Compare commits

..

323 Commits

Author SHA1 Message Date
ed
c07e0110f8 v1.9.15 2023-10-24 16:43:26 +00:00
ed
2808734047 drc: further reduce volume skip between songs 2023-10-24 16:38:29 +00:00
ed
1f75314463 placeholder expansion in readme and logues; closes #56
also fixes the "scan" volflag which broke in v1.9.14
2023-10-24 16:37:32 +00:00
ed
063fa3efde drc: fix volume jump on song change
(in exchange for a chance of clipping, which should be fine because
all browsers appear to have a limiter on the output anyways)
2023-10-23 09:05:31 +00:00
ed
44693d79ec update pkgs to 1.9.14 2023-10-21 14:52:22 +00:00
ed
cea746377e v1.9.14 2023-10-21 14:43:11 +00:00
ed
59a98bd2b5 update pkgs to 1.9.13 2023-10-21 13:34:50 +00:00
ed
250aa28185 v1.9.13 2023-10-21 13:14:41 +00:00
ed
5280792cd7 list existing tags even if tagscanning is disabled 2023-10-21 13:09:37 +00:00
ed
2529aa151d tersen volume listing on startup 2023-10-21 12:11:49 +00:00
ed
fc658e5b9e utcfromtimestamp was deprecated and nobody told me,
not even the deprecationwarning that got silently generated burning
20~30% of all CPU-time without actually displaying it anywhere, nice

python 3.12.0 is now only 5% slower than 3.11.6

also fixes some other, less-performance-fatal deprecations
2023-10-20 23:41:58 +00:00
ed
a4bad62b60 add clientside DRC / dynamic range compressor 2023-10-20 20:51:00 +00:00
ed
e1d78d8b23 increase timeout of unfinished uploads from 6 to 24 hours
plus make it configurable
2023-10-20 18:31:28 +00:00
ed
c7f826dbbe search by upload time 2023-10-19 23:57:27 +00:00
ed
801da8079b only 404-ban accounts with permission [gGh]:
never bonk anyone with read-access (able to see directory-listing)
or write-only (not able to retrieve any files at all) due to
either --ban-404 or --ban-url

fixes accidental ban when webdav-uploading files which
match any of the --ban-url patterns (#55)

also default-enables --ban-404 since it is now generally safe
(even when up2k is in turbo mode), plus make turbo smart enough to
disengage when necessary
2023-10-18 22:14:09 +00:00
ed
7d797dba3f strip filekeys from -txt- links;
accessing the syntax hilighter using a filekey is impossible anyways
because the client expects to build its state from the folder listing
and the backend refuses to return a listing given just a filekey
2023-10-18 20:57:53 +00:00
ed
cda90c285e mention lack of EINTR handling in older pytjons 2023-10-17 17:58:01 +00:00
ed
4b5a0787ab option to show upload timestamps in directory listing;
enable with -mte +.ip_at
or volflag mte=+.ip_at

worst-case performance impact: 18%
2023-10-17 17:51:27 +00:00
ed
2048b7538e update pkgs to 1.9.12 2023-10-15 20:22:15 +00:00
ed
ac40dccc8f v1.9.12 2023-10-15 20:06:46 +00:00
ed
9ca8154651 prefer the new TTF in pillow 10.1 + pyinstaller 6.1 fixes 2023-10-15 18:47:34 +00:00
ed
db668ba491 spectrograms are never cropped; share thumbcache 2023-10-15 11:42:57 +00:00
ed
edbafd94c2 avoid iphone jank:
safari can immediately popstate when alt-tabbing back to the browser,
causing the page to load twice in parallel:

2174 log-capture ok
2295 h-repl $location
2498 h-pop $location <==
2551 sha-ok  # from initial load
2023-10-15 11:27:27 +00:00
ed
2df76eb6e1 client decides if thumbnails should be cropped or not
this carries some intentional side-effects; each thumbnail format will
now be stored in its own subfolder under .hist/th/ making cleanup more
effective (jpeg and webm are dropped separately)
2023-10-15 10:21:25 +00:00
ed
9b77c9ce7d more intuitive upload/filesearch toggle:
restore preferred mode after leaving a restricted folder
2023-10-15 09:00:57 +00:00
ed
dc2b67f155 ui-button to use upload-time instead of local last-modified 2023-10-15 08:46:23 +00:00
ed
9f32e9e11d set default sort order; --sort or volflag "sort" 2023-10-14 22:17:37 +00:00
ed
7086d2a305 ie9 support 2023-10-14 10:01:03 +00:00
ed
575615ca2d slight refactor; 7% faster, 1% more maintainable 2023-10-14 09:54:49 +00:00
kipukun ;_
c0da4b09bf contrib: bump python version in rc script
the default version of Python is now 3.9 as of FreeBSD 13.2-RELEASE
2023-10-13 10:15:27 +02:00
ed
22880ccc9a update pkgs to 1.9.11 2023-10-09 00:51:41 +00:00
ed
e4001550c1 v1.9.11 2023-10-09 00:36:54 +00:00
ed
e9f65be86a add cachebuster for dynamically loaded js files 2023-10-09 00:22:16 +00:00
ed
3b9919a486 update pkgs to 1.9.10 2023-10-08 21:16:12 +00:00
ed
acc363133f v1.9.10 2023-10-08 20:51:49 +00:00
ed
8f2d502d4d configurable printing of failed login attempts 2023-10-08 20:41:02 +00:00
ed
2ae93ad715 clear response headers for each request 2023-10-08 20:38:51 +00:00
ed
bb590e364a update pkgs to 1.9.9 2023-10-07 22:49:12 +00:00
ed
e7fff77735 v1.9.9 2023-10-07 22:29:37 +00:00
ed
753e3cfbaf revert 68c6794d (v1.6.2) and fix it better:
moving deduplicated files between volumes could drop some links
2023-10-07 22:25:44 +00:00
ed
99e9cba1f7 update pkgs to 1.9.8 2023-10-06 18:22:01 +00:00
ed
fcc3336760 v1.9.8 2023-10-06 17:50:35 +00:00
ed
0dc3c23b42 add alternative filekey generator; closes #52 2023-10-06 13:41:22 +00:00
ed
6aa10ecedc mention streaming unzip with bsdtar 2023-10-02 07:40:40 +02:00
ed
93125bba4d update pkgs to 1.9.7 2023-09-30 23:56:35 +00:00
ed
fae5a36e6f v1.9.7 2023-09-30 23:32:51 +00:00
ed
fc9b729fc2 fix #51:
* handle unexpected localstorage values
* handle unsupported --lang values
2023-09-30 22:54:21 +00:00
ed
8620ae5bb7 fix column-hiding ux on phones:
table header click-handler didn't cover the entire cell so it was
easy to sort the table by accident; also do not exit hiding mode
automatically since you usually want to hide several columns
(so also adjust css to make it obvious you're in hiding mode)
2023-09-28 09:28:26 +02:00
ed
01a851da28 mtp-deps: fix building on archlinux 2023-09-24 23:17:26 +00:00
ed
309895d39d docker: exploring alternative base images for performance 2023-09-24 22:26:51 +00:00
ed
7ac0803ded update pkgs to 1.9.6 2023-09-23 12:56:47 +00:00
ed
cae5ccea62 v1.9.6 2023-09-23 12:15:24 +00:00
ed
3768cb4723 add chat 2023-09-23 11:34:32 +00:00
ed
0815dce4c1 ensure indexing runs with --ign-ebind-all 2023-09-22 23:20:57 +00:00
ed
a62f744a18 prevent losing an out-of-volume index
if the server is started while an external drive is not mounted,
it would drop the database because all the files are missing
2023-09-22 23:05:07 +00:00
ed
163e3fce46 improve reverse-proxy support when containerized:
the x-forwarded-for header would get rejected since the reverse-proxy
is not asking from 127.0.0.1 or ::1, so make this allowlist configurable
2023-09-22 22:39:20 +00:00
ed
e76a50cb9d add indexer benchmark + bump default num cores from 4 to 5
and make the mtag deps build better on fedora
2023-09-22 20:40:52 +00:00
ed
72fc76ef48 golf / normalize window.location 2023-09-20 22:07:40 +00:00
ed
c47047c30d configurable real-ip header from reverse proxy 2023-09-20 21:56:39 +00:00
ed
3b8f66c0d5 fix a client crash when uploading from glitchy net
prevent reattempting chunks / handshakes after an upload has completed
since that is both pointless and crashy

bugreport ocr'ed from deepfried pic (thx kipu):
stack: exec_handshake -> xhr.onload -> tasked -> exec_upload -> do_send

529226 crash: t.fobj is null; firefox 117, win64
529083 zombie handshake onerror, some.flac
529081 chunkpit onerror,, 1, another.flac
528933 retrying stuck handshake
498842 ^
464213 zombie handshake onload, some.flac
464208 ^
462858 ignoring dupe-segment error, some.flac
462766 ^
462751 ^
462667 ^
462403 ^
462316 ^
461321 zombie handshake onload, some.flac
461302 ^
461152 ^
461114 ^
461110 ^
460769 ^
459954 ^
459492 ignoring dupe-segment error, some.flac
2023-09-20 21:25:59 +00:00
ed
aa96a1acdc misc optimizations / cleanup:
* slightly faster startup / shutdown
* forgot a jinja2 golf
* waste 4KiB changing prismjs back to gz since brotli is https-gated ;_;
* broke support for firefox<52 (non-var functions must be toplevel
   or immediately within another function), now even firefox 10 /
   centos 6 is somewhat supported again
2023-09-17 13:02:18 +00:00
ed
91cafc2511 faster startup on windows by asking for ffmpeg.exe explicitly
rather than just "ffmpeg" which makes windows try to open each of
ffmpeg.BAT,CMD,COM,EXE,JS,JSE,MSC,VBE,VBS,WSF,WSH one by one
(ffmpeg.js? hello??)
2023-09-13 23:32:19 +00:00
ed
23ca00bba8 support jython and graalpy 2023-09-13 23:24:56 +00:00
ed
a75a992951 golf the sfx-gz by ~27.6 kB;
* 11 kB webdeps: brotli easymde+prism instead of zopfli
* 8 kB jinja2
* 5 kB ftp
* 3 kB improve uncommenter
2023-09-13 23:21:22 +00:00
ed
4fbd6853f4 add msg-log.py initially by @clach04, closes #35 2023-09-12 19:56:05 +00:00
ed
71c3ad63b3 fix tests 2023-09-11 01:46:25 +00:00
ed
e1324e37a5 update pkgs to 1.9.5 2023-09-09 14:15:46 +00:00
ed
a996a09bba v1.9.5 2023-09-09 13:36:56 +00:00
ed
18c763ac08 smb: upgrade to impacket 0.11, full user account support,
permissions are now per-account instead of coalescing

also stops windows from freaking out if there's an offline volume
2023-09-09 12:46:37 +00:00
ed
3d9fb753ba stuff 2023-09-08 21:42:05 +00:00
ed
714fd1811a add option to generate pax-format tar archives
and forgot to commit the nix module
2023-09-08 21:13:23 +00:00
ed
4364581705 fix accidental 422-ban when uploading lots of dupes 2023-09-08 19:49:29 +00:00
ed
ba02c9cc12 readme fix + make hacker theme more hacker 2023-09-08 19:35:12 +00:00
ed
11eefaf968 create / edit non-markdown textfiles (if user has delete-access)
also enables the ansi escape code parser if the text looks like ansi
2023-09-08 18:47:31 +00:00
ed
5a968f9e47 add permission 'h': folders redirect to index.html;
safest way to make copyparty like a general-purpose webserver where
index.html is returned as expected yet directory listing is entirely
disabled / unavailable
2023-09-07 23:30:01 +00:00
ed
6420c4bd03 up to 2.6x faster download-as-zip
when there's lots of files, and especially small ones
and also reduces cpu load by at least 15%
2023-09-05 22:57:03 +00:00
ed
0f9877201b support cache directives in --css-browser, --js-browser;
for example --css-browser=/the.css?cache=600 (seconds)
or --js-browser=/.res/the.js?cache=i (7 days)
2023-09-03 19:50:31 +00:00
ed
9ba2dec9b2 lightbox: fix ccw rotation hotkey 2023-09-03 19:23:29 +00:00
ed
ae9cfea939 update pkgs to 1.9.4 2023-09-02 00:45:57 +00:00
ed
cadaeeeace v1.9.4 2023-09-02 00:18:53 +00:00
ed
767696185b add ?tar=gz, ?tar=bz2, ?tar=xz with optional level;
defaults are ?tar=gz:3, ?tar=bz2:9, ?tar=xz:1
2023-09-01 23:44:10 +00:00
ed
c1efd227b7 fix inconsistent use of symlink mtimes in database;
on upload, dupes are by default handled by symlinking to the existing
copy on disk, writing the uploader's local mtime into the symlink mtime,
which is also what gets indexed in the db

this worked as intended, however during an -e2dsa rescan on startup the
symlink destination timestamps would be used instead, causing a reindex
and the resulting loss of uploader metadata (ip, timestamp)

will now always use the symlink's mtime;
worst-case 1% slower startup (no dhash)

this change will cause a reindex of incorrectly indexed files, however
as this has already happened at least once due to the bug being fixed,
there will be no additional loss of metadata
2023-09-01 20:29:55 +00:00
ed
a50d0563c3 instantly perform search when URL contains a raw query 2023-09-01 20:16:19 +00:00
ed
e5641ddd16 update pkgs to 1.9.3 2023-08-31 23:08:32 +00:00
ed
700111ffeb v1.9.3 2023-08-31 22:11:31 +00:00
ed
b8adeb824a misc http correctness;
some of this looks shady af but appears to have been harmless
(decent amount of testing came out ok)

* some location normalization happened before unquoting; however vfs
   handled this correctly so the outcome was just confusing messages
* some url parameters were double-decoded (unpost filter, move
   destinations), causing some operations to fail unexpectedly
* invalid cache-control headers could be generated,
   but not in a maliciously-beneficial way
   (there are safeguards stripping newlines and control-characters)

also adds an exception-message cleanup step to strip away the
filesystem path that copyparty's python files are located at,
in case that could be interesting knowledge
2023-08-31 21:51:58 +00:00
ed
30cc9defcb cosmetics:
* in case someone gets a confusing access-related error message,
  include more context in serverlogs (exact path)
* fix js console spam in search results
* same markdown line-height in viewer and browser
2023-08-31 21:27:14 +00:00
ed
61875bd773 slightly reduce flickering during page load on chrome 2023-08-31 20:02:33 +00:00
ed
30905c6f5d add convenient debugs in case the fight is not over 2023-08-31 20:00:14 +00:00
ed
9986136dfb apple/ios/iphone: maybe fix background album playback
good news: apple finally added support for samplerates other than
44100 for AudioContext, meaning it would now have been possible to
set non-100% volume for audio files including opus files

bad news: apple broke AudioContext in a way that makes it bug out
mediaSessions, causing lockscreen controls to become mostly useless

bad news: apple broke AudioContext additionally where it randomly
causes playback issues, blocking playback of audio files, even if
the AudioContext is sitting idle doing nothing (which is a
requirement for reliable upload speeds on other platforms)

disable AudioContext on iOS
2023-08-31 19:57:05 +00:00
ed
1c0d978979 ios/iphone: autoreplace smart-quotes with sane quotes,
as the iphone keyboard is not able to produce ' or "
2023-08-31 19:29:37 +00:00
ed
0a0364e9f8 FTPd: fix py3.12 support; workaround until next release:
run sfx twice with PYTHONPATH=/tmp/pe-copyparty.$(id -u)/copyparty/vend
2023-08-28 00:25:33 +00:00
ed
3376fbde1a update pkgs to 1.9.2 2023-08-26 22:09:43 +00:00
ed
ac21fa7782 v1.9.2 2023-08-26 21:16:30 +00:00
ed
c1c8dc5e82 ok lets try that again 2023-08-26 19:07:23 +00:00
ed
5a38311481 mark offline volumes in directory tree sidebar 2023-08-26 19:00:46 +00:00
ed
9f8edb7f32 make markdown slightly safer without the nohtml volflag
by running dompurify after marked.parse if plugins are not enabled;
adds no protection against the more practical approach of just
putting a malicious <script> in an html file and uploading that,
but one footgun less is one less footgun
2023-08-26 17:37:02 +00:00
ed
c5a6ac8417 persist dotfile preference as cookie for initial listing 2023-08-26 15:50:57 +00:00
ed
50e01d6904 add more autoban triggers:
* --ban-url: URLs which 404 and also match --sus-urls (bot-scan)
* --ban-403: trying to access volumes that dont exist or require auth
* --ban-422: invalid POST messages, fuzzing and such
* --nonsus-urls: regex of 404s which  shouldn't trigger --ban-404

in may situations it makes sense to handle this logic inside copyparty,
since stuff like cloudflare and running copyparty on another physical
box than the nginx frontend is on becomes fairly clunky
2023-08-26 13:52:24 +00:00
ed
9b46291a20 add option to force-disable turbo,
making it safer to enable --ban-404
(u2c can still get banned inadvertently)
2023-08-26 13:19:38 +00:00
ed
14497b2425 docs:
* mention cloudflare-specific nginx config

versus.md:
* seafile has a size limit on zip downloads
* seafile and nextcloud are slow at uploading many small files

u2c: improve error message in funky environments
2023-08-25 21:57:26 +00:00
ed
f7ceae5a5f add filetable range-select with shift-pgup/pgdn,
and retain file selection cursor when lazyloading more files
2023-08-25 19:34:37 +00:00
ed
c9492d16ba fix textfile navigation hotkeys (broke in 5d13ebb4) 2023-08-25 18:41:45 +00:00
ed
9fb9ada3aa dont whine about inaccessible root on rootless configs,
and make it easier for on403 to invoke the homepage-redirect
2023-08-25 18:33:15 +00:00
ed
db0abbfdda typo 2023-08-21 00:05:39 +00:00
ed
e7f0009e57 update pkgs to 1.9.1 2023-08-20 23:53:58 +00:00
ed
4444f0f6ff v1.9.1 2023-08-20 23:38:42 +00:00
ed
418842d2d3 update pkgs to 1.9.0 2023-08-20 23:11:44 +00:00
ed
cafe53c055 v1.9.0 2023-08-20 22:02:40 +00:00
ed
7673beef72 actually impl --mc-hop (and improve --zm-spam) 2023-08-20 21:27:28 +00:00
ed
b28bfe64c0 explain apple bullshit 2023-08-20 22:09:00 +02:00
ed
135ece3fbd immediately allow uploading an interrupted and
deleted incomplete upload to another location
2023-08-20 19:16:35 +00:00
ed
bd3640d256 change to openmetrics 2023-08-20 18:50:14 +00:00
ed
fc0405c8f3 add prometheus metrics; closes #49 2023-08-20 17:58:06 +00:00
ed
7df890d964 wget: only allow http/https/ftp/ftps (#50):
these are all the protocols that are currently supported by wget,
so this has no practical effect aside from making sure we won't
suddenly get file:// support or something (which would be bad)
2023-08-20 09:47:50 +00:00
ed
8341041857 mdns: option to ignore spec to avoid issues on
networks where clients have multiple IPs of which some are subnets that
the copyparty server is not
2023-08-19 21:45:26 +00:00
ed
1b7634932d tar/zip-download: add opus transcoding filter 2023-08-19 19:40:46 +00:00
ed
48a3898aa6 suggest enabling the database on startup 2023-08-16 19:57:19 +00:00
ed
5d13ebb4ac avoid firefox-android quirk(?):
when repeatedly tapping the next-folder button, occasionally it will
reload the entire page instead of ajax'ing the directory contents.

Navigation happens by simulating a click in the directory sidebar,
so the incorrect behavior matches what would happen if the link to the
folder didn't have its onclick-handler attached, so should probably
double-check if there's some way for that to happen

Issue observed fairly easily in firefox on android, regardless if
copyparty is running locally or on a server in a different country.
Unable to reproduce with android-chrome or desktop-firefox

Could also be due to an addon (dark-reader, noscript, ublock-origin)

anyways, avoiding this by doing the navigation more explicitly
2023-08-16 19:56:47 +00:00
ed
015b87ee99 performance / cosmetic:
* js: use .call instead of .bind when possible
* when running without e2d, the message on startup regarding
  unfinished uploads didn't show the correct filesystem path
2023-08-16 19:32:43 +00:00
ed
0a48acf6be limit each column of the files table to screen width 2023-08-16 03:55:53 +00:00
ed
2b6a3afd38 fix iOS randomly increasing fontsize of some things:
* links which are wider than the display width
* probably input fields too
2023-08-16 03:47:19 +00:00
ed
18aa82fb2f make browser resizing smoother / less expensive 2023-08-15 16:55:19 +00:00
ed
f5407b2997 docker: persist autogenerated seeds, disable certgen, and
mention how to run the containers with selinux enabled
* assumes that a /cfg docker volume is provided
2023-08-15 15:07:33 +00:00
ed
474d5a155b android's got hella strict filename rules 2023-08-15 06:46:57 +02:00
ed
afcd98b794 mention some gotchas (thx noktuas) 2023-08-15 03:38:51 +02:00
ed
4f80e44ff7 option to exactly specify browser title prefix 2023-08-15 03:17:01 +02:00
ed
406e413594 hint at additional context in exceptions 2023-08-15 01:42:13 +02:00
ed
033b50ae1b u2c: exclude files by regex 2023-08-15 00:45:12 +02:00
ed
bee26e853b show server hostname in html titles:
* --doctitle defines most titles, prefixed with "--name: " by default
* the file browser is only prefixed with the --name itself
* --nth ("no-title-hostname") removes it
* also removed by --nih ("no-info-hostname")
2023-08-14 23:50:13 +02:00
ed
04a1f7040e adjustable timestamp resolution in log messages 2023-08-14 17:22:22 +02:00
ed
f9d5bb3b29 support upload by dragdrop from other browser windows,
hello from LO484 https://ocv.me/stuff/aircode.jpg
2023-07-28 21:43:40 +02:00
ed
ca0cd04085 update pkgs to 1.8.8 2023-07-25 16:25:27 +00:00
ed
999ee2e7bc v1.8.8 2023-07-25 15:50:48 +00:00
ed
1ff7f968e8 fix tls-cert regeneration on windows 2023-07-25 15:27:27 +00:00
ed
3966266207 remember ?edit and trailing-slash during login redirect 2023-07-25 15:14:47 +00:00
ed
d03e96a392 html5 strips the first leading LF in textareas; stop it 2023-07-25 14:16:54 +00:00
ed
4c843c6df9 fix md-editor lastmod cmp when browsercache is belligerent 2023-07-25 14:06:53 +00:00
ed
0896c5295c range-select fixes:
* dont crash when shiftclicking between folders
* remember origin when lazyloading more files
2023-07-25 14:06:31 +02:00
ed
cc0c9839eb update pkgs to 1.8.7 2023-07-23 16:16:49 +00:00
ed
d0aa20e17c v1.8.7 2023-07-23 15:43:38 +00:00
ed
1a658dedb7 fix infinite playback spin on servers with one single file 2023-07-23 14:52:42 +00:00
ed
8d376b854c this is the wrong way around 2023-07-23 14:10:23 +00:00
ed
490c16b01d be even stricter with ?hc 2023-07-23 13:23:52 +00:00
ed
2437a4e864 the CVE-2023-37474 fix was overly strict; loosen 2023-07-23 11:31:11 +00:00
ed
007d948cb9 fix GHSA-f54q-j679-p9hh: reflected-XSS in cookie-setters;
it was possible to set cookie values which contained newlines,
thus terminating the http header and bleeding into the body.

We now disallow control-characters in queries,
but still allow them in paths, as copyparty supports
filenames containing newlines and other mojibake.

The changes in `set_k304` are not necessary in fixing the vulnerability,
but makes the behavior more correct.
2023-07-23 10:55:08 +00:00
ed
335fcc8535 update pkgs to 1.8.6 2023-07-21 01:12:55 +00:00
ed
9eaa9904e0 v1.8.6 2023-07-21 00:36:37 +00:00
ed
0778da6c4d fix GHSA-cw7j-v52w-fp5r: reflected-XSS through /?hc 2023-07-21 00:35:43 +00:00
ed
a1bb10012d update pkgs to 1.8.4 2023-07-18 08:26:39 +00:00
ed
1441ccee4f v1.8.4 2023-07-18 07:46:22 +00:00
ed
491803d8b7 update pkgs to 1.8.3 2023-07-16 23:03:30 +00:00
ed
3dcc386b6f v1.8.3 2023-07-16 22:00:04 +00:00
ed
5aa54d1217 shift/ctrl-click improvements:
* always enable shift-click selection in list-view
* shift-clicking thumbnails opens in new window by default as expected
* enable shift-select in grid-view when multiselect is on
* invert select when the same shift-select is made repeatedly
2023-07-16 18:15:56 +00:00
ed
88b876027c option to range-select files with shift-click; closes #47
also restores the browser-default behavior of
opening links in a new tab with CTRL / new window with SHIFT
2023-07-16 14:05:09 +00:00
ed
fcc3aa98fd add path-traversal scanners 2023-07-16 13:09:31 +00:00
ed
f2f5e266b4 support listing uploader IPs in d2t volumes 2023-07-15 18:50:35 +00:00
ed
e17bf8f325 require the new admin permission for the admin-panel 2023-07-15 18:39:41 +00:00
ed
d19cb32bf3 update pkgs to 1.8.2 2023-07-14 16:05:57 +00:00
ed
85a637af09 v1.8.2 2023-07-14 15:58:39 +00:00
ed
043e3c7dd6 fix traversal vulnerability GHSA-pxfv-7rr3-2qjg:
the /.cpr endpoint allowed full access to server filesystem,
unless mitigated by prisonparty
2023-07-14 15:55:49 +00:00
ed
8f59afb159 fix another race (unpost):
unposting could collide with most other database-related activities,
causing one or the other to fail.
luckily the unprotected query performed by the unpost API happens to be
very cheap, so also the most likely to fail, and would succeed upon a
manual reattempt from the UI.
even in the worst case scenario, there would be no unrecoverable damage
as the next rescan would auto-repair any resulting inconsistencies.
2023-07-14 15:21:14 +00:00
ed
77f1e51444 fix unlikely race (e2tsr):
if someone with admin rights refreshes the homepage exactly as the
directory indexer decides to `_drop_caches`, the indexer thread would
die and the up2k instance would become inoperable...
luckily the probability of hitting this by chance is absolutely minimal,
and the worst case scenario is having to restart copyparty if this
happens immediately after startup; there is no risk of database damage
2023-07-14 15:20:25 +00:00
ed
22fc4bb938 add event-hook for banning users 2023-07-13 22:29:32 +00:00
ed
50c7bba6ea volflag "nohtml" to never return html or rendered markdown from potentially unsafe volumes 2023-07-13 21:57:52 +00:00
ed
551d99b71b add permission "a" to show uploader IPs (#45) 2023-07-12 21:36:55 +00:00
ed
b54b7213a7 more thumbnailer configs available as volflags:
--th-convt = convt
--th-no-crop = nocrop
--th-size = thsize
2023-07-11 22:15:37 +00:00
ed
a14943c8de update pkgs to 1.8.1 2023-07-07 23:58:16 +00:00
ed
a10cad54fc v1.8.1 2023-07-07 22:20:01 +00:00
ed
8568b7702a add pillow10 support + improve text rendering 2023-07-07 22:13:04 +00:00
ed
5d8cb34885 404/403 can be handled with plugins 2023-07-07 21:33:40 +00:00
ed
8d248333e8 dont disable quickedit when hashing passwords interactively 2023-07-07 18:29:30 +00:00
ed
99e2ef7f33 ux: fix tabs clipping in fedora-ff, hackertheme up2k flags 2023-07-07 18:24:58 +00:00
ed
e767230383 very-bad-idea: prefer mpv / streamlink; closes #42 2023-06-28 21:25:40 +00:00
ed
90601314d6 better explain why very-bad-idea is a very bad idea 2023-06-27 22:30:14 +00:00
ed
9c5eac1274 add fedora package 2023-06-27 22:22:42 +00:00
ed
50905439e4 update pkgs to 1.8.0 2023-06-26 00:46:55 +00:00
ed
a0c1239246 v1.8.0 2023-06-26 00:05:12 +00:00
ed
b8e851c332 cloudflare update + cosmetics:
* toastb padding fixes scrollbar on norwegian 403 in firefox
* fix text aspect ratio in seekbaron compact toggle
* crashpage had link overlaps on homepage
2023-06-25 23:09:29 +00:00
ed
baaf2eb24d include mdns names in tls cert 2023-06-25 22:06:35 +00:00
ed
e197895c10 support hashed passwords; closes #39 2023-06-25 21:50:33 +00:00
ed
cb75efa05d md-editor: index file and trigger upload hooks 2023-06-20 18:11:35 +00:00
ed
8b0cf2c982 volflags to limit volume size / num files; closes #40 2023-06-19 00:42:45 +00:00
ed
fc7d9e1f9c update pkgs to 1.7.6 2023-06-11 09:13:58 +00:00
ed
10caafa34c v1.7.6 2023-06-11 08:14:45 +00:00
ed
22cc22225a v1.7.5 2023-06-11 01:32:56 +00:00
ed
22dff4b0e5 update pkgs to 1.7.4 2023-06-11 01:26:25 +00:00
ed
a00ff2b086 v1.7.4 2023-06-11 00:07:38 +00:00
ed
e4acddc23b v1.7.3 2023-06-11 00:03:03 +00:00
ed
2b2d8e4e02 tls / gencert fixes 2023-06-10 23:34:34 +00:00
ed
5501d49032 prefer urandom for fk-salt unless cert.pem exists 2023-06-10 22:47:39 +00:00
ed
fa54b2eec4 generate tls certs 2023-06-10 22:46:24 +00:00
ed
cb0160021f upgrade pyinstaller env/deps 2023-06-10 11:58:58 +00:00
ed
93a723d588 add --ansi to systemd, fix grid controls bg,
mention folder thumbs dependency on -e2d,
improve make-sfx warnings,
update changelog
2023-06-06 22:04:39 +00:00
ed
8ebe1fb5e8 mention cfssl.sh in the default-certificate warning,
and improve documentation inside cfssl.sh
2023-06-06 21:41:19 +00:00
clach04
2acdf685b1 Fix issue #33 - no color output expected when redirecting stdout 2023-06-05 01:58:49 +02:00
ed
9f122ccd16 make-sfx: option to auto-obtain webdeps 2023-06-04 23:46:38 +00:00
ed
03be26fafc improve check for type-hint support 2023-06-04 22:59:25 +00:00
ed
df5d309d6e document the make-sfx.sh fast option 2023-06-04 14:13:35 +00:00
ed
c355f9bd91 catch common environment issues (#32):
* error-message which explains how to run on py2 / older py3
   when trying to run from source
* check compatibility between jinja2 and cpython on startup
* verify that webdeps are present on startup
* verify that webdeps are present when building sfx
* make-sfx.sh grabs the strip-hints dependency
2023-06-04 13:13:36 +00:00
ed
9c28ba417e option to regex-exclude files in browser listings 2023-06-02 21:54:25 +00:00
ed
705b58c741 support the NO_COLOR environment variable
https://no-color.org/ and more importantly
https://youtu.be/biW5UVGkPMA?t=150
2023-06-02 20:22:57 +00:00
ed
510302d667 support ftps-only; closes #30 2023-06-02 19:02:50 +00:00
ed
025a537413 add option to show thumbs by default; closes #31 2023-06-02 18:41:21 +00:00
ed
60a1ff0fc0 macos: mute select() noise on wake from suspend 2023-05-19 16:37:52 +02:00
ed
f94a0b1bff update pkgs to 1.7.2 2023-05-13 00:49:46 +00:00
ed
4ccfeeb2cd v1.7.2 2023-05-13 00:00:07 +00:00
ed
2646f6a4f2 oh nice, looks like 3.18 fixed whatever broke in 3.17 2023-05-12 23:38:10 +00:00
ed
b286ab539e readme: add more examples 2023-05-12 22:41:06 +00:00
ed
2cca6e0922 warn when sharing certain system locations 2023-05-12 21:38:16 +00:00
ed
db51f1b063 cfg: allow trailing colon on category headers 2023-05-12 21:01:34 +00:00
ed
d979c47f50 optimize clearTimeout + always shrink upload panes after completion + fix GET alignment 2023-05-12 20:46:45 +00:00
ed
e64b87b99b dont hardlink symlinks (they could be relative) 2023-05-12 20:41:09 +00:00
ed
b985011a00 upgrade docker to alpine 3.18:
* enables chiptune player
* smaller containers (generate pycache at runtime)
2023-05-11 06:56:21 +00:00
ed
c2ed2314c8 pkg/arch: add setuptools 2023-05-08 22:24:46 +00:00
ed
cd496658c3 update pkgs to 1.7.1 2023-05-07 19:51:59 +00:00
ed
deca082623 v1.7.1 2023-05-07 18:34:39 +00:00
ed
0ea8bb7c83 forgot the u2c symlink + sfx listing 2023-05-07 15:45:20 +00:00
ed
1fb251a4c2 was moved to pyproject 2023-05-07 15:41:00 +00:00
ed
4295923b76 rename up2k.py (client) to u2c.py 2023-05-07 15:37:52 +00:00
ed
572aa4b26c rename up2k.py (client) to u2c.py 2023-05-07 15:35:56 +00:00
ed
b1359f039f linter cleanup 2023-05-07 14:38:30 +00:00
ed
867d8ee49e replace setup.py with pyproject.toml + misc cleanup 2023-05-07 14:37:57 +00:00
ed
04c86e8a89 webdav: support write-only folders + force auth option 2023-05-06 20:33:29 +00:00
ed
bc0cb43ef9 include usernames in request logs 2023-05-06 20:17:56 +00:00
ed
769454fdce ftpd: only log invalid passwords 2023-05-06 19:16:52 +00:00
ed
4ee81af8f6 support ';' in passwords 2023-05-06 18:54:55 +00:00
ed
8b0e66122f smoother playback cursor on short songs + optimize 2023-05-06 16:31:04 +00:00
ed
8a98efb929 adapt to new archpkg layout 2023-05-05 20:51:18 +00:00
ed
b6fd555038 panic if two accounts have the same password 2023-05-05 20:24:24 +00:00
ed
7eb413ad51 doc tweaks 2023-05-05 19:39:10 +00:00
ixces
4421d509eb update PKGBUILD 2023-05-02 17:21:12 +02:00
ed
793ffd7b01 update pkgs to 1.7.0 2023-04-29 22:50:36 +00:00
ed
1e22222c60 v1.7.0 2023-04-29 21:14:38 +00:00
ed
544e0549bc make xvol and xdev apply at runtime (closes #24):
* when accessing files inside an xdev volume, verify that the file
   exists on the same device/filesystem as the volume root

* when accessing files inside an xvol volume, verify that the file
   exists within any volume where the user has read access
2023-04-29 21:10:02 +00:00
ed
83178d0836 preserve empty folders (closes #23):
* when deleting files, do not cascade upwards through empty folders
* when moving folders, also move any empty folders inside

the only remaining action which autoremoves empty folders is
files getting deleted as they expire volume lifetimes

also prevents accidentally moving parent folders into subfolders
(even though that actually worked surprisingly well)
2023-04-29 11:30:43 +00:00
ed
c44f5f5701 nit 2023-04-29 09:44:46 +00:00
ed
138f5bc989 warn about android powersave settings on music interruption + fix eq on folder change 2023-04-29 09:31:53 +00:00
ed
e4759f86ef ftpd correctness:
* winscp mkdir failed because the folder-not-found error got repeated
* rmdir fails after all files in the folder have poofed; that's OK
* add --ftp4 as a precaution
2023-04-28 20:50:45 +00:00
ed
d71416437a show file selection summary 2023-04-27 19:33:52 +00:00
ed
a84c583b2c ok that wasn't enough 2023-04-27 19:06:35 +00:00
ed
cdacdccdb8 update pkgs to 1.6.15 2023-04-27 00:36:56 +00:00
ed
d3ccd3f174 v1.6.15 2023-04-26 23:00:55 +00:00
ed
cb6de0387d a bit faster 2023-04-26 19:56:27 +00:00
ed
abff40519d eyecandy: restore playback indicator on folder hop 2023-04-26 19:09:16 +00:00
ed
55c74ad164 30% faster folder listings (wtf...) 2023-04-26 18:55:53 +00:00
ed
673b4f7e23 option to show symlink's lastmod instead of deref;
mainly motivated by u2cli's folder syncing in turbo mode
which would un-turbo on most dupes due to wrong lastmod

disabled by default for regular http listings
(to avoid confusion in most regular usecases),
enable per-request with urlparam lt

enabled by default for single-level webdav listings
(because rclone hits the same issue as u2cli),
can be disabled with arg --dav-rt or volflag davrt

impossible to enable for recursive webdav listings
2023-04-26 18:54:21 +00:00
ed
d11e02da49 u2cli: avoid dns lookups while uploading 2023-04-26 18:46:42 +00:00
ed
8790f89e08 fix installing from source tarball 2023-04-26 18:40:47 +00:00
ed
33442026b8 try to discourage android from stopping playback...
...when continuing into the next folder

accidentally introduces a neat bonus feature where the music
no longer stops while you go looking for stuff to play next
2023-04-26 18:33:30 +00:00
ed
03193de6d0 socket read/write timeout 2023-04-24 20:04:22 +00:00
ed
8675ff40f3 update pkgs to 1.6.14 2023-04-24 07:52:12 +00:00
ed
d88889d3fc v1.6.14 2023-04-24 06:09:44 +00:00
ed
6f244d4335 update pkgs to 1.6.13 2023-04-24 00:46:47 +00:00
ed
cacca663b3 v1.6.13 2023-04-23 23:05:31 +00:00
ed
d5109be559 ftp: track login state isolated from pyftpdlib;
for convenience, the password can be provided as the username
but that confuses pyftpd a little so let's do this
2023-04-23 21:06:19 +00:00
ed
d999f06bb9 volflags can be -unset 2023-04-23 21:05:29 +00:00
ed
a1a8a8c7b5 configurable tls-certificate location 2023-04-23 20:56:55 +00:00
ed
fdd6f3b4a6 tar/zip: use volume name as toplevel fallback 2023-04-23 20:55:34 +00:00
ed
f5191973df docs cleanup:
* mostly deprecate --http-only and --https-only since there is zero
   performance gain in recent python versions, however could still be
   useful for avoiding limitations in alternative python interpreters
   (and forcing http/https with mdns/ssdp/qr)

* mention antivirus being useless as usual
2023-04-23 20:25:44 +00:00
ed
ddbaebe779 update pkgs to 1.6.12 2023-04-20 22:47:37 +00:00
ed
42099baeff v1.6.12 2023-04-20 21:41:47 +00:00
ed
2459965ca8 u2cli: dont enter delete stage if something failed 2023-04-20 20:40:09 +00:00
ed
6acf436573 u2idx pool instead of per-socket;
prevents running out of FDs thanks to thousands of sqlite3 sessions
and neatly sidesteps what could possibly be a race in python's
sqlite3 bindings where it sometimes forgets to close the fd
2023-04-20 20:36:13 +00:00
ed
f217e1ce71 correctly ignore multirange requests 2023-04-20 19:14:38 +00:00
ed
418000aee3 explain tus incompatibility + update docs 2023-04-19 21:46:33 +00:00
ed
dbbba9625b nix: make deps optional + update docs 2023-04-17 13:17:53 +02:00
Chinpo Nya
397bc92fbc rewrite the nix module config with nix options 2023-04-17 00:26:57 +02:00
Chinpo Nya
6e615dcd03 fix: remove ffmpeg from python env build inputs 2023-04-17 00:26:57 +02:00
Chinpo Nya
9ac5908b33 refactor: remove unnecessary use of 'rec' 2023-04-17 00:26:57 +02:00
Chinpo Nya
50912480b9 automate nix package updates 2023-04-17 00:26:57 +02:00
Chinpo Nya
24b9b8319d nix/nixos documentation 2023-04-17 00:26:57 +02:00
Chinpo Nya
b0f4f0b653 nixos module 2023-04-17 00:26:57 +02:00
Chinpo Nya
05bbd41c4b nix package 2023-04-17 00:26:57 +02:00
ed
8f5f8a3cda expand userhomes everywhere:
* -c
* -lo
* --hist
* hist volflag
* --ssl-log
2023-04-14 18:55:19 +02:00
ed
c8938fc033 fix ipv4 location header on dualstack 2023-04-14 14:06:44 +02:00
ed
1550350e05 update docs (performance tips, windows example) 2023-04-13 21:36:55 +00:00
ed
5cc190c026 better 2023-04-12 22:09:46 +00:00
ed
d6a0a738ce add windows example + update docs + some cosmetics 2023-04-12 22:06:44 +00:00
ed
f5fe3678ee more safari-on-touchbar-macbook workarounds:
* safari invokes pause on the mediasession
   whenever any Audio loads a new src (preload)

* ...and on some(?) seeks
2023-04-07 23:04:01 +02:00
ed
f2a7925387 avoid safari bugs on touchbar macbooks:
* songs would play backwards
* playback started immediately on folder change
2023-04-07 12:38:37 +02:00
ed
fa953ced52 update archpkg to 1.6.11 2023-04-01 22:59:20 +00:00
ed
f0000d9861 v1.6.11 2023-04-01 21:12:54 +00:00
ed
4e67516719 last.fm web-scrobbler support 2023-04-01 21:02:03 +00:00
ed
29db7a6270 deps: automate prismjs build 2023-04-01 17:46:42 +00:00
ed
852499e296 dont panic in case of extension-injected css 2023-04-01 16:08:45 +00:00
ed
f1775fd51c update deps 2023-04-01 15:15:53 +00:00
ed
4bb306932a update systemd notes 2023-04-01 10:32:12 +00:00
ed
2a37e81bd8 add rclone optimization, closes #21 2023-04-01 10:21:21 +00:00
ed
6a312ca856 something dumb 2023-04-01 00:16:30 +00:00
ed
e7f3e475a2 more accurate bpm detector 2023-03-31 21:20:37 +00:00
ed
854ba0ec06 add audio filter plugin thing 2023-03-31 20:20:28 +00:00
ed
209b49d771 remind sqlite we have indexes 2023-03-30 21:45:58 +00:00
ed
949baae539 integrate markdown thumbs with image gallery 2023-03-30 21:21:21 +00:00
ed
5f4ea27586 new hook: exif stripper 2023-03-26 22:19:15 +00:00
ed
099cc97247 hooks: more correct usage examples 2023-03-26 22:18:48 +00:00
ed
592b7d6315 gdi js 2023-03-26 02:06:49 +00:00
ed
0880bf55a1 markdown thumbnails 2023-03-26 01:53:41 +00:00
ed
4cbffec0ec u2cli: show more errors + drop --ws (does nothing) 2023-03-23 23:47:41 +00:00
ed
cc355417d4 update docs 2023-03-23 23:37:45 +00:00
ed
e2bc573e61 webdav correctness:
* generally respond without body
   (rclone likes this)
* don't connection:close on most mkcol errors
2023-03-23 23:25:00 +00:00
ed
41c0376177 update archpkg to 1.6.10 2023-03-20 23:37:20 +00:00
ed
c01cad091e v1.6.10 2023-03-20 21:56:31 +00:00
ed
eb349f339c update foldersync / rclone docs 2023-03-20 21:54:08 +00:00
ed
24d8caaf3e switch rclone to owncloud mode so it sends lastmod 2023-03-20 21:45:52 +00:00
ed
5ac2c20959 basic support for rclone sync 2023-03-20 21:17:53 +00:00
ed
bb72e6bf30 support propfind of files (not just dirs) 2023-03-20 20:58:51 +00:00
ed
d8142e866a accept last-modified from owncloud webdav extension 2023-03-20 20:28:26 +00:00
ed
7b7979fd61 add sftpgo to comparison + update docs 2023-03-19 21:45:35 +00:00
ed
749616d09d help iOS understand short audio files 2023-03-19 20:03:35 +00:00
ed
5485c6d7ca prisonparty: FFmpeg runs faster with /dev/urandom 2023-03-19 18:32:35 +00:00
ed
b7aea38d77 add iOS uploader (mk.ii) 2023-03-18 18:38:37 +00:00
ed
0ecd9f99e6 update archpkg to 1.6.9 2023-03-16 22:34:09 +00:00
ed
ca04a00662 v1.6.9 2023-03-16 21:06:18 +00:00
ed
8a09601be8 url-param ?v disables index.html 2023-03-16 20:52:43 +00:00
ed
1fe0d4693e fix logues bleeding into navpane 2023-03-16 20:23:01 +00:00
ed
bba8a3c6bc fix truncated search results 2023-03-16 20:12:13 +00:00
ed
e3d7f0c7d5 add tooltip delay to android too 2023-03-16 19:48:44 +00:00
ed
be7bb71bbc add option to show index.html instead of listing 2023-03-16 19:41:33 +00:00
ed
e0c4829ec6 verify covers against db instead of fs 2023-03-15 19:48:43 +00:00
ed
5af1575329 readme: ideas welcome w 2023-03-14 22:24:43 +00:00
ed
884f966b86 update archpkg to 1.6.8 2023-03-12 18:55:02 +00:00
ed
f6c6fbc223 fix exe builder 2023-03-12 18:54:16 +00:00
159 changed files with 9583 additions and 1863 deletions

View File

@@ -1,2 +1,2 @@
Please include the following text somewhere in this PR description:
To show that your contribution is compatible with the MIT License, please include the following text somewhere in this PR description:
This PR complies with the DCO; https://developercertificate.org/

7
.gitignore vendored
View File

@@ -21,6 +21,9 @@ copyparty.egg-info/
# winmerge
*.bak
# apple pls
.DS_Store
# derived
copyparty/res/COPYING.txt
copyparty/web/deps/
@@ -34,3 +37,7 @@ up.*.txt
.hist/
scripts/docker/*.out
scripts/docker/*.err
/perf.*
# nix build output link
result

4
.vscode/launch.json vendored
View File

@@ -9,6 +9,10 @@
"console": "integratedTerminal",
"cwd": "${workspaceFolder}",
"justMyCode": false,
"env": {
"PYDEVD_DISABLE_FILE_VALIDATION": "1",
"PYTHONWARNINGS": "always", //error
},
"args": [
//"-nw",
"-ed",

12
.vscode/launch.py vendored
View File

@@ -30,10 +30,18 @@ except:
argv = [os.path.expanduser(x) if x.startswith("~") else x for x in argv]
sfx = ""
if len(sys.argv) > 1 and os.path.isfile(sys.argv[1]):
sfx = sys.argv[1]
sys.argv = [sys.argv[0]] + sys.argv[2:]
argv += sys.argv[1:]
if re.search(" -j ?[0-9]", " ".join(argv)):
argv = [sys.executable, "-m", "copyparty"] + argv
if sfx:
argv = [sys.executable, sfx] + argv
sp.check_call(argv)
elif re.search(" -j ?[0-9]", " ".join(argv)):
argv = [sys.executable, "-Wa", "-m", "copyparty"] + argv
sp.check_call(argv)
else:
sys.path.insert(0, os.getcwd())

32
.vscode/settings.json vendored
View File

@@ -35,34 +35,18 @@
"python.linting.flake8Enabled": true,
"python.linting.banditEnabled": true,
"python.linting.mypyEnabled": true,
"python.linting.mypyArgs": [
"--ignore-missing-imports",
"--follow-imports=silent",
"--show-column-numbers",
"--strict"
],
"python.linting.flake8Args": [
"--max-line-length=120",
"--ignore=E722,F405,E203,W503,W293,E402,E501,E128",
"--ignore=E722,F405,E203,W503,W293,E402,E501,E128,E226",
],
"python.linting.banditArgs": [
"--ignore=B104"
],
"python.linting.pylintArgs": [
"--disable=missing-module-docstring",
"--disable=missing-class-docstring",
"--disable=missing-function-docstring",
"--disable=import-outside-toplevel",
"--disable=wrong-import-position",
"--disable=raise-missing-from",
"--disable=bare-except",
"--disable=broad-except",
"--disable=invalid-name",
"--disable=line-too-long",
"--disable=consider-using-f-string"
"--ignore=B104,B110,B112"
],
// python3 -m isort --py=27 --profile=black copyparty/
"python.formatting.provider": "black",
"python.formatting.provider": "none",
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter"
},
"editor.formatOnSave": true,
"[html]": {
"editor.formatOnSave": false,
@@ -74,10 +58,6 @@
"files.associations": {
"*.makefile": "makefile"
},
"python.formatting.blackArgs": [
"-t",
"py27"
],
"python.linting.enabled": true,
"python.pythonPath": "/usr/bin/python3"
}

1
.vscode/tasks.json vendored
View File

@@ -11,6 +11,7 @@
"type": "shell",
"command": "${config:python.pythonPath}",
"args": [
"-Wa", //-We
".vscode/launch.py"
]
}

580
README.md
View File

@@ -1,29 +1,16 @@
# 💾🎉 copyparty
* portable file sharing hub (py2/py3) [(on PyPI)](https://pypi.org/project/copyparty/)
* MIT-Licensed, 2019-05-26, ed @ irc.rizon.net
turn almost any device into a file server with resumable uploads/downloads using [*any*](#browser-support) web browser
* server only needs Python (2 or 3), all dependencies optional
* 🔌 protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
## summary
turn your phone or raspi into a portable file server with resumable uploads/downloads using *any* web browser
* server only needs Python (`2.7` or `3.3+`), all dependencies optional
* browse/upload with [IE4](#browser-support) / netscape4.0 on win3.11 (heh)
* protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
## get the app
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
(the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet)
## readme toc
* top
@@ -33,9 +20,8 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [testimonials](#testimonials) - small collection of user feedback
* [motivations](#motivations) - project goals / philosophy
* [notes](#notes) - general notes
* [bugs](#bugs)
* [general bugs](#general-bugs)
* [not my bugs](#not-my-bugs)
* [bugs](#bugs) - roughly sorted by chance of encounter
* [not my bugs](#not-my-bugs) - same order here too
* [breaking changes](#breaking-changes) - upgrade notes
* [FAQ](#FAQ) - "frequently" asked questions
* [accounts and volumes](#accounts-and-volumes) - per-folder, per-user permissions
@@ -52,6 +38,9 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [self-destruct](#self-destruct) - uploads can be given a lifetime
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
* [media player](#media-player) - plays almost every audio format there is
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
* [markdown viewer](#markdown-viewer) - and there are *two* editors
* [other tricks](#other-tricks)
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
@@ -76,22 +65,35 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags
* [event hooks](#event-hooks) - trigger a program on uploads, renames etc ([examples](./bin/hooks/))
* [upload events](#upload-events) - the older, more powerful approach ([examples](./bin/mtag/))
* [handlers](#handlers) - redefine behavior with plugins ([examples](./bin/handlers/))
* [hiding from google](#hiding-from-google) - tell search engines you dont wanna be indexed
* [themes](#themes)
* [complete examples](#complete-examples)
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
* [prometheus](#prometheus) - metrics/stats can be enabled
* [packages](#packages) - the party might be closer than you think
* [arch package](#arch-package) - now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
* [fedora package](#fedora-package) - now [available on copr-pypi](https://copr.fedorainfracloud.org/coprs/g/copr/PyPI/)
* [nix package](#nix-package) - `nix profile install github:9001/copyparty`
* [nixos module](#nixos-module)
* [browser support](#browser-support) - TLDR: yes
* [client examples](#client-examples) - interact with copyparty using non-browser clients
* [folder sync](#folder-sync) - sync folders to/from copyparty
* [mount as drive](#mount-as-drive) - a remote copyparty server as a local filesystem
* [android app](#android-app) - upload to copyparty with one tap
* [iOS shortcuts](#iOS-shortcuts) - there is no iPhone app, but
* [performance](#performance) - defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
* [client-side](#client-side) - when uploading files
* [security](#security) - some notes on hardening
* [security](#security) - there is a [discord server](https://discord.gg/25J8CdTT6G)
* [gotchas](#gotchas) - behavior that might be unexpected
* [cors](#cors) - cross-site request config
* [filekeys](#filekeys) - prevent filename bruteforcing
* [password hashing](#password-hashing) - you can hash passwords
* [https](#https) - both HTTP and HTTPS are accepted
* [recovering from crashes](#recovering-from-crashes)
* [client crashes](#client-crashes)
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
* [HTTP API](#HTTP-API) - see [devnotes](#./docs/devnotes.md#http-api)
* [HTTP API](#HTTP-API) - see [devnotes](./docs/devnotes.md#http-api)
* [dependencies](#dependencies) - mandatory deps
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
* [optional gpl stuff](#optional-gpl-stuff)
@@ -106,8 +108,10 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
* or install through pypi: `python3 -m pip install --user -U copyparty`
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
* or install [on arch](#arch-package) [on fedora](#fedora-package) [on NixOS](#nixos-module) [through nix](#nix-package)
* or if you are on android, [install copyparty in termux](#install-on-android)
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
* docker has all deps built-in, so skip this step:
@@ -126,6 +130,8 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
or see [some usage examples](#complete-examples) for inspiration, or the [complete windows example](./docs/examples/windows.md)
some recommended options:
* `-e2dsa` enables general [file indexing](#file-indexing)
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
@@ -138,10 +144,11 @@ some recommended options:
you may also want these, especially on servers:
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service (see guide inside)
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
* [nixos module](#nixos-module) to run copyparty on NixOS hosts
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
```
@@ -165,14 +172,18 @@ firewall-cmd --reload
* ☑ [smb/cifs server](#smb-server)
* ☑ [qr-code](#qr-code) for quick access
* ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf)
* ☑ [event hooks](#event-hooks) / script runner
* ☑ [reverse-proxy support](https://github.com/9001/copyparty#reverse-proxy)
* upload
* ☑ basic: plain multipart, ie6 support
* ☑ [up2k](#uploading): js, resumable, multithreaded
* unaffected by cloudflare's max-upload-size (100 MiB)
* ☑ stash: simple PUT filedropper
* ☑ filename randomizer
* ☑ write-only folders
* ☑ [unpost](#unpost): undo/delete accidental uploads
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
* ☑ symlink/discard existing files (content-matching)
* ☑ symlink/discard duplicates (content-matching)
* download
* ☑ single files in browser
* ☑ [folders as zip / tar files](#zip-downloads)
@@ -193,10 +204,15 @@ firewall-cmd --reload
* ☑ [locate files by contents](#file-search)
* ☑ search by name/path/date/size
* ☑ [search by ID3-tags etc.](#searching)
* client support
* ☑ [folder sync](#folder-sync)
* ☑ [curl-friendly](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png)
* markdown
* ☑ [viewer](#markdown-viewer)
* ☑ editor (sure why not)
PS: something missing? post any crazy ideas you've got as a [feature request](https://github.com/9001/copyparty/issues/new?assignees=9001&labels=enhancement&template=feature_request.md) or [discussion](https://github.com/9001/copyparty/discussions/new?category=ideas) 🤙
## testimonials
@@ -246,29 +262,42 @@ server notes:
# bugs
* Windows: python 2.7 cannot index non-ascii filenames with `-e2d`
* Windows: python 2.7 cannot handle filenames with mojibake
* `--th-ff-jpg` may fix video thumbnails on some FFmpeg versions (macos, some linux)
* `--th-ff-swr` may fix audio thumbnails on some FFmpeg versions
roughly sorted by chance of encounter
## general bugs
* general:
* `--th-ff-jpg` may fix video thumbnails on some FFmpeg versions (macos, some linux)
* `--th-ff-swr` may fix audio thumbnails on some FFmpeg versions
* if the `up2k.db` (filesystem index) is on a samba-share or network disk, you'll get unpredictable behavior if the share is disconnected for a bit
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db on a local disk instead
* all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise
* probably more, pls let me know
* Windows: if the `up2k.db` (filesystem index) is on a samba-share or network disk, you'll get unpredictable behavior if the share is disconnected for a bit
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db on a local disk instead
* all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise
* probably more, pls let me know
* python 3.4 and older (including 2.7):
* many rare and exciting edge-cases because [python didn't handle EINTR yet](https://peps.python.org/pep-0475/)
* downloads from copyparty may suddenly fail, but uploads *should* be fine
* python 2.7 on Windows:
* cannot index non-ascii filenames with `-e2d`
* cannot handle filenames with mojibake
## not my bugs
same order here too
* [Chrome issue 1317069](https://bugs.chromium.org/p/chromium/issues/detail?id=1317069) -- if you try to upload a folder which contains symlinks by dragging it into the browser, the symlinked files will not get uploaded
* [Chrome issue 1352210](https://bugs.chromium.org/p/chromium/issues/detail?id=1352210) -- plaintext http may be faster at filehashing than https (but also extremely CPU-intensive)
* [Firefox issue 1790500](https://bugzilla.mozilla.org/show_bug.cgi?id=1790500) -- entire browser can crash after uploading ~4000 small files
* Android: music playback randomly stops due to [battery usage settings](#fix-unreliable-playback-on-android)
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
* "future" because `AudioContext` is broken in the current iOS version (15.1), maybe one day...
* `AudioContext` will probably never be a viable workaround as apple introduces new issues faster than they fix current ones
* iPhones: the preload feature (in the media-player-options tab) can cause a tiny audio glitch 20sec before the end of each song, but disabling it may cause worse iOS bugs to appear instead
* just a hunch, but disabling preloading may cause playback to stop entirely, or possibly mess with bluetooth speakers
* tried to add a tooltip regarding this but looks like apple broke my tooltips
* Windows: folders cannot be accessed if the name ends with `.`
* python or windows bug
@@ -278,6 +307,7 @@ server notes:
* VirtualBox: sqlite throws `Disk I/O Error` when running in a VM and the up2k database is in a vboxsf
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db inside the vm instead
* also happens on mergerfs, so put the db elsewhere
* Ubuntu: dragging files from certain folders into firefox or chrome is impossible
* due to snap security policies -- see `snap connections firefox` for the allowlist, `removable-media` permits all of `/mnt` and `/media` apparently
@@ -291,7 +321,7 @@ upgrade notes
* http-api: delete/move is now `POST` instead of `GET`
* everything other than `GET` and `HEAD` must pass [cors validation](#cors)
* `1.5.0` (2022-12-03): [new chunksize formula](https://github.com/9001/copyparty/commit/54e1c8d261df) for files larger than 128 GiB
* **users:** upgrade to the latest [cli uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) if you use that
* **users:** upgrade to the latest [cli uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) if you use that
* **devs:** update third-party up2k clients (if those even exist)
@@ -306,11 +336,17 @@ upgrade notes
* can I make copyparty download a file to my server if I give it a URL?
* yes, using [hooks](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
* i want to learn python and/or programming and am considering looking at the copyparty source code in that occasion
```bash
_| _ __ _ _|_
(_| (_) | | (_) |_
```
# accounts and volumes
per-folder, per-user permissions - if your setup is getting complex, consider making a [config file](./docs/example.conf) instead of using arguments
* much easier to manage, and you can modify the config at runtime with `systemctl reload copyparty` or more conveniently using the `[reload cfg]` button in the control-panel (if logged in as admin)
* much easier to manage, and you can modify the config at runtime with `systemctl reload copyparty` or more conveniently using the `[reload cfg]` button in the control-panel (if the user has `a`/admin in any volume)
* changes to the `[global]` config section requires a restart to take effect
a quick summary can be seen using `--help-accounts`
@@ -328,7 +364,9 @@ permissions:
* `m` (move): move files/folders *from* this folder
* `d` (delete): delete files/folders
* `g` (get): only download files, cannot see folder contents or zip/tar
* `G` (upget): same as `g` except uploaders get to see their own filekeys (see `fk` in examples below)
* `G` (upget): same as `g` except uploaders get to see their own [filekeys](#filekeys) (see `fk` in examples below)
* `h` (html): same as `g` except folders return their index.html, and filekeys are not necessary for index.html
* `a` (admin): can see upload time, uploader IPs, config-reload
examples:
* add accounts named u1, u2, u3 with passwords p1, p2, p3: `-a u1:p1 -a u2:p2 -a u3:p3`
@@ -340,7 +378,7 @@ examples:
* `u1` can open the `inc` folder, but cannot see the contents, only upload new files to it
* `u2` can browse it and move files *from* `/inc` into any folder where `u2` has write-access
* make folder `/mnt/ss` available at `/i`, read-write for u1, get-only for everyone else, and enable filekeys: `-v /mnt/ss:i:rw,u1:g:c,fk=4`
* `c,fk=4` sets the `fk` (filekey) volflag to 4, meaning each file gets a 4-character accesskey
* `c,fk=4` sets the `fk` ([filekey](#filekeys)) volflag to 4, meaning each file gets a 4-character accesskey
* `u1` can upload files, browse the folder, and see the generated filekeys
* other users cannot browse the folder, but can access the files if they have the full file URL with the filekey
* replacing the `g` permission with `wg` would let anonymous users upload files, but not see the required filekey to access it
@@ -457,6 +495,7 @@ click the `🌲` or pressing the `B` hotkey to toggle between breadcrumbs path (
## thumbnails
press `g` or `` to toggle grid-view instead of the file listing and `t` toggles icons / thumbnails
* can be made default globally with `--grid` or per-volume with volflag `grid`
![copyparty-thumbs-fs8](https://user-images.githubusercontent.com/241032/129636211-abd20fa2-a953-4366-9423-1c88ebb96ba9.png)
@@ -467,10 +506,14 @@ it does static images with Pillow / pyvips / FFmpeg, and uses FFmpeg for video f
audio files are covnerted into spectrograms using FFmpeg unless you `--no-athumb` (and some FFmpeg builds may need `--th-ff-swr`)
images with the following names (see `--th-covers`) become the thumbnail of the folder they're in: `folder.png`, `folder.jpg`, `cover.png`, `cover.jpg`
* and, if you enable [file indexing](#file-indexing), all remaining folders will also get thumbnails (as long as they contain any pics at all)
in the grid/thumbnail view, if the audio player panel is open, songs will start playing when clicked
* indicated by the audio files having the ▶ icon instead of 💾
enabling `multiselect` lets you click files to select them, and then shift-click another file for range-select
* `multiselect` is mostly intended for phones/tablets, but the `sel` option in the `[⚙️] settings` tab is better suited for desktop use, allowing selection by CTRL-clicking and range-selection with SHIFT-click, all without affecting regular clicking
## zip downloads
@@ -481,12 +524,20 @@ select which type of archive you want in the `[⚙️] config` tab:
| name | url-suffix | description |
|--|--|--|
| `tar` | `?tar` | plain gnutar, works great with `curl \| tar -xv` |
| `pax` | `?tar=pax` | pax-format tar, futureproof, not as fast |
| `tgz` | `?tar=gz` | gzip compressed gnu-tar (slow), for `curl \| tar -xvz` |
| `txz` | `?tar=xz` | gnu-tar with xz / lzma compression (v.slow) |
| `zip` | `?zip=utf8` | works everywhere, glitchy filenames on win7 and older |
| `zip_dos` | `?zip` | traditional cp437 (no unicode) to fix glitchy filenames |
| `zip_crc` | `?zip=crc` | cp437 with crc32 computed early for truly ancient software |
* gzip default level is `3` (0=fast, 9=best), change with `?tar=gz:9`
* xz default level is `1` (0=fast, 9=best), change with `?tar=xz:9`
* bz2 default level is `2` (1=fast, 9=best), change with `?tar=bz2:9`
* hidden files (dotfiles) are excluded unless `-ed`
* `up2k.db` and `dir.txt` is always excluded
* bsdtar supports streaming unzipping: `curl foo?zip=utf8 | bsdtar -xv`
* good, because copyparty's zip is faster than tar on small files
* `zip_crc` will take longer to download since the server has to read each file twice
* this is only to support MS-DOS PKZIP v2.04g (october 1993) and older
* how are you accessing copyparty actually
@@ -495,10 +546,15 @@ you can also zip a selection of files or folders by clicking them in the browser
![copyparty-zipsel-fs8](https://user-images.githubusercontent.com/241032/129635374-e5136e01-470a-49b1-a762-848e8a4c9cdc.png)
cool trick: download a folder by appending url-params `?tar&opus` to transcode all audio files (except aac|m4a|mp3|ogg|opus|wma) to opus before they're added to the archive
* super useful if you're 5 minutes away from takeoff and realize you don't have any music on your phone but your server only has flac files and downloading those will burn through all your data + there wouldn't be enough time anyways
* and url-params `&j` / `&w` produce jpeg/webm thumbnails/spectrograms instead of the original audio/video/images
* can also be used to pregenerate thumbnails; combine with `--th-maxage=9999999` or `--th-clean=0`
## uploading
drag files/folders into the web-browser to upload (or use the [command-line uploader](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy))
drag files/folders into the web-browser to upload (or use the [command-line uploader](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy))
this initiates an upload using `up2k`; there are two uploaders available:
* `[🎈] bup`, the basic uploader, supports almost every browser since netscape 4.0
@@ -531,7 +587,8 @@ the up2k UI is the epitome of polished inutitive experiences:
* "parallel uploads" specifies how many chunks to upload at the same time
* `[🏃]` analysis of other files should continue while one is uploading
* `[🥔]` shows a simpler UI for faster uploads from slow devices
* `[💭]` ask for confirmation before files are added to the queue
* `[🎲]` generate random filenames during upload
* `[📅]` preserve last-modified timestamps; server times will match yours
* `[🔎]` switch between upload and [file-search](#file-search) mode
* ignore `[🔎]` if you add files by dragging them into the browser
@@ -593,6 +650,7 @@ file selection: click somewhere on the line (not the link itsef), then:
* `up/down` to move
* `shift-up/down` to move-and-select
* `ctrl-shift-up/down` to also scroll
* shift-click another line for range-select
* cut: select some files and `ctrl-x`
* paste: `ctrl-v` in another folder
@@ -648,12 +706,70 @@ or a mix of both:
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
## media player
plays almost every audio format there is (if the server has FFmpeg installed for on-demand transcoding)
the following audio formats are usually always playable, even without FFmpeg: `aac|flac|m4a|mp3|ogg|opus|wav`
some hilights:
* OS integration; control playback from your phone's lockscreen ([windows](https://user-images.githubusercontent.com/241032/233213022-298a98ba-721a-4cf1-a3d4-f62634bc53d5.png) // [iOS](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) // [android](https://user-images.githubusercontent.com/241032/233212311-a7368590-08c7-4f9f-a1af-48ccf3f36fad.png))
* shows the audio waveform in the seekbar
* not perfectly gapless but can get really close (see settings + eq below); good enough to enjoy gapless albums as intended
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
open the `[🎺]` media-player-settings tab to configure it,
* switches:
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
* `[~s]` toggles the seekbar waveform display
* `[/np]` enables buttons to copy the now-playing info as an irc message
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
* `[seek]` allows seeking with lockscreen controls (buggy on some devices)
* `[art]` shows album art on the lockscreen
* `[🎯]` keeps the playing song scrolled into view (good when using the player as a taskbar dock)
* `[⟎]` shrinks the playback controls
* playback mode:
* `[loop]` keeps looping the folder
* `[next]` plays into the next folder
* transcode:
* `[flac]` converts `flac` and `wav` files into opus
* `[aac]` converts `aac` and `m4a` files into opus
* `[oth]` converts all other known formats into opus
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
* "tint" reduces the contrast of the playback bar
### audio equalizer
and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
can also boost the volume in general, or increase/decrease stereo width (like [crossfeed](https://www.foobar2000.org/components/view/foo_dsp_meiercf) just worse)
has the convenient side-effect of reducing the pause between songs, so gapless albums play better with the eq enabled (just make it flat)
not available on iPhones / iPads because AudioContext currently breaks background audio playback on iOS (15.7.8)
### fix unreliable playback on android
due to phone / app settings, android phones may randomly stop playing music when the power saver kicks in, especially at the end of an album -- you can fix it by [disabling power saving](https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png) in the [app settings](https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png) of the browser you use for music streaming (preferably a dedicated one)
## markdown viewer
and there are *two* editors
![copyparty-md-read-fs8](https://user-images.githubusercontent.com/241032/115978057-66419080-a57d-11eb-8539-d2be843991aa.png)
there is a built-in extension for inline clickable thumbnails;
* enable it by adding `<!-- th -->` somewhere in the doc
* add thumbnails with `!th[l](your.jpg)` where `l` means left-align (`r` = right-align)
* a single line with `---` clears the float / inlining
* in the case of README.md being displayed below a file listing, thumbnails will open in the gallery viewer
other notes,
* the document preview has a max-width which is the same as an A4 paper when printed
@@ -673,6 +789,8 @@ and there are *two* editors
* files named `README.md` / `readme.md` will be rendered after directory listings unless `--no-readme` (but `.epilogue.html` takes precedence)
* `README.md` and `*logue.html` can contain placeholder values which are replaced server-side before embedding into directory listings; see `--help-exp`
## searching
@@ -698,7 +816,7 @@ for the above example to work, add the commandline argument `-e2ts` to also scan
using arguments or config files, or a mix of both:
* config files (`-c some.conf`) can set additional commandline arguments; see [./docs/example.conf](docs/example.conf) and [./docs/example2.conf](docs/example2.conf)
* `kill -s USR1` (same as `systemctl reload copyparty`) to reload accounts and volumes from config files without restarting
* or click the `[reload cfg]` button in the control-panel when logged in as admin
* or click the `[reload cfg]` button in the control-panel if the user has `a`/admin in any volume
* changes to the `[global]` config section requires a restart to take effect
@@ -758,6 +876,13 @@ an FTP server can be started using `--ftp 3921`, and/or `--ftps` for explicit T
* some older software (filezilla on debian-stable) cannot passive-mode with TLS
* login with any username + your password, or put your password in the username field
some recommended FTP / FTPS clients; `wark` = example password:
* https://winscp.net/eng/download.php
* https://filezilla-project.org/ struggles a bit with ftps in active-mode, but is fine otherwise
* https://rclone.org/ does FTPS with `tls=false explicit_tls=true`
* `lftp -u k,wark -p 3921 127.0.0.1 -e ls`
* `lftp -u k,wark -p 3990 127.0.0.1 -e 'set ssl:verify-certificate no; ls'`
## webdav server
@@ -799,15 +924,13 @@ unsafe, slow, not recommended for wan, enable with `--smb` for read-only or `--
click the [connect](http://127.0.0.1:3923/?hc) button in the control-panel to see connection instructions for windows, linux, macos
dependencies: `python3 -m pip install --user -U impacket==0.10.0`
dependencies: `python3 -m pip install --user -U impacket==0.11.0`
* newer versions of impacket will hopefully work just fine but there is monkeypatching so maybe not
some **BIG WARNINGS** specific to SMB/CIFS, in decreasing importance:
* not entirely confident that read-only is read-only
* the smb backend is not fully integrated with vfs, meaning there could be security issues (path traversal). Please use `--smb-port` (see below) and [prisonparty](./bin/prisonparty.sh)
* account passwords work per-volume as expected, but account permissions are coalesced; all accounts have read-access to all volumes, and if a single account has write-access to some volume then all other accounts also do
* if no accounts have write-access to a specific volume, or if `--smbw` is not set, then writing to that volume from smb *should* be impossible
* will be fixed once [impacket v0.11.0](https://github.com/SecureAuthCorp/impacket/commit/d923c00f75d54b972bca573a211a82f09b55261a) is released
* account passwords work per-volume as expected, and so does account permissions (read/write/move/delete), but `--smbw` must be given to allow write-access from smb
* [shadowing](#shadowing) probably works as expected but no guarantees
and some minor issues,
@@ -818,7 +941,7 @@ and some minor issues,
* win10 onwards does not allow connecting anonymously / without accounts
* on windows, creating a new file through rightclick --> new --> textfile throws an error due to impacket limitations -- hit OK and F5 to get your file
* python3 only
* slow
* slow (the builtin webdav support in windows is 5x faster, and rclone-webdav is 30x faster)
known client bugs:
* on win7 only, `--smb1` is much faster than smb2 (default) because it keeps rescanning folders on smb2
@@ -833,6 +956,16 @@ authenticate with one of the following:
* username `$password`, password `k`
## browser ux
tweaking the ui
* set default sort order globally with `--sort` or per-volume with the `sort` volflag; specify one or more comma-separated columns to sort by, and prefix the column name with `-` for reverse sort
* the column names you can use are visible as tooltips when hovering over the column headers in the directory listing, for example `href ext sz ts tags/.up_at tags/Cirle tags/.tn tags/Artist tags/Title`
* to sort in music order (album, track, artist, title) with filename as fallback, you could `--sort tags/Cirle,tags/.tn,tags/Artist,tags/Title,href`
* to sort by upload date, first enable showing the upload date in the listing with `-e2d -mte +.up_at` and then `--sort tags/.up_at`
## file indexing
enables dedup and music search ++
@@ -852,14 +985,14 @@ through arguments:
* `--xlink` enables deduplication across volumes
the same arguments can be set as volflags, in addition to `d2d`, `d2ds`, `d2t`, `d2ts`, `d2v` for disabling:
* `-v ~/music::r:c,e2dsa,e2tsr` does a full reindex of everything on startup
* `-v ~/music::r:c,e2ds,e2tsr` does a full reindex of everything on startup
* `-v ~/music::r:c,d2d` disables **all** indexing, even if any `-e2*` are on
* `-v ~/music::r:c,d2t` disables all `-e2t*` (tags), does not affect `-e2d*`
* `-v ~/music::r:c,d2ds` disables on-boot scans; only index new uploads
* `-v ~/music::r:c,d2ts` same except only affecting tags
note:
* the parser can finally handle `c,e2dsa,e2tsr` so you no longer have to `c,e2dsa:c,e2tsr`
* upload-times can be displayed in the file listing by enabling the `.up_at` metadata key, either globally with `-e2d -mte +.up_at` or per-volume with volflags `e2d,mte=+.up_at` (will have a ~17% performance impact on directory listings)
* `e2tsr` is probably always overkill, since `e2ds`/`e2dsa` would pick up any file modifications and `e2ts` would then reindex those, unless there is a new copyparty version with new parsers and the release note says otherwise
* the rescan button in the admin panel has no effect unless the volume has `-e2ds` or higher
* deduplication is possible on windows if you run copyparty as administrator (not saying you should!)
@@ -881,7 +1014,11 @@ avoid traversing into other filesystems using `--xdev` / volflag `:c,xdev`, ski
and/or you can `--xvol` / `:c,xvol` to ignore all symlinks leaving the volume's top directory, but still allow bind-mounts pointing elsewhere
**NB: only affects the indexer** -- users can still access anything inside a volume, unless shadowed by another volume
* symlinks are permitted with `xvol` if they point into another volume where the user has the same level of access
these options will reduce performance; unlikely worst-case estimates are 14% reduction for directory listings, 35% for download-as-tar
as of copyparty v1.7.0 these options also prevent file access at runtime -- in previous versions it was just hints for the indexer
### periodic rescan
@@ -898,6 +1035,8 @@ set upload rules using volflags, some examples:
* `:c,sz=1k-3m` sets allowed filesize between 1 KiB and 3 MiB inclusive (suffixes: `b`, `k`, `m`, `g`)
* `:c,df=4g` block uploads if there would be less than 4 GiB free disk space afterwards
* `:c,vmaxb=1g` block uploads if total volume size would exceed 1 GiB afterwards
* `:c,vmaxn=4k` block uploads if volume would contain more than 4096 files afterwards
* `:c,nosub` disallow uploading into subdirectories; goes well with `rotn` and `rotf`:
* `:c,rotn=1000,2` moves uploads into subfolders, up to 1000 files in each folder before making a new one, two levels deep (must be at least 1)
* `:c,rotf=%Y/%m/%d/%H` enforces files to be uploaded into a structure of subfolders according to that date format
@@ -911,6 +1050,9 @@ you can also set transaction limits which apply per-IP and per-volume, but these
* `:c,maxn=250,3600` allows 250 files over 1 hour from each IP (tracked per-volume)
* `:c,maxb=1g,300` allows 1 GiB total over 5 minutes from each IP (tracked per-volume)
notes:
* `vmaxb` and `vmaxn` requires either the `e2ds` volflag or `-e2dsa` global-option
## compress uploads
@@ -1041,6 +1183,13 @@ note that this is way more complicated than the new [event hooks](#event-hooks)
note that it will occupy the parsing threads, so fork anything expensive (or set `kn` to have copyparty fork it for you) -- otoh if you want to intentionally queue/singlethread you can combine it with `--mtag-mt 1`
## handlers
redefine behavior with plugins ([examples](./bin/handlers/))
replace 404 and 403 errors with something completely different (that's it for now)
## hiding from google
tell search engines you dont wanna be indexed, either using the good old [robots.txt](https://www.robotstxt.org/robotstxt.html) or through copyparty settings:
@@ -1077,7 +1226,33 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
## complete examples
* read-only music server
* see [running on windows](./docs/examples/windows.md) for a fancy windows setup
* or use any of the examples below, just replace `python copyparty-sfx.py` with `copyparty.exe` if you're using the exe edition
* allow anyone to download or upload files into the current folder:
`python copyparty-sfx.py`
* enable searching and music indexing with `-e2dsa -e2ts`
* start an FTP server on port 3921 with `--ftp 3921`
* announce it on your LAN with `-z` so it appears in windows/Linux file managers
* anyone can upload, but nobody can see any files (even the uploader):
`python copyparty-sfx.py -e2dsa -v .::w`
* block uploads if there's less than 4 GiB free disk space with `--df 4`
* show a popup on new uploads with `--xau bin/hooks/notify.py`
* anyone can upload, and receive "secret" links for each upload they do:
`python copyparty-sfx.py -e2dsa -v .::wG:c,fk=8`
* anyone can browse, only `kevin` (password `okgo`) can upload/move/delete files:
`python copyparty-sfx.py -e2dsa -a kevin:okgo -v .::r:rwmd,kevin`
* read-only music server:
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
* ...with bpm and key scanning
@@ -1092,19 +1267,194 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
## reverse-proxy
running copyparty next to other websites hosted on an existing webserver such as nginx or apache
running copyparty next to other websites hosted on an existing webserver such as nginx, caddy, or apache
you can either:
* give copyparty its own domain or subdomain (recommended)
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which could be a nice speed boost
* **warning:** nginx-QUIC is still experimental and can make uploads much slower, so HTTP/2 is recommended for now
example webserver configs:
* [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
## prometheus
metrics/stats can be enabled at URL `/.cpr/metrics` for grafana / prometheus / etc (openmetrics 1.0.0)
must be enabled with `--stats` since it reduces startup time a tiny bit, and you probably want `-e2dsa` too
the endpoint is only accessible by `admin` accounts, meaning the `a` in `rwmda` in the following example commandline: `python3 -m copyparty -a ed:wark -v /mnt/nas::rwmda,ed --stats -e2dsa`
follow a guide for setting up `node_exporter` except have it read from copyparty instead; example `/etc/prometheus/prometheus.yml` below
```yaml
scrape_configs:
- job_name: copyparty
metrics_path: /.cpr/metrics
basic_auth:
password: wark
static_configs:
- targets: ['192.168.123.1:3923']
```
currently the following metrics are available,
* `cpp_uptime_seconds`
* `cpp_bans` number of banned IPs
and these are available per-volume only:
* `cpp_disk_size_bytes` total HDD size
* `cpp_disk_free_bytes` free HDD space
and these are per-volume and `total`:
* `cpp_vol_bytes` size of all files in volume
* `cpp_vol_files` number of files
* `cpp_dupe_bytes` disk space presumably saved by deduplication
* `cpp_dupe_files` number of dupe files
* `cpp_unf_bytes` currently unfinished / incoming uploads
some of the metrics have additional requirements to function correctly,
* `cpp_vol_*` requires either the `e2ds` volflag or `-e2dsa` global-option
the following options are available to disable some of the metrics:
* `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs
* `--nos-vol` disables `cpp_vol_*` which reduces server startup time
* `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries
* `--nos-unf` disables `cpp_unf_*` for no particular purpose
# packages
the party might be closer than you think
## arch package
now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
## fedora package
now [available on copr-pypi](https://copr.fedorainfracloud.org/coprs/g/copr/PyPI/) , maintained autonomously -- [track record](https://copr.fedorainfracloud.org/coprs/g/copr/PyPI/package/python-copyparty/) seems OK
```bash
dnf copr enable @copr/PyPI
dnf install python3-copyparty # just a minimal install, or...
dnf install python3-{copyparty,pillow,argon2-cffi,pyftpdlib,pyOpenSSL} ffmpeg-free # with recommended deps
```
this *may* also work on RHEL but [I'm not paying IBM to verify that](https://www.jeffgeerling.com/blog/2023/dear-red-hat-are-you-dumb)
## nix package
`nix profile install github:9001/copyparty`
requires a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of nix
some recommended dependencies are enabled by default; [override the package](https://github.com/9001/copyparty/blob/hovudstraum/contrib/package/nix/copyparty/default.nix#L3-L22) if you want to add/remove some features/deps
`ffmpeg-full` was chosen over `ffmpeg-headless` mainly because we need `withWebp` (and `withOpenmpt` is also nice) and being able to use a cached build felt more important than optimizing for size at the time -- PRs welcome if you disagree 👍
## nixos module
for this setup, you will need a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of NixOS.
```nix
{
# add copyparty flake to your inputs
inputs.copyparty.url = "github:9001/copyparty";
# ensure that copyparty is an allowed argument to the outputs function
outputs = { self, nixpkgs, copyparty }: {
nixosConfigurations.yourHostName = nixpkgs.lib.nixosSystem {
modules = [
# load the copyparty NixOS module
copyparty.nixosModules.default
({ pkgs, ... }: {
# add the copyparty overlay to expose the package to the module
nixpkgs.overlays = [ copyparty.overlays.default ];
# (optional) install the package globally
environment.systemPackages = [ pkgs.copyparty ];
# configure the copyparty module
services.copyparty.enable = true;
})
];
};
};
}
```
copyparty on NixOS is configured via `services.copyparty` options, for example:
```nix
services.copyparty = {
enable = true;
# directly maps to values in the [global] section of the copyparty config.
# see `copyparty --help` for available options
settings = {
i = "0.0.0.0";
# use lists to set multiple values
p = [ 3210 3211 ];
# use booleans to set binary flags
no-reload = true;
# using 'false' will do nothing and omit the value when generating a config
ignored-flag = false;
};
# create users
accounts = {
# specify the account name as the key
ed = {
# provide the path to a file containing the password, keeping it out of /nix/store
# must be readable by the copyparty service user
passwordFile = "/run/keys/copyparty/ed_password";
};
# or do both in one go
k.passwordFile = "/run/keys/copyparty/k_password";
};
# create a volume
volumes = {
# create a volume at "/" (the webroot), which will
"/" = {
# share the contents of "/srv/copyparty"
path = "/srv/copyparty";
# see `copyparty --help-accounts` for available options
access = {
# everyone gets read-access, but
r = "*";
# users "ed" and "k" get read-write
rw = [ "ed" "k" ];
};
# see `copyparty --help-flags` for available options
flags = {
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
fk = 4;
# scan for new files every 60sec
scan = 60;
# volflag "e2d" enables the uploads database
e2d = true;
# "d2t" disables multimedia parsers (in case the uploads are malicious)
d2t = true;
# skips hashing file contents if path matches *.iso
nohash = "\.iso$";
};
};
};
# you may increase the open file limit for the process
openFilesLimit = 8192;
};
```
the passwordFile at /run/keys/copyparty/ could for example be generated by [agenix](https://github.com/ryantm/agenix), or you could just dump it in the nix store instead if that's acceptable
# browser support
TLDR: yes
@@ -1175,10 +1525,10 @@ interact with copyparty using non-browser clients
* `(printf 'PUT /junk?pw=wark HTTP/1.1\r\n\r\n'; cat movie.mkv) | nc 127.0.0.1 3923`
* `(printf 'PUT / HTTP/1.1\r\n\r\n'; cat movie.mkv) >/dev/tcp/127.0.0.1/3923`
* python: [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
* file uploads, file-search, folder sync, autoresume of aborted/broken uploads
* can be downloaded from copyparty: controlpanel -> connect -> [up2k.py](http://127.0.0.1:3923/.cpr/a/up2k.py)
* see [./bin/README.md#up2kpy](bin/README.md#up2kpy)
* python: [u2c.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
* file uploads, file-search, [folder sync](#folder-sync), autoresume of aborted/broken uploads
* can be downloaded from copyparty: controlpanel -> connect -> [u2c.py](http://127.0.0.1:3923/.cpr/a/u2c.py)
* see [./bin/README.md#u2cpy](bin/README.md#u2cpy)
* FUSE: mount a copyparty server as a local filesystem
* cross-platform python client available in [./bin/](bin/)
@@ -1197,17 +1547,27 @@ you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, ur
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
## folder sync
sync folders to/from copyparty
the commandline uploader [u2c.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy) with `--dr` is the best way to sync a folder to copyparty; verifies checksums and does files in parallel, and deletes unexpected files on the server after upload has finished which makes file-renames really cheap (it'll rename serverside and skip uploading)
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty, ...), although there is no integrity check and it won't work with files over 100 MiB if copyparty is behind cloudflare
* starting from rclone v1.63 (currently [in beta](https://beta.rclone.org/?filter=latest)), rclone will also be faster than u2c.py
## mount as drive
a remote copyparty server as a local filesystem; go to the control-panel and click `connect` to see a list of commands to do that
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
* [rclone-http](./docs/rclone.md) (25s), read-only
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE ([v1.63-beta](https://beta.rclone.org/?filter=latest))
* [rclone-http](./docs/rclone.md) (26s), read-only
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
* [rclone-webdav](./docs/rclone.md) (51s), read/WRITE
* copyparty-1.5.0's webdav server is faster than rclone-1.60.0 (69s)
* [partyfuse.py](./bin/#partyfusepy) (71s), read-only
* davfs2 (103s), read/WRITE, *very fast* on small files
* [win10-webdav](#webdav-server) (138s), read/WRITE
* [win10-smb2](#smb-server) (387s), read/WRITE
@@ -1215,6 +1575,27 @@ alternatively, some alternatives roughly sorted by speed (unreproducible benchma
most clients will fail to mount the root of a copyparty server unless there is a root volume (so you get the admin-panel instead of a browser when accessing it) -- in that case, mount a specific volume instead
# android app
upload to copyparty with one tap
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet
if you want to run the copyparty server on your android device, see [install on android](#install-on-android)
# iOS shortcuts
there is no iPhone app, but the following shortcuts are almost as good:
* [upload to copyparty](https://www.icloud.com/shortcuts/41e98dd985cb4d3bb433222bc1e9e770) ([offline](https://github.com/9001/copyparty/raw/hovudstraum/contrib/ios/upload-to-copyparty.shortcut)) ([png](https://user-images.githubusercontent.com/241032/226118053-78623554-b0ed-482e-98e4-6d57ada58ea4.png)) based on the [original](https://www.icloud.com/shortcuts/ab415d5b4de3467b9ce6f151b439a5d7) by [Daedren](https://github.com/Daedren) (thx!)
* can strip exif, upload files, pics, vids, links, clipboard
* can download links and rehost the target file on copyparty (see first comment inside the shortcut)
* pics become lowres if you share from gallery to shortcut, so better to launch the shortcut and pick stuff from there
# performance
defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
@@ -1222,15 +1603,17 @@ defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
below are some tweaks roughly ordered by usefulness:
* `-q` disables logging and can help a bunch, even when combined with `-lo` to redirect logs to file
* `--http-only` or `--https-only` (unless you want to support both protocols) will reduce the delay before a new connection is established
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example:
* huge amount of short-lived connections
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
* lots of connections (many users or heavy clients)
* simultaneous downloads and uploads saturating a 20gbps connection
* if `-e2d` is enabled, `-j2` gives 4x performance for directory listings; `-j4` gives 16x
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
* and pypy can sometimes crash on startup with `-j0` (TODO make issue)
## client-side
@@ -1240,7 +1623,7 @@ when uploading files,
* chrome is recommended, at least compared to firefox:
* up to 90% faster when hashing, especially on SSDs
* up to 40% faster when uploading over extremely fast internets
* but [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) can be 40% faster than chrome again
* but [u2c.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) can be 40% faster than chrome again
* if you're cpu-bottlenecked, or the browser is maxing a cpu core:
* up to 30% faster uploads if you hide the upload status list by switching away from the `[🚀]` up2k ui-tab (or closing it)
@@ -1251,10 +1634,14 @@ when uploading files,
# security
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` for all important updates (at the lack of better ideas)
some notes on hardening
* set `--rproxy 0` if your copyparty is directly facing the internet (not through a reverse-proxy)
* cors doesn't work right otherwise
* if you allow anonymous uploads or otherwise don't trust the contents of a volume, you can prevent XSS with volflag `nohtml`
* this returns html documents as plaintext, and also disables markdown rendering
safety profiles:
@@ -1268,9 +1655,7 @@ safety profiles:
* `--unpost 0`, `--no-del`, `--no-mv` disables all move/delete support
* `--hardlink` creates hardlinks instead of symlinks when deduplicating uploads, which is less maintenance
* however note if you edit one file it will also affect the other copies
* `--vague-401` returns a "404 not found" instead of "401 unauthorized" which is a common enterprise meme
* `--ban-404=50,60,1440` ban client for 1440min (24h) if they hit 50 404's in 60min
* **NB:** will ban anyone who enables up2k turbo
* `--vague-403` returns a "404 not found" instead of "401 unauthorized" which is a common enterprise meme
* `--nih` removes the server hostname from directory listings
* option `-sss` is a shortcut for the above plus:
@@ -1282,9 +1667,9 @@ safety profiles:
other misc notes:
* you can disable directory listings by giving permission `g` instead of `r`, only accepting direct URLs to files
* combine this with volflag `c,fk` to generate filekeys (per-file accesskeys); users which have full read-access will then see URLs with `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
* the default filekey entropy is fairly small so give `--fk-salt` around 30 characters if you want filekeys longer than 16 chars
* permissions `wG` lets users upload files and receive their own filekeys, still without being able to see other uploads
* you may want [filekeys](#filekeys) to prevent filename bruteforcing
* permission `h` instead of `r` makes copyparty behave like a traditional webserver with directory listing/index disabled, returning index.html instead
* compatibility with filekeys: index.html itself can be retrieved without the correct filekey, but all other files are protected
## gotchas
@@ -1292,10 +1677,12 @@ other misc notes:
behavior that might be unexpected
* users without read-access to a folder can still see the `.prologue.html` / `.epilogue.html` / `README.md` contents, for the purpose of showing a description on how to use the uploader for example
* users can submit `<script>`s which autorun for other visitors in a few ways;
* users can submit `<script>`s which autorun (in a sandbox) for other visitors in a few ways;
* uploading a `README.md` -- avoid with `--no-readme`
* renaming `some.html` to `.epilogue.html` -- avoid with either `--no-logues` or `--no-dot-ren`
* the directory-listing embed is sandboxed (so any malicious scripts can't do any damage) but the markdown editor is not
* the directory-listing embed is sandboxed (so any malicious scripts can't do any damage) but the markdown editor is not 100% safe, see below
* markdown documents can contain html and `<script>`s; attempts are made to prevent scripts from executing (unless `-emp` is specified) but this is not 100% bulletproof, so setting the `nohtml` volflag is still the safest choice
* or eliminate the problem entirely by only giving write-access to trustworthy people :^)
## cors
@@ -1310,6 +1697,40 @@ by default, except for `GET` and `HEAD` operations, all requests must either:
cors can be configured with `--acao` and `--acam`, or the protections entirely disabled with `--allow-csrf`
## filekeys
prevent filename bruteforcing
volflag `c,fk` generates filekeys (per-file accesskeys) for all files; users which have full read-access (permission `r`) will then see URLs with the correct filekey `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
by default, filekeys are generated based on salt (`--fk-salt`) + filesystem-path + file-size + inode (if not windows); add volflag `fka` to generate slightly weaker filekeys which will not be invalidated if the file is edited (only salt + path)
permissions `wG` (write + upget) lets users upload files and receive their own filekeys, still without being able to see other uploads
## password hashing
you can hash passwords before putting them into config files / providing them as arguments; see `--help-pwhash` for all the details
`--ah-alg argon2` enables it, and if you have any plaintext passwords then it'll print the hashed versions on startup so you can replace them
optionally also specify `--ah-cli` to enter an interactive mode where it will hash passwords without ever writing the plaintext ones to disk
the default configs take about 0.4 sec and 256 MiB RAM to process a new password on a decent laptop
## https
both HTTP and HTTPS are accepted by default, but letting a [reverse proxy](#reverse-proxy) handle the https/tls/ssl would be better (probably more secure by default)
copyparty doesn't speak HTTP/2 or QUIC, so using a reverse proxy would solve that as well
if [cfssl](https://github.com/cloudflare/cfssl/releases/latest) is installed, copyparty will automatically create a CA and server-cert on startup
* the certs are written to `--crt-dir` for distribution, see `--help` for the other `--crt` options
* this will be a self-signed certificate so you must install your `ca.pem` into all your browsers/devices
* if you want to avoid the hassle of distributing certs manually, please consider using a reverse proxy
# recovering from crashes
## client crashes
@@ -1332,7 +1753,7 @@ however you can hit `F12` in the up2k tab and use the devtools to see how far yo
# HTTP API
see [devnotes](#./docs/devnotes.md#http-api)
see [devnotes](./docs/devnotes.md#http-api)
# dependencies
@@ -1345,6 +1766,8 @@ mandatory deps:
install these to enable bonus features
enable hashed passwords in config: `argon2-cffi`
enable ftp-server:
* for just plaintext FTP, `pyftpdlib` (is built into the SFX)
* with TLS encryption, `pyftpdlib pyopenssl`
@@ -1361,7 +1784,7 @@ enable [thumbnails](#thumbnails) of...
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
enable [smb](#smb-server) support (**not** recommended):
* `impacket==0.10.0`
* `impacket==0.11.0`
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
@@ -1377,7 +1800,7 @@ these are standalone programs and will never be imported / evaluated by copypart
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](#./docs/devnotes.md#sfx-repack)
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](./docs/devnotes.md#sfx-repack)
## copyparty.exe
@@ -1391,10 +1814,11 @@ can be convenient on machines where installing python is problematic, however is
* [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) runs on win8 or newer, was compiled on win10, does thumbnails + media tags, and is *currently* safe to use, but any future python/expat/pillow CVEs can only be remedied by downloading a newer version of the exe
* on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145), on win10 it just works
* some antivirus may freak out (false-positive), possibly [Avast, AVG, and McAfee](https://www.virustotal.com/gui/file/52391a1e9842cf70ad243ef83844d46d29c0044d101ee0138fcdd3c8de2237d6/detection)
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with [windows7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png), which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.8.7/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date

View File

@@ -1,4 +1,4 @@
# [`up2k.py`](up2k.py)
# [`u2c.py`](u2c.py)
* command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
* file uploads, file-search, autoresume of aborted/broken uploads
* sync local folder to server

35
bin/handlers/README.md Normal file
View File

@@ -0,0 +1,35 @@
replace the standard 404 / 403 responses with plugins
# usage
load plugins either globally with `--on404 ~/dev/copyparty/bin/handlers/sorry.py` or for a specific volume with `:c,on404=~/handlers/sorry.py`
# api
each plugin must define a `main()` which takes 3 arguments;
* `cli` is an instance of [copyparty/httpcli.py](https://github.com/9001/copyparty/blob/hovudstraum/copyparty/httpcli.py) (the monstrosity itself)
* `vn` is the VFS which overlaps with the requested URL, and
* `rem` is the URL remainder below the VFS mountpoint
* so `vn.vpath + rem` == `cli.vpath` == original request
# examples
## on404
* [sorry.py](answer.py) replies with a custom message instead of the usual 404
* [nooo.py](nooo.py) replies with an endless noooooooooooooo
* [never404.py](never404.py) 100% guarantee that 404 will never be a thing again as it automatically creates dummy files whenever necessary
* [caching-proxy.py](caching-proxy.py) transforms copyparty into a squid/varnish knockoff
## on403
* [ip-ok.py](ip-ok.py) disables security checks if client-ip is 1.2.3.4
# notes
* on403 only works for trivial stuff (basic http access) since I haven't been able to think of any good usecases for it (was just easy to add while doing on404)

36
bin/handlers/caching-proxy.py Executable file
View File

@@ -0,0 +1,36 @@
# assume each requested file exists on another webserver and
# download + mirror them as they're requested
# (basically pretend we're warnish)
import os
import requests
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from copyparty.httpcli import HttpCli
def main(cli: "HttpCli", vn, rem):
url = "https://mirrors.edge.kernel.org/alpine/" + rem
abspath = os.path.join(vn.realpath, rem)
# sneaky trick to preserve a requests-session between downloads
# so it doesn't have to spend ages reopening https connections;
# luckily we can stash it inside the copyparty client session,
# name just has to be definitely unused so "hacapo_req_s" it is
req_s = getattr(cli.conn, "hacapo_req_s", None) or requests.Session()
setattr(cli.conn, "hacapo_req_s", req_s)
try:
os.makedirs(os.path.dirname(abspath), exist_ok=True)
with req_s.get(url, stream=True, timeout=69) as r:
r.raise_for_status()
with open(abspath, "wb", 64 * 1024) as f:
for buf in r.iter_content(chunk_size=64 * 1024):
f.write(buf)
except:
os.unlink(abspath)
return "false"
return "retry"

6
bin/handlers/ip-ok.py Executable file
View File

@@ -0,0 +1,6 @@
# disable permission checks and allow access if client-ip is 1.2.3.4
def main(cli, vn, rem):
if cli.ip == "1.2.3.4":
return "allow"

11
bin/handlers/never404.py Executable file
View File

@@ -0,0 +1,11 @@
# create a dummy file and let copyparty return it
def main(cli, vn, rem):
print("hello", cli.ip)
abspath = vn.canonical(rem)
with open(abspath, "wb") as f:
f.write(b"404? not on MY watch!")
return "retry"

16
bin/handlers/nooo.py Executable file
View File

@@ -0,0 +1,16 @@
# reply with an endless "noooooooooooooooooooooooo"
def say_no():
yield b"n"
while True:
yield b"o" * 4096
def main(cli, vn, rem):
cli.send_headers(None, 404, "text/plain")
for chunk in say_no():
cli.s.sendall(chunk)
return "false"

7
bin/handlers/sorry.py Executable file
View File

@@ -0,0 +1,7 @@
# sends a custom response instead of the usual 404
def main(cli, vn, rem):
msg = f"sorry {cli.ip} but {cli.vpath} doesn't exist"
return str(cli.reply(msg.encode("utf-8"), 404, "text/plain"))

View File

@@ -10,6 +10,7 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xb
# after upload
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
* [notify2.py](notify2.py) uses the json API to show more context
* [image-noexif.py](image-noexif.py) removes image exif by overwriting / directly editing the uploaded file
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable

View File

@@ -13,9 +13,15 @@ example usage as global config:
--xau f,t5,j,bin/hooks/discord-announce.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xau=f,t5,j,bin/hooks/discord-announce.py
-v srv/inc:inc:r:rw,ed:c,xau=f,t5,j,bin/hooks/discord-announce.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xbu = execute after upload
f = fork; don't wait for it to finish
t5 = timeout if it's still running after 5 sec
j = provide upload information as json; not just the filename
@@ -30,6 +36,7 @@ then use this to design your message: https://discohook.org/
def main():
WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
WEBHOOK = "https://discord.com/api/webhooks/1066830390280597718/M1TDD110hQA-meRLMRhdurych8iyG35LDoI1YhzbrjGP--BXNZodZFczNVwK4Ce7Yme5"
# read info from copyparty
inf = json.loads(sys.argv[1])

72
bin/hooks/image-noexif.py Executable file
View File

@@ -0,0 +1,72 @@
#!/usr/bin/env python3
import os
import sys
import subprocess as sp
_ = r"""
remove exif tags from uploaded images; the eventhook edition of
https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py
dependencies:
exiftool / perl-Image-ExifTool
being an upload hook, this will take effect after upload completion
but before copyparty has hashed/indexed the file, which means that
copyparty will never index the original file, so deduplication will
not work as expected... which is mostly OK but ehhh
note: modifies the file in-place, so don't set the `f` (fork) flag
example usages; either as global config (all volumes) or as volflag:
--xau bin/hooks/image-noexif.py
-v srv/inc:inc:r:rw,ed:c,xau=bin/hooks/image-noexif.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
explained:
share fs-path srv/inc at /inc (readable by all, read-write for user ed)
running this xau (execute-after-upload) plugin for all uploaded files
"""
# filetypes to process; ignores everything else
EXTS = ("jpg", "jpeg", "avif", "heif", "heic")
try:
from copyparty.util import fsenc
except:
def fsenc(p):
return p.encode("utf-8")
def main():
fp = sys.argv[1]
ext = fp.lower().split(".")[-1]
if ext not in EXTS:
return
cwd, fn = os.path.split(fp)
os.chdir(cwd)
f1 = fsenc(fn)
cmd = [
b"exiftool",
b"-exif:all=",
b"-iptc:all=",
b"-xmp:all=",
b"-P",
b"-overwrite_original",
b"--",
f1,
]
sp.check_output(cmd)
print("image-noexif: stripped")
if __name__ == "__main__":
try:
main()
except:
pass

123
bin/hooks/msg-log.py Executable file
View File

@@ -0,0 +1,123 @@
#!/usr/bin/env python
# coding: utf-8
# vim:ts=4:sw=4:softtabstop=4:smarttab:expandtab
from __future__ import print_function, unicode_literals
import json
import os
import sys
import time
try:
from datetime import datetime, timezone
except:
from datetime import datetime
"""
use copyparty as a dumb messaging server / guestbook thing;
initially contributed by @clach04 in https://github.com/9001/copyparty/issues/35 (thanks!)
Sample usage:
python copyparty-sfx.py --xm j,bin/hooks/msg-log.py
Where:
xm = execute on message-to-server-log
j = provide message information as json; not just the text - this script REQUIRES json
t10 = timeout and kill download after 10 secs
"""
# output filename
FILENAME = os.environ.get("COPYPARTY_MESSAGE_FILENAME", "") or "README.md"
# set True to write in descending order (newest message at top of file);
# note that this becomes very slow/expensive as the file gets bigger
DESCENDING = True
# the message template; the following parameters are provided by copyparty and can be referenced below:
# 'ap' = absolute filesystem path where the message was posted
# 'vp' = virtual path (URL 'path') where the message was posted
# 'mt' = 'at' = unix-timestamp when the message was posted
# 'datetime' = ISO-8601 time when the message was posted
# 'sz' = message size in bytes
# 'host' = the server hostname which the user was accessing (URL 'host')
# 'user' = username (if logged in), otherwise '*'
# 'txt' = the message text itself
# (uncomment the print(msg_info) to see if additional information has been introduced by copyparty since this was written)
TEMPLATE = """
🕒 %(datetime)s, 👤 %(user)s @ %(ip)s
%(txt)s
"""
def write_ascending(filepath, msg_text):
with open(filepath, "a", encoding="utf-8", errors="replace") as outfile:
outfile.write(msg_text)
def write_descending(filepath, msg_text):
lockpath = filepath + ".lock"
got_it = False
for _ in range(16):
try:
os.mkdir(lockpath)
got_it = True
break
except:
time.sleep(0.1)
continue
if not got_it:
return sys.exit(1)
try:
oldpath = filepath + ".old"
os.rename(filepath, oldpath)
with open(oldpath, "r", encoding="utf-8", errors="replace") as infile, open(
filepath, "w", encoding="utf-8", errors="replace"
) as outfile:
outfile.write(msg_text)
while True:
buf = infile.read(4096)
if not buf:
break
outfile.write(buf)
finally:
try:
os.unlink(oldpath)
except:
pass
os.rmdir(lockpath)
def main(argv=None):
if argv is None:
argv = sys.argv
msg_info = json.loads(sys.argv[1])
# print(msg_info)
try:
dt = datetime.fromtimestamp(msg_info["at"], timezone.utc)
except:
dt = datetime.utcfromtimestamp(msg_info["at"])
msg_info["datetime"] = dt.strftime("%Y-%m-%d, %H:%M:%S")
msg_text = TEMPLATE % msg_info
filepath = os.path.join(msg_info["ap"], FILENAME)
if DESCENDING and os.path.exists(filepath):
write_descending(filepath, msg_text)
else:
write_ascending(filepath, msg_text)
print(msg_text)
if __name__ == "__main__":
main()

View File

@@ -17,8 +17,12 @@ depdencies:
example usages; either as global config (all volumes) or as volflag:
--xau f,bin/hooks/notify.py
-v srv/inc:inc:c,xau=f,bin/hooks/notify.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
-v srv/inc:inc:r:rw,ed:c,xau=f,bin/hooks/notify.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xau = execute after upload

View File

@@ -4,7 +4,7 @@ import json
import os
import sys
import subprocess as sp
from datetime import datetime
from datetime import datetime, timezone
from plyer import notification
@@ -15,9 +15,13 @@ and also supports --xm (notify on 📟 message)
example usages; either as global config (all volumes) or as volflag:
--xm f,j,bin/hooks/notify2.py
--xau f,j,bin/hooks/notify2.py
-v srv/inc:inc:c,xm=f,j,bin/hooks/notify2.py
-v srv/inc:inc:c,xau=f,j,bin/hooks/notify2.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-v srv/inc:inc:r:rw,ed:c,xm=f,j,bin/hooks/notify2.py
-v srv/inc:inc:r:rw,ed:c,xau=f,j,bin/hooks/notify2.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads / msgs with the params listed below)
parameters explained,
xau = execute after upload
@@ -39,7 +43,8 @@ def main():
fp = inf["ap"]
sz = humansize(inf["sz"])
dp, fn = os.path.split(fp)
mt = datetime.utcfromtimestamp(inf["mt"]).strftime("%Y-%m-%d %H:%M:%S")
dt = datetime.fromtimestamp(inf["mt"], timezone.utc)
mt = dt.strftime("%Y-%m-%d %H:%M:%S")
msg = f"{fn} ({sz})\n📁 {dp}"
title = "File received"

View File

@@ -10,7 +10,12 @@ example usage as global config:
--xbu c,bin/hooks/reject-extension.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xbu=c,bin/hooks/reject-extension.py
-v srv/inc:inc:r:rw,ed:c,xbu=c,bin/hooks/reject-extension.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xbu = execute before upload

View File

@@ -17,7 +17,12 @@ example usage as global config:
--xau c,bin/hooks/reject-mimetype.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xau=c,bin/hooks/reject-mimetype.py
-v srv/inc:inc:r:rw,ed:c,xau=c,bin/hooks/reject-mimetype.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xau = execute after upload

View File

@@ -15,9 +15,15 @@ example usage as global config:
--xm f,j,t3600,bin/hooks/wget.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xm=f,j,t3600,bin/hooks/wget.py
-v srv/inc:inc:r:rw,ed:c,xm=f,j,t3600,bin/hooks/wget.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all messages with the params listed below)
parameters explained,
xm = execute on message-to-server-log
f = fork so it doesn't block uploads
j = provide message information as json; not just the text
c3 = mute all output
@@ -31,6 +37,10 @@ def main():
if "://" not in url:
url = "https://" + url
proto = url.split("://")[0].lower()
if proto not in ("http", "https", "ftp", "ftps"):
raise Exception("bad proto {}".format(proto))
os.chdir(inf["ap"])
name = url.split("?")[0].split("/")[-1]

View File

@@ -3,7 +3,7 @@
import hashlib
import json
import sys
from datetime import datetime
from datetime import datetime, timezone
_ = r"""
@@ -18,7 +18,12 @@ example usage as global config:
--xiu i5,j,bin/hooks/xiu-sha.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xiu=i5,j,bin/hooks/xiu-sha.py
-v srv/inc:inc:r:rw,ed:c,xiu=i5,j,bin/hooks/xiu-sha.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on batches of uploads with the params listed below)
parameters explained,
xiu = execute after uploads...
@@ -38,8 +43,11 @@ except:
return p
UTC = timezone.utc
def humantime(ts):
return datetime.utcfromtimestamp(ts).strftime("%Y-%m-%d %H:%M:%S")
return datetime.fromtimestamp(ts, UTC).strftime("%Y-%m-%d %H:%M:%S")
def find_files_root(inf):
@@ -91,7 +99,7 @@ def main():
ret.append("# {} files, {} bytes total".format(len(inf), total_sz))
ret.append("")
ftime = datetime.utcnow().strftime("%Y-%m%d-%H%M%S.%f")
ftime = datetime.now(UTC).strftime("%Y-%m%d-%H%M%S.%f")
fp = "{}xfer-{}.sha512".format(inf[0]["ap"][:di], ftime)
with open(fsenc(fp), "wb") as f:
f.write("\n".join(ret).encode("utf-8", "replace"))

View File

@@ -15,7 +15,12 @@ example usage as global config:
--xiu i1,j,bin/hooks/xiu.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xiu=i1,j,bin/hooks/xiu.py
-v srv/inc:inc:r:rw,ed:c,xiu=i1,j,bin/hooks/xiu.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on batches of uploads with the params listed below)
parameters explained,
xiu = execute after uploads...

View File

@@ -24,6 +24,15 @@ these do not have any problematic dependencies at all:
* also available as an [event hook](../hooks/wget.py)
## dangerous plugins
plugins in this section should only be used with appropriate precautions:
* [very-bad-idea.py](./very-bad-idea.py) combined with [meadup.js](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/meadup.js) converts copyparty into a janky yet extremely flexible chromecast clone
* also adds a virtual keyboard by @steinuil to the basic-upload tab for comfy couch crowd control
* anything uploaded through the [android app](https://github.com/9001/party-up) (files or links) are executed on the server, meaning anyone can infect your PC with malware... so protect this with a password and keep it on a LAN!
# dependencies
run [`install-deps.sh`](install-deps.sh) to build/install most dependencies required by these programs (supports windows/linux/macos)

View File

@@ -16,6 +16,10 @@ dep: ffmpeg
"""
# save beat timestamps to ".beats/filename.txt"
SAVE = False
def det(tf):
# fmt: off
sp.check_call([
@@ -23,12 +27,11 @@ def det(tf):
b"-nostdin",
b"-hide_banner",
b"-v", b"fatal",
b"-ss", b"13",
b"-y", b"-i", fsenc(sys.argv[1]),
b"-map", b"0:a:0",
b"-ac", b"1",
b"-ar", b"22050",
b"-t", b"300",
b"-t", b"360",
b"-f", b"f32le",
fsenc(tf)
])
@@ -47,10 +50,29 @@ def det(tf):
print(c["list"][0]["label"].split(" ")[0])
return
# throws if detection failed:
bpm = float(cl[-1]["timestamp"] - cl[1]["timestamp"])
bpm = round(60 * ((len(cl) - 1) / bpm), 2)
print(f"{bpm:.2f}")
# throws if detection failed:
beats = [float(x["timestamp"]) for x in cl]
bds = [b - a for a, b in zip(beats, beats[1:])]
bds.sort()
n0 = int(len(bds) * 0.2)
n1 = int(len(bds) * 0.75) + 1
bds = bds[n0:n1]
bpm = sum(bds)
bpm = round(60 * (len(bds) / bpm), 2)
print(f"{bpm:.2f}")
if SAVE:
fdir, fname = os.path.split(sys.argv[1])
bdir = os.path.join(fdir, ".beats")
try:
os.mkdir(fsenc(bdir))
except:
pass
fp = os.path.join(bdir, fname) + ".txt"
with open(fsenc(fp), "wb") as f:
txt = "\n".join([f"{x:.2f}" for x in beats])
f.write(txt.encode("utf-8"))
def main():

View File

@@ -7,6 +7,7 @@ set -e
# linux/alpine: requires gcc g++ make cmake patchelf {python3,ffmpeg,fftw,libsndfile}-dev py3-{wheel,pip} py3-numpy{,-dev}
# linux/debian: requires libav{codec,device,filter,format,resample,util}-dev {libfftw3,python3,libsndfile1}-dev python3-{numpy,pip} vamp-{plugin-sdk,examples} patchelf cmake
# linux/fedora: requires gcc gcc-c++ make cmake patchelf {python3,ffmpeg,fftw,libsndfile}-devel python3-numpy vamp-plugin-sdk qm-vamp-plugins
# linux/arch: requires gcc make cmake patchelf python3 ffmpeg fftw libsndfile python-{numpy,wheel,pip,setuptools}
# win64: requires msys2-mingw64 environment
# macos: requires macports
#
@@ -227,15 +228,16 @@ install_vamp() {
cd "$td"
echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2588/vamp-plugin-sdk-2.9.0.tar.gz)
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2691/vamp-plugin-sdk-2.10.0.tar.gz)
sha512sum -c <(
echo "7ef7f837d19a08048b059e0da408373a7964ced452b290fae40b85d6d70ca9000bcfb3302cd0b4dc76cf2a848528456f78c1ce1ee0c402228d812bd347b6983b -"
) <vamp-plugin-sdk-2.9.0.tar.gz
tar -xf vamp-plugin-sdk-2.9.0.tar.gz
echo "153b7f2fa01b77c65ad393ca0689742d66421017fd5931d216caa0fcf6909355fff74706fabbc062a3a04588a619c9b515a1dae00f21a57afd97902a355c48ed -"
) <vamp-plugin-sdk-2.10.0.tar.gz
tar -xf vamp-plugin-sdk-2.10.0.tar.gz
rm -- *.tar.gz
ls -al
cd vamp-plugin-sdk-*
./configure --prefix=$HOME/pe/vamp-sdk
printf '%s\n' "int main(int argc, char **argv) { return 0; }" > host/vamp-simple-host.cpp
./configure --disable-programs --prefix=$HOME/pe/vamp-sdk
make -j1 install
}
@@ -250,8 +252,9 @@ install_vamp() {
rm -- *.tar.gz
cd beatroot-vamp-v1.0
[ -e ~/pe/vamp-sdk ] &&
sed -ri 's`^(CFLAGS :=.*)`\1 -I'$HOME'/pe/vamp-sdk/include`' Makefile.linux
make -f Makefile.linux -j4 LDFLAGS=-L$HOME/pe/vamp-sdk/lib
sed -ri 's`^(CFLAGS :=.*)`\1 -I'$HOME'/pe/vamp-sdk/include`' Makefile.linux ||
sed -ri 's`^(CFLAGS :=.*)`\1 -I/usr/include/vamp-sdk`' Makefile.linux
make -f Makefile.linux -j4 LDFLAGS="-L$HOME/pe/vamp-sdk/lib -L/usr/lib64"
# /home/ed/vamp /home/ed/.vamp /usr/local/lib/vamp
mkdir ~/vamp
cp -pv beatroot-vamp.* ~/vamp/

View File

@@ -1,6 +1,11 @@
#!/usr/bin/env python3
"""
WARNING -- DANGEROUS PLUGIN --
if someone is able to upload files to a copyparty which is
running this plugin, they can execute malware on your machine
so please keep this on a LAN and protect it with a password
use copyparty as a chromecast replacement:
* post a URL and it will open in the default browser
* upload a file and it will open in the default application
@@ -10,16 +15,17 @@ use copyparty as a chromecast replacement:
the android app makes it a breeze to post pics and links:
https://github.com/9001/party-up/releases
(iOS devices have to rely on the web-UI)
goes without saying, but this is HELLA DANGEROUS,
GIVES RCE TO ANYONE WHO HAVE UPLOAD PERMISSIONS
iOS devices can use the web-UI or the shortcut instead:
https://github.com/9001/copyparty#ios-shortcuts
example copyparty config to use this:
--urlform save,get -v.::w:c,e2d,e2t,mte=+a1:c,mtp=a1=ad,kn,c0,bin/mtag/very-bad-idea.py
example copyparty config to use this;
lets the user "kevin" with password "hunter2" use this plugin:
-a kevin:hunter2 --urlform save,get -v.::w,kevin:c,e2d,e2t,mte=+a1:c,mtp=a1=ad,kn,c0,bin/mtag/very-bad-idea.py
recommended deps:
apt install xdotool libnotify-bin
apt install xdotool libnotify-bin mpv
python3 -m pip install --user -U streamlink yt-dlp
https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/meadup.js
and you probably want `twitter-unmute.user.js` from the res folder
@@ -63,8 +69,10 @@ set -e
EOF
chmod 755 /usr/local/bin/chromium-browser
# start the server (note: replace `-v.::rw:` with `-v.::w:` to disallow retrieving uploaded stuff)
cd ~/Downloads; python3 copyparty-sfx.py --urlform save,get -v.::rw:c,e2d,e2t,mte=+a1:c,mtp=a1=ad,kn,very-bad-idea.py
# start the server
# note 1: replace hunter2 with a better password to access the server
# note 2: replace `-v.::rw` with `-v.::w` to disallow retrieving uploaded stuff
cd ~/Downloads; python3 copyparty-sfx.py -a kevin:hunter2 --urlform save,get -v.::rw,kevin:c,e2d,e2t,mte=+a1:c,mtp=a1=ad,kn,very-bad-idea.py
"""
@@ -72,11 +80,23 @@ cd ~/Downloads; python3 copyparty-sfx.py --urlform save,get -v.::rw:c,e2d,e2t,mt
import os
import sys
import time
import shutil
import subprocess as sp
from urllib.parse import unquote_to_bytes as unquote
from urllib.parse import quote
have_mpv = shutil.which("mpv")
have_vlc = shutil.which("vlc")
def main():
if len(sys.argv) > 2 and sys.argv[1] == "x":
# invoked on commandline for testing;
# python3 very-bad-idea.py x msg=https://youtu.be/dQw4w9WgXcQ
txt = " ".join(sys.argv[2:])
txt = quote(txt.replace(" ", "+"))
return open_post(txt.encode("utf-8"))
fp = os.path.abspath(sys.argv[1])
with open(fp, "rb") as f:
txt = f.read(4096)
@@ -92,7 +112,7 @@ def open_post(txt):
try:
k, v = txt.split(" ", 1)
except:
open_url(txt)
return open_url(txt)
if k == "key":
sp.call(["xdotool", "key"] + v.split(" "))
@@ -128,6 +148,17 @@ def open_url(txt):
# else:
# sp.call(["xdotool", "getactivewindow", "windowminimize"]) # minimizes the focused windo
# mpv is probably smart enough to use streamlink automatically
if try_mpv(txt):
print("mpv got it")
return
# or maybe streamlink would be a good choice to open this
if try_streamlink(txt):
print("streamlink got it")
return
# nope,
# close any error messages:
sp.call(["xdotool", "search", "--name", "Error", "windowclose"])
# sp.call(["xdotool", "key", "ctrl+alt+d"]) # doesnt work at all
@@ -136,4 +167,39 @@ def open_url(txt):
sp.call(["xdg-open", txt])
def try_mpv(url):
t0 = time.time()
try:
print("trying mpv...")
sp.check_call(["mpv", "--fs", url])
return True
except:
# if it ran for 15 sec it probably succeeded and terminated
t = time.time()
return t - t0 > 15
def try_streamlink(url):
t0 = time.time()
try:
import streamlink
print("trying streamlink...")
streamlink.Streamlink().resolve_url(url)
if have_mpv:
args = "-m streamlink -p mpv -a --fs"
else:
args = "-m streamlink"
cmd = [sys.executable] + args.split() + [url, "best"]
t0 = time.time()
sp.check_call(cmd)
return True
except:
# if it ran for 10 sec it probably succeeded and terminated
t = time.time()
return t - t0 > 10
main()

View File

@@ -65,6 +65,10 @@ def main():
if "://" not in url:
url = "https://" + url
proto = url.split("://")[0].lower()
if proto not in ("http", "https", "ftp", "ftps"):
raise Exception("bad proto {}".format(proto))
os.chdir(fdir)
name = url.split("?")[0].split("/")[-1]

View File

@@ -46,12 +46,13 @@ import traceback
import http.client # py2: httplib
import urllib.parse
import calendar
from datetime import datetime
from datetime import datetime, timezone
from urllib.parse import quote_from_bytes as quote
from urllib.parse import unquote_to_bytes as unquote
WINDOWS = sys.platform == "win32"
MACOS = platform.system() == "Darwin"
UTC = timezone.utc
info = log = dbg = None
@@ -176,7 +177,7 @@ class RecentLog(object):
def put(self, msg):
msg = "{:10.6f} {} {}\n".format(time.time() % 900, rice_tid(), msg)
if self.f:
fmsg = " ".join([datetime.utcnow().strftime("%H%M%S.%f"), str(msg)])
fmsg = " ".join([datetime.now(UTC).strftime("%H%M%S.%f"), str(msg)])
self.f.write(fmsg.encode("utf-8"))
with self.mtx:

View File

@@ -20,12 +20,13 @@ import sys
import base64
import sqlite3
import argparse
from datetime import datetime
from datetime import datetime, timezone
from urllib.parse import quote_from_bytes as quote
from urllib.parse import unquote_to_bytes as unquote
FS_ENCODING = sys.getfilesystemencoding()
UTC = timezone.utc
class APF(argparse.ArgumentDefaultsHelpFormatter, argparse.RawDescriptionHelpFormatter):
@@ -155,11 +156,10 @@ th {
link = txt.decode("utf-8")[4:]
sz = "{:,}".format(sz)
dt = datetime.fromtimestamp(at if at > 0 else mt, UTC)
v = [
w[:16],
datetime.utcfromtimestamp(at if at > 0 else mt).strftime(
"%Y-%m-%d %H:%M:%S"
),
dt.strftime("%Y-%m-%d %H:%M:%S"),
sz,
imap.get(ip, ip),
]

View File

@@ -118,6 +118,15 @@ mkdir -p "$jail/tmp"
chmod 777 "$jail/tmp"
# create a dev
(cd $jail; mkdir -p dev; cd dev
[ -e null ] || mknod -m 666 null c 1 3
[ -e zero ] || mknod -m 666 zero c 1 5
[ -e random ] || mknod -m 444 random c 1 8
[ -e urandom ] || mknod -m 444 urandom c 1 9
)
# run copyparty
export HOME=$(getent passwd $uid | cut -d: -f6)
export USER=$(getent passwd $uid | cut -d: -f1)

View File

@@ -1,19 +1,20 @@
#!/usr/bin/env python3
from __future__ import print_function, unicode_literals
S_VERSION = "1.5"
S_BUILD_DT = "2023-03-12"
S_VERSION = "1.10"
S_BUILD_DT = "2023-08-15"
"""
up2k.py: upload to copyparty
u2c.py: upload to copyparty
2021, ed <irc.rizon.net>, MIT-Licensed
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py
- dependencies: requests
- supports python 2.6, 2.7, and 3.3 through 3.12
- if something breaks just try again and it'll autoresume
"""
import re
import os
import sys
import stat
@@ -21,6 +22,7 @@ import math
import time
import atexit
import signal
import socket
import base64
import hashlib
import platform
@@ -38,7 +40,7 @@ except:
try:
import requests
except ImportError:
except ImportError as ex:
if EXE:
raise
elif sys.version_info > (2, 7):
@@ -49,7 +51,7 @@ except ImportError:
m = "\n ERROR: need these:\n" + "\n".join(m) + "\n"
m += "\n for f in *.whl; do unzip $f; done; rm -r *.dist-info\n"
print(m.format(sys.executable))
print(m.format(sys.executable), "\nspecifically,", ex)
sys.exit(1)
@@ -58,6 +60,7 @@ PY2 = sys.version_info < (3,)
if PY2:
from Queue import Queue
from urllib import quote, unquote
from urlparse import urlsplit, urlunsplit
sys.dont_write_bytecode = True
bytes = str
@@ -65,6 +68,7 @@ else:
from queue import Queue
from urllib.parse import unquote_to_bytes as unquote
from urllib.parse import quote_from_bytes as quote
from urllib.parse import urlsplit, urlunsplit
unicode = str
@@ -337,6 +341,32 @@ class CTermsize(object):
ss = CTermsize()
def undns(url):
usp = urlsplit(url)
hn = usp.hostname
gai = None
eprint("resolving host [{0}] ...".format(hn), end="")
try:
gai = socket.getaddrinfo(hn, None)
hn = gai[0][4][0]
except KeyboardInterrupt:
raise
except:
t = "\n\033[31mfailed to resolve upload destination host;\033[0m\ngai={0}\n"
eprint(t.format(repr(gai)))
raise
if usp.port:
hn = "{0}:{1}".format(hn, usp.port)
if usp.username or usp.password:
hn = "{0}:{1}@{2}".format(usp.username, usp.password, hn)
usp = usp._replace(netloc=hn)
url = urlunsplit(usp)
eprint(" {0}".format(url))
return url
def _scd(err, top):
"""non-recursive listing of directory contents, along with stat() info"""
with os.scandir(top) as dh:
@@ -382,10 +412,11 @@ def walkdir(err, top, seen):
err.append((ap, str(ex)))
def walkdirs(err, tops):
def walkdirs(err, tops, excl):
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
sep = "{0}".format(os.sep).encode("ascii")
if not VT100:
excl = excl.replace("/", r"\\")
za = []
for td in tops:
try:
@@ -402,6 +433,8 @@ def walkdirs(err, tops):
za = [x.replace(b"/", b"\\") for x in za]
tops = za
ptn = re.compile(excl.encode("utf-8") or b"\n")
for top in tops:
isdir = os.path.isdir(top)
if top[-1:] == sep:
@@ -414,6 +447,8 @@ def walkdirs(err, tops):
if isdir:
for ap, inf in walkdir(err, top, []):
if ptn.match(ap):
continue
yield stop, ap[len(stop) :].lstrip(sep), inf
else:
d, n = top.rsplit(sep, 1)
@@ -625,7 +660,7 @@ class Ctl(object):
nfiles = 0
nbytes = 0
err = []
for _, _, inf in walkdirs(err, ar.files):
for _, _, inf in walkdirs(err, ar.files, ar.x):
if stat.S_ISDIR(inf.st_mode):
continue
@@ -653,6 +688,7 @@ class Ctl(object):
return nfiles, nbytes
def __init__(self, ar, stats=None):
self.ok = False
self.ar = ar
self.stats = stats or self._scan()
if not self.stats:
@@ -666,7 +702,7 @@ class Ctl(object):
if ar.te:
req_ses.verify = ar.te
self.filegen = walkdirs([], ar.files)
self.filegen = walkdirs([], ar.files, ar.x)
self.recheck = [] # type: list[File]
if ar.safe:
@@ -700,6 +736,8 @@ class Ctl(object):
self._fancy()
self.ok = True
def _safe(self):
"""minimal basic slow boring fallback codepath"""
search = self.ar.s
@@ -850,7 +888,7 @@ class Ctl(object):
print(" ls ~{0}".format(srd))
zb = self.ar.url.encode("utf-8")
zb += quotep(rd.replace(b"\\", b"/"))
r = req_ses.get(zb + b"?ls&dots", headers=headers)
r = req_ses.get(zb + b"?ls&lt&dots", headers=headers)
if not r:
raise Exception("HTTP {0}".format(r.status_code))
@@ -928,7 +966,7 @@ class Ctl(object):
upath = file.abs.decode("utf-8", "replace")
if not VT100:
upath = upath[4:]
upath = upath.lstrip("\\?")
hs, sprs = handshake(self.ar, file, search)
if search:
@@ -1009,8 +1047,9 @@ class Ctl(object):
file, cid = task
try:
upload(file, cid, self.ar.a, stats)
except:
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
except Exception as ex:
t = "upload failed, retrying: {0} #{1} ({2})\n"
eprint(t.format(file.name, cid[:8], ex))
# handshake will fix it
with self.mutex:
@@ -1049,6 +1088,8 @@ def main():
print(ver)
return
sys.argv = [x for x in sys.argv if x != "--ws"]
# fmt: off
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
NOTE:
@@ -1062,12 +1103,13 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-v", action="store_true", help="verbose")
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\.hist/.*'")
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
ap.add_argument("--version", action="store_true", help="show version and exit")
ap = app.add_argument_group("compatibility")
ap.add_argument("--cls", action="store_true", help="clear screen before start")
ap.add_argument("--ws", action="store_true", help="copyparty is running on windows; wait before deleting files after uploading")
ap.add_argument("--rh", type=int, metavar="TRIES", default=0, help="resolve server hostname before upload (good for buggy networks, but TLS certs will break)")
ap = app.add_argument_group("folder sync")
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
@@ -1078,7 +1120,7 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections")
ap.add_argument("-J", type=int, metavar="THREADS", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing")
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles)")
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)")
ap.add_argument("--safe", action="store_true", help="use simple fallback approach")
ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)")
@@ -1091,7 +1133,7 @@ source file/folder selection uses rsync syntax, meaning that:
ar = app.parse_args()
finally:
if EXE and not sys.argv[1:]:
print("*** hit enter to exit ***")
eprint("*** hit enter to exit ***")
try:
input()
except:
@@ -1124,20 +1166,28 @@ source file/folder selection uses rsync syntax, meaning that:
with open(fn, "rb") as f:
ar.a = f.read().decode("utf-8").strip()
for n in range(ar.rh):
try:
ar.url = undns(ar.url)
break
except KeyboardInterrupt:
raise
except:
if n > ar.rh - 2:
raise
if ar.cls:
print("\x1b\x5b\x48\x1b\x5b\x32\x4a\x1b\x5b\x33\x4a", end="")
eprint("\033[H\033[2J\033[3J", end="")
ctl = Ctl(ar)
if ar.dr and not ar.drd:
if ar.dr and not ar.drd and ctl.ok:
print("\npass 2/2: delete")
if getattr(ctl, "up_br") and ar.ws:
# wait for up2k to mtime if there was uploads
time.sleep(4)
ar.drd = True
ar.z = True
Ctl(ar, ctl.stats)
ctl = Ctl(ar, ctl.stats)
sys.exit(0 if ctl.ok else 1)
if __name__ == "__main__":

View File

@@ -1,7 +1,6 @@
# when running copyparty behind a reverse proxy,
# the following arguments are recommended:
#
# --http-only lower latency on initial connection
# -i 127.0.0.1 only accept connections from nginx
#
# if you are doing location-based proxying (such as `/stuff` below)

View File

@@ -1,14 +1,44 @@
#!/bin/bash
set -e
cat >/dev/null <<'EOF'
NOTE: copyparty is now able to do this automatically;
however you may wish to use this script instead if
you have specific needs (or if copyparty breaks)
this script generates a new self-signed TLS certificate and
replaces the default insecure one that comes with copyparty
as it is trivial to impersonate a copyparty server using the
default certificate, it is highly recommended to do this
this will create a self-signed CA, and a Server certificate
which gets signed by that CA -- you can run it multiple times
with different server-FQDNs / IPs to create additional certs
for all your different servers / (non-)copyparty services
EOF
# ca-name and server-fqdn
ca_name="$1"
srv_fqdn="$2"
[ -z "$srv_fqdn" ] && {
echo "need arg 1: ca name"
echo "need arg 2: server fqdn and/or IPs, comma-separated"
echo "optional arg 3: if set, write cert into copyparty cfg"
[ -z "$srv_fqdn" ] && { cat <<'EOF'
need arg 1: ca name
need arg 2: server fqdn and/or IPs, comma-separated
optional arg 3: if set, write cert into copyparty cfg
example:
./cfssl.sh PartyCo partybox.local y
EOF
exit 1
}
command -v cfssljson 2>/dev/null || {
echo please install cfssl and try again
exit 1
}
@@ -59,12 +89,14 @@ show() {
}
show ca.pem
show "$srv_fqdn.pem"
echo
echo "successfully generated new certificates"
# write cert into copyparty config
[ -z "$3" ] || {
mkdir -p ~/.config/copyparty
cat "$srv_fqdn".{key,pem} ca.pem >~/.config/copyparty/cert.pem
echo "successfully replaced copyparty certificate"
}

View File

@@ -26,8 +26,8 @@ a {
<script>
var a = document.getElementById('redir'),
proto = window.location.protocol.indexOf('https') === 0 ? 'https' : 'http',
loc = window.location.hostname || '127.0.0.1',
proto = location.protocol.indexOf('https') === 0 ? 'https' : 'http',
loc = location.hostname || '127.0.0.1',
port = a.getAttribute('href').split(':').pop().split('/')[0],
url = proto + '://' + loc + ':' + port + '/';
@@ -35,7 +35,7 @@ a.setAttribute('href', url);
document.getElementById('desc').innerHTML = 'redirecting to';
setTimeout(function() {
window.location.href = url;
location.href = url;
}, 500);
</script>

Binary file not shown.

View File

@@ -1,7 +1,6 @@
# when running copyparty behind a reverse proxy,
# the following arguments are recommended:
#
# --http-only lower latency on initial connection
# -i 127.0.0.1 only accept connections from nginx
#
# -nc must match or exceed the webserver's max number of concurrent clients;
@@ -9,7 +8,7 @@
# nginx default is 512 (worker_processes 1, worker_connections 512)
#
# you may also consider adding -j0 for CPU-intensive configurations
# (not that i can really think of any good examples)
# (5'000 requests per second, or 20gbps upload/download in parallel)
#
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
@@ -35,7 +34,15 @@ server {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
# NOTE: with cloudflare you want this instead:
#proxy_set_header X-Forwarded-For $http_cf_connecting_ip;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Connection "Keep-Alive";
}
}
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
client_max_body_size 1024M;
client_header_timeout 610m;
client_body_timeout 610m;
send_timeout 610m;

View File

@@ -0,0 +1,283 @@
{ config, pkgs, lib, ... }:
with lib;
let
mkKeyValue = key: value:
if value == true then
# sets with a true boolean value are coerced to just the key name
key
else if value == false then
# or omitted completely when false
""
else
(generators.mkKeyValueDefault { inherit mkValueString; } ": " key value);
mkAttrsString = value: (generators.toKeyValue { inherit mkKeyValue; } value);
mkValueString = value:
if isList value then
(concatStringsSep ", " (map mkValueString value))
else if isAttrs value then
"\n" + (mkAttrsString value)
else
(generators.mkValueStringDefault { } value);
mkSectionName = value: "[" + (escape [ "[" "]" ] value) + "]";
mkSection = name: attrs: ''
${mkSectionName name}
${mkAttrsString attrs}
'';
mkVolume = name: attrs: ''
${mkSectionName name}
${attrs.path}
${mkAttrsString {
accs = attrs.access;
flags = attrs.flags;
}}
'';
passwordPlaceholder = name: "{{password-${name}}}";
accountsWithPlaceholders = mapAttrs (name: attrs: passwordPlaceholder name);
configStr = ''
${mkSection "global" cfg.settings}
${mkSection "accounts" (accountsWithPlaceholders cfg.accounts)}
${concatStringsSep "\n" (mapAttrsToList mkVolume cfg.volumes)}
'';
name = "copyparty";
cfg = config.services.copyparty;
configFile = pkgs.writeText "${name}.conf" configStr;
runtimeConfigPath = "/run/${name}/${name}.conf";
home = "/var/lib/${name}";
defaultShareDir = "${home}/data";
in {
options.services.copyparty = {
enable = mkEnableOption "web-based file manager";
package = mkOption {
type = types.package;
default = pkgs.copyparty;
defaultText = "pkgs.copyparty";
description = ''
Package of the application to run, exposed for overriding purposes.
'';
};
openFilesLimit = mkOption {
default = 4096;
type = types.either types.int types.str;
description = "Number of files to allow copyparty to open.";
};
settings = mkOption {
type = types.attrs;
description = ''
Global settings to apply.
Directly maps to values in the [global] section of the copyparty config.
See `${getExe cfg.package} --help` for more details.
'';
default = {
i = "127.0.0.1";
no-reload = true;
};
example = literalExpression ''
{
i = "0.0.0.0";
no-reload = true;
}
'';
};
accounts = mkOption {
type = types.attrsOf (types.submodule ({ ... }: {
options = {
passwordFile = mkOption {
type = types.str;
description = ''
Runtime file path to a file containing the user password.
Must be readable by the copyparty user.
'';
example = "/run/keys/copyparty/ed";
};
};
}));
description = ''
A set of copyparty accounts to create.
'';
default = { };
example = literalExpression ''
{
ed.passwordFile = "/run/keys/copyparty/ed";
};
'';
};
volumes = mkOption {
type = types.attrsOf (types.submodule ({ ... }: {
options = {
path = mkOption {
type = types.str;
description = ''
Path of a directory to share.
'';
};
access = mkOption {
type = types.attrs;
description = ''
Attribute list of permissions and the users to apply them to.
The key must be a string containing any combination of allowed permission:
"r" (read): list folder contents, download files
"w" (write): upload files; need "r" to see the uploads
"m" (move): move files and folders; need "w" at destination
"d" (delete): permanently delete files and folders
"g" (get): download files, but cannot see folder contents
"G" (upget): "get", but can see filekeys of their own uploads
"h" (html): "get", but folders return their index.html
"a" (admin): can see uploader IPs, config-reload
For example: "rwmd"
The value must be one of:
an account name, defined in `accounts`
a list of account names
"*", which means "any account"
'';
example = literalExpression ''
{
# wG = write-upget = see your own uploads only
wG = "*";
# read-write-modify-delete for users "ed" and "k"
rwmd = ["ed" "k"];
};
'';
};
flags = mkOption {
type = types.attrs;
description = ''
Attribute list of volume flags to apply.
See `${getExe cfg.package} --help-flags` for more details.
'';
example = literalExpression ''
{
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
fk = 4;
# scan for new files every 60sec
scan = 60;
# volflag "e2d" enables the uploads database
e2d = true;
# "d2t" disables multimedia parsers (in case the uploads are malicious)
d2t = true;
# skips hashing file contents if path matches *.iso
nohash = "\.iso$";
};
'';
default = { };
};
};
}));
description = "A set of copyparty volumes to create";
default = {
"/" = {
path = defaultShareDir;
access = { r = "*"; };
};
};
example = literalExpression ''
{
"/" = {
path = ${defaultShareDir};
access = {
# wG = write-upget = see your own uploads only
wG = "*";
# read-write-modify-delete for users "ed" and "k"
rwmd = ["ed" "k"];
};
};
};
'';
};
};
config = mkIf cfg.enable {
systemd.services.copyparty = {
description = "http file sharing hub";
wantedBy = [ "multi-user.target" ];
environment = {
PYTHONUNBUFFERED = "true";
XDG_CONFIG_HOME = "${home}/.config";
};
preStart = let
replaceSecretCommand = name: attrs:
"${getExe pkgs.replace-secret} '${
passwordPlaceholder name
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
in ''
set -euo pipefail
install -m 600 ${configFile} ${runtimeConfigPath}
${concatStringsSep "\n"
(mapAttrsToList replaceSecretCommand cfg.accounts)}
'';
serviceConfig = {
Type = "simple";
ExecStart = "${getExe cfg.package} -c ${runtimeConfigPath}";
# Hardening options
User = "copyparty";
Group = "copyparty";
RuntimeDirectory = name;
RuntimeDirectoryMode = "0700";
StateDirectory = [ name "${name}/data" "${name}/.config" ];
StateDirectoryMode = "0700";
WorkingDirectory = home;
TemporaryFileSystem = "/:ro";
BindReadOnlyPaths = [
"/nix/store"
"-/etc/resolv.conf"
"-/etc/nsswitch.conf"
"-/etc/hosts"
"-/etc/localtime"
] ++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
BindPaths = [ home ] ++ (mapAttrsToList (k: v: v.path) cfg.volumes);
# Would re-mount paths ignored by temporary root
#ProtectSystem = "strict";
ProtectHome = true;
PrivateTmp = true;
PrivateDevices = true;
ProtectKernelTunables = true;
ProtectControlGroups = true;
RestrictSUIDSGID = true;
PrivateMounts = true;
ProtectKernelModules = true;
ProtectKernelLogs = true;
ProtectHostname = true;
ProtectClock = true;
ProtectProc = "invisible";
ProcSubset = "pid";
RestrictNamespaces = true;
RemoveIPC = true;
UMask = "0077";
LimitNOFILE = cfg.openFilesLimit;
NoNewPrivileges = true;
LockPersonality = true;
RestrictRealtime = true;
};
};
users.groups.copyparty = { };
users.users.copyparty = {
description = "Service user for copyparty";
group = "copyparty";
home = home;
isSystemUser = true;
};
};
}

View File

@@ -1,50 +1,48 @@
# Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty
pkgver="1.6.7"
pkgver="1.9.14"
pkgrel=1
pkgdesc="Portable file sharing hub"
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++"
arch=("any")
url="https://github.com/9001/${pkgname}"
license=('MIT')
depends=("python" "lsof")
optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tags"
"python-jinja: faster html generator"
depends=("python" "lsof" "python-jinja")
makedepends=("python-wheel" "python-setuptools" "python-build" "python-installer" "make" "pigz")
optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tags"
"python-mutagen: music tags (alternative)"
"python-pillow: thumbnails for images"
"python-pyvips: thumbnails for images (higher quality, faster, uses more ram)"
"libkeyfinder-git: detection of musical keys"
"qm-vamp-plugins: BPM detection"
"python-pyopenssl: ftps functionality"
"python-argon2_cffi: hashed passwords in config"
"python-impacket-git: smb support (bad idea)"
)
source=("${url}/releases/download/v${pkgver}/${pkgname}-sfx.py"
"${pkgname}.conf"
"${pkgname}.service"
"prisonparty.service"
"index.md"
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/bin/prisonparty.sh"
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
)
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" )
sha256sums=("3fb40a631e9decf0073db06aab6fd8d743de91f4ddb82a65164d39d53e0b413f"
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
"23054bb206153a1ed34038accaf490b8068f9c856e423c2f2595b148b40c0a0c"
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
)
sha256sums=("96867ea1bcaf622e5dc29ee3224ffa8ea80218d3a146e7a10d04c12255bae00f")
build() {
cd "${srcdir}/${pkgname}-${pkgver}"
pushd copyparty/web
make -j$(nproc)
rm Makefile
popd
python3 -m build -wn
}
package() {
cd "${srcdir}/"
cd "${srcdir}/${pkgname}-${pkgver}"
python3 -m installer -d "$pkgdir" dist/*.whl
install -dm755 "${pkgdir}/etc/${pkgname}.d"
install -Dm755 "${pkgname}-sfx.py" "${pkgdir}/usr/bin/${pkgname}"
install -Dm755 "prisonparty.sh" "${pkgdir}/usr/bin/prisonparty"
install -Dm644 "${pkgname}.conf" "${pkgdir}/etc/${pkgname}.d/init"
install -Dm644 "${pkgname}.service" "${pkgdir}/usr/lib/systemd/system/${pkgname}.service"
install -Dm644 "prisonparty.service" "${pkgdir}/usr/lib/systemd/system/prisonparty.service"
install -Dm644 "index.md" "${pkgdir}/var/lib/${pkgname}-jail/README.md"
install -Dm755 "bin/prisonparty.sh" "${pkgdir}/usr/bin/prisonparty"
install -Dm644 "contrib/package/arch/${pkgname}.conf" "${pkgdir}/etc/${pkgname}.d/init"
install -Dm644 "contrib/package/arch/${pkgname}.service" "${pkgdir}/usr/lib/systemd/system/${pkgname}.service"
install -Dm644 "contrib/package/arch/prisonparty.service" "${pkgdir}/usr/lib/systemd/system/prisonparty.service"
install -Dm644 "contrib/package/arch/index.md" "${pkgdir}/var/lib/${pkgname}-jail/README.md"
install -Dm644 "LICENSE" "${pkgdir}/usr/share/licenses/${pkgname}/LICENSE"
find /etc/${pkgname}.d -iname '*.conf' 2>/dev/null | grep -qE . && return

View File

@@ -0,0 +1,59 @@
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, ffmpeg, mutagen,
# use argon2id-hashed passwords in config files (sha2 is always available)
withHashedPasswords ? true,
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
withThumbnails ? true,
# create thumbnails with PyVIPS; even faster, uses more memory
# -- can be combined with Pillow to support more filetypes
withFastThumbnails ? false,
# enable FFmpeg; thumbnails for most filetypes (also video and audio), extract audio metadata, transcode audio to opus
# -- possibly dangerous if you allow anonymous uploads, since FFmpeg has a huge attack surface
# -- can be combined with Thumbnails and/or FastThumbnails, since FFmpeg is slower than both
withMediaProcessing ? true,
# if MediaProcessing is not enabled, you probably want this instead (less accurate, but much safer and faster)
withBasicAudioMetadata ? false,
# enable FTPS support in the FTP server
withFTPS ? false,
# samba/cifs server; dangerous and buggy, enable if you really need it
withSMB ? false,
}:
let
pinData = lib.importJSON ./pin.json;
pyEnv = python.withPackages (ps:
with ps; [
jinja2
]
++ lib.optional withSMB impacket
++ lib.optional withFTPS pyopenssl
++ lib.optional withThumbnails pillow
++ lib.optional withFastThumbnails pyvips
++ lib.optional withMediaProcessing ffmpeg
++ lib.optional withBasicAudioMetadata mutagen
++ lib.optional withHashedPasswords argon2-cffi
);
in stdenv.mkDerivation {
pname = "copyparty";
version = pinData.version;
src = fetchurl {
url = pinData.url;
hash = pinData.hash;
};
buildInputs = [ makeWrapper ];
dontUnpack = true;
dontBuild = true;
installPhase = ''
install -Dm755 $src $out/share/copyparty-sfx.py
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
--add-flags "$out/share/copyparty-sfx.py"
'';
}

View File

@@ -0,0 +1,5 @@
{
"url": "https://github.com/9001/copyparty/releases/download/v1.9.14/copyparty-sfx.py",
"version": "1.9.14",
"hash": "sha256-H4hRi6Nn4jUouhvqLacFyr0odMQ+99crBXL3iNz7mXs="
}

View File

@@ -0,0 +1,77 @@
#!/usr/bin/env python3
# Update the Nix package pin
#
# Usage: ./update.sh [PATH]
# When the [PATH] is not set, it will fetch the latest release from the repo.
# With [PATH] set, it will hash the given file and generate the URL,
# base on the version contained within the file
import base64
import json
import hashlib
import sys
import re
from pathlib import Path
OUTPUT_FILE = Path("pin.json")
TARGET_ASSET = "copyparty-sfx.py"
HASH_TYPE = "sha256"
LATEST_RELEASE_URL = "https://api.github.com/repos/9001/copyparty/releases/latest"
DOWNLOAD_URL = lambda version: f"https://github.com/9001/copyparty/releases/download/v{version}/{TARGET_ASSET}"
def get_formatted_hash(binary):
hasher = hashlib.new("sha256")
hasher.update(binary)
asset_hash = hasher.digest()
encoded_hash = base64.b64encode(asset_hash).decode("ascii")
return f"{HASH_TYPE}-{encoded_hash}"
def version_from_sfx(binary):
result = re.search(b'^VER = "(.*)"$', binary, re.MULTILINE)
if result:
return result.groups(1)[0].decode("ascii")
raise ValueError("version not found in provided file")
def remote_release_pin():
import requests
response = requests.get(LATEST_RELEASE_URL).json()
version = response["tag_name"].lstrip("v")
asset_info = [a for a in response["assets"] if a["name"] == TARGET_ASSET][0]
download_url = asset_info["browser_download_url"]
asset = requests.get(download_url)
formatted_hash = get_formatted_hash(asset.content)
result = {"url": download_url, "version": version, "hash": formatted_hash}
return result
def local_release_pin(path):
asset = path.read_bytes()
version = version_from_sfx(asset)
download_url = DOWNLOAD_URL(version)
formatted_hash = get_formatted_hash(asset)
result = {"url": download_url, "version": version, "hash": formatted_hash}
return result
def main():
if len(sys.argv) > 1:
asset_path = Path(sys.argv[1])
result = local_release_pin(asset_path)
else:
result = remote_release_pin()
print(result)
json_result = json.dumps(result, indent=4)
OUTPUT_FILE.write_text(json_result)
if __name__ == "__main__":
main()

208
contrib/plugins/rave.js Normal file
View File

@@ -0,0 +1,208 @@
/* untz untz untz untz */
(function () {
var can, ctx, W, H, fft, buf, bars, barw, pv,
hue = 0,
ibeat = 0,
beats = [9001],
beats_url = '',
uofs = 0,
ops = ebi('ops'),
raving = false,
recalc = 0,
cdown = 0,
FC = 0.9,
css = `<style>
#fft {
position: fixed;
top: 0;
left: 0;
z-index: -1;
}
body {
box-shadow: inset 0 0 0 white;
}
#ops>a,
#path>a {
display: inline-block;
}
/*
body.untz {
animation: untz-body 200ms ease-out;
}
@keyframes untz-body {
0% {inset 0 0 20em white}
100% {inset 0 0 0 white}
}
*/
:root, html.a, html.b, html.c, html.d, html.e {
--row-alt: rgba(48,52,78,0.2);
}
#files td {
background: none;
}
</style>`;
QS('body').appendChild(mknod('div', null, css));
function rave_load() {
console.log('rave_load');
can = mknod('canvas', 'fft');
QS('body').appendChild(can);
ctx = can.getContext('2d');
fft = new AnalyserNode(actx, {
"fftSize": 2048,
"maxDecibels": 0,
"smoothingTimeConstant": 0.7,
});
ibeat = 0;
beats = [9001];
buf = new Uint8Array(fft.frequencyBinCount);
bars = buf.length * FC;
afilt.filters.push(fft);
if (!raving) {
raving = true;
raver();
}
beats_url = mp.au.src.split('?')[0].replace(/(.*\/)(.*)/, '$1.beats/$2.txt');
console.log("reading beats from", beats_url);
var xhr = new XHR();
xhr.open('GET', beats_url, true);
xhr.onload = readbeats;
xhr.url = beats_url;
xhr.send();
}
function rave_unload() {
qsr('#fft');
can = null;
}
function readbeats() {
if (this.url != beats_url)
return console.log('old beats??', this.url, beats_url);
var sbeats = this.responseText.replace(/\r/g, '').split(/\n/g);
if (sbeats.length < 3)
return;
beats = [];
for (var a = 0; a < sbeats.length; a++)
beats.push(parseFloat(sbeats[a]));
var end = beats.slice(-2),
t = end[1],
d = t - end[0];
while (d > 0.1 && t < 1200)
beats.push(t += d);
}
function hrand() {
return Math.random() - 0.5;
}
function raver() {
if (!can) {
raving = false;
return;
}
requestAnimationFrame(raver);
if (!mp || !mp.au || mp.au.paused)
return;
if (--uofs >= 0) {
document.body.style.marginLeft = hrand() * uofs + 'px';
ebi('tree').style.marginLeft = hrand() * uofs + 'px';
for (var a of QSA('#ops>a, #path>a, #pctl>a'))
a.style.transform = 'translate(' + hrand() * uofs * 1 + 'px, ' + hrand() * uofs * 0.7 + 'px) rotate(' + Math.random() * uofs * 0.7 + 'deg)'
}
if (--recalc < 0) {
recalc = 60;
var tree = ebi('tree'),
x = tree.style.display == 'none' ? 0 : tree.offsetWidth;
//W = can.width = window.innerWidth - x;
//H = can.height = window.innerHeight;
//H = ebi('widget').offsetTop;
W = can.width = bars;
H = can.height = 512;
barw = 1; //parseInt(0.8 + W / bars);
can.style.left = x + 'px';
can.style.width = (window.innerWidth - x) + 'px';
can.style.height = ebi('widget').offsetTop + 'px';
}
//if (--cdown == 1)
// clmod(ops, 'untz');
fft.getByteFrequencyData(buf);
var imax = 0, vmax = 0;
for (var a = 10; a < 50; a++)
if (vmax < buf[a]) {
vmax = buf[a];
imax = a;
}
hue = hue * 0.93 + imax * 0.07;
ctx.fillStyle = 'rgba(0,0,0,0)';
ctx.fillRect(0, 0, W, H);
ctx.clearRect(0, 0, W, H);
ctx.fillStyle = 'hsla(' + (hue * 2.5) + ',100%,50%,0.7)';
var x = 0, mul = (H / 256) * 0.5;
for (var a = 0; a < buf.length * FC; a++) {
var v = buf[a] * mul * (1 + 0.69 * a / buf.length);
ctx.fillRect(x, H - v, barw, v);
x += barw;
}
var t = mp.au.currentTime + 0.05;
if (ibeat >= beats.length || beats[ibeat] > t)
return;
while (ibeat < beats.length && beats[ibeat++] < t)
continue;
return untz();
var cv = 0;
for (var a = 0; a < 128; a++)
cv += buf[a];
if (cv - pv > 1000) {
console.log(pv, cv, cv - pv);
if (cdown < 0) {
clmod(ops, 'untz', 1);
cdown = 20;
}
}
pv = cv;
}
function untz() {
console.log('untz');
uofs = 14;
document.body.animate([
{ boxShadow: 'inset 0 0 1em #f0c' },
{ boxShadow: 'inset 0 0 20em #f0c', offset: 0.2 },
{ boxShadow: 'inset 0 0 0 #f0c' },
], { duration: 200, iterations: 1 });
}
afilt.plugs.push({
"en": true,
"load": rave_load,
"unload": rave_unload
});
})();

View File

@@ -10,7 +10,7 @@ name="copyparty"
rcvar="copyparty_enable"
copyparty_user="copyparty"
copyparty_args="-e2dsa -v /storage:/storage:r" # change as you see fit
copyparty_command="/usr/local/bin/python3.8 /usr/local/copyparty/copyparty-sfx.py ${copyparty_args}"
copyparty_command="/usr/local/bin/python3.9 /usr/local/copyparty/copyparty-sfx.py ${copyparty_args}"
pidfile="/var/run/copyparty/${name}.pid"
command="/usr/sbin/daemon"
command_args="-P ${pidfile} -r -f ${copyparty_command}"

View File

@@ -1,3 +1,6 @@
# NOTE: this is now a built-in feature in copyparty
# but you may still want this if you have specific needs
#
# systemd service which generates a new TLS certificate on each boot,
# that way the one-year expiry time won't cause any issues --
# just have everyone trust the ca.pem once every 10 years

View File

@@ -2,12 +2,16 @@
# and share '/mnt' with anonymous read+write
#
# installation:
# cp -pv copyparty.service /etc/systemd/system
# restorecon -vr /etc/systemd/system/copyparty.service
# wget https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py -O /usr/local/bin/copyparty-sfx.py
# cp -pv copyparty.service /etc/systemd/system/
# restorecon -vr /etc/systemd/system/copyparty.service # on fedora/rhel
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
# firewall-cmd --reload
# systemctl daemon-reload && systemctl enable --now copyparty
#
# if it fails to start, first check this: systemctl status copyparty
# then try starting it while viewing logs: journalctl -fan 100
#
# you may want to:
# change "User=cpp" and "/home/cpp/" to another user
# remove the nft lines to only listen on port 3923
@@ -18,6 +22,7 @@
# add '-i 127.0.0.1' to only allow local connections
# add '-e2dsa' to enable filesystem scanning + indexing
# add '-e2ts' to enable metadata indexing
# remove '--ansi' to disable colored logs
#
# with `Type=notify`, copyparty will signal systemd when it is ready to
# accept connections; correctly delaying units depending on copyparty.
@@ -44,7 +49,7 @@ ExecReload=/bin/kill -s USR1 $MAINPID
User=cpp
Environment=XDG_CONFIG_HOME=/home/cpp/.config
# setup forwarding from ports 80 and 443 to port 3923
# OPTIONAL: setup forwarding from ports 80 and 443 to port 3923
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
ExecStartPre=+nft add table ip nat
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }
@@ -55,7 +60,7 @@ ExecStartPre=+nft add rule ip nat prerouting tcp dport 443 redirect to :3923
ExecStartPre=+/bin/bash -c 'mkdir -p /run/tmpfiles.d/ && echo "x /tmp/pe-copyparty*" > /run/tmpfiles.d/copyparty.conf'
# copyparty settings
ExecStart=/usr/bin/python3 /usr/local/bin/copyparty-sfx.py -e2d -v /mnt::rw
ExecStart=/usr/bin/python3 /usr/local/bin/copyparty-sfx.py --ansi -e2d -v /mnt::rw
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,2 @@
rem run copyparty.exe on machines with busted environment variables
cmd /v /c "set TMP=\tmp && copyparty.exe"

View File

@@ -6,6 +6,10 @@ import platform
import sys
import time
# fmt: off
_:tuple[int,int]=(0,0) # _____________________________________________________________________ hey there! if you are reading this, your python is too old to run copyparty without some help. Please use https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py or the pypi package instead, or see https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#building if you want to build it yourself :-) ************************************************************************************************************************************************
# fmt: on
try:
from typing import TYPE_CHECKING
except:
@@ -27,7 +31,12 @@ WINDOWS: Any = (
else False
)
VT100 = not WINDOWS or WINDOWS >= [10, 0, 14393]
VT100 = "--ansi" in sys.argv or (
os.environ.get("NO_COLOR", "").lower() in ("", "0", "false")
and sys.stdout.isatty()
and "--no-ansi" not in sys.argv
and (not WINDOWS or WINDOWS >= [10, 0, 14393])
)
# introduced in anniversary update
ANYWIN = WINDOWS or sys.platform in ["msys", "cygwin"]

View File

@@ -10,11 +10,9 @@ __url__ = "https://github.com/9001/copyparty/"
import argparse
import base64
import filecmp
import locale
import os
import re
import shutil
import socket
import sys
import threading
@@ -29,6 +27,9 @@ from .authsrv import expand_config_file, re_vol, split_cfg_ln, upgrade_cfg_fmt
from .cfg import flagcats, onedash
from .svchub import SvcHub
from .util import (
DEF_EXP,
DEF_MTE,
DEF_MTH,
IMPLICATIONS,
JINJA_VER,
PYFTPD_VER,
@@ -186,7 +187,10 @@ def init_E(E: EnvParams) -> None:
with open_binary("copyparty", "z.tar") as tgz:
with tarfile.open(fileobj=tgz) as tf:
tf.extractall(tdn)
try:
tf.extractall(tdn, filter="tar")
except TypeError:
tf.extractall(tdn) # nosec (archive is safe)
return tdn
@@ -201,7 +205,7 @@ def init_E(E: EnvParams) -> None:
E.mod = _unpack()
if sys.platform == "win32":
bdir = os.environ.get("APPDATA") or os.environ.get("TEMP")
bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "."
E.cfg = os.path.normpath(bdir + "/copyparty")
elif sys.platform == "darwin":
E.cfg = os.path.expanduser("~/Library/Preferences/copyparty")
@@ -242,6 +246,32 @@ def get_srvname() -> str:
return ret
def get_fk_salt(cert_path) -> str:
fp = os.path.join(E.cfg, "fk-salt.txt")
try:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(18))
with open(fp, "wb") as f:
f.write(ret + b"\n")
return ret.decode("utf-8")
def get_ah_salt() -> str:
fp = os.path.join(E.cfg, "ah-salt.txt")
try:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(18))
with open(fp, "wb") as f:
f.write(ret + b"\n")
return ret.decode("utf-8")
def ensure_locale() -> None:
safe = "en_US.UTF-8"
for x in [
@@ -261,30 +291,22 @@ def ensure_locale() -> None:
warn(t.format(safe))
def ensure_cert() -> None:
def ensure_webdeps() -> None:
ap = os.path.join(E.mod, "web/deps/mini-fa.woff")
if os.path.exists(ap):
return
warn(
"""could not find webdeps;
if you are running the sfx, or exe, or pypi package, or docker image,
then this is a bug! Please let me know so I can fix it, thanks :-)
https://github.com/9001/copyparty/issues/new?labels=bug&template=bug_report.md
however, if you are a dev, or running copyparty from source, and you want
full client functionality, you will need to build or obtain the webdeps:
https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#building
"""
the default cert (and the entire TLS support) is only here to enable the
crypto.subtle javascript API, which is necessary due to the webkit guys
being massive memers (https://www.chromium.org/blink/webcrypto)
i feel awful about this and so should they
"""
cert_insec = os.path.join(E.mod, "res/insecure.pem")
cert_cfg = os.path.join(E.cfg, "cert.pem")
if not os.path.exists(cert_cfg):
shutil.copy(cert_insec, cert_cfg)
try:
if filecmp.cmp(cert_cfg, cert_insec):
lprint(
"\033[33musing default TLS certificate; https will be insecure."
+ "\033[36m\ncertificate location: {}\033[0m\n".format(cert_cfg)
)
except:
pass
# speaking of the default cert,
# printf 'NO\n.\n.\n.\n.\ncopyparty-insecure\n.\n' | faketime '2000-01-01 00:00:00' openssl req -x509 -sha256 -newkey rsa:2048 -keyout insecure.pem -out insecure.pem -days $((($(printf %d 0x7fffffff)-$(date +%s --date=2000-01-01T00:00:00Z))/(60*60*24))) -nodes && ls -al insecure.pem && openssl x509 -in insecure.pem -text -noout
)
def configure_ssl_ver(al: argparse.Namespace) -> None:
@@ -471,6 +493,8 @@ def get_sects():
"d" (delete): permanently delete files and folders
"g" (get): download files, but cannot see folder contents
"G" (upget): "get", but can see filekeys of their own uploads
"h" (html): "get", but folders return their index.html
"a" (admin): can see uploader IPs, config-reload
too many volflags to list here, see --help-flags
@@ -499,10 +523,58 @@ def get_sects():
"""
volflags are appended to volume definitions, for example,
to create a write-only volume with the \033[33mnodupe\033[0m and \033[32mnosub\033[0m flags:
\033[35m-v /mnt/inc:/inc:w\033[33m:c,nodupe\033[32m:c,nosub"""
)
\033[35m-v /mnt/inc:/inc:w\033[33m:c,nodupe\033[32m:c,nosub\033[0m
if global config defines a volflag for all volumes,
you can unset it for a specific volume with -flag
"""
).rstrip()
+ build_flags_desc(),
],
[
"handlers",
"use plugins to handle certain events",
dedent(
"""
usually copyparty returns a \033[33m404\033[0m if a file does not exist, and
\033[33m403\033[0m if a user tries to access a file they don't have access to
you can load a plugin which will be invoked right before this
happens, and the plugin can choose to override this behavior
load the plugin using --args or volflags; for example \033[36m
--on404 ~/partyhandlers/not404.py
-v .::r:c,on404=~/partyhandlers/not404.py
\033[0m
the file must define the function \033[35mmain(cli,vn,rem)\033[0m:
\033[35mcli\033[0m: the copyparty HttpCli instance
\033[35mvn\033[0m: the VFS which overlaps with the requested URL
\033[35mrem\033[0m: the remainder of the URL below the VFS mountpoint
`main` must return a string; one of the following:
> \033[32m"true"\033[0m: the plugin has responded to the request,
and the TCP connection should be kept open
> \033[32m"false"\033[0m: the plugin has responded to the request,
and the TCP connection should be terminated
> \033[32m"retry"\033[0m: the plugin has done something to resolve the 404
situation, and copyparty should reattempt reading the file.
if it still fails, a regular 404 will be returned
> \033[32m"allow"\033[0m: should ignore the insufficient permissions
and let the client continue anyways
> \033[32m""\033[0m: the plugin has not handled the request;
try the next plugin or return the usual 404 or 403
\033[1;35mPS!\033[0m the folder that contains the python file should ideally
not contain many other python files, and especially nothing
with filenames that overlap with modules used by copyparty
"""
),
],
[
"hooks",
"execute commands before/after various events",
@@ -517,6 +589,7 @@ def get_sects():
\033[36mxbd\033[35m executes CMD before a file delete
\033[36mxad\033[35m executes CMD after a file delete
\033[36mxm\033[35m executes CMD on message
\033[36mxban\033[35m executes CMD if someone gets banned
\033[0m
can be defined as --args or volflags; for example \033[36m
--xau notify-send
@@ -552,6 +625,9 @@ def get_sects():
executed program on STDIN instead of as argv arguments, and
it also includes the wark (file-id/hash) as a json property
\033[36mxban\033[0m can be used to overrule / cancel a user ban event;
if the program returns 0 (true/OK) then the ban will NOT happen
except for \033[36mxm\033[0m, only one hook / one action can run at a time,
so it's recommended to use the \033[36mf\033[0m flag unless you really need
to wait for the hook to finish before continuing (without \033[36mf\033[0m
@@ -571,6 +647,47 @@ def get_sects():
"""
),
],
[
"exp",
"text expansion",
dedent(
"""
specify --exp or the "exp" volflag to enable placeholder expansions
in README.md / .prologue.html / .epilogue.html
--exp-md (volflag exp_md) holds the list of placeholders which can be
expanded in READMEs, and --exp-lg (volflag exp_lg) likewise for logues;
any placeholder not given in those lists will be ignored and shown as-is
the default list will expand the following placeholders:
\033[36m{{self.ip}} \033[35mclient ip
\033[36m{{self.ua}} \033[35mclient user-agent
\033[36m{{self.uname}} \033[35mclient username
\033[36m{{self.host}} \033[35mthe "Host" header, or the server's external IP otherwise
\033[36m{{cfg.name}} \033[35mthe --name global-config
\033[36m{{cfg.logout}} \033[35mthe --logout global-config
\033[36m{{vf.scan}} \033[35mthe "scan" volflag
\033[36m{{vf.thsize}} \033[35mthumbnail size
\033[36m{{srv.itime}} \033[35mserver time in seconds
\033[36m{{srv.htime}} \033[35mserver time as YY-mm-dd, HH:MM:SS (UTC)
\033[36m{{hdr.cf_ipcountry}} \033[35mthe "CF-IPCountry" client header (probably blank)
\033[0m
so the following types of placeholders can be added to the lists:
* any client header can be accessed through {{hdr.*}}
* any variable in httpcli.py can be accessed through {{self.*}}
* any global server setting can be accessed through {{cfg.*}}
* any volflag can be accessed through {{vf.*}}
remove vf.scan from default list using --exp-md /vf.scan
add "accept" header to def. list using --exp-md +hdr.accept
for performance reasons, expansion only happens while embedding
documents into directory listings, and when accessing a ?doc=...
link, but never otherwise, so if you click a -txt- link you'll
have to refresh the page to apply expansion
"""
),
],
[
"ls",
"volume inspection",
@@ -600,9 +717,9 @@ def get_sects():
\033[32macid\033[0m = extremely safe but slow; the old default. Should never lose any data no matter what
\033[32mswal\033[0m = 2.4x faster uploads yet 99.9%% as safe -- theoretical chance of losing metadata for the ~200 most recently uploaded files if there's a power-loss or your OS crashes
\033[32mswal\033[0m = 2.4x faster uploads yet 99.9% as safe -- theoretical chance of losing metadata for the ~200 most recently uploaded files if there's a power-loss or your OS crashes
\033[32mwal\033[0m = another 21x faster on HDDs yet 90%% as safe; same pitfall as \033[33mswal\033[0m except more likely
\033[32mwal\033[0m = another 21x faster on HDDs yet 90% as safe; same pitfall as \033[33mswal\033[0m except more likely
\033[32myolo\033[0m = another 1.5x faster, and removes the occasional sudden upload-pause while the disk syncs, but now you're at risk of losing the entire database in a powerloss / OS-crash
@@ -610,6 +727,72 @@ def get_sects():
"""
),
],
[
"pwhash",
"password hashing",
dedent(
"""
when \033[36m--ah-alg\033[0m is not the default [\033[32mnone\033[0m], all account passwords must be hashed
passwords can be hashed on the commandline with \033[36m--ah-gen\033[0m, but copyparty will also hash and print any passwords that are non-hashed (password which do not start with '+') and then terminate afterwards
\033[36m--ah-alg\033[0m specifies the hashing algorithm and a list of optional comma-separated arguments:
\033[36m--ah-alg argon2\033[0m # which is the same as:
\033[36m--ah-alg argon2,3,256,4,19\033[0m
use argon2id with timecost 3, 256 MiB, 4 threads, version 19 (0x13/v1.3)
\033[36m--ah-alg scrypt\033[0m # which is the same as:
\033[36m--ah-alg scrypt,13,2,8,4\033[0m
use scrypt with cost 2**13, 2 iterations, blocksize 8, 4 threads
\033[36m--ah-alg sha2\033[0m # which is the same as:
\033[36m--ah-alg sha2,424242\033[0m
use sha2-512 with 424242 iterations
recommended: \033[32m--ah-alg argon2\033[0m
(takes about 0.4 sec and 256M RAM to process a new password)
argon2 needs python-package argon2-cffi,
scrypt needs openssl,
sha2 is always available
"""
),
],
[
"zm",
"mDNS debugging",
dedent(
"""
the mDNS protocol is multicast-based, which means there are thousands
of fun and intersesting ways for it to break unexpectedly
things to check if it does not work at all:
* is there a firewall blocking port 5353 on either the server or client?
(for example, clients may be able to send queries to copyparty,
but the replies could get lost)
* is multicast accidentally disabled on either the server or client?
(look for mDNS log messages saying "new client on [...]")
* the router/switch must be multicast and igmp capable
things to check if it works for a while but then it doesn't:
* is there a firewall blocking port 5353 on either the server or client?
(copyparty may be unable to see the queries from the clients, but the
clients may still be able to see the initial unsolicited announce,
so it works for about 2 minutes after startup until TTL expires)
* does the client have multiple IPs on its interface, and some of the
IPs are in subnets which the copyparty server is not a member of?
for both of the above intermittent issues, try --zm-spam 30
(not spec-compliant but nothing will mind)
"""
),
],
]
@@ -635,8 +818,6 @@ def add_general(ap, nc, srvname):
ap2.add_argument("-a", metavar="ACCT", type=u, action="append", help="add account, \033[33mUSER\033[0m:\033[33mPASS\033[0m; example [\033[32med:wark\033[0m]")
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, \033[33mSRC\033[0m:\033[33mDST\033[0m:\033[33mFLAG\033[0m; examples [\033[32m.::r\033[0m], [\033[32m/mnt/nas/music:/music:r:aed\033[0m]")
ap2.add_argument("-ed", action="store_true", help="enable the ?dots url parameter / client option which allows clients to see dotfiles / hidden files")
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins -- neat but dangerous, big XSS risk")
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-form POSTs; see --help-urlform")
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="window title, for example [\033[32m$ip-10.1.2.\033[0m] or [\033[32m$ip-]")
ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)")
@@ -667,15 +848,18 @@ def add_upload(ap):
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even when it might be dangerous (multiprocessing, filesystems lacking sparse-files support, ...)")
ap2.add_argument("--hardlink", action="store_true", help="prefer hardlinks instead of symlinks when possible (within same filesystem) (volflag=hardlink)")
ap2.add_argument("--never-symlink", action="store_true", help="do not fallback to symlinks when a hardlink cannot be made (volflag=neversymlink)")
ap2.add_argument("--no-dedup", action="store_true", help="disable symlink/hardlink creation; copy file contents instead (volflag=copydupes")
ap2.add_argument("--no-dedup", action="store_true", help="disable symlink/hardlink creation; copy file contents instead (volflag=copydupes)")
ap2.add_argument("--no-dupe", action="store_true", help="reject duplicate files during upload; only matches within the same volume (volflag=nodupe)")
ap2.add_argument("--no-snap", action="store_true", help="disable snapshots -- forget unfinished uploads on shutdown; don't create .hist/up2k.snap files -- abandoned/interrupted uploads must be cleaned up manually")
ap2.add_argument("--snap-wri", metavar="SEC", type=int, default=300, help="write upload state to ./hist/up2k.snap every SEC seconds; allows resuming incomplete uploads after a server crash")
ap2.add_argument("--snap-drop", metavar="MIN", type=float, default=1440, help="forget unfinished uploads after MIN minutes; impossible to resume them after that (360=6h, 1440=24h)")
ap2.add_argument("--u2ts", metavar="TXT", type=u, default="c", help="how to timestamp uploaded files; [\033[32mc\033[0m]=client-last-modified, [\033[32mu\033[0m]=upload-time, [\033[32mfc\033[0m]=force-c, [\033[32mfu\033[0m]=force-u (volflag=u2ts)")
ap2.add_argument("--rand", action="store_true", help="force randomized filenames, --nrand chars long (volflag=rand)")
ap2.add_argument("--nrand", metavar="NUM", type=int, default=9, help="randomized filenames length (volflag=nrand)")
ap2.add_argument("--magic", action="store_true", help="enable filetype detection on nameless uploads (volflag=magic)")
ap2.add_argument("--df", metavar="GiB", type=float, default=0, help="ensure GiB free disk space by rejecting upload requests")
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m0\033[0m] = off and warn if enabled, [\033[32m1\033[0m] = off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
@@ -685,28 +869,52 @@ def add_network(ap):
ap2.add_argument("-i", metavar="IP", type=u, default="::", help="ip to bind (comma-sep.), default: all IPv4 and IPv6")
ap2.add_argument("-p", metavar="PORT", type=u, default="3923", help="ports to bind (comma/range)")
ap2.add_argument("--ll", action="store_true", help="include link-local IPv4/IPv6 even if the NIC has routable IPs (breaks some mdns clients)")
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; [\033[32m0\033[0m]=tcp, [\033[32m1\033[0m]=origin (first x-fwd), [\033[32m2\033[0m]=cloudflare, [\033[32m3\033[0m]=nginx, [\033[32m-1\033[0m]=closest proxy")
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; [\033[32m0\033[0m]=tcp, [\033[32m1\033[0m]=origin (first x-fwd, unsafe), [\033[32m2\033[0m]=outermost-proxy, [\033[32m3\033[0m]=second-proxy, [\033[32m-1\033[0m]=closest-proxy")
ap2.add_argument("--xff-hdr", metavar="NAME", type=u, default="x-forwarded-for", help="if reverse-proxied, which http header to read the client's real ip from (argument must be lowercase, but not the actual header)")
ap2.add_argument("--xff-src", metavar="IP", type=u, default="127., ::1", help="comma-separated list of trusted reverse-proxy IPs; only accept the real-ip header (--xff-hdr) if the incoming connection is from an IP starting with either of these. Can be disabled with [\033[32many\033[0m] if you are behind cloudflare (or similar) and are using --xff-hdr=cf-connecting-ip (or similar)")
ap2.add_argument("--rp-loc", metavar="PATH", type=u, default="", help="if reverse-proxying on a location instead of a dedicated domain/subdomain, provide the base location here (eg. /foo/bar)")
if ANYWIN:
ap2.add_argument("--reuseaddr", action="store_true", help="set reuseaddr on listening sockets on windows; allows rapid restart of copyparty at the expense of being able to accidentally start multiple instances")
else:
ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)")
ap2.add_argument("--s-thead", metavar="SEC", type=int, default=120, help="socket timeout (read request header)")
ap2.add_argument("--s-tbody", metavar="SEC", type=float, default=186, help="socket timeout (read/write request/response bodies). Use 60 on fast servers (default is extremely safe). Disable with 0 if reverse-proxied for a 2%% speed boost")
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="debug: socket write delay in seconds")
ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="debug: response delay in seconds")
ap2.add_argument("--rsp-jtr", metavar="SEC", type=float, default=0, help="debug: response delay, random duration 0..SEC")
def add_tls(ap):
def add_tls(ap, cert_path):
ap2 = ap.add_argument_group('SSL/TLS options')
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls -- force plaintext")
ap2.add_argument("--https-only", action="store_true", help="disable plaintext -- force tls")
ap2.add_argument("--cert", metavar="PATH", type=u, default=cert_path, help="path to TLS certificate")
ap2.add_argument("--ssl-ver", metavar="LIST", type=u, help="set allowed ssl/tls versions; [\033[32mhelp\033[0m] shows available versions; default is what your python version considers safe")
ap2.add_argument("--ciphers", metavar="LIST", type=u, help="set allowed ssl/tls ciphers; [\033[32mhelp\033[0m] shows available ciphers")
ap2.add_argument("--ssl-dbg", action="store_true", help="dump some tls info")
ap2.add_argument("--ssl-log", metavar="PATH", type=u, help="log master secrets for later decryption in wireshark")
def add_cert(ap, cert_path):
cert_dir = os.path.dirname(cert_path)
ap2 = ap.add_argument_group('TLS certificate generator options')
ap2.add_argument("--no-crt", action="store_true", help="disable automatic certificate creation")
ap2.add_argument("--crt-ns", metavar="N,N", type=u, default="", help="comma-separated list of FQDNs (domains) to add into the certificate")
ap2.add_argument("--crt-exact", action="store_true", help="do not add wildcard entries for each --crt-ns")
ap2.add_argument("--crt-noip", action="store_true", help="do not add autodetected IP addresses into cert")
ap2.add_argument("--crt-nolo", action="store_true", help="do not add 127.0.0.1 / localhost into cert")
ap2.add_argument("--crt-nohn", action="store_true", help="do not add mDNS names / hostname into cert")
ap2.add_argument("--crt-dir", metavar="PATH", default=cert_dir, help="where to save the CA cert")
ap2.add_argument("--crt-cdays", metavar="D", type=float, default=3650, help="ca-certificate expiration time in days")
ap2.add_argument("--crt-sdays", metavar="D", type=float, default=365, help="server-cert expiration time in days")
ap2.add_argument("--crt-cn", metavar="TXT", type=u, default="partyco", help="CA/server-cert common-name")
ap2.add_argument("--crt-cnc", metavar="TXT", type=u, default="--crt-cn", help="override CA name")
ap2.add_argument("--crt-cns", metavar="TXT", type=u, default="--crt-cn cpp", help="override server-cert name")
ap2.add_argument("--crt-back", metavar="HRS", type=float, default=72, help="backdate in hours")
ap2.add_argument("--crt-alg", metavar="S-N", type=u, default="ecdsa-256", help="algorithm and keysize; one of these: ecdsa-256 rsa-4096 rsa-2048")
def add_zeroconf(ap):
ap2 = ap.add_argument_group("Zeroconf options")
ap2.add_argument("-z", action="store_true", help="enable all zeroconf backends (mdns, ssdp)")
@@ -718,7 +926,7 @@ def add_zeroconf(ap):
def add_zc_mdns(ap):
ap2 = ap.add_argument_group("Zeroconf-mDNS options")
ap2 = ap.add_argument_group("Zeroconf-mDNS options; also see --help-zm")
ap2.add_argument("--zm", action="store_true", help="announce the enabled protocols over mDNS (multicast DNS-SD) -- compatible with KDE, gnome, macOS, ...")
ap2.add_argument("--zm-on", metavar="NETS", type=u, default="", help="enable zeroconf ONLY on the comma-separated list of subnets and/or interface names/indexes")
ap2.add_argument("--zm-off", metavar="NETS", type=u, default="", help="disable zeroconf on the comma-separated list of subnets and/or interface names/indexes")
@@ -732,8 +940,9 @@ def add_zc_mdns(ap):
ap2.add_argument("--zm-lf", metavar="PATH", type=u, default="", help="link a specific folder for ftp shares")
ap2.add_argument("--zm-ls", metavar="PATH", type=u, default="", help="link a specific folder for smb shares")
ap2.add_argument("--zm-mnic", action="store_true", help="merge NICs which share subnets; assume that same subnet means same network")
ap2.add_argument("--zm-msub", action="store_true", help="merge subnets on each NIC -- always enabled for ipv6 -- reduces network load, but gnome-gvfs clients may stop working")
ap2.add_argument("--zm-msub", action="store_true", help="merge subnets on each NIC -- always enabled for ipv6 -- reduces network load, but gnome-gvfs clients may stop working, and clients cannot be in subnets that the server is not")
ap2.add_argument("--zm-noneg", action="store_true", help="disable NSEC replies -- try this if some clients don't see copyparty")
ap2.add_argument("--zm-spam", metavar="SEC", type=float, default=0, help="send unsolicited announce every SEC; useful if clients have IPs in a subnet which doesn't overlap with the server")
def add_zc_ssdp(ap):
@@ -751,6 +960,7 @@ def add_ftp(ap):
ap2.add_argument("--ftp", metavar="PORT", type=int, help="enable FTP server on PORT, for example \033[32m3921")
ap2.add_argument("--ftps", metavar="PORT", type=int, help="enable FTPS server on PORT, for example \033[32m3990")
ap2.add_argument("--ftpv", action="store_true", help="verbose")
ap2.add_argument("--ftp4", action="store_true", help="only listen on IPv4")
ap2.add_argument("--ftp-wt", metavar="SEC", type=int, default=7, help="grace period for resuming interrupted uploads (any client can write to any file last-modified more recently than SEC seconds ago)")
ap2.add_argument("--ftp-nat", metavar="ADDR", type=u, help="the NAT address to use for passive connections")
ap2.add_argument("--ftp-pr", metavar="P-P", type=u, help="the range of TCP ports to use for passive connections, for example \033[32m12000-13000")
@@ -758,24 +968,34 @@ def add_ftp(ap):
def add_webdav(ap):
ap2 = ap.add_argument_group('WebDAV options')
ap2.add_argument("--daw", action="store_true", help="enable full write support. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
ap2.add_argument("--daw", action="store_true", help="enable full write support, even if client may not be webdav. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
ap2.add_argument("--dav-inf", action="store_true", help="allow depth:infinite requests (recursive file listing); extremely server-heavy but required for spec compliance -- luckily few clients rely on this")
ap2.add_argument("--dav-mac", action="store_true", help="disable apple-garbage filter -- allow macos to create junk files (._* and .DS_Store, .Spotlight-*, .fseventsd, .Trashes, .AppleDouble, __MACOS)")
ap2.add_argument("--dav-rt", action="store_true", help="show symlink-destination's lastmodified instead of the link itself; always enabled for recursive listings (volflag=davrt)")
ap2.add_argument("--dav-auth", action="store_true", help="force auth for all folders (required by davfs2 when only some folders are world-readable) (volflag=davauth)")
def add_smb(ap):
ap2 = ap.add_argument_group('SMB/CIFS options')
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless --smb-port is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is dangerous! Never expose to the internet. Account permissions are coalesced; if one account has write-access to a volume, then all accounts do.")
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless --smb-port is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is dangerous! Never expose to the internet!")
ap2.add_argument("--smbw", action="store_true", help="enable write support (please dont)")
ap2.add_argument("--smb1", action="store_true", help="disable SMBv2, only enable SMBv1 (CIFS)")
ap2.add_argument("--smb-port", metavar="PORT", type=int, default=445, help="port to listen on -- if you change this value, you must NAT from TCP:445 to this port using iptables or similar")
ap2.add_argument("--smb-nwa-1", action="store_true", help="disable impacket#1433 workaround (truncate directory listings to 64kB)")
ap2.add_argument("--smb-nwa-2", action="store_true", help="disable impacket workaround for filecopy globs")
ap2.add_argument("--smba", action="store_true", help="small performance boost: disable per-account permissions, enables account coalescing instead (if one user has write/delete-access, then everyone does)")
ap2.add_argument("--smbv", action="store_true", help="verbose")
ap2.add_argument("--smbvv", action="store_true", help="verboser")
ap2.add_argument("--smbvvv", action="store_true", help="verbosest")
def add_handlers(ap):
ap2 = ap.add_argument_group('handlers (see --help-handlers)')
ap2.add_argument("--on404", metavar="PY", type=u, action="append", help="handle 404s by executing PY file")
ap2.add_argument("--on403", metavar="PY", type=u, action="append", help="handle 403s by executing PY file")
ap2.add_argument("--hot-handlers", action="store_true", help="reload handlers on each request -- expensive but convenient when hacking on stuff")
def add_hooks(ap):
ap2 = ap.add_argument_group('event hooks (see --help-hooks)')
ap2.add_argument("--xbu", metavar="CMD", type=u, action="append", help="execute CMD before a file upload starts")
@@ -786,6 +1006,16 @@ def add_hooks(ap):
ap2.add_argument("--xbd", metavar="CMD", type=u, action="append", help="execute CMD before a file delete")
ap2.add_argument("--xad", metavar="CMD", type=u, action="append", help="execute CMD after a file delete")
ap2.add_argument("--xm", metavar="CMD", type=u, action="append", help="execute CMD on message")
ap2.add_argument("--xban", metavar="CMD", type=u, action="append", help="execute CMD if someone gets banned (pw/404/403/url)")
def add_stats(ap):
ap2 = ap.add_argument_group('grafana/prometheus metrics endpoint')
ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts")
ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)")
ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)")
ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)")
ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics")
def add_yolo(ap):
@@ -801,21 +1031,23 @@ def add_optouts(ap):
ap2.add_argument("--no-dav", action="store_true", help="disable webdav support")
ap2.add_argument("--no-del", action="store_true", help="disable delete operations")
ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations")
ap2.add_argument("-nth", action="store_true", help="no title hostname; don't show --name in <title>")
ap2.add_argument("-nih", action="store_true", help="no info hostname -- don't show in UI")
ap2.add_argument("-nid", action="store_true", help="no info disk-usage -- don't show in UI")
ap2.add_argument("-nb", action="store_true", help="no powered-by-copyparty branding in UI")
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
ap2.add_argument("--no-lifetime", action="store_true", help="disable automatic deletion of uploads after a certain time (as specified by the 'lifetime' volflag)")
def add_safety(ap, fk_salt):
def add_safety(ap):
ap2 = ap.add_argument_group('safety options')
ap2.add_argument("-s", action="count", default=0, help="increase safety: Disable thumbnails / potentially dangerous software (ffmpeg/pillow/vips), hide partial uploads, avoid crawlers.\n └─Alias of\033[32m --dotpart --no-thumb --no-mtag-ff --no-robots --force-js")
ap2.add_argument("-ss", action="store_true", help="further increase safety: Prevent js-injection, accidental move/delete, broken symlinks, webdav, 404 on 403, ban on excessive 404s.\n └─Alias of\033[32m -s --unpost=0 --no-del --no-mv --hardlink --vague-403 --ban-404=50,60,1440 -nih")
ap2.add_argument("-ss", action="store_true", help="further increase safety: Prevent js-injection, accidental move/delete, broken symlinks, webdav, 404 on 403, ban on excessive 404s.\n └─Alias of\033[32m -s --unpost=0 --no-del --no-mv --hardlink --vague-403 -nih")
ap2.add_argument("-sss", action="store_true", help="further increase safety: Enable logging to disk, scan for dangerous symlinks.\n └─Alias of\033[32m -ss --no-dav --no-logues --no-readme -lo=cpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz --ls=**,*,ln,p,r")
ap2.add_argument("--ls", metavar="U[,V[,F]]", type=u, help="do a sanity/safety check of all volumes on startup; arguments \033[33mUSER\033[0m,\033[33mVOL\033[0m,\033[33mFLAGS\033[0m; example [\033[32m**,*,ln,p,r\033[0m]")
ap2.add_argument("--salt", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files -- this one DOES matter")
ap2.add_argument("--xvol", action="store_true", help="never follow symlinks leaving the volume root, unless the link is into another volume where the user has similar access (volflag=xvol)")
ap2.add_argument("--xdev", action="store_true", help="stay within the filesystem of the volume root; do not descend into other devices (symlink or bind-mount to another HDD, ...) (volflag=xdev)")
ap2.add_argument("--no-dot-mv", action="store_true", help="disallow moving dotfiles; makes it impossible to move folders containing dotfiles")
ap2.add_argument("--no-dot-ren", action="store_true", help="disallow renaming dotfiles; makes it impossible to make something a dotfile")
ap2.add_argument("--no-logues", action="store_true", help="disable rendering .prologue/.epilogue.html into directory listings")
@@ -825,13 +1057,28 @@ def add_safety(ap, fk_salt):
ap2.add_argument("--no-robots", action="store_true", help="adds http and html headers asking search engines to not index anything (volflag=norobots)")
ap2.add_argument("--logout", metavar="H", type=float, default="8086", help="logout clients after H hours of inactivity; [\033[32m0.0028\033[0m]=10sec, [\033[32m0.1\033[0m]=6min, [\033[32m24\033[0m]=day, [\033[32m168\033[0m]=week, [\033[32m720\033[0m]=month, [\033[32m8760\033[0m]=year)")
ap2.add_argument("--ban-pw", metavar="N,W,B", type=u, default="9,60,1440", help="more than \033[33mN\033[0m wrong passwords in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; disable with [\033[32mno\033[0m]")
ap2.add_argument("--ban-404", metavar="N,W,B", type=u, default="no", help="hitting more than \033[33mN\033[0m 404's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes (disabled by default since turbo-up2k counts as 404s)")
ap2.add_argument("--ban-404", metavar="N,W,B", type=u, default="50,60,1440", help="hitting more than \033[33mN\033[0m 404's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; only affects users who cannot see directory listings because their access is either g/G/h")
ap2.add_argument("--ban-403", metavar="N,W,B", type=u, default="9,2,1440", help="hitting more than \033[33mN\033[0m 403's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; [\033[32m1440\033[0m]=day, [\033[32m10080\033[0m]=week, [\033[32m43200\033[0m]=month")
ap2.add_argument("--ban-422", metavar="N,W,B", type=u, default="9,2,1440", help="hitting more than \033[33mN\033[0m 422's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes (422 is server fuzzing, invalid POSTs and so)")
ap2.add_argument("--ban-url", metavar="N,W,B", type=u, default="9,2,1440", help="hitting more than \033[33mN\033[0m sus URL's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; applies only to access g/G/h (decent replacement for --ban-404 if that can't be used)")
ap2.add_argument("--sus-urls", metavar="R", type=u, default=r"\.php$|(^|/)wp-(admin|content|includes)/", help="URLs which are considered sus / eligible for banning; disable with blank or [\033[32mno\033[0m]")
ap2.add_argument("--nonsus-urls", metavar="R", type=u, default=r"^(favicon\.ico|robots\.txt)$|^apple-touch-icon|^\.well-known", help="harmless URLs ignored from 404-bans; disable with blank or [\033[32mno\033[0m]")
ap2.add_argument("--aclose", metavar="MIN", type=int, default=10, help="if a client maxes out the server connection limit, downgrade it from connection:keep-alive to connection:close for MIN minutes (and also kill its active connections) -- disable with 0")
ap2.add_argument("--loris", metavar="B", type=int, default=60, help="if a client maxes out the server connection limit without sending headers, ban it for B minutes; disable with [\033[32m0\033[0m]")
ap2.add_argument("--acao", metavar="V[,V]", type=u, default="*", help="Access-Control-Allow-Origin; list of origins (domains/IPs without port) to accept requests from; [\033[32mhttps://1.2.3.4\033[0m]. Default [\033[32m*\033[0m] allows requests from all sites but removes cookies and http-auth; only ?pw=hunter2 survives")
ap2.add_argument("--acam", metavar="V[,V]", type=u, default="GET,HEAD", help="Access-Control-Allow-Methods; list of methods to accept from offsite ('*' behaves like described in --acao)")
def add_salt(ap, fk_salt, ah_salt):
ap2 = ap.add_argument_group('salting options')
ap2.add_argument("--ah-alg", metavar="ALG", type=u, default="none", help="account-pw hashing algorithm; one of these, best to worst: argon2 scrypt sha2 none (each optionally followed by alg-specific comma-sep. config)")
ap2.add_argument("--ah-salt", metavar="SALT", type=u, default=ah_salt, help="account-pw salt; ignored if --ah-alg is none (default)")
ap2.add_argument("--ah-gen", metavar="PW", type=u, default="", help="generate hashed password for \033[33mPW\033[0m, or read passwords from STDIN if \033[33mPW\033[0m is [\033[32m-\033[0m]")
ap2.add_argument("--ah-cli", action="store_true", help="interactive shell which hashes passwords without ever storing or displaying the original passwords")
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files")
ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
def add_shutdown(ap):
ap2 = ap.add_argument_group('shutdown options')
ap2.add_argument("--ign-ebind", action="store_true", help="continue running even if it's impossible to listen on some of the requested endpoints")
@@ -843,7 +1090,11 @@ def add_logging(ap):
ap2 = ap.add_argument_group('logging options')
ap2.add_argument("-q", action="store_true", help="quiet")
ap2.add_argument("-lo", metavar="PATH", type=u, help="logfile, example: \033[32mcpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz")
ap2.add_argument("--no-ansi", action="store_true", default=not VT100, help="disable colors; same as environment-variable NO_COLOR")
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log passphrase of failed login attempts: 0=terse, 1=plaintext, 2=hashed")
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
@@ -862,10 +1113,10 @@ def add_thumbnail(ap):
ap2.add_argument("--no-thumb", action="store_true", help="disable all thumbnails (volflag=dthumb)")
ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails (volflag=dvthumb)")
ap2.add_argument("--no-athumb", action="store_true", help="disable audio thumbnails (spectrograms) (volflag=dathumb)")
ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res")
ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res (volflag=thsize)")
ap2.add_argument("--th-mt", metavar="CORES", type=int, default=CORES, help="num cpu cores to use for generating thumbnails")
ap2.add_argument("--th-convt", metavar="SEC", type=int, default=60, help="conversion timeout in seconds")
ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image")
ap2.add_argument("--th-convt", metavar="SEC", type=float, default=60, help="conversion timeout in seconds (volflag=convt)")
ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image by default (volflag=nocrop)")
ap2.add_argument("--th-dec", metavar="LIBS", default="vips,pil,ff", help="image decoders, in order of preference")
ap2.add_argument("--th-no-jpg", action="store_true", help="disable jpg output")
ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output")
@@ -874,13 +1125,13 @@ def add_thumbnail(ap):
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown -- avoids doing keepalive pokes (updating the mtime) on thumbnail folders more often than SEC seconds")
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age -- folders which haven't been poked for longer than --th-poke seconds will get deleted every --th-clean seconds")
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for; case-insensitive if -e2d")
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for; enabling -e2d will make these case-insensitive, and also automatically select thumbnails for all folders that contain pics, even if none match this pattern")
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
# https://github.com/libvips/libvips
# ffmpeg -hide_banner -demuxers | awk '/^ D /{print$2}' | while IFS= read -r x; do ffmpeg -hide_banner -h demuxer=$x; done | grep -E '^Demuxer |extensions:'
ap2.add_argument("--th-r-pil", metavar="T,T", type=u, default="avif,avifs,blp,bmp,dcx,dds,dib,emf,eps,fits,flc,fli,fpx,gif,heic,heics,heif,heifs,icns,ico,im,j2p,j2k,jp2,jpeg,jpg,jpx,pbm,pcx,pgm,png,pnm,ppm,psd,sgi,spi,tga,tif,tiff,webp,wmf,xbm,xpm", help="image formats to decode using pillow")
ap2.add_argument("--th-r-pil", metavar="T,T", type=u, default="avif,avifs,blp,bmp,dcx,dds,dib,emf,eps,fits,flc,fli,fpx,gif,heic,heics,heif,heifs,icns,ico,im,j2p,j2k,jp2,jpeg,jpg,jpx,pbm,pcx,pgm,png,pnm,ppm,psd,qoi,sgi,spi,tga,tif,tiff,webp,wmf,xbm,xpm", help="image formats to decode using pillow")
ap2.add_argument("--th-r-vips", metavar="T,T", type=u, default="avif,exr,fit,fits,fts,gif,hdr,heic,jp2,jpeg,jpg,jpx,jxl,nii,pfm,pgm,png,ppm,svg,tif,tiff,webp", help="image formats to decode using pyvips")
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,m4a,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,tak,tta,ulaw,wav,wma,wv,xm,xpk", help="audio formats to decode using ffmpeg")
@@ -888,6 +1139,7 @@ def add_thumbnail(ap):
def add_transcoding(ap):
ap2 = ap.add_argument_group('transcoding options')
ap2.add_argument("--no-acode", action="store_true", help="disable audio transcoding")
ap2.add_argument("--no-bacode", action="store_true", help="disable batch audio transcoding by folder download (zip/tar)")
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after SEC seconds")
@@ -900,15 +1152,13 @@ def add_db_general(ap, hcores):
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see --help-dbd (volflag=dbd)")
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)")
ap2.add_argument("--xdev", action="store_true", help="do not descend into other filesystems (symlink or bind-mount to another HDD, ...) (volflag=xdev)")
ap2.add_argument("--xvol", action="store_true", help="skip symlinks leaving the volume root (volflag=xvol)")
ap2.add_argument("--hash-mt", metavar="CORES", type=int, default=hcores, help="num cpu cores to use for file hashing; set 0 or 1 for single-core hashing")
ap2.add_argument("--re-maxage", metavar="SEC", type=int, default=0, help="disk rescan volume interval, 0=off (volflag=scan)")
ap2.add_argument("--db-act", metavar="SEC", type=float, default=10, help="defer any scheduled volume reindexing until SEC seconds after last db write (uploads, renames, ...)")
@@ -929,28 +1179,40 @@ def add_db_metadata(ap):
ap2.add_argument("--mtag-v", action="store_true", help="verbose tag scanning; print errors from mtp subprocesses and such")
ap2.add_argument("--mtag-vv", action="store_true", help="debug mtp settings and mutagen/ffprobe parsers")
ap2.add_argument("-mtm", metavar="M=t,t,t", type=u, action="append", help="add/replace metadata mapping")
ap2.add_argument("-mte", metavar="M,M,M", type=u, help="tags to index/display (comma-sep.)",
default="circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,vc,ac,fmt,res,.fps,ahash,vhash")
ap2.add_argument("-mth", metavar="M,M,M", type=u, help="tags to hide by default (comma-sep.)",
default=".vq,.aq,vc,ac,fmt,res,.fps")
ap2.add_argument("-mte", metavar="M,M,M", type=u, help="tags to index/display (comma-sep.); either an entire replacement list, or add/remove stuff on the default-list with +foo or /bar", default=DEF_MTE)
ap2.add_argument("-mth", metavar="M,M,M", type=u, help="tags to hide by default (comma-sep.); assign/add/remove same as -mte", default=DEF_MTH)
ap2.add_argument("-mtp", metavar="M=[f,]BIN", type=u, action="append", help="read tag M using program BIN to parse the file")
def add_txt(ap):
ap2 = ap.add_argument_group('textfile options')
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="textfile editor checks for serverside changes every SEC seconds")
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins -- neat but dangerous, big XSS risk")
ap2.add_argument("--exp", action="store_true", help="enable textfile expansion -- replace {{self.ip}} and such; see --help-exp (volflag=exp)")
ap2.add_argument("--exp-md", metavar="V,V,V", type=u, default=DEF_EXP, help="comma/space-separated list of placeholders to expand in markdown files; add/remove stuff on the default list with +hdr_foo or /vf.scan (volflag=exp_md)")
ap2.add_argument("--exp-lg", metavar="V,V,V", type=u, default=DEF_EXP, help="comma/space-separated list of placeholders to expand in prologue/epilogue files (volflag=exp_lg)")
def add_ui(ap, retry):
ap2 = ap.add_argument_group('ui options')
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language")
ap2.add_argument("--grid", action="store_true", help="show grid/thumbnails by default (volflag=grid)")
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language; one of the following: eng nor")
ap2.add_argument("--theme", metavar="NUM", type=int, default=0, help="default theme to use")
ap2.add_argument("--themes", metavar="NUM", type=int, default=8, help="number of themes installed")
ap2.add_argument("--sort", metavar="C,C,C", type=u, default="href", help="default sort order, comma-separated column IDs (see header tooltips), prefix with '-' for descending. Examples: \033[32mhref -href ext sz ts tags/Album tags/.tn\033[0m (volflag=sort)")
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching REGEX in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
ap2.add_argument("--mpmc", metavar="URL", type=u, default="", help="change the mediaplayer-toggle mouse cursor; URL to a folder with {2..5}.png inside (or disable with [\033[32m.\033[0m])")
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI, or add ?v to URL for override)")
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty @ --name", help="title / service-name to show in html documents")
ap2.add_argument("--bname", metavar="TXT", type=u, default="--name", help="server name (displayed in filebrowser document title)")
ap2.add_argument("--pb-url", metavar="URL", type=u, default="https://github.com/9001/copyparty", help="powered-by link; disable with -np")
ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible by -np)")
ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible with -nb)")
ap2.add_argument("--md-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for README.md docs (volflag=md_sbf); see https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox")
ap2.add_argument("--lg-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for prologue/epilogue docs (volflag=lg_sbf)")
ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README.md documents (volflags: no_sb_md | sb_md)")
@@ -986,12 +1248,15 @@ def run_argparse(
description="http file sharing hub v{} ({})".format(S_VERSION, S_BUILD_DT),
)
try:
fk_salt = unicode(os.path.getmtime(os.path.join(E.cfg, "cert.pem")))
except:
fk_salt = "hunter2"
cert_path = os.path.join(E.cfg, "cert.pem")
hcores = min(CORES, 4) # optimal on py3.11 @ r5-4500U
fk_salt = get_fk_salt(cert_path)
ah_salt = get_ah_salt()
# alpine peaks at 5 threads for some reason,
# all others scale past that (but try to avoid SMT),
# 5 should be plenty anyways (3 GiB/s on most machines)
hcores = min(CORES, 5 if CORES > 8 else 4)
tty = os.environ.get("TERM", "").lower() == "linux"
@@ -999,7 +1264,8 @@ def run_argparse(
add_general(ap, nc, srvname)
add_network(ap)
add_tls(ap)
add_tls(ap, cert_path)
add_cert(ap, cert_path)
add_qr(ap, tty)
add_zeroconf(ap)
add_zc_mdns(ap)
@@ -1012,11 +1278,15 @@ def run_argparse(
add_ftp(ap)
add_webdav(ap)
add_smb(ap)
add_safety(ap, fk_salt)
add_safety(ap)
add_salt(ap, fk_salt, ah_salt)
add_optouts(ap)
add_shutdown(ap)
add_yolo(ap)
add_handlers(ap)
add_hooks(ap)
add_stats(ap)
add_txt(ap)
add_ui(ap, retry)
add_admin(ap)
add_logging(ap)
@@ -1083,8 +1353,8 @@ def main(argv: Optional[list[str]] = None) -> None:
print("pybin: {}\n".format(pybin), end="")
ensure_locale()
if HAVE_SSL:
ensure_cert()
ensure_webdeps()
for k, v in zip(argv[1:], argv[2:]):
if k == "-c" and os.path.isfile(v):
@@ -1097,16 +1367,22 @@ def main(argv: Optional[list[str]] = None) -> None:
supp = args_from_cfg(v)
argv.extend(supp)
deprecated: list[tuple[str, str]] = []
deprecated: list[tuple[str, str]] = [("--salt", "--warksalt")]
for dk, nk in deprecated:
try:
idx = argv.index(dk)
except:
idx = -1
ov = ""
for n, k in enumerate(argv):
if k == dk or k.startswith(dk + "="):
idx = n
if "=" in k:
ov = "=" + k.split("=", 1)[1]
if idx < 0:
continue
msg = "\033[1;31mWARNING:\033[0;1m\n {} \033[0;33mwas replaced with\033[0;1m {} \033[0;33mand will be removed\n\033[0m"
lprint(msg.format(dk, nk))
argv[idx] = nk
argv[idx] = nk + ov
time.sleep(2)
da = len(argv) == 1
@@ -1152,13 +1428,18 @@ def main(argv: Optional[list[str]] = None) -> None:
except:
sys.exit(1)
if WINDOWS and not al.keep_qem:
if al.ansi:
al.no_ansi = False
elif not al.no_ansi:
al.ansi = VT100
if WINDOWS and not al.keep_qem and not al.ah_cli:
try:
disable_quickedit()
except:
lprint("\nfailed to disable quick-edit-mode:\n" + min_ex() + "\n")
if not VT100:
if al.ansi:
al.wintitle = ""
nstrs: list[str] = []
@@ -1177,11 +1458,9 @@ def main(argv: Optional[list[str]] = None) -> None:
if re.match("c[^,]", opt):
mod = True
na.append("c," + opt[1:])
elif re.sub("^[rwmdgG]*", "", opt) and "," not in opt:
elif re.sub("^[rwmdgGha]*", "", opt) and "," not in opt:
mod = True
perm = opt[0]
if perm == "a":
perm = "rw"
na.append(perm + "," + opt[1:])
else:
na.append(opt)
@@ -1237,6 +1516,7 @@ def main(argv: Optional[list[str]] = None) -> None:
configure_ssl_ciphers(al)
else:
warn("ssl module does not exist; cannot enable https")
al.http_only = True
if PY2 and WINDOWS and al.e2d:
warn(

View File

@@ -1,8 +1,8 @@
# coding: utf-8
VERSION = (1, 6, 8)
CODENAME = "cors k"
BUILD_DT = (2023, 3, 12)
VERSION = (1, 9, 15)
CODENAME = "prometheable"
BUILD_DT = (2023, 10, 24)
S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -12,19 +12,23 @@ import threading
import time
from datetime import datetime
from .__init__ import ANYWIN, TYPE_CHECKING, WINDOWS
from .__init__ import ANYWIN, TYPE_CHECKING, WINDOWS, E
from .bos import bos
from .cfg import flagdescs, permdescs, vf_bmap, vf_cmap, vf_vmap
from .pwhash import PWHash
from .util import (
IMPLICATIONS,
META_NOBOTS,
SQLITE_VER,
UNPLICATIONS,
ODict,
Pebkac,
UTC,
absreal,
afsenc,
get_df,
humansize,
odfusion,
relchk,
statdir,
uncyg,
@@ -40,7 +44,10 @@ if True: # pylint: disable=using-constant-test
from .util import NamedLogger, RootLogger
if TYPE_CHECKING:
pass
from .broker_mp import BrokerMp
from .broker_thr import BrokerThr
from .broker_util import BrokerCli
# Vflags: TypeAlias = dict[str, str | bool | float | list[str]]
# Vflags: TypeAlias = dict[str, Any]
# Mflags: TypeAlias = dict[str, Vflags]
@@ -48,6 +55,11 @@ if TYPE_CHECKING:
LEELOO_DALLAS = "leeloo_dallas"
SEE_LOG = "see log for details"
SSEELOG = " ({})".format(SEE_LOG)
BAD_CFG = "invalid config; {}".format(SEE_LOG)
SBADCFG = " ({})".format(BAD_CFG)
class AXS(object):
def __init__(
@@ -58,6 +70,8 @@ class AXS(object):
udel: Optional[Union[list[str], set[str]]] = None,
uget: Optional[Union[list[str], set[str]]] = None,
upget: Optional[Union[list[str], set[str]]] = None,
uhtml: Optional[Union[list[str], set[str]]] = None,
uadmin: Optional[Union[list[str], set[str]]] = None,
) -> None:
self.uread: set[str] = set(uread or [])
self.uwrite: set[str] = set(uwrite or [])
@@ -65,14 +79,12 @@ class AXS(object):
self.udel: set[str] = set(udel or [])
self.uget: set[str] = set(uget or [])
self.upget: set[str] = set(upget or [])
self.uhtml: set[str] = set(uhtml or [])
self.uadmin: set[str] = set(uadmin or [])
def __repr__(self) -> str:
return "AXS({})".format(
", ".join(
"{}={!r}".format(k, self.__dict__[k])
for k in "uread uwrite umove udel uget upget".split()
)
)
ks = "uread uwrite umove udel uget upget uhtml uadmin".split()
return "AXS(%s)" % (", ".join("%s=%r" % (k, self.__dict__[k]) for k in ks),)
class Lim(object):
@@ -90,6 +102,8 @@ class Lim(object):
self.dfl = 0 # free disk space limit
self.dft = 0 # last-measured time
self.dfv = 0 # currently free
self.vbmax = 0 # volume bytes max
self.vnmax = 0 # volume max num files
self.smin = 0 # filesize min
self.smax = 0 # filesize max
@@ -119,8 +133,11 @@ class Lim(object):
ip: str,
rem: str,
sz: int,
ptop: str,
abspath: str,
broker: Optional[Union["BrokerCli", "BrokerMp", "BrokerThr"]] = None,
reg: Optional[dict[str, dict[str, Any]]] = None,
volgetter: str = "up2k.get_volsize",
) -> tuple[str, str]:
if reg is not None and self.reg is None:
self.reg = reg
@@ -131,6 +148,7 @@ class Lim(object):
self.chk_rem(rem)
if sz != -1:
self.chk_sz(sz)
self.chk_vsz(broker, ptop, sz, volgetter)
self.chk_df(abspath, sz) # side effects; keep last-ish
ap2, vp2 = self.rot(abspath)
@@ -146,6 +164,25 @@ class Lim(object):
if self.smax and sz > self.smax:
raise Pebkac(400, "file too big")
def chk_vsz(
self,
broker: Optional[Union["BrokerCli", "BrokerMp", "BrokerThr"]],
ptop: str,
sz: int,
volgetter: str = "up2k.get_volsize",
) -> None:
if not broker or not self.vbmax + self.vnmax:
return
x = broker.ask(volgetter, ptop)
nbytes, nfiles = x.get()
if self.vbmax and self.vbmax < nbytes + sz:
raise Pebkac(400, "volume has exceeded max size")
if self.vnmax and self.vnmax < nfiles + 1:
raise Pebkac(400, "volume has exceeded max num.files")
def chk_df(self, abspath: str, sz: int, already_written: bool = False) -> None:
if not self.dfl:
return
@@ -179,7 +216,7 @@ class Lim(object):
if self.rot_re.search(path.replace("\\", "/")):
return path, ""
suf = datetime.utcnow().strftime(self.rotf)
suf = datetime.now(UTC).strftime(self.rotf)
if path:
path += "/"
@@ -266,7 +303,7 @@ class Lim(object):
self.bupc[ip] = mark
if mark >= self.bmax:
raise Pebkac(429, "ingress saturated")
raise Pebkac(429, "upload size limit exceeded")
class VFS(object):
@@ -285,6 +322,8 @@ class VFS(object):
self.vpath = vpath # absolute path in the virtual filesystem
self.axs = axs
self.flags = flags # config options
self.root = self
self.dev = 0 # st_dev
self.nodes: dict[str, VFS] = {} # child nodes
self.histtab: dict[str, str] = {} # all realpath->histpath
self.dbv: Optional[VFS] = None # closest full/non-jump parent
@@ -295,28 +334,46 @@ class VFS(object):
self.adel: dict[str, list[str]] = {}
self.aget: dict[str, list[str]] = {}
self.apget: dict[str, list[str]] = {}
self.ahtml: dict[str, list[str]] = {}
self.aadmin: dict[str, list[str]] = {}
if realpath:
rp = realpath + ("" if realpath.endswith(os.sep) else os.sep)
vp = vpath + ("/" if vpath else "")
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
self.all_vols = {vpath: self} # flattened recursive
self.all_aps = [(rp, self)]
self.all_vps = [(vp, self)]
else:
self.histpath = ""
self.all_vols = {}
self.all_aps = []
self.all_vps = []
def __repr__(self) -> str:
return "VFS({})".format(
return "VFS(%s)" % (
", ".join(
"{}={!r}".format(k, self.__dict__[k])
"%s=%r" % (k, self.__dict__[k])
for k in "realpath vpath axs flags".split()
)
)
def get_all_vols(self, outdict: dict[str, "VFS"]) -> None:
def get_all_vols(
self,
vols: dict[str, "VFS"],
aps: list[tuple[str, "VFS"]],
vps: list[tuple[str, "VFS"]],
) -> None:
if self.realpath:
outdict[self.vpath] = self
vols[self.vpath] = self
rp = self.realpath
rp += "" if rp.endswith(os.sep) else os.sep
vp = self.vpath + ("/" if self.vpath else "")
aps.append((rp, self))
vps.append((vp, self))
for v in self.nodes.values():
v.get_all_vols(outdict)
v.get_all_vols(vols, aps, vps)
def add(self, src: str, dst: str) -> "VFS":
"""get existing, or add new path to the vfs"""
@@ -356,7 +413,8 @@ class VFS(object):
flags = {k: v for k, v in self.flags.items()}
hist = flags.get("hist")
if hist and hist != "-":
flags["hist"] = "{}/{}".format(hist.rstrip("/"), name)
zs = "{}/{}".format(hist.rstrip("/"), name)
flags["hist"] = os.path.expanduser(zs) if zs.startswith("~") else zs
return flags
@@ -387,9 +445,13 @@ class VFS(object):
def can_access(
self, vpath: str, uname: str
) -> tuple[bool, bool, bool, bool, bool, bool]:
"""can Read,Write,Move,Delete,Get,Upget"""
vn, _ = self._find(undot(vpath))
) -> tuple[bool, bool, bool, bool, bool, bool, bool]:
"""can Read,Write,Move,Delete,Get,Upget,Admin"""
if vpath:
vn, _ = self._find(undot(vpath))
else:
vn = self
c = vn.axs
return (
uname in c.uread or "*" in c.uread,
@@ -398,7 +460,9 @@ class VFS(object):
uname in c.udel or "*" in c.udel,
uname in c.uget or "*" in c.uget,
uname in c.upget or "*" in c.upget,
uname in c.uadmin or "*" in c.uadmin,
)
# skip uhtml because it's rarely needed
def get(
self,
@@ -419,7 +483,8 @@ class VFS(object):
self.log("vfs", "invalid relpath [{}]".format(vpath))
raise Pebkac(404)
vn, rem = self._find(undot(vpath))
cvpath = undot(vpath)
vn, rem = self._find(cvpath)
c: AXS = vn.axs
for req, d, msg in [
@@ -430,6 +495,11 @@ class VFS(object):
(will_get, c.uget, "get"),
]:
if req and (uname not in d and "*" not in d) and uname != LEELOO_DALLAS:
if vpath != cvpath and vpath != "." and self.log:
ap = vn.canonical(rem)
t = "{} has no {} in [{}] => [{}] => [{}]"
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
t = "you don't have {}-access for this location"
raise Pebkac(err, t.format(msg))
@@ -544,9 +614,20 @@ class VFS(object):
self.log("vfs.walk", t.format(seen[-1], fsroot, self.vpath, rem), 3)
return
if "xdev" in self.flags or "xvol" in self.flags:
rm1 = []
for le in vfs_ls:
ap = absreal(os.path.join(fsroot, le[0]))
vn2 = self.chk_ap(ap)
if not vn2 or not vn2.get("", uname, True, False):
rm1.append(le)
_ = [vfs_ls.remove(x) for x in rm1] # type: ignore
seen = seen[:] + [fsroot]
rfiles = [x for x in vfs_ls if not stat.S_ISDIR(x[1].st_mode)]
rdirs = [x for x in vfs_ls if stat.S_ISDIR(x[1].st_mode)]
# if lstat: ignore folder symlinks since copyparty will never make those
# (and we definitely don't want to descend into them)
rfiles.sort()
rdirs.sort()
@@ -577,6 +658,7 @@ class VFS(object):
def zipgen(
self,
vpath: str,
vrem: str,
flt: set[str],
uname: str,
@@ -588,7 +670,7 @@ class VFS(object):
# if multiselect: add all items to archive root
# if single folder: the folder itself is the top-level item
folder = "" if flt or not wrap else (vrem.split("/")[-1].lstrip(".") or "top")
folder = "" if flt or not wrap else (vpath.split("/")[-1].lstrip(".") or "top")
g = self.walk(folder, vrem, [], uname, [[True, False]], dots, scandir, False)
for _, _, vpath, apath, files, rd, vd in g:
@@ -639,6 +721,44 @@ class VFS(object):
for d in [{"vp": v, "ap": a, "st": n} for v, a, n in ret2]:
yield d
def chk_ap(self, ap: str, st: Optional[os.stat_result] = None) -> Optional["VFS"]:
aps = ap + os.sep
if "xdev" in self.flags and not ANYWIN:
if not st:
ap2 = ap.replace("\\", "/") if ANYWIN else ap
while ap2:
try:
st = bos.stat(ap2)
break
except:
if "/" not in ap2:
raise
ap2 = ap2.rsplit("/", 1)[0]
assert st
vdev = self.dev
if not vdev:
vdev = self.dev = bos.stat(self.realpath).st_dev
if vdev != st.st_dev:
if self.log:
t = "xdev: {}[{}] => {}[{}]"
self.log("vfs", t.format(vdev, self.realpath, st.st_dev, ap), 3)
return None
if "xvol" in self.flags:
for vap, vn in self.root.all_aps:
if aps.startswith(vap):
return vn
if self.log:
self.log("vfs", "xvol: [{}]".format(ap), 3)
return None
return self
if WINDOWS:
re_vol = re.compile(r"^([a-zA-Z]:[\\/][^:]*|[^:]*):([^:]*):(.*)$")
@@ -656,6 +776,7 @@ class AuthSrv(object):
warn_anonwrite: bool = True,
dargs: Optional[argparse.Namespace] = None,
) -> None:
self.ah = PWHash(args)
self.args = args
self.dargs = dargs or args
self.log_func = log_func
@@ -692,7 +813,7 @@ class AuthSrv(object):
if dst in mount:
t = "multiple filesystem-paths mounted at [/{}]:\n [{}]\n [{}]"
self.log(t.format(dst, mount[dst], src), c=1)
raise Exception("invalid config")
raise Exception(BAD_CFG)
if src in mount.values():
t = "filesystem-path [{}] mounted in multiple locations:"
@@ -701,7 +822,7 @@ class AuthSrv(object):
t += "\n /{}".format(v)
self.log(t, c=3)
raise Exception("invalid config")
raise Exception(BAD_CFG)
if not bos.path.isdir(src):
self.log("warning: filesystem-path does not exist: {}".format(src), 3)
@@ -768,6 +889,9 @@ class AuthSrv(object):
if not ln.split("#")[0].strip():
continue
if re.match(r"^\[.*\]:$", ln):
ln = ln[:-1]
subsection = ln in (catx, catf)
if ln.startswith("[") or subsection:
self._e()
@@ -797,7 +921,7 @@ class AuthSrv(object):
t = "volume-specific config (anything from --help-flags)"
self._l(ln, 6, t)
else:
raise Exception("invalid section header")
raise Exception("invalid section header" + SBADCFG)
self.indent = " " if subsection else " "
continue
@@ -820,7 +944,7 @@ class AuthSrv(object):
acct[u] = p
except:
t = 'lines inside the [accounts] section must be "username: password"'
raise Exception(t)
raise Exception(t + SBADCFG)
continue
if vp is not None and ap is None:
@@ -838,7 +962,7 @@ class AuthSrv(object):
try:
self._l(ln, 5, "volume access config:")
sk, sv = ln.split(":")
if re.sub("[rwmdgG]", "", sk) or not sk:
if re.sub("[rwmdgGha]", "", sk) or not sk:
err = "invalid accs permissions list; "
raise Exception(err)
if " " in re.sub(", *", "", sv).strip():
@@ -847,8 +971,8 @@ class AuthSrv(object):
self._read_vol_str(sk, sv.replace(" ", ""), daxs[vp], mflags[vp])
continue
except:
err += "accs entries must be 'rwmdgG: user1, user2, ...'"
raise Exception(err)
err += "accs entries must be 'rwmdgGha: user1, user2, ...'"
raise Exception(err + SBADCFG)
if cat == catf:
err = ""
@@ -857,11 +981,11 @@ class AuthSrv(object):
zd = split_cfg_ln(ln)
fstr = ""
for sk, sv in zd.items():
bad = re.sub(r"[a-z0-9_]", "", sk)
bad = re.sub(r"[a-z0-9_-]", "", sk).lstrip("-")
if bad:
err = "bad characters [{}] in volflag name [{}]; "
err = err.format(bad, sk)
raise Exception(err)
raise Exception(err + SBADCFG)
if sv is True:
fstr += "," + sk
else:
@@ -873,9 +997,9 @@ class AuthSrv(object):
continue
except:
err += "flags entries (volflags) must be one of the following:\n 'flag1, flag2, ...'\n 'key: value'\n 'flag1, flag2, key: value'"
raise Exception(err)
raise Exception(err + SBADCFG)
raise Exception("unprocessable line in config")
raise Exception("unprocessable line in config" + SBADCFG)
self._e()
self.line_ctr = 0
@@ -883,7 +1007,7 @@ class AuthSrv(object):
def _read_vol_str(
self, lvl: str, uname: str, axs: AXS, flags: dict[str, Any]
) -> None:
if lvl.strip("crwmdgG"):
if lvl.strip("crwmdgGha"):
raise Exception("invalid volflag: {},{}".format(lvl, uname))
if lvl == "c":
@@ -912,6 +1036,9 @@ class AuthSrv(object):
("w", axs.uwrite),
("m", axs.umove),
("d", axs.udel),
("a", axs.uadmin),
("h", axs.uhtml),
("h", axs.uget),
("g", axs.uget),
("G", axs.uget),
("G", axs.upget),
@@ -933,8 +1060,16 @@ class AuthSrv(object):
value: Union[str, bool, list[str]],
is_list: bool,
) -> None:
desc = flagdescs.get(name, "?").replace("\n", " ")
if name not in "mtp xbu xau xiu xbr xar xbd xad xm".split():
desc = flagdescs.get(name.lstrip("-"), "?").replace("\n", " ")
if re.match("^-[^-]+$", name):
t = "└─unset volflag [{}] ({})"
self._e(t.format(name[1:], desc))
flags[name] = True
return
zs = "mtp on403 on404 xbu xau xiu xbr xar xbd xad xm xban"
if name not in zs.split():
if value is True:
t = "└─add volflag [{}] = {} ({})"
else:
@@ -979,7 +1114,7 @@ class AuthSrv(object):
if self.args.v:
# list of src:dst:permset:permset:...
# permset is <rwmdgG>[,username][,username] or <c>,<flag>[=args]
# permset is <rwmdgGha>[,username][,username] or <c>,<flag>[=args]
for v_str in self.args.v:
m = re_vol.match(v_str)
if not m:
@@ -1023,6 +1158,8 @@ class AuthSrv(object):
self.log("\n{0}\n{1}{0}".format(t, "\n".join(slns)))
raise
self.setup_pwhash(acct)
# case-insensitive; normalize
if WINDOWS:
cased = {}
@@ -1058,9 +1195,15 @@ class AuthSrv(object):
assert vfs
vfs.all_vols = {}
vfs.get_all_vols(vfs.all_vols)
vfs.all_aps = []
vfs.all_vps = []
vfs.get_all_vols(vfs.all_vols, vfs.all_aps, vfs.all_vps)
for vol in vfs.all_vols.values():
vol.all_aps.sort(key=lambda x: len(x[0]), reverse=True)
vol.all_vps.sort(key=lambda x: len(x[0]), reverse=True)
vol.root = vfs
for perm in "read write move del get pget".split():
for perm in "read write move del get pget html admin".split():
axs_key = "u" + perm
unames = ["*"] + list(acct.keys())
umap: dict[str, list[str]] = {x: [] for x in unames}
@@ -1075,7 +1218,16 @@ class AuthSrv(object):
all_users = {}
missing_users = {}
for axs in daxs.values():
for d in [axs.uread, axs.uwrite, axs.umove, axs.udel, axs.uget, axs.upget]:
for d in [
axs.uread,
axs.uwrite,
axs.umove,
axs.udel,
axs.uget,
axs.upget,
axs.uhtml,
axs.uadmin,
]:
for usr in d:
all_users[usr] = 1
if usr != "*" and usr not in acct:
@@ -1087,11 +1239,19 @@ class AuthSrv(object):
+ ", ".join(k for k in sorted(missing_users)),
c=1,
)
raise Exception("invalid config")
raise Exception(BAD_CFG)
if LEELOO_DALLAS in all_users:
raise Exception("sorry, reserved username: " + LEELOO_DALLAS)
seenpwds = {}
for usr, pwd in acct.items():
if pwd in seenpwds:
t = "accounts [{}] and [{}] have the same password; this is not supported"
self.log(t.format(seenpwds[pwd], usr), 1)
raise Exception(BAD_CFG)
seenpwds[pwd] = usr
promote = []
demote = []
for vol in vfs.all_vols.values():
@@ -1101,6 +1261,9 @@ class AuthSrv(object):
if vflag == "-":
pass
elif vflag:
if vflag.startswith("~"):
vflag = os.path.expanduser(vflag)
vol.histpath = uncyg(vflag) if WINDOWS else vflag
elif self.args.hist:
for nch in range(len(hid)):
@@ -1190,6 +1353,16 @@ class AuthSrv(object):
use = True
lim.bmax, lim.bwin = [unhumanize(x) for x in zs.split(",")]
zs = vol.flags.get("vmaxb")
if zs:
use = True
lim.vbmax = unhumanize(zs)
zs = vol.flags.get("vmaxn")
if zs:
use = True
lim.vnmax = unhumanize(zs)
if use:
vol.lim = lim
@@ -1220,6 +1393,9 @@ class AuthSrv(object):
have_fk = False
for vol in vfs.all_vols.values():
fk = vol.flags.get("fk")
fka = vol.flags.get("fka")
if fka and not fk:
fk = fka
if fk:
vol.flags["fk"] = int(fk) if fk is not True else 8
have_fk = True
@@ -1227,6 +1403,12 @@ class AuthSrv(object):
if have_fk and re.match(r"^[0-9\.]+$", self.args.fk_salt):
self.log("filekey salt: {}".format(self.args.fk_salt))
fk_len = len(self.args.fk_salt)
if have_fk and fk_len < 14:
t = "WARNING: filekeys are enabled, but the salt is only %d chars long; %d or longer is recommended. Either specify a stronger salt using --fk-salt or delete this file and restart copyparty: %s"
zs = os.path.join(E.cfg, "fk-salt.txt")
self.log(t % (fk_len, 16, zs), 3)
for vol in vfs.all_vols.values():
if "pk" in vol.flags and "gz" not in vol.flags and "xz" not in vol.flags:
vol.flags["gz"] = False # def.pk
@@ -1247,12 +1429,12 @@ class AuthSrv(object):
for ga, vf in [["no_hash", "nohash"], ["no_idx", "noidx"]]:
if vf in vol.flags:
ptn = vol.flags.pop(vf)
ptn = re.compile(vol.flags.pop(vf))
else:
ptn = getattr(self.args, ga)
if ptn:
vol.flags[vf] = re.compile(ptn)
vol.flags[vf] = ptn
for ga, vf in vf_bmap().items():
if getattr(self.args, ga):
@@ -1278,6 +1460,10 @@ class AuthSrv(object):
if k in vol.flags:
vol.flags[k] = int(vol.flags[k])
for k in ("convt",):
if k in vol.flags:
vol.flags[k] = float(vol.flags[k])
for k1, k2 in IMPLICATIONS:
if k1 in vol.flags:
vol.flags[k2] = True
@@ -1293,18 +1479,15 @@ class AuthSrv(object):
raise Exception(t.format(dbd, dbds))
# default tag cfgs if unset
if "mte" not in vol.flags:
vol.flags["mte"] = self.args.mte
elif vol.flags["mte"].startswith("+"):
vol.flags["mte"] = ",".join(
x for x in [self.args.mte, vol.flags["mte"][1:]] if x
)
if "mth" not in vol.flags:
vol.flags["mth"] = self.args.mth
for k in ("mte", "mth", "exp_md", "exp_lg"):
if k not in vol.flags:
vol.flags[k] = getattr(self.args, k).copy()
else:
vol.flags[k] = odfusion(getattr(self.args, k), vol.flags[k])
# append additive args from argv to volflags
hooks = "xbu xau xiu xbr xar xbd xad xm".split()
for name in ["mtp"] + hooks:
hooks = "xbu xau xiu xbr xar xbd xad xm xban".split()
for name in "mtp on404 on403".split() + hooks:
self._read_volflag(vol.flags, name, getattr(self.args, name), True)
for hn in hooks:
@@ -1326,6 +1509,10 @@ class AuthSrv(object):
hfs = [x for x in hfs if x != "f"]
ocmd = ",".join(hfs + [cmd])
if "c" not in hfs and "f" not in hfs and hn == "xban":
hfs = ["c"] + hfs
ocmd = ",".join(hfs + [cmd])
ncmds.append(ocmd)
vol.flags[hn] = ncmds
@@ -1350,7 +1537,11 @@ class AuthSrv(object):
if vol.flags.get(grp, False):
continue
vol.flags = {k: v for k, v in vol.flags.items() if not k.startswith(rm)}
vol.flags = {
k: v
for k, v in vol.flags.items()
if not k.startswith(rm) or k == "mte"
}
for grp, rm in [["d2v", "e2v"]]:
if not vol.flags.get(grp, False):
@@ -1397,12 +1588,12 @@ class AuthSrv(object):
if local:
local_only_mtp[a] = True
local_mte = {}
for a in vol.flags.get("mte", "").split(","):
local_mte = ODict()
for a in vol.flags.get("mte", {}).keys():
local = True
all_mte[a] = True
local_mte[a] = True
for b in self.args.mte.split(","):
for b in self.args.mte.keys():
if not a or not b:
continue
@@ -1436,12 +1627,23 @@ class AuthSrv(object):
self.log(t, 1)
errors = True
if self.args.smb and self.ah.on and acct:
self.log("--smb can only be used when --ah-alg is none", 1)
errors = True
for vol in vfs.all_vols.values():
for k in list(vol.flags.keys()):
if re.match("^-[^-]+$", k):
vol.flags.pop(k[1:], None)
vol.flags.pop(k)
if errors:
sys.exit(1)
vfs.bubble_flags()
have_e2d = False
have_e2t = False
t = "volumes and permissions:\n"
for zv in vfs.all_vols.values():
if not self.warn_anonwrite:
@@ -1455,6 +1657,8 @@ class AuthSrv(object):
["delete", "udel"],
[" get", "uget"],
[" upget", "upget"],
[" html", "uhtml"],
["uadmin", "uadmin"],
]:
u = list(sorted(getattr(zv.axs, attr)))
u = ", ".join("\033[35meverybody\033[0m" if x == "*" else x for x in u)
@@ -1464,6 +1668,9 @@ class AuthSrv(object):
if "e2d" in zv.flags:
have_e2d = True
if "e2t" in zv.flags:
have_e2t = True
t += "\n"
if self.warn_anonwrite:
@@ -1475,8 +1682,21 @@ class AuthSrv(object):
if t:
self.log("\n\033[{}\033[0m\n".format(t))
if not have_e2t:
t = "hint: argument -e2ts enables multimedia indexing (artist/title/...)"
self.log(t, 6)
else:
t = "hint: argument -e2dsa enables searching, upload-undo, and better deduplication"
self.log(t, 6)
zv, _ = vfs.get("/", "*", False, False)
zs = zv.realpath.lower()
if zs in ("/", "c:\\") or zs.startswith(r"c:\windows"):
t = "you are sharing a system directory: {}\n"
self.log(t.format(zv.realpath), c=1)
try:
zv, _ = vfs.get("/", "*", False, True)
zv, _ = vfs.get("", "*", False, True, err=999)
if self.warn_anonwrite and os.getcwd() == zv.realpath:
t = "anyone can write to the current directory: {}\n"
self.log(t.format(zv.realpath), c=1)
@@ -1493,7 +1713,51 @@ class AuthSrv(object):
self.re_pwd = None
pwds = [re.escape(x) for x in self.iacct.keys()]
if pwds:
self.re_pwd = re.compile("=(" + "|".join(pwds) + ")([]&; ]|$)")
if self.ah.on:
zs = r"(\[H\] pw:.*|[?&]pw=)([^&]+)"
else:
zs = r"(\[H\] pw:.*|=)(" + "|".join(pwds) + r")([]&; ]|$)"
self.re_pwd = re.compile(zs)
def setup_pwhash(self, acct: dict[str, str]) -> None:
self.ah = PWHash(self.args)
if not self.ah.on:
return
if self.args.ah_cli:
self.ah.cli()
sys.exit()
elif self.args.ah_gen == "-":
self.ah.stdin()
sys.exit()
elif self.args.ah_gen:
print(self.ah.hash(self.args.ah_gen))
sys.exit()
if not acct:
return
changed = False
for uname, pw in list(acct.items())[:]:
if pw.startswith("+") and len(pw) == 33:
continue
changed = True
hpw = self.ah.hash(pw)
acct[uname] = hpw
t = "hashed password for account {}: {}"
self.log(t.format(uname, hpw), 3)
if not changed:
return
lns = []
for uname, pw in acct.items():
lns.append(" {}: {}".format(uname, pw))
t = "please use the following hashed passwords in your config:\n{}"
self.log(t.format("\n".join(lns)), 3)
def chk_sqlite_threadsafe(self) -> str:
v = SQLITE_VER[-1:]
@@ -1550,10 +1814,20 @@ class AuthSrv(object):
raise Exception("volume not found: " + zs)
self.log(str({"users": users, "vols": vols, "flags": flags}))
t = "/{}: read({}) write({}) move({}) del({}) get({}) upget({})"
t = "/{}: read({}) write({}) move({}) del({}) get({}) upget({}) uadmin({})"
for k, zv in self.vfs.all_vols.items():
vc = zv.axs
vs = [k, vc.uread, vc.uwrite, vc.umove, vc.udel, vc.uget, vc.upget]
vs = [
k,
vc.uread,
vc.uwrite,
vc.umove,
vc.udel,
vc.uget,
vc.upget,
vc.uhtml,
vc.uadmin,
]
self.log(t.format(*vs))
flag_v = "v" in flags
@@ -1633,7 +1907,8 @@ class AuthSrv(object):
]
csv = set("i p".split())
lst = set("c ihead mtm mtp xad xar xau xiu xbd xbr xbu xm".split())
zs = "c ihead mtm mtp on403 on404 xad xar xau xiu xban xbd xbr xbu xm"
lst = set(zs.split())
askip = set("a v c vc cgen theme".split())
# keymap from argv to vflag
@@ -1692,6 +1967,8 @@ class AuthSrv(object):
"d": "udel",
"g": "uget",
"G": "upget",
"h": "uhtml",
"a": "uadmin",
}
users = {}
for pkey in perms.values():
@@ -1783,13 +2060,19 @@ def expand_config_file(ret: list[str], fp: str, ipath: str) -> None:
if os.path.isdir(fp):
names = os.listdir(fp)
ret.append("#\033[36m cfg files in {} => {}\033[0m".format(fp, names))
crumb = "#\033[36m cfg files in {} => {}\033[0m".format(fp, names)
ret.append(crumb)
for fn in sorted(names):
fp2 = os.path.join(fp, fn)
if not fp2.endswith(".conf") or fp2 in ipath:
continue
expand_config_file(ret, fp2, ipath)
if ret[-1] == crumb:
# no config files below; remove breadcrumb
ret.pop()
return
ipath += " -> " + fp
@@ -1888,7 +2171,7 @@ def upgrade_cfg_fmt(
else:
sn = sn.replace(",", ", ")
ret.append(" " + sn)
elif sn[:1] in "rwmdgG":
elif sn[:1] in "rwmdgGha":
if cat != catx:
cat = catx
ret.append(cat)

View File

@@ -9,7 +9,7 @@ import queue
from .__init__ import CORES, TYPE_CHECKING
from .broker_mpw import MpWorker
from .broker_util import try_exec
from .broker_util import ExceptionalQueue, try_exec
from .util import Daemon, mp
if TYPE_CHECKING:
@@ -69,7 +69,7 @@ class BrokerMp(object):
while procs:
if procs[-1].is_alive():
time.sleep(0.1)
time.sleep(0.05)
continue
procs.pop()
@@ -107,6 +107,19 @@ class BrokerMp(object):
if retq_id:
proc.q_pend.put((retq_id, "retq", rv))
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
# new non-ipc invoking managed service in hub
obj = self.hub
for node in dest.split("."):
obj = getattr(obj, node)
rv = try_exec(True, obj, *args)
retq = ExceptionalQueue(1)
retq.put(rv)
return retq
def say(self, dest: str, *args: Any) -> None:
"""
send message to non-hub component in other process,

226
copyparty/cert.py Normal file
View File

@@ -0,0 +1,226 @@
import calendar
import errno
import filecmp
import json
import os
import shutil
import time
from .util import Netdev, runcmd
HAVE_CFSSL = True
if True: # pylint: disable=using-constant-test
from .util import RootLogger
def ensure_cert(log: "RootLogger", args) -> None:
"""
the default cert (and the entire TLS support) is only here to enable the
crypto.subtle javascript API, which is necessary due to the webkit guys
being massive memers (https://www.chromium.org/blink/webcrypto)
i feel awful about this and so should they
"""
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
cert_appdata = os.path.join(args.E.cfg, "cert.pem")
if not os.path.isfile(args.cert):
if cert_appdata != args.cert:
raise Exception("certificate file does not exist: " + args.cert)
shutil.copy(cert_insec, args.cert)
with open(args.cert, "rb") as f:
buf = f.read()
o1 = buf.find(b" PRIVATE KEY-")
o2 = buf.find(b" CERTIFICATE-")
m = "unsupported certificate format: "
if o1 < 0:
raise Exception(m + "no private key inside pem")
if o2 < 0:
raise Exception(m + "no server certificate inside pem")
if o1 > o2:
raise Exception(m + "private key must appear before server certificate")
try:
if filecmp.cmp(args.cert, cert_insec):
t = "using default TLS certificate; https will be insecure:\033[36m {}"
log("cert", t.format(args.cert), 3)
except:
pass
# speaking of the default cert,
# printf 'NO\n.\n.\n.\n.\ncopyparty-insecure\n.\n' | faketime '2000-01-01 00:00:00' openssl req -x509 -sha256 -newkey rsa:2048 -keyout insecure.pem -out insecure.pem -days $((($(printf %d 0x7fffffff)-$(date +%s --date=2000-01-01T00:00:00Z))/(60*60*24))) -nodes && ls -al insecure.pem && openssl x509 -in insecure.pem -text -noout
def _read_crt(args, fn):
try:
if not os.path.exists(os.path.join(args.crt_dir, fn)):
return 0, {}
acmd = ["cfssl-certinfo", "-cert", fn]
rc, so, se = runcmd(acmd, cwd=args.crt_dir)
if rc:
return 0, {}
inf = json.loads(so)
zs = inf["not_after"]
expiry = calendar.timegm(time.strptime(zs, "%Y-%m-%dT%H:%M:%SZ"))
return expiry, inf
except OSError as ex:
if ex.errno == errno.ENOENT:
raise
return 0, {}
except:
return 0, {}
def _gen_ca(log: "RootLogger", args):
expiry = _read_crt(args, "ca.pem")[0]
if time.time() + args.crt_cdays * 60 * 60 * 24 * 0.1 < expiry:
return
backdate = "{}m".format(int(args.crt_back * 60))
expiry = "{}m".format(int(args.crt_cdays * 60 * 24))
cn = args.crt_cnc.replace("--crt-cn", args.crt_cn)
algo, ksz = args.crt_alg.split("-")
req = {
"CN": cn,
"CA": {"backdate": backdate, "expiry": expiry, "pathlen": 0},
"key": {"algo": algo, "size": int(ksz)},
"names": [{"O": cn}],
}
sin = json.dumps(req).encode("utf-8")
log("cert", "creating new ca ...", 6)
cmd = "cfssl gencert -initca -"
rc, so, se = runcmd(cmd.split(), 30, sin=sin)
if rc:
raise Exception("failed to create ca-cert: {}, {}".format(rc, se), 3)
cmd = "cfssljson -bare ca"
sin = so.encode("utf-8")
rc, so, se = runcmd(cmd.split(), 10, sin=sin, cwd=args.crt_dir)
if rc:
raise Exception("failed to translate ca-cert: {}, {}".format(rc, se), 3)
bname = os.path.join(args.crt_dir, "ca")
os.rename(bname + "-key.pem", bname + ".key")
os.unlink(bname + ".csr")
log("cert", "new ca OK", 2)
def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
names = args.crt_ns.split(",") if args.crt_ns else []
if not args.crt_exact:
for n in names[:]:
names.append("*.{}".format(n))
if not args.crt_noip:
for ip in netdevs.keys():
names.append(ip.split("/")[0])
if args.crt_nolo:
names = [x for x in names if x not in ("localhost", "127.0.0.1", "::1")]
if not args.crt_nohn:
names.append(args.name)
names.append(args.name + ".local")
if not names:
names = ["127.0.0.1"]
if "127.0.0.1" in names or "::1" in names:
names.append("localhost")
names = list({x: 1 for x in names}.keys())
try:
expiry, inf = _read_crt(args, "srv.pem")
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.1 > expiry
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
for n in names:
if n not in inf["sans"]:
raise Exception("does not have {}".format(n))
if expired:
raise Exception("old server-cert has expired")
if not filecmp.cmp(args.cert, cert_insec):
return
except Exception as ex:
log("cert", "will create new server-cert; {}".format(ex))
log("cert", "creating server-cert ...", 6)
backdate = "{}m".format(int(args.crt_back * 60))
expiry = "{}m".format(int(args.crt_sdays * 60 * 24))
cfg = {
"signing": {
"default": {
"backdate": backdate,
"expiry": expiry,
"usages": ["signing", "key encipherment", "server auth"],
}
}
}
with open(os.path.join(args.crt_dir, "cfssl.json"), "wb") as f:
f.write(json.dumps(cfg).encode("utf-8"))
cn = args.crt_cns.replace("--crt-cn", args.crt_cn)
algo, ksz = args.crt_alg.split("-")
req = {
"key": {"algo": algo, "size": int(ksz)},
"names": [{"O": cn}],
}
sin = json.dumps(req).encode("utf-8")
cmd = "cfssl gencert -config=cfssl.json -ca ca.pem -ca-key ca.key -profile=www"
acmd = cmd.split() + ["-hostname=" + ",".join(names), "-"]
rc, so, se = runcmd(acmd, 30, sin=sin, cwd=args.crt_dir)
if rc:
raise Exception("failed to create cert: {}, {}".format(rc, se))
cmd = "cfssljson -bare srv"
sin = so.encode("utf-8")
rc, so, se = runcmd(cmd.split(), 10, sin=sin, cwd=args.crt_dir)
if rc:
raise Exception("failed to translate cert: {}, {}".format(rc, se))
bname = os.path.join(args.crt_dir, "srv")
try:
os.unlink(bname + ".key")
except:
pass
os.rename(bname + "-key.pem", bname + ".key")
os.unlink(bname + ".csr")
with open(os.path.join(args.crt_dir, "ca.pem"), "rb") as f:
ca = f.read()
with open(bname + ".key", "rb") as f:
skey = f.read()
with open(bname + ".pem", "rb") as f:
scrt = f.read()
with open(args.cert, "wb") as f:
f.write(skey + scrt + ca)
log("cert", "new server-cert OK", 2)
def gencert(log: "RootLogger", args, netdevs: dict[str, Netdev]):
global HAVE_CFSSL
if args.http_only:
return
if args.no_crt or not HAVE_CFSSL:
ensure_cert(log, args)
return
try:
_gen_ca(log, args)
_gen_srv(log, args, netdevs)
except Exception as ex:
HAVE_CFSSL = False
log("cert", "could not create TLS certificates: {}".format(ex), 3)
if getattr(ex, "errno", 0) == errno.ENOENT:
t = "install cfssl if you want to fix this; https://github.com/cloudflare/cfssl/releases/latest (cfssl, cfssljson, cfssl-certinfo)"
log("cert", t, 6)
ensure_cert(log, args)

View File

@@ -13,15 +13,27 @@ def vf_bmap() -> dict[str, str]:
"no_dedup": "copydupes",
"no_dupe": "nodupe",
"no_forget": "noforget",
"no_robots": "norobots",
"no_thumb": "dthumb",
"no_vthumb": "dvthumb",
"no_athumb": "dathumb",
"th_no_crop": "nocrop",
"dav_auth": "davauth",
"dav_rt": "davrt",
}
for k in (
"dotsrch",
"e2d",
"e2ds",
"e2dsa",
"e2t",
"e2ts",
"e2tsr",
"e2v",
"e2vu",
"e2vp",
"exp",
"grid",
"hardlink",
"magic",
"no_sb_md",
@@ -37,8 +49,22 @@ def vf_bmap() -> dict[str, str]:
def vf_vmap() -> dict[str, str]:
"""argv-to-volflag: simple values"""
ret = {}
for k in ("lg_sbf", "md_sbf"):
ret = {
"no_hash": "nohash",
"no_idx": "noidx",
"re_maxage": "scan",
"th_convt": "convt",
"th_size": "thsize",
}
for k in (
"dbd",
"lg_sbf",
"md_sbf",
"nrand",
"sort",
"unlist",
"u2ts",
):
ret[k] = k
return ret
@@ -46,7 +72,23 @@ def vf_vmap() -> dict[str, str]:
def vf_cmap() -> dict[str, str]:
"""argv-to-volflag: complex/lists"""
ret = {}
for k in ("dbd", "html_head", "mte", "mth", "nrand"):
for k in (
"exp_lg",
"exp_md",
"html_head",
"mte",
"mth",
"mtp",
"xad",
"xar",
"xau",
"xban",
"xbd",
"xbr",
"xbu",
"xiu",
"xm",
):
ret[k] = k
return ret
@@ -58,6 +100,8 @@ permdescs = {
"d": "delete; permanently delete files and folders",
"g": "get; download files, but cannot see folder contents",
"G": 'upget; same as "g" but can see filekeys of their own uploads',
"h": 'html; same as "g" but folders return their index.html',
"a": "admin; can see uploader IPs, config-reload",
}
@@ -75,9 +119,12 @@ flagcats = {
},
"upload rules": {
"maxn=250,600": "max 250 uploads over 15min",
"maxb=1g,300": "max 1 GiB over 5min (suffixes: b, k, m, g)",
"maxb=1g,300": "max 1 GiB over 5min (suffixes: b, k, m, g, t)",
"vmaxb=1g": "total volume size max 1 GiB (suffixes: b, k, m, g, t)",
"vmaxn=4k": "max 4096 files in volume (suffixes: b, k, m, g, t)",
"rand": "force randomized filenames, 9 chars long by default",
"nrand=N": "randomized filenames are N chars long",
"u2ts=fc": "[f]orce [c]lient-last-modified or [u]pload-time",
"sz=1k-3m": "allow filesizes between 1 KiB and 3MiB",
"df=1g": "ensure 1 GiB free disk space",
},
@@ -103,10 +150,11 @@ flagcats = {
"nohash=\\.iso$": "skips hashing file contents if path matches *.iso",
"noidx=\\.iso$": "fully ignores the contents at paths matching *.iso",
"noforget": "don't forget files when deleted from disk",
"fat32": "avoid excessive reindexing on android sdcardfs",
"dbd=[acid|swal|wal|yolo]": "database speed-durability tradeoff",
"xlink": "cross-volume dupe detection / linking",
"xdev": "do not descend into other filesystems",
"xvol": "skip symlinks leaving the volume root",
"xvol": "do not follow symlinks leaving the volume root",
"dotsrch": "show dotfiles in search results",
"nodotsrch": "hide dotfiles in search results (default)",
},
@@ -119,6 +167,13 @@ flagcats = {
"dvthumb": "disables video thumbnails",
"dathumb": "disables audio thumbnails (spectrograms)",
"dithumb": "disables image thumbnails",
"thsize": "thumbnail res; WxH",
"nocrop": "disable center-cropping by default",
"convt": "conversion timeout in seconds",
},
"handlers\n(better explained in --help-handlers)": {
"on404=PY": "handle 404s by executing PY file",
"on403=PY": "handle 403s by executing PY file",
},
"event hooks\n(better explained in --help-hooks)": {
"xbu=CMD": "execute CMD before a file upload starts",
@@ -129,8 +184,12 @@ flagcats = {
"xbd=CMD": "execute CMD before a file delete",
"xad=CMD": "execute CMD after a file delete",
"xm=CMD": "execute CMD on message",
"xban=CMD": "execute CMD if someone gets banned",
},
"client and ux": {
"grid": "show grid/thumbnails by default",
"sort": "default sort order",
"unlist": "dont list files matching REGEX",
"html_head=TXT": "includes TXT in the <head>",
"robots": "allows indexing by search engines (default)",
"norobots": "kindly asks search engines to leave",
@@ -140,9 +199,13 @@ flagcats = {
"sb_lg": "enable js sandbox for prologue/epilogue (default)",
"md_sbf": "list of markdown-sandbox safeguards to disable",
"lg_sbf": "list of *logue-sandbox safeguards to disable",
"nohtml": "return html and markdown as text/html",
},
"others": {
"fk=8": 'generates per-file accesskeys,\nwhich will then be required at the "g" permission'
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
"fka=8": 'generates slightly weaker per-file accesskeys,\nwhich are then required at the "g" permission;\nnot affected by filesize or inode numbers',
"davauth": "ask webdav clients to login for all folders",
"davrt": "show lastmod time of symlink destination, not the link itself\n(note: this option is always enabled for recursive listings)",
},
}

View File

@@ -2,20 +2,28 @@
from __future__ import print_function, unicode_literals
import argparse
import errno
import logging
import os
import stat
import sys
import time
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
try:
import asynchat
except:
sys.path.append(os.path.join(E.mod, "vend"))
from pyftpdlib.authorizers import AuthenticationFailed, DummyAuthorizer
from pyftpdlib.filesystems import AbstractedFS, FilesystemError
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.ioloop import IOLoop
from pyftpdlib.servers import FTPServer
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
from .bos import bos
from .authsrv import VFS
from .bos import bos
from .util import (
Daemon,
Pebkac,
@@ -29,15 +37,6 @@ from .util import (
vjoin,
)
try:
from pyftpdlib.ioloop import IOLoop
except ImportError:
p = os.path.join(E.mod, "vend")
print("loading asynchat from " + p)
sys.path.append(p)
from pyftpdlib.ioloop import IOLoop
if TYPE_CHECKING:
from .svchub import SvcHub
@@ -46,6 +45,12 @@ if True: # pylint: disable=using-constant-test
from typing import Any, Optional
class FSE(FilesystemError):
def __init__(self, msg: str, severity: int = 0) -> None:
super(FilesystemError, self).__init__(msg)
self.severity = severity
class FtpAuth(DummyAuthorizer):
def __init__(self, hub: "SvcHub") -> None:
super(FtpAuth, self).__init__()
@@ -55,6 +60,7 @@ class FtpAuth(DummyAuthorizer):
self, username: str, password: str, handler: Any
) -> None:
handler.username = "{}:{}".format(username, password)
handler.uname = "*"
ip = handler.addr[0]
if ip.startswith("::ffff:"):
@@ -71,10 +77,13 @@ class FtpAuth(DummyAuthorizer):
raise AuthenticationFailed("banned")
asrv = self.hub.asrv
if username == "anonymous":
uname = "*"
else:
uname = asrv.iacct.get(password, "") or asrv.iacct.get(username, "") or "*"
uname = "*"
if username != "anonymous":
for zs in (password, username):
zs = asrv.iacct.get(asrv.ah.hash(zs), "")
if zs:
uname = zs
break
if not uname or not (asrv.vfs.aread.get(uname) or asrv.vfs.awrite.get(uname)):
g = self.hub.gpwd
@@ -86,14 +95,14 @@ class FtpAuth(DummyAuthorizer):
raise AuthenticationFailed("Authentication failed.")
handler.username = uname
handler.uname = handler.username = uname
def get_home_dir(self, username: str) -> str:
return "/"
def has_user(self, username: str) -> bool:
asrv = self.hub.asrv
return username in asrv.acct
return username in asrv.acct or username in asrv.iacct
def has_perm(self, username: str, perm: int, path: Optional[str] = None) -> bool:
return True # handled at filesystem layer
@@ -112,24 +121,22 @@ class FtpFs(AbstractedFS):
def __init__(
self, root: str, cmd_channel: Any
) -> None: # pylint: disable=super-init-not-called
self.h = self.cmd_channel = cmd_channel # type: FTPHandler
self.h = cmd_channel # type: FTPHandler
self.cmd_channel = cmd_channel # type: FTPHandler
self.hub: "SvcHub" = cmd_channel.hub
self.args = cmd_channel.args
self.uname = self.hub.asrv.iacct.get(cmd_channel.password, "*")
self.uname = cmd_channel.uname
self.cwd = "/" # pyftpdlib convention of leading slash
self.root = "/var/lib/empty"
self.can_read = self.can_write = self.can_move = False
self.can_delete = self.can_get = self.can_upget = False
self.can_admin = False
self.listdirinfo = self.listdir
self.chdir(".")
def die(self, msg):
self.h.die(msg)
def v2a(
self,
vpath: str,
@@ -139,21 +146,34 @@ class FtpFs(AbstractedFS):
d: bool = False,
) -> tuple[str, VFS, str]:
try:
vpath = vpath.replace("\\", "/").lstrip("/")
vpath = vpath.replace("\\", "/").strip("/")
rd, fn = os.path.split(vpath)
if ANYWIN and relchk(rd):
logging.warning("malicious vpath: %s", vpath)
self.die("Unsupported characters in filepath")
t = "Unsupported characters in [{}]"
raise FSE(t.format(vpath), 1)
fn = sanitize_fn(fn or "", "", [".prologue.html", ".epilogue.html"])
vpath = vjoin(rd, fn)
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
if not vfs.realpath:
self.die("No filesystem mounted at this path")
t = "No filesystem mounted at [{}]"
raise FSE(t.format(vpath))
if "xdev" in vfs.flags or "xvol" in vfs.flags:
ap = vfs.canonical(rem)
avfs = vfs.chk_ap(ap)
t = "Permission denied in [{}]"
if not avfs:
raise FSE(t.format(vpath), 1)
cr, cw, cm, cd, _, _, _ = avfs.can_access("", self.h.uname)
if r and not cr or w and not cw or m and not cm or d and not cd:
raise FSE(t.format(vpath), 1)
return os.path.join(vfs.realpath, rem), vfs, rem
except Pebkac as ex:
self.die(str(ex))
raise FSE(str(ex))
def rv2a(
self,
@@ -176,7 +196,7 @@ class FtpFs(AbstractedFS):
def validpath(self, path: str) -> bool:
if "/.hist/" in path:
if "/up2k." in path or path.endswith("/dir.txt"):
self.die("Access to this file is forbidden")
raise FSE("Access to this file is forbidden", 1)
return True
@@ -193,7 +213,7 @@ class FtpFs(AbstractedFS):
td = 0
if td < -1 or td > self.args.ftp_wt:
self.die("Cannot open existing file for writing")
raise FSE("Cannot open existing file for writing")
self.validpath(ap)
return open(fsenc(ap), mode)
@@ -202,9 +222,17 @@ class FtpFs(AbstractedFS):
nwd = join(self.cwd, path)
vfs, rem = self.hub.asrv.vfs.get(nwd, self.uname, False, False)
ap = vfs.canonical(rem)
if not bos.path.isdir(ap):
try:
st = bos.stat(ap)
if not stat.S_ISDIR(st.st_mode):
raise Exception()
except:
# returning 550 is library-default and suitable
self.die("Failed to change directory")
raise FSE("No such file or directory")
avfs = vfs.chk_ap(ap, st)
if not avfs:
raise FSE("Permission denied", 1)
self.cwd = nwd
(
@@ -214,16 +242,19 @@ class FtpFs(AbstractedFS):
self.can_delete,
self.can_get,
self.can_upget,
) = self.hub.asrv.vfs.can_access(self.cwd.lstrip("/"), self.h.username)
self.can_admin,
) = avfs.can_access("", self.h.uname)
def mkdir(self, path: str) -> None:
ap = self.rv2a(path, w=True)[0]
bos.makedirs(ap) # filezilla expects this
def listdir(self, path: str) -> list[str]:
vpath = join(self.cwd, path).lstrip("/")
vpath = join(self.cwd, path)
try:
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, True, False)
ap, vfs, rem = self.v2a(vpath, True, False)
if not bos.path.isdir(ap):
raise FSE("No such file or directory", 1)
fsroot, vfs_ls1, vfs_virt = vfs.ls(
rem,
@@ -239,8 +270,12 @@ class FtpFs(AbstractedFS):
vfs_ls.sort()
return vfs_ls
except:
if vpath:
except Exception as ex:
# panic on malicious names
if getattr(ex, "severity", 0):
raise
if vpath.strip("/"):
# display write-only folders as empty
return []
@@ -250,31 +285,35 @@ class FtpFs(AbstractedFS):
def rmdir(self, path: str) -> None:
ap = self.rv2a(path, d=True)[0]
bos.rmdir(ap)
try:
bos.rmdir(ap)
except OSError as e:
if e.errno != errno.ENOENT:
raise
def remove(self, path: str) -> None:
if self.args.no_del:
self.die("The delete feature is disabled in server config")
raise FSE("The delete feature is disabled in server config")
vp = join(self.cwd, path).lstrip("/")
try:
self.hub.up2k.handle_rm(self.uname, self.h.cli_ip, [vp], [])
self.hub.up2k.handle_rm(self.uname, self.h.cli_ip, [vp], [], False)
except Exception as ex:
self.die(str(ex))
raise FSE(str(ex))
def rename(self, src: str, dst: str) -> None:
if not self.can_move:
self.die("Not allowed for user " + self.h.username)
raise FSE("Not allowed for user " + self.h.uname)
if self.args.no_mv:
self.die("The rename/move feature is disabled in server config")
raise FSE("The rename/move feature is disabled in server config")
svp = join(self.cwd, src).lstrip("/")
dvp = join(self.cwd, dst).lstrip("/")
try:
self.hub.up2k.handle_mv(self.uname, svp, dvp)
except Exception as ex:
self.die(str(ex))
raise FSE(str(ex))
def chmod(self, path: str, mode: str) -> None:
pass
@@ -283,7 +322,10 @@ class FtpFs(AbstractedFS):
try:
ap = self.rv2a(path, r=True)[0]
return bos.stat(ap)
except:
except FSE as ex:
if ex.severity:
raise
ap = self.rv2a(path)[0]
st = bos.stat(ap)
if not stat.S_ISDIR(st.st_mode):
@@ -303,7 +345,10 @@ class FtpFs(AbstractedFS):
try:
st = self.stat(path)
return stat.S_ISREG(st.st_mode)
except:
except Exception as ex:
if getattr(ex, "severity", 0):
raise
return False # expected for mojibake in ftp_SIZE()
def islink(self, path: str) -> bool:
@@ -314,7 +359,10 @@ class FtpFs(AbstractedFS):
try:
st = self.stat(path)
return stat.S_ISDIR(st.st_mode)
except:
except Exception as ex:
if getattr(ex, "severity", 0):
raise
return True
def getsize(self, path: str) -> int:
@@ -343,10 +391,12 @@ class FtpHandler(FTPHandler):
abstracted_fs = FtpFs
hub: "SvcHub"
args: argparse.Namespace
uname: str
def __init__(self, conn: Any, server: Any, ioloop: Any = None) -> None:
self.hub: "SvcHub" = FtpHandler.hub
self.args: argparse.Namespace = FtpHandler.args
self.uname = "*"
if PY2:
FTPHandler.__init__(self, conn, server, ioloop)
@@ -362,14 +412,10 @@ class FtpHandler(FTPHandler):
# reduce non-debug logging
self.log_cmds_list = [x for x in self.log_cmds_list if x not in ("CWD", "XCWD")]
def die(self, msg):
self.respond("550 {}".format(msg))
raise FilesystemError(msg)
def ftp_STOR(self, file: str, mode: str = "w") -> Any:
# Optional[str]
vp = join(self.fs.cwd, file).lstrip("/")
ap, vfs, rem = self.fs.v2a(vp)
ap, vfs, rem = self.fs.v2a(vp, w=True)
self.vfs_map[ap] = vp
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
@@ -378,14 +424,14 @@ class FtpHandler(FTPHandler):
ap,
vfs.canonical(rem),
"",
self.username,
self.uname,
0,
0,
self.cli_ip,
0,
"",
):
self.die("Upload blocked by xbu server config")
raise FSE("Upload blocked by xbu server config")
# print("ftp_STOR: {} {} => {}".format(vp, mode, ap))
ret = FTPHandler.ftp_STOR(self, file, mode)
@@ -407,7 +453,7 @@ class FtpHandler(FTPHandler):
# print("xfer_end: {} => {}".format(ap, vp))
if vp:
vp, fn = os.path.split(vp)
vfs, rem = self.hub.asrv.vfs.get(vp, self.username, False, True)
vfs, rem = self.hub.asrv.vfs.get(vp, self.uname, False, True)
vfs, rem = vfs.get_dbv(rem)
self.hub.up2k.hash_file(
vfs.realpath,
@@ -417,7 +463,7 @@ class FtpHandler(FTPHandler):
fn,
self.cli_ip,
time.time(),
self.username,
self.uname,
)
return FTPHandler.log_transfer(
@@ -451,7 +497,7 @@ class Ftpd(object):
print(t.format(pybin))
sys.exit(1)
h1.certfile = os.path.join(self.args.E.cfg, "cert.pem")
h1.certfile = self.args.cert
h1.tls_control_required = True
h1.tls_data_required = True
@@ -459,9 +505,9 @@ class Ftpd(object):
for h_lp in hs:
h2, lp = h_lp
h2.hub = hub
h2.args = hub.args
h2.authorizer = FtpAuth(hub)
FtpHandler.hub = h2.hub = hub
FtpHandler.args = h2.args = hub.args
FtpHandler.authorizer = h2.authorizer = FtpAuth(hub)
if self.args.ftp_pr:
p1, p2 = [int(x) for x in self.args.ftp_pr.split("-")]
@@ -485,6 +531,9 @@ class Ftpd(object):
if "::" in ips:
ips.append("0.0.0.0")
if self.args.ftp4:
ips = [x for x in ips if ":" not in x]
ioloop = IOLoop()
for ip in ips:
for h, lp in hs:

File diff suppressed because it is too large Load Diff

View File

@@ -54,7 +54,6 @@ class HttpConn(object):
self.args: argparse.Namespace = hsrv.args # mypy404
self.E: EnvParams = self.args.E
self.asrv: AuthSrv = hsrv.asrv # mypy404
self.cert_path = hsrv.cert_path
self.u2fh: Util.FHC = hsrv.u2fh # mypy404
self.iphash: HMaccas = hsrv.broker.iphash
self.bans: dict[str, int] = hsrv.bans
@@ -103,17 +102,18 @@ class HttpConn(object):
def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func(self.log_src, msg, c)
def get_u2idx(self) -> U2idx:
# one u2idx per tcp connection;
def get_u2idx(self) -> Optional[U2idx]:
# grab from a pool of u2idx instances;
# sqlite3 fully parallelizes under python threads
# but avoid running out of FDs by creating too many
if not self.u2idx:
self.u2idx = U2idx(self)
self.u2idx = self.hsrv.get_u2idx(str(self.addr))
return self.u2idx
def _detect_https(self) -> bool:
method = None
if self.cert_path:
if True:
try:
method = self.s.recv(4, socket.MSG_PEEK)
except socket.timeout:
@@ -147,7 +147,7 @@ class HttpConn(object):
self.sr = None
if self.args.https_only:
is_https = True
elif self.args.http_only or not HAVE_SSL:
elif self.args.http_only:
is_https = False
else:
# raise Exception("asdf")
@@ -161,7 +161,7 @@ class HttpConn(object):
self.log_src = self.log_src.replace("[36m", "[35m")
try:
ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ctx.load_cert_chain(self.cert_path)
ctx.load_cert_chain(self.args.cert)
if self.args.ssl_ver:
ctx.options &= ~self.args.ssl_flags_en
ctx.options |= self.args.ssl_flags_de
@@ -215,3 +215,7 @@ class HttpConn(object):
self.cli = HttpCli(self)
if not self.cli.run():
return
if self.u2idx:
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
self.u2idx = None

View File

@@ -4,6 +4,7 @@ from __future__ import print_function, unicode_literals
import base64
import math
import os
import re
import socket
import sys
import threading
@@ -11,7 +12,7 @@ import time
import queue
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams
from .__init__ import ANYWIN, CORES, EXE, MACOS, TYPE_CHECKING, EnvParams
try:
MNFE = ModuleNotFoundError
@@ -33,13 +34,30 @@ except MNFE:
* (try another python version, if you have one)
* (try copyparty.sfx instead)
""".format(
os.path.basename(sys.executable)
sys.executable
)
)
sys.exit(1)
except SyntaxError:
if EXE:
raise
print(
"""\033[1;31m
your jinja2 version is incompatible with your python version;\033[33m
please try to replace it with an older version:\033[0m
* {} -m pip install --user jinja2==2.11.3
* (try another python version, if you have one)
* (try copyparty.sfx instead)
""".format(
sys.executable
)
)
sys.exit(1)
from .bos import bos
from .httpconn import HttpConn
from .metrics import Metrics
from .u2idx import U2idx
from .util import (
E_SCK,
FHC,
@@ -48,6 +66,7 @@ from .util import (
Magician,
Netdev,
NetMap,
absreal,
ipnorm,
min_ex,
shut_socket,
@@ -81,12 +100,16 @@ class HttpSrv(object):
# redefine in case of multiprocessing
socket.setdefaulttimeout(120)
self.t0 = time.time()
nsuf = "-n{}-i{:x}".format(nid, os.getpid()) if nid else ""
self.magician = Magician()
self.nm = NetMap([], {})
self.ssdp: Optional["SSDPr"] = None
self.gpwd = Garda(self.args.ban_pw)
self.g404 = Garda(self.args.ban_404)
self.g403 = Garda(self.args.ban_403)
self.g422 = Garda(self.args.ban_422, False)
self.gurl = Garda(self.args.ban_url)
self.bans: dict[str, int] = {}
self.aclose: dict[str, int] = {}
@@ -104,6 +127,7 @@ class HttpSrv(object):
self.t_periodic: Optional[threading.Thread] = None
self.u2fh = FHC()
self.metrics = Metrics(self)
self.srvs: list[socket.socket] = []
self.ncli = 0 # exact
self.clients: set[HttpConn] = set() # laggy
@@ -111,6 +135,9 @@ class HttpSrv(object):
self.cb_ts = 0.0
self.cb_v = ""
self.u2idx_free: dict[str, U2idx] = {}
self.u2idx_n = 0
env = jinja2.Environment()
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
@@ -118,6 +145,12 @@ class HttpSrv(object):
zs = os.path.join(self.E.mod, "web", "deps", "prism.js.gz")
self.prism = os.path.exists(zs)
self.statics: set[str] = set()
self._build_statics()
self.ptn_cc = re.compile(r"[\x00-\x1f]")
self.ptn_hsafe = re.compile(r"[\x00-\x1f<>\"'&]")
self.mallow = "GET HEAD POST PUT DELETE OPTIONS".split()
if not self.args.no_dav:
zs = "PROPFIND PROPPATCH LOCK UNLOCK MKCOL COPY MOVE"
@@ -128,12 +161,6 @@ class HttpSrv(object):
self.ssdp = SSDPr(broker)
cert_path = os.path.join(self.E.cfg, "cert.pem")
if bos.path.exists(cert_path):
self.cert_path = cert_path
else:
self.cert_path = ""
if self.tp_q:
self.start_threads(4)
@@ -144,7 +171,7 @@ class HttpSrv(object):
if self.args.log_thrs:
start_log_thrs(self.log, self.args.log_thrs, nid)
self.th_cfg: dict[str, Any] = {}
self.th_cfg: dict[str, set[str]] = {}
Daemon(self.post_init, "hsrv-init2")
def post_init(self) -> None:
@@ -154,6 +181,14 @@ class HttpSrv(object):
except:
pass
def _build_statics(self) -> None:
for dp, _, df in os.walk(os.path.join(self.E.mod, "web")):
for fn in df:
ap = absreal(os.path.join(dp, fn))
self.statics.add(ap)
if ap.endswith(".gz") or ap.endswith(".br"):
self.statics.add(ap[:-3])
def set_netdevs(self, netdevs: dict[str, Netdev]) -> None:
ips = set()
for ip, _ in self.bound:
@@ -445,6 +480,9 @@ class HttpSrv(object):
self.clients.remove(cli)
self.ncli -= 1
if cli.u2idx:
self.put_u2idx(str(addr), cli.u2idx)
def cachebuster(self) -> str:
if time.time() - self.cb_ts < 1:
return self.cb_v
@@ -466,3 +504,31 @@ class HttpSrv(object):
self.cb_v = v.decode("ascii")[-4:]
self.cb_ts = time.time()
return self.cb_v
def get_u2idx(self, ident: str) -> Optional[U2idx]:
utab = self.u2idx_free
for _ in range(100): # 5/0.05 = 5sec
with self.mutex:
if utab:
if ident in utab:
return utab.pop(ident)
return utab.pop(list(utab.keys())[0])
if self.u2idx_n < CORES:
self.u2idx_n += 1
return U2idx(self)
time.sleep(0.05)
# not using conditional waits, on a hunch that
# average performance will be faster like this
# since most servers won't be fully saturated
return None
def put_u2idx(self, ident: str, u2idx: U2idx) -> None:
with self.mutex:
while ident in self.u2idx_free:
ident += "a"
self.u2idx_free[ident] = u2idx

View File

@@ -4,9 +4,10 @@ from __future__ import print_function, unicode_literals
import argparse # typechk
import colorsys
import hashlib
import re
from .__init__ import PY2
from .th_srv import HAVE_PIL
from .th_srv import HAVE_PIL, HAVE_PILF
from .util import BytesIO
@@ -17,12 +18,14 @@ class Ico(object):
def get(self, ext: str, as_thumb: bool, chrome: bool) -> tuple[str, bytes]:
"""placeholder to make thumbnails not break"""
zb = hashlib.sha1(ext.encode("utf-8")).digest()[2:4]
bext = ext.encode("ascii", "replace")
ext = bext.decode("utf-8")
zb = hashlib.sha1(bext).digest()[2:4]
if PY2:
zb = [ord(x) for x in zb]
c1 = colorsys.hsv_to_rgb(zb[0] / 256.0, 1, 0.3)
c2 = colorsys.hsv_to_rgb(zb[0] / 256.0, 1, 1)
c2 = colorsys.hsv_to_rgb(zb[0] / 256.0, 0.8 if HAVE_PILF else 1, 1)
ci = [int(x * 255) for x in list(c1) + list(c2)]
c = "".join(["{:02x}".format(x) for x in ci])
@@ -33,8 +36,34 @@ class Ico(object):
h = int(100 / (float(sw) / float(sh)))
w = 100
if chrome and as_thumb:
if chrome:
# cannot handle more than ~2000 unique SVGs
if HAVE_PILF:
# pillow 10.1 made this the default font;
# svg: 3.7s, this: 36s
try:
from PIL import Image, ImageDraw
# [.lt] are hard to see lowercase / unspaced
ext2 = re.sub("(.)", "\\1 ", ext).upper()
h = int(128 * h / w)
w = 128
img = Image.new("RGB", (w, h), "#" + c[:6])
pb = ImageDraw.Draw(img)
_, _, tw, th = pb.textbbox((0, 0), ext2, font_size=16)
xy = ((w - tw) // 2, (h - th) // 2)
pb.text(xy, ext2, fill="#" + c[6:], font_size=16)
img = img.resize((w * 2, h * 2), Image.NEAREST)
buf = BytesIO()
img.save(buf, format="PNG", compress_level=1)
return "image/png", buf.getvalue()
except:
pass
if HAVE_PIL:
# svg: 3s, cache: 6s, this: 8s
from PIL import Image, ImageDraw
@@ -43,8 +72,19 @@ class Ico(object):
w = 64
img = Image.new("RGB", (w, h), "#" + c[:6])
pb = ImageDraw.Draw(img)
tw, th = pb.textsize(ext)
pb.text(((w - tw) // 2, (h - th) // 2), ext, fill="#" + c[6:])
try:
_, _, tw, th = pb.textbbox((0, 0), ext)
except:
tw, th = pb.textsize(ext)
tw += len(ext)
cw = tw // len(ext)
x = ((w - tw) // 2) - (cw * 2) // 3
fill = "#" + c[6:]
for ch in ext:
pb.text((x, (h - th) // 2), " %s " % (ch,), fill=fill)
x += cw
img = img.resize((w * 3, h * 3), Image.NEAREST)
buf = BytesIO()

View File

@@ -1,6 +1,7 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import errno
import random
import select
import socket
@@ -277,12 +278,26 @@ class MDNS(MCast):
zf = time.time() + 2
self.probing = zf # cant unicast so give everyone an extra sec
self.unsolicited = [zf, zf + 1, zf + 3, zf + 7] # rfc-8.3
try:
self.run2()
except OSError as ex:
if ex.errno != errno.EBADF:
raise
self.log("stopping due to {}".format(ex), "90")
self.log("stopped", 2)
def run2(self) -> None:
last_hop = time.time()
ihop = self.args.mc_hop
while self.running:
timeout = (
0.02 + random.random() * 0.07
if self.probing or self.q or self.defend or self.unsolicited
if self.probing or self.q or self.defend
else max(0.05, self.unsolicited[0] - time.time())
if self.unsolicited
else (last_hop + ihop if ihop else 180)
)
rdy = select.select(self.srv, [], [], timeout)
@@ -314,8 +329,6 @@ class MDNS(MCast):
self.log(t.format(self.hn[:-1]), 2)
self.probing = 0
self.log("stopped", 2)
def stop(self, panic=False) -> None:
self.running = False
for srv in self.srv.values():
@@ -502,6 +515,10 @@ class MDNS(MCast):
for srv in self.srv.values():
tx.add(srv)
if not self.unsolicited and self.args.zm_spam:
zf = time.time() + self.args.zm_spam + random.random() * 0.07
self.unsolicited.append(zf)
for srv, deadline in list(self.defend.items()):
if now < deadline:
continue

165
copyparty/metrics.py Normal file
View File

@@ -0,0 +1,165 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import json
import time
from .__init__ import TYPE_CHECKING
from .util import Pebkac, get_df, unhumanize
if TYPE_CHECKING:
from .httpcli import HttpCli
from .httpsrv import HttpSrv
class Metrics(object):
def __init__(self, hsrv: "HttpSrv") -> None:
self.hsrv = hsrv
def tx(self, cli: "HttpCli") -> bool:
if not cli.avol:
raise Pebkac(403, "not allowed for user " + cli.uname)
args = cli.args
if not args.stats:
raise Pebkac(403, "the stats feature is not enabled in server config")
conn = cli.conn
vfs = conn.asrv.vfs
allvols = list(sorted(vfs.all_vols.items()))
idx = conn.get_u2idx()
if not idx or not hasattr(idx, "p_end"):
idx = None
ret: list[str] = []
def addc(k: str, unit: str, v: str, desc: str) -> None:
if unit:
k += "_" + unit
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
else:
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
def addh(k: str, typ: str, desc: str) -> None:
zs = "# TYPE %s %s\n# HELP %s %s"
ret.append(zs % (k, typ, k, desc))
def addbh(k: str, desc: str) -> None:
zs = "# TYPE %s gauge\n# UNIT %s bytes\n# HELP %s %s"
ret.append(zs % (k, k, k, desc))
def addv(k: str, v: str) -> None:
ret.append("%s %s" % (k, v))
v = "{:.3f}".format(time.time() - self.hsrv.t0)
addc("cpp_uptime", "seconds", v, "time since last server restart")
v = str(len(conn.bans or []))
addc("cpp_bans", "", v, "number of banned IPs")
if not args.nos_hdd:
addbh("cpp_disk_size_bytes", "total HDD size of volume")
addbh("cpp_disk_free_bytes", "free HDD space in volume")
for vpath, vol in allvols:
free, total = get_df(vol.realpath)
addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total))
addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free))
if idx and not args.nos_vol:
addbh("cpp_vol_bytes", "num bytes of data in volume")
addh("cpp_vol_files", "gauge", "num files in volume")
addbh("cpp_vol_free_bytes", "free space (vmaxb) in volume")
addh("cpp_vol_free_files", "gauge", "free space (vmaxn) in volume")
tnbytes = 0
tnfiles = 0
volsizes = []
try:
ptops = [x.realpath for _, x in allvols]
x = self.hsrv.broker.ask("up2k.get_volsizes", ptops)
volsizes = x.get()
except Exception as ex:
cli.log("tx_stats get_volsizes: {!r}".format(ex), 3)
for (vpath, vol), (nbytes, nfiles) in zip(allvols, volsizes):
tnbytes += nbytes
tnfiles += nfiles
addv('cpp_vol_bytes{vol="/%s"}' % (vpath), str(nbytes))
addv('cpp_vol_files{vol="/%s"}' % (vpath), str(nfiles))
if vol.flags.get("vmaxb") or vol.flags.get("vmaxn"):
zi = unhumanize(vol.flags.get("vmaxb") or "0")
if zi:
v = str(zi - nbytes)
addv('cpp_vol_free_bytes{vol="/%s"}' % (vpath), v)
zi = unhumanize(vol.flags.get("vmaxn") or "0")
if zi:
v = str(zi - nfiles)
addv('cpp_vol_free_files{vol="/%s"}' % (vpath), v)
if volsizes:
addv('cpp_vol_bytes{vol="total"}', str(tnbytes))
addv('cpp_vol_files{vol="total"}', str(tnfiles))
if idx and not args.nos_dup:
addbh("cpp_dupe_bytes", "num dupe bytes in volume")
addh("cpp_dupe_files", "gauge", "num dupe files in volume")
tnbytes = 0
tnfiles = 0
for vpath, vol in allvols:
cur = idx.get_cur(vol.realpath)
if not cur:
continue
nbytes = 0
nfiles = 0
q = "select sz, count(*)-1 c from up group by w having c"
for sz, c in cur.execute(q):
nbytes += sz * c
nfiles += c
tnbytes += nbytes
tnfiles += nfiles
addv('cpp_dupe_bytes{vol="/%s"}' % (vpath), str(nbytes))
addv('cpp_dupe_files{vol="/%s"}' % (vpath), str(nfiles))
addv('cpp_dupe_bytes{vol="total"}', str(tnbytes))
addv('cpp_dupe_files{vol="total"}', str(tnfiles))
if not args.nos_unf:
addbh("cpp_unf_bytes", "incoming/unfinished uploads (num bytes)")
addh("cpp_unf_files", "gauge", "incoming/unfinished uploads (num files)")
tnbytes = 0
tnfiles = 0
try:
x = self.hsrv.broker.ask("up2k.get_unfinished")
xs = x.get()
xj = json.loads(xs)
for ptop, (nbytes, nfiles) in xj.items():
tnbytes += nbytes
tnfiles += nfiles
vol = next((x[1] for x in allvols if x[1].realpath == ptop), None)
if not vol:
t = "tx_stats get_unfinished: could not map {}"
cli.log(t.format(ptop), 3)
continue
addv('cpp_unf_bytes{vol="/%s"}' % (vol.vpath), str(nbytes))
addv('cpp_unf_files{vol="/%s"}' % (vol.vpath), str(nfiles))
addv('cpp_unf_bytes{vol="total"}', str(tnbytes))
addv('cpp_unf_files{vol="total"}', str(tnfiles))
except Exception as ex:
cli.log("tx_stats get_unfinished: {!r}".format(ex), 3)
ret.append("# EOF")
mime = "application/openmetrics-text; version=1.0.0; charset=utf-8"
cli.reply("\n".join(ret).encode("utf-8"), mime=mime)
return True

View File

@@ -8,7 +8,7 @@ import shutil
import subprocess as sp
import sys
from .__init__ import EXE, PY2, WINDOWS, E, unicode
from .__init__ import ANYWIN, EXE, PY2, WINDOWS, E, unicode
from .bos import bos
from .util import (
FFMPEG_URL,
@@ -29,6 +29,9 @@ if True: # pylint: disable=using-constant-test
def have_ff(scmd: str) -> bool:
if ANYWIN:
scmd += ".exe"
if PY2:
print("# checking {}".format(scmd))
acmd = (scmd + " -version").encode("ascii").split(b" ")

View File

@@ -15,7 +15,7 @@ from ipaddress import (
)
from .__init__ import MACOS, TYPE_CHECKING
from .util import Netdev, find_prefix, min_ex, spack
from .util import Daemon, Netdev, find_prefix, min_ex, spack
if TYPE_CHECKING:
from .svchub import SvcHub
@@ -228,6 +228,7 @@ class MCast(object):
for srv in self.srv.values():
assert srv.ip in self.sips
Daemon(self.hopper, "mc-hop")
return bound
def setup_socket(self, srv: MC_Sck) -> None:
@@ -299,33 +300,57 @@ class MCast(object):
t = "failed to set IPv4 TTL/LOOP; announcements may not survive multiple switches/routers"
self.log(t, 3)
self.hop(srv)
if self.hop(srv, False):
self.log("igmp was already joined?? chilling for a sec", 3)
time.sleep(1.2)
self.hop(srv, True)
self.b4.sort(reverse=True)
self.b6.sort(reverse=True)
def hop(self, srv: MC_Sck) -> None:
def hop(self, srv: MC_Sck, on: bool) -> bool:
"""rejoin to keepalive on routers/switches without igmp-snooping"""
sck = srv.sck
req = srv.mreq
if ":" in srv.ip:
try:
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_LEAVE_GROUP, req)
# linux does leaves/joins twice with 0.2~1.05s spacing
time.sleep(1.2)
except:
pass
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_JOIN_GROUP, req)
if not on:
try:
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_LEAVE_GROUP, req)
return True
except:
return False
else:
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_JOIN_GROUP, req)
else:
try:
sck.setsockopt(socket.IPPROTO_IP, socket.IP_DROP_MEMBERSHIP, req)
time.sleep(1.2)
except:
pass
if not on:
try:
sck.setsockopt(socket.IPPROTO_IP, socket.IP_DROP_MEMBERSHIP, req)
return True
except:
return False
else:
# t = "joining {} from ip {} idx {} with mreq {}"
# self.log(t.format(srv.grp, srv.ip, srv.idx, repr(srv.mreq)), 6)
sck.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, req)
# t = "joining {} from ip {} idx {} with mreq {}"
# self.log(t.format(srv.grp, srv.ip, srv.idx, repr(srv.mreq)), 6)
sck.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, req)
return True
def hopper(self):
while self.args.mc_hop and self.running:
time.sleep(self.args.mc_hop)
if not self.running:
return
for srv in self.srv.values():
self.hop(srv, False)
# linux does leaves/joins twice with 0.2~1.05s spacing
time.sleep(1.2)
if not self.running:
return
for srv in self.srv.values():
self.hop(srv, True)
def map_client(self, cip: str) -> Optional[MC_Sck]:
try:

145
copyparty/pwhash.py Normal file
View File

@@ -0,0 +1,145 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import argparse
import base64
import hashlib
import sys
import threading
from .__init__ import unicode
class PWHash(object):
def __init__(self, args: argparse.Namespace):
self.args = args
try:
alg, ac = args.ah_alg.split(",")
except:
alg = args.ah_alg
ac = {}
if alg == "none":
alg = ""
self.alg = alg
self.ac = ac
if not alg:
self.on = False
self.hash = unicode
return
self.on = True
self.salt = args.ah_salt.encode("utf-8")
self.cache: dict[str, str] = {}
self.mutex = threading.Lock()
self.hash = self._cache_hash
if alg == "sha2":
self._hash = self._gen_sha2
elif alg == "scrypt":
self._hash = self._gen_scrypt
elif alg == "argon2":
self._hash = self._gen_argon2
else:
t = "unsupported password hashing algorithm [{}], must be one of these: argon2 scrypt sha2 none"
raise Exception(t.format(alg))
def _cache_hash(self, plain: str) -> str:
with self.mutex:
try:
return self.cache[plain]
except:
pass
if not plain:
return ""
if len(plain) > 255:
raise Exception("password too long")
if len(self.cache) > 9000:
self.cache = {}
ret = self._hash(plain)
self.cache[plain] = ret
return ret
def _gen_sha2(self, plain: str) -> str:
its = int(self.ac[0]) if self.ac else 424242
bplain = plain.encode("utf-8")
ret = b"\n"
for _ in range(its):
ret = hashlib.sha512(self.salt + bplain + ret).digest()
return "+" + base64.urlsafe_b64encode(ret[:24]).decode("utf-8")
def _gen_scrypt(self, plain: str) -> str:
cost = 2 << 13
its = 2
blksz = 8
para = 4
try:
cost = 2 << int(self.ac[0])
its = int(self.ac[1])
blksz = int(self.ac[2])
para = int(self.ac[3])
except:
pass
ret = plain.encode("utf-8")
for _ in range(its):
ret = hashlib.scrypt(ret, salt=self.salt, n=cost, r=blksz, p=para, dklen=24)
return "+" + base64.urlsafe_b64encode(ret).decode("utf-8")
def _gen_argon2(self, plain: str) -> str:
from argon2.low_level import Type as ArgonType
from argon2.low_level import hash_secret
time_cost = 3
mem_cost = 256
parallelism = 4
version = 19
try:
time_cost = int(self.ac[0])
mem_cost = int(self.ac[1])
parallelism = int(self.ac[2])
version = int(self.ac[3])
except:
pass
bplain = plain.encode("utf-8")
bret = hash_secret(
secret=bplain,
salt=self.salt,
time_cost=time_cost,
memory_cost=mem_cost * 1024,
parallelism=parallelism,
hash_len=24,
type=ArgonType.ID,
version=version,
)
ret = bret.split(b"$")[-1].decode("utf-8")
return "+" + ret.replace("/", "_").replace("+", "-")
def stdin(self) -> None:
while True:
ln = sys.stdin.readline().strip()
if not ln:
break
print(self.hash(ln))
def cli(self) -> None:
import getpass
while True:
p1 = getpass.getpass("password> ")
p2 = getpass.getpass("again or just hit ENTER> ")
if p2 and p1 != p2:
print("\033[31minputs don't match; try again\033[0m", file=sys.stderr)
continue
print(self.hash(p1))
print()

View File

View File

@@ -32,6 +32,8 @@ class SMB(object):
self.asrv = hub.asrv
self.log = hub.log
self.files: dict[int, tuple[float, str]] = {}
self.noacc = self.args.smba
self.accs = not self.args.smba
lg.setLevel(logging.DEBUG if self.args.smbvvv else logging.INFO)
for x in ["impacket", "impacket.smbserver"]:
@@ -94,6 +96,14 @@ class SMB(object):
port = int(self.args.smb_port)
srv = smbserver.SimpleSMBServer(listenAddress=ip, listenPort=port)
try:
if self.accs:
srv.setAuthCallback(self._auth_cb)
except:
self.accs = False
self.noacc = True
t = "impacket too old; access permissions will not work! all accounts are admin!"
self.log("smb", t, 1)
ro = "no" if self.args.smbw else "yes" # (does nothing)
srv.addShare("A", "/", readOnly=ro)
@@ -119,24 +129,74 @@ class SMB(object):
def start(self) -> None:
Daemon(self.srv.start)
def _v2a(self, caller: str, vpath: str, *a: Any) -> tuple[VFS, str]:
def _auth_cb(self, *a, **ka):
debug("auth-result: %s %s", a, ka)
conndata = ka["connData"]
auth_ok = conndata["Authenticated"]
uname = ka["user_name"] if auth_ok else "*"
uname = self.asrv.iacct.get(uname, uname) or "*"
oldname = conndata.get("partygoer", "*") or "*"
cli_ip = conndata["ClientIP"]
cli_hn = ka["host_name"]
if uname != "*":
conndata["partygoer"] = uname
info("client %s [%s] authed as %s", cli_ip, cli_hn, uname)
elif oldname != "*":
info("client %s [%s] keeping old auth as %s", cli_ip, cli_hn, oldname)
elif auth_ok:
info("client %s [%s] authed as [*] (anon)", cli_ip, cli_hn)
else:
info("client %s [%s] rejected", cli_ip, cli_hn)
def _uname(self) -> str:
if self.noacc:
return LEELOO_DALLAS
try:
# you found it! my single worst bit of code so far
# (if you can think of a better way to track users through impacket i'm all ears)
cf0 = inspect.currentframe().f_back.f_back
cf = cf0.f_back
for n in range(3):
cl = cf.f_locals
if "connData" in cl:
return cl["connData"]["partygoer"]
cf = cf.f_back
raise Exception()
except:
warning(
"nyoron... %s <<-- %s <<-- %s <<-- %s",
cf0.f_code.co_name,
cf0.f_back.f_code.co_name,
cf0.f_back.f_back.f_code.co_name,
cf0.f_back.f_back.f_back.f_code.co_name,
)
return "*"
def _v2a(
self, caller: str, vpath: str, *a: Any, uname="", perms=None
) -> tuple[VFS, str]:
vpath = vpath.replace("\\", "/").lstrip("/")
# cf = inspect.currentframe().f_back
# c1 = cf.f_back.f_code.co_name
# c2 = cf.f_code.co_name
debug('%s("%s", %s)\033[K\033[0m', caller, vpath, str(a))
if not uname:
uname = self._uname()
if not perms:
perms = [True, True]
# TODO find a way to grab `identity` in smbComSessionSetupAndX and smb2SessionSetup
vfs, rem = self.asrv.vfs.get(vpath, LEELOO_DALLAS, True, True)
debug('%s("%s", %s) %s @%s\033[K\033[0m', caller, vpath, str(a), perms, uname)
vfs, rem = self.asrv.vfs.get(vpath, uname, *perms)
return vfs, vfs.canonical(rem)
def _listdir(self, vpath: str, *a: Any, **ka: Any) -> list[str]:
vpath = vpath.replace("\\", "/").lstrip("/")
# caller = inspect.currentframe().f_back.f_code.co_name
debug('listdir("%s", %s)\033[K\033[0m', vpath, str(a))
vfs, rem = self.asrv.vfs.get(vpath, LEELOO_DALLAS, False, False)
uname = self._uname()
# debug('listdir("%s", %s) @%s\033[K\033[0m', vpath, str(a), uname)
vfs, rem = self.asrv.vfs.get(vpath, uname, False, False)
_, vfs_ls, vfs_virt = vfs.ls(
rem, LEELOO_DALLAS, not self.args.no_scandir, [[False, False]]
rem, uname, not self.args.no_scandir, [[False, False]]
)
dirs = [x[0] for x in vfs_ls if stat.S_ISDIR(x[1].st_mode)]
fils = [x[0] for x in vfs_ls if x[0] not in dirs]
@@ -149,8 +209,8 @@ class SMB(object):
sz = 112 * 2 # ['.', '..']
for n, fn in enumerate(ls):
if sz >= 64000:
t = "listing only %d of %d files (%d byte); see impacket#1433"
warning(t, n, len(ls), sz)
t = "listing only %d of %d files (%d byte) in /%s; see impacket#1433"
warning(t, n, len(ls), sz, vpath)
break
nsz = len(fn.encode("utf-16", "replace"))
@@ -171,10 +231,12 @@ class SMB(object):
if wr and not self.args.smbw:
yeet("blocked write (no --smbw): " + vpath)
vfs, ap = self._v2a("open", vpath, *a)
uname = self._uname()
vfs, ap = self._v2a("open", vpath, *a, uname=uname, perms=[True, wr])
if wr:
if not vfs.axs.uwrite:
yeet("blocked write (no-write-acc): " + vpath)
t = "blocked write (no-write-acc %s): /%s @%s"
yeet(t % (vfs.axs.uwrite, vpath, uname))
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
@@ -204,7 +266,7 @@ class SMB(object):
_, vp = self.files.pop(fd)
vp, fn = os.path.split(vp)
vfs, rem = self.hub.asrv.vfs.get(vp, LEELOO_DALLAS, False, True)
vfs, rem = self.hub.asrv.vfs.get(vp, self._uname(), False, True)
vfs, rem = vfs.get_dbv(rem)
self.hub.up2k.hash_file(
vfs.realpath,
@@ -224,15 +286,18 @@ class SMB(object):
vp1 = vp1.lstrip("/")
vp2 = vp2.lstrip("/")
vfs2, ap2 = self._v2a("rename", vp2, vp1)
uname = self._uname()
vfs2, ap2 = self._v2a("rename", vp2, vp1, uname=uname)
if not vfs2.axs.uwrite:
yeet("blocked rename (no-write-acc): " + vp2)
t = "blocked write (no-write-acc %s): /%s @%s"
yeet(t % (vfs2.axs.uwrite, vp2, uname))
vfs1, _ = self.asrv.vfs.get(vp1, LEELOO_DALLAS, True, True)
vfs1, _ = self.asrv.vfs.get(vp1, uname, True, True, True)
if not vfs1.axs.umove:
yeet("blocked rename (no-move-acc): " + vp1)
t = "blocked rename (no-move-acc %s): /%s @%s"
yeet(t % (vfs1.axs.umove, vp1, uname))
self.hub.up2k.handle_mv(LEELOO_DALLAS, vp1, vp2)
self.hub.up2k.handle_mv(uname, vp1, vp2)
try:
bos.makedirs(ap2)
except:
@@ -242,52 +307,74 @@ class SMB(object):
if not self.args.smbw:
yeet("blocked mkdir (no --smbw): " + vpath)
vfs, ap = self._v2a("mkdir", vpath)
uname = self._uname()
vfs, ap = self._v2a("mkdir", vpath, uname=uname)
if not vfs.axs.uwrite:
yeet("blocked mkdir (no-write-acc): " + vpath)
t = "blocked mkdir (no-write-acc %s): /%s @%s"
yeet(t % (vfs.axs.uwrite, vpath, uname))
return bos.mkdir(ap)
def _stat(self, vpath: str, *a: Any, **ka: Any) -> os.stat_result:
return bos.stat(self._v2a("stat", vpath, *a)[1], *a, **ka)
try:
ap = self._v2a("stat", vpath, *a, perms=[True, False])[1]
ret = bos.stat(ap, *a, **ka)
# debug(" `-stat:ok")
return ret
except:
# white lie: windows freaks out if we raise due to an offline volume
# debug(" `-stat:NOPE (faking a directory)")
ts = int(time.time())
return os.stat_result((16877, -1, -1, 1, 1000, 1000, 8, ts, ts, ts))
def _unlink(self, vpath: str) -> None:
if not self.args.smbw:
yeet("blocked delete (no --smbw): " + vpath)
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
vfs, ap = self._v2a("delete", vpath)
uname = self._uname()
vfs, ap = self._v2a(
"delete", vpath, uname=uname, perms=[True, False, False, True]
)
if not vfs.axs.udel:
yeet("blocked delete (no-del-acc): " + vpath)
vpath = vpath.replace("\\", "/").lstrip("/")
self.hub.up2k.handle_rm(LEELOO_DALLAS, "1.7.6.2", [vpath], [])
self.hub.up2k.handle_rm(uname, "1.7.6.2", [vpath], [], False)
def _utime(self, vpath: str, times: tuple[float, float]) -> None:
if not self.args.smbw:
yeet("blocked utime (no --smbw): " + vpath)
vfs, ap = self._v2a("utime", vpath)
uname = self._uname()
vfs, ap = self._v2a("utime", vpath, uname=uname)
if not vfs.axs.uwrite:
yeet("blocked utime (no-write-acc): " + vpath)
t = "blocked utime (no-write-acc %s): /%s @%s"
yeet(t % (vfs.axs.uwrite, vpath, uname))
return bos.utime(ap, times)
def _p_exists(self, vpath: str) -> bool:
# ap = "?"
try:
bos.stat(self._v2a("p.exists", vpath)[1])
ap = self._v2a("p.exists", vpath, perms=[True, False])[1]
bos.stat(ap)
# debug(" `-exists((%s)->(%s)):ok", vpath, ap)
return True
except:
# debug(" `-exists((%s)->(%s)):NOPE", vpath, ap)
return False
def _p_getsize(self, vpath: str) -> int:
st = bos.stat(self._v2a("p.getsize", vpath)[1])
st = bos.stat(self._v2a("p.getsize", vpath, perms=[True, False])[1])
return st.st_size
def _p_isdir(self, vpath: str) -> bool:
try:
st = bos.stat(self._v2a("p.isdir", vpath)[1])
return stat.S_ISDIR(st.st_mode)
st = bos.stat(self._v2a("p.isdir", vpath, perms=[True, False])[1])
ret = stat.S_ISDIR(st.st_mode)
# debug(" `-isdir:%s:%s", st.st_mode, ret)
return ret
except:
return False

View File

@@ -1,6 +1,7 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import errno
import re
import select
import socket
@@ -80,7 +81,7 @@ class SSDPr(object):
ubase = "{}://{}:{}".format(proto, sip, sport)
zsl = self.args.zsl
url = zsl if "://" in zsl else ubase + "/" + zsl.lstrip("/")
name = "{} @ {}".format(self.args.doctitle, self.args.name)
name = self.args.doctitle
zs = zs.strip().format(c(ubase), c(url), c(name), c(self.args.zsid))
hc.reply(zs.encode("utf-8", "replace"))
return False # close connectino
@@ -129,6 +130,17 @@ class SSDPd(MCast):
srv.hport = hp
self.log("listening")
try:
self.run2()
except OSError as ex:
if ex.errno != errno.EBADF:
raise
self.log("stopping due to {}".format(ex), "90")
self.log("stopped", 2)
def run2(self) -> None:
while self.running:
rdy = select.select(self.srv, [], [], self.args.z_chk or 180)
rx: list[socket.socket] = rdy[0] # type: ignore
@@ -148,8 +160,6 @@ class SSDPd(MCast):
)
self.log(t, 6)
self.log("stopped", 2)
def stop(self) -> None:
self.running = False
for srv in self.srv.values():
@@ -204,7 +214,7 @@ CONFIGID.UPNP.ORG: 1
srv.sck.sendto(zb, addr[:2])
if cip not in self.txc.c:
self.log("{} [{}] --> {}".format(srv.name, srv.ip, cip), "6")
self.log("{} [{}] --> {}".format(srv.name, srv.ip, cip), 6)
self.txc.add(cip)
self.txc.cln()

View File

@@ -1,6 +1,7 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import re
import stat
import tarfile
@@ -44,6 +45,7 @@ class StreamTar(StreamArc):
self,
log: "NamedLogger",
fgen: Generator[dict[str, Any], None, None],
cmp: str = "",
**kwargs: Any
):
super(StreamTar, self).__init__(log, fgen)
@@ -53,14 +55,41 @@ class StreamTar(StreamArc):
self.qfile = QFile()
self.errf: dict[str, Any] = {}
# python 3.8 changed to PAX_FORMAT as default,
# waste of space and don't care about the new features
# python 3.8 changed to PAX_FORMAT as default;
# slower, bigger, and no particular advantage
fmt = tarfile.GNU_FORMAT
self.tar = tarfile.open(fileobj=self.qfile, mode="w|", format=fmt) # type: ignore
if "pax" in cmp:
# unless a client asks for it (currently
# gnu-tar has wider support than pax-tar)
fmt = tarfile.PAX_FORMAT
cmp = re.sub(r"[^a-z0-9]*pax[^a-z0-9]*", "", cmp)
try:
cmp, lv = cmp.replace(":", ",").split(",")
lv = int(lv)
except:
lv = None
arg = {"name": None, "fileobj": self.qfile, "mode": "w", "format": fmt}
if cmp == "gz":
fun = tarfile.TarFile.gzopen
arg["compresslevel"] = lv if lv is not None else 3
elif cmp == "bz2":
fun = tarfile.TarFile.bz2open
arg["compresslevel"] = lv if lv is not None else 2
elif cmp == "xz":
fun = tarfile.TarFile.xzopen
arg["preset"] = lv if lv is not None else 1
else:
fun = tarfile.open
arg["mode"] = "w|"
self.tar = fun(**arg)
Daemon(self._gen, "star-gen")
def gen(self) -> Generator[Optional[bytes], None, None]:
buf = b""
try:
while True:
buf = self.qfile.q.get()
@@ -72,6 +101,12 @@ class StreamTar(StreamArc):
yield None
finally:
while buf:
try:
buf = self.qfile.q.get()
except:
pass
if self.errf:
bos.unlink(self.errf["ap"])
@@ -101,6 +136,9 @@ class StreamTar(StreamArc):
errors.append((f["vp"], f["err"]))
continue
if self.stopped:
break
try:
self.ser(f)
except:

View File

@@ -1,10 +1,14 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import os
import tempfile
from datetime import datetime
from .__init__ import CORES
from .bos import bos
from .th_cli import ThumbCli
from .util import UTC, vjoin
if True: # pylint: disable=using-constant-test
from typing import Any, Generator, Optional
@@ -21,10 +25,78 @@ class StreamArc(object):
):
self.log = log
self.fgen = fgen
self.stopped = False
def gen(self) -> Generator[Optional[bytes], None, None]:
raise Exception("override me")
def stop(self) -> None:
self.stopped = True
def gfilter(
fgen: Generator[dict[str, Any], None, None],
thumbcli: ThumbCli,
uname: str,
vtop: str,
fmt: str,
) -> Generator[dict[str, Any], None, None]:
from concurrent.futures import ThreadPoolExecutor
pend = []
with ThreadPoolExecutor(max_workers=CORES) as tp:
try:
for f in fgen:
task = tp.submit(enthumb, thumbcli, uname, vtop, f, fmt)
pend.append((task, f))
if pend[0][0].done() or len(pend) > CORES * 4:
task, f = pend.pop(0)
try:
f = task.result(600)
except:
pass
yield f
for task, f in pend:
try:
f = task.result(600)
except:
pass
yield f
except Exception as ex:
thumbcli.log("gfilter flushing ({})".format(ex))
for task, f in pend:
try:
task.result(600)
except:
pass
thumbcli.log("gfilter flushed")
def enthumb(
thumbcli: ThumbCli, uname: str, vtop: str, f: dict[str, Any], fmt: str
) -> dict[str, Any]:
rem = f["vp"]
ext = rem.rsplit(".", 1)[-1].lower()
if fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|"):
raise Exception()
vp = vjoin(vtop, rem.split("/", 1)[1])
vn, rem = thumbcli.asrv.vfs.get(vp, uname, True, False)
dbv, vrem = vn.get_dbv(rem)
thp = thumbcli.get(dbv, vrem, f["st"].st_mtime, fmt)
if not thp:
raise Exception()
ext = "jpg" if fmt == "j" else "webp" if fmt == "w" else fmt
sz = bos.path.getsize(thp)
st: os.stat_result = f["st"]
ts = st.st_mtime
f["ap"] = thp
f["vp"] = f["vp"].rsplit(".", 1)[0] + "." + ext
f["st"] = os.stat_result((st.st_mode, -1, -1, 1, 1000, 1000, sz, ts, ts, ts))
return f
def errdesc(errors: list[tuple[str, str]]) -> tuple[dict[str, Any], list[str]]:
report = ["copyparty failed to add the following files to the archive:", ""]
@@ -36,7 +108,7 @@ def errdesc(errors: list[tuple[str, str]]) -> tuple[dict[str, Any], list[str]]:
tf_path = tf.name
tf.write("\r\n".join(report).encode("utf-8", "replace"))
dt = datetime.utcnow().strftime("%Y-%m%d-%H%M%S")
dt = datetime.now(UTC).strftime("%Y-%m%d-%H%M%S")
bos.chmod(tf_path, 0o444)
return {

View File

@@ -28,8 +28,9 @@ if True: # pylint: disable=using-constant-test
import typing
from typing import Any, Optional, Union
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, VT100, EnvParams, unicode
from .authsrv import AuthSrv
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams, unicode
from .authsrv import BAD_CFG, AuthSrv
from .cert import ensure_cert
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
from .tcpsrv import TcpSrv
from .th_srv import HAVE_PIL, HAVE_VIPS, HAVE_WEBP, ThumbSrv
@@ -38,13 +39,19 @@ from .util import (
FFMPEG_URL,
VERSIONS,
Daemon,
DEF_EXP,
DEF_MTE,
DEF_MTH,
Garda,
HLog,
HMaccas,
ODict,
UTC,
alltrace,
ansi_re,
min_ex,
mp,
odfusion,
pybin,
start_log_thrs,
start_stackmon,
@@ -80,6 +87,7 @@ class SvcHub(object):
self.dargs = dargs
self.argv = argv
self.E: EnvParams = args.E
self.no_ansi = args.no_ansi
self.logf: Optional[typing.TextIO] = None
self.logf_base_fn = ""
self.stop_req = False
@@ -98,11 +106,6 @@ class SvcHub(object):
self.iphash = HMaccas(os.path.join(self.E.cfg, "iphash"), 8)
# for non-http clients (ftp)
self.bans: dict[str, int] = {}
self.gpwd = Garda(self.args.ban_pw)
self.g404 = Garda(self.args.ban_404)
if args.sss or args.s >= 3:
args.ss = True
args.no_dav = True
@@ -118,7 +121,6 @@ class SvcHub(object):
args.no_mv = True
args.hardlink = True
args.vague_403 = True
args.ban_404 = "50,60,1440"
args.nih = True
if args.s:
@@ -128,6 +130,20 @@ class SvcHub(object):
args.no_robots = True
args.force_js = True
if not self._process_config():
raise Exception(BAD_CFG)
# for non-http clients (ftp)
self.bans: dict[str, int] = {}
self.gpwd = Garda(self.args.ban_pw)
self.g404 = Garda(self.args.ban_404)
self.g403 = Garda(self.args.ban_403)
self.g422 = Garda(self.args.ban_422)
self.gurl = Garda(self.args.ban_url)
self.log_div = 10 ** (6 - args.log_tdec)
self.log_efmt = "%02d:%02d:%02d.%0{}d".format(args.log_tdec)
self.log_dfmt = "%04d-%04d-%06d.%0{}d".format(args.log_tdec)
self.log = self._log_disabled if args.q else self._log_enabled
if args.lo:
self._setup_logfile(printed)
@@ -157,6 +173,14 @@ class SvcHub(object):
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
args.theme = "{0}{1} {0} {1}".format(ch, bri)
if args.nih:
args.vname = ""
args.doctitle = args.doctitle.replace(" @ --name", "")
else:
args.vname = args.name
args.doctitle = args.doctitle.replace("--name", args.vname)
args.bname = args.bname.replace("--name", args.vname) or args.vname
if args.log_fk:
args.log_fk = re.compile(args.log_fk)
@@ -177,10 +201,11 @@ class SvcHub(object):
self.log("root", "max clients: {}".format(self.args.nc))
if not self._process_config():
raise Exception("bad config")
self.tcpsrv = TcpSrv(self)
if not self.tcpsrv.srv and self.args.ign_ebind_all:
self.args.no_fastboot = True
self.up2k = Up2k(self)
decs = {k: 1 for k in self.args.th_dec.split(",")}
@@ -238,7 +263,8 @@ class SvcHub(object):
if args.ftp or args.ftps:
from .ftpd import Ftpd
self.ftpd = Ftpd(self)
self.ftpd: Optional[Ftpd] = None
Daemon(self.start_ftpd, "start_ftpd")
zms += "f" if args.ftp else "F"
if args.smb:
@@ -268,6 +294,28 @@ class SvcHub(object):
self.broker = Broker(self)
def start_ftpd(self) -> None:
time.sleep(30)
if self.ftpd:
return
self.restart_ftpd()
def restart_ftpd(self) -> None:
if not hasattr(self, "ftpd"):
return
from .ftpd import Ftpd
if self.ftpd:
return # todo
if not os.path.exists(self.args.cert):
ensure_cert(self.log, self.args)
self.ftpd = Ftpd(self)
self.log("root", "started FTPd")
def thr_httpsrv_up(self) -> None:
time.sleep(1 if self.args.ign_ebind_all else 5)
expected = self.broker.num_workers * self.tcpsrv.nsrv
@@ -299,12 +347,20 @@ class SvcHub(object):
if self.httpsrv_up != self.broker.num_workers:
return
time.sleep(0.1) # purely cosmetic dw
ar = self.args
for _ in range(10 if ar.ftp or ar.ftps else 0):
time.sleep(0.03)
if self.ftpd:
break
if self.tcpsrv.qr:
self.log("qr-code", self.tcpsrv.qr)
else:
self.log("root", "workers OK\n")
self.after_httpsrv_up()
def after_httpsrv_up(self) -> None:
self.up2k.init_vols()
Daemon(self.sd_notify, "sd-notify")
@@ -350,6 +406,52 @@ class SvcHub(object):
al.th_covers = set(al.th_covers.split(","))
for k in "c".split(" "):
vl = getattr(al, k)
if not vl:
continue
vl = [os.path.expanduser(x) if x.startswith("~") else x for x in vl]
setattr(al, k, vl)
for k in "lo hist ssl_log".split(" "):
vs = getattr(al, k)
if vs and vs.startswith("~"):
setattr(al, k, os.path.expanduser(vs))
for k in "sus_urls nonsus_urls".split(" "):
vs = getattr(al, k)
if not vs or vs == "no":
setattr(al, k, None)
else:
setattr(al, k, re.compile(vs))
if not al.sus_urls:
al.ban_url = "no"
elif al.ban_url == "no":
al.sus_urls = None
if al.xff_src in ("any", "0", ""):
al.xff_re = None
else:
zs = al.xff_src.replace(" ", "").replace(".", "\\.").replace(",", "|")
al.xff_re = re.compile("^(?:" + zs + ")")
mte = ODict.fromkeys(DEF_MTE.split(","), True)
al.mte = odfusion(mte, al.mte)
mth = ODict.fromkeys(DEF_MTH.split(","), True)
al.mth = odfusion(mth, al.mth)
exp = ODict.fromkeys(DEF_EXP.split(" "), True)
al.exp_md = odfusion(exp, al.exp_md.replace(" ", ","))
al.exp_lg = odfusion(exp, al.exp_lg.replace(" ", ","))
for k in ["no_hash", "no_idx"]:
ptn = getattr(self.args, k)
if ptn:
setattr(self.args, k, re.compile(ptn))
return True
def _setlimits(self) -> None:
@@ -393,7 +495,7 @@ class SvcHub(object):
self.args.nc = min(self.args.nc, soft // 2)
def _logname(self) -> str:
dt = datetime.utcnow()
dt = datetime.now(UTC)
fn = str(self.args.lo)
for fs in "YmdHMS":
fs = "%" + fs
@@ -580,19 +682,25 @@ class SvcHub(object):
ret = 1
try:
self.pr("OPYTHAT")
tasks = []
slp = 0.0
if self.mdns:
Daemon(self.mdns.stop)
tasks.append(Daemon(self.mdns.stop, "mdns"))
slp = time.time() + 0.5
if self.ssdp:
Daemon(self.ssdp.stop)
tasks.append(Daemon(self.ssdp.stop, "ssdp"))
slp = time.time() + 0.5
self.broker.shutdown()
self.tcpsrv.shutdown()
self.up2k.shutdown()
if hasattr(self, "smbd"):
slp = max(slp, time.time() + 0.5)
tasks.append(Daemon(self.smbd.stop, "smbd"))
if self.thumbsrv:
self.thumbsrv.shutdown()
@@ -602,17 +710,19 @@ class SvcHub(object):
break
if n == 3:
self.pr("waiting for thumbsrv (10sec)...")
self.log("root", "waiting for thumbsrv (10sec)...")
if hasattr(self, "smbd"):
slp = max(slp, time.time() + 0.5)
Daemon(self.kill9, a=(1,))
Daemon(self.smbd.stop)
zf = max(time.time() - slp, 0)
Daemon(self.kill9, a=(zf + 0.5,))
while time.time() < slp:
time.sleep(0.1)
if not next((x for x in tasks if x.is_alive), None):
break
self.pr("nailed it", end="")
time.sleep(0.05)
self.log("root", "nailed it")
ret = self.retcode
except:
self.pr("\033[31m[ error during shutdown ]\n{}\033[0m".format(min_ex()))
@@ -622,7 +732,7 @@ class SvcHub(object):
print("\033]0;\033\\", file=sys.stderr, end="")
sys.stderr.flush()
self.pr("\033[0m")
self.pr("\033[0m", end="")
if self.logf:
self.logf.close()
@@ -634,8 +744,14 @@ class SvcHub(object):
return
with self.log_mutex:
ts = datetime.utcnow().strftime("%Y-%m%d-%H%M%S.%f")[:-3]
self.logf.write("@{} [{}\033[0m] {}\n".format(ts, src, msg))
zd = datetime.now(UTC)
ts = self.log_dfmt % (
zd.year,
zd.month * 100 + zd.day,
(zd.hour * 100 + zd.minute) * 100 + zd.second,
zd.microsecond // self.log_div,
)
self.logf.write("@%s [%s\033[0m] %s\n" % (ts, src, msg))
now = time.time()
if now >= self.next_day:
@@ -646,7 +762,7 @@ class SvcHub(object):
self.logf.close()
self._setup_logfile("")
dt = datetime.utcnow()
dt = datetime.now(UTC)
# unix timestamp of next 00:00:00 (leap-seconds safe)
day_now = dt.day
@@ -654,34 +770,50 @@ class SvcHub(object):
dt += timedelta(hours=12)
dt = dt.replace(hour=0, minute=0, second=0)
self.next_day = calendar.timegm(dt.utctimetuple())
try:
tt = dt.utctimetuple()
except:
# still makes me hella uncomfortable
tt = dt.timetuple()
self.next_day = calendar.timegm(tt)
def _log_enabled(self, src: str, msg: str, c: Union[int, str] = 0) -> None:
"""handles logging from all components"""
with self.log_mutex:
now = time.time()
if now >= self.next_day:
dt = datetime.utcfromtimestamp(now)
print("\033[36m{}\033[0m\n".format(dt.strftime("%Y-%m-%d")), end="")
dt = datetime.fromtimestamp(now, UTC)
zs = "{}\n" if self.no_ansi else "\033[36m{}\033[0m\n"
zs = zs.format(dt.strftime("%Y-%m-%d"))
print(zs, end="")
self._set_next_day()
if self.logf:
self.logf.write(zs)
fmt = "\033[36m{} \033[33m{:21} \033[0m{}\n"
if not VT100:
fmt = "{} {:21} {}\n"
fmt = "\033[36m%s \033[33m%-21s \033[0m%s\n"
if self.no_ansi:
fmt = "%s %-21s %s\n"
if "\033" in msg:
msg = ansi_re.sub("", msg)
if "\033" in src:
src = ansi_re.sub("", src)
elif c:
if isinstance(c, int):
msg = "\033[3{}m{}\033[0m".format(c, msg)
msg = "\033[3%sm%s\033[0m" % (c, msg)
elif "\033" not in c:
msg = "\033[{}m{}\033[0m".format(c, msg)
msg = "\033[%sm%s\033[0m" % (c, msg)
else:
msg = "{}{}\033[0m".format(c, msg)
msg = "%s%s\033[0m" % (c, msg)
ts = datetime.utcfromtimestamp(now).strftime("%H:%M:%S.%f")[:-3]
msg = fmt.format(ts, src, msg)
zd = datetime.fromtimestamp(now, UTC)
ts = self.log_efmt % (
zd.hour,
zd.minute,
zd.second,
zd.microsecond // self.log_div,
)
msg = fmt % (ts, src, msg)
try:
print(msg, end="")
except UnicodeEncodeError:

View File

@@ -221,6 +221,7 @@ class StreamZip(StreamArc):
fgen: Generator[dict[str, Any], None, None],
utf8: bool = False,
pre_crc: bool = False,
**kwargs: Any
) -> None:
super(StreamZip, self).__init__(log, fgen)
@@ -275,6 +276,7 @@ class StreamZip(StreamArc):
def gen(self) -> Generator[bytes, None, None]:
errf: dict[str, Any] = {}
errors = []
mbuf = b""
try:
for f in self.fgen:
if "err" in f:
@@ -283,13 +285,20 @@ class StreamZip(StreamArc):
try:
for x in self.ser(f):
yield x
mbuf += x
if len(mbuf) >= 16384:
yield mbuf
mbuf = b""
except GeneratorExit:
raise
except:
ex = min_ex(5, True).replace("\n", "\n-- ")
errors.append((f["vp"], ex))
if mbuf:
yield mbuf
mbuf = b""
if errors:
errf, txt = errdesc(errors)
self.log("\n".join(([repr(errf)] + txt[1:])))
@@ -299,20 +308,23 @@ class StreamZip(StreamArc):
cdir_pos = self.pos
for name, sz, ts, crc, h_pos in self.items:
buf = gen_hdr(h_pos, name, sz, ts, self.utf8, crc, self.pre_crc)
yield self._ct(buf)
mbuf += self._ct(buf)
if len(mbuf) >= 16384:
yield mbuf
mbuf = b""
cdir_end = self.pos
_, need_64 = gen_ecdr(self.items, cdir_pos, cdir_end)
if need_64:
ecdir64_pos = self.pos
buf = gen_ecdr64(self.items, cdir_pos, cdir_end)
yield self._ct(buf)
mbuf += self._ct(buf)
buf = gen_ecdr64_loc(ecdir64_pos)
yield self._ct(buf)
mbuf += self._ct(buf)
ecdr, _ = gen_ecdr(self.items, cdir_pos, cdir_end)
yield self._ct(ecdr)
yield mbuf + self._ct(ecdr)
finally:
if errf:
bos.unlink(errf["ap"])

View File

@@ -7,13 +7,15 @@ import socket
import sys
import time
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, VT100, unicode
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, unicode
from .cert import gencert
from .stolen.qrcodegen import QrCode
from .util import (
E_ACCESS,
E_ADDR_IN_USE,
E_ADDR_NOT_AVAIL,
E_UNREACH,
IP6ALL,
Netdev,
min_ex,
sunpack,
@@ -253,6 +255,9 @@ class TcpSrv(object):
srvs: list[socket.socket] = []
for srv in self.srv:
ip, port = srv.getsockname()[:2]
if ip == IP6ALL:
ip = "::" # jython
try:
srv.listen(self.args.nc)
try:
@@ -274,6 +279,8 @@ class TcpSrv(object):
srv.close()
continue
t = "\n\nERROR: could not open listening socket, probably because one of the server ports ({}) is busy on one of the requested interfaces ({}); avoid this issue by specifying a different port (-p 3939) and/or a specific interface to listen on (-i 192.168.56.1)\n"
self.log("tcpsrv", t.format(port, ip), 1)
raise
bound.append((ip, port))
@@ -295,6 +302,8 @@ class TcpSrv(object):
def _distribute_netdevs(self):
self.hub.broker.say("set_netdevs", self.netdevs)
self.hub.start_zeroconf()
gencert(self.log, self.args, self.netdevs)
self.hub.restart_ftpd()
def shutdown(self) -> None:
self.stopping = True
@@ -322,7 +331,7 @@ class TcpSrv(object):
if k not in netdevs:
removed = "{} = {}".format(k, v)
t = "network change detected:\n added {}\nremoved {}"
t = "network change detected:\n added {}\033[0;33m\nremoved {}"
self.log("tcpsrv", t.format(added, removed), 3)
self.netdevs = netdevs
self._distribute_netdevs()
@@ -501,7 +510,7 @@ class TcpSrv(object):
zoom = 1
qr = qrc.render(zoom, pad)
if not VT100:
if self.args.no_ansi:
return "{}\n{}".format(txt, qr)
halfc = "\033[40;48;5;{0}m{1}\033[47;48;5;{2}m"

View File

@@ -31,7 +31,7 @@ class ThumbCli(object):
if not c:
raise Exception()
except:
c = {k: {} for k in ["thumbable", "pil", "vips", "ffi", "ffv", "ffa"]}
c = {k: set() for k in ["thumbable", "pil", "vips", "ffi", "ffv", "ffa"]}
self.thumbable = c["thumbable"]
self.fmt_pil = c["pil"]
@@ -94,7 +94,7 @@ class ThumbCli(object):
self.log("no histpath for [{}]".format(ptop))
return None
tpath = thumb_path(histpath, rem, mtime, fmt)
tpath = thumb_path(histpath, rem, mtime, fmt, self.fmt_ffa)
tpaths = [tpath]
if fmt == "w":
# also check for jpg (maybe webp is unavailable)
@@ -108,6 +108,7 @@ class ThumbCli(object):
if st.st_size:
ret = tpath = tp
fmt = ret.rsplit(".")[1]
break
else:
abort = True
except:

View File

@@ -13,13 +13,14 @@ import time
from queue import Queue
from .__init__ import ANYWIN, TYPE_CHECKING
from .authsrv import VFS
from .bos import bos
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
from .util import (
FFMPEG_URL,
BytesIO,
Cooldown,
Daemon,
FFMPEG_URL,
Pebkac,
afsenc,
fsenc,
@@ -36,14 +37,21 @@ if TYPE_CHECKING:
from .svchub import SvcHub
HAVE_PIL = False
HAVE_PILF = False
HAVE_HEIF = False
HAVE_AVIF = False
HAVE_WEBP = False
try:
from PIL import ExifTags, Image, ImageOps
from PIL import ExifTags, Image, ImageFont, ImageOps
HAVE_PIL = True
try:
ImageFont.load_default(size=16)
HAVE_PILF = True
except:
pass
try:
Image.new("RGB", (2, 2)).save(BytesIO(), format="webp")
HAVE_WEBP = True
@@ -78,17 +86,23 @@ except:
HAVE_VIPS = False
def thumb_path(histpath: str, rem: str, mtime: float, fmt: str) -> str:
def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -> str:
# base16 = 16 = 256
# b64-lc = 38 = 1444
# base64 = 64 = 4096
rd, fn = vsplit(rem)
if rd:
h = hashlib.sha512(afsenc(rd)).digest()
b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24]
rd = "{}/{}/".format(b64[:2], b64[2:4]).lower() + b64
else:
rd = "top"
if not rd:
rd = "\ntop"
# spectrograms are never cropped; strip fullsize flag
ext = rem.split(".")[-1].lower()
if ext in ffa and fmt in ("wf", "jf"):
fmt = fmt[:1]
rd += "\n" + fmt
h = hashlib.sha512(afsenc(rd)).digest()
b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24]
rd = "{}/{}/".format(b64[:2], b64[2:4]).lower() + b64
# could keep original filenames but this is safer re pathlen
h = hashlib.sha512(afsenc(fn)).digest()
@@ -97,7 +111,8 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str) -> str:
if fmt in ("opus", "caf"):
cat = "ac"
else:
fmt = "webp" if fmt == "w" else "png" if fmt == "p" else "jpg"
fc = fmt[:1]
fmt = "webp" if fc == "w" else "png" if fc == "p" else "jpg"
cat = "th"
return "{}/{}/{}/{}.{:x}.{}".format(histpath, cat, rd, fn, int(mtime), fmt)
@@ -110,8 +125,6 @@ class ThumbSrv(object):
self.args = hub.args
self.log_func = hub.log
res = hub.args.th_size.split("x")
self.res = tuple([int(x) for x in res])
self.poke_cd = Cooldown(self.args.th_poke)
self.mutex = threading.Lock()
@@ -119,7 +132,7 @@ class ThumbSrv(object):
self.stopping = False
self.nthr = max(1, self.args.th_mt)
self.q: Queue[Optional[tuple[str, str]]] = Queue(self.nthr * 4)
self.q: Queue[Optional[tuple[str, str, str, VFS]]] = Queue(self.nthr * 4)
for n in range(self.nthr):
Daemon(self.worker, "thumb-{}-{}".format(n, self.nthr))
@@ -184,13 +197,17 @@ class ThumbSrv(object):
with self.mutex:
return not self.nthr
def getres(self, vn: VFS) -> tuple[int, int]:
w, h = vn.flags["thsize"].split("x")
return int(w), int(h)
def get(self, ptop: str, rem: str, mtime: float, fmt: str) -> Optional[str]:
histpath = self.asrv.vfs.histtab.get(ptop)
if not histpath:
self.log("no histpath for [{}]".format(ptop))
return None
tpath = thumb_path(histpath, rem, mtime, fmt)
tpath = thumb_path(histpath, rem, mtime, fmt, self.fmt_ffa)
abspath = os.path.join(ptop, rem)
cond = threading.Condition(self.mutex)
do_conv = False
@@ -211,8 +228,14 @@ class ThumbSrv(object):
do_conv = True
if do_conv:
self.q.put((abspath, tpath))
self.log("conv {} \033[0m{}".format(tpath, abspath), c=6)
allvols = list(self.asrv.vfs.all_vols.values())
vn = next((x for x in allvols if x.realpath == ptop), None)
if not vn:
self.log("ptop [{}] not in {}".format(ptop, allvols), 3)
vn = self.asrv.vfs.all_aps[0][1]
self.q.put((abspath, tpath, fmt, vn))
self.log("conv {} :{} \033[0m{}".format(tpath, fmt, abspath), c=6)
while not self.stopping:
with self.mutex:
@@ -248,7 +271,7 @@ class ThumbSrv(object):
if not task:
break
abspath, tpath = task
abspath, tpath, fmt, vn = task
ext = abspath.split(".")[-1].lower()
png_ok = False
funs = []
@@ -274,10 +297,14 @@ class ThumbSrv(object):
tdir, tfn = os.path.split(tpath)
ttpath = os.path.join(tdir, "w", tfn)
try:
bos.unlink(ttpath)
except:
pass
for fun in funs:
try:
fun(abspath, ttpath)
fun(abspath, ttpath, fmt, vn)
break
except Exception as ex:
msg = "{} could not create thumbnail of {}\n{}"
@@ -311,9 +338,10 @@ class ThumbSrv(object):
with self.mutex:
self.nthr -= 1
def fancy_pillow(self, im: "Image.Image") -> "Image.Image":
def fancy_pillow(self, im: "Image.Image", fmt: str, vn: VFS) -> "Image.Image":
# exif_transpose is expensive (loads full image + unconditional copy)
r = max(*self.res) * 2
res = self.getres(vn)
r = max(*res) * 2
im.thumbnail((r, r), resample=Image.LANCZOS)
try:
k = next(k for k, v in ExifTags.TAGS.items() if v == "Orientation")
@@ -327,23 +355,23 @@ class ThumbSrv(object):
if rot in rots:
im = im.transpose(rots[rot])
if self.args.th_no_crop:
im.thumbnail(self.res, resample=Image.LANCZOS)
if fmt.endswith("f"):
im.thumbnail(res, resample=Image.LANCZOS)
else:
iw, ih = im.size
dw, dh = self.res
dw, dh = res
res = (min(iw, dw), min(ih, dh))
im = ImageOps.fit(im, res, method=Image.LANCZOS)
return im
def conv_pil(self, abspath: str, tpath: str) -> None:
def conv_pil(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
with Image.open(fsenc(abspath)) as im:
try:
im = self.fancy_pillow(im)
im = self.fancy_pillow(im, fmt, vn)
except Exception as ex:
self.log("fancy_pillow {}".format(ex), "90")
im.thumbnail(self.res)
im.thumbnail(self.getres(vn))
fmts = ["RGB", "L"]
args = {"quality": 40}
@@ -366,12 +394,12 @@ class ThumbSrv(object):
im.save(tpath, **args)
def conv_vips(self, abspath: str, tpath: str) -> None:
def conv_vips(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
crops = ["centre", "none"]
if self.args.th_no_crop:
if fmt.endswith("f"):
crops = ["none"]
w, h = self.res
w, h = self.getres(vn)
kw = {"height": h, "size": "down", "intent": "relative"}
for c in crops:
@@ -385,8 +413,8 @@ class ThumbSrv(object):
img.write_to_file(tpath, Q=40)
def conv_ffmpeg(self, abspath: str, tpath: str) -> None:
ret, _ = ffprobe(abspath, int(self.args.th_convt / 2))
def conv_ffmpeg(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
if not ret:
return
@@ -398,12 +426,13 @@ class ThumbSrv(object):
seek = [b"-ss", "{:.0f}".format(dur / 3).encode("utf-8")]
scale = "scale={0}:{1}:force_original_aspect_ratio="
if self.args.th_no_crop:
if fmt.endswith("f"):
scale += "decrease,setsar=1:1"
else:
scale += "increase,crop={0}:{1},setsar=1:1"
bscale = scale.format(*list(self.res)).encode("utf-8")
res = self.getres(vn)
bscale = scale.format(*list(res)).encode("utf-8")
# fmt: off
cmd = [
b"ffmpeg",
@@ -435,11 +464,11 @@ class ThumbSrv(object):
]
cmd += [fsenc(tpath)]
self._run_ff(cmd)
self._run_ff(cmd, vn)
def _run_ff(self, cmd: list[bytes]) -> None:
def _run_ff(self, cmd: list[bytes], vn: VFS) -> None:
# self.log((b" ".join(cmd)).decode("utf-8"))
ret, _, serr = runcmd(cmd, timeout=self.args.th_convt)
ret, _, serr = runcmd(cmd, timeout=vn.flags["convt"])
if not ret:
return
@@ -482,8 +511,8 @@ class ThumbSrv(object):
self.log(t + txt, c=c)
raise sp.CalledProcessError(ret, (cmd[0], b"...", cmd[-1]))
def conv_waves(self, abspath: str, tpath: str) -> None:
ret, _ = ffprobe(abspath, int(self.args.th_convt / 2))
def conv_waves(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in ret:
raise Exception("not audio")
@@ -508,10 +537,10 @@ class ThumbSrv(object):
# fmt: on
cmd += [fsenc(tpath)]
self._run_ff(cmd)
self._run_ff(cmd, vn)
def conv_spec(self, abspath: str, tpath: str) -> None:
ret, _ = ffprobe(abspath, int(self.args.th_convt / 2))
def conv_spec(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in ret:
raise Exception("not audio")
@@ -551,23 +580,34 @@ class ThumbSrv(object):
]
cmd += [fsenc(tpath)]
self._run_ff(cmd)
self._run_ff(cmd, vn)
def conv_opus(self, abspath: str, tpath: str) -> None:
def conv_opus(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
if self.args.no_acode:
raise Exception("disabled in server config")
ret, _ = ffprobe(abspath, int(self.args.th_convt / 2))
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in ret:
raise Exception("not audio")
try:
dur = ret[".dur"][1]
except:
dur = 0
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
want_caf = tpath.endswith(".caf")
tmp_opus = tpath
if want_caf:
tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
tmp_opus = tpath + ".opus"
try:
bos.unlink(tmp_opus)
except:
pass
if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
caf_src = abspath if src_opus else tmp_opus
if not want_caf or not src_opus:
# fmt: off
cmd = [
b"ffmpeg",
@@ -582,9 +622,34 @@ class ThumbSrv(object):
fsenc(tmp_opus)
]
# fmt: on
self._run_ff(cmd)
self._run_ff(cmd, vn)
if want_caf:
# iOS fails to play some "insufficiently complex" files
# (average file shorter than 8 seconds), so of course we
# fix that by mixing in some inaudible pink noise :^)
# 6.3 sec seems like the cutoff so lets do 7, and
# 7 sec of psyqui-musou.opus @ 3:50 is 174 KiB
if want_caf and (dur < 20 or bos.path.getsize(caf_src) < 256 * 1024):
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
b"-filter_complex", b"anoisesrc=a=0.001:d=7:c=pink,asplit[l][r]; [l][r]amerge[s]; [0:a:0][s]amix",
b"-map_metadata", b"-1",
b"-ac", b"2",
b"-c:a", b"libopus",
b"-b:a", b"128k",
b"-f", b"caf",
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd, vn)
elif want_caf:
# simple remux should be safe
# fmt: off
cmd = [
b"ffmpeg",
@@ -599,7 +664,13 @@ class ThumbSrv(object):
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd)
self._run_ff(cmd, vn)
if tmp_opus != tpath:
try:
bos.unlink(tmp_opus)
except:
pass
def poke(self, tdir: str) -> None:
if not self.poke_cd.poke(tdir):

View File

@@ -34,14 +34,14 @@ if True: # pylint: disable=using-constant-test
from typing import Any, Optional, Union
if TYPE_CHECKING:
from .httpconn import HttpConn
from .httpsrv import HttpSrv
class U2idx(object):
def __init__(self, conn: "HttpConn") -> None:
self.log_func = conn.log_func
self.asrv = conn.asrv
self.args = conn.args
def __init__(self, hsrv: "HttpSrv") -> None:
self.log_func = hsrv.log
self.asrv = hsrv.asrv
self.args = hsrv.args
self.timeout = self.args.srch_time
if not HAVE_SQLITE3:
@@ -51,7 +51,7 @@ class U2idx(object):
self.active_id = ""
self.active_cur: Optional["sqlite3.Cursor"] = None
self.cur: dict[str, "sqlite3.Cursor"] = {}
self.mem_cur = sqlite3.connect(":memory:").cursor()
self.mem_cur = sqlite3.connect(":memory:", check_same_thread=False).cursor()
self.mem_cur.execute(r"create table a (b text)")
self.p_end = 0.0
@@ -69,7 +69,7 @@ class U2idx(object):
fsize = body["size"]
fhash = body["hash"]
wark = up2k_wark_from_hashlist(self.args.salt, fsize, fhash)
wark = up2k_wark_from_hashlist(self.args.warksalt, fsize, fhash)
uq = "substr(w,1,16) = ? and w = ?"
uv: list[Union[str, int]] = [wark[:16], wark]
@@ -101,7 +101,8 @@ class U2idx(object):
uri = ""
try:
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
cur = sqlite3.connect(uri, 2, uri=True).cursor()
db = sqlite3.connect(uri, 2, uri=True, check_same_thread=False)
cur = db.cursor()
cur.execute('pragma table_info("up")').fetchone()
self.log("ro: {}".format(db_path))
except:
@@ -112,7 +113,7 @@ class U2idx(object):
if not cur:
# on windows, this steals the write-lock from up2k.deferred_init --
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
cur = sqlite3.connect(db_path, 2).cursor()
cur = sqlite3.connect(db_path, 2, check_same_thread=False).cursor()
self.log("opened {}".format(db_path))
self.cur[ptop] = cur
@@ -120,10 +121,10 @@ class U2idx(object):
def search(
self, vols: list[tuple[str, str, dict[str, Any]]], uq: str, lim: int
) -> tuple[list[dict[str, Any]], list[str]]:
) -> tuple[list[dict[str, Any]], list[str], bool]:
"""search by query params"""
if not HAVE_SQLITE3:
return [], []
return [], [], False
q = ""
v: Union[str, int] = ""
@@ -180,6 +181,11 @@ class U2idx(object):
is_date = True
have_up = True
elif v == "up_at":
v = "up.at"
is_date = True
have_up = True
elif v == "path":
v = "trim(?||up.rd,'/')"
va.append("\nrd")
@@ -275,7 +281,7 @@ class U2idx(object):
have_up: bool,
have_mt: bool,
lim: int,
) -> tuple[list[dict[str, Any]], list[str]]:
) -> tuple[list[dict[str, Any]], list[str], bool]:
done_flag: list[bool] = []
self.active_id = "{:.6f}_{}".format(
time.time(), threading.current_thread().ident
@@ -313,12 +319,10 @@ class U2idx(object):
sret = []
fk = flags.get("fk")
dots = flags.get("dotsrch")
fk_alg = 2 if "fka" in flags else 1
c = cur.execute(uq, tuple(vuv))
for hit in c:
w, ts, sz, rd, fn, ip, at = hit[:7]
lim -= 1
if lim < 0:
break
if rd.startswith("//") or fn.startswith("//"):
rd, fn = s3dec(rd, fn)
@@ -335,16 +339,21 @@ class U2idx(object):
else:
try:
ap = absreal(os.path.join(ptop, rd, fn))
inf = bos.stat(ap)
ino = 0 if ANYWIN or fk_alg == 2 else bos.stat(ap).st_ino
except:
continue
suf = (
"?k="
+ gen_filekey(
self.args.fk_salt, ap, sz, 0 if ANYWIN else inf.st_ino
)[:fk]
)
suf = "?k=" + gen_filekey(
fk_alg,
self.args.fk_salt,
ap,
sz,
ino,
)[:fk]
lim -= 1
if lim < 0:
break
seen_rps.add(rp)
sret.append({"ts": int(ts), "sz": sz, "rp": rp + suf, "w": w[:16]})
@@ -368,7 +377,7 @@ class U2idx(object):
ret.sort(key=itemgetter("rp"))
return ret, list(taglist.keys())
return ret, list(taglist.keys()), lim < 0
def terminator(self, identifier: str, done_flag: list[bool]) -> None:
for _ in range(self.timeout):

View File

@@ -22,8 +22,9 @@ from copy import deepcopy
from queue import Queue
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, WINDOWS
from .authsrv import LEELOO_DALLAS, VFS, AuthSrv
from .authsrv import LEELOO_DALLAS, SSEELOG, VFS, AuthSrv
from .bos import bos
from .cfg import vf_bmap, vf_cmap, vf_vmap
from .fsutil import Fstab
from .mtag import MParser, MTag
from .util import (
@@ -41,6 +42,7 @@ from .util import (
gen_filekey,
gen_filekey_dbg,
hidedir,
humansize,
min_ex,
quotep,
rand_name,
@@ -56,6 +58,7 @@ from .util import (
sfsenc,
spack,
statdir,
unhumanize,
vjoin,
vsplit,
w8b64dec,
@@ -73,8 +76,8 @@ if True: # pylint: disable=using-constant-test
if TYPE_CHECKING:
from .svchub import SvcHub
zs = "avif,avifs,bmp,gif,heic,heics,heif,heifs,ico,j2p,j2k,jp2,jpeg,jpg,jpx,png,tga,tif,tiff,webp"
CV_EXTS = set(zs.split(","))
zsg = "avif,avifs,bmp,gif,heic,heics,heif,heifs,ico,j2p,j2k,jp2,jpeg,jpg,jpx,png,tga,tif,tiff,webp"
CV_EXTS = set(zsg.split(","))
class Dbw(object):
@@ -110,7 +113,7 @@ class Up2k(object):
self.args = hub.args
self.log_func = hub.log
self.salt = self.args.salt
self.salt = self.args.warksalt
self.r_hash = re.compile("^[0-9a-zA-Z_-]{44}$")
self.gid = 0
@@ -125,12 +128,12 @@ class Up2k(object):
self.registry: dict[str, dict[str, dict[str, Any]]] = {}
self.flags: dict[str, dict[str, Any]] = {}
self.droppable: dict[str, list[str]] = {}
self.volnfiles: dict["sqlite3.Cursor", int] = {}
self.volsize: dict["sqlite3.Cursor", int] = {}
self.volstate: dict[str, str] = {}
self.vol_act: dict[str, float] = {}
self.busy_aps: set[str] = set()
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
self.snap_persist_interval = 300 # persist unfinished index every 5 min
self.snap_discard_interval = 21600 # drop unfinished after 6 hours inactivity
self.snap_prev: dict[str, Optional[tuple[int, float]]] = {}
self.mtag: Optional[MTag] = None
@@ -195,7 +198,8 @@ class Up2k(object):
if self.stop:
# up-mt consistency not guaranteed if init is interrupted;
# drop caches for a full scan on next boot
self._drop_caches()
with self.mutex:
self._drop_caches()
if self.pp:
self.pp.end = True
@@ -222,8 +226,10 @@ class Up2k(object):
self.log_func("up2k", msg, c)
def _gen_fk(self, salt: str, fspath: str, fsize: int, inode: int) -> str:
return gen_filekey_dbg(salt, fspath, fsize, inode, self.log, self.args.log_fk)
def _gen_fk(self, alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str:
return gen_filekey_dbg(
alg, salt, fspath, fsize, inode, self.log, self.args.log_fk
)
def _block(self, why: str) -> None:
self.blocked = why
@@ -261,6 +267,58 @@ class Up2k(object):
}
return json.dumps(ret, indent=4)
def get_unfinished(self) -> str:
if PY2 or not self.mutex.acquire(timeout=0.5):
return "{}"
ret: dict[str, tuple[int, int]] = {}
try:
for ptop, tab2 in self.registry.items():
nbytes = 0
nfiles = 0
drp = self.droppable.get(ptop, {})
for wark, job in tab2.items():
if wark in drp:
continue
nfiles += 1
try:
# close enough on average
nbytes += len(job["need"]) * job["size"] // len(job["hash"])
except:
pass
ret[ptop] = (nbytes, nfiles)
finally:
self.mutex.release()
return json.dumps(ret, indent=4)
def get_volsize(self, ptop: str) -> tuple[int, int]:
with self.mutex:
return self._get_volsize(ptop)
def get_volsizes(self, ptops: list[str]) -> list[tuple[int, int]]:
ret = []
with self.mutex:
for ptop in ptops:
ret.append(self._get_volsize(ptop))
return ret
def _get_volsize(self, ptop: str) -> tuple[int, int]:
if "e2ds" not in self.flags.get(ptop, {}):
return (0, 0)
cur = self.cur[ptop]
nbytes = self.volsize[cur]
nfiles = self.volnfiles[cur]
for j in list(self.registry.get(ptop, {}).values()):
nbytes += j["size"]
nfiles += 1
return (nbytes, nfiles)
def rescan(
self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool
) -> str:
@@ -380,11 +438,11 @@ class Up2k(object):
if rd.startswith("//") or fn.startswith("//"):
rd, fn = s3dec(rd, fn)
fvp = "{}/{}".format(rd, fn).strip("/")
fvp = ("%s/%s" % (rd, fn)).strip("/")
if vp:
fvp = "{}/{}".format(vp, fvp)
fvp = "%s/%s" % (vp, fvp)
self._handle_rm(LEELOO_DALLAS, "", fvp, [])
self._handle_rm(LEELOO_DALLAS, "", fvp, [], True)
nrm += 1
if nrm:
@@ -575,16 +633,14 @@ class Up2k(object):
if self.args.re_dhash or [zv for zv in vols if "e2tsr" in zv.flags]:
self.args.re_dhash = False
self._drop_caches()
with self.mutex:
self._drop_caches()
for vol in vols:
if self.stop:
break
en: set[str] = set()
if "mte" in vol.flags:
en = set(vol.flags["mte"].split(","))
en = set(vol.flags.get("mte", {}))
self.entags[vol.realpath] = en
if "e2d" in vol.flags:
@@ -739,18 +795,34 @@ class Up2k(object):
ff = "\033[0;35m{}{:.0}"
fv = "\033[0;36m{}:\033[90m{}"
fx = set(("html_head",))
fdl = ("dbd", "lg_sbf", "md_sbf", "mte", "mth", "mtp", "nrand", "rand")
fd = {x: x for x in fdl}
fd = vf_bmap()
fd.update(vf_cmap())
fd.update(vf_vmap())
fd = {v: k for k, v in fd.items()}
fl = {
k: v
for k, v in flags.items()
if k not in fd or v != getattr(self.args, fd[k])
if k not in fd
or (
v != getattr(self.args, fd[k])
and str(v) != str(getattr(self.args, fd[k]))
)
}
for k1, k2 in vf_cmap().items():
if k1 not in fl or k1 in fx:
continue
if str(fl[k1]) == str(getattr(self.args, k2)):
del fl[k1]
else:
fl[k1] = ",".join(x for x in fl[k1])
a = [
(ft if v is True else ff if v is False else fv).format(k, str(v))
for k, v in fl.items()
if k not in fx
]
if not a:
a = ["\033[90mall-default"]
if a:
vpath = "?"
for k, v in self.asrv.vfs.all_vols.items():
@@ -761,14 +833,14 @@ class Up2k(object):
vpath += "/"
zs = " ".join(sorted(a))
zs = zs.replace("30mre.compile(", "30m(") # nohash
zs = zs.replace("90mre.compile(", "90m(") # nohash
self.log("/{} {}".format(vpath, zs), "35")
reg = {}
drp = None
path = os.path.join(histpath, "up2k.snap")
if bos.path.exists(path):
with gzip.GzipFile(path, "rb") as f:
snap = os.path.join(histpath, "up2k.snap")
if bos.path.exists(snap):
with gzip.GzipFile(snap, "rb") as f:
j = f.read().decode("utf-8")
reg2 = json.loads(j)
@@ -779,20 +851,20 @@ class Up2k(object):
pass
for k, job in reg2.items():
path = djoin(job["ptop"], job["prel"], job["name"])
if bos.path.exists(path):
fp = djoin(job["ptop"], job["prel"], job["name"])
if bos.path.exists(fp):
reg[k] = job
job["poke"] = time.time()
job["busy"] = {}
else:
self.log("ign deleted file in snap: [{}]".format(path))
self.log("ign deleted file in snap: [{}]".format(fp))
if drp is None:
drp = [k for k, v in reg.items() if not v.get("need", [])]
else:
drp = [x for x in drp if x in reg]
t = "loaded snap {} |{}| ({})".format(path, len(reg.keys()), len(drp or []))
t = "loaded snap {} |{}| ({})".format(snap, len(reg.keys()), len(drp or []))
ta = [t] + self._vis_reg_progress(reg)
self.log("\n".join(ta))
@@ -804,12 +876,17 @@ class Up2k(object):
if not HAVE_SQLITE3 or "e2d" not in flags or "d2d" in flags:
return None
if bos.makedirs(histpath):
hidedir(histpath)
try:
if bos.makedirs(histpath):
hidedir(histpath)
except:
return None
try:
cur = self._open_db(db_path)
self.cur[ptop] = cur
self.volsize[cur] = 0
self.volnfiles[cur] = 0
# speeds measured uploading 520 small files on a WD20SPZX (SMR 2.5" 5400rpm 4kb)
dbd = flags["dbd"]
@@ -856,6 +933,7 @@ class Up2k(object):
rei = vol.flags.get("noidx")
reh = vol.flags.get("nohash")
n4g = bool(vol.flags.get("noforget"))
ffat = "fat32" in vol.flags
cst = bos.stat(top)
dev = cst.st_dev if vol.flags.get("xdev") else 0
@@ -883,6 +961,11 @@ class Up2k(object):
rtop = absreal(top)
n_add = n_rm = 0
try:
if not bos.listdir(rtop):
t = "volume /%s at [%s] is empty; will not be indexed as this could be due to an offline filesystem"
self.log(t % (vol.vpath, rtop), 6)
return True, False
n_add = self._build_dir(
db,
top,
@@ -892,6 +975,7 @@ class Up2k(object):
rei,
reh,
n4g,
ffat,
[],
cst,
dev,
@@ -917,6 +1001,28 @@ class Up2k(object):
db.c.connection.commit()
if (
vol.flags.get("vmaxb")
or vol.flags.get("vmaxn")
or (self.args.stats and not self.args.nos_vol)
):
zs = "select count(sz), sum(sz) from up"
vn, vb = db.c.execute(zs).fetchone()
vb = vb or 0
vb += vn * 2048
self.volsize[db.c] = vb
self.volnfiles[db.c] = vn
vmaxb = unhumanize(vol.flags.get("vmaxb") or "0")
vmaxn = unhumanize(vol.flags.get("vmaxn") or "0")
t = "{:>5} / {:>5} ( {:>5} / {:>5} files) in {}".format(
humansize(vb, True),
humansize(vmaxb, True),
humansize(vn, True).rstrip("B"),
humansize(vmaxn, True).rstrip("B"),
vol.realpath,
)
self.log(t)
return True, bool(n_add or n_rm or do_vac)
def _build_dir(
@@ -929,6 +1035,7 @@ class Up2k(object):
rei: Optional[Pattern[str]],
reh: Optional[Pattern[str]],
n4g: bool,
ffat: bool,
seen: list[str],
cst: os.stat_result,
dev: int,
@@ -957,7 +1064,7 @@ class Up2k(object):
if WINDOWS:
rd = rd.replace("\\", "/").strip("/")
g = statdir(self.log_func, not self.args.no_scandir, False, cdir)
g = statdir(self.log_func, not self.args.no_scandir, True, cdir)
gl = sorted(g)
partials = set([x[0] for x in gl if "PARTIAL" in x[0]])
for iname, inf in gl:
@@ -972,8 +1079,14 @@ class Up2k(object):
continue
lmod = int(inf.st_mtime)
if stat.S_ISLNK(inf.st_mode):
try:
inf = bos.stat(abspath)
except:
continue
sz = inf.st_size
if fat32 and inf.st_mtime % 2:
if fat32 and not ffat and inf.st_mtime % 2:
fat32 = False
if stat.S_ISDIR(inf.st_mode):
@@ -990,7 +1103,19 @@ class Up2k(object):
# self.log(" dir: {}".format(abspath))
try:
ret += self._build_dir(
db, top, excl, abspath, rap, rei, reh, n4g, seen, inf, dev, xvol
db,
top,
excl,
abspath,
rap,
rei,
reh,
n4g,
fat32,
seen,
inf,
dev,
xvol,
)
except:
t = "failed to index subdir [{}]:\n{}"
@@ -1032,7 +1157,7 @@ class Up2k(object):
zh.update(cv.encode("utf-8", "replace"))
zh.update(spack(b"<d", cst.st_mtime))
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
sql = "select d from dh where d = ? and h = ?"
sql = "select d from dh where d=? and +h=?"
try:
c = db.c.execute(sql, (rd, dhash))
drd = rd
@@ -1092,7 +1217,7 @@ class Up2k(object):
top, rp, dts, lmod, dsz, sz
)
self.log(t)
self.db_rm(db.c, rd, fn)
self.db_rm(db.c, rd, fn, 0)
ret += 1
db.n += 1
in_db = []
@@ -1175,7 +1300,7 @@ class Up2k(object):
rm_files = [x for x in hits if x not in seen_files]
n_rm = len(rm_files)
for fn in rm_files:
self.db_rm(db.c, rd, fn)
self.db_rm(db.c, rd, fn, 0)
if n_rm:
self.log("forgot {} deleted files".format(n_rm))
@@ -1251,12 +1376,15 @@ class Up2k(object):
if n_rm2:
self.log("forgetting {} shadowed deleted files".format(n_rm2))
c2.connection.commit()
# then covers
n_rm3 = 0
qu = "select 1 from up where rd=? and +fn=? limit 1"
q = "delete from cv where rd=? and dn=? and +fn=?"
for crd, cdn, fn in cur.execute("select * from cv"):
ap = os.path.join(top, crd, cdn, fn)
if not bos.path.exists(ap):
urd = vjoin(crd, cdn)
if not c2.execute(qu, (urd, fn)).fetchone():
c2.execute(q, (crd, cdn, fn))
n_rm3 += 1
@@ -1313,9 +1441,9 @@ class Up2k(object):
w, drd, dfn = zb[:-1].decode("utf-8").split("\x00")
with self.mutex:
q = "select mt, sz from up where w = ? and rd = ? and fn = ?"
q = "select mt, sz from up where rd=? and fn=? and +w=?"
try:
mt, sz = cur.execute(q, (w, drd, dfn)).fetchone()
mt, sz = cur.execute(q, (drd, dfn, w)).fetchone()
except:
# file moved/deleted since spooling
continue
@@ -1337,9 +1465,11 @@ class Up2k(object):
pf = "v{}, {:.0f}+".format(n_left, b_left / 1024 / 1024)
self.pp.msg = pf + abspath
st = bos.stat(abspath)
# throws on broken symlinks (always did)
stl = bos.lstat(abspath)
st = bos.stat(abspath) if stat.S_ISLNK(stl.st_mode) else stl
mt2 = int(stl.st_mtime)
sz2 = st.st_size
mt2 = int(st.st_mtime)
if nohash or not sz2:
w2 = up2k_wark_from_metadata(self.salt, sz2, mt2, rd, fn)
@@ -1361,6 +1491,13 @@ class Up2k(object):
if w == w2:
continue
# symlink mtime was inconsistent before v1.9.4; check if that's it
if st != stl and (nohash or not sz2):
mt2b = int(st.st_mtime)
w2b = up2k_wark_from_metadata(self.salt, sz2, mt2b, rd, fn)
if w == w2b:
continue
rewark.append((drd, dfn, w2, sz2, mt2))
t = "hash mismatch: {}\n db: {} ({} byte, {})\n fs: {} ({} byte, {})"
@@ -2005,7 +2142,7 @@ class Up2k(object):
try:
nfiles = next(cur.execute("select count(w) from up"))[0]
self.log("OK: {} |{}|".format(db_path, nfiles))
self.log(" {} |{}|".format(db_path, nfiles), "90")
return cur
except:
self.log("WARN: could not list files; DB corrupt?\n" + min_ex())
@@ -2209,6 +2346,9 @@ class Up2k(object):
vols = [(ptop, jcur)] if jcur else []
if vfs.flags.get("xlink"):
vols += [(k, v) for k, v in self.cur.items() if k != ptop]
if vfs.flags.get("up_ts", "") == "fu" or not cj["lmod"]:
# force upload time rather than last-modified
cj["lmod"] = int(time.time())
alts: list[tuple[int, int, dict[str, Any]]] = []
for ptop, cur in vols:
@@ -2220,7 +2360,7 @@ class Up2k(object):
q = r"select * from up where w = ?"
argv = [wark]
else:
q = r"select * from up where substr(w,1,16) = ? and w = ?"
q = r"select * from up where substr(w,1,16)=? and +w=?"
argv = [wark[:16], wark]
c2 = cur.execute(q, tuple(argv))
@@ -2281,7 +2421,9 @@ class Up2k(object):
if lost:
c2 = None
for cur, dp_dir, dp_fn in lost:
self.db_rm(cur, dp_dir, dp_fn)
t = "forgetting deleted file: /{}"
self.log(t.format(vjoin(vjoin(vfs.vpath, dp_dir), dp_fn)))
self.db_rm(cur, dp_dir, dp_fn, cj["size"])
if c2 and c2 != cur:
c2.connection.commit()
@@ -2293,27 +2435,31 @@ class Up2k(object):
cur = jcur
ptop = None # use cj or job as appropriate
if not job and wark in reg:
# ensure the files haven't been deleted manually
rj = reg[wark]
names = [rj[x] for x in ["name", "tnam"] if x in rj]
for fn in names:
path = djoin(rj["ptop"], rj["prel"], fn)
try:
if bos.path.getsize(path) > 0 or not rj["need"]:
# upload completed or both present
break
except:
# missing; restart
if not self.args.nw and not n4g:
t = "forgetting deleted partial upload at {}"
self.log(t.format(path))
del reg[wark]
break
if job or wark in reg:
job = job or reg[wark]
if (
job["ptop"] == cj["ptop"]
and job["prel"] == cj["prel"]
and job["name"] == cj["name"]
job["ptop"] != cj["ptop"]
or job["prel"] != cj["prel"]
or job["name"] != cj["name"]
):
# ensure the files haven't been deleted manually
names = [job[x] for x in ["name", "tnam"] if x in job]
for fn in names:
path = djoin(job["ptop"], job["prel"], fn)
try:
if bos.path.getsize(path) > 0:
# upload completed or both present
break
except:
# missing; restart
if not self.args.nw and not n4g:
job = None
break
else:
# file contents match, but not the path
src = djoin(job["ptop"], job["prel"], job["name"])
dst = djoin(cj["ptop"], cj["prel"], cj["name"])
@@ -2415,7 +2561,14 @@ class Up2k(object):
if vfs.lim:
ap2, cj["prel"] = vfs.lim.all(
cj["addr"], cj["prel"], cj["size"], ap1, reg
cj["addr"],
cj["prel"],
cj["size"],
cj["ptop"],
ap1,
self.hub.broker,
reg,
"up2k._get_volsize",
)
bos.makedirs(ap2)
vfs.lim.nup(cj["addr"])
@@ -2482,9 +2635,10 @@ class Up2k(object):
and not self.args.nw
and (cj["user"] in vfs.axs.uread or cj["user"] in vfs.axs.upget)
):
alg = 2 if "fka" in vfs.flags else 1
ap = absreal(djoin(job["ptop"], job["prel"], job["name"]))
ino = 0 if ANYWIN else bos.stat(ap).st_ino
fk = self.gen_fk(self.args.fk_salt, ap, job["size"], ino)
fk = self.gen_fk(alg, self.args.fk_salt, ap, job["size"], ino)
ret["fk"] = fk[: vfs.flags["fk"]]
return ret
@@ -2536,7 +2690,7 @@ class Up2k(object):
fs2 = bos.stat(os.path.dirname(dst)).st_dev
if fs1 == 0 or fs2 == 0:
# py2 on winxp or other unsupported combination
raise OSError(38, "filesystem does not have st_dev")
raise OSError(errno.ENOSYS, "filesystem does not have st_dev")
elif fs1 == fs2:
# same fs; make symlink as relative as possible
spl = r"[\\/]" if WINDOWS else "/"
@@ -2561,7 +2715,7 @@ class Up2k(object):
try:
if "hardlink" in flags:
os.link(fsenc(src), fsenc(dst))
os.link(fsenc(absreal(src)), fsenc(dst))
linked = True
except Exception as ex:
self.log("cannot hardlink: " + repr(ex))
@@ -2588,7 +2742,7 @@ class Up2k(object):
if not job:
known = " ".join([x for x in self.registry[ptop].keys()])
self.log("unknown wark [{}], known: {}".format(wark, known))
raise Pebkac(400, "unknown wark")
raise Pebkac(400, "unknown wark" + SSEELOG)
if chash not in job["need"]:
msg = "chash = {} , need:\n".format(chash)
@@ -2733,7 +2887,7 @@ class Up2k(object):
self._symlink(dst, d2, self.flags[ptop], lmod=lmod)
if cur:
self.db_rm(cur, rd, fn)
self.db_rm(cur, rd, fn, job["size"])
self.db_add(cur, vflags, rd, fn, lmod, *z2[3:])
if cur:
@@ -2776,7 +2930,7 @@ class Up2k(object):
self.db_act = self.vol_act[ptop] = time.time()
try:
self.db_rm(cur, rd, fn)
self.db_rm(cur, rd, fn, sz)
self.db_add(
cur,
vflags,
@@ -2806,13 +2960,17 @@ class Up2k(object):
return True
def db_rm(self, db: "sqlite3.Cursor", rd: str, fn: str) -> None:
def db_rm(self, db: "sqlite3.Cursor", rd: str, fn: str, sz: int) -> None:
sql = "delete from up where rd = ? and fn = ?"
try:
db.execute(sql, (rd, fn))
r = db.execute(sql, (rd, fn))
except:
assert self.mem_cur
db.execute(sql, s3enc(self.mem_cur, rd, fn))
r = db.execute(sql, s3enc(self.mem_cur, rd, fn))
if r.rowcount:
self.volsize[db] -= sz
self.volnfiles[db] -= 1
def db_add(
self,
@@ -2841,6 +2999,9 @@ class Up2k(object):
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0))
db.execute(sql, v)
self.volsize[db] += sz
self.volnfiles[db] += 1
xau = False if skip_xau else vflags.get("xau")
dst = djoin(ptop, rd, fn)
if xau and not runhook(
@@ -2894,7 +3055,9 @@ class Up2k(object):
except:
pass
def handle_rm(self, uname: str, ip: str, vpaths: list[str], lim: list[int]) -> str:
def handle_rm(
self, uname: str, ip: str, vpaths: list[str], lim: list[int], rm_up: bool
) -> str:
n_files = 0
ok = {}
ng = {}
@@ -2903,7 +3066,7 @@ class Up2k(object):
self.log("hit delete limit of {} files".format(lim[1]), 3)
break
a, b, c = self._handle_rm(uname, ip, vp, lim)
a, b, c = self._handle_rm(uname, ip, vp, lim, rm_up)
n_files += a
for k in b:
ok[k] = 1
@@ -2917,7 +3080,7 @@ class Up2k(object):
return "deleted {} files (and {}/{} folders)".format(n_files, iok, iok + ing)
def _handle_rm(
self, uname: str, ip: str, vpath: str, lim: list[int]
self, uname: str, ip: str, vpath: str, lim: list[int], rm_up: bool
) -> tuple[int, list[str], list[str]]:
self.db_act = time.time()
try:
@@ -2934,7 +3097,8 @@ class Up2k(object):
permsets = [[False, True]]
vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0])
vn, rem = vn.get_dbv(rem)
_, _, _, _, dip, dat = self._find_from_vpath(vn.realpath, rem)
with self.mutex:
_, _, _, _, dip, dat = self._find_from_vpath(vn.realpath, rem)
t = "you cannot delete this: "
if not dip:
@@ -2986,12 +3150,12 @@ class Up2k(object):
break
abspath = djoin(adir, fn)
st = bos.stat(abspath)
volpath = "{}/{}".format(vrem, fn).strip("/")
vpath = "{}/{}".format(dbv.vpath, volpath).strip("/")
self.log("rm {}\n {}".format(vpath, abspath))
_ = dbv.get(volpath, uname, *permsets[0])
if xbd:
st = bos.stat(abspath)
if not runhook(
self.log,
xbd,
@@ -3015,25 +3179,43 @@ class Up2k(object):
try:
ptop = dbv.realpath
cur, wark, _, _, _, _ = self._find_from_vpath(ptop, volpath)
self._forget_file(ptop, volpath, cur, wark, True)
self._forget_file(ptop, volpath, cur, wark, True, st.st_size)
finally:
if cur:
cur.connection.commit()
bos.unlink(abspath)
if xad:
runhook(self.log, xad, abspath, vpath, "", uname, 0, 0, ip, 0, "")
runhook(
self.log,
xad,
abspath,
vpath,
"",
uname,
st.st_mtime,
st.st_size,
ip,
0,
"",
)
ok: list[str] = []
ng: list[str] = []
if is_dir:
ok, ng = rmdirs(self.log_func, scandir, True, atop, 1)
else:
ok = ng = []
ok2, ng2 = rmdirs_up(os.path.dirname(atop), ptop)
if rm_up:
ok2, ng2 = rmdirs_up(os.path.dirname(atop), ptop)
else:
ok2 = ng2 = []
return n_files, ok + ok2, ng + ng2
def handle_mv(self, uname: str, svp: str, dvp: str) -> str:
if svp == dvp or dvp.startswith(svp + "/"):
raise Pebkac(400, "mv: cannot move parent into subfolder")
svn, srem = self.asrv.vfs.get(svp, uname, True, False, True)
svn, srem = svn.get_dbv(srem)
sabs = svn.canonical(srem, False)
@@ -3087,8 +3269,21 @@ class Up2k(object):
curs.clear()
rmdirs(self.log_func, scandir, True, sabs, 1)
rmdirs_up(os.path.dirname(sabs), svn.realpath)
rm_ok, rm_ng = rmdirs(self.log_func, scandir, True, sabs, 1)
for zsl in (rm_ok, rm_ng):
for ap in reversed(zsl):
if not ap.startswith(sabs):
raise Pebkac(500, "mv_d: bug at {}, top {}".format(ap, sabs))
rem = ap[len(sabs) :].replace(os.sep, "/").lstrip("/")
vp = vjoin(dvp, rem)
try:
dvn, drem = self.asrv.vfs.get(vp, uname, False, True)
bos.mkdir(dvn.canonical(drem))
except:
pass
return "k"
def _mv_file(
@@ -3115,10 +3310,15 @@ class Up2k(object):
if bos.path.exists(dabs):
raise Pebkac(400, "mv2: target file exists")
stl = bos.lstat(sabs)
try:
st = bos.stat(sabs)
except:
st = stl
xbr = svn.flags.get("xbr")
xar = dvn.flags.get("xar")
if xbr:
st = bos.stat(sabs)
if not runhook(
self.log, xbr, sabs, svp, "", uname, st.st_mtime, st.st_size, "", 0, ""
):
@@ -3126,9 +3326,16 @@ class Up2k(object):
self.log(t, 1)
raise Pebkac(405, t)
is_xvol = svn.realpath != dvn.realpath
if stat.S_ISLNK(stl.st_mode):
is_dirlink = stat.S_ISDIR(st.st_mode)
is_link = True
else:
is_link = is_dirlink = False
bos.makedirs(os.path.dirname(dabs))
if bos.path.islink(sabs):
if is_dirlink:
dlabs = absreal(sabs)
t = "moving symlink from [{}] to [{}], target [{}]"
self.log(t.format(sabs, dabs, dlabs))
@@ -3151,36 +3358,22 @@ class Up2k(object):
c2 = self.cur.get(dvn.realpath)
if ftime_ is None:
st = bos.stat(sabs)
ftime = st.st_mtime
fsize = st.st_size
else:
ftime = ftime_
fsize = fsize_ or 0
try:
atomic_move(sabs, dabs)
except OSError as ex:
if ex.errno != errno.EXDEV:
raise
self.log("cross-device move:\n {}\n {}".format(sabs, dabs))
b1, b2 = fsenc(sabs), fsenc(dabs)
try:
shutil.copy2(b1, b2)
except:
os.unlink(b2)
raise
os.unlink(b1)
has_dupes = False
if w:
assert c1
if c2 and c2 != c1:
self._copy_tags(c1, c2, w)
self._forget_file(svn.realpath, srem, c1, w, c1 != c2)
self._relink(w, svn.realpath, srem, dabs)
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize)
if not is_xvol:
has_dupes = self._relink(w, svn.realpath, srem, dabs)
curs.add(c1)
if c2:
@@ -3203,6 +3396,47 @@ class Up2k(object):
else:
self.log("not found in src db: [{}]".format(svp))
try:
if is_xvol and has_dupes:
raise OSError(errno.EXDEV, "src is symlink")
atomic_move(sabs, dabs)
except OSError as ex:
if ex.errno != errno.EXDEV:
raise
self.log("using copy+delete (%s):\n %s\n %s" % (ex.strerror, sabs, dabs))
b1, b2 = fsenc(sabs), fsenc(dabs)
is_link = os.path.islink(b1) # due to _relink
try:
shutil.copy2(b1, b2)
except:
try:
os.unlink(b2)
except:
pass
if not is_link:
raise
# broken symlink? keep it as-is
try:
zb = os.readlink(b1)
os.symlink(zb, b2)
except:
os.unlink(b2)
raise
if is_link:
try:
times = (int(time.time()), int(stl.st_mtime))
bos.utime(dabs, times, False)
except:
pass
os.unlink(b1)
if xar:
runhook(self.log, xar, dabs, dvp, "", uname, 0, 0, "", 0, "")
@@ -3255,21 +3489,24 @@ class Up2k(object):
cur: Optional["sqlite3.Cursor"],
wark: Optional[str],
drop_tags: bool,
) -> None:
sz: int,
) -> bool:
"""forgets file in db, fixes symlinks, does not delete"""
srd, sfn = vsplit(vrem)
has_dupes = False
self.log("forgetting {}".format(vrem))
if wark and cur:
self.log("found {} in db".format(wark))
if drop_tags:
if self._relink(wark, ptop, vrem, ""):
has_dupes = True
drop_tags = False
if drop_tags:
q = "delete from mt where w=?"
cur.execute(q, (wark[:16],))
self.db_rm(cur, srd, sfn)
self.db_rm(cur, srd, sfn, sz)
reg = self.registry.get(ptop)
if reg:
@@ -3290,6 +3527,8 @@ class Up2k(object):
assert wark
del reg[wark]
return has_dupes
def _relink(self, wark: str, sptop: str, srem: str, dabs: str) -> int:
"""
update symlinks from file at svn/srem to dabs (rename),
@@ -3297,9 +3536,16 @@ class Up2k(object):
"""
dupes = []
sabs = djoin(sptop, srem)
q = "select rd, fn from up where substr(w,1,16)=? and w=?"
if self.no_expr_idx:
q = r"select rd, fn from up where w = ?"
argv = (wark,)
else:
q = r"select rd, fn from up where substr(w,1,16)=? and +w=?"
argv = (wark[:16], wark)
for ptop, cur in self.cur.items():
for rd, fn in cur.execute(q, (wark[:16], wark)):
for rd, fn in cur.execute(q, argv):
if rd.startswith("//") or fn.startswith("//"):
rd, fn = s3dec(rd, fn)
@@ -3509,13 +3755,16 @@ class Up2k(object):
self._finish_upload(job["ptop"], job["wark"])
def _snapshot(self) -> None:
slp = self.snap_persist_interval
slp = self.args.snap_wri
if not slp or self.args.no_snap:
return
while True:
time.sleep(slp)
if self.pp:
slp = 5
else:
slp = self.snap_persist_interval
slp = self.args.snap_wri
self.do_snapshot()
def do_snapshot(self) -> None:
@@ -3529,11 +3778,8 @@ class Up2k(object):
if not histpath:
return
rm = [
x
for x in reg.values()
if x["need"] and now - x["poke"] > self.snap_discard_interval
]
idrop = self.args.snap_drop * 60
rm = [x for x in reg.values() if x["need"] and now - x["poke"] >= idrop]
if self.args.nw:
lost = []

View File

@@ -25,7 +25,6 @@ import threading
import time
import traceback
from collections import Counter
from datetime import datetime
from email.utils import formatdate
from ipaddress import IPv4Address, IPv4Network, IPv6Address, IPv6Network
@@ -35,6 +34,35 @@ from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, VT100, WINDOWS
from .__version__ import S_BUILD_DT, S_VERSION
from .stolen import surrogateescape
try:
from datetime import datetime, timezone
UTC = timezone.utc
except:
from datetime import datetime, timedelta, tzinfo
TD_ZERO = timedelta(0)
class _UTC(tzinfo):
def utcoffset(self, dt):
return TD_ZERO
def tzname(self, dt):
return "UTC"
def dst(self, dt):
return TD_ZERO
UTC = _UTC()
if sys.version_info >= (3, 7) or (
sys.version_info >= (3, 6) and platform.python_implementation() == "CPython"
):
ODict = dict
else:
from collections import OrderedDict as ODict
def _ens(want: str) -> tuple[int, ...]:
ret: list[int] = []
@@ -56,6 +84,8 @@ E_ADDR_IN_USE = _ens("EADDRINUSE WSAEADDRINUSE")
E_ACCESS = _ens("EACCES WSAEACCES")
E_UNREACH = _ens("EHOSTUNREACH WSAEHOSTUNREACH ENETUNREACH WSAENETUNREACH")
IP6ALL = "0:0:0:0:0:0:0:0"
try:
import ctypes
@@ -66,7 +96,9 @@ except:
try:
HAVE_SQLITE3 = True
import sqlite3 # pylint: disable=unused-import # typechk
import sqlite3
assert hasattr(sqlite3, "connect") # graalpy
except:
HAVE_SQLITE3 = False
@@ -171,6 +203,7 @@ HTTPCODE = {
500: "Internal Server Error",
501: "Not Implemented",
503: "Service Unavailable",
999: "MissingNo",
}
@@ -256,6 +289,13 @@ EXTS["vnd.mozilla.apng"] = "png"
MAGIC_MAP = {"jpeg": "jpg"}
DEF_EXP = "self.ip self.ua self.uname self.host cfg.name cfg.logout vf.scan vf.thsize hdr.cf_ipcountry srv.itime srv.htime"
DEF_MTE = "circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,vc,ac,fmt,res,.fps,ahash,vhash"
DEF_MTH = ".vq,.aq,vc,ac,fmt,res,.fps"
REKOBO_KEY = {
v: ln.split(" ", 1)[0]
for ln in """
@@ -293,14 +333,22 @@ REKOBO_KEY = {
REKOBO_LKEY = {k.lower(): v for k, v in REKOBO_KEY.items()}
_exestr = "python3 python ffmpeg ffprobe cfssl cfssljson cfssl-certinfo"
CMD_EXEB = set(_exestr.encode("utf-8").split())
CMD_EXES = set(_exestr.split())
pybin = sys.executable or ""
if EXE:
pybin = ""
for p in "python3 python".split():
for zsg in "python3 python".split():
try:
p = shutil.which(p)
if p:
pybin = p
if ANYWIN:
zsg += ".exe"
zsg = shutil.which(zsg)
if zsg:
pybin = zsg
break
except:
pass
@@ -537,7 +585,7 @@ class _Unrecv(object):
self.log = log
self.buf: bytes = b""
def recv(self, nbytes: int) -> bytes:
def recv(self, nbytes: int, spins: int = 1) -> bytes:
if self.buf:
ret = self.buf[:nbytes]
self.buf = self.buf[nbytes:]
@@ -548,6 +596,10 @@ class _Unrecv(object):
ret = self.s.recv(nbytes)
break
except socket.timeout:
spins -= 1
if spins <= 0:
ret = b""
break
continue
except:
ret = b""
@@ -590,7 +642,7 @@ class _LUnrecv(object):
self.log = log
self.buf = b""
def recv(self, nbytes: int) -> bytes:
def recv(self, nbytes: int, spins: int) -> bytes:
if self.buf:
ret = self.buf[:nbytes]
self.buf = self.buf[nbytes:]
@@ -609,7 +661,7 @@ class _LUnrecv(object):
def recv_ex(self, nbytes: int, raise_on_trunc: bool = True) -> bytes:
"""read an exact number of bytes"""
try:
ret = self.recv(nbytes)
ret = self.recv(nbytes, 1)
err = False
except:
ret = b""
@@ -617,7 +669,7 @@ class _LUnrecv(object):
while not err and len(ret) < nbytes:
try:
ret += self.recv(nbytes - len(ret))
ret += self.recv(nbytes - len(ret), 1)
except OSError:
err = True
@@ -921,7 +973,8 @@ class Magician(object):
class Garda(object):
"""ban clients for repeated offenses"""
def __init__(self, cfg: str) -> None:
def __init__(self, cfg: str, uniq: bool = True) -> None:
self.uniq = uniq
try:
a, b, c = cfg.strip().split(",")
self.lim = int(a)
@@ -967,7 +1020,7 @@ class Garda(object):
# assume /64 clients; drop 4 groups
ip = IPv6Address(ip).exploded[:-20]
if prev:
if prev and self.uniq:
if self.prev.get(ip) == prev:
return 0, ip
@@ -1100,7 +1153,7 @@ def stackmon(fp: str, ival: float, suffix: str) -> None:
buf = lzma.compress(buf, preset=0)
if "%" in fp:
dt = datetime.utcnow()
dt = datetime.now(UTC)
for fs in "YmdHMS":
fs = "%" + fs
if fs in fp:
@@ -1223,12 +1276,15 @@ def ren_open(
except OSError as ex_:
ex = ex_
if ex.errno == errno.EINVAL and not asciified:
# EPERM: android13
if ex.errno in (errno.EINVAL, errno.EPERM) and not asciified:
asciified = True
bname, fname = [
zs.encode("ascii", "replace").decode("ascii").replace("?", "_")
for zs in [bname, fname]
]
zsl = []
for zs in (bname, fname):
zs = zs.encode("ascii", "replace").decode("ascii")
zs = re.sub(r"[^][a-zA-Z0-9(){}.,+=!-]", "_", zs)
zsl.append(zs)
bname, fname = zsl
continue
# ENOTSUP: zfs on ubuntu 20.04
@@ -1292,7 +1348,7 @@ class MultipartParser(object):
rfc1341/rfc1521/rfc2047/rfc2231/rfc2388/rfc6266/the-real-world
(only the fallback non-js uploader relies on these filenames)
"""
for ln in read_header(self.sr):
for ln in read_header(self.sr, 2, 2592000):
self.log(ln)
m = self.re_ctype.match(ln)
@@ -1439,7 +1495,7 @@ class MultipartParser(object):
for buf in iterable:
ret += buf
if len(ret) > max_len:
raise Pebkac(400, "field length is too long")
raise Pebkac(422, "field length is too long")
return ret
@@ -1492,15 +1548,15 @@ def get_boundary(headers: dict[str, str]) -> str:
return m.group(2)
def read_header(sr: Unrecv) -> list[str]:
def read_header(sr: Unrecv, t_idle: int, t_tot: int) -> list[str]:
t0 = time.time()
ret = b""
while True:
if time.time() - t0 > 120:
if time.time() - t0 >= t_tot:
return []
try:
ret += sr.recv(1024)
ret += sr.recv(1024, t_idle // 2)
except:
if not ret:
return []
@@ -1546,15 +1602,18 @@ def rand_name(fdir: str, fn: str, rnd: int) -> str:
return fn
def gen_filekey(salt: str, fspath: str, fsize: int, inode: int) -> str:
return base64.urlsafe_b64encode(
hashlib.sha512(
"{} {} {} {}".format(salt, fspath, fsize, inode).encode("utf-8", "replace")
).digest()
).decode("ascii")
def gen_filekey(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str:
if alg == 1:
zs = "%s %s %s %s" % (salt, fspath, fsize, inode)
else:
zs = "%s %s" % (salt, fspath)
zb = zs.encode("utf-8", "replace")
return base64.urlsafe_b64encode(hashlib.sha512(zb).digest()).decode("ascii")
def gen_filekey_dbg(
alg: int,
salt: str,
fspath: str,
fsize: int,
@@ -1562,7 +1621,7 @@ def gen_filekey_dbg(
log: "NamedLogger",
log_ptn: Optional[Pattern[str]],
) -> str:
ret = gen_filekey(salt, fspath, fsize, inode)
ret = gen_filekey(alg, salt, fspath, fsize, inode)
assert log_ptn
if log_ptn.search(fspath):
@@ -1589,7 +1648,7 @@ def gen_filekey_dbg(
def gencookie(k: str, v: str, r: str, tls: bool, dur: Optional[int]) -> str:
v = v.replace(";", "")
v = v.replace("%", "%25").replace(";", "%3B")
if dur:
exp = formatdate(time.time() + dur, usegmt=True)
else:
@@ -1622,7 +1681,12 @@ def unhumanize(sz: str) -> int:
pass
mc = sz[-1:].lower()
mi = {"k": 1024, "m": 1024 * 1024, "g": 1024 * 1024 * 1024}.get(mc, 1)
mi = {
"k": 1024,
"m": 1024 * 1024,
"g": 1024 * 1024 * 1024,
"t": 1024 * 1024 * 1024 * 1024,
}.get(mc, 1)
return int(float(sz[:-1]) * mi)
@@ -1658,7 +1722,7 @@ def uncyg(path: str) -> str:
if len(path) > 2 and path[2] != "/":
return path
return "{}:\\{}".format(path[1], path[3:])
return "%s:\\%s" % (path[1], path[3:])
def undot(path: str) -> str:
@@ -1701,7 +1765,7 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
bad = ["con", "prn", "aux", "nul"]
for n in range(1, 10):
bad += "com{0} lpt{0}".format(n).split(" ")
bad += ("com%s lpt%s" % (n, n)).split(" ")
if fn.lower().split(".")[0] in bad:
fn = "_" + fn
@@ -1745,6 +1809,24 @@ def exclude_dotfiles(filepaths: list[str]) -> list[str]:
return [x for x in filepaths if not x.split("/")[-1].startswith(".")]
def odfusion(base: ODict[str, bool], oth: str) -> ODict[str, bool]:
# merge an "ordered set" (just a dict really) with another list of keys
words0 = [x for x in oth.split(",") if x]
words1 = [x for x in oth[1:].split(",") if x]
ret = base.copy()
if oth.startswith("+"):
for k in words1:
ret[k] = True
elif oth[:1] in ("-", "/"):
for k in words1:
ret.pop(k, None)
else:
ret = ODict.fromkeys(words0, True)
return ret
def ipnorm(ip: str) -> str:
if ":" in ip:
# assume /64 clients; drop 4 groups
@@ -2015,6 +2097,8 @@ def shut_socket(log: "NamedLogger", sck: socket.socket, timeout: int = 3) -> Non
sck.shutdown(socket.SHUT_RDWR)
except:
pass
except Exception as ex:
log("shut({}): {}".format(fd, ex), "90")
finally:
td = time.time() - t0
if td >= 1:
@@ -2109,7 +2193,7 @@ def list_ips() -> list[str]:
def yieldfile(fn: str) -> Generator[bytes, None, None]:
with open(fsenc(fn), "rb", 512 * 1024) as f:
while True:
buf = f.read(64 * 1024)
buf = f.read(128 * 1024)
if not buf:
break
@@ -2266,7 +2350,7 @@ def rmdirs(
dirs = [os.path.join(top, x) for x in dirs]
ok = []
ng = []
for d in dirs[::-1]:
for d in reversed(dirs):
a, b = rmdirs(logger, scandir, lstat, d, depth + 1)
ok += a
ng += b
@@ -2316,7 +2400,7 @@ def unescape_cookie(orig: str) -> str:
ret += chr(int(esc[1:], 16))
except:
ret += esc
esc = ""
esc = ""
else:
ret += ch
@@ -2416,7 +2500,7 @@ def killtree(root: int) -> None:
def runcmd(
argv: Union[list[bytes], list[str]], timeout: Optional[int] = None, **ka: Any
argv: Union[list[bytes], list[str]], timeout: Optional[float] = None, **ka: Any
) -> tuple[int, str, str]:
kill = ka.pop("kill", "t") # [t]ree [m]ain [n]one
capture = ka.pop("capture", 3) # 0=none 1=stdout 2=stderr 3=both
@@ -2430,6 +2514,14 @@ def runcmd(
bout: bytes
berr: bytes
if ANYWIN:
if isinstance(argv[0], (bytes, bytearray)):
if argv[0] in CMD_EXEB:
argv[0] += b".exe"
else:
if argv[0] in CMD_EXES:
argv[0] += ".exe"
p = sp.Popen(argv, stdout=cout, stderr=cerr, **ka)
if not timeout or PY2:
bout, berr = p.communicate(sin)
@@ -2469,7 +2561,7 @@ def chkcmd(argv: Union[list[bytes], list[str]], **ka: Any) -> tuple[str, str]:
return sout, serr
def mchkcmd(argv: Union[list[bytes], list[str]], timeout: int = 10) -> None:
def mchkcmd(argv: Union[list[bytes], list[str]], timeout: float = 10) -> None:
if PY2:
with open(os.devnull, "wb") as f:
rv = sp.call(argv, stdout=f, stderr=f)
@@ -2716,6 +2808,34 @@ def runhook(
return True
def loadpy(ap: str, hot: bool) -> Any:
"""
a nice can of worms capable of causing all sorts of bugs
depending on what other inconveniently named files happen
to be in the same folder
"""
if ap.startswith("~"):
ap = os.path.expanduser(ap)
mdir, mfile = os.path.split(absreal(ap))
mname = mfile.rsplit(".", 1)[0]
sys.path.insert(0, mdir)
if PY2:
mod = __import__(mname)
if hot:
reload(mod)
else:
import importlib
mod = importlib.import_module(mname)
if hot:
importlib.reload(mod)
sys.path.remove(mdir)
return mod
def gzip_orig_sz(fn: str) -> int:
with open(fsenc(fn), "rb") as f:
f.seek(-4, 2)

View File

@@ -6,7 +6,7 @@ pk: $(addsuffix .gz, $(wildcard *.js *.css))
un: $(addsuffix .un, $(wildcard *.gz))
%.gz: %
pigz -11 -J 34 -I 5730 $<
pigz -11 -J 34 -I 573 $<
%.un: %
pigz -d $<

1
copyparty/web/a/u2c.py Symbolic link
View File

@@ -0,0 +1 @@
../../../bin/u2c.py

View File

@@ -1 +0,0 @@
../../../bin/up2k.py

View File

@@ -27,8 +27,8 @@ window.baguetteBox = (function () {
isOverlayVisible = false,
touch = {}, // start-pos
touchFlag = false, // busy
re_i = /.+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /.+\.(webm|mkv|mp4)(\?|$)/i,
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
anims = ['slideIn', 'fadeIn', 'none'],
data = {}, // all galleries
imagesElements = [],
@@ -127,7 +127,7 @@ window.baguetteBox = (function () {
var gallery = [];
[].forEach.call(tagsNodeList, function (imageElement, imageIndex) {
var imageElementClickHandler = function (e) {
if (ctrl(e))
if (ctrl(e) || e && e.shiftKey)
return true;
e.preventDefault ? e.preventDefault() : e.returnValue = false;
@@ -261,7 +261,7 @@ window.baguetteBox = (function () {
setloop(1);
else if (k == "BracketRight")
setloop(2);
else if (e.shiftKey)
else if (e.shiftKey && k != 'KeyR')
return;
else if (k == "ArrowLeft" || k == "KeyJ")
showPreviousImage();
@@ -310,7 +310,7 @@ window.baguetteBox = (function () {
options = {};
setOptions(o);
if (tt.en)
tt.show.bind(this)();
tt.show.call(this);
}
function setVmode() {
@@ -356,7 +356,7 @@ window.baguetteBox = (function () {
setVmode();
if (tt.en)
tt.show.bind(this)();
tt.show.call(this);
}
function findfile() {
@@ -376,7 +376,12 @@ window.baguetteBox = (function () {
else
(vid() || ebi('bbox-overlay')).requestFullscreen();
}
catch (ex) { alert(ex); }
catch (ex) {
if (IPHONE)
alert('sorry, apple decided to make this impossible on iphones (should work on ipad tho)');
else
alert(ex);
}
}
function tglsel() {
@@ -519,7 +524,7 @@ window.baguetteBox = (function () {
options[item] = newOptions[item];
}
var an = options.animation = sread('ganim') || anims[ANIM ? 0 : 2];
var an = options.animation = sread('ganim', anims) || anims[ANIM ? 0 : 2];
btnAnim.textContent = ['⇄', '⮺', '⚡'][anims.indexOf(an)];
btnAnim.setAttribute('tt', 'animation: ' + an);
@@ -580,6 +585,7 @@ window.baguetteBox = (function () {
function hideOverlay(e) {
ev(e);
playvid(false);
removeFromCache('#files');
if (options.noScrollbars) {
document.documentElement.style.overflowY = 'auto';
document.body.style.overflowY = 'auto';
@@ -812,10 +818,16 @@ window.baguetteBox = (function () {
}
function vid() {
if (currentIndex >= imagesElements.length)
return;
return imagesElements[currentIndex].querySelector('video');
}
function vidimg() {
if (currentIndex >= imagesElements.length)
return;
return imagesElements[currentIndex].querySelector('img, video');
}
@@ -863,7 +875,7 @@ window.baguetteBox = (function () {
if (loopB !== null) {
timer.add(loopchk);
sethash(window.location.hash.slice(1).split('&')[0] + '&t=' + (loopA || 0) + '-' + loopB);
sethash(location.hash.slice(1).split('&')[0] + '&t=' + (loopA || 0) + '-' + loopB);
}
}
@@ -961,7 +973,7 @@ window.baguetteBox = (function () {
clmod(btnPrev, 'off', 't');
clmod(btnNext, 'off', 't');
if (Date.now() - ctime <= 500)
if (Date.now() - ctime <= 500 && !IPHONE)
tglfull();
ctime = Date.now();

View File

@@ -55,6 +55,7 @@
--u2-sbtn-b1: #999;
--u2-txt-bg: var(--bg-u5);
--u2-tab-bg: linear-gradient(to bottom, var(--bg), var(--bg-u1));
--u2-tab-b1: rgba(128,128,128,0.8);
--u2-tab-1-fg: #fd7;
--u2-tab-1-bg: linear-gradient(to bottom, var(#353), var(--bg) 80%);
--u2-tab-1-b1: #7c5;
@@ -270,6 +271,7 @@ html.bz {
--btn-1h-fg: #000;
--txt-sh: a;
--u2-tab-b1: var(--bg-u5);
--u2-tab-1-fg: var(--fg-max);
--u2-tab-1-bg: var(--bg);
@@ -329,6 +331,7 @@ html.c {
html.cz {
--bgg: var(--bg-u2);
--srv-3: #fff;
--u2-tab-b1: var(--bg-d3);
}
html.cy {
--fg: #fff;
@@ -411,10 +414,11 @@ html.dz {
--op-aa-bg: var(--bg-d2);
--op-a-sh: rgba(0,0,0,0.5);
--u2-btn-b1: #999;
--u2-sbtn-b1: #999;
--u2-btn-b1: var(--fg-weak);
--u2-sbtn-b1: var(--fg-weak);
--u2-txt-bg: var(--bg-u5);
--u2-tab-bg: linear-gradient(to bottom, var(--bg), var(--bg-u1));
--u2-tab-b1: var(--fg-weak);
--u2-tab-1-fg: #fff;
--u2-tab-1-bg: linear-gradient(to bottom, var(#353), var(--bg) 80%);
--u2-tab-1-b1: #7c5;
@@ -423,6 +427,12 @@ html.dz {
--u2-b-fg: #fff;
--u2-b1-bg: #3a3;
--u2-b2-bg: #3a3;
--u2-o-bg: var(--btn-bg);
--u2-o-b1: var(--bg-u5);
--u2-o-h-bg: var(--fg-weak);
--u2-o-1-bg: var(--fg-weak);
--u2-o-1-b1: var(--a);
--u2-o-1h-bg: var(--a);
--u2-inf-bg: #07a;
--u2-inf-b1: #0be;
--u2-ok-bg: #380;
@@ -483,6 +493,7 @@ html.dz {
--err-ts: #500;
text-shadow: none;
font-family: 'scp', monospace, monospace;
}
html.dy {
--fg: #000;
@@ -717,18 +728,22 @@ a:hover {
html.y #files thead th {
box-shadow: 0 1px 0 rgba(0,0,0,0.12);
}
html #files.hhpick thead th {
color: #f7d;
background: #000;
box-shadow: .1em .2em 0 #f6c inset, -.1em -.1em 0 #f6c inset;
}
#files td {
margin: 0;
padding: .3em .5em;
background: var(--bg);
max-width: var(--file-td-w);
word-wrap: break-word;
overflow: hidden;
}
#files tr:nth-child(2n) td {
background: var(--row-alt);
}
#files td+td+td {
max-width: 30em;
overflow: hidden;
}
#files td+td {
box-shadow: 1px 0 0 0 rgba(128,128,128,var(--f-sh1)) inset, 0 1px 0 rgba(255,255,255,var(--f-sh2)) inset, 0 -1px 0 rgba(255,255,255,var(--f-sh2)) inset;
}
@@ -796,6 +811,8 @@ html.y #path a:hover {
}
.logue {
padding: .2em 0;
position: relative;
z-index: 1;
}
.logue.hidden,
.logue:empty {
@@ -849,7 +866,7 @@ html.y #path a:hover {
}
.mdo,
.mdo * {
line-height: 1.4em;
line-height: 1.5em;
}
#srv_info,
#srv_info2,
@@ -1072,6 +1089,9 @@ html.np_open #ggrid>a.au:before {
background: var(--bg-d3);
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
}
#player {
display: none;
}
#widget {
position: fixed;
font-size: 1.4em;
@@ -1154,10 +1174,10 @@ html.y #widget.open {
background: #fff;
background: var(--bg-u3);
}
#wfm, #wzip, #wnp {
#wfs, #wfm, #wzip, #wnp {
display: none;
}
#wzip, #wnp {
#wfs, #wzip, #wnp {
margin-right: .2em;
padding-right: .2em;
border: 1px solid var(--bg-u5);
@@ -1169,6 +1189,7 @@ html.y #widget.open {
padding-left: .2em;
border-left-width: .1em;
}
#wfs.act,
#wfm.act {
display: inline-block;
}
@@ -1192,6 +1213,13 @@ html.y #widget.open {
position: relative;
display: inline-block;
}
#wfs {
font-size: .36em;
text-align: right;
line-height: 1.3em;
padding: 0 .3em 0 0;
border-width: 0 .25em 0 0;
}
#wfm span,
#wnp span {
font-size: .6em;
@@ -1207,7 +1235,8 @@ html.y #widget.open {
#wfm a.hide {
display: none;
}
#files tbody tr.fcut td {
#files tbody tr.fcut td,
#ggrid>a.fcut {
animation: fcut .5s ease-out;
}
@keyframes fcut {
@@ -1374,6 +1403,9 @@ input[type="checkbox"]:checked+label {
color: #0e0;
color: var(--a);
}
html.dz input {
font-family: 'scp', monospace, monospace;
}
.opwide div>span>input+label {
padding: .3em 0 .3em .3em;
margin: 0 0 0 -.3em;
@@ -1382,14 +1414,17 @@ input[type="checkbox"]:checked+label {
.opview input.i {
width: calc(100% - 16.2em);
}
input.drc_v,
input.eq_gain {
width: 3em;
text-align: center;
margin: 0 .6em;
}
#audio_drc table,
#audio_eq table {
border-collapse: collapse;
}
#audio_drc td,
#audio_eq td {
text-align: center;
}
@@ -1398,11 +1433,15 @@ input.eq_gain {
display: block;
padding: 0;
}
#au_drc,
#au_eq {
display: block;
margin-top: .5em;
padding: 1.3em .3em;
}
#au_drc {
padding: .4em .3em;
}
#ico1 {
cursor: pointer;
}
@@ -1443,6 +1482,8 @@ input.eq_gain {
width: calc(100% - 2em);
margin: .3em 0 0 1.4em;
}
@media (max-width: 130em) { #srch_form.tags #tq_raw { width: calc(100% - 34em) } }
@media (max-width: 95em) { #srch_form.tags #tq_raw { width: calc(100% - 2em) } }
#tq_raw td+td {
width: 100%;
}
@@ -1554,6 +1595,7 @@ html.cz .btn {
border-bottom: .2em solid #709;
}
html.dz .btn {
font-size: 1em;
box-shadow: 0 0 0 .1em #080 inset;
}
html.dz .tgl.btn.on {
@@ -1597,6 +1639,12 @@ html.cz .tgl.btn.on {
list-style: none;
border-top: 1px solid var(--bg-u5);
}
#tree li.offline>a:first-child:before {
content: '❌';
position: absolute;
margin-left: -.25em;
z-index: 3;
}
#tree ul a.sel {
background: #000;
background: var(--bg-d3);
@@ -1738,6 +1786,8 @@ html.y #tree.nowrap .ntree a+a:hover {
display: none;
}
.ghead {
background: #fff;
background: var(--bg-u2);
border-radius: .3em;
padding: .2em .5em;
line-height: 2.3em;
@@ -1767,6 +1817,7 @@ html.y #tree.nowrap .ntree a+a:hover {
padding: 0;
}
#rui {
background: #fff;
background: var(--bg);
position: fixed;
top: 0;
@@ -1823,6 +1874,7 @@ html.y #tree.nowrap .ntree a+a:hover {
}
#doc {
overflow: visible;
background: #fff;
background: var(--bg);
margin: -1em 0 .5em 0;
padding: 1em 0 1em 0;
@@ -2077,12 +2129,12 @@ html.y #bbox-overlay figcaption a {
}
.bbox-btn,
#bbox-btns {
opacity: 1;
opacity: 1;
animation: opacity .2s infinite ease-in-out;
}
.bbox-btn.off,
#bbox-btns.off {
opacity: 0;
opacity: 0;
}
#bbox-overlay button {
cursor: pointer;
@@ -2352,7 +2404,7 @@ html.y #bbox-overlay figcaption a {
display: block;
}
#u2bm sup {
font-weight: bold;
font-weight: bold;
}
#u2notbtn {
display: none;
@@ -2451,7 +2503,7 @@ html.y #bbox-overlay figcaption a {
width: 21em;
}
#u2cards {
padding: 1em 1em .3em 1em;
padding: 1em 1em .42em 1em;
margin: 0 auto;
white-space: nowrap;
text-align: center;
@@ -2459,14 +2511,14 @@ html.y #bbox-overlay figcaption a {
min-width: 24em;
}
#u2cards.w {
width: 44em;
width: 48em;
text-align: left;
}
#u2cards.ww {
display: inline-block;
}
#u2etaw.w {
width: 52em;
width: 55em;
text-align: right;
margin: 2em auto -2.7em auto;
}
@@ -2476,7 +2528,8 @@ html.y #bbox-overlay figcaption a {
#u2cards a {
padding: .2em 1em;
background: var(--u2-tab-bg);
border: 1px solid rgba(128,128,128,0.8);
border: 1px solid #999;
border-color: var(--u2-tab-b1);
border-width: 0 0 1px 0;
}
#u2cards a:first-child {
@@ -2510,10 +2563,10 @@ html.y #bbox-overlay figcaption a {
width: 30em;
}
#u2conf.w {
width: 48em;
width: 51em;
}
#u2conf.ww {
width: 78em;
width: 82em;
}
#u2conf.ww #u2c3w {
width: 29em;
@@ -2735,6 +2788,9 @@ html.c .opbox,
html.a .opbox {
margin: 1.5em 0 0 0;
}
html.dz .opview input.i {
width: calc(100% - 18em);
}
html.c #tree,
html.c #treeh,
html.a #tree,
@@ -2787,6 +2843,9 @@ html.a #u2btn {
html.ay #u2btn {
box-shadow: .4em .4em 0 #ccc;
}
html.dz #u2btn {
letter-spacing: -.033em;
}
html.c #u2conf.ww #u2btn,
html.a #u2conf.ww #u2btn {
margin: -2em .5em -3em 0;
@@ -2934,6 +2993,7 @@ html.b #treepar {
html.b #wrap {
margin-top: 2em;
}
html.by .ghead,
html.bz .ghead {
background: var(--bg);
padding: .2em 0;
@@ -2948,13 +3008,13 @@ html.b .btn {
top: -.1em;
}
html.b #op_up2k.srch sup {
color: #fc0;
color: #fc0;
}
html.by #u2btn sup {
color: #06b;
color: #06b;
}
html.by #op_up2k.srch sup {
color: #b70;
color: #b70;
}
html.bz #u2cards a.act {
box-shadow: 0 -.1em .2em var(--bg-d2);
@@ -3003,6 +3063,16 @@ html.d #treepar {
@media (max-width: 32em) {
#u2conf {
font-size: .9em;
}
}
@media (max-width: 28em) {
#u2conf {
font-size: .8em;
}
}
@media (min-width: 70em) {
#barpos,
#barbuf {

View File

@@ -11,7 +11,7 @@
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}">
{%- if css %}
<link rel="stylesheet" media="screen" href="{{ css }}?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ css }}_={{ ts }}">
{%- endif %}
</head>
@@ -29,7 +29,7 @@
<div id="op_player" class="opview opbox opwide"></div>
<div id="op_bup" class="opview opbox act">
<div id="op_bup" class="opview opbox {% if not ls0 %}act{% endif %}">
<div id="u2err"></div>
<form method="post" enctype="multipart/form-data" accept-charset="utf-8" action="{{ url_suf }}">
<input type="hidden" name="act" value="bput" />
@@ -39,7 +39,7 @@
<a id="bbsw" href="?b=u" rel="nofollow"><br />switch to basic browser</a>
</div>
<div id="op_mkdir" class="opview opbox act">
<div id="op_mkdir" class="opview opbox {% if not ls0 %}act{% endif %}">
<form method="post" enctype="multipart/form-data" accept-charset="utf-8" action="{{ url_suf }}">
<input type="hidden" name="act" value="mkdir" />
📂<input type="text" name="name" class="i" placeholder="awesome mix vol.1">
@@ -55,7 +55,7 @@
</form>
</div>
<div id="op_msg" class="opview opbox act">
<div id="op_msg" class="opview opbox {% if not ls0 %}act{% endif %}">
<form method="post" enctype="application/x-www-form-urlencoded" accept-charset="utf-8" action="{{ url_suf }}">
📟<input type="text" name="msg" class="i" placeholder="lorem ipsum dolor sit amet">
<input type="submit" value="send msg to srv log">
@@ -135,42 +135,27 @@
<script>
var SR = {{ r|tojson }},
CGV = {{ cgv|tojson }},
TS = "{{ ts }}",
acct = "{{ acct }}",
perms = {{ perms }},
themes = {{ themes }},
dtheme = "{{ dtheme }}",
srvinf = "{{ srv_info }}",
s_name = "{{ s_name }}",
lang = "{{ lang }}",
dfavico = "{{ favico }}",
def_hcols = {{ def_hcols|tojson }},
have_up2k_idx = {{ have_up2k_idx|tojson }},
have_tags_idx = {{ have_tags_idx|tojson }},
have_acode = {{ have_acode|tojson }},
have_mv = {{ have_mv|tojson }},
have_del = {{ have_del|tojson }},
have_unpost = {{ have_unpost }},
have_zip = {{ have_zip|tojson }},
sb_md = "{{ sb_md }}",
sb_lg = "{{ sb_lg }}",
lifetime = {{ lifetime }},
turbolvl = {{ turbolvl }},
frand = {{ frand|tojson }},
u2sort = "{{ u2sort }}",
have_emp = {{ have_emp|tojson }},
txt_ext = "{{ txt_ext }}",
logues = {{ logues|tojson if sb_lg else "[]" }},
readme = {{ readme|tojson }},
ls0 = {{ ls0|tojson }};
document.documentElement.className = localStorage.theme || dtheme;
document.documentElement.className = localStorage.cpp_thm || dtheme;
</script>
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/baguettebox.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/browser.js?_={{ ts }}"></script>
<script src="{{ r }}/.cpr/up2k.js?_={{ ts }}"></script>
{%- if js %}
<script src="{{ js }}?_={{ ts }}"></script>
<script src="{{ js }}_={{ ts }}"></script>
{%- endif %}
</body>

File diff suppressed because it is too large Load Diff

View File

@@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<title>{{ svcname }}</title>
<title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
</head>

View File

@@ -31,7 +31,7 @@
<span id="lno">L#</span>
{%- else %}
<a href="{{ arg_base }}edit" tt="good: higher performance$Ngood: same document width as viewer$Nbad: assumes you know markdown">edit (basic)</a>
<a href="{{ arg_base }}edit2" tt="not in-house so probably less buggy">edit (fancy)</a>
<a href="{{ arg_base }}edit2" id="edit2" tt="not in-house so probably less buggy">edit (fancy)</a>
<a href="{{ arg_base }}">view raw</a>
{%- endif %}
</div>

View File

@@ -52,7 +52,7 @@ var img_load = (function () {
var r = {};
r.callbacks = [];
function fire() {
var fire = function () {
for (var a = 0; a < r.callbacks.length; a++)
r.callbacks[a]();
}
@@ -212,6 +212,8 @@ function convert_markdown(md_text, dest_dom) {
try {
var md_html = marked.parse(md_text, marked_opts);
if (!have_emp)
md_html = DOMPurify.sanitize(md_html);
}
catch (ex) {
if (ext)
@@ -231,11 +233,11 @@ function convert_markdown(md_text, dest_dom) {
var nodes = md_dom.getElementsByTagName('a');
for (var a = nodes.length - 1; a >= 0; a--) {
var href = nodes[a].getAttribute('href');
var txt = nodes[a].textContent;
var txt = nodes[a].innerHTML;
if (!txt)
nodes[a].textContent = href;
else if (href !== txt)
else if (href !== txt && !nodes[a].className)
nodes[a].className = 'vis';
}
@@ -470,7 +472,7 @@ img_load.callbacks = [toc.refresh];
// scroll handler
var redraw = (function () {
var sbs = true;
function onresize() {
var onresize = function () {
if (window.matchMedia)
sbs = window.matchMedia('(min-width: 64em)').matches;
@@ -483,7 +485,7 @@ var redraw = (function () {
onscroll();
}
function onscroll() {
var onscroll = function () {
toc.refresh();
}
@@ -505,6 +507,13 @@ dom_navtgl.onclick = function () {
redraw();
};
if (!HTTPS && location.hostname != '127.0.0.1') try {
ebi('edit2').onclick = function (e) {
toast.err(0, "the fancy editor is only available over https");
return ev(e);
}
} catch (ex) { }
if (sread('hidenav') == 1)
dom_navtgl.onclick();

View File

@@ -92,7 +92,7 @@ var action_stack = null;
var nlines = 0;
var draw_md = (function () {
var delay = 1;
function draw_md() {
var draw_md = function () {
var t0 = Date.now();
var src = dom_src.value;
convert_markdown(src, dom_pre);
@@ -135,7 +135,7 @@ img_load.callbacks = [function () {
// resize handler
redraw = (function () {
function onresize() {
var onresize = function () {
var y = (dom_hbar.offsetTop + dom_hbar.offsetHeight) + 'px';
dom_wrap.style.top = y;
dom_swrap.style.top = y;
@@ -143,12 +143,12 @@ redraw = (function () {
map_src = genmap(dom_ref, map_src);
map_pre = genmap(dom_pre, map_pre);
}
function setsbs() {
var setsbs = function () {
dom_wrap.className = '';
dom_swrap.className = '';
onresize();
}
function modetoggle() {
var modetoggle = function () {
var mode = dom_nsbs.innerHTML;
dom_nsbs.innerHTML = mode == 'editor' ? 'preview' : 'editor';
mode += ' single';
@@ -172,7 +172,7 @@ redraw = (function () {
(function () {
var skip_src = false, skip_pre = false;
function scroll(src, srcmap, dst, dstmap) {
var scroll = function (src, srcmap, dst, dstmap) {
var y = src.scrollTop;
if (y < 8) {
dst.scrollTop = 0;
@@ -278,6 +278,7 @@ function Modpoll() {
return;
var new_md = this.responseText,
new_mt = this.getResponseHeader('X-Lastmod3') || r.lastmod,
server_ref = server_md.replace(/\r/g, ''),
server_now = new_md.replace(/\r/g, '');
@@ -285,6 +286,7 @@ function Modpoll() {
if (r.initial && server_ref != server_now)
return modal.confirm('Your browser decided to show an outdated copy of the document!\n\nDo you want to load the latest version from the server instead?', function () {
dom_src.value = server_md = new_md;
last_modified = new_mt;
draw_md();
}, null);
@@ -898,12 +900,12 @@ var set_lno = (function () {
pv = null,
lno = ebi('lno');
function poke() {
var poke = function () {
clearTimeout(t);
t = setTimeout(fire, 20);
}
function fire() {
var fire = function () {
try {
clearTimeout(t);
@@ -928,7 +930,7 @@ var set_lno = (function () {
// hotkeys / toolbar
(function () {
function keydown(ev) {
var keydown = function (ev) {
ev = ev || window.event;
var kc = ev.code || ev.keyCode || ev.which,
editing = document.activeElement == dom_src;
@@ -1056,7 +1058,7 @@ action_stack = (function () {
var ignore = false;
var ref = dom_src.value;
function diff(from, to, cpos) {
var diff = function (from, to, cpos) {
if (from === to)
return null;
@@ -1087,14 +1089,14 @@ action_stack = (function () {
};
}
function undiff(from, change) {
var undiff = function (from, change) {
return {
txt: from.substring(0, change.car) + change.txt + from.substring(change.cdr),
cpos: change.cpos
};
}
function apply(src, dst) {
var apply = function (src, dst) {
dbg('undos(%d) redos(%d)', hist.un.length, hist.re.length);
if (src.length === 0)
@@ -1118,7 +1120,7 @@ action_stack = (function () {
return true;
}
function schedule_push() {
var schedule_push = function () {
if (ignore) {
ignore = false;
return;
@@ -1129,7 +1131,7 @@ action_stack = (function () {
sched_timer = setTimeout(push, 500);
}
function undo() {
var undo = function () {
if (hist.re.length == 0) {
clearTimeout(sched_timer);
push();
@@ -1137,11 +1139,11 @@ action_stack = (function () {
return apply(hist.un, hist.re);
}
function redo() {
var redo = function () {
return apply(hist.re, hist.un);
}
function push() {
var push = function () {
var newtxt = dom_src.value;
var change = diff(ref, newtxt, sched_cpos);
if (change !== null)

View File

@@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<title>{{ svcname }}</title>
<title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333">
@@ -42,7 +42,7 @@
{%- if redir %}
<script>
setTimeout(function() {
window.location.replace("{{ redir }}");
location.replace("{{ redir }}");
}, 1000);
</script>
{%- endif %}

View File

@@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<title>{{ svcname }}</title>
<title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333">
@@ -110,7 +110,7 @@ var SR = {{ r|tojson }},
lang="{{ lang }}",
dfavico="{{ favico }}";
document.documentElement.className=localStorage.theme||"{{ this.args.theme }}";
document.documentElement.className=localStorage.cpp_thm||"{{ this.args.theme }}";
</script>
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>

View File

@@ -35,7 +35,7 @@ var Ls = {
"v2": "use this server as a local HDD$N$NWARNING: this will show your password!",
}
},
d = Ls[sread("lang") || lang];
d = Ls[sread("cpp_lang", ["eng", "nor"]) || lang] || Ls.eng || Ls.nor;
for (var k in (d || {})) {
var f = k.slice(-1),

View File

@@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<title>{{ args.doctitle }} @ {{ args.name }}</title>
<title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333">
@@ -43,10 +43,9 @@
<h1>WebDAV</h1>
<div class="os win">
<p><em>note: rclone-FTP is a bit faster, so {% if args.ftp or args.ftps %}try that first{% else %}consider enabling FTP in server settings{% endif %}</em></p>
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
<pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
</pre>
{% if s %}
@@ -69,9 +68,9 @@
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
</pre>
<p>or you can use rclone instead, which is much slower but doesn't require root:</p>
<p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
<pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
</pre>
{% if s %}
@@ -111,10 +110,21 @@
<div class="os win">
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
{% if args.ftp %}
<p>connect with plaintext FTP:</p>
<pre>
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp or args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls={{ "false" if args.ftp else "true" }}
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftp:{{ rvp }} <b>W:</b>
</pre>
{% endif %}
{% if args.ftps %}
<p>connect with TLS-encrypted FTPS:</p>
<pre>
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b>
</pre>
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
{% endif %}
<p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p>
<pre>
explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }}
@@ -122,10 +132,21 @@
</div>
<div class="os lin">
{% if args.ftp %}
<p>connect with plaintext FTP:</p>
<pre>
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp or args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls={{ "false" if args.ftp else "true" }}
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftp:{{ rvp }} <b>mp</b>
</pre>
{% endif %}
{% if args.ftps %}
<p>connect with TLS-encrypted FTPS:</p>
<pre>
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b>
</pre>
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
{% endif %}
<p>emergency alternative (gnome/gui-only):</p>
<!-- gnome-bug: ignores vp -->
<pre>
@@ -160,7 +181,7 @@
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>-td</code></em></p>
{% endif %}
<p>
you can use <a href="{{ r }}/.cpr/a/up2k.py">up2k.py</a> to upload (sometimes faster than web-browsers)
you can use <a href="{{ r }}/.cpr/a/u2c.py">u2c.py</a> to upload (sometimes faster than web-browsers)
</p>
@@ -197,7 +218,7 @@ var SR = {{ r|tojson }},
lang="{{ lang }}",
dfavico="{{ favico }}";
document.documentElement.className=localStorage.theme||"{{ args.theme }}";
document.documentElement.className=localStorage.cpp_thm||"{{ args.theme }}";
</script>
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>

View File

@@ -4,6 +4,8 @@
src: local('Source Code Pro Regular'), local('SourceCodePro-Regular'), url(deps/scp.woff2) format('woff2');
}
html {
text-size-adjust: 100%;
-webkit-text-size-adjust: 100%;
touch-action: manipulation;
}
#tt, #toast {
@@ -13,6 +15,7 @@ html {
max-width: min(34em, calc(100% - 7em));
color: #ddd;
background: #333;
background: var(--bg-u2);
border: 0 solid #777;
box-shadow: 0 .2em .5em #111;
border-radius: .4em;
@@ -73,7 +76,7 @@ html {
#toastb {
max-height: 70vh;
overflow-y: auto;
padding: 1px;
padding: .1em;
}
#toast.scroll #toastb {
overflow-y: scroll;
@@ -157,9 +160,10 @@ html {
#modalc code,
#tt code {
color: #eee;
color: var(--fg-max);
background: #444;
background: var(--bg-u5);
padding: .1em .3em;
border-top: 1px solid #777;
border-radius: .3em;
line-height: 1.7em;
}
@@ -180,8 +184,10 @@ html.y #toast {
box-shadow: 0 .3em 1em rgba(0,0,0,0.4);
}
html.y #tt code {
background: #060;
color: #fff;
color: var(--fg-max);
background: #060;
background: var(--bg-u5);
}
#modalc code {
color: #060;
@@ -451,6 +457,20 @@ html.y textarea:focus {
padding: .2em .5em;
border: .12em solid #aaa;
}
.mdo .mdth,
.mdo .mdthl,
.mdo .mdthr {
margin: .5em .5em .5em 0;
}
.mdthl {
float: left;
}
.mdthr {
float: right;
}
hr {
clear: both;
}
@media screen {
.mdo {

View File

@@ -588,7 +588,7 @@ function U2pvis(act, btns, uc, st) {
btns[a].onclick = function (e) {
ev(e);
var newtab = this.getAttribute('act');
function go() {
var go = function () {
for (var b = 0; b < btns.length; b++) {
btns[b].className = (
btns[b].getAttribute('act') == newtab) ? 'act' : '';
@@ -723,7 +723,7 @@ function Donut(uc, st) {
function strobe() {
var txt = strobes.pop();
wintitle(txt);
wintitle(txt, false);
if (!txt)
clearInterval(tstrober);
}
@@ -807,7 +807,7 @@ function up2k_init(subtle) {
function init_deps() {
if (!loading_deps && !got_deps()) {
var fn = 'sha512.' + sha_js + '.js',
m = L.u_https1 + ' <a href="' + (window.location + '').replace(':', 's:') + '">' + L.u_https2 + '</a> ' + L.u_https3;
m = L.u_https1 + ' <a href="' + (location + '').replace(':', 's:') + '">' + L.u_https2 + '</a> ' + L.u_https3;
showmodal('<h1>loading ' + fn + '</h1>');
import_js(SR + '/.cpr/deps/' + fn, unmodal);
@@ -861,6 +861,7 @@ function up2k_init(subtle) {
bcfg_bind(uc, 'multitask', 'multitask', true, null, false);
bcfg_bind(uc, 'potato', 'potato', false, set_potato, false);
bcfg_bind(uc, 'ask_up', 'ask_up', true, null, false);
bcfg_bind(uc, 'u2ts', 'u2ts', !u2ts.endsWith('u'), set_u2ts, false);
bcfg_bind(uc, 'fsearch', 'fsearch', false, set_fsearch, false);
bcfg_bind(uc, 'flag_en', 'flag_en', false, apply_flag_cfg);
@@ -971,7 +972,7 @@ function up2k_init(subtle) {
if (++nenters <= 0)
nenters = 1;
if (onover.bind(this)(e))
if (onover.call(this, e))
return true;
var mup, up = QS('#up_zd');
@@ -995,16 +996,29 @@ function up2k_init(subtle) {
function onoverb(e) {
// zones are alive; disable cuo2duo branch
document.body.ondragover = document.body.ondrop = null;
return onover.bind(this)(e);
return onover.call(this, e);
}
function onover(e) {
return onovercmn(this, e, false);
}
function onoverbtn(e) {
return onovercmn(this, e, true);
}
function onovercmn(self, e, btn) {
try {
var ok = false, dt = e.dataTransfer.types;
for (var a = 0; a < dt.length; a++)
if (dt[a] == 'Files')
ok = true;
else if (dt[a] == 'text/uri-list')
return true;
else if (dt[a] == 'text/uri-list') {
if (btn) {
ok = true;
if (toast.txt == L.u_uri)
toast.hide();
}
else
return toast.inf(10, L.u_uri) || true;
}
if (!ok)
return true;
@@ -1020,8 +1034,11 @@ function up2k_init(subtle) {
document.body.ondragenter = document.body.ondragleave = document.body.ondragover = null;
return modal.alert('your browser does not support drag-and-drop uploading');
}
if (btn)
return;
clmod(ebi('drops'), 'vis', 1);
var v = this.getAttribute('v');
var v = self.getAttribute('v');
if (v)
clmod(ebi(v), 'hl', 1);
}
@@ -1045,6 +1062,8 @@ function up2k_init(subtle) {
document.body.ondragleave = offdrag;
document.body.ondragover = onover;
document.body.ondrop = gotfile;
ebi('u2btn').ondrop = gotfile;
ebi('u2btn').ondragover = onoverbtn;
var drops = [ebi('up_dz'), ebi('srch_dz')];
for (var a = 0; a < 2; a++) {
@@ -1088,7 +1107,7 @@ function up2k_init(subtle) {
function gotfile(e) {
ev(e);
nenters = 0;
offdrag.bind(this)();
offdrag.call(this);
var dz = this && this.getAttribute('id');
if (!dz && e && e.clientY)
// cuo2duo fallback
@@ -1132,7 +1151,7 @@ function up2k_init(subtle) {
dst = good_files;
if (is_itemlist) {
if (fobj.kind !== 'file')
if (fobj.kind !== 'file' && fobj.type !== 'text/uri-list')
continue;
try {
@@ -1144,6 +1163,8 @@ function up2k_init(subtle) {
}
catch (ex) { }
fobj = fobj.getAsFile();
if (!fobj)
continue;
}
try {
if (fobj.size < 1)
@@ -1319,6 +1340,7 @@ function up2k_init(subtle) {
function up_them(good_files) {
start_actx();
draw_turbo();
var evpath = get_evpath(),
draw_each = good_files.length < 50;
@@ -1341,7 +1363,7 @@ function up2k_init(subtle) {
name = good_files[a][1],
fdir = evpath,
now = Date.now(),
lmod = fobj.lastModified || now,
lmod = uc.u2ts ? (fobj.lastModified || now) : 0,
ofs = name.lastIndexOf('/') + 1;
if (ofs) {
@@ -1568,7 +1590,7 @@ function up2k_init(subtle) {
return;
st.oserr = true;
var msg = HTTPS ? L.u_emtleak3 : L.u_emtleak2.format((window.location + '').replace(':', 's:'));
var msg = HTTPS ? L.u_emtleak3 : L.u_emtleak2.format((location + '').replace(':', 's:'));
modal.alert(L.u_emtleak1 + msg + (CHROME ? L.u_emtleakc : FIREFOX ? L.u_emtleakf : ''));
}
@@ -1634,11 +1656,11 @@ function up2k_init(subtle) {
var running = false,
was_busy = false;
function defer() {
var defer = function () {
running = false;
}
function taskerd() {
var taskerd = function () {
if (running)
return;
@@ -1668,7 +1690,7 @@ function up2k_init(subtle) {
is_busy = st.todo.handshake.length;
try {
if (!is_busy && !uc.fsearch && !msel.getsel().length && (!mp.au || mp.au.paused))
treectl.goto(get_evpath());
treectl.goto();
}
catch (ex) { }
}
@@ -1826,6 +1848,7 @@ function up2k_init(subtle) {
timer.rm(etafun);
timer.rm(donut.do);
ebi('u2tabw').style.minHeight = '0px';
utw_minh = 0;
}
@@ -1935,7 +1958,7 @@ function up2k_init(subtle) {
st.bytes.hashed += cdr - car;
st.etac.h++;
function orz(e) {
var orz = function (e) {
bpend--;
segm_next();
hash_calc(nch, e.target.result);
@@ -2139,6 +2162,9 @@ function up2k_init(subtle) {
function exec_head() {
var t = st.todo.head.shift();
if (t.done)
return console.log('done; skip head1', t.name, t);
st.busy.head.push(t);
var xhr = new XMLHttpRequest();
@@ -2151,6 +2177,9 @@ function up2k_init(subtle) {
st.todo.head.unshift(t);
};
function orz(e) {
if (t.done)
return console.log('done; skip head2', t.name, t);
var ok = false;
if (xhr.status == 200) {
var srv_sz = xhr.getResponseHeader('Content-Length'),
@@ -2197,6 +2226,9 @@ function up2k_init(subtle) {
keepalive = t.keepalive,
me = Date.now();
if (t.done)
return console.log('done; skip hs', t.name, t);
st.busy.handshake.push(t);
t.keepalive = undefined;
t.t_busied = me;
@@ -2206,10 +2238,9 @@ function up2k_init(subtle) {
var xhr = new XMLHttpRequest();
xhr.onerror = function () {
if (t.t_busied != me) {
console.log('zombie handshake onerror,', t.name, t);
return;
}
if (t.t_busied != me) // t.done ok
return console.log('zombie handshake onerror', t.name, t);
if (!toast.visible)
toast.warn(9.98, L.u_eneths + "\n\nfile: " + t.name, t);
@@ -2218,11 +2249,10 @@ function up2k_init(subtle) {
st.todo.handshake.unshift(t);
t.keepalive = keepalive;
};
function orz(e) {
if (t.t_busied != me) {
console.log('zombie handshake onload,', t.name, t);
return;
}
var orz = function (e) {
if (t.t_busied != me || t.done)
return console.log('zombie handshake onload', t.name, t);
if (xhr.status == 200) {
t.t_handshake = Date.now();
if (keepalive) {
@@ -2482,14 +2512,17 @@ function up2k_init(subtle) {
}
function exec_upload() {
var upt = st.todo.upload.shift();
var upt = st.todo.upload.shift(),
t = st.files[upt.nfile],
npart = upt.npart,
tries = 0;
if (t.done)
return console.log('done; skip chunk', t.name, t);
st.busy.upload.push(upt);
st.nfile.upload = upt.nfile;
var npart = upt.npart,
t = st.files[upt.nfile],
tries = 0;
if (!t.t_uploading)
t.t_uploading = Date.now();
@@ -2502,7 +2535,7 @@ function up2k_init(subtle) {
if (cdr >= t.size)
cdr = t.size;
function orz(xhr) {
var orz = function (xhr) {
var txt = ((xhr.response && xhr.response.err) || xhr.responseText) + '';
if (txt.indexOf('upload blocked by x') + 1) {
apop(st.busy.upload, upt);
@@ -2531,7 +2564,7 @@ function up2k_init(subtle) {
}
orz2(xhr);
}
function orz2(xhr) {
var orz2 = function (xhr) {
apop(st.busy.upload, upt);
apop(t.postlist, npart);
if (!t.postlist.length) {
@@ -2589,7 +2622,7 @@ function up2k_init(subtle) {
wpx = window.innerWidth,
fpx = parseInt(getComputedStyle(bar)['font-size']),
wem = wpx * 1.0 / fpx,
wide = wem > 54 ? 'w' : '',
wide = wem > 57 ? 'w' : '',
parent = ebi(wide ? 'u2btn_cw' : 'u2btn_ct'),
btn = ebi('u2btn');
@@ -2598,7 +2631,7 @@ function up2k_init(subtle) {
ebi('u2conf').className = ebi('u2cards').className = ebi('u2etaw').className = wide;
}
wide = wem > 82 ? 'ww' : wide;
wide = wem > 86 ? 'ww' : wide;
parent = ebi(wide == 'ww' ? 'u2c3w' : 'u2c3t');
var its = [ebi('u2etaw'), ebi('u2cards')];
if (its[0].parentNode !== parent) {
@@ -2609,8 +2642,7 @@ function up2k_init(subtle) {
}
}
}
window.addEventListener('resize', onresize);
onresize();
onresize100.add(onresize, true);
if (MOBILE) {
// android-chrome wobbles for a bit; firefox / iOS-safari are OK
@@ -2678,6 +2710,16 @@ function up2k_init(subtle) {
}
function draw_turbo() {
if (turbolvl < 0 && uc.turbo) {
bcfg_set('u2turbo', uc.turbo = false);
toast.err(10, L.u_turbo_c);
}
if (uc.turbo && !has(perms, 'read')) {
bcfg_set('u2turbo', uc.turbo = false);
toast.warn(30, L.u_turbo_g);
}
var msg = (turbolvl || !uc.turbo) ? null : uc.fsearch ? L.u_ts : L.u_tu,
html = ebi('u2foot').innerHTML;
@@ -2776,6 +2818,8 @@ function up2k_init(subtle) {
function set_fsearch(new_state) {
var fixed = false,
persist = new_state !== undefined,
preferred = bcfg_get('fsearch', undefined),
can_write = false;
if (!ebi('fsearch')) {
@@ -2792,8 +2836,14 @@ function up2k_init(subtle) {
}
}
if (new_state === undefined)
new_state = preferred;
if (new_state !== undefined)
bcfg_set('fsearch', uc.fsearch = new_state);
if (persist)
bcfg_set('fsearch', uc.fsearch = new_state);
else
bcfg_upd_ui('fsearch', uc.fsearch = new_state);
try {
clmod(ebi('u2c3w'), 's', !can_write);
@@ -2816,6 +2866,9 @@ function up2k_init(subtle) {
ebi('u2cards').style.display = ebi('u2tab').style.display = potato ? 'none' : '';
ebi('u2mu').style.display = potato ? '' : 'none';
if (u2ts.startsWith('f') || !sread('u2ts'))
uc.u2ts = bcfg_upd_ui('u2ts', !u2ts.endsWith('u'));
draw_turbo();
draw_life();
onresize();
@@ -2840,12 +2893,24 @@ function up2k_init(subtle) {
}
}
function set_u2sort() {
function set_u2sort(en) {
if (u2sort.indexOf('f') < 0)
return;
bcfg_set('u2sort', uc.az = u2sort.indexOf('n') + 1);
localStorage.removeItem('u2sort');
var fen = uc.az = u2sort.indexOf('n') + 1;
bcfg_upd_ui('u2sort', fen);
if (en != fen)
toast.warn(10, L.ul_btnlk);
}
function set_u2ts(en) {
if (u2ts.indexOf('f') < 0)
return;
var fen = !u2ts.endsWith('u');
bcfg_upd_ui('u2ts', fen);
if (en != fen)
toast.warn(10, L.ul_btnlk);
}
function set_hashw() {
@@ -2943,7 +3008,7 @@ ebi('ico1').onclick = function () {
if (QS('#op_up2k.act'))
goto_up2k();
apply_perms({ "perms": perms, "frand": frand });
apply_perms({ "perms": perms, "frand": frand, "u2ts": u2ts });
(function () {

View File

@@ -6,13 +6,19 @@ if (!window.console || !console.log)
};
if (window.CGV)
for (var k in CGV)
window[k] = CGV[k];
var wah = '',
NOAC = 'autocorrect="off" autocapitalize="off"',
L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked,
CB = '?_=' + Date.now(),
R = SR.slice(1),
RS = R ? "/" + R : "",
HALFMAX = 8192 * 8192 * 8192 * 8192,
HTTPS = (window.location + '').indexOf('https:') === 0,
HTTPS = ('' + location).indexOf('https:') === 0,
TOUCH = 'ontouchstart' in window,
MOBILE = TOUCH,
CHROME = !!window.chrome,
@@ -139,29 +145,35 @@ catch (ex) {
}
var crashed = false, ignexd = {}, evalex_fatal = false;
function vis_exh(msg, url, lineNo, columnNo, error) {
if ((msg + '').indexOf('ResizeObserver') + 1)
msg = String(msg);
url = String(url);
if (msg.indexOf('ResizeObserver') + 1)
return; // chrome issue 809574 (benign, from <video>)
if ((msg + '').indexOf('l2d.js') + 1)
if (msg.indexOf('l2d.js') + 1)
return; // `t` undefined in tapEvent -> hitTestSimpleCustom
if (!/\.js($|\?)/.exec('' + url))
if (!/\.js($|\?)/.exec(url))
return; // chrome debugger
if ((url + '').indexOf(' > eval') + 1 && !evalex_fatal)
if (url.indexOf(' > eval') + 1 && !evalex_fatal)
return; // md timer
var ekey = url + '\n' + lineNo + '\n' + msg;
if (ignexd[ekey] || crashed)
return;
if (url.indexOf('deps/marked.js') + 1 && !window.WebAssembly)
return; // ff<52
crashed = true;
window.onerror = undefined;
var html = [
'<h1>you hit a bug!</h1>',
'<p style="font-size:1.3em;margin:0">try to <a href="#" onclick="localStorage.clear();location.reload();">reset copyparty settings</a> if you are stuck here, or <a href="#" onclick="ignex();">ignore this</a> / <a href="#" onclick="ignex(true);">ignore all</a> / <a href="?b=u">basic</a></p>',
'<p style="color:#fff">please send me a screenshot arigathanks gozaimuch: <a href="<ghi>" target="_blank">github issue</a> or <code>ed#2644</code></p>',
'<p class="b">' + esc(url + ' @' + lineNo + ':' + columnNo), '<br />' + esc(String(msg)).replace(/\n/g, '<br />') + '</p>',
'<p style="font-size:1.3em;margin:0;line-height:2em">try to <a href="#" onclick="localStorage.clear();location.reload();">reset copyparty settings</a> if you are stuck here, or <a href="#" onclick="ignex();">ignore this</a> / <a href="#" onclick="ignex(true);">ignore all</a> / <a href="?b=u">basic</a></p>',
'<p style="color:#fff">please send me a screenshot arigathanks gozaimuch: <a href="<ghi>" target="_blank">new github issue</a></p>',
'<p class="b">' + esc(url + ' @' + lineNo + ':' + columnNo), '<br />' + esc(msg).replace(/\n/g, '<br />') + '</p>',
'<p><b>UA:</b> ' + esc(navigator.userAgent + '')
];
@@ -225,7 +237,7 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
'#exbox{background:#222;color:#ddd;font-family:sans-serif;font-size:0.8em;padding:0 1em 1em 1em;z-index:80386;position:fixed;top:0;left:0;right:0;bottom:0;width:100%;height:100%;overflow:auto;width:calc(100% - 2em)} ' +
'#exbox,#exbox *{line-height:1.5em;overflow-wrap:break-word} ' +
'#exbox code{color:#bf7;background:#222;padding:.1em;margin:.2em;font-size:1.1em;font-family:monospace,monospace} ' +
'#exbox a{text-decoration:underline;color:#fc0} ' +
'#exbox a{text-decoration:underline;color:#fc0;background:#222;border:none} ' +
'#exbox h1{margin:.5em 1em 0 0;padding:0} ' +
'#exbox p.b{border-top:1px solid #999;margin:1em 0 0 0;font-size:1em} ' +
'#exbox ul, #exbox li {margin:0 0 0 .5em;padding:0} ' +
@@ -353,13 +365,13 @@ catch (ex) {
}
// https://stackoverflow.com/a/950146
function import_js(url, cb) {
function import_js(url, cb, ecb) {
var head = document.head || document.getElementsByTagName('head')[0];
var script = mknod('script');
script.type = 'text/javascript';
script.src = url;
script.src = url + '?_=' + (window.TS || 'a');
script.onload = cb;
script.onerror = function () {
script.onerror = ecb || function () {
var m = 'Failed to load module:\n' + url;
console.log(m);
toast.err(0, m);
@@ -368,6 +380,15 @@ function import_js(url, cb) {
}
function unsmart(txt) {
return !IPHONE ? txt : (txt.
replace(/[\u2014]/g, "--").
replace(/[\u2022]/g, "*").
replace(/[\u2018\u2019]/g, "'").
replace(/[\u201c\u201d]/g, '"'));
}
var crctab = (function () {
var c, tab = [];
for (var n = 0; n < 256; n++) {
@@ -462,7 +483,7 @@ function yscroll() {
function showsort(tab) {
var v, vn, v1, v2, th = tab.tHead,
sopts = jread('fsort', [["href", 1, ""]]);
sopts = jread('fsort', jcp(dsort));
th && (th = th.rows[0]) && (th = th.cells);
@@ -632,6 +653,29 @@ function vsplit(vp) {
}
function vjoin(p1, p2) {
if (!p1)
p1 = '';
if (!p2)
p2 = '';
if (p1.endsWith('/'))
p1 = p1.slice(0, -1);
if (p2.startsWith('/'))
p2 = p2.slice(1);
if (!p1)
return p2;
if (!p2)
return p1;
return p1 + '/' + p2;
}
function uricom_enc(txt, do_fb_enc) {
try {
return encodeURIComponent(txt);
@@ -719,7 +763,7 @@ function get_pwd() {
if (pwd.length < 2)
return null;
return pwd[1].split(';')[0];
return decodeURIComponent(pwd[1].split(';')[0]);
}
@@ -843,9 +887,10 @@ function jcp(obj) {
}
function sread(key) {
function sread(key, al) {
try {
return localStorage.getItem(key);
var ret = localStorage.getItem(key);
return (!al || has(al, ret)) ? ret : null;
}
catch (e) {
return null;
@@ -867,7 +912,13 @@ function jread(key, fb) {
if (!str)
return fb;
return JSON.parse(str);
try {
// '' throws, null is ok, sasuga
return JSON.parse(str);
}
catch (e) {
return fb;
}
}
function jwrite(key, val) {
@@ -931,13 +982,14 @@ function bcfg_set(name, val) {
function bcfg_upd_ui(name, val) {
var o = ebi(name);
if (!o)
return;
return val;
if (o.getAttribute('type') == 'checkbox')
o.checked = val;
else if (o) {
clmod(o, 'on', val);
}
return val;
}
function bcfg_bind(obj, oname, cname, defval, cb, un_ev) {
@@ -1028,10 +1080,13 @@ function cliptxt(txt, ok) {
}
var timer = (function () {
var r = {};
function Debounce(delay) {
var r = this;
r.delay = delay;
r.timer = 0;
r.t_hit = 0;
r.t_run = 0;
r.q = [];
r.last = 0;
r.add = function (fun, run) {
r.rm(fun);
@@ -1045,7 +1100,67 @@ var timer = (function () {
apop(r.q, fun);
};
function doevents() {
r.run = function () {
if (crashed)
return;
r.t_run = Date.now();
var q = r.q.slice(0);
for (var a = 0; a < q.length; a++)
q[a]();
};
r.hit = function () {
if (crashed)
return;
var now = Date.now(),
td_hit = now - r.t_hit,
td_run = now - r.t_run;
if (td_run >= r.delay * 2)
r.t_run = now;
if (td_run >= r.delay && td_run <= r.delay * 2) {
// r.delay is also deadline
clearTimeout(r.timer);
return r.run();
}
if (td_hit < r.delay / 5)
return;
clearTimeout(r.timer);
r.timer = setTimeout(r.run, r.delay);
r.t_hit = now;
};
};
var onresize100 = new Debounce(100);
window.addEventListener('resize', onresize100.hit);
var timer = (function () {
var r = {};
r.q = [];
r.last = 0;
r.fs = 0;
r.fc = 0;
r.add = function (fun, run) {
r.rm(fun);
r.q.push(fun);
if (run)
fun();
};
r.rm = function (fun) {
apop(r.q, fun);
};
var doevents = function () {
if (crashed)
return;
@@ -1057,6 +1172,7 @@ var timer = (function () {
q[a]();
r.last = Date.now();
//r.fc++; if (r.last - r.fs >= 2000) { console.log(r.last - r.fs, r.fc); r.fs = r.last; r.fc = 0; }
}
setInterval(doevents, 100);
@@ -1081,7 +1197,7 @@ var tt = (function () {
var prev = null;
r.cshow = function () {
if (this !== prev)
r.show.bind(this)();
r.show.call(this);
prev = this;
};
@@ -1093,7 +1209,7 @@ var tt = (function () {
return;
if (Date.now() - r.lvis < 400)
return r.show.bind(this)();
return r.show.call(this);
tev = setTimeout(r.show.bind(this), 800);
if (TOUCH)
@@ -1183,13 +1299,13 @@ var tt = (function () {
r.th.style.top = (e.pageY + 12 * sy) + 'px';
};
if (IPHONE) {
if (TOUCH) {
var f1 = r.show,
f2 = r.hide,
q = [];
// if an onclick-handler creates a new timer,
// iOS 13.1.2 delays the entire handler by up to 401ms,
// webkits delay the entire handler by up to 401ms,
// win by using a shared timer instead
timer.add(function () {
@@ -1251,8 +1367,11 @@ var toast = (function () {
r.visible = false;
r.txt = null;
r.tag = obj; // filler value (null is scary)
r.p_txt = '';
r.p_sec = 0;
r.p_t = 0;
function scrollchk() {
var scrollchk = function () {
if (scrolling)
return;
@@ -1267,7 +1386,7 @@ var toast = (function () {
scrolling = true;
}
function unscroll() {
var unscroll = function () {
timer.rm(scrollchk);
clmod(obj, 'scroll');
scrolling = false;
@@ -1283,10 +1402,23 @@ var toast = (function () {
};
r.show = function (cl, sec, txt, tag) {
var same = r.visible && txt == r.p_txt && r.p_sec == sec,
delta = Date.now() - r.p_t;
if (same && delta < 100)
return;
r.p_txt = txt;
r.p_sec = sec;
r.p_t = Date.now();
clearTimeout(te);
if (sec)
te = setTimeout(r.hide, sec * 1000);
if (same && delta < 1000)
return;
if (txt.indexOf('<body>') + 1)
txt = txt.slice(0, txt.indexOf('<')) + ' [...]';
@@ -1373,7 +1505,7 @@ var modal = (function () {
r.busy = false;
setTimeout(next, 50);
};
function ok(e) {
var ok = function (e) {
ev(e);
var v = ebi('modali');
v = v ? v.value : true;
@@ -1381,14 +1513,14 @@ var modal = (function () {
if (cb_ok)
cb_ok(v);
}
function ng(e) {
var ng = function (e) {
ev(e);
r.hide();
if (cb_ng)
cb_ng(null);
}
function onfocus(e) {
var onfocus = function (e) {
var ctr = ebi('modalc');
if (!ctr || !ctr.contains || !document.activeElement || ctr.contains(document.activeElement))
return;
@@ -1400,7 +1532,7 @@ var modal = (function () {
ev(e);
}
function onkey(e) {
var onkey = function (e) {
var k = e.code,
eok = ebi('modal-ok'),
eng = ebi('modal-ng'),
@@ -1423,7 +1555,7 @@ var modal = (function () {
return ng();
}
function next() {
var next = function () {
if (!r.busy && q.length)
q.shift()();
}
@@ -1434,7 +1566,7 @@ var modal = (function () {
});
next();
};
function _alert(html, cb, fun) {
var _alert = function (html, cb, fun) {
cb_ok = cb_ng = cb;
cb_up = fun;
html += '<div id="modalb"><a href="#" id="modal-ok">OK</a></div>';
@@ -1447,7 +1579,7 @@ var modal = (function () {
});
next();
}
function _confirm(html, cok, cng, fun, btns) {
var _confirm = function (html, cok, cng, fun, btns) {
cb_ok = cok;
cb_ng = cng === undefined ? cok : cng;
cb_up = fun;
@@ -1461,11 +1593,11 @@ var modal = (function () {
});
next();
}
function _prompt(html, v, cok, cng, fun) {
var _prompt = function (html, v, cok, cng, fun) {
cb_ok = cok;
cb_ng = cng === undefined ? cok : null;
cb_up = fun;
html += '<input id="modali" type="text" /><div id="modalb">' + ok_cancel + '</div>';
html += '<input id="modali" type="text" ' + NOAC + ' /><div id="modalb">' + ok_cancel + '</div>';
r.show(html);
ebi('modali').value = v || '';
@@ -1497,7 +1629,7 @@ function repl_load() {
ret = [
'var v=Object.keys(localStorage); v.sort(); JSON.stringify(v)',
"for (var a of QSA('#files a[id]')) a.setAttribute('download','')",
'console.hist.slice(-10).join("\\n")'
'console.hist.slice(-50).join("\\n")'
];
ipre.innerHTML = '<option value=""></option>';
@@ -1553,6 +1685,8 @@ function repl(e) {
if (!cmd)
return toast.inf(3, 'eval aborted');
cmd = unsmart(cmd);
if (cmd.startsWith(',')) {
evalex_fatal = true;
return modal.alert(esc(eval(cmd.slice(1)) + ''));
@@ -1579,11 +1713,19 @@ function load_md_plug(md_text, plug_type, defer) {
if (defer)
md_plug[plug_type] = null;
if (plug_type == 'pre')
try {
md_text = md_thumbs(md_text);
}
catch (ex) {
toast.warn(30, '' + ex);
}
if (!have_emp)
return md_text;
var find = '\n```copyparty_' + plug_type + '\n',
md = md_text.replace(/\r/g, ''),
md = '\n' + md_text.replace(/\r/g, '') + '\n',
ofs = md.indexOf(find),
ofs2 = md.indexOf('\n```', ofs + 1);
@@ -1619,6 +1761,47 @@ function load_md_plug(md_text, plug_type, defer) {
return md;
}
function md_thumbs(md) {
if (!/(^|\n)<!-- th -->/.exec(md))
return md;
// `!th[flags](some.jpg)`
// flags: nothing or "l" or "r"
md = md.split(/!th\[/g);
for (var a = 1; a < md.length; a++) {
if (!/^[^\]!()]*\]\([^\][!()]+\)/.exec(md[a])) {
md[a] = '!th[' + md[a];
continue;
}
var o1 = md[a].indexOf(']('),
o2 = md[a].indexOf(')', o1),
alt = md[a].slice(0, o1),
flags = alt.split(','),
url = md[a].slice(o1 + 2, o2),
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
if (!/[?&]cache/.exec(url))
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i';
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
}
return md.join('');
}
function md_th_set() {
var els = QSA('.mdth');
for (var a = 0, aa = els.length; a < aa; a++)
els[a].onclick = md_th_click;
}
function md_th_click(e) {
ev(e);
var url = this.getAttribute('href').split('?')[0];
if (window.sb_md)
window.parent.postMessage("imshow " + url, "*");
else
thegrid.imshow(url);
}
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';
@@ -1629,7 +1812,7 @@ var favico = (function () {
r.en = true;
r.tag = null;
function gx(txt) {
var gx = function (txt) {
return (svg_decl +
'<svg version="1.1" viewBox="0 0 64 64" xmlns="http://www.w3.org/2000/svg">\n' +
(r.bg ? '<rect width="100%" height="100%" rx="16" fill="#' + r.bg + '" />\n' : '') +
@@ -1697,7 +1880,6 @@ function cprop(name) {
function bchrome() {
console.log(document.documentElement.className);
var v, o = QS('meta[name=theme-color]');
if (!o)
return;
@@ -1715,16 +1897,17 @@ function xhrchk(xhr, prefix, e404, lvl, tag) {
if (xhr.status < 400 && xhr.status >= 200)
return true;
if (xhr.status == 403)
var errtxt = (xhr.response && xhr.response.err) || xhr.responseText,
fun = toast[lvl || 'err'],
is_cf = /[Cc]loud[f]lare|>Just a mo[m]ent|#cf-b[u]bbles|Chec[k]ing your br[o]wser|\/chall[e]nge-platform|"chall[e]nge-error|nable Ja[v]aScript and cook/.test(errtxt);
if (xhr.status == 403 && !is_cf)
return toast.err(0, prefix + (L && L.xhr403 || "403: access denied\n\ntry pressing F5, maybe you got logged out"), tag);
if (xhr.status == 404)
return toast.err(0, prefix + e404, tag);
var errtxt = (xhr.response && xhr.response.err) || xhr.responseText,
fun = toast[lvl || 'err'];
if (xhr.status == 503 && /[Cc]loud[f]lare|>Just a mo[m]ent|#cf-b[u]bbles|Chec[k]ing your br[o]wser/.test(errtxt)) {
if (is_cf && (xhr.status == 403 || xhr.status == 503)) {
var now = Date.now(), td = now - cf_cha_t;
if (td < 15000)
return;

View File

@@ -1,3 +1,787 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1021-1443 `v1.9.14` uptime
## new features
* search for files by upload time
* option to display upload time in directory listings
* enable globally with `-e2d -mte +.up_at` or per-volume with volflags `e2d,mte=+.up_at`
* has a ~17% performance impact on directory listings
* [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression) in the audioplayer settings
* `--ban-404` is now default-enabled
* the turbo-uploader will now un-turbo when necessary to avoid banning itself
* this only affects accounts with permissions `g`, `G`, or `h`
* accounts with read-access (which are able to see directory listings anyways) and accounts with write-only access are no longer affected by `--ban-404` or `--ban-url`
## bugfixes
* #55 clients could hit the `--url-ban` filter when uploading over webdav
* fixed by limiting `--ban-404` and `--ban-url` to accounts with permission `g`, `G`, or `h`
* fixed 20% performance drop in python 3.12 due to utcfromtimestamp deprecation
* but 3.12.0 is still 5% slower than 3.11.6 for some reason
* volume listing on startup would display some redundant info
## other changes
* timeout for unfinished uploads increased from 6 to 24 hours
* and is now configurable with `--snap-drop`
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1015-2006 `v1.9.12` more buttons
just adding requested features, nothing important
## new features
* button `📅` in the uploader (default-enabled) sends your local last-modified timestamps to the server
* when deselected, the files on the server will have the upload time as their timestamps instead
* `--u2ts` specifies the default setting, `c` client-last-modified or `u` upload-time, or `fc` and `fu` to force
* button `full` in the gridview decides if thumbnails should be center-cropped or not
* `--no-crop` and the `nocrop` volflag now sets the default value of this instead of forcing the setting
* thumbnail cleanup is now more granular, cleaning full-jpg separately from cropped-webp for example
* set default sort order with `--sort` or volflag `sort`
* one or more comma-separated values; `tags/Cirle,tags/.tn,tags/Artist,tags/Title,href`
* see the column header tooltips in the browser to know what names (`id`) to use
* prefix a column name with `-` for descending sort
* specifying a sort order in the client will override all server-defined ones
* when visiting a read-only folder, the upload-or-filesearch toggle will remember its previous state and restore it when leaving the folder
* much more intuitive, if anything about this UI can be called that...
## bugfixes
* iPhone: rare javascript panic when switching between safari and another app
* ie9: file-rename ui was borked
## other changes
* copyparty.exe: upgrade to pillow 10.1 (which adds a new font for thumbnails in chrome)
* still based on python 3.11.6 because 3.12 is currently slower than 3.11
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1009-0036 `v1.9.11` bustin'
okay, i swear this is the last version for weeks! probably
## bugfixes
* cachebuster didn't apply to dynamically loaded javascript files
* READMEs could fail to render with `ReferenceError: DOMPurify is not defined` after upgrading from a copyparty older than v1.9.2
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1008-2051 `v1.9.10` badpwd
## new features
* argument `--log-badpwd` specifies how to log invalid login attempts;
* `0` = just a warning with no further information
* `1` = log incorrect password in plaintext (default)
* `2` = log sha512 hash of the incorrect password
* `1` and `2` are convenient for stuff like setting up autoban triggers for common passwords using fail2ban or similar
## bugfixes
* none!
* the formerly mentioned caching-directives bug turned out to be unreachable... oh well, better safe than sorry
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1007-2229 `v1.9.9` fix cross-volume dedup moves
## bugfixes
* v1.6.2 introduced a bug which, when moving files between volumes, could cause the move operation to abort when it encounters a deduplicated file
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1006-1750 `v1.9.8` static filekeys
## new features
* #52 add alternative filekey generator:
* volflag `fka` changes the calculation to ignore filesize and inode-number, only caring about the absolute-path on the filesystem and the `--fk-salt`
* good for linking to markdown files which might be edited, but reduces security a tiny bit
* add warning on startup if `--fk-salt` is too weak (for example when it was upgraded from before [v1.7.6](https://github.com/9001/copyparty/releases/tag/v1.7.6))
* removed the filekey upgrade feaure to ensure a weak fk-salt is not selected; a new filekey will be generated from scratch on startup if necessary
## other changes
* pyftpdlib upgraded to 1.5.8
* copyparty.exe built on python 3.11.6
* the exe in this release will be replaced with an 3.12.0 exe as soon as [pillow adds 3.12 support](https://github.com/python-pillow/Pillow/issues/6941)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0930-2332 `v1.9.7` better column hider
## new features
* column hiding on phones is much more intuitive
* since you usually want to hide multiple columns, the hiding mode must now be manually disengaged
* click-handler now covers the entire header cell, preventing a misclick from accidentally sorting the table instead
## bugfixes
* #51 running copyparty with an invalid value for `--lang` made it crash with a confusing error message
* also makes it more compatible with other localStorage-using webservices running on the same domain
## other changes
* CVE-2023-5217, a vulnerability in libvpx, was fixed by alpine recently and no longer present in the docker images
* unlike the fix in v1.9.6, this is irrelevant since it was impossible to reach in all conceivable setups, but still nice
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0923-1215 `v1.9.6` configurable x-forwarded-for
## new features
* rudimentary support for jython and graalpy, and directory tree sidebar in internet explorer 9 through 11, and firefox 10
* all older browsers (ie4, ie6, ie8, Netscape) get basic html instead
* #35 adds a [hook](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/msg-log.py) which extends the message-to-serverlog feature so it writes the message to a textfile on the server
* could theoretically be extended into a [full instant-messaging feature](https://github.com/9001/copyparty/blob/hovudstraum/srv/chat.md) but that's silly, [nobody would do that](https://ocv.me/stuff/cchat.webm)
* [r0c is much better](https://github.com/9001/r0c) than this joke
## bugfixes
* 163e3fce46122d64bf824762b6733ff2c3551ba5 the `x-forwarded-for` header was ignored if the nearest reverse-proxy is not asking from 127.0.0.1, which broke client IPs in containerized deployments
* the serverlog will now explain how to trust the reverse-proxy to provide client IPs, but basically,
* `--xff-hdr` specifies which header to read the client's real ip from
* `--xff-src` is an allowlist of IP-addresses to trust that header from
* a62f744a187bc9f821b540e8bb4e0b9a67bd01c8 if copyparty was started while an external HDD was not connected, and that volume's index was stored elsewhere, then the index would get wiped (since all the files are gone)
* 3b8f66c0d5c27a68841814ec06f1758f146a5ff5 javascript could crash while uploading from a very unreliable internet connection
## other changes
* copyparty.exe: updated pillow to 10.0.1 which fixes the webp cve
* alpine, which the docker images are based on, turns out to be fairly slow -- currently working on a new docker image (probably fedora-based) which will be 30% faster at analyzing multimedia files and in general 20% faster on average
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0909-1336 `v1.9.5` webhotell
[happy 9/9!](https://safebooru.org/index.php?page=post&s=view&id=4027419)
## new features
* new permission `h` disables directory listing (so works like `g`) except it redirects to the folder's index.html instead of 404
* index.html is accessible by anyone with `h` even if filekeys are enabled
* well suited for running a shared-webhosting gig (thx kipu) especially now that the...
* markdown editor can now be used on non-markdown files if account has `w`rite and `d`elete
* hotkey `e` to edit a textfile while it's open in the textfile viewer
* SMB: account permissions now work fully as intended, thanks to impacket 0.11
* but enabling `--smb` is still strongly discouraged as it's a massive security hazard
* download-as-zip can be 2.5x faster on tiny files, at least 15% faster in general
* download folders as pax-format tarfiles with `?tar=pax` or `?tar=pax,xz:9`
## bugfixes
* 422-autoban accidentally triggered when uploading lots of duplicate files (thx hiem!)
* `--css-browser` and `--js-browser` now accepts URLs with cache directives
* `--css-browser=/the.css?cache=600` (seconds) or `--js-browser=/.res/the.js?cache=i` (7 days)
* SMB: avoid windows freaking out and disconnecting if it hits an offline volume
* hotkey shift-r to rotate pictures counter-clockwise didn't do anything
* hacker theme wasn't hacker enough (everything is monospace now)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0902-0018 `v1.9.4` yes symlink times
hello! it's been a while, an entire day even...
## new features
* download folder as tar.gz, tar.bz2, tar.xz
* single-threaded, so extremely slow, but nice for easily compressed data or challenged networks
* append `?tar=gz`, `?tar=bz2` or `?tar=xz` to a folder URL to do it
* default compression levels are gz:3, bz2:2, xz:1; override with `?tar=gz:9`
# bugfixes
* c1efd227b7377144a5760bc6cff64f4e87b626d9 symlink-deduplicated files got indexed with the wrong last-modified timestamp
* mostly inconsequential; would cause the dupe's uploader-ip to be forgotten on the next server restart since it would reindex to "fix" the timestamp
* when linking [a search query](https://a.ocv.me/pub/#q=tags%20like%20soundsho*) it loads the results faster
# other changes
* update readme to mention that iPhones and iPads dislike the preload feature and respond by glitching the audio a bit when a song is exactly 20 seconds away from ending and yet how it's probably a bad idea to disable preloading since i bet it's load-bearing against other iOS bugs
* speaking of iPhones and iPads, the [previous version](https://github.com/9001/copyparty/releases/tag/v1.9.3) should have fixed album playback on those
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0831-2211 `v1.9.3` iOS and http fixes
## new features
* iPhones and iPads are now able to...
* 9986136dfb2364edb35aa9fbb87410641c6d6af3 play entire albums while the screen is off without the music randomly stopping
* apple keeps breaking AudioContext in new and interesting ways; time to give up (no more equalizer)
* 1c0d978979a703edeb792e552b18d3b7695b2d90 perform search queries and execude js code
* by translating [smart-quotes](https://stackoverflow.com/questions/48678359/ios-11-safari-html-disable-smart-punctuation) into regular `'` and `"` characters
* python 3.12 support
* technically a bugfix since it was added [a year ago](https://github.com/9001/copyparty/commit/32e22dfe84d5e0b13914b4d0e15c1b8c9725a76d) way before the first py3.12 alpha was released but turns out i botched it, oh well
* filter error messages so they never include the filesystem path where copyparty's python files reside
* print more context in server logs if someone hits an unexpected permission-denied
# bugfixes
found some iffy stuff combing over the code but, as far as I can tell, luckily none of these were dangerous:
* URL normalization was a bit funky, but it appears everything access-control-related was unaffected
* some url parameters were double-decoded, causing the unpost filtering and file renaming to fail if the values contained `%`
* clients could cause the server to return an invalid cache-control header, but newlines and control-characters got rejected correctly
* minor cosmetics / qol fixes:
* reduced flickering on page load in chrome
* fixed some console spam in search results
* markdown documents now have the same line-height in directory listings and the editor
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0826-2116 `v1.9.2` bigger hammer
## new features
* more ways to automatically ban users! three new sensors, all default-enabled, giving a 1 day ban after 9 hits in 2 minutes:
* `--ban-403`: trying to access volumes that dont exist or require authentication
* `--ban-422`: invalid POST messages (from brutefocing POST parameters and such)
* `--ban-url`: URLs which 404 and also match `--sus-urls` (scanners/crawlers)
* if you want to run a vulnerability scan on copyparty, please just [download the server](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) and do it locally! takes less than 30 seconds to set up, you get lower latency, and you won't be filling up the logfiles on the demo server with junk, thank you 🙏
* more ban-related stuff,
* new global option `--nonsus-urls` specifies regex of URLs which are OK to 404 and shouldn't ban people
* `--turbo` now accepts the value `-1` which makes it impossible for clients to enable it, making `--ban-404` safe to use
* range-selecting files in the list-view by shift-pgup/pgdn
* volumes which are currently unavailable (dead nfs share, external HDD which is off, ...) are marked with a ❌ in the directory tree sidebar
* the toggle-button to see dotfiles is now persisted as a cookie so it also applies on the initial page load
* more effort is made to prevent `<script>`s inside markdown documents from running in the markdown editor and the fullpage viewer
* anyone who wanted to use markdown files for malicious stuff can still just upload an html file instead, so this doesn't make anything more secure, just less confusing
* the safest approach is still the `nohtml` volflag which disables markdown rendering outside sandboxes entirely, or only giving out write-access to trustworthy people
* enabling markdown plugins with `-emp` now has the side-effect of cancelling this band-aid too
## bugfixes
* textfile navigation hotkeys broke in the previous version
## other changes
* example [nginx config](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nginx/copyparty.conf) was not compatible with cloudflare (suggest `$http_cf_connecting_ip` instead of `$proxy_add_x_forwarded_for`)
* `copyparty.exe` is now built with python 3.11.5 which fixes [CVE-2023-40217](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-40217)
* `copyparty32.exe` is not, because python understandably ended win7 support
* [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md):
* copyparty appears to be 30x faster than nextcloud and seafile at receiving uploads of many small files
* seafile has a size limit when zip-downloading folders
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0820-2338 `v1.9.1` prometheable
## new features
* #49 prometheus / grafana / openmetrics integration ([see readme](https://github.com/9001/copyparty#prometheus))
* read metrics from http://127.0.0.1:3923/.cpr/metrics after enabling with `--stats`
* download a folder with all music transcoded to opus by adding `?tar=opus` or `?zip&opus` to the URL
* can also be used to download thumbnails instead of full images; `?tar=w` for webp, `?tar=j` for jpg
* so i guess the long-time requested feature of pre-generating thumbnails kind of happened after all, if you schedule a `curl http://127.0.0.1:3923/?tar=w >/dev/null` after server startup
* u2c (commandline uploader): argument `-x` to exclude files by regex (compares absolute filesystem paths)
* `--zm-spam 30` can be used to improve zeroconf / mDNS reliability on crazy networks
* only necessary if there are clients with multiple IPs and some of the IPs are outside the subnets that copyparty are in -- not spec-compliant, not really recommended, but shouldn't cause any issues either
* and `--mc-hop` wasn't actually implemented until now
* dragging an image from another browser window onto the upload button is now possible
* only works on chrome, and only on windows or linux (not macos)
* server hostname is prefixed in all window titles
* can be adjusted with `--bname` (the file explorer) and `--doctitle` (all other documents)
* can be disabled with `--nth` (just window title) or `--nih` (title + header)
## bugfixes
* docker: the autogenerated seeds for filekeys and account passwords now get persisted to the config volume (thx noktuas)
* uploading files with fancy filenames could fail if the copyparty server is running on android
* improve workarounds for some apple/iphone/ios jank (thx noktuas and spiky)
* some ui elements had their font-size selected by fair dice roll
* the volume control does nothing because [apple disabled it](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11), so add a warning
* the image gallery cannot be fullscreened [as apple intended](https://developer.mozilla.org/en-US/docs/Web/API/Element/requestFullscreen#browser_compatibility) so add a warning
## other changes
* file table columns are now limited to browser window width
* readme: mention that nginx-QUIC is currently very slow (thx noktuas)
* #50 add a safeguard to the wget plugin in case wget at some point adds support for `file://` or similar
* show a suggestion on startup to enable the database
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0725-1550 `v1.8.8` just boring bugfixes
final release until late august unless something bad happens and i end up building this thing on a shinkansen
## recent security / vulnerability fixes
* there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates
* [v1.8.7](https://github.com/9001/copyparty/releases/tag/v1.8.7) (2023-07-23) - [CVE-2023-38501](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-38501) - reflected XSS
* [v1.8.2](https://github.com/9001/copyparty/releases/tag/v1.8.2) (2023-07-14) - [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474) - path traversal (first CVE)
* all serverlogs reviewed so far (5 public servers) showed no signs of exploitation
## bugfixes
* range-select with shiftclick:
* don't crash when entering another folder and shift-clicking some more
* remember selection origin when lazy-loading more stuff into the viewport
* markdown editor:
* fix confusing warnings when the browser cache decides it *really* wants to cache
* and when a document starts with a newline
* remember intended actions such as `?edit` on login prompts
* Windows: TLS-cert generation (triggered by network changes) could occasionally fail
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0723-1543 `v1.8.7` XSS for days
at the lack of better ideas, there is now a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` for all future important updates such as this one
## bugfixes
* reflected XSS through `/?k304` and `/?setck`
* if someone tricked you into clicking a URL containing a chain of `%0d` and `%0a` they could potentially have moved/deleted existing files on the server, or uploaded new files, using your account
* if you use a reverse proxy, you can check if you have been exploited like so:
* nginx: grep your logs for URLs containing `%0d%0a%0d%0a`, for example using the following command:
```bash
(gzip -dc access.log*.gz; cat access.log) | sed -r 's/" [0-9]+ .*//' | grep -iE '%0[da]%0[da]%0[da]%0[da]'
```
* if you find any traces of exploitation (or just want to be on the safe side) it's recommended to change the passwords of your copyparty accounts
* huge thanks *again* to @TheHackyDog !
* the original fix for CVE-2023-37474 broke the download links for u2c.py and partyfuse.py
* fix mediaplayer spinlock if the server only has a single audio file
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0721-0036 `v1.8.6` fix reflected XSS
## bugfixes
* reflected XSS through `/?hc` (the optional subfolder parameter to the [connect](https://a.ocv.me/?hc) page)
* if someone tricked you into clicking `http://127.0.0.1:3923/?hc=<script>alert(1)</script>` they could potentially have moved/deleted existing files on the server, or uploaded new files, using your account
* if you use a reverse proxy, you can check if you have been exploited like so:
* nginx: grep your logs for URLs containing `?hc=` with `<` somewhere in its value, for example using the following command:
```bash
(gzip -dc access.log*.gz; cat access.log) | sed -r 's/" [0-9]+ .*//' | grep -E '[?&](hc|pw)=.*[<>]'
```
* if you find any traces of exploitation (or just want to be on the safe side) it's recommended to change the passwords of your copyparty accounts
* thanks again to @TheHackyDog !
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0718-0746 `v1.8.4` range-select v2
**IMPORTANT:** `v1.8.2` (previous release) fixed [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474) ; please see the [1.8.2 release notes](https://github.com/9001/copyparty/releases/tag/v1.8.2) (all serverlogs reviewed so far showed no signs of exploitation)
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
## new features
* #47 file selection by shift-clicking
* in list-view: click a table row to select it, then shift-click another to select all files in-between
* in grid-view: either enable the `multiselect` button (mainly for phones/tablets), or the new `sel` button in the `[⚙️] settings` tab (better for mouse+keyboard), then shift-click two files
* volflag `fat32` avoids a bug in android's sdcardfs causing excessive reindexing on startup if any files were modified on the sdcard since last reboot
## bugfixes
* minor corrections to the new features from #45
* uploader IPs are now visible for `a`dmin accounts in `d2t` volumes as well
## other changes
* the admin-panel is only accessible for accounts which have the `a` (admin) permission-level in one or more volumes; so instead of giving your user `rwmd` access, you'll want `rwmda` instead:
```bash
python3 copyparty-sfx.py -a joe:hunter2 -v /mnt/nas/pub:pub:rwmda,joe
```
or in a settings file,
```yaml
[/pub]
/mnt/nas/pub
accs:
rwmda: joe
```
* until now, `rw` was enough, however most readwrite users don't need access to those features
* grabbing a stacktrace with `?stack` is permitted for both `rw` and `a`
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0714-1558 `v1.8.2` URGENT: fix path traversal vulnerability
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
Starting with the bad and important news; this release fixes https://github.com/9001/copyparty/security/advisories/GHSA-pxfv-7rr3-2qjg / [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474) -- so please upgrade!
Every version until now had a [path traversal vulnerability](https://owasp.org/www-community/attacks/Path_Traversal) which allowed read-access to any file on the server's filesystem. To summarize,
* Every file that the copyparty process had the OS-level permissions to read, could be retrieved over HTTP without password authentication
* However, an attacker would need to know the full (or copyparty-module-relative) path to the file; it was luckily impossible to list directory contents to discover files on the server
* You may have been running copyparty with some mitigations against this:
* [prisonparty](https://github.com/9001/copyparty/tree/hovudstraum/bin#prisonpartysh) limited the scope of access to files which were intentionally given to copyparty for sharing; meaning all volumes, as well as the following read-only filesystem locations: `/bin`, `/lib`, `/lib32`, `/lib64`, `/sbin`, `/usr`, `/etc/alternatives`
* the [nix package](https://github.com/9001/copyparty#nix-package) has a similar mitigation implemented using systemd concepts
* [docker containers](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) would only expose the files which were intentionally mounted into the container, so even better
* More conventional setups, such as just running the sfx (python or exe editions), would unfortunately expose all files readable by the current user
* The following configurations would have made the impact much worse:
* running copyparty as root
So, three years, and finally a CVE -- which has been there since day one... Not great huh. There is a list of all the copyparty alternatives that I know of in the `similar software` link above.
Thanks for flying copyparty! And especially if you decide to continue doing so :-)
## new features
* #43 volflags to specify thumbnailer behavior per-volume;
* `--th-no-crop` / volflag `nocrop` to specify whether autocrop should be disabled
* `--th-size` / volflag `thsize` to set a custom thumbnail resolution
* `--th-convt` / volflag `convt` to specify conversion timeout
* #45 resulted in a handful of opportunities to tighten security in intentionally-dangerous setups (public folders with anonymous uploads enabled):
* a new permission, `a` (in addition to the existing `rwmdgG`), to show the uploader-IP and upload-time for each file in the file listing
* accidentally incompatible with the `d2t` volflag (will be fixed in the next ver)
* volflag `nohtml` is a good defense against (un)intentional XSS; it returns HTML-files and markdown-files as plaintext instead of rendering them, meaning any malicious `<script>` won't run -- bad idea for regular use since it breaks fundamental functionality, but good when you really need it
* the README-previews below the file-listing still renders as usual, as this is fine thanks to the sandbox
* a new eventhook `--xban` to run a plugin when copyparty decides to ban someone (for password bruteforcing or excessive 404's), for example to blackhole the IP using fail2ban or similar
## bugfixes
* **fixes a path traversal vulnerability,** https://github.com/9001/copyparty/security/advisories/GHSA-pxfv-7rr3-2qjg / [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474)
* HUGE thanks to @TheHackyDog for reporting this !!
* if you use a reverse proxy, you can check if you have been exploited like so:
* nginx: grep your logs for URLs containing both `.cpr/` and `%2[^0]`, for example using the following command:
```bash
(gzip -dc access.log.*.gz; cat access.log) | sed -r 's/" [0-9]+ .*//' | grep -E 'cpr/.*%2[^0]' | grep -vF data:image/svg
```
* 77f1e5144455eb946db7368792ea11c934f0f6da fixes an extremely unlikely race-condition (see the commit for details)
* 8f59afb1593a75b8ce8c91ceee304097a07aea6e fixes another race-condition which is a bit worse:
* the unpost feature could collide with other database activity, with the worst-case outcome being aborted batch operations, for example a directory move or a batch-rename which stops halfways
----
# 💾 what to download?
| download link | is it good? | description |
| -- | -- | -- |
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
| [u2c.exe](https://github.com/9001/copyparty/releases/download/v1.7.1/u2c.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.8.2/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
* except for [u2c.exe](https://github.com/9001/copyparty/releases/download/v1.7.1/u2c.exe), all of the options above are equivalent
* the zip and tar.gz files below are just source code
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0707-2220 `v1.8.1` in case of 404
## new features
* [handlers](https://github.com/9001/copyparty/tree/hovudstraum/bin/handlers); change the behavior of 404 / 403 with plugins
* makes it possible to use copyparty as a [caching proxy](https://github.com/9001/copyparty/blob/hovudstraum/bin/handlers/caching-proxy.py)
* #42 add mpv + streamlink support to [very-bad-idea](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag#dangerous-plugins)
* add support for Pillow 10
* also improved text rendering in icons
* mention the [fedora package](https://github.com/9001/copyparty#fedora-package) in the readme
## bugfixes
* theme 6 (hacker) didn't show the state of some toggle-switches
* windows: keep quickedit enabled when hashing passwords interactively
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0626-0005 `v1.8.0` argon
News: if you use rclone as a copyparty webdav client, upgrading to [rclone v1.63](https://github.com/rclone/rclone/releases/tag/v1.63.0) (just released) will give you [a huge speed boost](https://github.com/rclone/rclone/pull/6897) for small files
## new features
* #39 hashed passwords
* instead of keeping plaintext account passwords in config files, you can now store hashed ones instead
* `--ah-alg` specifies algorithm; best to worst: `argon2`, `scrypt`, `sha2`, or the default `none`
* the default settings of each algorithm takes `0.4 sec` to hash a password, and argon2 eats `256 MiB` RAM
* can be adjusted with optional comma-separated args after the algorithm name; see `--help-pwhash`
* `--ah-salt` is the [static salt](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#hashed-passwords) for all passwords, and is autogenerated-and-persisted if not specified
* `--ah-cli` switches copyparty into a shell where you can hash passwords interactively
* but copyparty will also autoconvert any unhashed passwords on startup and give you the values to insert into the config anyways
* #40 volume size limit
* volflag `vmaxb` specifies max size of a volume
* volflag `vmaxn` specifies max number of files in a volume
* example: `-v [...]:c,vmaxb=900g:c,vmaxn=20k` blocks uploads if the volume reaches 900 GiB or a total of 20480 files
* good alternative to `--df` since it works per-volume
## bugfixes
* autogenerated TLS certs didn't include the mDNS name
## other changes
* improved cloudflare challenge detection
* markdown edits will now trigger upload hooks
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0611-0814 `v1.7.6` NO_COLOR
## new features
* #31 `--grid` shows thumbnails instead of file-list by default
* #28 `--unlist` regex-exclude files from browser listings
* for example `--unlist '\.(js|css)$'` hides all `.js` and `.css` files
* **purely cosmetic!** the files are still fully accessible, and still appear in API calls
* auto-generate TLS certificates on startup / network-change
* mostly good for LAN, requires [cfssl](https://github.com/cloudflare/cfssl/releases/latest), can be disabled with `--no-crt`
* creates a self-signed CA and certs with SANs of all detected server IPs
* so it's still recommended to use a reverse-proxy / letsencrypt for WAN servers
* the default `--fk-salt` is now much stronger
* all existing installations will keep the previously selected seed -- you can choose to upgrade by deleting `~/.config/copyparty/cert.pem` but this will change all filekeys / per-file passwords
* the `NO_COLOR` environment-variable is now supported, removing colors from stdout
* see https://no-color.org/ and more importantly https://youtu.be/biW5UVGkPMA?t=150
* `--ansi` and `--no-ansi` can also be used to force-enable/disable colored output
* #33 disable colors when stdout is redirected to a pipe/file -- by @clach04
* #32 simplify building sfx from source
* upgraded [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) to [python 3.11.4](https://pythoninsider.blogspot.com/2023/06/python-3114-31012-3917-3817-3717-and.html)
## bugfixes
* #30 `--ftps` didn't work without `--ftp`
* tiny css bug in light themes (opaque thumbnail controls)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0513-0000 `v1.7.2` hard resolve
## new features
* print a warning if `c:\`, `c:\windows*`, or all of `/` are shared
* upgraded the docker image to v3.18 which enables the [chiptune player](https://a.ocv.me/pub/demo/music/chiptunes/#af-f6fb2e5f)
* in config files, allow trailing `:` in section headers
## bugfixes
* when `--hardlink` (or the volflag) is set, resolve symlinks before hardlinking
* uploads could fail due to relative symlinks
* really minor ux fixes
* left-align `GET` in access logs
* the upload panel didn't always shrink back down after uploads completed
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0507-1834 `v1.7.1` CräzY;PWDs
## new features
* webdav:
* support write-only folders
* option `--dav-auth` / volflag `davauth` forces clients to always auth
* helps clients such as `davfs2` see all folders if the root is anon-readable but some subfolders are not
* alternatively you could configure your client to always send the password in the `PW` header
* include usernames in http request logs
* audio player:
* consumes less power on phones when the screen is off
* smoother playback cursor on short songs
## bugfixes
* the characters `;` and `%` can now be used in passwords
* but non-ascii characters (such as the ä in the release title) can, in fact, not
* verify that all accounts have unique passwords on startup (#25)
## other changes
* ftpd: log incorrect passwords only, not correct ones
* `up2k.py` (the upload, folder-sync, and file-search client) has been renamed to [u2c.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy)
* `u2c` as in `up2k client`, or `up2k CLI`, or `upload-to-copyparty` -- good name
* now the only things named "up2k" are the web-ui and the server backend which is way less confusing
* upgrade packaging from [setup.py](https://github.com/9001/copyparty/blob/hovudstraum/setup.py) to [pyproject.toml](https://github.com/9001/copyparty/blob/hovudstraum/pyproject.toml)
* no practical consequences aside from a warm fuzzy feeling of being in the future
* the docker images ~~will be~~ got rebuilt 2023-05-11 ~~in a few days (when [alpine](https://alpinelinux.org/) 3.18 is released)~~ enabling [the chiptune player](https://a.ocv.me/pub/demo/music/chiptunes/#af-f6fb2e5f)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0429-2114 `v1.7.0` unlinked
don't get excited! nothing new and revolutionary, but `xvol` and `xdev` changed behavior so there's an above-average chance of fresh bugs
## new features
* (#24): `xvol` and `xdev`, previously just hints to the filesystem indexer, now actively block access as well:
* `xvol` stops users following symlinks leaving the volumes they have access to
* so if you symlink `/home/ed/music` into `/srv/www/music` it'll get blocked
* ...unless both folders are accessible through volumes, and the user has read-access to both
* `xdev` stops users crossing the filesystem boundary of the volumes they have access to
* so if you symlink another HDD into a volume it'll get blocked, but you can still symlink from other places on the same FS
* enabling these will add a slight performance hit; the unlikely worst-case is `14%` slower directory listings, `35%` slower download-as-tar
* file selection summary (num files, size, audio duration) in the bottom right
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py): more aggressive resolving with `--rh`
* [add a warning](https://github.com/9001/copyparty#fix-unreliable-playback-on-android) that the default powersave settings in android may stop playing music during album changes
* also appears [in the media player](https://user-images.githubusercontent.com/241032/235327191-7aaefff9-5d41-4e42-b71f-042a8247f29d.png) if the issue is detected at runtime (playback halts for 30sec while screen is off)
## bugfixes
* (#23): stop autodeleting empty folders when moving or deleting files
* but files which expire / [self-destruct](https://github.com/9001/copyparty#self-destruct) still clean up parent directories like before
* ftp-server: some clients could fail to `mkdir` at first attempt (and also complain during rmdir)
## other changes
* new version of [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.7.0/copyparty-winpe64.exe) since the ftp-server fix might be relevant
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0426-2300 `v1.6.15` unexpected boost
## new features
* 30% faster folder listings due to [the very last thing](https://github.com/9001/copyparty/commit/55c74ad164633a0a64dceb51f7f534da0422cbb5) i'd ever expect to be a bottleneck, [thx perf](https://docs.python.org/3.12/howto/perf_profiling.html)
* option to see the lastmod timestamps of symlinks instead of the target files
* makes the turbo mode of [u2cli, the commandline uploader and folder-sync tool](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) more turbo since copyparty dedupes uploads by symlinking to an existing copy and the symlink is stamped with the deduped file's lastmod
* **webdav:** enabled by default (because rclone will want this), can be disabled with arg `--dav-rt` or volflag `davrt`
* **http:** disabled by default, can be enabled per-request with urlparam `lt`
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py): option `--rh` to resolve server hostname only once at start of upload
* fantastic for buggy networks, but it'll break TLS
## bugfixes
* new arg `--s-tbody` specifies the network timeout before a dead connection gets dropped (default 3min)
* before there was no timeout at all, which could hang uploads or possibly consume all server resources
* ...but this is only relevant if your copyparty is directly exposed to the internet with no reverse proxy
* with nginx/caddy/etc you can disable the timeout with `--s-tbody 0` for a 3% performance boost (*wow!*)
* iPhone audio transcoder could turn bad and stop transcoding
* ~~maybe android phones no longer pause playback at the end of an album~~
* nope, that was due to [android's powersaver](https://github.com/9001/copyparty#fix-unreliable-playback-on-android), oh well
* ***bonus unintended feature:*** navigate into other folders while a song is plaing
* [installing from the source tarball](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#build-from-release-tarball) should be ok now
* good base for making distro packages probably
## other changes
* since the network timeout fix is relevant for the single usecase that [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.15/copyparty-winpe64.exe) covers, there is now a new version of that
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0424-0609 `v1.6.14` unsettable flags
## new features
* unset a volflag (override a global option) by negating it (setting volflag `-flagname`)
* new argument `--cert` to specify TLS certificate location
* defaults to `~/.config/copyparty/cert.pem` like before
## bugfixes
* in zip/tar downloads, always use the parent-folder name as the archive root
* more reliable ftp authentication when providing password as username
* connect-page: fix rclone ftps example
## other changes
* stop suggesting `--http-only` and `--https-only` for performance since the difference is negligible
* mention how some antivirus (avast, avg, mcafee) thinks that pillow's webp encoder is a virus, affecting `copyparty.exe`
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0420-2141 `v1.6.12` as seen on nixos
## new features
* @chinponya [made](https://github.com/9001/copyparty/pull/22) a copyparty [Nix package](https://github.com/9001/copyparty#nix-package) and a [NixOS module](https://github.com/9001/copyparty#nixos-module)! nice 🎉
* with [systemd-based hardening](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nixos/modules/copyparty.nix#L230-L270) instead of [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh)
* complements the [arch package](https://github.com/9001/copyparty/tree/hovudstraum/contrib/package/arch) very well w
## bugfixes
* fix an sqlite fd leak
* with enough simultaneous traffic, copyparty could run out of file descriptors since it relied on the gc to close sqlite cursors
* now there's a pool of cursors shared between the tcp connections instead, limited to the number of CPU cores
* performance mostly unaffected (or slightly improved) compared to before, except for a 20% reduction only during max server load caused by directory-listings or searches
* ~~somehow explicitly closing the cursors didn't always work... maybe this was actually a python bug :\\/~~
* yes, it does incomplete cleanup if opening a WAL database fails
* multirange requests would fail with an error; now they get a 200 as expected (since they're kinda useless and not worth the overhead)
* [the only software i've ever seen do that](https://apps.kde.org/discover/) now works as intended
* expand `~/` filesystem paths in all remaining args: `-c`, `-lo`, `--hist`, `--ssl-log`, and the `hist` volflag
* never use IPv6-format IPv4 (`::ffff:127.0.0.1`) in responses
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py): don't enter delete stage if some of the uploads failed
* audio player in safari on touchbar macbooks
* songs would play backwards because the touchbar keeps spamming play/pause
* playback would stop when the preloader kicks in because safari sees the new audio object and freaks out
## other changes
* added [windows quickstart / service example](https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/windows.md)
* updated pyinstaller (it makes smaller exe files now)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0401-2112 `v1.6.11` not joke
## new features
* new event-hook: [exif stripper](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/image-noexif.py)
* [markdown thumbnails](https://a.ocv.me/pub/demo/pics-vids/README.md?v) -- see [readme](https://github.com/9001/copyparty#markdown-viewer)
* soon: support for [web-scrobbler](https://github.com/web-scrobbler/web-scrobbler/) - the [Last.fm](https://www.last.fm/user/tripflag) browser extension
* will update here + readme with more info when [the v3](https://github.com/web-scrobbler/web-scrobbler/projects/5) is out
## bugfixes
* more sqlite query-planner twiddling
* deleting files is MUCH faster now, and uploads / bootup might be a bit better too
* webdav optimizations / compliance
* should make some webdav clients run faster than before
* in very related news, the webdav-client in [rclone](https://github.com/rclone/rclone/) v1.63 ([currently beta](https://beta.rclone.org/?filter=latest)) will be ***FAST!***
* does cool stuff such as [bidirectional sync](https://github.com/9001/copyparty#folder-sync) between copyparty and a local folder
* [bpm detector](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/audio-bpm.py) is a bit more accurate
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) / commandline uploader: better error messages if something goes wrong
* readme rendering could fail in firefox if certain addons were installed (not sure which)
* event-hooks: more accurate usage examples
## other changes
* @chinponya automated the prismjs build step (thx!)
* updated some js deps (markedjs, codemirror)
* copyparty.exe: updated Pillow to 9.5.0
* and finally [the joke](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/rave.js) (looks [like this](https://cd.ocv.me/b/d2/d21/#af-9b927c42))
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0320-2156 `v1.6.10` rclone sync
## new features
* [iPhone "app"](https://github.com/9001/copyparty#ios-shortcuts) (upload shortcut) -- thanks @Daedren !
* can strip exif, upload files, pics, vids, links, clipboard
* can download links and rehost the target file on your server
* support `rclone sync` to [sync folders](https://github.com/9001/copyparty#folder-sync) to/from copyparty
* let webdav clients set lastmodified times during upload
* let webdav clients replace files during upload
## bugfixes
* [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh): FFmpeg transcoding was slow because there was no `/dev/urandom`
* iphones would fail to play *some* songs (low-bitrate and/or shorter than ~7 seconds)
* due to either an iOS bug or an FFmpeg bug in the caf remuxing idk
* fixed by mixing in white noise into songs if an iPhone asks for them
* small correction in the docker readme regarding rootless podman
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0316-2106 `v1.6.9` index.html
## new features
* option to show `index.html` instead of the folder listing
* arg `--ih` makes it default-enabled
* clients can enable/disable it in the `[⚙️]` settings tab
* url-param `?v` skips it for a particular folder
* faster folder-thumbnail validation on startup (mostly on conventional HDDs)
## bugfixes
* "load more" button didn't always show up when search results got truncated
* ux: tooltips could block buttons on android
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0312-1610 `v1.6.8` folder thumbs
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
## new features
* folder thumbnails are indexed in the db
* now supports non-lowercase names (`Cover.jpg`, `Folder.JPG`)
* folders without a specific cover/folder image will show the first pic inside
* when audio playback continues into an empty folder, keep trying for a bit
* add no-index hints (google etc) in basic-browser HTML (`?b`, `?b=u`)
* [commandline uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) supports long filenames on win7
## bugfixes
* rotated logfiles didn't get xz compressed
* image-gallery links pointing to a deleted image shows an error instead of a crashpage
## other changes
* folder thumbnails have purple text to differentiate from files
* `copyparty32.exe` starts 30% faster (but is 6% larger)
----
# what to download?
| download link | is it good? | description |
| -- | -- | -- |
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
| [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
* except for [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe), all of the options above are equivalent
* the zip and tar.gz files below are just source code
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0305-2018 `v1.6.7` fix no-dedup + add up2k.exe

View File

@@ -4,7 +4,9 @@
* [future plans](#future-plans) - some improvement ideas
* [design](#design)
* [up2k](#up2k) - quick outline of the up2k protocol
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
* [why not tus](#why-not-tus) - I didn't know about [tus](https://tus.io/)
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
* [hashed passwords](#hashed-passwords) - regarding the curious decisions
* [http api](#http-api)
* [read](#read)
* [write](#write)
@@ -16,6 +18,7 @@
* [building](#building)
* [dev env setup](#dev-env-setup)
* [just the sfx](#just-the-sfx)
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
* [complete release](#complete-release)
* [todo](#todo) - roughly sorted by priority
* [discarded ideas](#discarded-ideas)
@@ -66,7 +69,14 @@ regarding the frequent server log message during uploads;
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
## why chunk-hashes
### why not tus
I didn't know about [tus](https://tus.io/) when I made this, but:
* up2k has the advantage that it supports parallel uploading of non-contiguous chunks straight into the final file -- [tus does a merge at the end](https://tus.io/protocols/resumable-upload.html#concatenation) which is slow and taxing on the server HDD / filesystem (unless i'm misunderstanding)
* up2k has the slight disadvantage of requiring the client to hash the entire file before an upload can begin, but this has the benefit of immediately skipping duplicate files
* and the hashing happens in a separate thread anyways so it's usually not a bottleneck
### why chunk-hashes
a single sha512 would be better, right?
@@ -76,13 +86,22 @@ as a result, the hashes are much less useful than they could have been (search t
however it allows for hashing multiple chunks in parallel, greatly increasing upload speed from fast storage (NVMe, raid-0 and such)
* both the [browser uploader](https://github.com/9001/copyparty#uploading) and the [commandline one](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) does this now, allowing for fast uploading even from plaintext http
* both the [browser uploader](https://github.com/9001/copyparty#uploading) and the [commandline one](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy) does this now, allowing for fast uploading even from plaintext http
hashwasm would solve the streaming issue but reduces hashing speed for sha512 (xxh128 does 6 GiB/s), and it would make old browsers and [iphones](https://bugs.webkit.org/show_bug.cgi?id=228552) unsupported
* blake2 might be a better choice since xxh is non-cryptographic, but that gets ~15 MiB/s on slower androids
# hashed passwords
regarding the curious decisions
there is a static salt for all passwords;
* because most copyparty APIs allow users to authenticate using only their password, making the username unknown, so impossible to do per-account salts
* the drawback of this is that an attacker can bruteforce all accounts in parallel, however most copyparty instances only have a handful of accounts in the first place, and it can be compensated by increasing the hashing cost anyways
# http api
* table-column `params` = URL parameters; `?foo=bar&qux=...`
@@ -102,11 +121,18 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| GET | `?ls&dots` | list files/folders at URL as JSON, including dotfiles |
| GET | `?ls=t` | list files/folders at URL as plaintext |
| GET | `?ls=v` | list files/folders at URL, terminal-formatted |
| GET | `?lt` | in listings, use symlink timestamps rather than targets |
| GET | `?b` | list files/folders at URL as simplified HTML |
| GET | `?tree=.` | list one level of subdirectories inside URL |
| GET | `?tree` | list one level of subdirectories for each level until URL |
| GET | `?tar` | download everything below URL as a tar file |
| GET | `?zip=utf-8` | download everything below URL as a zip file |
| GET | `?tar` | download everything below URL as a gnu-tar file |
| GET | `?tar=gz:9` | ...as a gzip-level-9 gnu-tar file |
| GET | `?tar=xz:9` | ...as an xz-level-9 gnu-tar file |
| GET | `?tar=pax` | ...as a pax-tar file |
| GET | `?tar=pax,xz` | ...as an xz-level-1 pax-tar file |
| GET | `?zip=utf-8` | ...as a zip file |
| GET | `?zip` | ...as a WinXP-compatible zip file |
| GET | `?zip=crc` | ...as an MSDOS-compatible zip file |
| GET | `?ups` | show recent uploads from your IP |
| GET | `?ups&filter=f` | ...where URL contains `f` |
| GET | `?mime=foo` | specify return mimetype `foo` |
@@ -218,39 +244,55 @@ pip install mutagen # audio metadata
pip install pyftpdlib # ftp server
pip install impacket # smb server -- disable Windows Defender if you REALLY need this on windows
pip install Pillow pyheif-pillow-opener pillow-avif-plugin # thumbnails
pip install pyvips # faster thumbnails
pip install psutil # better cleanup of stuck metadata parsers on windows
pip install black==21.12b0 click==8.0.2 bandit pylint flake8 isort mypy # vscode tooling
```
## just the sfx
first grab the web-dependencies from a previous sfx (assuming you don't need to modify something in those):
if you just want to modify the copyparty source code (py/html/css/js) then this is the easiest approach
```sh
rm -rf copyparty/web/deps
curl -L https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py >x.py
python3 x.py --version
rm x.py
cp -R /tmp/pe-copyparty.$(id -u)/copyparty/web/deps copyparty/web/
```
or you could build the web-dependencies from source instead (NB: does not include prismjs, need to grab that manually):
```sh
make -C scripts/deps-docker
```
then build the sfx using any of the following examples:
build the sfx using any of the following examples:
```sh
./scripts/make-sfx.sh # regular edition
./scripts/make-sfx.sh fast # build faster (worse js/css compression)
./scripts/make-sfx.sh gz no-cm # gzip-compressed + no fancy markdown editor
```
## build from release tarball
uses the included prebuilt webdeps
if you downloaded a [release](https://github.com/9001/copyparty/releases) source tarball from github (for example [copyparty-1.6.15.tar.gz](https://github.com/9001/copyparty/releases/download/v1.6.15/copyparty-1.6.15.tar.gz) so not the autogenerated one) you can build it like so,
```bash
python3 -m pip install --user -U build setuptools wheel jinja2 strip_hints
bash scripts/run-tests.sh python3 # optional
python3 -m build
```
if you are unable to use `build`, you can use the old setuptools approach instead,
```bash
python3 setup.py install --user setuptools wheel jinja2
python3 setup.py build
# you now have a wheel which you can install. or extract and repackage:
python3 setup.py install --skip-build --prefix=/usr --root=$HOME/pe/copyparty
```
## complete release
also builds the sfx so skip the sfx section above
*WARNING: `rls.sh` has not yet been updated with the docker-images and arch/nix packaging*
does everything completely from scratch, straight from your local repo
in the `scripts` folder:
* run `make -C deps-docker` to build all dependencies

4
docs/examples/README.md Normal file
View File

@@ -0,0 +1,4 @@
copyparty server config examples
[windows.md](windows.md) -- running copyparty as a service on windows

Some files were not shown because too many files have changed in this diff Show More