Compare commits

...

109 Commits

Author SHA1 Message Date
ed
f00b939402 v1.13.3 2024-06-01 23:24:35 +00:00
ed
bef9617638 u2c.exe: explain that https is disabled 2024-06-01 22:26:47 +00:00
ed
692175f5b0 md-editor autoindent was duplicating hr markers
only keep characters `>+-*` if there's less than three of them,
and discard entire prefix if there's more

markdown spec only cares about exactly-one or three-or-more, but
let's keep pairs in case anyone use that as unconventional markup
2024-06-01 20:56:15 +00:00
ed
5ad65450c4 more intuitive df option/volflag, closes #83 2024-06-01 01:15:34 +00:00
ed
60c96f990a ux: hide video ui + floor seekbar text
* hide lightbox buttons when a video is playing

* move audio seekbar text to the bottom, so it
   hides less of the waveform and minute-markers
2024-06-01 00:35:44 +00:00
ed
07b2bf1104 better support for 700+ connections
when there was more than ~700 active connections,
* sendfile (non-https downloads) could fail
* mdns and ssdp could fail to reinitialize on network changes

...because `select` can't handle FDs higher than 512 on windows
(1024 on linux/macos), so prefer `poll` where possible (linux/macos)

but apple keeps breaking and unbreaking `poll` in macos,
so use `--no-poll` if necessary to force `select` instead
2024-05-31 23:31:32 +00:00
ed
ac1bc232a9 black 2024-05-31 08:57:33 +00:00
ed
5919607ad0 sanitize fs-paths in archive error summary
also gets rid of a dumb debug print i forgot
2024-05-30 23:55:37 +00:00
ed
07ea629ca5 keep most tags during audio transcode
metadata is no longer discarded when transcoding to opus or mp3;
this was a good idea back when the transcodes were only used by
the webplayer, but now that folders can be batch-downloaded with
on-the-fly transcoding, it makes sense to keep most of the tags

individual tags are discarded if its value exceeds 1023 letters

this should mainly affect the following:
* traktor beatmaps, size usually somewhere around 100 KiB
* non-standard cover-art embeddings, size around 250 KiB
* XMP (project data from adobe premiere), around 48 KiB
2024-05-30 23:46:56 +00:00
ed
b629d18df6 print helpful warning if unix env is inhospitable
thx kipu you're the best
2024-05-11 18:34:41 +00:00
ed
566cbb6507 update pkgs to 1.13.2 2024-05-10 15:04:33 +00:00
ed
400d700845 v1.13.2 2024-05-10 14:31:50 +00:00
ed
82ce6862ee option to use pngquant for smaller waveform PNGs 2024-05-10 13:06:02 +00:00
ed
38e4fdfe03 batch-convert audio waveforms with ?tar&p 2024-05-10 12:55:35 +00:00
ed
c04662798d play compressed s3xmodit chiptunes
adds support for playing gz, xz, and zip-compressed tracker files

using the de-facto naming convention for compressed modules;

* mod: mdz, mdgz, mdxz
* s3m: s3z, s3gz, s3xz
* xm: xmz, xmgz, xmxz
* it: itz, itgz, itxz
2024-05-10 12:45:17 +00:00
ed
19d156ff4e option to add custom UI translations 2024-05-09 23:09:45 +00:00
ed
87c60a1ec9 ensure OS signals hit main-thread as intended;
use sigmasks to block SIGINT, SIGTERM, SIGUSR1 from all other threads

also initiate shutdown by calling sighandler directly,
in case this misses anything and that is still unreliable
(discovered by `--exit=idx` being noop once in a blue moon)
2024-05-09 22:28:16 +00:00
ed
2c92dab165 fix small annoyances,
* mute exception on early shutdown
* sfx: give the utime thread a name
2024-05-09 14:17:53 +00:00
ed
5c1e23907d og: append full original filename as url suffix 2024-05-09 13:18:15 +00:00
ed
925c7f0a57 in gridview, assume .ts files are video, not typescript 2024-05-08 22:20:29 +00:00
ed
feed08deb2 doc: export --help to html and link it 2024-05-08 22:01:58 +00:00
ed
560d7b6672 option to add or change mimetype mappings 2024-05-08 21:12:14 +00:00
ed
565daee98b fix mimetype detection for uppercase file extensions 2024-05-08 20:08:11 +00:00
ed
e396c5c2b5 only drop index caches if necessary;
prevents having to rebuild covers due to unrelated changes
2024-05-08 20:03:51 +00:00
ed
1ee2cdd089 update pkgs to 1.13.1 2024-05-06 01:11:01 +00:00
ed
beacedab50 v1.13.1 2024-05-06 00:29:15 +00:00
ed
25139a4358 qr-code: better fallback ip when no default-route 2024-05-05 23:36:05 +00:00
ed
f8491970fd remember url-hash during login from 403 2024-05-05 22:37:41 +00:00
ed
da091aec85 "volume" is too overloaded, make it --au-vol instead 2024-05-05 21:27:07 +00:00
ed
e9eb5affcd and option to set default audio/video volume 2024-05-05 19:10:29 +00:00
ed
c1918bc36c expand tcolor early to avoid listing in volume props 2024-05-05 18:52:02 +00:00
ed
fdda567f50 ux: add "this folder is empty" banner 2024-05-05 18:44:36 +00:00
ed
603d0ed72b misc: messages, docs, ie4 / win311 support
* docker: improve config-not-found warning message
* readme: mention markdown variable expansion
* basic-browser: use zip=crc to support ie4 / win-3.11
2024-05-05 17:32:50 +00:00
ed
b15a4ef79f failed attempt at making images load on android-discord 2024-05-05 14:16:22 +00:00
ed
48a6789d36 use --og-title as fallback if template gives blank result 2024-05-05 11:25:52 +00:00
ed
36f2c446af opengraph stuff:
* template-based title formatting
* picture embeds are no longer ant-sized
* `--og-color` sets accent color; default #333
* `--og-s-title` forces default title, ignoring e2t
* add a music indicator to song titles because discord doesn't
2024-05-03 00:11:40 +00:00
ed
69517e4624 add general-purpose query-string parcelling;
currently only being used to workaround discord discarding
query strings in opengraph tags, but i'm sure there will be
plenty more wonderful usecases for this atrocity
2024-05-02 22:49:27 +00:00
ed
ea270ab9f2 add og / opengraph / discord embeds 2024-05-01 23:40:56 +00:00
ed
b6cf2d3089 --html-head can take a filepath and/or jinja2 2024-05-01 20:24:18 +00:00
ed
e8db3dd37f fix tests on windows 2024-04-25 22:25:38 +00:00
ed
27485a4cb1 add pyz builder 2024-04-24 23:45:01 +00:00
ed
253a414443 better ctrl-v upload ux 2024-04-24 23:49:34 +02:00
ed
f6e693f0f5 reevaluate support for sparse files periodically
if a given filesystem were to disappear (e.g. removable storage)
followed by another filesystem appearing at the same location,
this would not get noticed by up2k in a timely manner

fix this by discarding the mtab cache after `--mtab-age` seconds and
rebuild it from scratch, unless the previous values are definitely
correct (as indicated by identical output from `/bin/mount`)

probably reduces windows performance by an acceptable amount
2024-04-24 21:18:26 +00:00
ed
c5f7cfc355 upload files/images with CTRL-V (from explorer etc.) 2024-04-23 19:46:54 +00:00
ed
bc2c1e427a config-reset forgot the dots cookie 2024-04-23 19:39:43 +00:00
ed
95d9e693c6 d2d should disable search/unpost even if db exists 2024-04-22 18:55:13 +00:00
ed
70a3cf36d1 pipe: only flush FDs when necessary
should give higher performance on servers with slow storage
2024-04-21 23:53:04 +00:00
ed
aa45fccf11 update pkgs to 1.13.0 2024-04-20 22:48:16 +00:00
ed
42d00050c1 v1.13.0 2024-04-20 22:32:50 +00:00
ed
4bb0e6e75a pipe: windows: make it safe with aggressive flushing 2024-04-20 22:15:08 +00:00
ed
2f7f9de3f5 pipe: optimize (1 GiB/s @ ryzen5-4500U) 2024-04-20 20:13:31 +00:00
ed
f31ac90932 less confusing help-text for --re-dhash 2024-04-20 16:42:56 +00:00
ed
439cb7f85b u2c: add --ow (previously part of --dr) 2024-04-20 16:36:10 +00:00
ed
af193ee834 keep up2k state integrity on abort 2024-04-20 16:13:32 +00:00
ed
c06126cc9d pipe: add volflag to disable 2024-04-19 23:54:23 +00:00
ed
897ffbbbd0 pipe: add to docs 2024-04-19 00:02:28 +00:00
ed
8244d3b4fc pipe: add tapering to keep tcp alive 2024-04-18 23:10:37 +00:00
ed
74266af6d1 pipe: warn when trying to download a .PARTIAL
and fix file sorting indicators on firefox
2024-04-18 23:10:11 +00:00
ed
8c552f1ad1 windows: fix upload-abort 2024-04-18 23:08:05 +00:00
ed
bf5850785f add opt-out from storing uploader IPs 2024-04-18 17:16:00 +00:00
ed
feecb3e0b8 up2k: fix put-hasher dying + a harmless race
* hasher thread could die if a client would rapidly
   upload and delete files (so very unlikely)

* two unprotected calls to register_vpath which was
   almost-definitely safe because the volumes
   already existed in the registry
2024-04-18 16:43:38 +00:00
ed
08d8c82167 PoC: ongoing uploads can be downloaded in lockstep 2024-04-18 00:10:54 +00:00
ed
5239e7ac0c separate registry mutex for faster access
also fix a harmless toctou in handle_json where clients
could get stuck hanging for a bit longer than necessary
2024-04-18 00:07:56 +00:00
ed
9937c2e755 add ArozOS to comparison 2024-04-16 21:00:47 +00:00
ed
f1e947f37d rehost deps from a flaky server 2024-04-12 21:49:01 +00:00
ed
a70a49b9c9 update pkgs to 1.12.2 2024-04-12 21:25:21 +00:00
ed
fe700dcf1a v1.12.2 2024-04-12 21:10:02 +00:00
ed
c8e3ed3aae retry failed renames on windows
theoretical issue which nobody has ran into yet,
probably because nobody uses this on windows
2024-04-12 20:38:30 +00:00
ed
b8733653a3 fix audio transcoding with filekeys 2024-04-11 21:54:15 +00:00
ed
b772a4f8bb fix wordwrap of buttons on ios 2024-04-11 21:31:40 +00:00
ed
9e5253ef87 ie11: restore load-bearing thing 2024-04-11 20:53:15 +00:00
ed
7b94e4edf3 configurable basic-auth preference;
adds options `--bauth-last` to lower the preference for
taking the basic-auth password in case of conflict,
and `--no-bauth` to entirely disable basic-authentication

if a client is providing multiple passwords, for example when
"logged in" with one password (the `cppwd` cookie) and switching
to another account by also sending a PW header/url-param, then
the default evaluation order to determine which password to use is:

url-param `pw`, header `pw`, basic-auth header, cookie (cppwd/cppws)

so if a client supplies a basic-auth header, it will ignore the cookie
and use the basic-auth password instead, which usually makes sense

but this can become a problem if you have other webservers running
on the same domain which also support basic-authentication

--bauth-last is a good choice for cooperating with such services, as
--no-bauth currently breaks support for the android app...
2024-04-11 20:15:49 +00:00
ed
da26ec36ca add password placeholder on login page
was easy to assume you were supposed to put a username there
2024-04-11 19:31:02 +00:00
ed
443acf2f8b update nuitka notes 2024-04-10 22:04:43 +00:00
ed
6c90e3893d update pkgs to 1.12.1 2024-04-09 23:53:43 +00:00
ed
ea002ee71d v1.12.1 2024-04-09 23:34:31 +00:00
ed
ab18893cd2 update deps 2024-04-09 23:25:54 +00:00
ed
844d16b9e5 bbox: scrollwheel for prev/next pic
inspired by d385305f5e
2024-04-09 20:39:07 +00:00
ed
989cc613ef fix tree-rendering when history-popping into bbox
plus misc similar technically-incorrect addq usages;
most of these don't matter in practice since they'll
never get a url with a hash, but makes the intent clear

and make sure hashes never get passed around
like they're part of a dirkey, harmless as it is
2024-04-09 19:54:15 +00:00
ed
4f0cad5468 fix bbox destructor, closes #81 for real 2024-04-09 19:10:55 +00:00
ed
f89de6b35d preloading too aggressive, chill a bit 2024-04-09 18:44:23 +00:00
ed
e0bcb88ee7 update pkgs to 1.12.0 2024-04-06 20:56:52 +00:00
ed
a0022805d1 v1.12.0 (closes #64) 2024-04-06 20:11:49 +00:00
ed
853adb5d04 update deps 2024-04-06 19:51:38 +00:00
ed
7744226b5c apply audio equalizer to videos too 2024-04-06 18:44:08 +00:00
ed
d94b5b3fc9 fau doesn't work on iphones; compensate by preloading much earlier 2024-04-06 18:43:45 +00:00
ed
e6ba065bc2 improve cachebusters 2024-04-06 00:27:06 +00:00
ed
59a53ba9ac on phones, fix playback halting if next song didn't buffer in time 2024-04-06 00:25:28 +00:00
ed
b88cc7b5ce turns out it doesn't need to be audible... 2024-04-05 23:06:26 +00:00
ed
5ab54763c6 remove pyoxidizer (unmaintained)
partially reverts e430b2567a

the remaining stuff might be useful for other cpython alternatives
2024-04-05 17:51:26 +00:00
ed
59f815ff8c deps: add busy.mp3 2024-04-04 09:27:01 +00:00
ed
9c42cbec6f maybe fix #81 2024-04-03 00:28:15 +00:00
ed
f471b05aa4 todo: fix playback stopping on phones if slow preload 2024-04-02 23:20:58 +00:00
ed
34c32e3e89 golf:
util.js ensures `WebAssembly`, `Notification`, and `FormData`
are always declared, setting them false when not available
2024-04-02 20:25:06 +00:00
ed
a080759a03 add transcoding to mp3
because CU's car stereo can't do opus...

incidentally adds support for playing any audio format in ie11
2024-03-29 16:36:56 +00:00
ed
0ae12868e5 dirkeys: add volflag dky (skip keycheck) 2024-03-27 21:03:58 +00:00
ed
ef52e2c06c dirkeys: fix 403 in dks volumes 2024-03-27 20:34:34 +00:00
ed
32c912bb16 fix a bunch of dirkey stuff:
* breadcrumb navigation
* tree generation in `recvls`
* dirkeys in initial tree
2024-03-27 16:05:05 +00:00
ed
20870fda79 Merge branch 'dirkeys' into hovudstraum 2024-03-25 10:34:08 +00:00
ed
bdfe2c1a5f mention unproductive optimizations 2024-03-24 22:07:23 +00:00
ed
cb99fbf442 update pkgs to 1.11.2 2024-03-23 17:53:19 +00:00
ed
bccc44dc21 v1.11.2 2024-03-23 17:24:36 +00:00
ed
2f20d29edd idp: mention lack of volume persistence 2024-03-23 16:35:45 +00:00
ed
c6acd3a904 add option --s-rd-sz (socket read size):
counterpart of `--s-wr-sz` which existed already

the default (256 KiB) appears optimal in the most popular scenario
(linux host with storage on local physical disk, usually NVMe)

was previously 32 KiB, so large uploads should now use 17% less CPU

also adds sanchecks for values of `--iobuf`, `--s-rd-sz`, `--s-wr-sz`

also adds file-overwrite feature for multipart posts
2024-03-23 16:35:14 +00:00
ed
2b24c50eb7 add option --iobuf (file r/w buffersize):
the default (256 KiB) appears optimal in the most popular scenario
(linux host with storage on local physical disk, usually NVMe)

was previously a mix of 64 and 512 KiB;
now the same value is enforced everywhere

download-as-tar is now 20% faster with the default value
2024-03-23 16:17:40 +00:00
ed
d30ae8453d idp: precise expansion of ${u} (fixes #79);
it is now possible to grant access to users other than `${u}`
(the user which the volume belongs to)

previously, permissions did not apply correctly to IdP volumes due to
the way `${u}` and `${g}` was expanded, which was a funky iteration
over all known users/groups instead of... just expanding them?

also adds another sanchk that a volume's URL must contain a
`${u}` to be allowed to mention `${u}` in the accs list, and
similarly for `${g}` / `@${g}` since users can be in multiple groups
2024-03-21 20:10:27 +00:00
ed
8e5c436bef black + isort 2024-03-21 18:51:23 +00:00
ed
f500e55e68 update pkgs to 1.11.1 2024-03-18 17:41:43 +00:00
ed
10bc2d9205 unsuccessful attempt at dirkeys (#64) 2023-12-17 22:30:22 +00:00
78 changed files with 3312 additions and 1006 deletions

1
.gitignore vendored
View File

@@ -12,6 +12,7 @@ copyparty.egg-info/
/dist/ /dist/
/py2/ /py2/
/sfx* /sfx*
/pyz/
/unt/ /unt/
/log/ /log/

125
README.md
View File

@@ -10,13 +10,16 @@ turn almost any device into a file server with resumable uploads/downloads using
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer) 📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
🎬 **videos:** [upload](https://a.ocv.me/pub/demo/pics-vids/up2k.webm) // [cli-upload](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm) // [race-the-beam](https://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)
## readme toc ## readme toc
* top * top
* [quickstart](#quickstart) - just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉 * [quickstart](#quickstart) - just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
* [at home](#at-home) - make it accessible over the internet
* [on servers](#on-servers) - you may also want these, especially on servers * [on servers](#on-servers) - you may also want these, especially on servers
* [features](#features) * [features](#features) - also see [comparison to similar software](./docs/versus.md)
* [testimonials](#testimonials) - small collection of user feedback * [testimonials](#testimonials) - small collection of user feedback
* [motivations](#motivations) - project goals / philosophy * [motivations](#motivations) - project goals / philosophy
* [notes](#notes) - general notes * [notes](#notes) - general notes
@@ -37,12 +40,14 @@ turn almost any device into a file server with resumable uploads/downloads using
* [file-search](#file-search) - dropping files into the browser also lets you see if they exist on the server * [file-search](#file-search) - dropping files into the browser also lets you see if they exist on the server
* [unpost](#unpost) - undo/delete accidental uploads * [unpost](#unpost) - undo/delete accidental uploads
* [self-destruct](#self-destruct) - uploads can be given a lifetime * [self-destruct](#self-destruct) - uploads can be given a lifetime
* [race the beam](#race-the-beam) - download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm))
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission) * [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI * [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
* [media player](#media-player) - plays almost every audio format there is * [media player](#media-player) - plays almost every audio format there is
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression) * [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings * [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
* [markdown viewer](#markdown-viewer) - and there are *two* editors * [markdown viewer](#markdown-viewer) - and there are *two* editors
* [markdown vars](#markdown-vars) - dynamic docs with serverside variable expansion
* [other tricks](#other-tricks) * [other tricks](#other-tricks)
* [searching](#searching) - search by size, date, path/name, mp3-tags, ... * [searching](#searching) - search by size, date, path/name, mp3-tags, ...
* [server config](#server-config) - using arguments or config files, or a mix of both * [server config](#server-config) - using arguments or config files, or a mix of both
@@ -56,6 +61,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [tftp server](#tftp-server) - a TFTP server (read/write) can be started using `--tftp 3969` * [tftp server](#tftp-server) - a TFTP server (read/write) can be started using `--tftp 3969`
* [smb server](#smb-server) - unsafe, slow, not recommended for wan * [smb server](#smb-server) - unsafe, slow, not recommended for wan
* [browser ux](#browser-ux) - tweaking the ui * [browser ux](#browser-ux) - tweaking the ui
* [opengraph](#opengraph) - discord and social-media embeds
* [file indexing](#file-indexing) - enables dedup and music search ++ * [file indexing](#file-indexing) - enables dedup and music search ++
* [exclude-patterns](#exclude-patterns) - to save some time * [exclude-patterns](#exclude-patterns) - to save some time
* [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems * [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems
@@ -94,6 +100,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [gotchas](#gotchas) - behavior that might be unexpected * [gotchas](#gotchas) - behavior that might be unexpected
* [cors](#cors) - cross-site request config * [cors](#cors) - cross-site request config
* [filekeys](#filekeys) - prevent filename bruteforcing * [filekeys](#filekeys) - prevent filename bruteforcing
* [dirkeys](#dirkeys) - share specific folders in a volume
* [password hashing](#password-hashing) - you can hash passwords * [password hashing](#password-hashing) - you can hash passwords
* [https](#https) - both HTTP and HTTPS are accepted * [https](#https) - both HTTP and HTTPS are accepted
* [recovering from crashes](#recovering-from-crashes) * [recovering from crashes](#recovering-from-crashes)
@@ -103,8 +110,9 @@ turn almost any device into a file server with resumable uploads/downloads using
* [dependencies](#dependencies) - mandatory deps * [dependencies](#dependencies) - mandatory deps
* [optional dependencies](#optional-dependencies) - install these to enable bonus features * [optional dependencies](#optional-dependencies) - install these to enable bonus features
* [optional gpl stuff](#optional-gpl-stuff) * [optional gpl stuff](#optional-gpl-stuff)
* [sfx](#sfx) - the self-contained "binary" * [sfx](#sfx) - the self-contained "binary" (recommended!)
* [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+) * [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
* [zipapp](#zipapp) - another emergency alternative, [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz)
* [install on android](#install-on-android) * [install on android](#install-on-android)
* [reporting bugs](#reporting-bugs) - ideas for context to include, and where to submit them * [reporting bugs](#reporting-bugs) - ideas for context to include, and where to submit them
* [devnotes](#devnotes) - for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md) * [devnotes](#devnotes) - for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
@@ -118,6 +126,7 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead * or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
* or install [on arch](#arch-package) [on NixOS](#nixos-module) [through nix](#nix-package) * or install [on arch](#arch-package) [on NixOS](#nixos-module) [through nix](#nix-package)
* or if you are on android, [install copyparty in termux](#install-on-android) * or if you are on android, [install copyparty in termux](#install-on-android)
* or if your computer is messed up and nothing else works, [try the pyz](#zipapp)
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too * or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
* docker has all deps built-in, so skip this step: * docker has all deps built-in, so skip this step:
@@ -125,7 +134,7 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
* **Alpine:** `apk add py3-pillow ffmpeg` * **Alpine:** `apk add py3-pillow ffmpeg`
* **Debian:** `apt install --no-install-recommends python3-pil ffmpeg` * **Debian:** `apt install --no-install-recommends python3-pil ffmpeg`
* **Fedora:** rpmfusion + `dnf install python3-pillow ffmpeg` * **Fedora:** rpmfusion + `dnf install python3-pillow ffmpeg --allowerasing`
* **FreeBSD:** `pkg install py39-sqlite3 py39-pillow ffmpeg` * **FreeBSD:** `pkg install py39-sqlite3 py39-pillow ffmpeg`
* **MacOS:** `port install py-Pillow ffmpeg` * **MacOS:** `port install py-Pillow ffmpeg`
* **MacOS** (alternative): `brew install pillow ffmpeg` * **MacOS** (alternative): `brew install pillow ffmpeg`
@@ -146,6 +155,17 @@ some recommended options:
* see [accounts and volumes](#accounts-and-volumes) (or `--help-accounts`) for the syntax and other permissions * see [accounts and volumes](#accounts-and-volumes) (or `--help-accounts`) for the syntax and other permissions
### at home
make it accessible over the internet by starting a [cloudflare quicktunnel](https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/do-more-with-tunnels/trycloudflare/) like so:
first download [cloudflared](https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/downloads/) and then start the tunnel with `cloudflared tunnel --url http://127.0.0.1:3923`
as the tunnel starts, it will show a URL which you can share to let anyone browse your stash or upload files to you
since people will be connecting through cloudflare, run copyparty with `--xff-hdr cf-connecting-ip` to detect client IPs correctly
### on servers ### on servers
you may also want these, especially on servers: you may also want these, especially on servers:
@@ -169,6 +189,8 @@ firewall-cmd --reload
## features ## features
also see [comparison to similar software](./docs/versus.md)
* backend stuff * backend stuff
* ☑ IPv6 * ☑ IPv6
* ☑ [multiprocessing](#performance) (actual multithreading) * ☑ [multiprocessing](#performance) (actual multithreading)
@@ -191,6 +213,7 @@ firewall-cmd --reload
* ☑ write-only folders * ☑ write-only folders
* ☑ [unpost](#unpost): undo/delete accidental uploads * ☑ [unpost](#unpost): undo/delete accidental uploads
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side) * ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
* ☑ [race the beam](#race-the-beam) (almost like peer-to-peer)
* ☑ symlink/discard duplicates (content-matching) * ☑ symlink/discard duplicates (content-matching)
* download * download
* ☑ single files in browser * ☑ single files in browser
@@ -199,7 +222,7 @@ firewall-cmd --reload
* browser * browser
* ☑ [navpane](#navpane) (directory tree sidebar) * ☑ [navpane](#navpane) (directory tree sidebar)
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename)) * ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
* ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus transcoding) * ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus/mp3 transcoding)
* ☑ image gallery with webm player * ☑ image gallery with webm player
* ☑ textfile browser with syntax hilighting * ☑ textfile browser with syntax hilighting
* ☑ [thumbnails](#thumbnails) * ☑ [thumbnails](#thumbnails)
@@ -207,6 +230,7 @@ firewall-cmd --reload
* ☑ ...of videos using FFmpeg * ☑ ...of videos using FFmpeg
* ☑ ...of audio (spectrograms) using FFmpeg * ☑ ...of audio (spectrograms) using FFmpeg
* ☑ cache eviction (max-age; maybe max-size eventually) * ☑ cache eviction (max-age; maybe max-size eventually)
* ☑ multilingual UI (english, norwegian, [add your own](./docs/rice/#translations)))
* ☑ SPA (browse while uploading) * ☑ SPA (browse while uploading)
* server indexing * server indexing
* ☑ [locate files by contents](#file-search) * ☑ [locate files by contents](#file-search)
@@ -215,9 +239,11 @@ firewall-cmd --reload
* client support * client support
* ☑ [folder sync](#folder-sync) * ☑ [folder sync](#folder-sync)
* ☑ [curl-friendly](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png) * ☑ [curl-friendly](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png)
* ☑ [opengraph](#opengraph) (discord embeds)
* markdown * markdown
* ☑ [viewer](#markdown-viewer) * ☑ [viewer](#markdown-viewer)
* ☑ editor (sure why not) * ☑ editor (sure why not)
* ☑ [variables](#markdown-vars)
PS: something missing? post any crazy ideas you've got as a [feature request](https://github.com/9001/copyparty/issues/new?assignees=9001&labels=enhancement&template=feature_request.md) or [discussion](https://github.com/9001/copyparty/discussions/new?category=ideas) 🤙 PS: something missing? post any crazy ideas you've got as a [feature request](https://github.com/9001/copyparty/issues/new?assignees=9001&labels=enhancement&template=feature_request.md) or [discussion](https://github.com/9001/copyparty/discussions/new?category=ideas) 🤙
@@ -387,7 +413,7 @@ configuring accounts/volumes with arguments:
`-v .::r,usr1,usr2:rw,usr3,usr4` = usr1/2 read-only, 3/4 read-write `-v .::r,usr1,usr2:rw,usr3,usr4` = usr1/2 read-only, 3/4 read-write
permissions: permissions:
* `r` (read): browse folder contents, download files, download as zip/tar * `r` (read): browse folder contents, download files, download as zip/tar, see filekeys/dirkeys
* `w` (write): upload files, move files *into* this folder * `w` (write): upload files, move files *into* this folder
* `m` (move): move files/folders *from* this folder * `m` (move): move files/folders *from* this folder
* `d` (delete): delete files/folders * `d` (delete): delete files/folders
@@ -587,17 +613,23 @@ you can also zip a selection of files or folders by clicking them in the browser
![copyparty-zipsel-fs8](https://user-images.githubusercontent.com/241032/129635374-e5136e01-470a-49b1-a762-848e8a4c9cdc.png) ![copyparty-zipsel-fs8](https://user-images.githubusercontent.com/241032/129635374-e5136e01-470a-49b1-a762-848e8a4c9cdc.png)
cool trick: download a folder by appending url-params `?tar&opus` to transcode all audio files (except aac|m4a|mp3|ogg|opus|wma) to opus before they're added to the archive cool trick: download a folder by appending url-params `?tar&opus` or `?tar&mp3` to transcode all audio files (except aac|m4a|mp3|ogg|opus|wma) to opus/mp3 before they're added to the archive
* super useful if you're 5 minutes away from takeoff and realize you don't have any music on your phone but your server only has flac files and downloading those will burn through all your data + there wouldn't be enough time anyways * super useful if you're 5 minutes away from takeoff and realize you don't have any music on your phone but your server only has flac files and downloading those will burn through all your data + there wouldn't be enough time anyways
* and url-params `&j` / `&w` produce jpeg/webm thumbnails/spectrograms instead of the original audio/video/images * and url-params `&j` / `&w` produce jpeg/webm thumbnails/spectrograms instead of the original audio/video/images (`&p` for audio waveforms)
* can also be used to pregenerate thumbnails; combine with `--th-maxage=9999999` or `--th-clean=0` * can also be used to pregenerate thumbnails; combine with `--th-maxage=9999999` or `--th-clean=0`
## uploading ## uploading
drag files/folders into the web-browser to upload (or use the [command-line uploader](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy)) drag files/folders into the web-browser to upload
this initiates an upload using `up2k`; there are two uploaders available: dragdrop is the recommended way, but you may also:
* select some files (not folders) in your file explorer and press CTRL-V inside the browser window
* use the [command-line uploader](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy)
* upload using [curl or sharex](#client-examples)
when uploading files through dragdrop or CTRL-V, this initiates an upload using `up2k`; there are two browser-based uploaders available:
* `[🎈] bup`, the basic uploader, supports almost every browser since netscape 4.0 * `[🎈] bup`, the basic uploader, supports almost every browser since netscape 4.0
* `[🚀] up2k`, the good / fancy one * `[🚀] up2k`, the good / fancy one
@@ -616,7 +648,7 @@ up2k has several advantages:
> it is perfectly safe to restart / upgrade copyparty while someone is uploading to it! > it is perfectly safe to restart / upgrade copyparty while someone is uploading to it!
> all known up2k clients will resume just fine 💪 > all known up2k clients will resume just fine 💪
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d) see [up2k](./docs/devnotes.md#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
![copyparty-upload-fs8](https://user-images.githubusercontent.com/241032/129635371-48fc54ca-fa91-48e3-9b1d-ba413e4b68cb.png) ![copyparty-upload-fs8](https://user-images.githubusercontent.com/241032/129635371-48fc54ca-fa91-48e3-9b1d-ba413e4b68cb.png)
@@ -682,6 +714,13 @@ clients can specify a shorter expiration time using the [up2k ui](#uploading) --
specifying a custom expiration time client-side will affect the timespan in which unposts are permitted, so keep an eye on the estimates in the up2k ui specifying a custom expiration time client-side will affect the timespan in which unposts are permitted, so keep an eye on the estimates in the up2k ui
### race the beam
download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)) -- it's almost like peer-to-peer
requires the file to be uploaded using up2k (which is the default drag-and-drop uploader), alternatively the command-line program
## file manager ## file manager
cut/paste, rename, and delete files/folders (if you have permission) cut/paste, rename, and delete files/folders (if you have permission)
@@ -778,9 +817,9 @@ open the `[🎺]` media-player-settings tab to configure it,
* `[loop]` keeps looping the folder * `[loop]` keeps looping the folder
* `[next]` plays into the next folder * `[next]` plays into the next folder
* "transcode": * "transcode":
* `[flac]` converts `flac` and `wav` files into opus * `[flac]` converts `flac` and `wav` files into opus (if supported by browser) or mp3
* `[aac]` converts `aac` and `m4a` files into opus * `[aac]` converts `aac` and `m4a` files into opus (if supported by browser) or mp3
* `[oth]` converts all other known formats into opus * `[oth]` converts all other known formats into opus (if supported by browser) or mp3
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk` * `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
* "tint" reduces the contrast of the playback bar * "tint" reduces the contrast of the playback bar
@@ -817,6 +856,13 @@ other notes,
* the document preview has a max-width which is the same as an A4 paper when printed * the document preview has a max-width which is the same as an A4 paper when printed
### markdown vars
dynamic docs with serverside variable expansion to replace stuff like `{{self.ip}}` with the client's IP, or `{{srv.htime}}` with the current time on the server
see [./srv/expand/](./srv/expand/) for usage and examples
## other tricks ## other tricks
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab` * you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
@@ -865,6 +911,8 @@ using arguments or config files, or a mix of both:
**NB:** as humongous as this readme is, there is also a lot of undocumented features. Run copyparty with `--help` to see all available global options; all of those can be used in the `[global]` section of config files, and everything listed in `--help-flags` can be used in volumes as volflags. **NB:** as humongous as this readme is, there is also a lot of undocumented features. Run copyparty with `--help` to see all available global options; all of those can be used in the `[global]` section of config files, and everything listed in `--help-flags` can be used in volumes as volflags.
* if running in docker/podman, try this: `docker run --rm -it copyparty/ac --help` * if running in docker/podman, try this: `docker run --rm -it copyparty/ac --help`
* or see this (probably outdated): https://ocv.me/copyparty/helptext.html
* or if you prefer plaintext, https://ocv.me/copyparty/helptext.txt
## zeroconf ## zeroconf
@@ -1041,6 +1089,23 @@ tweaking the ui
* to sort in music order (album, track, artist, title) with filename as fallback, you could `--sort tags/Cirle,tags/.tn,tags/Artist,tags/Title,href` * to sort in music order (album, track, artist, title) with filename as fallback, you could `--sort tags/Cirle,tags/.tn,tags/Artist,tags/Title,href`
* to sort by upload date, first enable showing the upload date in the listing with `-e2d -mte +.up_at` and then `--sort tags/.up_at` * to sort by upload date, first enable showing the upload date in the listing with `-e2d -mte +.up_at` and then `--sort tags/.up_at`
see [./docs/rice](./docs/rice) for more, including how to add stuff (css/`<meta>`/...) to the html `<head>` tag, or to add your own translation
## opengraph
discord and social-media embeds
can be enabled globally with `--og` or per-volume with volflag `og`
note that this disables hotlinking because the opengraph spec demands it; to sneak past this intentional limitation, you can enable opengraph selectively by user-agent, for example `--og-ua '(Discord|Twitter|Slack)bot'` (or volflag `og_ua`)
you can also hotlink files regardless by appending `?raw` to the url
NOTE: because discord (and maybe others) strip query args such as `?raw` in opengraph tags, any links which require a filekey or dirkey will not work
if you want to entirely replace the copyparty response with your own jinja2 template, give the template filepath to `--og-tpl` or volflag `og_tpl` (all members of `HttpCli` are available through the `this` object)
## file indexing ## file indexing
@@ -1291,6 +1356,8 @@ you may experience poor upload performance this way, but that can sometimes be f
someone has also tested geesefs in combination with [gocryptfs](https://nuetzlich.net/gocryptfs/) with surprisingly good results, getting 60 MiB/s upload speeds on a gbit line, but JuiceFS won with 80 MiB/s using its built-in encryption someone has also tested geesefs in combination with [gocryptfs](https://nuetzlich.net/gocryptfs/) with surprisingly good results, getting 60 MiB/s upload speeds on a gbit line, but JuiceFS won with 80 MiB/s using its built-in encryption
you may improve performance by specifying larger values for `--iobuf` / `--s-rd-sz` / `--s-wr-sz`
## hiding from google ## hiding from google
@@ -1740,6 +1807,7 @@ below are some tweaks roughly ordered by usefulness:
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set * `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
* and also makes thumbnails load faster, regardless of e2d/e2t * and also makes thumbnails load faster, regardless of e2d/e2t
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable * `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* if your volumes are on a network-disk such as NFS / SMB / s3, specifying larger values for `--iobuf` and/or `--s-rd-sz` and/or `--s-wr-sz` may help; try setting all of them to `524288` or `1048576` or `4194304`
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger) * `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example: * `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
* lots of connections (many users or heavy clients) * lots of connections (many users or heavy clients)
@@ -1836,12 +1904,29 @@ cors can be configured with `--acao` and `--acam`, or the protections entirely d
prevent filename bruteforcing prevent filename bruteforcing
volflag `c,fk` generates filekeys (per-file accesskeys) for all files; users which have full read-access (permission `r`) will then see URLs with the correct filekey `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404 volflag `fk` generates filekeys (per-file accesskeys) for all files; users which have full read-access (permission `r`) will then see URLs with the correct filekey `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
by default, filekeys are generated based on salt (`--fk-salt`) + filesystem-path + file-size + inode (if not windows); add volflag `fka` to generate slightly weaker filekeys which will not be invalidated if the file is edited (only salt + path) by default, filekeys are generated based on salt (`--fk-salt`) + filesystem-path + file-size + inode (if not windows); add volflag `fka` to generate slightly weaker filekeys which will not be invalidated if the file is edited (only salt + path)
permissions `wG` (write + upget) lets users upload files and receive their own filekeys, still without being able to see other uploads permissions `wG` (write + upget) lets users upload files and receive their own filekeys, still without being able to see other uploads
### dirkeys
share specific folders in a volume without giving away full read-access to the rest -- the visitor only needs the `g` (get) permission to view the link
volflag `dk` generates dirkeys (per-directory accesskeys) for all folders, granting read-access to that folder; by default only that folder itself, no subfolders
volflag `dky` disables the actual key-check, meaning anyone can see the contents of a folder where they have `g` access, but not its subdirectories
* `dk` + `dky` gives the same behavior as if all users with `g` access have full read-access, but subfolders are hidden files (their names start with a dot), so `dky` is an alternative to renaming all the folders for that purpose, maybe just for some users
volflag `dks` lets people enter subfolders as well, and also enables download-as-zip/tar
dirkeys are generated based on another salt (`--dk-salt`) + filesystem-path and have a few limitations:
* the key does not change if the contents of the folder is modified
* if you need a new dirkey, either change the salt or rename the folder
* linking to a textfile (so it opens in the textfile viewer) is not possible if recipient doesn't have read-access
## password hashing ## password hashing
@@ -1933,7 +2018,7 @@ these are standalone programs and will never be imported / evaluated by copypart
# sfx # sfx
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course the self-contained "binary" (recommended!) [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](./docs/devnotes.md#sfx-repack) you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](./docs/devnotes.md#sfx-repack)
@@ -1960,6 +2045,16 @@ meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/d
then again, if you are already into downloading shady binaries from the internet, you may also want my [minimal builds](./scripts/pyinstaller#ffmpeg) of [ffmpeg](https://ocv.me/stuff/bin/ffmpeg.exe) and [ffprobe](https://ocv.me/stuff/bin/ffprobe.exe) which enables copyparty to extract multimedia-info, do audio-transcoding, and thumbnails/spectrograms/waveforms, however it's much better to instead grab a [recent official build](https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z) every once ina while if you can afford the size then again, if you are already into downloading shady binaries from the internet, you may also want my [minimal builds](./scripts/pyinstaller#ffmpeg) of [ffmpeg](https://ocv.me/stuff/bin/ffmpeg.exe) and [ffprobe](https://ocv.me/stuff/bin/ffprobe.exe) which enables copyparty to extract multimedia-info, do audio-transcoding, and thumbnails/spectrograms/waveforms, however it's much better to instead grab a [recent official build](https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z) every once ina while if you can afford the size
## zipapp
another emergency alternative, [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) has less features, requires python 3.7 or newer, worse compression, and more importantly is unable to benefit from more recent versions of jinja2 and such (which makes it less secure)... lots of drawbacks with this one really -- but it *may* just work if the regular sfx fails to start because the computer is messed up in certain funky ways, so it's worth a shot if all else fails
run it by doubleclicking it, or try typing `python copyparty.pyz` in your terminal/console/commandline/telex if that fails
it is a python [zipapp](https://docs.python.org/3/library/zipapp.html) meaning it doesn't have to unpack its own python code anywhere to run, so if the filesystem is busted it has a better chance of getting somewhere
* but note that it currently still needs to extract the web-resources somewhere (they'll land in the default TEMP-folder of your OS)
# install on android # install on android
install [Termux](https://termux.com/) + its companion app `Termux:API` (see [ocv.me/termux](https://ocv.me/termux/)) and then copy-paste this into Termux (long-tap) all at once: install [Termux](https://termux.com/) + its companion app `Termux:API` (see [ocv.me/termux](https://ocv.me/termux/)) and then copy-paste this into Termux (long-tap) all at once:

View File

@@ -231,7 +231,7 @@ install_vamp() {
cd "$td" cd "$td"
echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || { echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n' printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2691/vamp-plugin-sdk-2.10.0.tar.gz) (dl_files yolo https://ocv.me/mirror/vamp-plugin-sdk-2.10.0.tar.gz)
sha512sum -c <( sha512sum -c <(
echo "153b7f2fa01b77c65ad393ca0689742d66421017fd5931d216caa0fcf6909355fff74706fabbc062a3a04588a619c9b515a1dae00f21a57afd97902a355c48ed -" echo "153b7f2fa01b77c65ad393ca0689742d66421017fd5931d216caa0fcf6909355fff74706fabbc062a3a04588a619c9b515a1dae00f21a57afd97902a355c48ed -"
) <vamp-plugin-sdk-2.10.0.tar.gz ) <vamp-plugin-sdk-2.10.0.tar.gz
@@ -247,7 +247,7 @@ install_vamp() {
cd "$td" cd "$td"
have_beatroot || { have_beatroot || {
printf '\033[33mcould not find the vamp beatroot plugin, building from source\033[0m\n' printf '\033[33mcould not find the vamp beatroot plugin, building from source\033[0m\n'
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/885/beatroot-vamp-v1.0.tar.gz) (dl_files yolo https://ocv.me/mirror/beatroot-vamp-v1.0.tar.gz)
sha512sum -c <( sha512sum -c <(
echo "1f444d1d58ccf565c0adfe99f1a1aa62789e19f5071e46857e2adfbc9d453037bc1c4dcb039b02c16240e9b97f444aaff3afb625c86aa2470233e711f55b6874 -" echo "1f444d1d58ccf565c0adfe99f1a1aa62789e19f5071e46857e2adfbc9d453037bc1c4dcb039b02c16240e9b97f444aaff3afb625c86aa2470233e711f55b6874 -"
) <beatroot-vamp-v1.0.tar.gz ) <beatroot-vamp-v1.0.tar.gz

View File

@@ -1,8 +1,8 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
S_VERSION = "1.15" S_VERSION = "1.18"
S_BUILD_DT = "2024-02-18" S_BUILD_DT = "2024-06-01"
""" """
u2c.py: upload to copyparty u2c.py: upload to copyparty
@@ -79,12 +79,21 @@ req_ses = requests.Session()
class Daemon(threading.Thread): class Daemon(threading.Thread):
def __init__(self, target, name=None, a=None): def __init__(self, target, name = None, a = None):
# type: (Any, Any, Any) -> None threading.Thread.__init__(self, name=name)
threading.Thread.__init__(self, target=target, args=a or (), name=name) self.a = a or ()
self.fun = target
self.daemon = True self.daemon = True
self.start() self.start()
def run(self):
try:
signal.pthread_sigmask(signal.SIG_BLOCK, [signal.SIGINT, signal.SIGTERM])
except:
pass
self.fun(*self.a)
class File(object): class File(object):
"""an up2k upload task; represents a single file""" """an up2k upload task; represents a single file"""
@@ -563,7 +572,7 @@ def handshake(ar, file, search):
else: else:
if ar.touch: if ar.touch:
req["umod"] = True req["umod"] = True
if ar.dr: if ar.ow:
req["replace"] = True req["replace"] = True
headers = {"Content-Type": "text/plain"} # <=1.5.1 compat headers = {"Content-Type": "text/plain"} # <=1.5.1 compat
@@ -1135,11 +1144,12 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("url", type=unicode, help="server url, including destination folder") ap.add_argument("url", type=unicode, help="server url, including destination folder")
ap.add_argument("files", type=unicode, nargs="+", help="files and/or folders to process") ap.add_argument("files", type=unicode, nargs="+", help="files and/or folders to process")
ap.add_argument("-v", action="store_true", help="verbose") ap.add_argument("-v", action="store_true", help="verbose")
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath") ap.add_argument("-a", metavar="PASSWD", help="password or $filepath")
ap.add_argument("-s", action="store_true", help="file-search (disables upload)") ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\\.hist/.*'") ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\\.hist/.*'")
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible") ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
ap.add_argument("--touch", action="store_true", help="if last-modified timestamps differ, push local to server (need write+delete perms)") ap.add_argument("--touch", action="store_true", help="if last-modified timestamps differ, push local to server (need write+delete perms)")
ap.add_argument("--ow", action="store_true", help="overwrite existing files instead of autorenaming")
ap.add_argument("--version", action="store_true", help="show version and exit") ap.add_argument("--version", action="store_true", help="show version and exit")
ap = app.add_argument_group("compatibility") ap = app.add_argument_group("compatibility")
@@ -1148,12 +1158,12 @@ source file/folder selection uses rsync syntax, meaning that:
ap = app.add_argument_group("folder sync") ap = app.add_argument_group("folder sync")
ap.add_argument("--dl", action="store_true", help="delete local files after uploading") ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
ap.add_argument("--dr", action="store_true", help="delete remote files which don't exist locally") ap.add_argument("--dr", action="store_true", help="delete remote files which don't exist locally (implies --ow)")
ap.add_argument("--drd", action="store_true", help="delete remote files during upload instead of afterwards; reduces peak disk space usage, but will reupload instead of detecting renames") ap.add_argument("--drd", action="store_true", help="delete remote files during upload instead of afterwards; reduces peak disk space usage, but will reupload instead of detecting renames")
ap = app.add_argument_group("performance tweaks") ap = app.add_argument_group("performance tweaks")
ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections") ap.add_argument("-j", type=int, metavar="CONNS", default=2, help="parallel connections")
ap.add_argument("-J", type=int, metavar="THREADS", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing") ap.add_argument("-J", type=int, metavar="CORES", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing")
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading") ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)") ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)")
ap.add_argument("--cd", type=float, metavar="SEC", default=5, help="delay before reattempting a failed handshake/upload") ap.add_argument("--cd", type=float, metavar="SEC", default=5, help="delay before reattempting a failed handshake/upload")
@@ -1161,7 +1171,7 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)") ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)")
ap = app.add_argument_group("tls") ap = app.add_argument_group("tls")
ap.add_argument("-te", metavar="PEM_FILE", help="certificate to expect/verify") ap.add_argument("-te", metavar="PATH", help="path to ca.pem or cert.pem to expect/verify")
ap.add_argument("-td", action="store_true", help="disable certificate check") ap.add_argument("-td", action="store_true", help="disable certificate check")
# fmt: on # fmt: on
@@ -1178,6 +1188,9 @@ source file/folder selection uses rsync syntax, meaning that:
if ar.drd: if ar.drd:
ar.dr = True ar.dr = True
if ar.dr:
ar.ow = True
for k in "dl dr drd".split(): for k in "dl dr drd".split():
errs = [] errs = []
if ar.safe and getattr(ar, k): if ar.safe and getattr(ar, k):
@@ -1196,6 +1209,14 @@ source file/folder selection uses rsync syntax, meaning that:
if "://" not in ar.url: if "://" not in ar.url:
ar.url = "http://" + ar.url ar.url = "http://" + ar.url
if "https://" in ar.url.lower():
try:
import ssl, zipfile
except:
t = "ERROR: https is not available for some reason; please use http"
print("\n\n %s\n\n" % (t,))
raise
if ar.a and ar.a.startswith("$"): if ar.a and ar.a.startswith("$"):
fn = ar.a[1:] fn = ar.a[1:]
print("reading password from file [{0}]".format(fn)) print("reading password from file [{0}]".format(fn))

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe> # Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty pkgname=copyparty
pkgver="1.11.0" pkgver="1.13.2"
pkgrel=1 pkgrel=1
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++" pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
arch=("any") arch=("any")
@@ -21,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
) )
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz") source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" ) backup=("etc/${pkgname}.d/init" )
sha256sums=("95f39a239dc38844fc27c5a1473635d07d8907bc98679dc79eb1de475e36fe42") sha256sums=("39937526aab77f4d78f19d16ed5c0eae9ac28f658e772ae9b54fff5281161c77")
build() { build() {
cd "${srcdir}/${pkgname}-${pkgver}" cd "${srcdir}/${pkgname}-${pkgver}"

View File

@@ -1,5 +1,5 @@
{ {
"url": "https://github.com/9001/copyparty/releases/download/v1.11.0/copyparty-sfx.py", "url": "https://github.com/9001/copyparty/releases/download/v1.13.2/copyparty-sfx.py",
"version": "1.11.0", "version": "1.13.2",
"hash": "sha256-MkNp+tI/Pl5QB4FMdJNOePbSUPO1MHWJLLC7gNh9K+c=" "hash": "sha256-5vvhbiZtgW/km9csq9iYezCaS6wAsLn1qVXDjl6gvwU="
} }

View File

@@ -56,7 +56,6 @@ class EnvParams(object):
self.t0 = time.time() self.t0 = time.time()
self.mod = "" self.mod = ""
self.cfg = "" self.cfg = ""
self.ox = getattr(sys, "oxidized", None)
E = EnvParams() E = EnvParams()

155
copyparty/__main__.py Executable file → Normal file
View File

@@ -13,6 +13,7 @@ import base64
import locale import locale
import os import os
import re import re
import select
import socket import socket
import sys import sys
import threading import threading
@@ -43,11 +44,13 @@ from .util import (
DEF_MTH, DEF_MTH,
IMPLICATIONS, IMPLICATIONS,
JINJA_VER, JINJA_VER,
MIMES,
PARTFTPY_VER, PARTFTPY_VER,
PY_DESC, PY_DESC,
PYFTPD_VER, PYFTPD_VER,
SQLITE_VER, SQLITE_VER,
UNPLICATIONS, UNPLICATIONS,
Daemon,
align_tab, align_tab,
ansi_re, ansi_re,
dedent, dedent,
@@ -157,7 +160,8 @@ def warn(msg: str) -> None:
def init_E(EE: EnvParams) -> None: def init_E(EE: EnvParams) -> None:
# __init__ runs 18 times when oxidized; do expensive stuff here # some cpython alternatives (such as pyoxidizer) can
# __init__ several times, so do expensive stuff here
E = EE # pylint: disable=redefined-outer-name E = EE # pylint: disable=redefined-outer-name
@@ -170,8 +174,10 @@ def init_E(EE: EnvParams) -> None:
(os.environ.get, "TMP"), (os.environ.get, "TMP"),
(unicode, "/tmp"), (unicode, "/tmp"),
] ]
errs = []
for chk in [os.listdir, os.mkdir]: for chk in [os.listdir, os.mkdir]:
for pf, pa in paths: for npath, (pf, pa) in enumerate(paths):
p = ""
try: try:
p = pf(pa) p = pf(pa)
# print(chk.__name__, p, pa) # print(chk.__name__, p, pa)
@@ -184,40 +190,26 @@ def init_E(EE: EnvParams) -> None:
if not os.path.isdir(p): if not os.path.isdir(p):
os.mkdir(p) os.mkdir(p)
if npath > 1:
t = "Using [%s] for config; filekeys/dirkeys will change on every restart. Consider setting XDG_CONFIG_HOME or giving the unix-user a ~/.config/"
errs.append(t % (p,))
elif errs:
errs.append("Using [%s] instead" % (p,))
if errs:
print("WARNING: " + ". ".join(errs))
return p # type: ignore return p # type: ignore
except: except Exception as ex:
pass if p and npath < 2:
t = "Unable to store config in [%s] due to %r"
errs.append(t % (p, ex))
raise Exception("could not find a writable path for config") raise Exception("could not find a writable path for config")
def _unpack() -> str: E.mod = os.path.dirname(os.path.realpath(__file__))
import atexit if E.mod.endswith("__init__"):
import tarfile E.mod = os.path.dirname(E.mod)
import tempfile
from importlib.resources import open_binary
td = tempfile.TemporaryDirectory(prefix="")
atexit.register(td.cleanup)
tdn = td.name
with open_binary("copyparty", "z.tar") as tgz:
with tarfile.open(fileobj=tgz) as tf:
try:
tf.extractall(tdn, filter="tar")
except TypeError:
tf.extractall(tdn) # nosec (archive is safe)
return tdn
try:
E.mod = os.path.dirname(os.path.realpath(__file__))
if E.mod.endswith("__init__"):
E.mod = os.path.dirname(E.mod)
except:
if not E.ox:
raise
E.mod = _unpack()
if sys.platform == "win32": if sys.platform == "win32":
bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "." bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "."
@@ -274,6 +266,19 @@ def get_fk_salt() -> str:
return ret.decode("utf-8") return ret.decode("utf-8")
def get_dk_salt() -> str:
fp = os.path.join(E.cfg, "dk-salt.txt")
try:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(30))
with open(fp, "wb") as f:
f.write(ret + b"\n")
return ret.decode("utf-8")
def get_ah_salt() -> str: def get_ah_salt() -> str:
fp = os.path.join(E.cfg, "ah-salt.txt") fp = os.path.join(E.cfg, "ah-salt.txt")
try: try:
@@ -481,6 +486,16 @@ def disable_quickedit() -> None:
cmode(True, mode | 4) cmode(True, mode | 4)
def sfx_tpoke(top: str):
files = [os.path.join(dp, p) for dp, dd, df in os.walk(top) for p in dd + df]
while True:
t = int(time.time())
for f in [top] + files:
os.utime(f, (t, t))
time.sleep(78123)
def showlic() -> None: def showlic() -> None:
p = os.path.join(E.mod, "res", "COPYING.txt") p = os.path.join(E.mod, "res", "COPYING.txt")
if not os.path.exists(p): if not os.path.exists(p):
@@ -831,7 +846,7 @@ def build_flags_desc():
v = v.replace("\n", "\n ") v = v.replace("\n", "\n ")
ret += "\n \033[36m{}\033[35m {}".format(k, v) ret += "\n \033[36m{}\033[35m {}".format(k, v)
return ret + "\033[0m" return ret
# fmt: off # fmt: off
@@ -849,6 +864,8 @@ def add_general(ap, nc, srvname):
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-form POSTs; see \033[33m--help-urlform\033[0m") ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-form POSTs; see \033[33m--help-urlform\033[0m")
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="server terminal title, for example [\033[32m$ip-10.1.2.\033[0m] or [\033[32m$ip-]") ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="server terminal title, for example [\033[32m$ip-10.1.2.\033[0m] or [\033[32m$ip-]")
ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)") ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)")
ap2.add_argument("--mime", metavar="EXT=MIME", type=u, action="append", help="map file \033[33mEXT\033[0mension to \033[33mMIME\033[0mtype, for example [\033[32mjpg=image/jpeg\033[0m]")
ap2.add_argument("--mimes", action="store_true", help="list default mimetype mapping and exit")
ap2.add_argument("--license", action="store_true", help="show licenses and exit") ap2.add_argument("--license", action="store_true", help="show licenses and exit")
ap2.add_argument("--version", action="store_true", help="show versions and exit") ap2.add_argument("--version", action="store_true", help="show versions and exit")
@@ -867,8 +884,11 @@ def add_qr(ap, tty):
def add_fs(ap): def add_fs(ap):
ap2 = ap.add_argument_group("filesystem options") ap2 = ap.add_argument_group("filesystem options")
rm_re_def = "5/0.1" if ANYWIN else "0/0" rm_re_def = "15/0.1" if ANYWIN else "0/0"
ap2.add_argument("--rm-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be deleted because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=rm_retry)") ap2.add_argument("--rm-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be deleted because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=rm_retry)")
ap2.add_argument("--mv-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be renamed because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=mv_retry)")
ap2.add_argument("--iobuf", metavar="BYTES", type=int, default=256*1024, help="file I/O buffer-size; if your volumes are on a network drive, try increasing to \033[32m524288\033[0m or even \033[32m4194304\033[0m (and let me know if that improves your performance)")
ap2.add_argument("--mtab-age", metavar="SEC", type=int, default=60, help="rebuild mountpoint cache every \033[33mSEC\033[0m to keep track of sparse-files support; keep low on servers with removable media")
def add_upload(ap): def add_upload(ap):
@@ -892,7 +912,7 @@ def add_upload(ap):
ap2.add_argument("--rand", action="store_true", help="force randomized filenames, \033[33m--nrand\033[0m chars long (volflag=rand)") ap2.add_argument("--rand", action="store_true", help="force randomized filenames, \033[33m--nrand\033[0m chars long (volflag=rand)")
ap2.add_argument("--nrand", metavar="NUM", type=int, default=9, help="randomized filenames length (volflag=nrand)") ap2.add_argument("--nrand", metavar="NUM", type=int, default=9, help="randomized filenames length (volflag=nrand)")
ap2.add_argument("--magic", action="store_true", help="enable filetype detection on nameless uploads (volflag=magic)") ap2.add_argument("--magic", action="store_true", help="enable filetype detection on nameless uploads (volflag=magic)")
ap2.add_argument("--df", metavar="GiB", type=float, default=0, help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests") ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)")
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files") ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck") ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16 for cross-atlantic (max=64)") ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16 for cross-atlantic (max=64)")
@@ -916,6 +936,7 @@ def add_network(ap):
ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)") ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)")
ap2.add_argument("--s-thead", metavar="SEC", type=int, default=120, help="socket timeout (read request header)") ap2.add_argument("--s-thead", metavar="SEC", type=int, default=120, help="socket timeout (read request header)")
ap2.add_argument("--s-tbody", metavar="SEC", type=float, default=186, help="socket timeout (read/write request/response bodies). Use 60 on fast servers (default is extremely safe). Disable with 0 if reverse-proxied for a 2%% speed boost") ap2.add_argument("--s-tbody", metavar="SEC", type=float, default=186, help="socket timeout (read/write request/response bodies). Use 60 on fast servers (default is extremely safe). Disable with 0 if reverse-proxied for a 2%% speed boost")
ap2.add_argument("--s-rd-sz", metavar="B", type=int, default=256*1024, help="socket read size in bytes (indirectly affects filesystem writes; recommendation: keep equal-to or lower-than \033[33m--iobuf\033[0m)")
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes") ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="debug: socket write delay in seconds") ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="debug: socket write delay in seconds")
ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="debug: response delay in seconds") ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="debug: response delay in seconds")
@@ -958,6 +979,8 @@ def add_auth(ap):
ap2.add_argument("--idp-h-grp", metavar="HN", type=u, default="", help="assume the request-header \033[33mHN\033[0m contains the groupname of the requesting user; can be referenced in config files for group-based access control") ap2.add_argument("--idp-h-grp", metavar="HN", type=u, default="", help="assume the request-header \033[33mHN\033[0m contains the groupname of the requesting user; can be referenced in config files for group-based access control")
ap2.add_argument("--idp-h-key", metavar="HN", type=u, default="", help="optional but recommended safeguard; your reverse-proxy will insert a secret header named \033[33mHN\033[0m into all requests, and the other IdP headers will be ignored if this header is not present") ap2.add_argument("--idp-h-key", metavar="HN", type=u, default="", help="optional but recommended safeguard; your reverse-proxy will insert a secret header named \033[33mHN\033[0m into all requests, and the other IdP headers will be ignored if this header is not present")
ap2.add_argument("--idp-gsep", metavar="RE", type=u, default="|:;+,", help="if there are multiple groups in \033[33m--idp-h-grp\033[0m, they are separated by one of the characters in \033[33mRE\033[0m") ap2.add_argument("--idp-gsep", metavar="RE", type=u, default="|:;+,", help="if there are multiple groups in \033[33m--idp-h-grp\033[0m, they are separated by one of the characters in \033[33mRE\033[0m")
ap2.add_argument("--no-bauth", action="store_true", help="disable basic-authentication support; do not accept passwords from the 'Authenticate' header at all. NOTE: This breaks support for the android app")
ap2.add_argument("--bauth-last", action="store_true", help="keeps basic-authentication enabled, but only as a last-resort; if a cookie is also provided then the cookie wins")
def add_zeroconf(ap): def add_zeroconf(ap):
@@ -1097,6 +1120,8 @@ def add_optouts(ap):
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar") ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)") ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time") ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader IPs into the database")
def add_safety(ap): def add_safety(ap):
@@ -1129,13 +1154,14 @@ def add_safety(ap):
ap2.add_argument("--acam", metavar="V[,V]", type=u, default="GET,HEAD", help="Access-Control-Allow-Methods; list of methods to accept from offsite ('*' behaves like \033[33m--acao\033[0m's description)") ap2.add_argument("--acam", metavar="V[,V]", type=u, default="GET,HEAD", help="Access-Control-Allow-Methods; list of methods to accept from offsite ('*' behaves like \033[33m--acao\033[0m's description)")
def add_salt(ap, fk_salt, ah_salt): def add_salt(ap, fk_salt, dk_salt, ah_salt):
ap2 = ap.add_argument_group('salting options') ap2 = ap.add_argument_group('salting options')
ap2.add_argument("--ah-alg", metavar="ALG", type=u, default="none", help="account-pw hashing algorithm; one of these, best to worst: \033[32margon2 scrypt sha2 none\033[0m (each optionally followed by alg-specific comma-sep. config)") ap2.add_argument("--ah-alg", metavar="ALG", type=u, default="none", help="account-pw hashing algorithm; one of these, best to worst: \033[32margon2 scrypt sha2 none\033[0m (each optionally followed by alg-specific comma-sep. config)")
ap2.add_argument("--ah-salt", metavar="SALT", type=u, default=ah_salt, help="account-pw salt; ignored if \033[33m--ah-alg\033[0m is none (default)") ap2.add_argument("--ah-salt", metavar="SALT", type=u, default=ah_salt, help="account-pw salt; ignored if \033[33m--ah-alg\033[0m is none (default)")
ap2.add_argument("--ah-gen", metavar="PW", type=u, default="", help="generate hashed password for \033[33mPW\033[0m, or read passwords from STDIN if \033[33mPW\033[0m is [\033[32m-\033[0m]") ap2.add_argument("--ah-gen", metavar="PW", type=u, default="", help="generate hashed password for \033[33mPW\033[0m, or read passwords from STDIN if \033[33mPW\033[0m is [\033[32m-\033[0m]")
ap2.add_argument("--ah-cli", action="store_true", help="launch an interactive shell which hashes passwords without ever storing or displaying the original passwords") ap2.add_argument("--ah-cli", action="store_true", help="launch an interactive shell which hashes passwords without ever storing or displaying the original passwords")
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files") ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files")
ap2.add_argument("--dk-salt", metavar="SALT", type=u, default=dk_salt, help="per-directory accesskey salt; used to generate unpredictable URLs to share folders with users who only have the 'get' permission")
ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)") ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
@@ -1196,11 +1222,14 @@ def add_thumbnail(ap):
ap2.add_argument("--th-r-vips", metavar="T,T", type=u, default="avif,exr,fit,fits,fts,gif,hdr,heic,jp2,jpeg,jpg,jpx,jxl,nii,pfm,pgm,png,ppm,svg,tif,tiff,webp", help="image formats to decode using pyvips") ap2.add_argument("--th-r-vips", metavar="T,T", type=u, default="avif,exr,fit,fits,fts,gif,hdr,heic,jp2,jpeg,jpg,jpx,jxl,nii,pfm,pgm,png,ppm,svg,tif,tiff,webp", help="image formats to decode using pyvips")
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg") ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg") ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,m4a,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,tak,tta,ulaw,wav,wma,wv,xm,xpk", help="audio formats to decode using ffmpeg") ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itxz,itz,m4a,mdgz,mdxz,mdz,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,s3gz,s3xz,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmxz,xmz,xpk", help="audio formats to decode using ffmpeg")
ap2.add_argument("--au-unpk", metavar="E=F.C", type=u, default="mdz=mod.zip, mdgz=mod.gz, mdxz=mod.xz, s3z=s3m.zip, s3gz=s3m.gz, s3xz=s3m.xz, xmz=xm.zip, xmgz=xm.gz, xmxz=xm.xz, itz=it.zip, itgz=it.gz, itxz=it.xz", help="audio formats to decompress before passing to ffmpeg")
def add_transcoding(ap): def add_transcoding(ap):
ap2 = ap.add_argument_group('transcoding options') ap2 = ap.add_argument_group('transcoding options')
ap2.add_argument("--q-opus", metavar="KBPS", type=int, default=128, help="target bitrate for transcoding to opus; set 0 to disable")
ap2.add_argument("--q-mp3", metavar="QUALITY", type=u, default="q2", help="target quality for transcoding to mp3, for example [\033[32m192k\033[0m] (CBR) or [\033[32mq0\033[0m] (CQ/CRF, q0=maxquality, q9=smallest); set 0 to disable")
ap2.add_argument("--no-acode", action="store_true", help="disable audio transcoding") ap2.add_argument("--no-acode", action="store_true", help="disable audio transcoding")
ap2.add_argument("--no-bacode", action="store_true", help="disable batch audio transcoding by folder download (zip/tar)") ap2.add_argument("--no-bacode", action="store_true", help="disable batch audio transcoding by folder download (zip/tar)")
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds") ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds")
@@ -1219,7 +1248,7 @@ def add_db_general(ap, hcores):
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)") ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)") ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower") ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)") ap2.add_argument("--re-dhash", action="store_true", help="force a cache rebuild on startup; enable this once if it gets out of sync (should never be necessary)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)") ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)")
ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see \033[33m--help-dbd\033[0m (volflag=dbd)") ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see \033[33m--help-dbd\033[0m (volflag=dbd)")
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)") ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)")
@@ -1257,19 +1286,38 @@ def add_txt(ap):
ap2.add_argument("--exp-lg", metavar="V,V,V", type=u, default=DEF_EXP, help="comma/space-separated list of placeholders to expand in prologue/epilogue files (volflag=exp_lg)") ap2.add_argument("--exp-lg", metavar="V,V,V", type=u, default=DEF_EXP, help="comma/space-separated list of placeholders to expand in prologue/epilogue files (volflag=exp_lg)")
def add_og(ap):
ap2 = ap.add_argument_group('og / open graph / discord-embed options')
ap2.add_argument("--og", action="store_true", help="disable hotlinking and return an html document instead; this is required by open-graph, but can also be useful on its own (volflag=og)")
ap2.add_argument("--og-ua", metavar="RE", type=u, default="", help="only disable hotlinking / engage OG behavior if the useragent matches regex \033[33mRE\033[0m (volflag=og_ua)")
ap2.add_argument("--og-tpl", metavar="PATH", type=u, default="", help="do not return the regular copyparty html, but instead load the jinja2 template at \033[33mPATH\033[0m (if path contains 'EXT' then EXT will be replaced with the requested file's extension) (volflag=og_tpl)")
ap2.add_argument("--og-no-head", action="store_true", help="do not automatically add OG entries into <head> (useful if you're doing this yourself in a template or such) (volflag=og_no_head)")
ap2.add_argument("--og-th", metavar="FMT", type=u, default="jf3", help="thumbnail format; j=jpeg, jf=jpeg-uncropped, jf3=jpeg-uncropped-large, w=webm, ... (volflag=og_th)")
ap2.add_argument("--og-title", metavar="TXT", type=u, default="", help="fallback title if there is nothing in the \033[33m-e2t\033[0m database (volflag=og_title)")
ap2.add_argument("--og-title-a", metavar="T", type=u, default="🎵 {{ artist }} - {{ title }}", help="audio title format; takes any metadata key (volflag=og_title_a)")
ap2.add_argument("--og-title-v", metavar="T", type=u, default="{{ title }}", help="video title format; takes any metadata key (volflag=og_title_v)")
ap2.add_argument("--og-title-i", metavar="T", type=u, default="{{ title }}", help="image title format; takes any metadata key (volflag=og_title_i)")
ap2.add_argument("--og-s-title", action="store_true", help="force default title; do not read from tags (volflag=og_s_title)")
ap2.add_argument("--og-desc", metavar="TXT", type=u, default="", help="description text; same for all files, disable with [\033[32m-\033[0m] (volflag=og_desc)")
ap2.add_argument("--og-site", metavar="TXT", type=u, default="", help="sitename; defaults to \033[33m--name\033[0m, disable with [\033[32m-\033[0m] (volflag=og_site)")
ap2.add_argument("--tcolor", metavar="RGB", type=u, default="333", help="accent color (3 or 6 hex digits); may also affect safari and/or android-chrome (volflag=tcolor)")
ap2.add_argument("--uqe", action="store_true", help="query-string parceling; translate a request for \033[33m/foo/.uqe/BASE64\033[0m into \033[33m/foo?TEXT\033[0m, or \033[33m/foo/?TEXT\033[0m if the first character in \033[33mTEXT\033[0m is a slash. Automatically enabled for \033[33m--og\033[0m")
def add_ui(ap, retry): def add_ui(ap, retry):
ap2 = ap.add_argument_group('ui options') ap2 = ap.add_argument_group('ui options')
ap2.add_argument("--grid", action="store_true", help="show grid/thumbnails by default (volflag=grid)") ap2.add_argument("--grid", action="store_true", help="show grid/thumbnails by default (volflag=grid)")
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language; one of the following: \033[32meng nor\033[0m") ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language; one of the following: \033[32meng nor\033[0m")
ap2.add_argument("--theme", metavar="NUM", type=int, default=0, help="default theme to use (0..7)") ap2.add_argument("--theme", metavar="NUM", type=int, default=0, help="default theme to use (0..7)")
ap2.add_argument("--themes", metavar="NUM", type=int, default=8, help="number of themes installed") ap2.add_argument("--themes", metavar="NUM", type=int, default=8, help="number of themes installed")
ap2.add_argument("--au-vol", metavar="0-100", type=int, default=50, choices=range(0, 101), help="default audio/video volume percent")
ap2.add_argument("--sort", metavar="C,C,C", type=u, default="href", help="default sort order, comma-separated column IDs (see header tooltips), prefix with '-' for descending. Examples: \033[32mhref -href ext sz ts tags/Album tags/.tn\033[0m (volflag=sort)") ap2.add_argument("--sort", metavar="C,C,C", type=u, default="href", help="default sort order, comma-separated column IDs (see header tooltips), prefix with '-' for descending. Examples: \033[32mhref -href ext sz ts tags/Album tags/.tn\033[0m (volflag=sort)")
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching \033[33mREGEX\033[0m in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)") ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching \033[33mREGEX\033[0m in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable") ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
ap2.add_argument("--mpmc", metavar="URL", type=u, default="", help="change the mediaplayer-toggle mouse cursor; URL to a folder with {2..5}.png inside (or disable with [\033[32m.\033[0m])") ap2.add_argument("--mpmc", metavar="URL", type=u, default="", help="change the mediaplayer-toggle mouse cursor; URL to a folder with {2..5}.png inside (or disable with [\033[32m.\033[0m])")
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include") ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include") ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages") ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages; can be @PATH to send the contents of a file at PATH, and/or begin with %% to render as jinja2 template (volflag=html_head)")
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI, or add ?v to URL for override)") ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI, or add ?v to URL for override)")
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext") ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)") ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
@@ -1288,6 +1336,8 @@ def add_debug(ap):
ap2 = ap.add_argument_group('debug options') ap2 = ap.add_argument_group('debug options')
ap2.add_argument("--vc", action="store_true", help="verbose config file parser (explain config)") ap2.add_argument("--vc", action="store_true", help="verbose config file parser (explain config)")
ap2.add_argument("--cgen", action="store_true", help="generate config file from current config (best-effort; probably buggy)") ap2.add_argument("--cgen", action="store_true", help="generate config file from current config (best-effort; probably buggy)")
if hasattr(select, "poll"):
ap2.add_argument("--no-poll", action="store_true", help="kernel-bug workaround: disable poll; use select instead (limits max num clients to ~700)")
ap2.add_argument("--no-sendfile", action="store_true", help="kernel-bug workaround: disable sendfile; do a safe and slow read-send-loop instead") ap2.add_argument("--no-sendfile", action="store_true", help="kernel-bug workaround: disable sendfile; do a safe and slow read-send-loop instead")
ap2.add_argument("--no-scandir", action="store_true", help="kernel-bug workaround: disable scandir; do a listdir + stat on each file instead") ap2.add_argument("--no-scandir", action="store_true", help="kernel-bug workaround: disable scandir; do a listdir + stat on each file instead")
ap2.add_argument("--no-fastboot", action="store_true", help="wait for initial filesystem indexing before accepting client requests") ap2.add_argument("--no-fastboot", action="store_true", help="wait for initial filesystem indexing before accepting client requests")
@@ -1317,6 +1367,7 @@ def run_argparse(
cert_path = os.path.join(E.cfg, "cert.pem") cert_path = os.path.join(E.cfg, "cert.pem")
fk_salt = get_fk_salt() fk_salt = get_fk_salt()
dk_salt = get_dk_salt()
ah_salt = get_ah_salt() ah_salt = get_ah_salt()
# alpine peaks at 5 threads for some reason, # alpine peaks at 5 threads for some reason,
@@ -1348,7 +1399,7 @@ def run_argparse(
add_tftp(ap) add_tftp(ap)
add_smb(ap) add_smb(ap)
add_safety(ap) add_safety(ap)
add_salt(ap, fk_salt, ah_salt) add_salt(ap, fk_salt, dk_salt, ah_salt)
add_optouts(ap) add_optouts(ap)
add_shutdown(ap) add_shutdown(ap)
add_yolo(ap) add_yolo(ap)
@@ -1356,6 +1407,7 @@ def run_argparse(
add_hooks(ap) add_hooks(ap)
add_stats(ap) add_stats(ap)
add_txt(ap) add_txt(ap)
add_og(ap)
add_ui(ap, retry) add_ui(ap, retry)
add_admin(ap) add_admin(ap)
add_logging(ap) add_logging(ap)
@@ -1384,18 +1436,22 @@ def run_argparse(
k2 = "help_" + k.replace("-", "_") k2 = "help_" + k.replace("-", "_")
if vars(ret)[k2]: if vars(ret)[k2]:
lprint("# %s help page (%s)" % (k, h)) lprint("# %s help page (%s)" % (k, h))
lprint(t + "\033[0m") lprint(t.rstrip() + "\033[0m")
sys.exit(0) sys.exit(0)
return ret return ret
def main(argv: Optional[list[str]] = None) -> None: def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
time.strptime("19970815", "%Y%m%d") # python#7980 time.strptime("19970815", "%Y%m%d") # python#7980
if WINDOWS: if WINDOWS:
os.system("rem") # enables colors os.system("rem") # enables colors
init_E(E) init_E(E)
if rsrc: # pyz
E.mod = rsrc
if argv is None: if argv is None:
argv = sys.argv argv = sys.argv
@@ -1419,9 +1475,19 @@ def main(argv: Optional[list[str]] = None) -> None:
showlic() showlic()
sys.exit(0) sys.exit(0)
if "--mimes" in argv:
print("\n".join("%8s %s" % (k, v) for k, v in sorted(MIMES.items())))
sys.exit(0)
if EXE: if EXE:
print("pybin: {}\n".format(pybin), end="") print("pybin: {}\n".format(pybin), end="")
for n, zs in enumerate(argv):
if zs.startswith("--sfx-tpoke="):
Daemon(sfx_tpoke, "sfx-tpoke", (zs.split("=", 1)[1],))
argv.pop(n)
break
ensure_locale() ensure_locale()
ensure_webdeps() ensure_webdeps()
@@ -1482,7 +1548,7 @@ def main(argv: Optional[list[str]] = None) -> None:
if hard > 0: # -1 == infinite if hard > 0: # -1 == infinite
nc = min(nc, int(hard / 4)) nc = min(nc, int(hard / 4))
except: except:
nc = 512 nc = 486 # mdns/ssdp restart headroom; select() maxfd is 512 on windows
retry = False retry = False
for fmtr in [RiceFormatter, RiceFormatter, Dodge11874, BasicDodge11874]: for fmtr in [RiceFormatter, RiceFormatter, Dodge11874, BasicDodge11874]:
@@ -1575,6 +1641,9 @@ def main(argv: Optional[list[str]] = None) -> None:
if not hasattr(os, "sendfile"): if not hasattr(os, "sendfile"):
al.no_sendfile = True al.no_sendfile = True
if not hasattr(select, "poll"):
al.no_poll = True
# signal.signal(signal.SIGINT, sighandler) # signal.signal(signal.SIGINT, sighandler)
SvcHub(al, dal, argv, "".join(printed)).run() SvcHub(al, dal, argv, "".join(printed)).run()

View File

@@ -1,8 +1,8 @@
# coding: utf-8 # coding: utf-8
VERSION = (1, 11, 1) VERSION = (1, 13, 3)
CODENAME = "You Can (Not) Proceed" CODENAME = "race the beam"
BUILD_DT = (2024, 3, 18) BUILD_DT = (2024, 6, 1)
S_VERSION = ".".join(map(str, VERSION)) S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT) S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -17,7 +17,9 @@ from .bos import bos
from .cfg import flagdescs, permdescs, vf_bmap, vf_cmap, vf_vmap from .cfg import flagdescs, permdescs, vf_bmap, vf_cmap, vf_vmap
from .pwhash import PWHash from .pwhash import PWHash
from .util import ( from .util import (
EXTS,
IMPLICATIONS, IMPLICATIONS,
MIMES,
SQLITE_VER, SQLITE_VER,
UNPLICATIONS, UNPLICATIONS,
UTC, UTC,
@@ -555,7 +557,12 @@ class VFS(object):
# no vfs nodes in the list of real inodes # no vfs nodes in the list of real inodes
real = [x for x in real if x[0] not in self.nodes] real = [x for x in real if x[0] not in self.nodes]
dbv = self.dbv or self
for name, vn2 in sorted(self.nodes.items()): for name, vn2 in sorted(self.nodes.items()):
if vn2.dbv == dbv and self.flags.get("dk"):
virt_vis[name] = vn2
continue
ok = False ok = False
zx = vn2.axs zx = vn2.axs
axs = [zx.uread, zx.uwrite, zx.umove, zx.udel, zx.uget] axs = [zx.uread, zx.uwrite, zx.umove, zx.udel, zx.uget]
@@ -1224,7 +1231,9 @@ class AuthSrv(object):
if un.startswith("@"): if un.startswith("@"):
grp = un[1:] grp = un[1:]
uns = [x[0] for x in un_gns.items() if grp in x[1]] uns = [x[0] for x in un_gns.items() if grp in x[1]]
if not uns and grp != "${g}" and not self.args.idp_h_grp: if grp == "${g}":
unames.append(un)
elif not uns and not self.args.idp_h_grp:
t = "group [%s] must be defined with --grp argument (or in a [groups] config section)" t = "group [%s] must be defined with --grp argument (or in a [groups] config section)"
raise CfgEx(t % (grp,)) raise CfgEx(t % (grp,))
@@ -1234,31 +1243,28 @@ class AuthSrv(object):
# unames may still contain ${u} and ${g} so now expand those; # unames may still contain ${u} and ${g} so now expand those;
un_gn = [(un, gn) for un, gns in un_gns.items() for gn in gns] un_gn = [(un, gn) for un, gns in un_gns.items() for gn in gns]
if "*" not in un_gns:
# need ("*","") to match "*" in unames
un_gn.append(("*", ""))
for _, dst, vu, vg in vols: for src, dst, vu, vg in vols:
unames2 = set() unames2 = set(unames)
for un, gn in un_gn:
# if vu/vg (volume user/group) is non-null,
# then each non-null value corresponds to
# ${u}/${g}; consider this a filter to
# apply to unames, as well as un_gn
if (vu and vu != un) or (vg and vg != gn):
continue
for uname in unames + ([un] if vu or vg else []): if "${u}" in unames:
if uname == "${u}": if not vu:
uname = vu or un t = "cannot use ${u} in accs of volume [%s] because the volume url does not contain ${u}"
elif uname in ("${g}", "@${g}"): raise CfgEx(t % (src,))
uname = vg or gn unames2.add(vu)
if vu and vu != uname: if "@${g}" in unames:
continue if not vg:
t = "cannot use @${g} in accs of volume [%s] because the volume url does not contain @${g}"
raise CfgEx(t % (src,))
unames2.update([un for un, gn in un_gn if gn == vg])
if uname: if "${g}" in unames:
unames2.add(uname) t = 'the accs of volume [%s] contains "${g}" but the only supported way of specifying that is "@${g}"'
raise CfgEx(t % (src,))
unames2.discard("${u}")
unames2.discard("@${g}")
self._read_vol_str(lvl, list(unames2), axs[dst]) self._read_vol_str(lvl, list(unames2), axs[dst])
@@ -1438,6 +1444,7 @@ class AuthSrv(object):
elif "" not in mount: elif "" not in mount:
# there's volumes but no root; make root inaccessible # there's volumes but no root; make root inaccessible
vfs = VFS(self.log_func, "", "", AXS(), {}) vfs = VFS(self.log_func, "", "", AXS(), {})
vfs.flags["tcolor"] = self.args.tcolor
vfs.flags["d2d"] = True vfs.flags["d2d"] = True
maxdepth = 0 maxdepth = 0
@@ -1610,11 +1617,14 @@ class AuthSrv(object):
use = True use = True
lim.nosub = True lim.nosub = True
zs = vol.flags.get("df") or ( zs = vol.flags.get("df") or self.args.df or ""
"{}g".format(self.args.df) if self.args.df else "" if zs not in ("", "0"):
)
if zs:
use = True use = True
try:
_ = float(zs)
zs = "%sg" % (zs)
except:
pass
lim.dfl = unhumanize(zs) lim.dfl = unhumanize(zs)
zs = vol.flags.get("sz") zs = vol.flags.get("sz")
@@ -1682,6 +1692,20 @@ class AuthSrv(object):
vol.flags["fk"] = int(fk) if fk is not True else 8 vol.flags["fk"] = int(fk) if fk is not True else 8
have_fk = True have_fk = True
dk = vol.flags.get("dk")
dks = vol.flags.get("dks")
dky = vol.flags.get("dky")
if dks is not None and dky is not None:
t = "WARNING: volume /%s has both dks and dky enabled; this is too yolo and not permitted"
raise Exception(t % (vol.vpath,))
if dks and not dk:
dk = dks
if dky and not dk:
dk = dky
if dk:
vol.flags["dk"] = int(dk) if dk is not True else 8
if have_fk and re.match(r"^[0-9\.]+$", self.args.fk_salt): if have_fk and re.match(r"^[0-9\.]+$", self.args.fk_salt):
self.log("filekey salt: {}".format(self.args.fk_salt)) self.log("filekey salt: {}".format(self.args.fk_salt))
@@ -1709,7 +1733,11 @@ class AuthSrv(object):
if self.args.e2d or "e2ds" in vol.flags: if self.args.e2d or "e2ds" in vol.flags:
vol.flags["e2d"] = True vol.flags["e2d"] = True
for ga, vf in [["no_hash", "nohash"], ["no_idx", "noidx"]]: for ga, vf in [
["no_hash", "nohash"],
["no_idx", "noidx"],
["og_ua", "og_ua"],
]:
if vf in vol.flags: if vf in vol.flags:
ptn = re.compile(vol.flags.pop(vf)) ptn = re.compile(vol.flags.pop(vf))
else: else:
@@ -1746,13 +1774,21 @@ class AuthSrv(object):
if k in vol.flags: if k in vol.flags:
vol.flags[k] = float(vol.flags[k]) vol.flags[k] = float(vol.flags[k])
try: for k in ("mv_re", "rm_re"):
zs1, zs2 = vol.flags["rm_retry"].split("/") try:
vol.flags["rm_re_t"] = float(zs1) zs1, zs2 = vol.flags[k + "try"].split("/")
vol.flags["rm_re_r"] = float(zs2) vol.flags[k + "_t"] = float(zs1)
except: vol.flags[k + "_r"] = float(zs2)
t = 'volume "/%s" has invalid rm_retry [%s]' except:
raise Exception(t % (vol.vpath, vol.flags.get("rm_retry"))) t = 'volume "/%s" has invalid %stry [%s]'
raise Exception(t % (vol.vpath, k, vol.flags.get(k + "try")))
if vol.flags.get("og"):
self.args.uqe = True
zs = str(vol.flags.get("tcolor", "")).lstrip("#")
if len(zs) == 3: # fc5 => ffcc55
vol.flags["tcolor"] = "".join([x * 2 for x in zs])
for k1, k2 in IMPLICATIONS: for k1, k2 in IMPLICATIONS:
if k1 in vol.flags: if k1 in vol.flags:
@@ -2034,6 +2070,13 @@ class AuthSrv(object):
self.re_pwd = re.compile(zs) self.re_pwd = re.compile(zs)
# to ensure it propagates into tcpsrv with mp on
if self.args.mime:
for zs in self.args.mime:
ext, mime = zs.split("=", 1)
MIMES[ext] = mime
EXTS.update({v: k for k, v in MIMES.items()})
def setup_pwhash(self, acct: dict[str, str]) -> None: def setup_pwhash(self, acct: dict[str, str]) -> None:
self.ah = PWHash(self.args) self.ah = PWHash(self.args)
if not self.ah.on: if not self.ah.on:
@@ -2391,7 +2434,7 @@ def expand_config_file(
if not cnames: if not cnames:
t = "warning: tried to read config-files from folder '%s' but it does not contain any " t = "warning: tried to read config-files from folder '%s' but it does not contain any "
if names: if names:
t += ".conf files; the following files were ignored: %s" t += ".conf files; the following files/subfolders were ignored: %s"
t = t % (fp, ", ".join(names[:8])) t = t % (fp, ", ".join(names[:8]))
else: else:
t += "files at all" t += "files at all"

View File

@@ -57,11 +57,8 @@ class BrokerMp(object):
def shutdown(self) -> None: def shutdown(self) -> None:
self.log("broker", "shutting down") self.log("broker", "shutting down")
for n, proc in enumerate(self.procs): for n, proc in enumerate(self.procs):
thr = threading.Thread( name = "mp-shut-%d-%d" % (n, len(self.procs))
target=proc.q_pend.put((0, "shutdown", [])), Daemon(proc.q_pend.put, name, ((0, "shutdown", []),))
name="mp-shutdown-{}-{}".format(n, len(self.procs)),
)
thr.start()
with self.mutex: with self.mutex:
procs = self.procs procs = self.procs

View File

@@ -6,7 +6,8 @@ import os
import shutil import shutil
import time import time
from .util import Netdev, runcmd from .__init__ import ANYWIN
from .util import Netdev, runcmd, wrename, wunlink
HAVE_CFSSL = True HAVE_CFSSL = True
@@ -14,6 +15,12 @@ if True: # pylint: disable=using-constant-test
from .util import RootLogger from .util import RootLogger
if ANYWIN:
VF = {"mv_re_t": 5, "rm_re_t": 5, "mv_re_r": 0.1, "rm_re_r": 0.1}
else:
VF = {"mv_re_t": 0, "rm_re_t": 0}
def ensure_cert(log: "RootLogger", args) -> None: def ensure_cert(log: "RootLogger", args) -> None:
""" """
the default cert (and the entire TLS support) is only here to enable the the default cert (and the entire TLS support) is only here to enable the
@@ -105,8 +112,12 @@ def _gen_ca(log: "RootLogger", args):
raise Exception("failed to translate ca-cert: {}, {}".format(rc, se), 3) raise Exception("failed to translate ca-cert: {}, {}".format(rc, se), 3)
bname = os.path.join(args.crt_dir, "ca") bname = os.path.join(args.crt_dir, "ca")
os.rename(bname + "-key.pem", bname + ".key") try:
os.unlink(bname + ".csr") wunlink(log, bname + ".key", VF)
except:
pass
wrename(log, bname + "-key.pem", bname + ".key", VF)
wunlink(log, bname + ".csr", VF)
log("cert", "new ca OK", 2) log("cert", "new ca OK", 2)
@@ -185,11 +196,11 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
bname = os.path.join(args.crt_dir, "srv") bname = os.path.join(args.crt_dir, "srv")
try: try:
os.unlink(bname + ".key") wunlink(log, bname + ".key", VF)
except: except:
pass pass
os.rename(bname + "-key.pem", bname + ".key") wrename(log, bname + "-key.pem", bname + ".key", VF)
os.unlink(bname + ".csr") wunlink(log, bname + ".csr", VF)
with open(os.path.join(args.crt_dir, "ca.pem"), "rb") as f: with open(os.path.join(args.crt_dir, "ca.pem"), "rb") as f:
ca = f.read() ca = f.read()

View File

@@ -16,6 +16,7 @@ def vf_bmap() -> dict[str, str]:
"no_dedup": "copydupes", "no_dedup": "copydupes",
"no_dupe": "nodupe", "no_dupe": "nodupe",
"no_forget": "noforget", "no_forget": "noforget",
"no_pipe": "nopipe",
"no_robots": "norobots", "no_robots": "norobots",
"no_thumb": "dthumb", "no_thumb": "dthumb",
"no_vthumb": "dvthumb", "no_vthumb": "dvthumb",
@@ -38,6 +39,9 @@ def vf_bmap() -> dict[str, str]:
"magic", "magic",
"no_sb_md", "no_sb_md",
"no_sb_lg", "no_sb_lg",
"og",
"og_no_head",
"og_s_title",
"rand", "rand",
"xdev", "xdev",
"xlink", "xlink",
@@ -60,11 +64,23 @@ def vf_vmap() -> dict[str, str]:
} }
for k in ( for k in (
"dbd", "dbd",
"html_head",
"lg_sbf", "lg_sbf",
"md_sbf", "md_sbf",
"nrand", "nrand",
"og_desc",
"og_site",
"og_th",
"og_title",
"og_title_a",
"og_title_v",
"og_title_i",
"og_tpl",
"og_ua",
"mv_retry",
"rm_retry", "rm_retry",
"sort", "sort",
"tcolor",
"unlist", "unlist",
"u2abort", "u2abort",
"u2ts", "u2ts",
@@ -79,7 +95,6 @@ def vf_cmap() -> dict[str, str]:
for k in ( for k in (
"exp_lg", "exp_lg",
"exp_md", "exp_md",
"html_head",
"mte", "mte",
"mth", "mth",
"mtp", "mtp",
@@ -175,6 +190,7 @@ flagcats = {
"dvthumb": "disables video thumbnails", "dvthumb": "disables video thumbnails",
"dathumb": "disables audio thumbnails (spectrograms)", "dathumb": "disables audio thumbnails (spectrograms)",
"dithumb": "disables image thumbnails", "dithumb": "disables image thumbnails",
"pngquant": "compress audio waveforms 33% better",
"thsize": "thumbnail res; WxH", "thsize": "thumbnail res; WxH",
"crop": "center-cropping (y/n/fy/fn)", "crop": "center-cropping (y/n/fy/fn)",
"th3x": "3x resolution (y/n/fy/fn)", "th3x": "3x resolution (y/n/fy/fn)",
@@ -199,7 +215,7 @@ flagcats = {
"grid": "show grid/thumbnails by default", "grid": "show grid/thumbnails by default",
"sort": "default sort order", "sort": "default sort order",
"unlist": "dont list files matching REGEX", "unlist": "dont list files matching REGEX",
"html_head=TXT": "includes TXT in the <head>", "html_head=TXT": "includes TXT in the <head>, or @PATH for file at PATH",
"robots": "allows indexing by search engines (default)", "robots": "allows indexing by search engines (default)",
"norobots": "kindly asks search engines to leave", "norobots": "kindly asks search engines to leave",
"no_sb_md": "disable js sandbox for markdown files", "no_sb_md": "disable js sandbox for markdown files",
@@ -214,6 +230,7 @@ flagcats = {
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings", "dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes', "fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
"fka=8": 'generates slightly weaker per-file accesskeys,\nwhich are then required at the "g" permission;\nnot affected by filesize or inode numbers', "fka=8": 'generates slightly weaker per-file accesskeys,\nwhich are then required at the "g" permission;\nnot affected by filesize or inode numbers',
"mv_retry": "ms-windows: timeout for renaming busy files",
"rm_retry": "ms-windows: timeout for deleting busy files", "rm_retry": "ms-windows: timeout for deleting busy files",
"davauth": "ask webdav clients to login for all folders", "davauth": "ask webdav clients to login for all folders",
"davrt": "show lastmod time of symlink destination, not the link itself\n(note: this option is always enabled for recursive listings)", "davrt": "show lastmod time of symlink destination, not the link itself\n(note: this option is always enabled for recursive listings)",

View File

@@ -1,6 +1,7 @@
# coding: utf-8 # coding: utf-8
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
import argparse
import os import os
import re import re
import time import time
@@ -17,20 +18,26 @@ if True: # pylint: disable=using-constant-test
class Fstab(object): class Fstab(object):
def __init__(self, log: "RootLogger"): def __init__(self, log: "RootLogger", args: argparse.Namespace):
self.log_func = log self.log_func = log
self.warned = False
self.trusted = False self.trusted = False
self.tab: Optional[VFS] = None self.tab: Optional[VFS] = None
self.oldtab: Optional[VFS] = None
self.srctab = "a"
self.cache: dict[str, str] = {} self.cache: dict[str, str] = {}
self.age = 0.0 self.age = 0.0
self.maxage = args.mtab_age
def log(self, msg: str, c: Union[int, str] = 0) -> None: def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func("fstab", msg, c) self.log_func("fstab", msg, c)
def get(self, path: str) -> str: def get(self, path: str) -> str:
if len(self.cache) > 9000: now = time.time()
self.age = time.time() if now - self.age > self.maxage or len(self.cache) > 9000:
self.age = now
self.oldtab = self.tab or self.oldtab
self.tab = None self.tab = None
self.cache = {} self.cache = {}
@@ -75,7 +82,7 @@ class Fstab(object):
self.trusted = False self.trusted = False
def build_tab(self) -> None: def build_tab(self) -> None:
self.log("building tab") self.log("inspecting mtab for changes")
sptn = r"^.*? on (.*) type ([^ ]+) \(.*" sptn = r"^.*? on (.*) type ([^ ]+) \(.*"
if MACOS: if MACOS:
@@ -84,6 +91,7 @@ class Fstab(object):
ptn = re.compile(sptn) ptn = re.compile(sptn)
so, _ = chkcmd(["mount"]) so, _ = chkcmd(["mount"])
tab1: list[tuple[str, str]] = [] tab1: list[tuple[str, str]] = []
atab = []
for ln in so.split("\n"): for ln in so.split("\n"):
m = ptn.match(ln) m = ptn.match(ln)
if not m: if not m:
@@ -91,6 +99,15 @@ class Fstab(object):
zs1, zs2 = m.groups() zs1, zs2 = m.groups()
tab1.append((str(zs1), str(zs2))) tab1.append((str(zs1), str(zs2)))
atab.append(ln)
# keep empirically-correct values if mounttab unchanged
srctab = "\n".join(sorted(atab))
if srctab == self.srctab:
self.tab = self.oldtab
return
self.log("mtab has changed; reevaluating support for sparse files")
tab1.sort(key=lambda x: (len(x[0]), x[0])) tab1.sort(key=lambda x: (len(x[0]), x[0]))
path1, fs1 = tab1[0] path1, fs1 = tab1[0]
@@ -99,6 +116,7 @@ class Fstab(object):
tab.add(fs, path.lstrip("/")) tab.add(fs, path.lstrip("/"))
self.tab = tab self.tab = tab
self.srctab = srctab
def relabel(self, path: str, nval: str) -> None: def relabel(self, path: str, nval: str) -> None:
assert self.tab assert self.tab
@@ -133,7 +151,9 @@ class Fstab(object):
self.trusted = True self.trusted = True
except: except:
# prisonparty or other restrictive environment # prisonparty or other restrictive environment
self.log("failed to build tab:\n{}".format(min_ex()), 3) if not self.warned:
self.warned = True
self.log("failed to build tab:\n{}".format(min_ex()), 3)
self.build_fallback() self.build_fallback()
assert self.tab assert self.tab

View File

@@ -218,7 +218,7 @@ class FtpFs(AbstractedFS):
raise FSE("Cannot open existing file for writing") raise FSE("Cannot open existing file for writing")
self.validpath(ap) self.validpath(ap)
return open(fsenc(ap), mode) return open(fsenc(ap), mode, self.args.iobuf)
def chdir(self, path: str) -> None: def chdir(self, path: str) -> None:
nwd = join(self.cwd, path) nwd = join(self.cwd, path)

File diff suppressed because it is too large Load Diff

View File

@@ -55,6 +55,7 @@ class HttpConn(object):
self.E: EnvParams = self.args.E self.E: EnvParams = self.args.E
self.asrv: AuthSrv = hsrv.asrv # mypy404 self.asrv: AuthSrv = hsrv.asrv # mypy404
self.u2fh: Util.FHC = hsrv.u2fh # mypy404 self.u2fh: Util.FHC = hsrv.u2fh # mypy404
self.pipes: Util.CachedDict = hsrv.pipes # mypy404
self.ipa_nm: Optional[NetMap] = hsrv.ipa_nm self.ipa_nm: Optional[NetMap] = hsrv.ipa_nm
self.xff_nm: Optional[NetMap] = hsrv.xff_nm self.xff_nm: Optional[NetMap] = hsrv.xff_nm
self.xff_lan: NetMap = hsrv.xff_lan # type: ignore self.xff_lan: NetMap = hsrv.xff_lan # type: ignore

View File

@@ -61,6 +61,7 @@ from .u2idx import U2idx
from .util import ( from .util import (
E_SCK, E_SCK,
FHC, FHC,
CachedDict,
Daemon, Daemon,
Garda, Garda,
Magician, Magician,
@@ -130,6 +131,7 @@ class HttpSrv(object):
self.t_periodic: Optional[threading.Thread] = None self.t_periodic: Optional[threading.Thread] = None
self.u2fh = FHC() self.u2fh = FHC()
self.pipes = CachedDict(0.2)
self.metrics = Metrics(self) self.metrics = Metrics(self)
self.nreq = 0 self.nreq = 0
self.nsus = 0 self.nsus = 0
@@ -264,10 +266,7 @@ class HttpSrv(object):
msg = "subscribed @ {}:{} f{} p{}".format(hip, port, fno, os.getpid()) msg = "subscribed @ {}:{} f{} p{}".format(hip, port, fno, os.getpid())
self.log(self.name, msg) self.log(self.name, msg)
def fun() -> None: Daemon(self.broker.say, "sig-hsrv-up1", ("cb_httpsrv_up",))
self.broker.say("cb_httpsrv_up")
threading.Thread(target=fun, name="sig-hsrv-up1").start()
while not self.stopping: while not self.stopping:
if self.args.log_conn: if self.args.log_conn:

View File

@@ -292,6 +292,22 @@ class MDNS(MCast):
def run2(self) -> None: def run2(self) -> None:
last_hop = time.time() last_hop = time.time()
ihop = self.args.mc_hop ihop = self.args.mc_hop
try:
if self.args.no_poll:
raise Exception()
fd2sck = {}
srvpoll = select.poll()
for sck in self.srv:
fd = sck.fileno()
fd2sck[fd] = sck
srvpoll.register(fd, select.POLLIN)
except Exception as ex:
srvpoll = None
if not self.args.no_poll:
t = "WARNING: failed to poll(), will use select() instead: %r"
self.log(t % (ex,), 3)
while self.running: while self.running:
timeout = ( timeout = (
0.02 + random.random() * 0.07 0.02 + random.random() * 0.07
@@ -300,8 +316,13 @@ class MDNS(MCast):
if self.unsolicited if self.unsolicited
else (last_hop + ihop if ihop else 180) else (last_hop + ihop if ihop else 180)
) )
rdy = select.select(self.srv, [], [], timeout) if srvpoll:
rx: list[socket.socket] = rdy[0] # type: ignore pr = srvpoll.poll(timeout * 1000)
rx = [fd2sck[x[0]] for x in pr if x[1] & select.POLLIN]
else:
rdy = select.select(self.srv, [], [], timeout)
rx: list[socket.socket] = rdy[0] # type: ignore
self.rx4.cln() self.rx4.cln()
self.rx6.cln() self.rx6.cln()
buf = b"" buf = b""

View File

@@ -179,7 +179,7 @@ class Metrics(object):
tnbytes = 0 tnbytes = 0
tnfiles = 0 tnfiles = 0
for vpath, vol in allvols: for vpath, vol in allvols:
cur = idx.get_cur(vol.realpath) cur = idx.get_cur(vol)
if not cur: if not cur:
continue continue

View File

@@ -7,12 +7,15 @@ import os
import shutil import shutil
import subprocess as sp import subprocess as sp
import sys import sys
import tempfile
from .__init__ import ANYWIN, EXE, PY2, WINDOWS, E, unicode from .__init__ import ANYWIN, EXE, PY2, WINDOWS, E, unicode
from .authsrv import VFS
from .bos import bos from .bos import bos
from .util import ( from .util import (
FFMPEG_URL, FFMPEG_URL,
REKOBO_LKEY, REKOBO_LKEY,
VF_CAREFUL,
fsenc, fsenc,
min_ex, min_ex,
pybin, pybin,
@@ -20,12 +23,13 @@ from .util import (
runcmd, runcmd,
sfsenc, sfsenc,
uncyg, uncyg,
wunlink,
) )
if True: # pylint: disable=using-constant-test if True: # pylint: disable=using-constant-test
from typing import Any, Union from typing import Any, Optional, Union
from .util import RootLogger from .util import NamedLogger, RootLogger
def have_ff(scmd: str) -> bool: def have_ff(scmd: str) -> bool:
@@ -107,6 +111,53 @@ class MParser(object):
raise Exception() raise Exception()
def au_unpk(
log: "NamedLogger", fmt_map: dict[str, str], abspath: str, vn: Optional[VFS] = None
) -> str:
ret = ""
try:
ext = abspath.split(".")[-1].lower()
au, pk = fmt_map[ext].split(".")
fd, ret = tempfile.mkstemp("." + au)
if pk == "gz":
import gzip
fi = gzip.GzipFile(abspath, mode="rb")
elif pk == "xz":
import lzma
fi = lzma.open(abspath, "rb")
elif pk == "zip":
import zipfile
zf = zipfile.ZipFile(abspath, "r")
zil = zf.infolist()
zil = [x for x in zil if x.filename.lower().split(".")[-1] == au]
fi = zf.open(zil[0])
with os.fdopen(fd, "wb") as fo:
while True:
buf = fi.read(32768)
if not buf:
break
fo.write(buf)
return ret
except Exception as ex:
if ret:
t = "failed to decompress audio file [%s]: %r"
log(t % (abspath, ex))
wunlink(log, ret, vn.flags if vn else VF_CAREFUL)
return abspath
def ffprobe( def ffprobe(
abspath: str, timeout: int = 60 abspath: str, timeout: int = 60
) -> tuple[dict[str, tuple[int, Any]], dict[str, list[Any]]]: ) -> tuple[dict[str, tuple[int, Any]], dict[str, list[Any]]]:
@@ -281,7 +332,7 @@ class MTag(object):
or_ffprobe = " or FFprobe" or_ffprobe = " or FFprobe"
if self.backend == "mutagen": if self.backend == "mutagen":
self.get = self.get_mutagen self._get = self.get_mutagen
try: try:
from mutagen import version # noqa: F401 from mutagen import version # noqa: F401
except: except:
@@ -290,7 +341,7 @@ class MTag(object):
if self.backend == "ffprobe": if self.backend == "ffprobe":
self.usable = self.can_ffprobe self.usable = self.can_ffprobe
self.get = self.get_ffprobe self._get = self.get_ffprobe
self.prefer_mt = True self.prefer_mt = True
if not HAVE_FFPROBE: if not HAVE_FFPROBE:
@@ -460,6 +511,17 @@ class MTag(object):
return r1 return r1
def get(self, abspath: str) -> dict[str, Union[str, float]]:
ext = abspath.split(".")[-1].lower()
if ext not in self.args.au_unpk:
return self._get(abspath)
ap = au_unpk(self.log, self.args.au_unpk, abspath)
ret = self._get(ap)
if ap != abspath:
wunlink(self.log, ap, VF_CAREFUL)
return ret
def get_mutagen(self, abspath: str) -> dict[str, Union[str, float]]: def get_mutagen(self, abspath: str) -> dict[str, Union[str, float]]:
ret: dict[str, tuple[int, Any]] = {} ret: dict[str, tuple[int, Any]] = {}
@@ -551,13 +613,18 @@ class MTag(object):
pypath = str(os.pathsep.join(zsl)) pypath = str(os.pathsep.join(zsl))
env["PYTHONPATH"] = pypath env["PYTHONPATH"] = pypath
except: except:
if not E.ox and not EXE: raise # might be expected outside cpython
raise
ext = abspath.split(".")[-1].lower()
if ext in self.args.au_unpk:
ap = au_unpk(self.log, self.args.au_unpk, abspath)
else:
ap = abspath
ret: dict[str, Any] = {} ret: dict[str, Any] = {}
for tagname, parser in sorted(parsers.items(), key=lambda x: (x[1].pri, x[0])): for tagname, parser in sorted(parsers.items(), key=lambda x: (x[1].pri, x[0])):
try: try:
cmd = [parser.bin, abspath] cmd = [parser.bin, ap]
if parser.bin.endswith(".py"): if parser.bin.endswith(".py"):
cmd = [pybin] + cmd cmd = [pybin] + cmd
@@ -594,4 +661,7 @@ class MTag(object):
t = "mtag error: tagname {}, parser {}, file {} => {}" t = "mtag error: tagname {}, parser {}, file {} => {}"
self.log(t.format(tagname, parser.bin, abspath, min_ex())) self.log(t.format(tagname, parser.bin, abspath, min_ex()))
if ap != abspath:
wunlink(self.log, ap, VF_CAREFUL)
return ret return ret

View File

@@ -206,6 +206,7 @@ class MCast(object):
except: except:
t = "announce failed on {} [{}]:\n{}" t = "announce failed on {} [{}]:\n{}"
self.log(t.format(netdev, ip, min_ex()), 3) self.log(t.format(netdev, ip, min_ex()), 3)
sck.close()
if self.args.zm_msub: if self.args.zm_msub:
for s1 in self.srv.values(): for s1 in self.srv.values():

View File

@@ -127,7 +127,7 @@ class SMB(object):
self.log("smb", msg, c) self.log("smb", msg, c)
def start(self) -> None: def start(self) -> None:
Daemon(self.srv.start) Daemon(self.srv.start, "smbd")
def _auth_cb(self, *a, **ka): def _auth_cb(self, *a, **ka):
debug("auth-result: %s %s", a, ka) debug("auth-result: %s %s", a, ka)

View File

@@ -141,9 +141,29 @@ class SSDPd(MCast):
self.log("stopped", 2) self.log("stopped", 2)
def run2(self) -> None: def run2(self) -> None:
try:
if self.args.no_poll:
raise Exception()
fd2sck = {}
srvpoll = select.poll()
for sck in self.srv:
fd = sck.fileno()
fd2sck[fd] = sck
srvpoll.register(fd, select.POLLIN)
except Exception as ex:
srvpoll = None
if not self.args.no_poll:
t = "WARNING: failed to poll(), will use select() instead: %r"
self.log(t % (ex,), 3)
while self.running: while self.running:
rdy = select.select(self.srv, [], [], self.args.z_chk or 180) if srvpoll:
rx: list[socket.socket] = rdy[0] # type: ignore pr = srvpoll.poll((self.args.z_chk or 180) * 1000)
rx = [fd2sck[x[0]] for x in pr if x[1] & select.POLLIN]
else:
rdy = select.select(self.srv, [], [], self.args.z_chk or 180)
rx: list[socket.socket] = rdy[0] # type: ignore
self.rxc.cln() self.rxc.cln()
buf = b"" buf = b""
addr = ("0", 0) addr = ("0", 0)

View File

@@ -7,6 +7,7 @@ import tarfile
from queue import Queue from queue import Queue
from .authsrv import AuthSrv
from .bos import bos from .bos import bos
from .sutil import StreamArc, errdesc from .sutil import StreamArc, errdesc
from .util import Daemon, fsenc, min_ex from .util import Daemon, fsenc, min_ex
@@ -44,11 +45,12 @@ class StreamTar(StreamArc):
def __init__( def __init__(
self, self,
log: "NamedLogger", log: "NamedLogger",
asrv: AuthSrv,
fgen: Generator[dict[str, Any], None, None], fgen: Generator[dict[str, Any], None, None],
cmp: str = "", cmp: str = "",
**kwargs: Any **kwargs: Any
): ):
super(StreamTar, self).__init__(log, fgen) super(StreamTar, self).__init__(log, asrv, fgen)
self.ci = 0 self.ci = 0
self.co = 0 self.co = 0
@@ -126,7 +128,7 @@ class StreamTar(StreamArc):
inf.gid = 0 inf.gid = 0
self.ci += inf.size self.ci += inf.size
with open(fsenc(src), "rb", 512 * 1024) as fo: with open(fsenc(src), "rb", self.args.iobuf) as fo:
self.tar.addfile(inf, fo) self.tar.addfile(inf, fo)
def _gen(self) -> None: def _gen(self) -> None:
@@ -146,7 +148,7 @@ class StreamTar(StreamArc):
errors.append((f["vp"], ex)) errors.append((f["vp"], ex))
if errors: if errors:
self.errf, txt = errdesc(errors) self.errf, txt = errdesc(self.asrv.vfs, errors)
self.log("\n".join(([repr(self.errf)] + txt[1:]))) self.log("\n".join(([repr(self.errf)] + txt[1:])))
self.ser(self.errf) self.ser(self.errf)

View File

@@ -6,9 +6,10 @@ import tempfile
from datetime import datetime from datetime import datetime
from .__init__ import CORES from .__init__ import CORES
from .authsrv import AuthSrv, VFS
from .bos import bos from .bos import bos
from .th_cli import ThumbCli from .th_cli import ThumbCli
from .util import UTC, vjoin from .util import UTC, vjoin, vol_san
if True: # pylint: disable=using-constant-test if True: # pylint: disable=using-constant-test
from typing import Any, Generator, Optional from typing import Any, Generator, Optional
@@ -20,10 +21,13 @@ class StreamArc(object):
def __init__( def __init__(
self, self,
log: "NamedLogger", log: "NamedLogger",
asrv: AuthSrv,
fgen: Generator[dict[str, Any], None, None], fgen: Generator[dict[str, Any], None, None],
**kwargs: Any **kwargs: Any
): ):
self.log = log self.log = log
self.asrv = asrv
self.args = asrv.args
self.fgen = fgen self.fgen = fgen
self.stopped = False self.stopped = False
@@ -78,7 +82,9 @@ def enthumb(
) -> dict[str, Any]: ) -> dict[str, Any]:
rem = f["vp"] rem = f["vp"]
ext = rem.rsplit(".", 1)[-1].lower() ext = rem.rsplit(".", 1)[-1].lower()
if fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|"): if (fmt == "mp3" and ext == "mp3") or (
fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|")
):
raise Exception() raise Exception()
vp = vjoin(vtop, rem.split("/", 1)[1]) vp = vjoin(vtop, rem.split("/", 1)[1])
@@ -98,15 +104,20 @@ def enthumb(
return f return f
def errdesc(errors: list[tuple[str, str]]) -> tuple[dict[str, Any], list[str]]: def errdesc(
vfs: VFS, errors: list[tuple[str, str]]
) -> tuple[dict[str, Any], list[str]]:
report = ["copyparty failed to add the following files to the archive:", ""] report = ["copyparty failed to add the following files to the archive:", ""]
for fn, err in errors: for fn, err in errors:
report.extend([" file: {}".format(fn), "error: {}".format(err), ""]) report.extend([" file: {}".format(fn), "error: {}".format(err), ""])
btxt = "\r\n".join(report).encode("utf-8", "replace")
btxt = vol_san(list(vfs.all_vols.values()), btxt)
with tempfile.NamedTemporaryFile(prefix="copyparty-", delete=False) as tf: with tempfile.NamedTemporaryFile(prefix="copyparty-", delete=False) as tf:
tf_path = tf.name tf_path = tf.name
tf.write("\r\n".join(report).encode("utf-8", "replace")) tf.write(btxt)
dt = datetime.now(UTC).strftime("%Y-%m%d-%H%M%S") dt = datetime.now(UTC).strftime("%Y-%m%d-%H%M%S")

View File

@@ -28,7 +28,7 @@ if True: # pylint: disable=using-constant-test
import typing import typing
from typing import Any, Optional, Union from typing import Any, Optional, Union
from .__init__ import ANYWIN, E, EXE, MACOS, TYPE_CHECKING, EnvParams, unicode from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, E, EnvParams, unicode
from .authsrv import BAD_CFG, AuthSrv from .authsrv import BAD_CFG, AuthSrv
from .cert import ensure_cert from .cert import ensure_cert
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
@@ -173,6 +173,26 @@ class SvcHub(object):
self.log("root", t.format(args.j), c=3) self.log("root", t.format(args.j), c=3)
args.no_fpool = True args.no_fpool = True
for name, arg in (
("iobuf", "iobuf"),
("s-rd-sz", "s_rd_sz"),
("s-wr-sz", "s_wr_sz"),
):
zi = getattr(args, arg)
if zi < 32768:
t = "WARNING: expect very poor performance because you specified a very low value (%d) for --%s"
self.log("root", t % (zi, name), 3)
zi = 2
zi2 = 2 ** (zi - 1).bit_length()
if zi != zi2:
zi3 = 2 ** ((zi - 1).bit_length() - 1)
t = "WARNING: expect poor performance because --%s is not a power-of-two; consider using %d or %d instead of %d"
self.log("root", t % (name, zi2, zi3, zi), 3)
if args.s_rd_sz > args.iobuf:
t = "WARNING: --s-rd-sz (%d) is larger than --iobuf (%d); this may lead to reduced performance"
self.log("root", t % (args.s_rd_sz, args.iobuf), 3)
bri = "zy"[args.theme % 2 :][:1] bri = "zy"[args.theme % 2 :][:1]
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)] ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
args.theme = "{0}{1} {0} {1}".format(ch, bri) args.theme = "{0}{1} {0} {1}".format(ch, bri)
@@ -220,6 +240,10 @@ class SvcHub(object):
if not HAVE_FFMPEG or not HAVE_FFPROBE: if not HAVE_FFMPEG or not HAVE_FFPROBE:
decs.pop("ff", None) decs.pop("ff", None)
# compressed formats; "s3z=s3m.zip, s3gz=s3m.gz, ..."
zlss = [x.strip().lower().split("=", 1) for x in args.au_unpk.split(",")]
args.au_unpk = {x[0]: x[1] for x in zlss}
self.args.th_dec = list(decs.keys()) self.args.th_dec = list(decs.keys())
self.thumbsrv = None self.thumbsrv = None
want_ff = False want_ff = False
@@ -256,6 +280,13 @@ class SvcHub(object):
if want_ff and ANYWIN: if want_ff and ANYWIN:
self.log("thumb", "download FFmpeg to fix it:\033[0m " + FFMPEG_URL, 3) self.log("thumb", "download FFmpeg to fix it:\033[0m " + FFMPEG_URL, 3)
if not args.no_acode:
if not re.match("^(0|[qv][0-9]|[0-9]{2,3}k)$", args.q_mp3.lower()):
t = "invalid mp3 transcoding quality [%s] specified; only supports [0] to disable, a CBR value such as [192k], or a CQ/CRF value such as [v2]"
raise Exception(t % (args.q_mp3,))
else:
args.au_unpk = {}
args.th_poke = min(args.th_poke, args.th_maxage, args.ac_maxage) args.th_poke = min(args.th_poke, args.th_maxage, args.ac_maxage)
zms = "" zms = ""
@@ -268,13 +299,14 @@ class SvcHub(object):
from .ftpd import Ftpd from .ftpd import Ftpd
self.ftpd: Optional[Ftpd] = None self.ftpd: Optional[Ftpd] = None
Daemon(self.start_ftpd, "start_ftpd")
zms += "f" if args.ftp else "F" zms += "f" if args.ftp else "F"
if args.tftp: if args.tftp:
from .tftpd import Tftpd from .tftpd import Tftpd
self.tftpd: Optional[Tftpd] = None self.tftpd: Optional[Tftpd] = None
if args.ftp or args.ftps or args.tftp:
Daemon(self.start_ftpd, "start_tftpd") Daemon(self.start_ftpd, "start_tftpd")
if args.smb: if args.smb:
@@ -363,7 +395,7 @@ class SvcHub(object):
self.sigterm() self.sigterm()
def sigterm(self) -> None: def sigterm(self) -> None:
os.kill(os.getpid(), signal.SIGTERM) self.signal_handler(signal.SIGTERM, None)
def cb_httpsrv_up(self) -> None: def cb_httpsrv_up(self) -> None:
self.httpsrv_up += 1 self.httpsrv_up += 1
@@ -399,6 +431,12 @@ class SvcHub(object):
t = "WARNING: found config files in [%s]: %s\n config files are not expected here, and will NOT be loaded (unless your setup is intentionally hella funky)" t = "WARNING: found config files in [%s]: %s\n config files are not expected here, and will NOT be loaded (unless your setup is intentionally hella funky)"
self.log("root", t % (E.cfg, ", ".join(hits)), 3) self.log("root", t % (E.cfg, ", ".join(hits)), 3)
if self.args.no_bauth:
t = "WARNING: --no-bauth disables support for the Android app; you may want to use --bauth-last instead"
self.log("root", t, 3)
if self.args.bauth_last:
self.log("root", "WARNING: ignoring --bauth-last due to --no-bauth", 3)
def _process_config(self) -> bool: def _process_config(self) -> bool:
al = self.args al = self.args
@@ -495,7 +533,7 @@ class SvcHub(object):
al.exp_md = odfusion(exp, al.exp_md.replace(" ", ",")) al.exp_md = odfusion(exp, al.exp_md.replace(" ", ","))
al.exp_lg = odfusion(exp, al.exp_lg.replace(" ", ",")) al.exp_lg = odfusion(exp, al.exp_lg.replace(" ", ","))
for k in ["no_hash", "no_idx"]: for k in ["no_hash", "no_idx", "og_ua"]:
ptn = getattr(self.args, k) ptn = getattr(self.args, k)
if ptn: if ptn:
setattr(self.args, k, re.compile(ptn)) setattr(self.args, k, re.compile(ptn))
@@ -519,6 +557,17 @@ class SvcHub(object):
except: except:
raise Exception("invalid --rm-retry [%s]" % (self.args.rm_retry,)) raise Exception("invalid --rm-retry [%s]" % (self.args.rm_retry,))
try:
zf1, zf2 = self.args.mv_retry.split("/")
self.args.mv_re_t = float(zf1)
self.args.mv_re_r = float(zf2)
except:
raise Exception("invalid --mv-retry [%s]" % (self.args.mv_retry,))
al.tcolor = al.tcolor.lstrip("#")
if len(al.tcolor) == 3: # fc5 => ffcc55
al.tcolor = "".join([x * 2 for x in al.tcolor])
return True return True
def _ipa2re(self, txt) -> Optional[re.Pattern]: def _ipa2re(self, txt) -> Optional[re.Pattern]:

View File

@@ -6,6 +6,7 @@ import stat
import time import time
import zlib import zlib
from .authsrv import AuthSrv
from .bos import bos from .bos import bos
from .sutil import StreamArc, errdesc from .sutil import StreamArc, errdesc
from .util import min_ex, sanitize_fn, spack, sunpack, yieldfile from .util import min_ex, sanitize_fn, spack, sunpack, yieldfile
@@ -218,12 +219,13 @@ class StreamZip(StreamArc):
def __init__( def __init__(
self, self,
log: "NamedLogger", log: "NamedLogger",
asrv: AuthSrv,
fgen: Generator[dict[str, Any], None, None], fgen: Generator[dict[str, Any], None, None],
utf8: bool = False, utf8: bool = False,
pre_crc: bool = False, pre_crc: bool = False,
**kwargs: Any **kwargs: Any
) -> None: ) -> None:
super(StreamZip, self).__init__(log, fgen) super(StreamZip, self).__init__(log, asrv, fgen)
self.utf8 = utf8 self.utf8 = utf8
self.pre_crc = pre_crc self.pre_crc = pre_crc
@@ -248,7 +250,7 @@ class StreamZip(StreamArc):
crc = 0 crc = 0
if self.pre_crc: if self.pre_crc:
for buf in yieldfile(src): for buf in yieldfile(src, self.args.iobuf):
crc = zlib.crc32(buf, crc) crc = zlib.crc32(buf, crc)
crc &= 0xFFFFFFFF crc &= 0xFFFFFFFF
@@ -257,7 +259,7 @@ class StreamZip(StreamArc):
buf = gen_hdr(None, name, sz, ts, self.utf8, crc, self.pre_crc) buf = gen_hdr(None, name, sz, ts, self.utf8, crc, self.pre_crc)
yield self._ct(buf) yield self._ct(buf)
for buf in yieldfile(src): for buf in yieldfile(src, self.args.iobuf):
if not self.pre_crc: if not self.pre_crc:
crc = zlib.crc32(buf, crc) crc = zlib.crc32(buf, crc)
@@ -300,7 +302,7 @@ class StreamZip(StreamArc):
mbuf = b"" mbuf = b""
if errors: if errors:
errf, txt = errdesc(errors) errf, txt = errdesc(self.asrv.vfs, errors)
self.log("\n".join(([repr(errf)] + txt[1:]))) self.log("\n".join(([repr(errf)] + txt[1:])))
for x in self.ser(errf): for x in self.ser(errf):
yield x yield x

View File

@@ -463,6 +463,12 @@ class TcpSrv(object):
sys.stderr.flush() sys.stderr.flush()
def _qr(self, t1: dict[str, list[int]], t2: dict[str, list[int]]) -> str: def _qr(self, t1: dict[str, list[int]], t2: dict[str, list[int]]) -> str:
t2c = {zs: zli for zs, zli in t2.items() if zs in ("127.0.0.1", "::1")}
t2b = {zs: zli for zs, zli in t2.items() if ":" in zs and zs not in t2c}
t2 = {zs: zli for zs, zli in t2.items() if zs not in t2b and zs not in t2c}
t2.update(t2b) # first ipv4, then ipv6...
t2.update(t2c) # ...and finally localhost
ip = None ip = None
ips = list(t1) + list(t2) ips = list(t1) + list(t2)
qri = self.args.qri qri = self.args.qri

View File

@@ -340,6 +340,9 @@ class Tftpd(object):
if not self.args.tftp_nols and bos.path.isdir(ap): if not self.args.tftp_nols and bos.path.isdir(ap):
return self._ls(vpath, "", 0, True) return self._ls(vpath, "", 0, True)
if not a:
a = [self.args.iobuf]
return open(ap, mode, *a, **ka) return open(ap, mode, *a, **ka)
def _mkdir(self, vpath: str, *a) -> None: def _mkdir(self, vpath: str, *a) -> None:

View File

@@ -57,7 +57,7 @@ class ThumbCli(object):
if is_vid and "dvthumb" in dbv.flags: if is_vid and "dvthumb" in dbv.flags:
return None return None
want_opus = fmt in ("opus", "caf") want_opus = fmt in ("opus", "caf", "mp3")
is_au = ext in self.fmt_ffa is_au = ext in self.fmt_ffa
if is_au: if is_au:
if want_opus: if want_opus:
@@ -107,6 +107,11 @@ class ThumbCli(object):
fmt = sfmt fmt = sfmt
elif fmt[:1] == "p" and not is_au:
t = "cannot thumbnail [%s]: png only allowed for waveforms"
self.log(t % (rem), 6)
return None
histpath = self.asrv.vfs.histtab.get(ptop) histpath = self.asrv.vfs.histtab.get(ptop)
if not histpath: if not histpath:
self.log("no histpath for [{}]".format(ptop)) self.log("no histpath for [{}]".format(ptop))

View File

@@ -15,10 +15,10 @@ from queue import Queue
from .__init__ import ANYWIN, TYPE_CHECKING from .__init__ import ANYWIN, TYPE_CHECKING
from .authsrv import VFS from .authsrv import VFS
from .bos import bos from .bos import bos
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, au_unpk, ffprobe
from .util import BytesIO # type: ignore
from .util import ( from .util import (
FFMPEG_URL, FFMPEG_URL,
BytesIO, # type: ignore
Cooldown, Cooldown,
Daemon, Daemon,
Pebkac, Pebkac,
@@ -28,6 +28,7 @@ from .util import (
runcmd, runcmd,
statdir, statdir,
vsplit, vsplit,
wrename,
wunlink, wunlink,
) )
@@ -109,7 +110,7 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
h = hashlib.sha512(afsenc(fn)).digest() h = hashlib.sha512(afsenc(fn)).digest()
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24] fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
if fmt in ("opus", "caf"): if fmt in ("opus", "caf", "mp3"):
cat = "ac" cat = "ac"
else: else:
fc = fmt[:1] fc = fmt[:1]
@@ -296,6 +297,12 @@ class ThumbSrv(object):
ext = abspath.split(".")[-1].lower() ext = abspath.split(".")[-1].lower()
png_ok = False png_ok = False
funs = [] funs = []
if ext in self.args.au_unpk:
ap_unpk = au_unpk(self.log, self.args.au_unpk, abspath, vn)
else:
ap_unpk = abspath
if not bos.path.exists(tpath): if not bos.path.exists(tpath):
for lib in self.args.th_dec: for lib in self.args.th_dec:
if lib == "pil" and ext in self.fmt_pil: if lib == "pil" and ext in self.fmt_pil:
@@ -307,15 +314,14 @@ class ThumbSrv(object):
elif lib == "ff" and ext in self.fmt_ffa: elif lib == "ff" and ext in self.fmt_ffa:
if tpath.endswith(".opus") or tpath.endswith(".caf"): if tpath.endswith(".opus") or tpath.endswith(".caf"):
funs.append(self.conv_opus) funs.append(self.conv_opus)
elif tpath.endswith(".mp3"):
funs.append(self.conv_mp3)
elif tpath.endswith(".png"): elif tpath.endswith(".png"):
funs.append(self.conv_waves) funs.append(self.conv_waves)
png_ok = True png_ok = True
else: else:
funs.append(self.conv_spec) funs.append(self.conv_spec)
if not png_ok and tpath.endswith(".png"):
raise Pebkac(400, "png only allowed for waveforms")
tdir, tfn = os.path.split(tpath) tdir, tfn = os.path.split(tpath)
ttpath = os.path.join(tdir, "w", tfn) ttpath = os.path.join(tdir, "w", tfn)
try: try:
@@ -325,7 +331,10 @@ class ThumbSrv(object):
for fun in funs: for fun in funs:
try: try:
fun(abspath, ttpath, fmt, vn) if not png_ok and tpath.endswith(".png"):
raise Exception("png only allowed for waveforms")
fun(ap_unpk, ttpath, fmt, vn)
break break
except Exception as ex: except Exception as ex:
msg = "{} could not create thumbnail of {}\n{}" msg = "{} could not create thumbnail of {}\n{}"
@@ -343,8 +352,11 @@ class ThumbSrv(object):
except: except:
pass pass
if abspath != ap_unpk:
wunlink(self.log, ap_unpk, vn.flags)
try: try:
bos.rename(ttpath, tpath) wrename(self.log, ttpath, tpath, vn.flags)
except: except:
pass pass
@@ -581,6 +593,25 @@ class ThumbSrv(object):
cmd += [fsenc(tpath)] cmd += [fsenc(tpath)]
self._run_ff(cmd, vn) self._run_ff(cmd, vn)
if "pngquant" in vn.flags:
wtpath = tpath + ".png"
cmd = [
b"pngquant",
b"--strip",
b"--nofs",
b"--output",
fsenc(wtpath),
fsenc(tpath),
]
ret = runcmd(cmd, timeout=vn.flags["convt"], nice=True, oom=400)[0]
if ret:
try:
wunlink(self.log, wtpath, vn.flags)
except:
pass
else:
wrename(self.log, wtpath, tpath, vn.flags)
def conv_spec(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None: def conv_spec(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2)) ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in ret: if "ac" not in ret:
@@ -637,21 +668,60 @@ class ThumbSrv(object):
cmd += [fsenc(tpath)] cmd += [fsenc(tpath)]
self._run_ff(cmd, vn) self._run_ff(cmd, vn)
def conv_opus(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None: def conv_mp3(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
if self.args.no_acode: quality = self.args.q_mp3.lower()
if self.args.no_acode or not quality:
raise Exception("disabled in server config") raise Exception("disabled in server config")
self.wait4ram(0.2, tpath) self.wait4ram(0.2, tpath)
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2)) tags, rawtags = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in ret: if "ac" not in tags:
raise Exception("not audio")
if quality.endswith("k"):
qk = b"-b:a"
qv = quality.encode("ascii")
else:
qk = b"-q:a"
qv = quality[1:].encode("ascii")
# extremely conservative choices for output format
# (always 2ch 44k1) because if a device is old enough
# to not support opus then it's probably also super picky
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
] + self.big_tags(rawtags) + [
b"-map", b"0:a:0",
b"-ar", b"44100",
b"-ac", b"2",
b"-c:a", b"libmp3lame",
qk, qv,
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd, vn, oom=300)
def conv_opus(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
if self.args.no_acode or not self.args.q_opus:
raise Exception("disabled in server config")
self.wait4ram(0.2, tpath)
tags, rawtags = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in tags:
raise Exception("not audio") raise Exception("not audio")
try: try:
dur = ret[".dur"][1] dur = tags[".dur"][1]
except: except:
dur = 0 dur = 0
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus" src_opus = abspath.lower().endswith(".opus") or tags["ac"][1] == "opus"
want_caf = tpath.endswith(".caf") want_caf = tpath.endswith(".caf")
tmp_opus = tpath tmp_opus = tpath
if want_caf: if want_caf:
@@ -662,6 +732,7 @@ class ThumbSrv(object):
pass pass
caf_src = abspath if src_opus else tmp_opus caf_src = abspath if src_opus else tmp_opus
bq = ("%dk" % (self.args.q_opus,)).encode("ascii")
if not want_caf or not src_opus: if not want_caf or not src_opus:
# fmt: off # fmt: off
@@ -671,10 +742,10 @@ class ThumbSrv(object):
b"-v", b"error", b"-v", b"error",
b"-hide_banner", b"-hide_banner",
b"-i", fsenc(abspath), b"-i", fsenc(abspath),
b"-map_metadata", b"-1", ] + self.big_tags(rawtags) + [
b"-map", b"0:a:0", b"-map", b"0:a:0",
b"-c:a", b"libopus", b"-c:a", b"libopus",
b"-b:a", b"128k", b"-b:a", bq,
fsenc(tmp_opus) fsenc(tmp_opus)
] ]
# fmt: on # fmt: on
@@ -697,7 +768,7 @@ class ThumbSrv(object):
b"-map_metadata", b"-1", b"-map_metadata", b"-1",
b"-ac", b"2", b"-ac", b"2",
b"-c:a", b"libopus", b"-c:a", b"libopus",
b"-b:a", b"128k", b"-b:a", bq,
b"-f", b"caf", b"-f", b"caf",
fsenc(tpath) fsenc(tpath)
] ]
@@ -728,6 +799,16 @@ class ThumbSrv(object):
except: except:
pass pass
def big_tags(self, raw_tags: dict[str, list[str]]) -> list[bytes]:
ret = []
for k, vs in raw_tags.items():
for v in vs:
if len(str(v)) >= 1024:
bv = k.encode("utf-8", "replace")
ret += [b"-metadata", bv + b"="]
break
return ret
def poke(self, tdir: str) -> None: def poke(self, tdir: str) -> None:
if not self.poke_cd.poke(tdir): if not self.poke_cd.poke(tdir):
return return
@@ -771,7 +852,7 @@ class ThumbSrv(object):
def _clean(self, cat: str, thumbpath: str) -> int: def _clean(self, cat: str, thumbpath: str) -> int:
# self.log("cln {}".format(thumbpath)) # self.log("cln {}".format(thumbpath))
exts = ["jpg", "webp", "png"] if cat == "th" else ["opus", "caf"] exts = ["jpg", "webp", "png"] if cat == "th" else ["opus", "caf", "mp3"]
maxage = getattr(self.args, cat + "_maxage") maxage = getattr(self.args, cat + "_maxage")
now = time.time() now = time.time()
prev_b64 = None prev_b64 = None

View File

@@ -62,6 +62,17 @@ class U2idx(object):
def log(self, msg: str, c: Union[int, str] = 0) -> None: def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func("u2idx", msg, c) self.log_func("u2idx", msg, c)
def shutdown(self) -> None:
for cur in self.cur.values():
db = cur.connection
try:
db.interrupt()
except:
pass
cur.close()
db.close()
def fsearch( def fsearch(
self, uname: str, vols: list[VFS], body: dict[str, Any] self, uname: str, vols: list[VFS], body: dict[str, Any]
) -> list[dict[str, Any]]: ) -> list[dict[str, Any]]:
@@ -81,14 +92,18 @@ class U2idx(object):
except: except:
raise Pebkac(500, min_ex()) raise Pebkac(500, min_ex())
def get_cur(self, ptop: str) -> Optional["sqlite3.Cursor"]: def get_cur(self, vn: VFS) -> Optional["sqlite3.Cursor"]:
if not HAVE_SQLITE3: if not HAVE_SQLITE3:
return None return None
cur = self.cur.get(ptop) cur = self.cur.get(vn.realpath)
if cur: if cur:
return cur return cur
if "e2d" not in vn.flags:
return None
ptop = vn.realpath
histpath = self.asrv.vfs.histtab.get(ptop) histpath = self.asrv.vfs.histtab.get(ptop)
if not histpath: if not histpath:
self.log("no histpath for [{}]".format(ptop)) self.log("no histpath for [{}]".format(ptop))
@@ -317,7 +332,7 @@ class U2idx(object):
ptop = vol.realpath ptop = vol.realpath
flags = vol.flags flags = vol.flags
cur = self.get_cur(ptop) cur = self.get_cur(vol)
if not cur: if not cur:
continue continue

View File

@@ -10,7 +10,6 @@ import math
import os import os
import re import re
import shutil import shutil
import signal
import stat import stat
import subprocess as sp import subprocess as sp
import tempfile import tempfile
@@ -29,6 +28,7 @@ from .fsutil import Fstab
from .mtag import MParser, MTag from .mtag import MParser, MTag
from .util import ( from .util import (
HAVE_SQLITE3, HAVE_SQLITE3,
VF_CAREFUL,
SYMTIME, SYMTIME,
Daemon, Daemon,
MTHash, MTHash,
@@ -136,6 +136,7 @@ class Up2k(object):
self.need_rescan: set[str] = set() self.need_rescan: set[str] = set()
self.db_act = 0.0 self.db_act = 0.0
self.reg_mutex = threading.Lock()
self.registry: dict[str, dict[str, dict[str, Any]]] = {} self.registry: dict[str, dict[str, dict[str, Any]]] = {}
self.flags: dict[str, dict[str, Any]] = {} self.flags: dict[str, dict[str, Any]] = {}
self.droppable: dict[str, list[str]] = {} self.droppable: dict[str, list[str]] = {}
@@ -143,7 +144,7 @@ class Up2k(object):
self.volsize: dict["sqlite3.Cursor", int] = {} self.volsize: dict["sqlite3.Cursor", int] = {}
self.volstate: dict[str, str] = {} self.volstate: dict[str, str] = {}
self.vol_act: dict[str, float] = {} self.vol_act: dict[str, float] = {}
self.busy_aps: set[str] = set() self.busy_aps: dict[str, int] = {}
self.dupesched: dict[str, list[tuple[str, str, float]]] = {} self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
self.snap_prev: dict[str, Optional[tuple[int, float]]] = {} self.snap_prev: dict[str, Optional[tuple[int, float]]] = {}
@@ -182,7 +183,7 @@ class Up2k(object):
t = "could not initialize sqlite3, will use in-memory registry only" t = "could not initialize sqlite3, will use in-memory registry only"
self.log(t, 3) self.log(t, 3)
self.fstab = Fstab(self.log_func) self.fstab = Fstab(self.log_func, self.args)
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
if self.args.hash_mt < 2: if self.args.hash_mt < 2:
@@ -200,11 +201,15 @@ class Up2k(object):
Daemon(self.deferred_init, "up2k-deferred-init") Daemon(self.deferred_init, "up2k-deferred-init")
def reload(self, rescan_all_vols: bool) -> None: def reload(self, rescan_all_vols: bool) -> None:
"""mutex me""" """mutex(main) me"""
self.log("reload #{} scheduled".format(self.gid + 1)) self.log("reload #{} scheduled".format(self.gid + 1))
all_vols = self.asrv.vfs.all_vols all_vols = self.asrv.vfs.all_vols
scan_vols = [k for k, v in all_vols.items() if v.realpath not in self.registry] with self.reg_mutex:
scan_vols = [
k for k, v in all_vols.items() if v.realpath not in self.registry
]
if rescan_all_vols: if rescan_all_vols:
scan_vols = list(all_vols.keys()) scan_vols = list(all_vols.keys())
@@ -217,7 +222,7 @@ class Up2k(object):
if self.stop: if self.stop:
# up-mt consistency not guaranteed if init is interrupted; # up-mt consistency not guaranteed if init is interrupted;
# drop caches for a full scan on next boot # drop caches for a full scan on next boot
with self.mutex: with self.mutex, self.reg_mutex:
self._drop_caches() self._drop_caches()
if self.pp: if self.pp:
@@ -283,10 +288,27 @@ class Up2k(object):
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act) min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
), ),
} }
return json.dumps(ret, indent=4) return json.dumps(ret, separators=(",\n", ": "))
def find_job_by_ap(self, ptop: str, ap: str) -> str:
try:
if ANYWIN:
ap = ap.replace("\\", "/")
vp = ap[len(ptop) :].strip("/")
dn, fn = vsplit(vp)
with self.reg_mutex:
tab2 = self.registry[ptop]
for job in tab2.values():
if job["prel"] == dn and job["name"] == fn:
return json.dumps(job, separators=(",\n", ": "))
except:
pass
return "{}"
def get_unfinished_by_user(self, uname, ip) -> str: def get_unfinished_by_user(self, uname, ip) -> str:
if PY2 or not self.mutex.acquire(timeout=2): if PY2 or not self.reg_mutex.acquire(timeout=2):
return '[{"timeout":1}]' return '[{"timeout":1}]'
ret: list[tuple[int, str, int, int, int]] = [] ret: list[tuple[int, str, int, int, int]] = []
@@ -315,17 +337,25 @@ class Up2k(object):
) )
ret.append(zt5) ret.append(zt5)
finally: finally:
self.mutex.release() self.reg_mutex.release()
if ANYWIN:
ret = [(x[0], x[1].replace("\\", "/"), x[2], x[3], x[4]) for x in ret]
ret.sort(reverse=True) ret.sort(reverse=True)
ret2 = [ ret2 = [
{"at": at, "vp": "/" + vp, "pd": 100 - ((nn * 100) // (nh or 1)), "sz": sz} {
"at": at,
"vp": "/" + quotep(vp),
"pd": 100 - ((nn * 100) // (nh or 1)),
"sz": sz,
}
for (at, vp, sz, nn, nh) in ret for (at, vp, sz, nn, nh) in ret
] ]
return json.dumps(ret2, indent=0) return json.dumps(ret2, separators=(",\n", ": "))
def get_unfinished(self) -> str: def get_unfinished(self) -> str:
if PY2 or not self.mutex.acquire(timeout=0.5): if PY2 or not self.reg_mutex.acquire(timeout=0.5):
return "" return ""
ret: dict[str, tuple[int, int]] = {} ret: dict[str, tuple[int, int]] = {}
@@ -347,17 +377,17 @@ class Up2k(object):
ret[ptop] = (nbytes, nfiles) ret[ptop] = (nbytes, nfiles)
finally: finally:
self.mutex.release() self.reg_mutex.release()
return json.dumps(ret, indent=4) return json.dumps(ret, separators=(",\n", ": "))
def get_volsize(self, ptop: str) -> tuple[int, int]: def get_volsize(self, ptop: str) -> tuple[int, int]:
with self.mutex: with self.reg_mutex:
return self._get_volsize(ptop) return self._get_volsize(ptop)
def get_volsizes(self, ptops: list[str]) -> list[tuple[int, int]]: def get_volsizes(self, ptops: list[str]) -> list[tuple[int, int]]:
ret = [] ret = []
with self.mutex: with self.reg_mutex:
for ptop in ptops: for ptop in ptops:
ret.append(self._get_volsize(ptop)) ret.append(self._get_volsize(ptop))
@@ -385,7 +415,7 @@ class Up2k(object):
def _rescan( def _rescan(
self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool
) -> str: ) -> str:
"""mutex me""" """mutex(main) me"""
if not wait and self.pp: if not wait and self.pp:
return "cannot initiate; scan is already in progress" return "cannot initiate; scan is already in progress"
@@ -428,7 +458,13 @@ class Up2k(object):
# important; not deferred by db_act # important; not deferred by db_act
timeout = self._check_lifetimes() timeout = self._check_lifetimes()
timeout = min(timeout, now + self._check_xiu()) try:
timeout = min(timeout, now + self._check_xiu())
except Exception as ex:
if "closed cursor" in str(ex):
self.log("sched_rescan: lost db")
return
raise
with self.mutex: with self.mutex:
for vp, vol in sorted(self.asrv.vfs.all_vols.items()): for vp, vol in sorted(self.asrv.vfs.all_vols.items()):
@@ -667,15 +703,15 @@ class Up2k(object):
self.log(msg, c=3) self.log(msg, c=3)
live_vols = [] live_vols = []
with self.mutex: with self.mutex, self.reg_mutex:
# only need to protect register_vpath but all in one go feels right # only need to protect register_vpath but all in one go feels right
for vol in vols: for vol in vols:
try: try:
bos.makedirs(vol.realpath) # gonna happen at snap anyways bos.makedirs(vol.realpath) # gonna happen at snap anyways
dir_is_empty(self.log_func, not self.args.no_scandir, vol.realpath) dir_is_empty(self.log_func, not self.args.no_scandir, vol.realpath)
except: except Exception as ex:
self.volstate[vol.vpath] = "OFFLINE (cannot access folder)" self.volstate[vol.vpath] = "OFFLINE (cannot access folder)"
self.log("cannot access " + vol.realpath, c=1) self.log("cannot access %s: %r" % (vol.realpath, ex), c=1)
continue continue
if scan_vols and vol.vpath not in scan_vols: if scan_vols and vol.vpath not in scan_vols:
@@ -709,7 +745,7 @@ class Up2k(object):
if self.args.re_dhash or [zv for zv in vols if "e2tsr" in zv.flags]: if self.args.re_dhash or [zv for zv in vols if "e2tsr" in zv.flags]:
self.args.re_dhash = False self.args.re_dhash = False
with self.mutex: with self.mutex, self.reg_mutex:
self._drop_caches() self._drop_caches()
for vol in vols: for vol in vols:
@@ -786,7 +822,9 @@ class Up2k(object):
self.volstate[vol.vpath] = "online (mtp soon)" self.volstate[vol.vpath] = "online (mtp soon)"
for vol in need_vac: for vol in need_vac:
reg = self.register_vpath(vol.realpath, vol.flags) with self.mutex, self.reg_mutex:
reg = self.register_vpath(vol.realpath, vol.flags)
assert reg assert reg
cur, _ = reg cur, _ = reg
with self.mutex: with self.mutex:
@@ -800,7 +838,9 @@ class Up2k(object):
if vol.flags["dbd"] == "acid": if vol.flags["dbd"] == "acid":
continue continue
reg = self.register_vpath(vol.realpath, vol.flags) with self.mutex, self.reg_mutex:
reg = self.register_vpath(vol.realpath, vol.flags)
try: try:
assert reg assert reg
cur, db_path = reg cur, db_path = reg
@@ -847,6 +887,7 @@ class Up2k(object):
def register_vpath( def register_vpath(
self, ptop: str, flags: dict[str, Any] self, ptop: str, flags: dict[str, Any]
) -> Optional[tuple["sqlite3.Cursor", str]]: ) -> Optional[tuple["sqlite3.Cursor", str]]:
"""mutex(main,reg) me"""
histpath = self.asrv.vfs.histtab.get(ptop) histpath = self.asrv.vfs.histtab.get(ptop)
if not histpath: if not histpath:
self.log("no histpath for [{}]".format(ptop)) self.log("no histpath for [{}]".format(ptop))
@@ -869,7 +910,7 @@ class Up2k(object):
ft = "\033[0;32m{}{:.0}" ft = "\033[0;32m{}{:.0}"
ff = "\033[0;35m{}{:.0}" ff = "\033[0;35m{}{:.0}"
fv = "\033[0;36m{}:\033[90m{}" fv = "\033[0;36m{}:\033[90m{}"
fx = set(("html_head", "rm_re_t", "rm_re_r")) fx = set(("html_head", "rm_re_t", "rm_re_r", "mv_re_t", "mv_re_r"))
fd = vf_bmap() fd = vf_bmap()
fd.update(vf_cmap()) fd.update(vf_cmap())
fd.update(vf_vmap()) fd.update(vf_vmap())
@@ -1001,8 +1042,11 @@ class Up2k(object):
return None return None
def _verify_db_cache(self, cur: "sqlite3.Cursor", vpath: str) -> None: def _verify_db_cache(self, cur: "sqlite3.Cursor", vpath: str) -> None:
# check if volume config changed since last use; drop caches if so # check if list of intersecting volumes changed since last use; drop caches if so
zsl = [vpath] + list(sorted(self.asrv.vfs.all_vols.keys())) prefix = (vpath + "/").lstrip("/")
zsl = [x for x in self.asrv.vfs.all_vols if x.startswith(prefix)]
zsl = [x[len(prefix) :] for x in zsl]
zsl.sort()
zb = hashlib.sha1("\n".join(zsl).encode("utf-8", "replace")).digest() zb = hashlib.sha1("\n".join(zsl).encode("utf-8", "replace")).digest()
vcfg = base64.urlsafe_b64encode(zb[:18]).decode("ascii") vcfg = base64.urlsafe_b64encode(zb[:18]).decode("ascii")
@@ -1030,7 +1074,9 @@ class Up2k(object):
dev = cst.st_dev if vol.flags.get("xdev") else 0 dev = cst.st_dev if vol.flags.get("xdev") else 0
with self.mutex: with self.mutex:
reg = self.register_vpath(top, vol.flags) with self.reg_mutex:
reg = self.register_vpath(top, vol.flags)
assert reg and self.pp assert reg and self.pp
cur, db_path = reg cur, db_path = reg
@@ -1610,7 +1656,7 @@ class Up2k(object):
if e2vp and rewark: if e2vp and rewark:
self.hub.retcode = 1 self.hub.retcode = 1
os.kill(os.getpid(), signal.SIGTERM) Daemon(self.hub.sigterm)
raise Exception("{} files have incorrect hashes".format(len(rewark))) raise Exception("{} files have incorrect hashes".format(len(rewark)))
if not e2vu or not rewark: if not e2vu or not rewark:
@@ -1627,7 +1673,7 @@ class Up2k(object):
def _build_tags_index(self, vol: VFS) -> tuple[int, int, bool]: def _build_tags_index(self, vol: VFS) -> tuple[int, int, bool]:
ptop = vol.realpath ptop = vol.realpath
with self.mutex: with self.mutex, self.reg_mutex:
reg = self.register_vpath(ptop, vol.flags) reg = self.register_vpath(ptop, vol.flags)
assert reg and self.pp assert reg and self.pp
@@ -1648,6 +1694,7 @@ class Up2k(object):
return ret return ret
def _drop_caches(self) -> None: def _drop_caches(self) -> None:
"""mutex(main,reg) me"""
self.log("dropping caches for a full filesystem scan") self.log("dropping caches for a full filesystem scan")
for vol in self.asrv.vfs.all_vols.values(): for vol in self.asrv.vfs.all_vols.values():
reg = self.register_vpath(vol.realpath, vol.flags) reg = self.register_vpath(vol.realpath, vol.flags)
@@ -1823,7 +1870,7 @@ class Up2k(object):
params: tuple[Any, ...], params: tuple[Any, ...],
flt: int, flt: int,
) -> tuple[tempfile.SpooledTemporaryFile[bytes], int]: ) -> tuple[tempfile.SpooledTemporaryFile[bytes], int]:
"""mutex me""" """mutex(main) me"""
n = 0 n = 0
c2 = cur.connection.cursor() c2 = cur.connection.cursor()
tf = tempfile.SpooledTemporaryFile(1024 * 1024 * 8, "w+b", prefix="cpp-tq-") tf = tempfile.SpooledTemporaryFile(1024 * 1024 * 8, "w+b", prefix="cpp-tq-")
@@ -2157,7 +2204,7 @@ class Up2k(object):
ip: str, ip: str,
at: float, at: float,
) -> int: ) -> int:
"""will mutex""" """will mutex(main)"""
assert self.mtag assert self.mtag
try: try:
@@ -2189,7 +2236,7 @@ class Up2k(object):
abspath: str, abspath: str,
tags: dict[str, Union[str, float]], tags: dict[str, Union[str, float]],
) -> int: ) -> int:
"""mutex me""" """mutex(main) me"""
assert self.mtag assert self.mtag
if not bos.path.isfile(abspath): if not bos.path.isfile(abspath):
@@ -2474,28 +2521,36 @@ class Up2k(object):
cur.connection.commit() cur.connection.commit()
def _job_volchk(self, cj: dict[str, Any]) -> None: def handle_json(
if not self.register_vpath(cj["ptop"], cj["vcfg"]): self, cj: dict[str, Any], busy_aps: dict[str, int]
if cj["ptop"] not in self.registry: ) -> dict[str, Any]:
raise Pebkac(410, "location unavailable") # busy_aps is u2fh (always undefined if -j0) so this is safe
def handle_json(self, cj: dict[str, Any], busy_aps: set[str]) -> dict[str, Any]:
self.busy_aps = busy_aps self.busy_aps = busy_aps
got_lock = False
try: try:
# bit expensive; 3.9=10x 3.11=2x # bit expensive; 3.9=10x 3.11=2x
if self.mutex.acquire(timeout=10): if self.mutex.acquire(timeout=10):
self._job_volchk(cj) got_lock = True
self.mutex.release() with self.reg_mutex:
return self._handle_json(cj)
else: else:
t = "cannot receive uploads right now;\nserver busy with {}.\nPlease wait; the client will retry..." t = "cannot receive uploads right now;\nserver busy with {}.\nPlease wait; the client will retry..."
raise Pebkac(503, t.format(self.blocked or "[unknown]")) raise Pebkac(503, t.format(self.blocked or "[unknown]"))
except TypeError: except TypeError:
if not PY2: if not PY2:
raise raise
with self.mutex: with self.mutex, self.reg_mutex:
self._job_volchk(cj) return self._handle_json(cj)
finally:
if got_lock:
self.mutex.release()
def _handle_json(self, cj: dict[str, Any]) -> dict[str, Any]:
ptop = cj["ptop"] ptop = cj["ptop"]
if not self.register_vpath(ptop, cj["vcfg"]):
if ptop not in self.registry:
raise Pebkac(410, "location unavailable")
cj["name"] = sanitize_fn(cj["name"], "", [".prologue.html", ".epilogue.html"]) cj["name"] = sanitize_fn(cj["name"], "", [".prologue.html", ".epilogue.html"])
cj["poke"] = now = self.db_act = self.vol_act[ptop] = time.time() cj["poke"] = now = self.db_act = self.vol_act[ptop] = time.time()
wark = self._get_wark(cj) wark = self._get_wark(cj)
@@ -2510,7 +2565,7 @@ class Up2k(object):
# refuse out-of-order / multithreaded uploading if sprs False # refuse out-of-order / multithreaded uploading if sprs False
sprs = self.fstab.get(pdir) != "ng" sprs = self.fstab.get(pdir) != "ng"
with self.mutex: if True:
jcur = self.cur.get(ptop) jcur = self.cur.get(ptop)
reg = self.registry[ptop] reg = self.registry[ptop]
vfs = self.asrv.vfs.all_vols[cj["vtop"]] vfs = self.asrv.vfs.all_vols[cj["vtop"]]
@@ -2948,7 +3003,7 @@ class Up2k(object):
def handle_chunk( def handle_chunk(
self, ptop: str, wark: str, chash: str self, ptop: str, wark: str, chash: str
) -> tuple[int, list[int], str, float, bool]: ) -> tuple[int, list[int], str, float, bool]:
with self.mutex: with self.mutex, self.reg_mutex:
self.db_act = self.vol_act[ptop] = time.time() self.db_act = self.vol_act[ptop] = time.time()
job = self.registry[ptop].get(wark) job = self.registry[ptop].get(wark)
if not job: if not job:
@@ -2991,7 +3046,7 @@ class Up2k(object):
return chunksize, ofs, path, job["lmod"], job["sprs"] return chunksize, ofs, path, job["lmod"], job["sprs"]
def release_chunk(self, ptop: str, wark: str, chash: str) -> bool: def release_chunk(self, ptop: str, wark: str, chash: str) -> bool:
with self.mutex: with self.reg_mutex:
job = self.registry[ptop].get(wark) job = self.registry[ptop].get(wark)
if job: if job:
job["busy"].pop(chash, None) job["busy"].pop(chash, None)
@@ -2999,7 +3054,7 @@ class Up2k(object):
return True return True
def confirm_chunk(self, ptop: str, wark: str, chash: str) -> tuple[int, str]: def confirm_chunk(self, ptop: str, wark: str, chash: str) -> tuple[int, str]:
with self.mutex: with self.mutex, self.reg_mutex:
self.db_act = self.vol_act[ptop] = time.time() self.db_act = self.vol_act[ptop] = time.time()
try: try:
job = self.registry[ptop][wark] job = self.registry[ptop][wark]
@@ -3022,16 +3077,16 @@ class Up2k(object):
if self.args.nw: if self.args.nw:
self.regdrop(ptop, wark) self.regdrop(ptop, wark)
return ret, dst
return ret, dst return ret, dst
def finish_upload(self, ptop: str, wark: str, busy_aps: set[str]) -> None: def finish_upload(self, ptop: str, wark: str, busy_aps: set[str]) -> None:
self.busy_aps = busy_aps self.busy_aps = busy_aps
with self.mutex: with self.mutex, self.reg_mutex:
self._finish_upload(ptop, wark) self._finish_upload(ptop, wark)
def _finish_upload(self, ptop: str, wark: str) -> None: def _finish_upload(self, ptop: str, wark: str) -> None:
"""mutex(main,reg) me"""
try: try:
job = self.registry[ptop][wark] job = self.registry[ptop][wark]
pdir = djoin(job["ptop"], job["prel"]) pdir = djoin(job["ptop"], job["prel"])
@@ -3044,12 +3099,11 @@ class Up2k(object):
t = "finish_upload {} with remaining chunks {}" t = "finish_upload {} with remaining chunks {}"
raise Pebkac(500, t.format(wark, job["need"])) raise Pebkac(500, t.format(wark, job["need"]))
# self.log("--- " + wark + " " + dst + " finish_upload atomic " + dst, 4)
atomic_move(src, dst)
upt = job.get("at") or time.time() upt = job.get("at") or time.time()
vflags = self.flags[ptop] vflags = self.flags[ptop]
atomic_move(self.log, src, dst, vflags)
times = (int(time.time()), int(job["lmod"])) times = (int(time.time()), int(job["lmod"]))
self.log( self.log(
"no more chunks, setting times {} ({}) on {}".format( "no more chunks, setting times {} ({}) on {}".format(
@@ -3105,6 +3159,7 @@ class Up2k(object):
cur.connection.commit() cur.connection.commit()
def regdrop(self, ptop: str, wark: str) -> None: def regdrop(self, ptop: str, wark: str) -> None:
"""mutex(main,reg) me"""
olds = self.droppable[ptop] olds = self.droppable[ptop]
if wark: if wark:
olds.append(wark) olds.append(wark)
@@ -3199,16 +3254,23 @@ class Up2k(object):
at: float, at: float,
skip_xau: bool = False, skip_xau: bool = False,
) -> None: ) -> None:
"""mutex(main) me"""
self.db_rm(db, rd, fn, sz) self.db_rm(db, rd, fn, sz)
if not ip:
db_ip = ""
else:
# plugins may expect this to look like an actual IP
db_ip = "1.1.1.1" if self.args.no_db_ip else ip
sql = "insert into up values (?,?,?,?,?,?,?)" sql = "insert into up values (?,?,?,?,?,?,?)"
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0)) v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
try: try:
db.execute(sql, v) db.execute(sql, v)
except: except:
assert self.mem_cur assert self.mem_cur
rd, fn = s3enc(self.mem_cur, rd, fn) rd, fn = s3enc(self.mem_cur, rd, fn)
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0)) v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
db.execute(sql, v) db.execute(sql, v)
self.volsize[db] += sz self.volsize[db] += sz
@@ -3312,7 +3374,7 @@ class Up2k(object):
vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0]) vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0])
vn, rem = vn.get_dbv(rem) vn, rem = vn.get_dbv(rem)
ptop = vn.realpath ptop = vn.realpath
with self.mutex: with self.mutex, self.reg_mutex:
abrt_cfg = self.flags.get(ptop, {}).get("u2abort", 1) abrt_cfg = self.flags.get(ptop, {}).get("u2abort", 1)
addr = (ip or "\n") if abrt_cfg in (1, 2) else "" addr = (ip or "\n") if abrt_cfg in (1, 2) else ""
user = (uname or "\n") if abrt_cfg in (1, 3) else "" user = (uname or "\n") if abrt_cfg in (1, 3) else ""
@@ -3320,7 +3382,10 @@ class Up2k(object):
for wark, job in reg.items(): for wark, job in reg.items():
if (user and user != job["user"]) or (addr and addr != job["addr"]): if (user and user != job["user"]) or (addr and addr != job["addr"]):
continue continue
if djoin(job["prel"], job["name"]) == rem: jrem = djoin(job["prel"], job["name"])
if ANYWIN:
jrem = jrem.replace("\\", "/")
if jrem == rem:
if job["ptop"] != ptop: if job["ptop"] != ptop:
t = "job.ptop [%s] != vol.ptop [%s] ??" t = "job.ptop [%s] != vol.ptop [%s] ??"
raise Exception(t % (job["ptop"] != ptop)) raise Exception(t % (job["ptop"] != ptop))
@@ -3416,7 +3481,7 @@ class Up2k(object):
continue continue
n_files += 1 n_files += 1
with self.mutex: with self.mutex, self.reg_mutex:
cur = None cur = None
try: try:
ptop = dbv.realpath ptop = dbv.realpath
@@ -3534,6 +3599,7 @@ class Up2k(object):
def _mv_file( def _mv_file(
self, uname: str, svp: str, dvp: str, curs: set["sqlite3.Cursor"] self, uname: str, svp: str, dvp: str, curs: set["sqlite3.Cursor"]
) -> str: ) -> str:
"""mutex(main) me; will mutex(reg)"""
svn, srem = self.asrv.vfs.get(svp, uname, True, False, True) svn, srem = self.asrv.vfs.get(svp, uname, True, False, True)
svn, srem = svn.get_dbv(srem) svn, srem = svn.get_dbv(srem)
@@ -3614,7 +3680,9 @@ class Up2k(object):
if c2 and c2 != c1: if c2 and c2 != c1:
self._copy_tags(c1, c2, w) self._copy_tags(c1, c2, w)
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize) with self.reg_mutex:
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize)
if not is_xvol: if not is_xvol:
has_dupes = self._relink(w, svn.realpath, srem, dabs) has_dupes = self._relink(w, svn.realpath, srem, dabs)
@@ -3653,7 +3721,7 @@ class Up2k(object):
self._symlink(dlink, dabs, dvn.flags, lmod=ftime) self._symlink(dlink, dabs, dvn.flags, lmod=ftime)
wunlink(self.log, sabs, svn.flags) wunlink(self.log, sabs, svn.flags)
else: else:
atomic_move(sabs, dabs) atomic_move(self.log, sabs, dabs, svn.flags)
except OSError as ex: except OSError as ex:
if ex.errno != errno.EXDEV: if ex.errno != errno.EXDEV:
@@ -3744,7 +3812,10 @@ class Up2k(object):
drop_tags: bool, drop_tags: bool,
sz: int, sz: int,
) -> bool: ) -> bool:
"""forgets file in db, fixes symlinks, does not delete""" """
mutex(main,reg) me
forgets file in db, fixes symlinks, does not delete
"""
srd, sfn = vsplit(vrem) srd, sfn = vsplit(vrem)
has_dupes = False has_dupes = False
self.log("forgetting {}".format(vrem)) self.log("forgetting {}".format(vrem))
@@ -3830,8 +3901,7 @@ class Up2k(object):
self.log("linkswap [{}] and [{}]".format(sabs, slabs)) self.log("linkswap [{}] and [{}]".format(sabs, slabs))
mt = bos.path.getmtime(slabs, False) mt = bos.path.getmtime(slabs, False)
flags = self.flags.get(ptop) or {} flags = self.flags.get(ptop) or {}
wunlink(self.log, slabs, flags) atomic_move(self.log, sabs, slabs, flags)
bos.rename(sabs, slabs)
bos.utime(slabs, (int(time.time()), int(mt)), False) bos.utime(slabs, (int(time.time()), int(mt)), False)
self._symlink(slabs, sabs, flags, False) self._symlink(slabs, sabs, flags, False)
full[slabs] = (ptop, rem) full[slabs] = (ptop, rem)
@@ -3920,7 +3990,7 @@ class Up2k(object):
csz = up2k_chunksize(fsz) csz = up2k_chunksize(fsz)
ret = [] ret = []
suffix = " MB, {}".format(path) suffix = " MB, {}".format(path)
with open(fsenc(path), "rb", 512 * 1024) as f: with open(fsenc(path), "rb", self.args.iobuf) as f:
if self.mth and fsz >= 1024 * 512: if self.mth and fsz >= 1024 * 512:
tlt = self.mth.hash(f, fsz, csz, self.pp, prefix, suffix) tlt = self.mth.hash(f, fsz, csz, self.pp, prefix, suffix)
ret = [x[0] for x in tlt] ret = [x[0] for x in tlt]
@@ -4070,7 +4140,7 @@ class Up2k(object):
self.do_snapshot() self.do_snapshot()
def do_snapshot(self) -> None: def do_snapshot(self) -> None:
with self.mutex: with self.mutex, self.reg_mutex:
for k, reg in self.registry.items(): for k, reg in self.registry.items():
self._snap_reg(k, reg) self._snap_reg(k, reg)
@@ -4138,11 +4208,11 @@ class Up2k(object):
path2 = "{}.{}".format(path, os.getpid()) path2 = "{}.{}".format(path, os.getpid())
body = {"droppable": self.droppable[ptop], "registry": reg} body = {"droppable": self.droppable[ptop], "registry": reg}
j = json.dumps(body, indent=2, sort_keys=True).encode("utf-8") j = json.dumps(body, sort_keys=True, separators=(",\n", ": ")).encode("utf-8")
with gzip.GzipFile(path2, "wb") as f: with gzip.GzipFile(path2, "wb") as f:
f.write(j) f.write(j)
atomic_move(path2, path) atomic_move(self.log, path2, path, VF_CAREFUL)
self.log("snap: {} |{}|".format(path, len(reg.keys()))) self.log("snap: {} |{}|".format(path, len(reg.keys())))
self.snap_prev[ptop] = etag self.snap_prev[ptop] = etag
@@ -4211,7 +4281,7 @@ class Up2k(object):
raise Exception("invalid hash task") raise Exception("invalid hash task")
try: try:
if not self._hash_t(task): if not self._hash_t(task) and self.stop:
return return
except Exception as ex: except Exception as ex:
self.log("failed to hash %s: %s" % (task, ex), 1) self.log("failed to hash %s: %s" % (task, ex), 1)
@@ -4221,7 +4291,7 @@ class Up2k(object):
) -> bool: ) -> bool:
ptop, vtop, flags, rd, fn, ip, at, usr, skip_xau = task ptop, vtop, flags, rd, fn, ip, at, usr, skip_xau = task
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn)) # self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
with self.mutex: with self.mutex, self.reg_mutex:
if not self.register_vpath(ptop, flags): if not self.register_vpath(ptop, flags):
return True return True
@@ -4239,7 +4309,7 @@ class Up2k(object):
wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes) wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes)
with self.mutex: with self.mutex, self.reg_mutex:
self.idx_wark( self.idx_wark(
self.flags[ptop], self.flags[ptop],
rd, rd,
@@ -4315,6 +4385,18 @@ class Up2k(object):
for x in list(self.spools): for x in list(self.spools):
self._unspool(x) self._unspool(x)
for cur in self.cur.values():
db = cur.connection
try:
db.interrupt()
except:
pass
cur.close()
db.close()
self.registry = {}
def up2k_chunksize(filesize: int) -> int: def up2k_chunksize(filesize: int) -> int:
chunksize = 1024 * 1024 chunksize = 1024 * 1024

View File

@@ -35,6 +35,9 @@ from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, VT100, WINDOWS
from .__version__ import S_BUILD_DT, S_VERSION from .__version__ import S_BUILD_DT, S_VERSION
from .stolen import surrogateescape from .stolen import surrogateescape
ub64dec = base64.urlsafe_b64decode
ub64enc = base64.urlsafe_b64encode
try: try:
from datetime import datetime, timezone from datetime import datetime, timezone
@@ -355,6 +358,21 @@ APPLESAN_TXT = r"/(__MACOS|Icon\r\r)|/\.(_|DS_Store|AppleDouble|LSOverride|Docum
APPLESAN_RE = re.compile(APPLESAN_TXT) APPLESAN_RE = re.compile(APPLESAN_TXT)
HUMANSIZE_UNITS = ("B", "KiB", "MiB", "GiB", "TiB", "PiB", "EiB")
UNHUMANIZE_UNITS = {
"b": 1,
"k": 1024,
"m": 1024 * 1024,
"g": 1024 * 1024 * 1024,
"t": 1024 * 1024 * 1024 * 1024,
"p": 1024 * 1024 * 1024 * 1024 * 1024,
"e": 1024 * 1024 * 1024 * 1024 * 1024 * 1024,
}
VF_CAREFUL = {"mv_re_t": 5, "rm_re_t": 5, "mv_re_r": 0.1, "rm_re_r": 0.1}
pybin = sys.executable or "" pybin = sys.executable or ""
if EXE: if EXE:
pybin = "" pybin = ""
@@ -460,13 +478,22 @@ class Daemon(threading.Thread):
r: bool = True, r: bool = True,
ka: Optional[dict[Any, Any]] = None, ka: Optional[dict[Any, Any]] = None,
) -> None: ) -> None:
threading.Thread.__init__( threading.Thread.__init__(self, name=name)
self, target=target, name=name, args=a or (), kwargs=ka self.a = a or ()
) self.ka = ka or {}
self.fun = target
self.daemon = True self.daemon = True
if r: if r:
self.start() self.start()
def run(self):
if not ANYWIN and not PY2:
signal.pthread_sigmask(
signal.SIG_BLOCK, [signal.SIGINT, signal.SIGTERM, signal.SIGUSR1]
)
self.fun(*self.a, **self.ka)
class Netdev(object): class Netdev(object):
def __init__(self, ip: str, idx: int, name: str, desc: str): def __init__(self, ip: str, idx: int, name: str, desc: str):
@@ -759,15 +786,46 @@ class CachedSet(object):
self.oldest = now self.oldest = now
class CachedDict(object):
def __init__(self, maxage: float) -> None:
self.c: dict[str, tuple[float, Any]] = {}
self.maxage = maxage
self.oldest = 0.0
def set(self, k: str, v: Any) -> None:
now = time.time()
self.c[k] = (now, v)
if now - self.oldest < self.maxage:
return
c = self.c = {k: v for k, v in self.c.items() if now - v[0] < self.maxage}
try:
self.oldest = min([x[0] for x in c.values()])
except:
self.oldest = now
def get(self, k: str) -> Optional[tuple[str, Any]]:
try:
ts, ret = self.c[k]
now = time.time()
if now - ts > self.maxage:
del self.c[k]
return None
return ret
except:
return None
class FHC(object): class FHC(object):
class CE(object): class CE(object):
def __init__(self, fh: typing.BinaryIO) -> None: def __init__(self, fh: typing.BinaryIO) -> None:
self.ts: float = 0 self.ts: float = 0
self.fhs = [fh] self.fhs = [fh]
self.all_fhs = set([fh])
def __init__(self) -> None: def __init__(self) -> None:
self.cache: dict[str, FHC.CE] = {} self.cache: dict[str, FHC.CE] = {}
self.aps: set[str] = set() self.aps: dict[str, int] = {}
def close(self, path: str) -> None: def close(self, path: str) -> None:
try: try:
@@ -779,7 +837,7 @@ class FHC(object):
fh.close() fh.close()
del self.cache[path] del self.cache[path]
self.aps.remove(path) del self.aps[path]
def clean(self) -> None: def clean(self) -> None:
if not self.cache: if not self.cache:
@@ -800,9 +858,12 @@ class FHC(object):
return self.cache[path].fhs.pop() return self.cache[path].fhs.pop()
def put(self, path: str, fh: typing.BinaryIO) -> None: def put(self, path: str, fh: typing.BinaryIO) -> None:
self.aps.add(path) if path not in self.aps:
self.aps[path] = 0
try: try:
ce = self.cache[path] ce = self.cache[path]
ce.all_fhs.add(fh)
ce.fhs.append(fh) ce.fhs.append(fh)
except: except:
ce = self.CE(fh) ce = self.CE(fh)
@@ -827,6 +888,7 @@ class ProgressPrinter(threading.Thread):
self.start() self.start()
def run(self) -> None: def run(self) -> None:
sigblock()
tp = 0 tp = 0
msg = None msg = None
no_stdout = self.args.q no_stdout = self.args.q
@@ -1271,6 +1333,15 @@ def log_thrs(log: Callable[[str, str, int], None], ival: float, name: str) -> No
log(name, "\033[0m \033[33m".join(tv), 3) log(name, "\033[0m \033[33m".join(tv), 3)
def sigblock():
if ANYWIN or PY2:
return
signal.pthread_sigmask(
signal.SIG_BLOCK, [signal.SIGINT, signal.SIGTERM, signal.SIGUSR1]
)
def vol_san(vols: list["VFS"], txt: bytes) -> bytes: def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
txt0 = txt txt0 = txt
for vol in vols: for vol in vols:
@@ -1292,10 +1363,11 @@ def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
def min_ex(max_lines: int = 8, reverse: bool = False) -> str: def min_ex(max_lines: int = 8, reverse: bool = False) -> str:
et, ev, tb = sys.exc_info() et, ev, tb = sys.exc_info()
stb = traceback.extract_tb(tb) stb = traceback.extract_tb(tb) if tb else traceback.extract_stack()[:-1]
fmt = "%s @ %d <%s>: %s" fmt = "%s @ %d <%s>: %s"
ex = [fmt % (fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb] ex = [fmt % (fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb]
ex.append("[%s] %s" % (et.__name__ if et else "(anonymous)", ev)) if et or ev or tb:
ex.append("[%s] %s" % (et.__name__ if et else "(anonymous)", ev))
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1]) return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
@@ -1400,10 +1472,15 @@ def ren_open(
class MultipartParser(object): class MultipartParser(object):
def __init__( def __init__(
self, log_func: "NamedLogger", sr: Unrecv, http_headers: dict[str, str] self,
log_func: "NamedLogger",
args: argparse.Namespace,
sr: Unrecv,
http_headers: dict[str, str],
): ):
self.sr = sr self.sr = sr
self.log = log_func self.log = log_func
self.args = args
self.headers = http_headers self.headers = http_headers
self.re_ctype = re.compile(r"^content-type: *([^; ]+)", re.IGNORECASE) self.re_ctype = re.compile(r"^content-type: *([^; ]+)", re.IGNORECASE)
@@ -1502,7 +1579,7 @@ class MultipartParser(object):
def _read_data(self) -> Generator[bytes, None, None]: def _read_data(self) -> Generator[bytes, None, None]:
blen = len(self.boundary) blen = len(self.boundary)
bufsz = 32 * 1024 bufsz = self.args.s_rd_sz
while True: while True:
try: try:
buf = self.sr.recv(bufsz) buf = self.sr.recv(bufsz)
@@ -1743,7 +1820,7 @@ def gencookie(k: str, v: str, r: str, tls: bool, dur: int = 0, txt: str = "") ->
def humansize(sz: float, terse: bool = False) -> str: def humansize(sz: float, terse: bool = False) -> str:
for unit in ["B", "KiB", "MiB", "GiB", "TiB"]: for unit in HUMANSIZE_UNITS:
if sz < 1024: if sz < 1024:
break break
@@ -1764,12 +1841,7 @@ def unhumanize(sz: str) -> int:
pass pass
mc = sz[-1:].lower() mc = sz[-1:].lower()
mi = { mi = UNHUMANIZE_UNITS.get(mc, 1)
"k": 1024,
"m": 1024 * 1024,
"g": 1024 * 1024 * 1024,
"t": 1024 * 1024 * 1024 * 1024,
}.get(mc, 1)
return int(float(sz[:-1]) * mi) return int(float(sz[:-1]) * mi)
@@ -1992,6 +2064,7 @@ def vsplit(vpath: str) -> tuple[str, str]:
return vpath.rsplit("/", 1) # type: ignore return vpath.rsplit("/", 1) # type: ignore
# vpath-join
def vjoin(rd: str, fn: str) -> str: def vjoin(rd: str, fn: str) -> str:
if rd and fn: if rd and fn:
return rd + "/" + fn return rd + "/" + fn
@@ -1999,6 +2072,14 @@ def vjoin(rd: str, fn: str) -> str:
return rd or fn return rd or fn
# url-join
def ujoin(rd: str, fn: str) -> str:
if rd and fn:
return rd.rstrip("/") + "/" + fn.lstrip("/")
else:
return rd or fn
def _w8dec2(txt: bytes) -> str: def _w8dec2(txt: bytes) -> str:
"""decodes filesystem-bytes to wtf8""" """decodes filesystem-bytes to wtf8"""
return surrogateescape.decodefilename(txt) return surrogateescape.decodefilename(txt)
@@ -2120,26 +2201,29 @@ def lsof(log: "NamedLogger", abspath: str) -> None:
log("lsof failed; " + min_ex(), 3) log("lsof failed; " + min_ex(), 3)
def atomic_move(usrc: str, udst: str) -> None: def _fs_mvrm(
src = fsenc(usrc) log: "NamedLogger", src: str, dst: str, atomic: bool, flags: dict[str, Any]
dst = fsenc(udst) ) -> bool:
if not PY2: bsrc = fsenc(src)
os.replace(src, dst) bdst = fsenc(dst)
if atomic:
k = "mv_re_"
act = "atomic-rename"
osfun = os.replace
args = [bsrc, bdst]
elif dst:
k = "mv_re_"
act = "rename"
osfun = os.rename
args = [bsrc, bdst]
else: else:
if os.path.exists(dst): k = "rm_re_"
os.unlink(dst) act = "delete"
osfun = os.unlink
args = [bsrc]
os.rename(src, dst) maxtime = flags.get(k + "t", 0.0)
chill = flags.get(k + "r", 0.0)
def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
maxtime = flags.get("rm_re_t", 0.0)
bpath = fsenc(abspath)
if not maxtime:
os.unlink(bpath)
return True
chill = flags.get("rm_re_r", 0.0)
if chill < 0.001: if chill < 0.001:
chill = 0.1 chill = 0.1
@@ -2147,14 +2231,19 @@ def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
t0 = now = time.time() t0 = now = time.time()
for attempt in range(90210): for attempt in range(90210):
try: try:
if ino and os.stat(bpath).st_ino != ino: if ino and os.stat(bsrc).st_ino != ino:
log("inode changed; aborting delete") t = "src inode changed; aborting %s %s"
log(t % (act, src), 1)
return False return False
os.unlink(bpath) if (dst and not atomic) and os.path.exists(bdst):
t = "something appeared at dst; aborting rename [%s] ==> [%s]"
log(t % (src, dst), 1)
return False
osfun(*args)
if attempt: if attempt:
now = time.time() now = time.time()
t = "deleted in %.2f sec, attempt %d" t = "%sd in %.2f sec, attempt %d: %s"
log(t % (now - t0, attempt + 1)) log(t % (act, now - t0, attempt + 1, src))
return True return True
except OSError as ex: except OSError as ex:
now = time.time() now = time.time()
@@ -2164,15 +2253,45 @@ def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
raise raise
if not attempt: if not attempt:
if not PY2: if not PY2:
ino = os.stat(bpath).st_ino ino = os.stat(bsrc).st_ino
t = "delete failed (err.%d); retrying for %d sec: %s" t = "%s failed (err.%d); retrying for %d sec: [%s]"
log(t % (ex.errno, maxtime + 0.99, abspath)) log(t % (act, ex.errno, maxtime + 0.99, src))
time.sleep(chill) time.sleep(chill)
return False # makes pylance happy return False # makes pylance happy
def atomic_move(log: "NamedLogger", src: str, dst: str, flags: dict[str, Any]) -> None:
bsrc = fsenc(src)
bdst = fsenc(dst)
if PY2:
if os.path.exists(bdst):
_fs_mvrm(log, dst, "", False, flags) # unlink
_fs_mvrm(log, src, dst, False, flags) # rename
elif flags.get("mv_re_t"):
_fs_mvrm(log, src, dst, True, flags)
else:
os.replace(bsrc, bdst)
def wrename(log: "NamedLogger", src: str, dst: str, flags: dict[str, Any]) -> bool:
if not flags.get("mv_re_t"):
os.rename(fsenc(src), fsenc(dst))
return True
return _fs_mvrm(log, src, dst, False, flags)
def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
if not flags.get("rm_re_t"):
os.unlink(fsenc(abspath))
return True
return _fs_mvrm(log, abspath, "", False, flags)
def get_df(abspath: str) -> tuple[Optional[int], Optional[int]]: def get_df(abspath: str) -> tuple[Optional[int], Optional[int]]:
try: try:
# some fuses misbehave # some fuses misbehave
@@ -2243,10 +2362,11 @@ def shut_socket(log: "NamedLogger", sck: socket.socket, timeout: int = 3) -> Non
sck.close() sck.close()
def read_socket(sr: Unrecv, total_size: int) -> Generator[bytes, None, None]: def read_socket(
sr: Unrecv, bufsz: int, total_size: int
) -> Generator[bytes, None, None]:
remains = total_size remains = total_size
while remains > 0: while remains > 0:
bufsz = 32 * 1024
if bufsz > remains: if bufsz > remains:
bufsz = remains bufsz = remains
@@ -2260,16 +2380,16 @@ def read_socket(sr: Unrecv, total_size: int) -> Generator[bytes, None, None]:
yield buf yield buf
def read_socket_unbounded(sr: Unrecv) -> Generator[bytes, None, None]: def read_socket_unbounded(sr: Unrecv, bufsz: int) -> Generator[bytes, None, None]:
try: try:
while True: while True:
yield sr.recv(32 * 1024) yield sr.recv(bufsz)
except: except:
return return
def read_socket_chunked( def read_socket_chunked(
sr: Unrecv, log: Optional["NamedLogger"] = None sr: Unrecv, bufsz: int, log: Optional["NamedLogger"] = None
) -> Generator[bytes, None, None]: ) -> Generator[bytes, None, None]:
err = "upload aborted: expected chunk length, got [{}] |{}| instead" err = "upload aborted: expected chunk length, got [{}] |{}| instead"
while True: while True:
@@ -2303,7 +2423,7 @@ def read_socket_chunked(
if log: if log:
log("receiving %d byte chunk" % (chunklen,)) log("receiving %d byte chunk" % (chunklen,))
for chunk in read_socket(sr, chunklen): for chunk in read_socket(sr, bufsz, chunklen):
yield chunk yield chunk
x = sr.recv_ex(2, False) x = sr.recv_ex(2, False)
@@ -2361,10 +2481,11 @@ def build_netmap(csv: str):
return NetMap(ips, cidrs, True) return NetMap(ips, cidrs, True)
def yieldfile(fn: str) -> Generator[bytes, None, None]: def yieldfile(fn: str, bufsz: int) -> Generator[bytes, None, None]:
with open(fsenc(fn), "rb", 512 * 1024) as f: readsz = min(bufsz, 128 * 1024)
with open(fsenc(fn), "rb", bufsz) as f:
while True: while True:
buf = f.read(128 * 1024) buf = f.read(readsz)
if not buf: if not buf:
break break
@@ -2403,6 +2524,7 @@ def sendfile_py(
s: socket.socket, s: socket.socket,
bufsz: int, bufsz: int,
slp: int, slp: int,
use_poll: bool,
) -> int: ) -> int:
remains = upper - lower remains = upper - lower
f.seek(lower) f.seek(lower)
@@ -2431,22 +2553,31 @@ def sendfile_kern(
s: socket.socket, s: socket.socket,
bufsz: int, bufsz: int,
slp: int, slp: int,
use_poll: bool,
) -> int: ) -> int:
out_fd = s.fileno() out_fd = s.fileno()
in_fd = f.fileno() in_fd = f.fileno()
ofs = lower ofs = lower
stuck = 0.0 stuck = 0.0
if use_poll:
poll = select.poll()
poll.register(out_fd, select.POLLOUT)
while ofs < upper: while ofs < upper:
stuck = stuck or time.time() stuck = stuck or time.time()
try: try:
req = min(2 ** 30, upper - ofs) req = min(2 ** 30, upper - ofs)
select.select([], [out_fd], [], 10) if use_poll:
poll.poll(10000)
else:
select.select([], [out_fd], [], 10)
n = os.sendfile(out_fd, in_fd, ofs, req) n = os.sendfile(out_fd, in_fd, ofs, req)
stuck = 0 stuck = 0
except OSError as ex: except OSError as ex:
# client stopped reading; do another select # client stopped reading; do another select
d = time.time() - stuck d = time.time() - stuck
if d < 3600 and ex.errno == errno.EWOULDBLOCK: if d < 3600 and ex.errno == errno.EWOULDBLOCK:
time.sleep(0.02)
continue continue
n = 0 n = 0
@@ -2590,7 +2721,7 @@ def unescape_cookie(orig: str) -> str:
def guess_mime(url: str, fallback: str = "application/octet-stream") -> str: def guess_mime(url: str, fallback: str = "application/octet-stream") -> str:
try: try:
_, ext = url.rsplit(".", 1) ext = url.rsplit(".", 1)[1].lower()
except: except:
return fallback return fallback

View File

@@ -29,6 +29,7 @@ window.baguetteBox = (function () {
isOverlayVisible = false, isOverlayVisible = false,
touch = {}, // start-pos touch = {}, // start-pos
touchFlag = false, // busy touchFlag = false, // busy
scrollTimer = 0,
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i, re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i, re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
anims = ['slideIn', 'fadeIn', 'none'], anims = ['slideIn', 'fadeIn', 'none'],
@@ -91,6 +92,30 @@ window.baguetteBox = (function () {
touchendHandler(); touchendHandler();
}; };
var overlayWheelHandler = function (e) {
if (!options.noScrollbars || anymod(e))
return;
ev(e);
var x = e.deltaX,
y = e.deltaY,
d = Math.abs(x) > Math.abs(y) ? x : y;
if (e.deltaMode)
d *= 10;
if (Date.now() - scrollTimer < (Math.abs(d) > 20 ? 100 : 300))
return;
scrollTimer = Date.now();
if (d > 0)
showNextImage();
else
showPreviousImage();
};
var trapFocusInsideOverlay = function (e) { var trapFocusInsideOverlay = function (e) {
if (overlay.style.display === 'block' && (overlay.contains && !overlay.contains(e.target))) { if (overlay.style.display === 'block' && (overlay.contains && !overlay.contains(e.target))) {
e.stopPropagation(); e.stopPropagation();
@@ -394,8 +419,7 @@ window.baguetteBox = (function () {
} }
function dlpic() { function dlpic() {
var url = findfile()[3].href; var url = addq(findfile()[3].href, 'cache');
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache';
dl_file(url); dl_file(url);
} }
@@ -452,6 +476,7 @@ window.baguetteBox = (function () {
bind(document, 'keyup', keyUpHandler); bind(document, 'keyup', keyUpHandler);
bind(document, 'fullscreenchange', onFSC); bind(document, 'fullscreenchange', onFSC);
bind(overlay, 'click', overlayClickHandler); bind(overlay, 'click', overlayClickHandler);
bind(overlay, 'wheel', overlayWheelHandler);
bind(btnPrev, 'click', showPreviousImage); bind(btnPrev, 'click', showPreviousImage);
bind(btnNext, 'click', showNextImage); bind(btnNext, 'click', showNextImage);
bind(btnClose, 'click', hideOverlay); bind(btnClose, 'click', hideOverlay);
@@ -474,6 +499,7 @@ window.baguetteBox = (function () {
unbind(document, 'keyup', keyUpHandler); unbind(document, 'keyup', keyUpHandler);
unbind(document, 'fullscreenchange', onFSC); unbind(document, 'fullscreenchange', onFSC);
unbind(overlay, 'click', overlayClickHandler); unbind(overlay, 'click', overlayClickHandler);
unbind(overlay, 'wheel', overlayWheelHandler);
unbind(btnPrev, 'click', showPreviousImage); unbind(btnPrev, 'click', showPreviousImage);
unbind(btnNext, 'click', showNextImage); unbind(btnNext, 'click', showNextImage);
unbind(btnClose, 'click', hideOverlay); unbind(btnClose, 'click', hideOverlay);
@@ -584,7 +610,7 @@ window.baguetteBox = (function () {
isOverlayVisible = true; isOverlayVisible = true;
} }
function hideOverlay(e) { function hideOverlay(e, dtor) {
ev(e); ev(e);
playvid(false); playvid(false);
removeFromCache('#files'); removeFromCache('#files');
@@ -592,7 +618,15 @@ window.baguetteBox = (function () {
document.documentElement.style.overflowY = 'auto'; document.documentElement.style.overflowY = 'auto';
document.body.style.overflowY = 'auto'; document.body.style.overflowY = 'auto';
} }
if (overlay.style.display === 'none')
try {
if (document.fullscreenElement)
document.exitFullscreen();
}
catch (ex) { }
isFullscreen = false;
if (dtor || overlay.style.display === 'none')
return; return;
if (options.duringHide) if (options.duringHide)
@@ -600,11 +634,6 @@ window.baguetteBox = (function () {
sethash(''); sethash('');
unbindEvents(); unbindEvents();
try {
document.exitFullscreen();
isFullscreen = false;
}
catch (ex) { }
// Fade out and hide the overlay // Fade out and hide the overlay
overlay.className = ''; overlay.className = '';
@@ -682,7 +711,7 @@ window.baguetteBox = (function () {
options.captions.call(currentGallery, imageElement) : options.captions.call(currentGallery, imageElement) :
imageElement.getAttribute('data-caption') || imageElement.title; imageElement.getAttribute('data-caption') || imageElement.title;
imageSrc += imageSrc.indexOf('?') < 0 ? '?cache' : '&cache'; imageSrc = addq(imageSrc, 'cache');
if (is_vid && index != currentIndex) if (is_vid && index != currentIndex)
return; // no preload return; // no preload
@@ -711,8 +740,11 @@ window.baguetteBox = (function () {
}); });
image.setAttribute('src', imageSrc); image.setAttribute('src', imageSrc);
if (is_vid) { if (is_vid) {
image.volume = clamp(fcfg_get('vol', dvol / 100), 0, 1);
image.setAttribute('controls', 'controls'); image.setAttribute('controls', 'controls');
image.onended = vidEnd; image.onended = vidEnd;
image.onplay = function () { show_buttons(1); };
image.onpause = function () { show_buttons(); };
} }
image.alt = thumbnailElement ? thumbnailElement.alt || '' : ''; image.alt = thumbnailElement ? thumbnailElement.alt || '' : '';
if (options.titleTag && imageCaption) if (options.titleTag && imageCaption)
@@ -720,6 +752,9 @@ window.baguetteBox = (function () {
figure.appendChild(image); figure.appendChild(image);
if (is_vid && window.afilt)
afilt.apply(undefined, image);
if (options.async && callback) if (options.async && callback)
callback(); callback();
} }
@@ -955,6 +990,12 @@ window.baguetteBox = (function () {
} }
} }
function show_buttons(v) {
clmod(ebi('bbox-btns'), 'off', v);
clmod(btnPrev, 'off', v);
clmod(btnNext, 'off', v);
}
function bounceAnimation(direction) { function bounceAnimation(direction) {
slider.className = options.animation == 'slideIn' ? 'bounce-from-' + direction : 'eog'; slider.className = options.animation == 'slideIn' ? 'bounce-from-' + direction : 'eog';
setTimeout(function () { setTimeout(function () {
@@ -1018,9 +1059,7 @@ window.baguetteBox = (function () {
if (fx > 0.7) if (fx > 0.7)
return showNextImage(); return showNextImage();
clmod(ebi('bbox-btns'), 'off', 't'); show_buttons('t');
clmod(btnPrev, 'off', 't');
clmod(btnNext, 'off', 't');
if (Date.now() - ctime <= 500 && !IPHONE) if (Date.now() - ctime <= 500 && !IPHONE)
tglfull(); tglfull();
@@ -1062,6 +1101,7 @@ window.baguetteBox = (function () {
} }
function destroyPlugin() { function destroyPlugin() {
hideOverlay(undefined, true);
unbindEvents(); unbindEvents();
clearCachedData(); clearCachedData();
document.getElementsByTagName('body')[0].removeChild(ebi('bbox-overlay')); document.getElementsByTagName('body')[0].removeChild(ebi('bbox-overlay'));

View File

@@ -28,6 +28,8 @@
--row-alt: #282828; --row-alt: #282828;
--scroll: #eb0; --scroll: #eb0;
--sel-fg: var(--bg-d1);
--sel-bg: var(--fg);
--a: #fc5; --a: #fc5;
--a-b: #c90; --a-b: #c90;
@@ -267,6 +269,7 @@ html.bz {
--btn-bg: #202231; --btn-bg: #202231;
--btn-h-bg: #2d2f45; --btn-h-bg: #2d2f45;
--btn-1-bg: #ba2959; --btn-1-bg: #ba2959;
--btn-1-is: #f59;
--btn-1-fg: #fff; --btn-1-fg: #fff;
--btn-1h-fg: #000; --btn-1h-fg: #000;
--txt-sh: a; --txt-sh: a;
@@ -330,6 +333,8 @@ html.c {
} }
html.cz { html.cz {
--bgg: var(--bg-u2); --bgg: var(--bg-u2);
--sel-bg: var(--bg-u5);
--sel-fg: var(--fg);
--srv-3: #fff; --srv-3: #fff;
--u2-tab-b1: var(--bg-d3); --u2-tab-b1: var(--bg-d3);
} }
@@ -343,6 +348,8 @@ html.cy {
--bg-d3: #f77; --bg-d3: #f77;
--bg-d2: #ff0; --bg-d2: #ff0;
--sel-bg: #f77;
--a: #fff; --a: #fff;
--a-hil: #fff; --a-hil: #fff;
--a-h-bg: #000; --a-h-bg: #000;
@@ -588,8 +595,8 @@ html.dy {
line-height: 1.2em; line-height: 1.2em;
} }
::selection { ::selection {
color: var(--bg-d1); color: var(--sel-fg);
background: var(--fg); background: var(--sel-bg);
text-shadow: none; text-shadow: none;
} }
html,body,tr,th,td,#files,a { html,body,tr,th,td,#files,a {
@@ -699,12 +706,12 @@ a:hover {
.s0:after, .s0:after,
.s1:after { .s1:after {
content: '⌄'; content: '⌄';
margin-left: -.1em; margin-left: -.15em;
} }
.s0r:after, .s0r:after,
.s1r:after { .s1r:after {
content: '⌃'; content: '⌃';
margin-left: -.1em; margin-left: -.15em;
} }
.s0:after, .s0:after,
.s0r:after { .s0r:after {
@@ -715,7 +722,7 @@ a:hover {
color: var(--sort-2); color: var(--sort-2);
} }
#files thead th:after { #files thead th:after {
margin-right: -.7em; margin-right: -.5em;
} }
#files tbody tr:hover td, #files tbody tr:hover td,
#files tbody tr:hover td+td { #files tbody tr:hover td+td {
@@ -744,6 +751,15 @@ html #files.hhpick thead th {
word-wrap: break-word; word-wrap: break-word;
overflow: hidden; overflow: hidden;
} }
#files tr.fade a {
color: #999;
color: rgba(255, 255, 255, 0.4);
font-style: italic;
}
html.y #files tr.fade a {
color: #999;
color: rgba(0, 0, 0, 0.4);
}
#files tr:nth-child(2n) td { #files tr:nth-child(2n) td {
background: var(--row-alt); background: var(--row-alt);
} }
@@ -1600,6 +1616,7 @@ html {
padding: .2em .4em; padding: .2em .4em;
font-size: 1.2em; font-size: 1.2em;
margin: .2em; margin: .2em;
display: inline-block;
white-space: pre; white-space: pre;
position: relative; position: relative;
top: -.12em; top: -.12em;
@@ -1736,6 +1753,7 @@ html.y #tree.nowrap .ntree a+a:hover {
} }
#files th span { #files th span {
position: relative; position: relative;
white-space: nowrap;
} }
#files>thead>tr>th.min, #files>thead>tr>th.min,
#files td.min { #files td.min {
@@ -1773,9 +1791,6 @@ html.y #tree.nowrap .ntree a+a:hover {
margin: .7em 0 .7em .5em; margin: .7em 0 .7em .5em;
padding-left: .5em; padding-left: .5em;
} }
.opwide>div>div>a {
line-height: 2em;
}
.opwide>div>h3 { .opwide>div>h3 {
color: var(--fg-weak); color: var(--fg-weak);
margin: 0 .4em; margin: 0 .4em;
@@ -3046,6 +3061,14 @@ html.b #ggrid>a {
html.b .btn { html.b .btn {
top: -.1em; top: -.1em;
} }
html.b .btn,
html.b #u2conf a.b,
html.b #u2conf input[type="checkbox"]:not(:checked)+label {
box-shadow: 0 .05em 0 var(--bg-d3) inset;
}
html.b .tgl.btn.on {
box-shadow: 0 .05em 0 var(--btn-1-is) inset;
}
html.b #op_up2k.srch sup { html.b #op_up2k.srch sup {
color: #fc0; color: #fc0;
} }

View File

@@ -6,7 +6,7 @@
<title>{{ title }}</title> <title>{{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8, minimum-scale=0.6"> <meta name="viewport" content="width=device-width, initial-scale=0.8, minimum-scale=0.6">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}">
{{ html_head }} {{ html_head }}

File diff suppressed because it is too large Load Diff

View File

@@ -3,7 +3,7 @@
<title>📝 {{ title }}</title> <title>📝 {{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.7"> <meta name="viewport" content="width=device-width, initial-scale=0.7">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/md.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/md.css?_={{ ts }}">
{%- if edit %} {%- if edit %}

View File

@@ -607,10 +607,10 @@ function md_newline() {
var s = linebounds(true), var s = linebounds(true),
ln = s.md.substring(s.n1, s.n2), ln = s.md.substring(s.n1, s.n2),
m1 = /^( *)([0-9]+)(\. +)/.exec(ln), m1 = /^( *)([0-9]+)(\. +)/.exec(ln),
m2 = /^[ \t>+-]*(\* )?/.exec(ln), m2 = /^[ \t]*[>+*-]{0,2}[ \t]/.exec(ln),
drop = dom_src.selectionEnd - dom_src.selectionStart; drop = dom_src.selectionEnd - dom_src.selectionStart;
var pre = m2[0]; var pre = m2 ? m2[0] : '';
if (m1 !== null) if (m1 !== null)
pre = m1[1] + (parseInt(m1[2]) + 1) + m1[3]; pre = m1[1] + (parseInt(m1[2]) + 1) + m1[3];

View File

@@ -3,7 +3,7 @@
<title>📝 {{ title }}</title> <title>📝 {{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.7"> <meta name="viewport" content="width=device-width, initial-scale=0.7">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/mde.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/mde.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/deps/mini-fa.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/deps/mini-fa.css?_={{ ts }}">

View File

@@ -6,7 +6,7 @@
<title>{{ s_doctitle }}</title> <title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/msg.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/msg.css?_={{ ts }}">
{{ html_head }} {{ html_head }}
</head> </head>

View File

@@ -190,11 +190,21 @@ input {
padding: .5em .7em; padding: .5em .7em;
margin: 0 .5em 0 0; margin: 0 .5em 0 0;
} }
input::placeholder {
font-size: 1.2em;
font-style: italic;
letter-spacing: .04em;
opacity: 0.64;
color: #930;
}
html.z input { html.z input {
color: #fff; color: #fff;
background: #626; background: #626;
border-color: #c2c; border-color: #c2c;
} }
html.z input::placeholder {
color: #fff;
}
html.z .num { html.z .num {
border-color: #777; border-color: #777;
} }

View File

@@ -6,7 +6,7 @@
<title>{{ s_doctitle }}</title> <title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
{{ html_head }} {{ html_head }}
@@ -94,7 +94,8 @@
<div> <div>
<form method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}"> <form method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}">
<input type="hidden" name="act" value="login" /> <input type="hidden" name="act" value="login" />
<input type="password" name="cppwd" /> <input type="password" name="cppwd" placeholder=" password" />
<input type="hidden" name="uhash" id="uhash" value="x" />
<input type="submit" value="Login" /> <input type="submit" value="Login" />
{% if ahttps %} {% if ahttps %}
<a id="w" href="{{ ahttps }}">switch to https</a> <a id="w" href="{{ ahttps }}">switch to https</a>

View File

@@ -34,8 +34,14 @@ var Ls = {
"u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds", "u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds",
"v2": "use this server as a local HDD$N$NWARNING: this will show your password!", "v2": "use this server as a local HDD$N$NWARNING: this will show your password!",
} }
}, };
d = Ls[sread("cpp_lang", ["eng", "nor"]) || lang] || Ls.eng || Ls.nor;
var LANGS = ["eng", "nor"];
if (window.langmod)
langmod();
var d = Ls[sread("cpp_lang", LANGS) || lang] || Ls.eng || Ls.nor;
for (var k in (d || {})) { for (var k in (d || {})) {
var f = k.slice(-1), var f = k.slice(-1),
@@ -66,3 +72,5 @@ if (!ebi('c') && o.offsetTop + o.offsetHeight < window.innerHeight)
o = ebi('u'); o = ebi('u');
if (o && /[0-9]+$/.exec(o.innerHTML)) if (o && /[0-9]+$/.exec(o.innerHTML))
o.innerHTML = shumantime(o.innerHTML); o.innerHTML = shumantime(o.innerHTML);
ebi('uhash').value = '' + location.hash;

View File

@@ -6,7 +6,7 @@
<title>{{ s_doctitle }}</title> <title>{{ s_doctitle }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#{{ tcolor }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<style>ul{padding-left:1.3em}li{margin:.4em 0}</style> <style>ul{padding-left:1.3em}li{margin:.4em 0}</style>

View File

@@ -17,7 +17,7 @@ function goto_up2k() {
var up2k = null, var up2k = null,
up2k_hooks = [], up2k_hooks = [],
hws = [], hws = [],
sha_js = window.WebAssembly ? 'hw' : 'ac', // ff53,c57,sa11 sha_js = WebAssembly ? 'hw' : 'ac', // ff53,c57,sa11
m = 'will use ' + sha_js + ' instead of native sha512 due to'; m = 'will use ' + sha_js + ' instead of native sha512 due to';
try { try {
@@ -717,7 +717,7 @@ function Donut(uc, st) {
sfx(); sfx();
// firefox may forget that filedrops are user-gestures so it can skip this: // firefox may forget that filedrops are user-gestures so it can skip this:
if (uc.upnag && window.Notification && Notification.permission == 'granted') if (uc.upnag && Notification && Notification.permission == 'granted')
new Notification(uc.nagtxt); new Notification(uc.nagtxt);
} }
@@ -779,8 +779,8 @@ function up2k_init(subtle) {
}; };
setTimeout(function () { setTimeout(function () {
if (window.WebAssembly && !hws.length) if (WebAssembly && !hws.length)
fetch(SR + '/.cpr/w.hash.js' + CB); fetch(SR + '/.cpr/w.hash.js?_=' + TS);
}, 1000); }, 1000);
function showmodal(msg) { function showmodal(msg) {
@@ -869,7 +869,7 @@ function up2k_init(subtle) {
bcfg_bind(uc, 'turbo', 'u2turbo', turbolvl > 1, draw_turbo); bcfg_bind(uc, 'turbo', 'u2turbo', turbolvl > 1, draw_turbo);
bcfg_bind(uc, 'datechk', 'u2tdate', turbolvl < 3, null); bcfg_bind(uc, 'datechk', 'u2tdate', turbolvl < 3, null);
bcfg_bind(uc, 'az', 'u2sort', u2sort.indexOf('n') + 1, set_u2sort); bcfg_bind(uc, 'az', 'u2sort', u2sort.indexOf('n') + 1, set_u2sort);
bcfg_bind(uc, 'hashw', 'hashw', !!window.WebAssembly && (!subtle || !CHROME || MOBILE || VCHROME >= 107), set_hashw); bcfg_bind(uc, 'hashw', 'hashw', !!WebAssembly && (!subtle || !CHROME || MOBILE || VCHROME >= 107), set_hashw);
bcfg_bind(uc, 'upnag', 'upnag', false, set_upnag); bcfg_bind(uc, 'upnag', 'upnag', false, set_upnag);
bcfg_bind(uc, 'upsfx', 'upsfx', false, set_upsfx); bcfg_bind(uc, 'upsfx', 'upsfx', false, set_upsfx);
@@ -1347,9 +1347,9 @@ function up2k_init(subtle) {
var evpath = get_evpath(), var evpath = get_evpath(),
draw_each = good_files.length < 50; draw_each = good_files.length < 50;
if (window.WebAssembly && !hws.length) { if (WebAssembly && !hws.length) {
for (var a = 0; a < Math.min(navigator.hardwareConcurrency || 4, 16); a++) for (var a = 0; a < Math.min(navigator.hardwareConcurrency || 4, 16); a++)
hws.push(new Worker(SR + '/.cpr/w.hash.js' + CB)); hws.push(new Worker(SR + '/.cpr/w.hash.js?_=' + TS));
console.log(hws.length + " hashers"); console.log(hws.length + " hashers");
} }
@@ -2950,7 +2950,7 @@ function up2k_init(subtle) {
} }
function set_hashw() { function set_hashw() {
if (!window.WebAssembly) { if (!WebAssembly) {
bcfg_set('hashw', uc.hashw = false); bcfg_set('hashw', uc.hashw = false);
toast.err(10, L.u_nowork); toast.err(10, L.u_nowork);
} }
@@ -2967,7 +2967,7 @@ function up2k_init(subtle) {
nopenag(); nopenag();
} }
if (!window.Notification || !HTTPS) if (!Notification || !HTTPS)
return nopenag(); return nopenag();
if (en && Notification.permission == 'default') if (en && Notification.permission == 'default')
@@ -2989,7 +2989,7 @@ function up2k_init(subtle) {
}; };
} }
if (uc.upnag && (!window.Notification || Notification.permission != 'granted')) if (uc.upnag && (!Notification || Notification.permission != 'granted'))
bcfg_set('upnag', uc.upnag = false); bcfg_set('upnag', uc.upnag = false);
ebi('nthread_add').onclick = function (e) { ebi('nthread_add').onclick = function (e) {

View File

@@ -16,7 +16,6 @@ var wah = '',
NOAC = 'autocorrect="off" autocapitalize="off"', NOAC = 'autocorrect="off" autocapitalize="off"',
L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked, L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked,
T0 = Date.now(), T0 = Date.now(),
CB = '?_=' + Math.floor(T0 / 1000).toString(36),
R = SR.slice(1), R = SR.slice(1),
RS = R ? "/" + R : "", RS = R ? "/" + R : "",
HALFMAX = 8192 * 8192 * 8192 * 8192, HALFMAX = 8192 * 8192 * 8192 * 8192,
@@ -52,8 +51,6 @@ catch (ex) {
} }
try { try {
CB = '?' + document.currentScript.src.split('?').pop();
if (navigator.userAgentData.mobile) if (navigator.userAgentData.mobile)
MOBILE = true; MOBILE = true;
@@ -130,7 +127,7 @@ if ((document.location + '').indexOf(',rej,') + 1)
try { try {
console.hist = []; console.hist = [];
var CMAXHIST = 100; var CMAXHIST = 1000;
var hook = function (t) { var hook = function (t) {
var orig = console[t].bind(console), var orig = console[t].bind(console),
cfun = function () { cfun = function () {
@@ -182,7 +179,7 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
if (url.indexOf('easymde.js') + 1) if (url.indexOf('easymde.js') + 1)
return; // clicking the preview pane return; // clicking the preview pane
if (url.indexOf('deps/marked.js') + 1 && !window.WebAssembly) if (url.indexOf('deps/marked.js') + 1 && !WebAssembly)
return; // ff<52 return; // ff<52
crashed = true; crashed = true;
@@ -740,6 +737,15 @@ function vjoin(p1, p2) {
} }
function addq(url, q) {
var uh = url.split('#', 1),
u = uh[0],
h = uh.length == 1 ? '' : '#' + uh[1];
return u + (u.indexOf('?') < 0 ? '?' : '&') + (q === undefined ? '' : q) + h;
}
function uricom_enc(txt, do_fb_enc) { function uricom_enc(txt, do_fb_enc) {
try { try {
return encodeURIComponent(txt); return encodeURIComponent(txt);
@@ -1469,7 +1475,7 @@ var toast = (function () {
clmod(obj, 'vis'); clmod(obj, 'vis');
r.visible = false; r.visible = false;
r.tag = obj; r.tag = obj;
if (!window.WebAssembly) if (!WebAssembly)
te = setTimeout(function () { te = setTimeout(function () {
obj.className = 'hide'; obj.className = 'hide';
}, 500); }, 500);
@@ -1533,6 +1539,8 @@ var modal = (function () {
cb_up = null, cb_up = null,
cb_ok = null, cb_ok = null,
cb_ng = null, cb_ng = null,
sel_0 = 0,
sel_1 = 0,
tok, tng, prim, sec, ok_cancel; tok, tng, prim, sec, ok_cancel;
r.load = function () { r.load = function () {
@@ -1566,7 +1574,7 @@ var modal = (function () {
(inp || a).focus(); (inp || a).focus();
if (inp) if (inp)
setTimeout(function () { setTimeout(function () {
inp.setSelectionRange(0, inp.value.length, "forward"); inp.setSelectionRange(sel_0, sel_1, "forward");
}, 0); }, 0);
document.addEventListener('focus', onfocus); document.addEventListener('focus', onfocus);
@@ -1689,16 +1697,18 @@ var modal = (function () {
r.show(html); r.show(html);
} }
r.prompt = function (html, v, cok, cng, fun) { r.prompt = function (html, v, cok, cng, fun, so0, so1) {
q.push(function () { q.push(function () {
_prompt(lf2br(html), v, cok, cng, fun); _prompt(lf2br(html), v, cok, cng, fun, so0, so1);
}); });
next(); next();
} }
var _prompt = function (html, v, cok, cng, fun) { var _prompt = function (html, v, cok, cng, fun, so0, so1) {
cb_ok = cok; cb_ok = cok;
cb_ng = cng === undefined ? cok : null; cb_ng = cng === undefined ? cok : null;
cb_up = fun; cb_up = fun;
sel_0 = so0 || 0;
sel_1 = so1 === undefined ? v.length : so1;
html += '<input id="modali" type="text" ' + NOAC + ' /><div id="modalb">' + ok_cancel + '</div>'; html += '<input id="modali" type="text" ' + NOAC + ' /><div id="modalb">' + ok_cancel + '</div>';
r.show(html); r.show(html);
@@ -1885,7 +1895,7 @@ function md_thumbs(md) {
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : ''; float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
if (!/[?&]cache/.exec(url)) if (!/[?&]cache/.exec(url))
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i'; url = addq(url, 'cache=i');
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1); md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
} }

View File

@@ -5,9 +5,6 @@ a living list of upcoming features / fixes / changes, very roughly in order of p
* maybe resumable downloads (chrome-only, jank api) * maybe resumable downloads (chrome-only, jank api)
* maybe checksum validation (return sha512 of requested range in responses, and probably also warks) * maybe checksum validation (return sha512 of requested range in responses, and probably also warks)
* [github issue #64](https://github.com/9001/copyparty/issues/64) - dirkeys 2nd season
* popular feature request, finally time to refactor browser.js i suppose...
* [github issue #37](https://github.com/9001/copyparty/issues/37) - upload PWA * [github issue #37](https://github.com/9001/copyparty/issues/37) - upload PWA
* or [maybe not](https://arstechnica.com/tech-policy/2024/02/apple-under-fire-for-disabling-iphone-web-apps-eu-asks-developers-to-weigh-in/), or [maybe](https://arstechnica.com/gadgets/2024/03/apple-changes-course-will-keep-iphone-eu-web-apps-how-they-are-in-ios-17-4/) * or [maybe not](https://arstechnica.com/tech-policy/2024/02/apple-under-fire-for-disabling-iphone-web-apps-eu-asks-developers-to-weigh-in/), or [maybe](https://arstechnica.com/gadgets/2024/03/apple-changes-course-will-keep-iphone-eu-web-apps-how-they-are-in-ios-17-4/)

96
docs/bufsize.txt Normal file
View File

@@ -0,0 +1,96 @@
notes from testing various buffer sizes of files and sockets
summary:
download-folder-as-tar: would be 7% faster with --iobuf 65536 (but got 20% faster in v1.11.2)
download-folder-as-zip: optimal with default --iobuf 262144
download-file-over-https: optimal with default --iobuf 262144
put-large-file: optimal with default --iobuf 262144, --s-rd-sz 262144 (and got 14% faster in v1.11.2)
post-large-file: optimal with default --iobuf 262144, --s-rd-sz 262144 (and got 18% faster in v1.11.2)
----
oha -z10s -c1 --ipv4 --insecure http://127.0.0.1:3923/bigs/?tar
3.3 req/s 1.11.1
4.3 4.0 3.3 req/s 1.12.2
64 256 512 --iobuf 256 (prefer smaller)
32 32 32 --s-rd-sz
oha -z10s -c1 --ipv4 --insecure http://127.0.0.1:3923/bigs/?zip
2.9 req/s 1.11.1
2.5 2.9 2.9 req/s 1.12.2
64 256 512 --iobuf 256 (prefer bigger)
32 32 32 --s-rd-sz
oha -z10s -c1 --ipv4 --insecure http://127.0.0.1:3923/pairdupes/?tar
8.3 req/s 1.11.1
8.4 8.4 8.5 req/s 1.12.2
64 256 512 --iobuf 256 (prefer bigger)
32 32 32 --s-rd-sz
oha -z10s -c1 --ipv4 --insecure http://127.0.0.1:3923/pairdupes/?zip
13.9 req/s 1.11.1
14.1 14.0 13.8 req/s 1.12.2
64 256 512 --iobuf 256 (prefer smaller)
32 32 32 --s-rd-sz
oha -z10s -c1 --ipv4 --insecure http://127.0.0.1:3923/pairdupes/987a
5260 req/s 1.11.1
5246 5246 5280 5268 req/s 1.12.2
64 256 512 256 --iobuf dontcare
32 32 32 512 --s-rd-sz dontcare
oha -z10s -c1 --ipv4 --insecure https://127.0.0.1:3923/pairdupes/987a
4445 req/s 1.11.1
4462 4494 4444 req/s 1.12.2
64 256 512 --iobuf dontcare
32 32 32 --s-rd-sz
oha -z10s -c1 --ipv4 --insecure http://127.0.0.1:3923/bigs/gssc-02-cannonball-skydrift/track10.cdda.flac
95 req/s 1.11.1
95 97 req/s 1.12.2
64 512 --iobuf dontcare
32 32 --s-rd-sz
oha -z10s -c1 --ipv4 --insecure https://127.0.0.1:3923/bigs/gssc-02-cannonball-skydrift/track10.cdda.flac
15.4 req/s 1.11.1
15.4 15.3 14.9 15.4 req/s 1.12.2
64 256 512 512 --iobuf 256 (prefer smaller, and smaller than s-wr-sz)
32 32 32 32 --s-rd-sz
256 256 256 512 --s-wr-sz
----
python3 ~/dev/old/copyparty\ v1.11.1\ dont\ ban\ the\ pipes.py -q -i 127.0.0.1 -v .::A --daw
python3 ~/dev/copyparty/dist/copyparty-sfx.py -q -i 127.0.0.1 -v .::A --daw --iobuf $((1024*512))
oha -z10s -c1 --ipv4 --insecure -mPUT -r0 -D ~/Music/gssc-02-cannonball-skydrift/track10.cdda.flac http://127.0.0.1:3923/a.bin
10.8 req/s 1.11.1
10.8 11.5 11.8 12.1 12.2 12.3 req/s new
512 512 512 512 512 256 --iobuf 256
32 64 128 256 512 256 --s-rd-sz 256 (prefer bigger)
----
buildpost() {
b=--jeg-er-grensestaven;
printf -- "$b\r\nContent-Disposition: form-data; name=\"act\"\r\n\r\nbput\r\n$b\r\nContent-Disposition: form-data; name=\"f\"; filename=\"a.bin\"\r\nContent-Type: audio/mpeg\r\n\r\n"
cat "$1"
printf -- "\r\n${b}--\r\n"
}
buildpost ~/Music/gssc-02-cannonball-skydrift/track10.cdda.flac >big.post
buildpost ~/Music/bottomtext.txt >smol.post
oha -z10s -c1 --ipv4 --insecure -mPOST -r0 -T 'multipart/form-data; boundary=jeg-er-grensestaven' -D big.post http://127.0.0.1:3923/?replace
9.6 11.2 11.3 11.1 10.9 req/s v1.11.2
512 512 256 128 256 --iobuf 256
32 512 256 128 128 --s-rd-sz 256
oha -z10s -c1 --ipv4 --insecure -mPOST -r0 -T 'multipart/form-data; boundary=jeg-er-grensestaven' -D smol.post http://127.0.0.1:3923/?replace
2445 2414 2401 2437
256 128 256 256 --iobuf 256
128 128 256 64 --s-rd-sz 128 (but use 256 since big posts are more important)

View File

@@ -1,3 +1,241 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0510-1431 `v1.13.2` s3xmodit.zip
## new features
* play [compressed](https://a.ocv.me/pub/demo/music/chiptunes/compressed/#af-99f0c0e4) s3xmodit chiptunes/modules c0466279
* can now read gz/xz/zip-compressed s3m/xm/mod/it songs
* new filetypes supported: mdz, mdgz, mdxz, s3z, s3gz, s3xz, xmz, xmgz, xmxz, itz, itgz, itxz
* and if you need to fit even more tracks on the mixtape, [try mo3](https://a.ocv.me/pub/demo/music/chiptunes/compressed/#af-0bc9b877)
* option to batch-convert audio waveforms 38e4fdfe
* volflag to improve audio waveform compression with pngquant 82ce6862
* option to add or change mappings from file-extensions to mimetypes 560d7b66
* export and publish the `--help` text for online viewing 560d7b66
* now available [as html](https://ocv.me/copyparty/helptext.html) and as [plaintext](https://ocv.me/copyparty/helptext.txt), includes many features not documented in the readme
* another way to add your own UI translations 19d156ff
## bugfixes
* ensure OS signals are immediately received and processed 87c60a1e
* things like reload and shutdown signals from systemd could get lost/stuck
* fix mimetype detection for uppercase file extensions 565daee9
* when clicking a `.ts` file in the gridview, don't open it as text 925c7f0a
* ...as it's probably an mpeg transport-stream, not a typescript file
* be less aggressive in dropping volume caches e396c5c2
* very minor performance gain, only really relevant if you're doing something like burning a copyparty volume onto a CD
* previously, adding or removing any volume at all was enough to drop covers cache for all volumes; now this only happens if an intersecting volume is added/removed
## other changes
* updated dompurify to 3.1.2 566cbb65
* opengraph: add the full filename as url suffix 5c1e2390
* so discord picks a good filename when saving an image
----
# 💾 what to download?
| download link | is it good? | description |
| -- | -- | -- |
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
| [u2c.exe](https://github.com/9001/copyparty/releases/download/v1.13.0/u2c.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
| [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) | ⚠️ acceptable | similar to the regular sfx, [mostly worse](https://github.com/9001/copyparty#zipapp) |
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.10.1/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
* except for [u2c.exe](https://github.com/9001/copyparty/releases/download/v1.13.0/u2c.exe), all of the options above are mostly equivalent
* the zip and tar.gz files below are just source code
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0506-0029 `v1.13.1` ctrl-v
## new features
* upload files by `ctrl-c` from OS and `ctrl-v` into browser c5f7cfc3
* from just about any file manager (windows explorer, thunar on linux, etc.) into the copyparty web-ui
* only files, not folders, so drag-drop is still the recommended way
* empty folders show an "empty folder" banner fdda567f
* opengraph / discord embeds ea270ab9 36f2c446 48a6789d b15a4ef7
* embeds [audio with covers](https://cd.ocv.me/c/d2/d22/snowy.mp3) , [images](https://cd.ocv.me/c/d2/d22/cover.jpg) , [videos](https://cd.ocv.me/c/d2/d21/no-effect.webm) , [audio without coverart](https://cd.ocv.me/c/d2/bitconnect.mp3) (links to one of the copyparty demoservers where the feature is enabled; link those in discord to test)
* images are currently not rendering correctly once clicked on android-discord (works on ios and in browser)
* default-disabled because opengraph disables hotlinking by design
* enable with `--og` and [see readme](https://github.com/9001/copyparty#opengraph) and [the --help](https://github.com/9001/copyparty/assets/241032/2dabf21e-2470-4e20-8ef0-3821b24be1b6)
* add option to support base64-encoded url queries parceled into the url location 69517e46
* because android-specific discord bugs prevent the use of queries in opengraph tags
* improve server performance when downloading unfinished uploads, especially on slow storage 70a3cf36
* add dynamic content into `<head>` using `--html-head` which now takes files and/or jinja templates as input b6cf2d30
* `--au-vol` (default 50, same as before) sets default audio volume in percent da091aec
* add **[copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** buildscript 27485a4c
* support ie4 and the [version of winzip](https://a.ocv.me/pub/g/nerd-stuff/cpp/win311zip.png) you'd find on an average windows 3.11 pc 603d0ed7
## bugfixes
* when logging in from the 403 page, remember and apply the original url hash f8491970
* the config-reset button in the control-panel didn't clear the dotfiles preference bc2c1e42
* the search feature could discover and use stale indexes in volumes where indexing was since disabled 95d9e693
* when in doubt, periodically recheck if filesystems support sparse files f6e693f0
* reduces opportunities for confusion on servers with removable media (usb flashdrives)
----
this release introduces **[copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz)**, yet another way to bring copyparty where it's needed -- very limited and with many drawbacks (see [readme](https://github.com/9001/copyparty#zipapp)) but may work when the others don't
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0420-2232 `v1.13.0` race the beam
## new features
* files can be downloaded before the upload has completed ("almost like peer-to-peer")
* watch the [release trailer](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm) 👌
* if the downloader catches up with the upload, the speed is gradually slowed down so it never runs ahead
* can be disabled with `--no-pipe`
* option `--no-db-ip` disables storing the uploader IP in the database bf585078
* u2c (cli uploader): option `--ow` to overwrite existing files on the server 439cb7f8
## bugfixes
* when running on windows, using the web-UI to abort an upload could fail 8c552f1a
* rapidly PUT-uploading and then deleting files could crash the file hasher feecb3e0
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0412-2110 `v1.12.2` ie11 fix
## new features
* new option `--bauth-last` for when you're hosting other [basic-auth](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication) services on the same domain 7b94e4ed
* makes it possible to log into copyparty as intended, but it still sees the passwords from the other service until you do
* alternatively, the other new option `--no-bauth` entirely disables basic-auth support, but that also kills [the android app](https://github.com/9001/party-up)
## bugfixes
* internet explorer isn't working?! FIX IT!!! 9e5253ef
* audio transcoding was buggy with filekeys enabled b8733653
* on windows, theoretical chance that antivirus could interrupt renaming files, so preemptively guard against that c8e3ed3a
## other changes
* add a "password" placeholder on the login page since you might think it's asking for a username da26ec36
* config buttons were jank on iOS b772a4f8
* readme: [making your homeserver accessible from the internet](https://github.com/9001/copyparty#at-home)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0409-2334 `v1.12.1` scrolling stuff
## new features
* while viewing pictures/videos, the scrollwheel can be used to view the prev/next file 844d16b9
## bugfixes
* #81 (scrolling suddenly getting disabled) properly fixed after @icxes found another way to reproduce it (thx) 4f0cad54
* and fixed at least one javascript glitch introduced in v1.12.0 while adding dirkeys 989cc613
* directory tree sidebar could fail to render when popping browser history into the lightbox
## other changes
* music preloader is slightly less hyper f89de6b3
* u2c.exe: updated TLS-certs and deps ab18893c
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0406-2011 `v1.12.0` locksmith
## new features
* #64 dirkeys; option to auto-generate passwords for folders, so you can give someone a link to a specific folder inside a volume without sharing the rest of the volume 10bc2d92 32c912bb ef52e2c0 0ae12868
* enabled by volflag `dk` (exact folder only) and/or volflag `dks` (also subfolders); see [readme](https://github.com/9001/copyparty#dirkeys)
* audio transcoding to mp3 if browser doesn't support opus a080759a
* recursively transcode and download a folder using `?tar&mp3`
* accidentally adds support for playing just about any audio format in ie11
* audio equalizer also applies to videos 7744226b
## bugfixes
* #81 scrolling could break after viewing an image in the lightbox 9c42cbec
* on phones, audio playback could stop if network is slow/unreliable 59f815ff b88cc7b5 59a53ba9
* fixes the issue on android, but ios/safari appears to be [impossible](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#music-playback-halting-on-phones) d94b5b3f
## other changes
* updated dompurify to 3.0.11
* copyparty.exe: updated to python 3.11.9
* support for building with pyoxidizer was removed 5ab54763
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0323-1724 `v1.11.2` public idp volumes
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates, such as [vulnerabilities](https://github.com/9001/copyparty/security) (most recently 2023-07-23)
## new features
* global-option `--iobuf` to set a custom I/O buffersize 2b24c50e
* changes the default buffersize to 256 KiB everywhere (was a mix of 64 and 512)
* may improve performance of networked volumes (s3 etc.) if increased
* on gbit networks: download-as-tar is now up to 20% faster
* slightly faster FTP and TFTP too
* global-option `--s-rd-sz` to set a custom read-size for sockets c6acd3a9
* changes the default from 32 to 256 KiB
* may improve performance of networked volumes (s3 etc.) if increased
* on 10gbit networks: uploading large files is now up to 17% faster
* add url parameter `?replace` to overwrite any existing files with a multipart-post c6acd3a9
## bugfixes
* #79 idp volumes (introduced in [v1.11.0](https://github.com/9001/copyparty/releases/tag/v1.11.0)) would only accept permissions for the user that owned the volume; was impossible to grant read/write-access to other users d30ae845
## other changes
* mention the [lack of persistence for idp volumes](https://github.com/9001/copyparty/blob/hovudstraum/docs/idp.md#important-notes) in the IdP docs 2f20d29e
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0318-1709 `v1.11.1` dont ban the pipes
the [previous release](https://github.com/9001/copyparty/releases/tag/v1.11.0) had all the fun new features... this one's just bugfixes
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
### no vulnerabilities since 2023-07-23
* there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates
* [v1.8.7](https://github.com/9001/copyparty/releases/tag/v1.8.7) (2023-07-23) - [CVE-2023-38501](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-38501) - reflected XSS
* [v1.8.2](https://github.com/9001/copyparty/releases/tag/v1.8.2) (2023-07-14) - [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474) - path traversal (first CVE)
## bugfixes
* less aggressive rejection of requests from banned IPs 51d31588
* clients would get kicked before the header was parsed (which contains the xff header), meaning the server could become inaccessible to everyone if the reverse-proxy itself were to "somehow" get banned
* ...which can happen if a server behind cloudflare also accepts non-cloudflare connections, meaning the client IP would not be resolved, and it'll ban the LAN IP instead heh
* that part still happens, but now it won't affect legit clients through the intended route
* the old behavior can be restored with `--early-ban` to save some cycles, and/or avoid slowloris somewhat
* the unpost feature could appear to be disabled on servers where no volume was mapped to `/` 0287c7ba
* python 3.12 support for [compiling the dependencies](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag#dependencies) necessary to detect bpm/key in audio files 32553e45
## other changes
* mention [real-ip configuration](https://github.com/9001/copyparty?tab=readme-ov-file#real-ip) in the readme ee80cdb9
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0315-2047 `v1.11.0` You Can (Not) Proceed # 2024-0315-2047 `v1.11.0` You Can (Not) Proceed

View File

@@ -20,7 +20,8 @@
* [just the sfx](#just-the-sfx) * [just the sfx](#just-the-sfx)
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps * [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
* [complete release](#complete-release) * [complete release](#complete-release)
* [todo](#todo) - roughly sorted by priority * [debugging](#debugging)
* [music playback halting on phones](#music-playback-halting-on-phones) - mostly fine on android
* [discarded ideas](#discarded-ideas) * [discarded ideas](#discarded-ideas)
@@ -133,6 +134,9 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| GET | `?zip=utf-8` | ...as a zip file | | GET | `?zip=utf-8` | ...as a zip file |
| GET | `?zip` | ...as a WinXP-compatible zip file | | GET | `?zip` | ...as a WinXP-compatible zip file |
| GET | `?zip=crc` | ...as an MSDOS-compatible zip file | | GET | `?zip=crc` | ...as an MSDOS-compatible zip file |
| GET | `?tar&w` | pregenerate webp thumbnails |
| GET | `?tar&j` | pregenerate jpg thumbnails |
| GET | `?tar&p` | pregenerate audio waveforms |
| GET | `?ups` | show recent uploads from your IP | | GET | `?ups` | show recent uploads from your IP |
| GET | `?ups&filter=f` | ...where URL contains `f` | | GET | `?ups&filter=f` | ...where URL contains `f` |
| GET | `?mime=foo` | specify return mimetype `foo` | | GET | `?mime=foo` | specify return mimetype `foo` |
@@ -164,6 +168,7 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| PUT | `?xz` | (binary data) | compress with xz and write into file at URL | | PUT | `?xz` | (binary data) | compress with xz and write into file at URL |
| mPOST | | `f=FILE` | upload `FILE` into the folder at URL | | mPOST | | `f=FILE` | upload `FILE` into the folder at URL |
| mPOST | `?j` | `f=FILE` | ...and reply with json | | mPOST | `?j` | `f=FILE` | ...and reply with json |
| mPOST | `?replace` | `f=FILE` | ...and overwrite existing files |
| mPOST | | `act=mkdir`, `name=foo` | create directory `foo` at URL | | mPOST | | `act=mkdir`, `name=foo` | create directory `foo` at URL |
| POST | `?delete` | | delete URL recursively | | POST | `?delete` | | delete URL recursively |
| jPOST | `?delete` | `["/foo","/bar"]` | delete `/foo` and `/bar` recursively | | jPOST | `?delete` | `["/foo","/bar"]` | delete `/foo` and `/bar` recursively |
@@ -300,15 +305,26 @@ in the `scripts` folder:
* run `./rls.sh 1.2.3` which uploads to pypi + creates github release + sfx * run `./rls.sh 1.2.3` which uploads to pypi + creates github release + sfx
# todo # debugging
roughly sorted by priority ## music playback halting on phones
* nothing! currently mostly fine on android, but still haven't find a way to massage iphones into behaving well
* conditionally starting/stopping mp.fau according to mp.au.readyState <3 or <4 doesn't help
* loop=true doesn't work, and manually looping mp.fau from an onended also doesn't work (it does nothing)
* assigning fau.currentTime in a timer doesn't work, as safari merely pretends to assign it
* on ios 16.7.7, mp.fau can sometimes make everything visibly work correctly, but no audio is actually hitting the speakers
can be reproduced with `--no-sendfile --s-wr-sz 8192 --s-wr-slp 0.3 --rsp-slp 6` and then play a collection of small audio files with the screen off, `ffmpeg -i track01.cdda.flac -c:a libopus -b:a 128k -segment_time 12 -f segment smol-%02d.opus`
## discarded ideas ## discarded ideas
* optimization attempts which didn't improve performance
* remove brokers / multiprocessing stuff; https://github.com/9001/copyparty/tree/no-broker
* reduce the nesting / indirections in `HttpCli` / `httpcli.py`
* nearly zero benefit from stuff like replacing all the `self.conn.hsrv` with a local `hsrv` variable
* reduce up2k roundtrips * reduce up2k roundtrips
* start from a chunk index and just go * start from a chunk index and just go
* terminate client on bad data * terminate client on bad data

View File

@@ -46,7 +46,7 @@ open up notepad and save the following as `c:\users\you\documents\party.conf` (f
### config explained: [global] ### config explained: [global]
the `[global]` section accepts any config parameters you can see when running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor` the `[global]` section accepts any config parameters [listed here](https://ocv.me/copyparty/helptext.html), also viewable by running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor`
* `lo: ~/logs/cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed) * `lo: ~/logs/cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed)
* `e2dsa` enables the upload deduplicator and file indexer, which enables searching * `e2dsa` enables the upload deduplicator and file indexer, which enables searching
* `e2ts` enables music metadata indexing, making albums / titles etc. searchable too * `e2ts` enables music metadata indexing, making albums / titles etc. searchable too

View File

@@ -5,3 +5,18 @@ to configure IdP from scratch, you must place copyparty behind a reverse-proxy w
in the copyparty `[global]` config, specify which headers to read client info from; username is required (`idp-h-usr: X-Authooley-User`), group(s) are optional (`idp-h-grp: X-Authooley-Groups`) in the copyparty `[global]` config, specify which headers to read client info from; username is required (`idp-h-usr: X-Authooley-User`), group(s) are optional (`idp-h-grp: X-Authooley-Groups`)
* it is also required to specify the subnet that legit requests will be coming from, for example `--xff-src=10.88.0.0/24` to allow 10.88.x.x (or `--xff-src=lan` for all private IPs), and it is recommended to configure the reverseproxy to include a secret header as proof that the other headers are also legit (and not smuggled in by a malicious client), telling copyparty the headername to expect with `idp-h-key: shangala-bangala` * it is also required to specify the subnet that legit requests will be coming from, for example `--xff-src=10.88.0.0/24` to allow 10.88.x.x (or `--xff-src=lan` for all private IPs), and it is recommended to configure the reverseproxy to include a secret header as proof that the other headers are also legit (and not smuggled in by a malicious client), telling copyparty the headername to expect with `idp-h-key: shangala-bangala`
# important notes
## IdP volumes are forgotten on shutdown
IdP volumes, meaning dynamically-created volumes, meaning volumes that contain `${u}` or `${g}` in their URL, will be forgotten during a server restart and then "revived" when the volume's owner sends their first request after the restart
until each IdP volume is revived, it will inherit the permissions of its parent volume (if any)
this means that, if an IdP volume is located inside a folder that is readable by anyone, then each of those IdP volumes will **also become readable by anyone** until the volume is revived
and likewise -- if the IdP volume is inside a folder that is only accessible by certain users, but the IdP volume is configured to allow access from unauthenticated users, then the contents of the volume will NOT be accessible until it is revived
until this limitation is fixed (if ever), it is recommended to place IdP volumes inside an appropriate parent volume, so they can inherit acceptable permissions until their revival; see the "strategic volumes" at the bottom of [./examples/docker/idp/copyparty.conf](./examples/docker/idp/copyparty.conf)

View File

@@ -221,6 +221,11 @@ sox -DnV -r8000 -b8 -c1 /dev/shm/a.wav synth 1.1 sin 400 vol 0.02
# play icon calibration pics # play icon calibration pics
for w in 150 170 190 210 230 250; do for h in 130 150 170 190 210; do /c/Program\ Files/ImageMagick-7.0.11-Q16-HDRI/magick.exe convert -size ${w}x${h} xc:brown -fill orange -draw "circle $((w/2)),$((h/2)) $((w/2)),$((h/3))" $w-$h.png; done; done for w in 150 170 190 210 230 250; do for h in 130 150 170 190 210; do /c/Program\ Files/ImageMagick-7.0.11-Q16-HDRI/magick.exe convert -size ${w}x${h} xc:brown -fill orange -draw "circle $((w/2)),$((h/2)) $((w/2)),$((h/3))" $w-$h.png; done; done
# compress chiptune modules
mkdir gz; for f in *.*; do pigz -c11 -I100 <"$f" >gz/"$f"gz; touch -r "$f" gz/"$f"gz; done
mkdir xz; for f in *.*; do xz -cz9 <"$f" >xz/"$f"xz; touch -r "$f" xz/"$f"xz; done
mkdir z; for f in *.*; do 7z a -tzip -mx=9 -mm=lzma "z/${f}z" "$f" && touch -r "$f" z/"$f"z; done
## ##
## vscode ## vscode

View File

@@ -1,82 +1,71 @@
# recipe for building an exe with nuitka (extreme jank edition) # recipe for building an exe with nuitka (extreme jank edition)
#
# NOTE: win7 and win10 builds both work on win10 but NOTE: copyparty runs SLOWER when compiled with nuitka;
# on win7 they immediately c0000005 in kernelbase.dll just use copyparty-sfx.py and/or pyinstaller instead
#
# first install python-3.6.8-amd64.exe ( the sfx and the pyinstaller EXEs are equally fast if you
# [x] add to path have the latest jinja2 installed, but the older jinja that
# comes bundled with the sfx is slightly faster yet )
roughly, copyparty-sfx.py is 6% faster than copyparty.exe
(win10-pyinstaller), and copyparty.exe is 10% faster than
nuitka, making copyparty-sfx.py 17% faster than nuitka
NOTE: every time a nuitka-compiled copyparty.exe is launched,
it will show the windows firewall prompt since nuitka will
pick a new unique location in %TEMP% to unpack an exe into,
unlike pyinstaller which doesn't fork itself on startup...
might be fixable by configuring nuitka differently, idk
NOTE: nuitka EXEs are larger than pyinstaller ones;
a minimal nuitka build of just the sfx (with its bundled
dependencies) was already the same size as the pyinstaller
copyparty.exe which also includes Mutagen and Pillow
NOTE: nuitka takes a lot longer to build than pyinstaller
(due to actual compilation of course, but still)
NOTE: binaries built with nuitka cannot run on windows7,
even when compiled with python 3.6 on windows 7 itself
NOTE: `--python-flags=-m` is the magic sauce to
correctly compile `from .util import Daemon`
(which otherwise only explodes at runtime)
NOTE: `--deployment` doesn't seem to affect performance
########################################################################
# copypaste the rest of this file into cmd # copypaste the rest of this file into cmd
rem from pypi
cd \users\ed\downloads
python -m pip install --user Nuitka-0.6.14.7.tar.gz
rem https://github.com/brechtsanders/winlibs_mingw/releases/download/10.2.0-11.0.0-8.0.0-r5/winlibs-x86_64-posix-seh-gcc-10.2.0-llvm-11.0.0-mingw-w64-8.0.0-r5.zip
mkdir C:\Users\ed\AppData\Local\Nuitka\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\x86_64\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\x86_64\10.2.0-11.0.0-8.0.0-r5\
copy c:\users\ed\downloads\winlibs-x86_64-posix-seh-gcc-10.2.0-llvm-11.0.0-mingw-w64-8.0.0-r5.zip C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\x86_64\10.2.0-11.0.0-8.0.0-r5\winlibs-x86_64-posix-seh-gcc-10.2.0-llvm-11.0.0-mingw-w64-8.0.0-r5.zip
rem https://github.com/ccache/ccache/releases/download/v3.7.12/ccache-3.7.12-windows-32.zip python -m pip install --user -U nuitka
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\ccache\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\ccache\v3.7.12\
copy c:\users\ed\downloads\ccache-3.7.12-windows-32.zip C:\Users\ed\AppData\Local\Nuitka\Nuitka\ccache\v3.7.12\ccache-3.7.12-windows-32.zip
rem https://dependencywalker.com/depends22_x64.zip cd %homedrive%
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\depends\ cd %homepath%\downloads
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\depends\x86_64\
copy c:\users\ed\downloads\depends22_x64.zip C:\Users\ed\AppData\Local\Nuitka\Nuitka\depends\x86_64\depends22_x64.zip
cd \ rd /s /q copypuitka
rd /s /q %appdata%\..\local\temp\pe-copyparty mkdir copypuitka
cd \users\ed\downloads cd copypuitka
python copyparty-sfx.py -h
cd %appdata%\..\local\temp\pe-copyparty\copyparty
python rd /s /q %temp%\pe-copyparty
import os, re python ..\copyparty-sfx.py --version
os.rename('../dep-j2/jinja2', '../jinja2')
os.rename('../dep-j2/markupsafe', '../markupsafe')
print("# nuitka dies if .__init__.stuff is imported") move %temp%\pe-copyparty\copyparty .\
with open('__init__.py','r',encoding='utf-8') as f: move %temp%\pe-copyparty\partftpy .\
t1 = f.read() move %temp%\pe-copyparty\ftp\pyftpdlib .\
move %temp%\pe-copyparty\j2\jinja2 .\
move %temp%\pe-copyparty\j2\markupsafe .\
with open('util.py','r',encoding='utf-8') as f: rd /s /q %temp%\pe-copyparty
t2 = f.read().split('\n')[3:]
t2 = [x for x in t2 if 'from .__init__' not in x] python -m nuitka ^
t = t1 + '\n'.join(t2) --onefile --deployment --python-flag=-m ^
with open('__init__.py','w',encoding='utf-8') as f: --include-package=markupsafe ^
f.write('\n') --include-package=jinja2 ^
--include-package=partftpy ^
--include-package=pyftpdlib ^
--include-data-dir=copyparty\web=copyparty\web ^
--include-data-dir=copyparty\res=copyparty\res ^
--run copyparty
with open('util.py','w',encoding='utf-8') as f:
f.write(t)
print("# local-imports fail, prefix module names")
ptn = re.compile(r'^( *from )(\.[^ ]+ import .*)')
for d, _, fs in os.walk('.'):
for f in fs:
fp = os.path.join(d, f)
if not fp.endswith('.py'):
continue
t = ''
with open(fp,'r',encoding='utf-8') as f:
for ln in [x.rstrip('\r\n') for x in f]:
m = ptn.match(ln)
if not m:
t += ln + '\n'
continue
p1, p2 = m.groups()
t += "{}copyparty{}\n".format(p1, p2).replace("__init__", "util")
with open(fp,'w',encoding='utf-8') as f:
f.write(t)
exit()
cd ..
rd /s /q bout & python -m nuitka --standalone --onefile --windows-onefile-tempdir --python-flag=no_site --assume-yes-for-downloads --include-data-dir=copyparty\web=copyparty\web --include-data-dir=copyparty\res=copyparty\res --run --output-dir=bout --mingw64 --include-package=markupsafe --include-package=jinja2 copyparty

View File

@@ -1,52 +0,0 @@
pyoxidizer doesn't crosscompile yet so need to build in a windows vm,
luckily possible to do mostly airgapped (https-proxy for crates)
none of this is version-specific but doing absolute links just in case
(only exception is py3.8 which is the final win7 ver)
# deps (download on linux host):
https://www.python.org/ftp/python/3.10.7/python-3.10.7-amd64.exe
https://github.com/indygreg/PyOxidizer/releases/download/pyoxidizer%2F0.22.0/pyoxidizer-0.22.0-x86_64-pc-windows-msvc.zip
https://github.com/upx/upx/releases/download/v3.96/upx-3.96-win64.zip
https://static.rust-lang.org/dist/rust-1.61.0-x86_64-pc-windows-msvc.msi
https://github.com/indygreg/python-build-standalone/releases/download/20220528/cpython-3.8.13%2B20220528-i686-pc-windows-msvc-static-noopt-full.tar.zst
# need cl.exe, prefer 2017 -- download on linux host:
https://visualstudio.microsoft.com/downloads/?q=build+tools
https://docs.microsoft.com/en-us/visualstudio/releases/2022/release-history#release-dates-and-build-numbers
https://aka.ms/vs/15/release/vs_buildtools.exe # 2017
https://aka.ms/vs/16/release/vs_buildtools.exe # 2019
https://aka.ms/vs/17/release/vs_buildtools.exe # 2022
https://docs.microsoft.com/en-us/visualstudio/install/workload-component-id-vs-build-tools?view=vs-2017
# use disposable w10 vm to prep offline installer; xfer to linux host with firefox to copyparty
vs_buildtools-2017.exe --add Microsoft.VisualStudio.Workload.MSBuildTools --add Microsoft.VisualStudio.Workload.VCTools --add Microsoft.VisualStudio.Component.Windows10SDK.17763 --layout c:\msbt2017 --lang en-us
# need two proxies on host; s5s or ssh for msys2(socks5), and tinyproxy for rust(http)
UP=- python3 socks5server.py 192.168.123.1 4321
ssh -vND 192.168.123.1:4321 localhost
git clone https://github.com/tinyproxy/tinyproxy.git
./autogen.sh
./configure --prefix=/home/ed/pe/tinyproxy
make -j24 install
printf '%s\n' >cfg "Port 4380" "Listen 192.168.123.1"
./tinyproxy -dccfg
https://github.com/msys2/msys2-installer/releases/download/2022-09-04/msys2-x86_64-20220904.exe
export all_proxy=socks5h://192.168.123.1:4321
# if chat dies after auth (2 messages) it probably failed dns, note the h in socks5h to tunnel dns
pacman -Syuu
pacman -S git patch mingw64/mingw-w64-x86_64-zopfli
cd /c && curl -k https://192.168.123.1:3923/ro/ox/msbt2017/?tar | tar -xv
first install certs from msbt/certificates then admin-cmd `vs_buildtools.exe --noweb`,
default selection (vc++2017-v15.9-v14.16, vc++redist, vc++bt-core) += win10sdk (for io.h)
install rust without documentation, python 3.10, put upx and pyoxidizer into ~/bin,
[cmd.exe] python -m pip install --user -U wheel-0.37.1.tar.gz strip-hints-0.1.10.tar.gz
p=192.168.123.1:4380; export https_proxy=$p; export http_proxy=$p
# and with all of the one-time-setup out of the way,
mkdir /c/d; cd /c/d && curl -k https://192.168.123.1:3923/cpp/gb?pw=wark > gb && git clone gb copyparty
cd /c/d/copyparty/ && curl -k https://192.168.123.1:3923/cpp/patch?pw=wark | patch -p1
cd /c/d/copyparty/scripts && CARGO_HTTP_CHECK_REVOKE=false PATH=/c/Users/$USER/AppData/Local/Programs/Python/Python310:/c/Users/$USER/bin:"$(cygpath "C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\Hostx86\x86"):$PATH" ./make-sfx.sh ox ultra

View File

@@ -47,3 +47,25 @@ and if you want to have a monospace font in the fancy markdown editor, do this:
NB: `<textarea id="mt">` and `<div id="mtr">` in the regular markdown editor must have the same font; none of the suggestions above will cause any issues but keep it in mind if you're getting creative NB: `<textarea id="mt">` and `<div id="mtr">` in the regular markdown editor must have the same font; none of the suggestions above will cause any issues but keep it in mind if you're getting creative
# `<head>`
to add stuff to the html `<head>`, for example a css `<link>` or `<meta>` tags, use either the global-option `--html-head` or the volflag `html_head`
if you give it the value `@ASDF` it will try to open a file named ASDF and send the text within
if the value starts with `%` it will assume a jinja2 template and expand it; the template has access to the `HttpCli` object through a property named `this` as well as everything in `j2a` and the stuff added by `self.j2s`; see [browser.html](https://github.com/9001/copyparty/blob/hovudstraum/copyparty/web/browser.html) for inspiration or look under the hood in [httpcli.py](https://github.com/9001/copyparty/blob/hovudstraum/copyparty/httpcli.py)
# translations
add your own translations by using the english or norwegian one from `browser.js` as a template
the easy way is to open up and modify `browser.js` in your own installation; depending on how you installed copyparty it might be named `browser.js.gz` instead, in which case just decompress it, restart copyparty, and start editing it anyways
if you're running `copyparty-sfx.py` then you'll find it at `/tmp/pe-copyparty.1000/copyparty/web` (on linux) or `%TEMP%\pe-copyparty\copyparty\web` (on windows)
* make sure to keep backups of your work religiously! since that location is volatile af
if editing `browser.js` is inconvenient in your setup then you can instead do this:
* add your translation to a separate javascript file (`tl.js`) and make it load before `browser.js` with the help of `--html-head='<script src="/tl.js"></script>'`
* as the page loads, `browser.js` will look for a function named `langmod` so define that function and make it insert your translation into the `Ls` and `LANGS` variables so it'll take effect

View File

@@ -48,6 +48,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* [filebrowser](#filebrowser) * [filebrowser](#filebrowser)
* [filegator](#filegator) * [filegator](#filegator)
* [sftpgo](#sftpgo) * [sftpgo](#sftpgo)
* [arozos](#arozos)
* [updog](#updog) * [updog](#updog)
* [goshs](#goshs) * [goshs](#goshs)
* [gimme-that](#gimme-that) * [gimme-that](#gimme-that)
@@ -93,6 +94,7 @@ the softwares,
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser) * `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
* `k` = [filegator](https://github.com/filegator/filegator) * `k` = [filegator](https://github.com/filegator/filegator)
* `l` = [sftpgo](https://github.com/drakkan/sftpgo) * `l` = [sftpgo](https://github.com/drakkan/sftpgo)
* `m` = [arozos](https://github.com/tobychui/arozos)
some softwares not in the matrixes, some softwares not in the matrixes,
* [updog](#updog) * [updog](#updog)
@@ -113,22 +115,22 @@ symbol legend,
## general ## general
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | █ | | intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ | | config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ | █ |
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | | | good documentation | | | | █ | █ | █ | █ | | | █ | █ | | |
| runs on iOS | | | | | | | | | | | | | | runs on iOS | | | | | | | | | | | | | |
| runs on Android | █ | | | | | █ | | | | | | | | runs on Android | █ | | | | | █ | | | | | | | |
| runs on WinXP | █ | █ | | | | █ | | | | | | | | runs on WinXP | █ | █ | | | | █ | | | | | | | |
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | █ | | runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | █ | |
| runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | |
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ | | runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ | |
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ | | portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ | █ |
| zero setup, just go | █ | █ | █ | | | | █ | | | █ | | | | zero setup, just go | █ | █ | █ | | | | █ | | | █ | | | █ |
| android app | | | | █ | █ | | | | | | | | | android app | | | | █ | █ | | | | | | | | |
| iOS app | | | | █ | █ | | | | | | | | | iOS app | | | | █ | █ | | | | | | | | |
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever * `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
* `a`/copyparty remarks: * `a`/copyparty remarks:
@@ -140,37 +142,39 @@ symbol legend,
* `f`/rclone must be started with the command `rclone serve webdav .` or similar * `f`/rclone must be started with the command `rclone serve webdav .` or similar
* `h`/chibisafe has undocumented windows support * `h`/chibisafe has undocumented windows support
* `i`/sftpgo must be launched with a command * `i`/sftpgo must be launched with a command
* `m`/arozos has partial windows support
## file transfer ## file transfer
*the thing that copyparty is actually kinda good at* *the thing that copyparty is actually kinda good at*
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | | | █ | | █ | █ | | █ | | download folder as zip | █ | █ | █ | █ | | | █ | | █ | █ | | █ | |
| download folder as tar | █ | | | | | | | | | █ | | | | download folder as tar | █ | | | | | | | | | █ | | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | | parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | █ |
| resumable uploads | █ | | | | | | | | █ | | █ | | | resumable uploads | █ | | | | | | | | █ | | █ | | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | | | upload segmenting | █ | | | | | | | █ | █ | | █ | | █ |
| upload acceleration | █ | | | | | | | | █ | | █ | | | upload acceleration | █ | | | | | | | | █ | | █ | | |
| upload verification | █ | | | █ | █ | | | | █ | | | | | upload verification | █ | | | █ | █ | | | | █ | | | | |
| upload deduplication | █ | | | | █ | | | | █ | | | | | upload deduplication | █ | | | | █ | | | | █ | | | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | | | upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | | |
| keep last-modified time | █ | | | █ | █ | █ | | | | | | | | race the beam ("p2p") | █ | | | | | | | | | • | | | |
| upload rules | | | | | | | | | | | | | | keep last-modified time | █ | | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | | | | | | | | | | upload rules | | | | | | | | | | | | | |
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ | | ┗ max disk usage | █ | █ | | | | | | | █ | | | █ | █ |
| ┗ max items in folder | █ | | | | | | | | | | | | | ┗ max filesize | █ | | | | | | | | | | █ | █ | |
| ┗ max file age | █ | | | | | | | | | | | | | ┗ max items in folder | █ | | | | | | | | | | | | |
| ┗ max uploads over time | █ | | | | | | | | | | | | | ┗ max file age | █ | | | | | | | | █ | | | | |
| ┗ compress before write | █ | | | | | | | | | | | | | ┗ max uploads over time | █ | | | | | | | | | | | | |
| ┗ randomize filename | █ | | | | | | | | | | | | | ┗ compress before write | █ | | | | | | | | | | | | |
| ┗ mimetype reject-list | | | | | | | | | | | | | | ┗ randomize filename | | | | | | | | █ | █ | | | | |
| ┗ extension reject-list | | | | | | | | | • | | | | | ┗ mimetype reject-list | | | | | | | | | • | | | | • |
| checksums provided | | | | █ | █ | | | | | | | | | ┗ extension reject-list | | | | | | | | | | | | | • |
| cloud storage backend | | | | █ | █ | | | | | | | | | checksums provided | | | | █ | █ | | | | | | | | |
| cloud storage backend | | | | █ | █ | █ | | | | | █ | █ | |
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example * `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
@@ -178,6 +182,8 @@ symbol legend,
* `upload verification` = uploads are checksummed or otherwise confirmed to have been transferred correctly * `upload verification` = uploads are checksummed or otherwise confirmed to have been transferred correctly
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
* `checksums provided` = when downloading a file from the server, the file's checksum is provided for verification client-side * `checksums provided` = when downloading a file from the server, the file's checksum is provided for verification client-side
* `cloud storage backend` = able to serve files from (and write to) s3 or similar cloud services; `` means the software can do this with some help from `rclone mount` as a bridge * `cloud storage backend` = able to serve files from (and write to) s3 or similar cloud services; `` means the software can do this with some help from `rclone mount` as a bridge
@@ -192,26 +198,27 @@ symbol legend,
* resumable/segmented uploads only over SFTP, not over HTTP * resumable/segmented uploads only over SFTP, not over HTTP
* upload rules are totals only, not over time * upload rules are totals only, not over time
* can probably do extension/mimetype rejection similar to copyparty * can probably do extension/mimetype rejection similar to copyparty
* `m`/arozos download-as-zip is not streaming; it creates the full zipfile before download can start, and fails on big folders
## protocols and client support ## protocols and client support
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ | | serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ | █ |
| serve ftp (tcp) | █ | | | | | █ | | | | | | █ | | serve ftp (tcp) | █ | | | | | █ | | | | | | █ | █ |
| serve ftps (tls) | █ | | | | | █ | | | | | | █ | | serve ftps (tls) | █ | | | | | █ | | | | | | █ | |
| serve tftp (udp) | █ | | | | | | | | | | | | | serve tftp (udp) | █ | | | | | | | | | | | | |
| serve sftp (ssh) | | | | | | █ | | | | | | █ | | serve sftp (ssh) | | | | | | █ | | | | | | █ | █ |
| serve smb/cifs | | | | | | █ | | | | | | | | serve smb/cifs | | | | | | █ | | | | | | | |
| serve dlna | | | | | | █ | | | | | | | | serve dlna | | | | | | █ | | | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ | | listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ | |
| zeroconf | █ | | | | | | | | | | | | | zeroconf | █ | | | | | | | | | | | | █ |
| supports netscape 4 | | | | | | █ | | | | | • | | | supports netscape 4 | | | | | | █ | | | | | • | | |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • | | | ...internet explorer 6 | | █ | | █ | | █ | | | | | • | | |
| mojibake filenames | █ | | | • | • | █ | █ | • | | • | | | | mojibake filenames | █ | | | • | • | █ | █ | • | | • | | | |
| undecodable filenames | █ | | | • | • | █ | | • | | | | | | undecodable filenames | █ | | | • | • | █ | | • | | | | | |
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf: * `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices * `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
@@ -222,61 +229,66 @@ symbol legend,
* extremely minimal samba/cifs server * extremely minimal samba/cifs server
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png)) * netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
* `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss) * `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss)
* `m`/arozos has readonly-support for older browsers; no uploading
## server configuration ## server configuration
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| config from cmd args | █ | | | | | █ | █ | | | █ | | | | config from cmd args | █ | | | | | █ | █ | | | █ | | | |
| config files | █ | █ | █ | | | █ | | █ | | █ | • | | | config files | █ | █ | █ | | | █ | | █ | | █ | • | | |
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | | | runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | | █ |
| same-port http / https | █ | | | | | | | | | | | | | same-port http / https | █ | | | | | | | | | | | | |
| listen multiple ports | █ | | | | | | | | | | | █ | | listen multiple ports | █ | | | | | | | | | | | █ | |
| virtual file system | █ | █ | █ | | | | █ | | | | | █ | | virtual file system | █ | █ | █ | | | | █ | | | | | █ | |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ | | reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ | |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | | | folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | | • |
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead * `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
* `l`/sftpgo: * `l`/sftpgo:
* config: users must be added through gui / api calls * config: users must be added through gui / api calls
* `m`/arozos:
* configuration is primarily through GUI
* reverse-proxy is not guaranteed to see the correct client IP
## server capabilities ## server capabilities
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| per-account chroot | | | | | | | | | | | | █ | | per-account chroot | | | | | | | | | | | | █ | |
| single-sign-on | | | | █ | █ | | | | • | | | | | single-sign-on | | | | █ | █ | | | | • | | | | |
| token auth | | | | █ | █ | | | █ | | | | | | token auth | | | | █ | █ | | | █ | | | | | █ |
| 2fa | | | | █ | █ | | | | | | | █ | | 2fa | | | | █ | █ | | | | | | | █ | |
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | | █ | | per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | | █ | █ |
| per-folder permissions | | | | █ | █ | | █ | | █ | █ | | █ | | per-folder permissions | | | | █ | █ | | █ | | █ | █ | | █ | █ |
| per-file permissions | | | | █ | █ | | █ | | █ | | | | | per-file permissions | | | | █ | █ | | █ | | █ | | | | █ |
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | | | per-file passwords | █ | | | █ | █ | | █ | | █ | | | | █ |
| unmap subfolders | █ | | | | | | █ | | | █ | | • | | unmap subfolders | █ | | | | | | █ | | | █ | | • | |
| index.html blocks list | | | | | | | █ | | | • | | | | index.html blocks list | | | | | | | █ | | | • | | | |
| write-only folders | █ | | | | | | | | | | █ | █ | | write-only folders | █ | | | | | | | | | | █ | █ | |
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ | | files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ | █ |
| file versioning | | | | █ | █ | | | | | | | | | file versioning | | | | █ | █ | | | | | | | | |
| file encryption | | | | █ | █ | █ | | | | | | █ | | file encryption | | | | █ | █ | █ | | | | | | █ | |
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | | | file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | | |
| ┗ per-volume db | █ | | • | • | • | | | • | • | | | | | ┗ per-volume db | █ | | • | • | • | | | • | • | | | | |
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | | | ┗ db stored in folder | █ | | | | | | | • | • | █ | | | |
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | | | ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | | |
| ┗ existing file tree | █ | | █ | | | | | | | █ | | | | ┗ existing file tree | █ | | █ | | | | | | | █ | | | |
| file action event hooks | █ | | | | | | | | | █ | | █ | | file action event hooks | █ | | | | | | | | | █ | | █ | • |
| one-way folder sync | █ | | | █ | █ | █ | | | | | | | | one-way folder sync | █ | | | █ | █ | █ | | | | | | | |
| full sync | | | | █ | █ | | | | | | | | | full sync | | | | █ | █ | | | | | | | | |
| speed throttle | | █ | █ | | | █ | | | █ | | | █ | | speed throttle | | █ | █ | | | █ | | | █ | | | █ | |
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ | | anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ | • |
| dyndns updater | | █ | | | | | | | | | | | | dyndns updater | | █ | | | | | | | | | | | |
| self-updater | | | █ | | | | | | | | | | | self-updater | | | █ | | | | | | | | | | █ |
| log rotation | █ | | █ | █ | █ | | | • | █ | | | █ | | log rotation | █ | | █ | █ | █ | | | • | █ | | | █ | • |
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | | | upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | | █ |
| curl-friendly ls | █ | | | | | | | | | | | | | prometheus metrics | █ | | | █ | | | | | | | | | |
| curl-friendly upload | █ | | | | | | | | | | | | | curl-friendly ls | █ | | | | | | | | | | | | |
| curl-friendly upload | █ | | | | | █ | █ | • | | | | | |
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path * `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that * `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
@@ -302,49 +314,51 @@ symbol legend,
* `l`/sftpgo: * `l`/sftpgo:
* `file action event hooks` also include on-download triggers * `file action event hooks` also include on-download triggers
* `upload tracking / log` in main logfile * `upload tracking / log` in main logfile
* `m`/arozos:
* `2fa` maybe possible through LDAP/Oauth
## client features ## client features
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | | | single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | | █ |
| themes | █ | █ | | █ | | | | | █ | | | | | themes | █ | █ | | █ | | | | | █ | | | | |
| directory tree nav | █ | | | | █ | | | | █ | | | | | directory tree nav | █ | | | | █ | | | | █ | | | | |
| multi-column sorting | █ | | | | | | | | | | | | | multi-column sorting | █ | | | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | | | | thumbnails | █ | | | | | | | █ | █ | | | | █ |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | | | ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | | █ |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | | | ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | | █ |
| ┗ audio spectrograms | █ | | | | | | | | | | | | | ┗ audio spectrograms | █ | | | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | | | | audio player | █ | | | █ | █ | | | | █ | | | | █ |
| ┗ gapless playback | █ | | | | | | | | • | | | | | ┗ gapless playback | █ | | | | | | | | • | | | | |
| ┗ audio equalizer | █ | | | | | | | | | | | | | ┗ audio equalizer | █ | | | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | | | | ┗ waveform seekbar | █ | | | | | | | | | | | | |
| ┗ OS integration | █ | | | | | | | | | | | | | ┗ OS integration | █ | | | | | | | | | | | | |
| ┗ transcode to lossy | █ | | | | | | | | | | | | | ┗ transcode to lossy | █ | | | | | | | | | | | | |
| video player | █ | | | █ | █ | | | | █ | █ | | | | video player | █ | | | █ | █ | | | | █ | █ | | | █ |
| ┗ video transcoding | | | | | | | | | █ | | | | | ┗ video transcoding | | | | | | | | | █ | | | | |
| audio BPM detector | █ | | | | | | | | | | | | | audio BPM detector | █ | | | | | | | | | | | | |
| audio key detector | █ | | | | | | | | | | | | | audio key detector | █ | | | | | | | | | | | | |
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | | | search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | | |
| search by date / size | █ | | | | █ | | | █ | █ | | | | | search by date / size | █ | | | | █ | | | █ | █ | | | | |
| search by bpm / key | █ | | | | | | | | | | | | | search by bpm / key | █ | | | | | | | | | | | | |
| search by custom tags | | | | | | | | █ | █ | | | | | search by custom tags | | | | | | | | █ | █ | | | | |
| search in file contents | | | | █ | █ | | | | █ | | | | | search in file contents | | | | █ | █ | | | | █ | | | | |
| search by custom parser | █ | | | | | | | | | | | | | search by custom parser | █ | | | | | | | | | | | | |
| find local file | █ | | | | | | | | | | | | | find local file | █ | | | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | | | | undo recent uploads | █ | | | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ | | create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | | | image viewer | █ | | | █ | █ | | | | █ | █ | █ | | █ |
| markdown viewer | █ | | | | █ | | | | █ | | | | | markdown viewer | █ | | | | █ | | | | █ | | | | █ |
| markdown editor | █ | | | | █ | | | | █ | | | | | markdown editor | █ | | | | █ | | | | █ | | | | █ |
| readme.md in listing | █ | | | █ | | | | | | | | | | readme.md in listing | █ | | | █ | | | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ | | rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | | | | batch rename | █ | | | | | | | | █ | | | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | | | cut / paste files | █ | █ | | █ | █ | | | | █ | | | | █ |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | | move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | █ |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ | | delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ | | | copy files | | | | | █ | | | | █ | █ | █ | | █ |
* `single-page app` = multitasking; possible to continue navigating while uploading * `single-page app` = multitasking; possible to continue navigating while uploading
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song * `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
@@ -360,14 +374,14 @@ symbol legend,
## integration ## integration
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| OS alert on upload | █ | | | | | | | | | | | | | OS alert on upload | █ | | | | | | | | | | | | |
| discord | █ | | | | | | | | | | | | | discord | █ | | | | | | | | | | | | |
| ┗ announce uploads | █ | | | | | | | | | | | | | ┗ announce uploads | █ | | | | | | | | | | | | |
| ┗ custom embeds | | | | | | | | | | | | | | ┗ custom embeds | | | | | | | | | | | | | |
| sharex | █ | | | █ | | █ | | █ | | | | | | sharex | █ | | | █ | | █ | | █ | | | | | |
| flameshot | | | | | | █ | | | | | | | | flameshot | | | | | | █ | | | | | | | |
* sharex `` = yes, but does not provide example sharex config * sharex `` = yes, but does not provide example sharex config
* `a`/copyparty remarks: * `a`/copyparty remarks:
@@ -393,6 +407,7 @@ symbol legend,
| filebrowser | go | █ apl2 | 20 MB | | filebrowser | go | █ apl2 | 20 MB |
| filegator | php | █ mit | • | | filegator | php | █ mit | • |
| sftpgo | go | ‼ agpl | 44 MB | | sftpgo | go | ‼ agpl | 44 MB |
| arozos | go | ░ gpl3 | 531 MB |
| updog | python | █ mit | 17 MB | | updog | python | █ mit | 17 MB |
| goshs | go | █ mit | 11 MB | | goshs | go | █ mit | 11 MB |
| gimme-that | python | █ mit | 4.8 MB | | gimme-that | python | █ mit | 4.8 MB |
@@ -504,12 +519,14 @@ symbol legend,
* ✅ token auth (api keys) * ✅ token auth (api keys)
## [kodbox](https://github.com/kalcaddle/kodbox) ## [kodbox](https://github.com/kalcaddle/kodbox)
* this thing is insane * this thing is insane (but is getting competition from [arozos](#arozos))
* php; [docker](https://hub.docker.com/r/kodcloud/kodbox) * php; [docker](https://hub.docker.com/r/kodcloud/kodbox)
* 🔵 *upload segmenting, acceleration, and integrity checking!* * 🔵 *upload segmenting, acceleration, and integrity checking!*
* ⚠️ but uploads are not resumable(?) * ⚠️ but uploads are not resumable(?)
* ⚠️ not portable * ⚠️ not portable
* ⚠️ isolated on-disk file hierarchy, incompatible with other software * ⚠️ isolated on-disk file hierarchy, incompatible with other software
* ⚠️ uploading small files to copyparty is 16x faster
* ⚠️ uploading large files to copyparty is 3x faster
* ⚠️ http/webdav only; no ftp or zeroconf * ⚠️ http/webdav only; no ftp or zeroconf
* ⚠️ some parts of the GUI are in chinese * ⚠️ some parts of the GUI are in chinese
* ✅ fantastic ui/ux * ✅ fantastic ui/ux
@@ -569,6 +586,24 @@ symbol legend,
* ✅ on-download event hook (otherwise same as copyparty) * ✅ on-download event hook (otherwise same as copyparty)
* ✅ more extensive permissions control * ✅ more extensive permissions control
## [arozos](https://github.com/tobychui/arozos)
* big suite of applications similar to [kodbox](#kodbox), copyparty is better at downloading/uploading/music/indexing but arozos has other advantages
* go; primarily linux (limited support for windows)
* ⚠️ uploads not resumable / integrity-checked
* ⚠️ uploading small files to copyparty is 2.7x faster
* ⚠️ uploading large files to copyparty is at least 10% faster
* arozos is websocket-based, 512 KiB chunks; writes each chunk to separate files and then merges
* copyparty splices directly into the final file; faster and better for the HDD and filesystem
* ⚠️ no directory tree navpane; not as easy to navigate
* ⚠️ download-as-zip is not streaming; creates a temp.file on the server
* ⚠️ not self-contained (pulls from jsdelivr)
* ⚠️ has an audio player, but supports less filetypes
* ⚠️ limited support for configuring real-ip detection
* ✅ sftp server
* ✅ settings gui
* ✅ good-looking gui
* ✅ an IDE, msoffice viewer, rich host integration, much more
## [updog](https://github.com/sc0tfree/updog) ## [updog](https://github.com/sc0tfree/updog)
* python; cross-platform * python; cross-platform
* basic directory listing with upload feature * basic directory listing with upload feature

View File

@@ -1,48 +0,0 @@
# builds win7-i386 exe on win10-ltsc-1809(17763.316)
# see docs/pyoxidizer.txt
def make_exe():
dist = default_python_distribution(flavor="standalone_static", python_version="3.8")
policy = dist.make_python_packaging_policy()
policy.allow_files = True
policy.allow_in_memory_shared_library_loading = True
#policy.bytecode_optimize_level_zero = True
#policy.include_distribution_sources = False # error instantiating embedded Python interpreter: during initializing Python main: init_fs_encoding: failed to get the Python codec of the filesystem encoding
policy.include_distribution_resources = False
policy.include_non_distribution_sources = False
policy.include_test = False
python_config = dist.make_python_interpreter_config()
#python_config.module_search_paths = ["$ORIGIN/lib"]
python_config.run_module = "copyparty"
exe = dist.to_python_executable(
name="copyparty",
config=python_config,
packaging_policy=policy,
)
exe.windows_runtime_dlls_mode = "never"
exe.windows_subsystem = "console"
exe.add_python_resources(exe.read_package_root(
path="sfx",
packages=[
"copyparty",
"jinja2",
"markupsafe",
"pyftpdlib",
"python-magic",
]
))
return exe
def make_embedded_resources(exe):
return exe.to_embedded_resources()
def make_install(exe):
files = FileManifest()
files.add_python_resource("copyparty", exe)
return files
register_target("exe", make_exe)
register_target("resources", make_embedded_resources, depends=["exe"], default_build_script=True)
register_target("install", make_install, depends=["exe"], default=True)
resolve_targets()

View File

@@ -3,7 +3,7 @@ WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.10.0 \ ver_hashwasm=4.10.0 \
ver_marked=4.3.0 \ ver_marked=4.3.0 \
ver_dompf=3.0.9 \ ver_dompf=3.1.5 \
ver_mde=2.18.0 \ ver_mde=2.18.0 \
ver_codemirror=5.65.16 \ ver_codemirror=5.65.16 \
ver_fontawesome=5.13.0 \ ver_fontawesome=5.13.0 \
@@ -24,7 +24,9 @@ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap # the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
RUN mkdir -p /z/dist/no-pk \ RUN mkdir -p /z/dist/no-pk \
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \ && wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev py3-brotli \ && apk add \
bash brotli cmake make g++ git gzip lame npm patch pigz \
python3 python3-dev py3-brotli sox tar unzip wget \
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \ && rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \ && wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \ && wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
@@ -58,6 +60,11 @@ RUN mkdir -p /z/dist/no-pk \
&& tar -xf zopfli.tgz && tar -xf zopfli.tgz
COPY busy-mp3.sh /z/
RUN /z/busy-mp3.sh \
&& mv -v /dev/shm/busy.mp3.gz /z/dist
# build fonttools (which needs zopfli) # build fonttools (which needs zopfli)
RUN tar -xf zopfli.tgz \ RUN tar -xf zopfli.tgz \
&& cd zopfli* \ && cd zopfli* \

61
scripts/deps-docker/busy-mp3.sh Executable file
View File

@@ -0,0 +1,61 @@
#!/bin/bash
set -e
cat >/dev/null <<EOF
a frame is 1152 samples
1sec @ 48000 = 41.66 frames
11 frames = 12672 samples = 0.264 sec
22 frames = 25344 samples = 0.528 sec
EOF
fast=1
fast=
echo
mkdir -p /dev/shm/$1
cd /dev/shm/$1
find -maxdepth 1 -type f -iname 'a.*.mp3*' -delete
min=99999999
for freq in 425; do # {400..500}
for vol in 0; do # {10..30}
for kbps in 32; do
for fdur in 1124; do # {800..1200}
for fdu2 in 1042; do # {800..1200}
for ftyp in h; do # q h t l p
for ofs1 in 9214; do # {0..32768}
for ofs2 in 0; do # {0..4096}
for ofs3 in 0; do # {0..4096}
for nores in --nores; do # '' --nores
f=a.b$kbps$nores-f$freq-v$vol-$ftyp$fdur-$fdu2-o$ofs1-$ofs2-$ofs3
sox -r48000 -Dn -r48000 -b16 -c2 -t raw s1.pcm synth 25344s sin $freq vol 0.$vol fade $ftyp ${fdur}s 25344s ${fdu2}s
sox -r48000 -Dn -r48000 -b16 -c2 -t raw s0.pcm synth 12672s sin $freq vol 0
tail -c +$ofs1 s0.pcm >s0a.pcm
tail -c +$ofs2 s0.pcm >s0b.pcm
tail -c +$ofs3 s0.pcm >s0c.pcm
cat s{0a,1,0,0b,1,0c}.pcm > a.pcm
lame --silent -r -s 48 --bitwidth 16 --signed a.pcm -m j --resample 48 -b $kbps -q 0 $nores $f.mp3
if [ $fast ]
then gzip -c9 <$f.mp3 >$f.mp3.gz
else pigz -c11 -I1 <$f.mp3 >$f.mp3.gz
fi
sz=$(wc -c <$f.mp3.gz)
printf '\033[A%d %s\033[K\n' $sz $f
[ $sz -le $((min+10)) ] && echo
[ $sz -le $min ] && echo && min=$sz
done;done;done;done;done;done;done;done;done;done
true
f=a.b32--nores-f425-v0-h1124-1042-o9214-0-0.mp3
[ $fast ] &&
pigz -c11 -I1 <$f >busy.mp3.gz ||
mv $f.gz busy.mp3.gz
sz=$(wc -c <busy.mp3.gz)
[ "$sz" -eq 106 ] &&
echo busy.mp3 built successfully ||
echo WARNING: unexpected size of busy.mp3
find -maxdepth 1 -type f -iname 'a.*.mp3*' -delete

61
scripts/make-pyz.sh Executable file
View File

@@ -0,0 +1,61 @@
#!/bin/bash
set -e
echo
# port install gnutar gsed coreutils
gtar=$(command -v gtar || command -v gnutar) || true
[ ! -z "$gtar" ] && command -v gsed >/dev/null && {
tar() { $gtar "$@"; }
sed() { gsed "$@"; }
command -v grealpath >/dev/null &&
realpath() { grealpath "$@"; }
}
targs=(--owner=1000 --group=1000)
[ "$OSTYPE" = msys ] &&
targs=()
[ -e copyparty/__main__.py ] || cd ..
[ -e copyparty/__main__.py ] || {
echo "run me from within the project root folder"
echo
exit 1
}
[ -e sfx/copyparty/__main__.py ] || {
echo "run ./scripts/make-sfx.py first"
echo
exit 1
}
rm -rf pyz
mkdir -p pyz
cd pyz
cp -pR ../sfx/{copyparty,partftpy} .
cp -pR ../sfx/{ftp,j2}/* .
ts=$(date -u +%s)
hts=$(date -u +%Y-%m%d-%H%M%S)
ver="$(cat ../sfx/ver)"
mkdir -p ../dist
pyz_out=../dist/copyparty.pyz
echo creating z.tar
( cd copyparty
tar -cf z.tar "${targs[@]}" --numeric-owner web res
rm -rf web res
)
echo creating loader
sed -r 's/^(VER = ).*/\1"'"$ver"'"/; s/^(STAMP = ).*/\1'$(date +%s)/ \
<../scripts/ziploader.py \
>__main__.py
echo creating pyz
rm -f $pyz_out
zip -9 -q -r $pyz_out *
echo done:
echo " $(realpath $pyz_out)"

View File

@@ -16,8 +16,6 @@ help() { exec cat <<'EOF'
# `re` does a repack of an sfx which you already executed once # `re` does a repack of an sfx which you already executed once
# (grabs files from the sfx-created tempdir), overrides `clean` # (grabs files from the sfx-created tempdir), overrides `clean`
# #
# `ox` builds a pyoxidizer exe instead of py
#
# `gz` creates a gzip-compressed python sfx instead of bzip2 # `gz` creates a gzip-compressed python sfx instead of bzip2
# #
# `lang` limits which languages/translations to include, # `lang` limits which languages/translations to include,
@@ -101,9 +99,6 @@ pybin=$(command -v python3 || command -v python) || {
exit 1 exit 1
} }
[ $CSN ] ||
CSN=sfx
langs= langs=
use_gz= use_gz=
zopf=2560 zopf=2560
@@ -111,7 +106,6 @@ while [ ! -z "$1" ]; do
case $1 in case $1 in
clean) clean=1 ; ;; clean) clean=1 ; ;;
re) repack=1 ; ;; re) repack=1 ; ;;
ox) use_ox=1 ; ;;
gz) use_gz=1 ; ;; gz) use_gz=1 ; ;;
gzz) shift;use_gzz=$1;use_gz=1; ;; gzz) shift;use_gzz=$1;use_gz=1; ;;
no-ftp) no_ftp=1 ; ;; no-ftp) no_ftp=1 ; ;;
@@ -151,9 +145,9 @@ stamp=$(
done | sort | tail -n 1 | sha1sum | cut -c-16 done | sort | tail -n 1 | sha1sum | cut -c-16
) )
rm -rf $CSN/* rm -rf sfx$CSN/*
mkdir -p $CSN build mkdir -p sfx$CSN build
cd $CSN cd sfx$CSN
tmpdir="$( tmpdir="$(
printf '%s\n' "$TMPDIR" /tmp | printf '%s\n' "$TMPDIR" /tmp |
@@ -389,11 +383,13 @@ git describe --tags >/dev/null 2>/dev/null && {
ver="$(awk '/^VERSION *= \(/ { ver="$(awk '/^VERSION *= \(/ {
gsub(/[^0-9,a-g-]/,""); gsub(/,/,"."); print; exit}' < copyparty/__version__.py)" gsub(/[^0-9,a-g-]/,""); gsub(/,/,"."); print; exit}' < copyparty/__version__.py)"
echo "$ver" >ver # pyz
ts=$(date -u +%s) ts=$(date -u +%s)
hts=$(date -u +%Y-%m%d-%H%M%S) # --date=@$ts (thx osx) hts=$(date -u +%Y-%m%d-%H%M%S) # --date=@$ts (thx osx)
mkdir -p ../dist mkdir -p ../dist
sfx_out=../dist/copyparty-$CSN sfx_out=../dist/copyparty-sfx$CSN
echo cleanup echo cleanup
find -name '*.pyc' -delete find -name '*.pyc' -delete
@@ -461,8 +457,8 @@ rm -f ftp/pyftpdlib/{__main__,prefork}.py
iawk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' $f iawk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' $f
done done
[ ! $repack ] && [ ! $use_ox ] && { [ ! $repack ] && {
# uncomment; oxidized drops 45 KiB but becomes undebuggable # uncomment
find | grep -E '\.py$' | find | grep -E '\.py$' |
grep -vE '__version__' | grep -vE '__version__' |
tr '\n' '\0' | tr '\n' '\0' |
@@ -545,7 +541,7 @@ gzres() {
} }
zdir="$tmpdir/cpp-mk$CSN" zdir="$tmpdir/cpp-mksfx$CSN"
[ -e "$zdir/$stamp" ] || rm -rf "$zdir" [ -e "$zdir/$stamp" ] || rm -rf "$zdir"
mkdir -p "$zdir" mkdir -p "$zdir"
echo a > "$zdir/$stamp" echo a > "$zdir/$stamp"
@@ -570,33 +566,6 @@ nf=$(ls -1 "$zdir"/arc.* 2>/dev/null | wc -l)
} }
[ $use_ox ] && {
tgt=x86_64-pc-windows-msvc
tgt=i686-pc-windows-msvc # 2M smaller (770k after upx)
bdir=build/$tgt/release/install/copyparty
t="res web"
(printf "\n\n\nBUT WAIT! THERE'S MORE!!\n\n";
cat ../$bdir/COPYING.txt) >> copyparty/res/COPYING.txt ||
echo "copying.txt 404 pls rebuild"
mv ftp/* j2/* .
rm -rf ftp j2 py2 py37
(cd copyparty; tar -cvf z.tar $t; rm -rf $t)
cd ..
pyoxidizer build --release --target-triple $tgt
mv $bdir/copyparty.exe dist/
cp -pv "$(for d in '/c/Program Files (x86)/Microsoft Visual Studio/'*'/BuildTools/VC/Redist/MSVC'; do
find "$d" -name vcruntime140.dll; done | sort | grep -vE '/x64/|/onecore/' | head -n 1)" dist/
dist/copyparty.exe --version
cp -pv dist/copyparty{,.orig}.exe
[ $ultra ] && a="--best --lzma" || a=-1
/bin/time -f %es upx $a dist/copyparty.exe >/dev/null
ls -al dist/copyparty{,.orig}.exe
exit 0
}
echo gen tarlist echo gen tarlist
for d in copyparty partftpy magic j2 py2 py37 ftp; do find $d -type f || true; done | # strip_hints for d in copyparty partftpy magic j2 py2 py37 ftp; do find $d -type f || true; done | # strip_hints
sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort | sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort |

View File

@@ -118,4 +118,12 @@ base64 | head -c12 >> dist/copyparty.exe
dist/copyparty.exe --version dist/copyparty.exe --version
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe csum=$(sha512sum <dist/copyparty.exe | cut -c-56)
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe >uplod.log
cat uplod.log
grep -q $csum uplod.log && echo upload OK || {
echo UPLOAD FAILED
exit 1
}

View File

@@ -1,33 +1,33 @@
f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f84beadd139bfeef4982e40f21ca5d9065cb794eeb0e473e82 altgraph-0.17.4-py2.py3-none-any.whl f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f84beadd139bfeef4982e40f21ca5d9065cb794eeb0e473e82 altgraph-0.17.4-py2.py3-none-any.whl
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe e0d2e6183437af321a36944f04a501e85181243e5fa2da3254254305dd8119161f62048bc56bff8849b49f546ff175b02b4c999401f1c404f6b88e6f46a9c96e Git-2.44.0-32-bit.exe
9d2c31701a4d3fef553928c00528a48f9e1854ab5333528b50e358a214eba90029d687f039bcda5760b6fdf9f2de3bcf3784ae21a6374cf2a97a845d33b636c6 packaging-24.0-py3-none-any.whl
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl 17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
126ca016c00256f4ff13c88707ead21b3b98f3c665ae57a5bcbb80c8be3004bff36d9c7f9a1cc9d20551019708f2b195154f302d80a1e5a2026d6d0fe9f3d5f4 pyinstaller_hooks_contrib-2024.3-py2.py3-none-any.whl 126ca016c00256f4ff13c88707ead21b3b98f3c665ae57a5bcbb80c8be3004bff36d9c7f9a1cc9d20551019708f2b195154f302d80a1e5a2026d6d0fe9f3d5f4 pyinstaller_hooks_contrib-2024.3-py2.py3-none-any.whl
749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl 749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl
6e0d854040baff861e1647d2bece7d090bc793b2bd9819c56105b94090df54881a6a9b43ebd82578cd7c76d47181571b671e60672afd9def389d03c9dae84fcf setuptools-68.2.2-py3-none-any.whl 6e0d854040baff861e1647d2bece7d090bc793b2bd9819c56105b94090df54881a6a9b43ebd82578cd7c76d47181571b671e60672afd9def389d03c9dae84fcf setuptools-68.2.2-py3-none-any.whl
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
# u2c (win7) # u2c (win7)
f3390290b896019b2fa169932390e4930d1c03c014e1f6db2405ca2eb1f51f5f5213f725885853805b742997b0edb369787e5c0069d217bc4e8b957f847f58b6 certifi-2023.11.17-py3-none-any.whl 7a3bd4849f95e1715fe2e99613df70a0fedd944a9bfde71a0fadb837fe62c3431c30da4f0b75c74de6f1a459f1fdf7cb62eaf404fdbe45e2d121e0b1021f1580 certifi-2024.2.2-py3-none-any.whl
904eb57b13bea80aea861de86987e618665d37fa9ea0856e0125a9ba767a53e5064de0b9c4735435a2ddf4f16f7f7d2c75a682e1de83d9f57922bdca8e29988c charset_normalizer-3.3.0-cp37-cp37m-win32.whl 9cc8acc5e269e6421bc32bb89261101da29d6ca337d39d60b9106de9ed7904e188716e4a48d78a2c4329026443fcab7acab013d2fe43778e30d6c4e4506a1b91 charset_normalizer-3.3.2-cp37-cp37m-win32.whl
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl 0ec1ae5c928b4a0001a254c8598b746049406e1eed720bfafa94d4474078eff76bf6e032124e2d4df4619052836523af36162443c6d746487b387d2e3476e691 idna-3.6-py3-none-any.whl
b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c3875222eb47fc95c73fc0606aaa6602b9ebc524809c9ba3501f requests-2.31.0-py3-none-any.whl b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c3875222eb47fc95c73fc0606aaa6602b9ebc524809c9ba3501f requests-2.31.0-py3-none-any.whl
5a25cb9b79bb6107f9055dc3e9f62ebc6d4d9ca2c730d824985c93cd82406b723c200d6300c5064e42ee9fc7a2853d6ec6661394f3ed7bac03750e1f2a6840d1 urllib3-1.26.17-py2.py3-none-any.whl 61ed4500b6361632030f05229705c5c5a52cb47e31c0e6b55151c8f3beed631cd752ca6c3d6393d56a2acf6a453cfcf801e877116123c550922249c3a976e0f4 urllib3-1.26.18-py2.py3-none-any.whl
# win7 # win7
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz d130bfa136bd171b9972b5c281c578545f2a84a909fdf18a6d2d71dd12fb3d512a7a1fa5cf7300433adece1d306eb2f22d7278f4c90e744e04dc67ba627a82c0 future-1.0.0-py3-none-any.whl
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl 0b4d07434bf8d314f42893d90bce005545b44a509e7353a73cad26dc9360b44e2824218a1a74f8174d02eba87fba91baffa82c8901279a32ebc6b8386b1b4275 importlib_metadata-6.7.0-py3-none-any.whl
016a8cbd09384f1a9a44cb0e8274df75a8bcb2f3966bb5d708c62145289efaa5db98f75256c97e4f8046735ce2e529fbb076f284a46cdb716e89a75660200ad9 pip-23.2.1-py3-none-any.whl 5d7462a584105bccaa9cf376f5a8c5827ead099c813c8af7392d478a4398f373d9e8cac7bbad2db51b335411ab966b21e119b1b1234c9a7ab70c6ddfc9306da6 pip-24.0-py3-none-any.whl
f298e34356b5590dde7477d7b3a88ad39c622a2bcf3fcd7c53870ce8384dd510f690af81b8f42e121a22d3968a767d2e07595036b2ed7049c8ef4d112bcf3a61 pyinstaller-5.13.2-py3-none-win32.whl f298e34356b5590dde7477d7b3a88ad39c622a2bcf3fcd7c53870ce8384dd510f690af81b8f42e121a22d3968a767d2e07595036b2ed7049c8ef4d112bcf3a61 pyinstaller-5.13.2-py3-none-win32.whl
6bb73cc2db795c59c92f2115727f5c173cacc9465af7710db9ff2f2aec2d73130d0992d0f16dcb3fac222dc15c0916562d0813b2337401022020673a4461df3d python-3.7.9-amd64.exe 6bb73cc2db795c59c92f2115727f5c173cacc9465af7710db9ff2f2aec2d73130d0992d0f16dcb3fac222dc15c0916562d0813b2337401022020673a4461df3d python-3.7.9-amd64.exe
500747651c87f59f2436c5ab91207b5b657856e43d10083f3ce27efb196a2580fadd199a4209519b409920c562aaaa7dcbdfb83ed2072a43eaccae6e2d056f31 python-3.7.9.exe 500747651c87f59f2436c5ab91207b5b657856e43d10083f3ce27efb196a2580fadd199a4209519b409920c562aaaa7dcbdfb83ed2072a43eaccae6e2d056f31 python-3.7.9.exe
03e50aecc85914567c114e38a1777e32628ee098756f37177bc23220eab33ac7d3ff591fd162db3b4d4e34d55cee93ef0dc67af68a69c38bb1435e0768dee57e typing_extensions-4.7.1-py3-none-any.whl
2e04acff170ca3bbceeeb18489c687126c951ec0bfd53cccfb389ba8d29a4576c1a9e8f2e5ea26c84dd21bfa2912f4e71fa72c1e2653b71e34afc0e65f1722d4 upx-4.2.2-win32.zip 2e04acff170ca3bbceeeb18489c687126c951ec0bfd53cccfb389ba8d29a4576c1a9e8f2e5ea26c84dd21bfa2912f4e71fa72c1e2653b71e34afc0e65f1722d4 upx-4.2.2-win32.zip
68e1b618d988be56aaae4e2eb92bc0093627a00441c1074ebe680c41aa98a6161e52733ad0c59888c643a33fe56884e4f935178b2557fbbdd105e92e0d993df6 windows6.1-kb2533623-x64.msu 68e1b618d988be56aaae4e2eb92bc0093627a00441c1074ebe680c41aa98a6161e52733ad0c59888c643a33fe56884e4f935178b2557fbbdd105e92e0d993df6 windows6.1-kb2533623-x64.msu
479a63e14586ab2f2228208116fc149ed8ee7b1e4ff360754f5bda4bf765c61af2e04b5ef123976623d04df4976b7886e0445647269da81436bd0a7b5671d361 windows6.1-kb2533623-x86.msu 479a63e14586ab2f2228208116fc149ed8ee7b1e4ff360754f5bda4bf765c61af2e04b5ef123976623d04df4976b7886e0445647269da81436bd0a7b5671d361 windows6.1-kb2533623-x86.msu
ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a25067512b8f75538c67c987cf3944bfa0229e3cb677e2fb81e763e zipp-3.10.0-py3-none-any.whl ac96786e5d35882e0c5b724794329c9125c2b86ae7847f17acfc49f0d294312c6afc1c3f248655de3f0ccb4ca426d7957d02ba702f4a15e9fcd7e2c314e72c19 zipp-3.15.0-py3-none-any.whl
# win10 # win10
e3e2e6bd511dec484dd0292f4c46c55c88a885eabf15413d53edea2dd4a4dbae1571735b9424f78c0cd7f1082476a8259f31fd3f63990f726175470f636df2b3 Jinja2-3.1.3-py3-none-any.whl d1420c8417fad7888766dd26b9706a87c63e8f33dceeb8e26d0056d5127b0b3ed9272e44b4b761132d4b3320327252eab1696520488ca64c25958896b41f547b jinja2-3.1.4-py3-none-any.whl
e21495f1d473d855103fb4a243095b498ec90eb68776b0f9b48e994990534f7286c0292448e129c507e5d70409f8a05cca58b98d59ce2a815993d0a873dfc480 MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl e21495f1d473d855103fb4a243095b498ec90eb68776b0f9b48e994990534f7286c0292448e129c507e5d70409f8a05cca58b98d59ce2a815993d0a873dfc480 MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl 8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
656015f5cc2c04aa0653ee5609c39a7e5f0b6a58c84fe26b20bd070c52d20b4effb810132f7fb771168483e9fd975cc3302837dd7a1a687ee058b0460c857cc4 packaging-23.2-py3-none-any.whl 1dfe6f66bef5c9d62c9028a964196b902772ec9e19db215f3f41acb8d2d563586988d81b455fa6b895b434e9e1e9d57e4d271d1b1214483bdb3eadffcbba6a33 pillow-10.3.0-cp311-cp311-win_amd64.whl
424e20dc7263a31d524307bc39ed755a9dd82f538086fff68d98dd97e236c9b00777a8ac2e3853081b532b0e93cef44983e74d0ab274877440e8b7341b19358a pillow-10.2.0-cp311-cp311-win_amd64.whl
8760eab271e79256ae3bfb4af8ccc59010cb5d2eccdd74b325d1a533ae25eb127d51c2ec28ff90d449afed32dd7d6af62934fe9caaf1ae1f4d4831e948e912da pyinstaller-6.5.0-py3-none-win_amd64.whl 8760eab271e79256ae3bfb4af8ccc59010cb5d2eccdd74b325d1a533ae25eb127d51c2ec28ff90d449afed32dd7d6af62934fe9caaf1ae1f4d4831e948e912da pyinstaller-6.5.0-py3-none-win_amd64.whl
e6bdbae1affd161e62fc87407c912462dfe875f535ba9f344d0c4ade13715c947cd3ae832eff60f1bad4161938311d06ac8bc9b52ef203f7b0d9de1409f052a5 python-3.11.8-amd64.exe 897a14d5ee5cbc6781a0f48beffc27807a4f789d58c4329d899233f615d168a5dcceddf7f8f2d5bb52212ddcf3eba4664590d9f1fdb25bb5201f44899e03b2f7 python-3.11.9-amd64.exe
729dc52f1a02bc6274d012ce33f534102975a828cba11f6029600ea40e2d23aefeb07bf4ae19f9621d0565dd03eb2635bbb97d45fb692c1f756315e8c86c5255 upx-4.2.2-win64.zip 729dc52f1a02bc6274d012ce33f534102975a828cba11f6029600ea40e2d23aefeb07bf4ae19f9621d0565dd03eb2635bbb97d45fb692c1f756315e8c86c5255 upx-4.2.2-win64.zip

View File

@@ -8,7 +8,7 @@ run ./build.sh in git-bash to build + upload the exe
to obtain the files referenced below, see ./deps.txt to obtain the files referenced below, see ./deps.txt
download + install git (32bit OK on 64): download + install git (32bit OK on 64):
http://192.168.123.1:3923/ro/pyi/Git-2.39.1-32-bit.exe http://192.168.123.1:3923/ro/pyi/Git-2.44.0-32-bit.exe
===[ copy-paste into git-bash ]================================ ===[ copy-paste into git-bash ]================================
uname -s | grep NT-10 && w10=1 || { uname -s | grep NT-10 && w10=1 || {
@@ -16,6 +16,7 @@ uname -s | grep NT-10 && w10=1 || {
} }
fns=( fns=(
altgraph-0.17.4-py2.py3-none-any.whl altgraph-0.17.4-py2.py3-none-any.whl
packaging-24.0-py3-none-any.whl
pefile-2023.2.7-py3-none-any.whl pefile-2023.2.7-py3-none-any.whl
pyinstaller_hooks_contrib-2024.3-py2.py3-none-any.whl pyinstaller_hooks_contrib-2024.3-py2.py3-none-any.whl
pywin32_ctypes-0.2.2-py3-none-any.whl pywin32_ctypes-0.2.2-py3-none-any.whl
@@ -23,29 +24,28 @@ fns=(
) )
[ $w10 ] && fns+=( [ $w10 ] && fns+=(
pyinstaller-6.5.0-py3-none-win_amd64.whl pyinstaller-6.5.0-py3-none-win_amd64.whl
Jinja2-3.1.3-py3-none-any.whl Jinja2-3.1.4-py3-none-any.whl
MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl
mutagen-1.47.0-py3-none-any.whl mutagen-1.47.0-py3-none-any.whl
packaging-23.2-py3-none-any.whl pillow-10.3.0-cp311-cp311-win_amd64.whl
pillow-10.2.0-cp311-cp311-win_amd64.whl python-3.11.9-amd64.exe
python-3.11.8-amd64.exe
upx-4.2.2-win64.zip upx-4.2.2-win64.zip
) )
[ $w7 ] && fns+=( [ $w7 ] && fns+=(
pyinstaller-5.13.2-py3-none-win32.whl pyinstaller-5.13.2-py3-none-win32.whl
certifi-2023.11.17-py3-none-any.whl certifi-2024.2.2-py3-none-any.whl
chardet-5.1.0-py3-none-any.whl charset_normalizer-3.3.2-cp37-cp37m-win32.whl
idna-3.4-py3-none-any.whl idna-3.6-py3-none-any.whl
requests-2.28.2-py3-none-any.whl requests-2.31.0-py3-none-any.whl
urllib3-1.26.14-py2.py3-none-any.whl urllib3-1.26.18-py2.py3-none-any.whl
upx-4.2.2-win32.zip upx-4.2.2-win32.zip
) )
[ $w7 ] && fns+=( [ $w7 ] && fns+=(
future-0.18.2.tar.gz future-1.0.0-py3-none-any.whl
importlib_metadata-5.0.0-py3-none-any.whl importlib_metadata-6.7.0-py3-none-any.whl
pip-23.2.1-py3-none-any.whl pip-24.0-py3-none-any.whl
typing_extensions-4.4.0-py3-none-any.whl typing_extensions-4.7.1-py3-none-any.whl
zipp-3.10.0-py3-none-any.whl zipp-3.15.0-py3-none-any.whl
) )
[ $w7x64 ] && fns+=( [ $w7x64 ] && fns+=(
windows6.1-kb2533623-x64.msu windows6.1-kb2533623-x64.msu
@@ -77,9 +77,10 @@ yes | unzip upx-*-win32.zip &&
mv upx-*/upx.exe . && mv upx-*/upx.exe . &&
python -m ensurepip && python -m ensurepip &&
{ [ $w10 ] || python -m pip install --user -U pip-*.whl; } && { [ $w10 ] || python -m pip install --user -U pip-*.whl; } &&
{ [ $w7 ] || python -m pip install --user -U {packaging,setuptools,mutagen,Pillow,Jinja2,MarkupSafe}-*.whl; } && python -m pip install --user -U packaging-*.whl &&
{ [ $w7 ] || python -m pip install --user -U {setuptools,mutagen,Pillow,Jinja2,MarkupSafe}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } && { [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } && { [ $w10 ] || python -m pip install --user -U future-*.whl importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl && python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py && sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py && curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&

View File

@@ -37,6 +37,8 @@ grep -E '^from .ssl_ import' $APPDATA/python/python37/site-packages/urllib3/util
echo golfed echo golfed
} }
sed -ri 's/(add_argument."-t[de]",.*help=")[^"]+/\1not applicable; HTTPS is disabled in this exe/; s/for some reason/in this exe for safety reasons/' u2c.py
read a b _ < <(awk -F\" '/^S_VERSION =/{$0=$2;sub(/\./," ");print}' < u2c.py) read a b _ < <(awk -F\" '/^S_VERSION =/{$0=$2;sub(/\./," ");print}' < u2c.py)
sed -r 's/1,2,3,0/'$a,$b,0,0'/;s/1\.2\.3/'$a.$b.0/ <up2k.rc >up2k.rc2 sed -r 's/1,2,3,0/'$a,$b,0,0'/;s/1\.2\.3/'$a.$b.0/ <up2k.rc >up2k.rc2
@@ -45,4 +47,12 @@ $APPDATA/python/python37/scripts/pyinstaller -y --clean --upx-dir=. up2k.spec
./dist/u2c.exe --version ./dist/u2c.exe --version
curl -fkT dist/u2c.exe -HPW:wark https://192.168.123.1:3923/ csum=$(sha512sum <dist/u2c.exe | cut -c-56)
curl -fkT dist/u2c.exe -HPW:wark https://192.168.123.1:3923/ >uplod.log
cat uplod.log
grep -q $csum uplod.log && echo upload OK || {
echo UPLOAD FAILED
exit 1
}

View File

@@ -42,7 +42,7 @@ $f$s.py --version >/dev/null
min=99999999 min=99999999
for ((a=0; a<$parallel; a++)); do for ((a=0; a<$parallel; a++)); do
while [ -e .sfx-run ]; do while [ -e .sfx-run ]; do
CSN=sfx$a ./make-sfx.sh re "$@" CSN=$a ./make-sfx.sh re "$@"
sz=$(wc -c <$f$a$s.py | awk '{print$1}') sz=$(wc -c <$f$a$s.py | awk '{print$1}')
[ $sz -ge $min ] && continue [ $sz -ge $min ] && continue
mv $f$a$s.py $f$s.py.$sz mv $f$a$s.py $f$s.py.$sz

View File

@@ -1,6 +1,21 @@
#!/bin/bash #!/bin/bash
set -ex set -ex
if uname | grep -iE '^(msys|mingw)'; then
pids=()
python -m unittest discover -s tests >/dev/null &
pids+=($!)
python scripts/test/smoketest.py &
pids+=($!)
for pid in ${pids[@]}; do
wait $pid
done
exit $?
fi
# osx support # osx support
gtar=$(command -v gtar || command -v gnutar) || true gtar=$(command -v gtar || command -v gnutar) || true
[ ! -z "$gtar" ] && command -v gfind >/dev/null && { [ ! -z "$gtar" ] && command -v gfind >/dev/null && {

View File

@@ -81,6 +81,7 @@ copyparty/web/dd/5.png,
copyparty/web/dd/__init__.py, copyparty/web/dd/__init__.py,
copyparty/web/deps, copyparty/web/deps,
copyparty/web/deps/__init__.py, copyparty/web/deps/__init__.py,
copyparty/web/deps/busy.mp3,
copyparty/web/deps/easymde.css, copyparty/web/deps/easymde.css,
copyparty/web/deps/easymde.js, copyparty/web/deps/easymde.js,
copyparty/web/deps/marked.js, copyparty/web/deps/marked.js,

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
# coding: latin-1 # coding: latin-1
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
import re, os, sys, time, shutil, signal, threading, tarfile, hashlib, platform, tempfile, traceback import re, os, sys, time, shutil, signal, tarfile, hashlib, platform, tempfile, traceback
import subprocess as sp import subprocess as sp
@@ -234,8 +234,9 @@ def u8(gen):
def yieldfile(fn): def yieldfile(fn):
with open(fn, "rb") as f: s = 64 * 1024
for block in iter(lambda: f.read(64 * 1024), b""): with open(fn, "rb", s * 4) as f:
for block in iter(lambda: f.read(s), b""):
yield block yield block
@@ -367,17 +368,6 @@ def get_payload():
p = a p = a
def utime(top):
# avoid cleaners
files = [os.path.join(dp, p) for dp, dd, df in os.walk(top) for p in dd + df]
while True:
t = int(time.time())
for f in [top] + files:
os.utime(f, (t, t))
time.sleep(78123)
def confirm(rv): def confirm(rv):
msg() msg()
msg("retcode", rv if rv else traceback.format_exc()) msg("retcode", rv if rv else traceback.format_exc())
@@ -397,9 +387,7 @@ def run(tmp, j2, ftp):
msg("sfxdir:", tmp) msg("sfxdir:", tmp)
msg() msg()
t = threading.Thread(target=utime, args=(tmp,)) sys.argv.append("--sfx-tpoke=" + tmp)
t.daemon = True
t.start()
ld = (("", ""), (j2, "j2"), (ftp, "ftp"), (not PY2, "py2"), (PY37, "py37")) ld = (("", ""), (j2, "j2"), (ftp, "ftp"), (not PY2, "py2"), (PY37, "py37"))
ld = [os.path.join(tmp, b) for a, b in ld if not a] ld = [os.path.join(tmp, b) for a, b in ld if not a]

109
scripts/ziploader.py Normal file
View File

@@ -0,0 +1,109 @@
#!/usr/bin/env python3
import atexit
import os
import platform
import sys
import tarfile
import tempfile
import time
import traceback
VER = None
STAMP = None
WINDOWS = sys.platform in ["win32", "msys"]
def msg(*a, **ka):
if a:
a = ["[ZIP]", a[0]] + list(a[1:])
ka["file"] = sys.stderr
print(*a, **ka)
def utime(top):
# avoid cleaners
files = [os.path.join(dp, p) for dp, dd, df in os.walk(top) for p in dd + df]
try:
while True:
t = int(time.time())
for f in [top] + files:
os.utime(f, (t, t))
time.sleep(78123)
except Exception as ex:
print("utime:", ex, f)
def confirm(rv):
msg()
msg("retcode", rv if rv else traceback.format_exc())
if WINDOWS:
msg("*** hit enter to exit ***")
try:
input()
except:
pass
sys.exit(rv or 1)
def run():
import copyparty
from copyparty.__main__ import main as cm
td = tempfile.TemporaryDirectory(prefix="")
atexit.register(td.cleanup)
rsrc = td.name
try:
from importlib.resources import files
f = files(copyparty).joinpath("z.tar").open("rb")
except:
from importlib.resources import open_binary
f = open_binary("copyparty", "z.tar")
with tarfile.open(fileobj=f) as tf:
try:
tf.extractall(rsrc, filter="tar")
except TypeError:
tf.extractall(rsrc) # nosec (archive is safe)
f.close()
f = None
msg(" rsrc dir:", rsrc)
msg()
sys.argv.append("--sfx-tpoke=" + rsrc)
cm(rsrc=rsrc)
def main():
sysver = str(sys.version).replace("\n", "\n" + " " * 18)
pktime = time.strftime("%Y-%m-%d, %H:%M:%S", time.gmtime(STAMP))
msg()
msg(" this is: copyparty", VER)
msg(" packed at:", pktime, "UTC,", STAMP)
msg("python bin:", sys.executable)
msg("python ver:", platform.python_implementation(), sysver)
try:
run()
except SystemExit as ex:
c = ex.code
if c not in [0, -15]:
confirm(ex.code)
except KeyboardInterrupt:
pass
except:
confirm(0)
if __name__ == "__main__":
main()

24
tests/res/idp/6.conf Normal file
View File

@@ -0,0 +1,24 @@
# -*- mode: yaml -*-
# vim: ft=yaml:
[global]
idp-h-usr: x-idp-user
idp-h-grp: x-idp-group
[/get/${u}]
/get/${u}
accs:
g: *
r: ${u}, @su
m: @su
[/priv/${u}]
/priv/${u}
accs:
r: ${u}, @su
m: @su
[/team/${g}/${u}]
/team/${g}/${u}
accs:
r: @${g}

View File

@@ -3,6 +3,7 @@
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
import io import io
import json
import os import os
import shutil import shutil
import tarfile import tarfile
@@ -49,11 +50,7 @@ class TestHttpCli(unittest.TestCase):
with open(filepath, "wb") as f: with open(filepath, "wb") as f:
f.write(filepath.encode("utf-8")) f.write(filepath.encode("utf-8"))
vcfg = [ vcfg = [".::r,u1:r.,u2", "a:a:r,u1:r,u2", ".b:.b:r.,u1:r,u2"]
".::r,u1:r.,u2",
"a:a:r,u1:r,u2",
".b:.b:r.,u1:r,u2"
]
self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"], e2dsa=True) self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"], e2dsa=True)
self.asrv = AuthSrv(self.args, self.log) self.asrv = AuthSrv(self.args, self.log)
@@ -66,7 +63,9 @@ class TestHttpCli(unittest.TestCase):
self.assertEqual(self.curl("?tar", "x")[1][:17], "\nJ2EOT") self.assertEqual(self.curl("?tar", "x")[1][:17], "\nJ2EOT")
# search ##
## search
up2k = Up2k(self) up2k = Up2k(self)
u2idx = U2idx(self) u2idx = U2idx(self)
allvols = list(self.asrv.vfs.all_vols.values()) allvols = list(self.asrv.vfs.all_vols.values())
@@ -81,7 +80,8 @@ class TestHttpCli(unittest.TestCase):
x = " ".join(sorted([x["rp"] for x in x[0]])) x = " ".join(sorted([x["rp"] for x in x[0]]))
self.assertEqual(x, ".f0 .t/.f2 .t/f2 a/da/f4 a/f3 f0 t/.f1 t/f1") self.assertEqual(x, ".f0 .t/.f2 .t/f2 a/da/f4 a/f3 f0 t/.f1 t/f1")
self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"], dotsrch=False) u2idx.shutdown()
self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"], dotsrch=False, e2d=True)
self.asrv = AuthSrv(self.args, self.log) self.asrv = AuthSrv(self.args, self.log)
u2idx = U2idx(self) u2idx = U2idx(self)
@@ -90,16 +90,58 @@ class TestHttpCli(unittest.TestCase):
# u1 can see dotfiles in volB so they should be included # u1 can see dotfiles in volB so they should be included
xe = "a/da/f4 a/f3 f0 t/f1" xe = "a/da/f4 a/f3 f0 t/f1"
self.assertEqual(x, xe) self.assertEqual(x, xe)
u2idx.shutdown()
up2k.shutdown()
##
## dirkeys
os.mkdir("v")
with open("v/f1.txt", "wb") as f:
f.write(b"a")
os.rename("a", "v/a")
os.rename(".b", "v/.b")
vcfg = [
".::r.,u1:g,u2:c,dk",
"v/a:v/a:r.,u1:g,u2:c,dk",
"v/.b:v/.b:r.,u1:g,u2:c,dk",
]
self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"])
self.asrv = AuthSrv(self.args, self.log)
zj = json.loads(self.curl("?ls", "u1")[1])
url = "?k=" + zj["dk"]
# should descend into folders, but not other volumes:
self.assertEqual(self.tardir(url, "u2"), "f0 t/f1 v/f1.txt")
zj = json.loads(self.curl("v?ls", "u1")[1])
url = "v?k=" + zj["dk"]
self.assertEqual(self.tarsel(url, "u2", ["f1.txt", "a", ".b"]), "f1.txt")
def tardir(self, url, uname): def tardir(self, url, uname):
h, b = self.curl("/" + url + "?tar", uname, True) top = url.split("?")[0]
top = ("top" if not top else top.lstrip(".").split("/")[0]) + "/"
url += ("&" if "?" in url else "?") + "tar"
h, b = self.curl(url, uname, True)
tar = tarfile.open(fileobj=io.BytesIO(b), mode="r|").getnames() tar = tarfile.open(fileobj=io.BytesIO(b), mode="r|").getnames()
top = ("top" if not url else url.lstrip(".").split("/")[0]) + "/" if len(tar) != len([x for x in tar if x.startswith(top)]):
assert len(tar) == len([x for x in tar if x.startswith(top)]) raise Exception("bad-prefix:", tar)
return " ".join([x[len(top):] for x in tar]) return " ".join([x[len(top) :] for x in tar])
def curl(self, url, uname, binary=False): def tarsel(self, url, uname, sel):
conn = tu.VHttpConn(self.args, self.asrv, self.log, hdr(url, uname)) url += ("&" if "?" in url else "?") + "tar"
zs = '--XD\r\nContent-Disposition: form-data; name="act"\r\n\r\nzip\r\n--XD\r\nContent-Disposition: form-data; name="files"\r\n\r\n'
zs += "\r\n".join(sel) + "\r\n--XD--\r\n"
zb = zs.encode("utf-8")
hdr = "POST /%s HTTP/1.1\r\nPW: %s\r\nConnection: close\r\nContent-Type: multipart/form-data; boundary=XD\r\nContent-Length: %d\r\n\r\n"
req = (hdr % (url, uname, len(zb))).encode("utf-8") + zb
h, b = self.curl("/" + url, uname, True, req)
tar = tarfile.open(fileobj=io.BytesIO(b), mode="r|").getnames()
return " ".join(tar)
def curl(self, url, uname, binary=False, req=b""):
req = req or hdr(url, uname)
conn = tu.VHttpConn(self.args, self.asrv, self.log, req)
HttpCli(conn).run() HttpCli(conn).run()
if binary: if binary:
h, b = conn.s._reply.split(b"\r\n\r\n", 1) h, b = conn.s._reply.split(b"\r\n\r\n", 1)
@@ -108,4 +150,4 @@ class TestHttpCli(unittest.TestCase):
return conn.s._reply.decode("utf-8").split("\r\n\r\n", 1) return conn.s._reply.decode("utf-8").split("\r\n\r\n", 1)
def log(self, src, msg, c=0): def log(self, src, msg, c=0):
print(msg) print(msg, "\033[0m")

View File

@@ -6,6 +6,7 @@ import json
import os import os
import unittest import unittest
from copyparty.__init__ import ANYWIN
from copyparty.authsrv import AuthSrv from copyparty.authsrv import AuthSrv
from tests.util import Cfg from tests.util import Cfg
@@ -15,6 +16,16 @@ class TestVFS(unittest.TestCase):
print(json.dumps(vfs, indent=4, sort_keys=True, default=lambda o: o.__dict__)) print(json.dumps(vfs, indent=4, sort_keys=True, default=lambda o: o.__dict__))
def log(self, src, msg, c=0): def log(self, src, msg, c=0):
m = "%s" % (msg,)
if (
"warning: filesystem-path does not exist:" in m
or "you are sharing a system directory:" in m
or "reinitializing due to new user from IdP:" in m
or m.startswith("hint: argument")
or (m.startswith("loaded ") and " config files:" in m)
):
return
print(("[%s] %s" % (src, msg)).encode("ascii", "replace").decode("ascii")) print(("[%s] %s" % (src, msg)).encode("ascii", "replace").decode("ascii"))
def nav(self, au, vp): def nav(self, au, vp):
@@ -30,21 +41,29 @@ class TestVFS(unittest.TestCase):
self.assertEqual(unpacked, expected + [[]] * pad) self.assertEqual(unpacked, expected + [[]] * pad)
def assertAxsAt(self, au, vp, expected): def assertAxsAt(self, au, vp, expected):
self.assertAxs(self.nav(au, vp).axs, expected) vn = self.nav(au, vp)
self.assertAxs(vn.axs, expected)
def assertNodes(self, vfs, expected): def assertNodes(self, vfs, expected):
got = list(sorted(vfs.nodes.keys())) got = list(sorted(vfs.nodes.keys()))
self.assertEqual(got, expected) self.assertEqual(got, expected)
def assertNodesAt(self, au, vp, expected): def assertNodesAt(self, au, vp, expected):
self.assertNodes(self.nav(au, vp), expected) vn = self.nav(au, vp)
self.assertNodes(vn, expected)
def assertApEq(self, ap, rhs):
if ANYWIN and len(ap) > 2 and ap[1] == ":":
ap = ap[2:].replace("\\", "/")
return self.assertEqual(ap, rhs)
def prep(self): def prep(self):
here = os.path.abspath(os.path.dirname(__file__)) here = os.path.abspath(os.path.dirname(__file__))
cfgdir = os.path.join(here, "res", "idp") cfgdir = os.path.join(here, "res", "idp")
# globals are applied by main so need to cheat a little # globals are applied by main so need to cheat a little
xcfg = { "idp_h_usr": "x-idp-user", "idp_h_grp": "x-idp-group" } xcfg = {"idp_h_usr": "x-idp-user", "idp_h_grp": "x-idp-group"}
return here, cfgdir, xcfg return here, cfgdir, xcfg
@@ -58,7 +77,7 @@ class TestVFS(unittest.TestCase):
au = AuthSrv(Cfg(c=[cfgdir + "/1.conf"], **xcfg), self.log) au = AuthSrv(Cfg(c=[cfgdir + "/1.conf"], **xcfg), self.log)
self.assertEqual(au.vfs.vpath, "") self.assertEqual(au.vfs.vpath, "")
self.assertEqual(au.vfs.realpath, "/") self.assertApEq(au.vfs.realpath, "/")
self.assertNodes(au.vfs, ["vb"]) self.assertNodes(au.vfs, ["vb"])
self.assertNodes(au.vfs.nodes["vb"], []) self.assertNodes(au.vfs.nodes["vb"], [])
@@ -73,7 +92,7 @@ class TestVFS(unittest.TestCase):
au = AuthSrv(Cfg(c=[cfgdir + "/2.conf"], **xcfg), self.log) au = AuthSrv(Cfg(c=[cfgdir + "/2.conf"], **xcfg), self.log)
self.assertEqual(au.vfs.vpath, "") self.assertEqual(au.vfs.vpath, "")
self.assertEqual(au.vfs.realpath, "/") self.assertApEq(au.vfs.realpath, "/")
self.assertNodes(au.vfs, ["vb", "vc"]) self.assertNodes(au.vfs, ["vb", "vc"])
self.assertNodes(au.vfs.nodes["vb"], []) self.assertNodes(au.vfs.nodes["vb"], [])
self.assertNodes(au.vfs.nodes["vc"], []) self.assertNodes(au.vfs.nodes["vc"], [])
@@ -91,7 +110,7 @@ class TestVFS(unittest.TestCase):
au = AuthSrv(Cfg(c=[cfgdir + "/3.conf"], **xcfg), self.log) au = AuthSrv(Cfg(c=[cfgdir + "/3.conf"], **xcfg), self.log)
self.assertEqual(au.vfs.vpath, "") self.assertEqual(au.vfs.vpath, "")
self.assertEqual(au.vfs.realpath, "") self.assertApEq(au.vfs.realpath, "")
self.assertNodes(au.vfs, []) self.assertNodes(au.vfs, [])
self.assertAxs(au.vfs.axs, []) self.assertAxs(au.vfs.axs, [])
@@ -100,8 +119,8 @@ class TestVFS(unittest.TestCase):
self.assertNodesAt(au, "vu", ["iua"]) # same as: self.assertNodesAt(au, "vu", ["iua"]) # same as:
self.assertNodes(au.vfs.nodes["vu"], ["iua"]) self.assertNodes(au.vfs.nodes["vu"], ["iua"])
self.assertNodes(au.vfs.nodes["vg"], ["iga"]) self.assertNodes(au.vfs.nodes["vg"], ["iga"])
self.assertEqual(au.vfs.nodes["vu"].realpath, "") self.assertApEq(au.vfs.nodes["vu"].realpath, "")
self.assertEqual(au.vfs.nodes["vg"].realpath, "") self.assertApEq(au.vfs.nodes["vg"].realpath, "")
self.assertAxs(au.vfs.axs, []) self.assertAxs(au.vfs.axs, [])
self.assertAxsAt(au, "vu/iua", [["iua"]]) # same as: self.assertAxsAt(au, "vu/iua", [["iua"]]) # same as:
self.assertAxs(self.nav(au, "vu/iua").axs, [["iua"]]) self.assertAxs(self.nav(au, "vu/iua").axs, [["iua"]])
@@ -115,7 +134,7 @@ class TestVFS(unittest.TestCase):
au = AuthSrv(Cfg(c=[cfgdir + "/4.conf"], **xcfg), self.log) au = AuthSrv(Cfg(c=[cfgdir + "/4.conf"], **xcfg), self.log)
self.assertEqual(au.vfs.vpath, "") self.assertEqual(au.vfs.vpath, "")
self.assertEqual(au.vfs.realpath, "") self.assertApEq(au.vfs.realpath, "")
self.assertNodes(au.vfs, ["vu"]) self.assertNodes(au.vfs, ["vu"])
self.assertNodesAt(au, "vu", ["ua", "ub"]) self.assertNodesAt(au, "vu", ["ua", "ub"])
self.assertAxs(au.vfs.axs, []) self.assertAxs(au.vfs.axs, [])
@@ -135,10 +154,15 @@ class TestVFS(unittest.TestCase):
self.assertAxsAt(au, "vg", []) self.assertAxsAt(au, "vg", [])
self.assertAxsAt(au, "vg/iga1", [["iua"]]) self.assertAxsAt(au, "vg/iga1", [["iua"]])
self.assertAxsAt(au, "vg/iga2", [["iua", "ua"]]) self.assertAxsAt(au, "vg/iga2", [["iua", "ua"]])
self.assertEqual(self.nav(au, "vu/ua").realpath, "/u-ua") self.assertApEq(self.nav(au, "vu/ua").realpath, "/u-ua")
self.assertEqual(self.nav(au, "vu/iua").realpath, "/u-iua") self.assertApEq(self.nav(au, "vu/iua").realpath, "/u-iua")
self.assertEqual(self.nav(au, "vg/iga1").realpath, "/g1-iga") self.assertApEq(self.nav(au, "vg/iga1").realpath, "/g1-iga")
self.assertEqual(self.nav(au, "vg/iga2").realpath, "/g2-iga") self.assertApEq(self.nav(au, "vg/iga2").realpath, "/g2-iga")
au.idp_checkin(None, "iub", "iga")
self.assertAxsAt(au, "vu/iua", [["iua"]])
self.assertAxsAt(au, "vg/iga1", [["iua", "iub"]])
self.assertAxsAt(au, "vg/iga2", [["iua", "iub", "ua"]])
def test_5(self): def test_5(self):
""" """
@@ -148,7 +172,7 @@ class TestVFS(unittest.TestCase):
au = AuthSrv(Cfg(c=[cfgdir + "/5.conf"], **xcfg), self.log) au = AuthSrv(Cfg(c=[cfgdir + "/5.conf"], **xcfg), self.log)
self.assertEqual(au.vfs.vpath, "") self.assertEqual(au.vfs.vpath, "")
self.assertEqual(au.vfs.realpath, "") self.assertApEq(au.vfs.realpath, "")
self.assertNodes(au.vfs, ["g", "ga", "gb"]) self.assertNodes(au.vfs, ["g", "ga", "gb"])
self.assertAxs(au.vfs.axs, []) self.assertAxs(au.vfs.axs, [])
@@ -169,3 +193,44 @@ class TestVFS(unittest.TestCase):
self.assertAxsAt(au, "g", [["iua"]]) self.assertAxsAt(au, "g", [["iua"]])
self.assertAxsAt(au, "ga", [["iua"]]) self.assertAxsAt(au, "ga", [["iua"]])
self.assertAxsAt(au, "gb", [["iua"]]) self.assertAxsAt(au, "gb", [["iua"]])
def test_6(self):
"""
IdP volumes with anon-get and other users/groups (github#79)
"""
_, cfgdir, xcfg = self.prep()
au = AuthSrv(Cfg(c=[cfgdir + "/6.conf"], **xcfg), self.log)
self.assertAxs(au.vfs.axs, [])
self.assertEqual(au.vfs.vpath, "")
self.assertApEq(au.vfs.realpath, "")
self.assertNodes(au.vfs, [])
au.idp_checkin(None, "iua", "")
star = ["*", "iua"]
self.assertNodes(au.vfs, ["get", "priv"])
self.assertAxsAt(au, "get/iua", [["iua"], [], [], [], star])
self.assertAxsAt(au, "priv/iua", [["iua"], [], []])
au.idp_checkin(None, "iub", "")
star = ["*", "iua", "iub"]
self.assertNodes(au.vfs, ["get", "priv"])
self.assertAxsAt(au, "get/iua", [["iua"], [], [], [], star])
self.assertAxsAt(au, "get/iub", [["iub"], [], [], [], star])
self.assertAxsAt(au, "priv/iua", [["iua"], [], []])
self.assertAxsAt(au, "priv/iub", [["iub"], [], []])
au.idp_checkin(None, "iuc", "su")
star = ["*", "iua", "iub", "iuc"]
self.assertNodes(au.vfs, ["get", "priv", "team"])
self.assertAxsAt(au, "get/iua", [["iua", "iuc"], [], ["iuc"], [], star])
self.assertAxsAt(au, "get/iub", [["iub", "iuc"], [], ["iuc"], [], star])
self.assertAxsAt(au, "get/iuc", [["iuc"], [], ["iuc"], [], star])
self.assertAxsAt(au, "priv/iua", [["iua", "iuc"], [], ["iuc"]])
self.assertAxsAt(au, "priv/iub", [["iub", "iuc"], [], ["iuc"]])
self.assertAxsAt(au, "priv/iuc", [["iuc"], [], ["iuc"]])
self.assertAxsAt(au, "team/su/iuc", [["iuc"]])
au.idp_checkin(None, "iud", "su")
self.assertAxsAt(au, "team/su/iuc", [["iuc", "iud"]])
self.assertAxsAt(au, "team/su/iud", [["iuc", "iud"]])

View File

@@ -44,7 +44,7 @@ if MACOS:
from copyparty.__init__ import E from copyparty.__init__ import E
from copyparty.__main__ import init_E from copyparty.__main__ import init_E
from copyparty.u2idx import U2idx from copyparty.u2idx import U2idx
from copyparty.util import FHC, Garda, Unrecv from copyparty.util import FHC, CachedDict, Garda, Unrecv
init_E(E) init_E(E)
@@ -110,25 +110,25 @@ class Cfg(Namespace):
def __init__(self, a=None, v=None, c=None, **ka0): def __init__(self, a=None, v=None, c=None, **ka0):
ka = {} ka = {}
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw q rand smb srch_dbg stats vague_403 vc ver xdev xlink xvol" ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw og og_no_head og_s_title q rand smb srch_dbg stats uqe vague_403 vc ver xdev xlink xvol"
ka.update(**{k: False for k in ex.split()}) ka.update(**{k: False for k in ex.split()})
ex = "dotpart dotsrch no_dhash no_fastboot no_rescan no_sendfile no_voldump re_dhash plain_ip" ex = "dotpart dotsrch no_dhash no_fastboot no_rescan no_sendfile no_snap no_voldump re_dhash plain_ip"
ka.update(**{k: True for k in ex.split()}) ka.update(**{k: True for k in ex.split()})
ex = "ah_cli ah_gen css_browser hist js_browser no_forget no_hash no_idx nonsus_urls" ex = "ah_cli ah_gen css_browser hist js_browser mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua"
ka.update(**{k: None for k in ex.split()}) ka.update(**{k: None for k in ex.split()})
ex = "hash_mt srch_time u2abort u2j" ex = "hash_mt srch_time u2abort u2j"
ka.update(**{k: 1 for k in ex.split()}) ka.update(**{k: 1 for k in ex.split()})
ex = "reg_cap s_thead s_tbody th_convt" ex = "au_vol mtab_age reg_cap s_thead s_tbody th_convt"
ka.update(**{k: 9 for k in ex.split()}) ka.update(**{k: 9 for k in ex.split()})
ex = "db_act df k304 loris re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo" ex = "db_act k304 loris re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo"
ka.update(**{k: 0 for k in ex.split()}) ka.update(**{k: 0 for k in ex.split()})
ex = "ah_alg bname doctitle exit favico idp_h_usr html_head lg_sbf log_fk md_sbf name textfiles unlist vname R RS SR" ex = "ah_alg bname doctitle df exit favico idp_h_usr html_head lg_sbf log_fk md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i tcolor textfiles unlist vname R RS SR"
ka.update(**{k: "" for k in ex.split()}) ka.update(**{k: "" for k in ex.split()})
ex = "grp on403 on404 xad xar xau xban xbd xbr xbu xiu xm" ex = "grp on403 on404 xad xar xau xban xbd xbr xbu xiu xm"
@@ -145,16 +145,20 @@ class Cfg(Namespace):
c=c, c=c,
E=E, E=E,
dbd="wal", dbd="wal",
dk_salt="b" * 16,
fk_salt="a" * 16, fk_salt="a" * 16,
idp_gsep=re.compile("[|:;+,]"), idp_gsep=re.compile("[|:;+,]"),
iobuf=256 * 1024,
lang="eng", lang="eng",
log_badpwd=1, log_badpwd=1,
logout=573, logout=573,
mte={"a": True}, mte={"a": True},
mth={}, mth={},
mtp=[], mtp=[],
mv_retry="0/0",
rm_retry="0/0", rm_retry="0/0",
s_wr_sz=512 * 1024, s_rd_sz=256 * 1024,
s_wr_sz=256 * 1024,
sort="href", sort="href",
srch_hits=99999, srch_hits=99999,
th_crop="y", th_crop="y",
@@ -247,6 +251,7 @@ class VHttpConn(object):
self.log_func = log self.log_func = log
self.log_src = "a" self.log_src = "a"
self.mutex = threading.Lock() self.mutex = threading.Lock()
self.pipes = CachedDict(1)
self.u2mutex = threading.Lock() self.u2mutex = threading.Lock()
self.nbyte = 0 self.nbyte = 0
self.nid = None self.nid = None
@@ -255,3 +260,7 @@ class VHttpConn(object):
self.u2fh = FHC() self.u2fh = FHC()
self.get_u2idx = self.hsrv.get_u2idx self.get_u2idx = self.hsrv.get_u2idx
if WINDOWS:
os.system("rem")