Compare commits

...

153 Commits

Author SHA1 Message Date
ed
1e22222c60 v1.7.0 2023-04-29 21:14:38 +00:00
ed
544e0549bc make xvol and xdev apply at runtime (closes #24):
* when accessing files inside an xdev volume, verify that the file
   exists on the same device/filesystem as the volume root

* when accessing files inside an xvol volume, verify that the file
   exists within any volume where the user has read access
2023-04-29 21:10:02 +00:00
ed
83178d0836 preserve empty folders (closes #23):
* when deleting files, do not cascade upwards through empty folders
* when moving folders, also move any empty folders inside

the only remaining action which autoremoves empty folders is
files getting deleted as they expire volume lifetimes

also prevents accidentally moving parent folders into subfolders
(even though that actually worked surprisingly well)
2023-04-29 11:30:43 +00:00
ed
c44f5f5701 nit 2023-04-29 09:44:46 +00:00
ed
138f5bc989 warn about android powersave settings on music interruption + fix eq on folder change 2023-04-29 09:31:53 +00:00
ed
e4759f86ef ftpd correctness:
* winscp mkdir failed because the folder-not-found error got repeated
* rmdir fails after all files in the folder have poofed; that's OK
* add --ftp4 as a precaution
2023-04-28 20:50:45 +00:00
ed
d71416437a show file selection summary 2023-04-27 19:33:52 +00:00
ed
a84c583b2c ok that wasn't enough 2023-04-27 19:06:35 +00:00
ed
cdacdccdb8 update pkgs to 1.6.15 2023-04-27 00:36:56 +00:00
ed
d3ccd3f174 v1.6.15 2023-04-26 23:00:55 +00:00
ed
cb6de0387d a bit faster 2023-04-26 19:56:27 +00:00
ed
abff40519d eyecandy: restore playback indicator on folder hop 2023-04-26 19:09:16 +00:00
ed
55c74ad164 30% faster folder listings (wtf...) 2023-04-26 18:55:53 +00:00
ed
673b4f7e23 option to show symlink's lastmod instead of deref;
mainly motivated by u2cli's folder syncing in turbo mode
which would un-turbo on most dupes due to wrong lastmod

disabled by default for regular http listings
(to avoid confusion in most regular usecases),
enable per-request with urlparam lt

enabled by default for single-level webdav listings
(because rclone hits the same issue as u2cli),
can be disabled with arg --dav-rt or volflag davrt

impossible to enable for recursive webdav listings
2023-04-26 18:54:21 +00:00
ed
d11e02da49 u2cli: avoid dns lookups while uploading 2023-04-26 18:46:42 +00:00
ed
8790f89e08 fix installing from source tarball 2023-04-26 18:40:47 +00:00
ed
33442026b8 try to discourage android from stopping playback...
...when continuing into the next folder

accidentally introduces a neat bonus feature where the music
no longer stops while you go looking for stuff to play next
2023-04-26 18:33:30 +00:00
ed
03193de6d0 socket read/write timeout 2023-04-24 20:04:22 +00:00
ed
8675ff40f3 update pkgs to 1.6.14 2023-04-24 07:52:12 +00:00
ed
d88889d3fc v1.6.14 2023-04-24 06:09:44 +00:00
ed
6f244d4335 update pkgs to 1.6.13 2023-04-24 00:46:47 +00:00
ed
cacca663b3 v1.6.13 2023-04-23 23:05:31 +00:00
ed
d5109be559 ftp: track login state isolated from pyftpdlib;
for convenience, the password can be provided as the username
but that confuses pyftpd a little so let's do this
2023-04-23 21:06:19 +00:00
ed
d999f06bb9 volflags can be -unset 2023-04-23 21:05:29 +00:00
ed
a1a8a8c7b5 configurable tls-certificate location 2023-04-23 20:56:55 +00:00
ed
fdd6f3b4a6 tar/zip: use volume name as toplevel fallback 2023-04-23 20:55:34 +00:00
ed
f5191973df docs cleanup:
* mostly deprecate --http-only and --https-only since there is zero
   performance gain in recent python versions, however could still be
   useful for avoiding limitations in alternative python interpreters
   (and forcing http/https with mdns/ssdp/qr)

* mention antivirus being useless as usual
2023-04-23 20:25:44 +00:00
ed
ddbaebe779 update pkgs to 1.6.12 2023-04-20 22:47:37 +00:00
ed
42099baeff v1.6.12 2023-04-20 21:41:47 +00:00
ed
2459965ca8 u2cli: dont enter delete stage if something failed 2023-04-20 20:40:09 +00:00
ed
6acf436573 u2idx pool instead of per-socket;
prevents running out of FDs thanks to thousands of sqlite3 sessions
and neatly sidesteps what could possibly be a race in python's
sqlite3 bindings where it sometimes forgets to close the fd
2023-04-20 20:36:13 +00:00
ed
f217e1ce71 correctly ignore multirange requests 2023-04-20 19:14:38 +00:00
ed
418000aee3 explain tus incompatibility + update docs 2023-04-19 21:46:33 +00:00
ed
dbbba9625b nix: make deps optional + update docs 2023-04-17 13:17:53 +02:00
Chinpo Nya
397bc92fbc rewrite the nix module config with nix options 2023-04-17 00:26:57 +02:00
Chinpo Nya
6e615dcd03 fix: remove ffmpeg from python env build inputs 2023-04-17 00:26:57 +02:00
Chinpo Nya
9ac5908b33 refactor: remove unnecessary use of 'rec' 2023-04-17 00:26:57 +02:00
Chinpo Nya
50912480b9 automate nix package updates 2023-04-17 00:26:57 +02:00
Chinpo Nya
24b9b8319d nix/nixos documentation 2023-04-17 00:26:57 +02:00
Chinpo Nya
b0f4f0b653 nixos module 2023-04-17 00:26:57 +02:00
Chinpo Nya
05bbd41c4b nix package 2023-04-17 00:26:57 +02:00
ed
8f5f8a3cda expand userhomes everywhere:
* -c
* -lo
* --hist
* hist volflag
* --ssl-log
2023-04-14 18:55:19 +02:00
ed
c8938fc033 fix ipv4 location header on dualstack 2023-04-14 14:06:44 +02:00
ed
1550350e05 update docs (performance tips, windows example) 2023-04-13 21:36:55 +00:00
ed
5cc190c026 better 2023-04-12 22:09:46 +00:00
ed
d6a0a738ce add windows example + update docs + some cosmetics 2023-04-12 22:06:44 +00:00
ed
f5fe3678ee more safari-on-touchbar-macbook workarounds:
* safari invokes pause on the mediasession
   whenever any Audio loads a new src (preload)

* ...and on some(?) seeks
2023-04-07 23:04:01 +02:00
ed
f2a7925387 avoid safari bugs on touchbar macbooks:
* songs would play backwards
* playback started immediately on folder change
2023-04-07 12:38:37 +02:00
ed
fa953ced52 update archpkg to 1.6.11 2023-04-01 22:59:20 +00:00
ed
f0000d9861 v1.6.11 2023-04-01 21:12:54 +00:00
ed
4e67516719 last.fm web-scrobbler support 2023-04-01 21:02:03 +00:00
ed
29db7a6270 deps: automate prismjs build 2023-04-01 17:46:42 +00:00
ed
852499e296 dont panic in case of extension-injected css 2023-04-01 16:08:45 +00:00
ed
f1775fd51c update deps 2023-04-01 15:15:53 +00:00
ed
4bb306932a update systemd notes 2023-04-01 10:32:12 +00:00
ed
2a37e81bd8 add rclone optimization, closes #21 2023-04-01 10:21:21 +00:00
ed
6a312ca856 something dumb 2023-04-01 00:16:30 +00:00
ed
e7f3e475a2 more accurate bpm detector 2023-03-31 21:20:37 +00:00
ed
854ba0ec06 add audio filter plugin thing 2023-03-31 20:20:28 +00:00
ed
209b49d771 remind sqlite we have indexes 2023-03-30 21:45:58 +00:00
ed
949baae539 integrate markdown thumbs with image gallery 2023-03-30 21:21:21 +00:00
ed
5f4ea27586 new hook: exif stripper 2023-03-26 22:19:15 +00:00
ed
099cc97247 hooks: more correct usage examples 2023-03-26 22:18:48 +00:00
ed
592b7d6315 gdi js 2023-03-26 02:06:49 +00:00
ed
0880bf55a1 markdown thumbnails 2023-03-26 01:53:41 +00:00
ed
4cbffec0ec u2cli: show more errors + drop --ws (does nothing) 2023-03-23 23:47:41 +00:00
ed
cc355417d4 update docs 2023-03-23 23:37:45 +00:00
ed
e2bc573e61 webdav correctness:
* generally respond without body
   (rclone likes this)
* don't connection:close on most mkcol errors
2023-03-23 23:25:00 +00:00
ed
41c0376177 update archpkg to 1.6.10 2023-03-20 23:37:20 +00:00
ed
c01cad091e v1.6.10 2023-03-20 21:56:31 +00:00
ed
eb349f339c update foldersync / rclone docs 2023-03-20 21:54:08 +00:00
ed
24d8caaf3e switch rclone to owncloud mode so it sends lastmod 2023-03-20 21:45:52 +00:00
ed
5ac2c20959 basic support for rclone sync 2023-03-20 21:17:53 +00:00
ed
bb72e6bf30 support propfind of files (not just dirs) 2023-03-20 20:58:51 +00:00
ed
d8142e866a accept last-modified from owncloud webdav extension 2023-03-20 20:28:26 +00:00
ed
7b7979fd61 add sftpgo to comparison + update docs 2023-03-19 21:45:35 +00:00
ed
749616d09d help iOS understand short audio files 2023-03-19 20:03:35 +00:00
ed
5485c6d7ca prisonparty: FFmpeg runs faster with /dev/urandom 2023-03-19 18:32:35 +00:00
ed
b7aea38d77 add iOS uploader (mk.ii) 2023-03-18 18:38:37 +00:00
ed
0ecd9f99e6 update archpkg to 1.6.9 2023-03-16 22:34:09 +00:00
ed
ca04a00662 v1.6.9 2023-03-16 21:06:18 +00:00
ed
8a09601be8 url-param ?v disables index.html 2023-03-16 20:52:43 +00:00
ed
1fe0d4693e fix logues bleeding into navpane 2023-03-16 20:23:01 +00:00
ed
bba8a3c6bc fix truncated search results 2023-03-16 20:12:13 +00:00
ed
e3d7f0c7d5 add tooltip delay to android too 2023-03-16 19:48:44 +00:00
ed
be7bb71bbc add option to show index.html instead of listing 2023-03-16 19:41:33 +00:00
ed
e0c4829ec6 verify covers against db instead of fs 2023-03-15 19:48:43 +00:00
ed
5af1575329 readme: ideas welcome w 2023-03-14 22:24:43 +00:00
ed
884f966b86 update archpkg to 1.6.8 2023-03-12 18:55:02 +00:00
ed
f6c6fbc223 fix exe builder 2023-03-12 18:54:16 +00:00
ed
b0cc396bca v1.6.8 2023-03-12 16:10:07 +00:00
ed
ae463518f6 u2cli: send upload stats to server + fix py2.6 support 2023-03-11 21:39:56 +00:00
ed
2be2e9a0d8 index folder thumbs in db 2023-03-11 11:43:29 +00:00
ed
e405fddf74 specify that only up2k clients will resume uploads 2023-03-09 22:47:37 +00:00
ed
c269b0dd91 show an error (instead of crashing) if a pic is 404 2023-03-09 22:37:12 +00:00
ed
8c3211263a keep scanning folders for more music to play 2023-03-09 22:26:41 +00:00
ed
bf04e7c089 update some docs 2023-03-09 22:11:39 +00:00
ed
c7c6e48b1a didn't compress numbered logfiles 2023-03-09 21:59:59 +00:00
ed
974ca773be just to be extra sure 2023-03-09 21:49:29 +00:00
ed
9270c2df19 evict basic-browser from crawlers 2023-03-09 21:35:07 +00:00
ed
b39ff92f34 u2cli: support long paths on win7 2023-03-08 22:27:13 +00:00
ed
7454167f78 add DCO PR template 2023-03-08 08:27:17 +01:00
ed
5ceb3a962f build up2k.exe 2023-03-07 22:58:14 +00:00
ed
52bd5642da update archpkg to 1.6.7 2023-03-05 20:20:15 +00:00
ed
c39c93725f v1.6.7 2023-03-05 20:18:16 +00:00
ed
d00f0b9fa7 ftp: support filezilla mkdir 2023-03-05 20:18:02 +00:00
ed
01cfc70982 add example for webdav automount 2023-03-05 19:52:45 +00:00
ed
e6aec189bd fix flickering toast on upload finish 2023-03-05 19:49:54 +00:00
ed
c98fff1647 fix chunkpost-handshake race (affects --no-dedup only);
a handshake arriving in the middle of the final chunk could cause
dupes to become empty -- worst case leading to loss of data
2023-03-05 19:45:50 +00:00
ed
0009e31bd3 heavy webworker load can park the main thread of a
background chrome tab for 10sec; piggyback some pokes off postmessage
2023-03-02 22:35:32 +00:00
ed
db95e880b2 thats not how it works 2023-02-28 22:19:06 +00:00
ed
e69fea4a59 exe: update screenshots 2023-02-26 22:26:40 +00:00
ed
4360800a6e update archpkg to 1.6.6 2023-02-26 22:11:56 +00:00
ed
b179e2b031 prisonparty: ignore unresolvable mount paths;
allows startup even if some locations are missing,
for example if a server rebooted and some disks aren't up yet
2023-02-26 22:11:15 +00:00
ed
ecdec75b4e v1.6.6 2023-02-26 20:30:17 +00:00
ed
5cb2e33353 update readmes + fix typo 2023-02-26 19:22:54 +00:00
ed
43ff2e531a add deadline for filling data into a reserved filename 2023-02-26 19:13:35 +00:00
ed
1c2c9db8f0 retain upload time (but not ip) on file reindex 2023-02-26 19:09:24 +00:00
ed
7ea183baef let http thread handle upload verification plugins 2023-02-26 19:07:49 +00:00
ed
ab87fac6d8 db got the wrong lastmod when linking dupes 2023-02-26 18:52:04 +00:00
ed
1e3b7eee3b dont rmdir volume top on cleanup 2023-02-26 18:28:37 +00:00
ed
4de028fc3b let controlpanel rescan button override lack of e2dsa 2023-02-26 18:27:10 +00:00
ed
604e5dfaaf improve error handling / messages 2023-02-26 18:26:13 +00:00
ed
05e0c2ec9e add xiu (batching hook; runs on idle after uploads) +
bunch of tweaks/fixes for hooks
2023-02-26 18:23:32 +00:00
ed
76bd005bdc cgen fixes 2023-02-21 19:42:08 +00:00
ed
5effaed352 add reminder that SSDP launches IE by default 2023-02-21 19:38:35 +00:00
ed
cedaf4809f add exe integrity selfcheck 2023-02-21 19:18:10 +00:00
ed
6deaf5c268 add jitter simlation 2023-02-20 21:34:30 +00:00
ed
9dc6a26472 webdav.bat and readme tweaks 2023-02-20 21:00:04 +00:00
ed
14ad5916fc freebsd: fancy console listing for fetch 2023-02-19 22:14:21 +00:00
ed
1a46738649 raise edgecases (broken envs on windows) 2023-02-19 22:13:33 +00:00
ed
9e5e3b099a add optional deps to quickstart section 2023-02-19 22:13:02 +00:00
ed
292ce75cc2 return to previous url after login 2023-02-19 19:58:15 +00:00
ed
ce7df7afd4 update platform support listing 2023-02-19 15:16:50 +00:00
ed
e28e793f81 whoops 2023-02-19 15:11:04 +00:00
ed
3e561976db optimize docker build times (884 to 379 sec) 2023-02-19 14:19:35 +00:00
ed
273a4eb7d0 list supported platforms 2023-02-19 01:00:37 +00:00
ed
6175f85bb6 more docker images for arm, arm64, s390x 2023-02-19 00:50:07 +00:00
ed
a80579f63a build docker for x32 aarch64 armhf ppc64 s390x 2023-02-18 23:04:55 +00:00
ed
96d6bcf26e if non-TLS, show warning in the login form 2023-02-17 22:49:03 +00:00
ed
49e8df25ac ie11: support back button 2023-02-17 22:21:13 +00:00
ed
6a05850f21 also undupe search hits from overlapping volumes 2023-02-17 20:48:57 +00:00
ed
5e7c3defe3 update pypi description + docker links 2023-02-16 19:56:57 +00:00
ed
6c0987d4d0 mention --daw 2023-02-15 17:51:20 +00:00
ed
6eba9feffe condense uploads listing on view change 2023-02-14 21:58:15 +00:00
ed
8adfcf5950 win10-based copyparty64.exe 2023-02-14 21:50:14 +00:00
ed
36d6fa512a mention upcoming libopenmpt availability 2023-02-13 06:57:47 +00:00
ed
79b6e9b393 update archpkg to 1.6.5 2023-02-12 15:38:03 +00:00
ed
dc2e2cbd4b v1.6.5 2023-02-12 14:11:45 +00:00
ed
5c12dac30f most ffmpeg builds dont support compressed modules 2023-02-12 14:02:43 +00:00
ed
641929191e fix reading smb shares on windows 2023-02-12 13:59:34 +00:00
ed
617321631a docker: add annotations 2023-02-11 21:10:28 +00:00
ed
ddc0c899f8 update archpkg to 1.6.4 2023-02-11 21:01:45 +00:00
108 changed files with 5347 additions and 1019 deletions

2
.github/pull_request_template.md vendored Normal file
View File

@@ -0,0 +1,2 @@
Please include the following text somewhere in this PR description:
This PR complies with the DCO; https://developercertificate.org/

11
.gitignore vendored
View File

@@ -21,6 +21,9 @@ copyparty.egg-info/
# winmerge
*.bak
# apple pls
.DS_Store
# derived
copyparty/res/COPYING.txt
copyparty/web/deps/
@@ -31,4 +34,10 @@ contrib/package/arch/src/
# state/logs
up.*.txt
.hist/
.hist/
scripts/docker/*.out
scripts/docker/*.err
/perf.*
# nix build output link
result

10
.vscode/launch.py vendored
View File

@@ -30,9 +30,17 @@ except:
argv = [os.path.expanduser(x) if x.startswith("~") else x for x in argv]
sfx = ""
if len(sys.argv) > 1 and os.path.isfile(sys.argv[1]):
sfx = sys.argv[1]
sys.argv = [sys.argv[0]] + sys.argv[2:]
argv += sys.argv[1:]
if re.search(" -j ?[0-9]", " ".join(argv)):
if sfx:
argv = [sys.executable, sfx] + argv
sp.check_call(argv)
elif re.search(" -j ?[0-9]", " ".join(argv)):
argv = [sys.executable, "-m", "copyparty"] + argv
sp.check_call(argv)
else:

380
README.md
View File

@@ -1,35 +1,21 @@
# 🎉 copyparty
# 💾🎉 copyparty
* portable file sharing hub (py2/py3) [(on PyPI)](https://pypi.org/project/copyparty/)
* MIT-Licensed, 2019-05-26, ed @ irc.rizon.net
turn almost any device into a file server with resumable uploads/downloads using [*any*](#browser-support) web browser
* server only needs Python (2 or 3), all dependencies optional
* 🔌 protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
## summary
turn your phone or raspi into a portable file server with resumable uploads/downloads using *any* web browser
* server only needs Python (`2.7` or `3.3+`), all dependencies optional
* browse/upload with [IE4](#browser-support) / netscape4.0 on win3.11 (heh)
* protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
## get the app
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
(the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet)
## readme toc
* top
* [quickstart](#quickstart) - download **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** and you're all set!
* [quickstart](#quickstart) - just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
* [on servers](#on-servers) - you may also want these, especially on servers
* [on debian](#on-debian) - recommended additional steps on debian
* [features](#features)
* [testimonials](#testimonials) - small collection of user feedback
* [motivations](#motivations) - project goals / philosophy
@@ -53,6 +39,9 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [self-destruct](#self-destruct) - uploads can be given a lifetime
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
* [media player](#media-player) - plays almost every audio format there is
* [audio equalizer](#audio-equalizer) - bass boosted
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
* [markdown viewer](#markdown-viewer) - and there are *two* editors
* [other tricks](#other-tricks)
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
@@ -81,24 +70,31 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [themes](#themes)
* [complete examples](#complete-examples)
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
* [packages](#packages) - the party might be closer than you think
* [arch package](#arch-package) - now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
* [nix package](#nix-package) - `nix profile install github:9001/copyparty`
* [nixos module](#nixos-module)
* [browser support](#browser-support) - TLDR: yes
* [client examples](#client-examples) - interact with copyparty using non-browser clients
* [folder sync](#folder-sync) - sync folders to/from copyparty
* [mount as drive](#mount-as-drive) - a remote copyparty server as a local filesystem
* [android app](#android-app) - upload to copyparty with one tap
* [iOS shortcuts](#iOS-shortcuts) - there is no iPhone app, but
* [performance](#performance) - defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
* [client-side](#client-side) - when uploading files
* [security](#security) - some notes on hardening
* [gotchas](#gotchas) - behavior that might be unexpected
* [cors](#cors) - cross-site request config
* [https](#https) - both HTTP and HTTPS are accepted
* [recovering from crashes](#recovering-from-crashes)
* [client crashes](#client-crashes)
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
* [HTTP API](#HTTP-API) - see [devnotes](#./docs/devnotes.md#http-api)
* [HTTP API](#HTTP-API) - see [devnotes](./docs/devnotes.md#http-api)
* [dependencies](#dependencies) - mandatory deps
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
* [install recommended deps](#install-recommended-deps)
* [optional gpl stuff](#optional-gpl-stuff)
* [sfx](#sfx) - the self-contained "binary"
* [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) or [copyparty64.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty64.exe)
* [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
* [install on android](#install-on-android)
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports
* [devnotes](#devnotes) - for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
@@ -106,29 +102,49 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
## quickstart
download **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** and you're all set!
just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
* or [install through nix](#nix-package), or [on NixOS](#nixos-module), or [on arch](#arch-package)
* or if you are on android, [install copyparty in termux](#install-on-android)
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
* docker has all deps built-in, so skip this step:
running the sfx without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
enable thumbnails (images/audio/video), media indexing, and audio transcoding by installing some recommended deps:
* **Alpine:** `apk add py3-pillow ffmpeg`
* **Debian:** `apt install python3-pil ffmpeg`
* **Fedora:** `dnf install python3-pillow ffmpeg`
* **FreeBSD:** `pkg install py39-sqlite3 py39-pillow ffmpeg`
* **MacOS:** `port install py-Pillow ffmpeg`
* **MacOS** (alternative): `brew install pillow ffmpeg`
* **Windows:** `python -m pip install --user -U Pillow`
* install python and ffmpeg manually; do not use `winget` or `Microsoft Store` (it breaks $PATH)
* copyparty.exe comes with `Pillow` and only needs `ffmpeg`
* see [optional dependencies](#optional-dependencies) to enable even more features
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
or see [complete windows example](./docs/examples/windows.md)
some recommended options:
* `-e2dsa` enables general [file indexing](#file-indexing)
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen), see [optional dependencies](#optional-dependencies) to enable thumbnails and more
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
* `-v /mnt/music:/music:r:rw,foo -a foo:bar` shares `/mnt/music` as `/music`, `r`eadable by anyone, and read-write for user `foo`, password `bar`
* replace `:r:rw,foo` with `:r,foo` to only make the folder readable by `foo` and nobody else
* see [accounts and volumes](#accounts-and-volumes) for the syntax and other permissions (`r`ead, `w`rite, `m`ove, `d`elete, `g`et, up`G`et)
* see [accounts and volumes](#accounts-and-volumes) (or `--help-accounts`) for the syntax and other permissions
### on servers
you may also want these, especially on servers:
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service (see guide inside)
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
* [nixos module](#nixos-module) to run copyparty on NixOS hosts
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
```
@@ -139,18 +155,6 @@ firewall-cmd --reload
```
(1900:ssdp, 3921:ftp, 3923:http/https, 3945:smb, 3990:ftps, 5353:mdns, 12000:passive-ftp)
### on debian
recommended additional steps on debian which enable audio metadata and thumbnails (from images and videos):
* as root, run the following:
`apt install python3 python3-pip python3-dev ffmpeg`
* then, as the user which will be running copyparty (so hopefully not root), run this:
`python3 -m pip install --user -U Pillow pillow-avif-plugin`
(skipped `pyheif-pillow-opener` because apparently debian is too old to build it)
## features
@@ -164,14 +168,18 @@ recommended additional steps on debian which enable audio metadata and thumbnai
* ☑ [smb/cifs server](#smb-server)
* ☑ [qr-code](#qr-code) for quick access
* ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf)
* ☑ [event hooks](#event-hooks) / script runner
* ☑ [reverse-proxy support](https://github.com/9001/copyparty#reverse-proxy)
* upload
* ☑ basic: plain multipart, ie6 support
* ☑ [up2k](#uploading): js, resumable, multithreaded
* unaffected by cloudflare's max-upload-size (100 MiB)
* ☑ stash: simple PUT filedropper
* ☑ filename randomizer
* ☑ write-only folders
* ☑ [unpost](#unpost): undo/delete accidental uploads
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
* ☑ symlink/discard existing files (content-matching)
* ☑ symlink/discard duplicates (content-matching)
* download
* ☑ single files in browser
* ☑ [folders as zip / tar files](#zip-downloads)
@@ -192,10 +200,15 @@ recommended additional steps on debian which enable audio metadata and thumbnai
* ☑ [locate files by contents](#file-search)
* ☑ search by name/path/date/size
* ☑ [search by ID3-tags etc.](#searching)
* client support
* ☑ [folder sync](#folder-sync)
* ☑ [curl-friendly](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png)
* markdown
* ☑ [viewer](#markdown-viewer)
* ☑ editor (sure why not)
PS: something missing? post any crazy ideas you've got as a [feature request](https://github.com/9001/copyparty/issues/new?assignees=9001&labels=enhancement&template=feature_request.md) or [discussion](https://github.com/9001/copyparty/discussions/new?category=ideas) 🤙
## testimonials
@@ -239,6 +252,9 @@ browser-specific:
server-os-specific:
* RHEL8 / Rocky8: you can run copyparty using `/usr/libexec/platform-python`
server notes:
* pypy is supported but regular cpython is faster if you enable the database
# bugs
@@ -262,9 +278,11 @@ server-os-specific:
* [Firefox issue 1790500](https://bugzilla.mozilla.org/show_bug.cgi?id=1790500) -- entire browser can crash after uploading ~4000 small files
* Android: music playback randomly stops due to [battery usage settings](#fix-unreliable-playback-on-android)
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
* "future" because `AudioContext` is broken in the current iOS version (15.1), maybe one day...
* "future" because `AudioContext` can't maintain a stable playback speed in the current iOS version (15.7), maybe one day...
* Windows: folders cannot be accessed if the name ends with `.`
* python or windows bug
@@ -512,11 +530,14 @@ up2k has several advantages:
* much higher speeds than ftp/scp/tarpipe on some internet connections (mainly american ones) thanks to parallel connections
* the last-modified timestamp of the file is preserved
> it is perfectly safe to restart / upgrade copyparty while someone is uploading to it!
> all known up2k clients will resume just fine 💪
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
![copyparty-upload-fs8](https://user-images.githubusercontent.com/241032/129635371-48fc54ca-fa91-48e3-9b1d-ba413e4b68cb.png)
**protip:** you can avoid scaring away users with [contrib/plugins/minimal-up2k.html](contrib/plugins/minimal-up2k.html) which makes it look [much simpler](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png)
**protip:** you can avoid scaring away users with [contrib/plugins/minimal-up2k.js](contrib/plugins/minimal-up2k.js) which makes it look [much simpler](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png)
**protip:** if you enable `favicon` in the `[⚙️] settings` tab (by typing something into the textbox), the icon in the browser tab will indicate upload progress -- also, the `[🔔]` and/or `[🔊]` switches enable visible and/or audible notifications on upload completion
@@ -641,12 +662,68 @@ or a mix of both:
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
## media player
plays almost every audio format there is (if the server has FFmpeg installed for on-demand transcoding)
the following audio formats are usually always playable, even without FFmpeg: `aac|flac|m4a|mp3|ogg|opus|wav`
some hilights:
* OS integration; control playback from your phone's lockscreen ([windows](https://user-images.githubusercontent.com/241032/233213022-298a98ba-721a-4cf1-a3d4-f62634bc53d5.png) // [iOS](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) // [android](https://user-images.githubusercontent.com/241032/233212311-a7368590-08c7-4f9f-a1af-48ccf3f36fad.png))
* shows the audio waveform in the seekbar
* not perfectly gapless but can get really close (see settings + eq below); good enough to enjoy gapless albums as intended
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
open the `[🎺]` media-player-settings tab to configure it,
* switches:
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
* `[~s]` toggles the seekbar waveform display
* `[/np]` enables buttons to copy the now-playing info as an irc message
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
* `[seek]` allows seeking with lockscreen controls (buggy on some devices)
* `[art]` shows album art on the lockscreen
* `[🎯]` keeps the playing song scrolled into view (good when using the player as a taskbar dock)
* `[⟎]` shrinks the playback controls
* playback mode:
* `[loop]` keeps looping the folder
* `[next]` plays into the next folder
* transcode:
* `[flac]` convers `flac` and `wav` files into opus
* `[aac]` converts `aac` and `m4a` files into opus
* `[oth]` converts all other known formats into opus
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
* "tint" reduces the contrast of the playback bar
### audio equalizer
bass boosted
can also boost the volume in general, or increase/decrease stereo width (like [crossfeed](https://www.foobar2000.org/components/view/foo_dsp_meiercf) just worse)
has the convenient side-effect of reducing the pause between songs, so gapless albums play better with the eq enabled (just make it flat)
### fix unreliable playback on android
due to phone / app settings, android phones may randomly stop playing music when the power saver kicks in, especially at the end of an album -- you can fix it by [disabling power saving](https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png) in the [app settings](https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png) of the browser you use for music streaming (preferably a dedicated one)
## markdown viewer
and there are *two* editors
![copyparty-md-read-fs8](https://user-images.githubusercontent.com/241032/115978057-66419080-a57d-11eb-8539-d2be843991aa.png)
there is a built-in extension for inline clickable thumbnails;
* enable it by adding `<!-- th -->` somewhere in the doc
* add thumbnails with `!th[l](your.jpg)` where `l` means left-align (`r` = right-align)
* a single line with `---` clears the float / inlining
* in the case of README.md being displayed below a file listing, thumbnails will open in the gallery viewer
other notes,
* the document preview has a max-width which is the same as an A4 paper when printed
@@ -751,6 +828,13 @@ an FTP server can be started using `--ftp 3921`, and/or `--ftps` for explicit T
* some older software (filezilla on debian-stable) cannot passive-mode with TLS
* login with any username + your password, or put your password in the username field
some recommended FTP / FTPS clients; `wark` = example password:
* https://winscp.net/eng/download.php
* https://filezilla-project.org/ struggles a bit with ftps in active-mode, but is fine otherwise
* https://rclone.org/ does FTPS with `tls=false explicit_tls=true`
* `lftp -u k,wark -p 3921 127.0.0.1 -e ls`
* `lftp -u k,wark -p 3990 127.0.0.1 -e 'set ssl:verify-certificate no; ls'`
## webdav server
@@ -764,6 +848,8 @@ general usage:
on macos, connect from finder:
* [Go] -> [Connect to Server...] -> http://192.168.123.1:3923/
in order to grant full write-access to webdav clients, the volflag `daw` must be set and the account must also have delete-access (otherwise the client won't be allowed to replace the contents of existing files, which is how webdav works)
### connecting to webdav from windows
@@ -872,7 +958,11 @@ avoid traversing into other filesystems using `--xdev` / volflag `:c,xdev`, ski
and/or you can `--xvol` / `:c,xvol` to ignore all symlinks leaving the volume's top directory, but still allow bind-mounts pointing elsewhere
**NB: only affects the indexer** -- users can still access anything inside a volume, unless shadowed by another volume
* symlinks are permitted with `xvol` if they point into another volume where the user has the same level of access
these options will reduce performance; unlikely worst-case estimates are 14% reduction for directory listings, 35% for download-as-tar
as of copyparty v1.7.0 these options also prevent file access at runtime -- in previous versions it was just hints for the indexer
### periodic rescan
@@ -1068,6 +1158,8 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
## complete examples
* [running on windows](./docs/examples/windows.md)
* read-only music server
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
@@ -1083,19 +1175,135 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
## reverse-proxy
running copyparty next to other websites hosted on an existing webserver such as nginx or apache
running copyparty next to other websites hosted on an existing webserver such as nginx, caddy, or apache
you can either:
* give copyparty its own domain or subdomain (recommended)
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which could be a nice speed boost
example webserver configs:
* [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
# packages
the party might be closer than you think
## arch package
now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
## nix package
`nix profile install github:9001/copyparty`
requires a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of nix
some recommended dependencies are enabled by default; [override the package](https://github.com/9001/copyparty/blob/hovudstraum/contrib/package/nix/copyparty/default.nix#L3-L22) if you want to add/remove some features/deps
`ffmpeg-full` was chosen over `ffmpeg-headless` mainly because we need `withWebp` (and `withOpenmpt` is also nice) and being able to use a cached build felt more important than optimizing for size at the time -- PRs welcome if you disagree 👍
## nixos module
for this setup, you will need a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of NixOS.
```nix
{
# add copyparty flake to your inputs
inputs.copyparty.url = "github:9001/copyparty";
# ensure that copyparty is an allowed argument to the outputs function
outputs = { self, nixpkgs, copyparty }: {
nixosConfigurations.yourHostName = nixpkgs.lib.nixosSystem {
modules = [
# load the copyparty NixOS module
copyparty.nixosModules.default
({ pkgs, ... }: {
# add the copyparty overlay to expose the package to the module
nixpkgs.overlays = [ copyparty.overlays.default ];
# (optional) install the package globally
environment.systemPackages = [ pkgs.copyparty ];
# configure the copyparty module
services.copyparty.enable = true;
})
];
};
};
}
```
copyparty on NixOS is configured via `services.copyparty` options, for example:
```nix
services.copyparty = {
enable = true;
# directly maps to values in the [global] section of the copyparty config.
# see `copyparty --help` for available options
settings = {
i = "0.0.0.0";
# use lists to set multiple values
p = [ 3210 3211 ];
# use booleans to set binary flags
no-reload = true;
# using 'false' will do nothing and omit the value when generating a config
ignored-flag = false;
};
# create users
accounts = {
# specify the account name as the key
ed = {
# provide the path to a file containing the password, keeping it out of /nix/store
# must be readable by the copyparty service user
passwordFile = "/run/keys/copyparty/ed_password";
};
# or do both in one go
k.passwordFile = "/run/keys/copyparty/k_password";
};
# create a volume
volumes = {
# create a volume at "/" (the webroot), which will
"/" = {
# share the contents of "/srv/copyparty"
path = "/srv/copyparty";
# see `copyparty --help-accounts` for available options
access = {
# everyone gets read-access, but
r = "*";
# users "ed" and "k" get read-write
rw = [ "ed" "k" ];
};
# see `copyparty --help-flags` for available options
flags = {
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
fk = 4;
# scan for new files every 60sec
scan = 60;
# volflag "e2d" enables the uploads database
e2d = true;
# "d2t" disables multimedia parsers (in case the uploads are malicious)
d2t = true;
# skips hashing file contents if path matches *.iso
nohash = "\.iso$";
};
};
};
# you may increase the open file limit for the process
openFilesLimit = 8192;
};
```
the passwordFile at /run/keys/copyparty/ could for example be generated by [agenix](https://github.com/ryantm/agenix), or you could just dump it in the nix store instead if that's acceptable
# browser support
TLDR: yes
@@ -1167,7 +1375,7 @@ interact with copyparty using non-browser clients
* `(printf 'PUT / HTTP/1.1\r\n\r\n'; cat movie.mkv) >/dev/tcp/127.0.0.1/3923`
* python: [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
* file uploads, file-search, folder sync, autoresume of aborted/broken uploads
* file uploads, file-search, [folder sync](#folder-sync), autoresume of aborted/broken uploads
* can be downloaded from copyparty: controlpanel -> connect -> [up2k.py](http://127.0.0.1:3923/.cpr/a/up2k.py)
* see [./bin/README.md#up2kpy](bin/README.md#up2kpy)
@@ -1188,17 +1396,27 @@ you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, ur
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
## folder sync
sync folders to/from copyparty
the commandline uploader [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) with `--dr` is the best way to sync a folder to copyparty; verifies checksums and does files in parallel, and deletes unexpected files on the server after upload has finished which makes file-renames really cheap (it'll rename serverside and skip uploading)
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty, ...), although there is no integrity check and it won't work with files over 100 MiB if copyparty is behind cloudflare
* starting from rclone v1.63 (currently [in beta](https://beta.rclone.org/?filter=latest)), rclone will also be faster than up2k.py
## mount as drive
a remote copyparty server as a local filesystem; go to the control-panel and click `connect` to see a list of commands to do that
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
* [rclone-http](./docs/rclone.md) (25s), read-only
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE ([v1.63-beta](https://beta.rclone.org/?filter=latest))
* [rclone-http](./docs/rclone.md) (26s), read-only
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
* [rclone-webdav](./docs/rclone.md) (51s), read/WRITE
* copyparty-1.5.0's webdav server is faster than rclone-1.60.0 (69s)
* [partyfuse.py](./bin/#partyfusepy) (71s), read-only
* davfs2 (103s), read/WRITE, *very fast* on small files
* [win10-webdav](#webdav-server) (138s), read/WRITE
* [win10-smb2](#smb-server) (387s), read/WRITE
@@ -1206,6 +1424,27 @@ alternatively, some alternatives roughly sorted by speed (unreproducible benchma
most clients will fail to mount the root of a copyparty server unless there is a root volume (so you get the admin-panel instead of a browser when accessing it) -- in that case, mount a specific volume instead
# android app
upload to copyparty with one tap
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet
if you want to run the copyparty server on your android device, see [install on android](#install-on-android)
# iOS shortcuts
there is no iPhone app, but the following shortcuts are almost as good:
* [upload to copyparty](https://www.icloud.com/shortcuts/41e98dd985cb4d3bb433222bc1e9e770) ([offline](https://github.com/9001/copyparty/raw/hovudstraum/contrib/ios/upload-to-copyparty.shortcut)) ([png](https://user-images.githubusercontent.com/241032/226118053-78623554-b0ed-482e-98e4-6d57ada58ea4.png)) based on the [original](https://www.icloud.com/shortcuts/ab415d5b4de3467b9ce6f151b439a5d7) by [Daedren](https://github.com/Daedren) (thx!)
* can strip exif, upload files, pics, vids, links, clipboard
* can download links and rehost the target file on copyparty (see first comment inside the shortcut)
* pics become lowres if you share from gallery to shortcut, so better to launch the shortcut and pick stuff from there
# performance
defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
@@ -1213,15 +1452,16 @@ defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
below are some tweaks roughly ordered by usefulness:
* `-q` disables logging and can help a bunch, even when combined with `-lo` to redirect logs to file
* `--http-only` or `--https-only` (unless you want to support both protocols) will reduce the delay before a new connection is established
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example:
* huge amount of short-lived connections
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
* lots of connections (many users or heavy clients)
* simultaneous downloads and uploads saturating a 20gbps connection
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
* and pypy can sometimes crash on startup with `-j0` (TODO make issue)
## client-side
@@ -1301,6 +1541,13 @@ by default, except for `GET` and `HEAD` operations, all requests must either:
cors can be configured with `--acao` and `--acam`, or the protections entirely disabled with `--allow-csrf`
## https
both HTTP and HTTPS are accepted by default, but letting a [reverse proxy](#reverse-proxy) handle the https/tls/ssl would be better (probably more secure by default)
copyparty doesn't speak HTTP/2 or QUIC, so using a reverse proxy would solve that as well
# recovering from crashes
## client crashes
@@ -1323,7 +1570,7 @@ however you can hit `F12` in the up2k tab and use the devtools to see how far yo
# HTTP API
see [devnotes](#./docs/devnotes.md#http-api)
see [devnotes](./docs/devnotes.md#http-api)
# dependencies
@@ -1357,12 +1604,6 @@ enable [smb](#smb-server) support (**not** recommended):
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
## install recommended deps
```
python -m pip install --user -U jinja2 mutagen Pillow
```
## optional gpl stuff
some bundled tools have copyleft dependencies, see [./bin/#mtag](bin/#mtag)
@@ -1374,20 +1615,25 @@ these are standalone programs and will never be imported / evaluated by copypart
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](#./docs/devnotes.md#sfx-repack)
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](./docs/devnotes.md#sfx-repack)
## copyparty.exe
download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) or [copyparty64.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty64.exe)
download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
![copyparty-exe-fs8](https://user-images.githubusercontent.com/241032/194707422-cb7f66c9-41a2-4cb9-8dbc-2ab866cd4338.png)
![copyparty-exe-fs8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png)
can be convenient on old machines where installing python is problematic, however is **not recommended** and should be considered a last resort -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
can be convenient on machines where installing python is problematic, however is **not recommended** -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
* [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) is compatible with 32bit windows7, which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and will definitely become a security hazard at some point
* [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) runs on win8 or newer, was compiled on win10, does thumbnails + media tags, and is *currently* safe to use, but any future python/expat/pillow CVEs can only be remedied by downloading a newer version of the exe
* [copyparty64.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty64.exe) is identical except 64bit so it [works in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png)
* on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145), on win10 it just works
* some antivirus may freak out (false-positive), possibly [Avast, AVG, and McAfee](https://www.virustotal.com/gui/file/52391a1e9842cf70ad243ef83844d46d29c0044d101ee0138fcdd3c8de2237d6/detection)
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with [windows7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png), which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date

View File

@@ -2,15 +2,25 @@ standalone programs which are executed by copyparty when an event happens (uploa
these programs either take zero arguments, or a filepath (the affected file), or a json message with filepath + additional info
run copyparty with `--help-hooks` for usage details / hook type explanations (xbu/xau/xiu/xbr/xar/xbd/xad)
> **note:** in addition to event hooks (the stuff described here), copyparty has another api to run your programs/scripts while providing way more information such as audio tags / video codecs / etc and optionally daisychaining data between scripts in a processing pipeline; if that's what you want then see [mtp plugins](../mtag/) instead
# after upload
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
* [notify2.py](notify2.py) uses the json API to show more context
* [image-noexif.py](image-noexif.py) removes image exif by overwriting / directly editing the uploaded file
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
# upload batches
these are `--xiu` hooks; unlike `xbu` and `xau` (which get executed on every single file), `xiu` hooks are given a list of recent uploads on STDIN after the server has gone idle for N seconds, reducing server load + providing more context
* [xiu.py](xiu.py) is a "minimal" example showing a list of filenames + total filesize
* [xiu-sha.py](xiu-sha.py) produces a sha512 checksum list in the volume root
# before upload
* [reject-extension.py](reject-extension.py) rejects uploads if they match a list of file extensions

View File

@@ -13,9 +13,15 @@ example usage as global config:
--xau f,t5,j,bin/hooks/discord-announce.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xau=f,t5,j,bin/hooks/discord-announce.py
-v srv/inc:inc:r:rw,ed:c,xau=f,t5,j,bin/hooks/discord-announce.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xbu = execute after upload
f = fork; don't wait for it to finish
t5 = timeout if it's still running after 5 sec
j = provide upload information as json; not just the filename
@@ -30,6 +36,7 @@ then use this to design your message: https://discohook.org/
def main():
WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
WEBHOOK = "https://discord.com/api/webhooks/1066830390280597718/M1TDD110hQA-meRLMRhdurych8iyG35LDoI1YhzbrjGP--BXNZodZFczNVwK4Ce7Yme5"
# read info from copyparty
inf = json.loads(sys.argv[1])

72
bin/hooks/image-noexif.py Executable file
View File

@@ -0,0 +1,72 @@
#!/usr/bin/env python3
import os
import sys
import subprocess as sp
_ = r"""
remove exif tags from uploaded images; the eventhook edition of
https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py
dependencies:
exiftool / perl-Image-ExifTool
being an upload hook, this will take effect after upload completion
but before copyparty has hashed/indexed the file, which means that
copyparty will never index the original file, so deduplication will
not work as expected... which is mostly OK but ehhh
note: modifies the file in-place, so don't set the `f` (fork) flag
example usages; either as global config (all volumes) or as volflag:
--xau bin/hooks/image-noexif.py
-v srv/inc:inc:r:rw,ed:c,xau=bin/hooks/image-noexif.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
explained:
share fs-path srv/inc at /inc (readable by all, read-write for user ed)
running this xau (execute-after-upload) plugin for all uploaded files
"""
# filetypes to process; ignores everything else
EXTS = ("jpg", "jpeg", "avif", "heif", "heic")
try:
from copyparty.util import fsenc
except:
def fsenc(p):
return p.encode("utf-8")
def main():
fp = sys.argv[1]
ext = fp.lower().split(".")[-1]
if ext not in EXTS:
return
cwd, fn = os.path.split(fp)
os.chdir(cwd)
f1 = fsenc(fn)
cmd = [
b"exiftool",
b"-exif:all=",
b"-iptc:all=",
b"-xmp:all=",
b"-P",
b"-overwrite_original",
b"--",
f1,
]
sp.check_output(cmd)
print("image-noexif: stripped")
if __name__ == "__main__":
try:
main()
except:
pass

View File

@@ -17,8 +17,12 @@ depdencies:
example usages; either as global config (all volumes) or as volflag:
--xau f,bin/hooks/notify.py
-v srv/inc:inc:c,xau=f,bin/hooks/notify.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
-v srv/inc:inc:r:rw,ed:c,xau=f,bin/hooks/notify.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xau = execute after upload

72
bin/hooks/notify2.py Executable file
View File

@@ -0,0 +1,72 @@
#!/usr/bin/env python3
import json
import os
import sys
import subprocess as sp
from datetime import datetime
from plyer import notification
_ = r"""
same as notify.py but with additional info (uploader, ...)
and also supports --xm (notify on 📟 message)
example usages; either as global config (all volumes) or as volflag:
--xm f,j,bin/hooks/notify2.py
--xau f,j,bin/hooks/notify2.py
-v srv/inc:inc:r:rw,ed:c,xm=f,j,bin/hooks/notify2.py
-v srv/inc:inc:r:rw,ed:c,xau=f,j,bin/hooks/notify2.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads / msgs with the params listed below)
parameters explained,
xau = execute after upload
f = fork so it doesn't block uploads
j = provide json instead of filepath list
"""
try:
from copyparty.util import humansize
except:
def humansize(n):
return n
def main():
inf = json.loads(sys.argv[1])
fp = inf["ap"]
sz = humansize(inf["sz"])
dp, fn = os.path.split(fp)
mt = datetime.utcfromtimestamp(inf["mt"]).strftime("%Y-%m-%d %H:%M:%S")
msg = f"{fn} ({sz})\n📁 {dp}"
title = "File received"
icon = "emblem-documents-symbolic" if sys.platform == "linux" else ""
if inf.get("txt"):
msg = inf["txt"]
title = "Message received"
icon = "mail-unread-symbolic" if sys.platform == "linux" else ""
msg += f"\n👤 {inf['user']} ({inf['ip']})\n🕒 {mt}"
if "com.termux" in sys.executable:
sp.run(["termux-notification", "-t", title, "-c", msg])
return
notification.notify(
title=title,
message=msg,
app_icon=icon,
timeout=10,
)
if __name__ == "__main__":
main()

View File

@@ -10,7 +10,12 @@ example usage as global config:
--xbu c,bin/hooks/reject-extension.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xbu=c,bin/hooks/reject-extension.py
-v srv/inc:inc:r:rw,ed:c,xbu=c,bin/hooks/reject-extension.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xbu = execute before upload

View File

@@ -17,7 +17,12 @@ example usage as global config:
--xau c,bin/hooks/reject-mimetype.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xau=c,bin/hooks/reject-mimetype.py
-v srv/inc:inc:r:rw,ed:c,xau=c,bin/hooks/reject-mimetype.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained,
xau = execute after upload

View File

@@ -15,9 +15,15 @@ example usage as global config:
--xm f,j,t3600,bin/hooks/wget.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xm=f,j,t3600,bin/hooks/wget.py
-v srv/inc:inc:r:rw,ed:c,xm=f,j,t3600,bin/hooks/wget.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all messages with the params listed below)
parameters explained,
xm = execute on message-to-server-log
f = fork so it doesn't block uploads
j = provide message information as json; not just the text
c3 = mute all output

108
bin/hooks/xiu-sha.py Executable file
View File

@@ -0,0 +1,108 @@
#!/usr/bin/env python3
import hashlib
import json
import sys
from datetime import datetime
_ = r"""
this hook will produce a single sha512 file which
covers all recent uploads (plus metadata comments)
use this with --xiu, which makes copyparty buffer
uploads until server is idle, providing file infos
on stdin (filepaths or json)
example usage as global config:
--xiu i5,j,bin/hooks/xiu-sha.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:r:rw,ed:c,xiu=i5,j,bin/hooks/xiu-sha.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on batches of uploads with the params listed below)
parameters explained,
xiu = execute after uploads...
i5 = ...after volume has been idle for 5sec
j = provide json instead of filepath list
note the "f" (fork) flag is not set, so this xiu
will block other xiu hooks while it's running
"""
try:
from copyparty.util import fsenc
except:
def fsenc(p):
return p
def humantime(ts):
return datetime.utcfromtimestamp(ts).strftime("%Y-%m-%d %H:%M:%S")
def find_files_root(inf):
di = 9000
for f1, f2 in zip(inf, inf[1:]):
p1 = f1["ap"].replace("\\", "/").rsplit("/", 1)[0]
p2 = f2["ap"].replace("\\", "/").rsplit("/", 1)[0]
di = min(len(p1), len(p2), di)
di = next((i for i in range(di) if p1[i] != p2[i]), di)
return di + 1
def find_vol_root(inf):
return len(inf[0]["ap"][: -len(inf[0]["vp"])])
def main():
zb = sys.stdin.buffer.read()
zs = zb.decode("utf-8", "replace")
inf = json.loads(zs)
# root directory (where to put the sha512 file);
# di = find_files_root(inf) # next to the file closest to volume root
di = find_vol_root(inf) # top of the entire volume
ret = []
total_sz = 0
for md in inf:
ap = md["ap"]
rp = ap[di:]
total_sz += md["sz"]
fsize = "{:,}".format(md["sz"])
mtime = humantime(md["mt"])
up_ts = humantime(md["at"])
h = hashlib.sha512()
with open(fsenc(md["ap"]), "rb", 512 * 1024) as f:
while True:
buf = f.read(512 * 1024)
if not buf:
break
h.update(buf)
cksum = h.hexdigest()
meta = " | ".join([md["wark"], up_ts, mtime, fsize, md["ip"]])
ret.append("# {}\n{} *{}".format(meta, cksum, rp))
ret.append("# {} files, {} bytes total".format(len(inf), total_sz))
ret.append("")
ftime = datetime.utcnow().strftime("%Y-%m%d-%H%M%S.%f")
fp = "{}xfer-{}.sha512".format(inf[0]["ap"][:di], ftime)
with open(fsenc(fp), "wb") as f:
f.write("\n".join(ret).encode("utf-8", "replace"))
print("wrote checksums to {}".format(fp))
if __name__ == "__main__":
main()

50
bin/hooks/xiu.py Executable file
View File

@@ -0,0 +1,50 @@
#!/usr/bin/env python3
import json
import sys
_ = r"""
this hook prints absolute filepaths + total size
use this with --xiu, which makes copyparty buffer
uploads until server is idle, providing file infos
on stdin (filepaths or json)
example usage as global config:
--xiu i1,j,bin/hooks/xiu.py
example usage as a volflag (per-volume config):
-v srv/inc:inc:r:rw,ed:c,xiu=i1,j,bin/hooks/xiu.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on batches of uploads with the params listed below)
parameters explained,
xiu = execute after uploads...
i1 = ...after volume has been idle for 1sec
j = provide json instead of filepath list
note the "f" (fork) flag is not set, so this xiu
will block other xiu hooks while it's running
"""
def main():
zb = sys.stdin.buffer.read()
zs = zb.decode("utf-8", "replace")
inf = json.loads(zs)
total_sz = 0
for upload in inf:
sz = upload["sz"]
total_sz += sz
print("{:9} {}".format(sz, upload["ap"]))
print("{} files, {} bytes total".format(len(inf), total_sz))
if __name__ == "__main__":
main()

View File

@@ -31,7 +31,7 @@ run [`install-deps.sh`](install-deps.sh) to build/install most dependencies requ
*alternatively* (or preferably) use packages from your distro instead, then you'll need at least these:
* from distro: `numpy vamp-plugin-sdk beatroot-vamp mixxx-keyfinder ffmpeg`
* from pypy: `keyfinder vamp`
* from pip: `keyfinder vamp`
# usage from copyparty

View File

@@ -16,6 +16,10 @@ dep: ffmpeg
"""
# save beat timestamps to ".beats/filename.txt"
SAVE = False
def det(tf):
# fmt: off
sp.check_call([
@@ -23,12 +27,11 @@ def det(tf):
b"-nostdin",
b"-hide_banner",
b"-v", b"fatal",
b"-ss", b"13",
b"-y", b"-i", fsenc(sys.argv[1]),
b"-map", b"0:a:0",
b"-ac", b"1",
b"-ar", b"22050",
b"-t", b"300",
b"-t", b"360",
b"-f", b"f32le",
fsenc(tf)
])
@@ -47,10 +50,29 @@ def det(tf):
print(c["list"][0]["label"].split(" ")[0])
return
# throws if detection failed:
bpm = float(cl[-1]["timestamp"] - cl[1]["timestamp"])
bpm = round(60 * ((len(cl) - 1) / bpm), 2)
print(f"{bpm:.2f}")
# throws if detection failed:
beats = [float(x["timestamp"]) for x in cl]
bds = [b - a for a, b in zip(beats, beats[1:])]
bds.sort()
n0 = int(len(bds) * 0.2)
n1 = int(len(bds) * 0.75) + 1
bds = bds[n0:n1]
bpm = sum(bds)
bpm = round(60 * (len(bds) / bpm), 2)
print(f"{bpm:.2f}")
if SAVE:
fdir, fname = os.path.split(sys.argv[1])
bdir = os.path.join(fdir, ".beats")
try:
os.mkdir(fsenc(bdir))
except:
pass
fp = os.path.join(bdir, fname) + ".txt"
with open(fsenc(fp), "wb") as f:
txt = "\n".join([f"{x:.2f}" for x in beats])
f.write(txt.encode("utf-8"))
def main():

View File

@@ -225,7 +225,7 @@ install_vamp() {
$pybin -m pip install --user vamp
cd "$td"
echo '#include <vamp-sdk/Plugin.h>' | gcc -x c -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2588/vamp-plugin-sdk-2.9.0.tar.gz)
sha512sum -c <(

View File

@@ -4,8 +4,9 @@ set -e
# runs copyparty (or any other program really) in a chroot
#
# assumption: these directories, and everything within, are owned by root
sysdirs=( /bin /lib /lib32 /lib64 /sbin /usr /etc/alternatives )
sysdirs=(); for v in /bin /lib /lib32 /lib64 /sbin /usr /etc/alternatives ; do
[ -e $v ] && sysdirs+=($v)
done
# error-handler
help() { cat <<'EOF'
@@ -38,7 +39,7 @@ while true; do
v="$1"; shift
[ "$v" = -- ] && break # end of volumes
[ "$#" -eq 0 ] && break # invalid usage
vols+=( "$(realpath "$v")" )
vols+=( "$(realpath "$v" || echo "$v")" )
done
pybin="$1"; shift
pybin="$(command -v "$pybin")"
@@ -82,7 +83,7 @@ jail="${jail%/}"
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | sed -r 's`/$``' | LC_ALL=C sort | uniq |
while IFS= read -r v; do
[ -e "$v" ] || {
# printf '\033[1;31mfolder does not exist:\033[0m %s\n' "/$v"
printf '\033[1;31mfolder does not exist:\033[0m %s\n' "$v"
continue
}
i1=$(stat -c%D.%i "$v" 2>/dev/null || echo a)
@@ -117,6 +118,15 @@ mkdir -p "$jail/tmp"
chmod 777 "$jail/tmp"
# create a dev
(cd $jail; mkdir -p dev; cd dev
[ -e null ] || mknod -m 666 null c 1 3
[ -e zero ] || mknod -m 666 zero c 1 5
[ -e random ] || mknod -m 444 random c 1 8
[ -e urandom ] || mknod -m 444 urandom c 1 9
)
# run copyparty
export HOME=$(getent passwd $uid | cut -d: -f6)
export USER=$(getent passwd $uid | cut -d: -f1)

View File

@@ -1,9 +1,12 @@
#!/usr/bin/env python3
from __future__ import print_function, unicode_literals
S_VERSION = "1.8"
S_BUILD_DT = "2023-04-27"
"""
up2k.py: upload to copyparty
2023-01-13, v1.2, ed <irc.rizon.net>, MIT-Licensed
2021, ed <irc.rizon.net>, MIT-Licensed
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
- dependencies: requests
@@ -18,12 +21,15 @@ import math
import time
import atexit
import signal
import socket
import base64
import hashlib
import platform
import threading
import datetime
EXE = sys.executable.endswith("exe")
try:
import argparse
except:
@@ -34,7 +40,9 @@ except:
try:
import requests
except ImportError:
if sys.version_info > (2, 7):
if EXE:
raise
elif sys.version_info > (2, 7):
m = "\nERROR: need 'requests'; please run this command:\n {0} -m pip install --user requests\n"
else:
m = "requests/2.18.4 urllib3/1.23 chardet/3.0.4 certifi/2020.4.5.1 idna/2.7"
@@ -51,6 +59,7 @@ PY2 = sys.version_info < (3,)
if PY2:
from Queue import Queue
from urllib import quote, unquote
from urlparse import urlsplit, urlunsplit
sys.dont_write_bytecode = True
bytes = str
@@ -58,6 +67,7 @@ else:
from queue import Queue
from urllib.parse import unquote_to_bytes as unquote
from urllib.parse import quote_from_bytes as quote
from urllib.parse import urlsplit, urlunsplit
unicode = str
@@ -245,7 +255,13 @@ def eprint(*a, **ka):
def flushing_print(*a, **ka):
_print(*a, **ka)
try:
_print(*a, **ka)
except:
v = " ".join(str(x) for x in a)
v = v.encode("ascii", "replace").decode("ascii")
_print(v, **ka)
if "flush" not in ka:
sys.stdout.flush()
@@ -324,6 +340,32 @@ class CTermsize(object):
ss = CTermsize()
def undns(url):
usp = urlsplit(url)
hn = usp.hostname
gai = None
eprint("resolving host [{0}] ...".format(hn), end="")
try:
gai = socket.getaddrinfo(hn, None)
hn = gai[0][4][0]
except KeyboardInterrupt:
raise
except:
t = "\n\033[31mfailed to resolve upload destination host;\033[0m\ngai={0}\n"
eprint(t.format(repr(gai)))
raise
if usp.port:
hn = "{0}:{1}".format(hn, usp.port)
if usp.username or usp.password:
hn = "{0}:{1}@{2}".format(usp.username, usp.password, hn)
usp = usp._replace(netloc=hn)
url = urlunsplit(usp)
eprint(" {0}".format(url))
return url
def _scd(err, top):
"""non-recursive listing of directory contents, along with stat() info"""
with os.scandir(top) as dh:
@@ -372,6 +414,23 @@ def walkdir(err, top, seen):
def walkdirs(err, tops):
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
sep = "{0}".format(os.sep).encode("ascii")
if not VT100:
za = []
for td in tops:
try:
ap = os.path.abspath(os.path.realpath(td))
if td[-1:] in (b"\\", b"/"):
ap += sep
except:
# maybe cpython #88013 (ok)
ap = td
za.append(ap)
za = [x if x.startswith(b"\\\\") else b"\\\\?\\" + x for x in za]
za = [x.replace(b"/", b"\\") for x in za]
tops = za
for top in tops:
isdir = os.path.isdir(top)
if top[-1:] == sep:
@@ -520,7 +579,11 @@ def handshake(ar, file, search):
except Exception as ex:
em = str(ex).split("SSLError(")[-1].split("\nURL: ")[0].strip()
if sc == 422 or "<pre>partial upload exists at a different" in txt:
if (
sc == 422
or "<pre>partial upload exists at a different" in txt
or "<pre>source file busy; please try again" in txt
):
file.recheck = True
return [], False
elif sc == 409 or "<pre>upload rejected, file already exists" in txt:
@@ -552,8 +615,8 @@ def handshake(ar, file, search):
return r["hash"], r["sprs"]
def upload(file, cid, pw):
# type: (File, str, str) -> None
def upload(file, cid, pw, stats):
# type: (File, str, str, str) -> None
"""upload one specific chunk, `cid` (a chunk-hash)"""
headers = {
@@ -561,6 +624,10 @@ def upload(file, cid, pw):
"X-Up2k-Wark": file.wark,
"Content-Type": "application/octet-stream",
}
if stats:
headers["X-Up2k-Stat"] = stats
if pw:
headers["Cookie"] = "=".join(["cppwd", pw])
@@ -615,6 +682,7 @@ class Ctl(object):
return nfiles, nbytes
def __init__(self, ar, stats=None):
self.ok = False
self.ar = ar
self.stats = stats or self._scan()
if not self.stats:
@@ -629,6 +697,8 @@ class Ctl(object):
req_ses.verify = ar.te
self.filegen = walkdirs([], ar.files)
self.recheck = [] # type: list[File]
if ar.safe:
self._safe()
else:
@@ -647,11 +717,11 @@ class Ctl(object):
self.t0 = time.time()
self.t0_up = None
self.spd = None
self.eta = "99:99:99"
self.mutex = threading.Lock()
self.q_handshake = Queue() # type: Queue[File]
self.q_upload = Queue() # type: Queue[tuple[File, str]]
self.recheck = [] # type: list[File]
self.st_hash = [None, "(idle, starting...)"] # type: tuple[File, int]
self.st_up = [None, "(idle, starting...)"] # type: tuple[File, int]
@@ -660,6 +730,8 @@ class Ctl(object):
self._fancy()
self.ok = True
def _safe(self):
"""minimal basic slow boring fallback codepath"""
search = self.ar.s
@@ -693,7 +765,8 @@ class Ctl(object):
ncs = len(hs)
for nc, cid in enumerate(hs):
print(" {0} up {1}".format(ncs - nc, cid))
upload(file, cid, self.ar.a)
stats = "{0}/0/0/{1}".format(nf, self.nfiles - nf)
upload(file, cid, self.ar.a, stats)
print(" ok!")
if file.recheck:
@@ -768,12 +841,12 @@ class Ctl(object):
eta = (self.nbytes - self.up_b) / (spd + 1)
spd = humansize(spd)
eta = str(datetime.timedelta(seconds=int(eta)))
self.eta = str(datetime.timedelta(seconds=int(eta)))
sleft = humansize(self.nbytes - self.up_b)
nleft = self.nfiles - self.up_f
tail = "\033[K\033[u" if VT100 and not self.ar.ns else "\r"
t = "{0} eta @ {1}/s, {2}, {3}# left".format(eta, spd, sleft, nleft)
t = "{0} eta @ {1}/s, {2}, {3}# left".format(self.eta, spd, sleft, nleft)
eprint(txt + "\033]0;{0}\033\\\r{0}{1}".format(t, tail))
if not self.recheck:
@@ -809,9 +882,9 @@ class Ctl(object):
print(" ls ~{0}".format(srd))
zb = self.ar.url.encode("utf-8")
zb += quotep(rd.replace(b"\\", b"/"))
r = req_ses.get(zb + b"?ls&dots", headers=headers)
r = req_ses.get(zb + b"?ls&lt&dots", headers=headers)
if not r:
raise Exception("HTTP {}".format(r.status_code))
raise Exception("HTTP {0}".format(r.status_code))
j = r.json()
for f in j["dirs"] + j["files"]:
@@ -886,6 +959,9 @@ class Ctl(object):
self.handshaker_busy += 1
upath = file.abs.decode("utf-8", "replace")
if not VT100:
upath = upath.lstrip("\\?")
hs, sprs = handshake(self.ar, file, search)
if search:
if hs:
@@ -951,11 +1027,23 @@ class Ctl(object):
self.uploader_busy += 1
self.t0_up = self.t0_up or time.time()
zs = "{0}/{1}/{2}/{3} {4}/{5} {6}"
stats = zs.format(
self.up_f,
len(self.recheck),
self.uploader_busy,
self.nfiles - self.up_f,
int(self.nbytes / (1024 * 1024)),
int((self.nbytes - self.up_b) / (1024 * 1024)),
self.eta,
)
file, cid = task
try:
upload(file, cid, self.ar.a)
except:
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
upload(file, cid, self.ar.a, stats)
except Exception as ex:
t = "upload failed, retrying: {0} #{1} ({2})\n"
eprint(t.format(file.name, cid[:8], ex))
# handshake will fix it
with self.mutex:
@@ -989,8 +1077,15 @@ def main():
cores = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
hcores = min(cores, 3) # 4% faster than 4+ on py3.9 @ r5-4500U
ver = "{0}, v{1}".format(S_BUILD_DT, S_VERSION)
if "--version" in sys.argv:
print(ver)
return
sys.argv = [x for x in sys.argv if x != "--ws"]
# fmt: off
ap = app = argparse.ArgumentParser(formatter_class=APF, epilog="""
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
NOTE:
source file/folder selection uses rsync syntax, meaning that:
"foo" uploads the entire folder to URL/foo/
@@ -1003,10 +1098,11 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
ap.add_argument("--version", action="store_true", help="show version and exit")
ap = app.add_argument_group("compatibility")
ap.add_argument("--cls", action="store_true", help="clear screen before start")
ap.add_argument("--ws", action="store_true", help="copyparty is running on windows; wait before deleting files after uploading")
ap.add_argument("--rh", type=int, metavar="TRIES", default=0, help="resolve server hostname before upload (good for buggy networks, but TLS certs will break)")
ap = app.add_argument_group("folder sync")
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
@@ -1026,7 +1122,16 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-td", action="store_true", help="disable certificate check")
# fmt: on
ar = app.parse_args()
try:
ar = app.parse_args()
finally:
if EXE and not sys.argv[1:]:
eprint("*** hit enter to exit ***")
try:
input()
except:
pass
if ar.drd:
ar.dr = True
@@ -1040,7 +1145,7 @@ source file/folder selection uses rsync syntax, meaning that:
ar.files = [
os.path.abspath(os.path.realpath(x.encode("utf-8")))
+ (x[-1:] if x[-1:] == os.sep else "").encode("utf-8")
+ (x[-1:] if x[-1:] in ("\\", "/") else "").encode("utf-8")
for x in ar.files
]
@@ -1050,24 +1155,32 @@ source file/folder selection uses rsync syntax, meaning that:
if ar.a and ar.a.startswith("$"):
fn = ar.a[1:]
print("reading password from file [{}]".format(fn))
print("reading password from file [{0}]".format(fn))
with open(fn, "rb") as f:
ar.a = f.read().decode("utf-8").strip()
for n in range(ar.rh):
try:
ar.url = undns(ar.url)
break
except KeyboardInterrupt:
raise
except:
if n > ar.rh - 2:
raise
if ar.cls:
print("\x1b\x5b\x48\x1b\x5b\x32\x4a\x1b\x5b\x33\x4a", end="")
eprint("\x1b\x5b\x48\x1b\x5b\x32\x4a\x1b\x5b\x33\x4a", end="")
ctl = Ctl(ar)
if ar.dr and not ar.drd:
if ar.dr and not ar.drd and ctl.ok:
print("\npass 2/2: delete")
if getattr(ctl, "up_br") and ar.ws:
# wait for up2k to mtime if there was uploads
time.sleep(4)
ar.drd = True
ar.z = True
Ctl(ar, ctl.stats)
ctl = Ctl(ar, ctl.stats)
sys.exit(0 if ctl.ok else 1)
if __name__ == "__main__":

View File

@@ -1,7 +1,6 @@
# when running copyparty behind a reverse proxy,
# the following arguments are recommended:
#
# --http-only lower latency on initial connection
# -i 127.0.0.1 only accept connections from nginx
#
# if you are doing location-based proxying (such as `/stuff` below)

View File

@@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<title>🎉 redirect</title>
<title>💾🎉 redirect</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<style>

Binary file not shown.

View File

@@ -1,7 +1,6 @@
# when running copyparty behind a reverse proxy,
# the following arguments are recommended:
#
# --http-only lower latency on initial connection
# -i 127.0.0.1 only accept connections from nginx
#
# -nc must match or exceed the webserver's max number of concurrent clients;
@@ -9,7 +8,7 @@
# nginx default is 512 (worker_processes 1, worker_connections 512)
#
# you may also consider adding -j0 for CPU-intensive configurations
# (not that i can really think of any good examples)
# (5'000 requests per second, or 20gbps upload/download in parallel)
#
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
@@ -39,3 +38,9 @@ server {
proxy_set_header Connection "Keep-Alive";
}
}
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
client_max_body_size 1024M;
client_header_timeout 610m;
client_body_timeout 610m;
send_timeout 610m;

View File

@@ -0,0 +1,281 @@
{ config, pkgs, lib, ... }:
with lib;
let
mkKeyValue = key: value:
if value == true then
# sets with a true boolean value are coerced to just the key name
key
else if value == false then
# or omitted completely when false
""
else
(generators.mkKeyValueDefault { inherit mkValueString; } ": " key value);
mkAttrsString = value: (generators.toKeyValue { inherit mkKeyValue; } value);
mkValueString = value:
if isList value then
(concatStringsSep ", " (map mkValueString value))
else if isAttrs value then
"\n" + (mkAttrsString value)
else
(generators.mkValueStringDefault { } value);
mkSectionName = value: "[" + (escape [ "[" "]" ] value) + "]";
mkSection = name: attrs: ''
${mkSectionName name}
${mkAttrsString attrs}
'';
mkVolume = name: attrs: ''
${mkSectionName name}
${attrs.path}
${mkAttrsString {
accs = attrs.access;
flags = attrs.flags;
}}
'';
passwordPlaceholder = name: "{{password-${name}}}";
accountsWithPlaceholders = mapAttrs (name: attrs: passwordPlaceholder name);
configStr = ''
${mkSection "global" cfg.settings}
${mkSection "accounts" (accountsWithPlaceholders cfg.accounts)}
${concatStringsSep "\n" (mapAttrsToList mkVolume cfg.volumes)}
'';
name = "copyparty";
cfg = config.services.copyparty;
configFile = pkgs.writeText "${name}.conf" configStr;
runtimeConfigPath = "/run/${name}/${name}.conf";
home = "/var/lib/${name}";
defaultShareDir = "${home}/data";
in {
options.services.copyparty = {
enable = mkEnableOption "web-based file manager";
package = mkOption {
type = types.package;
default = pkgs.copyparty;
defaultText = "pkgs.copyparty";
description = ''
Package of the application to run, exposed for overriding purposes.
'';
};
openFilesLimit = mkOption {
default = 4096;
type = types.either types.int types.str;
description = "Number of files to allow copyparty to open.";
};
settings = mkOption {
type = types.attrs;
description = ''
Global settings to apply.
Directly maps to values in the [global] section of the copyparty config.
See `${getExe cfg.package} --help` for more details.
'';
default = {
i = "127.0.0.1";
no-reload = true;
};
example = literalExpression ''
{
i = "0.0.0.0";
no-reload = true;
}
'';
};
accounts = mkOption {
type = types.attrsOf (types.submodule ({ ... }: {
options = {
passwordFile = mkOption {
type = types.str;
description = ''
Runtime file path to a file containing the user password.
Must be readable by the copyparty user.
'';
example = "/run/keys/copyparty/ed";
};
};
}));
description = ''
A set of copyparty accounts to create.
'';
default = { };
example = literalExpression ''
{
ed.passwordFile = "/run/keys/copyparty/ed";
};
'';
};
volumes = mkOption {
type = types.attrsOf (types.submodule ({ ... }: {
options = {
path = mkOption {
type = types.str;
description = ''
Path of a directory to share.
'';
};
access = mkOption {
type = types.attrs;
description = ''
Attribute list of permissions and the users to apply them to.
The key must be a string containing any combination of allowed permission:
"r" (read): list folder contents, download files
"w" (write): upload files; need "r" to see the uploads
"m" (move): move files and folders; need "w" at destination
"d" (delete): permanently delete files and folders
"g" (get): download files, but cannot see folder contents
"G" (upget): "get", but can see filekeys of their own uploads
For example: "rwmd"
The value must be one of:
an account name, defined in `accounts`
a list of account names
"*", which means "any account"
'';
example = literalExpression ''
{
# wG = write-upget = see your own uploads only
wG = "*";
# read-write-modify-delete for users "ed" and "k"
rwmd = ["ed" "k"];
};
'';
};
flags = mkOption {
type = types.attrs;
description = ''
Attribute list of volume flags to apply.
See `${getExe cfg.package} --help-flags` for more details.
'';
example = literalExpression ''
{
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
fk = 4;
# scan for new files every 60sec
scan = 60;
# volflag "e2d" enables the uploads database
e2d = true;
# "d2t" disables multimedia parsers (in case the uploads are malicious)
d2t = true;
# skips hashing file contents if path matches *.iso
nohash = "\.iso$";
};
'';
default = { };
};
};
}));
description = "A set of copyparty volumes to create";
default = {
"/" = {
path = defaultShareDir;
access = { r = "*"; };
};
};
example = literalExpression ''
{
"/" = {
path = ${defaultShareDir};
access = {
# wG = write-upget = see your own uploads only
wG = "*";
# read-write-modify-delete for users "ed" and "k"
rwmd = ["ed" "k"];
};
};
};
'';
};
};
config = mkIf cfg.enable {
systemd.services.copyparty = {
description = "http file sharing hub";
wantedBy = [ "multi-user.target" ];
environment = {
PYTHONUNBUFFERED = "true";
XDG_CONFIG_HOME = "${home}/.config";
};
preStart = let
replaceSecretCommand = name: attrs:
"${getExe pkgs.replace-secret} '${
passwordPlaceholder name
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
in ''
set -euo pipefail
install -m 600 ${configFile} ${runtimeConfigPath}
${concatStringsSep "\n"
(mapAttrsToList replaceSecretCommand cfg.accounts)}
'';
serviceConfig = {
Type = "simple";
ExecStart = "${getExe cfg.package} -c ${runtimeConfigPath}";
# Hardening options
User = "copyparty";
Group = "copyparty";
RuntimeDirectory = name;
RuntimeDirectoryMode = "0700";
StateDirectory = [ name "${name}/data" "${name}/.config" ];
StateDirectoryMode = "0700";
WorkingDirectory = home;
TemporaryFileSystem = "/:ro";
BindReadOnlyPaths = [
"/nix/store"
"-/etc/resolv.conf"
"-/etc/nsswitch.conf"
"-/etc/hosts"
"-/etc/localtime"
] ++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
BindPaths = [ home ] ++ (mapAttrsToList (k: v: v.path) cfg.volumes);
# Would re-mount paths ignored by temporary root
#ProtectSystem = "strict";
ProtectHome = true;
PrivateTmp = true;
PrivateDevices = true;
ProtectKernelTunables = true;
ProtectControlGroups = true;
RestrictSUIDSGID = true;
PrivateMounts = true;
ProtectKernelModules = true;
ProtectKernelLogs = true;
ProtectHostname = true;
ProtectClock = true;
ProtectProc = "invisible";
ProcSubset = "pid";
RestrictNamespaces = true;
RemoveIPC = true;
UMask = "0077";
LimitNOFILE = cfg.openFilesLimit;
NoNewPrivileges = true;
LockPersonality = true;
RestrictRealtime = true;
};
};
users.groups.copyparty = { };
users.users.copyparty = {
description = "Service user for copyparty";
group = "copyparty";
home = home;
isSystemUser = true;
};
};
}

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty
pkgver="1.6.3"
pkgver="1.6.15"
pkgrel=1
pkgdesc="Portable file sharing hub"
arch=("any")
@@ -26,12 +26,12 @@ source=("${url}/releases/download/v${pkgver}/${pkgname}-sfx.py"
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
)
backup=("etc/${pkgname}.d/init" )
sha256sums=("56c02d43a0e6c18d71295268674454b4c6f5ff2ccef30fb95f81d58d2d1e260d"
sha256sums=("efe3fe8710ef8f3b50b1eb0f935f8620e378bdee48f592d77a1c5cf3ed7ce244"
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
"0530459e6fbd57f770c374e960d2eb07a4e8c082c0007fb754454e45c0af57c6"
"8e89d281483e22d11d111bed540652af35b66af6f14f49faae7b959f6cdc6475"
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
)

View File

@@ -0,0 +1,55 @@
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, pillow, pyvips, ffmpeg, mutagen,
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
withThumbnails ? true,
# create thumbnails with PyVIPS; even faster, uses more memory
# -- can be combined with Pillow to support more filetypes
withFastThumbnails ? false,
# enable FFmpeg; thumbnails for most filetypes (also video and audio), extract audio metadata, transcode audio to opus
# -- possibly dangerous if you allow anonymous uploads, since FFmpeg has a huge attack surface
# -- can be combined with Thumbnails and/or FastThumbnails, since FFmpeg is slower than both
withMediaProcessing ? true,
# if MediaProcessing is not enabled, you probably want this instead (less accurate, but much safer and faster)
withBasicAudioMetadata ? false,
# enable FTPS support in the FTP server
withFTPS ? false,
# samba/cifs server; dangerous and buggy, enable if you really need it
withSMB ? false,
}:
let
pinData = lib.importJSON ./pin.json;
pyEnv = python.withPackages (ps:
with ps; [
jinja2
]
++ lib.optional withSMB impacket
++ lib.optional withFTPS pyopenssl
++ lib.optional withThumbnails pillow
++ lib.optional withFastThumbnails pyvips
++ lib.optional withMediaProcessing ffmpeg
++ lib.optional withBasicAudioMetadata mutagen
);
in stdenv.mkDerivation {
pname = "copyparty";
version = pinData.version;
src = fetchurl {
url = pinData.url;
hash = pinData.hash;
};
buildInputs = [ makeWrapper ];
dontUnpack = true;
dontBuild = true;
installPhase = ''
install -Dm755 $src $out/share/copyparty-sfx.py
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
--add-flags "$out/share/copyparty-sfx.py"
'';
}

View File

@@ -0,0 +1,5 @@
{
"url": "https://github.com/9001/copyparty/releases/download/v1.6.15/copyparty-sfx.py",
"version": "1.6.15",
"hash": "sha256-7+P+hxDvjztQsesPk1+GION4ve5I9ZLXehxc8+184kQ="
}

View File

@@ -0,0 +1,77 @@
#!/usr/bin/env python3
# Update the Nix package pin
#
# Usage: ./update.sh [PATH]
# When the [PATH] is not set, it will fetch the latest release from the repo.
# With [PATH] set, it will hash the given file and generate the URL,
# base on the version contained within the file
import base64
import json
import hashlib
import sys
import re
from pathlib import Path
OUTPUT_FILE = Path("pin.json")
TARGET_ASSET = "copyparty-sfx.py"
HASH_TYPE = "sha256"
LATEST_RELEASE_URL = "https://api.github.com/repos/9001/copyparty/releases/latest"
DOWNLOAD_URL = lambda version: f"https://github.com/9001/copyparty/releases/download/v{version}/{TARGET_ASSET}"
def get_formatted_hash(binary):
hasher = hashlib.new("sha256")
hasher.update(binary)
asset_hash = hasher.digest()
encoded_hash = base64.b64encode(asset_hash).decode("ascii")
return f"{HASH_TYPE}-{encoded_hash}"
def version_from_sfx(binary):
result = re.search(b'^VER = "(.*)"$', binary, re.MULTILINE)
if result:
return result.groups(1)[0].decode("ascii")
raise ValueError("version not found in provided file")
def remote_release_pin():
import requests
response = requests.get(LATEST_RELEASE_URL).json()
version = response["tag_name"].lstrip("v")
asset_info = [a for a in response["assets"] if a["name"] == TARGET_ASSET][0]
download_url = asset_info["browser_download_url"]
asset = requests.get(download_url)
formatted_hash = get_formatted_hash(asset.content)
result = {"url": download_url, "version": version, "hash": formatted_hash}
return result
def local_release_pin(path):
asset = path.read_bytes()
version = version_from_sfx(asset)
download_url = DOWNLOAD_URL(version)
formatted_hash = get_formatted_hash(asset)
result = {"url": download_url, "version": version, "hash": formatted_hash}
return result
def main():
if len(sys.argv) > 1:
asset_path = Path(sys.argv[1])
result = local_release_pin(asset_path)
else:
result = remote_release_pin()
print(result)
json_result = json.dumps(result, indent=4)
OUTPUT_FILE.write_text(json_result)
if __name__ == "__main__":
main()

208
contrib/plugins/rave.js Normal file
View File

@@ -0,0 +1,208 @@
/* untz untz untz untz */
(function () {
var can, ctx, W, H, fft, buf, bars, barw, pv,
hue = 0,
ibeat = 0,
beats = [9001],
beats_url = '',
uofs = 0,
ops = ebi('ops'),
raving = false,
recalc = 0,
cdown = 0,
FC = 0.9,
css = `<style>
#fft {
position: fixed;
top: 0;
left: 0;
z-index: -1;
}
body {
box-shadow: inset 0 0 0 white;
}
#ops>a,
#path>a {
display: inline-block;
}
/*
body.untz {
animation: untz-body 200ms ease-out;
}
@keyframes untz-body {
0% {inset 0 0 20em white}
100% {inset 0 0 0 white}
}
*/
:root, html.a, html.b, html.c, html.d, html.e {
--row-alt: rgba(48,52,78,0.2);
}
#files td {
background: none;
}
</style>`;
QS('body').appendChild(mknod('div', null, css));
function rave_load() {
console.log('rave_load');
can = mknod('canvas', 'fft');
QS('body').appendChild(can);
ctx = can.getContext('2d');
fft = new AnalyserNode(actx, {
"fftSize": 2048,
"maxDecibels": 0,
"smoothingTimeConstant": 0.7,
});
ibeat = 0;
beats = [9001];
buf = new Uint8Array(fft.frequencyBinCount);
bars = buf.length * FC;
afilt.filters.push(fft);
if (!raving) {
raving = true;
raver();
}
beats_url = mp.au.src.split('?')[0].replace(/(.*\/)(.*)/, '$1.beats/$2.txt');
console.log("reading beats from", beats_url);
var xhr = new XHR();
xhr.open('GET', beats_url, true);
xhr.onload = readbeats;
xhr.url = beats_url;
xhr.send();
}
function rave_unload() {
qsr('#fft');
can = null;
}
function readbeats() {
if (this.url != beats_url)
return console.log('old beats??', this.url, beats_url);
var sbeats = this.responseText.replace(/\r/g, '').split(/\n/g);
if (sbeats.length < 3)
return;
beats = [];
for (var a = 0; a < sbeats.length; a++)
beats.push(parseFloat(sbeats[a]));
var end = beats.slice(-2),
t = end[1],
d = t - end[0];
while (d > 0.1 && t < 1200)
beats.push(t += d);
}
function hrand() {
return Math.random() - 0.5;
}
function raver() {
if (!can) {
raving = false;
return;
}
requestAnimationFrame(raver);
if (!mp || !mp.au || mp.au.paused)
return;
if (--uofs >= 0) {
document.body.style.marginLeft = hrand() * uofs + 'px';
ebi('tree').style.marginLeft = hrand() * uofs + 'px';
for (var a of QSA('#ops>a, #path>a, #pctl>a'))
a.style.transform = 'translate(' + hrand() * uofs * 1 + 'px, ' + hrand() * uofs * 0.7 + 'px) rotate(' + Math.random() * uofs * 0.7 + 'deg)'
}
if (--recalc < 0) {
recalc = 60;
var tree = ebi('tree'),
x = tree.style.display == 'none' ? 0 : tree.offsetWidth;
//W = can.width = window.innerWidth - x;
//H = can.height = window.innerHeight;
//H = ebi('widget').offsetTop;
W = can.width = bars;
H = can.height = 512;
barw = 1; //parseInt(0.8 + W / bars);
can.style.left = x + 'px';
can.style.width = (window.innerWidth - x) + 'px';
can.style.height = ebi('widget').offsetTop + 'px';
}
//if (--cdown == 1)
// clmod(ops, 'untz');
fft.getByteFrequencyData(buf);
var imax = 0, vmax = 0;
for (var a = 10; a < 50; a++)
if (vmax < buf[a]) {
vmax = buf[a];
imax = a;
}
hue = hue * 0.93 + imax * 0.07;
ctx.fillStyle = 'rgba(0,0,0,0)';
ctx.fillRect(0, 0, W, H);
ctx.clearRect(0, 0, W, H);
ctx.fillStyle = 'hsla(' + (hue * 2.5) + ',100%,50%,0.7)';
var x = 0, mul = (H / 256) * 0.5;
for (var a = 0; a < buf.length * FC; a++) {
var v = buf[a] * mul * (1 + 0.69 * a / buf.length);
ctx.fillRect(x, H - v, barw, v);
x += barw;
}
var t = mp.au.currentTime + 0.05;
if (ibeat >= beats.length || beats[ibeat] > t)
return;
while (ibeat < beats.length && beats[ibeat++] < t)
continue;
return untz();
var cv = 0;
for (var a = 0; a < 128; a++)
cv += buf[a];
if (cv - pv > 1000) {
console.log(pv, cv, cv - pv);
if (cdown < 0) {
clmod(ops, 'untz', 1);
cdown = 20;
}
}
pv = cv;
}
function untz() {
console.log('untz');
uofs = 14;
document.body.animate([
{ boxShadow: 'inset 0 0 1em #f0c' },
{ boxShadow: 'inset 0 0 20em #f0c', offset: 0.2 },
{ boxShadow: 'inset 0 0 0 #f0c' },
], { duration: 200, iterations: 1 });
}
afilt.plugs.push({
"en": true,
"load": rave_load,
"unload": rave_unload
});
})();

View File

@@ -2,12 +2,16 @@
# and share '/mnt' with anonymous read+write
#
# installation:
# cp -pv copyparty.service /etc/systemd/system
# restorecon -vr /etc/systemd/system/copyparty.service
# wget https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py -O /usr/local/bin/copyparty-sfx.py
# cp -pv copyparty.service /etc/systemd/system/
# restorecon -vr /etc/systemd/system/copyparty.service # on fedora/rhel
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
# firewall-cmd --reload
# systemctl daemon-reload && systemctl enable --now copyparty
#
# if it fails to start, first check this: systemctl status copyparty
# then try starting it while viewing logs: journalctl -fan 100
#
# you may want to:
# change "User=cpp" and "/home/cpp/" to another user
# remove the nft lines to only listen on port 3923
@@ -44,7 +48,7 @@ ExecReload=/bin/kill -s USR1 $MAINPID
User=cpp
Environment=XDG_CONFIG_HOME=/home/cpp/.config
# setup forwarding from ports 80 and 443 to port 3923
# OPTIONAL: setup forwarding from ports 80 and 443 to port 3923
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
ExecStartPre=+nft add table ip nat
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }

View File

@@ -3,8 +3,6 @@ rem removes the 47.6 MiB filesize limit when downloading from webdav
rem + optionally allows/enables password-auth over plaintext http
rem + optionally helps disable wpad, removing the 10sec latency
setlocal enabledelayedexpansion
net session >nul 2>&1
if %errorlevel% neq 0 (
echo sorry, you must run this as administrator
@@ -20,30 +18,26 @@ echo OK;
echo allow webdav basic-auth over plaintext http?
echo Y: login works, but the password will be visible in wireshark etc
echo N: login will NOT work unless you use https and valid certificates
set c=.
set /p "c=(Y/N): "
echo(
if /i not "!c!"=="y" goto :g1
reg add HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\WebClient\Parameters /v BasicAuthLevel /t REG_DWORD /d 0x2 /f
rem default is 1 (require tls)
choice
if %errorlevel% equ 1 (
reg add HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\WebClient\Parameters /v BasicAuthLevel /t REG_DWORD /d 0x2 /f
rem default is 1 (require tls)
)
:g1
echo(
echo OK;
echo do you want to disable wpad?
echo can give a HUGE speed boost depending on network settings
set c=.
set /p "c=(Y/N): "
echo(
if /i not "!c!"=="y" goto :g2
echo(
echo i'm about to open the [Connections] tab in [Internet Properties] for you;
echo please click [LAN settings] and disable [Automatically detect settings]
echo(
pause
control inetcpl.cpl,,4
choice
if %errorlevel% equ 1 (
echo(
echo i'm about to open the [Connections] tab in [Internet Properties] for you;
echo please click [LAN settings] and disable [Automatically detect settings]
echo(
pause
control inetcpl.cpl,,4
)
:g2
net stop webclient
net start webclient
echo(

View File

@@ -34,6 +34,8 @@ ANYWIN = WINDOWS or sys.platform in ["msys", "cygwin"]
MACOS = platform.system() == "Darwin"
EXE = bool(getattr(sys, "frozen", False))
try:
CORES = len(os.sched_getaffinity(0))
except:

View File

@@ -23,7 +23,7 @@ import traceback
import uuid
from textwrap import dedent
from .__init__ import ANYWIN, CORES, PY2, VT100, WINDOWS, E, EnvParams, unicode
from .__init__ import ANYWIN, CORES, EXE, PY2, VT100, WINDOWS, E, EnvParams, unicode
from .__version__ import CODENAME, S_BUILD_DT, S_VERSION
from .authsrv import expand_config_file, re_vol, split_cfg_ln, upgrade_cfg_fmt
from .cfg import flagcats, onedash
@@ -36,7 +36,6 @@ from .util import (
UNPLICATIONS,
align_tab,
ansi_re,
is_exe,
min_ex,
py_desc,
pybin,
@@ -262,7 +261,7 @@ def ensure_locale() -> None:
warn(t.format(safe))
def ensure_cert() -> None:
def ensure_cert(al: argparse.Namespace) -> None:
"""
the default cert (and the entire TLS support) is only here to enable the
crypto.subtle javascript API, which is necessary due to the webkit guys
@@ -271,15 +270,30 @@ def ensure_cert() -> None:
i feel awful about this and so should they
"""
cert_insec = os.path.join(E.mod, "res/insecure.pem")
cert_cfg = os.path.join(E.cfg, "cert.pem")
if not os.path.exists(cert_cfg):
shutil.copy(cert_insec, cert_cfg)
cert_appdata = os.path.join(E.cfg, "cert.pem")
if not os.path.isfile(al.cert):
if cert_appdata != al.cert:
raise Exception("certificate file does not exist: " + al.cert)
shutil.copy(cert_insec, al.cert)
with open(al.cert, "rb") as f:
buf = f.read()
o1 = buf.find(b" PRIVATE KEY-")
o2 = buf.find(b" CERTIFICATE-")
m = "unsupported certificate format: "
if o1 < 0:
raise Exception(m + "no private key inside pem")
if o2 < 0:
raise Exception(m + "no server certificate inside pem")
if o1 > o2:
raise Exception(m + "private key must appear before server certificate")
try:
if filecmp.cmp(cert_cfg, cert_insec):
if filecmp.cmp(al.cert, cert_insec):
lprint(
"\033[33musing default TLS certificate; https will be insecure."
+ "\033[36m\ncertificate location: {}\033[0m\n".format(cert_cfg)
+ "\033[36m\ncertificate location: {}\033[0m\n".format(al.cert)
)
except:
pass
@@ -500,8 +514,12 @@ def get_sects():
"""
volflags are appended to volume definitions, for example,
to create a write-only volume with the \033[33mnodupe\033[0m and \033[32mnosub\033[0m flags:
\033[35m-v /mnt/inc:/inc:w\033[33m:c,nodupe\033[32m:c,nosub"""
)
\033[35m-v /mnt/inc:/inc:w\033[33m:c,nodupe\033[32m:c,nosub\033[0m
if global config defines a volflag for all volumes,
you can unset it for a specific volume with -flag
"""
).rstrip()
+ build_flags_desc(),
],
[
@@ -512,6 +530,7 @@ def get_sects():
execute a command (a program or script) before or after various events;
\033[36mxbu\033[35m executes CMD before a file upload starts
\033[36mxau\033[35m executes CMD after a file upload finishes
\033[36mxiu\033[35m executes CMD after all uploads finish and volume is idle
\033[36mxbr\033[35m executes CMD before a file rename/move
\033[36mxar\033[35m executes CMD after a file rename/move
\033[36mxbd\033[35m executes CMD before a file delete
@@ -533,6 +552,7 @@ def get_sects():
\033[36mj\033[35m provides json with info as 1st arg instead of filepath
\033[36mwN\033[35m waits N sec after command has been started before continuing
\033[36mtN\033[35m sets an N sec timeout before the command is abandoned
\033[36miN\033[35m xiu only: volume must be idle for N sec (default = 5)
\033[36mkt\033[35m kills the entire process tree on timeout (default),
\033[36mkm\033[35m kills just the main process
@@ -543,6 +563,14 @@ def get_sects():
\033[36mc2\033[35m show only stdout
\033[36mc3\033[35m mute all process otput
\033[0m
each hook is executed once for each event, except for \033[36mxiu\033[0m
which builds up a backlog of uploads, running the hook just once
as soon as the volume has been idle for iN seconds (5 by default)
\033[36mxiu\033[0m is also unique in that it will pass the metadata to the
executed program on STDIN instead of as argv arguments, and
it also includes the wark (file-id/hash) as a json property
except for \033[36mxm\033[0m, only one hook / one action can run at a time,
so it's recommended to use the \033[36mf\033[0m flag unless you really need
to wait for the hook to finish before continuing (without \033[36mf\033[0m
@@ -591,9 +619,9 @@ def get_sects():
\033[32macid\033[0m = extremely safe but slow; the old default. Should never lose any data no matter what
\033[32mswal\033[0m = 2.4x faster uploads yet 99.9%% as safe -- theoretical chance of losing metadata for the ~200 most recently uploaded files if there's a power-loss or your OS crashes
\033[32mswal\033[0m = 2.4x faster uploads yet 99.9% as safe -- theoretical chance of losing metadata for the ~200 most recently uploaded files if there's a power-loss or your OS crashes
\033[32mwal\033[0m = another 21x faster on HDDs yet 90%% as safe; same pitfall as \033[33mswal\033[0m except more likely
\033[32mwal\033[0m = another 21x faster on HDDs yet 90% as safe; same pitfall as \033[33mswal\033[0m except more likely
\033[32myolo\033[0m = another 1.5x faster, and removes the occasional sudden upload-pause while the disk syncs, but now you're at risk of losing the entire database in a powerloss / OS-crash
@@ -652,6 +680,7 @@ def add_upload(ap):
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless -ed")
ap2.add_argument("--plain-ip", action="store_true", help="when avoiding filename collisions by appending the uploader's ip to the filename: append the plaintext ip instead of salting and hashing the ip")
ap2.add_argument("--unpost", metavar="SEC", type=int, default=3600*12, help="grace period where uploads can be deleted by the uploader, even without delete permissions; 0=disabled")
ap2.add_argument("--blank-wt", metavar="SEC", type=int, default=300, help="file write grace period (any client can write to a blank file last-modified more recently than SEC seconds ago)")
ap2.add_argument("--reg-cap", metavar="N", type=int, default=38400, help="max number of uploads to keep in memory when running without -e2d; roughly 1 MiB RAM per 600")
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload (very slow on windows)")
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even when it might be dangerous (multiprocessing, filesystems lacking sparse-files support, ...)")
@@ -681,15 +710,19 @@ def add_network(ap):
ap2.add_argument("--reuseaddr", action="store_true", help="set reuseaddr on listening sockets on windows; allows rapid restart of copyparty at the expense of being able to accidentally start multiple instances")
else:
ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)")
ap2.add_argument("--s-thead", metavar="SEC", type=int, default=120, help="socket timeout (read request header)")
ap2.add_argument("--s-tbody", metavar="SEC", type=float, default=186, help="socket timeout (read/write request/response bodies). Use 60 on fast servers (default is extremely safe). Disable with 0 if reverse-proxied for a 2%% speed boost")
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="debug: socket write delay in seconds")
ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="debug: response delay in seconds")
ap2.add_argument("--rsp-jtr", metavar="SEC", type=float, default=0, help="debug: response delay, random duration 0..SEC")
def add_tls(ap):
def add_tls(ap, cert_path):
ap2 = ap.add_argument_group('SSL/TLS options')
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls -- force plaintext")
ap2.add_argument("--https-only", action="store_true", help="disable plaintext -- force tls")
ap2.add_argument("--cert", metavar="PATH", type=u, default=cert_path, help="path to TLS certificate")
ap2.add_argument("--ssl-ver", metavar="LIST", type=u, help="set allowed ssl/tls versions; [\033[32mhelp\033[0m] shows available versions; default is what your python version considers safe")
ap2.add_argument("--ciphers", metavar="LIST", type=u, help="set allowed ssl/tls ciphers; [\033[32mhelp\033[0m] shows available ciphers")
ap2.add_argument("--ssl-dbg", action="store_true", help="dump some tls info")
@@ -740,6 +773,7 @@ def add_ftp(ap):
ap2.add_argument("--ftp", metavar="PORT", type=int, help="enable FTP server on PORT, for example \033[32m3921")
ap2.add_argument("--ftps", metavar="PORT", type=int, help="enable FTPS server on PORT, for example \033[32m3990")
ap2.add_argument("--ftpv", action="store_true", help="verbose")
ap2.add_argument("--ftp4", action="store_true", help="only listen on IPv4")
ap2.add_argument("--ftp-wt", metavar="SEC", type=int, default=7, help="grace period for resuming interrupted uploads (any client can write to any file last-modified more recently than SEC seconds ago)")
ap2.add_argument("--ftp-nat", metavar="ADDR", type=u, help="the NAT address to use for passive connections")
ap2.add_argument("--ftp-pr", metavar="P-P", type=u, help="the range of TCP ports to use for passive connections, for example \033[32m12000-13000")
@@ -747,9 +781,10 @@ def add_ftp(ap):
def add_webdav(ap):
ap2 = ap.add_argument_group('WebDAV options')
ap2.add_argument("--daw", action="store_true", help="enable full write support. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
ap2.add_argument("--daw", action="store_true", help="enable full write support, even if client may not be webdav. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
ap2.add_argument("--dav-inf", action="store_true", help="allow depth:infinite requests (recursive file listing); extremely server-heavy but required for spec compliance -- luckily few clients rely on this")
ap2.add_argument("--dav-mac", action="store_true", help="disable apple-garbage filter -- allow macos to create junk files (._* and .DS_Store, .Spotlight-*, .fseventsd, .Trashes, .AppleDouble, __MACOS)")
ap2.add_argument("--dav-rt", action="store_true", help="show symlink-destination's lastmodified instead of the link itself; always enabled for recursive listings (volflag=davrt)")
def add_smb(ap):
@@ -769,6 +804,7 @@ def add_hooks(ap):
ap2 = ap.add_argument_group('event hooks (see --help-hooks)')
ap2.add_argument("--xbu", metavar="CMD", type=u, action="append", help="execute CMD before a file upload starts")
ap2.add_argument("--xau", metavar="CMD", type=u, action="append", help="execute CMD after a file upload finishes")
ap2.add_argument("--xiu", metavar="CMD", type=u, action="append", help="execute CMD after all uploads finish and volume is idle")
ap2.add_argument("--xbr", metavar="CMD", type=u, action="append", help="execute CMD before a file move/rename")
ap2.add_argument("--xar", metavar="CMD", type=u, action="append", help="execute CMD after a file move/rename")
ap2.add_argument("--xbd", metavar="CMD", type=u, action="append", help="execute CMD before a file delete")
@@ -802,7 +838,9 @@ def add_safety(ap, fk_salt):
ap2.add_argument("-ss", action="store_true", help="further increase safety: Prevent js-injection, accidental move/delete, broken symlinks, webdav, 404 on 403, ban on excessive 404s.\n └─Alias of\033[32m -s --unpost=0 --no-del --no-mv --hardlink --vague-403 --ban-404=50,60,1440 -nih")
ap2.add_argument("-sss", action="store_true", help="further increase safety: Enable logging to disk, scan for dangerous symlinks.\n └─Alias of\033[32m -ss --no-dav --no-logues --no-readme -lo=cpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz --ls=**,*,ln,p,r")
ap2.add_argument("--ls", metavar="U[,V[,F]]", type=u, help="do a sanity/safety check of all volumes on startup; arguments \033[33mUSER\033[0m,\033[33mVOL\033[0m,\033[33mFLAGS\033[0m; example [\033[32m**,*,ln,p,r\033[0m]")
ap2.add_argument("--salt", type=u, default="hunter2", help="up2k file-hash salt; used to generate unpredictable internal identifiers for uploads -- doesn't really matter")
ap2.add_argument("--xvol", action="store_true", help="never follow symlinks leaving the volume root, unless the link is into another volume where the user has similar access (volflag=xvol)")
ap2.add_argument("--xdev", action="store_true", help="stay within the filesystem of the volume root; do not descend into other devices (symlink or bind-mount to another HDD, ...) (volflag=xdev)")
ap2.add_argument("--salt", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files -- this one DOES matter")
ap2.add_argument("--no-dot-mv", action="store_true", help="disallow moving dotfiles; makes it impossible to move folders containing dotfiles")
ap2.add_argument("--no-dot-ren", action="store_true", help="disallow renaming dotfiles; makes it impossible to make something a dotfile")
@@ -862,7 +900,7 @@ def add_thumbnail(ap):
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown -- avoids doing keepalive pokes (updating the mtime) on thumbnail folders more often than SEC seconds")
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age -- folders which haven't been poked for longer than --th-poke seconds will get deleted every --th-clean seconds")
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for")
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for; case-insensitive if -e2d")
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
# https://github.com/libvips/libvips
# ffmpeg -hide_banner -demuxers | awk '/^ D /{print$2}' | while IFS= read -r x; do ffmpeg -hide_banner -h demuxer=$x; done | grep -E '^Demuxer |extensions:'
@@ -870,7 +908,7 @@ def add_thumbnail(ap):
ap2.add_argument("--th-r-vips", metavar="T,T", type=u, default="avif,exr,fit,fits,fts,gif,hdr,heic,jp2,jpeg,jpg,jpx,jxl,nii,pfm,pgm,png,ppm,svg,tif,tiff,webp", help="image formats to decode using pyvips")
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itr,itz,m4a,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3gz,s3m,s3r,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmr,xmz,xpk", help="audio formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,m4a,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,tak,tta,ulaw,wav,wma,wv,xm,xpk", help="audio formats to decode using ffmpeg")
def add_transcoding(ap):
@@ -888,15 +926,13 @@ def add_db_general(ap, hcores):
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see --help-dbd (volflag=dbd)")
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)")
ap2.add_argument("--xdev", action="store_true", help="do not descend into other filesystems (symlink or bind-mount to another HDD, ...) (volflag=xdev)")
ap2.add_argument("--xvol", action="store_true", help="skip symlinks leaving the volume root (volflag=xvol)")
ap2.add_argument("--hash-mt", metavar="CORES", type=int, default=hcores, help="num cpu cores to use for file hashing; set 0 or 1 for single-core hashing")
ap2.add_argument("--re-maxage", metavar="SEC", type=int, default=0, help="disk rescan volume interval, 0=off (volflag=scan)")
ap2.add_argument("--db-act", metavar="SEC", type=float, default=10, help="defer any scheduled volume reindexing until SEC seconds after last db write (uploads, renames, ...)")
@@ -934,6 +970,7 @@ def add_ui(ap, retry):
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI)")
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
@@ -974,8 +1011,10 @@ def run_argparse(
description="http file sharing hub v{} ({})".format(S_VERSION, S_BUILD_DT),
)
cert_path = os.path.join(E.cfg, "cert.pem")
try:
fk_salt = unicode(os.path.getmtime(os.path.join(E.cfg, "cert.pem")))
fk_salt = unicode(os.path.getmtime(cert_path))
except:
fk_salt = "hunter2"
@@ -987,7 +1026,7 @@ def run_argparse(
add_general(ap, nc, srvname)
add_network(ap)
add_tls(ap)
add_tls(ap, cert_path)
add_qr(ap, tty)
add_zeroconf(ap)
add_zc_mdns(ap)
@@ -1067,12 +1106,10 @@ def main(argv: Optional[list[str]] = None) -> None:
showlic()
sys.exit(0)
if is_exe:
if EXE:
print("pybin: {}\n".format(pybin), end="")
ensure_locale()
if HAVE_SSL:
ensure_cert()
for k, v in zip(argv[1:], argv[2:]):
if k == "-c" and os.path.isfile(v):
@@ -1140,6 +1177,9 @@ def main(argv: Optional[list[str]] = None) -> None:
except:
sys.exit(1)
if HAVE_SSL:
ensure_cert(al)
if WINDOWS and not al.keep_qem:
try:
disable_quickedit()

View File

@@ -1,8 +1,8 @@
# coding: utf-8
VERSION = (1, 6, 4)
CODENAME = "cors k"
BUILD_DT = (2023, 2, 11)
VERSION = (1, 7, 0)
CODENAME = "unlinked"
BUILD_DT = (2023, 4, 29)
S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -67,9 +67,9 @@ class AXS(object):
self.upget: set[str] = set(upget or [])
def __repr__(self) -> str:
return "AXS({})".format(
return "AXS(%s)" % (
", ".join(
"{}={!r}".format(k, self.__dict__[k])
"%s=%r" % (k, self.__dict__[k])
for k in "uread uwrite umove udel uget upget".split()
)
)
@@ -285,6 +285,8 @@ class VFS(object):
self.vpath = vpath # absolute path in the virtual filesystem
self.axs = axs
self.flags = flags # config options
self.root = self
self.dev = 0 # st_dev
self.nodes: dict[str, VFS] = {} # child nodes
self.histtab: dict[str, str] = {} # all realpath->histpath
self.dbv: Optional[VFS] = None # closest full/non-jump parent
@@ -297,26 +299,42 @@ class VFS(object):
self.apget: dict[str, list[str]] = {}
if realpath:
rp = realpath + ("" if realpath.endswith(os.sep) else os.sep)
vp = vpath + ("/" if vpath else "")
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
self.all_vols = {vpath: self} # flattened recursive
self.all_aps = [(rp, self)]
self.all_vps = [(vp, self)]
else:
self.histpath = ""
self.all_vols = {}
self.all_aps = []
self.all_vps = []
def __repr__(self) -> str:
return "VFS({})".format(
return "VFS(%s)" % (
", ".join(
"{}={!r}".format(k, self.__dict__[k])
"%s=%r" % (k, self.__dict__[k])
for k in "realpath vpath axs flags".split()
)
)
def get_all_vols(self, outdict: dict[str, "VFS"]) -> None:
def get_all_vols(
self,
vols: dict[str, "VFS"],
aps: list[tuple[str, "VFS"]],
vps: list[tuple[str, "VFS"]],
) -> None:
if self.realpath:
outdict[self.vpath] = self
vols[self.vpath] = self
rp = self.realpath
rp += "" if rp.endswith(os.sep) else os.sep
vp = self.vpath + ("/" if self.vpath else "")
aps.append((rp, self))
vps.append((vp, self))
for v in self.nodes.values():
v.get_all_vols(outdict)
v.get_all_vols(vols, aps, vps)
def add(self, src: str, dst: str) -> "VFS":
"""get existing, or add new path to the vfs"""
@@ -356,7 +374,8 @@ class VFS(object):
flags = {k: v for k, v in self.flags.items()}
hist = flags.get("hist")
if hist and hist != "-":
flags["hist"] = "{}/{}".format(hist.rstrip("/"), name)
zs = "{}/{}".format(hist.rstrip("/"), name)
flags["hist"] = os.path.expanduser(zs) if zs.startswith("~") else zs
return flags
@@ -389,7 +408,11 @@ class VFS(object):
self, vpath: str, uname: str
) -> tuple[bool, bool, bool, bool, bool, bool]:
"""can Read,Write,Move,Delete,Get,Upget"""
vn, _ = self._find(undot(vpath))
if vpath:
vn, _ = self._find(undot(vpath))
else:
vn = self
c = vn.axs
return (
uname in c.uread or "*" in c.uread,
@@ -544,9 +567,20 @@ class VFS(object):
self.log("vfs.walk", t.format(seen[-1], fsroot, self.vpath, rem), 3)
return
if "xdev" in self.flags or "xvol" in self.flags:
rm1 = []
for le in vfs_ls:
ap = absreal(os.path.join(fsroot, le[0]))
vn2 = self.chk_ap(ap)
if not vn2 or not vn2.get("", uname, True, False):
rm1.append(le)
_ = [vfs_ls.remove(x) for x in rm1] # type: ignore
seen = seen[:] + [fsroot]
rfiles = [x for x in vfs_ls if not stat.S_ISDIR(x[1].st_mode)]
rdirs = [x for x in vfs_ls if stat.S_ISDIR(x[1].st_mode)]
# if lstat: ignore folder symlinks since copyparty will never make those
# (and we definitely don't want to descend into them)
rfiles.sort()
rdirs.sort()
@@ -577,6 +611,7 @@ class VFS(object):
def zipgen(
self,
vpath: str,
vrem: str,
flt: set[str],
uname: str,
@@ -588,7 +623,7 @@ class VFS(object):
# if multiselect: add all items to archive root
# if single folder: the folder itself is the top-level item
folder = "" if flt or not wrap else (vrem.split("/")[-1].lstrip(".") or "top")
folder = "" if flt or not wrap else (vpath.split("/")[-1].lstrip(".") or "top")
g = self.walk(folder, vrem, [], uname, [[True, False]], dots, scandir, False)
for _, _, vpath, apath, files, rd, vd in g:
@@ -639,6 +674,44 @@ class VFS(object):
for d in [{"vp": v, "ap": a, "st": n} for v, a, n in ret2]:
yield d
def chk_ap(self, ap: str, st: Optional[os.stat_result] = None) -> Optional["VFS"]:
aps = ap + os.sep
if "xdev" in self.flags and not ANYWIN:
if not st:
ap2 = ap.replace("\\", "/") if ANYWIN else ap
while ap2:
try:
st = bos.stat(ap2)
break
except:
if "/" not in ap2:
raise
ap2 = ap2.rsplit("/", 1)[0]
assert st
vdev = self.dev
if not vdev:
vdev = self.dev = bos.stat(self.realpath).st_dev
if vdev != st.st_dev:
if self.log:
t = "xdev: {}[{}] => {}[{}]"
self.log("vfs", t.format(vdev, self.realpath, st.st_dev, ap), 3)
return None
if "xvol" in self.flags:
for vap, vn in self.root.all_aps:
if aps.startswith(vap):
return vn
if self.log:
self.log("vfs", "xvol: [{}]".format(ap), 3)
return None
return self
if WINDOWS:
re_vol = re.compile(r"^([a-zA-Z]:[\\/][^:]*|[^:]*):([^:]*):(.*)$")
@@ -857,7 +930,7 @@ class AuthSrv(object):
zd = split_cfg_ln(ln)
fstr = ""
for sk, sv in zd.items():
bad = re.sub(r"[a-z0-9_]", "", sk)
bad = re.sub(r"[a-z0-9_-]", "", sk).lstrip("-")
if bad:
err = "bad characters [{}] in volflag name [{}]; "
err = err.format(bad, sk)
@@ -933,8 +1006,15 @@ class AuthSrv(object):
value: Union[str, bool, list[str]],
is_list: bool,
) -> None:
desc = flagdescs.get(name, "?").replace("\n", " ")
if name not in ["mtp", "xbu", "xau", "xbr", "xar", "xbd", "xad", "xm"]:
desc = flagdescs.get(name.lstrip("-"), "?").replace("\n", " ")
if re.match("^-[^-]+$", name):
t = "└─unset volflag [{}] ({})"
self._e(t.format(name[1:], desc))
flags[name] = True
return
if name not in "mtp xbu xau xiu xbr xar xbd xad xm".split():
if value is True:
t = "└─add volflag [{}] = {} ({})"
else:
@@ -1058,7 +1138,13 @@ class AuthSrv(object):
assert vfs
vfs.all_vols = {}
vfs.get_all_vols(vfs.all_vols)
vfs.all_aps = []
vfs.all_vps = []
vfs.get_all_vols(vfs.all_vols, vfs.all_aps, vfs.all_vps)
for vol in vfs.all_vols.values():
vol.all_aps.sort(key=lambda x: len(x[0]), reverse=True)
vol.all_vps.sort(key=lambda x: len(x[0]), reverse=True)
vol.root = vfs
for perm in "read write move del get pget".split():
axs_key = "u" + perm
@@ -1101,6 +1187,9 @@ class AuthSrv(object):
if vflag == "-":
pass
elif vflag:
if vflag.startswith("~"):
vflag = os.path.expanduser(vflag)
vol.histpath = uncyg(vflag) if WINDOWS else vflag
elif self.args.hist:
for nch in range(len(hid)):
@@ -1303,7 +1392,7 @@ class AuthSrv(object):
vol.flags["mth"] = self.args.mth
# append additive args from argv to volflags
hooks = "xbu xau xbr xar xbd xad xm".split()
hooks = "xbu xau xiu xbr xar xbd xad xm".split()
for name in ["mtp"] + hooks:
self._read_volflag(vol.flags, name, getattr(self.args, name), True)
@@ -1363,10 +1452,19 @@ class AuthSrv(object):
if k in ints:
vol.flags[k] = int(vol.flags[k])
if "lifetime" in vol.flags and "e2d" not in vol.flags:
t = 'removing lifetime config from volume "/{}" because e2d is disabled'
self.log(t.format(vol.vpath), 1)
del vol.flags["lifetime"]
if "e2d" not in vol.flags:
if "lifetime" in vol.flags:
t = 'removing lifetime config from volume "/{}" because e2d is disabled'
self.log(t.format(vol.vpath), 1)
del vol.flags["lifetime"]
needs_e2d = [x for x in hooks if x != "xm"]
drop = [x for x in needs_e2d if vol.flags.get(x)]
if drop:
t = 'removing [{}] from volume "/{}" because e2d is disabled'
self.log(t.format(", ".join(drop), vol.vpath), 1)
for x in drop:
vol.flags.pop(x)
if vol.flags.get("neversymlink") and not vol.flags.get("hardlink"):
vol.flags["copydupes"] = True
@@ -1427,6 +1525,12 @@ class AuthSrv(object):
self.log(t, 1)
errors = True
for vol in vfs.all_vols.values():
for k in list(vol.flags.keys()):
if re.match("^-[^-]+$", k):
vol.flags.pop(k[1:], None)
vol.flags.pop(k)
if errors:
sys.exit(1)
@@ -1624,8 +1728,8 @@ class AuthSrv(object):
]
csv = set("i p".split())
lst = set("c ihead mtm mtp xad xar xau xbd xbr xbu xm".split())
askip = set("a v c vc cgen".split())
lst = set("c ihead mtm mtp xad xar xau xiu xbd xbr xbu xm".split())
askip = set("a v c vc cgen theme".split())
# keymap from argv to vflag
amap = vf_bmap()
@@ -1722,6 +1826,11 @@ class AuthSrv(object):
elif v is True:
trues.append(k)
elif v is not False:
try:
v = v.pattern
except:
pass
vals.append("{}: {}".format(k, v))
pops = []
for k1, k2 in IMPLICATIONS:

View File

@@ -13,6 +13,7 @@ def vf_bmap() -> dict[str, str]:
"no_dedup": "copydupes",
"no_dupe": "nodupe",
"no_forget": "noforget",
"dav_rt": "davrt",
}
for k in (
"dotsrch",
@@ -106,7 +107,7 @@ flagcats = {
"dbd=[acid|swal|wal|yolo]": "database speed-durability tradeoff",
"xlink": "cross-volume dupe detection / linking",
"xdev": "do not descend into other filesystems",
"xvol": "skip symlinks leaving the volume root",
"xvol": "do not follow symlinks leaving the volume root",
"dotsrch": "show dotfiles in search results",
"nodotsrch": "hide dotfiles in search results (default)",
},
@@ -123,6 +124,7 @@ flagcats = {
"event hooks\n(better explained in --help-hooks)": {
"xbu=CMD": "execute CMD before a file upload starts",
"xau=CMD": "execute CMD after a file upload finishes",
"xiu=CMD": "execute CMD after all uploads finish and volume is idle",
"xbr=CMD": "execute CMD before a file rename/move",
"xar=CMD": "execute CMD after a file rename/move",
"xbd=CMD": "execute CMD before a file delete",
@@ -141,7 +143,8 @@ flagcats = {
"lg_sbf": "list of *logue-sandbox safeguards to disable",
},
"others": {
"fk=8": 'generates per-file accesskeys,\nwhich will then be required at the "g" permission'
"fk=8": 'generates per-file accesskeys,\nwhich will then be required at the "g" permission',
"davrt": "show lastmod time of symlink destination, not the link itself\n(note: this option is always enabled for recursive listings)",
},
}

View File

@@ -2,6 +2,7 @@
from __future__ import print_function, unicode_literals
import argparse
import errno
import logging
import os
import stat
@@ -14,6 +15,7 @@ from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
from .authsrv import VFS
from .bos import bos
from .util import (
Daemon,
@@ -23,6 +25,7 @@ from .util import (
ipnorm,
pybin,
relchk,
runhook,
sanitize_fn,
vjoin,
)
@@ -44,6 +47,12 @@ if True: # pylint: disable=using-constant-test
from typing import Any, Optional
class FSE(FilesystemError):
def __init__(self, msg: str, severity: int = 0) -> None:
super(FilesystemError, self).__init__(msg)
self.severity = severity
class FtpAuth(DummyAuthorizer):
def __init__(self, hub: "SvcHub") -> None:
super(FtpAuth, self).__init__()
@@ -53,6 +62,7 @@ class FtpAuth(DummyAuthorizer):
self, username: str, password: str, handler: Any
) -> None:
handler.username = "{}:{}".format(username, password)
handler.uname = "*"
ip = handler.addr[0]
if ip.startswith("::ffff:"):
@@ -84,14 +94,14 @@ class FtpAuth(DummyAuthorizer):
raise AuthenticationFailed("Authentication failed.")
handler.username = uname
handler.uname = uname
def get_home_dir(self, username: str) -> str:
return "/"
def has_user(self, username: str) -> bool:
asrv = self.hub.asrv
return username in asrv.acct
return username in asrv.acct or username in asrv.iacct
def has_perm(self, username: str, perm: int, path: Optional[str] = None) -> bool:
return True # handled at filesystem layer
@@ -110,11 +120,11 @@ class FtpFs(AbstractedFS):
def __init__(
self, root: str, cmd_channel: Any
) -> None: # pylint: disable=super-init-not-called
self.h = self.cmd_channel = cmd_channel # type: FTPHandler
self.h = cmd_channel # type: FTPHandler
self.cmd_channel = cmd_channel # type: FTPHandler
self.hub: "SvcHub" = cmd_channel.hub
self.args = cmd_channel.args
self.uname = self.hub.asrv.iacct.get(cmd_channel.password, "*")
self.uname = cmd_channel.uname
self.cwd = "/" # pyftpdlib convention of leading slash
self.root = "/var/lib/empty"
@@ -132,23 +142,36 @@ class FtpFs(AbstractedFS):
w: bool = False,
m: bool = False,
d: bool = False,
) -> str:
) -> tuple[str, VFS, str]:
try:
vpath = vpath.replace("\\", "/").lstrip("/")
vpath = vpath.replace("\\", "/").strip("/")
rd, fn = os.path.split(vpath)
if ANYWIN and relchk(rd):
logging.warning("malicious vpath: %s", vpath)
raise FilesystemError("unsupported characters in filepath")
t = "Unsupported characters in [{}]"
raise FSE(t.format(vpath), 1)
fn = sanitize_fn(fn or "", "", [".prologue.html", ".epilogue.html"])
vpath = vjoin(rd, fn)
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
if not vfs.realpath:
raise FilesystemError("no filesystem mounted at this path")
t = "No filesystem mounted at [{}]"
raise FSE(t.format(vpath))
return os.path.join(vfs.realpath, rem)
if "xdev" in vfs.flags or "xvol" in vfs.flags:
ap = vfs.canonical(rem)
avfs = vfs.chk_ap(ap)
t = "Permission denied in [{}]"
if not avfs:
raise FSE(t.format(vpath), 1)
cr, cw, cm, cd, _, _ = avfs.can_access("", self.h.uname)
if r and not cr or w and not cw or m and not cm or d and not cd:
raise FSE(t.format(vpath), 1)
return os.path.join(vfs.realpath, rem), vfs, rem
except Pebkac as ex:
raise FilesystemError(str(ex))
raise FSE(str(ex))
def rv2a(
self,
@@ -157,7 +180,7 @@ class FtpFs(AbstractedFS):
w: bool = False,
m: bool = False,
d: bool = False,
) -> str:
) -> tuple[str, VFS, str]:
return self.v2a(os.path.join(self.cwd, vpath), r, w, m, d)
def ftp2fs(self, ftppath: str) -> str:
@@ -171,7 +194,7 @@ class FtpFs(AbstractedFS):
def validpath(self, path: str) -> bool:
if "/.hist/" in path:
if "/up2k." in path or path.endswith("/dir.txt"):
raise FilesystemError("access to this file is forbidden")
raise FSE("Access to this file is forbidden", 1)
return True
@@ -179,7 +202,7 @@ class FtpFs(AbstractedFS):
r = "r" in mode
w = "w" in mode or "a" in mode or "+" in mode
ap = self.rv2a(filename, r, w)
ap = self.rv2a(filename, r, w)[0]
if w:
try:
st = bos.stat(ap)
@@ -188,7 +211,7 @@ class FtpFs(AbstractedFS):
td = 0
if td < -1 or td > self.args.ftp_wt:
raise FilesystemError("cannot open existing file for writing")
raise FSE("Cannot open existing file for writing")
self.validpath(ap)
return open(fsenc(ap), mode)
@@ -197,9 +220,17 @@ class FtpFs(AbstractedFS):
nwd = join(self.cwd, path)
vfs, rem = self.hub.asrv.vfs.get(nwd, self.uname, False, False)
ap = vfs.canonical(rem)
if not bos.path.isdir(ap):
try:
st = bos.stat(ap)
if not stat.S_ISDIR(st.st_mode):
raise Exception()
except:
# returning 550 is library-default and suitable
raise FilesystemError("Failed to change directory")
raise FSE("No such file or directory")
avfs = vfs.chk_ap(ap, st)
if not avfs:
raise FSE("Permission denied", 1)
self.cwd = nwd
(
@@ -209,16 +240,18 @@ class FtpFs(AbstractedFS):
self.can_delete,
self.can_get,
self.can_upget,
) = self.hub.asrv.vfs.can_access(self.cwd.lstrip("/"), self.h.username)
) = avfs.can_access("", self.h.uname)
def mkdir(self, path: str) -> None:
ap = self.rv2a(path, w=True)
bos.mkdir(ap)
ap = self.rv2a(path, w=True)[0]
bos.makedirs(ap) # filezilla expects this
def listdir(self, path: str) -> list[str]:
vpath = join(self.cwd, path).lstrip("/")
vpath = join(self.cwd, path)
try:
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, True, False)
ap, vfs, rem = self.v2a(vpath, True, False)
if not bos.path.isdir(ap):
raise FSE("No such file or directory", 1)
fsroot, vfs_ls1, vfs_virt = vfs.ls(
rem,
@@ -234,8 +267,12 @@ class FtpFs(AbstractedFS):
vfs_ls.sort()
return vfs_ls
except:
if vpath:
except Exception as ex:
# panic on malicious names
if getattr(ex, "severity", 0):
raise
if vpath.strip("/"):
# display write-only folders as empty
return []
@@ -244,43 +281,49 @@ class FtpFs(AbstractedFS):
return list(sorted(list(r.keys())))
def rmdir(self, path: str) -> None:
ap = self.rv2a(path, d=True)
bos.rmdir(ap)
ap = self.rv2a(path, d=True)[0]
try:
bos.rmdir(ap)
except OSError as e:
if e.errno != errno.ENOENT:
raise
def remove(self, path: str) -> None:
if self.args.no_del:
raise FilesystemError("the delete feature is disabled in server config")
raise FSE("The delete feature is disabled in server config")
vp = join(self.cwd, path).lstrip("/")
try:
self.hub.up2k.handle_rm(self.uname, self.h.remote_ip, [vp], [])
self.hub.up2k.handle_rm(self.uname, self.h.cli_ip, [vp], [], False)
except Exception as ex:
raise FilesystemError(str(ex))
raise FSE(str(ex))
def rename(self, src: str, dst: str) -> None:
if not self.can_move:
raise FilesystemError("not allowed for user " + self.h.username)
raise FSE("Not allowed for user " + self.h.uname)
if self.args.no_mv:
t = "the rename/move feature is disabled in server config"
raise FilesystemError(t)
raise FSE("The rename/move feature is disabled in server config")
svp = join(self.cwd, src).lstrip("/")
dvp = join(self.cwd, dst).lstrip("/")
try:
self.hub.up2k.handle_mv(self.uname, svp, dvp)
except Exception as ex:
raise FilesystemError(str(ex))
raise FSE(str(ex))
def chmod(self, path: str, mode: str) -> None:
pass
def stat(self, path: str) -> os.stat_result:
try:
ap = self.rv2a(path, r=True)
ap = self.rv2a(path, r=True)[0]
return bos.stat(ap)
except:
ap = self.rv2a(path)
except FSE as ex:
if ex.severity:
raise
ap = self.rv2a(path)[0]
st = bos.stat(ap)
if not stat.S_ISDIR(st.st_mode):
raise
@@ -288,44 +331,50 @@ class FtpFs(AbstractedFS):
return st
def utime(self, path: str, timeval: float) -> None:
ap = self.rv2a(path, w=True)
ap = self.rv2a(path, w=True)[0]
return bos.utime(ap, (timeval, timeval))
def lstat(self, path: str) -> os.stat_result:
ap = self.rv2a(path)
ap = self.rv2a(path)[0]
return bos.stat(ap)
def isfile(self, path: str) -> bool:
try:
st = self.stat(path)
return stat.S_ISREG(st.st_mode)
except:
except Exception as ex:
if getattr(ex, "severity", 0):
raise
return False # expected for mojibake in ftp_SIZE()
def islink(self, path: str) -> bool:
ap = self.rv2a(path)
ap = self.rv2a(path)[0]
return bos.path.islink(ap)
def isdir(self, path: str) -> bool:
try:
st = self.stat(path)
return stat.S_ISDIR(st.st_mode)
except:
except Exception as ex:
if getattr(ex, "severity", 0):
raise
return True
def getsize(self, path: str) -> int:
ap = self.rv2a(path)
ap = self.rv2a(path)[0]
return bos.path.getsize(ap)
def getmtime(self, path: str) -> float:
ap = self.rv2a(path)
ap = self.rv2a(path)[0]
return bos.path.getmtime(ap)
def realpath(self, path: str) -> str:
return path
def lexists(self, path: str) -> bool:
ap = self.rv2a(path)
ap = self.rv2a(path)[0]
return bos.path.lexists(ap)
def get_user_by_uid(self, uid: int) -> str:
@@ -339,16 +388,21 @@ class FtpHandler(FTPHandler):
abstracted_fs = FtpFs
hub: "SvcHub"
args: argparse.Namespace
uname: str
def __init__(self, conn: Any, server: Any, ioloop: Any = None) -> None:
self.hub: "SvcHub" = FtpHandler.hub
self.args: argparse.Namespace = FtpHandler.args
self.uname = "*"
if PY2:
FTPHandler.__init__(self, conn, server, ioloop)
else:
super(FtpHandler, self).__init__(conn, server, ioloop)
cip = self.remote_ip
self.cli_ip = cip[7:] if cip.startswith("::ffff:") else cip
# abspath->vpath mapping to resolve log_transfer paths
self.vfs_map: dict[str, str] = {}
@@ -358,8 +412,24 @@ class FtpHandler(FTPHandler):
def ftp_STOR(self, file: str, mode: str = "w") -> Any:
# Optional[str]
vp = join(self.fs.cwd, file).lstrip("/")
ap = self.fs.v2a(vp)
ap, vfs, rem = self.fs.v2a(vp, w=True)
self.vfs_map[ap] = vp
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
None,
xbu,
ap,
vfs.canonical(rem),
"",
self.uname,
0,
0,
self.cli_ip,
0,
"",
):
raise FSE("Upload blocked by xbu server config")
# print("ftp_STOR: {} {} => {}".format(vp, mode, ap))
ret = FTPHandler.ftp_STOR(self, file, mode)
# print("ftp_STOR: {} {} OK".format(vp, mode))
@@ -380,15 +450,17 @@ class FtpHandler(FTPHandler):
# print("xfer_end: {} => {}".format(ap, vp))
if vp:
vp, fn = os.path.split(vp)
vfs, rem = self.hub.asrv.vfs.get(vp, self.username, False, True)
vfs, rem = self.hub.asrv.vfs.get(vp, self.uname, False, True)
vfs, rem = vfs.get_dbv(rem)
self.hub.up2k.hash_file(
vfs.realpath,
vfs.vpath,
vfs.flags,
rem,
fn,
self.remote_ip,
self.cli_ip,
time.time(),
self.uname,
)
return FTPHandler.log_transfer(
@@ -422,7 +494,7 @@ class Ftpd(object):
print(t.format(pybin))
sys.exit(1)
h1.certfile = os.path.join(self.args.E.cfg, "cert.pem")
h1.certfile = self.args.cert
h1.tls_control_required = True
h1.tls_data_required = True
@@ -456,6 +528,9 @@ class Ftpd(object):
if "::" in ips:
ips.append("0.0.0.0")
if self.args.ftp4:
ips = [x for x in ips if ":" not in x]
ioloop = IOLoop()
for ip in ips:
for h, lp in hs:

View File

@@ -10,6 +10,7 @@ import gzip
import itertools
import json
import os
import random
import re
import stat
import string
@@ -134,6 +135,7 @@ class HttpCli(object):
self.ouparam: dict[str, str] = {}
self.uparam: dict[str, str] = {}
self.cookies: dict[str, str] = {}
self.avn: Optional[VFS] = None
self.vpath = " "
self.uname = " "
self.pw = " "
@@ -218,7 +220,7 @@ class HttpCli(object):
try:
self.s.settimeout(2)
headerlines = read_header(self.sr)
headerlines = read_header(self.sr, self.args.s_thead, self.args.s_thead)
self.in_hdr_recv = False
if not headerlines:
return False
@@ -263,9 +265,10 @@ class HttpCli(object):
self.is_https = (
self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls
)
self.host = self.headers.get("host") or "{}:{}".format(
*list(self.s.getsockname()[:2])
)
self.host = self.headers.get("host") or ""
if not self.host:
zs = "%s:%s" % self.s.getsockname()[:2]
self.host = zs[7:] if zs.startswith("::ffff:") else zs
n = self.args.rproxy
if n:
@@ -343,6 +346,8 @@ class HttpCli(object):
if self.args.rsp_slp:
time.sleep(self.args.rsp_slp)
if self.args.rsp_jtr:
time.sleep(random.random() * self.args.rsp_jtr)
zso = self.headers.get("cookie")
if zso:
@@ -399,10 +404,21 @@ class HttpCli(object):
self.get_pwd_cookie(self.pw)
if self.is_rclone:
# dots: always include dotfiles if permitted
# lt: probably more important showing the correct timestamps of any dupes it just uploaded rather than the lastmod time of any non-copyparty-managed symlinks
# b: basic-browser if it tries to parse the html listing
uparam["dots"] = ""
uparam["lt"] = ""
uparam["b"] = ""
cookies["b"] = ""
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, False, False)
if "xdev" in vn.flags or "xvol" in vn.flags:
ap = vn.canonical(rem)
avn = vn.chk_ap(ap)
else:
avn = vn
(
self.can_read,
self.can_write,
@@ -410,7 +426,12 @@ class HttpCli(object):
self.can_delete,
self.can_get,
self.can_upget,
) = self.asrv.vfs.can_access(self.vpath, self.uname)
) = (
avn.can_access("", self.uname) if avn else [False] * 6
)
self.avn = avn
self.s.settimeout(self.args.s_tbody or None)
try:
cors_k = self._cors()
@@ -526,7 +547,7 @@ class HttpCli(object):
mime: Optional[str] = None,
headers: Optional[dict[str, str]] = None,
) -> None:
response = ["{} {} {}".format(self.http_ver, status, HTTPCODE[status])]
response = ["%s %s %s" % (self.http_ver, status, HTTPCODE[status])]
if length is not None:
response.append("Content-Length: " + unicode(length))
@@ -550,11 +571,10 @@ class HttpCli(object):
self.out_headers["Content-Type"] = mime
for k, zs in list(self.out_headers.items()) + self.out_headerlist:
response.append("{}: {}".format(k, zs))
response.append("%s: %s" % (k, zs))
try:
# best practice to separate headers and body into different packets
self.s.settimeout(None)
self.s.sendall("\r\n".join(response).encode("utf-8") + b"\r\n\r\n")
except:
raise Pebkac(400, "client d/c while replying headers")
@@ -617,7 +637,7 @@ class HttpCli(object):
if not kv:
return ""
r = ["{}={}".format(k, quotep(zs)) if zs else k for k, zs in kv.items()]
r = ["%s=%s" % (k, quotep(zs)) if zs else k for k, zs in kv.items()]
return "?" + "&amp;".join(r)
def redirect(
@@ -775,8 +795,8 @@ class HttpCli(object):
if "k304" in self.uparam:
return self.set_k304()
if "am_js" in self.uparam:
return self.set_am_js()
if "setck" in self.uparam:
return self.setck()
if "reset" in self.uparam:
return self.set_cfg_reset()
@@ -784,8 +804,8 @@ class HttpCli(object):
if "hc" in self.uparam:
return self.tx_svcs()
if "h" in self.uparam:
return self.tx_mounts()
if "h" in self.uparam:
return self.tx_mounts()
# conditional redirect to single volumes
if self.vpath == "" and not self.ouparam:
@@ -860,16 +880,31 @@ class HttpCli(object):
props = set(props_lst)
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False, err=401)
tap = vn.canonical(rem)
depth = self.headers.get("depth", "infinity").lower()
if depth == "infinity":
try:
topdir = {"vp": "", "st": bos.stat(tap)}
except OSError as ex:
if ex.errno not in (errno.ENOENT, errno.ENOTDIR):
raise
raise Pebkac(404)
if not stat.S_ISDIR(topdir["st"].st_mode):
fgen = []
elif depth == "infinity":
if not self.args.dav_inf:
self.log("client wants --dav-inf", 3)
zb = b'<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:propfind-finite-depth/></D:error>'
self.reply(zb, 403, "application/xml; charset=utf-8")
return True
# this will return symlink-target timestamps
# because lstat=true would not recurse into subfolders
# and this is a rare case where we actually want that
fgen = vn.zipgen(
rem,
rem,
set(),
self.uname,
@@ -881,7 +916,11 @@ class HttpCli(object):
elif depth == "1":
_, vfs_ls, vfs_virt = vn.ls(
rem, self.uname, not self.args.no_scandir, [[True, False]]
rem,
self.uname,
not self.args.no_scandir,
[[True, False]],
lstat="davrt" not in vn.flags,
)
if not self.args.ed:
names = set(exclude_dotfiles([x[0] for x in vfs_ls]))
@@ -901,13 +940,6 @@ class HttpCli(object):
t2 = " or 'infinity'" if self.args.dav_inf else ""
raise Pebkac(412, t.format(depth, t2))
try:
topdir = {"vp": "", "st": os.stat(vn.canonical(rem))}
except OSError as ex:
if ex.errno != errno.ENOENT:
raise
raise Pebkac(404)
fgen = itertools.chain([topdir], fgen) # type: ignore
vtop = vjoin(self.args.R, vjoin(vn.vpath, rem))
@@ -922,14 +954,23 @@ class HttpCli(object):
for x in fgen:
rp = vjoin(vtop, x["vp"])
st: os.stat_result = x["st"]
mtime = st.st_mtime
if stat.S_ISLNK(st.st_mode):
try:
st = bos.stat(os.path.join(tap, x["vp"]))
except:
continue
isdir = stat.S_ISDIR(st.st_mode)
t = "<D:response><D:href>/{}{}</D:href><D:propstat><D:prop>"
ret += t.format(quotep(rp), "/" if isdir and rp else "")
ret += "<D:response><D:href>/%s%s</D:href><D:propstat><D:prop>" % (
quotep(rp),
"/" if isdir and rp else "",
)
pvs: dict[str, str] = {
"displayname": html_escape(rp.split("/")[-1]),
"getlastmodified": formatdate(st.st_mtime, usegmt=True),
"getlastmodified": formatdate(mtime, usegmt=True),
"resourcetype": '<D:collection xmlns:D="DAV:"/>' if isdir else "",
"supportedlock": '<D:lockentry xmlns:D="DAV:"><D:lockscope><D:exclusive/></D:lockscope><D:locktype><D:write/></D:locktype></D:lockentry>',
}
@@ -941,13 +982,13 @@ class HttpCli(object):
if k not in props:
continue
elif v:
ret += "<D:{0}>{1}</D:{0}>".format(k, v)
ret += "<D:%s>%s</D:%s>" % (k, v, k)
else:
ret += "<D:{}/>".format(k)
ret += "<D:%s/>" % (k,)
ret += "</D:prop><D:status>HTTP/1.1 200 OK</D:status></D:propstat>"
missing = ["<D:{}/>".format(x) for x in props if x not in pvs]
missing = ["<D:%s/>" % (x,) for x in props if x not in pvs]
if missing and clen:
t = "<D:propstat><D:prop>{}</D:prop><D:status>HTTP/1.1 404 Not Found</D:status></D:propstat>"
ret += t.format("".join(missing))
@@ -1109,24 +1150,22 @@ class HttpCli(object):
if self.do_log:
self.log("MKCOL " + self.req)
return self._mkdir(self.vpath)
try:
return self._mkdir(self.vpath, True)
except Pebkac as ex:
if ex.code >= 500:
raise
self.reply(b"", ex.code)
return True
def handle_move(self) -> bool:
dst = self.headers["destination"]
dst = re.sub("^https?://[^/]+", "", dst).lstrip()
dst = unquotep(dst)
if not self._mv(self.vpath, dst):
if not self._mv(self.vpath, dst.lstrip("/")):
return False
# up2k only cares about files and removes all empty folders;
# clients naturally expect empty folders to survive a rename
vn, rem = self.asrv.vfs.get(dst, self.uname, False, False)
dabs = vn.canonical(rem)
try:
bos.makedirs(dabs)
except:
pass
return True
def _applesan(self) -> bool:
@@ -1191,7 +1230,6 @@ class HttpCli(object):
if self.headers.get("expect", "").lower() == "100-continue":
try:
self.s.settimeout(None)
self.s.sendall(b"HTTP/1.1 100 Continue\r\n\r\n")
except:
raise Pebkac(400, "client d/c before 100 continue")
@@ -1203,7 +1241,6 @@ class HttpCli(object):
if self.headers.get("expect", "").lower() == "100-continue":
try:
self.s.settimeout(None)
self.s.sendall(b"HTTP/1.1 100 Continue\r\n\r\n")
except:
raise Pebkac(400, "client d/c before 100 continue")
@@ -1269,9 +1306,10 @@ class HttpCli(object):
self.vpath,
self.host,
self.uname,
self.ip,
time.time(),
len(xm),
self.ip,
time.time(),
plain,
)
@@ -1412,31 +1450,35 @@ class HttpCli(object):
self.vpath,
self.host,
self.uname,
self.ip,
at,
remains,
self.ip,
at,
"",
):
t = "upload denied by xbu"
t = "upload blocked by xbu server config"
self.log(t, 1)
raise Pebkac(403, t)
if is_put and not self.args.no_dav:
if is_put and not (self.args.no_dav or self.args.nw) and bos.path.exists(path):
# allow overwrite if...
# * volflag 'daw' is set
# * volflag 'daw' is set, or client is definitely webdav
# * and account has delete-access
# or...
# * file exists and is empty
# * file exists, is empty, sufficiently new
# * and there is no .PARTIAL
tnam = fn + ".PARTIAL"
if self.args.dotpart:
tnam = "." + tnam
if (vfs.flags.get("daw") and self.can_delete) or (
if (
self.can_delete
and (vfs.flags.get("daw") or "x-oc-mtime" in self.headers)
) or (
not bos.path.exists(os.path.join(fdir, tnam))
and bos.path.exists(path)
and not bos.path.getsize(path)
and bos.path.getmtime(path) >= time.time() - self.args.blank_wt
):
# small toctou, but better than clobbering a hardlink
bos.unlink(path)
@@ -1458,6 +1500,16 @@ class HttpCli(object):
if self.args.nw:
return post_sz, sha_hex, sha_b64, remains, path, ""
at = mt = time.time() - lifetime
cli_mt = self.headers.get("x-oc-mtime")
if cli_mt:
try:
mt = int(cli_mt)
times = (int(time.time()), mt)
bos.utime(path, times, False)
except:
pass
if nameless and "magic" in vfs.flags:
try:
ext = self.conn.hsrv.magician.ext(path)
@@ -1480,7 +1532,6 @@ class HttpCli(object):
fn = fn2
path = path2
at = time.time() - lifetime
if xau and not runhook(
self.log,
xau,
@@ -1488,12 +1539,13 @@ class HttpCli(object):
self.vpath,
self.host,
self.uname,
mt,
post_sz,
self.ip,
at,
post_sz,
"",
):
t = "upload denied by xau"
t = "upload blocked by xau server config"
self.log(t, 1)
os.unlink(path)
raise Pebkac(403, t)
@@ -1502,11 +1554,14 @@ class HttpCli(object):
self.conn.hsrv.broker.say(
"up2k.hash_file",
vfs.realpath,
vfs.vpath,
vfs.flags,
rem,
fn,
self.ip,
at,
self.uname,
True,
)
vsuf = ""
@@ -1544,6 +1599,11 @@ class HttpCli(object):
t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
h = {"Location": url} if is_put and url else {}
if "x-oc-mtime" in self.headers:
h["X-OC-MTime"] = "accepted"
t = "" # some webdav clients expect/prefer this
self.reply(t.encode("utf-8"), 201, headers=h)
return True
@@ -1585,7 +1645,7 @@ class HttpCli(object):
spd1 = get_spd(nbytes, self.t0)
spd2 = get_spd(self.conn.nbyte, self.conn.t0)
return "{} {} n{}".format(spd1, spd2, self.conn.nreq)
return "%s %s n%s" % (spd1, spd2, self.conn.nreq)
def handle_post_multipart(self) -> bool:
self.parser = MultipartParser(self.log, self.sr, self.headers)
@@ -1632,7 +1692,7 @@ class HttpCli(object):
items = [unquotep(x) for x in items if items]
self.parser.drop()
return self.tx_zip(k, v, vn, rem, items, self.args.ed)
return self.tx_zip(k, v, "", vn, rem, items, self.args.ed)
def handle_post_json(self) -> bool:
try:
@@ -1704,7 +1764,7 @@ class HttpCli(object):
except:
raise Pebkac(500, min_ex())
x = self.conn.hsrv.broker.ask("up2k.handle_json", body)
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
ret = x.get()
if self.is_vproxied:
if "purl" in ret:
@@ -1717,7 +1777,7 @@ class HttpCli(object):
def handle_search(self, body: dict[str, Any]) -> bool:
idx = self.conn.get_u2idx()
if not hasattr(idx, "p_end"):
if not idx or not hasattr(idx, "p_end"):
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
vols = []
@@ -1747,12 +1807,13 @@ class HttpCli(object):
hits = idx.fsearch(vols, body)
msg: Any = repr(hits)
taglist: list[str] = []
trunc = False
else:
# search by query params
q = body["q"]
n = body.get("n", self.args.srch_hits)
self.log("qj: {} |{}|".format(q, n))
hits, taglist = idx.search(vols, q, n)
hits, taglist, trunc = idx.search(vols, q, n)
msg = len(hits)
idx.p_end = time.time()
@@ -1772,7 +1833,8 @@ class HttpCli(object):
for hit in hits:
hit["rp"] = self.args.RS + hit["rp"]
r = json.dumps({"hits": hits, "tag_order": order}).encode("utf-8")
rj = {"hits": hits, "tag_order": order, "trunc": trunc}
r = json.dumps(rj).encode("utf-8")
self.reply(r, mime="application/json")
return True
@@ -1874,17 +1936,10 @@ class HttpCli(object):
with self.mutex:
self.u2fh.close(path)
# windows cant rename open files
if ANYWIN and path != fin_path and not self.args.nw:
self.conn.hsrv.broker.ask("up2k.finish_upload", ptop, wark).get()
if not ANYWIN and not num_left:
times = (int(time.time()), int(lastmod))
self.log("no more chunks, setting times {}".format(times))
try:
bos.utime(fin_path, times)
except:
self.log("failed to utime ({}, {})".format(fin_path, times))
if not num_left and not self.args.nw:
self.conn.hsrv.broker.ask(
"up2k.finish_upload", ptop, wark, self.u2fh.aps
).get()
cinf = self.headers.get("x-up2k-stat", "")
@@ -1949,7 +2004,7 @@ class HttpCli(object):
sanitized = sanitize_fn(new_dir, "", [])
return self._mkdir(vjoin(self.vpath, sanitized))
def _mkdir(self, vpath: str) -> bool:
def _mkdir(self, vpath: str, dav: bool = False) -> bool:
nullwrite = self.args.nw
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
self._assert_safe_rem(rem)
@@ -1975,7 +2030,12 @@ class HttpCli(object):
raise Pebkac(500, min_ex())
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
self.redirect(vpath, status=201)
if dav:
self.reply(b"", 201)
else:
self.redirect(vpath, status=201)
return True
def handle_new_md(self) -> bool:
@@ -2099,12 +2159,13 @@ class HttpCli(object):
self.vpath,
self.host,
self.uname,
self.ip,
at,
0,
self.ip,
at,
"",
):
t = "upload denied by xbu"
t = "upload blocked by xbu server config"
self.log(t, 1)
raise Pebkac(403, t)
@@ -2158,12 +2219,13 @@ class HttpCli(object):
self.vpath,
self.host,
self.uname,
self.ip,
at,
sz,
self.ip,
at,
"",
):
t = "upload denied by xau"
t = "upload blocked by xau server config"
self.log(t, 1)
os.unlink(abspath)
raise Pebkac(403, t)
@@ -2172,11 +2234,14 @@ class HttpCli(object):
self.conn.hsrv.broker.say(
"up2k.hash_file",
dbv.realpath,
vfs.vpath,
dbv.flags,
vrem,
fname,
self.ip,
at,
self.uname,
True,
)
self.conn.nbyte += sz
@@ -2514,8 +2579,12 @@ class HttpCli(object):
hrange = self.headers.get("range")
# let's not support 206 with compression
if do_send and not is_compressed and hrange and file_sz:
# and multirange / multipart is also not-impl (mostly because calculating contentlength is a pain)
if do_send and not is_compressed and hrange and file_sz and "," not in hrange:
try:
if not hrange.lower().startswith("bytes"):
raise Exception()
a, b = hrange.split("=", 1)[1].split("-")
if a.strip():
@@ -2610,7 +2679,14 @@ class HttpCli(object):
return ret
def tx_zip(
self, fmt: str, uarg: str, vn: VFS, rem: str, items: list[str], dots: bool
self,
fmt: str,
uarg: str,
vpath: str,
vn: VFS,
rem: str,
items: list[str],
dots: bool,
) -> bool:
if self.args.no_zip:
raise Pebkac(400, "not enabled")
@@ -2653,7 +2729,7 @@ class HttpCli(object):
self.send_headers(None, mime=mime, headers={"Content-Disposition": cdis})
fgen = vn.zipgen(
rem, set(items), self.uname, False, dots, not self.args.no_scandir
vpath, rem, set(items), self.uname, dots, False, not self.args.no_scandir
)
# for f in fgen: print(repr({k: f[k] for k in ["vp", "ap"]}))
bgen = packer(self.log, fgen, utf8="utf" in uarg, pre_crc="crc" in uarg)
@@ -2833,7 +2909,7 @@ class HttpCli(object):
}
fmt = self.uparam.get("ls", "")
if not fmt and self.ua.startswith("curl/"):
if not fmt and (self.ua.startswith("curl/") or self.ua.startswith("fetch")):
fmt = "v"
if fmt in ["v", "t", "txt"]:
@@ -2877,6 +2953,7 @@ class HttpCli(object):
url_suf=suf,
k304=self.k304(),
ver=S_VERSION if self.args.ver else "",
ahttps="" if self.is_https else "https://" + self.host + self.req,
)
self.reply(html.encode("utf-8"))
return True
@@ -2887,15 +2964,16 @@ class HttpCli(object):
self.redirect("", "?h#cc")
return True
def set_am_js(self) -> bool:
v = "n" if self.uparam["am_js"] == "n" else "y"
ck = gencookie("js", v, self.args.R, False, 86400 * 299)
def setck(self) -> bool:
k, v = self.uparam["setck"].split("=", 1)
t = None if v == "" else 86400 * 299
ck = gencookie(k, v, self.args.R, False, t)
self.out_headerlist.append(("Set-Cookie", ck))
self.reply(b"promoted\n")
self.reply(b"o7\n")
return True
def set_cfg_reset(self) -> bool:
for k in ("k304", "js", "cppwd", "cppws"):
for k in ("k304", "js", "idxh", "cppwd", "cppws"):
cookie = gencookie(k, "x", self.args.R, False, None)
self.out_headerlist.append(("Set-Cookie", cookie))
@@ -2926,7 +3004,7 @@ class HttpCli(object):
vn, _ = self.asrv.vfs.get(self.vpath, self.uname, True, True)
args = [self.asrv.vfs.all_vols, [vn.vpath], False]
args = [self.asrv.vfs.all_vols, [vn.vpath], False, True]
x = self.conn.hsrv.broker.ask("up2k.rescan", *args)
err = x.get()
@@ -2978,8 +3056,8 @@ class HttpCli(object):
ret = self.gen_tree(top, dst)
if self.is_vproxied:
parents = self.args.R.split("/")
for parent in parents[::-1]:
ret = {"k{}".format(parent): ret, "a": []}
for parent in reversed(parents):
ret = {"k%s" % (parent,): ret, "a": []}
zs = json.dumps(ret)
self.reply(zs.encode("utf-8"), mime="application/json")
@@ -3031,7 +3109,7 @@ class HttpCli(object):
raise Pebkac(403, "the unpost feature is disabled in server config")
idx = self.conn.get_u2idx()
if not hasattr(idx, "p_end"):
if not idx or not hasattr(idx, "p_end"):
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
filt = self.uparam.get("filter")
@@ -3117,7 +3195,9 @@ class HttpCli(object):
nlim = int(self.uparam.get("lim") or 0)
lim = [nlim, nlim] if nlim else []
x = self.conn.hsrv.broker.ask("up2k.handle_rm", self.uname, self.ip, req, lim)
x = self.conn.hsrv.broker.ask(
"up2k.handle_rm", self.uname, self.ip, req, lim, False
)
self.loud_reply(x.get())
return True
@@ -3134,7 +3214,7 @@ class HttpCli(object):
# x-www-form-urlencoded (url query part) uses
# either + or %20 for 0x20 so handle both
dst = unquotep(dst.replace("+", " "))
return self._mv(self.vpath, dst)
return self._mv(self.vpath, dst.lstrip("/"))
def _mv(self, vsrc: str, vdst: str) -> bool:
if not self.can_move:
@@ -3239,23 +3319,48 @@ class HttpCli(object):
):
raise Pebkac(403)
e2d = "e2d" in vn.flags
e2t = "e2t" in vn.flags
self.html_head = vn.flags.get("html_head", "")
if vn.flags.get("norobots"):
if vn.flags.get("norobots") or "b" in self.uparam:
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
else:
self.out_headers.pop("X-Robots-Tag", None)
is_dir = stat.S_ISDIR(st.st_mode)
icur = None
if is_dir and (e2t or e2d):
idx = self.conn.get_u2idx()
if idx and hasattr(idx, "p_end"):
icur = idx.get_cur(dbv.realpath)
if self.can_read:
th_fmt = self.uparam.get("th")
if th_fmt is not None:
if is_dir:
for fn in self.args.th_covers.split(","):
fp = os.path.join(abspath, fn)
if bos.path.exists(fp):
vrem = "{}/{}".format(vrem.rstrip("/"), fn).strip("/")
is_dir = False
break
vrem = vrem.rstrip("/")
if icur and vrem:
q = "select fn from cv where rd=? and dn=?"
crd, cdn = vrem.rsplit("/", 1) if "/" in vrem else ("", vrem)
# no mojibake support:
try:
cfn = icur.execute(q, (crd, cdn)).fetchone()
if cfn:
fn = cfn[0]
fp = os.path.join(abspath, fn)
if bos.path.exists(fp):
vrem = "{}/{}".format(vrem, fn).strip("/")
is_dir = False
except:
pass
else:
for fn in self.args.th_covers:
fp = os.path.join(abspath, fn)
if bos.path.exists(fp):
vrem = "{}/{}".format(vrem, fn).strip("/")
is_dir = False
break
if is_dir:
return self.tx_ico("a.folder")
@@ -3329,7 +3434,7 @@ class HttpCli(object):
is_ls = "ls" in self.uparam
is_js = self.args.force_js or self.cookies.get("js") == "y"
if not is_ls and self.ua.startswith("curl/"):
if not is_ls and (self.ua.startswith("curl/") or self.ua.startswith("fetch")):
self.uparam["ls"] = "v"
is_ls = True
@@ -3362,8 +3467,8 @@ class HttpCli(object):
"taglist": [],
"srvinf": srv_infot,
"acct": self.uname,
"idx": ("e2d" in vn.flags),
"itag": ("e2t" in vn.flags),
"idx": e2d,
"itag": e2t,
"lifetime": vn.flags.get("lifetime") or 0,
"frand": bool(vn.flags.get("rand")),
"perms": perms,
@@ -3382,8 +3487,8 @@ class HttpCli(object):
"taglist": [],
"def_hcols": [],
"have_emp": self.args.emp,
"have_up2k_idx": ("e2d" in vn.flags),
"have_tags_idx": ("e2t" in vn.flags),
"have_up2k_idx": e2d,
"have_tags_idx": e2t,
"have_acode": (not self.args.no_acode),
"have_mv": (not self.args.no_mv),
"have_del": (not self.args.no_del),
@@ -3395,11 +3500,12 @@ class HttpCli(object):
"url_suf": url_suf,
"logues": logues,
"readme": readme,
"title": html_escape(self.vpath, crlf=True) or "🎉",
"title": html_escape(self.vpath, crlf=True) or "💾🎉",
"srv_info": srv_infot,
"dtheme": self.args.theme,
"themes": self.args.themes,
"turbolvl": self.args.turbo,
"idxh": int(self.args.ih),
"u2sort": self.args.u2sort,
}
@@ -3429,10 +3535,14 @@ class HttpCli(object):
for k in ["zip", "tar"]:
v = self.uparam.get(k)
if v is not None:
return self.tx_zip(k, v, vn, rem, [], self.args.ed)
return self.tx_zip(k, v, self.vpath, vn, rem, [], self.args.ed)
fsroot, vfs_ls, vfs_virt = vn.ls(
rem, self.uname, not self.args.no_scandir, [[True, False], [False, True]]
rem,
self.uname,
not self.args.no_scandir,
[[True, False], [False, True]],
lstat="lt" in self.uparam,
)
stats = {k: v for k, v in vfs_ls}
ls_names = [x[0] for x in vfs_ls]
@@ -3459,11 +3569,6 @@ class HttpCli(object):
if not self.args.ed or "dots" not in self.uparam:
ls_names = exclude_dotfiles(ls_names)
icur = None
if "e2t" in vn.flags:
idx = self.conn.get_u2idx()
icur = idx.get_cur(dbv.realpath)
add_fk = vn.flags.get("fk")
dirs = []
@@ -3481,7 +3586,8 @@ class HttpCli(object):
fspath = fsroot + "/" + fn
try:
inf = stats.get(fn) or bos.stat(fspath)
linf = stats.get(fn) or bos.lstat(fspath)
inf = bos.stat(fspath) if stat.S_ISLNK(linf.st_mode) else linf
except:
self.log("broken symlink: {}".format(repr(fspath)))
continue
@@ -3492,19 +3598,26 @@ class HttpCli(object):
if self.args.no_zip:
margin = "DIR"
else:
margin = '<a href="{}?zip" rel="nofollow">zip</a>'.format(
quotep(href)
)
margin = '<a href="%s?zip" rel="nofollow">zip</a>' % (quotep(href),)
elif fn in hist:
margin = '<a href="{}.hist/{}">#{}</a>'.format(
base, html_escape(hist[fn][2], quot=True, crlf=True), hist[fn][0]
margin = '<a href="%s.hist/%s">#%s</a>' % (
base,
html_escape(hist[fn][2], quot=True, crlf=True),
hist[fn][0],
)
else:
margin = "-"
sz = inf.st_size
zd = datetime.utcfromtimestamp(inf.st_mtime)
dt = zd.strftime("%Y-%m-%d %H:%M:%S")
zd = datetime.utcfromtimestamp(linf.st_mtime)
dt = "%04d-%02d-%02d %02d:%02d:%02d" % (
zd.year,
zd.month,
zd.day,
zd.hour,
zd.minute,
zd.second,
)
try:
ext = "---" if is_dir else fn.rsplit(".", 1)[1]
@@ -3514,7 +3627,7 @@ class HttpCli(object):
ext = "%"
if add_fk:
href = "{}?k={}".format(
href = "%s?k=%s" % (
quotep(href),
self.gen_fk(
self.args.fk_salt, fspath, sz, 0 if ANYWIN else inf.st_ino
@@ -3530,7 +3643,7 @@ class HttpCli(object):
"sz": sz,
"ext": ext,
"dt": dt,
"ts": int(inf.st_mtime),
"ts": int(linf.st_mtime),
}
if is_dir:
dirs.append(item)
@@ -3538,6 +3651,20 @@ class HttpCli(object):
files.append(item)
item["rd"] = rem
if (
self.cookies.get("idxh") == "y"
and "ls" not in self.uparam
and "v" not in self.uparam
):
idx_html = set(["index.htm", "index.html"])
for item in files:
if item["name"] in idx_html:
# do full resolve in case of shadowed file
vp = vjoin(self.vpath.split("?")[0], item["name"])
vn, rem = self.asrv.vfs.get(vp, self.uname, True, False)
ap = vn.canonical(rem)
return self.tx_file(ap) # is no-cache
tagset: set[str] = set()
for fe in files:
fn = fe["name"]

View File

@@ -103,11 +103,12 @@ class HttpConn(object):
def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func(self.log_src, msg, c)
def get_u2idx(self) -> U2idx:
# one u2idx per tcp connection;
def get_u2idx(self) -> Optional[U2idx]:
# grab from a pool of u2idx instances;
# sqlite3 fully parallelizes under python threads
# but avoid running out of FDs by creating too many
if not self.u2idx:
self.u2idx = U2idx(self)
self.u2idx = self.hsrv.get_u2idx(str(self.addr))
return self.u2idx
@@ -215,3 +216,7 @@ class HttpConn(object):
self.cli = HttpCli(self)
if not self.cli.run():
return
if self.u2idx:
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
self.u2idx = None

View File

@@ -11,9 +11,19 @@ import time
import queue
from .__init__ import ANYWIN, CORES, EXE, MACOS, TYPE_CHECKING, EnvParams
try:
MNFE = ModuleNotFoundError
except:
MNFE = ImportError
try:
import jinja2
except ImportError:
except MNFE:
if EXE:
raise
print(
"""\033[1;31m
you do not have jinja2 installed,\033[33m
@@ -28,9 +38,9 @@ except ImportError:
)
sys.exit(1)
from .__init__ import ANYWIN, MACOS, TYPE_CHECKING, EnvParams
from .bos import bos
from .httpconn import HttpConn
from .u2idx import U2idx
from .util import (
E_SCK,
FHC,
@@ -102,6 +112,9 @@ class HttpSrv(object):
self.cb_ts = 0.0
self.cb_v = ""
self.u2idx_free: dict[str, U2idx] = {}
self.u2idx_n = 0
env = jinja2.Environment()
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
@@ -119,7 +132,7 @@ class HttpSrv(object):
self.ssdp = SSDPr(broker)
cert_path = os.path.join(self.E.cfg, "cert.pem")
cert_path = self.args.cert
if bos.path.exists(cert_path):
self.cert_path = cert_path
else:
@@ -436,6 +449,9 @@ class HttpSrv(object):
self.clients.remove(cli)
self.ncli -= 1
if cli.u2idx:
self.put_u2idx(str(addr), cli.u2idx)
def cachebuster(self) -> str:
if time.time() - self.cb_ts < 1:
return self.cb_v
@@ -457,3 +473,31 @@ class HttpSrv(object):
self.cb_v = v.decode("ascii")[-4:]
self.cb_ts = time.time()
return self.cb_v
def get_u2idx(self, ident: str) -> Optional[U2idx]:
utab = self.u2idx_free
for _ in range(100): # 5/0.05 = 5sec
with self.mutex:
if utab:
if ident in utab:
return utab.pop(ident)
return utab.pop(list(utab.keys())[0])
if self.u2idx_n < CORES:
self.u2idx_n += 1
return U2idx(self)
time.sleep(0.05)
# not using conditional waits, on a hunch that
# average performance will be faster like this
# since most servers won't be fully saturated
return None
def put_u2idx(self, ident: str, u2idx: U2idx) -> None:
with self.mutex:
while ident in self.u2idx_free:
ident += "a"
self.u2idx_free[ident] = u2idx

View File

@@ -8,13 +8,12 @@ import shutil
import subprocess as sp
import sys
from .__init__ import PY2, WINDOWS, E, unicode
from .__init__ import EXE, PY2, WINDOWS, E, unicode
from .bos import bos
from .util import (
FFMPEG_URL,
REKOBO_LKEY,
fsenc,
is_exe,
min_ex,
pybin,
retchk,
@@ -270,7 +269,9 @@ class MTag(object):
self.args = args
self.usable = True
self.prefer_mt = not args.no_mtag_ff
self.backend = "ffprobe" if args.no_mutagen else "mutagen"
self.backend = (
"ffprobe" if args.no_mutagen or (HAVE_FFPROBE and EXE) else "mutagen"
)
self.can_ffprobe = HAVE_FFPROBE and not args.no_mtag_ff
mappings = args.mtm
or_ffprobe = " or FFprobe"
@@ -296,7 +297,7 @@ class MTag(object):
self.log(msg, c=3)
if not self.usable:
if is_exe:
if EXE:
t = "copyparty.exe cannot use mutagen; need ffprobe.exe to read media tags: "
self.log(t + FFMPEG_URL)
return
@@ -472,7 +473,10 @@ class MTag(object):
self.log("mutagen: {}\033[0m".format(" ".join(zl)), "90")
if not md.info.length and not md.info.codec:
raise Exception()
except:
except Exception as ex:
if self.args.mtag_v:
self.log("mutagen-err [{}] @ [{}]".format(ex, abspath), "90")
return self.get_ffprobe(abspath) if self.can_ffprobe else {}
sz = bos.path.getsize(abspath)
@@ -535,7 +539,7 @@ class MTag(object):
env = os.environ.copy()
try:
if is_exe:
if EXE:
raise Exception()
pypath = os.path.abspath(os.path.dirname(os.path.dirname(__file__)))
@@ -543,7 +547,7 @@ class MTag(object):
pypath = str(os.pathsep.join(zsl))
env["PYTHONPATH"] = pypath
except:
if not E.ox and not is_exe:
if not E.ox and not EXE:
raise
ret: dict[str, Any] = {}

View File

@@ -9,13 +9,13 @@ import sys
import time
from types import SimpleNamespace
from .__init__ import ANYWIN, TYPE_CHECKING
from .__init__ import ANYWIN, EXE, TYPE_CHECKING
from .authsrv import LEELOO_DALLAS, VFS
from .bos import bos
from .util import Daemon, is_exe, min_ex, pybin
from .util import Daemon, min_ex, pybin, runhook
if True: # pylint: disable=using-constant-test
from typing import Any
from typing import Any, Union
if TYPE_CHECKING:
from .svchub import SvcHub
@@ -42,7 +42,7 @@ class SMB(object):
from impacket import smbserver
from impacket.ntlm import compute_lmhash, compute_nthash
except ImportError:
if is_exe:
if EXE:
print("copyparty.exe cannot do SMB")
sys.exit(1)
@@ -113,6 +113,9 @@ class SMB(object):
self.stop = srv.stop
self.log("smb", "listening @ {}:{}".format(ip, port))
def nlog(self, msg: str, c: Union[int, str] = 0) -> None:
self.log("smb", msg, c)
def start(self) -> None:
Daemon(self.srv.start)
@@ -169,8 +172,15 @@ class SMB(object):
yeet("blocked write (no --smbw): " + vpath)
vfs, ap = self._v2a("open", vpath, *a)
if wr and not vfs.axs.uwrite:
yeet("blocked write (no-write-acc): " + vpath)
if wr:
if not vfs.axs.uwrite:
yeet("blocked write (no-write-acc): " + vpath)
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
self.nlog, xbu, ap, vpath, "", "", 0, 0, "1.7.6.2", 0, ""
):
yeet("blocked by xbu server config: " + vpath)
ret = bos.open(ap, flags, *a, mode=chmod, **ka)
if wr:
@@ -198,11 +208,13 @@ class SMB(object):
vfs, rem = vfs.get_dbv(rem)
self.hub.up2k.hash_file(
vfs.realpath,
vfs.vpath,
vfs.flags,
rem,
fn,
"1.7.6.2",
time.time(),
"",
)
def _rename(self, vp1: str, vp2: str) -> None:
@@ -249,7 +261,7 @@ class SMB(object):
yeet("blocked delete (no-del-acc): " + vpath)
vpath = vpath.replace("\\", "/").lstrip("/")
self.hub.up2k.handle_rm(LEELOO_DALLAS, "1.7.6.2", [vpath], [])
self.hub.up2k.handle_rm(LEELOO_DALLAS, "1.7.6.2", [vpath], [], False)
def _utime(self, vpath: str, times: tuple[float, float]) -> None:
if not self.args.smbw:

View File

@@ -28,7 +28,7 @@ if True: # pylint: disable=using-constant-test
import typing
from typing import Any, Optional, Union
from .__init__ import ANYWIN, MACOS, TYPE_CHECKING, VT100, EnvParams, unicode
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, VT100, EnvParams, unicode
from .authsrv import AuthSrv
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
from .tcpsrv import TcpSrv
@@ -43,7 +43,6 @@ from .util import (
HMaccas,
alltrace,
ansi_re,
is_exe,
min_ex,
mp,
pybin,
@@ -129,6 +128,9 @@ class SvcHub(object):
args.no_robots = True
args.force_js = True
if not self._process_config():
raise Exception("bad config")
self.log = self._log_disabled if args.q else self._log_enabled
if args.lo:
self._setup_logfile(printed)
@@ -150,12 +152,9 @@ class SvcHub(object):
self.log("root", t.format(args.j))
if not args.no_fpool and args.j != 1:
t = "WARNING: --use-fpool combined with multithreading is untested and can probably cause undefined behavior"
if ANYWIN:
t = 'windows cannot do multithreading without --no-fpool, so enabling that -- note that upload performance will suffer if you have microsoft defender "real-time protection" enabled, so you probably want to use -j 1 instead'
args.no_fpool = True
self.log("root", t, c=3)
t = "WARNING: ignoring --use-fpool because multithreading (-j{}) is enabled"
self.log("root", t.format(args.j), c=3)
args.no_fpool = True
bri = "zy"[args.theme % 2 :][:1]
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
@@ -181,9 +180,6 @@ class SvcHub(object):
self.log("root", "max clients: {}".format(self.args.nc))
if not self._process_config():
raise Exception("bad config")
self.tcpsrv = TcpSrv(self)
self.up2k = Up2k(self)
@@ -212,7 +208,7 @@ class SvcHub(object):
want_ff = True
msg = "need either Pillow, pyvips, or FFmpeg to create thumbnails; for example:\n{0}{1} -m pip install --user Pillow\n{0}{1} -m pip install --user pyvips\n{0}apt install ffmpeg"
msg = msg.format(" " * 37, os.path.basename(pybin))
if is_exe:
if EXE:
msg = "copyparty.exe cannot use Pillow or pyvips; need ffprobe.exe and ffmpeg.exe to create thumbnails"
self.log("thumb", msg, c=3)
@@ -349,6 +345,24 @@ class SvcHub(object):
al.RS = R + "/" if R else ""
al.SRS = "/" + R + "/" if R else "/"
if al.rsp_jtr:
al.rsp_slp = 0.000001
al.th_covers = set(al.th_covers.split(","))
for k in "c".split(" "):
vl = getattr(al, k)
if not vl:
continue
vl = [os.path.expanduser(x) if x.startswith("~") else x for x in vl]
setattr(al, k, vl)
for k in "lo hist ssl_log".split(" "):
vs = getattr(al, k)
if vs and vs.startswith("~"):
setattr(al, k, os.path.expanduser(vs))
return True
def _setlimits(self) -> None:
@@ -403,6 +417,7 @@ class SvcHub(object):
def _setup_logfile(self, printed: str) -> None:
base_fn = fn = sel_fn = self._logname()
do_xz = fn.lower().endswith(".xz")
if fn != self.args.lo:
ctr = 0
# yup this is a race; if started sufficiently concurrently, two
@@ -414,7 +429,7 @@ class SvcHub(object):
fn = sel_fn
try:
if fn.lower().endswith(".xz"):
if do_xz:
import lzma
lh = lzma.open(fn, "wt", encoding="utf-8", errors="replace", preset=0)
@@ -632,8 +647,14 @@ class SvcHub(object):
return
with self.log_mutex:
ts = datetime.utcnow().strftime("%Y-%m%d-%H%M%S.%f")[:-3]
self.logf.write("@{} [{}\033[0m] {}\n".format(ts, src, msg))
zd = datetime.utcnow()
ts = "%04d-%04d-%06d.%03d" % (
zd.year,
zd.month * 100 + zd.day,
(zd.hour * 100 + zd.minute) * 100 + zd.second,
zd.microsecond // 1000,
)
self.logf.write("@%s [%s\033[0m] %s\n" % (ts, src, msg))
now = time.time()
if now >= self.next_day:
@@ -663,23 +684,29 @@ class SvcHub(object):
print("\033[36m{}\033[0m\n".format(dt.strftime("%Y-%m-%d")), end="")
self._set_next_day()
fmt = "\033[36m{} \033[33m{:21} \033[0m{}\n"
fmt = "\033[36m%s \033[33m%-21s \033[0m%s\n"
if not VT100:
fmt = "{} {:21} {}\n"
fmt = "%s %-21s %s\n"
if "\033" in msg:
msg = ansi_re.sub("", msg)
if "\033" in src:
src = ansi_re.sub("", src)
elif c:
if isinstance(c, int):
msg = "\033[3{}m{}\033[0m".format(c, msg)
msg = "\033[3%sm%s\033[0m" % (c, msg)
elif "\033" not in c:
msg = "\033[{}m{}\033[0m".format(c, msg)
msg = "\033[%sm%s\033[0m" % (c, msg)
else:
msg = "{}{}\033[0m".format(c, msg)
msg = "%s%s\033[0m" % (c, msg)
ts = datetime.utcfromtimestamp(now).strftime("%H:%M:%S.%f")[:-3]
msg = fmt.format(ts, src, msg)
zd = datetime.utcfromtimestamp(now)
ts = "%02d:%02d:%02d.%03d" % (
zd.hour,
zd.minute,
zd.second,
zd.microsecond // 1000,
)
msg = fmt % (ts, src, msg)
try:
print(msg, end="")
except UnicodeEncodeError:

View File

@@ -322,7 +322,7 @@ class TcpSrv(object):
if k not in netdevs:
removed = "{} = {}".format(k, v)
t = "network change detected:\n added {}\nremoved {}"
t = "network change detected:\n added {}\033[0;33m\nremoved {}"
self.log("tcpsrv", t.format(added, removed), 3)
self.netdevs = netdevs
self._distribute_netdevs()

View File

@@ -16,10 +16,10 @@ from .__init__ import ANYWIN, TYPE_CHECKING
from .bos import bos
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
from .util import (
FFMPEG_URL,
BytesIO,
Cooldown,
Daemon,
FFMPEG_URL,
Pebkac,
afsenc,
fsenc,
@@ -135,7 +135,7 @@ class ThumbSrv(object):
msg = "cannot create audio/video thumbnails because some of the required programs are not available: "
msg += ", ".join(missing)
self.log(msg, c=3)
if ANYWIN:
if ANYWIN and self.args.no_acode:
self.log("download FFmpeg to fix it:\033[0m " + FFMPEG_URL, 3)
if self.args.th_clean:
@@ -274,6 +274,10 @@ class ThumbSrv(object):
tdir, tfn = os.path.split(tpath)
ttpath = os.path.join(tdir, "w", tfn)
try:
bos.unlink(ttpath)
except:
pass
for fun in funs:
try:
@@ -561,13 +565,24 @@ class ThumbSrv(object):
if "ac" not in ret:
raise Exception("not audio")
try:
dur = ret[".dur"][1]
except:
dur = 0
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
want_caf = tpath.endswith(".caf")
tmp_opus = tpath
if want_caf:
tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
tmp_opus = tpath + ".opus"
try:
bos.unlink(tmp_opus)
except:
pass
if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
caf_src = abspath if src_opus else tmp_opus
if not want_caf or not src_opus:
# fmt: off
cmd = [
b"ffmpeg",
@@ -584,7 +599,32 @@ class ThumbSrv(object):
# fmt: on
self._run_ff(cmd)
if want_caf:
# iOS fails to play some "insufficiently complex" files
# (average file shorter than 8 seconds), so of course we
# fix that by mixing in some inaudible pink noise :^)
# 6.3 sec seems like the cutoff so lets do 7, and
# 7 sec of psyqui-musou.opus @ 3:50 is 174 KiB
if want_caf and (dur < 20 or bos.path.getsize(caf_src) < 256 * 1024):
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
b"-filter_complex", b"anoisesrc=a=0.001:d=7:c=pink,asplit[l][r]; [l][r]amerge[s]; [0:a:0][s]amix",
b"-map_metadata", b"-1",
b"-ac", b"2",
b"-c:a", b"libopus",
b"-b:a", b"128k",
b"-f", b"caf",
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd)
elif want_caf:
# simple remux should be safe
# fmt: off
cmd = [
b"ffmpeg",
@@ -601,6 +641,12 @@ class ThumbSrv(object):
# fmt: on
self._run_ff(cmd)
if tmp_opus != tpath:
try:
bos.unlink(tmp_opus)
except:
pass
def poke(self, tdir: str) -> None:
if not self.poke_cd.poke(tdir):
return

View File

@@ -34,14 +34,14 @@ if True: # pylint: disable=using-constant-test
from typing import Any, Optional, Union
if TYPE_CHECKING:
from .httpconn import HttpConn
from .httpsrv import HttpSrv
class U2idx(object):
def __init__(self, conn: "HttpConn") -> None:
self.log_func = conn.log_func
self.asrv = conn.asrv
self.args = conn.args
def __init__(self, hsrv: "HttpSrv") -> None:
self.log_func = hsrv.log
self.asrv = hsrv.asrv
self.args = hsrv.args
self.timeout = self.args.srch_time
if not HAVE_SQLITE3:
@@ -51,7 +51,7 @@ class U2idx(object):
self.active_id = ""
self.active_cur: Optional["sqlite3.Cursor"] = None
self.cur: dict[str, "sqlite3.Cursor"] = {}
self.mem_cur = sqlite3.connect(":memory:").cursor()
self.mem_cur = sqlite3.connect(":memory:", check_same_thread=False).cursor()
self.mem_cur.execute(r"create table a (b text)")
self.p_end = 0.0
@@ -101,7 +101,8 @@ class U2idx(object):
uri = ""
try:
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
cur = sqlite3.connect(uri, 2, uri=True).cursor()
db = sqlite3.connect(uri, 2, uri=True, check_same_thread=False)
cur = db.cursor()
cur.execute('pragma table_info("up")').fetchone()
self.log("ro: {}".format(db_path))
except:
@@ -112,7 +113,7 @@ class U2idx(object):
if not cur:
# on windows, this steals the write-lock from up2k.deferred_init --
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
cur = sqlite3.connect(db_path, 2).cursor()
cur = sqlite3.connect(db_path, 2, check_same_thread=False).cursor()
self.log("opened {}".format(db_path))
self.cur[ptop] = cur
@@ -120,10 +121,10 @@ class U2idx(object):
def search(
self, vols: list[tuple[str, str, dict[str, Any]]], uq: str, lim: int
) -> tuple[list[dict[str, Any]], list[str]]:
) -> tuple[list[dict[str, Any]], list[str], bool]:
"""search by query params"""
if not HAVE_SQLITE3:
return [], []
return [], [], False
q = ""
v: Union[str, int] = ""
@@ -275,7 +276,7 @@ class U2idx(object):
have_up: bool,
have_mt: bool,
lim: int,
) -> tuple[list[dict[str, Any]], list[str]]:
) -> tuple[list[dict[str, Any]], list[str], bool]:
done_flag: list[bool] = []
self.active_id = "{:.6f}_{}".format(
time.time(), threading.current_thread().ident
@@ -293,6 +294,7 @@ class U2idx(object):
self.log("qs: {!r} {!r}".format(uq, uv))
ret = []
seen_rps: set[str] = set()
lim = min(lim, int(self.args.srch_hits))
taglist = {}
for (vtop, ptop, flags) in vols:
@@ -315,9 +317,6 @@ class U2idx(object):
c = cur.execute(uq, tuple(vuv))
for hit in c:
w, ts, sz, rd, fn, ip, at = hit[:7]
lim -= 1
if lim < 0:
break
if rd.startswith("//") or fn.startswith("//"):
rd, fn = s3dec(rd, fn)
@@ -326,6 +325,9 @@ class U2idx(object):
if not dots and "/." in ("/" + rp):
continue
if rp in seen_rps:
continue
if not fk:
suf = ""
else:
@@ -342,6 +344,11 @@ class U2idx(object):
)[:fk]
)
lim -= 1
if lim < 0:
break
seen_rps.add(rp)
sret.append({"ts": int(ts), "sz": sz, "rp": rp + suf, "w": w[:16]})
for hit in sret:
@@ -361,17 +368,9 @@ class U2idx(object):
done_flag.append(True)
self.active_id = ""
# undupe hits from multiple metadata keys
if len(ret) > 1:
ret = [ret[0]] + [
y
for x, y in zip(ret[:-1], ret[1:])
if x["rp"].split("?")[0] != y["rp"].split("?")[0]
]
ret.sort(key=itemgetter("rp"))
return ret, list(taglist.keys())
return ret, list(taglist.keys()), lim < 0
def terminator(self, identifier: str, done_flag: list[bool]) -> None:
for _ in range(self.timeout):

File diff suppressed because it is too large Load Diff

View File

@@ -31,7 +31,7 @@ from email.utils import formatdate
from ipaddress import IPv4Address, IPv4Network, IPv6Address, IPv6Network
from queue import Queue
from .__init__ import ANYWIN, MACOS, PY2, TYPE_CHECKING, VT100, WINDOWS
from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, VT100, WINDOWS
from .__version__ import S_BUILD_DT, S_VERSION
from .stolen import surrogateescape
@@ -294,8 +294,7 @@ REKOBO_LKEY = {k.lower(): v for k, v in REKOBO_KEY.items()}
pybin = sys.executable or ""
is_exe = bool(getattr(sys, "frozen", False))
if is_exe:
if EXE:
pybin = ""
for p in "python3 python".split():
try:
@@ -538,7 +537,7 @@ class _Unrecv(object):
self.log = log
self.buf: bytes = b""
def recv(self, nbytes: int) -> bytes:
def recv(self, nbytes: int, spins: int = 1) -> bytes:
if self.buf:
ret = self.buf[:nbytes]
self.buf = self.buf[nbytes:]
@@ -549,6 +548,10 @@ class _Unrecv(object):
ret = self.s.recv(nbytes)
break
except socket.timeout:
spins -= 1
if spins <= 0:
ret = b""
break
continue
except:
ret = b""
@@ -591,7 +594,7 @@ class _LUnrecv(object):
self.log = log
self.buf = b""
def recv(self, nbytes: int) -> bytes:
def recv(self, nbytes: int, spins: int) -> bytes:
if self.buf:
ret = self.buf[:nbytes]
self.buf = self.buf[nbytes:]
@@ -610,7 +613,7 @@ class _LUnrecv(object):
def recv_ex(self, nbytes: int, raise_on_trunc: bool = True) -> bytes:
"""read an exact number of bytes"""
try:
ret = self.recv(nbytes)
ret = self.recv(nbytes, 1)
err = False
except:
ret = b""
@@ -618,7 +621,7 @@ class _LUnrecv(object):
while not err and len(ret) < nbytes:
try:
ret += self.recv(nbytes - len(ret))
ret += self.recv(nbytes - len(ret), 1)
except OSError:
err = True
@@ -669,6 +672,7 @@ class FHC(object):
def __init__(self) -> None:
self.cache: dict[str, FHC.CE] = {}
self.aps: set[str] = set()
def close(self, path: str) -> None:
try:
@@ -680,6 +684,7 @@ class FHC(object):
fh.close()
del self.cache[path]
self.aps.remove(path)
def clean(self) -> None:
if not self.cache:
@@ -700,6 +705,7 @@ class FHC(object):
return self.cache[path].fhs.pop()
def put(self, path: str, fh: typing.BinaryIO) -> None:
self.aps.add(path)
try:
ce = self.cache[path]
ce.fhs.append(fh)
@@ -1290,7 +1296,7 @@ class MultipartParser(object):
rfc1341/rfc1521/rfc2047/rfc2231/rfc2388/rfc6266/the-real-world
(only the fallback non-js uploader relies on these filenames)
"""
for ln in read_header(self.sr):
for ln in read_header(self.sr, 2, 2592000):
self.log(ln)
m = self.re_ctype.match(ln)
@@ -1490,15 +1496,15 @@ def get_boundary(headers: dict[str, str]) -> str:
return m.group(2)
def read_header(sr: Unrecv) -> list[str]:
def read_header(sr: Unrecv, t_idle: int, t_tot: int) -> list[str]:
t0 = time.time()
ret = b""
while True:
if time.time() - t0 > 120:
if time.time() - t0 >= t_tot:
return []
try:
ret += sr.recv(1024)
ret += sr.recv(1024, t_idle // 2)
except:
if not ret:
return []
@@ -1547,7 +1553,7 @@ def rand_name(fdir: str, fn: str, rnd: int) -> str:
def gen_filekey(salt: str, fspath: str, fsize: int, inode: int) -> str:
return base64.urlsafe_b64encode(
hashlib.sha512(
"{} {} {} {}".format(salt, fspath, fsize, inode).encode("utf-8", "replace")
("%s %s %s %s" % (salt, fspath, fsize, inode)).encode("utf-8", "replace")
).digest()
).decode("ascii")
@@ -1656,7 +1662,7 @@ def uncyg(path: str) -> str:
if len(path) > 2 and path[2] != "/":
return path
return "{}:\\{}".format(path[1], path[3:])
return "%s:\\%s" % (path[1], path[3:])
def undot(path: str) -> str:
@@ -1699,7 +1705,7 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
bad = ["con", "prn", "aux", "nul"]
for n in range(1, 10):
bad += "com{0} lpt{0}".format(n).split(" ")
bad += ("com%s lpt%s" % (n, n)).split(" ")
if fn.lower().split(".")[0] in bad:
fn = "_" + fn
@@ -1850,13 +1856,21 @@ def _msaenc(txt: str) -> bytes:
return txt.replace("/", "\\").encode(FS_ENCODING, "surrogateescape")
def _uncify(txt: str) -> str:
txt = txt.replace("/", "\\")
if ":" not in txt and not txt.startswith("\\\\"):
txt = absreal(txt)
return txt if txt.startswith("\\\\") else "\\\\?\\" + txt
def _msenc(txt: str) -> bytes:
txt = txt.replace("/", "\\")
if ":" not in txt and not txt.startswith("\\\\"):
txt = absreal(txt)
ret = txt.encode(FS_ENCODING, "surrogateescape")
return ret if ret.startswith(b"\\\\?\\") else b"\\\\?\\" + ret
return ret if ret.startswith(b"\\\\") else b"\\\\?\\" + ret
w8dec = _w8dec3 if not PY2 else _w8dec2
@@ -1878,9 +1892,11 @@ if not PY2 and WINDOWS:
afsenc = _msaenc
fsenc = _msenc
fsdec = _msdec
uncify = _uncify
elif not PY2 or not WINDOWS:
fsenc = afsenc = sfsenc = w8enc
fsdec = w8dec
uncify = str
else:
# moonrunes become \x3f with bytestrings,
# losing mojibake support is worth
@@ -1892,6 +1908,7 @@ else:
fsenc = afsenc = sfsenc = _not_actually_mbcs_enc
fsdec = _not_actually_mbcs_dec
uncify = str
def s3enc(mem_cur: "sqlite3.Cursor", rd: str, fn: str) -> tuple[str, str]:
@@ -2253,7 +2270,7 @@ def rmdirs(
dirs = [os.path.join(top, x) for x in dirs]
ok = []
ng = []
for d in dirs[::-1]:
for d in reversed(dirs):
a, b = rmdirs(logger, scandir, lstat, d, depth + 1)
ok += a
ng += b
@@ -2268,18 +2285,21 @@ def rmdirs(
return ok, ng
def rmdirs_up(top: str) -> tuple[list[str], list[str]]:
def rmdirs_up(top: str, stop: str) -> tuple[list[str], list[str]]:
"""rmdir on self, then all parents"""
if top == stop:
return [], [top]
try:
os.rmdir(fsenc(top))
except:
return [], [top]
par = os.path.dirname(top)
if not par:
if not par or par == stop:
return [top], []
ok, ng = rmdirs_up(par)
ok, ng = rmdirs_up(par, stop)
return [top] + ok, ng
@@ -2513,23 +2533,14 @@ def retchk(
raise Exception(t)
def _runhook(
log: "NamedLogger",
cmd: str,
ap: str,
vp: str,
host: str,
uname: str,
ip: str,
at: float,
sz: int,
txt: str,
) -> bool:
def _parsehook(
log: Optional["NamedLogger"], cmd: str
) -> tuple[bool, bool, bool, float, dict[str, Any], str]:
chk = False
fork = False
jtxt = False
wait = 0
tout = 0
wait = 0.0
tout = 0.0
kill = "t"
cap = 0
ocmd = cmd
@@ -2549,13 +2560,15 @@ def _runhook(
cap = int(arg[1:]) # 0=none 1=stdout 2=stderr 3=both
elif arg.startswith("k"):
kill = arg[1:] # [t]ree [m]ain [n]one
elif arg.startswith("i"):
pass
else:
t = "hook: invalid flag {} in {}"
log(t.format(arg, ocmd))
(log or print)(t.format(arg, ocmd))
env = os.environ.copy()
try:
if is_exe:
if EXE:
raise Exception()
pypath = os.path.abspath(os.path.dirname(os.path.dirname(__file__)))
@@ -2563,25 +2576,95 @@ def _runhook(
pypath = str(os.pathsep.join(zsl))
env["PYTHONPATH"] = pypath
except:
if not is_exe:
if not EXE:
raise
ka = {
sp_ka = {
"env": env,
"timeout": tout,
"kill": kill,
"capture": cap,
}
if cmd.startswith("~"):
cmd = os.path.expanduser(cmd)
return chk, fork, jtxt, wait, sp_ka, cmd
def runihook(
log: Optional["NamedLogger"],
cmd: str,
vol: "VFS",
ups: list[tuple[str, int, int, str, str, str, int]],
) -> bool:
ocmd = cmd
chk, fork, jtxt, wait, sp_ka, cmd = _parsehook(log, cmd)
bcmd = [sfsenc(cmd)]
if cmd.endswith(".py"):
bcmd = [sfsenc(pybin)] + bcmd
vps = [vjoin(*list(s3dec(x[3], x[4]))) for x in ups]
aps = [djoin(vol.realpath, x) for x in vps]
if jtxt:
# 0w 1mt 2sz 3rd 4fn 5ip 6at
ja = [
{
"ap": uncify(ap), # utf8 for json
"vp": vp,
"wark": x[0][:16],
"mt": x[1],
"sz": x[2],
"ip": x[5],
"at": x[6],
}
for x, vp, ap in zip(ups, vps, aps)
]
sp_ka["sin"] = json.dumps(ja).encode("utf-8", "replace")
else:
sp_ka["sin"] = b"\n".join(fsenc(x) for x in aps)
t0 = time.time()
if fork:
Daemon(runcmd, ocmd, [bcmd], ka=sp_ka)
else:
rc, v, err = runcmd(bcmd, **sp_ka) # type: ignore
if chk and rc:
retchk(rc, bcmd, err, log, 5)
return False
wait -= time.time() - t0
if wait > 0:
time.sleep(wait)
return True
def _runhook(
log: Optional["NamedLogger"],
cmd: str,
ap: str,
vp: str,
host: str,
uname: str,
mt: float,
sz: int,
ip: str,
at: float,
txt: str,
) -> bool:
ocmd = cmd
chk, fork, jtxt, wait, sp_ka, cmd = _parsehook(log, cmd)
if jtxt:
ja = {
"ap": ap,
"vp": vp,
"mt": mt,
"sz": sz,
"ip": ip,
"at": at or time.time(),
"host": host,
"user": uname,
"at": at or time.time(),
"sz": sz,
"txt": txt,
}
arg = json.dumps(ja)
@@ -2596,9 +2679,9 @@ def _runhook(
t0 = time.time()
if fork:
Daemon(runcmd, ocmd, [acmd], ka=ka)
Daemon(runcmd, ocmd, [bcmd], ka=sp_ka)
else:
rc, v, err = runcmd(bcmd, **ka) # type: ignore
rc, v, err = runcmd(bcmd, **sp_ka) # type: ignore
if chk and rc:
retchk(rc, bcmd, err, log, 5)
return False
@@ -2611,24 +2694,25 @@ def _runhook(
def runhook(
log: "NamedLogger",
log: Optional["NamedLogger"],
cmds: list[str],
ap: str,
vp: str,
host: str,
uname: str,
mt: float,
sz: int,
ip: str,
at: float,
sz: int,
txt: str,
) -> bool:
vp = vp.replace("\\", "/")
for cmd in cmds:
try:
if not _runhook(log, cmd, ap, vp, host, uname, ip, at, sz, txt):
if not _runhook(log, cmd, ap, vp, host, uname, mt, sz, ip, at, txt):
return False
except Exception as ex:
log("hook: {}".format(ex))
(log or print)("hook: {}".format(ex))
if ",c," in "," + cmd:
return False
break

View File

@@ -27,8 +27,8 @@ window.baguetteBox = (function () {
isOverlayVisible = false,
touch = {}, // start-pos
touchFlag = false, // busy
re_i = /.+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /.+\.(webm|mkv|mp4)(\?|$)/i,
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
anims = ['slideIn', 'fadeIn', 'none'],
data = {}, // all galleries
imagesElements = [],
@@ -580,6 +580,7 @@ window.baguetteBox = (function () {
function hideOverlay(e) {
ev(e);
playvid(false);
removeFromCache('#files');
if (options.noScrollbars) {
document.documentElement.style.overflowY = 'auto';
document.body.style.overflowY = 'auto';
@@ -812,10 +813,16 @@ window.baguetteBox = (function () {
}
function vid() {
if (currentIndex >= imagesElements.length)
return;
return imagesElements[currentIndex].querySelector('video');
}
function vidimg() {
if (currentIndex >= imagesElements.length)
return;
return imagesElements[currentIndex].querySelector('img, video');
}

View File

@@ -93,6 +93,7 @@
--g-fsel-bg: #d39;
--g-fsel-b1: #f4a;
--g-fsel-ts: #804;
--g-dfg: var(--srv-3);
--g-fg: var(--a-hil);
--g-bg: var(--bg-u2);
--g-b1: var(--bg-u4);
@@ -327,6 +328,7 @@ html.c {
}
html.cz {
--bgg: var(--bg-u2);
--srv-3: #fff;
}
html.cy {
--fg: #fff;
@@ -354,6 +356,7 @@ html.cy {
--chk-fg: #fd0;
--srv-1: #f00;
--srv-3: #fff;
--op-aa-bg: #fff;
--u2-b1-bg: #f00;
@@ -793,6 +796,8 @@ html.y #path a:hover {
}
.logue {
padding: .2em 0;
position: relative;
z-index: 1;
}
.logue.hidden,
.logue:empty {
@@ -964,6 +969,9 @@ html.y #path a:hover {
#ggrid>a.dir:before {
content: '📂';
}
#ggrid>a.dir>span {
color: var(--g-dfg);
}
#ggrid>a.au:before {
content: '💾';
}
@@ -1010,6 +1018,9 @@ html.np_open #ggrid>a.au:before {
background: var(--g-sel-bg);
border-color: var(--g-sel-b1);
}
#ggrid>a.sel>span {
color: var(--g-sel-fg);
}
#ggrid>a.sel,
#ggrid>a[tt].sel {
border-top: 1px solid var(--g-fsel-b1);
@@ -1063,6 +1074,9 @@ html.np_open #ggrid>a.au:before {
background: var(--bg-d3);
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
}
#player {
display: none;
}
#widget {
position: fixed;
font-size: 1.4em;
@@ -1145,10 +1159,10 @@ html.y #widget.open {
background: #fff;
background: var(--bg-u3);
}
#wfm, #wzip, #wnp {
#wfs, #wfm, #wzip, #wnp {
display: none;
}
#wzip, #wnp {
#wfs, #wzip, #wnp {
margin-right: .2em;
padding-right: .2em;
border: 1px solid var(--bg-u5);
@@ -1160,6 +1174,7 @@ html.y #widget.open {
padding-left: .2em;
border-left-width: .1em;
}
#wfs.act,
#wfm.act {
display: inline-block;
}
@@ -1183,6 +1198,13 @@ html.y #widget.open {
position: relative;
display: inline-block;
}
#wfs {
font-size: .36em;
text-align: right;
line-height: 1.3em;
padding: 0 .3em 0 0;
border-width: 0 .25em 0 0;
}
#wfm span,
#wnp span {
font-size: .6em;
@@ -1321,6 +1343,10 @@ html.y #ops svg circle {
padding: .3em .6em;
white-space: nowrap;
}
#noie {
color: #b60;
margin: 0 0 0 .5em;
}
.opbox {
padding: .5em;
border-radius: 0 .3em .3em 0;

View File

@@ -155,6 +155,7 @@
sb_lg = "{{ sb_lg }}",
lifetime = {{ lifetime }},
turbolvl = {{ turbolvl }},
idxh = {{ idxh }},
frand = {{ frand|tojson }},
u2sort = "{{ u2sort }}",
have_emp = {{ have_emp|tojson }},

View File

@@ -110,6 +110,7 @@ var Ls = {
"ot_cfg": "configuration options",
"ot_u2i": 'up2k: upload files (if you have write-access) or toggle into the search-mode to see if they exist somewhere on the server$N$Nuploads are resumable, multithreaded, and file timestamps are preserved, but it uses more CPU than [🎈]&nbsp; (the basic uploader)<br /><br />during uploads, this icon becomes a progress indicator!',
"ot_u2w": 'up2k: upload files with resume support (close your browser and drop the same files in later)$N$Nmultithreaded, and file timestamps are preserved, but it uses more CPU than [🎈]&nbsp; (the basic uploader)<br /><br />during uploads, this icon becomes a progress indicator!',
"ot_noie": 'Please use Chrome / Firefox / Edge',
"ab_mkdir": "make directory",
"ab_mkdoc": "new markdown doc",
@@ -192,6 +193,7 @@ var Ls = {
"ct_dots": "show hidden files (if server permits)",
"ct_dir1st": "sort folders before files",
"ct_readme": "show README.md in folder listings",
"ct_idxh": "show index.html instead of folder listing",
"ct_sbars": "show scrollbars",
"cut_turbo": "the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>&quot;does this have the same filesize on the server?&quot;</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then &quot;upload&quot; the same files again to let the client verify them",
@@ -258,6 +260,11 @@ var Ls = {
"mm_e404": "Could not play audio; error 404: File not found.",
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
"mm_e5xx": "Could not play audio; server error ",
"mm_nof": "not finding any more audio files nearby",
"mm_pwrsv": "<p>it looks like playback is being interrupted by your phone's power-saving settings!</p>" + '<p>please go to <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png">the app settings of your browser</a> and then <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png">allow unrestricted battery usage</a> to fix it.</p><p>(probably a good idea to use a separate browser dedicated for just music streaming...)</p>',
"mm_hnf": "that song no longer exists",
"im_hnf": "that image no longer exists",
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
@@ -293,6 +300,7 @@ var Ls = {
"fd_ok": "delete OK",
"fd_err": "delete failed:\n",
"fd_none": "nothing was deleted; maybe blocked by server config (xbd)?",
"fd_busy": "deleting {0} items...\n\n{1}",
"fd_warn1": "DELETE these {0} items?",
"fd_warn2": "<b>Last chance!</b> No way to undo. Delete?",
@@ -450,7 +458,7 @@ var Ls = {
"ur_aun": "All {0} uploads failed, sorry",
"ur_1sn": "File was NOT found on server",
"ur_asn": "The {0} files were NOT found on server",
"ur_um": "Finished;\n{0} uplads OK,\n{1} uploads failed, sorry",
"ur_um": "Finished;\n{0} uploads OK,\n{1} uploads failed, sorry",
"ur_sm": "Finished;\n{0} files found on server,\n{1} files NOT found on server",
"lang_set": "refresh to make the change take effect?",
@@ -563,6 +571,7 @@ var Ls = {
"ot_cfg": "andre innstillinger",
"ot_u2i": 'up2k: last opp filer (hvis du har skrivetilgang) eller bytt til søkemodus for å sjekke om filene finnes et-eller-annet sted på serveren$N$Nopplastninger kan gjenopptas etter avbrudd, skjer stykkevis for potensielt høyere ytelse, og ivaretar datostempling -- men bruker litt mer prosessorkraft enn [🎈]&nbsp; (den primitive opplasteren "bup")<br /><br />mens opplastninger foregår så vises fremdriften her oppe!',
"ot_u2w": 'up2k: filopplastning med støtte for å gjenoppta avbrutte opplastninger -- steng ned nettleseren og dra de samme filene inn i nettleseren igjen for å plukke opp igjen der du slapp$N$Nopplastninger skjer stykkevis for potensielt høyere ytelse, og ivaretar datostempling -- men bruker litt mer prosessorkraft enn [🎈]&nbsp; (den primitive opplasteren "bup")<br /><br />mens opplastninger foregår så vises fremdriften her oppe!',
"ot_noie": 'Fungerer mye bedre i Chrome / Firefox / Edge',
"ab_mkdir": "lag mappe",
"ab_mkdoc": "nytt dokument",
@@ -645,6 +654,7 @@ var Ls = {
"ct_dots": "vis skjulte filer (gitt at serveren tillater det)",
"ct_dir1st": "sorter slik at mapper kommer foran filer",
"ct_readme": "vis README.md nedenfor filene",
"ct_idxh": "vis index.html istedenfor fil-liste",
"ct_sbars": "vis rullgardiner / skrollefelt",
"cut_turbo": "forenklet befaring ved opplastning; bør sannsynlig <em>ikke</em> skrus på:$N$Nnyttig dersom du var midt i en svær opplastning som måtte restartes av en eller annen grunn, og du vil komme igang igjen så raskt som overhodet mulig.$N$Nnår denne er skrudd på så forenkles befaringen kraftig; istedenfor å utføre en trygg sjekk på om filene finnes på serveren i god stand, så sjekkes kun om <em>filstørrelsen</em> stemmer. Så dersom en korrupt fil skulle befinne seg på serveren allerede, på samme sted med samme størrelse og navn, så blir det <em>ikke oppdaget</em>.$N$Ndet anbefales å kun benytte denne funksjonen for å komme seg raskt igjennom selve opplastningen, for så å skru den av, og til slutt &quot;laste opp&quot; de samme filene én gang til -- slik at integriteten kan verifiseres",
@@ -711,6 +721,11 @@ var Ls = {
"mm_e404": "Avspilling feilet: Fil ikke funnet.",
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
"mm_e5xx": "Avspilling feilet: ",
"mm_nof": "finner ikke flere sanger i nærheten",
"mm_pwrsv": "<p>det ser ut som musikken ble avbrutt av telefonen sine strømsparings-innstillinger!</p>" + '<p>ta en tur innom <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png">app-innstillingene til nettleseren din</a> og så <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png">tillat ubegrenset batteriforbruk</a></p><p>(sikkert smart å ha en egen nettleser kun for musikkspilling...)</p>',
"mm_hnf": "sangen finnes ikke lenger",
"im_hnf": "bildet finnes ikke lenger",
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
@@ -746,6 +761,7 @@ var Ls = {
"fd_ok": "sletting OK",
"fd_err": "sletting feilet:\n",
"fd_none": "ingenting ble slettet; kanskje avvist av serverkonfigurasjon (xbd)?",
"fd_busy": "sletter {0} filer...\n\n{1}",
"fd_warn1": "SLETT disse {0} filene?",
"fd_warn2": "<b>Siste sjanse!</b> Dette kan ikke angres. Slett?",
@@ -930,6 +946,7 @@ ebi('ops').innerHTML = (
'<a href="#" data-perm="write" data-dest="msg" tt="' + L.ot_msg + '">📟</a>' +
'<a href="#" data-dest="player" tt="' + L.ot_mp + '">🎺</a>' +
'<a href="#" data-dest="cfg" tt="' + L.ot_cfg + '">⚙️</a>' +
(IE ? '<span id="noie">' + L.ot_noie + '</span>' : '') +
'<div id="opdesc"></div>'
);
@@ -937,6 +954,7 @@ ebi('ops').innerHTML = (
// media player
ebi('widget').innerHTML = (
'<div id="wtoggle">' +
'<span id="wfs"></span>' +
'<span id="wfm"><a' +
' href="#" id="fren" tt="' + L.wt_ren + '">✎<span>name</span></a><a' +
' href="#" id="fdel" tt="' + L.wt_del + '">⌫<span>del.</span></a><a' +
@@ -959,6 +977,17 @@ ebi('widget').innerHTML = (
' <canvas id="pvol" width="288" height="38"></canvas>' +
' <canvas id="barpos"></canvas>' +
' <canvas id="barbuf"></canvas>' +
'</div>' +
'<div id="np_inf">' +
' <img id="np_img"></span>' +
' <span id="np_url"></span>' +
' <span id="np_circle"></span>' +
' <span id="np_album"></span>' +
' <span id="np_tn"></span>' +
' <span id="np_artist"></span>' +
' <span id="np_title"></span>' +
' <span id="np_pos"></span>' +
' <span id="np_dur"></span>' +
'</div>'
);
@@ -1070,6 +1099,7 @@ ebi('op_cfg').innerHTML = (
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '">dotfiles</a>\n' +
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '">📁 first</a>\n' +
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '">📜 readme</a>\n' +
' <a id="idxh" class="tgl btn" href="#" tt="' + L.ct_idxh + '">htm</a>\n' +
' <a id="sbars" class="tgl btn" href="#" tt="' + L.ct_sbars + '">⟊</a>\n' +
' </div>\n' +
'</div>\n' +
@@ -1265,6 +1295,7 @@ function set_files_html(html) {
var ACtx = window.AudioContext || window.webkitAudioContext,
noih = /[?&]v\b/.exec('' + location),
hash0 = location.hash,
mp;
@@ -1307,6 +1338,7 @@ var mpl = (function () {
var r = {
"pb_mode": (sread('pb_mode') || 'next').split('-')[0],
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
'traversals': 0,
};
bcfg_bind(r, 'preload', 'au_preload', true);
bcfg_bind(r, 'fullpre', 'au_fullpre', false);
@@ -1387,10 +1419,17 @@ var mpl = (function () {
};
r.pp = function () {
var adur, apos, playing = mp.au && !mp.au.paused;
clmod(ebi('np_inf'), 'playing', playing);
if (mp.au && isNum(adur = mp.au.duration) && isNum(apos = mp.au.currentTime) && apos >= 0)
ebi('np_pos').textContent = s2ms(apos);
if (!r.os_ctl)
return;
navigator.mediaSession.playbackState = mp.au && !mp.au.paused ? "playing" : "paused";
navigator.mediaSession.playbackState = playing ? "playing" : "paused";
};
function setaufollow() {
@@ -1405,9 +1444,10 @@ var mpl = (function () {
var np = get_np()[0],
fns = np.file.split(' - '),
artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
tags = {
title: np.title || fns.pop()
};
title = np.title || fns.pop(),
cover = '',
pcover = '',
tags = { title: title };
if (artist)
tags.artist = artist;
@@ -1428,18 +1468,29 @@ var mpl = (function () {
if (cover) {
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
pcover = cover;
var pwd = get_pwd();
if (pwd)
cover += '&pw=' + uricom_enc(pwd);
pcover += '&pw=' + uricom_enc(pwd);
tags.artwork = [{ "src": cover, type: "image/jpeg" }];
tags.artwork = [{ "src": pcover, type: "image/jpeg" }];
}
}
ebi('np_circle').textContent = np.circle || '';
ebi('np_album').textContent = np.album || '';
ebi('np_tn').textContent = np['.tn'] || '';
ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : '');
ebi('np_title').textContent = np.title || '';
ebi('np_dur').textContent = np['.dur'] || '';
ebi('np_url').textContent = get_vpath() + np.file.split('?')[0];
if (!MOBILE)
ebi('np_img').setAttribute('src', cover || ''); // dont give last.fm the pwd
navigator.mediaSession.metadata = new MediaMetadata(tags);
navigator.mediaSession.setActionHandler('play', playpause);
navigator.mediaSession.setActionHandler('pause', playpause);
navigator.mediaSession.setActionHandler('play', mplay);
navigator.mediaSession.setActionHandler('pause', mpause);
navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null);
navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null);
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
@@ -1449,11 +1500,18 @@ var mpl = (function () {
r.announce = announce;
r.stop = function () {
if (!r.os_ctl || !navigator.mediaSession.metadata)
if (!r.os_ctl)
return;
// dead code; left for debug
navigator.mediaSession.metadata = null;
navigator.mediaSession.playbackState = "paused";
var hs = 'play pause seekbackward seekforward previoustrack nexttrack'.split(/ /g);
for (var a = 0; a < hs.length; a++)
navigator.mediaSession.setActionHandler(hs[a], null);
navigator.mediaSession.setPositionState();
};
r.unbuffer = function (url) {
@@ -1481,18 +1539,21 @@ catch (ex) { }
var re_au_native = can_ogg ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i :
have_acode ? /\.(aac|flac|m4a|mp3|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|itgz|itr|itz|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3gz|s3m|s3r|s3z|tak|tta|ulaw|wav|wma|wv|xm|xmgz|xmr|xmz|xpk)$/i;
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk)$/i;
// extract songs + add play column
var mpo = { "au": null, "au2": null, "acs": null };
var t_fchg = 0;
function MPlayer() {
var r = this;
r.id = Date.now();
r.au = null;
r.au = null;
r.au2 = null;
r.au = mpo.au;
r.au2 = mpo.au2;
r.acs = mpo.acs;
r.tracks = {};
r.order = [];
r.cd_pause = 0;
var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native,
trs = QSA('#files tbody tr');
@@ -1549,6 +1610,7 @@ function MPlayer() {
r.ftid = -1;
r.ftimer = null;
r.fade_in = function () {
r.nopause();
r.fvol = 0;
r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1);
if (r.au) {
@@ -1581,9 +1643,9 @@ function MPlayer() {
r.au.pause();
mpl.pp();
var t = mp.au.currentTime - 0.8;
var t = r.au.currentTime - 0.8;
if (isNum(t))
mp.au.currentTime = Math.max(t, 0);
r.au.currentTime = Math.max(t, 0);
}
else if (r.fvol > r.vol)
r.fvol = r.vol;
@@ -1629,8 +1691,14 @@ function MPlayer() {
drop();
});
mp.au2.preload = "auto";
mp.au2.src = mp.au2.rsrc = url;
r.nopause();
r.au2.onloadeddata = r.au2.onloadedmetadata = r.nopause;
r.au2.preload = "auto";
r.au2.src = r.au2.rsrc = url;
};
r.nopause = function () {
r.cd_pause = Date.now();
};
}
@@ -1769,6 +1837,7 @@ function glossy_grad(can, h, s, l) {
var pbar = (function () {
var r = {},
bau = null,
html_txt = 'a',
lastmove = 0,
mousepos = 0,
gradh = -1,
@@ -1938,6 +2007,16 @@ var pbar = (function () {
pctx.strokeText(t2, xt2, yt);
pctx.fillText(t1, xt1, yt);
pctx.fillText(t2, xt2, yt);
if (w && html_txt != t2) {
ebi('np_pos').textContent = html_txt = t2;
if (mpl.os_ctl)
navigator.mediaSession.setPositionState({
'duration': adur,
'position': apos,
'playbackRate': 1
});
}
};
window.addEventListener('resize', r.onresize);
@@ -2067,6 +2146,7 @@ function seek_au_sec(seek) {
if (!isNum(seek))
return;
mp.nopause();
mp.au.currentTime = seek;
if (mp.au.paused)
@@ -2086,9 +2166,30 @@ function song_skip(n) {
else
play(mp.order[n == -1 ? mp.order.length - 1 : 0]);
}
function next_song_sig(e) {
t_fchg = document.hasFocus() ? 0 : Date.now();
return next_song_cmn(e);
}
function next_song(e) {
t_fchg = 0;
return next_song_cmn(e);
}
function next_song_cmn(e) {
ev(e);
return song_skip(1);
if (mp.order.length) {
mpl.traversals = 0;
return song_skip(1);
}
if (mpl.traversals++ < 5) {
if (MOBILE && t_fchg && Date.now() - t_fchg > 30 * 1000)
modal.alert(L.mm_pwrsv);
t_fchg = document.hasFocus() ? 0 : Date.now();
treectl.ls_cb = next_song_cmn;
return tree_neigh(1);
}
toast.inf(10, L.mm_nof);
t_fchg = 0;
}
function prev_song(e) {
ev(e);
@@ -2131,6 +2232,25 @@ function playpause(e) {
};
function mplay(e) {
if (mp.au && !mp.au.paused)
return;
playpause(e);
}
function mpause(e) {
if (mp.cd_pause > Date.now() - 100)
return;
if (mp.au && mp.au.paused)
return;
playpause(e);
}
// hook up the widget buttons
(function () {
ebi('bplay').onclick = playpause;
@@ -2185,6 +2305,10 @@ var mpui = (function () {
return;
}
var pos = mp.au.currentTime;
if (!isNum(pos))
pos = 0;
// indicate playback state in ui
widget.paused(mp.au.paused);
@@ -2207,10 +2331,18 @@ var mpui = (function () {
pbar.drawbuf();
}
if (pos > 0.3 && t_fchg) {
// cannot check document.hasFocus to avoid false positives;
// it continues on power-on, doesn't need to be in-browser
if (MOBILE && Date.now() - t_fchg > 30 * 1000)
modal.alert(L.mm_pwrsv);
t_fchg = 0;
}
// preload next song
if (mpl.preload && preloaded != mp.au.rsrc) {
var pos = mp.au.currentTime,
len = mp.au.duration,
var len = mp.au.duration,
rem = pos > 1 ? len - pos : 999,
full = null;
@@ -2276,12 +2408,14 @@ function start_actx() {
}
}
var audio_eq = (function () {
var afilt = (function () {
var r = {
"en": false,
"eqen": false,
"bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000],
"gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4],
"filters": [],
"filterskip": [],
"plugs": [],
"amp": 0,
"chw": 1,
"last_au": null,
@@ -2370,6 +2504,10 @@ var audio_eq = (function () {
r.filters[a].disconnect();
r.filters = [];
r.filterskip = [];
for (var a = 0; a < r.plugs.length; a++)
r.plugs[a].unload();
if (!mp)
return;
@@ -2377,7 +2515,7 @@ var audio_eq = (function () {
if (mp.acs)
mp.acs.disconnect();
mp.acs = null;
mp.acs = mpo.acs = null;
};
r.apply = function () {
@@ -2387,18 +2525,34 @@ var audio_eq = (function () {
if (!actx)
bcfg_set('au_eq', false);
if (!actx || !mp.au || (!r.en && !mp.acs))
var plug = false;
for (var a = 0; a < r.plugs.length; a++)
if (r.plugs[a].en)
plug = true;
if (!actx || !mp.au || (!r.eqen && !plug && !mp.acs))
return;
r.stop();
mp.au.id = mp.au.id || Date.now();
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
if (!r.en) {
mp.acs.connect(actx.destination);
return;
}
if (r.eqen)
add_eq();
for (var a = 0; a < r.plugs.length; a++)
if (r.plugs[a].en)
r.plugs[a].load();
for (var a = 0; a < r.filters.length; a++)
if (!has(r.filterskip, a))
r.filters[a].connect(a ? r.filters[a - 1] : actx.destination);
mp.acs.connect(r.filters.length ?
r.filters[r.filters.length - 1] : actx.destination);
}
function add_eq() {
var min, max;
min = max = r.gains[0];
for (var a = 1; a < r.gains.length; a++) {
@@ -2429,9 +2583,6 @@ var audio_eq = (function () {
fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect
r.filters.push(fi);
for (var a = r.filters.length - 1; a >= 0; a--)
r.filters[a].connect(a > 0 ? r.filters[a - 1] : actx.destination);
if (Math.round(r.chw * 25) != 25) {
var split = actx.createChannelSplitter(2),
merge = actx.createChannelMerger(2),
@@ -2456,11 +2607,10 @@ var audio_eq = (function () {
split.connect(lg2, 0);
split.connect(rg1, 1);
split.connect(rg2, 1);
r.filterskip.push(r.filters.length);
r.filters.push(split);
mp.acs.channelCountMode = 'explicit';
}
mp.acs.connect(r.filters[r.filters.length - 1]);
}
function eq_step(e) {
@@ -2555,7 +2705,7 @@ var audio_eq = (function () {
txt[a].onkeydown = eq_keydown;
}
bcfg_bind(r, 'en', 'au_eq', false, r.apply);
bcfg_bind(r, 'eqen', 'au_eq', false, r.apply);
r.draw();
return r;
@@ -2577,7 +2727,7 @@ function play(tid, is_ev, seek) {
if ((tn + '').indexOf('f-') === 0) {
tn = mp.order.indexOf(tn);
if (tn < 0)
return;
return toast.warn(10, L.mm_hnf);
}
if (tn >= mp.order.length) {
@@ -2585,6 +2735,7 @@ function play(tid, is_ev, seek) {
tn = 0;
}
else if (mpl.pb_mode == 'next') {
t_fchg = document.hasFocus() ? 0 : Date.now();
treectl.ls_cb = next_song;
return tree_neigh(1);
}
@@ -2604,7 +2755,9 @@ function play(tid, is_ev, seek) {
if (mp.au) {
mp.au.pause();
clmod(ebi('a' + mp.au.tid), 'act');
var el = ebi('a' + mp.au.tid);
if (el)
clmod(el, 'act');
}
else {
mp.au = new Audio();
@@ -2612,7 +2765,7 @@ function play(tid, is_ev, seek) {
mp.au.onerror = evau_error;
mp.au.onprogress = pbar.drawpos;
mp.au.onplaying = mpui.progress_updater;
mp.au.onended = next_song;
mp.au.onended = next_song_sig;
widget.open();
}
@@ -2629,7 +2782,7 @@ function play(tid, is_ev, seek) {
mp.au.onerror = evau_error;
mp.au.onprogress = pbar.drawpos;
mp.au.onplaying = mpui.progress_updater;
mp.au.onended = next_song;
mp.au.onended = next_song_sig;
t = mp.au.currentTime;
if (isNum(t) && t > 0.1)
mp.au.currentTime = 0;
@@ -2637,7 +2790,7 @@ function play(tid, is_ev, seek) {
else
mp.au.src = mp.au.rsrc = url;
audio_eq.apply();
afilt.apply();
setTimeout(function () {
mpl.unbuffer(url);
@@ -2660,6 +2813,7 @@ function play(tid, is_ev, seek) {
scroll2playing();
try {
mp.nopause();
mp.au.play();
if (mp.au.paused)
autoplay_blocked(seek);
@@ -2688,7 +2842,7 @@ function play(tid, is_ev, seek) {
toast.err(0, esc(L.mm_playerr + basenames(ex)));
}
clmod(ebi(oid), 'act');
setTimeout(next_song, 5000);
setTimeout(next_song_sig, 5000);
}
@@ -2776,7 +2930,7 @@ function autoplay_blocked(seek) {
modal.confirm('<h6>' + L.mm_hashplay + '</h6>\n«' + esc(fn) + '»', function () {
// chrome 91 may permanently taint on a failed play()
// depending on win10 settings or something? idk
mp.au = null;
mp.au = mpo.au = null;
play(tid, true, seek);
mp.fade_in();
@@ -2845,6 +2999,9 @@ function eval_hash() {
clearInterval(t);
baguetteBox.urltime(ts);
var im = QS('#ggrid a[ref="' + id + '"]');
if (!im)
return toast.warn(10, L.im_hnf);
im.click();
im.scrollIntoView();
}, 50);
@@ -3112,7 +3269,9 @@ var fileman = (function () {
if (r.clip === null)
r.clip = jread('fman_clip', []).slice(1);
var nsel = msel.getsel().length;
var sel = msel.getsel(),
nsel = sel.length;
clmod(bren, 'en', nsel);
clmod(bdel, 'en', nsel);
clmod(bcut, 'en', nsel);
@@ -3124,9 +3283,51 @@ var fileman = (function () {
clmod(bpst, 'hide', !(have_mv && has(perms, 'write')));
clmod(ebi('wfm'), 'act', QS('#wfm a.en:not(.hide)'));
var wfs = ebi('wfs'), h = '';
try {
wfs.innerHTML = h = r.fsi(sel);
}
catch (ex) { }
clmod(wfs, 'act', h);
bpst.setAttribute('tt', L.ft_paste.format(r.clip.length));
};
r.fsi = function (sel) {
if (!sel.length)
return '';
var lf = treectl.lsc.files,
nf = 0,
sz = 0,
dur = 0,
ntab = new Set();
for (var a = 0; a < sel.length; a++)
ntab.add(sel[a].vp.split('/').pop());
for (var a = 0; a < lf.length; a++) {
if (!ntab.has(lf[a].href.split('?')[0]))
continue;
var f = lf[a];
nf++;
sz += f.sz;
if (f.tags && f.tags['.dur'])
dur += f.tags['.dur']
}
if (!nf)
return '';
var ret = '{0}<br />{1}<small>F</small>'.format(humansize(sz), nf);
if (dur)
ret += ' ' + s2ms(dur);
return ret;
};
r.rename = function (e) {
ev(e);
if (clgot(bren, 'hide'))
@@ -3423,12 +3624,14 @@ var fileman = (function () {
if (!sel.length)
return toast.err(3, L.fd_emore);
function deleter() {
function deleter(err) {
var xhr = new XHR(),
vp = vps.shift();
if (!vp) {
toast.ok(2, L.fd_ok);
if (err !== 'xbd')
toast.ok(2, L.fd_ok);
treectl.goto(get_evpath());
return;
}
@@ -3444,6 +3647,10 @@ var fileman = (function () {
toast.err(9, L.fd_err + msg);
return;
}
if (this.responseText.indexOf('deleted 0 files (and 0') + 1) {
toast.err(9, L.fd_none);
return deleter('xbd');
}
deleter();
}
@@ -4115,6 +4322,21 @@ var thegrid = (function () {
ev(e);
}
r.imshow = function (url) {
var sel = '#ggrid>a'
if (!thegrid.en) {
thegrid.bagit('#files');
sel = '#files a[id]';
}
var ims = QSA(sel);
for (var a = 0, aa = ims.length; a < aa; a++) {
var iu = ims[a].getAttribute('href').split('?')[0].split('/').slice(-1)[0];
if (iu == url)
return ims[a].click();
}
baguetteBox.hide();
};
r.loadsel = function () {
if (r.dirty)
return;
@@ -4254,19 +4476,19 @@ var thegrid = (function () {
}
r.dirty = false;
r.bagit();
r.bagit('#ggrid');
r.loadsel();
setTimeout(r.tippen, 20);
}
r.bagit = function () {
r.bagit = function (isrc) {
if (!window.baguetteBox)
return;
if (r.bbox)
baguetteBox.destroy();
r.bbox = baguetteBox.run('#ggrid', {
r.bbox = baguetteBox.run(isrc, {
captions: function (g) {
var idx = -1,
h = '' + g;
@@ -4841,7 +5063,7 @@ document.onkeydown = function (e) {
var html = mk_files_header(tagord), seen = {};
html.push('<tbody>');
html.push('<tr class="srch_hdr"><td>-</td><td><a href="#" id="unsearch"><big style="font-weight:bold">[❌] ' + L.sl_close + '</big></a> -- ' + L.sl_hits.format(res.hits.length) + (res.hits.length == cap ? ' -- <a href="#" id="moar">' + L.sl_moar + '</a>' : '') + '</td></tr>');
html.push('<tr class="srch_hdr"><td>-</td><td><a href="#" id="unsearch"><big style="font-weight:bold">[❌] ' + L.sl_close + '</big></a> -- ' + L.sl_hits.format(res.hits.length) + (res.trunc ? ' -- <a href="#" id="moar">' + L.sl_moar + '</a>' : '') + '</td></tr>');
for (var a = 0; a < res.hits.length; a++) {
var r = res.hits[a],
@@ -4955,6 +5177,7 @@ var treectl = (function () {
treesz = clamp(icfg_get('treesz', 16), 10, 50);
bcfg_bind(r, 'ireadme', 'ireadme', true);
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
r.goto(get_evpath());
@@ -4979,6 +5202,16 @@ var treectl = (function () {
}
setwrap(r.wtree);
function setidxh(v) {
if (!v == !/\bidxh=y\b/.exec('' + document.cookie))
return;
var xhr = new XHR();
xhr.open('GET', SR + '/?setck=idxh=' + (v ? 'y' : 'n'), true);
xhr.send();
}
setidxh(r.idxh);
r.entree = function (e, nostore) {
ev(e);
entreed = true;
@@ -5407,6 +5640,9 @@ var treectl = (function () {
return;
}
if (r.chk_index_html(this.top, res))
return;
for (var a = 0; a < res.files.length; a++)
if (res.files[a].tags === undefined)
res.files[a].tags = {};
@@ -5454,6 +5690,17 @@ var treectl = (function () {
}
}
r.chk_index_html = function (top, res) {
if (!r.idxh || !res || !res.files || noih)
return;
for (var a = 0; a < res.files.length; a++)
if (/^index.html?(\?|$)/i.exec(res.files[a].href)) {
window.location = vjoin(top, res.files[a].href);
return true;
}
};
r.gentab = function (top, res) {
var nodes = res.dirs.concat(res.files),
html = mk_files_header(res.taglist),
@@ -5574,14 +5821,18 @@ var treectl = (function () {
qsr('#bbsw');
if (ls0 === null) {
var xhr = new XHR();
xhr.open('GET', SR + '/?am_js', true);
xhr.open('GET', SR + '/?setck=js=y', true);
xhr.send();
r.ls_cb = showfile.addlinks;
return r.reqls(get_evpath(), false);
}
r.gentab(get_evpath(), ls0);
var top = get_evpath();
if (r.chk_index_html(top, ls0))
return;
r.gentab(top, ls0);
pbar.onresize();
vbar.onresize();
showfile.addlinks();
@@ -5805,7 +6056,7 @@ function apply_perms(res) {
ebi('acc_info').innerHTML = '<span id="srv_info2"><span>' + srvinf +
'</span></span><span' + aclass + axs + L.access + '</span>' + (acct != '*' ?
'<a href="' + SR + '/?pw=x">' + L.logout + acct + '</a>' :
'<a href="' + SR + '/?h">Login</a>');
'<a href="?h">Login</a>');
var o = QSA('#ops>a[data-perm]');
for (var a = 0; a < o.length; a++) {
@@ -6610,22 +6861,27 @@ var globalcss = (function () {
var dcs = document.styleSheets;
for (var a = 0; a < dcs.length; a++) {
var base = dcs[a].href,
var ds, base = '';
try {
base = dcs[a].href;
if (!base)
continue;
ds = dcs[a].cssRules;
if (!base)
continue;
base = base.replace(/[^/]+$/, '');
for (var b = 0; b < ds.length; b++) {
var css = ds[b].cssText.split(/\burl\(/g);
ret += css[0];
for (var c = 1; c < css.length; c++) {
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
css[c].slice(delim ? 1 : 0);
base = base.replace(/[^/]+$/, '');
for (var b = 0; b < ds.length; b++) {
var css = ds[b].cssText.split(/\burl\(/g);
ret += css[0];
for (var c = 1; c < css.length; c++) {
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
css[c].slice(delim ? 1 : 0);
}
ret += '\n';
}
ret += '\n';
}
catch (ex) {
console.log('could not read css', a, base);
}
}
return ret;
@@ -6711,6 +6967,7 @@ function show_md(md, name, div, url, depth) {
els[a].setAttribute('href', '#md-' + href.slice(1));
}
md_th_set();
set_tabindex();
var hash = location.hash;
if (hash.startsWith('#md-'))
@@ -6789,6 +7046,7 @@ function sandbox(tgt, rules, cls, html) {
'window.onblur=function(){say("ilost #' + tid + '")};' +
'var el="' + want + '"&&ebi("' + want + '");' +
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
'md_th_set();' +
(cls == 'mdo' && md_plug.post ?
'const x={' + md_plug.post + '};' +
'if(x.render)x.render(ebi("b"));' +
@@ -6821,6 +7079,9 @@ window.addEventListener("message", function (e) {
else if (t[0] == 'igot' || t[0] == 'ilost') {
clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot');
}
else if (t[0] == 'imshow') {
thegrid.imshow(e.data.slice(7));
}
} catch (ex) {
console.log('msg-err: ' + ex);
}
@@ -7095,21 +7356,25 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
function reload_mp() {
if (mp && mp.au) {
if (audio_eq)
audio_eq.stop();
mp.au.pause();
mp.au = null;
mpo.au = mp.au;
mpo.au2 = mp.au2;
mpo.acs = mp.acs;
mpl.unbuffer();
}
mpl.stop();
var plays = QSA('tr>td:first-child>a.play');
for (var a = plays.length - 1; a >= 0; a--)
plays[a].parentNode.innerHTML = '-';
mp = new MPlayer();
if (audio_eq)
audio_eq.acst = {};
if (mp.au && mp.au.tid) {
var el = QS('a#a' + mp.au.tid);
if (el)
clmod(el, 'act', 1);
el = el && el.closest('tr');
if (el)
clmod(el, 'play', 1);
}
setTimeout(pbar.onresize, 1);
}

View File

@@ -231,11 +231,11 @@ function convert_markdown(md_text, dest_dom) {
var nodes = md_dom.getElementsByTagName('a');
for (var a = nodes.length - 1; a >= 0; a--) {
var href = nodes[a].getAttribute('href');
var txt = nodes[a].textContent;
var txt = nodes[a].innerHTML;
if (!txt)
nodes[a].textContent = href;
else if (href !== txt)
else if (href !== txt && !nodes[a].className)
nodes[a].className = 'vis';
}

View File

@@ -36,6 +36,11 @@ a {
td a {
margin: 0;
}
#w {
color: #fff;
background: #940;
border-color: #b70;
}
.af,
.logout {
float: right;
@@ -175,15 +180,19 @@ html.z a.g {
border-color: #af4;
box-shadow: 0 .3em 1em #7d0;
}
html.z input {
color: #fff;
background: #626;
border: 1px solid #c2c;
border-width: 1px 0 0 0;
input {
color: #a50;
background: #fff;
border: 1px solid #a50;
border-radius: .5em;
padding: .5em .7em;
margin: 0 .5em 0 0;
}
html.z input {
color: #fff;
background: #626;
border-color: #c2c;
}
html.z .num {
border-color: #777;
}

View File

@@ -89,13 +89,16 @@
</ul>
<h1 id="l">login for more:</h1>
<ul>
<div>
<form method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}">
<input type="hidden" name="act" value="login" />
<input type="password" name="cppwd" />
<input type="submit" value="Login" />
{% if ahttps %}
<a id="w" href="{{ ahttps }}">switch to https</a>
{% endif %}
</form>
</ul>
</div>
</div>
<a href="#" id="repl">π</a>
{%- if not this.args.nb %}

View File

@@ -25,7 +25,8 @@ var Ls = {
"t1": "handling",
"u2": "tid siden noen sist skrev til serveren$N( opplastning / navneendring / ... )$N$N17d = 17 dager$N1h23 = 1 time 23 minutter$N4m56 = 4 minuter 56 sekunder",
"v1": "koble til",
"v2": "bruk denne serveren som en lokal harddisk$N$NADVARSEL: kommer til å vise passordet ditt!"
"v2": "bruk denne serveren som en lokal harddisk$N$NADVARSEL: kommer til å vise passordet ditt!",
"w1": "bytt til https",
},
"eng": {
"d2": "shows the state of all active threads",

View File

@@ -43,10 +43,9 @@
<h1>WebDAV</h1>
<div class="os win">
<p><em>note: rclone-FTP is a bit faster, so {% if args.ftp or args.ftps %}try that first{% else %}consider enabling FTP in server settings{% endif %}</em></p>
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
<pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
</pre>
{% if s %}
@@ -64,9 +63,14 @@
yum install davfs2
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
</pre>
<p>or you can use rclone instead, which is much slower but doesn't require root:</p>
<p>make it automount on boot:</p>
<pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
</pre>
<p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
<pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
</pre>
{% if s %}
@@ -106,10 +110,21 @@
<div class="os win">
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
{% if args.ftp %}
<p>connect with plaintext FTP:</p>
<pre>
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp or args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls={{ "false" if args.ftp else "true" }}
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftp:{{ rvp }} <b>W:</b>
</pre>
{% endif %}
{% if args.ftps %}
<p>connect with TLS-encrypted FTPS:</p>
<pre>
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b>
</pre>
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
{% endif %}
<p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p>
<pre>
explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }}
@@ -117,10 +132,21 @@
</div>
<div class="os lin">
{% if args.ftp %}
<p>connect with plaintext FTP:</p>
<pre>
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp or args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls={{ "false" if args.ftp else "true" }}
rclone config create {{ aname }}-ftp ftp host={{ rip }} port={{ args.ftp }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftp:{{ rvp }} <b>mp</b>
</pre>
{% endif %}
{% if args.ftps %}
<p>connect with TLS-encrypted FTPS:</p>
<pre>
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b>
</pre>
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
{% endif %}
<p>emergency alternative (gnome/gui-only):</p>
<!-- gnome-bug: ignores vp -->
<pre>

View File

@@ -451,6 +451,20 @@ html.y textarea:focus {
padding: .2em .5em;
border: .12em solid #aaa;
}
.mdo .mdth,
.mdo .mdthl,
.mdo .mdthr {
margin: .5em .5em .5em 0;
}
.mdthl {
float: left;
}
.mdthr {
float: right;
}
hr {
clear: both;
}
@media screen {
.mdo {

View File

@@ -114,10 +114,10 @@ function up2k_flagbus() {
do_take(now);
return;
}
if (flag.owner && now - flag.owner[1] > 5000) {
if (flag.owner && now - flag.owner[1] > 12000) {
flag.owner = null;
}
if (flag.wants && now - flag.wants[1] > 5000) {
if (flag.wants && now - flag.wants[1] > 12000) {
flag.wants = null;
}
if (!flag.owner && !flag.wants) {
@@ -772,6 +772,7 @@ function fsearch_explain(n) {
function up2k_init(subtle) {
var r = {
"tact": Date.now(),
"init_deps": init_deps,
"set_fsearch": set_fsearch,
"gotallfiles": [gotallfiles] // hooks
@@ -1452,7 +1453,7 @@ function up2k_init(subtle) {
});
};
var etaref = 0, etaskip = 0, utw_minh = 0, utw_read = 0;
var etaref = 0, etaskip = 0, utw_minh = 0, utw_read = 0, utw_card = 0;
function etafun() {
var nhash = st.busy.head.length + st.busy.hash.length + st.todo.head.length + st.todo.hash.length,
nsend = st.busy.upload.length + st.todo.upload.length,
@@ -1465,6 +1466,12 @@ function up2k_init(subtle) {
//ebi('acc_info').innerHTML = humantime(st.time.busy) + ' ' + f2f(now / 1000, 1);
if (utw_card != pvis.act) {
utw_card = pvis.act;
utw_read = 9001;
ebi('u2tabw').style.minHeight = '0px';
}
if (++utw_read >= 20) {
utw_read = 0;
utw_minh = parseInt(ebi('u2tabw').style.minHeight || '0');
@@ -1641,8 +1648,14 @@ function up2k_init(subtle) {
running = true;
while (true) {
var now = Date.now(),
blocktime = now - r.tact,
is_busy = st.car < st.files.length;
if (blocktime > 2500)
console.log('main thread blocked for ' + blocktime);
r.tact = now;
if (was_busy && !is_busy) {
for (var a = 0; a < st.files.length; a++) {
var t = st.files[a];
@@ -1782,6 +1795,15 @@ function up2k_init(subtle) {
})();
function uptoast() {
if (st.busy.handshake.length)
return;
for (var a = 0; a < st.files.length; a++) {
var t = st.files[a];
if (t.want_recheck && !t.rechecks)
return;
}
var sr = uc.fsearch,
ok = pvis.ctr.ok,
ng = pvis.ctr.ng,
@@ -2037,6 +2059,8 @@ function up2k_init(subtle) {
nbusy++;
reading++;
nchunk++;
if (Date.now() - up2k.tact > 1500)
tasker();
}
function onmsg(d) {
@@ -2367,16 +2391,17 @@ function up2k_init(subtle) {
}
var err_pend = rsp.indexOf('partial upload exists at a different') + 1,
err_srcb = rsp.indexOf('source file busy; please try again') + 1,
err_plug = rsp.indexOf('upload blocked by x') + 1,
err_dupe = rsp.indexOf('upload rejected, file already exists') + 1;
if (err_pend || err_plug || err_dupe) {
if (err_pend || err_srcb || err_plug || err_dupe) {
err = rsp;
ofs = err.indexOf('\n/');
if (ofs !== -1) {
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
}
if (!t.rechecks && err_pend) {
if (!t.rechecks && (err_pend || err_srcb)) {
t.rechecks = 0;
t.want_recheck = true;
}

View File

@@ -17,6 +17,7 @@ var wah = '',
MOBILE = TOUCH,
CHROME = !!window.chrome,
VCHROME = CHROME ? 1 : 0,
IE = /Trident\//.test(navigator.userAgent),
FIREFOX = ('netscape' in window) && / rv:/.test(navigator.userAgent),
IPHONE = TOUCH && /iPhone|iPad|iPod/i.test(navigator.userAgent),
LINUX = /Linux/.test(navigator.userAgent),
@@ -111,12 +112,13 @@ if ((document.location + '').indexOf(',rej,') + 1)
try {
console.hist = [];
var CMAXHIST = 100;
var hook = function (t) {
var orig = console[t].bind(console),
cfun = function () {
console.hist.push(Date.now() + ' ' + t + ': ' + Array.from(arguments).join(', '));
if (console.hist.length > 100)
console.hist = console.hist.slice(50);
if (console.hist.length > CMAXHIST)
console.hist = console.hist.slice(CMAXHIST / 2);
orig.apply(console, arguments);
};
@@ -331,6 +333,25 @@ if (!String.prototype.format)
});
};
try {
new URL('/a/', 'https://a.com/');
}
catch (ex) {
console.log('ie11 shim URL()');
window.URL = function (url, base) {
if (url.indexOf('//') < 0)
url = base + '/' + url.replace(/^\/?/, '');
else if (url.indexOf('//') == 0)
url = 'https:' + url;
var x = url.split('?');
return {
"pathname": '/' + x[0].split('://')[1].replace(/[^/]+\//, ''),
"search": x.length > 1 ? x[1] : ''
};
}
}
// https://stackoverflow.com/a/950146
function import_js(url, cb) {
var head = document.head || document.getElementsByTagName('head')[0];
@@ -611,6 +632,29 @@ function vsplit(vp) {
}
function vjoin(p1, p2) {
if (!p1)
p1 = '';
if (!p2)
p2 = '';
if (p1.endsWith('/'))
p1 = p1.slice(0, -1);
if (p2.startsWith('/'))
p2 = p2.slice(1);
if (!p1)
return p2;
if (!p2)
return p1;
return p1 + '/' + p2;
}
function uricom_enc(txt, do_fb_enc) {
try {
return encodeURIComponent(txt);
@@ -1162,13 +1206,13 @@ var tt = (function () {
r.th.style.top = (e.pageY + 12 * sy) + 'px';
};
if (IPHONE) {
if (TOUCH) {
var f1 = r.show,
f2 = r.hide,
q = [];
// if an onclick-handler creates a new timer,
// iOS 13.1.2 delays the entire handler by up to 401ms,
// webkits delay the entire handler by up to 401ms,
// win by using a shared timer instead
timer.add(function () {
@@ -1558,6 +1602,14 @@ function load_md_plug(md_text, plug_type, defer) {
if (defer)
md_plug[plug_type] = null;
if (plug_type == 'pre')
try {
md_text = md_thumbs(md_text);
}
catch (ex) {
toast.warn(30, '' + ex);
}
if (!have_emp)
return md_text;
@@ -1598,6 +1650,47 @@ function load_md_plug(md_text, plug_type, defer) {
return md;
}
function md_thumbs(md) {
if (!/(^|\n)<!-- th -->/.exec(md))
return md;
// `!th[flags](some.jpg)`
// flags: nothing or "l" or "r"
md = md.split(/!th\[/g);
for (var a = 1; a < md.length; a++) {
if (!/^[^\]!()]*\]\([^\][!()]+\)/.exec(md[a])) {
md[a] = '!th[' + md[a];
continue;
}
var o1 = md[a].indexOf(']('),
o2 = md[a].indexOf(')', o1),
alt = md[a].slice(0, o1),
flags = alt.split(','),
url = md[a].slice(o1 + 2, o2),
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
if (!/[?&]cache/.exec(url))
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i';
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
}
return md.join('');
}
function md_th_set() {
var els = QSA('.mdth');
for (var a = 0, aa = els.length; a < aa; a++)
els[a].onclick = md_th_click;
}
function md_th_click(e) {
ev(e);
var url = this.getAttribute('href').split('?')[0];
if (window.sb_md)
window.parent.postMessage("imshow " + url, "*");
else
thegrid.imshow(url);
}
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';

View File

@@ -1,3 +1,341 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0426-2300 `v1.6.15` unexpected boost
## new features
* 30% faster folder listings due to [the very last thing](https://github.com/9001/copyparty/commit/55c74ad164633a0a64dceb51f7f534da0422cbb5) i'd ever expect to be a bottleneck, [thx perf](https://docs.python.org/3.12/howto/perf_profiling.html)
* option to see the lastmod timestamps of symlinks instead of the target files
* makes the turbo mode of [u2cli, the commandline uploader and folder-sync tool](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) more turbo since copyparty dedupes uploads by symlinking to an existing copy and the symlink is stamped with the deduped file's lastmod
* **webdav:** enabled by default (because rclone will want this), can be disabled with arg `--dav-rt` or volflag `davrt`
* **http:** disabled by default, can be enabled per-request with urlparam `lt`
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py): option `--rh` to resolve server hostname only once at start of upload
* fantastic for buggy networks, but it'll break TLS
## bugfixes
* new arg `--s-tbody` specifies the network timeout before a dead connection gets dropped (default 3min)
* before there was no timeout at all, which could hang uploads or possibly consume all server resources
* ...but this is only relevant if your copyparty is directly exposed to the internet with no reverse proxy
* with nginx/caddy/etc you can disable the timeout with `--s-tbody 0` for a 3% performance boost (*wow!*)
* iPhone audio transcoder could turn bad and stop transcoding
* ~~maybe android phones no longer pause playback at the end of an album~~
* nope, that was due to [android's powersaver](https://github.com/9001/copyparty#fix-unreliable-playback-on-android), oh well
* ***bonus unintended feature:*** navigate into other folders while a song is plaing
* [installing from the source tarball](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#build-from-release-tarball) should be ok now
* good base for making distro packages probably
## other changes
* since the network timeout fix is relevant for the single usecase that [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.15/copyparty-winpe64.exe) covers, there is now a new version of that
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0424-0609 `v1.6.14` unsettable flags
## new features
* unset a volflag (override a global option) by negating it (setting volflag `-flagname`)
* new argument `--cert` to specify TLS certificate location
* defaults to `~/.config/copyparty/cert.pem` like before
## bugfixes
* in zip/tar downloads, always use the parent-folder name as the archive root
* more reliable ftp authentication when providing password as username
* connect-page: fix rclone ftps example
## other changes
* stop suggesting `--http-only` and `--https-only` for performance since the difference is negligible
* mention how some antivirus (avast, avg, mcafee) thinks that pillow's webp encoder is a virus, affecting `copyparty.exe`
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0420-2141 `v1.6.12` as seen on nixos
## new features
* @chinponya [made](https://github.com/9001/copyparty/pull/22) a copyparty [Nix package](https://github.com/9001/copyparty#nix-package) and a [NixOS module](https://github.com/9001/copyparty#nixos-module)! nice 🎉
* with [systemd-based hardening](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nixos/modules/copyparty.nix#L230-L270) instead of [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh)
* complements the [arch package](https://github.com/9001/copyparty/tree/hovudstraum/contrib/package/arch) very well w
## bugfixes
* fix an sqlite fd leak
* with enough simultaneous traffic, copyparty could run out of file descriptors since it relied on the gc to close sqlite cursors
* now there's a pool of cursors shared between the tcp connections instead, limited to the number of CPU cores
* performance mostly unaffected (or slightly improved) compared to before, except for a 20% reduction only during max server load caused by directory-listings or searches
* ~~somehow explicitly closing the cursors didn't always work... maybe this was actually a python bug :\\/~~
* yes, it does incomplete cleanup if opening a WAL database fails
* multirange requests would fail with an error; now they get a 200 as expected (since they're kinda useless and not worth the overhead)
* [the only software i've ever seen do that](https://apps.kde.org/discover/) now works as intended
* expand `~/` filesystem paths in all remaining args: `-c`, `-lo`, `--hist`, `--ssl-log`, and the `hist` volflag
* never use IPv6-format IPv4 (`::ffff:127.0.0.1`) in responses
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py): don't enter delete stage if some of the uploads failed
* audio player in safari on touchbar macbooks
* songs would play backwards because the touchbar keeps spamming play/pause
* playback would stop when the preloader kicks in because safari sees the new audio object and freaks out
## other changes
* added [windows quickstart / service example](https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/windows.md)
* updated pyinstaller (it makes smaller exe files now)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0401-2112 `v1.6.11` not joke
## new features
* new event-hook: [exif stripper](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/image-noexif.py)
* [markdown thumbnails](https://a.ocv.me/pub/demo/pics-vids/README.md?v) -- see [readme](https://github.com/9001/copyparty#markdown-viewer)
* soon: support for [web-scrobbler](https://github.com/web-scrobbler/web-scrobbler/) - the [Last.fm](https://www.last.fm/user/tripflag) browser extension
* will update here + readme with more info when [the v3](https://github.com/web-scrobbler/web-scrobbler/projects/5) is out
## bugfixes
* more sqlite query-planner twiddling
* deleting files is MUCH faster now, and uploads / bootup might be a bit better too
* webdav optimizations / compliance
* should make some webdav clients run faster than before
* in very related news, the webdav-client in [rclone](https://github.com/rclone/rclone/) v1.63 ([currently beta](https://beta.rclone.org/?filter=latest)) will be ***FAST!***
* does cool stuff such as [bidirectional sync](https://github.com/9001/copyparty#folder-sync) between copyparty and a local folder
* [bpm detector](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/audio-bpm.py) is a bit more accurate
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) / commandline uploader: better error messages if something goes wrong
* readme rendering could fail in firefox if certain addons were installed (not sure which)
* event-hooks: more accurate usage examples
## other changes
* @chinponya automated the prismjs build step (thx!)
* updated some js deps (markedjs, codemirror)
* copyparty.exe: updated Pillow to 9.5.0
* and finally [the joke](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/rave.js) (looks [like this](https://cd.ocv.me/b/d2/d21/#af-9b927c42))
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0320-2156 `v1.6.10` rclone sync
## new features
* [iPhone "app"](https://github.com/9001/copyparty#ios-shortcuts) (upload shortcut) -- thanks @Daedren !
* can strip exif, upload files, pics, vids, links, clipboard
* can download links and rehost the target file on your server
* support `rclone sync` to [sync folders](https://github.com/9001/copyparty#folder-sync) to/from copyparty
* let webdav clients set lastmodified times during upload
* let webdav clients replace files during upload
## bugfixes
* [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh): FFmpeg transcoding was slow because there was no `/dev/urandom`
* iphones would fail to play *some* songs (low-bitrate and/or shorter than ~7 seconds)
* due to either an iOS bug or an FFmpeg bug in the caf remuxing idk
* fixed by mixing in white noise into songs if an iPhone asks for them
* small correction in the docker readme regarding rootless podman
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0316-2106 `v1.6.9` index.html
## new features
* option to show `index.html` instead of the folder listing
* arg `--ih` makes it default-enabled
* clients can enable/disable it in the `[⚙️]` settings tab
* url-param `?v` skips it for a particular folder
* faster folder-thumbnail validation on startup (mostly on conventional HDDs)
## bugfixes
* "load more" button didn't always show up when search results got truncated
* ux: tooltips could block buttons on android
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0312-1610 `v1.6.8` folder thumbs
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
## new features
* folder thumbnails are indexed in the db
* now supports non-lowercase names (`Cover.jpg`, `Folder.JPG`)
* folders without a specific cover/folder image will show the first pic inside
* when audio playback continues into an empty folder, keep trying for a bit
* add no-index hints (google etc) in basic-browser HTML (`?b`, `?b=u`)
* [commandline uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) supports long filenames on win7
## bugfixes
* rotated logfiles didn't get xz compressed
* image-gallery links pointing to a deleted image shows an error instead of a crashpage
## other changes
* folder thumbnails have purple text to differentiate from files
* `copyparty32.exe` starts 30% faster (but is 6% larger)
----
# what to download?
| download link | is it good? | description |
| -- | -- | -- |
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
| [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
* except for [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe), all of the options above are equivalent
* the zip and tar.gz files below are just source code
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0305-2018 `v1.6.7` fix no-dedup + add up2k.exe
## new features
* controlpanel-connect: add example for webdav automount
## bugfixes
* fix a race which, in worst case (but unlikely on linux), **could cause data loss**
* could only happen if `--no-dedup` or volflag `copydupes` was set (**not** default)
* if two identical files were uploaded at the same time, there was a small chance that one of the files would become empty
* check if you were affected by doing a search for zero-byte files using either of the following:
* https://127.0.0.1:3923/#q=size%20%3D%200
* `find -type f -size 0`
* let me know if you lost something important and had logging enabled!
* ftp: mkdir can do multiple levels at once (support filezilla)
* fix flickering toast on upload finish
* `[💤]` (upload-baton) could disengage if chrome decides to pause the background tab for 10sec (which it sometimes does)
----
## introducing [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe)
the commandline up2k upload / filesearch client, now as a standalone windows exe
* based on python 3.7 so it runs on 32bit windows7 or anything newer
* *no https support* (saves space + the python3.7 openssl is getting old)
* built from b39ff92f34e3fca389c78109d20d5454af761f8e so it can do long filepaths and mojibake
----
⭐️ **you probably want [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) below;**
the exe is [not recommended](https://github.com/9001/copyparty#copypartyexe) for longterm use
and the zip and tar.gz files are source code
(python packages are available at [PyPI](https://pypi.org/project/copyparty/#files))
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0226-2030 `v1.6.6` r 2 0 0
two hundred releases wow
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
* currently fighting a ground fault so the demo server will be unreliable for a while
## new features
* more docker containers! now runs on x64, x32, aarch64, armhf, ppc64, s390x
* pls let me know if you actually run copyparty on an IBM mainframe 👍
* new [event hook](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) type `xiu` runs just once for all recent uploads
* example hook [xiu-sha.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/xiu-sha.py) generates sha512 checksum files
* new arg `--rsp-jtr` simulates connection jitter
* copyparty.exe integrity selftest
* ux:
* return to previous page after logging in
* show a warning on the login page if you're not using https
* freebsd: detect `fetch` and return the [colorful sortable plaintext](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png) listing
## bugfixes
* permit replacing empty files only during a `--blank-wt` grace period
* lifetimes: keep upload-time when a size/mtime change triggers a reindex
* during cleanup after an unlink, never rmdir the entire volume
* rescan button in the controlpanel required volumes to be e2ds
* dupes could get indexed with the wrong mtime
* only affected the search index; the filesystem got the right one
* ux: search results could include the same hit twice in case of overlapping volumes
* ux: upload UI would remain expanded permanently after visiting a huge tab
* ftp: return proper error messages when client does something illegal
* ie11: support the back button
## other changes
* [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) replaces copyparty64.exe -- now built for 64-bit windows 10
* **on win10 it just works** -- on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145) -- no win7 support
* has the latest security patches, but sfx.py is still better for long-term use
* has pillow and mutagen; can make thumbnails and parse/index media
* [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is the old win7-compatible, dangerously-insecure edition
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0212-1411 `v1.6.5` windows smb fix + win10.exe
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
## bugfixes
* **windows-only:** smb locations (network drives) could not be accessed
* appeared in [v1.6.4](https://github.com/9001/copyparty/releases/tag/v1.6.4) while adding support for long filepaths (260chars+)
## other changes
* removed tentative support for compressed chiptunes (xmgz, xmz, xmj, ...) since FFmpeg usually doesn't
----
# introducing [copyparty640.exe](https://github.com/9001/copyparty/releases/download/v1.6.5/copyparty640.exe)
* built for win10, comes with the latest python and deps (supports win8 with [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145))
* __*much* safer__ than the old win7-compatible `copyparty.exe` and `copyparty64.exe`
* but only `copyparty-sfx.py` takes advantage of the operating system security patches
* includes pillow for thumbnails and mutagen for media indexing
* around 10% slower (trying to figure out what's up with that)
starting from the next release,
* `copyparty.exe` (win7 x32) will become `copyparty32.exe`
* `copyparty640.exe` (win10) will be the new `copyparty.exe`
* `copyparty64.exe` (win7 x64) will graduate
so the [copyparty64.exe](https://github.com/9001/copyparty/releases/download/v1.6.5/copyparty64.exe) in this release will be the "final" version able to run inside a [64bit Win7-era winPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) (all regular 32/64-bit win7 editions can just use `copyparty32.exe` instead)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0211-1802 `v1.6.4` 🔧🎲🔗🐳🇦🎶
* read-only demo server at https://a.ocv.me/pub/demo/
* [1.6 theme song](https://a.ocv.me/pub/demo/music/.bonus/#af-134e597c) // [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md)
## new features
* 🔧 new [config syntax](https://github.com/9001/copyparty/blob/hovudstraum/docs/example.conf) (#20)
* the new syntax is still kinda esoteric and funky but it's an improvement
* old config files are still supported
* `--vc` prints the autoconverted config which you can copy back into the config file to upgrade
* `--vc` will also [annotate and explain](https://user-images.githubusercontent.com/241032/217356028-eb3e141f-80a6-4bc6-8d04-d8d1d874c3e9.png) the config files
* new argument `--cgen` to generate config from commandline arguments
* kinda buggy, especially the `[global]` section, so give it a lookover before saving it
* 🎲 randomize filenames on upload
* either optionally, using the 🎲 button in the up2k ui
* or force-enabled; globally with `--rand` or per-volume with volflag `rand`
* specify filename length with `nrand` (globally or volflag), default 9
* 🔗 export a list of links to your recent uploads
* `copy links` in the up2k tab (🚀) will copy links to all uploads since last page refresh,
* `copy` in the unpost tab (🧯) will copy links to all your recent uploads (max 2000 files / 12 hours by default)
* filekeys are included if that's enabled and you have access to view those (permissions `G` or `r`)
* 🇦 [arch package](https://github.com/9001/copyparty/tree/hovudstraum/contrib/package/arch) -- added in #18, thx @icxes
* maybe in aur soon!
* 🐳 [docker containers](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) -- 5 editions,
* [min](https://hub.docker.com/r/copyparty/min) (57 MiB), just copyparty without thumbnails or audio transcoding
* [im](https://hub.docker.com/r/copyparty/im) (70 MiB), thumbnails of popular image formats + media tags with mutagen
* [ac (163 MiB)](https://hub.docker.com/r/copyparty/ac) 🥇 adds audio/video thumbnails + audio transcoding + better tags
* [iv](https://hub.docker.com/r/copyparty/iv) (211 MiB), makes heif/avic/jxl faster to thumbnail
* [dj](https://hub.docker.com/r/copyparty/dj) (309 MiB), adds optional detection of musical key / bpm
* 🎶 [chiptune player](https://a.ocv.me/pub/demo/music/chiptunes/#af-f6fb2e5f)
* transcodes mod/xm/s3m/it/mo3/mptm/mt2/okt to opus
* uses FFmpeg (libopenmpt) so the accuracy is not perfect, but most files play OK enough
* not **yet** supported in the docker container since Alpine's FFmpeg was built without libopenmpt
* windows: support long filepaths (over 260 chars)
* uses the `//?/` winapi syntax to also support windows 7
* `--ver` shows the server version on the control panel
## bugfixes
* markdown files didn't scale properly in the document browser
* detect and refuse multiple volume definitions sharing the same filesystem path
* don't return incomplete transcodes if multiple clients try to play the same flac file
* [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh): more reliable chroot cleanup, sigusr1 for config reload
* pypi packaging: compress web resources, include webdav.bat
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0131-2103 `v1.6.3` sandbox k

View File

@@ -17,6 +17,6 @@ problem: `svchost.exe` is using 100% of a cpu core, and upon further inspection
"solution": create a virtual filesystem which is intentionally slow and trick windows into reading it from there instead
* create a file called `AppxManifest.xml` and put something dumb in it
* serve the file from a copyparty instance with `--rsp-slp=9` so every request will hang for 9 sec
* serve the file from a copyparty instance with `--rsp-slp=1` so every request will hang for 1 sec
* `net use m: http://127.0.0.1:3993/` (mount copyparty using the windows-native webdav client)
* `mklink /d c:\windows\systemapps\microsoftwindows.client.cbs_cw5n1h2txyewy\AppxManifest.xml m:\AppxManifest.xml`

View File

@@ -4,6 +4,7 @@
* [future plans](#future-plans) - some improvement ideas
* [design](#design)
* [up2k](#up2k) - quick outline of the up2k protocol
* [why not tus](#why-not-tus) - I didn't know about [tus](https://tus.io/)
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
* [http api](#http-api)
* [read](#read)
@@ -16,6 +17,7 @@
* [building](#building)
* [dev env setup](#dev-env-setup)
* [just the sfx](#just-the-sfx)
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
* [complete release](#complete-release)
* [todo](#todo) - roughly sorted by priority
* [discarded ideas](#discarded-ideas)
@@ -25,13 +27,15 @@
some improvement ideas
* the JS is a mess -- a preact rewrite would be nice
* the JS is a mess -- a ~~preact~~ rewrite would be nice
* preferably without build dependencies like webpack/babel/node.js, maybe a python thing to assemble js files into main.js
* good excuse to look at using virtual lists (browsers start to struggle when folders contain over 5000 files)
* maybe preact / vdom isn't the best choice, could just wait for the Next Big Thing
* the UX is a mess -- a proper design would be nice
* very organic (much like the python/js), everything was an afterthought
* true for both the layout and the visual flair
* something like the tron board-room ui (or most other hollywood ones, like ironman) would be :100:
* would preferably keep the information density, just more organized yet [not too boring](https://blog.rachelbinx.com/2023/02/unbearable-sameness/)
* some of the python files are way too big
* `up2k.py` ended up doing all the file indexing / db management
* `httpcli.py` should be separated into modules in general
@@ -64,6 +68,13 @@ regarding the frequent server log message during uploads;
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
## why not tus
I didn't know about [tus](https://tus.io/) when I made this, but:
* up2k has the advantage that it supports parallel uploading of non-contiguous chunks straight into the final file -- [tus does a merge at the end](https://tus.io/protocols/resumable-upload.html#concatenation) which is slow and taxing on the server HDD / filesystem (unless i'm misunderstanding)
* up2k has the slight disadvantage of requiring the client to hash the entire file before an upload can begin, but this has the benefit of immediately skipping duplicate files
* and the hashing happens in a separate thread anyways so it's usually not a bottleneck
## why chunk-hashes
a single sha512 would be better, right?
@@ -100,6 +111,7 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| GET | `?ls&dots` | list files/folders at URL as JSON, including dotfiles |
| GET | `?ls=t` | list files/folders at URL as plaintext |
| GET | `?ls=v` | list files/folders at URL, terminal-formatted |
| GET | `?lt` | in listings, use symlink timestamps rather than targets |
| GET | `?b` | list files/folders at URL as simplified HTML |
| GET | `?tree=.` | list one level of subdirectories inside URL |
| GET | `?tree` | list one level of subdirectories for each level until URL |
@@ -245,10 +257,31 @@ then build the sfx using any of the following examples:
```
## build from release tarball
uses the included prebuilt webdeps
if you downloaded a [release](https://github.com/9001/copyparty/releases) source tarball from github (for example [copyparty-1.6.15.tar.gz](https://github.com/9001/copyparty/releases/download/v1.6.15/copyparty-1.6.15.tar.gz) so not the autogenerated one) you can build it like so,
```bash
python3 setup.py install --user
```
or if you're packaging it for a linux distro (nice), maybe something like
```bash
bash scripts/run-tests.sh python3 # optional
python3 setup.py build
python3 setup.py install --skip-build --prefix=/usr --root=$HOME/pe/copyparty
```
## complete release
also builds the sfx so skip the sfx section above
does everything completely from scratch, straight from your local repo
in the `scripts` folder:
* run `make -C deps-docker` to build all dependencies

4
docs/examples/README.md Normal file
View File

@@ -0,0 +1,4 @@
copyparty server config examples
[windows.md](windows.md) -- running copyparty as a service on windows

115
docs/examples/windows.md Normal file
View File

@@ -0,0 +1,115 @@
# running copyparty on windows
this is a complete example / quickstart for running copyparty on windows, optionally as a service (autostart on boot)
you will definitely need either [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (comfy, portable, more features) or [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) (smaller, safer)
* if you decided to grab `copyparty-sfx.py` instead of the exe you will also need to install the ["Latest Python 3 Release"](https://www.python.org/downloads/windows/)
then you probably want to download [FFmpeg](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) and put `ffmpeg.exe` and `ffprobe.exe` in your PATH (so for example `C:\Windows\System32\`) -- this enables thumbnails, audio transcoding, and making music metadata searchable
## the config file
open up notepad and save the following as `c:\users\you\documents\party.conf` (for example)
```yaml
[global]
lo: ~/logs/cpp-%Y-%m%d.xz # log to c:\users\you\logs\
e2dsa, e2ts, no-dedup, z # sets 4 flags; see expl.
p: 80, 443 # listen on ports 80 and 443, not 3923
theme: 2 # default theme: protonmail-monokai
lang: nor # default language: viking
[accounts] # usernames and passwords
kevin: shangalabangala # kevin's password
[/] # create a volume available at /
c:\pub # sharing this filesystem location
accs: # and set permissions:
r: * # everyone can read/download files,
rwmd: kevin # kevin can read/write/move/delete
[/inc] # create another volume at /inc
c:\pub\inc # sharing this filesystem location
accs: # permissions:
w: * # everyone can upload, but not browse
rwmd: kevin # kevin is admin here too
[/music] # and a third volume at /music
~/music # which shares c:\users\you\music
accs:
r: *
rwmd: kevin
```
### config explained: [global]
the `[global]` section accepts any config parameters you can see when running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor`
* `lo: ~/logs/cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed)
* `e2dsa` enables the upload deduplicator and file indexer, which enables searching
* `e2ts` enables music metadata indexing, making albums / titles etc. searchable too
* `no-dedup` writes full dupes to disk instead of symlinking, since lots of windows software doesn't handle symlinks well
* but the improved upload speed from `e2dsa` is not affected
* `z` enables zeroconf, making the server available at `http://HOSTNAME.local/` from any other machine in the LAN
* `p: 80,443` listens on the ports `80` and `443` instead of the default `3923`
* `lang: nor` sets default language to viking
### config explained: [accounts]
the `[accounts]` section defines all the user accounts, which can then be referenced when granting people access to the different volumes
### config explained: volumes
then we create three volumes, one at `/`, one at `/inc`, and one at `/music`
* `/` and `/music` are readable without requiring people to login (`r: *`) but you need to login as kevin to write/move/delete files (`rwmd: kevin`)
* anyone can upload to `/inc` but you must be logged in as kevin to see the files inside
## run copyparty
to test your config it's best to just run copyparty in a console to watch the output:
```batch
copyparty.exe -c party.conf
```
or if you wanna use `copyparty-sfx.py` instead of the exe (understandable),
```batch
%localappdata%\programs\python\python311\python.exe copyparty-sfx.py -c party.conf
```
(please adjust `python311` to match the python version you installed, i'm not good enough at windows to make that bit generic)
## run it as a service
to run this as a service you need [NSSM](https://nssm.cc/ci/nssm-2.24-101-g897c7ad.zip), so put the exe somewhere in your PATH
then either do this for `copyparty.exe`:
```batch
nssm install cpp %homedrive%%homepath%\downloads\copyparty.exe -c %homedrive%%homepath%\documents\party.conf
```
or do this for `copyparty-sfx.py`:
```batch
nssm install cpp %localappdata%\programs\python\python311\python.exe %homedrive%%homepath%\downloads\copyparty-sfx.py -c %homedrive%%homepath%\documents\party.conf
```
then after creating the service, modify it so it runs with your own windows account (so file permissions don't get wonky and paths expand as expected):
```batch
nssm set cpp ObjectName .\yourAccoutName yourWindowsPassword
nssm start cpp
```
and that's it, all good
if it doesn't start, enable stderr logging so you can see what went wrong:
```batch
nssm set cpp AppStderr %homedrive%%homepath%\logs\cppsvc.err
nssm set cpp AppStderrCreationDisposition 2
```

View File

@@ -194,6 +194,9 @@ sqlite3 .hist/up2k.db 'select * from mt where k="fgsfds" or k="t:mtp"' | tee /de
for ((f=420;f<1200;f++)); do sz=$(ffmpeg -y -f lavfi -i sine=frequency=$f:duration=2 -vf volume=0.1 -ac 1 -ar 44100 -f s16le /dev/shm/a.wav 2>/dev/null; base64 -w0 </dev/shm/a.wav | gzip -c | wc -c); printf '%d %d\n' $f $sz; done | tee /dev/stderr | sort -nrk2,2
ffmpeg -y -f lavfi -i sine=frequency=1050:duration=2 -vf volume=0.1 -ac 1 -ar 44100 /dev/shm/a.wav
# better sine
sox -DnV -r8000 -b8 -c1 /dev/shm/a.wav synth 1.1 sin 400 vol 0.02
# play icon calibration pics
for w in 150 170 190 210 230 250; do for h in 130 150 170 190 210; do /c/Program\ Files/ImageMagick-7.0.11-Q16-HDRI/magick.exe convert -size ${w}x${h} xc:brown -fill orange -draw "circle $((w/2)),$((h/2)) $((w/2)),$((h/3))" $w-$h.png; done; done

View File

@@ -0,0 +1,2 @@
vsftpd a.conf -olisten=YES -olisten_port=3921 -orun_as_launching_user=YES -obackground=NO -olog_ftp_protocol=YES

View File

@@ -14,6 +14,10 @@ when server is on another machine (1gbit LAN),
# creating the config file
the copyparty "connect" page at `/?hc` (so for example http://127.0.0.1:3923/?hc) will generate commands to autoconfigure rclone for your server
**if you prefer to configure rclone manually, continue reading:**
replace `hunter2` with your password, or remove the `hunter2` lines if you allow anonymous access
@@ -22,14 +26,16 @@ replace `hunter2` with your password, or remove the `hunter2` lines if you allow
(
echo [cpp-rw]
echo type = webdav
echo vendor = other
echo vendor = owncloud
echo url = http://127.0.0.1:3923/
echo headers = Cookie,cppwd=hunter2
echo pacer_min_sleep = 0.01ms
echo(
echo [cpp-ro]
echo type = http
echo url = http://127.0.0.1:3923/
echo headers = Cookie,cppwd=hunter2
echo pacer_min_sleep = 0.01ms
) > %userprofile%\.config\rclone\rclone.conf
```
@@ -41,14 +47,16 @@ also install the windows dependencies: [winfsp](https://github.com/billziss-gh/w
cat > ~/.config/rclone/rclone.conf <<'EOF'
[cpp-rw]
type = webdav
vendor = other
vendor = owncloud
url = http://127.0.0.1:3923/
headers = Cookie,cppwd=hunter2
pacer_min_sleep = 0.01ms
[cpp-ro]
type = http
url = http://127.0.0.1:3923/
headers = Cookie,cppwd=hunter2
pacer_min_sleep = 0.01ms
EOF
```
@@ -62,6 +70,15 @@ rclone.exe mount --vfs-cache-mode writes --vfs-cache-max-age 5s --attr-timeout 5
```
# sync folders to/from copyparty
note that the up2k client [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) (available on the "connect" page of your copyparty server) does uploads much faster and safer, but rclone is bidirectional and more ubiquitous
```
rclone sync /usr/share/icons/ cpp-rw:fds/
```
# use rclone as server too, replacing copyparty
feels out of place but is too good not to mention
@@ -70,3 +87,26 @@ feels out of place but is too good not to mention
rclone.exe serve http --read-only .
rclone.exe serve webdav .
```
# devnotes
copyparty supports and expects [the following](https://github.com/rclone/rclone/blob/46484022b08f8756050aa45505ea0db23e62df8b/backend/webdav/webdav.go#L575-L578) from rclone,
```go
case "owncloud":
f.canStream = true
f.precision = time.Second
f.useOCMtime = true
f.hasOCMD5 = true
f.hasOCSHA1 = true
```
notably,
* `useOCMtime` enables the `x-oc-mtime` header to retain mtime of uploads from rclone
* `canStream` is supported but not required by us
* `hasOCMD5` / `hasOCSHA1` is conveniently dontcare on both ends
there's a scary comment mentioning PROPSET of lastmodified which is not something we wish to support
and if `vendor=owncloud` ever stops working, try `vendor=fastmail` instead

View File

@@ -7,6 +7,21 @@ there is probably some unintentional bias so please submit corrections
currently up to date with [awesome-selfhosted](https://github.com/awesome-selfhosted/awesome-selfhosted) but that probably won't last
## symbol legends
### ...in feature matrices:
* `█` = absolutely
* `` = partially
* `•` = maybe?
* ` ` = nope
### ...in reviews:
* ✅ = advantages over copyparty
* 💾 = what copyparty offers as an alternative
* 🔵 = similarities
* ⚠️ = disadvantages (something copyparty does "better")
## toc
* top
@@ -32,11 +47,15 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* [kodbox](#kodbox)
* [filebrowser](#filebrowser)
* [filegator](#filegator)
* [sftpgo](#sftpgo)
* [updog](#updog)
* [goshs](#goshs)
* [gimme-that](#gimme-that)
* [ass](#ass)
* [linx](#linx)
* [h5ai](#h5ai)
* [autoindex](#autoindex)
* [miniserve](#miniserve)
* [briefly considered](#briefly-considered)
@@ -63,8 +82,8 @@ the table headers in the matrixes below are the different softwares, with a quic
the softwares,
* `a` = [copyparty](https://github.com/9001/copyparty)
* `b` = [hfs2](https://github.com/rejetto/hfs2)
* `c` = [hfs3](https://www.rejetto.com/hfs/)
* `b` = [hfs2](https://rejetto.com/hfs/)
* `c` = [hfs3](https://github.com/rejetto/hfs)
* `d` = [nextcloud](https://github.com/nextcloud/server)
* `e` = [seafile](https://github.com/haiwen/seafile)
* `f` = [rclone](https://github.com/rclone/rclone), specifically `rclone serve webdav .`
@@ -73,6 +92,7 @@ the softwares,
* `i` = [kodbox](https://github.com/kalcaddle/kodbox)
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
* `k` = [filegator](https://github.com/filegator/filegator)
* `l` = [sftpgo](https://github.com/drakkan/sftpgo)
some softwares not in the matrixes,
* [updog](#updog)
@@ -80,6 +100,9 @@ some softwares not in the matrixes,
* [gimme-that](#gimmethat)
* [ass](#ass)
* [linx](#linx)
* [h5ai](#h5ai)
* [autoindex](#autoindex)
* [miniserve](#miniserve)
symbol legend,
* `█` = absolutely
@@ -90,62 +113,64 @@ symbol legend,
## general
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
| intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ |
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | |
| good documentation | | | | █ | █ | █ | █ | | | █ | █ |
| runs on iOS | | | | | | | | | | | |
| runs on Android | █ | | | | | █ | | | | | |
| runs on WinXP | █ | █ | | | | █ | | | | | |
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ |
| runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | |
| portable binary | █ | █ | █ | | | █ | █ | | | █ | |
| zero setup, just go | █ | █ | █ | | | | █ | | | █ | |
| android app | | | | █ | █ | | | | | | |
| iOS app | | | | █ | █ | | | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | █ |
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ |
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | |
| runs on iOS | | | | | | | | | | | | |
| runs on Android | █ | | | | | █ | | | | | | |
| runs on WinXP | █ | █ | | | | █ | | | | | | |
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | █ |
| runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ |
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ |
| zero setup, just go | █ | █ | █ | | | | █ | | | █ | | |
| android app | | | | █ | █ | | | | | | | |
| iOS app | | | | █ | █ | | | | | | | |
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
* `a`/copyparty remarks:
* no gui for server settings; only for client-side stuff
* can theoretically run on iOS / iPads using [iSH](https://ish.app/), but only the iPad will offer sufficient multitasking i think
* [android app](https://f-droid.org/en/packages/me.ocv.partyup/) is for uploading only
* no iOS app but has [shortcuts](https://github.com/9001/copyparty#ios-shortcuts) for easy uploading
* `b`/hfs2 runs on linux through wine
* `f`/rclone must be started with the command `rclone serve webdav .` or similar
* `h`/chibisafe has undocumented windows support
* `i`/sftpgo must be launched with a command
## file transfer
*the thing that copyparty is actually kinda good at*
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | |
| download folder as tar | █ | | | | | | | | | █ | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ |
| resumable uploads | █ | | | | | | | | █ | | █ |
| upload segmenting | █ | | | | | | | █ | █ | | █ |
| upload acceleration | █ | | | | | | | | █ | | █ |
| upload verification | █ | | | █ | █ | | | | █ | | |
| upload deduplication | █ | | | | █ | | | | █ | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ |
| keep last-modified time | █ | | | █ | █ | █ | | | | | |
| upload rules | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | |
| ┗ max filesize | █ | | | | | | | █ | | | █ |
| ┗ max items in folder | █ | | | | | | | | | | |
| ┗ max file age | █ | | | | | | | | █ | | |
| ┗ max uploads over time | █ | | | | | | | | | | |
| ┗ compress before write | █ | | | | | | | | | | |
| ┗ randomize filename | █ | | | | | | | █ | █ | | |
| ┗ mimetype reject-list | | | | | | | | | • | | |
| ┗ extension reject-list | | | | | | | | █ | • | | |
| checksums provided | | | | █ | █ | | | | █ | | |
| cloud storage backend | | | | █ | █ | █ | | | | | █ |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | | █ |
| download folder as tar | █ | | | | | | | | | █ | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | |
| resumable uploads | █ | | | | | | | | █ | | █ | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | |
| upload acceleration | █ | | | | | | | | █ | | █ | |
| upload verification | █ | | | █ | █ | | | | █ | | | |
| upload deduplication | █ | | | | █ | | | | █ | | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | |
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ |
| upload rules | | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ |
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ |
| ┗ max items in folder | █ | | | | | | | | | | | |
| ┗ max file age | █ | | | | | | | | █ | | | |
| ┗ max uploads over time | █ | | | | | | | | | | | |
| ┗ compress before write | █ | | | | | | | | | | | |
| ┗ randomize filename | █ | | | | | | | █ | █ | | | |
| ┗ mimetype reject-list | | | | | | | | | • | | | |
| ┗ extension reject-list | | | | | | | | █ | • | | | |
| checksums provided | | | | █ | █ | | | | █ | | | |
| cloud storage backend | | | | █ | █ | █ | | | | | █ | █ |
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
@@ -162,25 +187,29 @@ symbol legend,
* can provide checksums for single files on request
* can probably do extension/mimetype rejection similar to copyparty
* `k`/filegator download-as-zip is not streaming; it creates the full zipfile before download can start
* `l`/sftpgo:
* resumable/segmented uploads only over SFTP, not over HTTP
* upload rules are totals only, not over time
* can probably do extension/mimetype rejection similar to copyparty
## protocols and client support
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | |
| serve ftp | █ | | | | | █ | | | | | |
| serve ftps | █ | | | | | █ | | | | | |
| serve sftp | | | | | | █ | | | | | |
| serve smb/cifs | | | | | | █ | | | | | |
| serve dlna | | | | | | █ | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ |
| zeroconf | █ | | | | | | | | | | |
| supports netscape 4 | | | | | | █ | | | | | • |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • |
| mojibake filenames | █ | | | • | • | █ | █ | • | • | • | |
| undecodable filenames | █ | | | • | • | █ | | • | • | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ |
| serve ftp | █ | | | | | █ | | | | | | █ |
| serve ftps | █ | | | | | █ | | | | | | █ |
| serve sftp | | | | | | █ | | | | | | █ |
| serve smb/cifs | | | | | | █ | | | | | | |
| serve dlna | | | | | | █ | | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ |
| zeroconf | █ | | | | | | | | | | | |
| supports netscape 4 | | | | | | █ | | | | | • | |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • | |
| mojibake filenames | █ | | | • | • | █ | █ | • | • | • | | |
| undecodable filenames | █ | | | • | • | █ | | • | • | | | |
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
@@ -190,57 +219,62 @@ symbol legend,
* `a`/copyparty remarks:
* extremely minimal samba/cifs server
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
* `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss)
## server configuration
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
| config from cmd args | █ | | | | | █ | █ | | | █ | |
| config files | █ | █ | █ | | | █ | | █ | | █ | • |
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ |
| same-port http / https | █ | | | | | | | | | | |
| listen multiple ports | █ | | | | | | | | | | |
| virtual file system | █ | █ | █ | | | | █ | | | | |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| config from cmd args | █ | | | | | █ | █ | | | █ | | |
| config files | █ | █ | █ | | | █ | | █ | | █ | • | |
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | |
| same-port http / https | █ | | | | | | | | | | | |
| listen multiple ports | █ | | | | | | | | | | | █ |
| virtual file system | █ | █ | █ | | | | █ | | | | | █ |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | |
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
* `l`/sftpgo:
* config: users must be added through gui / api calls
## server capabilities
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| single-sign-on | | | | | | | | | • | | |
| token auth | | | | █ | █ | | | | | | |
| per-volume permissions | | | | █ | █ | | █ | | | | |
| per-folder permissions | | | | █ | █ | | | | █ | █ | |
| per-file permissions | | | | █ | █ | | █ | | █ | | |
| per-file passwords | | | | █ | █ | | █ | | █ | | |
| unmap subfolders | | | | | | | █ | | | █ | |
| index.html blocks list | | | | | | | █ | | | • | |
| write-only folders | █ | | | | | | | | | | |
| files stored as-is | █ | | | | | █ | | | | | |
| file versioning | | | | █ | █ | | | | | | |
| file encryption | | | | █ | █ | █ | | | | | |
| file indexing | █ | | | █ | █ | | | | | | |
| ┗ per-volume db | | | | | | | | | | | |
| ┗ db stored in folder | █ | | | | | | | • | • | █ | |
| ┗ db stored out-of-tree | █ | | | | | | | • | • | | |
| ┗ existing file tree | █ | | █ | | | | | | | | |
| file action event hooks | █ | | | | | | | | | █ | |
| one-way folder sync | █ | | | | | | | | | | |
| full sync | | | | █ | █ | | | | | | |
| speed throttle | | █ | █ | | | █ | | | | | |
| anti-bruteforce | █ | █ | | █ | █ | | | | | | |
| dyndns updater | | █ | | | | | | | | | |
| self-updater | | | █ | | | | | | | | |
| log rotation | █ | | | | | | | • | █ | | |
| upload tracking / log | █ | █ | | | | | | | | | |
| curl-friendly ls | █ | | | | | | | | | | |
| curl-friendly upload | █ | | | | | | █ | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| per-account chroot | | | | | | | | | | | | |
| single-sign-on | | | | █ | █ | | | | • | | | |
| token auth | | | | █ | █ | | | █ | | | | |
| 2fa | | | | █ | █ | | | | | | | █ |
| per-volume permissions | | | █ | █ | █ | █ | █ | | █ | █ | | |
| per-folder permissions | | | | █ | █ | | █ | | █ | | | █ |
| per-file permissions | | | | █ | █ | | | | █ | | | |
| per-file passwords | | | | █ | █ | | █ | | █ | | | |
| unmap subfolders | █ | | | | | | | | | █ | | |
| index.html blocks list | | | | | | | █ | | | | | |
| write-only folders | █ | | | | | | | | | | █ | █ |
| files stored as-is | █ | █ | | █ | | █ | █ | | | | | |
| file versioning | | | | █ | █ | | | | | | | |
| file encryption | | | | | | | | | | | | |
| file indexing | █ | | | █ | █ | | | | █ | █ | | |
| ┗ per-volume db | █ | | | | | | | • | • | | | |
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | |
| ┗ db stored out-of-tree | █ | | █ | █ | | | | | • | █ | | |
| ┗ existing file tree | █ | | | | | | | | | █ | | |
| file action event hooks | █ | | | | | | | | | █ | | █ |
| one-way folder sync | █ | | | █ | █ | █ | | | | | | |
| full sync | | | | █ | █ | | | | | | | |
| speed throttle | | █ | | | | | | | █ | | | |
| anti-bruteforce | █ | █ | █ | █ | █ | | | | | | | |
| dyndns updater | | █ | | | | | | | | | | |
| self-updater | | | █ | | | | | | | | | |
| log rotation | █ | | █ | █ | | | | | | | | |
| upload tracking / log | █ | | | █ | █ | | | █ | | | | |
| curl-friendly ls | █ | | | | | | | | | | | |
| curl-friendly upload | █ | | | | | █ | █ | • | | | | |
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
@@ -261,49 +295,52 @@ symbol legend,
* `k`/filegator remarks:
* `per-* permissions` -- can limit a user to one folder and its subfolders
* `unmap subfolders` -- can globally filter a list of paths
* `l`/sftpgo:
* `file action event hooks` also include on-download triggers
* `upload tracking / log` in main logfile
## client features
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ |
| themes | █ | █ | | █ | | | | | █ | | |
| directory tree nav | █ | | | | █ | | | | █ | | |
| multi-column sorting | █ | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | |
| ┗ audio spectrograms | █ | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | |
| ┗ gapless playback | █ | | | | | | | | • | | |
| ┗ audio equalizer | █ | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | |
| ┗ OS integration | █ | | | | | | | | | | |
| ┗ transcode to lossy | █ | | | | | | | | | | |
| video player | █ | | | █ | █ | | | | █ | █ | |
| ┗ video transcoding | | | | | | | | | █ | | |
| audio BPM detector | █ | | | | | | | | | | |
| audio key detector | █ | | | | | | | | | | |
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | |
| search by date / size | █ | | | | █ | | | █ | █ | | |
| search by bpm / key | █ | | | | | | | | | | |
| search by custom tags | | | | | | | | █ | █ | | |
| search in file contents | | | | █ | █ | | | | █ | | |
| search by custom parser | █ | | | | | | | | | | |
| find local file | █ | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ |
| markdown viewer | █ | | | | █ | | | | █ | | |
| markdown editor | █ | | | | █ | | | | █ | | |
| readme.md in listing | █ | | | █ | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | |
| themes | █ | █ | | █ | | | | | █ | | | |
| directory tree nav | █ | | | | █ | | | | █ | | | |
| multi-column sorting | █ | | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | | |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | |
| ┗ audio spectrograms | █ | | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | | |
| ┗ gapless playback | █ | | | | | | | | • | | | |
| ┗ audio equalizer | █ | | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | | |
| ┗ OS integration | █ | | | | | | | | | | | |
| ┗ transcode to lossy | █ | | | | | | | | | | | |
| video player | █ | | | █ | █ | | | | █ | █ | | |
| ┗ video transcoding | | | | | | | | | █ | | | |
| audio BPM detector | █ | | | | | | | | | | | |
| audio key detector | █ | | | | | | | | | | | |
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | |
| search by date / size | █ | | | | █ | | | █ | █ | | | |
| search by bpm / key | █ | | | | | | | | | | | |
| search by custom tags | | | | | | | | █ | █ | | | |
| search in file contents | | | | █ | █ | | | | █ | | | |
| search by custom parser | █ | | | | | | | | | | | |
| find local file | █ | | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | |
| markdown viewer | █ | | | | █ | | | | █ | | | |
| markdown editor | █ | | | | █ | | | | █ | | | |
| readme.md in listing | █ | | | █ | | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ | |
* `single-page app` = multitasking; possible to continue navigating while uploading
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
@@ -319,20 +356,21 @@ symbol legend,
## integration
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
| OS alert on upload | █ | | | | | | | | | | |
| discord | █ | | | | | | | | | | |
| ┗ announce uploads | █ | | | | | | | | | | |
| ┗ custom embeds | | | | | | | | | | | |
| sharex | █ | | | █ | | █ | | █ | | | |
| flameshot | | | | | | █ | | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| OS alert on upload | █ | | | | | | | | | | | |
| discord | █ | | | | | | | | | | | |
| ┗ announce uploads | █ | | | | | | | | | | | |
| ┗ custom embeds | | | | | | | | | | | | |
| sharex | █ | | | █ | | █ | | █ | | | | |
| flameshot | | | | | | █ | | | | | | |
* sharex `` = yes, but does not provide example sharex config
* `a`/copyparty remarks:
* `OS alert on upload` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
* `discord » announce uploads` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
* `j`/filebrowser can probably pull those off with command runners similar to copyparty
* `l`/sftpgo has nothing built-in but is very extensible
## another matrix
@@ -350,6 +388,7 @@ symbol legend,
| kodbox | php | ░ gpl3 | 92 MB |
| filebrowser | go | █ apl2 | 20 MB |
| filegator | php | █ mit | • |
| sftpgo | go | ‼ agpl | 44 MB |
| updog | python | █ mit | 17 MB |
| goshs | go | █ mit | 11 MB |
| gimme-that | python | █ mit | 4.8 MB |
@@ -363,7 +402,9 @@ symbol legend,
# reviews
* ✅ are advantages over copyparty
* ⚠️ are disadvantages
* 💾 are what copyparty offers as an alternative
* 🔵 are similarities
* ⚠️ are disadvantages (something copyparty does "better")
## [copyparty](https://github.com/9001/copyparty)
* resumable uploads which are verified server-side
@@ -371,7 +412,7 @@ symbol legend,
* both of the above are surprisingly uncommon features
* very cross-platform (python, no dependencies)
## [hfs2](https://github.com/rejetto/hfs2)
## [hfs2](https://rejetto.com/hfs/)
* the OG, the legend
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
@@ -380,7 +421,7 @@ symbol legend,
* vfs with gui config, per-volume permissions
* starting to show its age, hence the rewrite:
## [hfs3](https://www.rejetto.com/hfs/)
## [hfs3](https://github.com/rejetto/hfs)
* nodejs; cross-platform
* vfs with gui config, per-volume permissions
* still early development, let's revisit later
@@ -395,10 +436,11 @@ symbol legend,
* ⚠️ http/webdav only; no ftp, zeroconf
* ⚠️ less awesome music player
* ⚠️ doesn't run on android or ipads
* ⚠️ AGPL licensed
* ✅ great ui/ux
* ✅ config gui
* ✅ apps (android / iphone)
* copyparty: android upload-only app
* 💾 android upload-only app + iPhone upload shortcut
* ✅ more granular permissions (per-file)
* ✅ search: fulltext indexing of file contents
* ✅ webauthn passwordless authentication
@@ -413,10 +455,11 @@ symbol legend,
* ⚠️ http/webdav only; no ftp, zeroconf
* ⚠️ less awesome music player
* ⚠️ doesn't run on android or ipads
* ⚠️ AGPL licensed
* ✅ great ui/ux
* ✅ config gui
* ✅ apps (android / iphone)
* copyparty: android upload-only app
* 💾 android upload-only app + iPhone upload shortcut
* ✅ more granular permissions (per-file)
* ✅ search: fulltext indexing of file contents
@@ -434,12 +477,12 @@ symbol legend,
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ doesn't support crazy filenames
* ✅ per-url access control (copyparty is per-volume)
* basic but really snappy ui
* upload, rename, delete, ... see feature matrix
* 🔵 basic but really snappy ui
* 🔵 upload, rename, delete, ... see feature matrix
## [chibisafe](https://github.com/chibisafe/chibisafe)
* nodejs; recommends docker
* *it has upload segmenting!*
* 🔵 *it has upload segmenting!*
* ⚠️ but uploads are still not resumable / accelerated / integrity-checked
* ⚠️ not portable
* ⚠️ isolated on-disk file hierarchy, incompatible with other software
@@ -450,13 +493,13 @@ symbol legend,
* ✅ searchable image tags; delete by tag
* ✅ browser extension to upload files to the server
* ✅ reject uploads by file extension
* copyparty: can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
* 💾 can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
* ✅ token auth (api keys)
## [kodbox](https://github.com/kalcaddle/kodbox)
* this thing is insane
* php; [docker](https://hub.docker.com/r/kodcloud/kodbox)
* *upload segmenting, acceleration, and integrity checking!*
* 🔵 *upload segmenting, acceleration, and integrity checking!*
* ⚠️ but uploads are not resumable(?)
* ⚠️ not portable
* ⚠️ isolated on-disk file hierarchy, incompatible with other software
@@ -483,17 +526,41 @@ symbol legend,
* ⚠️ but no directory tree for navigation
* ✅ user signup
* ✅ command runner / remote shell
* supposed to have write-only folders but couldn't get it to work
* 🔵 supposed to have write-only folders but couldn't get it to work
## [filegator](https://github.com/filegator/filegator)
* go; cross-platform (windows, linux, mac)
* 🔵 *it has upload segmenting and acceleration*
* ⚠️ but uploads are still not integrity-checked
* ⚠️ http only; no webdav / ftp / zeroconf
* ⚠️ does not support symlinks
* ⚠️ expensive download-as-zip feature
* ⚠️ doesn't support crazy filenames
* ⚠️ limited file search
* *it has upload segmenting and acceleration*
* ⚠️ but uploads are still not integrity-checked
## [sftpgo](https://github.com/drakkan/sftpgo)
* go; cross-platform (windows, linux, mac)
* ⚠️ http uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* 🔵 sftp uploads are resumable
* ⚠️ web UI is very minimal + a bit slow
* ⚠️ no thumbnails / image viewer / audio player
* ⚠️ basic file manager (no cut/paste/move)
* ⚠️ no filesystem indexing / search
* ⚠️ doesn't run on phones, tablets
* ⚠️ no zeroconf (mdns/ssdp)
* ⚠️ AGPL licensed
* 🔵 ftp, ftps, webdav
* ✅ sftp server
* ✅ settings gui
* ✅ acme (automatic tls certs)
* 💾 relies on caddy/certbot/acme.sh
* ✅ at-rest encryption
* 💾 relies on LUKS/BitLocker
* ✅ can use S3/GCS as storage backend
* 💾 relies on rclone-mount
* ✅ on-download event hook (otherwise same as copyparty)
* ✅ more extensive permissions control
## [updog](https://github.com/sc0tfree/updog)
* python; cross-platform
@@ -509,9 +576,9 @@ symbol legend,
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* ✅ cool clipboard widget
* copyparty: the markdown editor is an ok substitute
* read-only and upload-only modes (same as copyparty's write-only)
* https, webdav
* 💾 the markdown editor is an ok substitute
* 🔵 read-only and upload-only modes (same as copyparty's write-only)
* 🔵 https, webdav, but no ftp
## [gimme-that](https://github.com/nejdetckenobi/gimme-that)
* python, but with c dependencies
@@ -520,8 +587,8 @@ symbol legend,
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ weird folder structure for uploads
* ✅ clamav antivirus check on upload! neat
* optional max-filesize, os-notification on uploads
* copyparty: os-notification available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
* 🔵 optional max-filesize, os-notification on uploads
* 💾 os-notification available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
## [ass](https://github.com/tycrek/ass)
* nodejs; recommends docker
@@ -532,30 +599,52 @@ symbol legend,
* ⚠️ on cloudflare: max upload size 100 MiB
* ✅ token auth
* ✅ gps metadata stripping
* copyparty: possible with [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py)
* 💾 possible with [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py)
* ✅ discord integration (custom embeds, upload webhook)
* copyparty: [upload webhook plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
* 💾 [upload webhook plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
* ✅ reject uploads by mimetype
* copyparty: can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
* ✅ can use S3 as storage backend; copyparty relies on rclone-mount for that
* 💾 can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
* ✅ can use S3 as storage backend
* 💾 relies on rclone-mount
* ✅ custom 404 pages
## [linx](https://github.com/ZizzyDizzyMC/linx-server/)
* originally [andreimarcu/linx-server](https://github.com/andreimarcu/linx-server) but development has ended
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* some of its unique features have been added to copyparty as former linx users have migrated
* 🔵 some of its unique features have been added to copyparty as former linx users have migrated
* file expiration timers, filename randomization
* ✅ password-protected files
* copyparty: password-protected folders + filekeys to skip the folder password seem to cover most usecases
* 💾 password-protected folders + filekeys to skip the folder password seem to cover most usecases
* ✅ file deletion keys
* ✅ download files as torrents
* ✅ remote uploads (send a link to the server and it downloads it)
* copyparty: available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
* ✅ can use S3 as storage backend; copyparty relies on rclone-mount for that
* 💾 available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
* ✅ can use S3 as storage backend
* 💾 relies on rclone-mount
## [h5ai](https://larsjung.de/h5ai/)
* ⚠️ read only; no upload/move/delete
* ⚠️ search hits the filesystem directly; not indexed/cached
* ✅ slick ui
* ✅ in-browser qr generator to share URLs
* 🔵 directory tree, image viewer, thumbnails, download-as-tar
## [autoindex](https://github.com/nielsAD/autoindex)
* ⚠️ read only; no upload/move/delete
* ✅ directory cache for faster browsing of cloud storage
* 💾 local index/cache for recursive search (names/attrs/tags), but not for browsing
## [miniserve](https://github.com/svenstaro/miniserve)
* rust; cross-platform (windows, linux, mac)
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ no thumbnails / image viewer / audio player / file manager
* ⚠️ no filesystem indexing / search
* 🔵 upload, tar/zip download, qr-code
* ✅ faster at loading huge folders
# briefly considered
* [pydio](https://github.com/pydio/cells): python/agpl3, looks great, fantastic ux -- but needs mariadb, systemwide install
* [gossa](https://github.com/pldubouilh/gossa): go/mit, minimalistic, basic file upload, text editor, mkdir and rename (no delete/move)
* [h5ai](https://larsjung.de/h5ai/): php/mit, slick ui, image viewer, directory tree, no upload feature

42
flake.lock generated Normal file
View File

@@ -0,0 +1,42 @@
{
"nodes": {
"flake-utils": {
"locked": {
"lastModified": 1678901627,
"narHash": "sha256-U02riOqrKKzwjsxc/400XnElV+UtPUQWpANPlyazjH0=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "93a2b84fc4b70d9e089d029deacc3583435c2ed6",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1680334310,
"narHash": "sha256-ISWz16oGxBhF7wqAxefMPwFag6SlsA9up8muV79V9ck=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "884e3b68be02ff9d61a042bc9bd9dd2a358f95da",
"type": "github"
},
"original": {
"id": "nixpkgs",
"ref": "nixos-22.11",
"type": "indirect"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
}
},
"root": "root",
"version": 7
}

28
flake.nix Normal file
View File

@@ -0,0 +1,28 @@
{
inputs = {
nixpkgs.url = "nixpkgs/nixos-22.11";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }:
{
nixosModules.default = ./contrib/nixos/modules/copyparty.nix;
overlays.default = self: super: {
copyparty =
self.python3.pkgs.callPackage ./contrib/package/nix/copyparty {
ffmpeg = self.ffmpeg-full;
};
};
} // flake-utils.lib.eachDefaultSystem (system:
let
pkgs = import nixpkgs {
inherit system;
overlays = [ self.overlays.default ];
};
in {
packages = {
inherit (pkgs) copyparty;
default = self.packages.${system}.copyparty;
};
});
}

View File

@@ -3,12 +3,21 @@ FROM alpine:3.16
WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.9.0 \
ver_marked=4.2.5 \
ver_marked=4.3.0 \
ver_mde=2.18.0 \
ver_codemirror=5.65.11 \
ver_codemirror=5.65.12 \
ver_fontawesome=5.13.0 \
ver_prism=1.29.0 \
ver_zopfli=1.0.3
# versioncheck:
# https://github.com/markedjs/marked/releases
# https://github.com/Ionaru/easy-markdown-editor/tags
# https://github.com/codemirror/codemirror5/releases
# https://github.com/Daninet/hash-wasm/releases
# https://github.com/openpgpjs/asmcrypto.js
# https://github.com/google/zopfli/tags
# download;
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
@@ -22,6 +31,7 @@ RUN mkdir -p /z/dist/no-pk \
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
&& wget https://github.com/PrismJS/prism/archive/refs/tags/v$ver_prism.tar.gz -O prism.tgz \
&& (mkdir hash-wasm \
&& cd hash-wasm \
&& unzip ../hash-wasm.zip) \
@@ -39,14 +49,11 @@ RUN mkdir -p /z/dist/no-pk \
&& cd easy-markdown-editor* \
&& npm install \
&& npm i gulp-cli -g ) \
&& tar -xf prism.tgz \
&& unzip fontawesome.zip \
&& tar -xf zopfli.tgz
# todo
# https://prismjs.com/download.html#themes=prism-funky&languages=markup+css+clike+javascript+autohotkey+bash+basic+batch+c+csharp+cpp+cmake+diff+docker+go+ini+java+json+kotlin+latex+less+lisp+lua+makefile+objectivec+perl+powershell+python+r+jsx+ruby+rust+sass+scss+sql+swift+systemd+toml+typescript+vbnet+verilog+vhdl+yaml&plugins=line-highlight+line-numbers+autolinker
# build fonttools (which needs zopfli)
RUN tar -xf zopfli.tgz \
&& cd zopfli* \
@@ -121,6 +128,12 @@ COPY shiftbase.py /z
RUN /bin/ash /z/mini-fa.sh
# build prismjs
COPY genprism.py /z
COPY genprism.sh /z
RUN ./genprism.sh $ver_prism
# compress
COPY zopfli.makefile /z/dist/Makefile
RUN cd /z/dist \

View File

@@ -1,10 +1,9 @@
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
vend := $(self)/../../copyparty/web/deps
# prefers podman-docker (optionally rootless) over actual docker/moby
all:
-service docker start
-systemctl start docker
docker build -t build-copyparty-deps .
rm -rf $(vend)
@@ -14,6 +13,7 @@ all:
docker run --rm -i build-copyparty-deps:latest | \
tar -xvC $(vend) --strip-components=1
touch $(vend)/__init__.py
chown -R `stat $(self) -c %u:%g` $(vend)
purge:

198
scripts/deps-docker/genprism.py Executable file
View File

@@ -0,0 +1,198 @@
#!/usr/bin/env python3
# author: @chinponya
import argparse
import json
from pathlib import Path
from urllib.parse import urlparse, parse_qsl
def read_json(path):
return json.loads(path.read_text())
def get_prism_version(prism_path):
package_json_path = prism_path / "package.json"
package_json = read_json(package_json_path)
return package_json["version"]
def get_prism_components(prism_path):
components_json_path = prism_path / "components.json"
components_json = read_json(components_json_path)
return components_json
def parse_prism_configuration(url_str):
url = urlparse(url_str)
# prism.com uses a non-standard query string-like encoding
query = {k: v.split(" ") for k, v in parse_qsl(url.fragment)}
return query
def paths_of_component(prism_path, kind, components, name, minified):
component = components[kind][name]
meta = components[kind]["meta"]
path_format = meta["path"]
path_base = prism_path / path_format.replace("{id}", name)
if isinstance(component, str):
# 'core' component has a different shape, so we convert it to be consistent
component = {"title": component}
if meta.get("noCSS") or component.get("noCSS"):
extensions = ["js"]
elif kind == "themes":
extensions = ["css"]
else:
extensions = ["js", "css"]
if path_base.is_dir():
result = {ext: path_base / f"{name}.{ext}" for ext in extensions}
elif path_base.suffix:
ext = path_base.suffix.replace(".", "")
result = {ext: path_base}
else:
result = {ext: path_base.with_suffix(f".{ext}") for ext in extensions}
if minified:
result = {
ext: path.with_suffix(".min" + path.suffix) for ext, path in result.items()
}
return result
def read_component_contents(kv_paths):
return {k: path.read_text() for k, path in kv_paths.items()}
def get_language_dependencies(components, name):
dependencies = components["languages"][name].get("require")
if isinstance(dependencies, list):
return dependencies
elif isinstance(dependencies, str):
return [dependencies]
else:
return []
def make_header(prism_path, url):
version = get_prism_version(prism_path)
header = f"/* PrismJS {version}\n{url} */"
return {"js": header, "css": header}
def make_core(prism_path, components, minified):
kv_paths = paths_of_component(prism_path, "core", components, "core", minified)
return read_component_contents(kv_paths)
def make_theme(prism_path, components, name, minified):
kv_paths = paths_of_component(prism_path, "themes", components, name, minified)
return read_component_contents(kv_paths)
def make_language(prism_path, components, name, minified):
kv_paths = paths_of_component(prism_path, "languages", components, name, minified)
return read_component_contents(kv_paths)
def make_languages(prism_path, components, names, minified):
names_with_dependencies = sum(
([*get_language_dependencies(components, name), name] for name in names), []
)
seen = set()
names_with_dependencies = [
x for x in names_with_dependencies if not (x in seen or seen.add(x))
]
kv_code = [
make_language(prism_path, components, name, minified)
for name in names_with_dependencies
]
return kv_code
def make_plugin(prism_path, components, name, minified):
kv_paths = paths_of_component(prism_path, "plugins", components, name, minified)
return read_component_contents(kv_paths)
def make_plugins(prism_path, components, names, minified):
kv_code = [make_plugin(prism_path, components, name, minified) for name in names]
return kv_code
def make_code(prism_path, url, minified):
components = get_prism_components(prism_path)
configuration = parse_prism_configuration(url)
theme_name = configuration["themes"][0]
code = [
make_header(prism_path, url),
make_core(prism_path, components, minified),
make_theme(prism_path, components, theme_name, minified),
]
if configuration.get("languages"):
code.extend(
make_languages(prism_path, components, configuration["languages"], minified)
)
if configuration.get("plugins"):
code.extend(
make_plugins(prism_path, components, configuration["plugins"], minified)
)
return code
def join_code(kv_code):
result = {"js": "", "css": ""}
for row in kv_code:
for key, code in row.items():
result[key] += code
result[key] += "\n"
return result
def write_code(kv_code, js_out, css_out):
code = join_code(kv_code)
with js_out.open("w") as f:
f.write(code["js"])
print(f"written {js_out}")
with css_out.open("w") as f:
f.write(code["css"])
print(f"written {css_out}")
def parse_args():
# fmt: off
parser = argparse.ArgumentParser()
parser.add_argument("url", help="configured prism download url")
parser.add_argument("--dir", type=Path, default=Path("."), help="prism repo directory")
parser.add_argument("--minify", default=True, action=argparse.BooleanOptionalAction, help="use minified files",)
parser.add_argument("--js-out", type=Path, default=Path("prism.js"), help="JS output file path")
parser.add_argument("--css-out", type=Path, default=Path("prism.css"), help="CSS output file path")
# fmt: on
args = parser.parse_args()
return args
def main():
args = parse_args()
code = make_code(args.dir, args.url, args.minify)
write_code(code, args.js_out, args.css_out)
if __name__ == "__main__":
main()

66
scripts/deps-docker/genprism.sh Executable file
View File

@@ -0,0 +1,66 @@
#!/bin/bash
set -e
langs=(
markup
css
clike
javascript
autohotkey
bash
basic
batch
c
csharp
cpp
cmake
diff
docker
elixir
glsl
go
ini
java
json
kotlin
latex
less
lisp
lua
makefile
matlab
moonscript
nim
objectivec
perl
powershell
python
r
jsx
ruby
rust
sass
scss
sql
swift
systemd
toml
typescript
vbnet
verilog
vhdl
yaml
zig
)
slangs="${langs[*]}"
slangs="${slangs// /+}"
for theme in prism-funky prism ; do
u="https://prismjs.com/download.html#themes=$theme&languages=$slangs&plugins=line-highlight+line-numbers+autolinker"
echo "$u"
./genprism.py --dir prism-$1 --js-out prism.js --css-out $theme.css "$u"
done
mv prism-funky.css prismd.css
mv prismd.css prism.css prism.js /z/dist/

View File

@@ -1,5 +1,5 @@
diff --git a/src/Lexer.js b/src/Lexer.js
adds linetracking to marked.js v4.2.3;
adds linetracking to marked.js v4.3.0;
add data-ln="%d" to most tags, %d is the source markdown line
--- a/src/Lexer.js
+++ b/src/Lexer.js
@@ -206,7 +206,6 @@ index a22a2bc..884ad66 100644
// Run any renderer extensions
if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) {
diff --git a/src/Renderer.js b/src/Renderer.js
index 7c36a75..aa1a53a 100644
--- a/src/Renderer.js
+++ b/src/Renderer.js
@@ -11,6 +11,12 @@ export class Renderer {
@@ -290,10 +289,9 @@ index 7c36a75..aa1a53a 100644
if (title) {
out += ` title="${title}"`;
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
index e8a69b6..2cc772b 100644
--- a/src/Tokenizer.js
+++ b/src/Tokenizer.js
@@ -312,4 +312,7 @@ export class Tokenizer {
@@ -333,4 +333,7 @@ export class Tokenizer {
const l = list.items.length;
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad

View File

@@ -1,4 +1,5 @@
diff --git a/src/Lexer.js b/src/Lexer.js
strip some features
--- a/src/Lexer.js
+++ b/src/Lexer.js
@@ -7,5 +7,5 @@ import { repeatString } from './helpers.js';
@@ -56,7 +57,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
--- a/src/Tokenizer.js
+++ b/src/Tokenizer.js
@@ -352,14 +352,7 @@ export class Tokenizer {
@@ -367,14 +367,7 @@ export class Tokenizer {
type: 'html',
raw: cap[0],
- pre: !this.options.sanitizer
@@ -72,7 +73,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
- }
return token;
}
@@ -502,15 +495,9 @@ export class Tokenizer {
@@ -517,15 +510,9 @@ export class Tokenizer {
return {
- type: this.options.sanitize
@@ -90,7 +91,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
+ text: cap[0]
};
}
@@ -699,10 +686,10 @@ export class Tokenizer {
@@ -714,10 +701,10 @@ export class Tokenizer {
}
- autolink(src, mangle) {
@@ -103,7 +104,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
+ text = escape(cap[1]);
href = 'mailto:' + text;
} else {
@@ -727,10 +714,10 @@ export class Tokenizer {
@@ -742,10 +729,10 @@ export class Tokenizer {
}
- url(src, mangle) {
@@ -116,7 +117,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
+ text = escape(cap[0]);
href = 'mailto:' + text;
} else {
@@ -764,12 +751,12 @@ export class Tokenizer {
@@ -779,12 +766,12 @@ export class Tokenizer {
}
- inlineText(src, smartypants) {
@@ -135,8 +136,8 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
diff --git a/src/defaults.js b/src/defaults.js
--- a/src/defaults.js
+++ b/src/defaults.js
@@ -10,11 +10,7 @@ export function getDefaults() {
highlight: null,
@@ -11,11 +11,7 @@ export function getDefaults() {
hooks: null,
langPrefix: 'language-',
- mangle: true,
pedantic: false,
@@ -170,7 +171,7 @@ diff --git a/src/helpers.js b/src/helpers.js
+export function cleanUrl(base, href) {
if (base && !originIndependentUrl.test(href)) {
href = resolveUrl(base, href);
@@ -250,10 +237,4 @@ export function findClosingBracket(str, b) {
@@ -233,10 +220,4 @@ export function findClosingBracket(str, b) {
}
-export function checkSanitizeDeprecation(opt) {
@@ -185,30 +186,25 @@ diff --git a/src/marked.js b/src/marked.js
--- a/src/marked.js
+++ b/src/marked.js
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
import { Hooks } from './Hooks.js';
import {
merge,
- checkSanitizeDeprecation,
escape
} from './helpers.js';
@@ -35,5 +34,4 @@ export function marked(src, opt, callback) {
opt = merge({}, marked.defaults, opt || {});
- checkSanitizeDeprecation(opt);
if (callback) {
@@ -318,5 +316,4 @@ marked.parseInline = function(src, opt) {
opt = merge({}, marked.defaults, opt || {});
- checkSanitizeDeprecation(opt);
try {
@@ -327,5 +324,5 @@ marked.parseInline = function(src, opt) {
return Parser.parseInline(tokens, opt);
} catch (e) {
@@ -18,5 +17,5 @@ import {
function onError(silent, async, callback) {
return (e) => {
- e.message += '\nPlease report this to https://github.com/markedjs/marked.';
+ e.message += '\nmake issue @ https://github.com/9001/copyparty';
if (opt.silent) {
return '<p>An error occurred:</p><pre>'
if (silent) {
@@ -65,6 +64,4 @@ function parseMarkdown(lexer, parser) {
}
- checkSanitizeDeprecation(opt);
-
if (opt.hooks) {
opt.hooks.options = opt;
diff --git a/test/bench.js b/test/bench.js
--- a/test/bench.js
+++ b/test/bench.js
@@ -250,70 +246,70 @@ diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
--- a/test/unit/Lexer-spec.js
+++ b/test/unit/Lexer-spec.js
@@ -712,5 +712,5 @@ paragraph
@@ -794,5 +794,5 @@ paragraph
});
- it('sanitize', () => {
+ /*it('sanitize', () => {
expectTokens({
md: '<div>html</div>',
@@ -730,5 +730,5 @@ paragraph
@@ -812,5 +812,5 @@ paragraph
]
});
- });
+ });*/
});
@@ -810,5 +810,5 @@ paragraph
@@ -892,5 +892,5 @@ paragraph
});
- it('html sanitize', () => {
+ /*it('html sanitize', () => {
expectInlineTokens({
md: '<div>html</div>',
@@ -818,5 +818,5 @@ paragraph
@@ -900,5 +900,5 @@ paragraph
]
});
- });
+ });*/
it('link', () => {
@@ -1129,5 +1129,5 @@ paragraph
@@ -1211,5 +1211,5 @@ paragraph
});
- it('autolink mangle email', () => {
+ /*it('autolink mangle email', () => {
expectInlineTokens({
md: '<test@example.com>',
@@ -1149,5 +1149,5 @@ paragraph
@@ -1231,5 +1231,5 @@ paragraph
]
});
- });
+ });*/
it('url', () => {
@@ -1186,5 +1186,5 @@ paragraph
@@ -1268,5 +1268,5 @@ paragraph
});
- it('url mangle email', () => {
+ /*it('url mangle email', () => {
expectInlineTokens({
md: 'test@example.com',
@@ -1206,5 +1206,5 @@ paragraph
@@ -1288,5 +1288,5 @@ paragraph
]
});
- });
+ });*/
});
@@ -1222,5 +1222,5 @@ paragraph
@@ -1304,5 +1304,5 @@ paragraph
});
- describe('smartypants', () => {
+ /*describe('smartypants', () => {
it('single quotes', () => {
expectInlineTokens({
@@ -1292,5 +1292,5 @@ paragraph
@@ -1374,5 +1374,5 @@ paragraph
});
});
- });

View File

@@ -1,5 +1,10 @@
FROM alpine:latest
WORKDIR /z
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-ac" \
org.opencontainers.image.description="copyparty with Pillow and FFmpeg (image/audio/video thumbnails, audio transcoding, media tags)"
RUN apk --no-cache add \
wget \
@@ -11,4 +16,5 @@ RUN apk --no-cache add \
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]

View File

@@ -1,20 +1,27 @@
FROM alpine:latest
WORKDIR /z
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-dj" \
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection"
COPY i/bin/mtag/install-deps.sh ./
COPY i/bin/mtag/audio-bpm.py /mtag/
COPY i/bin/mtag/audio-key.py /mtag/
RUN apk add -U \
wget \
py3-pillow py3-pip \
py3-pillow py3-pip py3-cffi \
ffmpeg \
vips-jxl vips-heif vips-poppler vips-magick \
py3-numpy fftw libsndfile \
vamp-sdk vamp-sdk-libs \
&& python3 -m pip install pyvips \
&& apk --no-cache add -t .bd \
bash wget gcc g++ make cmake patchelf \
python3-dev ffmpeg-dev fftw-dev libsndfile-dev \
py3-wheel py3-numpy-dev \
vamp-sdk-dev \
&& bash install-deps.sh \
&& apk del py3-pip .bd \
&& rm -rf /var/cache/apk/* \
@@ -26,4 +33,5 @@ RUN apk add -U \
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]

View File

@@ -1,13 +1,19 @@
FROM alpine:latest
WORKDIR /z
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-im" \
org.opencontainers.image.description="copyparty with Pillow and Mutagen (image thumbnails, media tags)"
RUN apk --no-cache add \
wget \
py3-pillow \
py3-pillow py3-mutagen \
&& mkdir /cfg /w \
&& chmod 777 /cfg /w \
&& echo % /cfg > initcfg
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]

View File

@@ -1,9 +1,14 @@
FROM alpine:latest
WORKDIR /z
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-iv" \
org.opencontainers.image.description="copyparty with Pillow, FFmpeg, libvips (image/audio/video thumbnails, audio transcoding, media tags)"
RUN apk --no-cache add \
wget \
py3-pillow py3-pip \
py3-pillow py3-pip py3-cffi \
ffmpeg \
vips-jxl vips-heif vips-poppler vips-magick \
&& python3 -m pip install pyvips \
@@ -14,4 +19,5 @@ RUN apk --no-cache add \
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]

View File

@@ -1,5 +1,10 @@
FROM alpine:latest
WORKDIR /z
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-min" \
org.opencontainers.image.description="just copyparty, no thumbnails / media tags / audio transcoding"
RUN apk --no-cache add \
python3 \
@@ -9,4 +14,5 @@ RUN apk --no-cache add \
COPY i/dist/copyparty-sfx.py ./
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]

View File

@@ -1,5 +1,10 @@
FROM alpine:latest
WORKDIR /z
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
org.opencontainers.image.licenses="MIT" \
org.opencontainers.image.title="copyparty-min-pip" \
org.opencontainers.image.description="just copyparty, no thumbnails, no media tags, no audio transcoding"
RUN apk --no-cache add python3 py3-pip \
&& python3 -m pip install copyparty \
@@ -9,4 +14,5 @@ RUN apk --no-cache add python3 py3-pip \
&& echo % /cfg > initcfg
WORKDIR /w
EXPOSE 3923
ENTRYPOINT ["python3", "-m", "copyparty", "-c", "/z/initcfg"]

View File

@@ -48,7 +48,7 @@ push:
clean:
-docker kill `docker ps -q`
-docker rm `docker ps -qa`
-docker rmi `docker images -a | awk '/^<none>/{print$$3}'`
-docker rmi -f `docker images -a | awk '/<none>/{print$$3}'`
hclean:
-docker kill `docker ps -q`

View File

@@ -1,5 +1,5 @@
copyparty is availabe in these repos:
* https://hub.docker.com/r/copyparty
* https://hub.docker.com/u/copyparty
* https://github.com/9001?tab=packages&repo_name=copyparty
@@ -14,6 +14,7 @@ docker run --rm -it -u 1000 -p 3923:3923 -v /mnt/nas:/w -v $PWD/cfgdir:/cfg copy
* `/cfg` is an optional folder with zero or more config files (*.conf) to load
* `copyparty/ac` is the recommended [image edition](#editions)
* you can download the image from github instead by replacing `copyparty/ac` with `ghcr.io/9001/copyparty-ac`
* if you are using rootless podman, remove `-u 1000`
i'm unfamiliar with docker-compose and alternatives so let me know if this section could be better 🙏
@@ -32,13 +33,17 @@ the recommended way to configure copyparty inside a container is to mount a fold
with image size after installation and when gzipped
* `min` (57 MiB, 20 gz) is just copyparty itself
* `im` (69 MiB, 24 gz) can create thumbnails using pillow (pics only)
* `ac` (163 MiB, 56 gz) is `im` plus ffmpeg for video/audio thumbnails + audio transcoding
* `iv` (211 MiB, 73 gz) is `ac` plus vips for faster heif / avic / jxl thumbnails
* `dj` (309 MiB, 104 gz) is `iv` plus beatroot/keyfinder to detect musical keys and bpm
* [`min`](https://hub.docker.com/r/copyparty/min) (57 MiB, 20 gz) is just copyparty itself
* [`im`](https://hub.docker.com/r/copyparty/im) (70 MiB, 25 gz) can thumbnail images with pillow, parse media files with mutagen
* [`ac` (163 MiB, 56 gz)](https://hub.docker.com/r/copyparty/ac) is `im` plus ffmpeg for video/audio thumbs + audio transcoding + better tags
* [`iv`](https://hub.docker.com/r/copyparty/iv) (211 MiB, 73 gz) is `ac` plus vips for faster heif / avic / jxl thumbnails
* [`dj`](https://hub.docker.com/r/copyparty/dj) (309 MiB, 104 gz) is `iv` plus beatroot/keyfinder to detect musical keys and bpm
`ac` is recommended since the additional features available in `iv` and `dj` are rarely useful
[`ac` is recommended](https://hub.docker.com/r/copyparty/ac) since the additional features available in `iv` and `dj` are rarely useful
most editions support `x86`, `x86_64`, `armhf`, `aarch64`, `ppc64le`, `s390x`
* `dj` doesn't run on `ppc64le`, `s390x`, `armhf`
* `iv` doesn't run on `ppc64le`, `s390x`
## detecting bpm and musical key
@@ -71,4 +76,9 @@ or using commandline arguments,
# build the images yourself
put `copyparty-sfx.py` into `../dist/` (or [build that from scratch](../../docs/devnotes.md#just-the-sfx) too) then run `make`
basically `./make.sh hclean pull img push` but see [devnotes.md](./devnotes.md)
# notes
* currently unable to play [tracker music](https://en.wikipedia.org/wiki/Module_file) (mod/s3m/xm/it/...) -- will be fixed in june 2023 (Alpine 3.18)

View File

@@ -0,0 +1,19 @@
# building the images yourself
```bash
./make.sh hclean pull img push
```
will download the latest copyparty-sfx.py from github unless you have [built it from scratch](../../docs/devnotes.md#just-the-sfx) and then build all the images based on that
deprecated alternative: run `make` to use the makefile however that uses docker instead of podman and only builds x86_64
`make.sh` is necessarily(?) overengineered because:
* podman keeps burning dockerhub pulls by not using the cached images (`--pull=never` does not apply to manifests)
* podman cannot build from a local manifest, only local images or remote manifests
but I don't really know what i'm doing here 💩
* auth for pushing images to repos;
`podman login docker.io`
`podman login ghcr.io -u 9001`
[about gchq](https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry) (takes a classic token as password)

152
scripts/docker/make.sh Executable file
View File

@@ -0,0 +1,152 @@
#!/bin/bash
set -e
[ $(id -u) -eq 0 ] && {
echo dont root
exit 1
}
sarchs="386 amd64 arm/v7 arm64/v8 ppc64le s390x"
archs="amd64 arm s390x 386 arm64 ppc64le"
imgs="dj iv min im ac"
dhub_order="iv dj min im ac"
ghcr_order="ac im min dj iv"
ngs=(
iv-{ppc64le,s390x}
dj-{ppc64le,s390x,arm}
)
for v in "$@"; do
[ "$v" = clean ] && clean=1
[ "$v" = hclean ] && hclean=1
[ "$v" = purge ] && purge=1
[ "$v" = pull ] && pull=1
[ "$v" = img ] && img=1
[ "$v" = push ] && push=1
[ "$v" = sh ] && sh=1
done
[ $# -gt 0 ] || {
echo "need list of commands, for example: hclean pull img push"
exit 1
}
[ $sh ] && {
printf "\n\033[1;31mopening a shell in the most recently created docker image\033[0m\n"
podman run --rm -it --entrypoint /bin/ash $(podman images -aq | head -n 1)
exit $?
}
filt=
[ $clean ] && filt='/<none>/{print$$3}'
[ $hclean ] && filt='/localhost\/copyparty-|^<none>.*localhost\/alpine-/{print$3}'
[ $purge ] && filt='NR>1{print$3}'
[ $filt ] && {
[ $purge ] && {
podman kill $(podman ps -q) || true
podman rm $(podman ps -qa) || true
}
podman rmi -f $(podman images -a --history | awk "$filt") || true
podman rmi $(podman images -a --history | awk '/^<none>.*<none>.*-tmp:/{print$3}') || true
}
[ $pull ] && {
for a in $sarchs; do # arm/v6
podman pull --arch=$a alpine:latest
done
podman images --format "{{.ID}} {{.History}}" |
awk '/library\/alpine/{print$1}' |
while read id; do
tag=alpine-$(podman inspect $id | jq -r '.[]|.Architecture' | tr / -)
[ -e .tag-$tag ] && continue
touch .tag-$tag
echo tagging $tag
podman untag $id
podman tag $id $tag
done
rm .tag-*
}
[ $img ] && {
fp=../../dist/copyparty-sfx.py
[ -e $fp ] || {
echo downloading copyparty-sfx.py ...
mkdir -p ../../dist
wget https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py -O $fp
}
# kill abandoned builders
ps aux | awk '/bin\/qemu-[^-]+-static/{print$2}' | xargs -r kill -9
# grab deps
rm -rf i err
mkdir i
tar -cC../.. dist/copyparty-sfx.py bin/mtag | tar -xvCi
for i in $imgs; do
podman rm copyparty-$i || true # old manifest
for a in $archs; do
[[ " ${ngs[*]} " =~ " $i-$a " ]] && continue # known incompat
# wait for a free slot
while true; do
touch .blk
[ $(jobs -p | wc -l) -lt $(nproc) ] && break
while [ -e .blk ]; do sleep 0.2; done
done
aa="$(printf '%7s' $a)"
# arm takes forever so make it top priority
[ ${a::3} == arm ] && nice= || nice=nice
# --pull=never does nothing at all btw
(set -x
$nice podman build \
--pull=never \
--from localhost/alpine-$a \
-t copyparty-$i-$a \
-f Dockerfile.$i . ||
(echo $? $i-$a >> err)
rm -f .blk
) 2> >(tee $a.err | sed "s/^/$aa:/" >&2) > >(tee $a.out | sed "s/^/$aa:/") &
done
[ -e err ] && {
echo somethign died,
cat err
pkill -P $$
exit 1
}
for a in $archs; do
rm -f $a.{out,err}
done
done
wait
[ -e err ] && {
echo somethign died,
cat err
pkill -P $$
exit 1
}
# avoid podman race-condition by creating manifest manually --
# Error: creating image to hold manifest list: image name "localhost/copyparty-dj:latest" is already associated with image "[0-9a-f]{64}": that name is already in use
for i in $imgs; do
variants=
for a in $archs; do
[[ " ${ngs[*]} " =~ " $i-$a " ]] && continue
variants="$variants containers-storage:localhost/copyparty-$i-$a"
done
podman manifest create copyparty-$i $variants
done
}
[ $push ] && {
for i in $dhub_order; do
podman manifest push --all copyparty-$i copyparty/$i:latest
done
for i in $ghcr_order; do
podman manifest push --all copyparty-$i ghcr.io/9001/copyparty-$i:latest
done
}
echo ok

View File

@@ -184,9 +184,9 @@ necho() {
mv {markupsafe,jinja2} j2/
necho collecting pyftpdlib
f="../build/pyftpdlib-1.5.6.tar.gz"
f="../build/pyftpdlib-1.5.7.tar.gz"
[ -e "$f" ] ||
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.6.tar.gz;
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.7.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
tar -zxf $f

View File

@@ -64,6 +64,8 @@ git archive hovudstraum | tar -xC "$rls_dir"
echo ">>> export untracked deps"
tar -c copyparty/web/deps | tar -xC "$rls_dir"
scripts/genlic.sh "$rls_dir/copyparty/res/COPYING.txt"
cd "$rls_dir"
find -type d -exec chmod 755 '{}' \+
find -type f -exec chmod 644 '{}' \+
@@ -93,7 +95,7 @@ rm \
.gitattributes \
.gitignore
mv LICENSE LICENSE.txt
cp -pv LICENSE LICENSE.txt
# the regular cleanup memes
find -name '*.pyc' -delete

View File

@@ -40,4 +40,10 @@ update_arch_pkgbuild() {
rm -rf x
}
update_nixos_pin() {
( cd $self/../contrib/package/nix/copyparty;
./update.py $self/../dist/copyparty-sfx.py )
}
update_arch_pkgbuild
update_nixos_pin

View File

@@ -1,7 +1,9 @@
builds a fully standalone copyparty.exe compatible with 32bit win7-sp1 and later
builds copyparty32.exe, fully standalone, compatible with 32bit win7-sp1 and later
requires a win7 vm which has never been connected to the internet and a host-only network with the linux host at 192.168.123.1
copyparty.exe is built by a win10-ltsc-2021 vm with similar setup
first-time setup steps in notes.txt
run build.sh in the vm to fetch src + compile + push a new exe to the linux host for manual publishing

View File

@@ -9,7 +9,17 @@ tee build2.sh | cmp build.sh && rm build2.sh || {
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
}
uname -s | grep WOW64 && m=64 || m=
[ -e up2k.sh ] && ./up2k.sh
uname -s | grep WOW64 && m=64 || m=32
uname -s | grep NT-10 && w10=1 || w7=1
[ $w7 ] && pyv=37 || pyv=311
esuf=
[ $w7 ] && [ $m = 32 ] && esuf=32
[ $w7 ] && [ $m = 64 ] && esuf=-winpe64
appd=$(cygpath.exe "$APPDATA")
spkgs=$appd/Python/Python$pyv/site-packages
dl() { curl -fkLO "$1"; }
@@ -25,14 +35,23 @@ python copyparty-sfx.py --version
rm -rf mods; mkdir mods
cp -pR $TEMP/pe-copyparty/copyparty/ $TEMP/pe-copyparty/{ftp,j2}/* mods/
[ $w10 ] && rm -rf mods/{jinja2,markupsafe}
af() { awk "$1" <$2 >tf; mv tf "$2"; }
rm -rf mods/magic/
sed -ri /pickle/d mods/jinja2/_compat.py
sed -ri '/(bccache|PackageLoader)/d' mods/jinja2/__init__.py
af '/^class/{s=0}/^class PackageLoader/{s=1}!s' mods/jinja2/loaders.py
[ $w7 ] && {
sed -ri /pickle/d mods/jinja2/_compat.py
sed -ri '/(bccache|PackageLoader)/d' mods/jinja2/__init__.py
af '/^class/{s=0}/^class PackageLoader/{s=1}!s' mods/jinja2/loaders.py
}
[ $w10 ] && {
sed -ri '/(bccache|PackageLoader)/d' $spkgs/jinja2/__init__.py
for f in nodes async_utils; do
sed -ri 's/\binspect\b/os/' $spkgs/jinja2/$f.py
done
}
sed -ri /fork_process/d mods/pyftpdlib/servers.py
af '/^class _Base/{s=1}!s' mods/pyftpdlib/authorizers.py
@@ -43,6 +62,7 @@ read a b c d _ < <(
sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/'
)
sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2
sed -ri s/copyparty.exe/copyparty$esuf.exe/ loader.rc2
excl=(
copyparty.broker_mp
@@ -59,7 +79,12 @@ excl=(
urllib.robotparser
zipfile
)
false || excl+=(
[ $w10 ] && excl+=(
PIL.ImageQt
PIL.ImageShow
PIL.ImageTk
PIL.ImageWin
) || excl+=(
PIL
PIL.ExifTags
PIL.Image
@@ -68,7 +93,7 @@ false || excl+=(
)
excl=( "${excl[@]/#/--exclude-module }" )
$APPDATA/python/python37/scripts/pyinstaller \
$APPDATA/python/python$pyv/scripts/pyinstaller \
-y --clean -p mods --upx-dir=. \
${excl[*]} \
--version-file loader.rc2 -i loader.ico -n copyparty -c -F loader.py \
@@ -77,4 +102,9 @@ $APPDATA/python/python37/scripts/pyinstaller \
# ./upx.exe --best --ultra-brute --lzma -k dist/copyparty.exe
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$m.exe
printf $(sha512sum ~/Downloads/dist/copyparty.exe | head -c 18 | sed -r 's/(..)/\\x\1/g') |
base64 | head -c12 >> dist/copyparty.exe
dist/copyparty.exe --version
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe

56
scripts/pyinstaller/depchk.sh Executable file
View File

@@ -0,0 +1,56 @@
#!/bin/bash
set -e
e=0
cd ~/dev/pyi
ckpypi() {
deps=(
altgraph
pefile
pyinstaller
pyinstaller-hooks-contrib
pywin32-ctypes
Jinja2
MarkupSafe
mutagen
Pillow
)
for dep in "${deps[@]}"; do
k=
echo -n .
curl -s https://pypi.org/pypi/$dep/json >h
ver=$(jq <h -r '.releases|keys|.[]' | sort -V | tail -n 1)
while IFS= read -r fn; do
[ -e "$fn" ] && k="$fn" && break
done < <(
jq -r '.releases["'"$ver"'"]|.[]|.filename' <h
)
[ -z "$k" ] && echo "outdated: $dep" && cp h "ng-$dep" && e=1
done
true
}
ckgh() {
deps=(
upx/upx
)
for dep in "${deps[@]}"; do
k=
echo -n .
while IFS= read -r fn; do
[ -e "$fn" ] && k="$fn" && break
done < <(
curl -s https://api.github.com/repos/$dep/releases | tee h |
jq -r 'first|.assets|.[]|.name'
)
[ -z "$k" ] && echo "outdated: $dep" && cp h "ng-$dep" e=1
done
true
}
ckpypi
ckgh
rm h
exit $e

View File

@@ -1,17 +1,30 @@
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
fd8ebead3572a924cd7a256efa67d9a63f49fe30a8890a82d8899c21b7b2411c6761460f334757411c9bf050d8485641e928ccb52455346bf0b5c2d5f89857bc Git-2.38.1-32-bit.exe
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
18f8070bfb13fe9b7005ed55c9d040465d1e4c0ddcc1e8adca9127070f8db7b32272fbe50622c1d5c937d9be3bb110167b6a55502e232e26d7da881a5342a9e3 pefile-2022.5.30.tar.gz
4e71295da5d1a26c71a0baa8905fdccb522bb16d56bc964db636de68688c5bf703f3b2880cdeea07138789e0eb4506e06f9ccd0da906c89d2cb6d55ad64659ea pip-22.3-py3-none-any.whl
30aa0f3b15d17867f5d8a35e2c86462fa90cfefd3fde5733570a1f86ee75ee1b08550cce9fb34c94d9bb68efeaba26217e99b8b0769bb179c20f61c856057f07 pyinstaller-5.6.1-py3-none-win32.whl
15319328d3ba5aee43e0b830fdf8030213622bc06e5959c0abf579a92061723b100b66325fea0b540de61b90acbbf3985c6ad6eacfc8ccd1c212d282d4e332ca pyinstaller-5.6.1-py3-none-win_amd64.whl
d2534bcf4b8ecc6708d02c6cc6b3c4d8dc3430db5d9a7d350b63239af097a84a325c3f9c3feddf9ebd82ceb8350d9239f79f8c9378fb5164c857f82b0fa25eb8 pyinstaller_hooks_contrib-2022.11-py2.py3-none-any.whl
6bb73cc2db795c59c92f2115727f5c173cacc9465af7710db9ff2f2aec2d73130d0992d0f16dcb3fac222dc15c0916562d0813b2337401022020673a4461df3d python-3.7.9-amd64.exe
500747651c87f59f2436c5ab91207b5b657856e43d10083f3ce27efb196a2580fadd199a4209519b409920c562aaaa7dcbdfb83ed2072a43eaccae6e2d056f31 python-3.7.9.exe
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
d68c78bc83f4f48c604912b2d1ca4772b0e6ed676cd2eb439411e0a74d63fe215aac93dd9dab04ed341909a4a6a1efc13ec982516e3cb0fc7c355055e63d9178 pyinstaller-5.10.1-py3-none-win32.whl
fe62705893c86eeb2d5b841da8debe05dedda98364dec190b487e718caad8a8735503bf93739a7a27ea793a835bf976fb919ceec1424b8fc550b936bae4a54e9 pyinstaller-5.10.1-py3-none-win_amd64.whl
61c543983ff67e2bdff94d2d6198023679437363db8c660fa81683aff87c5928cd800720488e18d09be89fe45d6ab99be3ccb912cb2e03e2bca385b4338e1e42 pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
58d50f3639373e3e2adad3c52328a234a0ca527f8a520731c0f03911074610a9c90dbb9bfa93d1f9cbeab3245ec9e45527128f0295e4b7085b97d288a0b4a2fd upx-4.0.0-win32.zip
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
# up2k (win7)
a7d259277af4948bf960682bc9fb45a44b9ae9a19763c8a7c313cef4aa9ec2d447d843e4a7c409e9312c8c8f863a24487a8ee4ffa6891e9b1c4e111bb4723861 certifi-2022.12.7-py3-none-any.whl
2822c0dae180b1c8cfb7a70c8c00bad62af9afdbb18b656236680def9d3f1fcdcb8ef5eb64fc3b4c934385cd175ad5992a2284bcba78a243130de75b2d1650db charset_normalizer-3.1.0-cp37-cp37m-win32.whl
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl
220e0e122d5851aaccf633224dd7fbd3ba8c8d2720944d8019d6a276ed818d83e3426fe21807f22d673b5428f19fcf9a6b4e645f69bbecd967c568bb6aeb7c8d requests-2.28.2-py3-none-any.whl
8770011f4ad1fe40a3062e6cdf1fda431530c59ee7de3fc5f8c57db54bfdb71c3aa220ca0e0bb1874fc6700e9ebb57defbae54ac84938bc9ad8f074910106681 urllib3-1.26.14-py2.py3-none-any.whl
# win7
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
4e71295da5d1a26c71a0baa8905fdccb522bb16d56bc964db636de68688c5bf703f3b2880cdeea07138789e0eb4506e06f9ccd0da906c89d2cb6d55ad64659ea pip-22.3-py3-none-any.whl
6bb73cc2db795c59c92f2115727f5c173cacc9465af7710db9ff2f2aec2d73130d0992d0f16dcb3fac222dc15c0916562d0813b2337401022020673a4461df3d python-3.7.9-amd64.exe
500747651c87f59f2436c5ab91207b5b657856e43d10083f3ce27efb196a2580fadd199a4209519b409920c562aaaa7dcbdfb83ed2072a43eaccae6e2d056f31 python-3.7.9.exe
68e1b618d988be56aaae4e2eb92bc0093627a00441c1074ebe680c41aa98a6161e52733ad0c59888c643a33fe56884e4f935178b2557fbbdd105e92e0d993df6 windows6.1-kb2533623-x64.msu
479a63e14586ab2f2228208116fc149ed8ee7b1e4ff360754f5bda4bf765c61af2e04b5ef123976623d04df4976b7886e0445647269da81436bd0a7b5671d361 windows6.1-kb2533623-x86.msu
ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a25067512b8f75538c67c987cf3944bfa0229e3cb677e2fb81e763e zipp-3.10.0-py3-none-any.whl
# win10
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
78414808cb9a5fa74e7b23360b8f46147952530e3cc78a3ad4b80be3e26598080537ac691a1be1f35b7428a22c1f65a6adf45986da2752fbe9d9819d77a58bf8 Pillow-9.5.0-cp311-cp311-win_amd64.whl
4b7711b950858f459d47145b88ccde659279c6af47144d58a1c54ea2ce4b80ec43eb7f69c68f12f8f6bc54c86a44e77441993257f7ad43aab364655de5c51bb1 python-3.11.2-amd64.exe

View File

@@ -1,16 +1,31 @@
# links to download pages for each dep
https://pypi.org/project/altgraph/#files
https://pypi.org/project/future/#files
https://pypi.org/project/importlib-metadata/#files
https://pypi.org/project/pefile/#files
https://pypi.org/project/pip/#files
https://pypi.org/project/pyinstaller/#files
https://pypi.org/project/pyinstaller-hooks-contrib/#files
https://pypi.org/project/pywin32-ctypes/#files
https://pypi.org/project/typing-extensions/#files
https://pypi.org/project/zipp/#files
https://github.com/git-for-windows/git/releases/latest
https://github.com/upx/upx/releases/latest
# win10 additionals
https://pypi.org/project/Jinja2/#files
https://pypi.org/project/MarkupSafe/#files
https://pypi.org/project/mutagen/#files
https://pypi.org/project/Pillow/#files
# up2k (win7) additionals
https://pypi.org/project/certifi/#files
https://pypi.org/project/charset-normalizer/#files # cp37-cp37m-win32.whl
https://pypi.org/project/idna/#files
https://pypi.org/project/requests/#files
https://pypi.org/project/urllib3/#files
# win7 additionals
https://pypi.org/project/future/#files
https://pypi.org/project/importlib-metadata/#files
https://pypi.org/project/pip/#files
https://pypi.org/project/typing-extensions/#files
https://pypi.org/project/zipp/#files
https://support.microsoft.com/en-us/topic/microsoft-security-advisory-insecure-library-loading-could-allow-remote-code-execution-486ea436-2d47-27e5-6cb9-26ab7230c704
http://www.microsoft.com/download/details.aspx?familyid=c79c41b0-fbfb-4d61-b5d8-cadbe184b9fc
http://www.microsoft.com/download/details.aspx?familyid=146ed6f7-b605-4270-8ec4-b9f0f284bb9e

View File

@@ -1,8 +1,10 @@
#!/bin/bash
set -e
genico() {
# imagemagick png compression is broken, use pillow instead
convert ~/AndroidStudioProjects/PartyUP/metadata/en-US/images/icon.png a.bmp
convert $1 a.bmp
#convert a.bmp -trim -resize '48x48!' -strip a.png
python3 <<'EOF'
@@ -17,11 +19,15 @@ EOF
pngquant --strip --quality 30 a.png
mv a-*.png a.png
python3 <<'EOF'
python3 <<EOF
from PIL import Image
Image.open('a.png').save('loader.ico',sizes=[(48,48)])
Image.open('a.png').save('$2',sizes=[(48,48)])
EOF
rm a.{bmp,png}
}
genico ~/AndroidStudioProjects/PartyUP/metadata/en-US/images/icon.png loader.ico
genico https://raw.githubusercontent.com/googlefonts/noto-emoji/main/png/512/emoji_u1f680.png up2k.ico
ls -al
exit 0

View File

@@ -1,29 +1,44 @@
# coding: utf-8
import base64
import hashlib
import os
import re
import shutil
import subprocess as sp
import sys
import traceback
v = r"""
this is the EXE edition of copyparty, compatible with Windows7-SP1
and later. To make this possible, the EXE was compiled with Python
3.7.9, which is EOL and does not receive security patches anymore.
this 32-bit copyparty.exe is compatible with Windows7-SP1 and later.
To make this possible, the EXE was compiled with Python 3.7.9,
which is EOL and does not receive security patches anymore.
if possible, for performance and security reasons, please use this instead:
https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
"""
if sys.version_info > (3, 10):
v = r"""
this 64-bit copyparty.exe is compatible with Windows 8 and later.
No security issues were known to affect this EXE at build time,
however that may have changed since then.
if possible, for performance and security reasons, please use this instead:
https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
"""
if sys.maxsize > 2 ** 32:
v = v.replace("32-bit", "64-bit")
try:
print(v.replace("\n", "\n▒▌ ")[1:] + "\n")
except:
print(v.replace("\n", "\n|| ")[1:] + "\n")
import re
import os
import sys
import shutil
import traceback
import subprocess as sp
def confirm(rv):
print()
print("retcode", rv if rv else traceback.format_exc())
@@ -36,6 +51,30 @@ def confirm(rv):
sys.exit(rv or 1)
def ckck():
hs = hashlib.sha512()
with open(sys.executable, "rb") as f:
f.seek(-12, 2)
rem = f.tell()
esum = f.read().decode("ascii", "replace")
f.seek(0)
while rem:
buf = f.read(min(rem, 64 * 1024))
rem -= len(buf)
hs.update(buf)
if not buf:
t = "unexpected eof @ {} with {} left"
raise Exception(t.format(f.tell(), rem))
fsum = base64.b64encode(hs.digest()[:9]).decode("utf-8")
if fsum != esum:
t = "exe integrity check error; [{}] != [{}]"
raise Exception(t.format(esum, fsum))
ckck()
def meicln(mod):
pdir, mine = os.path.split(mod)
dirs = os.listdir(pdir)

View File

@@ -16,7 +16,7 @@ VSVersionInfo(
StringTable(
'000004b0',
[StringStruct('CompanyName', 'ocv.me'),
StringStruct('FileDescription', 'copyparty'),
StringStruct('FileDescription', 'copyparty file server'),
StringStruct('FileVersion', '1.2.3'),
StringStruct('InternalName', 'copyparty'),
StringStruct('LegalCopyright', '2019, ed'),

View File

@@ -2,34 +2,53 @@ run ./build.sh in git-bash to build + upload the exe
## ============================================================
## first-time setup on a stock win7x32sp1 vm:
## first-time setup on a stock win7x32sp1 and/or win10x64 vm:
##
to obtain the files referenced below, see ./deps.txt
download + install git (32bit OK on 64):
http://192.168.123.1:3923/ro/pyi/Git-2.38.1-32-bit.exe
http://192.168.123.1:3923/ro/pyi/Git-2.39.1-32-bit.exe
===[ copy-paste into git-bash ]================================
uname -s | grep NT-10 && w10=1 || {
w7=1; uname -s | grep WOW64 && w7x64=1 || w7x32=1
}
fns=(
upx-4.0.0-win32.zip
pip-22.3-py3-none-any.whl
altgraph-0.17.3-py2.py3-none-any.whl
pefile-2023.2.7-py3-none-any.whl
pyinstaller-5.10.1-py3-none-win_amd64.whl
pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
upx-4.0.2-win32.zip
)
[ $w10 ] && fns+=(
mutagen-1.46.0-py3-none-any.whl
Pillow-9.4.0-cp311-cp311-win_amd64.whl
python-3.11.3-amd64.exe
}
[ $w7 ] && fns+=(
certifi-2022.12.7-py3-none-any.whl
chardet-5.1.0-py3-none-any.whl
idna-3.4-py3-none-any.whl
requests-2.28.2-py3-none-any.whl
urllib3-1.26.14-py2.py3-none-any.whl
)
[ $w7 ] && fns+=(
future-0.18.2.tar.gz
importlib_metadata-5.0.0-py3-none-any.whl
pefile-2022.5.30.tar.gz
pyinstaller_hooks_contrib-2022.11-py2.py3-none-any.whl
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
pip-22.3-py3-none-any.whl
typing_extensions-4.4.0-py3-none-any.whl
zipp-3.10.0-py3-none-any.whl
)
uname -s | grep WOW64 && fns+=(
[ $w7x64 ] && fns+=(
windows6.1-kb2533623-x64.msu
pyinstaller-5.6.1-py3-none-win_amd64.whl
pyinstaller-5.10.1-py3-none-win_amd64.whl
python-3.7.9-amd64.exe
) || fns+=(
)
[ $w7x32 ] && fns+=(
windows6.1-kb2533623-x86.msu
pyinstaller-5.6.1-py3-none-win32.whl
pyinstaller-5.10.1-py3-none-win32.whl
python-3.7.9.exe
)
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
@@ -45,20 +64,35 @@ manually install:
python-3.7.9
===[ copy-paste into git-bash ]================================
uname -s | grep NT-10 && w10=1 || w7=1
[ $w7 ] && pyv=37 || pyv=311
appd=$(cygpath.exe "$APPDATA")
cd ~/Downloads &&
unzip upx-*-win32.zip &&
mv upx-*/upx.exe . &&
python -m ensurepip &&
python -m pip install --user -U pip-*.whl &&
python -m pip install --user -U pyinstaller-*.whl pefile-*.tar.gz pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl &&
{ [ $w7 ] || python -m pip install --user -U mutagen-*.whl Pillow-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&
python uncomment.py $(for d in $appd/Python/Python$pyv/site-packages/{requests,urllib3,charset_normalizer,certifi,idna}; do find $d -name \*.py; done) &&
cd &&
rm -f build.sh &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/build.sh &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.sh &&
echo ok
# python -m pip install --user -U Pillow-9.2.0-cp37-cp37m-win32.whl
# sed -ri 's/, bestopt, /]+bestopt+[/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
# sed -ri 's/(^\s+bestopt = ).*/\1["--best","--lzma","--ultra-brute"]/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
===[ win10: copy-paste into git-bash ]=========================
#for f in $appd/Python/Python311/site-packages/mutagen/*.py; do awk -i inplace '/^\s*def _?(save|write)/{sub(/d.*/," ");s=$0;ns=length(s)} ns&&/[^ ]/&&substr($0,0,ns)!=s{ns=0} !ns' "$f"; done &&
python uncomment.py $appd/Python/Python311/site-packages/{mutagen,PIL,jinja2,markupsafe}/*.py &&
echo ok
## ============================================================
## notes

View File

@@ -0,0 +1,29 @@
# UTF-8
VSVersionInfo(
ffi=FixedFileInfo(
filevers=(1,2,3,0),
prodvers=(1,2,3,0),
mask=0x3f,
flags=0x0,
OS=0x4,
fileType=0x1,
subtype=0x0,
date=(0, 0)
),
kids=[
StringFileInfo(
[
StringTable(
'000004b0',
[StringStruct('CompanyName', 'ocv.me'),
StringStruct('FileDescription', 'copyparty uploader / filesearch command'),
StringStruct('FileVersion', '1.2.3'),
StringStruct('InternalName', 'up2k'),
StringStruct('LegalCopyright', '2019, ed'),
StringStruct('OriginalFilename', 'up2k.exe'),
StringStruct('ProductName', 'copyparty up2k client'),
StringStruct('ProductVersion', '1.2.3')])
]),
VarFileInfo([VarStruct('Translation', [0, 1200])])
]
)

Some files were not shown because too many files have changed in this diff Show More