Compare commits
31 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c7caecf77c | ||
|
|
1fe30363c7 | ||
|
|
54a7256c8d | ||
|
|
8e8e4ff132 | ||
|
|
1dace72092 | ||
|
|
3a5c1d9faf | ||
|
|
f38c754301 | ||
|
|
fff38f484d | ||
|
|
95390b655f | ||
|
|
5967c421ca | ||
|
|
b8b5214f44 | ||
|
|
cdd3b67a5c | ||
|
|
28c9de3f6a | ||
|
|
f3b9bfc114 | ||
|
|
c9eba39edd | ||
|
|
40a1c7116e | ||
|
|
c03af9cfcc | ||
|
|
c4cbc32cc5 | ||
|
|
1231ce199e | ||
|
|
e0cac6fd99 | ||
|
|
d9db1534b1 | ||
|
|
6a0aaaf069 | ||
|
|
4c04798aa5 | ||
|
|
3f84b0a015 | ||
|
|
917380ddbb | ||
|
|
d9ae067e52 | ||
|
|
b2e8bf6e89 | ||
|
|
170cbe98c5 | ||
|
|
c94f662095 | ||
|
|
0987dcfb1c | ||
|
|
6920c01d4a |
57
README.md
57
README.md
@@ -80,6 +80,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
|
||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags
|
||||
* [event hooks](#event-hooks) - trigger a program on uploads, renames etc ([examples](./bin/hooks/))
|
||||
* [zeromq](#zeromq) - event-hooks can send zeromq messages
|
||||
* [upload events](#upload-events) - the older, more powerful approach ([examples](./bin/mtag/))
|
||||
* [handlers](#handlers) - redefine behavior with plugins ([examples](./bin/handlers/))
|
||||
* [ip auth](#ip-auth) - autologin based on IP range (CIDR)
|
||||
@@ -352,10 +353,19 @@ same order here too
|
||||
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
|
||||
* `AudioContext` will probably never be a viable workaround as apple introduces new issues faster than they fix current ones
|
||||
|
||||
* iPhones: music volume goes on a rollercoaster during song changes
|
||||
* nothing I can do about it because `AudioContext` is still broken in safari
|
||||
|
||||
* iPhones: the preload feature (in the media-player-options tab) can cause a tiny audio glitch 20sec before the end of each song, but disabling it may cause worse iOS bugs to appear instead
|
||||
* just a hunch, but disabling preloading may cause playback to stop entirely, or possibly mess with bluetooth speakers
|
||||
* tried to add a tooltip regarding this but looks like apple broke my tooltips
|
||||
|
||||
* iPhones: preloaded awo files make safari log MEDIA_ERR_NETWORK errors as playback starts, but the song plays just fine so eh whatever
|
||||
* awo, opus-weba, is apple's new take on opus support, replacing opus-caf which was technically limited to cbr opus
|
||||
|
||||
* iPhones: preloading another awo file may cause playback to stop
|
||||
* can be somewhat mitigated with `mp.au.play()` in `mp.onpreload` but that can hit a race condition in safari that starts playing the same audio object twice in parallel...
|
||||
|
||||
* Windows: folders cannot be accessed if the name ends with `.`
|
||||
* python or windows bug
|
||||
|
||||
@@ -615,8 +625,8 @@ select which type of archive you want in the `[⚙️] config` tab:
|
||||
| `pax` | `?tar=pax` | pax-format tar, futureproof, not as fast |
|
||||
| `tgz` | `?tar=gz` | gzip compressed gnu-tar (slow), for `curl \| tar -xvz` |
|
||||
| `txz` | `?tar=xz` | gnu-tar with xz / lzma compression (v.slow) |
|
||||
| `zip` | `?zip=utf8` | works everywhere, glitchy filenames on win7 and older |
|
||||
| `zip_dos` | `?zip` | traditional cp437 (no unicode) to fix glitchy filenames |
|
||||
| `zip` | `?zip` | works everywhere, glitchy filenames on win7 and older |
|
||||
| `zip_dos` | `?zip=dos` | traditional cp437 (no unicode) to fix glitchy filenames |
|
||||
| `zip_crc` | `?zip=crc` | cp437 with crc32 computed early for truly ancient software |
|
||||
|
||||
* gzip default level is `3` (0=fast, 9=best), change with `?tar=gz:9`
|
||||
@@ -624,7 +634,7 @@ select which type of archive you want in the `[⚙️] config` tab:
|
||||
* bz2 default level is `2` (1=fast, 9=best), change with `?tar=bz2:9`
|
||||
* hidden files ([dotfiles](#dotfiles)) are excluded unless account is allowed to list them
|
||||
* `up2k.db` and `dir.txt` is always excluded
|
||||
* bsdtar supports streaming unzipping: `curl foo?zip=utf8 | bsdtar -xv`
|
||||
* bsdtar supports streaming unzipping: `curl foo?zip | bsdtar -xv`
|
||||
* good, because copyparty's zip is faster than tar on small files
|
||||
* `zip_crc` will take longer to download since the server has to read each file twice
|
||||
* this is only to support MS-DOS PKZIP v2.04g (october 1993) and older
|
||||
@@ -887,6 +897,8 @@ will show uploader IP and upload-time if the visitor has the admin permission
|
||||
|
||||
* global-option `--ups-when` makes upload-time visible to all users, and not just admins
|
||||
|
||||
* global-option `--ups-who` (volflag `ups_who`) specifies who gets access (0=nobody, 1=admins, 2=everyone), default=2
|
||||
|
||||
note that the [🧯 unpost](#unpost) feature is better suited for viewing *your own* recent uploads, as it includes the option to undo/delete them
|
||||
|
||||
|
||||
@@ -926,6 +938,11 @@ open the `[🎺]` media-player-settings tab to configure it,
|
||||
* `[aac]` converts `aac` and `m4a` files into opus (if supported by browser) or mp3
|
||||
* `[oth]` converts all other known formats into opus (if supported by browser) or mp3
|
||||
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
|
||||
* "transcode to":
|
||||
* `[opus]` produces an `opus` whenever transcoding is necessary (the best choice on Android and PCs)
|
||||
* `[awo]` is `opus` in a `weba` file, good for iPhones (iOS 17.5 and newer) but Apple is still fixing some state-confusion bugs as of iOS 18.2.1
|
||||
* `[caf]` is `opus` in a `caf` file, good for iPhones (iOS 11 through 17), technically unsupported by Apple but works for the mos tpart
|
||||
* `[mp3]` -- the myth, the legend, the undying master of mediocre sound quality that definitely works everywhere
|
||||
* "tint" reduces the contrast of the playback bar
|
||||
|
||||
|
||||
@@ -1458,6 +1475,23 @@ there's a bunch of flags and stuff, see `--help-hooks`
|
||||
if you want to write your own hooks, see [devnotes](./docs/devnotes.md#event-hooks)
|
||||
|
||||
|
||||
### zeromq
|
||||
|
||||
event-hooks can send zeromq messages instead of running programs
|
||||
|
||||
to send a 0mq message every time a file is uploaded,
|
||||
|
||||
* `--xau zmq:pub:tcp://*:5556` sends a PUB to any/all connected SUB clients
|
||||
* `--xau t3,zmq:push:tcp://*:5557` sends a PUSH to exactly one connected PULL client
|
||||
* `--xau t3,j,zmq:req:tcp://localhost:5555` sends a REQ to the connected REP client
|
||||
|
||||
the PUSH and REQ examples have `t3` (timeout after 3 seconds) because they block if there's no clients to talk to
|
||||
|
||||
* the REQ example does `t3,j` to send extended upload-info as json instead of just the filesystem-path
|
||||
|
||||
see [zmq-recv.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/zmq-recv.py) if you need something to receive the messages with
|
||||
|
||||
|
||||
### upload events
|
||||
|
||||
the older, more powerful approach ([examples](./bin/mtag/)):
|
||||
@@ -1545,12 +1579,16 @@ connecting to an aws s3 bucket and similar
|
||||
|
||||
there is no built-in support for this, but you can use FUSE-software such as [rclone](https://rclone.org/) / [geesefs](https://github.com/yandex-cloud/geesefs) / [JuiceFS](https://juicefs.com/en/) to first mount your cloud storage as a local disk, and then let copyparty use (a folder in) that disk as a volume
|
||||
|
||||
you may experience poor upload performance this way, but that can sometimes be fixed by specifying the volflag `sparse` to force the use of sparse files; this has improved the upload speeds from `1.5 MiB/s` to over `80 MiB/s` in one case, but note that you are also more likely to discover funny bugs in your FUSE software this way, so buckle up
|
||||
you will probably get decent speeds with the default config, however most likely restricted to using one TCP connection per file, so the upload-client won't be able to send multiple chunks in parallel
|
||||
|
||||
> before [v1.13.5](https://github.com/9001/copyparty/releases/tag/v1.13.5) it was recommended to use the volflag `sparse` to force-allow multiple chunks in parallel; this would improve the upload-speed from `1.5 MiB/s` to over `80 MiB/s` at the risk of provoking latent bugs in S3 or JuiceFS. But v1.13.5 added chunk-stitching, so this is now probably much less important. On the contrary, `nosparse` *may* now increase performance in some cases. Please try all three options (default, `sparse`, `nosparse`) as the optimal choice depends on your network conditions and software stack (both the FUSE-driver and cloud-server)
|
||||
|
||||
someone has also tested geesefs in combination with [gocryptfs](https://nuetzlich.net/gocryptfs/) with surprisingly good results, getting 60 MiB/s upload speeds on a gbit line, but JuiceFS won with 80 MiB/s using its built-in encryption
|
||||
|
||||
you may improve performance by specifying larger values for `--iobuf` / `--s-rd-sz` / `--s-wr-sz`
|
||||
|
||||
> if you've experimented with this and made interesting observations, please share your findings so we can add a section with specific recommendations :-)
|
||||
|
||||
|
||||
## hiding from google
|
||||
|
||||
@@ -2304,13 +2342,13 @@ mandatory deps:
|
||||
|
||||
install these to enable bonus features
|
||||
|
||||
enable hashed passwords in config: `argon2-cffi`
|
||||
enable [hashed passwords](#password-hashing) in config: `argon2-cffi`
|
||||
|
||||
enable ftp-server:
|
||||
enable [ftp-server](#ftp-server):
|
||||
* for just plaintext FTP, `pyftpdlib` (is built into the SFX)
|
||||
* with TLS encryption, `pyftpdlib pyopenssl`
|
||||
|
||||
enable music tags:
|
||||
enable [music tags](#metadata-from-audio-files):
|
||||
* either `mutagen` (fast, pure-python, skips a few tags, makes copyparty GPL? idk)
|
||||
* or `ffprobe` (20x slower, more accurate, possibly dangerous depending on your distro and users)
|
||||
|
||||
@@ -2321,8 +2359,9 @@ enable [thumbnails](#thumbnails) of...
|
||||
* **AVIF pictures:** `pyvips` or `ffmpeg` or `pillow-avif-plugin`
|
||||
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
|
||||
|
||||
enable [smb](#smb-server) support (**not** recommended):
|
||||
* `impacket==0.12.0`
|
||||
enable sending [zeromq messages](#zeromq) from event-hooks: `pyzmq`
|
||||
|
||||
enable [smb](#smb-server) support (**not** recommended): `impacket==0.12.0`
|
||||
|
||||
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
|
||||
|
||||
|
||||
@@ -30,4 +30,5 @@ these are `--xiu` hooks; unlike `xbu` and `xau` (which get executed on every sin
|
||||
# on message
|
||||
* [wget.py](wget.py) lets you download files by POSTing URLs to copyparty
|
||||
* [qbittorrent-magnet.py](qbittorrent-magnet.py) starts downloading a torrent if you post a magnet url
|
||||
* [usb-eject.py](usb-eject.py) adds web-UI buttons to safe-remove usb flashdrives shared through copyparty
|
||||
* [msg-log.py](msg-log.py) is a guestbook; logs messages to a doc in the same folder
|
||||
|
||||
54
bin/hooks/usb-eject.js
Normal file
54
bin/hooks/usb-eject.js
Normal file
@@ -0,0 +1,54 @@
|
||||
// see usb-eject.py for usage
|
||||
|
||||
function usbclick() {
|
||||
QS('#treeul a[href="/usb/"]').click();
|
||||
}
|
||||
|
||||
function eject_cb() {
|
||||
var t = this.responseText;
|
||||
if (t.indexOf('can be safely unplugged') < 0 && t.indexOf('Device can be removed') < 0)
|
||||
return toast.err(30, 'usb eject failed:\n\n' + t);
|
||||
|
||||
toast.ok(5, esc(t.replace(/ - /g, '\n\n')));
|
||||
usbclick(); setTimeout(usbclick, 10);
|
||||
};
|
||||
|
||||
function add_eject_2(a) {
|
||||
var aw = a.getAttribute('href').split(/\//g);
|
||||
if (aw.length != 4 || aw[3])
|
||||
return;
|
||||
|
||||
var v = aw[2],
|
||||
k = 'umount_' + v;
|
||||
|
||||
qsr('#' + k);
|
||||
a.appendChild(mknod('span', k, '⏏'), a);
|
||||
|
||||
var o = ebi(k);
|
||||
o.style.cssText = 'position:absolute; right:1em; margin-top:-.2em; font-size:1.3em';
|
||||
o.onclick = function (e) {
|
||||
ev(e);
|
||||
var xhr = new XHR();
|
||||
xhr.open('POST', get_evpath(), true);
|
||||
xhr.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded;charset=UTF-8');
|
||||
xhr.send('msg=' + uricom_enc(':usb-eject:' + v + ':'));
|
||||
xhr.onload = xhr.onerror = eject_cb;
|
||||
toast.inf(10, "ejecting " + v + "...");
|
||||
};
|
||||
};
|
||||
|
||||
function add_eject() {
|
||||
for (var a of QSA('#treeul a[href^="/usb/"]'))
|
||||
add_eject_2(a);
|
||||
};
|
||||
|
||||
(function() {
|
||||
var f0 = treectl.rendertree;
|
||||
treectl.rendertree = function (res, ts, top0, dst, rst) {
|
||||
var ret = f0(res, ts, top0, dst, rst);
|
||||
add_eject();
|
||||
return ret;
|
||||
};
|
||||
})();
|
||||
|
||||
setTimeout(add_eject, 50);
|
||||
49
bin/hooks/usb-eject.py
Normal file
49
bin/hooks/usb-eject.py
Normal file
@@ -0,0 +1,49 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import os
|
||||
import stat
|
||||
import subprocess as sp
|
||||
import sys
|
||||
|
||||
|
||||
"""
|
||||
if you've found yourself using copyparty to serve flashdrives on a LAN
|
||||
and your only wish is that the web-UI had a button to unmount / safely
|
||||
remove those flashdrives, then boy howdy are you in the right place :D
|
||||
|
||||
put usb-eject.js in the webroot (or somewhere else http-accessible)
|
||||
then run copyparty with these args:
|
||||
|
||||
-v /run/media/ed:/usb:A:c,hist=/tmp/junk
|
||||
--xm=c1,bin/hooks/usb-eject.py
|
||||
--js-browser=/usb-eject.js
|
||||
|
||||
which does the following respectively,
|
||||
|
||||
* share all of /run/media/ed as /usb with admin for everyone
|
||||
and put the histpath somewhere it won't cause trouble
|
||||
* run the usb-eject hook with stdout redirect to the web-ui
|
||||
* add the complementary usb-eject.js to the browser
|
||||
|
||||
"""
|
||||
|
||||
|
||||
def main():
|
||||
try:
|
||||
label = sys.argv[1].split(":usb-eject:")[1].split(":")[0]
|
||||
mp = "/run/media/ed/" + label
|
||||
# print("ejecting [%s]... " % (mp,), end="")
|
||||
mp = os.path.abspath(os.path.realpath(mp.encode("utf-8")))
|
||||
st = os.lstat(mp)
|
||||
if not stat.S_ISDIR(st.st_mode):
|
||||
raise Exception("not a regular directory")
|
||||
|
||||
cmd = [b"gio", b"mount", b"-e", mp]
|
||||
print(sp.check_output(cmd).decode("utf-8", "replace").strip())
|
||||
|
||||
except Exception as ex:
|
||||
print("unmount failed: %r" % (ex,))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
55
bin/u2c.py
55
bin/u2c.py
@@ -1,8 +1,8 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
S_VERSION = "2.7"
|
||||
S_BUILD_DT = "2024-12-06"
|
||||
S_VERSION = "2.9"
|
||||
S_BUILD_DT = "2025-01-27"
|
||||
|
||||
"""
|
||||
u2c.py: upload to copyparty
|
||||
@@ -234,6 +234,10 @@ CLEN = "Content-Length"
|
||||
|
||||
web = None # type: HCli
|
||||
|
||||
links = [] # type: list[str]
|
||||
linkmtx = threading.Lock()
|
||||
linkfile = None
|
||||
|
||||
|
||||
class File(object):
|
||||
"""an up2k upload task; represents a single file"""
|
||||
@@ -761,6 +765,29 @@ def get_hashlist(file, pcb, mth):
|
||||
file.kchunks[k] = [v1, v2]
|
||||
|
||||
|
||||
def printlink(ar, purl, name, fk):
|
||||
if not name:
|
||||
url = purl # srch
|
||||
else:
|
||||
name = quotep(name.encode("utf-8", WTF8)).decode("utf-8")
|
||||
if fk:
|
||||
url = "%s%s?k=%s" % (purl, name, fk)
|
||||
else:
|
||||
url = "%s%s" % (purl, name)
|
||||
|
||||
url = "%s/%s" % (ar.burl, url.lstrip("/"))
|
||||
|
||||
with linkmtx:
|
||||
if ar.u:
|
||||
links.append(url)
|
||||
if ar.ud:
|
||||
print(url)
|
||||
if linkfile:
|
||||
zs = "%s\n" % (url,)
|
||||
zb = zs.encode("utf-8", "replace")
|
||||
linkfile.write(zb)
|
||||
|
||||
|
||||
def handshake(ar, file, search):
|
||||
# type: (argparse.Namespace, File, bool) -> tuple[list[str], bool]
|
||||
"""
|
||||
@@ -832,12 +859,17 @@ def handshake(ar, file, search):
|
||||
raise Exception(txt)
|
||||
|
||||
if search:
|
||||
if ar.uon and r["hits"]:
|
||||
printlink(ar, r["hits"][0]["rp"], "", "")
|
||||
return r["hits"], False
|
||||
|
||||
file.url = quotep(r["purl"].encode("utf-8", WTF8)).decode("utf-8")
|
||||
file.name = r["name"]
|
||||
file.wark = r["wark"]
|
||||
|
||||
if ar.uon and not r["hash"]:
|
||||
printlink(ar, file.url, r["name"], r.get("fk"))
|
||||
|
||||
return r["hash"], r["sprs"]
|
||||
|
||||
|
||||
@@ -1249,7 +1281,7 @@ class Ctl(object):
|
||||
for n, zsii in enumerate(file.cids)
|
||||
]
|
||||
print("chs: %s\n%s" % (vp, "\n".join(zsl)))
|
||||
zsl = [self.ar.wsalt, str(file.size)] + [x[0] for x in file.kchunks]
|
||||
zsl = [self.ar.wsalt, str(file.size)] + [x[0] for x in file.cids]
|
||||
zb = hashlib.sha512("\n".join(zsl).encode("utf-8")).digest()[:33]
|
||||
wark = ub64enc(zb).decode("utf-8")
|
||||
if self.ar.jw:
|
||||
@@ -1472,7 +1504,7 @@ class APF(argparse.ArgumentDefaultsHelpFormatter, argparse.RawDescriptionHelpFor
|
||||
|
||||
|
||||
def main():
|
||||
global web
|
||||
global web, linkfile
|
||||
|
||||
time.strptime("19970815", "%Y%m%d") # python#7980
|
||||
"".encode("idna") # python#29288
|
||||
@@ -1509,6 +1541,11 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ap.add_argument("--spd", action="store_true", help="print speeds for each file")
|
||||
ap.add_argument("--version", action="store_true", help="show version and exit")
|
||||
|
||||
ap = app.add_argument_group("print links")
|
||||
ap.add_argument("-u", action="store_true", help="print list of download-links after all uploads finished")
|
||||
ap.add_argument("-ud", action="store_true", help="print download-link after each upload finishes")
|
||||
ap.add_argument("-uf", type=unicode, metavar="PATH", help="print list of download-links to file")
|
||||
|
||||
ap = app.add_argument_group("compatibility")
|
||||
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
||||
ap.add_argument("--rh", type=int, metavar="TRIES", default=0, help="resolve server hostname before upload (good for buggy networks, but TLS certs will break)")
|
||||
@@ -1594,6 +1631,10 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ar.x = "|".join(ar.x or [])
|
||||
|
||||
setattr(ar, "wlist", ar.url == "-")
|
||||
setattr(ar, "uon", ar.u or ar.ud or ar.uf)
|
||||
|
||||
if ar.uf:
|
||||
linkfile = open(ar.uf, "wb")
|
||||
|
||||
for k in "dl dr drd wlist".split():
|
||||
errs = []
|
||||
@@ -1656,6 +1697,12 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ar.z = True
|
||||
ctl = Ctl(ar, ctl.stats)
|
||||
|
||||
if links:
|
||||
print()
|
||||
print("\n".join(links))
|
||||
if linkfile:
|
||||
linkfile.close()
|
||||
|
||||
if ctl.errs:
|
||||
print("WARNING: %d errors" % (ctl.errs))
|
||||
|
||||
|
||||
76
bin/zmq-recv.py
Executable file
76
bin/zmq-recv.py
Executable file
@@ -0,0 +1,76 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import sys
|
||||
import zmq
|
||||
|
||||
"""
|
||||
zmq-recv.py: demo zmq receiver
|
||||
2025-01-22, v1.0, ed <irc.rizon.net>, MIT-Licensed
|
||||
https://github.com/9001/copyparty/blob/hovudstraum/bin/zmq-recv.py
|
||||
|
||||
basic zmq-server to receive events from copyparty; try one of
|
||||
the below and then "send a message to serverlog" in the web-ui:
|
||||
|
||||
1) dumb fire-and-forget to any and all listeners;
|
||||
run this script with "sub" and run copyparty with this:
|
||||
--xm zmq:pub:tcp://*:5556
|
||||
|
||||
2) one lucky listener gets the message, blocks if no listeners:
|
||||
run this script with "pull" and run copyparty with this:
|
||||
--xm t3,zmq:push:tcp://*:5557
|
||||
|
||||
3) blocking syn/ack mode, client must ack each message;
|
||||
run this script with "rep" and run copyparty with this:
|
||||
--xm t3,zmq:req:tcp://localhost:5555
|
||||
|
||||
note: to conditionally block uploads based on message contents,
|
||||
use rep_server to answer with "return 1" and run copyparty with
|
||||
--xau t3,c,zmq:req:tcp://localhost:5555
|
||||
"""
|
||||
|
||||
|
||||
ctx = zmq.Context()
|
||||
|
||||
|
||||
def sub_server():
|
||||
# PUB/SUB allows any number of servers/clients, and
|
||||
# messages are fire-and-forget
|
||||
sck = ctx.socket(zmq.SUB)
|
||||
sck.connect("tcp://localhost:5556")
|
||||
sck.setsockopt_string(zmq.SUBSCRIBE, "")
|
||||
while True:
|
||||
print("copyparty says %r" % (sck.recv_string(),))
|
||||
|
||||
|
||||
def pull_server():
|
||||
# PUSH/PULL allows any number of servers/clients, and
|
||||
# each message is sent to a exactly one PULL client
|
||||
sck = ctx.socket(zmq.PULL)
|
||||
sck.connect("tcp://localhost:5557")
|
||||
while True:
|
||||
print("copyparty says %r" % (sck.recv_string(),))
|
||||
|
||||
|
||||
def rep_server():
|
||||
# REP/REQ is a server/client pair where each message must be
|
||||
# acked by the other before another message can be sent, so
|
||||
# copyparty will do a blocking-wait for the ack
|
||||
sck = ctx.socket(zmq.REP)
|
||||
sck.bind("tcp://*:5555")
|
||||
while True:
|
||||
print("copyparty says %r" % (sck.recv_string(),))
|
||||
reply = b"thx"
|
||||
# reply = b"return 1" # non-zero to block an upload
|
||||
sck.send(reply)
|
||||
|
||||
|
||||
mode = sys.argv[1].lower() if len(sys.argv) > 1 else ""
|
||||
|
||||
if mode == "sub":
|
||||
sub_server()
|
||||
elif mode == "pull":
|
||||
pull_server()
|
||||
elif mode == "rep":
|
||||
rep_server()
|
||||
else:
|
||||
print("specify mode as first argument: SUB | PULL | REP")
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.16.7"
|
||||
pkgver="1.16.10"
|
||||
pkgrel=1
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
||||
arch=("any")
|
||||
@@ -16,12 +16,13 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
"libkeyfinder-git: detection of musical keys"
|
||||
"qm-vamp-plugins: BPM detection"
|
||||
"python-pyopenssl: ftps functionality"
|
||||
"python-argon2_cffi: hashed passwords in config"
|
||||
"python-pyzmq: send zeromq messages from event-hooks"
|
||||
"python-argon2-cffi: hashed passwords in config"
|
||||
"python-impacket-git: smb support (bad idea)"
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("22178c98513072a8ef1e0fdb85d1044becf345ee392a9f5a336cc340ae16e4e9")
|
||||
sha256sums=("a33f5df985f5c6e8da717e913eb070eb488263759ee1a2668b3b26837e29a944")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, ffmpeg, mutagen,
|
||||
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, pyzmq, ffmpeg, mutagen,
|
||||
|
||||
# use argon2id-hashed passwords in config files (sha2 is always available)
|
||||
withHashedPasswords ? true,
|
||||
@@ -21,6 +21,9 @@ withMediaProcessing ? true,
|
||||
# if MediaProcessing is not enabled, you probably want this instead (less accurate, but much safer and faster)
|
||||
withBasicAudioMetadata ? false,
|
||||
|
||||
# send ZeroMQ messages from event-hooks
|
||||
withZeroMQ ? true,
|
||||
|
||||
# enable FTPS support in the FTP server
|
||||
withFTPS ? false,
|
||||
|
||||
@@ -43,6 +46,7 @@ let
|
||||
++ lib.optional withMediaProcessing ffmpeg
|
||||
++ lib.optional withBasicAudioMetadata mutagen
|
||||
++ lib.optional withHashedPasswords argon2-cffi
|
||||
++ lib.optional withZeroMQ pyzmq
|
||||
);
|
||||
in stdenv.mkDerivation {
|
||||
pname = "copyparty";
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.16.7/copyparty-sfx.py",
|
||||
"version": "1.16.7",
|
||||
"hash": "sha256-mAoZre3hArsdXorZwv0mYESn/mtyMXfcUzcOMwnk8Do="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.16.10/copyparty-sfx.py",
|
||||
"version": "1.16.10",
|
||||
"hash": "sha256-4CK/491U/fdor+McO94nYsL39g73WQ7i8loUYVrbiZA="
|
||||
}
|
||||
@@ -54,6 +54,8 @@ from .util import (
|
||||
RAM_TOTAL,
|
||||
SQLITE_VER,
|
||||
UNPLICATIONS,
|
||||
URL_BUG,
|
||||
URL_PRJ,
|
||||
Daemon,
|
||||
align_tab,
|
||||
ansi_re,
|
||||
@@ -332,17 +334,16 @@ def ensure_webdeps() -> None:
|
||||
if has_resource(E, "web/deps/mini-fa.woff"):
|
||||
return
|
||||
|
||||
warn(
|
||||
"""could not find webdeps;
|
||||
t = """could not find webdeps;
|
||||
if you are running the sfx, or exe, or pypi package, or docker image,
|
||||
then this is a bug! Please let me know so I can fix it, thanks :-)
|
||||
https://github.com/9001/copyparty/issues/new?labels=bug&template=bug_report.md
|
||||
%s
|
||||
|
||||
however, if you are a dev, or running copyparty from source, and you want
|
||||
full client functionality, you will need to build or obtain the webdeps:
|
||||
https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#building
|
||||
%s/blob/hovudstraum/docs/devnotes.md#building
|
||||
"""
|
||||
)
|
||||
warn(t % (URL_BUG, URL_PRJ))
|
||||
|
||||
|
||||
def configure_ssl_ver(al: argparse.Namespace) -> None:
|
||||
@@ -739,6 +740,10 @@ def get_sects():
|
||||
the \033[33m,,\033[35m stops copyparty from reading the rest as flags and
|
||||
the \033[33m--\033[35m stops notify-send from reading the message as args
|
||||
and the alert will be "hey" followed by the messagetext
|
||||
|
||||
\033[36m--xau zmq:pub:tcp://*:5556\033[35m announces uploads on zeromq;
|
||||
\033[36m--xau t3,zmq:push:tcp://*:5557\033[35m also works, and you can
|
||||
\033[36m--xau t3,j,zmq:req:tcp://localhost:5555\033[35m too for example
|
||||
\033[0m
|
||||
each hook is executed once for each event, except for \033[36mxiu\033[0m
|
||||
which builds up a backlog of uploads, running the hook just once
|
||||
@@ -770,11 +775,22 @@ def get_sects():
|
||||
values for --urlform:
|
||||
\033[36mstash\033[35m dumps the data to file and returns length + checksum
|
||||
\033[36msave,get\033[35m dumps to file and returns the page like a GET
|
||||
\033[36mprint,get\033[35m prints the data in the log and returns GET
|
||||
(leave out the ",get" to return an error instead)\033[0m
|
||||
\033[36mprint \033[35m prints the data to log and returns an error
|
||||
\033[36mprint,xm \033[35m prints the data to log and returns --xm output
|
||||
\033[36mprint,get\033[35m prints the data to log and returns GET\033[0m
|
||||
|
||||
note that the \033[35m--xm\033[0m hook will only run if \033[35m--urlform\033[0m
|
||||
is either \033[36mprint\033[0m or the default \033[36mprint,get\033[0m
|
||||
note that the \033[35m--xm\033[0m hook will only run if \033[35m--urlform\033[0m is
|
||||
either \033[36mprint\033[0m or \033[36mprint,get\033[0m or the default \033[36mprint,xm\033[0m
|
||||
|
||||
if an \033[35m--xm\033[0m hook returns text, then
|
||||
the response code will be HTTP 202;
|
||||
http/get responses will be HTTP 200
|
||||
|
||||
if there are multiple \033[35m--xm\033[0m hooks defined, then
|
||||
the first hook that produced output is returned
|
||||
|
||||
if there are no \033[35m--xm\033[0m hooks defined, then the default
|
||||
\033[36mprint,xm\033[0m behaves like \033[36mprint,get\033[0m (returning html)
|
||||
"""
|
||||
),
|
||||
],
|
||||
@@ -955,7 +971,7 @@ def add_general(ap, nc, srvname):
|
||||
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, \033[33mSRC\033[0m:\033[33mDST\033[0m:\033[33mFLAG\033[0m; examples [\033[32m.::r\033[0m], [\033[32m/mnt/nas/music:/music:r:aed\033[0m], see --help-accounts")
|
||||
ap2.add_argument("--grp", metavar="G:N,N", type=u, action="append", help="add group, \033[33mNAME\033[0m:\033[33mUSER1\033[0m,\033[33mUSER2\033[0m,\033[33m...\033[0m; example [\033[32madmins:ed,foo,bar\033[0m]")
|
||||
ap2.add_argument("-ed", action="store_true", help="enable the ?dots url parameter / client option which allows clients to see dotfiles / hidden files (volflag=dots)")
|
||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-form POSTs; see \033[33m--help-urlform\033[0m")
|
||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,xm", help="how to handle url-form POSTs; see \033[33m--help-urlform\033[0m")
|
||||
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="server terminal title, for example [\033[32m$ip-10.1.2.\033[0m] or [\033[32m$ip-]")
|
||||
ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)")
|
||||
ap2.add_argument("--mime", metavar="EXT=MIME", type=u, action="append", help="map file \033[33mEXT\033[0mension to \033[33mMIME\033[0mtype, for example [\033[32mjpg=image/jpeg\033[0m]")
|
||||
@@ -1328,6 +1344,7 @@ def add_admin(ap):
|
||||
ap2.add_argument("--no-ups-page", action="store_true", help="disable ?ru (list of recent uploads)")
|
||||
ap2.add_argument("--no-up-list", action="store_true", help="don't show list of incoming files in controlpanel")
|
||||
ap2.add_argument("--dl-list", metavar="LVL", type=int, default=2, help="who can see active downloads in the controlpanel? [\033[32m0\033[0m]=nobody, [\033[32m1\033[0m]=admins, [\033[32m2\033[0m]=everyone")
|
||||
ap2.add_argument("--ups-who", metavar="LVL", type=int, default=2, help="who can see recent uploads on the ?ru page? [\033[32m0\033[0m]=nobody, [\033[32m1\033[0m]=admins, [\033[32m2\033[0m]=everyone (volflag=ups_who)")
|
||||
ap2.add_argument("--ups-when", action="store_true", help="let everyone see upload timestamps on the ?ru page, not just admins")
|
||||
|
||||
|
||||
@@ -1368,6 +1385,8 @@ def add_transcoding(ap):
|
||||
ap2 = ap.add_argument_group('transcoding options')
|
||||
ap2.add_argument("--q-opus", metavar="KBPS", type=int, default=128, help="target bitrate for transcoding to opus; set 0 to disable")
|
||||
ap2.add_argument("--q-mp3", metavar="QUALITY", type=u, default="q2", help="target quality for transcoding to mp3, for example [\033[32m192k\033[0m] (CBR) or [\033[32mq0\033[0m] (CQ/CRF, q0=maxquality, q9=smallest); set 0 to disable")
|
||||
ap2.add_argument("--no-caf", action="store_true", help="disable transcoding to caf-opus (affects iOS v12~v17), will use mp3 instead")
|
||||
ap2.add_argument("--no-owa", action="store_true", help="disable transcoding to webm-opus (iOS v18 and later), will use mp3 instead")
|
||||
ap2.add_argument("--no-acode", action="store_true", help="disable audio transcoding")
|
||||
ap2.add_argument("--no-bacode", action="store_true", help="disable batch audio transcoding by folder download (zip/tar)")
|
||||
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds")
|
||||
@@ -1476,12 +1495,14 @@ def add_ui(ap, retry):
|
||||
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
|
||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty @ --name", help="title / service-name to show in html documents")
|
||||
ap2.add_argument("--bname", metavar="TXT", type=u, default="--name", help="server name (displayed in filebrowser document title)")
|
||||
ap2.add_argument("--pb-url", metavar="URL", type=u, default="https://github.com/9001/copyparty", help="powered-by link; disable with \033[33m-np\033[0m")
|
||||
ap2.add_argument("--pb-url", metavar="URL", type=u, default=URL_PRJ, help="powered-by link; disable with \033[33m-np\033[0m")
|
||||
ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible with \033[33m-nb\033[0m)")
|
||||
ap2.add_argument("--k304", metavar="NUM", type=int, default=0, help="configure the option to enable/disable k304 on the controlpanel (workaround for buggy reverse-proxies); [\033[32m0\033[0m] = hidden and default-off, [\033[32m1\033[0m] = visible and default-off, [\033[32m2\033[0m] = visible and default-on")
|
||||
ap2.add_argument("--no304", metavar="NUM", type=int, default=0, help="configure the option to enable/disable no304 on the controlpanel (workaround for buggy caching in browsers); [\033[32m0\033[0m] = hidden and default-off, [\033[32m1\033[0m] = visible and default-off, [\033[32m2\033[0m] = visible and default-on")
|
||||
ap2.add_argument("--md-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for README.md docs (volflag=md_sbf); see https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox")
|
||||
ap2.add_argument("--lg-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for prologue/epilogue docs (volflag=lg_sbf)")
|
||||
ap2.add_argument("--md-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to allow in the iframe 'sandbox' attribute for README.md docs (volflag=md_sbf); see https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#sandbox")
|
||||
ap2.add_argument("--lg-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to allow in the iframe 'sandbox' attribute for prologue/epilogue docs (volflag=lg_sbf)")
|
||||
ap2.add_argument("--md-sba", metavar="TXT", type=u, default="", help="the value of the iframe 'allow' attribute for README.md docs, for example [\033[32mfullscreen\033[0m] (volflag=md_sba)")
|
||||
ap2.add_argument("--lg-sba", metavar="TXT", type=u, default="", help="the value of the iframe 'allow' attribute for prologue/epilogue docs (volflag=lg_sba); see https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Permissions-Policy#iframes")
|
||||
ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README/PREADME.md documents (volflags: no_sb_md | sb_md)")
|
||||
ap2.add_argument("--no-sb-lg", action="store_true", help="don't sandbox prologue/epilogue docs (volflags: no_sb_lg | sb_lg); enables non-js support")
|
||||
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 16, 8)
|
||||
VERSION = (1, 16, 11)
|
||||
CODENAME = "COPYparty"
|
||||
BUILD_DT = (2025, 1, 11)
|
||||
BUILD_DT = (2025, 1, 27)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -1832,7 +1832,11 @@ class AuthSrv(object):
|
||||
if fka and not fk:
|
||||
fk = fka
|
||||
if fk:
|
||||
vol.flags["fk"] = int(fk) if fk is not True else 8
|
||||
fk = 8 if fk is True else int(fk)
|
||||
if fk > 72:
|
||||
t = "max filekey-length is 72; volume /%s specified %d (anything higher than 16 is pointless btw)"
|
||||
raise Exception(t % (vol.vpath, fk))
|
||||
vol.flags["fk"] = fk
|
||||
have_fk = True
|
||||
|
||||
dk = vol.flags.get("dk")
|
||||
@@ -2339,6 +2343,7 @@ class AuthSrv(object):
|
||||
"frand": bool(vf.get("rand")),
|
||||
"lifetime": vf.get("lifetime") or 0,
|
||||
"unlist": vf.get("unlist") or "",
|
||||
"sb_lg": "" if "no_sb_lg" in vf else (vf.get("lg_sbf") or "y"),
|
||||
}
|
||||
js_htm = {
|
||||
"s_name": self.args.bname,
|
||||
@@ -2351,6 +2356,8 @@ class AuthSrv(object):
|
||||
"have_unpost": int(self.args.unpost),
|
||||
"have_emp": self.args.emp,
|
||||
"sb_md": "" if "no_sb_md" in vf else (vf.get("md_sbf") or "y"),
|
||||
"sba_md": vf.get("md_sba") or "",
|
||||
"sba_lg": vf.get("lg_sba") or "",
|
||||
"txt_ext": self.args.textfiles.replace(",", " "),
|
||||
"def_hcols": list(vf.get("mth") or []),
|
||||
"unlist0": vf.get("unlist") or "",
|
||||
|
||||
@@ -74,6 +74,8 @@ def vf_vmap() -> dict[str, str]:
|
||||
"html_head",
|
||||
"lg_sbf",
|
||||
"md_sbf",
|
||||
"lg_sba",
|
||||
"md_sba",
|
||||
"nrand",
|
||||
"og_desc",
|
||||
"og_site",
|
||||
@@ -91,6 +93,7 @@ def vf_vmap() -> dict[str, str]:
|
||||
"unlist",
|
||||
"u2abort",
|
||||
"u2ts",
|
||||
"ups_who",
|
||||
):
|
||||
ret[k] = k
|
||||
return ret
|
||||
@@ -144,6 +147,7 @@ flagcats = {
|
||||
"noclone": "take dupe data from clients, even if available on HDD",
|
||||
"nodupe": "rejects existing files (instead of linking/cloning them)",
|
||||
"sparse": "force use of sparse files, mainly for s3-backed storage",
|
||||
"nosparse": "deny use of sparse files, mainly for slow storage",
|
||||
"daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files",
|
||||
"nosub": "forces all uploads into the top folder of the vfs",
|
||||
"magic": "enables filetype detection for nameless uploads",
|
||||
@@ -240,6 +244,8 @@ flagcats = {
|
||||
"sb_lg": "enable js sandbox for prologue/epilogue (default)",
|
||||
"md_sbf": "list of markdown-sandbox safeguards to disable",
|
||||
"lg_sbf": "list of *logue-sandbox safeguards to disable",
|
||||
"md_sba": "value of iframe allow-prop for markdown-sandbox",
|
||||
"lg_sba": "value of iframe allow-prop for *logue-sandbox",
|
||||
"nohtml": "return html and markdown as text/html",
|
||||
},
|
||||
"others": {
|
||||
|
||||
@@ -1,3 +1,6 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import importlib
|
||||
import sys
|
||||
import xml.etree.ElementTree as ET
|
||||
@@ -8,6 +11,10 @@ if True: # pylint: disable=using-constant-test
|
||||
from typing import Any, Optional
|
||||
|
||||
|
||||
class BadXML(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def get_ET() -> ET.XMLParser:
|
||||
pn = "xml.etree.ElementTree"
|
||||
cn = "_elementtree"
|
||||
@@ -34,7 +41,7 @@ def get_ET() -> ET.XMLParser:
|
||||
XMLParser: ET.XMLParser = get_ET()
|
||||
|
||||
|
||||
class DXMLParser(XMLParser): # type: ignore
|
||||
class _DXMLParser(XMLParser): # type: ignore
|
||||
def __init__(self) -> None:
|
||||
tb = ET.TreeBuilder()
|
||||
super(DXMLParser, self).__init__(target=tb)
|
||||
@@ -49,8 +56,12 @@ class DXMLParser(XMLParser): # type: ignore
|
||||
raise BadXML("{}, {}".format(a, ka))
|
||||
|
||||
|
||||
class BadXML(Exception):
|
||||
pass
|
||||
class _NG(XMLParser): # type: ignore
|
||||
def __int__(self) -> None:
|
||||
raise BadXML("dxml selftest failed")
|
||||
|
||||
|
||||
DXMLParser = _DXMLParser
|
||||
|
||||
|
||||
def parse_xml(txt: str) -> ET.Element:
|
||||
@@ -59,6 +70,40 @@ def parse_xml(txt: str) -> ET.Element:
|
||||
return parser.close() # type: ignore
|
||||
|
||||
|
||||
def selftest() -> bool:
|
||||
qbe = r"""<!DOCTYPE d [
|
||||
<!ENTITY a "nice_bakuretsu">
|
||||
]>
|
||||
<root>&a;&a;&a;</root>"""
|
||||
|
||||
emb = r"""<!DOCTYPE d [
|
||||
<!ENTITY a SYSTEM "file:///etc/hostname">
|
||||
]>
|
||||
<root>&a;</root>"""
|
||||
|
||||
# future-proofing; there's never been any known vulns
|
||||
# regarding DTDs and ET.XMLParser, but might as well
|
||||
# block them since webdav-clients don't use them
|
||||
dtd = r"""<!DOCTYPE d SYSTEM "a.dtd">
|
||||
<root>a</root>"""
|
||||
|
||||
for txt in (qbe, emb, dtd):
|
||||
try:
|
||||
parse_xml(txt)
|
||||
t = "WARNING: dxml selftest failed:\n%s\n"
|
||||
print(t % (txt,), file=sys.stderr)
|
||||
return False
|
||||
except BadXML:
|
||||
pass
|
||||
|
||||
return True
|
||||
|
||||
|
||||
DXML_OK = selftest()
|
||||
if not DXML_OK:
|
||||
DXMLParser = _NG
|
||||
|
||||
|
||||
def mktnod(name: str, text: str) -> ET.Element:
|
||||
el = ET.Element(name)
|
||||
el.text = text
|
||||
|
||||
@@ -132,6 +132,8 @@ NO_CACHE = {"Cache-Control": "no-cache"}
|
||||
|
||||
ALL_COOKIES = "k304 no304 js idxh dots cppwd cppws".split()
|
||||
|
||||
BADXFF = " due to dangerous misconfiguration (the http-header specified by --xff-hdr was received from an untrusted reverse-proxy)"
|
||||
|
||||
H_CONN_KEEPALIVE = "Connection: Keep-Alive"
|
||||
H_CONN_CLOSE = "Connection: Close"
|
||||
|
||||
@@ -162,6 +164,8 @@ class HttpCli(object):
|
||||
def __init__(self, conn: "HttpConn") -> None:
|
||||
assert conn.sr # !rm
|
||||
|
||||
empty_stringlist: list[str] = []
|
||||
|
||||
self.t0 = time.time()
|
||||
self.conn = conn
|
||||
self.u2mutex = conn.u2mutex # mypy404
|
||||
@@ -207,9 +211,7 @@ class HttpCli(object):
|
||||
self.trailing_slash = True
|
||||
self.uname = " "
|
||||
self.pw = " "
|
||||
self.rvol = [" "]
|
||||
self.wvol = [" "]
|
||||
self.avol = [" "]
|
||||
self.rvol = self.wvol = self.avol = empty_stringlist
|
||||
self.do_log = True
|
||||
self.can_read = False
|
||||
self.can_write = False
|
||||
@@ -390,6 +392,7 @@ class HttpCli(object):
|
||||
) + "0.0/16"
|
||||
zs2 = ' or "--xff-src=lan"' if self.conn.xff_lan.map(pip) else ""
|
||||
self.log(t % (self.args.xff_hdr, pip, cli_ip, zso, zs, zs2), 3)
|
||||
self.bad_xff = True
|
||||
else:
|
||||
self.ip = cli_ip
|
||||
self.is_vproxied = bool(self.args.R)
|
||||
@@ -510,7 +513,7 @@ class HttpCli(object):
|
||||
return False
|
||||
|
||||
if "k" in uparam:
|
||||
m = RE_K.search(uparam["k"])
|
||||
m = re_k.search(uparam["k"])
|
||||
if m:
|
||||
zs = uparam["k"]
|
||||
t = "malicious user; illegal filekey; req(%r) k(%r) => %r"
|
||||
@@ -1896,6 +1899,9 @@ class HttpCli(object):
|
||||
if "stash" in opt:
|
||||
return self.handle_stash(False)
|
||||
|
||||
xm = []
|
||||
xm_rsp = {}
|
||||
|
||||
if "save" in opt:
|
||||
post_sz, _, _, _, _, path, _ = self.dump_to_file(False)
|
||||
self.log("urlform: %d bytes, %r" % (post_sz, path))
|
||||
@@ -1918,7 +1924,7 @@ class HttpCli(object):
|
||||
plain = plain[4:]
|
||||
xm = self.vn.flags.get("xm")
|
||||
if xm:
|
||||
runhook(
|
||||
xm_rsp = runhook(
|
||||
self.log,
|
||||
self.conn.hsrv.broker,
|
||||
None,
|
||||
@@ -1942,6 +1948,13 @@ class HttpCli(object):
|
||||
except Exception as ex:
|
||||
self.log(repr(ex))
|
||||
|
||||
if "xm" in opt:
|
||||
if xm:
|
||||
self.loud_reply(xm_rsp.get("stdout") or "", status=202)
|
||||
return True
|
||||
else:
|
||||
return self.handle_get()
|
||||
|
||||
if "get" in opt:
|
||||
return self.handle_get()
|
||||
|
||||
@@ -4354,7 +4367,7 @@ class HttpCli(object):
|
||||
self.log,
|
||||
self.asrv,
|
||||
fgen,
|
||||
utf8="utf" in uarg,
|
||||
utf8="utf" in uarg or not uarg,
|
||||
pre_crc="crc" in uarg,
|
||||
cmp=uarg if cancmp or uarg == "pax" else "",
|
||||
)
|
||||
@@ -5003,8 +5016,16 @@ class HttpCli(object):
|
||||
and (self.uname in vol.axs.uread or self.uname in vol.axs.upget)
|
||||
}
|
||||
|
||||
bad_xff = hasattr(self, "bad_xff")
|
||||
if bad_xff:
|
||||
allvols = []
|
||||
t = "will not return list of recent uploads" + BADXFF
|
||||
self.log(t, 1)
|
||||
if self.avol:
|
||||
raise Pebkac(500, t)
|
||||
|
||||
x = self.conn.hsrv.broker.ask(
|
||||
"up2k.get_unfinished_by_user", self.uname, self.ip
|
||||
"up2k.get_unfinished_by_user", self.uname, "" if bad_xff else self.ip
|
||||
)
|
||||
uret = x.get()
|
||||
|
||||
@@ -5131,6 +5152,12 @@ class HttpCli(object):
|
||||
adm = "*" in vol.axs.uadmin or self.uname in vol.axs.uadmin
|
||||
dots = "*" in vol.axs.udot or self.uname in vol.axs.udot
|
||||
|
||||
lvl = int(vol.flags["ups_who"])
|
||||
if not lvl:
|
||||
continue
|
||||
elif lvl == 1 and not adm:
|
||||
continue
|
||||
|
||||
n = 1000
|
||||
q = "select sz, rd, fn, ip, at from up where at>0 order by at desc"
|
||||
for sz, rd, fn, ip, at in cur.execute(q):
|
||||
@@ -5399,12 +5426,16 @@ class HttpCli(object):
|
||||
if self.args.no_del:
|
||||
raise Pebkac(403, "the delete feature is disabled in server config")
|
||||
|
||||
unpost = "unpost" in self.uparam
|
||||
if unpost and hasattr(self, "bad_xff"):
|
||||
self.log("unpost was denied" + BADXFF, 1)
|
||||
raise Pebkac(403, "the delete feature is disabled in server config")
|
||||
|
||||
if not req:
|
||||
req = [self.vpath]
|
||||
elif self.is_vproxied:
|
||||
req = [x[len(self.args.SR) :] for x in req]
|
||||
|
||||
unpost = "unpost" in self.uparam
|
||||
nlim = int(self.uparam.get("lim") or 0)
|
||||
lim = [nlim, nlim] if nlim else []
|
||||
|
||||
@@ -5804,7 +5835,7 @@ class HttpCli(object):
|
||||
"taglist": [],
|
||||
"have_tags_idx": int(e2t),
|
||||
"have_b_u": (self.can_write and self.uparam.get("b") == "u"),
|
||||
"sb_lg": "" if "no_sb_lg" in vf else (vf.get("lg_sbf") or "y"),
|
||||
"sb_lg": vn.js_ls["sb_lg"],
|
||||
"url_suf": url_suf,
|
||||
"title": html_escape("%s %s" % (self.args.bname, self.vpath), crlf=True),
|
||||
"srv_info": srv_infot,
|
||||
|
||||
@@ -50,6 +50,8 @@ from .util import (
|
||||
FFMPEG_URL,
|
||||
HAVE_PSUTIL,
|
||||
HAVE_SQLITE3,
|
||||
HAVE_ZMQ,
|
||||
URL_BUG,
|
||||
UTC,
|
||||
VERSIONS,
|
||||
Daemon,
|
||||
@@ -60,6 +62,7 @@ from .util import (
|
||||
alltrace,
|
||||
ansi_re,
|
||||
build_netmap,
|
||||
expat_ver,
|
||||
load_ipu,
|
||||
min_ex,
|
||||
mp,
|
||||
@@ -639,6 +642,7 @@ class SvcHub(object):
|
||||
(HAVE_FFPROBE, "ffprobe", t_ff + ", read audio/media tags"),
|
||||
(HAVE_MUTAGEN, "mutagen", "read audio tags (ffprobe is better but slower)"),
|
||||
(HAVE_ARGON2, "argon2", "secure password hashing (advanced users only)"),
|
||||
(HAVE_ZMQ, "pyzmq", "send zeromq messages from event-hooks"),
|
||||
(HAVE_HEIF, "pillow-heif", "read .heif images with pillow (rarely useful)"),
|
||||
(HAVE_AVIF, "pillow-avif", "read .avif images with pillow (rarely useful)"),
|
||||
]
|
||||
@@ -695,6 +699,15 @@ class SvcHub(object):
|
||||
if self.args.bauth_last:
|
||||
self.log("root", "WARNING: ignoring --bauth-last due to --no-bauth", 3)
|
||||
|
||||
if not self.args.no_dav:
|
||||
from .dxml import DXML_OK
|
||||
|
||||
if not DXML_OK:
|
||||
if not self.args.no_dav:
|
||||
self.args.no_dav = True
|
||||
t = "WARNING:\nDisabling WebDAV support because dxml selftest failed. Please report this bug;\n%s\n...and include the following information in the bug-report:\n%s | expat %s\n"
|
||||
self.log("root", t % (URL_BUG, VERSIONS, expat_ver()), 1)
|
||||
|
||||
def _process_config(self) -> bool:
|
||||
al = self.args
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@ import os
|
||||
from .__init__ import TYPE_CHECKING
|
||||
from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .th_srv import HAVE_WEBP, thumb_path
|
||||
from .th_srv import EXTS_AC, HAVE_WEBP, thumb_path
|
||||
from .util import Cooldown
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
@@ -57,13 +57,17 @@ class ThumbCli(object):
|
||||
if is_vid and "dvthumb" in dbv.flags:
|
||||
return None
|
||||
|
||||
want_opus = fmt in ("opus", "caf", "mp3")
|
||||
want_opus = fmt in EXTS_AC
|
||||
is_au = ext in self.fmt_ffa
|
||||
is_vau = want_opus and ext in self.fmt_ffv
|
||||
if is_au or is_vau:
|
||||
if want_opus:
|
||||
if self.args.no_acode:
|
||||
return None
|
||||
elif fmt == "caf" and self.args.no_caf:
|
||||
fmt = "mp3"
|
||||
elif fmt == "owa" and self.args.no_owa:
|
||||
fmt = "mp3"
|
||||
else:
|
||||
if "dathumb" in dbv.flags:
|
||||
return None
|
||||
|
||||
@@ -32,7 +32,7 @@ from .util import (
|
||||
)
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from typing import Optional, Union
|
||||
from typing import Any, Optional, Union
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .svchub import SvcHub
|
||||
@@ -46,6 +46,9 @@ HAVE_HEIF = False
|
||||
HAVE_AVIF = False
|
||||
HAVE_WEBP = False
|
||||
|
||||
EXTS_TH = set(["jpg", "webp", "png"])
|
||||
EXTS_AC = set(["opus", "owa", "caf", "mp3"])
|
||||
|
||||
try:
|
||||
if os.environ.get("PRTY_NO_PIL"):
|
||||
raise Exception()
|
||||
@@ -139,7 +142,7 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
|
||||
h = hashlib.sha512(afsenc(fn)).digest()
|
||||
fn = ub64enc(h).decode("ascii")[:24]
|
||||
|
||||
if fmt in ("opus", "caf", "mp3"):
|
||||
if fmt in EXTS_AC:
|
||||
cat = "ac"
|
||||
else:
|
||||
fc = fmt[:1]
|
||||
@@ -334,9 +337,10 @@ class ThumbSrv(object):
|
||||
ap_unpk = abspath
|
||||
|
||||
if not bos.path.exists(tpath):
|
||||
want_mp3 = tpath.endswith(".mp3")
|
||||
want_opus = tpath.endswith(".opus") or tpath.endswith(".caf")
|
||||
want_png = tpath.endswith(".png")
|
||||
tex = tpath.rsplit(".", 1)[-1]
|
||||
want_mp3 = tex == "mp3"
|
||||
want_opus = tex in ("opus", "owa", "caf")
|
||||
want_png = tex == "png"
|
||||
want_au = want_mp3 or want_opus
|
||||
for lib in self.args.th_dec:
|
||||
can_au = lib == "ff" and (
|
||||
@@ -754,25 +758,37 @@ class ThumbSrv(object):
|
||||
if "ac" not in tags:
|
||||
raise Exception("not audio")
|
||||
|
||||
try:
|
||||
dur = tags[".dur"][1]
|
||||
except:
|
||||
dur = 0
|
||||
sq = "%dk" % (self.args.q_opus,)
|
||||
bq = sq.encode("ascii")
|
||||
if tags["ac"][1] == "opus":
|
||||
enc = "-c:a copy"
|
||||
else:
|
||||
enc = "-c:a libopus -b:a " + sq
|
||||
|
||||
src_opus = abspath.lower().endswith(".opus") or tags["ac"][1] == "opus"
|
||||
want_caf = tpath.endswith(".caf")
|
||||
tmp_opus = tpath
|
||||
if want_caf:
|
||||
tmp_opus = tpath + ".opus"
|
||||
try:
|
||||
wunlink(self.log, tmp_opus, vn.flags)
|
||||
except:
|
||||
pass
|
||||
fun = self._conv_caf if fmt == "caf" else self._conv_owa
|
||||
|
||||
caf_src = abspath if src_opus else tmp_opus
|
||||
bq = ("%dk" % (self.args.q_opus,)).encode("ascii")
|
||||
fun(abspath, tpath, tags, rawtags, enc, bq, vn)
|
||||
|
||||
def _conv_owa(
|
||||
self,
|
||||
abspath: str,
|
||||
tpath: str,
|
||||
tags: dict[str, tuple[int, Any]],
|
||||
rawtags: dict[str, list[Any]],
|
||||
enc: str,
|
||||
bq: bytes,
|
||||
vn: VFS,
|
||||
) -> None:
|
||||
if tpath.endswith(".owa"):
|
||||
container = b"webm"
|
||||
tagset = [b"-map_metadata", b"-1"]
|
||||
else:
|
||||
container = b"opus"
|
||||
tagset = self.big_tags(rawtags)
|
||||
|
||||
self.log("conv2 %s [%s]" % (container, enc), 6)
|
||||
benc = enc.encode("ascii").split(b" ")
|
||||
|
||||
if not want_caf or not src_opus:
|
||||
# fmt: off
|
||||
cmd = [
|
||||
b"ffmpeg",
|
||||
@@ -780,10 +796,50 @@ class ThumbSrv(object):
|
||||
b"-v", b"error",
|
||||
b"-hide_banner",
|
||||
b"-i", fsenc(abspath),
|
||||
] + self.big_tags(rawtags) + [
|
||||
] + tagset + [
|
||||
b"-map", b"0:a:0",
|
||||
b"-c:a", b"libopus",
|
||||
b"-b:a", bq,
|
||||
] + benc + [
|
||||
b"-f", container,
|
||||
fsenc(tpath)
|
||||
]
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn, oom=300)
|
||||
|
||||
def _conv_caf(
|
||||
self,
|
||||
abspath: str,
|
||||
tpath: str,
|
||||
tags: dict[str, tuple[int, Any]],
|
||||
rawtags: dict[str, list[Any]],
|
||||
enc: str,
|
||||
bq: bytes,
|
||||
vn: VFS,
|
||||
) -> None:
|
||||
tmp_opus = tpath + ".opus"
|
||||
try:
|
||||
wunlink(self.log, tmp_opus, vn.flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
dur = tags[".dur"][1]
|
||||
except:
|
||||
dur = 0
|
||||
|
||||
self.log("conv2 caf-tmp [%s]" % (enc,), 6)
|
||||
benc = enc.encode("ascii").split(b" ")
|
||||
|
||||
# fmt: off
|
||||
cmd = [
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-v", b"error",
|
||||
b"-hide_banner",
|
||||
b"-i", fsenc(abspath),
|
||||
b"-map_metadata", b"-1",
|
||||
b"-map", b"0:a:0",
|
||||
] + benc + [
|
||||
b"-f", b"opus",
|
||||
fsenc(tmp_opus)
|
||||
]
|
||||
# fmt: on
|
||||
@@ -794,7 +850,10 @@ class ThumbSrv(object):
|
||||
# fix that by mixing in some inaudible pink noise :^)
|
||||
# 6.3 sec seems like the cutoff so lets do 7, and
|
||||
# 7 sec of psyqui-musou.opus @ 3:50 is 174 KiB
|
||||
if want_caf and (dur < 20 or bos.path.getsize(caf_src) < 256 * 1024):
|
||||
sz = bos.path.getsize(tmp_opus)
|
||||
if dur < 20 or sz < 256 * 1024:
|
||||
zs = bq.decode("ascii")
|
||||
self.log("conv2 caf-transcode; dur=%d sz=%d q=%s" % (dur, sz, zs), 6)
|
||||
# fmt: off
|
||||
cmd = [
|
||||
b"ffmpeg",
|
||||
@@ -813,15 +872,16 @@ class ThumbSrv(object):
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn, oom=300)
|
||||
|
||||
elif want_caf:
|
||||
else:
|
||||
# simple remux should be safe
|
||||
self.log("conv2 caf-remux; dur=%d sz=%d" % (dur, sz), 6)
|
||||
# fmt: off
|
||||
cmd = [
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-v", b"error",
|
||||
b"-hide_banner",
|
||||
b"-i", fsenc(abspath if src_opus else tmp_opus),
|
||||
b"-i", fsenc(tmp_opus),
|
||||
b"-map_metadata", b"-1",
|
||||
b"-map", b"0:a:0",
|
||||
b"-c:a", b"copy",
|
||||
@@ -831,7 +891,6 @@ class ThumbSrv(object):
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn, oom=300)
|
||||
|
||||
if tmp_opus != tpath:
|
||||
try:
|
||||
wunlink(self.log, tmp_opus, vn.flags)
|
||||
except:
|
||||
@@ -891,7 +950,7 @@ class ThumbSrv(object):
|
||||
|
||||
def _clean(self, cat: str, thumbpath: str) -> int:
|
||||
# self.log("cln {}".format(thumbpath))
|
||||
exts = ["jpg", "webp", "png"] if cat == "th" else ["opus", "caf", "mp3"]
|
||||
exts = EXTS_TH if cat == "th" else EXTS_AC
|
||||
maxage = getattr(self.args, cat + "_maxage")
|
||||
now = time.time()
|
||||
prev_b64 = None
|
||||
|
||||
@@ -795,7 +795,7 @@ class Up2k(object):
|
||||
continue
|
||||
|
||||
self.log("xiu: %d# %r" % (len(wrfs), cmd))
|
||||
runihook(self.log, cmd, vol, ups)
|
||||
runihook(self.log, self.args.hook_v, cmd, vol, ups)
|
||||
|
||||
def _vis_job_progress(self, job: dict[str, Any]) -> str:
|
||||
perc = 100 - (len(job["need"]) * 100.0 / (len(job["hash"]) or 1))
|
||||
@@ -4905,7 +4905,8 @@ class Up2k(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
xbu = self.flags[job["ptop"]].get("xbu")
|
||||
vf = self.flags[job["ptop"]]
|
||||
xbu = vf.get("xbu")
|
||||
ap_chk = djoin(pdir, job["name"])
|
||||
vp_chk = djoin(job["vtop"], job["prel"], job["name"])
|
||||
if xbu:
|
||||
@@ -4935,7 +4936,7 @@ class Up2k(object):
|
||||
if x:
|
||||
zvfs = vfs
|
||||
pdir, _, job["name"], (vfs, rem) = x
|
||||
job["vcfg"] = vfs.flags
|
||||
job["vcfg"] = vf = vfs.flags
|
||||
job["ptop"] = vfs.realpath
|
||||
job["vtop"] = vfs.vpath
|
||||
job["prel"] = rem
|
||||
@@ -4985,8 +4986,13 @@ class Up2k(object):
|
||||
fs = self.fstab.get(pdir)
|
||||
if fs == "ok":
|
||||
pass
|
||||
elif "sparse" in self.flags[job["ptop"]]:
|
||||
t = "volflag 'sparse' is forcing use of sparse files for uploads to [%s]"
|
||||
elif "nosparse" in vf:
|
||||
t = "volflag 'nosparse' is preventing creation of sparse files for uploads to [%s]"
|
||||
self.log(t % (job["ptop"],))
|
||||
relabel = True
|
||||
sprs = False
|
||||
elif "sparse" in vf:
|
||||
t = "volflag 'sparse' is forcing creation of sparse files for uploads to [%s]"
|
||||
self.log(t % (job["ptop"],))
|
||||
relabel = True
|
||||
else:
|
||||
|
||||
@@ -120,6 +120,13 @@ try:
|
||||
except:
|
||||
HAVE_SQLITE3 = False
|
||||
|
||||
try:
|
||||
import importlib.util
|
||||
|
||||
HAVE_ZMQ = bool(importlib.util.find_spec("zmq"))
|
||||
except:
|
||||
HAVE_ZMQ = False
|
||||
|
||||
try:
|
||||
if os.environ.get("PRTY_NO_PSUTIL"):
|
||||
raise Exception()
|
||||
@@ -229,9 +236,14 @@ META_NOBOTS = '<meta name="robots" content="noindex, nofollow">\n'
|
||||
|
||||
FFMPEG_URL = "https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z"
|
||||
|
||||
URL_PRJ = "https://github.com/9001/copyparty"
|
||||
|
||||
URL_BUG = URL_PRJ + "/issues/new?labels=bug&template=bug_report.md"
|
||||
|
||||
HTTPCODE = {
|
||||
200: "OK",
|
||||
201: "Created",
|
||||
202: "Accepted",
|
||||
204: "No Content",
|
||||
206: "Partial Content",
|
||||
207: "Multi-Status",
|
||||
@@ -319,6 +331,7 @@ DAV_ALLPROPS = set(DAV_ALLPROP_L)
|
||||
|
||||
MIMES = {
|
||||
"opus": "audio/ogg; codecs=opus",
|
||||
"owa": "audio/webm; codecs=opus",
|
||||
}
|
||||
|
||||
|
||||
@@ -491,6 +504,15 @@ def py_desc() -> str:
|
||||
)
|
||||
|
||||
|
||||
def expat_ver() -> str:
|
||||
try:
|
||||
import pyexpat
|
||||
|
||||
return ".".join([str(x) for x in pyexpat.version_info])
|
||||
except:
|
||||
return "?"
|
||||
|
||||
|
||||
def _sqlite_ver() -> str:
|
||||
assert sqlite3 # type: ignore # !rm
|
||||
try:
|
||||
@@ -3386,6 +3408,7 @@ def _parsehook(
|
||||
|
||||
def runihook(
|
||||
log: Optional["NamedLogger"],
|
||||
verbose: bool,
|
||||
cmd: str,
|
||||
vol: "VFS",
|
||||
ups: list[tuple[str, int, int, str, str, str, int]],
|
||||
@@ -3415,6 +3438,17 @@ def runihook(
|
||||
else:
|
||||
sp_ka["sin"] = b"\n".join(fsenc(x) for x in aps)
|
||||
|
||||
if acmd[0].startswith("zmq:"):
|
||||
try:
|
||||
msg = sp_ka["sin"].decode("utf-8", "replace")
|
||||
_zmq_hook(log, verbose, "xiu", acmd[0][4:].lower(), msg, wait, sp_ka)
|
||||
if verbose and log:
|
||||
log("hook(xiu) %r OK" % (cmd,), 6)
|
||||
except Exception as ex:
|
||||
if log:
|
||||
log("zeromq failed: %r" % (ex,))
|
||||
return True
|
||||
|
||||
t0 = time.time()
|
||||
if fork:
|
||||
Daemon(runcmd, cmd, bcmd, ka=sp_ka)
|
||||
@@ -3424,6 +3458,7 @@ def runihook(
|
||||
retchk(rc, bcmd, err, log, 5)
|
||||
return False
|
||||
|
||||
if wait:
|
||||
wait -= time.time() - t0
|
||||
if wait > 0:
|
||||
time.sleep(wait)
|
||||
@@ -3431,8 +3466,118 @@ def runihook(
|
||||
return True
|
||||
|
||||
|
||||
ZMQ = {}
|
||||
ZMQ_DESC = {
|
||||
"pub": "fire-and-forget to all/any connected SUB-clients",
|
||||
"push": "fire-and-forget to one of the connected PULL-clients",
|
||||
"req": "send messages to a REP-server and blocking-wait for ack",
|
||||
}
|
||||
|
||||
|
||||
def _zmq_hook(
|
||||
log: Optional["NamedLogger"],
|
||||
verbose: bool,
|
||||
src: str,
|
||||
cmd: str,
|
||||
msg: str,
|
||||
wait: float,
|
||||
sp_ka: dict[str, Any],
|
||||
) -> tuple[int, str]:
|
||||
import zmq
|
||||
|
||||
try:
|
||||
mtx = ZMQ["mtx"]
|
||||
except:
|
||||
ZMQ["mtx"] = threading.Lock()
|
||||
time.sleep(0.1)
|
||||
mtx = ZMQ["mtx"]
|
||||
|
||||
ret = ""
|
||||
nret = 0
|
||||
t0 = time.time()
|
||||
if verbose and log:
|
||||
log("hook(%s) %r entering zmq-main-lock" % (src, cmd), 6)
|
||||
|
||||
with mtx:
|
||||
try:
|
||||
mode, sck, mtx = ZMQ[cmd]
|
||||
except:
|
||||
mode, uri = cmd.split(":", 1)
|
||||
try:
|
||||
desc = ZMQ_DESC[mode]
|
||||
if log:
|
||||
t = "libzmq(%s) pyzmq(%s) init(%s); %s"
|
||||
log(t % (zmq.zmq_version(), zmq.__version__, cmd, desc))
|
||||
except:
|
||||
raise Exception("the only supported ZMQ modes are REQ PUB PUSH")
|
||||
|
||||
try:
|
||||
ctx = ZMQ["ctx"]
|
||||
except:
|
||||
ctx = ZMQ["ctx"] = zmq.Context()
|
||||
|
||||
timeout = sp_ka["timeout"]
|
||||
|
||||
if mode == "pub":
|
||||
sck = ctx.socket(zmq.PUB)
|
||||
sck.setsockopt(zmq.LINGER, 0)
|
||||
sck.bind(uri)
|
||||
time.sleep(1) # give clients time to connect; avoids losing first msg
|
||||
elif mode == "push":
|
||||
sck = ctx.socket(zmq.PUSH)
|
||||
if timeout:
|
||||
sck.SNDTIMEO = int(timeout * 1000)
|
||||
sck.setsockopt(zmq.LINGER, 0)
|
||||
sck.bind(uri)
|
||||
elif mode == "req":
|
||||
sck = ctx.socket(zmq.REQ)
|
||||
if timeout:
|
||||
sck.RCVTIMEO = int(timeout * 1000)
|
||||
sck.setsockopt(zmq.LINGER, 0)
|
||||
sck.connect(uri)
|
||||
else:
|
||||
raise Exception()
|
||||
|
||||
mtx = threading.Lock()
|
||||
ZMQ[cmd] = (mode, sck, mtx)
|
||||
|
||||
if verbose and log:
|
||||
log("hook(%s) %r entering socket-lock" % (src, cmd), 6)
|
||||
|
||||
with mtx:
|
||||
if verbose and log:
|
||||
log("hook(%s) %r sending |%d|" % (src, cmd, len(msg)), 6)
|
||||
|
||||
sck.send_string(msg) # PUSH can safely timeout here
|
||||
|
||||
if mode == "req":
|
||||
if verbose and log:
|
||||
log("hook(%s) %r awaiting ack from req" % (src, cmd), 6)
|
||||
try:
|
||||
ret = sck.recv().decode("utf-8", "replace")
|
||||
if ret.startswith("return "):
|
||||
m = re.search("^return ([0-9]+)", ret[:12])
|
||||
if m:
|
||||
nret = int(m.group(1))
|
||||
except:
|
||||
sck.close()
|
||||
del ZMQ[cmd] # bad state; must reset
|
||||
raise Exception("ack timeout; zmq socket killed")
|
||||
|
||||
if ret and log:
|
||||
log("hook(%s) %r ACK: %r" % (src, cmd, ret), 6)
|
||||
|
||||
if wait:
|
||||
wait -= time.time() - t0
|
||||
if wait > 0:
|
||||
time.sleep(wait)
|
||||
|
||||
return nret, ret
|
||||
|
||||
|
||||
def _runhook(
|
||||
log: Optional["NamedLogger"],
|
||||
verbose: bool,
|
||||
src: str,
|
||||
cmd: str,
|
||||
ap: str,
|
||||
@@ -3473,6 +3618,12 @@ def _runhook(
|
||||
else:
|
||||
arg = txt or ap
|
||||
|
||||
if acmd[0].startswith("zmq:"):
|
||||
zi, zs = _zmq_hook(log, verbose, src, acmd[0][4:].lower(), arg, wait, sp_ka)
|
||||
if zi:
|
||||
raise Exception("zmq says %d" % (zi,))
|
||||
return {"rc": 0, "stdout": zs}
|
||||
|
||||
acmd += [arg]
|
||||
if acmd[0].endswith(".py"):
|
||||
acmd = [pybin] + acmd
|
||||
@@ -3501,6 +3652,7 @@ def _runhook(
|
||||
except:
|
||||
ret = {"rc": rc, "stdout": v}
|
||||
|
||||
if wait:
|
||||
wait -= time.time() - t0
|
||||
if wait > 0:
|
||||
time.sleep(wait)
|
||||
@@ -3527,14 +3679,15 @@ def runhook(
|
||||
) -> dict[str, Any]:
|
||||
assert broker or up2k # !rm
|
||||
args = (broker or up2k).args
|
||||
verbose = args.hook_v
|
||||
vp = vp.replace("\\", "/")
|
||||
ret = {"rc": 0}
|
||||
for cmd in cmds:
|
||||
try:
|
||||
hr = _runhook(
|
||||
log, src, cmd, ap, vp, host, uname, perms, mt, sz, ip, at, txt
|
||||
log, verbose, src, cmd, ap, vp, host, uname, perms, mt, sz, ip, at, txt
|
||||
)
|
||||
if log and args.hook_v:
|
||||
if verbose and log:
|
||||
log("hook(%s) %r => \033[32m%s" % (src, cmd, hr), 6)
|
||||
if not hr:
|
||||
return {}
|
||||
@@ -3550,6 +3703,8 @@ def runhook(
|
||||
elif k in ret:
|
||||
if k == "rc" and v:
|
||||
ret[k] = v
|
||||
elif k == "stdout" and v and not ret[k]:
|
||||
ret[k] = v
|
||||
else:
|
||||
ret[k] = v
|
||||
except Exception as ex:
|
||||
|
||||
@@ -241,7 +241,7 @@ var Ls = {
|
||||
"cut_nag": "OS notification when upload completes$N(only if the browser or tab is not active)",
|
||||
"cut_sfx": "audible alert when upload completes$N(only if the browser or tab is not active)",
|
||||
|
||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$N30% faster https, 4.5x faster http,$Nand 5.3x faster on android phones\">mt",
|
||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
||||
|
||||
"cft_text": "favicon text (blank and refresh to disable)",
|
||||
"cft_fg": "foreground color",
|
||||
@@ -263,6 +263,7 @@ var Ls = {
|
||||
"ml_pmode": "at end of folder...",
|
||||
"ml_btns": "cmds",
|
||||
"ml_tcode": "transcode",
|
||||
"ml_tcode2": "transcode to",
|
||||
"ml_tint": "tint",
|
||||
"ml_eq": "audio equalizer",
|
||||
"ml_drc": "dynamic range compressor",
|
||||
@@ -286,6 +287,14 @@ var Ls = {
|
||||
"mt_cflac": "convert flac / wav to opus\">flac",
|
||||
"mt_caac": "convert aac / m4a to opus\">aac",
|
||||
"mt_coth": "convert all others (not mp3) to opus\">oth",
|
||||
"mt_c2opus": "best choice for desktops, laptops, android\">opus",
|
||||
"mt_c2owa": "opus-weba, for iOS 17.5 and newer\">owa",
|
||||
"mt_c2caf": "opus-caf, for iOS 11 through 17\">caf",
|
||||
"mt_c2mp3": "use this on very old devices\">mp3",
|
||||
"mt_c2ok": "nice, good choice",
|
||||
"mt_c2nd": "that's not the recommended output format for your device, but that's fine",
|
||||
"mt_c2ng": "your device does not seem to support this output format, but let's try anyways",
|
||||
"mt_xowa": "there are bugs in iOS preventing background playback using this format; please use caf or mp3 instead",
|
||||
"mt_tint": "background level (0-100) on the seekbar$Nto make buffering less distracting",
|
||||
"mt_eq": "enables the equalizer and gain control;$N$Nboost <code>0</code> = standard 100% volume (unmodified)$N$Nwidth <code>1 </code> = standard stereo (unmodified)$Nwidth <code>0.5</code> = 50% left-right crossfeed$Nwidth <code>0 </code> = mono$N$Nboost <code>-0.8</code> & width <code>10</code> = vocal removal :^)$N$Nenabling the equalizer makes gapless albums fully gapless, so leave it on with all the values at zero (except width = 1) if you care about that",
|
||||
"mt_drc": "enables the dynamic range compressor (volume flattener / brickwaller); will also enable EQ to balance the spaghetti, so set all EQ fields except for 'width' to 0 if you don't want it$N$Nlowers the volume of audio above THRESHOLD dB; for every RATIO dB past THRESHOLD there is 1 dB of output, so default values of tresh -24 and ratio 12 means it should never get louder than -22 dB and it is safe to increase the equalizer boost to 0.8, or even 1.8 with ATK 0 and a huge RLS like 90 (only works in firefox; RLS is max 1 in other browsers)$N$N(see wikipedia, they explain it much better)",
|
||||
@@ -830,7 +839,7 @@ var Ls = {
|
||||
"cut_nag": "meldingsvarsel når opplastning er ferdig$N(kun on nettleserfanen ikke er synlig)",
|
||||
"cut_sfx": "lydvarsel når opplastning er ferdig$N(kun on nettleserfanen ikke er synlig)",
|
||||
|
||||
"cut_mt": "raskere befaring ved å bruke hele CPU'en$N$Ndenne funksjonen anvender web-workers$Nog krever mer RAM (opptil 512 MiB ekstra)$N$N30% raskere https, 4.5x raskere http,$Nog 5.3x raskere på android-telefoner\">mt",
|
||||
"cut_mt": "raskere befaring ved å bruke hele CPU'en$N$Ndenne funksjonen anvender web-workers$Nog krever mer RAM (opptil 512 MiB ekstra)$N$Ngjør https 30% raskere, http 4.5x raskere\">mt",
|
||||
|
||||
"cft_text": "ikontekst (blank ut og last siden på nytt for å deaktivere)",
|
||||
"cft_fg": "farge",
|
||||
@@ -852,6 +861,7 @@ var Ls = {
|
||||
"ml_pmode": "ved enden av mappen",
|
||||
"ml_btns": "knapper",
|
||||
"ml_tcode": "konvertering",
|
||||
"ml_tcode2": "konverter til",
|
||||
"ml_tint": "tint",
|
||||
"ml_eq": "audio equalizer (tonejustering)",
|
||||
"ml_drc": "compressor (volum-utjevning)",
|
||||
@@ -875,6 +885,14 @@ var Ls = {
|
||||
"mt_cflac": "konverter flac / wav-filer til opus\">flac",
|
||||
"mt_caac": "konverter aac / m4a-filer til to opus\">aac",
|
||||
"mt_coth": "konverter alt annet (men ikke mp3) til opus\">andre",
|
||||
"mt_c2opus": "det beste valget for alle PCer og Android\">opus",
|
||||
"mt_c2owa": "opus-weba, for iOS 17.5 og nyere\">owa",
|
||||
"mt_c2caf": "opus-caf, for iOS 11 tilogmed 17\">caf",
|
||||
"mt_c2mp3": "bra valg for steinalder-utstyr (slår aldri feil)\">mp3",
|
||||
"mt_c2ok": "bra valg!",
|
||||
"mt_c2nd": "ikke det foretrukne valget for din enhet, men funker sikkert greit",
|
||||
"mt_c2ng": "ser virkelig ikke ut som enheten din takler dette formatet... men ok, vi prøver",
|
||||
"mt_xowa": "iOS har fortsatt problemer med avspilling av owa-musikk i bakgrunnen. Bruk caf eller mp3 istedenfor",
|
||||
"mt_tint": "nivå av bakgrunnsfarge på søkestripa (0-100),$Ngjør oppdateringer mindre distraherende",
|
||||
"mt_eq": "aktiver tonekontroll og forsterker;$N$Nboost <code>0</code> = normal volumskala$N$Nwidth <code>1 </code> = normal stereo$Nwidth <code>0.5</code> = 50% blanding venstre-høyre$Nwidth <code>0 </code> = mono$N$Nboost <code>-0.8</code> & width <code>10</code> = instrumental :^)$N$Nreduserer også dødtid imellom sangfiler",
|
||||
"mt_drc": "aktiver volum-utjevning (dynamic range compressor); vil også aktivere tonejustering, så sett alle EQ-feltene bortsett fra 'width' til 0 hvis du ikke vil ha noe EQ$N$Nfilteret vil dempe volumet på alt som er høyere enn TRESH dB; for hver RATIO dB over grensen er det 1dB som treffer høyttalerne, så standardverdiene tresh -24 og ratio 12 skal bety at volumet ikke går høyere enn -22 dB, slik at man trygt kan øke boost-verdien i equalizer'n til rundt 0.8, eller 1.8 kombinert med ATK 0 og RLS 90 (bare mulig i firefox; andre nettlesere tar ikke høyere RLS enn 1)$N$Nwikipedia forklarer dette mye bedre forresten",
|
||||
@@ -1419,7 +1437,7 @@ var Ls = {
|
||||
"cut_nag": "上传完成时的操作系统通知$N(仅当浏览器或标签页不活跃时)",
|
||||
"cut_sfx": "上传完成时的声音警报$N(仅当浏览器或标签页不活跃时)",
|
||||
|
||||
"cut_mt": "使用多线程加速文件哈希$N$N这使用 Web Worker 并且需要更多内存(额外最多 512 MiB)$N$N比https快30%,http快4.5倍,比Android 手机快5.3倍\">mt",
|
||||
"cut_mt": "使用多线程加速文件哈希$N$N这使用 Web Worker 并且需要更多内存(额外最多 512 MiB)$N$N这使得 https 快 30%,http 快 4.5 倍\">mt",
|
||||
|
||||
"cft_text": "网站图标文本(为空并刷新以禁用)",
|
||||
"cft_fg": "前景色",
|
||||
@@ -1441,6 +1459,7 @@ var Ls = {
|
||||
"ml_pmode": "在文件夹末尾时...",
|
||||
"ml_btns": "命令",
|
||||
"ml_tcode": "转码",
|
||||
"ml_tcode2": "转换为", //m
|
||||
"ml_tint": "透明度",
|
||||
"ml_eq": "音频均衡器",
|
||||
"ml_drc": "动态范围压缩器",
|
||||
@@ -1464,6 +1483,14 @@ var Ls = {
|
||||
"mt_cflac": "将 flac / wav 转换为 opus\">flac",
|
||||
"mt_caac": "将 aac / m4a 转换为 opus\">aac",
|
||||
"mt_coth": "将所有其他(不是 mp3)转换为 opus\">oth",
|
||||
"mt_c2opus": "适合桌面电脑、笔记本电脑和安卓设备的最佳选择\">opus", //m
|
||||
"mt_c2owa": "opus-weba(适用于 iOS 17.5 及更新版本)\">owa", //m
|
||||
"mt_c2caf": "opus-caf(适用于 iOS 11 到 iOS 17)\">caf", //m
|
||||
"mt_c2mp3": "适用于非常旧的设备\">mp3", //m
|
||||
"mt_c2ok": "不错的选择!", //m
|
||||
"mt_c2nd": "这不是您的设备推荐的输出格式,但应该没问题。", //m
|
||||
"mt_c2ng": "您的设备似乎不支持此输出格式,不过我们还是试试看吧。", //m
|
||||
"mt_xowa": "iOS 系统仍存在无法后台播放 owa 音乐的错误,请改用 caf 或 mp3 格式。", //m
|
||||
"mt_tint": "在进度条上设置背景级别(0-100)",
|
||||
"mt_eq": "启用均衡器和增益控制;$N$Nboost <code>0</code> = 标准 100% 音量(默认)$N$Nwidth <code>1 </code> = 标准立体声(默认)$Nwidth <code>0.5</code> = 50% 左右交叉反馈$Nwidth <code>0 </code> = 单声道$N$Nboost <code>-0.8</code> & width <code>10</code> = 人声移除 )$N$N启用均衡器使无缝专辑完全无缝,所以如果你在乎这一点,请保持启用,所有值设为零(除了宽度 = 1)",
|
||||
"mt_drc": "启用动态范围压缩器(音量平滑器 / 限幅器);还会启用均衡器以平衡音频,因此如果你不想要它,请将均衡器字段除了 '宽度' 外的所有字段设置为 0$N$N降低 THRESHOLD dB 以上的音频的音量;每超过 THRESHOLD dB 的 RATIO 会有 1 dB 输出,所以默认值 tresh -24 和 ratio 12 意味着它的音量不应超过 -22 dB,可以安全地将均衡器增益提高到 0.8,甚至在 ATK 0 和 RLS 如 90 的情况下提高到 1.8(仅在 Firefox 中有效;其他浏览器中 RLS 最大为 1)$N$N(见维基百科,他们解释得更好)",
|
||||
@@ -2269,6 +2296,12 @@ var mpl = (function () {
|
||||
'<a href="#" id="ac_flac" class="tgl btn" tt="' + L.mt_cflac + '</a>' +
|
||||
'<a href="#" id="ac_aac" class="tgl btn" tt="' + L.mt_caac + '</a>' +
|
||||
'<a href="#" id="ac_oth" class="tgl btn" tt="' + L.mt_coth + '</a>' +
|
||||
'</div></div>' +
|
||||
'<div><h3>' + L.ml_tcode2 + '</h3><div>' +
|
||||
'<a href="#" id="ac2opus" class="tgl btn" tt="' + L.mt_c2opus + '</a>' +
|
||||
'<a href="#" id="ac2owa" class="tgl btn" tt="' + L.mt_c2owa + '</a>' +
|
||||
'<a href="#" id="ac2caf" class="tgl btn" tt="' + L.mt_c2caf + '</a>' +
|
||||
'<a href="#" id="ac2mp3" class="tgl btn" tt="' + L.mt_c2mp3 + '</a>' +
|
||||
'</div></div>'
|
||||
) : '') +
|
||||
|
||||
@@ -2376,7 +2409,7 @@ var mpl = (function () {
|
||||
c = r.ac_flac;
|
||||
else if (/\.(aac|m4a)$/i.exec(cs))
|
||||
c = r.ac_aac;
|
||||
else if (/\.(ogg|opus)$/i.exec(cs) && !can_ogg)
|
||||
else if (/\.(ogg|opus)$/i.exec(cs) && (!can_ogg || mpl.ac2 == 'mp3'))
|
||||
c = true;
|
||||
else if (re_au_native.exec(cs))
|
||||
c = false;
|
||||
@@ -2384,7 +2417,56 @@ var mpl = (function () {
|
||||
if (!c)
|
||||
return url;
|
||||
|
||||
return addq(url, 'th=' + (can_ogg ? 'opus' : (IPHONE || MACOS) ? 'caf' : 'mp3'));
|
||||
return addq(url, 'th=' + r.ac2);
|
||||
};
|
||||
|
||||
r.set_ac2 = function () {
|
||||
r.init_ac2(this.getAttribute('id').split('ac2')[1]);
|
||||
};
|
||||
|
||||
r.init_ac2 = function (v) {
|
||||
if (!window.have_acode) {
|
||||
r.ac2 = 'opus';
|
||||
return;
|
||||
}
|
||||
|
||||
var dv = can_ogg ? 'opus' : can_caf ? 'caf' : 'mp3',
|
||||
fmts = ['opus', 'owa', 'caf', 'mp3'],
|
||||
btns = [];
|
||||
|
||||
if (v === dv)
|
||||
toast.ok(5, L.mt_c2ok);
|
||||
else if (v)
|
||||
toast.inf(10, L.mt_c2nd);
|
||||
|
||||
if ((v == 'opus' && !can_ogg) ||
|
||||
(v == 'caf' && !can_caf) ||
|
||||
(v == 'owa' && !can_owa))
|
||||
toast.warn(15, L.mt_c2ng);
|
||||
|
||||
if (v == 'owa' && IPHONE)
|
||||
toast.err(30, L.mt_xowa);
|
||||
|
||||
for (var a = 0; a < fmts.length; a++) {
|
||||
var btn = ebi('ac2' + fmts[a]);
|
||||
if (!btn)
|
||||
return console.log('!btn', fmts[a]);
|
||||
btn.onclick = r.set_ac2;
|
||||
btns.push(btn);
|
||||
}
|
||||
if (!IPHONE)
|
||||
btns[1].style.display = btns[2].style.display = 'none';
|
||||
|
||||
if (v)
|
||||
swrite('acode2', v);
|
||||
else
|
||||
v = dv;
|
||||
|
||||
v = sread('acode2', fmts) || v;
|
||||
for (var a = 0; a < fmts.length; a++)
|
||||
clmod(btns[a], 'on', fmts[a] == v)
|
||||
|
||||
r.ac2 = v;
|
||||
};
|
||||
|
||||
r.pp = function () {
|
||||
@@ -2483,6 +2565,7 @@ var mpl = (function () {
|
||||
r.unbuffer = function (url) {
|
||||
if (mp.au2 && (!url || mp.au2.rsrc == url)) {
|
||||
mp.au2.src = mp.au2.rsrc = '';
|
||||
mp.au2.ld = 0; //owa
|
||||
mp.au2.load();
|
||||
}
|
||||
if (!url)
|
||||
@@ -2493,11 +2576,22 @@ var mpl = (function () {
|
||||
})();
|
||||
|
||||
|
||||
var can_ogg = true;
|
||||
var za,
|
||||
can_ogg = true,
|
||||
can_owa = false,
|
||||
can_caf = IPHONE && !/ OS ([1-9]|1[01])_/.test(UA);
|
||||
try {
|
||||
can_ogg = new Audio().canPlayType('audio/ogg; codecs=opus') === 'probably';
|
||||
za = new Audio();
|
||||
can_ogg = za.canPlayType('audio/ogg; codecs=opus') === 'probably';
|
||||
can_owa = za.canPlayType('audio/webm; codecs=opus') === 'probably';
|
||||
}
|
||||
catch (ex) { }
|
||||
za = null;
|
||||
|
||||
if (can_owa && IPHONE && / OS ([1-9]|1[0-7])_/.test(UA))
|
||||
can_owa = false;
|
||||
|
||||
mpl.init_ac2();
|
||||
|
||||
|
||||
var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
|
||||
@@ -2670,7 +2764,8 @@ function MPlayer() {
|
||||
});
|
||||
|
||||
r.nopause();
|
||||
r.au2.onloadeddata = r.au2.onloadedmetadata = r.nopause;
|
||||
r.au2.ld = 0; //owa
|
||||
r.au2.onloadeddata = r.au2.onloadedmetadata = r.onpreload;
|
||||
r.au2.preload = "auto";
|
||||
r.au2.src = r.au2.rsrc = url;
|
||||
|
||||
@@ -2685,6 +2780,11 @@ function MPlayer() {
|
||||
r.cd_pause = Date.now();
|
||||
};
|
||||
|
||||
r.onpreload = function () {
|
||||
r.nopause();
|
||||
this.ld++;
|
||||
};
|
||||
|
||||
r.init_fau = function () {
|
||||
if (r.fau || !mpl.fau)
|
||||
return;
|
||||
@@ -2985,7 +3085,7 @@ var pbar = (function () {
|
||||
return;
|
||||
|
||||
pctx.fillStyle = light ? 'rgba(0,0,0,0.5)' : 'rgba(255,255,255,0.5)';
|
||||
var m = /[?&]th=(opus|caf|mp3)/.exec('' + mp.au.rsrc),
|
||||
var m = /[?&]th=(opus|owa|caf|mp3)/.exec('' + mp.au.rsrc),
|
||||
txt = mp.au.ded ? L.mm_playerr.replace(':', ' ;_;') :
|
||||
m ? L.mm_bconv.format(m[1]) : L.mm_bload;
|
||||
|
||||
@@ -3941,16 +4041,20 @@ function play(tid, is_ev, seek) {
|
||||
mp.au = mp.au2;
|
||||
mp.au2 = t;
|
||||
t.onerror = t.onprogress = t.onended = null;
|
||||
t.ld = 0; //owa
|
||||
mp.au.onerror = evau_error;
|
||||
mp.au.onprogress = pbar.drawpos;
|
||||
mp.au.onplaying = mpui.progress_updater;
|
||||
mp.au.onloadeddata = mp.au.onloadedmetadata = mp.nopause;
|
||||
mp.au.onended = next_song;
|
||||
t = mp.au.currentTime;
|
||||
if (isNum(t) && t > 0.1)
|
||||
mp.au.currentTime = 0;
|
||||
}
|
||||
else
|
||||
else {
|
||||
console.log('get ' + url.split('/').pop());
|
||||
mp.au.src = mp.au.rsrc = url;
|
||||
}
|
||||
|
||||
mp.au.osrc = mp.tracks[tid];
|
||||
afilt.apply();
|
||||
@@ -4036,6 +4140,14 @@ function evau_error(e) {
|
||||
err = L.mm_eabrt;
|
||||
break;
|
||||
case eplaya.error.MEDIA_ERR_NETWORK:
|
||||
if (IPHONE && eplaya.ld === 1 && mpl.ac2 == 'owa' && !eplaya.paused && !eplaya.currentTime) {
|
||||
eplaya.ded = 0;
|
||||
if (!mpl.owaw) {
|
||||
mpl.owaw = 1;
|
||||
console.log('ignored iOS bug; spurious error sent in parallel with preloaded songs starting to play just fine');
|
||||
}
|
||||
return;
|
||||
}
|
||||
err = L.mm_enet;
|
||||
break;
|
||||
case eplaya.error.MEDIA_ERR_DECODE:
|
||||
@@ -7355,12 +7467,12 @@ var treectl = (function () {
|
||||
xhr.rst = rst;
|
||||
xhr.ts = Date.now();
|
||||
xhr.open('GET', addq(dst, 'tree=' + top + (r.dots ? '&dots' : '') + k), true);
|
||||
xhr.onload = xhr.onerror = recvtree;
|
||||
xhr.onload = xhr.onerror = r.recvtree;
|
||||
xhr.send();
|
||||
enspin('#tree');
|
||||
}
|
||||
|
||||
function recvtree() {
|
||||
r.recvtree = function () {
|
||||
if (!xhrchk(this, L.tl_xe1, L.tl_xe2))
|
||||
return;
|
||||
|
||||
@@ -7370,10 +7482,10 @@ var treectl = (function () {
|
||||
catch (ex) {
|
||||
return toast.err(30, "bad <code>?tree</code> reply;\nexpected json, got this:\n\n" + esc(this.responseText + ''));
|
||||
}
|
||||
rendertree(res, this.ts, this.top, this.dst, this.rst);
|
||||
}
|
||||
r.rendertree(res, this.ts, this.top, this.dst, this.rst);
|
||||
};
|
||||
|
||||
function rendertree(res, ts, top0, dst, rst) {
|
||||
r.rendertree = function (res, ts, top0, dst, rst) {
|
||||
var cur = ebi('treeul').getAttribute('ts');
|
||||
if (cur && parseInt(cur) > ts + 20 && QS('#treeul>li>a+a')) {
|
||||
console.log("reject tree; " + cur + " / " + (ts - cur));
|
||||
@@ -7427,7 +7539,7 @@ var treectl = (function () {
|
||||
console.log("dir_cb failed", ex);
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
function reload_tree() {
|
||||
var cdir = r.nextdir || get_vpath(),
|
||||
@@ -7649,7 +7761,7 @@ var treectl = (function () {
|
||||
dirs.push(dn);
|
||||
}
|
||||
|
||||
rendertree({ "a": dirs }, this.ts, ".", get_evpath() + (dk ? '?k=' + dk : ''));
|
||||
r.rendertree({ "a": dirs }, this.ts, ".", get_evpath() + (dk ? '?k=' + dk : ''));
|
||||
}
|
||||
|
||||
r.gentab(this.top, res);
|
||||
@@ -7667,9 +7779,9 @@ var treectl = (function () {
|
||||
if (lg1 === Ls.eng.f_empty)
|
||||
lg1 = L.f_empty;
|
||||
|
||||
sandbox(ebi('pro'), sb_lg, '', lg0);
|
||||
sandbox(ebi('pro'), sb_lg, sba_lg,'', lg0);
|
||||
if (dirchg)
|
||||
sandbox(ebi('epi'), sb_lg, '', lg1);
|
||||
sandbox(ebi('epi'), sb_lg, sba_lg, '', lg1);
|
||||
|
||||
clmod(ebi('pro'), 'mdo');
|
||||
clmod(ebi('epi'), 'mdo');
|
||||
@@ -7680,7 +7792,7 @@ var treectl = (function () {
|
||||
if (md1)
|
||||
show_readme(md1, 1);
|
||||
else if (!dirchg)
|
||||
sandbox(ebi('epi'), sb_lg, '', lg1);
|
||||
sandbox(ebi('epi'), sb_lg, sba_lg, '', lg1);
|
||||
|
||||
if (this.hpush && !this.back) {
|
||||
var ofs = ebi('wrap').offsetTop;
|
||||
@@ -8635,8 +8747,8 @@ var arcfmt = (function () {
|
||||
["pax", "tar=pax", L.fz_pax],
|
||||
["tgz", "tar=gz", L.fz_targz],
|
||||
["txz", "tar=xz", L.fz_tarxz],
|
||||
["zip", "zip=utf8", L.fz_zip8],
|
||||
["zip_dos", "zip", L.fz_zipd],
|
||||
["zip", "zip", L.fz_zip8],
|
||||
["zip_dos", "zip=dos", L.fz_zipd],
|
||||
["zip_crc", "zip=crc", L.fz_zipc]
|
||||
];
|
||||
|
||||
@@ -9028,14 +9140,18 @@ var msel = (function () {
|
||||
function cb() {
|
||||
xhrchk(this, L.fsm_xe1, L.fsm_xe2);
|
||||
|
||||
if (this.status < 200 || this.status > 201) {
|
||||
if (this.status < 200 || this.status > 202) {
|
||||
sf.textContent = 'error: ' + hunpre(this.responseText);
|
||||
return;
|
||||
}
|
||||
|
||||
tb.value = '';
|
||||
clmod(sf, 'vis');
|
||||
sf.textContent = 'sent: "' + this.msg + '"';
|
||||
var txt = 'sent: <code>' + esc(this.msg) + '</code>';
|
||||
if (this.status == 202)
|
||||
txt += '<br /> got: <code>' + esc(this.responseText) + '</code>';
|
||||
|
||||
sf.innerHTML = txt;
|
||||
setTimeout(function () {
|
||||
treectl.goto();
|
||||
}, 100);
|
||||
@@ -9156,7 +9272,7 @@ function show_md(md, name, div, url, depth) {
|
||||
if (!have_emp)
|
||||
md_html = DOMPurify.sanitize(md_html);
|
||||
|
||||
if (sandbox(div, sb_md, 'mdo', md_html))
|
||||
if (sandbox(div, sb_md, sba_md, 'mdo', md_html))
|
||||
return;
|
||||
|
||||
ext = md_plug.post;
|
||||
@@ -9207,7 +9323,7 @@ function show_readme(md, n) {
|
||||
var tgt = ebi(n ? 'epi' : 'pro');
|
||||
|
||||
if (!treectl.ireadme)
|
||||
return sandbox(tgt, '', '', 'a');
|
||||
return sandbox(tgt, '', '', '', 'a');
|
||||
|
||||
show_md(md, n ? 'README.md' : 'PREADME.md', tgt);
|
||||
}
|
||||
@@ -9216,7 +9332,7 @@ for (var a = 0; a < readmes.length; a++)
|
||||
show_readme(readmes[a], a);
|
||||
|
||||
|
||||
function sandbox(tgt, rules, cls, html) {
|
||||
function sandbox(tgt, rules, allow, cls, html) {
|
||||
if (!treectl.ireadme) {
|
||||
tgt.innerHTML = html ? L.md_off : '';
|
||||
return;
|
||||
@@ -9272,6 +9388,7 @@ function sandbox(tgt, rules, cls, html) {
|
||||
var fr = mknod('iframe');
|
||||
fr.setAttribute('title', 'folder ' + tid + 'logue');
|
||||
fr.setAttribute('sandbox', rules ? 'allow-' + rules.replace(/ /g, ' allow-') : '');
|
||||
fr.setAttribute('allow', allow);
|
||||
fr.setAttribute('srcdoc', html);
|
||||
tgt.appendChild(fr);
|
||||
treectl.sb_msg = true;
|
||||
@@ -9321,8 +9438,8 @@ if (sb_lg && logues.length) {
|
||||
if (logues[1] === Ls.eng.f_empty)
|
||||
logues[1] = L.f_empty;
|
||||
|
||||
sandbox(ebi('pro'), sb_lg, '', logues[0]);
|
||||
sandbox(ebi('epi'), sb_lg, '', logues[1]);
|
||||
sandbox(ebi('pro'), sb_lg, sba_lg, '', logues[0]);
|
||||
sandbox(ebi('epi'), sb_lg, sba_lg, '', logues[1]);
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -23,8 +23,7 @@ var dbg = function () { };
|
||||
|
||||
// dodge browser issues
|
||||
(function () {
|
||||
var ua = navigator.userAgent;
|
||||
if (ua.indexOf(') Gecko/') !== -1 && /Linux| Mac /.exec(ua)) {
|
||||
if (UA.indexOf(') Gecko/') !== -1 && /Linux| Mac /.exec(UA)) {
|
||||
// necessary on ff-68.7 at least
|
||||
var s = mknod('style');
|
||||
s.innerHTML = '@page { margin: .5in .6in .8in .6in; }';
|
||||
|
||||
@@ -450,7 +450,7 @@ function savechk_cb() {
|
||||
|
||||
// firefox bug: initial selection offset isn't cleared properly through js
|
||||
var ff_clearsel = (function () {
|
||||
if (navigator.userAgent.indexOf(') Gecko/') === -1)
|
||||
if (UA.indexOf(') Gecko/') === -1)
|
||||
return function () { }
|
||||
|
||||
return function () {
|
||||
|
||||
@@ -239,7 +239,7 @@
|
||||
<div class="os win">
|
||||
<h1>ShareX</h1>
|
||||
|
||||
<p>to upload screenshots using ShareX <a href="https://github.com/ShareX/ShareX/releases/tag/v12.4.1">v12</a> or <a href="https://getsharex.com/">v15+</a>, save this as <code>copyparty.sxcu</code> and run it:</p>
|
||||
<p>to upload screenshots using ShareX <a href="https://github.com/ShareX/ShareX/releases/tag/v12.1.1">v12</a> or <a href="https://getsharex.com/">v15+</a>, save this as <code>copyparty.sxcu</code> and run it:</p>
|
||||
|
||||
<pre class="dl" name="copyparty.sxcu">
|
||||
{ "Name": "copyparty",
|
||||
|
||||
@@ -1,11 +1,3 @@
|
||||
function QSA(x) {
|
||||
return document.querySelectorAll(x);
|
||||
}
|
||||
var LINUX = /Linux/.test(navigator.userAgent),
|
||||
MACOS = /[^a-z]mac ?os/i.test(navigator.userAgent),
|
||||
WINDOWS = /Windows/.test(navigator.userAgent);
|
||||
|
||||
|
||||
var oa = QSA('pre');
|
||||
for (var a = 0; a < oa.length; a++) {
|
||||
var html = oa[a].innerHTML,
|
||||
|
||||
@@ -969,7 +969,7 @@ function up2k_init(subtle) {
|
||||
ud = function () { ebi('dir' + fdom_ctr).click(); };
|
||||
|
||||
// too buggy on chrome <= 72
|
||||
var m = / Chrome\/([0-9]+)\./.exec(navigator.userAgent);
|
||||
var m = / Chrome\/([0-9]+)\./.exec(UA);
|
||||
if (m && parseInt(m[1]) < 73)
|
||||
return uf();
|
||||
|
||||
|
||||
@@ -31,12 +31,13 @@ var wah = '',
|
||||
MOBILE = TOUCH,
|
||||
CHROME = !!window.chrome, // safari=false
|
||||
VCHROME = CHROME ? 1 : 0,
|
||||
IE = /Trident\//.test(navigator.userAgent),
|
||||
FIREFOX = ('netscape' in window) && / rv:/.test(navigator.userAgent),
|
||||
IPHONE = TOUCH && /iPhone|iPad|iPod/i.test(navigator.userAgent),
|
||||
LINUX = /Linux/.test(navigator.userAgent),
|
||||
MACOS = /[^a-z]mac ?os/i.test(navigator.userAgent),
|
||||
WINDOWS = /Windows/.test(navigator.userAgent);
|
||||
UA = '' + navigator.userAgent,
|
||||
IE = /Trident\//.test(UA),
|
||||
FIREFOX = ('netscape' in window) && / rv:/.test(UA),
|
||||
IPHONE = TOUCH && /iPhone|iPad|iPod/i.test(UA),
|
||||
LINUX = /Linux/.test(UA),
|
||||
MACOS = /Macintosh/.test(UA),
|
||||
WINDOWS = /Windows/.test(UA);
|
||||
|
||||
if (!window.WebAssembly || !WebAssembly.Memory)
|
||||
window.WebAssembly = false;
|
||||
@@ -196,7 +197,7 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
'<p style="font-size:1.3em;margin:0;line-height:2em">try to <a href="#" onclick="localStorage.clear();location.reload();">reset copyparty settings</a> if you are stuck here, or <a href="#" onclick="ignex();">ignore this</a> / <a href="#" onclick="ignex(true);">ignore all</a> / <a href="?b=u">basic</a></p>',
|
||||
'<p style="color:#fff">please send me a screenshot arigathanks gozaimuch: <a href="<ghi>" target="_blank">new github issue</a></p>',
|
||||
'<p class="b">' + esc(url + ' @' + lineNo + ':' + columnNo), '<br />' + esc(msg).replace(/\n/g, '<br />') + '</p>',
|
||||
'<p><b>UA:</b> ' + esc(navigator.userAgent + '')
|
||||
'<p><b>UA:</b> ' + esc(UA)
|
||||
];
|
||||
|
||||
try {
|
||||
|
||||
@@ -1,3 +1,93 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0125-1809 `v1.16.10` iOS9 is fine too
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* support audio playback on *really old* apple devices c9eba39e
|
||||
* will now transcode to mp3 when necessary, since iOS didn't support opus-in-caf before iOS 11
|
||||
* support audio playback on *future* apple devices 28c9de3f 95390b65
|
||||
* iOS 17.5 introduced support for opus-in-weba (like webp just audio instead) and, unlike caf, this intentionally supports vbr-opus (awesome)
|
||||
* ...but the current code in iOS is too buggy, so this new format is default-disabled and we'll stick to caf for now fff38f48
|
||||
* ZeroMQ event-hooks can reject uploads 3a5c1d9f
|
||||
* see [the example zmq listener](https://github.com/9001/copyparty/blob/1dace720/bin/zmq-recv.py#L26-L28)
|
||||
* chat with ZeroMQ event-hooks from javascript cdd3b67a
|
||||
* replies from ZMQ REP servers are included in the msg-to-log responses
|
||||
* which makes [this joke](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/usb-eject.py) possible f38c7543
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* nope
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* option to restrict the recent-uploads listing to admins-only b8b5214f
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0122-2326 `v1.16.9` ZeroMQ says hello
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* event-hooks can send zeromq / zmq / 0mq messages; see [readme](https://github.com/9001/copyparty#zeromq) or `--help-hooks` for examples d9db1534
|
||||
* new volflags to specify the [allow-tag](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Permissions-Policy#iframes) of the markdown/logue sandbox, to allow fullscreen and such (see `--help-flags`) 6a0aaaf0
|
||||
* new volflag `nosparse` for possibly-better performance in very rare and specific scenarios 917380dd
|
||||
* only enable this if you're uploading to s3 or something like that, and do plenty of benchmarking to make sure that it actually improved performance instead of making it worse
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* restrict max-length of filekeys to 72 characters e0cac6fd
|
||||
* the hash-calculator mode of the commandline uploader produced incorrect whole-file hashes 4c04798a
|
||||
* each chunk (`--chs`) was okay, but the final sum was not
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* selftest the xml-parser on startup with malicious xml b2e8bf6e
|
||||
* just in case a future python-version suddenly makes it unsafe somehow
|
||||
* disable some features if a dangerously misconfigured reverseproxy is detected 3f84b0a0
|
||||
* the download-as-zip feature now defaults to utf8 filenames 1231ce19
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0111-1611 `v1.16.8` android boost
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* 10x faster file hashing in android-chrome ec507889
|
||||
* on a recent pixel, speed went from 13 to 139 MiB/s
|
||||
* android's sandboxing makes small reads expensive, so do bigger reads instead
|
||||
* so the browser-tab will use more RAM on android now, maybe around 200 MiB
|
||||
* this only affects chrome-based browsers on android, not firefox
|
||||
* PUT/multipart uploads: request-header `Accept: json` makes it return json instead of html, just like `?j` ce0e5be4
|
||||
* add config examples for [ishare](https://isharemac.app/), a MacOS screenshot utility inspired by ShareX 0c0d6b2b
|
||||
* also includes a bug-workaround for [ishare#107](https://github.com/castdrian/ishare/issues/107) - copyparty will now include a toplevel json property `fileurl` in the response if exactly one file was uploaded
|
||||
* the [connect-page](https://a.ocv.me/?hc) generates an appropriate `copyparty.iscu` for ishare; [it looks like this](https://github.com/user-attachments/assets/820730ad-2319-4912-8eb2-733755a4cf54)
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* fix a potential upload deadlock when...
|
||||
* ...the database (`-e2d`) is **not** enabled for any volume, and...
|
||||
* ...either the shares feature, or user-changeable passwords, is enabled 9e542cf8
|
||||
* when loading the partial-uploads registry on startup, a cosmetic desync could occur 467acb47
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* remove some deprecated properties in partial-upload metadata aa2a8fa2
|
||||
* v1.15.7 is now the oldest version which still has any chance of reading a modern up2k.snap
|
||||
* #129 added howto: [using webdav when copyparty is behind IdP](https://github.com/9001/copyparty/blob/hovudstraum/docs/idp.md#connecting-webdav-clients) -- thanks @wuast94 !
|
||||
* added howto: [install copyparty on a synology nas](https://github.com/9001/copyparty/blob/hovudstraum/docs/synology-dsm.md) 21f93042
|
||||
* more examples in the connect-page: 278258ee fb139697
|
||||
* config-file for sharex on windows
|
||||
* config-file for ishare on macos
|
||||
* script for flameshot on linux
|
||||
* #75 add recommendation to use the [kamelåså project](https://github.com/steinuil/kameloso) instead of copyparty's [very-bad-idea.py](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag#dangerous-plugins) 9f84dc42
|
||||
* more reverse-proxy examples (haproxy, lighttpd, traefik, caddy) and improved nginx performance ac0a2da3
|
||||
* readme has a [performance comparison](https://github.com/9001/copyparty?tab=readme-ov-file#reverse-proxy-performance) -- `haproxy > caddy > traefik > nginx > apache > lighttpd`
|
||||
* copyparty.exe: updated pillow 244e952f
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2024-1223-0005 `v1.16.7` an idp fix for xmas
|
||||
|
||||
|
||||
@@ -172,8 +172,8 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
||||
| GET | `?tar=xz:9` | ...as an xz-level-9 gnu-tar file |
|
||||
| GET | `?tar=pax` | ...as a pax-tar file |
|
||||
| GET | `?tar=pax,xz` | ...as an xz-level-1 pax-tar file |
|
||||
| GET | `?zip=utf-8` | ...as a zip file |
|
||||
| GET | `?zip` | ...as a WinXP-compatible zip file |
|
||||
| GET | `?zip` | ...as a zip file |
|
||||
| GET | `?zip=dos` | ...as a WinXP-compatible zip file |
|
||||
| GET | `?zip=crc` | ...as an MSDOS-compatible zip file |
|
||||
| GET | `?tar&w` | pregenerate webp thumbnails |
|
||||
| GET | `?tar&j` | pregenerate jpg thumbnails |
|
||||
@@ -342,6 +342,7 @@ python3 -m venv .venv
|
||||
. .venv/bin/activate
|
||||
pip install jinja2 strip_hints # MANDATORY
|
||||
pip install argon2-cffi # password hashing
|
||||
pip install pyzmq # send 0mq from hooks
|
||||
pip install mutagen # audio metadata
|
||||
pip install pyftpdlib # ftp server
|
||||
pip install partftpy # tftp server
|
||||
|
||||
@@ -279,7 +279,7 @@ symbol legend,
|
||||
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | | █ |
|
||||
| unmap subfolders | █ | | █ | | | | █ | | | █ | ╱ | • | |
|
||||
| index.html blocks list | ╱ | | | | | | █ | | | • | | | |
|
||||
| write-only folders | █ | | █ | | | | | | | | █ | █ | |
|
||||
| write-only folders | █ | | █ | | █ | | | | | | █ | █ | |
|
||||
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ | █ |
|
||||
| file versioning | | | | █ | █ | | | | | | | | |
|
||||
| file encryption | | | | █ | █ | █ | | | | | | █ | |
|
||||
@@ -507,7 +507,6 @@ symbol legend,
|
||||
* ⚠️ uploads not resumable / accelerated / integrity-checked
|
||||
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||
* ⚠️ uploading small files is slow; `4.7` files per sec (copyparty does `670`/sec, 140x faster)
|
||||
* ⚠️ no write-only / upload-only folders
|
||||
* ⚠️ big folders cannot be zip-downloaded
|
||||
* ⚠️ http/webdav only; no ftp, zeroconf
|
||||
* ⚠️ less awesome music player
|
||||
@@ -593,6 +592,7 @@ symbol legend,
|
||||
* ✅ user signup
|
||||
* ✅ command runner / remote shell
|
||||
* ✅ more efficient; can handle around twice as much simultaneous traffic
|
||||
* note: keep an eye on [gtsteffaniak's fork](https://github.com/gtsteffaniak/filebrowser)
|
||||
|
||||
## [filegator](https://github.com/filegator/filegator)
|
||||
* php; cross-platform (windows, linux, mac)
|
||||
|
||||
@@ -52,6 +52,7 @@ ftpd = ["pyftpdlib"]
|
||||
ftps = ["pyftpdlib", "pyopenssl"]
|
||||
tftpd = ["partftpy>=0.4.0"]
|
||||
pwhash = ["argon2-cffi"]
|
||||
zeromq = ["pyzmq"]
|
||||
|
||||
[project.scripts]
|
||||
copyparty = "copyparty.__main__:main"
|
||||
|
||||
@@ -9,7 +9,7 @@ ENV XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add !pyc \
|
||||
tzdata wget \
|
||||
py3-jinja2 py3-argon2-cffi py3-pillow \
|
||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||
ffmpeg
|
||||
|
||||
COPY i/dist/copyparty-sfx.py innvikler.sh ./
|
||||
|
||||
@@ -12,7 +12,8 @@ COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
COPY i/bin/mtag/audio-key.py /mtag/
|
||||
RUN apk add -U !pyc \
|
||||
tzdata wget \
|
||||
py3-jinja2 py3-argon2-cffi py3-pillow py3-pip py3-cffi \
|
||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||
py3-pip py3-cffi \
|
||||
ffmpeg \
|
||||
vips-jxl vips-heif vips-poppler vips-magick \
|
||||
py3-numpy fftw libsndfile \
|
||||
|
||||
@@ -9,7 +9,8 @@ ENV XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk add -U !pyc \
|
||||
tzdata wget \
|
||||
py3-jinja2 py3-argon2-cffi py3-pillow py3-pip py3-cffi \
|
||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||
py3-pip py3-cffi \
|
||||
ffmpeg \
|
||||
vips-jxl vips-heif vips-poppler vips-magick \
|
||||
&& apk add -t .bd \
|
||||
|
||||
@@ -32,6 +32,7 @@ def readclip():
|
||||
def cnv(src):
|
||||
hostname = str(socket.gethostname()).split(".")[0]
|
||||
|
||||
yield '<!DOCTYPE html>'
|
||||
yield '<html style="background:#222;color:#fff"><body>'
|
||||
skip_sfx = False
|
||||
in_sfx = 0
|
||||
|
||||
1
setup.py
1
setup.py
@@ -144,6 +144,7 @@ args = {
|
||||
"ftps": ["pyftpdlib", "pyopenssl"],
|
||||
"tftpd": ["partftpy>=0.4.0"],
|
||||
"pwhash": ["argon2-cffi"],
|
||||
"zeromq": ["pyzmq"],
|
||||
},
|
||||
"entry_points": {"console_scripts": ["copyparty = copyparty.__main__:main"]},
|
||||
"scripts": ["bin/partyfuse.py", "bin/u2c.py"],
|
||||
|
||||
@@ -20,7 +20,8 @@ def _parse(txt):
|
||||
|
||||
|
||||
class TestDXML(unittest.TestCase):
|
||||
def test1(self):
|
||||
def test_qbe(self):
|
||||
# allowed by default; verify that we stopped it
|
||||
txt = r"""<!DOCTYPE qbe [
|
||||
<!ENTITY a "nice_bakuretsu">
|
||||
]>
|
||||
@@ -28,7 +29,8 @@ class TestDXML(unittest.TestCase):
|
||||
_parse(txt)
|
||||
ET.fromstring(txt)
|
||||
|
||||
def test2(self):
|
||||
def test_ent_file(self):
|
||||
# NOT allowed by default; should still be blocked
|
||||
txt = r"""<!DOCTYPE ext [
|
||||
<!ENTITY ee SYSTEM "file:///bin/bash">
|
||||
]>
|
||||
@@ -40,6 +42,25 @@ class TestDXML(unittest.TestCase):
|
||||
except ET.ParseError:
|
||||
pass
|
||||
|
||||
def test_ent_ext(self):
|
||||
# NOT allowed by default; should still be blocked
|
||||
txt = r"""<!DOCTYPE ext [
|
||||
<!ENTITY ee SYSTEM "http://example.com/a.xml">
|
||||
]>
|
||||
<root>ⅇ</root>"""
|
||||
_parse(txt)
|
||||
|
||||
def test_dtd(self):
|
||||
# allowed by default; verify that we stopped it
|
||||
txt = r"""<!DOCTYPE d SYSTEM "a.dtd">
|
||||
<root>a</root>"""
|
||||
_parse(txt)
|
||||
ET.fromstring(txt)
|
||||
|
||||
##
|
||||
## end of negative/security tests; the rest is functional
|
||||
##
|
||||
|
||||
def test3(self):
|
||||
txt = r"""<?xml version="1.0" ?>
|
||||
<propfind xmlns="DAV:">
|
||||
|
||||
@@ -141,13 +141,13 @@ class Cfg(Namespace):
|
||||
ex = "hash_mt hsortn safe_dedup srch_time u2abort u2j u2sz"
|
||||
ka.update(**{k: 1 for k in ex.split()})
|
||||
|
||||
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody th_convt"
|
||||
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody th_convt ups_who"
|
||||
ka.update(**{k: 9 for k in ex.split()})
|
||||
|
||||
ex = "db_act k304 loris no304 re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo"
|
||||
ka.update(**{k: 0 for k in ex.split()})
|
||||
|
||||
ex = "ah_alg bname chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sbf log_fk md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src R RS SR"
|
||||
ex = "ah_alg bname chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sba lg_sbf log_fk md_sba md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src R RS SR"
|
||||
ka.update(**{k: "" for k in ex.split()})
|
||||
|
||||
ex = "ban_403 ban_404 ban_422 ban_pw ban_url"
|
||||
|
||||
Reference in New Issue
Block a user