Compare commits
12 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
dabdaaee33 | ||
|
|
65e4d67c3e | ||
|
|
4b720f4150 | ||
|
|
2e85a25614 | ||
|
|
713fffcb8e | ||
|
|
8020b11ea0 | ||
|
|
2523d76756 | ||
|
|
7ede509973 | ||
|
|
7c1d97af3b | ||
|
|
95566e8388 | ||
|
|
76afb62b7b | ||
|
|
7dec922c70 |
25
README.md
25
README.md
@@ -53,6 +53,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
|||||||
* [webdav server](#webdav-server) - with read-write support
|
* [webdav server](#webdav-server) - with read-write support
|
||||||
* [connecting to webdav from windows](#connecting-to-webdav-from-windows) - using the GUI
|
* [connecting to webdav from windows](#connecting-to-webdav-from-windows) - using the GUI
|
||||||
* [smb server](#smb-server) - unsafe, slow, not recommended for wan
|
* [smb server](#smb-server) - unsafe, slow, not recommended for wan
|
||||||
|
* [browser ux](#browser-ux) - tweaking the ui
|
||||||
* [file indexing](#file-indexing) - enables dedup and music search ++
|
* [file indexing](#file-indexing) - enables dedup and music search ++
|
||||||
* [exclude-patterns](#exclude-patterns) - to save some time
|
* [exclude-patterns](#exclude-patterns) - to save some time
|
||||||
* [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems
|
* [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems
|
||||||
@@ -317,6 +318,8 @@ same order here too
|
|||||||
|
|
||||||
upgrade notes
|
upgrade notes
|
||||||
|
|
||||||
|
* `1.9.16` (2023-11-04):
|
||||||
|
* `--stats`/prometheus: `cpp_bans` renamed to `cpp_active_bans`, and that + `cpp_uptime` are gauges
|
||||||
* `1.6.0` (2023-01-29):
|
* `1.6.0` (2023-01-29):
|
||||||
* http-api: delete/move is now `POST` instead of `GET`
|
* http-api: delete/move is now `POST` instead of `GET`
|
||||||
* everything other than `GET` and `HEAD` must pass [cors validation](#cors)
|
* everything other than `GET` and `HEAD` must pass [cors validation](#cors)
|
||||||
@@ -1304,8 +1307,23 @@ scrape_configs:
|
|||||||
```
|
```
|
||||||
|
|
||||||
currently the following metrics are available,
|
currently the following metrics are available,
|
||||||
* `cpp_uptime_seconds`
|
* `cpp_uptime_seconds` time since last copyparty restart
|
||||||
* `cpp_bans` number of banned IPs
|
* `cpp_boot_unixtime_seconds` same but as an absolute timestamp
|
||||||
|
* `cpp_http_conns` number of open http(s) connections
|
||||||
|
* `cpp_http_reqs` number of http(s) requests handled
|
||||||
|
* `cpp_sus_reqs` number of 403/422/malicious requests
|
||||||
|
* `cpp_active_bans` number of currently banned IPs
|
||||||
|
* `cpp_total_bans` number of IPs banned since last restart
|
||||||
|
|
||||||
|
these are available unless `--nos-vst` is specified:
|
||||||
|
* `cpp_db_idle_seconds` time since last database activity (upload/rename/delete)
|
||||||
|
* `cpp_db_act_seconds` same but as an absolute timestamp
|
||||||
|
* `cpp_idle_vols` number of volumes which are idle / ready
|
||||||
|
* `cpp_busy_vols` number of volumes which are busy / indexing
|
||||||
|
* `cpp_offline_vols` number of volumes which are offline / unavailable
|
||||||
|
* `cpp_hashing_files` number of files queued for hashing / indexing
|
||||||
|
* `cpp_tagq_files` number of files queued for metadata scanning
|
||||||
|
* `cpp_mtpq_files` number of files queued for plugin-based analysis
|
||||||
|
|
||||||
and these are available per-volume only:
|
and these are available per-volume only:
|
||||||
* `cpp_disk_size_bytes` total HDD size
|
* `cpp_disk_size_bytes` total HDD size
|
||||||
@@ -1324,9 +1342,12 @@ some of the metrics have additional requirements to function correctly,
|
|||||||
the following options are available to disable some of the metrics:
|
the following options are available to disable some of the metrics:
|
||||||
* `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs
|
* `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs
|
||||||
* `--nos-vol` disables `cpp_vol_*` which reduces server startup time
|
* `--nos-vol` disables `cpp_vol_*` which reduces server startup time
|
||||||
|
* `--nos-vst` disables volume state, reducing the worst-case prometheus query time by 0.5 sec
|
||||||
* `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries
|
* `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries
|
||||||
* `--nos-unf` disables `cpp_unf_*` for no particular purpose
|
* `--nos-unf` disables `cpp_unf_*` for no particular purpose
|
||||||
|
|
||||||
|
note: the following metrics are counted incorrectly if multiprocessing is enabled with `-j`: `cpp_http_conns`, `cpp_http_reqs`, `cpp_sus_reqs`, `cpp_active_bans`, `cpp_total_bans`
|
||||||
|
|
||||||
|
|
||||||
# packages
|
# packages
|
||||||
|
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
|
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
|
||||||
|
|
||||||
upstream cpp {
|
upstream cpp {
|
||||||
server 127.0.0.1:3923;
|
server 127.0.0.1:3923 fail_timeout=1s;
|
||||||
keepalive 1;
|
keepalive 1;
|
||||||
}
|
}
|
||||||
server {
|
server {
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Maintainer: icxes <dev.null@need.moe>
|
# Maintainer: icxes <dev.null@need.moe>
|
||||||
pkgname=copyparty
|
pkgname=copyparty
|
||||||
pkgver="1.9.14"
|
pkgver="1.9.15"
|
||||||
pkgrel=1
|
pkgrel=1
|
||||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++"
|
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++"
|
||||||
arch=("any")
|
arch=("any")
|
||||||
@@ -20,7 +20,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
|||||||
)
|
)
|
||||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||||
backup=("etc/${pkgname}.d/init" )
|
backup=("etc/${pkgname}.d/init" )
|
||||||
sha256sums=("96867ea1bcaf622e5dc29ee3224ffa8ea80218d3a146e7a10d04c12255bae00f")
|
sha256sums=("ee569d664b22cb59ac0eb11850380648d9f8d42d1c26283d43dab350745c102e")
|
||||||
|
|
||||||
build() {
|
build() {
|
||||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.14/copyparty-sfx.py",
|
"url": "https://github.com/9001/copyparty/releases/download/v1.9.15/copyparty-sfx.py",
|
||||||
"version": "1.9.14",
|
"version": "1.9.15",
|
||||||
"hash": "sha256-H4hRi6Nn4jUouhvqLacFyr0odMQ+99crBXL3iNz7mXs="
|
"hash": "sha256-EUenh567NYj1klMpjVOWKqiBSqZbdEA0ZGidzzzpnsY="
|
||||||
}
|
}
|
||||||
@@ -1014,6 +1014,7 @@ def add_stats(ap):
|
|||||||
ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts")
|
ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts")
|
||||||
ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)")
|
ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)")
|
||||||
ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)")
|
ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)")
|
||||||
|
ap2.add_argument("--nos-vst", action="store_true", help="disable volume state metrics (indexing, analyzing, activity)")
|
||||||
ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)")
|
ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)")
|
||||||
ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics")
|
ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics")
|
||||||
|
|
||||||
@@ -1094,7 +1095,7 @@ def add_logging(ap):
|
|||||||
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
|
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
|
||||||
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
|
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
|
||||||
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
|
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
|
||||||
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log passphrase of failed login attempts: 0=terse, 1=plaintext, 2=hashed")
|
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log failed login attempt passwords: 0=terse, 1=plaintext, 2=hashed")
|
||||||
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
|
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
|
||||||
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
|
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
|
||||||
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
|
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
|
||||||
@@ -1314,7 +1315,7 @@ def run_argparse(
|
|||||||
for k, h, t in sects:
|
for k, h, t in sects:
|
||||||
k2 = "help_" + k.replace("-", "_")
|
k2 = "help_" + k.replace("-", "_")
|
||||||
if vars(ret)[k2]:
|
if vars(ret)[k2]:
|
||||||
lprint("# {} help page".format(k))
|
lprint("# %s help page (%s)" % (k, h))
|
||||||
lprint(t + "\033[0m")
|
lprint(t + "\033[0m")
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
# coding: utf-8
|
# coding: utf-8
|
||||||
|
|
||||||
VERSION = (1, 9, 15)
|
VERSION = (1, 9, 16)
|
||||||
CODENAME = "prometheable"
|
CODENAME = "prometheable"
|
||||||
BUILD_DT = (2023, 10, 24)
|
BUILD_DT = (2023, 11, 4)
|
||||||
|
|
||||||
S_VERSION = ".".join(map(str, VERSION))
|
S_VERSION = ".".join(map(str, VERSION))
|
||||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||||
|
|||||||
@@ -476,12 +476,10 @@ class VFS(object):
|
|||||||
err: int = 403,
|
err: int = 403,
|
||||||
) -> tuple["VFS", str]:
|
) -> tuple["VFS", str]:
|
||||||
"""returns [vfsnode,fs_remainder] if user has the requested permissions"""
|
"""returns [vfsnode,fs_remainder] if user has the requested permissions"""
|
||||||
if ANYWIN:
|
if relchk(vpath):
|
||||||
mod = relchk(vpath)
|
if self.log:
|
||||||
if mod:
|
self.log("vfs", "invalid relpath [{}]".format(vpath))
|
||||||
if self.log:
|
raise Pebkac(422)
|
||||||
self.log("vfs", "invalid relpath [{}]".format(vpath))
|
|
||||||
raise Pebkac(404)
|
|
||||||
|
|
||||||
cvpath = undot(vpath)
|
cvpath = undot(vpath)
|
||||||
vn, rem = self._find(cvpath)
|
vn, rem = self._find(cvpath)
|
||||||
@@ -500,8 +498,8 @@ class VFS(object):
|
|||||||
t = "{} has no {} in [{}] => [{}] => [{}]"
|
t = "{} has no {} in [{}] => [{}] => [{}]"
|
||||||
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
|
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
|
||||||
|
|
||||||
t = "you don't have {}-access for this location"
|
t = 'you don\'t have %s-access in "/%s"'
|
||||||
raise Pebkac(err, t.format(msg))
|
raise Pebkac(err, t % (msg, cvpath))
|
||||||
|
|
||||||
return vn, rem
|
return vn, rem
|
||||||
|
|
||||||
@@ -1723,6 +1721,9 @@ class AuthSrv(object):
|
|||||||
def setup_pwhash(self, acct: dict[str, str]) -> None:
|
def setup_pwhash(self, acct: dict[str, str]) -> None:
|
||||||
self.ah = PWHash(self.args)
|
self.ah = PWHash(self.args)
|
||||||
if not self.ah.on:
|
if not self.ah.on:
|
||||||
|
if self.args.ah_cli or self.args.ah_gen:
|
||||||
|
t = "\n BAD CONFIG:\n cannot --ah-cli or --ah-gen without --ah-alg"
|
||||||
|
raise Exception(t)
|
||||||
return
|
return
|
||||||
|
|
||||||
if self.args.ah_cli:
|
if self.args.ah_cli:
|
||||||
|
|||||||
@@ -92,6 +92,12 @@ class FtpAuth(DummyAuthorizer):
|
|||||||
if bonk:
|
if bonk:
|
||||||
logging.warning("client banned: invalid passwords")
|
logging.warning("client banned: invalid passwords")
|
||||||
bans[ip] = bonk
|
bans[ip] = bonk
|
||||||
|
try:
|
||||||
|
# only possible if multiprocessing disabled
|
||||||
|
self.hub.broker.httpsrv.bans[ip] = bonk
|
||||||
|
self.hub.broker.httpsrv.nban += 1
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
raise AuthenticationFailed("Authentication failed.")
|
raise AuthenticationFailed("Authentication failed.")
|
||||||
|
|
||||||
@@ -148,7 +154,7 @@ class FtpFs(AbstractedFS):
|
|||||||
try:
|
try:
|
||||||
vpath = vpath.replace("\\", "/").strip("/")
|
vpath = vpath.replace("\\", "/").strip("/")
|
||||||
rd, fn = os.path.split(vpath)
|
rd, fn = os.path.split(vpath)
|
||||||
if ANYWIN and relchk(rd):
|
if relchk(rd):
|
||||||
logging.warning("malicious vpath: %s", vpath)
|
logging.warning("malicious vpath: %s", vpath)
|
||||||
t = "Unsupported characters in [{}]"
|
t = "Unsupported characters in [{}]"
|
||||||
raise FSE(t.format(vpath), 1)
|
raise FSE(t.format(vpath), 1)
|
||||||
|
|||||||
@@ -37,6 +37,7 @@ from .star import StreamTar
|
|||||||
from .sutil import StreamArc, gfilter
|
from .sutil import StreamArc, gfilter
|
||||||
from .szip import StreamZip
|
from .szip import StreamZip
|
||||||
from .util import (
|
from .util import (
|
||||||
|
Garda,
|
||||||
HTTPCODE,
|
HTTPCODE,
|
||||||
META_NOBOTS,
|
META_NOBOTS,
|
||||||
MultipartParser,
|
MultipartParser,
|
||||||
@@ -75,6 +76,7 @@ from .util import (
|
|||||||
runhook,
|
runhook,
|
||||||
s3enc,
|
s3enc,
|
||||||
sanitize_fn,
|
sanitize_fn,
|
||||||
|
sanitize_vpath,
|
||||||
sendfile_kern,
|
sendfile_kern,
|
||||||
sendfile_py,
|
sendfile_py,
|
||||||
undot,
|
undot,
|
||||||
@@ -146,6 +148,7 @@ class HttpCli(object):
|
|||||||
self.rem = " "
|
self.rem = " "
|
||||||
self.vpath = " "
|
self.vpath = " "
|
||||||
self.vpaths = " "
|
self.vpaths = " "
|
||||||
|
self.gctx = " " # additional context for garda
|
||||||
self.trailing_slash = True
|
self.trailing_slash = True
|
||||||
self.uname = " "
|
self.uname = " "
|
||||||
self.pw = " "
|
self.pw = " "
|
||||||
@@ -254,8 +257,8 @@ class HttpCli(object):
|
|||||||
k, zs = header_line.split(":", 1)
|
k, zs = header_line.split(":", 1)
|
||||||
self.headers[k.lower()] = zs.strip()
|
self.headers[k.lower()] = zs.strip()
|
||||||
except:
|
except:
|
||||||
msg = " ]\n#[ ".join(headerlines)
|
msg = "#[ " + " ]\n#[ ".join(headerlines) + " ]"
|
||||||
raise Pebkac(400, "bad headers:\n#[ " + msg + " ]")
|
raise Pebkac(400, "bad headers", log=msg)
|
||||||
|
|
||||||
except Pebkac as ex:
|
except Pebkac as ex:
|
||||||
self.mode = "GET"
|
self.mode = "GET"
|
||||||
@@ -268,8 +271,14 @@ class HttpCli(object):
|
|||||||
self.loud_reply(unicode(ex), status=ex.code, headers=h, volsan=True)
|
self.loud_reply(unicode(ex), status=ex.code, headers=h, volsan=True)
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
if ex.log:
|
||||||
|
self.log("additional error context:\n" + ex.log, 6)
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
self.conn.hsrv.nreq += 1
|
||||||
|
|
||||||
self.ua = self.headers.get("user-agent", "")
|
self.ua = self.headers.get("user-agent", "")
|
||||||
self.is_rclone = self.ua.startswith("rclone/")
|
self.is_rclone = self.ua.startswith("rclone/")
|
||||||
|
|
||||||
@@ -411,12 +420,9 @@ class HttpCli(object):
|
|||||||
self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath
|
self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath
|
||||||
)
|
)
|
||||||
|
|
||||||
ok = "\x00" not in self.vpath
|
if relchk(self.vpath) and (self.vpath != "*" or self.mode != "OPTIONS"):
|
||||||
if ANYWIN:
|
|
||||||
ok = ok and not relchk(self.vpath)
|
|
||||||
|
|
||||||
if not ok and (self.vpath != "*" or self.mode != "OPTIONS"):
|
|
||||||
self.log("invalid relpath [{}]".format(self.vpath))
|
self.log("invalid relpath [{}]".format(self.vpath))
|
||||||
|
self.cbonk(self.conn.hsrv.g422, self.vpath, "bad_vp", "invalid relpaths")
|
||||||
return self.tx_404() and self.keepalive
|
return self.tx_404() and self.keepalive
|
||||||
|
|
||||||
zso = self.headers.get("authorization")
|
zso = self.headers.get("authorization")
|
||||||
@@ -549,6 +555,9 @@ class HttpCli(object):
|
|||||||
zb = b"<pre>" + html_escape(msg).encode("utf-8", "replace")
|
zb = b"<pre>" + html_escape(msg).encode("utf-8", "replace")
|
||||||
h = {"WWW-Authenticate": 'Basic realm="a"'} if pex.code == 401 else {}
|
h = {"WWW-Authenticate": 'Basic realm="a"'} if pex.code == 401 else {}
|
||||||
self.reply(zb, status=pex.code, headers=h, volsan=True)
|
self.reply(zb, status=pex.code, headers=h, volsan=True)
|
||||||
|
if pex.log:
|
||||||
|
self.log("additional error context:\n" + pex.log, 6)
|
||||||
|
|
||||||
return self.keepalive
|
return self.keepalive
|
||||||
except Pebkac:
|
except Pebkac:
|
||||||
return False
|
return False
|
||||||
@@ -559,6 +568,36 @@ class HttpCli(object):
|
|||||||
else:
|
else:
|
||||||
return self.conn.iphash.s(self.ip)
|
return self.conn.iphash.s(self.ip)
|
||||||
|
|
||||||
|
def cbonk(self, g: Garda, v: str, reason: str, descr: str) -> bool:
|
||||||
|
self.conn.hsrv.nsus += 1
|
||||||
|
if not g.lim:
|
||||||
|
return False
|
||||||
|
|
||||||
|
bonk, ip = g.bonk(self.ip, v + self.gctx)
|
||||||
|
if not bonk:
|
||||||
|
return False
|
||||||
|
|
||||||
|
xban = self.vn.flags.get("xban")
|
||||||
|
if not xban or not runhook(
|
||||||
|
self.log,
|
||||||
|
xban,
|
||||||
|
self.vn.canonical(self.rem),
|
||||||
|
self.vpath,
|
||||||
|
self.host,
|
||||||
|
self.uname,
|
||||||
|
time.time(),
|
||||||
|
0,
|
||||||
|
self.ip,
|
||||||
|
time.time(),
|
||||||
|
reason,
|
||||||
|
):
|
||||||
|
self.log("client banned: %s" % (descr,), 1)
|
||||||
|
self.conn.hsrv.bans[ip] = bonk
|
||||||
|
self.conn.hsrv.nban += 1
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
def is_banned(self) -> bool:
|
def is_banned(self) -> bool:
|
||||||
if not self.conn.bans:
|
if not self.conn.bans:
|
||||||
return False
|
return False
|
||||||
@@ -678,24 +717,7 @@ class HttpCli(object):
|
|||||||
or not self.args.nonsus_urls
|
or not self.args.nonsus_urls
|
||||||
or not self.args.nonsus_urls.search(self.vpath)
|
or not self.args.nonsus_urls.search(self.vpath)
|
||||||
):
|
):
|
||||||
bonk, ip = g.bonk(self.ip, self.vpath)
|
self.cbonk(g, self.vpath, str(status), "%ss" % (status,))
|
||||||
if bonk:
|
|
||||||
xban = self.vn.flags.get("xban")
|
|
||||||
if not xban or not runhook(
|
|
||||||
self.log,
|
|
||||||
xban,
|
|
||||||
self.vn.canonical(self.rem),
|
|
||||||
self.vpath,
|
|
||||||
self.host,
|
|
||||||
self.uname,
|
|
||||||
time.time(),
|
|
||||||
0,
|
|
||||||
self.ip,
|
|
||||||
time.time(),
|
|
||||||
str(status),
|
|
||||||
):
|
|
||||||
self.log("client banned: %ss" % (status,), 1)
|
|
||||||
self.conn.hsrv.bans[ip] = bonk
|
|
||||||
|
|
||||||
if volsan:
|
if volsan:
|
||||||
vols = list(self.asrv.vfs.all_vols.values())
|
vols = list(self.asrv.vfs.all_vols.values())
|
||||||
@@ -2121,8 +2143,10 @@ class HttpCli(object):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
def get_pwd_cookie(self, pwd: str) -> str:
|
def get_pwd_cookie(self, pwd: str) -> str:
|
||||||
if self.asrv.ah.hash(pwd) in self.asrv.iacct:
|
hpwd = self.asrv.ah.hash(pwd)
|
||||||
msg = "login ok"
|
uname = self.asrv.iacct.get(hpwd)
|
||||||
|
if uname:
|
||||||
|
msg = "hi " + uname
|
||||||
dur = int(60 * 60 * self.args.logout)
|
dur = int(60 * 60 * self.args.logout)
|
||||||
else:
|
else:
|
||||||
logpwd = pwd
|
logpwd = pwd
|
||||||
@@ -2133,27 +2157,7 @@ class HttpCli(object):
|
|||||||
logpwd = "%" + base64.b64encode(zb[:12]).decode("utf-8")
|
logpwd = "%" + base64.b64encode(zb[:12]).decode("utf-8")
|
||||||
|
|
||||||
self.log("invalid password: {}".format(logpwd), 3)
|
self.log("invalid password: {}".format(logpwd), 3)
|
||||||
|
self.cbonk(self.conn.hsrv.gpwd, pwd, "pw", "invalid passwords")
|
||||||
g = self.conn.hsrv.gpwd
|
|
||||||
if g.lim:
|
|
||||||
bonk, ip = g.bonk(self.ip, pwd)
|
|
||||||
if bonk:
|
|
||||||
xban = self.vn.flags.get("xban")
|
|
||||||
if not xban or not runhook(
|
|
||||||
self.log,
|
|
||||||
xban,
|
|
||||||
self.vn.canonical(self.rem),
|
|
||||||
self.vpath,
|
|
||||||
self.host,
|
|
||||||
self.uname,
|
|
||||||
time.time(),
|
|
||||||
0,
|
|
||||||
self.ip,
|
|
||||||
time.time(),
|
|
||||||
"pw",
|
|
||||||
):
|
|
||||||
self.log("client banned: invalid passwords", 1)
|
|
||||||
self.conn.hsrv.bans[ip] = bonk
|
|
||||||
|
|
||||||
msg = "naw dude"
|
msg = "naw dude"
|
||||||
pwd = "x" # nosec
|
pwd = "x" # nosec
|
||||||
@@ -2177,26 +2181,30 @@ class HttpCli(object):
|
|||||||
new_dir = self.parser.require("name", 512)
|
new_dir = self.parser.require("name", 512)
|
||||||
self.parser.drop()
|
self.parser.drop()
|
||||||
|
|
||||||
sanitized = sanitize_fn(new_dir, "", [])
|
return self._mkdir(vjoin(self.vpath, new_dir))
|
||||||
return self._mkdir(vjoin(self.vpath, sanitized))
|
|
||||||
|
|
||||||
def _mkdir(self, vpath: str, dav: bool = False) -> bool:
|
def _mkdir(self, vpath: str, dav: bool = False) -> bool:
|
||||||
nullwrite = self.args.nw
|
nullwrite = self.args.nw
|
||||||
|
self.gctx = vpath
|
||||||
|
vpath = undot(vpath)
|
||||||
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
||||||
self._assert_safe_rem(rem)
|
rem = sanitize_vpath(rem, "/", [])
|
||||||
fn = vfs.canonical(rem)
|
fn = vfs.canonical(rem)
|
||||||
|
if not fn.startswith(vfs.realpath):
|
||||||
|
self.log("invalid mkdir [%s] [%s]" % (self.gctx, vpath), 1)
|
||||||
|
raise Pebkac(422)
|
||||||
|
|
||||||
if not nullwrite:
|
if not nullwrite:
|
||||||
fdir = os.path.dirname(fn)
|
fdir = os.path.dirname(fn)
|
||||||
|
|
||||||
if not bos.path.isdir(fdir):
|
if dav and not bos.path.isdir(fdir):
|
||||||
raise Pebkac(409, "parent folder does not exist")
|
raise Pebkac(409, "parent folder does not exist")
|
||||||
|
|
||||||
if bos.path.isdir(fn):
|
if bos.path.isdir(fn):
|
||||||
raise Pebkac(405, "that folder exists already")
|
raise Pebkac(405, 'folder "/%s" already exists' % (vpath,))
|
||||||
|
|
||||||
try:
|
try:
|
||||||
bos.mkdir(fn)
|
bos.makedirs(fn)
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
if ex.errno == errno.EACCES:
|
if ex.errno == errno.EACCES:
|
||||||
raise Pebkac(500, "the server OS denied write-access")
|
raise Pebkac(500, "the server OS denied write-access")
|
||||||
@@ -2205,7 +2213,7 @@ class HttpCli(object):
|
|||||||
except:
|
except:
|
||||||
raise Pebkac(500, min_ex())
|
raise Pebkac(500, min_ex())
|
||||||
|
|
||||||
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
|
self.out_headers["X-New-Dir"] = quotep(vpath)
|
||||||
|
|
||||||
if dav:
|
if dav:
|
||||||
self.reply(b"", 201)
|
self.reply(b"", 201)
|
||||||
|
|||||||
@@ -128,6 +128,9 @@ class HttpSrv(object):
|
|||||||
|
|
||||||
self.u2fh = FHC()
|
self.u2fh = FHC()
|
||||||
self.metrics = Metrics(self)
|
self.metrics = Metrics(self)
|
||||||
|
self.nreq = 0
|
||||||
|
self.nsus = 0
|
||||||
|
self.nban = 0
|
||||||
self.srvs: list[socket.socket] = []
|
self.srvs: list[socket.socket] = []
|
||||||
self.ncli = 0 # exact
|
self.ncli = 0 # exact
|
||||||
self.clients: set[HttpConn] = set() # laggy
|
self.clients: set[HttpConn] = set() # laggy
|
||||||
|
|||||||
@@ -34,14 +34,23 @@ class Metrics(object):
|
|||||||
|
|
||||||
ret: list[str] = []
|
ret: list[str] = []
|
||||||
|
|
||||||
def addc(k: str, unit: str, v: str, desc: str) -> None:
|
def addc(k: str, v: str, desc: str) -> None:
|
||||||
if unit:
|
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||||
k += "_" + unit
|
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
|
||||||
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
|
||||||
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
|
def adduc(k: str, unit: str, v: str, desc: str) -> None:
|
||||||
else:
|
k += "_" + unit
|
||||||
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||||
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
|
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
|
||||||
|
|
||||||
|
def addg(k: str, v: str, desc: str) -> None:
|
||||||
|
zs = "# TYPE %s gauge\n# HELP %s %s\n%s %s"
|
||||||
|
ret.append(zs % (k, k, desc, k, v))
|
||||||
|
|
||||||
|
def addug(k: str, unit: str, v: str, desc: str) -> None:
|
||||||
|
k += "_" + unit
|
||||||
|
zs = "# TYPE %s gauge\n# UNIT %s %s\n# HELP %s %s\n%s %s"
|
||||||
|
ret.append(zs % (k, k, unit, k, desc, k, v))
|
||||||
|
|
||||||
def addh(k: str, typ: str, desc: str) -> None:
|
def addh(k: str, typ: str, desc: str) -> None:
|
||||||
zs = "# TYPE %s %s\n# HELP %s %s"
|
zs = "# TYPE %s %s\n# HELP %s %s"
|
||||||
@@ -54,17 +63,75 @@ class Metrics(object):
|
|||||||
def addv(k: str, v: str) -> None:
|
def addv(k: str, v: str) -> None:
|
||||||
ret.append("%s %s" % (k, v))
|
ret.append("%s %s" % (k, v))
|
||||||
|
|
||||||
|
t = "time since last copyparty restart"
|
||||||
v = "{:.3f}".format(time.time() - self.hsrv.t0)
|
v = "{:.3f}".format(time.time() - self.hsrv.t0)
|
||||||
addc("cpp_uptime", "seconds", v, "time since last server restart")
|
addug("cpp_uptime", "seconds", v, t)
|
||||||
|
|
||||||
|
# timestamps are gauges because initial value is not zero
|
||||||
|
t = "unixtime of last copyparty restart"
|
||||||
|
v = "{:.3f}".format(self.hsrv.t0)
|
||||||
|
addug("cpp_boot_unixtime", "seconds", v, t)
|
||||||
|
|
||||||
|
t = "number of open http(s) client connections"
|
||||||
|
addg("cpp_http_conns", str(self.hsrv.ncli), t)
|
||||||
|
|
||||||
|
t = "number of http(s) requests since last restart"
|
||||||
|
addc("cpp_http_reqs", str(self.hsrv.nreq), t)
|
||||||
|
|
||||||
|
t = "number of 403/422/malicious reqs since restart"
|
||||||
|
addc("cpp_sus_reqs", str(self.hsrv.nsus), t)
|
||||||
|
|
||||||
v = str(len(conn.bans or []))
|
v = str(len(conn.bans or []))
|
||||||
addc("cpp_bans", "", v, "number of banned IPs")
|
addg("cpp_active_bans", v, "number of currently banned IPs")
|
||||||
|
|
||||||
|
t = "number of IPs banned since last restart"
|
||||||
|
addg("cpp_total_bans", str(self.hsrv.nban), t)
|
||||||
|
|
||||||
|
if not args.nos_vst:
|
||||||
|
x = self.hsrv.broker.ask("up2k.get_state")
|
||||||
|
vs = json.loads(x.get())
|
||||||
|
|
||||||
|
nvidle = 0
|
||||||
|
nvbusy = 0
|
||||||
|
nvoffline = 0
|
||||||
|
for v in vs["volstate"].values():
|
||||||
|
if v == "online, idle":
|
||||||
|
nvidle += 1
|
||||||
|
elif "OFFLINE" in v:
|
||||||
|
nvoffline += 1
|
||||||
|
else:
|
||||||
|
nvbusy += 1
|
||||||
|
|
||||||
|
addg("cpp_idle_vols", str(nvidle), "number of idle/ready volumes")
|
||||||
|
addg("cpp_busy_vols", str(nvbusy), "number of busy/indexing volumes")
|
||||||
|
addg("cpp_offline_vols", str(nvoffline), "number of offline volumes")
|
||||||
|
|
||||||
|
t = "time since last database activity (upload/rename/delete)"
|
||||||
|
addug("cpp_db_idle", "seconds", str(vs["dbwt"]), t)
|
||||||
|
|
||||||
|
t = "unixtime of last database activity (upload/rename/delete)"
|
||||||
|
addug("cpp_db_act", "seconds", str(vs["dbwu"]), t)
|
||||||
|
|
||||||
|
t = "number of files queued for hashing/indexing"
|
||||||
|
addg("cpp_hashing_files", str(vs["hashq"]), t)
|
||||||
|
|
||||||
|
t = "number of files queued for metadata scanning"
|
||||||
|
addg("cpp_tagq_files", str(vs["tagq"]), t)
|
||||||
|
|
||||||
|
try:
|
||||||
|
t = "number of files queued for plugin-based analysis"
|
||||||
|
addg("cpp_mtpq_files", str(int(vs["mtpq"])), t)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
if not args.nos_hdd:
|
if not args.nos_hdd:
|
||||||
addbh("cpp_disk_size_bytes", "total HDD size of volume")
|
addbh("cpp_disk_size_bytes", "total HDD size of volume")
|
||||||
addbh("cpp_disk_free_bytes", "free HDD space in volume")
|
addbh("cpp_disk_free_bytes", "free HDD space in volume")
|
||||||
for vpath, vol in allvols:
|
for vpath, vol in allvols:
|
||||||
free, total = get_df(vol.realpath)
|
free, total = get_df(vol.realpath)
|
||||||
|
if free is None or total is None:
|
||||||
|
continue
|
||||||
|
|
||||||
addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total))
|
addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total))
|
||||||
addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free))
|
addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free))
|
||||||
|
|
||||||
@@ -161,5 +228,6 @@ class Metrics(object):
|
|||||||
ret.append("# EOF")
|
ret.append("# EOF")
|
||||||
|
|
||||||
mime = "application/openmetrics-text; version=1.0.0; charset=utf-8"
|
mime = "application/openmetrics-text; version=1.0.0; charset=utf-8"
|
||||||
|
mime = cli.uparam.get("mime") or mime
|
||||||
cli.reply("\n".join(ret).encode("utf-8"), mime=mime)
|
cli.reply("\n".join(ret).encode("utf-8"), mime=mime)
|
||||||
return True
|
return True
|
||||||
|
|||||||
@@ -136,8 +136,12 @@ class PWHash(object):
|
|||||||
import getpass
|
import getpass
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
p1 = getpass.getpass("password> ")
|
try:
|
||||||
p2 = getpass.getpass("again or just hit ENTER> ")
|
p1 = getpass.getpass("password> ")
|
||||||
|
p2 = getpass.getpass("again or just hit ENTER> ")
|
||||||
|
except EOFError:
|
||||||
|
return
|
||||||
|
|
||||||
if p2 and p1 != p2:
|
if p2 and p1 != p2:
|
||||||
print("\033[31minputs don't match; try again\033[0m", file=sys.stderr)
|
print("\033[31minputs don't match; try again\033[0m", file=sys.stderr)
|
||||||
continue
|
continue
|
||||||
|
|||||||
@@ -65,6 +65,11 @@ from .util import (
|
|||||||
w8b64enc,
|
w8b64enc,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
from pathlib import Path
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
if HAVE_SQLITE3:
|
if HAVE_SQLITE3:
|
||||||
import sqlite3
|
import sqlite3
|
||||||
|
|
||||||
@@ -261,6 +266,7 @@ class Up2k(object):
|
|||||||
"hashq": self.n_hashq,
|
"hashq": self.n_hashq,
|
||||||
"tagq": self.n_tagq,
|
"tagq": self.n_tagq,
|
||||||
"mtpq": mtpq,
|
"mtpq": mtpq,
|
||||||
|
"dbwu": "{:.2f}".format(self.db_act),
|
||||||
"dbwt": "{:.2f}".format(
|
"dbwt": "{:.2f}".format(
|
||||||
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
|
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
|
||||||
),
|
),
|
||||||
@@ -2723,7 +2729,18 @@ class Up2k(object):
|
|||||||
raise Exception("symlink-fallback disabled in cfg")
|
raise Exception("symlink-fallback disabled in cfg")
|
||||||
|
|
||||||
if not linked:
|
if not linked:
|
||||||
os.symlink(fsenc(lsrc), fsenc(ldst))
|
if ANYWIN:
|
||||||
|
Path(ldst).symlink_to(lsrc)
|
||||||
|
if not bos.path.exists(dst):
|
||||||
|
try:
|
||||||
|
bos.unlink(dst)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
t = "the created symlink [%s] did not resolve to [%s]"
|
||||||
|
raise Exception(t % (ldst, lsrc))
|
||||||
|
else:
|
||||||
|
os.symlink(fsenc(lsrc), fsenc(ldst))
|
||||||
|
|
||||||
linked = True
|
linked = True
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
self.log("cannot link; creating copy: " + repr(ex))
|
self.log("cannot link; creating copy: " + repr(ex))
|
||||||
|
|||||||
@@ -1563,8 +1563,8 @@ def read_header(sr: Unrecv, t_idle: int, t_tot: int) -> list[str]:
|
|||||||
|
|
||||||
raise Pebkac(
|
raise Pebkac(
|
||||||
400,
|
400,
|
||||||
"protocol error while reading headers:\n"
|
"protocol error while reading headers",
|
||||||
+ ret.decode("utf-8", "replace"),
|
log=ret.decode("utf-8", "replace"),
|
||||||
)
|
)
|
||||||
|
|
||||||
ofs = ret.find(b"\r\n\r\n")
|
ofs = ret.find(b"\r\n\r\n")
|
||||||
@@ -1773,7 +1773,16 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
|
|||||||
return fn.strip()
|
return fn.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def sanitize_vpath(vp: str, ok: str, bad: list[str]) -> str:
|
||||||
|
parts = vp.replace(os.sep, "/").split("/")
|
||||||
|
ret = [sanitize_fn(x, ok, bad) for x in parts]
|
||||||
|
return "/".join(ret)
|
||||||
|
|
||||||
|
|
||||||
def relchk(rp: str) -> str:
|
def relchk(rp: str) -> str:
|
||||||
|
if "\x00" in rp:
|
||||||
|
return "[nul]"
|
||||||
|
|
||||||
if ANYWIN:
|
if ANYWIN:
|
||||||
if "\n" in rp or "\r" in rp:
|
if "\n" in rp or "\r" in rp:
|
||||||
return "x\nx"
|
return "x\nx"
|
||||||
@@ -2976,9 +2985,12 @@ def hidedir(dp) -> None:
|
|||||||
|
|
||||||
|
|
||||||
class Pebkac(Exception):
|
class Pebkac(Exception):
|
||||||
def __init__(self, code: int, msg: Optional[str] = None) -> None:
|
def __init__(
|
||||||
|
self, code: int, msg: Optional[str] = None, log: Optional[str] = None
|
||||||
|
) -> None:
|
||||||
super(Pebkac, self).__init__(msg or HTTPCODE[code])
|
super(Pebkac, self).__init__(msg or HTTPCODE[code])
|
||||||
self.code = code
|
self.code = code
|
||||||
|
self.log = log
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
return "Pebkac({}, {})".format(self.code, repr(self.args))
|
return "Pebkac({}, {})".format(self.code, repr(self.args))
|
||||||
|
|||||||
@@ -1891,6 +1891,10 @@ html.y #doc {
|
|||||||
text-align: center;
|
text-align: center;
|
||||||
padding: .5em;
|
padding: .5em;
|
||||||
}
|
}
|
||||||
|
#docul li.bn span {
|
||||||
|
font-weight: bold;
|
||||||
|
color: var(--fg-max);
|
||||||
|
}
|
||||||
#doc.prism {
|
#doc.prism {
|
||||||
padding-left: 3em;
|
padding-left: 3em;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -3797,7 +3797,7 @@ var fileman = (function () {
|
|||||||
|
|
||||||
function rename_cb() {
|
function rename_cb() {
|
||||||
if (this.status !== 201) {
|
if (this.status !== 201) {
|
||||||
var msg = this.responseText;
|
var msg = unpre(this.responseText);
|
||||||
toast.err(9, L.fr_efail + msg);
|
toast.err(9, L.fr_efail + msg);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -3846,7 +3846,7 @@ var fileman = (function () {
|
|||||||
}
|
}
|
||||||
function delete_cb() {
|
function delete_cb() {
|
||||||
if (this.status !== 200) {
|
if (this.status !== 200) {
|
||||||
var msg = this.responseText;
|
var msg = unpre(this.responseText);
|
||||||
toast.err(9, L.fd_err + msg);
|
toast.err(9, L.fd_err + msg);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -3967,7 +3967,7 @@ var fileman = (function () {
|
|||||||
}
|
}
|
||||||
function paste_cb() {
|
function paste_cb() {
|
||||||
if (this.status !== 201) {
|
if (this.status !== 201) {
|
||||||
var msg = this.responseText;
|
var msg = unpre(this.responseText);
|
||||||
toast.err(9, L.fp_err + msg);
|
toast.err(9, L.fp_err + msg);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -4300,7 +4300,7 @@ var showfile = (function () {
|
|||||||
};
|
};
|
||||||
|
|
||||||
r.mktree = function () {
|
r.mktree = function () {
|
||||||
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('') + '</li>'];
|
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('<span>/</span>') + '</li>'];
|
||||||
for (var a = 0; a < r.files.length; a++) {
|
for (var a = 0; a < r.files.length; a++) {
|
||||||
var file = r.files[a];
|
var file = r.files[a];
|
||||||
html.push('<li><a href="?doc=' +
|
html.push('<li><a href="?doc=' +
|
||||||
@@ -5300,10 +5300,7 @@ document.onkeydown = function (e) {
|
|||||||
|
|
||||||
function xhr_search_results() {
|
function xhr_search_results() {
|
||||||
if (this.status !== 200) {
|
if (this.status !== 200) {
|
||||||
var msg = this.responseText;
|
var msg = unpre(this.responseText);
|
||||||
if (msg.indexOf('<pre>') === 0)
|
|
||||||
msg = msg.slice(5);
|
|
||||||
|
|
||||||
srch_msg(true, "http " + this.status + ": " + msg);
|
srch_msg(true, "http " + this.status + ": " + msg);
|
||||||
search_in_progress = 0;
|
search_in_progress = 0;
|
||||||
return;
|
return;
|
||||||
@@ -5342,7 +5339,7 @@ document.onkeydown = function (e) {
|
|||||||
if (ext.length > 8)
|
if (ext.length > 8)
|
||||||
ext = '%';
|
ext = '%';
|
||||||
|
|
||||||
var links = linksplit(r.rp + '', id).join(''),
|
var links = linksplit(r.rp + '', id).join('<span>/</span>'),
|
||||||
nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz];
|
nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz];
|
||||||
|
|
||||||
for (var b = 0; b < tagord.length; b++) {
|
for (var b = 0; b < tagord.length; b++) {
|
||||||
@@ -7168,16 +7165,17 @@ var msel = (function () {
|
|||||||
form.onsubmit = function (e) {
|
form.onsubmit = function (e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
clmod(sf, 'vis', 1);
|
clmod(sf, 'vis', 1);
|
||||||
sf.textContent = 'creating "' + tb.value + '"...';
|
var dn = tb.value;
|
||||||
|
sf.textContent = 'creating "' + dn + '"...';
|
||||||
|
|
||||||
var fd = new FormData();
|
var fd = new FormData();
|
||||||
fd.append("act", "mkdir");
|
fd.append("act", "mkdir");
|
||||||
fd.append("name", tb.value);
|
fd.append("name", dn);
|
||||||
|
|
||||||
var xhr = new XHR();
|
var xhr = new XHR();
|
||||||
xhr.vp = get_evpath();
|
xhr.vp = get_evpath();
|
||||||
xhr.dn = tb.value;
|
xhr.dn = dn;
|
||||||
xhr.open('POST', xhr.vp, true);
|
xhr.open('POST', dn.startsWith('/') ? (SR || '/') : xhr.vp, true);
|
||||||
xhr.onload = xhr.onerror = cb;
|
xhr.onload = xhr.onerror = cb;
|
||||||
xhr.responseType = 'text';
|
xhr.responseType = 'text';
|
||||||
xhr.send(fd);
|
xhr.send(fd);
|
||||||
@@ -7194,7 +7192,7 @@ var msel = (function () {
|
|||||||
xhrchk(this, L.fd_xe1, L.fd_xe2);
|
xhrchk(this, L.fd_xe1, L.fd_xe2);
|
||||||
|
|
||||||
if (this.status !== 201) {
|
if (this.status !== 201) {
|
||||||
sf.textContent = 'error: ' + this.responseText;
|
sf.textContent = 'error: ' + unpre(this.responseText);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -7203,8 +7201,9 @@ var msel = (function () {
|
|||||||
sf.textContent = '';
|
sf.textContent = '';
|
||||||
|
|
||||||
var dn = this.getResponseHeader('X-New-Dir');
|
var dn = this.getResponseHeader('X-New-Dir');
|
||||||
dn = dn || uricom_enc(this.dn);
|
dn = dn ? '/' + dn + '/' : uricom_enc(this.dn);
|
||||||
treectl.goto(this.vp + dn + '/', true);
|
treectl.goto(dn, true);
|
||||||
|
tree_scrollto();
|
||||||
}
|
}
|
||||||
})();
|
})();
|
||||||
|
|
||||||
@@ -7241,7 +7240,7 @@ var msel = (function () {
|
|||||||
xhrchk(this, L.fsm_xe1, L.fsm_xe2);
|
xhrchk(this, L.fsm_xe1, L.fsm_xe2);
|
||||||
|
|
||||||
if (this.status < 200 || this.status > 201) {
|
if (this.status < 200 || this.status > 201) {
|
||||||
sf.textContent = 'error: ' + this.responseText;
|
sf.textContent = 'error: ' + unpre(this.responseText);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -7586,7 +7585,7 @@ var unpost = (function () {
|
|||||||
'<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + L.un_del + '</a></td>' +
|
'<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + L.un_del + '</a></td>' +
|
||||||
'<td>' + unix2iso(res[a].at) + '</td>' +
|
'<td>' + unix2iso(res[a].at) + '</td>' +
|
||||||
'<td>' + res[a].sz + '</td>' +
|
'<td>' + res[a].sz + '</td>' +
|
||||||
'<td>' + linksplit(res[a].vp).join(' ') + '</td></tr>');
|
'<td>' + linksplit(res[a].vp).join('<span> / </span>') + '</td></tr>');
|
||||||
}
|
}
|
||||||
|
|
||||||
html.push("</tbody></table>");
|
html.push("</tbody></table>");
|
||||||
@@ -7619,7 +7618,7 @@ var unpost = (function () {
|
|||||||
|
|
||||||
function unpost_delete_cb() {
|
function unpost_delete_cb() {
|
||||||
if (this.status !== 200) {
|
if (this.status !== 200) {
|
||||||
var msg = this.responseText;
|
var msg = unpre(this.responseText);
|
||||||
toast.err(9, L.un_derr + msg);
|
toast.err(9, L.un_derr + msg);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -10,6 +10,7 @@
|
|||||||
{{ html_head }}
|
{{ html_head }}
|
||||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
|
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
|
||||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
|
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
|
||||||
|
<style>ul{padding-left:1.3em}li{margin:.4em 0}</style>
|
||||||
</head>
|
</head>
|
||||||
|
|
||||||
<body>
|
<body>
|
||||||
@@ -48,9 +49,13 @@
|
|||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
<ul>
|
||||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>--no-check-certificate</code> to the mount command</em><br />---</p>
|
{% if s %}
|
||||||
{% endif %}
|
<li>running <code>rclone mount</code> on LAN (or just dont have valid certificates)? add <code>--no-check-certificate</code></li>
|
||||||
|
{% endif %}
|
||||||
|
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||||
|
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
<p>if you want to use the native WebDAV client in windows instead (slow and buggy), first run <a href="{{ r }}/.cpr/a/webdav-cfg.bat">webdav-cfg.bat</a> to remove the 47 MiB filesize limit (also fixes latency and password login), then connect:</p>
|
<p>if you want to use the native WebDAV client in windows instead (slow and buggy), first run <a href="{{ r }}/.cpr/a/webdav-cfg.bat">webdav-cfg.bat</a> to remove the 47 MiB filesize limit (also fixes latency and password login), then connect:</p>
|
||||||
<pre>
|
<pre>
|
||||||
@@ -73,10 +78,13 @@
|
|||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
<ul>
|
||||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>--no-check-certificate</code> to the mount command</em><br />---</p>
|
{% if s %}
|
||||||
{% endif %}
|
<li>running <code>rclone mount</code> on LAN (or just dont have valid certificates)? add <code>--no-check-certificate</code></li>
|
||||||
|
{% endif %}
|
||||||
|
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||||
|
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||||
|
</ul>
|
||||||
<p>or the emergency alternative (gnome/gui-only):</p>
|
<p>or the emergency alternative (gnome/gui-only):</p>
|
||||||
<!-- gnome-bug: ignores vp -->
|
<!-- gnome-bug: ignores vp -->
|
||||||
<pre>
|
<pre>
|
||||||
@@ -123,8 +131,14 @@
|
|||||||
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
|
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b>
|
||||||
</pre>
|
</pre>
|
||||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
|
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
<ul>
|
||||||
|
{% if args.ftps %}
|
||||||
|
<li>running on LAN (or just dont have valid certificates)? add <code>no_check_certificate=true</code> to the config command</li>
|
||||||
|
{% endif %}
|
||||||
|
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||||
|
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||||
|
</ul>
|
||||||
<p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p>
|
<p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p>
|
||||||
<pre>
|
<pre>
|
||||||
explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }}
|
explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }}
|
||||||
@@ -145,8 +159,14 @@
|
|||||||
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
|
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b>
|
||||||
</pre>
|
</pre>
|
||||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
|
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
<ul>
|
||||||
|
{% if args.ftps %}
|
||||||
|
<li>running on LAN (or just dont have valid certificates)? add <code>no_check_certificate=true</code> to the config command</li>
|
||||||
|
{% endif %}
|
||||||
|
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||||
|
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||||
|
</ul>
|
||||||
<p>emergency alternative (gnome/gui-only):</p>
|
<p>emergency alternative (gnome/gui-only):</p>
|
||||||
<!-- gnome-bug: ignores vp -->
|
<!-- gnome-bug: ignores vp -->
|
||||||
<pre>
|
<pre>
|
||||||
@@ -178,7 +198,7 @@
|
|||||||
partyfuse.py{% if accs %} -a <b>{{ pw }}</b>{% endif %} http{{ s }}://{{ ep }}/{{ rvp }} <b><span class="os win">W:</span><span class="os lin mac">mp</span></b>
|
partyfuse.py{% if accs %} -a <b>{{ pw }}</b>{% endif %} http{{ s }}://{{ ep }}/{{ rvp }} <b><span class="os win">W:</span><span class="os lin mac">mp</span></b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
{% if s %}
|
||||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>-td</code></em></p>
|
<ul><li>if you are on LAN (or just dont have valid certificates), add <code>-td</code></li></ul>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<p>
|
<p>
|
||||||
you can use <a href="{{ r }}/.cpr/a/u2c.py">u2c.py</a> to upload (sometimes faster than web-browsers)
|
you can use <a href="{{ r }}/.cpr/a/u2c.py">u2c.py</a> to upload (sometimes faster than web-browsers)
|
||||||
|
|||||||
@@ -1,3 +1,18 @@
|
|||||||
|
:root {
|
||||||
|
--fg: #ccc;
|
||||||
|
--fg-max: #fff;
|
||||||
|
--bg-u2: #2b2b2b;
|
||||||
|
--bg-u5: #444;
|
||||||
|
}
|
||||||
|
html.y {
|
||||||
|
--fg: #222;
|
||||||
|
--fg-max: #000;
|
||||||
|
--bg-u2: #f7f7f7;
|
||||||
|
--bg-u5: #ccc;
|
||||||
|
}
|
||||||
|
html.bz {
|
||||||
|
--bg-u2: #202231;
|
||||||
|
}
|
||||||
@font-face {
|
@font-face {
|
||||||
font-family: 'scp';
|
font-family: 'scp';
|
||||||
font-display: swap;
|
font-display: swap;
|
||||||
@@ -14,6 +29,7 @@ html {
|
|||||||
max-width: min(34em, 90%);
|
max-width: min(34em, 90%);
|
||||||
max-width: min(34em, calc(100% - 7em));
|
max-width: min(34em, calc(100% - 7em));
|
||||||
color: #ddd;
|
color: #ddd;
|
||||||
|
color: var(--fg);
|
||||||
background: #333;
|
background: #333;
|
||||||
background: var(--bg-u2);
|
background: var(--bg-u2);
|
||||||
border: 0 solid #777;
|
border: 0 solid #777;
|
||||||
@@ -171,24 +187,15 @@ html {
|
|||||||
color: #f6a;
|
color: #f6a;
|
||||||
}
|
}
|
||||||
html.y #tt {
|
html.y #tt {
|
||||||
color: #333;
|
|
||||||
background: #fff;
|
|
||||||
border-color: #888 #000 #777 #000;
|
border-color: #888 #000 #777 #000;
|
||||||
}
|
}
|
||||||
html.bz #tt {
|
html.bz #tt {
|
||||||
background: #202231;
|
|
||||||
border-color: #3b3f58;
|
border-color: #3b3f58;
|
||||||
}
|
}
|
||||||
html.y #tt,
|
html.y #tt,
|
||||||
html.y #toast {
|
html.y #toast {
|
||||||
box-shadow: 0 .3em 1em rgba(0,0,0,0.4);
|
box-shadow: 0 .3em 1em rgba(0,0,0,0.4);
|
||||||
}
|
}
|
||||||
html.y #tt code {
|
|
||||||
color: #fff;
|
|
||||||
color: var(--fg-max);
|
|
||||||
background: #060;
|
|
||||||
background: var(--bg-u5);
|
|
||||||
}
|
|
||||||
#modalc code {
|
#modalc code {
|
||||||
color: #060;
|
color: #060;
|
||||||
background: transparent;
|
background: transparent;
|
||||||
@@ -326,6 +333,9 @@ html.y .btn:focus {
|
|||||||
box-shadow: 0 .1em .2em #037 inset;
|
box-shadow: 0 .1em .2em #037 inset;
|
||||||
outline: #037 solid .1em;
|
outline: #037 solid .1em;
|
||||||
}
|
}
|
||||||
|
input[type="submit"] {
|
||||||
|
cursor: pointer;
|
||||||
|
}
|
||||||
input[type="text"]:focus,
|
input[type="text"]:focus,
|
||||||
input:not([type]):focus,
|
input:not([type]):focus,
|
||||||
textarea:focus {
|
textarea:focus {
|
||||||
|
|||||||
@@ -1407,7 +1407,7 @@ function up2k_init(subtle) {
|
|||||||
|
|
||||||
pvis.addfile([
|
pvis.addfile([
|
||||||
uc.fsearch ? esc(entry.name) : linksplit(
|
uc.fsearch ? esc(entry.name) : linksplit(
|
||||||
entry.purl + uricom_enc(entry.name)).join(' '),
|
entry.purl + uricom_enc(entry.name)).join(' / '),
|
||||||
'📐 ' + L.u_hashing,
|
'📐 ' + L.u_hashing,
|
||||||
''
|
''
|
||||||
], entry.size, draw_each);
|
], entry.size, draw_each);
|
||||||
@@ -2284,7 +2284,7 @@ function up2k_init(subtle) {
|
|||||||
cdiff = (Math.abs(diff) <= 2) ? '3c0' : 'f0b',
|
cdiff = (Math.abs(diff) <= 2) ? '3c0' : 'f0b',
|
||||||
sdiff = '<span style="color:#' + cdiff + '">diff ' + diff;
|
sdiff = '<span style="color:#' + cdiff + '">diff ' + diff;
|
||||||
|
|
||||||
msg.push(linksplit(hit.rp).join('') + '<br /><small>' + tr + ' (srv), ' + tu + ' (You), ' + sdiff + '</small></span>');
|
msg.push(linksplit(hit.rp).join(' / ') + '<br /><small>' + tr + ' (srv), ' + tu + ' (You), ' + sdiff + '</small></span>');
|
||||||
}
|
}
|
||||||
msg = msg.join('<br />\n');
|
msg = msg.join('<br />\n');
|
||||||
}
|
}
|
||||||
@@ -2318,7 +2318,7 @@ function up2k_init(subtle) {
|
|||||||
url += '?k=' + fk;
|
url += '?k=' + fk;
|
||||||
}
|
}
|
||||||
|
|
||||||
pvis.seth(t.n, 0, linksplit(url).join(' '));
|
pvis.seth(t.n, 0, linksplit(url).join(' / '));
|
||||||
}
|
}
|
||||||
|
|
||||||
var chunksize = get_chunksize(t.size),
|
var chunksize = get_chunksize(t.size),
|
||||||
@@ -2402,15 +2402,12 @@ function up2k_init(subtle) {
|
|||||||
pvis.seth(t.n, 2, L.u_ehstmp, t);
|
pvis.seth(t.n, 2, L.u_ehstmp, t);
|
||||||
|
|
||||||
var err = "",
|
var err = "",
|
||||||
rsp = (xhr.responseText + ''),
|
rsp = unpre(this.responseText),
|
||||||
ofs = rsp.lastIndexOf('\nURL: ');
|
ofs = rsp.lastIndexOf('\nURL: ');
|
||||||
|
|
||||||
if (ofs !== -1)
|
if (ofs !== -1)
|
||||||
rsp = rsp.slice(0, ofs);
|
rsp = rsp.slice(0, ofs);
|
||||||
|
|
||||||
if (rsp.indexOf('<pre>') === 0)
|
|
||||||
rsp = rsp.slice(5);
|
|
||||||
|
|
||||||
if (rsp.indexOf('rate-limit ') !== -1) {
|
if (rsp.indexOf('rate-limit ') !== -1) {
|
||||||
var penalty = rsp.replace(/.*rate-limit /, "").split(' ')[0];
|
var penalty = rsp.replace(/.*rate-limit /, "").split(' ')[0];
|
||||||
console.log("rate-limit: " + penalty);
|
console.log("rate-limit: " + penalty);
|
||||||
@@ -2429,7 +2426,7 @@ function up2k_init(subtle) {
|
|||||||
err = rsp;
|
err = rsp;
|
||||||
ofs = err.indexOf('\n/');
|
ofs = err.indexOf('\n/');
|
||||||
if (ofs !== -1) {
|
if (ofs !== -1) {
|
||||||
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
|
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' / ');
|
||||||
}
|
}
|
||||||
if (!t.rechecks && (err_pend || err_srcb)) {
|
if (!t.rechecks && (err_pend || err_srcb)) {
|
||||||
t.rechecks = 0;
|
t.rechecks = 0;
|
||||||
@@ -2536,7 +2533,7 @@ function up2k_init(subtle) {
|
|||||||
cdr = t.size;
|
cdr = t.size;
|
||||||
|
|
||||||
var orz = function (xhr) {
|
var orz = function (xhr) {
|
||||||
var txt = ((xhr.response && xhr.response.err) || xhr.responseText) + '';
|
var txt = unpre((xhr.response && xhr.response.err) || xhr.responseText);
|
||||||
if (txt.indexOf('upload blocked by x') + 1) {
|
if (txt.indexOf('upload blocked by x') + 1) {
|
||||||
apop(st.busy.upload, upt);
|
apop(st.busy.upload, upt);
|
||||||
apop(t.postlist, npart);
|
apop(t.postlist, npart);
|
||||||
|
|||||||
@@ -622,9 +622,8 @@ function linksplit(rp, id) {
|
|||||||
}
|
}
|
||||||
var vlink = esc(uricom_dec(link));
|
var vlink = esc(uricom_dec(link));
|
||||||
|
|
||||||
if (link.indexOf('/') !== -1) {
|
if (link.indexOf('/') !== -1)
|
||||||
vlink = vlink.slice(0, -1) + '<span>/</span>';
|
vlink = vlink.slice(0, -1);
|
||||||
}
|
|
||||||
|
|
||||||
if (!rp) {
|
if (!rp) {
|
||||||
if (q)
|
if (q)
|
||||||
@@ -1357,6 +1356,11 @@ function lf2br(txt) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
function unpre(txt) {
|
||||||
|
return ('' + txt).replace(/^<pre>/, '');
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
var toast = (function () {
|
var toast = (function () {
|
||||||
var r = {},
|
var r = {},
|
||||||
te = null,
|
te = null,
|
||||||
|
|||||||
@@ -1,3 +1,20 @@
|
|||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-1024-1643 `v1.9.15` expand placeholder
|
||||||
|
|
||||||
|
[made it just in time!](https://a.ocv.me/pub/g/nerd-stuff/PXL_20231024_170348367.jpg) (EDIT: nevermind, three of the containers didn't finish uploading to ghcr before takeoff ;_; all up now)
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* #56 placeholder variables in markdown documents and prologue/epilogue html files
|
||||||
|
* default-disabled; must be enabled globally with `--exp` or per-volume with volflag `exp`
|
||||||
|
* `{{self.ip}}` becomes the client IP; see [/srv/expand/README.md](https://github.com/9001/copyparty/blob/hovudstraum/srv/expand/README.md) for more examples
|
||||||
|
* dynamic-range-compressor: reduced volume jumps between songs when enabled
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* v1.9.14 broke the `scan` volflag, causing volume rescans to happen every 10sec if enabled
|
||||||
|
* its global counterpart `--re-maxage` was not affected
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
# 2023-1021-1443 `v1.9.14` uptime
|
# 2023-1021-1443 `v1.9.14` uptime
|
||||||
|
|
||||||
|
|||||||
@@ -106,20 +106,19 @@ def meichk():
|
|||||||
if filt not in sys.executable:
|
if filt not in sys.executable:
|
||||||
filt = os.path.basename(sys.executable)
|
filt = os.path.basename(sys.executable)
|
||||||
|
|
||||||
pids = []
|
hits = []
|
||||||
ptn = re.compile(r"^([^\s]+)\s+([0-9]+)")
|
|
||||||
try:
|
try:
|
||||||
procs = sp.check_output("tasklist").decode("utf-8", "replace")
|
cmd = "tasklist /fo csv".split(" ")
|
||||||
|
procs = sp.check_output(cmd).decode("utf-8", "replace")
|
||||||
except:
|
except:
|
||||||
procs = "" # winpe
|
procs = "" # winpe
|
||||||
|
|
||||||
for ln in procs.splitlines():
|
for ln in procs.split("\n"):
|
||||||
m = ptn.match(ln)
|
if filt in ln.split('"')[:2][-1]:
|
||||||
if m and filt in m.group(1).lower():
|
hits.append(ln)
|
||||||
pids.append(int(m.group(2)))
|
|
||||||
|
|
||||||
mod = os.path.dirname(os.path.realpath(__file__))
|
mod = os.path.dirname(os.path.realpath(__file__))
|
||||||
if os.path.basename(mod).startswith("_MEI") and len(pids) == 2:
|
if os.path.basename(mod).startswith("_MEI") and len(hits) == 2:
|
||||||
meicln(mod)
|
meicln(mod)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user