Compare commits
28 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
deef32335e | ||
|
|
fc4b51ad00 | ||
|
|
fa762754bf | ||
|
|
29bd8f57c4 | ||
|
|
abc37354ef | ||
|
|
ee3333362f | ||
|
|
7c0c6b94a3 | ||
|
|
bac733113c | ||
|
|
32ab65d7cb | ||
|
|
c6744dc483 | ||
|
|
b9997d677d | ||
|
|
10defe6aef | ||
|
|
736aa125a8 | ||
|
|
eb48373b8b | ||
|
|
d4a7b7d84d | ||
|
|
2923a38b87 | ||
|
|
dabdaaee33 | ||
|
|
65e4d67c3e | ||
|
|
4b720f4150 | ||
|
|
2e85a25614 | ||
|
|
713fffcb8e | ||
|
|
8020b11ea0 | ||
|
|
2523d76756 | ||
|
|
7ede509973 | ||
|
|
7c1d97af3b | ||
|
|
95566e8388 | ||
|
|
76afb62b7b | ||
|
|
7dec922c70 |
@@ -1,3 +1,43 @@
|
||||
* do something cool
|
||||
|
||||
really tho, send a PR or an issue or whatever, all appreciated, anything goes, just behave aight
|
||||
really tho, send a PR or an issue or whatever, all appreciated, anything goes, just behave aight 👍👍
|
||||
|
||||
but to be more specific,
|
||||
|
||||
|
||||
# contribution ideas
|
||||
|
||||
|
||||
## documentation
|
||||
|
||||
I think we can agree that the documentation leaves a LOT to be desired. I've realized I'm not exactly qualified for this 😅 but maybe the [soon-to-come setup GUI](https://github.com/9001/copyparty/issues/57) will make this more manageable. The best documentation is the one that never had to be written, right? :> so I suppose we can give this a wait-and-see approach for a bit longer.
|
||||
|
||||
|
||||
## crazy ideas & features
|
||||
|
||||
assuming they won't cause too much problems or side-effects :>
|
||||
|
||||
i think someone was working on a way to list directories over DNS for example...
|
||||
|
||||
if you wanna have a go at coding it up yourself then maybe mention the idea on discord before you get too far, otherwise just go nuts 👍
|
||||
|
||||
|
||||
## others
|
||||
|
||||
aside from documentation and ideas, some other things that would be cool to have some help with is:
|
||||
|
||||
* **translations** -- the copyparty web-UI has translations for english and norwegian at the top of [browser.js](https://github.com/9001/copyparty/blob/hovudstraum/copyparty/web/browser.js); if you'd like to add a translation for another language then that'd be welcome! and if that language has a grammar that doesn't fit into the way the strings are assembled, then we'll fix that as we go :>
|
||||
|
||||
* **UI ideas** -- at some point I was thinking of rewriting the UI in react/preact/something-not-vanilla-javascript, but I'll admit the comfiness of not having any build stage combined with raw performance has kinda convinced me otherwise :p but I'd be very open to ideas on how the UI could be improved, or be more intuitive.
|
||||
|
||||
* **docker improvements** -- I don't really know what I'm doing when it comes to containers, so I'm sure there's a *huge* room for improvement here, mainly regarding how you're supposed to use the container with kubernetes / docker-compose / any of the other popular ways to do things. At some point I swear I'll start learning about docker so I can pick up clach04's [docker-compose draft](https://github.com/9001/copyparty/issues/38) and learn how that stuff ticks, unless someone beats me to it!
|
||||
|
||||
* **packaging** for various linux distributions -- this could either be as simple as just plopping the sfx.py in the right place and calling that from systemd (the archlinux package [originally did this](https://github.com/9001/copyparty/pull/18)); maybe with a small config-file which would cause copyparty to load settings from `/etc/copyparty.d` (like the [archlinux package](https://github.com/9001/copyparty/tree/hovudstraum/contrib/package/arch) does with `copyparty.conf`), or it could be a proper installation of the copyparty python package into /usr/lib or similar (the archlinux package [eventually went for this approach](https://github.com/9001/copyparty/pull/26))
|
||||
|
||||
* [fpm](https://github.com/jordansissel/fpm) can probably help with the technical part of it, but someone needs to handle distro relations :-)
|
||||
|
||||
* **software integration** -- I'm sure there's a lot of usecases where copyparty could complement something else, or the other way around, so any ideas or any work in this regard would be dope. This doesn't necessarily have to be code inside copyparty itself;
|
||||
|
||||
* [hooks](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) -- these are small programs which are called by copyparty when certain things happen (files are uploaded, someone hits a 404, etc.), and could be a fun way to add support for more usecases
|
||||
|
||||
* [parser plugins](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag) -- if you want to have copyparty analyze and index metadata for some oddball file-formats, then additional plugins would be neat :>
|
||||
|
||||
25
README.md
25
README.md
@@ -53,6 +53,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [webdav server](#webdav-server) - with read-write support
|
||||
* [connecting to webdav from windows](#connecting-to-webdav-from-windows) - using the GUI
|
||||
* [smb server](#smb-server) - unsafe, slow, not recommended for wan
|
||||
* [browser ux](#browser-ux) - tweaking the ui
|
||||
* [file indexing](#file-indexing) - enables dedup and music search ++
|
||||
* [exclude-patterns](#exclude-patterns) - to save some time
|
||||
* [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems
|
||||
@@ -317,6 +318,8 @@ same order here too
|
||||
|
||||
upgrade notes
|
||||
|
||||
* `1.9.16` (2023-11-04):
|
||||
* `--stats`/prometheus: `cpp_bans` renamed to `cpp_active_bans`, and that + `cpp_uptime` are gauges
|
||||
* `1.6.0` (2023-01-29):
|
||||
* http-api: delete/move is now `POST` instead of `GET`
|
||||
* everything other than `GET` and `HEAD` must pass [cors validation](#cors)
|
||||
@@ -1304,8 +1307,23 @@ scrape_configs:
|
||||
```
|
||||
|
||||
currently the following metrics are available,
|
||||
* `cpp_uptime_seconds`
|
||||
* `cpp_bans` number of banned IPs
|
||||
* `cpp_uptime_seconds` time since last copyparty restart
|
||||
* `cpp_boot_unixtime_seconds` same but as an absolute timestamp
|
||||
* `cpp_http_conns` number of open http(s) connections
|
||||
* `cpp_http_reqs` number of http(s) requests handled
|
||||
* `cpp_sus_reqs` number of 403/422/malicious requests
|
||||
* `cpp_active_bans` number of currently banned IPs
|
||||
* `cpp_total_bans` number of IPs banned since last restart
|
||||
|
||||
these are available unless `--nos-vst` is specified:
|
||||
* `cpp_db_idle_seconds` time since last database activity (upload/rename/delete)
|
||||
* `cpp_db_act_seconds` same but as an absolute timestamp
|
||||
* `cpp_idle_vols` number of volumes which are idle / ready
|
||||
* `cpp_busy_vols` number of volumes which are busy / indexing
|
||||
* `cpp_offline_vols` number of volumes which are offline / unavailable
|
||||
* `cpp_hashing_files` number of files queued for hashing / indexing
|
||||
* `cpp_tagq_files` number of files queued for metadata scanning
|
||||
* `cpp_mtpq_files` number of files queued for plugin-based analysis
|
||||
|
||||
and these are available per-volume only:
|
||||
* `cpp_disk_size_bytes` total HDD size
|
||||
@@ -1324,9 +1342,12 @@ some of the metrics have additional requirements to function correctly,
|
||||
the following options are available to disable some of the metrics:
|
||||
* `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs
|
||||
* `--nos-vol` disables `cpp_vol_*` which reduces server startup time
|
||||
* `--nos-vst` disables volume state, reducing the worst-case prometheus query time by 0.5 sec
|
||||
* `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries
|
||||
* `--nos-unf` disables `cpp_unf_*` for no particular purpose
|
||||
|
||||
note: the following metrics are counted incorrectly if multiprocessing is enabled with `-j`: `cpp_http_conns`, `cpp_http_reqs`, `cpp_sus_reqs`, `cpp_active_bans`, `cpp_total_bans`
|
||||
|
||||
|
||||
# packages
|
||||
|
||||
|
||||
33
bin/u2c.py
33
bin/u2c.py
@@ -1,8 +1,8 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
S_VERSION = "1.10"
|
||||
S_BUILD_DT = "2023-08-15"
|
||||
S_VERSION = "1.11"
|
||||
S_BUILD_DT = "2023-11-11"
|
||||
|
||||
"""
|
||||
u2c.py: upload to copyparty
|
||||
@@ -107,10 +107,12 @@ class File(object):
|
||||
self.ucids = [] # type: list[str] # chunks which need to be uploaded
|
||||
self.wark = None # type: str
|
||||
self.url = None # type: str
|
||||
self.nhs = 0
|
||||
|
||||
# set by upload
|
||||
self.up_b = 0 # type: int
|
||||
self.up_c = 0 # type: int
|
||||
self.cd = 0
|
||||
|
||||
# t = "size({}) lmod({}) top({}) rel({}) abs({}) name({})\n"
|
||||
# eprint(t.format(self.size, self.lmod, self.top, self.rel, self.abs, self.name))
|
||||
@@ -433,7 +435,7 @@ def walkdirs(err, tops, excl):
|
||||
za = [x.replace(b"/", b"\\") for x in za]
|
||||
tops = za
|
||||
|
||||
ptn = re.compile(excl.encode("utf-8") or b"\n")
|
||||
ptn = re.compile(excl.encode("utf-8") or b"\n", re.I)
|
||||
|
||||
for top in tops:
|
||||
isdir = os.path.isdir(top)
|
||||
@@ -598,7 +600,7 @@ def handshake(ar, file, search):
|
||||
raise
|
||||
|
||||
eprint("handshake failed, retrying: {0}\n {1}\n\n".format(file.name, em))
|
||||
time.sleep(1)
|
||||
time.sleep(ar.cd)
|
||||
|
||||
try:
|
||||
r = r.json()
|
||||
@@ -689,6 +691,7 @@ class Ctl(object):
|
||||
|
||||
def __init__(self, ar, stats=None):
|
||||
self.ok = False
|
||||
self.errs = 0
|
||||
self.ar = ar
|
||||
self.stats = stats or self._scan()
|
||||
if not self.stats:
|
||||
@@ -736,7 +739,7 @@ class Ctl(object):
|
||||
|
||||
self._fancy()
|
||||
|
||||
self.ok = True
|
||||
self.ok = not self.errs
|
||||
|
||||
def _safe(self):
|
||||
"""minimal basic slow boring fallback codepath"""
|
||||
@@ -961,13 +964,22 @@ class Ctl(object):
|
||||
self.q_upload.put(None)
|
||||
break
|
||||
|
||||
with self.mutex:
|
||||
self.handshaker_busy += 1
|
||||
|
||||
upath = file.abs.decode("utf-8", "replace")
|
||||
if not VT100:
|
||||
upath = upath.lstrip("\\?")
|
||||
|
||||
file.nhs += 1
|
||||
if file.nhs > 32:
|
||||
print("ERROR: giving up on file %s" % (upath))
|
||||
self.errs += 1
|
||||
continue
|
||||
|
||||
with self.mutex:
|
||||
self.handshaker_busy += 1
|
||||
|
||||
while time.time() < file.cd:
|
||||
time.sleep(0.1)
|
||||
|
||||
hs, sprs = handshake(self.ar, file, search)
|
||||
if search:
|
||||
if hs:
|
||||
@@ -1050,6 +1062,7 @@ class Ctl(object):
|
||||
except Exception as ex:
|
||||
t = "upload failed, retrying: {0} #{1} ({2})\n"
|
||||
eprint(t.format(file.name, cid[:8], ex))
|
||||
file.cd = time.time() + self.ar.cd
|
||||
# handshake will fix it
|
||||
|
||||
with self.mutex:
|
||||
@@ -1121,6 +1134,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ap.add_argument("-J", type=int, metavar="THREADS", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing")
|
||||
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
|
||||
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)")
|
||||
ap.add_argument("--cd", type=float, metavar="SEC", default=5, help="delay before reattempting a failed handshake/upload")
|
||||
ap.add_argument("--safe", action="store_true", help="use simple fallback approach")
|
||||
ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)")
|
||||
|
||||
@@ -1187,6 +1201,9 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ar.z = True
|
||||
ctl = Ctl(ar, ctl.stats)
|
||||
|
||||
if ctl.errs:
|
||||
print("WARNING: %d errors" % (ctl.errs))
|
||||
|
||||
sys.exit(0 if ctl.ok else 1)
|
||||
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
|
||||
|
||||
upstream cpp {
|
||||
server 127.0.0.1:3923;
|
||||
server 127.0.0.1:3923 fail_timeout=1s;
|
||||
keepalive 1;
|
||||
}
|
||||
server {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.9.14"
|
||||
pkgver="1.9.17"
|
||||
pkgrel=1
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++"
|
||||
arch=("any")
|
||||
@@ -9,6 +9,7 @@ license=('MIT')
|
||||
depends=("python" "lsof" "python-jinja")
|
||||
makedepends=("python-wheel" "python-setuptools" "python-build" "python-installer" "make" "pigz")
|
||||
optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tags"
|
||||
"cfssl: generate TLS certificates on startup (pointless when reverse-proxied)"
|
||||
"python-mutagen: music tags (alternative)"
|
||||
"python-pillow: thumbnails for images"
|
||||
"python-pyvips: thumbnails for images (higher quality, faster, uses more ram)"
|
||||
@@ -20,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("96867ea1bcaf622e5dc29ee3224ffa8ea80218d3a146e7a10d04c12255bae00f")
|
||||
sha256sums=("73d66a9ff21caf45d8093829ba7de5b161fcd595ff91f8674795f426db86644c")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -3,6 +3,9 @@
|
||||
# use argon2id-hashed passwords in config files (sha2 is always available)
|
||||
withHashedPasswords ? true,
|
||||
|
||||
# generate TLS certificates on startup (pointless when reverse-proxied)
|
||||
withCertgen ? false,
|
||||
|
||||
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
|
||||
withThumbnails ? true,
|
||||
|
||||
@@ -34,6 +37,7 @@ let
|
||||
]
|
||||
++ lib.optional withSMB impacket
|
||||
++ lib.optional withFTPS pyopenssl
|
||||
++ lib.optional withCertgen cfssl
|
||||
++ lib.optional withThumbnails pillow
|
||||
++ lib.optional withFastThumbnails pyvips
|
||||
++ lib.optional withMediaProcessing ffmpeg
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.14/copyparty-sfx.py",
|
||||
"version": "1.9.14",
|
||||
"hash": "sha256-H4hRi6Nn4jUouhvqLacFyr0odMQ+99crBXL3iNz7mXs="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.17/copyparty-sfx.py",
|
||||
"version": "1.9.17",
|
||||
"hash": "sha256-YLl7hGWRDsFgxUvQ6hUbq+DWduhm2bs4FSZWs/AgvB0="
|
||||
}
|
||||
@@ -1014,6 +1014,7 @@ def add_stats(ap):
|
||||
ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts")
|
||||
ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)")
|
||||
ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)")
|
||||
ap2.add_argument("--nos-vst", action="store_true", help="disable volume state metrics (indexing, analyzing, activity)")
|
||||
ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)")
|
||||
ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics")
|
||||
|
||||
@@ -1094,7 +1095,7 @@ def add_logging(ap):
|
||||
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
|
||||
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
|
||||
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
|
||||
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log passphrase of failed login attempts: 0=terse, 1=plaintext, 2=hashed")
|
||||
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log failed login attempt passwords: 0=terse, 1=plaintext, 2=hashed")
|
||||
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
|
||||
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
|
||||
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
|
||||
@@ -1314,7 +1315,7 @@ def run_argparse(
|
||||
for k, h, t in sects:
|
||||
k2 = "help_" + k.replace("-", "_")
|
||||
if vars(ret)[k2]:
|
||||
lprint("# {} help page".format(k))
|
||||
lprint("# %s help page (%s)" % (k, h))
|
||||
lprint(t + "\033[0m")
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 9, 15)
|
||||
VERSION = (1, 9, 18)
|
||||
CODENAME = "prometheable"
|
||||
BUILD_DT = (2023, 10, 24)
|
||||
BUILD_DT = (2023, 11, 18)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -21,9 +21,9 @@ from .util import (
|
||||
META_NOBOTS,
|
||||
SQLITE_VER,
|
||||
UNPLICATIONS,
|
||||
UTC,
|
||||
ODict,
|
||||
Pebkac,
|
||||
UTC,
|
||||
absreal,
|
||||
afsenc,
|
||||
get_df,
|
||||
@@ -476,12 +476,10 @@ class VFS(object):
|
||||
err: int = 403,
|
||||
) -> tuple["VFS", str]:
|
||||
"""returns [vfsnode,fs_remainder] if user has the requested permissions"""
|
||||
if ANYWIN:
|
||||
mod = relchk(vpath)
|
||||
if mod:
|
||||
if self.log:
|
||||
self.log("vfs", "invalid relpath [{}]".format(vpath))
|
||||
raise Pebkac(404)
|
||||
if relchk(vpath):
|
||||
if self.log:
|
||||
self.log("vfs", "invalid relpath [{}]".format(vpath))
|
||||
raise Pebkac(422)
|
||||
|
||||
cvpath = undot(vpath)
|
||||
vn, rem = self._find(cvpath)
|
||||
@@ -500,8 +498,8 @@ class VFS(object):
|
||||
t = "{} has no {} in [{}] => [{}] => [{}]"
|
||||
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
|
||||
|
||||
t = "you don't have {}-access for this location"
|
||||
raise Pebkac(err, t.format(msg))
|
||||
t = 'you don\'t have %s-access in "/%s"'
|
||||
raise Pebkac(err, t % (msg, cvpath))
|
||||
|
||||
return vn, rem
|
||||
|
||||
@@ -1723,6 +1721,9 @@ class AuthSrv(object):
|
||||
def setup_pwhash(self, acct: dict[str, str]) -> None:
|
||||
self.ah = PWHash(self.args)
|
||||
if not self.ah.on:
|
||||
if self.args.ah_cli or self.args.ah_gen:
|
||||
t = "\n BAD CONFIG:\n cannot --ah-cli or --ah-gen without --ah-alg"
|
||||
raise Exception(t)
|
||||
return
|
||||
|
||||
if self.args.ah_cli:
|
||||
|
||||
@@ -132,7 +132,10 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
|
||||
|
||||
try:
|
||||
expiry, inf = _read_crt(args, "srv.pem")
|
||||
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.1 > expiry
|
||||
if "sans" not in inf:
|
||||
raise Exception("no useable cert found")
|
||||
|
||||
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.5 > expiry
|
||||
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
|
||||
for n in names:
|
||||
if n not in inf["sans"]:
|
||||
|
||||
@@ -11,11 +11,6 @@ import time
|
||||
|
||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
||||
|
||||
try:
|
||||
import asynchat
|
||||
except:
|
||||
sys.path.append(os.path.join(E.mod, "vend"))
|
||||
|
||||
from pyftpdlib.authorizers import AuthenticationFailed, DummyAuthorizer
|
||||
from pyftpdlib.filesystems import AbstractedFS, FilesystemError
|
||||
from pyftpdlib.handlers import FTPHandler
|
||||
@@ -92,6 +87,12 @@ class FtpAuth(DummyAuthorizer):
|
||||
if bonk:
|
||||
logging.warning("client banned: invalid passwords")
|
||||
bans[ip] = bonk
|
||||
try:
|
||||
# only possible if multiprocessing disabled
|
||||
self.hub.broker.httpsrv.bans[ip] = bonk
|
||||
self.hub.broker.httpsrv.nban += 1
|
||||
except:
|
||||
pass
|
||||
|
||||
raise AuthenticationFailed("Authentication failed.")
|
||||
|
||||
@@ -148,7 +149,7 @@ class FtpFs(AbstractedFS):
|
||||
try:
|
||||
vpath = vpath.replace("\\", "/").strip("/")
|
||||
rd, fn = os.path.split(vpath)
|
||||
if ANYWIN and relchk(rd):
|
||||
if relchk(rd):
|
||||
logging.warning("malicious vpath: %s", vpath)
|
||||
t = "Unsupported characters in [{}]"
|
||||
raise FSE(t.format(vpath), 1)
|
||||
|
||||
@@ -39,10 +39,11 @@ from .szip import StreamZip
|
||||
from .util import (
|
||||
HTTPCODE,
|
||||
META_NOBOTS,
|
||||
UTC,
|
||||
Garda,
|
||||
MultipartParser,
|
||||
ODict,
|
||||
Pebkac,
|
||||
UTC,
|
||||
UnrecvEOF,
|
||||
absreal,
|
||||
alltrace,
|
||||
@@ -75,6 +76,7 @@ from .util import (
|
||||
runhook,
|
||||
s3enc,
|
||||
sanitize_fn,
|
||||
sanitize_vpath,
|
||||
sendfile_kern,
|
||||
sendfile_py,
|
||||
undot,
|
||||
@@ -146,6 +148,7 @@ class HttpCli(object):
|
||||
self.rem = " "
|
||||
self.vpath = " "
|
||||
self.vpaths = " "
|
||||
self.gctx = " " # additional context for garda
|
||||
self.trailing_slash = True
|
||||
self.uname = " "
|
||||
self.pw = " "
|
||||
@@ -254,8 +257,8 @@ class HttpCli(object):
|
||||
k, zs = header_line.split(":", 1)
|
||||
self.headers[k.lower()] = zs.strip()
|
||||
except:
|
||||
msg = " ]\n#[ ".join(headerlines)
|
||||
raise Pebkac(400, "bad headers:\n#[ " + msg + " ]")
|
||||
msg = "#[ " + " ]\n#[ ".join(headerlines) + " ]"
|
||||
raise Pebkac(400, "bad headers", log=msg)
|
||||
|
||||
except Pebkac as ex:
|
||||
self.mode = "GET"
|
||||
@@ -268,8 +271,14 @@ class HttpCli(object):
|
||||
self.loud_reply(unicode(ex), status=ex.code, headers=h, volsan=True)
|
||||
except:
|
||||
pass
|
||||
|
||||
if ex.log:
|
||||
self.log("additional error context:\n" + ex.log, 6)
|
||||
|
||||
return False
|
||||
|
||||
self.conn.hsrv.nreq += 1
|
||||
|
||||
self.ua = self.headers.get("user-agent", "")
|
||||
self.is_rclone = self.ua.startswith("rclone/")
|
||||
|
||||
@@ -411,12 +420,9 @@ class HttpCli(object):
|
||||
self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath
|
||||
)
|
||||
|
||||
ok = "\x00" not in self.vpath
|
||||
if ANYWIN:
|
||||
ok = ok and not relchk(self.vpath)
|
||||
|
||||
if not ok and (self.vpath != "*" or self.mode != "OPTIONS"):
|
||||
if relchk(self.vpath) and (self.vpath != "*" or self.mode != "OPTIONS"):
|
||||
self.log("invalid relpath [{}]".format(self.vpath))
|
||||
self.cbonk(self.conn.hsrv.g422, self.vpath, "bad_vp", "invalid relpaths")
|
||||
return self.tx_404() and self.keepalive
|
||||
|
||||
zso = self.headers.get("authorization")
|
||||
@@ -549,6 +555,9 @@ class HttpCli(object):
|
||||
zb = b"<pre>" + html_escape(msg).encode("utf-8", "replace")
|
||||
h = {"WWW-Authenticate": 'Basic realm="a"'} if pex.code == 401 else {}
|
||||
self.reply(zb, status=pex.code, headers=h, volsan=True)
|
||||
if pex.log:
|
||||
self.log("additional error context:\n" + pex.log, 6)
|
||||
|
||||
return self.keepalive
|
||||
except Pebkac:
|
||||
return False
|
||||
@@ -559,6 +568,36 @@ class HttpCli(object):
|
||||
else:
|
||||
return self.conn.iphash.s(self.ip)
|
||||
|
||||
def cbonk(self, g: Garda, v: str, reason: str, descr: str) -> bool:
|
||||
self.conn.hsrv.nsus += 1
|
||||
if not g.lim:
|
||||
return False
|
||||
|
||||
bonk, ip = g.bonk(self.ip, v + self.gctx)
|
||||
if not bonk:
|
||||
return False
|
||||
|
||||
xban = self.vn.flags.get("xban")
|
||||
if not xban or not runhook(
|
||||
self.log,
|
||||
xban,
|
||||
self.vn.canonical(self.rem),
|
||||
self.vpath,
|
||||
self.host,
|
||||
self.uname,
|
||||
time.time(),
|
||||
0,
|
||||
self.ip,
|
||||
time.time(),
|
||||
reason,
|
||||
):
|
||||
self.log("client banned: %s" % (descr,), 1)
|
||||
self.conn.hsrv.bans[ip] = bonk
|
||||
self.conn.hsrv.nban += 1
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def is_banned(self) -> bool:
|
||||
if not self.conn.bans:
|
||||
return False
|
||||
@@ -678,24 +717,7 @@ class HttpCli(object):
|
||||
or not self.args.nonsus_urls
|
||||
or not self.args.nonsus_urls.search(self.vpath)
|
||||
):
|
||||
bonk, ip = g.bonk(self.ip, self.vpath)
|
||||
if bonk:
|
||||
xban = self.vn.flags.get("xban")
|
||||
if not xban or not runhook(
|
||||
self.log,
|
||||
xban,
|
||||
self.vn.canonical(self.rem),
|
||||
self.vpath,
|
||||
self.host,
|
||||
self.uname,
|
||||
time.time(),
|
||||
0,
|
||||
self.ip,
|
||||
time.time(),
|
||||
str(status),
|
||||
):
|
||||
self.log("client banned: %ss" % (status,), 1)
|
||||
self.conn.hsrv.bans[ip] = bonk
|
||||
self.cbonk(g, self.vpath, str(status), "%ss" % (status,))
|
||||
|
||||
if volsan:
|
||||
vols = list(self.asrv.vfs.all_vols.values())
|
||||
@@ -2121,8 +2143,10 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
def get_pwd_cookie(self, pwd: str) -> str:
|
||||
if self.asrv.ah.hash(pwd) in self.asrv.iacct:
|
||||
msg = "login ok"
|
||||
hpwd = self.asrv.ah.hash(pwd)
|
||||
uname = self.asrv.iacct.get(hpwd)
|
||||
if uname:
|
||||
msg = "hi " + uname
|
||||
dur = int(60 * 60 * self.args.logout)
|
||||
else:
|
||||
logpwd = pwd
|
||||
@@ -2133,27 +2157,7 @@ class HttpCli(object):
|
||||
logpwd = "%" + base64.b64encode(zb[:12]).decode("utf-8")
|
||||
|
||||
self.log("invalid password: {}".format(logpwd), 3)
|
||||
|
||||
g = self.conn.hsrv.gpwd
|
||||
if g.lim:
|
||||
bonk, ip = g.bonk(self.ip, pwd)
|
||||
if bonk:
|
||||
xban = self.vn.flags.get("xban")
|
||||
if not xban or not runhook(
|
||||
self.log,
|
||||
xban,
|
||||
self.vn.canonical(self.rem),
|
||||
self.vpath,
|
||||
self.host,
|
||||
self.uname,
|
||||
time.time(),
|
||||
0,
|
||||
self.ip,
|
||||
time.time(),
|
||||
"pw",
|
||||
):
|
||||
self.log("client banned: invalid passwords", 1)
|
||||
self.conn.hsrv.bans[ip] = bonk
|
||||
self.cbonk(self.conn.hsrv.gpwd, pwd, "pw", "invalid passwords")
|
||||
|
||||
msg = "naw dude"
|
||||
pwd = "x" # nosec
|
||||
@@ -2177,26 +2181,30 @@ class HttpCli(object):
|
||||
new_dir = self.parser.require("name", 512)
|
||||
self.parser.drop()
|
||||
|
||||
sanitized = sanitize_fn(new_dir, "", [])
|
||||
return self._mkdir(vjoin(self.vpath, sanitized))
|
||||
return self._mkdir(vjoin(self.vpath, new_dir))
|
||||
|
||||
def _mkdir(self, vpath: str, dav: bool = False) -> bool:
|
||||
nullwrite = self.args.nw
|
||||
self.gctx = vpath
|
||||
vpath = undot(vpath)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
||||
self._assert_safe_rem(rem)
|
||||
rem = sanitize_vpath(rem, "/", [])
|
||||
fn = vfs.canonical(rem)
|
||||
if not fn.startswith(vfs.realpath):
|
||||
self.log("invalid mkdir [%s] [%s]" % (self.gctx, vpath), 1)
|
||||
raise Pebkac(422)
|
||||
|
||||
if not nullwrite:
|
||||
fdir = os.path.dirname(fn)
|
||||
|
||||
if not bos.path.isdir(fdir):
|
||||
if dav and not bos.path.isdir(fdir):
|
||||
raise Pebkac(409, "parent folder does not exist")
|
||||
|
||||
if bos.path.isdir(fn):
|
||||
raise Pebkac(405, "that folder exists already")
|
||||
raise Pebkac(405, 'folder "/%s" already exists' % (vpath,))
|
||||
|
||||
try:
|
||||
bos.mkdir(fn)
|
||||
bos.makedirs(fn)
|
||||
except OSError as ex:
|
||||
if ex.errno == errno.EACCES:
|
||||
raise Pebkac(500, "the server OS denied write-access")
|
||||
@@ -2205,7 +2213,7 @@ class HttpCli(object):
|
||||
except:
|
||||
raise Pebkac(500, min_ex())
|
||||
|
||||
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
|
||||
self.out_headers["X-New-Dir"] = quotep(vpath)
|
||||
|
||||
if dav:
|
||||
self.reply(b"", 201)
|
||||
|
||||
@@ -128,6 +128,9 @@ class HttpSrv(object):
|
||||
|
||||
self.u2fh = FHC()
|
||||
self.metrics = Metrics(self)
|
||||
self.nreq = 0
|
||||
self.nsus = 0
|
||||
self.nban = 0
|
||||
self.srvs: list[socket.socket] = []
|
||||
self.ncli = 0 # exact
|
||||
self.clients: set[HttpConn] = set() # laggy
|
||||
|
||||
@@ -34,14 +34,23 @@ class Metrics(object):
|
||||
|
||||
ret: list[str] = []
|
||||
|
||||
def addc(k: str, unit: str, v: str, desc: str) -> None:
|
||||
if unit:
|
||||
k += "_" + unit
|
||||
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
|
||||
else:
|
||||
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
|
||||
def addc(k: str, v: str, desc: str) -> None:
|
||||
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
|
||||
|
||||
def adduc(k: str, unit: str, v: str, desc: str) -> None:
|
||||
k += "_" + unit
|
||||
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
|
||||
|
||||
def addg(k: str, v: str, desc: str) -> None:
|
||||
zs = "# TYPE %s gauge\n# HELP %s %s\n%s %s"
|
||||
ret.append(zs % (k, k, desc, k, v))
|
||||
|
||||
def addug(k: str, unit: str, v: str, desc: str) -> None:
|
||||
k += "_" + unit
|
||||
zs = "# TYPE %s gauge\n# UNIT %s %s\n# HELP %s %s\n%s %s"
|
||||
ret.append(zs % (k, k, unit, k, desc, k, v))
|
||||
|
||||
def addh(k: str, typ: str, desc: str) -> None:
|
||||
zs = "# TYPE %s %s\n# HELP %s %s"
|
||||
@@ -54,17 +63,75 @@ class Metrics(object):
|
||||
def addv(k: str, v: str) -> None:
|
||||
ret.append("%s %s" % (k, v))
|
||||
|
||||
t = "time since last copyparty restart"
|
||||
v = "{:.3f}".format(time.time() - self.hsrv.t0)
|
||||
addc("cpp_uptime", "seconds", v, "time since last server restart")
|
||||
addug("cpp_uptime", "seconds", v, t)
|
||||
|
||||
# timestamps are gauges because initial value is not zero
|
||||
t = "unixtime of last copyparty restart"
|
||||
v = "{:.3f}".format(self.hsrv.t0)
|
||||
addug("cpp_boot_unixtime", "seconds", v, t)
|
||||
|
||||
t = "number of open http(s) client connections"
|
||||
addg("cpp_http_conns", str(self.hsrv.ncli), t)
|
||||
|
||||
t = "number of http(s) requests since last restart"
|
||||
addc("cpp_http_reqs", str(self.hsrv.nreq), t)
|
||||
|
||||
t = "number of 403/422/malicious reqs since restart"
|
||||
addc("cpp_sus_reqs", str(self.hsrv.nsus), t)
|
||||
|
||||
v = str(len(conn.bans or []))
|
||||
addc("cpp_bans", "", v, "number of banned IPs")
|
||||
addg("cpp_active_bans", v, "number of currently banned IPs")
|
||||
|
||||
t = "number of IPs banned since last restart"
|
||||
addg("cpp_total_bans", str(self.hsrv.nban), t)
|
||||
|
||||
if not args.nos_vst:
|
||||
x = self.hsrv.broker.ask("up2k.get_state")
|
||||
vs = json.loads(x.get())
|
||||
|
||||
nvidle = 0
|
||||
nvbusy = 0
|
||||
nvoffline = 0
|
||||
for v in vs["volstate"].values():
|
||||
if v == "online, idle":
|
||||
nvidle += 1
|
||||
elif "OFFLINE" in v:
|
||||
nvoffline += 1
|
||||
else:
|
||||
nvbusy += 1
|
||||
|
||||
addg("cpp_idle_vols", str(nvidle), "number of idle/ready volumes")
|
||||
addg("cpp_busy_vols", str(nvbusy), "number of busy/indexing volumes")
|
||||
addg("cpp_offline_vols", str(nvoffline), "number of offline volumes")
|
||||
|
||||
t = "time since last database activity (upload/rename/delete)"
|
||||
addug("cpp_db_idle", "seconds", str(vs["dbwt"]), t)
|
||||
|
||||
t = "unixtime of last database activity (upload/rename/delete)"
|
||||
addug("cpp_db_act", "seconds", str(vs["dbwu"]), t)
|
||||
|
||||
t = "number of files queued for hashing/indexing"
|
||||
addg("cpp_hashing_files", str(vs["hashq"]), t)
|
||||
|
||||
t = "number of files queued for metadata scanning"
|
||||
addg("cpp_tagq_files", str(vs["tagq"]), t)
|
||||
|
||||
try:
|
||||
t = "number of files queued for plugin-based analysis"
|
||||
addg("cpp_mtpq_files", str(int(vs["mtpq"])), t)
|
||||
except:
|
||||
pass
|
||||
|
||||
if not args.nos_hdd:
|
||||
addbh("cpp_disk_size_bytes", "total HDD size of volume")
|
||||
addbh("cpp_disk_free_bytes", "free HDD space in volume")
|
||||
for vpath, vol in allvols:
|
||||
free, total = get_df(vol.realpath)
|
||||
if free is None or total is None:
|
||||
continue
|
||||
|
||||
addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total))
|
||||
addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free))
|
||||
|
||||
@@ -161,5 +228,6 @@ class Metrics(object):
|
||||
ret.append("# EOF")
|
||||
|
||||
mime = "application/openmetrics-text; version=1.0.0; charset=utf-8"
|
||||
mime = cli.uparam.get("mime") or mime
|
||||
cli.reply("\n".join(ret).encode("utf-8"), mime=mime)
|
||||
return True
|
||||
|
||||
@@ -136,8 +136,12 @@ class PWHash(object):
|
||||
import getpass
|
||||
|
||||
while True:
|
||||
p1 = getpass.getpass("password> ")
|
||||
p2 = getpass.getpass("again or just hit ENTER> ")
|
||||
try:
|
||||
p1 = getpass.getpass("password> ")
|
||||
p2 = getpass.getpass("again or just hit ENTER> ")
|
||||
except EOFError:
|
||||
return
|
||||
|
||||
if p2 and p1 != p2:
|
||||
print("\033[31minputs don't match; try again\033[0m", file=sys.stderr)
|
||||
continue
|
||||
|
||||
@@ -36,17 +36,17 @@ from .tcpsrv import TcpSrv
|
||||
from .th_srv import HAVE_PIL, HAVE_VIPS, HAVE_WEBP, ThumbSrv
|
||||
from .up2k import Up2k
|
||||
from .util import (
|
||||
FFMPEG_URL,
|
||||
VERSIONS,
|
||||
Daemon,
|
||||
DEF_EXP,
|
||||
DEF_MTE,
|
||||
DEF_MTH,
|
||||
FFMPEG_URL,
|
||||
UTC,
|
||||
VERSIONS,
|
||||
Daemon,
|
||||
Garda,
|
||||
HLog,
|
||||
HMaccas,
|
||||
ODict,
|
||||
UTC,
|
||||
alltrace,
|
||||
ansi_re,
|
||||
min_ex,
|
||||
|
||||
@@ -65,6 +65,11 @@ from .util import (
|
||||
w8b64enc,
|
||||
)
|
||||
|
||||
try:
|
||||
from pathlib import Path
|
||||
except:
|
||||
pass
|
||||
|
||||
if HAVE_SQLITE3:
|
||||
import sqlite3
|
||||
|
||||
@@ -261,6 +266,7 @@ class Up2k(object):
|
||||
"hashq": self.n_hashq,
|
||||
"tagq": self.n_tagq,
|
||||
"mtpq": mtpq,
|
||||
"dbwu": "{:.2f}".format(self.db_act),
|
||||
"dbwt": "{:.2f}".format(
|
||||
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
|
||||
),
|
||||
@@ -789,6 +795,11 @@ class Up2k(object):
|
||||
except:
|
||||
return None
|
||||
|
||||
vpath = "?"
|
||||
for k, v in self.asrv.vfs.all_vols.items():
|
||||
if v.realpath == ptop:
|
||||
vpath = k
|
||||
|
||||
_, flags = self._expr_idx_filter(flags)
|
||||
|
||||
ft = "\033[0;32m{}{:.0}"
|
||||
@@ -824,17 +835,9 @@ class Up2k(object):
|
||||
a = ["\033[90mall-default"]
|
||||
|
||||
if a:
|
||||
vpath = "?"
|
||||
for k, v in self.asrv.vfs.all_vols.items():
|
||||
if v.realpath == ptop:
|
||||
vpath = k
|
||||
|
||||
if vpath:
|
||||
vpath += "/"
|
||||
|
||||
zs = " ".join(sorted(a))
|
||||
zs = zs.replace("90mre.compile(", "90m(") # nohash
|
||||
self.log("/{} {}".format(vpath, zs), "35")
|
||||
self.log("/{} {}".format(vpath + ("/" if vpath else ""), zs), "35")
|
||||
|
||||
reg = {}
|
||||
drp = None
|
||||
@@ -884,9 +887,6 @@ class Up2k(object):
|
||||
|
||||
try:
|
||||
cur = self._open_db(db_path)
|
||||
self.cur[ptop] = cur
|
||||
self.volsize[cur] = 0
|
||||
self.volnfiles[cur] = 0
|
||||
|
||||
# speeds measured uploading 520 small files on a WD20SPZX (SMR 2.5" 5400rpm 4kb)
|
||||
dbd = flags["dbd"]
|
||||
@@ -920,6 +920,13 @@ class Up2k(object):
|
||||
|
||||
cur.execute("pragma synchronous=" + sync)
|
||||
cur.connection.commit()
|
||||
|
||||
self._verify_db_cache(cur, vpath)
|
||||
|
||||
self.cur[ptop] = cur
|
||||
self.volsize[cur] = 0
|
||||
self.volnfiles[cur] = 0
|
||||
|
||||
return cur, db_path
|
||||
except:
|
||||
msg = "cannot use database at [{}]:\n{}"
|
||||
@@ -927,6 +934,25 @@ class Up2k(object):
|
||||
|
||||
return None
|
||||
|
||||
def _verify_db_cache(self, cur: "sqlite3.Cursor", vpath: str) -> None:
|
||||
# check if volume config changed since last use; drop caches if so
|
||||
zsl = [vpath] + list(sorted(self.asrv.vfs.all_vols.keys()))
|
||||
zb = hashlib.sha1("\n".join(zsl).encode("utf-8", "replace")).digest()
|
||||
vcfg = base64.urlsafe_b64encode(zb[:18]).decode("ascii")
|
||||
|
||||
c = cur.execute("select v from kv where k = 'volcfg'")
|
||||
try:
|
||||
(oldcfg,) = c.fetchone()
|
||||
except:
|
||||
oldcfg = ""
|
||||
|
||||
if oldcfg != vcfg:
|
||||
cur.execute("delete from kv where k = 'volcfg'")
|
||||
cur.execute("delete from dh")
|
||||
cur.execute("delete from cv")
|
||||
cur.execute("insert into kv values ('volcfg',?)", (vcfg,))
|
||||
cur.connection.commit()
|
||||
|
||||
def _build_file_index(self, vol: VFS, all_vols: list[VFS]) -> tuple[bool, bool]:
|
||||
do_vac = False
|
||||
top = vol.realpath
|
||||
@@ -2723,7 +2749,18 @@ class Up2k(object):
|
||||
raise Exception("symlink-fallback disabled in cfg")
|
||||
|
||||
if not linked:
|
||||
os.symlink(fsenc(lsrc), fsenc(ldst))
|
||||
if ANYWIN:
|
||||
Path(ldst).symlink_to(lsrc)
|
||||
if not bos.path.exists(dst):
|
||||
try:
|
||||
bos.unlink(dst)
|
||||
except:
|
||||
pass
|
||||
t = "the created symlink [%s] did not resolve to [%s]"
|
||||
raise Exception(t % (ldst, lsrc))
|
||||
else:
|
||||
os.symlink(fsenc(lsrc), fsenc(ldst))
|
||||
|
||||
linked = True
|
||||
except Exception as ex:
|
||||
self.log("cannot link; creating copy: " + repr(ex))
|
||||
@@ -3904,45 +3941,58 @@ class Up2k(object):
|
||||
self.n_hashq -= 1
|
||||
# self.log("hashq {}".format(self.n_hashq))
|
||||
|
||||
ptop, vtop, rd, fn, ip, at, usr, skip_xau = self.hashq.get()
|
||||
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
|
||||
if "e2d" not in self.flags[ptop]:
|
||||
continue
|
||||
task = self.hashq.get()
|
||||
if len(task) != 8:
|
||||
raise Exception("invalid hash task")
|
||||
|
||||
abspath = djoin(ptop, rd, fn)
|
||||
self.log("hashing " + abspath)
|
||||
inf = bos.stat(abspath)
|
||||
if not inf.st_size:
|
||||
wark = up2k_wark_from_metadata(
|
||||
self.salt, inf.st_size, int(inf.st_mtime), rd, fn
|
||||
)
|
||||
else:
|
||||
hashes = self._hashlist_from_file(abspath)
|
||||
if not hashes:
|
||||
try:
|
||||
if not self._hash_t(task):
|
||||
return
|
||||
except Exception as ex:
|
||||
self.log("failed to hash %s: %s" % (task, ex), 1)
|
||||
|
||||
wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes)
|
||||
def _hash_t(self, task: tuple[str, str, str, str, str, float, str, bool]) -> bool:
|
||||
ptop, vtop, rd, fn, ip, at, usr, skip_xau = task
|
||||
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
|
||||
if "e2d" not in self.flags[ptop]:
|
||||
return True
|
||||
|
||||
with self.mutex:
|
||||
self.idx_wark(
|
||||
self.flags[ptop],
|
||||
rd,
|
||||
fn,
|
||||
inf.st_mtime,
|
||||
inf.st_size,
|
||||
ptop,
|
||||
vtop,
|
||||
wark,
|
||||
"",
|
||||
usr,
|
||||
ip,
|
||||
at,
|
||||
skip_xau,
|
||||
)
|
||||
abspath = djoin(ptop, rd, fn)
|
||||
self.log("hashing " + abspath)
|
||||
inf = bos.stat(abspath)
|
||||
if not inf.st_size:
|
||||
wark = up2k_wark_from_metadata(
|
||||
self.salt, inf.st_size, int(inf.st_mtime), rd, fn
|
||||
)
|
||||
else:
|
||||
hashes = self._hashlist_from_file(abspath)
|
||||
if not hashes:
|
||||
return False
|
||||
|
||||
if at and time.time() - at > 30:
|
||||
with self.rescan_cond:
|
||||
self.rescan_cond.notify_all()
|
||||
wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes)
|
||||
|
||||
with self.mutex:
|
||||
self.idx_wark(
|
||||
self.flags[ptop],
|
||||
rd,
|
||||
fn,
|
||||
inf.st_mtime,
|
||||
inf.st_size,
|
||||
ptop,
|
||||
vtop,
|
||||
wark,
|
||||
"",
|
||||
usr,
|
||||
ip,
|
||||
at,
|
||||
skip_xau,
|
||||
)
|
||||
|
||||
if at and time.time() - at > 30:
|
||||
with self.rescan_cond:
|
||||
self.rescan_cond.notify_all()
|
||||
|
||||
return True
|
||||
|
||||
def hash_file(
|
||||
self,
|
||||
|
||||
@@ -1563,8 +1563,8 @@ def read_header(sr: Unrecv, t_idle: int, t_tot: int) -> list[str]:
|
||||
|
||||
raise Pebkac(
|
||||
400,
|
||||
"protocol error while reading headers:\n"
|
||||
+ ret.decode("utf-8", "replace"),
|
||||
"protocol error while reading headers",
|
||||
log=ret.decode("utf-8", "replace"),
|
||||
)
|
||||
|
||||
ofs = ret.find(b"\r\n\r\n")
|
||||
@@ -1773,7 +1773,16 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
|
||||
return fn.strip()
|
||||
|
||||
|
||||
def sanitize_vpath(vp: str, ok: str, bad: list[str]) -> str:
|
||||
parts = vp.replace(os.sep, "/").split("/")
|
||||
ret = [sanitize_fn(x, ok, bad) for x in parts]
|
||||
return "/".join(ret)
|
||||
|
||||
|
||||
def relchk(rp: str) -> str:
|
||||
if "\x00" in rp:
|
||||
return "[nul]"
|
||||
|
||||
if ANYWIN:
|
||||
if "\n" in rp or "\r" in rp:
|
||||
return "x\nx"
|
||||
@@ -2976,9 +2985,12 @@ def hidedir(dp) -> None:
|
||||
|
||||
|
||||
class Pebkac(Exception):
|
||||
def __init__(self, code: int, msg: Optional[str] = None) -> None:
|
||||
def __init__(
|
||||
self, code: int, msg: Optional[str] = None, log: Optional[str] = None
|
||||
) -> None:
|
||||
super(Pebkac, self).__init__(msg or HTTPCODE[code])
|
||||
self.code = code
|
||||
self.log = log
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return "Pebkac({}, {})".format(self.code, repr(self.args))
|
||||
|
||||
@@ -1891,6 +1891,10 @@ html.y #doc {
|
||||
text-align: center;
|
||||
padding: .5em;
|
||||
}
|
||||
#docul li.bn span {
|
||||
font-weight: bold;
|
||||
color: var(--fg-max);
|
||||
}
|
||||
#doc.prism {
|
||||
padding-left: 3em;
|
||||
}
|
||||
|
||||
@@ -3797,7 +3797,7 @@ var fileman = (function () {
|
||||
|
||||
function rename_cb() {
|
||||
if (this.status !== 201) {
|
||||
var msg = this.responseText;
|
||||
var msg = unpre(this.responseText);
|
||||
toast.err(9, L.fr_efail + msg);
|
||||
return;
|
||||
}
|
||||
@@ -3846,7 +3846,7 @@ var fileman = (function () {
|
||||
}
|
||||
function delete_cb() {
|
||||
if (this.status !== 200) {
|
||||
var msg = this.responseText;
|
||||
var msg = unpre(this.responseText);
|
||||
toast.err(9, L.fd_err + msg);
|
||||
return;
|
||||
}
|
||||
@@ -3967,7 +3967,7 @@ var fileman = (function () {
|
||||
}
|
||||
function paste_cb() {
|
||||
if (this.status !== 201) {
|
||||
var msg = this.responseText;
|
||||
var msg = unpre(this.responseText);
|
||||
toast.err(9, L.fp_err + msg);
|
||||
return;
|
||||
}
|
||||
@@ -4300,7 +4300,7 @@ var showfile = (function () {
|
||||
};
|
||||
|
||||
r.mktree = function () {
|
||||
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('') + '</li>'];
|
||||
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('<span>/</span>') + '</li>'];
|
||||
for (var a = 0; a < r.files.length; a++) {
|
||||
var file = r.files[a];
|
||||
html.push('<li><a href="?doc=' +
|
||||
@@ -4505,12 +4505,13 @@ var thegrid = (function () {
|
||||
aplay = ebi('a' + oth.getAttribute('id')),
|
||||
is_img = /\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp|webm|mkv|mp4)(\?|$)/i.test(href),
|
||||
is_dir = href.endsWith('/'),
|
||||
is_srch = !!ebi('unsearch'),
|
||||
in_tree = is_dir && treectl.find(oth.textContent.slice(0, -1)),
|
||||
have_sel = QS('#files tr.sel'),
|
||||
td = oth.closest('td').nextSibling,
|
||||
tr = td.parentNode;
|
||||
|
||||
if ((r.sel && !dbl && !ctrl(e)) || (treectl.csel && (e.shiftKey || ctrl(e)))) {
|
||||
if (!is_srch && ((r.sel && !dbl && !ctrl(e)) || (treectl.csel && (e.shiftKey || ctrl(e))))) {
|
||||
td.onclick.call(td, e);
|
||||
if (e.shiftKey)
|
||||
return r.loadsel();
|
||||
@@ -4647,7 +4648,7 @@ var thegrid = (function () {
|
||||
if (r.full)
|
||||
ihref += 'f'
|
||||
if (href == "#")
|
||||
ihref = SR + '/.cpr/ico/⏏️';
|
||||
ihref = SR + '/.cpr/ico/' + (ref == 'moar' ? '++' : 'exit');
|
||||
}
|
||||
else if (isdir) {
|
||||
ihref = SR + '/.cpr/ico/folder';
|
||||
@@ -5300,10 +5301,7 @@ document.onkeydown = function (e) {
|
||||
|
||||
function xhr_search_results() {
|
||||
if (this.status !== 200) {
|
||||
var msg = this.responseText;
|
||||
if (msg.indexOf('<pre>') === 0)
|
||||
msg = msg.slice(5);
|
||||
|
||||
var msg = unpre(this.responseText);
|
||||
srch_msg(true, "http " + this.status + ": " + msg);
|
||||
search_in_progress = 0;
|
||||
return;
|
||||
@@ -5342,7 +5340,7 @@ document.onkeydown = function (e) {
|
||||
if (ext.length > 8)
|
||||
ext = '%';
|
||||
|
||||
var links = linksplit(r.rp + '', id).join(''),
|
||||
var links = linksplit(r.rp + '', id).join('<span>/</span>'),
|
||||
nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz];
|
||||
|
||||
for (var b = 0; b < tagord.length; b++) {
|
||||
@@ -7168,16 +7166,17 @@ var msel = (function () {
|
||||
form.onsubmit = function (e) {
|
||||
ev(e);
|
||||
clmod(sf, 'vis', 1);
|
||||
sf.textContent = 'creating "' + tb.value + '"...';
|
||||
var dn = tb.value;
|
||||
sf.textContent = 'creating "' + dn + '"...';
|
||||
|
||||
var fd = new FormData();
|
||||
fd.append("act", "mkdir");
|
||||
fd.append("name", tb.value);
|
||||
fd.append("name", dn);
|
||||
|
||||
var xhr = new XHR();
|
||||
xhr.vp = get_evpath();
|
||||
xhr.dn = tb.value;
|
||||
xhr.open('POST', xhr.vp, true);
|
||||
xhr.dn = dn;
|
||||
xhr.open('POST', dn.startsWith('/') ? (SR || '/') : xhr.vp, true);
|
||||
xhr.onload = xhr.onerror = cb;
|
||||
xhr.responseType = 'text';
|
||||
xhr.send(fd);
|
||||
@@ -7194,7 +7193,7 @@ var msel = (function () {
|
||||
xhrchk(this, L.fd_xe1, L.fd_xe2);
|
||||
|
||||
if (this.status !== 201) {
|
||||
sf.textContent = 'error: ' + this.responseText;
|
||||
sf.textContent = 'error: ' + unpre(this.responseText);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -7203,8 +7202,9 @@ var msel = (function () {
|
||||
sf.textContent = '';
|
||||
|
||||
var dn = this.getResponseHeader('X-New-Dir');
|
||||
dn = dn || uricom_enc(this.dn);
|
||||
treectl.goto(this.vp + dn + '/', true);
|
||||
dn = dn ? '/' + dn + '/' : uricom_enc(this.dn);
|
||||
treectl.goto(dn, true);
|
||||
tree_scrollto();
|
||||
}
|
||||
})();
|
||||
|
||||
@@ -7241,7 +7241,7 @@ var msel = (function () {
|
||||
xhrchk(this, L.fsm_xe1, L.fsm_xe2);
|
||||
|
||||
if (this.status < 200 || this.status > 201) {
|
||||
sf.textContent = 'error: ' + this.responseText;
|
||||
sf.textContent = 'error: ' + unpre(this.responseText);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -7586,7 +7586,7 @@ var unpost = (function () {
|
||||
'<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + L.un_del + '</a></td>' +
|
||||
'<td>' + unix2iso(res[a].at) + '</td>' +
|
||||
'<td>' + res[a].sz + '</td>' +
|
||||
'<td>' + linksplit(res[a].vp).join(' ') + '</td></tr>');
|
||||
'<td>' + linksplit(res[a].vp).join('<span> / </span>') + '</td></tr>');
|
||||
}
|
||||
|
||||
html.push("</tbody></table>");
|
||||
@@ -7619,7 +7619,7 @@ var unpost = (function () {
|
||||
|
||||
function unpost_delete_cb() {
|
||||
if (this.status !== 200) {
|
||||
var msg = this.responseText;
|
||||
var msg = unpre(this.responseText);
|
||||
toast.err(9, L.un_derr + msg);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -10,6 +10,7 @@
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
|
||||
<style>ul{padding-left:1.3em}li{margin:.4em 0}</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
@@ -48,9 +49,13 @@
|
||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
||||
</pre>
|
||||
{% if s %}
|
||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>--no-check-certificate</code> to the mount command</em><br />---</p>
|
||||
{% endif %}
|
||||
<ul>
|
||||
{% if s %}
|
||||
<li>running <code>rclone mount</code> on LAN (or just dont have valid certificates)? add <code>--no-check-certificate</code></li>
|
||||
{% endif %}
|
||||
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||
</ul>
|
||||
|
||||
<p>if you want to use the native WebDAV client in windows instead (slow and buggy), first run <a href="{{ r }}/.cpr/a/webdav-cfg.bat">webdav-cfg.bat</a> to remove the 47 MiB filesize limit (also fixes latency and password login), then connect:</p>
|
||||
<pre>
|
||||
@@ -73,10 +78,13 @@
|
||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
||||
</pre>
|
||||
{% if s %}
|
||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>--no-check-certificate</code> to the mount command</em><br />---</p>
|
||||
{% endif %}
|
||||
|
||||
<ul>
|
||||
{% if s %}
|
||||
<li>running <code>rclone mount</code> on LAN (or just dont have valid certificates)? add <code>--no-check-certificate</code></li>
|
||||
{% endif %}
|
||||
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||
</ul>
|
||||
<p>or the emergency alternative (gnome/gui-only):</p>
|
||||
<!-- gnome-bug: ignores vp -->
|
||||
<pre>
|
||||
@@ -123,8 +131,14 @@
|
||||
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
|
||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b>
|
||||
</pre>
|
||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
|
||||
{% endif %}
|
||||
<ul>
|
||||
{% if args.ftps %}
|
||||
<li>running on LAN (or just dont have valid certificates)? add <code>no_check_certificate=true</code> to the config command</li>
|
||||
{% endif %}
|
||||
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||
</ul>
|
||||
<p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p>
|
||||
<pre>
|
||||
explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }}
|
||||
@@ -145,8 +159,14 @@
|
||||
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
|
||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b>
|
||||
</pre>
|
||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
|
||||
{% endif %}
|
||||
<ul>
|
||||
{% if args.ftps %}
|
||||
<li>running on LAN (or just dont have valid certificates)? add <code>no_check_certificate=true</code> to the config command</li>
|
||||
{% endif %}
|
||||
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
|
||||
<li>old version of rclone? replace all <code>=</code> with <code> </code> (space)</li>
|
||||
</ul>
|
||||
<p>emergency alternative (gnome/gui-only):</p>
|
||||
<!-- gnome-bug: ignores vp -->
|
||||
<pre>
|
||||
@@ -178,7 +198,7 @@
|
||||
partyfuse.py{% if accs %} -a <b>{{ pw }}</b>{% endif %} http{{ s }}://{{ ep }}/{{ rvp }} <b><span class="os win">W:</span><span class="os lin mac">mp</span></b>
|
||||
</pre>
|
||||
{% if s %}
|
||||
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>-td</code></em></p>
|
||||
<ul><li>if you are on LAN (or just dont have valid certificates), add <code>-td</code></li></ul>
|
||||
{% endif %}
|
||||
<p>
|
||||
you can use <a href="{{ r }}/.cpr/a/u2c.py">u2c.py</a> to upload (sometimes faster than web-browsers)
|
||||
|
||||
@@ -1,3 +1,18 @@
|
||||
:root {
|
||||
--fg: #ccc;
|
||||
--fg-max: #fff;
|
||||
--bg-u2: #2b2b2b;
|
||||
--bg-u5: #444;
|
||||
}
|
||||
html.y {
|
||||
--fg: #222;
|
||||
--fg-max: #000;
|
||||
--bg-u2: #f7f7f7;
|
||||
--bg-u5: #ccc;
|
||||
}
|
||||
html.bz {
|
||||
--bg-u2: #202231;
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'scp';
|
||||
font-display: swap;
|
||||
@@ -14,6 +29,7 @@ html {
|
||||
max-width: min(34em, 90%);
|
||||
max-width: min(34em, calc(100% - 7em));
|
||||
color: #ddd;
|
||||
color: var(--fg);
|
||||
background: #333;
|
||||
background: var(--bg-u2);
|
||||
border: 0 solid #777;
|
||||
@@ -171,24 +187,15 @@ html {
|
||||
color: #f6a;
|
||||
}
|
||||
html.y #tt {
|
||||
color: #333;
|
||||
background: #fff;
|
||||
border-color: #888 #000 #777 #000;
|
||||
}
|
||||
html.bz #tt {
|
||||
background: #202231;
|
||||
border-color: #3b3f58;
|
||||
}
|
||||
html.y #tt,
|
||||
html.y #toast {
|
||||
box-shadow: 0 .3em 1em rgba(0,0,0,0.4);
|
||||
}
|
||||
html.y #tt code {
|
||||
color: #fff;
|
||||
color: var(--fg-max);
|
||||
background: #060;
|
||||
background: var(--bg-u5);
|
||||
}
|
||||
#modalc code {
|
||||
color: #060;
|
||||
background: transparent;
|
||||
@@ -326,6 +333,9 @@ html.y .btn:focus {
|
||||
box-shadow: 0 .1em .2em #037 inset;
|
||||
outline: #037 solid .1em;
|
||||
}
|
||||
input[type="submit"] {
|
||||
cursor: pointer;
|
||||
}
|
||||
input[type="text"]:focus,
|
||||
input:not([type]):focus,
|
||||
textarea:focus {
|
||||
|
||||
@@ -1407,7 +1407,7 @@ function up2k_init(subtle) {
|
||||
|
||||
pvis.addfile([
|
||||
uc.fsearch ? esc(entry.name) : linksplit(
|
||||
entry.purl + uricom_enc(entry.name)).join(' '),
|
||||
entry.purl + uricom_enc(entry.name)).join(' / '),
|
||||
'📐 ' + L.u_hashing,
|
||||
''
|
||||
], entry.size, draw_each);
|
||||
@@ -2284,7 +2284,7 @@ function up2k_init(subtle) {
|
||||
cdiff = (Math.abs(diff) <= 2) ? '3c0' : 'f0b',
|
||||
sdiff = '<span style="color:#' + cdiff + '">diff ' + diff;
|
||||
|
||||
msg.push(linksplit(hit.rp).join('') + '<br /><small>' + tr + ' (srv), ' + tu + ' (You), ' + sdiff + '</small></span>');
|
||||
msg.push(linksplit(hit.rp).join(' / ') + '<br /><small>' + tr + ' (srv), ' + tu + ' (You), ' + sdiff + '</small></span>');
|
||||
}
|
||||
msg = msg.join('<br />\n');
|
||||
}
|
||||
@@ -2318,7 +2318,7 @@ function up2k_init(subtle) {
|
||||
url += '?k=' + fk;
|
||||
}
|
||||
|
||||
pvis.seth(t.n, 0, linksplit(url).join(' '));
|
||||
pvis.seth(t.n, 0, linksplit(url).join(' / '));
|
||||
}
|
||||
|
||||
var chunksize = get_chunksize(t.size),
|
||||
@@ -2402,15 +2402,12 @@ function up2k_init(subtle) {
|
||||
pvis.seth(t.n, 2, L.u_ehstmp, t);
|
||||
|
||||
var err = "",
|
||||
rsp = (xhr.responseText + ''),
|
||||
rsp = unpre(xhr.responseText),
|
||||
ofs = rsp.lastIndexOf('\nURL: ');
|
||||
|
||||
if (ofs !== -1)
|
||||
rsp = rsp.slice(0, ofs);
|
||||
|
||||
if (rsp.indexOf('<pre>') === 0)
|
||||
rsp = rsp.slice(5);
|
||||
|
||||
if (rsp.indexOf('rate-limit ') !== -1) {
|
||||
var penalty = rsp.replace(/.*rate-limit /, "").split(' ')[0];
|
||||
console.log("rate-limit: " + penalty);
|
||||
@@ -2429,7 +2426,7 @@ function up2k_init(subtle) {
|
||||
err = rsp;
|
||||
ofs = err.indexOf('\n/');
|
||||
if (ofs !== -1) {
|
||||
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
|
||||
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' / ');
|
||||
}
|
||||
if (!t.rechecks && (err_pend || err_srcb)) {
|
||||
t.rechecks = 0;
|
||||
@@ -2536,7 +2533,7 @@ function up2k_init(subtle) {
|
||||
cdr = t.size;
|
||||
|
||||
var orz = function (xhr) {
|
||||
var txt = ((xhr.response && xhr.response.err) || xhr.responseText) + '';
|
||||
var txt = unpre((xhr.response && xhr.response.err) || xhr.responseText);
|
||||
if (txt.indexOf('upload blocked by x') + 1) {
|
||||
apop(st.busy.upload, upt);
|
||||
apop(t.postlist, npart);
|
||||
|
||||
@@ -622,9 +622,8 @@ function linksplit(rp, id) {
|
||||
}
|
||||
var vlink = esc(uricom_dec(link));
|
||||
|
||||
if (link.indexOf('/') !== -1) {
|
||||
vlink = vlink.slice(0, -1) + '<span>/</span>';
|
||||
}
|
||||
if (link.indexOf('/') !== -1)
|
||||
vlink = vlink.slice(0, -1);
|
||||
|
||||
if (!rp) {
|
||||
if (q)
|
||||
@@ -1357,6 +1356,11 @@ function lf2br(txt) {
|
||||
}
|
||||
|
||||
|
||||
function unpre(txt) {
|
||||
return ('' + txt).replace(/^<pre>/, '');
|
||||
}
|
||||
|
||||
|
||||
var toast = (function () {
|
||||
var r = {},
|
||||
te = null,
|
||||
|
||||
@@ -1,3 +1,74 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-1111-1738 `v1.9.17` 11-11
|
||||
|
||||
## new features
|
||||
* `u2c.py` / `u2c.exe` (the commandline uploader):
|
||||
* `-x` is now case-insensitive
|
||||
* if a file fails to upload after 30 attempts, give up (bitflips)
|
||||
* add 5 sec delay before reattempts (configurable with `--cd`)
|
||||
|
||||
## bugfixes
|
||||
* clients could crash the file indexer by uploading and then instantly deleting files (as some webdav clients tend to do)
|
||||
* and fix some upload errorhandling which broke during a refactoring in v1.9.16
|
||||
|
||||
## other changes
|
||||
* upgraded pyftpdlib to v1.5.9
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-1104-2158 `v1.9.16` windedup
|
||||
|
||||
## breaking changes
|
||||
* two of the prometheus metrics have changed slightly; see the [breaking changes readme section](https://github.com/9001/copyparty#breaking-changes)
|
||||
* (i'm not familiar with prometheus so i'm not sure if this is a big deal)
|
||||
|
||||
## new features
|
||||
* #58 versioned docker images! no longer just `latest`
|
||||
* browser: the mkdir feature now accepts `foo/bar/qux` and `../foo` and `/bar`
|
||||
* add 14 more prometheus metrics; see [readme](https://github.com/9001/copyparty#prometheus) for details
|
||||
* connections, requests, malicious requests, volume state, file hashing/analyzation queues
|
||||
* catch some more malicious requests in the autoban filters
|
||||
* some malicious requests are now answered with HTTP 422, so that they count against `--ban-422`
|
||||
|
||||
## bugfixes
|
||||
* windows: fix symlink-based upload deduplication
|
||||
* MS decided to make symlinks relative to working-directory rather than destination-path...
|
||||
* `--stats` would produce invalid metrics if a volume was offline
|
||||
* minor improvements to password hashing ux:
|
||||
* properly warn if `--ah-cli` or `--ah-gen` is used without `--ah-alg`
|
||||
* support `^D` during `--ah-cli`
|
||||
* browser-ux / cosmetics:
|
||||
* fix toast/tooltip colors on splashpage
|
||||
* easier to do partial text selection inside links (search results, breadcrumbs, uploads)
|
||||
* more rclone-related hints on the connect-page
|
||||
|
||||
## other changes
|
||||
* malformed http headers from clients are no longer included in the client error-message
|
||||
* just in case there are deployments with a reverse-proxy inserting interesting stuff on the way in
|
||||
* the serverlog still contains all the necessary info to debug your own clients
|
||||
* updated [example nginx config](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nginx/copyparty.conf) to recover faster from brief server outages
|
||||
* the default value of `fail_timeout` (10sec) makes nginx cache the outage for longer than necessary
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-1024-1643 `v1.9.15` expand placeholder
|
||||
|
||||
[made it just in time!](https://a.ocv.me/pub/g/nerd-stuff/PXL_20231024_170348367.jpg) (EDIT: nevermind, three of the containers didn't finish uploading to ghcr before takeoff ;_; all up now)
|
||||
|
||||
## new features
|
||||
* #56 placeholder variables in markdown documents and prologue/epilogue html files
|
||||
* default-disabled; must be enabled globally with `--exp` or per-volume with volflag `exp`
|
||||
* `{{self.ip}}` becomes the client IP; see [/srv/expand/README.md](https://github.com/9001/copyparty/blob/hovudstraum/srv/expand/README.md) for more examples
|
||||
* dynamic-range-compressor: reduced volume jumps between songs when enabled
|
||||
|
||||
## bugfixes
|
||||
* v1.9.14 broke the `scan` volflag, causing volume rescans to happen every 10sec if enabled
|
||||
* its global counterpart `--re-maxage` was not affected
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-1021-1443 `v1.9.14` uptime
|
||||
|
||||
|
||||
@@ -28,10 +28,6 @@ https://github.com/nayuki/QR-Code-generator/
|
||||
C: Project Nayuki
|
||||
L: MIT
|
||||
|
||||
https://github.com/python/cpython/blob/3.10/Lib/asyncore.py
|
||||
C: 1996 Sam Rushing
|
||||
L: ISC
|
||||
|
||||
https://github.com/ahupp/python-magic/
|
||||
C: 2001-2014 Adam Hupp
|
||||
L: MIT
|
||||
|
||||
@@ -141,12 +141,25 @@ filt=
|
||||
}
|
||||
|
||||
[ $push ] && {
|
||||
ver=$(
|
||||
python3 ../../dist/copyparty-sfx.py --version 2>/dev/null |
|
||||
awk '/^copyparty v/{sub(/-.*/,"");sub(/v/,"");print$2;exit}'
|
||||
)
|
||||
echo $ver | grep -E '[0-9]\.[0-9]' || {
|
||||
echo no ver
|
||||
exit 1
|
||||
}
|
||||
for i in $dhub_order; do
|
||||
printf '\ndockerhub %s\n' $i
|
||||
podman manifest push --all copyparty-$i copyparty/$i:$ver
|
||||
podman manifest push --all copyparty-$i copyparty/$i:latest
|
||||
done
|
||||
done &
|
||||
for i in $ghcr_order; do
|
||||
printf '\nghcr %s\n' $i
|
||||
podman manifest push --all copyparty-$i ghcr.io/9001/copyparty-$i:$ver
|
||||
podman manifest push --all copyparty-$i ghcr.io/9001/copyparty-$i:latest
|
||||
done
|
||||
done &
|
||||
wait
|
||||
}
|
||||
|
||||
echo ok
|
||||
|
||||
@@ -205,26 +205,22 @@ necho() {
|
||||
mv {markupsafe,jinja2} j2/
|
||||
|
||||
necho collecting pyftpdlib
|
||||
f="../build/pyftpdlib-1.5.8.tar.gz"
|
||||
f="../build/pyftpdlib-1.5.9.tar.gz"
|
||||
[ -e "$f" ] ||
|
||||
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.8.tar.gz;
|
||||
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.9.tar.gz;
|
||||
wget -O$f "$url" || curl -L "$url" >$f)
|
||||
|
||||
tar -zxf $f
|
||||
mv pyftpdlib-release-*/pyftpdlib .
|
||||
rm -rf pyftpdlib-release-* pyftpdlib/test
|
||||
for f in pyftpdlib/_async{hat,ore}.py; do
|
||||
[ -e "$f" ] || continue;
|
||||
iawk 'NR<4||NR>27||!/^#/;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' $f
|
||||
done
|
||||
|
||||
mkdir ftp/
|
||||
mv pyftpdlib ftp/
|
||||
|
||||
necho collecting asyncore, asynchat
|
||||
for n in asyncore.py asynchat.py; do
|
||||
f=../build/$n
|
||||
[ -e "$f" ] ||
|
||||
(url=https://raw.githubusercontent.com/python/cpython/c4d45ee670c09d4f6da709df072ec80cb7dfad22/Lib/$n;
|
||||
wget -O$f "$url" || curl -L "$url" >$f)
|
||||
done
|
||||
|
||||
necho collecting python-magic
|
||||
v=0.4.27
|
||||
f="../build/python-magic-$v.tar.gz"
|
||||
@@ -293,12 +289,6 @@ necho() {
|
||||
(cd "${x%/*}"; cp -p "../$(cat "${x##*/}")" ${x##*/})
|
||||
done
|
||||
|
||||
# insert asynchat
|
||||
mkdir copyparty/vend
|
||||
for n in asyncore.py asynchat.py; do
|
||||
awk 'NR<4||NR>27;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' ../build/$n >copyparty/vend/$n
|
||||
done
|
||||
|
||||
rm -f copyparty/stolen/*/README.md
|
||||
|
||||
# remove type hints before build instead
|
||||
@@ -419,7 +409,7 @@ iawk '/^ {0,4}[^ ]/{s=0}/^ {4}def (serve_forever|_loop)/{s=1}!s' ftp/pyftpdlib/s
|
||||
rm -f ftp/pyftpdlib/{__main__,prefork}.py
|
||||
|
||||
[ $no_ftp ] &&
|
||||
rm -rf copyparty/ftpd.py ftp asyncore.py asynchat.py &&
|
||||
rm -rf copyparty/ftpd.py ftp &&
|
||||
sed -ri '/\.ftp/d' copyparty/svchub.py
|
||||
|
||||
[ $no_smb ] &&
|
||||
@@ -576,8 +566,8 @@ nf=$(ls -1 "$zdir"/arc.* 2>/dev/null | wc -l)
|
||||
cat ../$bdir/COPYING.txt) >> copyparty/res/COPYING.txt ||
|
||||
echo "copying.txt 404 pls rebuild"
|
||||
|
||||
mv ftp/* j2/* copyparty/vend/* .
|
||||
rm -rf ftp j2 py2 py37 copyparty/vend
|
||||
mv ftp/* j2/* .
|
||||
rm -rf ftp j2 py2 py37
|
||||
(cd copyparty; tar -cvf z.tar $t; rm -rf $t)
|
||||
cd ..
|
||||
pyoxidizer build --release --target-triple $tgt
|
||||
|
||||
@@ -9,7 +9,7 @@ f23615c522ed58b9a05978ba4c69c06224590f3a6adbd8e89b31838b181a57160739ceff1fc2ba6f
|
||||
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
||||
8d16a967a0a7872a7575b1005cf66915deacda6ee8611fbb52f42fc3e3beb2f901a5140c942a5d146bd412b92bfa9cbadd82beeba83df6d70930c6dc26608a5b upx-4.1.0-win32.zip
|
||||
# u2c (win7)
|
||||
4562b1065c6bce7084eb575b654985c990e26034bfcd8db54629312f43ac737e264db7a2b4d8b797e09919a485cbc6af3fd0931690b7ed79b62bcc0736aec9fc certifi-2023.7.22-py3-none-any.whl
|
||||
f3390290b896019b2fa169932390e4930d1c03c014e1f6db2405ca2eb1f51f5f5213f725885853805b742997b0edb369787e5c0069d217bc4e8b957f847f58b6 certifi-2023.11.17-py3-none-any.whl
|
||||
904eb57b13bea80aea861de86987e618665d37fa9ea0856e0125a9ba767a53e5064de0b9c4735435a2ddf4f16f7f7d2c75a682e1de83d9f57922bdca8e29988c charset_normalizer-3.3.0-cp37-cp37m-win32.whl
|
||||
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl
|
||||
b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c3875222eb47fc95c73fc0606aaa6602b9ebc524809c9ba3501f requests-2.31.0-py3-none-any.whl
|
||||
|
||||
@@ -106,20 +106,19 @@ def meichk():
|
||||
if filt not in sys.executable:
|
||||
filt = os.path.basename(sys.executable)
|
||||
|
||||
pids = []
|
||||
ptn = re.compile(r"^([^\s]+)\s+([0-9]+)")
|
||||
hits = []
|
||||
try:
|
||||
procs = sp.check_output("tasklist").decode("utf-8", "replace")
|
||||
cmd = "tasklist /fo csv".split(" ")
|
||||
procs = sp.check_output(cmd).decode("utf-8", "replace")
|
||||
except:
|
||||
procs = "" # winpe
|
||||
|
||||
for ln in procs.splitlines():
|
||||
m = ptn.match(ln)
|
||||
if m and filt in m.group(1).lower():
|
||||
pids.append(int(m.group(2)))
|
||||
for ln in procs.split("\n"):
|
||||
if filt in ln.split('"')[:2][-1]:
|
||||
hits.append(ln)
|
||||
|
||||
mod = os.path.dirname(os.path.realpath(__file__))
|
||||
if os.path.basename(mod).startswith("_MEI") and len(pids) == 2:
|
||||
if os.path.basename(mod).startswith("_MEI") and len(hits) == 2:
|
||||
meicln(mod)
|
||||
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ fns=(
|
||||
)
|
||||
[ $w7 ] && fns+=(
|
||||
pyinstaller-5.13.2-py3-none-win32.whl
|
||||
certifi-2022.12.7-py3-none-any.whl
|
||||
certifi-2023.11.17-py3-none-any.whl
|
||||
chardet-5.1.0-py3-none-any.whl
|
||||
idna-3.4-py3-none-any.whl
|
||||
requests-2.28.2-py3-none-any.whl
|
||||
|
||||
@@ -59,9 +59,6 @@ copyparty/th_srv.py,
|
||||
copyparty/u2idx.py,
|
||||
copyparty/up2k.py,
|
||||
copyparty/util.py,
|
||||
copyparty/vend,
|
||||
copyparty/vend/asynchat.py,
|
||||
copyparty/vend/asyncore.py,
|
||||
copyparty/web,
|
||||
copyparty/web/a,
|
||||
copyparty/web/a/__init__.py,
|
||||
|
||||
@@ -16,16 +16,11 @@ def uncomment(fpath):
|
||||
orig = f.read().decode("utf-8")
|
||||
|
||||
out = ""
|
||||
for ln in orig.split("\n"):
|
||||
if not ln.startswith("#"):
|
||||
break
|
||||
|
||||
out += ln + "\n"
|
||||
|
||||
io_obj = io.StringIO(orig)
|
||||
prev_toktype = tokenize.INDENT
|
||||
last_lineno = -1
|
||||
last_col = 0
|
||||
code = False
|
||||
for tok in tokenize.generate_tokens(io_obj.readline):
|
||||
# print(repr(tok))
|
||||
token_type = tok[0]
|
||||
@@ -53,7 +48,11 @@ def uncomment(fpath):
|
||||
out += token_string
|
||||
else:
|
||||
out += '"a"'
|
||||
elif token_type != tokenize.COMMENT or is_legalese:
|
||||
elif token_type != tokenize.COMMENT:
|
||||
out += token_string
|
||||
if not code and token_string.strip():
|
||||
code = True
|
||||
elif is_legalese or (not start_col and not code):
|
||||
out += token_string
|
||||
else:
|
||||
if out.rstrip(" ").endswith("\n"):
|
||||
|
||||
@@ -115,7 +115,7 @@ class Cfg(Namespace):
|
||||
ex = "dotpart no_rescan no_sendfile no_voldump plain_ip"
|
||||
ka.update(**{k: True for k in ex.split()})
|
||||
|
||||
ex = "css_browser hist js_browser no_forget no_hash no_idx nonsus_urls"
|
||||
ex = "ah_cli ah_gen css_browser hist js_browser no_forget no_hash no_idx nonsus_urls"
|
||||
ka.update(**{k: None for k in ex.split()})
|
||||
|
||||
ex = "s_thead s_tbody th_convt"
|
||||
@@ -190,6 +190,7 @@ class VHttpSrv(object):
|
||||
self.broker = NullBroker()
|
||||
self.prism = None
|
||||
self.bans = {}
|
||||
self.nreq = 0
|
||||
|
||||
aliases = ["splash", "browser", "browser2", "msg", "md", "mde"]
|
||||
self.j2 = {x: J2_FILES for x in aliases}
|
||||
|
||||
Reference in New Issue
Block a user