Compare commits

...

28 Commits

Author SHA1 Message Date
ed
deef32335e v1.9.18 2023-11-18 21:06:55 +00:00
ed
fc4b51ad00 make dhash more volatile; probably fixes #61:
if any volumes were added or removed since last use,
drop dhash to verify that there are no files to shadow
2023-11-18 20:48:56 +00:00
ed
fa762754bf fix close/more thumbs in search results for pillow 10.x 2023-11-18 13:57:35 +00:00
ed
29bd8f57c4 fix js error when ctrl-clicking a search result; closes #60 2023-11-18 13:47:00 +00:00
ed
abc37354ef update pkgs to 1.9.17 2023-11-11 18:22:51 +00:00
ed
ee3333362f v1.9.17 2023-11-11 17:38:43 +00:00
ed
7c0c6b94a3 drop asyncore; pyftpdlib has vendored it 2023-11-11 17:20:00 +00:00
ed
bac733113c up2k-hasher robustness:
webdav clients tend to upload and then immediately delete
files to test for write-access and available disk space,
so don't crash and burn when that happens
2023-11-11 16:21:54 +00:00
ed
32ab65d7cb add cfssl to packaging + improve certgen expiration check 2023-11-11 15:30:03 +00:00
ed
c6744dc483 u2c: configurable retry delay 2023-11-11 14:46:00 +00:00
ed
b9997d677d u2c: give up on files with bitflips 2023-11-11 14:30:46 +00:00
ed
10defe6aef u2c: make -x case-insensitive 2023-11-11 14:02:01 +00:00
ed
736aa125a8 fix dumb 2023-11-11 13:52:06 +00:00
ed
eb48373b8b mention fpm 2023-11-08 00:55:16 +00:00
ed
d4a7b7d84d add contribution ideas 2023-11-06 15:33:29 +00:00
ed
2923a38b87 update pkgs to 1.9.16 2023-11-04 23:30:07 +00:00
ed
dabdaaee33 v1.9.16 2023-11-04 21:58:01 +00:00
ed
65e4d67c3e mkdir with leading slash works as expected 2023-11-04 22:21:56 +00:00
ed
4b720f4150 add more prometheus metrics; breaking changes:
* cpp_uptime is now a gauge
* cpp_bans is now cpp_active_bans (and also a gauge)

and other related fixes:
* stop emitting invalid cpp_disk_size/free for offline volumes
* support overriding the spec-mandatory mimetype with ?mime=foo
2023-11-04 20:32:34 +00:00
ed
2e85a25614 improve service listing 2023-11-04 10:23:37 +00:00
ed
713fffcb8e also mkdir missing intermediates,
unless requester is a webdav client (those expect a 409)
2023-11-03 23:23:49 +00:00
ed
8020b11ea0 improve/simplify validation/errorhandling:
* some malicious requests are now answered with HTTP 422,
   so that they count against --ban-422
* do not include request headers when replying to invalid requests,
   in case there is a reverse-proxy inserting something interesting
2023-11-03 23:07:16 +00:00
ed
2523d76756 windows: fix symlinks 2023-11-03 17:16:12 +00:00
ed
7ede509973 nginx: reduce cost of spurious connectivity loss;
default value of fail_timeout (10sec) makes server unavailable for that
amount of time, even if the server is just down for a quick restart
2023-11-03 17:13:11 +00:00
ed
7c1d97af3b slightly better pyinstaller loader 2023-11-03 17:09:34 +00:00
ed
95566e8388 cosmetics:
* fix toast/tooltip colors on splashpage
* properly warn if --ah-cli or --ah-gen is used without --ah-alg
* support ^D during --ah-cli
* improve flavor texts
2023-11-03 16:52:43 +00:00
ed
76afb62b7b make each segment of links separately selectable 2023-10-25 12:21:39 +00:00
ed
7dec922c70 update pkgs to 1.9.15 2023-10-24 16:56:57 +00:00
35 changed files with 592 additions and 257 deletions

View File

@@ -1,3 +1,43 @@
* do something cool * do something cool
really tho, send a PR or an issue or whatever, all appreciated, anything goes, just behave aight really tho, send a PR or an issue or whatever, all appreciated, anything goes, just behave aight 👍👍
but to be more specific,
# contribution ideas
## documentation
I think we can agree that the documentation leaves a LOT to be desired. I've realized I'm not exactly qualified for this 😅 but maybe the [soon-to-come setup GUI](https://github.com/9001/copyparty/issues/57) will make this more manageable. The best documentation is the one that never had to be written, right? :> so I suppose we can give this a wait-and-see approach for a bit longer.
## crazy ideas & features
assuming they won't cause too much problems or side-effects :>
i think someone was working on a way to list directories over DNS for example...
if you wanna have a go at coding it up yourself then maybe mention the idea on discord before you get too far, otherwise just go nuts 👍
## others
aside from documentation and ideas, some other things that would be cool to have some help with is:
* **translations** -- the copyparty web-UI has translations for english and norwegian at the top of [browser.js](https://github.com/9001/copyparty/blob/hovudstraum/copyparty/web/browser.js); if you'd like to add a translation for another language then that'd be welcome! and if that language has a grammar that doesn't fit into the way the strings are assembled, then we'll fix that as we go :>
* **UI ideas** -- at some point I was thinking of rewriting the UI in react/preact/something-not-vanilla-javascript, but I'll admit the comfiness of not having any build stage combined with raw performance has kinda convinced me otherwise :p but I'd be very open to ideas on how the UI could be improved, or be more intuitive.
* **docker improvements** -- I don't really know what I'm doing when it comes to containers, so I'm sure there's a *huge* room for improvement here, mainly regarding how you're supposed to use the container with kubernetes / docker-compose / any of the other popular ways to do things. At some point I swear I'll start learning about docker so I can pick up clach04's [docker-compose draft](https://github.com/9001/copyparty/issues/38) and learn how that stuff ticks, unless someone beats me to it!
* **packaging** for various linux distributions -- this could either be as simple as just plopping the sfx.py in the right place and calling that from systemd (the archlinux package [originally did this](https://github.com/9001/copyparty/pull/18)); maybe with a small config-file which would cause copyparty to load settings from `/etc/copyparty.d` (like the [archlinux package](https://github.com/9001/copyparty/tree/hovudstraum/contrib/package/arch) does with `copyparty.conf`), or it could be a proper installation of the copyparty python package into /usr/lib or similar (the archlinux package [eventually went for this approach](https://github.com/9001/copyparty/pull/26))
* [fpm](https://github.com/jordansissel/fpm) can probably help with the technical part of it, but someone needs to handle distro relations :-)
* **software integration** -- I'm sure there's a lot of usecases where copyparty could complement something else, or the other way around, so any ideas or any work in this regard would be dope. This doesn't necessarily have to be code inside copyparty itself;
* [hooks](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) -- these are small programs which are called by copyparty when certain things happen (files are uploaded, someone hits a 404, etc.), and could be a fun way to add support for more usecases
* [parser plugins](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag) -- if you want to have copyparty analyze and index metadata for some oddball file-formats, then additional plugins would be neat :>

View File

@@ -53,6 +53,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [webdav server](#webdav-server) - with read-write support * [webdav server](#webdav-server) - with read-write support
* [connecting to webdav from windows](#connecting-to-webdav-from-windows) - using the GUI * [connecting to webdav from windows](#connecting-to-webdav-from-windows) - using the GUI
* [smb server](#smb-server) - unsafe, slow, not recommended for wan * [smb server](#smb-server) - unsafe, slow, not recommended for wan
* [browser ux](#browser-ux) - tweaking the ui
* [file indexing](#file-indexing) - enables dedup and music search ++ * [file indexing](#file-indexing) - enables dedup and music search ++
* [exclude-patterns](#exclude-patterns) - to save some time * [exclude-patterns](#exclude-patterns) - to save some time
* [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems * [filesystem guards](#filesystem-guards) - avoid traversing into other filesystems
@@ -317,6 +318,8 @@ same order here too
upgrade notes upgrade notes
* `1.9.16` (2023-11-04):
* `--stats`/prometheus: `cpp_bans` renamed to `cpp_active_bans`, and that + `cpp_uptime` are gauges
* `1.6.0` (2023-01-29): * `1.6.0` (2023-01-29):
* http-api: delete/move is now `POST` instead of `GET` * http-api: delete/move is now `POST` instead of `GET`
* everything other than `GET` and `HEAD` must pass [cors validation](#cors) * everything other than `GET` and `HEAD` must pass [cors validation](#cors)
@@ -1304,8 +1307,23 @@ scrape_configs:
``` ```
currently the following metrics are available, currently the following metrics are available,
* `cpp_uptime_seconds` * `cpp_uptime_seconds` time since last copyparty restart
* `cpp_bans` number of banned IPs * `cpp_boot_unixtime_seconds` same but as an absolute timestamp
* `cpp_http_conns` number of open http(s) connections
* `cpp_http_reqs` number of http(s) requests handled
* `cpp_sus_reqs` number of 403/422/malicious requests
* `cpp_active_bans` number of currently banned IPs
* `cpp_total_bans` number of IPs banned since last restart
these are available unless `--nos-vst` is specified:
* `cpp_db_idle_seconds` time since last database activity (upload/rename/delete)
* `cpp_db_act_seconds` same but as an absolute timestamp
* `cpp_idle_vols` number of volumes which are idle / ready
* `cpp_busy_vols` number of volumes which are busy / indexing
* `cpp_offline_vols` number of volumes which are offline / unavailable
* `cpp_hashing_files` number of files queued for hashing / indexing
* `cpp_tagq_files` number of files queued for metadata scanning
* `cpp_mtpq_files` number of files queued for plugin-based analysis
and these are available per-volume only: and these are available per-volume only:
* `cpp_disk_size_bytes` total HDD size * `cpp_disk_size_bytes` total HDD size
@@ -1324,9 +1342,12 @@ some of the metrics have additional requirements to function correctly,
the following options are available to disable some of the metrics: the following options are available to disable some of the metrics:
* `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs * `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs
* `--nos-vol` disables `cpp_vol_*` which reduces server startup time * `--nos-vol` disables `cpp_vol_*` which reduces server startup time
* `--nos-vst` disables volume state, reducing the worst-case prometheus query time by 0.5 sec
* `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries * `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries
* `--nos-unf` disables `cpp_unf_*` for no particular purpose * `--nos-unf` disables `cpp_unf_*` for no particular purpose
note: the following metrics are counted incorrectly if multiprocessing is enabled with `-j`: `cpp_http_conns`, `cpp_http_reqs`, `cpp_sus_reqs`, `cpp_active_bans`, `cpp_total_bans`
# packages # packages

View File

@@ -1,8 +1,8 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
S_VERSION = "1.10" S_VERSION = "1.11"
S_BUILD_DT = "2023-08-15" S_BUILD_DT = "2023-11-11"
""" """
u2c.py: upload to copyparty u2c.py: upload to copyparty
@@ -107,10 +107,12 @@ class File(object):
self.ucids = [] # type: list[str] # chunks which need to be uploaded self.ucids = [] # type: list[str] # chunks which need to be uploaded
self.wark = None # type: str self.wark = None # type: str
self.url = None # type: str self.url = None # type: str
self.nhs = 0
# set by upload # set by upload
self.up_b = 0 # type: int self.up_b = 0 # type: int
self.up_c = 0 # type: int self.up_c = 0 # type: int
self.cd = 0
# t = "size({}) lmod({}) top({}) rel({}) abs({}) name({})\n" # t = "size({}) lmod({}) top({}) rel({}) abs({}) name({})\n"
# eprint(t.format(self.size, self.lmod, self.top, self.rel, self.abs, self.name)) # eprint(t.format(self.size, self.lmod, self.top, self.rel, self.abs, self.name))
@@ -433,7 +435,7 @@ def walkdirs(err, tops, excl):
za = [x.replace(b"/", b"\\") for x in za] za = [x.replace(b"/", b"\\") for x in za]
tops = za tops = za
ptn = re.compile(excl.encode("utf-8") or b"\n") ptn = re.compile(excl.encode("utf-8") or b"\n", re.I)
for top in tops: for top in tops:
isdir = os.path.isdir(top) isdir = os.path.isdir(top)
@@ -598,7 +600,7 @@ def handshake(ar, file, search):
raise raise
eprint("handshake failed, retrying: {0}\n {1}\n\n".format(file.name, em)) eprint("handshake failed, retrying: {0}\n {1}\n\n".format(file.name, em))
time.sleep(1) time.sleep(ar.cd)
try: try:
r = r.json() r = r.json()
@@ -689,6 +691,7 @@ class Ctl(object):
def __init__(self, ar, stats=None): def __init__(self, ar, stats=None):
self.ok = False self.ok = False
self.errs = 0
self.ar = ar self.ar = ar
self.stats = stats or self._scan() self.stats = stats or self._scan()
if not self.stats: if not self.stats:
@@ -736,7 +739,7 @@ class Ctl(object):
self._fancy() self._fancy()
self.ok = True self.ok = not self.errs
def _safe(self): def _safe(self):
"""minimal basic slow boring fallback codepath""" """minimal basic slow boring fallback codepath"""
@@ -961,13 +964,22 @@ class Ctl(object):
self.q_upload.put(None) self.q_upload.put(None)
break break
with self.mutex:
self.handshaker_busy += 1
upath = file.abs.decode("utf-8", "replace") upath = file.abs.decode("utf-8", "replace")
if not VT100: if not VT100:
upath = upath.lstrip("\\?") upath = upath.lstrip("\\?")
file.nhs += 1
if file.nhs > 32:
print("ERROR: giving up on file %s" % (upath))
self.errs += 1
continue
with self.mutex:
self.handshaker_busy += 1
while time.time() < file.cd:
time.sleep(0.1)
hs, sprs = handshake(self.ar, file, search) hs, sprs = handshake(self.ar, file, search)
if search: if search:
if hs: if hs:
@@ -1050,6 +1062,7 @@ class Ctl(object):
except Exception as ex: except Exception as ex:
t = "upload failed, retrying: {0} #{1} ({2})\n" t = "upload failed, retrying: {0} #{1} ({2})\n"
eprint(t.format(file.name, cid[:8], ex)) eprint(t.format(file.name, cid[:8], ex))
file.cd = time.time() + self.ar.cd
# handshake will fix it # handshake will fix it
with self.mutex: with self.mutex:
@@ -1121,6 +1134,7 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-J", type=int, metavar="THREADS", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing") ap.add_argument("-J", type=int, metavar="THREADS", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing")
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading") ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)") ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)")
ap.add_argument("--cd", type=float, metavar="SEC", default=5, help="delay before reattempting a failed handshake/upload")
ap.add_argument("--safe", action="store_true", help="use simple fallback approach") ap.add_argument("--safe", action="store_true", help="use simple fallback approach")
ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)") ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)")
@@ -1187,6 +1201,9 @@ source file/folder selection uses rsync syntax, meaning that:
ar.z = True ar.z = True
ctl = Ctl(ar, ctl.stats) ctl = Ctl(ar, ctl.stats)
if ctl.errs:
print("WARNING: %d errors" % (ctl.errs))
sys.exit(0 if ctl.ok else 1) sys.exit(0 if ctl.ok else 1)

View File

@@ -13,7 +13,7 @@
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1 # on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
upstream cpp { upstream cpp {
server 127.0.0.1:3923; server 127.0.0.1:3923 fail_timeout=1s;
keepalive 1; keepalive 1;
} }
server { server {

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe> # Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty pkgname=copyparty
pkgver="1.9.14" pkgver="1.9.17"
pkgrel=1 pkgrel=1
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++" pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++"
arch=("any") arch=("any")
@@ -9,6 +9,7 @@ license=('MIT')
depends=("python" "lsof" "python-jinja") depends=("python" "lsof" "python-jinja")
makedepends=("python-wheel" "python-setuptools" "python-build" "python-installer" "make" "pigz") makedepends=("python-wheel" "python-setuptools" "python-build" "python-installer" "make" "pigz")
optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tags" optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tags"
"cfssl: generate TLS certificates on startup (pointless when reverse-proxied)"
"python-mutagen: music tags (alternative)" "python-mutagen: music tags (alternative)"
"python-pillow: thumbnails for images" "python-pillow: thumbnails for images"
"python-pyvips: thumbnails for images (higher quality, faster, uses more ram)" "python-pyvips: thumbnails for images (higher quality, faster, uses more ram)"
@@ -20,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
) )
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz") source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" ) backup=("etc/${pkgname}.d/init" )
sha256sums=("96867ea1bcaf622e5dc29ee3224ffa8ea80218d3a146e7a10d04c12255bae00f") sha256sums=("73d66a9ff21caf45d8093829ba7de5b161fcd595ff91f8674795f426db86644c")
build() { build() {
cd "${srcdir}/${pkgname}-${pkgver}" cd "${srcdir}/${pkgname}-${pkgver}"

View File

@@ -3,6 +3,9 @@
# use argon2id-hashed passwords in config files (sha2 is always available) # use argon2id-hashed passwords in config files (sha2 is always available)
withHashedPasswords ? true, withHashedPasswords ? true,
# generate TLS certificates on startup (pointless when reverse-proxied)
withCertgen ? false,
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing # create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
withThumbnails ? true, withThumbnails ? true,
@@ -34,6 +37,7 @@ let
] ]
++ lib.optional withSMB impacket ++ lib.optional withSMB impacket
++ lib.optional withFTPS pyopenssl ++ lib.optional withFTPS pyopenssl
++ lib.optional withCertgen cfssl
++ lib.optional withThumbnails pillow ++ lib.optional withThumbnails pillow
++ lib.optional withFastThumbnails pyvips ++ lib.optional withFastThumbnails pyvips
++ lib.optional withMediaProcessing ffmpeg ++ lib.optional withMediaProcessing ffmpeg

View File

@@ -1,5 +1,5 @@
{ {
"url": "https://github.com/9001/copyparty/releases/download/v1.9.14/copyparty-sfx.py", "url": "https://github.com/9001/copyparty/releases/download/v1.9.17/copyparty-sfx.py",
"version": "1.9.14", "version": "1.9.17",
"hash": "sha256-H4hRi6Nn4jUouhvqLacFyr0odMQ+99crBXL3iNz7mXs=" "hash": "sha256-YLl7hGWRDsFgxUvQ6hUbq+DWduhm2bs4FSZWs/AgvB0="
} }

View File

@@ -1014,6 +1014,7 @@ def add_stats(ap):
ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts") ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts")
ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)") ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)")
ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)") ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)")
ap2.add_argument("--nos-vst", action="store_true", help="disable volume state metrics (indexing, analyzing, activity)")
ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)") ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)")
ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics") ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics")
@@ -1094,7 +1095,7 @@ def add_logging(ap):
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR") ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup") ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals") ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log passphrase of failed login attempts: 0=terse, 1=plaintext, 2=hashed") ap2.add_argument("--log-badpwd", metavar="N", type=int, default=1, help="log failed login attempt passwords: 0=terse, 1=plaintext, 2=hashed")
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs") ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling") ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header") ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
@@ -1314,7 +1315,7 @@ def run_argparse(
for k, h, t in sects: for k, h, t in sects:
k2 = "help_" + k.replace("-", "_") k2 = "help_" + k.replace("-", "_")
if vars(ret)[k2]: if vars(ret)[k2]:
lprint("# {} help page".format(k)) lprint("# %s help page (%s)" % (k, h))
lprint(t + "\033[0m") lprint(t + "\033[0m")
sys.exit(0) sys.exit(0)

View File

@@ -1,8 +1,8 @@
# coding: utf-8 # coding: utf-8
VERSION = (1, 9, 15) VERSION = (1, 9, 18)
CODENAME = "prometheable" CODENAME = "prometheable"
BUILD_DT = (2023, 10, 24) BUILD_DT = (2023, 11, 18)
S_VERSION = ".".join(map(str, VERSION)) S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT) S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -21,9 +21,9 @@ from .util import (
META_NOBOTS, META_NOBOTS,
SQLITE_VER, SQLITE_VER,
UNPLICATIONS, UNPLICATIONS,
UTC,
ODict, ODict,
Pebkac, Pebkac,
UTC,
absreal, absreal,
afsenc, afsenc,
get_df, get_df,
@@ -476,12 +476,10 @@ class VFS(object):
err: int = 403, err: int = 403,
) -> tuple["VFS", str]: ) -> tuple["VFS", str]:
"""returns [vfsnode,fs_remainder] if user has the requested permissions""" """returns [vfsnode,fs_remainder] if user has the requested permissions"""
if ANYWIN: if relchk(vpath):
mod = relchk(vpath)
if mod:
if self.log: if self.log:
self.log("vfs", "invalid relpath [{}]".format(vpath)) self.log("vfs", "invalid relpath [{}]".format(vpath))
raise Pebkac(404) raise Pebkac(422)
cvpath = undot(vpath) cvpath = undot(vpath)
vn, rem = self._find(cvpath) vn, rem = self._find(cvpath)
@@ -500,8 +498,8 @@ class VFS(object):
t = "{} has no {} in [{}] => [{}] => [{}]" t = "{} has no {} in [{}] => [{}] => [{}]"
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6) self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
t = "you don't have {}-access for this location" t = 'you don\'t have %s-access in "/%s"'
raise Pebkac(err, t.format(msg)) raise Pebkac(err, t % (msg, cvpath))
return vn, rem return vn, rem
@@ -1723,6 +1721,9 @@ class AuthSrv(object):
def setup_pwhash(self, acct: dict[str, str]) -> None: def setup_pwhash(self, acct: dict[str, str]) -> None:
self.ah = PWHash(self.args) self.ah = PWHash(self.args)
if not self.ah.on: if not self.ah.on:
if self.args.ah_cli or self.args.ah_gen:
t = "\n BAD CONFIG:\n cannot --ah-cli or --ah-gen without --ah-alg"
raise Exception(t)
return return
if self.args.ah_cli: if self.args.ah_cli:

View File

@@ -132,7 +132,10 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
try: try:
expiry, inf = _read_crt(args, "srv.pem") expiry, inf = _read_crt(args, "srv.pem")
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.1 > expiry if "sans" not in inf:
raise Exception("no useable cert found")
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.5 > expiry
cert_insec = os.path.join(args.E.mod, "res/insecure.pem") cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
for n in names: for n in names:
if n not in inf["sans"]: if n not in inf["sans"]:

View File

@@ -11,11 +11,6 @@ import time
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
try:
import asynchat
except:
sys.path.append(os.path.join(E.mod, "vend"))
from pyftpdlib.authorizers import AuthenticationFailed, DummyAuthorizer from pyftpdlib.authorizers import AuthenticationFailed, DummyAuthorizer
from pyftpdlib.filesystems import AbstractedFS, FilesystemError from pyftpdlib.filesystems import AbstractedFS, FilesystemError
from pyftpdlib.handlers import FTPHandler from pyftpdlib.handlers import FTPHandler
@@ -92,6 +87,12 @@ class FtpAuth(DummyAuthorizer):
if bonk: if bonk:
logging.warning("client banned: invalid passwords") logging.warning("client banned: invalid passwords")
bans[ip] = bonk bans[ip] = bonk
try:
# only possible if multiprocessing disabled
self.hub.broker.httpsrv.bans[ip] = bonk
self.hub.broker.httpsrv.nban += 1
except:
pass
raise AuthenticationFailed("Authentication failed.") raise AuthenticationFailed("Authentication failed.")
@@ -148,7 +149,7 @@ class FtpFs(AbstractedFS):
try: try:
vpath = vpath.replace("\\", "/").strip("/") vpath = vpath.replace("\\", "/").strip("/")
rd, fn = os.path.split(vpath) rd, fn = os.path.split(vpath)
if ANYWIN and relchk(rd): if relchk(rd):
logging.warning("malicious vpath: %s", vpath) logging.warning("malicious vpath: %s", vpath)
t = "Unsupported characters in [{}]" t = "Unsupported characters in [{}]"
raise FSE(t.format(vpath), 1) raise FSE(t.format(vpath), 1)

View File

@@ -39,10 +39,11 @@ from .szip import StreamZip
from .util import ( from .util import (
HTTPCODE, HTTPCODE,
META_NOBOTS, META_NOBOTS,
UTC,
Garda,
MultipartParser, MultipartParser,
ODict, ODict,
Pebkac, Pebkac,
UTC,
UnrecvEOF, UnrecvEOF,
absreal, absreal,
alltrace, alltrace,
@@ -75,6 +76,7 @@ from .util import (
runhook, runhook,
s3enc, s3enc,
sanitize_fn, sanitize_fn,
sanitize_vpath,
sendfile_kern, sendfile_kern,
sendfile_py, sendfile_py,
undot, undot,
@@ -146,6 +148,7 @@ class HttpCli(object):
self.rem = " " self.rem = " "
self.vpath = " " self.vpath = " "
self.vpaths = " " self.vpaths = " "
self.gctx = " " # additional context for garda
self.trailing_slash = True self.trailing_slash = True
self.uname = " " self.uname = " "
self.pw = " " self.pw = " "
@@ -254,8 +257,8 @@ class HttpCli(object):
k, zs = header_line.split(":", 1) k, zs = header_line.split(":", 1)
self.headers[k.lower()] = zs.strip() self.headers[k.lower()] = zs.strip()
except: except:
msg = " ]\n#[ ".join(headerlines) msg = "#[ " + " ]\n#[ ".join(headerlines) + " ]"
raise Pebkac(400, "bad headers:\n#[ " + msg + " ]") raise Pebkac(400, "bad headers", log=msg)
except Pebkac as ex: except Pebkac as ex:
self.mode = "GET" self.mode = "GET"
@@ -268,8 +271,14 @@ class HttpCli(object):
self.loud_reply(unicode(ex), status=ex.code, headers=h, volsan=True) self.loud_reply(unicode(ex), status=ex.code, headers=h, volsan=True)
except: except:
pass pass
if ex.log:
self.log("additional error context:\n" + ex.log, 6)
return False return False
self.conn.hsrv.nreq += 1
self.ua = self.headers.get("user-agent", "") self.ua = self.headers.get("user-agent", "")
self.is_rclone = self.ua.startswith("rclone/") self.is_rclone = self.ua.startswith("rclone/")
@@ -411,12 +420,9 @@ class HttpCli(object):
self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath
) )
ok = "\x00" not in self.vpath if relchk(self.vpath) and (self.vpath != "*" or self.mode != "OPTIONS"):
if ANYWIN:
ok = ok and not relchk(self.vpath)
if not ok and (self.vpath != "*" or self.mode != "OPTIONS"):
self.log("invalid relpath [{}]".format(self.vpath)) self.log("invalid relpath [{}]".format(self.vpath))
self.cbonk(self.conn.hsrv.g422, self.vpath, "bad_vp", "invalid relpaths")
return self.tx_404() and self.keepalive return self.tx_404() and self.keepalive
zso = self.headers.get("authorization") zso = self.headers.get("authorization")
@@ -549,6 +555,9 @@ class HttpCli(object):
zb = b"<pre>" + html_escape(msg).encode("utf-8", "replace") zb = b"<pre>" + html_escape(msg).encode("utf-8", "replace")
h = {"WWW-Authenticate": 'Basic realm="a"'} if pex.code == 401 else {} h = {"WWW-Authenticate": 'Basic realm="a"'} if pex.code == 401 else {}
self.reply(zb, status=pex.code, headers=h, volsan=True) self.reply(zb, status=pex.code, headers=h, volsan=True)
if pex.log:
self.log("additional error context:\n" + pex.log, 6)
return self.keepalive return self.keepalive
except Pebkac: except Pebkac:
return False return False
@@ -559,6 +568,36 @@ class HttpCli(object):
else: else:
return self.conn.iphash.s(self.ip) return self.conn.iphash.s(self.ip)
def cbonk(self, g: Garda, v: str, reason: str, descr: str) -> bool:
self.conn.hsrv.nsus += 1
if not g.lim:
return False
bonk, ip = g.bonk(self.ip, v + self.gctx)
if not bonk:
return False
xban = self.vn.flags.get("xban")
if not xban or not runhook(
self.log,
xban,
self.vn.canonical(self.rem),
self.vpath,
self.host,
self.uname,
time.time(),
0,
self.ip,
time.time(),
reason,
):
self.log("client banned: %s" % (descr,), 1)
self.conn.hsrv.bans[ip] = bonk
self.conn.hsrv.nban += 1
return True
return False
def is_banned(self) -> bool: def is_banned(self) -> bool:
if not self.conn.bans: if not self.conn.bans:
return False return False
@@ -678,24 +717,7 @@ class HttpCli(object):
or not self.args.nonsus_urls or not self.args.nonsus_urls
or not self.args.nonsus_urls.search(self.vpath) or not self.args.nonsus_urls.search(self.vpath)
): ):
bonk, ip = g.bonk(self.ip, self.vpath) self.cbonk(g, self.vpath, str(status), "%ss" % (status,))
if bonk:
xban = self.vn.flags.get("xban")
if not xban or not runhook(
self.log,
xban,
self.vn.canonical(self.rem),
self.vpath,
self.host,
self.uname,
time.time(),
0,
self.ip,
time.time(),
str(status),
):
self.log("client banned: %ss" % (status,), 1)
self.conn.hsrv.bans[ip] = bonk
if volsan: if volsan:
vols = list(self.asrv.vfs.all_vols.values()) vols = list(self.asrv.vfs.all_vols.values())
@@ -2121,8 +2143,10 @@ class HttpCli(object):
return True return True
def get_pwd_cookie(self, pwd: str) -> str: def get_pwd_cookie(self, pwd: str) -> str:
if self.asrv.ah.hash(pwd) in self.asrv.iacct: hpwd = self.asrv.ah.hash(pwd)
msg = "login ok" uname = self.asrv.iacct.get(hpwd)
if uname:
msg = "hi " + uname
dur = int(60 * 60 * self.args.logout) dur = int(60 * 60 * self.args.logout)
else: else:
logpwd = pwd logpwd = pwd
@@ -2133,27 +2157,7 @@ class HttpCli(object):
logpwd = "%" + base64.b64encode(zb[:12]).decode("utf-8") logpwd = "%" + base64.b64encode(zb[:12]).decode("utf-8")
self.log("invalid password: {}".format(logpwd), 3) self.log("invalid password: {}".format(logpwd), 3)
self.cbonk(self.conn.hsrv.gpwd, pwd, "pw", "invalid passwords")
g = self.conn.hsrv.gpwd
if g.lim:
bonk, ip = g.bonk(self.ip, pwd)
if bonk:
xban = self.vn.flags.get("xban")
if not xban or not runhook(
self.log,
xban,
self.vn.canonical(self.rem),
self.vpath,
self.host,
self.uname,
time.time(),
0,
self.ip,
time.time(),
"pw",
):
self.log("client banned: invalid passwords", 1)
self.conn.hsrv.bans[ip] = bonk
msg = "naw dude" msg = "naw dude"
pwd = "x" # nosec pwd = "x" # nosec
@@ -2177,26 +2181,30 @@ class HttpCli(object):
new_dir = self.parser.require("name", 512) new_dir = self.parser.require("name", 512)
self.parser.drop() self.parser.drop()
sanitized = sanitize_fn(new_dir, "", []) return self._mkdir(vjoin(self.vpath, new_dir))
return self._mkdir(vjoin(self.vpath, sanitized))
def _mkdir(self, vpath: str, dav: bool = False) -> bool: def _mkdir(self, vpath: str, dav: bool = False) -> bool:
nullwrite = self.args.nw nullwrite = self.args.nw
self.gctx = vpath
vpath = undot(vpath)
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True) vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
self._assert_safe_rem(rem) rem = sanitize_vpath(rem, "/", [])
fn = vfs.canonical(rem) fn = vfs.canonical(rem)
if not fn.startswith(vfs.realpath):
self.log("invalid mkdir [%s] [%s]" % (self.gctx, vpath), 1)
raise Pebkac(422)
if not nullwrite: if not nullwrite:
fdir = os.path.dirname(fn) fdir = os.path.dirname(fn)
if not bos.path.isdir(fdir): if dav and not bos.path.isdir(fdir):
raise Pebkac(409, "parent folder does not exist") raise Pebkac(409, "parent folder does not exist")
if bos.path.isdir(fn): if bos.path.isdir(fn):
raise Pebkac(405, "that folder exists already") raise Pebkac(405, 'folder "/%s" already exists' % (vpath,))
try: try:
bos.mkdir(fn) bos.makedirs(fn)
except OSError as ex: except OSError as ex:
if ex.errno == errno.EACCES: if ex.errno == errno.EACCES:
raise Pebkac(500, "the server OS denied write-access") raise Pebkac(500, "the server OS denied write-access")
@@ -2205,7 +2213,7 @@ class HttpCli(object):
except: except:
raise Pebkac(500, min_ex()) raise Pebkac(500, min_ex())
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1]) self.out_headers["X-New-Dir"] = quotep(vpath)
if dav: if dav:
self.reply(b"", 201) self.reply(b"", 201)

View File

@@ -128,6 +128,9 @@ class HttpSrv(object):
self.u2fh = FHC() self.u2fh = FHC()
self.metrics = Metrics(self) self.metrics = Metrics(self)
self.nreq = 0
self.nsus = 0
self.nban = 0
self.srvs: list[socket.socket] = [] self.srvs: list[socket.socket] = []
self.ncli = 0 # exact self.ncli = 0 # exact
self.clients: set[HttpConn] = set() # laggy self.clients: set[HttpConn] = set() # laggy

View File

@@ -34,14 +34,23 @@ class Metrics(object):
ret: list[str] = [] ret: list[str] = []
def addc(k: str, unit: str, v: str, desc: str) -> None: def addc(k: str, v: str, desc: str) -> None:
if unit: zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
def adduc(k: str, unit: str, v: str, desc: str) -> None:
k += "_" + unit k += "_" + unit
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s" zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v)) ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
else:
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s" def addg(k: str, v: str, desc: str) -> None:
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v)) zs = "# TYPE %s gauge\n# HELP %s %s\n%s %s"
ret.append(zs % (k, k, desc, k, v))
def addug(k: str, unit: str, v: str, desc: str) -> None:
k += "_" + unit
zs = "# TYPE %s gauge\n# UNIT %s %s\n# HELP %s %s\n%s %s"
ret.append(zs % (k, k, unit, k, desc, k, v))
def addh(k: str, typ: str, desc: str) -> None: def addh(k: str, typ: str, desc: str) -> None:
zs = "# TYPE %s %s\n# HELP %s %s" zs = "# TYPE %s %s\n# HELP %s %s"
@@ -54,17 +63,75 @@ class Metrics(object):
def addv(k: str, v: str) -> None: def addv(k: str, v: str) -> None:
ret.append("%s %s" % (k, v)) ret.append("%s %s" % (k, v))
t = "time since last copyparty restart"
v = "{:.3f}".format(time.time() - self.hsrv.t0) v = "{:.3f}".format(time.time() - self.hsrv.t0)
addc("cpp_uptime", "seconds", v, "time since last server restart") addug("cpp_uptime", "seconds", v, t)
# timestamps are gauges because initial value is not zero
t = "unixtime of last copyparty restart"
v = "{:.3f}".format(self.hsrv.t0)
addug("cpp_boot_unixtime", "seconds", v, t)
t = "number of open http(s) client connections"
addg("cpp_http_conns", str(self.hsrv.ncli), t)
t = "number of http(s) requests since last restart"
addc("cpp_http_reqs", str(self.hsrv.nreq), t)
t = "number of 403/422/malicious reqs since restart"
addc("cpp_sus_reqs", str(self.hsrv.nsus), t)
v = str(len(conn.bans or [])) v = str(len(conn.bans or []))
addc("cpp_bans", "", v, "number of banned IPs") addg("cpp_active_bans", v, "number of currently banned IPs")
t = "number of IPs banned since last restart"
addg("cpp_total_bans", str(self.hsrv.nban), t)
if not args.nos_vst:
x = self.hsrv.broker.ask("up2k.get_state")
vs = json.loads(x.get())
nvidle = 0
nvbusy = 0
nvoffline = 0
for v in vs["volstate"].values():
if v == "online, idle":
nvidle += 1
elif "OFFLINE" in v:
nvoffline += 1
else:
nvbusy += 1
addg("cpp_idle_vols", str(nvidle), "number of idle/ready volumes")
addg("cpp_busy_vols", str(nvbusy), "number of busy/indexing volumes")
addg("cpp_offline_vols", str(nvoffline), "number of offline volumes")
t = "time since last database activity (upload/rename/delete)"
addug("cpp_db_idle", "seconds", str(vs["dbwt"]), t)
t = "unixtime of last database activity (upload/rename/delete)"
addug("cpp_db_act", "seconds", str(vs["dbwu"]), t)
t = "number of files queued for hashing/indexing"
addg("cpp_hashing_files", str(vs["hashq"]), t)
t = "number of files queued for metadata scanning"
addg("cpp_tagq_files", str(vs["tagq"]), t)
try:
t = "number of files queued for plugin-based analysis"
addg("cpp_mtpq_files", str(int(vs["mtpq"])), t)
except:
pass
if not args.nos_hdd: if not args.nos_hdd:
addbh("cpp_disk_size_bytes", "total HDD size of volume") addbh("cpp_disk_size_bytes", "total HDD size of volume")
addbh("cpp_disk_free_bytes", "free HDD space in volume") addbh("cpp_disk_free_bytes", "free HDD space in volume")
for vpath, vol in allvols: for vpath, vol in allvols:
free, total = get_df(vol.realpath) free, total = get_df(vol.realpath)
if free is None or total is None:
continue
addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total)) addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total))
addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free)) addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free))
@@ -161,5 +228,6 @@ class Metrics(object):
ret.append("# EOF") ret.append("# EOF")
mime = "application/openmetrics-text; version=1.0.0; charset=utf-8" mime = "application/openmetrics-text; version=1.0.0; charset=utf-8"
mime = cli.uparam.get("mime") or mime
cli.reply("\n".join(ret).encode("utf-8"), mime=mime) cli.reply("\n".join(ret).encode("utf-8"), mime=mime)
return True return True

View File

@@ -136,8 +136,12 @@ class PWHash(object):
import getpass import getpass
while True: while True:
try:
p1 = getpass.getpass("password> ") p1 = getpass.getpass("password> ")
p2 = getpass.getpass("again or just hit ENTER> ") p2 = getpass.getpass("again or just hit ENTER> ")
except EOFError:
return
if p2 and p1 != p2: if p2 and p1 != p2:
print("\033[31minputs don't match; try again\033[0m", file=sys.stderr) print("\033[31minputs don't match; try again\033[0m", file=sys.stderr)
continue continue

View File

@@ -36,17 +36,17 @@ from .tcpsrv import TcpSrv
from .th_srv import HAVE_PIL, HAVE_VIPS, HAVE_WEBP, ThumbSrv from .th_srv import HAVE_PIL, HAVE_VIPS, HAVE_WEBP, ThumbSrv
from .up2k import Up2k from .up2k import Up2k
from .util import ( from .util import (
FFMPEG_URL,
VERSIONS,
Daemon,
DEF_EXP, DEF_EXP,
DEF_MTE, DEF_MTE,
DEF_MTH, DEF_MTH,
FFMPEG_URL,
UTC,
VERSIONS,
Daemon,
Garda, Garda,
HLog, HLog,
HMaccas, HMaccas,
ODict, ODict,
UTC,
alltrace, alltrace,
ansi_re, ansi_re,
min_ex, min_ex,

View File

@@ -65,6 +65,11 @@ from .util import (
w8b64enc, w8b64enc,
) )
try:
from pathlib import Path
except:
pass
if HAVE_SQLITE3: if HAVE_SQLITE3:
import sqlite3 import sqlite3
@@ -261,6 +266,7 @@ class Up2k(object):
"hashq": self.n_hashq, "hashq": self.n_hashq,
"tagq": self.n_tagq, "tagq": self.n_tagq,
"mtpq": mtpq, "mtpq": mtpq,
"dbwu": "{:.2f}".format(self.db_act),
"dbwt": "{:.2f}".format( "dbwt": "{:.2f}".format(
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act) min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
), ),
@@ -789,6 +795,11 @@ class Up2k(object):
except: except:
return None return None
vpath = "?"
for k, v in self.asrv.vfs.all_vols.items():
if v.realpath == ptop:
vpath = k
_, flags = self._expr_idx_filter(flags) _, flags = self._expr_idx_filter(flags)
ft = "\033[0;32m{}{:.0}" ft = "\033[0;32m{}{:.0}"
@@ -824,17 +835,9 @@ class Up2k(object):
a = ["\033[90mall-default"] a = ["\033[90mall-default"]
if a: if a:
vpath = "?"
for k, v in self.asrv.vfs.all_vols.items():
if v.realpath == ptop:
vpath = k
if vpath:
vpath += "/"
zs = " ".join(sorted(a)) zs = " ".join(sorted(a))
zs = zs.replace("90mre.compile(", "90m(") # nohash zs = zs.replace("90mre.compile(", "90m(") # nohash
self.log("/{} {}".format(vpath, zs), "35") self.log("/{} {}".format(vpath + ("/" if vpath else ""), zs), "35")
reg = {} reg = {}
drp = None drp = None
@@ -884,9 +887,6 @@ class Up2k(object):
try: try:
cur = self._open_db(db_path) cur = self._open_db(db_path)
self.cur[ptop] = cur
self.volsize[cur] = 0
self.volnfiles[cur] = 0
# speeds measured uploading 520 small files on a WD20SPZX (SMR 2.5" 5400rpm 4kb) # speeds measured uploading 520 small files on a WD20SPZX (SMR 2.5" 5400rpm 4kb)
dbd = flags["dbd"] dbd = flags["dbd"]
@@ -920,6 +920,13 @@ class Up2k(object):
cur.execute("pragma synchronous=" + sync) cur.execute("pragma synchronous=" + sync)
cur.connection.commit() cur.connection.commit()
self._verify_db_cache(cur, vpath)
self.cur[ptop] = cur
self.volsize[cur] = 0
self.volnfiles[cur] = 0
return cur, db_path return cur, db_path
except: except:
msg = "cannot use database at [{}]:\n{}" msg = "cannot use database at [{}]:\n{}"
@@ -927,6 +934,25 @@ class Up2k(object):
return None return None
def _verify_db_cache(self, cur: "sqlite3.Cursor", vpath: str) -> None:
# check if volume config changed since last use; drop caches if so
zsl = [vpath] + list(sorted(self.asrv.vfs.all_vols.keys()))
zb = hashlib.sha1("\n".join(zsl).encode("utf-8", "replace")).digest()
vcfg = base64.urlsafe_b64encode(zb[:18]).decode("ascii")
c = cur.execute("select v from kv where k = 'volcfg'")
try:
(oldcfg,) = c.fetchone()
except:
oldcfg = ""
if oldcfg != vcfg:
cur.execute("delete from kv where k = 'volcfg'")
cur.execute("delete from dh")
cur.execute("delete from cv")
cur.execute("insert into kv values ('volcfg',?)", (vcfg,))
cur.connection.commit()
def _build_file_index(self, vol: VFS, all_vols: list[VFS]) -> tuple[bool, bool]: def _build_file_index(self, vol: VFS, all_vols: list[VFS]) -> tuple[bool, bool]:
do_vac = False do_vac = False
top = vol.realpath top = vol.realpath
@@ -2723,7 +2749,18 @@ class Up2k(object):
raise Exception("symlink-fallback disabled in cfg") raise Exception("symlink-fallback disabled in cfg")
if not linked: if not linked:
if ANYWIN:
Path(ldst).symlink_to(lsrc)
if not bos.path.exists(dst):
try:
bos.unlink(dst)
except:
pass
t = "the created symlink [%s] did not resolve to [%s]"
raise Exception(t % (ldst, lsrc))
else:
os.symlink(fsenc(lsrc), fsenc(ldst)) os.symlink(fsenc(lsrc), fsenc(ldst))
linked = True linked = True
except Exception as ex: except Exception as ex:
self.log("cannot link; creating copy: " + repr(ex)) self.log("cannot link; creating copy: " + repr(ex))
@@ -3904,10 +3941,21 @@ class Up2k(object):
self.n_hashq -= 1 self.n_hashq -= 1
# self.log("hashq {}".format(self.n_hashq)) # self.log("hashq {}".format(self.n_hashq))
ptop, vtop, rd, fn, ip, at, usr, skip_xau = self.hashq.get() task = self.hashq.get()
if len(task) != 8:
raise Exception("invalid hash task")
try:
if not self._hash_t(task):
return
except Exception as ex:
self.log("failed to hash %s: %s" % (task, ex), 1)
def _hash_t(self, task: tuple[str, str, str, str, str, float, str, bool]) -> bool:
ptop, vtop, rd, fn, ip, at, usr, skip_xau = task
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn)) # self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
if "e2d" not in self.flags[ptop]: if "e2d" not in self.flags[ptop]:
continue return True
abspath = djoin(ptop, rd, fn) abspath = djoin(ptop, rd, fn)
self.log("hashing " + abspath) self.log("hashing " + abspath)
@@ -3919,7 +3967,7 @@ class Up2k(object):
else: else:
hashes = self._hashlist_from_file(abspath) hashes = self._hashlist_from_file(abspath)
if not hashes: if not hashes:
return return False
wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes) wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes)
@@ -3944,6 +3992,8 @@ class Up2k(object):
with self.rescan_cond: with self.rescan_cond:
self.rescan_cond.notify_all() self.rescan_cond.notify_all()
return True
def hash_file( def hash_file(
self, self,
ptop: str, ptop: str,

View File

@@ -1563,8 +1563,8 @@ def read_header(sr: Unrecv, t_idle: int, t_tot: int) -> list[str]:
raise Pebkac( raise Pebkac(
400, 400,
"protocol error while reading headers:\n" "protocol error while reading headers",
+ ret.decode("utf-8", "replace"), log=ret.decode("utf-8", "replace"),
) )
ofs = ret.find(b"\r\n\r\n") ofs = ret.find(b"\r\n\r\n")
@@ -1773,7 +1773,16 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
return fn.strip() return fn.strip()
def sanitize_vpath(vp: str, ok: str, bad: list[str]) -> str:
parts = vp.replace(os.sep, "/").split("/")
ret = [sanitize_fn(x, ok, bad) for x in parts]
return "/".join(ret)
def relchk(rp: str) -> str: def relchk(rp: str) -> str:
if "\x00" in rp:
return "[nul]"
if ANYWIN: if ANYWIN:
if "\n" in rp or "\r" in rp: if "\n" in rp or "\r" in rp:
return "x\nx" return "x\nx"
@@ -2976,9 +2985,12 @@ def hidedir(dp) -> None:
class Pebkac(Exception): class Pebkac(Exception):
def __init__(self, code: int, msg: Optional[str] = None) -> None: def __init__(
self, code: int, msg: Optional[str] = None, log: Optional[str] = None
) -> None:
super(Pebkac, self).__init__(msg or HTTPCODE[code]) super(Pebkac, self).__init__(msg or HTTPCODE[code])
self.code = code self.code = code
self.log = log
def __repr__(self) -> str: def __repr__(self) -> str:
return "Pebkac({}, {})".format(self.code, repr(self.args)) return "Pebkac({}, {})".format(self.code, repr(self.args))

View File

@@ -1891,6 +1891,10 @@ html.y #doc {
text-align: center; text-align: center;
padding: .5em; padding: .5em;
} }
#docul li.bn span {
font-weight: bold;
color: var(--fg-max);
}
#doc.prism { #doc.prism {
padding-left: 3em; padding-left: 3em;
} }

View File

@@ -3797,7 +3797,7 @@ var fileman = (function () {
function rename_cb() { function rename_cb() {
if (this.status !== 201) { if (this.status !== 201) {
var msg = this.responseText; var msg = unpre(this.responseText);
toast.err(9, L.fr_efail + msg); toast.err(9, L.fr_efail + msg);
return; return;
} }
@@ -3846,7 +3846,7 @@ var fileman = (function () {
} }
function delete_cb() { function delete_cb() {
if (this.status !== 200) { if (this.status !== 200) {
var msg = this.responseText; var msg = unpre(this.responseText);
toast.err(9, L.fd_err + msg); toast.err(9, L.fd_err + msg);
return; return;
} }
@@ -3967,7 +3967,7 @@ var fileman = (function () {
} }
function paste_cb() { function paste_cb() {
if (this.status !== 201) { if (this.status !== 201) {
var msg = this.responseText; var msg = unpre(this.responseText);
toast.err(9, L.fp_err + msg); toast.err(9, L.fp_err + msg);
return; return;
} }
@@ -4300,7 +4300,7 @@ var showfile = (function () {
}; };
r.mktree = function () { r.mktree = function () {
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('') + '</li>']; var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('<span>/</span>') + '</li>'];
for (var a = 0; a < r.files.length; a++) { for (var a = 0; a < r.files.length; a++) {
var file = r.files[a]; var file = r.files[a];
html.push('<li><a href="?doc=' + html.push('<li><a href="?doc=' +
@@ -4505,12 +4505,13 @@ var thegrid = (function () {
aplay = ebi('a' + oth.getAttribute('id')), aplay = ebi('a' + oth.getAttribute('id')),
is_img = /\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp|webm|mkv|mp4)(\?|$)/i.test(href), is_img = /\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp|webm|mkv|mp4)(\?|$)/i.test(href),
is_dir = href.endsWith('/'), is_dir = href.endsWith('/'),
is_srch = !!ebi('unsearch'),
in_tree = is_dir && treectl.find(oth.textContent.slice(0, -1)), in_tree = is_dir && treectl.find(oth.textContent.slice(0, -1)),
have_sel = QS('#files tr.sel'), have_sel = QS('#files tr.sel'),
td = oth.closest('td').nextSibling, td = oth.closest('td').nextSibling,
tr = td.parentNode; tr = td.parentNode;
if ((r.sel && !dbl && !ctrl(e)) || (treectl.csel && (e.shiftKey || ctrl(e)))) { if (!is_srch && ((r.sel && !dbl && !ctrl(e)) || (treectl.csel && (e.shiftKey || ctrl(e))))) {
td.onclick.call(td, e); td.onclick.call(td, e);
if (e.shiftKey) if (e.shiftKey)
return r.loadsel(); return r.loadsel();
@@ -4647,7 +4648,7 @@ var thegrid = (function () {
if (r.full) if (r.full)
ihref += 'f' ihref += 'f'
if (href == "#") if (href == "#")
ihref = SR + '/.cpr/ico/⏏️'; ihref = SR + '/.cpr/ico/' + (ref == 'moar' ? '++' : 'exit');
} }
else if (isdir) { else if (isdir) {
ihref = SR + '/.cpr/ico/folder'; ihref = SR + '/.cpr/ico/folder';
@@ -5300,10 +5301,7 @@ document.onkeydown = function (e) {
function xhr_search_results() { function xhr_search_results() {
if (this.status !== 200) { if (this.status !== 200) {
var msg = this.responseText; var msg = unpre(this.responseText);
if (msg.indexOf('<pre>') === 0)
msg = msg.slice(5);
srch_msg(true, "http " + this.status + ": " + msg); srch_msg(true, "http " + this.status + ": " + msg);
search_in_progress = 0; search_in_progress = 0;
return; return;
@@ -5342,7 +5340,7 @@ document.onkeydown = function (e) {
if (ext.length > 8) if (ext.length > 8)
ext = '%'; ext = '%';
var links = linksplit(r.rp + '', id).join(''), var links = linksplit(r.rp + '', id).join('<span>/</span>'),
nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz]; nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz];
for (var b = 0; b < tagord.length; b++) { for (var b = 0; b < tagord.length; b++) {
@@ -7168,16 +7166,17 @@ var msel = (function () {
form.onsubmit = function (e) { form.onsubmit = function (e) {
ev(e); ev(e);
clmod(sf, 'vis', 1); clmod(sf, 'vis', 1);
sf.textContent = 'creating "' + tb.value + '"...'; var dn = tb.value;
sf.textContent = 'creating "' + dn + '"...';
var fd = new FormData(); var fd = new FormData();
fd.append("act", "mkdir"); fd.append("act", "mkdir");
fd.append("name", tb.value); fd.append("name", dn);
var xhr = new XHR(); var xhr = new XHR();
xhr.vp = get_evpath(); xhr.vp = get_evpath();
xhr.dn = tb.value; xhr.dn = dn;
xhr.open('POST', xhr.vp, true); xhr.open('POST', dn.startsWith('/') ? (SR || '/') : xhr.vp, true);
xhr.onload = xhr.onerror = cb; xhr.onload = xhr.onerror = cb;
xhr.responseType = 'text'; xhr.responseType = 'text';
xhr.send(fd); xhr.send(fd);
@@ -7194,7 +7193,7 @@ var msel = (function () {
xhrchk(this, L.fd_xe1, L.fd_xe2); xhrchk(this, L.fd_xe1, L.fd_xe2);
if (this.status !== 201) { if (this.status !== 201) {
sf.textContent = 'error: ' + this.responseText; sf.textContent = 'error: ' + unpre(this.responseText);
return; return;
} }
@@ -7203,8 +7202,9 @@ var msel = (function () {
sf.textContent = ''; sf.textContent = '';
var dn = this.getResponseHeader('X-New-Dir'); var dn = this.getResponseHeader('X-New-Dir');
dn = dn || uricom_enc(this.dn); dn = dn ? '/' + dn + '/' : uricom_enc(this.dn);
treectl.goto(this.vp + dn + '/', true); treectl.goto(dn, true);
tree_scrollto();
} }
})(); })();
@@ -7241,7 +7241,7 @@ var msel = (function () {
xhrchk(this, L.fsm_xe1, L.fsm_xe2); xhrchk(this, L.fsm_xe1, L.fsm_xe2);
if (this.status < 200 || this.status > 201) { if (this.status < 200 || this.status > 201) {
sf.textContent = 'error: ' + this.responseText; sf.textContent = 'error: ' + unpre(this.responseText);
return; return;
} }
@@ -7586,7 +7586,7 @@ var unpost = (function () {
'<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + L.un_del + '</a></td>' + '<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + L.un_del + '</a></td>' +
'<td>' + unix2iso(res[a].at) + '</td>' + '<td>' + unix2iso(res[a].at) + '</td>' +
'<td>' + res[a].sz + '</td>' + '<td>' + res[a].sz + '</td>' +
'<td>' + linksplit(res[a].vp).join(' ') + '</td></tr>'); '<td>' + linksplit(res[a].vp).join('<span> / </span>') + '</td></tr>');
} }
html.push("</tbody></table>"); html.push("</tbody></table>");
@@ -7619,7 +7619,7 @@ var unpost = (function () {
function unpost_delete_cb() { function unpost_delete_cb() {
if (this.status !== 200) { if (this.status !== 200) {
var msg = this.responseText; var msg = unpre(this.responseText);
toast.err(9, L.un_derr + msg); toast.err(9, L.un_derr + msg);
return; return;
} }

View File

@@ -10,6 +10,7 @@
{{ html_head }} {{ html_head }}
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<style>ul{padding-left:1.3em}li{margin:.4em 0}</style>
</head> </head>
<body> <body>
@@ -48,9 +49,13 @@
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %} rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b> rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
</pre> </pre>
<ul>
{% if s %} {% if s %}
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>--no-check-certificate</code> to the mount command</em><br />---</p> <li>running <code>rclone mount</code> on LAN (or just dont have valid certificates)? add <code>--no-check-certificate</code></li>
{% endif %} {% endif %}
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
<li>old version of rclone? replace all <code>=</code> with <code>&nbsp;</code> (space)</li>
</ul>
<p>if you want to use the native WebDAV client in windows instead (slow and buggy), first run <a href="{{ r }}/.cpr/a/webdav-cfg.bat">webdav-cfg.bat</a> to remove the 47 MiB filesize limit (also fixes latency and password login), then connect:</p> <p>if you want to use the native WebDAV client in windows instead (slow and buggy), first run <a href="{{ r }}/.cpr/a/webdav-cfg.bat">webdav-cfg.bat</a> to remove the 47 MiB filesize limit (also fixes latency and password login), then connect:</p>
<pre> <pre>
@@ -73,10 +78,13 @@
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %} rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b> rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
</pre> </pre>
<ul>
{% if s %} {% if s %}
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>--no-check-certificate</code> to the mount command</em><br />---</p> <li>running <code>rclone mount</code> on LAN (or just dont have valid certificates)? add <code>--no-check-certificate</code></li>
{% endif %} {% endif %}
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
<li>old version of rclone? replace all <code>=</code> with <code>&nbsp;</code> (space)</li>
</ul>
<p>or the emergency alternative (gnome/gui-only):</p> <p>or the emergency alternative (gnome/gui-only):</p>
<!-- gnome-bug: ignores vp --> <!-- gnome-bug: ignores vp -->
<pre> <pre>
@@ -123,8 +131,14 @@
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b> rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>W:</b>
</pre> </pre>
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
{% endif %} {% endif %}
<ul>
{% if args.ftps %}
<li>running on LAN (or just dont have valid certificates)? add <code>no_check_certificate=true</code> to the config command</li>
{% endif %}
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
<li>old version of rclone? replace all <code>=</code> with <code>&nbsp;</code> (space)</li>
</ul>
<p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p> <p>if you want to use the native FTP client in windows instead (please dont), press <code>win+R</code> and run this command:</p>
<pre> <pre>
explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }} explorer {{ "ftp" if args.ftp else "ftps" }}://{% if accs %}<b>{{ pw }}</b>:k@{% endif %}{{ host }}:{{ args.ftp or args.ftps }}/{{ rvp }}
@@ -145,8 +159,14 @@
rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true rclone config create {{ aname }}-ftps ftp host={{ rip }} port={{ args.ftps }} pass=k user={% if accs %}<b>{{ pw }}</b>{% else %}anonymous{% endif %} tls=false explicit_tls=true
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b> rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-ftps:{{ rvp }} <b>mp</b>
</pre> </pre>
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>no_check_certificate=true</code> to the config command</em><br />---</p>
{% endif %} {% endif %}
<ul>
{% if args.ftps %}
<li>running on LAN (or just dont have valid certificates)? add <code>no_check_certificate=true</code> to the config command</li>
{% endif %}
<li>running <code>rclone mount</code> as root? add <code>--allow-other</code></li>
<li>old version of rclone? replace all <code>=</code> with <code>&nbsp;</code> (space)</li>
</ul>
<p>emergency alternative (gnome/gui-only):</p> <p>emergency alternative (gnome/gui-only):</p>
<!-- gnome-bug: ignores vp --> <!-- gnome-bug: ignores vp -->
<pre> <pre>
@@ -178,7 +198,7 @@
partyfuse.py{% if accs %} -a <b>{{ pw }}</b>{% endif %} http{{ s }}://{{ ep }}/{{ rvp }} <b><span class="os win">W:</span><span class="os lin mac">mp</span></b> partyfuse.py{% if accs %} -a <b>{{ pw }}</b>{% endif %} http{{ s }}://{{ ep }}/{{ rvp }} <b><span class="os win">W:</span><span class="os lin mac">mp</span></b>
</pre> </pre>
{% if s %} {% if s %}
<p><em>note: if you are on LAN (or just dont have valid certificates), add <code>-td</code></em></p> <ul><li>if you are on LAN (or just dont have valid certificates), add <code>-td</code></li></ul>
{% endif %} {% endif %}
<p> <p>
you can use <a href="{{ r }}/.cpr/a/u2c.py">u2c.py</a> to upload (sometimes faster than web-browsers) you can use <a href="{{ r }}/.cpr/a/u2c.py">u2c.py</a> to upload (sometimes faster than web-browsers)

View File

@@ -1,3 +1,18 @@
:root {
--fg: #ccc;
--fg-max: #fff;
--bg-u2: #2b2b2b;
--bg-u5: #444;
}
html.y {
--fg: #222;
--fg-max: #000;
--bg-u2: #f7f7f7;
--bg-u5: #ccc;
}
html.bz {
--bg-u2: #202231;
}
@font-face { @font-face {
font-family: 'scp'; font-family: 'scp';
font-display: swap; font-display: swap;
@@ -14,6 +29,7 @@ html {
max-width: min(34em, 90%); max-width: min(34em, 90%);
max-width: min(34em, calc(100% - 7em)); max-width: min(34em, calc(100% - 7em));
color: #ddd; color: #ddd;
color: var(--fg);
background: #333; background: #333;
background: var(--bg-u2); background: var(--bg-u2);
border: 0 solid #777; border: 0 solid #777;
@@ -171,24 +187,15 @@ html {
color: #f6a; color: #f6a;
} }
html.y #tt { html.y #tt {
color: #333;
background: #fff;
border-color: #888 #000 #777 #000; border-color: #888 #000 #777 #000;
} }
html.bz #tt { html.bz #tt {
background: #202231;
border-color: #3b3f58; border-color: #3b3f58;
} }
html.y #tt, html.y #tt,
html.y #toast { html.y #toast {
box-shadow: 0 .3em 1em rgba(0,0,0,0.4); box-shadow: 0 .3em 1em rgba(0,0,0,0.4);
} }
html.y #tt code {
color: #fff;
color: var(--fg-max);
background: #060;
background: var(--bg-u5);
}
#modalc code { #modalc code {
color: #060; color: #060;
background: transparent; background: transparent;
@@ -326,6 +333,9 @@ html.y .btn:focus {
box-shadow: 0 .1em .2em #037 inset; box-shadow: 0 .1em .2em #037 inset;
outline: #037 solid .1em; outline: #037 solid .1em;
} }
input[type="submit"] {
cursor: pointer;
}
input[type="text"]:focus, input[type="text"]:focus,
input:not([type]):focus, input:not([type]):focus,
textarea:focus { textarea:focus {

View File

@@ -1407,7 +1407,7 @@ function up2k_init(subtle) {
pvis.addfile([ pvis.addfile([
uc.fsearch ? esc(entry.name) : linksplit( uc.fsearch ? esc(entry.name) : linksplit(
entry.purl + uricom_enc(entry.name)).join(' '), entry.purl + uricom_enc(entry.name)).join(' / '),
'📐 ' + L.u_hashing, '📐 ' + L.u_hashing,
'' ''
], entry.size, draw_each); ], entry.size, draw_each);
@@ -2284,7 +2284,7 @@ function up2k_init(subtle) {
cdiff = (Math.abs(diff) <= 2) ? '3c0' : 'f0b', cdiff = (Math.abs(diff) <= 2) ? '3c0' : 'f0b',
sdiff = '<span style="color:#' + cdiff + '">diff ' + diff; sdiff = '<span style="color:#' + cdiff + '">diff ' + diff;
msg.push(linksplit(hit.rp).join('') + '<br /><small>' + tr + ' (srv), ' + tu + ' (You), ' + sdiff + '</small></span>'); msg.push(linksplit(hit.rp).join(' / ') + '<br /><small>' + tr + ' (srv), ' + tu + ' (You), ' + sdiff + '</small></span>');
} }
msg = msg.join('<br />\n'); msg = msg.join('<br />\n');
} }
@@ -2318,7 +2318,7 @@ function up2k_init(subtle) {
url += '?k=' + fk; url += '?k=' + fk;
} }
pvis.seth(t.n, 0, linksplit(url).join(' ')); pvis.seth(t.n, 0, linksplit(url).join(' / '));
} }
var chunksize = get_chunksize(t.size), var chunksize = get_chunksize(t.size),
@@ -2402,15 +2402,12 @@ function up2k_init(subtle) {
pvis.seth(t.n, 2, L.u_ehstmp, t); pvis.seth(t.n, 2, L.u_ehstmp, t);
var err = "", var err = "",
rsp = (xhr.responseText + ''), rsp = unpre(xhr.responseText),
ofs = rsp.lastIndexOf('\nURL: '); ofs = rsp.lastIndexOf('\nURL: ');
if (ofs !== -1) if (ofs !== -1)
rsp = rsp.slice(0, ofs); rsp = rsp.slice(0, ofs);
if (rsp.indexOf('<pre>') === 0)
rsp = rsp.slice(5);
if (rsp.indexOf('rate-limit ') !== -1) { if (rsp.indexOf('rate-limit ') !== -1) {
var penalty = rsp.replace(/.*rate-limit /, "").split(' ')[0]; var penalty = rsp.replace(/.*rate-limit /, "").split(' ')[0];
console.log("rate-limit: " + penalty); console.log("rate-limit: " + penalty);
@@ -2429,7 +2426,7 @@ function up2k_init(subtle) {
err = rsp; err = rsp;
ofs = err.indexOf('\n/'); ofs = err.indexOf('\n/');
if (ofs !== -1) { if (ofs !== -1) {
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' '); err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' / ');
} }
if (!t.rechecks && (err_pend || err_srcb)) { if (!t.rechecks && (err_pend || err_srcb)) {
t.rechecks = 0; t.rechecks = 0;
@@ -2536,7 +2533,7 @@ function up2k_init(subtle) {
cdr = t.size; cdr = t.size;
var orz = function (xhr) { var orz = function (xhr) {
var txt = ((xhr.response && xhr.response.err) || xhr.responseText) + ''; var txt = unpre((xhr.response && xhr.response.err) || xhr.responseText);
if (txt.indexOf('upload blocked by x') + 1) { if (txt.indexOf('upload blocked by x') + 1) {
apop(st.busy.upload, upt); apop(st.busy.upload, upt);
apop(t.postlist, npart); apop(t.postlist, npart);

View File

@@ -622,9 +622,8 @@ function linksplit(rp, id) {
} }
var vlink = esc(uricom_dec(link)); var vlink = esc(uricom_dec(link));
if (link.indexOf('/') !== -1) { if (link.indexOf('/') !== -1)
vlink = vlink.slice(0, -1) + '<span>/</span>'; vlink = vlink.slice(0, -1);
}
if (!rp) { if (!rp) {
if (q) if (q)
@@ -1357,6 +1356,11 @@ function lf2br(txt) {
} }
function unpre(txt) {
return ('' + txt).replace(/^<pre>/, '');
}
var toast = (function () { var toast = (function () {
var r = {}, var r = {},
te = null, te = null,

View File

@@ -1,3 +1,74 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1111-1738 `v1.9.17` 11-11
## new features
* `u2c.py` / `u2c.exe` (the commandline uploader):
* `-x` is now case-insensitive
* if a file fails to upload after 30 attempts, give up (bitflips)
* add 5 sec delay before reattempts (configurable with `--cd`)
## bugfixes
* clients could crash the file indexer by uploading and then instantly deleting files (as some webdav clients tend to do)
* and fix some upload errorhandling which broke during a refactoring in v1.9.16
## other changes
* upgraded pyftpdlib to v1.5.9
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1104-2158 `v1.9.16` windedup
## breaking changes
* two of the prometheus metrics have changed slightly; see the [breaking changes readme section](https://github.com/9001/copyparty#breaking-changes)
* (i'm not familiar with prometheus so i'm not sure if this is a big deal)
## new features
* #58 versioned docker images! no longer just `latest`
* browser: the mkdir feature now accepts `foo/bar/qux` and `../foo` and `/bar`
* add 14 more prometheus metrics; see [readme](https://github.com/9001/copyparty#prometheus) for details
* connections, requests, malicious requests, volume state, file hashing/analyzation queues
* catch some more malicious requests in the autoban filters
* some malicious requests are now answered with HTTP 422, so that they count against `--ban-422`
## bugfixes
* windows: fix symlink-based upload deduplication
* MS decided to make symlinks relative to working-directory rather than destination-path...
* `--stats` would produce invalid metrics if a volume was offline
* minor improvements to password hashing ux:
* properly warn if `--ah-cli` or `--ah-gen` is used without `--ah-alg`
* support `^D` during `--ah-cli`
* browser-ux / cosmetics:
* fix toast/tooltip colors on splashpage
* easier to do partial text selection inside links (search results, breadcrumbs, uploads)
* more rclone-related hints on the connect-page
## other changes
* malformed http headers from clients are no longer included in the client error-message
* just in case there are deployments with a reverse-proxy inserting interesting stuff on the way in
* the serverlog still contains all the necessary info to debug your own clients
* updated [example nginx config](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nginx/copyparty.conf) to recover faster from brief server outages
* the default value of `fail_timeout` (10sec) makes nginx cache the outage for longer than necessary
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1024-1643 `v1.9.15` expand placeholder
[made it just in time!](https://a.ocv.me/pub/g/nerd-stuff/PXL_20231024_170348367.jpg) (EDIT: nevermind, three of the containers didn't finish uploading to ghcr before takeoff ;_; all up now)
## new features
* #56 placeholder variables in markdown documents and prologue/epilogue html files
* default-disabled; must be enabled globally with `--exp` or per-volume with volflag `exp`
* `{{self.ip}}` becomes the client IP; see [/srv/expand/README.md](https://github.com/9001/copyparty/blob/hovudstraum/srv/expand/README.md) for more examples
* dynamic-range-compressor: reduced volume jumps between songs when enabled
## bugfixes
* v1.9.14 broke the `scan` volflag, causing volume rescans to happen every 10sec if enabled
* its global counterpart `--re-maxage` was not affected
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-1021-1443 `v1.9.14` uptime # 2023-1021-1443 `v1.9.14` uptime

View File

@@ -28,10 +28,6 @@ https://github.com/nayuki/QR-Code-generator/
C: Project Nayuki C: Project Nayuki
L: MIT L: MIT
https://github.com/python/cpython/blob/3.10/Lib/asyncore.py
C: 1996 Sam Rushing
L: ISC
https://github.com/ahupp/python-magic/ https://github.com/ahupp/python-magic/
C: 2001-2014 Adam Hupp C: 2001-2014 Adam Hupp
L: MIT L: MIT

View File

@@ -141,12 +141,25 @@ filt=
} }
[ $push ] && { [ $push ] && {
ver=$(
python3 ../../dist/copyparty-sfx.py --version 2>/dev/null |
awk '/^copyparty v/{sub(/-.*/,"");sub(/v/,"");print$2;exit}'
)
echo $ver | grep -E '[0-9]\.[0-9]' || {
echo no ver
exit 1
}
for i in $dhub_order; do for i in $dhub_order; do
printf '\ndockerhub %s\n' $i
podman manifest push --all copyparty-$i copyparty/$i:$ver
podman manifest push --all copyparty-$i copyparty/$i:latest podman manifest push --all copyparty-$i copyparty/$i:latest
done done &
for i in $ghcr_order; do for i in $ghcr_order; do
printf '\nghcr %s\n' $i
podman manifest push --all copyparty-$i ghcr.io/9001/copyparty-$i:$ver
podman manifest push --all copyparty-$i ghcr.io/9001/copyparty-$i:latest podman manifest push --all copyparty-$i ghcr.io/9001/copyparty-$i:latest
done done &
wait
} }
echo ok echo ok

View File

@@ -205,26 +205,22 @@ necho() {
mv {markupsafe,jinja2} j2/ mv {markupsafe,jinja2} j2/
necho collecting pyftpdlib necho collecting pyftpdlib
f="../build/pyftpdlib-1.5.8.tar.gz" f="../build/pyftpdlib-1.5.9.tar.gz"
[ -e "$f" ] || [ -e "$f" ] ||
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.8.tar.gz; (url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.9.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f) wget -O$f "$url" || curl -L "$url" >$f)
tar -zxf $f tar -zxf $f
mv pyftpdlib-release-*/pyftpdlib . mv pyftpdlib-release-*/pyftpdlib .
rm -rf pyftpdlib-release-* pyftpdlib/test rm -rf pyftpdlib-release-* pyftpdlib/test
for f in pyftpdlib/_async{hat,ore}.py; do
[ -e "$f" ] || continue;
iawk 'NR<4||NR>27||!/^#/;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' $f
done
mkdir ftp/ mkdir ftp/
mv pyftpdlib ftp/ mv pyftpdlib ftp/
necho collecting asyncore, asynchat
for n in asyncore.py asynchat.py; do
f=../build/$n
[ -e "$f" ] ||
(url=https://raw.githubusercontent.com/python/cpython/c4d45ee670c09d4f6da709df072ec80cb7dfad22/Lib/$n;
wget -O$f "$url" || curl -L "$url" >$f)
done
necho collecting python-magic necho collecting python-magic
v=0.4.27 v=0.4.27
f="../build/python-magic-$v.tar.gz" f="../build/python-magic-$v.tar.gz"
@@ -293,12 +289,6 @@ necho() {
(cd "${x%/*}"; cp -p "../$(cat "${x##*/}")" ${x##*/}) (cd "${x%/*}"; cp -p "../$(cat "${x##*/}")" ${x##*/})
done done
# insert asynchat
mkdir copyparty/vend
for n in asyncore.py asynchat.py; do
awk 'NR<4||NR>27;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' ../build/$n >copyparty/vend/$n
done
rm -f copyparty/stolen/*/README.md rm -f copyparty/stolen/*/README.md
# remove type hints before build instead # remove type hints before build instead
@@ -419,7 +409,7 @@ iawk '/^ {0,4}[^ ]/{s=0}/^ {4}def (serve_forever|_loop)/{s=1}!s' ftp/pyftpdlib/s
rm -f ftp/pyftpdlib/{__main__,prefork}.py rm -f ftp/pyftpdlib/{__main__,prefork}.py
[ $no_ftp ] && [ $no_ftp ] &&
rm -rf copyparty/ftpd.py ftp asyncore.py asynchat.py && rm -rf copyparty/ftpd.py ftp &&
sed -ri '/\.ftp/d' copyparty/svchub.py sed -ri '/\.ftp/d' copyparty/svchub.py
[ $no_smb ] && [ $no_smb ] &&
@@ -576,8 +566,8 @@ nf=$(ls -1 "$zdir"/arc.* 2>/dev/null | wc -l)
cat ../$bdir/COPYING.txt) >> copyparty/res/COPYING.txt || cat ../$bdir/COPYING.txt) >> copyparty/res/COPYING.txt ||
echo "copying.txt 404 pls rebuild" echo "copying.txt 404 pls rebuild"
mv ftp/* j2/* copyparty/vend/* . mv ftp/* j2/* .
rm -rf ftp j2 py2 py37 copyparty/vend rm -rf ftp j2 py2 py37
(cd copyparty; tar -cvf z.tar $t; rm -rf $t) (cd copyparty; tar -cvf z.tar $t; rm -rf $t)
cd .. cd ..
pyoxidizer build --release --target-triple $tgt pyoxidizer build --release --target-triple $tgt

View File

@@ -9,7 +9,7 @@ f23615c522ed58b9a05978ba4c69c06224590f3a6adbd8e89b31838b181a57160739ceff1fc2ba6f
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl 3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
8d16a967a0a7872a7575b1005cf66915deacda6ee8611fbb52f42fc3e3beb2f901a5140c942a5d146bd412b92bfa9cbadd82beeba83df6d70930c6dc26608a5b upx-4.1.0-win32.zip 8d16a967a0a7872a7575b1005cf66915deacda6ee8611fbb52f42fc3e3beb2f901a5140c942a5d146bd412b92bfa9cbadd82beeba83df6d70930c6dc26608a5b upx-4.1.0-win32.zip
# u2c (win7) # u2c (win7)
4562b1065c6bce7084eb575b654985c990e26034bfcd8db54629312f43ac737e264db7a2b4d8b797e09919a485cbc6af3fd0931690b7ed79b62bcc0736aec9fc certifi-2023.7.22-py3-none-any.whl f3390290b896019b2fa169932390e4930d1c03c014e1f6db2405ca2eb1f51f5f5213f725885853805b742997b0edb369787e5c0069d217bc4e8b957f847f58b6 certifi-2023.11.17-py3-none-any.whl
904eb57b13bea80aea861de86987e618665d37fa9ea0856e0125a9ba767a53e5064de0b9c4735435a2ddf4f16f7f7d2c75a682e1de83d9f57922bdca8e29988c charset_normalizer-3.3.0-cp37-cp37m-win32.whl 904eb57b13bea80aea861de86987e618665d37fa9ea0856e0125a9ba767a53e5064de0b9c4735435a2ddf4f16f7f7d2c75a682e1de83d9f57922bdca8e29988c charset_normalizer-3.3.0-cp37-cp37m-win32.whl
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl
b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c3875222eb47fc95c73fc0606aaa6602b9ebc524809c9ba3501f requests-2.31.0-py3-none-any.whl b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c3875222eb47fc95c73fc0606aaa6602b9ebc524809c9ba3501f requests-2.31.0-py3-none-any.whl

View File

@@ -106,20 +106,19 @@ def meichk():
if filt not in sys.executable: if filt not in sys.executable:
filt = os.path.basename(sys.executable) filt = os.path.basename(sys.executable)
pids = [] hits = []
ptn = re.compile(r"^([^\s]+)\s+([0-9]+)")
try: try:
procs = sp.check_output("tasklist").decode("utf-8", "replace") cmd = "tasklist /fo csv".split(" ")
procs = sp.check_output(cmd).decode("utf-8", "replace")
except: except:
procs = "" # winpe procs = "" # winpe
for ln in procs.splitlines(): for ln in procs.split("\n"):
m = ptn.match(ln) if filt in ln.split('"')[:2][-1]:
if m and filt in m.group(1).lower(): hits.append(ln)
pids.append(int(m.group(2)))
mod = os.path.dirname(os.path.realpath(__file__)) mod = os.path.dirname(os.path.realpath(__file__))
if os.path.basename(mod).startswith("_MEI") and len(pids) == 2: if os.path.basename(mod).startswith("_MEI") and len(hits) == 2:
meicln(mod) meicln(mod)

View File

@@ -33,7 +33,7 @@ fns=(
) )
[ $w7 ] && fns+=( [ $w7 ] && fns+=(
pyinstaller-5.13.2-py3-none-win32.whl pyinstaller-5.13.2-py3-none-win32.whl
certifi-2022.12.7-py3-none-any.whl certifi-2023.11.17-py3-none-any.whl
chardet-5.1.0-py3-none-any.whl chardet-5.1.0-py3-none-any.whl
idna-3.4-py3-none-any.whl idna-3.4-py3-none-any.whl
requests-2.28.2-py3-none-any.whl requests-2.28.2-py3-none-any.whl

View File

@@ -59,9 +59,6 @@ copyparty/th_srv.py,
copyparty/u2idx.py, copyparty/u2idx.py,
copyparty/up2k.py, copyparty/up2k.py,
copyparty/util.py, copyparty/util.py,
copyparty/vend,
copyparty/vend/asynchat.py,
copyparty/vend/asyncore.py,
copyparty/web, copyparty/web,
copyparty/web/a, copyparty/web/a,
copyparty/web/a/__init__.py, copyparty/web/a/__init__.py,

View File

@@ -16,16 +16,11 @@ def uncomment(fpath):
orig = f.read().decode("utf-8") orig = f.read().decode("utf-8")
out = "" out = ""
for ln in orig.split("\n"):
if not ln.startswith("#"):
break
out += ln + "\n"
io_obj = io.StringIO(orig) io_obj = io.StringIO(orig)
prev_toktype = tokenize.INDENT prev_toktype = tokenize.INDENT
last_lineno = -1 last_lineno = -1
last_col = 0 last_col = 0
code = False
for tok in tokenize.generate_tokens(io_obj.readline): for tok in tokenize.generate_tokens(io_obj.readline):
# print(repr(tok)) # print(repr(tok))
token_type = tok[0] token_type = tok[0]
@@ -53,7 +48,11 @@ def uncomment(fpath):
out += token_string out += token_string
else: else:
out += '"a"' out += '"a"'
elif token_type != tokenize.COMMENT or is_legalese: elif token_type != tokenize.COMMENT:
out += token_string
if not code and token_string.strip():
code = True
elif is_legalese or (not start_col and not code):
out += token_string out += token_string
else: else:
if out.rstrip(" ").endswith("\n"): if out.rstrip(" ").endswith("\n"):

View File

@@ -115,7 +115,7 @@ class Cfg(Namespace):
ex = "dotpart no_rescan no_sendfile no_voldump plain_ip" ex = "dotpart no_rescan no_sendfile no_voldump plain_ip"
ka.update(**{k: True for k in ex.split()}) ka.update(**{k: True for k in ex.split()})
ex = "css_browser hist js_browser no_forget no_hash no_idx nonsus_urls" ex = "ah_cli ah_gen css_browser hist js_browser no_forget no_hash no_idx nonsus_urls"
ka.update(**{k: None for k in ex.split()}) ka.update(**{k: None for k in ex.split()})
ex = "s_thead s_tbody th_convt" ex = "s_thead s_tbody th_convt"
@@ -190,6 +190,7 @@ class VHttpSrv(object):
self.broker = NullBroker() self.broker = NullBroker()
self.prism = None self.prism = None
self.bans = {} self.bans = {}
self.nreq = 0
aliases = ["splash", "browser", "browser2", "msg", "md", "mde"] aliases = ["splash", "browser", "browser2", "msg", "md", "mde"]
self.j2 = {x: J2_FILES for x in aliases} self.j2 = {x: J2_FILES for x in aliases}