Compare commits
18 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
42d00050c1 | ||
|
|
4bb0e6e75a | ||
|
|
2f7f9de3f5 | ||
|
|
f31ac90932 | ||
|
|
439cb7f85b | ||
|
|
af193ee834 | ||
|
|
c06126cc9d | ||
|
|
897ffbbbd0 | ||
|
|
8244d3b4fc | ||
|
|
74266af6d1 | ||
|
|
8c552f1ad1 | ||
|
|
bf5850785f | ||
|
|
feecb3e0b8 | ||
|
|
08d8c82167 | ||
|
|
5239e7ac0c | ||
|
|
9937c2e755 | ||
|
|
f1e947f37d | ||
|
|
a70a49b9c9 |
19
README.md
19
README.md
@@ -10,6 +10,8 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
|
||||
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
|
||||
|
||||
🎬 **videos:** [upload](https://a.ocv.me/pub/demo/pics-vids/up2k.webm) // [cli-upload](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm) // [race-the-beam](https://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)
|
||||
|
||||
|
||||
## readme toc
|
||||
|
||||
@@ -17,7 +19,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [quickstart](#quickstart) - just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
|
||||
* [at home](#at-home) - make it accessible over the internet
|
||||
* [on servers](#on-servers) - you may also want these, especially on servers
|
||||
* [features](#features)
|
||||
* [features](#features) - also see [comparison to similar software](./docs/versus.md)
|
||||
* [testimonials](#testimonials) - small collection of user feedback
|
||||
* [motivations](#motivations) - project goals / philosophy
|
||||
* [notes](#notes) - general notes
|
||||
@@ -38,6 +40,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [file-search](#file-search) - dropping files into the browser also lets you see if they exist on the server
|
||||
* [unpost](#unpost) - undo/delete accidental uploads
|
||||
* [self-destruct](#self-destruct) - uploads can be given a lifetime
|
||||
* [race the beam](#race-the-beam) - download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm))
|
||||
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
|
||||
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
|
||||
* [media player](#media-player) - plays almost every audio format there is
|
||||
@@ -127,7 +130,7 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
|
||||
|
||||
* **Alpine:** `apk add py3-pillow ffmpeg`
|
||||
* **Debian:** `apt install --no-install-recommends python3-pil ffmpeg`
|
||||
* **Fedora:** rpmfusion + `dnf install python3-pillow ffmpeg`
|
||||
* **Fedora:** rpmfusion + `dnf install python3-pillow ffmpeg --allowerasing`
|
||||
* **FreeBSD:** `pkg install py39-sqlite3 py39-pillow ffmpeg`
|
||||
* **MacOS:** `port install py-Pillow ffmpeg`
|
||||
* **MacOS** (alternative): `brew install pillow ffmpeg`
|
||||
@@ -182,6 +185,8 @@ firewall-cmd --reload
|
||||
|
||||
## features
|
||||
|
||||
also see [comparison to similar software](./docs/versus.md)
|
||||
|
||||
* backend stuff
|
||||
* ☑ IPv6
|
||||
* ☑ [multiprocessing](#performance) (actual multithreading)
|
||||
@@ -204,6 +209,7 @@ firewall-cmd --reload
|
||||
* ☑ write-only folders
|
||||
* ☑ [unpost](#unpost): undo/delete accidental uploads
|
||||
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
|
||||
* ☑ [race the beam](#race-the-beam) (almost like peer-to-peer)
|
||||
* ☑ symlink/discard duplicates (content-matching)
|
||||
* download
|
||||
* ☑ single files in browser
|
||||
@@ -629,7 +635,7 @@ up2k has several advantages:
|
||||
> it is perfectly safe to restart / upgrade copyparty while someone is uploading to it!
|
||||
> all known up2k clients will resume just fine 💪
|
||||
|
||||
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
|
||||
see [up2k](./docs/devnotes.md#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
|
||||
|
||||

|
||||
|
||||
@@ -695,6 +701,13 @@ clients can specify a shorter expiration time using the [up2k ui](#uploading) --
|
||||
specifying a custom expiration time client-side will affect the timespan in which unposts are permitted, so keep an eye on the estimates in the up2k ui
|
||||
|
||||
|
||||
### race the beam
|
||||
|
||||
download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)) -- it's almost like peer-to-peer
|
||||
|
||||
requires the file to be uploaded using up2k (which is the default drag-and-drop uploader), alternatively the command-line program
|
||||
|
||||
|
||||
## file manager
|
||||
|
||||
cut/paste, rename, and delete files/folders (if you have permission)
|
||||
|
||||
@@ -231,7 +231,7 @@ install_vamp() {
|
||||
cd "$td"
|
||||
echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
|
||||
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
|
||||
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2691/vamp-plugin-sdk-2.10.0.tar.gz)
|
||||
(dl_files yolo https://ocv.me/mirror/vamp-plugin-sdk-2.10.0.tar.gz)
|
||||
sha512sum -c <(
|
||||
echo "153b7f2fa01b77c65ad393ca0689742d66421017fd5931d216caa0fcf6909355fff74706fabbc062a3a04588a619c9b515a1dae00f21a57afd97902a355c48ed -"
|
||||
) <vamp-plugin-sdk-2.10.0.tar.gz
|
||||
@@ -247,7 +247,7 @@ install_vamp() {
|
||||
cd "$td"
|
||||
have_beatroot || {
|
||||
printf '\033[33mcould not find the vamp beatroot plugin, building from source\033[0m\n'
|
||||
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/885/beatroot-vamp-v1.0.tar.gz)
|
||||
(dl_files yolo https://ocv.me/mirror/beatroot-vamp-v1.0.tar.gz)
|
||||
sha512sum -c <(
|
||||
echo "1f444d1d58ccf565c0adfe99f1a1aa62789e19f5071e46857e2adfbc9d453037bc1c4dcb039b02c16240e9b97f444aaff3afb625c86aa2470233e711f55b6874 -"
|
||||
) <beatroot-vamp-v1.0.tar.gz
|
||||
|
||||
12
bin/u2c.py
12
bin/u2c.py
@@ -1,8 +1,8 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
S_VERSION = "1.15"
|
||||
S_BUILD_DT = "2024-02-18"
|
||||
S_VERSION = "1.16"
|
||||
S_BUILD_DT = "2024-04-20"
|
||||
|
||||
"""
|
||||
u2c.py: upload to copyparty
|
||||
@@ -563,7 +563,7 @@ def handshake(ar, file, search):
|
||||
else:
|
||||
if ar.touch:
|
||||
req["umod"] = True
|
||||
if ar.dr:
|
||||
if ar.ow:
|
||||
req["replace"] = True
|
||||
|
||||
headers = {"Content-Type": "text/plain"} # <=1.5.1 compat
|
||||
@@ -1140,6 +1140,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\\.hist/.*'")
|
||||
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
||||
ap.add_argument("--touch", action="store_true", help="if last-modified timestamps differ, push local to server (need write+delete perms)")
|
||||
ap.add_argument("--ow", action="store_true", help="overwrite existing files instead of autorenaming")
|
||||
ap.add_argument("--version", action="store_true", help="show version and exit")
|
||||
|
||||
ap = app.add_argument_group("compatibility")
|
||||
@@ -1148,7 +1149,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
|
||||
ap = app.add_argument_group("folder sync")
|
||||
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
|
||||
ap.add_argument("--dr", action="store_true", help="delete remote files which don't exist locally")
|
||||
ap.add_argument("--dr", action="store_true", help="delete remote files which don't exist locally (implies --ow)")
|
||||
ap.add_argument("--drd", action="store_true", help="delete remote files during upload instead of afterwards; reduces peak disk space usage, but will reupload instead of detecting renames")
|
||||
|
||||
ap = app.add_argument_group("performance tweaks")
|
||||
@@ -1178,6 +1179,9 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
if ar.drd:
|
||||
ar.dr = True
|
||||
|
||||
if ar.dr:
|
||||
ar.ow = True
|
||||
|
||||
for k in "dl dr drd".split():
|
||||
errs = []
|
||||
if ar.safe and getattr(ar, k):
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.12.1"
|
||||
pkgver="1.12.2"
|
||||
pkgrel=1
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
||||
arch=("any")
|
||||
@@ -21,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("c247de98727bc28aef4f696850c52b4816530847d839f5ff8451a8313bb7f983")
|
||||
sha256sums=("e4fd6733e5361f5ceb2ae950f71f65f2609c2b69d45f47e8b2a2f128fb67de0a")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.12.1/copyparty-sfx.py",
|
||||
"version": "1.12.1",
|
||||
"hash": "sha256-EkkLOaGZPbu9NJyCoSTj6yso9wcAbNAk23TiT96QYJ8="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.12.2/copyparty-sfx.py",
|
||||
"version": "1.12.2",
|
||||
"hash": "sha256-GJts5N0leK/WHqpqb+eB1JjBvf6TRpzCc9R7AIHkujo="
|
||||
}
|
||||
@@ -856,7 +856,7 @@ def add_qr(ap, tty):
|
||||
|
||||
def add_fs(ap):
|
||||
ap2 = ap.add_argument_group("filesystem options")
|
||||
rm_re_def = "5/0.1" if ANYWIN else "0/0"
|
||||
rm_re_def = "15/0.1" if ANYWIN else "0/0"
|
||||
ap2.add_argument("--rm-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be deleted because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=rm_retry)")
|
||||
ap2.add_argument("--mv-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be renamed because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=mv_retry)")
|
||||
ap2.add_argument("--iobuf", metavar="BYTES", type=int, default=256*1024, help="file I/O buffer-size; if your volumes are on a network drive, try increasing to \033[32m524288\033[0m or even \033[32m4194304\033[0m (and let me know if that improves your performance)")
|
||||
@@ -1091,6 +1091,8 @@ def add_optouts(ap):
|
||||
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
|
||||
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
||||
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
|
||||
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
|
||||
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader IPs into the database")
|
||||
|
||||
|
||||
def add_safety(ap):
|
||||
@@ -1216,7 +1218,7 @@ def add_db_general(ap, hcores):
|
||||
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
|
||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
|
||||
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
|
||||
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
|
||||
ap2.add_argument("--re-dhash", action="store_true", help="force a cache rebuild on startup; enable this once if it gets out of sync (should never be necessary)")
|
||||
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)")
|
||||
ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see \033[33m--help-dbd\033[0m (volflag=dbd)")
|
||||
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)")
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 12, 2)
|
||||
CODENAME = "locksmith"
|
||||
BUILD_DT = (2024, 4, 12)
|
||||
VERSION = (1, 13, 0)
|
||||
CODENAME = "race the beam"
|
||||
BUILD_DT = (2024, 4, 20)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -16,6 +16,7 @@ def vf_bmap() -> dict[str, str]:
|
||||
"no_dedup": "copydupes",
|
||||
"no_dupe": "nodupe",
|
||||
"no_forget": "noforget",
|
||||
"no_pipe": "nopipe",
|
||||
"no_robots": "norobots",
|
||||
"no_thumb": "dthumb",
|
||||
"no_vthumb": "dvthumb",
|
||||
|
||||
@@ -36,6 +36,7 @@ from .bos import bos
|
||||
from .star import StreamTar
|
||||
from .sutil import StreamArc, gfilter
|
||||
from .szip import StreamZip
|
||||
from .up2k import up2k_chunksize
|
||||
from .util import unquote # type: ignore
|
||||
from .util import (
|
||||
APPLESAN_RE,
|
||||
@@ -127,6 +128,7 @@ class HttpCli(object):
|
||||
self.ico = conn.ico # mypy404
|
||||
self.thumbcli = conn.thumbcli # mypy404
|
||||
self.u2fh = conn.u2fh # mypy404
|
||||
self.pipes = conn.pipes # mypy404
|
||||
self.log_func = conn.log_func # mypy404
|
||||
self.log_src = conn.log_src # mypy404
|
||||
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
||||
@@ -2929,17 +2931,42 @@ class HttpCli(object):
|
||||
|
||||
return txt
|
||||
|
||||
def tx_file(self, req_path: str) -> bool:
|
||||
def tx_file(self, req_path: str, ptop: Optional[str] = None) -> bool:
|
||||
status = 200
|
||||
logmsg = "{:4} {} ".format("", self.req)
|
||||
logtail = ""
|
||||
|
||||
if ptop is not None:
|
||||
try:
|
||||
dp, fn = os.path.split(req_path)
|
||||
tnam = fn + ".PARTIAL"
|
||||
if self.args.dotpart:
|
||||
tnam = "." + tnam
|
||||
ap_data = os.path.join(dp, tnam)
|
||||
st_data = bos.stat(ap_data)
|
||||
if not st_data.st_size:
|
||||
raise Exception("partial is empty")
|
||||
x = self.conn.hsrv.broker.ask("up2k.find_job_by_ap", ptop, req_path)
|
||||
job = json.loads(x.get())
|
||||
if not job:
|
||||
raise Exception("not found in registry")
|
||||
self.pipes.set(req_path, job)
|
||||
except Exception as ex:
|
||||
self.log("will not pipe [%s]; %s" % (ap_data, ex), 6)
|
||||
ptop = None
|
||||
|
||||
#
|
||||
# if request is for foo.js, check if we have foo.js.gz
|
||||
|
||||
file_ts = 0.0
|
||||
editions: dict[str, tuple[str, int]] = {}
|
||||
for ext in ("", ".gz"):
|
||||
if ptop is not None:
|
||||
sz = job["size"]
|
||||
file_ts = job["lmod"]
|
||||
editions["plain"] = (ap_data, sz)
|
||||
break
|
||||
|
||||
try:
|
||||
fs_path = req_path + ext
|
||||
st = bos.stat(fs_path)
|
||||
@@ -3096,6 +3123,11 @@ class HttpCli(object):
|
||||
self.send_headers(length=upper - lower, status=status, mime=mime)
|
||||
return True
|
||||
|
||||
if ptop is not None:
|
||||
return self.tx_pipe(
|
||||
ptop, req_path, ap_data, job, lower, upper, status, mime, logmsg
|
||||
)
|
||||
|
||||
ret = True
|
||||
with open_func(*open_args) as f:
|
||||
self.send_headers(length=upper - lower, status=status, mime=mime)
|
||||
@@ -3115,6 +3147,143 @@ class HttpCli(object):
|
||||
|
||||
return ret
|
||||
|
||||
def tx_pipe(
|
||||
self,
|
||||
ptop: str,
|
||||
req_path: str,
|
||||
ap_data: str,
|
||||
job: dict[str, Any],
|
||||
lower: int,
|
||||
upper: int,
|
||||
status: int,
|
||||
mime: str,
|
||||
logmsg: str,
|
||||
) -> bool:
|
||||
M = 1048576
|
||||
self.send_headers(length=upper - lower, status=status, mime=mime)
|
||||
wr_slp = self.args.s_wr_slp
|
||||
wr_sz = self.args.s_wr_sz
|
||||
file_size = job["size"]
|
||||
chunk_size = up2k_chunksize(file_size)
|
||||
num_need = -1
|
||||
data_end = 0
|
||||
remains = upper - lower
|
||||
broken = False
|
||||
spins = 0
|
||||
tier = 0
|
||||
tiers = ["uncapped", "reduced speed", "one byte per sec"]
|
||||
|
||||
while lower < upper and not broken:
|
||||
with self.u2mutex:
|
||||
job = self.pipes.get(req_path)
|
||||
if not job:
|
||||
x = self.conn.hsrv.broker.ask("up2k.find_job_by_ap", ptop, req_path)
|
||||
job = json.loads(x.get())
|
||||
if job:
|
||||
self.pipes.set(req_path, job)
|
||||
|
||||
if not job:
|
||||
t = "pipe: OK, upload has finished; yeeting remainder"
|
||||
self.log(t, 2)
|
||||
data_end = file_size
|
||||
break
|
||||
|
||||
if num_need != len(job["need"]):
|
||||
num_need = len(job["need"])
|
||||
data_end = 0
|
||||
for cid in job["hash"]:
|
||||
if cid in job["need"]:
|
||||
break
|
||||
data_end += chunk_size
|
||||
t = "pipe: can stream %.2f MiB; requested range is %.2f to %.2f"
|
||||
self.log(t % (data_end / M, lower / M, upper / M), 6)
|
||||
with self.u2mutex:
|
||||
if data_end > self.u2fh.aps.get(ap_data, data_end):
|
||||
try:
|
||||
fhs = self.u2fh.cache[ap_data].all_fhs
|
||||
for fh in fhs:
|
||||
fh.flush()
|
||||
self.u2fh.aps[ap_data] = data_end
|
||||
self.log("pipe: flushed %d up2k-FDs" % (len(fhs),))
|
||||
except Exception as ex:
|
||||
self.log("pipe: u2fh flush failed: %r" % (ex,))
|
||||
|
||||
if lower >= data_end:
|
||||
if data_end:
|
||||
t = "pipe: uploader is too slow; aborting download at %.2f MiB"
|
||||
self.log(t % (data_end / M))
|
||||
raise Pebkac(416, "uploader is too slow")
|
||||
|
||||
raise Pebkac(416, "no data available yet; please retry in a bit")
|
||||
|
||||
slack = data_end - lower
|
||||
if slack >= 8 * M:
|
||||
ntier = 0
|
||||
winsz = M
|
||||
bufsz = wr_sz
|
||||
slp = wr_slp
|
||||
else:
|
||||
winsz = max(40, int(M * (slack / (12 * M))))
|
||||
base_rate = M if not wr_slp else wr_sz / wr_slp
|
||||
if winsz > base_rate:
|
||||
ntier = 0
|
||||
bufsz = wr_sz
|
||||
slp = wr_slp
|
||||
elif winsz > 300:
|
||||
ntier = 1
|
||||
bufsz = winsz // 5
|
||||
slp = 0.2
|
||||
else:
|
||||
ntier = 2
|
||||
bufsz = winsz = slp = 1
|
||||
|
||||
if tier != ntier:
|
||||
tier = ntier
|
||||
self.log("moved to tier %d (%s)" % (tier, tiers[tier]))
|
||||
|
||||
try:
|
||||
with open(ap_data, "rb", self.args.iobuf) as f:
|
||||
f.seek(lower)
|
||||
page = f.read(min(winsz, data_end - lower, upper - lower))
|
||||
if not page:
|
||||
raise Exception("got 0 bytes (EOF?)")
|
||||
except Exception as ex:
|
||||
self.log("pipe: read failed at %.2f MiB: %s" % (lower / M, ex), 3)
|
||||
with self.u2mutex:
|
||||
self.pipes.c.pop(req_path, None)
|
||||
spins += 1
|
||||
if spins > 3:
|
||||
raise Pebkac(500, "file became unreadable")
|
||||
time.sleep(2)
|
||||
continue
|
||||
|
||||
spins = 0
|
||||
pofs = 0
|
||||
while pofs < len(page):
|
||||
if slp:
|
||||
time.sleep(slp)
|
||||
|
||||
try:
|
||||
buf = page[pofs : pofs + bufsz]
|
||||
self.s.sendall(buf)
|
||||
zi = len(buf)
|
||||
remains -= zi
|
||||
lower += zi
|
||||
pofs += zi
|
||||
except:
|
||||
broken = True
|
||||
break
|
||||
|
||||
if lower < upper and not broken:
|
||||
with open(req_path, "rb") as f:
|
||||
remains = sendfile_py(self.log, lower, upper, f, self.s, wr_sz, wr_slp)
|
||||
|
||||
spd = self._spd((upper - lower) - remains)
|
||||
if self.do_log:
|
||||
self.log("{}, {}".format(logmsg, spd))
|
||||
|
||||
return not broken
|
||||
|
||||
def tx_zip(
|
||||
self,
|
||||
fmt: str,
|
||||
@@ -3752,7 +3921,7 @@ class HttpCli(object):
|
||||
if not allvols:
|
||||
ret = [{"kinshi": 1}]
|
||||
|
||||
jtxt = '{"u":%s,"c":%s}' % (uret, json.dumps(ret, indent=0))
|
||||
jtxt = '{"u":%s,"c":%s}' % (uret, json.dumps(ret, separators=(",\n", ": ")))
|
||||
zi = len(uret.split('\n"pd":')) - 1
|
||||
self.log("%s #%d+%d %.2fsec" % (lm, zi, len(ret), time.time() - t0))
|
||||
self.reply(jtxt.encode("utf-8", "replace"), mime="application/json")
|
||||
@@ -4031,7 +4200,9 @@ class HttpCli(object):
|
||||
):
|
||||
return self.tx_md(vn, abspath)
|
||||
|
||||
return self.tx_file(abspath)
|
||||
return self.tx_file(
|
||||
abspath, None if st.st_size or "nopipe" in vn.flags else vn.realpath
|
||||
)
|
||||
|
||||
elif is_dir and not self.can_read:
|
||||
if self._use_dirkey(abspath):
|
||||
|
||||
@@ -55,6 +55,7 @@ class HttpConn(object):
|
||||
self.E: EnvParams = self.args.E
|
||||
self.asrv: AuthSrv = hsrv.asrv # mypy404
|
||||
self.u2fh: Util.FHC = hsrv.u2fh # mypy404
|
||||
self.pipes: Util.CachedDict = hsrv.pipes # mypy404
|
||||
self.ipa_nm: Optional[NetMap] = hsrv.ipa_nm
|
||||
self.xff_nm: Optional[NetMap] = hsrv.xff_nm
|
||||
self.xff_lan: NetMap = hsrv.xff_lan # type: ignore
|
||||
|
||||
@@ -61,6 +61,7 @@ from .u2idx import U2idx
|
||||
from .util import (
|
||||
E_SCK,
|
||||
FHC,
|
||||
CachedDict,
|
||||
Daemon,
|
||||
Garda,
|
||||
Magician,
|
||||
@@ -130,6 +131,7 @@ class HttpSrv(object):
|
||||
self.t_periodic: Optional[threading.Thread] = None
|
||||
|
||||
self.u2fh = FHC()
|
||||
self.pipes = CachedDict(0.2)
|
||||
self.metrics = Metrics(self)
|
||||
self.nreq = 0
|
||||
self.nsus = 0
|
||||
|
||||
@@ -139,6 +139,7 @@ class Up2k(object):
|
||||
self.need_rescan: set[str] = set()
|
||||
self.db_act = 0.0
|
||||
|
||||
self.reg_mutex = threading.Lock()
|
||||
self.registry: dict[str, dict[str, dict[str, Any]]] = {}
|
||||
self.flags: dict[str, dict[str, Any]] = {}
|
||||
self.droppable: dict[str, list[str]] = {}
|
||||
@@ -146,7 +147,7 @@ class Up2k(object):
|
||||
self.volsize: dict["sqlite3.Cursor", int] = {}
|
||||
self.volstate: dict[str, str] = {}
|
||||
self.vol_act: dict[str, float] = {}
|
||||
self.busy_aps: set[str] = set()
|
||||
self.busy_aps: dict[str, int] = {}
|
||||
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
|
||||
self.snap_prev: dict[str, Optional[tuple[int, float]]] = {}
|
||||
|
||||
@@ -203,11 +204,15 @@ class Up2k(object):
|
||||
Daemon(self.deferred_init, "up2k-deferred-init")
|
||||
|
||||
def reload(self, rescan_all_vols: bool) -> None:
|
||||
"""mutex me"""
|
||||
"""mutex(main) me"""
|
||||
self.log("reload #{} scheduled".format(self.gid + 1))
|
||||
all_vols = self.asrv.vfs.all_vols
|
||||
|
||||
scan_vols = [k for k, v in all_vols.items() if v.realpath not in self.registry]
|
||||
with self.reg_mutex:
|
||||
scan_vols = [
|
||||
k for k, v in all_vols.items() if v.realpath not in self.registry
|
||||
]
|
||||
|
||||
if rescan_all_vols:
|
||||
scan_vols = list(all_vols.keys())
|
||||
|
||||
@@ -220,7 +225,7 @@ class Up2k(object):
|
||||
if self.stop:
|
||||
# up-mt consistency not guaranteed if init is interrupted;
|
||||
# drop caches for a full scan on next boot
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
self._drop_caches()
|
||||
|
||||
if self.pp:
|
||||
@@ -286,10 +291,27 @@ class Up2k(object):
|
||||
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
|
||||
),
|
||||
}
|
||||
return json.dumps(ret, indent=4)
|
||||
return json.dumps(ret, separators=(",\n", ": "))
|
||||
|
||||
def find_job_by_ap(self, ptop: str, ap: str) -> str:
|
||||
try:
|
||||
if ANYWIN:
|
||||
ap = ap.replace("\\", "/")
|
||||
|
||||
vp = ap[len(ptop) :].strip("/")
|
||||
dn, fn = vsplit(vp)
|
||||
with self.reg_mutex:
|
||||
tab2 = self.registry[ptop]
|
||||
for job in tab2.values():
|
||||
if job["prel"] == dn and job["name"] == fn:
|
||||
return json.dumps(job, separators=(",\n", ": "))
|
||||
except:
|
||||
pass
|
||||
|
||||
return "{}"
|
||||
|
||||
def get_unfinished_by_user(self, uname, ip) -> str:
|
||||
if PY2 or not self.mutex.acquire(timeout=2):
|
||||
if PY2 or not self.reg_mutex.acquire(timeout=2):
|
||||
return '[{"timeout":1}]'
|
||||
|
||||
ret: list[tuple[int, str, int, int, int]] = []
|
||||
@@ -318,17 +340,25 @@ class Up2k(object):
|
||||
)
|
||||
ret.append(zt5)
|
||||
finally:
|
||||
self.mutex.release()
|
||||
self.reg_mutex.release()
|
||||
|
||||
if ANYWIN:
|
||||
ret = [(x[0], x[1].replace("\\", "/"), x[2], x[3], x[4]) for x in ret]
|
||||
|
||||
ret.sort(reverse=True)
|
||||
ret2 = [
|
||||
{"at": at, "vp": "/" + vp, "pd": 100 - ((nn * 100) // (nh or 1)), "sz": sz}
|
||||
{
|
||||
"at": at,
|
||||
"vp": "/" + quotep(vp),
|
||||
"pd": 100 - ((nn * 100) // (nh or 1)),
|
||||
"sz": sz,
|
||||
}
|
||||
for (at, vp, sz, nn, nh) in ret
|
||||
]
|
||||
return json.dumps(ret2, indent=0)
|
||||
return json.dumps(ret2, separators=(",\n", ": "))
|
||||
|
||||
def get_unfinished(self) -> str:
|
||||
if PY2 or not self.mutex.acquire(timeout=0.5):
|
||||
if PY2 or not self.reg_mutex.acquire(timeout=0.5):
|
||||
return ""
|
||||
|
||||
ret: dict[str, tuple[int, int]] = {}
|
||||
@@ -350,17 +380,17 @@ class Up2k(object):
|
||||
|
||||
ret[ptop] = (nbytes, nfiles)
|
||||
finally:
|
||||
self.mutex.release()
|
||||
self.reg_mutex.release()
|
||||
|
||||
return json.dumps(ret, indent=4)
|
||||
return json.dumps(ret, separators=(",\n", ": "))
|
||||
|
||||
def get_volsize(self, ptop: str) -> tuple[int, int]:
|
||||
with self.mutex:
|
||||
with self.reg_mutex:
|
||||
return self._get_volsize(ptop)
|
||||
|
||||
def get_volsizes(self, ptops: list[str]) -> list[tuple[int, int]]:
|
||||
ret = []
|
||||
with self.mutex:
|
||||
with self.reg_mutex:
|
||||
for ptop in ptops:
|
||||
ret.append(self._get_volsize(ptop))
|
||||
|
||||
@@ -388,7 +418,7 @@ class Up2k(object):
|
||||
def _rescan(
|
||||
self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool
|
||||
) -> str:
|
||||
"""mutex me"""
|
||||
"""mutex(main) me"""
|
||||
if not wait and self.pp:
|
||||
return "cannot initiate; scan is already in progress"
|
||||
|
||||
@@ -670,7 +700,7 @@ class Up2k(object):
|
||||
self.log(msg, c=3)
|
||||
|
||||
live_vols = []
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
# only need to protect register_vpath but all in one go feels right
|
||||
for vol in vols:
|
||||
try:
|
||||
@@ -712,7 +742,7 @@ class Up2k(object):
|
||||
|
||||
if self.args.re_dhash or [zv for zv in vols if "e2tsr" in zv.flags]:
|
||||
self.args.re_dhash = False
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
self._drop_caches()
|
||||
|
||||
for vol in vols:
|
||||
@@ -789,7 +819,9 @@ class Up2k(object):
|
||||
self.volstate[vol.vpath] = "online (mtp soon)"
|
||||
|
||||
for vol in need_vac:
|
||||
reg = self.register_vpath(vol.realpath, vol.flags)
|
||||
with self.mutex, self.reg_mutex:
|
||||
reg = self.register_vpath(vol.realpath, vol.flags)
|
||||
|
||||
assert reg
|
||||
cur, _ = reg
|
||||
with self.mutex:
|
||||
@@ -803,7 +835,9 @@ class Up2k(object):
|
||||
if vol.flags["dbd"] == "acid":
|
||||
continue
|
||||
|
||||
reg = self.register_vpath(vol.realpath, vol.flags)
|
||||
with self.mutex, self.reg_mutex:
|
||||
reg = self.register_vpath(vol.realpath, vol.flags)
|
||||
|
||||
try:
|
||||
assert reg
|
||||
cur, db_path = reg
|
||||
@@ -850,6 +884,7 @@ class Up2k(object):
|
||||
def register_vpath(
|
||||
self, ptop: str, flags: dict[str, Any]
|
||||
) -> Optional[tuple["sqlite3.Cursor", str]]:
|
||||
"""mutex(main,reg) me"""
|
||||
histpath = self.asrv.vfs.histtab.get(ptop)
|
||||
if not histpath:
|
||||
self.log("no histpath for [{}]".format(ptop))
|
||||
@@ -1033,7 +1068,9 @@ class Up2k(object):
|
||||
dev = cst.st_dev if vol.flags.get("xdev") else 0
|
||||
|
||||
with self.mutex:
|
||||
reg = self.register_vpath(top, vol.flags)
|
||||
with self.reg_mutex:
|
||||
reg = self.register_vpath(top, vol.flags)
|
||||
|
||||
assert reg and self.pp
|
||||
cur, db_path = reg
|
||||
|
||||
@@ -1630,7 +1667,7 @@ class Up2k(object):
|
||||
|
||||
def _build_tags_index(self, vol: VFS) -> tuple[int, int, bool]:
|
||||
ptop = vol.realpath
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
reg = self.register_vpath(ptop, vol.flags)
|
||||
|
||||
assert reg and self.pp
|
||||
@@ -1651,6 +1688,7 @@ class Up2k(object):
|
||||
return ret
|
||||
|
||||
def _drop_caches(self) -> None:
|
||||
"""mutex(main,reg) me"""
|
||||
self.log("dropping caches for a full filesystem scan")
|
||||
for vol in self.asrv.vfs.all_vols.values():
|
||||
reg = self.register_vpath(vol.realpath, vol.flags)
|
||||
@@ -1826,7 +1864,7 @@ class Up2k(object):
|
||||
params: tuple[Any, ...],
|
||||
flt: int,
|
||||
) -> tuple[tempfile.SpooledTemporaryFile[bytes], int]:
|
||||
"""mutex me"""
|
||||
"""mutex(main) me"""
|
||||
n = 0
|
||||
c2 = cur.connection.cursor()
|
||||
tf = tempfile.SpooledTemporaryFile(1024 * 1024 * 8, "w+b", prefix="cpp-tq-")
|
||||
@@ -2160,7 +2198,7 @@ class Up2k(object):
|
||||
ip: str,
|
||||
at: float,
|
||||
) -> int:
|
||||
"""will mutex"""
|
||||
"""will mutex(main)"""
|
||||
assert self.mtag
|
||||
|
||||
try:
|
||||
@@ -2192,7 +2230,7 @@ class Up2k(object):
|
||||
abspath: str,
|
||||
tags: dict[str, Union[str, float]],
|
||||
) -> int:
|
||||
"""mutex me"""
|
||||
"""mutex(main) me"""
|
||||
assert self.mtag
|
||||
|
||||
if not bos.path.isfile(abspath):
|
||||
@@ -2477,28 +2515,36 @@ class Up2k(object):
|
||||
|
||||
cur.connection.commit()
|
||||
|
||||
def _job_volchk(self, cj: dict[str, Any]) -> None:
|
||||
if not self.register_vpath(cj["ptop"], cj["vcfg"]):
|
||||
if cj["ptop"] not in self.registry:
|
||||
raise Pebkac(410, "location unavailable")
|
||||
|
||||
def handle_json(self, cj: dict[str, Any], busy_aps: set[str]) -> dict[str, Any]:
|
||||
def handle_json(
|
||||
self, cj: dict[str, Any], busy_aps: dict[str, int]
|
||||
) -> dict[str, Any]:
|
||||
# busy_aps is u2fh (always undefined if -j0) so this is safe
|
||||
self.busy_aps = busy_aps
|
||||
got_lock = False
|
||||
try:
|
||||
# bit expensive; 3.9=10x 3.11=2x
|
||||
if self.mutex.acquire(timeout=10):
|
||||
self._job_volchk(cj)
|
||||
self.mutex.release()
|
||||
got_lock = True
|
||||
with self.reg_mutex:
|
||||
return self._handle_json(cj)
|
||||
else:
|
||||
t = "cannot receive uploads right now;\nserver busy with {}.\nPlease wait; the client will retry..."
|
||||
raise Pebkac(503, t.format(self.blocked or "[unknown]"))
|
||||
except TypeError:
|
||||
if not PY2:
|
||||
raise
|
||||
with self.mutex:
|
||||
self._job_volchk(cj)
|
||||
with self.mutex, self.reg_mutex:
|
||||
return self._handle_json(cj)
|
||||
finally:
|
||||
if got_lock:
|
||||
self.mutex.release()
|
||||
|
||||
def _handle_json(self, cj: dict[str, Any]) -> dict[str, Any]:
|
||||
ptop = cj["ptop"]
|
||||
if not self.register_vpath(ptop, cj["vcfg"]):
|
||||
if ptop not in self.registry:
|
||||
raise Pebkac(410, "location unavailable")
|
||||
|
||||
cj["name"] = sanitize_fn(cj["name"], "", [".prologue.html", ".epilogue.html"])
|
||||
cj["poke"] = now = self.db_act = self.vol_act[ptop] = time.time()
|
||||
wark = self._get_wark(cj)
|
||||
@@ -2513,7 +2559,7 @@ class Up2k(object):
|
||||
# refuse out-of-order / multithreaded uploading if sprs False
|
||||
sprs = self.fstab.get(pdir) != "ng"
|
||||
|
||||
with self.mutex:
|
||||
if True:
|
||||
jcur = self.cur.get(ptop)
|
||||
reg = self.registry[ptop]
|
||||
vfs = self.asrv.vfs.all_vols[cj["vtop"]]
|
||||
@@ -2951,7 +2997,7 @@ class Up2k(object):
|
||||
def handle_chunk(
|
||||
self, ptop: str, wark: str, chash: str
|
||||
) -> tuple[int, list[int], str, float, bool]:
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
self.db_act = self.vol_act[ptop] = time.time()
|
||||
job = self.registry[ptop].get(wark)
|
||||
if not job:
|
||||
@@ -2994,7 +3040,7 @@ class Up2k(object):
|
||||
return chunksize, ofs, path, job["lmod"], job["sprs"]
|
||||
|
||||
def release_chunk(self, ptop: str, wark: str, chash: str) -> bool:
|
||||
with self.mutex:
|
||||
with self.reg_mutex:
|
||||
job = self.registry[ptop].get(wark)
|
||||
if job:
|
||||
job["busy"].pop(chash, None)
|
||||
@@ -3002,7 +3048,7 @@ class Up2k(object):
|
||||
return True
|
||||
|
||||
def confirm_chunk(self, ptop: str, wark: str, chash: str) -> tuple[int, str]:
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
self.db_act = self.vol_act[ptop] = time.time()
|
||||
try:
|
||||
job = self.registry[ptop][wark]
|
||||
@@ -3025,16 +3071,16 @@ class Up2k(object):
|
||||
|
||||
if self.args.nw:
|
||||
self.regdrop(ptop, wark)
|
||||
return ret, dst
|
||||
|
||||
return ret, dst
|
||||
|
||||
def finish_upload(self, ptop: str, wark: str, busy_aps: set[str]) -> None:
|
||||
self.busy_aps = busy_aps
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
self._finish_upload(ptop, wark)
|
||||
|
||||
def _finish_upload(self, ptop: str, wark: str) -> None:
|
||||
"""mutex(main,reg) me"""
|
||||
try:
|
||||
job = self.registry[ptop][wark]
|
||||
pdir = djoin(job["ptop"], job["prel"])
|
||||
@@ -3107,6 +3153,7 @@ class Up2k(object):
|
||||
cur.connection.commit()
|
||||
|
||||
def regdrop(self, ptop: str, wark: str) -> None:
|
||||
"""mutex(main,reg) me"""
|
||||
olds = self.droppable[ptop]
|
||||
if wark:
|
||||
olds.append(wark)
|
||||
@@ -3201,16 +3248,23 @@ class Up2k(object):
|
||||
at: float,
|
||||
skip_xau: bool = False,
|
||||
) -> None:
|
||||
"""mutex(main) me"""
|
||||
self.db_rm(db, rd, fn, sz)
|
||||
|
||||
if not ip:
|
||||
db_ip = ""
|
||||
else:
|
||||
# plugins may expect this to look like an actual IP
|
||||
db_ip = "1.1.1.1" if self.args.no_db_ip else ip
|
||||
|
||||
sql = "insert into up values (?,?,?,?,?,?,?)"
|
||||
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0))
|
||||
v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
|
||||
try:
|
||||
db.execute(sql, v)
|
||||
except:
|
||||
assert self.mem_cur
|
||||
rd, fn = s3enc(self.mem_cur, rd, fn)
|
||||
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0))
|
||||
v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
|
||||
db.execute(sql, v)
|
||||
|
||||
self.volsize[db] += sz
|
||||
@@ -3314,7 +3368,7 @@ class Up2k(object):
|
||||
vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0])
|
||||
vn, rem = vn.get_dbv(rem)
|
||||
ptop = vn.realpath
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
abrt_cfg = self.flags.get(ptop, {}).get("u2abort", 1)
|
||||
addr = (ip or "\n") if abrt_cfg in (1, 2) else ""
|
||||
user = (uname or "\n") if abrt_cfg in (1, 3) else ""
|
||||
@@ -3322,7 +3376,10 @@ class Up2k(object):
|
||||
for wark, job in reg.items():
|
||||
if (user and user != job["user"]) or (addr and addr != job["addr"]):
|
||||
continue
|
||||
if djoin(job["prel"], job["name"]) == rem:
|
||||
jrem = djoin(job["prel"], job["name"])
|
||||
if ANYWIN:
|
||||
jrem = jrem.replace("\\", "/")
|
||||
if jrem == rem:
|
||||
if job["ptop"] != ptop:
|
||||
t = "job.ptop [%s] != vol.ptop [%s] ??"
|
||||
raise Exception(t % (job["ptop"] != ptop))
|
||||
@@ -3418,7 +3475,7 @@ class Up2k(object):
|
||||
continue
|
||||
|
||||
n_files += 1
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
cur = None
|
||||
try:
|
||||
ptop = dbv.realpath
|
||||
@@ -3536,6 +3593,7 @@ class Up2k(object):
|
||||
def _mv_file(
|
||||
self, uname: str, svp: str, dvp: str, curs: set["sqlite3.Cursor"]
|
||||
) -> str:
|
||||
"""mutex(main) me; will mutex(reg)"""
|
||||
svn, srem = self.asrv.vfs.get(svp, uname, True, False, True)
|
||||
svn, srem = svn.get_dbv(srem)
|
||||
|
||||
@@ -3616,7 +3674,9 @@ class Up2k(object):
|
||||
if c2 and c2 != c1:
|
||||
self._copy_tags(c1, c2, w)
|
||||
|
||||
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize)
|
||||
with self.reg_mutex:
|
||||
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize)
|
||||
|
||||
if not is_xvol:
|
||||
has_dupes = self._relink(w, svn.realpath, srem, dabs)
|
||||
|
||||
@@ -3746,7 +3806,10 @@ class Up2k(object):
|
||||
drop_tags: bool,
|
||||
sz: int,
|
||||
) -> bool:
|
||||
"""forgets file in db, fixes symlinks, does not delete"""
|
||||
"""
|
||||
mutex(main,reg) me
|
||||
forgets file in db, fixes symlinks, does not delete
|
||||
"""
|
||||
srd, sfn = vsplit(vrem)
|
||||
has_dupes = False
|
||||
self.log("forgetting {}".format(vrem))
|
||||
@@ -4071,7 +4134,7 @@ class Up2k(object):
|
||||
self.do_snapshot()
|
||||
|
||||
def do_snapshot(self) -> None:
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
for k, reg in self.registry.items():
|
||||
self._snap_reg(k, reg)
|
||||
|
||||
@@ -4139,7 +4202,7 @@ class Up2k(object):
|
||||
|
||||
path2 = "{}.{}".format(path, os.getpid())
|
||||
body = {"droppable": self.droppable[ptop], "registry": reg}
|
||||
j = json.dumps(body, indent=2, sort_keys=True).encode("utf-8")
|
||||
j = json.dumps(body, sort_keys=True, separators=(",\n", ": ")).encode("utf-8")
|
||||
with gzip.GzipFile(path2, "wb") as f:
|
||||
f.write(j)
|
||||
|
||||
@@ -4212,7 +4275,7 @@ class Up2k(object):
|
||||
raise Exception("invalid hash task")
|
||||
|
||||
try:
|
||||
if not self._hash_t(task):
|
||||
if not self._hash_t(task) and self.stop:
|
||||
return
|
||||
except Exception as ex:
|
||||
self.log("failed to hash %s: %s" % (task, ex), 1)
|
||||
@@ -4222,7 +4285,7 @@ class Up2k(object):
|
||||
) -> bool:
|
||||
ptop, vtop, flags, rd, fn, ip, at, usr, skip_xau = task
|
||||
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
if not self.register_vpath(ptop, flags):
|
||||
return True
|
||||
|
||||
@@ -4240,7 +4303,7 @@ class Up2k(object):
|
||||
|
||||
wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes)
|
||||
|
||||
with self.mutex:
|
||||
with self.mutex, self.reg_mutex:
|
||||
self.idx_wark(
|
||||
self.flags[ptop],
|
||||
rd,
|
||||
|
||||
@@ -759,15 +759,46 @@ class CachedSet(object):
|
||||
self.oldest = now
|
||||
|
||||
|
||||
class CachedDict(object):
|
||||
def __init__(self, maxage: float) -> None:
|
||||
self.c: dict[str, tuple[float, Any]] = {}
|
||||
self.maxage = maxage
|
||||
self.oldest = 0.0
|
||||
|
||||
def set(self, k: str, v: Any) -> None:
|
||||
now = time.time()
|
||||
self.c[k] = (now, v)
|
||||
if now - self.oldest < self.maxage:
|
||||
return
|
||||
|
||||
c = self.c = {k: v for k, v in self.c.items() if now - v[0] < self.maxage}
|
||||
try:
|
||||
self.oldest = min([x[0] for x in c.values()])
|
||||
except:
|
||||
self.oldest = now
|
||||
|
||||
def get(self, k: str) -> Optional[tuple[str, Any]]:
|
||||
try:
|
||||
ts, ret = self.c[k]
|
||||
now = time.time()
|
||||
if now - ts > self.maxage:
|
||||
del self.c[k]
|
||||
return None
|
||||
return ret
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
class FHC(object):
|
||||
class CE(object):
|
||||
def __init__(self, fh: typing.BinaryIO) -> None:
|
||||
self.ts: float = 0
|
||||
self.fhs = [fh]
|
||||
self.all_fhs = set([fh])
|
||||
|
||||
def __init__(self) -> None:
|
||||
self.cache: dict[str, FHC.CE] = {}
|
||||
self.aps: set[str] = set()
|
||||
self.aps: dict[str, int] = {}
|
||||
|
||||
def close(self, path: str) -> None:
|
||||
try:
|
||||
@@ -779,7 +810,7 @@ class FHC(object):
|
||||
fh.close()
|
||||
|
||||
del self.cache[path]
|
||||
self.aps.remove(path)
|
||||
del self.aps[path]
|
||||
|
||||
def clean(self) -> None:
|
||||
if not self.cache:
|
||||
@@ -800,9 +831,12 @@ class FHC(object):
|
||||
return self.cache[path].fhs.pop()
|
||||
|
||||
def put(self, path: str, fh: typing.BinaryIO) -> None:
|
||||
self.aps.add(path)
|
||||
if path not in self.aps:
|
||||
self.aps[path] = 0
|
||||
|
||||
try:
|
||||
ce = self.cache[path]
|
||||
ce.all_fhs.add(fh)
|
||||
ce.fhs.append(fh)
|
||||
except:
|
||||
ce = self.CE(fh)
|
||||
|
||||
@@ -699,12 +699,12 @@ a:hover {
|
||||
.s0:after,
|
||||
.s1:after {
|
||||
content: '⌄';
|
||||
margin-left: -.1em;
|
||||
margin-left: -.15em;
|
||||
}
|
||||
.s0r:after,
|
||||
.s1r:after {
|
||||
content: '⌃';
|
||||
margin-left: -.1em;
|
||||
margin-left: -.15em;
|
||||
}
|
||||
.s0:after,
|
||||
.s0r:after {
|
||||
@@ -715,7 +715,7 @@ a:hover {
|
||||
color: var(--sort-2);
|
||||
}
|
||||
#files thead th:after {
|
||||
margin-right: -.7em;
|
||||
margin-right: -.5em;
|
||||
}
|
||||
#files tbody tr:hover td,
|
||||
#files tbody tr:hover td+td {
|
||||
@@ -744,6 +744,15 @@ html #files.hhpick thead th {
|
||||
word-wrap: break-word;
|
||||
overflow: hidden;
|
||||
}
|
||||
#files tr.fade a {
|
||||
color: #999;
|
||||
color: rgba(255, 255, 255, 0.4);
|
||||
font-style: italic;
|
||||
}
|
||||
html.y #files tr.fade a {
|
||||
color: #999;
|
||||
color: rgba(0, 0, 0, 0.4);
|
||||
}
|
||||
#files tr:nth-child(2n) td {
|
||||
background: var(--row-alt);
|
||||
}
|
||||
@@ -1737,6 +1746,7 @@ html.y #tree.nowrap .ntree a+a:hover {
|
||||
}
|
||||
#files th span {
|
||||
position: relative;
|
||||
white-space: nowrap;
|
||||
}
|
||||
#files>thead>tr>th.min,
|
||||
#files td.min {
|
||||
|
||||
@@ -290,6 +290,8 @@ var Ls = {
|
||||
|
||||
"f_dls": 'the file links in the current folder have\nbeen changed into download links',
|
||||
|
||||
"f_partial": "To safely download a file which is currently being uploaded, please click the file which has the same filename, but without the <code>.PARTIAL</code> file extension. Please press CANCEL or Escape to do this.\n\nPressing OK / Enter will ignore this warning and continue downloading the <code>.PARTIAL</code> scratchfile instead, which will almost definitely give you corrupted data.",
|
||||
|
||||
"ft_paste": "paste {0} items$NHotkey: ctrl-V",
|
||||
"fr_eperm": 'cannot rename:\nyou do not have “move” permission in this folder',
|
||||
"fd_eperm": 'cannot delete:\nyou do not have “delete” permission in this folder',
|
||||
@@ -420,6 +422,7 @@ var Ls = {
|
||||
"un_fclr": "clear filter",
|
||||
"un_derr": 'unpost-delete failed:\n',
|
||||
"un_f5": 'something broke, please try a refresh or hit F5',
|
||||
"un_uf5": "sorry but you have to refresh the page (for example by pressing F5 or CTRL-R) before this upload can be aborted",
|
||||
"un_nou": '<b>warning:</b> server too busy to show unfinished uploads; click the "refresh" link in a bit',
|
||||
"un_noc": '<b>warning:</b> unpost of fully uploaded files is not enabled/permitted in server config',
|
||||
"un_max": "showing first 2000 files (use the filter)",
|
||||
@@ -792,6 +795,8 @@ var Ls = {
|
||||
|
||||
"f_dls": 'linkene i denne mappen er nå\nomgjort til nedlastningsknapper',
|
||||
|
||||
"f_partial": "For å laste ned en fil som enda ikke er ferdig opplastet, klikk på filen som har samme filnavn som denne, men uten <code>.PARTIAL</code> på slutten. Da vil serveren passe på at nedlastning går bra. Derfor anbefales det sterkt å trykke ABRYT eller Escape-tasten.\n\nHvis du virkelig ønsker å laste ned denne <code>.PARTIAL</code>-filen på en ukontrollert måte, trykk OK / Enter for å ignorere denne advarselen. Slik vil du høyst sannsynlig motta korrupt data.",
|
||||
|
||||
"ft_paste": "Lim inn {0} filer$NSnarvei: ctrl-V",
|
||||
"fr_eperm": 'kan ikke endre navn:\ndu har ikke “move”-rettigheten i denne mappen',
|
||||
"fd_eperm": 'kan ikke slette:\ndu har ikke “delete”-rettigheten i denne mappen',
|
||||
@@ -922,6 +927,7 @@ var Ls = {
|
||||
"un_fclr": "nullstill filter",
|
||||
"un_derr": 'unpost-sletting feilet:\n',
|
||||
"un_f5": 'noe gikk galt, prøv å oppdatere listen eller trykk F5',
|
||||
"un_uf5": "beklager, men du må laste siden på nytt (f.eks. ved å trykke F5 eller CTRL-R) før denne opplastningen kan avbrytes",
|
||||
"un_nou": '<b>advarsel:</b> kan ikke vise ufullstendige opplastninger akkurat nå; klikk på oppdater-linken om litt',
|
||||
"un_noc": '<b>advarsel:</b> angring av fullførte opplastninger er deaktivert i serverkonfigurasjonen',
|
||||
"un_max": "viser de første 2000 filene (bruk filteret for å innsnevre)",
|
||||
@@ -1806,7 +1812,7 @@ function MPlayer() {
|
||||
|
||||
r.preload = function (url, full) {
|
||||
var t0 = Date.now(),
|
||||
fname = uricom_dec(url.split('/').pop());
|
||||
fname = uricom_dec(url.split('/').pop().split('?')[0]);
|
||||
|
||||
url = addq(mpl.acode(url), 'cache=987&_=' + ACB);
|
||||
mpl.preload_url = full ? url : null;
|
||||
@@ -6381,8 +6387,9 @@ var treectl = (function () {
|
||||
'" class="doc' + (lang ? ' bri' : '') +
|
||||
'" hl="' + id + '" name="' + hname + '">-txt-</a>';
|
||||
|
||||
var ln = ['<tr><td>' + tn.lead + '</td><td><a href="' +
|
||||
top + tn.href + '" id="' + id + '">' + hname + '</a>', tn.sz];
|
||||
var cl = /\.PARTIAL$/.exec(fname) ? ' class="fade"' : '',
|
||||
ln = ['<tr' + cl + '><td>' + tn.lead + '</td><td><a href="' +
|
||||
top + tn.href + '" id="' + id + '">' + hname + '</a>', tn.sz];
|
||||
|
||||
for (var b = 0; b < res.taglist.length; b++) {
|
||||
var k = res.taglist[b],
|
||||
@@ -8092,7 +8099,17 @@ var unpost = (function () {
|
||||
if (!links.length)
|
||||
continue;
|
||||
|
||||
req.push(uricom_dec(r.files[a].vp.split('?')[0]));
|
||||
var f = r.files[a];
|
||||
if (f.k == 'u') {
|
||||
var vp = vsplit(f.vp.split('?')[0]),
|
||||
dfn = uricom_dec(vp[1]);
|
||||
for (var iu = 0; iu < up2k.st.files.length; iu++) {
|
||||
var uf = up2k.st.files[iu];
|
||||
if (uf.name == dfn && uf.purl == vp[0])
|
||||
return modal.alert(L.un_uf5);
|
||||
}
|
||||
}
|
||||
req.push(uricom_dec(f.vp.split('?')[0]));
|
||||
for (var b = 0; b < links.length; b++) {
|
||||
links[b].removeAttribute('href');
|
||||
links[b].innerHTML = '[busy]';
|
||||
@@ -8215,6 +8232,13 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
|
||||
treectl.reqls(tgt.getAttribute('href'), true);
|
||||
return ev(e);
|
||||
}
|
||||
if (tgt && /\.PARTIAL(\?|$)/.exec('' + tgt.getAttribute('href')) && !window.partdlok) {
|
||||
ev(e);
|
||||
modal.confirm(L.f_partial, function () {
|
||||
window.partdlok = 1;
|
||||
tgt.click();
|
||||
}, null);
|
||||
}
|
||||
|
||||
tgt = e.target.closest('a[hl]');
|
||||
if (tgt) {
|
||||
|
||||
@@ -1,3 +1,26 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2024-0412-2110 `v1.12.2` ie11 fix
|
||||
|
||||
## new features
|
||||
|
||||
* new option `--bauth-last` for when you're hosting other [basic-auth](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication) services on the same domain 7b94e4ed
|
||||
* makes it possible to log into copyparty as intended, but it still sees the passwords from the other service until you do
|
||||
* alternatively, the other new option `--no-bauth` entirely disables basic-auth support, but that also kills [the android app](https://github.com/9001/party-up)
|
||||
|
||||
## bugfixes
|
||||
|
||||
* internet explorer isn't working?! FIX IT!!! 9e5253ef
|
||||
* audio transcoding was buggy with filekeys enabled b8733653
|
||||
* on windows, theoretical chance that antivirus could interrupt renaming files, so preemptively guard against that c8e3ed3a
|
||||
|
||||
## other changes
|
||||
|
||||
* add a "password" placeholder on the login page since you might think it's asking for a username da26ec36
|
||||
* config buttons were jank on iOS b772a4f8
|
||||
* readme: [making your homeserver accessible from the internet](https://github.com/9001/copyparty#at-home)
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2024-0409-2334 `v1.12.1` scrolling stuff
|
||||
|
||||
|
||||
331
docs/versus.md
331
docs/versus.md
@@ -48,6 +48,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
|
||||
* [filebrowser](#filebrowser)
|
||||
* [filegator](#filegator)
|
||||
* [sftpgo](#sftpgo)
|
||||
* [arozos](#arozos)
|
||||
* [updog](#updog)
|
||||
* [goshs](#goshs)
|
||||
* [gimme-that](#gimme-that)
|
||||
@@ -93,6 +94,7 @@ the softwares,
|
||||
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
|
||||
* `k` = [filegator](https://github.com/filegator/filegator)
|
||||
* `l` = [sftpgo](https://github.com/drakkan/sftpgo)
|
||||
* `m` = [arozos](https://github.com/tobychui/arozos)
|
||||
|
||||
some softwares not in the matrixes,
|
||||
* [updog](#updog)
|
||||
@@ -113,22 +115,22 @@ symbol legend,
|
||||
|
||||
## general
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| intuitive UX | | ╱ | █ | █ | █ | | █ | █ | █ | █ | █ | █ |
|
||||
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ |
|
||||
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | ╱ |
|
||||
| runs on iOS | ╱ | | | | | ╱ | | | | | | |
|
||||
| runs on Android | █ | | | | | █ | | | | | | |
|
||||
| runs on WinXP | █ | █ | | | | █ | | | | | | |
|
||||
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | ╱ | █ | █ | █ | █ |
|
||||
| runs on Linux | █ | ╱ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ |
|
||||
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ |
|
||||
| zero setup, just go | █ | █ | █ | | | ╱ | █ | | | █ | | ╱ |
|
||||
| android app | ╱ | | | █ | █ | | | | | | | |
|
||||
| iOS app | ╱ | | | █ | █ | | | | | | | |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| intuitive UX | | ╱ | █ | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
|
||||
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ | █ |
|
||||
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | ╱ | ╱ |
|
||||
| runs on iOS | ╱ | | | | | ╱ | | | | | | | |
|
||||
| runs on Android | █ | | | | | █ | | | | | | | |
|
||||
| runs on WinXP | █ | █ | | | | █ | | | | | | | |
|
||||
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | ╱ | █ | █ | █ | █ | ╱ |
|
||||
| runs on Linux | █ | ╱ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | |
|
||||
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ | |
|
||||
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ | █ |
|
||||
| zero setup, just go | █ | █ | █ | | | ╱ | █ | | | █ | | ╱ | █ |
|
||||
| android app | ╱ | | | █ | █ | | | | | | | | |
|
||||
| iOS app | ╱ | | | █ | █ | | | | | | | | |
|
||||
|
||||
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
|
||||
* `a`/copyparty remarks:
|
||||
@@ -140,37 +142,39 @@ symbol legend,
|
||||
* `f`/rclone must be started with the command `rclone serve webdav .` or similar
|
||||
* `h`/chibisafe has undocumented windows support
|
||||
* `i`/sftpgo must be launched with a command
|
||||
* `m`/arozos has partial windows support
|
||||
|
||||
|
||||
## file transfer
|
||||
|
||||
*the thing that copyparty is actually kinda good at*
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| download folder as zip | █ | █ | █ | █ | ╱ | | █ | | █ | █ | ╱ | █ |
|
||||
| download folder as tar | █ | | | | | | | | | █ | | |
|
||||
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | |
|
||||
| resumable uploads | █ | | | | | | | | █ | | █ | ╱ |
|
||||
| upload segmenting | █ | | | | | | | █ | █ | | █ | ╱ |
|
||||
| upload acceleration | █ | | | | | | | | █ | | █ | |
|
||||
| upload verification | █ | | | █ | █ | | | | █ | | | |
|
||||
| upload deduplication | █ | | | | █ | | | | █ | | | |
|
||||
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | ╱ |
|
||||
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ |
|
||||
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ | ╱ |
|
||||
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ |
|
||||
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ |
|
||||
| ┗ max items in folder | █ | | | | | | | | | | | ╱ |
|
||||
| ┗ max file age | █ | | | | | | | | █ | | | |
|
||||
| ┗ max uploads over time | █ | | | | | | | | | | | ╱ |
|
||||
| ┗ compress before write | █ | | | | | | | | | | | |
|
||||
| ┗ randomize filename | █ | | | | | | | █ | █ | | | |
|
||||
| ┗ mimetype reject-list | ╱ | | | | | | | | • | ╱ | | ╱ |
|
||||
| ┗ extension reject-list | ╱ | | | | | | | █ | • | ╱ | | ╱ |
|
||||
| checksums provided | | | | █ | █ | | | | █ | ╱ | | |
|
||||
| cloud storage backend | ╱ | ╱ | ╱ | █ | █ | █ | ╱ | | | ╱ | █ | █ |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| download folder as zip | █ | █ | █ | █ | ╱ | | █ | | █ | █ | ╱ | █ | ╱ |
|
||||
| download folder as tar | █ | | | | | | | | | █ | | | |
|
||||
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | █ |
|
||||
| resumable uploads | █ | | | | | | | | █ | | █ | ╱ | |
|
||||
| upload segmenting | █ | | | | | | | █ | █ | | █ | ╱ | █ |
|
||||
| upload acceleration | █ | | | | | | | | █ | | █ | | |
|
||||
| upload verification | █ | | | █ | █ | | | | █ | | | | |
|
||||
| upload deduplication | █ | | | | █ | | | | █ | | | | |
|
||||
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | ╱ | ╱ |
|
||||
| race the beam ("p2p") | █ | | | | | | | | | • | | | |
|
||||
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ | |
|
||||
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ | ╱ | ╱ |
|
||||
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ | █ |
|
||||
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ | █ |
|
||||
| ┗ max items in folder | █ | | | | | | | | | | | ╱ | |
|
||||
| ┗ max file age | █ | | | | | | | | █ | | | | |
|
||||
| ┗ max uploads over time | █ | | | | | | | | | | | ╱ | |
|
||||
| ┗ compress before write | █ | | | | | | | | | | | | |
|
||||
| ┗ randomize filename | █ | | | | | | | █ | █ | | | | |
|
||||
| ┗ mimetype reject-list | ╱ | | | | | | | | • | ╱ | | ╱ | • |
|
||||
| ┗ extension reject-list | ╱ | | | | | | | █ | • | ╱ | | ╱ | • |
|
||||
| checksums provided | | | | █ | █ | | | | █ | ╱ | | | |
|
||||
| cloud storage backend | ╱ | ╱ | ╱ | █ | █ | █ | ╱ | | | ╱ | █ | █ | ╱ |
|
||||
|
||||
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
|
||||
|
||||
@@ -178,6 +182,8 @@ symbol legend,
|
||||
|
||||
* `upload verification` = uploads are checksummed or otherwise confirmed to have been transferred correctly
|
||||
|
||||
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
|
||||
|
||||
* `checksums provided` = when downloading a file from the server, the file's checksum is provided for verification client-side
|
||||
|
||||
* `cloud storage backend` = able to serve files from (and write to) s3 or similar cloud services; `╱` means the software can do this with some help from `rclone mount` as a bridge
|
||||
@@ -192,26 +198,27 @@ symbol legend,
|
||||
* resumable/segmented uploads only over SFTP, not over HTTP
|
||||
* upload rules are totals only, not over time
|
||||
* can probably do extension/mimetype rejection similar to copyparty
|
||||
* `m`/arozos download-as-zip is not streaming; it creates the full zipfile before download can start, and fails on big folders
|
||||
|
||||
|
||||
## protocols and client support
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ |
|
||||
| serve ftp (tcp) | █ | | | | | █ | | | | | | █ |
|
||||
| serve ftps (tls) | █ | | | | | █ | | | | | | █ |
|
||||
| serve tftp (udp) | █ | | | | | | | | | | | |
|
||||
| serve sftp (ssh) | | | | | | █ | | | | | | █ |
|
||||
| serve smb/cifs | ╱ | | | | | █ | | | | | | |
|
||||
| serve dlna | | | | | | █ | | | | | | |
|
||||
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ |
|
||||
| zeroconf | █ | | | | | | | | | | | |
|
||||
| supports netscape 4 | ╱ | | | | | █ | | | | | • | |
|
||||
| ...internet explorer 6 | ╱ | █ | | █ | | █ | | | | | • | |
|
||||
| mojibake filenames | █ | | | • | • | █ | █ | • | • | • | | ╱ |
|
||||
| undecodable filenames | █ | | | • | • | █ | | • | • | | | ╱ |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ | █ |
|
||||
| serve ftp (tcp) | █ | | | | | █ | | | | | | █ | █ |
|
||||
| serve ftps (tls) | █ | | | | | █ | | | | | | █ | |
|
||||
| serve tftp (udp) | █ | | | | | | | | | | | | |
|
||||
| serve sftp (ssh) | | | | | | █ | | | | | | █ | █ |
|
||||
| serve smb/cifs | ╱ | | | | | █ | | | | | | | |
|
||||
| serve dlna | | | | | | █ | | | | | | | |
|
||||
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ | |
|
||||
| zeroconf | █ | | | | | | | | | | | | █ |
|
||||
| supports netscape 4 | ╱ | | | | | █ | | | | | • | | ╱ |
|
||||
| ...internet explorer 6 | ╱ | █ | | █ | | █ | | | | | • | | ╱ |
|
||||
| mojibake filenames | █ | | | • | • | █ | █ | • | █ | • | | ╱ | |
|
||||
| undecodable filenames | █ | | | • | • | █ | | • | | | | ╱ | |
|
||||
|
||||
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
|
||||
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
|
||||
@@ -222,61 +229,66 @@ symbol legend,
|
||||
* extremely minimal samba/cifs server
|
||||
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
|
||||
* `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss)
|
||||
* `m`/arozos has readonly-support for older browsers; no uploading
|
||||
|
||||
|
||||
## server configuration
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| config from cmd args | █ | | | | | █ | █ | | | █ | | ╱ |
|
||||
| config files | █ | █ | █ | ╱ | ╱ | █ | | █ | | █ | • | ╱ |
|
||||
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | |
|
||||
| same-port http / https | █ | | | | | | | | | | | |
|
||||
| listen multiple ports | █ | | | | | | | | | | | █ |
|
||||
| virtual file system | █ | █ | █ | | | | █ | | | | | █ |
|
||||
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ |
|
||||
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| config from cmd args | █ | | | | | █ | █ | | | █ | | ╱ | ╱ |
|
||||
| config files | █ | █ | █ | ╱ | ╱ | █ | | █ | | █ | • | ╱ | ╱ |
|
||||
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | | █ |
|
||||
| same-port http / https | █ | | | | | | | | | | | | |
|
||||
| listen multiple ports | █ | | | | | | | | | | | █ | |
|
||||
| virtual file system | █ | █ | █ | | | | █ | | | | | █ | |
|
||||
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ | ╱ |
|
||||
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | | • |
|
||||
|
||||
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
|
||||
* `l`/sftpgo:
|
||||
* config: users must be added through gui / api calls
|
||||
* `m`/arozos:
|
||||
* configuration is primarily through GUI
|
||||
* reverse-proxy is not guaranteed to see the correct client IP
|
||||
|
||||
|
||||
## server capabilities
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| per-account chroot | | | | | | | | | | | | █ |
|
||||
| single-sign-on | ╱ | | | █ | █ | | | | • | | | |
|
||||
| token auth | ╱ | | | █ | █ | | | █ | | | | |
|
||||
| 2fa | ╱ | | | █ | █ | | | | | | | █ |
|
||||
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | ╱ | █ |
|
||||
| per-folder permissions | ╱ | | | █ | █ | | █ | | █ | █ | ╱ | █ |
|
||||
| per-file permissions | | | | █ | █ | | █ | | █ | | | |
|
||||
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | |
|
||||
| unmap subfolders | █ | | | | | | █ | | | █ | ╱ | • |
|
||||
| index.html blocks list | ╱ | | | | | | █ | | | • | | |
|
||||
| write-only folders | █ | | | | | | | | | | █ | █ |
|
||||
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ |
|
||||
| file versioning | | | | █ | █ | | | | | | | |
|
||||
| file encryption | | | | █ | █ | █ | | | | | | █ |
|
||||
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | |
|
||||
| ┗ per-volume db | █ | | • | • | • | | | • | • | | | |
|
||||
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | |
|
||||
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | |
|
||||
| ┗ existing file tree | █ | | █ | | | | | | | █ | | |
|
||||
| file action event hooks | █ | | | | | | | | | █ | | █ |
|
||||
| one-way folder sync | █ | | | █ | █ | █ | | | | | | |
|
||||
| full sync | | | | █ | █ | | | | | | | |
|
||||
| speed throttle | | █ | █ | | | █ | | | █ | | | █ |
|
||||
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ |
|
||||
| dyndns updater | | █ | | | | | | | | | | |
|
||||
| self-updater | | | █ | | | | | | | | | |
|
||||
| log rotation | █ | | █ | █ | █ | | | • | █ | | | █ |
|
||||
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | ╱ |
|
||||
| curl-friendly ls | █ | | | | | | | | | | | |
|
||||
| curl-friendly upload | █ | | | | | █ | █ | • | | | | |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| per-account chroot | | | | | | | | | | | | █ | |
|
||||
| single-sign-on | ╱ | | | █ | █ | | | | • | | | | |
|
||||
| token auth | ╱ | | | █ | █ | | | █ | | | | | █ |
|
||||
| 2fa | ╱ | | | █ | █ | | | | | | | █ | ╱ |
|
||||
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | ╱ | █ | █ |
|
||||
| per-folder permissions | ╱ | | | █ | █ | | █ | | █ | █ | ╱ | █ | █ |
|
||||
| per-file permissions | | | | █ | █ | | █ | | █ | | | | █ |
|
||||
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | | █ |
|
||||
| unmap subfolders | █ | | | | | | █ | | | █ | ╱ | • | |
|
||||
| index.html blocks list | ╱ | | | | | | █ | | | • | | | |
|
||||
| write-only folders | █ | | | | | | | | | | █ | █ | |
|
||||
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ | █ |
|
||||
| file versioning | | | | █ | █ | | | | | | | | |
|
||||
| file encryption | | | | █ | █ | █ | | | | | | █ | |
|
||||
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | | |
|
||||
| ┗ per-volume db | █ | | • | • | • | | | • | • | | | | |
|
||||
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | | |
|
||||
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | | |
|
||||
| ┗ existing file tree | █ | | █ | | | | | | | █ | | | |
|
||||
| file action event hooks | █ | | | | | | | | | █ | | █ | • |
|
||||
| one-way folder sync | █ | | | █ | █ | █ | | | | | | | |
|
||||
| full sync | | | | █ | █ | | | | | | | | |
|
||||
| speed throttle | | █ | █ | | | █ | | | █ | | | █ | |
|
||||
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ | • |
|
||||
| dyndns updater | | █ | | | | | | | | | | | |
|
||||
| self-updater | | | █ | | | | | | | | | | █ |
|
||||
| log rotation | █ | | █ | █ | █ | | | • | █ | | | █ | • |
|
||||
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | ╱ | █ |
|
||||
| prometheus metrics | █ | | | █ | | | | | | | | █ | |
|
||||
| curl-friendly ls | █ | | | | | | | | | | | | |
|
||||
| curl-friendly upload | █ | | | | | █ | █ | • | | | | | |
|
||||
|
||||
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
|
||||
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
|
||||
@@ -302,49 +314,51 @@ symbol legend,
|
||||
* `l`/sftpgo:
|
||||
* `file action event hooks` also include on-download triggers
|
||||
* `upload tracking / log` in main logfile
|
||||
* `m`/arozos:
|
||||
* `2fa` maybe possible through LDAP/Oauth
|
||||
|
||||
|
||||
## client features
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | |
|
||||
| themes | █ | █ | | █ | | | | | █ | | | |
|
||||
| directory tree nav | █ | ╱ | | | █ | | | | █ | | ╱ | |
|
||||
| multi-column sorting | █ | | | | | | | | | | | |
|
||||
| thumbnails | █ | | | ╱ | ╱ | | | █ | █ | ╱ | | |
|
||||
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | |
|
||||
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | |
|
||||
| ┗ audio spectrograms | █ | | | | | | | | | | | |
|
||||
| audio player | █ | | | █ | █ | | | | █ | ╱ | | |
|
||||
| ┗ gapless playback | █ | | | | | | | | • | | | |
|
||||
| ┗ audio equalizer | █ | | | | | | | | | | | |
|
||||
| ┗ waveform seekbar | █ | | | | | | | | | | | |
|
||||
| ┗ OS integration | █ | | | | | | | | | | | |
|
||||
| ┗ transcode to lossy | █ | | | | | | | | | | | |
|
||||
| video player | █ | | | █ | █ | | | | █ | █ | | |
|
||||
| ┗ video transcoding | | | | | | | | | █ | | | |
|
||||
| audio BPM detector | █ | | | | | | | | | | | |
|
||||
| audio key detector | █ | | | | | | | | | | | |
|
||||
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ | |
|
||||
| search by date / size | █ | | | | █ | | | █ | █ | | | |
|
||||
| search by bpm / key | █ | | | | | | | | | | | |
|
||||
| search by custom tags | | | | | | | | █ | █ | | | |
|
||||
| search in file contents | | | | █ | █ | | | | █ | | | |
|
||||
| search by custom parser | █ | | | | | | | | | | | |
|
||||
| find local file | █ | | | | | | | | | | | |
|
||||
| undo recent uploads | █ | | | | | | | | | | | |
|
||||
| create directories | █ | | | █ | █ | ╱ | █ | █ | █ | █ | █ | █ |
|
||||
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | |
|
||||
| markdown viewer | █ | | | | █ | | | | █ | ╱ | ╱ | |
|
||||
| markdown editor | █ | | | | █ | | | | █ | ╱ | ╱ | |
|
||||
| readme.md in listing | █ | | | █ | | | | | | | | |
|
||||
| rename files | █ | █ | █ | █ | █ | ╱ | █ | | █ | █ | █ | █ |
|
||||
| batch rename | █ | | | | | | | | █ | | | |
|
||||
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | |
|
||||
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | |
|
||||
| delete files | █ | █ | | █ | █ | ╱ | █ | █ | █ | █ | █ | █ |
|
||||
| copy files | | | | | █ | | | | █ | █ | █ | |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | | █ |
|
||||
| themes | █ | █ | | █ | | | | | █ | | | | |
|
||||
| directory tree nav | █ | ╱ | | | █ | | | | █ | | ╱ | | |
|
||||
| multi-column sorting | █ | | | | | | | | | | | | |
|
||||
| thumbnails | █ | | | ╱ | ╱ | | | █ | █ | ╱ | | | █ |
|
||||
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | | █ |
|
||||
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | | █ |
|
||||
| ┗ audio spectrograms | █ | | | | | | | | | | | | |
|
||||
| audio player | █ | | | █ | █ | | | | █ | ╱ | | | █ |
|
||||
| ┗ gapless playback | █ | | | | | | | | • | | | | |
|
||||
| ┗ audio equalizer | █ | | | | | | | | | | | | |
|
||||
| ┗ waveform seekbar | █ | | | | | | | | | | | | |
|
||||
| ┗ OS integration | █ | | | | | | | | | | | | |
|
||||
| ┗ transcode to lossy | █ | | | | | | | | | | | | |
|
||||
| video player | █ | | | █ | █ | | | | █ | █ | | | █ |
|
||||
| ┗ video transcoding | | | | | | | | | █ | | | | |
|
||||
| audio BPM detector | █ | | | | | | | | | | | | |
|
||||
| audio key detector | █ | | | | | | | | | | | | |
|
||||
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ | | |
|
||||
| search by date / size | █ | | | | █ | | | █ | █ | | | | |
|
||||
| search by bpm / key | █ | | | | | | | | | | | | |
|
||||
| search by custom tags | | | | | | | | █ | █ | | | | |
|
||||
| search in file contents | | | | █ | █ | | | | █ | | | | |
|
||||
| search by custom parser | █ | | | | | | | | | | | | |
|
||||
| find local file | █ | | | | | | | | | | | | |
|
||||
| undo recent uploads | █ | | | | | | | | | | | | |
|
||||
| create directories | █ | | | █ | █ | ╱ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | | █ |
|
||||
| markdown viewer | █ | | | | █ | | | | █ | ╱ | ╱ | | █ |
|
||||
| markdown editor | █ | | | | █ | | | | █ | ╱ | ╱ | | █ |
|
||||
| readme.md in listing | █ | | | █ | | | | | | | | | |
|
||||
| rename files | █ | █ | █ | █ | █ | ╱ | █ | | █ | █ | █ | █ | █ |
|
||||
| batch rename | █ | | | | | | | | █ | | | | |
|
||||
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | | █ |
|
||||
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | █ |
|
||||
| delete files | █ | █ | | █ | █ | ╱ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| copy files | | | | | █ | | | | █ | █ | █ | | █ |
|
||||
|
||||
* `single-page app` = multitasking; possible to continue navigating while uploading
|
||||
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
|
||||
@@ -360,14 +374,14 @@ symbol legend,
|
||||
|
||||
## integration
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| OS alert on upload | █ | | | | | | | | | ╱ | | ╱ |
|
||||
| discord | █ | | | | | | | | | ╱ | | ╱ |
|
||||
| ┗ announce uploads | █ | | | | | | | | | | | ╱ |
|
||||
| ┗ custom embeds | | | | | | | | | | | | ╱ |
|
||||
| sharex | █ | | | █ | | █ | ╱ | █ | | | | |
|
||||
| flameshot | | | | | | █ | | | | | | |
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| OS alert on upload | █ | | | | | | | | | ╱ | | ╱ | |
|
||||
| discord | █ | | | | | | | | | ╱ | | ╱ | |
|
||||
| ┗ announce uploads | █ | | | | | | | | | | | ╱ | |
|
||||
| ┗ custom embeds | | | | | | | | | | | | ╱ | |
|
||||
| sharex | █ | | | █ | | █ | ╱ | █ | | | | | |
|
||||
| flameshot | | | | | | █ | | | | | | | |
|
||||
|
||||
* sharex `╱` = yes, but does not provide example sharex config
|
||||
* `a`/copyparty remarks:
|
||||
@@ -393,6 +407,7 @@ symbol legend,
|
||||
| filebrowser | go | █ apl2 | 20 MB |
|
||||
| filegator | php | █ mit | • |
|
||||
| sftpgo | go | ‼ agpl | 44 MB |
|
||||
| arozos | go | ░ gpl3 | 531 MB |
|
||||
| updog | python | █ mit | 17 MB |
|
||||
| goshs | go | █ mit | 11 MB |
|
||||
| gimme-that | python | █ mit | 4.8 MB |
|
||||
@@ -504,12 +519,14 @@ symbol legend,
|
||||
* ✅ token auth (api keys)
|
||||
|
||||
## [kodbox](https://github.com/kalcaddle/kodbox)
|
||||
* this thing is insane
|
||||
* this thing is insane (but is getting competition from [arozos](#arozos))
|
||||
* php; [docker](https://hub.docker.com/r/kodcloud/kodbox)
|
||||
* 🔵 *upload segmenting, acceleration, and integrity checking!*
|
||||
* ⚠️ but uploads are not resumable(?)
|
||||
* ⚠️ not portable
|
||||
* ⚠️ isolated on-disk file hierarchy, incompatible with other software
|
||||
* ⚠️ uploading small files to copyparty is 16x faster
|
||||
* ⚠️ uploading large files to copyparty is 3x faster
|
||||
* ⚠️ http/webdav only; no ftp or zeroconf
|
||||
* ⚠️ some parts of the GUI are in chinese
|
||||
* ✅ fantastic ui/ux
|
||||
@@ -569,6 +586,24 @@ symbol legend,
|
||||
* ✅ on-download event hook (otherwise same as copyparty)
|
||||
* ✅ more extensive permissions control
|
||||
|
||||
## [arozos](https://github.com/tobychui/arozos)
|
||||
* big suite of applications similar to [kodbox](#kodbox), copyparty is better at downloading/uploading/music/indexing but arozos has other advantages
|
||||
* go; primarily linux (limited support for windows)
|
||||
* ⚠️ uploads not resumable / integrity-checked
|
||||
* ⚠️ uploading small files to copyparty is 2.7x faster
|
||||
* ⚠️ uploading large files to copyparty is at least 10% faster
|
||||
* arozos is websocket-based, 512 KiB chunks; writes each chunk to separate files and then merges
|
||||
* copyparty splices directly into the final file; faster and better for the HDD and filesystem
|
||||
* ⚠️ no directory tree navpane; not as easy to navigate
|
||||
* ⚠️ download-as-zip is not streaming; creates a temp.file on the server
|
||||
* ⚠️ not self-contained (pulls from jsdelivr)
|
||||
* ⚠️ has an audio player, but supports less filetypes
|
||||
* ⚠️ limited support for configuring real-ip detection
|
||||
* ✅ sftp server
|
||||
* ✅ settings gui
|
||||
* ✅ good-looking gui
|
||||
* ✅ an IDE, msoffice viewer, rich host integration, much more
|
||||
|
||||
## [updog](https://github.com/sc0tfree/updog)
|
||||
* python; cross-platform
|
||||
* basic directory listing with upload feature
|
||||
|
||||
@@ -3,10 +3,8 @@
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import io
|
||||
import os
|
||||
import time
|
||||
import json
|
||||
import pprint
|
||||
import os
|
||||
import shutil
|
||||
import tarfile
|
||||
import tempfile
|
||||
|
||||
@@ -44,7 +44,7 @@ if MACOS:
|
||||
from copyparty.__init__ import E
|
||||
from copyparty.__main__ import init_E
|
||||
from copyparty.u2idx import U2idx
|
||||
from copyparty.util import FHC, Garda, Unrecv
|
||||
from copyparty.util import FHC, CachedDict, Garda, Unrecv
|
||||
|
||||
init_E(E)
|
||||
|
||||
@@ -110,7 +110,7 @@ class Cfg(Namespace):
|
||||
def __init__(self, a=None, v=None, c=None, **ka0):
|
||||
ka = {}
|
||||
|
||||
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw q rand smb srch_dbg stats vague_403 vc ver xdev xlink xvol"
|
||||
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_pipe no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw q rand smb srch_dbg stats vague_403 vc ver xdev xlink xvol"
|
||||
ka.update(**{k: False for k in ex.split()})
|
||||
|
||||
ex = "dotpart dotsrch no_dhash no_fastboot no_rescan no_sendfile no_voldump re_dhash plain_ip"
|
||||
@@ -251,6 +251,7 @@ class VHttpConn(object):
|
||||
self.log_func = log
|
||||
self.log_src = "a"
|
||||
self.mutex = threading.Lock()
|
||||
self.pipes = CachedDict(1)
|
||||
self.u2mutex = threading.Lock()
|
||||
self.nbyte = 0
|
||||
self.nid = None
|
||||
|
||||
Reference in New Issue
Block a user