Compare commits

...

54 Commits

Author SHA1 Message Date
ed
42d00050c1 v1.13.0 2024-04-20 22:32:50 +00:00
ed
4bb0e6e75a pipe: windows: make it safe with aggressive flushing 2024-04-20 22:15:08 +00:00
ed
2f7f9de3f5 pipe: optimize (1 GiB/s @ ryzen5-4500U) 2024-04-20 20:13:31 +00:00
ed
f31ac90932 less confusing help-text for --re-dhash 2024-04-20 16:42:56 +00:00
ed
439cb7f85b u2c: add --ow (previously part of --dr) 2024-04-20 16:36:10 +00:00
ed
af193ee834 keep up2k state integrity on abort 2024-04-20 16:13:32 +00:00
ed
c06126cc9d pipe: add volflag to disable 2024-04-19 23:54:23 +00:00
ed
897ffbbbd0 pipe: add to docs 2024-04-19 00:02:28 +00:00
ed
8244d3b4fc pipe: add tapering to keep tcp alive 2024-04-18 23:10:37 +00:00
ed
74266af6d1 pipe: warn when trying to download a .PARTIAL
and fix file sorting indicators on firefox
2024-04-18 23:10:11 +00:00
ed
8c552f1ad1 windows: fix upload-abort 2024-04-18 23:08:05 +00:00
ed
bf5850785f add opt-out from storing uploader IPs 2024-04-18 17:16:00 +00:00
ed
feecb3e0b8 up2k: fix put-hasher dying + a harmless race
* hasher thread could die if a client would rapidly
   upload and delete files (so very unlikely)

* two unprotected calls to register_vpath which was
   almost-definitely safe because the volumes
   already existed in the registry
2024-04-18 16:43:38 +00:00
ed
08d8c82167 PoC: ongoing uploads can be downloaded in lockstep 2024-04-18 00:10:54 +00:00
ed
5239e7ac0c separate registry mutex for faster access
also fix a harmless toctou in handle_json where clients
could get stuck hanging for a bit longer than necessary
2024-04-18 00:07:56 +00:00
ed
9937c2e755 add ArozOS to comparison 2024-04-16 21:00:47 +00:00
ed
f1e947f37d rehost deps from a flaky server 2024-04-12 21:49:01 +00:00
ed
a70a49b9c9 update pkgs to 1.12.2 2024-04-12 21:25:21 +00:00
ed
fe700dcf1a v1.12.2 2024-04-12 21:10:02 +00:00
ed
c8e3ed3aae retry failed renames on windows
theoretical issue which nobody has ran into yet,
probably because nobody uses this on windows
2024-04-12 20:38:30 +00:00
ed
b8733653a3 fix audio transcoding with filekeys 2024-04-11 21:54:15 +00:00
ed
b772a4f8bb fix wordwrap of buttons on ios 2024-04-11 21:31:40 +00:00
ed
9e5253ef87 ie11: restore load-bearing thing 2024-04-11 20:53:15 +00:00
ed
7b94e4edf3 configurable basic-auth preference;
adds options `--bauth-last` to lower the preference for
taking the basic-auth password in case of conflict,
and `--no-bauth` to entirely disable basic-authentication

if a client is providing multiple passwords, for example when
"logged in" with one password (the `cppwd` cookie) and switching
to another account by also sending a PW header/url-param, then
the default evaluation order to determine which password to use is:

url-param `pw`, header `pw`, basic-auth header, cookie (cppwd/cppws)

so if a client supplies a basic-auth header, it will ignore the cookie
and use the basic-auth password instead, which usually makes sense

but this can become a problem if you have other webservers running
on the same domain which also support basic-authentication

--bauth-last is a good choice for cooperating with such services, as
--no-bauth currently breaks support for the android app...
2024-04-11 20:15:49 +00:00
ed
da26ec36ca add password placeholder on login page
was easy to assume you were supposed to put a username there
2024-04-11 19:31:02 +00:00
ed
443acf2f8b update nuitka notes 2024-04-10 22:04:43 +00:00
ed
6c90e3893d update pkgs to 1.12.1 2024-04-09 23:53:43 +00:00
ed
ea002ee71d v1.12.1 2024-04-09 23:34:31 +00:00
ed
ab18893cd2 update deps 2024-04-09 23:25:54 +00:00
ed
844d16b9e5 bbox: scrollwheel for prev/next pic
inspired by d385305f5e
2024-04-09 20:39:07 +00:00
ed
989cc613ef fix tree-rendering when history-popping into bbox
plus misc similar technically-incorrect addq usages;
most of these don't matter in practice since they'll
never get a url with a hash, but makes the intent clear

and make sure hashes never get passed around
like they're part of a dirkey, harmless as it is
2024-04-09 19:54:15 +00:00
ed
4f0cad5468 fix bbox destructor, closes #81 for real 2024-04-09 19:10:55 +00:00
ed
f89de6b35d preloading too aggressive, chill a bit 2024-04-09 18:44:23 +00:00
ed
e0bcb88ee7 update pkgs to 1.12.0 2024-04-06 20:56:52 +00:00
ed
a0022805d1 v1.12.0 (closes #64) 2024-04-06 20:11:49 +00:00
ed
853adb5d04 update deps 2024-04-06 19:51:38 +00:00
ed
7744226b5c apply audio equalizer to videos too 2024-04-06 18:44:08 +00:00
ed
d94b5b3fc9 fau doesn't work on iphones; compensate by preloading much earlier 2024-04-06 18:43:45 +00:00
ed
e6ba065bc2 improve cachebusters 2024-04-06 00:27:06 +00:00
ed
59a53ba9ac on phones, fix playback halting if next song didn't buffer in time 2024-04-06 00:25:28 +00:00
ed
b88cc7b5ce turns out it doesn't need to be audible... 2024-04-05 23:06:26 +00:00
ed
5ab54763c6 remove pyoxidizer (unmaintained)
partially reverts e430b2567a

the remaining stuff might be useful for other cpython alternatives
2024-04-05 17:51:26 +00:00
ed
59f815ff8c deps: add busy.mp3 2024-04-04 09:27:01 +00:00
ed
9c42cbec6f maybe fix #81 2024-04-03 00:28:15 +00:00
ed
f471b05aa4 todo: fix playback stopping on phones if slow preload 2024-04-02 23:20:58 +00:00
ed
34c32e3e89 golf:
util.js ensures `WebAssembly`, `Notification`, and `FormData`
are always declared, setting them false when not available
2024-04-02 20:25:06 +00:00
ed
a080759a03 add transcoding to mp3
because CU's car stereo can't do opus...

incidentally adds support for playing any audio format in ie11
2024-03-29 16:36:56 +00:00
ed
0ae12868e5 dirkeys: add volflag dky (skip keycheck) 2024-03-27 21:03:58 +00:00
ed
ef52e2c06c dirkeys: fix 403 in dks volumes 2024-03-27 20:34:34 +00:00
ed
32c912bb16 fix a bunch of dirkey stuff:
* breadcrumb navigation
* tree generation in `recvls`
* dirkeys in initial tree
2024-03-27 16:05:05 +00:00
ed
20870fda79 Merge branch 'dirkeys' into hovudstraum 2024-03-25 10:34:08 +00:00
ed
bdfe2c1a5f mention unproductive optimizations 2024-03-24 22:07:23 +00:00
ed
cb99fbf442 update pkgs to 1.11.2 2024-03-23 17:53:19 +00:00
ed
10bc2d9205 unsuccessful attempt at dirkeys (#64) 2023-12-17 22:30:22 +00:00
46 changed files with 1571 additions and 702 deletions

View File

@@ -10,13 +10,16 @@ turn almost any device into a file server with resumable uploads/downloads using
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
🎬 **videos:** [upload](https://a.ocv.me/pub/demo/pics-vids/up2k.webm) // [cli-upload](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm) // [race-the-beam](https://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)
## readme toc
* top
* [quickstart](#quickstart) - just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** -- that's it! 🎉
* [at home](#at-home) - make it accessible over the internet
* [on servers](#on-servers) - you may also want these, especially on servers
* [features](#features)
* [features](#features) - also see [comparison to similar software](./docs/versus.md)
* [testimonials](#testimonials) - small collection of user feedback
* [motivations](#motivations) - project goals / philosophy
* [notes](#notes) - general notes
@@ -37,6 +40,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [file-search](#file-search) - dropping files into the browser also lets you see if they exist on the server
* [unpost](#unpost) - undo/delete accidental uploads
* [self-destruct](#self-destruct) - uploads can be given a lifetime
* [race the beam](#race-the-beam) - download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm))
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
* [media player](#media-player) - plays almost every audio format there is
@@ -94,6 +98,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [gotchas](#gotchas) - behavior that might be unexpected
* [cors](#cors) - cross-site request config
* [filekeys](#filekeys) - prevent filename bruteforcing
* [dirkeys](#dirkeys) - share specific folders in a volume
* [password hashing](#password-hashing) - you can hash passwords
* [https](#https) - both HTTP and HTTPS are accepted
* [recovering from crashes](#recovering-from-crashes)
@@ -125,7 +130,7 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
* **Alpine:** `apk add py3-pillow ffmpeg`
* **Debian:** `apt install --no-install-recommends python3-pil ffmpeg`
* **Fedora:** rpmfusion + `dnf install python3-pillow ffmpeg`
* **Fedora:** rpmfusion + `dnf install python3-pillow ffmpeg --allowerasing`
* **FreeBSD:** `pkg install py39-sqlite3 py39-pillow ffmpeg`
* **MacOS:** `port install py-Pillow ffmpeg`
* **MacOS** (alternative): `brew install pillow ffmpeg`
@@ -146,6 +151,17 @@ some recommended options:
* see [accounts and volumes](#accounts-and-volumes) (or `--help-accounts`) for the syntax and other permissions
### at home
make it accessible over the internet by starting a [cloudflare quicktunnel](https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/do-more-with-tunnels/trycloudflare/) like so:
first download [cloudflared](https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/downloads/) and then start the tunnel with `cloudflared tunnel --url http://127.0.0.1:3923`
as the tunnel starts, it will show a URL which you can share to let anyone browse your stash or upload files to you
since people will be connecting through cloudflare, run copyparty with `--xff-hdr cf-connecting-ip` to detect client IPs correctly
### on servers
you may also want these, especially on servers:
@@ -169,6 +185,8 @@ firewall-cmd --reload
## features
also see [comparison to similar software](./docs/versus.md)
* backend stuff
* ☑ IPv6
* ☑ [multiprocessing](#performance) (actual multithreading)
@@ -191,6 +209,7 @@ firewall-cmd --reload
* ☑ write-only folders
* ☑ [unpost](#unpost): undo/delete accidental uploads
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
* ☑ [race the beam](#race-the-beam) (almost like peer-to-peer)
* ☑ symlink/discard duplicates (content-matching)
* download
* ☑ single files in browser
@@ -199,7 +218,7 @@ firewall-cmd --reload
* browser
* ☑ [navpane](#navpane) (directory tree sidebar)
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
* ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus transcoding)
* ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus/mp3 transcoding)
* ☑ image gallery with webm player
* ☑ textfile browser with syntax hilighting
* ☑ [thumbnails](#thumbnails)
@@ -587,7 +606,7 @@ you can also zip a selection of files or folders by clicking them in the browser
![copyparty-zipsel-fs8](https://user-images.githubusercontent.com/241032/129635374-e5136e01-470a-49b1-a762-848e8a4c9cdc.png)
cool trick: download a folder by appending url-params `?tar&opus` to transcode all audio files (except aac|m4a|mp3|ogg|opus|wma) to opus before they're added to the archive
cool trick: download a folder by appending url-params `?tar&opus` or `?tar&mp3` to transcode all audio files (except aac|m4a|mp3|ogg|opus|wma) to opus/mp3 before they're added to the archive
* super useful if you're 5 minutes away from takeoff and realize you don't have any music on your phone but your server only has flac files and downloading those will burn through all your data + there wouldn't be enough time anyways
* and url-params `&j` / `&w` produce jpeg/webm thumbnails/spectrograms instead of the original audio/video/images
* can also be used to pregenerate thumbnails; combine with `--th-maxage=9999999` or `--th-clean=0`
@@ -616,7 +635,7 @@ up2k has several advantages:
> it is perfectly safe to restart / upgrade copyparty while someone is uploading to it!
> all known up2k clients will resume just fine 💪
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
see [up2k](./docs/devnotes.md#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
![copyparty-upload-fs8](https://user-images.githubusercontent.com/241032/129635371-48fc54ca-fa91-48e3-9b1d-ba413e4b68cb.png)
@@ -682,6 +701,13 @@ clients can specify a shorter expiration time using the [up2k ui](#uploading) --
specifying a custom expiration time client-side will affect the timespan in which unposts are permitted, so keep an eye on the estimates in the up2k ui
### race the beam
download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)) -- it's almost like peer-to-peer
requires the file to be uploaded using up2k (which is the default drag-and-drop uploader), alternatively the command-line program
## file manager
cut/paste, rename, and delete files/folders (if you have permission)
@@ -778,9 +804,9 @@ open the `[🎺]` media-player-settings tab to configure it,
* `[loop]` keeps looping the folder
* `[next]` plays into the next folder
* "transcode":
* `[flac]` converts `flac` and `wav` files into opus
* `[aac]` converts `aac` and `m4a` files into opus
* `[oth]` converts all other known formats into opus
* `[flac]` converts `flac` and `wav` files into opus (if supported by browser) or mp3
* `[aac]` converts `aac` and `m4a` files into opus (if supported by browser) or mp3
* `[oth]` converts all other known formats into opus (if supported by browser) or mp3
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
* "tint" reduces the contrast of the playback bar
@@ -1041,6 +1067,8 @@ tweaking the ui
* to sort in music order (album, track, artist, title) with filename as fallback, you could `--sort tags/Cirle,tags/.tn,tags/Artist,tags/Title,href`
* to sort by upload date, first enable showing the upload date in the listing with `-e2d -mte +.up_at` and then `--sort tags/.up_at`
see [./docs/rice](./docs/rice) for more
## file indexing
@@ -1839,12 +1867,29 @@ cors can be configured with `--acao` and `--acam`, or the protections entirely d
prevent filename bruteforcing
volflag `c,fk` generates filekeys (per-file accesskeys) for all files; users which have full read-access (permission `r`) will then see URLs with the correct filekey `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
volflag `fk` generates filekeys (per-file accesskeys) for all files; users which have full read-access (permission `r`) will then see URLs with the correct filekey `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
by default, filekeys are generated based on salt (`--fk-salt`) + filesystem-path + file-size + inode (if not windows); add volflag `fka` to generate slightly weaker filekeys which will not be invalidated if the file is edited (only salt + path)
permissions `wG` (write + upget) lets users upload files and receive their own filekeys, still without being able to see other uploads
### dirkeys
share specific folders in a volume without giving away full read-access to the rest -- the visitor only needs the `g` (get) permission to view the link
volflag `dk` generates dirkeys (per-directory accesskeys) for all folders, granting read-access to that folder; by default only that folder itself, no subfolders
volflag `dky` disables the actual key-check, meaning anyone can see the contents of a folder where they have `g` access, but not its subdirectories
* `dk` + `dky` gives the same behavior as if all users with `g` access have full read-access, but subfolders are hidden files (their names start with a dot), so `dky` is an alternative to renaming all the folders for that purpose, maybe just for some users
volflag `dks` lets people enter subfolders as well, and also enables download-as-zip/tar
dirkeys are generated based on another salt (`--dk-salt`) + filesystem-path and have a few limitations:
* the key does not change if the contents of the folder is modified
* if you need a new dirkey, either change the salt or rename the folder
* linking to a textfile (so it opens in the textfile viewer) is not possible if recipient doesn't have read-access
## password hashing

View File

@@ -231,7 +231,7 @@ install_vamp() {
cd "$td"
echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2691/vamp-plugin-sdk-2.10.0.tar.gz)
(dl_files yolo https://ocv.me/mirror/vamp-plugin-sdk-2.10.0.tar.gz)
sha512sum -c <(
echo "153b7f2fa01b77c65ad393ca0689742d66421017fd5931d216caa0fcf6909355fff74706fabbc062a3a04588a619c9b515a1dae00f21a57afd97902a355c48ed -"
) <vamp-plugin-sdk-2.10.0.tar.gz
@@ -247,7 +247,7 @@ install_vamp() {
cd "$td"
have_beatroot || {
printf '\033[33mcould not find the vamp beatroot plugin, building from source\033[0m\n'
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/885/beatroot-vamp-v1.0.tar.gz)
(dl_files yolo https://ocv.me/mirror/beatroot-vamp-v1.0.tar.gz)
sha512sum -c <(
echo "1f444d1d58ccf565c0adfe99f1a1aa62789e19f5071e46857e2adfbc9d453037bc1c4dcb039b02c16240e9b97f444aaff3afb625c86aa2470233e711f55b6874 -"
) <beatroot-vamp-v1.0.tar.gz

View File

@@ -1,8 +1,8 @@
#!/usr/bin/env python3
from __future__ import print_function, unicode_literals
S_VERSION = "1.15"
S_BUILD_DT = "2024-02-18"
S_VERSION = "1.16"
S_BUILD_DT = "2024-04-20"
"""
u2c.py: upload to copyparty
@@ -563,7 +563,7 @@ def handshake(ar, file, search):
else:
if ar.touch:
req["umod"] = True
if ar.dr:
if ar.ow:
req["replace"] = True
headers = {"Content-Type": "text/plain"} # <=1.5.1 compat
@@ -1140,6 +1140,7 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\\.hist/.*'")
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
ap.add_argument("--touch", action="store_true", help="if last-modified timestamps differ, push local to server (need write+delete perms)")
ap.add_argument("--ow", action="store_true", help="overwrite existing files instead of autorenaming")
ap.add_argument("--version", action="store_true", help="show version and exit")
ap = app.add_argument_group("compatibility")
@@ -1148,7 +1149,7 @@ source file/folder selection uses rsync syntax, meaning that:
ap = app.add_argument_group("folder sync")
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
ap.add_argument("--dr", action="store_true", help="delete remote files which don't exist locally")
ap.add_argument("--dr", action="store_true", help="delete remote files which don't exist locally (implies --ow)")
ap.add_argument("--drd", action="store_true", help="delete remote files during upload instead of afterwards; reduces peak disk space usage, but will reupload instead of detecting renames")
ap = app.add_argument_group("performance tweaks")
@@ -1178,6 +1179,9 @@ source file/folder selection uses rsync syntax, meaning that:
if ar.drd:
ar.dr = True
if ar.dr:
ar.ow = True
for k in "dl dr drd".split():
errs = []
if ar.safe and getattr(ar, k):

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty
pkgver="1.11.1"
pkgver="1.12.2"
pkgrel=1
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
arch=("any")
@@ -21,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
)
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" )
sha256sums=("13e4a65d1854f4f95308fa91c00bd8a5f5977b3ea4fa844ed08c7e1cb1c4bf29")
sha256sums=("e4fd6733e5361f5ceb2ae950f71f65f2609c2b69d45f47e8b2a2f128fb67de0a")
build() {
cd "${srcdir}/${pkgname}-${pkgver}"

View File

@@ -1,5 +1,5 @@
{
"url": "https://github.com/9001/copyparty/releases/download/v1.11.1/copyparty-sfx.py",
"version": "1.11.1",
"hash": "sha256-q7RiaB5yo1EDTwdPeMCNFnmcNj0TsKzBsbsddMSqTH4="
"url": "https://github.com/9001/copyparty/releases/download/v1.12.2/copyparty-sfx.py",
"version": "1.12.2",
"hash": "sha256-GJts5N0leK/WHqpqb+eB1JjBvf6TRpzCc9R7AIHkujo="
}

View File

@@ -56,7 +56,6 @@ class EnvParams(object):
self.t0 = time.time()
self.mod = ""
self.cfg = ""
self.ox = getattr(sys, "oxidized", None)
E = EnvParams()

View File

@@ -157,7 +157,8 @@ def warn(msg: str) -> None:
def init_E(EE: EnvParams) -> None:
# __init__ runs 18 times when oxidized; do expensive stuff here
# some cpython alternatives (such as pyoxidizer) can
# __init__ several times, so do expensive stuff here
E = EE # pylint: disable=redefined-outer-name
@@ -190,34 +191,9 @@ def init_E(EE: EnvParams) -> None:
raise Exception("could not find a writable path for config")
def _unpack() -> str:
import atexit
import tarfile
import tempfile
from importlib.resources import open_binary
td = tempfile.TemporaryDirectory(prefix="")
atexit.register(td.cleanup)
tdn = td.name
with open_binary("copyparty", "z.tar") as tgz:
with tarfile.open(fileobj=tgz) as tf:
try:
tf.extractall(tdn, filter="tar")
except TypeError:
tf.extractall(tdn) # nosec (archive is safe)
return tdn
try:
E.mod = os.path.dirname(os.path.realpath(__file__))
if E.mod.endswith("__init__"):
E.mod = os.path.dirname(E.mod)
except:
if not E.ox:
raise
E.mod = _unpack()
E.mod = os.path.dirname(os.path.realpath(__file__))
if E.mod.endswith("__init__"):
E.mod = os.path.dirname(E.mod)
if sys.platform == "win32":
bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "."
@@ -274,6 +250,19 @@ def get_fk_salt() -> str:
return ret.decode("utf-8")
def get_dk_salt() -> str:
fp = os.path.join(E.cfg, "dk-salt.txt")
try:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(30))
with open(fp, "wb") as f:
f.write(ret + b"\n")
return ret.decode("utf-8")
def get_ah_salt() -> str:
fp = os.path.join(E.cfg, "ah-salt.txt")
try:
@@ -867,8 +856,9 @@ def add_qr(ap, tty):
def add_fs(ap):
ap2 = ap.add_argument_group("filesystem options")
rm_re_def = "5/0.1" if ANYWIN else "0/0"
rm_re_def = "15/0.1" if ANYWIN else "0/0"
ap2.add_argument("--rm-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be deleted because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=rm_retry)")
ap2.add_argument("--mv-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be renamed because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=mv_retry)")
ap2.add_argument("--iobuf", metavar="BYTES", type=int, default=256*1024, help="file I/O buffer-size; if your volumes are on a network drive, try increasing to \033[32m524288\033[0m or even \033[32m4194304\033[0m (and let me know if that improves your performance)")
@@ -960,6 +950,8 @@ def add_auth(ap):
ap2.add_argument("--idp-h-grp", metavar="HN", type=u, default="", help="assume the request-header \033[33mHN\033[0m contains the groupname of the requesting user; can be referenced in config files for group-based access control")
ap2.add_argument("--idp-h-key", metavar="HN", type=u, default="", help="optional but recommended safeguard; your reverse-proxy will insert a secret header named \033[33mHN\033[0m into all requests, and the other IdP headers will be ignored if this header is not present")
ap2.add_argument("--idp-gsep", metavar="RE", type=u, default="|:;+,", help="if there are multiple groups in \033[33m--idp-h-grp\033[0m, they are separated by one of the characters in \033[33mRE\033[0m")
ap2.add_argument("--no-bauth", action="store_true", help="disable basic-authentication support; do not accept passwords from the 'Authenticate' header at all. NOTE: This breaks support for the android app")
ap2.add_argument("--bauth-last", action="store_true", help="keeps basic-authentication enabled, but only as a last-resort; if a cookie is also provided then the cookie wins")
def add_zeroconf(ap):
@@ -1099,6 +1091,8 @@ def add_optouts(ap):
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader IPs into the database")
def add_safety(ap):
@@ -1131,13 +1125,14 @@ def add_safety(ap):
ap2.add_argument("--acam", metavar="V[,V]", type=u, default="GET,HEAD", help="Access-Control-Allow-Methods; list of methods to accept from offsite ('*' behaves like \033[33m--acao\033[0m's description)")
def add_salt(ap, fk_salt, ah_salt):
def add_salt(ap, fk_salt, dk_salt, ah_salt):
ap2 = ap.add_argument_group('salting options')
ap2.add_argument("--ah-alg", metavar="ALG", type=u, default="none", help="account-pw hashing algorithm; one of these, best to worst: \033[32margon2 scrypt sha2 none\033[0m (each optionally followed by alg-specific comma-sep. config)")
ap2.add_argument("--ah-salt", metavar="SALT", type=u, default=ah_salt, help="account-pw salt; ignored if \033[33m--ah-alg\033[0m is none (default)")
ap2.add_argument("--ah-gen", metavar="PW", type=u, default="", help="generate hashed password for \033[33mPW\033[0m, or read passwords from STDIN if \033[33mPW\033[0m is [\033[32m-\033[0m]")
ap2.add_argument("--ah-cli", action="store_true", help="launch an interactive shell which hashes passwords without ever storing or displaying the original passwords")
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files")
ap2.add_argument("--dk-salt", metavar="SALT", type=u, default=dk_salt, help="per-directory accesskey salt; used to generate unpredictable URLs to share folders with users who only have the 'get' permission")
ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
@@ -1203,6 +1198,8 @@ def add_thumbnail(ap):
def add_transcoding(ap):
ap2 = ap.add_argument_group('transcoding options')
ap2.add_argument("--q-opus", metavar="KBPS", type=int, default=128, help="target bitrate for transcoding to opus; set 0 to disable")
ap2.add_argument("--q-mp3", metavar="QUALITY", type=u, default="q2", help="target quality for transcoding to mp3, for example [\033[32m192k\033[0m] (CBR) or [\033[32mq0\033[0m] (CQ/CRF, q0=maxquality, q9=smallest); set 0 to disable")
ap2.add_argument("--no-acode", action="store_true", help="disable audio transcoding")
ap2.add_argument("--no-bacode", action="store_true", help="disable batch audio transcoding by folder download (zip/tar)")
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds")
@@ -1221,7 +1218,7 @@ def add_db_general(ap, hcores):
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
ap2.add_argument("--re-dhash", action="store_true", help="force a cache rebuild on startup; enable this once if it gets out of sync (should never be necessary)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)")
ap2.add_argument("--dbd", metavar="PROFILE", default="wal", help="database durability profile; sets the tradeoff between robustness and speed, see \033[33m--help-dbd\033[0m (volflag=dbd)")
ap2.add_argument("--xlink", action="store_true", help="on upload: check all volumes for dupes, not just the target volume (volflag=xlink)")
@@ -1319,6 +1316,7 @@ def run_argparse(
cert_path = os.path.join(E.cfg, "cert.pem")
fk_salt = get_fk_salt()
dk_salt = get_dk_salt()
ah_salt = get_ah_salt()
# alpine peaks at 5 threads for some reason,
@@ -1350,7 +1348,7 @@ def run_argparse(
add_tftp(ap)
add_smb(ap)
add_safety(ap)
add_salt(ap, fk_salt, ah_salt)
add_salt(ap, fk_salt, dk_salt, ah_salt)
add_optouts(ap)
add_shutdown(ap)
add_yolo(ap)

View File

@@ -1,8 +1,8 @@
# coding: utf-8
VERSION = (1, 11, 2)
CODENAME = "You Can (Not) Proceed"
BUILD_DT = (2024, 3, 23)
VERSION = (1, 13, 0)
CODENAME = "race the beam"
BUILD_DT = (2024, 4, 20)
S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -555,7 +555,12 @@ class VFS(object):
# no vfs nodes in the list of real inodes
real = [x for x in real if x[0] not in self.nodes]
dbv = self.dbv or self
for name, vn2 in sorted(self.nodes.items()):
if vn2.dbv == dbv and self.flags.get("dk"):
virt_vis[name] = vn2
continue
ok = False
zx = vn2.axs
axs = [zx.uread, zx.uwrite, zx.umove, zx.udel, zx.uget]
@@ -1681,6 +1686,20 @@ class AuthSrv(object):
vol.flags["fk"] = int(fk) if fk is not True else 8
have_fk = True
dk = vol.flags.get("dk")
dks = vol.flags.get("dks")
dky = vol.flags.get("dky")
if dks is not None and dky is not None:
t = "WARNING: volume /%s has both dks and dky enabled; this is too yolo and not permitted"
raise Exception(t % (vol.vpath,))
if dks and not dk:
dk = dks
if dky and not dk:
dk = dky
if dk:
vol.flags["dk"] = int(dk) if dk is not True else 8
if have_fk and re.match(r"^[0-9\.]+$", self.args.fk_salt):
self.log("filekey salt: {}".format(self.args.fk_salt))
@@ -1745,13 +1764,14 @@ class AuthSrv(object):
if k in vol.flags:
vol.flags[k] = float(vol.flags[k])
try:
zs1, zs2 = vol.flags["rm_retry"].split("/")
vol.flags["rm_re_t"] = float(zs1)
vol.flags["rm_re_r"] = float(zs2)
except:
t = 'volume "/%s" has invalid rm_retry [%s]'
raise Exception(t % (vol.vpath, vol.flags.get("rm_retry")))
for k in ("mv_re", "rm_re"):
try:
zs1, zs2 = vol.flags[k + "try"].split("/")
vol.flags[k + "_t"] = float(zs1)
vol.flags[k + "_r"] = float(zs2)
except:
t = 'volume "/%s" has invalid %stry [%s]'
raise Exception(t % (vol.vpath, k, vol.flags.get(k + "try")))
for k1, k2 in IMPLICATIONS:
if k1 in vol.flags:

View File

@@ -6,7 +6,8 @@ import os
import shutil
import time
from .util import Netdev, runcmd
from .__init__ import ANYWIN
from .util import Netdev, runcmd, wrename, wunlink
HAVE_CFSSL = True
@@ -14,6 +15,12 @@ if True: # pylint: disable=using-constant-test
from .util import RootLogger
if ANYWIN:
VF = {"mv_re_t": 5, "rm_re_t": 5, "mv_re_r": 0.1, "rm_re_r": 0.1}
else:
VF = {"mv_re_t": 0, "rm_re_t": 0}
def ensure_cert(log: "RootLogger", args) -> None:
"""
the default cert (and the entire TLS support) is only here to enable the
@@ -105,8 +112,12 @@ def _gen_ca(log: "RootLogger", args):
raise Exception("failed to translate ca-cert: {}, {}".format(rc, se), 3)
bname = os.path.join(args.crt_dir, "ca")
os.rename(bname + "-key.pem", bname + ".key")
os.unlink(bname + ".csr")
try:
wunlink(log, bname + ".key", VF)
except:
pass
wrename(log, bname + "-key.pem", bname + ".key", VF)
wunlink(log, bname + ".csr", VF)
log("cert", "new ca OK", 2)
@@ -185,11 +196,11 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
bname = os.path.join(args.crt_dir, "srv")
try:
os.unlink(bname + ".key")
wunlink(log, bname + ".key", VF)
except:
pass
os.rename(bname + "-key.pem", bname + ".key")
os.unlink(bname + ".csr")
wrename(log, bname + "-key.pem", bname + ".key", VF)
wunlink(log, bname + ".csr", VF)
with open(os.path.join(args.crt_dir, "ca.pem"), "rb") as f:
ca = f.read()

View File

@@ -16,6 +16,7 @@ def vf_bmap() -> dict[str, str]:
"no_dedup": "copydupes",
"no_dupe": "nodupe",
"no_forget": "noforget",
"no_pipe": "nopipe",
"no_robots": "norobots",
"no_thumb": "dthumb",
"no_vthumb": "dvthumb",
@@ -63,6 +64,7 @@ def vf_vmap() -> dict[str, str]:
"lg_sbf",
"md_sbf",
"nrand",
"mv_retry",
"rm_retry",
"sort",
"unlist",
@@ -214,6 +216,7 @@ flagcats = {
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
"fka=8": 'generates slightly weaker per-file accesskeys,\nwhich are then required at the "g" permission;\nnot affected by filesize or inode numbers',
"mv_retry": "ms-windows: timeout for renaming busy files",
"rm_retry": "ms-windows: timeout for deleting busy files",
"davauth": "ask webdav clients to login for all folders",
"davrt": "show lastmod time of symlink destination, not the link itself\n(note: this option is always enabled for recursive listings)",

View File

@@ -36,6 +36,7 @@ from .bos import bos
from .star import StreamTar
from .sutil import StreamArc, gfilter
from .szip import StreamZip
from .up2k import up2k_chunksize
from .util import unquote # type: ignore
from .util import (
APPLESAN_RE,
@@ -89,6 +90,7 @@ from .util import (
vjoin,
vol_san,
vsplit,
wrename,
wunlink,
yieldfile,
)
@@ -126,6 +128,7 @@ class HttpCli(object):
self.ico = conn.ico # mypy404
self.thumbcli = conn.thumbcli # mypy404
self.u2fh = conn.u2fh # mypy404
self.pipes = conn.pipes # mypy404
self.log_func = conn.log_func # mypy404
self.log_src = conn.log_src # mypy404
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
@@ -443,7 +446,11 @@ class HttpCli(object):
zso = self.headers.get("authorization")
bauth = ""
if zso:
if (
zso
and not self.args.no_bauth
and (not cookie_pw or not self.args.bauth_last)
):
try:
zb = zso.split(" ")[1].encode("ascii")
zs = base64.b64decode(zb).decode("utf-8")
@@ -1800,7 +1807,7 @@ class HttpCli(object):
f, fn = zfw["orz"]
path2 = os.path.join(fdir, fn2)
atomic_move(path, path2)
atomic_move(self.log, path, path2, vfs.flags)
fn = fn2
path = path2
@@ -1881,7 +1888,9 @@ class HttpCli(object):
self.reply(t.encode("utf-8"), 201, headers=h)
return True
def bakflip(self, f: typing.BinaryIO, ofs: int, sz: int, sha: str) -> None:
def bakflip(
self, f: typing.BinaryIO, ofs: int, sz: int, sha: str, flags: dict[str, Any]
) -> None:
if not self.args.bak_flips or self.args.nw:
return
@@ -1909,7 +1918,7 @@ class HttpCli(object):
if nrem:
self.log("bakflip truncated; {} remains".format(nrem), 1)
atomic_move(fp, fp + ".trunc")
atomic_move(self.log, fp, fp + ".trunc", flags)
else:
self.log("bakflip ok", 2)
@@ -1966,7 +1975,12 @@ class HttpCli(object):
v = self.uparam[k]
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False)
if self._use_dirkey():
vn = self.vn
rem = self.rem
else:
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False)
zs = self.parser.require("files", 1024 * 1024)
if not zs:
raise Pebkac(422, "need files list")
@@ -2170,7 +2184,7 @@ class HttpCli(object):
if sha_b64 != chash:
try:
self.bakflip(f, cstart[0], post_sz, sha_b64)
self.bakflip(f, cstart[0], post_sz, sha_b64, vfs.flags)
except:
self.log("bakflip failed: " + min_ex())
@@ -2445,7 +2459,7 @@ class HttpCli(object):
self.log("user not allowed to overwrite with ?replace")
elif bos.path.exists(abspath):
try:
bos.unlink(abspath)
wunlink(self.log, abspath, vfs.flags)
t = "overwriting file with new upload: %s"
except:
t = "toctou while deleting for ?replace: %s"
@@ -2522,7 +2536,7 @@ class HttpCli(object):
raise
if not nullwrite:
atomic_move(tabspath, abspath)
atomic_move(self.log, tabspath, abspath, vfs.flags)
tabspath = ""
@@ -2762,7 +2776,7 @@ class HttpCli(object):
hidedir(dp)
except:
pass
bos.rename(fp, os.path.join(mdir, ".hist", mfile2))
wrename(self.log, fp, os.path.join(mdir, ".hist", mfile2), vfs.flags)
assert self.parser.gen
p_field, _, p_data = next(self.parser.gen)
@@ -2870,6 +2884,30 @@ class HttpCli(object):
return file_lastmod, True
def _use_dirkey(self, ap: str = "") -> bool:
if self.can_read or not self.can_get:
return False
if self.vn.flags.get("dky"):
return True
req = self.uparam.get("k") or ""
if not req:
return False
dk_len = self.vn.flags.get("dk")
if not dk_len:
return False
ap = ap or self.vn.canonical(self.rem)
zs = self.gen_fk(2, self.args.dk_salt, ap, 0, 0)[:dk_len]
if req == zs:
return True
t = "wrong dirkey, want %s, got %s\n vp: %s\n ap: %s"
self.log(t % (zs, req, self.req, ap), 6)
return False
def _expand(self, txt: str, phs: list[str]) -> str:
for ph in phs:
if ph.startswith("hdr."):
@@ -2893,17 +2931,42 @@ class HttpCli(object):
return txt
def tx_file(self, req_path: str) -> bool:
def tx_file(self, req_path: str, ptop: Optional[str] = None) -> bool:
status = 200
logmsg = "{:4} {} ".format("", self.req)
logtail = ""
if ptop is not None:
try:
dp, fn = os.path.split(req_path)
tnam = fn + ".PARTIAL"
if self.args.dotpart:
tnam = "." + tnam
ap_data = os.path.join(dp, tnam)
st_data = bos.stat(ap_data)
if not st_data.st_size:
raise Exception("partial is empty")
x = self.conn.hsrv.broker.ask("up2k.find_job_by_ap", ptop, req_path)
job = json.loads(x.get())
if not job:
raise Exception("not found in registry")
self.pipes.set(req_path, job)
except Exception as ex:
self.log("will not pipe [%s]; %s" % (ap_data, ex), 6)
ptop = None
#
# if request is for foo.js, check if we have foo.js.gz
file_ts = 0.0
editions: dict[str, tuple[str, int]] = {}
for ext in ("", ".gz"):
if ptop is not None:
sz = job["size"]
file_ts = job["lmod"]
editions["plain"] = (ap_data, sz)
break
try:
fs_path = req_path + ext
st = bos.stat(fs_path)
@@ -3060,6 +3123,11 @@ class HttpCli(object):
self.send_headers(length=upper - lower, status=status, mime=mime)
return True
if ptop is not None:
return self.tx_pipe(
ptop, req_path, ap_data, job, lower, upper, status, mime, logmsg
)
ret = True
with open_func(*open_args) as f:
self.send_headers(length=upper - lower, status=status, mime=mime)
@@ -3079,6 +3147,143 @@ class HttpCli(object):
return ret
def tx_pipe(
self,
ptop: str,
req_path: str,
ap_data: str,
job: dict[str, Any],
lower: int,
upper: int,
status: int,
mime: str,
logmsg: str,
) -> bool:
M = 1048576
self.send_headers(length=upper - lower, status=status, mime=mime)
wr_slp = self.args.s_wr_slp
wr_sz = self.args.s_wr_sz
file_size = job["size"]
chunk_size = up2k_chunksize(file_size)
num_need = -1
data_end = 0
remains = upper - lower
broken = False
spins = 0
tier = 0
tiers = ["uncapped", "reduced speed", "one byte per sec"]
while lower < upper and not broken:
with self.u2mutex:
job = self.pipes.get(req_path)
if not job:
x = self.conn.hsrv.broker.ask("up2k.find_job_by_ap", ptop, req_path)
job = json.loads(x.get())
if job:
self.pipes.set(req_path, job)
if not job:
t = "pipe: OK, upload has finished; yeeting remainder"
self.log(t, 2)
data_end = file_size
break
if num_need != len(job["need"]):
num_need = len(job["need"])
data_end = 0
for cid in job["hash"]:
if cid in job["need"]:
break
data_end += chunk_size
t = "pipe: can stream %.2f MiB; requested range is %.2f to %.2f"
self.log(t % (data_end / M, lower / M, upper / M), 6)
with self.u2mutex:
if data_end > self.u2fh.aps.get(ap_data, data_end):
try:
fhs = self.u2fh.cache[ap_data].all_fhs
for fh in fhs:
fh.flush()
self.u2fh.aps[ap_data] = data_end
self.log("pipe: flushed %d up2k-FDs" % (len(fhs),))
except Exception as ex:
self.log("pipe: u2fh flush failed: %r" % (ex,))
if lower >= data_end:
if data_end:
t = "pipe: uploader is too slow; aborting download at %.2f MiB"
self.log(t % (data_end / M))
raise Pebkac(416, "uploader is too slow")
raise Pebkac(416, "no data available yet; please retry in a bit")
slack = data_end - lower
if slack >= 8 * M:
ntier = 0
winsz = M
bufsz = wr_sz
slp = wr_slp
else:
winsz = max(40, int(M * (slack / (12 * M))))
base_rate = M if not wr_slp else wr_sz / wr_slp
if winsz > base_rate:
ntier = 0
bufsz = wr_sz
slp = wr_slp
elif winsz > 300:
ntier = 1
bufsz = winsz // 5
slp = 0.2
else:
ntier = 2
bufsz = winsz = slp = 1
if tier != ntier:
tier = ntier
self.log("moved to tier %d (%s)" % (tier, tiers[tier]))
try:
with open(ap_data, "rb", self.args.iobuf) as f:
f.seek(lower)
page = f.read(min(winsz, data_end - lower, upper - lower))
if not page:
raise Exception("got 0 bytes (EOF?)")
except Exception as ex:
self.log("pipe: read failed at %.2f MiB: %s" % (lower / M, ex), 3)
with self.u2mutex:
self.pipes.c.pop(req_path, None)
spins += 1
if spins > 3:
raise Pebkac(500, "file became unreadable")
time.sleep(2)
continue
spins = 0
pofs = 0
while pofs < len(page):
if slp:
time.sleep(slp)
try:
buf = page[pofs : pofs + bufsz]
self.s.sendall(buf)
zi = len(buf)
remains -= zi
lower += zi
pofs += zi
except:
broken = True
break
if lower < upper and not broken:
with open(req_path, "rb") as f:
remains = sendfile_py(self.log, lower, upper, f, self.s, wr_sz, wr_slp)
spd = self._spd((upper - lower) - remains)
if self.do_log:
self.log("{}, {}".format(logmsg, spd))
return not broken
def tx_zip(
self,
fmt: str,
@@ -3148,7 +3353,7 @@ class HttpCli(object):
# for f in fgen: print(repr({k: f[k] for k in ["vp", "ap"]}))
cfmt = ""
if self.thumbcli and not self.args.no_bacode:
for zs in ("opus", "w", "j"):
for zs in ("opus", "mp3", "w", "j"):
if zs in self.ouparam or uarg == zs:
cfmt = zs
@@ -3557,7 +3762,7 @@ class HttpCli(object):
dst = dst[len(top) + 1 :]
ret = self.gen_tree(top, dst)
ret = self.gen_tree(top, dst, self.uparam.get("k", ""))
if self.is_vproxied:
parents = self.args.R.split("/")
for parent in reversed(parents):
@@ -3567,18 +3772,25 @@ class HttpCli(object):
self.reply(zs.encode("utf-8"), mime="application/json")
return True
def gen_tree(self, top: str, target: str) -> dict[str, Any]:
def gen_tree(self, top: str, target: str, dk: str) -> dict[str, Any]:
ret: dict[str, Any] = {}
excl = None
if target:
excl, target = (target.split("/", 1) + [""])[:2]
sub = self.gen_tree("/".join([top, excl]).strip("/"), target)
sub = self.gen_tree("/".join([top, excl]).strip("/"), target, dk)
ret["k" + quotep(excl)] = sub
vfs = self.asrv.vfs
dk_sz = False
if dk:
vn, rem = vfs.get(top, self.uname, False, False)
if vn.flags.get("dks") and self._use_dirkey(vn.canonical(rem)):
dk_sz = vn.flags.get("dk")
dots = False
fsroot = ""
try:
vn, rem = vfs.get(top, self.uname, True, False)
vn, rem = vfs.get(top, self.uname, not dk_sz, False)
fsroot, vfs_ls, vfs_virt = vn.ls(
rem,
self.uname,
@@ -3586,7 +3798,9 @@ class HttpCli(object):
[[True, False], [False, True]],
)
dots = self.uname in vn.axs.udot
dk_sz = vn.flags.get("dk")
except:
dk_sz = None
vfs_ls = []
vfs_virt = {}
for v in self.rvol:
@@ -3601,6 +3815,14 @@ class HttpCli(object):
dirs = [quotep(x) for x in dirs if x != excl]
if dk_sz and fsroot:
kdirs = []
for dn in dirs:
ap = os.path.join(fsroot, dn)
zs = self.gen_fk(2, self.args.dk_salt, ap, 0, 0)[:dk_sz]
kdirs.append(dn + "?k=" + zs)
dirs = kdirs
for x in vfs_virt:
if x != excl:
try:
@@ -3699,7 +3921,7 @@ class HttpCli(object):
if not allvols:
ret = [{"kinshi": 1}]
jtxt = '{"u":%s,"c":%s}' % (uret, json.dumps(ret, indent=0))
jtxt = '{"u":%s,"c":%s}' % (uret, json.dumps(ret, separators=(",\n", ": ")))
zi = len(uret.split('\n"pd":')) - 1
self.log("%s #%d+%d %.2fsec" % (lm, zi, len(ret), time.time() - t0))
self.reply(jtxt.encode("utf-8", "replace"), mime="application/json")
@@ -3865,6 +4087,7 @@ class HttpCli(object):
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
is_dir = stat.S_ISDIR(st.st_mode)
is_dk = False
fk_pass = False
icur = None
if is_dir and (e2t or e2d):
@@ -3873,7 +4096,7 @@ class HttpCli(object):
icur = idx.get_cur(dbv.realpath)
th_fmt = self.uparam.get("th")
if self.can_read:
if self.can_read or (self.can_get and vn.flags.get("dk")):
if th_fmt is not None:
nothumb = "dthumb" in dbv.flags
if is_dir:
@@ -3977,10 +4200,15 @@ class HttpCli(object):
):
return self.tx_md(vn, abspath)
return self.tx_file(abspath)
return self.tx_file(
abspath, None if st.st_size or "nopipe" in vn.flags else vn.realpath
)
elif is_dir and not self.can_read and not self.can_write:
return self.tx_404(True)
elif is_dir and not self.can_read:
if self._use_dirkey(abspath):
is_dk = True
elif not self.can_write:
return self.tx_404(True)
srv_info = []
@@ -4002,7 +4230,7 @@ class HttpCli(object):
srv_infot = "</span> // <span>".join(srv_info)
perms = []
if self.can_read:
if self.can_read or is_dk:
perms.append("read")
if self.can_write:
perms.append("write")
@@ -4130,7 +4358,7 @@ class HttpCli(object):
if not self.conn.hsrv.prism:
j2a["no_prism"] = True
if not self.can_read:
if not self.can_read and not is_dk:
if is_ls:
return self.tx_ls(ls_ret)
@@ -4183,8 +4411,15 @@ class HttpCli(object):
):
ls_names = exclude_dotfiles(ls_names)
add_dk = vf.get("dk")
add_fk = vf.get("fk")
fk_alg = 2 if "fka" in vf else 1
if add_dk:
if vf.get("dky"):
add_dk = False
else:
zs = self.gen_fk(2, self.args.dk_salt, abspath, 0, 0)[:add_dk]
ls_ret["dk"] = cgv["dk"] = zs
dirs = []
files = []
@@ -4212,6 +4447,12 @@ class HttpCli(object):
href += "/"
if self.args.no_zip:
margin = "DIR"
elif add_dk:
zs = absreal(fspath)
margin = '<a href="%s?k=%s&zip" rel="nofollow">zip</a>' % (
quotep(href),
self.gen_fk(2, self.args.dk_salt, zs, 0, 0)[:add_dk],
)
else:
margin = '<a href="%s?zip" rel="nofollow">zip</a>' % (quotep(href),)
elif fn in hist:
@@ -4252,6 +4493,11 @@ class HttpCli(object):
0 if ANYWIN else inf.st_ino,
)[:add_fk],
)
elif add_dk and is_dir:
href = "%s?k=%s" % (
quotep(href),
self.gen_fk(2, self.args.dk_salt, fspath, 0, 0)[:add_dk],
)
else:
href = quotep(href)
@@ -4270,6 +4516,9 @@ class HttpCli(object):
files.append(item)
item["rd"] = rem
if is_dk and not vf.get("dks"):
dirs = []
if (
self.cookies.get("idxh") == "y"
and "ls" not in self.uparam

View File

@@ -55,6 +55,7 @@ class HttpConn(object):
self.E: EnvParams = self.args.E
self.asrv: AuthSrv = hsrv.asrv # mypy404
self.u2fh: Util.FHC = hsrv.u2fh # mypy404
self.pipes: Util.CachedDict = hsrv.pipes # mypy404
self.ipa_nm: Optional[NetMap] = hsrv.ipa_nm
self.xff_nm: Optional[NetMap] = hsrv.xff_nm
self.xff_lan: NetMap = hsrv.xff_lan # type: ignore

View File

@@ -61,6 +61,7 @@ from .u2idx import U2idx
from .util import (
E_SCK,
FHC,
CachedDict,
Daemon,
Garda,
Magician,
@@ -130,6 +131,7 @@ class HttpSrv(object):
self.t_periodic: Optional[threading.Thread] = None
self.u2fh = FHC()
self.pipes = CachedDict(0.2)
self.metrics = Metrics(self)
self.nreq = 0
self.nsus = 0

View File

@@ -551,8 +551,7 @@ class MTag(object):
pypath = str(os.pathsep.join(zsl))
env["PYTHONPATH"] = pypath
except:
if not E.ox and not EXE:
raise
raise # might be expected outside cpython
ret: dict[str, Any] = {}
for tagname, parser in sorted(parsers.items(), key=lambda x: (x[1].pri, x[0])):

View File

@@ -206,6 +206,7 @@ class MCast(object):
except:
t = "announce failed on {} [{}]:\n{}"
self.log(t.format(netdev, ip, min_ex()), 3)
sck.close()
if self.args.zm_msub:
for s1 in self.srv.values():

View File

@@ -81,7 +81,9 @@ def enthumb(
) -> dict[str, Any]:
rem = f["vp"]
ext = rem.rsplit(".", 1)[-1].lower()
if fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|"):
if (fmt == "mp3" and ext == "mp3") or (
fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|")
):
raise Exception()
vp = vjoin(vtop, rem.split("/", 1)[1])

View File

@@ -276,6 +276,11 @@ class SvcHub(object):
if want_ff and ANYWIN:
self.log("thumb", "download FFmpeg to fix it:\033[0m " + FFMPEG_URL, 3)
if not args.no_acode:
if not re.match("^(0|[qv][0-9]|[0-9]{2,3}k)$", args.q_mp3.lower()):
t = "invalid mp3 transcoding quality [%s] specified; only supports [0] to disable, a CBR value such as [192k], or a CQ/CRF value such as [v2]"
raise Exception(t % (args.q_mp3,))
args.th_poke = min(args.th_poke, args.th_maxage, args.ac_maxage)
zms = ""
@@ -419,6 +424,12 @@ class SvcHub(object):
t = "WARNING: found config files in [%s]: %s\n config files are not expected here, and will NOT be loaded (unless your setup is intentionally hella funky)"
self.log("root", t % (E.cfg, ", ".join(hits)), 3)
if self.args.no_bauth:
t = "WARNING: --no-bauth disables support for the Android app; you may want to use --bauth-last instead"
self.log("root", t, 3)
if self.args.bauth_last:
self.log("root", "WARNING: ignoring --bauth-last due to --no-bauth", 3)
def _process_config(self) -> bool:
al = self.args
@@ -539,6 +550,13 @@ class SvcHub(object):
except:
raise Exception("invalid --rm-retry [%s]" % (self.args.rm_retry,))
try:
zf1, zf2 = self.args.mv_retry.split("/")
self.args.mv_re_t = float(zf1)
self.args.mv_re_r = float(zf2)
except:
raise Exception("invalid --mv-retry [%s]" % (self.args.mv_retry,))
return True
def _ipa2re(self, txt) -> Optional[re.Pattern]:

View File

@@ -57,7 +57,7 @@ class ThumbCli(object):
if is_vid and "dvthumb" in dbv.flags:
return None
want_opus = fmt in ("opus", "caf")
want_opus = fmt in ("opus", "caf", "mp3")
is_au = ext in self.fmt_ffa
if is_au:
if want_opus:

View File

@@ -28,6 +28,7 @@ from .util import (
runcmd,
statdir,
vsplit,
wrename,
wunlink,
)
@@ -109,7 +110,7 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
h = hashlib.sha512(afsenc(fn)).digest()
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
if fmt in ("opus", "caf"):
if fmt in ("opus", "caf", "mp3"):
cat = "ac"
else:
fc = fmt[:1]
@@ -307,6 +308,8 @@ class ThumbSrv(object):
elif lib == "ff" and ext in self.fmt_ffa:
if tpath.endswith(".opus") or tpath.endswith(".caf"):
funs.append(self.conv_opus)
elif tpath.endswith(".mp3"):
funs.append(self.conv_mp3)
elif tpath.endswith(".png"):
funs.append(self.conv_waves)
png_ok = True
@@ -344,7 +347,7 @@ class ThumbSrv(object):
pass
try:
bos.rename(ttpath, tpath)
wrename(self.log, ttpath, tpath, vn.flags)
except:
pass
@@ -637,8 +640,47 @@ class ThumbSrv(object):
cmd += [fsenc(tpath)]
self._run_ff(cmd, vn)
def conv_mp3(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
quality = self.args.q_mp3.lower()
if self.args.no_acode or not quality:
raise Exception("disabled in server config")
self.wait4ram(0.2, tpath)
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
if "ac" not in ret:
raise Exception("not audio")
if quality.endswith("k"):
qk = b"-b:a"
qv = quality.encode("ascii")
else:
qk = b"-q:a"
qv = quality[1:].encode("ascii")
# extremely conservative choices for output format
# (always 2ch 44k1) because if a device is old enough
# to not support opus then it's probably also super picky
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
b"-map_metadata", b"-1",
b"-map", b"0:a:0",
b"-ar", b"44100",
b"-ac", b"2",
b"-c:a", b"libmp3lame",
qk, qv,
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd, vn, oom=300)
def conv_opus(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
if self.args.no_acode:
if self.args.no_acode or not self.args.q_opus:
raise Exception("disabled in server config")
self.wait4ram(0.2, tpath)
@@ -662,6 +704,7 @@ class ThumbSrv(object):
pass
caf_src = abspath if src_opus else tmp_opus
bq = ("%dk" % (self.args.q_opus,)).encode("ascii")
if not want_caf or not src_opus:
# fmt: off
@@ -674,7 +717,7 @@ class ThumbSrv(object):
b"-map_metadata", b"-1",
b"-map", b"0:a:0",
b"-c:a", b"libopus",
b"-b:a", b"128k",
b"-b:a", bq,
fsenc(tmp_opus)
]
# fmt: on
@@ -697,7 +740,7 @@ class ThumbSrv(object):
b"-map_metadata", b"-1",
b"-ac", b"2",
b"-c:a", b"libopus",
b"-b:a", b"128k",
b"-b:a", bq,
b"-f", b"caf",
fsenc(tpath)
]
@@ -771,7 +814,7 @@ class ThumbSrv(object):
def _clean(self, cat: str, thumbpath: str) -> int:
# self.log("cln {}".format(thumbpath))
exts = ["jpg", "webp", "png"] if cat == "th" else ["opus", "caf"]
exts = ["jpg", "webp", "png"] if cat == "th" else ["opus", "caf", "mp3"]
maxage = getattr(self.args, cat + "_maxage")
now = time.time()
prev_b64 = None

View File

@@ -91,6 +91,9 @@ CV_EXTS = set(zsg.split(","))
HINT_HISTPATH = "you could try moving the database to another location (preferably an SSD or NVME drive) using either the --hist argument (global option for all volumes), or the hist volflag (just for this volume)"
VF_CAREFUL = {"mv_re_t": 5, "rm_re_t": 5, "mv_re_r": 0.1, "rm_re_r": 0.1}
class Dbw(object):
def __init__(self, c: "sqlite3.Cursor", n: int, t: float) -> None:
self.c = c
@@ -136,6 +139,7 @@ class Up2k(object):
self.need_rescan: set[str] = set()
self.db_act = 0.0
self.reg_mutex = threading.Lock()
self.registry: dict[str, dict[str, dict[str, Any]]] = {}
self.flags: dict[str, dict[str, Any]] = {}
self.droppable: dict[str, list[str]] = {}
@@ -143,7 +147,7 @@ class Up2k(object):
self.volsize: dict["sqlite3.Cursor", int] = {}
self.volstate: dict[str, str] = {}
self.vol_act: dict[str, float] = {}
self.busy_aps: set[str] = set()
self.busy_aps: dict[str, int] = {}
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
self.snap_prev: dict[str, Optional[tuple[int, float]]] = {}
@@ -200,11 +204,15 @@ class Up2k(object):
Daemon(self.deferred_init, "up2k-deferred-init")
def reload(self, rescan_all_vols: bool) -> None:
"""mutex me"""
"""mutex(main) me"""
self.log("reload #{} scheduled".format(self.gid + 1))
all_vols = self.asrv.vfs.all_vols
scan_vols = [k for k, v in all_vols.items() if v.realpath not in self.registry]
with self.reg_mutex:
scan_vols = [
k for k, v in all_vols.items() if v.realpath not in self.registry
]
if rescan_all_vols:
scan_vols = list(all_vols.keys())
@@ -217,7 +225,7 @@ class Up2k(object):
if self.stop:
# up-mt consistency not guaranteed if init is interrupted;
# drop caches for a full scan on next boot
with self.mutex:
with self.mutex, self.reg_mutex:
self._drop_caches()
if self.pp:
@@ -283,10 +291,27 @@ class Up2k(object):
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
),
}
return json.dumps(ret, indent=4)
return json.dumps(ret, separators=(",\n", ": "))
def find_job_by_ap(self, ptop: str, ap: str) -> str:
try:
if ANYWIN:
ap = ap.replace("\\", "/")
vp = ap[len(ptop) :].strip("/")
dn, fn = vsplit(vp)
with self.reg_mutex:
tab2 = self.registry[ptop]
for job in tab2.values():
if job["prel"] == dn and job["name"] == fn:
return json.dumps(job, separators=(",\n", ": "))
except:
pass
return "{}"
def get_unfinished_by_user(self, uname, ip) -> str:
if PY2 or not self.mutex.acquire(timeout=2):
if PY2 or not self.reg_mutex.acquire(timeout=2):
return '[{"timeout":1}]'
ret: list[tuple[int, str, int, int, int]] = []
@@ -315,17 +340,25 @@ class Up2k(object):
)
ret.append(zt5)
finally:
self.mutex.release()
self.reg_mutex.release()
if ANYWIN:
ret = [(x[0], x[1].replace("\\", "/"), x[2], x[3], x[4]) for x in ret]
ret.sort(reverse=True)
ret2 = [
{"at": at, "vp": "/" + vp, "pd": 100 - ((nn * 100) // (nh or 1)), "sz": sz}
{
"at": at,
"vp": "/" + quotep(vp),
"pd": 100 - ((nn * 100) // (nh or 1)),
"sz": sz,
}
for (at, vp, sz, nn, nh) in ret
]
return json.dumps(ret2, indent=0)
return json.dumps(ret2, separators=(",\n", ": "))
def get_unfinished(self) -> str:
if PY2 or not self.mutex.acquire(timeout=0.5):
if PY2 or not self.reg_mutex.acquire(timeout=0.5):
return ""
ret: dict[str, tuple[int, int]] = {}
@@ -347,17 +380,17 @@ class Up2k(object):
ret[ptop] = (nbytes, nfiles)
finally:
self.mutex.release()
self.reg_mutex.release()
return json.dumps(ret, indent=4)
return json.dumps(ret, separators=(",\n", ": "))
def get_volsize(self, ptop: str) -> tuple[int, int]:
with self.mutex:
with self.reg_mutex:
return self._get_volsize(ptop)
def get_volsizes(self, ptops: list[str]) -> list[tuple[int, int]]:
ret = []
with self.mutex:
with self.reg_mutex:
for ptop in ptops:
ret.append(self._get_volsize(ptop))
@@ -385,7 +418,7 @@ class Up2k(object):
def _rescan(
self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool
) -> str:
"""mutex me"""
"""mutex(main) me"""
if not wait and self.pp:
return "cannot initiate; scan is already in progress"
@@ -667,7 +700,7 @@ class Up2k(object):
self.log(msg, c=3)
live_vols = []
with self.mutex:
with self.mutex, self.reg_mutex:
# only need to protect register_vpath but all in one go feels right
for vol in vols:
try:
@@ -709,7 +742,7 @@ class Up2k(object):
if self.args.re_dhash or [zv for zv in vols if "e2tsr" in zv.flags]:
self.args.re_dhash = False
with self.mutex:
with self.mutex, self.reg_mutex:
self._drop_caches()
for vol in vols:
@@ -786,7 +819,9 @@ class Up2k(object):
self.volstate[vol.vpath] = "online (mtp soon)"
for vol in need_vac:
reg = self.register_vpath(vol.realpath, vol.flags)
with self.mutex, self.reg_mutex:
reg = self.register_vpath(vol.realpath, vol.flags)
assert reg
cur, _ = reg
with self.mutex:
@@ -800,7 +835,9 @@ class Up2k(object):
if vol.flags["dbd"] == "acid":
continue
reg = self.register_vpath(vol.realpath, vol.flags)
with self.mutex, self.reg_mutex:
reg = self.register_vpath(vol.realpath, vol.flags)
try:
assert reg
cur, db_path = reg
@@ -847,6 +884,7 @@ class Up2k(object):
def register_vpath(
self, ptop: str, flags: dict[str, Any]
) -> Optional[tuple["sqlite3.Cursor", str]]:
"""mutex(main,reg) me"""
histpath = self.asrv.vfs.histtab.get(ptop)
if not histpath:
self.log("no histpath for [{}]".format(ptop))
@@ -869,7 +907,7 @@ class Up2k(object):
ft = "\033[0;32m{}{:.0}"
ff = "\033[0;35m{}{:.0}"
fv = "\033[0;36m{}:\033[90m{}"
fx = set(("html_head", "rm_re_t", "rm_re_r"))
fx = set(("html_head", "rm_re_t", "rm_re_r", "mv_re_t", "mv_re_r"))
fd = vf_bmap()
fd.update(vf_cmap())
fd.update(vf_vmap())
@@ -1030,7 +1068,9 @@ class Up2k(object):
dev = cst.st_dev if vol.flags.get("xdev") else 0
with self.mutex:
reg = self.register_vpath(top, vol.flags)
with self.reg_mutex:
reg = self.register_vpath(top, vol.flags)
assert reg and self.pp
cur, db_path = reg
@@ -1627,7 +1667,7 @@ class Up2k(object):
def _build_tags_index(self, vol: VFS) -> tuple[int, int, bool]:
ptop = vol.realpath
with self.mutex:
with self.mutex, self.reg_mutex:
reg = self.register_vpath(ptop, vol.flags)
assert reg and self.pp
@@ -1648,6 +1688,7 @@ class Up2k(object):
return ret
def _drop_caches(self) -> None:
"""mutex(main,reg) me"""
self.log("dropping caches for a full filesystem scan")
for vol in self.asrv.vfs.all_vols.values():
reg = self.register_vpath(vol.realpath, vol.flags)
@@ -1823,7 +1864,7 @@ class Up2k(object):
params: tuple[Any, ...],
flt: int,
) -> tuple[tempfile.SpooledTemporaryFile[bytes], int]:
"""mutex me"""
"""mutex(main) me"""
n = 0
c2 = cur.connection.cursor()
tf = tempfile.SpooledTemporaryFile(1024 * 1024 * 8, "w+b", prefix="cpp-tq-")
@@ -2157,7 +2198,7 @@ class Up2k(object):
ip: str,
at: float,
) -> int:
"""will mutex"""
"""will mutex(main)"""
assert self.mtag
try:
@@ -2189,7 +2230,7 @@ class Up2k(object):
abspath: str,
tags: dict[str, Union[str, float]],
) -> int:
"""mutex me"""
"""mutex(main) me"""
assert self.mtag
if not bos.path.isfile(abspath):
@@ -2474,28 +2515,36 @@ class Up2k(object):
cur.connection.commit()
def _job_volchk(self, cj: dict[str, Any]) -> None:
if not self.register_vpath(cj["ptop"], cj["vcfg"]):
if cj["ptop"] not in self.registry:
raise Pebkac(410, "location unavailable")
def handle_json(self, cj: dict[str, Any], busy_aps: set[str]) -> dict[str, Any]:
def handle_json(
self, cj: dict[str, Any], busy_aps: dict[str, int]
) -> dict[str, Any]:
# busy_aps is u2fh (always undefined if -j0) so this is safe
self.busy_aps = busy_aps
got_lock = False
try:
# bit expensive; 3.9=10x 3.11=2x
if self.mutex.acquire(timeout=10):
self._job_volchk(cj)
self.mutex.release()
got_lock = True
with self.reg_mutex:
return self._handle_json(cj)
else:
t = "cannot receive uploads right now;\nserver busy with {}.\nPlease wait; the client will retry..."
raise Pebkac(503, t.format(self.blocked or "[unknown]"))
except TypeError:
if not PY2:
raise
with self.mutex:
self._job_volchk(cj)
with self.mutex, self.reg_mutex:
return self._handle_json(cj)
finally:
if got_lock:
self.mutex.release()
def _handle_json(self, cj: dict[str, Any]) -> dict[str, Any]:
ptop = cj["ptop"]
if not self.register_vpath(ptop, cj["vcfg"]):
if ptop not in self.registry:
raise Pebkac(410, "location unavailable")
cj["name"] = sanitize_fn(cj["name"], "", [".prologue.html", ".epilogue.html"])
cj["poke"] = now = self.db_act = self.vol_act[ptop] = time.time()
wark = self._get_wark(cj)
@@ -2510,7 +2559,7 @@ class Up2k(object):
# refuse out-of-order / multithreaded uploading if sprs False
sprs = self.fstab.get(pdir) != "ng"
with self.mutex:
if True:
jcur = self.cur.get(ptop)
reg = self.registry[ptop]
vfs = self.asrv.vfs.all_vols[cj["vtop"]]
@@ -2948,7 +2997,7 @@ class Up2k(object):
def handle_chunk(
self, ptop: str, wark: str, chash: str
) -> tuple[int, list[int], str, float, bool]:
with self.mutex:
with self.mutex, self.reg_mutex:
self.db_act = self.vol_act[ptop] = time.time()
job = self.registry[ptop].get(wark)
if not job:
@@ -2991,7 +3040,7 @@ class Up2k(object):
return chunksize, ofs, path, job["lmod"], job["sprs"]
def release_chunk(self, ptop: str, wark: str, chash: str) -> bool:
with self.mutex:
with self.reg_mutex:
job = self.registry[ptop].get(wark)
if job:
job["busy"].pop(chash, None)
@@ -2999,7 +3048,7 @@ class Up2k(object):
return True
def confirm_chunk(self, ptop: str, wark: str, chash: str) -> tuple[int, str]:
with self.mutex:
with self.mutex, self.reg_mutex:
self.db_act = self.vol_act[ptop] = time.time()
try:
job = self.registry[ptop][wark]
@@ -3022,16 +3071,16 @@ class Up2k(object):
if self.args.nw:
self.regdrop(ptop, wark)
return ret, dst
return ret, dst
def finish_upload(self, ptop: str, wark: str, busy_aps: set[str]) -> None:
self.busy_aps = busy_aps
with self.mutex:
with self.mutex, self.reg_mutex:
self._finish_upload(ptop, wark)
def _finish_upload(self, ptop: str, wark: str) -> None:
"""mutex(main,reg) me"""
try:
job = self.registry[ptop][wark]
pdir = djoin(job["ptop"], job["prel"])
@@ -3044,12 +3093,11 @@ class Up2k(object):
t = "finish_upload {} with remaining chunks {}"
raise Pebkac(500, t.format(wark, job["need"]))
# self.log("--- " + wark + " " + dst + " finish_upload atomic " + dst, 4)
atomic_move(src, dst)
upt = job.get("at") or time.time()
vflags = self.flags[ptop]
atomic_move(self.log, src, dst, vflags)
times = (int(time.time()), int(job["lmod"]))
self.log(
"no more chunks, setting times {} ({}) on {}".format(
@@ -3105,6 +3153,7 @@ class Up2k(object):
cur.connection.commit()
def regdrop(self, ptop: str, wark: str) -> None:
"""mutex(main,reg) me"""
olds = self.droppable[ptop]
if wark:
olds.append(wark)
@@ -3199,16 +3248,23 @@ class Up2k(object):
at: float,
skip_xau: bool = False,
) -> None:
"""mutex(main) me"""
self.db_rm(db, rd, fn, sz)
if not ip:
db_ip = ""
else:
# plugins may expect this to look like an actual IP
db_ip = "1.1.1.1" if self.args.no_db_ip else ip
sql = "insert into up values (?,?,?,?,?,?,?)"
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0))
v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
try:
db.execute(sql, v)
except:
assert self.mem_cur
rd, fn = s3enc(self.mem_cur, rd, fn)
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0))
v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
db.execute(sql, v)
self.volsize[db] += sz
@@ -3312,7 +3368,7 @@ class Up2k(object):
vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0])
vn, rem = vn.get_dbv(rem)
ptop = vn.realpath
with self.mutex:
with self.mutex, self.reg_mutex:
abrt_cfg = self.flags.get(ptop, {}).get("u2abort", 1)
addr = (ip or "\n") if abrt_cfg in (1, 2) else ""
user = (uname or "\n") if abrt_cfg in (1, 3) else ""
@@ -3320,7 +3376,10 @@ class Up2k(object):
for wark, job in reg.items():
if (user and user != job["user"]) or (addr and addr != job["addr"]):
continue
if djoin(job["prel"], job["name"]) == rem:
jrem = djoin(job["prel"], job["name"])
if ANYWIN:
jrem = jrem.replace("\\", "/")
if jrem == rem:
if job["ptop"] != ptop:
t = "job.ptop [%s] != vol.ptop [%s] ??"
raise Exception(t % (job["ptop"] != ptop))
@@ -3416,7 +3475,7 @@ class Up2k(object):
continue
n_files += 1
with self.mutex:
with self.mutex, self.reg_mutex:
cur = None
try:
ptop = dbv.realpath
@@ -3534,6 +3593,7 @@ class Up2k(object):
def _mv_file(
self, uname: str, svp: str, dvp: str, curs: set["sqlite3.Cursor"]
) -> str:
"""mutex(main) me; will mutex(reg)"""
svn, srem = self.asrv.vfs.get(svp, uname, True, False, True)
svn, srem = svn.get_dbv(srem)
@@ -3614,7 +3674,9 @@ class Up2k(object):
if c2 and c2 != c1:
self._copy_tags(c1, c2, w)
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize)
with self.reg_mutex:
has_dupes = self._forget_file(svn.realpath, srem, c1, w, is_xvol, fsize)
if not is_xvol:
has_dupes = self._relink(w, svn.realpath, srem, dabs)
@@ -3653,7 +3715,7 @@ class Up2k(object):
self._symlink(dlink, dabs, dvn.flags, lmod=ftime)
wunlink(self.log, sabs, svn.flags)
else:
atomic_move(sabs, dabs)
atomic_move(self.log, sabs, dabs, svn.flags)
except OSError as ex:
if ex.errno != errno.EXDEV:
@@ -3744,7 +3806,10 @@ class Up2k(object):
drop_tags: bool,
sz: int,
) -> bool:
"""forgets file in db, fixes symlinks, does not delete"""
"""
mutex(main,reg) me
forgets file in db, fixes symlinks, does not delete
"""
srd, sfn = vsplit(vrem)
has_dupes = False
self.log("forgetting {}".format(vrem))
@@ -3830,8 +3895,7 @@ class Up2k(object):
self.log("linkswap [{}] and [{}]".format(sabs, slabs))
mt = bos.path.getmtime(slabs, False)
flags = self.flags.get(ptop) or {}
wunlink(self.log, slabs, flags)
bos.rename(sabs, slabs)
atomic_move(self.log, sabs, slabs, flags)
bos.utime(slabs, (int(time.time()), int(mt)), False)
self._symlink(slabs, sabs, flags, False)
full[slabs] = (ptop, rem)
@@ -4070,7 +4134,7 @@ class Up2k(object):
self.do_snapshot()
def do_snapshot(self) -> None:
with self.mutex:
with self.mutex, self.reg_mutex:
for k, reg in self.registry.items():
self._snap_reg(k, reg)
@@ -4138,11 +4202,11 @@ class Up2k(object):
path2 = "{}.{}".format(path, os.getpid())
body = {"droppable": self.droppable[ptop], "registry": reg}
j = json.dumps(body, indent=2, sort_keys=True).encode("utf-8")
j = json.dumps(body, sort_keys=True, separators=(",\n", ": ")).encode("utf-8")
with gzip.GzipFile(path2, "wb") as f:
f.write(j)
atomic_move(path2, path)
atomic_move(self.log, path2, path, VF_CAREFUL)
self.log("snap: {} |{}|".format(path, len(reg.keys())))
self.snap_prev[ptop] = etag
@@ -4211,7 +4275,7 @@ class Up2k(object):
raise Exception("invalid hash task")
try:
if not self._hash_t(task):
if not self._hash_t(task) and self.stop:
return
except Exception as ex:
self.log("failed to hash %s: %s" % (task, ex), 1)
@@ -4221,7 +4285,7 @@ class Up2k(object):
) -> bool:
ptop, vtop, flags, rd, fn, ip, at, usr, skip_xau = task
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
with self.mutex:
with self.mutex, self.reg_mutex:
if not self.register_vpath(ptop, flags):
return True
@@ -4239,7 +4303,7 @@ class Up2k(object):
wark = up2k_wark_from_hashlist(self.salt, inf.st_size, hashes)
with self.mutex:
with self.mutex, self.reg_mutex:
self.idx_wark(
self.flags[ptop],
rd,

View File

@@ -759,15 +759,46 @@ class CachedSet(object):
self.oldest = now
class CachedDict(object):
def __init__(self, maxage: float) -> None:
self.c: dict[str, tuple[float, Any]] = {}
self.maxage = maxage
self.oldest = 0.0
def set(self, k: str, v: Any) -> None:
now = time.time()
self.c[k] = (now, v)
if now - self.oldest < self.maxage:
return
c = self.c = {k: v for k, v in self.c.items() if now - v[0] < self.maxage}
try:
self.oldest = min([x[0] for x in c.values()])
except:
self.oldest = now
def get(self, k: str) -> Optional[tuple[str, Any]]:
try:
ts, ret = self.c[k]
now = time.time()
if now - ts > self.maxage:
del self.c[k]
return None
return ret
except:
return None
class FHC(object):
class CE(object):
def __init__(self, fh: typing.BinaryIO) -> None:
self.ts: float = 0
self.fhs = [fh]
self.all_fhs = set([fh])
def __init__(self) -> None:
self.cache: dict[str, FHC.CE] = {}
self.aps: set[str] = set()
self.aps: dict[str, int] = {}
def close(self, path: str) -> None:
try:
@@ -779,7 +810,7 @@ class FHC(object):
fh.close()
del self.cache[path]
self.aps.remove(path)
del self.aps[path]
def clean(self) -> None:
if not self.cache:
@@ -800,9 +831,12 @@ class FHC(object):
return self.cache[path].fhs.pop()
def put(self, path: str, fh: typing.BinaryIO) -> None:
self.aps.add(path)
if path not in self.aps:
self.aps[path] = 0
try:
ce = self.cache[path]
ce.all_fhs.add(fh)
ce.fhs.append(fh)
except:
ce = self.CE(fh)
@@ -2125,26 +2159,29 @@ def lsof(log: "NamedLogger", abspath: str) -> None:
log("lsof failed; " + min_ex(), 3)
def atomic_move(usrc: str, udst: str) -> None:
src = fsenc(usrc)
dst = fsenc(udst)
if not PY2:
os.replace(src, dst)
def _fs_mvrm(
log: "NamedLogger", src: str, dst: str, atomic: bool, flags: dict[str, Any]
) -> bool:
bsrc = fsenc(src)
bdst = fsenc(dst)
if atomic:
k = "mv_re_"
act = "atomic-rename"
osfun = os.replace
args = [bsrc, bdst]
elif dst:
k = "mv_re_"
act = "rename"
osfun = os.rename
args = [bsrc, bdst]
else:
if os.path.exists(dst):
os.unlink(dst)
k = "rm_re_"
act = "delete"
osfun = os.unlink
args = [bsrc]
os.rename(src, dst)
def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
maxtime = flags.get("rm_re_t", 0.0)
bpath = fsenc(abspath)
if not maxtime:
os.unlink(bpath)
return True
chill = flags.get("rm_re_r", 0.0)
maxtime = flags.get(k + "t", 0.0)
chill = flags.get(k + "r", 0.0)
if chill < 0.001:
chill = 0.1
@@ -2152,14 +2189,19 @@ def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
t0 = now = time.time()
for attempt in range(90210):
try:
if ino and os.stat(bpath).st_ino != ino:
log("inode changed; aborting delete")
if ino and os.stat(bsrc).st_ino != ino:
t = "src inode changed; aborting %s %s"
log(t % (act, src), 1)
return False
os.unlink(bpath)
if (dst and not atomic) and os.path.exists(bdst):
t = "something appeared at dst; aborting rename [%s] ==> [%s]"
log(t % (src, dst), 1)
return False
osfun(*args)
if attempt:
now = time.time()
t = "deleted in %.2f sec, attempt %d"
log(t % (now - t0, attempt + 1))
t = "%sd in %.2f sec, attempt %d: %s"
log(t % (act, now - t0, attempt + 1, src))
return True
except OSError as ex:
now = time.time()
@@ -2169,15 +2211,45 @@ def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
raise
if not attempt:
if not PY2:
ino = os.stat(bpath).st_ino
t = "delete failed (err.%d); retrying for %d sec: %s"
log(t % (ex.errno, maxtime + 0.99, abspath))
ino = os.stat(bsrc).st_ino
t = "%s failed (err.%d); retrying for %d sec: [%s]"
log(t % (act, ex.errno, maxtime + 0.99, src))
time.sleep(chill)
return False # makes pylance happy
def atomic_move(log: "NamedLogger", src: str, dst: str, flags: dict[str, Any]) -> None:
bsrc = fsenc(src)
bdst = fsenc(dst)
if PY2:
if os.path.exists(bdst):
_fs_mvrm(log, dst, "", False, flags) # unlink
_fs_mvrm(log, src, dst, False, flags) # rename
elif flags.get("mv_re_t"):
_fs_mvrm(log, src, dst, True, flags)
else:
os.replace(bsrc, bdst)
def wrename(log: "NamedLogger", src: str, dst: str, flags: dict[str, Any]) -> bool:
if not flags.get("mv_re_t"):
os.rename(fsenc(src), fsenc(dst))
return True
return _fs_mvrm(log, src, dst, False, flags)
def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
if not flags.get("rm_re_t"):
os.unlink(fsenc(abspath))
return True
return _fs_mvrm(log, abspath, "", False, flags)
def get_df(abspath: str) -> tuple[Optional[int], Optional[int]]:
try:
# some fuses misbehave

View File

@@ -29,6 +29,7 @@ window.baguetteBox = (function () {
isOverlayVisible = false,
touch = {}, // start-pos
touchFlag = false, // busy
scrollTimer = 0,
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
anims = ['slideIn', 'fadeIn', 'none'],
@@ -91,6 +92,30 @@ window.baguetteBox = (function () {
touchendHandler();
};
var overlayWheelHandler = function (e) {
if (!options.noScrollbars || anymod(e))
return;
ev(e);
var x = e.deltaX,
y = e.deltaY,
d = Math.abs(x) > Math.abs(y) ? x : y;
if (e.deltaMode)
d *= 10;
if (Date.now() - scrollTimer < (Math.abs(d) > 20 ? 100 : 300))
return;
scrollTimer = Date.now();
if (d > 0)
showNextImage();
else
showPreviousImage();
};
var trapFocusInsideOverlay = function (e) {
if (overlay.style.display === 'block' && (overlay.contains && !overlay.contains(e.target))) {
e.stopPropagation();
@@ -394,8 +419,7 @@ window.baguetteBox = (function () {
}
function dlpic() {
var url = findfile()[3].href;
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache';
var url = addq(findfile()[3].href, 'cache');
dl_file(url);
}
@@ -452,6 +476,7 @@ window.baguetteBox = (function () {
bind(document, 'keyup', keyUpHandler);
bind(document, 'fullscreenchange', onFSC);
bind(overlay, 'click', overlayClickHandler);
bind(overlay, 'wheel', overlayWheelHandler);
bind(btnPrev, 'click', showPreviousImage);
bind(btnNext, 'click', showNextImage);
bind(btnClose, 'click', hideOverlay);
@@ -474,6 +499,7 @@ window.baguetteBox = (function () {
unbind(document, 'keyup', keyUpHandler);
unbind(document, 'fullscreenchange', onFSC);
unbind(overlay, 'click', overlayClickHandler);
unbind(overlay, 'wheel', overlayWheelHandler);
unbind(btnPrev, 'click', showPreviousImage);
unbind(btnNext, 'click', showNextImage);
unbind(btnClose, 'click', hideOverlay);
@@ -584,7 +610,7 @@ window.baguetteBox = (function () {
isOverlayVisible = true;
}
function hideOverlay(e) {
function hideOverlay(e, dtor) {
ev(e);
playvid(false);
removeFromCache('#files');
@@ -592,7 +618,15 @@ window.baguetteBox = (function () {
document.documentElement.style.overflowY = 'auto';
document.body.style.overflowY = 'auto';
}
if (overlay.style.display === 'none')
try {
if (document.fullscreenElement)
document.exitFullscreen();
}
catch (ex) { }
isFullscreen = false;
if (dtor || overlay.style.display === 'none')
return;
if (options.duringHide)
@@ -600,11 +634,6 @@ window.baguetteBox = (function () {
sethash('');
unbindEvents();
try {
document.exitFullscreen();
isFullscreen = false;
}
catch (ex) { }
// Fade out and hide the overlay
overlay.className = '';
@@ -682,7 +711,7 @@ window.baguetteBox = (function () {
options.captions.call(currentGallery, imageElement) :
imageElement.getAttribute('data-caption') || imageElement.title;
imageSrc += imageSrc.indexOf('?') < 0 ? '?cache' : '&cache';
imageSrc = addq(imageSrc, 'cache');
if (is_vid && index != currentIndex)
return; // no preload
@@ -720,6 +749,9 @@ window.baguetteBox = (function () {
figure.appendChild(image);
if (is_vid && window.afilt)
afilt.apply(undefined, image);
if (options.async && callback)
callback();
}
@@ -1062,6 +1094,7 @@ window.baguetteBox = (function () {
}
function destroyPlugin() {
hideOverlay(undefined, true);
unbindEvents();
clearCachedData();
document.getElementsByTagName('body')[0].removeChild(ebi('bbox-overlay'));

View File

@@ -699,12 +699,12 @@ a:hover {
.s0:after,
.s1:after {
content: '⌄';
margin-left: -.1em;
margin-left: -.15em;
}
.s0r:after,
.s1r:after {
content: '⌃';
margin-left: -.1em;
margin-left: -.15em;
}
.s0:after,
.s0r:after {
@@ -715,7 +715,7 @@ a:hover {
color: var(--sort-2);
}
#files thead th:after {
margin-right: -.7em;
margin-right: -.5em;
}
#files tbody tr:hover td,
#files tbody tr:hover td+td {
@@ -744,6 +744,15 @@ html #files.hhpick thead th {
word-wrap: break-word;
overflow: hidden;
}
#files tr.fade a {
color: #999;
color: rgba(255, 255, 255, 0.4);
font-style: italic;
}
html.y #files tr.fade a {
color: #999;
color: rgba(0, 0, 0, 0.4);
}
#files tr:nth-child(2n) td {
background: var(--row-alt);
}
@@ -1600,6 +1609,7 @@ html {
padding: .2em .4em;
font-size: 1.2em;
margin: .2em;
display: inline-block;
white-space: pre;
position: relative;
top: -.12em;
@@ -1736,6 +1746,7 @@ html.y #tree.nowrap .ntree a+a:hover {
}
#files th span {
position: relative;
white-space: nowrap;
}
#files>thead>tr>th.min,
#files td.min {
@@ -1773,9 +1784,6 @@ html.y #tree.nowrap .ntree a+a:hover {
margin: .7em 0 .7em .5em;
padding-left: .5em;
}
.opwide>div>div>a {
line-height: 2em;
}
.opwide>div>h3 {
color: var(--fg-weak);
margin: 0 .4em;

View File

@@ -244,6 +244,7 @@ var Ls = {
"mt_preload": "start loading the next song near the end for gapless playback\">preload",
"mt_prescan": "go to the next folder before the last song$Nends, keeping the webbrowser happy$Nso it doesn't stop the playback\">nav",
"mt_fullpre": "try to preload the entire song;$N✅ enable on <b>unreliable</b> connections,$N❌ <b>disable</b> on slow connections probably\">full",
"mt_fau": "on phones, prevent music from stopping if the next song doesn't preload fast enough (can make tags display glitchy)\">☕️",
"mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s",
"mt_npclip": "show buttons for clipboarding the currently playing song\">/np",
"mt_octl": "os integration (media hotkeys / osd)\">os-ctl",
@@ -289,6 +290,8 @@ var Ls = {
"f_dls": 'the file links in the current folder have\nbeen changed into download links',
"f_partial": "To safely download a file which is currently being uploaded, please click the file which has the same filename, but without the <code>.PARTIAL</code> file extension. Please press CANCEL or Escape to do this.\n\nPressing OK / Enter will ignore this warning and continue downloading the <code>.PARTIAL</code> scratchfile instead, which will almost definitely give you corrupted data.",
"ft_paste": "paste {0} items$NHotkey: ctrl-V",
"fr_eperm": 'cannot rename:\nyou do not have “move” permission in this folder',
"fd_eperm": 'cannot delete:\nyou do not have “delete” permission in this folder',
@@ -419,6 +422,7 @@ var Ls = {
"un_fclr": "clear filter",
"un_derr": 'unpost-delete failed:\n',
"un_f5": 'something broke, please try a refresh or hit F5',
"un_uf5": "sorry but you have to refresh the page (for example by pressing F5 or CTRL-R) before this upload can be aborted",
"un_nou": '<b>warning:</b> server too busy to show unfinished uploads; click the "refresh" link in a bit',
"un_noc": '<b>warning:</b> unpost of fully uploaded files is not enabled/permitted in server config',
"un_max": "showing first 2000 files (use the filter)",
@@ -745,6 +749,7 @@ var Ls = {
"mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles",
"mt_prescan": "ved behov, bla til neste mappe$Nslik at nettleseren lar oss$Nfortsette å spille musikk\">bla",
"mt_fullpre": "hent ned hele neste sang, ikke bare litt:$N✅ skru på hvis nettet ditt er <b>ustabilt</b>,$N❌ skru av hvis nettet ditt er <b>tregt</b>\">full",
"mt_fau": "for telefoner: forhindre at avspilling stopper hvis nettet er for tregt til å laste neste sang i tide. Hvis påskrudd, kan forårsake at sang-info ikke vises korrekt i OS'et\">☕️",
"mt_waves": "waveform seekbar:$Nvis volumkurve i avspillingsfeltet\">~s",
"mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np",
"mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl",
@@ -790,6 +795,8 @@ var Ls = {
"f_dls": 'linkene i denne mappen er nå\nomgjort til nedlastningsknapper',
"f_partial": "For å laste ned en fil som enda ikke er ferdig opplastet, klikk på filen som har samme filnavn som denne, men uten <code>.PARTIAL</code> på slutten. Da vil serveren passe på at nedlastning går bra. Derfor anbefales det sterkt å trykke ABRYT eller Escape-tasten.\n\nHvis du virkelig ønsker å laste ned denne <code>.PARTIAL</code>-filen på en ukontrollert måte, trykk OK / Enter for å ignorere denne advarselen. Slik vil du høyst sannsynlig motta korrupt data.",
"ft_paste": "Lim inn {0} filer$NSnarvei: ctrl-V",
"fr_eperm": 'kan ikke endre navn:\ndu har ikke “move”-rettigheten i denne mappen',
"fd_eperm": 'kan ikke slette:\ndu har ikke “delete”-rettigheten i denne mappen',
@@ -920,6 +927,7 @@ var Ls = {
"un_fclr": "nullstill filter",
"un_derr": 'unpost-sletting feilet:\n',
"un_f5": 'noe gikk galt, prøv å oppdatere listen eller trykk F5',
"un_uf5": "beklager, men du må laste siden på nytt (f.eks. ved å trykke F5 eller CTRL-R) før denne opplastningen kan avbrytes",
"un_nou": '<b>advarsel:</b> kan ikke vise ufullstendige opplastninger akkurat nå; klikk på oppdater-linken om litt',
"un_noc": '<b>advarsel:</b> angring av fullførte opplastninger er deaktivert i serverkonfigurasjonen',
"un_max": "viser de første 2000 filene (bruk filteret for å innsnevre)",
@@ -1400,7 +1408,9 @@ var ACtx = !IPHONE && (window.AudioContext || window.webkitAudioContext),
ACB = sread('au_cbv') || 1,
noih = /[?&]v\b/.exec('' + location),
hash0 = location.hash,
mp;
ldks = [],
dks = {},
dk, mp;
var mpl = (function () {
@@ -1413,6 +1423,7 @@ var mpl = (function () {
'<a href="#" class="tgl btn" id="au_preload" tt="' + L.mt_preload + '</a>' +
'<a href="#" class="tgl btn" id="au_prescan" tt="' + L.mt_prescan + '</a>' +
'<a href="#" class="tgl btn" id="au_fullpre" tt="' + L.mt_fullpre + '</a>' +
'<a href="#" class="tgl btn" id="au_fau" tt="' + L.mt_fau + '</a>' +
'<a href="#" class="tgl btn" id="au_waves" tt="' + L.mt_waves + '</a>' +
'<a href="#" class="tgl btn" id="au_npclip" tt="' + L.mt_npclip + '</a>' +
'<a href="#" class="tgl btn" id="au_os_ctl" tt="' + L.mt_octl + '</a>' +
@@ -1459,6 +1470,15 @@ var mpl = (function () {
bcfg_bind(r, 'preload', 'au_preload', true);
bcfg_bind(r, 'prescan', 'au_prescan', true);
bcfg_bind(r, 'fullpre', 'au_fullpre', false);
bcfg_bind(r, 'fau', 'au_fau', MOBILE && !IPHONE, function (v) {
mp.nopause();
if (mp.fau) {
mp.fau.pause();
mp.fau = mpo.fau = null;
console.log('stop fau');
}
mp.init_fau();
});
bcfg_bind(r, 'waves', 'au_waves', true, function (v) {
if (!v) pbar.unwave();
});
@@ -1525,22 +1545,24 @@ var mpl = (function () {
set_tint();
r.acode = function (url) {
var c = true;
var c = true,
cs = url.split('?')[0];
if (!have_acode)
c = false;
else if (/\.(wav|flac)$/i.exec(url))
else if (/\.(wav|flac)$/i.exec(cs))
c = r.ac_flac;
else if (/\.(aac|m4a)$/i.exec(url))
else if (/\.(aac|m4a)$/i.exec(cs))
c = r.ac_aac;
else if (/\.opus$/i.exec(url) && !can_ogg)
else if (/\.(ogg|opus)$/i.exec(cs) && !can_ogg)
c = true;
else if (re_au_native.exec(url))
else if (re_au_native.exec(cs))
c = false;
if (!c)
return url;
return url + (url.indexOf('?') < 0 ? '?' : '&') + 'th=' + (can_ogg ? 'opus' : 'caf');
return addq(url, 'th=' + (can_ogg ? 'opus' : (IPHONE || MACOS) ? 'caf' : 'mp3'));
};
r.pp = function () {
@@ -1591,7 +1613,7 @@ var mpl = (function () {
}
if (cover) {
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
cover = addq(cover, 'th=j');
tags.artwork = [{ "src": cover, type: "image/jpeg" }];
}
}
@@ -1650,26 +1672,23 @@ var mpl = (function () {
var can_ogg = true;
try {
can_ogg = new Audio().canPlayType('audio/ogg; codecs=opus') === 'probably';
if (document.documentMode)
can_ogg = true; // ie8-11
}
catch (ex) { }
var re_au_native = can_ogg ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i :
have_acode ? /\.(aac|flac|m4a|mp3|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk)$/i;
// extract songs + add play column
var mpo = { "au": null, "au2": null, "acs": null };
var mpo = { "au": null, "au2": null, "acs": null, "fau": null };
function MPlayer() {
var r = this;
r.id = Date.now();
r.au = mpo.au;
r.au2 = mpo.au2;
r.acs = mpo.acs;
r.fau = mpo.fau;
r.tracks = {};
r.order = [];
r.cd_pause = 0;
@@ -1682,8 +1701,8 @@ function MPlayer() {
link = tds[1].getElementsByTagName('a');
link = link[link.length - 1];
var url = noq_href(link),
m = re_audio.exec(url);
var url = link.getAttribute('href'),
m = re_audio.exec(url.split('?')[0]);
if (m) {
var tid = link.getAttribute('id');
@@ -1793,14 +1812,13 @@ function MPlayer() {
r.preload = function (url, full) {
var t0 = Date.now(),
fname = uricom_dec(url.split('/').pop());
fname = uricom_dec(url.split('/').pop().split('?')[0]);
url = mpl.acode(url);
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=987&_=' + ACB;
url = addq(mpl.acode(url), 'cache=987&_=' + ACB);
mpl.preload_url = full ? url : null;
if (mpl.waves)
fetch(url.replace(/\bth=opus&/, '') + '&th=p').then(function (x) {
fetch(url.replace(/\bth=(opus|mp3)&/, '') + '&th=p').then(function (x) {
x.body.getReader().read();
});
@@ -1842,6 +1860,17 @@ function MPlayer() {
r.nopause = function () {
r.cd_pause = Date.now();
};
r.init_fau = function () {
if (r.fau || !mpl.fau)
return;
// breaks touchbar-macs
console.log('init fau');
r.fau = new Audio(SR + '/.cpr/deps/busy.mp3?_=' + TS);
r.fau.loop = true;
r.fau.play();
};
}
@@ -2379,8 +2408,7 @@ function dl_song() {
return toast.inf(10, L.f_dls);
}
var url = mp.tracks[mp.au.tid];
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=987&_=' + ACB;
var url = addq(mp.tracks[mp.au.tid], 'cache=987&_=' + ACB);
dl_file(url);
}
@@ -2514,11 +2542,11 @@ var mpui = (function () {
rem = pos > 1 ? len - pos : 999,
full = null;
if (rem < (mpl.fullpre ? 7 : 20)) {
if (rem < 7 || (!mpl.fullpre && (rem < 40 || (rem < 90 && pos > 10)))) {
preloaded = fpreloaded = mp.au.rsrc;
full = false;
}
else if (rem < 40 && mpl.fullpre && fpreloaded != mp.au.rsrc) {
else if (rem < 60 && mpl.fullpre && fpreloaded != mp.au.rsrc) {
fpreloaded = mp.au.rsrc;
full = true;
}
@@ -2714,7 +2742,7 @@ var afilt = (function () {
mp.acs = mpo.acs = null;
};
r.apply = function (v) {
r.apply = function (v, au) {
r.init();
r.draw();
@@ -2734,12 +2762,13 @@ var afilt = (function () {
if (r.plugs[a].en)
plug = true;
if (!actx || !mp.au || (!r.eqen && !plug && !mp.acs))
au = au || (mp && mp.au);
if (!actx || !au || (!r.eqen && !plug && !mp.acs))
return;
r.stop();
mp.au.id = mp.au.id || Date.now();
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
au.id = au.id || Date.now();
mp.acs = r.acst[au.id] = r.acst[au.id] || actx.createMediaElementSource(au);
if (r.eqen)
add_eq();
@@ -2995,6 +3024,7 @@ function play(tid, is_ev, seek) {
return;
mpl.preload_url = null;
mp.nopause();
mp.stopfade(true);
var tn = tid;
@@ -3041,9 +3071,9 @@ function play(tid, is_ev, seek) {
mp.au.onended = next_song;
widget.open();
}
mp.init_fau();
var url = mpl.acode(mp.tracks[tid]);
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=987&_=' + ACB;
var url = addq(mpl.acode(mp.tracks[tid]), 'cache=987&_=' + ACB);
if (mp.au.rsrc == url)
mp.au.currentTime = 0;
@@ -3107,7 +3137,7 @@ function play(tid, is_ev, seek) {
pbar.unwave();
if (mpl.waves)
pbar.loadwaves(url.replace(/\bth=opus&/, '') + '&th=p');
pbar.loadwaves(url.replace(/\bth=(opus|mp3)&/, '') + '&th=p');
mpui.progress_updater();
pbar.onresize();
@@ -3194,7 +3224,10 @@ function evau_error(e) {
toast.warn(15, esc(basenames(err + mfile)));
};
xhr.send();
return;
}
setTimeout(next_song, 15000);
}
@@ -4214,12 +4247,12 @@ var showfile = (function () {
qsr('#prism_css');
var el = mknod('link', 'prism_css');
el.rel = 'stylesheet';
el.href = SR + '/.cpr/deps/prism' + (light ? '' : 'd') + '.css';
el.href = SR + '/.cpr/deps/prism' + (light ? '' : 'd') + '.css?_=' + TS;
document.head.appendChild(el);
};
r.active = function () {
return location.search.indexOf('doc=') + 1;
return !!/[?&]doc=/.exec(location.search);
};
r.getlang = function (fn) {
@@ -4260,12 +4293,15 @@ var showfile = (function () {
};
r.show = function (url, no_push) {
var xhr = new XHR();
var xhr = new XHR(),
m = /[?&](k=[^&#]+)/.exec(url);
url = url.split('?')[0] + (m ? '?' + m[1] : '');
xhr.url = url;
xhr.fname = uricom_dec(url.split('/').pop());
xhr.no_push = no_push;
xhr.ts = Date.now();
xhr.open('GET', url.split('?')[0], true);
xhr.open('GET', url, true);
xhr.onprogress = loading;
xhr.onload = xhr.onerror = load_cb;
xhr.send();
@@ -4303,14 +4339,14 @@ var showfile = (function () {
var url = doc[0],
lnh = doc[1],
txt = doc[2],
name = url.split('/').pop(),
name = url.split('?')[0].split('/').pop(),
tname = uricom_dec(name),
lang = r.getlang(name),
is_md = lang == 'md';
ebi('files').style.display = ebi('gfiles').style.display = ebi('lazy').style.display = ebi('pro').style.display = ebi('epi').style.display = 'none';
ebi('dldoc').setAttribute('href', url);
ebi('editdoc').setAttribute('href', url + (url.indexOf('?') > 0 ? '&' : '?') + 'edit');
ebi('editdoc').setAttribute('href', addq(url, 'edit'));
ebi('editdoc').style.display = (has(perms, 'write') && (is_md || has(perms, 'delete'))) ? '' : 'none';
var wr = ebi('bdoc'),
@@ -4361,7 +4397,7 @@ var showfile = (function () {
wintitle(tname + ' \u2014 ');
document.documentElement.scrollTop = 0;
var hfun = no_push ? hist_replace : hist_push;
hfun(get_evpath() + '?doc=' + url.split('/').pop());
hfun(get_evpath() + '?doc=' + name); // can't dk: server wants dk and js needs fk
qsr('#docname');
el = mknod('span', 'docname');
@@ -4563,7 +4599,7 @@ var thegrid = (function () {
if (!force)
return;
hist_push(get_evpath());
hist_push(get_evpath() + (dk ? '?k=' + dk : ''));
wintitle();
}
@@ -4795,10 +4831,10 @@ var thegrid = (function () {
ref = ao.getAttribute('id'),
isdir = href.endsWith('/'),
ac = isdir ? ' class="dir"' : '',
ihref = href;
ihref = ohref;
if (r.thumbs) {
ihref += '?th=' + (have_webp ? 'w' : 'j');
ihref = addq(ihref, 'th=' + (have_webp ? 'w' : 'j'));
if (!r.crop)
ihref += 'f';
if (r.x3)
@@ -4834,7 +4870,7 @@ var thegrid = (function () {
}
ihref = SR + '/.cpr/ico/' + ext;
}
ihref += (ihref.indexOf('?') > 0 ? '&' : '?') + 'cache=i&_=' + ACB;
ihref = addq(ihref, 'cache=i&_=' + ACB + TS);
if (CHROME)
ihref += "&raster";
@@ -4875,10 +4911,10 @@ var thegrid = (function () {
baguetteBox.destroy();
var br = baguetteBox.run(isrc, {
noScrollbars: true,
duringHide: r.onhide,
afterShow: function () {
r.bbox_opts.refocus = true;
document.body.style.overflow = 'hidden';
},
captions: function (g) {
var idx = -1,
@@ -4901,7 +4937,8 @@ var thegrid = (function () {
};
r.onhide = function () {
document.body.style.overflow = '';
afilt.apply();
if (!thegrid.ihop)
return;
@@ -5723,6 +5760,14 @@ var treectl = (function () {
};
r.nvis = r.lim;
ldks = jread('dks', []);
for (var a = ldks.length - 1; a >= 0; a--) {
var s = ldks[a],
o = s.lastIndexOf('?');
dks[s.slice(0, o)] = s.slice(o + 1);
}
function setwrap(v) {
clmod(ebi('tree'), 'nowrap', !v);
reload_tree();
@@ -5937,12 +5982,15 @@ var treectl = (function () {
};
function get_tree(top, dst, rst) {
var xhr = new XHR();
var xhr = new XHR(),
m = /[?&](k=[^&#]+)/.exec(dst),
k = m ? '&' + m[1] : dk ? '&k=' + dk : '';
xhr.top = top;
xhr.dst = dst;
xhr.rst = rst;
xhr.ts = Date.now();
xhr.open('GET', dst + '?tree=' + top + (r.dots ? '&dots' : ''), true);
xhr.open('GET', addq(dst, 'tree=' + top + (r.dots ? '&dots' : '') + k), true);
xhr.onload = xhr.onerror = recvtree;
xhr.send();
enspin('#tree');
@@ -5969,7 +6017,7 @@ var treectl = (function () {
}
ebi('treeul').setAttribute('ts', ts);
var top = top0 == '.' ? dst : top0,
var top = (top0 == '.' ? dst : top0).split('?')[0],
name = uricom_dec(top.split('/').slice(-2)[0]),
rtop = top.replace(/^\/+/, ""),
html = parsetree(res, rtop);
@@ -5986,7 +6034,7 @@ var treectl = (function () {
var links = QSA('#treeul a+a');
for (var a = 0, aa = links.length; a < aa; a++) {
if (links[a].getAttribute('href') == top) {
if (links[a].getAttribute('href').split('?')[0] == top) {
var o = links[a].parentNode;
if (!o.getElementsByTagName('li').length)
o.innerHTML = html;
@@ -6019,14 +6067,20 @@ var treectl = (function () {
function reload_tree() {
var cdir = r.nextdir || get_vpath(),
cevp = get_evpath(),
links = QSA('#treeul a+a'),
nowrap = QS('#tree.nowrap') && QS('#hovertree.on'),
act = null;
for (var a = 0, aa = links.length; a < aa; a++) {
var href = uricom_dec(links[a].getAttribute('href')),
var qhref = links[a].getAttribute('href'),
ehref = qhref.split('?')[0],
href = uricom_dec(ehref),
cl = '';
if (dk && ehref == cevp && !/[?&]k=/.exec(qhref))
links[a].setAttribute('href', addq(qhref, 'k=' + dk));
if (href == cdir) {
act = links[a];
cl = 'hl';
@@ -6125,13 +6179,16 @@ var treectl = (function () {
if (IE && !history.pushState)
return window.location = url;
var xhr = new XHR();
var xhr = new XHR(),
m = /[?&](k=[^&#]+)/.exec(url),
k = m ? '&' + m[1] : dk ? '&k=' + dk : '';
xhr.top = url.split('?')[0];
xhr.back = back
xhr.hpush = hpush;
xhr.hydrate = hydrate;
xhr.ts = Date.now();
xhr.open('GET', xhr.top + '?ls' + (r.dots ? '&dots' : ''), true);
xhr.open('GET', xhr.top + '?ls' + (r.dots ? '&dots' : '') + k, true);
xhr.onload = xhr.onerror = recvls;
xhr.send();
@@ -6193,6 +6250,7 @@ var treectl = (function () {
read_dsort(res.dsort);
dcrop = res.dcrop;
dth3x = res.dth3x;
dk = res.dk;
srvinf = res.srvinf;
try {
@@ -6201,14 +6259,22 @@ var treectl = (function () {
catch (ex) { }
if (this.hpush && !showfile.active())
hist_push(this.top);
hist_push(this.top + (dk ? '?k=' + dk : ''));
if (!this.back) {
var dirs = [];
for (var a = 0; a < res.dirs.length; a++)
dirs.push(res.dirs[a].href.split('/')[0].split('?')[0]);
for (var a = 0; a < res.dirs.length; a++) {
var dh = res.dirs[a].href,
dn = dh.split('/')[0].split('?')[0],
m = /[?&](k=[^&#]+)/.exec(dh);
rendertree({ "a": dirs }, this.ts, ".", get_evpath());
if (m)
dn += '?' + m[1];
dirs.push(dn);
}
rendertree({ "a": dirs }, this.ts, ".", get_evpath() + (dk ? '?k=' + dk : ''));
}
r.gentab(this.top, res);
@@ -6268,6 +6334,10 @@ var treectl = (function () {
if (ae = ae.querySelector('a[id]'))
cid = ae.getAttribute('id');
var m = /[?&]k=([^&]+)/.exec(location.search);
if (m)
memo_dk(top, m[1]);
r.lsc = res;
if (res.unlist) {
var ptn = new RegExp(res.unlist);
@@ -6309,7 +6379,7 @@ var treectl = (function () {
if (lang) {
showfile.files.push({ 'id': id, 'name': fname });
if (lang == 'md')
tn.href += tn.href.indexOf('?') < 0 ? '?v' : '&v';
tn.href = addq(tn.href, 'v');
}
if (tn.lead == '-')
@@ -6317,8 +6387,9 @@ var treectl = (function () {
'" class="doc' + (lang ? ' bri' : '') +
'" hl="' + id + '" name="' + hname + '">-txt-</a>';
var ln = ['<tr><td>' + tn.lead + '</td><td><a href="' +
top + tn.href + '" id="' + id + '">' + hname + '</a>', tn.sz];
var cl = /\.PARTIAL$/.exec(fname) ? ' class="fade"' : '',
ln = ['<tr' + cl + '><td>' + tn.lead + '</td><td><a href="' +
top + tn.href + '" id="' + id + '">' + hname + '</a>', tn.sz];
for (var b = 0; b < res.taglist.length; b++) {
var k = res.taglist[b],
@@ -6385,7 +6456,7 @@ var treectl = (function () {
url = url.href;
var mt = m[0] == 'a' ? 'audio' : /\.(webm|mkv)($|\?)/i.exec(url) ? 'video' : 'image'
if (mt == 'image') {
url += url.indexOf('?') < 0 ? '?cache' : '&cache';
url = addq(url, 'cache');
console.log(url);
new Image().src = url;
}
@@ -6417,6 +6488,30 @@ var treectl = (function () {
setTimeout(eval_hash, 1);
};
function memo_dk(vp, k) {
dks[vp] = k;
var lv = vp + "?" + k;
if (has(ldks, lv))
return;
ldks.unshift(lv);
if (ldks.length > 32) {
var keep = [], evp = get_evpath();
for (var a = 0; a < ldks.length; a++) {
var s = ldks[a];
if (evp.startsWith(s.replace(/\?[^?]+$/, '')))
keep.push(s);
}
var lim = 32 - keep.length;
for (var a = 0; a < lim; a++) {
if (!has(keep, ldks[a]))
keep.push(ldks[a])
}
ldks = keep;
}
jwrite('dks', ldks);
}
r.setlazy = function (plain) {
var html = ['<div id="plazy">', esc(plain.join(' ')), '</div>'],
all = r.lsc.files.length + r.lsc.dirs.length,
@@ -6488,7 +6583,9 @@ var treectl = (function () {
keys.sort(function (a, b) { return a.localeCompare(b); });
for (var a = 0; a < keys.length; a++) {
var kk = keys[a],
ks = kk.slice(1),
m = /(\?k=[^\n]+)/.exec(kk),
kdk = m ? m[1] : '',
ks = kk.replace(kdk, '').slice(1),
ded = ks.endsWith('\n'),
k = uricom_sdec(ded ? ks.replace(/\n$/, '') : ks),
hek = esc(k[0]),
@@ -6496,7 +6593,7 @@ var treectl = (function () {
url = '/' + (top ? top + uek : uek) + '/',
sym = res[kk] ? '-' : '+',
link = '<a href="#">' + sym + '</a><a href="' +
url + '">' + hek + '</a>';
url + kdk + '">' + hek + '</a>';
if (res[kk]) {
var subtree = parsetree(res[kk], url.slice(1));
@@ -6537,16 +6634,24 @@ var treectl = (function () {
if (!e.state)
return;
var url = new URL(e.state, "https://" + document.location.host);
var hbase = url.pathname;
var cbase = document.location.pathname;
if (url.search.indexOf('doc=') + 1 && hbase == cbase)
var url = new URL(e.state, "https://" + location.host),
req = url.pathname,
hbase = req,
cbase = location.pathname,
mdoc = /[?&]doc=/.exec('' + url),
mdk = /[?&](k=[^&#]+)/.exec('' + url);
if (mdoc && hbase == cbase)
return showfile.show(hbase + showfile.sname(url.search), true);
r.goto(url.pathname, false, true);
if (mdk)
req += '?' + mdk[1];
r.goto(req, false, true);
};
hist_replace(get_evpath() + location.hash);
var evp = get_evpath() + (dk ? '?k=' + dk : '');
hist_replace(evp + location.hash);
r.onscroll = onscroll;
return r;
})();
@@ -7171,11 +7276,11 @@ var arcfmt = (function () {
if (!/^(zip|tar|pax|tgz|txz)$/.exec(txt))
continue;
var ofs = href.lastIndexOf('?');
if (ofs < 0)
var m = /(.*[?&])(tar|zip)([^&#]*)(.*)$/.exec(href);
if (!m)
throw new Error('missing arg in url');
o.setAttribute("href", href.slice(0, ofs + 1) + arg);
o.setAttribute("href", m[1] + arg + m[4]);
o.textContent = fmt.split('_')[0];
}
ebi('selzip').textContent = fmt.split('_')[0];
@@ -7238,13 +7343,20 @@ var msel = (function () {
vbase = get_evpath();
for (var a = 0, aa = links.length; a < aa; a++) {
var href = noq_href(links[a]).replace(/\/$/, ""),
var qhref = links[a].getAttribute('href'),
href = qhref.split('?')[0].replace(/\/$/, ""),
item = {};
item.id = links[a].getAttribute('id');
item.sel = clgot(links[a].closest('tr'), 'sel');
item.vp = href.indexOf('/') !== -1 ? href : vbase + href;
if (dk) {
var m = /[?&](k=[^&#]+)/.exec(qhref);
item.q = m ? '?' + m[1] : '';
}
else item.q = '';
r.all.push(item);
if (item.sel)
r.sel.push(item);
@@ -7361,6 +7473,9 @@ var msel = (function () {
frm = mknod('form'),
txt = [];
if (dk)
arg += '&k=' + dk;
for (var a = 0; a < sel.length; a++)
txt.push(vsplit(sel[a].vp)[1]);
@@ -7385,7 +7500,7 @@ var msel = (function () {
ev(e);
var sel = r.getsel();
for (var a = 0; a < sel.length; a++)
dl_file(sel[a].vp);
dl_file(sel[a].vp + sel[a].q);
};
r.render = function () {
var tds = QSA('#files tbody td+td+td'),
@@ -7405,7 +7520,7 @@ var msel = (function () {
(function () {
if (!window.FormData)
if (!FormData)
return;
var form = QS('#op_new_md>form'),
@@ -7427,7 +7542,7 @@ var msel = (function () {
(function () {
if (!window.FormData)
if (!FormData)
return;
var form = QS('#op_mkdir>form'),
@@ -7621,7 +7736,7 @@ function show_md(md, name, div, url, depth) {
if (depth) {
clmod(div, 'raw', 1);
div.textContent = "--[ " + name + " ]---------\r\n" + md;
return toast.warn(10, errmsg + (window.WebAssembly ? 'failed to load marked.js' : 'your browser is too old'));
return toast.warn(10, errmsg + (WebAssembly ? 'failed to load marked.js' : 'your browser is too old'));
}
wfp_debounce.n--;
@@ -7934,7 +8049,7 @@ var unpost = (function () {
function linklist() {
var ret = [],
base = document.location.origin.replace(/\/$/, '');
base = location.origin.replace(/\/$/, '');
for (var a = 0; a < r.files.length; a++)
ret.push(base + r.files[a].vp);
@@ -7984,7 +8099,17 @@ var unpost = (function () {
if (!links.length)
continue;
req.push(uricom_dec(r.files[a].vp.split('?')[0]));
var f = r.files[a];
if (f.k == 'u') {
var vp = vsplit(f.vp.split('?')[0]),
dfn = uricom_dec(vp[1]);
for (var iu = 0; iu < up2k.st.files.length; iu++) {
var uf = up2k.st.files[iu];
if (uf.name == dfn && uf.purl == vp[0])
return modal.alert(L.un_uf5);
}
}
req.push(uricom_dec(f.vp.split('?')[0]));
for (var b = 0; b < links.length; b++) {
links[b].removeAttribute('href');
links[b].innerHTML = '[busy]';
@@ -8107,12 +8232,20 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
treectl.reqls(tgt.getAttribute('href'), true);
return ev(e);
}
if (tgt && /\.PARTIAL(\?|$)/.exec('' + tgt.getAttribute('href')) && !window.partdlok) {
ev(e);
modal.confirm(L.f_partial, function () {
window.partdlok = 1;
tgt.click();
}, null);
}
tgt = e.target.closest('a[hl]');
if (tgt) {
var a = ebi(tgt.getAttribute('hl')),
href = a.getAttribute('href'),
fun = function () {
showfile.show(noq_href(a), tgt.getAttribute('lang'));
showfile.show(href, tgt.getAttribute('lang'));
},
szs = ft2dict(a.closest('tr'))[0].sz,
sz = parseInt(szs.replace(/[, ]/g, ''));
@@ -8139,6 +8272,7 @@ function reload_mp() {
mpo.au = mp.au;
mpo.au2 = mp.au2;
mpo.acs = mp.acs;
mpo.fau = mp.fau;
mpl.unbuffer();
}
var plays = QSA('tr>td:first-child>a.play');
@@ -8173,8 +8307,10 @@ function reload_browser() {
for (var a = 0; a < parts.length - 1; a++) {
link += parts[a] + '/';
var link2 = dks[link] ? addq(link, 'k=' + dks[link]) : link;
o = mknod('a');
o.setAttribute('href', link);
o.setAttribute('href', link2);
o.textContent = uricom_dec(parts[a]) || '/';
ebi('path').appendChild(mknod('i'));
ebi('path').appendChild(o);

View File

@@ -190,11 +190,21 @@ input {
padding: .5em .7em;
margin: 0 .5em 0 0;
}
input::placeholder {
font-size: 1.2em;
font-style: italic;
letter-spacing: .04em;
opacity: 0.64;
color: #930;
}
html.z input {
color: #fff;
background: #626;
border-color: #c2c;
}
html.z input::placeholder {
color: #fff;
}
html.z .num {
border-color: #777;
}

View File

@@ -94,7 +94,7 @@
<div>
<form method="post" enctype="multipart/form-data" action="{{ r }}/{{ qvpath }}">
<input type="hidden" name="act" value="login" />
<input type="password" name="cppwd" />
<input type="password" name="cppwd" placeholder=" password" />
<input type="submit" value="Login" />
{% if ahttps %}
<a id="w" href="{{ ahttps }}">switch to https</a>

View File

@@ -17,7 +17,7 @@ function goto_up2k() {
var up2k = null,
up2k_hooks = [],
hws = [],
sha_js = window.WebAssembly ? 'hw' : 'ac', // ff53,c57,sa11
sha_js = WebAssembly ? 'hw' : 'ac', // ff53,c57,sa11
m = 'will use ' + sha_js + ' instead of native sha512 due to';
try {
@@ -717,7 +717,7 @@ function Donut(uc, st) {
sfx();
// firefox may forget that filedrops are user-gestures so it can skip this:
if (uc.upnag && window.Notification && Notification.permission == 'granted')
if (uc.upnag && Notification && Notification.permission == 'granted')
new Notification(uc.nagtxt);
}
@@ -779,8 +779,8 @@ function up2k_init(subtle) {
};
setTimeout(function () {
if (window.WebAssembly && !hws.length)
fetch(SR + '/.cpr/w.hash.js' + CB);
if (WebAssembly && !hws.length)
fetch(SR + '/.cpr/w.hash.js?_=' + TS);
}, 1000);
function showmodal(msg) {
@@ -869,7 +869,7 @@ function up2k_init(subtle) {
bcfg_bind(uc, 'turbo', 'u2turbo', turbolvl > 1, draw_turbo);
bcfg_bind(uc, 'datechk', 'u2tdate', turbolvl < 3, null);
bcfg_bind(uc, 'az', 'u2sort', u2sort.indexOf('n') + 1, set_u2sort);
bcfg_bind(uc, 'hashw', 'hashw', !!window.WebAssembly && (!subtle || !CHROME || MOBILE || VCHROME >= 107), set_hashw);
bcfg_bind(uc, 'hashw', 'hashw', !!WebAssembly && (!subtle || !CHROME || MOBILE || VCHROME >= 107), set_hashw);
bcfg_bind(uc, 'upnag', 'upnag', false, set_upnag);
bcfg_bind(uc, 'upsfx', 'upsfx', false, set_upsfx);
@@ -1347,9 +1347,9 @@ function up2k_init(subtle) {
var evpath = get_evpath(),
draw_each = good_files.length < 50;
if (window.WebAssembly && !hws.length) {
if (WebAssembly && !hws.length) {
for (var a = 0; a < Math.min(navigator.hardwareConcurrency || 4, 16); a++)
hws.push(new Worker(SR + '/.cpr/w.hash.js' + CB));
hws.push(new Worker(SR + '/.cpr/w.hash.js?_=' + TS));
console.log(hws.length + " hashers");
}
@@ -2950,7 +2950,7 @@ function up2k_init(subtle) {
}
function set_hashw() {
if (!window.WebAssembly) {
if (!WebAssembly) {
bcfg_set('hashw', uc.hashw = false);
toast.err(10, L.u_nowork);
}
@@ -2967,7 +2967,7 @@ function up2k_init(subtle) {
nopenag();
}
if (!window.Notification || !HTTPS)
if (!Notification || !HTTPS)
return nopenag();
if (en && Notification.permission == 'default')
@@ -2989,7 +2989,7 @@ function up2k_init(subtle) {
};
}
if (uc.upnag && (!window.Notification || Notification.permission != 'granted'))
if (uc.upnag && (!Notification || Notification.permission != 'granted'))
bcfg_set('upnag', uc.upnag = false);
ebi('nthread_add').onclick = function (e) {

View File

@@ -16,7 +16,6 @@ var wah = '',
NOAC = 'autocorrect="off" autocapitalize="off"',
L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked,
T0 = Date.now(),
CB = '?_=' + Math.floor(T0 / 1000).toString(36),
R = SR.slice(1),
RS = R ? "/" + R : "",
HALFMAX = 8192 * 8192 * 8192 * 8192,
@@ -52,8 +51,6 @@ catch (ex) {
}
try {
CB = '?' + document.currentScript.src.split('?').pop();
if (navigator.userAgentData.mobile)
MOBILE = true;
@@ -130,7 +127,7 @@ if ((document.location + '').indexOf(',rej,') + 1)
try {
console.hist = [];
var CMAXHIST = 100;
var CMAXHIST = 1000;
var hook = function (t) {
var orig = console[t].bind(console),
cfun = function () {
@@ -182,7 +179,7 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
if (url.indexOf('easymde.js') + 1)
return; // clicking the preview pane
if (url.indexOf('deps/marked.js') + 1 && !window.WebAssembly)
if (url.indexOf('deps/marked.js') + 1 && !WebAssembly)
return; // ff<52
crashed = true;
@@ -740,6 +737,15 @@ function vjoin(p1, p2) {
}
function addq(url, q) {
var uh = url.split('#', 1),
u = uh[0],
h = uh.length == 1 ? '' : '#' + uh[1];
return u + (u.indexOf('?') < 0 ? '?' : '&') + (q === undefined ? '' : q) + h;
}
function uricom_enc(txt, do_fb_enc) {
try {
return encodeURIComponent(txt);
@@ -1469,7 +1475,7 @@ var toast = (function () {
clmod(obj, 'vis');
r.visible = false;
r.tag = obj;
if (!window.WebAssembly)
if (!WebAssembly)
te = setTimeout(function () {
obj.className = 'hide';
}, 500);
@@ -1885,7 +1891,7 @@ function md_thumbs(md) {
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
if (!/[?&]cache/.exec(url))
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i';
url = addq(url, 'cache=i');
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
}

View File

@@ -5,9 +5,6 @@ a living list of upcoming features / fixes / changes, very roughly in order of p
* maybe resumable downloads (chrome-only, jank api)
* maybe checksum validation (return sha512 of requested range in responses, and probably also warks)
* [github issue #64](https://github.com/9001/copyparty/issues/64) - dirkeys 2nd season
* popular feature request, finally time to refactor browser.js i suppose...
* [github issue #37](https://github.com/9001/copyparty/issues/37) - upload PWA
* or [maybe not](https://arstechnica.com/tech-policy/2024/02/apple-under-fire-for-disabling-iphone-web-apps-eu-asks-developers-to-weigh-in/), or [maybe](https://arstechnica.com/gadgets/2024/03/apple-changes-course-will-keep-iphone-eu-web-apps-how-they-are-in-ios-17-4/)

View File

@@ -1,8 +1,118 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0412-2110 `v1.12.2` ie11 fix
## new features
* new option `--bauth-last` for when you're hosting other [basic-auth](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication) services on the same domain 7b94e4ed
* makes it possible to log into copyparty as intended, but it still sees the passwords from the other service until you do
* alternatively, the other new option `--no-bauth` entirely disables basic-auth support, but that also kills [the android app](https://github.com/9001/party-up)
## bugfixes
* internet explorer isn't working?! FIX IT!!! 9e5253ef
* audio transcoding was buggy with filekeys enabled b8733653
* on windows, theoretical chance that antivirus could interrupt renaming files, so preemptively guard against that c8e3ed3a
## other changes
* add a "password" placeholder on the login page since you might think it's asking for a username da26ec36
* config buttons were jank on iOS b772a4f8
* readme: [making your homeserver accessible from the internet](https://github.com/9001/copyparty#at-home)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0409-2334 `v1.12.1` scrolling stuff
## new features
* while viewing pictures/videos, the scrollwheel can be used to view the prev/next file 844d16b9
## bugfixes
* #81 (scrolling suddenly getting disabled) properly fixed after @icxes found another way to reproduce it (thx) 4f0cad54
* and fixed at least one javascript glitch introduced in v1.12.0 while adding dirkeys 989cc613
* directory tree sidebar could fail to render when popping browser history into the lightbox
## other changes
* music preloader is slightly less hyper f89de6b3
* u2c.exe: updated TLS-certs and deps ab18893c
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0406-2011 `v1.12.0` locksmith
## new features
* #64 dirkeys; option to auto-generate passwords for folders, so you can give someone a link to a specific folder inside a volume without sharing the rest of the volume 10bc2d92 32c912bb ef52e2c0 0ae12868
* enabled by volflag `dk` (exact folder only) and/or volflag `dks` (also subfolders); see [readme](https://github.com/9001/copyparty#dirkeys)
* audio transcoding to mp3 if browser doesn't support opus a080759a
* recursively transcode and download a folder using `?tar&mp3`
* accidentally adds support for playing just about any audio format in ie11
* audio equalizer also applies to videos 7744226b
## bugfixes
* #81 scrolling could break after viewing an image in the lightbox 9c42cbec
* on phones, audio playback could stop if network is slow/unreliable 59f815ff b88cc7b5 59a53ba9
* fixes the issue on android, but ios/safari appears to be [impossible](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#music-playback-halting-on-phones) d94b5b3f
## other changes
* updated dompurify to 3.0.11
* copyparty.exe: updated to python 3.11.9
* support for building with pyoxidizer was removed 5ab54763
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0323-1724 `v1.11.2` public idp volumes
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates, such as [vulnerabilities](https://github.com/9001/copyparty/security) (most recently 2023-07-23)
## new features
* global-option `--iobuf` to set a custom I/O buffersize 2b24c50e
* changes the default buffersize to 256 KiB everywhere (was a mix of 64 and 512)
* may improve performance of networked volumes (s3 etc.) if increased
* on gbit networks: download-as-tar is now up to 20% faster
* slightly faster FTP and TFTP too
* global-option `--s-rd-sz` to set a custom read-size for sockets c6acd3a9
* changes the default from 32 to 256 KiB
* may improve performance of networked volumes (s3 etc.) if increased
* on 10gbit networks: uploading large files is now up to 17% faster
* add url parameter `?replace` to overwrite any existing files with a multipart-post c6acd3a9
## bugfixes
* #79 idp volumes (introduced in [v1.11.0](https://github.com/9001/copyparty/releases/tag/v1.11.0)) would only accept permissions for the user that owned the volume; was impossible to grant read/write-access to other users d30ae845
## other changes
* mention the [lack of persistence for idp volumes](https://github.com/9001/copyparty/blob/hovudstraum/docs/idp.md#important-notes) in the IdP docs 2f20d29e
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0318-1709 `v1.11.1` dont ban the pipes
the [previous release](https://github.com/9001/copyparty/releases/tag/v1.11.0) had all the fun new features... this one's just bugfixes
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
### no vulnerabilities since 2023-07-23
* there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates
* [v1.8.7](https://github.com/9001/copyparty/releases/tag/v1.8.7) (2023-07-23) - [CVE-2023-38501](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-38501) - reflected XSS
* [v1.8.2](https://github.com/9001/copyparty/releases/tag/v1.8.2) (2023-07-14) - [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474) - path traversal (first CVE)
## bugfixes
* less aggressive rejection of requests from banned IPs 51d31588

View File

@@ -20,7 +20,8 @@
* [just the sfx](#just-the-sfx)
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
* [complete release](#complete-release)
* [todo](#todo) - roughly sorted by priority
* [debugging](#debugging)
* [music playback halting on phones](#music-playback-halting-on-phones) - mostly fine on android
* [discarded ideas](#discarded-ideas)
@@ -301,15 +302,26 @@ in the `scripts` folder:
* run `./rls.sh 1.2.3` which uploads to pypi + creates github release + sfx
# todo
# debugging
roughly sorted by priority
## music playback halting on phones
* nothing! currently
mostly fine on android, but still haven't find a way to massage iphones into behaving well
* conditionally starting/stopping mp.fau according to mp.au.readyState <3 or <4 doesn't help
* loop=true doesn't work, and manually looping mp.fau from an onended also doesn't work (it does nothing)
* assigning fau.currentTime in a timer doesn't work, as safari merely pretends to assign it
* on ios 16.7.7, mp.fau can sometimes make everything visibly work correctly, but no audio is actually hitting the speakers
can be reproduced with `--no-sendfile --s-wr-sz 8192 --s-wr-slp 0.3 --rsp-slp 6` and then play a collection of small audio files with the screen off, `ffmpeg -i track01.cdda.flac -c:a libopus -b:a 128k -segment_time 12 -f segment smol-%02d.opus`
## discarded ideas
* optimization attempts which didn't improve performance
* remove brokers / multiprocessing stuff; https://github.com/9001/copyparty/tree/no-broker
* reduce the nesting / indirections in `HttpCli` / `httpcli.py`
* nearly zero benefit from stuff like replacing all the `self.conn.hsrv` with a local `hsrv` variable
* reduce up2k roundtrips
* start from a chunk index and just go
* terminate client on bad data

View File

@@ -1,82 +1,71 @@
# recipe for building an exe with nuitka (extreme jank edition)
#
# NOTE: win7 and win10 builds both work on win10 but
# on win7 they immediately c0000005 in kernelbase.dll
#
# first install python-3.6.8-amd64.exe
# [x] add to path
#
NOTE: copyparty runs SLOWER when compiled with nuitka;
just use copyparty-sfx.py and/or pyinstaller instead
( the sfx and the pyinstaller EXEs are equally fast if you
have the latest jinja2 installed, but the older jinja that
comes bundled with the sfx is slightly faster yet )
roughly, copyparty-sfx.py is 6% faster than copyparty.exe
(win10-pyinstaller), and copyparty.exe is 10% faster than
nuitka, making copyparty-sfx.py 17% faster than nuitka
NOTE: every time a nuitka-compiled copyparty.exe is launched,
it will show the windows firewall prompt since nuitka will
pick a new unique location in %TEMP% to unpack an exe into,
unlike pyinstaller which doesn't fork itself on startup...
might be fixable by configuring nuitka differently, idk
NOTE: nuitka EXEs are larger than pyinstaller ones;
a minimal nuitka build of just the sfx (with its bundled
dependencies) was already the same size as the pyinstaller
copyparty.exe which also includes Mutagen and Pillow
NOTE: nuitka takes a lot longer to build than pyinstaller
(due to actual compilation of course, but still)
NOTE: binaries built with nuitka cannot run on windows7,
even when compiled with python 3.6 on windows 7 itself
NOTE: `--python-flags=-m` is the magic sauce to
correctly compile `from .util import Daemon`
(which otherwise only explodes at runtime)
NOTE: `--deployment` doesn't seem to affect performance
########################################################################
# copypaste the rest of this file into cmd
rem from pypi
cd \users\ed\downloads
python -m pip install --user Nuitka-0.6.14.7.tar.gz
rem https://github.com/brechtsanders/winlibs_mingw/releases/download/10.2.0-11.0.0-8.0.0-r5/winlibs-x86_64-posix-seh-gcc-10.2.0-llvm-11.0.0-mingw-w64-8.0.0-r5.zip
mkdir C:\Users\ed\AppData\Local\Nuitka\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\x86_64\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\x86_64\10.2.0-11.0.0-8.0.0-r5\
copy c:\users\ed\downloads\winlibs-x86_64-posix-seh-gcc-10.2.0-llvm-11.0.0-mingw-w64-8.0.0-r5.zip C:\Users\ed\AppData\Local\Nuitka\Nuitka\gcc\x86_64\10.2.0-11.0.0-8.0.0-r5\winlibs-x86_64-posix-seh-gcc-10.2.0-llvm-11.0.0-mingw-w64-8.0.0-r5.zip
rem https://github.com/ccache/ccache/releases/download/v3.7.12/ccache-3.7.12-windows-32.zip
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\ccache\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\ccache\v3.7.12\
copy c:\users\ed\downloads\ccache-3.7.12-windows-32.zip C:\Users\ed\AppData\Local\Nuitka\Nuitka\ccache\v3.7.12\ccache-3.7.12-windows-32.zip
python -m pip install --user -U nuitka
rem https://dependencywalker.com/depends22_x64.zip
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\depends\
mkdir C:\Users\ed\AppData\Local\Nuitka\Nuitka\depends\x86_64\
copy c:\users\ed\downloads\depends22_x64.zip C:\Users\ed\AppData\Local\Nuitka\Nuitka\depends\x86_64\depends22_x64.zip
cd %homedrive%
cd %homepath%\downloads
cd \
rd /s /q %appdata%\..\local\temp\pe-copyparty
cd \users\ed\downloads
python copyparty-sfx.py -h
cd %appdata%\..\local\temp\pe-copyparty\copyparty
rd /s /q copypuitka
mkdir copypuitka
cd copypuitka
python
import os, re
os.rename('../dep-j2/jinja2', '../jinja2')
os.rename('../dep-j2/markupsafe', '../markupsafe')
rd /s /q %temp%\pe-copyparty
python ..\copyparty-sfx.py --version
print("# nuitka dies if .__init__.stuff is imported")
with open('__init__.py','r',encoding='utf-8') as f:
t1 = f.read()
move %temp%\pe-copyparty\copyparty .\
move %temp%\pe-copyparty\partftpy .\
move %temp%\pe-copyparty\ftp\pyftpdlib .\
move %temp%\pe-copyparty\j2\jinja2 .\
move %temp%\pe-copyparty\j2\markupsafe .\
with open('util.py','r',encoding='utf-8') as f:
t2 = f.read().split('\n')[3:]
rd /s /q %temp%\pe-copyparty
t2 = [x for x in t2 if 'from .__init__' not in x]
t = t1 + '\n'.join(t2)
with open('__init__.py','w',encoding='utf-8') as f:
f.write('\n')
python -m nuitka ^
--onefile --deployment --python-flag=-m ^
--include-package=markupsafe ^
--include-package=jinja2 ^
--include-package=partftpy ^
--include-package=pyftpdlib ^
--include-data-dir=copyparty\web=copyparty\web ^
--include-data-dir=copyparty\res=copyparty\res ^
--run copyparty
with open('util.py','w',encoding='utf-8') as f:
f.write(t)
print("# local-imports fail, prefix module names")
ptn = re.compile(r'^( *from )(\.[^ ]+ import .*)')
for d, _, fs in os.walk('.'):
for f in fs:
fp = os.path.join(d, f)
if not fp.endswith('.py'):
continue
t = ''
with open(fp,'r',encoding='utf-8') as f:
for ln in [x.rstrip('\r\n') for x in f]:
m = ptn.match(ln)
if not m:
t += ln + '\n'
continue
p1, p2 = m.groups()
t += "{}copyparty{}\n".format(p1, p2).replace("__init__", "util")
with open(fp,'w',encoding='utf-8') as f:
f.write(t)
exit()
cd ..
rd /s /q bout & python -m nuitka --standalone --onefile --windows-onefile-tempdir --python-flag=no_site --assume-yes-for-downloads --include-data-dir=copyparty\web=copyparty\web --include-data-dir=copyparty\res=copyparty\res --run --output-dir=bout --mingw64 --include-package=markupsafe --include-package=jinja2 copyparty

View File

@@ -1,52 +0,0 @@
pyoxidizer doesn't crosscompile yet so need to build in a windows vm,
luckily possible to do mostly airgapped (https-proxy for crates)
none of this is version-specific but doing absolute links just in case
(only exception is py3.8 which is the final win7 ver)
# deps (download on linux host):
https://www.python.org/ftp/python/3.10.7/python-3.10.7-amd64.exe
https://github.com/indygreg/PyOxidizer/releases/download/pyoxidizer%2F0.22.0/pyoxidizer-0.22.0-x86_64-pc-windows-msvc.zip
https://github.com/upx/upx/releases/download/v3.96/upx-3.96-win64.zip
https://static.rust-lang.org/dist/rust-1.61.0-x86_64-pc-windows-msvc.msi
https://github.com/indygreg/python-build-standalone/releases/download/20220528/cpython-3.8.13%2B20220528-i686-pc-windows-msvc-static-noopt-full.tar.zst
# need cl.exe, prefer 2017 -- download on linux host:
https://visualstudio.microsoft.com/downloads/?q=build+tools
https://docs.microsoft.com/en-us/visualstudio/releases/2022/release-history#release-dates-and-build-numbers
https://aka.ms/vs/15/release/vs_buildtools.exe # 2017
https://aka.ms/vs/16/release/vs_buildtools.exe # 2019
https://aka.ms/vs/17/release/vs_buildtools.exe # 2022
https://docs.microsoft.com/en-us/visualstudio/install/workload-component-id-vs-build-tools?view=vs-2017
# use disposable w10 vm to prep offline installer; xfer to linux host with firefox to copyparty
vs_buildtools-2017.exe --add Microsoft.VisualStudio.Workload.MSBuildTools --add Microsoft.VisualStudio.Workload.VCTools --add Microsoft.VisualStudio.Component.Windows10SDK.17763 --layout c:\msbt2017 --lang en-us
# need two proxies on host; s5s or ssh for msys2(socks5), and tinyproxy for rust(http)
UP=- python3 socks5server.py 192.168.123.1 4321
ssh -vND 192.168.123.1:4321 localhost
git clone https://github.com/tinyproxy/tinyproxy.git
./autogen.sh
./configure --prefix=/home/ed/pe/tinyproxy
make -j24 install
printf '%s\n' >cfg "Port 4380" "Listen 192.168.123.1"
./tinyproxy -dccfg
https://github.com/msys2/msys2-installer/releases/download/2022-09-04/msys2-x86_64-20220904.exe
export all_proxy=socks5h://192.168.123.1:4321
# if chat dies after auth (2 messages) it probably failed dns, note the h in socks5h to tunnel dns
pacman -Syuu
pacman -S git patch mingw64/mingw-w64-x86_64-zopfli
cd /c && curl -k https://192.168.123.1:3923/ro/ox/msbt2017/?tar | tar -xv
first install certs from msbt/certificates then admin-cmd `vs_buildtools.exe --noweb`,
default selection (vc++2017-v15.9-v14.16, vc++redist, vc++bt-core) += win10sdk (for io.h)
install rust without documentation, python 3.10, put upx and pyoxidizer into ~/bin,
[cmd.exe] python -m pip install --user -U wheel-0.37.1.tar.gz strip-hints-0.1.10.tar.gz
p=192.168.123.1:4380; export https_proxy=$p; export http_proxy=$p
# and with all of the one-time-setup out of the way,
mkdir /c/d; cd /c/d && curl -k https://192.168.123.1:3923/cpp/gb?pw=wark > gb && git clone gb copyparty
cd /c/d/copyparty/ && curl -k https://192.168.123.1:3923/cpp/patch?pw=wark | patch -p1
cd /c/d/copyparty/scripts && CARGO_HTTP_CHECK_REVOKE=false PATH=/c/Users/$USER/AppData/Local/Programs/Python/Python310:/c/Users/$USER/bin:"$(cygpath "C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\Hostx86\x86"):$PATH" ./make-sfx.sh ox ultra

View File

@@ -48,6 +48,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* [filebrowser](#filebrowser)
* [filegator](#filegator)
* [sftpgo](#sftpgo)
* [arozos](#arozos)
* [updog](#updog)
* [goshs](#goshs)
* [gimme-that](#gimme-that)
@@ -93,6 +94,7 @@ the softwares,
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
* `k` = [filegator](https://github.com/filegator/filegator)
* `l` = [sftpgo](https://github.com/drakkan/sftpgo)
* `m` = [arozos](https://github.com/tobychui/arozos)
some softwares not in the matrixes,
* [updog](#updog)
@@ -113,22 +115,22 @@ symbol legend,
## general
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | █ |
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ |
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | |
| runs on iOS | | | | | | | | | | | | |
| runs on Android | █ | | | | | █ | | | | | | |
| runs on WinXP | █ | █ | | | | █ | | | | | | |
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | █ |
| runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ |
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ |
| zero setup, just go | █ | █ | █ | | | | █ | | | █ | | |
| android app | | | | █ | █ | | | | | | | |
| iOS app | | | | █ | █ | | | | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ | █ |
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | | |
| runs on iOS | | | | | | | | | | | | | |
| runs on Android | █ | | | | | █ | | | | | | | |
| runs on WinXP | █ | █ | | | | █ | | | | | | | |
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | █ | |
| runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | |
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ | |
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ | █ |
| zero setup, just go | █ | █ | █ | | | | █ | | | █ | | | █ |
| android app | | | | █ | █ | | | | | | | | |
| iOS app | | | | █ | █ | | | | | | | | |
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
* `a`/copyparty remarks:
@@ -140,37 +142,39 @@ symbol legend,
* `f`/rclone must be started with the command `rclone serve webdav .` or similar
* `h`/chibisafe has undocumented windows support
* `i`/sftpgo must be launched with a command
* `m`/arozos has partial windows support
## file transfer
*the thing that copyparty is actually kinda good at*
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | | | █ | | █ | █ | | █ |
| download folder as tar | █ | | | | | | | | | █ | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | |
| resumable uploads | █ | | | | | | | | █ | | █ | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | |
| upload acceleration | █ | | | | | | | | █ | | █ | |
| upload verification | █ | | | █ | █ | | | | █ | | | |
| upload deduplication | █ | | | | █ | | | | █ | | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | |
| keep last-modified time | █ | | | █ | █ | █ | | | | | | |
| upload rules | | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | | | | | | | | |
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ |
| ┗ max items in folder | █ | | | | | | | | | | | |
| ┗ max file age | █ | | | | | | | | | | | |
| ┗ max uploads over time | █ | | | | | | | | | | | |
| ┗ compress before write | █ | | | | | | | | | | | |
| ┗ randomize filename | █ | | | | | | | | | | | |
| ┗ mimetype reject-list | | | | | | | | | | | | |
| ┗ extension reject-list | | | | | | | | | • | | | |
| checksums provided | | | | █ | █ | | | | | | | |
| cloud storage backend | | | | █ | █ | | | | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | | | █ | | █ | █ | | █ | |
| download folder as tar | █ | | | | | | | | | █ | | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | █ |
| resumable uploads | █ | | | | | | | | █ | | █ | | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | | █ |
| upload acceleration | █ | | | | | | | | █ | | █ | | |
| upload verification | █ | | | █ | █ | | | | █ | | | | |
| upload deduplication | █ | | | | █ | | | | █ | | | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | | |
| race the beam ("p2p") | █ | | | | | | | | | • | | | |
| keep last-modified time | █ | | | | | | | | | | | | |
| upload rules | | | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | | | | | █ | | | █ | █ |
| ┗ max filesize | █ | | | | | | | | | | █ | █ | |
| ┗ max items in folder | █ | | | | | | | | | | | | |
| ┗ max file age | █ | | | | | | | | █ | | | | |
| ┗ max uploads over time | █ | | | | | | | | | | | | |
| ┗ compress before write | █ | | | | | | | | | | | | |
| ┗ randomize filename | | | | | | | | █ | █ | | | | |
| ┗ mimetype reject-list | | | | | | | | | • | | | | • |
| ┗ extension reject-list | | | | | | | | | | | | | • |
| checksums provided | | | | █ | █ | | | | | | | | |
| cloud storage backend | | | | █ | █ | █ | | | | | █ | █ | |
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
@@ -178,6 +182,8 @@ symbol legend,
* `upload verification` = uploads are checksummed or otherwise confirmed to have been transferred correctly
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
* `checksums provided` = when downloading a file from the server, the file's checksum is provided for verification client-side
* `cloud storage backend` = able to serve files from (and write to) s3 or similar cloud services; `` means the software can do this with some help from `rclone mount` as a bridge
@@ -192,26 +198,27 @@ symbol legend,
* resumable/segmented uploads only over SFTP, not over HTTP
* upload rules are totals only, not over time
* can probably do extension/mimetype rejection similar to copyparty
* `m`/arozos download-as-zip is not streaming; it creates the full zipfile before download can start, and fails on big folders
## protocols and client support
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ |
| serve ftp (tcp) | █ | | | | | █ | | | | | | █ |
| serve ftps (tls) | █ | | | | | █ | | | | | | █ |
| serve tftp (udp) | █ | | | | | | | | | | | |
| serve sftp (ssh) | | | | | | █ | | | | | | █ |
| serve smb/cifs | | | | | | █ | | | | | | |
| serve dlna | | | | | | █ | | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ |
| zeroconf | █ | | | | | | | | | | | |
| supports netscape 4 | | | | | | █ | | | | | • | |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • | |
| mojibake filenames | █ | | | • | • | █ | █ | • | | • | | |
| undecodable filenames | █ | | | • | • | █ | | • | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ | █ |
| serve ftp (tcp) | █ | | | | | █ | | | | | | █ | █ |
| serve ftps (tls) | █ | | | | | █ | | | | | | █ | |
| serve tftp (udp) | █ | | | | | | | | | | | | |
| serve sftp (ssh) | | | | | | █ | | | | | | █ | █ |
| serve smb/cifs | | | | | | █ | | | | | | | |
| serve dlna | | | | | | █ | | | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ | |
| zeroconf | █ | | | | | | | | | | | | █ |
| supports netscape 4 | | | | | | █ | | | | | • | | |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • | | |
| mojibake filenames | █ | | | • | • | █ | █ | • | | • | | | |
| undecodable filenames | █ | | | • | • | █ | | • | | | | | |
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
@@ -222,61 +229,66 @@ symbol legend,
* extremely minimal samba/cifs server
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
* `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss)
* `m`/arozos has readonly-support for older browsers; no uploading
## server configuration
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| config from cmd args | █ | | | | | █ | █ | | | █ | | |
| config files | █ | █ | █ | | | █ | | █ | | █ | • | |
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | |
| same-port http / https | █ | | | | | | | | | | | |
| listen multiple ports | █ | | | | | | | | | | | █ |
| virtual file system | █ | █ | █ | | | | █ | | | | | █ |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| config from cmd args | █ | | | | | █ | █ | | | █ | | | |
| config files | █ | █ | █ | | | █ | | █ | | █ | • | | |
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | | █ |
| same-port http / https | █ | | | | | | | | | | | | |
| listen multiple ports | █ | | | | | | | | | | | █ | |
| virtual file system | █ | █ | █ | | | | █ | | | | | █ | |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ | |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | | • |
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
* `l`/sftpgo:
* config: users must be added through gui / api calls
* `m`/arozos:
* configuration is primarily through GUI
* reverse-proxy is not guaranteed to see the correct client IP
## server capabilities
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| per-account chroot | | | | | | | | | | | | █ |
| single-sign-on | | | | █ | █ | | | | • | | | |
| token auth | | | | █ | █ | | | █ | | | | |
| 2fa | | | | █ | █ | | | | | | | █ |
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | | █ |
| per-folder permissions | | | | █ | █ | | █ | | █ | █ | | █ |
| per-file permissions | | | | █ | █ | | █ | | █ | | | |
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | |
| unmap subfolders | █ | | | | | | █ | | | █ | | • |
| index.html blocks list | | | | | | | █ | | | • | | |
| write-only folders | █ | | | | | | | | | | █ | █ |
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ |
| file versioning | | | | █ | █ | | | | | | | |
| file encryption | | | | █ | █ | █ | | | | | | █ |
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | |
| ┗ per-volume db | █ | | • | • | • | | | • | • | | | |
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | |
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | |
| ┗ existing file tree | █ | | █ | | | | | | | █ | | |
| file action event hooks | █ | | | | | | | | | █ | | █ |
| one-way folder sync | █ | | | █ | █ | █ | | | | | | |
| full sync | | | | █ | █ | | | | | | | |
| speed throttle | | █ | █ | | | █ | | | █ | | | █ |
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ |
| dyndns updater | | █ | | | | | | | | | | |
| self-updater | | | █ | | | | | | | | | |
| log rotation | █ | | █ | █ | █ | | | • | █ | | | █ |
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | |
| curl-friendly ls | █ | | | | | | | | | | | |
| curl-friendly upload | █ | | | | | | | | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| per-account chroot | | | | | | | | | | | | █ | |
| single-sign-on | | | | █ | █ | | | | • | | | | |
| token auth | | | | █ | █ | | | █ | | | | | █ |
| 2fa | | | | █ | █ | | | | | | | █ | |
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | | █ | █ |
| per-folder permissions | | | | █ | █ | | █ | | █ | █ | | █ | █ |
| per-file permissions | | | | █ | █ | | █ | | █ | | | | █ |
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | | █ |
| unmap subfolders | █ | | | | | | █ | | | █ | | • | |
| index.html blocks list | | | | | | | █ | | | • | | | |
| write-only folders | █ | | | | | | | | | | █ | █ | |
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ | █ |
| file versioning | | | | █ | █ | | | | | | | | |
| file encryption | | | | █ | █ | █ | | | | | | █ | |
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | | |
| ┗ per-volume db | █ | | • | • | • | | | • | • | | | | |
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | | |
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | | |
| ┗ existing file tree | █ | | █ | | | | | | | █ | | | |
| file action event hooks | █ | | | | | | | | | █ | | █ | • |
| one-way folder sync | █ | | | █ | █ | █ | | | | | | | |
| full sync | | | | █ | █ | | | | | | | | |
| speed throttle | | █ | █ | | | █ | | | █ | | | █ | |
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ | • |
| dyndns updater | | █ | | | | | | | | | | | |
| self-updater | | | █ | | | | | | | | | | █ |
| log rotation | █ | | █ | █ | █ | | | • | █ | | | █ | • |
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | | █ |
| prometheus metrics | █ | | | █ | | | | | | | | | |
| curl-friendly ls | █ | | | | | | | | | | | | |
| curl-friendly upload | █ | | | | | █ | █ | • | | | | | |
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
@@ -302,49 +314,51 @@ symbol legend,
* `l`/sftpgo:
* `file action event hooks` also include on-download triggers
* `upload tracking / log` in main logfile
* `m`/arozos:
* `2fa` maybe possible through LDAP/Oauth
## client features
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | |
| themes | █ | █ | | █ | | | | | █ | | | |
| directory tree nav | █ | | | | █ | | | | █ | | | |
| multi-column sorting | █ | | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | | |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | |
| ┗ audio spectrograms | █ | | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | | |
| ┗ gapless playback | █ | | | | | | | | • | | | |
| ┗ audio equalizer | █ | | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | | |
| ┗ OS integration | █ | | | | | | | | | | | |
| ┗ transcode to lossy | █ | | | | | | | | | | | |
| video player | █ | | | █ | █ | | | | █ | █ | | |
| ┗ video transcoding | | | | | | | | | █ | | | |
| audio BPM detector | █ | | | | | | | | | | | |
| audio key detector | █ | | | | | | | | | | | |
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | |
| search by date / size | █ | | | | █ | | | █ | █ | | | |
| search by bpm / key | █ | | | | | | | | | | | |
| search by custom tags | | | | | | | | █ | █ | | | |
| search in file contents | | | | █ | █ | | | | █ | | | |
| search by custom parser | █ | | | | | | | | | | | |
| find local file | █ | | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | |
| markdown viewer | █ | | | | █ | | | | █ | | | |
| markdown editor | █ | | | | █ | | | | █ | | | |
| readme.md in listing | █ | | | █ | | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | | █ |
| themes | █ | █ | | █ | | | | | █ | | | | |
| directory tree nav | █ | | | | █ | | | | █ | | | | |
| multi-column sorting | █ | | | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | | | █ |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | | █ |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | | █ |
| ┗ audio spectrograms | █ | | | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | | | █ |
| ┗ gapless playback | █ | | | | | | | | • | | | | |
| ┗ audio equalizer | █ | | | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | | | |
| ┗ OS integration | █ | | | | | | | | | | | | |
| ┗ transcode to lossy | █ | | | | | | | | | | | | |
| video player | █ | | | █ | █ | | | | █ | █ | | | █ |
| ┗ video transcoding | | | | | | | | | █ | | | | |
| audio BPM detector | █ | | | | | | | | | | | | |
| audio key detector | █ | | | | | | | | | | | | |
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | | |
| search by date / size | █ | | | | █ | | | █ | █ | | | | |
| search by bpm / key | █ | | | | | | | | | | | | |
| search by custom tags | | | | | | | | █ | █ | | | | |
| search in file contents | | | | █ | █ | | | | █ | | | | |
| search by custom parser | █ | | | | | | | | | | | | |
| find local file | █ | | | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | | █ |
| markdown viewer | █ | | | | █ | | | | █ | | | | █ |
| markdown editor | █ | | | | █ | | | | █ | | | | █ |
| readme.md in listing | █ | | | █ | | | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | | █ |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | █ |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ | | █ |
* `single-page app` = multitasking; possible to continue navigating while uploading
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
@@ -360,14 +374,14 @@ symbol legend,
## integration
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| OS alert on upload | █ | | | | | | | | | | | |
| discord | █ | | | | | | | | | | | |
| ┗ announce uploads | █ | | | | | | | | | | | |
| ┗ custom embeds | | | | | | | | | | | | |
| sharex | █ | | | █ | | █ | | █ | | | | |
| flameshot | | | | | | █ | | | | | | |
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l | m |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - | - |
| OS alert on upload | █ | | | | | | | | | | | | |
| discord | █ | | | | | | | | | | | | |
| ┗ announce uploads | █ | | | | | | | | | | | | |
| ┗ custom embeds | | | | | | | | | | | | | |
| sharex | █ | | | █ | | █ | | █ | | | | | |
| flameshot | | | | | | █ | | | | | | | |
* sharex `` = yes, but does not provide example sharex config
* `a`/copyparty remarks:
@@ -393,6 +407,7 @@ symbol legend,
| filebrowser | go | █ apl2 | 20 MB |
| filegator | php | █ mit | • |
| sftpgo | go | ‼ agpl | 44 MB |
| arozos | go | ░ gpl3 | 531 MB |
| updog | python | █ mit | 17 MB |
| goshs | go | █ mit | 11 MB |
| gimme-that | python | █ mit | 4.8 MB |
@@ -504,12 +519,14 @@ symbol legend,
* ✅ token auth (api keys)
## [kodbox](https://github.com/kalcaddle/kodbox)
* this thing is insane
* this thing is insane (but is getting competition from [arozos](#arozos))
* php; [docker](https://hub.docker.com/r/kodcloud/kodbox)
* 🔵 *upload segmenting, acceleration, and integrity checking!*
* ⚠️ but uploads are not resumable(?)
* ⚠️ not portable
* ⚠️ isolated on-disk file hierarchy, incompatible with other software
* ⚠️ uploading small files to copyparty is 16x faster
* ⚠️ uploading large files to copyparty is 3x faster
* ⚠️ http/webdav only; no ftp or zeroconf
* ⚠️ some parts of the GUI are in chinese
* ✅ fantastic ui/ux
@@ -569,6 +586,24 @@ symbol legend,
* ✅ on-download event hook (otherwise same as copyparty)
* ✅ more extensive permissions control
## [arozos](https://github.com/tobychui/arozos)
* big suite of applications similar to [kodbox](#kodbox), copyparty is better at downloading/uploading/music/indexing but arozos has other advantages
* go; primarily linux (limited support for windows)
* ⚠️ uploads not resumable / integrity-checked
* ⚠️ uploading small files to copyparty is 2.7x faster
* ⚠️ uploading large files to copyparty is at least 10% faster
* arozos is websocket-based, 512 KiB chunks; writes each chunk to separate files and then merges
* copyparty splices directly into the final file; faster and better for the HDD and filesystem
* ⚠️ no directory tree navpane; not as easy to navigate
* ⚠️ download-as-zip is not streaming; creates a temp.file on the server
* ⚠️ not self-contained (pulls from jsdelivr)
* ⚠️ has an audio player, but supports less filetypes
* ⚠️ limited support for configuring real-ip detection
* ✅ sftp server
* ✅ settings gui
* ✅ good-looking gui
* ✅ an IDE, msoffice viewer, rich host integration, much more
## [updog](https://github.com/sc0tfree/updog)
* python; cross-platform
* basic directory listing with upload feature

View File

@@ -1,48 +0,0 @@
# builds win7-i386 exe on win10-ltsc-1809(17763.316)
# see docs/pyoxidizer.txt
def make_exe():
dist = default_python_distribution(flavor="standalone_static", python_version="3.8")
policy = dist.make_python_packaging_policy()
policy.allow_files = True
policy.allow_in_memory_shared_library_loading = True
#policy.bytecode_optimize_level_zero = True
#policy.include_distribution_sources = False # error instantiating embedded Python interpreter: during initializing Python main: init_fs_encoding: failed to get the Python codec of the filesystem encoding
policy.include_distribution_resources = False
policy.include_non_distribution_sources = False
policy.include_test = False
python_config = dist.make_python_interpreter_config()
#python_config.module_search_paths = ["$ORIGIN/lib"]
python_config.run_module = "copyparty"
exe = dist.to_python_executable(
name="copyparty",
config=python_config,
packaging_policy=policy,
)
exe.windows_runtime_dlls_mode = "never"
exe.windows_subsystem = "console"
exe.add_python_resources(exe.read_package_root(
path="sfx",
packages=[
"copyparty",
"jinja2",
"markupsafe",
"pyftpdlib",
"python-magic",
]
))
return exe
def make_embedded_resources(exe):
return exe.to_embedded_resources()
def make_install(exe):
files = FileManifest()
files.add_python_resource("copyparty", exe)
return files
register_target("exe", make_exe)
register_target("resources", make_embedded_resources, depends=["exe"], default_build_script=True)
register_target("install", make_install, depends=["exe"], default=True)
resolve_targets()

View File

@@ -3,7 +3,7 @@ WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.10.0 \
ver_marked=4.3.0 \
ver_dompf=3.0.9 \
ver_dompf=3.0.11 \
ver_mde=2.18.0 \
ver_codemirror=5.65.16 \
ver_fontawesome=5.13.0 \
@@ -24,7 +24,9 @@ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
RUN mkdir -p /z/dist/no-pk \
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev py3-brotli \
&& apk add \
bash brotli cmake make g++ git gzip lame npm patch pigz \
python3 python3-dev py3-brotli sox tar unzip wget \
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
@@ -58,6 +60,11 @@ RUN mkdir -p /z/dist/no-pk \
&& tar -xf zopfli.tgz
COPY busy-mp3.sh /z/
RUN /z/busy-mp3.sh \
&& mv -v /dev/shm/busy.mp3.gz /z/dist
# build fonttools (which needs zopfli)
RUN tar -xf zopfli.tgz \
&& cd zopfli* \

61
scripts/deps-docker/busy-mp3.sh Executable file
View File

@@ -0,0 +1,61 @@
#!/bin/bash
set -e
cat >/dev/null <<EOF
a frame is 1152 samples
1sec @ 48000 = 41.66 frames
11 frames = 12672 samples = 0.264 sec
22 frames = 25344 samples = 0.528 sec
EOF
fast=1
fast=
echo
mkdir -p /dev/shm/$1
cd /dev/shm/$1
find -maxdepth 1 -type f -iname 'a.*.mp3*' -delete
min=99999999
for freq in 425; do # {400..500}
for vol in 0; do # {10..30}
for kbps in 32; do
for fdur in 1124; do # {800..1200}
for fdu2 in 1042; do # {800..1200}
for ftyp in h; do # q h t l p
for ofs1 in 9214; do # {0..32768}
for ofs2 in 0; do # {0..4096}
for ofs3 in 0; do # {0..4096}
for nores in --nores; do # '' --nores
f=a.b$kbps$nores-f$freq-v$vol-$ftyp$fdur-$fdu2-o$ofs1-$ofs2-$ofs3
sox -r48000 -Dn -r48000 -b16 -c2 -t raw s1.pcm synth 25344s sin $freq vol 0.$vol fade $ftyp ${fdur}s 25344s ${fdu2}s
sox -r48000 -Dn -r48000 -b16 -c2 -t raw s0.pcm synth 12672s sin $freq vol 0
tail -c +$ofs1 s0.pcm >s0a.pcm
tail -c +$ofs2 s0.pcm >s0b.pcm
tail -c +$ofs3 s0.pcm >s0c.pcm
cat s{0a,1,0,0b,1,0c}.pcm > a.pcm
lame --silent -r -s 48 --bitwidth 16 --signed a.pcm -m j --resample 48 -b $kbps -q 0 $nores $f.mp3
if [ $fast ]
then gzip -c9 <$f.mp3 >$f.mp3.gz
else pigz -c11 -I1 <$f.mp3 >$f.mp3.gz
fi
sz=$(wc -c <$f.mp3.gz)
printf '\033[A%d %s\033[K\n' $sz $f
[ $sz -le $((min+10)) ] && echo
[ $sz -le $min ] && echo && min=$sz
done;done;done;done;done;done;done;done;done;done
true
f=a.b32--nores-f425-v0-h1124-1042-o9214-0-0.mp3
[ $fast ] &&
pigz -c11 -I1 <$f >busy.mp3.gz ||
mv $f.gz busy.mp3.gz
sz=$(wc -c <busy.mp3.gz)
[ "$sz" -eq 106 ] &&
echo busy.mp3 built successfully ||
echo WARNING: unexpected size of busy.mp3
find -maxdepth 1 -type f -iname 'a.*.mp3*' -delete

View File

@@ -16,8 +16,6 @@ help() { exec cat <<'EOF'
# `re` does a repack of an sfx which you already executed once
# (grabs files from the sfx-created tempdir), overrides `clean`
#
# `ox` builds a pyoxidizer exe instead of py
#
# `gz` creates a gzip-compressed python sfx instead of bzip2
#
# `lang` limits which languages/translations to include,
@@ -111,7 +109,6 @@ while [ ! -z "$1" ]; do
case $1 in
clean) clean=1 ; ;;
re) repack=1 ; ;;
ox) use_ox=1 ; ;;
gz) use_gz=1 ; ;;
gzz) shift;use_gzz=$1;use_gz=1; ;;
no-ftp) no_ftp=1 ; ;;
@@ -461,8 +458,8 @@ rm -f ftp/pyftpdlib/{__main__,prefork}.py
iawk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' $f
done
[ ! $repack ] && [ ! $use_ox ] && {
# uncomment; oxidized drops 45 KiB but becomes undebuggable
[ ! $repack ] && {
# uncomment
find | grep -E '\.py$' |
grep -vE '__version__' |
tr '\n' '\0' |
@@ -570,33 +567,6 @@ nf=$(ls -1 "$zdir"/arc.* 2>/dev/null | wc -l)
}
[ $use_ox ] && {
tgt=x86_64-pc-windows-msvc
tgt=i686-pc-windows-msvc # 2M smaller (770k after upx)
bdir=build/$tgt/release/install/copyparty
t="res web"
(printf "\n\n\nBUT WAIT! THERE'S MORE!!\n\n";
cat ../$bdir/COPYING.txt) >> copyparty/res/COPYING.txt ||
echo "copying.txt 404 pls rebuild"
mv ftp/* j2/* .
rm -rf ftp j2 py2 py37
(cd copyparty; tar -cvf z.tar $t; rm -rf $t)
cd ..
pyoxidizer build --release --target-triple $tgt
mv $bdir/copyparty.exe dist/
cp -pv "$(for d in '/c/Program Files (x86)/Microsoft Visual Studio/'*'/BuildTools/VC/Redist/MSVC'; do
find "$d" -name vcruntime140.dll; done | sort | grep -vE '/x64/|/onecore/' | head -n 1)" dist/
dist/copyparty.exe --version
cp -pv dist/copyparty{,.orig}.exe
[ $ultra ] && a="--best --lzma" || a=-1
/bin/time -f %es upx $a dist/copyparty.exe >/dev/null
ls -al dist/copyparty{,.orig}.exe
exit 0
}
echo gen tarlist
for d in copyparty partftpy magic j2 py2 py37 ftp; do find $d -type f || true; done | # strip_hints
sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort |

View File

@@ -118,4 +118,12 @@ base64 | head -c12 >> dist/copyparty.exe
dist/copyparty.exe --version
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe
csum=$(sha512sum <dist/copyparty.exe | cut -c-56)
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe >uplod.log
cat uplod.log
grep -q $csum uplod.log && echo upload OK || {
echo UPLOAD FAILED
exit 1
}

View File

@@ -1,33 +1,33 @@
f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f84beadd139bfeef4982e40f21ca5d9065cb794eeb0e473e82 altgraph-0.17.4-py2.py3-none-any.whl
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
e0d2e6183437af321a36944f04a501e85181243e5fa2da3254254305dd8119161f62048bc56bff8849b49f546ff175b02b4c999401f1c404f6b88e6f46a9c96e Git-2.44.0-32-bit.exe
9d2c31701a4d3fef553928c00528a48f9e1854ab5333528b50e358a214eba90029d687f039bcda5760b6fdf9f2de3bcf3784ae21a6374cf2a97a845d33b636c6 packaging-24.0-py3-none-any.whl
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
126ca016c00256f4ff13c88707ead21b3b98f3c665ae57a5bcbb80c8be3004bff36d9c7f9a1cc9d20551019708f2b195154f302d80a1e5a2026d6d0fe9f3d5f4 pyinstaller_hooks_contrib-2024.3-py2.py3-none-any.whl
749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl
6e0d854040baff861e1647d2bece7d090bc793b2bd9819c56105b94090df54881a6a9b43ebd82578cd7c76d47181571b671e60672afd9def389d03c9dae84fcf setuptools-68.2.2-py3-none-any.whl
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
# u2c (win7)
f3390290b896019b2fa169932390e4930d1c03c014e1f6db2405ca2eb1f51f5f5213f725885853805b742997b0edb369787e5c0069d217bc4e8b957f847f58b6 certifi-2023.11.17-py3-none-any.whl
904eb57b13bea80aea861de86987e618665d37fa9ea0856e0125a9ba767a53e5064de0b9c4735435a2ddf4f16f7f7d2c75a682e1de83d9f57922bdca8e29988c charset_normalizer-3.3.0-cp37-cp37m-win32.whl
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl
7a3bd4849f95e1715fe2e99613df70a0fedd944a9bfde71a0fadb837fe62c3431c30da4f0b75c74de6f1a459f1fdf7cb62eaf404fdbe45e2d121e0b1021f1580 certifi-2024.2.2-py3-none-any.whl
9cc8acc5e269e6421bc32bb89261101da29d6ca337d39d60b9106de9ed7904e188716e4a48d78a2c4329026443fcab7acab013d2fe43778e30d6c4e4506a1b91 charset_normalizer-3.3.2-cp37-cp37m-win32.whl
0ec1ae5c928b4a0001a254c8598b746049406e1eed720bfafa94d4474078eff76bf6e032124e2d4df4619052836523af36162443c6d746487b387d2e3476e691 idna-3.6-py3-none-any.whl
b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c3875222eb47fc95c73fc0606aaa6602b9ebc524809c9ba3501f requests-2.31.0-py3-none-any.whl
5a25cb9b79bb6107f9055dc3e9f62ebc6d4d9ca2c730d824985c93cd82406b723c200d6300c5064e42ee9fc7a2853d6ec6661394f3ed7bac03750e1f2a6840d1 urllib3-1.26.17-py2.py3-none-any.whl
61ed4500b6361632030f05229705c5c5a52cb47e31c0e6b55151c8f3beed631cd752ca6c3d6393d56a2acf6a453cfcf801e877116123c550922249c3a976e0f4 urllib3-1.26.18-py2.py3-none-any.whl
# win7
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
016a8cbd09384f1a9a44cb0e8274df75a8bcb2f3966bb5d708c62145289efaa5db98f75256c97e4f8046735ce2e529fbb076f284a46cdb716e89a75660200ad9 pip-23.2.1-py3-none-any.whl
d130bfa136bd171b9972b5c281c578545f2a84a909fdf18a6d2d71dd12fb3d512a7a1fa5cf7300433adece1d306eb2f22d7278f4c90e744e04dc67ba627a82c0 future-1.0.0-py3-none-any.whl
0b4d07434bf8d314f42893d90bce005545b44a509e7353a73cad26dc9360b44e2824218a1a74f8174d02eba87fba91baffa82c8901279a32ebc6b8386b1b4275 importlib_metadata-6.7.0-py3-none-any.whl
5d7462a584105bccaa9cf376f5a8c5827ead099c813c8af7392d478a4398f373d9e8cac7bbad2db51b335411ab966b21e119b1b1234c9a7ab70c6ddfc9306da6 pip-24.0-py3-none-any.whl
f298e34356b5590dde7477d7b3a88ad39c622a2bcf3fcd7c53870ce8384dd510f690af81b8f42e121a22d3968a767d2e07595036b2ed7049c8ef4d112bcf3a61 pyinstaller-5.13.2-py3-none-win32.whl
6bb73cc2db795c59c92f2115727f5c173cacc9465af7710db9ff2f2aec2d73130d0992d0f16dcb3fac222dc15c0916562d0813b2337401022020673a4461df3d python-3.7.9-amd64.exe
500747651c87f59f2436c5ab91207b5b657856e43d10083f3ce27efb196a2580fadd199a4209519b409920c562aaaa7dcbdfb83ed2072a43eaccae6e2d056f31 python-3.7.9.exe
03e50aecc85914567c114e38a1777e32628ee098756f37177bc23220eab33ac7d3ff591fd162db3b4d4e34d55cee93ef0dc67af68a69c38bb1435e0768dee57e typing_extensions-4.7.1-py3-none-any.whl
2e04acff170ca3bbceeeb18489c687126c951ec0bfd53cccfb389ba8d29a4576c1a9e8f2e5ea26c84dd21bfa2912f4e71fa72c1e2653b71e34afc0e65f1722d4 upx-4.2.2-win32.zip
68e1b618d988be56aaae4e2eb92bc0093627a00441c1074ebe680c41aa98a6161e52733ad0c59888c643a33fe56884e4f935178b2557fbbdd105e92e0d993df6 windows6.1-kb2533623-x64.msu
479a63e14586ab2f2228208116fc149ed8ee7b1e4ff360754f5bda4bf765c61af2e04b5ef123976623d04df4976b7886e0445647269da81436bd0a7b5671d361 windows6.1-kb2533623-x86.msu
ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a25067512b8f75538c67c987cf3944bfa0229e3cb677e2fb81e763e zipp-3.10.0-py3-none-any.whl
ac96786e5d35882e0c5b724794329c9125c2b86ae7847f17acfc49f0d294312c6afc1c3f248655de3f0ccb4ca426d7957d02ba702f4a15e9fcd7e2c314e72c19 zipp-3.15.0-py3-none-any.whl
# win10
e3e2e6bd511dec484dd0292f4c46c55c88a885eabf15413d53edea2dd4a4dbae1571735b9424f78c0cd7f1082476a8259f31fd3f63990f726175470f636df2b3 Jinja2-3.1.3-py3-none-any.whl
e21495f1d473d855103fb4a243095b498ec90eb68776b0f9b48e994990534f7286c0292448e129c507e5d70409f8a05cca58b98d59ce2a815993d0a873dfc480 MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
656015f5cc2c04aa0653ee5609c39a7e5f0b6a58c84fe26b20bd070c52d20b4effb810132f7fb771168483e9fd975cc3302837dd7a1a687ee058b0460c857cc4 packaging-23.2-py3-none-any.whl
424e20dc7263a31d524307bc39ed755a9dd82f538086fff68d98dd97e236c9b00777a8ac2e3853081b532b0e93cef44983e74d0ab274877440e8b7341b19358a pillow-10.2.0-cp311-cp311-win_amd64.whl
1dfe6f66bef5c9d62c9028a964196b902772ec9e19db215f3f41acb8d2d563586988d81b455fa6b895b434e9e1e9d57e4d271d1b1214483bdb3eadffcbba6a33 pillow-10.3.0-cp311-cp311-win_amd64.whl
8760eab271e79256ae3bfb4af8ccc59010cb5d2eccdd74b325d1a533ae25eb127d51c2ec28ff90d449afed32dd7d6af62934fe9caaf1ae1f4d4831e948e912da pyinstaller-6.5.0-py3-none-win_amd64.whl
e6bdbae1affd161e62fc87407c912462dfe875f535ba9f344d0c4ade13715c947cd3ae832eff60f1bad4161938311d06ac8bc9b52ef203f7b0d9de1409f052a5 python-3.11.8-amd64.exe
897a14d5ee5cbc6781a0f48beffc27807a4f789d58c4329d899233f615d168a5dcceddf7f8f2d5bb52212ddcf3eba4664590d9f1fdb25bb5201f44899e03b2f7 python-3.11.9-amd64.exe
729dc52f1a02bc6274d012ce33f534102975a828cba11f6029600ea40e2d23aefeb07bf4ae19f9621d0565dd03eb2635bbb97d45fb692c1f756315e8c86c5255 upx-4.2.2-win64.zip

View File

@@ -8,7 +8,7 @@ run ./build.sh in git-bash to build + upload the exe
to obtain the files referenced below, see ./deps.txt
download + install git (32bit OK on 64):
http://192.168.123.1:3923/ro/pyi/Git-2.39.1-32-bit.exe
http://192.168.123.1:3923/ro/pyi/Git-2.44.0-32-bit.exe
===[ copy-paste into git-bash ]================================
uname -s | grep NT-10 && w10=1 || {
@@ -16,6 +16,7 @@ uname -s | grep NT-10 && w10=1 || {
}
fns=(
altgraph-0.17.4-py2.py3-none-any.whl
packaging-24.0-py3-none-any.whl
pefile-2023.2.7-py3-none-any.whl
pyinstaller_hooks_contrib-2024.3-py2.py3-none-any.whl
pywin32_ctypes-0.2.2-py3-none-any.whl
@@ -26,26 +27,25 @@ fns=(
Jinja2-3.1.3-py3-none-any.whl
MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl
mutagen-1.47.0-py3-none-any.whl
packaging-23.2-py3-none-any.whl
pillow-10.2.0-cp311-cp311-win_amd64.whl
python-3.11.8-amd64.exe
pillow-10.3.0-cp311-cp311-win_amd64.whl
python-3.11.9-amd64.exe
upx-4.2.2-win64.zip
)
[ $w7 ] && fns+=(
pyinstaller-5.13.2-py3-none-win32.whl
certifi-2023.11.17-py3-none-any.whl
chardet-5.1.0-py3-none-any.whl
idna-3.4-py3-none-any.whl
requests-2.28.2-py3-none-any.whl
urllib3-1.26.14-py2.py3-none-any.whl
certifi-2024.2.2-py3-none-any.whl
charset_normalizer-3.3.2-cp37-cp37m-win32.whl
idna-3.6-py3-none-any.whl
requests-2.31.0-py3-none-any.whl
urllib3-1.26.18-py2.py3-none-any.whl
upx-4.2.2-win32.zip
)
[ $w7 ] && fns+=(
future-0.18.2.tar.gz
importlib_metadata-5.0.0-py3-none-any.whl
pip-23.2.1-py3-none-any.whl
typing_extensions-4.4.0-py3-none-any.whl
zipp-3.10.0-py3-none-any.whl
future-1.0.0-py3-none-any.whl
importlib_metadata-6.7.0-py3-none-any.whl
pip-24.0-py3-none-any.whl
typing_extensions-4.7.1-py3-none-any.whl
zipp-3.15.0-py3-none-any.whl
)
[ $w7x64 ] && fns+=(
windows6.1-kb2533623-x64.msu
@@ -77,9 +77,10 @@ yes | unzip upx-*-win32.zip &&
mv upx-*/upx.exe . &&
python -m ensurepip &&
{ [ $w10 ] || python -m pip install --user -U pip-*.whl; } &&
{ [ $w7 ] || python -m pip install --user -U {packaging,setuptools,mutagen,Pillow,Jinja2,MarkupSafe}-*.whl; } &&
python -m pip install --user -U packaging-*.whl &&
{ [ $w7 ] || python -m pip install --user -U {setuptools,mutagen,Pillow,Jinja2,MarkupSafe}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U future-*.whl importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&

View File

@@ -45,4 +45,12 @@ $APPDATA/python/python37/scripts/pyinstaller -y --clean --upx-dir=. up2k.spec
./dist/u2c.exe --version
curl -fkT dist/u2c.exe -HPW:wark https://192.168.123.1:3923/
csum=$(sha512sum <dist/u2c.exe | cut -c-56)
curl -fkT dist/u2c.exe -HPW:wark https://192.168.123.1:3923/ >uplod.log
cat uplod.log
grep -q $csum uplod.log && echo upload OK || {
echo UPLOAD FAILED
exit 1
}

View File

@@ -81,6 +81,7 @@ copyparty/web/dd/5.png,
copyparty/web/dd/__init__.py,
copyparty/web/deps,
copyparty/web/deps/__init__.py,
copyparty/web/deps/busy.mp3,
copyparty/web/deps/easymde.css,
copyparty/web/deps/easymde.js,
copyparty/web/deps/marked.js,

View File

@@ -3,6 +3,7 @@
from __future__ import print_function, unicode_literals
import io
import json
import os
import shutil
import tarfile
@@ -62,7 +63,9 @@ class TestHttpCli(unittest.TestCase):
self.assertEqual(self.curl("?tar", "x")[1][:17], "\nJ2EOT")
# search
##
## search
up2k = Up2k(self)
u2idx = U2idx(self)
allvols = list(self.asrv.vfs.all_vols.values())
@@ -87,15 +90,55 @@ class TestHttpCli(unittest.TestCase):
xe = "a/da/f4 a/f3 f0 t/f1"
self.assertEqual(x, xe)
##
## dirkeys
os.mkdir("v")
with open("v/f1.txt", "wb") as f:
f.write(b"a")
os.rename("a", "v/a")
os.rename(".b", "v/.b")
vcfg = [
".::r.,u1:g,u2:c,dk",
"v/a:v/a:r.,u1:g,u2:c,dk",
"v/.b:v/.b:r.,u1:g,u2:c,dk",
]
self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"])
self.asrv = AuthSrv(self.args, self.log)
zj = json.loads(self.curl("?ls", "u1")[1])
url = "?k=" + zj["dk"]
# should descend into folders, but not other volumes:
self.assertEqual(self.tardir(url, "u2"), "f0 t/f1 v/f1.txt")
zj = json.loads(self.curl("v?ls", "u1")[1])
url = "v?k=" + zj["dk"]
self.assertEqual(self.tarsel(url, "u2", ["f1.txt", "a", ".b"]), "f1.txt")
def tardir(self, url, uname):
h, b = self.curl("/" + url + "?tar", uname, True)
top = url.split("?")[0]
top = ("top" if not top else top.lstrip(".").split("/")[0]) + "/"
url += ("&" if "?" in url else "?") + "tar"
h, b = self.curl(url, uname, True)
tar = tarfile.open(fileobj=io.BytesIO(b), mode="r|").getnames()
top = ("top" if not url else url.lstrip(".").split("/")[0]) + "/"
assert len(tar) == len([x for x in tar if x.startswith(top)])
if len(tar) != len([x for x in tar if x.startswith(top)]):
raise Exception("bad-prefix:", tar)
return " ".join([x[len(top) :] for x in tar])
def curl(self, url, uname, binary=False):
conn = tu.VHttpConn(self.args, self.asrv, self.log, hdr(url, uname))
def tarsel(self, url, uname, sel):
url += ("&" if "?" in url else "?") + "tar"
zs = '--XD\r\nContent-Disposition: form-data; name="act"\r\n\r\nzip\r\n--XD\r\nContent-Disposition: form-data; name="files"\r\n\r\n'
zs += "\r\n".join(sel) + "\r\n--XD--\r\n"
zb = zs.encode("utf-8")
hdr = "POST /%s HTTP/1.1\r\nPW: %s\r\nConnection: close\r\nContent-Type: multipart/form-data; boundary=XD\r\nContent-Length: %d\r\n\r\n"
req = (hdr % (url, uname, len(zb))).encode("utf-8") + zb
h, b = self.curl("/" + url, uname, True, req)
tar = tarfile.open(fileobj=io.BytesIO(b), mode="r|").getnames()
return " ".join(tar)
def curl(self, url, uname, binary=False, req=b""):
req = req or hdr(url, uname)
conn = tu.VHttpConn(self.args, self.asrv, self.log, req)
HttpCli(conn).run()
if binary:
h, b = conn.s._reply.split(b"\r\n\r\n", 1)

View File

@@ -44,7 +44,7 @@ if MACOS:
from copyparty.__init__ import E
from copyparty.__main__ import init_E
from copyparty.u2idx import U2idx
from copyparty.util import FHC, Garda, Unrecv
from copyparty.util import FHC, CachedDict, Garda, Unrecv
init_E(E)
@@ -110,7 +110,7 @@ class Cfg(Namespace):
def __init__(self, a=None, v=None, c=None, **ka0):
ka = {}
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw q rand smb srch_dbg stats vague_403 vc ver xdev xlink xvol"
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_pipe no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw q rand smb srch_dbg stats vague_403 vc ver xdev xlink xvol"
ka.update(**{k: False for k in ex.split()})
ex = "dotpart dotsrch no_dhash no_fastboot no_rescan no_sendfile no_voldump re_dhash plain_ip"
@@ -145,6 +145,7 @@ class Cfg(Namespace):
c=c,
E=E,
dbd="wal",
dk_salt="b" * 16,
fk_salt="a" * 16,
idp_gsep=re.compile("[|:;+,]"),
iobuf=256 * 1024,
@@ -154,6 +155,7 @@ class Cfg(Namespace):
mte={"a": True},
mth={},
mtp=[],
mv_retry="0/0",
rm_retry="0/0",
s_rd_sz=256 * 1024,
s_wr_sz=256 * 1024,
@@ -249,6 +251,7 @@ class VHttpConn(object):
self.log_func = log
self.log_src = "a"
self.mutex = threading.Lock()
self.pipes = CachedDict(1)
self.u2mutex = threading.Lock()
self.nbyte = 0
self.nid = None