Compare commits

..

58 Commits

Author SHA1 Message Date
ed
ff012221ae v1.15.5 2024-10-05 18:03:04 +00:00
ed
c398553748 pkgres: fix multiprocessing 2024-10-05 17:32:08 +00:00
ed
3ccbcf6185 update pkgs to 1.15.4 2024-10-04 23:56:45 +00:00
ed
f0abc0ef59 v1.15.4 2024-10-04 23:19:28 +00:00
ed
a99fa3375d the impresources.files traversible is not threadsafe 2024-10-04 22:37:29 +00:00
ed
22c7e09b3f small fixes;
* make-sfx: delete failed deps downloads
* tlcheck: detect untranslated strings
2024-10-04 20:56:16 +00:00
ed
0dfe1d5b35 toast countdown bar 2024-10-04 19:29:54 +00:00
ed
a99a3bc6d7 audio-player: fix compact-mode rendering glitch on narrow screens 2024-10-04 18:15:18 +00:00
ed
9804f25de3 add option for natural sorting; thx @oshiteku 2024-10-04 00:30:04 +00:00
ed
ae98200660 og: support filekeys 2024-10-03 23:52:11 +00:00
ed
e45420646f share folders as qr-codes 2024-10-03 23:14:06 +00:00
ed
21be82ef8b fix #101 (show logues even if dotfiles are hidden) 2024-10-03 22:19:32 +02:00
ed
001afe00cb i18n: time plurals 2024-10-03 07:38:33 +00:00
ed
19a5985f29 allow uploading logues; closes #100 2024-10-02 23:16:59 +00:00
ed
2715ee6c61 fix confusing toast on F2 with nothing selected (#100) 2024-10-02 23:11:29 +00:00
ed
dc157fa28f webdav: support explicit <allprop/> (WinSCP) 2024-10-02 22:28:23 +00:00
ed
1ff14b4e05 optimizations, failsafes, formatting 2024-10-02 21:59:53 +00:00
ed
480ac254ab webdav: show toplevel volumes when root is unmapped
previously, only real folders could be listed by a webdav client;
a server which does not have any filesystem paths mapped to `/`
would cause clients to panic when trying to list the server root

now, assuming volumes `/foo` and `/bar/qux` exist, when accessing `/`
the user will see `/foo` but not `/bar` due to limitations in `walk`,
and `qux` will only appear when viewing `/bar`

a future rework of the recursion logic should further improve this
2024-10-02 21:12:58 +00:00
ed
4b95db81aa thx keth 2024-10-01 22:37:50 +00:00
ed
c81e898435 partyfuse: also support mounting nginx, iis
these additional parsers are not included in the sfx-embedded
copy of partyfuse.py; grab it from github when necessary
2024-10-01 22:37:07 +00:00
ed
f1646b96ca dist: strip some pointless code 2024-10-01 18:35:36 +00:00
ed
44f2b63e43 partyfuse: embed fuse.py into sfx 2024-10-01 18:27:42 +00:00
ed
847a2bdc85 partyfuse: bump datacache chunksize
previous approach:
* cache 64K on first read
* cache 1M on subsequent intersecting reads

new approach:
* cache 64K on first read
* cache 1M on the next intersecting read
* cache 8M on subsequent intersecting reads
* cache 4M on standalone reads at offsets >1M

improves performance by 50% on windows
and should help on high-latency connections
2024-10-01 17:15:35 +00:00
ed
03f0f99469 partyfuse: fix extremely slow dircache lookups
the cache was a list of files instead of a dict... dude

also adds a max-num dircache limit
in addition to the expiration time
2024-10-01 17:07:28 +00:00
ed
3900e66158 partyfuse: modernize html parser (just in case) 2024-10-01 17:00:17 +00:00
ed
3dff6cda40 partyfuse: normalize naming in parsers 2024-10-01 16:55:00 +00:00
ed
73d05095b5 partyfuse: misc correctness;
* support more unix envs with granular fuse config
* generated URLs were OK but technically incorrect
2024-10-01 16:49:39 +00:00
ed
fcdc1728eb #102: make UI translation easier in docker 2024-10-01 00:04:07 +00:00
ed
8b942ea237 partyfuse: cleanup logging and exceptions
windows runs 50% faster with recentlog on infos too...
2024-09-29 23:19:33 +00:00
ed
88a1c5ca5d optimize non-e2d ram usage down to 10% or so
drop chunk-hashes in the up2k snap, plus other insignificant attribs
to reduce both the snapfile size and the ram usage by about 90%

reduces startup/shutdown time by a lot since there's less to serdes
(does not affect -e2d which was already optimal)

other changes:

* improve incoming-eta accuracy when the initial handshake
   was made a long time before the upload actually started

* move the list of incoming files in the controlpanel to the top
2024-09-27 21:11:10 +00:00
ed
047176b297 py2 fix 2024-09-27 21:06:01 +00:00
ed
dc4d0d8e71 smb: upto 2x faster; but still very buggy:
* do not absreal paths unless necessary
* do not determine username if no users configured
* impacket 0.12 fixed the foldersize limit, but now
   you get extremely poor performance in large folders
   so the previous workaround is still default-enabled
2024-09-27 17:09:48 +00:00
ed
b9c5c7bbde u2c: early exclude + fix py2 perf
* don't traverse into excluded folders
* 3.5x faster on python2.7
2024-09-23 17:20:04 +00:00
ed
9daeed923f u2c: remove all deps to become 3x faster on small files
reduces performance on python 2.7, but that is ok

also fixes `unknown encoding: idna` due to cpython race
2024-09-22 18:07:36 +00:00
ed
66b260cea9 pkgres: fix tiny leak in template loader 2024-09-20 22:25:36 +00:00
ed
58cf01c2ad fix linter warnings 2024-09-20 22:24:39 +00:00
ed
d866841c19 pkgres:
* pyz: yeet the resource tar which is now pointless thanks to pkgres
* cache impresource stuff because pyz lookups are Extremely slow
* prefer tx_file when possible for slightly better performance
* use hardcoded list of expected resources instead of dynamic
   discovery at runtime; much simpler and probably safer
* fix some forgotten resources (copying.txt, insecure.pem)
* fix loading jinja templates on windows
2024-09-19 22:04:49 +00:00
Shiz
a462a644fb Python 3.7 package resources support (#98)
add support for reading webdeps and jinja-templates using either
importlib_resources or pkg_resources, which removes the need for
extracting these to a temporary folder on the filesystem

* util: add helper functions to abstract embedded resource access
* http*: serve embedded resources through resource abstraction
* main: check webdeps through resource abstraction
* httpconn: remove unused method `respath(name)`
* use __package__ to find package resources
* util: use importlib_resources backport if available
* pass E.pkg as module object for importlib_resources compatibility
* util: add pkg_resources compatibility to resource abstraction
2024-09-19 09:00:34 +00:00
ed
678675a9a6 fix prometheus metrics; broke in 609c5921 2024-09-16 21:04:58 +00:00
ed
de9069ef1d update pkgs to 1.15.3 2024-09-16 01:20:16 +00:00
ed
c0c0a1a83a v1.15.3 2024-09-16 01:07:50 +00:00
ed
1d004b6dbd update pkgs to 1.15.2 2024-09-16 00:44:48 +00:00
ed
b90e1200d7 v1.15.2 2024-09-16 00:20:20 +00:00
ed
4493a0a804 misc mojibake filename support 2024-09-16 00:12:49 +00:00
ed
58835b2b42 ux bugfixes:
* show media tags in shares
* html hydrator assumed a folder named `foo.txt` was a doc
* due to sessions, use `pwd` as password placeholder on services
2024-09-15 23:37:24 +00:00
ed
427597b603 show total directory size in listings
sizes are computed during `-e2ds` indexing, and new uploads
are counted, but a rescan is necessary after a move or delete
2024-09-15 23:01:18 +00:00
ed
7d64879ba8 more optimizations,
* 5% less cpu load from clients fetching thumbnails
* and slight improvement to up2k stuff
2024-09-15 17:46:43 +00:00
ed
bb715704b7 ren_open was too fancy 2024-09-15 14:39:35 +00:00
ed
d67e9cc507 sqlite and misc optimizations:
* exponentially slow upload handshakes caused by lack of rd+fn
   sqlite index; became apparent after a volume hit 200k files
* listing big folders 5% faster due to `_quotep3b`
* optimize `unquote`, 20% faster but only used rarely
* reindex on startup 150x faster in some rare cases
   (same filename in MANY folders)

the database is now around 10% larger (likely worst-case)
2024-09-15 13:18:43 +00:00
ed
2927bbb2d6 strip dev-only asserts at build stage 2024-09-14 22:17:35 +00:00
ed
0527b59180 cosmetic: only print hostname warning once 2024-09-14 20:37:56 +00:00
ed
a5ce1032d3 tlnote + nginx unix-socket example 2024-09-12 21:42:33 +00:00
ed
1c2acdc985 add flameshot client example 2024-09-11 20:56:38 +00:00
ed
4e75534ef8 optimize BrokerThr, 7x faster:
reduce the overhead of function-calls from the client thread
to the svchub singletons (up2k, thumbs, metrics) down to 14%

and optimize up2k chunk-receiver to spend 5x less time bookkeeping
which restores up2k performance to before introducing incoming-ETA
2024-09-11 20:37:10 +00:00
ultwcz
7a573cafd1 fix: translation: Check the newly added Chinese translation (#97) 2024-09-11 19:03:53 +00:00
ed
844194ee29 incoming-ETA: improve accuracy 2024-09-11 06:56:12 +00:00
ed
609c5921d4 list incoming files + ETA in controlpanel 2024-09-10 21:24:05 +00:00
ed
c79eaa089a update pkgs to 1.15.1 2024-09-09 23:55:37 +00:00
74 changed files with 3298 additions and 1111 deletions

1
.gitignore vendored
View File

@@ -30,6 +30,7 @@ copyparty/res/COPYING.txt
copyparty/web/deps/
srv/
scripts/docker/i/
scripts/deps-docker/uncomment.py
contrib/package/arch/pkg/
contrib/package/arch/src/

View File

@@ -1,4 +1,4 @@
<img src="docs/logo.svg" width="250" align="right"/>
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
### 💾🎉 copyparty
@@ -43,6 +43,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [unpost](#unpost) - undo/delete accidental uploads
* [self-destruct](#self-destruct) - uploads can be given a lifetime
* [race the beam](#race-the-beam) - download files while they're still uploading ([demo video](http://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm))
* [incoming files](#incoming-files) - the control-panel shows the ETA for all incoming files
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [shares](#shares) - share a file or folder by creating a temporary link
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
@@ -240,7 +241,7 @@ also see [comparison to similar software](./docs/versus.md)
* ☑ ...of videos using FFmpeg
* ☑ ...of audio (spectrograms) using FFmpeg
* ☑ cache eviction (max-age; maybe max-size eventually)
* ☑ multilingual UI (english, norwegian, [add your own](./docs/rice/#translations)))
* ☑ multilingual UI (english, norwegian, chinese, [add your own](./docs/rice/#translations)))
* ☑ SPA (browse while uploading)
* server indexing
* ☑ [locate files by contents](#file-search)
@@ -731,6 +732,13 @@ download files while they're still uploading ([demo video](http://a.ocv.me/pub/g
requires the file to be uploaded using up2k (which is the default drag-and-drop uploader), alternatively the command-line program
### incoming files
the control-panel shows the ETA for all incoming files , but only for files being uploaded into volumes where you have read-access
![copyparty-cpanel-upload-eta-or8](https://github.com/user-attachments/assets/fd275ffa-698c-4fca-a307-4d2181269a6a)
## file manager
cut/paste, rename, and delete files/folders (if you have permission)
@@ -769,6 +777,7 @@ semi-intentional limitations:
* cleanup of expired shares only works when global option `e2d` is set, and/or at least one volume on the server has volflag `e2d`
* only folders from the same volume are shared; if you are sharing a folder which contains other volumes, then the contents of those volumes will not be available
* related to [IdP volumes being forgotten on shutdown](https://github.com/9001/copyparty/blob/hovudstraum/docs/idp.md#idp-volumes-are-forgotten-on-shutdown), any shares pointing into a user's IdP volume will be unavailable until that user makes their first request after a restart
* no option to "delete after first access" because tricky
* when linking something to discord (for example) it'll get accessed by their scraper and that would count as a hit
* browsers wouldn't be able to resume a broken download unless the requester's IP gets allowlisted for X minutes (ref. tricky)
@@ -1107,12 +1116,12 @@ some **BIG WARNINGS** specific to SMB/CIFS, in decreasing importance:
* [shadowing](#shadowing) probably works as expected but no guarantees
and some minor issues,
* clients only see the first ~400 files in big folders; [impacket#1433](https://github.com/SecureAuthCorp/impacket/issues/1433)
* clients only see the first ~400 files in big folders;
* this was originally due to [impacket#1433](https://github.com/SecureAuthCorp/impacket/issues/1433) which was fixed in impacket-0.12, so you can disable the workaround with `--smb-nwa-1` but then you get unacceptably poor performance instead
* hot-reload of server config (`/?reload=cfg`) does not include the `[global]` section (commandline args)
* listens on the first IPv4 `-i` interface only (default = :: = 0.0.0.0 = all)
* login doesn't work on winxp, but anonymous access is ok -- remove all accounts from copyparty config for that to work
* win10 onwards does not allow connecting anonymously / without accounts
* on windows, creating a new file through rightclick --> new --> textfile throws an error due to impacket limitations -- hit OK and F5 to get your file
* python3 only
* slow (the builtin webdav support in windows is 5x faster, and rclone-webdav is 30x faster)
@@ -1151,8 +1160,6 @@ note that this disables hotlinking because the opengraph spec demands it; to sne
you can also hotlink files regardless by appending `?raw` to the url
NOTE: because discord (and maybe others) strip query args such as `?raw` in opengraph tags, any links which require a filekey or dirkey will not work
if you want to entirely replace the copyparty response with your own jinja2 template, give the template filepath to `--og-tpl` or volflag `og_tpl` (all members of `HttpCli` are available through the `this` object)
@@ -1549,6 +1556,8 @@ you can either:
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
when running behind a reverse-proxy (this includes services like cloudflare), it is important to configure real-ip correctly, as many features rely on knowing the client's IP. Look out for red and yellow log messages which explain how to do this. But basically, set `--xff-hdr` to the name of the http header to read the IP from (usually `x-forwarded-for`, but cloudflare uses `cf-connecting-ip`), and then `--xff-src` to the IP of the reverse-proxy so copyparty will trust the xff-hdr. Note that `--rp-loc` in particular will not work at all unless you do this
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which *could* be a nice speed boost, depending on a lot of factors
* **warning:** nginx-QUIC (HTTP/3) is still experimental and can make uploads much slower, so HTTP/1.1 is recommended for now
* depending on server/client, HTTP/1.1 can also be 5x faster than HTTP/2
@@ -1875,10 +1884,12 @@ interact with copyparty using non-browser clients
* FUSE: mount a copyparty server as a local filesystem
* cross-platform python client available in [./bin/](bin/)
* able to mount nginx and iis directory listings too, not just copyparty
* can be downloaded from copyparty: controlpanel -> connect -> [partyfuse.py](http://127.0.0.1:3923/.cpr/a/partyfuse.py)
* [rclone](https://rclone.org/) as client can give ~5x performance, see [./docs/rclone.md](docs/rclone.md)
* sharex (screenshot utility): see [./contrib/sharex.sxcu](contrib/#sharexsxcu)
* and for screenshots on linux, see [./contrib/flameshot.sh](./contrib/flameshot.sh)
* contextlet (web browser integration); see [contrib contextlet](contrib/#send-to-cppcontextletjson)
@@ -1913,7 +1924,7 @@ alternatively, some alternatives roughly sorted by speed (unreproducible benchma
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE (rclone v1.63 or later)
* [rclone-http](./docs/rclone.md) (26s), read-only
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
* [partyfuse.py](./bin/#partyfusepy) (26s), read-only
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
* davfs2 (103s), read/WRITE
* [win10-webdav](#webdav-server) (138s), read/WRITE
@@ -1957,6 +1968,7 @@ below are some tweaks roughly ordered by usefulness:
* and also makes thumbnails load faster, regardless of e2d/e2t
* `--dedup` enables deduplication and thus avoids writing to the HDD if someone uploads a dupe
* `--safe-dedup 1` makes deduplication much faster during upload by skipping verification of file contents; safe if there is no other software editing/moving the files in the volumes
* `--no-dirsz` shows the size of folder inodes instead of the total size of the contents, giving about 30% faster folder listings
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* if your volumes are on a network-disk such as NFS / SMB / s3, specifying larger values for `--iobuf` and/or `--s-rd-sz` and/or `--s-wr-sz` may help; try setting all of them to `524288` or `1048576` or `4194304`
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
@@ -2074,6 +2086,8 @@ volflag `dky` disables the actual key-check, meaning anyone can see the contents
volflag `dks` lets people enter subfolders as well, and also enables download-as-zip/tar
if you enable dirkeys, it is probably a good idea to enable filekeys too, otherwise it will be impossible to hotlink files from a folder which was accessed using a dirkey
dirkeys are generated based on another salt (`--dk-salt`) + filesystem-path and have a few limitations:
* the key does not change if the contents of the folder is modified
* if you need a new dirkey, either change the salt or rename the folder
@@ -2156,7 +2170,7 @@ enable [thumbnails](#thumbnails) of...
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
enable [smb](#smb-server) support (**not** recommended):
* `impacket==0.11.0`
* `impacket==0.12.0`
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
@@ -2230,7 +2244,7 @@ then again, if you are already into downloading shady binaries from the internet
## zipapp
another emergency alternative, [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) has less features, requires python 3.7 or newer, worse compression, and more importantly is unable to benefit from more recent versions of jinja2 and such (which makes it less secure)... lots of drawbacks with this one really -- but it *may* just work if the regular sfx fails to start because the computer is messed up in certain funky ways, so it's worth a shot if all else fails
another emergency alternative, [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) has less features, is slow, requires python 3.7 or newer, worse compression, and more importantly is unable to benefit from more recent versions of jinja2 and such (which makes it less secure)... lots of drawbacks with this one really -- but it does not unpack any temporay files to disk, so it *may* just work if the regular sfx fails to start because the computer is messed up in certain funky ways, so it's worth a shot if all else fails
run it by doubleclicking it, or try typing `python copyparty.pyz` in your terminal/console/commandline/telex if that fails

View File

@@ -15,22 +15,18 @@ produces a chronological list of all uploads by collecting info from up2k databa
# [`partyfuse.py`](partyfuse.py)
* mount a copyparty server as a local filesystem (read-only)
* **supports Windows!** -- expect `194 MiB/s` sequential read
* **supports Linux** -- expect `117 MiB/s` sequential read
* **supports Linux** -- expect `600 MiB/s` sequential read
* **supports macos** -- expect `85 MiB/s` sequential read
filecache is default-on for windows and macos;
* macos readsize is 64kB, so speed ~32 MiB/s without the cache
* windows readsize varies by software; explorer=1M, pv=32k
note that copyparty should run with `-ed` to enable dotfiles (hidden otherwise)
also consider using [../docs/rclone.md](../docs/rclone.md) instead for 5x performance
and consider using [../docs/rclone.md](../docs/rclone.md) instead; usually a bit faster, especially on windows
## to run this on windows:
* install [winfsp](https://github.com/billziss-gh/winfsp/releases/latest) and [python 3](https://www.python.org/downloads/)
* [x] add python 3.x to PATH (it asks during install)
* `python -m pip install --user fusepy`
* `python -m pip install --user fusepy` (or grab a copy of `fuse.py` from the `connect` page on your copyparty, and keep it in the same folder)
* `python ./partyfuse.py n: http://192.168.1.69:3923/`
10% faster in [msys2](https://www.msys2.org/), 700% faster if debug prints are enabled:

File diff suppressed because it is too large Load Diff

View File

@@ -1,34 +1,35 @@
#!/usr/bin/env python3
from __future__ import print_function, unicode_literals
S_VERSION = "1.24"
S_BUILD_DT = "2024-09-05"
S_VERSION = "2.1"
S_BUILD_DT = "2024-09-23"
"""
u2c.py: upload to copyparty
2021, ed <irc.rizon.net>, MIT-Licensed
https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py
- dependencies: requests
- dependencies: no
- supports python 2.6, 2.7, and 3.3 through 3.12
- if something breaks just try again and it'll autoresume
"""
import re
import os
import sys
import stat
import math
import time
import json
import atexit
import base64
import binascii
import datetime
import hashlib
import json
import math
import os
import platform
import re
import signal
import socket
import base64
import hashlib
import platform
import stat
import sys
import threading
import datetime
import time
EXE = bool(getattr(sys, "frozen", False))
@@ -39,32 +40,12 @@ except:
print(m)
raise
try:
import requests
req_ses = requests.Session()
except ImportError as ex:
if "-" in sys.argv or "-h" in sys.argv:
m = ""
elif EXE:
raise
elif sys.version_info > (2, 7):
m = "\nERROR: need 'requests'{0}; please run this command:\n {1} -m pip install --user requests\n"
else:
m = "requests/2.18.4 urllib3/1.23 chardet/3.0.4 certifi/2020.4.5.1 idna/2.7"
m = [" https://pypi.org/project/" + x + "/#files" for x in m.split()]
m = "\n ERROR: need these{0}:\n" + "\n".join(m) + "\n"
m += "\n for f in *.whl; do unzip $f; done; rm -r *.dist-info\n"
if m:
t = " when not running with '-h' or url '-'"
print(m.format(t, sys.executable), "\nspecifically,", ex)
sys.exit(1)
# from copyparty/__init__.py
PY2 = sys.version_info < (3,)
PY27 = sys.version_info > (2, 7) and PY2
PY37 = sys.version_info > (3, 7)
if PY2:
import httplib as http_client
from Queue import Queue
from urllib import quote, unquote
from urlparse import urlsplit, urlunsplit
@@ -72,11 +53,13 @@ if PY2:
sys.dont_write_bytecode = True
bytes = str
else:
from queue import Queue
from urllib.parse import unquote_to_bytes as unquote
from urllib.parse import quote_from_bytes as quote
from urllib.parse import unquote_to_bytes as unquote
from urllib.parse import urlsplit, urlunsplit
import http.client as http_client
from queue import Queue
unicode = str
VT100 = platform.system() != "Windows"
@@ -100,6 +83,22 @@ except:
UTC = _UTC()
try:
_b64etl = bytes.maketrans(b"+/", b"-_")
def ub64enc(bs):
x = binascii.b2a_base64(bs, newline=False)
return x.translate(_b64etl)
ub64enc(b"a")
except:
ub64enc = base64.urlsafe_b64encode
class BadAuth(Exception):
pass
class Daemon(threading.Thread):
def __init__(self, target, name=None, a=None):
threading.Thread.__init__(self, name=name)
@@ -117,6 +116,108 @@ class Daemon(threading.Thread):
self.fun(*self.a)
class HSQueue(Queue):
def _init(self, maxsize):
from collections import deque
self.q = deque()
def _qsize(self):
return len(self.q)
def _put(self, item):
if item and item.nhs:
self.q.appendleft(item)
else:
self.q.append(item)
def _get(self):
return self.q.popleft()
class HCli(object):
def __init__(self, ar):
self.ar = ar
url = urlsplit(ar.url)
tls = url.scheme.lower() == "https"
try:
addr, port = url.netloc.split(":")
except:
addr = url.netloc
port = 443 if tls else 80
self.addr = addr
self.port = int(port)
self.tls = tls
self.verify = ar.te or not ar.td
self.conns = []
if tls:
import ssl
if not self.verify:
self.ctx = ssl._create_unverified_context()
elif self.verify is True:
self.ctx = None
else:
self.ctx = ssl.SSLContext(ssl.PROTOCOL_TLS)
self.ctx.load_verify_locations(self.verify)
self.base_hdrs = {
"Accept": "*/*",
"Connection": "keep-alive",
"Host": url.netloc,
"Origin": self.ar.burl,
"User-Agent": "u2c/%s" % (S_VERSION,),
}
def _connect(self):
args = {}
if PY37:
args["blocksize"] = 1048576
if not self.tls:
C = http_client.HTTPConnection
else:
C = http_client.HTTPSConnection
if self.ctx:
args = {"context": self.ctx}
return C(self.addr, self.port, timeout=999, **args)
def req(self, meth, vpath, hdrs, body=None, ctype=None):
hdrs.update(self.base_hdrs)
if self.ar.a:
hdrs["PW"] = self.ar.a
if ctype:
hdrs["Content-Type"] = ctype
if meth == "POST" and CLEN not in hdrs:
hdrs[CLEN] = (
0 if not body else body.len if hasattr(body, "len") else len(body)
)
c = self.conns.pop() if self.conns else self._connect()
try:
c.request(meth, vpath, body, hdrs)
if PY27:
rsp = c.getresponse(buffering=True)
else:
rsp = c.getresponse()
data = rsp.read()
self.conns.append(c)
return rsp.status, data.decode("utf-8")
except:
c.close()
raise
MJ = "application/json"
MO = "application/octet-stream"
CLEN = "Content-Length"
web = None # type: HCli
class File(object):
"""an up2k upload task; represents a single file"""
@@ -149,9 +250,6 @@ class File(object):
self.up_c = 0 # type: int
self.cd = 0 # type: int
# t = "size({}) lmod({}) top({}) rel({}) abs({}) name({})\n"
# eprint(t.format(self.size, self.lmod, self.top, self.rel, self.abs, self.name))
class FileSlice(object):
"""file-like object providing a fixed window into a file"""
@@ -284,8 +382,7 @@ class MTHash(object):
chunk_rem -= len(buf)
ofs += len(buf)
digest = hashobj.digest()[:33]
digest = base64.urlsafe_b64encode(digest).decode("utf-8")
digest = ub64enc(hashobj.digest()[:33]).decode("utf-8")
return nch, digest, ofs0, chunk_sz
@@ -329,7 +426,9 @@ def termsize():
def ioctl_GWINSZ(fd):
try:
import fcntl, termios, struct
import fcntl
import struct
import termios
r = struct.unpack(b"hh", fcntl.ioctl(fd, termios.TIOCGWINSZ, b"AAAA"))
return r[::-1]
@@ -387,8 +486,8 @@ class CTermsize(object):
eprint("\033[s\033[r\033[u")
else:
self.g = 1 + self.h - margin
t = "{0}\033[{1}A".format("\n" * margin, margin)
eprint("{0}\033[s\033[1;{1}r\033[u".format(t, self.g - 1))
t = "%s\033[%dA" % ("\n" * margin, margin)
eprint("%s\033[s\033[1;%dr\033[u" % (t, self.g - 1))
ss = CTermsize()
@@ -405,14 +504,14 @@ def undns(url):
except KeyboardInterrupt:
raise
except:
t = "\n\033[31mfailed to resolve upload destination host;\033[0m\ngai={0}\n"
eprint(t.format(repr(gai)))
t = "\n\033[31mfailed to resolve upload destination host;\033[0m\ngai=%r\n"
eprint(t % (gai,))
raise
if usp.port:
hn = "{0}:{1}".format(hn, usp.port)
hn = "%s:%s" % (hn, usp.port)
if usp.username or usp.password:
hn = "{0}:{1}@{2}".format(usp.username, usp.password, hn)
hn = "%s:%s@%s" % (usp.username, usp.password, hn)
usp = usp._replace(netloc=hn)
url = urlunsplit(usp)
@@ -447,7 +546,7 @@ else:
statdir = _lsd
def walkdir(err, top, seen):
def walkdir(err, top, excl, seen):
"""recursive statdir"""
atop = os.path.abspath(os.path.realpath(top))
if atop in seen:
@@ -456,10 +555,12 @@ def walkdir(err, top, seen):
seen = seen[:] + [atop]
for ap, inf in sorted(statdir(err, top)):
if excl.match(ap):
continue
yield ap, inf
if stat.S_ISDIR(inf.st_mode):
try:
for x in walkdir(err, ap, seen):
for x in walkdir(err, ap, excl, seen):
yield x
except Exception as ex:
err.append((ap, str(ex)))
@@ -499,9 +600,7 @@ def walkdirs(err, tops, excl):
yield stop, dn, os.stat(stop)
if isdir:
for ap, inf in walkdir(err, top, []):
if ptn.match(ap):
continue
for ap, inf in walkdir(err, top, ptn, []):
yield stop, ap[len(stop) :].lstrip(sep), inf
else:
d, n = top.rsplit(sep, 1)
@@ -577,8 +676,7 @@ def get_hashlist(file, pcb, mth):
hashobj.update(buf)
chunk_rem -= len(buf)
digest = hashobj.digest()[:33]
digest = base64.urlsafe_b64encode(digest).decode("utf-8")
digest = ub64enc(hashobj.digest()[:33]).decode("utf-8")
ret.append([digest, file_ofs, chunk_sz])
file_ofs += chunk_sz
@@ -603,9 +701,6 @@ def handshake(ar, file, search):
otherwise, a list of chunks to upload
"""
url = ar.url
pw = ar.a
req = {
"hash": [x[0] for x in file.cids],
"name": file.name,
@@ -620,28 +715,26 @@ def handshake(ar, file, search):
if ar.ow:
req["replace"] = True
headers = {"Content-Type": "text/plain"} # <=1.5.1 compat
if pw:
headers["Cookie"] = "=".join(["cppwd", pw])
file.recheck = False
if file.url:
url = file.url
elif b"/" in file.rel:
url += quotep(file.rel.rsplit(b"/", 1)[0]).decode("utf-8", "replace")
else:
if b"/" in file.rel:
url = quotep(file.rel.rsplit(b"/", 1)[0]).decode("utf-8", "replace")
else:
url = ""
url = ar.vtop + url
while True:
sc = 600
txt = ""
try:
zs = json.dumps(req, separators=(",\n", ": "))
r = req_ses.post(url, headers=headers, data=zs)
sc = r.status_code
txt = r.text
sc, txt = web.req("POST", url, {}, zs.encode("utf-8"), MJ)
if sc < 400:
break
raise Exception("http {0}: {1}".format(sc, txt))
raise Exception("http %d: %s" % (sc, txt))
except Exception as ex:
em = str(ex).split("SSLError(")[-1].split("\nURL: ")[0].strip()
@@ -655,35 +748,30 @@ def handshake(ar, file, search):
return [], False
elif sc == 409 or "<pre>upload rejected, file already exists" in txt:
return [], False
elif "<pre>you don't have " in txt:
raise
elif sc == 403:
print("\nERROR: login required, or wrong password:\n%s" % (txt,))
raise BadAuth()
eprint("handshake failed, retrying: {0}\n {1}\n\n".format(file.name, em))
eprint("handshake failed, retrying: %s\n %s\n\n" % (file.name, em))
time.sleep(ar.cd)
try:
r = r.json()
r = json.loads(txt)
except:
raise Exception(r.text)
raise Exception(txt)
if search:
return r["hits"], False
try:
pre, url = url.split("://")
pre += "://"
except:
pre = ""
file.url = pre + url.split("/")[0] + r["purl"]
file.url = r["purl"]
file.name = r["name"]
file.wark = r["wark"]
return r["hash"], r["sprs"]
def upload(fsl, pw, stats):
# type: (FileSlice, str, str) -> None
def upload(fsl, stats):
# type: (FileSlice, str) -> None
"""upload a range of file data, defined by one or more `cid` (chunk-hash)"""
ctxt = fsl.cids[0]
@@ -696,20 +784,15 @@ def upload(fsl, pw, stats):
headers = {
"X-Up2k-Hash": ctxt,
"X-Up2k-Wark": fsl.file.wark,
"Content-Type": "application/octet-stream",
}
if stats:
headers["X-Up2k-Stat"] = stats
if pw:
headers["Cookie"] = "=".join(["cppwd", pw])
try:
r = req_ses.post(fsl.file.url, headers=headers, data=fsl)
sc, txt = web.req("POST", fsl.file.url, headers, fsl, MO)
if r.status_code == 400:
txt = r.text
if sc == 400:
if (
"already being written" in txt
or "already got that" in txt
@@ -717,10 +800,8 @@ def upload(fsl, pw, stats):
):
fsl.file.nojoin = 1
if not r:
raise Exception(repr(r))
_ = r.content
if sc >= 400:
raise Exception("http %s: %s" % (sc, txt))
finally:
fsl.f.close()
@@ -733,7 +814,7 @@ class Ctl(object):
def _scan(self):
ar = self.ar
eprint("\nscanning {0} locations\n".format(len(ar.files)))
eprint("\nscanning %d locations\n" % (len(ar.files),))
nfiles = 0
nbytes = 0
err = []
@@ -745,14 +826,14 @@ class Ctl(object):
nbytes += inf.st_size
if err:
eprint("\n# failed to access {0} paths:\n".format(len(err)))
eprint("\n# failed to access %d paths:\n" % (len(err),))
for ap, msg in err:
if ar.v:
eprint("{0}\n `-{1}\n\n".format(ap.decode("utf-8", "replace"), msg))
eprint("%s\n `-%s\n\n" % (ap.decode("utf-8", "replace"), msg))
else:
eprint(ap.decode("utf-8", "replace") + "\n")
eprint("^ failed to access those {0} paths ^\n\n".format(len(err)))
eprint("^ failed to access those %d paths ^\n\n" % (len(err),))
if not ar.v:
eprint("hint: set -v for detailed error messages\n")
@@ -761,11 +842,12 @@ class Ctl(object):
eprint("hint: aborting because --ok is not set\n")
return
eprint("found {0} files, {1}\n\n".format(nfiles, humansize(nbytes)))
eprint("found %d files, %s\n\n" % (nfiles, humansize(nbytes)))
return nfiles, nbytes
def __init__(self, ar, stats=None):
self.ok = False
self.panik = 0
self.errs = 0
self.ar = ar
self.stats = stats or self._scan()
@@ -773,13 +855,6 @@ class Ctl(object):
return
self.nfiles, self.nbytes = self.stats
if ar.td:
requests.packages.urllib3.disable_warnings()
req_ses.verify = False
if ar.te:
req_ses.verify = ar.te
self.filegen = walkdirs([], ar.files, ar.x)
self.recheck = [] # type: list[File]
@@ -808,7 +883,7 @@ class Ctl(object):
self.exit_cond = threading.Condition()
self.uploader_alive = ar.j
self.handshaker_alive = ar.j
self.q_handshake = Queue() # type: Queue[File]
self.q_handshake = HSQueue() # type: Queue[File]
self.q_upload = Queue() # type: Queue[FileSlice]
self.st_hash = [None, "(idle, starting...)"] # type: tuple[File, int]
@@ -823,24 +898,29 @@ class Ctl(object):
def _safe(self):
"""minimal basic slow boring fallback codepath"""
search = self.ar.s
for nf, (top, rel, inf) in enumerate(self.filegen):
nf = 0
for top, rel, inf in self.filegen:
if stat.S_ISDIR(inf.st_mode) or not rel:
continue
nf += 1
file = File(top, rel, inf.st_size, inf.st_mtime)
upath = file.abs.decode("utf-8", "replace")
print("{0} {1}\n hash...".format(self.nfiles - nf, upath))
print("%d %s\n hash..." % (self.nfiles - nf, upath))
get_hashlist(file, None, None)
burl = self.ar.url[:12] + self.ar.url[8:].split("/")[0] + "/"
while True:
print(" hs...")
hs, _ = handshake(self.ar, file, search)
try:
hs, _ = handshake(self.ar, file, search)
except BadAuth:
sys.exit(1)
if search:
if hs:
for hit in hs:
print(" found: {0}{1}".format(burl, hit["rp"]))
print(" found: %s/%s" % (self.ar.burl, hit["rp"]))
else:
print(" NOT found")
break
@@ -849,13 +929,13 @@ class Ctl(object):
if not hs:
break
print("{0} {1}".format(self.nfiles - nf, upath))
print("%d %s" % (self.nfiles - nf, upath))
ncs = len(hs)
for nc, cid in enumerate(hs):
print(" {0} up {1}".format(ncs - nc, cid))
stats = "{0}/0/0/{1}".format(nf, self.nfiles - nf)
print(" %d up %s" % (ncs - nc, cid))
stats = "%d/0/0/%d" % (nf, self.nfiles - nf)
fslice = FileSlice(file, [cid])
upload(fslice, self.ar.a, stats)
upload(fslice, stats)
print(" ok!")
if file.recheck:
@@ -866,7 +946,7 @@ class Ctl(object):
eprint("finalizing %d duplicate files\n" % (len(self.recheck),))
for file in self.recheck:
handshake(self.ar, file, search)
handshake(self.ar, file, False)
def _fancy(self):
if VT100 and not self.ar.ns:
@@ -881,6 +961,8 @@ class Ctl(object):
while True:
with self.exit_cond:
self.exit_cond.wait(0.07)
if self.panik:
sys.exit(1)
with self.mutex:
if not self.handshaker_alive and not self.uploader_alive:
break
@@ -889,15 +971,15 @@ class Ctl(object):
if VT100 and not self.ar.ns:
maxlen = ss.w - len(str(self.nfiles)) - 14
txt = "\033[s\033[{0}H".format(ss.g)
txt = "\033[s\033[%dH" % (ss.g,)
for y, k, st, f in [
[0, "hash", st_hash, self.hash_f],
[1, "send", st_up, self.up_f],
]:
txt += "\033[{0}H{1}:".format(ss.g + y, k)
txt += "\033[%dH%s:" % (ss.g + y, k)
file, arg = st
if not file:
txt += " {0}\033[K".format(arg)
txt += " %s\033[K" % (arg,)
else:
if y:
p = 100 * file.up_b / file.size
@@ -906,12 +988,11 @@ class Ctl(object):
name = file.abs.decode("utf-8", "replace")[-maxlen:]
if "/" in name:
name = "\033[36m{0}\033[0m/{1}".format(*name.rsplit("/", 1))
name = "\033[36m%s\033[0m/%s" % tuple(name.rsplit("/", 1))
t = "{0:6.1f}% {1} {2}\033[K"
txt += t.format(p, self.nfiles - f, name)
txt += "%6.1f%% %d %s\033[K" % (p, self.nfiles - f, name)
txt += "\033[{0}H ".format(ss.g + 2)
txt += "\033[%dH " % (ss.g + 2,)
else:
txt = " "
@@ -929,7 +1010,7 @@ class Ctl(object):
nleft = self.nfiles - self.up_f
tail = "\033[K\033[u" if VT100 and not self.ar.ns else "\r"
t = "{0} eta @ {1}/s, {2}, {3}# left".format(self.eta, spd, sleft, nleft)
t = "%s eta @ %s/s, %s, %d# left\033[K" % (self.eta, spd, sleft, nleft)
eprint(txt + "\033]0;{0}\033\\\r{0}{1}".format(t, tail))
if self.hash_b and self.at_hash:
@@ -965,20 +1046,18 @@ class Ctl(object):
srd = rd.decode("utf-8", "replace").replace("\\", "/")
if prd != rd:
prd = rd
headers = {}
if self.ar.a:
headers["Cookie"] = "=".join(["cppwd", self.ar.a])
ls = {}
try:
print(" ls ~{0}".format(srd))
zb = self.ar.url.encode("utf-8")
zb += quotep(rd.replace(b"\\", b"/"))
r = req_ses.get(zb + b"?ls&lt&dots", headers=headers)
if not r:
raise Exception("HTTP {0}".format(r.status_code))
zt = (
self.ar.vtop,
quotep(rd.replace(b"\\", b"/")).decode("utf-8", "replace"),
)
sc, txt = web.req("GET", "%s%s?ls&lt&dots" % zt, {})
if sc >= 400:
raise Exception("http %s" % (sc,))
j = r.json()
j = json.loads(txt)
for f in j["dirs"] + j["files"]:
rfn = f["href"].split("?")[0].rstrip("/")
ls[unquote(rfn.encode("utf-8", "replace"))] = f
@@ -1001,14 +1080,17 @@ class Ctl(object):
req = locs
while req:
print("DELETING ~%s/#%s" % (srd, len(req)))
r = req_ses.post(self.ar.url + "?delete", json=req)
if r.status_code == 413 and "json 2big" in r.text:
body = json.dumps(req).encode("utf-8")
sc, txt = web.req(
"POST", self.ar.url + "?delete", {}, body, MJ
)
if sc == 413 and "json 2big" in txt:
print(" (delete request too big; slicing...)")
req = req[: len(req) // 2]
continue
elif not r:
t = "delete request failed: %r %s"
raise Exception(t % (r, r.text))
elif sc >= 400:
t = "delete request failed: %s %s"
raise Exception(t % (sc, txt))
break
locs = locs[len(req) :]
@@ -1056,7 +1138,7 @@ class Ctl(object):
if self.ar.wlist:
zsl = [self.ar.wsalt, str(file.size)] + [x[0] for x in file.kchunks]
zb = hashlib.sha512("\n".join(zsl).encode("utf-8")).digest()[:33]
wark = base64.urlsafe_b64encode(zb).decode("utf-8")
wark = ub64enc(zb).decode("utf-8")
vp = file.rel.decode("utf-8")
if self.ar.jw:
print("%s %s" % (wark, vp))
@@ -1087,7 +1169,6 @@ class Ctl(object):
def handshaker(self):
search = self.ar.s
burl = self.ar.url[:8] + self.ar.url[8:].split("/")[0] + "/"
while True:
file = self.q_handshake.get()
if not file:
@@ -1109,12 +1190,16 @@ class Ctl(object):
while time.time() < file.cd:
time.sleep(0.1)
hs, sprs = handshake(self.ar, file, search)
try:
hs, sprs = handshake(self.ar, file, search)
except BadAuth:
self.panik = 1
break
if search:
if hs:
for hit in hs:
t = "found: {0}\n {1}{2}"
print(t.format(upath, burl, hit["rp"]))
print("found: %s\n %s/%s" % (upath, self.ar.burl, hit["rp"]))
else:
print("NOT found: {0}".format(upath))
@@ -1236,7 +1321,7 @@ class Ctl(object):
)
try:
upload(fsl, self.ar.a, stats)
upload(fsl, stats)
except Exception as ex:
t = "upload failed, retrying: %s #%s+%d (%s)\n"
eprint(t % (file.name, cids[0][:8], len(cids) - 1, ex))
@@ -1270,14 +1355,17 @@ class APF(argparse.ArgumentDefaultsHelpFormatter, argparse.RawDescriptionHelpFor
def main():
global web
time.strptime("19970815", "%Y%m%d") # python#7980
"".encode("idna") # python#29288
if not VT100:
os.system("rem") # enables colors
cores = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
hcores = min(cores, 3) # 4% faster than 4+ on py3.9 @ r5-4500U
ver = "{0}, v{1}".format(S_BUILD_DT, S_VERSION)
ver = "{0} v{1} https://youtu.be/BIcOO6TLKaY".format(S_BUILD_DT, S_VERSION)
if "--version" in sys.argv:
print(ver)
return
@@ -1285,7 +1373,7 @@ def main():
sys.argv = [x for x in sys.argv if x != "--ws"]
# fmt: off
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool " + ver, epilog="""
NOTE:
source file/folder selection uses rsync syntax, meaning that:
"foo" uploads the entire folder to URL/foo/
@@ -1366,13 +1454,20 @@ source file/folder selection uses rsync syntax, meaning that:
for x in ar.files
]
ar.url = ar.url.rstrip("/") + "/"
if "://" not in ar.url:
ar.url = "http://" + ar.url
# urlsplit needs scheme;
zs = ar.url.rstrip("/") + "/"
if "://" not in zs:
zs = "http://" + zs
ar.url = zs
url = urlsplit(zs)
ar.burl = "%s://%s" % (url.scheme, url.netloc)
ar.vtop = url.path
if "https://" in ar.url.lower():
try:
import ssl, zipfile
import ssl
import zipfile
except:
t = "ERROR: https is not available for some reason; please use http"
print("\n\n %s\n\n" % (t,))
@@ -1397,6 +1492,7 @@ source file/folder selection uses rsync syntax, meaning that:
if ar.cls:
eprint("\033[H\033[2J\033[3J", end="")
web = HCli(ar)
ctl = Ctl(ar)
if ar.dr and not ar.drd and ctl.ok:

View File

@@ -19,6 +19,9 @@
* the `act:bput` thing is optional since copyparty v1.9.29
* using an older sharex version, maybe sharex v12.1.1 for example? dw fam i got your back 👉😎👉 [`sharex12.sxcu`](sharex12.sxcu)
### [`flameshot.sh`](flameshot.sh)
* takes a screenshot with [flameshot](https://flameshot.org/) on Linux, uploads it, and writes the URL to clipboard
### [`send-to-cpp.contextlet.json`](send-to-cpp.contextlet.json)
* browser integration, kind of? custom rightclick actions and stuff
* rightclick a pic and send it to copyparty straight from your browser

14
contrib/flameshot.sh Executable file
View File

@@ -0,0 +1,14 @@
#!/bin/bash
set -e
# take a screenshot with flameshot and send it to copyparty;
# the image url will be placed on your clipboard
password=wark
url=https://a.ocv.me/up/
filename=$(date +%Y-%m%d-%H%M%S).png
flameshot gui -s -r |
curl -T- $url$filename?pw=$password |
tail -n 1 |
xsel -ib

View File

@@ -1,14 +1,10 @@
# when running copyparty behind a reverse proxy,
# the following arguments are recommended:
#
# -i 127.0.0.1 only accept connections from nginx
#
# -nc must match or exceed the webserver's max number of concurrent clients;
# copyparty default is 1024 if OS permits it (see "max clients:" on startup),
# look for "max clients:" when starting copyparty, as nginx should
# not accept more consecutive clients than what copyparty is able to;
# nginx default is 512 (worker_processes 1, worker_connections 512)
#
# you may also consider adding -j0 for CPU-intensive configurations
# (5'000 requests per second, or 20gbps upload/download in parallel)
# rarely, in some extreme usecases, it can be good to add -j0
# (40'000 requests per second, or 20gbps upload/download in parallel)
# but this is usually counterproductive and slightly buggy
#
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
#
@@ -20,10 +16,33 @@
#
# and then enable it below by uncomenting the cloudflare-only.conf line
upstream cpp {
upstream cpp_tcp {
# alternative 1: connect to copyparty using tcp;
# cpp_uds is slightly faster and more secure, but
# cpp_tcp is easier to setup and "just works"
# ...you should however restrict copyparty to only
# accept connections from nginx by adding these args:
# -i 127.0.0.1
server 127.0.0.1:3923 fail_timeout=1s;
keepalive 1;
}
upstream cpp_uds {
# alternative 2: unix-socket, aka. "unix domain socket";
# 5-10% faster, and better isolation from other software,
# but there must be at least one unix-group which both
# nginx and copyparty is a member of; if that group is
# "www" then run copyparty with the following args:
# -i unix:770:www:/tmp/party.sock
server unix:/tmp/party.sock fail_timeout=1s;
keepalive 1;
}
server {
listen 443 ssl;
listen [::]:443 ssl;
@@ -34,7 +53,8 @@ server {
#include /etc/nginx/cloudflare-only.conf;
location / {
proxy_pass http://cpp;
# recommendation: replace cpp_tcp with cpp_uds below
proxy_pass http://cpp_tcp;
proxy_redirect off;
# disable buffering (next 4 lines)
proxy_http_version 1.1;
@@ -52,6 +72,7 @@ server {
}
}
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
client_max_body_size 1024M;
client_header_timeout 610m;

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty
pkgver="1.15.0"
pkgver="1.15.4"
pkgrel=1
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
arch=("any")
@@ -21,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
)
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" )
sha256sums=("cd082e1dc93ef0bd8b6115155f467e14bf450874d0a822567416f5e30fc55618")
sha256sums=("e0c91b2344f1cbec8be60f715d076f8ba79eef09c1f9f016f5192f743621800d")
build() {
cd "${srcdir}/${pkgname}-${pkgver}"

View File

@@ -1,5 +1,5 @@
{
"url": "https://github.com/9001/copyparty/releases/download/v1.15.0/copyparty-sfx.py",
"version": "1.15.0",
"hash": "sha256-4W7GMdukwG6CNaVrCCOF12tdQ/12XZz/orHAoB/3G8U="
"url": "https://github.com/9001/copyparty/releases/download/v1.15.4/copyparty-sfx.py",
"version": "1.15.4",
"hash": "sha256-a6zgGg1EOu7wtiOzPLwv4h17yrmq2Iharip9rpI8kt0="
}

View File

@@ -16,9 +16,10 @@ except:
TYPE_CHECKING = False
if True:
from typing import Any, Callable
from typing import Any, Callable, Optional
PY2 = sys.version_info < (3,)
PY36 = sys.version_info > (3, 6)
if not PY2:
unicode: Callable[[Any], str] = str
else:
@@ -50,6 +51,60 @@ try:
except:
CORES = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
# all embedded resources to be retrievable over http
zs = """
web/a/partyfuse.py
web/a/u2c.py
web/a/webdav-cfg.bat
web/baguettebox.js
web/browser.css
web/browser.html
web/browser.js
web/browser2.html
web/cf.html
web/copyparty.gif
web/dd/2.png
web/dd/3.png
web/dd/4.png
web/dd/5.png
web/deps/busy.mp3
web/deps/easymde.css
web/deps/easymde.js
web/deps/marked.js
web/deps/fuse.py
web/deps/mini-fa.css
web/deps/mini-fa.woff
web/deps/prism.css
web/deps/prism.js
web/deps/prismd.css
web/deps/scp.woff2
web/deps/sha512.ac.js
web/deps/sha512.hw.js
web/md.css
web/md.html
web/md.js
web/md2.css
web/md2.js
web/mde.css
web/mde.html
web/mde.js
web/msg.css
web/msg.html
web/shares.css
web/shares.html
web/shares.js
web/splash.css
web/splash.html
web/splash.js
web/svcs.html
web/svcs.js
web/ui.css
web/up2k.js
web/util.js
web/w.hash.js
"""
RES = set(zs.strip().split("\n"))
class EnvParams(object):
def __init__(self) -> None:

View File

@@ -27,6 +27,7 @@ from .__init__ import (
EXE,
MACOS,
PY2,
PY36,
VT100,
WINDOWS,
E,
@@ -54,7 +55,10 @@ from .util import (
Daemon,
align_tab,
ansi_re,
b64enc,
dedent,
has_resource,
load_resource,
min_ex,
pybin,
termsize,
@@ -204,7 +208,7 @@ def init_E(EE: EnvParams) -> None:
errs.append("Using [%s] instead" % (p,))
if errs:
print("WARNING: " + ". ".join(errs))
warn(". ".join(errs))
return p # type: ignore
except Exception as ex:
@@ -234,7 +238,7 @@ def init_E(EE: EnvParams) -> None:
raise
def get_srvname() -> str:
def get_srvname(verbose) -> str:
try:
ret: str = unicode(socket.gethostname()).split(".")[0]
except:
@@ -244,7 +248,8 @@ def get_srvname() -> str:
return ret
fp = os.path.join(E.cfg, "name.txt")
lprint("using hostname from {}\n".format(fp))
if verbose:
lprint("using hostname from {}\n".format(fp))
try:
with open(fp, "rb") as f:
ret = f.read().decode("utf-8", "replace").strip()
@@ -266,7 +271,7 @@ def get_fk_salt() -> str:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(18))
ret = b64enc(os.urandom(18))
with open(fp, "wb") as f:
f.write(ret + b"\n")
@@ -279,7 +284,7 @@ def get_dk_salt() -> str:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(30))
ret = b64enc(os.urandom(30))
with open(fp, "wb") as f:
f.write(ret + b"\n")
@@ -292,7 +297,7 @@ def get_ah_salt() -> str:
with open(fp, "rb") as f:
ret = f.read().strip()
except:
ret = base64.b64encode(os.urandom(18))
ret = b64enc(os.urandom(18))
with open(fp, "wb") as f:
f.write(ret + b"\n")
@@ -322,8 +327,7 @@ def ensure_locale() -> None:
def ensure_webdeps() -> None:
ap = os.path.join(E.mod, "web/deps/mini-fa.woff")
if os.path.exists(ap):
if has_resource(E, "web/deps/mini-fa.woff"):
return
warn(
@@ -350,7 +354,7 @@ def configure_ssl_ver(al: argparse.Namespace) -> None:
# oh man i love openssl
# check this out
# hold my beer
assert ssl # type: ignore
assert ssl # type: ignore # !rm
ptn = re.compile(r"^OP_NO_(TLS|SSL)v")
sslver = terse_sslver(al.ssl_ver).split(",")
flags = [k for k in ssl.__dict__ if ptn.match(k)]
@@ -384,7 +388,7 @@ def configure_ssl_ver(al: argparse.Namespace) -> None:
def configure_ssl_ciphers(al: argparse.Namespace) -> None:
assert ssl # type: ignore
assert ssl # type: ignore # !rm
ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
if al.ssl_ver:
ctx.options &= ~al.ssl_flags_en
@@ -516,14 +520,18 @@ def sfx_tpoke(top: str):
def showlic() -> None:
p = os.path.join(E.mod, "res", "COPYING.txt")
if not os.path.exists(p):
try:
with load_resource(E, "res/COPYING.txt") as f:
buf = f.read()
except:
buf = b""
if buf:
print(buf.decode("utf-8", "replace"))
else:
print("no relevant license info to display")
return
with open(p, "rb") as f:
print(f.read().decode("utf-8", "replace"))
def get_sects():
return [
@@ -1171,7 +1179,7 @@ def add_smb(ap):
ap2.add_argument("--smbw", action="store_true", help="enable write support (please dont)")
ap2.add_argument("--smb1", action="store_true", help="disable SMBv2, only enable SMBv1 (CIFS)")
ap2.add_argument("--smb-port", metavar="PORT", type=int, default=445, help="port to listen on -- if you change this value, you must NAT from TCP:445 to this port using iptables or similar")
ap2.add_argument("--smb-nwa-1", action="store_true", help="disable impacket#1433 workaround (truncate directory listings to 64kB)")
ap2.add_argument("--smb-nwa-1", action="store_true", help="truncate directory listings to 64kB (~400 files); avoids impacket-0.11 bug, fixes impacket-0.12 performance")
ap2.add_argument("--smb-nwa-2", action="store_true", help="disable impacket workaround for filecopy globs")
ap2.add_argument("--smba", action="store_true", help="small performance boost: disable per-account permissions, enables account coalescing instead (if one user has write/delete-access, then everyone does)")
ap2.add_argument("--smbv", action="store_true", help="verbose")
@@ -1230,6 +1238,7 @@ def add_optouts(ap):
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
ap2.add_argument("--no-up-list", action="store_true", help="don't show list of incoming files in controlpanel")
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader IPs into the database")
@@ -1358,6 +1367,8 @@ def add_db_general(ap, hcores):
ap2.add_argument("--hist", metavar="PATH", type=u, default="", help="where to store volume data (db, thumbs); default is a folder named \".hist\" inside each volume (volflag=hist)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, default="", help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dirsz", action="store_true", help="do not show total recursive size of folders in listings, show inode size instead; slightly faster (volflag=nodirsz)")
ap2.add_argument("--re-dirsz", action="store_true", help="if the directory-sizes in the UI are bonkers, use this along with \033[33m-e2dsa\033[0m to rebuild the index from scratch")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="force a cache rebuild on startup; enable this once if it gets out of sync (should never be necessary)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)")
@@ -1471,7 +1482,7 @@ def add_debug(ap):
def run_argparse(
argv: list[str], formatter: Any, retry: bool, nc: int
argv: list[str], formatter: Any, retry: bool, nc: int, verbose=True
) -> argparse.Namespace:
ap = argparse.ArgumentParser(
formatter_class=formatter,
@@ -1493,7 +1504,7 @@ def run_argparse(
tty = os.environ.get("TERM", "").lower() == "linux"
srvname = get_srvname()
srvname = get_srvname(verbose)
add_general(ap, nc, srvname)
add_network(ap)
@@ -1560,16 +1571,13 @@ def run_argparse(
return ret
def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
def main(argv: Optional[list[str]] = None) -> None:
time.strptime("19970815", "%Y%m%d") # python#7980
if WINDOWS:
os.system("rem") # enables colors
init_E(E)
if rsrc: # pyz
E.mod = rsrc
if argv is None:
argv = sys.argv
@@ -1673,7 +1681,7 @@ def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
for fmtr in [RiceFormatter, RiceFormatter, Dodge11874, BasicDodge11874]:
try:
al = run_argparse(argv, fmtr, retry, nc)
dal = run_argparse([], fmtr, retry, nc)
dal = run_argparse([], fmtr, retry, nc, False)
break
except SystemExit:
raise
@@ -1757,7 +1765,7 @@ def main(argv: Optional[list[str]] = None, rsrc: Optional[str] = None) -> None:
print("error: python2 cannot --smb")
return
if sys.version_info < (3, 6):
if not PY36:
al.no_scandir = True
if not hasattr(os, "sendfile"):

View File

@@ -1,8 +1,8 @@
# coding: utf-8
VERSION = (1, 15, 1)
VERSION = (1, 15, 5)
CODENAME = "fill the drives"
BUILD_DT = (2024, 9, 9)
BUILD_DT = (2024, 10, 5)
S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -855,6 +855,7 @@ class AuthSrv(object):
self.idp_accs: dict[str, list[str]] = {} # username->groupnames
self.idp_usr_gh: dict[str, str] = {} # username->group-header-value (cache)
self.hid_cache: dict[str, str] = {}
self.mutex = threading.Lock()
self.reload()
@@ -1531,7 +1532,7 @@ class AuthSrv(object):
if enshare:
import sqlite3
shv = VFS(self.log_func, "", shr, AXS(), {"d2d": True})
shv = VFS(self.log_func, "", shr, AXS(), {})
db_path = self.args.shr_db
db = sqlite3.connect(db_path)
@@ -1550,8 +1551,8 @@ class AuthSrv(object):
if s_pw:
# gotta reuse the "account" for all shares with this pw,
# so do a light scramble as this appears in the web-ui
zs = ub64enc(hashlib.sha512(s_pw.encode("utf-8")).digest())[4:16]
sun = "s_%s" % (zs.decode("utf-8"),)
zb = hashlib.sha512(s_pw.encode("utf-8")).digest()
sun = "s_%s" % (ub64enc(zb)[4:16].decode("ascii"),)
acct[sun] = s_pw
else:
sun = "*"
@@ -1656,8 +1657,12 @@ class AuthSrv(object):
promote = []
demote = []
for vol in vfs.all_vols.values():
zb = hashlib.sha512(afsenc(vol.realpath)).digest()
hid = base64.b32encode(zb).decode("ascii").lower()
hid = self.hid_cache.get(vol.realpath)
if not hid:
zb = hashlib.sha512(afsenc(vol.realpath)).digest()
hid = base64.b32encode(zb).decode("ascii").lower()
self.hid_cache[vol.realpath] = hid
vflag = vol.flags.get("hist")
if vflag == "-":
pass
@@ -2286,7 +2291,7 @@ class AuthSrv(object):
q = "insert into us values (?,?,?)"
for uname in self.acct:
if uname not in ases:
sid = ub64enc(os.urandom(blen)).decode("utf-8")
sid = ub64enc(os.urandom(blen)).decode("ascii")
cur.execute(q, (uname, sid, int(time.time())))
ases[uname] = sid
n.append(uname)

View File

@@ -9,14 +9,14 @@ import queue
from .__init__ import CORES, TYPE_CHECKING
from .broker_mpw import MpWorker
from .broker_util import ExceptionalQueue, try_exec
from .broker_util import ExceptionalQueue, NotExQueue, try_exec
from .util import Daemon, mp
if TYPE_CHECKING:
from .svchub import SvcHub
if True: # pylint: disable=using-constant-test
from typing import Any
from typing import Any, Union
class MProcess(mp.Process):
@@ -108,7 +108,7 @@ class BrokerMp(object):
if retq_id:
proc.q_pend.put((retq_id, "retq", rv))
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
# new non-ipc invoking managed service in hub
obj = self.hub

View File

@@ -11,7 +11,7 @@ import queue
from .__init__ import ANYWIN
from .authsrv import AuthSrv
from .broker_util import BrokerCli, ExceptionalQueue
from .broker_util import BrokerCli, ExceptionalQueue, NotExQueue
from .httpsrv import HttpSrv
from .util import FAKE_MP, Daemon, HMaccas
@@ -114,7 +114,7 @@ class MpWorker(BrokerCli):
else:
raise Exception("what is " + str(dest))
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
retq = ExceptionalQueue(1)
retq_id = id(retq)
with self.retpend_mutex:

View File

@@ -5,7 +5,7 @@ import os
import threading
from .__init__ import TYPE_CHECKING
from .broker_util import BrokerCli, ExceptionalQueue, try_exec
from .broker_util import BrokerCli, ExceptionalQueue, NotExQueue
from .httpsrv import HttpSrv
from .util import HMaccas
@@ -13,7 +13,7 @@ if TYPE_CHECKING:
from .svchub import SvcHub
if True: # pylint: disable=using-constant-test
from typing import Any
from typing import Any, Union
class BrokerThr(BrokerCli):
@@ -43,19 +43,14 @@ class BrokerThr(BrokerCli):
def noop(self) -> None:
pass
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
# new ipc invoking managed service in hub
obj = self.hub
for node in dest.split("."):
obj = getattr(obj, node)
rv = try_exec(True, obj, *args)
# pretend we're broker_mp
retq = ExceptionalQueue(1)
retq.put(rv)
return retq
return NotExQueue(obj(*args)) # type: ignore
def say(self, dest: str, *args: Any) -> None:
if dest == "listen":
@@ -71,4 +66,4 @@ class BrokerThr(BrokerCli):
for node in dest.split("."):
obj = getattr(obj, node)
try_exec(False, obj, *args)
obj(*args) # type: ignore

View File

@@ -33,6 +33,18 @@ class ExceptionalQueue(Queue, object):
return rv
class NotExQueue(object):
"""
BrokerThr uses this instead of ExceptionalQueue; 7x faster
"""
def __init__(self, rv: Any) -> None:
self.rv = rv
def get(self) -> Any:
return self.rv
class BrokerCli(object):
"""
helps mypy understand httpsrv.broker but still fails a few levels deeper,
@@ -48,7 +60,7 @@ class BrokerCli(object):
def __init__(self) -> None:
pass
def ask(self, dest: str, *args: Any) -> ExceptionalQueue:
def ask(self, dest: str, *args: Any) -> Union[ExceptionalQueue, NotExQueue]:
return ExceptionalQueue(1)
def say(self, dest: str, *args: Any) -> None:

View File

@@ -7,7 +7,7 @@ import shutil
import time
from .__init__ import ANYWIN
from .util import Netdev, runcmd, wrename, wunlink
from .util import Netdev, load_resource, runcmd, wrename, wunlink
HAVE_CFSSL = not os.environ.get("PRTY_NO_CFSSL")
@@ -29,13 +29,15 @@ def ensure_cert(log: "RootLogger", args) -> None:
i feel awful about this and so should they
"""
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
with load_resource(args.E, "res/insecure.pem") as f:
cert_insec = f.read()
cert_appdata = os.path.join(args.E.cfg, "cert.pem")
if not os.path.isfile(args.cert):
if cert_appdata != args.cert:
raise Exception("certificate file does not exist: " + args.cert)
shutil.copy(cert_insec, args.cert)
with open(args.cert, "wb") as f:
f.write(cert_insec)
with open(args.cert, "rb") as f:
buf = f.read()
@@ -50,7 +52,9 @@ def ensure_cert(log: "RootLogger", args) -> None:
raise Exception(m + "private key must appear before server certificate")
try:
if filecmp.cmp(args.cert, cert_insec):
with open(args.cert, "rb") as f:
active_cert = f.read()
if active_cert == cert_insec:
t = "using default TLS certificate; https will be insecure:\033[36m {}"
log("cert", t.format(args.cert), 3)
except:
@@ -151,14 +155,22 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
raise Exception("no useable cert found")
expired = time.time() + args.crt_sdays * 60 * 60 * 24 * 0.5 > expiry
cert_insec = os.path.join(args.E.mod, "res/insecure.pem")
if expired:
raise Exception("old server-cert has expired")
for n in names:
if n not in inf["sans"]:
raise Exception("does not have {}".format(n))
if expired:
raise Exception("old server-cert has expired")
if not filecmp.cmp(args.cert, cert_insec):
with load_resource(args.E, "res/insecure.pem") as f:
cert_insec = f.read()
with open(args.cert, "rb") as f:
active_cert = f.read()
if active_cert and active_cert != cert_insec:
return
except Exception as ex:
log("cert", "will create new server-cert; {}".format(ex))

View File

@@ -2,7 +2,7 @@
from __future__ import print_function, unicode_literals
# awk -F\" '/add_argument\("-[^-]/{print(substr($2,2))}' copyparty/__main__.py | sort | tr '\n' ' '
zs = "a c e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vp e2vu ed emp i j lo mcr mte mth mtm mtp nb nc nid nih nw p q s ss sss v z zv"
zs = "a c e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vp e2vu ed emp i j lo mcr mte mth mtm mtp nb nc nid nih nth nw p q s ss sss v z zv"
onedash = set(zs.split())
@@ -13,6 +13,7 @@ def vf_bmap() -> dict[str, str]:
"dav_rt": "davrt",
"ed": "dots",
"hardlink_only": "hardlinkonly",
"no_dirsz": "nodirsz",
"no_dupe": "nodupe",
"no_forget": "noforget",
"no_pipe": "nopipe",

View File

@@ -119,7 +119,7 @@ class Fstab(object):
self.srctab = srctab
def relabel(self, path: str, nval: str) -> None:
assert self.tab
assert self.tab # !rm
self.cache = {}
if ANYWIN:
path = self._winpath(path)
@@ -156,7 +156,7 @@ class Fstab(object):
self.log("failed to build tab:\n{}".format(min_ex()), 3)
self.build_fallback()
assert self.tab
assert self.tab # !rm
ret = self.tab._find(path)[0]
if self.trusted or path == ret.vpath:
return ret.realpath.split("/")[0]
@@ -167,6 +167,6 @@ class Fstab(object):
if not self.tab:
self.build_fallback()
assert self.tab
assert self.tab # !rm
ret = self.tab._find(path)[0]
return ret.realpath

View File

@@ -163,7 +163,7 @@ class FtpFs(AbstractedFS):
t = "Unsupported characters in [{}]"
raise FSE(t.format(vpath), 1)
fn = sanitize_fn(fn or "", "", [".prologue.html", ".epilogue.html"])
fn = sanitize_fn(fn or "", "")
vpath = vjoin(rd, fn)
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
if not vfs.realpath:

View File

@@ -2,7 +2,6 @@
from __future__ import print_function, unicode_literals
import argparse # typechk
import base64
import calendar
import copy
import errno
@@ -33,11 +32,12 @@ try:
except:
pass
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, EnvParams, unicode
from .__init__ import ANYWIN, PY2, RES, TYPE_CHECKING, EnvParams, unicode
from .__version__ import S_VERSION
from .authsrv import VFS # typechk
from .bos import bos
from .star import StreamTar
from .stolen.qrcodegen import QrCode, qr2svg
from .sutil import StreamArc, gfilter
from .szip import StreamZip
from .up2k import up2k_chunksize
@@ -45,6 +45,7 @@ from .util import unquote # type: ignore
from .util import (
APPLESAN_RE,
BITNESS,
DAV_ALLPROPS,
HAVE_SQLITE3,
HTTPCODE,
META_NOBOTS,
@@ -58,6 +59,7 @@ from .util import (
absreal,
alltrace,
atomic_move,
b64dec,
exclude_dotfiles,
formatdate,
fsenc,
@@ -67,13 +69,16 @@ from .util import (
get_df,
get_spd,
guess_mime,
gzip_file_orig_sz,
gzip_orig_sz,
has_resource,
hashcopy,
hidedir,
html_bescape,
html_escape,
humansize,
ipnorm,
load_resource,
loadpy,
log_reloc,
min_ex,
@@ -87,11 +92,13 @@ from .util import (
relchk,
ren_open,
runhook,
s2hms,
s3enc,
sanitize_fn,
sanitize_vpath,
sendfile_kern,
sendfile_py,
stat_resource,
ub64dec,
ub64enc,
ujoin,
@@ -108,7 +115,7 @@ from .util import (
if True: # pylint: disable=using-constant-test
import typing
from typing import Any, Generator, Match, Optional, Pattern, Type, Union
from typing import Any, Generator, Iterable, Match, Optional, Pattern, Type, Union
if TYPE_CHECKING:
from .httpconn import HttpConn
@@ -127,7 +134,7 @@ class HttpCli(object):
"""
def __init__(self, conn: "HttpConn") -> None:
assert conn.sr
assert conn.sr # !rm
self.t0 = time.time()
self.conn = conn
@@ -422,6 +429,7 @@ class HttpCli(object):
vpath = undot(vpath)
ptn = self.conn.hsrv.ptn_cc
k_safe = self.conn.hsrv.uparam_cc_ok
for k in arglist.split("&"):
if "=" in k:
k, zs = k.split("=", 1)
@@ -434,7 +442,7 @@ class HttpCli(object):
k = k.lower()
uparam[k] = sv
if k in ("doc", "move", "tree"):
if k in k_safe:
continue
zs = "%s=%s" % (k, sv)
@@ -488,6 +496,9 @@ class HttpCli(object):
self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath
)
if "qr" in uparam:
return self.tx_qr()
if relchk(self.vpath) and (self.vpath != "*" or self.mode != "OPTIONS"):
self.log("invalid relpath [{}]".format(self.vpath))
self.cbonk(self.conn.hsrv.gmal, self.req, "bad_vp", "invalid relpaths")
@@ -502,7 +513,7 @@ class HttpCli(object):
):
try:
zb = zso.split(" ")[1].encode("ascii")
zs = base64.b64decode(zb).decode("utf-8")
zs = b64dec(zb).decode("utf-8")
# try "pwd", "x:pwd", "pwd:x"
for bauth in [zs] + zs.split(":", 1)[::-1]:
if bauth in self.asrv.sesa:
@@ -1092,14 +1103,17 @@ class HttpCli(object):
if self.vpath == ".cpr/metrics":
return self.conn.hsrv.metrics.tx(self)
path_base = os.path.join(self.E.mod, "web")
static_path = absreal(os.path.join(path_base, self.vpath[5:]))
if static_path in self.conn.hsrv.statics:
return self.tx_file(static_path)
res_path = "web/" + self.vpath[5:]
if res_path in RES:
ap = os.path.join(self.E.mod, res_path)
if bos.path.exists(ap) or bos.path.exists(ap + ".gz"):
return self.tx_file(ap)
else:
return self.tx_res(res_path)
if not static_path.startswith(path_base):
if res_path != undot(res_path):
t = "malicious user; attempted path traversal [{}] => [{}]"
self.log(t.format(self.vpath, static_path), 1)
self.log(t.format(self.vpath, res_path), 1)
self.cbonk(self.conn.hsrv.gmal, self.req, "trav", "path traversal")
self.tx_404()
@@ -1194,10 +1208,6 @@ class HttpCli(object):
tap = vn.canonical(rem)
if "davauth" in vn.flags and self.uname == "*":
self.can_read = self.can_write = self.can_get = False
if not self.can_read and not self.can_write and not self.can_get:
self.log("inaccessible: [%s]" % (self.vpath,))
raise Pebkac(401, "authenticate")
from .dxml import parse_xml
@@ -1206,6 +1216,7 @@ class HttpCli(object):
# enc = "shift_jis"
enc = "utf-8"
uenc = enc.upper()
props = DAV_ALLPROPS
clen = int(self.headers.get("content-length", 0))
if clen:
@@ -1216,33 +1227,13 @@ class HttpCli(object):
break
xroot = parse_xml(buf.decode(enc, "replace"))
xtag = next(x for x in xroot if x.tag.split("}")[-1] == "prop")
props_lst = [y.tag.split("}")[-1] for y in xtag]
else:
props_lst = [
"contentclass",
"creationdate",
"defaultdocument",
"displayname",
"getcontentlanguage",
"getcontentlength",
"getcontenttype",
"getlastmodified",
"href",
"iscollection",
"ishidden",
"isreadonly",
"isroot",
"isstructureddocument",
"lastaccessed",
"name",
"parentname",
"resourcetype",
"supportedlock",
]
xtag = next((x for x in xroot if x.tag.split("}")[-1] == "prop"), None)
if xtag is not None:
props = set([y.tag.split("}")[-1] for y in xtag])
# assume <allprop/> otherwise; nobody ever gonna <propname/>
props = set(props_lst)
depth = self.headers.get("depth", "infinity").lower()
zi = int(time.time())
vst = os.stat_result((16877, -1, -1, 1, 1000, 1000, 8, zi, zi, zi))
try:
topdir = {"vp": "", "st": bos.stat(tap)}
@@ -1251,10 +1242,22 @@ class HttpCli(object):
raise
raise Pebkac(404)
if depth == "0" or not self.can_read or not stat.S_ISDIR(topdir["st"].st_mode):
fgen = []
fgen: Iterable[dict[str, Any]] = []
depth = self.headers.get("depth", "infinity").lower()
if depth == "infinity":
if not self.can_read:
t = "depth:infinity requires read-access in /%s"
t = t % (self.vpath,)
self.log(t, 3)
raise Pebkac(401, t)
if not stat.S_ISDIR(topdir["st"].st_mode):
t = "depth:infinity can only be used on folders; /%s is 0o%o"
t = t % (self.vpath, topdir["st"])
self.log(t, 3)
raise Pebkac(400, t)
elif depth == "infinity":
if not self.args.dav_inf:
self.log("client wants --dav-inf", 3)
zb = b'<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:propfind-finite-depth/></D:error>'
@@ -1282,22 +1285,28 @@ class HttpCli(object):
[[True, False]],
lstat="davrt" not in vn.flags,
)
if not self.can_read:
vfs_ls = []
if not self.can_dot:
names = set(exclude_dotfiles([x[0] for x in vfs_ls]))
vfs_ls = [x for x in vfs_ls if x[0] in names]
zi = int(time.time())
zsr = os.stat_result((16877, -1, -1, 1, 1000, 1000, 8, zi, zi, zi))
ls = [{"vp": vp, "st": st} for vp, st in vfs_ls]
ls += [{"vp": v, "st": zsr} for v in vfs_virt]
fgen = ls # type: ignore
fgen = [{"vp": vp, "st": st} for vp, st in vfs_ls]
fgen += [{"vp": v, "st": vst} for v in vfs_virt]
elif depth == "0":
pass
else:
t = "invalid depth value '{}' (must be either '0' or '1'{})"
t2 = " or 'infinity'" if self.args.dav_inf else ""
raise Pebkac(412, t.format(depth, t2))
fgen = itertools.chain([topdir], fgen) # type: ignore
if not self.can_read and not self.can_write and not self.can_get and not fgen:
self.log("inaccessible: [%s]" % (self.vpath,))
raise Pebkac(401, "authenticate")
fgen = itertools.chain([topdir], fgen)
vtop = vjoin(self.args.R, vjoin(vn.vpath, rem))
chunksz = 0x7FF8 # preferred by nginx or cf (dunno which)
@@ -1395,7 +1404,7 @@ class HttpCli(object):
xroot = mkenod("D:orz")
xroot.insert(0, parse_xml(txt))
xprop = xroot.find(r"./{DAV:}propertyupdate/{DAV:}set/{DAV:}prop")
assert xprop
assert xprop # !rm
for ze in xprop:
ze.clear()
@@ -1403,12 +1412,12 @@ class HttpCli(object):
xroot = parse_xml(txt)
el = xroot.find(r"./{DAV:}response")
assert el
assert el # !rm
e2 = mktnod("D:href", quotep(self.args.SRS + self.vpath))
el.insert(0, e2)
el = xroot.find(r"./{DAV:}response/{DAV:}propstat")
assert el
assert el # !rm
el.insert(0, xprop)
ret = '<?xml version="1.0" encoding="{}"?>\n'.format(uenc)
@@ -1775,6 +1784,7 @@ class HttpCli(object):
open_ka["fun"] = gzip.GzipFile
open_a = ["wb", lv[alg], None, 0x5FEE6600] # 2021-01-01
elif alg == "xz":
assert lzma # type: ignore # !rm
open_ka = {"fun": lzma.open, "preset": lv[alg]}
open_a = ["wb"]
else:
@@ -1792,13 +1802,13 @@ class HttpCli(object):
fn = os.devnull
params.update(open_ka)
assert fn
assert fn # !rm
if not self.args.nw:
if rnd:
fn = rand_name(fdir, fn, rnd)
fn = sanitize_fn(fn or "", "", [".prologue.html", ".epilogue.html"])
fn = sanitize_fn(fn or "", "")
path = os.path.join(fdir, fn)
@@ -1864,10 +1874,12 @@ class HttpCli(object):
# small toctou, but better than clobbering a hardlink
wunlink(self.log, path, vfs.flags)
with ren_open(fn, *open_a, **params) as zfw:
f, fn = zfw["orz"]
f, fn = ren_open(fn, *open_a, **params)
try:
path = os.path.join(fdir, fn)
post_sz, sha_hex, sha_b64 = hashcopy(reader, f, self.args.s_wr_slp)
finally:
f.close()
if lim:
lim.nup(self.ip)
@@ -1906,8 +1918,8 @@ class HttpCli(object):
fn2 = fn.rsplit(".", 1)[0] + "." + ext
params["suffix"] = suffix[:-4]
with ren_open(fn, *open_a, **params) as zfw:
f, fn = zfw["orz"]
f, fn2 = ren_open(fn2, *open_a, **params)
f.close()
path2 = os.path.join(fdir, fn2)
atomic_move(self.log, path, path2, vfs.flags)
@@ -2101,7 +2113,7 @@ class HttpCli(object):
raise Pebkac(422, 'invalid action "{}"'.format(act))
def handle_zip_post(self) -> bool:
assert self.parser
assert self.parser # !rm
try:
k = next(x for x in self.uparam if x in ("zip", "tar"))
except:
@@ -2301,11 +2313,16 @@ class HttpCli(object):
vfs, _ = self.asrv.vfs.get(self.vpath, self.uname, False, True)
ptop = (vfs.dbv or vfs).realpath
x = self.conn.hsrv.broker.ask("up2k.handle_chunks", ptop, wark, chashes)
broker = self.conn.hsrv.broker
x = broker.ask("up2k.handle_chunks", ptop, wark, chashes)
response = x.get()
chashes, chunksize, cstarts, path, lastmod, sprs = response
maxsize = chunksize * len(chashes)
cstart0 = cstarts[0]
locked = chashes # remaining chunks to be received in this request
written = [] # chunks written to disk, but not yet released by up2k
num_left = -1 # num chunks left according to most recent up2k release
treport = time.time() # ratelimit up2k reporting to reduce overhead
try:
if self.args.nw:
@@ -2351,11 +2368,8 @@ class HttpCli(object):
remains -= chunksize
if len(cstart) > 1 and path != os.devnull:
self.log(
"clone {} to {}".format(
cstart[0], " & ".join(unicode(x) for x in cstart[1:])
)
)
t = " & ".join(unicode(x) for x in cstart[1:])
self.log("clone %s to %s" % (cstart[0], t))
ofs = 0
while ofs < chunksize:
bufsz = max(4 * 1024 * 1024, self.args.iobuf)
@@ -2370,6 +2384,25 @@ class HttpCli(object):
self.log("clone {} done".format(cstart[0]))
# be quick to keep the tcp winsize scale;
# if we can't confirm rn then that's fine
written.append(chash)
now = time.time()
if now - treport < 1:
continue
treport = now
x = broker.ask("up2k.fast_confirm_chunks", ptop, wark, written)
num_left, t = x.get()
if num_left < -1:
self.loud_reply(t, status=500)
locked = written = []
return False
elif num_left >= 0:
t = "got %d more chunks, %d left"
self.log(t % (len(written), num_left), 6)
locked = locked[len(written) :]
written = []
if not fpool:
f.close()
else:
@@ -2380,25 +2413,25 @@ class HttpCli(object):
f.close()
raise
finally:
x = self.conn.hsrv.broker.ask("up2k.release_chunks", ptop, wark, chashes)
x.get() # block client until released
if locked:
# now block until all chunks released+confirmed
x = broker.ask("up2k.confirm_chunks", ptop, wark, locked)
num_left, t = x.get()
if num_left < 0:
self.loud_reply(t, status=500)
return False
t = "got %d more chunks, %d left"
self.log(t % (len(locked), num_left), 6)
x = self.conn.hsrv.broker.ask("up2k.confirm_chunks", ptop, wark, chashes)
ztis = x.get()
try:
num_left, fin_path = ztis
except:
self.loud_reply(ztis, status=500)
return False
if num_left < 0:
raise Pebkac(500, "unconfirmed; see serverlog")
if not num_left and fpool:
with self.u2mutex:
self.u2fh.close(path)
if not num_left and not self.args.nw:
self.conn.hsrv.broker.ask(
"up2k.finish_upload", ptop, wark, self.u2fh.aps
).get()
broker.ask("up2k.finish_upload", ptop, wark, self.u2fh.aps).get()
cinf = self.headers.get("x-up2k-stat", "")
@@ -2408,7 +2441,7 @@ class HttpCli(object):
return True
def handle_chpw(self) -> bool:
assert self.parser
assert self.parser # !rm
pwd = self.parser.require("pw", 64)
self.parser.drop()
@@ -2425,7 +2458,7 @@ class HttpCli(object):
return True
def handle_login(self) -> bool:
assert self.parser
assert self.parser # !rm
pwd = self.parser.require("cppwd", 64)
try:
uhash = self.parser.require("uhash", 256)
@@ -2453,7 +2486,7 @@ class HttpCli(object):
return True
def handle_logout(self) -> bool:
assert self.parser
assert self.parser # !rm
self.parser.drop()
self.log("logout " + self.uname)
@@ -2482,7 +2515,7 @@ class HttpCli(object):
logpwd = ""
elif self.args.log_badpwd == 2:
zb = hashlib.sha512(pwd.encode("utf-8", "replace")).digest()
logpwd = "%" + base64.b64encode(zb[:12]).decode("utf-8")
logpwd = "%" + ub64enc(zb[:12]).decode("ascii")
if pwd != "x":
self.log("invalid password: {}".format(logpwd), 3)
@@ -2507,7 +2540,7 @@ class HttpCli(object):
return dur > 0, msg
def handle_mkdir(self) -> bool:
assert self.parser
assert self.parser # !rm
new_dir = self.parser.require("name", 512)
self.parser.drop()
@@ -2518,7 +2551,7 @@ class HttpCli(object):
self.gctx = vpath
vpath = undot(vpath)
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
rem = sanitize_vpath(rem, "/", [])
rem = sanitize_vpath(rem, "/")
fn = vfs.canonical(rem)
if not fn.startswith(vfs.realpath):
self.log("invalid mkdir [%s] [%s]" % (self.gctx, vpath), 1)
@@ -2553,7 +2586,7 @@ class HttpCli(object):
return True
def handle_new_md(self) -> bool:
assert self.parser
assert self.parser # !rm
new_file = self.parser.require("name", 512)
self.parser.drop()
@@ -2565,7 +2598,7 @@ class HttpCli(object):
if not ext or len(ext) > 5 or not self.can_delete:
new_file += ".md"
sanitized = sanitize_fn(new_file, "", [])
sanitized = sanitize_fn(new_file, "")
if not nullwrite:
fdir = vfs.canonical(rem)
@@ -2644,9 +2677,7 @@ class HttpCli(object):
# fallthrough
fdir = fdir_base
fname = sanitize_fn(
p_file or "", "", [".prologue.html", ".epilogue.html"]
)
fname = sanitize_fn(p_file or "", "")
abspath = os.path.join(fdir, fname)
suffix = "-%.6f-%s" % (time.time(), dip)
if p_file and not nullwrite:
@@ -2719,8 +2750,8 @@ class HttpCli(object):
bos.makedirs(fdir)
# reserve destination filename
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as zfw:
fname = zfw["orz"][1]
f, fname = ren_open(fname, "wb", fdir=fdir, suffix=suffix)
f.close()
tnam = fname + ".PARTIAL"
if self.args.dotpart:
@@ -2743,8 +2774,8 @@ class HttpCli(object):
v2 = lim.dfv - lim.dfl
max_sz = min(v1, v2) if v1 and v2 else v1 or v2
with ren_open(tnam, "wb", self.args.iobuf, **open_args) as zfw:
f, tnam = zfw["orz"]
f, tnam = ren_open(tnam, "wb", self.args.iobuf, **open_args)
try:
tabspath = os.path.join(fdir, tnam)
self.log("writing to {}".format(tabspath))
sz, sha_hex, sha_b64 = hashcopy(
@@ -2752,6 +2783,8 @@ class HttpCli(object):
)
if sz == 0:
raise Pebkac(400, "empty files in post")
finally:
f.close()
if lim:
lim.nup(self.ip)
@@ -2961,7 +2994,7 @@ class HttpCli(object):
return True
def handle_text_upload(self) -> bool:
assert self.parser
assert self.parser # !rm
try:
cli_lastmod3 = int(self.parser.require("lastmod", 16))
except:
@@ -3046,7 +3079,7 @@ class HttpCli(object):
pass
wrename(self.log, fp, os.path.join(mdir, ".hist", mfile2), vfs.flags)
assert self.parser.gen
assert self.parser.gen # !rm
p_field, _, p_data = next(self.parser.gen)
if p_field != "body":
raise Pebkac(400, "expected body, got {}".format(p_field))
@@ -3147,7 +3180,7 @@ class HttpCli(object):
# some browser append "; length=573"
cli_lastmod = cli_lastmod.split(";")[0].strip()
cli_dt = parsedate(cli_lastmod)
assert cli_dt
assert cli_dt # !rm
cli_ts = calendar.timegm(cli_dt)
return file_lastmod, int(file_ts) > int(cli_ts)
except Exception as ex:
@@ -3274,6 +3307,130 @@ class HttpCli(object):
return txt
def tx_res(self, req_path: str) -> bool:
status = 200
logmsg = "{:4} {} ".format("", self.req)
logtail = ""
editions = {}
file_ts = 0
if has_resource(self.E, req_path):
st = stat_resource(self.E, req_path)
if st:
file_ts = max(file_ts, st.st_mtime)
editions["plain"] = req_path
if has_resource(self.E, req_path + ".gz"):
st = stat_resource(self.E, req_path + ".gz")
if st:
file_ts = max(file_ts, st.st_mtime)
if not st or st.st_mtime > file_ts:
editions[".gz"] = req_path + ".gz"
if not editions:
return self.tx_404()
#
# if-modified
if file_ts > 0:
file_lastmod, do_send = self._chk_lastmod(int(file_ts))
self.out_headers["Last-Modified"] = file_lastmod
if not do_send:
status = 304
if self.can_write:
self.out_headers["X-Lastmod3"] = str(int(file_ts * 1000))
else:
do_send = True
#
# Accept-Encoding and UA decides which edition to send
decompress = False
supported_editions = [
x.strip()
for x in self.headers.get("accept-encoding", "").lower().split(",")
]
if ".gz" in editions:
is_compressed = True
selected_edition = ".gz"
if "gzip" not in supported_editions:
decompress = True
else:
if re.match(r"MSIE [4-6]\.", self.ua) and " SV1" not in self.ua:
decompress = True
if not decompress:
self.out_headers["Content-Encoding"] = "gzip"
else:
is_compressed = False
selected_edition = "plain"
res_path = editions[selected_edition]
logmsg += "{} ".format(selected_edition.lstrip("."))
res = load_resource(self.E, res_path)
if decompress:
file_sz = gzip_file_orig_sz(res)
res = gzip.open(res)
else:
res.seek(0, os.SEEK_END)
file_sz = res.tell()
res.seek(0, os.SEEK_SET)
#
# send reply
if is_compressed:
self.out_headers["Cache-Control"] = "max-age=604869"
else:
self.permit_caching()
if "txt" in self.uparam:
mime = "text/plain; charset={}".format(self.uparam["txt"] or "utf-8")
elif "mime" in self.uparam:
mime = str(self.uparam.get("mime"))
else:
mime = guess_mime(req_path)
logmsg += unicode(status) + logtail
if self.mode == "HEAD" or not do_send:
res.close()
if self.do_log:
self.log(logmsg)
self.send_headers(length=file_sz, status=status, mime=mime)
return True
ret = True
self.send_headers(length=file_sz, status=status, mime=mime)
remains = sendfile_py(
self.log,
0,
file_sz,
res,
self.s,
self.args.s_wr_sz,
self.args.s_wr_slp,
not self.args.no_poll,
)
res.close()
if remains > 0:
logmsg += " \033[31m" + unicode(file_sz - remains) + "\033[0m"
ret = False
spd = self._spd(file_sz - remains)
if self.do_log:
self.log("{}, {}".format(logmsg, spd))
return ret
def tx_file(self, req_path: str, ptop: Optional[str] = None) -> bool:
status = 200
logmsg = "{:4} {} ".format("", self.req)
@@ -3655,7 +3812,7 @@ class HttpCli(object):
items: list[str],
) -> bool:
if self.args.no_zip:
raise Pebkac(400, "not enabled")
raise Pebkac(400, "not enabled in server config")
logmsg = "{:4} {} ".format("", self.req)
self.keepalive = False
@@ -3781,6 +3938,33 @@ class HttpCli(object):
self.reply(ico, mime=mime, headers={"Last-Modified": lm})
return True
def tx_qr(self):
url = "%s://%s%s%s" % (
"https" if self.is_https else "http",
self.host,
self.args.SRS,
self.vpaths,
)
uhash = ""
uparams = []
if self.ouparam:
for k, v in self.ouparam.items():
if k == "qr":
continue
if k == "uhash":
uhash = v
continue
uparams.append(k if v == "" else "%s=%s" % (k, v))
if uparams:
url += "?" + "&".join(uparams)
if uhash:
url += "#" + uhash
self.log("qrcode(%r)" % (url,))
ret = qr2svg(QrCode.encode_binary(url.encode("utf-8")), 2)
self.reply(ret.encode("utf-8"), mime="image/svg+xml")
return True
def tx_md(self, vn: VFS, fs_path: str) -> bool:
logmsg = " %s @%s " % (self.req, self.uname)
@@ -3789,15 +3973,11 @@ class HttpCli(object):
return self.tx_404(True)
tpl = "mde" if "edit2" in self.uparam else "md"
html_path = os.path.join(self.E.mod, "web", "{}.html".format(tpl))
template = self.j2j(tpl)
st = bos.stat(fs_path)
ts_md = st.st_mtime
st = bos.stat(html_path)
ts_html = st.st_mtime
max_sz = 1024 * self.args.txt_max
sz_md = 0
lead = b""
@@ -3831,7 +4011,7 @@ class HttpCli(object):
fullfile = html_bescape(fullfile)
sz_md = len(lead) + len(fullfile)
file_ts = int(max(ts_md, ts_html, self.E.t0))
file_ts = int(max(ts_md, self.E.t0))
file_lastmod, do_send = self._chk_lastmod(file_ts)
self.out_headers["Last-Modified"] = file_lastmod
self.out_headers.update(NO_CACHE)
@@ -3870,7 +4050,7 @@ class HttpCli(object):
zs = template.render(**targs).encode("utf-8", "replace")
html = zs.split(boundary.encode("utf-8"))
if len(html) != 2:
raise Exception("boundary appears in " + html_path)
raise Exception("boundary appears in " + tpl)
self.send_headers(sz_md + len(html[0]) + len(html[1]), status)
@@ -3915,6 +4095,9 @@ class HttpCli(object):
vp = re.sub(r"[<>&$?`\"']", "_", self.uparam["hc"] or "").lstrip("/")
pw = pw.replace(" ", "%20")
vp = vp.replace(" ", "%20")
if pw in self.asrv.sesa:
pw = "pwd"
html = self.j2s(
"svcs",
args=self.args,
@@ -3939,11 +4122,30 @@ class HttpCli(object):
for y in [self.rvol, self.wvol, self.avol]
]
if self.avol and not self.args.no_rescan:
x = self.conn.hsrv.broker.ask("up2k.get_state")
ups = []
now = time.time()
get_vst = self.avol and not self.args.no_rescan
get_ups = self.rvol and not self.args.no_up_list and self.uname or ""
if get_vst or get_ups:
x = self.conn.hsrv.broker.ask("up2k.get_state", get_vst, get_ups)
vs = json.loads(x.get())
vstate = {("/" + k).rstrip("/") + "/": v for k, v in vs["volstate"].items()}
else:
try:
for rem, sz, t0, poke, vp in vs["ups"]:
fdone = max(0.001, 1 - rem)
td = max(0.1, now - t0)
rd, fn = vsplit(vp.replace(os.sep, "/"))
if not rd:
rd = "/"
erd = quotep(rd)
rds = rd.replace("/", " / ")
spd = humansize(sz * fdone / td, True) + "/s"
eta = s2hms((td / fdone) - td, True) if rem < 1 else "--"
idle = s2hms(now - poke, True)
ups.append((int(100 * fdone), spd, eta, idle, erd, rds, fn))
except Exception as ex:
self.log("failed to list upload progress: %r" % (ex,), 1)
if not get_vst:
vstate = {}
vs = {
"scanning": None,
@@ -3953,6 +4155,8 @@ class HttpCli(object):
"dbwt": None,
}
assert vstate.items and vs # type: ignore # !rm
fmt = self.uparam.get("ls", "")
if not fmt and (self.ua.startswith("curl/") or self.ua.startswith("fetch")):
fmt = "v"
@@ -3968,6 +4172,12 @@ class HttpCli(object):
for k in ["scanning", "hashq", "tagq", "mtpq", "dbwt"]:
txt += " {}({})".format(k, vs[k])
if ups:
txt += "\n\nincoming files:"
for zt in ups:
txt += "\n%s" % (", ".join((str(x) for x in zt)),)
txt += "\n"
if rvol:
txt += "\nyou can browse:"
for v in rvol:
@@ -3991,6 +4201,7 @@ class HttpCli(object):
avol=avol,
in_shr=self.args.shr and self.vpath.startswith(self.args.shr[1:]),
vstate=vstate,
ups=ups,
scanning=vs["scanning"],
hashq=vs["hashq"],
tagq=vs["tagq"],
@@ -4325,9 +4536,6 @@ class HttpCli(object):
if self.uname != self.args.shr_adm:
rows = [x for x in rows if x[5] == self.uname]
for x in rows:
x[1] = "yes" if x[1] else ""
html = self.j2s(
"shares", this=self, shr=self.args.shr, rows=rows, now=int(time.time())
)
@@ -4405,7 +4613,7 @@ class HttpCli(object):
else:
for zs in vps:
if zs.endswith("/"):
t = "you cannot select more than one folder, or mix flies and folders in one selection"
t = "you cannot select more than one folder, or mix files and folders in one selection"
raise Pebkac(400, t)
vp = vps[0].rsplit("/", 1)[0]
for zs in vps:
@@ -4948,6 +5156,9 @@ class HttpCli(object):
for k in ["zip", "tar"]:
v = self.uparam.get(k)
if v is not None and (not add_og or not og_fn):
if is_dk and "dks" not in vn.flags:
t = "server config does not allow download-as-zip/tar; only dk is specified, need dks too"
raise Pebkac(403, t)
return self.tx_zip(k, v, self.vpath, vn, rem, [])
fsroot, vfs_ls, vfs_virt = vn.ls(
@@ -4982,14 +5193,14 @@ class HttpCli(object):
except:
pass
lnames = {x.lower(): x for x in ls_names}
# show dotfiles if permitted and requested
if not self.can_dot or (
"dots" not in self.uparam and (is_ls or "dots" not in self.cookies)
):
ls_names = exclude_dotfiles(ls_names)
lnames = {x.lower(): x for x in ls_names}
add_dk = vf.get("dk")
add_fk = vf.get("fk")
fk_alg = 2 if "fka" in vf else 1
@@ -5095,7 +5306,6 @@ class HttpCli(object):
dirs.append(item)
else:
files.append(item)
item["rd"] = rem
if is_dk and not vf.get("dks"):
dirs = []
@@ -5118,16 +5328,10 @@ class HttpCli(object):
add_up_at = ".up_at" in mte
is_admin = self.can_admin
tagset: set[str] = set()
for fe in files:
rd = vrem
for fe in files if icur else []:
assert icur # !rm
fn = fe["name"]
rd = fe["rd"]
del fe["rd"]
if not icur:
continue
if vn != dbv:
_, rd = vn.get_dbv(rd)
erd_efn = (rd, fn)
q = "select mt.k, mt.v from up inner join mt on mt.w = substr(up.w,1,16) where up.rd = ? and up.fn = ? and +mt.k != 'x'"
try:
@@ -5169,13 +5373,25 @@ class HttpCli(object):
fe["tags"] = tags
if icur:
for fe in dirs:
fe["tags"] = ODict()
lmte = list(mte)
if self.can_admin:
lmte.extend(("up_ip", ".up_at"))
if "nodirsz" not in vf:
tagset.add(".files")
vdir = "%s/" % (rd,) if rd else ""
q = "select sz, nf from ds where rd=? limit 1"
for fe in dirs:
try:
hit = icur.execute(q, (vdir + fe["name"],)).fetchone()
(fe["sz"], fe["tags"][".files"]) = hit
except:
pass # 404 or mojibake
taglist = [k for k in lmte if k in tagset]
for fe in dirs:
fe["tags"] = ODict()
else:
taglist = list(tagset)
@@ -5319,7 +5535,9 @@ class HttpCli(object):
fmt = vn.flags.get("og_th", "j")
th_base = ujoin(url_base, quotep(thumb))
query = "th=%s&cache" % (fmt,)
query = ub64enc(query.encode("utf-8")).decode("utf-8")
if use_filekey:
query += "&k=" + self.uparam["k"]
query = ub64enc(query.encode("utf-8")).decode("ascii")
# discord looks at file extension, not content-type...
query += "/th.jpg" if "j" in fmt else "/th.webp"
j2a["og_thumb"] = "%s/.uqe/%s" % (th_base, query)
@@ -5328,7 +5546,10 @@ class HttpCli(object):
j2a["og_file"] = file
if og_fn:
og_fn_q = quotep(og_fn)
query = ub64enc(b"raw").decode("utf-8")
query = "raw"
if use_filekey:
query += "&k=" + self.uparam["k"]
query = ub64enc(query.encode("utf-8")).decode("ascii")
query += "/%s" % (og_fn_q,)
j2a["og_url"] = ujoin(url_base, og_fn_q)
j2a["og_raw"] = j2a["og_url"] + "/.uqe/" + query

View File

@@ -103,9 +103,6 @@ class HttpConn(object):
self.log_src = ("%s \033[%dm%d" % (ip, color, self.addr[1])).ljust(26)
return self.log_src
def respath(self, res_name: str) -> str:
return os.path.join(self.E.mod, "web", res_name)
def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func(self.log_src, msg, c)
@@ -165,6 +162,7 @@ class HttpConn(object):
self.log_src = self.log_src.replace("[36m", "[35m")
try:
assert ssl # type: ignore # !rm
ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ctx.load_cert_chain(self.args.cert)
if self.args.ssl_ver:
@@ -190,7 +188,7 @@ class HttpConn(object):
if self.args.ssl_dbg and hasattr(self.s, "shared_ciphers"):
ciphers = self.s.shared_ciphers()
assert ciphers
assert ciphers # !rm
overlap = [str(y[::-1]) for y in ciphers]
self.log("TLS cipher overlap:" + "\n".join(overlap))
for k, v in [

View File

@@ -1,7 +1,6 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import base64
import math
import os
import re
@@ -67,14 +66,16 @@ from .util import (
Magician,
Netdev,
NetMap,
absreal,
build_netmap,
has_resource,
ipnorm,
load_resource,
min_ex,
shut_socket,
spack,
start_log_thrs,
start_stackmon,
ub64enc,
)
if TYPE_CHECKING:
@@ -91,6 +92,11 @@ if not hasattr(socket, "AF_UNIX"):
setattr(socket, "AF_UNIX", -9001)
def load_jinja2_resource(E: EnvParams, name: str):
with load_resource(E, "web/" + name, "r") as f:
return f.read()
class HttpSrv(object):
"""
handles incoming connections using HttpConn to process http,
@@ -152,8 +158,9 @@ class HttpSrv(object):
self.u2idx_free: dict[str, U2idx] = {}
self.u2idx_n = 0
assert jinja2 # type: ignore # !rm
env = jinja2.Environment()
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
env.loader = jinja2.FunctionLoader(lambda f: load_jinja2_resource(self.E, f))
jn = [
"splash",
"shares",
@@ -166,18 +173,15 @@ class HttpSrv(object):
"cf",
]
self.j2 = {x: env.get_template(x + ".html") for x in jn}
zs = os.path.join(self.E.mod, "web", "deps", "prism.js.gz")
self.prism = os.path.exists(zs)
self.prism = has_resource(self.E, "web/deps/prism.js.gz")
self.ipa_nm = build_netmap(self.args.ipa)
self.xff_nm = build_netmap(self.args.xff_src)
self.xff_lan = build_netmap("lan")
self.statics: set[str] = set()
self._build_statics()
self.ptn_cc = re.compile(r"[\x00-\x1f]")
self.ptn_hsafe = re.compile(r"[\x00-\x1f<>\"'&]")
self.uparam_cc_ok = set("doc move tree".split())
self.mallow = "GET HEAD POST PUT DELETE OPTIONS".split()
if not self.args.no_dav:
@@ -209,14 +213,6 @@ class HttpSrv(object):
except:
pass
def _build_statics(self) -> None:
for dp, _, df in os.walk(os.path.join(self.E.mod, "web")):
for fn in df:
ap = absreal(os.path.join(dp, fn))
self.statics.add(ap)
if ap.endswith(".gz"):
self.statics.add(ap[:-3])
def set_netdevs(self, netdevs: dict[str, Netdev]) -> None:
ips = set()
for ip, _ in self.bound:
@@ -237,7 +233,7 @@ class HttpSrv(object):
if self.args.log_htp:
self.log(self.name, "workers -= {} = {}".format(n, self.tp_nthr), 6)
assert self.tp_q
assert self.tp_q # !rm
for _ in range(n):
self.tp_q.put(None)
@@ -431,7 +427,7 @@ class HttpSrv(object):
)
def thr_poolw(self) -> None:
assert self.tp_q
assert self.tp_q # !rm
while True:
task = self.tp_q.get()
if not task:
@@ -543,8 +539,8 @@ class HttpSrv(object):
except:
pass
v = base64.urlsafe_b64encode(spack(b">xxL", int(v)))
self.cb_v = v.decode("ascii")[-4:]
# spack gives 4 lsb, take 3 lsb, get 4 ch
self.cb_v = ub64enc(spack(b">L", int(v))[1:]).decode("ascii")
self.cb_ts = time.time()
return self.cb_v

View File

@@ -88,7 +88,7 @@ class Metrics(object):
addg("cpp_total_bans", str(self.hsrv.nban), t)
if not args.nos_vst:
x = self.hsrv.broker.ask("up2k.get_state")
x = self.hsrv.broker.ask("up2k.get_state", True, "")
vs = json.loads(x.get())
nvidle = 0

View File

@@ -12,7 +12,7 @@ from types import SimpleNamespace
from .__init__ import ANYWIN, EXE, TYPE_CHECKING
from .authsrv import LEELOO_DALLAS, VFS
from .bos import bos
from .util import Daemon, min_ex, pybin, runhook
from .util import Daemon, absreal, min_ex, pybin, runhook, vjoin
if True: # pylint: disable=using-constant-test
from typing import Any, Union
@@ -151,6 +151,8 @@ class SMB(object):
def _uname(self) -> str:
if self.noacc:
return LEELOO_DALLAS
if not self.asrv.acct:
return "*"
try:
# you found it! my single worst bit of code so far
@@ -189,7 +191,7 @@ class SMB(object):
vfs, rem = self.asrv.vfs.get(vpath, uname, *perms)
if not vfs.realpath:
raise Exception("unmapped vfs")
return vfs, vfs.canonical(rem)
return vfs, vjoin(vfs.realpath, rem)
def _listdir(self, vpath: str, *a: Any, **ka: Any) -> list[str]:
vpath = vpath.replace("\\", "/").lstrip("/")
@@ -213,7 +215,7 @@ class SMB(object):
sz = 112 * 2 # ['.', '..']
for n, fn in enumerate(ls):
if sz >= 64000:
t = "listing only %d of %d files (%d byte) in /%s; see impacket#1433"
t = "listing only %d of %d files (%d byte) in /%s for performance; see --smb-nwa-1"
warning(t, n, len(ls), sz, vpath)
break
@@ -242,6 +244,7 @@ class SMB(object):
t = "blocked write (no-write-acc %s): /%s @%s"
yeet(t % (vfs.axs.uwrite, vpath, uname))
ap = absreal(ap)
xbu = vfs.flags.get("xbu")
if xbu and not runhook(
self.nlog,

View File

@@ -594,3 +594,20 @@ def _get_bit(x: int, i: int) -> bool:
class DataTooLongError(ValueError):
pass
def qr2svg(qr: QrCode, border: int) -> str:
parts: list[str] = []
for y in range(qr.size):
sy = border + y
for x in range(qr.size):
if qr.modules[y][x]:
parts.append("M%d,%dh1v1h-1z" % (border + x, sy))
t = """\
<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg" version="1.1" viewBox="0 0 {0} {0}" stroke="none">
<rect width="100%" height="100%" fill="#F7F7F7"/>
<path d="{1}" fill="#111111"/>
</svg>
"""
return t.format(qr.size + border * 2, " ".join(parts))

View File

@@ -2,7 +2,6 @@
from __future__ import print_function, unicode_literals
import argparse
import base64
import errno
import gzip
import logging
@@ -67,6 +66,7 @@ from .util import (
pybin,
start_log_thrs,
start_stackmon,
ub64enc,
)
if TYPE_CHECKING:
@@ -419,8 +419,8 @@ class SvcHub(object):
r"insert into kv values ('sver', 1)",
]
assert db # type: ignore
assert cur # type: ignore
assert db # type: ignore # !rm
assert cur # type: ignore # !rm
if create:
for cmd in sch:
cur.execute(cmd)
@@ -488,8 +488,8 @@ class SvcHub(object):
r"create index sh_t1 on sh(t1)",
]
assert db # type: ignore
assert cur # type: ignore
assert db # type: ignore # !rm
assert cur # type: ignore # !rm
if create:
dver = 2
modified = True
@@ -1297,5 +1297,5 @@ class SvcHub(object):
zs = "{}\n{}".format(VERSIONS, alltrace())
zb = zs.encode("utf-8", "replace")
zb = gzip.compress(zb)
zs = base64.b64encode(zb).decode("ascii")
zs = ub64enc(zb).decode("ascii")
self.log("stacks", zs)

View File

@@ -105,7 +105,7 @@ def gen_hdr(
ret += spack(b"<LL", vsz, vsz)
# windows support (the "?" replace below too)
fn = sanitize_fn(fn, "/", [])
fn = sanitize_fn(fn, "/")
bfn = fn.encode("utf-8" if utf8 else "cp437", "replace").replace(b"?", b"_")
# add ntfs (0x24) and/or unix (0x10) extrafields for utc, add z64 if requested

View File

@@ -1,7 +1,6 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import base64
import hashlib
import logging
import os
@@ -27,6 +26,7 @@ from .util import (
min_ex,
runcmd,
statdir,
ub64enc,
vsplit,
wrename,
wunlink,
@@ -109,6 +109,9 @@ except:
HAVE_VIPS = False
th_dir_cache = {}
def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -> str:
# base16 = 16 = 256
# b64-lc = 38 = 1444
@@ -122,14 +125,20 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
if ext in ffa and fmt[:2] in ("wf", "jf"):
fmt = fmt.replace("f", "")
rd += "\n" + fmt
h = hashlib.sha512(afsenc(rd)).digest()
b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24]
rd = ("%s/%s/" % (b64[:2], b64[2:4])).lower() + b64
dcache = th_dir_cache
rd_key = rd + "\n" + fmt
rd = dcache.get(rd_key)
if not rd:
h = hashlib.sha512(afsenc(rd_key)).digest()
b64 = ub64enc(h).decode("ascii")[:24]
rd = ("%s/%s/" % (b64[:2], b64[2:4])).lower() + b64
if len(dcache) > 9001:
dcache.clear()
dcache[rd_key] = rd
# could keep original filenames but this is safer re pathlen
h = hashlib.sha512(afsenc(fn)).digest()
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
fn = ub64enc(h).decode("ascii")[:24]
if fmt in ("opus", "caf", "mp3"):
cat = "ac"
@@ -479,7 +488,7 @@ class ThumbSrv(object):
if c == crops[-1]:
raise
assert img # type: ignore
assert img # type: ignore # !rm
img.write_to_file(tpath, Q=40)
def conv_ffmpeg(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:

View File

@@ -53,6 +53,8 @@ class U2idx(object):
self.log("your python does not have sqlite3; searching will be disabled")
return
assert sqlite3 # type: ignore # !rm
self.active_id = ""
self.active_cur: Optional["sqlite3.Cursor"] = None
self.cur: dict[str, "sqlite3.Cursor"] = {}
@@ -104,7 +106,7 @@ class U2idx(object):
if not HAVE_SQLITE3 or not self.args.shr:
return None
assert sqlite3 # type: ignore
assert sqlite3 # type: ignore # !rm
db = sqlite3.connect(self.args.shr_db, timeout=2, check_same_thread=False)
cur = db.cursor()
@@ -120,7 +122,7 @@ class U2idx(object):
if not HAVE_SQLITE3 or "e2d" not in vn.flags:
return None
assert sqlite3 # type: ignore
assert sqlite3 # type: ignore # !rm
ptop = vn.realpath
histpath = self.asrv.vfs.histtab.get(ptop)
@@ -467,5 +469,5 @@ class U2idx(object):
return
if identifier == self.active_id:
assert self.active_cur
assert self.active_cur # !rm
self.active_cur.connection.interrupt()

View File

@@ -1,7 +1,6 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import base64
import errno
import gzip
import hashlib
@@ -61,6 +60,7 @@ from .util import (
sfsenc,
spack,
statdir,
ub64enc,
unhumanize,
vjoin,
vsplit,
@@ -176,6 +176,7 @@ class Up2k(object):
self.spools: set[tempfile.SpooledTemporaryFile[bytes]] = set()
if HAVE_SQLITE3:
# mojibake detector
assert sqlite3 # type: ignore # !rm
self.mem_cur = self._orz(":memory:")
self.mem_cur.execute(r"create table a (b text)")
self.sqlite_ver = tuple([int(x) for x in sqlite3.sqlite_version.split(".")])
@@ -268,19 +269,31 @@ class Up2k(object):
if not self.stop:
self.log("uploads are now possible", 2)
def get_state(self) -> str:
def get_state(self, get_q: bool, uname: str) -> str:
mtpq: Union[int, str] = 0
ups = []
up_en = not self.args.no_up_list
q = "select count(w) from mt where k = 't:mtp'"
got_lock = False if PY2 else self.mutex.acquire(timeout=0.5)
if got_lock:
for cur in self.cur.values():
try:
mtpq += cur.execute(q).fetchone()[0]
except:
pass
self.mutex.release()
try:
for cur in self.cur.values() if get_q else []:
try:
mtpq += cur.execute(q).fetchone()[0]
except:
pass
if uname and up_en:
ups = self._active_uploads(uname)
finally:
self.mutex.release()
else:
mtpq = "(?)"
if up_en:
ups = [(1, 0, 0, time.time(), "cannot show list (server too busy)")]
if PY2:
ups = []
ups.sort(reverse=True)
ret = {
"volstate": self.volstate,
@@ -288,6 +301,7 @@ class Up2k(object):
"hashq": self.n_hashq,
"tagq": self.n_tagq,
"mtpq": mtpq,
"ups": ups,
"dbwu": "{:.2f}".format(self.db_act),
"dbwt": "{:.2f}".format(
min(1000 * 24 * 60 * 60 - 1, time.time() - self.db_act)
@@ -295,6 +309,32 @@ class Up2k(object):
}
return json.dumps(ret, separators=(",\n", ": "))
def _active_uploads(self, uname: str) -> list[tuple[float, int, int, str]]:
ret = []
for vtop in self.asrv.vfs.aread[uname]:
vfs = self.asrv.vfs.all_vols.get(vtop)
if not vfs: # dbv only
continue
ptop = vfs.realpath
tab = self.registry.get(ptop)
if not tab:
continue
for job in tab.values():
ineed = len(job["need"])
ihash = len(job["hash"])
if ineed == ihash or not ineed:
continue
zt = (
ineed / ihash,
job["size"],
int(job["t0c"]),
int(job["poke"]),
djoin(vtop, job["prel"], job["name"]),
)
ret.append(zt)
return ret
def find_job_by_ap(self, ptop: str, ap: str) -> str:
try:
if ANYWIN:
@@ -324,10 +364,9 @@ class Up2k(object):
continue
addr = (ip or "\n") if cfg in (1, 2) else ""
user = (uname or "\n") if cfg in (1, 3) else ""
drp = self.droppable.get(ptop, {})
for wark, job in tab2.items():
for job in tab2.values():
if (
wark in drp
"done" in job
or (user and user != job["user"])
or (addr and addr != job["addr"])
):
@@ -368,9 +407,8 @@ class Up2k(object):
for ptop, tab2 in self.registry.items():
nbytes = 0
nfiles = 0
drp = self.droppable.get(ptop, {})
for wark, job in tab2.items():
if wark in drp:
for job in tab2.values():
if "done" in job:
continue
nfiles += 1
@@ -575,7 +613,7 @@ class Up2k(object):
return timeout
def _check_shares(self) -> float:
assert sqlite3 # type: ignore
assert sqlite3 # type: ignore # !rm
now = time.time()
timeout = now + 9001
@@ -896,7 +934,7 @@ class Up2k(object):
with self.mutex, self.reg_mutex:
reg = self.register_vpath(vol.realpath, vol.flags)
assert reg
assert reg # !rm
cur, _ = reg
with self.mutex:
cur.connection.commit()
@@ -913,7 +951,7 @@ class Up2k(object):
reg = self.register_vpath(vol.realpath, vol.flags)
try:
assert reg
assert reg # !rm
cur, db_path = reg
if bos.path.getsize(db_path + "-wal") < 1024 * 1024 * 5:
continue
@@ -1017,6 +1055,7 @@ class Up2k(object):
reg = {}
drp = None
emptylist = []
snap = os.path.join(histpath, "up2k.snap")
if bos.path.exists(snap):
with gzip.GzipFile(snap, "rb") as f:
@@ -1033,6 +1072,9 @@ class Up2k(object):
fp = djoin(job["ptop"], job["prel"], job["name"])
if bos.path.exists(fp):
reg[k] = job
if "done" in job:
job["need"] = job["hash"] = emptylist
continue
job["poke"] = time.time()
job["busy"] = {}
else:
@@ -1119,7 +1161,7 @@ class Up2k(object):
zsl = [x[len(prefix) :] for x in zsl]
zsl.sort()
zb = hashlib.sha1("\n".join(zsl).encode("utf-8", "replace")).digest()
vcfg = base64.urlsafe_b64encode(zb[:18]).decode("ascii")
vcfg = ub64enc(zb[:18]).decode("ascii")
c = cur.execute("select v from kv where k = 'volcfg'")
try:
@@ -1148,7 +1190,7 @@ class Up2k(object):
with self.reg_mutex:
reg = self.register_vpath(top, vol.flags)
assert reg and self.pp
assert reg and self.pp # !rm
cur, db_path = reg
db = Dbw(cur, 0, time.time())
@@ -1167,6 +1209,10 @@ class Up2k(object):
# ~/.wine/dosdevices/z:/ and such
excl.extend(("/dev", "/proc", "/run", "/sys"))
if self.args.re_dirsz:
db.c.execute("delete from ds")
db.n += 1
rtop = absreal(top)
n_add = n_rm = 0
try:
@@ -1175,7 +1221,7 @@ class Up2k(object):
self.log(t % (vol.vpath, rtop), 6)
return True, False
n_add = self._build_dir(
n_add, _, _ = self._build_dir(
db,
top,
set(excl),
@@ -1249,17 +1295,18 @@ class Up2k(object):
cst: os.stat_result,
dev: int,
xvol: bool,
) -> int:
) -> tuple[int, int, int]:
if xvol and not rcdir.startswith(top):
self.log("skip xvol: [{}] -> [{}]".format(cdir, rcdir), 6)
return 0
return 0, 0, 0
if rcdir in seen:
t = "bailing from symlink loop,\n prev: {}\n curr: {}\n from: {}"
self.log(t.format(seen[-1], rcdir, cdir), 3)
return 0
return 0, 0, 0
ret = 0
# total-files-added, total-num-files, recursive-size
tfa = tnf = rsz = 0
seen = seen + [rcdir]
unreg: list[str] = []
files: list[tuple[int, int, str]] = []
@@ -1269,22 +1316,25 @@ class Up2k(object):
th_cvd = self.args.th_coversd
th_cvds = self.args.th_coversd_set
assert self.pp and self.mem_cur
assert self.pp and self.mem_cur # !rm
self.pp.msg = "a%d %s" % (self.pp.n, cdir)
rd = cdir[len(top) :].strip("/")
if WINDOWS:
rd = rd.replace("\\", "/").strip("/")
rds = rd + "/" if rd else ""
cdirs = cdir + os.sep
g = statdir(self.log_func, not self.args.no_scandir, True, cdir)
gl = sorted(g)
partials = set([x[0] for x in gl if "PARTIAL" in x[0]])
for iname, inf in gl:
if self.stop:
return -1
return -1, 0, 0
rp = vjoin(rd, iname)
abspath = os.path.join(cdir, iname)
rp = rds + iname
abspath = cdirs + iname
if rei and rei.search(abspath):
unreg.append(rp)
@@ -1318,7 +1368,7 @@ class Up2k(object):
continue
# self.log(" dir: {}".format(abspath))
try:
ret += self._build_dir(
i1, i2, i3 = self._build_dir(
db,
top,
excl,
@@ -1333,6 +1383,9 @@ class Up2k(object):
dev,
xvol,
)
tfa += i1
tnf += i2
rsz += i3
except:
t = "failed to index subdir [{}]:\n{}"
self.log(t.format(abspath, min_ex()), c=1)
@@ -1351,6 +1404,7 @@ class Up2k(object):
# placeholder for unfinished upload
continue
rsz += sz
files.append((sz, lmod, iname))
liname = iname.lower()
if (
@@ -1372,6 +1426,18 @@ class Up2k(object):
):
cv = iname
if not self.args.no_dirsz:
tnf += len(files)
q = "select sz, nf from ds where rd=? limit 1"
try:
db_sz, db_nf = db.c.execute(q, (rd,)).fetchone() or (-1, -1)
if rsz != db_sz or tnf != db_nf:
db.c.execute("delete from ds where rd=?", (rd,))
db.c.execute("insert into ds values (?,?,?)", (rd, rsz, tnf))
db.n += 1
except:
pass # mojibake rd
# folder of 1000 files = ~1 MiB RAM best-case (tiny filenames);
# free up stuff we're done with before dhashing
gl = []
@@ -1385,7 +1451,7 @@ class Up2k(object):
zh.update(cv.encode("utf-8", "replace"))
zh.update(spack(b"<d", cst.st_mtime))
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
dhash = ub64enc(zh.digest()[:12]).decode("ascii")
sql = "select d from dh where d=? and +h=?"
try:
c = db.c.execute(sql, (rd, dhash))
@@ -1395,7 +1461,7 @@ class Up2k(object):
c = db.c.execute(sql, (drd, dhash))
if c.fetchone():
return ret
return tfa, tnf, rsz
if cv and rd:
# mojibake not supported (for performance / simplicity):
@@ -1412,10 +1478,10 @@ class Up2k(object):
seen_files = set([x[2] for x in files]) # for dropcheck
for sz, lmod, fn in files:
if self.stop:
return -1
return -1, 0, 0
rp = vjoin(rd, fn)
abspath = os.path.join(cdir, fn)
rp = rds + fn
abspath = cdirs + fn
nohash = reh.search(abspath) if reh else False
sql = "select w, mt, sz, ip, at from up where rd = ? and fn = ?"
@@ -1445,7 +1511,7 @@ class Up2k(object):
)
self.log(t)
self.db_rm(db.c, rd, fn, 0)
ret += 1
tfa += 1
db.n += 1
in_db = []
else:
@@ -1470,7 +1536,7 @@ class Up2k(object):
continue
if not hashes:
return -1
return -1, 0, 0
wark = up2k_wark_from_hashlist(self.salt, sz, hashes)
@@ -1481,7 +1547,7 @@ class Up2k(object):
# skip upload hooks by not providing vflags
self.db_add(db.c, {}, rd, fn, lmod, sz, "", "", wark, "", "", ip, at)
db.n += 1
ret += 1
tfa += 1
td = time.time() - db.t
if db.n >= 4096 or td >= 60:
self.log("commit {} new files".format(db.n))
@@ -1494,33 +1560,38 @@ class Up2k(object):
db.c.execute("insert into dh values (?,?)", (drd, dhash)) # type: ignore
if self.stop:
return -1
return -1, 0, 0
# drop shadowed folders
for sh_rd in unreg:
n = 0
q = "select count(w) from up where (rd = ? or rd like ?||'%') and at == 0"
q = "select count(w) from up where (rd=? or rd like ?||'/%') and +at == 0"
for sh_erd in [sh_rd, "//" + w8b64enc(sh_rd)]:
try:
n = db.c.execute(q, (sh_erd, sh_erd + "/")).fetchone()[0]
erd_erd = (sh_erd, sh_erd)
n = db.c.execute(q, erd_erd).fetchone()[0]
break
except:
pass
assert erd_erd # type: ignore # !rm
if n:
t = "forgetting {} shadowed autoindexed files in [{}] > [{}]"
self.log(t.format(n, top, sh_rd))
assert sh_erd # type: ignore
q = "delete from dh where (d = ? or d like ?||'%')"
db.c.execute(q, (sh_erd, sh_erd + "/"))
q = "delete from dh where (d = ? or d like ?||'/%')"
db.c.execute(q, erd_erd)
q = "delete from up where (rd = ? or rd like ?||'%') and at == 0"
db.c.execute(q, (sh_erd, sh_erd + "/"))
ret += n
q = "delete from up where (rd=? or rd like ?||'/%') and +at == 0"
db.c.execute(q, erd_erd)
tfa += n
q = "delete from ds where (rd=? or rd like ?||'/%')"
db.c.execute(q, erd_erd)
if n4g:
return ret
return tfa, tnf, rsz
# drop missing files
q = "select fn from up where rd = ?"
@@ -1538,7 +1609,7 @@ class Up2k(object):
if n_rm:
self.log("forgot {} deleted files".format(n_rm))
return ret
return tfa, tnf, rsz
def _drop_lost(self, cur: "sqlite3.Cursor", top: str, excl: list[str]) -> int:
rm = []
@@ -1613,7 +1684,7 @@ class Up2k(object):
# then covers
n_rm3 = 0
qu = "select 1 from up where rd=? and +fn=? limit 1"
qu = "select 1 from up where rd=? and fn=? limit 1"
q = "delete from cv where rd=? and dn=? and +fn=?"
for crd, cdn, fn in cur.execute("select * from cv"):
urd = vjoin(crd, cdn)
@@ -1756,13 +1827,13 @@ class Up2k(object):
return 0
with self.mutex:
q = "update up set w=?, sz=?, mt=? where rd=? and fn=?"
for rd, fn, w, sz, mt in rewark:
q = "update up set w = ?, sz = ?, mt = ? where rd = ? and fn = ? limit 1"
cur.execute(q, (w, sz, int(mt), rd, fn))
for _, _, w in f404:
q = "delete from up where w = ? limit 1"
cur.execute(q, (w,))
if f404:
q = "delete from up where rd=? and fn=? and +w=?"
cur.executemany(q, f404)
cur.connection.commit()
@@ -2229,7 +2300,7 @@ class Up2k(object):
# mp.pool.ThreadPool and concurrent.futures.ThreadPoolExecutor
# both do crazy runahead so lets reinvent another wheel
nw = max(1, self.args.mtag_mt)
assert self.mtag
assert self.mtag # !rm
if not self.mpool_used:
self.mpool_used = True
self.log("using {}x {}".format(nw, self.mtag.backend))
@@ -2303,7 +2374,7 @@ class Up2k(object):
at: float,
) -> int:
"""will mutex(main)"""
assert self.mtag
assert self.mtag # !rm
try:
st = bos.stat(abspath)
@@ -2335,7 +2406,7 @@ class Up2k(object):
tags: dict[str, Union[str, float]],
) -> int:
"""mutex(main) me"""
assert self.mtag
assert self.mtag # !rm
if not bos.path.isfile(abspath):
return 0
@@ -2391,7 +2462,7 @@ class Up2k(object):
def _log_sqlite_incompat(self, db_path, t0) -> None:
txt = t0 or ""
digest = hashlib.sha512(db_path.encode("utf-8", "replace")).digest()
stackname = base64.urlsafe_b64encode(digest[:9]).decode("utf-8")
stackname = ub64enc(digest[:9]).decode("ascii")
stackpath = os.path.join(E.cfg, "stack-%s.txt" % (stackname,))
t = " the filesystem at %s may not support locking, or is otherwise incompatible with sqlite\n\n %s\n\n"
@@ -2411,6 +2482,7 @@ class Up2k(object):
raise Exception("%s\n%s\n%s" % (t, txt, t))
def _orz(self, db_path: str) -> "sqlite3.Cursor":
assert sqlite3 # type: ignore # !rm
c = sqlite3.connect(
db_path, timeout=self.timeout, check_same_thread=False
).cursor()
@@ -2434,12 +2506,11 @@ class Up2k(object):
self.log("WARN: failed to upgrade from v4", 3)
if ver == DB_VER:
try:
self._add_cv_tab(cur)
self._add_xiu_tab(cur)
self._add_dhash_tab(cur)
except:
pass
self._add_dhash_tab(cur)
self._add_xiu_tab(cur)
self._add_cv_tab(cur)
self._add_idx_up_vp(cur, db_path)
self._add_ds_tab(cur)
try:
nfiles = next(cur.execute("select count(w) from up"))[0]
@@ -2474,6 +2545,7 @@ class Up2k(object):
def _backup_db(
self, db_path: str, cur: "sqlite3.Cursor", ver: Optional[int], msg: str
) -> "sqlite3.Cursor":
assert sqlite3 # type: ignore # !rm
bak = "{}.bak.{:x}.v{}".format(db_path, int(time.time()), ver)
self.log(msg + bak)
try:
@@ -2536,9 +2608,10 @@ class Up2k(object):
for cmd in [
r"create table up (w text, mt int, sz int, rd text, fn text, ip text, at int)",
r"create index up_rd on up(rd)",
r"create index up_vp on up(rd, fn)",
r"create index up_fn on up(fn)",
r"create index up_ip on up(ip)",
r"create index up_at on up(at)",
idx,
r"create table mt (w text, k text, v int)",
r"create index mt_w on mt(w)",
@@ -2552,6 +2625,7 @@ class Up2k(object):
self._add_dhash_tab(cur)
self._add_xiu_tab(cur)
self._add_cv_tab(cur)
self._add_ds_tab(cur)
self.log("created DB at {}".format(db_path))
return cur
@@ -2568,6 +2642,12 @@ class Up2k(object):
def _add_dhash_tab(self, cur: "sqlite3.Cursor") -> None:
# v5 -> v5a
try:
cur.execute("select d, h from dh limit 1").fetchone()
return
except:
pass
for cmd in [
r"create table dh (d text, h text)",
r"create index dh_d on dh(d)",
@@ -2621,6 +2701,40 @@ class Up2k(object):
cur.connection.commit()
def _add_idx_up_vp(self, cur: "sqlite3.Cursor", db_path: str) -> None:
# v5c -> v5d
try:
cur.execute("drop index up_rd")
except:
return
for cmd in [
r"create index up_vp on up(rd, fn)",
r"create index up_at on up(at)",
]:
self.log("upgrading db [%s]: %s" % (db_path, cmd[:18]))
cur.execute(cmd)
self.log("upgrading db [%s]: writing to disk..." % (db_path,))
cur.connection.commit()
cur.execute("vacuum")
def _add_ds_tab(self, cur: "sqlite3.Cursor") -> None:
# v5d -> v5e
try:
cur.execute("select rd, sz from ds limit 1").fetchone()
return
except:
pass
for cmd in [
r"create table ds (rd text, sz int, nf int)",
r"create index ds_rd on ds(rd)",
]:
cur.execute(cmd)
cur.connection.commit()
def wake_rescanner(self):
with self.rescan_cond:
self.rescan_cond.notify_all()
@@ -2663,7 +2777,7 @@ class Up2k(object):
if ptop not in self.registry:
raise Pebkac(410, "location unavailable")
cj["name"] = sanitize_fn(cj["name"], "", [".prologue.html", ".epilogue.html"])
cj["name"] = sanitize_fn(cj["name"], "")
cj["poke"] = now = self.db_act = self.vol_act[ptop] = time.time()
wark = self._get_wark(cj)
job = None
@@ -2805,7 +2919,7 @@ class Up2k(object):
c2 = cur
assert c2
assert c2 # !rm
c2.connection.commit()
cur = jcur
@@ -2909,10 +3023,13 @@ class Up2k(object):
job = deepcopy(job)
job["wark"] = wark
job["at"] = cj.get("at") or time.time()
zs = "lmod ptop vtop prel name host user addr poke"
job["at"] = cj.get("at") or now
zs = "vtop ptop prel name lmod host user addr poke"
for k in zs.split():
job[k] = cj.get(k) or ""
for k in ("life", "replace"):
if k in cj:
job[k] = cj[k]
pdir = djoin(cj["ptop"], cj["prel"])
if rand:
@@ -3013,18 +3130,8 @@ class Up2k(object):
"busy": {},
}
# client-provided, sanitized by _get_wark: name, size, lmod
for k in [
"host",
"user",
"addr",
"vtop",
"ptop",
"prel",
"name",
"size",
"lmod",
"poke",
]:
zs = "vtop ptop prel name size lmod host user addr poke"
for k in zs.split():
job[k] = cj[k]
for k in ["life", "replace"]:
@@ -3130,8 +3237,9 @@ class Up2k(object):
dip = self.hub.iphash.s(ip)
suffix = "-%.6f-%s" % (ts, dip)
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as zfw:
return zfw["orz"][1]
f, ret = ren_open(fname, "wb", fdir=fdir, suffix=suffix)
f.close()
return ret
def _symlink(
self,
@@ -3240,6 +3348,9 @@ class Up2k(object):
self.log("unknown wark [{}], known: {}".format(wark, known))
raise Pebkac(400, "unknown wark" + SSEELOG)
if "t0c" not in job:
job["t0c"] = time.time()
if len(chashes) > 1 and len(chashes[1]) < 44:
# first hash is full-length; expand remaining ones
uniq = []
@@ -3278,7 +3389,7 @@ class Up2k(object):
t = "that chunk is already being written to:\n {}\n {} {}/{}\n {}"
raise Pebkac(400, t.format(wark, chash, idx, nh, job["name"]))
assert chash # type: ignore
assert chash # type: ignore # !rm
chunksize = up2k_chunksize(job["size"])
coffsets = []
@@ -3313,19 +3424,30 @@ class Up2k(object):
return chashes, chunksize, coffsets, path, job["lmod"], job["sprs"]
def release_chunks(self, ptop: str, wark: str, chashes: list[str]) -> bool:
with self.reg_mutex:
job = self.registry[ptop].get(wark)
if job:
for chash in chashes:
job["busy"].pop(chash, None)
return True
def fast_confirm_chunks(
self, ptop: str, wark: str, chashes: list[str]
) -> tuple[int, str]:
if not self.mutex.acquire(False):
return -1, ""
if not self.reg_mutex.acquire(False):
self.mutex.release()
return -1, ""
try:
return self._confirm_chunks(ptop, wark, chashes)
finally:
self.reg_mutex.release()
self.mutex.release()
def confirm_chunks(
self, ptop: str, wark: str, chashes: list[str]
) -> tuple[int, str]:
with self.mutex, self.reg_mutex:
return self._confirm_chunks(ptop, wark, chashes)
def _confirm_chunks(
self, ptop: str, wark: str, chashes: list[str]
) -> tuple[int, str]:
if True:
self.db_act = self.vol_act[ptop] = time.time()
try:
job = self.registry[ptop][wark]
@@ -3333,7 +3455,7 @@ class Up2k(object):
src = djoin(pdir, job["tnam"])
dst = djoin(pdir, job["name"])
except Exception as ex:
return "confirm_chunk, wark(%r)" % (ex,) # type: ignore
return -2, "confirm_chunk, wark(%r)" % (ex,) # type: ignore
for chash in chashes:
job["busy"].pop(chash, None)
@@ -3342,7 +3464,7 @@ class Up2k(object):
for chash in chashes:
job["need"].remove(chash)
except Exception as ex:
return "confirm_chunk, chash(%s) %r" % (chash, ex) # type: ignore
return -2, "confirm_chunk, chash(%s) %r" % (chash, ex) # type: ignore
ret = len(job["need"])
if ret > 0:
@@ -3410,7 +3532,11 @@ class Up2k(object):
if self.idx_wark(vflags, *z2):
del self.registry[ptop][wark]
else:
self.registry[ptop][wark]["done"] = 1
for k in "host tnam busy sprs poke t0c".split():
del job[k]
job["t0"] = int(job["t0"])
job["hash"] = []
job["done"] = 1
self.regdrop(ptop, wark)
if wake_sr:
@@ -3491,7 +3617,7 @@ class Up2k(object):
cur.connection.commit()
except Exception as ex:
x = self.register_vpath(ptop, {})
assert x
assert x # !rm
db_ex_chk(self.log, ex, x[1])
raise
@@ -3506,7 +3632,7 @@ class Up2k(object):
try:
r = db.execute(sql, (rd, fn))
except:
assert self.mem_cur
assert self.mem_cur # !rm
r = db.execute(sql, s3enc(self.mem_cur, rd, fn))
if r.rowcount:
@@ -3544,7 +3670,7 @@ class Up2k(object):
try:
db.execute(sql, v)
except:
assert self.mem_cur
assert self.mem_cur # !rm
rd, fn = s3enc(self.mem_cur, rd, fn)
v = (wark, int(ts), sz, rd, fn, db_ip, int(at or 0))
db.execute(sql, v)
@@ -3592,7 +3718,7 @@ class Up2k(object):
try:
db.execute(q, (cd, wark[:16], rd, fn))
except:
assert self.mem_cur
assert self.mem_cur # !rm
rd, fn = s3enc(self.mem_cur, rd, fn)
db.execute(q, (cd, wark[:16], rd, fn))
@@ -3625,6 +3751,19 @@ class Up2k(object):
except:
pass
if "nodirsz" not in vflags:
try:
q = "update ds set nf=nf+1, sz=sz+? where rd=?"
q2 = "insert into ds values(?,?,1)"
while True:
if not db.execute(q, (sz, rd)).rowcount:
db.execute(q2, (rd, sz))
if not rd:
break
rd = rd.rsplit("/", 1)[0] if "/" in rd else ""
except:
pass
def handle_rm(
self,
uname: str,
@@ -4009,7 +4148,7 @@ class Up2k(object):
has_dupes = False
if w:
assert c1
assert c1 # !rm
if c2 and c2 != c1:
self._copy_tags(c1, c2, w)
@@ -4147,7 +4286,7 @@ class Up2k(object):
try:
c = cur.execute(q, (rd, fn))
except:
assert self.mem_cur
assert self.mem_cur # !rm
c = cur.execute(q, s3enc(self.mem_cur, rd, fn))
hit = c.fetchone()
@@ -4390,8 +4529,7 @@ class Up2k(object):
rem -= len(buf)
digest = hashobj.digest()[:33]
digest = base64.urlsafe_b64encode(digest)
ret.append(digest.decode("utf-8"))
ret.append(ub64enc(digest).decode("ascii"))
return ret, st
@@ -4463,8 +4601,8 @@ class Up2k(object):
dip = self.hub.iphash.s(job["addr"])
suffix = "-%.6f-%s" % (job["t0"], dip)
with ren_open(tnam, "wb", fdir=pdir, suffix=suffix) as zfw:
f, job["tnam"] = zfw["orz"]
f, job["tnam"] = ren_open(tnam, "wb", fdir=pdir, suffix=suffix)
try:
abspath = djoin(pdir, job["tnam"])
sprs = job["sprs"]
sz = job["size"]
@@ -4511,6 +4649,8 @@ class Up2k(object):
if job["hash"] and sprs:
f.seek(sz - 1)
f.write(b"e")
finally:
f.close()
if not job["hash"]:
self._finish_upload(job["ptop"], job["wark"])
@@ -4589,7 +4729,11 @@ class Up2k(object):
bos.unlink(path)
return
newest = float(max(x["poke"] for _, x in reg.items()) if reg else 0)
newest = float(
max(x["t0"] if "done" in x else x["poke"] for _, x in reg.items())
if reg
else 0
)
etag = (len(reg), newest)
if etag == self.snap_prev.get(ptop):
return
@@ -4600,12 +4744,15 @@ class Up2k(object):
path2 = "{}.{}".format(path, os.getpid())
body = {"droppable": self.droppable[ptop], "registry": reg}
j = json.dumps(body, sort_keys=True, separators=(",\n", ": ")).encode("utf-8")
# j = re.sub(r'"(need|hash)": \[\],\n', "", j) # bytes=slow, utf8=hungry
j = j.replace(b'"need": [],\n', b"") # surprisingly optimal
j = j.replace(b'"hash": [],\n', b"")
with gzip.GzipFile(path2, "wb") as f:
f.write(j)
atomic_move(self.log, path2, path, VF_CAREFUL)
self.log("snap: {} |{}|".format(path, len(reg.keys())))
self.log("snap: %s |%d| %.2fs" % (path, len(reg), time.time() - now))
self.snap_prev[ptop] = etag
def _tagger(self) -> None:
@@ -4853,11 +5000,10 @@ def up2k_wark_from_hashlist(salt: str, filesize: int, hashes: list[str]) -> str:
vstr = "\n".join(values)
wark = hashlib.sha512(vstr.encode("utf-8")).digest()[:33]
wark = base64.urlsafe_b64encode(wark)
return wark.decode("ascii")
return ub64enc(wark).decode("ascii")
def up2k_wark_from_metadata(salt: str, sz: int, lastmod: int, rd: str, fn: str) -> str:
ret = sfsenc("%s\n%d\n%d\n%s\n%s" % (salt, lastmod, sz, rd, fn))
ret = base64.urlsafe_b64encode(hashlib.sha512(ret).digest())
ret = ub64enc(hashlib.sha512(ret).digest())
return ("#%s" % (ret.decode("ascii"),))[:44]

View File

@@ -3,7 +3,8 @@ from __future__ import print_function, unicode_literals
import argparse
import base64
import contextlib
import binascii
import codecs
import errno
import hashlib
import hmac
@@ -30,13 +31,20 @@ from collections import Counter
from ipaddress import IPv4Address, IPv4Network, IPv6Address, IPv6Network
from queue import Queue
from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, VT100, WINDOWS
from .__init__ import (
ANYWIN,
EXE,
MACOS,
PY2,
PY36,
TYPE_CHECKING,
VT100,
WINDOWS,
EnvParams,
)
from .__version__ import S_BUILD_DT, S_VERSION
from .stolen import surrogateescape
ub64dec = base64.urlsafe_b64decode
ub64enc = base64.urlsafe_b64encode
try:
from datetime import datetime, timezone
@@ -64,7 +72,7 @@ if PY2:
if sys.version_info >= (3, 7) or (
sys.version_info >= (3, 6) and platform.python_implementation() == "CPython"
PY36 and platform.python_implementation() == "CPython"
):
ODict = dict
else:
@@ -126,7 +134,7 @@ if True: # pylint: disable=using-constant-test
from collections.abc import Callable, Iterable
import typing
from typing import Any, Generator, Optional, Pattern, Protocol, Union
from typing import IO, Any, Generator, Optional, Pattern, Protocol, Union
try:
from typing import LiteralString
@@ -164,12 +172,8 @@ except ImportError:
if not PY2:
from io import BytesIO
from urllib.parse import quote_from_bytes as quote
from urllib.parse import unquote_to_bytes as unquote
else:
from StringIO import StringIO as BytesIO # type: ignore
from urllib import quote # type: ignore # pylint: disable=no-name-in-module
from urllib import unquote # type: ignore # pylint: disable=no-name-in-module
try:
@@ -216,7 +220,7 @@ else:
FS_ENCODING = sys.getfilesystemencoding()
SYMTIME = sys.version_info > (3, 6) and os.utime in os.supports_follow_symlinks
SYMTIME = PY36 and os.utime in os.supports_follow_symlinks
META_NOBOTS = '<meta name="robots" content="noindex, nofollow">\n'
@@ -286,6 +290,30 @@ if ANYWIN:
UNPLICATIONS = [["no_dav", "daw"]]
DAV_ALLPROP_L = [
"contentclass",
"creationdate",
"defaultdocument",
"displayname",
"getcontentlanguage",
"getcontentlength",
"getcontenttype",
"getlastmodified",
"href",
"iscollection",
"ishidden",
"isreadonly",
"isroot",
"isstructureddocument",
"lastaccessed",
"name",
"parentname",
"resourcetype",
"supportedlock",
]
DAV_ALLPROPS = set(DAV_ALLPROP_L)
MIMES = {
"opus": "audio/ogg; codecs=opus",
}
@@ -338,7 +366,7 @@ MAGIC_MAP = {"jpeg": "jpg"}
DEF_EXP = "self.ip self.ua self.uname self.host cfg.name cfg.logout vf.scan vf.thsize hdr.cf_ipcountry srv.itime srv.htime"
DEF_MTE = "circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,vc,ac,fmt,res,.fps,ahash,vhash"
DEF_MTE = ".files,circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,vc,ac,fmt,res,.fps,ahash,vhash"
DEF_MTH = ".vq,.aq,vc,ac,fmt,res,.fps"
@@ -440,7 +468,7 @@ def py_desc() -> str:
def _sqlite_ver() -> str:
assert sqlite3 # type: ignore
assert sqlite3 # type: ignore # !rm
try:
co = sqlite3.connect(":memory:")
cur = co.cursor()
@@ -488,17 +516,36 @@ VERSIONS = (
)
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER)
__all__ = [
"mp",
"BytesIO",
"quote",
"unquote",
"SQLITE_VER",
"JINJA_VER",
"PYFTPD_VER",
"PARTFTPY_VER",
]
try:
_b64_enc_tl = bytes.maketrans(b"+/", b"-_")
_b64_dec_tl = bytes.maketrans(b"-_", b"+/")
def ub64enc(bs: bytes) -> bytes:
x = binascii.b2a_base64(bs, newline=False)
return x.translate(_b64_enc_tl)
def ub64dec(bs: bytes) -> bytes:
bs = bs.translate(_b64_dec_tl)
return binascii.a2b_base64(bs)
def b64enc(bs: bytes) -> bytes:
return binascii.b2a_base64(bs, newline=False)
def b64dec(bs: bytes) -> bytes:
return binascii.a2b_base64(bs)
zb = b">>>????"
zb2 = base64.urlsafe_b64encode(zb)
if zb2 != ub64enc(zb) or zb != ub64dec(zb2):
raise Exception("bad smoke")
except Exception as ex:
ub64enc = base64.urlsafe_b64encode # type: ignore
ub64dec = base64.urlsafe_b64decode # type: ignore
b64enc = base64.b64encode # type: ignore
b64dec = base64.b64decode # type: ignore
if not PY36:
print("using fallback base64 codec due to %r" % (ex,))
class Daemon(threading.Thread):
@@ -738,6 +785,7 @@ class _Unrecv(object):
self.buf = buf + self.buf
# !rm.yes>
class _LUnrecv(object):
"""
with expensive debug logging
@@ -794,6 +842,9 @@ class _LUnrecv(object):
print(t.format(buf, self.buf))
# !rm.no>
Unrecv = _Unrecv
@@ -1030,7 +1081,7 @@ class MTHash(object):
if self.stop:
return nch, "", ofs0, chunk_sz
assert f
assert f # !rm
hashobj = hashlib.sha512()
while chunk_rem > 0:
with self.imutex:
@@ -1045,7 +1096,7 @@ class MTHash(object):
ofs += len(buf)
bdig = hashobj.digest()[:33]
udig = base64.urlsafe_b64encode(bdig).decode("utf-8")
udig = ub64enc(bdig).decode("ascii")
return nch, udig, ofs0, chunk_sz
@@ -1071,7 +1122,7 @@ class HMaccas(object):
self.cache = {}
zb = hmac.new(self.key, msg, hashlib.sha512).digest()
zs = base64.urlsafe_b64encode(zb)[: self.retlen].decode("utf-8")
zs = ub64enc(zb)[: self.retlen].decode("ascii")
self.cache[msg] = zs
return zs
@@ -1402,18 +1453,13 @@ def min_ex(max_lines: int = 8, reverse: bool = False) -> str:
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
@contextlib.contextmanager
def ren_open(
fname: str, *args: Any, **kwargs: Any
) -> Generator[dict[str, tuple[typing.IO[Any], str]], None, None]:
def ren_open(fname: str, *args: Any, **kwargs: Any) -> tuple[typing.IO[Any], str]:
fun = kwargs.pop("fun", open)
fdir = kwargs.pop("fdir", None)
suffix = kwargs.pop("suffix", None)
if fname == os.devnull:
with fun(fname, *args, **kwargs) as f:
yield {"orz": (f, fname)}
return
return fun(fname, *args, **kwargs), fname
if suffix:
ext = fname.split(".")[-1]
@@ -1435,6 +1481,7 @@ def ren_open(
asciified = False
b64 = ""
while True:
f = None
try:
if fdir:
fpath = os.path.join(fdir, fname)
@@ -1446,19 +1493,20 @@ def ren_open(
fname += suffix
ext += suffix
with fun(fsenc(fpath), *args, **kwargs) as f:
if b64:
assert fdir
fp2 = "fn-trunc.%s.txt" % (b64,)
fp2 = os.path.join(fdir, fp2)
with open(fsenc(fp2), "wb") as f2:
f2.write(orig_name.encode("utf-8"))
f = fun(fsenc(fpath), *args, **kwargs)
if b64:
assert fdir # !rm
fp2 = "fn-trunc.%s.txt" % (b64,)
fp2 = os.path.join(fdir, fp2)
with open(fsenc(fp2), "wb") as f2:
f2.write(orig_name.encode("utf-8"))
yield {"orz": (f, fname)}
return
return f, fname
except OSError as ex_:
ex = ex_
if f:
f.close()
# EPERM: android13
if ex.errno in (errno.EINVAL, errno.EPERM) and not asciified:
@@ -1479,8 +1527,7 @@ def ren_open(
if not b64:
zs = ("%s\n%s" % (orig_name, suffix)).encode("utf-8", "replace")
zs = hashlib.sha512(zs).digest()[:12]
b64 = base64.urlsafe_b64encode(zs).decode("utf-8")
b64 = ub64enc(hashlib.sha512(zs).digest()[:12]).decode("ascii")
badlen = len(fname)
while len(fname) >= badlen:
@@ -1713,7 +1760,7 @@ class MultipartParser(object):
returns the value of the next field in the multipart body,
raises if the field name is not as expected
"""
assert self.gen
assert self.gen # !rm
p_field, p_fname, p_data = next(self.gen)
if p_field != field_name:
raise WrongPostKey(field_name, p_field, p_fname, p_data)
@@ -1722,7 +1769,7 @@ class MultipartParser(object):
def drop(self) -> None:
"""discards the remaining multipart body"""
assert self.gen
assert self.gen # !rm
for _, _, data in self.gen:
for _ in data:
pass
@@ -1786,9 +1833,8 @@ def rand_name(fdir: str, fn: str, rnd: int) -> str:
nc = rnd + extra
nb = (6 + 6 * nc) // 8
zb = os.urandom(nb)
zb = base64.urlsafe_b64encode(zb)
fn = zb[:nc].decode("utf-8") + ext
zb = ub64enc(os.urandom(nb))
fn = zb[:nc].decode("ascii") + ext
ok = not os.path.exists(fsenc(os.path.join(fdir, fn)))
return fn
@@ -1801,7 +1847,7 @@ def gen_filekey(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str
zs = "%s %s" % (salt, fspath)
zb = zs.encode("utf-8", "replace")
return base64.urlsafe_b64encode(hashlib.sha512(zb).digest()).decode("ascii")
return ub64enc(hashlib.sha512(zb).digest()).decode("ascii")
def gen_filekey_dbg(
@@ -1815,7 +1861,7 @@ def gen_filekey_dbg(
) -> str:
ret = gen_filekey(alg, salt, fspath, fsize, inode)
assert log_ptn
assert log_ptn # !rm
if log_ptn.search(fspath):
try:
import inspect
@@ -1936,13 +1982,10 @@ def undot(path: str) -> str:
return "/".join(ret)
def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
def sanitize_fn(fn: str, ok: str) -> str:
if "/" not in ok:
fn = fn.replace("\\", "/").split("/")[-1]
if fn.lower() in bad:
fn = "_" + fn
if ANYWIN:
remap = [
["<", ""],
@@ -1968,9 +2011,9 @@ def sanitize_fn(fn: str, ok: str, bad: list[str]) -> str:
return fn.strip()
def sanitize_vpath(vp: str, ok: str, bad: list[str]) -> str:
def sanitize_vpath(vp: str, ok: str) -> str:
parts = vp.replace(os.sep, "/").split("/")
ret = [sanitize_fn(x, ok, bad) for x in parts]
ret = [sanitize_fn(x, ok) for x in parts]
return "/".join(ret)
@@ -2074,6 +2117,8 @@ def html_bescape(s: bytes, quot: bool = False, crlf: bool = False) -> bytes:
def _quotep2(txt: str) -> str:
"""url quoter which deals with bytes correctly"""
if not txt:
return ""
btxt = w8enc(txt)
quot = quote(btxt, safe=b"/")
return w8dec(quot.replace(b" ", b"+")) # type: ignore
@@ -2081,18 +2126,61 @@ def _quotep2(txt: str) -> str:
def _quotep3(txt: str) -> str:
"""url quoter which deals with bytes correctly"""
if not txt:
return ""
btxt = w8enc(txt)
quot = quote(btxt, safe=b"/").encode("utf-8")
return w8dec(quot.replace(b" ", b"+"))
quotep = _quotep3 if not PY2 else _quotep2
if not PY2:
_uqsb = b"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789_.-~/"
_uqtl = {
n: ("%%%02X" % (n,) if n not in _uqsb else chr(n)).encode("utf-8")
for n in range(256)
}
_uqtl[b" "] = b"+"
def _quotep3b(txt: str) -> str:
"""url quoter which deals with bytes correctly"""
if not txt:
return ""
btxt = w8enc(txt)
if btxt.rstrip(_uqsb):
lut = _uqtl
btxt = b"".join([lut[ch] for ch in btxt])
return w8dec(btxt)
quotep = _quotep3b
_hexd = "0123456789ABCDEFabcdef"
_hex2b = {(a + b).encode(): bytes.fromhex(a + b) for a in _hexd for b in _hexd}
def unquote(btxt: bytes) -> bytes:
h2b = _hex2b
parts = iter(btxt.split(b"%"))
ret = [next(parts)]
for item in parts:
c = h2b.get(item[:2])
if c is None:
ret.append(b"%")
ret.append(item)
else:
ret.append(c)
ret.append(item[2:])
return b"".join(ret)
from urllib.parse import quote_from_bytes as quote
else:
from urllib import quote # type: ignore # pylint: disable=no-name-in-module
from urllib import unquote # type: ignore # pylint: disable=no-name-in-module
quotep = _quotep2
def unquotep(txt: str) -> str:
"""url unquoter which deals with bytes correctly"""
btxt = w8enc(txt)
# btxt = btxt.replace(b"+", b" ")
unq2 = unquote(btxt)
return w8dec(unq2)
@@ -2238,12 +2326,12 @@ w8enc = _w8enc3 if not PY2 else _w8enc2
def w8b64dec(txt: str) -> str:
"""decodes base64(filesystem-bytes) to wtf8"""
return w8dec(base64.urlsafe_b64decode(txt.encode("ascii")))
return w8dec(ub64dec(txt.encode("ascii")))
def w8b64enc(txt: str) -> str:
"""encodes wtf8 to base64(filesystem-bytes)"""
return base64.urlsafe_b64encode(w8enc(txt)).decode("ascii")
return ub64enc(w8enc(txt)).decode("ascii")
if not PY2 and WINDOWS:
@@ -2401,7 +2489,7 @@ def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
def get_df(abspath: str) -> tuple[Optional[int], Optional[int]]:
try:
# some fuses misbehave
assert ctypes
assert ctypes # type: ignore # !rm
if ANYWIN:
bfree = ctypes.c_ulonglong(0)
ctypes.windll.kernel32.GetDiskFreeSpaceExW( # type: ignore
@@ -2619,8 +2707,7 @@ def hashcopy(
if slp:
time.sleep(slp)
digest = hashobj.digest()[:33]
digest_b64 = base64.urlsafe_b64encode(digest).decode("utf-8")
digest_b64 = ub64enc(hashobj.digest()[:33]).decode("ascii")
return tlen, hashobj.hexdigest(), digest_b64
@@ -2860,7 +2947,7 @@ def getalive(pids: list[int], pgid: int) -> list[int]:
alive.append(pid)
else:
# windows doesn't have pgroups; assume
assert psutil
assert psutil # type: ignore # !rm
psutil.Process(pid)
alive.append(pid)
except:
@@ -2878,7 +2965,7 @@ def killtree(root: int) -> None:
pgid = 0
if HAVE_PSUTIL:
assert psutil
assert psutil # type: ignore # !rm
pids = [root]
parent = psutil.Process(root)
for child in parent.children(recursive=True):
@@ -3291,7 +3378,7 @@ def runhook(
at: float,
txt: str,
) -> dict[str, Any]:
assert broker or up2k
assert broker or up2k # !rm
asrv = (broker or up2k).asrv
args = (broker or up2k).args
vp = vp.replace("\\", "/")
@@ -3356,9 +3443,15 @@ def loadpy(ap: str, hot: bool) -> Any:
def gzip_orig_sz(fn: str) -> int:
with open(fsenc(fn), "rb") as f:
f.seek(-4, 2)
rv = f.read(4)
return sunpack(b"I", rv)[0] # type: ignore
return gzip_file_orig_sz(f)
def gzip_file_orig_sz(f) -> int:
start = f.tell()
f.seek(-4, 2)
rv = f.read(4)
f.seek(start, 0)
return sunpack(b"I", rv)[0] # type: ignore
def align_tab(lines: list[str]) -> list[str]:
@@ -3485,7 +3578,7 @@ def termsize() -> tuple[int, int]:
def hidedir(dp) -> None:
if ANYWIN:
try:
assert ctypes
assert ctypes # type: ignore # !rm
k32 = ctypes.WinDLL("kernel32")
attrs = k32.GetFileAttributesW(dp)
if attrs >= 0:
@@ -3494,6 +3587,110 @@ def hidedir(dp) -> None:
pass
try:
if sys.version_info < (3, 10):
# py3.8 doesn't have .files
# py3.9 has broken .is_file
raise ImportError()
import importlib.resources as impresources
except ImportError:
try:
import importlib_resources as impresources
except ImportError:
impresources = None
try:
if sys.version_info > (3, 10):
raise ImportError()
import pkg_resources
except ImportError:
pkg_resources = None
def _pkg_resource_exists(pkg: str, name: str) -> bool:
if not pkg_resources:
return False
try:
return pkg_resources.resource_exists(pkg, name)
except NotImplementedError:
return False
def stat_resource(E: EnvParams, name: str):
path = os.path.join(E.mod, name)
if os.path.exists(path):
return os.stat(fsenc(path))
return None
def _find_impresource(pkg: types.ModuleType, name: str):
assert impresources # !rm
try:
files = impresources.files(pkg)
except ImportError:
return None
return files.joinpath(name)
_rescache_has = {}
def _has_resource(name: str):
try:
return _rescache_has[name]
except:
pass
if len(_rescache_has) > 999:
_rescache_has.clear()
assert __package__ # !rm
pkg = sys.modules[__package__]
if impresources:
res = _find_impresource(pkg, name)
if res and res.is_file():
_rescache_has[name] = True
return True
if pkg_resources:
if _pkg_resource_exists(pkg.__name__, name):
_rescache_has[name] = True
return True
_rescache_has[name] = False
return False
def has_resource(E: EnvParams, name: str):
return _has_resource(name) or os.path.exists(os.path.join(E.mod, name))
def load_resource(E: EnvParams, name: str, mode="rb") -> IO[bytes]:
enc = None if "b" in mode else "utf-8"
if impresources:
assert __package__ # !rm
res = _find_impresource(sys.modules[__package__], name)
if res and res.is_file():
if enc:
return res.open(mode, encoding=enc)
else:
# throws if encoding= is mentioned at all
return res.open(mode)
if pkg_resources:
assert __package__ # !rm
pkg = sys.modules[__package__]
if _pkg_resource_exists(pkg.__name__, name):
stream = pkg_resources.resource_stream(pkg.__name__, name)
if enc:
stream = codecs.getreader(enc)(stream)
return stream
return open(os.path.join(E.mod, name), mode, encoding=enc)
class Pebkac(Exception):
def __init__(
self, code: int, msg: Optional[str] = None, log: Optional[str] = None
@@ -3521,3 +3718,16 @@ class WrongPostKey(Pebkac):
self.got = got
self.fname = fname
self.datagen = datagen
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER)
__all__ = [
"mp",
"BytesIO",
"quote",
"unquote",
"SQLITE_VER",
"JINJA_VER",
"PYFTPD_VER",
"PARTFTPY_VER",
]

View File

@@ -1362,6 +1362,7 @@ html.y #widget.open {
}
#widget.cmp #barpos,
#widget.cmp #barbuf {
height: 1.6em;
width: calc(100% - 11em);
border-radius: 0;
left: 5em;

View File

@@ -108,12 +108,9 @@
{%- for f in files %}
<tr><td>{{ f.lead }}</td><td><a href="{{ f.href }}">{{ f.name|e }}</a></td><td>{{ f.sz }}</td>
{%- if f.tags is defined %}
{%- for k in taglist %}
<td>{{ f.tags[k] }}</td>
{%- endfor %}
{%- endif %}
<td>{{ f.ext }}</td><td>{{ f.dt }}</td></tr>
{%- if f.tags is defined %}
{%- for k in taglist %}<td>{{ f.tags[k] }}</td>{%- endfor %}
{%- endif %}<td>{{ f.ext }}</td><td>{{ f.dt }}</td></tr>
{%- endfor %}
</tbody>

View File

@@ -94,10 +94,14 @@ var Ls = {
"danger": "DANGER",
"clipped": "copied to clipboard",
"ht_s": "second!s",
"ht_m": "minute!s",
"ht_h": "hour!s",
"ht_d": "day!s",
"ht_s1": "second",
"ht_s2": "seconds",
"ht_m1": "minute",
"ht_m2": "minutes",
"ht_h1": "hour",
"ht_h2": "hours",
"ht_d1": "day",
"ht_d2": "days",
"ht_and": " and ",
"goh": "control-panel",
@@ -214,6 +218,7 @@ var Ls = {
"ct_ihop": 'when the image viewer is closed, scroll down to the last viewed file">g⮯',
"ct_dots": 'show hidden files (if server permits)">dotfiles',
"ct_dir1st": 'sort folders before files">📁 first',
"ct_nsort": 'natural sort (for filenames with leading digits)">nsort',
"ct_readme": 'show README.md in folder listings">📜 readme',
"ct_idxh": 'show index.html instead of folder listing">htm',
"ct_sbars": 'show scrollbars">⟊',
@@ -325,7 +330,7 @@ var Ls = {
"fs_sc": "share the folder you're in",
"fs_ss": "share the selected files",
"fs_just1d": "you cannot select more than one folder,\nor mix flies and folders in one selection",
"fs_just1d": "you cannot select more than one folder,\nor mix files and folders in one selection",
"fs_abrt": "❌ abort",
"fs_rand": "🎲 rand.name",
"fs_go": "✅ create share",
@@ -341,7 +346,7 @@ var Ls = {
"fs_tsrc": "the file or folder to share",
"fs_ppwd": "optional password",
"fs_w8": "creating share...",
"fs_ok": "<h6>share-URL created</h6>\npress <code>Enter/OK</code> to Clipboard\npress <code>ESC/Cancel</code> to Close\n\n",
"fs_ok": "press <code>Enter/OK</code> to Clipboard\npress <code>ESC/Cancel</code> to Close",
"frt_dec": "may fix some cases of broken filenames\">url-decode",
"frt_rst": "reset modified filenames back to the original ones\">↺ reset",
@@ -658,10 +663,14 @@ var Ls = {
"danger": "VARSKU",
"clipped": "kopiert til utklippstavlen",
"ht_s": "sekund!er",
"ht_m": "minutt!er",
"ht_h": "time!r",
"ht_d": "dag!er",
"ht_s1": "sekund",
"ht_s2": "sekunder",
"ht_m1": "minutt",
"ht_m2": "minutter",
"ht_h1": "time",
"ht_h2": "timer",
"ht_d1": "dag",
"ht_d2": "dager",
"ht_and": " og ",
"goh": "kontrollpanel",
@@ -778,6 +787,7 @@ var Ls = {
"ct_ihop": 'bla ned til sist viste bilde når bildeviseren lukkes">g⮯',
"ct_dots": 'vis skjulte filer (gitt at serveren tillater det)">.synlig',
"ct_dir1st": 'sorter slik at mapper kommer foran filer">📁 først',
"ct_nsort": 'naturlig sortering (forstår tall i filnavn)">nsort',
"ct_readme": 'vis README.md nedenfor filene">📜 readme',
"ct_idxh": 'vis index.html istedenfor fil-liste">htm',
"ct_sbars": 'vis rullgardiner / skrollefelt">⟊',
@@ -905,7 +915,7 @@ var Ls = {
"fs_tsrc": "fil/mappe som skal deles",
"fs_ppwd": "frivillig passord",
"fs_w8": "oppretter deling...",
"fs_ok": "<h6>URL opprettet</h6>\ntrykk <code>Enter/OK</code> for å kopiere linken (for CTRL-V)\ntrykk <code>ESC/Avbryt</code> for å bare bekrefte\n\n",
"fs_ok": "trykk <code>Enter/OK</code> for å kopiere linken (for CTRL-V)\ntrykk <code>ESC/Avbryt</code> for å bare bekrefte",
"frt_dec": "kan korrigere visse ødelagte filnavn\">url-decode",
"frt_rst": "nullstiller endringer (tilbake til de originale filnavnene)\">↺ reset",
@@ -1218,14 +1228,18 @@ var Ls = {
"m_ok": "确定",
"m_ng": "取消",
"enable": "启用", //m
"danger": "危险", //m
"clipped": "已复制到剪贴板", //m
"enable": "启用",
"danger": "危险",
"clipped": "已复制到剪贴板",
"ht_s": "秒",
"ht_m": "",
"ht_h": "",
"ht_d": "",
"ht_s1": "秒",
"ht_s2": "",
"ht_m1": "",
"ht_m2": "",
"ht_h1": "时",
"ht_h2": "时",
"ht_d1": "天",
"ht_d2": "天",
"ht_and": " 和 ",
"goh": "控制面板",
@@ -1303,12 +1317,12 @@ var Ls = {
"utl_prog": "进度",
// 保持简短:
"utl_404": "404", //m
"utl_err": "故障", //m
"utl_oserr": "OS故障", //m
"utl_found": "已找到", //m
"utl_defer": "延期", //m
"utl_yolo": "加速", //m
"utl_404": "404",
"utl_err": "错误",
"utl_oserr": "OS错误",
"utl_found": "已找到",
"utl_defer": "延期",
"utl_yolo": "加速",
"utl_done": "完成",
"ul_flagblk": "文件已添加到队列</b><br>但另一个浏览器标签中有一个繁忙的 up2k<br>因此等待它完成",
@@ -1336,12 +1350,13 @@ var Ls = {
"cl_hcancel": "列隐藏已取消",
"ct_grid": '网格视图',
"ct_ttips": '◔ ◡ ◔"> 工具提示', //m
"ct_ttips": '◔ ◡ ◔"> 工具提示',
"ct_thumb": '在网格视图中,切换图标或缩略图$N快捷键: T">🖼️ 缩略图',
"ct_csel": '在网格视图中使用 CTRL 和 SHIFT 进行文件选择">CTRL',
"ct_ihop": '当图像查看器关闭时,滚动到最后查看的文件">滚动',
"ct_dots": '显示隐藏文件(如果服务器允许)">隐藏文件',
"ct_dir1st": '在文件之前排序文件夹">📁 排序',
"ct_nsort": '正确排序以数字开头的文件名">数字排序', //m
"ct_readme": '在文件夹列表中显示 README.md">📜 readme',
"ct_idxh": '显示 index.html 代替文件夹列表">htm',
"ct_sbars": '显示滚动条">⟊',
@@ -1468,8 +1483,8 @@ var Ls = {
"fs_pname": "链接名称可选;如果为空则随机",
"fs_tsrc": "共享的文件或文件夹",
"fs_ppwd": "密码可选",
"fs_w8": "正在创建文件共享...", //m
"fs_ok": "<h6>分享链接已创建</h6>\n按 <code>Enter/OK</code> 复制到剪贴板\n按 <code>ESC/Cancel</code> 关闭\n\n",
"fs_w8": "正在创建文件共享...",
"fs_ok": "按 <code>Enter/OK</code> 复制到剪贴板\n按 <code>ESC/Cancel</code> 关闭",
"frt_dec": "可能修复一些损坏的文件名\">url-decode",
"frt_rst": "将修改后的文件名重置为原始文件名\">↺ 重置",
@@ -1479,8 +1494,8 @@ var Ls = {
"fr_case": "区分大小写的正则表达式\">case",
"fr_win": "Windows 安全名称;将 <code>&lt;&gt;:&quot;\\|?*</code> 替换为日文全角字符\">win",
"fr_slash": "将 <code>/</code> 替换为不会导致新文件夹创建的字符\">不使用 /",
"fr_re": "正则表达式搜索模式应用于原始文件名;$N可以在下面的格式字段中引用捕获组如&lt;code&gt;(1)&lt;/code&gt;和&lt;code&gt;(2)&lt;/code&gt;等等。", //m
"fr_fmt": "受到 foobar2000 的启发:$N&lt;code&gt;(title)&lt;/code&gt; 被歌曲名称替换,$N&lt;code&gt;[(artist) - ](title)&lt;/code&gt; 仅当歌曲艺术家不为空时才包含&lt;code&gt;[此]&lt;/code&gt;部分$N&lt;code&gt;$lpad((tn),2,0)&lt;/code&gt; 将曲目编号填充为 2 位数字", //m
"fr_re": "正则表达式搜索模式应用于原始文件名;$N可以在下面的格式字段中引用捕获组如&lt;code&gt;(1)&lt;/code&gt;和&lt;code&gt;(2)&lt;/code&gt;等等。",
"fr_fmt": "受到 foobar2000 的启发:$N&lt;code&gt;(title)&lt;/code&gt; 被歌曲名称替换,$N&lt;code&gt;[(artist) - ](title)&lt;/code&gt; 仅当歌曲艺术家不为空时才包含&lt;code&gt;[此]&lt;/code&gt;部分$N&lt;code&gt;$lpad((tn),2,0)&lt;/code&gt; 将曲目编号填充为 2 位数字",
"fr_pdel": "删除",
"fr_pnew": "另存为",
"fr_pname": "为你的新预设提供一个名称",
@@ -1540,8 +1555,8 @@ var Ls = {
"gt_c1": "截断文件名更多(显示更少)",
"gt_c2": "截断文件名更少(显示更多)",
"sm_w8": "正在搜寻匹配...", //m
"sm_prev": "以下是来自先前查询的搜索结果:\n ",
"sm_w8": "正在搜...",
"sm_prev": "上次查询的搜索结果:\n ",
"sl_close": "关闭搜索结果",
"sl_hits": "显示 {0} 个结果",
"sl_moar": "加载更多",
@@ -1613,20 +1628,20 @@ var Ls = {
"un_del": "删除",
"un_m3": "正在加载你的近期上传...",
"un_busy": "正在删除 {0} 个文件...",
"un_clip": "{0} 个链接已复制到剪贴板", //m
"un_clip": "{0} 个链接已复制到剪贴板",
"u_https1": "你应该",
"u_https2": "切换到 https",
"u_https3": "以获得更好的性能",
"u_ancient": '你的浏览器非常古老 -- 也许你应该 <a href="#" onclick="goto(\'bup\')">改用 bup</a>',
"u_nowork": "需要 Firefox 53+ 或 Chrome 57+ 或 iOS 11+",
"u_nodrop": '您的浏览器太旧,不支持通过拖动文件到窗口来上传文件', //m
"u_notdir": "未收到文件夹!\n\n您的浏览器太旧\n请尝试将文件夹拖入窗口", //m
"u_nodrop": '浏览器版本低,不支持通过拖动文件到窗口来上传文件',
"u_notdir": "不是文件夹!\n\n您的浏览器太旧\n请尝试将文件夹拖入窗口",
"u_uri": "要从其他浏览器窗口拖放图片,\n请将其拖放到大的上传按钮上",
"u_enpot": '切换到 <a href="#">简约 UI</a>(可能提高上传速度)',
"u_depot": '切换到 <a href="#">精美 UI</a>(可能降低上传速度)',
"u_gotpot": '切换到土豆 UI 以提高上传速度,\n\n随时可以不同意并切换回去',
"u_pott": "<p>个文件: &nbsp; <b>{0}</b> 已完成, &nbsp; <b>{1}</b> 失败, &nbsp; <b>{2}</b> 正在处理, &nbsp; <b>{3}</b> 排队</p>", //m
"u_gotpot": '切换到简化UI以提高上传速度\n\n随时可以不同意并切换回去',
"u_pott": "<p>个文件: &nbsp; <b>{0}</b> 已完成, &nbsp; <b>{1}</b> 失败, &nbsp; <b>{2}</b> 正在处理, &nbsp; <b>{3}</b> 排队</p>",
"u_ever": "这是基本的上传工具; up2k 需要至少<br>chrome 21 // firefox 13 // edge 12 // opera 12 // safari 5.1",
"u_su2k": '这是基本的上传工具;<a href="#" id="u2yea">up2k</a> 更好',
"u_ewrite": '你对这个文件夹没有写入权限',
@@ -1639,16 +1654,16 @@ var Ls = {
"u_up_life": "此上传将在 {0} 后从服务器删除",
"u_asku": '将这些 {0} 个文件上传到 <code>{1}</code>',
"u_unpt": "你可以使用左上角的 🧯 撤销/删除此上传",
"u_bigtab": '将显示 {0} 个文件。这可能会导致您的浏览器崩溃。您确定吗?', //m
"u_scan": '正在扫描文件...', //m
"u_dirstuck": '您的浏览器无法访问以下 {0} 个文件/文件夹,因此它们将被跳过:', //m
"u_bigtab": '将显示 {0} 个文件,可能会导致您的浏览器崩溃。您确定吗?',
"u_scan": '正在扫描文件...',
"u_dirstuck": '您的浏览器无法访问以下 {0} 个文件/文件夹,它们将被跳过:',
"u_etadone": '完成 ({0}, {1} 个文件)',
"u_etaprep": '(准备上传)',
"u_hashdone": '哈希完成',
"u_hashing": '哈希',
"u_hs": '正在等待服务器...', //m
"u_dupdefer": "这是一个重复文件。它将在所有其他文件上传后进行处理", //m
"u_actx": "单击此文本以防止切换到其他窗口/选项卡时性能下降", //m
"u_hs": '正在等待服务器...',
"u_dupdefer": "这是一个重复文件。它将在所有其他文件上传后进行处理",
"u_actx": "单击此文本以防止切换到其他窗口/选项卡时性能下降",
"u_fixed": "好!&nbsp;已修复 👍",
"u_cuerr": "上传块 {0} 的 {1} 失败;\n可能无害继续中\n\n文件{2}",
"u_cuerr2": "服务器拒绝上传(块 {0} 的 {1}\n稍后重试\n\n文件{2}\n\n错误 ",
@@ -1776,7 +1791,7 @@ ebi('widget').innerHTML = (
' <canvas id="barbuf"></canvas>' +
'</div>' +
'<div id="np_inf">' +
' <img id="np_img"></span>' +
' <img id="np_img" />' +
' <span id="np_url"></span>' +
' <span id="np_circle"></span>' +
' <span id="np_album"></span>' +
@@ -1901,6 +1916,7 @@ ebi('op_cfg').innerHTML = (
' <a id="ihop" class="tgl btn" href="#" tt="' + L.ct_ihop + '</a>\n' +
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '</a>\n' +
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '</a>\n' +
' <a id="nsort" class="tgl btn" href="#" tt="' + L.ct_nsort + '</a>\n' +
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '</a>\n' +
' <a id="idxh" class="tgl btn" href="#" tt="' + L.ct_idxh + '</a>\n' +
' <a id="sbars" class="tgl btn" href="#" tt="' + L.ct_sbars + '</a>\n' +
@@ -4175,6 +4191,9 @@ function sortfiles(nodes) {
var sopts = jread('fsort', jcp(dsort)),
dir1st = sread('dir1st') !== '0';
var collator = sread('nsort') != 1 ? null :
new Intl.Collator([], {numeric: true});
try {
var is_srch = false;
if (nodes[0]['rp']) {
@@ -4224,7 +4243,9 @@ function sortfiles(nodes) {
}
if (v2 === undefined) return 1 * rev;
var ret = rev * (typ == 'int' ? (v1 - v2) : (v1.localeCompare(v2)));
var ret = rev * (typ == 'int' ? (v1 - v2) : collator ?
collator.compare(v1, v2) : v1.localeCompare(v2));
if (ret === 0)
ret = onodes.indexOf(n1) - onodes.indexOf(n2);
@@ -4584,11 +4605,12 @@ var fileman = (function () {
return;
}
surl = surl.slice(15);
modal.confirm(L.fs_ok + esc(surl), function() {
var txt = esc(surl) + '<img class="b64" src="' + surl + '?qr" />';
modal.confirm(txt + L.fs_ok, function() {
cliptxt(surl, function () {
toast.ok(2, L.clipped);
});
});
}, null);
}
sh_apply.onclick = function () {
@@ -4622,13 +4644,13 @@ var fileman = (function () {
r.rename = function (e) {
ev(e);
if (clgot(bren, 'hide'))
return toast.err(3, L.fr_eperm);
var sel = msel.getsel();
if (!sel.length)
return toast.err(3, L.fr_emore);
if (clgot(bren, 'hide'))
return toast.err(3, L.fr_eperm);
var f = [],
base = vsplit(sel[0].vp)[0],
mkeys;
@@ -4931,9 +4953,6 @@ var fileman = (function () {
r.delete = function (e) {
ev(e);
if (clgot(bdel, 'hide'))
return toast.err(3, L.fd_eperm);
var sel = msel.getsel(),
vps = [];
@@ -4943,6 +4962,9 @@ var fileman = (function () {
if (!sel.length)
return toast.err(3, L.fd_emore);
if (clgot(bdel, 'hide'))
return toast.err(3, L.fd_eperm);
function deleter(err) {
var xhr = new XHR(),
vp = vps.shift();
@@ -4980,14 +5002,14 @@ var fileman = (function () {
r.cut = function (e) {
ev(e);
if (clgot(bcut, 'hide'))
return toast.err(3, L.fc_eperm);
var sel = msel.getsel(),
vps = [];
if (!sel.length)
toast.err(3, L.fc_emore);
return toast.err(3, L.fc_emore);
if (clgot(bcut, 'hide'))
return toast.err(3, L.fc_eperm);
var els = [], griden = thegrid.en;
for (var a = 0; a < sel.length; a++) {
@@ -5095,12 +5117,12 @@ var fileman = (function () {
};
r.paste = function () {
if (clgot(bpst, 'hide'))
return toast.err(3, L.fp_eperm);
if (!r.clip.length)
return toast.err(5, L.fp_ecut);
if (clgot(bpst, 'hide'))
return toast.err(3, L.fp_eperm);
var req = [],
exists = [],
indir = [],
@@ -5301,8 +5323,12 @@ var showfile = (function () {
r.files.push({ 'id': link.id, 'name': uricom_dec(fn) });
var td = ebi(link.id).closest('tr').getElementsByTagName('td')[0];
var ah = ebi(link.id),
td = ah.closest('tr').getElementsByTagName('td')[0];
if (ah.textContent.endsWith('/'))
continue;
if (lang == 'ts' || (lang == 'md' && td.textContent != '-'))
continue;
@@ -5670,10 +5696,7 @@ var thegrid = (function () {
swrite('gridln', r.ln);
setTimeout(r.tippen, 20);
}
try {
document.documentElement.style.setProperty('--grid-ln', r.ln);
}
catch (ex) { }
setcvar('--grid-ln', r.ln);
}
setln();
@@ -5683,10 +5706,7 @@ var thegrid = (function () {
swrite('gridsz', r.sz);
setTimeout(r.tippen, 20);
}
try {
document.documentElement.style.setProperty('--grid-sz', r.sz + 'em');
}
catch (ex) { }
setcvar('--grid-sz', r.sz + 'em');
aligngriditems();
}
setsz();
@@ -6749,10 +6769,7 @@ var filecolwidth = (function () {
return;
lastwidth = w;
try {
document.documentElement.style.setProperty('--file-td-w', w + 'em');
}
catch (ex) { }
setcvar('--file-td-w', w + 'em');
}
})();
onresize100.add(filecolwidth, true);
@@ -6773,6 +6790,9 @@ var treectl = (function () {
mentered = null,
treesz = clamp(icfg_get('treesz', 16), 10, 50);
var resort = function () {
treectl.gentab(get_evpath(), treectl.lsc);
};
bcfg_bind(r, 'ireadme', 'ireadme', true);
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
@@ -6783,9 +6803,8 @@ var treectl = (function () {
xhr.open('GET', SR + '/?setck=dots=' + (v ? 'y' : ''), true);
xhr.send();
});
bcfg_bind(r, 'dir1st', 'dir1st', true, function (v) {
treectl.gentab(get_evpath(), treectl.lsc);
});
bcfg_bind(r, 'nsort', 'nsort', false, resort);
bcfg_bind(r, 'dir1st', 'dir1st', true, resort);
setwrap(bcfg_bind(r, 'wtree', 'wraptree', true, setwrap));
setwrap(bcfg_bind(r, 'parpane', 'parpane', true, onscroll));
bcfg_bind(r, 'htree', 'hovertree', false, reload_tree);
@@ -6994,10 +7013,7 @@ var treectl = (function () {
w = iw + 'em',
w2 = (iw + 2) + 'em';
try {
document.documentElement.style.setProperty('--nav-sz', w);
}
catch (ex) { }
setcvar('--nav-sz', w);
ebi('tree').style.width = w;
ebi('wrap').style.marginLeft = w2;
onscroll();

View File

@@ -27,7 +27,7 @@ a {
padding: .2em .6em;
margin: 0 .3em;
}
td a {
#wrap td a {
margin: 0;
}
#w {
@@ -53,12 +53,13 @@ th {
position: sticky;
background: #f7f7f7;
}
td, th {
#wrap td,
#wrap th {
padding: .3em .6em;
text-align: left;
white-space: nowrap;
}
td+td+td+td+td+td+td+td {
#wrap td+td+td+td+td+td+td+td {
font-family: var(--font-mono), monospace, monospace;
}

View File

@@ -22,8 +22,8 @@
<span>min/hrs = time left</span>
<table id="tab"><thead><tr>
<th>delete</th>
<th>sharekey</th>
<th>delete</th>
<th>pw</th>
<th>source</th>
<th>axs</th>
@@ -37,10 +37,13 @@
</tr></thead><tbody>
{% for k, pw, vp, pr, st, un, t0, t1 in rows %}
<tr>
<td>
<a href="{{ r }}{{ shr }}{{ k }}?qr">qr</a>
<a href="{{ r }}{{ shr }}{{ k }}">{{ k }}</a>
</td>
<td><a href="#" k="{{ k }}">delete</a></td>
<td><a href="{{ r }}{{ shr }}{{ k }}">{{ k }}</a></td>
<td>{{ pw }}</td>
<td><a href="{{ r }}/{{ vp|e }}">{{ vp|e }}</a></td>
<td>{{ "yes" if pw else "--" }}</td>
<td><a href="{{ r }}/{{ vp|e }}">/{{ vp|e }}</a></td>
<td>{{ pr }}</td>
<td>{{ st }}</td>
<td>{{ un|e }}</td>

View File

@@ -12,7 +12,7 @@ function rm() {
}
function bump() {
var k = this.closest('tr').getElementsByTagName('a')[0].getAttribute('k'),
var k = this.closest('tr').getElementsByTagName('a')[2].getAttribute('k'),
u = SR + shr + uricom_enc(k) + '?eshare=' + this.value,
xhr = new XHR();
@@ -28,14 +28,36 @@ function cb() {
document.location = '?shares';
}
function qr(e) {
ev(e);
var href = this.href,
pw = this.closest('tr').cells[2].textContent;
if (pw.indexOf('yes') < 0)
return showqr(href);
modal.prompt("if you want to bypass the password protection by\nembedding the password into the qr-code, then\ntype the password now, otherwise leave this empty", "", function (v) {
if (v)
href += "&pw=" + v;
showqr(href);
});
}
function showqr(href) {
var vhref = href.replace('?qr&', '?').replace('?qr', '');
modal.alert(esc(vhref) + '<img class="b64" src="' + href + '" />');
}
(function() {
var tab = ebi('tab').tBodies[0],
tr = Array.prototype.slice.call(tab.rows, 0);
var buf = [];
for (var a = 0; a < tr.length; a++)
for (var a = 0; a < tr.length; a++) {
tr[a].cells[0].getElementsByTagName('a')[0].onclick = qr;
for (var b = 7; b < 9; b++)
buf.push(parseInt(tr[a].cells[b].innerHTML));
}
var ibuf = 0;
for (var a = 0; a < tr.length; a++)

View File

@@ -53,7 +53,7 @@ a.r {
border-color: #c7a;
}
a.g {
color: #2b0;
color: #0a0;
border-color: #3a0;
box-shadow: 0 .3em 1em #4c0;
}
@@ -152,11 +152,13 @@ pre b,
code b {
color: #000;
font-weight: normal;
text-shadow: 0 0 .2em #0f0;
text-shadow: 0 0 .2em #3f3;
border-bottom: 1px solid #090;
}
html.z pre b,
html.z code b {
color: #fff;
border-bottom: 1px solid #9f9;
}

View File

@@ -32,6 +32,18 @@
</div>
{%- endif %}
{%- if ups %}
<h1 id="aa">incoming files:</h1>
<table class="vols">
<thead><tr><th>%</th><th>speed</th><th>eta</th><th>idle</th><th>dir</th><th>file</th></tr></thead>
<tbody>
{% for u in ups %}
<tr><td>{{ u[0] }}</td><td>{{ u[1] }}</td><td>{{ u[2] }}</td><td>{{ u[3] }}</td><td><a href="{{ u[4] }}">{{ u[5]|e }}</a></td><td>{{ u[6]|e }}</td></tr>
{% endfor %}
</tbody>
</table>
{%- endif %}
{%- if avol %}
<h1>admin panel:</h1>
<table><tr><td> <!-- hehehe -->

View File

@@ -9,7 +9,7 @@ var Ls = {
"e2": "leser inn konfigurasjonsfiler på nytt$N(kontoer, volumer, volumbrytere)$Nog kartlegger alle e2ds-volumer$N$Nmerk: endringer i globale parametere$Nkrever en full restart for å ta gjenge",
"f1": "du kan betrakte:",
"g1": "du kan laste opp til:",
"cc1": "brytere og sånt",
"cc1": "brytere og sånt:",
"h1": "skru av k304",
"i1": "skru på k304",
"j1": "k304 bryter tilkoplingen for hver HTTP 304. Dette hjelper mot visse mellomtjenere som kan sette seg fast / plutselig slutter å laste sider, men det reduserer også ytelsen betydelig",
@@ -25,20 +25,21 @@ var Ls = {
"t1": "handling",
"u2": "tid siden noen sist skrev til serveren$N( opplastning / navneendring / ... )$N$N17d = 17 dager$N1h23 = 1 time 23 minutter$N4m56 = 4 minuter 56 sekunder",
"v1": "koble til",
"v2": "bruk denne serveren som en lokal harddisk$N$NADVARSEL: kommer til å vise passordet ditt!",
"v2": "bruk denne serveren som en lokal harddisk",
"w1": "bytt til https",
"x1": "bytt passord",
"y1": "dine delinger",
"z1": "lås opp område",
"z1": "lås opp område:",
"ta1": "du må skrive et nytt passord først",
"ta2": "gjenta for å bekrefte nytt passord:",
"ta3": "fant en skrivefeil; vennligst prøv igjen",
"aa1": "innkommende:",
},
"eng": {
"d2": "shows the state of all active threads",
"e2": "reload config files (accounts/volumes/volflags),$Nand rescan all e2ds volumes$N$Nnote: any changes to global settings$Nrequire a full restart to take effect",
"u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds",
"v2": "use this server as a local HDD$N$NWARNING: this will show your password!",
"v2": "use this server as a local HDD",
"ta1": "fill in your new password first",
"ta2": "repeat to confirm new password:",
"ta3": "found a typo; please try again",
@@ -70,7 +71,7 @@ var Ls = {
"t1": "操作",
"u2": "自上次服务器写入的时间$N( 上传 / 重命名 / ... )$N$N17d = 17 天$N1h23 = 1 小时 23 分钟$N4m56 = 4 分钟 56 秒",
"v1": "连接",
"v2": "将此服务器用作本地硬盘$N$N警告这将显示你的密码",
"v2": "将此服务器用作本地硬盘",
"w1": "切换到 https",
"x1": "更改密码",
"y1": "你的分享",
@@ -78,6 +79,7 @@ var Ls = {
"ta1": "请先输入新密码",
"ta2": "重复以确认新密码:",
"ta3": "发现拼写错误;请重试",
"aa1": "正在接收的文件:", //m
}
};

View File

@@ -192,6 +192,7 @@
<h1>partyfuse</h1>
<p>
<a href="{{ r }}/.cpr/a/partyfuse.py">partyfuse.py</a> -- fast, read-only,
needs <a href="{{ r }}/.cpr/deps/fuse.py">fuse.py</a> in the same folder,
<span class="os win">needs <a href="https://winfsp.dev/rel/">winfsp</a></span>
<span class="os lin">doesn't need root</span>
</p>
@@ -208,7 +209,6 @@
{% if args.smb %}
<h1>SMB / CIFS</h1>
<em><a href="https://github.com/SecureAuthCorp/impacket/issues/1433">bug:</a> max ~300 files in each folder</em>
<div class="os win">
<pre>

View File

@@ -69,6 +69,18 @@ html {
top: 2em;
bottom: unset;
}
#toastt {
position: absolute;
height: 1px;
top: 1px;
right: 1%;
width: 99%;
animation: toastt var(--tmtime) steps(var(--tmstep)) forwards;
transform-origin: right;
}
@keyframes toastt {
to {transform: scaleX(0)}
}
#toast a {
color: inherit;
text-shadow: inherit;
@@ -130,6 +142,9 @@ html {
#toast.inf #toastc {
background: #0be;
}
#toast.inf #toastt {
background: #8ef;
}
#toast.ok {
background: #380;
border-color: #8e4;
@@ -137,6 +152,9 @@ html {
#toast.ok #toastc {
background: #8e4;
}
#toast.ok #toastt {
background: #cf9;
}
#toast.warn {
background: #960;
border-color: #fc0;
@@ -144,6 +162,9 @@ html {
#toast.warn #toastc {
background: #fc0;
}
#toast.warn #toastt {
background: #fe9;
}
#toast.err {
background: #900;
border-color: #d06;
@@ -151,6 +172,9 @@ html {
#toast.err #toastc {
background: #d06;
}
#toast.err #toastt {
background: #f9c;
}
#toast code {
padding: 0 .2em;
background: rgba(0,0,0,0.2);
@@ -293,6 +317,12 @@ html.y #tth {
#modalc a {
color: #07b;
}
#modalc .b64 {
display: block;
margin: .1em auto;
width: 60%;
height: 60%;
}
#modalb {
position: sticky;
text-align: right;

View File

@@ -535,6 +535,14 @@ function clgot(el, cls) {
}
function setcvar(k, v) {
try {
document.documentElement.style.setProperty(k, v);
}
catch (e) { }
}
var ANIM = true;
try {
var mq = window.matchMedia('(prefers-reduced-motion: reduce)');
@@ -922,15 +930,18 @@ function shumantime(v, long) {
function lhumantime(v) {
var t = shumantime(v, 1),
tp = t.replace(/([a-z])/g, " $1 ").split(/ /g).slice(0, -1);
var t = shumantime(v, 1);
if (/[0-9]$/.exec(t))
t += 's';
var tp = t.replace(/([a-z])/g, " $1 ").split(/ /g).slice(0, -1);
if (!L || tp.length < 2 || tp[1].indexOf('$') + 1)
return t;
var ret = '';
for (var a = 0; a < tp.length; a += 2)
ret += tp[a] + ' ' + L['ht_' + tp[a + 1]].replace(tp[a] == 1 ? /!.*/ : /!/, '') + L.ht_and;
ret += tp[a] + ' ' + L['ht_' + tp[a + 1] + (tp[a]==1?1:2)] + L.ht_and;
return ret.slice(0, -L.ht_and.length);
}
@@ -1516,13 +1527,21 @@ var toast = (function () {
if (sec)
te = setTimeout(r.hide, sec * 1000);
if (same && delta < 1000)
var tb = ebi('toastt');
if (same && delta < 1000 && tb) {
tb.style.animation = 'none';
tb.offsetHeight;
tb.style.animation = null;
return;
}
if (txt.indexOf('<body>') + 1)
txt = txt.slice(0, txt.indexOf('<')) + ' [...]';
obj.innerHTML = '<a href="#" id="toastc">x</a><div id="toastb">' + lf2br(txt) + '</div>';
setcvar('--tmtime', sec + 's');
setcvar('--tmstep', sec * 15);
obj.innerHTML = '<div id="toastt"></div><a href="#" id="toastc">x</a><div id="toastb">' + lf2br(txt) + '</div>';
obj.className = cl;
sec += obj.offsetWidth;
obj.className += ' vis';

View File

@@ -1,3 +1,112 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-1004-2319 `v1.15.4` hermetic
## 🧪 new features
* [u2c](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy) (commandline uploader):
* remove all dependencies; now entirely self-contained 9daeed92
* made it 3x faster for small files, 2x faster in general
* improve `-x` behavior to not traverse into excluded folders b9c5c7bb
* [partyfuse](https://github.com/9001/copyparty/tree/hovudstraum/bin#partyfusepy) (fuse client; mount a copyparty server as a local filesystem):
* 9x faster directory listings 03f0f994
* 4x faster downloads on high-latency connections 847a2bdc
* embed `fuse.py` (its only dependency) -- can be downloaded from the connect-page 44f2b63e
* support mounting nginx and iis servers too, not just copyparty c81e8984
* reduce ram usage down to 10% when running without `-e2d` 88a1c5ca
* does not affect servers with `-e2d` enabled (was already optimal)
* share folders as qr-codes e4542064
* when creating a share, you get a qr-code for quick access
* buttons in the shares controlpanel to reshow it, optionally with the password embedded into the qr-code
* #98 read embedded webdeps and templates with `pkg_resources`; thx @shizmob! a462a644 d866841c
* [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) now runs straight from the source file without unpacking anything to disk
* ...and is now much slower at returning resource GETs, but that is fine
* og / opengraph / discord embeds: support filekeys ae982006
* add option for natural sorting; thx @oshiteku! 9804f25d
* eyecandy timer bar on toasts 0dfe1d5b
* smb-server: impacket 0.12 is out! dc4d0d8e
* now *possible* to list folders with more than 400 files (it's REALLY slow)
## 🩹 bugfixes
* webdav:
* support `<allprop/>` in propfind dc157fa2
* list volumes when root is unmapped 480ac254
* previously, clients couldn't connect to the root of a copyparty server unless a volume existed at `/`
* #101 show `.prologue.html` and `.epilogue.html` in directory listings even if user cannot see hidden files 21be82ef
* #100 confusing toast when pressing F2 without selecting anything 2715ee6c
* fix prometheus metrics 678675a9
## 🔧 other changes
* #100 allow uploading `.prologue.html` and `.epilogue.html` 19a5985f
* #102 make translation easier when running in docker
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0916-0107 `v1.15.3` incoming eta
## 🧪 new features
![cpanel-upload-eta2-or8](https://github.com/user-attachments/assets/eb003bd4-da3c-4995-bf6e-3a8c1c1b26dd)
* incoming uploads (and their ETA) are shown in the controlpanel 609c5921 844194ee
* list total directory sizes 427597b6
* show the total size and number of files of each directory in listings
* makes browsing a bit slower (up to 30%) so can be disabled with `--no-dirsz`
* sizes are calculated during startup, so it requires `-e2dsa`
* file-uploads will recalculate the sizes immediately, but a full rescan is necessary to see changes caused by moves/deletes
* optimizations;
* reduce broker overhead when multiprocessing is disabled 4e75534e
* should reduce cpu usage by uploads, thumbnails, prometheus metrics
* reduce cpu usage from downloading thumbnails 7d64879b
## 🩹 bugfixes
* fix sqlite indexes d67e9cc5
* upload handshakes would get exponentially slow if a volume has more than 200'000 files
* reindex on startup can be 150x faster in some rare cases (same filename in MANY folders)
* the database is now around 10% larger (likely worst-case)
* misc ux: 58835b2b
* shares: show media tags
* html hydrator assumed a folder named `foo.txt` was a doc
* due to sessions, use `pwd` as password placeholder on services
## 🔧 other changes
* add [example](https://github.com/9001/copyparty/tree/hovudstraum/contrib#flameshotsh) for uploading screenshots from linux with flameshot 1c2acdc9
* [nginx example](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nginx/copyparty.conf): use unix-sockets for higher performance a5ce1032
* #97 chinese translation was improved, thx again @ultwcz 7a573caf
## 🗿 known issues
* prometheus metrics are busted
* **workaround:** disable monitoring of volume status with `--nos-vst`
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0909-2343 `v1.15.1` session
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
blessed by ⑨, this release is [certified strong](https://github.com/user-attachments/assets/05459032-736c-4b9a-9ade-a0044461194a) ([artist](https://x.com/hcnone))
## new features
* login sessions b5405174
* a random session cookie is generated for each known user, replacing the previous plaintext login cookie
* the logout button will nuke the session on all clients where that user is logged in
* the sessions are stored in the database at `--ses-db`, default `~/.config/copyparty/sessions.db` (docker uses `/cfg/sessions.db` similar to the other runtime configs)
* if you run multiple copyparty instances, much like [shares](https://github.com/9001/copyparty#shares) and [user-changeable passwords](https://github.com/9001/copyparty#user-changeable-passwords) you'll want to keep a separate db for each instance
* can be mostly disabled with `--no-ses` when it turns out to be buggy
## bugfixes
* v1.13.8 broke the u2c `--ow` option to replace/overwrite files on the server during upload 6eee6015
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2024-0908-1925 `v1.15.0` fill the drives

View File

@@ -255,6 +255,9 @@ cat copyparty/httpcli.py | awk '/^[^a-zA-Z0-9]+def / {printf "%s\n%s\n\n", f, pl
# create a folder with symlinks to big files
for d in /usr /var; do find $d -type f -size +30M 2>/dev/null; done | while IFS= read -r x; do ln -s "$x" big/; done
# up2k worst-case testfiles: create 64 GiB (256 x 256 MiB) of sparse files; each file takes 1 MiB disk space; each 1 MiB chunk is globally unique
for f in {0..255}; do echo $f; truncate -s 256M $f; b1=$(printf '%02x' $f); for o in {0..255}; do b2=$(printf '%02x' $o); printf "\x$b1\x$b2" | dd of=$f bs=2 seek=$((o*1024*1024)) conv=notrunc 2>/dev/null; done; done
# py2 on osx
brew install python@2
pip install virtualenv

View File

@@ -63,9 +63,28 @@ add your own translations by using the english or norwegian one from `browser.js
the easy way is to open up and modify `browser.js` in your own installation; depending on how you installed copyparty it might be named `browser.js.gz` instead, in which case just decompress it, restart copyparty, and start editing it anyways
you will be delighted to see inline html in the translation strings; to help prevent syntax errors, there is [a very jank linux script](https://github.com/9001/copyparty/blob/hovudstraum/scripts/tlcheck.sh) which is slightly better than nothing -- just beware the false-positives, so even if it complains it's not necessarily wrong/bad
if you're running `copyparty-sfx.py` then you'll find it at `/tmp/pe-copyparty.1000/copyparty/web` (on linux) or `%TEMP%\pe-copyparty\copyparty\web` (on windows)
* make sure to keep backups of your work religiously! since that location is volatile af
if editing `browser.js` is inconvenient in your setup then you can instead do this:
* add your translation to a separate javascript file (`tl.js`) and make it load before `browser.js` with the help of `--html-head='<script src="/tl.js"></script>'`
* as the page loads, `browser.js` will look for a function named `langmod` so define that function and make it insert your translation into the `Ls` and `LANGS` variables so it'll take effect
## translations (docker-friendly)
if editing `browser.js` is inconvenient in your setup, for example if you're running in docker, then you can instead do this:
* if you have python, go to the `scripts` folder and run `./tl.py fra Français` to generate a `tl.js` which is perfect for translating to French, using the three-letter language code `fra`
* if you do not have python, you can also just grab `tl.js` from the scripts folder, but I'll probably forget to keep that up to date... and then you'll have to find/replace all `"eng"` and `Ls.eng` to your three-letter language code
* put your `tl.js` inside a folder that is being shared by your copyparty, preferably the webroot
* run copyparty with the argument `--html-head='<script src="/tl.js"></script>'`
* if you placed `tl.js` in the webroot then you're all good, but if you put it somewhere else then change `/tl.js` accordingly
* if you are running copyparty with config files, you can do this:
```yaml
[global]
html-head: <script src="/tl.js"></script>
```
you can now edit `tl.js` and press CTRL-SHIFT-R in the browser to see your changes take effect as you go
if you want to contribute your translation back to the project (please do!) then you'll want to...
* grab all of the text inside your `var tl_cpanel = {` and add it to the translations inside `copyparty/web/splash.js` in the repo
* and the text inside your `var tl_browser = {` and add that to the translations inside `copyparty/web/browser.js` in the repo

View File

@@ -3,7 +3,7 @@ WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.10.0 \
ver_marked=4.3.0 \
ver_dompf=3.1.6 \
ver_dompf=3.1.7 \
ver_mde=2.18.0 \
ver_codemirror=5.65.16 \
ver_fontawesome=5.13.0 \
@@ -37,6 +37,7 @@ RUN mkdir -p /z/dist/no-pk \
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
&& wget https://github.com/PrismJS/prism/archive/refs/tags/v$ver_prism.tar.gz -O prism.tgz \
&& wget https://files.pythonhosted.org/packages/04/0b/4506cb2e831cea4b0214d3625430e921faaa05a7fb520458c75a2dbd2152/fusepy-3.0.1.tar.gz -O fusepy.tgz \
&& (mkdir hash-wasm \
&& cd hash-wasm \
&& unzip ../hash-wasm.zip) \
@@ -56,6 +57,7 @@ RUN mkdir -p /z/dist/no-pk \
&& npm i gulp-cli -g ) \
&& tar -xf dompurify.tgz \
&& tar -xf prism.tgz \
&& tar -xf fusepy.tgz \
&& unzip fontawesome.zip \
&& tar -xf zopfli.tgz
@@ -158,6 +160,18 @@ RUN cd /z/dist \
&& rmdir no-pk
# build fusepy
COPY uncomment.py /z
RUN mv /z/fusepy-3.0.1/fuse.py /z/dist/f1 \
&& cd /z/dist \
&& python3 /z/uncomment.py f1 \
&& sed -ri '/self.__critical_exception = e/d' f1 \
&& awk '/^log =/{s=0} !s; /^from traceback im/{s=1;print"from functools import partial";print"basestring = str"}' <f1 >f2 \
&& awk '/LoggingMixIn:/{exit} --s<0;/self.use_ns = getattr/{s=7}' <f2 >f1 \
&& awk "/if _machine =/{s=0} /'(mips|ppc|ppc64)'/{s=1} !s" <f1 >f2 \
&& rm f1 && mv f2 fuse.py
# git diff -U2 --no-index marked-1.1.0-orig/ marked-1.1.0-edit/ -U2 | sed -r '/^index /d;s`^(diff --git a/)[^/]+/(.* b/)[^/]+/`\1\2`; s`^(---|\+\+\+) ([ab]/)[^/]+/`\1 \2`' > ../dev/copyparty/scripts/deps-docker/marked-ln.patch
# d=/home/ed/dev/copyparty/scripts/deps-docker/; tar -cf ../x . && ssh root@$bip "cd $d && tar -xv >&2 && make >&2 && tar -cC ../../copyparty/web deps" <../x | (cd ../../copyparty/web/; cat > the.tgz; tar -xvf the.tgz; rm the.tgz)
# gzip -dkf ../dev/copyparty/copyparty/web/deps/deps/marked.full.js.gz && diff -NarU2 ../dev/copyparty/copyparty/web/deps/{,deps/}marked.full.js

View File

@@ -4,8 +4,10 @@ vend := $(self)/../../copyparty/web/deps
# prefers podman-docker (optionally rootless) over actual docker/moby
all:
cp -pv ../uncomment.py .
docker build -t build-copyparty-deps .
rm -rf $(vend)
mkdir $(vend)

View File

@@ -11,6 +11,15 @@ gtar=$(command -v gtar || command -v gnutar) || true
realpath() { grealpath "$@"; }
}
tmv() {
touch -r "$1" t
mv t "$1"
}
ised() {
sed -r "$1" <"$2" >t
tmv "$2"
}
targs=(--owner=1000 --group=1000)
[ "$OSTYPE" = msys ] &&
targs=()
@@ -35,6 +44,12 @@ cd pyz
cp -pR ../sfx/{copyparty,partftpy} .
cp -pR ../sfx/{ftp,j2}/* .
true && {
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
echo h > copyparty/web/mde.html
ised '/edit2">edit \(fancy/d' copyparty/web/md.html
}
ts=$(date -u +%s)
hts=$(date -u +%Y-%m%d-%H%M%S)
ver="$(cat ../sfx/ver)"
@@ -42,12 +57,6 @@ ver="$(cat ../sfx/ver)"
mkdir -p ../dist
pyz_out=../dist/copyparty.pyz
echo creating z.tar
( cd copyparty
tar -cf z.tar "${targs[@]}" --numeric-owner web res
rm -rf web res
)
echo creating loader
sed -r 's/^(VER = ).*/\1"'"$ver"'"/; s/^(STAMP = ).*/\1'$(date +%s)/ \
<../scripts/ziploader.py \

View File

@@ -27,14 +27,14 @@ help() { exec cat <<'EOF'
#
# `no-ftp` saves ~30k by removing the ftp server, disabling --ftp
#
# `no-pf` saves ~12k by removing the option to download partyfuse
#
# `no-tfp` saves ~10k by removing the tftp server, disabling --tftp
#
# `no-zm` saves ~7k by removing the zeroconf mDNS server
#
# `no-smb` saves ~3.5k by removing the smb / cifs server
#
# `no-pf` saves ~2.8k by removing the option to download partyfuse
#
# _____________________________________________________________________
# web features:
#
@@ -109,7 +109,7 @@ pybin=$(command -v python3 || command -v python) || {
langs=
use_gz=
zopf=2560
zopf=2000
while [ ! -z "$1" ]; do
case $1 in
clean) clean=1 ; ;;
@@ -146,6 +146,13 @@ ised() {
sed -r "$1" <"$2" >t
tmv "$2"
}
dlf() {
[ -s "$f" ] && return 0
wget -O "$f" "$1" && return 0
curl -L "$1" >"$f" && return 0
rm -f "$f"
exit 1
}
stamp=$(
for d in copyparty scripts; do
@@ -178,8 +185,7 @@ necho() {
necho collecting ipaddress
f="../build/ipaddress-1.0.23.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz
tar -zxf $f
mkdir py37
@@ -189,8 +195,7 @@ necho() {
necho collecting jinja2
f="../build/Jinja2-2.11.3.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/4f/e7/65300e6b32e69768ded990494809106f87da1d436418d5f1367ed3966fd7/Jinja2-2.11.3.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/4f/e7/65300e6b32e69768ded990494809106f87da1d436418d5f1367ed3966fd7/Jinja2-2.11.3.tar.gz
tar -zxf $f
mv Jinja2-*/src/jinja2 .
@@ -199,8 +204,7 @@ necho() {
necho collecting markupsafe
f="../build/MarkupSafe-1.1.1.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz
tar -zxf $f
mv MarkupSafe-*/src/markupsafe .
@@ -212,8 +216,7 @@ necho() {
necho collecting pyftpdlib
f="../build/pyftpdlib-1.5.10.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/cf/31/8d910cf40317dd0db74ba0b8558d0dee23c8b002468c14d3a5dec0e6e9fd/pyftpdlib-1.5.10.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/cf/31/8d910cf40317dd0db74ba0b8558d0dee23c8b002468c14d3a5dec0e6e9fd/pyftpdlib-1.5.10.tar.gz
tar -zxf $f
mv pyftpdlib-*/pyftpdlib .
@@ -229,8 +232,7 @@ necho() {
necho collecting partftpy
f="../build/partftpy-0.4.0.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/8c/96/642bb3ddcb07a2c6764eb29aa562d1cf56877ad6c330c3c8921a5f05606d/partftpy-0.4.0.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/8c/96/642bb3ddcb07a2c6764eb29aa562d1cf56877ad6c330c3c8921a5f05606d/partftpy-0.4.0.tar.gz
tar -zxf $f
mv partftpy-*/partftpy .
@@ -240,8 +242,7 @@ necho() {
v=0.4.27
f="../build/python-magic-$v.tar.gz"
[ -e "$f" ] ||
(url=https://files.pythonhosted.org/packages/da/db/0b3e28ac047452d079d375ec6798bf76a036a08182dbb39ed38116a49130/python-magic-0.4.27.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/da/db/0b3e28ac047452d079d375ec6798bf76a036a08182dbb39ed38116a49130/python-magic-0.4.27.tar.gz
tar -zxf $f
mkdir magic
@@ -256,8 +257,7 @@ necho() {
necho collecting strip-hints
f=../build/strip-hints-0.1.10.tar.gz
[ -e $f ] ||
(url=https://files.pythonhosted.org/packages/9c/d4/312ddce71ee10f7e0ab762afc027e07a918f1c0e1be5b0069db5b0e7542d/strip-hints-0.1.10.tar.gz;
wget -O$f "$url" || curl -L "$url" >$f)
dlf https://files.pythonhosted.org/packages/9c/d4/312ddce71ee10f7e0ab762afc027e07a918f1c0e1be5b0069db5b0e7542d/strip-hints-0.1.10.tar.gz
tar -zxf $f
mv strip-hints-0.1.10/src/strip_hints .
@@ -315,7 +315,7 @@ necho() {
[ ! -e copyparty/web/deps/mini-fa.woff ] && [ $dl_wd ] && {
echo "could not find webdeps; downloading..."
url=https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
wget -Ox.py "$url" || curl -L "$url" >x.py
f=x.py; rm -f $f; dlf $url
echo "extracting webdeps..."
wdsrc="$("$pybin" x.py --version 2>&1 | tee /dev/stderr | awk '/sfxdir:/{sub(/.*: /,"");print;exit}')"
@@ -437,7 +437,7 @@ rm -f ftp/pyftpdlib/{__main__,prefork}.py
rm -rf copyparty/mdns.py copyparty/stolen/dnslib
[ $no_pf ] &&
rm -rf copyparty/web/a/partyfuse.py
rm -rf copyparty/web/a/partyfuse.py copyparty/web/deps/fuse.py
[ $no_cm ] && {
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
@@ -492,8 +492,8 @@ iawk '/^def /{s=0}/^def generate_lorem_ipsum/{s=1}!s' j2/jinja2/utils.py
iawk '/^(class|def) /{s=0}/^(class InternationalizationExtension|def _make_new_n?gettext)/{s=1}!s' j2/jinja2/ext.py
iawk '/^[^ ]/{s=0}/^def babel_extract/{s=1}!s' j2/jinja2/ext.py
ised '/InternationalizationExtension/d' j2/jinja2/ext.py
iawk '/^class/{s=0}/^class (Package|Dict|Function|Prefix|Choice|Module)Loader/{s=1}!s' j2/jinja2/loaders.py
sed -ri '/^from .bccache | (Package|Dict|Function|Prefix|Choice|Module)Loader$/d' j2/jinja2/__init__.py
iawk '/^class/{s=0}/^class (Package|Dict|Prefix|Choice|Module)Loader/{s=1}!s' j2/jinja2/loaders.py
sed -ri '/^from .bccache | (Package|Dict|Prefix|Choice|Module)Loader$/d' j2/jinja2/__init__.py
rm -f j2/jinja2/async* j2/jinja2/{bccache,sandbox}.py
cat > j2/jinja2/_identifier.py <<'EOF'
import re

View File

@@ -77,11 +77,14 @@ excl=(
email._header_value_parser
email.header
email.parser
importlib.resources
importlib_resources
inspect
multiprocessing
packaging
pdb
pickle
pkg_resources
PIL.EpsImagePlugin
pyftpdlib.prefork
urllib.request

View File

@@ -11,14 +11,10 @@ ckpypi() {
pyinstaller
pyinstaller-hooks-contrib
pywin32-ctypes
certifi
charset_normalizer
idna
Jinja2
MarkupSafe
mutagen
Pillow
requests
)
for dep in "${deps[@]}"; do
k=

View File

@@ -4,12 +4,6 @@ f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f8
749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl
085d39ef4426aa5f097fbc484595becc16e61ca23fc7da4d2a8bba540a3b82e789e390b176c7151bdc67d01735cce22b1562cdb2e31273225a2d3e275851a4ad setuptools-70.3.0-py3-none-any.whl
360a141928f4a7ec18a994602cbb28bbf8b5cc7c077a06ac76b54b12fa769ed95ca0333a5cf728923a8e0baeb5cc4d5e73e5b3de2666beb05eb477d8ae719093 upx-4.2.4-win32.zip
# u2c (win7)
7a3bd4849f95e1715fe2e99613df70a0fedd944a9bfde71a0fadb837fe62c3431c30da4f0b75c74de6f1a459f1fdf7cb62eaf404fdbe45e2d121e0b1021f1580 certifi-2024.2.2-py3-none-any.whl
9cc8acc5e269e6421bc32bb89261101da29d6ca337d39d60b9106de9ed7904e188716e4a48d78a2c4329026443fcab7acab013d2fe43778e30d6c4e4506a1b91 charset_normalizer-3.3.2-cp37-cp37m-win32.whl
0ec1ae5c928b4a0001a254c8598b746049406e1eed720bfafa94d4474078eff76bf6e032124e2d4df4619052836523af36162443c6d746487b387d2e3476e691 idna-3.6-py3-none-any.whl
cc08d0d87d184401872a2f82266d589253979b4cd02f23b51290fbb2a20082848fc72acbed8aacb74ac4af068d575ef96e66196c5068bc38fb0bcafdc7626869 requests-2.29.0-py3-none-any.whl
fe5fee6cb8a2c68800b32353a0015e5d2e1ad1cb6e0c9e6acf86e48e5cdb5606ad465dc4485ea5fbc8701d8716a8a7f7148c57724ef9da26b0c0a76f6dbbd698 urllib3-1.26.19-py2.py3-none-any.whl
# win7
3253e86471e6f9fa85bfdb7684cd2f964ed6e35c6a4db87f81cca157c049bef43e66dfcae1e037b2fb904567b1e028aaeefe8983ba3255105df787406d2aa71e en_windows_7_professional_with_sp1_x86_dvd_u_677056.iso
ab0db0283f61a5bbe44797d74546786bf41685175764a448d2e3bd629f292f1e7d829757b26be346b5044d78c9c1891736d93237cee4b1b6f5996a902c86d15f en_windows_7_professional_with_sp1_x64_dvd_u_676939.iso
@@ -36,4 +30,4 @@ d1420c8417fad7888766dd26b9706a87c63e8f33dceeb8e26d0056d5127b0b3ed9272e44b4b76113
2be320b4191f208cdd6af183c77ba2cf460ea52164ee45ac3ff17d6dfa57acd9deff016636c2dd42a21f4f6af977d5f72df7dacf599bebcf41757272354d14c1 pillow-10.4.0-cp312-cp312-win_amd64.whl
896ddddbd4b85e86e0600cb65eb4c07fbc7f3802d47e7f660411e20b5500831469b97ed4770f25820f4e75cbfac40308da624fd86d4f62e578149d5c276a9cde pyinstaller-6.10.0-py3-none-win_amd64.whl
873781decaeef07f6a79b0ed8b9f35f3fa534a1ea0d866991e40278a10818fa5b60c70b0d5828971b045364f1099694cd1e5d5d60d480acb93fcfbfbced4a09e pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
37fa7250b10b0c03b87d800bf4f920589649309cb4fbd25864475084bb7873d62b809a4fdeabd06c79f03f33614218eb7e01a9bd796de29dd3b141f1906d588c python-3.12.6-amd64.exe
912b710007c7b29f29c0097aff8f825412166eed7777a7cef135b14316e8fff31b5df56d26d835d8ca090468cc0e914730f201a56caa3dd6dbef2f91088942b1 python-3.12.7-amd64.exe

View File

@@ -13,13 +13,6 @@ https://pypi.org/project/MarkupSafe/#files
https://pypi.org/project/mutagen/#files
https://pypi.org/project/Pillow/#files
# u2c (win7) additionals
https://pypi.org/project/certifi/#files
https://pypi.org/project/charset-normalizer/#files # cp37-cp37m-win32.whl
https://pypi.org/project/idna/#files
https://pypi.org/project/requests/#files
https://pypi.org/project/urllib3/#files
# win7 additionals
https://pypi.org/project/future/#files
https://pypi.org/project/importlib-metadata/#files

View File

@@ -41,14 +41,7 @@ fns=(
pillow-10.4.0-cp312-cp312-win_amd64.whl
pyinstaller-6.10.0-py3-none-win_amd64.whl
pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
python-3.12.6-amd64.exe
)
[ $w7 ] && fns+=( # u2c stuff
certifi-2024.2.2-py3-none-any.whl
charset_normalizer-3.3.2-cp37-cp37m-win32.whl
idna-3.6-py3-none-any.whl
requests-2.29.0-py3-none-any.whl
urllib3-1.26.19-py2.py3-none-any.whl
python-3.12.7-amd64.exe
)
[ $w7 ] && fns+=(
future-1.0.0-py3-none-any.whl
@@ -96,12 +89,11 @@ python -m ensurepip &&
{ [ $w10 ] || python -m pip install --user -U pip-*.whl; } &&
python -m pip install --user -U packaging-*.whl &&
{ [ $w7 ] || python -m pip install --user -U {setuptools,mutagen,pillow,jinja2,MarkupSafe}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } &&
{ [ $w10 ] || python -m pip install --user -U future-*.whl importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&
python uncomment.py 1 $(for d in $appd/Python/Python$pyv/site-packages/{requests,urllib3,charset_normalizer,certifi,idna,mutagen,PIL,jinja2,markupsafe}; do find $d -name \*.py; done) &&
python uncomment.py 1 $(for d in $appd/Python/Python$pyv/site-packages/{mutagen,PIL,jinja2,markupsafe}; do find $d -name \*.py; done) &&
cd &&
rm -f build.sh &&
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/build.sh &&

View File

@@ -19,25 +19,12 @@ dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.ico
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.rc
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.spec
# $LOCALAPPDATA/programs/python/python37-32/python -m pip install --user -U pyinstaller requests
# $LOCALAPPDATA/programs/python/python37-32/python -m pip install --user -U pyinstaller
grep -E '^from .ssl_ import' $APPDATA/python/python37/site-packages/urllib3/util/proxy.py && {
echo golfing
echo > $APPDATA/python/python37/site-packages/requests/certs.py
sed -ri 's/^(DEFAULT_CA_BUNDLE_PATH = ).*/\1""/' $APPDATA/python/python37/site-packages/requests/utils.py
sed -ri '/^import zipfile$/d' $APPDATA/python/python37/site-packages/requests/utils.py
sed -ri 's/"idna"//' $APPDATA/python/python37/site-packages/requests/packages.py
sed -ri 's/import charset_normalizer.*/pass/' $APPDATA/python/python37/site-packages/requests/compat.py
sed -ri 's/raise.*charset_normalizer.*/pass/' $APPDATA/python/python37/site-packages/requests/__init__.py
sed -ri 's/import charset_normalizer.*//' $APPDATA/python/python37/site-packages/requests/packages.py
sed -ri 's/chardet.__name__/"\\roll\\tide"/' $APPDATA/python/python37/site-packages/requests/packages.py
sed -ri 's/chardet,//' $APPDATA/python/python37/site-packages/requests/models.py
for n in util/__init__.py connection.py; do awk -i inplace '/^from (\.util)?\.ssl_ /{s=1} !s; /^\)/{s=0}' $APPDATA/python/python37/site-packages/urllib3/$n; done
sed -ri 's/^from .ssl_ import .*//' $APPDATA/python/python37/site-packages/urllib3/util/proxy.py
echo golfed
}
sed -ri 's/^(import .*), selectors$/\1\ntry: import selectors\nexcept: pass/' $LOCALAPPDATA/programs/python/python37-32/Lib/socket.py
sed -ri 's/(add_argument."-t[de]",.*help=")[^"]+/\1not applicable; HTTPS is disabled in this exe/; s/for some reason/in this exe for safety reasons/' u2c.py
sed -ri '/^import platform/d;s/^(VT100 = )pla.*/\1False/' u2c.py
read a b _ < <(awk -F\" '/^S_VERSION =/{$0=$2;sub(/\./," ");print}' < u2c.py)
sed -r 's/1,2,3,0/'$a,$b,0,0'/;s/1\.2\.3/'$a.$b.0/ <up2k.rc >up2k.rc2

View File

@@ -14,22 +14,21 @@ a = Analysis(
hooksconfig={},
runtime_hooks=[],
excludes=[
'bz2',
'ftplib',
'getpass',
'lzma',
'pickle',
'platform',
'selectors',
'ssl',
'subprocess',
'tarfile',
'bz2',
'zipfile',
'tempfile',
'tracemalloc',
'typing',
'zipfile',
'zlib',
'urllib3.util.ssl_',
'urllib3.contrib.pyopenssl',
'urllib3.contrib.socks',
'certifi',
'idna',
'chardet',
'charset_normalizer',
'email.contentmanager',
'email.policy',
'encodings.zlib_codec',
@@ -40,6 +39,8 @@ a = Analysis(
'encodings.palmos',
'encodings.punycode',
'encodings.rot_13',
'urllib.response',
'urllib.robotparser',
],
win_no_prefer_redirects=False,
win_private_assemblies=False,

View File

@@ -6,7 +6,6 @@ set -e
ex=(
ftplib lzma pickle ssl tarfile bz2 zipfile tracemalloc zlib
urllib3.util.ssl_ urllib3.contrib.pyopenssl urllib3.contrib.socks certifi idna chardet charset_normalizer
email.contentmanager email.policy
encodings.{zlib_codec,base64_codec,bz2_codec,charmap,hex_codec,palmos,punycode,rot_13}
);

View File

@@ -84,6 +84,7 @@ copyparty/web/deps/__init__.py,
copyparty/web/deps/busy.mp3,
copyparty/web/deps/easymde.css,
copyparty/web/deps/easymde.js,
copyparty/web/deps/fuse.py,
copyparty/web/deps/marked.js,
copyparty/web/deps/mini-fa.css,
copyparty/web/deps/mini-fa.woff,

View File

@@ -61,15 +61,29 @@ def uh2(fp):
# remove expensive imports too
lns = []
on = True
on2 = True
for ln in cs.split("\n"):
if ln.startswith("if True:"):
on = False
continue
if ln.endswith("# !rm.yes>"):
on2 = False
continue
if not on2:
if ln.endswith("# !rm.no>"):
on2 = True
continue
if not on and (not ln.strip() or ln.startswith(" ")):
continue
on = True
if " # !rm" in ln:
continue
lns.append(ln)
cs = "\n".join(lns)

View File

@@ -98,6 +98,7 @@ def tc1(vflags):
args = [
"-q",
"-j0",
"-p4321",
"-e2dsa",
"-e2tsr",

View File

@@ -3,7 +3,7 @@ set -ex
# PYTHONPATH=.:~/dev/partftpy/ taskset -c 0 python3 -m copyparty -v srv::r -v srv/junk:junk:A --tftp 3969
get_src=~/dev/copyparty/srv/ro/palette.flac
get_src=~/dev/copyparty/srv/ro/palette.flac # tftpwa...
get_fp=ro/${get_src##*/} # server url
get_fn=${get_fp##*/} # just filename

657
scripts/tl.js Normal file
View File

@@ -0,0 +1,657 @@
"use strict";
// NOTE: please use tabs for indenting, not spaces :-)
// NOTE: since you are using the tl.js straight from the repo, you'll
// need to find/replace all "eng" with your own three-letter name
// the three-letter name of the language you're translating to
var my_lang = "eng";
// and because we don't know these yet...
var SR='', wah='';
// this function is automatically called when the page is loaded:
function langmod() {
// which page is the javascript currently running on?
// look for some well-known elements to figure it out:
if (QS("h1#cc") && QS("a#k")) {
// we are running in the control-panel
Ls[my_lang] = tl_cpanel[my_lang];
}
else if (ebi("op_cfg")) {
// we are running in the filebrowser
Ls[my_lang] = tl_browser[my_lang];
// inform the settings tab that a new lang is available
LANGS.push(my_lang);
}
}
////////////////////////////////////////////////////////////////////////
// below this point is where the actual translation happens;
// here is the pairs of "text-identifier": "text-to-translate"
////////////////////////////////////////////////////////////////////////
// translation of splash.js (the control-panel);
// you do not need to translate the TLNotes, those are just for you :-)
var tl_cpanel = {
"eng": {
"a1": "refresh",
"b1": "howdy stranger &nbsp; <small>(you're not logged in)</small>",
"c1": "logout",
"d1": "dump stack", // TLNote: "d2" is the tooltip for this button
"d2": "shows the state of all active threads",
"e1": "reload cfg",
"e2": "reload config files (accounts/volumes/volflags),$Nand rescan all e2ds volumes$N$Nnote: any changes to global settings$Nrequire a full restart to take effect",
"f1": "you can browse:",
"g1": "you can upload to:",
"cc1": "other stuff:",
"h1": "disable k304", // TLNote: "j1" explains what k304 is
"i1": "enable k304",
"j1": "enabling this will disconnect your client on every HTTP 304, which can prevent some buggy proxies from getting stuck (suddenly not loading pages), <em>but</em> it will also make things slower in general",
"k1": "reset client settings",
"l1": "login for more:",
"m1": "welcome back,", // TLNote: "welcome back, USERNAME"
"n1": "404 not found &nbsp;┐( ´ -`)┌",
"o1": 'or maybe you don\'t have access -- try a password or <a href="' + SR + '/?h">go home</a>',
"p1": "403 forbiddena &nbsp;~┻━┻",
"q1": 'use a password or <a href="' + SR + '/?h">go home</a>',
"r1": "go home",
".s1": "rescan",
"t1": "action", // TLNote: this is the header above the "rescan" buttons
"u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds",
"v1": "connect",
"v2": "use this server as a local HDD",
"w1": "switch to https",
"x1": "change password",
"y1": "edit shares", // TLNote: shows the list of folders that the user has decided to share
"z1": "unlock this share:", // TLNote: the password prompt to see a hidden share
"ta1": "fill in your new password first",
"ta2": "repeat to confirm new password:",
"ta3": "found a typo; please try again",
"aa1": "incoming files:",
},
};
////////////////////////////////////////////////////////////////////////
// translation of browser.js (the filebrowser):
var tl_browser = {
"eng": {
"tt": "English",
"cols": {
"c": "action buttons",
"dur": "duration",
"q": "quality / bitrate",
"Ac": "audio codec",
"Vc": "video codec",
"Fmt": "format / container",
"Ahash": "audio checksum",
"Vhash": "video checksum",
"Res": "resolution",
"T": "filetype",
"aq": "audio quality / bitrate",
"vq": "video quality / bitrate",
"pixfmt": "subsampling / pixel structure",
"resw": "horizontal resolution",
"resh": "veritcal resolution",
"chs": "audio channels",
"hz": "sample rate"
},
"hks": [
[
"misc",
["ESC", "close various things"],
"file-manager",
["G", "toggle list / grid view"],
["T", "toggle thumbnails / icons"],
["🡅 A/D", "thumbnail size"],
["ctrl-K", "delete selected"],
["ctrl-X", "cut selected"],
["ctrl-V", "paste into folder"],
["Y", "download selected"],
["F2", "rename selected"],
"file-list-sel",
["space", "toggle file selection"],
["🡑/🡓", "move selection cursor"],
["ctrl 🡑/🡓", "move cursor and viewport"],
["🡅 🡑/🡓", "select prev/next file"],
["ctrl-A", "select all files / folders"],
], [
"navigation",
["B", "toggle breadcrumbs / navpane"],
["I/K", "prev/next folder"],
["M", "parent folder (or unexpand current)"],
["V", "toggle folders / textfiles in navpane"],
["A/D", "navpane size"],
], [
"audio-player",
["J/L", "prev/next song"],
["U/O", "skip 10sec back/fwd"],
["0..9", "jump to 0%..90%"],
["P", "play/pause (also initiates)"],
["Y", "download song"],
], [
"image-viewer",
["J/L, ←/→", "prev/next pic"],
["Home/End", "first/last pic"],
["F", "fullscreen"],
["R", "rotate clockwise"],
["🡅 R", "rotate ccw"],
["Y", "download pic"],
], [
"video-player",
["U/O", "skip 10sec back/fwd"],
["P/K/Space", "play/pause"],
["C", "continue playing next"],
["V", "loop"],
["M", "mute"],
["[ and ]", "set loop interval"],
], [
"textfile-viewer",
["I/K", "prev/next file"],
["M", "close textfile"],
["E", "edit textfile"],
["S", "select file (for cut/rename)"],
]
],
"m_ok": "OK",
"m_ng": "Cancel",
"enable": "Enable",
"danger": "DANGER",
"clipped": "copied to clipboard",
"ht_s1": "second",
"ht_s2": "seconds",
"ht_m1": "minute",
"ht_m2": "minutes",
"ht_h1": "hour",
"ht_h2": "hours",
"ht_d1": "day",
"ht_d2": "days",
"ht_and": " and ",
"goh": "control-panel",
"gop": 'previous sibling">prev',
"gou": 'parent folder">up',
"gon": 'next folder">next',
"logout": "Logout ",
"access": " access",
"ot_close": "close submenu",
"ot_search": "search for files by attributes, path / name, music tags, or any combination of those$N$N&lt;code&gt;foo bar&lt;/code&gt; = must contain both «foo» and «bar»,$N&lt;code&gt;foo -bar&lt;/code&gt; = must contain «foo» but not «bar»,$N&lt;code&gt;^yana .opus$&lt;/code&gt; = start with «yana» and be an «opus» file$N&lt;code&gt;&quot;try unite&quot;&lt;/code&gt; = contain exactly «try unite»$N$Nthe date format is iso-8601, like$N&lt;code&gt;2009-12-31&lt;/code&gt; or &lt;code&gt;2020-09-12 23:30:00&lt;/code&gt;",
"ot_unpost": "unpost: delete your recent uploads, or abort unfinished ones",
"ot_bup": "bup: basic uploader, even supports netscape 4.0",
"ot_mkdir": "mkdir: create a new directory",
"ot_md": "new-md: create a new markdown document",
"ot_msg": "msg: send a message to the server log",
"ot_mp": "media player options",
"ot_cfg": "configuration options",
"ot_u2i": 'up2k: upload files (if you have write-access) or toggle into the search-mode to see if they exist somewhere on the server$N$Nuploads are resumable, multithreaded, and file timestamps are preserved, but it uses more CPU than [🎈]&nbsp; (the basic uploader)<br /><br />during uploads, this icon becomes a progress indicator!',
"ot_u2w": 'up2k: upload files with resume support (close your browser and drop the same files in later)$N$Nmultithreaded, and file timestamps are preserved, but it uses more CPU than [🎈]&nbsp; (the basic uploader)<br /><br />during uploads, this icon becomes a progress indicator!',
"ot_noie": 'Please use Chrome / Firefox / Edge',
"ab_mkdir": "make directory",
"ab_mkdoc": "new markdown doc",
"ab_msg": "send msg to srv log",
"ay_path": "skip to folders",
"ay_files": "skip to files",
"wt_ren": "rename selected items$NHotkey: F2",
"wt_del": "delete selected items$NHotkey: ctrl-K",
"wt_cut": "cut selected items &lt;small&gt;(then paste somewhere else)&lt;/small&gt;$NHotkey: ctrl-X",
"wt_pst": "paste a previously cut / copied selection$NHotkey: ctrl-V",
"wt_selall": "select all files$NHotkey: ctrl-A (when file focused)",
"wt_selinv": "invert selection",
"wt_selzip": "download selection as archive",
"wt_seldl": "download selection as separate files$NHotkey: Y",
"wt_npirc": "copy irc-formatted track info",
"wt_nptxt": "copy plaintext track info",
"wt_grid": "toggle grid / list view$NHotkey: G",
"wt_prev": "previous track$NHotkey: J",
"wt_play": "play / pause$NHotkey: P",
"wt_next": "next track$NHotkey: L",
"ul_par": "parallel uploads:",
"ut_rand": "randomize filenames",
"ut_u2ts": "copy the last-modified timestamp$Nfrom your filesystem to the server",
"ut_mt": "continue hashing other files while uploading$N$Nmaybe disable if your CPU or HDD is a bottleneck",
"ut_ask": 'ask for confirmation before upload starts">💭',
"ut_pot": "improve upload speed on slow devices$Nby making the UI less complex",
"ut_srch": "don't actually upload, instead check if the files already $N exist on the server (will scan all folders you can read)",
"ut_par": "pause uploads by setting it to 0$N$Nincrease if your connection is slow / high latency$N$Nkeep it 1 on LAN or if the server HDD is a bottleneck",
"ul_btn": "drop files / folders<br>here (or click me)",
"ul_btnu": "U P L O A D",
"ul_btns": "S E A R C H",
"ul_hash": "hash",
"ul_send": "send",
"ul_done": "done",
"ul_idle1": "no uploads are queued yet",
"ut_etah": "average &lt;em&gt;hashing&lt;/em&gt; speed, and estimated time until finish",
"ut_etau": "average &lt;em&gt;upload&lt;/em&gt; speed and estimated time until finish",
"ut_etat": "average &lt;em&gt;total&lt;/em&gt; speed and estimated time until finish",
"uct_ok": "completed successfully",
"uct_ng": "no-good: failed / rejected / not-found",
"uct_done": "ok and ng combined",
"uct_bz": "hashing or uploading",
"uct_q": "idle, pending",
"utl_name": "filename",
"utl_ulist": "list",
"utl_ucopy": "copy",
"utl_links": "links",
"utl_stat": "status",
"utl_prog": "progress",
// keep short:
"utl_404": "404",
"utl_err": "ERROR",
"utl_oserr": "OS-error",
"utl_found": "found",
"utl_defer": "defer",
"utl_yolo": "YOLO",
"utl_done": "done",
"ul_flagblk": "the files were added to the queue</b><br>however there is a busy up2k in another browser tab,<br>so waiting for that to finish first",
"ul_btnlk": "the server configuration has locked this switch into this state",
"udt_up": "Upload",
"udt_srch": "Search",
"udt_drop": "drop it here",
"u_nav_m": '<h6>aight, what do you have?</h6><code>Enter</code> = Files (one or more)\n<code>ESC</code> = One folder (including subfolders)',
"u_nav_b": '<a href="#" id="modal-ok">Files</a><a href="#" id="modal-ng">One folder</a>',
"cl_opts": "switches",
"cl_themes": "theme",
"cl_langs": "language",
"cl_ziptype": "folder download",
"cl_uopts": "up2k switches",
"cl_favico": "favicon",
"cl_bigdir": "big dirs",
"cl_keytype": "key notation",
"cl_hiddenc": "hidden columns",
"cl_hidec": "hide",
"cl_reset": "reset",
"cl_hpick": "tap on column headers to hide in the table below",
"cl_hcancel": "column hiding aborted",
"ct_grid": '田 the grid',
"ct_ttips": '◔ ◡ ◔"> tooltips',
"ct_thumb": 'in grid-view, toggle icons or thumbnails$NHotkey: T">🖼️ thumbs',
"ct_csel": 'use CTRL and SHIFT for file selection in grid-view">sel',
"ct_ihop": 'when the image viewer is closed, scroll down to the last viewed file">g⮯',
"ct_dots": 'show hidden files (if server permits)">dotfiles',
"ct_dir1st": 'sort folders before files">📁 first',
"ct_nsort": 'natural sort (for filenames with leading digits)">nsort',
"ct_readme": 'show README.md in folder listings">📜 readme',
"ct_idxh": 'show index.html instead of folder listing">htm',
"ct_sbars": 'show scrollbars">⟊',
"cut_umod": "if a file already exists on the server, update the server's last-modified timestamp to match your local file (requires write+delete permissions)\">re📅",
"cut_turbo": "the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>&quot;does this have the same filesize on the server?&quot;</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then &quot;upload&quot; the same files again to let the client verify them\">turbo",
"cut_datechk": "has no effect unless the turbo button is enabled$N$Nreduces the yolo factor by a tiny amount; checks whether the file timestamps on the server matches yours$N$Nshould <em>theoretically</em> catch most unfinished / corrupted uploads, but is not a substitute for doing a verification pass with turbo disabled afterwards\">date-chk",
"cut_u2sz": "size (in MiB) of each upload chunk; big values fly better across the atlantic. Try low values on very unreliable connections",
"cut_flag": "ensure only one tab is uploading at a time $N -- other tabs must have this enabled too $N -- only affects tabs on the same domain",
"cut_az": "upload files in alphabetical order, rather than smallest-file-first$N$Nalphabetical order can make it easier to eyeball if something went wrong on the server, but it makes uploading slightly slower on fiber / LAN",
"cut_nag": "OS notification when upload completes$N(only if the browser or tab is not active)",
"cut_sfx": "audible alert when upload completes$N(only if the browser or tab is not active)",
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$N30% faster https, 4.5x faster http,$Nand 5.3x faster on android phones\">mt",
"cft_text": "favicon text (blank and refresh to disable)",
"cft_fg": "foreground color",
"cft_bg": "background color",
"cdt_lim": "max number of files to show in a folder",
"cdt_ask": "when scrolling to the bottom,$Ninstead of loading more files,$Nask what to do",
"tt_entree": "show navpane (directory tree sidebar)$NHotkey: B",
"tt_detree": "show breadcrumbs$NHotkey: B",
"tt_visdir": "scroll to selected folder",
"tt_ftree": "toggle folder-tree / textfiles$NHotkey: V",
"tt_pdock": "show parent folders in a docked pane at the top",
"tt_dynt": "autogrow as tree expands",
"tt_wrap": "word wrap",
"tt_hover": "reveal overflowing lines on hover$N( breaks scrolling unless mouse $N&nbsp; cursor is in the left gutter )",
"ml_pmode": "at end of folder...",
"ml_btns": "cmds",
"ml_tcode": "transcode",
"ml_tint": "tint",
"ml_eq": "audio equalizer",
"ml_drc": "dynamic range compressor",
"mt_shuf": "shuffle the songs in each folder\">🔀",
"mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶",
"mt_preload": "start loading the next song near the end for gapless playback\">preload",
"mt_prescan": "go to the next folder before the last song$Nends, keeping the webbrowser happy$Nso it doesn't stop the playback\">nav",
"mt_fullpre": "try to preload the entire song;$N✅ enable on <b>unreliable</b> connections,$N❌ <b>disable</b> on slow connections probably\">full",
"mt_fau": "on phones, prevent music from stopping if the next song doesn't preload fast enough (can make tags display glitchy)\">☕️",
"mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s",
"mt_npclip": "show buttons for clipboarding the currently playing song\">/np",
"mt_octl": "os integration (media hotkeys / osd)\">os-ctl",
"mt_oseek": "allow seeking through os integration$N$Nnote: on some devices (iPhones),$Nthis replaces the next-song button\">seek",
"mt_oscv": "show album cover in osd\">art",
"mt_follow": "keep the playing track scrolled into view\">🎯",
"mt_compact": "compact controls\">⟎",
"mt_uncache": "clear cache &nbsp;(try this if your browser cached$Na broken copy of a song so it refuses to play)\">uncache",
"mt_mloop": "loop the open folder\">🔁 loop",
"mt_mnext": "load the next folder and continue\">📂 next",
"mt_cflac": "convert flac / wav to opus\">flac",
"mt_caac": "convert aac / m4a to opus\">aac",
"mt_coth": "convert all others (not mp3) to opus\">oth",
"mt_tint": "background level (0-100) on the seekbar$Nto make buffering less distracting",
"mt_eq": "enables the equalizer and gain control;$N$Nboost &lt;code&gt;0&lt;/code&gt; = standard 100% volume (unmodified)$N$Nwidth &lt;code&gt;1 &nbsp;&lt;/code&gt; = standard stereo (unmodified)$Nwidth &lt;code&gt;0.5&lt;/code&gt; = 50% left-right crossfeed$Nwidth &lt;code&gt;0 &nbsp;&lt;/code&gt; = mono$N$Nboost &lt;code&gt;-0.8&lt;/code&gt; &amp; width &lt;code&gt;10&lt;/code&gt; = vocal removal :^)$N$Nenabling the equalizer makes gapless albums fully gapless, so leave it on with all the values at zero (except width = 1) if you care about that",
"mt_drc": "enables the dynamic range compressor (volume flattener / brickwaller); will also enable EQ to balance the spaghetti, so set all EQ fields except for 'width' to 0 if you don't want it$N$Nlowers the volume of audio above THRESHOLD dB; for every RATIO dB past THRESHOLD there is 1 dB of output, so default values of tresh -24 and ratio 12 means it should never get louder than -22 dB and it is safe to increase the equalizer boost to 0.8, or even 1.8 with ATK 0 and a huge RLS like 90 (only works in firefox; RLS is max 1 in other browsers)$N$N(see wikipedia, they explain it much better)",
"mb_play": "play",
"mm_hashplay": "play this audio file?",
"mp_breq": "need firefox 82+ or chrome 73+ or iOS 15+",
"mm_bload": "now loading...",
"mm_bconv": "converting to {0}, please wait...",
"mm_opusen": "your browser cannot play aac / m4a files;\ntranscoding to opus is now enabled",
"mm_playerr": "playback failed: ",
"mm_eabrt": "The playback attempt was cancelled",
"mm_enet": "Your internet connection is wonky",
"mm_edec": "This file is supposedly corrupted??",
"mm_esupp": "Your browser does not understand this audio format",
"mm_eunk": "Unknown Errol",
"mm_e404": "Could not play audio; error 404: File not found.",
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
"mm_e5xx": "Could not play audio; server error ",
"mm_nof": "not finding any more audio files nearby",
"mm_prescan": "Looking for music to play next...",
"mm_scank": "Found the next song:",
"mm_uncache": "cache cleared; all songs will redownload on next playback",
"mm_hnf": "that song no longer exists",
"im_hnf": "that image no longer exists",
"f_empty": 'this folder is empty',
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
"f_dls": 'the file links in the current folder have\nbeen changed into download links',
"f_partial": "To safely download a file which is currently being uploaded, please click the file which has the same filename, but without the <code>.PARTIAL</code> file extension. Please press CANCEL or Escape to do this.\n\nPressing OK / Enter will ignore this warning and continue downloading the <code>.PARTIAL</code> scratchfile instead, which will almost definitely give you corrupted data.",
"ft_paste": "paste {0} items$NHotkey: ctrl-V",
"fr_eperm": 'cannot rename:\nyou do not have “move” permission in this folder',
"fd_eperm": 'cannot delete:\nyou do not have “delete” permission in this folder',
"fc_eperm": 'cannot cut:\nyou do not have “move” permission in this folder',
"fp_eperm": 'cannot paste:\nyou do not have “write” permission in this folder',
"fr_emore": "select at least one item to rename",
"fd_emore": "select at least one item to delete",
"fc_emore": "select at least one item to cut",
"fs_sc": "share the folder you're in",
"fs_ss": "share the selected files",
"fs_just1d": "you cannot select more than one folder,\nor mix files and folders in one selection",
"fs_abrt": "❌ abort",
"fs_rand": "🎲 rand.name",
"fs_go": "✅ create share",
"fs_name": "name",
"fs_src": "source",
"fs_pwd": "passwd",
"fs_exp": "expiry",
"fs_tmin": "min",
"fs_thrs": "hours",
"fs_tdays": "days",
"fs_never": "eternal",
"fs_pname": "optional link name; will be random if blank",
"fs_tsrc": "the file or folder to share",
"fs_ppwd": "optional password",
"fs_w8": "creating share...",
"fs_ok": "press <code>Enter/OK</code> to Clipboard\npress <code>ESC/Cancel</code> to Close",
"frt_dec": "may fix some cases of broken filenames\">url-decode",
"frt_rst": "reset modified filenames back to the original ones\">↺ reset",
"frt_abrt": "abort and close this window\">❌ cancel",
"frb_apply": "APPLY RENAME",
"fr_adv": "batch / metadata / pattern renaming\">advanced",
"fr_case": "case-sensitive regex\">case",
"fr_win": "windows-safe names; replace <code>&lt;&gt;:&quot;\\|?*</code> with japanese fullwidth characters\">win",
"fr_slash": "replace <code>/</code> with a character that doesn't cause new folders to be created\">no /",
"fr_re": "regex search pattern to apply to original filenames; capturing groups can be referenced in the format field below like &lt;code&gt;(1)&lt;/code&gt; and &lt;code&gt;(2)&lt;/code&gt; and so on",
"fr_fmt": "inspired by foobar2000:$N&lt;code&gt;(title)&lt;/code&gt; is replaced by song title,$N&lt;code&gt;[(artist) - ](title)&lt;/code&gt; skips [this] part if artist is blank$N&lt;code&gt;$lpad((tn),2,0)&lt;/code&gt; pads tracknumber to 2 digits",
"fr_pdel": "delete",
"fr_pnew": "save as",
"fr_pname": "provide a name for your new preset",
"fr_aborted": "aborted",
"fr_lold": "old name",
"fr_lnew": "new name",
"fr_tags": "tags for the selected files (read-only, just for reference):",
"fr_busy": "renaming {0} items...\n\n{1}",
"fr_efail": "rename failed:\n",
"fr_nchg": "{0} of the new names were altered due to <code>win</code> and/or <code>no /</code>\n\nOK to continue with these altered new names?",
"fd_ok": "delete OK",
"fd_err": "delete failed:\n",
"fd_none": "nothing was deleted; maybe blocked by server config (xbd)?",
"fd_busy": "deleting {0} items...\n\n{1}",
"fd_warn1": "DELETE these {0} items?",
"fd_warn2": "<b>Last chance!</b> No way to undo. Delete?",
"fc_ok": "cut {0} items",
"fc_warn": 'cut {0} items\n\nbut: only <b>this</b> browser-tab can paste them\n(since the selection is so absolutely massive)',
"fp_ecut": "first cut some files / folders to paste / move\n\nnote: you can cut / paste across different browser tabs",
"fp_ename": "these {0} items cannot be moved here (names already exist):",
"fp_ok": "move OK",
"fp_busy": "moving {0} items...\n\n{1}",
"fp_err": "move failed:\n",
"fp_confirm": "move these {0} items here?",
"fp_etab": 'failed to read clipboard from other browser tab',
"fp_name": "uploading a file from your device. Give it a name:",
"fp_both_m": '<h6>choose what to paste</h6><code>Enter</code> = Move {0} files from «{1}»\n<code>ESC</code> = Upload {2} files from your device',
"fp_both_b": '<a href="#" id="modal-ok">Move</a><a href="#" id="modal-ng">Upload</a>',
"mk_noname": "type a name into the text field on the left before you do that :p",
"tv_load": "Loading text document:\n\n{0}\n\n{1}% ({2} of {3} MiB loaded)",
"tv_xe1": "could not load textfile:\n\nerror ",
"tv_xe2": "404, file not found",
"tv_lst": "list of textfiles in",
"tvt_close": "return to folder view$NHotkey: M (or Esc)\">❌ close",
"tvt_dl": "download this file$NHotkey: Y\">💾 download",
"tvt_prev": "show previous document$NHotkey: i\">⬆ prev",
"tvt_next": "show next document$NHotkey: K\">⬇ next",
"tvt_sel": "select file &nbsp; ( for cut / delete / ... )$NHotkey: S\">sel",
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
"gt_vau": "don't show videos, just play the audio\">🎧",
"gt_msel": "enable file selection; ctrl-click a file to override$N$N&lt;em&gt;when active: doubleclick a file / folder to open it&lt;/em&gt;$N$NHotkey: S\">multiselect",
"gt_crop": "center-crop thumbnails\">crop",
"gt_3x": "hi-res thumbnails\">3x",
"gt_zoom": "zoom",
"gt_chop": "chop",
"gt_sort": "sort by",
"gt_name": "name",
"gt_sz": "size",
"gt_ts": "date",
"gt_ext": "type",
"gt_c1": "truncate filenames more (show less)",
"gt_c2": "truncate filenames less (show more)",
"sm_w8": "searching...",
"sm_prev": "search results below are from a previous query:\n ",
"sl_close": "close search results",
"sl_hits": "showing {0} hits",
"sl_moar": "load more",
"s_sz": "size",
"s_dt": "date",
"s_rd": "path",
"s_fn": "name",
"s_ta": "tags",
"s_ua": "up@",
"s_ad": "adv.",
"s_s1": "minimum MiB",
"s_s2": "maximum MiB",
"s_d1": "min. iso8601",
"s_d2": "max. iso8601",
"s_u1": "uploaded after",
"s_u2": "and/or before",
"s_r1": "path contains &nbsp; (space-separated)",
"s_f1": "name contains &nbsp; (negate with -nope)",
"s_t1": "tags contains &nbsp; (^=start, end=$)",
"s_a1": "specific metadata properties",
"md_eshow": "cannot render ",
"md_off": "[📜<em>readme</em>] disabled in [⚙️] -- document hidden",
"badreply": "Failed to parse reply from server",
"xhr403": "403: Access denied\n\ntry pressing F5, maybe you got logged out",
"xhr0": "unknown (probably lost connection to server, or server is offline)",
"cf_ok": "sorry about that -- DD" + wah + "oS protection kicked in\n\nthings should resume in about 30 sec\n\nif nothing happens, hit F5 to reload the page",
"tl_xe1": "could not list subfolders:\n\nerror ",
"tl_xe2": "404: Folder not found",
"fl_xe1": "could not list files in folder:\n\nerror ",
"fl_xe2": "404: Folder not found",
"fd_xe1": "could not create subfolder:\n\nerror ",
"fd_xe2": "404: Parent folder not found",
"fsm_xe1": "could not send message:\n\nerror ",
"fsm_xe2": "404: Parent folder not found",
"fu_xe1": "failed to load unpost list from server:\n\nerror ",
"fu_xe2": "404: File not found??",
"fz_tar": "uncompressed gnu-tar file (linux / mac)",
"fz_pax": "uncompressed pax-format tar (slower)",
"fz_targz": "gnu-tar with gzip level 3 compression$N$Nthis is usually very slow, so$Nuse uncompressed tar instead",
"fz_tarxz": "gnu-tar with xz level 1 compression$N$Nthis is usually very slow, so$Nuse uncompressed tar instead",
"fz_zip8": "zip with utf8 filenames (maybe wonky on windows 7 and older)",
"fz_zipd": "zip with traditional cp437 filenames, for really old software",
"fz_zipc": "cp437 with crc32 computed early,$Nfor MS-DOS PKZIP v2.04g (october 1993)$N(takes longer to process before download can start)",
"un_m1": "you can delete your recent uploads (or abort unfinished ones) below",
"un_upd": "refresh",
"un_m4": "or share the files visible below:",
"un_ulist": "show",
"un_ucopy": "copy",
"un_flt": "optional filter:&nbsp; URL must contain",
"un_fclr": "clear filter",
"un_derr": 'unpost-delete failed:\n',
"un_f5": 'something broke, please try a refresh or hit F5',
"un_uf5": "sorry but you have to refresh the page (for example by pressing F5 or CTRL-R) before this upload can be aborted",
"un_nou": '<b>warning:</b> server too busy to show unfinished uploads; click the "refresh" link in a bit',
"un_noc": '<b>warning:</b> unpost of fully uploaded files is not enabled/permitted in server config',
"un_max": "showing first 2000 files (use the filter)",
"un_avail": "{0} recent uploads can be deleted<br />{1} unfinished ones can be aborted",
"un_m2": "sorted by upload time; most recent first:",
"un_no1": "sike! no uploads are sufficiently recent",
"un_no2": "sike! no uploads matching that filter are sufficiently recent",
"un_next": "delete the next {0} files below",
"un_abrt": "abort",
"un_del": "delete",
"un_m3": "loading your recent uploads...",
"un_busy": "deleting {0} files...",
"un_clip": "{0} links copied to clipboard",
"u_https1": "you should",
"u_https2": "switch to https",
"u_https3": "for better performance",
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
"u_enpot": 'switch to <a href="#">potato UI</a> (may improve upload speed)',
"u_depot": 'switch to <a href="#">fancy UI</a> (may reduce upload speed)',
"u_gotpot": 'switching to the potato UI for improved upload speed,\n\nfeel free to disagree and switch back!',
"u_pott": "<p>files: &nbsp; <b>{0}</b> finished, &nbsp; <b>{1}</b> failed, &nbsp; <b>{2}</b> busy, &nbsp; <b>{3}</b> queued</p>",
"u_ever": "this is the basic uploader; up2k needs at least<br>chrome 21 // firefox 13 // edge 12 // opera 12 // safari 5.1",
"u_su2k": 'this is the basic uploader; <a href="#" id="u2yea">up2k</a> is better',
"u_ewrite": 'you do not have write-access to this folder',
"u_eread": 'you do not have read-access to this folder',
"u_enoi": 'file-search is not enabled in server config',
"u_badf": 'These {0} files (of {1} total) were skipped, possibly due to filesystem permissions:\n\n',
"u_blankf": 'These {0} files (of {1} total) are blank / empty; upload them anyways?\n\n',
"u_just1": '\nMaybe it works better if you select just one file',
"u_ff_many": "if you're using <b>Linux / MacOS / Android,</b> then this amount of files <a href=\"https://bugzilla.mozilla.org/show_bug.cgi?id=1790500\" target=\"_blank\"><em>may</em> crash Firefox!</a>\nif that happens, please try again (or use Chrome).",
"u_up_life": "This upload will be deleted from the server\n{0} after it completes",
"u_asku": 'upload these {0} files to <code>{1}</code>',
"u_unpt": "you can undo / delete this upload using the top-left 🧯",
"u_bigtab": 'about to show {0} files\n\nthis may crash your browser, are you sure?',
"u_scan": 'Scanning files...',
"u_dirstuck": 'directory iterator got stuck trying to access the following {0} items; will skip:',
"u_etadone": 'Done ({0}, {1} files)',
"u_etaprep": '(preparing to upload)',
"u_hashdone": 'hashing done',
"u_hashing": 'hash',
"u_hs": 'handshaking...',
"u_dupdefer": "duplicate; will be processed after all other files",
"u_actx": "click this text to prevent loss of<br />performance when switching to other windows/tabs",
"u_fixed": "OK!&nbsp; Fixed it 👍",
"u_cuerr": "failed to upload chunk {0} of {1};\nprobably harmless, continuing\n\nfile: {2}",
"u_cuerr2": "server rejected upload (chunk {0} of {1});\nwill retry later\n\nfile: {2}\n\nerror ",
"u_ehstmp": "will retry; see bottom-right",
"u_ehsfin": "server rejected the request to finalize upload; retrying...",
"u_ehssrch": "server rejected the request to perform search; retrying...",
"u_ehsinit": "server rejected the request to initiate upload; retrying...",
"u_eneths": "network error while performing upload handshake; retrying...",
"u_enethd": "network error while testing target existence; retrying...",
"u_cbusy": "waiting for server to trust us again after a network glitch...",
"u_ehsdf": "server ran out of disk space!\n\nwill keep retrying, in case someone\nfrees up enough space to continue",
"u_emtleak1": "it looks like your webbrowser may have a memory leak;\nplease",
"u_emtleak2": ' <a href="{0}">switch to https (recommended)</a> or ',
"u_emtleak3": ' ',
"u_emtleakc": 'try the following:\n<ul><li>hit <code>F5</code> to refresh the page</li><li>then disable the &nbsp;<code>mt</code>&nbsp; button in the &nbsp;<code>⚙️ settings</code></li><li>and try that upload again</li></ul>Uploads will be a bit slower, but oh well.\nSorry for the trouble !\n\nPS: chrome v107 <a href="https://bugs.chromium.org/p/chromium/issues/detail?id=1354816" target="_blank">has a bugfix</a> for this',
"u_emtleakf": 'try the following:\n<ul><li>hit <code>F5</code> to refresh the page</li><li>then enable <code>🥔</code> (potato) in the upload UI<li>and try that upload again</li></ul>\nPS: firefox <a href="https://bugzilla.mozilla.org/show_bug.cgi?id=1790500" target="_blank">will hopefully have a bugfix</a> at some point',
"u_s404": "not found on server",
"u_expl": "explain",
"u_maxconn": "most browsers limit this to 6, but firefox lets you raise it with <code>connections-per-server</code> in <code>about:config</code>",
"u_tu": '<p class="warn">WARNING: turbo enabled, <span>&nbsp;client may not detect and resume incomplete uploads; see turbo-button tooltip</span></p>',
"u_ts": '<p class="warn">WARNING: turbo enabled, <span>&nbsp;search results can be incorrect; see turbo-button tooltip</span></p>',
"u_turbo_c": "turbo is disabled in server config",
"u_turbo_g": "disabling turbo because you don't have\ndirectory listing privileges within this volume",
"u_life_cfg": 'autodelete after <input id="lifem" p="60" /> min (or <input id="lifeh" p="3600" /> hours)',
"u_life_est": 'upload will be deleted <span id="lifew" tt="local time">---</span>',
"u_life_max": 'this folder enforces a\nmax lifetime of {0}',
"u_unp_ok": 'unpost is allowed for {0}',
"u_unp_ng": 'unpost will NOT be allowed',
"ue_ro": 'your access to this folder is Read-Only\n\n',
"ue_nl": 'you are currently not logged in',
"ue_la": 'you are currently logged in as "{0}"',
"ue_sr": 'you are currently in file-search mode\n\nswitch to upload-mode by clicking the magnifying glass 🔎 (next to the big SEARCH button), and try uploading again\n\nsorry',
"ue_ta": 'try uploading again, it should work now',
"ur_1uo": "OK: File uploaded successfully",
"ur_auo": "OK: All {0} files uploaded successfully",
"ur_1so": "OK: File found on server",
"ur_aso": "OK: All {0} files found on server",
"ur_1un": "Upload failed, sorry",
"ur_aun": "All {0} uploads failed, sorry",
"ur_1sn": "File was NOT found on server",
"ur_asn": "The {0} files were NOT found on server",
"ur_um": "Finished;\n{0} uploads OK,\n{1} uploads failed, sorry",
"ur_sm": "Finished;\n{0} files found on server,\n{1} files NOT found on server",
"lang_set": "refresh to make the change take effect?",
},
};

184
scripts/tl.py Executable file
View File

@@ -0,0 +1,184 @@
#!/usr/bin/env python3
import os
import sys
"""
generates a "tl.js" which can be used to prototype a new translation
without having to run from source, or tamper with the files that are
extracted from the sfx
to generate a tl.js for the language you are translating to,
run the following command: python tl.py fra Français
("fra" being the three-letter language code of your language, and
"Français" the native name of the language)
then copy tl.js into the webroot and run copyparty with the following:
on Linux: --html-head='<script src="/tl.js"></script>'
on Windows: "--html-head=<script src='/tl.js'></script>"
or in a copyparty config file:
[global]
html-head: <script src="/tl.js"></script>
after editing tl.js, reload your webbrowser by pressing ctrl-shift-r
"""
#######################################################################
#######################################################################
def generate_javascript(lang3, native_name, tl_browser):
note = "// NOTE: please use tabs for indenting, not spaces :-)"
if lang3 == "eng":
note += """\n
// NOTE: since you are using the tl.js straight from the repo, you'll
// need to find/replace all "eng" with your own three-letter name"""
return f""""use strict";
{note}
// the three-letter name of the language you're translating to
var my_lang = "{lang3}";
// and because we don't know these yet...
var SR='', wah='';
// this function is automatically called when the page is loaded:
function langmod() {{
// which page is the javascript currently running on?
// look for some well-known elements to figure it out:
if (QS("h1#cc") && QS("a#k")) {{
// we are running in the control-panel
Ls[my_lang] = tl_cpanel[my_lang];
}}
else if (ebi("op_cfg")) {{
// we are running in the filebrowser
Ls[my_lang] = tl_browser[my_lang];
// inform the settings tab that a new lang is available
LANGS.push(my_lang);
}}
}}
////////////////////////////////////////////////////////////////////////
// below this point is where the actual translation happens;
// here is the pairs of "text-identifier": "text-to-translate"
////////////////////////////////////////////////////////////////////////
// translation of splash.js (the control-panel);
// you do not need to translate the TLNotes, those are just for you :-)
var tl_cpanel = {{
"{lang3}": {{
"a1": "refresh",
"b1": "howdy stranger &nbsp; <small>(you're not logged in)</small>",
"c1": "logout",
"d1": "dump stack", // TLNote: "d2" is the tooltip for this button
"d2": "shows the state of all active threads",
"e1": "reload cfg",
"e2": "reload config files (accounts/volumes/volflags),$Nand rescan all e2ds volumes$N$Nnote: any changes to global settings$Nrequire a full restart to take effect",
"f1": "you can browse:",
"g1": "you can upload to:",
"cc1": "other stuff:",
"h1": "disable k304", // TLNote: "j1" explains what k304 is
"i1": "enable k304",
"j1": "enabling this will disconnect your client on every HTTP 304, which can prevent some buggy proxies from getting stuck (suddenly not loading pages), <em>but</em> it will also make things slower in general",
"k1": "reset client settings",
"l1": "login for more:",
"m1": "welcome back,", // TLNote: "welcome back, USERNAME"
"n1": "404 not found &nbsp;┐( ´ -`)┌",
"o1": 'or maybe you don\\'t have access -- try a password or <a href="' + SR + '/?h">go home</a>',
"p1": "403 forbiddena &nbsp;~┻━┻",
"q1": 'use a password or <a href="' + SR + '/?h">go home</a>',
"r1": "go home",
".s1": "rescan",
"t1": "action", // TLNote: this is the header above the "rescan" buttons
"u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds",
"v1": "connect",
"v2": "use this server as a local HDD",
"w1": "switch to https",
"x1": "change password",
"y1": "edit shares", // TLNote: shows the list of folders that the user has decided to share
"z1": "unlock this share:", // TLNote: the password prompt to see a hidden share
"ta1": "fill in your new password first",
"ta2": "repeat to confirm new password:",
"ta3": "found a typo; please try again",
"aa1": "incoming files:",
}},
}};
////////////////////////////////////////////////////////////////////////
// translation of browser.js (the filebrowser):
var tl_browser = {{
"{lang3}": {{
"tt": "{native_name}",
{tl_browser}
}};
"""
#######################################################################
#######################################################################
def die(*a):
print(*a)
sys.exit(1)
def main():
webdir = "../copyparty/web/splash.js"
while webdir and not os.path.exists(webdir):
webdir = webdir.split("/", 1)[1]
if not webdir:
t = "could not find the copyparty/web/*.js files!\nplease cd into the copyparty repo before running this script"
die(t)
webdir = webdir.rsplit("/", 1)[0] if "/" in webdir else "."
with open(os.path.join(webdir, "browser.js"), "rb") as f:
browserjs = f.read().decode("utf-8")
_, browserjs = browserjs.split('\n\t\t"tt": "English",\n', 1)
browserjs, _ = browserjs.split('\n\t"nor": {', 1)
try:
lang3 = sys.argv[1]
except:
t = "you need to provide one more argument: the three-letter language code for the language you are translating to, for example: ger"
die(t)
try:
native_name = sys.argv[2]
except:
t = "you need to provide one more argument: the native name of the language you are translating to, for example: Deutsch"
die(t)
ret = generate_javascript(lang3, native_name, browserjs)
outpath = os.path.abspath("tl.js")
if os.path.exists(outpath):
t = "the output file already exists! if you really want to overwrite it, then delete the following file and try again:"
die(t, outpath)
with open(outpath, "wb") as f:
f.write(ret.encode("utf-8"))
print("successfully created", outpath)
if __name__ == "__main__":
main()

View File

@@ -22,8 +22,12 @@ awk -v apos=\' -v quot=\" '
!$0 && t!=tp {
print "\n\033[1;37;41m====DIFF===="
}
!$0 && s==sp {
print "\n\033[1;37;44m====IDENTICAL===="
}
!$0 { print; next; }
{
sp=s; s=$0;
tp=t; t="";
c(quot);
c(apos);

View File

@@ -1,11 +1,6 @@
#!/usr/bin/env python3
import atexit
import os
import platform
import sys
import tarfile
import tempfile
import time
import traceback
@@ -23,20 +18,6 @@ def msg(*a, **ka):
print(*a, **ka)
def utime(top):
# avoid cleaners
files = [os.path.join(dp, p) for dp, dd, df in os.walk(top) for p in dd + df]
try:
while True:
t = int(time.time())
for f in [top] + files:
os.utime(f, (t, t))
time.sleep(78123)
except Exception as ex:
print("utime:", ex, f)
def confirm(rv):
msg()
msg("retcode", rv if rv else traceback.format_exc())
@@ -51,47 +32,17 @@ def confirm(rv):
def run():
import copyparty
from copyparty.__main__ import main as cm
td = tempfile.TemporaryDirectory(prefix="")
atexit.register(td.cleanup)
rsrc = td.name
try:
from importlib.resources import files
f = files(copyparty).joinpath("z.tar").open("rb")
except:
from importlib.resources import open_binary
f = open_binary("copyparty", "z.tar")
with tarfile.open(fileobj=f) as tf:
try:
tf.extractall(rsrc, filter="tar")
except TypeError:
tf.extractall(rsrc) # nosec (archive is safe)
f.close()
f = None
msg(" rsrc dir:", rsrc)
msg()
sys.argv.append("--sfx-tpoke=" + rsrc)
cm(rsrc=rsrc)
cm()
def main():
sysver = str(sys.version).replace("\n", "\n" + " " * 18)
pktime = time.strftime("%Y-%m-%d, %H:%M:%S", time.gmtime(STAMP))
msg()
msg(" this is: copyparty", VER)
msg(" packed at:", pktime, "UTC,", STAMP)
msg("python bin:", sys.executable)
msg("python ver:", platform.python_implementation(), sysver)
msg("build-time:", pktime, "UTC,", STAMP)
msg("python-bin:", sys.executable)
msg()
try:
run()

View File

@@ -107,9 +107,9 @@ class TestDots(unittest.TestCase):
os.rename(".b", "v/.b")
vcfg = [
".::r.,u1:g,u2:c,dk",
"v/a:v/a:r.,u1:g,u2:c,dk",
"v/.b:v/.b:r.,u1:g,u2:c,dk",
".::r.,u1:g,u2:c,dks",
"v/a:v/a:r.,u1:g,u2:c,dks",
"v/.b:v/.b:r.,u1:g,u2:c,dks",
]
self.args = Cfg(v=vcfg, a=["u1:u1", "u2:u2"])
self.asrv = AuthSrv(self.args, self.log)

View File

@@ -26,6 +26,7 @@ def hdr(query):
class TestHttpCli(unittest.TestCase):
def setUp(self):
self.td = tu.get_ramdisk()
self.maxDiff = 99999
def tearDown(self):
os.chdir(tempfile.gettempdir())
@@ -39,6 +40,7 @@ class TestHttpCli(unittest.TestCase):
os.mkdir(td)
os.chdir(td)
# "perm+user"; r/w/a (a=rw) for user a/o/x (a=all)
self.dtypes = ["ra", "ro", "rx", "wa", "wo", "wx", "aa", "ao", "ax"]
self.can_read = ["ra", "ro", "aa", "ao"]
self.can_write = ["wa", "wo", "aa", "ao"]
@@ -121,7 +123,8 @@ class TestHttpCli(unittest.TestCase):
# expected files in archives
if rok:
ref = [x for x in vfiles if self.in_dive(top + "/" + durl, x)]
zs = top + "/" + durl
ref = [x for x in vfiles if self.in_dive(zs, x)]
ref.sort()
else:
ref = []
@@ -166,7 +169,7 @@ class TestHttpCli(unittest.TestCase):
self.assertEqual([], zf_ng)
# stash
h, ret = self.put(url)
h, ret = self.put(durl)
res = h.startswith("HTTP/1.1 201 ")
self.assertEqual(res, wok)
if wok:

104
tests/test_metrics.py Normal file
View File

@@ -0,0 +1,104 @@
#!/usr/bin/env python3
# coding: utf-8
from __future__ import print_function, unicode_literals
import os
import re
import shutil
import tempfile
import unittest
from copyparty.__init__ import PY2
from copyparty.authsrv import AuthSrv
from copyparty.httpcli import HttpCli
from copyparty.metrics import Metrics
from tests import util as tu
from tests.util import Cfg
def hdr(query):
h = "GET /{} HTTP/1.1\r\nCookie: cppwd=o\r\nConnection: close\r\n\r\n"
return h.format(query).encode("utf-8")
class TestMetrics(unittest.TestCase):
def setUp(self):
self.td = tu.get_ramdisk()
os.chdir(self.td)
def tearDown(self):
os.chdir(tempfile.gettempdir())
shutil.rmtree(self.td)
def cinit(self):
if self.conn:
self.fstab = self.conn.hsrv.hub.up2k.fstab
self.conn.hsrv.hub.up2k.shutdown()
self.asrv = AuthSrv(self.args, self.log)
self.conn = tu.VHttpConn(self.args, self.asrv, self.log, b"", True)
if self.fstab:
self.conn.hsrv.hub.up2k.fstab = self.fstab
if not self.metrics:
self.metrics = Metrics(self.conn)
self.conn.broker = self.conn.hsrv.broker
self.conn.hsrv.metrics = self.metrics
for k in "ncli nsus nban".split():
setattr(self.conn, k, 9)
def test(self):
zs = "nos_dup nos_hdd nos_unf nos_vol nos_vst"
opts = {x: False for x in zs.split()}
self.args = Cfg(v=[".::A"], a=[], stats=True, **opts)
self.conn = self.fstab = self.metrics = None
self.cinit()
h, b = self.curl(".cpr/metrics")
self.assertIn(".1 200 OK", h)
ptns = r"""
cpp_uptime_seconds [0-9]\.[0-9]{3}$
cpp_boot_unixtime_seconds [0-9]{7,10}\.[0-9]{3}$
cpp_http_reqs_created [0-9]{7,10}$
cpp_http_reqs_total -1$
cpp_http_conns 9$
cpp_total_bans 9$
cpp_sus_reqs_total 9$
cpp_active_bans 0$
cpp_idle_vols 0$
cpp_busy_vols 0$
cpp_offline_vols 0$
cpp_db_idle_seconds 86399999\.00$
cpp_db_act_seconds 0\.00$
cpp_hashing_files 0$
cpp_tagq_files 0$
cpp_disk_size_bytes\{vol="/"\} [0-9]+$
cpp_disk_free_bytes\{vol="/"\} [0-9]+$
cpp_vol_bytes\{vol="/"\} 0$
cpp_vol_files\{vol="/"\} 0$
cpp_vol_bytes\{vol="total"\} 0$
cpp_vol_files\{vol="total"\} 0$
cpp_dupe_bytes\{vol="total"\} 0$
cpp_dupe_files\{vol="total"\} 0$
"""
if not PY2:
ptns += r"""
cpp_mtpq_files 0$
cpp_unf_bytes\{vol="/"\} 0$
cpp_unf_files\{vol="/"\} 0$
cpp_unf_bytes\{vol="total"\} 0$
cpp_unf_files\{vol="total"\} 0$
"""
for want in [x for x in ptns.split("\n") if x]:
if not re.search("^" + want, b.replace("\r", ""), re.MULTILINE):
raise Exception("server response:\n%s\n\nmissing item: %s" % (b, want))
def curl(self, url, binary=False):
conn = self.conn.setbuf(hdr(url))
HttpCli(conn).run()
if binary:
h, b = conn.s._reply.split(b"\r\n\r\n", 1)
return [h.decode("utf-8"), b]
return conn.s._reply.decode("utf-8").split("\r\n\r\n", 1)
def log(self, src, msg, c=0):
print(msg)

38
tests/test_utils.py Normal file
View File

@@ -0,0 +1,38 @@
#!/usr/bin/env python3
# coding: utf-8
from __future__ import print_function, unicode_literals
import unittest
from copyparty.__main__ import PY2
from copyparty.util import w8enc
from tests import util as tu
class TestUtils(unittest.TestCase):
def cmp(self, orig, t1, t2):
if t1 != t2:
raise Exception("\n%r\n%r\n%r\n" % (w8enc(orig), t1, t2))
def test_quotep(self):
if PY2:
raise unittest.SkipTest()
from copyparty.util import _quotep3, _quotep3b, w8dec
txt = w8dec(tu.randbytes(8192))
self.cmp(txt, _quotep3(txt), _quotep3b(txt))
def test_unquote(self):
if PY2:
raise unittest.SkipTest()
from urllib.parse import unquote_to_bytes as u2b
from copyparty.util import unquote
for btxt in (
tu.randbytes(8192),
br"%ed%91qw,er;ty%20as df?gh+jkl%zxc&vbn <qwe>\"rty'uio&asd&nbsp;fgh",
):
self.cmp(btxt, unquote(btxt), u2b(btxt))

View File

@@ -3,6 +3,7 @@
from __future__ import print_function, unicode_literals
import os
import random
import re
import shutil
import socket
@@ -49,6 +50,10 @@ from copyparty.util import FHC, CachedDict, Garda, Unrecv
init_E(E)
def randbytes(n):
return random.getrandbits(n * 8).to_bytes(n, "little")
def runcmd(argv):
p = sp.Popen(argv, stdout=sp.PIPE, stderr=sp.PIPE)
stdout, stderr = p.communicate()
@@ -117,10 +122,10 @@ class Cfg(Namespace):
def __init__(self, a=None, v=None, c=None, **ka0):
ka = {}
ex = "chpw daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink ih ihead magic hardlink_only nid nih no_acode no_athumb no_dav no_db_ip no_del no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw og og_no_head og_s_title q rand smb srch_dbg stats uqe vague_403 vc ver write_uplog xdev xlink xvol zs"
ex = "chpw daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink ih ihead magic hardlink_only nid nih no_acode no_athumb no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw og og_no_head og_s_title q rand re_dirsz smb srch_dbg stats uqe vague_403 vc ver write_uplog xdev xlink xvol zs"
ka.update(**{k: False for k in ex.split()})
ex = "dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_voldump re_dhash plain_ip"
ex = "dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash plain_ip"
ka.update(**{k: True for k in ex.split()})
ex = "ah_cli ah_gen css_browser hist js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua"
@@ -262,6 +267,7 @@ class VHttpSrv(object):
self.u2idx = None
self.ptn_cc = re.compile(r"[\x00-\x1f]")
self.uparam_cc_ok = set("doc move tree".split())
def cachebuster(self):
return "a"