Compare commits
34 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
acd32abac5 | ||
|
|
2b47c96cf2 | ||
|
|
1027378bda | ||
|
|
e979d30659 | ||
|
|
574db704cc | ||
|
|
fdb969ea89 | ||
|
|
08977854b3 | ||
|
|
cecac64b68 | ||
|
|
7dabdade2a | ||
|
|
e788f098e2 | ||
|
|
69406d4344 | ||
|
|
d16dd26c65 | ||
|
|
12219c1bea | ||
|
|
118bdcc26e | ||
|
|
78fa96f0f4 | ||
|
|
c7deb63a04 | ||
|
|
4f811eb9e9 | ||
|
|
0b265bd673 | ||
|
|
ee67fabbeb | ||
|
|
b213de7e62 | ||
|
|
7c01505750 | ||
|
|
ae28dfd020 | ||
|
|
2a5a4e785f | ||
|
|
d8bddede6a | ||
|
|
b8a93e74bf | ||
|
|
e60ec94d35 | ||
|
|
84af5fd0a3 | ||
|
|
dbb3edec77 | ||
|
|
d284b46a3e | ||
|
|
9fcb4d222b | ||
|
|
d0bb1ad141 | ||
|
|
b299aaed93 | ||
|
|
abb3224cc5 | ||
|
|
1c66d06702 |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -22,6 +22,7 @@ copyparty.egg-info/
|
||||
*.bak
|
||||
|
||||
# derived
|
||||
copyparty/res/COPYING.txt
|
||||
copyparty/web/deps/
|
||||
srv/
|
||||
|
||||
|
||||
56
README.md
56
README.md
@@ -56,6 +56,7 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
|
||||
* [other tricks](#other-tricks)
|
||||
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
|
||||
* [server config](#server-config) - using arguments or config files, or a mix of both
|
||||
* [qr-code](#qr-code) - print a qr-code [(screenshot)](https://user-images.githubusercontent.com/241032/194728533-6f00849b-c6ac-43c6-9359-83e454d11e00.png) for quick access
|
||||
* [ftp-server](#ftp-server) - an FTP server can be started using `--ftp 3921`
|
||||
* [file indexing](#file-indexing) - enables dedup and music search ++
|
||||
* [exclude-patterns](#exclude-patterns) - to save some time
|
||||
@@ -66,7 +67,7 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
|
||||
* [other flags](#other-flags)
|
||||
* [database location](#database-location) - in-volume (`.hist/up2k.db`, default) or somewhere else
|
||||
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
|
||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags
|
||||
* [upload events](#upload-events) - trigger a script/program on each upload
|
||||
* [hiding from google](#hiding-from-google) - tell search engines you dont wanna be indexed
|
||||
* [themes](#themes)
|
||||
@@ -93,6 +94,7 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
|
||||
* [optional gpl stuff](#optional-gpl-stuff)
|
||||
* [sfx](#sfx) - the self-contained "binary"
|
||||
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
|
||||
* [copyparty.exe](#copypartyexe)
|
||||
* [install on android](#install-on-android)
|
||||
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports
|
||||
* [building](#building)
|
||||
@@ -107,6 +109,8 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
|
||||
|
||||
download **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** and you're all set!
|
||||
|
||||
if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
|
||||
|
||||
running the sfx without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
|
||||
|
||||
some recommended options:
|
||||
@@ -114,7 +118,7 @@ some recommended options:
|
||||
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen), see [optional dependencies](#optional-dependencies)
|
||||
* `-v /mnt/music:/music:r:rw,foo -a foo:bar` shares `/mnt/music` as `/music`, `r`eadable by anyone, and read-write for user `foo`, password `bar`
|
||||
* replace `:r:rw,foo` with `:r,foo` to only make the folder readable by `foo` and nobody else
|
||||
* see [accounts and volumes](#accounts-and-volumes) for the syntax and other permissions (`r`ead, `w`rite, `m`ove, `d`elete, `g`et)
|
||||
* see [accounts and volumes](#accounts-and-volumes) for the syntax and other permissions (`r`ead, `w`rite, `m`ove, `d`elete, `g`et, up`G`et)
|
||||
* `--ls '**,*,ln,p,r'` to crash on startup if any of the volumes contain a symlink which point outside the volume, as that could give users unintended access (see `--help-ls`)
|
||||
|
||||
|
||||
@@ -164,6 +168,7 @@ feature summary
|
||||
* ☑ volumes (mountpoints)
|
||||
* ☑ [accounts](#accounts-and-volumes)
|
||||
* ☑ [ftp-server](#ftp-server)
|
||||
* ☑ [qr-code](#qr-code) for quick access
|
||||
* upload
|
||||
* ☑ basic: plain multipart, ie6 support
|
||||
* ☑ [up2k](#uploading): js, resumable, multithreaded
|
||||
@@ -318,6 +323,7 @@ permissions:
|
||||
* `m` (move): move files/folders *from* this folder
|
||||
* `d` (delete): delete files/folders
|
||||
* `g` (get): only download files, cannot see folder contents or zip/tar
|
||||
* `G` (upget): same as `g` except uploaders get to see their own filekeys (see `fk` in examples below)
|
||||
|
||||
examples:
|
||||
* add accounts named u1, u2, u3 with passwords p1, p2, p3: `-a u1:p1 -a u2:p2 -a u3:p3`
|
||||
@@ -328,10 +334,12 @@ examples:
|
||||
* unauthorized users accessing the webroot can see that the `inc` folder exists, but cannot open it
|
||||
* `u1` can open the `inc` folder, but cannot see the contents, only upload new files to it
|
||||
* `u2` can browse it and move files *from* `/inc` into any folder where `u2` has write-access
|
||||
* make folder `/mnt/ss` available at `/i`, read-write for u1, get-only for everyone else, and enable accesskeys: `-v /mnt/ss:i:rw,u1:g:c,fk=4`
|
||||
* `c,fk=4` sets the `fk` volflag to 4, meaning each file gets a 4-character accesskey
|
||||
* `u1` can upload files, browse the folder, and see the generated accesskeys
|
||||
* other users cannot browse the folder, but can access the files if they have the full file URL with the accesskey
|
||||
* make folder `/mnt/ss` available at `/i`, read-write for u1, get-only for everyone else, and enable filekeys: `-v /mnt/ss:i:rw,u1:g:c,fk=4`
|
||||
* `c,fk=4` sets the `fk` (filekey) volflag to 4, meaning each file gets a 4-character accesskey
|
||||
* `u1` can upload files, browse the folder, and see the generated filekeys
|
||||
* other users cannot browse the folder, but can access the files if they have the full file URL with the filekey
|
||||
* replacing the `g` permission with `wg` would let anonymous users upload files, but not see the required filekey to access it
|
||||
* replacing the `g` permission with `wG` would let anonymous users upload files, receiving a working direct link in return
|
||||
|
||||
anyone trying to bruteforce a password gets banned according to `--ban-pw`; default is 24h ban for 9 failed attempts in 1 hour
|
||||
|
||||
@@ -675,6 +683,19 @@ using arguments or config files, or a mix of both:
|
||||
* or click the `[reload cfg]` button in the control-panel when logged in as admin
|
||||
|
||||
|
||||
## qr-code
|
||||
|
||||
print a qr-code [(screenshot)](https://user-images.githubusercontent.com/241032/194728533-6f00849b-c6ac-43c6-9359-83e454d11e00.png) for quick access, great between phones on android hotspots which keep changing the subnet
|
||||
|
||||
* `--qr` enables it
|
||||
* `--qrs` does https instead of http
|
||||
* `--qrl lootbox/?pw=hunter2` appends to the url, linking to the `lootbox` folder with password `hunter2`
|
||||
* `--qrz 1` forces 1x zoom instead of autoscaling to fit the terminal size
|
||||
* 1x may render incorrectly on some terminals/fonts, but 2x should always work
|
||||
|
||||
it will use your external ip (default route) unless `--qri` specifies an ip-prefix or domain
|
||||
|
||||
|
||||
## ftp-server
|
||||
|
||||
an FTP server can be started using `--ftp 3921`, and/or `--ftps` for explicit TLS (ftpes)
|
||||
@@ -841,7 +862,7 @@ see the beautiful mess of a dictionary in [mtag.py](https://github.com/9001/copy
|
||||
|
||||
## file parser plugins
|
||||
|
||||
provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||
provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||
|
||||
copyparty can invoke external programs to collect additional metadata for files using `mtp` (either as argument or volflag), there is a default timeout of 60sec, and only files which contain audio get analyzed by default (see ay/an/ad below)
|
||||
|
||||
@@ -1074,6 +1095,7 @@ below are some tweaks roughly ordered by usefulness:
|
||||
* `--http-only` or `--https-only` (unless you want to support both protocols) will reduce the delay before a new connection is established
|
||||
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
||||
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
||||
* `--no-htp --hash-mt=0 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
||||
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example:
|
||||
* huge amount of short-lived connections
|
||||
* really heavy traffic (downloads/uploads)
|
||||
@@ -1124,7 +1146,8 @@ some notes on hardening
|
||||
other misc notes:
|
||||
|
||||
* you can disable directory listings by giving permission `g` instead of `r`, only accepting direct URLs to files
|
||||
* combine this with volflag `c,fk` to generate per-file accesskeys; users which have full read-access will then see URLs with `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
|
||||
* combine this with volflag `c,fk` to generate filekeys (per-file accesskeys); users which have full read-access will then see URLs with `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
|
||||
* permissions `wG` lets users upload files and receive their own filekeys, still without being able to see other uploads
|
||||
|
||||
|
||||
## gotchas
|
||||
@@ -1308,6 +1331,19 @@ for the `re`pack to work, first run one of the sfx'es once to unpack it
|
||||
**note:** you can also just download and run [scripts/copyparty-repack.sh](scripts/copyparty-repack.sh) -- this will grab the latest copyparty release from github and do a few repacks; works on linux/macos (and windows with msys2 or WSL)
|
||||
|
||||
|
||||
## copyparty.exe
|
||||
|
||||

|
||||
|
||||
[copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) can be convenient on old machines where installing python is problematic, however is **not recommended** and should be considered a last resort -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
|
||||
|
||||
the exe is compatible with 32bit windows7, which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and will definitely become a security hazard at some point
|
||||
|
||||
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
||||
|
||||
then again, if you are already into downloading shady binaries from the internet, you may also want my [minimal builds](./scripts/pyinstaller#ffmpeg) of [ffmpeg](https://ocv.me/stuff/bin/ffmpeg.exe) and [ffprobe](https://ocv.me/stuff/bin/ffprobe.exe) which enables copyparty to extract multimedia-info, do audio-transcoding, and thumbnails/spectrograms/waveforms, however it's much better to instead grab a [recent official build](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) every once ina while if you can afford the size
|
||||
|
||||
|
||||
# install on android
|
||||
|
||||
install [Termux](https://termux.com/) (see [ocv.me/termux](https://ocv.me/termux/)) and then copy-paste this into Termux (long-tap) all at once:
|
||||
@@ -1316,7 +1352,7 @@ apt update && apt -y full-upgrade && apt update && termux-setup-storage && apt -
|
||||
echo $?
|
||||
```
|
||||
|
||||
after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux
|
||||
after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux -- and if you run it with `--qr` you'll get a [neat qr-code](#qr-code) pointing to your external ip
|
||||
|
||||
if you want thumbnails, `apt -y install ffmpeg`
|
||||
|
||||
@@ -1363,7 +1399,7 @@ first grab the web-dependencies from a previous sfx (assuming you don't need to
|
||||
```sh
|
||||
rm -rf copyparty/web/deps
|
||||
curl -L https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py >x.py
|
||||
python3 x.py -h
|
||||
python3 x.py --version
|
||||
rm x.py
|
||||
mv /tmp/pe-copyparty/copyparty/web/deps/ copyparty/web/deps/
|
||||
```
|
||||
|
||||
@@ -48,7 +48,7 @@ except ImportError:
|
||||
|
||||
|
||||
# from copyparty/__init__.py
|
||||
PY2 = sys.version_info[0] == 2
|
||||
PY2 = sys.version_info < (3,)
|
||||
if PY2:
|
||||
from Queue import Queue
|
||||
from urllib import unquote
|
||||
|
||||
@@ -11,7 +11,7 @@ try:
|
||||
except:
|
||||
TYPE_CHECKING = False
|
||||
|
||||
PY2 = sys.version_info[0] == 2
|
||||
PY2 = sys.version_info < (3,)
|
||||
if PY2:
|
||||
sys.dont_write_bytecode = True
|
||||
unicode = unicode # noqa: F821 # pylint: disable=undefined-variable,self-assigning-variable
|
||||
|
||||
@@ -76,6 +76,10 @@ class RiceFormatter(argparse.HelpFormatter):
|
||||
defaulting_nargs = [argparse.OPTIONAL, argparse.ZERO_OR_MORE]
|
||||
if action.option_strings or action.nargs in defaulting_nargs:
|
||||
ret += fmt
|
||||
|
||||
if not VT100:
|
||||
ret = re.sub("\033\[[0-9;]+m", "", ret)
|
||||
|
||||
return ret
|
||||
|
||||
def _fill_text(self, text: str, width: int, indent: str) -> str:
|
||||
@@ -438,6 +442,7 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
"m" (move): move files and folders; need "w" at destination
|
||||
"d" (delete): permanently delete files and folders
|
||||
"g" (get): download files, but cannot see folder contents
|
||||
"G" (upget): "get", but can see filekeys of their own uploads
|
||||
|
||||
too many volflags to list here, see the other sections
|
||||
|
||||
@@ -566,16 +571,26 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2.add_argument("-c", metavar="PATH", type=u, action="append", help="add config file")
|
||||
ap2.add_argument("-nc", metavar="NUM", type=int, default=64, help="max num clients")
|
||||
ap2.add_argument("-j", metavar="CORES", type=int, default=1, help="max num cpu cores, 0=all")
|
||||
ap2.add_argument("-a", metavar="ACCT", type=u, action="append", help="add account, USER:PASS; example [ed:wark]")
|
||||
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, SRC:DST:FLAG; examples [.::r], [/mnt/nas/music:/music:r:aed]")
|
||||
ap2.add_argument("-a", metavar="ACCT", type=u, action="append", help="add account, \033[33mUSER\033[0m:\033[33mPASS\033[0m; example [\033[32med:wark\033[0m]")
|
||||
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, \033[33mSRC\033[0m:\033[33mDST\033[0m:\033[33mFLAG\033[0m; examples [\033[32m.::r\033[0m], [\033[32m/mnt/nas/music:/music:r:aed\033[0m]")
|
||||
ap2.add_argument("-ed", action="store_true", help="enable the ?dots url parameter / client option which allows clients to see dotfiles / hidden files")
|
||||
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins -- neat but dangerous, big XSS risk")
|
||||
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
|
||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-form POSTs; see --help-urlform")
|
||||
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="window title, for example '$ip-10.1.2.' or '$ip-'")
|
||||
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="window title, for example [\033[32m$ip-10.1.2.\033[0m] or [\033[32m$ip-]")
|
||||
ap2.add_argument("--license", action="store_true", help="show licenses and exit")
|
||||
ap2.add_argument("--version", action="store_true", help="show versions and exit")
|
||||
|
||||
ap2 = ap.add_argument_group('qr options')
|
||||
ap2.add_argument("--qr", action="store_true", help="show http:// QR-code on startup")
|
||||
ap2.add_argument("--qrs", action="store_true", help="show https:// QR-code on startup")
|
||||
ap2.add_argument("--qrl", metavar="PATH", type=u, default="", help="location to include in the url, for example [\033[32mpriv/?pw=hunter2\033[0m]")
|
||||
ap2.add_argument("--qri", metavar="PREFIX", type=u, default="", help="select IP which starts with PREFIX")
|
||||
ap2.add_argument("--qr-fg", metavar="COLOR", type=int, default=16, help="foreground")
|
||||
ap2.add_argument("--qr-bg", metavar="COLOR", type=int, default=229, help="background (white=255)")
|
||||
ap2.add_argument("--qrp", metavar="CELLS", type=int, default=4, help="padding (spec says 4 or more, but 1 is usually fine)")
|
||||
ap2.add_argument("--qrz", metavar="N", type=int, default=0, help="[\033[32m1\033[0m]=1x, [\033[32m2\033[0m]=2x, [\033[32m0\033[0m]=auto (try 2 on broken fonts)")
|
||||
|
||||
ap2 = ap.add_argument_group('upload options')
|
||||
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless -ed")
|
||||
ap2.add_argument("--plain-ip", action="store_true", help="when avoiding filename collisions by appending the uploader's ip to the filename: append the plaintext ip instead of salting and hashing the ip")
|
||||
@@ -589,14 +604,14 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2.add_argument("--magic", action="store_true", help="enable filetype detection on nameless uploads")
|
||||
ap2.add_argument("--df", metavar="GiB", type=float, default=0, help="ensure GiB free disk space by rejecting upload requests")
|
||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; 0 = off and warn if enabled, 1 = off, 2 = on, 3 = on and disable datecheck")
|
||||
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; s=smallest-first, n=alphabetical, fs=force-s, fn=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m0\033[0m] = off and warn if enabled, [\033[32m1\033[0m] = off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
|
||||
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
|
||||
|
||||
ap2 = ap.add_argument_group('network options')
|
||||
ap2.add_argument("-i", metavar="IP", type=u, default="0.0.0.0", help="ip to bind (comma-sep.)")
|
||||
ap2.add_argument("-p", metavar="PORT", type=u, default="3923", help="ports to bind (comma/range)")
|
||||
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; 0 = tcp, 1 = origin (first x-fwd), 2 = cloudflare, 3 = nginx, -1 = closest proxy")
|
||||
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; [\033[32m0\033[0m]=tcp, [\033[32m1\033[0m]=origin (first x-fwd), [\033[32m2\033[0m]=cloudflare, [\033[32m3\033[0m]=nginx, [\033[32m-1\033[0m]=closest proxy")
|
||||
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
|
||||
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="debug: socket write delay in seconds")
|
||||
ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="debug: response delay in seconds")
|
||||
@@ -604,17 +619,17 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2 = ap.add_argument_group('SSL/TLS options')
|
||||
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls -- force plaintext")
|
||||
ap2.add_argument("--https-only", action="store_true", help="disable plaintext -- force tls")
|
||||
ap2.add_argument("--ssl-ver", metavar="LIST", type=u, help="set allowed ssl/tls versions; [help] shows available versions; default is what your python version considers safe")
|
||||
ap2.add_argument("--ciphers", metavar="LIST", type=u, help="set allowed ssl/tls ciphers; [help] shows available ciphers")
|
||||
ap2.add_argument("--ssl-ver", metavar="LIST", type=u, help="set allowed ssl/tls versions; [\033[32mhelp\033[0m] shows available versions; default is what your python version considers safe")
|
||||
ap2.add_argument("--ciphers", metavar="LIST", type=u, help="set allowed ssl/tls ciphers; [\033[32mhelp\033[0m] shows available ciphers")
|
||||
ap2.add_argument("--ssl-dbg", action="store_true", help="dump some tls info")
|
||||
ap2.add_argument("--ssl-log", metavar="PATH", type=u, help="log master secrets for later decryption in wireshark")
|
||||
|
||||
ap2 = ap.add_argument_group('FTP options')
|
||||
ap2.add_argument("--ftp", metavar="PORT", type=int, help="enable FTP server on PORT, for example 3921")
|
||||
ap2.add_argument("--ftps", metavar="PORT", type=int, help="enable FTPS server on PORT, for example 3990")
|
||||
ap2.add_argument("--ftp", metavar="PORT", type=int, help="enable FTP server on PORT, for example \033[32m3921")
|
||||
ap2.add_argument("--ftps", metavar="PORT", type=int, help="enable FTPS server on PORT, for example \033[32m3990")
|
||||
ap2.add_argument("--ftp-dbg", action="store_true", help="enable debug logging")
|
||||
ap2.add_argument("--ftp-nat", metavar="ADDR", type=u, help="the NAT address to use for passive connections")
|
||||
ap2.add_argument("--ftp-pr", metavar="P-P", type=u, help="the range of TCP ports to use for passive connections, for example 12000-13000")
|
||||
ap2.add_argument("--ftp-pr", metavar="P-P", type=u, help="the range of TCP ports to use for passive connections, for example \033[32m12000-13000")
|
||||
|
||||
ap2 = ap.add_argument_group('opt-outs')
|
||||
ap2.add_argument("-nw", action="store_true", help="never write anything to disk (debug/benchmark)")
|
||||
@@ -630,7 +645,7 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2.add_argument("-s", action="count", default=0, help="increase safety: Disable thumbnails / potentially dangerous software (ffmpeg/pillow/vips), hide partial uploads, avoid crawlers.\n └─Alias of\033[32m --dotpart --no-thumb --no-mtag-ff --no-robots --force-js")
|
||||
ap2.add_argument("-ss", action="store_true", help="further increase safety: Prevent js-injection, accidental move/delete, broken symlinks, 404 on 403, ban on excessive 404s.\n └─Alias of\033[32m -s --no-dot-mv --no-dot-ren --unpost=0 --no-del --no-mv --hardlink --vague-403 --ban-404=50,60,1440 -nih")
|
||||
ap2.add_argument("-sss", action="store_true", help="further increase safety: Enable logging to disk, scan for dangerous symlinks.\n └─Alias of\033[32m -ss -lo=cpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz --ls=**,*,ln,p,r")
|
||||
ap2.add_argument("--ls", metavar="U[,V[,F]]", type=u, help="do a sanity/safety check of all volumes on startup; arguments USER,VOL,FLAGS; example [**,*,ln,p,r]")
|
||||
ap2.add_argument("--ls", metavar="U[,V[,F]]", type=u, help="do a sanity/safety check of all volumes on startup; arguments \033[33mUSER\033[0m,\033[33mVOL\033[0m,\033[33mFLAGS\033[0m; example [\033[32m**,*,ln,p,r\033[0m]")
|
||||
ap2.add_argument("--salt", type=u, default="hunter2", help="up2k file-hash salt; used to generate unpredictable internal identifiers for uploads -- doesn't really matter")
|
||||
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files -- this one DOES matter")
|
||||
ap2.add_argument("--no-dot-mv", action="store_true", help="disallow moving dotfiles; makes it impossible to move folders containing dotfiles")
|
||||
@@ -640,18 +655,18 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2.add_argument("--vague-403", action="store_true", help="send 404 instead of 403 (security through ambiguity, very enterprise)")
|
||||
ap2.add_argument("--force-js", action="store_true", help="don't send folder listings as HTML, force clients to use the embedded json instead -- slight protection against misbehaving search engines which ignore --no-robots")
|
||||
ap2.add_argument("--no-robots", action="store_true", help="adds http and html headers asking search engines to not index anything")
|
||||
ap2.add_argument("--logout", metavar="H", type=float, default="8086", help="logout clients after H hours of inactivity (0.0028=10sec, 0.1=6min, 24=day, 168=week, 720=month, 8760=year)")
|
||||
ap2.add_argument("--ban-pw", metavar="N,W,B", type=u, default="9,60,1440", help="more than N wrong passwords in W minutes = ban for B minutes (disable with \"no\")")
|
||||
ap2.add_argument("--ban-404", metavar="N,W,B", type=u, default="no", help="hitting more than N 404's in W minutes = ban for B minutes (disabled by default since turbo-up2k counts as 404s)")
|
||||
ap2.add_argument("--logout", metavar="H", type=float, default="8086", help="logout clients after H hours of inactivity; [\033[32m0.0028\033[0m]=10sec, [\033[32m0.1\033[0m]=6min, [\033[32m24\033[0m]=day, [\033[32m168\033[0m]=week, [\033[32m720\033[0m]=month, [\033[32m8760\033[0m]=year)")
|
||||
ap2.add_argument("--ban-pw", metavar="N,W,B", type=u, default="9,60,1440", help="more than \033[33mN\033[0m wrong passwords in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; disable with [\033[32mno\033[0m]")
|
||||
ap2.add_argument("--ban-404", metavar="N,W,B", type=u, default="no", help="hitting more than \033[33mN\033[0m 404's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes (disabled by default since turbo-up2k counts as 404s)")
|
||||
|
||||
ap2 = ap.add_argument_group('shutdown options')
|
||||
ap2.add_argument("--ign-ebind", action="store_true", help="continue running even if it's impossible to listen on some of the requested endpoints")
|
||||
ap2.add_argument("--ign-ebind-all", action="store_true", help="continue running even if it's impossible to receive connections at all")
|
||||
ap2.add_argument("--exit", metavar="WHEN", type=u, default="", help="shutdown after WHEN has finished; for example 'idx' will do volume indexing + metadata analysis")
|
||||
ap2.add_argument("--exit", metavar="WHEN", type=u, default="", help="shutdown after WHEN has finished; for example [\033[32midx\033[0m] will do volume indexing + metadata analysis")
|
||||
|
||||
ap2 = ap.add_argument_group('logging options')
|
||||
ap2.add_argument("-q", action="store_true", help="quiet")
|
||||
ap2.add_argument("-lo", metavar="PATH", type=u, help="logfile, example: cpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz")
|
||||
ap2.add_argument("-lo", metavar="PATH", type=u, help="logfile, example: \033[32mcpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz")
|
||||
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
|
||||
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
|
||||
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
|
||||
@@ -734,7 +749,7 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language")
|
||||
ap2.add_argument("--theme", metavar="NUM", type=int, default=0, help="default theme to use")
|
||||
ap2.add_argument("--themes", metavar="NUM", type=int, default=8, help="number of themes installed")
|
||||
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="favicon text [ foreground [ background ] ], set blank to disable")
|
||||
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
|
||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
|
||||
@@ -747,9 +762,12 @@ def run_argparse(argv: list[str], formatter: Any, retry: bool) -> argparse.Names
|
||||
ap2.add_argument("--no-scandir", action="store_true", help="disable scandir; instead using listdir + stat on each file")
|
||||
ap2.add_argument("--no-fastboot", action="store_true", help="wait for up2k indexing before starting the httpd")
|
||||
ap2.add_argument("--no-htp", action="store_true", help="disable httpserver threadpool, create threads as-needed instead")
|
||||
ap2.add_argument("--stackmon", metavar="P,S", type=u, help="write stacktrace to Path every S second, for example --stackmon=./st/%%Y-%%m/%%d/%%H%%M.xz,60")
|
||||
ap2.add_argument("--stackmon", metavar="P,S", type=u, help="write stacktrace to Path every S second, for example --stackmon=\033[32m./st/%%Y-%%m/%%d/%%H%%M.xz,60")
|
||||
ap2.add_argument("--log-thrs", metavar="SEC", type=float, help="list active threads every SEC")
|
||||
ap2.add_argument("--log-fk", metavar="REGEX", type=u, default="", help="log filekey params for files where path matches REGEX; '.' (a single dot) = all files")
|
||||
ap2.add_argument("--log-fk", metavar="REGEX", type=u, default="", help="log filekey params for files where path matches REGEX; [\033[32m.\033[0m] (a single dot) = all files")
|
||||
ap2.add_argument("--bak-flips", action="store_true", help="[up2k] if a client uploads a bitflipped/corrupted chunk, store a copy according to --bf-nc and --bf-dir")
|
||||
ap2.add_argument("--bf-nc", metavar="NUM", type=int, default=200, help="bak-flips: stop if there's more than NUM files at --kf-dir already; default: 6.3 GiB max (200*32M)")
|
||||
ap2.add_argument("--bf-dir", metavar="PATH", type=u, default="bf", help="bak-flips: store corrupted chunks at PATH; default: folder named 'bf' wherever copyparty was started")
|
||||
# fmt: on
|
||||
|
||||
ap2 = ap.add_argument_group("help sections")
|
||||
@@ -817,8 +835,10 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
time.sleep(2)
|
||||
|
||||
try:
|
||||
if len(argv) == 1 and (ANYWIN or not os.geteuid()):
|
||||
argv.extend(["-p80,443,3923", "--ign-ebind"])
|
||||
if len(argv) == 1:
|
||||
argv.extend(["--qr"])
|
||||
if ANYWIN or not os.geteuid():
|
||||
argv.extend(["-p80,443,3923", "--ign-ebind"])
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -860,7 +880,7 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
if re.match("c[^,]", opt):
|
||||
mod = True
|
||||
na.append("c," + opt[1:])
|
||||
elif re.sub("^[rwmdg]*", "", opt) and "," not in opt:
|
||||
elif re.sub("^[rwmdgG]*", "", opt) and "," not in opt:
|
||||
mod = True
|
||||
perm = opt[0]
|
||||
if perm == "a":
|
||||
@@ -901,6 +921,9 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
zs = "argument {} cannot be '{}'; try one of these: {}"
|
||||
raise Exception(zs.format(arg, val, okays))
|
||||
|
||||
if not al.qrs and [k for k in argv if k.startswith("--qr")]:
|
||||
al.qr = True
|
||||
|
||||
if HAVE_SSL:
|
||||
if al.ssl_ver:
|
||||
configure_ssl_ver(al)
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 4, 2)
|
||||
VERSION = (1, 4, 6)
|
||||
CODENAME = "mostly reliable"
|
||||
BUILD_DT = (2022, 9, 25)
|
||||
BUILD_DT = (2022, 10, 13)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -58,18 +58,20 @@ class AXS(object):
|
||||
umove: Optional[Union[list[str], set[str]]] = None,
|
||||
udel: Optional[Union[list[str], set[str]]] = None,
|
||||
uget: Optional[Union[list[str], set[str]]] = None,
|
||||
upget: Optional[Union[list[str], set[str]]] = None,
|
||||
) -> None:
|
||||
self.uread: set[str] = set(uread or [])
|
||||
self.uwrite: set[str] = set(uwrite or [])
|
||||
self.umove: set[str] = set(umove or [])
|
||||
self.udel: set[str] = set(udel or [])
|
||||
self.uget: set[str] = set(uget or [])
|
||||
self.upget: set[str] = set(upget or [])
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return "AXS({})".format(
|
||||
", ".join(
|
||||
"{}={!r}".format(k, self.__dict__[k])
|
||||
for k in "uread uwrite umove udel uget".split()
|
||||
for k in "uread uwrite umove udel uget upget".split()
|
||||
)
|
||||
)
|
||||
|
||||
@@ -293,6 +295,7 @@ class VFS(object):
|
||||
self.amove: dict[str, list[str]] = {}
|
||||
self.adel: dict[str, list[str]] = {}
|
||||
self.aget: dict[str, list[str]] = {}
|
||||
self.apget: dict[str, list[str]] = {}
|
||||
|
||||
if realpath:
|
||||
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
|
||||
@@ -384,8 +387,10 @@ class VFS(object):
|
||||
|
||||
return self, vpath
|
||||
|
||||
def can_access(self, vpath: str, uname: str) -> tuple[bool, bool, bool, bool, bool]:
|
||||
"""can Read,Write,Move,Delete,Get"""
|
||||
def can_access(
|
||||
self, vpath: str, uname: str
|
||||
) -> tuple[bool, bool, bool, bool, bool, bool]:
|
||||
"""can Read,Write,Move,Delete,Get,Upget"""
|
||||
vn, _ = self._find(vpath)
|
||||
c = vn.axs
|
||||
return (
|
||||
@@ -394,6 +399,7 @@ class VFS(object):
|
||||
uname in c.umove or "*" in c.umove,
|
||||
uname in c.udel or "*" in c.udel,
|
||||
uname in c.uget or "*" in c.uget,
|
||||
uname in c.upget or "*" in c.upget,
|
||||
)
|
||||
|
||||
def get(
|
||||
@@ -441,11 +447,20 @@ class VFS(object):
|
||||
|
||||
def canonical(self, rem: str, resolve: bool = True) -> str:
|
||||
"""returns the canonical path (fully-resolved absolute fs path)"""
|
||||
rp = self.realpath
|
||||
ap = self.realpath
|
||||
if rem:
|
||||
rp += "/" + rem
|
||||
ap += "/" + rem
|
||||
|
||||
return absreal(rp) if resolve else rp
|
||||
return absreal(ap) if resolve else ap
|
||||
|
||||
def dcanonical(self, rem: str) -> str:
|
||||
"""resolves until the final component (filename)"""
|
||||
ap = self.realpath
|
||||
if rem:
|
||||
ap += "/" + rem
|
||||
|
||||
ad, fn = os.path.split(ap)
|
||||
return os.path.join(absreal(ad), fn)
|
||||
|
||||
def ls(
|
||||
self,
|
||||
@@ -728,7 +743,7 @@ class AuthSrv(object):
|
||||
def _read_vol_str(
|
||||
self, lvl: str, uname: str, axs: AXS, flags: dict[str, Any]
|
||||
) -> None:
|
||||
if lvl.strip("crwmdg"):
|
||||
if lvl.strip("crwmdgG"):
|
||||
raise Exception("invalid volflag: {},{}".format(lvl, uname))
|
||||
|
||||
if lvl == "c":
|
||||
@@ -758,7 +773,9 @@ class AuthSrv(object):
|
||||
("m", axs.umove),
|
||||
("d", axs.udel),
|
||||
("g", axs.uget),
|
||||
]:
|
||||
("G", axs.uget),
|
||||
("G", axs.upget),
|
||||
]: # b bb bbb
|
||||
if ch in lvl:
|
||||
al.add(un)
|
||||
|
||||
@@ -808,7 +825,7 @@ class AuthSrv(object):
|
||||
|
||||
if self.args.v:
|
||||
# list of src:dst:permset:permset:...
|
||||
# permset is <rwmdg>[,username][,username] or <c>,<flag>[=args]
|
||||
# permset is <rwmdgG>[,username][,username] or <c>,<flag>[=args]
|
||||
for v_str in self.args.v:
|
||||
m = re_vol.match(v_str)
|
||||
if not m:
|
||||
@@ -873,7 +890,7 @@ class AuthSrv(object):
|
||||
vfs.all_vols = {}
|
||||
vfs.get_all_vols(vfs.all_vols)
|
||||
|
||||
for perm in "read write move del get".split():
|
||||
for perm in "read write move del get pget".split():
|
||||
axs_key = "u" + perm
|
||||
unames = ["*"] + list(acct.keys())
|
||||
umap: dict[str, list[str]] = {x: [] for x in unames}
|
||||
@@ -888,7 +905,7 @@ class AuthSrv(object):
|
||||
all_users = {}
|
||||
missing_users = {}
|
||||
for axs in daxs.values():
|
||||
for d in [axs.uread, axs.uwrite, axs.umove, axs.udel, axs.uget]:
|
||||
for d in [axs.uread, axs.uwrite, axs.umove, axs.udel, axs.uget, axs.upget]:
|
||||
for usr in d:
|
||||
all_users[usr] = 1
|
||||
if usr != "*" and usr not in acct:
|
||||
@@ -1193,6 +1210,7 @@ class AuthSrv(object):
|
||||
[" move", "umove"],
|
||||
["delete", "udel"],
|
||||
[" get", "uget"],
|
||||
[" upget", "upget"],
|
||||
]:
|
||||
u = list(sorted(getattr(zv.axs, attr)))
|
||||
u = ", ".join("\033[35meverybody\033[0m" if x == "*" else x for x in u)
|
||||
@@ -1288,10 +1306,11 @@ class AuthSrv(object):
|
||||
raise Exception("volume not found: " + zs)
|
||||
|
||||
self.log(str({"users": users, "vols": vols, "flags": flags}))
|
||||
t = "/{}: read({}) write({}) move({}) del({}) get({})"
|
||||
t = "/{}: read({}) write({}) move({}) del({}) get({}) upget({})"
|
||||
for k, zv in self.vfs.all_vols.items():
|
||||
vc = zv.axs
|
||||
self.log(t.format(k, vc.uread, vc.uwrite, vc.umove, vc.udel, vc.uget))
|
||||
vs = [k, vc.uread, vc.uwrite, vc.umove, vc.udel, vc.uget, vc.upget]
|
||||
self.log(t.format(*vs))
|
||||
|
||||
flag_v = "v" in flags
|
||||
flag_ln = "ln" in flags
|
||||
|
||||
@@ -9,6 +9,7 @@ import threading
|
||||
|
||||
import queue
|
||||
|
||||
from .__init__ import ANYWIN
|
||||
from .authsrv import AuthSrv
|
||||
from .broker_util import BrokerCli, ExceptionalQueue
|
||||
from .httpsrv import HttpSrv
|
||||
@@ -48,7 +49,11 @@ class MpWorker(BrokerCli):
|
||||
# we inherited signal_handler from parent,
|
||||
# replace it with something harmless
|
||||
if not FAKE_MP:
|
||||
for sig in [signal.SIGINT, signal.SIGTERM, signal.SIGUSR1]:
|
||||
sigs = [signal.SIGINT, signal.SIGTERM]
|
||||
if not ANYWIN:
|
||||
sigs.append(signal.SIGUSR1)
|
||||
|
||||
for sig in sigs:
|
||||
signal.signal(sig, self.signal_handler)
|
||||
|
||||
# starting to look like a good idea
|
||||
|
||||
@@ -94,6 +94,9 @@ class FtpFs(AbstractedFS):
|
||||
self.cwd = "/" # pyftpdlib convention of leading slash
|
||||
self.root = "/var/lib/empty"
|
||||
|
||||
self.can_read = self.can_write = self.can_move = False
|
||||
self.can_delete = self.can_get = self.can_upget = False
|
||||
|
||||
self.listdirinfo = self.listdir
|
||||
self.chdir(".")
|
||||
|
||||
@@ -153,8 +156,14 @@ class FtpFs(AbstractedFS):
|
||||
|
||||
def chdir(self, path: str) -> None:
|
||||
self.cwd = join(self.cwd, path)
|
||||
x = self.hub.asrv.vfs.can_access(self.cwd.lstrip("/"), self.h.username)
|
||||
self.can_read, self.can_write, self.can_move, self.can_delete, self.can_get = x
|
||||
(
|
||||
self.can_read,
|
||||
self.can_write,
|
||||
self.can_move,
|
||||
self.can_delete,
|
||||
self.can_get,
|
||||
self.can_upget,
|
||||
) = self.hub.asrv.vfs.can_access(self.cwd.lstrip("/"), self.h.username)
|
||||
|
||||
def mkdir(self, path: str) -> None:
|
||||
ap = self.rv2a(path, w=True)
|
||||
@@ -195,7 +204,7 @@ class FtpFs(AbstractedFS):
|
||||
|
||||
vp = join(self.cwd, path).lstrip("/")
|
||||
try:
|
||||
self.hub.up2k.handle_rm(self.uname, self.h.remote_ip, [vp])
|
||||
self.hub.up2k.handle_rm(self.uname, self.h.remote_ip, [vp], [])
|
||||
except Exception as ex:
|
||||
raise FilesystemError(str(ex))
|
||||
|
||||
|
||||
@@ -81,6 +81,7 @@ from .util import (
|
||||
)
|
||||
|
||||
try:
|
||||
import typing
|
||||
from typing import Any, Generator, Match, Optional, Pattern, Type, Union
|
||||
except:
|
||||
pass
|
||||
@@ -146,6 +147,7 @@ class HttpCli(object):
|
||||
self.can_move = False
|
||||
self.can_delete = False
|
||||
self.can_get = False
|
||||
self.can_upget = False
|
||||
# post
|
||||
self.parser: Optional[MultipartParser] = None
|
||||
# end placeholders
|
||||
@@ -362,6 +364,7 @@ class HttpCli(object):
|
||||
self.mvol = self.asrv.vfs.amove[self.uname]
|
||||
self.dvol = self.asrv.vfs.adel[self.uname]
|
||||
self.gvol = self.asrv.vfs.aget[self.uname]
|
||||
self.upvol = self.asrv.vfs.apget[self.uname]
|
||||
|
||||
if pwd:
|
||||
self.out_headerlist.append(("Set-Cookie", self.get_pwd_cookie(pwd)[0]))
|
||||
@@ -375,8 +378,14 @@ class HttpCli(object):
|
||||
ptn: Optional[Pattern[str]] = self.conn.lf_url # mypy404
|
||||
self.do_log = not ptn or not ptn.search(self.req)
|
||||
|
||||
x = self.asrv.vfs.can_access(self.vpath, self.uname)
|
||||
self.can_read, self.can_write, self.can_move, self.can_delete, self.can_get = x
|
||||
(
|
||||
self.can_read,
|
||||
self.can_write,
|
||||
self.can_move,
|
||||
self.can_delete,
|
||||
self.can_get,
|
||||
self.can_upget,
|
||||
) = self.asrv.vfs.can_access(self.vpath, self.uname)
|
||||
|
||||
try:
|
||||
if self.mode in ["GET", "HEAD"]:
|
||||
@@ -884,7 +893,7 @@ class HttpCli(object):
|
||||
)
|
||||
|
||||
vsuf = ""
|
||||
if self.can_read and "fk" in vfs.flags:
|
||||
if (self.can_read or self.can_upget) and "fk" in vfs.flags:
|
||||
vsuf = "?k=" + self.gen_fk(
|
||||
self.args.fk_salt,
|
||||
path,
|
||||
@@ -920,6 +929,38 @@ class HttpCli(object):
|
||||
self.reply(t.encode("utf-8"))
|
||||
return True
|
||||
|
||||
def bakflip(self, f: typing.BinaryIO, ofs: int, sz: int, sha: str) -> None:
|
||||
if not self.args.bak_flips or self.args.nw:
|
||||
return
|
||||
|
||||
sdir = self.args.bf_dir
|
||||
fp = os.path.join(sdir, sha)
|
||||
if bos.path.exists(fp):
|
||||
return self.log("no bakflip; have it", 6)
|
||||
|
||||
if not bos.path.isdir(sdir):
|
||||
bos.makedirs(sdir)
|
||||
|
||||
if len(bos.listdir(sdir)) >= self.args.bf_nc:
|
||||
return self.log("no bakflip; too many", 3)
|
||||
|
||||
nrem = sz
|
||||
f.seek(ofs)
|
||||
with open(fp, "wb") as fo:
|
||||
while nrem:
|
||||
buf = f.read(min(nrem, 512 * 1024))
|
||||
if not buf:
|
||||
break
|
||||
|
||||
nrem -= len(buf)
|
||||
fo.write(buf)
|
||||
|
||||
if nrem:
|
||||
self.log("bakflip truncated; {} remains".format(nrem), 1)
|
||||
atomic_move(fp, fp + ".trunc")
|
||||
else:
|
||||
self.log("bakflip ok", 2)
|
||||
|
||||
def rand_name(self, fdir: str, fn: str, rnd: int) -> str:
|
||||
ok = False
|
||||
try:
|
||||
@@ -1177,6 +1218,11 @@ class HttpCli(object):
|
||||
post_sz, _, sha_b64 = hashcopy(reader, f, self.args.s_wr_slp)
|
||||
|
||||
if sha_b64 != chash:
|
||||
try:
|
||||
self.bakflip(f, cstart[0], post_sz, sha_b64)
|
||||
except:
|
||||
self.log("bakflip failed: " + min_ex())
|
||||
|
||||
t = "your chunk got corrupted somehow (received {} bytes); expected vs received hash:\n{}\n{}"
|
||||
raise Pebkac(400, t.format(post_sz, chash, sha_b64))
|
||||
|
||||
@@ -1506,7 +1552,7 @@ class HttpCli(object):
|
||||
|
||||
for sz, sha_hex, sha_b64, ofn, lfn, ap in files:
|
||||
vsuf = ""
|
||||
if self.can_read and "fk" in vfs.flags:
|
||||
if (self.can_read or self.can_upget) and "fk" in vfs.flags:
|
||||
vsuf = "?k=" + self.gen_fk(
|
||||
self.args.fk_salt,
|
||||
ap,
|
||||
@@ -2275,25 +2321,56 @@ class HttpCli(object):
|
||||
ret: list[dict[str, Any]] = []
|
||||
t0 = time.time()
|
||||
lim = time.time() - self.args.unpost
|
||||
fk_vols = {
|
||||
vol: vol.flags["fk"]
|
||||
for vp, vol in self.asrv.vfs.all_vols.items()
|
||||
if "fk" in vol.flags and (vp in self.rvol or vp in self.upvol)
|
||||
}
|
||||
for vol in self.asrv.vfs.all_vols.values():
|
||||
cur = idx.get_cur(vol.realpath)
|
||||
if not cur:
|
||||
continue
|
||||
|
||||
nfk = fk_vols.get(vol, 0)
|
||||
|
||||
q = "select sz, rd, fn, at from up where ip=? and at>?"
|
||||
for sz, rd, fn, at in cur.execute(q, (self.ip, lim)):
|
||||
vp = "/" + "/".join(x for x in [vol.vpath, rd, fn] if x)
|
||||
if filt and filt not in vp:
|
||||
continue
|
||||
|
||||
ret.append({"vp": quotep(vp), "sz": sz, "at": at})
|
||||
rv = {"vp": quotep(vp), "sz": sz, "at": at, "nfk": nfk}
|
||||
if nfk:
|
||||
rv["ap"] = vol.canonical(vjoin(rd, fn))
|
||||
|
||||
ret.append(rv)
|
||||
if len(ret) > 3000:
|
||||
ret.sort(key=lambda x: x["at"], reverse=True) # type: ignore
|
||||
ret = ret[:2000]
|
||||
|
||||
ret.sort(key=lambda x: x["at"], reverse=True) # type: ignore
|
||||
ret = ret[:2000]
|
||||
n = 0
|
||||
for rv in ret[:11000]:
|
||||
nfk = rv.pop("nfk")
|
||||
if not nfk:
|
||||
continue
|
||||
|
||||
ap = rv.pop("ap")
|
||||
try:
|
||||
st = bos.stat(ap)
|
||||
except:
|
||||
continue
|
||||
|
||||
fk = self.gen_fk(
|
||||
self.args.fk_salt, ap, st.st_size, 0 if ANYWIN else st.st_ino
|
||||
)
|
||||
rv["vp"] += "?k=" + fk[:nfk]
|
||||
|
||||
n += 1
|
||||
if n > 2000:
|
||||
break
|
||||
|
||||
ret = ret[:2000]
|
||||
jtxt = json.dumps(ret, indent=2, sort_keys=True).encode("utf-8", "replace")
|
||||
self.log("{} #{} {:.2f}sec".format(lm, len(ret), time.time() - t0))
|
||||
self.reply(jtxt, mime="application/json")
|
||||
@@ -2309,7 +2386,10 @@ class HttpCli(object):
|
||||
if not req:
|
||||
req = [self.vpath]
|
||||
|
||||
x = self.conn.hsrv.broker.ask("up2k.handle_rm", self.uname, self.ip, req)
|
||||
nlim = int(self.uparam.get("lim") or 0)
|
||||
lim = [nlim, nlim] if nlim else []
|
||||
|
||||
x = self.conn.hsrv.broker.ask("up2k.handle_rm", self.uname, self.ip, req, lim)
|
||||
self.loud_reply(x.get())
|
||||
return True
|
||||
|
||||
@@ -2411,7 +2491,7 @@ class HttpCli(object):
|
||||
vpnodes.append([quotep(vpath) + "/", html_escape(node, crlf=True)])
|
||||
|
||||
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, False, False)
|
||||
abspath = vn.canonical(rem)
|
||||
abspath = vn.dcanonical(rem)
|
||||
dbv, vrem = vn.get_dbv(rem)
|
||||
|
||||
try:
|
||||
@@ -2452,13 +2532,15 @@ class HttpCli(object):
|
||||
if thp:
|
||||
return self.tx_file(thp)
|
||||
|
||||
if th_fmt == "p":
|
||||
raise Pebkac(404)
|
||||
|
||||
return self.tx_ico(rem)
|
||||
|
||||
if not is_dir and (self.can_read or self.can_get):
|
||||
if not self.can_read and "fk" in vn.flags:
|
||||
vabs = vjoin(vn.realpath, rem)
|
||||
correct = self.gen_fk(
|
||||
self.args.fk_salt, vabs, st.st_size, 0 if ANYWIN else st.st_ino
|
||||
self.args.fk_salt, abspath, st.st_size, 0 if ANYWIN else st.st_ino
|
||||
)[: vn.flags["fk"]]
|
||||
got = self.uparam.get("k")
|
||||
if got != correct:
|
||||
@@ -2503,6 +2585,8 @@ class HttpCli(object):
|
||||
perms.append("delete")
|
||||
if self.can_get:
|
||||
perms.append("get")
|
||||
if self.can_upget:
|
||||
perms.append("upget")
|
||||
|
||||
url_suf = self.urlq({}, ["k"])
|
||||
is_ls = "ls" in self.uparam
|
||||
|
||||
593
copyparty/stolen/qrcodegen.py
Normal file
593
copyparty/stolen/qrcodegen.py
Normal file
@@ -0,0 +1,593 @@
|
||||
# coding: utf-8
|
||||
|
||||
# modified copy of Project Nayuki's qrcodegen (MIT-licensed);
|
||||
# https://github.com/nayuki/QR-Code-generator/blob/daa3114/python/qrcodegen.py
|
||||
# the original ^ is extremely well commented so refer to that for explanations
|
||||
|
||||
# hacks: binary-only, auto-ecc, render, py2-compat
|
||||
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import collections
|
||||
import itertools
|
||||
|
||||
try:
|
||||
from collections.abc import Sequence
|
||||
|
||||
from typing import Callable, List, Optional, Tuple, Union
|
||||
except:
|
||||
pass
|
||||
|
||||
|
||||
def num_char_count_bits(ver: int) -> int:
|
||||
return 16 if (ver + 7) // 17 else 8
|
||||
|
||||
|
||||
class Ecc(object):
|
||||
ordinal: int
|
||||
formatbits: int
|
||||
|
||||
def __init__(self, i: int, fb: int) -> None:
|
||||
self.ordinal = i
|
||||
self.formatbits = fb
|
||||
|
||||
LOW: "Ecc"
|
||||
MEDIUM: "Ecc"
|
||||
QUARTILE: "Ecc"
|
||||
HIGH: "Ecc"
|
||||
|
||||
|
||||
Ecc.LOW = Ecc(0, 1)
|
||||
Ecc.MEDIUM = Ecc(1, 0)
|
||||
Ecc.QUARTILE = Ecc(2, 3)
|
||||
Ecc.HIGH = Ecc(3, 2)
|
||||
|
||||
|
||||
class QrSegment(object):
|
||||
@staticmethod
|
||||
def make_seg(data: Union[bytes, Sequence[int]]) -> "QrSegment":
|
||||
bb = _BitBuffer()
|
||||
for b in data:
|
||||
bb.append_bits(b, 8)
|
||||
return QrSegment(len(data), bb)
|
||||
|
||||
numchars: int # num bytes, not the same as the data's bit length
|
||||
bitdata: List[int] # The data bits of this segment
|
||||
|
||||
def __init__(self, numch: int, bitdata: Sequence[int]) -> None:
|
||||
if numch < 0:
|
||||
raise ValueError()
|
||||
self.numchars = numch
|
||||
self.bitdata = list(bitdata)
|
||||
|
||||
@staticmethod
|
||||
def get_total_bits(segs: Sequence["QrSegment"], ver: int) -> Optional[int]:
|
||||
result = 0
|
||||
for seg in segs:
|
||||
ccbits: int = num_char_count_bits(ver)
|
||||
if seg.numchars >= (1 << ccbits):
|
||||
return None # segment length doesn't fit the field's bit width
|
||||
result += 4 + ccbits + len(seg.bitdata)
|
||||
return result
|
||||
|
||||
|
||||
class QrCode(object):
|
||||
@staticmethod
|
||||
def encode_binary(data: Union[bytes, Sequence[int]]) -> "QrCode":
|
||||
return QrCode.encode_segments([QrSegment.make_seg(data)])
|
||||
|
||||
@staticmethod
|
||||
def encode_segments(
|
||||
segs: Sequence[QrSegment],
|
||||
ecl: Ecc = Ecc.LOW,
|
||||
minver: int = 2,
|
||||
maxver: int = 40,
|
||||
mask: int = -1,
|
||||
) -> "QrCode":
|
||||
for ver in range(minver, maxver + 1):
|
||||
datacapacitybits: int = QrCode._get_num_data_codewords(ver, ecl) * 8
|
||||
datausedbits: Optional[int] = QrSegment.get_total_bits(segs, ver)
|
||||
if (datausedbits is not None) and (datausedbits <= datacapacitybits):
|
||||
break
|
||||
|
||||
assert datausedbits
|
||||
|
||||
for newecl in (
|
||||
Ecc.MEDIUM,
|
||||
Ecc.QUARTILE,
|
||||
Ecc.HIGH,
|
||||
):
|
||||
if datausedbits <= QrCode._get_num_data_codewords(ver, newecl) * 8:
|
||||
ecl = newecl
|
||||
|
||||
# Concatenate all segments to create the data bit string
|
||||
bb = _BitBuffer()
|
||||
for seg in segs:
|
||||
bb.append_bits(4, 4)
|
||||
bb.append_bits(seg.numchars, num_char_count_bits(ver))
|
||||
bb.extend(seg.bitdata)
|
||||
assert len(bb) == datausedbits
|
||||
|
||||
# Add terminator and pad up to a byte if applicable
|
||||
datacapacitybits = QrCode._get_num_data_codewords(ver, ecl) * 8
|
||||
assert len(bb) <= datacapacitybits
|
||||
bb.append_bits(0, min(4, datacapacitybits - len(bb)))
|
||||
bb.append_bits(0, -len(bb) % 8)
|
||||
assert len(bb) % 8 == 0
|
||||
|
||||
# Pad with alternating bytes until data capacity is reached
|
||||
for padbyte in itertools.cycle((0xEC, 0x11)):
|
||||
if len(bb) >= datacapacitybits:
|
||||
break
|
||||
bb.append_bits(padbyte, 8)
|
||||
|
||||
# Pack bits into bytes in big endian
|
||||
datacodewords = bytearray([0] * (len(bb) // 8))
|
||||
for (i, bit) in enumerate(bb):
|
||||
datacodewords[i >> 3] |= bit << (7 - (i & 7))
|
||||
|
||||
return QrCode(ver, ecl, datacodewords, mask)
|
||||
|
||||
ver: int
|
||||
size: int # w/h; 21..177 (ver * 4 + 17)
|
||||
ecclvl: Ecc
|
||||
mask: int # 0..7
|
||||
modules: List[List[bool]]
|
||||
unmaskable: List[List[bool]]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
ver: int,
|
||||
ecclvl: Ecc,
|
||||
datacodewords: Union[bytes, Sequence[int]],
|
||||
msk: int,
|
||||
) -> None:
|
||||
self.ver = ver
|
||||
self.size = ver * 4 + 17
|
||||
self.ecclvl = ecclvl
|
||||
|
||||
self.modules = [[False] * self.size for _ in range(self.size)]
|
||||
self.unmaskable = [[False] * self.size for _ in range(self.size)]
|
||||
|
||||
# Compute ECC, draw modules
|
||||
self._draw_function_patterns()
|
||||
allcodewords: bytes = self._add_ecc_and_interleave(bytearray(datacodewords))
|
||||
self._draw_codewords(allcodewords)
|
||||
|
||||
if msk == -1: # automask
|
||||
minpenalty: int = 1 << 32
|
||||
for i in range(8):
|
||||
self._apply_mask(i)
|
||||
self._draw_format_bits(i)
|
||||
penalty = self._get_penalty_score()
|
||||
if penalty < minpenalty:
|
||||
msk = i
|
||||
minpenalty = penalty
|
||||
self._apply_mask(i) # xor/undo
|
||||
|
||||
assert 0 <= msk <= 7
|
||||
self.mask = msk
|
||||
self._apply_mask(msk) # Apply the final choice of mask
|
||||
self._draw_format_bits(msk) # Overwrite old format bits
|
||||
|
||||
def render(self, zoom=1, pad=4) -> str:
|
||||
tab = self.modules
|
||||
sz = self.size
|
||||
if sz % 2 and zoom == 1:
|
||||
tab.append([False] * sz)
|
||||
|
||||
tab = [[False] * sz] * pad + tab + [[False] * sz] * pad
|
||||
tab = [[False] * pad + x + [False] * pad for x in tab]
|
||||
|
||||
rows: list[str] = []
|
||||
if zoom == 1:
|
||||
for y in range(0, len(tab), 2):
|
||||
row = ""
|
||||
for x in range(len(tab[y])):
|
||||
v = 2 if tab[y][x] else 0
|
||||
v += 1 if tab[y + 1][x] else 0
|
||||
row += " ▄▀█"[v]
|
||||
rows.append(row)
|
||||
else:
|
||||
for tr in tab:
|
||||
row = ""
|
||||
for zb in tr:
|
||||
row += " █"[int(zb)] * 2
|
||||
rows.append(row)
|
||||
|
||||
return "\n".join(rows)
|
||||
|
||||
def _draw_function_patterns(self) -> None:
|
||||
# Draw horizontal and vertical timing patterns
|
||||
for i in range(self.size):
|
||||
self._set_function_module(6, i, i % 2 == 0)
|
||||
self._set_function_module(i, 6, i % 2 == 0)
|
||||
|
||||
# Draw 3 finder patterns (all corners except bottom right; overwrites some timing modules)
|
||||
self._draw_finder_pattern(3, 3)
|
||||
self._draw_finder_pattern(self.size - 4, 3)
|
||||
self._draw_finder_pattern(3, self.size - 4)
|
||||
|
||||
# Draw numerous alignment patterns
|
||||
alignpatpos: List[int] = self._get_alignment_pattern_positions()
|
||||
numalign: int = len(alignpatpos)
|
||||
skips: Sequence[Tuple[int, int]] = (
|
||||
(0, 0),
|
||||
(0, numalign - 1),
|
||||
(numalign - 1, 0),
|
||||
)
|
||||
for i in range(numalign):
|
||||
for j in range(numalign):
|
||||
if (i, j) not in skips: # avoid finder corners
|
||||
self._draw_alignment_pattern(alignpatpos[i], alignpatpos[j])
|
||||
|
||||
# draw config data with dummy mask value; ctor overwrites it
|
||||
self._draw_format_bits(0)
|
||||
self._draw_ver()
|
||||
|
||||
def _draw_format_bits(self, mask: int) -> None:
|
||||
# Calculate error correction code and pack bits; ecclvl is uint2, mask is uint3
|
||||
data: int = self.ecclvl.formatbits << 3 | mask
|
||||
rem: int = data
|
||||
for _ in range(10):
|
||||
rem = (rem << 1) ^ ((rem >> 9) * 0x537)
|
||||
bits: int = (data << 10 | rem) ^ 0x5412 # uint15
|
||||
assert bits >> 15 == 0
|
||||
|
||||
# first copy
|
||||
for i in range(0, 6):
|
||||
self._set_function_module(8, i, _get_bit(bits, i))
|
||||
self._set_function_module(8, 7, _get_bit(bits, 6))
|
||||
self._set_function_module(8, 8, _get_bit(bits, 7))
|
||||
self._set_function_module(7, 8, _get_bit(bits, 8))
|
||||
for i in range(9, 15):
|
||||
self._set_function_module(14 - i, 8, _get_bit(bits, i))
|
||||
|
||||
# second copy
|
||||
for i in range(0, 8):
|
||||
self._set_function_module(self.size - 1 - i, 8, _get_bit(bits, i))
|
||||
for i in range(8, 15):
|
||||
self._set_function_module(8, self.size - 15 + i, _get_bit(bits, i))
|
||||
self._set_function_module(8, self.size - 8, True) # Always dark
|
||||
|
||||
def _draw_ver(self) -> None:
|
||||
if self.ver < 7:
|
||||
return
|
||||
|
||||
# Calculate error correction code and pack bits
|
||||
rem: int = self.ver # ver is uint6, 7..40
|
||||
for _ in range(12):
|
||||
rem = (rem << 1) ^ ((rem >> 11) * 0x1F25)
|
||||
bits: int = self.ver << 12 | rem # uint18
|
||||
assert bits >> 18 == 0
|
||||
|
||||
# Draw two copies
|
||||
for i in range(18):
|
||||
bit: bool = _get_bit(bits, i)
|
||||
a: int = self.size - 11 + i % 3
|
||||
b: int = i // 3
|
||||
self._set_function_module(a, b, bit)
|
||||
self._set_function_module(b, a, bit)
|
||||
|
||||
def _draw_finder_pattern(self, x: int, y: int) -> None:
|
||||
for dy in range(-4, 5):
|
||||
for dx in range(-4, 5):
|
||||
xx, yy = x + dx, y + dy
|
||||
if (0 <= xx < self.size) and (0 <= yy < self.size):
|
||||
# Chebyshev/infinity norm
|
||||
self._set_function_module(
|
||||
xx, yy, max(abs(dx), abs(dy)) not in (2, 4)
|
||||
)
|
||||
|
||||
def _draw_alignment_pattern(self, x: int, y: int) -> None:
|
||||
for dy in range(-2, 3):
|
||||
for dx in range(-2, 3):
|
||||
self._set_function_module(x + dx, y + dy, max(abs(dx), abs(dy)) != 1)
|
||||
|
||||
def _set_function_module(self, x: int, y: int, isdark: bool) -> None:
|
||||
self.modules[y][x] = isdark
|
||||
self.unmaskable[y][x] = True
|
||||
|
||||
def _add_ecc_and_interleave(self, data: bytearray) -> bytes:
|
||||
ver: int = self.ver
|
||||
assert len(data) == QrCode._get_num_data_codewords(ver, self.ecclvl)
|
||||
|
||||
# Calculate parameter numbers
|
||||
numblocks: int = QrCode._NUM_ERROR_CORRECTION_BLOCKS[self.ecclvl.ordinal][ver]
|
||||
blockecclen: int = QrCode._ECC_CODEWORDS_PER_BLOCK[self.ecclvl.ordinal][ver]
|
||||
rawcodewords: int = QrCode._get_num_raw_data_modules(ver) // 8
|
||||
numshortblocks: int = numblocks - rawcodewords % numblocks
|
||||
shortblocklen: int = rawcodewords // numblocks
|
||||
|
||||
# Split data into blocks and append ECC to each block
|
||||
blocks: List[bytes] = []
|
||||
rsdiv: bytes = QrCode._reed_solomon_compute_divisor(blockecclen)
|
||||
k: int = 0
|
||||
for i in range(numblocks):
|
||||
dat: bytearray = data[
|
||||
k : k + shortblocklen - blockecclen + (0 if i < numshortblocks else 1)
|
||||
]
|
||||
k += len(dat)
|
||||
ecc: bytes = QrCode._reed_solomon_compute_remainder(dat, rsdiv)
|
||||
if i < numshortblocks:
|
||||
dat.append(0)
|
||||
blocks.append(dat + ecc)
|
||||
assert k == len(data)
|
||||
|
||||
# Interleave (not concatenate) the bytes from every block into a single sequence
|
||||
result = bytearray()
|
||||
for i in range(len(blocks[0])):
|
||||
for (j, blk) in enumerate(blocks):
|
||||
# Skip the padding byte in short blocks
|
||||
if (i != shortblocklen - blockecclen) or (j >= numshortblocks):
|
||||
result.append(blk[i])
|
||||
assert len(result) == rawcodewords
|
||||
return result
|
||||
|
||||
def _draw_codewords(self, data: bytes) -> None:
|
||||
assert len(data) == QrCode._get_num_raw_data_modules(self.ver) // 8
|
||||
|
||||
i: int = 0 # Bit index into the data
|
||||
for right in range(self.size - 1, 0, -2):
|
||||
# idx of right column in each column pair
|
||||
if right <= 6:
|
||||
right -= 1
|
||||
for vert in range(self.size): # Vertical counter
|
||||
for j in range(2):
|
||||
x: int = right - j
|
||||
upward: bool = (right + 1) & 2 == 0
|
||||
y: int = (self.size - 1 - vert) if upward else vert
|
||||
if (not self.unmaskable[y][x]) and (i < len(data) * 8):
|
||||
self.modules[y][x] = _get_bit(data[i >> 3], 7 - (i & 7))
|
||||
i += 1
|
||||
# any remainder bits (0..7) were set 0/false/light by ctor
|
||||
|
||||
assert i == len(data) * 8
|
||||
|
||||
def _apply_mask(self, mask: int) -> None:
|
||||
masker: Callable[[int, int], int] = QrCode._MASK_PATTERNS[mask]
|
||||
for y in range(self.size):
|
||||
for x in range(self.size):
|
||||
self.modules[y][x] ^= (masker(x, y) == 0) and (
|
||||
not self.unmaskable[y][x]
|
||||
)
|
||||
|
||||
def _get_penalty_score(self) -> int:
|
||||
result: int = 0
|
||||
size: int = self.size
|
||||
modules: List[List[bool]] = self.modules
|
||||
|
||||
# Adjacent modules in row having same color, and finder-like patterns
|
||||
for y in range(size):
|
||||
runcolor: bool = False
|
||||
runx: int = 0
|
||||
runhistory = collections.deque([0] * 7, 7)
|
||||
for x in range(size):
|
||||
if modules[y][x] == runcolor:
|
||||
runx += 1
|
||||
if runx == 5:
|
||||
result += QrCode._PENALTY_N1
|
||||
elif runx > 5:
|
||||
result += 1
|
||||
else:
|
||||
self._finder_penalty_add_history(runx, runhistory)
|
||||
if not runcolor:
|
||||
result += (
|
||||
self._finder_penalty_count_patterns(runhistory)
|
||||
* QrCode._PENALTY_N3
|
||||
)
|
||||
runcolor = modules[y][x]
|
||||
runx = 1
|
||||
result += (
|
||||
self._finder_penalty_terminate_and_count(runcolor, runx, runhistory)
|
||||
* QrCode._PENALTY_N3
|
||||
)
|
||||
|
||||
# Adjacent modules in column having same color, and finder-like patterns
|
||||
for x in range(size):
|
||||
runcolor = False
|
||||
runy = 0
|
||||
runhistory = collections.deque([0] * 7, 7)
|
||||
for y in range(size):
|
||||
if modules[y][x] == runcolor:
|
||||
runy += 1
|
||||
if runy == 5:
|
||||
result += QrCode._PENALTY_N1
|
||||
elif runy > 5:
|
||||
result += 1
|
||||
else:
|
||||
self._finder_penalty_add_history(runy, runhistory)
|
||||
if not runcolor:
|
||||
result += (
|
||||
self._finder_penalty_count_patterns(runhistory)
|
||||
* QrCode._PENALTY_N3
|
||||
)
|
||||
runcolor = modules[y][x]
|
||||
runy = 1
|
||||
result += (
|
||||
self._finder_penalty_terminate_and_count(runcolor, runy, runhistory)
|
||||
* QrCode._PENALTY_N3
|
||||
)
|
||||
|
||||
# 2*2 blocks of modules having same color
|
||||
for y in range(size - 1):
|
||||
for x in range(size - 1):
|
||||
if (
|
||||
modules[y][x]
|
||||
== modules[y][x + 1]
|
||||
== modules[y + 1][x]
|
||||
== modules[y + 1][x + 1]
|
||||
):
|
||||
result += QrCode._PENALTY_N2
|
||||
|
||||
# Balance of dark and light modules
|
||||
dark: int = sum((1 if cell else 0) for row in modules for cell in row)
|
||||
total: int = size ** 2 # Note that size is odd, so dark/total != 1/2
|
||||
|
||||
# Compute the smallest integer k >= 0 such that (45-5k)% <= dark/total <= (55+5k)%
|
||||
k: int = (abs(dark * 20 - total * 10) + total - 1) // total - 1
|
||||
assert 0 <= k <= 9
|
||||
result += k * QrCode._PENALTY_N4
|
||||
assert 0 <= result <= 2568888
|
||||
# ^ Non-tight upper bound based on default values of PENALTY_N1, ..., N4
|
||||
|
||||
return result
|
||||
|
||||
def _get_alignment_pattern_positions(self) -> List[int]:
|
||||
ver: int = self.ver
|
||||
if ver == 1:
|
||||
return []
|
||||
|
||||
numalign: int = ver // 7 + 2
|
||||
step: int = (
|
||||
26
|
||||
if (ver == 32)
|
||||
else (ver * 4 + numalign * 2 + 1) // (numalign * 2 - 2) * 2
|
||||
)
|
||||
result: List[int] = [
|
||||
(self.size - 7 - i * step) for i in range(numalign - 1)
|
||||
] + [6]
|
||||
return list(reversed(result))
|
||||
|
||||
@staticmethod
|
||||
def _get_num_raw_data_modules(ver: int) -> int:
|
||||
result: int = (16 * ver + 128) * ver + 64
|
||||
if ver >= 2:
|
||||
numalign: int = ver // 7 + 2
|
||||
result -= (25 * numalign - 10) * numalign - 55
|
||||
if ver >= 7:
|
||||
result -= 36
|
||||
assert 208 <= result <= 29648
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _get_num_data_codewords(ver: int, ecl: Ecc) -> int:
|
||||
return (
|
||||
QrCode._get_num_raw_data_modules(ver) // 8
|
||||
- QrCode._ECC_CODEWORDS_PER_BLOCK[ecl.ordinal][ver]
|
||||
* QrCode._NUM_ERROR_CORRECTION_BLOCKS[ecl.ordinal][ver]
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _reed_solomon_compute_divisor(degree: int) -> bytes:
|
||||
if not (1 <= degree <= 255):
|
||||
raise ValueError("Degree out of range")
|
||||
|
||||
# Polynomial coefficients are stored from highest to lowest power, excluding the leading term which is always 1.
|
||||
# For example the polynomial x^3 + 255x^2 + 8x + 93 is stored as the uint8 array [255, 8, 93].
|
||||
result = bytearray([0] * (degree - 1) + [1]) # start with monomial x^0
|
||||
|
||||
# Compute the product polynomial (x - r^0) * (x - r^1) * (x - r^2) * ... * (x - r^{degree-1}),
|
||||
# and drop the highest monomial term which is always 1x^degree.
|
||||
# Note that r = 0x02, which is a generator element of this field GF(2^8/0x11D).
|
||||
root: int = 1
|
||||
for _ in range(degree):
|
||||
# Multiply the current product by (x - r^i)
|
||||
for j in range(degree):
|
||||
result[j] = QrCode._reed_solomon_multiply(result[j], root)
|
||||
if j + 1 < degree:
|
||||
result[j] ^= result[j + 1]
|
||||
root = QrCode._reed_solomon_multiply(root, 0x02)
|
||||
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _reed_solomon_compute_remainder(data: bytes, divisor: bytes) -> bytes:
|
||||
result = bytearray([0] * len(divisor))
|
||||
for b in data: # Polynomial division
|
||||
factor: int = b ^ result.pop(0)
|
||||
result.append(0)
|
||||
for (i, coef) in enumerate(divisor):
|
||||
result[i] ^= QrCode._reed_solomon_multiply(coef, factor)
|
||||
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def _reed_solomon_multiply(x: int, y: int) -> int:
|
||||
if (x >> 8 != 0) or (y >> 8 != 0):
|
||||
raise ValueError("Byte out of range")
|
||||
z: int = 0 # Russian peasant multiplication
|
||||
for i in reversed(range(8)):
|
||||
z = (z << 1) ^ ((z >> 7) * 0x11D)
|
||||
z ^= ((y >> i) & 1) * x
|
||||
assert z >> 8 == 0
|
||||
return z
|
||||
|
||||
def _finder_penalty_count_patterns(self, runhistory: collections.deque[int]) -> int:
|
||||
n: int = runhistory[1]
|
||||
assert n <= self.size * 3
|
||||
core: bool = (
|
||||
n > 0
|
||||
and (runhistory[2] == runhistory[4] == runhistory[5] == n)
|
||||
and runhistory[3] == n * 3
|
||||
)
|
||||
return (
|
||||
1 if (core and runhistory[0] >= n * 4 and runhistory[6] >= n) else 0
|
||||
) + (1 if (core and runhistory[6] >= n * 4 and runhistory[0] >= n) else 0)
|
||||
|
||||
def _finder_penalty_terminate_and_count(
|
||||
self,
|
||||
currentruncolor: bool,
|
||||
currentrunlength: int,
|
||||
runhistory: collections.deque[int],
|
||||
) -> int:
|
||||
if currentruncolor: # Terminate dark run
|
||||
self._finder_penalty_add_history(currentrunlength, runhistory)
|
||||
currentrunlength = 0
|
||||
currentrunlength += self.size # Add light border to final run
|
||||
self._finder_penalty_add_history(currentrunlength, runhistory)
|
||||
return self._finder_penalty_count_patterns(runhistory)
|
||||
|
||||
def _finder_penalty_add_history(
|
||||
self, currentrunlength: int, runhistory: collections.deque[int]
|
||||
) -> None:
|
||||
if runhistory[0] == 0:
|
||||
currentrunlength += self.size # Add light border to initial run
|
||||
|
||||
runhistory.appendleft(currentrunlength)
|
||||
|
||||
_PENALTY_N1: int = 3
|
||||
_PENALTY_N2: int = 3
|
||||
_PENALTY_N3: int = 40
|
||||
_PENALTY_N4: int = 10
|
||||
|
||||
# fmt: off
|
||||
_ECC_CODEWORDS_PER_BLOCK: Sequence[Sequence[int]] = (
|
||||
(-1, 7, 10, 15, 20, 26, 18, 20, 24, 30, 18, 20, 24, 26, 30, 22, 24, 28, 30, 28, 28, 28, 28, 30, 30, 26, 28, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30), # noqa: E241 # L
|
||||
(-1, 10, 16, 26, 18, 24, 16, 18, 22, 22, 26, 30, 22, 22, 24, 24, 28, 28, 26, 26, 26, 26, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28), # noqa: E241 # M
|
||||
(-1, 13, 22, 18, 26, 18, 24, 18, 22, 20, 24, 28, 26, 24, 20, 30, 24, 28, 28, 26, 30, 28, 30, 30, 30, 30, 28, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30), # noqa: E241 # Q
|
||||
(-1, 17, 28, 22, 16, 22, 28, 26, 26, 24, 28, 24, 28, 22, 24, 24, 30, 28, 28, 26, 28, 30, 24, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30, 30)) # noqa: E241 # H
|
||||
|
||||
_NUM_ERROR_CORRECTION_BLOCKS: Sequence[Sequence[int]] = (
|
||||
(-1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 4, 4, 4, 4, 4, 6, 6, 6, 6, 7, 8, 8, 9, 9, 10, 12, 12, 12, 13, 14, 15, 16, 17, 18, 19, 19, 20, 21, 22, 24, 25), # noqa: E241 # L
|
||||
(-1, 1, 1, 1, 2, 2, 4, 4, 4, 5, 5, 5, 8, 9, 9, 10, 10, 11, 13, 14, 16, 17, 17, 18, 20, 21, 23, 25, 26, 28, 29, 31, 33, 35, 37, 38, 40, 43, 45, 47, 49), # noqa: E241 # M
|
||||
(-1, 1, 1, 2, 2, 4, 4, 6, 6, 8, 8, 8, 10, 12, 16, 12, 17, 16, 18, 21, 20, 23, 23, 25, 27, 29, 34, 34, 35, 38, 40, 43, 45, 48, 51, 53, 56, 59, 62, 65, 68), # noqa: E241 # Q
|
||||
(-1, 1, 1, 2, 4, 4, 4, 5, 6, 8, 8, 11, 11, 16, 16, 18, 16, 19, 21, 25, 25, 25, 34, 30, 32, 35, 37, 40, 42, 45, 48, 51, 54, 57, 60, 63, 66, 70, 74, 77, 81)) # noqa: E241 # H
|
||||
# fmt: on
|
||||
|
||||
_MASK_PATTERNS: Sequence[Callable[[int, int], int]] = (
|
||||
(lambda x, y: (x + y) % 2),
|
||||
(lambda x, y: y % 2),
|
||||
(lambda x, y: x % 3),
|
||||
(lambda x, y: (x + y) % 3),
|
||||
(lambda x, y: (x // 3 + y // 2) % 2),
|
||||
(lambda x, y: x * y % 2 + x * y % 3),
|
||||
(lambda x, y: (x * y % 2 + x * y % 3) % 2),
|
||||
(lambda x, y: ((x + y) % 2 + x * y % 3) % 2),
|
||||
)
|
||||
|
||||
|
||||
class _BitBuffer(list): # type: ignore
|
||||
def append_bits(self, val: int, n: int) -> None:
|
||||
if (n < 0) or (val >> n != 0):
|
||||
raise ValueError("Value out of range")
|
||||
|
||||
self.extend(((val >> i) & 1) for i in reversed(range(n)))
|
||||
|
||||
|
||||
def _get_bit(x: int, i: int) -> bool:
|
||||
return (x >> i) & 1 != 0
|
||||
|
||||
|
||||
class DataTooLongError(ValueError):
|
||||
pass
|
||||
@@ -16,7 +16,7 @@ import codecs
|
||||
import platform
|
||||
import sys
|
||||
|
||||
PY3 = sys.version_info[0] > 2
|
||||
PY3 = sys.version_info > (3,)
|
||||
WINDOWS = platform.system() == "Windows"
|
||||
FS_ERRORS = "surrogateescape"
|
||||
|
||||
@@ -26,20 +26,6 @@ except:
|
||||
pass
|
||||
|
||||
|
||||
def u(text: Any) -> str:
|
||||
if PY3:
|
||||
return text
|
||||
else:
|
||||
return text.decode("unicode_escape")
|
||||
|
||||
|
||||
def b(data: Any) -> bytes:
|
||||
if PY3:
|
||||
return data.encode("latin1")
|
||||
else:
|
||||
return data
|
||||
|
||||
|
||||
if PY3:
|
||||
_unichr = chr
|
||||
bytes_chr = lambda code: bytes((code,))
|
||||
@@ -171,9 +157,6 @@ def decodefilename(fn: bytes) -> str:
|
||||
|
||||
|
||||
FS_ENCODING = sys.getfilesystemencoding()
|
||||
# FS_ENCODING = "ascii"; fn = b("[abc\xff]"); encoded = u("[abc\udcff]")
|
||||
# FS_ENCODING = 'cp932'; fn = b('[abc\x81\x00]'); encoded = u('[abc\udc81\x00]')
|
||||
# FS_ENCODING = 'UTF-8'; fn = b('[abc\xff]'); encoded = u('[abc\udcff]')
|
||||
|
||||
|
||||
if WINDOWS and not PY3:
|
||||
|
||||
@@ -24,7 +24,7 @@ try:
|
||||
except:
|
||||
pass
|
||||
|
||||
from .__init__ import ANYWIN, MACOS, PY2, VT100, WINDOWS, EnvParams, unicode
|
||||
from .__init__ import ANYWIN, MACOS, VT100, EnvParams, unicode
|
||||
from .authsrv import AuthSrv
|
||||
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
|
||||
from .tcpsrv import TcpSrv
|
||||
@@ -222,7 +222,11 @@ class SvcHub(object):
|
||||
return
|
||||
|
||||
time.sleep(0.1) # purely cosmetic dw
|
||||
self.log("root", "workers OK\n")
|
||||
if self.tcpsrv.qr:
|
||||
self.log("qr-code", self.tcpsrv.qr)
|
||||
else:
|
||||
self.log("root", "workers OK\n")
|
||||
|
||||
self.up2k.init_vols()
|
||||
|
||||
thr = threading.Thread(target=self.sd_notify, name="sd-notify")
|
||||
@@ -269,7 +273,8 @@ class SvcHub(object):
|
||||
|
||||
msg = "[+] opened logfile [{}]\n".format(fn)
|
||||
printed += msg
|
||||
lh.write("t0: {:.3f}\nargv: {}\n\n{}".format(self.E.t0, " ".join(argv), printed))
|
||||
t = "t0: {:.3f}\nargv: {}\n\n{}"
|
||||
lh.write(t.format(self.E.t0, " ".join(argv), printed))
|
||||
self.logf = lh
|
||||
self.logf_base_fn = base_fn
|
||||
print(msg, end="")
|
||||
@@ -417,7 +422,7 @@ class SvcHub(object):
|
||||
|
||||
with self.log_mutex:
|
||||
ts = datetime.utcnow().strftime("%Y-%m%d-%H%M%S.%f")[:-3]
|
||||
self.logf.write("@{} [{}] {}\n".format(ts, src, msg))
|
||||
self.logf.write("@{} [{}\033[0m] {}\n".format(ts, src, msg))
|
||||
|
||||
now = time.time()
|
||||
if now >= self.next_day:
|
||||
@@ -480,17 +485,10 @@ class SvcHub(object):
|
||||
print(*a, **ka)
|
||||
|
||||
def check_mp_support(self) -> str:
|
||||
vmin = sys.version_info[1]
|
||||
if WINDOWS:
|
||||
msg = "need python 3.3 or newer for multiprocessing;"
|
||||
if PY2 or vmin < 3:
|
||||
return msg
|
||||
elif MACOS:
|
||||
if MACOS:
|
||||
return "multiprocessing is wonky on mac osx;"
|
||||
else:
|
||||
msg = "need python 3.3+ for multiprocessing;"
|
||||
if PY2 or vmin < 3:
|
||||
return msg
|
||||
elif sys.version_info < (3, 3):
|
||||
return "need python 3.3 or newer for multiprocessing;"
|
||||
|
||||
try:
|
||||
x: mp.Queue[tuple[str, str]] = mp.Queue(1)
|
||||
|
||||
@@ -6,8 +6,9 @@ import re
|
||||
import socket
|
||||
import sys
|
||||
|
||||
from .__init__ import ANYWIN, MACOS, TYPE_CHECKING, unicode
|
||||
from .util import chkcmd
|
||||
from .__init__ import ANYWIN, MACOS, PY2, TYPE_CHECKING, VT100, unicode
|
||||
from .stolen.qrcodegen import QrCode
|
||||
from .util import chkcmd, sunpack, termsize
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .svchub import SvcHub
|
||||
@@ -30,6 +31,7 @@ class TcpSrv(object):
|
||||
self.stopping = False
|
||||
self.srv: list[socket.socket] = []
|
||||
self.nsrv = 0
|
||||
self.qr = ""
|
||||
ok: dict[str, list[int]] = {}
|
||||
for ip in self.args.i:
|
||||
ok[ip] = []
|
||||
@@ -60,6 +62,8 @@ class TcpSrv(object):
|
||||
for x in nonlocals:
|
||||
eps[x] = "external"
|
||||
|
||||
qr1: dict[str, list[int]] = {}
|
||||
qr2: dict[str, list[int]] = {}
|
||||
msgs = []
|
||||
title_tab: dict[str, dict[str, int]] = {}
|
||||
title_vars = [x[1:] for x in self.args.wintitle.split(" ") if x.startswith("$")]
|
||||
@@ -77,6 +81,13 @@ class TcpSrv(object):
|
||||
|
||||
msgs.append(t.format(proto, ip, port, desc))
|
||||
|
||||
is_ext = "external" in unicode(desc)
|
||||
qrt = qr1 if is_ext else qr2
|
||||
try:
|
||||
qrt[ip].append(port)
|
||||
except:
|
||||
qrt[ip] = [port]
|
||||
|
||||
if not self.args.wintitle:
|
||||
continue
|
||||
|
||||
@@ -86,7 +97,7 @@ class TcpSrv(object):
|
||||
ep = "{}:{}".format(ip, port)
|
||||
|
||||
hits = []
|
||||
if "pub" in title_vars and "external" in unicode(desc):
|
||||
if "pub" in title_vars and is_ext:
|
||||
hits.append(("pub", ep))
|
||||
|
||||
if "pub" in title_vars or "all" in title_vars:
|
||||
@@ -110,6 +121,9 @@ class TcpSrv(object):
|
||||
if self.args.wintitle:
|
||||
self._set_wintitle(title_tab)
|
||||
|
||||
if self.args.qr or self.args.qrs:
|
||||
self.qr = self._qr(qr1, qr2)
|
||||
|
||||
def _listen(self, ip: str, port: int) -> None:
|
||||
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
@@ -330,19 +344,89 @@ class TcpSrv(object):
|
||||
|
||||
vs2 = {}
|
||||
for k, eps in vs.items():
|
||||
vs2[k] = {
|
||||
ep: 1
|
||||
for ep in eps.keys()
|
||||
if ":" not in ep or ep.split(":")[0] not in eps
|
||||
}
|
||||
filt = {ep: 1 for ep in eps if ":" not in ep}
|
||||
have = set(filt)
|
||||
for ep in sorted(eps):
|
||||
ip = ep.split(":")[0]
|
||||
if ip not in have:
|
||||
have.add(ip)
|
||||
filt[ep] = 1
|
||||
|
||||
lo = [x for x in filt if x.startswith("127.")]
|
||||
if len(filt) > 3 and lo:
|
||||
for ip in lo:
|
||||
filt.pop(ip)
|
||||
|
||||
vs2[k] = filt
|
||||
|
||||
title = ""
|
||||
vs = vs2
|
||||
for p in self.args.wintitle.split(" "):
|
||||
if p.startswith("$"):
|
||||
p = " and ".join(sorted(vs.get(p[1:], {"(None)": 1}).keys()))
|
||||
seps = list(sorted(vs.get(p[1:], {"(None)": 1}).keys()))
|
||||
p = ", ".join(seps[:3])
|
||||
if len(seps) > 3:
|
||||
p += ", ..."
|
||||
|
||||
title += "{} ".format(p)
|
||||
|
||||
print("\033]0;{}\033\\".format(title), file=sys.stderr, end="")
|
||||
sys.stderr.flush()
|
||||
|
||||
def _qr(self, t1: dict[str, list[int]], t2: dict[str, list[int]]) -> str:
|
||||
ip = None
|
||||
for ip in list(t1) + list(t2):
|
||||
if ip.startswith(self.args.qri):
|
||||
break
|
||||
ip = ""
|
||||
|
||||
if not ip:
|
||||
# maybe /bin/ip is missing or smth
|
||||
ip = self.args.qri
|
||||
|
||||
if not ip:
|
||||
return ""
|
||||
|
||||
if self.args.http_only:
|
||||
https = ""
|
||||
elif self.args.https_only:
|
||||
https = "s"
|
||||
else:
|
||||
https = "s" if self.args.qrs else ""
|
||||
|
||||
ports = t1.get(ip, t2.get(ip, []))
|
||||
dport = 443 if https else 80
|
||||
port = "" if dport in ports or not ports else ":{}".format(ports[0])
|
||||
txt = "http{}://{}{}/{}".format(https, ip, port, self.args.qrl)
|
||||
|
||||
btxt = txt.encode("utf-8")
|
||||
if PY2:
|
||||
btxt = sunpack(b"B" * len(btxt), btxt)
|
||||
|
||||
fg = self.args.qr_fg
|
||||
bg = self.args.qr_bg
|
||||
pad = self.args.qrp
|
||||
zoom = self.args.qrz
|
||||
qrc = QrCode.encode_binary(btxt)
|
||||
if zoom == 0:
|
||||
try:
|
||||
tw, th = termsize()
|
||||
tsz = min(tw // 2, th)
|
||||
zoom = 1 if qrc.size + pad * 2 >= tsz else 2
|
||||
except:
|
||||
zoom = 1
|
||||
|
||||
qr = qrc.render(zoom, pad)
|
||||
if not VT100:
|
||||
return "{}\n{}".format(txt, qr)
|
||||
|
||||
def ansify(m: re.Match) -> str:
|
||||
t = "\033[40;48;5;{}m{}\033[47;48;5;{}m"
|
||||
return t.format(fg, " " * len(m.group(1)), bg)
|
||||
|
||||
if zoom > 1:
|
||||
qr = re.sub("(█+)", ansify, qr)
|
||||
|
||||
qr = qr.replace("\n", "\033[K\n") + "\033[K" # win10do
|
||||
t = "{} \033[0;38;5;{};48;5;{}m\033[J\n{}\033[999G\033[0m\033[J"
|
||||
return t.format(txt, fg, bg, qr)
|
||||
|
||||
@@ -30,6 +30,8 @@ class ThumbCli(object):
|
||||
|
||||
try:
|
||||
c = hsrv.th_cfg
|
||||
if not c:
|
||||
raise Exception()
|
||||
except:
|
||||
c = {k: {} for k in ["thumbable", "pil", "vips", "ffi", "ffv", "ffa"]}
|
||||
|
||||
|
||||
@@ -621,7 +621,7 @@ class ThumbSrv(object):
|
||||
|
||||
def _clean(self, cat: str, thumbpath: str) -> int:
|
||||
# self.log("cln {}".format(thumbpath))
|
||||
exts = ["jpg", "webp"] if cat == "th" else ["opus", "caf"]
|
||||
exts = ["jpg", "webp", "png"] if cat == "th" else ["opus", "caf"]
|
||||
maxage = getattr(self.args, cat + "_maxage")
|
||||
now = time.time()
|
||||
prev_b64 = None
|
||||
|
||||
@@ -45,6 +45,7 @@ from .util import (
|
||||
s3dec,
|
||||
s3enc,
|
||||
sanitize_fn,
|
||||
spack,
|
||||
statdir,
|
||||
vjoin,
|
||||
vsplit,
|
||||
@@ -370,7 +371,7 @@ class Up2k(object):
|
||||
if vp:
|
||||
fvp = "{}/{}".format(vp, fvp)
|
||||
|
||||
self._handle_rm(LEELOO_DALLAS, "", fvp)
|
||||
self._handle_rm(LEELOO_DALLAS, "", fvp, [])
|
||||
nrm += 1
|
||||
|
||||
if nrm:
|
||||
@@ -689,10 +690,8 @@ class Up2k(object):
|
||||
rei = vol.flags.get("noidx")
|
||||
reh = vol.flags.get("nohash")
|
||||
n4g = bool(vol.flags.get("noforget"))
|
||||
|
||||
dev = 0
|
||||
if vol.flags.get("xdev"):
|
||||
dev = bos.stat(top).st_dev
|
||||
cst = bos.stat(top)
|
||||
dev = cst.st_dev if vol.flags.get("xdev") else 0
|
||||
|
||||
with self.mutex:
|
||||
reg = self.register_vpath(top, vol.flags)
|
||||
@@ -728,6 +727,7 @@ class Up2k(object):
|
||||
reh,
|
||||
n4g,
|
||||
[],
|
||||
cst,
|
||||
dev,
|
||||
bool(vol.flags.get("xvol")),
|
||||
)
|
||||
@@ -764,6 +764,7 @@ class Up2k(object):
|
||||
reh: Optional[Pattern[str]],
|
||||
n4g: bool,
|
||||
seen: list[str],
|
||||
cst: os.stat_result,
|
||||
dev: int,
|
||||
xvol: bool,
|
||||
) -> int:
|
||||
@@ -818,7 +819,7 @@ class Up2k(object):
|
||||
# self.log(" dir: {}".format(abspath))
|
||||
try:
|
||||
ret += self._build_dir(
|
||||
db, top, excl, abspath, rap, rei, reh, n4g, seen, dev, xvol
|
||||
db, top, excl, abspath, rap, rei, reh, n4g, seen, inf, dev, xvol
|
||||
)
|
||||
except:
|
||||
t = "failed to index subdir [{}]:\n{}"
|
||||
@@ -851,6 +852,7 @@ class Up2k(object):
|
||||
zh = hashlib.sha1()
|
||||
_ = [zh.update(str(x).encode("utf-8", "replace")) for x in files]
|
||||
|
||||
zh.update(spack(b"<d", cst.st_mtime))
|
||||
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
||||
sql = "select d from dh where d = ? and h = ?"
|
||||
try:
|
||||
@@ -941,25 +943,25 @@ class Up2k(object):
|
||||
return -1
|
||||
|
||||
# drop shadowed folders
|
||||
for rd in unreg:
|
||||
for sh_rd in unreg:
|
||||
n = 0
|
||||
q = "select count(w) from up where (rd = ? or rd like ?||'%') and at == 0"
|
||||
for erd in [rd, "//" + w8b64enc(rd)]:
|
||||
for sh_erd in [sh_rd, "//" + w8b64enc(sh_rd)]:
|
||||
try:
|
||||
n = db.c.execute(q, (erd, erd + "/")).fetchone()[0]
|
||||
n = db.c.execute(q, (sh_erd, sh_erd + "/")).fetchone()[0]
|
||||
break
|
||||
except:
|
||||
pass
|
||||
|
||||
if n:
|
||||
t = "forgetting {} shadowed autoindexed files in [{}] > [{}]"
|
||||
self.log(t.format(n, top, rd))
|
||||
self.log(t.format(n, top, sh_rd))
|
||||
|
||||
q = "delete from dh where (d = ? or d like ?||'%')"
|
||||
db.c.execute(q, (erd, erd + "/"))
|
||||
db.c.execute(q, (sh_erd, sh_erd + "/"))
|
||||
|
||||
q = "delete from up where (rd = ? or rd like ?||'%') and at == 0"
|
||||
db.c.execute(q, (erd, erd + "/"))
|
||||
db.c.execute(q, (sh_erd, sh_erd + "/"))
|
||||
ret += n
|
||||
|
||||
if n4g:
|
||||
@@ -1924,6 +1926,7 @@ class Up2k(object):
|
||||
reg = self.registry[cj["ptop"]]
|
||||
vfs = self.asrv.vfs.all_vols[cj["vtop"]]
|
||||
n4g = vfs.flags.get("noforget")
|
||||
lost: list[tuple[str, str]] = []
|
||||
if cur:
|
||||
if self.no_expr_idx:
|
||||
q = r"select * from up where w = ?"
|
||||
@@ -1948,6 +1951,7 @@ class Up2k(object):
|
||||
if n4g:
|
||||
st = os.stat_result((0, -1, -1, 0, 0, 0, 0, 0, 0, 0))
|
||||
else:
|
||||
lost.append((dp_dir, dp_fn))
|
||||
continue
|
||||
|
||||
j = {
|
||||
@@ -1980,6 +1984,12 @@ class Up2k(object):
|
||||
# self.log("pop " + wark + " " + job["name"] + " handle_json db", 4)
|
||||
del reg[wark]
|
||||
|
||||
if lost:
|
||||
for dp_dir, dp_fn in lost:
|
||||
self.db_rm(cur, dp_dir, dp_fn)
|
||||
|
||||
cur.connection.commit()
|
||||
|
||||
if job or wark in reg:
|
||||
job = job or reg[wark]
|
||||
if job["prel"] == cj["prel"] and job["name"] == cj["name"]:
|
||||
@@ -2305,7 +2315,7 @@ class Up2k(object):
|
||||
flt = job["life"]
|
||||
vfs = self.asrv.vfs.all_vols[job["vtop"]]
|
||||
vlt = vfs.flags["lifetime"]
|
||||
if vlt and flt < vlt:
|
||||
if vlt and flt > 1 and flt < vlt:
|
||||
upt -= vlt - flt
|
||||
wake_sr = True
|
||||
t = "using client lifetime; at={:.0f} ({}-{})"
|
||||
@@ -2416,12 +2426,16 @@ class Up2k(object):
|
||||
v = (wark, int(ts), sz, rd, fn, ip or "", int(at or 0))
|
||||
db.execute(sql, v)
|
||||
|
||||
def handle_rm(self, uname: str, ip: str, vpaths: list[str]) -> str:
|
||||
def handle_rm(self, uname: str, ip: str, vpaths: list[str], lim: list[int]) -> str:
|
||||
n_files = 0
|
||||
ok = {}
|
||||
ng = {}
|
||||
for vp in vpaths:
|
||||
a, b, c = self._handle_rm(uname, ip, vp)
|
||||
if lim and lim[0] <= 0:
|
||||
self.log("hit delete limit of {} files".format(lim[1]), 3)
|
||||
break
|
||||
|
||||
a, b, c = self._handle_rm(uname, ip, vp, lim)
|
||||
n_files += a
|
||||
for k in b:
|
||||
ok[k] = 1
|
||||
@@ -2435,7 +2449,7 @@ class Up2k(object):
|
||||
return "deleted {} files (and {}/{} folders)".format(n_files, iok, iok + ing)
|
||||
|
||||
def _handle_rm(
|
||||
self, uname: str, ip: str, vpath: str
|
||||
self, uname: str, ip: str, vpath: str, lim: list[int]
|
||||
) -> tuple[int, list[str], list[str]]:
|
||||
self.db_act = time.time()
|
||||
try:
|
||||
@@ -2494,6 +2508,12 @@ class Up2k(object):
|
||||
n_files = 0
|
||||
for dbv, vrem, _, adir, files, rd, vd in g:
|
||||
for fn in [x[0] for x in files]:
|
||||
if lim:
|
||||
lim[0] -= 1
|
||||
if lim[0] < 0:
|
||||
self.log("hit delete limit of {} files".format(lim[1]), 3)
|
||||
break
|
||||
|
||||
n_files += 1
|
||||
abspath = os.path.join(adir, fn)
|
||||
volpath = "{}/{}".format(vrem, fn).strip("/")
|
||||
@@ -2526,6 +2546,7 @@ class Up2k(object):
|
||||
svn, srem = self.asrv.vfs.get(svp, uname, True, False, True)
|
||||
svn, srem = svn.get_dbv(srem)
|
||||
sabs = svn.canonical(srem, False)
|
||||
curs: set["sqlite3.Cursor"] = set()
|
||||
|
||||
if not srem:
|
||||
raise Pebkac(400, "mv: cannot move a mountpoint")
|
||||
@@ -2533,7 +2554,13 @@ class Up2k(object):
|
||||
st = bos.lstat(sabs)
|
||||
if stat.S_ISREG(st.st_mode) or stat.S_ISLNK(st.st_mode):
|
||||
with self.mutex:
|
||||
return self._mv_file(uname, svp, dvp)
|
||||
try:
|
||||
ret = self._mv_file(uname, svp, dvp, curs)
|
||||
finally:
|
||||
for v in curs:
|
||||
v.connection.commit()
|
||||
|
||||
return ret
|
||||
|
||||
jail = svn.get_dbv(srem)[0]
|
||||
permsets = [[True, False, True]]
|
||||
@@ -2552,20 +2579,29 @@ class Up2k(object):
|
||||
# the actual check (avoid toctou)
|
||||
raise Pebkac(400, "mv: source folder contains other volumes")
|
||||
|
||||
for fn in files:
|
||||
svpf = "/".join(x for x in [dbv.vpath, vrem, fn[0]] if x)
|
||||
if not svpf.startswith(svp + "/"): # assert
|
||||
raise Pebkac(500, "mv: bug at {}, top {}".format(svpf, svp))
|
||||
with self.mutex:
|
||||
try:
|
||||
for fn in files:
|
||||
self.db_act = time.time()
|
||||
svpf = "/".join(x for x in [dbv.vpath, vrem, fn[0]] if x)
|
||||
if not svpf.startswith(svp + "/"): # assert
|
||||
raise Pebkac(500, "mv: bug at {}, top {}".format(svpf, svp))
|
||||
|
||||
dvpf = dvp + svpf[len(svp) :]
|
||||
with self.mutex:
|
||||
self._mv_file(uname, svpf, dvpf)
|
||||
dvpf = dvp + svpf[len(svp) :]
|
||||
self._mv_file(uname, svpf, dvpf, curs)
|
||||
finally:
|
||||
for v in curs:
|
||||
v.connection.commit()
|
||||
|
||||
curs.clear()
|
||||
|
||||
rmdirs(self.log_func, scandir, True, sabs, 1)
|
||||
rmdirs_up(os.path.dirname(sabs))
|
||||
return "k"
|
||||
|
||||
def _mv_file(self, uname: str, svp: str, dvp: str) -> str:
|
||||
def _mv_file(
|
||||
self, uname: str, svp: str, dvp: str, curs: set["sqlite3.Cursor"]
|
||||
) -> str:
|
||||
svn, srem = self.asrv.vfs.get(svp, uname, True, False, True)
|
||||
svn, srem = svn.get_dbv(srem)
|
||||
|
||||
@@ -2623,11 +2659,11 @@ class Up2k(object):
|
||||
|
||||
self._forget_file(svn.realpath, srem, c1, w, c1 != c2)
|
||||
self._relink(w, svn.realpath, srem, dabs)
|
||||
c1.connection.commit()
|
||||
curs.add(c1)
|
||||
|
||||
if c2:
|
||||
self.db_add(c2, w, drd, dfn, ftime, fsize, ip or "", at or 0)
|
||||
c2.connection.commit()
|
||||
curs.add(c2)
|
||||
else:
|
||||
self.log("not found in src db: [{}]".format(svp))
|
||||
|
||||
|
||||
@@ -120,7 +120,7 @@ else:
|
||||
FS_ENCODING = sys.getfilesystemencoding()
|
||||
|
||||
|
||||
SYMTIME = sys.version_info >= (3, 6) and os.utime in os.supports_follow_symlinks
|
||||
SYMTIME = sys.version_info > (3, 6) and os.utime in os.supports_follow_symlinks
|
||||
|
||||
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
|
||||
|
||||
@@ -258,7 +258,7 @@ def py_desc() -> str:
|
||||
bitness = struct.calcsize("P") * 8
|
||||
|
||||
host_os = platform.system()
|
||||
compiler = platform.python_compiler()
|
||||
compiler = platform.python_compiler().split("http")[0]
|
||||
|
||||
m = re.search(r"([0-9]+\.[0-9\.]+)", platform.version())
|
||||
os_ver = m.group(1) if m else ""
|
||||
@@ -643,6 +643,9 @@ class HMaccas(object):
|
||||
try:
|
||||
return self.cache[msg]
|
||||
except:
|
||||
if len(self.cache) > 9000:
|
||||
self.cache = {}
|
||||
|
||||
zb = hmac.new(self.key, msg, hashlib.sha512).digest()
|
||||
zs = base64.urlsafe_b64encode(zb)[: self.retlen].decode("utf-8")
|
||||
self.cache[msg] = zs
|
||||
@@ -1327,8 +1330,24 @@ def gen_filekey_dbg(
|
||||
|
||||
assert log_ptn
|
||||
if log_ptn.search(fspath):
|
||||
t = "fk({}) salt({}) size({}) inode({}) fspath({})"
|
||||
log(t.format(ret[:8], salt, fsize, inode, fspath))
|
||||
try:
|
||||
import inspect
|
||||
|
||||
ctx = ",".join(inspect.stack()[n][3] for n in range(2, 5))
|
||||
except:
|
||||
ctx = ""
|
||||
|
||||
try:
|
||||
p2 = "a"
|
||||
p2 = absreal(fspath)
|
||||
if p2 != fspath:
|
||||
raise Exception()
|
||||
except:
|
||||
t = "maybe wrong abspath for filekey;\norig: {}\nreal: {}"
|
||||
log(t.format(fspath, p2), 1)
|
||||
|
||||
t = "fk({}) salt({}) size({}) inode({}) fspath({}) at({})"
|
||||
log(t.format(ret[:8], salt, fsize, inode, fspath, ctx), 5)
|
||||
|
||||
return ret
|
||||
|
||||
@@ -1786,7 +1805,7 @@ def yieldfile(fn: str) -> Generator[bytes, None, None]:
|
||||
|
||||
|
||||
def hashcopy(
|
||||
fin: Union[typing.BinaryIO, Generator[bytes, None, None]],
|
||||
fin: Generator[bytes, None, None],
|
||||
fout: Union[typing.BinaryIO, typing.IO[Any]],
|
||||
slp: int = 0,
|
||||
max_sz: int = 0,
|
||||
|
||||
@@ -246,12 +246,18 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
|
||||
function keyDownHandler(e) {
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing || modal.busy)
|
||||
if (anymod(e, true) || modal.busy)
|
||||
return;
|
||||
|
||||
var k = e.code + '', v = vid(), pos = -1;
|
||||
|
||||
if (k == "ArrowLeft" || k == "KeyJ")
|
||||
if (k == "BracketLeft")
|
||||
setloop(1);
|
||||
else if (k == "BracketRight")
|
||||
setloop(2);
|
||||
else if (e.shiftKey)
|
||||
return;
|
||||
else if (k == "ArrowLeft" || k == "KeyJ")
|
||||
showPreviousImage();
|
||||
else if (k == "ArrowRight" || k == "KeyL")
|
||||
showNextImage();
|
||||
@@ -289,10 +295,6 @@ window.baguetteBox = (function () {
|
||||
rotn(e.shiftKey ? -1 : 1);
|
||||
else if (k == "KeyY")
|
||||
dlpic();
|
||||
else if (k == "BracketLeft")
|
||||
setloop(1);
|
||||
else if (k == "BracketRight")
|
||||
setloop(2);
|
||||
}
|
||||
|
||||
function anim() {
|
||||
@@ -406,7 +408,7 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
|
||||
function keyUpHandler(e) {
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing)
|
||||
if (anymod(e))
|
||||
return;
|
||||
|
||||
var k = e.code + '';
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<meta charset="utf-8">
|
||||
<title>{{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8, minimum-scale=0.6">
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/ui.css?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/browser.css?_={{ ts }}">
|
||||
|
||||
@@ -166,7 +166,7 @@ var Ls = {
|
||||
"mt_oscv": "show album cover in osd\">art",
|
||||
"mt_mloop": "loop the open folder\">🔁 loop",
|
||||
"mt_mnext": "load the next folder and continue\">📂 next",
|
||||
"mt_cflac": "convert flac to opus\">flac",
|
||||
"mt_cflac": "convert flac / wav to opus\">flac",
|
||||
"mt_caac": "convert aac / m4a to opus\">aac",
|
||||
"mt_coth": "convert all others (not mp3) to opus\">oth",
|
||||
"mt_tint": "background level (0-100) on the seekbar$Nto make buffering less distracting",
|
||||
@@ -534,14 +534,14 @@ var Ls = {
|
||||
|
||||
"mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles",
|
||||
"mt_fullpre": "hent ned hele neste sang, ikke bare litt:$N✅ skru på hvis nettet ditt er <b>ustabilt</b>,$N❌ skru av hvis nettet ditt er <b>tregt</b>\">full",
|
||||
"mt_waves": "waveform seekbar:$Nvis volumkart i avspillingsindikatoren\">~s",
|
||||
"mt_waves": "waveform seekbar:$Nvis volumindikator i avspillingsfeltet\">~s",
|
||||
"mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np",
|
||||
"mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl",
|
||||
"mt_oseek": "tillat spoling med fjernkontroll\">spoling",
|
||||
"mt_oscv": "vis album-cover på infoskjermen\">bilde",
|
||||
"mt_mloop": "repeter hele mappen\">🔁 gjenta",
|
||||
"mt_mnext": "hopp til neste mappe og fortsett\">📂 neste",
|
||||
"mt_cflac": "konverter flac-filer til opus\">flac",
|
||||
"mt_cflac": "konverter flac / wav-filer til opus\">flac",
|
||||
"mt_caac": "konverter aac / m4a-filer til to opus\">aac",
|
||||
"mt_coth": "konverter alt annet (men ikke mp3) til opus\">andre",
|
||||
"mt_tint": "nivå av bakgrunnsfarge på søkestripa (0-100),$Ngjør oppdateringer mindre distraherende",
|
||||
@@ -1052,7 +1052,7 @@ function goto(dest) {
|
||||
|
||||
clmod(document.documentElement, 'op_open', dest);
|
||||
|
||||
if (window['treectl'])
|
||||
if (treectl)
|
||||
treectl.onscroll();
|
||||
}
|
||||
|
||||
@@ -1197,7 +1197,7 @@ var mpl = (function () {
|
||||
var c = true;
|
||||
if (!have_acode)
|
||||
c = false;
|
||||
else if (/\.flac$/i.exec(url))
|
||||
else if (/\.(wav|flac)$/i.exec(url))
|
||||
c = r.ac_flac;
|
||||
else if (/\.(aac|m4a)$/i.exec(url))
|
||||
c = r.ac_aac;
|
||||
@@ -1508,7 +1508,7 @@ var widget = (function () {
|
||||
clmod(document.documentElement, 'np_open', is_open);
|
||||
clmod(widget, 'open', is_open);
|
||||
bcfg_set('au_open', r.is_open = is_open);
|
||||
if (window.vbar) {
|
||||
if (vbar) {
|
||||
pbar.onresize();
|
||||
vbar.onresize();
|
||||
}
|
||||
@@ -2421,7 +2421,7 @@ function play(tid, is_ev, seek) {
|
||||
clmod(ebi(oid), 'act', 1);
|
||||
clmod(ebi(oid).closest('tr'), 'play', 1);
|
||||
clmod(ebi('wtoggle'), 'np', mpl.clip);
|
||||
if (window.thegrid)
|
||||
if (thegrid)
|
||||
thegrid.loadsel();
|
||||
|
||||
try {
|
||||
@@ -3392,7 +3392,7 @@ var showfile = (function () {
|
||||
}
|
||||
|
||||
r.setstyle = function () {
|
||||
if (window['no_prism'])
|
||||
if (window.no_prism)
|
||||
return;
|
||||
|
||||
qsr('#prism_css');
|
||||
@@ -3521,7 +3521,7 @@ var showfile = (function () {
|
||||
else {
|
||||
el.textContent = txt;
|
||||
el.innerHTML = '<code>' + el.innerHTML + '</code>';
|
||||
if (!window['no_prism']) {
|
||||
if (!window.no_prism) {
|
||||
el.className = 'prism linkable-line-numbers line-numbers language-' + lang;
|
||||
if (!defer)
|
||||
fun(el.firstChild);
|
||||
@@ -3747,7 +3747,7 @@ var thegrid = (function () {
|
||||
ebi('bdoc').style.display = 'none';
|
||||
clmod(ebi('wrap'), 'doc');
|
||||
qsr('#docname');
|
||||
if (window['treectl'])
|
||||
if (treectl)
|
||||
treectl.textmode(false);
|
||||
|
||||
aligngriditems();
|
||||
@@ -4610,7 +4610,7 @@ document.onkeydown = function (e) {
|
||||
})();
|
||||
|
||||
function aligngriditems() {
|
||||
if (!window.treectl)
|
||||
if (!treectl)
|
||||
return;
|
||||
|
||||
var em2px = parseFloat(getComputedStyle(ebi('ggrid')).fontSize);
|
||||
@@ -4658,7 +4658,7 @@ var treectl = (function () {
|
||||
setwrap(bcfg_bind(r, 'wtree', 'wraptree', true, setwrap));
|
||||
setwrap(bcfg_bind(r, 'parpane', 'parpane', true, onscroll));
|
||||
bcfg_bind(r, 'htree', 'hovertree', false, reload_tree);
|
||||
bcfg_bind(r, 'ask', 'bd_ask', false);
|
||||
bcfg_bind(r, 'ask', 'bd_ask', MOBILE && FIREFOX);
|
||||
ebi('bd_lim').value = r.lim = icfg_get('bd_lim');
|
||||
ebi('bd_lim').oninput = function (e) {
|
||||
var n = parseInt(this.value);
|
||||
@@ -5496,7 +5496,7 @@ function apply_perms(newperms) {
|
||||
'table-cell' : 'none';
|
||||
}
|
||||
|
||||
if (window['up2k'])
|
||||
if (up2k)
|
||||
up2k.set_fsearch();
|
||||
|
||||
ebi('widget').style.display = have_read ? '' : 'none';
|
||||
@@ -5629,7 +5629,7 @@ var filecols = (function () {
|
||||
for (var b = 0, bb = tds.length; b < bb; b++)
|
||||
tds[b].className = cls;
|
||||
}
|
||||
if (window['tt']) {
|
||||
if (tt) {
|
||||
tt.att(ebi('hcols'));
|
||||
tt.att(QS('#files>thead'));
|
||||
}
|
||||
@@ -6215,7 +6215,7 @@ function show_md(md, name, div, url, depth) {
|
||||
if (url != now)
|
||||
return;
|
||||
|
||||
if (!window['marked']) {
|
||||
if (!marked) {
|
||||
if (depth)
|
||||
return toast.warn(10, errmsg + 'failed to load marked.js')
|
||||
|
||||
@@ -6415,7 +6415,7 @@ var unpost = (function () {
|
||||
|
||||
for (var a = n; a < n2; a++)
|
||||
if (QS('#op_unpost a.n' + a))
|
||||
req.push(uricom_dec(r.files[a].vp));
|
||||
req.push(uricom_dec(r.files[a].vp.split('?')[0]));
|
||||
|
||||
var links = QSA('#op_unpost a.n' + n);
|
||||
for (var a = 0, aa = links.length; a < aa; a++) {
|
||||
@@ -6428,7 +6428,7 @@ var unpost = (function () {
|
||||
var xhr = new XHR();
|
||||
xhr.n = n;
|
||||
xhr.n2 = n2;
|
||||
xhr.open('POST', '/?delete', true);
|
||||
xhr.open('POST', '/?delete&lim=' + links.length, true);
|
||||
xhr.onload = xhr.onerror = unpost_delete_cb;
|
||||
xhr.send(JSON.stringify(req));
|
||||
};
|
||||
@@ -6579,7 +6579,7 @@ function reload_browser() {
|
||||
for (var a = 0; a < ns.length; a++)
|
||||
clmod(ebi(ns[a]), 'hidden', ebi('unsearch'));
|
||||
|
||||
if (window['up2k'])
|
||||
if (up2k)
|
||||
up2k.set_fsearch();
|
||||
|
||||
thegrid.setdirty();
|
||||
|
||||
@@ -498,5 +498,5 @@ dom_navtgl.onclick = function () {
|
||||
if (sread('hidenav') == 1)
|
||||
dom_navtgl.onclick();
|
||||
|
||||
if (window['tt'])
|
||||
if (window.tt && tt.init)
|
||||
tt.init();
|
||||
|
||||
@@ -12,7 +12,7 @@ var Ls = {
|
||||
"cc1": "klient-konfigurasjon",
|
||||
"h1": "skru av k304",
|
||||
"i1": "skru på k304",
|
||||
"j1": "k304 bryter tilkoplingen for hver HTTP 304. Dette hjelper visse mellomtjenere som kan sette seg fast / plutselig slutter å laste sider, men det reduserer også ytelsen betydelig",
|
||||
"j1": "k304 bryter tilkoplingen for hver HTTP 304. Dette hjelper mot visse mellomtjenere som kan sette seg fast / plutselig slutter å laste sider, men det reduserer også ytelsen betydelig",
|
||||
"k1": "nullstill innstillinger",
|
||||
"l1": "logg inn:",
|
||||
"m1": "velkommen tilbake,",
|
||||
|
||||
@@ -928,7 +928,7 @@ function up2k_init(subtle) {
|
||||
r.st = st;
|
||||
r.uc = uc;
|
||||
|
||||
if (!window.File || !File.prototype.slice || !window.FileReader || !window.FileList)
|
||||
if (!window.File || !window.FileReader || !window.FileList || !File.prototype || !File.prototype.slice)
|
||||
return un2k(L.u_ever);
|
||||
|
||||
var flag = false;
|
||||
@@ -938,15 +938,19 @@ function up2k_init(subtle) {
|
||||
function nav() {
|
||||
start_actx();
|
||||
|
||||
var uf = function () { ebi('file' + fdom_ctr).click(); },
|
||||
ud = function () { ebi('dir' + fdom_ctr).click(); };
|
||||
|
||||
// too buggy on chrome <= 72
|
||||
var m = / Chrome\/([0-9]+)\./.exec(navigator.userAgent);
|
||||
if (m && parseInt(m[1]) < 73)
|
||||
return ebi('file' + fdom_ctr).click();
|
||||
return uf();
|
||||
|
||||
modal.confirm(L.u_nav_m,
|
||||
function () { ebi('file' + fdom_ctr).click(); },
|
||||
function () { ebi('dir' + fdom_ctr).click(); },
|
||||
null, L.u_nav_b);
|
||||
// phones dont support folder upload
|
||||
if (MOBILE)
|
||||
return uf();
|
||||
|
||||
modal.confirm(L.u_nav_m, uf, ud, null, L.u_nav_b);
|
||||
}
|
||||
ebi('u2btn').onclick = nav;
|
||||
|
||||
@@ -2497,7 +2501,7 @@ function up2k_init(subtle) {
|
||||
tt.att(QS('#u2conf'));
|
||||
|
||||
function bumpthread2(e) {
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing)
|
||||
if (anymod(e))
|
||||
return;
|
||||
|
||||
if (e.code == 'ArrowUp')
|
||||
@@ -2571,6 +2575,7 @@ function up2k_init(subtle) {
|
||||
el.innerHTML = '<div>' + L.u_life_cfg + '</div><div>' + L.u_life_est + '</div><div id="undor"></div>';
|
||||
set_life(Math.min(lifetime, icfg_get('lifetime', lifetime)));
|
||||
ebi('lifem').oninput = ebi('lifeh').oninput = mod_life;
|
||||
ebi('lifem').onkeydown = ebi('lifeh').onkeydown = kd_life;
|
||||
tt.att(ebi('u2life'));
|
||||
}
|
||||
draw_life();
|
||||
@@ -2596,12 +2601,23 @@ function up2k_init(subtle) {
|
||||
set_life(v);
|
||||
}
|
||||
|
||||
function kd_life(e) {
|
||||
var el = e.target,
|
||||
d = e.code == 'ArrowUp' ? 1 : e.code == 'ArrowDown' ? -1 : 0;
|
||||
|
||||
if (anymod(e) || !d)
|
||||
return;
|
||||
|
||||
el.value = parseInt(el.value) + d;
|
||||
mod_life(e);
|
||||
}
|
||||
|
||||
function set_life(v) {
|
||||
//ebi('lifes').value = v;
|
||||
ebi('lifem').value = parseInt(v / 60);
|
||||
ebi('lifeh').value = parseInt(v / 3600);
|
||||
|
||||
var undo = have_unpost - (v || lifetime);
|
||||
var undo = have_unpost - (v ? lifetime - v : 0);
|
||||
ebi('undor').innerHTML = undo <= 0 ?
|
||||
L.u_unp_ng : L.u_unp_ok.format(lhumantime(undo));
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
"use strict";
|
||||
|
||||
if (!window['console'])
|
||||
window['console'] = {
|
||||
if (!window.console || !console.log)
|
||||
window.console = {
|
||||
"log": function (msg) { }
|
||||
};
|
||||
|
||||
|
||||
var wah = '',
|
||||
L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked,
|
||||
CB = '?_=' + Date.now(),
|
||||
HALFMAX = 8192 * 8192 * 8192 * 8192,
|
||||
HTTPS = (window.location + '').indexOf('https:') === 0,
|
||||
@@ -17,6 +18,15 @@ var wah = '',
|
||||
IPHONE = TOUCH && /iPhone|iPad|iPod/i.test(navigator.userAgent),
|
||||
WINDOWS = navigator.platform ? navigator.platform == 'Win32' : /Windows/.test(navigator.userAgent);
|
||||
|
||||
if (!window.WebAssembly || !WebAssembly.Memory)
|
||||
window.WebAssembly = false;
|
||||
|
||||
if (!window.Notification || !Notification.permission)
|
||||
window.Notification = false;
|
||||
|
||||
if (!window.FormData)
|
||||
window.FormData = false;
|
||||
|
||||
try {
|
||||
CB = '?' + document.currentScript.src.split('?').pop();
|
||||
|
||||
@@ -229,6 +239,11 @@ function ctrl(e) {
|
||||
}
|
||||
|
||||
|
||||
function anymod(e, shift_ok) {
|
||||
return e && (e.ctrlKey || e.altKey || e.metaKey || e.isComposing || (!shift_ok && e.shiftKey));
|
||||
}
|
||||
|
||||
|
||||
function ev(e) {
|
||||
e = e || window.event;
|
||||
if (!e)
|
||||
@@ -381,13 +396,14 @@ function clgot(el, cls) {
|
||||
|
||||
|
||||
var ANIM = true;
|
||||
if (window.matchMedia) {
|
||||
try {
|
||||
var mq = window.matchMedia('(prefers-reduced-motion: reduce)');
|
||||
mq.onchange = function () {
|
||||
ANIM = !mq.matches;
|
||||
};
|
||||
ANIM = !mq.matches;
|
||||
}
|
||||
catch (ex) { }
|
||||
|
||||
|
||||
function yscroll() {
|
||||
@@ -747,7 +763,7 @@ function lhumantime(v) {
|
||||
var t = shumantime(v, 1),
|
||||
tp = t.replace(/([a-z])/g, " $1 ").split(/ /g).slice(0, -1);
|
||||
|
||||
if (!window.L || tp.length < 2 || tp[1].indexOf('$') + 1)
|
||||
if (!L || tp.length < 2 || tp[1].indexOf('$') + 1)
|
||||
return t;
|
||||
|
||||
var ret = '';
|
||||
@@ -1251,8 +1267,8 @@ var modal = (function () {
|
||||
tok, tng, prim, sec, ok_cancel;
|
||||
|
||||
r.load = function () {
|
||||
tok = (window.L && L.m_ok) || 'OK';
|
||||
tng = (window.L && L.m_ng) || 'Cancel';
|
||||
tok = (L && L.m_ok) || 'OK';
|
||||
tng = (L && L.m_ng) || 'Cancel';
|
||||
prim = '<a href="#" id="modal-ok">' + tok + '</a>';
|
||||
sec = '<a href="#" id="modal-ng">' + tng + '</a>';
|
||||
ok_cancel = WINDOWS ? prim + sec : sec + prim;
|
||||
@@ -1612,7 +1628,7 @@ function xhrchk(xhr, prefix, e404, lvl, tag) {
|
||||
return true;
|
||||
|
||||
if (xhr.status == 403)
|
||||
return toast.err(0, prefix + (window.L && L.xhr403 || "403: access denied\n\ntry pressing F5, maybe you got logged out"), tag);
|
||||
return toast.err(0, prefix + (L && L.xhr403 || "403: access denied\n\ntry pressing F5, maybe you got logged out"), tag);
|
||||
|
||||
if (xhr.status == 404)
|
||||
return toast.err(0, prefix + e404, tag);
|
||||
|
||||
@@ -1,3 +1,69 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2022-1009-0919 `v1.4.5` qr-code
|
||||
|
||||
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||
* latest gzip edition of the sfx: [v1.0.14](https://github.com/9001/copyparty/releases/tag/v1.0.14#:~:text=release-specific%20notes)
|
||||
|
||||
## new features
|
||||
* display a server [qr-code](https://github.com/9001/copyparty#qr-code) [(screenshot)](https://user-images.githubusercontent.com/241032/194728533-6f00849b-c6ac-43c6-9359-83e454d11e00.png) on startup
|
||||
* primarily for running copyparty on a phone and accessing it from another
|
||||
* optionally specify a path or password with `--qrl lootbox/?pw=hunter2`
|
||||
* uses the server's exteral ip (default route) unless `--qri` specifies a domain / ip-prefix
|
||||
* classic cp437 `▄` `▀` for space efficiency; some misbehaving terminals / fonts need `--qrz 2`
|
||||
* new permission `G` returns the filekey of uploaded files for users without read-access
|
||||
* when combined with permission `w` and volflag `fk`, uploaded files will not be accessible unless the filekey is provided in the url, and `G` provides the filekey to the uploader unlike `g`
|
||||
* filekeys are added to the unpost listing
|
||||
|
||||
## bugfixes
|
||||
* renaming / moving folders is now **at least 120x faster**
|
||||
* and that's on nvme drives, so probably like 2000x on HDDs
|
||||
* uploads to volumes with lifetimes could get instapurged depending on browser and browser settings
|
||||
* ux fixes
|
||||
* FINALLY fixed messageboxes appearing offscreen on phones (and some other layout issues)
|
||||
* stop asking about folder-uploads on phones because they dont support it
|
||||
* on android-firefox, default to truncating huge folders with the load-more button due to ff onscroll being buggy
|
||||
* audioplayer looking funky if ffmpeg unavailable
|
||||
* waveform-seekbar cache expiration (the thumbcleaner complaining about png files)
|
||||
* ie11 panic when opening a folder which contains a file named `up2k`
|
||||
* turns out `<a name=foo>` becomes `window.foo` unless that's already declared somewhere in js -- luckily other browsers "only" do that with IDs
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2022-0926-2037 `v1.4.3` signal in the noise
|
||||
|
||||
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||
* latest gzip edition of the sfx: [v1.0.14](https://github.com/9001/copyparty/releases/tag/v1.0.14#:~:text=release-specific%20notes)
|
||||
|
||||
## new features
|
||||
* `--bak-flips` saves a copy of corrupted / bitflipped up2k uploads
|
||||
* comparing against a good copy can help pinpoint the culprit
|
||||
* also see [tracking bitflips](https://github.com/9001/copyparty/blob/hovudstraum/docs/notes.sh#:~:text=tracking%20bitflips)
|
||||
|
||||
## bugfixes
|
||||
* some edgecases where deleted files didn't get dropped from the db
|
||||
* can reduce performance over time, hitting the filesystem more than necessary
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2022-0925-1236 `v1.4.2` fuhgeddaboudit
|
||||
|
||||
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||
* latest gzip edition of the sfx: [v1.0.14](https://github.com/9001/copyparty/releases/tag/v1.0.14#:~:text=release-specific%20notes)
|
||||
|
||||
## new features
|
||||
* forget incoming uploads by deleting the name-reservation
|
||||
* (the zerobyte file with the actual filename, not the .PARTIAL)
|
||||
* can take 5min to kick in
|
||||
|
||||
## bugfixes
|
||||
* zfs on ubuntu 20.04 would reject files with big unicode names such as `148. Профессор Лебединский, Виктор Бондарюк, Дмитрий Нагиев - Я её хой (Я танцую пьяный на столе) (feat. Виктор Бондарюк & Дмитрий Нагиев).mp3`
|
||||
* usually not a problem since copyparty truncates names to fit filesystem limits, except zfs uses a nonstandard errorcode
|
||||
* in the "print-message-to-serverlog" feature, a unicode message larger than one tcp-frame could decode incorrectly
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2022-0924-1245 `v1.4.1` fix api compat
|
||||
|
||||
|
||||
@@ -16,6 +16,10 @@ https://github.com/giampaolo/pyftpdlib/
|
||||
C: 2007 Giampaolo Rodola'
|
||||
L: MIT
|
||||
|
||||
https://github.com/nayuki/QR-Code-generator
|
||||
C: Project Nayuki
|
||||
L: MIT
|
||||
|
||||
https://github.com/python/cpython/blob/3.10/Lib/asyncore.py
|
||||
C: 1996 Sam Rushing
|
||||
L: ISC
|
||||
|
||||
@@ -67,6 +67,7 @@ mkdir -p "${dirs[@]}"
|
||||
for dir in "${dirs[@]}"; do for fn in ふが "$(printf \\xed\\x93)" 'qw,er;ty%20as df?gh+jkl%zxc&vbn <qwe>"rty'"'"'uio&asd fgh'; do echo "$dir" > "$dir/$fn.html"; done; done
|
||||
# qw er+ty%20ui%%20op<as>df&gh&jk#zx'cv"bn`m=qw*er^ty?ui@op,as.df-gh_jk
|
||||
|
||||
|
||||
##
|
||||
## upload mojibake
|
||||
|
||||
@@ -143,6 +144,17 @@ sqlite3 -readonly up2k.db.key-full 'select w, v from mt where k = "key" order by
|
||||
sqlite3 -readonly up2k.db.key-full 'select w, v from mt where k = "key" order by w' > k1; sqlite3 -readonly up2k.db 'select mt.w, mt.v, up.rd, up.fn from mt inner join up on mt.w = substr(up.w,1,16) where mt.k = "key" order by up.rd, up.fn' > k2; ok=0; ng=0; while IFS='|' read w k2 path; do k1="$(grep -E "^$w" k1 | sed -r 's/.*\|//')"; [ "$k1" = "$k2" ] && ok=$((ok+1)) || { ng=$((ng+1)); printf '%3s %3s %s\n' "$k1" "$k2" "$path"; }; done < <(cat k2); echo "match $ok diff $ng"
|
||||
|
||||
|
||||
##
|
||||
## scanning for exceptions
|
||||
|
||||
cd /dev/shm
|
||||
journalctl -aS '720 hour ago' -t python3 -o with-unit --utc | cut -d\ -f2,6- > cpp.log
|
||||
tac cpp.log | awk '/RuntimeError: generator ignored GeneratorExit/{n=1} n{n--;if(n==0)print} 1' | grep 'generator ignored GeneratorExit' -C7 | head -n 100
|
||||
awk '/Exception ignored in: <generator object StreamZip.gen/{s=1;next} /could not create thumbnail/{s=3;next} s{s--;next} 1' <cpp.log | less -R
|
||||
less-search:
|
||||
>: |Exception|Traceback
|
||||
|
||||
|
||||
##
|
||||
## tracking bitflips
|
||||
|
||||
@@ -168,6 +180,7 @@ printf ' %s [%s]\n' $h2 "$(grep -F $h2 <handshakes | head -n 1)"
|
||||
# BUT the clients will immediately re-handshake the upload with the same bitflipped hashes, so the uploaders have to refresh their browsers before you do that,
|
||||
# so maybe just ask them to refresh and do nothing for 6 hours so the timeout kicks in, which deletes the placeholders/name-reservations and you can then manually delete the .PARTIALs at some point later
|
||||
|
||||
|
||||
##
|
||||
## media
|
||||
|
||||
@@ -215,7 +228,7 @@ brew install python@2
|
||||
pip install virtualenv
|
||||
|
||||
# readme toc
|
||||
cat README.md | awk 'function pr() { if (!h) {return}; if (/^ *[*!#|]/||!s) {printf "%s\n",h;h=0;return}; if (/.../) {printf "%s - %s\n",h,$0;h=0}; }; /^#/{s=1;pr()} /^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff)|`$/{s=0} /^#/{lv=length($1);sub(/[^ ]+ /,"");bab=$0;gsub(/ /,"-",bab); h=sprintf("%" ((lv-1)*4+1) "s [%s](#%s)", "*",$0,bab);next} !h{next} {sub(/ .*/,"");sub(/[:;,]$/,"")} {pr()}' > toc; grep -E '^## readme toc' -B1000 -A2 <README.md >p1; grep -E '^## quickstart' -B2 -A999999 <README.md >p2; (cat p1; grep quickstart -A1000 <toc; cat p2) >README.md; rm p1 p2 toc
|
||||
cat README.md | awk 'function pr() { if (!h) {return}; if (/^ *[*!#|]/||!s) {printf "%s\n",h;h=0;return}; if (/.../) {printf "%s - %s\n",h,$0;h=0}; }; /^#/{s=1;pr()} /^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff)|`$/{s=0} /^#/{lv=length($1);sub(/[^ ]+ /,"");bab=$0;gsub(/ /,"-",bab);gsub(/\./,"",bab); h=sprintf("%" ((lv-1)*4+1) "s [%s](#%s)", "*",$0,bab);next} !h{next} {sub(/ .*/,"");sub(/[:;,]$/,"")} {pr()}' > toc; grep -E '^## readme toc' -B1000 -A2 <README.md >p1; grep -E '^## quickstart' -B2 -A999999 <README.md >p2; (cat p1; grep quickstart -A1000 <toc; cat p2) >README.md; rm p1 p2 toc
|
||||
|
||||
# fix firefox phantom breakpoints,
|
||||
# suggestions from bugtracker, doesnt work (debugger is not attachable)
|
||||
|
||||
@@ -119,7 +119,7 @@ chmod 755 \
|
||||
|
||||
# extract the sfx
|
||||
( cd copyparty-extras/sfx-full/
|
||||
./copyparty-sfx.py -h
|
||||
./copyparty-sfx.py --version
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
FROM alpine:3
|
||||
WORKDIR /z
|
||||
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
|
||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
ver_hashwasm=4.9.0 \
|
||||
ver_marked=4.0.18 \
|
||||
ver_mde=2.18.0 \
|
||||
|
||||
47
scripts/genlic.sh
Executable file
47
scripts/genlic.sh
Executable file
@@ -0,0 +1,47 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
outfile="$(realpath "$1")"
|
||||
|
||||
[ -e genlic.sh ] || cd scripts
|
||||
[ -e genlic.sh ]
|
||||
|
||||
f=../build/mit.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/MIT |
|
||||
awk '/div>/{o=0}o>1;o{o++}/;COPYRIGHT HOLDER/{o=1}' |
|
||||
awk '{gsub(/<[^>]+>/,"")};1' >$f
|
||||
|
||||
f=../build/isc.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/ISC |
|
||||
awk '/div>/{o=0}o>2;o{o++}/;OWNER/{o=1}' |
|
||||
awk '{gsub(/<[^>]+>/,"")};/./{b=0}!/./{b++}b>1{next}1' >$f
|
||||
|
||||
f=../build/3bsd.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/BSD-3-Clause |
|
||||
awk '/div>/{o=0}o>1;o{o++}/HOLDER/{o=1}' |
|
||||
awk '{gsub(/<[^>]+>/,"")};1' >$f
|
||||
|
||||
f=../build/ofl.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/OFL-1.1 |
|
||||
awk '/PREAMBLE/{o=1}/sil\.org/{o=0}!o{next}/./{printf "%s ",$0;next}{print"\n"}' |
|
||||
awk '{gsub(/<[^>]+>/,"");gsub(/^\s+/,"");gsub(/&/,"\\&")}/./{b=0}!/./{b++}b>1{next}1' >$f
|
||||
|
||||
(sed -r 's/^L: /License: /;s/^C: /Copyright (c) /' <../docs/lics.txt
|
||||
printf '\n\n--- MIT License ---\n\n'; cat ../build/mit.txt
|
||||
printf '\n\n--- ISC License ---\n\n'; cat ../build/isc.txt
|
||||
printf '\n\n--- BSD 3-Clause License ---\n\n'; cat ../build/3bsd.txt
|
||||
printf '\n\n--- SIL Open Font License v1.1 ---\n\n'; cat ../build/ofl.txt
|
||||
) |
|
||||
while IFS= read -r x; do
|
||||
[ "${x:0:4}" = "--- " ] || {
|
||||
printf '%s\n' "$x"
|
||||
continue
|
||||
}
|
||||
n=${#x}
|
||||
p=$(( (80-n)/2 ))
|
||||
printf "%${p}s\033[07m%s\033[0m\n" "" "$x"
|
||||
done > "$outfile"
|
||||
@@ -87,6 +87,7 @@ while [ ! -z "$1" ]; do
|
||||
re) repack=1 ; ;;
|
||||
ox) use_ox=1 ; ;;
|
||||
gz) use_gz=1 ; ;;
|
||||
gzz) shift;use_gzz=$1;use_gz=1; ;;
|
||||
no-fnt) no_fnt=1 ; ;;
|
||||
no-hl) no_hl=1 ; ;;
|
||||
no-dd) no_dd=1 ; ;;
|
||||
@@ -224,48 +225,11 @@ tmpdir="$(
|
||||
|
||||
# remove type hints before build instead
|
||||
(cd copyparty; "$pybin" ../../scripts/strip_hints/a.py; rm uh)
|
||||
|
||||
licfile=$(realpath copyparty/res/COPYING.txt)
|
||||
(cd ../scripts; ./genlic.sh "$licfile")
|
||||
}
|
||||
|
||||
f=../build/mit.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/MIT |
|
||||
awk '/div>/{o=0}o>1;o{o++}/;COPYRIGHT HOLDER/{o=1}' |
|
||||
awk '{gsub(/<[^>]+>/,"")};1' >$f
|
||||
|
||||
f=../build/isc.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/ISC |
|
||||
awk '/div>/{o=0}o>2;o{o++}/;OWNER/{o=1}' |
|
||||
awk '{gsub(/<[^>]+>/,"")};/./{b=0}!/./{b++}b>1{next}1' >$f
|
||||
|
||||
f=../build/3bsd.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/BSD-3-Clause |
|
||||
awk '/div>/{o=0}o>1;o{o++}/HOLDER/{o=1}' |
|
||||
awk '{gsub(/<[^>]+>/,"")};1' >$f
|
||||
|
||||
f=../build/ofl.txt
|
||||
[ -e $f ] ||
|
||||
curl https://opensource.org/licenses/OFL-1.1 |
|
||||
awk '/PREAMBLE/{o=1}/sil\.org/{o=0}!o{next}/./{printf "%s ",$0;next}{print"\n"}' |
|
||||
awk '{gsub(/<[^>]+>/,"");gsub(/^\s+/,"");gsub(/&/,"\\&")}/./{b=0}!/./{b++}b>1{next}1' >$f
|
||||
|
||||
(sed -r 's/^L: /License: /;s/^C: /Copyright (c) /' <../docs/lics.txt
|
||||
printf '\n\n--- MIT License ---\n\n'; cat ../build/mit.txt
|
||||
printf '\n\n--- ISC License ---\n\n'; cat ../build/isc.txt
|
||||
printf '\n\n--- BSD 3-Clause License ---\n\n'; cat ../build/3bsd.txt
|
||||
printf '\n\n--- SIL Open Font License v1.1 ---\n\n'; cat ../build/ofl.txt
|
||||
) |
|
||||
while IFS= read -r x; do
|
||||
[ "${x:0:4}" = "--- " ] || {
|
||||
printf '%s\n' "$x"
|
||||
continue
|
||||
}
|
||||
n=${#x}
|
||||
p=$(( (80-n)/2 ))
|
||||
printf "%${p}s\033[07m%s\033[0m\n" "" "$x"
|
||||
done > copyparty/res/COPYING.txt
|
||||
|
||||
ver=
|
||||
[ -z "$repack" ] &&
|
||||
git describe --tags >/dev/null 2>/dev/null && {
|
||||
@@ -310,6 +274,7 @@ sfx_out=../dist/copyparty-$CSN
|
||||
echo cleanup
|
||||
find -name '*.pyc' -delete
|
||||
find -name __pycache__ -delete
|
||||
find -name py.typed -delete
|
||||
|
||||
# especially prevent osx from leaking your lan ip (wtf apple)
|
||||
find -type f \( -name .DS_Store -or -name ._.DS_Store \) -delete
|
||||
@@ -482,7 +447,7 @@ nf=$(ls -1 "$zdir"/arc.* | wc -l)
|
||||
pyoxidizer build --release --target-triple $tgt
|
||||
mv $bdir/copyparty.exe dist/
|
||||
cp -pv "$(for d in '/c/Program Files (x86)/Microsoft Visual Studio/'*'/BuildTools/VC/Redist/MSVC'; do
|
||||
find "$d" -name vcruntime140.dll; done | sort | grep -vE '/x64/|/onecore/' | head -n 1)" dist/
|
||||
find "$d" -name vcruntime140.dll; done | sort | grep -vE '/x64/|/onecore/' | head -n 1)" dist/
|
||||
dist/copyparty.exe --version
|
||||
cp -pv dist/copyparty{,.orig}.exe
|
||||
[ $ultra ] && a="--best --lzma" || a=-1
|
||||
@@ -509,13 +474,18 @@ done
|
||||
echo creating tar
|
||||
tar -cf tar "${targs[@]}" --numeric-owner -T list
|
||||
|
||||
pc=bzip2
|
||||
pe=bz2
|
||||
[ $use_gz ] && pc=gzip && pe=gz
|
||||
pc="bzip2 -"; pe=bz2
|
||||
[ $use_gz ] && pc="gzip -" && pe=gz
|
||||
[ $use_gzz ] && pc="pigz -11 -I$use_gzz" && pe=gz
|
||||
|
||||
echo compressing tar
|
||||
# detect best level; bzip2 -7 is usually better than -9
|
||||
for n in {2..9}; do cp tar t.$n; nice $pc -$n t.$n & done; wait; mv -v $(ls -1S t.*.$pe | tail -n 1) tar.bz2
|
||||
for n in {2..9}; do cp tar t.$n; nice $pc$n t.$n & done; wait
|
||||
minf=$(for f in t.*.$pe; do
|
||||
s1=$(wc -c <$f)
|
||||
s2=$(tr -d '\r\n\0' <$f | wc -c)
|
||||
echo "$(( s2+(s1-s2)*3 )) $f"
|
||||
done | sort -n | awk '{print$2;exit}')
|
||||
mv -v $minf tar.bz2
|
||||
rm t.* || true
|
||||
exts=()
|
||||
|
||||
|
||||
@@ -47,6 +47,7 @@ $APPDATA/python/python37/scripts/pyinstaller \
|
||||
--exclude-module copyparty.broker_mpw \
|
||||
--exclude-module curses \
|
||||
--exclude-module ctypes.macholib \
|
||||
--exclude-module inspect \
|
||||
--exclude-module multiprocessing \
|
||||
--exclude-module pdb \
|
||||
--exclude-module pickle \
|
||||
|
||||
@@ -31,10 +31,9 @@ rm -f ../dist/copyparty-sfx*
|
||||
shift
|
||||
./make-sfx.sh "$@"
|
||||
f=../dist/copyparty-sfx
|
||||
[ -e $f.py ] ||
|
||||
f=../dist/copyparty-sfx-gz
|
||||
[ -e $f.py ] && s= || s=-gz
|
||||
|
||||
$f.py -h >/dev/null
|
||||
$f$s.py --version >/dev/null
|
||||
|
||||
[ $parallel -gt 1 ] && {
|
||||
printf '\033[%s' s 2r H "0;1;37;44mbruteforcing sfx size -- press enter to terminate" K u "7m $* " K $'27m\n'
|
||||
@@ -44,9 +43,9 @@ $f.py -h >/dev/null
|
||||
for ((a=0; a<$parallel; a++)); do
|
||||
while [ -e .sfx-run ]; do
|
||||
CSN=sfx$a ./make-sfx.sh re "$@"
|
||||
sz=$(wc -c <$f$a.py | awk '{print$1}')
|
||||
sz=$(wc -c <$f$a$s.py | awk '{print$1}')
|
||||
[ $sz -ge $min ] && continue
|
||||
mv $f$a.py $f.py.$sz
|
||||
mv $f$a$s.py $f$s.py.$sz
|
||||
min=$sz
|
||||
done &
|
||||
done
|
||||
@@ -55,7 +54,7 @@ $f.py -h >/dev/null
|
||||
}
|
||||
|
||||
while true; do
|
||||
mv $f.py $f.$(wc -c <$f.py | awk '{print$1}').py
|
||||
mv $f$s.py $f$s.$(wc -c <$f$s.py | awk '{print$1}').py
|
||||
./make-sfx.sh re "$@"
|
||||
done
|
||||
|
||||
|
||||
@@ -24,6 +24,7 @@ copyparty/res/insecure.pem,
|
||||
copyparty/star.py,
|
||||
copyparty/stolen,
|
||||
copyparty/stolen/__init__.py,
|
||||
copyparty/stolen/qrcodegen.py,
|
||||
copyparty/stolen/surrogateescape.py,
|
||||
copyparty/sutil.py,
|
||||
copyparty/svchub.py,
|
||||
|
||||
@@ -27,7 +27,7 @@ SIZE = None
|
||||
CKSUM = None
|
||||
STAMP = None
|
||||
|
||||
PY2 = sys.version_info[0] == 2
|
||||
PY2 = sys.version_info < (3,)
|
||||
WINDOWS = sys.platform in ["win32", "msys"]
|
||||
sys.dont_write_bytecode = True
|
||||
me = os.path.abspath(os.path.realpath(__file__))
|
||||
@@ -142,6 +142,7 @@ def testchk(cdata):
|
||||
|
||||
def encode(data, size, cksum, ver, ts):
|
||||
"""creates a new sfx; `data` should yield bufs to attach"""
|
||||
nb = 0
|
||||
nin = 0
|
||||
nout = 0
|
||||
skip = False
|
||||
@@ -151,6 +152,7 @@ def encode(data, size, cksum, ver, ts):
|
||||
for ln in src.split("\n"):
|
||||
if ln.endswith("# skip 0"):
|
||||
skip = False
|
||||
nb = 9
|
||||
continue
|
||||
|
||||
if ln.endswith("# skip 1") or skip:
|
||||
@@ -160,6 +162,13 @@ def encode(data, size, cksum, ver, ts):
|
||||
if ln.strip().startswith("# fmt: "):
|
||||
continue
|
||||
|
||||
if ln:
|
||||
nb = 0
|
||||
else:
|
||||
nb += 1
|
||||
if nb > 2:
|
||||
continue
|
||||
|
||||
unpk += ln + "\n"
|
||||
|
||||
for k, v in [
|
||||
@@ -177,7 +186,7 @@ def encode(data, size, cksum, ver, ts):
|
||||
unpk = unpk.replace("\t ", "\t\t")
|
||||
|
||||
with open("sfx.out", "wb") as f:
|
||||
f.write(unpk.encode("utf-8") + b"\n\n# eof\n# ")
|
||||
f.write(unpk.encode("utf-8").rstrip(b"\n") + b"\n\n\n# eof\n# ")
|
||||
for buf in data:
|
||||
ebuf = buf.replace(b"\n", b"\n#n").replace(b"\r", b"\n#r")
|
||||
f.write(ebuf)
|
||||
@@ -260,6 +269,12 @@ def unpack():
|
||||
raise Exception(t.format(CKSUM, SIZE, ck, sz))
|
||||
|
||||
with tarfile.open(tar, "r:bz2") as tf:
|
||||
# this is safe against traversal
|
||||
# skip 1
|
||||
# since it will never process user-provided data;
|
||||
# the only possible input is a single tar.bz2
|
||||
# which gets hardcoded into this script at build stage
|
||||
# skip 0
|
||||
tf.extractall(mine)
|
||||
|
||||
os.remove(tar)
|
||||
|
||||
7
setup.py
7
setup.py
@@ -4,6 +4,7 @@ from __future__ import print_function
|
||||
|
||||
import os
|
||||
import sys
|
||||
import subprocess as sp
|
||||
from shutil import rmtree
|
||||
from setuptools import setup, Command, find_packages
|
||||
|
||||
@@ -28,6 +29,11 @@ with open(here + "/README.md", "rb") as f:
|
||||
txt = f.read().decode("utf-8")
|
||||
long_description = txt
|
||||
|
||||
try:
|
||||
cmd = "bash scripts/genlic.sh copyparty/res/COPYING.txt"
|
||||
sp.Popen(cmd.split()).wait()
|
||||
except:
|
||||
pass
|
||||
|
||||
about = {}
|
||||
if not VERSION:
|
||||
@@ -100,6 +106,7 @@ args = {
|
||||
"Programming Language :: Python :: 3.8",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: 3.11",
|
||||
"Programming Language :: Python :: Implementation :: CPython",
|
||||
"Programming Language :: Python :: Implementation :: Jython",
|
||||
"Programming Language :: Python :: Implementation :: PyPy",
|
||||
|
||||
@@ -178,10 +178,13 @@ class TestVFS(unittest.TestCase):
|
||||
self.assertEqual(n.realpath, os.path.join(td, "a"))
|
||||
self.assertAxs(n.axs.uread, ["*"])
|
||||
self.assertAxs(n.axs.uwrite, [])
|
||||
self.assertEqual(vfs.can_access("/", "*"), (False, False, False, False, False))
|
||||
self.assertEqual(vfs.can_access("/", "k"), (True, True, False, False, False))
|
||||
self.assertEqual(vfs.can_access("/a", "*"), (True, False, False, False, False))
|
||||
self.assertEqual(vfs.can_access("/a", "k"), (True, False, False, False, False))
|
||||
perm_na = (False, False, False, False, False, False)
|
||||
perm_rw = (True, True, False, False, False, False)
|
||||
perm_ro = (True, False, False, False, False, False)
|
||||
self.assertEqual(vfs.can_access("/", "*"), perm_na)
|
||||
self.assertEqual(vfs.can_access("/", "k"), perm_rw)
|
||||
self.assertEqual(vfs.can_access("/a", "*"), perm_ro)
|
||||
self.assertEqual(vfs.can_access("/a", "k"), perm_ro)
|
||||
|
||||
# breadth-first construction
|
||||
vfs = AuthSrv(
|
||||
|
||||
Reference in New Issue
Block a user