Compare commits
86 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
13e77777d7 | ||
|
|
89c6c2e0d9 | ||
|
|
14af136fcd | ||
|
|
d39a99c929 | ||
|
|
43ee6b9f5b | ||
|
|
8a38101e48 | ||
|
|
5026b21226 | ||
|
|
d07859e8e6 | ||
|
|
df7219d3b6 | ||
|
|
ad9be54f55 | ||
|
|
eeecc50757 | ||
|
|
8ff7094e4d | ||
|
|
58ae38c613 | ||
|
|
7f1c992601 | ||
|
|
fbfdd8338b | ||
|
|
bbc379906a | ||
|
|
33f41f3e61 | ||
|
|
655f6d00f8 | ||
|
|
fd552842d4 | ||
|
|
6bd087ddc5 | ||
|
|
0504b010a1 | ||
|
|
39cc92d4bc | ||
|
|
a0da0122b9 | ||
|
|
879e83e24f | ||
|
|
64ad585318 | ||
|
|
f262aee800 | ||
|
|
d4da386172 | ||
|
|
5d92f4df49 | ||
|
|
6f8a588c4d | ||
|
|
7c8e368721 | ||
|
|
f7a43a8e46 | ||
|
|
02879713a2 | ||
|
|
acbb8267e1 | ||
|
|
8796c09f56 | ||
|
|
d636316a19 | ||
|
|
ed524d84bb | ||
|
|
f0cdd9f25d | ||
|
|
4e797a7156 | ||
|
|
136c0fdc2b | ||
|
|
cab999978e | ||
|
|
fabeebd96b | ||
|
|
b1cf588452 | ||
|
|
c354a38b4c | ||
|
|
a17c267d87 | ||
|
|
c1180d6f9c | ||
|
|
d3db6d296f | ||
|
|
eefa0518db | ||
|
|
945170e271 | ||
|
|
6c2c6090dc | ||
|
|
b2e233403d | ||
|
|
e397ec2e48 | ||
|
|
fade751a3e | ||
|
|
0f386c4b08 | ||
|
|
14bccbe45f | ||
|
|
55eb692134 | ||
|
|
b32d65207b | ||
|
|
64cac003d8 | ||
|
|
6dbfcddcda | ||
|
|
b4e0a34193 | ||
|
|
01c82b54a7 | ||
|
|
4ef3106009 | ||
|
|
aa3a971961 | ||
|
|
b9d0c8536b | ||
|
|
3313503ea5 | ||
|
|
d999d3a921 | ||
|
|
e7d00bae39 | ||
|
|
650e41c717 | ||
|
|
140f6e0389 | ||
|
|
5e111ba5ee | ||
|
|
95a599961e | ||
|
|
a55e0d6eb8 | ||
|
|
2fd2c6b948 | ||
|
|
7a936ea01e | ||
|
|
226c7c3045 | ||
|
|
a4239a466b | ||
|
|
d0eb014c38 | ||
|
|
e01ba8552a | ||
|
|
024303592a | ||
|
|
86419b8f47 | ||
|
|
f1358dbaba | ||
|
|
e8a653ca0c | ||
|
|
9bc09ce949 | ||
|
|
dc8e621d7c | ||
|
|
dee0950f74 | ||
|
|
143f72fe36 | ||
|
|
a7889fb6a2 |
3
.vscode/launch.json
vendored
3
.vscode/launch.json
vendored
@@ -19,8 +19,7 @@
|
||||
"-emp",
|
||||
"-e2dsa",
|
||||
"-e2ts",
|
||||
"-mtp",
|
||||
".bpm=f,bin/mtag/audio-bpm.py",
|
||||
"-mtp=.bpm=f,bin/mtag/audio-bpm.py",
|
||||
"-aed:wark",
|
||||
"-vsrv::r:rw,ed:c,dupe",
|
||||
"-vdist:dist:r"
|
||||
|
||||
2
LICENSE
2
LICENSE
@@ -1,6 +1,6 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2019 ed
|
||||
Copyright (c) 2019 ed <oss@ocv.me>
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
|
||||
80
README.md
80
README.md
@@ -3,7 +3,7 @@
|
||||
turn almost any device into a file server with resumable uploads/downloads using [*any*](#browser-support) web browser
|
||||
|
||||
* server only needs Python (2 or 3), all dependencies optional
|
||||
* 🔌 protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
|
||||
* 🔌 protocols: [http](#the-browser) // [webdav](#webdav-server) // [ftp](#ftp-server) // [tftp](#tftp-server) // [smb/cifs](#smb-server)
|
||||
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
|
||||
|
||||
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
|
||||
@@ -53,6 +53,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [ftp server](#ftp-server) - an FTP server can be started using `--ftp 3921`
|
||||
* [webdav server](#webdav-server) - with read-write support
|
||||
* [connecting to webdav from windows](#connecting-to-webdav-from-windows) - using the GUI
|
||||
* [tftp server](#tftp-server) - a TFTP server (read/write) can be started using `--tftp 3969`
|
||||
* [smb server](#smb-server) - unsafe, slow, not recommended for wan
|
||||
* [browser ux](#browser-ux) - tweaking the ui
|
||||
* [file indexing](#file-indexing) - enables dedup and music search ++
|
||||
@@ -157,11 +158,11 @@ you may also want these, especially on servers:
|
||||
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
|
||||
```
|
||||
firewall-cmd --permanent --add-port={80,443,3921,3923,3945,3990}/tcp # --zone=libvirt
|
||||
firewall-cmd --permanent --add-port=12000-12099/tcp --permanent # --zone=libvirt
|
||||
firewall-cmd --permanent --add-port={1900,5353}/udp # --zone=libvirt
|
||||
firewall-cmd --permanent --add-port=12000-12099/tcp # --zone=libvirt
|
||||
firewall-cmd --permanent --add-port={69,1900,3969,5353}/udp # --zone=libvirt
|
||||
firewall-cmd --reload
|
||||
```
|
||||
(1900:ssdp, 3921:ftp, 3923:http/https, 3945:smb, 3990:ftps, 5353:mdns, 12000:passive-ftp)
|
||||
(69:tftp, 1900:ssdp, 3921:ftp, 3923:http/https, 3945:smb, 3969:tftp, 3990:ftps, 5353:mdns, 12000:passive-ftp)
|
||||
|
||||
|
||||
## features
|
||||
@@ -172,6 +173,7 @@ firewall-cmd --reload
|
||||
* ☑ volumes (mountpoints)
|
||||
* ☑ [accounts](#accounts-and-volumes)
|
||||
* ☑ [ftp server](#ftp-server)
|
||||
* ☑ [tftp server](#tftp-server)
|
||||
* ☑ [webdav server](#webdav-server)
|
||||
* ☑ [smb/cifs server](#smb-server)
|
||||
* ☑ [qr-code](#qr-code) for quick access
|
||||
@@ -739,7 +741,8 @@ some hilights:
|
||||
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
|
||||
|
||||
open the `[🎺]` media-player-settings tab to configure it,
|
||||
* switches:
|
||||
* "switches":
|
||||
* `[🔀]` shuffles the files inside each folder
|
||||
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
|
||||
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
|
||||
* `[~s]` toggles the seekbar waveform display
|
||||
@@ -749,10 +752,12 @@ open the `[🎺]` media-player-settings tab to configure it,
|
||||
* `[art]` shows album art on the lockscreen
|
||||
* `[🎯]` keeps the playing song scrolled into view (good when using the player as a taskbar dock)
|
||||
* `[⟎]` shrinks the playback controls
|
||||
* playback mode:
|
||||
* "buttons":
|
||||
* `[uncache]` may fix songs that won't play correctly due to bad files in browser cache
|
||||
* "at end of folder":
|
||||
* `[loop]` keeps looping the folder
|
||||
* `[next]` plays into the next folder
|
||||
* transcode:
|
||||
* "transcode":
|
||||
* `[flac]` converts `flac` and `wav` files into opus
|
||||
* `[aac]` converts `aac` and `m4a` files into opus
|
||||
* `[oth]` converts all other known formats into opus
|
||||
@@ -940,6 +945,35 @@ known client bugs:
|
||||
* latin-1 is fine, hiragana is not (not even as shift-jis on japanese xp)
|
||||
|
||||
|
||||
## tftp server
|
||||
|
||||
a TFTP server (read/write) can be started using `--tftp 3969` (you probably want [ftp](#ftp-server) instead unless you are *actually* communicating with hardware from the 90s (in which case we should definitely hang some time))
|
||||
|
||||
> that makes this the first RTX DECT Base that has been updated using copyparty 🎉
|
||||
|
||||
* based on [partftpy](https://github.com/9001/partftpy)
|
||||
* no accounts; read from world-readable folders, write to world-writable, overwrite in world-deletable
|
||||
* needs a dedicated port (cannot share with the HTTP/HTTPS API)
|
||||
* run as root (or see below) to use the spec-recommended port `69` (nice)
|
||||
* can reply from a predefined portrange (good for firewalls)
|
||||
* only supports the binary/octet/image transfer mode (no netascii)
|
||||
* [RFC 7440](https://datatracker.ietf.org/doc/html/rfc7440) is **not** supported, so will be extremely slow over WAN
|
||||
* assuming default blksize (512), expect 1100 KiB/s over 100BASE-T, 400-500 KiB/s over wifi, 200 on bad wifi
|
||||
|
||||
most clients expect to find TFTP on port 69, but on linux and macos you need to be root to listen on that. Alternatively, listen on 3969 and use NAT on the server to forward 69 to that port;
|
||||
* on linux: `iptables -t nat -A PREROUTING -i eth0 -p udp --dport 69 -j REDIRECT --to-port 3969`
|
||||
|
||||
some recommended TFTP clients:
|
||||
* curl (cross-platform, read/write)
|
||||
* get: `curl --tftp-blksize 1428 tftp://127.0.0.1:3969/firmware.bin`
|
||||
* put: `curl --tftp-blksize 1428 -T firmware.bin tftp://127.0.0.1:3969/`
|
||||
* windows: `tftp.exe` (you probably already have it)
|
||||
* `tftp -i 127.0.0.1 put firmware.bin`
|
||||
* linux: `tftp-hpa`, `atftp`
|
||||
* `atftp --option "blksize 1428" 127.0.0.1 3969 -p -l firmware.bin -r firmware.bin`
|
||||
* `tftp -v -m binary 127.0.0.1 3969 -c put firmware.bin`
|
||||
|
||||
|
||||
## smb server
|
||||
|
||||
unsafe, slow, not recommended for wan, enable with `--smb` for read-only or `--smbw` for read-write
|
||||
@@ -970,7 +1004,7 @@ known client bugs:
|
||||
* however smb1 is buggy and is not enabled by default on win10 onwards
|
||||
* windows cannot access folders which contain filenames with invalid unicode or forbidden characters (`<>:"/\|?*`), or names ending with `.`
|
||||
|
||||
the smb protocol listens on TCP port 445, which is a privileged port on linux and macos, which would require running copyparty as root. However, this can be avoided by listening on another port using `--smb-port 3945` and then using NAT to forward the traffic from 445 to there;
|
||||
the smb protocol listens on TCP port 445, which is a privileged port on linux and macos, which would require running copyparty as root. However, this can be avoided by listening on another port using `--smb-port 3945` and then using NAT on the server to forward the traffic from 445 to there;
|
||||
* on linux: `iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 445 -j REDIRECT --to-port 3945`
|
||||
|
||||
authenticate with one of the following:
|
||||
@@ -1028,6 +1062,8 @@ to save some time, you can provide a regex pattern for filepaths to only index
|
||||
|
||||
similarly, you can fully ignore files/folders using `--no-idx [...]` and `:c,noidx=\.iso$`
|
||||
|
||||
* when running on macos, all the usual apple metadata files are excluded by default
|
||||
|
||||
if you set `--no-hash [...]` globally, you can enable hashing for specific volumes using flag `:c,nohash=`
|
||||
|
||||
### filesystem guards
|
||||
@@ -1535,8 +1571,8 @@ TLDR: yes
|
||||
| navpane | - | yep | yep | yep | yep | yep | yep | yep |
|
||||
| image viewer | - | yep | yep | yep | yep | yep | yep | yep |
|
||||
| video player | - | yep | yep | yep | yep | yep | yep | yep |
|
||||
| markdown editor | - | - | yep | yep | yep | yep | yep | yep |
|
||||
| markdown viewer | - | yep | yep | yep | yep | yep | yep | yep |
|
||||
| markdown editor | - | - | `*2` | `*2` | yep | yep | yep | yep |
|
||||
| markdown viewer | - | `*2` | `*2` | `*2` | yep | yep | yep | yep |
|
||||
| play mp3/m4a | - | yep | yep | yep | yep | yep | yep | yep |
|
||||
| play ogg/opus | - | - | - | - | yep | yep | `*3` | yep |
|
||||
| **= feature =** | ie6 | ie9 | ie10 | ie11 | ff 52 | c 49 | iOS | Andr |
|
||||
@@ -1544,6 +1580,7 @@ TLDR: yes
|
||||
* internet explorer 6 through 8 behave the same
|
||||
* firefox 52 and chrome 49 are the final winxp versions
|
||||
* `*1` yes, but extremely slow (ie10: `1 MiB/s`, ie11: `270 KiB/s`)
|
||||
* `*2` only able to do plaintext documents (no markdown rendering)
|
||||
* `*3` iOS 11 and newer, opus only, and requires FFmpeg on the server
|
||||
|
||||
quick summary of more eccentric web-browsers trying to view a directory index:
|
||||
@@ -1569,10 +1606,12 @@ interact with copyparty using non-browser clients
|
||||
* `var xhr = new XMLHttpRequest(); xhr.open('POST', '//127.0.0.1:3923/msgs?raw'); xhr.send('foo');`
|
||||
|
||||
* curl/wget: upload some files (post=file, chunk=stdin)
|
||||
* `post(){ curl -F act=bput -F f=@"$1" http://127.0.0.1:3923/?pw=wark;}`
|
||||
`post movie.mkv`
|
||||
* `post(){ curl -F f=@"$1" http://127.0.0.1:3923/?pw=wark;}`
|
||||
`post movie.mkv` (gives HTML in return)
|
||||
* `post(){ curl -F f=@"$1" 'http://127.0.0.1:3923/?want=url&pw=wark';}`
|
||||
`post movie.mkv` (gives hotlink in return)
|
||||
* `post(){ curl -H pw:wark -H rand:8 -T "$1" http://127.0.0.1:3923/;}`
|
||||
`post movie.mkv`
|
||||
`post movie.mkv` (randomized filename)
|
||||
* `post(){ wget --header='pw: wark' --post-file="$1" -O- http://127.0.0.1:3923/?raw;}`
|
||||
`post movie.mkv`
|
||||
* `chunk(){ curl -H pw:wark -T- http://127.0.0.1:3923/;}`
|
||||
@@ -1594,6 +1633,10 @@ interact with copyparty using non-browser clients
|
||||
|
||||
* sharex (screenshot utility): see [./contrib/sharex.sxcu](contrib/#sharexsxcu)
|
||||
|
||||
* contextlet (web browser integration); see [contrib contextlet](contrib/#send-to-cppcontextletjson)
|
||||
|
||||
* [igloo irc](https://iglooirc.com/): Method: `post` Host: `https://you.com/up/?want=url&pw=hunter2` Multipart: `yes` File parameter: `f`
|
||||
|
||||
copyparty returns a truncated sha512sum of your PUT/POST as base64; you can generate the same checksum locally to verify uplaods:
|
||||
|
||||
b512(){ printf "$((sha512sum||shasum -a512)|sed -E 's/ .*//;s/(..)/\\x\1/g')"|base64|tr '+/' '-_'|head -c44;}
|
||||
@@ -1612,7 +1655,7 @@ the commandline uploader [u2c.py](https://github.com/9001/copyparty/tree/hovudst
|
||||
|
||||
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty, ...), although there is no integrity check and it won't work with files over 100 MiB if copyparty is behind cloudflare
|
||||
|
||||
* starting from rclone v1.63 (currently [in beta](https://beta.rclone.org/?filter=latest)), rclone will also be faster than u2c.py
|
||||
* starting from rclone v1.63, rclone is faster than u2c.py on low-latency connections
|
||||
|
||||
|
||||
## mount as drive
|
||||
@@ -1621,7 +1664,7 @@ a remote copyparty server as a local filesystem; go to the control-panel and cl
|
||||
|
||||
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
|
||||
|
||||
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE ([v1.63-beta](https://beta.rclone.org/?filter=latest))
|
||||
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE (rclone v1.63 or later)
|
||||
* [rclone-http](./docs/rclone.md) (26s), read-only
|
||||
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
|
||||
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
|
||||
@@ -1661,6 +1704,7 @@ below are some tweaks roughly ordered by usefulness:
|
||||
|
||||
* `-q` disables logging and can help a bunch, even when combined with `-lo` to redirect logs to file
|
||||
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
||||
* and also makes thumbnails load faster, regardless of e2d/e2t
|
||||
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
||||
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
||||
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
|
||||
@@ -1668,7 +1712,7 @@ below are some tweaks roughly ordered by usefulness:
|
||||
* simultaneous downloads and uploads saturating a 20gbps connection
|
||||
* if `-e2d` is enabled, `-j2` gives 4x performance for directory listings; `-j4` gives 16x
|
||||
|
||||
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
||||
...however it also increases the server/filesystem/HDD load during uploads, and adds an overhead to internal communication, so it is usually a better idea to don't
|
||||
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
|
||||
* and pypy can sometimes crash on startup with `-j0` (TODO make issue)
|
||||
|
||||
@@ -1677,7 +1721,7 @@ below are some tweaks roughly ordered by usefulness:
|
||||
|
||||
when uploading files,
|
||||
|
||||
* chrome is recommended, at least compared to firefox:
|
||||
* chrome is recommended (unfortunately), at least compared to firefox:
|
||||
* up to 90% faster when hashing, especially on SSDs
|
||||
* up to 40% faster when uploading over extremely fast internets
|
||||
* but [u2c.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) can be 40% faster than chrome again
|
||||
@@ -1879,7 +1923,7 @@ can be convenient on machines where installing python is problematic, however is
|
||||
|
||||
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
||||
|
||||
then again, if you are already into downloading shady binaries from the internet, you may also want my [minimal builds](./scripts/pyinstaller#ffmpeg) of [ffmpeg](https://ocv.me/stuff/bin/ffmpeg.exe) and [ffprobe](https://ocv.me/stuff/bin/ffprobe.exe) which enables copyparty to extract multimedia-info, do audio-transcoding, and thumbnails/spectrograms/waveforms, however it's much better to instead grab a [recent official build](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) every once ina while if you can afford the size
|
||||
then again, if you are already into downloading shady binaries from the internet, you may also want my [minimal builds](./scripts/pyinstaller#ffmpeg) of [ffmpeg](https://ocv.me/stuff/bin/ffmpeg.exe) and [ffprobe](https://ocv.me/stuff/bin/ffprobe.exe) which enables copyparty to extract multimedia-info, do audio-transcoding, and thumbnails/spectrograms/waveforms, however it's much better to instead grab a [recent official build](https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z) every once ina while if you can afford the size
|
||||
|
||||
|
||||
# install on android
|
||||
|
||||
38
bin/u2c.py
38
bin/u2c.py
@@ -1,8 +1,8 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
S_VERSION = "1.12"
|
||||
S_BUILD_DT = "2023-12-08"
|
||||
S_VERSION = "1.15"
|
||||
S_BUILD_DT = "2024-02-18"
|
||||
|
||||
"""
|
||||
u2c.py: upload to copyparty
|
||||
@@ -29,7 +29,7 @@ import platform
|
||||
import threading
|
||||
import datetime
|
||||
|
||||
EXE = sys.executable.endswith("exe")
|
||||
EXE = bool(getattr(sys, "frozen", False))
|
||||
|
||||
try:
|
||||
import argparse
|
||||
@@ -560,8 +560,11 @@ def handshake(ar, file, search):
|
||||
}
|
||||
if search:
|
||||
req["srch"] = 1
|
||||
elif ar.dr:
|
||||
req["replace"] = True
|
||||
else:
|
||||
if ar.touch:
|
||||
req["umod"] = True
|
||||
if ar.dr:
|
||||
req["replace"] = True
|
||||
|
||||
headers = {"Content-Type": "text/plain"} # <=1.5.1 compat
|
||||
if pw:
|
||||
@@ -843,12 +846,12 @@ class Ctl(object):
|
||||
txt = " "
|
||||
|
||||
if not self.up_br:
|
||||
spd = self.hash_b / (time.time() - self.t0)
|
||||
eta = (self.nbytes - self.hash_b) / (spd + 1)
|
||||
spd = self.hash_b / ((time.time() - self.t0) or 1)
|
||||
eta = (self.nbytes - self.hash_b) / (spd or 1)
|
||||
else:
|
||||
spd = self.up_br / (time.time() - self.t0_up)
|
||||
spd = self.up_br / ((time.time() - self.t0_up) or 1)
|
||||
spd = self.spd = (self.spd or spd) * 0.9 + spd * 0.1
|
||||
eta = (self.nbytes - self.up_b) / (spd + 1)
|
||||
eta = (self.nbytes - self.up_b) / (spd or 1)
|
||||
|
||||
spd = humansize(spd)
|
||||
self.eta = str(datetime.timedelta(seconds=int(eta)))
|
||||
@@ -874,6 +877,8 @@ class Ctl(object):
|
||||
self.st_hash = [file, ofs]
|
||||
|
||||
def hasher(self):
|
||||
ptn = re.compile(self.ar.x.encode("utf-8"), re.I) if self.ar.x else None
|
||||
sep = "{0}".format(os.sep).encode("ascii")
|
||||
prd = None
|
||||
ls = {}
|
||||
for top, rel, inf in self.filegen:
|
||||
@@ -906,7 +911,12 @@ class Ctl(object):
|
||||
if self.ar.drd:
|
||||
dp = os.path.join(top, rd)
|
||||
lnodes = set(os.listdir(dp))
|
||||
bnames = [x for x in ls if x not in lnodes]
|
||||
if ptn:
|
||||
zs = dp.replace(sep, b"/").rstrip(b"/") + b"/"
|
||||
zls = [zs + x for x in lnodes]
|
||||
zls = [x for x in zls if not ptn.match(x)]
|
||||
lnodes = [x.split(b"/")[-1] for x in zls]
|
||||
bnames = [x for x in ls if x not in lnodes and x != b".hist"]
|
||||
vpath = self.ar.url.split("://")[-1].split("/", 1)[-1]
|
||||
names = [x.decode("utf-8", "replace") for x in bnames]
|
||||
locs = [vpath + srd + "/" + x for x in names]
|
||||
@@ -1057,14 +1067,13 @@ class Ctl(object):
|
||||
self.uploader_busy += 1
|
||||
self.t0_up = self.t0_up or time.time()
|
||||
|
||||
zs = "{0}/{1}/{2}/{3} {4}/{5} {6}"
|
||||
stats = zs.format(
|
||||
stats = "%d/%d/%d/%d %d/%d %s" % (
|
||||
self.up_f,
|
||||
len(self.recheck),
|
||||
self.uploader_busy,
|
||||
self.nfiles - self.up_f,
|
||||
int(self.nbytes / (1024 * 1024)),
|
||||
int((self.nbytes - self.up_b) / (1024 * 1024)),
|
||||
self.nbytes // (1024 * 1024),
|
||||
(self.nbytes - self.up_b) // (1024 * 1024),
|
||||
self.eta,
|
||||
)
|
||||
|
||||
@@ -1130,6 +1139,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
||||
ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\\.hist/.*'")
|
||||
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
||||
ap.add_argument("--touch", action="store_true", help="if last-modified timestamps differ, push local to server (need write+delete perms)")
|
||||
ap.add_argument("--version", action="store_true", help="show version and exit")
|
||||
|
||||
ap = app.add_argument_group("compatibility")
|
||||
|
||||
@@ -17,10 +17,10 @@
|
||||
* `RequestURL`: full URL to the target folder
|
||||
* `pw`: password (remove the `pw` line if anon-write)
|
||||
|
||||
however if your copyparty is behind a reverse-proxy, you may want to use [`sharex-html.sxcu`](sharex-html.sxcu) instead:
|
||||
* `RequestURL`: full URL to the target folder
|
||||
* `URL`: full URL to the root folder (with trailing slash) followed by `$regex:1|1$`
|
||||
* `pw`: password (remove `Parameters` if anon-write)
|
||||
### [`send-to-cpp.contextlet.json`](send-to-cpp.contextlet.json)
|
||||
* browser integration, kind of? custom rightclick actions and stuff
|
||||
* rightclick a pic and send it to copyparty straight from your browser
|
||||
* for the [contextlet](https://addons.mozilla.org/en-US/firefox/addon/contextlets/) firefox extension
|
||||
|
||||
### [`media-osd-bgone.ps1`](media-osd-bgone.ps1)
|
||||
* disables the [windows OSD popup](https://user-images.githubusercontent.com/241032/122821375-0e08df80-d2dd-11eb-9fd9-184e8aacf1d0.png) (the thing on the left) which appears every time you hit media hotkeys to adjust volume or change song while playing music with the copyparty web-ui, or most other audio players really
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.9.27"
|
||||
pkgver="1.10.1"
|
||||
pkgrel=1
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, zeroconf, media indexer, thumbnails++"
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
||||
arch=("any")
|
||||
url="https://github.com/9001/${pkgname}"
|
||||
license=('MIT')
|
||||
@@ -21,7 +21,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("0bcf9362bc1bd9c85c228312c8341fcb9a37e383af6d8bee123e3a84e66394be")
|
||||
sha256sums=("3969bbacccaa2fbb4c0bb1c971d9fd7d1851c35f829a1f2f02ad281f5f6dfe53")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.27/copyparty-sfx.py",
|
||||
"version": "1.9.27",
|
||||
"hash": "sha256-o06o/gUIkQhDw9a9SLjuAyQUoLMfFCqkIeonUFzezds="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.10.1/copyparty-sfx.py",
|
||||
"version": "1.10.1",
|
||||
"hash": "sha256-p1SF0BKY+qcs+/ZpqgU3dfK4E+/rpxezsiY6U1obhx4="
|
||||
}
|
||||
11
contrib/send-to-cpp.contextlet.json
Normal file
11
contrib/send-to-cpp.contextlet.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"code": "// https://addons.mozilla.org/en-US/firefox/addon/contextlets/\n// https://github.com/davidmhammond/contextlets\n\nvar url = 'http://partybox.local:3923/';\nvar pw = 'wark';\n\nvar xhr = new XMLHttpRequest();\nxhr.msg = this.info.linkUrl || this.info.srcUrl;\nxhr.open('POST', url, true);\nxhr.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded;charset=UTF-8');\nxhr.setRequestHeader('PW', pw);\nxhr.send('msg=' + xhr.msg);\n",
|
||||
"contexts": [
|
||||
"link"
|
||||
],
|
||||
"icons": null,
|
||||
"patterns": "",
|
||||
"scope": "background",
|
||||
"title": "send to cpp",
|
||||
"type": "normal"
|
||||
}
|
||||
@@ -1,19 +0,0 @@
|
||||
{
|
||||
"Version": "13.5.0",
|
||||
"Name": "copyparty-html",
|
||||
"DestinationType": "ImageUploader",
|
||||
"RequestMethod": "POST",
|
||||
"RequestURL": "http://127.0.0.1:3923/sharex",
|
||||
"Parameters": {
|
||||
"pw": "wark"
|
||||
},
|
||||
"Body": "MultipartFormData",
|
||||
"Arguments": {
|
||||
"act": "bput"
|
||||
},
|
||||
"FileFormName": "f",
|
||||
"RegexList": [
|
||||
"bytes // <a href=\"/([^\"]+)\""
|
||||
],
|
||||
"URL": "http://127.0.0.1:3923/$regex:1|1$"
|
||||
}
|
||||
@@ -1,17 +1,19 @@
|
||||
{
|
||||
"Version": "13.5.0",
|
||||
"Version": "15.0.0",
|
||||
"Name": "copyparty",
|
||||
"DestinationType": "ImageUploader",
|
||||
"RequestMethod": "POST",
|
||||
"RequestURL": "http://127.0.0.1:3923/sharex",
|
||||
"Parameters": {
|
||||
"pw": "wark",
|
||||
"j": null
|
||||
},
|
||||
"Headers": {
|
||||
"pw": "PUT_YOUR_PASSWORD_HERE_MY_DUDE"
|
||||
},
|
||||
"Body": "MultipartFormData",
|
||||
"Arguments": {
|
||||
"act": "bput"
|
||||
},
|
||||
"FileFormName": "f",
|
||||
"URL": "$json:files[0].url$"
|
||||
"URL": "{json:files[0].url}"
|
||||
}
|
||||
|
||||
@@ -20,17 +20,31 @@ import time
|
||||
import traceback
|
||||
import uuid
|
||||
|
||||
from .__init__ import ANYWIN, CORES, EXE, PY2, VT100, WINDOWS, E, EnvParams, unicode
|
||||
from .__init__ import (
|
||||
ANYWIN,
|
||||
CORES,
|
||||
EXE,
|
||||
MACOS,
|
||||
PY2,
|
||||
VT100,
|
||||
WINDOWS,
|
||||
E,
|
||||
EnvParams,
|
||||
unicode,
|
||||
)
|
||||
from .__version__ import CODENAME, S_BUILD_DT, S_VERSION
|
||||
from .authsrv import expand_config_file, split_cfg_ln, upgrade_cfg_fmt
|
||||
from .cfg import flagcats, onedash
|
||||
from .svchub import SvcHub
|
||||
from .util import (
|
||||
APPLESAN_TXT,
|
||||
DEF_EXP,
|
||||
DEF_MTE,
|
||||
DEF_MTH,
|
||||
IMPLICATIONS,
|
||||
JINJA_VER,
|
||||
PARTFTPY_VER,
|
||||
PY_DESC,
|
||||
PYFTPD_VER,
|
||||
SQLITE_VER,
|
||||
UNPLICATIONS,
|
||||
@@ -38,7 +52,6 @@ from .util import (
|
||||
ansi_re,
|
||||
dedent,
|
||||
min_ex,
|
||||
py_desc,
|
||||
pybin,
|
||||
termsize,
|
||||
wrap,
|
||||
@@ -826,7 +839,7 @@ def add_general(ap, nc, srvname):
|
||||
ap2.add_argument("-nc", metavar="NUM", type=int, default=nc, help="max num clients")
|
||||
ap2.add_argument("-j", metavar="CORES", type=int, default=1, help="max num cpu cores, 0=all")
|
||||
ap2.add_argument("-a", metavar="ACCT", type=u, action="append", help="add account, \033[33mUSER\033[0m:\033[33mPASS\033[0m; example [\033[32med:wark\033[0m]")
|
||||
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, \033[33mSRC\033[0m:\033[33mDST\033[0m:\033[33mFLAG\033[0m; examples [\033[32m.::r\033[0m], [\033[32m/mnt/nas/music:/music:r:aed\033[0m]")
|
||||
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, \033[33mSRC\033[0m:\033[33mDST\033[0m:\033[33mFLAG\033[0m; examples [\033[32m.::r\033[0m], [\033[32m/mnt/nas/music:/music:r:aed\033[0m], see --help-accounts")
|
||||
ap2.add_argument("-ed", action="store_true", help="enable the ?dots url parameter / client option which allows clients to see dotfiles / hidden files (volflag=dots)")
|
||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-form POSTs; see \033[33m--help-urlform\033[0m")
|
||||
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="server terminal title, for example [\033[32m$ip-10.1.2.\033[0m] or [\033[32m$ip-]")
|
||||
@@ -847,6 +860,12 @@ def add_qr(ap, tty):
|
||||
ap2.add_argument("--qrz", metavar="N", type=int, default=0, help="[\033[32m1\033[0m]=1x, [\033[32m2\033[0m]=2x, [\033[32m0\033[0m]=auto (try [\033[32m2\033[0m] on broken fonts)")
|
||||
|
||||
|
||||
def add_fs(ap):
|
||||
ap2 = ap.add_argument_group("filesystem options")
|
||||
rm_re_def = "5/0.1" if ANYWIN else "0/0"
|
||||
ap2.add_argument("--rm-retry", metavar="T/R", type=u, default=rm_re_def, help="if a file cannot be deleted because it is busy, continue trying for \033[33mT\033[0m seconds, retry every \033[33mR\033[0m seconds; disable with 0/0 (volflag=rm_retry)")
|
||||
|
||||
|
||||
def add_upload(ap):
|
||||
ap2 = ap.add_argument_group('upload options')
|
||||
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless \033[33m-ed\033[0m")
|
||||
@@ -870,7 +889,7 @@ def add_upload(ap):
|
||||
ap2.add_argument("--df", metavar="GiB", type=float, default=0, help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests")
|
||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16-32 for cross-atlantic (max=64)")
|
||||
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16 for cross-atlantic (max=64)")
|
||||
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
|
||||
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
|
||||
|
||||
@@ -975,7 +994,7 @@ def add_zc_ssdp(ap):
|
||||
|
||||
|
||||
def add_ftp(ap):
|
||||
ap2 = ap.add_argument_group('FTP options')
|
||||
ap2 = ap.add_argument_group('FTP options (TCP only)')
|
||||
ap2.add_argument("--ftp", metavar="PORT", type=int, help="enable FTP server on \033[33mPORT\033[0m, for example \033[32m3921")
|
||||
ap2.add_argument("--ftps", metavar="PORT", type=int, help="enable FTPS server on \033[33mPORT\033[0m, for example \033[32m3990")
|
||||
ap2.add_argument("--ftpv", action="store_true", help="verbose")
|
||||
@@ -995,6 +1014,18 @@ def add_webdav(ap):
|
||||
ap2.add_argument("--dav-auth", action="store_true", help="force auth for all folders (required by davfs2 when only some folders are world-readable) (volflag=davauth)")
|
||||
|
||||
|
||||
def add_tftp(ap):
|
||||
ap2 = ap.add_argument_group('TFTP options (UDP only)')
|
||||
ap2.add_argument("--tftp", metavar="PORT", type=int, help="enable TFTP server on \033[33mPORT\033[0m, for example \033[32m69 \033[0mor \033[32m3969")
|
||||
ap2.add_argument("--tftpv", action="store_true", help="verbose")
|
||||
ap2.add_argument("--tftpvv", action="store_true", help="verboser")
|
||||
ap2.add_argument("--tftp-no-fast", action="store_true", help="debug: disable optimizations")
|
||||
ap2.add_argument("--tftp-lsf", metavar="PTN", type=u, default="\\.?(dir|ls)(\\.txt)?", help="return a directory listing if a file with this name is requested and it does not exist; defaults matches .ls, dir, .dir.txt, ls.txt, ...")
|
||||
ap2.add_argument("--tftp-nols", action="store_true", help="if someone tries to download a directory, return an error instead of showing its directory listing")
|
||||
ap2.add_argument("--tftp-ipa", metavar="PFX", type=u, default="", help="only accept connections from IP-addresses starting with \033[33mPFX\033[0m; specify [\033[32many\033[0m] to disable inheriting \033[33m--ipa\033[0m. Example: [\033[32m127., 10.89., 192.168.\033[0m]")
|
||||
ap2.add_argument("--tftp-pr", metavar="P-P", type=u, help="the range of UDP ports to use for data transfer, for example \033[32m12000-13000")
|
||||
|
||||
|
||||
def add_smb(ap):
|
||||
ap2 = ap.add_argument_group('SMB/CIFS options')
|
||||
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless \033[33m--smb-port\033[0m is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is DANGEROUS and buggy! Never expose to the internet!")
|
||||
@@ -1138,7 +1169,9 @@ def add_thumbnail(ap):
|
||||
ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res (volflag=thsize)")
|
||||
ap2.add_argument("--th-mt", metavar="CORES", type=int, default=CORES, help="num cpu cores to use for generating thumbnails")
|
||||
ap2.add_argument("--th-convt", metavar="SEC", type=float, default=60, help="conversion timeout in seconds (volflag=convt)")
|
||||
ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image by default (client can override in UI) (volflag=nocrop)")
|
||||
ap2.add_argument("--th-ram-max", metavar="GB", type=float, default=6, help="max memory usage (GiB) permitted by thumbnailer; not very accurate")
|
||||
ap2.add_argument("--th-crop", metavar="TXT", type=u, default="y", help="crop thumbnails to 4:3 or keep dynamic height; client can override in UI unless force. [\033[32mfy\033[0m]=crop, [\033[32mfn\033[0m]=nocrop, [\033[32mfy\033[0m]=force-y, [\033[32mfn\033[0m]=force-n (volflag=crop)")
|
||||
ap2.add_argument("--th-x3", metavar="TXT", type=u, default="n", help="show thumbs at 3x resolution; client can override in UI unless force. [\033[32mfy\033[0m]=yes, [\033[32mfn\033[0m]=no, [\033[32mfy\033[0m]=force-yes, [\033[32mfn\033[0m]=force-no (volflag=th3x)")
|
||||
ap2.add_argument("--th-dec", metavar="LIBS", default="vips,pil,ff", help="image decoders, in order of preference")
|
||||
ap2.add_argument("--th-no-jpg", action="store_true", help="disable jpg output")
|
||||
ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output")
|
||||
@@ -1166,6 +1199,7 @@ def add_transcoding(ap):
|
||||
|
||||
|
||||
def add_db_general(ap, hcores):
|
||||
noidx = APPLESAN_TXT if MACOS else ""
|
||||
ap2 = ap.add_argument_group('general db options')
|
||||
ap2.add_argument("-e2d", action="store_true", help="enable up2k database, making files searchable + enables upload deduplication")
|
||||
ap2.add_argument("-e2ds", action="store_true", help="scan writable folders for new files on startup; sets \033[33m-e2d\033[0m")
|
||||
@@ -1175,7 +1209,7 @@ def add_db_general(ap, hcores):
|
||||
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
|
||||
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
|
||||
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
|
||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
|
||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
|
||||
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
|
||||
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
|
||||
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice -- only useful for offloading uploads to a cloud service or something (volflag=noforget)")
|
||||
@@ -1294,6 +1328,7 @@ def run_argparse(
|
||||
add_zeroconf(ap)
|
||||
add_zc_mdns(ap)
|
||||
add_zc_ssdp(ap)
|
||||
add_fs(ap)
|
||||
add_upload(ap)
|
||||
add_db_general(ap, hcores)
|
||||
add_db_metadata(ap)
|
||||
@@ -1301,6 +1336,7 @@ def run_argparse(
|
||||
add_transcoding(ap)
|
||||
add_ftp(ap)
|
||||
add_webdav(ap)
|
||||
add_tftp(ap)
|
||||
add_smb(ap)
|
||||
add_safety(ap)
|
||||
add_salt(ap, fk_salt, ah_salt)
|
||||
@@ -1354,15 +1390,16 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
if argv is None:
|
||||
argv = sys.argv
|
||||
|
||||
f = '\033[36mcopyparty v{} "\033[35m{}\033[36m" ({})\n{}\033[0;36m\n sqlite v{} | jinja2 v{} | pyftpd v{}\n\033[0m'
|
||||
f = '\033[36mcopyparty v{} "\033[35m{}\033[36m" ({})\n{}\033[0;36m\n sqlite {} | jinja {} | pyftpd {} | tftp {}\n\033[0m'
|
||||
f = f.format(
|
||||
S_VERSION,
|
||||
CODENAME,
|
||||
S_BUILD_DT,
|
||||
py_desc().replace("[", "\033[90m["),
|
||||
PY_DESC.replace("[", "\033[90m["),
|
||||
SQLITE_VER,
|
||||
JINJA_VER,
|
||||
PYFTPD_VER,
|
||||
PARTFTPY_VER,
|
||||
)
|
||||
lprint(f)
|
||||
|
||||
@@ -1394,6 +1431,7 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
deprecated: list[tuple[str, str]] = [
|
||||
("--salt", "--warksalt"),
|
||||
("--hdr-au-usr", "--idp-h-usr"),
|
||||
("--th-no-crop", "--th-crop=n"),
|
||||
]
|
||||
for dk, nk in deprecated:
|
||||
idx = -1
|
||||
@@ -1432,7 +1470,7 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
|
||||
_, hard = resource.getrlimit(resource.RLIMIT_NOFILE)
|
||||
if hard > 0: # -1 == infinite
|
||||
nc = min(nc, hard // 4)
|
||||
nc = min(nc, int(hard / 4))
|
||||
except:
|
||||
nc = 512
|
||||
|
||||
@@ -1524,6 +1562,9 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
if sys.version_info < (3, 6):
|
||||
al.no_scandir = True
|
||||
|
||||
if not hasattr(os, "sendfile"):
|
||||
al.no_sendfile = True
|
||||
|
||||
# signal.signal(signal.SIGINT, sighandler)
|
||||
|
||||
SvcHub(al, dal, argv, "".join(printed)).run()
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 9, 28)
|
||||
CODENAME = "prometheable"
|
||||
BUILD_DT = (2023, 12, 31)
|
||||
VERSION = (1, 10, 2)
|
||||
CODENAME = "tftp"
|
||||
BUILD_DT = (2024, 2, 21)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -193,7 +193,7 @@ class Lim(object):
|
||||
self.dft = int(time.time()) + 300
|
||||
self.dfv = get_df(abspath)[0] or 0
|
||||
for j in list(self.reg.values()) if self.reg else []:
|
||||
self.dfv -= int(j["size"] / len(j["hash"]) * len(j["need"]))
|
||||
self.dfv -= int(j["size"] / (len(j["hash"]) or 999) * len(j["need"]))
|
||||
|
||||
if already_written:
|
||||
sz = 0
|
||||
@@ -381,7 +381,7 @@ class VFS(object):
|
||||
|
||||
def add(self, src: str, dst: str) -> "VFS":
|
||||
"""get existing, or add new path to the vfs"""
|
||||
assert not src.endswith("/") # nosec
|
||||
assert src == "/" or not src.endswith("/") # nosec
|
||||
assert not dst.endswith("/") # nosec
|
||||
|
||||
if "/" in dst:
|
||||
@@ -779,7 +779,6 @@ class AuthSrv(object):
|
||||
self.warn_anonwrite = warn_anonwrite
|
||||
self.line_ctr = 0
|
||||
self.indent = ""
|
||||
self.desc = []
|
||||
|
||||
self.mutex = threading.Lock()
|
||||
self.reload()
|
||||
@@ -862,7 +861,6 @@ class AuthSrv(object):
|
||||
mflags: dict[str, dict[str, Any]],
|
||||
mount: dict[str, str],
|
||||
) -> None:
|
||||
self.desc = []
|
||||
self.line_ctr = 0
|
||||
|
||||
expand_config_file(cfg_lines, fp, "")
|
||||
@@ -1009,6 +1007,7 @@ class AuthSrv(object):
|
||||
raise Exception("invalid config value (volume or volflag): %s" % (t,))
|
||||
|
||||
if lvl == "c":
|
||||
# here, 'uname' is not a username; it is a volflag name... sorry
|
||||
cval: Union[bool, str] = True
|
||||
try:
|
||||
# volflag with arguments, possibly with a preceding list of bools
|
||||
@@ -1235,6 +1234,7 @@ class AuthSrv(object):
|
||||
|
||||
all_users = {}
|
||||
missing_users = {}
|
||||
associated_users = {}
|
||||
for axs in daxs.values():
|
||||
for d in [
|
||||
axs.uread,
|
||||
@@ -1251,6 +1251,8 @@ class AuthSrv(object):
|
||||
all_users[usr] = 1
|
||||
if usr != "*" and usr not in acct:
|
||||
missing_users[usr] = 1
|
||||
if "*" not in d:
|
||||
associated_users[usr] = 1
|
||||
|
||||
if missing_users:
|
||||
self.log(
|
||||
@@ -1271,6 +1273,16 @@ class AuthSrv(object):
|
||||
raise Exception(BAD_CFG)
|
||||
seenpwds[pwd] = usr
|
||||
|
||||
for usr in acct:
|
||||
if usr not in associated_users:
|
||||
if len(vfs.all_vols) > 1:
|
||||
# user probably familiar enough that the verbose message is not necessary
|
||||
t = "account [%s] is not mentioned in any volume definitions; see --help-accounts"
|
||||
self.log(t % (usr,), 1)
|
||||
else:
|
||||
t = "WARNING: the account [%s] is not mentioned in any volume definitions and thus has the same access-level and privileges that guests have; please see --help-accounts for details. For example, if you intended to give that user full access to the current directory, you could do this: -v .::A,%s"
|
||||
self.log(t % (usr, usr), 1)
|
||||
|
||||
promote = []
|
||||
demote = []
|
||||
for vol in vfs.all_vols.values():
|
||||
@@ -1481,6 +1493,14 @@ class AuthSrv(object):
|
||||
if k in vol.flags:
|
||||
vol.flags[k] = float(vol.flags[k])
|
||||
|
||||
try:
|
||||
zs1, zs2 = vol.flags["rm_retry"].split("/")
|
||||
vol.flags["rm_re_t"] = float(zs1)
|
||||
vol.flags["rm_re_r"] = float(zs2)
|
||||
except:
|
||||
t = 'volume "/%s" has invalid rm_retry [%s]'
|
||||
raise Exception(t % (vol.vpath, vol.flags.get("rm_retry")))
|
||||
|
||||
for k1, k2 in IMPLICATIONS:
|
||||
if k1 in vol.flags:
|
||||
vol.flags[k2] = True
|
||||
@@ -1492,8 +1512,8 @@ class AuthSrv(object):
|
||||
dbds = "acid|swal|wal|yolo"
|
||||
vol.flags["dbd"] = dbd = vol.flags.get("dbd") or self.args.dbd
|
||||
if dbd not in dbds.split("|"):
|
||||
t = "invalid dbd [{}]; must be one of [{}]"
|
||||
raise Exception(t.format(dbd, dbds))
|
||||
t = 'volume "/%s" has invalid dbd [%s]; must be one of [%s]'
|
||||
raise Exception(t % (vol.vpath, dbd, dbds))
|
||||
|
||||
# default tag cfgs if unset
|
||||
for k in ("mte", "mth", "exp_md", "exp_lg"):
|
||||
|
||||
@@ -76,7 +76,7 @@ class MpWorker(BrokerCli):
|
||||
pass
|
||||
|
||||
def logw(self, msg: str, c: Union[int, str] = 0) -> None:
|
||||
self.log("mp{}".format(self.n), msg, c)
|
||||
self.log("mp%d" % (self.n,), msg, c)
|
||||
|
||||
def main(self) -> None:
|
||||
while True:
|
||||
|
||||
@@ -20,7 +20,6 @@ def vf_bmap() -> dict[str, str]:
|
||||
"no_thumb": "dthumb",
|
||||
"no_vthumb": "dvthumb",
|
||||
"no_athumb": "dathumb",
|
||||
"th_no_crop": "nocrop",
|
||||
}
|
||||
for k in (
|
||||
"dotsrch",
|
||||
@@ -56,12 +55,15 @@ def vf_vmap() -> dict[str, str]:
|
||||
"re_maxage": "scan",
|
||||
"th_convt": "convt",
|
||||
"th_size": "thsize",
|
||||
"th_crop": "crop",
|
||||
"th_x3": "th3x",
|
||||
}
|
||||
for k in (
|
||||
"dbd",
|
||||
"lg_sbf",
|
||||
"md_sbf",
|
||||
"nrand",
|
||||
"rm_retry",
|
||||
"sort",
|
||||
"unlist",
|
||||
"u2ts",
|
||||
@@ -171,7 +173,8 @@ flagcats = {
|
||||
"dathumb": "disables audio thumbnails (spectrograms)",
|
||||
"dithumb": "disables image thumbnails",
|
||||
"thsize": "thumbnail res; WxH",
|
||||
"nocrop": "disable center-cropping by default",
|
||||
"crop": "center-cropping (y/n/fy/fn)",
|
||||
"th3x": "3x resolution (y/n/fy/fn)",
|
||||
"convt": "conversion timeout in seconds",
|
||||
},
|
||||
"handlers\n(better explained in --help-handlers)": {
|
||||
@@ -208,6 +211,7 @@ flagcats = {
|
||||
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
|
||||
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
|
||||
"fka=8": 'generates slightly weaker per-file accesskeys,\nwhich are then required at the "g" permission;\nnot affected by filesize or inode numbers',
|
||||
"rm_retry": "ms-windows: timeout for deleting busy files",
|
||||
"davauth": "ask webdav clients to login for all folders",
|
||||
"davrt": "show lastmod time of symlink destination, not the link itself\n(note: this option is always enabled for recursive listings)",
|
||||
},
|
||||
|
||||
@@ -20,6 +20,7 @@ from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .util import (
|
||||
Daemon,
|
||||
ODict,
|
||||
Pebkac,
|
||||
exclude_dotfiles,
|
||||
fsenc,
|
||||
@@ -545,6 +546,8 @@ class Ftpd(object):
|
||||
if self.args.ftp4:
|
||||
ips = [x for x in ips if ":" not in x]
|
||||
|
||||
ips = list(ODict.fromkeys(ips)) # dedup
|
||||
|
||||
ioloop = IOLoop()
|
||||
for ip in ips:
|
||||
for h, lp in hs:
|
||||
|
||||
@@ -37,6 +37,8 @@ from .star import StreamTar
|
||||
from .sutil import StreamArc, gfilter
|
||||
from .szip import StreamZip
|
||||
from .util import (
|
||||
APPLESAN_RE,
|
||||
BITNESS,
|
||||
HTTPCODE,
|
||||
META_NOBOTS,
|
||||
UTC,
|
||||
@@ -45,6 +47,7 @@ from .util import (
|
||||
ODict,
|
||||
Pebkac,
|
||||
UnrecvEOF,
|
||||
WrongPostKey,
|
||||
absreal,
|
||||
alltrace,
|
||||
atomic_move,
|
||||
@@ -86,6 +89,7 @@ from .util import (
|
||||
vjoin,
|
||||
vol_san,
|
||||
vsplit,
|
||||
wunlink,
|
||||
yieldfile,
|
||||
)
|
||||
|
||||
@@ -111,7 +115,7 @@ class HttpCli(object):
|
||||
|
||||
self.t0 = time.time()
|
||||
self.conn = conn
|
||||
self.mutex = conn.mutex # mypy404
|
||||
self.u2mutex = conn.u2mutex # mypy404
|
||||
self.s = conn.s
|
||||
self.sr = conn.sr
|
||||
self.ip = conn.addr[0]
|
||||
@@ -189,7 +193,7 @@ class HttpCli(object):
|
||||
|
||||
def unpwd(self, m: Match[str]) -> str:
|
||||
a, b, c = m.groups()
|
||||
return "{}\033[7m {} \033[27m{}".format(a, self.asrv.iacct[b], c)
|
||||
return "%s\033[7m %s \033[27m%s" % (a, self.asrv.iacct[b], c)
|
||||
|
||||
def _check_nonfatal(self, ex: Pebkac, post: bool) -> bool:
|
||||
if post:
|
||||
@@ -556,16 +560,16 @@ class HttpCli(object):
|
||||
self.keepalive = False
|
||||
|
||||
em = str(ex)
|
||||
msg = em if pex == ex else min_ex()
|
||||
msg = em if pex is ex else min_ex()
|
||||
if pex.code != 404 or self.do_log:
|
||||
self.log(
|
||||
"{}\033[0m, {}".format(msg, self.vpath),
|
||||
"%s\033[0m, %s" % (msg, self.vpath),
|
||||
6 if em.startswith("client d/c ") else 3,
|
||||
)
|
||||
|
||||
msg = "{}\r\nURL: {}\r\n".format(em, self.vpath)
|
||||
msg = "%s\r\nURL: %s\r\n" % (em, self.vpath)
|
||||
if self.hint:
|
||||
msg += "hint: {}\r\n".format(self.hint)
|
||||
msg += "hint: %s\r\n" % (self.hint,)
|
||||
|
||||
if "database is locked" in em:
|
||||
self.conn.hsrv.broker.say("log_stacks")
|
||||
@@ -808,7 +812,7 @@ class HttpCli(object):
|
||||
if k in skip:
|
||||
continue
|
||||
|
||||
t = "{}={}".format(quotep(k), quotep(v))
|
||||
t = "%s=%s" % (quotep(k), quotep(v))
|
||||
ret.append(t.replace(" ", "+").rstrip("="))
|
||||
|
||||
if not ret:
|
||||
@@ -856,7 +860,8 @@ class HttpCli(object):
|
||||
oh = self.out_headers
|
||||
origin = origin.lower()
|
||||
good_origins = self.args.acao + [
|
||||
"{}://{}".format(
|
||||
"%s://%s"
|
||||
% (
|
||||
"https" if self.is_https else "http",
|
||||
self.host.lower().split(":")[0],
|
||||
)
|
||||
@@ -1053,7 +1058,7 @@ class HttpCli(object):
|
||||
self.can_read = self.can_write = self.can_get = False
|
||||
|
||||
if not self.can_read and not self.can_write and not self.can_get:
|
||||
self.log("inaccessible: [{}]".format(self.vpath))
|
||||
self.log("inaccessible: [%s]" % (self.vpath,))
|
||||
raise Pebkac(401, "authenticate")
|
||||
|
||||
from .dxml import parse_xml
|
||||
@@ -1382,8 +1387,7 @@ class HttpCli(object):
|
||||
return False
|
||||
|
||||
vp = "/" + self.vpath
|
||||
ptn = r"/\.(_|DS_Store|Spotlight-|fseventsd|Trashes|AppleDouble)|/__MACOS"
|
||||
if re.search(ptn, vp):
|
||||
if re.search(APPLESAN_RE, vp):
|
||||
zt = '<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:lock-token-submitted><D:href>{}</D:href></D:lock-token-submitted></D:error>'
|
||||
zb = zt.format(vp).encode("utf-8", "replace")
|
||||
self.reply(zb, 423, "text/xml; charset=utf-8")
|
||||
@@ -1403,7 +1407,7 @@ class HttpCli(object):
|
||||
if txt and len(txt) == orig_len:
|
||||
raise Pebkac(500, "chunk slicing failed")
|
||||
|
||||
buf = "{:x}\r\n".format(len(buf)).encode(enc) + buf
|
||||
buf = ("%x\r\n" % (len(buf),)).encode(enc) + buf
|
||||
self.s.sendall(buf + b"\r\n")
|
||||
return txt
|
||||
|
||||
@@ -1689,7 +1693,7 @@ class HttpCli(object):
|
||||
and bos.path.getmtime(path) >= time.time() - self.args.blank_wt
|
||||
):
|
||||
# small toctou, but better than clobbering a hardlink
|
||||
bos.unlink(path)
|
||||
wunlink(self.log, path, vfs.flags)
|
||||
|
||||
with ren_open(fn, *open_a, **params) as zfw:
|
||||
f, fn = zfw["orz"]
|
||||
@@ -1703,7 +1707,7 @@ class HttpCli(object):
|
||||
lim.chk_sz(post_sz)
|
||||
lim.chk_vsz(self.conn.hsrv.broker, vfs.realpath, post_sz)
|
||||
except:
|
||||
bos.unlink(path)
|
||||
wunlink(self.log, path, vfs.flags)
|
||||
raise
|
||||
|
||||
if self.args.nw:
|
||||
@@ -1756,7 +1760,7 @@ class HttpCli(object):
|
||||
):
|
||||
t = "upload blocked by xau server config"
|
||||
self.log(t, 1)
|
||||
os.unlink(path)
|
||||
wunlink(self.log, path, vfs.flags)
|
||||
raise Pebkac(403, t)
|
||||
|
||||
vfs, rem = vfs.get_dbv(rem)
|
||||
@@ -1862,7 +1866,16 @@ class HttpCli(object):
|
||||
self.parser = MultipartParser(self.log, self.sr, self.headers)
|
||||
self.parser.parse()
|
||||
|
||||
act = self.parser.require("act", 64)
|
||||
file0: list[tuple[str, Optional[str], Generator[bytes, None, None]]] = []
|
||||
try:
|
||||
act = self.parser.require("act", 64)
|
||||
except WrongPostKey as ex:
|
||||
if ex.got == "f" and ex.fname:
|
||||
self.log("missing 'act', but looks like an upload so assuming that")
|
||||
file0 = [(ex.got, ex.fname, ex.datagen)]
|
||||
act = "bput"
|
||||
else:
|
||||
raise
|
||||
|
||||
if act == "login":
|
||||
return self.handle_login()
|
||||
@@ -1875,7 +1888,7 @@ class HttpCli(object):
|
||||
return self.handle_new_md()
|
||||
|
||||
if act == "bput":
|
||||
return self.handle_plain_upload()
|
||||
return self.handle_plain_upload(file0)
|
||||
|
||||
if act == "tput":
|
||||
return self.handle_text_upload()
|
||||
@@ -1975,8 +1988,11 @@ class HttpCli(object):
|
||||
except:
|
||||
raise Pebkac(500, min_ex())
|
||||
|
||||
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
||||
ret = x.get()
|
||||
# not to protect u2fh, but to prevent handshakes while files are closing
|
||||
with self.u2mutex:
|
||||
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
||||
ret = x.get()
|
||||
|
||||
if self.is_vproxied:
|
||||
if "purl" in ret:
|
||||
ret["purl"] = self.args.SR + ret["purl"]
|
||||
@@ -2081,7 +2097,7 @@ class HttpCli(object):
|
||||
f = None
|
||||
fpool = not self.args.no_fpool and sprs
|
||||
if fpool:
|
||||
with self.mutex:
|
||||
with self.u2mutex:
|
||||
try:
|
||||
f = self.u2fh.pop(path)
|
||||
except:
|
||||
@@ -2124,7 +2140,7 @@ class HttpCli(object):
|
||||
if not fpool:
|
||||
f.close()
|
||||
else:
|
||||
with self.mutex:
|
||||
with self.u2mutex:
|
||||
self.u2fh.put(path, f)
|
||||
except:
|
||||
# maybe busted handle (eg. disk went full)
|
||||
@@ -2143,7 +2159,7 @@ class HttpCli(object):
|
||||
return False
|
||||
|
||||
if not num_left and fpool:
|
||||
with self.mutex:
|
||||
with self.u2mutex:
|
||||
self.u2fh.close(path)
|
||||
|
||||
if not num_left and not self.args.nw:
|
||||
@@ -2314,7 +2330,9 @@ class HttpCli(object):
|
||||
vfs.flags.get("xau") or [],
|
||||
)
|
||||
|
||||
def handle_plain_upload(self) -> bool:
|
||||
def handle_plain_upload(
|
||||
self, file0: list[tuple[str, Optional[str], Generator[bytes, None, None]]]
|
||||
) -> bool:
|
||||
assert self.parser
|
||||
nullwrite = self.args.nw
|
||||
vfs, rem = self.asrv.vfs.get(self.vpath, self.uname, False, True)
|
||||
@@ -2336,11 +2354,13 @@ class HttpCli(object):
|
||||
files: list[tuple[int, str, str, str, str, str]] = []
|
||||
# sz, sha_hex, sha_b64, p_file, fname, abspath
|
||||
errmsg = ""
|
||||
tabspath = ""
|
||||
dip = self.dip()
|
||||
t0 = time.time()
|
||||
try:
|
||||
assert self.parser.gen
|
||||
for nfile, (p_field, p_file, p_data) in enumerate(self.parser.gen):
|
||||
gens = itertools.chain(file0, self.parser.gen)
|
||||
for nfile, (p_field, p_file, p_data) in enumerate(gens):
|
||||
if not p_file:
|
||||
self.log("discarding incoming file without filename")
|
||||
# fallthrough
|
||||
@@ -2424,14 +2444,16 @@ class HttpCli(object):
|
||||
lim.chk_nup(self.ip)
|
||||
except:
|
||||
if not nullwrite:
|
||||
bos.unlink(tabspath)
|
||||
bos.unlink(abspath)
|
||||
wunlink(self.log, tabspath, vfs.flags)
|
||||
wunlink(self.log, abspath, vfs.flags)
|
||||
fname = os.devnull
|
||||
raise
|
||||
|
||||
if not nullwrite:
|
||||
atomic_move(tabspath, abspath)
|
||||
|
||||
tabspath = ""
|
||||
|
||||
files.append(
|
||||
(sz, sha_hex, sha_b64, p_file or "(discarded)", fname, abspath)
|
||||
)
|
||||
@@ -2451,7 +2473,7 @@ class HttpCli(object):
|
||||
):
|
||||
t = "upload blocked by xau server config"
|
||||
self.log(t, 1)
|
||||
os.unlink(abspath)
|
||||
wunlink(self.log, abspath, vfs.flags)
|
||||
raise Pebkac(403, t)
|
||||
|
||||
dbv, vrem = vfs.get_dbv(rem)
|
||||
@@ -2477,6 +2499,12 @@ class HttpCli(object):
|
||||
errmsg = vol_san(
|
||||
list(self.asrv.vfs.all_vols.values()), unicode(ex).encode("utf-8")
|
||||
).decode("utf-8")
|
||||
try:
|
||||
got = bos.path.getsize(tabspath)
|
||||
t = "connection lost after receiving %s of the file"
|
||||
self.log(t % (humansize(got),), 3)
|
||||
except:
|
||||
pass
|
||||
|
||||
td = max(0.1, time.time() - t0)
|
||||
sz_total = sum(x[0] for x in files)
|
||||
@@ -2689,7 +2717,7 @@ class HttpCli(object):
|
||||
raise Pebkac(403, t)
|
||||
|
||||
if bos.path.exists(fp):
|
||||
bos.unlink(fp)
|
||||
wunlink(self.log, fp, vfs.flags)
|
||||
|
||||
with open(fsenc(fp), "wb", 512 * 1024) as f:
|
||||
sz, sha512, _ = hashcopy(p_data, f, self.args.s_wr_slp)
|
||||
@@ -2701,7 +2729,7 @@ class HttpCli(object):
|
||||
lim.chk_sz(sz)
|
||||
lim.chk_vsz(self.conn.hsrv.broker, vfs.realpath, sz)
|
||||
except:
|
||||
bos.unlink(fp)
|
||||
wunlink(self.log, fp, vfs.flags)
|
||||
raise
|
||||
|
||||
new_lastmod = bos.stat(fp).st_mtime
|
||||
@@ -2724,7 +2752,7 @@ class HttpCli(object):
|
||||
):
|
||||
t = "save blocked by xau server config"
|
||||
self.log(t, 1)
|
||||
os.unlink(fp)
|
||||
wunlink(self.log, fp, vfs.flags)
|
||||
raise Pebkac(403, t)
|
||||
|
||||
vfs, rem = vfs.get_dbv(rem)
|
||||
@@ -2936,9 +2964,11 @@ class HttpCli(object):
|
||||
# 512 kB is optimal for huge files, use 64k
|
||||
open_args = [fsenc(fs_path), "rb", 64 * 1024]
|
||||
use_sendfile = (
|
||||
not self.tls #
|
||||
# fmt: off
|
||||
not self.tls
|
||||
and not self.args.no_sendfile
|
||||
and hasattr(os, "sendfile")
|
||||
and (BITNESS > 32 or file_sz < 0x7fffFFFF)
|
||||
# fmt: on
|
||||
)
|
||||
|
||||
#
|
||||
@@ -3109,11 +3139,15 @@ class HttpCli(object):
|
||||
|
||||
ext = ext.rstrip(".") or "unk"
|
||||
if len(ext) > 11:
|
||||
ext = "⋯" + ext[-9:]
|
||||
ext = "~" + ext[-9:]
|
||||
|
||||
return self.tx_svg(ext, exact)
|
||||
|
||||
def tx_svg(self, txt: str, small: bool = False) -> bool:
|
||||
# chrome cannot handle more than ~2000 unique SVGs
|
||||
chrome = " rv:" not in self.ua
|
||||
mime, ico = self.ico.get(ext, not exact, chrome)
|
||||
# so url-param "raster" returns a png/webp instead
|
||||
# (useragent-sniffing kinshi due to caching proxies)
|
||||
mime, ico = self.ico.get(txt, not small, "raster" in self.uparam)
|
||||
|
||||
lm = formatdate(self.E.t0, usegmt=True)
|
||||
self.reply(ico, mime=mime, headers={"Last-Modified": lm})
|
||||
@@ -3373,10 +3407,13 @@ class HttpCli(object):
|
||||
pt = "404 not found ┐( ´ -`)┌"
|
||||
|
||||
if self.ua.startswith("curl/") or self.ua.startswith("fetch"):
|
||||
pt = "# acct: %s\n%s" % (self.uname, pt)
|
||||
pt = "# acct: %s\n%s\n" % (self.uname, pt)
|
||||
self.reply(pt.encode("utf-8"), status=rc)
|
||||
return True
|
||||
|
||||
if "th" in self.ouparam:
|
||||
return self.tx_svg("e" + pt[:3])
|
||||
|
||||
t = t.format(self.args.SR)
|
||||
qv = quotep(self.vpaths) + self.ourlq()
|
||||
html = self.j2s("splash", this=self, qvpath=qv, msg=t)
|
||||
@@ -3757,12 +3794,15 @@ class HttpCli(object):
|
||||
if idx and hasattr(idx, "p_end"):
|
||||
icur = idx.get_cur(dbv.realpath)
|
||||
|
||||
th_fmt = self.uparam.get("th")
|
||||
if self.can_read:
|
||||
th_fmt = self.uparam.get("th")
|
||||
if th_fmt is not None:
|
||||
nothumb = "dthumb" in dbv.flags
|
||||
if is_dir:
|
||||
vrem = vrem.rstrip("/")
|
||||
if icur and vrem:
|
||||
if nothumb:
|
||||
pass
|
||||
elif icur and vrem:
|
||||
q = "select fn from cv where rd=? and dn=?"
|
||||
crd, cdn = vrem.rsplit("/", 1) if "/" in vrem else ("", vrem)
|
||||
# no mojibake support:
|
||||
@@ -3785,10 +3825,10 @@ class HttpCli(object):
|
||||
break
|
||||
|
||||
if is_dir:
|
||||
return self.tx_ico("a.folder")
|
||||
return self.tx_svg("folder")
|
||||
|
||||
thp = None
|
||||
if self.thumbcli:
|
||||
if self.thumbcli and not nothumb:
|
||||
thp = self.thumbcli.get(dbv, vrem, int(st.st_mtime), th_fmt)
|
||||
|
||||
if thp:
|
||||
@@ -3799,6 +3839,9 @@ class HttpCli(object):
|
||||
|
||||
return self.tx_ico(rem)
|
||||
|
||||
elif self.can_write and th_fmt is not None:
|
||||
return self.tx_svg("upload\nonly")
|
||||
|
||||
elif self.can_get and self.avn:
|
||||
axs = self.avn.axs
|
||||
if self.uname not in axs.uhtml:
|
||||
@@ -3943,7 +3986,8 @@ class HttpCli(object):
|
||||
"idx": e2d,
|
||||
"itag": e2t,
|
||||
"dsort": vf["sort"],
|
||||
"dfull": "nocrop" in vf,
|
||||
"dcrop": vf["crop"],
|
||||
"dth3x": vf["th3x"],
|
||||
"u2ts": vf["u2ts"],
|
||||
"lifetime": vn.flags.get("lifetime") or 0,
|
||||
"frand": bool(vn.flags.get("rand")),
|
||||
@@ -3970,8 +4014,9 @@ class HttpCli(object):
|
||||
"sb_md": "" if "no_sb_md" in vf else (vf.get("md_sbf") or "y"),
|
||||
"readme": readme,
|
||||
"dgrid": "grid" in vf,
|
||||
"dfull": "nocrop" in vf,
|
||||
"dsort": vf["sort"],
|
||||
"dcrop": vf["crop"],
|
||||
"dth3x": vf["th3x"],
|
||||
"themes": self.args.themes,
|
||||
"turbolvl": self.args.turbo,
|
||||
"u2j": self.args.u2j,
|
||||
@@ -4218,7 +4263,7 @@ class HttpCli(object):
|
||||
if icur:
|
||||
lmte = list(mte)
|
||||
if self.can_admin:
|
||||
lmte += ["up_ip", ".up_at"]
|
||||
lmte.extend(("up_ip", ".up_at"))
|
||||
|
||||
taglist = [k for k in lmte if k in tagset]
|
||||
for fe in dirs:
|
||||
|
||||
@@ -50,7 +50,7 @@ class HttpConn(object):
|
||||
self.addr = addr
|
||||
self.hsrv = hsrv
|
||||
|
||||
self.mutex: threading.Lock = hsrv.mutex # mypy404
|
||||
self.u2mutex: threading.Lock = hsrv.u2mutex # mypy404
|
||||
self.args: argparse.Namespace = hsrv.args # mypy404
|
||||
self.E: EnvParams = self.args.E
|
||||
self.asrv: AuthSrv = hsrv.asrv # mypy404
|
||||
@@ -93,7 +93,7 @@ class HttpConn(object):
|
||||
self.rproxy = ip
|
||||
|
||||
self.ip = ip
|
||||
self.log_src = "{} \033[{}m{}".format(ip, color, self.addr[1]).ljust(26)
|
||||
self.log_src = ("%s \033[%dm%d" % (ip, color, self.addr[1])).ljust(26)
|
||||
return self.log_src
|
||||
|
||||
def respath(self, res_name: str) -> str:
|
||||
@@ -176,7 +176,7 @@ class HttpConn(object):
|
||||
|
||||
self.s = ctx.wrap_socket(self.s, server_side=True)
|
||||
msg = [
|
||||
"\033[1;3{:d}m{}".format(c, s)
|
||||
"\033[1;3%dm%s" % (c, s)
|
||||
for c, s in zip([0, 5, 0], self.s.cipher()) # type: ignore
|
||||
]
|
||||
self.log(" ".join(msg) + "\033[0m")
|
||||
|
||||
@@ -117,6 +117,7 @@ class HttpSrv(object):
|
||||
self.bound: set[tuple[str, int]] = set()
|
||||
self.name = "hsrv" + nsuf
|
||||
self.mutex = threading.Lock()
|
||||
self.u2mutex = threading.Lock()
|
||||
self.stopping = False
|
||||
|
||||
self.tp_nthr = 0 # actual
|
||||
@@ -220,7 +221,7 @@ class HttpSrv(object):
|
||||
def periodic(self) -> None:
|
||||
while True:
|
||||
time.sleep(2 if self.tp_ncli or self.ncli else 10)
|
||||
with self.mutex:
|
||||
with self.u2mutex, self.mutex:
|
||||
self.u2fh.clean()
|
||||
if self.tp_q:
|
||||
self.tp_ncli = max(self.ncli, self.tp_ncli - 2)
|
||||
@@ -366,7 +367,7 @@ class HttpSrv(object):
|
||||
if not self.t_periodic:
|
||||
name = "hsrv-pt"
|
||||
if self.nid:
|
||||
name += "-{}".format(self.nid)
|
||||
name += "-%d" % (self.nid,)
|
||||
|
||||
self.t_periodic = Daemon(self.periodic, name)
|
||||
|
||||
@@ -385,7 +386,7 @@ class HttpSrv(object):
|
||||
|
||||
Daemon(
|
||||
self.thr_client,
|
||||
"httpconn-{}-{}".format(addr[0].split(".", 2)[-1][-6:], addr[1]),
|
||||
"httpconn-%s-%d" % (addr[0].split(".", 2)[-1][-6:], addr[1]),
|
||||
(sck, addr),
|
||||
)
|
||||
|
||||
@@ -402,9 +403,7 @@ class HttpSrv(object):
|
||||
try:
|
||||
sck, addr = task
|
||||
me = threading.current_thread()
|
||||
me.name = "httpconn-{}-{}".format(
|
||||
addr[0].split(".", 2)[-1][-6:], addr[1]
|
||||
)
|
||||
me.name = "httpconn-%s-%d" % (addr[0].split(".", 2)[-1][-6:], addr[1])
|
||||
self.thr_client(sck, addr)
|
||||
me.name = self.name + "-poolw"
|
||||
except Exception as ex:
|
||||
|
||||
@@ -8,7 +8,7 @@ import re
|
||||
|
||||
from .__init__ import PY2
|
||||
from .th_srv import HAVE_PIL, HAVE_PILF
|
||||
from .util import BytesIO # type: ignore
|
||||
from .util import BytesIO, html_escape # type: ignore
|
||||
|
||||
|
||||
class Ico(object):
|
||||
@@ -27,14 +27,13 @@ class Ico(object):
|
||||
c1 = colorsys.hsv_to_rgb(zb[0] / 256.0, 1, 0.3)
|
||||
c2 = colorsys.hsv_to_rgb(zb[0] / 256.0, 0.8 if HAVE_PILF else 1, 1)
|
||||
ci = [int(x * 255) for x in list(c1) + list(c2)]
|
||||
c = "".join(["{:02x}".format(x) for x in ci])
|
||||
c = "".join(["%02x" % (x,) for x in ci])
|
||||
|
||||
w = 100
|
||||
h = 30
|
||||
if not self.args.th_no_crop and as_thumb:
|
||||
if as_thumb:
|
||||
sw, sh = self.args.th_size.split("x")
|
||||
h = int(100 / (float(sw) / float(sh)))
|
||||
w = 100
|
||||
h = int(100.0 / (float(sw) / float(sh)))
|
||||
|
||||
if chrome:
|
||||
# cannot handle more than ~2000 unique SVGs
|
||||
@@ -47,12 +46,12 @@ class Ico(object):
|
||||
# [.lt] are hard to see lowercase / unspaced
|
||||
ext2 = re.sub("(.)", "\\1 ", ext).upper()
|
||||
|
||||
h = int(128 * h / w)
|
||||
h = int(128.0 * h / w)
|
||||
w = 128
|
||||
img = Image.new("RGB", (w, h), "#" + c[:6])
|
||||
pb = ImageDraw.Draw(img)
|
||||
_, _, tw, th = pb.textbbox((0, 0), ext2, font_size=16)
|
||||
xy = ((w - tw) // 2, (h - th) // 2)
|
||||
xy = (int((w - tw) / 2), int((h - th) / 2))
|
||||
pb.text(xy, ext2, fill="#" + c[6:], font_size=16)
|
||||
|
||||
img = img.resize((w * 2, h * 2), Image.NEAREST)
|
||||
@@ -68,7 +67,7 @@ class Ico(object):
|
||||
# svg: 3s, cache: 6s, this: 8s
|
||||
from PIL import Image, ImageDraw
|
||||
|
||||
h = int(64 * h / w)
|
||||
h = int(64.0 * h / w)
|
||||
w = 64
|
||||
img = Image.new("RGB", (w, h), "#" + c[:6])
|
||||
pb = ImageDraw.Draw(img)
|
||||
@@ -99,6 +98,6 @@ class Ico(object):
|
||||
fill="#{}" font-family="monospace" font-size="14px" style="letter-spacing:.5px">{}</text>
|
||||
</g></svg>
|
||||
"""
|
||||
svg = svg.format(h, c[:6], c[6:], ext)
|
||||
svg = svg.format(h, c[:6], c[6:], html_escape(ext, True))
|
||||
|
||||
return "image/svg+xml", svg.encode("utf-8")
|
||||
|
||||
@@ -118,7 +118,7 @@ def ffprobe(
|
||||
b"--",
|
||||
fsenc(abspath),
|
||||
]
|
||||
rc, so, se = runcmd(cmd, timeout=timeout, nice=True)
|
||||
rc, so, se = runcmd(cmd, timeout=timeout, nice=True, oom=200)
|
||||
retchk(rc, cmd, se)
|
||||
return parse_ffprobe(so)
|
||||
|
||||
@@ -240,7 +240,7 @@ def parse_ffprobe(txt: str) -> tuple[dict[str, tuple[int, Any]], dict[str, list[
|
||||
if "/" in fps:
|
||||
fa, fb = fps.split("/")
|
||||
try:
|
||||
fps = int(fa) * 1.0 / int(fb)
|
||||
fps = float(fa) / float(fb)
|
||||
except:
|
||||
fps = 9001
|
||||
|
||||
@@ -564,6 +564,7 @@ class MTag(object):
|
||||
args = {
|
||||
"env": env,
|
||||
"nice": True,
|
||||
"oom": 300,
|
||||
"timeout": parser.timeout,
|
||||
"kill": parser.kill,
|
||||
"capture": parser.capture,
|
||||
|
||||
@@ -65,21 +65,21 @@ class StreamTar(StreamArc):
|
||||
cmp = re.sub(r"[^a-z0-9]*pax[^a-z0-9]*", "", cmp)
|
||||
|
||||
try:
|
||||
cmp, lv = cmp.replace(":", ",").split(",")
|
||||
lv = int(lv)
|
||||
cmp, zs = cmp.replace(":", ",").split(",")
|
||||
lv = int(zs)
|
||||
except:
|
||||
lv = None
|
||||
lv = -1
|
||||
|
||||
arg = {"name": None, "fileobj": self.qfile, "mode": "w", "format": fmt}
|
||||
if cmp == "gz":
|
||||
fun = tarfile.TarFile.gzopen
|
||||
arg["compresslevel"] = lv if lv is not None else 3
|
||||
arg["compresslevel"] = lv if lv >= 0 else 3
|
||||
elif cmp == "bz2":
|
||||
fun = tarfile.TarFile.bz2open
|
||||
arg["compresslevel"] = lv if lv is not None else 2
|
||||
arg["compresslevel"] = lv if lv >= 0 else 2
|
||||
elif cmp == "xz":
|
||||
fun = tarfile.TarFile.xzopen
|
||||
arg["preset"] = lv if lv is not None else 1
|
||||
arg["preset"] = lv if lv >= 0 else 1
|
||||
else:
|
||||
fun = tarfile.open
|
||||
arg["mode"] = "w|"
|
||||
|
||||
@@ -133,7 +133,7 @@ class SvcHub(object):
|
||||
if not self._process_config():
|
||||
raise Exception(BAD_CFG)
|
||||
|
||||
# for non-http clients (ftp)
|
||||
# for non-http clients (ftp, tftp)
|
||||
self.bans: dict[str, int] = {}
|
||||
self.gpwd = Garda(self.args.ban_pw)
|
||||
self.g404 = Garda(self.args.ban_404)
|
||||
@@ -268,6 +268,12 @@ class SvcHub(object):
|
||||
Daemon(self.start_ftpd, "start_ftpd")
|
||||
zms += "f" if args.ftp else "F"
|
||||
|
||||
if args.tftp:
|
||||
from .tftpd import Tftpd
|
||||
|
||||
self.tftpd: Optional[Tftpd] = None
|
||||
Daemon(self.start_ftpd, "start_tftpd")
|
||||
|
||||
if args.smb:
|
||||
# impacket.dcerpc is noisy about listen timeouts
|
||||
sto = socket.getdefaulttimeout()
|
||||
@@ -297,10 +303,12 @@ class SvcHub(object):
|
||||
|
||||
def start_ftpd(self) -> None:
|
||||
time.sleep(30)
|
||||
if self.ftpd:
|
||||
return
|
||||
|
||||
self.restart_ftpd()
|
||||
if hasattr(self, "ftpd") and not self.ftpd:
|
||||
self.restart_ftpd()
|
||||
|
||||
if hasattr(self, "tftpd") and not self.tftpd:
|
||||
self.restart_tftpd()
|
||||
|
||||
def restart_ftpd(self) -> None:
|
||||
if not hasattr(self, "ftpd"):
|
||||
@@ -317,6 +325,17 @@ class SvcHub(object):
|
||||
self.ftpd = Ftpd(self)
|
||||
self.log("root", "started FTPd")
|
||||
|
||||
def restart_tftpd(self) -> None:
|
||||
if not hasattr(self, "tftpd"):
|
||||
return
|
||||
|
||||
from .tftpd import Tftpd
|
||||
|
||||
if self.tftpd:
|
||||
return # todo
|
||||
|
||||
self.tftpd = Tftpd(self)
|
||||
|
||||
def thr_httpsrv_up(self) -> None:
|
||||
time.sleep(1 if self.args.ign_ebind_all else 5)
|
||||
expected = self.broker.num_workers * self.tcpsrv.nsrv
|
||||
@@ -432,6 +451,13 @@ class SvcHub(object):
|
||||
else:
|
||||
setattr(al, k, re.compile(vs))
|
||||
|
||||
for k in "tftp_lsf".split(" "):
|
||||
vs = getattr(al, k)
|
||||
if not vs or vs == "no":
|
||||
setattr(al, k, None)
|
||||
else:
|
||||
setattr(al, k, re.compile("^" + vs + "$"))
|
||||
|
||||
if not al.sus_urls:
|
||||
al.ban_url = "no"
|
||||
elif al.ban_url == "no":
|
||||
@@ -444,6 +470,7 @@ class SvcHub(object):
|
||||
al.xff_re = self._ipa2re(al.xff_src)
|
||||
al.ipa_re = self._ipa2re(al.ipa)
|
||||
al.ftp_ipa_re = self._ipa2re(al.ftp_ipa or al.ipa)
|
||||
al.tftp_ipa_re = self._ipa2re(al.tftp_ipa or al.ipa)
|
||||
|
||||
mte = ODict.fromkeys(DEF_MTE.split(","), True)
|
||||
al.mte = odfusion(mte, al.mte)
|
||||
@@ -460,6 +487,13 @@ class SvcHub(object):
|
||||
if ptn:
|
||||
setattr(self.args, k, re.compile(ptn))
|
||||
|
||||
try:
|
||||
zf1, zf2 = self.args.rm_retry.split("/")
|
||||
self.args.rm_re_t = float(zf1)
|
||||
self.args.rm_re_r = float(zf2)
|
||||
except:
|
||||
raise Exception("invalid --rm-retry [%s]" % (self.args.rm_retry,))
|
||||
|
||||
return True
|
||||
|
||||
def _ipa2re(self, txt) -> Optional[re.Pattern]:
|
||||
@@ -474,7 +508,7 @@ class SvcHub(object):
|
||||
import resource
|
||||
|
||||
soft, hard = [
|
||||
x if x > 0 else 1024 * 1024
|
||||
int(x) if x > 0 else 1024 * 1024
|
||||
for x in list(resource.getrlimit(resource.RLIMIT_NOFILE))
|
||||
]
|
||||
except:
|
||||
@@ -791,7 +825,7 @@ class SvcHub(object):
|
||||
self.logf.flush()
|
||||
|
||||
now = time.time()
|
||||
if now >= self.next_day:
|
||||
if int(now) >= self.next_day:
|
||||
self._set_next_day()
|
||||
|
||||
def _set_next_day(self) -> None:
|
||||
@@ -819,7 +853,7 @@ class SvcHub(object):
|
||||
"""handles logging from all components"""
|
||||
with self.log_mutex:
|
||||
now = time.time()
|
||||
if now >= self.next_day:
|
||||
if int(now) >= self.next_day:
|
||||
dt = datetime.fromtimestamp(now, UTC)
|
||||
zs = "{}\n" if self.no_ansi else "\033[36m{}\033[0m\n"
|
||||
zs = zs.format(dt.strftime("%Y-%m-%d"))
|
||||
|
||||
@@ -309,6 +309,7 @@ class TcpSrv(object):
|
||||
self.hub.start_zeroconf()
|
||||
gencert(self.log, self.args, self.netdevs)
|
||||
self.hub.restart_ftpd()
|
||||
self.hub.restart_tftpd()
|
||||
|
||||
def shutdown(self) -> None:
|
||||
self.stopping = True
|
||||
|
||||
429
copyparty/tftpd.py
Normal file
429
copyparty/tftpd.py
Normal file
@@ -0,0 +1,429 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
try:
|
||||
from types import SimpleNamespace
|
||||
except:
|
||||
|
||||
class SimpleNamespace(object):
|
||||
def __init__(self, **attr):
|
||||
self.__dict__.update(attr)
|
||||
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
import stat
|
||||
import threading
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
try:
|
||||
import inspect
|
||||
except:
|
||||
pass
|
||||
|
||||
from partftpy import (
|
||||
TftpContexts,
|
||||
TftpPacketFactory,
|
||||
TftpPacketTypes,
|
||||
TftpServer,
|
||||
TftpStates,
|
||||
)
|
||||
from partftpy.TftpShared import TftpException
|
||||
|
||||
from .__init__ import EXE, TYPE_CHECKING
|
||||
from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .util import BytesIO, Daemon, ODict, exclude_dotfiles, min_ex, runhook, undot
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from typing import Any, Union
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .svchub import SvcHub
|
||||
|
||||
|
||||
lg = logging.getLogger("tftp")
|
||||
debug, info, warning, error = (lg.debug, lg.info, lg.warning, lg.error)
|
||||
|
||||
|
||||
def noop(*a, **ka) -> None:
|
||||
pass
|
||||
|
||||
|
||||
def _serverInitial(self, pkt: Any, raddress: str, rport: int) -> bool:
|
||||
info("connection from %s:%s", raddress, rport)
|
||||
ret = _orig_serverInitial(self, pkt, raddress, rport)
|
||||
ptn = _hub[0].args.tftp_ipa_re
|
||||
if ptn and not ptn.match(raddress):
|
||||
yeet("client rejected (--tftp-ipa): %s" % (raddress,))
|
||||
return ret
|
||||
|
||||
|
||||
# patch ipa-check into partftpd
|
||||
_hub: list["SvcHub"] = []
|
||||
_orig_serverInitial = TftpStates.TftpServerState.serverInitial
|
||||
TftpStates.TftpServerState.serverInitial = _serverInitial
|
||||
|
||||
|
||||
class Tftpd(object):
|
||||
def __init__(self, hub: "SvcHub") -> None:
|
||||
self.hub = hub
|
||||
self.args = hub.args
|
||||
self.asrv = hub.asrv
|
||||
self.log = hub.log
|
||||
self.mutex = threading.Lock()
|
||||
|
||||
_hub[:] = []
|
||||
_hub.append(hub)
|
||||
|
||||
lg.setLevel(logging.DEBUG if self.args.tftpv else logging.INFO)
|
||||
for x in ["partftpy", "partftpy.TftpStates", "partftpy.TftpServer"]:
|
||||
lgr = logging.getLogger(x)
|
||||
lgr.setLevel(logging.DEBUG if self.args.tftpv else logging.INFO)
|
||||
|
||||
if not self.args.tftpv and not self.args.tftpvv:
|
||||
# contexts -> states -> packettypes -> shared
|
||||
# contexts -> packetfactory
|
||||
# packetfactory -> packettypes
|
||||
Cs = [
|
||||
TftpPacketTypes,
|
||||
TftpPacketFactory,
|
||||
TftpStates,
|
||||
TftpContexts,
|
||||
TftpServer,
|
||||
]
|
||||
cbak = []
|
||||
if not self.args.tftp_no_fast and not EXE:
|
||||
try:
|
||||
import inspect
|
||||
|
||||
ptn = re.compile(r"(^\s*)log\.debug\(.*\)$")
|
||||
for C in Cs:
|
||||
cbak.append(C.__dict__)
|
||||
src1 = inspect.getsource(C).split("\n")
|
||||
src2 = "\n".join([ptn.sub("\\1pass", ln) for ln in src1])
|
||||
cfn = C.__spec__.origin
|
||||
exec (compile(src2, filename=cfn, mode="exec"), C.__dict__)
|
||||
except Exception:
|
||||
t = "failed to optimize tftp code; run with --tftp-noopt if there are issues:\n"
|
||||
self.log("tftp", t + min_ex(), 3)
|
||||
for n, zd in enumerate(cbak):
|
||||
Cs[n].__dict__ = zd
|
||||
|
||||
for C in Cs:
|
||||
C.log.debug = noop
|
||||
|
||||
# patch vfs into partftpy
|
||||
TftpContexts.open = self._open
|
||||
TftpStates.open = self._open
|
||||
|
||||
fos = SimpleNamespace()
|
||||
for k in os.__dict__:
|
||||
try:
|
||||
setattr(fos, k, getattr(os, k))
|
||||
except:
|
||||
pass
|
||||
fos.access = self._access
|
||||
fos.mkdir = self._mkdir
|
||||
fos.unlink = self._unlink
|
||||
fos.sep = "/"
|
||||
TftpContexts.os = fos
|
||||
TftpServer.os = fos
|
||||
TftpStates.os = fos
|
||||
|
||||
fop = SimpleNamespace()
|
||||
for k in os.path.__dict__:
|
||||
try:
|
||||
setattr(fop, k, getattr(os.path, k))
|
||||
except:
|
||||
pass
|
||||
fop.abspath = self._p_abspath
|
||||
fop.exists = self._p_exists
|
||||
fop.isdir = self._p_isdir
|
||||
fop.normpath = self._p_normpath
|
||||
fos.path = fop
|
||||
|
||||
self._disarm(fos)
|
||||
|
||||
ip = next((x for x in self.args.i if ":" not in x), None)
|
||||
if not ip:
|
||||
self.log("tftp", "IPv6 not supported for tftp; listening on 0.0.0.0", 3)
|
||||
ip = "0.0.0.0"
|
||||
|
||||
self.port = int(self.args.tftp)
|
||||
self.srv = []
|
||||
self.ips = []
|
||||
|
||||
ports = []
|
||||
if self.args.tftp_pr:
|
||||
p1, p2 = [int(x) for x in self.args.tftp_pr.split("-")]
|
||||
ports = list(range(p1, p2 + 1))
|
||||
|
||||
ips = self.args.i
|
||||
if "::" in ips:
|
||||
ips.append("0.0.0.0")
|
||||
|
||||
if self.args.ftp4:
|
||||
ips = [x for x in ips if ":" not in x]
|
||||
|
||||
ips = list(ODict.fromkeys(ips)) # dedup
|
||||
|
||||
for ip in ips:
|
||||
name = "tftp_%s" % (ip,)
|
||||
Daemon(self._start, name, [ip, ports])
|
||||
time.sleep(0.2) # give dualstack a chance
|
||||
|
||||
def nlog(self, msg: str, c: Union[int, str] = 0) -> None:
|
||||
self.log("tftp", msg, c)
|
||||
|
||||
def _start(self, ip, ports):
|
||||
fam = socket.AF_INET6 if ":" in ip else socket.AF_INET
|
||||
have_been_alive = False
|
||||
while True:
|
||||
srv = TftpServer.TftpServer("/", self._ls)
|
||||
with self.mutex:
|
||||
self.srv.append(srv)
|
||||
self.ips.append(ip)
|
||||
|
||||
try:
|
||||
# this is the listen loop; it should block forever
|
||||
srv.listen(ip, self.port, af_family=fam, ports=ports)
|
||||
except:
|
||||
with self.mutex:
|
||||
self.srv.remove(srv)
|
||||
self.ips.remove(ip)
|
||||
|
||||
try:
|
||||
srv.sock.close()
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
bound = bool(srv.listenport)
|
||||
except:
|
||||
bound = False
|
||||
|
||||
if bound:
|
||||
# this instance has managed to bind at least once
|
||||
have_been_alive = True
|
||||
|
||||
if have_been_alive:
|
||||
t = "tftp server [%s]:%d crashed; restarting in 3 sec:\n%s"
|
||||
error(t, ip, self.port, min_ex())
|
||||
time.sleep(3)
|
||||
continue
|
||||
|
||||
# server failed to start; could be due to dualstack (ipv6 managed to bind and this is ipv4)
|
||||
if ip != "0.0.0.0" or "::" not in self.ips:
|
||||
# nope, it's fatal
|
||||
t = "tftp server [%s]:%d failed to start:\n%s"
|
||||
error(t, ip, self.port, min_ex())
|
||||
|
||||
# yep; ignore
|
||||
# (TODO: move the "listening @ ..." infolog in partftpy to
|
||||
# after the bind attempt so it doesn't print twice)
|
||||
return
|
||||
|
||||
info("tftp server [%s]:%d terminated", ip, self.port)
|
||||
break
|
||||
|
||||
def stop(self):
|
||||
with self.mutex:
|
||||
srvs = self.srv[:]
|
||||
|
||||
for srv in srvs:
|
||||
srv.stop()
|
||||
|
||||
def _v2a(self, caller: str, vpath: str, perms: list, *a: Any) -> tuple[VFS, str]:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
if not perms:
|
||||
perms = [True, True]
|
||||
|
||||
debug('%s("%s", %s) %s\033[K\033[0m', caller, vpath, str(a), perms)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, "*", *perms)
|
||||
return vfs, vfs.canonical(rem)
|
||||
|
||||
def _ls(self, vpath: str, raddress: str, rport: int, force=False) -> Any:
|
||||
# generate file listing if vpath is dir.txt and return as file object
|
||||
if not force:
|
||||
vpath, fn = os.path.split(vpath.replace("\\", "/"))
|
||||
ptn = self.args.tftp_lsf
|
||||
if not ptn or not ptn.match(fn.lower()):
|
||||
return None
|
||||
|
||||
vn, rem = self.asrv.vfs.get(vpath, "*", True, False)
|
||||
fsroot, vfs_ls, vfs_virt = vn.ls(
|
||||
rem,
|
||||
"*",
|
||||
not self.args.no_scandir,
|
||||
[[True, False]],
|
||||
)
|
||||
dnames = set([x[0] for x in vfs_ls if stat.S_ISDIR(x[1].st_mode)])
|
||||
dirs1 = [(v.st_mtime, v.st_size, k + "/") for k, v in vfs_ls if k in dnames]
|
||||
fils1 = [(v.st_mtime, v.st_size, k) for k, v in vfs_ls if k not in dnames]
|
||||
real1 = dirs1 + fils1
|
||||
realt = [(datetime.fromtimestamp(mt), sz, fn) for mt, sz, fn in real1]
|
||||
reals = [
|
||||
(
|
||||
"%04d-%02d-%02d %02d:%02d:%02d"
|
||||
% (
|
||||
zd.year,
|
||||
zd.month,
|
||||
zd.day,
|
||||
zd.hour,
|
||||
zd.minute,
|
||||
zd.second,
|
||||
),
|
||||
sz,
|
||||
fn,
|
||||
)
|
||||
for zd, sz, fn in realt
|
||||
]
|
||||
virs = [("????-??-?? ??:??:??", 0, k + "/") for k in vfs_virt.keys()]
|
||||
ls = virs + reals
|
||||
|
||||
if "*" not in vn.axs.udot:
|
||||
names = set(exclude_dotfiles([x[2] for x in ls]))
|
||||
ls = [x for x in ls if x[2] in names]
|
||||
|
||||
try:
|
||||
biggest = max([x[1] for x in ls])
|
||||
except:
|
||||
biggest = 0
|
||||
|
||||
perms = []
|
||||
if "*" in vn.axs.uread:
|
||||
perms.append("read")
|
||||
if "*" in vn.axs.udot:
|
||||
perms.append("hidden")
|
||||
if "*" in vn.axs.uwrite:
|
||||
if "*" in vn.axs.udel:
|
||||
perms.append("overwrite")
|
||||
else:
|
||||
perms.append("write")
|
||||
|
||||
fmt = "{{}} {{:{},}} {{}}"
|
||||
fmt = fmt.format(len("{:,}".format(biggest)))
|
||||
retl = ["# permissions: %s" % (", ".join(perms),)]
|
||||
retl += [fmt.format(*x) for x in ls]
|
||||
ret = "\n".join(retl).encode("utf-8", "replace")
|
||||
return BytesIO(ret + b"\n")
|
||||
|
||||
def _open(self, vpath: str, mode: str, *a: Any, **ka: Any) -> Any:
|
||||
rd = wr = False
|
||||
if mode == "rb":
|
||||
rd = True
|
||||
elif mode == "wb":
|
||||
wr = True
|
||||
else:
|
||||
raise Exception("bad mode %s" % (mode,))
|
||||
|
||||
vfs, ap = self._v2a("open", vpath, [rd, wr])
|
||||
if wr:
|
||||
if "*" not in vfs.axs.uwrite:
|
||||
yeet("blocked write; folder not world-writable: /%s" % (vpath,))
|
||||
|
||||
if bos.path.exists(ap) and "*" not in vfs.axs.udel:
|
||||
yeet("blocked write; folder not world-deletable: /%s" % (vpath,))
|
||||
|
||||
xbu = vfs.flags.get("xbu")
|
||||
if xbu and not runhook(
|
||||
self.nlog, xbu, ap, vpath, "", "", 0, 0, "8.3.8.7", 0, ""
|
||||
):
|
||||
yeet("blocked by xbu server config: " + vpath)
|
||||
|
||||
if not self.args.tftp_nols and bos.path.isdir(ap):
|
||||
return self._ls(vpath, "", 0, True)
|
||||
|
||||
return open(ap, mode, *a, **ka)
|
||||
|
||||
def _mkdir(self, vpath: str, *a) -> None:
|
||||
vfs, ap = self._v2a("mkdir", vpath, [])
|
||||
if "*" not in vfs.axs.uwrite:
|
||||
yeet("blocked mkdir; folder not world-writable: /%s" % (vpath,))
|
||||
|
||||
return bos.mkdir(ap)
|
||||
|
||||
def _unlink(self, vpath: str) -> None:
|
||||
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
|
||||
vfs, ap = self._v2a("delete", vpath, [True, False, False, True])
|
||||
|
||||
try:
|
||||
inf = bos.stat(ap)
|
||||
except:
|
||||
return
|
||||
|
||||
if not stat.S_ISREG(inf.st_mode) or inf.st_size:
|
||||
yeet("attempted delete of non-empty file")
|
||||
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
self.hub.up2k.handle_rm("*", "8.3.8.7", [vpath], [], False)
|
||||
|
||||
def _access(self, *a: Any) -> bool:
|
||||
return True
|
||||
|
||||
def _p_abspath(self, vpath: str) -> str:
|
||||
return "/" + undot(vpath)
|
||||
|
||||
def _p_normpath(self, *a: Any) -> str:
|
||||
return ""
|
||||
|
||||
def _p_exists(self, vpath: str) -> bool:
|
||||
try:
|
||||
ap = self._v2a("p.exists", vpath, [False, False])[1]
|
||||
bos.stat(ap)
|
||||
return True
|
||||
except:
|
||||
return False
|
||||
|
||||
def _p_isdir(self, vpath: str) -> bool:
|
||||
try:
|
||||
st = bos.stat(self._v2a("p.isdir", vpath, [False, False])[1])
|
||||
ret = stat.S_ISDIR(st.st_mode)
|
||||
return ret
|
||||
except:
|
||||
return False
|
||||
|
||||
def _hook(self, *a: Any, **ka: Any) -> None:
|
||||
src = inspect.currentframe().f_back.f_code.co_name
|
||||
error("\033[31m%s:hook(%s)\033[0m", src, a)
|
||||
raise Exception("nope")
|
||||
|
||||
def _disarm(self, fos: SimpleNamespace) -> None:
|
||||
fos.chmod = self._hook
|
||||
fos.chown = self._hook
|
||||
fos.close = self._hook
|
||||
fos.ftruncate = self._hook
|
||||
fos.lchown = self._hook
|
||||
fos.link = self._hook
|
||||
fos.listdir = self._hook
|
||||
fos.lstat = self._hook
|
||||
fos.open = self._hook
|
||||
fos.remove = self._hook
|
||||
fos.rename = self._hook
|
||||
fos.replace = self._hook
|
||||
fos.scandir = self._hook
|
||||
fos.stat = self._hook
|
||||
fos.symlink = self._hook
|
||||
fos.truncate = self._hook
|
||||
fos.utime = self._hook
|
||||
fos.walk = self._hook
|
||||
|
||||
fos.path.expanduser = self._hook
|
||||
fos.path.expandvars = self._hook
|
||||
fos.path.getatime = self._hook
|
||||
fos.path.getctime = self._hook
|
||||
fos.path.getmtime = self._hook
|
||||
fos.path.getsize = self._hook
|
||||
fos.path.isabs = self._hook
|
||||
fos.path.isfile = self._hook
|
||||
fos.path.islink = self._hook
|
||||
fos.path.realpath = self._hook
|
||||
|
||||
|
||||
def yeet(msg: str) -> None:
|
||||
warning(msg)
|
||||
raise TftpException(msg)
|
||||
@@ -78,16 +78,34 @@ class ThumbCli(object):
|
||||
if rem.startswith(".hist/th/") and rem.split(".")[-1] in ["webp", "jpg", "png"]:
|
||||
return os.path.join(ptop, rem)
|
||||
|
||||
if fmt == "j" and self.args.th_no_jpg:
|
||||
fmt = "w"
|
||||
if fmt[:1] in "jw":
|
||||
sfmt = fmt[:1]
|
||||
|
||||
if fmt == "w":
|
||||
if (
|
||||
self.args.th_no_webp
|
||||
or (is_img and not self.can_webp)
|
||||
or (self.args.th_ff_jpg and (not is_img or preferred == "ff"))
|
||||
):
|
||||
fmt = "j"
|
||||
if sfmt == "j" and self.args.th_no_jpg:
|
||||
sfmt = "w"
|
||||
|
||||
if sfmt == "w":
|
||||
if (
|
||||
self.args.th_no_webp
|
||||
or (is_img and not self.can_webp)
|
||||
or (self.args.th_ff_jpg and (not is_img or preferred == "ff"))
|
||||
):
|
||||
sfmt = "j"
|
||||
|
||||
vf_crop = dbv.flags["crop"]
|
||||
vf_th3x = dbv.flags["th3x"]
|
||||
|
||||
if "f" in vf_crop:
|
||||
sfmt += "f" if "n" in vf_crop else ""
|
||||
else:
|
||||
sfmt += "f" if "f" in fmt else ""
|
||||
|
||||
if "f" in vf_th3x:
|
||||
sfmt += "3" if "y" in vf_th3x else ""
|
||||
else:
|
||||
sfmt += "3" if "3" in fmt else ""
|
||||
|
||||
fmt = sfmt
|
||||
|
||||
histpath = self.asrv.vfs.histtab.get(ptop)
|
||||
if not histpath:
|
||||
|
||||
@@ -28,6 +28,7 @@ from .util import (
|
||||
runcmd,
|
||||
statdir,
|
||||
vsplit,
|
||||
wunlink,
|
||||
)
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
@@ -96,13 +97,13 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
|
||||
|
||||
# spectrograms are never cropped; strip fullsize flag
|
||||
ext = rem.split(".")[-1].lower()
|
||||
if ext in ffa and fmt in ("wf", "jf"):
|
||||
fmt = fmt[:1]
|
||||
if ext in ffa and fmt[:2] in ("wf", "jf"):
|
||||
fmt = fmt.replace("f", "")
|
||||
|
||||
rd += "\n" + fmt
|
||||
h = hashlib.sha512(afsenc(rd)).digest()
|
||||
b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24]
|
||||
rd = "{}/{}/".format(b64[:2], b64[2:4]).lower() + b64
|
||||
rd = ("%s/%s/" % (b64[:2], b64[2:4])).lower() + b64
|
||||
|
||||
# could keep original filenames but this is safer re pathlen
|
||||
h = hashlib.sha512(afsenc(fn)).digest()
|
||||
@@ -115,7 +116,7 @@ def thumb_path(histpath: str, rem: str, mtime: float, fmt: str, ffa: set[str]) -
|
||||
fmt = "webp" if fc == "w" else "png" if fc == "p" else "jpg"
|
||||
cat = "th"
|
||||
|
||||
return "{}/{}/{}/{}.{:x}.{}".format(histpath, cat, rd, fn, int(mtime), fmt)
|
||||
return "%s/%s/%s/%s.%x.%s" % (histpath, cat, rd, fn, int(mtime), fmt)
|
||||
|
||||
|
||||
class ThumbSrv(object):
|
||||
@@ -129,6 +130,8 @@ class ThumbSrv(object):
|
||||
|
||||
self.mutex = threading.Lock()
|
||||
self.busy: dict[str, list[threading.Condition]] = {}
|
||||
self.ram: dict[str, float] = {}
|
||||
self.memcond = threading.Condition(self.mutex)
|
||||
self.stopping = False
|
||||
self.nthr = max(1, self.args.th_mt)
|
||||
|
||||
@@ -197,9 +200,10 @@ class ThumbSrv(object):
|
||||
with self.mutex:
|
||||
return not self.nthr
|
||||
|
||||
def getres(self, vn: VFS) -> tuple[int, int]:
|
||||
def getres(self, vn: VFS, fmt: str) -> tuple[int, int]:
|
||||
mul = 3 if "3" in fmt else 1
|
||||
w, h = vn.flags["thsize"].split("x")
|
||||
return int(w), int(h)
|
||||
return int(w) * mul, int(h) * mul
|
||||
|
||||
def get(self, ptop: str, rem: str, mtime: float, fmt: str) -> Optional[str]:
|
||||
histpath = self.asrv.vfs.histtab.get(ptop)
|
||||
@@ -214,7 +218,7 @@ class ThumbSrv(object):
|
||||
with self.mutex:
|
||||
try:
|
||||
self.busy[tpath].append(cond)
|
||||
self.log("wait {}".format(tpath))
|
||||
self.log("joined waiting room for %s" % (tpath,))
|
||||
except:
|
||||
thdir = os.path.dirname(tpath)
|
||||
bos.makedirs(os.path.join(thdir, "w"))
|
||||
@@ -265,6 +269,23 @@ class ThumbSrv(object):
|
||||
"ffa": self.fmt_ffa,
|
||||
}
|
||||
|
||||
def wait4ram(self, need: float, ttpath: str) -> None:
|
||||
ram = self.args.th_ram_max
|
||||
if need > ram * 0.99:
|
||||
t = "file too big; need %.2f GiB RAM, but --th-ram-max is only %.1f"
|
||||
raise Exception(t % (need, ram))
|
||||
|
||||
while True:
|
||||
with self.mutex:
|
||||
used = sum([v for k, v in self.ram.items() if k != ttpath]) + need
|
||||
if used < ram:
|
||||
# self.log("XXX self.ram: %s" % (self.ram,), 5)
|
||||
self.ram[ttpath] = need
|
||||
return
|
||||
with self.memcond:
|
||||
# self.log("at RAM limit; used %.2f GiB, need %.2f more" % (used-need, need), 1)
|
||||
self.memcond.wait(3)
|
||||
|
||||
def worker(self) -> None:
|
||||
while not self.stopping:
|
||||
task = self.q.get()
|
||||
@@ -298,7 +319,7 @@ class ThumbSrv(object):
|
||||
tdir, tfn = os.path.split(tpath)
|
||||
ttpath = os.path.join(tdir, "w", tfn)
|
||||
try:
|
||||
bos.unlink(ttpath)
|
||||
wunlink(self.log, ttpath, vn.flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -318,7 +339,7 @@ class ThumbSrv(object):
|
||||
else:
|
||||
# ffmpeg may spawn empty files on windows
|
||||
try:
|
||||
os.unlink(ttpath)
|
||||
wunlink(self.log, ttpath, vn.flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -330,17 +351,21 @@ class ThumbSrv(object):
|
||||
with self.mutex:
|
||||
subs = self.busy[tpath]
|
||||
del self.busy[tpath]
|
||||
self.ram.pop(ttpath, None)
|
||||
|
||||
for x in subs:
|
||||
with x:
|
||||
x.notify_all()
|
||||
|
||||
with self.memcond:
|
||||
self.memcond.notify_all()
|
||||
|
||||
with self.mutex:
|
||||
self.nthr -= 1
|
||||
|
||||
def fancy_pillow(self, im: "Image.Image", fmt: str, vn: VFS) -> "Image.Image":
|
||||
# exif_transpose is expensive (loads full image + unconditional copy)
|
||||
res = self.getres(vn)
|
||||
res = self.getres(vn, fmt)
|
||||
r = max(*res) * 2
|
||||
im.thumbnail((r, r), resample=Image.LANCZOS)
|
||||
try:
|
||||
@@ -355,7 +380,7 @@ class ThumbSrv(object):
|
||||
if rot in rots:
|
||||
im = im.transpose(rots[rot])
|
||||
|
||||
if fmt.endswith("f"):
|
||||
if "f" in fmt:
|
||||
im.thumbnail(res, resample=Image.LANCZOS)
|
||||
else:
|
||||
iw, ih = im.size
|
||||
@@ -366,12 +391,13 @@ class ThumbSrv(object):
|
||||
return im
|
||||
|
||||
def conv_pil(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
|
||||
self.wait4ram(0.2, tpath)
|
||||
with Image.open(fsenc(abspath)) as im:
|
||||
try:
|
||||
im = self.fancy_pillow(im, fmt, vn)
|
||||
except Exception as ex:
|
||||
self.log("fancy_pillow {}".format(ex), "90")
|
||||
im.thumbnail(self.getres(vn))
|
||||
im.thumbnail(self.getres(vn, fmt))
|
||||
|
||||
fmts = ["RGB", "L"]
|
||||
args = {"quality": 40}
|
||||
@@ -382,7 +408,7 @@ class ThumbSrv(object):
|
||||
# method 0 = pillow-default, fast
|
||||
# method 4 = ffmpeg-default
|
||||
# method 6 = max, slow
|
||||
fmts += ["RGBA", "LA"]
|
||||
fmts.extend(("RGBA", "LA"))
|
||||
args["method"] = 6
|
||||
else:
|
||||
# default q = 75
|
||||
@@ -395,11 +421,12 @@ class ThumbSrv(object):
|
||||
im.save(tpath, **args)
|
||||
|
||||
def conv_vips(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
|
||||
self.wait4ram(0.2, tpath)
|
||||
crops = ["centre", "none"]
|
||||
if fmt.endswith("f"):
|
||||
if "f" in fmt:
|
||||
crops = ["none"]
|
||||
|
||||
w, h = self.getres(vn)
|
||||
w, h = self.getres(vn, fmt)
|
||||
kw = {"height": h, "size": "down", "intent": "relative"}
|
||||
|
||||
for c in crops:
|
||||
@@ -415,6 +442,7 @@ class ThumbSrv(object):
|
||||
img.write_to_file(tpath, Q=40)
|
||||
|
||||
def conv_ffmpeg(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
|
||||
self.wait4ram(0.2, tpath)
|
||||
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
|
||||
if not ret:
|
||||
return
|
||||
@@ -427,12 +455,12 @@ class ThumbSrv(object):
|
||||
seek = [b"-ss", "{:.0f}".format(dur / 3).encode("utf-8")]
|
||||
|
||||
scale = "scale={0}:{1}:force_original_aspect_ratio="
|
||||
if fmt.endswith("f"):
|
||||
if "f" in fmt:
|
||||
scale += "decrease,setsar=1:1"
|
||||
else:
|
||||
scale += "increase,crop={0}:{1},setsar=1:1"
|
||||
|
||||
res = self.getres(vn)
|
||||
res = self.getres(vn, fmt)
|
||||
bscale = scale.format(*list(res)).encode("utf-8")
|
||||
# fmt: off
|
||||
cmd = [
|
||||
@@ -467,9 +495,9 @@ class ThumbSrv(object):
|
||||
cmd += [fsenc(tpath)]
|
||||
self._run_ff(cmd, vn)
|
||||
|
||||
def _run_ff(self, cmd: list[bytes], vn: VFS) -> None:
|
||||
def _run_ff(self, cmd: list[bytes], vn: VFS, oom: int = 400) -> None:
|
||||
# self.log((b" ".join(cmd)).decode("utf-8"))
|
||||
ret, _, serr = runcmd(cmd, timeout=vn.flags["convt"], nice=True)
|
||||
ret, _, serr = runcmd(cmd, timeout=vn.flags["convt"], nice=True, oom=oom)
|
||||
if not ret:
|
||||
return
|
||||
|
||||
@@ -517,8 +545,21 @@ class ThumbSrv(object):
|
||||
if "ac" not in ret:
|
||||
raise Exception("not audio")
|
||||
|
||||
flt = (
|
||||
b"[0:a:0]"
|
||||
# jt_versi.xm: 405M/839s
|
||||
dur = ret[".dur"][1] if ".dur" in ret else 300
|
||||
need = 0.2 + dur / 3000
|
||||
speedup = b""
|
||||
if need > self.args.th_ram_max * 0.7:
|
||||
self.log("waves too big (need %.2f GiB); trying to optimize" % (need,))
|
||||
need = 0.2 + dur / 4200 # only helps about this much...
|
||||
speedup = b"aresample=8000,"
|
||||
if need > self.args.th_ram_max * 0.96:
|
||||
raise Exception("file too big; cannot waves")
|
||||
|
||||
self.wait4ram(need, tpath)
|
||||
|
||||
flt = b"[0:a:0]" + speedup
|
||||
flt += (
|
||||
b"compand=.3|.3:1|1:-90/-60|-60/-40|-40/-30|-20/-20:6:0:-90:0.2"
|
||||
b",volume=2"
|
||||
b",showwavespic=s=2048x64:colors=white"
|
||||
@@ -545,7 +586,20 @@ class ThumbSrv(object):
|
||||
if "ac" not in ret:
|
||||
raise Exception("not audio")
|
||||
|
||||
fc = "[0:a:0]aresample=48000{},showspectrumpic=s=640x512,crop=780:544:70:50[o]"
|
||||
# https://trac.ffmpeg.org/ticket/10797
|
||||
# expect 1 GiB every 600 seconds when duration is tricky;
|
||||
# simple filetypes are generally safer so let's special-case those
|
||||
safe = ("flac", "wav", "aif", "aiff", "opus")
|
||||
coeff = 1800 if abspath.split(".")[-1].lower() in safe else 600
|
||||
dur = ret[".dur"][1] if ".dur" in ret else 300
|
||||
need = 0.2 + dur / coeff
|
||||
self.wait4ram(need, tpath)
|
||||
|
||||
fc = "[0:a:0]aresample=48000{},showspectrumpic=s="
|
||||
if "3" in fmt:
|
||||
fc += "1280x1024,crop=1420:1056:70:48[o]"
|
||||
else:
|
||||
fc += "640x512,crop=780:544:70:48[o]"
|
||||
|
||||
if self.args.th_ff_swr:
|
||||
fco = ":filter_size=128:cutoff=0.877"
|
||||
@@ -587,6 +641,7 @@ class ThumbSrv(object):
|
||||
if self.args.no_acode:
|
||||
raise Exception("disabled in server config")
|
||||
|
||||
self.wait4ram(0.2, tpath)
|
||||
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
|
||||
if "ac" not in ret:
|
||||
raise Exception("not audio")
|
||||
@@ -602,7 +657,7 @@ class ThumbSrv(object):
|
||||
if want_caf:
|
||||
tmp_opus = tpath + ".opus"
|
||||
try:
|
||||
bos.unlink(tmp_opus)
|
||||
wunlink(self.log, tmp_opus, vn.flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -623,7 +678,7 @@ class ThumbSrv(object):
|
||||
fsenc(tmp_opus)
|
||||
]
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn)
|
||||
self._run_ff(cmd, vn, oom=300)
|
||||
|
||||
# iOS fails to play some "insufficiently complex" files
|
||||
# (average file shorter than 8 seconds), so of course we
|
||||
@@ -647,7 +702,7 @@ class ThumbSrv(object):
|
||||
fsenc(tpath)
|
||||
]
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn)
|
||||
self._run_ff(cmd, vn, oom=300)
|
||||
|
||||
elif want_caf:
|
||||
# simple remux should be safe
|
||||
@@ -665,11 +720,11 @@ class ThumbSrv(object):
|
||||
fsenc(tpath)
|
||||
]
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn)
|
||||
self._run_ff(cmd, vn, oom=300)
|
||||
|
||||
if tmp_opus != tpath:
|
||||
try:
|
||||
bos.unlink(tmp_opus)
|
||||
wunlink(self.log, tmp_opus, vn.flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -696,7 +751,10 @@ class ThumbSrv(object):
|
||||
else:
|
||||
self.log("\033[Jcln {} ({})/\033[A".format(histpath, vol))
|
||||
|
||||
ndirs += self.clean(histpath)
|
||||
try:
|
||||
ndirs += self.clean(histpath)
|
||||
except Exception as ex:
|
||||
self.log("\033[Jcln err in %s: %r" % (histpath, ex), 3)
|
||||
|
||||
self.log("\033[Jcln ok; rm {} dirs".format(ndirs))
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@ from copy import deepcopy
|
||||
|
||||
from queue import Queue
|
||||
|
||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, WINDOWS
|
||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, WINDOWS, E
|
||||
from .authsrv import LEELOO_DALLAS, SSEELOG, VFS, AuthSrv
|
||||
from .bos import bos
|
||||
from .cfg import vf_bmap, vf_cmap, vf_vmap
|
||||
@@ -35,8 +35,10 @@ from .util import (
|
||||
Pebkac,
|
||||
ProgressPrinter,
|
||||
absreal,
|
||||
alltrace,
|
||||
atomic_move,
|
||||
db_ex_chk,
|
||||
dir_is_empty,
|
||||
djoin,
|
||||
fsenc,
|
||||
gen_filekey,
|
||||
@@ -63,6 +65,7 @@ from .util import (
|
||||
vsplit,
|
||||
w8b64dec,
|
||||
w8b64enc,
|
||||
wunlink,
|
||||
)
|
||||
|
||||
try:
|
||||
@@ -85,6 +88,9 @@ zsg = "avif,avifs,bmp,gif,heic,heics,heif,heifs,ico,j2p,j2k,jp2,jpeg,jpg,jpx,png
|
||||
CV_EXTS = set(zsg.split(","))
|
||||
|
||||
|
||||
HINT_HISTPATH = "you could try moving the database to another location (preferably an SSD or NVME drive) using either the --hist argument (global option for all volumes), or the hist volflag (just for this volume)"
|
||||
|
||||
|
||||
class Dbw(object):
|
||||
def __init__(self, c: "sqlite3.Cursor", n: int, t: float) -> None:
|
||||
self.c = c
|
||||
@@ -145,9 +151,12 @@ class Up2k(object):
|
||||
self.entags: dict[str, set[str]] = {}
|
||||
self.mtp_parsers: dict[str, dict[str, MParser]] = {}
|
||||
self.pending_tags: list[tuple[set[str], str, str, dict[str, Any]]] = []
|
||||
self.hashq: Queue[tuple[str, str, str, str, str, float, str, bool]] = Queue()
|
||||
self.tagq: Queue[tuple[str, str, str, str, str, float]] = Queue()
|
||||
self.hashq: Queue[
|
||||
tuple[str, str, dict[str, Any], str, str, str, float, str, bool]
|
||||
] = Queue()
|
||||
self.tagq: Queue[tuple[str, str, str, str, int, str, float]] = Queue()
|
||||
self.tag_event = threading.Condition()
|
||||
self.hashq_mutex = threading.Lock()
|
||||
self.n_hashq = 0
|
||||
self.n_tagq = 0
|
||||
self.mpool_used = False
|
||||
@@ -543,7 +552,7 @@ class Up2k(object):
|
||||
runihook(self.log, cmd, vol, ups)
|
||||
|
||||
def _vis_job_progress(self, job: dict[str, Any]) -> str:
|
||||
perc = 100 - (len(job["need"]) * 100.0 / len(job["hash"]))
|
||||
perc = 100 - (len(job["need"]) * 100.0 / (len(job["hash"]) or 1))
|
||||
path = djoin(job["ptop"], job["prel"], job["name"])
|
||||
return "{:5.1f}% {}".format(perc, path)
|
||||
|
||||
@@ -578,7 +587,7 @@ class Up2k(object):
|
||||
if gid:
|
||||
self.log("reload #{} running".format(self.gid))
|
||||
|
||||
self.pp = ProgressPrinter()
|
||||
self.pp = ProgressPrinter(self.log, self.args)
|
||||
vols = list(all_vols.values())
|
||||
t0 = time.time()
|
||||
have_e2d = False
|
||||
@@ -601,7 +610,7 @@ class Up2k(object):
|
||||
for vol in vols:
|
||||
try:
|
||||
bos.makedirs(vol.realpath) # gonna happen at snap anyways
|
||||
bos.listdir(vol.realpath)
|
||||
dir_is_empty(self.log_func, not self.args.no_scandir, vol.realpath)
|
||||
except:
|
||||
self.volstate[vol.vpath] = "OFFLINE (cannot access folder)"
|
||||
self.log("cannot access " + vol.realpath, c=1)
|
||||
@@ -804,7 +813,7 @@ class Up2k(object):
|
||||
ft = "\033[0;32m{}{:.0}"
|
||||
ff = "\033[0;35m{}{:.0}"
|
||||
fv = "\033[0;36m{}:\033[90m{}"
|
||||
fx = set(("html_head",))
|
||||
fx = set(("html_head", "rm_re_t", "rm_re_r"))
|
||||
fd = vf_bmap()
|
||||
fd.update(vf_cmap())
|
||||
fd.update(vf_vmap())
|
||||
@@ -887,7 +896,7 @@ class Up2k(object):
|
||||
return None
|
||||
|
||||
try:
|
||||
cur = self._open_db(db_path)
|
||||
cur = self._open_db_wd(db_path)
|
||||
|
||||
# speeds measured uploading 520 small files on a WD20SPZX (SMR 2.5" 5400rpm 4kb)
|
||||
dbd = flags["dbd"]
|
||||
@@ -930,8 +939,8 @@ class Up2k(object):
|
||||
|
||||
return cur, db_path
|
||||
except:
|
||||
msg = "cannot use database at [{}]:\n{}"
|
||||
self.log(msg.format(ptop, traceback.format_exc()))
|
||||
msg = "ERROR: cannot use database at [%s]:\n%s\n\033[33mhint: %s\n"
|
||||
self.log(msg % (db_path, traceback.format_exc(), HINT_HISTPATH), 1)
|
||||
|
||||
return None
|
||||
|
||||
@@ -983,12 +992,12 @@ class Up2k(object):
|
||||
excl = [x.replace("/", "\\") for x in excl]
|
||||
else:
|
||||
# ~/.wine/dosdevices/z:/ and such
|
||||
excl += ["/dev", "/proc", "/run", "/sys"]
|
||||
excl.extend(("/dev", "/proc", "/run", "/sys"))
|
||||
|
||||
rtop = absreal(top)
|
||||
n_add = n_rm = 0
|
||||
try:
|
||||
if not bos.listdir(rtop):
|
||||
if dir_is_empty(self.log_func, not self.args.no_scandir, rtop):
|
||||
t = "volume /%s at [%s] is empty; will not be indexed as this could be due to an offline filesystem"
|
||||
self.log(t % (vol.vpath, rtop), 6)
|
||||
return True, False
|
||||
@@ -1085,7 +1094,7 @@ class Up2k(object):
|
||||
cv = ""
|
||||
|
||||
assert self.pp and self.mem_cur
|
||||
self.pp.msg = "a{} {}".format(self.pp.n, cdir)
|
||||
self.pp.msg = "a%d %s" % (self.pp.n, cdir)
|
||||
|
||||
rd = cdir[len(top) :].strip("/")
|
||||
if WINDOWS:
|
||||
@@ -1160,8 +1169,8 @@ class Up2k(object):
|
||||
continue
|
||||
|
||||
if not sz and (
|
||||
"{}.PARTIAL".format(iname) in partials
|
||||
or ".{}.PARTIAL".format(iname) in partials
|
||||
"%s.PARTIAL" % (iname,) in partials
|
||||
or ".%s.PARTIAL" % (iname,) in partials
|
||||
):
|
||||
# placeholder for unfinished upload
|
||||
continue
|
||||
@@ -1224,7 +1233,7 @@ class Up2k(object):
|
||||
abspath = os.path.join(cdir, fn)
|
||||
nohash = reh.search(abspath) if reh else False
|
||||
|
||||
sql = "select w, mt, sz, at from up where rd = ? and fn = ?"
|
||||
sql = "select w, mt, sz, ip, at from up where rd = ? and fn = ?"
|
||||
try:
|
||||
c = db.c.execute(sql, (rd, fn))
|
||||
except:
|
||||
@@ -1233,7 +1242,7 @@ class Up2k(object):
|
||||
in_db = list(c.fetchall())
|
||||
if in_db:
|
||||
self.pp.n -= 1
|
||||
dw, dts, dsz, at = in_db[0]
|
||||
dw, dts, dsz, ip, at = in_db[0]
|
||||
if len(in_db) > 1:
|
||||
t = "WARN: multiple entries: [{}] => [{}] |{}|\n{}"
|
||||
rep_db = "\n".join([repr(x) for x in in_db])
|
||||
@@ -1255,9 +1264,11 @@ class Up2k(object):
|
||||
db.n += 1
|
||||
in_db = []
|
||||
else:
|
||||
dw = ""
|
||||
ip = ""
|
||||
at = 0
|
||||
|
||||
self.pp.msg = "a{} {}".format(self.pp.n, abspath)
|
||||
self.pp.msg = "a%d %s" % (self.pp.n, abspath)
|
||||
|
||||
if nohash or not sz:
|
||||
wark = up2k_wark_from_metadata(self.salt, sz, lmod, rd, fn)
|
||||
@@ -1278,8 +1289,12 @@ class Up2k(object):
|
||||
|
||||
wark = up2k_wark_from_hashlist(self.salt, sz, hashes)
|
||||
|
||||
if dw and dw != wark:
|
||||
ip = ""
|
||||
at = 0
|
||||
|
||||
# skip upload hooks by not providing vflags
|
||||
self.db_add(db.c, {}, rd, fn, lmod, sz, "", "", wark, "", "", "", at)
|
||||
self.db_add(db.c, {}, rd, fn, lmod, sz, "", "", wark, "", "", ip, at)
|
||||
db.n += 1
|
||||
ret += 1
|
||||
td = time.time() - db.t
|
||||
@@ -1357,7 +1372,7 @@ class Up2k(object):
|
||||
rd = drd
|
||||
|
||||
abspath = djoin(top, rd)
|
||||
self.pp.msg = "b{} {}".format(ndirs - nchecked, abspath)
|
||||
self.pp.msg = "b%d %s" % (ndirs - nchecked, abspath)
|
||||
try:
|
||||
if os.path.isdir(abspath):
|
||||
continue
|
||||
@@ -1709,7 +1724,7 @@ class Up2k(object):
|
||||
cur.execute(q, (w[:16],))
|
||||
|
||||
abspath = djoin(ptop, rd, fn)
|
||||
self.pp.msg = "c{} {}".format(nq, abspath)
|
||||
self.pp.msg = "c%d %s" % (nq, abspath)
|
||||
if not mpool:
|
||||
n_tags = self._tagscan_file(cur, entags, w, abspath, ip, at)
|
||||
else:
|
||||
@@ -1766,7 +1781,7 @@ class Up2k(object):
|
||||
if c2.execute(q, (row[0][:16],)).fetchone():
|
||||
continue
|
||||
|
||||
gf.write("{}\n".format("\x00".join(row)).encode("utf-8"))
|
||||
gf.write(("%s\n" % ("\x00".join(row),)).encode("utf-8"))
|
||||
n += 1
|
||||
|
||||
c2.close()
|
||||
@@ -2040,12 +2055,13 @@ class Up2k(object):
|
||||
return
|
||||
|
||||
try:
|
||||
st = bos.stat(qe.abspath)
|
||||
if not qe.mtp:
|
||||
if self.args.mtag_vv:
|
||||
t = "tag-thr: {}({})"
|
||||
self.log(t.format(self.mtag.backend, qe.abspath), "90")
|
||||
|
||||
tags = self.mtag.get(qe.abspath)
|
||||
tags = self.mtag.get(qe.abspath) if st.st_size else {}
|
||||
else:
|
||||
if self.args.mtag_vv:
|
||||
t = "tag-thr: {}({})"
|
||||
@@ -2086,11 +2102,16 @@ class Up2k(object):
|
||||
"""will mutex"""
|
||||
assert self.mtag
|
||||
|
||||
if not bos.path.isfile(abspath):
|
||||
try:
|
||||
st = bos.stat(abspath)
|
||||
except:
|
||||
return 0
|
||||
|
||||
if not stat.S_ISREG(st.st_mode):
|
||||
return 0
|
||||
|
||||
try:
|
||||
tags = self.mtag.get(abspath)
|
||||
tags = self.mtag.get(abspath) if st.st_size else {}
|
||||
except Exception as ex:
|
||||
self._log_tag_err("", abspath, ex)
|
||||
return 0
|
||||
@@ -2144,6 +2165,46 @@ class Up2k(object):
|
||||
def _trace(self, msg: str) -> None:
|
||||
self.log("ST: {}".format(msg))
|
||||
|
||||
def _open_db_wd(self, db_path: str) -> "sqlite3.Cursor":
|
||||
ok: list[int] = []
|
||||
Daemon(self._open_db_timeout, "opendb_watchdog", [db_path, ok])
|
||||
try:
|
||||
return self._open_db(db_path)
|
||||
finally:
|
||||
ok.append(1)
|
||||
|
||||
def _open_db_timeout(self, db_path, ok: list[int]) -> None:
|
||||
# give it plenty of time due to the count statement (and wisdom from byte's box)
|
||||
for _ in range(60):
|
||||
time.sleep(1)
|
||||
if ok:
|
||||
return
|
||||
|
||||
t = "WARNING:\n\n initializing an up2k database is taking longer than one minute; something has probably gone wrong:\n\n"
|
||||
self._log_sqlite_incompat(db_path, t)
|
||||
|
||||
def _log_sqlite_incompat(self, db_path, t0) -> None:
|
||||
txt = t0 or ""
|
||||
digest = hashlib.sha512(db_path.encode("utf-8", "replace")).digest()
|
||||
stackname = base64.urlsafe_b64encode(digest[:9]).decode("utf-8")
|
||||
stackpath = os.path.join(E.cfg, "stack-%s.txt" % (stackname,))
|
||||
|
||||
t = " the filesystem at %s may not support locking, or is otherwise incompatible with sqlite\n\n %s\n\n"
|
||||
t += " PS: if you think this is a bug and wish to report it, please include your configuration + the following file: %s\n"
|
||||
txt += t % (db_path, HINT_HISTPATH, stackpath)
|
||||
self.log(txt, 3)
|
||||
|
||||
try:
|
||||
stk = alltrace()
|
||||
with open(stackpath, "wb") as f:
|
||||
f.write(stk.encode("utf-8", "replace"))
|
||||
except Exception as ex:
|
||||
self.log("warning: failed to write %s: %s" % (stackpath, ex), 3)
|
||||
|
||||
if self.args.q:
|
||||
t = "-" * 72
|
||||
raise Exception("%s\n%s\n%s" % (t, txt, t))
|
||||
|
||||
def _orz(self, db_path: str) -> "sqlite3.Cursor":
|
||||
c = sqlite3.connect(
|
||||
db_path, timeout=self.timeout, check_same_thread=False
|
||||
@@ -2156,7 +2217,7 @@ class Up2k(object):
|
||||
cur = self._orz(db_path)
|
||||
ver = self._read_ver(cur)
|
||||
if not existed and ver is None:
|
||||
return self._create_db(db_path, cur)
|
||||
return self._try_create_db(db_path, cur)
|
||||
|
||||
if ver == 4:
|
||||
try:
|
||||
@@ -2194,8 +2255,16 @@ class Up2k(object):
|
||||
db = cur.connection
|
||||
cur.close()
|
||||
db.close()
|
||||
bos.unlink(db_path)
|
||||
return self._create_db(db_path, None)
|
||||
self._delete_db(db_path)
|
||||
return self._try_create_db(db_path, None)
|
||||
|
||||
def _delete_db(self, db_path: str):
|
||||
for suf in ("", "-shm", "-wal", "-journal"):
|
||||
try:
|
||||
bos.unlink(db_path + suf)
|
||||
except:
|
||||
if not suf:
|
||||
raise
|
||||
|
||||
def _backup_db(
|
||||
self, db_path: str, cur: "sqlite3.Cursor", ver: Optional[int], msg: str
|
||||
@@ -2232,6 +2301,18 @@ class Up2k(object):
|
||||
return int(rows[0][0])
|
||||
return None
|
||||
|
||||
def _try_create_db(
|
||||
self, db_path: str, cur: Optional["sqlite3.Cursor"]
|
||||
) -> "sqlite3.Cursor":
|
||||
try:
|
||||
return self._create_db(db_path, cur)
|
||||
except:
|
||||
try:
|
||||
self._delete_db(db_path)
|
||||
except:
|
||||
pass
|
||||
raise
|
||||
|
||||
def _create_db(
|
||||
self, db_path: str, cur: Optional["sqlite3.Cursor"]
|
||||
) -> "sqlite3.Cursor":
|
||||
@@ -2351,7 +2432,8 @@ class Up2k(object):
|
||||
t = "cannot receive uploads right now;\nserver busy with {}.\nPlease wait; the client will retry..."
|
||||
raise Pebkac(503, t.format(self.blocked or "[unknown]"))
|
||||
except TypeError:
|
||||
# py2
|
||||
if not PY2:
|
||||
raise
|
||||
with self.mutex:
|
||||
self._job_volchk(cj)
|
||||
|
||||
@@ -2574,12 +2656,13 @@ class Up2k(object):
|
||||
raise Pebkac(403, t)
|
||||
|
||||
if not self.args.nw:
|
||||
dvf: dict[str, Any] = vfs.flags
|
||||
try:
|
||||
dvf = self.flags[job["ptop"]]
|
||||
self._symlink(src, dst, dvf, lmod=cj["lmod"], rm=True)
|
||||
except:
|
||||
if bos.path.exists(dst):
|
||||
bos.unlink(dst)
|
||||
wunlink(self.log, dst, dvf)
|
||||
if not n4g:
|
||||
raise
|
||||
|
||||
@@ -2676,6 +2759,28 @@ class Up2k(object):
|
||||
fk = self.gen_fk(alg, self.args.fk_salt, ap, job["size"], ino)
|
||||
ret["fk"] = fk[: vfs.flags["fk"]]
|
||||
|
||||
if (
|
||||
not ret["hash"]
|
||||
and cur
|
||||
and cj.get("umod")
|
||||
and int(cj["lmod"]) != int(job["lmod"])
|
||||
and not self.args.nw
|
||||
and cj["user"] in vfs.axs.uwrite
|
||||
and cj["user"] in vfs.axs.udel
|
||||
):
|
||||
sql = "update up set mt=? where substr(w,1,16)=? and +rd=? and +fn=?"
|
||||
try:
|
||||
cur.execute(sql, (cj["lmod"], wark[:16], job["prel"], job["name"]))
|
||||
cur.connection.commit()
|
||||
|
||||
ap = djoin(job["ptop"], job["prel"], job["name"])
|
||||
times = (int(time.time()), int(cj["lmod"]))
|
||||
bos.utime(ap, times, False)
|
||||
|
||||
self.log("touched %s from %d to %d" % (ap, job["lmod"], cj["lmod"]))
|
||||
except Exception as ex:
|
||||
self.log("umod failed, %r" % (ex,), 3)
|
||||
|
||||
return ret
|
||||
|
||||
def _untaken(self, fdir: str, job: dict[str, Any], ts: float) -> str:
|
||||
@@ -2688,14 +2793,14 @@ class Up2k(object):
|
||||
fp = djoin(fdir, fname)
|
||||
if job.get("replace") and bos.path.exists(fp):
|
||||
self.log("replacing existing file at {}".format(fp))
|
||||
bos.unlink(fp)
|
||||
wunlink(self.log, fp, self.flags.get(job["ptop"]) or {})
|
||||
|
||||
if self.args.plain_ip:
|
||||
dip = ip.replace(":", ".")
|
||||
else:
|
||||
dip = self.hub.iphash.s(ip)
|
||||
|
||||
suffix = "-{:.6f}-{}".format(ts, dip)
|
||||
suffix = "-%.6f-%s" % (ts, dip)
|
||||
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as zfw:
|
||||
return zfw["orz"][1]
|
||||
|
||||
@@ -2746,7 +2851,7 @@ class Up2k(object):
|
||||
ldst = ldst.replace("/", "\\")
|
||||
|
||||
if rm and bos.path.exists(dst):
|
||||
bos.unlink(dst)
|
||||
wunlink(self.log, dst, flags)
|
||||
|
||||
try:
|
||||
if "hardlink" in flags:
|
||||
@@ -2762,7 +2867,7 @@ class Up2k(object):
|
||||
Path(ldst).symlink_to(lsrc)
|
||||
if not bos.path.exists(dst):
|
||||
try:
|
||||
bos.unlink(dst)
|
||||
wunlink(self.log, dst, flags)
|
||||
except:
|
||||
pass
|
||||
t = "the created symlink [%s] did not resolve to [%s]"
|
||||
@@ -2910,7 +3015,7 @@ class Up2k(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
z2 += [upt]
|
||||
z2.append(upt)
|
||||
if self.idx_wark(vflags, *z2):
|
||||
del self.registry[ptop][wark]
|
||||
else:
|
||||
@@ -2999,7 +3104,7 @@ class Up2k(object):
|
||||
raise
|
||||
|
||||
if "e2t" in self.flags[ptop]:
|
||||
self.tagq.put((ptop, wark, rd, fn, ip, at))
|
||||
self.tagq.put((ptop, wark, rd, fn, sz, ip, at))
|
||||
self.n_tagq += 1
|
||||
|
||||
return True
|
||||
@@ -3065,7 +3170,7 @@ class Up2k(object):
|
||||
):
|
||||
t = "upload blocked by xau server config"
|
||||
self.log(t, 1)
|
||||
bos.unlink(dst)
|
||||
wunlink(self.log, dst, vflags)
|
||||
self.registry[ptop].pop(wark, None)
|
||||
raise Pebkac(403, t)
|
||||
|
||||
@@ -3203,9 +3308,9 @@ class Up2k(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
volpath = "{}/{}".format(vrem, fn).strip("/")
|
||||
vpath = "{}/{}".format(dbv.vpath, volpath).strip("/")
|
||||
self.log("rm {}\n {}".format(vpath, abspath))
|
||||
volpath = ("%s/%s" % (vrem, fn)).strip("/")
|
||||
vpath = ("%s/%s" % (dbv.vpath, volpath)).strip("/")
|
||||
self.log("rm %s\n %s" % (vpath, abspath))
|
||||
_ = dbv.get(volpath, uname, *permsets[0])
|
||||
if xbd:
|
||||
if not runhook(
|
||||
@@ -3236,7 +3341,7 @@ class Up2k(object):
|
||||
if cur:
|
||||
cur.connection.commit()
|
||||
|
||||
bos.unlink(abspath)
|
||||
wunlink(self.log, abspath, dbv.flags)
|
||||
if xad:
|
||||
runhook(
|
||||
self.log,
|
||||
@@ -3391,7 +3496,7 @@ class Up2k(object):
|
||||
t = "moving symlink from [{}] to [{}], target [{}]"
|
||||
self.log(t.format(sabs, dabs, dlabs))
|
||||
mt = bos.path.getmtime(sabs, False)
|
||||
bos.unlink(sabs)
|
||||
wunlink(self.log, sabs, svn.flags)
|
||||
self._symlink(dlabs, dabs, dvn.flags, False, lmod=mt)
|
||||
|
||||
# folders are too scary, schedule rescan of both vols
|
||||
@@ -3458,7 +3563,7 @@ class Up2k(object):
|
||||
dlink = os.path.join(os.path.dirname(sabs), dlink)
|
||||
dlink = bos.path.abspath(dlink)
|
||||
self._symlink(dlink, dabs, dvn.flags, lmod=ftime)
|
||||
bos.unlink(sabs)
|
||||
wunlink(self.log, sabs, svn.flags)
|
||||
else:
|
||||
atomic_move(sabs, dabs)
|
||||
|
||||
@@ -3473,7 +3578,7 @@ class Up2k(object):
|
||||
shutil.copy2(b1, b2)
|
||||
except:
|
||||
try:
|
||||
os.unlink(b2)
|
||||
wunlink(self.log, dabs, dvn.flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -3485,7 +3590,7 @@ class Up2k(object):
|
||||
zb = os.readlink(b1)
|
||||
os.symlink(zb, b2)
|
||||
except:
|
||||
os.unlink(b2)
|
||||
wunlink(self.log, dabs, dvn.flags)
|
||||
raise
|
||||
|
||||
if is_link:
|
||||
@@ -3495,7 +3600,7 @@ class Up2k(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
os.unlink(b1)
|
||||
wunlink(self.log, sabs, svn.flags)
|
||||
|
||||
if xar:
|
||||
runhook(self.log, xar, dabs, dvp, "", uname, 0, 0, "", 0, "")
|
||||
@@ -3581,9 +3686,10 @@ class Up2k(object):
|
||||
)
|
||||
job = reg.get(wark) if wark else None
|
||||
if job:
|
||||
t = "forgetting partial upload {} ({})"
|
||||
p = self._vis_job_progress(job)
|
||||
self.log(t.format(wark, p))
|
||||
if job["need"]:
|
||||
t = "forgetting partial upload {} ({})"
|
||||
p = self._vis_job_progress(job)
|
||||
self.log(t.format(wark, p))
|
||||
assert wark
|
||||
del reg[wark]
|
||||
|
||||
@@ -3635,10 +3741,11 @@ class Up2k(object):
|
||||
ptop, rem = links.pop(slabs)
|
||||
self.log("linkswap [{}] and [{}]".format(sabs, slabs))
|
||||
mt = bos.path.getmtime(slabs, False)
|
||||
bos.unlink(slabs)
|
||||
flags = self.flags.get(ptop) or {}
|
||||
wunlink(self.log, slabs, flags)
|
||||
bos.rename(sabs, slabs)
|
||||
bos.utime(slabs, (int(time.time()), int(mt)), False)
|
||||
self._symlink(slabs, sabs, self.flags.get(ptop) or {}, False)
|
||||
self._symlink(slabs, sabs, flags, False)
|
||||
full[slabs] = (ptop, rem)
|
||||
sabs = slabs
|
||||
|
||||
@@ -3684,13 +3791,13 @@ class Up2k(object):
|
||||
self.log(t % (ex, ex), 3)
|
||||
|
||||
self.log("relinking [%s] to [%s]" % (alink, dabs))
|
||||
flags = self.flags.get(parts[0]) or {}
|
||||
try:
|
||||
lmod = bos.path.getmtime(alink, False)
|
||||
bos.unlink(alink)
|
||||
wunlink(self.log, alink, flags)
|
||||
except:
|
||||
pass
|
||||
|
||||
flags = self.flags.get(parts[0]) or {}
|
||||
self._symlink(dabs, alink, flags, False, lmod=lmod or 0)
|
||||
|
||||
return len(full) + len(links)
|
||||
@@ -3737,7 +3844,7 @@ class Up2k(object):
|
||||
return []
|
||||
|
||||
if self.pp:
|
||||
mb = int(fsz / 1024 / 1024)
|
||||
mb = fsz // (1024 * 1024)
|
||||
self.pp.msg = prefix + str(mb) + suffix
|
||||
|
||||
hashobj = hashlib.sha512()
|
||||
@@ -3759,8 +3866,14 @@ class Up2k(object):
|
||||
|
||||
def _new_upload(self, job: dict[str, Any]) -> None:
|
||||
pdir = djoin(job["ptop"], job["prel"])
|
||||
if not job["size"] and bos.path.isfile(djoin(pdir, job["name"])):
|
||||
return
|
||||
if not job["size"]:
|
||||
try:
|
||||
inf = bos.stat(djoin(pdir, job["name"]))
|
||||
if stat.S_ISREG(inf.st_mode):
|
||||
job["lmod"] = inf.st_size
|
||||
return
|
||||
except:
|
||||
pass
|
||||
|
||||
self.registry[job["ptop"]][job["wark"]] = job
|
||||
job["name"] = self._untaken(pdir, job, job["t0"])
|
||||
@@ -3802,7 +3915,7 @@ class Up2k(object):
|
||||
else:
|
||||
dip = self.hub.iphash.s(job["addr"])
|
||||
|
||||
suffix = "-{:.6f}-{}".format(job["t0"], dip)
|
||||
suffix = "-%.6f-%s" % (job["t0"], dip)
|
||||
with ren_open(tnam, "wb", fdir=pdir, suffix=suffix) as zfw:
|
||||
f, job["tnam"] = zfw["orz"]
|
||||
abspath = djoin(pdir, job["tnam"])
|
||||
@@ -3949,14 +4062,14 @@ class Up2k(object):
|
||||
with self.mutex:
|
||||
self.n_tagq -= 1
|
||||
|
||||
ptop, wark, rd, fn, ip, at = self.tagq.get()
|
||||
ptop, wark, rd, fn, sz, ip, at = self.tagq.get()
|
||||
if "e2t" not in self.flags[ptop]:
|
||||
continue
|
||||
|
||||
# self.log("\n " + repr([ptop, rd, fn]))
|
||||
abspath = djoin(ptop, rd, fn)
|
||||
try:
|
||||
tags = self.mtag.get(abspath)
|
||||
tags = self.mtag.get(abspath) if sz else {}
|
||||
ntags1 = len(tags)
|
||||
parsers = self._get_parsers(ptop, tags, abspath)
|
||||
if self.args.mtag_vv:
|
||||
@@ -3991,16 +4104,16 @@ class Up2k(object):
|
||||
self.log("tagged {} ({}+{})".format(abspath, ntags1, len(tags) - ntags1))
|
||||
|
||||
def _hasher(self) -> None:
|
||||
with self.mutex:
|
||||
with self.hashq_mutex:
|
||||
self.n_hashq += 1
|
||||
|
||||
while True:
|
||||
with self.mutex:
|
||||
with self.hashq_mutex:
|
||||
self.n_hashq -= 1
|
||||
# self.log("hashq {}".format(self.n_hashq))
|
||||
|
||||
task = self.hashq.get()
|
||||
if len(task) != 8:
|
||||
if len(task) != 9:
|
||||
raise Exception("invalid hash task")
|
||||
|
||||
try:
|
||||
@@ -4009,11 +4122,14 @@ class Up2k(object):
|
||||
except Exception as ex:
|
||||
self.log("failed to hash %s: %s" % (task, ex), 1)
|
||||
|
||||
def _hash_t(self, task: tuple[str, str, str, str, str, float, str, bool]) -> bool:
|
||||
ptop, vtop, rd, fn, ip, at, usr, skip_xau = task
|
||||
def _hash_t(
|
||||
self, task: tuple[str, str, dict[str, Any], str, str, str, float, str, bool]
|
||||
) -> bool:
|
||||
ptop, vtop, flags, rd, fn, ip, at, usr, skip_xau = task
|
||||
# self.log("hashq {} pop {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
|
||||
if "e2d" not in self.flags[ptop]:
|
||||
return True
|
||||
with self.mutex:
|
||||
if not self.register_vpath(ptop, flags):
|
||||
return True
|
||||
|
||||
abspath = djoin(ptop, rd, fn)
|
||||
self.log("hashing " + abspath)
|
||||
@@ -4064,11 +4180,22 @@ class Up2k(object):
|
||||
usr: str,
|
||||
skip_xau: bool = False,
|
||||
) -> None:
|
||||
with self.mutex:
|
||||
self.register_vpath(ptop, flags)
|
||||
self.hashq.put((ptop, vtop, rd, fn, ip, at, usr, skip_xau))
|
||||
if "e2d" not in flags:
|
||||
return
|
||||
|
||||
if self.n_hashq > 1024:
|
||||
t = "%d files in hashq; taking a nap"
|
||||
self.log(t % (self.n_hashq,), 6)
|
||||
|
||||
for _ in range(self.n_hashq // 1024):
|
||||
time.sleep(0.1)
|
||||
if self.n_hashq < 1024:
|
||||
break
|
||||
|
||||
zt = (ptop, vtop, flags, rd, fn, ip, at, usr, skip_xau)
|
||||
with self.hashq_mutex:
|
||||
self.hashq.put(zt)
|
||||
self.n_hashq += 1
|
||||
# self.log("hashq {} push {}/{}/{}".format(self.n_hashq, ptop, rd, fn))
|
||||
|
||||
def shutdown(self) -> None:
|
||||
self.stop = True
|
||||
@@ -4119,6 +4246,6 @@ def up2k_wark_from_hashlist(salt: str, filesize: int, hashes: list[str]) -> str:
|
||||
|
||||
|
||||
def up2k_wark_from_metadata(salt: str, sz: int, lastmod: int, rd: str, fn: str) -> str:
|
||||
ret = sfsenc("{}\n{}\n{}\n{}\n{}".format(salt, lastmod, sz, rd, fn))
|
||||
ret = sfsenc("%s\n%d\n%d\n%s\n%s" % (salt, lastmod, sz, rd, fn))
|
||||
ret = base64.urlsafe_b64encode(hashlib.sha512(ret).digest())
|
||||
return "#{}".format(ret.decode("ascii"))[:44]
|
||||
return ("#%s" % (ret.decode("ascii"),))[:44]
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import argparse
|
||||
import base64
|
||||
import contextlib
|
||||
import errno
|
||||
@@ -167,6 +168,12 @@ except:
|
||||
return struct.unpack(fmt.decode("ascii"), a)
|
||||
|
||||
|
||||
try:
|
||||
BITNESS = struct.calcsize(b"P") * 8
|
||||
except:
|
||||
BITNESS = struct.calcsize("P") * 8
|
||||
|
||||
|
||||
ansi_re = re.compile("\033\\[[^mK]*[mK]")
|
||||
|
||||
|
||||
@@ -343,6 +350,11 @@ CMD_EXEB = set(_exestr.encode("utf-8").split())
|
||||
CMD_EXES = set(_exestr.split())
|
||||
|
||||
|
||||
# mostly from https://github.com/github/gitignore/blob/main/Global/macOS.gitignore
|
||||
APPLESAN_TXT = r"/(__MACOS|Icon\r\r)|/\.(_|DS_Store|AppleDouble|LSOverride|DocumentRevisions-|fseventsd|Spotlight-|TemporaryItems|Trashes|VolumeIcon\.icns|com\.apple\.timemachine\.donotpresent|AppleDB|AppleDesktop|apdisk)"
|
||||
APPLESAN_RE = re.compile(APPLESAN_TXT)
|
||||
|
||||
|
||||
pybin = sys.executable or ""
|
||||
if EXE:
|
||||
pybin = ""
|
||||
@@ -366,11 +378,6 @@ def py_desc() -> str:
|
||||
if ofs > 0:
|
||||
py_ver = py_ver[:ofs]
|
||||
|
||||
try:
|
||||
bitness = struct.calcsize(b"P") * 8
|
||||
except:
|
||||
bitness = struct.calcsize("P") * 8
|
||||
|
||||
host_os = platform.system()
|
||||
compiler = platform.python_compiler().split("http")[0]
|
||||
|
||||
@@ -378,7 +385,7 @@ def py_desc() -> str:
|
||||
os_ver = m.group(1) if m else ""
|
||||
|
||||
return "{:>9} v{} on {}{} {} [{}]".format(
|
||||
interp, py_ver, host_os, bitness, os_ver, compiler
|
||||
interp, py_ver, host_os, BITNESS, os_ver, compiler
|
||||
)
|
||||
|
||||
|
||||
@@ -416,14 +423,32 @@ try:
|
||||
except:
|
||||
PYFTPD_VER = "(None)"
|
||||
|
||||
try:
|
||||
from partftpy.__init__ import __version__ as PARTFTPY_VER
|
||||
except:
|
||||
PARTFTPY_VER = "(None)"
|
||||
|
||||
VERSIONS = "copyparty v{} ({})\n{}\n sqlite v{} | jinja v{} | pyftpd v{}".format(
|
||||
S_VERSION, S_BUILD_DT, py_desc(), SQLITE_VER, JINJA_VER, PYFTPD_VER
|
||||
|
||||
PY_DESC = py_desc()
|
||||
|
||||
VERSIONS = (
|
||||
"copyparty v{} ({})\n{}\n sqlite {} | jinja {} | pyftpd {} | tftp {}".format(
|
||||
S_VERSION, S_BUILD_DT, PY_DESC, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER)
|
||||
__all__ = ["mp", "BytesIO", "quote", "unquote", "SQLITE_VER", "JINJA_VER", "PYFTPD_VER"]
|
||||
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER)
|
||||
__all__ = [
|
||||
"mp",
|
||||
"BytesIO",
|
||||
"quote",
|
||||
"unquote",
|
||||
"SQLITE_VER",
|
||||
"JINJA_VER",
|
||||
"PYFTPD_VER",
|
||||
"PARTFTPY_VER",
|
||||
]
|
||||
|
||||
|
||||
class Daemon(threading.Thread):
|
||||
@@ -527,6 +552,8 @@ class HLog(logging.Handler):
|
||||
elif record.name.startswith("impacket"):
|
||||
if self.ptn_smb_ign.match(msg):
|
||||
return
|
||||
elif record.name.startswith("partftpy."):
|
||||
record.name = record.name[9:]
|
||||
|
||||
self.log_func(record.name[-21:], msg, c)
|
||||
|
||||
@@ -623,9 +650,14 @@ class _Unrecv(object):
|
||||
while nbytes > len(ret):
|
||||
ret += self.recv(nbytes - len(ret))
|
||||
except OSError:
|
||||
t = "client only sent {} of {} expected bytes".format(len(ret), nbytes)
|
||||
if len(ret) <= 16:
|
||||
t += "; got {!r}".format(ret)
|
||||
t = "client stopped sending data; expected at least %d more bytes"
|
||||
if not ret:
|
||||
t = t % (nbytes,)
|
||||
else:
|
||||
t += ", only got %d"
|
||||
t = t % (nbytes, len(ret))
|
||||
if len(ret) <= 16:
|
||||
t += "; %r" % (ret,)
|
||||
|
||||
if raise_on_trunc:
|
||||
raise UnrecvEOF(5, t)
|
||||
@@ -775,16 +807,20 @@ class ProgressPrinter(threading.Thread):
|
||||
periodically print progress info without linefeeds
|
||||
"""
|
||||
|
||||
def __init__(self) -> None:
|
||||
def __init__(self, log: "NamedLogger", args: argparse.Namespace) -> None:
|
||||
threading.Thread.__init__(self, name="pp")
|
||||
self.daemon = True
|
||||
self.log = log
|
||||
self.args = args
|
||||
self.msg = ""
|
||||
self.end = False
|
||||
self.n = -1
|
||||
self.start()
|
||||
|
||||
def run(self) -> None:
|
||||
tp = 0
|
||||
msg = None
|
||||
no_stdout = self.args.q
|
||||
fmt = " {}\033[K\r" if VT100 else " {} $\r"
|
||||
while not self.end:
|
||||
time.sleep(0.1)
|
||||
@@ -792,10 +828,21 @@ class ProgressPrinter(threading.Thread):
|
||||
continue
|
||||
|
||||
msg = self.msg
|
||||
now = time.time()
|
||||
if msg and now - tp > 10:
|
||||
tp = now
|
||||
self.log("progress: %s" % (msg,), 6)
|
||||
|
||||
if no_stdout:
|
||||
continue
|
||||
|
||||
uprint(fmt.format(msg))
|
||||
if PY2:
|
||||
sys.stdout.flush()
|
||||
|
||||
if no_stdout:
|
||||
return
|
||||
|
||||
if VT100:
|
||||
print("\033[K", end="")
|
||||
elif msg:
|
||||
@@ -849,7 +896,7 @@ class MTHash(object):
|
||||
ex = ex or str(qe)
|
||||
|
||||
if pp:
|
||||
mb = int((fsz - nch * chunksz) / 1024 / 1024)
|
||||
mb = (fsz - nch * chunksz) // (1024 * 1024)
|
||||
pp.msg = prefix + str(mb) + suffix
|
||||
|
||||
if ex:
|
||||
@@ -1071,7 +1118,7 @@ def uprint(msg: str) -> None:
|
||||
|
||||
|
||||
def nuprint(msg: str) -> None:
|
||||
uprint("{}\n".format(msg))
|
||||
uprint("%s\n" % (msg,))
|
||||
|
||||
|
||||
def dedent(txt: str) -> str:
|
||||
@@ -1094,10 +1141,10 @@ def rice_tid() -> str:
|
||||
def trace(*args: Any, **kwargs: Any) -> None:
|
||||
t = time.time()
|
||||
stack = "".join(
|
||||
"\033[36m{}\033[33m{}".format(x[0].split(os.sep)[-1][:-3], x[1])
|
||||
"\033[36m%s\033[33m%s" % (x[0].split(os.sep)[-1][:-3], x[1])
|
||||
for x in traceback.extract_stack()[3:-1]
|
||||
)
|
||||
parts = ["{:.6f}".format(t), rice_tid(), stack]
|
||||
parts = ["%.6f" % (t,), rice_tid(), stack]
|
||||
|
||||
if args:
|
||||
parts.append(repr(args))
|
||||
@@ -1114,17 +1161,17 @@ def alltrace() -> str:
|
||||
threads: dict[str, types.FrameType] = {}
|
||||
names = dict([(t.ident, t.name) for t in threading.enumerate()])
|
||||
for tid, stack in sys._current_frames().items():
|
||||
name = "{} ({:x})".format(names.get(tid), tid)
|
||||
name = "%s (%x)" % (names.get(tid), tid)
|
||||
threads[name] = stack
|
||||
|
||||
rret: list[str] = []
|
||||
bret: list[str] = []
|
||||
for name, stack in sorted(threads.items()):
|
||||
ret = ["\n\n# {}".format(name)]
|
||||
ret = ["\n\n# %s" % (name,)]
|
||||
pad = None
|
||||
for fn, lno, name, line in traceback.extract_stack(stack):
|
||||
fn = os.sep.join(fn.split(os.sep)[-3:])
|
||||
ret.append('File: "{}", line {}, in {}'.format(fn, lno, name))
|
||||
ret.append('File: "%s", line %d, in %s' % (fn, lno, name))
|
||||
if line:
|
||||
ret.append(" " + str(line.strip()))
|
||||
if "self.not_empty.wait()" in line:
|
||||
@@ -1133,7 +1180,7 @@ def alltrace() -> str:
|
||||
if pad:
|
||||
bret += [ret[0]] + [pad + x for x in ret[1:]]
|
||||
else:
|
||||
rret += ret
|
||||
rret.extend(ret)
|
||||
|
||||
return "\n".join(rret + bret) + "\n"
|
||||
|
||||
@@ -1216,12 +1263,20 @@ def log_thrs(log: Callable[[str, str, int], None], ival: float, name: str) -> No
|
||||
|
||||
|
||||
def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
|
||||
txt0 = txt
|
||||
for vol in vols:
|
||||
txt = txt.replace(vol.realpath.encode("utf-8"), vol.vpath.encode("utf-8"))
|
||||
txt = txt.replace(
|
||||
vol.realpath.encode("utf-8").replace(b"\\", b"\\\\"),
|
||||
vol.vpath.encode("utf-8"),
|
||||
)
|
||||
bap = vol.realpath.encode("utf-8")
|
||||
bhp = vol.histpath.encode("utf-8")
|
||||
bvp = vol.vpath.encode("utf-8")
|
||||
bvph = b"$hist(/" + bvp + b")"
|
||||
|
||||
txt = txt.replace(bap, bvp)
|
||||
txt = txt.replace(bhp, bvph)
|
||||
txt = txt.replace(bap.replace(b"\\", b"\\\\"), bvp)
|
||||
txt = txt.replace(bhp.replace(b"\\", b"\\\\"), bvph)
|
||||
|
||||
if txt != txt0:
|
||||
txt += b"\r\nNOTE: filepaths sanitized; see serverlog for correct values"
|
||||
|
||||
return txt
|
||||
|
||||
@@ -1229,9 +1284,9 @@ def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
|
||||
def min_ex(max_lines: int = 8, reverse: bool = False) -> str:
|
||||
et, ev, tb = sys.exc_info()
|
||||
stb = traceback.extract_tb(tb)
|
||||
fmt = "{} @ {} <{}>: {}"
|
||||
ex = [fmt.format(fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb]
|
||||
ex.append("[{}] {}".format(et.__name__ if et else "(anonymous)", ev))
|
||||
fmt = "%s @ %d <%s>: %s"
|
||||
ex = [fmt % (fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb]
|
||||
ex.append("[%s] %s" % (et.__name__ if et else "(anonymous)", ev))
|
||||
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
|
||||
|
||||
|
||||
@@ -1282,7 +1337,7 @@ def ren_open(
|
||||
with fun(fsenc(fpath), *args, **kwargs) as f:
|
||||
if b64:
|
||||
assert fdir
|
||||
fp2 = "fn-trunc.{}.txt".format(b64)
|
||||
fp2 = "fn-trunc.%s.txt" % (b64,)
|
||||
fp2 = os.path.join(fdir, fp2)
|
||||
with open(fsenc(fp2), "wb") as f2:
|
||||
f2.write(orig_name.encode("utf-8"))
|
||||
@@ -1311,7 +1366,7 @@ def ren_open(
|
||||
raise
|
||||
|
||||
if not b64:
|
||||
zs = "{}\n{}".format(orig_name, suffix).encode("utf-8", "replace")
|
||||
zs = ("%s\n%s" % (orig_name, suffix)).encode("utf-8", "replace")
|
||||
zs = hashlib.sha512(zs).digest()[:12]
|
||||
b64 = base64.urlsafe_b64encode(zs).decode("utf-8")
|
||||
|
||||
@@ -1331,7 +1386,7 @@ def ren_open(
|
||||
# okay do the first letter then
|
||||
ext = "." + ext[2:]
|
||||
|
||||
fname = "{}~{}{}".format(bname, b64, ext)
|
||||
fname = "%s~%s%s" % (bname, b64, ext)
|
||||
|
||||
|
||||
class MultipartParser(object):
|
||||
@@ -1517,15 +1572,20 @@ class MultipartParser(object):
|
||||
return ret
|
||||
|
||||
def parse(self) -> None:
|
||||
boundary = get_boundary(self.headers)
|
||||
self.log("boundary=%r" % (boundary,))
|
||||
|
||||
# spec says there might be junk before the first boundary,
|
||||
# can't have the leading \r\n if that's not the case
|
||||
self.boundary = b"--" + get_boundary(self.headers).encode("utf-8")
|
||||
self.boundary = b"--" + boundary.encode("utf-8")
|
||||
|
||||
# discard junk before the first boundary
|
||||
for junk in self._read_data():
|
||||
self.log(
|
||||
"discarding preamble: [{}]".format(junk.decode("utf-8", "replace"))
|
||||
)
|
||||
if not junk:
|
||||
continue
|
||||
|
||||
jtxt = junk.decode("utf-8", "replace")
|
||||
self.log("discarding preamble |%d| %r" % (len(junk), jtxt))
|
||||
|
||||
# nice, now make it fast
|
||||
self.boundary = b"\r\n" + self.boundary
|
||||
@@ -1537,11 +1597,9 @@ class MultipartParser(object):
|
||||
raises if the field name is not as expected
|
||||
"""
|
||||
assert self.gen
|
||||
p_field, _, p_data = next(self.gen)
|
||||
p_field, p_fname, p_data = next(self.gen)
|
||||
if p_field != field_name:
|
||||
raise Pebkac(
|
||||
422, 'expected field "{}", got "{}"'.format(field_name, p_field)
|
||||
)
|
||||
raise WrongPostKey(field_name, p_field, p_fname, p_data)
|
||||
|
||||
return self._read_value(p_data, max_len).decode("utf-8", "surrogateescape")
|
||||
|
||||
@@ -1610,7 +1668,7 @@ def rand_name(fdir: str, fn: str, rnd: int) -> str:
|
||||
break
|
||||
|
||||
nc = rnd + extra
|
||||
nb = int((6 + 6 * nc) / 8)
|
||||
nb = (6 + 6 * nc) // 8
|
||||
zb = os.urandom(nb)
|
||||
zb = base64.urlsafe_b64encode(zb)
|
||||
fn = zb[:nc].decode("utf-8") + ext
|
||||
@@ -1710,10 +1768,10 @@ def get_spd(nbyte: int, t0: float, t: Optional[float] = None) -> str:
|
||||
if t is None:
|
||||
t = time.time()
|
||||
|
||||
bps = nbyte / ((t - t0) + 0.001)
|
||||
bps = nbyte / ((t - t0) or 0.001)
|
||||
s1 = humansize(nbyte).replace(" ", "\033[33m").replace("iB", "")
|
||||
s2 = humansize(bps).replace(" ", "\033[35m").replace("iB", "")
|
||||
return "{} \033[0m{}/s\033[0m".format(s1, s2)
|
||||
return "%s \033[0m%s/s\033[0m" % (s1, s2)
|
||||
|
||||
|
||||
def s2hms(s: float, optional_h: bool = False) -> str:
|
||||
@@ -1721,9 +1779,9 @@ def s2hms(s: float, optional_h: bool = False) -> str:
|
||||
h, s = divmod(s, 3600)
|
||||
m, s = divmod(s, 60)
|
||||
if not h and optional_h:
|
||||
return "{}:{:02}".format(m, s)
|
||||
return "%d:%02d" % (m, s)
|
||||
|
||||
return "{}:{:02}:{:02}".format(h, m, s)
|
||||
return "%d:%02d:%02d" % (h, m, s)
|
||||
|
||||
|
||||
def djoin(*paths: str) -> str:
|
||||
@@ -2065,6 +2123,47 @@ def atomic_move(usrc: str, udst: str) -> None:
|
||||
os.rename(src, dst)
|
||||
|
||||
|
||||
def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
|
||||
maxtime = flags.get("rm_re_t", 0.0)
|
||||
bpath = fsenc(abspath)
|
||||
if not maxtime:
|
||||
os.unlink(bpath)
|
||||
return True
|
||||
|
||||
chill = flags.get("rm_re_r", 0.0)
|
||||
if chill < 0.001:
|
||||
chill = 0.1
|
||||
|
||||
ino = 0
|
||||
t0 = now = time.time()
|
||||
for attempt in range(90210):
|
||||
try:
|
||||
if ino and os.stat(bpath).st_ino != ino:
|
||||
log("inode changed; aborting delete")
|
||||
return False
|
||||
os.unlink(bpath)
|
||||
if attempt:
|
||||
now = time.time()
|
||||
t = "deleted in %.2f sec, attempt %d"
|
||||
log(t % (now - t0, attempt + 1))
|
||||
return True
|
||||
except OSError as ex:
|
||||
now = time.time()
|
||||
if ex.errno == errno.ENOENT:
|
||||
return False
|
||||
if now - t0 > maxtime or attempt == 90209:
|
||||
raise
|
||||
if not attempt:
|
||||
if not PY2:
|
||||
ino = os.stat(bpath).st_ino
|
||||
t = "delete failed (err.%d); retrying for %d sec: %s"
|
||||
log(t % (ex.errno, maxtime + 0.99, abspath))
|
||||
|
||||
time.sleep(chill)
|
||||
|
||||
return False # makes pylance happy
|
||||
|
||||
|
||||
def get_df(abspath: str) -> tuple[Optional[int], Optional[int]]:
|
||||
try:
|
||||
# some fuses misbehave
|
||||
@@ -2193,7 +2292,7 @@ def read_socket_chunked(
|
||||
raise Pebkac(400, t.format(x))
|
||||
|
||||
if log:
|
||||
log("receiving {} byte chunk".format(chunklen))
|
||||
log("receiving %d byte chunk" % (chunklen,))
|
||||
|
||||
for chunk in read_socket(sr, chunklen):
|
||||
yield chunk
|
||||
@@ -2365,6 +2464,12 @@ def statdir(
|
||||
print(t)
|
||||
|
||||
|
||||
def dir_is_empty(logger: "RootLogger", scandir: bool, top: str):
|
||||
for _ in statdir(logger, scandir, False, top):
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def rmdirs(
|
||||
logger: "RootLogger", scandir: bool, lstat: bool, top: str, depth: int
|
||||
) -> tuple[list[str], list[str]]:
|
||||
@@ -2557,6 +2662,7 @@ def runcmd(
|
||||
argv: Union[list[bytes], list[str]], timeout: Optional[float] = None, **ka: Any
|
||||
) -> tuple[int, str, str]:
|
||||
isbytes = isinstance(argv[0], (bytes, bytearray))
|
||||
oom = ka.pop("oom", 0) # 0..1000
|
||||
kill = ka.pop("kill", "t") # [t]ree [m]ain [n]one
|
||||
capture = ka.pop("capture", 3) # 0=none 1=stdout 2=stderr 3=both
|
||||
|
||||
@@ -2587,6 +2693,14 @@ def runcmd(
|
||||
argv = [NICES] + argv
|
||||
|
||||
p = sp.Popen(argv, stdout=cout, stderr=cerr, **ka)
|
||||
|
||||
if oom and not ANYWIN and not MACOS:
|
||||
try:
|
||||
with open("/proc/%d/oom_score_adj" % (p.pid,), "wb") as f:
|
||||
f.write(("%d\n" % (oom,)).encode("utf-8"))
|
||||
except:
|
||||
pass
|
||||
|
||||
if not timeout or PY2:
|
||||
bout, berr = p.communicate(sin)
|
||||
else:
|
||||
@@ -2734,6 +2848,7 @@ def _parsehook(
|
||||
sp_ka = {
|
||||
"env": env,
|
||||
"nice": True,
|
||||
"oom": 300,
|
||||
"timeout": tout,
|
||||
"kill": kill,
|
||||
"capture": cap,
|
||||
@@ -2950,7 +3065,7 @@ def visual_length(txt: str) -> int:
|
||||
pend = None
|
||||
else:
|
||||
if ch == "\033":
|
||||
pend = "{0}".format(ch)
|
||||
pend = "%s" % (ch,)
|
||||
else:
|
||||
co = ord(ch)
|
||||
# the safe parts of latin1 and cp437 (no greek stuff)
|
||||
@@ -3048,3 +3163,20 @@ class Pebkac(Exception):
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return "Pebkac({}, {})".format(self.code, repr(self.args))
|
||||
|
||||
|
||||
class WrongPostKey(Pebkac):
|
||||
def __init__(
|
||||
self,
|
||||
expected: str,
|
||||
got: str,
|
||||
fname: Optional[str],
|
||||
datagen: Generator[bytes, None, None],
|
||||
) -> None:
|
||||
msg = 'expected field "{}", got "{}"'.format(expected, got)
|
||||
super(WrongPostKey, self).__init__(422, msg)
|
||||
|
||||
self.expected = expected
|
||||
self.got = got
|
||||
self.fname = fname
|
||||
self.datagen = datagen
|
||||
|
||||
@@ -17,8 +17,10 @@ window.baguetteBox = (function () {
|
||||
titleTag: false,
|
||||
async: false,
|
||||
preload: 2,
|
||||
refocus: true,
|
||||
afterShow: null,
|
||||
afterHide: null,
|
||||
duringHide: null,
|
||||
onChange: null,
|
||||
},
|
||||
overlay, slider, btnPrev, btnNext, btnHelp, btnAnim, btnRotL, btnRotR, btnSel, btnFull, btnVmode, btnClose,
|
||||
@@ -144,7 +146,7 @@ window.baguetteBox = (function () {
|
||||
selectorData.galleries.push(gallery);
|
||||
});
|
||||
|
||||
return selectorData.galleries;
|
||||
return [selectorData.galleries, options];
|
||||
}
|
||||
|
||||
function clearCachedData() {
|
||||
@@ -255,19 +257,19 @@ window.baguetteBox = (function () {
|
||||
if (anymod(e, true))
|
||||
return;
|
||||
|
||||
var k = e.code + '', v = vid(), pos = -1;
|
||||
var k = (e.code || e.key) + '', v = vid(), pos = -1;
|
||||
|
||||
if (k == "BracketLeft")
|
||||
setloop(1);
|
||||
else if (k == "BracketRight")
|
||||
setloop(2);
|
||||
else if (e.shiftKey && k != 'KeyR')
|
||||
else if (e.shiftKey && k != "KeyR" && k != "R")
|
||||
return;
|
||||
else if (k == "ArrowLeft" || k == "KeyJ")
|
||||
else if (k == "ArrowLeft" || k == "KeyJ" || k == "Left" || k == "j")
|
||||
showPreviousImage();
|
||||
else if (k == "ArrowRight" || k == "KeyL")
|
||||
else if (k == "ArrowRight" || k == "KeyL" || k == "Right" || k == "l")
|
||||
showNextImage();
|
||||
else if (k == "Escape")
|
||||
else if (k == "Escape" || k == "Esc")
|
||||
hideOverlay();
|
||||
else if (k == "Home")
|
||||
showFirstImage(e);
|
||||
@@ -295,9 +297,9 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
else if (k == "KeyF")
|
||||
tglfull();
|
||||
else if (k == "KeyS")
|
||||
else if (k == "KeyS" || k == "s")
|
||||
tglsel();
|
||||
else if (k == "KeyR")
|
||||
else if (k == "KeyR" || k == "r" || k == "R")
|
||||
rotn(e.shiftKey ? -1 : 1);
|
||||
else if (k == "KeyY")
|
||||
dlpic();
|
||||
@@ -593,6 +595,9 @@ window.baguetteBox = (function () {
|
||||
if (overlay.style.display === 'none')
|
||||
return;
|
||||
|
||||
if (options.duringHide)
|
||||
options.duringHide();
|
||||
|
||||
sethash('');
|
||||
unbindEvents();
|
||||
try {
|
||||
@@ -613,9 +618,45 @@ window.baguetteBox = (function () {
|
||||
if (options.afterHide)
|
||||
options.afterHide();
|
||||
|
||||
documentLastFocus && documentLastFocus.focus();
|
||||
options.refocus && documentLastFocus && documentLastFocus.focus();
|
||||
isOverlayVisible = false;
|
||||
}, 500);
|
||||
unvid();
|
||||
unfig();
|
||||
}, 250);
|
||||
}
|
||||
|
||||
function unvid(keep) {
|
||||
var vids = QSA('#bbox-overlay video');
|
||||
for (var a = vids.length - 1; a >= 0; a--) {
|
||||
var v = vids[a];
|
||||
if (v == keep)
|
||||
continue;
|
||||
|
||||
v.src = '';
|
||||
v.load();
|
||||
|
||||
var p = v.parentNode;
|
||||
p.removeChild(v);
|
||||
p.parentNode.removeChild(p);
|
||||
}
|
||||
}
|
||||
|
||||
function unfig(keep) {
|
||||
var figs = QSA('#bbox-overlay figure'),
|
||||
npre = options.preload || 0,
|
||||
k = [];
|
||||
|
||||
if (keep === undefined)
|
||||
keep = -9;
|
||||
|
||||
for (var a = keep - npre; a <= keep + npre; a++)
|
||||
k.push('bbox-figure-' + a);
|
||||
|
||||
for (var a = figs.length - 1; a >= 0; a--) {
|
||||
var f = figs[a];
|
||||
if (!has(k, f.getAttribute('id')))
|
||||
f.parentNode.removeChild(f);
|
||||
}
|
||||
}
|
||||
|
||||
function loadImage(index, callback) {
|
||||
@@ -708,6 +749,7 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
|
||||
function show(index, gallery) {
|
||||
gallery = gallery || currentGallery;
|
||||
if (!isOverlayVisible && index >= 0 && index < gallery.length) {
|
||||
prepareOverlay(gallery, options);
|
||||
showOverlay(index);
|
||||
@@ -720,12 +762,10 @@ window.baguetteBox = (function () {
|
||||
if (index >= imagesElements.length)
|
||||
return bounceAnimation('right');
|
||||
|
||||
var v = vid();
|
||||
if (v) {
|
||||
v.src = '';
|
||||
v.load();
|
||||
v.parentNode.removeChild(v);
|
||||
try {
|
||||
vid().pause();
|
||||
}
|
||||
catch (ex) { }
|
||||
|
||||
currentIndex = index;
|
||||
loadImage(currentIndex, function () {
|
||||
@@ -734,6 +774,15 @@ window.baguetteBox = (function () {
|
||||
});
|
||||
updateOffset();
|
||||
|
||||
if (options.animation == 'none')
|
||||
unvid(vid());
|
||||
else
|
||||
setTimeout(function () {
|
||||
unvid(vid());
|
||||
}, 100);
|
||||
|
||||
unfig(index);
|
||||
|
||||
if (options.onChange)
|
||||
options.onChange(currentIndex, imagesElements.length);
|
||||
|
||||
|
||||
@@ -818,6 +818,10 @@ html.y #path a:hover {
|
||||
.logue:empty {
|
||||
display: none;
|
||||
}
|
||||
.logue.raw {
|
||||
white-space: pre;
|
||||
font-family: 'scp', 'consolas', monospace;
|
||||
}
|
||||
#doc>iframe,
|
||||
.logue>iframe {
|
||||
background: var(--bgg);
|
||||
@@ -981,6 +985,10 @@ html.y #path a:hover {
|
||||
margin: 0 auto;
|
||||
display: block;
|
||||
}
|
||||
#ggrid.nocrop>a img {
|
||||
max-height: 20em;
|
||||
max-height: calc(var(--grid-sz)*2);
|
||||
}
|
||||
#ggrid>a.dir:before {
|
||||
content: '📂';
|
||||
}
|
||||
@@ -1147,9 +1155,6 @@ html.y #widget.open {
|
||||
@keyframes spin {
|
||||
100% {transform: rotate(360deg)}
|
||||
}
|
||||
@media (prefers-reduced-motion) {
|
||||
@keyframes spin { }
|
||||
}
|
||||
@keyframes fadein {
|
||||
0% {opacity: 0}
|
||||
100% {opacity: 1}
|
||||
@@ -1243,6 +1248,13 @@ html.y #widget.open {
|
||||
0% {opacity:0}
|
||||
100% {opacity:1}
|
||||
}
|
||||
#ggrid>a.glow {
|
||||
animation: gexit .6s ease-out;
|
||||
}
|
||||
@keyframes gexit {
|
||||
0% {box-shadow: 0 0 0 2em var(--a)}
|
||||
100% {box-shadow: 0 0 0em 0em var(--a)}
|
||||
}
|
||||
#wzip a {
|
||||
font-size: .4em;
|
||||
margin: -.3em .1em;
|
||||
@@ -1653,7 +1665,9 @@ html.cz .tgl.btn.on {
|
||||
color: var(--fg-max);
|
||||
}
|
||||
#tree ul a.hl {
|
||||
color: #fff;
|
||||
color: var(--btn-1-fg);
|
||||
background: #000;
|
||||
background: var(--btn-1-bg);
|
||||
text-shadow: none;
|
||||
}
|
||||
@@ -1769,6 +1783,7 @@ html.y #tree.nowrap .ntree a+a:hover {
|
||||
padding: 0;
|
||||
}
|
||||
#thumbs,
|
||||
#au_prescan,
|
||||
#au_fullpre,
|
||||
#au_os_seek,
|
||||
#au_osd_cv,
|
||||
@@ -1776,7 +1791,8 @@ html.y #tree.nowrap .ntree a+a:hover {
|
||||
opacity: .3;
|
||||
}
|
||||
#griden.on+#thumbs,
|
||||
#au_preload.on+#au_fullpre,
|
||||
#au_preload.on+#au_prescan,
|
||||
#au_preload.on+#au_prescan+#au_fullpre,
|
||||
#au_os_ctl.on+#au_os_seek,
|
||||
#au_os_ctl.on+#au_os_seek+#au_osd_cv,
|
||||
#u2turbo.on+#u2tdate {
|
||||
@@ -2174,6 +2190,7 @@ html.y #bbox-overlay figcaption a {
|
||||
}
|
||||
#bbox-halp {
|
||||
color: var(--fg-max);
|
||||
background: #fff;
|
||||
background: var(--bg);
|
||||
position: absolute;
|
||||
top: 0;
|
||||
@@ -3129,7 +3146,7 @@ html.d #treepar {
|
||||
margin-top: 1.7em;
|
||||
}
|
||||
}
|
||||
@supports (display: grid) {
|
||||
@supports (display: grid) and (gap: 1em) {
|
||||
#ggrid {
|
||||
display: grid;
|
||||
margin: 0em 0.25em;
|
||||
@@ -3154,3 +3171,24 @@ html.d #treepar {
|
||||
padding: 0.2em;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@media (prefers-reduced-motion) {
|
||||
@keyframes spin { }
|
||||
@keyframes gexit { }
|
||||
@keyframes bounce { }
|
||||
@keyframes bounceFromLeft { }
|
||||
@keyframes bounceFromRight { }
|
||||
|
||||
#ggrid>a:before,
|
||||
#widget.anim,
|
||||
#u2tabw,
|
||||
.dropdesc,
|
||||
.dropdesc b,
|
||||
.dropdesc>div>div {
|
||||
transition: none;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -148,7 +148,8 @@
|
||||
logues = {{ logues|tojson if sb_lg else "[]" }},
|
||||
ls0 = {{ ls0|tojson }};
|
||||
|
||||
document.documentElement.className = localStorage.cpp_thm || dtheme;
|
||||
var STG = window.localStorage;
|
||||
document.documentElement.className = (STG && STG.cpp_thm) || dtheme;
|
||||
</script>
|
||||
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/baguettebox.js?_={{ ts }}"></script>
|
||||
@@ -160,3 +161,4 @@
|
||||
</body>
|
||||
|
||||
</html>
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -61,3 +61,4 @@
|
||||
|
||||
</body>
|
||||
</html>
|
||||
|
||||
|
||||
@@ -25,3 +25,4 @@
|
||||
</body>
|
||||
|
||||
</html>
|
||||
|
||||
|
||||
@@ -139,16 +139,15 @@ var md_opt = {
|
||||
};
|
||||
|
||||
(function () {
|
||||
var l = localStorage,
|
||||
drk = l.light != 1,
|
||||
var l = window.localStorage,
|
||||
drk = (l && l.light) != 1,
|
||||
btn = document.getElementById("lightswitch"),
|
||||
f = function (e) {
|
||||
if (e) { e.preventDefault(); drk = !drk; }
|
||||
document.documentElement.className = drk? "z":"y";
|
||||
btn.innerHTML = "go " + (drk ? "light":"dark");
|
||||
l.light = drk? 0:1;
|
||||
try { l.light = drk? 0:1; } catch (ex) { }
|
||||
};
|
||||
|
||||
btn.onclick = f;
|
||||
f();
|
||||
})();
|
||||
@@ -161,3 +160,4 @@ l.light = drk? 0:1;
|
||||
<script src="{{ r }}/.cpr/md2.js?_={{ ts }}"></script>
|
||||
{%- endif %}
|
||||
</body></html>
|
||||
|
||||
|
||||
@@ -216,6 +216,11 @@ function convert_markdown(md_text, dest_dom) {
|
||||
md_html = DOMPurify.sanitize(md_html);
|
||||
}
|
||||
catch (ex) {
|
||||
if (IE) {
|
||||
dest_dom.innerHTML = 'IE cannot into markdown ;_;';
|
||||
return;
|
||||
}
|
||||
|
||||
if (ext)
|
||||
md_plug_err(ex, ext[1]);
|
||||
|
||||
|
||||
@@ -163,7 +163,7 @@ redraw = (function () {
|
||||
dom_sbs.onclick = setsbs;
|
||||
dom_nsbs.onclick = modetoggle;
|
||||
|
||||
onresize();
|
||||
(IE ? modetoggle : onresize)();
|
||||
return onresize;
|
||||
})();
|
||||
|
||||
@@ -933,7 +933,7 @@ var set_lno = (function () {
|
||||
var keydown = function (ev) {
|
||||
if (!ev && window.event) {
|
||||
ev = window.event;
|
||||
if (localStorage.dev_fbw == 1) {
|
||||
if (dev_fbw == 1) {
|
||||
toast.warn(10, 'hello from fallback code ;_;\ncheck console trace');
|
||||
console.error('using window.event');
|
||||
}
|
||||
@@ -1009,7 +1009,7 @@ var set_lno = (function () {
|
||||
md_home(ev.shiftKey);
|
||||
return false;
|
||||
}
|
||||
if (!ev.shiftKey && (ev.code.endsWith("Enter") || kc == 13)) {
|
||||
if (!ev.shiftKey && ((ev.code + '').endsWith("Enter") || kc == 13)) {
|
||||
return md_newline();
|
||||
}
|
||||
if (!ev.shiftKey && kc == 8) {
|
||||
|
||||
@@ -37,12 +37,12 @@ var md_opt = {
|
||||
};
|
||||
|
||||
var lightswitch = (function () {
|
||||
var l = localStorage,
|
||||
drk = l.light != 1,
|
||||
var l = window.localStorage,
|
||||
drk = (l && l.light) != 1,
|
||||
f = function (e) {
|
||||
if (e) drk = !drk;
|
||||
document.documentElement.className = drk? "z":"y";
|
||||
l.light = drk? 0:1;
|
||||
try { l.light = drk? 0:1; } catch (ex) { }
|
||||
};
|
||||
f();
|
||||
return f;
|
||||
@@ -54,3 +54,4 @@ l.light = drk? 0:1;
|
||||
<script src="{{ r }}/.cpr/deps/easymde.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/mde.js?_={{ ts }}"></script>
|
||||
</body></html>
|
||||
|
||||
|
||||
@@ -48,4 +48,5 @@
|
||||
{%- endif %}
|
||||
</body>
|
||||
|
||||
</html>
|
||||
</html>
|
||||
|
||||
|
||||
@@ -110,10 +110,12 @@ var SR = {{ r|tojson }},
|
||||
lang="{{ lang }}",
|
||||
dfavico="{{ favico }}";
|
||||
|
||||
document.documentElement.className=localStorage.cpp_thm||"{{ this.args.theme }}";
|
||||
var STG = window.localStorage;
|
||||
document.documentElement.className = (STG && STG.cpp_thm) || "{{ this.args.theme }}";
|
||||
|
||||
</script>
|
||||
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/splash.js?_={{ ts }}"></script>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
|
||||
@@ -238,10 +238,12 @@ var SR = {{ r|tojson }},
|
||||
lang="{{ lang }}",
|
||||
dfavico="{{ favico }}";
|
||||
|
||||
document.documentElement.className=localStorage.cpp_thm||"{{ args.theme }}";
|
||||
var STG = window.localStorage;
|
||||
document.documentElement.className = (STG && STG.cpp_thm) || "{{ args.theme }}";
|
||||
|
||||
</script>
|
||||
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/svcs.js?_={{ ts }}"></script>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
|
||||
@@ -105,6 +105,9 @@ html {
|
||||
#toast pre {
|
||||
margin: 0;
|
||||
}
|
||||
#toast.hide {
|
||||
display: none;
|
||||
}
|
||||
#toast.vis {
|
||||
right: 1.3em;
|
||||
transform: inherit;
|
||||
@@ -144,6 +147,10 @@ html {
|
||||
#toast.err #toastc {
|
||||
background: #d06;
|
||||
}
|
||||
#toast code {
|
||||
padding: 0 .2em;
|
||||
background: rgba(0,0,0,0.2);
|
||||
}
|
||||
#tth {
|
||||
color: #fff;
|
||||
background: #111;
|
||||
@@ -573,3 +580,11 @@ hr {
|
||||
border: .07em dashed #444;
|
||||
}
|
||||
}
|
||||
|
||||
@media (prefers-reduced-motion) {
|
||||
#toast,
|
||||
#toast a#toastc,
|
||||
#tt {
|
||||
transition: none;
|
||||
}
|
||||
}
|
||||
@@ -431,7 +431,7 @@ function U2pvis(act, btns, uc, st) {
|
||||
if (sread('potato') === null) {
|
||||
btn.click();
|
||||
toast.inf(30, L.u_gotpot);
|
||||
localStorage.removeItem('potato');
|
||||
sdrop('potato');
|
||||
}
|
||||
|
||||
u2f.appendChild(ode);
|
||||
@@ -861,6 +861,7 @@ function up2k_init(subtle) {
|
||||
bcfg_bind(uc, 'multitask', 'multitask', true, null, false);
|
||||
bcfg_bind(uc, 'potato', 'potato', false, set_potato, false);
|
||||
bcfg_bind(uc, 'ask_up', 'ask_up', true, null, false);
|
||||
bcfg_bind(uc, 'umod', 'umod', false, null, false);
|
||||
bcfg_bind(uc, 'u2ts', 'u2ts', !u2ts.endsWith('u'), set_u2ts, false);
|
||||
bcfg_bind(uc, 'fsearch', 'fsearch', false, set_fsearch, false);
|
||||
|
||||
@@ -894,6 +895,7 @@ function up2k_init(subtle) {
|
||||
"bytes": {
|
||||
"total": 0,
|
||||
"hashed": 0,
|
||||
"inflight": 0,
|
||||
"uploaded": 0,
|
||||
"finished": 0
|
||||
},
|
||||
@@ -1332,7 +1334,8 @@ function up2k_init(subtle) {
|
||||
return modal.confirm(msg.join('') + '</ul>', function () {
|
||||
start_actx();
|
||||
up_them(good_files);
|
||||
toast.inf(15, L.u_unpt, L.u_unpt);
|
||||
if (have_up2k_idx)
|
||||
toast.inf(15, L.u_unpt, L.u_unpt);
|
||||
}, null);
|
||||
|
||||
up_them(good_files);
|
||||
@@ -1391,6 +1394,8 @@ function up2k_init(subtle) {
|
||||
entry.rand = true;
|
||||
entry.name = 'a\n' + entry.name;
|
||||
}
|
||||
else if (uc.umod)
|
||||
entry.umod = true;
|
||||
|
||||
if (biggest_file < entry.size)
|
||||
biggest_file = entry.size;
|
||||
@@ -1539,17 +1544,21 @@ function up2k_init(subtle) {
|
||||
if (uc.fsearch)
|
||||
t.push(['u2etat', st.bytes.hashed, st.bytes.hashed, st.time.hashing]);
|
||||
}
|
||||
|
||||
var b_up = st.bytes.inflight + st.bytes.uploaded,
|
||||
b_fin = st.bytes.inflight + st.bytes.finished;
|
||||
|
||||
if (nsend) {
|
||||
st.time.uploading += td;
|
||||
t.push(['u2etau', st.bytes.uploaded, st.bytes.finished, st.time.uploading]);
|
||||
t.push(['u2etau', b_up, b_fin, st.time.uploading]);
|
||||
}
|
||||
if ((nhash || nsend) && !uc.fsearch) {
|
||||
if (!st.bytes.finished) {
|
||||
if (!b_fin) {
|
||||
ebi('u2etat').innerHTML = L.u_etaprep;
|
||||
}
|
||||
else {
|
||||
st.time.busy += td;
|
||||
t.push(['u2etat', st.bytes.finished, st.bytes.finished, st.time.busy]);
|
||||
t.push(['u2etat', b_fin, b_fin, st.time.busy]);
|
||||
}
|
||||
}
|
||||
for (var a = 0; a < t.length; a++) {
|
||||
@@ -2467,6 +2476,8 @@ function up2k_init(subtle) {
|
||||
req.srch = 1;
|
||||
else if (t.rand)
|
||||
req.rand = true;
|
||||
else if (t.umod)
|
||||
req.umod = true;
|
||||
|
||||
xhr.open('POST', t.purl, true);
|
||||
xhr.responseType = 'text';
|
||||
@@ -2533,6 +2544,7 @@ function up2k_init(subtle) {
|
||||
cdr = t.size;
|
||||
|
||||
var orz = function (xhr) {
|
||||
st.bytes.inflight -= xhr.bsent;
|
||||
var txt = unpre((xhr.response && xhr.response.err) || xhr.responseText);
|
||||
if (txt.indexOf('upload blocked by x') + 1) {
|
||||
apop(st.busy.upload, upt);
|
||||
@@ -2577,7 +2589,10 @@ function up2k_init(subtle) {
|
||||
btot = Math.floor(st.bytes.total / 1024 / 1024);
|
||||
|
||||
xhr.upload.onprogress = function (xev) {
|
||||
pvis.prog(t, npart, xev.loaded);
|
||||
var nb = xev.loaded;
|
||||
st.bytes.inflight += nb - xhr.bsent;
|
||||
xhr.bsent = nb;
|
||||
pvis.prog(t, npart, nb);
|
||||
};
|
||||
xhr.onload = function (xev) {
|
||||
try { orz(xhr); } catch (ex) { vis_exh(ex + '', 'up2k.js', '', '', ex); }
|
||||
@@ -2586,6 +2601,8 @@ function up2k_init(subtle) {
|
||||
if (crashed)
|
||||
return;
|
||||
|
||||
st.bytes.inflight -= (xhr.bsent || 0);
|
||||
|
||||
if (!toast.visible)
|
||||
toast.warn(9.98, L.u_cuerr.format(npart, Math.ceil(t.size / chunksize), t.name), t);
|
||||
|
||||
@@ -2602,6 +2619,7 @@ function up2k_init(subtle) {
|
||||
if (xhr.overrideMimeType)
|
||||
xhr.overrideMimeType('Content-Type', 'application/octet-stream');
|
||||
|
||||
xhr.bsent = 0;
|
||||
xhr.responseType = 'text';
|
||||
xhr.send(t.fobj.slice(car, cdr));
|
||||
}
|
||||
@@ -2686,7 +2704,7 @@ function up2k_init(subtle) {
|
||||
|
||||
parallel_uploads = v;
|
||||
if (v == u2j)
|
||||
localStorage.removeItem('nthread');
|
||||
sdrop('nthread');
|
||||
else
|
||||
swrite('nthread', v);
|
||||
|
||||
@@ -2702,6 +2720,9 @@ function up2k_init(subtle) {
|
||||
if (parallel_uploads > 16)
|
||||
parallel_uploads = 16;
|
||||
|
||||
if (parallel_uploads > 7)
|
||||
toast.warn(10, L.u_maxconn);
|
||||
|
||||
obj.value = parallel_uploads;
|
||||
bumpthread({ "target": 1 });
|
||||
}
|
||||
|
||||
@@ -12,9 +12,11 @@ if (window.CGV)
|
||||
|
||||
|
||||
var wah = '',
|
||||
STG = null,
|
||||
NOAC = 'autocorrect="off" autocapitalize="off"',
|
||||
L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked,
|
||||
CB = '?_=' + Date.now(),
|
||||
T0 = Date.now(),
|
||||
CB = '?_=' + Math.floor(T0 / 1000).toString(36),
|
||||
R = SR.slice(1),
|
||||
RS = R ? "/" + R : "",
|
||||
HALFMAX = 8192 * 8192 * 8192 * 8192,
|
||||
@@ -39,6 +41,16 @@ if (!window.Notification || !Notification.permission)
|
||||
if (!window.FormData)
|
||||
window.FormData = false;
|
||||
|
||||
try {
|
||||
STG = window.localStorage;
|
||||
STG.STG;
|
||||
}
|
||||
catch (ex) {
|
||||
STG = null;
|
||||
if ((ex + '').indexOf('sandbox') < 0)
|
||||
console.log('no localStorage: ' + ex);
|
||||
}
|
||||
|
||||
try {
|
||||
CB = '?' + document.currentScript.src.split('?').pop();
|
||||
|
||||
@@ -145,6 +157,10 @@ catch (ex) {
|
||||
}
|
||||
var crashed = false, ignexd = {}, evalex_fatal = false;
|
||||
function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
var ekey = url + '\n' + lineNo + '\n' + msg;
|
||||
if (ignexd[ekey] || crashed)
|
||||
return;
|
||||
|
||||
msg = String(msg);
|
||||
url = String(url);
|
||||
|
||||
@@ -160,10 +176,12 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
if (url.indexOf(' > eval') + 1 && !evalex_fatal)
|
||||
return; // md timer
|
||||
|
||||
var ekey = url + '\n' + lineNo + '\n' + msg;
|
||||
if (ignexd[ekey] || crashed)
|
||||
if (IE && url.indexOf('prism.js') + 1)
|
||||
return;
|
||||
|
||||
if (url.indexOf('easymde.js') + 1)
|
||||
return; // clicking the preview pane
|
||||
|
||||
if (url.indexOf('deps/marked.js') + 1 && !window.WebAssembly)
|
||||
return; // ff<52
|
||||
|
||||
@@ -202,19 +220,24 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
}
|
||||
ignexd[ekey] = true;
|
||||
|
||||
var ls = jcp(localStorage);
|
||||
if (ls.fman_clip)
|
||||
ls.fman_clip = ls.fman_clip.length + ' items';
|
||||
var ls = {},
|
||||
lsk = Object.keys(localStorage),
|
||||
nka = lsk.length,
|
||||
nk = Math.min(200, nka);
|
||||
|
||||
var lsk = Object.keys(ls);
|
||||
lsk.sort();
|
||||
html.push('<p class="b">');
|
||||
for (var a = 0; a < lsk.length; a++) {
|
||||
if (ls[lsk[a]].length > 9000)
|
||||
continue;
|
||||
for (var a = 0; a < nk; a++) {
|
||||
var k = lsk[a],
|
||||
v = localStorage.getItem(k);
|
||||
|
||||
html.push(' <b>' + esc(lsk[a]) + '</b> <code>' + esc(ls[lsk[a]]) + '</code> ');
|
||||
ls[k] = v.length > 256 ? v.slice(0, 32) + '[...' + v.length + 'b]' : v;
|
||||
}
|
||||
|
||||
lsk = Object.keys(ls);
|
||||
lsk.sort();
|
||||
html.push('<p class="b"><b>' + nka + ': </b>');
|
||||
for (var a = 0; a < nk; a++)
|
||||
html.push(' <b>' + esc(lsk[a]) + '</b> <code>' + esc(ls[lsk[a]]) + '</code> ');
|
||||
|
||||
html.push('</p>');
|
||||
}
|
||||
catch (e) { }
|
||||
@@ -276,10 +299,11 @@ function anymod(e, shift_ok) {
|
||||
}
|
||||
|
||||
|
||||
var dev_fbw = sread('dev_fbw');
|
||||
function ev(e) {
|
||||
if (!e && window.event) {
|
||||
e = window.event;
|
||||
if (localStorage.dev_fbw == 1) {
|
||||
if (dev_fbw == 1) {
|
||||
toast.warn(10, 'hello from fallback code ;_;\ncheck console trace');
|
||||
console.error('using window.event');
|
||||
}
|
||||
@@ -370,6 +394,22 @@ catch (ex) {
|
||||
}
|
||||
}
|
||||
|
||||
if (!window.Set)
|
||||
window.Set = function () {
|
||||
var r = this;
|
||||
r.size = 0;
|
||||
r.d = {};
|
||||
r.add = function (k) {
|
||||
if (!r.d[k]) {
|
||||
r.d[k] = 1;
|
||||
r.size++;
|
||||
}
|
||||
};
|
||||
r.has = function (k) {
|
||||
return r.d[k];
|
||||
};
|
||||
};
|
||||
|
||||
// https://stackoverflow.com/a/950146
|
||||
function import_js(url, cb, ecb) {
|
||||
var head = document.head || document.getElementsByTagName('head')[0];
|
||||
@@ -395,6 +435,25 @@ function unsmart(txt) {
|
||||
}
|
||||
|
||||
|
||||
function namesan(txt, win, fslash) {
|
||||
if (win)
|
||||
txt = (txt.
|
||||
replace(/</g, "<").
|
||||
replace(/>/g, ">").
|
||||
replace(/:/g, ":").
|
||||
replace(/"/g, """).
|
||||
replace(/\\/g, "\").
|
||||
replace(/\|/g, "|").
|
||||
replace(/\?/g, "?").
|
||||
replace(/\*/g, "*"));
|
||||
|
||||
if (fslash)
|
||||
txt = txt.replace(/\//g, "/");
|
||||
|
||||
return txt;
|
||||
}
|
||||
|
||||
|
||||
var crctab = (function () {
|
||||
var c, tab = [];
|
||||
for (var n = 0; n < 256; n++) {
|
||||
@@ -881,9 +940,16 @@ function jcp(obj) {
|
||||
}
|
||||
|
||||
|
||||
function sdrop(key) {
|
||||
try {
|
||||
STG.removeItem(key);
|
||||
}
|
||||
catch (ex) { }
|
||||
}
|
||||
|
||||
function sread(key, al) {
|
||||
try {
|
||||
var ret = localStorage.getItem(key);
|
||||
var ret = STG.getItem(key);
|
||||
return (!al || has(al, ret)) ? ret : null;
|
||||
}
|
||||
catch (e) {
|
||||
@@ -894,9 +960,9 @@ function sread(key, al) {
|
||||
function swrite(key, val) {
|
||||
try {
|
||||
if (val === undefined || val === null)
|
||||
localStorage.removeItem(key);
|
||||
STG.removeItem(key);
|
||||
else
|
||||
localStorage.setItem(key, val);
|
||||
STG.setItem(key, val);
|
||||
}
|
||||
catch (e) { }
|
||||
}
|
||||
@@ -1057,7 +1123,7 @@ function dl_file(url) {
|
||||
|
||||
function cliptxt(txt, ok) {
|
||||
var fb = function () {
|
||||
console.log('fb');
|
||||
console.log('clip-fb');
|
||||
var o = mknod('input');
|
||||
o.value = txt;
|
||||
document.body.appendChild(o);
|
||||
@@ -1392,15 +1458,23 @@ var toast = (function () {
|
||||
}
|
||||
|
||||
r.hide = function (e) {
|
||||
ev(e);
|
||||
if (this === ebi('toastc'))
|
||||
ev(e);
|
||||
|
||||
unscroll();
|
||||
clearTimeout(te);
|
||||
clmod(obj, 'vis');
|
||||
r.visible = false;
|
||||
r.tag = obj;
|
||||
if (!window.WebAssembly)
|
||||
te = setTimeout(function () {
|
||||
obj.className = 'hide';
|
||||
}, 500);
|
||||
};
|
||||
|
||||
r.show = function (cl, sec, txt, tag) {
|
||||
txt = (txt + '').slice(0, 16384);
|
||||
|
||||
var same = r.visible && txt == r.p_txt && r.p_sec == sec,
|
||||
delta = Date.now() - r.p_t;
|
||||
|
||||
@@ -1558,7 +1632,7 @@ var modal = (function () {
|
||||
};
|
||||
|
||||
var onkey = function (e) {
|
||||
var k = e.code,
|
||||
var k = (e.code || e.key) + '',
|
||||
eok = ebi('modal-ok'),
|
||||
eng = ebi('modal-ng'),
|
||||
ae = document.activeElement;
|
||||
@@ -1573,10 +1647,10 @@ var modal = (function () {
|
||||
return ok(e);
|
||||
}
|
||||
|
||||
if ((k == 'ArrowLeft' || k == 'ArrowRight') && eng && (ae == eok || ae == eng))
|
||||
if ((k == 'ArrowLeft' || k == 'ArrowRight' || k == 'Left' || k == 'Right') && eng && (ae == eok || ae == eng))
|
||||
return (ae == eok ? eng : eok).focus() || ev(e);
|
||||
|
||||
if (k == 'Escape')
|
||||
if (k == 'Escape' || k == 'Esc')
|
||||
return ng(e);
|
||||
}
|
||||
|
||||
@@ -1854,21 +1928,17 @@ var favico = (function () {
|
||||
var b64;
|
||||
try {
|
||||
b64 = btoa(svg ? svg_decl + svg : gx(r.txt));
|
||||
//console.log('f1');
|
||||
}
|
||||
catch (e1) {
|
||||
try {
|
||||
b64 = btoa(gx(encodeURIComponent(r.txt).replace(/%([0-9A-F]{2})/g,
|
||||
function x(m, v) { return String.fromCharCode('0x' + v); })));
|
||||
//console.log('f2');
|
||||
}
|
||||
catch (e2) {
|
||||
try {
|
||||
b64 = btoa(gx(unescape(encodeURIComponent(r.txt))));
|
||||
//console.log('f3');
|
||||
}
|
||||
catch (e3) {
|
||||
//console.log('fe');
|
||||
return;
|
||||
}
|
||||
}
|
||||
@@ -1922,6 +1992,9 @@ function xhrchk(xhr, prefix, e404, lvl, tag) {
|
||||
if (xhr.status < 400 && xhr.status >= 200)
|
||||
return true;
|
||||
|
||||
if (tag === undefined)
|
||||
tag = prefix;
|
||||
|
||||
var errtxt = (xhr.response && xhr.response.err) || xhr.responseText,
|
||||
fun = toast[lvl || 'err'],
|
||||
is_cf = /[Cc]loud[f]lare|>Just a mo[m]ent|#cf-b[u]bbles|Chec[k]ing your br[o]wser|\/chall[e]nge-platform|"chall[e]nge-error|nable Ja[v]aScript and cook/.test(errtxt);
|
||||
|
||||
1013
docs/changelog.md
1013
docs/changelog.md
File diff suppressed because it is too large
Load Diff
@@ -162,8 +162,8 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
||||
| PUT | | (binary data) | upload into file at URL |
|
||||
| PUT | `?gz` | (binary data) | compress with gzip and write into file at URL |
|
||||
| PUT | `?xz` | (binary data) | compress with xz and write into file at URL |
|
||||
| mPOST | | `act=bput`, `f=FILE` | upload `FILE` into the folder at URL |
|
||||
| mPOST | `?j` | `act=bput`, `f=FILE` | ...and reply with json |
|
||||
| mPOST | | `f=FILE` | upload `FILE` into the folder at URL |
|
||||
| mPOST | `?j` | `f=FILE` | ...and reply with json |
|
||||
| mPOST | | `act=mkdir`, `name=foo` | create directory `foo` at URL |
|
||||
| POST | `?delete` | | delete URL recursively |
|
||||
| jPOST | `?delete` | `["/foo","/bar"]` | delete `/foo` and `/bar` recursively |
|
||||
@@ -242,6 +242,7 @@ python3 -m venv .venv
|
||||
pip install jinja2 strip_hints # MANDATORY
|
||||
pip install mutagen # audio metadata
|
||||
pip install pyftpdlib # ftp server
|
||||
pip install partftpy # tftp server
|
||||
pip install impacket # smb server -- disable Windows Defender if you REALLY need this on windows
|
||||
pip install Pillow pyheif-pillow-opener pillow-avif-plugin # thumbnails
|
||||
pip install pyvips # faster thumbnails
|
||||
|
||||
@@ -31,33 +31,33 @@
|
||||
/w # share /w (the docker data volume)
|
||||
accs:
|
||||
rw: * # everyone gets read-access, but
|
||||
rwmda: %su # the group "su" gets read-write-move-delete-admin
|
||||
rwmda: @su # the group "su" gets read-write-move-delete-admin
|
||||
|
||||
|
||||
[/u/${u}] # each user gets their own home-folder at /u/username
|
||||
/w/u/${u} # which will be "u/username" in the docker data volume
|
||||
accs:
|
||||
r: * # read-access for anyone, and
|
||||
rwmda: ${u}, %su # read-write-move-delete-admin for that username + the "su" group
|
||||
rwmda: ${u}, @su # read-write-move-delete-admin for that username + the "su" group
|
||||
|
||||
|
||||
[/u/${u}/priv] # each user also gets a private area at /u/username/priv
|
||||
/w/u/${u}/priv # stored at DATAVOLUME/u/username/priv
|
||||
accs:
|
||||
rwmda: ${u}, %su # read-write-move-delete-admin for that username + the "su" group
|
||||
rwmda: ${u}, @su # read-write-move-delete-admin for that username + the "su" group
|
||||
|
||||
|
||||
[/lounge/${g}] # each group gets their own shared volume
|
||||
/w/lounge/${g} # stored at DATAVOLUME/lounge/groupname
|
||||
accs:
|
||||
r: * # read-access for anyone, and
|
||||
rwmda: %${g}, %su # read-write-move-delete-admin for that group + the "su" group
|
||||
rwmda: @${g}, @su # read-write-move-delete-admin for that group + the "su" group
|
||||
|
||||
|
||||
[/lounge/${g}/priv] # and a private area for each group too
|
||||
/w/lounge/${g}/priv # stored at DATAVOLUME/lounge/groupname/priv
|
||||
accs:
|
||||
rwmda: %${g}, %su # read-write-move-delete-admin for that group + the "su" group
|
||||
rwmda: @${g}, @su # read-write-move-delete-admin for that group + the "su" group
|
||||
|
||||
|
||||
# and create some strategic volumes to prevent anyone from gaining
|
||||
@@ -65,8 +65,8 @@
|
||||
[/u]
|
||||
/w/u
|
||||
accs:
|
||||
rwmda: %su
|
||||
rwmda: @su
|
||||
[/lounge]
|
||||
/w/lounge
|
||||
accs:
|
||||
rwmda: %su
|
||||
rwmda: @su
|
||||
|
||||
@@ -6,7 +6,7 @@ you will definitely need either [copyparty.exe](https://github.com/9001/copypart
|
||||
|
||||
* if you decided to grab `copyparty-sfx.py` instead of the exe you will also need to install the ["Latest Python 3 Release"](https://www.python.org/downloads/windows/)
|
||||
|
||||
then you probably want to download [FFmpeg](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) and put `ffmpeg.exe` and `ffprobe.exe` in your PATH (so for example `C:\Windows\System32\`) -- this enables thumbnails, audio transcoding, and making music metadata searchable
|
||||
then you probably want to download [FFmpeg](https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z) and put `ffmpeg.exe` and `ffprobe.exe` in your PATH (so for example `C:\Windows\System32\`) -- this enables thumbnails, audio transcoding, and making music metadata searchable
|
||||
|
||||
|
||||
## the config file
|
||||
|
||||
@@ -24,6 +24,10 @@ https://github.com/giampaolo/pyftpdlib/
|
||||
C: 2007 Giampaolo Rodola
|
||||
L: MIT
|
||||
|
||||
https://github.com/9001/partftpy
|
||||
C: 2010-2021 Michael P. Soulier
|
||||
L: MIT
|
||||
|
||||
https://github.com/nayuki/QR-Code-generator/
|
||||
C: Project Nayuki
|
||||
L: MIT
|
||||
|
||||
@@ -1,3 +1,19 @@
|
||||
this file accidentally got committed at some point, so let's put it to use
|
||||
|
||||
# trivia / lore
|
||||
|
||||
copyparty started as [three separate php projects](https://a.ocv.me/pub/stuff/old-php-projects/); an nginx custom directory listing (which became a php script), and a php music/picture viewer, and an additional php project for resumable uploads:
|
||||
|
||||
* findex -- directory browser / gallery with thumbnails and a music player which sometime back in 2009 had a canvas visualizer grabbing fft data from a flash audio player
|
||||
* findex.mini -- plain-listing fork of findex with streaming zip-download of folders (the js and design should look familiar)
|
||||
* upper and up2k -- up2k being the star of the show and where copyparty's chunked resumable uploads came from
|
||||
|
||||
the first link has screenshots but if that doesn't work there's also a [tar here](https://ocv.me/dev/old-php-projects.tgz)
|
||||
|
||||
----
|
||||
|
||||
below this point is misc useless scribbles
|
||||
|
||||
# up2k.js
|
||||
|
||||
## potato detection
|
||||
|
||||
@@ -24,6 +24,27 @@ gzip -d < .hist/up2k.snap | jq -r '.[].name' | while IFS= read -r f; do wc -c --
|
||||
echo; find -type f | while IFS= read -r x; do printf '\033[A\033[36m%s\033[K\033[0m\n' "$x"; tail -c$((1024*1024)) <"$x" | xxd -a | awk 'NR==1&&/^[0: ]+.{16}$/{next} NR==2&&/^\*$/{next} NR==3&&/^[0f]+: [0 ]+65 +.{16}$/{next} {e=1} END {exit e}' || continue; printf '\033[A\033[31msus:\033[33m %s \033[0m\n\n' "$x"; done
|
||||
|
||||
|
||||
##
|
||||
## sync pics/vids from phone
|
||||
## (takes all files named (IMG|PXL|PANORAMA|Screenshot)_20231224_*)
|
||||
|
||||
cd /storage/emulated/0/DCIM/Camera
|
||||
find -mindepth 1 -maxdepth 1 | sort | cut -c3- > ls
|
||||
url=https://192.168.1.3:3923/rw/pics/Camera/$d/; awk -F_ '!/^[A-Z][A-Za-z]{1,16}_[0-9]{8}[_-]/{next} {d=substr($2,1,6)} !t[d]++{print d}' ls | while read d; do grep -E "^[A-Z][A-Za-z]{1,16}_$d" ls | tr '\n' '\0' | xargs -0 python3 ~/dev/copyparty/bin/u2c.py -td $url --; done
|
||||
|
||||
|
||||
##
|
||||
## convert symlinks to hardlinks (probably safe, no guarantees)
|
||||
|
||||
find -type l | while IFS= read -r lnk; do [ -h "$lnk" ] || { printf 'nonlink: %s\n' "$lnk"; continue; }; dst="$(readlink -f -- "$lnk")"; [ -e "$dst" ] || { printf '???\n%s\n%s\n' "$lnk" "$dst"; continue; }; printf 'relinking:\n %s\n %s\n' "$lnk" "$dst"; rm -- "$lnk"; ln -- "$dst" "$lnk"; done
|
||||
|
||||
|
||||
##
|
||||
## convert hardlinks to symlinks (maybe not as safe? use with caution)
|
||||
|
||||
e=; p=; find -printf '%i %p\n' | awk '{i=$1;sub(/[^ ]+ /,"")} !n[i]++{p[i]=$0;next} {printf "real %s\nlink %s\n",p[i],$0}' | while read cls p; do [ -e "$p" ] || e=1; p="$(realpath -- "$p")" || e=1; [ -e "$p" ] || e=1; [ $cls = real ] && { real="$p"; continue; }; [ $cls = link ] || e=1; [ "$p" ] || e=1; [ $e ] && { echo "ERROR $p"; break; }; printf '\033[36m%s \033[0m -> \033[35m%s\033[0m\n' "$p" "$real"; rm "$p"; ln -s "$real" "$p" || { echo LINK FAILED; break; }; done
|
||||
|
||||
|
||||
##
|
||||
## create a test payload
|
||||
|
||||
|
||||
@@ -200,9 +200,10 @@ symbol legend,
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ |
|
||||
| serve ftp | █ | | | | | █ | | | | | | █ |
|
||||
| serve ftps | █ | | | | | █ | | | | | | █ |
|
||||
| serve sftp | | | | | | █ | | | | | | █ |
|
||||
| serve ftp (tcp) | █ | | | | | █ | | | | | | █ |
|
||||
| serve ftps (tls) | █ | | | | | █ | | | | | | █ |
|
||||
| serve tftp (udp) | █ | | | | | | | | | | | |
|
||||
| serve sftp (ssh) | | | | | | █ | | | | | | █ |
|
||||
| serve smb/cifs | ╱ | | | | | █ | | | | | | |
|
||||
| serve dlna | | | | | | █ | | | | | | |
|
||||
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ |
|
||||
|
||||
@@ -28,6 +28,7 @@ classifiers = [
|
||||
"Programming Language :: Python :: Implementation :: CPython",
|
||||
"Programming Language :: Python :: Implementation :: Jython",
|
||||
"Programming Language :: Python :: Implementation :: PyPy",
|
||||
"Operating System :: OS Independent",
|
||||
"Environment :: Console",
|
||||
"Environment :: No Input/Output (Daemon)",
|
||||
"Intended Audience :: End Users/Desktop",
|
||||
@@ -48,6 +49,7 @@ thumbnails2 = ["pyvips"]
|
||||
audiotags = ["mutagen"]
|
||||
ftpd = ["pyftpdlib"]
|
||||
ftps = ["pyftpdlib", "pyopenssl"]
|
||||
tftpd = ["partftpy>=0.3.0"]
|
||||
pwhash = ["argon2-cffi"]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -12,6 +12,11 @@ set -euo pipefail
|
||||
#
|
||||
# can be adjusted with --hash-mt (but alpine caps out at 5)
|
||||
|
||||
fsize=256
|
||||
nfiles=128
|
||||
pybin=$(command -v python3 || command -v python)
|
||||
#pybin=~/.pyenv/versions/nogil-3.9.10-2/bin/python3
|
||||
|
||||
[ $# -ge 1 ] || {
|
||||
echo 'need arg 1: path to copyparty-sfx.py'
|
||||
echo ' (remaining args will be passed on to copyparty,'
|
||||
@@ -22,6 +27,8 @@ sfx="$1"
|
||||
shift
|
||||
sfx="$(realpath "$sfx" || readlink -e "$sfx" || echo "$sfx")"
|
||||
awk=$(command -v gawk || command -v awk)
|
||||
uname -s | grep -E MSYS && win=1 || win=
|
||||
totalsize=$((fsize*nfiles))
|
||||
|
||||
# try to use /dev/shm to avoid hitting filesystems at all,
|
||||
# otherwise fallback to mktemp which probably uses /tmp
|
||||
@@ -30,20 +37,24 @@ mkdir $td || td=$(mktemp -d)
|
||||
trap "rm -rf $td" INT TERM EXIT
|
||||
cd $td
|
||||
|
||||
echo creating 256 MiB testfile in $td
|
||||
head -c $((1024*1024*256)) /dev/urandom > 1
|
||||
echo creating $fsize MiB testfile in $td
|
||||
sz=$((1024*1024*fsize))
|
||||
head -c $sz /dev/zero | openssl enc -aes-256-ctr -iter 1 -pass pass:k -nosalt 2>/dev/null >1 || true
|
||||
wc -c 1 | awk '$1=='$sz'{r=1}END{exit 1-r}' || head -c $sz /dev/urandom >1
|
||||
|
||||
echo creating 127 symlinks to it
|
||||
for n in $(seq 2 128); do ln -s 1 $n; done
|
||||
echo creating $((nfiles-1)) symlinks to it
|
||||
for n in $(seq 2 $nfiles); do MSYS=winsymlinks:nativestrict ln -s 1 $n; done
|
||||
|
||||
echo warming up cache
|
||||
cat 1 >/dev/null
|
||||
|
||||
echo ok lets go
|
||||
python3 "$sfx" -p39204 -e2dsa --dbd=yolo --exit=idx -lo=t -q "$@"
|
||||
$pybin "$sfx" -p39204 -e2dsa --dbd=yolo --exit=idx -lo=t -q "$@" && err= || err=$?
|
||||
[ $win ] && [ $err = 15 ] && err= # sigterm doesn't hook on windows, ah whatever
|
||||
[ $err ] && echo ERROR $err && exit $err
|
||||
|
||||
echo and the results are...
|
||||
$awk '/1 volumes in / {printf "%s MiB/s\n", 256*128/$(NF-1)}' <t
|
||||
LC_ALL=C $awk '/1 volumes in / {s=$(NF-1); printf "speed: %.1f MiB/s (time=%.2fs)\n", '$totalsize'/s, s}' <t
|
||||
|
||||
echo deleting $td and exiting
|
||||
|
||||
@@ -52,16 +63,30 @@ echo deleting $td and exiting
|
||||
|
||||
# MiB/s @ cpu or device (copyparty, pythonver, distro/os) // comment
|
||||
|
||||
# 3887 @ Ryzen 5 4500U (cpp 1.9.5, nogil 3.9, fedora 39) // --hash-mt=6; laptop
|
||||
# 3732 @ Ryzen 5 4500U (cpp 1.9.5, py 3.12.1, fedora 39) // --hash-mt=6; laptop
|
||||
# 3608 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.5, fedora 38) // --hash-mt=6; laptop
|
||||
# 2726 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.5, fedora 38) // --hash-mt=4 (old-default)
|
||||
# 2202 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.5, docker-alpine 3.18.3) ??? alpine slow
|
||||
# 2719 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.2, docker-debian 12.1)
|
||||
|
||||
# 7746 @ mbp 2023 m3pro (cpp 1.9.5, py 3.11.7, macos 14.1) // --hash-mt=6
|
||||
# 6687 @ mbp 2023 m3pro (cpp 1.9.5, py 3.11.7, macos 14.1) // --hash-mt=5 (default)
|
||||
# 5544 @ Intel i5-12500 (cpp 1.9.5, py 3.11.2, debian 12.0) // --hash-mt=12; desktop
|
||||
# 5197 @ Ryzen 7 3700X (cpp 1.9.5, py 3.9.18, freebsd 13.2) // --hash-mt=8; 2u server
|
||||
# 4551 @ mbp 2020 m1 (cpp 1.9.5, py 3.11.7, macos 14.2.1)
|
||||
# 4190 @ Ryzen 7 5800X (cpp 1.9.5, py 3.11.6, fedora 37) // --hash-mt=8 (vbox-VM on win10-17763.4974)
|
||||
# 3028 @ Ryzen 7 5800X (cpp 1.9.5, py 3.11.6, fedora 37) // --hash-mt=5 (vbox-VM on win10-17763.4974)
|
||||
# 2629 @ Ryzen 7 5800X (cpp 1.9.5, py 3.11.7, win10-ltsc-1809-17763.4974) // --hash-mt=5 (default)
|
||||
# 2576 @ Ryzen 7 5800X (cpp 1.9.5, py 3.11.7, win10-ltsc-1809-17763.4974) // --hash-mt=8 (hello??)
|
||||
# 2606 @ Ryzen 7 3700X (cpp 1.9.5, py 3.9.18, freebsd 13.2) // --hash-mt=4 (old-default)
|
||||
# 1436 @ Ryzen 5 5500U (cpp 1.9.5, py 3.11.4, alpine 3.18.3) // nuc
|
||||
# 1065 @ Pixel 7 (cpp 1.9.5, py 3.11.5, termux 2023-09)
|
||||
# 945 @ Pi 5B v1.0 (cpp 1.9.5, py 3.11.6, alpine 3.19.0)
|
||||
# 548 @ Pi 4B v1.5 (cpp 1.9.5, py 3.11.6, debian 11)
|
||||
# 435 @ Pi 4B v1.5 (cpp 1.9.5, py 3.11.6, alpine 3.19.0)
|
||||
# 212 @ Pi Zero2W v1.0 (cpp 1.9.5, py 3.11.6, alpine 3.19.0)
|
||||
# 10.0 @ Pi Zero W v1.1 (cpp 1.9.5, py 3.11.6, alpine 3.19.0)
|
||||
|
||||
# notes,
|
||||
# podman run --rm -it --shm-size 512m --entrypoint /bin/ash localhost/copyparty-min
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
FROM alpine:3.18
|
||||
WORKDIR /z
|
||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
ver_hashwasm=4.9.0 \
|
||||
ver_hashwasm=4.10.0 \
|
||||
ver_marked=4.3.0 \
|
||||
ver_dompf=3.0.5 \
|
||||
ver_dompf=3.0.9 \
|
||||
ver_mde=2.18.0 \
|
||||
ver_codemirror=5.65.12 \
|
||||
ver_codemirror=5.65.16 \
|
||||
ver_fontawesome=5.13.0 \
|
||||
ver_prism=1.29.0 \
|
||||
ver_zopfli=1.0.3
|
||||
@@ -80,7 +80,7 @@ RUN cd asmcrypto.js-$ver_asmcrypto \
|
||||
|
||||
|
||||
# build hash-wasm
|
||||
RUN cd hash-wasm \
|
||||
RUN cd hash-wasm/dist \
|
||||
&& mv sha512.umd.min.js /z/dist/sha512.hw.js
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
all: $(addsuffix .gz, $(wildcard *.js *.css))
|
||||
|
||||
%.gz: %
|
||||
pigz -11 -I 573 $<
|
||||
pigz -11 -I 2048 $<
|
||||
|
||||
# pigz -11 -J 34 -I 100 -F < $< > $@.first
|
||||
|
||||
@@ -79,6 +79,15 @@ or using commandline arguments,
|
||||
```
|
||||
|
||||
|
||||
# faq
|
||||
|
||||
the following advice is best-effort and not guaranteed to be entirely correct
|
||||
|
||||
* q: starting a rootless container on debian 12 fails with `failed to register layer: lsetxattr user.overlay.impure /etc: operation not supported`
|
||||
* a: docker's default rootless configuration on debian is to use the overlay2 storage driver; this does not work. Your options are to replace docker with podman (good choice), or to configure docker to use the `fuse-overlayfs` storage driver
|
||||
|
||||
|
||||
|
||||
# build the images yourself
|
||||
|
||||
basically `./make.sh hclean pull img push` but see [devnotes.md](./devnotes.md)
|
||||
|
||||
73
scripts/logpack.sh
Executable file
73
scripts/logpack.sh
Executable file
@@ -0,0 +1,73 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
# recompress logs so they decompress faster + save some space;
|
||||
# * will not recurse into subfolders
|
||||
# * each file in current folder gets recompressed to zstd; input file is DELETED
|
||||
# * any xz-compressed logfiles are decompressed before converting to zstd
|
||||
# * SHOULD ignore and skip files which are currently open; SHOULD be safe to run while copyparty is running
|
||||
|
||||
|
||||
# for files larger than $cutoff, compress with `zstd -T0`
|
||||
# (otherwise do several files in parallel (scales better))
|
||||
cutoff=400M
|
||||
|
||||
|
||||
# osx support:
|
||||
# port install findutils gsed coreutils
|
||||
command -v gfind >/dev/null &&
|
||||
command -v gsed >/dev/null &&
|
||||
command -v gsort >/dev/null && {
|
||||
find() { gfind "$@"; }
|
||||
sed() { gsed "$@"; }
|
||||
sort() { gsort "$@"; }
|
||||
}
|
||||
|
||||
packfun() {
|
||||
local jobs=$1 fn="$2"
|
||||
printf '%s\n' "$fn" | grep -qF .zst && return
|
||||
|
||||
local of="$(printf '%s\n' "$fn" | sed -r 's/\.(xz|txt)/.zst/')"
|
||||
[ "$fn" = "$of" ] &&
|
||||
of="$of.zst"
|
||||
|
||||
[ -e "$of" ] &&
|
||||
echo "SKIP: output file exists: $of" &&
|
||||
return
|
||||
|
||||
lsof -- "$fn" 2>/dev/null | grep -E .. &&
|
||||
printf "SKIP: file in use: %s\n\n" $fn &&
|
||||
return
|
||||
|
||||
# determine by header; old copyparty versions would produce xz output without .xz names
|
||||
head -c3 "$fn" | grep -qF 7z &&
|
||||
cmd="xz -dkc" || cmd="cat"
|
||||
|
||||
printf '<%s> T%d: %s\n' "$cmd" $jobs "$of"
|
||||
|
||||
$cmd <"$fn" >/dev/null || {
|
||||
echo "ERROR: uncompress failed: $fn"
|
||||
return
|
||||
}
|
||||
|
||||
$cmd <"$fn" | zstd --long -19 -T$jobs >"$of"
|
||||
touch -r "$fn" -- "$of"
|
||||
|
||||
cmp <($cmd <"$fn") <(zstd -d <"$of") || {
|
||||
echo "ERROR: data mismatch: $of"
|
||||
mv "$of"{,.BAD}
|
||||
return
|
||||
}
|
||||
rm -- "$fn"
|
||||
}
|
||||
|
||||
# do small files in parallel first (in descending size);
|
||||
# each file can use 4 threads in case the cutoff is poor
|
||||
export -f packfun
|
||||
export -f sed 2>/dev/null || true
|
||||
find -maxdepth 1 -type f -size -$cutoff -printf '%s %p\n' |
|
||||
sort -nr | sed -r 's`[^ ]+ ``; s`^\./``' | tr '\n' '\0' |
|
||||
xargs "$@" -0i -P$(nproc) bash -c 'packfun 4 "$@"' _ {}
|
||||
|
||||
# then the big ones, letting each file use the whole cpu
|
||||
for f in *; do packfun 0 "$f"; done
|
||||
@@ -77,13 +77,14 @@ function have() {
|
||||
}
|
||||
|
||||
function load_env() {
|
||||
. buildenv/bin/activate
|
||||
have setuptools
|
||||
have wheel
|
||||
have build
|
||||
have twine
|
||||
have jinja2
|
||||
have strip_hints
|
||||
. buildenv/bin/activate || return 1
|
||||
have setuptools &&
|
||||
have wheel &&
|
||||
have build &&
|
||||
have twine &&
|
||||
have jinja2 &&
|
||||
have strip_hints &&
|
||||
return 0 || return 1
|
||||
}
|
||||
|
||||
load_env || {
|
||||
|
||||
@@ -26,8 +26,9 @@ help() { exec cat <<'EOF'
|
||||
# _____________________________________________________________________
|
||||
# core features:
|
||||
#
|
||||
# `no-ftp` saves ~33k by removing the ftp server and filetype detector,
|
||||
# disabling --ftpd and --magic
|
||||
# `no-ftp` saves ~30k by removing the ftp server, disabling --ftp
|
||||
#
|
||||
# `no-tfp` saves ~10k by removing the tftp server, disabling --tftp
|
||||
#
|
||||
# `no-smb` saves ~3.5k by removing the smb / cifs server
|
||||
#
|
||||
@@ -114,6 +115,7 @@ while [ ! -z "$1" ]; do
|
||||
gz) use_gz=1 ; ;;
|
||||
gzz) shift;use_gzz=$1;use_gz=1; ;;
|
||||
no-ftp) no_ftp=1 ; ;;
|
||||
no-tfp) no_tfp=1 ; ;;
|
||||
no-smb) no_smb=1 ; ;;
|
||||
no-zm) no_zm=1 ; ;;
|
||||
no-fnt) no_fnt=1 ; ;;
|
||||
@@ -165,7 +167,8 @@ necho() {
|
||||
[ $repack ] && {
|
||||
old="$tmpdir/pe-copyparty.$(id -u)"
|
||||
echo "repack of files in $old"
|
||||
cp -pR "$old/"*{py2,py37,j2,copyparty} .
|
||||
cp -pR "$old/"*{py2,py37,magic,j2,copyparty} .
|
||||
cp -pR "$old/"*partftpy . || true
|
||||
cp -pR "$old/"*ftp . || true
|
||||
}
|
||||
|
||||
@@ -221,6 +224,16 @@ necho() {
|
||||
mkdir ftp/
|
||||
mv pyftpdlib ftp/
|
||||
|
||||
necho collecting partftpy
|
||||
f="../build/partftpy-0.3.0.tar.gz"
|
||||
[ -e "$f" ] ||
|
||||
(url=https://files.pythonhosted.org/packages/06/ce/531978c831c47f79bc72d5bbb3f12757daf1602d1fffad012305f2d270f6/partftpy-0.3.0.tar.gz;
|
||||
wget -O$f "$url" || curl -L "$url" >$f)
|
||||
|
||||
tar -zxf $f
|
||||
mv partftpy-*/partftpy .
|
||||
rm -rf partftpy-* partftpy/bin
|
||||
|
||||
necho collecting python-magic
|
||||
v=0.4.27
|
||||
f="../build/python-magic-$v.tar.gz"
|
||||
@@ -234,7 +247,6 @@ necho() {
|
||||
rm -rf python-magic-*
|
||||
rm magic/compat.py
|
||||
iawk '/^def _add_compat/{o=1} !o; /^_add_compat/{o=0}' magic/__init__.py
|
||||
mv magic ftp/ # doesn't provide a version label anyways
|
||||
|
||||
# enable this to dynamically remove type hints at startup,
|
||||
# in case a future python version can use them for performance
|
||||
@@ -409,8 +421,10 @@ iawk '/^ {0,4}[^ ]/{s=0}/^ {4}def (serve_forever|_loop)/{s=1}!s' ftp/pyftpdlib/s
|
||||
rm -f ftp/pyftpdlib/{__main__,prefork}.py
|
||||
|
||||
[ $no_ftp ] &&
|
||||
rm -rf copyparty/ftpd.py ftp &&
|
||||
sed -ri '/\.ftp/d' copyparty/svchub.py
|
||||
rm -rf copyparty/ftpd.py ftp
|
||||
|
||||
[ $no_tfp ] &&
|
||||
rm -rf copyparty/tftpd.py partftpy
|
||||
|
||||
[ $no_smb ] &&
|
||||
rm -f copyparty/smbd.py
|
||||
@@ -584,7 +598,7 @@ nf=$(ls -1 "$zdir"/arc.* 2>/dev/null | wc -l)
|
||||
|
||||
|
||||
echo gen tarlist
|
||||
for d in copyparty j2 py2 py37 ftp; do find $d -type f; done | # strip_hints
|
||||
for d in copyparty partftpy magic j2 py2 py37 ftp; do find $d -type f || true; done | # strip_hints
|
||||
sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort |
|
||||
sed -r 's/([^ ]*) (.*)/\2.\1/' | grep -vE '/list1?$' > list1
|
||||
|
||||
|
||||
@@ -37,7 +37,7 @@ rm -rf $TEMP/pe-copyparty*
|
||||
python copyparty-sfx.py --version
|
||||
|
||||
rm -rf mods; mkdir mods
|
||||
cp -pR $TEMP/pe-copyparty/copyparty/ $TEMP/pe-copyparty/{ftp,j2}/* mods/
|
||||
cp -pR $TEMP/pe-copyparty/{copyparty,partftpy}/ $TEMP/pe-copyparty/{ftp,j2}/* mods/
|
||||
[ $w10 ] && rm -rf mods/{jinja2,markupsafe}
|
||||
|
||||
af() { awk "$1" <$2 >tf; mv tf "$2"; }
|
||||
|
||||
@@ -1,13 +1,10 @@
|
||||
f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f84beadd139bfeef4982e40f21ca5d9065cb794eeb0e473e82 altgraph-0.17.4-py2.py3-none-any.whl
|
||||
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
|
||||
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
||||
f298e34356b5590dde7477d7b3a88ad39c622a2bcf3fcd7c53870ce8384dd510f690af81b8f42e121a22d3968a767d2e07595036b2ed7049c8ef4d112bcf3a61 pyinstaller-5.13.2-py3-none-win32.whl
|
||||
f23615c522ed58b9a05978ba4c69c06224590f3a6adbd8e89b31838b181a57160739ceff1fc2ba6f4239b8fee46f92ce02910b2debda2710558ed42cff1ce3f1 pyinstaller-6.1.0-py3-none-win_amd64.whl
|
||||
5747b3b119629c4cf956f0eaa85f29218bb3680d3a4a262fa6e976e56b35067302e153d2c0a001505f2cb642b1f78752567889b3b82e342d6cd29aac8b70e92e pyinstaller_hooks_contrib-2023.10-py2.py3-none-any.whl
|
||||
f042aabe6cca2ae368180eaf313dd58f9ee96384c0ac1064aefe24a9e0e7e9cd6efa74eacb125d51a8feb61eaf200bc84812ab4d90c08fe33ef315eb2d9e6c30 pyinstaller_hooks_contrib-2024.1-py2.py3-none-any.whl
|
||||
749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl
|
||||
6e0d854040baff861e1647d2bece7d090bc793b2bd9819c56105b94090df54881a6a9b43ebd82578cd7c76d47181571b671e60672afd9def389d03c9dae84fcf setuptools-68.2.2-py3-none-any.whl
|
||||
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
||||
8d16a967a0a7872a7575b1005cf66915deacda6ee8611fbb52f42fc3e3beb2f901a5140c942a5d146bd412b92bfa9cbadd82beeba83df6d70930c6dc26608a5b upx-4.1.0-win32.zip
|
||||
# u2c (win7)
|
||||
f3390290b896019b2fa169932390e4930d1c03c014e1f6db2405ca2eb1f51f5f5213f725885853805b742997b0edb369787e5c0069d217bc4e8b957f847f58b6 certifi-2023.11.17-py3-none-any.whl
|
||||
904eb57b13bea80aea861de86987e618665d37fa9ea0856e0125a9ba767a53e5064de0b9c4735435a2ddf4f16f7f7d2c75a682e1de83d9f57922bdca8e29988c charset_normalizer-3.3.0-cp37-cp37m-win32.whl
|
||||
@@ -18,15 +15,19 @@ b795abb26ba2f04f1afcfb196f21f638014b26c8186f8f488f1c2d91e8e0220962fbd259dbc9c387
|
||||
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
|
||||
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
|
||||
016a8cbd09384f1a9a44cb0e8274df75a8bcb2f3966bb5d708c62145289efaa5db98f75256c97e4f8046735ce2e529fbb076f284a46cdb716e89a75660200ad9 pip-23.2.1-py3-none-any.whl
|
||||
f298e34356b5590dde7477d7b3a88ad39c622a2bcf3fcd7c53870ce8384dd510f690af81b8f42e121a22d3968a767d2e07595036b2ed7049c8ef4d112bcf3a61 pyinstaller-5.13.2-py3-none-win32.whl
|
||||
6bb73cc2db795c59c92f2115727f5c173cacc9465af7710db9ff2f2aec2d73130d0992d0f16dcb3fac222dc15c0916562d0813b2337401022020673a4461df3d python-3.7.9-amd64.exe
|
||||
500747651c87f59f2436c5ab91207b5b657856e43d10083f3ce27efb196a2580fadd199a4209519b409920c562aaaa7dcbdfb83ed2072a43eaccae6e2d056f31 python-3.7.9.exe
|
||||
2e04acff170ca3bbceeeb18489c687126c951ec0bfd53cccfb389ba8d29a4576c1a9e8f2e5ea26c84dd21bfa2912f4e71fa72c1e2653b71e34afc0e65f1722d4 upx-4.2.2-win32.zip
|
||||
68e1b618d988be56aaae4e2eb92bc0093627a00441c1074ebe680c41aa98a6161e52733ad0c59888c643a33fe56884e4f935178b2557fbbdd105e92e0d993df6 windows6.1-kb2533623-x64.msu
|
||||
479a63e14586ab2f2228208116fc149ed8ee7b1e4ff360754f5bda4bf765c61af2e04b5ef123976623d04df4976b7886e0445647269da81436bd0a7b5671d361 windows6.1-kb2533623-x86.msu
|
||||
ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a25067512b8f75538c67c987cf3944bfa0229e3cb677e2fb81e763e zipp-3.10.0-py3-none-any.whl
|
||||
# win10
|
||||
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
||||
7f8f4daa4f4f2dbf24cdd534b2952ee3fba6334eb42b37465ccda3aa1cccc3d6204aa6bfffb8a83bf42ec59c702b5b5247d4c8ee0d4df906334ae53072ef8c4c MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl
|
||||
e3e2e6bd511dec484dd0292f4c46c55c88a885eabf15413d53edea2dd4a4dbae1571735b9424f78c0cd7f1082476a8259f31fd3f63990f726175470f636df2b3 Jinja2-3.1.3-py3-none-any.whl
|
||||
e21495f1d473d855103fb4a243095b498ec90eb68776b0f9b48e994990534f7286c0292448e129c507e5d70409f8a05cca58b98d59ce2a815993d0a873dfc480 MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl
|
||||
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
||||
656015f5cc2c04aa0653ee5609c39a7e5f0b6a58c84fe26b20bd070c52d20b4effb810132f7fb771168483e9fd975cc3302837dd7a1a687ee058b0460c857cc4 packaging-23.2-py3-none-any.whl
|
||||
6401616fdfdd720d1aaa9a0ed1398d00664b28b6d84517dff8d1f9c416452610c6afa64cfb012a78e61d1cf4f6d0784eca6e7610957859e511f15bc6f3b3bd53 Pillow-10.1.0-cp311-cp311-win_amd64.whl
|
||||
2e6a57bab45b5a825a2073780c73980cbf5aafd99dc3b28660ea3f5f658f04668cd0f01c7de0bb79e362ff4e3b8f01dd4f671d3a2e054d3071baefdcf0b0e4ba python-3.11.7-amd64.exe
|
||||
424e20dc7263a31d524307bc39ed755a9dd82f538086fff68d98dd97e236c9b00777a8ac2e3853081b532b0e93cef44983e74d0ab274877440e8b7341b19358a pillow-10.2.0-cp311-cp311-win_amd64.whl
|
||||
533b1aec21439032cf13084d84c4d862e41835a0468f34fef36c5d7cb9cf106a030826ac2e95c9e860f623f6a55ea58548f749c31594f388207d0809dc0859b5 pyinstaller-6.4.0-py3-none-win_amd64.whl
|
||||
e6bdbae1affd161e62fc87407c912462dfe875f535ba9f344d0c4ade13715c947cd3ae832eff60f1bad4161938311d06ac8bc9b52ef203f7b0d9de1409f052a5 python-3.11.8-amd64.exe
|
||||
729dc52f1a02bc6274d012ce33f534102975a828cba11f6029600ea40e2d23aefeb07bf4ae19f9621d0565dd03eb2635bbb97d45fb692c1f756315e8c86c5255 upx-4.2.2-win64.zip
|
||||
|
||||
@@ -17,19 +17,19 @@ uname -s | grep NT-10 && w10=1 || {
|
||||
fns=(
|
||||
altgraph-0.17.4-py2.py3-none-any.whl
|
||||
pefile-2023.2.7-py3-none-any.whl
|
||||
pyinstaller_hooks_contrib-2023.10-py2.py3-none-any.whl
|
||||
pyinstaller_hooks_contrib-2024.1-py2.py3-none-any.whl
|
||||
pywin32_ctypes-0.2.2-py3-none-any.whl
|
||||
setuptools-68.2.2-py3-none-any.whl
|
||||
upx-4.1.0-win32.zip
|
||||
)
|
||||
[ $w10 ] && fns+=(
|
||||
pyinstaller-6.1.0-py3-none-win_amd64.whl
|
||||
Jinja2-3.1.2-py3-none-any.whl
|
||||
MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl
|
||||
pyinstaller-6.4.0-py3-none-win_amd64.whl
|
||||
Jinja2-3.1.3-py3-none-any.whl
|
||||
MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl
|
||||
mutagen-1.47.0-py3-none-any.whl
|
||||
packaging-23.2-py3-none-any.whl
|
||||
Pillow-10.1.0-cp311-cp311-win_amd64.whl
|
||||
python-3.11.7-amd64.exe
|
||||
pillow-10.2.0-cp311-cp311-win_amd64.whl
|
||||
python-3.11.8-amd64.exe
|
||||
upx-4.2.2-win64.zip
|
||||
)
|
||||
[ $w7 ] && fns+=(
|
||||
pyinstaller-5.13.2-py3-none-win32.whl
|
||||
@@ -38,6 +38,7 @@ fns=(
|
||||
idna-3.4-py3-none-any.whl
|
||||
requests-2.28.2-py3-none-any.whl
|
||||
urllib3-1.26.14-py2.py3-none-any.whl
|
||||
upx-4.2.2-win32.zip
|
||||
)
|
||||
[ $w7 ] && fns+=(
|
||||
future-0.18.2.tar.gz
|
||||
|
||||
@@ -54,6 +54,7 @@ copyparty/sutil.py,
|
||||
copyparty/svchub.py,
|
||||
copyparty/szip.py,
|
||||
copyparty/tcpsrv.py,
|
||||
copyparty/tftpd.py,
|
||||
copyparty/th_cli.py,
|
||||
copyparty/th_srv.py,
|
||||
copyparty/u2idx.py,
|
||||
|
||||
@@ -262,7 +262,7 @@ def unpack():
|
||||
final = opj(top, name)
|
||||
san = opj(final, "copyparty/up2k.py")
|
||||
for suf in range(0, 9001):
|
||||
withpid = "{}.{}.{}".format(name, os.getpid(), suf)
|
||||
withpid = "%s.%d.%s" % (name, os.getpid(), suf)
|
||||
mine = opj(top, withpid)
|
||||
if not ofe(mine):
|
||||
break
|
||||
@@ -285,8 +285,8 @@ def unpack():
|
||||
|
||||
ck = hashfile(tar)
|
||||
if ck != CKSUM:
|
||||
t = "\n\nexpected {} ({} byte)\nobtained {} ({} byte)\nsfx corrupt"
|
||||
raise Exception(t.format(CKSUM, SIZE, ck, sz))
|
||||
t = "\n\nexpected %s (%d byte)\nobtained %s (%d byte)\nsfx corrupt"
|
||||
raise Exception(t % (CKSUM, SIZE, ck, sz))
|
||||
|
||||
with tarfile.open(tar, "r:bz2") as tf:
|
||||
# this is safe against traversal
|
||||
|
||||
36
scripts/test/tftp.sh
Executable file
36
scripts/test/tftp.sh
Executable file
@@ -0,0 +1,36 @@
|
||||
#!/bin/bash
|
||||
set -ex
|
||||
|
||||
# PYTHONPATH=.:~/dev/partftpy/ taskset -c 0 python3 -m copyparty -v srv::r -v srv/junk:junk:A --tftp 3969
|
||||
|
||||
get_src=~/dev/copyparty/srv/palette.flac
|
||||
get_fn=${get_src##*/}
|
||||
|
||||
put_src=~/Downloads/102.zip
|
||||
put_dst=~/dev/copyparty/srv/junk/102.zip
|
||||
|
||||
cd /dev/shm
|
||||
|
||||
echo curl get 1428 v4; curl --tftp-blksize 1428 tftp://127.0.0.1:3969/$get_fn | cmp $get_src || exit 1
|
||||
echo curl get 1428 v6; curl --tftp-blksize 1428 tftp://[::1]:3969/$get_fn | cmp $get_src || exit 1
|
||||
|
||||
echo curl put 1428 v4; rm -f $put_dst && curl --tftp-blksize 1428 -T $put_src tftp://127.0.0.1:3969/junk/ && cmp $put_src $put_dst || exit 1
|
||||
echo curl put 1428 v6; rm -f $put_dst && curl --tftp-blksize 1428 -T $put_src tftp://[::1]:3969/junk/ && cmp $put_src $put_dst || exit 1
|
||||
|
||||
echo atftp get 1428; rm -f $get_fn && ~/src/atftp/atftp --option "blksize 1428" -g -r $get_fn 127.0.0.1 3969 && cmp $get_fn $get_src || exit 1
|
||||
|
||||
echo atftp put 1428; rm -f $put_dst && ~/src/atftp/atftp --option "blksize 1428" 127.0.0.1 3969 -p -l $put_src -r junk/102.zip && cmp $put_src $put_dst || exit 1
|
||||
|
||||
echo tftp-hpa get; rm -f $put_dst && tftp -v -m binary 127.0.0.1 3969 -c get $get_fn && cmp $get_src $get_fn || exit 1
|
||||
|
||||
echo tftp-hpa put; rm -f $put_dst && tftp -v -m binary 127.0.0.1 3969 -c put $put_src junk/102.zip && cmp $put_src $put_dst || exit 1
|
||||
|
||||
echo curl get 512; curl tftp://127.0.0.1:3969/$get_fn | cmp $get_src || exit 1
|
||||
|
||||
echo curl put 512; rm -f $put_dst && curl -T $put_src tftp://127.0.0.1:3969/junk/ && cmp $put_src $put_dst || exit 1
|
||||
|
||||
echo atftp get 512; rm -f $get_fn && ~/src/atftp/atftp -g -r $get_fn 127.0.0.1 3969 && cmp $get_fn $get_src || exit 1
|
||||
|
||||
echo atftp put 512; rm -f $put_dst && ~/src/atftp/atftp 127.0.0.1 3969 -p -l $put_src -r junk/102.zip && cmp $put_src $put_dst || exit 1
|
||||
|
||||
echo nice
|
||||
4
setup.py
4
setup.py
@@ -84,7 +84,7 @@ args = {
|
||||
"version": about["__version__"],
|
||||
"description": (
|
||||
"Portable file server with accelerated resumable uploads, "
|
||||
+ "deduplication, WebDAV, FTP, zeroconf, media indexer, "
|
||||
+ "deduplication, WebDAV, FTP, TFTP, zeroconf, media indexer, "
|
||||
+ "video thumbnails, audio transcoding, and write-only folders"
|
||||
),
|
||||
"long_description": long_description,
|
||||
@@ -111,6 +111,7 @@ args = {
|
||||
"Programming Language :: Python :: Implementation :: CPython",
|
||||
"Programming Language :: Python :: Implementation :: Jython",
|
||||
"Programming Language :: Python :: Implementation :: PyPy",
|
||||
"Operating System :: OS Independent",
|
||||
"Environment :: Console",
|
||||
"Environment :: No Input/Output (Daemon)",
|
||||
"Intended Audience :: End Users/Desktop",
|
||||
@@ -140,6 +141,7 @@ args = {
|
||||
"audiotags": ["mutagen"],
|
||||
"ftpd": ["pyftpdlib"],
|
||||
"ftps": ["pyftpdlib", "pyopenssl"],
|
||||
"tftpd": ["partftpy>=0.3.0"],
|
||||
"pwhash": ["argon2-cffi"],
|
||||
},
|
||||
"entry_points": {"console_scripts": ["copyparty = copyparty.__main__:main"]},
|
||||
|
||||
@@ -11,8 +11,8 @@ import unittest
|
||||
|
||||
from copyparty.authsrv import AuthSrv
|
||||
from copyparty.httpcli import HttpCli
|
||||
from copyparty.up2k import Up2k
|
||||
from copyparty.u2idx import U2idx
|
||||
from copyparty.up2k import Up2k
|
||||
from tests import util as tu
|
||||
from tests.util import Cfg
|
||||
|
||||
|
||||
@@ -43,8 +43,8 @@ if MACOS:
|
||||
|
||||
from copyparty.__init__ import E
|
||||
from copyparty.__main__ import init_E
|
||||
from copyparty.util import FHC, Garda, Unrecv
|
||||
from copyparty.u2idx import U2idx
|
||||
from copyparty.util import FHC, Garda, Unrecv
|
||||
|
||||
init_E(E)
|
||||
|
||||
@@ -110,7 +110,7 @@ class Cfg(Namespace):
|
||||
def __init__(self, a=None, v=None, c=None, **ka0):
|
||||
ka = {}
|
||||
|
||||
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw rand smb srch_dbg stats th_no_crop vague_403 vc ver xdev xlink xvol"
|
||||
ex = "daw dav_auth dav_inf dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp exp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_lifetime no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw q rand smb srch_dbg stats vague_403 vc ver xdev xlink xvol"
|
||||
ka.update(**{k: False for k in ex.split()})
|
||||
|
||||
ex = "dotpart dotsrch no_dhash no_fastboot no_rescan no_sendfile no_voldump re_dhash plain_ip"
|
||||
@@ -152,10 +152,13 @@ class Cfg(Namespace):
|
||||
mte={"a": True},
|
||||
mth={},
|
||||
mtp=[],
|
||||
rm_retry="0/0",
|
||||
s_wr_sz=512 * 1024,
|
||||
sort="href",
|
||||
srch_hits=99999,
|
||||
th_crop="y",
|
||||
th_size="320x256",
|
||||
th_x3="n",
|
||||
u2sort="s",
|
||||
u2ts="c",
|
||||
unpost=600,
|
||||
@@ -242,6 +245,7 @@ class VHttpConn(object):
|
||||
self.log_func = log
|
||||
self.log_src = "a"
|
||||
self.mutex = threading.Lock()
|
||||
self.u2mutex = threading.Lock()
|
||||
self.nbyte = 0
|
||||
self.nid = None
|
||||
self.nreq = -1
|
||||
|
||||
Reference in New Issue
Block a user