Compare commits
95 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
fae5a36e6f | ||
|
|
fc9b729fc2 | ||
|
|
8620ae5bb7 | ||
|
|
01a851da28 | ||
|
|
309895d39d | ||
|
|
7ac0803ded | ||
|
|
cae5ccea62 | ||
|
|
3768cb4723 | ||
|
|
0815dce4c1 | ||
|
|
a62f744a18 | ||
|
|
163e3fce46 | ||
|
|
e76a50cb9d | ||
|
|
72fc76ef48 | ||
|
|
c47047c30d | ||
|
|
3b8f66c0d5 | ||
|
|
aa96a1acdc | ||
|
|
91cafc2511 | ||
|
|
23ca00bba8 | ||
|
|
a75a992951 | ||
|
|
4fbd6853f4 | ||
|
|
71c3ad63b3 | ||
|
|
e1324e37a5 | ||
|
|
a996a09bba | ||
|
|
18c763ac08 | ||
|
|
3d9fb753ba | ||
|
|
714fd1811a | ||
|
|
4364581705 | ||
|
|
ba02c9cc12 | ||
|
|
11eefaf968 | ||
|
|
5a968f9e47 | ||
|
|
6420c4bd03 | ||
|
|
0f9877201b | ||
|
|
9ba2dec9b2 | ||
|
|
ae9cfea939 | ||
|
|
cadaeeeace | ||
|
|
767696185b | ||
|
|
c1efd227b7 | ||
|
|
a50d0563c3 | ||
|
|
e5641ddd16 | ||
|
|
700111ffeb | ||
|
|
b8adeb824a | ||
|
|
30cc9defcb | ||
|
|
61875bd773 | ||
|
|
30905c6f5d | ||
|
|
9986136dfb | ||
|
|
1c0d978979 | ||
|
|
0a0364e9f8 | ||
|
|
3376fbde1a | ||
|
|
ac21fa7782 | ||
|
|
c1c8dc5e82 | ||
|
|
5a38311481 | ||
|
|
9f8edb7f32 | ||
|
|
c5a6ac8417 | ||
|
|
50e01d6904 | ||
|
|
9b46291a20 | ||
|
|
14497b2425 | ||
|
|
f7ceae5a5f | ||
|
|
c9492d16ba | ||
|
|
9fb9ada3aa | ||
|
|
db0abbfdda | ||
|
|
e7f0009e57 | ||
|
|
4444f0f6ff | ||
|
|
418842d2d3 | ||
|
|
cafe53c055 | ||
|
|
7673beef72 | ||
|
|
b28bfe64c0 | ||
|
|
135ece3fbd | ||
|
|
bd3640d256 | ||
|
|
fc0405c8f3 | ||
|
|
7df890d964 | ||
|
|
8341041857 | ||
|
|
1b7634932d | ||
|
|
48a3898aa6 | ||
|
|
5d13ebb4ac | ||
|
|
015b87ee99 | ||
|
|
0a48acf6be | ||
|
|
2b6a3afd38 | ||
|
|
18aa82fb2f | ||
|
|
f5407b2997 | ||
|
|
474d5a155b | ||
|
|
afcd98b794 | ||
|
|
4f80e44ff7 | ||
|
|
406e413594 | ||
|
|
033b50ae1b | ||
|
|
bee26e853b | ||
|
|
04a1f7040e | ||
|
|
f9d5bb3b29 | ||
|
|
ca0cd04085 | ||
|
|
999ee2e7bc | ||
|
|
1ff7f968e8 | ||
|
|
3966266207 | ||
|
|
d03e96a392 | ||
|
|
4c843c6df9 | ||
|
|
0896c5295c | ||
|
|
cc0c9839eb |
107
README.md
107
README.md
@@ -71,6 +71,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [themes](#themes)
|
||||
* [complete examples](#complete-examples)
|
||||
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
|
||||
* [prometheus](#prometheus) - metrics/stats can be enabled
|
||||
* [packages](#packages) - the party might be closer than you think
|
||||
* [arch package](#arch-package) - now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
|
||||
* [fedora package](#fedora-package) - now [available on copr-pypi](https://copr.fedorainfracloud.org/coprs/g/copr/PyPI/)
|
||||
@@ -109,7 +110,7 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
|
||||
|
||||
* or install through pypi: `python3 -m pip install --user -U copyparty`
|
||||
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
|
||||
* or install [on arch](#arch-package) ╱ [on NixOS](#nixos-module) ╱ [through nix](#nix-package)
|
||||
* or install [on arch](#arch-package) ╱ [on fedora](#fedora-package) ╱ [on NixOS](#nixos-module) ╱ [through nix](#nix-package)
|
||||
* or if you are on android, [install copyparty in termux](#install-on-android)
|
||||
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
|
||||
* docker has all deps built-in, so skip this step:
|
||||
@@ -284,8 +285,11 @@ server notes:
|
||||
* Android: music playback randomly stops due to [battery usage settings](#fix-unreliable-playback-on-android)
|
||||
|
||||
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
|
||||
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
|
||||
* "future" because `AudioContext` can't maintain a stable playback speed in the current iOS version (15.7), maybe one day...
|
||||
* `AudioContext` will probably never be a viable workaround as apple introduces new issues faster than they fix current ones
|
||||
|
||||
* iPhones: the preload feature (in the media-player-options tab) can cause a tiny audio glitch 20sec before the end of each song, but disabling it may cause worse iOS bugs to appear instead
|
||||
* just a hunch, but disabling preloading may cause playback to stop entirely, or possibly mess with bluetooth speakers
|
||||
* tried to add a tooltip regarding this but looks like apple broke my tooltips
|
||||
|
||||
* Windows: folders cannot be accessed if the name ends with `.`
|
||||
* python or windows bug
|
||||
@@ -295,6 +299,7 @@ server notes:
|
||||
|
||||
* VirtualBox: sqlite throws `Disk I/O Error` when running in a VM and the up2k database is in a vboxsf
|
||||
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db inside the vm instead
|
||||
* also happens on mergerfs, so put the db elsewhere
|
||||
|
||||
* Ubuntu: dragging files from certain folders into firefox or chrome is impossible
|
||||
* due to snap security policies -- see `snap connections firefox` for the allowlist, `removable-media` permits all of `/mnt` and `/media` apparently
|
||||
@@ -323,6 +328,12 @@ upgrade notes
|
||||
* can I make copyparty download a file to my server if I give it a URL?
|
||||
* yes, using [hooks](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
|
||||
|
||||
* i want to learn python and/or programming and am considering looking at the copyparty source code in that occasion
|
||||
```bash
|
||||
_| _ __ _ _|_
|
||||
(_| (_) | | (_) |_
|
||||
```
|
||||
|
||||
|
||||
# accounts and volumes
|
||||
|
||||
@@ -346,6 +357,7 @@ permissions:
|
||||
* `d` (delete): delete files/folders
|
||||
* `g` (get): only download files, cannot see folder contents or zip/tar
|
||||
* `G` (upget): same as `g` except uploaders get to see their own filekeys (see `fk` in examples below)
|
||||
* `h` (html): same as `g` except folders return their index.html, and filekeys are not necessary for index.html
|
||||
* `a` (admin): can see uploader IPs, config-reload
|
||||
|
||||
examples:
|
||||
@@ -504,10 +516,16 @@ select which type of archive you want in the `[⚙️] config` tab:
|
||||
| name | url-suffix | description |
|
||||
|--|--|--|
|
||||
| `tar` | `?tar` | plain gnutar, works great with `curl \| tar -xv` |
|
||||
| `pax` | `?tar=pax` | pax-format tar, futureproof, not as fast |
|
||||
| `tgz` | `?tar=gz` | gzip compressed gnu-tar (slow), for `curl \| tar -xvz` |
|
||||
| `txz` | `?tar=xz` | gnu-tar with xz / lzma compression (v.slow) |
|
||||
| `zip` | `?zip=utf8` | works everywhere, glitchy filenames on win7 and older |
|
||||
| `zip_dos` | `?zip` | traditional cp437 (no unicode) to fix glitchy filenames |
|
||||
| `zip_crc` | `?zip=crc` | cp437 with crc32 computed early for truly ancient software |
|
||||
|
||||
* gzip default level is `3` (0=fast, 9=best), change with `?tar=gz:9`
|
||||
* xz default level is `1` (0=fast, 9=best), change with `?tar=xz:9`
|
||||
* bz2 default level is `2` (1=fast, 9=best), change with `?tar=bz2:9`
|
||||
* hidden files (dotfiles) are excluded unless `-ed`
|
||||
* `up2k.db` and `dir.txt` is always excluded
|
||||
* `zip_crc` will take longer to download since the server has to read each file twice
|
||||
@@ -518,6 +536,10 @@ you can also zip a selection of files or folders by clicking them in the browser
|
||||
|
||||

|
||||
|
||||
cool trick: download a folder by appending url-params `?tar&opus` to transcode all audio files (except aac|m4a|mp3|ogg|opus|wma) to opus before they're added to the archive
|
||||
* super useful if you're 5 minutes away from takeoff and realize you don't have any music on your phone but your server only has flac files and downloading those will burn through all your data + there wouldn't be enough time anyways
|
||||
* and url-params `&j` / `&w` produce jpeg/webm thumbnails/spectrograms instead of the original audio/video/images
|
||||
|
||||
|
||||
## uploading
|
||||
|
||||
@@ -700,7 +722,7 @@ open the `[🎺]` media-player-settings tab to configure it,
|
||||
* `[loop]` keeps looping the folder
|
||||
* `[next]` plays into the next folder
|
||||
* transcode:
|
||||
* `[flac]` convers `flac` and `wav` files into opus
|
||||
* `[flac]` converts `flac` and `wav` files into opus
|
||||
* `[aac]` converts `aac` and `m4a` files into opus
|
||||
* `[oth]` converts all other known formats into opus
|
||||
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
|
||||
@@ -715,6 +737,8 @@ can also boost the volume in general, or increase/decrease stereo width (like [c
|
||||
|
||||
has the convenient side-effect of reducing the pause between songs, so gapless albums play better with the eq enabled (just make it flat)
|
||||
|
||||
not available on iPhones / iPads because AudioContext currently breaks background audio playback on iOS (15.7.8)
|
||||
|
||||
|
||||
### fix unreliable playback on android
|
||||
|
||||
@@ -886,15 +910,13 @@ unsafe, slow, not recommended for wan, enable with `--smb` for read-only or `--
|
||||
|
||||
click the [connect](http://127.0.0.1:3923/?hc) button in the control-panel to see connection instructions for windows, linux, macos
|
||||
|
||||
dependencies: `python3 -m pip install --user -U impacket==0.10.0`
|
||||
dependencies: `python3 -m pip install --user -U impacket==0.11.0`
|
||||
* newer versions of impacket will hopefully work just fine but there is monkeypatching so maybe not
|
||||
|
||||
some **BIG WARNINGS** specific to SMB/CIFS, in decreasing importance:
|
||||
* not entirely confident that read-only is read-only
|
||||
* the smb backend is not fully integrated with vfs, meaning there could be security issues (path traversal). Please use `--smb-port` (see below) and [prisonparty](./bin/prisonparty.sh)
|
||||
* account passwords work per-volume as expected, but account permissions are coalesced; all accounts have read-access to all volumes, and if a single account has write-access to some volume then all other accounts also do
|
||||
* if no accounts have write-access to a specific volume, or if `--smbw` is not set, then writing to that volume from smb *should* be impossible
|
||||
* will be fixed once [impacket v0.11.0](https://github.com/SecureAuthCorp/impacket/commit/d923c00f75d54b972bca573a211a82f09b55261a) is released
|
||||
* account passwords work per-volume as expected, and so does account permissions (read/write/move/delete), but `--smbw` must be given to allow write-access from smb
|
||||
* [shadowing](#shadowing) probably works as expected but no guarantees
|
||||
|
||||
and some minor issues,
|
||||
@@ -905,7 +927,7 @@ and some minor issues,
|
||||
* win10 onwards does not allow connecting anonymously / without accounts
|
||||
* on windows, creating a new file through rightclick --> new --> textfile throws an error due to impacket limitations -- hit OK and F5 to get your file
|
||||
* python3 only
|
||||
* slow
|
||||
* slow (the builtin webdav support in windows is 5x faster, and rclone-webdav is 30x faster)
|
||||
|
||||
known client bugs:
|
||||
* on win7 only, `--smb1` is much faster than smb2 (default) because it keeps rescanning folders on smb2
|
||||
@@ -1003,6 +1025,9 @@ you can also set transaction limits which apply per-IP and per-volume, but these
|
||||
* `:c,maxn=250,3600` allows 250 files over 1 hour from each IP (tracked per-volume)
|
||||
* `:c,maxb=1g,300` allows 1 GiB total over 5 minutes from each IP (tracked per-volume)
|
||||
|
||||
notes:
|
||||
* `vmaxb` and `vmaxn` requires either the `e2ds` volflag or `-e2dsa` global-option
|
||||
|
||||
|
||||
## compress uploads
|
||||
|
||||
@@ -1225,6 +1250,7 @@ you can either:
|
||||
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
|
||||
|
||||
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which could be a nice speed boost
|
||||
* **warning:** nginx-QUIC is still experimental and can make uploads much slower, so HTTP/2 is recommended for now
|
||||
|
||||
example webserver configs:
|
||||
|
||||
@@ -1232,6 +1258,51 @@ example webserver configs:
|
||||
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
|
||||
|
||||
|
||||
## prometheus
|
||||
|
||||
metrics/stats can be enabled at URL `/.cpr/metrics` for grafana / prometheus / etc (openmetrics 1.0.0)
|
||||
|
||||
must be enabled with `--stats` since it reduces startup time a tiny bit, and you probably want `-e2dsa` too
|
||||
|
||||
the endpoint is only accessible by `admin` accounts, meaning the `a` in `rwmda` in the following example commandline: `python3 -m copyparty -a ed:wark -v /mnt/nas::rwmda,ed --stats -e2dsa`
|
||||
|
||||
follow a guide for setting up `node_exporter` except have it read from copyparty instead; example `/etc/prometheus/prometheus.yml` below
|
||||
|
||||
```yaml
|
||||
scrape_configs:
|
||||
- job_name: copyparty
|
||||
metrics_path: /.cpr/metrics
|
||||
basic_auth:
|
||||
password: wark
|
||||
static_configs:
|
||||
- targets: ['192.168.123.1:3923']
|
||||
```
|
||||
|
||||
currently the following metrics are available,
|
||||
* `cpp_uptime_seconds`
|
||||
* `cpp_bans` number of banned IPs
|
||||
|
||||
and these are available per-volume only:
|
||||
* `cpp_disk_size_bytes` total HDD size
|
||||
* `cpp_disk_free_bytes` free HDD space
|
||||
|
||||
and these are per-volume and `total`:
|
||||
* `cpp_vol_bytes` size of all files in volume
|
||||
* `cpp_vol_files` number of files
|
||||
* `cpp_dupe_bytes` disk space presumably saved by deduplication
|
||||
* `cpp_dupe_files` number of dupe files
|
||||
* `cpp_unf_bytes` currently unfinished / incoming uploads
|
||||
|
||||
some of the metrics have additional requirements to function correctly,
|
||||
* `cpp_vol_*` requires either the `e2ds` volflag or `-e2dsa` global-option
|
||||
|
||||
the following options are available to disable some of the metrics:
|
||||
* `--nos-hdd` disables `cpp_disk_*` which can prevent spinning up HDDs
|
||||
* `--nos-vol` disables `cpp_vol_*` which reduces server startup time
|
||||
* `--nos-dup` disables `cpp_dupe_*` which reduces the server load caused by prometheus queries
|
||||
* `--nos-unf` disables `cpp_unf_*` for no particular purpose
|
||||
|
||||
|
||||
# packages
|
||||
|
||||
the party might be closer than you think
|
||||
@@ -1513,6 +1584,7 @@ below are some tweaks roughly ordered by usefulness:
|
||||
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
|
||||
* lots of connections (many users or heavy clients)
|
||||
* simultaneous downloads and uploads saturating a 20gbps connection
|
||||
* if `-e2d` is enabled, `-j2` gives 4x performance for directory listings; `-j4` gives 16x
|
||||
|
||||
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
||||
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
|
||||
@@ -1544,6 +1616,7 @@ some notes on hardening
|
||||
* set `--rproxy 0` if your copyparty is directly facing the internet (not through a reverse-proxy)
|
||||
* cors doesn't work right otherwise
|
||||
* if you allow anonymous uploads or otherwise don't trust the contents of a volume, you can prevent XSS with volflag `nohtml`
|
||||
* this returns html documents as plaintext, and also disables markdown rendering
|
||||
|
||||
safety profiles:
|
||||
|
||||
@@ -1557,9 +1630,9 @@ safety profiles:
|
||||
* `--unpost 0`, `--no-del`, `--no-mv` disables all move/delete support
|
||||
* `--hardlink` creates hardlinks instead of symlinks when deduplicating uploads, which is less maintenance
|
||||
* however note if you edit one file it will also affect the other copies
|
||||
* `--vague-401` returns a "404 not found" instead of "401 unauthorized" which is a common enterprise meme
|
||||
* `--vague-403` returns a "404 not found" instead of "401 unauthorized" which is a common enterprise meme
|
||||
* `--ban-404=50,60,1440` ban client for 1440min (24h) if they hit 50 404's in 60min
|
||||
* **NB:** will ban anyone who enables up2k turbo
|
||||
* `--turbo=-1` to force-disable turbo-mode in the uploader which could otherwise hit the 404-ban
|
||||
* `--nih` removes the server hostname from directory listings
|
||||
|
||||
* option `-sss` is a shortcut for the above plus:
|
||||
@@ -1574,6 +1647,8 @@ other misc notes:
|
||||
* combine this with volflag `c,fk` to generate filekeys (per-file accesskeys); users which have full read-access will then see URLs with `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
|
||||
* the default filekey entropy is fairly small so give `--fk-salt` around 30 characters if you want filekeys longer than 16 chars
|
||||
* permissions `wG` lets users upload files and receive their own filekeys, still without being able to see other uploads
|
||||
* permission `h` instead of `r` makes copyparty behave like a traditional webserver with directory listing/index disabled, returning index.html instead
|
||||
* compatibility with filekeys: index.html itself can be retrieved without the correct filekey, but all other files are protected
|
||||
|
||||
|
||||
## gotchas
|
||||
@@ -1581,10 +1656,12 @@ other misc notes:
|
||||
behavior that might be unexpected
|
||||
|
||||
* users without read-access to a folder can still see the `.prologue.html` / `.epilogue.html` / `README.md` contents, for the purpose of showing a description on how to use the uploader for example
|
||||
* users can submit `<script>`s which autorun for other visitors in a few ways;
|
||||
* users can submit `<script>`s which autorun (in a sandbox) for other visitors in a few ways;
|
||||
* uploading a `README.md` -- avoid with `--no-readme`
|
||||
* renaming `some.html` to `.epilogue.html` -- avoid with either `--no-logues` or `--no-dot-ren`
|
||||
* the directory-listing embed is sandboxed (so any malicious scripts can't do any damage) but the markdown editor is not
|
||||
* the directory-listing embed is sandboxed (so any malicious scripts can't do any damage) but the markdown editor is not 100% safe, see below
|
||||
* markdown documents can contain html and `<script>`s; attempts are made to prevent scripts from executing (unless `-emp` is specified) but this is not 100% bulletproof, so setting the `nohtml` volflag is still the safest choice
|
||||
* or eliminate the problem entirely by only giving write-access to trustworthy people :^)
|
||||
|
||||
|
||||
## cors
|
||||
@@ -1675,7 +1752,7 @@ enable [thumbnails](#thumbnails) of...
|
||||
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
|
||||
|
||||
enable [smb](#smb-server) support (**not** recommended):
|
||||
* `impacket==0.10.0`
|
||||
* `impacket==0.11.0`
|
||||
|
||||
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
|
||||
|
||||
@@ -1709,7 +1786,7 @@ can be convenient on machines where installing python is problematic, however is
|
||||
|
||||
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with [windows7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png), which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
|
||||
|
||||
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
|
||||
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.8.7/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
|
||||
|
||||
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
||||
|
||||
|
||||
115
bin/hooks/msg-log.py
Executable file
115
bin/hooks/msg-log.py
Executable file
@@ -0,0 +1,115 @@
|
||||
#!/usr/bin/env python
|
||||
# coding: utf-8
|
||||
# vim:ts=4:sw=4:softtabstop=4:smarttab:expandtab
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
"""
|
||||
use copyparty as a dumb messaging server / guestbook thing;
|
||||
initially contributed by @clach04 in https://github.com/9001/copyparty/issues/35 (thanks!)
|
||||
|
||||
Sample usage:
|
||||
|
||||
python copyparty-sfx.py --xm j,bin/hooks/msg-log.py
|
||||
|
||||
Where:
|
||||
|
||||
xm = execute on message-to-server-log
|
||||
j = provide message information as json; not just the text - this script REQUIRES json
|
||||
t10 = timeout and kill download after 10 secs
|
||||
"""
|
||||
|
||||
|
||||
# output filename
|
||||
FILENAME = os.environ.get("COPYPARTY_MESSAGE_FILENAME", "") or "README.md"
|
||||
|
||||
# set True to write in descending order (newest message at top of file);
|
||||
# note that this becomes very slow/expensive as the file gets bigger
|
||||
DESCENDING = True
|
||||
|
||||
# the message template; the following parameters are provided by copyparty and can be referenced below:
|
||||
# 'ap' = absolute filesystem path where the message was posted
|
||||
# 'vp' = virtual path (URL 'path') where the message was posted
|
||||
# 'mt' = 'at' = unix-timestamp when the message was posted
|
||||
# 'datetime' = ISO-8601 time when the message was posted
|
||||
# 'sz' = message size in bytes
|
||||
# 'host' = the server hostname which the user was accessing (URL 'host')
|
||||
# 'user' = username (if logged in), otherwise '*'
|
||||
# 'txt' = the message text itself
|
||||
# (uncomment the print(msg_info) to see if additional information has been introduced by copyparty since this was written)
|
||||
TEMPLATE = """
|
||||
🕒 %(datetime)s, 👤 %(user)s @ %(ip)s
|
||||
%(txt)s
|
||||
"""
|
||||
|
||||
|
||||
def write_ascending(filepath, msg_text):
|
||||
with open(filepath, "a", encoding="utf-8", errors="replace") as outfile:
|
||||
outfile.write(msg_text)
|
||||
|
||||
|
||||
def write_descending(filepath, msg_text):
|
||||
lockpath = filepath + ".lock"
|
||||
got_it = False
|
||||
for _ in range(16):
|
||||
try:
|
||||
os.mkdir(lockpath)
|
||||
got_it = True
|
||||
break
|
||||
except:
|
||||
time.sleep(0.1)
|
||||
continue
|
||||
|
||||
if not got_it:
|
||||
return sys.exit(1)
|
||||
|
||||
try:
|
||||
oldpath = filepath + ".old"
|
||||
os.rename(filepath, oldpath)
|
||||
with open(oldpath, "r", encoding="utf-8", errors="replace") as infile, open(
|
||||
filepath, "w", encoding="utf-8", errors="replace"
|
||||
) as outfile:
|
||||
outfile.write(msg_text)
|
||||
while True:
|
||||
buf = infile.read(4096)
|
||||
if not buf:
|
||||
break
|
||||
outfile.write(buf)
|
||||
finally:
|
||||
try:
|
||||
os.unlink(oldpath)
|
||||
except:
|
||||
pass
|
||||
os.rmdir(lockpath)
|
||||
|
||||
|
||||
def main(argv=None):
|
||||
if argv is None:
|
||||
argv = sys.argv
|
||||
|
||||
msg_info = json.loads(sys.argv[1])
|
||||
# print(msg_info)
|
||||
|
||||
dt = datetime.utcfromtimestamp(msg_info["at"])
|
||||
msg_info["datetime"] = dt.strftime("%Y-%m-%d, %H:%M:%S")
|
||||
|
||||
msg_text = TEMPLATE % msg_info
|
||||
|
||||
filepath = os.path.join(msg_info["ap"], FILENAME)
|
||||
|
||||
if DESCENDING and os.path.exists(filepath):
|
||||
write_descending(filepath, msg_text)
|
||||
else:
|
||||
write_ascending(filepath, msg_text)
|
||||
|
||||
print(msg_text)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -37,6 +37,10 @@ def main():
|
||||
if "://" not in url:
|
||||
url = "https://" + url
|
||||
|
||||
proto = url.split("://")[0].lower()
|
||||
if proto not in ("http", "https", "ftp", "ftps"):
|
||||
raise Exception("bad proto {}".format(proto))
|
||||
|
||||
os.chdir(inf["ap"])
|
||||
|
||||
name = url.split("?")[0].split("/")[-1]
|
||||
|
||||
@@ -7,6 +7,7 @@ set -e
|
||||
# linux/alpine: requires gcc g++ make cmake patchelf {python3,ffmpeg,fftw,libsndfile}-dev py3-{wheel,pip} py3-numpy{,-dev}
|
||||
# linux/debian: requires libav{codec,device,filter,format,resample,util}-dev {libfftw3,python3,libsndfile1}-dev python3-{numpy,pip} vamp-{plugin-sdk,examples} patchelf cmake
|
||||
# linux/fedora: requires gcc gcc-c++ make cmake patchelf {python3,ffmpeg,fftw,libsndfile}-devel python3-numpy vamp-plugin-sdk qm-vamp-plugins
|
||||
# linux/arch: requires gcc make cmake patchelf python3 ffmpeg fftw libsndfile python-{numpy,wheel,pip,setuptools}
|
||||
# win64: requires msys2-mingw64 environment
|
||||
# macos: requires macports
|
||||
#
|
||||
@@ -227,15 +228,16 @@ install_vamp() {
|
||||
cd "$td"
|
||||
echo '#include <vamp-sdk/Plugin.h>' | g++ -x c++ -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
|
||||
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
|
||||
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2588/vamp-plugin-sdk-2.9.0.tar.gz)
|
||||
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2691/vamp-plugin-sdk-2.10.0.tar.gz)
|
||||
sha512sum -c <(
|
||||
echo "7ef7f837d19a08048b059e0da408373a7964ced452b290fae40b85d6d70ca9000bcfb3302cd0b4dc76cf2a848528456f78c1ce1ee0c402228d812bd347b6983b -"
|
||||
) <vamp-plugin-sdk-2.9.0.tar.gz
|
||||
tar -xf vamp-plugin-sdk-2.9.0.tar.gz
|
||||
echo "153b7f2fa01b77c65ad393ca0689742d66421017fd5931d216caa0fcf6909355fff74706fabbc062a3a04588a619c9b515a1dae00f21a57afd97902a355c48ed -"
|
||||
) <vamp-plugin-sdk-2.10.0.tar.gz
|
||||
tar -xf vamp-plugin-sdk-2.10.0.tar.gz
|
||||
rm -- *.tar.gz
|
||||
ls -al
|
||||
cd vamp-plugin-sdk-*
|
||||
./configure --prefix=$HOME/pe/vamp-sdk
|
||||
printf '%s\n' "int main(int argc, char **argv) { return 0; }" > host/vamp-simple-host.cpp
|
||||
./configure --disable-programs --prefix=$HOME/pe/vamp-sdk
|
||||
make -j1 install
|
||||
}
|
||||
|
||||
@@ -250,8 +252,9 @@ install_vamp() {
|
||||
rm -- *.tar.gz
|
||||
cd beatroot-vamp-v1.0
|
||||
[ -e ~/pe/vamp-sdk ] &&
|
||||
sed -ri 's`^(CFLAGS :=.*)`\1 -I'$HOME'/pe/vamp-sdk/include`' Makefile.linux
|
||||
make -f Makefile.linux -j4 LDFLAGS=-L$HOME/pe/vamp-sdk/lib
|
||||
sed -ri 's`^(CFLAGS :=.*)`\1 -I'$HOME'/pe/vamp-sdk/include`' Makefile.linux ||
|
||||
sed -ri 's`^(CFLAGS :=.*)`\1 -I/usr/include/vamp-sdk`' Makefile.linux
|
||||
make -f Makefile.linux -j4 LDFLAGS="-L$HOME/pe/vamp-sdk/lib -L/usr/lib64"
|
||||
# /home/ed/vamp /home/ed/.vamp /usr/local/lib/vamp
|
||||
mkdir ~/vamp
|
||||
cp -pv beatroot-vamp.* ~/vamp/
|
||||
|
||||
@@ -65,6 +65,10 @@ def main():
|
||||
if "://" not in url:
|
||||
url = "https://" + url
|
||||
|
||||
proto = url.split("://")[0].lower()
|
||||
if proto not in ("http", "https", "ftp", "ftps"):
|
||||
raise Exception("bad proto {}".format(proto))
|
||||
|
||||
os.chdir(fdir)
|
||||
|
||||
name = url.split("?")[0].split("/")[-1]
|
||||
|
||||
23
bin/u2c.py
23
bin/u2c.py
@@ -1,8 +1,8 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
S_VERSION = "1.9"
|
||||
S_BUILD_DT = "2023-05-07"
|
||||
S_VERSION = "1.10"
|
||||
S_BUILD_DT = "2023-08-15"
|
||||
|
||||
"""
|
||||
u2c.py: upload to copyparty
|
||||
@@ -14,6 +14,7 @@ https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py
|
||||
- if something breaks just try again and it'll autoresume
|
||||
"""
|
||||
|
||||
import re
|
||||
import os
|
||||
import sys
|
||||
import stat
|
||||
@@ -39,7 +40,7 @@ except:
|
||||
|
||||
try:
|
||||
import requests
|
||||
except ImportError:
|
||||
except ImportError as ex:
|
||||
if EXE:
|
||||
raise
|
||||
elif sys.version_info > (2, 7):
|
||||
@@ -50,7 +51,7 @@ except ImportError:
|
||||
m = "\n ERROR: need these:\n" + "\n".join(m) + "\n"
|
||||
m += "\n for f in *.whl; do unzip $f; done; rm -r *.dist-info\n"
|
||||
|
||||
print(m.format(sys.executable))
|
||||
print(m.format(sys.executable), "\nspecifically,", ex)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@@ -411,10 +412,11 @@ def walkdir(err, top, seen):
|
||||
err.append((ap, str(ex)))
|
||||
|
||||
|
||||
def walkdirs(err, tops):
|
||||
def walkdirs(err, tops, excl):
|
||||
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
||||
sep = "{0}".format(os.sep).encode("ascii")
|
||||
if not VT100:
|
||||
excl = excl.replace("/", r"\\")
|
||||
za = []
|
||||
for td in tops:
|
||||
try:
|
||||
@@ -431,6 +433,8 @@ def walkdirs(err, tops):
|
||||
za = [x.replace(b"/", b"\\") for x in za]
|
||||
tops = za
|
||||
|
||||
ptn = re.compile(excl.encode("utf-8") or b"\n")
|
||||
|
||||
for top in tops:
|
||||
isdir = os.path.isdir(top)
|
||||
if top[-1:] == sep:
|
||||
@@ -443,6 +447,8 @@ def walkdirs(err, tops):
|
||||
|
||||
if isdir:
|
||||
for ap, inf in walkdir(err, top, []):
|
||||
if ptn.match(ap):
|
||||
continue
|
||||
yield stop, ap[len(stop) :].lstrip(sep), inf
|
||||
else:
|
||||
d, n = top.rsplit(sep, 1)
|
||||
@@ -654,7 +660,7 @@ class Ctl(object):
|
||||
nfiles = 0
|
||||
nbytes = 0
|
||||
err = []
|
||||
for _, _, inf in walkdirs(err, ar.files):
|
||||
for _, _, inf in walkdirs(err, ar.files, ar.x):
|
||||
if stat.S_ISDIR(inf.st_mode):
|
||||
continue
|
||||
|
||||
@@ -696,7 +702,7 @@ class Ctl(object):
|
||||
if ar.te:
|
||||
req_ses.verify = ar.te
|
||||
|
||||
self.filegen = walkdirs([], ar.files)
|
||||
self.filegen = walkdirs([], ar.files, ar.x)
|
||||
self.recheck = [] # type: list[File]
|
||||
|
||||
if ar.safe:
|
||||
@@ -1097,6 +1103,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ap.add_argument("-v", action="store_true", help="verbose")
|
||||
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
|
||||
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
||||
ap.add_argument("-x", type=unicode, metavar="REGEX", default="", help="skip file if filesystem-abspath matches REGEX, example: '.*/\.hist/.*'")
|
||||
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
||||
ap.add_argument("--version", action="store_true", help="show version and exit")
|
||||
|
||||
@@ -1113,7 +1120,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
||||
ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections")
|
||||
ap.add_argument("-J", type=int, metavar="THREADS", default=hcores, help="num cpu-cores to use for hashing; set 0 or 1 for single-core hashing")
|
||||
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
|
||||
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles)")
|
||||
ap.add_argument("-ns", action="store_true", help="no status panel (for slow consoles and macos)")
|
||||
ap.add_argument("--safe", action="store_true", help="use simple fallback approach")
|
||||
ap.add_argument("-z", action="store_true", help="ZOOMIN' (skip uploading files if they exist at the destination with the ~same last-modified timestamp, so same as yolo / turbo with date-chk but even faster)")
|
||||
|
||||
|
||||
@@ -26,8 +26,8 @@ a {
|
||||
<script>
|
||||
|
||||
var a = document.getElementById('redir'),
|
||||
proto = window.location.protocol.indexOf('https') === 0 ? 'https' : 'http',
|
||||
loc = window.location.hostname || '127.0.0.1',
|
||||
proto = location.protocol.indexOf('https') === 0 ? 'https' : 'http',
|
||||
loc = location.hostname || '127.0.0.1',
|
||||
port = a.getAttribute('href').split(':').pop().split('/')[0],
|
||||
url = proto + '://' + loc + ':' + port + '/';
|
||||
|
||||
@@ -35,7 +35,7 @@ a.setAttribute('href', url);
|
||||
document.getElementById('desc').innerHTML = 'redirecting to';
|
||||
|
||||
setTimeout(function() {
|
||||
window.location.href = url;
|
||||
location.href = url;
|
||||
}, 500);
|
||||
|
||||
</script>
|
||||
|
||||
@@ -34,6 +34,8 @@ server {
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
# NOTE: with cloudflare you want this instead:
|
||||
#proxy_set_header X-Forwarded-For $http_cf_connecting_ip;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_set_header Connection "Keep-Alive";
|
||||
}
|
||||
|
||||
@@ -138,7 +138,8 @@ in {
|
||||
"d" (delete): permanently delete files and folders
|
||||
"g" (get): download files, but cannot see folder contents
|
||||
"G" (upget): "get", but can see filekeys of their own uploads
|
||||
"a" (upget): can see uploader IPs, config-reload
|
||||
"h" (html): "get", but folders return their index.html
|
||||
"a" (admin): can see uploader IPs, config-reload
|
||||
|
||||
For example: "rwmd"
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.8.6"
|
||||
pkgver="1.9.6"
|
||||
pkgrel=1
|
||||
pkgdesc="Portable file sharing hub"
|
||||
arch=("any")
|
||||
@@ -20,7 +20,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("a37aacc30b9bec375ff6e7815fd763ec555b9bfbd70415aefdd18552c6491faa")
|
||||
sha256sums=("e849da36fe21f14078a8935407d1aefd41efa13d021cd490d2fa51269294ecac")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.8.6/copyparty-sfx.py",
|
||||
"version": "1.8.6",
|
||||
"hash": "sha256-yTcMW4QVf1QH8jfYpn5BdG5LXilcrmakdbTk9NsVTGE="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.6/copyparty-sfx.py",
|
||||
"version": "1.9.6",
|
||||
"hash": "sha256-EO7CLN7XQ2a9Pw2fgKbbtQY9LACZ+LsW+HjuYsn7X8Y="
|
||||
}
|
||||
@@ -492,6 +492,7 @@ def get_sects():
|
||||
"d" (delete): permanently delete files and folders
|
||||
"g" (get): download files, but cannot see folder contents
|
||||
"G" (upget): "get", but can see filekeys of their own uploads
|
||||
"h" (html): "get", but folders return their index.html
|
||||
"a" (admin): can see uploader IPs, config-reload
|
||||
|
||||
too many volflags to list here, see --help-flags
|
||||
@@ -716,6 +717,40 @@ def get_sects():
|
||||
"""
|
||||
),
|
||||
],
|
||||
[
|
||||
"zm",
|
||||
"mDNS debugging",
|
||||
dedent(
|
||||
"""
|
||||
the mDNS protocol is multicast-based, which means there are thousands
|
||||
of fun and intersesting ways for it to break unexpectedly
|
||||
|
||||
things to check if it does not work at all:
|
||||
|
||||
* is there a firewall blocking port 5353 on either the server or client?
|
||||
(for example, clients may be able to send queries to copyparty,
|
||||
but the replies could get lost)
|
||||
|
||||
* is multicast accidentally disabled on either the server or client?
|
||||
(look for mDNS log messages saying "new client on [...]")
|
||||
|
||||
* the router/switch must be multicast and igmp capable
|
||||
|
||||
things to check if it works for a while but then it doesn't:
|
||||
|
||||
* is there a firewall blocking port 5353 on either the server or client?
|
||||
(copyparty may be unable to see the queries from the clients, but the
|
||||
clients may still be able to see the initial unsolicited announce,
|
||||
so it works for about 2 minutes after startup until TTL expires)
|
||||
|
||||
* does the client have multiple IPs on its interface, and some of the
|
||||
IPs are in subnets which the copyparty server is not a member of?
|
||||
|
||||
for both of the above intermittent issues, try --zm-spam 30
|
||||
(not spec-compliant but nothing will mind)
|
||||
"""
|
||||
),
|
||||
],
|
||||
]
|
||||
|
||||
|
||||
@@ -781,7 +816,7 @@ def add_upload(ap):
|
||||
ap2.add_argument("--magic", action="store_true", help="enable filetype detection on nameless uploads (volflag=magic)")
|
||||
ap2.add_argument("--df", metavar="GiB", type=float, default=0, help="ensure GiB free disk space by rejecting upload requests")
|
||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m0\033[0m] = off and warn if enabled, [\033[32m1\033[0m] = off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
|
||||
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
|
||||
|
||||
@@ -791,7 +826,9 @@ def add_network(ap):
|
||||
ap2.add_argument("-i", metavar="IP", type=u, default="::", help="ip to bind (comma-sep.), default: all IPv4 and IPv6")
|
||||
ap2.add_argument("-p", metavar="PORT", type=u, default="3923", help="ports to bind (comma/range)")
|
||||
ap2.add_argument("--ll", action="store_true", help="include link-local IPv4/IPv6 even if the NIC has routable IPs (breaks some mdns clients)")
|
||||
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; [\033[32m0\033[0m]=tcp, [\033[32m1\033[0m]=origin (first x-fwd), [\033[32m2\033[0m]=cloudflare, [\033[32m3\033[0m]=nginx, [\033[32m-1\033[0m]=closest proxy")
|
||||
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; [\033[32m0\033[0m]=tcp, [\033[32m1\033[0m]=origin (first x-fwd, unsafe), [\033[32m2\033[0m]=outermost-proxy, [\033[32m3\033[0m]=second-proxy, [\033[32m-1\033[0m]=closest-proxy")
|
||||
ap2.add_argument("--xff-hdr", metavar="NAME", type=u, default="x-forwarded-for", help="if reverse-proxied, which http header to read the client's real ip from (argument must be lowercase, but not the actual header)")
|
||||
ap2.add_argument("--xff-src", metavar="IP", type=u, default="127., ::1", help="comma-separated list of trusted reverse-proxy IPs; only accept the real-ip header (--xff-hdr) if the incoming connection is from an IP starting with either of these. Can be disabled with [\033[32many\033[0m] if you are behind cloudflare (or similar) and are using --xff-hdr=cf-connecting-ip (or similar)")
|
||||
ap2.add_argument("--rp-loc", metavar="PATH", type=u, default="", help="if reverse-proxying on a location instead of a dedicated domain/subdomain, provide the base location here (eg. /foo/bar)")
|
||||
if ANYWIN:
|
||||
ap2.add_argument("--reuseaddr", action="store_true", help="set reuseaddr on listening sockets on windows; allows rapid restart of copyparty at the expense of being able to accidentally start multiple instances")
|
||||
@@ -846,7 +883,7 @@ def add_zeroconf(ap):
|
||||
|
||||
|
||||
def add_zc_mdns(ap):
|
||||
ap2 = ap.add_argument_group("Zeroconf-mDNS options")
|
||||
ap2 = ap.add_argument_group("Zeroconf-mDNS options; also see --help-zm")
|
||||
ap2.add_argument("--zm", action="store_true", help="announce the enabled protocols over mDNS (multicast DNS-SD) -- compatible with KDE, gnome, macOS, ...")
|
||||
ap2.add_argument("--zm-on", metavar="NETS", type=u, default="", help="enable zeroconf ONLY on the comma-separated list of subnets and/or interface names/indexes")
|
||||
ap2.add_argument("--zm-off", metavar="NETS", type=u, default="", help="disable zeroconf on the comma-separated list of subnets and/or interface names/indexes")
|
||||
@@ -860,8 +897,9 @@ def add_zc_mdns(ap):
|
||||
ap2.add_argument("--zm-lf", metavar="PATH", type=u, default="", help="link a specific folder for ftp shares")
|
||||
ap2.add_argument("--zm-ls", metavar="PATH", type=u, default="", help="link a specific folder for smb shares")
|
||||
ap2.add_argument("--zm-mnic", action="store_true", help="merge NICs which share subnets; assume that same subnet means same network")
|
||||
ap2.add_argument("--zm-msub", action="store_true", help="merge subnets on each NIC -- always enabled for ipv6 -- reduces network load, but gnome-gvfs clients may stop working")
|
||||
ap2.add_argument("--zm-msub", action="store_true", help="merge subnets on each NIC -- always enabled for ipv6 -- reduces network load, but gnome-gvfs clients may stop working, and clients cannot be in subnets that the server is not")
|
||||
ap2.add_argument("--zm-noneg", action="store_true", help="disable NSEC replies -- try this if some clients don't see copyparty")
|
||||
ap2.add_argument("--zm-spam", metavar="SEC", type=float, default=0, help="send unsolicited announce every SEC; useful if clients have IPs in a subnet which doesn't overlap with the server")
|
||||
|
||||
|
||||
def add_zc_ssdp(ap):
|
||||
@@ -896,12 +934,13 @@ def add_webdav(ap):
|
||||
|
||||
def add_smb(ap):
|
||||
ap2 = ap.add_argument_group('SMB/CIFS options')
|
||||
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless --smb-port is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is dangerous! Never expose to the internet. Account permissions are coalesced; if one account has write-access to a volume, then all accounts do.")
|
||||
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless --smb-port is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is dangerous! Never expose to the internet!")
|
||||
ap2.add_argument("--smbw", action="store_true", help="enable write support (please dont)")
|
||||
ap2.add_argument("--smb1", action="store_true", help="disable SMBv2, only enable SMBv1 (CIFS)")
|
||||
ap2.add_argument("--smb-port", metavar="PORT", type=int, default=445, help="port to listen on -- if you change this value, you must NAT from TCP:445 to this port using iptables or similar")
|
||||
ap2.add_argument("--smb-nwa-1", action="store_true", help="disable impacket#1433 workaround (truncate directory listings to 64kB)")
|
||||
ap2.add_argument("--smb-nwa-2", action="store_true", help="disable impacket workaround for filecopy globs")
|
||||
ap2.add_argument("--smba", action="store_true", help="small performance boost: disable per-account permissions, enables account coalescing instead (if one user has write/delete-access, then everyone does)")
|
||||
ap2.add_argument("--smbv", action="store_true", help="verbose")
|
||||
ap2.add_argument("--smbvv", action="store_true", help="verboser")
|
||||
ap2.add_argument("--smbvvv", action="store_true", help="verbosest")
|
||||
@@ -924,7 +963,16 @@ def add_hooks(ap):
|
||||
ap2.add_argument("--xbd", metavar="CMD", type=u, action="append", help="execute CMD before a file delete")
|
||||
ap2.add_argument("--xad", metavar="CMD", type=u, action="append", help="execute CMD after a file delete")
|
||||
ap2.add_argument("--xm", metavar="CMD", type=u, action="append", help="execute CMD on message")
|
||||
ap2.add_argument("--xban", metavar="CMD", type=u, action="append", help="execute CMD if someone gets banned (pw/404)")
|
||||
ap2.add_argument("--xban", metavar="CMD", type=u, action="append", help="execute CMD if someone gets banned (pw/404/403/url)")
|
||||
|
||||
|
||||
def add_stats(ap):
|
||||
ap2 = ap.add_argument_group('grafana/prometheus metrics endpoint')
|
||||
ap2.add_argument("--stats", action="store_true", help="enable openmetrics at /.cpr/metrics for admin accounts")
|
||||
ap2.add_argument("--nos-hdd", action="store_true", help="disable disk-space metrics (used/free space)")
|
||||
ap2.add_argument("--nos-vol", action="store_true", help="disable volume size metrics (num files, total bytes, vmaxb/vmaxn)")
|
||||
ap2.add_argument("--nos-dup", action="store_true", help="disable dupe-files metrics (good idea; very slow)")
|
||||
ap2.add_argument("--nos-unf", action="store_true", help="disable unfinished-uploads metrics")
|
||||
|
||||
|
||||
def add_yolo(ap):
|
||||
@@ -940,17 +988,19 @@ def add_optouts(ap):
|
||||
ap2.add_argument("--no-dav", action="store_true", help="disable webdav support")
|
||||
ap2.add_argument("--no-del", action="store_true", help="disable delete operations")
|
||||
ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations")
|
||||
ap2.add_argument("-nth", action="store_true", help="no title hostname; don't show --name in <title>")
|
||||
ap2.add_argument("-nih", action="store_true", help="no info hostname -- don't show in UI")
|
||||
ap2.add_argument("-nid", action="store_true", help="no info disk-usage -- don't show in UI")
|
||||
ap2.add_argument("-nb", action="store_true", help="no powered-by-copyparty branding in UI")
|
||||
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
|
||||
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
||||
ap2.add_argument("--no-lifetime", action="store_true", help="disable automatic deletion of uploads after a certain time (as specified by the 'lifetime' volflag)")
|
||||
|
||||
|
||||
def add_safety(ap):
|
||||
ap2 = ap.add_argument_group('safety options')
|
||||
ap2.add_argument("-s", action="count", default=0, help="increase safety: Disable thumbnails / potentially dangerous software (ffmpeg/pillow/vips), hide partial uploads, avoid crawlers.\n └─Alias of\033[32m --dotpart --no-thumb --no-mtag-ff --no-robots --force-js")
|
||||
ap2.add_argument("-ss", action="store_true", help="further increase safety: Prevent js-injection, accidental move/delete, broken symlinks, webdav, 404 on 403, ban on excessive 404s.\n └─Alias of\033[32m -s --unpost=0 --no-del --no-mv --hardlink --vague-403 --ban-404=50,60,1440 -nih")
|
||||
ap2.add_argument("-ss", action="store_true", help="further increase safety: Prevent js-injection, accidental move/delete, broken symlinks, webdav, 404 on 403, ban on excessive 404s.\n └─Alias of\033[32m -s --unpost=0 --no-del --no-mv --hardlink --vague-403 --ban-404=50,60,1440 --turbo=-1 -nih")
|
||||
ap2.add_argument("-sss", action="store_true", help="further increase safety: Enable logging to disk, scan for dangerous symlinks.\n └─Alias of\033[32m -ss --no-dav --no-logues --no-readme -lo=cpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz --ls=**,*,ln,p,r")
|
||||
ap2.add_argument("--ls", metavar="U[,V[,F]]", type=u, help="do a sanity/safety check of all volumes on startup; arguments \033[33mUSER\033[0m,\033[33mVOL\033[0m,\033[33mFLAGS\033[0m; example [\033[32m**,*,ln,p,r\033[0m]")
|
||||
ap2.add_argument("--xvol", action="store_true", help="never follow symlinks leaving the volume root, unless the link is into another volume where the user has similar access (volflag=xvol)")
|
||||
@@ -965,6 +1015,11 @@ def add_safety(ap):
|
||||
ap2.add_argument("--logout", metavar="H", type=float, default="8086", help="logout clients after H hours of inactivity; [\033[32m0.0028\033[0m]=10sec, [\033[32m0.1\033[0m]=6min, [\033[32m24\033[0m]=day, [\033[32m168\033[0m]=week, [\033[32m720\033[0m]=month, [\033[32m8760\033[0m]=year)")
|
||||
ap2.add_argument("--ban-pw", metavar="N,W,B", type=u, default="9,60,1440", help="more than \033[33mN\033[0m wrong passwords in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; disable with [\033[32mno\033[0m]")
|
||||
ap2.add_argument("--ban-404", metavar="N,W,B", type=u, default="no", help="hitting more than \033[33mN\033[0m 404's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes (disabled by default since turbo-up2k counts as 404s)")
|
||||
ap2.add_argument("--ban-403", metavar="N,W,B", type=u, default="9,2,1440", help="hitting more than \033[33mN\033[0m 403's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes; [\033[32m1440\033[0m]=day, [\033[32m10080\033[0m]=week, [\033[32m43200\033[0m]=month")
|
||||
ap2.add_argument("--ban-422", metavar="N,W,B", type=u, default="9,2,1440", help="hitting more than \033[33mN\033[0m 422's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes (422 is server fuzzing, invalid POSTs and so)")
|
||||
ap2.add_argument("--ban-url", metavar="N,W,B", type=u, default="9,2,1440", help="hitting more than \033[33mN\033[0m sus URL's in \033[33mW\033[0m minutes = ban for \033[33mB\033[0m minutes (decent replacement for --ban-404 if that can't be used)")
|
||||
ap2.add_argument("--sus-urls", metavar="R", type=u, default=r"\.php$|(^|/)wp-(admin|content|includes)/", help="URLs which are considered sus / eligible for banning; disable with blank or [\033[32mno\033[0m]")
|
||||
ap2.add_argument("--nonsus-urls", metavar="R", type=u, default=r"^(favicon\.ico|robots\.txt)$|^apple-touch-icon|^\.well-known", help="harmless URLs ignored from 404-bans; disable with blank or [\033[32mno\033[0m]")
|
||||
ap2.add_argument("--aclose", metavar="MIN", type=int, default=10, help="if a client maxes out the server connection limit, downgrade it from connection:keep-alive to connection:close for MIN minutes (and also kill its active connections) -- disable with 0")
|
||||
ap2.add_argument("--loris", metavar="B", type=int, default=60, help="if a client maxes out the server connection limit without sending headers, ban it for B minutes; disable with [\033[32m0\033[0m]")
|
||||
ap2.add_argument("--acao", metavar="V[,V]", type=u, default="*", help="Access-Control-Allow-Origin; list of origins (domains/IPs without port) to accept requests from; [\033[32mhttps://1.2.3.4\033[0m]. Default [\033[32m*\033[0m] allows requests from all sites but removes cookies and http-auth; only ?pw=hunter2 survives")
|
||||
@@ -995,6 +1050,7 @@ def add_logging(ap):
|
||||
ap2.add_argument("--no-ansi", action="store_true", default=not VT100, help="disable colors; same as environment-variable NO_COLOR")
|
||||
ap2.add_argument("--ansi", action="store_true", help="force colors; overrides environment-variable NO_COLOR")
|
||||
ap2.add_argument("--no-voldump", action="store_true", help="do not list volumes and permissions on startup")
|
||||
ap2.add_argument("--log-tdec", metavar="N", type=int, default=3, help="timestamp resolution / number of timestamp decimals")
|
||||
ap2.add_argument("--log-conn", action="store_true", help="debug: print tcp-server msgs")
|
||||
ap2.add_argument("--log-htp", action="store_true", help="debug: print http-server threadpool scaling")
|
||||
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
|
||||
@@ -1039,6 +1095,7 @@ def add_thumbnail(ap):
|
||||
def add_transcoding(ap):
|
||||
ap2 = ap.add_argument_group('transcoding options')
|
||||
ap2.add_argument("--no-acode", action="store_true", help="disable audio transcoding")
|
||||
ap2.add_argument("--no-bacode", action="store_true", help="disable batch audio transcoding by folder download (zip/tar)")
|
||||
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after SEC seconds")
|
||||
|
||||
|
||||
@@ -1088,7 +1145,7 @@ def add_db_metadata(ap):
|
||||
def add_ui(ap, retry):
|
||||
ap2 = ap.add_argument_group('ui options')
|
||||
ap2.add_argument("--grid", action="store_true", help="show grid/thumbnails by default (volflag=grid)")
|
||||
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language")
|
||||
ap2.add_argument("--lang", metavar="LANG", type=u, default="eng", help="language; one of the following: eng nor")
|
||||
ap2.add_argument("--theme", metavar="NUM", type=int, default=0, help="default theme to use")
|
||||
ap2.add_argument("--themes", metavar="NUM", type=int, default=8, help="number of themes installed")
|
||||
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching REGEX in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
|
||||
@@ -1097,12 +1154,13 @@ def add_ui(ap, retry):
|
||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
|
||||
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI)")
|
||||
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI, or add ?v to URL for override)")
|
||||
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
||||
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
|
||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
|
||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty @ --name", help="title / service-name to show in html documents")
|
||||
ap2.add_argument("--bname", metavar="TXT", type=u, default="--name", help="server name (displayed in filebrowser document title)")
|
||||
ap2.add_argument("--pb-url", metavar="URL", type=u, default="https://github.com/9001/copyparty", help="powered-by link; disable with -np")
|
||||
ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible by -np)")
|
||||
ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible with -nb)")
|
||||
ap2.add_argument("--md-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for README.md docs (volflag=md_sbf); see https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox")
|
||||
ap2.add_argument("--lg-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for prologue/epilogue docs (volflag=lg_sbf)")
|
||||
ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README.md documents (volflags: no_sb_md | sb_md)")
|
||||
@@ -1143,7 +1201,10 @@ def run_argparse(
|
||||
fk_salt = get_fk_salt(cert_path)
|
||||
ah_salt = get_ah_salt()
|
||||
|
||||
hcores = min(CORES, 4) # optimal on py3.11 @ r5-4500U
|
||||
# alpine peaks at 5 threads for some reason,
|
||||
# all others scale past that (but try to avoid SMT),
|
||||
# 5 should be plenty anyways (3 GiB/s on most machines)
|
||||
hcores = min(CORES, 5 if CORES > 8 else 4)
|
||||
|
||||
tty = os.environ.get("TERM", "").lower() == "linux"
|
||||
|
||||
@@ -1172,6 +1233,7 @@ def run_argparse(
|
||||
add_yolo(ap)
|
||||
add_handlers(ap)
|
||||
add_hooks(ap)
|
||||
add_stats(ap)
|
||||
add_ui(ap, retry)
|
||||
add_admin(ap)
|
||||
add_logging(ap)
|
||||
@@ -1343,7 +1405,7 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
if re.match("c[^,]", opt):
|
||||
mod = True
|
||||
na.append("c," + opt[1:])
|
||||
elif re.sub("^[rwmdgGa]*", "", opt) and "," not in opt:
|
||||
elif re.sub("^[rwmdgGha]*", "", opt) and "," not in opt:
|
||||
mod = True
|
||||
perm = opt[0]
|
||||
na.append(perm + "," + opt[1:])
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 8, 7)
|
||||
CODENAME = "argon"
|
||||
BUILD_DT = (2023, 7, 23)
|
||||
VERSION = (1, 9, 7)
|
||||
CODENAME = "prometheable"
|
||||
BUILD_DT = (2023, 9, 30)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -52,6 +52,11 @@ if TYPE_CHECKING:
|
||||
|
||||
LEELOO_DALLAS = "leeloo_dallas"
|
||||
|
||||
SEE_LOG = "see log for details"
|
||||
SSEELOG = " ({})".format(SEE_LOG)
|
||||
BAD_CFG = "invalid config; {}".format(SEE_LOG)
|
||||
SBADCFG = " ({})".format(BAD_CFG)
|
||||
|
||||
|
||||
class AXS(object):
|
||||
def __init__(
|
||||
@@ -62,6 +67,7 @@ class AXS(object):
|
||||
udel: Optional[Union[list[str], set[str]]] = None,
|
||||
uget: Optional[Union[list[str], set[str]]] = None,
|
||||
upget: Optional[Union[list[str], set[str]]] = None,
|
||||
uhtml: Optional[Union[list[str], set[str]]] = None,
|
||||
uadmin: Optional[Union[list[str], set[str]]] = None,
|
||||
) -> None:
|
||||
self.uread: set[str] = set(uread or [])
|
||||
@@ -70,10 +76,11 @@ class AXS(object):
|
||||
self.udel: set[str] = set(udel or [])
|
||||
self.uget: set[str] = set(uget or [])
|
||||
self.upget: set[str] = set(upget or [])
|
||||
self.uhtml: set[str] = set(uhtml or [])
|
||||
self.uadmin: set[str] = set(uadmin or [])
|
||||
|
||||
def __repr__(self) -> str:
|
||||
ks = "uread uwrite umove udel uget upget uadmin".split()
|
||||
ks = "uread uwrite umove udel uget upget uhtml uadmin".split()
|
||||
return "AXS(%s)" % (", ".join("%s=%r" % (k, self.__dict__[k]) for k in ks),)
|
||||
|
||||
|
||||
@@ -324,6 +331,7 @@ class VFS(object):
|
||||
self.adel: dict[str, list[str]] = {}
|
||||
self.aget: dict[str, list[str]] = {}
|
||||
self.apget: dict[str, list[str]] = {}
|
||||
self.ahtml: dict[str, list[str]] = {}
|
||||
self.aadmin: dict[str, list[str]] = {}
|
||||
|
||||
if realpath:
|
||||
@@ -451,6 +459,7 @@ class VFS(object):
|
||||
uname in c.upget or "*" in c.upget,
|
||||
uname in c.uadmin or "*" in c.uadmin,
|
||||
)
|
||||
# skip uhtml because it's rarely needed
|
||||
|
||||
def get(
|
||||
self,
|
||||
@@ -471,7 +480,8 @@ class VFS(object):
|
||||
self.log("vfs", "invalid relpath [{}]".format(vpath))
|
||||
raise Pebkac(404)
|
||||
|
||||
vn, rem = self._find(undot(vpath))
|
||||
cvpath = undot(vpath)
|
||||
vn, rem = self._find(cvpath)
|
||||
c: AXS = vn.axs
|
||||
|
||||
for req, d, msg in [
|
||||
@@ -482,6 +492,11 @@ class VFS(object):
|
||||
(will_get, c.uget, "get"),
|
||||
]:
|
||||
if req and (uname not in d and "*" not in d) and uname != LEELOO_DALLAS:
|
||||
if vpath != cvpath and vpath != "." and self.log:
|
||||
ap = vn.canonical(rem)
|
||||
t = "{} has no {} in [{}] => [{}] => [{}]"
|
||||
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
|
||||
|
||||
t = "you don't have {}-access for this location"
|
||||
raise Pebkac(err, t.format(msg))
|
||||
|
||||
@@ -795,7 +810,7 @@ class AuthSrv(object):
|
||||
if dst in mount:
|
||||
t = "multiple filesystem-paths mounted at [/{}]:\n [{}]\n [{}]"
|
||||
self.log(t.format(dst, mount[dst], src), c=1)
|
||||
raise Exception("invalid config")
|
||||
raise Exception(BAD_CFG)
|
||||
|
||||
if src in mount.values():
|
||||
t = "filesystem-path [{}] mounted in multiple locations:"
|
||||
@@ -804,7 +819,7 @@ class AuthSrv(object):
|
||||
t += "\n /{}".format(v)
|
||||
|
||||
self.log(t, c=3)
|
||||
raise Exception("invalid config")
|
||||
raise Exception(BAD_CFG)
|
||||
|
||||
if not bos.path.isdir(src):
|
||||
self.log("warning: filesystem-path does not exist: {}".format(src), 3)
|
||||
@@ -903,7 +918,7 @@ class AuthSrv(object):
|
||||
t = "volume-specific config (anything from --help-flags)"
|
||||
self._l(ln, 6, t)
|
||||
else:
|
||||
raise Exception("invalid section header")
|
||||
raise Exception("invalid section header" + SBADCFG)
|
||||
|
||||
self.indent = " " if subsection else " "
|
||||
continue
|
||||
@@ -926,7 +941,7 @@ class AuthSrv(object):
|
||||
acct[u] = p
|
||||
except:
|
||||
t = 'lines inside the [accounts] section must be "username: password"'
|
||||
raise Exception(t)
|
||||
raise Exception(t + SBADCFG)
|
||||
continue
|
||||
|
||||
if vp is not None and ap is None:
|
||||
@@ -944,7 +959,7 @@ class AuthSrv(object):
|
||||
try:
|
||||
self._l(ln, 5, "volume access config:")
|
||||
sk, sv = ln.split(":")
|
||||
if re.sub("[rwmdgGa]", "", sk) or not sk:
|
||||
if re.sub("[rwmdgGha]", "", sk) or not sk:
|
||||
err = "invalid accs permissions list; "
|
||||
raise Exception(err)
|
||||
if " " in re.sub(", *", "", sv).strip():
|
||||
@@ -953,8 +968,8 @@ class AuthSrv(object):
|
||||
self._read_vol_str(sk, sv.replace(" ", ""), daxs[vp], mflags[vp])
|
||||
continue
|
||||
except:
|
||||
err += "accs entries must be 'rwmdgGa: user1, user2, ...'"
|
||||
raise Exception(err)
|
||||
err += "accs entries must be 'rwmdgGha: user1, user2, ...'"
|
||||
raise Exception(err + SBADCFG)
|
||||
|
||||
if cat == catf:
|
||||
err = ""
|
||||
@@ -967,7 +982,7 @@ class AuthSrv(object):
|
||||
if bad:
|
||||
err = "bad characters [{}] in volflag name [{}]; "
|
||||
err = err.format(bad, sk)
|
||||
raise Exception(err)
|
||||
raise Exception(err + SBADCFG)
|
||||
if sv is True:
|
||||
fstr += "," + sk
|
||||
else:
|
||||
@@ -979,9 +994,9 @@ class AuthSrv(object):
|
||||
continue
|
||||
except:
|
||||
err += "flags entries (volflags) must be one of the following:\n 'flag1, flag2, ...'\n 'key: value'\n 'flag1, flag2, key: value'"
|
||||
raise Exception(err)
|
||||
raise Exception(err + SBADCFG)
|
||||
|
||||
raise Exception("unprocessable line in config")
|
||||
raise Exception("unprocessable line in config" + SBADCFG)
|
||||
|
||||
self._e()
|
||||
self.line_ctr = 0
|
||||
@@ -989,7 +1004,7 @@ class AuthSrv(object):
|
||||
def _read_vol_str(
|
||||
self, lvl: str, uname: str, axs: AXS, flags: dict[str, Any]
|
||||
) -> None:
|
||||
if lvl.strip("crwmdgGa"):
|
||||
if lvl.strip("crwmdgGha"):
|
||||
raise Exception("invalid volflag: {},{}".format(lvl, uname))
|
||||
|
||||
if lvl == "c":
|
||||
@@ -1018,10 +1033,12 @@ class AuthSrv(object):
|
||||
("w", axs.uwrite),
|
||||
("m", axs.umove),
|
||||
("d", axs.udel),
|
||||
("a", axs.uadmin),
|
||||
("h", axs.uhtml),
|
||||
("h", axs.uget),
|
||||
("g", axs.uget),
|
||||
("G", axs.uget),
|
||||
("G", axs.upget),
|
||||
("a", axs.uadmin),
|
||||
]: # b bb bbb
|
||||
if ch in lvl:
|
||||
if un == "*":
|
||||
@@ -1094,7 +1111,7 @@ class AuthSrv(object):
|
||||
|
||||
if self.args.v:
|
||||
# list of src:dst:permset:permset:...
|
||||
# permset is <rwmdgGa>[,username][,username] or <c>,<flag>[=args]
|
||||
# permset is <rwmdgGha>[,username][,username] or <c>,<flag>[=args]
|
||||
for v_str in self.args.v:
|
||||
m = re_vol.match(v_str)
|
||||
if not m:
|
||||
@@ -1183,7 +1200,7 @@ class AuthSrv(object):
|
||||
vol.all_vps.sort(key=lambda x: len(x[0]), reverse=True)
|
||||
vol.root = vfs
|
||||
|
||||
for perm in "read write move del get pget admin".split():
|
||||
for perm in "read write move del get pget html admin".split():
|
||||
axs_key = "u" + perm
|
||||
unames = ["*"] + list(acct.keys())
|
||||
umap: dict[str, list[str]] = {x: [] for x in unames}
|
||||
@@ -1205,6 +1222,7 @@ class AuthSrv(object):
|
||||
axs.udel,
|
||||
axs.uget,
|
||||
axs.upget,
|
||||
axs.uhtml,
|
||||
axs.uadmin,
|
||||
]:
|
||||
for usr in d:
|
||||
@@ -1218,7 +1236,7 @@ class AuthSrv(object):
|
||||
+ ", ".join(k for k in sorted(missing_users)),
|
||||
c=1,
|
||||
)
|
||||
raise Exception("invalid config")
|
||||
raise Exception(BAD_CFG)
|
||||
|
||||
if LEELOO_DALLAS in all_users:
|
||||
raise Exception("sorry, reserved username: " + LEELOO_DALLAS)
|
||||
@@ -1228,7 +1246,7 @@ class AuthSrv(object):
|
||||
if pwd in seenpwds:
|
||||
t = "accounts [{}] and [{}] have the same password; this is not supported"
|
||||
self.log(t.format(seenpwds[pwd], usr), 1)
|
||||
raise Exception("invalid config")
|
||||
raise Exception(BAD_CFG)
|
||||
seenpwds[pwd] = usr
|
||||
|
||||
promote = []
|
||||
@@ -1612,6 +1630,7 @@ class AuthSrv(object):
|
||||
vfs.bubble_flags()
|
||||
|
||||
have_e2d = False
|
||||
have_e2t = False
|
||||
t = "volumes and permissions:\n"
|
||||
for zv in vfs.all_vols.values():
|
||||
if not self.warn_anonwrite:
|
||||
@@ -1625,6 +1644,7 @@ class AuthSrv(object):
|
||||
["delete", "udel"],
|
||||
[" get", "uget"],
|
||||
[" upget", "upget"],
|
||||
[" html", "uhtml"],
|
||||
["uadmin", "uadmin"],
|
||||
]:
|
||||
u = list(sorted(getattr(zv.axs, attr)))
|
||||
@@ -1635,6 +1655,9 @@ class AuthSrv(object):
|
||||
if "e2d" in zv.flags:
|
||||
have_e2d = True
|
||||
|
||||
if "e2t" in zv.flags:
|
||||
have_e2t = True
|
||||
|
||||
t += "\n"
|
||||
|
||||
if self.warn_anonwrite:
|
||||
@@ -1646,6 +1669,13 @@ class AuthSrv(object):
|
||||
if t:
|
||||
self.log("\n\033[{}\033[0m\n".format(t))
|
||||
|
||||
if not have_e2t:
|
||||
t = "hint: argument -e2ts enables multimedia indexing (artist/title/...)"
|
||||
self.log(t, 6)
|
||||
else:
|
||||
t = "hint: argument -e2dsa enables searching, upload-undo, and better deduplication"
|
||||
self.log(t, 6)
|
||||
|
||||
zv, _ = vfs.get("/", "*", False, False)
|
||||
zs = zv.realpath.lower()
|
||||
if zs in ("/", "c:\\") or zs.startswith(r"c:\windows"):
|
||||
@@ -1653,7 +1683,7 @@ class AuthSrv(object):
|
||||
self.log(t.format(zv.realpath), c=1)
|
||||
|
||||
try:
|
||||
zv, _ = vfs.get("/", "*", False, True)
|
||||
zv, _ = vfs.get("", "*", False, True, err=999)
|
||||
if self.warn_anonwrite and os.getcwd() == zv.realpath:
|
||||
t = "anyone can write to the current directory: {}\n"
|
||||
self.log(t.format(zv.realpath), c=1)
|
||||
@@ -1782,6 +1812,7 @@ class AuthSrv(object):
|
||||
vc.udel,
|
||||
vc.uget,
|
||||
vc.upget,
|
||||
vc.uhtml,
|
||||
vc.uadmin,
|
||||
]
|
||||
self.log(t.format(*vs))
|
||||
@@ -1923,6 +1954,7 @@ class AuthSrv(object):
|
||||
"d": "udel",
|
||||
"g": "uget",
|
||||
"G": "upget",
|
||||
"h": "uhtml",
|
||||
"a": "uadmin",
|
||||
}
|
||||
users = {}
|
||||
@@ -2015,13 +2047,19 @@ def expand_config_file(ret: list[str], fp: str, ipath: str) -> None:
|
||||
|
||||
if os.path.isdir(fp):
|
||||
names = os.listdir(fp)
|
||||
ret.append("#\033[36m cfg files in {} => {}\033[0m".format(fp, names))
|
||||
crumb = "#\033[36m cfg files in {} => {}\033[0m".format(fp, names)
|
||||
ret.append(crumb)
|
||||
for fn in sorted(names):
|
||||
fp2 = os.path.join(fp, fn)
|
||||
if not fp2.endswith(".conf") or fp2 in ipath:
|
||||
continue
|
||||
|
||||
expand_config_file(ret, fp2, ipath)
|
||||
|
||||
if ret[-1] == crumb:
|
||||
# no config files below; remove breadcrumb
|
||||
ret.pop()
|
||||
|
||||
return
|
||||
|
||||
ipath += " -> " + fp
|
||||
@@ -2120,7 +2158,7 @@ def upgrade_cfg_fmt(
|
||||
else:
|
||||
sn = sn.replace(",", ", ")
|
||||
ret.append(" " + sn)
|
||||
elif sn[:1] in "rwmdgGa":
|
||||
elif sn[:1] in "rwmdgGha":
|
||||
if cat != catx:
|
||||
cat = catx
|
||||
ret.append(cat)
|
||||
|
||||
@@ -69,7 +69,7 @@ class BrokerMp(object):
|
||||
|
||||
while procs:
|
||||
if procs[-1].is_alive():
|
||||
time.sleep(0.1)
|
||||
time.sleep(0.05)
|
||||
continue
|
||||
|
||||
procs.pop()
|
||||
|
||||
@@ -181,6 +181,10 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
|
||||
raise Exception("failed to translate cert: {}, {}".format(rc, se))
|
||||
|
||||
bname = os.path.join(args.crt_dir, "srv")
|
||||
try:
|
||||
os.unlink(bname + ".key")
|
||||
except:
|
||||
pass
|
||||
os.rename(bname + "-key.pem", bname + ".key")
|
||||
os.unlink(bname + ".csr")
|
||||
|
||||
@@ -216,7 +220,7 @@ def gencert(log: "RootLogger", args, netdevs: dict[str, Netdev]):
|
||||
HAVE_CFSSL = False
|
||||
log("cert", "could not create TLS certificates: {}".format(ex), 3)
|
||||
if getattr(ex, "errno", 0) == errno.ENOENT:
|
||||
t = "install cfssl if you want to fix this; https://github.com/cloudflare/cfssl/releases/latest"
|
||||
t = "install cfssl if you want to fix this; https://github.com/cloudflare/cfssl/releases/latest (cfssl, cfssljson, cfssl-certinfo)"
|
||||
log("cert", t, 6)
|
||||
|
||||
ensure_cert(log, args)
|
||||
|
||||
@@ -62,6 +62,8 @@ permdescs = {
|
||||
"d": "delete; permanently delete files and folders",
|
||||
"g": "get; download files, but cannot see folder contents",
|
||||
"G": 'upget; same as "g" but can see filekeys of their own uploads',
|
||||
"h": 'html; same as "g" but folders return their index.html',
|
||||
"a": "admin; can see uploader IPs, config-reload",
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -9,12 +9,19 @@ import stat
|
||||
import sys
|
||||
import time
|
||||
|
||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
||||
|
||||
try:
|
||||
import asynchat
|
||||
except:
|
||||
sys.path.append(os.path.join(E.mod, "vend"))
|
||||
|
||||
from pyftpdlib.authorizers import AuthenticationFailed, DummyAuthorizer
|
||||
from pyftpdlib.filesystems import AbstractedFS, FilesystemError
|
||||
from pyftpdlib.handlers import FTPHandler
|
||||
from pyftpdlib.ioloop import IOLoop
|
||||
from pyftpdlib.servers import FTPServer
|
||||
|
||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
||||
from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .util import (
|
||||
@@ -30,15 +37,6 @@ from .util import (
|
||||
vjoin,
|
||||
)
|
||||
|
||||
try:
|
||||
from pyftpdlib.ioloop import IOLoop
|
||||
except ImportError:
|
||||
p = os.path.join(E.mod, "vend")
|
||||
print("loading asynchat from " + p)
|
||||
sys.path.append(p)
|
||||
from pyftpdlib.ioloop import IOLoop
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .svchub import SvcHub
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ from .__version__ import S_VERSION
|
||||
from .authsrv import VFS # typechk
|
||||
from .bos import bos
|
||||
from .star import StreamTar
|
||||
from .sutil import StreamArc # typechk
|
||||
from .sutil import StreamArc, gfilter
|
||||
from .szip import StreamZip
|
||||
from .util import (
|
||||
HTTPCODE,
|
||||
@@ -41,8 +41,8 @@ from .util import (
|
||||
MultipartParser,
|
||||
Pebkac,
|
||||
UnrecvEOF,
|
||||
alltrace,
|
||||
absreal,
|
||||
alltrace,
|
||||
atomic_move,
|
||||
exclude_dotfiles,
|
||||
fsenc,
|
||||
@@ -141,6 +141,7 @@ class HttpCli(object):
|
||||
self.vn = self.asrv.vfs
|
||||
self.rem = " "
|
||||
self.vpath = " "
|
||||
self.vpaths = " "
|
||||
self.uname = " "
|
||||
self.pw = " "
|
||||
self.rvol = [" "]
|
||||
@@ -210,7 +211,8 @@ class HttpCli(object):
|
||||
ka["ts"] = self.conn.hsrv.cachebuster()
|
||||
ka["lang"] = self.args.lang
|
||||
ka["favico"] = self.args.favico
|
||||
ka["svcname"] = self.args.doctitle
|
||||
ka["s_name"] = self.args.bname
|
||||
ka["s_doctitle"] = self.args.doctitle
|
||||
ka["html_head"] = self.html_head
|
||||
return tpl.render(**ka) # type: ignore
|
||||
|
||||
@@ -281,22 +283,35 @@ class HttpCli(object):
|
||||
|
||||
n = self.args.rproxy
|
||||
if n:
|
||||
zso = self.headers.get("x-forwarded-for")
|
||||
if zso and self.conn.addr[0] in ["127.0.0.1", "::1"]:
|
||||
zso = self.headers.get(self.args.xff_hdr)
|
||||
if zso:
|
||||
if n > 0:
|
||||
n -= 1
|
||||
|
||||
zsl = zso.split(",")
|
||||
try:
|
||||
self.ip = zsl[n].strip()
|
||||
cli_ip = zsl[n].strip()
|
||||
except:
|
||||
self.ip = zsl[0].strip()
|
||||
cli_ip = zsl[0].strip()
|
||||
t = "rproxy={} oob x-fwd {}"
|
||||
self.log(t.format(self.args.rproxy, zso), c=3)
|
||||
|
||||
self.log_src = self.conn.set_rproxy(self.ip)
|
||||
self.is_vproxied = bool(self.args.R)
|
||||
self.host = self.headers.get("x-forwarded-host") or self.host
|
||||
pip = self.conn.addr[0]
|
||||
if self.args.xff_re and not self.args.xff_re.match(pip):
|
||||
t = 'got header "%s" from untrusted source "%s" claiming the true client ip is "%s" (raw value: "%s"); if you trust this, you must allowlist this proxy with "--xff-src=%s"'
|
||||
if self.headers.get("cf-connecting-ip"):
|
||||
t += " Alternatively, if you are behind cloudflare, it is better to specify these two instead: --xff-hdr=cf-connecting-ip --xff-src=any"
|
||||
zs = (
|
||||
".".join(pip.split(".")[:2]) + "."
|
||||
if "." in pip
|
||||
else ":".join(pip.split(":")[:4]) + ":"
|
||||
)
|
||||
self.log(t % (self.args.xff_hdr, pip, cli_ip, zso, zs), 3)
|
||||
else:
|
||||
self.ip = cli_ip
|
||||
self.is_vproxied = bool(self.args.R)
|
||||
self.log_src = self.conn.set_rproxy(self.ip)
|
||||
self.host = self.headers.get("x-forwarded-host") or self.host
|
||||
|
||||
if self.is_banned():
|
||||
return False
|
||||
@@ -331,10 +346,12 @@ class HttpCli(object):
|
||||
# split req into vpath + uparam
|
||||
uparam = {}
|
||||
if "?" not in self.req:
|
||||
self.trailing_slash = self.req.endswith("/")
|
||||
vpath = undot(self.req)
|
||||
vpath = unquotep(self.req) # not query, so + means +
|
||||
self.trailing_slash = vpath.endswith("/")
|
||||
vpath = undot(vpath)
|
||||
else:
|
||||
vpath, arglist = self.req.split("?", 1)
|
||||
vpath = unquotep(vpath)
|
||||
self.trailing_slash = vpath.endswith("/")
|
||||
vpath = undot(vpath)
|
||||
|
||||
@@ -349,6 +366,8 @@ class HttpCli(object):
|
||||
for k in arglist.split("&"):
|
||||
if "=" in k:
|
||||
k, zs = k.split("=", 1)
|
||||
# x-www-form-urlencoded (url query part) uses
|
||||
# either + or %20 for 0x20 so handle both
|
||||
uparam[k.lower()] = unquotep(zs.strip().replace("+", " "))
|
||||
else:
|
||||
uparam[k.lower()] = ""
|
||||
@@ -383,7 +402,10 @@ class HttpCli(object):
|
||||
|
||||
self.uparam = uparam
|
||||
self.cookies = cookies
|
||||
self.vpath = unquotep(vpath) # not query, so + means +
|
||||
self.vpath = vpath
|
||||
self.vpaths = (
|
||||
self.vpath + "/" if self.trailing_slash and self.vpath else self.vpath
|
||||
)
|
||||
|
||||
ok = "\x00" not in self.vpath
|
||||
if ANYWIN:
|
||||
@@ -559,8 +581,8 @@ class HttpCli(object):
|
||||
self.out_headers.update(NO_CACHE)
|
||||
return
|
||||
|
||||
n = "604869" if cache == "i" else cache or "69"
|
||||
self.out_headers["Cache-Control"] = "max-age=" + n
|
||||
n = 69 if not cache else 604869 if cache == "i" else int(cache)
|
||||
self.out_headers["Cache-Control"] = "max-age=" + str(n)
|
||||
|
||||
def k304(self) -> bool:
|
||||
k304 = self.cookies.get("k304")
|
||||
@@ -620,9 +642,37 @@ class HttpCli(object):
|
||||
headers: Optional[dict[str, str]] = None,
|
||||
volsan: bool = False,
|
||||
) -> bytes:
|
||||
if status == 404:
|
||||
g = self.conn.hsrv.g404
|
||||
if g.lim:
|
||||
if (
|
||||
status > 400
|
||||
and status in (403, 404, 422)
|
||||
and (
|
||||
status != 422
|
||||
or (
|
||||
not body.startswith(b"<pre>partial upload exists")
|
||||
and not body.startswith(b"<pre>source file busy")
|
||||
)
|
||||
)
|
||||
):
|
||||
if status == 404:
|
||||
g = self.conn.hsrv.g404
|
||||
elif status == 403:
|
||||
g = self.conn.hsrv.g403
|
||||
else:
|
||||
g = self.conn.hsrv.g422
|
||||
|
||||
gurl = self.conn.hsrv.gurl
|
||||
if (
|
||||
gurl.lim
|
||||
and (not g.lim or gurl.lim < g.lim)
|
||||
and self.args.sus_urls.search(self.vpath)
|
||||
):
|
||||
g = self.conn.hsrv.gurl
|
||||
|
||||
if g.lim and (
|
||||
g == self.conn.hsrv.g422
|
||||
or not self.args.nonsus_urls
|
||||
or not self.args.nonsus_urls.search(self.vpath)
|
||||
):
|
||||
bonk, ip = g.bonk(self.ip, self.vpath)
|
||||
if bonk:
|
||||
xban = self.vn.flags.get("xban")
|
||||
@@ -637,14 +687,19 @@ class HttpCli(object):
|
||||
0,
|
||||
self.ip,
|
||||
time.time(),
|
||||
"404",
|
||||
str(status),
|
||||
):
|
||||
self.log("client banned: 404s", 1)
|
||||
self.log("client banned: %ss" % (status,), 1)
|
||||
self.conn.hsrv.bans[ip] = bonk
|
||||
|
||||
if volsan:
|
||||
vols = list(self.asrv.vfs.all_vols.values())
|
||||
body = vol_san(vols, body)
|
||||
try:
|
||||
zs = absreal(__file__).rsplit(os.path.sep, 2)[0]
|
||||
body = body.replace(zs.encode("utf-8"), b"PP")
|
||||
except:
|
||||
pass
|
||||
|
||||
self.send_headers(len(body), status, mime, headers)
|
||||
|
||||
@@ -687,6 +742,21 @@ class HttpCli(object):
|
||||
r = ["%s=%s" % (k, quotep(zs)) if zs else k for k, zs in kv.items()]
|
||||
return "?" + "&".join(r)
|
||||
|
||||
def ourlq(self) -> str:
|
||||
skip = ("pw", "h", "k")
|
||||
ret = []
|
||||
for k, v in self.ouparam.items():
|
||||
if k in skip:
|
||||
continue
|
||||
|
||||
t = "{}={}".format(quotep(k), quotep(v))
|
||||
ret.append(t.replace(" ", "+").rstrip("="))
|
||||
|
||||
if not ret:
|
||||
return ""
|
||||
|
||||
return "?" + "&".join(ret)
|
||||
|
||||
def redirect(
|
||||
self,
|
||||
vpath: str,
|
||||
@@ -802,6 +872,9 @@ class HttpCli(object):
|
||||
self.reply(b"", 301, headers=h)
|
||||
return True
|
||||
|
||||
if self.vpath == ".cpr/metrics":
|
||||
return self.conn.hsrv.metrics.tx(self)
|
||||
|
||||
path_base = os.path.join(self.E.mod, "web")
|
||||
static_path = absreal(os.path.join(path_base, self.vpath[5:]))
|
||||
if static_path in self.conn.hsrv.statics:
|
||||
@@ -820,14 +893,17 @@ class HttpCli(object):
|
||||
|
||||
if not self.can_read and not self.can_write and not self.can_get:
|
||||
t = "@{} has no access to [{}]"
|
||||
self.log(t.format(self.uname, self.vpath))
|
||||
|
||||
if "on403" in self.vn.flags:
|
||||
t += " (on403)"
|
||||
self.log(t.format(self.uname, self.vpath))
|
||||
ret = self.on40x(self.vn.flags["on403"], self.vn, self.rem)
|
||||
if ret == "true":
|
||||
return True
|
||||
elif ret == "false":
|
||||
return False
|
||||
elif ret == "home":
|
||||
self.uparam["h"] = ""
|
||||
elif ret == "allow":
|
||||
self.log("plugin override; access permitted")
|
||||
self.can_read = self.can_write = self.can_move = True
|
||||
@@ -837,6 +913,10 @@ class HttpCli(object):
|
||||
return self.tx_404(True)
|
||||
else:
|
||||
if self.vpath:
|
||||
ptn = self.args.nonsus_urls
|
||||
if not ptn or not ptn.search(self.vpath):
|
||||
self.log(t.format(self.uname, self.vpath))
|
||||
|
||||
return self.tx_404(True)
|
||||
|
||||
self.uparam["h"] = ""
|
||||
@@ -2025,7 +2105,9 @@ class HttpCli(object):
|
||||
|
||||
dst = self.args.SRS
|
||||
if self.vpath:
|
||||
dst += quotep(self.vpath)
|
||||
dst += quotep(self.vpaths)
|
||||
|
||||
dst += self.ourlq()
|
||||
|
||||
msg = self.get_pwd_cookie(pwd)
|
||||
html = self.j2s("msg", h1=msg, h2='<a href="' + dst + '">ack</a>', redir=dst)
|
||||
@@ -2127,7 +2209,8 @@ class HttpCli(object):
|
||||
vfs, rem = self.asrv.vfs.get(self.vpath, self.uname, False, True)
|
||||
self._assert_safe_rem(rem)
|
||||
|
||||
if not new_file.endswith(".md"):
|
||||
ext = "" if "." not in new_file else new_file.split(".")[-1]
|
||||
if not ext or len(ext) > 5 or not self.can_delete:
|
||||
new_file += ".md"
|
||||
|
||||
sanitized = sanitize_fn(new_file, "", [])
|
||||
@@ -2460,7 +2543,7 @@ class HttpCli(object):
|
||||
fp = os.path.join(fp, fn)
|
||||
rem = "{}/{}".format(rp, fn).strip("/")
|
||||
|
||||
if not rem.endswith(".md"):
|
||||
if not rem.endswith(".md") and not self.can_delete:
|
||||
raise Pebkac(400, "only markdown pls")
|
||||
|
||||
if nullwrite:
|
||||
@@ -2511,7 +2594,8 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
mdir, mfile = os.path.split(fp)
|
||||
mfile2 = "{}.{:.3f}.md".format(mfile[:-3], srv_lastmod)
|
||||
fname, fext = mfile.rsplit(".", 1) if "." in mfile else (mfile, "md")
|
||||
mfile2 = "{}.{:.3f}.{}".format(fname, srv_lastmod, fext)
|
||||
try:
|
||||
dp = os.path.join(mdir, ".hist")
|
||||
bos.mkdir(dp)
|
||||
@@ -2634,7 +2718,7 @@ class HttpCli(object):
|
||||
#
|
||||
# if request is for foo.js, check if we have foo.js.{gz,br}
|
||||
|
||||
file_ts = 0
|
||||
file_ts = 0.0
|
||||
editions: dict[str, tuple[str, int]] = {}
|
||||
for ext in ["", ".gz", ".br"]:
|
||||
try:
|
||||
@@ -2652,7 +2736,7 @@ class HttpCli(object):
|
||||
else:
|
||||
sz = st.st_size
|
||||
|
||||
file_ts = max(file_ts, int(st.st_mtime))
|
||||
file_ts = max(file_ts, st.st_mtime)
|
||||
editions[ext or "plain"] = (fs_path, sz)
|
||||
except:
|
||||
pass
|
||||
@@ -2665,11 +2749,14 @@ class HttpCli(object):
|
||||
#
|
||||
# if-modified
|
||||
|
||||
file_lastmod, do_send = self._chk_lastmod(file_ts)
|
||||
file_lastmod, do_send = self._chk_lastmod(int(file_ts))
|
||||
self.out_headers["Last-Modified"] = file_lastmod
|
||||
if not do_send:
|
||||
status = 304
|
||||
|
||||
if self.can_write:
|
||||
self.out_headers["X-Lastmod3"] = str(int(file_ts * 1000))
|
||||
|
||||
#
|
||||
# Accept-Encoding and UA decides which edition to send
|
||||
|
||||
@@ -2833,12 +2920,26 @@ class HttpCli(object):
|
||||
logmsg = "{:4} {} ".format("", self.req)
|
||||
self.keepalive = False
|
||||
|
||||
cancmp = not self.args.no_tarcmp
|
||||
|
||||
if fmt == "tar":
|
||||
mime = "application/x-tar"
|
||||
packer: Type[StreamArc] = StreamTar
|
||||
if cancmp and "gz" in uarg:
|
||||
mime = "application/gzip"
|
||||
ext = "tar.gz"
|
||||
elif cancmp and "bz2" in uarg:
|
||||
mime = "application/x-bzip"
|
||||
ext = "tar.bz2"
|
||||
elif cancmp and "xz" in uarg:
|
||||
mime = "application/x-xz"
|
||||
ext = "tar.xz"
|
||||
else:
|
||||
mime = "application/x-tar"
|
||||
ext = "tar"
|
||||
else:
|
||||
mime = "application/zip"
|
||||
packer = StreamZip
|
||||
ext = "zip"
|
||||
|
||||
fn = items[0] if items and items[0] else self.vpath
|
||||
if fn:
|
||||
@@ -2863,7 +2964,7 @@ class HttpCli(object):
|
||||
ufn = b"".join(zbl).decode("ascii")
|
||||
|
||||
cdis = "attachment; filename=\"{}.{}\"; filename*=UTF-8''{}.{}"
|
||||
cdis = cdis.format(afn, fmt, ufn, fmt)
|
||||
cdis = cdis.format(afn, ext, ufn, ext)
|
||||
self.log(cdis)
|
||||
self.send_headers(None, mime=mime, headers={"Content-Disposition": cdis})
|
||||
|
||||
@@ -2871,7 +2972,23 @@ class HttpCli(object):
|
||||
vpath, rem, set(items), self.uname, dots, False, not self.args.no_scandir
|
||||
)
|
||||
# for f in fgen: print(repr({k: f[k] for k in ["vp", "ap"]}))
|
||||
bgen = packer(self.log, fgen, utf8="utf" in uarg, pre_crc="crc" in uarg)
|
||||
cfmt = ""
|
||||
if self.thumbcli and not self.args.no_bacode:
|
||||
for zs in ("opus", "w", "j"):
|
||||
if zs in self.ouparam or uarg == zs:
|
||||
cfmt = zs
|
||||
|
||||
if cfmt:
|
||||
self.log("transcoding to [{}]".format(cfmt))
|
||||
fgen = gfilter(fgen, self.thumbcli, self.uname, vpath, cfmt)
|
||||
|
||||
bgen = packer(
|
||||
self.log,
|
||||
fgen,
|
||||
utf8="utf" in uarg,
|
||||
pre_crc="crc" in uarg,
|
||||
cmp=uarg if cancmp or uarg == "pax" else "",
|
||||
)
|
||||
bsent = 0
|
||||
for buf in bgen.gen():
|
||||
if not buf:
|
||||
@@ -2882,6 +2999,7 @@ class HttpCli(object):
|
||||
bsent += len(buf)
|
||||
except:
|
||||
logmsg += " \033[31m" + unicode(bsent) + "\033[0m"
|
||||
bgen.stop()
|
||||
break
|
||||
|
||||
spd = self._spd(bsent)
|
||||
@@ -2936,7 +3054,12 @@ class HttpCli(object):
|
||||
ts_html = st.st_mtime
|
||||
|
||||
sz_md = 0
|
||||
lead = b""
|
||||
for buf in yieldfile(fs_path):
|
||||
if not sz_md and b"\n" in buf[:2]:
|
||||
lead = buf[: buf.find(b"\n") + 1]
|
||||
sz_md += len(lead)
|
||||
|
||||
sz_md += len(buf)
|
||||
for c, v in [(b"&", 4), (b"<", 3), (b">", 3)]:
|
||||
sz_md += (len(buf) - len(buf.replace(c, b""))) * v
|
||||
@@ -2955,7 +3078,6 @@ class HttpCli(object):
|
||||
targs = {
|
||||
"r": self.args.SR if self.is_vproxied else "",
|
||||
"ts": self.conn.hsrv.cachebuster(),
|
||||
"svcname": self.args.doctitle,
|
||||
"html_head": self.html_head,
|
||||
"edit": "edit" in self.uparam,
|
||||
"title": html_escape(self.vpath, crlf=True),
|
||||
@@ -2982,7 +3104,7 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
try:
|
||||
self.s.sendall(html[0])
|
||||
self.s.sendall(html[0] + lead)
|
||||
for buf in yieldfile(fs_path):
|
||||
self.s.sendall(html_bescape(buf))
|
||||
|
||||
@@ -2998,7 +3120,7 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
def tx_svcs(self) -> bool:
|
||||
aname = re.sub("[^0-9a-zA-Z]+", "", self.args.name) or "a"
|
||||
aname = re.sub("[^0-9a-zA-Z]+", "", self.args.vname) or "a"
|
||||
ep = self.host
|
||||
host = ep.split(":")[0]
|
||||
hport = ep[ep.find(":") :] if ":" in ep else ""
|
||||
@@ -3082,7 +3204,7 @@ class HttpCli(object):
|
||||
html = self.j2s(
|
||||
"splash",
|
||||
this=self,
|
||||
qvpath=quotep(self.vpath),
|
||||
qvpath=quotep(self.vpaths) + self.ourlq(),
|
||||
rvol=rvol,
|
||||
wvol=wvol,
|
||||
avol=avol,
|
||||
@@ -3140,7 +3262,8 @@ class HttpCli(object):
|
||||
t = '<h1 id="n">404 not found ┐( ´ -`)┌</h1><p><a id="r" href="{}/?h">go home</a></p>'
|
||||
|
||||
t = t.format(self.args.SR)
|
||||
html = self.j2s("splash", this=self, qvpath=quotep(self.vpath), msg=t)
|
||||
qv = quotep(self.vpaths) + self.ourlq()
|
||||
html = self.j2s("splash", this=self, qvpath=qv, msg=t)
|
||||
self.reply(html.encode("utf-8"), status=rc)
|
||||
return True
|
||||
|
||||
@@ -3212,7 +3335,7 @@ class HttpCli(object):
|
||||
dst = ""
|
||||
elif top:
|
||||
if not dst.startswith(top + "/"):
|
||||
raise Pebkac(400, "arg funk")
|
||||
raise Pebkac(422, "arg funk")
|
||||
|
||||
dst = dst[len(top) + 1 :]
|
||||
|
||||
@@ -3234,8 +3357,9 @@ class HttpCli(object):
|
||||
sub = self.gen_tree("/".join([top, excl]).strip("/"), target)
|
||||
ret["k" + quotep(excl)] = sub
|
||||
|
||||
vfs = self.asrv.vfs
|
||||
try:
|
||||
vn, rem = self.asrv.vfs.get(top, self.uname, True, False)
|
||||
vn, rem = vfs.get(top, self.uname, True, False)
|
||||
fsroot, vfs_ls, vfs_virt = vn.ls(
|
||||
rem,
|
||||
self.uname,
|
||||
@@ -3248,7 +3372,7 @@ class HttpCli(object):
|
||||
for v in self.rvol:
|
||||
d1, d2 = v.rsplit("/", 1) if "/" in v else ["", v]
|
||||
if d1 == top:
|
||||
vfs_virt[d2] = self.asrv.vfs # typechk, value never read
|
||||
vfs_virt[d2] = vfs # typechk, value never read
|
||||
|
||||
dirs = []
|
||||
|
||||
@@ -3262,6 +3386,11 @@ class HttpCli(object):
|
||||
|
||||
for x in vfs_virt:
|
||||
if x != excl:
|
||||
try:
|
||||
dvn, drem = vfs.get(vjoin(top, x), self.uname, True, False)
|
||||
bos.stat(dvn.canonical(drem, False))
|
||||
except:
|
||||
x += "\n"
|
||||
dirs.append(x)
|
||||
|
||||
ret["a"] = dirs
|
||||
@@ -3275,8 +3404,7 @@ class HttpCli(object):
|
||||
if not idx or not hasattr(idx, "p_end"):
|
||||
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
|
||||
|
||||
filt = self.uparam.get("filter")
|
||||
filt = unquotep(filt or "")
|
||||
filt = self.uparam.get("filter") or ""
|
||||
lm = "ups [{}]".format(filt)
|
||||
self.log(lm)
|
||||
|
||||
@@ -3374,9 +3502,6 @@ class HttpCli(object):
|
||||
if not dst:
|
||||
raise Pebkac(400, "need dst vpath")
|
||||
|
||||
# x-www-form-urlencoded (url query part) uses
|
||||
# either + or %20 for 0x20 so handle both
|
||||
dst = unquotep(dst.replace("+", " "))
|
||||
return self._mv(self.vpath, dst.lstrip("/"))
|
||||
|
||||
def _mv(self, vsrc: str, vdst: str) -> bool:
|
||||
@@ -3507,6 +3632,7 @@ class HttpCli(object):
|
||||
self.out_headers.pop("X-Robots-Tag", None)
|
||||
|
||||
is_dir = stat.S_ISDIR(st.st_mode)
|
||||
fk_pass = False
|
||||
icur = None
|
||||
if is_dir and (e2t or e2d):
|
||||
idx = self.conn.get_u2idx()
|
||||
@@ -3555,8 +3681,38 @@ class HttpCli(object):
|
||||
|
||||
return self.tx_ico(rem)
|
||||
|
||||
elif self.can_get and self.avn:
|
||||
axs = self.avn.axs
|
||||
if self.uname not in axs.uhtml and "*" not in axs.uhtml:
|
||||
pass
|
||||
elif is_dir:
|
||||
for fn in ("index.htm", "index.html"):
|
||||
ap2 = os.path.join(abspath, fn)
|
||||
try:
|
||||
st2 = bos.stat(ap2)
|
||||
except:
|
||||
continue
|
||||
|
||||
# might as well be extra careful
|
||||
if not stat.S_ISREG(st2.st_mode):
|
||||
continue
|
||||
|
||||
if not self.trailing_slash:
|
||||
return self.redirect(
|
||||
self.vpath + "/", flavor="redirecting to", use302=True
|
||||
)
|
||||
|
||||
fk_pass = True
|
||||
is_dir = False
|
||||
rem = vjoin(rem, fn)
|
||||
vrem = vjoin(vrem, fn)
|
||||
abspath = ap2
|
||||
break
|
||||
elif self.vpath.rsplit("/", 1)[1] in ("index.htm", "index.html"):
|
||||
fk_pass = True
|
||||
|
||||
if not is_dir and (self.can_read or self.can_get):
|
||||
if not self.can_read and "fk" in vn.flags:
|
||||
if not self.can_read and not fk_pass and "fk" in vn.flags:
|
||||
correct = self.gen_fk(
|
||||
self.args.fk_salt, abspath, st.st_size, 0 if ANYWIN else st.st_ino
|
||||
)[: vn.flags["fk"]]
|
||||
@@ -3566,7 +3722,7 @@ class HttpCli(object):
|
||||
return self.tx_404()
|
||||
|
||||
if (
|
||||
abspath.endswith(".md")
|
||||
(abspath.endswith(".md") or self.can_delete)
|
||||
and "nohtml" not in vn.flags
|
||||
and (
|
||||
"v" in self.uparam
|
||||
@@ -3688,7 +3844,7 @@ class HttpCli(object):
|
||||
"url_suf": url_suf,
|
||||
"logues": logues,
|
||||
"readme": readme,
|
||||
"title": html_escape(self.vpath, crlf=True) or "💾🎉",
|
||||
"title": html_escape("%s %s" % (self.args.bname, self.vpath), crlf=True),
|
||||
"srv_info": srv_infot,
|
||||
"dgrid": "grid" in vf,
|
||||
"unlist": unlist,
|
||||
@@ -3700,10 +3856,14 @@ class HttpCli(object):
|
||||
}
|
||||
|
||||
if self.args.js_browser:
|
||||
j2a["js"] = self.args.js_browser
|
||||
zs = self.args.js_browser
|
||||
zs += "&" if "?" in zs else "?"
|
||||
j2a["js"] = zs
|
||||
|
||||
if self.args.css_browser:
|
||||
j2a["css"] = self.args.css_browser
|
||||
zs = self.args.css_browser
|
||||
zs += "&" if "?" in zs else "?"
|
||||
j2a["css"] = zs
|
||||
|
||||
if not self.conn.hsrv.prism:
|
||||
j2a["no_prism"] = True
|
||||
@@ -3756,7 +3916,9 @@ class HttpCli(object):
|
||||
pass
|
||||
|
||||
# show dotfiles if permitted and requested
|
||||
if not self.args.ed or "dots" not in self.uparam:
|
||||
if not self.args.ed or (
|
||||
"dots" not in self.uparam and (is_ls or "dots" not in self.cookies)
|
||||
):
|
||||
ls_names = exclude_dotfiles(ls_names)
|
||||
|
||||
add_fk = vn.flags.get("fk")
|
||||
|
||||
@@ -56,6 +56,7 @@ except SyntaxError:
|
||||
sys.exit(1)
|
||||
|
||||
from .httpconn import HttpConn
|
||||
from .metrics import Metrics
|
||||
from .u2idx import U2idx
|
||||
from .util import (
|
||||
E_SCK,
|
||||
@@ -99,12 +100,16 @@ class HttpSrv(object):
|
||||
# redefine in case of multiprocessing
|
||||
socket.setdefaulttimeout(120)
|
||||
|
||||
self.t0 = time.time()
|
||||
nsuf = "-n{}-i{:x}".format(nid, os.getpid()) if nid else ""
|
||||
self.magician = Magician()
|
||||
self.nm = NetMap([], {})
|
||||
self.ssdp: Optional["SSDPr"] = None
|
||||
self.gpwd = Garda(self.args.ban_pw)
|
||||
self.g404 = Garda(self.args.ban_404)
|
||||
self.g403 = Garda(self.args.ban_403)
|
||||
self.g422 = Garda(self.args.ban_422, False)
|
||||
self.gurl = Garda(self.args.ban_url)
|
||||
self.bans: dict[str, int] = {}
|
||||
self.aclose: dict[str, int] = {}
|
||||
|
||||
@@ -122,6 +127,7 @@ class HttpSrv(object):
|
||||
self.t_periodic: Optional[threading.Thread] = None
|
||||
|
||||
self.u2fh = FHC()
|
||||
self.metrics = Metrics(self)
|
||||
self.srvs: list[socket.socket] = []
|
||||
self.ncli = 0 # exact
|
||||
self.clients: set[HttpConn] = set() # laggy
|
||||
|
||||
@@ -295,7 +295,9 @@ class MDNS(MCast):
|
||||
while self.running:
|
||||
timeout = (
|
||||
0.02 + random.random() * 0.07
|
||||
if self.probing or self.q or self.defend or self.unsolicited
|
||||
if self.probing or self.q or self.defend
|
||||
else max(0.05, self.unsolicited[0] - time.time())
|
||||
if self.unsolicited
|
||||
else (last_hop + ihop if ihop else 180)
|
||||
)
|
||||
rdy = select.select(self.srv, [], [], timeout)
|
||||
@@ -513,6 +515,10 @@ class MDNS(MCast):
|
||||
for srv in self.srv.values():
|
||||
tx.add(srv)
|
||||
|
||||
if not self.unsolicited and self.args.zm_spam:
|
||||
zf = time.time() + self.args.zm_spam + random.random() * 0.07
|
||||
self.unsolicited.append(zf)
|
||||
|
||||
for srv, deadline in list(self.defend.items()):
|
||||
if now < deadline:
|
||||
continue
|
||||
|
||||
165
copyparty/metrics.py
Normal file
165
copyparty/metrics.py
Normal file
@@ -0,0 +1,165 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import json
|
||||
import time
|
||||
|
||||
from .__init__ import TYPE_CHECKING
|
||||
from .util import Pebkac, get_df, unhumanize
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .httpcli import HttpCli
|
||||
from .httpsrv import HttpSrv
|
||||
|
||||
|
||||
class Metrics(object):
|
||||
def __init__(self, hsrv: "HttpSrv") -> None:
|
||||
self.hsrv = hsrv
|
||||
|
||||
def tx(self, cli: "HttpCli") -> bool:
|
||||
if not cli.avol:
|
||||
raise Pebkac(403, "not allowed for user " + cli.uname)
|
||||
|
||||
args = cli.args
|
||||
if not args.stats:
|
||||
raise Pebkac(403, "the stats feature is not enabled in server config")
|
||||
|
||||
conn = cli.conn
|
||||
vfs = conn.asrv.vfs
|
||||
allvols = list(sorted(vfs.all_vols.items()))
|
||||
|
||||
idx = conn.get_u2idx()
|
||||
if not idx or not hasattr(idx, "p_end"):
|
||||
idx = None
|
||||
|
||||
ret: list[str] = []
|
||||
|
||||
def addc(k: str, unit: str, v: str, desc: str) -> None:
|
||||
if unit:
|
||||
k += "_" + unit
|
||||
zs = "# TYPE %s counter\n# UNIT %s %s\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||
ret.append(zs % (k, k, unit, k, desc, k, int(self.hsrv.t0), k, v))
|
||||
else:
|
||||
zs = "# TYPE %s counter\n# HELP %s %s\n%s_created %s\n%s_total %s"
|
||||
ret.append(zs % (k, k, desc, k, int(self.hsrv.t0), k, v))
|
||||
|
||||
def addh(k: str, typ: str, desc: str) -> None:
|
||||
zs = "# TYPE %s %s\n# HELP %s %s"
|
||||
ret.append(zs % (k, typ, k, desc))
|
||||
|
||||
def addbh(k: str, desc: str) -> None:
|
||||
zs = "# TYPE %s gauge\n# UNIT %s bytes\n# HELP %s %s"
|
||||
ret.append(zs % (k, k, k, desc))
|
||||
|
||||
def addv(k: str, v: str) -> None:
|
||||
ret.append("%s %s" % (k, v))
|
||||
|
||||
v = "{:.3f}".format(time.time() - self.hsrv.t0)
|
||||
addc("cpp_uptime", "seconds", v, "time since last server restart")
|
||||
|
||||
v = str(len(conn.bans or []))
|
||||
addc("cpp_bans", "", v, "number of banned IPs")
|
||||
|
||||
if not args.nos_hdd:
|
||||
addbh("cpp_disk_size_bytes", "total HDD size of volume")
|
||||
addbh("cpp_disk_free_bytes", "free HDD space in volume")
|
||||
for vpath, vol in allvols:
|
||||
free, total = get_df(vol.realpath)
|
||||
addv('cpp_disk_size_bytes{vol="/%s"}' % (vpath), str(total))
|
||||
addv('cpp_disk_free_bytes{vol="/%s"}' % (vpath), str(free))
|
||||
|
||||
if idx and not args.nos_vol:
|
||||
addbh("cpp_vol_bytes", "num bytes of data in volume")
|
||||
addh("cpp_vol_files", "gauge", "num files in volume")
|
||||
addbh("cpp_vol_free_bytes", "free space (vmaxb) in volume")
|
||||
addh("cpp_vol_free_files", "gauge", "free space (vmaxn) in volume")
|
||||
tnbytes = 0
|
||||
tnfiles = 0
|
||||
|
||||
volsizes = []
|
||||
try:
|
||||
ptops = [x.realpath for _, x in allvols]
|
||||
x = self.hsrv.broker.ask("up2k.get_volsizes", ptops)
|
||||
volsizes = x.get()
|
||||
except Exception as ex:
|
||||
cli.log("tx_stats get_volsizes: {!r}".format(ex), 3)
|
||||
|
||||
for (vpath, vol), (nbytes, nfiles) in zip(allvols, volsizes):
|
||||
tnbytes += nbytes
|
||||
tnfiles += nfiles
|
||||
addv('cpp_vol_bytes{vol="/%s"}' % (vpath), str(nbytes))
|
||||
addv('cpp_vol_files{vol="/%s"}' % (vpath), str(nfiles))
|
||||
|
||||
if vol.flags.get("vmaxb") or vol.flags.get("vmaxn"):
|
||||
|
||||
zi = unhumanize(vol.flags.get("vmaxb") or "0")
|
||||
if zi:
|
||||
v = str(zi - nbytes)
|
||||
addv('cpp_vol_free_bytes{vol="/%s"}' % (vpath), v)
|
||||
|
||||
zi = unhumanize(vol.flags.get("vmaxn") or "0")
|
||||
if zi:
|
||||
v = str(zi - nfiles)
|
||||
addv('cpp_vol_free_files{vol="/%s"}' % (vpath), v)
|
||||
|
||||
if volsizes:
|
||||
addv('cpp_vol_bytes{vol="total"}', str(tnbytes))
|
||||
addv('cpp_vol_files{vol="total"}', str(tnfiles))
|
||||
|
||||
if idx and not args.nos_dup:
|
||||
addbh("cpp_dupe_bytes", "num dupe bytes in volume")
|
||||
addh("cpp_dupe_files", "gauge", "num dupe files in volume")
|
||||
tnbytes = 0
|
||||
tnfiles = 0
|
||||
for vpath, vol in allvols:
|
||||
cur = idx.get_cur(vol.realpath)
|
||||
if not cur:
|
||||
continue
|
||||
|
||||
nbytes = 0
|
||||
nfiles = 0
|
||||
q = "select sz, count(*)-1 c from up group by w having c"
|
||||
for sz, c in cur.execute(q):
|
||||
nbytes += sz * c
|
||||
nfiles += c
|
||||
|
||||
tnbytes += nbytes
|
||||
tnfiles += nfiles
|
||||
addv('cpp_dupe_bytes{vol="/%s"}' % (vpath), str(nbytes))
|
||||
addv('cpp_dupe_files{vol="/%s"}' % (vpath), str(nfiles))
|
||||
|
||||
addv('cpp_dupe_bytes{vol="total"}', str(tnbytes))
|
||||
addv('cpp_dupe_files{vol="total"}', str(tnfiles))
|
||||
|
||||
if not args.nos_unf:
|
||||
addbh("cpp_unf_bytes", "incoming/unfinished uploads (num bytes)")
|
||||
addh("cpp_unf_files", "gauge", "incoming/unfinished uploads (num files)")
|
||||
tnbytes = 0
|
||||
tnfiles = 0
|
||||
try:
|
||||
x = self.hsrv.broker.ask("up2k.get_unfinished")
|
||||
xs = x.get()
|
||||
xj = json.loads(xs)
|
||||
for ptop, (nbytes, nfiles) in xj.items():
|
||||
tnbytes += nbytes
|
||||
tnfiles += nfiles
|
||||
vol = next((x[1] for x in allvols if x[1].realpath == ptop), None)
|
||||
if not vol:
|
||||
t = "tx_stats get_unfinished: could not map {}"
|
||||
cli.log(t.format(ptop), 3)
|
||||
continue
|
||||
|
||||
addv('cpp_unf_bytes{vol="/%s"}' % (vol.vpath), str(nbytes))
|
||||
addv('cpp_unf_files{vol="/%s"}' % (vol.vpath), str(nfiles))
|
||||
|
||||
addv('cpp_unf_bytes{vol="total"}', str(tnbytes))
|
||||
addv('cpp_unf_files{vol="total"}', str(tnfiles))
|
||||
|
||||
except Exception as ex:
|
||||
cli.log("tx_stats get_unfinished: {!r}".format(ex), 3)
|
||||
|
||||
ret.append("# EOF")
|
||||
|
||||
mime = "application/openmetrics-text; version=1.0.0; charset=utf-8"
|
||||
cli.reply("\n".join(ret).encode("utf-8"), mime=mime)
|
||||
return True
|
||||
@@ -8,7 +8,7 @@ import shutil
|
||||
import subprocess as sp
|
||||
import sys
|
||||
|
||||
from .__init__ import EXE, PY2, WINDOWS, E, unicode
|
||||
from .__init__ import ANYWIN, EXE, PY2, WINDOWS, E, unicode
|
||||
from .bos import bos
|
||||
from .util import (
|
||||
FFMPEG_URL,
|
||||
@@ -29,6 +29,9 @@ if True: # pylint: disable=using-constant-test
|
||||
|
||||
|
||||
def have_ff(scmd: str) -> bool:
|
||||
if ANYWIN:
|
||||
scmd += ".exe"
|
||||
|
||||
if PY2:
|
||||
print("# checking {}".format(scmd))
|
||||
acmd = (scmd + " -version").encode("ascii").split(b" ")
|
||||
|
||||
@@ -15,7 +15,7 @@ from ipaddress import (
|
||||
)
|
||||
|
||||
from .__init__ import MACOS, TYPE_CHECKING
|
||||
from .util import Netdev, find_prefix, min_ex, spack
|
||||
from .util import Daemon, Netdev, find_prefix, min_ex, spack
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .svchub import SvcHub
|
||||
@@ -228,6 +228,7 @@ class MCast(object):
|
||||
for srv in self.srv.values():
|
||||
assert srv.ip in self.sips
|
||||
|
||||
Daemon(self.hopper, "mc-hop")
|
||||
return bound
|
||||
|
||||
def setup_socket(self, srv: MC_Sck) -> None:
|
||||
@@ -299,33 +300,57 @@ class MCast(object):
|
||||
t = "failed to set IPv4 TTL/LOOP; announcements may not survive multiple switches/routers"
|
||||
self.log(t, 3)
|
||||
|
||||
self.hop(srv)
|
||||
if self.hop(srv, False):
|
||||
self.log("igmp was already joined?? chilling for a sec", 3)
|
||||
time.sleep(1.2)
|
||||
|
||||
self.hop(srv, True)
|
||||
self.b4.sort(reverse=True)
|
||||
self.b6.sort(reverse=True)
|
||||
|
||||
def hop(self, srv: MC_Sck) -> None:
|
||||
def hop(self, srv: MC_Sck, on: bool) -> bool:
|
||||
"""rejoin to keepalive on routers/switches without igmp-snooping"""
|
||||
sck = srv.sck
|
||||
req = srv.mreq
|
||||
if ":" in srv.ip:
|
||||
try:
|
||||
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_LEAVE_GROUP, req)
|
||||
# linux does leaves/joins twice with 0.2~1.05s spacing
|
||||
time.sleep(1.2)
|
||||
except:
|
||||
pass
|
||||
|
||||
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_JOIN_GROUP, req)
|
||||
if not on:
|
||||
try:
|
||||
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_LEAVE_GROUP, req)
|
||||
return True
|
||||
except:
|
||||
return False
|
||||
else:
|
||||
sck.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_JOIN_GROUP, req)
|
||||
else:
|
||||
try:
|
||||
sck.setsockopt(socket.IPPROTO_IP, socket.IP_DROP_MEMBERSHIP, req)
|
||||
time.sleep(1.2)
|
||||
except:
|
||||
pass
|
||||
if not on:
|
||||
try:
|
||||
sck.setsockopt(socket.IPPROTO_IP, socket.IP_DROP_MEMBERSHIP, req)
|
||||
return True
|
||||
except:
|
||||
return False
|
||||
else:
|
||||
# t = "joining {} from ip {} idx {} with mreq {}"
|
||||
# self.log(t.format(srv.grp, srv.ip, srv.idx, repr(srv.mreq)), 6)
|
||||
sck.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, req)
|
||||
|
||||
# t = "joining {} from ip {} idx {} with mreq {}"
|
||||
# self.log(t.format(srv.grp, srv.ip, srv.idx, repr(srv.mreq)), 6)
|
||||
sck.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, req)
|
||||
return True
|
||||
|
||||
def hopper(self):
|
||||
while self.args.mc_hop and self.running:
|
||||
time.sleep(self.args.mc_hop)
|
||||
if not self.running:
|
||||
return
|
||||
|
||||
for srv in self.srv.values():
|
||||
self.hop(srv, False)
|
||||
|
||||
# linux does leaves/joins twice with 0.2~1.05s spacing
|
||||
time.sleep(1.2)
|
||||
if not self.running:
|
||||
return
|
||||
|
||||
for srv in self.srv.values():
|
||||
self.hop(srv, True)
|
||||
|
||||
def map_client(self, cip: str) -> Optional[MC_Sck]:
|
||||
try:
|
||||
|
||||
@@ -32,6 +32,8 @@ class SMB(object):
|
||||
self.asrv = hub.asrv
|
||||
self.log = hub.log
|
||||
self.files: dict[int, tuple[float, str]] = {}
|
||||
self.noacc = self.args.smba
|
||||
self.accs = not self.args.smba
|
||||
|
||||
lg.setLevel(logging.DEBUG if self.args.smbvvv else logging.INFO)
|
||||
for x in ["impacket", "impacket.smbserver"]:
|
||||
@@ -94,6 +96,14 @@ class SMB(object):
|
||||
|
||||
port = int(self.args.smb_port)
|
||||
srv = smbserver.SimpleSMBServer(listenAddress=ip, listenPort=port)
|
||||
try:
|
||||
if self.accs:
|
||||
srv.setAuthCallback(self._auth_cb)
|
||||
except:
|
||||
self.accs = False
|
||||
self.noacc = True
|
||||
t = "impacket too old; access permissions will not work! all accounts are admin!"
|
||||
self.log("smb", t, 1)
|
||||
|
||||
ro = "no" if self.args.smbw else "yes" # (does nothing)
|
||||
srv.addShare("A", "/", readOnly=ro)
|
||||
@@ -119,24 +129,73 @@ class SMB(object):
|
||||
def start(self) -> None:
|
||||
Daemon(self.srv.start)
|
||||
|
||||
def _v2a(self, caller: str, vpath: str, *a: Any) -> tuple[VFS, str]:
|
||||
def _auth_cb(self, *a, **ka):
|
||||
debug("auth-result: %s %s", a, ka)
|
||||
conndata = ka["connData"]
|
||||
auth_ok = conndata["Authenticated"]
|
||||
uname = ka["user_name"] if auth_ok else "*"
|
||||
uname = self.asrv.iacct.get(uname, uname) or "*"
|
||||
oldname = conndata.get("partygoer", "*") or "*"
|
||||
cli_ip = conndata["ClientIP"]
|
||||
cli_hn = ka["host_name"]
|
||||
if uname != "*":
|
||||
conndata["partygoer"] = uname
|
||||
info("client %s [%s] authed as %s", cli_ip, cli_hn, uname)
|
||||
elif oldname != "*":
|
||||
info("client %s [%s] keeping old auth as %s", cli_ip, cli_hn, oldname)
|
||||
elif auth_ok:
|
||||
info("client %s [%s] authed as [*] (anon)", cli_ip, cli_hn)
|
||||
else:
|
||||
info("client %s [%s] rejected", cli_ip, cli_hn)
|
||||
|
||||
def _uname(self) -> str:
|
||||
if self.noacc:
|
||||
return LEELOO_DALLAS
|
||||
|
||||
try:
|
||||
# you found it! my single worst bit of code so far
|
||||
# (if you can think of a better way to track users through impacket i'm all ears)
|
||||
cf0 = inspect.currentframe().f_back.f_back
|
||||
cf = cf0.f_back
|
||||
for n in range(3):
|
||||
cl = cf.f_locals
|
||||
if "connData" in cl:
|
||||
return cl["connData"]["partygoer"]
|
||||
cf = cf.f_back
|
||||
except:
|
||||
warning(
|
||||
"nyoron... %s <<-- %s <<-- %s <<-- %s",
|
||||
cf0.f_code.co_name,
|
||||
cf0.f_back.f_code.co_name,
|
||||
cf0.f_back.f_back.f_code.co_name,
|
||||
cf0.f_back.f_back.f_back.f_code.co_name,
|
||||
)
|
||||
return "*"
|
||||
|
||||
def _v2a(
|
||||
self, caller: str, vpath: str, *a: Any, uname="", perms=None
|
||||
) -> tuple[VFS, str]:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
# cf = inspect.currentframe().f_back
|
||||
# c1 = cf.f_back.f_code.co_name
|
||||
# c2 = cf.f_code.co_name
|
||||
debug('%s("%s", %s)\033[K\033[0m', caller, vpath, str(a))
|
||||
if not uname:
|
||||
uname = self._uname()
|
||||
if not perms:
|
||||
perms = [True, True]
|
||||
|
||||
# TODO find a way to grab `identity` in smbComSessionSetupAndX and smb2SessionSetup
|
||||
vfs, rem = self.asrv.vfs.get(vpath, LEELOO_DALLAS, True, True)
|
||||
debug('%s("%s", %s) %s @%s\033[K\033[0m', caller, vpath, str(a), perms, uname)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, uname, *perms)
|
||||
return vfs, vfs.canonical(rem)
|
||||
|
||||
def _listdir(self, vpath: str, *a: Any, **ka: Any) -> list[str]:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
# caller = inspect.currentframe().f_back.f_code.co_name
|
||||
debug('listdir("%s", %s)\033[K\033[0m', vpath, str(a))
|
||||
vfs, rem = self.asrv.vfs.get(vpath, LEELOO_DALLAS, False, False)
|
||||
uname = self._uname()
|
||||
# debug('listdir("%s", %s) @%s\033[K\033[0m', vpath, str(a), uname)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, uname, False, False)
|
||||
_, vfs_ls, vfs_virt = vfs.ls(
|
||||
rem, LEELOO_DALLAS, not self.args.no_scandir, [[False, False]]
|
||||
rem, uname, not self.args.no_scandir, [[False, False]]
|
||||
)
|
||||
dirs = [x[0] for x in vfs_ls if stat.S_ISDIR(x[1].st_mode)]
|
||||
fils = [x[0] for x in vfs_ls if x[0] not in dirs]
|
||||
@@ -149,8 +208,8 @@ class SMB(object):
|
||||
sz = 112 * 2 # ['.', '..']
|
||||
for n, fn in enumerate(ls):
|
||||
if sz >= 64000:
|
||||
t = "listing only %d of %d files (%d byte); see impacket#1433"
|
||||
warning(t, n, len(ls), sz)
|
||||
t = "listing only %d of %d files (%d byte) in /%s; see impacket#1433"
|
||||
warning(t, n, len(ls), sz, vpath)
|
||||
break
|
||||
|
||||
nsz = len(fn.encode("utf-16", "replace"))
|
||||
@@ -171,10 +230,12 @@ class SMB(object):
|
||||
if wr and not self.args.smbw:
|
||||
yeet("blocked write (no --smbw): " + vpath)
|
||||
|
||||
vfs, ap = self._v2a("open", vpath, *a)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a("open", vpath, *a, uname=uname, perms=[True, wr])
|
||||
if wr:
|
||||
if not vfs.axs.uwrite:
|
||||
yeet("blocked write (no-write-acc): " + vpath)
|
||||
t = "blocked write (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||
|
||||
xbu = vfs.flags.get("xbu")
|
||||
if xbu and not runhook(
|
||||
@@ -204,7 +265,7 @@ class SMB(object):
|
||||
|
||||
_, vp = self.files.pop(fd)
|
||||
vp, fn = os.path.split(vp)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vp, LEELOO_DALLAS, False, True)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vp, self._uname(), False, True)
|
||||
vfs, rem = vfs.get_dbv(rem)
|
||||
self.hub.up2k.hash_file(
|
||||
vfs.realpath,
|
||||
@@ -224,15 +285,18 @@ class SMB(object):
|
||||
vp1 = vp1.lstrip("/")
|
||||
vp2 = vp2.lstrip("/")
|
||||
|
||||
vfs2, ap2 = self._v2a("rename", vp2, vp1)
|
||||
uname = self._uname()
|
||||
vfs2, ap2 = self._v2a("rename", vp2, vp1, uname=uname)
|
||||
if not vfs2.axs.uwrite:
|
||||
yeet("blocked rename (no-write-acc): " + vp2)
|
||||
t = "blocked write (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs2.axs.uwrite, vp2, uname))
|
||||
|
||||
vfs1, _ = self.asrv.vfs.get(vp1, LEELOO_DALLAS, True, True)
|
||||
vfs1, _ = self.asrv.vfs.get(vp1, uname, True, True, True)
|
||||
if not vfs1.axs.umove:
|
||||
yeet("blocked rename (no-move-acc): " + vp1)
|
||||
t = "blocked rename (no-move-acc %s): /%s @%s"
|
||||
yeet(t % (vfs1.axs.umove, vp1, uname))
|
||||
|
||||
self.hub.up2k.handle_mv(LEELOO_DALLAS, vp1, vp2)
|
||||
self.hub.up2k.handle_mv(uname, vp1, vp2)
|
||||
try:
|
||||
bos.makedirs(ap2)
|
||||
except:
|
||||
@@ -242,52 +306,74 @@ class SMB(object):
|
||||
if not self.args.smbw:
|
||||
yeet("blocked mkdir (no --smbw): " + vpath)
|
||||
|
||||
vfs, ap = self._v2a("mkdir", vpath)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a("mkdir", vpath, uname=uname)
|
||||
if not vfs.axs.uwrite:
|
||||
yeet("blocked mkdir (no-write-acc): " + vpath)
|
||||
t = "blocked mkdir (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||
|
||||
return bos.mkdir(ap)
|
||||
|
||||
def _stat(self, vpath: str, *a: Any, **ka: Any) -> os.stat_result:
|
||||
return bos.stat(self._v2a("stat", vpath, *a)[1], *a, **ka)
|
||||
try:
|
||||
ap = self._v2a("stat", vpath, *a, perms=[True, False])[1]
|
||||
ret = bos.stat(ap, *a, **ka)
|
||||
# debug(" `-stat:ok")
|
||||
return ret
|
||||
except:
|
||||
# white lie: windows freaks out if we raise due to an offline volume
|
||||
# debug(" `-stat:NOPE (faking a directory)")
|
||||
ts = int(time.time())
|
||||
return os.stat_result((16877, -1, -1, 1, 1000, 1000, 8, ts, ts, ts))
|
||||
|
||||
def _unlink(self, vpath: str) -> None:
|
||||
if not self.args.smbw:
|
||||
yeet("blocked delete (no --smbw): " + vpath)
|
||||
|
||||
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
|
||||
vfs, ap = self._v2a("delete", vpath)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a(
|
||||
"delete", vpath, uname=uname, perms=[True, False, False, True]
|
||||
)
|
||||
if not vfs.axs.udel:
|
||||
yeet("blocked delete (no-del-acc): " + vpath)
|
||||
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
self.hub.up2k.handle_rm(LEELOO_DALLAS, "1.7.6.2", [vpath], [], False)
|
||||
self.hub.up2k.handle_rm(uname, "1.7.6.2", [vpath], [], False)
|
||||
|
||||
def _utime(self, vpath: str, times: tuple[float, float]) -> None:
|
||||
if not self.args.smbw:
|
||||
yeet("blocked utime (no --smbw): " + vpath)
|
||||
|
||||
vfs, ap = self._v2a("utime", vpath)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a("utime", vpath, uname=uname)
|
||||
if not vfs.axs.uwrite:
|
||||
yeet("blocked utime (no-write-acc): " + vpath)
|
||||
t = "blocked utime (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||
|
||||
return bos.utime(ap, times)
|
||||
|
||||
def _p_exists(self, vpath: str) -> bool:
|
||||
# ap = "?"
|
||||
try:
|
||||
bos.stat(self._v2a("p.exists", vpath)[1])
|
||||
ap = self._v2a("p.exists", vpath, perms=[True, False])[1]
|
||||
bos.stat(ap)
|
||||
# debug(" `-exists((%s)->(%s)):ok", vpath, ap)
|
||||
return True
|
||||
except:
|
||||
# debug(" `-exists((%s)->(%s)):NOPE", vpath, ap)
|
||||
return False
|
||||
|
||||
def _p_getsize(self, vpath: str) -> int:
|
||||
st = bos.stat(self._v2a("p.getsize", vpath)[1])
|
||||
st = bos.stat(self._v2a("p.getsize", vpath, perms=[True, False])[1])
|
||||
return st.st_size
|
||||
|
||||
def _p_isdir(self, vpath: str) -> bool:
|
||||
try:
|
||||
st = bos.stat(self._v2a("p.isdir", vpath)[1])
|
||||
return stat.S_ISDIR(st.st_mode)
|
||||
st = bos.stat(self._v2a("p.isdir", vpath, perms=[True, False])[1])
|
||||
ret = stat.S_ISDIR(st.st_mode)
|
||||
# debug(" `-isdir:%s:%s", st.st_mode, ret)
|
||||
return ret
|
||||
except:
|
||||
return False
|
||||
|
||||
|
||||
@@ -81,7 +81,7 @@ class SSDPr(object):
|
||||
ubase = "{}://{}:{}".format(proto, sip, sport)
|
||||
zsl = self.args.zsl
|
||||
url = zsl if "://" in zsl else ubase + "/" + zsl.lstrip("/")
|
||||
name = "{} @ {}".format(self.args.doctitle, self.args.name)
|
||||
name = self.args.doctitle
|
||||
zs = zs.strip().format(c(ubase), c(url), c(name), c(self.args.zsid))
|
||||
hc.reply(zs.encode("utf-8", "replace"))
|
||||
return False # close connectino
|
||||
@@ -214,7 +214,7 @@ CONFIGID.UPNP.ORG: 1
|
||||
srv.sck.sendto(zb, addr[:2])
|
||||
|
||||
if cip not in self.txc.c:
|
||||
self.log("{} [{}] --> {}".format(srv.name, srv.ip, cip), "6")
|
||||
self.log("{} [{}] --> {}".format(srv.name, srv.ip, cip), 6)
|
||||
|
||||
self.txc.add(cip)
|
||||
self.txc.cln()
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import re
|
||||
import stat
|
||||
import tarfile
|
||||
|
||||
@@ -44,6 +45,7 @@ class StreamTar(StreamArc):
|
||||
self,
|
||||
log: "NamedLogger",
|
||||
fgen: Generator[dict[str, Any], None, None],
|
||||
cmp: str = "",
|
||||
**kwargs: Any
|
||||
):
|
||||
super(StreamTar, self).__init__(log, fgen)
|
||||
@@ -53,14 +55,41 @@ class StreamTar(StreamArc):
|
||||
self.qfile = QFile()
|
||||
self.errf: dict[str, Any] = {}
|
||||
|
||||
# python 3.8 changed to PAX_FORMAT as default,
|
||||
# waste of space and don't care about the new features
|
||||
# python 3.8 changed to PAX_FORMAT as default;
|
||||
# slower, bigger, and no particular advantage
|
||||
fmt = tarfile.GNU_FORMAT
|
||||
self.tar = tarfile.open(fileobj=self.qfile, mode="w|", format=fmt) # type: ignore
|
||||
if "pax" in cmp:
|
||||
# unless a client asks for it (currently
|
||||
# gnu-tar has wider support than pax-tar)
|
||||
fmt = tarfile.PAX_FORMAT
|
||||
cmp = re.sub(r"[^a-z0-9]*pax[^a-z0-9]*", "", cmp)
|
||||
|
||||
try:
|
||||
cmp, lv = cmp.replace(":", ",").split(",")
|
||||
lv = int(lv)
|
||||
except:
|
||||
lv = None
|
||||
|
||||
arg = {"name": None, "fileobj": self.qfile, "mode": "w", "format": fmt}
|
||||
if cmp == "gz":
|
||||
fun = tarfile.TarFile.gzopen
|
||||
arg["compresslevel"] = lv if lv is not None else 3
|
||||
elif cmp == "bz2":
|
||||
fun = tarfile.TarFile.bz2open
|
||||
arg["compresslevel"] = lv if lv is not None else 2
|
||||
elif cmp == "xz":
|
||||
fun = tarfile.TarFile.xzopen
|
||||
arg["preset"] = lv if lv is not None else 1
|
||||
else:
|
||||
fun = tarfile.open
|
||||
arg["mode"] = "w|"
|
||||
|
||||
self.tar = fun(**arg)
|
||||
|
||||
Daemon(self._gen, "star-gen")
|
||||
|
||||
def gen(self) -> Generator[Optional[bytes], None, None]:
|
||||
buf = b""
|
||||
try:
|
||||
while True:
|
||||
buf = self.qfile.q.get()
|
||||
@@ -72,6 +101,12 @@ class StreamTar(StreamArc):
|
||||
|
||||
yield None
|
||||
finally:
|
||||
while buf:
|
||||
try:
|
||||
buf = self.qfile.q.get()
|
||||
except:
|
||||
pass
|
||||
|
||||
if self.errf:
|
||||
bos.unlink(self.errf["ap"])
|
||||
|
||||
@@ -101,6 +136,9 @@ class StreamTar(StreamArc):
|
||||
errors.append((f["vp"], f["err"]))
|
||||
continue
|
||||
|
||||
if self.stopped:
|
||||
break
|
||||
|
||||
try:
|
||||
self.ser(f)
|
||||
except:
|
||||
|
||||
@@ -1,10 +1,14 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import os
|
||||
import tempfile
|
||||
from datetime import datetime
|
||||
|
||||
from .__init__ import CORES
|
||||
from .bos import bos
|
||||
from .th_cli import ThumbCli
|
||||
from .util import vjoin
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from typing import Any, Generator, Optional
|
||||
@@ -21,10 +25,78 @@ class StreamArc(object):
|
||||
):
|
||||
self.log = log
|
||||
self.fgen = fgen
|
||||
self.stopped = False
|
||||
|
||||
def gen(self) -> Generator[Optional[bytes], None, None]:
|
||||
raise Exception("override me")
|
||||
|
||||
def stop(self) -> None:
|
||||
self.stopped = True
|
||||
|
||||
|
||||
def gfilter(
|
||||
fgen: Generator[dict[str, Any], None, None],
|
||||
thumbcli: ThumbCli,
|
||||
uname: str,
|
||||
vtop: str,
|
||||
fmt: str,
|
||||
) -> Generator[dict[str, Any], None, None]:
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
|
||||
pend = []
|
||||
with ThreadPoolExecutor(max_workers=CORES) as tp:
|
||||
try:
|
||||
for f in fgen:
|
||||
task = tp.submit(enthumb, thumbcli, uname, vtop, f, fmt)
|
||||
pend.append((task, f))
|
||||
if pend[0][0].done() or len(pend) > CORES * 4:
|
||||
task, f = pend.pop(0)
|
||||
try:
|
||||
f = task.result(600)
|
||||
except:
|
||||
pass
|
||||
yield f
|
||||
|
||||
for task, f in pend:
|
||||
try:
|
||||
f = task.result(600)
|
||||
except:
|
||||
pass
|
||||
yield f
|
||||
except Exception as ex:
|
||||
thumbcli.log("gfilter flushing ({})".format(ex))
|
||||
for task, f in pend:
|
||||
try:
|
||||
task.result(600)
|
||||
except:
|
||||
pass
|
||||
thumbcli.log("gfilter flushed")
|
||||
|
||||
|
||||
def enthumb(
|
||||
thumbcli: ThumbCli, uname: str, vtop: str, f: dict[str, Any], fmt: str
|
||||
) -> dict[str, Any]:
|
||||
rem = f["vp"]
|
||||
ext = rem.rsplit(".", 1)[-1].lower()
|
||||
if fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|"):
|
||||
raise Exception()
|
||||
|
||||
vp = vjoin(vtop, rem.split("/", 1)[1])
|
||||
vn, rem = thumbcli.asrv.vfs.get(vp, uname, True, False)
|
||||
dbv, vrem = vn.get_dbv(rem)
|
||||
thp = thumbcli.get(dbv, vrem, f["st"].st_mtime, fmt)
|
||||
if not thp:
|
||||
raise Exception()
|
||||
|
||||
ext = "jpg" if fmt == "j" else "webp" if fmt == "w" else fmt
|
||||
sz = bos.path.getsize(thp)
|
||||
st: os.stat_result = f["st"]
|
||||
ts = st.st_mtime
|
||||
f["ap"] = thp
|
||||
f["vp"] = f["vp"].rsplit(".", 1)[0] + "." + ext
|
||||
f["st"] = os.stat_result((st.st_mode, -1, -1, 1, 1000, 1000, sz, ts, ts, ts))
|
||||
return f
|
||||
|
||||
|
||||
def errdesc(errors: list[tuple[str, str]]) -> tuple[dict[str, Any], list[str]]:
|
||||
report = ["copyparty failed to add the following files to the archive:", ""]
|
||||
|
||||
@@ -29,7 +29,7 @@ if True: # pylint: disable=using-constant-test
|
||||
from typing import Any, Optional, Union
|
||||
|
||||
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams, unicode
|
||||
from .authsrv import AuthSrv
|
||||
from .authsrv import BAD_CFG, AuthSrv
|
||||
from .cert import ensure_cert
|
||||
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
|
||||
from .tcpsrv import TcpSrv
|
||||
@@ -100,11 +100,6 @@ class SvcHub(object):
|
||||
|
||||
self.iphash = HMaccas(os.path.join(self.E.cfg, "iphash"), 8)
|
||||
|
||||
# for non-http clients (ftp)
|
||||
self.bans: dict[str, int] = {}
|
||||
self.gpwd = Garda(self.args.ban_pw)
|
||||
self.g404 = Garda(self.args.ban_404)
|
||||
|
||||
if args.sss or args.s >= 3:
|
||||
args.ss = True
|
||||
args.no_dav = True
|
||||
@@ -121,6 +116,7 @@ class SvcHub(object):
|
||||
args.hardlink = True
|
||||
args.vague_403 = True
|
||||
args.ban_404 = "50,60,1440"
|
||||
args.turbo = -1
|
||||
args.nih = True
|
||||
|
||||
if args.s:
|
||||
@@ -131,8 +127,19 @@ class SvcHub(object):
|
||||
args.force_js = True
|
||||
|
||||
if not self._process_config():
|
||||
raise Exception("bad config")
|
||||
raise Exception(BAD_CFG)
|
||||
|
||||
# for non-http clients (ftp)
|
||||
self.bans: dict[str, int] = {}
|
||||
self.gpwd = Garda(self.args.ban_pw)
|
||||
self.g404 = Garda(self.args.ban_404)
|
||||
self.g403 = Garda(self.args.ban_403)
|
||||
self.g422 = Garda(self.args.ban_422)
|
||||
self.gurl = Garda(self.args.ban_url)
|
||||
|
||||
self.log_div = 10 ** (6 - args.log_tdec)
|
||||
self.log_efmt = "%02d:%02d:%02d.%0{}d".format(args.log_tdec)
|
||||
self.log_dfmt = "%04d-%04d-%06d.%0{}d".format(args.log_tdec)
|
||||
self.log = self._log_disabled if args.q else self._log_enabled
|
||||
if args.lo:
|
||||
self._setup_logfile(printed)
|
||||
@@ -162,6 +169,14 @@ class SvcHub(object):
|
||||
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
|
||||
args.theme = "{0}{1} {0} {1}".format(ch, bri)
|
||||
|
||||
if args.nih:
|
||||
args.vname = ""
|
||||
args.doctitle = args.doctitle.replace(" @ --name", "")
|
||||
else:
|
||||
args.vname = args.name
|
||||
args.doctitle = args.doctitle.replace("--name", args.vname)
|
||||
args.bname = args.bname.replace("--name", args.vname) or args.vname
|
||||
|
||||
if args.log_fk:
|
||||
args.log_fk = re.compile(args.log_fk)
|
||||
|
||||
@@ -183,6 +198,10 @@ class SvcHub(object):
|
||||
self.log("root", "max clients: {}".format(self.args.nc))
|
||||
|
||||
self.tcpsrv = TcpSrv(self)
|
||||
|
||||
if not self.tcpsrv.srv and self.args.ign_ebind_all:
|
||||
self.args.no_fastboot = True
|
||||
|
||||
self.up2k = Up2k(self)
|
||||
|
||||
decs = {k: 1 for k in self.args.th_dec.split(",")}
|
||||
@@ -324,12 +343,20 @@ class SvcHub(object):
|
||||
if self.httpsrv_up != self.broker.num_workers:
|
||||
return
|
||||
|
||||
time.sleep(0.1) # purely cosmetic dw
|
||||
ar = self.args
|
||||
for _ in range(10 if ar.ftp or ar.ftps else 0):
|
||||
time.sleep(0.03)
|
||||
if self.ftpd:
|
||||
break
|
||||
|
||||
if self.tcpsrv.qr:
|
||||
self.log("qr-code", self.tcpsrv.qr)
|
||||
else:
|
||||
self.log("root", "workers OK\n")
|
||||
|
||||
self.after_httpsrv_up()
|
||||
|
||||
def after_httpsrv_up(self) -> None:
|
||||
self.up2k.init_vols()
|
||||
|
||||
Daemon(self.sd_notify, "sd-notify")
|
||||
@@ -388,6 +415,24 @@ class SvcHub(object):
|
||||
if vs and vs.startswith("~"):
|
||||
setattr(al, k, os.path.expanduser(vs))
|
||||
|
||||
for k in "sus_urls nonsus_urls".split(" "):
|
||||
vs = getattr(al, k)
|
||||
if not vs or vs == "no":
|
||||
setattr(al, k, None)
|
||||
else:
|
||||
setattr(al, k, re.compile(vs))
|
||||
|
||||
if not al.sus_urls:
|
||||
al.ban_url = "no"
|
||||
elif al.ban_url == "no":
|
||||
al.sus_urls = None
|
||||
|
||||
if al.xff_src in ("any", "0", ""):
|
||||
al.xff_re = None
|
||||
else:
|
||||
zs = al.xff_src.replace(" ", "").replace(".", "\\.").replace(",", "|")
|
||||
al.xff_re = re.compile("^(?:" + zs + ")")
|
||||
|
||||
return True
|
||||
|
||||
def _setlimits(self) -> None:
|
||||
@@ -618,19 +663,25 @@ class SvcHub(object):
|
||||
ret = 1
|
||||
try:
|
||||
self.pr("OPYTHAT")
|
||||
tasks = []
|
||||
slp = 0.0
|
||||
|
||||
if self.mdns:
|
||||
Daemon(self.mdns.stop)
|
||||
tasks.append(Daemon(self.mdns.stop, "mdns"))
|
||||
slp = time.time() + 0.5
|
||||
|
||||
if self.ssdp:
|
||||
Daemon(self.ssdp.stop)
|
||||
tasks.append(Daemon(self.ssdp.stop, "ssdp"))
|
||||
slp = time.time() + 0.5
|
||||
|
||||
self.broker.shutdown()
|
||||
self.tcpsrv.shutdown()
|
||||
self.up2k.shutdown()
|
||||
|
||||
if hasattr(self, "smbd"):
|
||||
slp = max(slp, time.time() + 0.5)
|
||||
tasks.append(Daemon(self.smbd.stop, "smbd"))
|
||||
|
||||
if self.thumbsrv:
|
||||
self.thumbsrv.shutdown()
|
||||
|
||||
@@ -640,17 +691,19 @@ class SvcHub(object):
|
||||
break
|
||||
|
||||
if n == 3:
|
||||
self.pr("waiting for thumbsrv (10sec)...")
|
||||
self.log("root", "waiting for thumbsrv (10sec)...")
|
||||
|
||||
if hasattr(self, "smbd"):
|
||||
slp = max(slp, time.time() + 0.5)
|
||||
Daemon(self.kill9, a=(1,))
|
||||
Daemon(self.smbd.stop)
|
||||
zf = max(time.time() - slp, 0)
|
||||
Daemon(self.kill9, a=(zf + 0.5,))
|
||||
|
||||
while time.time() < slp:
|
||||
time.sleep(0.1)
|
||||
if not next((x for x in tasks if x.is_alive), None):
|
||||
break
|
||||
|
||||
self.pr("nailed it", end="")
|
||||
time.sleep(0.05)
|
||||
|
||||
self.log("root", "nailed it")
|
||||
ret = self.retcode
|
||||
except:
|
||||
self.pr("\033[31m[ error during shutdown ]\n{}\033[0m".format(min_ex()))
|
||||
@@ -660,7 +713,7 @@ class SvcHub(object):
|
||||
print("\033]0;\033\\", file=sys.stderr, end="")
|
||||
sys.stderr.flush()
|
||||
|
||||
self.pr("\033[0m")
|
||||
self.pr("\033[0m", end="")
|
||||
if self.logf:
|
||||
self.logf.close()
|
||||
|
||||
@@ -673,11 +726,11 @@ class SvcHub(object):
|
||||
|
||||
with self.log_mutex:
|
||||
zd = datetime.utcnow()
|
||||
ts = "%04d-%04d-%06d.%03d" % (
|
||||
ts = self.log_dfmt % (
|
||||
zd.year,
|
||||
zd.month * 100 + zd.day,
|
||||
(zd.hour * 100 + zd.minute) * 100 + zd.second,
|
||||
zd.microsecond // 1000,
|
||||
zd.microsecond // self.log_div,
|
||||
)
|
||||
self.logf.write("@%s [%s\033[0m] %s\n" % (ts, src, msg))
|
||||
|
||||
@@ -729,11 +782,11 @@ class SvcHub(object):
|
||||
msg = "%s%s\033[0m" % (c, msg)
|
||||
|
||||
zd = datetime.utcfromtimestamp(now)
|
||||
ts = "%02d:%02d:%02d.%03d" % (
|
||||
ts = self.log_efmt % (
|
||||
zd.hour,
|
||||
zd.minute,
|
||||
zd.second,
|
||||
zd.microsecond // 1000,
|
||||
zd.microsecond // self.log_div,
|
||||
)
|
||||
msg = fmt % (ts, src, msg)
|
||||
try:
|
||||
|
||||
@@ -221,6 +221,7 @@ class StreamZip(StreamArc):
|
||||
fgen: Generator[dict[str, Any], None, None],
|
||||
utf8: bool = False,
|
||||
pre_crc: bool = False,
|
||||
**kwargs: Any
|
||||
) -> None:
|
||||
super(StreamZip, self).__init__(log, fgen)
|
||||
|
||||
@@ -275,6 +276,7 @@ class StreamZip(StreamArc):
|
||||
def gen(self) -> Generator[bytes, None, None]:
|
||||
errf: dict[str, Any] = {}
|
||||
errors = []
|
||||
mbuf = b""
|
||||
try:
|
||||
for f in self.fgen:
|
||||
if "err" in f:
|
||||
@@ -283,13 +285,20 @@ class StreamZip(StreamArc):
|
||||
|
||||
try:
|
||||
for x in self.ser(f):
|
||||
yield x
|
||||
mbuf += x
|
||||
if len(mbuf) >= 16384:
|
||||
yield mbuf
|
||||
mbuf = b""
|
||||
except GeneratorExit:
|
||||
raise
|
||||
except:
|
||||
ex = min_ex(5, True).replace("\n", "\n-- ")
|
||||
errors.append((f["vp"], ex))
|
||||
|
||||
if mbuf:
|
||||
yield mbuf
|
||||
mbuf = b""
|
||||
|
||||
if errors:
|
||||
errf, txt = errdesc(errors)
|
||||
self.log("\n".join(([repr(errf)] + txt[1:])))
|
||||
@@ -299,20 +308,23 @@ class StreamZip(StreamArc):
|
||||
cdir_pos = self.pos
|
||||
for name, sz, ts, crc, h_pos in self.items:
|
||||
buf = gen_hdr(h_pos, name, sz, ts, self.utf8, crc, self.pre_crc)
|
||||
yield self._ct(buf)
|
||||
mbuf += self._ct(buf)
|
||||
if len(mbuf) >= 16384:
|
||||
yield mbuf
|
||||
mbuf = b""
|
||||
cdir_end = self.pos
|
||||
|
||||
_, need_64 = gen_ecdr(self.items, cdir_pos, cdir_end)
|
||||
if need_64:
|
||||
ecdir64_pos = self.pos
|
||||
buf = gen_ecdr64(self.items, cdir_pos, cdir_end)
|
||||
yield self._ct(buf)
|
||||
mbuf += self._ct(buf)
|
||||
|
||||
buf = gen_ecdr64_loc(ecdir64_pos)
|
||||
yield self._ct(buf)
|
||||
mbuf += self._ct(buf)
|
||||
|
||||
ecdr, _ = gen_ecdr(self.items, cdir_pos, cdir_end)
|
||||
yield self._ct(ecdr)
|
||||
yield mbuf + self._ct(ecdr)
|
||||
finally:
|
||||
if errf:
|
||||
bos.unlink(errf["ap"])
|
||||
|
||||
@@ -15,6 +15,7 @@ from .util import (
|
||||
E_ADDR_IN_USE,
|
||||
E_ADDR_NOT_AVAIL,
|
||||
E_UNREACH,
|
||||
IP6ALL,
|
||||
Netdev,
|
||||
min_ex,
|
||||
sunpack,
|
||||
@@ -254,6 +255,9 @@ class TcpSrv(object):
|
||||
srvs: list[socket.socket] = []
|
||||
for srv in self.srv:
|
||||
ip, port = srv.getsockname()[:2]
|
||||
if ip == IP6ALL:
|
||||
ip = "::" # jython
|
||||
|
||||
try:
|
||||
srv.listen(self.args.nc)
|
||||
try:
|
||||
@@ -275,6 +279,8 @@ class TcpSrv(object):
|
||||
srv.close()
|
||||
continue
|
||||
|
||||
t = "\n\nERROR: could not open listening socket, probably because one of the server ports ({}) is busy on one of the requested interfaces ({}); avoid this issue by specifying a different port (-p 3939) and/or a specific interface to listen on (-i 192.168.56.1)\n"
|
||||
self.log("tcpsrv", t.format(port, ip), 1)
|
||||
raise
|
||||
|
||||
bound.append((ip, port))
|
||||
|
||||
@@ -108,6 +108,7 @@ class ThumbCli(object):
|
||||
if st.st_size:
|
||||
ret = tpath = tp
|
||||
fmt = ret.rsplit(".")[1]
|
||||
break
|
||||
else:
|
||||
abort = True
|
||||
except:
|
||||
|
||||
@@ -22,7 +22,7 @@ from copy import deepcopy
|
||||
from queue import Queue
|
||||
|
||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, WINDOWS
|
||||
from .authsrv import LEELOO_DALLAS, VFS, AuthSrv
|
||||
from .authsrv import LEELOO_DALLAS, SSEELOG, VFS, AuthSrv
|
||||
from .bos import bos
|
||||
from .cfg import vf_bmap, vf_vmap
|
||||
from .fsutil import Fstab
|
||||
@@ -267,11 +267,49 @@ class Up2k(object):
|
||||
}
|
||||
return json.dumps(ret, indent=4)
|
||||
|
||||
def get_unfinished(self) -> str:
|
||||
if PY2 or not self.mutex.acquire(timeout=0.5):
|
||||
return "{}"
|
||||
|
||||
ret: dict[str, tuple[int, int]] = {}
|
||||
try:
|
||||
for ptop, tab2 in self.registry.items():
|
||||
nbytes = 0
|
||||
nfiles = 0
|
||||
drp = self.droppable.get(ptop, {})
|
||||
for wark, job in tab2.items():
|
||||
if wark in drp:
|
||||
continue
|
||||
|
||||
nfiles += 1
|
||||
try:
|
||||
# close enough on average
|
||||
nbytes += len(job["need"]) * job["size"] // len(job["hash"])
|
||||
except:
|
||||
pass
|
||||
|
||||
ret[ptop] = (nbytes, nfiles)
|
||||
finally:
|
||||
self.mutex.release()
|
||||
|
||||
return json.dumps(ret, indent=4)
|
||||
|
||||
def get_volsize(self, ptop: str) -> tuple[int, int]:
|
||||
with self.mutex:
|
||||
return self._get_volsize(ptop)
|
||||
|
||||
def get_volsizes(self, ptops: list[str]) -> list[tuple[int, int]]:
|
||||
ret = []
|
||||
with self.mutex:
|
||||
for ptop in ptops:
|
||||
ret.append(self._get_volsize(ptop))
|
||||
|
||||
return ret
|
||||
|
||||
def _get_volsize(self, ptop: str) -> tuple[int, int]:
|
||||
if "e2ds" not in self.flags.get(ptop, {}):
|
||||
return (0, 0)
|
||||
|
||||
cur = self.cur[ptop]
|
||||
nbytes = self.volsize[cur]
|
||||
nfiles = self.volnfiles[cur]
|
||||
@@ -791,9 +829,9 @@ class Up2k(object):
|
||||
|
||||
reg = {}
|
||||
drp = None
|
||||
path = os.path.join(histpath, "up2k.snap")
|
||||
if bos.path.exists(path):
|
||||
with gzip.GzipFile(path, "rb") as f:
|
||||
snap = os.path.join(histpath, "up2k.snap")
|
||||
if bos.path.exists(snap):
|
||||
with gzip.GzipFile(snap, "rb") as f:
|
||||
j = f.read().decode("utf-8")
|
||||
|
||||
reg2 = json.loads(j)
|
||||
@@ -804,20 +842,20 @@ class Up2k(object):
|
||||
pass
|
||||
|
||||
for k, job in reg2.items():
|
||||
path = djoin(job["ptop"], job["prel"], job["name"])
|
||||
if bos.path.exists(path):
|
||||
fp = djoin(job["ptop"], job["prel"], job["name"])
|
||||
if bos.path.exists(fp):
|
||||
reg[k] = job
|
||||
job["poke"] = time.time()
|
||||
job["busy"] = {}
|
||||
else:
|
||||
self.log("ign deleted file in snap: [{}]".format(path))
|
||||
self.log("ign deleted file in snap: [{}]".format(fp))
|
||||
|
||||
if drp is None:
|
||||
drp = [k for k, v in reg.items() if not v.get("need", [])]
|
||||
else:
|
||||
drp = [x for x in drp if x in reg]
|
||||
|
||||
t = "loaded snap {} |{}| ({})".format(path, len(reg.keys()), len(drp or []))
|
||||
t = "loaded snap {} |{}| ({})".format(snap, len(reg.keys()), len(drp or []))
|
||||
ta = [t] + self._vis_reg_progress(reg)
|
||||
self.log("\n".join(ta))
|
||||
|
||||
@@ -829,8 +867,11 @@ class Up2k(object):
|
||||
if not HAVE_SQLITE3 or "e2d" not in flags or "d2d" in flags:
|
||||
return None
|
||||
|
||||
if bos.makedirs(histpath):
|
||||
hidedir(histpath)
|
||||
try:
|
||||
if bos.makedirs(histpath):
|
||||
hidedir(histpath)
|
||||
except:
|
||||
return None
|
||||
|
||||
try:
|
||||
cur = self._open_db(db_path)
|
||||
@@ -911,6 +952,11 @@ class Up2k(object):
|
||||
rtop = absreal(top)
|
||||
n_add = n_rm = 0
|
||||
try:
|
||||
if not bos.listdir(rtop):
|
||||
t = "volume /%s at [%s] is empty; will not be indexed as this could be due to an offline filesystem"
|
||||
self.log(t % (vol.vpath, rtop), 6)
|
||||
return True, False
|
||||
|
||||
n_add = self._build_dir(
|
||||
db,
|
||||
top,
|
||||
@@ -946,7 +992,11 @@ class Up2k(object):
|
||||
|
||||
db.c.connection.commit()
|
||||
|
||||
if vol.flags.get("vmaxb") or vol.flags.get("vmaxn"):
|
||||
if (
|
||||
vol.flags.get("vmaxb")
|
||||
or vol.flags.get("vmaxn")
|
||||
or (self.args.stats and not self.args.nos_vol)
|
||||
):
|
||||
zs = "select count(sz), sum(sz) from up"
|
||||
vn, vb = db.c.execute(zs).fetchone()
|
||||
vb = vb or 0
|
||||
@@ -955,7 +1005,7 @@ class Up2k(object):
|
||||
self.volnfiles[db.c] = vn
|
||||
vmaxb = unhumanize(vol.flags.get("vmaxb") or "0")
|
||||
vmaxn = unhumanize(vol.flags.get("vmaxn") or "0")
|
||||
t = "{} / {} ( {} / {} files) in {}".format(
|
||||
t = "{:>5} / {:>5} ( {:>5} / {:>5} files) in {}".format(
|
||||
humansize(vb, True),
|
||||
humansize(vmaxb, True),
|
||||
humansize(vn, True).rstrip("B"),
|
||||
@@ -1005,7 +1055,7 @@ class Up2k(object):
|
||||
if WINDOWS:
|
||||
rd = rd.replace("\\", "/").strip("/")
|
||||
|
||||
g = statdir(self.log_func, not self.args.no_scandir, False, cdir)
|
||||
g = statdir(self.log_func, not self.args.no_scandir, True, cdir)
|
||||
gl = sorted(g)
|
||||
partials = set([x[0] for x in gl if "PARTIAL" in x[0]])
|
||||
for iname, inf in gl:
|
||||
@@ -1020,6 +1070,12 @@ class Up2k(object):
|
||||
continue
|
||||
|
||||
lmod = int(inf.st_mtime)
|
||||
if stat.S_ISLNK(inf.st_mode):
|
||||
try:
|
||||
inf = bos.stat(abspath)
|
||||
except:
|
||||
continue
|
||||
|
||||
sz = inf.st_size
|
||||
if fat32 and not ffat and inf.st_mtime % 2:
|
||||
fat32 = False
|
||||
@@ -1400,9 +1456,11 @@ class Up2k(object):
|
||||
pf = "v{}, {:.0f}+".format(n_left, b_left / 1024 / 1024)
|
||||
self.pp.msg = pf + abspath
|
||||
|
||||
st = bos.stat(abspath)
|
||||
# throws on broken symlinks (always did)
|
||||
stl = bos.lstat(abspath)
|
||||
st = bos.stat(abspath) if stat.S_ISLNK(stl.st_mode) else stl
|
||||
mt2 = int(stl.st_mtime)
|
||||
sz2 = st.st_size
|
||||
mt2 = int(st.st_mtime)
|
||||
|
||||
if nohash or not sz2:
|
||||
w2 = up2k_wark_from_metadata(self.salt, sz2, mt2, rd, fn)
|
||||
@@ -1424,6 +1482,13 @@ class Up2k(object):
|
||||
if w == w2:
|
||||
continue
|
||||
|
||||
# symlink mtime was inconsistent before v1.9.4; check if that's it
|
||||
if st != stl and (nohash or not sz2):
|
||||
mt2b = int(st.st_mtime)
|
||||
w2b = up2k_wark_from_metadata(self.salt, sz2, mt2b, rd, fn)
|
||||
if w == w2b:
|
||||
continue
|
||||
|
||||
rewark.append((drd, dfn, w2, sz2, mt2))
|
||||
|
||||
t = "hash mismatch: {}\n db: {} ({} byte, {})\n fs: {} ({} byte, {})"
|
||||
@@ -2358,27 +2423,31 @@ class Up2k(object):
|
||||
cur = jcur
|
||||
ptop = None # use cj or job as appropriate
|
||||
|
||||
if not job and wark in reg:
|
||||
# ensure the files haven't been deleted manually
|
||||
rj = reg[wark]
|
||||
names = [rj[x] for x in ["name", "tnam"] if x in rj]
|
||||
for fn in names:
|
||||
path = djoin(rj["ptop"], rj["prel"], fn)
|
||||
try:
|
||||
if bos.path.getsize(path) > 0 or not rj["need"]:
|
||||
# upload completed or both present
|
||||
break
|
||||
except:
|
||||
# missing; restart
|
||||
if not self.args.nw and not n4g:
|
||||
t = "forgetting deleted partial upload at {}"
|
||||
self.log(t.format(path))
|
||||
del reg[wark]
|
||||
break
|
||||
|
||||
if job or wark in reg:
|
||||
job = job or reg[wark]
|
||||
if (
|
||||
job["ptop"] == cj["ptop"]
|
||||
and job["prel"] == cj["prel"]
|
||||
and job["name"] == cj["name"]
|
||||
job["ptop"] != cj["ptop"]
|
||||
or job["prel"] != cj["prel"]
|
||||
or job["name"] != cj["name"]
|
||||
):
|
||||
# ensure the files haven't been deleted manually
|
||||
names = [job[x] for x in ["name", "tnam"] if x in job]
|
||||
for fn in names:
|
||||
path = djoin(job["ptop"], job["prel"], fn)
|
||||
try:
|
||||
if bos.path.getsize(path) > 0:
|
||||
# upload completed or both present
|
||||
break
|
||||
except:
|
||||
# missing; restart
|
||||
if not self.args.nw and not n4g:
|
||||
job = None
|
||||
break
|
||||
else:
|
||||
# file contents match, but not the path
|
||||
src = djoin(job["ptop"], job["prel"], job["name"])
|
||||
dst = djoin(cj["ptop"], cj["prel"], cj["name"])
|
||||
@@ -2660,7 +2729,7 @@ class Up2k(object):
|
||||
if not job:
|
||||
known = " ".join([x for x in self.registry[ptop].keys()])
|
||||
self.log("unknown wark [{}], known: {}".format(wark, known))
|
||||
raise Pebkac(400, "unknown wark")
|
||||
raise Pebkac(400, "unknown wark" + SSEELOG)
|
||||
|
||||
if chash not in job["need"]:
|
||||
msg = "chash = {} , need:\n".format(chash)
|
||||
|
||||
@@ -56,6 +56,8 @@ E_ADDR_IN_USE = _ens("EADDRINUSE WSAEADDRINUSE")
|
||||
E_ACCESS = _ens("EACCES WSAEACCES")
|
||||
E_UNREACH = _ens("EHOSTUNREACH WSAEHOSTUNREACH ENETUNREACH WSAENETUNREACH")
|
||||
|
||||
IP6ALL = "0:0:0:0:0:0:0:0"
|
||||
|
||||
|
||||
try:
|
||||
import ctypes
|
||||
@@ -66,7 +68,9 @@ except:
|
||||
|
||||
try:
|
||||
HAVE_SQLITE3 = True
|
||||
import sqlite3 # pylint: disable=unused-import # typechk
|
||||
import sqlite3
|
||||
|
||||
assert hasattr(sqlite3, "connect") # graalpy
|
||||
except:
|
||||
HAVE_SQLITE3 = False
|
||||
|
||||
@@ -294,11 +298,19 @@ REKOBO_KEY = {
|
||||
REKOBO_LKEY = {k.lower(): v for k, v in REKOBO_KEY.items()}
|
||||
|
||||
|
||||
_exestr = "python3 python ffmpeg ffprobe cfssl cfssljson cfssl-certinfo"
|
||||
CMD_EXEB = set(_exestr.encode("utf-8").split())
|
||||
CMD_EXES = set(_exestr.split())
|
||||
|
||||
|
||||
pybin = sys.executable or ""
|
||||
if EXE:
|
||||
pybin = ""
|
||||
for zsg in "python3 python".split():
|
||||
try:
|
||||
if ANYWIN:
|
||||
zsg += ".exe"
|
||||
|
||||
zsg = shutil.which(zsg)
|
||||
if zsg:
|
||||
pybin = zsg
|
||||
@@ -926,7 +938,8 @@ class Magician(object):
|
||||
class Garda(object):
|
||||
"""ban clients for repeated offenses"""
|
||||
|
||||
def __init__(self, cfg: str) -> None:
|
||||
def __init__(self, cfg: str, uniq: bool = True) -> None:
|
||||
self.uniq = uniq
|
||||
try:
|
||||
a, b, c = cfg.strip().split(",")
|
||||
self.lim = int(a)
|
||||
@@ -972,7 +985,7 @@ class Garda(object):
|
||||
# assume /64 clients; drop 4 groups
|
||||
ip = IPv6Address(ip).exploded[:-20]
|
||||
|
||||
if prev:
|
||||
if prev and self.uniq:
|
||||
if self.prev.get(ip) == prev:
|
||||
return 0, ip
|
||||
|
||||
@@ -1228,12 +1241,15 @@ def ren_open(
|
||||
except OSError as ex_:
|
||||
ex = ex_
|
||||
|
||||
if ex.errno == errno.EINVAL and not asciified:
|
||||
# EPERM: android13
|
||||
if ex.errno in (errno.EINVAL, errno.EPERM) and not asciified:
|
||||
asciified = True
|
||||
bname, fname = [
|
||||
zs.encode("ascii", "replace").decode("ascii").replace("?", "_")
|
||||
for zs in [bname, fname]
|
||||
]
|
||||
zsl = []
|
||||
for zs in (bname, fname):
|
||||
zs = zs.encode("ascii", "replace").decode("ascii")
|
||||
zs = re.sub(r"[^][a-zA-Z0-9(){}.,+=!-]", "_", zs)
|
||||
zsl.append(zs)
|
||||
bname, fname = zsl
|
||||
continue
|
||||
|
||||
# ENOTSUP: zfs on ubuntu 20.04
|
||||
@@ -1444,7 +1460,7 @@ class MultipartParser(object):
|
||||
for buf in iterable:
|
||||
ret += buf
|
||||
if len(ret) > max_len:
|
||||
raise Pebkac(400, "field length is too long")
|
||||
raise Pebkac(422, "field length is too long")
|
||||
|
||||
return ret
|
||||
|
||||
@@ -2121,7 +2137,7 @@ def list_ips() -> list[str]:
|
||||
def yieldfile(fn: str) -> Generator[bytes, None, None]:
|
||||
with open(fsenc(fn), "rb", 512 * 1024) as f:
|
||||
while True:
|
||||
buf = f.read(64 * 1024)
|
||||
buf = f.read(128 * 1024)
|
||||
if not buf:
|
||||
break
|
||||
|
||||
@@ -2442,6 +2458,14 @@ def runcmd(
|
||||
bout: bytes
|
||||
berr: bytes
|
||||
|
||||
if ANYWIN:
|
||||
if isinstance(argv[0], (bytes, bytearray)):
|
||||
if argv[0] in CMD_EXEB:
|
||||
argv[0] += b".exe"
|
||||
else:
|
||||
if argv[0] in CMD_EXES:
|
||||
argv[0] += ".exe"
|
||||
|
||||
p = sp.Popen(argv, stdout=cout, stderr=cerr, **ka)
|
||||
if not timeout or PY2:
|
||||
bout, berr = p.communicate(sin)
|
||||
|
||||
@@ -261,7 +261,7 @@ window.baguetteBox = (function () {
|
||||
setloop(1);
|
||||
else if (k == "BracketRight")
|
||||
setloop(2);
|
||||
else if (e.shiftKey)
|
||||
else if (e.shiftKey && k != 'KeyR')
|
||||
return;
|
||||
else if (k == "ArrowLeft" || k == "KeyJ")
|
||||
showPreviousImage();
|
||||
@@ -310,7 +310,7 @@ window.baguetteBox = (function () {
|
||||
options = {};
|
||||
setOptions(o);
|
||||
if (tt.en)
|
||||
tt.show.bind(this)();
|
||||
tt.show.call(this);
|
||||
}
|
||||
|
||||
function setVmode() {
|
||||
@@ -356,7 +356,7 @@ window.baguetteBox = (function () {
|
||||
|
||||
setVmode();
|
||||
if (tt.en)
|
||||
tt.show.bind(this)();
|
||||
tt.show.call(this);
|
||||
}
|
||||
|
||||
function findfile() {
|
||||
@@ -376,7 +376,12 @@ window.baguetteBox = (function () {
|
||||
else
|
||||
(vid() || ebi('bbox-overlay')).requestFullscreen();
|
||||
}
|
||||
catch (ex) { alert(ex); }
|
||||
catch (ex) {
|
||||
if (IPHONE)
|
||||
alert('sorry, apple decided to make this impossible on iphones (should work on ipad tho)');
|
||||
else
|
||||
alert(ex);
|
||||
}
|
||||
}
|
||||
|
||||
function tglsel() {
|
||||
@@ -519,7 +524,7 @@ window.baguetteBox = (function () {
|
||||
options[item] = newOptions[item];
|
||||
}
|
||||
|
||||
var an = options.animation = sread('ganim') || anims[ANIM ? 0 : 2];
|
||||
var an = options.animation = sread('ganim', anims) || anims[ANIM ? 0 : 2];
|
||||
btnAnim.textContent = ['⇄', '⮺', '⚡'][anims.indexOf(an)];
|
||||
btnAnim.setAttribute('tt', 'animation: ' + an);
|
||||
|
||||
@@ -870,7 +875,7 @@ window.baguetteBox = (function () {
|
||||
|
||||
if (loopB !== null) {
|
||||
timer.add(loopchk);
|
||||
sethash(window.location.hash.slice(1).split('&')[0] + '&t=' + (loopA || 0) + '-' + loopB);
|
||||
sethash(location.hash.slice(1).split('&')[0] + '&t=' + (loopA || 0) + '-' + loopB);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -968,7 +973,7 @@ window.baguetteBox = (function () {
|
||||
clmod(btnPrev, 'off', 't');
|
||||
clmod(btnNext, 'off', 't');
|
||||
|
||||
if (Date.now() - ctime <= 500)
|
||||
if (Date.now() - ctime <= 500 && !IPHONE)
|
||||
tglfull();
|
||||
|
||||
ctime = Date.now();
|
||||
|
||||
@@ -493,6 +493,7 @@ html.dz {
|
||||
--err-ts: #500;
|
||||
|
||||
text-shadow: none;
|
||||
font-family: 'scp', monospace, monospace;
|
||||
}
|
||||
html.dy {
|
||||
--fg: #000;
|
||||
@@ -727,18 +728,22 @@ a:hover {
|
||||
html.y #files thead th {
|
||||
box-shadow: 0 1px 0 rgba(0,0,0,0.12);
|
||||
}
|
||||
html #files.hhpick thead th {
|
||||
color: #f7d;
|
||||
background: #000;
|
||||
box-shadow: .1em .2em 0 #f6c inset, -.1em -.1em 0 #f6c inset;
|
||||
}
|
||||
#files td {
|
||||
margin: 0;
|
||||
padding: .3em .5em;
|
||||
background: var(--bg);
|
||||
max-width: var(--file-td-w);
|
||||
word-wrap: break-word;
|
||||
overflow: hidden;
|
||||
}
|
||||
#files tr:nth-child(2n) td {
|
||||
background: var(--row-alt);
|
||||
}
|
||||
#files td+td+td {
|
||||
max-width: 30em;
|
||||
overflow: hidden;
|
||||
}
|
||||
#files td+td {
|
||||
box-shadow: 1px 0 0 0 rgba(128,128,128,var(--f-sh1)) inset, 0 1px 0 rgba(255,255,255,var(--f-sh2)) inset, 0 -1px 0 rgba(255,255,255,var(--f-sh2)) inset;
|
||||
}
|
||||
@@ -861,7 +866,7 @@ html.y #path a:hover {
|
||||
}
|
||||
.mdo,
|
||||
.mdo * {
|
||||
line-height: 1.4em;
|
||||
line-height: 1.5em;
|
||||
}
|
||||
#srv_info,
|
||||
#srv_info2,
|
||||
@@ -1398,6 +1403,9 @@ input[type="checkbox"]:checked+label {
|
||||
color: #0e0;
|
||||
color: var(--a);
|
||||
}
|
||||
html.dz input {
|
||||
font-family: 'scp', monospace, monospace;
|
||||
}
|
||||
.opwide div>span>input+label {
|
||||
padding: .3em 0 .3em .3em;
|
||||
margin: 0 0 0 -.3em;
|
||||
@@ -1578,6 +1586,7 @@ html.cz .btn {
|
||||
border-bottom: .2em solid #709;
|
||||
}
|
||||
html.dz .btn {
|
||||
font-size: 1em;
|
||||
box-shadow: 0 0 0 .1em #080 inset;
|
||||
}
|
||||
html.dz .tgl.btn.on {
|
||||
@@ -1621,6 +1630,12 @@ html.cz .tgl.btn.on {
|
||||
list-style: none;
|
||||
border-top: 1px solid var(--bg-u5);
|
||||
}
|
||||
#tree li.offline>a:first-child:before {
|
||||
content: '❌';
|
||||
position: absolute;
|
||||
margin-left: -.25em;
|
||||
z-index: 3;
|
||||
}
|
||||
#tree ul a.sel {
|
||||
background: #000;
|
||||
background: var(--bg-d3);
|
||||
@@ -2102,12 +2117,12 @@ html.y #bbox-overlay figcaption a {
|
||||
}
|
||||
.bbox-btn,
|
||||
#bbox-btns {
|
||||
opacity: 1;
|
||||
opacity: 1;
|
||||
animation: opacity .2s infinite ease-in-out;
|
||||
}
|
||||
.bbox-btn.off,
|
||||
#bbox-btns.off {
|
||||
opacity: 0;
|
||||
opacity: 0;
|
||||
}
|
||||
#bbox-overlay button {
|
||||
cursor: pointer;
|
||||
@@ -2377,7 +2392,7 @@ html.y #bbox-overlay figcaption a {
|
||||
display: block;
|
||||
}
|
||||
#u2bm sup {
|
||||
font-weight: bold;
|
||||
font-weight: bold;
|
||||
}
|
||||
#u2notbtn {
|
||||
display: none;
|
||||
@@ -2761,6 +2776,9 @@ html.c .opbox,
|
||||
html.a .opbox {
|
||||
margin: 1.5em 0 0 0;
|
||||
}
|
||||
html.dz .opview input.i {
|
||||
width: calc(100% - 18em);
|
||||
}
|
||||
html.c #tree,
|
||||
html.c #treeh,
|
||||
html.a #tree,
|
||||
@@ -2813,6 +2831,9 @@ html.a #u2btn {
|
||||
html.ay #u2btn {
|
||||
box-shadow: .4em .4em 0 #ccc;
|
||||
}
|
||||
html.dz #u2btn {
|
||||
letter-spacing: -.033em;
|
||||
}
|
||||
html.c #u2conf.ww #u2btn,
|
||||
html.a #u2conf.ww #u2btn {
|
||||
margin: -2em .5em -3em 0;
|
||||
@@ -2975,13 +2996,13 @@ html.b .btn {
|
||||
top: -.1em;
|
||||
}
|
||||
html.b #op_up2k.srch sup {
|
||||
color: #fc0;
|
||||
color: #fc0;
|
||||
}
|
||||
html.by #u2btn sup {
|
||||
color: #06b;
|
||||
color: #06b;
|
||||
}
|
||||
html.by #op_up2k.srch sup {
|
||||
color: #b70;
|
||||
color: #b70;
|
||||
}
|
||||
html.bz #u2cards a.act {
|
||||
box-shadow: 0 -.1em .2em var(--bg-d2);
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}">
|
||||
{%- if css %}
|
||||
<link rel="stylesheet" media="screen" href="{{ css }}?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="{{ css }}_={{ ts }}">
|
||||
{%- endif %}
|
||||
</head>
|
||||
|
||||
@@ -29,7 +29,7 @@
|
||||
|
||||
<div id="op_player" class="opview opbox opwide"></div>
|
||||
|
||||
<div id="op_bup" class="opview opbox act">
|
||||
<div id="op_bup" class="opview opbox {% if not ls0 %}act{% endif %}">
|
||||
<div id="u2err"></div>
|
||||
<form method="post" enctype="multipart/form-data" accept-charset="utf-8" action="{{ url_suf }}">
|
||||
<input type="hidden" name="act" value="bput" />
|
||||
@@ -39,7 +39,7 @@
|
||||
<a id="bbsw" href="?b=u" rel="nofollow"><br />switch to basic browser</a>
|
||||
</div>
|
||||
|
||||
<div id="op_mkdir" class="opview opbox act">
|
||||
<div id="op_mkdir" class="opview opbox {% if not ls0 %}act{% endif %}">
|
||||
<form method="post" enctype="multipart/form-data" accept-charset="utf-8" action="{{ url_suf }}">
|
||||
<input type="hidden" name="act" value="mkdir" />
|
||||
📂<input type="text" name="name" class="i" placeholder="awesome mix vol.1">
|
||||
@@ -55,7 +55,7 @@
|
||||
</form>
|
||||
</div>
|
||||
|
||||
<div id="op_msg" class="opview opbox act">
|
||||
<div id="op_msg" class="opview opbox {% if not ls0 %}act{% endif %}">
|
||||
<form method="post" enctype="application/x-www-form-urlencoded" accept-charset="utf-8" action="{{ url_suf }}">
|
||||
📟<input type="text" name="msg" class="i" placeholder="lorem ipsum dolor sit amet">
|
||||
<input type="submit" value="send msg to srv log">
|
||||
@@ -142,6 +142,7 @@
|
||||
themes = {{ themes }},
|
||||
dtheme = "{{ dtheme }}",
|
||||
srvinf = "{{ srv_info }}",
|
||||
s_name = "{{ s_name }}",
|
||||
lang = "{{ lang }}",
|
||||
dfavico = "{{ favico }}",
|
||||
def_hcols = {{ def_hcols|tojson }},
|
||||
@@ -165,14 +166,14 @@
|
||||
readme = {{ readme|tojson }},
|
||||
ls0 = {{ ls0|tojson }};
|
||||
|
||||
document.documentElement.className = localStorage.theme || dtheme;
|
||||
document.documentElement.className = localStorage.cpp_thm || dtheme;
|
||||
</script>
|
||||
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/baguettebox.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/browser.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/up2k.js?_={{ ts }}"></script>
|
||||
{%- if js %}
|
||||
<script src="{{ js }}?_={{ ts }}"></script>
|
||||
<script src="{{ js }}_={{ ts }}"></script>
|
||||
{%- endif %}
|
||||
</body>
|
||||
|
||||
|
||||
@@ -80,6 +80,7 @@ var Ls = {
|
||||
"textfile-viewer",
|
||||
["I/K", "prev/next file"],
|
||||
["M", "close textfile"],
|
||||
["E", "edit textfile"],
|
||||
["S", "select file (for cut/rename)"],
|
||||
]
|
||||
],
|
||||
@@ -186,7 +187,7 @@ var Ls = {
|
||||
"cl_hiddenc": "hidden columns",
|
||||
"cl_hidec": "hide",
|
||||
"cl_reset": "reset",
|
||||
"cl_hpick": "click one column header to hide in the table below",
|
||||
"cl_hpick": "tap on column headers to hide in the table below",
|
||||
"cl_hcancel": "column hiding aborted",
|
||||
|
||||
"ct_thumb": "in grid-view, toggle icons or thumbnails$NHotkey: T",
|
||||
@@ -262,7 +263,8 @@ var Ls = {
|
||||
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
||||
"mm_e5xx": "Could not play audio; server error ",
|
||||
"mm_nof": "not finding any more audio files nearby",
|
||||
"mm_pwrsv": "<p>it looks like playback is being interrupted by your phone's power-saving settings!</p>" + '<p>please go to <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png">the app settings of your browser</a> and then <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png">allow unrestricted battery usage</a> to fix it.</p><p>(probably a good idea to use a separate browser dedicated for just music streaming...)</p>',
|
||||
"mm_pwrsv": "<p>it looks like playback is being interrupted by your phone's power-saving settings!</p>" + '<p>please go to <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png">the app settings of your browser</a> and then <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png">allow unrestricted battery usage</a> to fix it.</p><p><em>however,</em> it could also be due to the browser\'s autoplay settings;</p><p>Firefox: tap the icon on the left side of the address bar, then select "autoplay" and "allow audio"</p><p>Chrome: the problem will gradually dissipate as you play more music on this site</p>',
|
||||
"mm_iosblk": "<p>your web browser thinks the audio playback is unwanted, and it decided to block playback until you start another track manually... unfortunately we are both powerless in telling it otherwise</p><p>supposedly this will get better as you continue playing music on this site, but I'm unfamiliar with apple devices so idk if that's true</p><p>you could try another browser, maybe firefox or chrome?</p>",
|
||||
"mm_hnf": "that song no longer exists",
|
||||
|
||||
"im_hnf": "that image no longer exists",
|
||||
@@ -326,6 +328,7 @@ var Ls = {
|
||||
"tvt_prev": "show previous document$NHotkey: i\">⬆ prev",
|
||||
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
||||
"tvt_sel": "select file ( for cut / delete / ... )$NHotkey: S\">sel",
|
||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||
|
||||
"gt_msel": "enable file selection; ctrl-click a file to override$N$N<em>when active: doubleclick a file / folder to open it</em>$N$NHotkey: S\">multiselect",
|
||||
"gt_zoom": "zoom",
|
||||
@@ -374,7 +377,10 @@ var Ls = {
|
||||
"fu_xe1": "failed to load unpost list from server:\n\nerror ",
|
||||
"fu_xe2": "404: File not found??",
|
||||
|
||||
"fz_tar": "plain gnutar file (linux / mac)",
|
||||
"fz_tar": "uncompressed gnu-tar file (linux / mac)",
|
||||
"fz_pax": "uncompressed pax-format tar (slower)",
|
||||
"fz_targz": "gnu-tar with gzip level 3 compression$N$Nthis is usually very slow, so$Nuse uncompressed tar instead",
|
||||
"fz_tarxz": "gnu-tar with xz level 1 compression$N$Nthis is usually very slow, so$Nuse uncompressed tar instead",
|
||||
"fz_zip8": "zip with utf8 filenames (maybe wonky on windows 7 and older)",
|
||||
"fz_zipd": "zip with traditional cp437 filenames, for really old software",
|
||||
"fz_zipc": "cp437 with crc32 computed early,$Nfor MS-DOS PKZIP v2.04g (october 1993)$N(takes longer to process before download can start)",
|
||||
@@ -403,6 +409,7 @@ var Ls = {
|
||||
"u_https3": "for better performance",
|
||||
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
||||
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
||||
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
||||
"u_enpot": 'switch to <a href="#">potato UI</a> (may improve upload speed)',
|
||||
"u_depot": 'switch to <a href="#">fancy UI</a> (may reduce upload speed)',
|
||||
"u_gotpot": 'switching to the potato UI for improved upload speed,\n\nfeel free to disagree and switch back!',
|
||||
@@ -542,6 +549,7 @@ var Ls = {
|
||||
"dokumentviser",
|
||||
["I/K", "forr./neste fil"],
|
||||
["M", "lukk tekstdokument"],
|
||||
["E", "rediger tekstdokument"]
|
||||
["S", "velg fil (for F2/ctrl-x/...)"]
|
||||
]
|
||||
],
|
||||
@@ -648,7 +656,7 @@ var Ls = {
|
||||
"cl_hiddenc": "skjulte kolonner",
|
||||
"cl_hidec": "skjul",
|
||||
"cl_reset": "nullstill",
|
||||
"cl_hpick": "klikk overskriften til kolonnen du ønsker å skjule i tabellen nedenfor",
|
||||
"cl_hpick": "klikk på overskriften til kolonnene du ønsker å skjule i tabellen nedenfor",
|
||||
"cl_hcancel": "kolonne-skjuling avbrutt",
|
||||
|
||||
"ct_thumb": "vis miniatyrbilder istedenfor ikoner$NSnarvei: T",
|
||||
@@ -724,7 +732,8 @@ var Ls = {
|
||||
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
|
||||
"mm_e5xx": "Avspilling feilet: ",
|
||||
"mm_nof": "finner ikke flere sanger i nærheten",
|
||||
"mm_pwrsv": "<p>det ser ut som musikken ble avbrutt av telefonen sine strømsparings-innstillinger!</p>" + '<p>ta en tur innom <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png">app-innstillingene til nettleseren din</a> og så <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png">tillat ubegrenset batteriforbruk</a></p><p>(sikkert smart å ha en egen nettleser kun for musikkspilling...)</p>',
|
||||
"mm_pwrsv": "<p>det ser ut som musikken ble avbrutt av telefonen sine strømsparings-innstillinger!</p>" + '<p>ta en tur innom <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png">app-innstillingene til nettleseren din</a> og så <a target="_blank" href="https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png">tillat ubegrenset batteriforbruk</a></p><p>NB: det kan også være pga. autoplay-innstillingene, så prøv dette:</p><p>Firefox: klikk på ikonet i venstre side av addressefeltet, velg "autoplay" og "tillat lyd"</p><p>Chrome: problemet vil minske gradvis jo mer musikk du spiller på denne siden</p>',
|
||||
"mm_iosblk": "<p>nettleseren din tror at musikken er uønsket, og den bestemte seg for å stoppe avspillingen slik at du manuelt må velge en ny sang... dessverre er både du og jeg maktesløse når den har bestemt seg.</p><p>det ryktes at problemet vil minske jo mer musikk du spiller på denne siden, men jeg er ikke godt kjent med apple-dingser så jeg er ikke sikker.</p><p>kanskje firefox eller chrome fungerer bedre?</p>",
|
||||
"mm_hnf": "sangen finnes ikke lenger",
|
||||
|
||||
"im_hnf": "bildet finnes ikke lenger",
|
||||
@@ -783,11 +792,12 @@ var Ls = {
|
||||
"tv_xe1": "kunne ikke laste tekstfil:\n\nfeil ",
|
||||
"tv_xe2": "404, Fil ikke funnet",
|
||||
"tv_lst": "tekstfiler i mappen",
|
||||
"tvt_close": "gå tilbake til mappen$NSnarvei: M\">❌ close",
|
||||
"tvt_close": "gå tilbake til mappen$NSnarvei: M\">❌ lukk",
|
||||
"tvt_dl": "last ned denne filen\">💾 last ned",
|
||||
"tvt_prev": "vis forrige dokument$NSnarvei: i\">⬆ prev",
|
||||
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ next",
|
||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">sel",
|
||||
"tvt_prev": "vis forrige dokument$NSnarvei: i\">⬆ forr.",
|
||||
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ neste",
|
||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
|
||||
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
|
||||
|
||||
"gt_msel": "markér filer istedenfor å åpne dem; ctrl-klikk filer for å overstyre$N$N<em>når aktiv: dobbelklikk en fil / mappe for å åpne</em>$N$NSnarvei: S\">markering",
|
||||
"gt_zoom": "zoom",
|
||||
@@ -837,6 +847,9 @@ var Ls = {
|
||||
"fu_xe2": "404: Filen finnes ikke??",
|
||||
|
||||
"fz_tar": "ukomprimert gnu-tar arkiv, for linux og mac",
|
||||
"fz_pax": "ukomprimert pax-tar arkiv, litt tregere",
|
||||
"fz_targz": "gnu-tar pakket med gzip (nivå 3)$N$NNB: denne er veldig treg;$Nukomprimert tar er bedre",
|
||||
"fz_tarxz": "gnu-tar pakket med xz (nivå 1)$N$NNB: denne er veldig treg;$Nukomprimert tar er bedre",
|
||||
"fz_zip8": "zip med filnavn i utf8 (noe problematisk på windows 7 og eldre)",
|
||||
"fz_zipd": "zip med filnavn i cp437, for høggamle maskiner",
|
||||
"fz_zipc": "cp437 med tidlig crc32,$Nfor MS-DOS PKZIP v2.04g (oktober 1993)$N(øker behandlingstid på server)",
|
||||
@@ -865,6 +878,7 @@ var Ls = {
|
||||
"u_https3": "for høyere hastighet",
|
||||
"u_ancient": 'nettleseren din er prehistorisk -- mulig du burde <a href="#" onclick="goto(\'bup\')">bruke bup istedenfor</a>',
|
||||
"u_nowork": "krever firefox 53+, chrome 57+, eller iOS 11+",
|
||||
"u_uri": "for å laste opp bilder ifra andre nettleservinduer,\nslipp bildet rett på den store last-opp-knappen",
|
||||
"u_enpot": 'bytt til <a href="#">enkelt UI</a> (gir sannsynlig raskere opplastning)',
|
||||
"u_depot": 'bytt til <a href="#">snæsent UI</a> (gir sannsynlig tregere opplastning)',
|
||||
"u_gotpot": 'byttet til et enklere UI for å laste opp raskere,\n\ndu kan gjerne bytte tilbake altså!',
|
||||
@@ -927,12 +941,27 @@ var Ls = {
|
||||
"lang_set": "passer det å laste siden på nytt?",
|
||||
},
|
||||
};
|
||||
var L = Ls[sread("lang") || lang];
|
||||
if (Ls.eng && L != Ls.eng) {
|
||||
for (var k in Ls.eng)
|
||||
if (!L[k])
|
||||
L[k] = Ls.eng[k];
|
||||
var LANGS = ["eng", "nor"],
|
||||
L = Ls[sread("cpp_lang", LANGS) || lang] || Ls.eng || Ls.nor;
|
||||
|
||||
for (var a = 0; a < LANGS.length; a++) {
|
||||
for (var b = a + 1; b < LANGS.length; b++) {
|
||||
var i1 = Object.keys(Ls[LANGS[a]]).length > Object.keys(Ls[LANGS[b]]).length ? a : b,
|
||||
i2 = i1 == a ? b : a,
|
||||
t1 = Ls[LANGS[i1]],
|
||||
t2 = Ls[LANGS[i2]];
|
||||
|
||||
for (var k in t1)
|
||||
if (!t2[k]) {
|
||||
console.log("E missing TL", LANGS[i2], k);
|
||||
t2[k] = t1[k];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!has(LANGS, lang))
|
||||
alert('unsupported --lang "' + lang + '" specified in server args;\nplease use one of these: ' + LANGS);
|
||||
|
||||
modal.load();
|
||||
|
||||
|
||||
@@ -1136,16 +1165,16 @@ ebi('op_cfg').innerHTML = (
|
||||
'<div>\n' +
|
||||
' <h3>' + L.cl_favico + ' <span id="ico1">🎉</span></h3>\n' +
|
||||
' <div>\n' +
|
||||
' <input type="text" id="icot" style="width:1.3em" value="" tt="' + L.cft_text + '" />' +
|
||||
' <input type="text" id="icof" style="width:2em" value="" tt="' + L.cft_fg + '" />' +
|
||||
' <input type="text" id="icob" style="width:2em" value="" tt="' + L.cft_bg + '" />' +
|
||||
' <input type="text" id="icot" value="" ' + NOAC + ' style="width:1.3em" tt="' + L.cft_text + '" />' +
|
||||
' <input type="text" id="icof" value="" ' + NOAC + ' style="width:2em" tt="' + L.cft_fg + '" />' +
|
||||
' <input type="text" id="icob" value="" ' + NOAC + ' style="width:2em" tt="' + L.cft_bg + '" />' +
|
||||
' </td>\n' +
|
||||
' </div>\n' +
|
||||
'</div>\n' +
|
||||
'<div>\n' +
|
||||
' <h3>' + L.cl_bigdir + '</h3>\n' +
|
||||
' <div>\n' +
|
||||
' <input type="text" id="bd_lim" value="250" style="width:4em" tt="' + L.cdt_lim + '" />' +
|
||||
' <input type="text" id="bd_lim" value="250" ' + NOAC + ' style="width:4em" tt="' + L.cdt_lim + '" />' +
|
||||
' <a id="bd_ask" class="tgl btn" href="#" tt="' + L.cdt_ask + '">ask</a>\n' +
|
||||
' </td>\n' +
|
||||
' </div>\n' +
|
||||
@@ -1297,7 +1326,8 @@ function set_files_html(html) {
|
||||
}
|
||||
|
||||
|
||||
var ACtx = window.AudioContext || window.webkitAudioContext,
|
||||
// actx breaks background album playback on ios
|
||||
var ACtx = !IPHONE && (window.AudioContext || window.webkitAudioContext),
|
||||
noih = /[?&]v\b/.exec('' + location),
|
||||
hash0 = location.hash,
|
||||
mp;
|
||||
@@ -1333,13 +1363,13 @@ var mpl = (function () {
|
||||
) : '') +
|
||||
|
||||
'<div><h3>' + L.ml_tint + '</h3><div>' +
|
||||
'<input type="text" id="pb_tint" style="width:2.4em" value="0" tt="' + L.mt_tint + '" />' +
|
||||
'<input type="text" id="pb_tint" value="0" ' + NOAC + ' style="width:2.4em" tt="' + L.mt_tint + '" />' +
|
||||
'</div></div>' +
|
||||
|
||||
'<div><h3>' + L.ml_eq + '</h3><div id="audio_eq"></div></div>');
|
||||
|
||||
var r = {
|
||||
"pb_mode": (sread('pb_mode') || 'next').split('-')[0],
|
||||
"pb_mode": (sread('pb_mode', ['loop', 'next']) || 'next').split('-')[0],
|
||||
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
||||
'traversals': 0,
|
||||
};
|
||||
@@ -2036,8 +2066,7 @@ var pbar = (function () {
|
||||
t_redraw = setTimeout(r.drawpos, sm > 50 ? 20 : 50);
|
||||
};
|
||||
|
||||
window.addEventListener('resize', r.onresize);
|
||||
r.onresize();
|
||||
onresize100.add(r.onresize, true);
|
||||
return r;
|
||||
})();
|
||||
|
||||
@@ -2099,8 +2128,7 @@ var vbar = (function () {
|
||||
clearTimeout(untext);
|
||||
untext = setTimeout(r.draw, 1000);
|
||||
};
|
||||
window.addEventListener('resize', r.onresize);
|
||||
r.onresize();
|
||||
onresize100.add(r.onresize, true);
|
||||
|
||||
var rect;
|
||||
function mousedown(e) {
|
||||
@@ -2125,6 +2153,11 @@ var vbar = (function () {
|
||||
lastv = Date.now();
|
||||
mp.setvol(mul);
|
||||
r.draw();
|
||||
|
||||
setTimeout(function () {
|
||||
if (IPHONE && mp.au && mul < 0.9 && mp.au.volume == 1)
|
||||
toast.inf(6, 'volume doesnt work because <a href="https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11" target="_blank">apple says no</a>');
|
||||
}, 1);
|
||||
}
|
||||
can.onmousedown = function (e) {
|
||||
if (e.button !== 0)
|
||||
@@ -2179,6 +2212,7 @@ function song_skip(n, dirskip) {
|
||||
|
||||
if (dirskip && ofs + 1 && ofs > mp.order.length - 2) {
|
||||
toast.inf(10, L.mm_nof);
|
||||
console.log("mm_nof1");
|
||||
mpl.traversals = 0;
|
||||
return;
|
||||
}
|
||||
@@ -2205,13 +2239,14 @@ function next_song_cmn(e) {
|
||||
}
|
||||
if (mpl.traversals++ < 5) {
|
||||
if (MOBILE && t_fchg && Date.now() - t_fchg > 30 * 1000)
|
||||
modal.alert(L.mm_pwrsv);
|
||||
modal.alert(IPHONE ? L.mm_iosblk : L.mm_pwrsv);
|
||||
|
||||
t_fchg = document.hasFocus() ? 0 : Date.now();
|
||||
treectl.ls_cb = next_song_cmn;
|
||||
return tree_neigh(1);
|
||||
}
|
||||
toast.inf(10, L.mm_nof);
|
||||
console.log("mm_nof2");
|
||||
mpl.traversals = 0;
|
||||
t_fchg = 0;
|
||||
}
|
||||
@@ -2361,7 +2396,7 @@ var mpui = (function () {
|
||||
// cannot check document.hasFocus to avoid false positives;
|
||||
// it continues on power-on, doesn't need to be in-browser
|
||||
if (MOBILE && Date.now() - t_fchg > 30 * 1000)
|
||||
modal.alert(L.mm_pwrsv);
|
||||
modal.alert(IPHONE ? L.mm_iosblk : L.mm_pwrsv);
|
||||
|
||||
t_fchg = 0;
|
||||
}
|
||||
@@ -2713,7 +2748,7 @@ var afilt = (function () {
|
||||
html.push('<td><a href="#" class="eq_step" step="0.5" band="' + b + '">+</a></td>');
|
||||
h2.push('<td>' + vs[a][1] + '</td>');
|
||||
h4.push('<td><a href="#" class="eq_step" step="-0.5" band="' + b + '">–</a></td>');
|
||||
h3.push('<td><input type="text" class="eq_gain" band="' + b + '" value="' + vs[a][2] + '" /></td>');
|
||||
h3.push('<td><input type="text" class="eq_gain" ' + NOAC + ' band="' + b + '" value="' + vs[a][2] + '" /></td>');
|
||||
}
|
||||
html = html.join('\n') + '</tr><tr>';
|
||||
html += h2.join('\n') + '</tr><tr>';
|
||||
@@ -2927,6 +2962,7 @@ function evau_error(e) {
|
||||
err = e404;
|
||||
|
||||
toast.warn(15, esc(basenames(err + mfile)));
|
||||
console.log(basenames(err + mfile));
|
||||
|
||||
if (em.startsWith('MEDIA_ELEMENT_ERROR:')) {
|
||||
// chromish for 40x
|
||||
@@ -3038,7 +3074,7 @@ function eval_hash() {
|
||||
goto('search');
|
||||
var i = ebi('q_raw');
|
||||
i.value = uricom_dec(v.slice(3));
|
||||
return i.oninput();
|
||||
return i.onkeydown({ 'key': 'Enter' });
|
||||
}
|
||||
|
||||
if (v.indexOf('#v=') === 0) {
|
||||
@@ -3423,8 +3459,8 @@ var fileman = (function () {
|
||||
'<a id="rn_case" class="tgl btn" href="#" tt="' + L.fr_case + '</a>',
|
||||
'</div>',
|
||||
'<div id="rn_vadv"><table>',
|
||||
'<tr><td>regex</td><td><input type="text" id="rn_re" tt="regex search pattern to apply to original filenames; capturing groups can be referenced in the format field below like <code>(1)</code> and <code>(2)</code> and so on" placeholder="^[0-9]+[\\. ]+(.*) - (.*)" /></td></tr>',
|
||||
'<tr><td>format</td><td><input type="text" id="rn_fmt" tt="inspired by foobar2000:$N<code>(title)</code> is replaced by song title,$N<code>[(artist) - ](title)</code> skips the first part if artist is blank$N<code>$lpad((tn),2,0)</code> pads tracknumber to 2 digits" placeholder="[(artist) - ](title).(ext)" /></td></tr>',
|
||||
'<tr><td>regex</td><td><input type="text" id="rn_re" ' + NOAC + ' tt="regex search pattern to apply to original filenames; capturing groups can be referenced in the format field below like <code>(1)</code> and <code>(2)</code> and so on" placeholder="^[0-9]+[\\. ]+(.*) - (.*)" /></td></tr>',
|
||||
'<tr><td>format</td><td><input type="text" id="rn_fmt" ' + NOAC + ' tt="inspired by foobar2000:$N<code>(title)</code> is replaced by song title,$N<code>[(artist) - ](title)</code> skips the first part if artist is blank$N<code>$lpad((tn),2,0)</code> pads tracknumber to 2 digits" placeholder="[(artist) - ](title).(ext)" /></td></tr>',
|
||||
'<tr><td>preset</td><td><select id="rn_pre"></select>',
|
||||
'<button id="rn_pdel">❌ ' + L.fr_pdel + '</button>',
|
||||
'<button id="rn_pnew">💾 ' + L.fr_pnew + '</button>',
|
||||
@@ -3613,7 +3649,7 @@ var fileman = (function () {
|
||||
|
||||
if (!f.length) {
|
||||
toast.ok(2, 'rename OK');
|
||||
treectl.goto(get_evpath());
|
||||
treectl.goto();
|
||||
return rn_cancel();
|
||||
}
|
||||
|
||||
@@ -3660,7 +3696,7 @@ var fileman = (function () {
|
||||
if (err !== 'xbd')
|
||||
toast.ok(2, L.fd_ok);
|
||||
|
||||
treectl.goto(get_evpath());
|
||||
treectl.goto();
|
||||
return;
|
||||
}
|
||||
toast.show('inf r', 0, esc(L.fd_busy.format(vps.length + 1, vp)), 'r');
|
||||
@@ -3778,7 +3814,7 @@ var fileman = (function () {
|
||||
|
||||
if (!vp) {
|
||||
toast.ok(2, L.fp_ok);
|
||||
treectl.goto(get_evpath());
|
||||
treectl.goto();
|
||||
r.tx(srcdir);
|
||||
return;
|
||||
}
|
||||
@@ -3894,9 +3930,9 @@ var showfile = (function () {
|
||||
window.Prism = { 'manual': true };
|
||||
var em = QS('#bdoc>pre');
|
||||
if (em)
|
||||
em = [r.sname(window.location.search), location.hash, em.textContent];
|
||||
em = [r.sname(location.search), location.hash, em.textContent];
|
||||
else {
|
||||
var m = /[?&]doc=([^&]+)/.exec(window.location.search);
|
||||
var m = /[?&]doc=([^&]+)/.exec(location.search);
|
||||
if (m) {
|
||||
setTimeout(function () {
|
||||
r.show(uricom_dec(m[1]), true);
|
||||
@@ -3916,7 +3952,7 @@ var showfile = (function () {
|
||||
};
|
||||
|
||||
r.active = function () {
|
||||
return document.location.search.indexOf('doc=') + 1;
|
||||
return location.search.indexOf('doc=') + 1;
|
||||
};
|
||||
|
||||
r.getlang = function (fn) {
|
||||
@@ -4004,6 +4040,8 @@ var showfile = (function () {
|
||||
|
||||
ebi('files').style.display = ebi('gfiles').style.display = ebi('lazy').style.display = ebi('pro').style.display = ebi('epi').style.display = 'none';
|
||||
ebi('dldoc').setAttribute('href', url);
|
||||
ebi('editdoc').setAttribute('href', url + (url.indexOf('?') > 0 ? '&' : '?') + 'edit');
|
||||
ebi('editdoc').style.display = (has(perms, 'write') && (is_md || has(perms, 'delete'))) ? '' : 'none';
|
||||
|
||||
var wr = ebi('bdoc'),
|
||||
defer = !Prism.highlightElement;
|
||||
@@ -4015,7 +4053,7 @@ var showfile = (function () {
|
||||
|
||||
el = el || QS('#doc>code');
|
||||
Prism.highlightElement(el);
|
||||
if (el.className == 'language-ans')
|
||||
if (el.className == 'language-ans' || (!lang && /\x1b\[[0-9;]{0,16}m/.exec(txt.slice(0, 4096))))
|
||||
r.ansify(el);
|
||||
}
|
||||
catch (ex) { }
|
||||
@@ -4174,6 +4212,7 @@ var showfile = (function () {
|
||||
'<a href="#" class="btn" id="prevdoc" tt="' + L.tvt_prev + '</a>\n' +
|
||||
'<a href="#" class="btn" id="nextdoc" tt="' + L.tvt_next + '</a>\n' +
|
||||
'<a href="#" class="btn" id="seldoc" tt="' + L.tvt_sel + '</a>\n' +
|
||||
'<a href="#" class="btn" id="editdoc" tt="' + L.tvt_edit + '</a>\n' +
|
||||
'</div>'
|
||||
);
|
||||
ebi('xdoc').onclick = function () {
|
||||
@@ -4310,14 +4349,14 @@ var thegrid = (function () {
|
||||
if (ctrl(e) && !treectl.csel && !r.sel)
|
||||
return true;
|
||||
|
||||
return gclick.bind(this)(e, false);
|
||||
return gclick.call(this, e, false);
|
||||
}
|
||||
|
||||
function gclick2(e) {
|
||||
if (ctrl(e) || !r.sel)
|
||||
return true;
|
||||
|
||||
return gclick.bind(this)(e, true);
|
||||
return gclick.call(this, e, true);
|
||||
}
|
||||
|
||||
function gclick(e, dbl) {
|
||||
@@ -4332,7 +4371,7 @@ var thegrid = (function () {
|
||||
tr = td.parentNode;
|
||||
|
||||
if ((r.sel && !dbl && !ctrl(e)) || (treectl.csel && (e.shiftKey || ctrl(e)))) {
|
||||
td.onclick.bind(td)(e);
|
||||
td.onclick.call(td, e);
|
||||
if (e.shiftKey)
|
||||
return r.loadsel();
|
||||
clmod(this, 'sel', clgot(tr, 'sel'));
|
||||
@@ -4618,8 +4657,11 @@ function tree_neigh(n) {
|
||||
if (act >= links.length)
|
||||
act = 0;
|
||||
|
||||
treectl.dir_cb = tree_scrollto;
|
||||
links[act].click();
|
||||
if (showfile.active())
|
||||
links[act].click();
|
||||
else
|
||||
treectl.treego.call(links[act]);
|
||||
|
||||
links[act].focus();
|
||||
}
|
||||
|
||||
@@ -4671,6 +4713,40 @@ function hkhelp() {
|
||||
}
|
||||
|
||||
|
||||
var fselgen, fselctr;
|
||||
function fselfunw(e, ae, d, rem) {
|
||||
fselctr = 0;
|
||||
var gen = fselgen = Date.now();
|
||||
if (rem)
|
||||
rem *= window.innerHeight;
|
||||
|
||||
var selfun = function () {
|
||||
var el = ae[d + 'ElementSibling'];
|
||||
if (!el || gen != fselgen)
|
||||
return;
|
||||
|
||||
el.focus();
|
||||
var elh = el.offsetHeight;
|
||||
if (ctrl(e))
|
||||
document.documentElement.scrollTop += (d == 'next' ? 1 : -1) * elh;
|
||||
|
||||
if (e.shiftKey) {
|
||||
clmod(el, 'sel', 't');
|
||||
msel.origin_tr(el);
|
||||
msel.selui();
|
||||
}
|
||||
|
||||
rem -= elh;
|
||||
if (rem > 0) {
|
||||
ae = document.activeElement;
|
||||
if (++fselctr % 5 && rem > elh * (FIREFOX ? 5 : 2))
|
||||
selfun();
|
||||
else
|
||||
setTimeout(selfun, 1);
|
||||
}
|
||||
}
|
||||
selfun();
|
||||
}
|
||||
document.onkeydown = function (e) {
|
||||
if (e.altKey || e.isComposing)
|
||||
return;
|
||||
@@ -4715,24 +4791,14 @@ document.onkeydown = function (e) {
|
||||
}
|
||||
|
||||
if (aet == 'tr' && ae.closest('#files')) {
|
||||
var d = '';
|
||||
var d = '', rem = 0;
|
||||
if (k == 'ArrowUp') d = 'previous';
|
||||
if (k == 'ArrowDown') d = 'next';
|
||||
if (k == 'PageUp') { d = 'previous'; rem = 0.6; }
|
||||
if (k == 'PageDown') { d = 'next'; rem = 0.6; }
|
||||
if (d) {
|
||||
var el = ae[d + 'ElementSibling'];
|
||||
if (el) {
|
||||
el.focus();
|
||||
if (ctrl(e))
|
||||
document.documentElement.scrollTop += (d == 'next' ? 1 : -1) * el.offsetHeight;
|
||||
|
||||
if (e.shiftKey) {
|
||||
clmod(el, 'sel', 't');
|
||||
msel.origin_tr(el);
|
||||
msel.selui();
|
||||
}
|
||||
|
||||
return ev(e);
|
||||
}
|
||||
fselfunw(e, ae, d, rem);
|
||||
return ev(e);
|
||||
}
|
||||
if (k == 'Space') {
|
||||
clmod(ae, 'sel', 't');
|
||||
@@ -4852,6 +4918,8 @@ document.onkeydown = function (e) {
|
||||
if (showfile.active()) {
|
||||
if (k == 'KeyS')
|
||||
showfile.tglsel();
|
||||
if (k == 'KeyE' && ebi('editdoc').style.display != 'none')
|
||||
ebi('editdoc').click();
|
||||
}
|
||||
};
|
||||
|
||||
@@ -4913,7 +4981,7 @@ document.onkeydown = function (e) {
|
||||
for (var a = 0; a < trs.length; a += 2) {
|
||||
html.push('<table>' + (trs[a].concat(trs[a + 1])).join('\n') + '</table>');
|
||||
}
|
||||
html.push('<table id="tq_raw"><tr><td>raw</td><td><input id="q_raw" type="text" name="q" placeholder="( tags like *nhato* or tags like *taishi* ) and ( not tags like *nhato* or not tags like *taishi* )" /></td></tr></table>');
|
||||
html.push('<table id="tq_raw"><tr><td>raw</td><td><input id="q_raw" type="text" name="q" ' + NOAC + ' placeholder="( tags like *nhato* or tags like *taishi* ) and ( not tags like *nhato* or not tags like *taishi* )" /></td></tr></table>');
|
||||
ebi('srch_form').innerHTML = html.join('\n');
|
||||
|
||||
var o = QSA('#op_search input');
|
||||
@@ -4933,7 +5001,7 @@ document.onkeydown = function (e) {
|
||||
search_in_progress = 0;
|
||||
|
||||
function ev_search_input() {
|
||||
var v = this.value,
|
||||
var v = unsmart(this.value),
|
||||
id = this.getAttribute('id');
|
||||
|
||||
if (id.slice(-1) == 'v') {
|
||||
@@ -4970,7 +5038,7 @@ document.onkeydown = function (e) {
|
||||
if (search_in_progress)
|
||||
return;
|
||||
|
||||
var q = ebi('q_raw').value,
|
||||
var q = unsmart(ebi('q_raw').value),
|
||||
vq = ebi('files').getAttribute('q_raw');
|
||||
|
||||
srch_msg(false, (q == vq) ? '' : L.sm_prev + (vq ? vq : '(*)'));
|
||||
@@ -4982,7 +5050,7 @@ document.onkeydown = function (e) {
|
||||
for (var b = 1; b < sconf[a].length; b++) {
|
||||
var k = sconf[a][b][0],
|
||||
chk = 'srch_' + k + 'c',
|
||||
vs = ebi('srch_' + k + 'v').value,
|
||||
vs = unsmart(ebi('srch_' + k + 'v').value),
|
||||
tvs = [];
|
||||
|
||||
if (a == 1)
|
||||
@@ -5075,7 +5143,7 @@ document.onkeydown = function (e) {
|
||||
xhr.setRequestHeader('Content-Type', 'text/plain');
|
||||
xhr.onload = xhr.onerror = xhr_search_results;
|
||||
xhr.ts = Date.now();
|
||||
xhr.q_raw = ebi('q_raw').value;
|
||||
xhr.q_raw = unsmart(ebi('q_raw').value);
|
||||
xhr.send(JSON.stringify({ "q": xhr.q_raw, "n": cap }));
|
||||
}
|
||||
|
||||
@@ -5181,6 +5249,7 @@ document.onkeydown = function (e) {
|
||||
}
|
||||
})();
|
||||
|
||||
|
||||
function aligngriditems() {
|
||||
if (!treectl)
|
||||
return;
|
||||
@@ -5203,11 +5272,33 @@ function aligngriditems() {
|
||||
ebi('ggrid').style.justifyContent = treectl.hidden ? 'center' : 'space-between';
|
||||
}
|
||||
}
|
||||
window.addEventListener('resize', aligngriditems);
|
||||
onresize100.add(aligngriditems);
|
||||
|
||||
|
||||
var filecolwidth = (function () {
|
||||
var lastwidth = -1;
|
||||
|
||||
return function () {
|
||||
var vw = window.innerWidth / parseFloat(getComputedStyle(document.body)['font-size']),
|
||||
w = Math.floor(vw - 2);
|
||||
|
||||
if (w == lastwidth)
|
||||
return;
|
||||
|
||||
lastwidth = w;
|
||||
try {
|
||||
document.documentElement.style.setProperty('--file-td-w', w + 'em');
|
||||
}
|
||||
catch (ex) { }
|
||||
}
|
||||
})();
|
||||
onresize100.add(filecolwidth, true);
|
||||
|
||||
|
||||
var treectl = (function () {
|
||||
var r = {
|
||||
"hidden": true,
|
||||
"sb_msg": false,
|
||||
"ls_cb": null,
|
||||
"dir_cb": tree_scrollto,
|
||||
"pdir": []
|
||||
@@ -5224,7 +5315,10 @@ var treectl = (function () {
|
||||
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
||||
bcfg_bind(r, 'csel', 'csel', false);
|
||||
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
|
||||
r.goto(get_evpath());
|
||||
r.goto();
|
||||
var xhr = new XHR();
|
||||
xhr.open('GET', SR + '/?setck=dots=' + (v ? 'y' : ''), true);
|
||||
xhr.send();
|
||||
});
|
||||
bcfg_bind(r, 'dir1st', 'dir1st', true, function (v) {
|
||||
treectl.gentab(get_evpath(), treectl.lsc);
|
||||
@@ -5446,6 +5540,9 @@ var treectl = (function () {
|
||||
};
|
||||
|
||||
r.goto = function (url, push, back) {
|
||||
if (!url || !url.startsWith('/'))
|
||||
url = get_evpath() + (url || '');
|
||||
|
||||
get_tree("", url, true);
|
||||
r.reqls(url, push, back);
|
||||
};
|
||||
@@ -5550,7 +5647,7 @@ var treectl = (function () {
|
||||
}
|
||||
|
||||
links[a].className = cl;
|
||||
links[a].onclick = treego;
|
||||
links[a].onclick = r.treego;
|
||||
links[a].onmouseenter = nowrap ? menter : null;
|
||||
links[a].onmouseleave = nowrap ? mleave : null;
|
||||
}
|
||||
@@ -5612,7 +5709,7 @@ var treectl = (function () {
|
||||
return els[a].click();
|
||||
}
|
||||
|
||||
function treego(e) {
|
||||
r.treego = function (e) {
|
||||
if (ctrl(e))
|
||||
return true;
|
||||
|
||||
@@ -5624,7 +5721,7 @@ var treectl = (function () {
|
||||
}
|
||||
var href = this.getAttribute('href');
|
||||
if (R && !href.startsWith(SR)) {
|
||||
window.location = href;
|
||||
location = href;
|
||||
return;
|
||||
}
|
||||
r.reqls(href, true);
|
||||
@@ -5643,6 +5740,7 @@ var treectl = (function () {
|
||||
xhr.send();
|
||||
|
||||
r.nvis = r.lim;
|
||||
r.sb_msg = false;
|
||||
r.nextdir = xhr.top;
|
||||
enspin('#tree');
|
||||
enspin(thegrid.en ? '#gfiles' : '#files');
|
||||
@@ -5669,7 +5767,9 @@ var treectl = (function () {
|
||||
return;
|
||||
|
||||
r.nextdir = null;
|
||||
var cur = ebi('files').getAttribute('ts');
|
||||
var cdir = get_evpath(),
|
||||
cur = ebi('files').getAttribute('ts');
|
||||
|
||||
if (cur && parseInt(cur) > this.ts) {
|
||||
console.log("reject ls");
|
||||
return;
|
||||
@@ -5680,7 +5780,7 @@ var treectl = (function () {
|
||||
var res = JSON.parse(this.responseText);
|
||||
}
|
||||
catch (ex) {
|
||||
window.location = this.top;
|
||||
location = this.top;
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -5713,12 +5813,19 @@ var treectl = (function () {
|
||||
despin('#files');
|
||||
despin('#gfiles');
|
||||
|
||||
sandbox(ebi('pro'), sb_lg, '', res.logues ? res.logues[0] || "" : "");
|
||||
sandbox(ebi('epi'), sb_lg, '', res.logues ? res.logues[1] || "" : "");
|
||||
var lg0 = res.logues ? res.logues[0] || "" : "",
|
||||
lg1 = res.logues ? res.logues[1] || "" : "",
|
||||
dirchg = get_evpath() != cdir;
|
||||
|
||||
sandbox(ebi('pro'), sb_lg, '', lg0);
|
||||
if (dirchg)
|
||||
sandbox(ebi('epi'), sb_lg, '', lg1);
|
||||
|
||||
clmod(ebi('epi'), 'mdo');
|
||||
if (res.readme)
|
||||
if (res.readme && treectl.ireadme)
|
||||
show_readme(res.readme);
|
||||
else if (!dirchg)
|
||||
sandbox(ebi('epi'), sb_lg, '', lg1);
|
||||
|
||||
if (this.hpush && !this.back) {
|
||||
var ofs = ebi('wrap').offsetTop;
|
||||
@@ -5740,7 +5847,7 @@ var treectl = (function () {
|
||||
|
||||
for (var a = 0; a < res.files.length; a++)
|
||||
if (/^index.html?(\?|$)/i.exec(res.files[a].href)) {
|
||||
window.location = vjoin(top, res.files[a].href);
|
||||
location = vjoin(top, res.files[a].href);
|
||||
return true;
|
||||
}
|
||||
};
|
||||
@@ -5749,9 +5856,15 @@ var treectl = (function () {
|
||||
var nodes = res.dirs.concat(res.files),
|
||||
html = mk_files_header(res.taglist),
|
||||
sel = r.lsc === res ? msel.getsel() : [],
|
||||
ae = document.activeElement,
|
||||
cid = null,
|
||||
plain = [],
|
||||
seen = {};
|
||||
|
||||
if (ae && /^tr$/i.exec(ae.nodeName))
|
||||
if (ae = ae.querySelector('a[id]'))
|
||||
cid = ae.getAttribute('id');
|
||||
|
||||
r.lsc = res;
|
||||
if (res.unlist) {
|
||||
var ptn = new RegExp(res.unlist);
|
||||
@@ -5848,6 +5961,12 @@ var treectl = (function () {
|
||||
}
|
||||
if (sel.length)
|
||||
msel.loadsel(sel);
|
||||
else
|
||||
msel.origin_id(null);
|
||||
|
||||
if (cid) try {
|
||||
ebi(cid).closest('tr').focus();
|
||||
} catch (ex) { }
|
||||
|
||||
setTimeout(eval_hash, 1);
|
||||
}
|
||||
@@ -5965,7 +6084,8 @@ var treectl = (function () {
|
||||
for (var a = 0; a < keys.length; a++) {
|
||||
var kk = keys[a],
|
||||
ks = kk.slice(1),
|
||||
k = uricom_sdec(ks),
|
||||
ded = ks.endsWith('\n'),
|
||||
k = uricom_sdec(ded ? ks.replace(/\n$/, '') : ks),
|
||||
hek = esc(k[0]),
|
||||
uek = k[1] ? uricom_enc(k[0], true) : k[0],
|
||||
url = '/' + (top ? top + uek : uek) + '/',
|
||||
@@ -5978,7 +6098,7 @@ var treectl = (function () {
|
||||
ret += '<li>' + link + '\n<ul>\n' + subtree + '</ul></li>\n';
|
||||
}
|
||||
else {
|
||||
ret += '<li>' + link + '</li>\n';
|
||||
ret += (ded ? '<li class="offline">' : '<li>') + link + '</li>\n';
|
||||
}
|
||||
}
|
||||
return ret;
|
||||
@@ -6231,10 +6351,11 @@ var filecols = (function () {
|
||||
var ths = QSA('#files>thead th>span');
|
||||
for (var a = 0, aa = ths.length; a < aa; a++) {
|
||||
var th = ths[a].parentElement,
|
||||
toh = ths[a].outerHTML, // !ff10
|
||||
ttv = L.cols[ths[a].textContent];
|
||||
|
||||
if (!MOBILE) {
|
||||
th.innerHTML = '<div class="cfg"><a href="#">-</a></div>' + ths[a].outerHTML;
|
||||
if (!MOBILE && toh) {
|
||||
th.innerHTML = '<div class="cfg"><a href="#">-</a></div>' + toh;
|
||||
th.getElementsByTagName('a')[0].onclick = ev_row_tgl;
|
||||
}
|
||||
if (ttv) {
|
||||
@@ -6339,9 +6460,9 @@ var filecols = (function () {
|
||||
ebi('hcolsh').onclick = function (e) {
|
||||
ev(e);
|
||||
if (r.picking)
|
||||
return r.unpick(false);
|
||||
return r.unpick();
|
||||
|
||||
var lbs = QSA('#files>thead th>span');
|
||||
var lbs = QSA('#files>thead th');
|
||||
for (var a = 0; a < lbs.length; a++) {
|
||||
lbs[a].onclick = function (e) {
|
||||
ev(e);
|
||||
@@ -6349,19 +6470,20 @@ var filecols = (function () {
|
||||
toast.hide();
|
||||
|
||||
r.hide(e.target.textContent);
|
||||
r.unpick(true);
|
||||
};
|
||||
};
|
||||
r.picking = true;
|
||||
clmod(ebi('files'), 'hhpick', 1);
|
||||
toast.inf(0, L.cl_hpick, 'pickhide');
|
||||
};
|
||||
|
||||
r.unpick = function (picked) {
|
||||
r.unpick = function () {
|
||||
r.picking = false;
|
||||
if (!picked)
|
||||
toast.inf(5, L.cl_hcancel);
|
||||
toast.inf(5, L.cl_hcancel);
|
||||
|
||||
var lbs = QSA('#files>thead th>span');
|
||||
clmod(ebi('files'), 'hhpick');
|
||||
|
||||
var lbs = QSA('#files>thead th');
|
||||
for (var a = 0; a < lbs.length; a++)
|
||||
lbs[a].onclick = null;
|
||||
};
|
||||
@@ -6425,7 +6547,9 @@ var mukey = (function () {
|
||||
"6d ", "7d ", "8d ", "9d ", "10d", "11d", "12d", "1d ", "2d ", "3d ", "4d ", "5d ",
|
||||
"6m ", "7m ", "8m ", "9m ", "10m", "11m", "12m", "1m ", "2m ", "3m ", "4m ", "5m "
|
||||
]
|
||||
};
|
||||
},
|
||||
defnot = 'rekobo_alnum';
|
||||
|
||||
var map = {},
|
||||
html = [];
|
||||
|
||||
@@ -6450,7 +6574,7 @@ var mukey = (function () {
|
||||
}
|
||||
|
||||
function load_notation(notation) {
|
||||
swrite("key_notation", notation);
|
||||
swrite("cpp_keynot", notation);
|
||||
map = {};
|
||||
var dst = maps[notation];
|
||||
for (var k in maps)
|
||||
@@ -6498,7 +6622,10 @@ var mukey = (function () {
|
||||
}
|
||||
}
|
||||
|
||||
var notation = sread("key_notation") || "rekobo_alnum";
|
||||
var notation = sread("cpp_keynot") || defnot;
|
||||
if (!maps[notation])
|
||||
notation = defnot;
|
||||
|
||||
ebi('key_' + notation).checked = true;
|
||||
load_notation(notation);
|
||||
|
||||
@@ -6517,7 +6644,7 @@ var light, theme, themen;
|
||||
var settheme = (function () {
|
||||
var ax = 'abcdefghijklmnopqrstuvwx';
|
||||
|
||||
theme = sread('theme') || 'a';
|
||||
theme = sread('cpp_thm') || 'a';
|
||||
if (!/^[a-x][yz]/.exec(theme))
|
||||
theme = dtheme;
|
||||
|
||||
@@ -6559,7 +6686,7 @@ var settheme = (function () {
|
||||
l = light ? 'y' : 'z';
|
||||
theme = c + l + ' ' + c + ' ' + l;
|
||||
themen = c + l;
|
||||
swrite('theme', theme);
|
||||
swrite('cpp_thm', theme);
|
||||
freshen();
|
||||
}
|
||||
|
||||
@@ -6570,7 +6697,7 @@ var settheme = (function () {
|
||||
|
||||
(function () {
|
||||
function freshen() {
|
||||
lang = sread("lang") || lang;
|
||||
lang = sread("cpp_lang", LANGS) || lang;
|
||||
var html = [];
|
||||
for (var k in Ls)
|
||||
if (Ls.hasOwnProperty(k))
|
||||
@@ -6586,7 +6713,7 @@ var settheme = (function () {
|
||||
function setlang(e) {
|
||||
ev(e);
|
||||
L = Ls[this.textContent];
|
||||
swrite("lang", this.textContent);
|
||||
swrite("cpp_lang", this.textContent);
|
||||
freshen();
|
||||
modal.confirm(Ls.eng.lang_set + "\n\n" + Ls.nor.lang_set, location.reload.bind(location), null);
|
||||
};
|
||||
@@ -6602,6 +6729,9 @@ var arcfmt = (function () {
|
||||
var html = [],
|
||||
fmts = [
|
||||
["tar", "tar", L.fz_tar],
|
||||
["pax", "tar=pax", L.fz_pax],
|
||||
["tgz", "tar=gz", L.fz_targz],
|
||||
["txz", "tar=xz", L.fz_tarxz],
|
||||
["zip", "zip=utf8", L.fz_zip8],
|
||||
["zip_dos", "zip", L.fz_zipd],
|
||||
["zip_crc", "zip=crc", L.fz_zipc]
|
||||
@@ -6631,7 +6761,7 @@ var arcfmt = (function () {
|
||||
|
||||
for (var a = 0, aa = tds.length; a < aa; a++) {
|
||||
var o = tds[a], txt = o.textContent, href = o.getAttribute('href');
|
||||
if (txt != 'tar' && txt != 'zip')
|
||||
if (!/^(zip|tar|pax|tgz|txz)$/.exec(txt))
|
||||
continue;
|
||||
|
||||
var ofs = href.lastIndexOf('?');
|
||||
@@ -6718,7 +6848,9 @@ var msel = (function () {
|
||||
};
|
||||
|
||||
r.loadsel = function (sel) {
|
||||
r.so = r.pr = null;
|
||||
if (!sel || !r.so || !ebi(r.so))
|
||||
r.so = r.pr = null;
|
||||
|
||||
r.sel = [];
|
||||
r.load();
|
||||
|
||||
@@ -6960,7 +7092,7 @@ var msel = (function () {
|
||||
clmod(sf, 'vis');
|
||||
sf.textContent = 'sent: "' + this.msg + '"';
|
||||
setTimeout(function () {
|
||||
treectl.goto(get_evpath());
|
||||
treectl.goto();
|
||||
}, 100);
|
||||
}
|
||||
})();
|
||||
@@ -7035,7 +7167,7 @@ function show_md(md, name, div, url, depth) {
|
||||
wfp_debounce.hide();
|
||||
if (!marked) {
|
||||
if (depth)
|
||||
return toast.warn(10, errmsg + 'failed to load marked.js')
|
||||
return toast.warn(10, errmsg + (window.WebAssembly ? 'failed to load marked.js' : 'your browser is too old'));
|
||||
|
||||
wfp_debounce.n--;
|
||||
return import_js(SR + '/.cpr/deps/marked.js', function () {
|
||||
@@ -7058,7 +7190,12 @@ function show_md(md, name, div, url, depth) {
|
||||
|
||||
try {
|
||||
clmod(div, 'mdo', 1);
|
||||
if (sandbox(div, sb_md, 'mdo', marked.parse(md, marked_opts)))
|
||||
|
||||
var md_html = marked.parse(md, marked_opts);
|
||||
if (!have_emp)
|
||||
md_html = DOMPurify.sanitize(md_html);
|
||||
|
||||
if (sandbox(div, sb_md, 'mdo', md_html))
|
||||
return;
|
||||
|
||||
ext = md_plug.post;
|
||||
@@ -7152,11 +7289,12 @@ function sandbox(tgt, rules, cls, html) {
|
||||
'function say(m){window.parent.postMessage(m,"*")};' +
|
||||
'setTimeout(function(){var its=0,pih=-1,f=function(){' +
|
||||
'var ih=2+Math.min(parseInt(getComputedStyle(d).height),d.scrollHeight);' +
|
||||
'if(ih!=pih){pih=ih;say("iheight #' + tid + ' "+ih,"*")}' +
|
||||
'if(ih!=pih&&!isNaN(ih)){pih=ih;say("iheight #' + tid + ' "+ih,"*")}' +
|
||||
'if(++its<20)return setTimeout(f,20);if(its==20)setInterval(f,200)' +
|
||||
'};f();' +
|
||||
'window.onfocus=function(){say("igot #' + tid + '")};' +
|
||||
'window.onblur=function(){say("ilost #' + tid + '")};' +
|
||||
'window.treectl={"goto":function(a){say("goto #' + tid + ' "+(a||""))}};' +
|
||||
'var el="' + want + '"&&ebi("' + want + '");' +
|
||||
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
|
||||
'md_th_set();' +
|
||||
@@ -7170,16 +7308,24 @@ function sandbox(tgt, rules, cls, html) {
|
||||
fr.setAttribute('title', 'folder ' + tid + 'logue');
|
||||
fr.setAttribute('sandbox', rules ? 'allow-' + rules.replace(/ /g, ' allow-') : '');
|
||||
fr.setAttribute('srcdoc', html);
|
||||
tgt.innerHTML = '';
|
||||
tgt.appendChild(fr);
|
||||
treectl.sb_msg = true;
|
||||
return true;
|
||||
}
|
||||
window.addEventListener("message", function (e) {
|
||||
if (!treectl.sb_msg)
|
||||
return;
|
||||
|
||||
try {
|
||||
console.log('msg:' + e.data);
|
||||
var t = e.data.split(/ /g);
|
||||
if (t[0] == 'iheight') {
|
||||
var el = QS(t[1] + '>iframe');
|
||||
var el = QSA(t[1] + '>iframe');
|
||||
el = el[el.length - 1];
|
||||
if (wfp_debounce.n)
|
||||
while (el.previousSibling)
|
||||
el.parentNode.removeChild(el.previousSibling);
|
||||
|
||||
el.style.height = (parseInt(t[2]) + SBH) + 'px';
|
||||
el.style.visibility = 'unset';
|
||||
wfp_debounce.show();
|
||||
@@ -7196,6 +7342,10 @@ window.addEventListener("message", function (e) {
|
||||
else if (t[0] == 'imshow') {
|
||||
thegrid.imshow(e.data.slice(7));
|
||||
}
|
||||
else if (t[0] == 'goto') {
|
||||
var t = e.data.replace(/^[^ ]+ [^ ]+ /, '').split(/[?&]/)[0];
|
||||
treectl.goto(t, !!t);
|
||||
}
|
||||
} catch (ex) {
|
||||
console.log('msg-err: ' + ex);
|
||||
}
|
||||
@@ -7408,8 +7558,16 @@ function goto_unpost(e) {
|
||||
}
|
||||
|
||||
|
||||
function wintitle(txt) {
|
||||
document.title = (txt ? txt : '') + get_vpath().slice(1, -1).split('/').pop();
|
||||
function wintitle(txt, noname) {
|
||||
if (txt === undefined)
|
||||
txt = '';
|
||||
|
||||
if (s_name && !noname)
|
||||
txt = s_name + ' ' + txt;
|
||||
|
||||
txt += get_vpath().slice(1, -1).split('/').pop();
|
||||
|
||||
document.title = txt;
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>{{ svcname }}</title>
|
||||
<title>{{ s_doctitle }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
</head>
|
||||
|
||||
@@ -31,7 +31,7 @@
|
||||
<span id="lno">L#</span>
|
||||
{%- else %}
|
||||
<a href="{{ arg_base }}edit" tt="good: higher performance$Ngood: same document width as viewer$Nbad: assumes you know markdown">edit (basic)</a>
|
||||
<a href="{{ arg_base }}edit2" tt="not in-house so probably less buggy">edit (fancy)</a>
|
||||
<a href="{{ arg_base }}edit2" id="edit2" tt="not in-house so probably less buggy">edit (fancy)</a>
|
||||
<a href="{{ arg_base }}">view raw</a>
|
||||
{%- endif %}
|
||||
</div>
|
||||
|
||||
@@ -52,7 +52,7 @@ var img_load = (function () {
|
||||
var r = {};
|
||||
r.callbacks = [];
|
||||
|
||||
function fire() {
|
||||
var fire = function () {
|
||||
for (var a = 0; a < r.callbacks.length; a++)
|
||||
r.callbacks[a]();
|
||||
}
|
||||
@@ -212,6 +212,8 @@ function convert_markdown(md_text, dest_dom) {
|
||||
|
||||
try {
|
||||
var md_html = marked.parse(md_text, marked_opts);
|
||||
if (!have_emp)
|
||||
md_html = DOMPurify.sanitize(md_html);
|
||||
}
|
||||
catch (ex) {
|
||||
if (ext)
|
||||
@@ -470,7 +472,7 @@ img_load.callbacks = [toc.refresh];
|
||||
// scroll handler
|
||||
var redraw = (function () {
|
||||
var sbs = true;
|
||||
function onresize() {
|
||||
var onresize = function () {
|
||||
if (window.matchMedia)
|
||||
sbs = window.matchMedia('(min-width: 64em)').matches;
|
||||
|
||||
@@ -483,7 +485,7 @@ var redraw = (function () {
|
||||
onscroll();
|
||||
}
|
||||
|
||||
function onscroll() {
|
||||
var onscroll = function () {
|
||||
toc.refresh();
|
||||
}
|
||||
|
||||
@@ -505,6 +507,13 @@ dom_navtgl.onclick = function () {
|
||||
redraw();
|
||||
};
|
||||
|
||||
if (!HTTPS && location.hostname != '127.0.0.1') try {
|
||||
ebi('edit2').onclick = function (e) {
|
||||
toast.err(0, "the fancy editor is only available over https");
|
||||
return ev(e);
|
||||
}
|
||||
} catch (ex) { }
|
||||
|
||||
if (sread('hidenav') == 1)
|
||||
dom_navtgl.onclick();
|
||||
|
||||
|
||||
@@ -92,7 +92,7 @@ var action_stack = null;
|
||||
var nlines = 0;
|
||||
var draw_md = (function () {
|
||||
var delay = 1;
|
||||
function draw_md() {
|
||||
var draw_md = function () {
|
||||
var t0 = Date.now();
|
||||
var src = dom_src.value;
|
||||
convert_markdown(src, dom_pre);
|
||||
@@ -135,7 +135,7 @@ img_load.callbacks = [function () {
|
||||
|
||||
// resize handler
|
||||
redraw = (function () {
|
||||
function onresize() {
|
||||
var onresize = function () {
|
||||
var y = (dom_hbar.offsetTop + dom_hbar.offsetHeight) + 'px';
|
||||
dom_wrap.style.top = y;
|
||||
dom_swrap.style.top = y;
|
||||
@@ -143,12 +143,12 @@ redraw = (function () {
|
||||
map_src = genmap(dom_ref, map_src);
|
||||
map_pre = genmap(dom_pre, map_pre);
|
||||
}
|
||||
function setsbs() {
|
||||
var setsbs = function () {
|
||||
dom_wrap.className = '';
|
||||
dom_swrap.className = '';
|
||||
onresize();
|
||||
}
|
||||
function modetoggle() {
|
||||
var modetoggle = function () {
|
||||
var mode = dom_nsbs.innerHTML;
|
||||
dom_nsbs.innerHTML = mode == 'editor' ? 'preview' : 'editor';
|
||||
mode += ' single';
|
||||
@@ -172,7 +172,7 @@ redraw = (function () {
|
||||
(function () {
|
||||
var skip_src = false, skip_pre = false;
|
||||
|
||||
function scroll(src, srcmap, dst, dstmap) {
|
||||
var scroll = function (src, srcmap, dst, dstmap) {
|
||||
var y = src.scrollTop;
|
||||
if (y < 8) {
|
||||
dst.scrollTop = 0;
|
||||
@@ -278,6 +278,7 @@ function Modpoll() {
|
||||
return;
|
||||
|
||||
var new_md = this.responseText,
|
||||
new_mt = this.getResponseHeader('X-Lastmod3') || r.lastmod,
|
||||
server_ref = server_md.replace(/\r/g, ''),
|
||||
server_now = new_md.replace(/\r/g, '');
|
||||
|
||||
@@ -285,6 +286,7 @@ function Modpoll() {
|
||||
if (r.initial && server_ref != server_now)
|
||||
return modal.confirm('Your browser decided to show an outdated copy of the document!\n\nDo you want to load the latest version from the server instead?', function () {
|
||||
dom_src.value = server_md = new_md;
|
||||
last_modified = new_mt;
|
||||
draw_md();
|
||||
}, null);
|
||||
|
||||
@@ -898,12 +900,12 @@ var set_lno = (function () {
|
||||
pv = null,
|
||||
lno = ebi('lno');
|
||||
|
||||
function poke() {
|
||||
var poke = function () {
|
||||
clearTimeout(t);
|
||||
t = setTimeout(fire, 20);
|
||||
}
|
||||
|
||||
function fire() {
|
||||
var fire = function () {
|
||||
try {
|
||||
clearTimeout(t);
|
||||
|
||||
@@ -928,7 +930,7 @@ var set_lno = (function () {
|
||||
|
||||
// hotkeys / toolbar
|
||||
(function () {
|
||||
function keydown(ev) {
|
||||
var keydown = function (ev) {
|
||||
ev = ev || window.event;
|
||||
var kc = ev.code || ev.keyCode || ev.which,
|
||||
editing = document.activeElement == dom_src;
|
||||
@@ -1056,7 +1058,7 @@ action_stack = (function () {
|
||||
var ignore = false;
|
||||
var ref = dom_src.value;
|
||||
|
||||
function diff(from, to, cpos) {
|
||||
var diff = function (from, to, cpos) {
|
||||
if (from === to)
|
||||
return null;
|
||||
|
||||
@@ -1087,14 +1089,14 @@ action_stack = (function () {
|
||||
};
|
||||
}
|
||||
|
||||
function undiff(from, change) {
|
||||
var undiff = function (from, change) {
|
||||
return {
|
||||
txt: from.substring(0, change.car) + change.txt + from.substring(change.cdr),
|
||||
cpos: change.cpos
|
||||
};
|
||||
}
|
||||
|
||||
function apply(src, dst) {
|
||||
var apply = function (src, dst) {
|
||||
dbg('undos(%d) redos(%d)', hist.un.length, hist.re.length);
|
||||
|
||||
if (src.length === 0)
|
||||
@@ -1118,7 +1120,7 @@ action_stack = (function () {
|
||||
return true;
|
||||
}
|
||||
|
||||
function schedule_push() {
|
||||
var schedule_push = function () {
|
||||
if (ignore) {
|
||||
ignore = false;
|
||||
return;
|
||||
@@ -1129,7 +1131,7 @@ action_stack = (function () {
|
||||
sched_timer = setTimeout(push, 500);
|
||||
}
|
||||
|
||||
function undo() {
|
||||
var undo = function () {
|
||||
if (hist.re.length == 0) {
|
||||
clearTimeout(sched_timer);
|
||||
push();
|
||||
@@ -1137,11 +1139,11 @@ action_stack = (function () {
|
||||
return apply(hist.un, hist.re);
|
||||
}
|
||||
|
||||
function redo() {
|
||||
var redo = function () {
|
||||
return apply(hist.re, hist.un);
|
||||
}
|
||||
|
||||
function push() {
|
||||
var push = function () {
|
||||
var newtxt = dom_src.value;
|
||||
var change = diff(ref, newtxt, sched_cpos);
|
||||
if (change !== null)
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>{{ svcname }}</title>
|
||||
<title>{{ s_doctitle }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<meta name="theme-color" content="#333">
|
||||
@@ -42,7 +42,7 @@
|
||||
{%- if redir %}
|
||||
<script>
|
||||
setTimeout(function() {
|
||||
window.location.replace("{{ redir }}");
|
||||
location.replace("{{ redir }}");
|
||||
}, 1000);
|
||||
</script>
|
||||
{%- endif %}
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>{{ svcname }}</title>
|
||||
<title>{{ s_doctitle }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<meta name="theme-color" content="#333">
|
||||
@@ -110,7 +110,7 @@ var SR = {{ r|tojson }},
|
||||
lang="{{ lang }}",
|
||||
dfavico="{{ favico }}";
|
||||
|
||||
document.documentElement.className=localStorage.theme||"{{ this.args.theme }}";
|
||||
document.documentElement.className=localStorage.cpp_thm||"{{ this.args.theme }}";
|
||||
|
||||
</script>
|
||||
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||
|
||||
@@ -35,7 +35,7 @@ var Ls = {
|
||||
"v2": "use this server as a local HDD$N$NWARNING: this will show your password!",
|
||||
}
|
||||
},
|
||||
d = Ls[sread("lang") || lang];
|
||||
d = Ls[sread("cpp_lang", ["eng", "nor"]) || lang] || Ls.eng || Ls.nor;
|
||||
|
||||
for (var k in (d || {})) {
|
||||
var f = k.slice(-1),
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>{{ args.doctitle }} @ {{ args.name }}</title>
|
||||
<title>{{ s_doctitle }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<meta name="theme-color" content="#333">
|
||||
@@ -218,7 +218,7 @@ var SR = {{ r|tojson }},
|
||||
lang="{{ lang }}",
|
||||
dfavico="{{ favico }}";
|
||||
|
||||
document.documentElement.className=localStorage.theme||"{{ args.theme }}";
|
||||
document.documentElement.className=localStorage.cpp_thm||"{{ args.theme }}";
|
||||
|
||||
</script>
|
||||
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||
|
||||
@@ -4,6 +4,8 @@
|
||||
src: local('Source Code Pro Regular'), local('SourceCodePro-Regular'), url(deps/scp.woff2) format('woff2');
|
||||
}
|
||||
html {
|
||||
text-size-adjust: 100%;
|
||||
-webkit-text-size-adjust: 100%;
|
||||
touch-action: manipulation;
|
||||
}
|
||||
#tt, #toast {
|
||||
|
||||
@@ -588,7 +588,7 @@ function U2pvis(act, btns, uc, st) {
|
||||
btns[a].onclick = function (e) {
|
||||
ev(e);
|
||||
var newtab = this.getAttribute('act');
|
||||
function go() {
|
||||
var go = function () {
|
||||
for (var b = 0; b < btns.length; b++) {
|
||||
btns[b].className = (
|
||||
btns[b].getAttribute('act') == newtab) ? 'act' : '';
|
||||
@@ -723,7 +723,7 @@ function Donut(uc, st) {
|
||||
|
||||
function strobe() {
|
||||
var txt = strobes.pop();
|
||||
wintitle(txt);
|
||||
wintitle(txt, false);
|
||||
if (!txt)
|
||||
clearInterval(tstrober);
|
||||
}
|
||||
@@ -807,7 +807,7 @@ function up2k_init(subtle) {
|
||||
function init_deps() {
|
||||
if (!loading_deps && !got_deps()) {
|
||||
var fn = 'sha512.' + sha_js + '.js',
|
||||
m = L.u_https1 + ' <a href="' + (window.location + '').replace(':', 's:') + '">' + L.u_https2 + '</a> ' + L.u_https3;
|
||||
m = L.u_https1 + ' <a href="' + (location + '').replace(':', 's:') + '">' + L.u_https2 + '</a> ' + L.u_https3;
|
||||
|
||||
showmodal('<h1>loading ' + fn + '</h1>');
|
||||
import_js(SR + '/.cpr/deps/' + fn, unmodal);
|
||||
@@ -971,7 +971,7 @@ function up2k_init(subtle) {
|
||||
if (++nenters <= 0)
|
||||
nenters = 1;
|
||||
|
||||
if (onover.bind(this)(e))
|
||||
if (onover.call(this, e))
|
||||
return true;
|
||||
|
||||
var mup, up = QS('#up_zd');
|
||||
@@ -995,16 +995,29 @@ function up2k_init(subtle) {
|
||||
function onoverb(e) {
|
||||
// zones are alive; disable cuo2duo branch
|
||||
document.body.ondragover = document.body.ondrop = null;
|
||||
return onover.bind(this)(e);
|
||||
return onover.call(this, e);
|
||||
}
|
||||
function onover(e) {
|
||||
return onovercmn(this, e, false);
|
||||
}
|
||||
function onoverbtn(e) {
|
||||
return onovercmn(this, e, true);
|
||||
}
|
||||
function onovercmn(self, e, btn) {
|
||||
try {
|
||||
var ok = false, dt = e.dataTransfer.types;
|
||||
for (var a = 0; a < dt.length; a++)
|
||||
if (dt[a] == 'Files')
|
||||
ok = true;
|
||||
else if (dt[a] == 'text/uri-list')
|
||||
return true;
|
||||
else if (dt[a] == 'text/uri-list') {
|
||||
if (btn) {
|
||||
ok = true;
|
||||
if (toast.txt == L.u_uri)
|
||||
toast.hide();
|
||||
}
|
||||
else
|
||||
return toast.inf(10, L.u_uri) || true;
|
||||
}
|
||||
|
||||
if (!ok)
|
||||
return true;
|
||||
@@ -1020,8 +1033,11 @@ function up2k_init(subtle) {
|
||||
document.body.ondragenter = document.body.ondragleave = document.body.ondragover = null;
|
||||
return modal.alert('your browser does not support drag-and-drop uploading');
|
||||
}
|
||||
if (btn)
|
||||
return;
|
||||
|
||||
clmod(ebi('drops'), 'vis', 1);
|
||||
var v = this.getAttribute('v');
|
||||
var v = self.getAttribute('v');
|
||||
if (v)
|
||||
clmod(ebi(v), 'hl', 1);
|
||||
}
|
||||
@@ -1045,6 +1061,8 @@ function up2k_init(subtle) {
|
||||
document.body.ondragleave = offdrag;
|
||||
document.body.ondragover = onover;
|
||||
document.body.ondrop = gotfile;
|
||||
ebi('u2btn').ondrop = gotfile;
|
||||
ebi('u2btn').ondragover = onoverbtn;
|
||||
|
||||
var drops = [ebi('up_dz'), ebi('srch_dz')];
|
||||
for (var a = 0; a < 2; a++) {
|
||||
@@ -1088,7 +1106,7 @@ function up2k_init(subtle) {
|
||||
function gotfile(e) {
|
||||
ev(e);
|
||||
nenters = 0;
|
||||
offdrag.bind(this)();
|
||||
offdrag.call(this);
|
||||
var dz = this && this.getAttribute('id');
|
||||
if (!dz && e && e.clientY)
|
||||
// cuo2duo fallback
|
||||
@@ -1132,7 +1150,7 @@ function up2k_init(subtle) {
|
||||
dst = good_files;
|
||||
|
||||
if (is_itemlist) {
|
||||
if (fobj.kind !== 'file')
|
||||
if (fobj.kind !== 'file' && fobj.type !== 'text/uri-list')
|
||||
continue;
|
||||
|
||||
try {
|
||||
@@ -1144,6 +1162,8 @@ function up2k_init(subtle) {
|
||||
}
|
||||
catch (ex) { }
|
||||
fobj = fobj.getAsFile();
|
||||
if (!fobj)
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
if (fobj.size < 1)
|
||||
@@ -1568,7 +1588,7 @@ function up2k_init(subtle) {
|
||||
return;
|
||||
|
||||
st.oserr = true;
|
||||
var msg = HTTPS ? L.u_emtleak3 : L.u_emtleak2.format((window.location + '').replace(':', 's:'));
|
||||
var msg = HTTPS ? L.u_emtleak3 : L.u_emtleak2.format((location + '').replace(':', 's:'));
|
||||
modal.alert(L.u_emtleak1 + msg + (CHROME ? L.u_emtleakc : FIREFOX ? L.u_emtleakf : ''));
|
||||
}
|
||||
|
||||
@@ -1634,11 +1654,11 @@ function up2k_init(subtle) {
|
||||
var running = false,
|
||||
was_busy = false;
|
||||
|
||||
function defer() {
|
||||
var defer = function () {
|
||||
running = false;
|
||||
}
|
||||
|
||||
function taskerd() {
|
||||
var taskerd = function () {
|
||||
if (running)
|
||||
return;
|
||||
|
||||
@@ -1668,7 +1688,7 @@ function up2k_init(subtle) {
|
||||
is_busy = st.todo.handshake.length;
|
||||
try {
|
||||
if (!is_busy && !uc.fsearch && !msel.getsel().length && (!mp.au || mp.au.paused))
|
||||
treectl.goto(get_evpath());
|
||||
treectl.goto();
|
||||
}
|
||||
catch (ex) { }
|
||||
}
|
||||
@@ -1936,7 +1956,7 @@ function up2k_init(subtle) {
|
||||
st.bytes.hashed += cdr - car;
|
||||
st.etac.h++;
|
||||
|
||||
function orz(e) {
|
||||
var orz = function (e) {
|
||||
bpend--;
|
||||
segm_next();
|
||||
hash_calc(nch, e.target.result);
|
||||
@@ -2140,6 +2160,9 @@ function up2k_init(subtle) {
|
||||
|
||||
function exec_head() {
|
||||
var t = st.todo.head.shift();
|
||||
if (t.done)
|
||||
return console.log('done; skip head1', t.name, t);
|
||||
|
||||
st.busy.head.push(t);
|
||||
|
||||
var xhr = new XMLHttpRequest();
|
||||
@@ -2152,6 +2175,9 @@ function up2k_init(subtle) {
|
||||
st.todo.head.unshift(t);
|
||||
};
|
||||
function orz(e) {
|
||||
if (t.done)
|
||||
return console.log('done; skip head2', t.name, t);
|
||||
|
||||
var ok = false;
|
||||
if (xhr.status == 200) {
|
||||
var srv_sz = xhr.getResponseHeader('Content-Length'),
|
||||
@@ -2198,6 +2224,9 @@ function up2k_init(subtle) {
|
||||
keepalive = t.keepalive,
|
||||
me = Date.now();
|
||||
|
||||
if (t.done)
|
||||
return console.log('done; skip hs', t.name, t);
|
||||
|
||||
st.busy.handshake.push(t);
|
||||
t.keepalive = undefined;
|
||||
t.t_busied = me;
|
||||
@@ -2207,10 +2236,9 @@ function up2k_init(subtle) {
|
||||
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.onerror = function () {
|
||||
if (t.t_busied != me) {
|
||||
console.log('zombie handshake onerror,', t.name, t);
|
||||
return;
|
||||
}
|
||||
if (t.t_busied != me) // t.done ok
|
||||
return console.log('zombie handshake onerror', t.name, t);
|
||||
|
||||
if (!toast.visible)
|
||||
toast.warn(9.98, L.u_eneths + "\n\nfile: " + t.name, t);
|
||||
|
||||
@@ -2219,11 +2247,10 @@ function up2k_init(subtle) {
|
||||
st.todo.handshake.unshift(t);
|
||||
t.keepalive = keepalive;
|
||||
};
|
||||
function orz(e) {
|
||||
if (t.t_busied != me) {
|
||||
console.log('zombie handshake onload,', t.name, t);
|
||||
return;
|
||||
}
|
||||
var orz = function (e) {
|
||||
if (t.t_busied != me || t.done)
|
||||
return console.log('zombie handshake onload', t.name, t);
|
||||
|
||||
if (xhr.status == 200) {
|
||||
t.t_handshake = Date.now();
|
||||
if (keepalive) {
|
||||
@@ -2483,14 +2510,17 @@ function up2k_init(subtle) {
|
||||
}
|
||||
|
||||
function exec_upload() {
|
||||
var upt = st.todo.upload.shift();
|
||||
var upt = st.todo.upload.shift(),
|
||||
t = st.files[upt.nfile],
|
||||
npart = upt.npart,
|
||||
tries = 0;
|
||||
|
||||
if (t.done)
|
||||
return console.log('done; skip chunk', t.name, t);
|
||||
|
||||
st.busy.upload.push(upt);
|
||||
st.nfile.upload = upt.nfile;
|
||||
|
||||
var npart = upt.npart,
|
||||
t = st.files[upt.nfile],
|
||||
tries = 0;
|
||||
|
||||
if (!t.t_uploading)
|
||||
t.t_uploading = Date.now();
|
||||
|
||||
@@ -2503,7 +2533,7 @@ function up2k_init(subtle) {
|
||||
if (cdr >= t.size)
|
||||
cdr = t.size;
|
||||
|
||||
function orz(xhr) {
|
||||
var orz = function (xhr) {
|
||||
var txt = ((xhr.response && xhr.response.err) || xhr.responseText) + '';
|
||||
if (txt.indexOf('upload blocked by x') + 1) {
|
||||
apop(st.busy.upload, upt);
|
||||
@@ -2532,7 +2562,7 @@ function up2k_init(subtle) {
|
||||
}
|
||||
orz2(xhr);
|
||||
}
|
||||
function orz2(xhr) {
|
||||
var orz2 = function (xhr) {
|
||||
apop(st.busy.upload, upt);
|
||||
apop(t.postlist, npart);
|
||||
if (!t.postlist.length) {
|
||||
@@ -2610,8 +2640,7 @@ function up2k_init(subtle) {
|
||||
}
|
||||
}
|
||||
}
|
||||
window.addEventListener('resize', onresize);
|
||||
onresize();
|
||||
onresize100.add(onresize, true);
|
||||
|
||||
if (MOBILE) {
|
||||
// android-chrome wobbles for a bit; firefox / iOS-safari are OK
|
||||
@@ -2679,6 +2708,11 @@ function up2k_init(subtle) {
|
||||
}
|
||||
|
||||
function draw_turbo() {
|
||||
if (turbolvl < 0 && uc.turbo) {
|
||||
bcfg_set('u2turbo', uc.turbo = false);
|
||||
toast.err(10, "turbo is disabled in server config");
|
||||
}
|
||||
|
||||
var msg = (turbolvl || !uc.turbo) ? null : uc.fsearch ? L.u_ts : L.u_tu,
|
||||
html = ebi('u2foot').innerHTML;
|
||||
|
||||
|
||||
@@ -7,12 +7,13 @@ if (!window.console || !console.log)
|
||||
|
||||
|
||||
var wah = '',
|
||||
NOAC = 'autocorrect="off" autocapitalize="off"',
|
||||
L, tt, treectl, thegrid, up2k, asmCrypto, hashwasm, vbar, marked,
|
||||
CB = '?_=' + Date.now(),
|
||||
R = SR.slice(1),
|
||||
RS = R ? "/" + R : "",
|
||||
HALFMAX = 8192 * 8192 * 8192 * 8192,
|
||||
HTTPS = (window.location + '').indexOf('https:') === 0,
|
||||
HTTPS = ('' + location).indexOf('https:') === 0,
|
||||
TOUCH = 'ontouchstart' in window,
|
||||
MOBILE = TOUCH,
|
||||
CHROME = !!window.chrome,
|
||||
@@ -139,29 +140,35 @@ catch (ex) {
|
||||
}
|
||||
var crashed = false, ignexd = {}, evalex_fatal = false;
|
||||
function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
if ((msg + '').indexOf('ResizeObserver') + 1)
|
||||
msg = String(msg);
|
||||
url = String(url);
|
||||
|
||||
if (msg.indexOf('ResizeObserver') + 1)
|
||||
return; // chrome issue 809574 (benign, from <video>)
|
||||
|
||||
if ((msg + '').indexOf('l2d.js') + 1)
|
||||
if (msg.indexOf('l2d.js') + 1)
|
||||
return; // `t` undefined in tapEvent -> hitTestSimpleCustom
|
||||
|
||||
if (!/\.js($|\?)/.exec('' + url))
|
||||
if (!/\.js($|\?)/.exec(url))
|
||||
return; // chrome debugger
|
||||
|
||||
if ((url + '').indexOf(' > eval') + 1 && !evalex_fatal)
|
||||
if (url.indexOf(' > eval') + 1 && !evalex_fatal)
|
||||
return; // md timer
|
||||
|
||||
var ekey = url + '\n' + lineNo + '\n' + msg;
|
||||
if (ignexd[ekey] || crashed)
|
||||
return;
|
||||
|
||||
if (url.indexOf('deps/marked.js') + 1 && !window.WebAssembly)
|
||||
return; // ff<52
|
||||
|
||||
crashed = true;
|
||||
window.onerror = undefined;
|
||||
var html = [
|
||||
'<h1>you hit a bug!</h1>',
|
||||
'<p style="font-size:1.3em;margin:0;line-height:2em">try to <a href="#" onclick="localStorage.clear();location.reload();">reset copyparty settings</a> if you are stuck here, or <a href="#" onclick="ignex();">ignore this</a> / <a href="#" onclick="ignex(true);">ignore all</a> / <a href="?b=u">basic</a></p>',
|
||||
'<p style="color:#fff">please send me a screenshot arigathanks gozaimuch: <a href="<ghi>" target="_blank">new github issue</a></p>',
|
||||
'<p class="b">' + esc(url + ' @' + lineNo + ':' + columnNo), '<br />' + esc(String(msg)).replace(/\n/g, '<br />') + '</p>',
|
||||
'<p class="b">' + esc(url + ' @' + lineNo + ':' + columnNo), '<br />' + esc(msg).replace(/\n/g, '<br />') + '</p>',
|
||||
'<p><b>UA:</b> ' + esc(navigator.userAgent + '')
|
||||
];
|
||||
|
||||
@@ -353,13 +360,13 @@ catch (ex) {
|
||||
}
|
||||
|
||||
// https://stackoverflow.com/a/950146
|
||||
function import_js(url, cb) {
|
||||
function import_js(url, cb, ecb) {
|
||||
var head = document.head || document.getElementsByTagName('head')[0];
|
||||
var script = mknod('script');
|
||||
script.type = 'text/javascript';
|
||||
script.src = url;
|
||||
script.onload = cb;
|
||||
script.onerror = function () {
|
||||
script.onerror = ecb || function () {
|
||||
var m = 'Failed to load module:\n' + url;
|
||||
console.log(m);
|
||||
toast.err(0, m);
|
||||
@@ -368,6 +375,15 @@ function import_js(url, cb) {
|
||||
}
|
||||
|
||||
|
||||
function unsmart(txt) {
|
||||
return !IPHONE ? txt : (txt.
|
||||
replace(/[\u2014]/g, "--").
|
||||
replace(/[\u2022]/g, "*").
|
||||
replace(/[\u2018\u2019]/g, "'").
|
||||
replace(/[\u201c\u201d]/g, '"'));
|
||||
}
|
||||
|
||||
|
||||
var crctab = (function () {
|
||||
var c, tab = [];
|
||||
for (var n = 0; n < 256; n++) {
|
||||
@@ -866,9 +882,10 @@ function jcp(obj) {
|
||||
}
|
||||
|
||||
|
||||
function sread(key) {
|
||||
function sread(key, al) {
|
||||
try {
|
||||
return localStorage.getItem(key);
|
||||
var ret = localStorage.getItem(key);
|
||||
return (!al || has(al, ret)) ? ret : null;
|
||||
}
|
||||
catch (e) {
|
||||
return null;
|
||||
@@ -890,7 +907,13 @@ function jread(key, fb) {
|
||||
if (!str)
|
||||
return fb;
|
||||
|
||||
return JSON.parse(str);
|
||||
try {
|
||||
// '' throws, null is ok, sasuga
|
||||
return JSON.parse(str);
|
||||
}
|
||||
catch (e) {
|
||||
return fb;
|
||||
}
|
||||
}
|
||||
|
||||
function jwrite(key, val) {
|
||||
@@ -1051,10 +1074,13 @@ function cliptxt(txt, ok) {
|
||||
}
|
||||
|
||||
|
||||
var timer = (function () {
|
||||
var r = {};
|
||||
function Debounce(delay) {
|
||||
var r = this;
|
||||
r.delay = delay;
|
||||
r.timer = 0;
|
||||
r.t_hit = 0;
|
||||
r.t_run = 0;
|
||||
r.q = [];
|
||||
r.last = 0;
|
||||
|
||||
r.add = function (fun, run) {
|
||||
r.rm(fun);
|
||||
@@ -1068,7 +1094,67 @@ var timer = (function () {
|
||||
apop(r.q, fun);
|
||||
};
|
||||
|
||||
function doevents() {
|
||||
r.run = function () {
|
||||
if (crashed)
|
||||
return;
|
||||
|
||||
r.t_run = Date.now();
|
||||
|
||||
var q = r.q.slice(0);
|
||||
for (var a = 0; a < q.length; a++)
|
||||
q[a]();
|
||||
};
|
||||
|
||||
r.hit = function () {
|
||||
if (crashed)
|
||||
return;
|
||||
|
||||
var now = Date.now(),
|
||||
td_hit = now - r.t_hit,
|
||||
td_run = now - r.t_run;
|
||||
|
||||
if (td_run >= r.delay * 2)
|
||||
r.t_run = now;
|
||||
|
||||
if (td_run >= r.delay && td_run <= r.delay * 2) {
|
||||
// r.delay is also deadline
|
||||
clearTimeout(r.timer);
|
||||
return r.run();
|
||||
}
|
||||
|
||||
if (td_hit < r.delay / 5)
|
||||
return;
|
||||
|
||||
clearTimeout(r.timer);
|
||||
r.timer = setTimeout(r.run, r.delay);
|
||||
r.t_hit = now;
|
||||
};
|
||||
};
|
||||
|
||||
var onresize100 = new Debounce(100);
|
||||
window.addEventListener('resize', onresize100.hit);
|
||||
|
||||
|
||||
var timer = (function () {
|
||||
var r = {};
|
||||
r.q = [];
|
||||
r.last = 0;
|
||||
r.fs = 0;
|
||||
r.fc = 0;
|
||||
|
||||
r.add = function (fun, run) {
|
||||
r.rm(fun);
|
||||
r.q.push(fun);
|
||||
|
||||
if (run)
|
||||
fun();
|
||||
};
|
||||
|
||||
r.rm = function (fun) {
|
||||
apop(r.q, fun);
|
||||
};
|
||||
|
||||
var doevents = function () {
|
||||
if (crashed)
|
||||
return;
|
||||
|
||||
@@ -1080,6 +1166,7 @@ var timer = (function () {
|
||||
q[a]();
|
||||
|
||||
r.last = Date.now();
|
||||
//r.fc++; if (r.last - r.fs >= 2000) { console.log(r.last - r.fs, r.fc); r.fs = r.last; r.fc = 0; }
|
||||
}
|
||||
setInterval(doevents, 100);
|
||||
|
||||
@@ -1104,7 +1191,7 @@ var tt = (function () {
|
||||
var prev = null;
|
||||
r.cshow = function () {
|
||||
if (this !== prev)
|
||||
r.show.bind(this)();
|
||||
r.show.call(this);
|
||||
|
||||
prev = this;
|
||||
};
|
||||
@@ -1116,7 +1203,7 @@ var tt = (function () {
|
||||
return;
|
||||
|
||||
if (Date.now() - r.lvis < 400)
|
||||
return r.show.bind(this)();
|
||||
return r.show.call(this);
|
||||
|
||||
tev = setTimeout(r.show.bind(this), 800);
|
||||
if (TOUCH)
|
||||
@@ -1274,8 +1361,11 @@ var toast = (function () {
|
||||
r.visible = false;
|
||||
r.txt = null;
|
||||
r.tag = obj; // filler value (null is scary)
|
||||
r.p_txt = '';
|
||||
r.p_sec = 0;
|
||||
r.p_t = 0;
|
||||
|
||||
function scrollchk() {
|
||||
var scrollchk = function () {
|
||||
if (scrolling)
|
||||
return;
|
||||
|
||||
@@ -1290,7 +1380,7 @@ var toast = (function () {
|
||||
scrolling = true;
|
||||
}
|
||||
|
||||
function unscroll() {
|
||||
var unscroll = function () {
|
||||
timer.rm(scrollchk);
|
||||
clmod(obj, 'scroll');
|
||||
scrolling = false;
|
||||
@@ -1306,10 +1396,23 @@ var toast = (function () {
|
||||
};
|
||||
|
||||
r.show = function (cl, sec, txt, tag) {
|
||||
var same = r.visible && txt == r.p_txt && r.p_sec == sec,
|
||||
delta = Date.now() - r.p_t;
|
||||
|
||||
if (same && delta < 100)
|
||||
return;
|
||||
|
||||
r.p_txt = txt;
|
||||
r.p_sec = sec;
|
||||
r.p_t = Date.now();
|
||||
|
||||
clearTimeout(te);
|
||||
if (sec)
|
||||
te = setTimeout(r.hide, sec * 1000);
|
||||
|
||||
if (same && delta < 1000)
|
||||
return;
|
||||
|
||||
if (txt.indexOf('<body>') + 1)
|
||||
txt = txt.slice(0, txt.indexOf('<')) + ' [...]';
|
||||
|
||||
@@ -1396,7 +1499,7 @@ var modal = (function () {
|
||||
r.busy = false;
|
||||
setTimeout(next, 50);
|
||||
};
|
||||
function ok(e) {
|
||||
var ok = function (e) {
|
||||
ev(e);
|
||||
var v = ebi('modali');
|
||||
v = v ? v.value : true;
|
||||
@@ -1404,14 +1507,14 @@ var modal = (function () {
|
||||
if (cb_ok)
|
||||
cb_ok(v);
|
||||
}
|
||||
function ng(e) {
|
||||
var ng = function (e) {
|
||||
ev(e);
|
||||
r.hide();
|
||||
if (cb_ng)
|
||||
cb_ng(null);
|
||||
}
|
||||
|
||||
function onfocus(e) {
|
||||
var onfocus = function (e) {
|
||||
var ctr = ebi('modalc');
|
||||
if (!ctr || !ctr.contains || !document.activeElement || ctr.contains(document.activeElement))
|
||||
return;
|
||||
@@ -1423,7 +1526,7 @@ var modal = (function () {
|
||||
ev(e);
|
||||
}
|
||||
|
||||
function onkey(e) {
|
||||
var onkey = function (e) {
|
||||
var k = e.code,
|
||||
eok = ebi('modal-ok'),
|
||||
eng = ebi('modal-ng'),
|
||||
@@ -1446,7 +1549,7 @@ var modal = (function () {
|
||||
return ng();
|
||||
}
|
||||
|
||||
function next() {
|
||||
var next = function () {
|
||||
if (!r.busy && q.length)
|
||||
q.shift()();
|
||||
}
|
||||
@@ -1457,7 +1560,7 @@ var modal = (function () {
|
||||
});
|
||||
next();
|
||||
};
|
||||
function _alert(html, cb, fun) {
|
||||
var _alert = function (html, cb, fun) {
|
||||
cb_ok = cb_ng = cb;
|
||||
cb_up = fun;
|
||||
html += '<div id="modalb"><a href="#" id="modal-ok">OK</a></div>';
|
||||
@@ -1470,7 +1573,7 @@ var modal = (function () {
|
||||
});
|
||||
next();
|
||||
}
|
||||
function _confirm(html, cok, cng, fun, btns) {
|
||||
var _confirm = function (html, cok, cng, fun, btns) {
|
||||
cb_ok = cok;
|
||||
cb_ng = cng === undefined ? cok : cng;
|
||||
cb_up = fun;
|
||||
@@ -1484,11 +1587,11 @@ var modal = (function () {
|
||||
});
|
||||
next();
|
||||
}
|
||||
function _prompt(html, v, cok, cng, fun) {
|
||||
var _prompt = function (html, v, cok, cng, fun) {
|
||||
cb_ok = cok;
|
||||
cb_ng = cng === undefined ? cok : null;
|
||||
cb_up = fun;
|
||||
html += '<input id="modali" type="text" /><div id="modalb">' + ok_cancel + '</div>';
|
||||
html += '<input id="modali" type="text" ' + NOAC + ' /><div id="modalb">' + ok_cancel + '</div>';
|
||||
r.show(html);
|
||||
|
||||
ebi('modali').value = v || '';
|
||||
@@ -1520,7 +1623,7 @@ function repl_load() {
|
||||
ret = [
|
||||
'var v=Object.keys(localStorage); v.sort(); JSON.stringify(v)',
|
||||
"for (var a of QSA('#files a[id]')) a.setAttribute('download','')",
|
||||
'console.hist.slice(-10).join("\\n")'
|
||||
'console.hist.slice(-50).join("\\n")'
|
||||
];
|
||||
|
||||
ipre.innerHTML = '<option value=""></option>';
|
||||
@@ -1576,6 +1679,8 @@ function repl(e) {
|
||||
if (!cmd)
|
||||
return toast.inf(3, 'eval aborted');
|
||||
|
||||
cmd = unsmart(cmd);
|
||||
|
||||
if (cmd.startsWith(',')) {
|
||||
evalex_fatal = true;
|
||||
return modal.alert(esc(eval(cmd.slice(1)) + ''));
|
||||
@@ -1614,7 +1719,7 @@ function load_md_plug(md_text, plug_type, defer) {
|
||||
return md_text;
|
||||
|
||||
var find = '\n```copyparty_' + plug_type + '\n',
|
||||
md = md_text.replace(/\r/g, ''),
|
||||
md = '\n' + md_text.replace(/\r/g, '') + '\n',
|
||||
ofs = md.indexOf(find),
|
||||
ofs2 = md.indexOf('\n```', ofs + 1);
|
||||
|
||||
@@ -1701,7 +1806,7 @@ var favico = (function () {
|
||||
r.en = true;
|
||||
r.tag = null;
|
||||
|
||||
function gx(txt) {
|
||||
var gx = function (txt) {
|
||||
return (svg_decl +
|
||||
'<svg version="1.1" viewBox="0 0 64 64" xmlns="http://www.w3.org/2000/svg">\n' +
|
||||
(r.bg ? '<rect width="100%" height="100%" rx="16" fill="#' + r.bg + '" />\n' : '') +
|
||||
|
||||
@@ -1,3 +1,212 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0923-1215 `v1.9.6` configurable x-forwarded-for
|
||||
|
||||
## new features
|
||||
* rudimentary support for jython and graalpy, and directory tree sidebar in internet explorer 9 through 11, and firefox 10
|
||||
* all older browsers (ie4, ie6, ie8, Netscape) get basic html instead
|
||||
* #35 adds a [hook](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/msg-log.py) which extends the message-to-serverlog feature so it writes the message to a textfile on the server
|
||||
* could theoretically be extended into a [full instant-messaging feature](https://github.com/9001/copyparty/blob/hovudstraum/srv/chat.md) but that's silly, [nobody would do that](https://ocv.me/stuff/cchat.webm)
|
||||
* [r0c is much better](https://github.com/9001/r0c) than this joke
|
||||
|
||||
## bugfixes
|
||||
* 163e3fce46122d64bf824762b6733ff2c3551ba5 the `x-forwarded-for` header was ignored if the nearest reverse-proxy is not asking from 127.0.0.1, which broke client IPs in containerized deployments
|
||||
* the serverlog will now explain how to trust the reverse-proxy to provide client IPs, but basically,
|
||||
* `--xff-hdr` specifies which header to read the client's real ip from
|
||||
* `--xff-src` is an allowlist of IP-addresses to trust that header from
|
||||
* a62f744a187bc9f821b540e8bb4e0b9a67bd01c8 if copyparty was started while an external HDD was not connected, and that volume's index was stored elsewhere, then the index would get wiped (since all the files are gone)
|
||||
* 3b8f66c0d5c27a68841814ec06f1758f146a5ff5 javascript could crash while uploading from a very unreliable internet connection
|
||||
|
||||
## other changes
|
||||
* copyparty.exe: updated pillow to 10.0.1 which fixes the webp cve
|
||||
* alpine, which the docker images are based on, turns out to be fairly slow -- currently working on a new docker image (probably fedora-based) which will be 30% faster at analyzing multimedia files and in general 20% faster on average
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0909-1336 `v1.9.5` webhotell
|
||||
|
||||
[happy 9/9!](https://safebooru.org/index.php?page=post&s=view&id=4027419)
|
||||
|
||||
## new features
|
||||
* new permission `h` disables directory listing (so works like `g`) except it redirects to the folder's index.html instead of 404
|
||||
* index.html is accessible by anyone with `h` even if filekeys are enabled
|
||||
* well suited for running a shared-webhosting gig (thx kipu) especially now that the...
|
||||
* markdown editor can now be used on non-markdown files if account has `w`rite and `d`elete
|
||||
* hotkey `e` to edit a textfile while it's open in the textfile viewer
|
||||
* SMB: account permissions now work fully as intended, thanks to impacket 0.11
|
||||
* but enabling `--smb` is still strongly discouraged as it's a massive security hazard
|
||||
* download-as-zip can be 2.5x faster on tiny files, at least 15% faster in general
|
||||
* download folders as pax-format tarfiles with `?tar=pax` or `?tar=pax,xz:9`
|
||||
|
||||
## bugfixes
|
||||
* 422-autoban accidentally triggered when uploading lots of duplicate files (thx hiem!)
|
||||
* `--css-browser` and `--js-browser` now accepts URLs with cache directives
|
||||
* `--css-browser=/the.css?cache=600` (seconds) or `--js-browser=/.res/the.js?cache=i` (7 days)
|
||||
* SMB: avoid windows freaking out and disconnecting if it hits an offline volume
|
||||
* hotkey shift-r to rotate pictures counter-clockwise didn't do anything
|
||||
* hacker theme wasn't hacker enough (everything is monospace now)
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0902-0018 `v1.9.4` yes symlink times
|
||||
|
||||
hello! it's been a while, an entire day even...
|
||||
|
||||
## new features
|
||||
* download folder as tar.gz, tar.bz2, tar.xz
|
||||
* single-threaded, so extremely slow, but nice for easily compressed data or challenged networks
|
||||
* append `?tar=gz`, `?tar=bz2` or `?tar=xz` to a folder URL to do it
|
||||
* default compression levels are gz:3, bz2:2, xz:1; override with `?tar=gz:9`
|
||||
|
||||
# bugfixes
|
||||
* c1efd227b7377144a5760bc6cff64f4e87b626d9 symlink-deduplicated files got indexed with the wrong last-modified timestamp
|
||||
* mostly inconsequential; would cause the dupe's uploader-ip to be forgotten on the next server restart since it would reindex to "fix" the timestamp
|
||||
* when linking [a search query](https://a.ocv.me/pub/#q=tags%20like%20soundsho*) it loads the results faster
|
||||
|
||||
# other changes
|
||||
* update readme to mention that iPhones and iPads dislike the preload feature and respond by glitching the audio a bit when a song is exactly 20 seconds away from ending and yet how it's probably a bad idea to disable preloading since i bet it's load-bearing against other iOS bugs
|
||||
* speaking of iPhones and iPads, the [previous version](https://github.com/9001/copyparty/releases/tag/v1.9.3) should have fixed album playback on those
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0831-2211 `v1.9.3` iOS and http fixes
|
||||
|
||||
## new features
|
||||
* iPhones and iPads are now able to...
|
||||
* 9986136dfb2364edb35aa9fbb87410641c6d6af3 play entire albums while the screen is off without the music randomly stopping
|
||||
* apple keeps breaking AudioContext in new and interesting ways; time to give up (no more equalizer)
|
||||
* 1c0d978979a703edeb792e552b18d3b7695b2d90 perform search queries and execude js code
|
||||
* by translating [smart-quotes](https://stackoverflow.com/questions/48678359/ios-11-safari-html-disable-smart-punctuation) into regular `'` and `"` characters
|
||||
* python 3.12 support
|
||||
* technically a bugfix since it was added [a year ago](https://github.com/9001/copyparty/commit/32e22dfe84d5e0b13914b4d0e15c1b8c9725a76d) way before the first py3.12 alpha was released but turns out i botched it, oh well
|
||||
* filter error messages so they never include the filesystem path where copyparty's python files reside
|
||||
* print more context in server logs if someone hits an unexpected permission-denied
|
||||
|
||||
# bugfixes
|
||||
found some iffy stuff combing over the code but, as far as I can tell, luckily none of these were dangerous:
|
||||
* URL normalization was a bit funky, but it appears everything access-control-related was unaffected
|
||||
* some url parameters were double-decoded, causing the unpost filtering and file renaming to fail if the values contained `%`
|
||||
* clients could cause the server to return an invalid cache-control header, but newlines and control-characters got rejected correctly
|
||||
* minor cosmetics / qol fixes:
|
||||
* reduced flickering on page load in chrome
|
||||
* fixed some console spam in search results
|
||||
* markdown documents now have the same line-height in directory listings and the editor
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0826-2116 `v1.9.2` bigger hammer
|
||||
|
||||
## new features
|
||||
* more ways to automatically ban users! three new sensors, all default-enabled, giving a 1 day ban after 9 hits in 2 minutes:
|
||||
* `--ban-403`: trying to access volumes that dont exist or require authentication
|
||||
* `--ban-422`: invalid POST messages (from brutefocing POST parameters and such)
|
||||
* `--ban-url`: URLs which 404 and also match `--sus-urls` (scanners/crawlers)
|
||||
* if you want to run a vulnerability scan on copyparty, please just [download the server](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) and do it locally! takes less than 30 seconds to set up, you get lower latency, and you won't be filling up the logfiles on the demo server with junk, thank you 🙏
|
||||
* more ban-related stuff,
|
||||
* new global option `--nonsus-urls` specifies regex of URLs which are OK to 404 and shouldn't ban people
|
||||
* `--turbo` now accepts the value `-1` which makes it impossible for clients to enable it, making `--ban-404` safe to use
|
||||
* range-selecting files in the list-view by shift-pgup/pgdn
|
||||
* volumes which are currently unavailable (dead nfs share, external HDD which is off, ...) are marked with a ❌ in the directory tree sidebar
|
||||
* the toggle-button to see dotfiles is now persisted as a cookie so it also applies on the initial page load
|
||||
* more effort is made to prevent `<script>`s inside markdown documents from running in the markdown editor and the fullpage viewer
|
||||
* anyone who wanted to use markdown files for malicious stuff can still just upload an html file instead, so this doesn't make anything more secure, just less confusing
|
||||
* the safest approach is still the `nohtml` volflag which disables markdown rendering outside sandboxes entirely, or only giving out write-access to trustworthy people
|
||||
* enabling markdown plugins with `-emp` now has the side-effect of cancelling this band-aid too
|
||||
|
||||
## bugfixes
|
||||
* textfile navigation hotkeys broke in the previous version
|
||||
|
||||
## other changes
|
||||
* example [nginx config](https://github.com/9001/copyparty/blob/hovudstraum/contrib/nginx/copyparty.conf) was not compatible with cloudflare (suggest `$http_cf_connecting_ip` instead of `$proxy_add_x_forwarded_for`)
|
||||
* `copyparty.exe` is now built with python 3.11.5 which fixes [CVE-2023-40217](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-40217)
|
||||
* `copyparty32.exe` is not, because python understandably ended win7 support
|
||||
* [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md):
|
||||
* copyparty appears to be 30x faster than nextcloud and seafile at receiving uploads of many small files
|
||||
* seafile has a size limit when zip-downloading folders
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0820-2338 `v1.9.1` prometheable
|
||||
|
||||
## new features
|
||||
* #49 prometheus / grafana / openmetrics integration ([see readme](https://github.com/9001/copyparty#prometheus))
|
||||
* read metrics from http://127.0.0.1:3923/.cpr/metrics after enabling with `--stats`
|
||||
* download a folder with all music transcoded to opus by adding `?tar=opus` or `?zip&opus` to the URL
|
||||
* can also be used to download thumbnails instead of full images; `?tar=w` for webp, `?tar=j` for jpg
|
||||
* so i guess the long-time requested feature of pre-generating thumbnails kind of happened after all, if you schedule a `curl http://127.0.0.1:3923/?tar=w >/dev/null` after server startup
|
||||
* u2c (commandline uploader): argument `-x` to exclude files by regex (compares absolute filesystem paths)
|
||||
* `--zm-spam 30` can be used to improve zeroconf / mDNS reliability on crazy networks
|
||||
* only necessary if there are clients with multiple IPs and some of the IPs are outside the subnets that copyparty are in -- not spec-compliant, not really recommended, but shouldn't cause any issues either
|
||||
* and `--mc-hop` wasn't actually implemented until now
|
||||
* dragging an image from another browser window onto the upload button is now possible
|
||||
* only works on chrome, and only on windows or linux (not macos)
|
||||
* server hostname is prefixed in all window titles
|
||||
* can be adjusted with `--bname` (the file explorer) and `--doctitle` (all other documents)
|
||||
* can be disabled with `--nth` (just window title) or `--nih` (title + header)
|
||||
|
||||
## bugfixes
|
||||
* docker: the autogenerated seeds for filekeys and account passwords now get persisted to the config volume (thx noktuas)
|
||||
* uploading files with fancy filenames could fail if the copyparty server is running on android
|
||||
* improve workarounds for some apple/iphone/ios jank (thx noktuas and spiky)
|
||||
* some ui elements had their font-size selected by fair dice roll
|
||||
* the volume control does nothing because [apple disabled it](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11), so add a warning
|
||||
* the image gallery cannot be fullscreened [as apple intended](https://developer.mozilla.org/en-US/docs/Web/API/Element/requestFullscreen#browser_compatibility) so add a warning
|
||||
|
||||
## other changes
|
||||
* file table columns are now limited to browser window width
|
||||
* readme: mention that nginx-QUIC is currently very slow (thx noktuas)
|
||||
* #50 add a safeguard to the wget plugin in case wget at some point adds support for `file://` or similar
|
||||
* show a suggestion on startup to enable the database
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0725-1550 `v1.8.8` just boring bugfixes
|
||||
|
||||
final release until late august unless something bad happens and i end up building this thing on a shinkansen
|
||||
|
||||
## recent security / vulnerability fixes
|
||||
* there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates
|
||||
* [v1.8.7](https://github.com/9001/copyparty/releases/tag/v1.8.7) (2023-07-23) - [CVE-2023-38501](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-38501) - reflected XSS
|
||||
* [v1.8.2](https://github.com/9001/copyparty/releases/tag/v1.8.2) (2023-07-14) - [CVE-2023-37474](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-37474) - path traversal (first CVE)
|
||||
* all serverlogs reviewed so far (5 public servers) showed no signs of exploitation
|
||||
|
||||
## bugfixes
|
||||
* range-select with shiftclick:
|
||||
* don't crash when entering another folder and shift-clicking some more
|
||||
* remember selection origin when lazy-loading more stuff into the viewport
|
||||
* markdown editor:
|
||||
* fix confusing warnings when the browser cache decides it *really* wants to cache
|
||||
* and when a document starts with a newline
|
||||
* remember intended actions such as `?edit` on login prompts
|
||||
* Windows: TLS-cert generation (triggered by network changes) could occasionally fail
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0723-1543 `v1.8.7` XSS for days
|
||||
|
||||
at the lack of better ideas, there is now a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` for all future important updates such as this one
|
||||
|
||||
## bugfixes
|
||||
* reflected XSS through `/?k304` and `/?setck`
|
||||
* if someone tricked you into clicking a URL containing a chain of `%0d` and `%0a` they could potentially have moved/deleted existing files on the server, or uploaded new files, using your account
|
||||
* if you use a reverse proxy, you can check if you have been exploited like so:
|
||||
* nginx: grep your logs for URLs containing `%0d%0a%0d%0a`, for example using the following command:
|
||||
```bash
|
||||
(gzip -dc access.log*.gz; cat access.log) | sed -r 's/" [0-9]+ .*//' | grep -iE '%0[da]%0[da]%0[da]%0[da]'
|
||||
```
|
||||
* if you find any traces of exploitation (or just want to be on the safe side) it's recommended to change the passwords of your copyparty accounts
|
||||
* huge thanks *again* to @TheHackyDog !
|
||||
* the original fix for CVE-2023-37474 broke the download links for u2c.py and partyfuse.py
|
||||
* fix mediaplayer spinlock if the server only has a single audio file
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0721-0036 `v1.8.6` fix reflected XSS
|
||||
|
||||
|
||||
@@ -125,8 +125,14 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
||||
| GET | `?b` | list files/folders at URL as simplified HTML |
|
||||
| GET | `?tree=.` | list one level of subdirectories inside URL |
|
||||
| GET | `?tree` | list one level of subdirectories for each level until URL |
|
||||
| GET | `?tar` | download everything below URL as a tar file |
|
||||
| GET | `?zip=utf-8` | download everything below URL as a zip file |
|
||||
| GET | `?tar` | download everything below URL as a gnu-tar file |
|
||||
| GET | `?tar=gz:9` | ...as a gzip-level-9 gnu-tar file |
|
||||
| GET | `?tar=xz:9` | ...as an xz-level-9 gnu-tar file |
|
||||
| GET | `?tar=pax` | ...as a pax-tar file |
|
||||
| GET | `?tar=pax,xz` | ...as an xz-level-1 pax-tar file |
|
||||
| GET | `?zip=utf-8` | ...as a zip file |
|
||||
| GET | `?zip` | ...as a WinXP-compatible zip file |
|
||||
| GET | `?zip=crc` | ...as an MSDOS-compatible zip file |
|
||||
| GET | `?ups` | show recent uploads from your IP |
|
||||
| GET | `?ups&filter=f` | ...where URL contains `f` |
|
||||
| GET | `?mime=foo` | specify return mimetype `foo` |
|
||||
|
||||
@@ -148,7 +148,7 @@ symbol legend,
|
||||
|
||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||
| download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ | █ |
|
||||
| download folder as zip | █ | █ | █ | █ | ╱ | | █ | | █ | █ | ╱ | █ |
|
||||
| download folder as tar | █ | | | | | | | | | █ | | |
|
||||
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | |
|
||||
@@ -183,6 +183,7 @@ symbol legend,
|
||||
* `cloud storage backend` = able to serve files from (and write to) s3 or similar cloud services; `╱` means the software can do this with some help from `rclone mount` as a bridge
|
||||
|
||||
* `a`/copyparty can reject uploaded files (based on complex conditions), for example [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py)
|
||||
* `e`/seafile download-as-zip is not streaming; it creates the full zipfile before download can start, and fails on big folders
|
||||
* `j`/filebrowser remarks:
|
||||
* can provide checksums for single files on request
|
||||
* can probably do extension/mimetype rejection similar to copyparty
|
||||
@@ -254,7 +255,7 @@ symbol legend,
|
||||
| per-file permissions | | | | █ | █ | | █ | | █ | | | |
|
||||
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | |
|
||||
| unmap subfolders | █ | | | | | | █ | | | █ | ╱ | • |
|
||||
| index.html blocks list | | | | | | | █ | | | • | | |
|
||||
| index.html blocks list | ╱ | | | | | | █ | | | • | | |
|
||||
| write-only folders | █ | | | | | | | | | | █ | █ |
|
||||
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ |
|
||||
| file versioning | | | | █ | █ | | | | | | | |
|
||||
@@ -290,6 +291,7 @@ symbol legend,
|
||||
* one-way folder sync from local to server can be done efficiently with [u2c.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy), or with webdav and conventional rsync
|
||||
* can hot-reload config files (with just a few exceptions)
|
||||
* can set per-folder permissions if that folder is made into a separate volume, so there is configuration overhead
|
||||
* `index.html` on its own does not prevent directory listing, but permission `h` (instead of `r`) enforces index.html to be returned instead of folder contents
|
||||
* [event hooks](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) ([discord](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png), [desktop](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png)) inspired by filebrowser, as well as the more complex [media parser](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag) alternative
|
||||
* upload history can be visualized using [partyjournal](https://github.com/9001/copyparty/blob/hovudstraum/bin/partyjournal.py)
|
||||
* `k`/filegator remarks:
|
||||
@@ -432,6 +434,7 @@ symbol legend,
|
||||
* not that bad, can probably be remedied with bindmounts or maybe symlinks
|
||||
* ⚠️ uploads not resumable / accelerated / integrity-checked
|
||||
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||
* ⚠️ uploading small files is slow; `2.2` files per sec (copyparty does `87`/sec), tested locally with [linuxserver/nextcloud](https://hub.docker.com/r/linuxserver/nextcloud) (sqlite)
|
||||
* ⚠️ no write-only / upload-only folders
|
||||
* ⚠️ http/webdav only; no ftp, zeroconf
|
||||
* ⚠️ less awesome music player
|
||||
@@ -451,7 +454,9 @@ symbol legend,
|
||||
* *much worse than nextcloud* in that regard
|
||||
* ⚠️ uploads not resumable / accelerated / integrity-checked
|
||||
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||
* ⚠️ uploading small files is slow; `2.7` files per sec (copyparty does `87`/sec), tested locally with [official container](https://manual.seafile.com/docker/deploy_seafile_with_docker/)
|
||||
* ⚠️ no write-only / upload-only folders
|
||||
* ⚠️ big folders cannot be zip-downloaded
|
||||
* ⚠️ http/webdav only; no ftp, zeroconf
|
||||
* ⚠️ less awesome music player
|
||||
* ⚠️ doesn't run on android or ipads
|
||||
|
||||
68
scripts/bench/filehash.sh
Executable file
68
scripts/bench/filehash.sh
Executable file
@@ -0,0 +1,68 @@
|
||||
#!/bin/bash
|
||||
set -euo pipefail
|
||||
|
||||
# check how fast copyparty is able to hash files during indexing
|
||||
# assuming an infinitely fast HDD to read from (alternatively,
|
||||
# checks whether you will be bottlenecked by CPU or HDD)
|
||||
#
|
||||
# uses copyparty's default config of using, well, it's complicated:
|
||||
# * if you have more than 8 cores, then 5 threads,
|
||||
# * if you have between 4 and 8, then 4 threads,
|
||||
# * anything less and it takes your number of cores
|
||||
#
|
||||
# can be adjusted with --hash-mt (but alpine caps out at 5)
|
||||
|
||||
[ $# -ge 1 ] || {
|
||||
echo 'need arg 1: path to copyparty-sfx.py'
|
||||
echo ' (remaining args will be passed on to copyparty,'
|
||||
echo ' for example to tweak the hasher settings)'
|
||||
exit 1
|
||||
}
|
||||
sfx="$1"
|
||||
shift
|
||||
sfx="$(realpath "$sfx" || readlink -e "$sfx" || echo "$sfx")"
|
||||
awk=$(command -v gawk || command -v awk)
|
||||
|
||||
# try to use /dev/shm to avoid hitting filesystems at all,
|
||||
# otherwise fallback to mktemp which probably uses /tmp
|
||||
td=/dev/shm/cppbenchtmp
|
||||
mkdir $td || td=$(mktemp -d)
|
||||
trap "rm -rf $td" INT TERM EXIT
|
||||
cd $td
|
||||
|
||||
echo creating 256 MiB testfile in $td
|
||||
head -c $((1024*1024*256)) /dev/urandom > 1
|
||||
|
||||
echo creating 127 symlinks to it
|
||||
for n in $(seq 2 128); do ln -s 1 $n; done
|
||||
|
||||
echo warming up cache
|
||||
cat 1 >/dev/null
|
||||
|
||||
echo ok lets go
|
||||
python3 "$sfx" -p39204 -e2dsa --dbd=yolo --exit=idx -lo=t -q "$@"
|
||||
|
||||
echo and the results are...
|
||||
$awk '/1 volumes in / {printf "%s MiB/s\n", 256*128/$(NF-1)}' <t
|
||||
|
||||
echo deleting $td and exiting
|
||||
|
||||
##
|
||||
## some results:
|
||||
|
||||
# MiB/s @ cpu or device (copyparty, pythonver, distro/os) // comment
|
||||
|
||||
# 3608 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.5, fedora 38) // --hash-mt=6; laptop
|
||||
# 2726 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.5, fedora 38) // --hash-mt=4 (old-default)
|
||||
# 2202 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.5, docker-alpine 3.18.3) ??? alpine slow
|
||||
# 2719 @ Ryzen 5 4500U (cpp 1.9.5, py 3.11.2, docker-debian 12.1)
|
||||
|
||||
# 5544 @ Intel i5-12500 (cpp 1.9.5, py 3.11.2, debian 12.0) // --hash-mt=12; desktop
|
||||
# 5197 @ Ryzen 7 3700X (cpp 1.9.5, py 3.9.18, freebsd 13.2) // --hash-mt=8; 2u server
|
||||
# 2606 @ Ryzen 7 3700X (cpp 1.9.5, py 3.9.18, freebsd 13.2) // --hash-mt=4 (old-default)
|
||||
# 1436 @ Ryzen 5 5500U (cpp 1.9.5, py 3.11.4, alpine 3.18.3) // nuc
|
||||
# 1065 @ Pixel 7 (cpp 1.9.5, py 3.11.5, termux 2023-09)
|
||||
|
||||
# notes,
|
||||
# podman run --rm -it --shm-size 512m --entrypoint /bin/ash localhost/copyparty-min
|
||||
# podman <filehash.sh run --rm -i --shm-size 512m --entrypoint /bin/ash localhost/copyparty-min -s - /z/copyparty-sfx.py
|
||||
@@ -3,6 +3,7 @@ WORKDIR /z
|
||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
ver_hashwasm=4.9.0 \
|
||||
ver_marked=4.3.0 \
|
||||
ver_dompf=3.0.5 \
|
||||
ver_mde=2.18.0 \
|
||||
ver_codemirror=5.65.12 \
|
||||
ver_fontawesome=5.13.0 \
|
||||
@@ -13,6 +14,7 @@ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
# https://github.com/markedjs/marked/releases
|
||||
# https://github.com/Ionaru/easy-markdown-editor/tags
|
||||
# https://github.com/codemirror/codemirror5/releases
|
||||
# https://github.com/cure53/DOMPurify/releases
|
||||
# https://github.com/Daninet/hash-wasm/releases
|
||||
# https://github.com/openpgpjs/asmcrypto.js
|
||||
# https://github.com/google/zopfli/tags
|
||||
@@ -23,10 +25,12 @@ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
RUN mkdir -p /z/dist/no-pk \
|
||||
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
|
||||
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
|
||||
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
|
||||
&& wget https://github.com/Ionaru/easy-markdown-editor/archive/$ver_mde.tar.gz -O mde.tgz \
|
||||
&& wget https://github.com/codemirror/codemirror5/archive/$ver_codemirror.tar.gz -O codemirror.tgz \
|
||||
&& wget https://github.com/cure53/DOMPurify/archive/refs/tags/$ver_dompf.tar.gz -O dompurify.tgz \
|
||||
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
||||
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
||||
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
||||
@@ -48,6 +52,7 @@ RUN mkdir -p /z/dist/no-pk \
|
||||
&& cd easy-markdown-editor* \
|
||||
&& npm install \
|
||||
&& npm i gulp-cli -g ) \
|
||||
&& tar -xf dompurify.tgz \
|
||||
&& tar -xf prism.tgz \
|
||||
&& unzip fontawesome.zip \
|
||||
&& tar -xf zopfli.tgz
|
||||
@@ -120,6 +125,10 @@ RUN cd easy-markdown-editor-$ver_mde \
|
||||
&& cp -pv dist/easymde.min.js /z/dist/easymde.js
|
||||
|
||||
|
||||
# build dompurify
|
||||
RUN (echo; cat DOMPurify-$ver_dompf/dist/purify.min.js) >> /z/dist/marked.js
|
||||
|
||||
|
||||
# build fontawesome and scp
|
||||
COPY mini-fa.sh /z
|
||||
COPY mini-fa.css /z
|
||||
@@ -134,10 +143,11 @@ RUN ./genprism.sh $ver_prism
|
||||
|
||||
|
||||
# compress
|
||||
COPY zopfli.makefile /z/dist/Makefile
|
||||
COPY brotli.makefile zopfli.makefile /z/dist/
|
||||
RUN cd /z/dist \
|
||||
&& make -j$(nproc) \
|
||||
&& rm Makefile \
|
||||
&& make -j$(nproc) -f brotli.makefile \
|
||||
&& make -j$(nproc) -f zopfli.makefile \
|
||||
&& rm *.makefile \
|
||||
&& mv no-pk/* . \
|
||||
&& rmdir no-pk
|
||||
|
||||
|
||||
4
scripts/deps-docker/brotli.makefile
Normal file
4
scripts/deps-docker/brotli.makefile
Normal file
@@ -0,0 +1,4 @@
|
||||
all: $(addsuffix .br, $(wildcard easymde*))
|
||||
|
||||
%.br: %
|
||||
brotli -jZ $<
|
||||
@@ -1,10 +1,6 @@
|
||||
all: $(addsuffix .gz, $(wildcard *.*))
|
||||
all: $(addsuffix .gz, $(wildcard *.js *.css))
|
||||
|
||||
%.gz: %
|
||||
#brotli -q 11 $<
|
||||
pigz -11 -I 573 $<
|
||||
|
||||
# pigz -11 -J 34 -I 100 -F < $< > $@.first
|
||||
|
||||
# disabling brotli after all since the gain is meh
|
||||
# and it bloats sfx and wheels by like 70%
|
||||
|
||||
@@ -5,7 +5,8 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-ac" \
|
||||
org.opencontainers.image.description="copyparty with Pillow and FFmpeg (image/audio/video thumbnails, audio transcoding, media tags)"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add !pyc \
|
||||
wget \
|
||||
@@ -19,4 +20,4 @@ RUN apk --no-cache add !pyc \
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
@@ -5,7 +5,8 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-dj" \
|
||||
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
COPY i/bin/mtag/install-deps.sh ./
|
||||
COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
@@ -22,6 +23,7 @@ RUN apk add -U !pyc \
|
||||
python3-dev ffmpeg-dev fftw-dev libsndfile-dev \
|
||||
py3-wheel py3-numpy-dev \
|
||||
vamp-sdk-dev \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install pyvips \
|
||||
&& bash install-deps.sh \
|
||||
&& apk del py3-pip .bd \
|
||||
@@ -35,4 +37,8 @@ RUN apk add -U !pyc \
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
# size: 286 MB
|
||||
# bpm/key: 529 sec
|
||||
# idx-bench: 2352 MB/s
|
||||
|
||||
78
scripts/docker/Dockerfile.djd
Normal file
78
scripts/docker/Dockerfile.djd
Normal file
@@ -0,0 +1,78 @@
|
||||
FROM debian:12-slim
|
||||
WORKDIR /z
|
||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-djd" \
|
||||
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection, and higher performance than the other editions"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
COPY i/bin/mtag/install-deps.sh ./
|
||||
COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
COPY i/bin/mtag/audio-key.py /mtag/
|
||||
|
||||
# Suites: bookworm bookworm-updates
|
||||
# Components: main
|
||||
RUN sed -ri 's/( main)$/\1 contrib non-free non-free-firmware/' /etc/apt/sources.list.d/debian.sources \
|
||||
&& apt update \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt install -y --no-install-recommends \
|
||||
wget \
|
||||
python3-argon2 python3-pillow python3-pip \
|
||||
ffmpeg libvips42 vamp-plugin-sdk \
|
||||
python3-numpy libfftw3-double3 libsndfile1 \
|
||||
gcc g++ make cmake patchelf jq \
|
||||
libavcodec-dev libavdevice-dev libavfilter-dev libavformat-dev libavutil-dev \
|
||||
libfftw3-dev python3-dev libsndfile1-dev python3-pip \
|
||||
patchelf cmake \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install --user pyvips \
|
||||
&& bash install-deps.sh \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt purge -y \
|
||||
gcc g++ make cmake patchelf jq \
|
||||
libavcodec-dev libavdevice-dev libavfilter-dev libavformat-dev libavutil-dev \
|
||||
libfftw3-dev python3-dev libsndfile1-dev python3-pip \
|
||||
patchelf cmake \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get autoremove -y \
|
||||
&& dpkg -r --force-depends libraqm0 libgdbm-compat4 libgdbm6 libperl5.36 perl-modules-5.36 mailcap mime-support \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get clean -y \
|
||||
&& find /usr/ -name __pycache__ | xargs rm -rf \
|
||||
&& find /usr/ -type d -name tests | grep packages/numpy | xargs rm -rf \
|
||||
&& rm -rf \
|
||||
/var/lib/apt/lists/* \
|
||||
/tmp/pyc \
|
||||
/usr/lib/python*/dist-packages/pip \
|
||||
/usr/lib/python*/dist-packages/setuptools \
|
||||
/usr/lib/*/dri \
|
||||
/usr/lib/*/mfx \
|
||||
/usr/share/doc \
|
||||
/usr/share/X11 \
|
||||
/usr/share/fonts \
|
||||
/usr/share/libmysofa \
|
||||
/usr/share/libthai \
|
||||
/usr/share/alsa \
|
||||
/usr/share/bash-completion \
|
||||
&& chmod 777 /root \
|
||||
&& ln -s /root/vamp /root/.local / \
|
||||
&& mkdir /cfg /w \
|
||||
&& chmod 777 /cfg /w \
|
||||
&& echo % /cfg > initcfg
|
||||
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
# size: 598 MB
|
||||
# bpm/key: 485 sec
|
||||
# idx-bench: 2751 MB/s
|
||||
|
||||
# notes:
|
||||
# libraqm0 (pillow dep) pulls in the other packages mentioned on the dpkg line; saves 50m
|
||||
|
||||
# advantage: official packages only
|
||||
# advantage: ffmpeg with gme, codec2, radiance-hdr
|
||||
# drawback: ffmpeg bloat; dc1394, flite, mfx, xorg
|
||||
# drawback: python packaging is a bit jank
|
||||
# drawback: they apply exciting patches due to old deps
|
||||
# drawback: dropping perl the hard way might cause issues
|
||||
69
scripts/docker/Dockerfile.djf
Normal file
69
scripts/docker/Dockerfile.djf
Normal file
@@ -0,0 +1,69 @@
|
||||
FROM fedora:38
|
||||
WORKDIR /z
|
||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-djf" \
|
||||
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection, and higher performance than the other editions"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
COPY i/bin/mtag/install-deps.sh ./
|
||||
COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
COPY i/bin/mtag/audio-key.py /mtag/
|
||||
RUN dnf install -y \
|
||||
https://mirrors.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm \
|
||||
https://mirrors.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-$(rpm -E %fedora).noarch.rpm \
|
||||
&& dnf install -y --setopt=install_weak_deps=False \
|
||||
wget \
|
||||
python3-argon2-cffi python3-pillow python3-pip python3-cffi \
|
||||
ffmpeg \
|
||||
vips vips-jxl vips-poppler vips-magick \
|
||||
python3-numpy fftw libsndfile \
|
||||
gcc gcc-c++ make cmake patchelf jq \
|
||||
python3-devel ffmpeg-devel fftw-devel libsndfile-devel python3-setuptools \
|
||||
vamp-plugin-sdk qm-vamp-plugins \
|
||||
vamp-plugin-sdk-devel vamp-plugin-sdk-static \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install --user pyvips \
|
||||
&& bash install-deps.sh \
|
||||
&& dnf erase -y \
|
||||
gcc gcc-c++ make cmake patchelf jq \
|
||||
python3-devel ffmpeg-devel fftw-devel libsndfile-devel python3-setuptools \
|
||||
vamp-plugin-sdk-devel vamp-plugin-sdk-static \
|
||||
&& dnf clean all \
|
||||
&& find /usr/ -name __pycache__ | xargs rm -rf \
|
||||
&& find /usr/ -type d -name tests | grep packages/numpy | xargs rm -rf \
|
||||
&& rm -rf \
|
||||
/usr/share/adobe \
|
||||
/usr/share/fonts \
|
||||
/usr/share/graphviz \
|
||||
/usr/share/poppler/cMap \
|
||||
/usr/share/licenses \
|
||||
/usr/share/ghostscript \
|
||||
/usr/share/tesseract \
|
||||
/usr/share/X11 \
|
||||
/usr/share/hwdata \
|
||||
/usr/share/python-wheels \
|
||||
/usr/bin/cyrusbdb2current \
|
||||
&& rm -rf /tmp/pyc \
|
||||
&& chmod 777 /root \
|
||||
&& ln -s /root/vamp /root/.local / \
|
||||
&& mkdir /cfg /w \
|
||||
&& chmod 777 /cfg /w \
|
||||
&& echo % /cfg > initcfg
|
||||
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
# size: 648 MB
|
||||
# bpm/key: 410 sec
|
||||
# idx-bench: 2744 MB/s
|
||||
|
||||
# advantage: fairly recent and sane ffmpeg build
|
||||
# drawback: ffmpeg without gme, codec2, radiance-hdr
|
||||
# drawback: ffmpeg from rpmfusion, which is both better and smaller than ffmpeg-free, can occasionally fail to install due to repo desync / conflicts
|
||||
# drawback: ffmpeg bloat; samba, modplug, v4l2
|
||||
# drawback: manual purging (graphviz/poppler/cmap) can break stuff
|
||||
62
scripts/docker/Dockerfile.djff
Normal file
62
scripts/docker/Dockerfile.djff
Normal file
@@ -0,0 +1,62 @@
|
||||
FROM fedora:38
|
||||
WORKDIR /z
|
||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-djff" \
|
||||
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection, and higher performance than the other editions"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
COPY i/bin/mtag/install-deps.sh ./
|
||||
COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
COPY i/bin/mtag/audio-key.py /mtag/
|
||||
RUN dnf install -y --setopt=install_weak_deps=False \
|
||||
wget \
|
||||
python3-argon2-cffi python3-pillow python3-pip python3-cffi \
|
||||
ffmpeg-free \
|
||||
vips vips-jxl vips-poppler vips-magick \
|
||||
python3-numpy fftw libsndfile \
|
||||
gcc gcc-c++ make cmake patchelf jq \
|
||||
python3-devel ffmpeg-free-devel fftw-devel libsndfile-devel python3-setuptools \
|
||||
vamp-plugin-sdk qm-vamp-plugins \
|
||||
vamp-plugin-sdk-devel vamp-plugin-sdk-static \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install --user pyvips \
|
||||
&& bash install-deps.sh \
|
||||
&& dnf erase -y \
|
||||
gcc gcc-c++ make cmake patchelf jq \
|
||||
python3-devel ffmpeg-free-devel fftw-devel libsndfile-devel python3-setuptools \
|
||||
vamp-plugin-sdk-devel vamp-plugin-sdk-static \
|
||||
&& dnf clean all \
|
||||
&& find /usr/ -name __pycache__ | xargs rm -rf \
|
||||
&& find /usr/ -type d -name tests | grep packages/numpy | xargs rm -rf \
|
||||
&& rm -rf \
|
||||
/usr/share/adobe \
|
||||
/usr/share/fonts \
|
||||
/usr/share/graphviz \
|
||||
/usr/share/poppler/cMap \
|
||||
/usr/share/licenses \
|
||||
/usr/share/ghostscript \
|
||||
/usr/share/tesseract \
|
||||
/usr/share/X11 \
|
||||
/usr/share/hwdata \
|
||||
/usr/share/python-wheels \
|
||||
/usr/bin/cyrusbdb2current \
|
||||
&& rm -rf /tmp/pyc \
|
||||
&& chmod 777 /root \
|
||||
&& ln -s /root/vamp /root/.local / \
|
||||
&& mkdir /cfg /w \
|
||||
&& chmod 777 /cfg /w \
|
||||
&& echo % /cfg > initcfg
|
||||
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
# size: 673 MB
|
||||
# bpm/key: 408 sec
|
||||
# idx-bench: 2744 MB/s
|
||||
|
||||
# drawback: less ffmpeg features we want, more features we don't (compared to rpmfusion)
|
||||
69
scripts/docker/Dockerfile.dju
Normal file
69
scripts/docker/Dockerfile.dju
Normal file
@@ -0,0 +1,69 @@
|
||||
FROM ubuntu:23.04
|
||||
WORKDIR /z
|
||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-dju" \
|
||||
org.opencontainers.image.description="copyparty with all optional dependencies, including musical key / bpm detection, and higher performance than the other editions"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
COPY i/bin/mtag/install-deps.sh ./
|
||||
COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
COPY i/bin/mtag/audio-key.py /mtag/
|
||||
|
||||
RUN apt update \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt install -y --no-install-recommends \
|
||||
wget \
|
||||
python3-argon2 python3-pillow python3-pip \
|
||||
ffmpeg libvips42 vamp-plugin-sdk \
|
||||
python3-numpy libfftw3-double3 libsndfile1 \
|
||||
gcc g++ make cmake patchelf jq \
|
||||
libavcodec-dev libavdevice-dev libavfilter-dev libavformat-dev libavutil-dev \
|
||||
libfftw3-dev python3-dev libsndfile1-dev python3-pip \
|
||||
patchelf cmake \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED
|
||||
|
||||
RUN python3 -m pip install --user pyvips \
|
||||
&& bash install-deps.sh \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt purge -y \
|
||||
gcc g++ make cmake patchelf jq \
|
||||
libavcodec-dev libavdevice-dev libavfilter-dev libavformat-dev libavutil-dev \
|
||||
libfftw3-dev python3-dev libsndfile1-dev python3-pip \
|
||||
patchelf cmake \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get clean -y \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get autoremove -y \
|
||||
&& find /usr/ -name __pycache__ | xargs rm -rf \
|
||||
&& find /usr/ -type d -name tests | grep site-packages/numpy | xargs rm -rf \
|
||||
&& rm -rf \
|
||||
/var/lib/apt/lists/* \
|
||||
/tmp/pyc \
|
||||
/usr/lib/python*/dist-packages/pip \
|
||||
/usr/lib/python*/dist-packages/setuptools \
|
||||
/usr/lib/*/dri \
|
||||
/usr/lib/*/mfx \
|
||||
/usr/share/doc \
|
||||
/usr/share/X11 \
|
||||
/usr/share/fonts \
|
||||
/usr/share/libmysofa \
|
||||
/usr/share/libthai \
|
||||
/usr/share/alsa \
|
||||
/usr/share/bash-completion \
|
||||
&& chmod 777 /root \
|
||||
&& ln -s /root/vamp /root/.local / \
|
||||
&& mkdir /cfg /w \
|
||||
&& chmod 777 /cfg /w \
|
||||
&& echo % /cfg > initcfg
|
||||
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
# size: 1198 MB (wowee)
|
||||
# bpm/key: 516 sec
|
||||
# idx-bench: 2751 MB/s
|
||||
|
||||
# advantage: official packages only
|
||||
# advantage: ffmpeg with gme, codec2, radiance-hdr
|
||||
# drawback: python packaging is a bit jank
|
||||
@@ -5,7 +5,8 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-im" \
|
||||
org.opencontainers.image.description="copyparty with Pillow and Mutagen (image thumbnails, media tags)"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add !pyc \
|
||||
wget \
|
||||
@@ -18,4 +19,4 @@ RUN apk --no-cache add !pyc \
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
@@ -5,7 +5,8 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-iv" \
|
||||
org.opencontainers.image.description="copyparty with Pillow, FFmpeg, libvips (image/audio/video thumbnails, audio transcoding, media tags)"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk add -U !pyc \
|
||||
wget \
|
||||
@@ -15,6 +16,7 @@ RUN apk add -U !pyc \
|
||||
&& apk add -t .bd \
|
||||
bash wget gcc g++ make cmake patchelf \
|
||||
python3-dev py3-wheel \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install pyvips \
|
||||
&& apk del py3-pip .bd \
|
||||
&& rm -rf /var/cache/apk/* /tmp/pyc \
|
||||
@@ -25,4 +27,4 @@ RUN apk add -U !pyc \
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "-c", "/z/initcfg"]
|
||||
|
||||
@@ -5,7 +5,8 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-min" \
|
||||
org.opencontainers.image.description="just copyparty, no thumbnails / media tags / audio transcoding"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add !pyc \
|
||||
python3 \
|
||||
@@ -17,4 +18,4 @@ RUN apk --no-cache add !pyc \
|
||||
COPY i/dist/copyparty-sfx.py ./
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "-c", "/z/initcfg"]
|
||||
ENTRYPOINT ["python3", "/z/copyparty-sfx.py", "--no-crt", "--no-thumb", "-c", "/z/initcfg"]
|
||||
|
||||
@@ -5,9 +5,11 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
org.opencontainers.image.licenses="MIT" \
|
||||
org.opencontainers.image.title="copyparty-min-pip" \
|
||||
org.opencontainers.image.description="just copyparty, no thumbnails, no media tags, no audio transcoding"
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc
|
||||
ENV PYTHONPYCACHEPREFIX=/tmp/pyc \
|
||||
XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add python3 py3-pip !pyc \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install copyparty \
|
||||
&& apk del py3-pip \
|
||||
&& rm -rf /tmp/pyc \
|
||||
@@ -17,4 +19,4 @@ RUN apk --no-cache add python3 py3-pip !pyc \
|
||||
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
ENTRYPOINT ["python3", "-m", "copyparty", "-c", "/z/initcfg"]
|
||||
ENTRYPOINT ["python3", "-m", "copyparty", "--no-crt", "--no-thumb", "-c", "/z/initcfg"]
|
||||
|
||||
@@ -15,6 +15,7 @@ docker run --rm -it -u 1000 -p 3923:3923 -v /mnt/nas:/w -v $PWD/cfgdir:/cfg copy
|
||||
* `copyparty/ac` is the recommended [image edition](#editions)
|
||||
* you can download the image from github instead by replacing `copyparty/ac` with `ghcr.io/9001/copyparty-ac`
|
||||
* if you are using rootless podman, remove `-u 1000`
|
||||
* if you have selinux, append `:z` to all `-v` args (for example `-v /mnt/nas:/w:z`)
|
||||
|
||||
i'm unfamiliar with docker-compose and alternatives so let me know if this section could be better 🙏
|
||||
|
||||
|
||||
@@ -134,6 +134,14 @@ tmv() {
|
||||
touch -r "$1" t
|
||||
mv t "$1"
|
||||
}
|
||||
iawk() {
|
||||
awk "$1" <"$2" >t
|
||||
tmv "$2"
|
||||
}
|
||||
ised() {
|
||||
sed -r "$1" <"$2" >t
|
||||
tmv "$2"
|
||||
}
|
||||
|
||||
stamp=$(
|
||||
for d in copyparty scripts; do
|
||||
@@ -229,9 +237,7 @@ necho() {
|
||||
mv python-magic-*/magic .
|
||||
rm -rf python-magic-*
|
||||
rm magic/compat.py
|
||||
f=magic/__init__.py
|
||||
awk '/^def _add_compat/{o=1} !o; /^_add_compat/{o=0}' <$f >t
|
||||
tmv "$f"
|
||||
iawk '/^def _add_compat/{o=1} !o; /^_add_compat/{o=0}' magic/__init__.py
|
||||
mv magic ftp/ # doesn't provide a version label anyways
|
||||
|
||||
# enable this to dynamically remove type hints at startup,
|
||||
@@ -398,7 +404,7 @@ find -type f -name ._\* | while IFS= read -r f; do cmp <(printf '\x00\x05\x16')
|
||||
|
||||
rm -f copyparty/web/deps/*.full.* copyparty/web/dbg-* copyparty/web/Makefile
|
||||
|
||||
find copyparty | LC_ALL=C sort | sed 's/\.gz$//;s/$/,/' > have
|
||||
find copyparty | LC_ALL=C sort | sed -r 's/\.(gz|br)$//;s/$/,/' > have
|
||||
cat have | while IFS= read -r x; do
|
||||
grep -qF -- "$x" ../scripts/sfx.ls || {
|
||||
echo "unexpected file: $x"
|
||||
@@ -407,6 +413,11 @@ cat have | while IFS= read -r x; do
|
||||
done
|
||||
rm have
|
||||
|
||||
ised /fork_process/d ftp/pyftpdlib/servers.py
|
||||
iawk '/^class _Base/{s=1}!s' ftp/pyftpdlib/authorizers.py
|
||||
iawk '/^ {0,4}[^ ]/{s=0}/^ {4}def (serve_forever|_loop)/{s=1}!s' ftp/pyftpdlib/servers.py
|
||||
rm -f ftp/pyftpdlib/{__main__,prefork}.py
|
||||
|
||||
[ $no_ftp ] &&
|
||||
rm -rf copyparty/ftpd.py ftp asyncore.py asynchat.py &&
|
||||
sed -ri '/add_argument\("--ftp/d' copyparty/__main__.py &&
|
||||
@@ -423,9 +434,7 @@ rm have
|
||||
[ $no_cm ] && {
|
||||
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
|
||||
echo h > copyparty/web/mde.html
|
||||
f=copyparty/web/md.html
|
||||
sed -r '/edit2">edit \(fancy/d' <$f >t
|
||||
tmv "$f"
|
||||
ised '/edit2">edit \(fancy/d' copyparty/web/md.html
|
||||
}
|
||||
|
||||
[ $no_hl ] &&
|
||||
@@ -435,23 +444,20 @@ rm have
|
||||
rm -f copyparty/web/deps/scp.woff2
|
||||
f=copyparty/web/ui.css
|
||||
gzip -d "$f.gz" || true
|
||||
sed -r "s/src:.*scp.*\)/src:local('Consolas')/" <$f >t
|
||||
tmv "$f"
|
||||
ised "s/src:.*scp.*\)/src:local('Consolas')/" $f
|
||||
}
|
||||
|
||||
[ $no_dd ] && {
|
||||
rm -rf copyparty/web/dd
|
||||
f=copyparty/web/browser.css
|
||||
gzip -d "$f.gz" || true
|
||||
sed -r 's/(cursor: ?)url\([^)]+\), ?(pointer)/\1\2/; s/[0-9]+% \{cursor:[^}]+\}//; s/animation: ?cursor[^};]+//' <$f >t
|
||||
tmv "$f"
|
||||
ised 's/(cursor: ?)url\([^)]+\), ?(pointer)/\1\2/; s/[0-9]+% \{cursor:[^}]+\}//; s/animation: ?cursor[^};]+//' $f
|
||||
}
|
||||
|
||||
[ $langs ] &&
|
||||
for f in copyparty/web/{browser.js,splash.js}; do
|
||||
gzip -d "$f.gz" || true
|
||||
awk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' <$f >t
|
||||
tmv "$f"
|
||||
iawk '/^\}/{l=0} !l; /^var Ls =/{l=1;next} o; /^\t["}]/{o=0} /^\t"'"$langs"'"/{o=1;print}' $f
|
||||
done
|
||||
|
||||
[ ! $repack ] && [ ! $use_ox ] && {
|
||||
@@ -466,10 +472,20 @@ rm have
|
||||
# sed -ri '/: TypeAlias = /d' "$x"; done
|
||||
}
|
||||
|
||||
f=j2/jinja2/constants.py
|
||||
awk '/^LOREM_IPSUM_WORDS/{o=1;print "LOREM_IPSUM_WORDS = u\"a\"";next} !o; /"""/{o=0}' <$f >t
|
||||
tmv "$f"
|
||||
rm -f j2/jinja2/async*
|
||||
rm -f j2/jinja2/constants.py
|
||||
iawk '/^ {4}def /{s=0}/^ {4}def compile_templates\(/{s=1}!s' j2/jinja2/environment.py
|
||||
ised '/generate_lorem_ipsum/d' j2/jinja2/defaults.py
|
||||
iawk '/^def /{s=0}/^def generate_lorem_ipsum/{s=1}!s' j2/jinja2/utils.py
|
||||
iawk '/^(class|def) /{s=0}/^(class InternationalizationExtension|def _make_new_n?gettext)/{s=1}!s' j2/jinja2/ext.py
|
||||
iawk '/^[^ ]/{s=0}/^def babel_extract/{s=1}!s' j2/jinja2/ext.py
|
||||
ised '/InternationalizationExtension/d' j2/jinja2/ext.py
|
||||
iawk '/^class/{s=0}/^class (Package|Dict|Function|Prefix|Choice|Module)Loader/{s=1}!s' j2/jinja2/loaders.py
|
||||
sed -ri '/^from .bccache | (Package|Dict|Function|Prefix|Choice|Module)Loader$/d' j2/jinja2/__init__.py
|
||||
rm -f j2/jinja2/async* j2/jinja2/{bccache,sandbox}.py
|
||||
cat > j2/jinja2/_identifier.py <<'EOF'
|
||||
import re
|
||||
pattern = re.compile(r"\w+")
|
||||
EOF
|
||||
|
||||
grep -rLE '^#[^a-z]*coding: utf-8' j2 |
|
||||
while IFS= read -r f; do
|
||||
|
||||
@@ -9,7 +9,7 @@ tee build2.sh | cmp build.sh && rm build2.sh || {
|
||||
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
|
||||
}
|
||||
|
||||
[ -e up2k.sh ] && ./up2k.sh
|
||||
[ -e up2k.sh ] && [ ! "$1" ] && ./up2k.sh
|
||||
|
||||
uname -s | grep WOW64 && m=64 || m=32
|
||||
uname -s | grep NT-10 && w10=1 || w7=1
|
||||
@@ -73,6 +73,7 @@ excl=(
|
||||
multiprocessing
|
||||
pdb
|
||||
pickle
|
||||
PIL.EpsImagePlugin
|
||||
pyftpdlib.prefork
|
||||
urllib.request
|
||||
urllib.response
|
||||
|
||||
@@ -44,7 +44,7 @@ ckgh() {
|
||||
curl -s https://api.github.com/repos/$dep/releases | tee h |
|
||||
jq -r 'first|.assets|.[]|.name'
|
||||
)
|
||||
[ -z "$k" ] && echo "outdated: $dep" && cp h "ng-$dep" e=1
|
||||
[ -z "$k" ] && echo "outdated: $dep" && cp h "ng-$dep" && e=1
|
||||
done
|
||||
true
|
||||
}
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
|
||||
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
|
||||
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
||||
2410f79f25b55829169fdd45611c04f51932f7701c0601df64ade0eb545c96ba950b7be186eb082482506bc689fcde5fe09c1f6f7cd77c2107028959b7e0d06f pyinstaller-5.12.0-py3-none-win32.whl
|
||||
62f4f3dda0526ea88cfc5af1806c7b53094672f4237d64c088626c226ad2fbc7549f6c9c6bbe5b228b1f87faf1e5c343ec468c485e4c17fe6d79c6b1f570153a pyinstaller-5.12.0-py3-none-win_amd64.whl
|
||||
2612c263f73a02eab41404ba96e0c7cf8be4475104668b47dfbae50fadf977b3621dd4102682b301264d82b6e130d95ea84a28bf2106a626a1a2845dac16df47 pyinstaller_hooks_contrib-2023.3-py2.py3-none-any.whl
|
||||
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||
f298e34356b5590dde7477d7b3a88ad39c622a2bcf3fcd7c53870ce8384dd510f690af81b8f42e121a22d3968a767d2e07595036b2ed7049c8ef4d112bcf3a61 pyinstaller-5.13.2-py3-none-win32.whl
|
||||
ea73aa54cc6d5db20dfb127e54562dabf890e4cd6171a91b10a51af2bcfc76e1d64cbdce4546df2dcfe42b624724c85b1cd05934be2413425b1f880222727b4f pyinstaller-5.13.2-py3-none-win_amd64.whl
|
||||
2f4e3927a38cf7757bc9a1c06370d79209669a285a80f1b09cf9917137825c7022a50a56b351807e6e687e2c3a7bd7b2c5cc6daeb4d90e11920284c1a04a1cc3 pyinstaller_hooks_contrib-2023.8-py2.py3-none-any.whl
|
||||
749a473646c6d4c7939989649733d4c7699fd1c359c27046bf5bc9c070d1a4b8b986bbc65f60d7da725baf16dbfdd75a4c2f5bb8335f2cb5685073f5fee5c2d1 pywin32_ctypes-0.2.2-py3-none-any.whl
|
||||
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
||||
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
||||
8d16a967a0a7872a7575b1005cf66915deacda6ee8611fbb52f42fc3e3beb2f901a5140c942a5d146bd412b92bfa9cbadd82beeba83df6d70930c6dc26608a5b upx-4.1.0-win32.zip
|
||||
# u2c (win7)
|
||||
a7d259277af4948bf960682bc9fb45a44b9ae9a19763c8a7c313cef4aa9ec2d447d843e4a7c409e9312c8c8f863a24487a8ee4ffa6891e9b1c4e111bb4723861 certifi-2022.12.7-py3-none-any.whl
|
||||
2822c0dae180b1c8cfb7a70c8c00bad62af9afdbb18b656236680def9d3f1fcdcb8ef5eb64fc3b4c934385cd175ad5992a2284bcba78a243130de75b2d1650db charset_normalizer-3.1.0-cp37-cp37m-win32.whl
|
||||
@@ -25,6 +25,6 @@ ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a250675
|
||||
# win10
|
||||
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
||||
7f8f4daa4f4f2dbf24cdd534b2952ee3fba6334eb42b37465ccda3aa1cccc3d6204aa6bfffb8a83bf42ec59c702b5b5247d4c8ee0d4df906334ae53072ef8c4c MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl
|
||||
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
|
||||
926d408a886059a75cf12706fa061146f9f042b27fb6e65be7d49f398ed23fb0227639d84804586ac014c6bcf7d08cd86a09c1a20793d341aa0802d3d32a546b Pillow-10.0.0-cp311-cp311-win_amd64.whl
|
||||
a48ee8992eee60a0d620dced71b9f96596f5dd510e3024015aca55884cdb3f9e2405734bfc13f3f40b79106a77bc442cce02ac4c8f5d16207448052b368fd52a python-3.11.4-amd64.exe
|
||||
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
||||
08a033202b5c51e50609b2700dd69cbae30edb367f34762fd1633aae08b35949b4f67f12c75f25868a5b62b4956190d0cc8d201b170758d9c04a523bc8442b9b Pillow-10.0.1-cp311-cp311-win_amd64.whl
|
||||
c86bbeacad3ae3c7bde747f5b4f09c11eced841add14e79ec4a064e5e29ebca35460e543ba735b11bfb882837d5ff4371ce64492d28d096b4686233c9a8cda6d python-3.11.5-amd64.exe
|
||||
|
||||
@@ -17,14 +17,14 @@ uname -s | grep NT-10 && w10=1 || {
|
||||
fns=(
|
||||
altgraph-0.17.3-py2.py3-none-any.whl
|
||||
pefile-2023.2.7-py3-none-any.whl
|
||||
pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||
pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
|
||||
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||
upx-4.0.2-win32.zip
|
||||
pyinstaller-5.13.2-py3-none-win_amd64.whl
|
||||
pyinstaller_hooks_contrib-2023.7-py2.py3-none-any.whl
|
||||
pywin32_ctypes-0.2.2-py3-none-any.whl
|
||||
upx-4.1.0-win32.zip
|
||||
)
|
||||
[ $w10 ] && fns+=(
|
||||
mutagen-1.46.0-py3-none-any.whl
|
||||
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
||||
mutagen-1.47.0-py3-none-any.whl
|
||||
Pillow-10.0.1-cp311-cp311-win_amd64.whl
|
||||
python-3.11.3-amd64.exe
|
||||
}
|
||||
[ $w7 ] && fns+=(
|
||||
@@ -43,12 +43,11 @@ fns=(
|
||||
)
|
||||
[ $w7x64 ] && fns+=(
|
||||
windows6.1-kb2533623-x64.msu
|
||||
pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||
python-3.7.9-amd64.exe
|
||||
)
|
||||
[ $w7x32 ] && fns+=(
|
||||
windows6.1-kb2533623-x86.msu
|
||||
pyinstaller-5.10.1-py3-none-win32.whl
|
||||
pyinstaller-5.13.2-py3-none-win32.whl
|
||||
python-3.7.9.exe
|
||||
)
|
||||
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
|
||||
|
||||
@@ -21,6 +21,7 @@ copyparty/httpconn.py,
|
||||
copyparty/httpsrv.py,
|
||||
copyparty/ico.py,
|
||||
copyparty/mdns.py,
|
||||
copyparty/metrics.py,
|
||||
copyparty/mtag.py,
|
||||
copyparty/multicast.py,
|
||||
copyparty/pwhash.py,
|
||||
|
||||
@@ -100,6 +100,7 @@ def tc1(vflags):
|
||||
"-p4321",
|
||||
"-e2dsa",
|
||||
"-e2tsr",
|
||||
"--ban-403=no",
|
||||
"--dbd=yolo",
|
||||
"--no-mutagen",
|
||||
"--th-ff-jpg",
|
||||
|
||||
@@ -55,6 +55,11 @@ def uncomment(fpath):
|
||||
out += '"a"'
|
||||
elif token_type != tokenize.COMMENT or is_legalese:
|
||||
out += token_string
|
||||
else:
|
||||
if out.rstrip(" ").endswith("\n"):
|
||||
out = out.rstrip() + "\n"
|
||||
else:
|
||||
out = out.rstrip()
|
||||
|
||||
prev_toktype = token_type
|
||||
last_lineno = end_line
|
||||
|
||||
18
srv/chat.md
Normal file
18
srv/chat.md
Normal file
@@ -0,0 +1,18 @@
|
||||
## chattyparty
|
||||
|
||||
this file, combined with the [msg-log](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/msg-log.py) hook, turns copyparty into a makeshift instant-messaging / chat service
|
||||
|
||||
name this file `README.md` and run copyparty as such:
|
||||
```bash
|
||||
python copyparty-sfx.py -emp --xm j,bin/hooks/msg-log.py
|
||||
```
|
||||
|
||||
only the stuff below is important; delete everything from this line up
|
||||
|
||||
```copyparty_post
|
||||
render2(dom) {
|
||||
if (/[?&]edit/.test(location)) return;
|
||||
setTimeout(function() { treectl.goto(); }, 1000);
|
||||
// if you wanna go to another folder: treectl.goto('foo <bar> baz/', true);
|
||||
}
|
||||
```
|
||||
@@ -12,7 +12,7 @@ import tempfile
|
||||
import unittest
|
||||
|
||||
from tests import util as tu
|
||||
from tests.util import Cfg
|
||||
from tests.util import Cfg, eprint
|
||||
|
||||
from copyparty.authsrv import AuthSrv
|
||||
from copyparty.httpcli import HttpCli
|
||||
@@ -93,7 +93,7 @@ class TestHttpCli(unittest.TestCase):
|
||||
res = "ok " + fp in ret
|
||||
print("[{}] {} {} = {}".format(fp, rok, wok, res))
|
||||
if rok != res:
|
||||
print("\033[33m{}\n# {}\033[0m".format(ret, furl))
|
||||
eprint("\033[33m{}\n# {}\033[0m".format(ret, furl))
|
||||
self.fail()
|
||||
|
||||
# file browser: html
|
||||
@@ -101,7 +101,7 @@ class TestHttpCli(unittest.TestCase):
|
||||
res = "'{}'".format(self.fn) in ret
|
||||
print(res)
|
||||
if rok != res:
|
||||
print("\033[33m{}\n# {}\033[0m".format(ret, durl))
|
||||
eprint("\033[33m{}\n# {}\033[0m".format(ret, durl))
|
||||
self.fail()
|
||||
|
||||
# file browser: json
|
||||
@@ -110,7 +110,7 @@ class TestHttpCli(unittest.TestCase):
|
||||
res = '"{}"'.format(self.fn) in ret
|
||||
print(res)
|
||||
if rok != res:
|
||||
print("\033[33m{}\n# {}\033[0m".format(ret, url))
|
||||
eprint("\033[33m{}\n# {}\033[0m".format(ret, url))
|
||||
self.fail()
|
||||
|
||||
# tar
|
||||
@@ -119,8 +119,11 @@ class TestHttpCli(unittest.TestCase):
|
||||
# with open(os.path.join(td, "tar"), "wb") as f:
|
||||
# f.write(b)
|
||||
try:
|
||||
tar = tarfile.open(fileobj=io.BytesIO(b)).getnames()
|
||||
tar = tarfile.open(fileobj=io.BytesIO(b), mode="r|").getnames()
|
||||
except:
|
||||
if "HTTP/1.1 403 Forbidden" not in h and b != b"\nJ2EOT":
|
||||
eprint("bad tar?", url, h, b)
|
||||
raise
|
||||
tar = []
|
||||
tar = [x.split("/", 1)[1] for x in tar]
|
||||
tar = ["/".join([y for y in [top, durl, x] if y]) for x in tar]
|
||||
@@ -132,7 +135,9 @@ class TestHttpCli(unittest.TestCase):
|
||||
if durl.split("/")[-1] in self.can_read:
|
||||
ref = [x for x in vfiles if self.in_dive(top + "/" + durl, x)]
|
||||
for f in ref:
|
||||
print("{}: {}".format("ok" if f in tar_ok else "NG", f))
|
||||
ok = f in tar_ok
|
||||
pr = print if ok else eprint
|
||||
pr("{}: {}".format("ok" if ok else "NG", f))
|
||||
ref.sort()
|
||||
tar_ok.sort()
|
||||
self.assertEqual(ref, tar_ok)
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
#!/usr/bin/env python3
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
@@ -16,13 +20,19 @@ ANYWIN = WINDOWS or sys.platform in ["msys"]
|
||||
MACOS = platform.system() == "Darwin"
|
||||
|
||||
J2_ENV = jinja2.Environment(loader=jinja2.BaseLoader)
|
||||
J2_FILES = J2_ENV.from_string("{{ files|join('\n') }}")
|
||||
J2_FILES = J2_ENV.from_string("{{ files|join('\n') }}\nJ2EOT")
|
||||
|
||||
|
||||
def nah(*a, **ka):
|
||||
return False
|
||||
|
||||
|
||||
def eprint(*a, **ka):
|
||||
ka["file"] = sys.stderr
|
||||
print(*a, **ka)
|
||||
sys.stderr.flush()
|
||||
|
||||
|
||||
if MACOS:
|
||||
import posixpath
|
||||
|
||||
@@ -99,13 +109,13 @@ class Cfg(Namespace):
|
||||
def __init__(self, a=None, v=None, c=None):
|
||||
ka = {}
|
||||
|
||||
ex = "daw dav_auth dav_inf dav_mac dav_rt dotsrch e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_thumb no_vthumb no_zip nrand nw rand smb th_no_crop vague_403 vc ver xdev xlink xvol"
|
||||
ex = "daw dav_auth dav_inf dav_mac dav_rt dotsrch e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp force_js getmod grid hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nw rand smb th_no_crop vague_403 vc ver xdev xlink xvol"
|
||||
ka.update(**{k: False for k in ex.split()})
|
||||
|
||||
ex = "dotpart no_rescan no_sendfile no_voldump plain_ip"
|
||||
ka.update(**{k: True for k in ex.split()})
|
||||
|
||||
ex = "css_browser hist js_browser no_forget no_hash no_idx"
|
||||
ex = "css_browser hist js_browser no_forget no_hash no_idx nonsus_urls"
|
||||
ka.update(**{k: None for k in ex.split()})
|
||||
|
||||
ex = "s_thead s_tbody th_convt"
|
||||
@@ -114,7 +124,7 @@ class Cfg(Namespace):
|
||||
ex = "df loris re_maxage rproxy rsp_jtr rsp_slp s_wr_slp theme themes turbo"
|
||||
ka.update(**{k: 0 for k in ex.split()})
|
||||
|
||||
ex = "ah_alg doctitle favico html_head lg_sbf log_fk md_sbf mth name textfiles unlist R RS SR"
|
||||
ex = "ah_alg bname doctitle favico html_head lg_sbf log_fk md_sbf mth name textfiles unlist vname R RS SR"
|
||||
ka.update(**{k: "" for k in ex.split()})
|
||||
|
||||
ex = "on403 on404 xad xar xau xban xbd xbr xbu xiu xm"
|
||||
@@ -179,6 +189,8 @@ class VHttpSrv(object):
|
||||
|
||||
self.gpwd = Garda("")
|
||||
self.g404 = Garda("")
|
||||
self.g403 = Garda("")
|
||||
self.gurl = Garda("")
|
||||
|
||||
self.ptn_cc = re.compile(r"[\x00-\x1f]")
|
||||
|
||||
|
||||
Reference in New Issue
Block a user