Compare commits
93 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
fff45552da | ||
|
|
95157d02c9 | ||
|
|
3090c74832 | ||
|
|
4195762d2a | ||
|
|
dc3b7a2720 | ||
|
|
ad200f2b97 | ||
|
|
897f9d328d | ||
|
|
efbe34f29d | ||
|
|
dbfc899d79 | ||
|
|
74fb4b0cb8 | ||
|
|
68e7000275 | ||
|
|
38c2dcce3e | ||
|
|
5b3a5fe76b | ||
|
|
d5a9bd80b2 | ||
|
|
71c5565949 | ||
|
|
db33d68d42 | ||
|
|
e1c20c7a18 | ||
|
|
d3f1b45ce3 | ||
|
|
c7aa1a3558 | ||
|
|
7b2bd6da83 | ||
|
|
2bd955ba9f | ||
|
|
98dcaee210 | ||
|
|
361aebf877 | ||
|
|
ffc1610980 | ||
|
|
233075aee7 | ||
|
|
d1a4d335df | ||
|
|
96acbd3593 | ||
|
|
4b876dd133 | ||
|
|
a06c5eb048 | ||
|
|
c9cdc3e1c1 | ||
|
|
c0becc6418 | ||
|
|
b17ccc38ee | ||
|
|
acfaacbd46 | ||
|
|
8e0364efad | ||
|
|
e3043004ba | ||
|
|
b2aaf40a3e | ||
|
|
21db8833dc | ||
|
|
ec14c3944e | ||
|
|
20920e844f | ||
|
|
f9954bc4e5 | ||
|
|
d450f61534 | ||
|
|
2b50fc2010 | ||
|
|
c2034f7bc5 | ||
|
|
cec3bee020 | ||
|
|
e1b9ac631f | ||
|
|
19ee64e5e3 | ||
|
|
4f397b9b5b | ||
|
|
71775dcccb | ||
|
|
b383c08cc3 | ||
|
|
fc88341820 | ||
|
|
43bbd566d7 | ||
|
|
e1dea7ef3e | ||
|
|
de2fedd2cd | ||
|
|
6aaafeee6d | ||
|
|
99f63adf58 | ||
|
|
de2c978842 | ||
|
|
3c90cec0cd | ||
|
|
57a56073d8 | ||
|
|
2525d594c5 | ||
|
|
a0ecc4d88e | ||
|
|
accd003d15 | ||
|
|
9c2c423761 | ||
|
|
999789c742 | ||
|
|
14bb299918 | ||
|
|
0a33336dd4 | ||
|
|
6a2644fece | ||
|
|
5ab09769e1 | ||
|
|
782084056d | ||
|
|
494179bd1c | ||
|
|
29a17ae2b7 | ||
|
|
815d46f2c4 | ||
|
|
8417098c68 | ||
|
|
25974d660d | ||
|
|
12fcb42201 | ||
|
|
16462ee573 | ||
|
|
540664e0c2 | ||
|
|
b5cb763ab1 | ||
|
|
c24a0ec364 | ||
|
|
4accef00fb | ||
|
|
d779525500 | ||
|
|
65a7706f77 | ||
|
|
5e12abbb9b | ||
|
|
e0fe2b97be | ||
|
|
bd33863f9f | ||
|
|
a011139894 | ||
|
|
36866f1d36 | ||
|
|
407531bcb1 | ||
|
|
3adbb2ff41 | ||
|
|
499ae1c7a1 | ||
|
|
438ea6ccb0 | ||
|
|
598a29a733 | ||
|
|
6d102fc826 | ||
|
|
fca07fbb62 |
1
.github/ISSUE_TEMPLATE/bug_report.md
vendored
1
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -8,6 +8,7 @@ assignees: '9001'
|
||||
---
|
||||
|
||||
NOTE:
|
||||
**please use english, or include an english translation.** aside from that,
|
||||
all of the below are optional, consider them as inspiration, delete and rewrite at will, thx md
|
||||
|
||||
|
||||
|
||||
2
.github/ISSUE_TEMPLATE/feature_request.md
vendored
2
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -7,6 +7,8 @@ assignees: '9001'
|
||||
|
||||
---
|
||||
|
||||
NOTE:
|
||||
**please use english, or include an english translation.** aside from that,
|
||||
all of the below are optional, consider them as inspiration, delete and rewrite at will
|
||||
|
||||
**is your feature request related to a problem? Please describe.**
|
||||
|
||||
@@ -1,8 +1,21 @@
|
||||
* do something cool
|
||||
* **found a bug?** [create an issue!](https://github.com/9001/copyparty/issues) or let me know in the [discord](https://discord.gg/25J8CdTT6G) :>
|
||||
* **fixed a bug?** create a PR or post a patch! big thx in advance :>
|
||||
* **have a cool idea?** let's discuss it! anywhere's fine, you choose.
|
||||
|
||||
really tho, send a PR or an issue or whatever, all appreciated, anything goes, just behave aight 👍👍
|
||||
but please:
|
||||
|
||||
|
||||
|
||||
# do not use AI / LMM when writing code
|
||||
|
||||
copyparty is 100% organic, free-range, human-written software!
|
||||
|
||||
> ⚠ you are now entering a no-copilot zone
|
||||
|
||||
the *only* place where LMM/AI *may* be accepted is for [localization](https://github.com/9001/copyparty/tree/hovudstraum/docs/rice#translations) if you are fluent and have confirmed that the translation is accurate.
|
||||
|
||||
sorry for the harsh tone, but this is important to me 🙏
|
||||
|
||||
but to be more specific,
|
||||
|
||||
|
||||
# contribution ideas
|
||||
|
||||
71
README.md
71
README.md
@@ -50,6 +50,8 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [rss feeds](#rss-feeds) - monitor a folder with your RSS reader
|
||||
* [recent uploads](#recent-uploads) - list all recent uploads
|
||||
* [media player](#media-player) - plays almost every audio format there is
|
||||
* [playlists](#playlists) - create and play [m3u8](https://en.wikipedia.org/wiki/M3U) playlists
|
||||
* [creating a playlist](#creating-a-playlist) - with a standalone mediaplayer or copyparty
|
||||
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
|
||||
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
|
||||
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
||||
@@ -100,6 +102,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [custom mimetypes](#custom-mimetypes) - change the association of a file extension
|
||||
* [GDPR compliance](#GDPR-compliance) - imagine using copyparty professionally...
|
||||
* [feature chickenbits](#feature-chickenbits) - buggy feature? rip it out
|
||||
* [feature beefybits](#feature-beefybits) - force-enable features with known issues on your OS/env
|
||||
* [packages](#packages) - the party might be closer than you think
|
||||
* [arch package](#arch-package) - now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
|
||||
* [fedora package](#fedora-package) - does not exist yet
|
||||
@@ -146,6 +149,7 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
|
||||
* or if you are on android, [install copyparty in termux](#install-on-android)
|
||||
* or maybe you have a [synology nas / dsm](./docs/synology-dsm.md)
|
||||
* or if your computer is messed up and nothing else works, [try the pyz](#zipapp)
|
||||
* or if your OS is dead, give the [bootable flashdrive / cd-rom](https://a.ocv.me/pub/stuff/edcd001/enterprise-edition/) a spin
|
||||
* or if you don't trust copyparty yet and want to isolate it a little, then...
|
||||
* ...maybe [prisonparty](./bin/prisonparty.sh) to create a tiny [chroot](https://wiki.archlinux.org/title/Chroot) (very portable),
|
||||
* ...or [bubbleparty](./bin/bubbleparty.sh) to wrap it in [bubblewrap](https://github.com/containers/bubblewrap) (much better)
|
||||
@@ -228,6 +232,7 @@ also see [comparison to similar software](./docs/versus.md)
|
||||
* ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf)
|
||||
* ☑ [event hooks](#event-hooks) / script runner
|
||||
* ☑ [reverse-proxy support](https://github.com/9001/copyparty#reverse-proxy)
|
||||
* ☑ cross-platform (Windows, Linux, Macos, Android, FreeBSD, arm32/arm64, ppc64le, s390x, risc-v/riscv64)
|
||||
* upload
|
||||
* ☑ basic: plain multipart, ie6 support
|
||||
* ☑ [up2k](#uploading): js, resumable, multithreaded
|
||||
@@ -248,6 +253,7 @@ also see [comparison to similar software](./docs/versus.md)
|
||||
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
|
||||
* ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus/mp3 transcoding)
|
||||
* ☑ play video files as audio (converted on server)
|
||||
* ☑ create and play [m3u8 playlists](#playlists)
|
||||
* ☑ image gallery with webm player
|
||||
* ☑ textfile browser with syntax hilighting
|
||||
* ☑ [thumbnails](#thumbnails)
|
||||
@@ -279,6 +285,8 @@ small collection of user feedback
|
||||
|
||||
`good enough`, `surprisingly correct`, `certified good software`, `just works`, `why`, `wow this is better than nextcloud`
|
||||
|
||||
* UI просто ужасно. Если буду описывать детально не смогу удержаться в рамках приличий
|
||||
|
||||
|
||||
# motivations
|
||||
|
||||
@@ -298,6 +306,8 @@ project goals / philosophy
|
||||
* adaptable, malleable, hackable
|
||||
* no build steps; modify the js/python without needing node.js or anything like that
|
||||
|
||||
becoming rich is specifically *not* a motivation, but if you wanna donate then see my [github profile](https://github.com/9001) regarding donations for my FOSS stuff in general (also THANKS!)
|
||||
|
||||
|
||||
## notes
|
||||
|
||||
@@ -327,7 +337,8 @@ roughly sorted by chance of encounter
|
||||
* `--th-ff-jpg` may fix video thumbnails on some FFmpeg versions (macos, some linux)
|
||||
* `--th-ff-swr` may fix audio thumbnails on some FFmpeg versions
|
||||
* if the `up2k.db` (filesystem index) is on a samba-share or network disk, you'll get unpredictable behavior if the share is disconnected for a bit
|
||||
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db on a local disk instead
|
||||
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db and thumbnails on a local disk instead
|
||||
* or, if you only want to move the db (and not the thumbnails), then use `--dbpath` or the `dbpath` volflag
|
||||
* all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise
|
||||
* probably more, pls let me know
|
||||
|
||||
@@ -380,7 +391,8 @@ same order here too
|
||||
* this is an msys2 bug, the regular windows edition of python is fine
|
||||
|
||||
* VirtualBox: sqlite throws `Disk I/O Error` when running in a VM and the up2k database is in a vboxsf
|
||||
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db inside the vm instead
|
||||
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db and thumbnails inside the vm instead
|
||||
* or, if you only want to move the db (and not the thumbnails), then use `--dbpath` or the `dbpath` volflag
|
||||
* also happens on mergerfs, so put the db elsewhere
|
||||
|
||||
* Ubuntu: dragging files from certain folders into firefox or chrome is impossible
|
||||
@@ -724,6 +736,7 @@ select which type of archive you want in the `[⚙️] config` tab:
|
||||
* `up2k.db` and `dir.txt` is always excluded
|
||||
* bsdtar supports streaming unzipping: `curl foo?zip | bsdtar -xv`
|
||||
* good, because copyparty's zip is faster than tar on small files
|
||||
* but `?tar` is better for large files, especially if the total exceeds 4 GiB
|
||||
* `zip_crc` will take longer to download since the server has to read each file twice
|
||||
* this is only to support MS-DOS PKZIP v2.04g (october 1993) and older
|
||||
* how are you accessing copyparty actually
|
||||
@@ -802,6 +815,8 @@ if you are resuming a massive upload and want to skip hashing the files which al
|
||||
|
||||
if the server is behind a proxy which imposes a request-size limit, you can configure up2k to sneak below the limit with server-option `--u2sz` (the default is 96 MiB to support Cloudflare)
|
||||
|
||||
if you want to replace existing files on the server with new uploads by default, run with `--u2ow 2` (only works if users have the delete-permission, and can still be disabled with `🛡️` in the UI)
|
||||
|
||||
|
||||
### file-search
|
||||
|
||||
@@ -1026,11 +1041,13 @@ click the `play` link next to an audio file, or copy the link target to [share i
|
||||
|
||||
open the `[🎺]` media-player-settings tab to configure it,
|
||||
* "switches":
|
||||
* `[🔁]` repeats one single song forever
|
||||
* `[🔀]` shuffles the files inside each folder
|
||||
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
|
||||
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
|
||||
* `[~s]` toggles the seekbar waveform display
|
||||
* `[/np]` enables buttons to copy the now-playing info as an irc message
|
||||
* `[📻]` enables buttons to create an [m3u playlist](#playlists) with the selected songs
|
||||
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
|
||||
* `[seek]` allows seeking with lockscreen controls (buggy on some devices)
|
||||
* `[art]` shows album art on the lockscreen
|
||||
@@ -1049,11 +1066,39 @@ open the `[🎺]` media-player-settings tab to configure it,
|
||||
* "transcode to":
|
||||
* `[opus]` produces an `opus` whenever transcoding is necessary (the best choice on Android and PCs)
|
||||
* `[awo]` is `opus` in a `weba` file, good for iPhones (iOS 17.5 and newer) but Apple is still fixing some state-confusion bugs as of iOS 18.2.1
|
||||
* `[caf]` is `opus` in a `caf` file, good for iPhones (iOS 11 through 17), technically unsupported by Apple but works for the mos tpart
|
||||
* `[caf]` is `opus` in a `caf` file, good for iPhones (iOS 11 through 17), technically unsupported by Apple but works for the most part
|
||||
* `[mp3]` -- the myth, the legend, the undying master of mediocre sound quality that definitely works everywhere
|
||||
* "tint" reduces the contrast of the playback bar
|
||||
|
||||
|
||||
### playlists
|
||||
|
||||
create and play [m3u8](https://en.wikipedia.org/wiki/M3U) playlists -- see example [text](https://a.ocv.me/pub/demo/music/?doc=example-playlist.m3u) and [player](https://a.ocv.me/pub/demo/music/#m3u=example-playlist.m3u)
|
||||
|
||||
click a file with the extension `m3u` or `m3u8` (for example `mixtape.m3u` or `touhou.m3u8` ) and you get two choices: Play / Edit
|
||||
|
||||
playlists can include songs across folders anywhere on the server, but filekeys/dirkeys are NOT supported, so the listener must have read-access or get-access to the files
|
||||
|
||||
|
||||
### creating a playlist
|
||||
|
||||
with a standalone mediaplayer or copyparty
|
||||
|
||||
you can use foobar2000, deadbeef, just about any standalone player should work -- but you might need to edit the filepaths in the playlist so they fit with the server-URLs
|
||||
|
||||
alternatively, you can create the playlist using copyparty itself:
|
||||
|
||||
* open the `[🎺]` media-player-settings tab and enable the `[📻]` create-playlist feature -- this adds two new buttons in the bottom-right tray, `[📻add]` and `[📻copy]` which appear when you listen to music, or when you select a few audiofiles
|
||||
|
||||
* click the `📻add` button while a song is playing (or when you've selected some songs) and they'll be added to "the list" (you can't see it yet)
|
||||
|
||||
* at any time, click `📻copy` to send the playlist to your clipboard
|
||||
* you can then continue adding more songs if you'd like
|
||||
* if you want to wipe the playlist and start from scratch, just refresh the page
|
||||
|
||||
* create a new textfile, name it `something.m3u` and paste the playlist there
|
||||
|
||||
|
||||
### audio equalizer
|
||||
|
||||
and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
|
||||
@@ -1590,6 +1635,8 @@ copyparty creates a subfolder named `.hist` inside each volume where it stores t
|
||||
this can instead be kept in a single place using the `--hist` argument, or the `hist=` volflag, or a mix of both:
|
||||
* `--hist ~/.cache/copyparty -v ~/music::r:c,hist=-` sets `~/.cache/copyparty` as the default place to put volume info, but `~/music` gets the regular `.hist` subfolder (`-` restores default behavior)
|
||||
|
||||
by default, the per-volume `up2k.db` sqlite3-database for `-e2d` and `-e2t` is stored next to the thumbnails according to the `--hist` option, but the global-option `--dbpath` and/or volflag `dbpath` can be used to put the database somewhere else
|
||||
|
||||
note:
|
||||
* putting the hist-folders on an SSD is strongly recommended for performance
|
||||
* markdown edits are always stored in a local `.hist` subdirectory
|
||||
@@ -1835,7 +1882,7 @@ tell search engines you don't wanna be indexed, either using the good old [robo
|
||||
* volflag `[...]:c,norobots` does the same thing for that single volume
|
||||
* volflag `[...]:c,robots` ALLOWS search-engine crawling for that volume, even if `--no-robots` is set globally
|
||||
|
||||
also, `--force-js` disables the plain HTML folder listing, making things harder to parse for search engines
|
||||
also, `--force-js` disables the plain HTML folder listing, making things harder to parse for *some* search engines -- note that crawlers which understand javascript (such as google) will not be affected
|
||||
|
||||
|
||||
## themes
|
||||
@@ -2125,7 +2172,9 @@ buggy feature? rip it out by setting any of the following environment variables
|
||||
|
||||
| env-var | what it does |
|
||||
| -------------------- | ------------ |
|
||||
| `PRTY_NO_DB_LOCK` | do not lock session/shares-databases for exclusive access |
|
||||
| `PRTY_NO_IFADDR` | disable ip/nic discovery by poking into your OS with ctypes |
|
||||
| `PRTY_NO_IMPRESO` | do not try to load js/css files using `importlib.resources` |
|
||||
| `PRTY_NO_IPV6` | disable some ipv6 support (should not be necessary since windows 2000) |
|
||||
| `PRTY_NO_LZMA` | disable streaming xz compression of incoming uploads |
|
||||
| `PRTY_NO_MP` | disable all use of the python `multiprocessing` module (actual multithreading, cpu-count for parsers/thumbnailers) |
|
||||
@@ -2136,6 +2185,15 @@ buggy feature? rip it out by setting any of the following environment variables
|
||||
example: `PRTY_NO_IFADDR=1 python3 copyparty-sfx.py`
|
||||
|
||||
|
||||
### feature beefybits
|
||||
|
||||
force-enable features with known issues on your OS/env by setting any of the following environment variables, also affectionately known as `fuckitbits` or `hail-mary-bits`
|
||||
|
||||
| env-var | what it does |
|
||||
| ------------------------ | ------------ |
|
||||
| `PRTY_FORCE_MP` | force-enable multiprocessing (real multithreading) on MacOS and other broken platforms |
|
||||
|
||||
|
||||
# packages
|
||||
|
||||
the party might be closer than you think
|
||||
@@ -2306,6 +2364,7 @@ quick summary of more eccentric web-browsers trying to view a directory index:
|
||||
| **ie4** and **netscape** 4.0 | can browse, upload with `?b=u`, auth with `&pw=wark` |
|
||||
| **ncsa mosaic** 2.7 | does not get a pass, [pic1](https://user-images.githubusercontent.com/241032/174189227-ae816026-cf6f-4be5-a26e-1b3b072c1b2f.png) - [pic2](https://user-images.githubusercontent.com/241032/174189225-5651c059-5152-46e9-ac26-7e98e497901b.png) |
|
||||
| **SerenityOS** (7e98457) | hits a page fault, works with `?b=u`, file upload not-impl |
|
||||
| **sony psp** 5.50 | can browse, upload/mkdir/msg (thx dwarf) [screenshot](https://github.com/user-attachments/assets/9d21f020-1110-4652-abeb-6fc09c533d4f) |
|
||||
| **nintendo 3ds** | can browse, upload, view thumbnails (thx bnjmn) |
|
||||
|
||||
<p align="center"><img src="https://github.com/user-attachments/assets/88deab3d-6cad-4017-8841-2f041472b853" /></p>
|
||||
@@ -2361,6 +2420,8 @@ copyparty returns a truncated sha512sum of your PUT/POST as base64; you can gene
|
||||
|
||||
you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, url-param `?pw=hunter2`, or with basic-authentication (either as the username or password)
|
||||
|
||||
> for basic-authentication, all of the following are accepted: `password` / `whatever:password` / `password:whatever` (the username is ignored)
|
||||
|
||||
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
|
||||
|
||||
|
||||
@@ -2433,6 +2494,8 @@ below are some tweaks roughly ordered by usefulness:
|
||||
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
||||
* if your volumes are on a network-disk such as NFS / SMB / s3, specifying larger values for `--iobuf` and/or `--s-rd-sz` and/or `--s-wr-sz` may help; try setting all of them to `524288` or `1048576` or `4194304`
|
||||
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
||||
* when running on AlpineLinux or other musl-based distro, try mimalloc for higher performance (and twice as much RAM usage); `apk add mimalloc2` and run copyparty with env-var `LD_PRELOAD=/usr/lib/libmimalloc-secure.so.2`
|
||||
* note that mimalloc requires special care when combined with prisonparty and/or bubbleparty/bubblewrap; you must give it access to `/proc` and `/sys` otherwise you'll encounter issues with FFmpeg (audio transcoding, thumbnails)
|
||||
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
|
||||
* lots of connections (many users or heavy clients)
|
||||
* simultaneous downloads and uploads saturating a 20gbps connection
|
||||
|
||||
@@ -14,6 +14,8 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xm
|
||||
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
|
||||
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
|
||||
* [into-the-cache-it-goes.py](into-the-cache-it-goes.py) avoids bugs in caching proxies by immediately downloading each file that is uploaded
|
||||
* [podcast-normalizer.py](podcast-normalizer.py) creates a second file with dynamic-range-compression whenever an audio file is uploaded
|
||||
* good example of the `idx` [hook effect](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#hook-effects) to tell copyparty about additional files to scan/index
|
||||
|
||||
|
||||
# upload batches
|
||||
@@ -25,6 +27,7 @@ these are `--xiu` hooks; unlike `xbu` and `xau` (which get executed on every sin
|
||||
# before upload
|
||||
* [reject-extension.py](reject-extension.py) rejects uploads if they match a list of file extensions
|
||||
* [reloc-by-ext.py](reloc-by-ext.py) redirects an upload to another destination based on the file extension
|
||||
* good example of the `reloc` [hook effect](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#hook-effects)
|
||||
|
||||
|
||||
# on message
|
||||
|
||||
121
bin/hooks/podcast-normalizer.py
Executable file
121
bin/hooks/podcast-normalizer.py
Executable file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import subprocess as sp
|
||||
|
||||
|
||||
_ = r"""
|
||||
sends all uploaded audio files through an aggressive
|
||||
dynamic-range-compressor to even out the volume levels
|
||||
|
||||
dependencies:
|
||||
ffmpeg
|
||||
|
||||
being an xau hook, this gets eXecuted After Upload completion
|
||||
but before copyparty has started hashing/indexing the file, so
|
||||
we'll create a second normalized copy in a subfolder and tell
|
||||
copyparty to hash/index that additional file as well
|
||||
|
||||
example usage as global config:
|
||||
-e2d -e2t --xau j,c1,bin/hooks/podcast-normalizer.py
|
||||
|
||||
parameters explained,
|
||||
e2d/e2t = enable database and metadata indexing
|
||||
xau = execute after upload
|
||||
j = this hook needs upload information as json (not just the filename)
|
||||
c1 = this hook returns json on stdout, so tell copyparty to read that
|
||||
|
||||
example usage as a volflag (per-volume config):
|
||||
-v srv/inc/pods:inc/pods:r:rw,ed:c,xau=j,c1,bin/hooks/podcast-normalizer.py
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
(share fs-path srv/inc/pods at URL /inc/pods,
|
||||
readable by all, read-write for user ed,
|
||||
running this xau (exec-after-upload) plugin for all uploaded files)
|
||||
|
||||
example usage as a volflag in a copyparty config file:
|
||||
[/inc/pods]
|
||||
srv/inc/pods
|
||||
accs:
|
||||
r: *
|
||||
rw: ed
|
||||
flags:
|
||||
e2d # enables file indexing
|
||||
e2t # metadata tags too
|
||||
xau: j,c1,bin/hooks/podcast-normalizer.py
|
||||
|
||||
"""
|
||||
|
||||
########################################################################
|
||||
### CONFIG
|
||||
|
||||
# filetypes to process; ignores everything else
|
||||
EXTS = "mp3 flac ogg opus m4a aac wav wma"
|
||||
|
||||
# the name of the subdir to put the normalized files in
|
||||
SUBDIR = "normalized"
|
||||
|
||||
########################################################################
|
||||
|
||||
|
||||
# try to enable support for crazy filenames
|
||||
try:
|
||||
from copyparty.util import fsenc
|
||||
except:
|
||||
|
||||
def fsenc(p):
|
||||
return p.encode("utf-8")
|
||||
|
||||
|
||||
def main():
|
||||
# read info from copyparty
|
||||
inf = json.loads(sys.argv[1])
|
||||
vpath = inf["vp"]
|
||||
abspath = inf["ap"]
|
||||
|
||||
# check if the file-extension is on the to-be-processed list
|
||||
ext = abspath.lower().split(".")[-1]
|
||||
if ext not in EXTS.split():
|
||||
return
|
||||
|
||||
# jump into the folder where the file was uploaded
|
||||
# and create the subfolder to place the normalized copy inside
|
||||
dirpath, filename = os.path.split(abspath)
|
||||
os.chdir(fsenc(dirpath))
|
||||
os.makedirs(SUBDIR, exist_ok=True)
|
||||
|
||||
# the input and output filenames to give ffmpeg
|
||||
fname_in = fsenc(f"./{filename}")
|
||||
fname_out = fsenc(f"{SUBDIR}/{filename}.opus")
|
||||
|
||||
# fmt: off
|
||||
# create and run the ffmpeg command
|
||||
cmd = [
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-hide_banner",
|
||||
b"-i", fname_in,
|
||||
b"-af", b"dynaudnorm=f=100:g=9", # the normalizer config
|
||||
b"-c:a", b"libopus",
|
||||
b"-b:a", b"128k",
|
||||
fname_out,
|
||||
]
|
||||
# fmt: on
|
||||
sp.check_output(cmd)
|
||||
|
||||
# and finally, tell copyparty about the new file
|
||||
# so it appears in the database and rss-feed:
|
||||
vpath = f"{SUBDIR}/{filename}.opus"
|
||||
print(json.dumps({"idx": {"vp": [vpath]}}))
|
||||
|
||||
# (it's fine to give it a relative path like that; it gets
|
||||
# resolved relative to the folder the file was uploaded into)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
main()
|
||||
except Exception as ex:
|
||||
print("podcast-normalizer failed; %r" % (ex,))
|
||||
@@ -1,11 +1,13 @@
|
||||
// see usb-eject.py for usage
|
||||
|
||||
function usbclick() {
|
||||
QS('#treeul a[href="/usb/"]').click();
|
||||
var o = QS('#treeul a[dst="/usb/"]') || QS('#treepar a[dst="/usb/"]');
|
||||
if (o)
|
||||
o.click();
|
||||
}
|
||||
|
||||
function eject_cb() {
|
||||
var t = this.responseText;
|
||||
var t = ('' + this.responseText).trim();
|
||||
if (t.indexOf('can be safely unplugged') < 0 && t.indexOf('Device can be removed') < 0)
|
||||
return toast.err(30, 'usb eject failed:\n\n' + t);
|
||||
|
||||
@@ -19,11 +21,14 @@ function add_eject_2(a) {
|
||||
return;
|
||||
|
||||
var v = aw[2],
|
||||
k = 'umount_' + v,
|
||||
o = ebi(k);
|
||||
k = 'umount_' + v;
|
||||
|
||||
if (o)
|
||||
for (var b = 0; b < 9; b++) {
|
||||
var o = ebi(k);
|
||||
if (!o)
|
||||
break;
|
||||
o.parentNode.removeChild(o);
|
||||
}
|
||||
|
||||
a.appendChild(mknod('span', k, '⏏'), a);
|
||||
o = ebi(k);
|
||||
@@ -40,7 +45,7 @@ function add_eject_2(a) {
|
||||
};
|
||||
|
||||
function add_eject() {
|
||||
var o = QSA('#treeul a[href^="/usb/"]');
|
||||
var o = QSA('#treeul a[href^="/usb/"]') || QSA('#treepar a[href^="/usb/"]');
|
||||
for (var a = o.length - 1; a > 0; a--)
|
||||
add_eject_2(o[a]);
|
||||
};
|
||||
|
||||
@@ -2,11 +2,15 @@
|
||||
|
||||
import sys
|
||||
import json
|
||||
import zlib
|
||||
import struct
|
||||
import base64
|
||||
import hashlib
|
||||
|
||||
try:
|
||||
from zlib_ng import zlib_ng as zlib
|
||||
except:
|
||||
import zlib
|
||||
|
||||
try:
|
||||
from copyparty.util import fsenc
|
||||
except:
|
||||
|
||||
@@ -50,6 +50,9 @@
|
||||
* give a 3rd argument to install it to your copyparty config
|
||||
* systemd service at [`systemd/cfssl.service`](systemd/cfssl.service)
|
||||
|
||||
### [`zfs-tune.py`](zfs-tune.py)
|
||||
* optimizes databases for optimal performance when stored on a zfs filesystem; also see [openzfs docs](https://openzfs.github.io/openzfs-docs/Performance%20and%20Tuning/Workload%20Tuning.html#database-workloads) and specifically the SQLite subsection
|
||||
|
||||
# OS integration
|
||||
init-scripts to start copyparty as a service
|
||||
* [`systemd/copyparty.service`](systemd/copyparty.service) runs the sfx normally
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.16.13"
|
||||
pkgver="1.16.21"
|
||||
pkgrel=1
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
||||
arch=("any")
|
||||
@@ -22,7 +22,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("75a3a2e3cdc82e18f27f26135e9ac8e71cd42e31fdd2a0c2bfa4ab88f2961b6a")
|
||||
sha256sums=("2e416e18dc854c65643b8aaedca56e0a5c5a03b0c3d45b7ff3f68daa38d8e9c6")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -64,4 +64,5 @@ in stdenv.mkDerivation {
|
||||
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
||||
--add-flags "$out/share/copyparty-sfx.py"
|
||||
'';
|
||||
meta.mainProgram = "copyparty";
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.16.13/copyparty-sfx.py",
|
||||
"version": "1.16.13",
|
||||
"hash": "sha256-PGk7ix77oBXd+c+WytRKG/WwX2pkhK3pq6AdLcRgw6I="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.16.21/copyparty-sfx.py",
|
||||
"version": "1.16.21",
|
||||
"hash": "sha256-+/f4g8J2Mv0l6ChXzbNJ84G8LeB+mP1UfkWzQxizd/g="
|
||||
}
|
||||
107
contrib/zfs-tune.py
Executable file
107
contrib/zfs-tune.py
Executable file
@@ -0,0 +1,107 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import os
|
||||
import sqlite3
|
||||
import sys
|
||||
import traceback
|
||||
|
||||
|
||||
"""
|
||||
when the up2k-database is stored on a zfs volume, this may give
|
||||
slightly higher performance (actual gains not measured yet)
|
||||
|
||||
NOTE: must be applied in combination with the related advice in the openzfs documentation;
|
||||
https://openzfs.github.io/openzfs-docs/Performance%20and%20Tuning/Workload%20Tuning.html#database-workloads
|
||||
and see specifically the SQLite subsection
|
||||
|
||||
it is assumed that all databases are stored in a single location,
|
||||
for example with `--hist /var/store/hists`
|
||||
|
||||
three alternatives for running this script:
|
||||
|
||||
1. copy it into /var/store/hists and run "python3 zfs-tune.py s"
|
||||
(s = modify all databases below folder containing script)
|
||||
|
||||
2. cd into /var/store/hists and run "python3 ~/zfs-tune.py w"
|
||||
(w = modify all databases below current working directory)
|
||||
|
||||
3. python3 ~/zfs-tune.py /var/store/hists
|
||||
|
||||
if you use docker, run copyparty with `--hist /cfg/hists`, copy this script into /cfg, and run this:
|
||||
podman run --rm -it --entrypoint /usr/bin/python3 ghcr.io/9001/copyparty-ac /cfg/zfs-tune.py s
|
||||
|
||||
"""
|
||||
|
||||
|
||||
PAGESIZE = 65536
|
||||
|
||||
|
||||
# borrowed from copyparty; short efficient stacktrace for errors
|
||||
def min_ex(max_lines: int = 8, reverse: bool = False) -> str:
|
||||
et, ev, tb = sys.exc_info()
|
||||
stb = traceback.extract_tb(tb) if tb else traceback.extract_stack()[:-1]
|
||||
fmt = "%s:%d <%s>: %s"
|
||||
ex = [fmt % (fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb]
|
||||
if et or ev or tb:
|
||||
ex.append("[%s] %s" % (et.__name__ if et else "(anonymous)", ev))
|
||||
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
|
||||
|
||||
|
||||
def set_pagesize(db_path):
|
||||
try:
|
||||
# check current page_size
|
||||
with sqlite3.connect(db_path) as db:
|
||||
v = db.execute("pragma page_size").fetchone()[0]
|
||||
if v == PAGESIZE:
|
||||
print(" `-- OK")
|
||||
return
|
||||
|
||||
# https://www.sqlite.org/pragma.html#pragma_page_size
|
||||
# `- disable wal; set pagesize; vacuum
|
||||
# (copyparty will reenable wal if necessary)
|
||||
|
||||
with sqlite3.connect(db_path) as db:
|
||||
db.execute("pragma journal_mode=delete")
|
||||
db.commit()
|
||||
|
||||
with sqlite3.connect(db_path) as db:
|
||||
db.execute(f"pragma page_size = {PAGESIZE}")
|
||||
db.execute("vacuum")
|
||||
|
||||
print(" `-- new pagesize OK")
|
||||
|
||||
except Exception:
|
||||
err = min_ex().replace("\n", "\n -- ")
|
||||
print(f"FAILED: {db_path}\n -- {err}")
|
||||
|
||||
|
||||
def main():
|
||||
top = os.path.dirname(os.path.abspath(__file__))
|
||||
cwd = os.path.abspath(os.getcwd())
|
||||
try:
|
||||
x = sys.argv[1]
|
||||
except:
|
||||
print(f"""
|
||||
this script takes one mandatory argument:
|
||||
specify 's' to start recursing from folder containing this script file ({top})
|
||||
specify 'w' to start recursing from the current working directory ({cwd})
|
||||
specify a path to start recursing from there
|
||||
""")
|
||||
sys.exit(1)
|
||||
|
||||
if x.lower() == "w":
|
||||
top = cwd
|
||||
elif x.lower() != "s":
|
||||
top = x
|
||||
|
||||
for dirpath, dirs, files in os.walk(top):
|
||||
for fname in files:
|
||||
if not fname.endswith(".db"):
|
||||
continue
|
||||
db_path = os.path.join(dirpath, fname)
|
||||
print(db_path)
|
||||
set_pagesize(db_path)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -40,6 +40,7 @@ from .cfg import flagcats, onedash
|
||||
from .svchub import SvcHub
|
||||
from .util import (
|
||||
APPLESAN_TXT,
|
||||
BAD_BOTS,
|
||||
DEF_EXP,
|
||||
DEF_MTE,
|
||||
DEF_MTH,
|
||||
@@ -65,6 +66,7 @@ from .util import (
|
||||
load_resource,
|
||||
min_ex,
|
||||
pybin,
|
||||
read_utf8,
|
||||
termsize,
|
||||
wrap,
|
||||
)
|
||||
@@ -226,7 +228,23 @@ def init_E(EE: EnvParams) -> None:
|
||||
if E.mod.endswith("__init__"):
|
||||
E.mod = os.path.dirname(E.mod)
|
||||
|
||||
if sys.platform == "win32":
|
||||
try:
|
||||
p = os.environ.get("XDG_CONFIG_HOME")
|
||||
if not p:
|
||||
raise Exception()
|
||||
if p.startswith("~"):
|
||||
p = os.path.expanduser(p)
|
||||
p = os.path.abspath(os.path.realpath(p))
|
||||
p = os.path.join(p, "copyparty")
|
||||
if not os.path.isdir(p):
|
||||
os.mkdir(p)
|
||||
os.listdir(p)
|
||||
except:
|
||||
p = ""
|
||||
|
||||
if p:
|
||||
E.cfg = p
|
||||
elif sys.platform == "win32":
|
||||
bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "."
|
||||
E.cfg = os.path.normpath(bdir + "/copyparty")
|
||||
elif sys.platform == "darwin":
|
||||
@@ -255,8 +273,7 @@ def get_srvname(verbose) -> str:
|
||||
if verbose:
|
||||
lprint("using hostname from {}\n".format(fp))
|
||||
try:
|
||||
with open(fp, "rb") as f:
|
||||
ret = f.read().decode("utf-8", "replace").strip()
|
||||
return read_utf8(None, fp, True).strip()
|
||||
except:
|
||||
ret = ""
|
||||
namelen = 5
|
||||
@@ -265,47 +282,18 @@ def get_srvname(verbose) -> str:
|
||||
ret = re.sub("[234567=]", "", ret)[:namelen]
|
||||
with open(fp, "wb") as f:
|
||||
f.write(ret.encode("utf-8") + b"\n")
|
||||
|
||||
return ret
|
||||
return ret
|
||||
|
||||
|
||||
def get_fk_salt() -> str:
|
||||
fp = os.path.join(E.cfg, "fk-salt.txt")
|
||||
def get_salt(name: str, nbytes: int) -> str:
|
||||
fp = os.path.join(E.cfg, "%s-salt.txt" % (name,))
|
||||
try:
|
||||
with open(fp, "rb") as f:
|
||||
ret = f.read().strip()
|
||||
return read_utf8(None, fp, True).strip()
|
||||
except:
|
||||
ret = b64enc(os.urandom(18))
|
||||
ret = b64enc(os.urandom(nbytes))
|
||||
with open(fp, "wb") as f:
|
||||
f.write(ret + b"\n")
|
||||
|
||||
return ret.decode("utf-8")
|
||||
|
||||
|
||||
def get_dk_salt() -> str:
|
||||
fp = os.path.join(E.cfg, "dk-salt.txt")
|
||||
try:
|
||||
with open(fp, "rb") as f:
|
||||
ret = f.read().strip()
|
||||
except:
|
||||
ret = b64enc(os.urandom(30))
|
||||
with open(fp, "wb") as f:
|
||||
f.write(ret + b"\n")
|
||||
|
||||
return ret.decode("utf-8")
|
||||
|
||||
|
||||
def get_ah_salt() -> str:
|
||||
fp = os.path.join(E.cfg, "ah-salt.txt")
|
||||
try:
|
||||
with open(fp, "rb") as f:
|
||||
ret = f.read().strip()
|
||||
except:
|
||||
ret = b64enc(os.urandom(18))
|
||||
with open(fp, "wb") as f:
|
||||
f.write(ret + b"\n")
|
||||
|
||||
return ret.decode("utf-8")
|
||||
return ret.decode("utf-8")
|
||||
|
||||
|
||||
def ensure_locale() -> None:
|
||||
@@ -1037,9 +1025,9 @@ def add_upload(ap):
|
||||
ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)")
|
||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16 for cross-atlantic (max=64)")
|
||||
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good when latency is low (same-country), 2~4 for android-clients, 2~6 for cross-atlantic. Max is 6 in most browsers. Big values increase network-speed but may reduce HDD-speed")
|
||||
ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for \033[33mdefault\033[0m, and never exceed \033[33mmax\033[0m. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]")
|
||||
ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)")
|
||||
ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to replace/overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)")
|
||||
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
|
||||
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
|
||||
|
||||
@@ -1058,6 +1046,8 @@ def add_network(ap):
|
||||
ap2.add_argument("--reuseaddr", action="store_true", help="set reuseaddr on listening sockets on windows; allows rapid restart of copyparty at the expense of being able to accidentally start multiple instances")
|
||||
else:
|
||||
ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)")
|
||||
ap2.add_argument("--wr-h-eps", metavar="PATH", type=u, default="", help="write list of listening-on ip:port to textfile at \033[33mPATH\033[0m when http-servers have started")
|
||||
ap2.add_argument("--wr-h-aon", metavar="PATH", type=u, default="", help="write list of accessible-on ip:port to textfile at \033[33mPATH\033[0m when http-servers have started")
|
||||
ap2.add_argument("--s-thead", metavar="SEC", type=int, default=120, help="socket timeout (read request header)")
|
||||
ap2.add_argument("--s-tbody", metavar="SEC", type=float, default=128.0, help="socket timeout (read/write request/response bodies). Use 60 on fast servers (default is extremely safe). Disable with 0 if reverse-proxied for a 2%% speed boost")
|
||||
ap2.add_argument("--s-rd-sz", metavar="B", type=int, default=256*1024, help="socket read size in bytes (indirectly affects filesystem writes; recommendation: keep equal-to or lower-than \033[33m--iobuf\033[0m)")
|
||||
@@ -1251,6 +1241,7 @@ def add_yolo(ap):
|
||||
ap2 = ap.add_argument_group('yolo options')
|
||||
ap2.add_argument("--allow-csrf", action="store_true", help="disable csrf protections; let other domains/sites impersonate you through cross-site requests")
|
||||
ap2.add_argument("--getmod", action="store_true", help="permit ?move=[...] and ?delete as GET")
|
||||
ap2.add_argument("--wo-up-readme", action="store_true", help="allow users with write-only access to upload logues and readmes without adding the _wo_ filename prefix (volflag=wo_up_readme)")
|
||||
|
||||
|
||||
def add_optouts(ap):
|
||||
@@ -1265,7 +1256,12 @@ def add_optouts(ap):
|
||||
ap2.add_argument("-nih", action="store_true", help="no info hostname -- don't show in UI")
|
||||
ap2.add_argument("-nid", action="store_true", help="no info disk-usage -- don't show in UI")
|
||||
ap2.add_argument("-nb", action="store_true", help="no powered-by-copyparty branding in UI")
|
||||
ap2.add_argument("--zipmaxn", metavar="N", type=u, default="0", help="reject download-as-zip if more than \033[33mN\033[0m files in total; optionally takes a unit suffix: [\033[32m256\033[0m], [\033[32m9K\033[0m], [\033[32m4G\033[0m] (volflag=zipmaxn)")
|
||||
ap2.add_argument("--zipmaxs", metavar="SZ", type=u, default="0", help="reject download-as-zip if total download size exceeds \033[33mSZ\033[0m bytes; optionally takes a unit suffix: [\033[32m256M\033[0m], [\033[32m4G\033[0m], [\033[32m2T\033[0m] (volflag=zipmaxs)")
|
||||
ap2.add_argument("--zipmaxt", metavar="TXT", type=u, default="", help="custom errormessage when download size exceeds max (volflag=zipmaxt)")
|
||||
ap2.add_argument("--zipmaxu", action="store_true", help="authenticated users bypass the zip size limit (volflag=zipmaxu)")
|
||||
ap2.add_argument("--zip-who", metavar="LVL", type=int, default=3, help="who can download as zip/tar? [\033[32m0\033[0m]=nobody, [\033[32m1\033[0m]=admins, [\033[32m2\033[0m]=authenticated-with-read-access, [\033[32m3\033[0m]=everyone-with-read-access (volflag=zip_who)\n\033[1;31mWARNING:\033[0m if a nested volume has a more restrictive value than a parent volume, then this will be \033[33mignored\033[0m if the download is initiated from the parent, more lenient volume")
|
||||
ap2.add_argument("--ua-nozip", metavar="PTN", type=u, default=BAD_BOTS, help="regex of user-agents to reject from download-as-zip/tar; disable with [\033[32mno\033[0m] or blank")
|
||||
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar; same as \033[33m--zip-who=0\033[0m")
|
||||
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
||||
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
|
||||
@@ -1312,6 +1308,9 @@ def add_salt(ap, fk_salt, dk_salt, ah_salt):
|
||||
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files")
|
||||
ap2.add_argument("--dk-salt", metavar="SALT", type=u, default=dk_salt, help="per-directory accesskey salt; used to generate unpredictable URLs to share folders with users who only have the 'get' permission")
|
||||
ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
|
||||
ap2.add_argument("--show-ah-salt", action="store_true", help="on startup, print the effective value of \033[33m--ah-salt\033[0m (the autogenerated value in $XDG_CONFIG_HOME unless otherwise specified)")
|
||||
ap2.add_argument("--show-fk-salt", action="store_true", help="on startup, print the effective value of \033[33m--fk-salt\033[0m (the autogenerated value in $XDG_CONFIG_HOME unless otherwise specified)")
|
||||
ap2.add_argument("--show-dk-salt", action="store_true", help="on startup, print the effective value of \033[33m--dk-salt\033[0m (the autogenerated value in $XDG_CONFIG_HOME unless otherwise specified)")
|
||||
|
||||
|
||||
def add_shutdown(ap):
|
||||
@@ -1353,7 +1352,7 @@ def add_admin(ap):
|
||||
|
||||
def add_thumbnail(ap):
|
||||
th_ram = (RAM_AVAIL or RAM_TOTAL or 9) * 0.6
|
||||
th_ram = int(max(min(th_ram, 6), 1) * 10) / 10
|
||||
th_ram = int(max(min(th_ram, 6), 0.3) * 10) / 10
|
||||
ap2 = ap.add_argument_group('thumbnail options')
|
||||
ap2.add_argument("--no-thumb", action="store_true", help="disable all thumbnails (volflag=dthumb)")
|
||||
ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails (volflag=dvthumb)")
|
||||
@@ -1381,6 +1380,7 @@ def add_thumbnail(ap):
|
||||
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,cbz,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itxz,itz,m4a,mdgz,mdxz,mdz,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,s3gz,s3xz,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmxz,xmz,xpk", help="audio formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-spec-cnv", metavar="T,T", type=u, default="it,itgz,itxz,itz,mdgz,mdxz,mdz,mo3,mod,s3m,s3gz,s3xz,s3z,xm,xmgz,xmxz,xmz,xpk", help="audio formats which provoke https://trac.ffmpeg.org/ticket/10797 (huge ram usage for s3xmodit spectrograms)")
|
||||
ap2.add_argument("--au-unpk", metavar="E=F.C", type=u, default="mdz=mod.zip, mdgz=mod.gz, mdxz=mod.xz, s3z=s3m.zip, s3gz=s3m.gz, s3xz=s3m.xz, xmz=xm.zip, xmgz=xm.gz, xmxz=xm.xz, itz=it.zip, itgz=it.gz, itxz=it.xz, cbz=jpg.cbz", help="audio/image formats to decompress before passing to ffmpeg")
|
||||
|
||||
|
||||
@@ -1413,6 +1413,7 @@ def add_db_general(ap, hcores):
|
||||
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
|
||||
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
|
||||
ap2.add_argument("--hist", metavar="PATH", type=u, default="", help="where to store volume data (db, thumbs); default is a folder named \".hist\" inside each volume (volflag=hist)")
|
||||
ap2.add_argument("--dbpath", metavar="PATH", type=u, default="", help="override where the volume databases are to be placed; default is the same as \033[33m--hist\033[0m (volflag=dbpath)")
|
||||
ap2.add_argument("--no-hash", metavar="PTN", type=u, default="", help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
|
||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
|
||||
ap2.add_argument("--no-dirsz", action="store_true", help="do not show total recursive size of folders in listings, show inode size instead; slightly faster (volflag=nodirsz)")
|
||||
@@ -1451,11 +1452,13 @@ def add_db_metadata(ap):
|
||||
|
||||
def add_txt(ap):
|
||||
ap2 = ap.add_argument_group('textfile options')
|
||||
ap2.add_argument("--md-hist", metavar="TXT", type=u, default="s", help="where to store old version of markdown files; [\033[32ms\033[0m]=subfolder, [\033[32mv\033[0m]=volume-histpath, [\033[32mn\033[0m]=nope/disabled (volflag=md_hist)")
|
||||
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="the textfile editor will check for serverside changes every \033[33mSEC\033[0m seconds")
|
||||
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins -- neat but dangerous, big XSS risk")
|
||||
ap2.add_argument("--exp", action="store_true", help="enable textfile expansion -- replace {{self.ip}} and such; see \033[33m--help-exp\033[0m (volflag=exp)")
|
||||
ap2.add_argument("--exp-md", metavar="V,V,V", type=u, default=DEF_EXP, help="comma/space-separated list of placeholders to expand in markdown files; add/remove stuff on the default list with +hdr_foo or /vf.scan (volflag=exp_md)")
|
||||
ap2.add_argument("--exp-lg", metavar="V,V,V", type=u, default=DEF_EXP, help="comma/space-separated list of placeholders to expand in prologue/epilogue files (volflag=exp_lg)")
|
||||
ap2.add_argument("--ua-nodoc", metavar="PTN", type=u, default=BAD_BOTS, help="regex of user-agents to reject from viewing documents through ?doc=[...]; disable with [\033[32mno\033[0m] or blank")
|
||||
|
||||
|
||||
def add_og(ap):
|
||||
@@ -1552,9 +1555,9 @@ def run_argparse(
|
||||
|
||||
cert_path = os.path.join(E.cfg, "cert.pem")
|
||||
|
||||
fk_salt = get_fk_salt()
|
||||
dk_salt = get_dk_salt()
|
||||
ah_salt = get_ah_salt()
|
||||
fk_salt = get_salt("fk", 18)
|
||||
dk_salt = get_salt("dk", 30)
|
||||
ah_salt = get_salt("ah", 18)
|
||||
|
||||
# alpine peaks at 5 threads for some reason,
|
||||
# all others scale past that (but try to avoid SMT),
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 16, 14)
|
||||
CODENAME = "COPYparty"
|
||||
BUILD_DT = (2025, 2, 19)
|
||||
VERSION = (1, 17, 0)
|
||||
CODENAME = "mixtape.m3u"
|
||||
BUILD_DT = (2025, 4, 26)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -33,6 +33,7 @@ from .util import (
|
||||
get_df,
|
||||
humansize,
|
||||
odfusion,
|
||||
read_utf8,
|
||||
relchk,
|
||||
statdir,
|
||||
ub64enc,
|
||||
@@ -46,7 +47,7 @@ from .util import (
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from collections.abc import Iterable
|
||||
|
||||
from typing import Any, Generator, Optional, Union
|
||||
from typing import Any, Generator, Optional, Sequence, Union
|
||||
|
||||
from .util import NamedLogger, RootLogger
|
||||
|
||||
@@ -71,6 +72,8 @@ SSEELOG = " ({})".format(SEE_LOG)
|
||||
BAD_CFG = "invalid config; {}".format(SEE_LOG)
|
||||
SBADCFG = " ({})".format(BAD_CFG)
|
||||
|
||||
PTN_U_GRP = re.compile(r"\$\{u%([+-])([^}]+)\}")
|
||||
|
||||
|
||||
class CfgEx(Exception):
|
||||
pass
|
||||
@@ -342,22 +345,27 @@ class VFS(object):
|
||||
log: Optional["RootLogger"],
|
||||
realpath: str,
|
||||
vpath: str,
|
||||
vpath0: str,
|
||||
axs: AXS,
|
||||
flags: dict[str, Any],
|
||||
) -> None:
|
||||
self.log = log
|
||||
self.realpath = realpath # absolute path on host filesystem
|
||||
self.vpath = vpath # absolute path in the virtual filesystem
|
||||
self.vpath0 = vpath0 # original vpath (before idp expansion)
|
||||
self.axs = axs
|
||||
self.flags = flags # config options
|
||||
self.root = self
|
||||
self.dev = 0 # st_dev
|
||||
self.badcfg1 = False
|
||||
self.nodes: dict[str, VFS] = {} # child nodes
|
||||
self.histtab: dict[str, str] = {} # all realpath->histpath
|
||||
self.dbpaths: dict[str, str] = {} # all realpath->dbpath
|
||||
self.dbv: Optional[VFS] = None # closest full/non-jump parent
|
||||
self.lim: Optional[Lim] = None # upload limits; only set for dbv
|
||||
self.shr_src: Optional[tuple[VFS, str]] = None # source vfs+rem of a share
|
||||
self.shr_files: set[str] = set() # filenames to include from shr_src
|
||||
self.shr_owner: str = "" # uname
|
||||
self.aread: dict[str, list[str]] = {}
|
||||
self.awrite: dict[str, list[str]] = {}
|
||||
self.amove: dict[str, list[str]] = {}
|
||||
@@ -374,12 +382,13 @@ class VFS(object):
|
||||
rp = realpath + ("" if realpath.endswith(os.sep) else os.sep)
|
||||
vp = vpath + ("/" if vpath else "")
|
||||
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
|
||||
self.dbpath = self.histpath
|
||||
self.all_vols = {vpath: self} # flattened recursive
|
||||
self.all_nodes = {vpath: self} # also jumpvols
|
||||
self.all_nodes = {vpath: self} # also jumpvols/shares
|
||||
self.all_aps = [(rp, self)]
|
||||
self.all_vps = [(vp, self)]
|
||||
else:
|
||||
self.histpath = ""
|
||||
self.histpath = self.dbpath = ""
|
||||
self.all_vols = {}
|
||||
self.all_nodes = {}
|
||||
self.all_aps = []
|
||||
@@ -415,7 +424,7 @@ class VFS(object):
|
||||
for v in self.nodes.values():
|
||||
v.get_all_vols(vols, nodes, aps, vps)
|
||||
|
||||
def add(self, src: str, dst: str) -> "VFS":
|
||||
def add(self, src: str, dst: str, dst0: str) -> "VFS":
|
||||
"""get existing, or add new path to the vfs"""
|
||||
assert src == "/" or not src.endswith("/") # nosec
|
||||
assert not dst.endswith("/") # nosec
|
||||
@@ -423,20 +432,22 @@ class VFS(object):
|
||||
if "/" in dst:
|
||||
# requires breadth-first population (permissions trickle down)
|
||||
name, dst = dst.split("/", 1)
|
||||
name0, dst0 = dst0.split("/", 1)
|
||||
if name in self.nodes:
|
||||
# exists; do not manipulate permissions
|
||||
return self.nodes[name].add(src, dst)
|
||||
return self.nodes[name].add(src, dst, dst0)
|
||||
|
||||
vn = VFS(
|
||||
self.log,
|
||||
os.path.join(self.realpath, name) if self.realpath else "",
|
||||
"{}/{}".format(self.vpath, name).lstrip("/"),
|
||||
"{}/{}".format(self.vpath0, name0).lstrip("/"),
|
||||
self.axs,
|
||||
self._copy_flags(name),
|
||||
)
|
||||
vn.dbv = self.dbv or self
|
||||
self.nodes[name] = vn
|
||||
return vn.add(src, dst)
|
||||
return vn.add(src, dst, dst0)
|
||||
|
||||
if dst in self.nodes:
|
||||
# leaf exists; return as-is
|
||||
@@ -444,24 +455,31 @@ class VFS(object):
|
||||
|
||||
# leaf does not exist; create and keep permissions blank
|
||||
vp = "{}/{}".format(self.vpath, dst).lstrip("/")
|
||||
vn = VFS(self.log, src, vp, AXS(), {})
|
||||
vp0 = "{}/{}".format(self.vpath0, dst0).lstrip("/")
|
||||
vn = VFS(self.log, src, vp, vp0, AXS(), {})
|
||||
vn.dbv = self.dbv or self
|
||||
self.nodes[dst] = vn
|
||||
return vn
|
||||
|
||||
def _copy_flags(self, name: str) -> dict[str, Any]:
|
||||
flags = {k: v for k, v in self.flags.items()}
|
||||
|
||||
hist = flags.get("hist")
|
||||
if hist and hist != "-":
|
||||
zs = "{}/{}".format(hist.rstrip("/"), name)
|
||||
flags["hist"] = os.path.expandvars(os.path.expanduser(zs))
|
||||
|
||||
dbp = flags.get("dbpath")
|
||||
if dbp and dbp != "-":
|
||||
zs = "{}/{}".format(dbp.rstrip("/"), name)
|
||||
flags["dbpath"] = os.path.expandvars(os.path.expanduser(zs))
|
||||
|
||||
return flags
|
||||
|
||||
def bubble_flags(self) -> None:
|
||||
if self.dbv:
|
||||
for k, v in self.dbv.flags.items():
|
||||
if k not in ["hist"]:
|
||||
if k not in ("hist", "dbpath"):
|
||||
self.flags[k] = v
|
||||
|
||||
for n in self.nodes.values():
|
||||
@@ -861,7 +879,7 @@ class AuthSrv(object):
|
||||
self.indent = ""
|
||||
|
||||
# fwd-decl
|
||||
self.vfs = VFS(log_func, "", "", AXS(), {})
|
||||
self.vfs = VFS(log_func, "", "", "", AXS(), {})
|
||||
self.acct: dict[str, str] = {} # uname->pw
|
||||
self.iacct: dict[str, str] = {} # pw->uname
|
||||
self.ases: dict[str, str] = {} # uname->session
|
||||
@@ -929,7 +947,7 @@ class AuthSrv(object):
|
||||
self,
|
||||
src: str,
|
||||
dst: str,
|
||||
mount: dict[str, str],
|
||||
mount: dict[str, tuple[str, str]],
|
||||
daxs: dict[str, AXS],
|
||||
mflags: dict[str, dict[str, Any]],
|
||||
un_gns: dict[str, list[str]],
|
||||
@@ -945,12 +963,24 @@ class AuthSrv(object):
|
||||
un_gn = [("", "")]
|
||||
|
||||
for un, gn in un_gn:
|
||||
m = PTN_U_GRP.search(dst0)
|
||||
if m:
|
||||
req, gnc = m.groups()
|
||||
hit = gnc in (un_gns.get(un) or [])
|
||||
if req == "+":
|
||||
if not hit:
|
||||
continue
|
||||
elif hit:
|
||||
continue
|
||||
|
||||
# if ap/vp has a user/group placeholder, make sure to keep
|
||||
# track so the same user/group is mapped when setting perms;
|
||||
# otherwise clear un/gn to indicate it's a regular volume
|
||||
|
||||
src1 = src0.replace("${u}", un or "\n")
|
||||
dst1 = dst0.replace("${u}", un or "\n")
|
||||
src1 = PTN_U_GRP.sub(un or "\n", src1)
|
||||
dst1 = PTN_U_GRP.sub(un or "\n", dst1)
|
||||
if src0 == src1 and dst0 == dst1:
|
||||
un = ""
|
||||
|
||||
@@ -967,7 +997,7 @@ class AuthSrv(object):
|
||||
continue
|
||||
visited.add(label)
|
||||
|
||||
src, dst = self._map_volume(src, dst, mount, daxs, mflags)
|
||||
src, dst = self._map_volume(src, dst, dst0, mount, daxs, mflags)
|
||||
if src:
|
||||
ret.append((src, dst, un, gn))
|
||||
if un or gn:
|
||||
@@ -979,7 +1009,8 @@ class AuthSrv(object):
|
||||
self,
|
||||
src: str,
|
||||
dst: str,
|
||||
mount: dict[str, str],
|
||||
dst0: str,
|
||||
mount: dict[str, tuple[str, str]],
|
||||
daxs: dict[str, AXS],
|
||||
mflags: dict[str, dict[str, Any]],
|
||||
) -> tuple[str, str]:
|
||||
@@ -989,13 +1020,13 @@ class AuthSrv(object):
|
||||
|
||||
if dst in mount:
|
||||
t = "multiple filesystem-paths mounted at [/{}]:\n [{}]\n [{}]"
|
||||
self.log(t.format(dst, mount[dst], src), c=1)
|
||||
self.log(t.format(dst, mount[dst][0], src), c=1)
|
||||
raise Exception(BAD_CFG)
|
||||
|
||||
if src in mount.values():
|
||||
t = "filesystem-path [{}] mounted in multiple locations:"
|
||||
t = t.format(src)
|
||||
for v in [k for k, v in mount.items() if v == src] + [dst]:
|
||||
for v in [k for k, v in mount.items() if v[0] == src] + [dst]:
|
||||
t += "\n /{}".format(v)
|
||||
|
||||
self.log(t, c=3)
|
||||
@@ -1004,7 +1035,7 @@ class AuthSrv(object):
|
||||
if not bos.path.isdir(src):
|
||||
self.log("warning: filesystem-path does not exist: {}".format(src), 3)
|
||||
|
||||
mount[dst] = src
|
||||
mount[dst] = (src, dst0)
|
||||
daxs[dst] = AXS()
|
||||
mflags[dst] = {}
|
||||
return (src, dst)
|
||||
@@ -1065,7 +1096,7 @@ class AuthSrv(object):
|
||||
grps: dict[str, list[str]],
|
||||
daxs: dict[str, AXS],
|
||||
mflags: dict[str, dict[str, Any]],
|
||||
mount: dict[str, str],
|
||||
mount: dict[str, tuple[str, str]],
|
||||
) -> None:
|
||||
self.line_ctr = 0
|
||||
|
||||
@@ -1090,7 +1121,7 @@ class AuthSrv(object):
|
||||
grps: dict[str, list[str]],
|
||||
daxs: dict[str, AXS],
|
||||
mflags: dict[str, dict[str, Any]],
|
||||
mount: dict[str, str],
|
||||
mount: dict[str, tuple[str, str]],
|
||||
npass: int,
|
||||
) -> None:
|
||||
self.line_ctr = 0
|
||||
@@ -1449,8 +1480,8 @@ class AuthSrv(object):
|
||||
acct: dict[str, str] = {} # username:password
|
||||
grps: dict[str, list[str]] = {} # groupname:usernames
|
||||
daxs: dict[str, AXS] = {}
|
||||
mflags: dict[str, dict[str, Any]] = {} # moutpoint:flags
|
||||
mount: dict[str, str] = {} # dst:src (mountpoint:realpath)
|
||||
mflags: dict[str, dict[str, Any]] = {} # vpath:flags
|
||||
mount: dict[str, tuple[str, str]] = {} # dst:src (vp:(ap,vp0))
|
||||
|
||||
self.idp_vols = {} # yolo
|
||||
|
||||
@@ -1529,8 +1560,8 @@ class AuthSrv(object):
|
||||
# case-insensitive; normalize
|
||||
if WINDOWS:
|
||||
cased = {}
|
||||
for k, v in mount.items():
|
||||
cased[k] = absreal(v)
|
||||
for vp, (ap, vp0) in mount.items():
|
||||
cased[vp] = (absreal(ap), vp0)
|
||||
|
||||
mount = cased
|
||||
|
||||
@@ -1545,25 +1576,28 @@ class AuthSrv(object):
|
||||
t = "Read-access has been disabled due to failsafe: No volumes were defined by the config-file. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to the working-directory by adding the following arguments: -v .::rw"
|
||||
self.log(t, 1)
|
||||
axs = AXS()
|
||||
vfs = VFS(self.log_func, absreal("."), "", axs, {})
|
||||
vfs = VFS(self.log_func, absreal("."), "", "", axs, {})
|
||||
if not axs.uread:
|
||||
vfs.badcfg1 = True
|
||||
elif "" not in mount:
|
||||
# there's volumes but no root; make root inaccessible
|
||||
zsd = {"d2d": True, "tcolor": self.args.tcolor}
|
||||
vfs = VFS(self.log_func, "", "", AXS(), zsd)
|
||||
vfs = VFS(self.log_func, "", "", "", AXS(), zsd)
|
||||
|
||||
maxdepth = 0
|
||||
for dst in sorted(mount.keys(), key=lambda x: (x.count("/"), len(x))):
|
||||
depth = dst.count("/")
|
||||
assert maxdepth <= depth # nosec
|
||||
maxdepth = depth
|
||||
src, dst0 = mount[dst]
|
||||
|
||||
if dst == "":
|
||||
# rootfs was mapped; fully replaces the default CWD vfs
|
||||
vfs = VFS(self.log_func, mount[dst], dst, daxs[dst], mflags[dst])
|
||||
vfs = VFS(self.log_func, src, dst, dst0, daxs[dst], mflags[dst])
|
||||
continue
|
||||
|
||||
assert vfs # type: ignore
|
||||
zv = vfs.add(mount[dst], dst)
|
||||
zv = vfs.add(src, dst, dst0)
|
||||
zv.axs = daxs[dst]
|
||||
zv.flags = mflags[dst]
|
||||
zv.dbv = None
|
||||
@@ -1584,12 +1618,8 @@ class AuthSrv(object):
|
||||
for vol in vfs.all_vols.values():
|
||||
unknown_flags = set()
|
||||
for k, v in vol.flags.items():
|
||||
stripped = k.lstrip("-")
|
||||
if k != stripped and stripped not in vol.flags:
|
||||
t = "WARNING: the config for volume [/%s] tried to remove volflag [%s] by specifying [%s] but that volflag was not already set"
|
||||
self.log(t % (vol.vpath, stripped, k), 3)
|
||||
k = stripped
|
||||
if k not in flagdescs and k not in k_ign:
|
||||
ks = k.lstrip("-")
|
||||
if ks not in flagdescs and ks not in k_ign:
|
||||
unknown_flags.add(k)
|
||||
if unknown_flags:
|
||||
t = "WARNING: the config for volume [/%s] has unrecognized volflags; will ignore: '%s'"
|
||||
@@ -1601,7 +1631,8 @@ class AuthSrv(object):
|
||||
if enshare:
|
||||
import sqlite3
|
||||
|
||||
shv = VFS(self.log_func, "", shr, AXS(), {})
|
||||
zsd = {"d2d": True, "tcolor": self.args.tcolor}
|
||||
shv = VFS(self.log_func, "", shr, shr, AXS(), zsd)
|
||||
|
||||
db_path = self.args.shr_db
|
||||
db = sqlite3.connect(db_path)
|
||||
@@ -1635,9 +1666,8 @@ class AuthSrv(object):
|
||||
|
||||
# don't know the abspath yet + wanna ensure the user
|
||||
# still has the privs they granted, so nullmap it
|
||||
shv.nodes[s_k] = VFS(
|
||||
self.log_func, "", "%s/%s" % (shr, s_k), s_axs, shv.flags.copy()
|
||||
)
|
||||
vp = "%s/%s" % (shr, s_k)
|
||||
shv.nodes[s_k] = VFS(self.log_func, "", vp, vp, s_axs, shv.flags.copy())
|
||||
|
||||
vfs.nodes[shr] = vfs.all_vols[shr] = shv
|
||||
for vol in shv.nodes.values():
|
||||
@@ -1737,7 +1767,7 @@ class AuthSrv(object):
|
||||
pass
|
||||
elif vflag:
|
||||
vflag = os.path.expandvars(os.path.expanduser(vflag))
|
||||
vol.histpath = uncyg(vflag) if WINDOWS else vflag
|
||||
vol.histpath = vol.dbpath = uncyg(vflag) if WINDOWS else vflag
|
||||
elif self.args.hist:
|
||||
for nch in range(len(hid)):
|
||||
hpath = os.path.join(self.args.hist, hid[: nch + 1])
|
||||
@@ -1758,12 +1788,45 @@ class AuthSrv(object):
|
||||
with open(powner, "wb") as f:
|
||||
f.write(me)
|
||||
|
||||
vol.histpath = hpath
|
||||
vol.histpath = vol.dbpath = hpath
|
||||
break
|
||||
|
||||
vol.histpath = absreal(vol.histpath)
|
||||
|
||||
for vol in vfs.all_vols.values():
|
||||
hid = self.hid_cache[vol.realpath]
|
||||
vflag = vol.flags.get("dbpath")
|
||||
if vflag == "-":
|
||||
pass
|
||||
elif vflag:
|
||||
vflag = os.path.expandvars(os.path.expanduser(vflag))
|
||||
vol.dbpath = uncyg(vflag) if WINDOWS else vflag
|
||||
elif self.args.dbpath:
|
||||
for nch in range(len(hid)):
|
||||
hpath = os.path.join(self.args.dbpath, hid[: nch + 1])
|
||||
bos.makedirs(hpath)
|
||||
|
||||
powner = os.path.join(hpath, "owner.txt")
|
||||
try:
|
||||
with open(powner, "rb") as f:
|
||||
owner = f.read().rstrip()
|
||||
except:
|
||||
owner = None
|
||||
|
||||
me = afsenc(vol.realpath).rstrip()
|
||||
if owner not in [None, me]:
|
||||
continue
|
||||
|
||||
if owner is None:
|
||||
with open(powner, "wb") as f:
|
||||
f.write(me)
|
||||
|
||||
vol.dbpath = hpath
|
||||
break
|
||||
|
||||
vol.dbpath = absreal(vol.dbpath)
|
||||
if vol.dbv:
|
||||
if bos.path.exists(os.path.join(vol.histpath, "up2k.db")):
|
||||
if bos.path.exists(os.path.join(vol.dbpath, "up2k.db")):
|
||||
promote.append(vol)
|
||||
vol.dbv = None
|
||||
else:
|
||||
@@ -1778,9 +1841,7 @@ class AuthSrv(object):
|
||||
"\n the following jump-volumes were generated to assist the vfs.\n As they contain a database (probably from v0.11.11 or older),\n they are promoted to full volumes:"
|
||||
]
|
||||
for vol in promote:
|
||||
ta.append(
|
||||
" /{} ({}) ({})".format(vol.vpath, vol.realpath, vol.histpath)
|
||||
)
|
||||
ta.append(" /%s (%s) (%s)" % (vol.vpath, vol.realpath, vol.dbpath))
|
||||
|
||||
self.log("\n\n".join(ta) + "\n", c=3)
|
||||
|
||||
@@ -1791,13 +1852,45 @@ class AuthSrv(object):
|
||||
is_shr = shr and zv.vpath.split("/")[0] == shr
|
||||
if histp and not is_shr and histp in rhisttab:
|
||||
zv2 = rhisttab[histp]
|
||||
t = "invalid config; multiple volumes share the same histpath (database location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
||||
t = "invalid config; multiple volumes share the same histpath (database+thumbnails location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
||||
t = t % (histp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
||||
self.log(t, 1)
|
||||
raise Exception(t)
|
||||
rhisttab[histp] = zv
|
||||
vfs.histtab[zv.realpath] = histp
|
||||
|
||||
rdbpaths = {}
|
||||
vfs.dbpaths = {}
|
||||
for zv in vfs.all_vols.values():
|
||||
dbp = zv.dbpath
|
||||
is_shr = shr and zv.vpath.split("/")[0] == shr
|
||||
if dbp and not is_shr and dbp in rdbpaths:
|
||||
zv2 = rdbpaths[dbp]
|
||||
t = "invalid config; multiple volumes share the same dbpath (database location):\n dbpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
||||
t = t % (dbp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
||||
self.log(t, 1)
|
||||
raise Exception(t)
|
||||
rdbpaths[dbp] = zv
|
||||
vfs.dbpaths[zv.realpath] = dbp
|
||||
|
||||
for vol in vfs.all_vols.values():
|
||||
use = False
|
||||
for k in ["zipmaxn", "zipmaxs"]:
|
||||
try:
|
||||
zs = vol.flags[k]
|
||||
except:
|
||||
zs = getattr(self.args, k)
|
||||
if zs in ("", "0"):
|
||||
vol.flags[k] = 0
|
||||
continue
|
||||
|
||||
zf = unhumanize(zs)
|
||||
vol.flags[k + "_v"] = zf
|
||||
if zf:
|
||||
use = True
|
||||
if use:
|
||||
vol.flags["zipmax"] = True
|
||||
|
||||
for vol in vfs.all_vols.values():
|
||||
lim = Lim(self.log_func)
|
||||
use = False
|
||||
@@ -2181,8 +2274,13 @@ class AuthSrv(object):
|
||||
for vol in vfs.all_nodes.values():
|
||||
for k in list(vol.flags.keys()):
|
||||
if re.match("^-[^-]+$", k):
|
||||
vol.flags.pop(k[1:], None)
|
||||
vol.flags.pop(k)
|
||||
zs = k[1:]
|
||||
if zs in vol.flags:
|
||||
vol.flags.pop(k[1:])
|
||||
else:
|
||||
t = "WARNING: the config for volume [/%s] tried to remove volflag [%s] by specifying [%s] but that volflag was not already set"
|
||||
self.log(t % (vol.vpath, zs, k), 3)
|
||||
|
||||
if vol.flags.get("dots"):
|
||||
for name in vol.axs.uread:
|
||||
@@ -2275,22 +2373,56 @@ class AuthSrv(object):
|
||||
except Pebkac:
|
||||
self.warn_anonwrite = True
|
||||
|
||||
idp_err = "WARNING! The following IdP volumes are mounted directly below another volume where anonymous users can read and/or write files. This is a SECURITY HAZARD!! When copyparty is restarted, it will not know about these IdP volumes yet. These volumes will then be accessible by anonymous users UNTIL one of the users associated with their volume sends a request to the server. RECOMMENDATION: You should create a restricted volume where nobody can read/write files, and make sure that all IdP volumes are configured to appear somewhere below that volume."
|
||||
self.idp_warn = []
|
||||
self.idp_err = []
|
||||
for idp_vp in self.idp_vols:
|
||||
parent_vp = vsplit(idp_vp)[0]
|
||||
vn, _ = vfs.get(parent_vp, "*", False, False)
|
||||
zs = (
|
||||
"READABLE"
|
||||
if "*" in vn.axs.uread
|
||||
else "WRITABLE"
|
||||
if "*" in vn.axs.uwrite
|
||||
else ""
|
||||
)
|
||||
if zs:
|
||||
t = '\nWARNING: Volume "/%s" appears below "/%s" and would be WORLD-%s'
|
||||
idp_err += t % (idp_vp, vn.vpath, zs)
|
||||
if "\n" in idp_err:
|
||||
self.log(idp_err, 1)
|
||||
idp_vn, _ = vfs.get(idp_vp, "*", False, False)
|
||||
idp_vp0 = idp_vn.vpath0
|
||||
|
||||
sigils = set(re.findall(r"(\${[ug][}%])", idp_vp0))
|
||||
if len(sigils) > 1:
|
||||
t = '\nWARNING: IdP-volume "/%s" created by "/%s" has multiple IdP placeholders: %s'
|
||||
self.idp_warn.append(t % (idp_vp, idp_vp0, list(sigils)))
|
||||
continue
|
||||
|
||||
sigil = sigils.pop()
|
||||
par_vp = idp_vp
|
||||
while par_vp:
|
||||
par_vp = vsplit(par_vp)[0]
|
||||
par_vn, _ = vfs.get(par_vp, "*", False, False)
|
||||
if sigil in par_vn.vpath0:
|
||||
continue # parent was spawned for and by same user
|
||||
|
||||
oth_read = []
|
||||
oth_write = []
|
||||
for usr in par_vn.axs.uread:
|
||||
if usr not in idp_vn.axs.uread:
|
||||
oth_read.append(usr)
|
||||
for usr in par_vn.axs.uwrite:
|
||||
if usr not in idp_vn.axs.uwrite:
|
||||
oth_write.append(usr)
|
||||
|
||||
if "*" in oth_read:
|
||||
taxs = "WORLD-READABLE"
|
||||
elif "*" in oth_write:
|
||||
taxs = "WORLD-WRITABLE"
|
||||
elif oth_read:
|
||||
taxs = "READABLE BY %r" % (oth_read,)
|
||||
elif oth_write:
|
||||
taxs = "WRITABLE BY %r" % (oth_write,)
|
||||
else:
|
||||
break # no sigil; not idp; safe to stop
|
||||
|
||||
t = '\nWARNING: IdP-volume "/%s" created by "/%s" has parent/grandparent "/%s" and would be %s'
|
||||
self.idp_err.append(t % (idp_vp, idp_vp0, par_vn.vpath, taxs))
|
||||
|
||||
if self.idp_warn:
|
||||
t = "WARNING! Some IdP volumes include multiple IdP placeholders; this is too complex to automatically determine if safe or not. To ensure that no users gain unintended access, please use only a single placeholder for each IdP volume."
|
||||
self.log(t + "".join(self.idp_warn), 1)
|
||||
|
||||
if self.idp_err:
|
||||
t = "WARNING! The following IdP volumes are mounted below another volume where other users can read and/or write files. This is a SECURITY HAZARD!! When copyparty is restarted, it will not know about these IdP volumes yet. These volumes will then be accessible by an unexpected set of permissions UNTIL one of the users associated with their volume sends a request to the server. RECOMMENDATION: You should create a restricted volume where nobody can read/write files, and make sure that all IdP volumes are configured to appear somewhere below that volume."
|
||||
self.log(t + "".join(self.idp_err), 1)
|
||||
|
||||
self.vfs = vfs
|
||||
self.acct = acct
|
||||
@@ -2325,11 +2457,6 @@ class AuthSrv(object):
|
||||
for x, y in vfs.all_vols.items()
|
||||
if x != shr and not x.startswith(shrs)
|
||||
}
|
||||
vfs.all_nodes = {
|
||||
x: y
|
||||
for x, y in vfs.all_nodes.items()
|
||||
if x != shr and not x.startswith(shrs)
|
||||
}
|
||||
|
||||
assert db and cur and cur2 and shv # type: ignore
|
||||
for row in cur.execute("select * from sh"):
|
||||
@@ -2359,6 +2486,7 @@ class AuthSrv(object):
|
||||
else:
|
||||
shn.ls = shn._ls
|
||||
|
||||
shn.shr_owner = s_un
|
||||
shn.shr_src = (s_vfs, s_rem)
|
||||
shn.realpath = s_vfs.canonical(s_rem)
|
||||
|
||||
@@ -2376,7 +2504,7 @@ class AuthSrv(object):
|
||||
continue # also fine
|
||||
for zs in svn.nodes.keys():
|
||||
# hide subvolume
|
||||
vn.nodes[zs] = VFS(self.log_func, "", "", AXS(), {})
|
||||
vn.nodes[zs] = VFS(self.log_func, "", "", "", AXS(), {})
|
||||
|
||||
cur2.close()
|
||||
cur.close()
|
||||
@@ -2384,7 +2512,9 @@ class AuthSrv(object):
|
||||
|
||||
self.js_ls = {}
|
||||
self.js_htm = {}
|
||||
for vn in self.vfs.all_nodes.values():
|
||||
for vp, vn in self.vfs.all_nodes.items():
|
||||
if enshare and vp.startswith(shrs):
|
||||
continue # propagates later in this func
|
||||
vf = vn.flags
|
||||
vn.js_ls = {
|
||||
"idx": "e2d" in vf,
|
||||
@@ -2442,8 +2572,12 @@ class AuthSrv(object):
|
||||
vols = list(vfs.all_nodes.values())
|
||||
if enshare:
|
||||
assert shv # type: ignore # !rm
|
||||
vols.append(shv)
|
||||
vols.extend(list(shv.nodes.values()))
|
||||
for vol in shv.nodes.values():
|
||||
if vol.vpath not in vfs.all_nodes:
|
||||
self.log("BUG: /%s not in all_nodes" % (vol.vpath,), 1)
|
||||
vols.append(vol)
|
||||
if shr in vfs.all_nodes:
|
||||
self.log("BUG: %s found in all_nodes" % (shr,), 1)
|
||||
|
||||
for vol in vols:
|
||||
dbv = vol.get_dbv("")[0]
|
||||
@@ -2546,8 +2680,8 @@ class AuthSrv(object):
|
||||
if not bos.path.exists(ap):
|
||||
pwdb = {}
|
||||
else:
|
||||
with open(ap, "r", encoding="utf-8") as f:
|
||||
pwdb = json.load(f)
|
||||
jtxt = read_utf8(self.log, ap, True)
|
||||
pwdb = json.loads(jtxt)
|
||||
|
||||
pwdb = [x for x in pwdb if x[0] != uname]
|
||||
pwdb.append((uname, self.defpw[uname], hpw))
|
||||
@@ -2570,8 +2704,8 @@ class AuthSrv(object):
|
||||
if not self.args.chpw or not bos.path.exists(ap):
|
||||
return
|
||||
|
||||
with open(ap, "r", encoding="utf-8") as f:
|
||||
pwdb = json.load(f)
|
||||
jtxt = read_utf8(self.log, ap, True)
|
||||
pwdb = json.loads(jtxt)
|
||||
|
||||
useen = set()
|
||||
urst = set()
|
||||
@@ -2685,7 +2819,7 @@ class AuthSrv(object):
|
||||
def dbg_ls(self) -> None:
|
||||
users = self.args.ls
|
||||
vol = "*"
|
||||
flags: list[str] = []
|
||||
flags: Sequence[str] = []
|
||||
|
||||
try:
|
||||
users, vol = users.split(",", 1)
|
||||
@@ -3067,8 +3201,9 @@ def expand_config_file(
|
||||
ipath += " -> " + fp
|
||||
ret.append("#\033[36m opening cfg file{}\033[0m".format(ipath))
|
||||
|
||||
with open(fp, "rb") as f:
|
||||
for oln in [x.decode("utf-8").rstrip() for x in f]:
|
||||
cfg_lines = read_utf8(log, fp, True).split("\n")
|
||||
if True: # diff-golf
|
||||
for oln in [x.rstrip() for x in cfg_lines]:
|
||||
ln = oln.split(" #")[0].strip()
|
||||
if ln.startswith("% "):
|
||||
pad = " " * len(oln.split("%")[0])
|
||||
|
||||
@@ -52,9 +52,11 @@ def vf_bmap() -> dict[str, str]:
|
||||
"og_s_title",
|
||||
"rand",
|
||||
"rss",
|
||||
"wo_up_readme",
|
||||
"xdev",
|
||||
"xlink",
|
||||
"xvol",
|
||||
"zipmaxu",
|
||||
):
|
||||
ret[k] = k
|
||||
return ret
|
||||
@@ -81,6 +83,7 @@ def vf_vmap() -> dict[str, str]:
|
||||
"md_sbf",
|
||||
"lg_sba",
|
||||
"md_sba",
|
||||
"md_hist",
|
||||
"nrand",
|
||||
"u2ow",
|
||||
"og_desc",
|
||||
@@ -101,6 +104,9 @@ def vf_vmap() -> dict[str, str]:
|
||||
"u2ts",
|
||||
"ups_who",
|
||||
"zip_who",
|
||||
"zipmaxn",
|
||||
"zipmaxs",
|
||||
"zipmaxt",
|
||||
):
|
||||
ret[k] = k
|
||||
return ret
|
||||
@@ -169,6 +175,7 @@ flagcats = {
|
||||
"vmaxb=1g": "total volume size max 1 GiB (suffixes: b, k, m, g, t)",
|
||||
"vmaxn=4k": "max 4096 files in volume (suffixes: b, k, m, g, t)",
|
||||
"medialinks": "return medialinks for non-up2k uploads (not hotlinks)",
|
||||
"wo_up_readme": "write-only users can upload logues without getting renamed",
|
||||
"rand": "force randomized filenames, 9 chars long by default",
|
||||
"nrand=N": "randomized filenames are N chars long",
|
||||
"u2ow=N": "overwrite existing files? 0=no 1=if-older 2=always",
|
||||
@@ -198,6 +205,7 @@ flagcats = {
|
||||
"d2v": "disables file verification, overrides -e2v*",
|
||||
"d2d": "disables all database stuff, overrides -e2*",
|
||||
"hist=/tmp/cdb": "puts thumbnails and indexes at that location",
|
||||
"dbpath=/tmp/cdb": "puts indexes at that location",
|
||||
"scan=60": "scan for new files every 60sec, same as --re-maxage",
|
||||
"nohash=\\.iso$": "skips hashing file contents if path matches *.iso",
|
||||
"noidx=\\.iso$": "fully ignores the contents at paths matching *.iso",
|
||||
@@ -285,6 +293,7 @@ flagcats = {
|
||||
"og_ua": "if defined: only send OG html if useragent matches this regex",
|
||||
},
|
||||
"textfiles": {
|
||||
"md_hist": "where to put markdown backups; s=subfolder, v=volHist, n=nope",
|
||||
"exp": "enable textfile expansion; see --help-exp",
|
||||
"exp_md": "placeholders to expand in markdown files; see --help",
|
||||
"exp_lg": "placeholders to expand in prologue/epilogue; see --help",
|
||||
@@ -293,9 +302,16 @@ flagcats = {
|
||||
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
|
||||
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
|
||||
"fka=8": 'generates slightly weaker per-file accesskeys,\nwhich are then required at the "g" permission;\nnot affected by filesize or inode numbers',
|
||||
"dk=8": 'generates per-directory accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
|
||||
"dks": "per-directory accesskeys allow browsing into subdirs",
|
||||
"dky": 'allow seeing files (not folders) inside a specific folder\nwith "g" perm, and does not require a valid dirkey to do so',
|
||||
"rss": "allow '?rss' URL suffix (experimental)",
|
||||
"ups_who=2": "restrict viewing the list of recent uploads",
|
||||
"zip_who=2": "restrict access to download-as-zip/tar",
|
||||
"zipmaxn=9k": "reject download-as-zip if more than 9000 files",
|
||||
"zipmaxs=2g": "reject download-as-zip if size over 2 GiB",
|
||||
"zipmaxt=no": "reply with 'no' if download-as-zip exceeds max",
|
||||
"zipmaxu": "zip-size-limit does not apply to authenticated users",
|
||||
"nopipe": "disable race-the-beam (download unfinished uploads)",
|
||||
"mv_retry": "ms-windows: timeout for renaming busy files",
|
||||
"rm_retry": "ms-windows: timeout for deleting busy files",
|
||||
|
||||
@@ -78,7 +78,7 @@ class Fstab(object):
|
||||
return vid
|
||||
|
||||
def build_fallback(self) -> None:
|
||||
self.tab = VFS(self.log_func, "idk", "/", AXS(), {})
|
||||
self.tab = VFS(self.log_func, "idk", "/", "/", AXS(), {})
|
||||
self.trusted = False
|
||||
|
||||
def build_tab(self) -> None:
|
||||
@@ -111,9 +111,10 @@ class Fstab(object):
|
||||
|
||||
tab1.sort(key=lambda x: (len(x[0]), x[0]))
|
||||
path1, fs1 = tab1[0]
|
||||
tab = VFS(self.log_func, fs1, path1, AXS(), {})
|
||||
tab = VFS(self.log_func, fs1, path1, path1, AXS(), {})
|
||||
for path, fs in tab1[1:]:
|
||||
tab.add(fs, path.lstrip("/"))
|
||||
zs = path.lstrip("/")
|
||||
tab.add(fs, zs, zs)
|
||||
|
||||
self.tab = tab
|
||||
self.srctab = srctab
|
||||
@@ -130,9 +131,10 @@ class Fstab(object):
|
||||
if not self.trusted:
|
||||
# no mtab access; have to build as we go
|
||||
if "/" in rem:
|
||||
self.tab.add("idk", os.path.join(vn.vpath, rem.split("/")[0]))
|
||||
zs = os.path.join(vn.vpath, rem.split("/")[0])
|
||||
self.tab.add("idk", zs, zs)
|
||||
if rem:
|
||||
self.tab.add(nval, path)
|
||||
self.tab.add(nval, path, path)
|
||||
else:
|
||||
vn.realpath = nval
|
||||
|
||||
|
||||
@@ -19,6 +19,7 @@ from .__init__ import PY2, TYPE_CHECKING
|
||||
from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .util import (
|
||||
FN_EMB,
|
||||
VF_CAREFUL,
|
||||
Daemon,
|
||||
ODict,
|
||||
@@ -170,6 +171,16 @@ class FtpFs(AbstractedFS):
|
||||
fn = sanitize_fn(fn or "", "")
|
||||
vpath = vjoin(rd, fn)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
|
||||
if (
|
||||
w
|
||||
and fn.lower() in FN_EMB
|
||||
and self.h.uname not in vfs.axs.uread
|
||||
and "wo_up_readme" not in vfs.flags
|
||||
):
|
||||
fn = "_wo_" + fn
|
||||
vpath = vjoin(rd, fn)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
|
||||
|
||||
if not vfs.realpath:
|
||||
t = "No filesystem mounted at [{}]"
|
||||
raise FSE(t.format(vpath))
|
||||
|
||||
@@ -4,7 +4,6 @@ from __future__ import print_function, unicode_literals
|
||||
import argparse # typechk
|
||||
import copy
|
||||
import errno
|
||||
import gzip
|
||||
import hashlib
|
||||
import itertools
|
||||
import json
|
||||
@@ -22,6 +21,7 @@ from datetime import datetime
|
||||
from operator import itemgetter
|
||||
|
||||
import jinja2 # typechk
|
||||
from ipaddress import IPv6Network
|
||||
|
||||
try:
|
||||
if os.environ.get("PRTY_NO_LZMA"):
|
||||
@@ -45,6 +45,7 @@ from .util import (
|
||||
APPLESAN_RE,
|
||||
BITNESS,
|
||||
DAV_ALLPROPS,
|
||||
FN_EMB,
|
||||
HAVE_SQLITE3,
|
||||
HTTPCODE,
|
||||
META_NOBOTS,
|
||||
@@ -56,6 +57,7 @@ from .util import (
|
||||
UnrecvEOF,
|
||||
WrongPostKey,
|
||||
absreal,
|
||||
afsenc,
|
||||
alltrace,
|
||||
atomic_move,
|
||||
b64dec,
|
||||
@@ -68,6 +70,7 @@ from .util import (
|
||||
get_df,
|
||||
get_spd,
|
||||
guess_mime,
|
||||
gzip,
|
||||
gzip_file_orig_sz,
|
||||
gzip_orig_sz,
|
||||
has_resource,
|
||||
@@ -89,6 +92,7 @@ from .util import (
|
||||
read_socket,
|
||||
read_socket_chunked,
|
||||
read_socket_unbounded,
|
||||
read_utf8,
|
||||
relchk,
|
||||
ren_open,
|
||||
runhook,
|
||||
@@ -152,6 +156,8 @@ RE_HSAFE = re.compile(r"[\x00-\x1f<>\"'&]") # search always much faster
|
||||
RE_HOST = re.compile(r"[^][0-9a-zA-Z.:_-]") # search faster <=17ch
|
||||
RE_MHOST = re.compile(r"^[][0-9a-zA-Z.:_-]+$") # match faster >=18ch
|
||||
RE_K = re.compile(r"[^0-9a-zA-Z_-]") # search faster <=17ch
|
||||
RE_HR = re.compile(r"[<>\"'&]")
|
||||
RE_MDV = re.compile(r"(.*)\.([0-9]+\.[0-9]{3})(\.[Mm][Dd])$")
|
||||
|
||||
UPARAM_CC_OK = set("doc move tree".split())
|
||||
|
||||
@@ -385,11 +391,12 @@ class HttpCli(object):
|
||||
t += ' Note: if you are behind cloudflare, then this default header is not a good choice; please first make sure your local reverse-proxy (if any) does not allow non-cloudflare IPs from providing cf-* headers, and then add this additional global setting: "--xff-hdr=cf-connecting-ip"'
|
||||
else:
|
||||
t += ' Note: depending on your reverse-proxy, and/or WAF, and/or other intermediates, you may want to read the true client IP from another header by also specifying "--xff-hdr=SomeOtherHeader"'
|
||||
zs = (
|
||||
".".join(pip.split(".")[:2]) + "."
|
||||
if "." in pip
|
||||
else ":".join(pip.split(":")[:4]) + ":"
|
||||
) + "0.0/16"
|
||||
|
||||
if "." in pip:
|
||||
zs = ".".join(pip.split(".")[:2]) + ".0.0/16"
|
||||
else:
|
||||
zs = IPv6Network(pip + "/64", False).compressed
|
||||
|
||||
zs2 = ' or "--xff-src=lan"' if self.conn.xff_lan.map(pip) else ""
|
||||
self.log(t % (self.args.xff_hdr, pip, cli_ip, zso, zs, zs2), 3)
|
||||
self.bad_xff = True
|
||||
@@ -866,8 +873,7 @@ class HttpCli(object):
|
||||
html = html.replace("%", "", 1)
|
||||
|
||||
if html.startswith("@"):
|
||||
with open(html[1:], "rb") as f:
|
||||
html = f.read().decode("utf-8")
|
||||
html = read_utf8(self.log, html[1:], True)
|
||||
|
||||
if html.startswith("%"):
|
||||
html = html[1:]
|
||||
@@ -1200,11 +1206,6 @@ class HttpCli(object):
|
||||
else:
|
||||
return self.tx_res(res_path)
|
||||
|
||||
if res_path != undot(res_path):
|
||||
t = "malicious user; attempted path traversal; req(%r) vp(%r) => %r"
|
||||
self.log(t % (self.req, "/" + self.vpath, res_path), 1)
|
||||
self.cbonk(self.conn.hsrv.gmal, self.req, "trav", "path traversal")
|
||||
|
||||
self.tx_404()
|
||||
return False
|
||||
|
||||
@@ -1234,14 +1235,7 @@ class HttpCli(object):
|
||||
return self.tx_404(True)
|
||||
else:
|
||||
vfs = self.asrv.vfs
|
||||
if (
|
||||
not vfs.nodes
|
||||
and not vfs.axs.uread
|
||||
and not vfs.axs.uwrite
|
||||
and not vfs.axs.uget
|
||||
and not vfs.axs.uhtml
|
||||
and not vfs.axs.uadmin
|
||||
):
|
||||
if vfs.badcfg1:
|
||||
t = "<h2>access denied due to failsafe; check server log</h2>"
|
||||
html = self.j2s("splash", this=self, msg=t)
|
||||
self.reply(html.encode("utf-8", "replace"), 500)
|
||||
@@ -2553,6 +2547,16 @@ class HttpCli(object):
|
||||
vfs, rem = self.asrv.vfs.get(self.vpath, self.uname, False, True)
|
||||
dbv, vrem = vfs.get_dbv(rem)
|
||||
|
||||
name = sanitize_fn(name, "")
|
||||
if (
|
||||
not self.can_read
|
||||
and self.can_write
|
||||
and name.lower() in FN_EMB
|
||||
and "wo_up_readme" not in dbv.flags
|
||||
):
|
||||
name = "_wo_" + name
|
||||
|
||||
body["name"] = name
|
||||
body["vtop"] = dbv.vpath
|
||||
body["ptop"] = dbv.realpath
|
||||
body["prel"] = vrem
|
||||
@@ -2990,9 +2994,6 @@ class HttpCli(object):
|
||||
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
||||
rem = sanitize_vpath(rem, "/")
|
||||
fn = vfs.canonical(rem)
|
||||
if not fn.startswith(vfs.realpath):
|
||||
self.log("invalid mkdir %r %r" % (self.gctx, vpath), 1)
|
||||
raise Pebkac(422)
|
||||
|
||||
if not nullwrite:
|
||||
fdir = os.path.dirname(fn)
|
||||
@@ -3493,6 +3494,7 @@ class HttpCli(object):
|
||||
|
||||
fp = os.path.join(fp, fn)
|
||||
rem = "{}/{}".format(rp, fn).strip("/")
|
||||
dbv, vrem = vfs.get_dbv(rem)
|
||||
|
||||
if not rem.endswith(".md") and not self.can_delete:
|
||||
raise Pebkac(400, "only markdown pls")
|
||||
@@ -3547,13 +3549,27 @@ class HttpCli(object):
|
||||
mdir, mfile = os.path.split(fp)
|
||||
fname, fext = mfile.rsplit(".", 1) if "." in mfile else (mfile, "md")
|
||||
mfile2 = "{}.{:.3f}.{}".format(fname, srv_lastmod, fext)
|
||||
try:
|
||||
|
||||
dp = ""
|
||||
hist_cfg = dbv.flags["md_hist"]
|
||||
if hist_cfg == "v":
|
||||
vrd = vsplit(vrem)[0]
|
||||
zb = hashlib.sha512(afsenc(vrd)).digest()
|
||||
zs = ub64enc(zb).decode("ascii")[:24].lower()
|
||||
dp = "%s/md/%s/%s/%s" % (dbv.histpath, zs[:2], zs[2:4], zs)
|
||||
self.log("moving old version to %s/%s" % (dp, mfile2))
|
||||
if bos.makedirs(dp):
|
||||
with open(os.path.join(dp, "dir.txt"), "wb") as f:
|
||||
f.write(afsenc(vrd))
|
||||
elif hist_cfg == "s":
|
||||
dp = os.path.join(mdir, ".hist")
|
||||
bos.mkdir(dp)
|
||||
hidedir(dp)
|
||||
except:
|
||||
pass
|
||||
wrename(self.log, fp, os.path.join(mdir, ".hist", mfile2), vfs.flags)
|
||||
try:
|
||||
bos.mkdir(dp)
|
||||
hidedir(dp)
|
||||
except:
|
||||
pass
|
||||
if dp:
|
||||
wrename(self.log, fp, os.path.join(dp, mfile2), vfs.flags)
|
||||
|
||||
assert self.parser.gen # !rm
|
||||
p_field, _, p_data = next(self.parser.gen)
|
||||
@@ -3626,13 +3642,12 @@ class HttpCli(object):
|
||||
wunlink(self.log, fp, vfs.flags)
|
||||
raise Pebkac(403, t)
|
||||
|
||||
vfs, rem = vfs.get_dbv(rem)
|
||||
self.conn.hsrv.broker.say(
|
||||
"up2k.hash_file",
|
||||
vfs.realpath,
|
||||
vfs.vpath,
|
||||
vfs.flags,
|
||||
vsplit(rem)[0],
|
||||
dbv.realpath,
|
||||
dbv.vpath,
|
||||
dbv.flags,
|
||||
vsplit(vrem)[0],
|
||||
fn,
|
||||
self.ip,
|
||||
new_lastmod,
|
||||
@@ -3736,8 +3751,7 @@ class HttpCli(object):
|
||||
continue
|
||||
fn = "%s/%s" % (abspath, fn)
|
||||
if bos.path.isfile(fn):
|
||||
with open(fsenc(fn), "rb") as f:
|
||||
logues[n] = f.read().decode("utf-8")
|
||||
logues[n] = read_utf8(self.log, fsenc(fn), False)
|
||||
if "exp" in vn.flags:
|
||||
logues[n] = self._expand(
|
||||
logues[n], vn.flags.get("exp_lg") or []
|
||||
@@ -3758,9 +3772,8 @@ class HttpCli(object):
|
||||
for fn in fns:
|
||||
fn = "%s/%s" % (abspath, fn)
|
||||
if bos.path.isfile(fn):
|
||||
with open(fsenc(fn), "rb") as f:
|
||||
txt = f.read().decode("utf-8")
|
||||
break
|
||||
txt = read_utf8(self.log, fsenc(fn), False)
|
||||
break
|
||||
|
||||
if txt and "exp" in vn.flags:
|
||||
txt = self._expand(txt, vn.flags.get("exp_md") or [])
|
||||
@@ -3793,6 +3806,19 @@ class HttpCli(object):
|
||||
|
||||
return txt
|
||||
|
||||
def _can_zip(self, volflags: dict[str, Any]) -> str:
|
||||
lvl = volflags["zip_who"]
|
||||
if self.args.no_zip or not lvl:
|
||||
return "download-as-zip/tar is disabled in server config"
|
||||
elif lvl <= 1 and not self.can_admin:
|
||||
return "download-as-zip/tar is admin-only on this server"
|
||||
elif lvl <= 2 and self.uname in ("", "*"):
|
||||
return "you must be authenticated to download-as-zip/tar on this server"
|
||||
elif self.args.ua_nozip and self.args.ua_nozip.search(self.ua):
|
||||
t = "this URL contains no valuable information for bots/crawlers"
|
||||
raise Pebkac(403, t)
|
||||
return ""
|
||||
|
||||
def tx_res(self, req_path: str) -> bool:
|
||||
status = 200
|
||||
logmsg = "{:4} {} ".format("", self.req)
|
||||
@@ -4220,6 +4246,7 @@ class HttpCli(object):
|
||||
self.log(t % (data_end / M, lower / M, upper / M), 6)
|
||||
with self.u2mutex:
|
||||
if data_end > self.u2fh.aps.get(ap_data, data_end):
|
||||
fhs: Optional[set[typing.BinaryIO]] = None
|
||||
try:
|
||||
fhs = self.u2fh.cache[ap_data].all_fhs
|
||||
for fh in fhs:
|
||||
@@ -4227,7 +4254,11 @@ class HttpCli(object):
|
||||
self.u2fh.aps[ap_data] = data_end
|
||||
self.log("pipe: flushed %d up2k-FDs" % (len(fhs),))
|
||||
except Exception as ex:
|
||||
self.log("pipe: u2fh flush failed: %r" % (ex,))
|
||||
if fhs is None:
|
||||
err = "file is not being written to right now"
|
||||
else:
|
||||
err = repr(ex)
|
||||
self.log("pipe: u2fh flush failed: " + err)
|
||||
|
||||
if lower >= data_end:
|
||||
if data_end:
|
||||
@@ -4325,13 +4356,8 @@ class HttpCli(object):
|
||||
rem: str,
|
||||
items: list[str],
|
||||
) -> bool:
|
||||
lvl = vn.flags["zip_who"]
|
||||
if self.args.no_zip or not lvl:
|
||||
raise Pebkac(400, "download-as-zip/tar is disabled in server config")
|
||||
elif lvl <= 1 and not self.can_admin:
|
||||
raise Pebkac(400, "download-as-zip/tar is admin-only on this server")
|
||||
elif lvl <= 2 and self.uname in ("", "*"):
|
||||
t = "you must be authenticated to download-as-zip/tar on this server"
|
||||
t = self._can_zip(vn.flags)
|
||||
if t:
|
||||
raise Pebkac(400, t)
|
||||
|
||||
logmsg = "{:4} {} ".format("", self.req)
|
||||
@@ -4364,6 +4390,33 @@ class HttpCli(object):
|
||||
else:
|
||||
fn = self.host.split(":")[0]
|
||||
|
||||
if vn.flags.get("zipmax") and (not self.uname or not "zipmaxu" in vn.flags):
|
||||
maxs = vn.flags.get("zipmaxs_v") or 0
|
||||
maxn = vn.flags.get("zipmaxn_v") or 0
|
||||
nf = 0
|
||||
nb = 0
|
||||
fgen = vn.zipgen(
|
||||
vpath, rem, set(items), self.uname, False, not self.args.no_scandir
|
||||
)
|
||||
t = "total size exceeds a limit specified in server config"
|
||||
t = vn.flags.get("zipmaxt") or t
|
||||
if maxs and maxn:
|
||||
for zd in fgen:
|
||||
nf += 1
|
||||
nb += zd["st"].st_size
|
||||
if maxs < nb or maxn < nf:
|
||||
raise Pebkac(400, t)
|
||||
elif maxs:
|
||||
for zd in fgen:
|
||||
nb += zd["st"].st_size
|
||||
if maxs < nb:
|
||||
raise Pebkac(400, t)
|
||||
elif maxn:
|
||||
for zd in fgen:
|
||||
nf += 1
|
||||
if maxn < nf:
|
||||
raise Pebkac(400, t)
|
||||
|
||||
safe = (string.ascii_letters + string.digits).replace("%", "")
|
||||
afn = "".join([x if x in safe.replace('"', "") else "_" for x in fn])
|
||||
bascii = unicode(safe).encode("utf-8")
|
||||
@@ -4829,7 +4882,7 @@ class HttpCli(object):
|
||||
self.reply(pt.encode("utf-8"), status=rc)
|
||||
return True
|
||||
|
||||
if "th" in self.ouparam:
|
||||
if "th" in self.ouparam and str(self.ouparam["th"])[:1] in "jw":
|
||||
return self.tx_svg("e" + pt[:3])
|
||||
|
||||
# most webdav clients will not send credentials until they
|
||||
@@ -5010,6 +5063,8 @@ class HttpCli(object):
|
||||
def get_dls(self) -> list[list[Any]]:
|
||||
ret = []
|
||||
dls = self.conn.hsrv.tdls
|
||||
enshare = self.args.shr
|
||||
shrs = enshare[1:]
|
||||
for dl_id, (t0, sz, vn, vp, uname) in self.conn.hsrv.tdli.items():
|
||||
t1, sent = dls[dl_id]
|
||||
if sent > 0x100000: # 1m; buffers 2~4
|
||||
@@ -5018,6 +5073,15 @@ class HttpCli(object):
|
||||
vp = ""
|
||||
elif self.uname not in vn.axs.udot and (vp.startswith(".") or "/." in vp):
|
||||
vp = ""
|
||||
elif (
|
||||
enshare
|
||||
and vp.startswith(shrs)
|
||||
and self.uname != vn.shr_owner
|
||||
and self.uname not in vn.axs.uadmin
|
||||
and self.uname not in self.args.shr_adm
|
||||
and not dl_id.startswith(self.ip + ":")
|
||||
):
|
||||
vp = ""
|
||||
if self.uname not in vn.axs.uadmin:
|
||||
dl_id = uname = ""
|
||||
|
||||
@@ -5746,7 +5810,13 @@ class HttpCli(object):
|
||||
|
||||
thp = None
|
||||
if self.thumbcli and not nothumb:
|
||||
thp = self.thumbcli.get(dbv, vrem, int(st.st_mtime), th_fmt)
|
||||
try:
|
||||
thp = self.thumbcli.get(dbv, vrem, int(st.st_mtime), th_fmt)
|
||||
except Pebkac as ex:
|
||||
if ex.code == 500 and th_fmt[:1] in "jw":
|
||||
self.log("failed to convert [%s]:\n%s" % (abspath, ex), 3)
|
||||
return self.tx_svg("--error--\ncheck\nserver\nlog")
|
||||
raise
|
||||
|
||||
if thp:
|
||||
return self.tx_file(thp)
|
||||
@@ -5968,9 +6038,11 @@ class HttpCli(object):
|
||||
# check for old versions of files,
|
||||
# [num-backups, most-recent, hist-path]
|
||||
hist: dict[str, tuple[int, float, str]] = {}
|
||||
histdir = os.path.join(fsroot, ".hist")
|
||||
ptn = re.compile(r"(.*)\.([0-9]+\.[0-9]{3})(\.[^\.]+)$")
|
||||
try:
|
||||
if vf["md_hist"] != "s":
|
||||
raise Exception()
|
||||
histdir = os.path.join(fsroot, ".hist")
|
||||
ptn = RE_MDV
|
||||
for hfn in bos.listdir(histdir):
|
||||
m = ptn.match(hfn)
|
||||
if not m:
|
||||
@@ -6000,8 +6072,11 @@ class HttpCli(object):
|
||||
zs = self.gen_fk(2, self.args.dk_salt, abspath, 0, 0)[:add_dk]
|
||||
ls_ret["dk"] = cgv["dk"] = zs
|
||||
|
||||
no_zip = bool(self._can_zip(vf))
|
||||
|
||||
dirs = []
|
||||
files = []
|
||||
ptn_hr = RE_HR
|
||||
for fn in ls_names:
|
||||
base = ""
|
||||
href = fn
|
||||
@@ -6024,7 +6099,7 @@ class HttpCli(object):
|
||||
is_dir = stat.S_ISDIR(inf.st_mode)
|
||||
if is_dir:
|
||||
href += "/"
|
||||
if self.args.no_zip:
|
||||
if no_zip:
|
||||
margin = "DIR"
|
||||
elif add_dk:
|
||||
zs = absreal(fspath)
|
||||
@@ -6037,7 +6112,7 @@ class HttpCli(object):
|
||||
quotep(href),
|
||||
)
|
||||
elif fn in hist:
|
||||
margin = '<a href="%s.hist/%s">#%s</a>' % (
|
||||
margin = '<a href="%s.hist/%s" rel="nofollow">#%s</a>' % (
|
||||
base,
|
||||
html_escape(hist[fn][2], quot=True, crlf=True),
|
||||
hist[fn][0],
|
||||
@@ -6056,11 +6131,13 @@ class HttpCli(object):
|
||||
zd.second,
|
||||
)
|
||||
|
||||
try:
|
||||
ext = "---" if is_dir else fn.rsplit(".", 1)[1]
|
||||
if is_dir:
|
||||
ext = "---"
|
||||
elif "." in fn:
|
||||
ext = ptn_hr.sub("@", fn.rsplit(".", 1)[1])
|
||||
if len(ext) > 16:
|
||||
ext = ext[:16]
|
||||
except:
|
||||
else:
|
||||
ext = "%"
|
||||
|
||||
if add_fk and not is_dir:
|
||||
@@ -6237,6 +6314,10 @@ class HttpCli(object):
|
||||
|
||||
doc = self.uparam.get("doc") if self.can_read else None
|
||||
if doc:
|
||||
zp = self.args.ua_nodoc
|
||||
if zp and zp.search(self.ua):
|
||||
t = "this URL contains no valuable information for bots/crawlers"
|
||||
raise Pebkac(403, t)
|
||||
j2a["docname"] = doc
|
||||
doctxt = None
|
||||
dfn = lnames.get(doc.lower())
|
||||
@@ -6247,9 +6328,7 @@ class HttpCli(object):
|
||||
docpath = os.path.join(abspath, doc)
|
||||
sz = bos.path.getsize(docpath)
|
||||
if sz < 1024 * self.args.txt_max:
|
||||
with open(fsenc(docpath), "rb") as f:
|
||||
doctxt = f.read().decode("utf-8", "replace")
|
||||
|
||||
doctxt = read_utf8(self.log, fsenc(docpath), False)
|
||||
if doc.lower().endswith(".md") and "exp" in vn.flags:
|
||||
doctxt = self._expand(doctxt, vn.flags.get("exp_md") or [])
|
||||
else:
|
||||
|
||||
@@ -94,10 +94,21 @@ class Ico(object):
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<svg version="1.1" viewBox="0 0 100 {}" xmlns="http://www.w3.org/2000/svg"><g>
|
||||
<rect width="100%" height="100%" fill="#{}" />
|
||||
<text x="50%" y="50%" dominant-baseline="middle" text-anchor="middle" xml:space="preserve"
|
||||
<text x="50%" y="{}" dominant-baseline="middle" text-anchor="middle" xml:space="preserve"
|
||||
fill="#{}" font-family="monospace" font-size="14px" style="letter-spacing:.5px">{}</text>
|
||||
</g></svg>
|
||||
"""
|
||||
svg = svg.format(h, c[:6], c[6:], html_escape(ext, True))
|
||||
|
||||
txt = html_escape(ext, True)
|
||||
if "\n" in txt:
|
||||
lines = txt.split("\n")
|
||||
n = len(lines)
|
||||
y = "20%" if n == 2 else "10%" if n == 3 else "0"
|
||||
zs = '<tspan x="50%%" dy="1.2em">%s</tspan>'
|
||||
txt = "".join([zs % (x,) for x in lines])
|
||||
else:
|
||||
y = "50%"
|
||||
|
||||
svg = svg.format(h, c[:6], y, c[6:], txt)
|
||||
|
||||
return "image/svg+xml", svg.encode("utf-8")
|
||||
|
||||
@@ -18,6 +18,7 @@ from .util import (
|
||||
REKOBO_LKEY,
|
||||
VF_CAREFUL,
|
||||
fsenc,
|
||||
gzip,
|
||||
min_ex,
|
||||
pybin,
|
||||
retchk,
|
||||
@@ -138,8 +139,6 @@ def au_unpk(
|
||||
fd, ret = tempfile.mkstemp("." + au)
|
||||
|
||||
if pk == "gz":
|
||||
import gzip
|
||||
|
||||
fi = gzip.GzipFile(abspath, mode="rb")
|
||||
|
||||
elif pk == "xz":
|
||||
|
||||
@@ -15,7 +15,7 @@ try:
|
||||
raise Exception()
|
||||
|
||||
HAVE_ARGON2 = True
|
||||
from argon2 import __version__ as argon2ver
|
||||
from argon2 import exceptions as argon2ex
|
||||
except:
|
||||
HAVE_ARGON2 = False
|
||||
|
||||
|
||||
@@ -3,7 +3,6 @@ from __future__ import print_function, unicode_literals
|
||||
|
||||
import argparse
|
||||
import errno
|
||||
import gzip
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
@@ -63,7 +62,9 @@ from .util import (
|
||||
ansi_re,
|
||||
build_netmap,
|
||||
expat_ver,
|
||||
gzip,
|
||||
load_ipu,
|
||||
lock_file,
|
||||
min_ex,
|
||||
mp,
|
||||
odfusion,
|
||||
@@ -73,6 +74,9 @@ from .util import (
|
||||
ub64enc,
|
||||
)
|
||||
|
||||
if HAVE_SQLITE3:
|
||||
import sqlite3
|
||||
|
||||
if TYPE_CHECKING:
|
||||
try:
|
||||
from .mdns import MDNS
|
||||
@@ -84,6 +88,10 @@ if PY2:
|
||||
range = xrange # type: ignore
|
||||
|
||||
|
||||
VER_SESSION_DB = 1
|
||||
VER_SHARES_DB = 2
|
||||
|
||||
|
||||
class SvcHub(object):
|
||||
"""
|
||||
Hosts all services which cannot be parallelized due to reliance on monolithic resources.
|
||||
@@ -186,8 +194,14 @@ class SvcHub(object):
|
||||
|
||||
if not args.use_fpool and args.j != 1:
|
||||
args.no_fpool = True
|
||||
t = "multithreading enabled with -j {}, so disabling fpool -- this can reduce upload performance on some filesystems"
|
||||
self.log("root", t.format(args.j))
|
||||
t = "multithreading enabled with -j {}, so disabling fpool -- this can reduce upload performance on some filesystems, and make some antivirus-softwares "
|
||||
c = 0
|
||||
if ANYWIN:
|
||||
t += "(especially Microsoft Defender) stress your CPU and HDD severely during big uploads"
|
||||
c = 3
|
||||
else:
|
||||
t += "consume more resources (CPU/HDD) than normal"
|
||||
self.log("root", t.format(args.j), c)
|
||||
|
||||
if not args.no_fpool and args.j != 1:
|
||||
t = "WARNING: ignoring --use-fpool because multithreading (-j{}) is enabled"
|
||||
@@ -239,6 +253,14 @@ class SvcHub(object):
|
||||
setattr(args, "ipu_iu", iu)
|
||||
setattr(args, "ipu_nm", nm)
|
||||
|
||||
for zs in "ah_salt fk_salt dk_salt".split():
|
||||
if getattr(args, "show_%s" % (zs,)):
|
||||
self.log("root", "effective %s is %s" % (zs, getattr(args, zs)))
|
||||
|
||||
if args.ah_cli or args.ah_gen:
|
||||
args.no_ses = True
|
||||
args.shr = ""
|
||||
|
||||
if not self.args.no_ses:
|
||||
self.setup_session_db()
|
||||
|
||||
@@ -406,25 +428,49 @@ class SvcHub(object):
|
||||
self.log("root", t, 3)
|
||||
return
|
||||
|
||||
import sqlite3
|
||||
assert sqlite3 # type: ignore # !rm
|
||||
|
||||
# policy:
|
||||
# the sessions-db is whatever, if something looks broken then just nuke it
|
||||
|
||||
create = True
|
||||
db_path = self.args.ses_db
|
||||
self.log("root", "opening sessions-db %s" % (db_path,))
|
||||
for n in range(2):
|
||||
db_lock = db_path + ".lock"
|
||||
try:
|
||||
create = not os.path.getsize(db_path)
|
||||
except:
|
||||
create = True
|
||||
zs = "creating new" if create else "opening"
|
||||
self.log("root", "%s sessions-db %s" % (zs, db_path))
|
||||
|
||||
for tries in range(2):
|
||||
sver = 0
|
||||
try:
|
||||
db = sqlite3.connect(db_path)
|
||||
cur = db.cursor()
|
||||
try:
|
||||
zs = "select v from kv where k='sver'"
|
||||
sver = cur.execute(zs).fetchall()[0][0]
|
||||
if sver > VER_SESSION_DB:
|
||||
zs = "this version of copyparty only understands session-db v%d and older; the db is v%d"
|
||||
raise Exception(zs % (VER_SESSION_DB, sver))
|
||||
|
||||
cur.execute("select count(*) from us").fetchone()
|
||||
create = False
|
||||
break
|
||||
except:
|
||||
pass
|
||||
if sver:
|
||||
raise
|
||||
sver = 1
|
||||
self._create_session_db(cur)
|
||||
err = self._verify_session_db(cur, sver, db_path)
|
||||
if err:
|
||||
tries = 99
|
||||
self.args.no_ses = True
|
||||
self.log("root", err, 3)
|
||||
break
|
||||
|
||||
except Exception as ex:
|
||||
if n:
|
||||
if tries or sver > VER_SESSION_DB:
|
||||
raise
|
||||
t = "sessions-db corrupt; deleting and recreating: %r"
|
||||
t = "sessions-db is unusable; deleting and recreating: %r"
|
||||
self.log("root", t % (ex,), 3)
|
||||
try:
|
||||
cur.close() # type: ignore
|
||||
@@ -434,8 +480,13 @@ class SvcHub(object):
|
||||
db.close() # type: ignore
|
||||
except:
|
||||
pass
|
||||
try:
|
||||
os.unlink(db_lock)
|
||||
except:
|
||||
pass
|
||||
os.unlink(db_path)
|
||||
|
||||
def _create_session_db(self, cur: "sqlite3.Cursor") -> None:
|
||||
sch = [
|
||||
r"create table kv (k text, v int)",
|
||||
r"create table us (un text, si text, t0 int)",
|
||||
@@ -445,17 +496,44 @@ class SvcHub(object):
|
||||
r"create index us_t0 on us(t0)",
|
||||
r"insert into kv values ('sver', 1)",
|
||||
]
|
||||
for cmd in sch:
|
||||
cur.execute(cmd)
|
||||
self.log("root", "created new sessions-db")
|
||||
|
||||
assert db # type: ignore # !rm
|
||||
assert cur # type: ignore # !rm
|
||||
if create:
|
||||
for cmd in sch:
|
||||
cur.execute(cmd)
|
||||
self.log("root", "created new sessions-db")
|
||||
db.commit()
|
||||
def _verify_session_db(self, cur: "sqlite3.Cursor", sver: int, db_path: str) -> str:
|
||||
# ensure writable (maybe owned by other user)
|
||||
db = cur.connection
|
||||
|
||||
try:
|
||||
zil = cur.execute("select v from kv where k='pid'").fetchall()
|
||||
if len(zil) > 1:
|
||||
raise Exception()
|
||||
owner = zil[0][0]
|
||||
except:
|
||||
owner = 0
|
||||
|
||||
if not lock_file(db_path + ".lock"):
|
||||
t = "the sessions-db [%s] is already in use by another copyparty instance (pid:%d). This is not supported; please provide another database with --ses-db or give this copyparty-instance its entirely separate config-folder by setting another path in the XDG_CONFIG_HOME env-var. You can also disable this safeguard by setting env-var PRTY_NO_DB_LOCK=1. Will now disable sessions and instead use plaintext passwords in cookies."
|
||||
return t % (db_path, owner)
|
||||
|
||||
vars = (("pid", os.getpid()), ("ts", int(time.time() * 1000)))
|
||||
if owner:
|
||||
# wear-estimate: 2 cells; offsets 0x10, 0x50, 0x19720
|
||||
for k, v in vars:
|
||||
cur.execute("update kv set v=? where k=?", (v, k))
|
||||
else:
|
||||
# wear-estimate: 3~4 cells; offsets 0x10, 0x50, 0x19180, 0x19710, 0x36000, 0x360b0, 0x36b90
|
||||
for k, v in vars:
|
||||
cur.execute("insert into kv values(?, ?)", (k, v))
|
||||
|
||||
if sver < VER_SESSION_DB:
|
||||
cur.execute("delete from kv where k='sver'")
|
||||
cur.execute("insert into kv values('sver',?)", (VER_SESSION_DB,))
|
||||
|
||||
db.commit()
|
||||
cur.close()
|
||||
db.close()
|
||||
return ""
|
||||
|
||||
def setup_share_db(self) -> None:
|
||||
al = self.args
|
||||
@@ -464,7 +542,7 @@ class SvcHub(object):
|
||||
al.shr = ""
|
||||
return
|
||||
|
||||
import sqlite3
|
||||
assert sqlite3 # type: ignore # !rm
|
||||
|
||||
al.shr = al.shr.strip("/")
|
||||
if "/" in al.shr or not al.shr:
|
||||
@@ -475,34 +553,48 @@ class SvcHub(object):
|
||||
al.shr = "/%s/" % (al.shr,)
|
||||
al.shr1 = al.shr[1:]
|
||||
|
||||
create = True
|
||||
modified = False
|
||||
# policy:
|
||||
# the shares-db is important, so panic if something is wrong
|
||||
|
||||
db_path = self.args.shr_db
|
||||
self.log("root", "opening shares-db %s" % (db_path,))
|
||||
for n in range(2):
|
||||
try:
|
||||
db = sqlite3.connect(db_path)
|
||||
cur = db.cursor()
|
||||
try:
|
||||
cur.execute("select count(*) from sh").fetchone()
|
||||
create = False
|
||||
break
|
||||
except:
|
||||
pass
|
||||
except Exception as ex:
|
||||
if n:
|
||||
raise
|
||||
t = "shares-db corrupt; deleting and recreating: %r"
|
||||
self.log("root", t % (ex,), 3)
|
||||
try:
|
||||
cur.close() # type: ignore
|
||||
except:
|
||||
pass
|
||||
try:
|
||||
db.close() # type: ignore
|
||||
except:
|
||||
pass
|
||||
os.unlink(db_path)
|
||||
db_lock = db_path + ".lock"
|
||||
try:
|
||||
create = not os.path.getsize(db_path)
|
||||
except:
|
||||
create = True
|
||||
zs = "creating new" if create else "opening"
|
||||
self.log("root", "%s shares-db %s" % (zs, db_path))
|
||||
|
||||
sver = 0
|
||||
try:
|
||||
db = sqlite3.connect(db_path)
|
||||
cur = db.cursor()
|
||||
if not create:
|
||||
zs = "select v from kv where k='sver'"
|
||||
sver = cur.execute(zs).fetchall()[0][0]
|
||||
if sver > VER_SHARES_DB:
|
||||
zs = "this version of copyparty only understands shares-db v%d and older; the db is v%d"
|
||||
raise Exception(zs % (VER_SHARES_DB, sver))
|
||||
|
||||
cur.execute("select count(*) from sh").fetchone()
|
||||
except Exception as ex:
|
||||
t = "could not open shares-db; will now panic...\nthe following database must be repaired or deleted before you can launch copyparty:\n%s\n\nERROR: %s\n\nadditional details:\n%s\n"
|
||||
self.log("root", t % (db_path, ex, min_ex()), 1)
|
||||
raise
|
||||
|
||||
try:
|
||||
zil = cur.execute("select v from kv where k='pid'").fetchall()
|
||||
if len(zil) > 1:
|
||||
raise Exception()
|
||||
owner = zil[0][0]
|
||||
except:
|
||||
owner = 0
|
||||
|
||||
if not lock_file(db_lock):
|
||||
t = "the shares-db [%s] is already in use by another copyparty instance (pid:%d). This is not supported; please provide another database with --shr-db or give this copyparty-instance its entirely separate config-folder by setting another path in the XDG_CONFIG_HOME env-var. You can also disable this safeguard by setting env-var PRTY_NO_DB_LOCK=1. Will now panic."
|
||||
t = t % (db_path, owner)
|
||||
self.log("root", t, 1)
|
||||
raise Exception(t)
|
||||
|
||||
sch1 = [
|
||||
r"create table kv (k text, v int)",
|
||||
@@ -514,34 +606,37 @@ class SvcHub(object):
|
||||
r"create index sf_k on sf(k)",
|
||||
r"create index sh_k on sh(k)",
|
||||
r"create index sh_t1 on sh(t1)",
|
||||
r"insert into kv values ('sver', 2)",
|
||||
]
|
||||
|
||||
assert db # type: ignore # !rm
|
||||
assert cur # type: ignore # !rm
|
||||
if create:
|
||||
dver = 2
|
||||
modified = True
|
||||
if not sver:
|
||||
sver = VER_SHARES_DB
|
||||
for cmd in sch1 + sch2:
|
||||
cur.execute(cmd)
|
||||
self.log("root", "created new shares-db")
|
||||
else:
|
||||
(dver,) = cur.execute("select v from kv where k = 'sver'").fetchall()[0]
|
||||
|
||||
if dver == 1:
|
||||
modified = True
|
||||
if sver == 1:
|
||||
for cmd in sch2:
|
||||
cur.execute(cmd)
|
||||
cur.execute("update sh set st = 0")
|
||||
self.log("root", "shares-db schema upgrade ok")
|
||||
|
||||
if modified:
|
||||
for cmd in [
|
||||
r"delete from kv where k = 'sver'",
|
||||
r"insert into kv values ('sver', %d)" % (2,),
|
||||
]:
|
||||
cur.execute(cmd)
|
||||
db.commit()
|
||||
if sver < VER_SHARES_DB:
|
||||
cur.execute("delete from kv where k='sver'")
|
||||
cur.execute("insert into kv values('sver',?)", (VER_SHARES_DB,))
|
||||
|
||||
vars = (("pid", os.getpid()), ("ts", int(time.time() * 1000)))
|
||||
if owner:
|
||||
# wear-estimate: same as sessions-db
|
||||
for k, v in vars:
|
||||
cur.execute("update kv set v=? where k=?", (v, k))
|
||||
else:
|
||||
for k, v in vars:
|
||||
cur.execute("insert into kv values(?, ?)", (k, v))
|
||||
|
||||
db.commit()
|
||||
cur.close()
|
||||
db.close()
|
||||
|
||||
@@ -679,10 +774,11 @@ class SvcHub(object):
|
||||
t += ", "
|
||||
t += "\033[0mNG: \033[35m" + sng
|
||||
|
||||
t += "\033[0m, see --deps"
|
||||
self.log("dependencies", t, 6)
|
||||
t += "\033[0m, see --deps (this is fine btw)"
|
||||
self.log("optional-dependencies", t, 6)
|
||||
|
||||
def _check_env(self) -> None:
|
||||
al = self.args
|
||||
try:
|
||||
files = os.listdir(E.cfg)
|
||||
except:
|
||||
@@ -699,6 +795,21 @@ class SvcHub(object):
|
||||
if self.args.bauth_last:
|
||||
self.log("root", "WARNING: ignoring --bauth-last due to --no-bauth", 3)
|
||||
|
||||
have_tcp = False
|
||||
for zs in al.i:
|
||||
if not zs.startswith("unix:"):
|
||||
have_tcp = True
|
||||
if not have_tcp:
|
||||
zb = False
|
||||
zs = "z zm zm4 zm6 zmv zmvv zs zsv zv"
|
||||
for zs in zs.split():
|
||||
if getattr(al, zs, False):
|
||||
setattr(al, zs, False)
|
||||
zb = True
|
||||
if zb:
|
||||
t = "only listening on unix-sockets; cannot enable zeroconf/mdns/ssdp as requested"
|
||||
self.log("root", t, 3)
|
||||
|
||||
if not self.args.no_dav:
|
||||
from .dxml import DXML_OK
|
||||
|
||||
@@ -763,13 +874,14 @@ class SvcHub(object):
|
||||
vl = [os.path.expandvars(os.path.expanduser(x)) for x in vl]
|
||||
setattr(al, k, vl)
|
||||
|
||||
for k in "lo hist ssl_log".split(" "):
|
||||
for k in "lo hist dbpath ssl_log".split(" "):
|
||||
vs = getattr(al, k)
|
||||
if vs:
|
||||
vs = os.path.expandvars(os.path.expanduser(vs))
|
||||
setattr(al, k, vs)
|
||||
|
||||
for k in "dav_ua1 sus_urls nonsus_urls".split(" "):
|
||||
zs = "dav_ua1 sus_urls nonsus_urls ua_nodoc ua_nozip"
|
||||
for k in zs.split(" "):
|
||||
vs = getattr(al, k)
|
||||
if not vs or vs == "no":
|
||||
setattr(al, k, None)
|
||||
@@ -1260,7 +1372,7 @@ class SvcHub(object):
|
||||
raise
|
||||
|
||||
def check_mp_support(self) -> str:
|
||||
if MACOS:
|
||||
if MACOS and not os.environ.get("PRTY_FORCE_MP"):
|
||||
return "multiprocessing is wonky on mac osx;"
|
||||
elif sys.version_info < (3, 3):
|
||||
return "need python 3.3 or newer for multiprocessing;"
|
||||
@@ -1280,7 +1392,7 @@ class SvcHub(object):
|
||||
return False
|
||||
|
||||
try:
|
||||
if mp.cpu_count() <= 1:
|
||||
if mp.cpu_count() <= 1 and not os.environ.get("PRTY_FORCE_MP"):
|
||||
raise Exception()
|
||||
except:
|
||||
self.log("svchub", "only one CPU detected; multiprocessing disabled")
|
||||
|
||||
@@ -4,12 +4,11 @@ from __future__ import print_function, unicode_literals
|
||||
import calendar
|
||||
import stat
|
||||
import time
|
||||
import zlib
|
||||
|
||||
from .authsrv import AuthSrv
|
||||
from .bos import bos
|
||||
from .sutil import StreamArc, errdesc
|
||||
from .util import min_ex, sanitize_fn, spack, sunpack, yieldfile
|
||||
from .util import min_ex, sanitize_fn, spack, sunpack, yieldfile, zlib
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from typing import Any, Generator, Optional
|
||||
@@ -55,6 +54,7 @@ def gen_fdesc(sz: int, crc32: int, z64: bool) -> bytes:
|
||||
|
||||
def gen_hdr(
|
||||
h_pos: Optional[int],
|
||||
z64: bool,
|
||||
fn: str,
|
||||
sz: int,
|
||||
lastmod: int,
|
||||
@@ -71,7 +71,6 @@ def gen_hdr(
|
||||
# appnote 4.5 / zip 3.0 (2008) / unzip 6.0 (2009) says to add z64
|
||||
# extinfo for values which exceed H, but that becomes an off-by-one
|
||||
# (can't tell if it was clamped or exactly maxval), make it obvious
|
||||
z64 = sz >= 0xFFFFFFFF
|
||||
z64v = [sz, sz] if z64 else []
|
||||
if h_pos and h_pos >= 0xFFFFFFFF:
|
||||
# central, also consider ptr to original header
|
||||
@@ -245,6 +244,7 @@ class StreamZip(StreamArc):
|
||||
|
||||
sz = st.st_size
|
||||
ts = st.st_mtime
|
||||
h_pos = self.pos
|
||||
|
||||
crc = 0
|
||||
if self.pre_crc:
|
||||
@@ -253,8 +253,12 @@ class StreamZip(StreamArc):
|
||||
|
||||
crc &= 0xFFFFFFFF
|
||||
|
||||
h_pos = self.pos
|
||||
buf = gen_hdr(None, name, sz, ts, self.utf8, crc, self.pre_crc)
|
||||
# some unzip-programs expect a 64bit data-descriptor
|
||||
# even if the only 32bit-exceeding value is the offset,
|
||||
# so force that by placeholdering the filesize too
|
||||
z64 = h_pos >= 0xFFFFFFFF or sz >= 0xFFFFFFFF
|
||||
|
||||
buf = gen_hdr(None, z64, name, sz, ts, self.utf8, crc, self.pre_crc)
|
||||
yield self._ct(buf)
|
||||
|
||||
for buf in yieldfile(src, self.args.iobuf):
|
||||
@@ -267,8 +271,6 @@ class StreamZip(StreamArc):
|
||||
|
||||
self.items.append((name, sz, ts, crc, h_pos))
|
||||
|
||||
z64 = sz >= 4 * 1024 * 1024 * 1024
|
||||
|
||||
if z64 or not self.pre_crc:
|
||||
buf = gen_fdesc(sz, crc, z64)
|
||||
yield self._ct(buf)
|
||||
@@ -307,7 +309,8 @@ class StreamZip(StreamArc):
|
||||
|
||||
cdir_pos = self.pos
|
||||
for name, sz, ts, crc, h_pos in self.items:
|
||||
buf = gen_hdr(h_pos, name, sz, ts, self.utf8, crc, self.pre_crc)
|
||||
z64 = h_pos >= 0xFFFFFFFF or sz >= 0xFFFFFFFF
|
||||
buf = gen_hdr(h_pos, z64, name, sz, ts, self.utf8, crc, self.pre_crc)
|
||||
mbuf += self._ct(buf)
|
||||
if len(mbuf) >= 16384:
|
||||
yield mbuf
|
||||
|
||||
@@ -151,9 +151,15 @@ class TcpSrv(object):
|
||||
if just_ll or self.args.ll:
|
||||
ll_ok.add(ip.split("/")[0])
|
||||
|
||||
listening_on = []
|
||||
for ip, ports in sorted(ok.items()):
|
||||
for port in sorted(ports):
|
||||
listening_on.append("%s %s" % (ip, port))
|
||||
|
||||
qr1: dict[str, list[int]] = {}
|
||||
qr2: dict[str, list[int]] = {}
|
||||
msgs = []
|
||||
accessible_on = []
|
||||
title_tab: dict[str, dict[str, int]] = {}
|
||||
title_vars = [x[1:] for x in self.args.wintitle.split(" ") if x.startswith("$")]
|
||||
t = "available @ {}://{}:{}/ (\033[33m{}\033[0m)"
|
||||
@@ -169,6 +175,10 @@ class TcpSrv(object):
|
||||
):
|
||||
continue
|
||||
|
||||
zs = "%s %s" % (ip, port)
|
||||
if zs not in accessible_on:
|
||||
accessible_on.append(zs)
|
||||
|
||||
proto = " http"
|
||||
if self.args.http_only:
|
||||
pass
|
||||
@@ -219,6 +229,14 @@ class TcpSrv(object):
|
||||
else:
|
||||
print("\n", end="")
|
||||
|
||||
for fn, ls in (
|
||||
(self.args.wr_h_eps, listening_on),
|
||||
(self.args.wr_h_aon, accessible_on),
|
||||
):
|
||||
if fn:
|
||||
with open(fn, "wb") as f:
|
||||
f.write(("\n".join(ls)).encode("utf-8"))
|
||||
|
||||
if self.args.qr or self.args.qrs:
|
||||
self.qr = self._qr(qr1, qr2)
|
||||
|
||||
@@ -548,7 +566,7 @@ class TcpSrv(object):
|
||||
ip = None
|
||||
ips = list(t1) + list(t2)
|
||||
qri = self.args.qri
|
||||
if self.args.zm and not qri:
|
||||
if self.args.zm and not qri and ips:
|
||||
name = self.args.name + ".local"
|
||||
t1[name] = next(v for v in (t1 or t2).values())
|
||||
ips = [name] + ips
|
||||
|
||||
@@ -36,7 +36,19 @@ from partftpy.TftpShared import TftpException
|
||||
from .__init__ import EXE, PY2, TYPE_CHECKING
|
||||
from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .util import UTC, BytesIO, Daemon, ODict, exclude_dotfiles, min_ex, runhook, undot
|
||||
from .util import (
|
||||
FN_EMB,
|
||||
UTC,
|
||||
BytesIO,
|
||||
Daemon,
|
||||
ODict,
|
||||
exclude_dotfiles,
|
||||
min_ex,
|
||||
runhook,
|
||||
undot,
|
||||
vjoin,
|
||||
vsplit,
|
||||
)
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from typing import Any, Union
|
||||
@@ -244,16 +256,25 @@ class Tftpd(object):
|
||||
for srv in srvs:
|
||||
srv.stop()
|
||||
|
||||
def _v2a(self, caller: str, vpath: str, perms: list, *a: Any) -> tuple[VFS, str]:
|
||||
def _v2a(
|
||||
self, caller: str, vpath: str, perms: list, *a: Any
|
||||
) -> tuple[VFS, str, str]:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
if not perms:
|
||||
perms = [True, True]
|
||||
|
||||
debug('%s("%s", %s) %s\033[K\033[0m', caller, vpath, str(a), perms)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, "*", *perms)
|
||||
if perms[1] and "*" not in vfs.axs.uread and "wo_up_readme" not in vfs.flags:
|
||||
zs, fn = vsplit(vpath)
|
||||
if fn.lower() in FN_EMB:
|
||||
vpath = vjoin(zs, "_wo_" + fn)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, "*", *perms)
|
||||
|
||||
if not vfs.realpath:
|
||||
raise Exception("unmapped vfs")
|
||||
return vfs, vfs.canonical(rem)
|
||||
|
||||
return vfs, vpath, vfs.canonical(rem)
|
||||
|
||||
def _ls(self, vpath: str, raddress: str, rport: int, force=False) -> Any:
|
||||
# generate file listing if vpath is dir.txt and return as file object
|
||||
@@ -331,7 +352,7 @@ class Tftpd(object):
|
||||
else:
|
||||
raise Exception("bad mode %s" % (mode,))
|
||||
|
||||
vfs, ap = self._v2a("open", vpath, [rd, wr])
|
||||
vfs, vpath, ap = self._v2a("open", vpath, [rd, wr])
|
||||
if wr:
|
||||
if "*" not in vfs.axs.uwrite:
|
||||
yeet("blocked write; folder not world-writable: /%s" % (vpath,))
|
||||
@@ -368,7 +389,7 @@ class Tftpd(object):
|
||||
return open(ap, mode, *a, **ka)
|
||||
|
||||
def _mkdir(self, vpath: str, *a) -> None:
|
||||
vfs, ap = self._v2a("mkdir", vpath, [])
|
||||
vfs, _, ap = self._v2a("mkdir", vpath, [False, True])
|
||||
if "*" not in vfs.axs.uwrite:
|
||||
yeet("blocked mkdir; folder not world-writable: /%s" % (vpath,))
|
||||
|
||||
@@ -376,7 +397,7 @@ class Tftpd(object):
|
||||
|
||||
def _unlink(self, vpath: str) -> None:
|
||||
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
|
||||
vfs, ap = self._v2a("delete", vpath, [True, False, False, True])
|
||||
vfs, _, ap = self._v2a("delete", vpath, [True, False, False, True])
|
||||
|
||||
try:
|
||||
inf = bos.stat(ap)
|
||||
@@ -400,7 +421,7 @@ class Tftpd(object):
|
||||
|
||||
def _p_exists(self, vpath: str) -> bool:
|
||||
try:
|
||||
ap = self._v2a("p.exists", vpath, [False, False])[1]
|
||||
ap = self._v2a("p.exists", vpath, [False, False])[2]
|
||||
bos.stat(ap)
|
||||
return True
|
||||
except:
|
||||
@@ -408,7 +429,7 @@ class Tftpd(object):
|
||||
|
||||
def _p_isdir(self, vpath: str) -> bool:
|
||||
try:
|
||||
st = bos.stat(self._v2a("p.isdir", vpath, [False, False])[1])
|
||||
st = bos.stat(self._v2a("p.isdir", vpath, [False, False])[2])
|
||||
ret = stat.S_ISDIR(st.st_mode)
|
||||
return ret
|
||||
except:
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import errno
|
||||
import os
|
||||
import stat
|
||||
|
||||
from .__init__ import TYPE_CHECKING
|
||||
from .authsrv import VFS
|
||||
from .bos import bos
|
||||
from .th_srv import EXTS_AC, HAVE_WEBP, thumb_path
|
||||
from .util import Cooldown
|
||||
from .util import Cooldown, Pebkac
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
from typing import Optional, Union
|
||||
@@ -16,6 +18,9 @@ if TYPE_CHECKING:
|
||||
from .httpsrv import HttpSrv
|
||||
|
||||
|
||||
IOERROR = "reading the file was denied by the server os; either due to filesystem permissions, selinux, apparmor, or similar:\n%r"
|
||||
|
||||
|
||||
class ThumbCli(object):
|
||||
def __init__(self, hsrv: "HttpSrv") -> None:
|
||||
self.broker = hsrv.broker
|
||||
@@ -124,7 +129,7 @@ class ThumbCli(object):
|
||||
|
||||
tpath = thumb_path(histpath, rem, mtime, fmt, self.fmt_ffa)
|
||||
tpaths = [tpath]
|
||||
if fmt == "w":
|
||||
if fmt[:1] == "w":
|
||||
# also check for jpg (maybe webp is unavailable)
|
||||
tpaths.append(tpath.rsplit(".", 1)[0] + ".jpg")
|
||||
|
||||
@@ -157,8 +162,22 @@ class ThumbCli(object):
|
||||
if abort:
|
||||
return None
|
||||
|
||||
if not bos.path.getsize(os.path.join(ptop, rem)):
|
||||
return None
|
||||
ap = os.path.join(ptop, rem)
|
||||
try:
|
||||
st = bos.stat(ap)
|
||||
if not st.st_size or not stat.S_ISREG(st.st_mode):
|
||||
return None
|
||||
|
||||
with open(ap, "rb", 4) as f:
|
||||
if not f.read(4):
|
||||
raise Exception()
|
||||
except OSError as ex:
|
||||
if ex.errno == errno.ENOENT:
|
||||
raise Pebkac(404)
|
||||
else:
|
||||
raise Pebkac(500, IOERROR % (ex,))
|
||||
except Exception as ex:
|
||||
raise Pebkac(500, IOERROR % (ex,))
|
||||
|
||||
x = self.broker.ask("thumbsrv.get", ptop, rem, mtime, fmt)
|
||||
return x.get() # type: ignore
|
||||
|
||||
@@ -4,8 +4,10 @@ from __future__ import print_function, unicode_literals
|
||||
import hashlib
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import subprocess as sp
|
||||
import tempfile
|
||||
import threading
|
||||
import time
|
||||
|
||||
@@ -18,6 +20,7 @@ from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, au_unpk, ffprobe
|
||||
from .util import BytesIO # type: ignore
|
||||
from .util import (
|
||||
FFMPEG_URL,
|
||||
VF_CAREFUL,
|
||||
Cooldown,
|
||||
Daemon,
|
||||
afsenc,
|
||||
@@ -48,6 +51,10 @@ HAVE_WEBP = False
|
||||
|
||||
EXTS_TH = set(["jpg", "webp", "png"])
|
||||
EXTS_AC = set(["opus", "owa", "caf", "mp3"])
|
||||
EXTS_SPEC_SAFE = set("aif aiff flac mp3 opus wav".split())
|
||||
|
||||
PTN_TS = re.compile("^-?[0-9a-f]{8,10}$")
|
||||
|
||||
|
||||
try:
|
||||
if os.environ.get("PRTY_NO_PIL"):
|
||||
@@ -163,12 +170,15 @@ class ThumbSrv(object):
|
||||
|
||||
self.mutex = threading.Lock()
|
||||
self.busy: dict[str, list[threading.Condition]] = {}
|
||||
self.untemp: dict[str, list[str]] = {}
|
||||
self.ram: dict[str, float] = {}
|
||||
self.memcond = threading.Condition(self.mutex)
|
||||
self.stopping = False
|
||||
self.rm_nullthumbs = True # forget failed conversions on startup
|
||||
self.nthr = max(1, self.args.th_mt)
|
||||
|
||||
self.exts_spec_unsafe = set(self.args.th_spec_cnv.split(","))
|
||||
|
||||
self.q: Queue[Optional[tuple[str, str, str, VFS]]] = Queue(self.nthr * 4)
|
||||
for n in range(self.nthr):
|
||||
Daemon(self.worker, "thumb-{}-{}".format(n, self.nthr))
|
||||
@@ -385,8 +395,12 @@ class ThumbSrv(object):
|
||||
self.log(msg, c)
|
||||
if getattr(ex, "returncode", 0) != 321:
|
||||
if fun == funs[-1]:
|
||||
with open(ttpath, "wb") as _:
|
||||
pass
|
||||
try:
|
||||
with open(ttpath, "wb") as _:
|
||||
pass
|
||||
except Exception as ex:
|
||||
t = "failed to create the file [%s]: %r"
|
||||
self.log(t % (ttpath, ex), 3)
|
||||
else:
|
||||
# ffmpeg may spawn empty files on windows
|
||||
try:
|
||||
@@ -399,13 +413,24 @@ class ThumbSrv(object):
|
||||
|
||||
try:
|
||||
wrename(self.log, ttpath, tpath, vn.flags)
|
||||
except:
|
||||
except Exception as ex:
|
||||
if not os.path.exists(tpath):
|
||||
t = "failed to move [%s] to [%s]: %r"
|
||||
self.log(t % (ttpath, tpath, ex), 3)
|
||||
pass
|
||||
|
||||
untemp = []
|
||||
with self.mutex:
|
||||
subs = self.busy[tpath]
|
||||
del self.busy[tpath]
|
||||
self.ram.pop(ttpath, None)
|
||||
untemp = self.untemp.pop(ttpath, None) or []
|
||||
|
||||
for ap in untemp:
|
||||
try:
|
||||
wunlink(self.log, ap, VF_CAREFUL)
|
||||
except:
|
||||
pass
|
||||
|
||||
for x in subs:
|
||||
with x:
|
||||
@@ -659,15 +684,43 @@ class ThumbSrv(object):
|
||||
if "ac" not in ret:
|
||||
raise Exception("not audio")
|
||||
|
||||
fext = abspath.split(".")[-1].lower()
|
||||
|
||||
# https://trac.ffmpeg.org/ticket/10797
|
||||
# expect 1 GiB every 600 seconds when duration is tricky;
|
||||
# simple filetypes are generally safer so let's special-case those
|
||||
safe = ("flac", "wav", "aif", "aiff", "opus")
|
||||
coeff = 1800 if abspath.split(".")[-1].lower() in safe else 600
|
||||
dur = ret[".dur"][1] if ".dur" in ret else 300
|
||||
coeff = 1800 if fext in EXTS_SPEC_SAFE else 600
|
||||
dur = ret[".dur"][1] if ".dur" in ret else 900
|
||||
need = 0.2 + dur / coeff
|
||||
self.wait4ram(need, tpath)
|
||||
|
||||
infile = abspath
|
||||
if dur >= 900 or fext in self.exts_spec_unsafe:
|
||||
with tempfile.NamedTemporaryFile(suffix=".spec.flac", delete=False) as f:
|
||||
f.write(b"h")
|
||||
infile = f.name
|
||||
try:
|
||||
self.untemp[tpath].append(infile)
|
||||
except:
|
||||
self.untemp[tpath] = [infile]
|
||||
|
||||
# fmt: off
|
||||
cmd = [
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-v", b"error",
|
||||
b"-hide_banner",
|
||||
b"-i", fsenc(abspath),
|
||||
b"-map", b"0:a:0",
|
||||
b"-ac", b"1",
|
||||
b"-ar", b"48000",
|
||||
b"-sample_fmt", b"s16",
|
||||
b"-t", b"900",
|
||||
b"-y", fsenc(infile),
|
||||
]
|
||||
# fmt: on
|
||||
self._run_ff(cmd, vn)
|
||||
|
||||
fc = "[0:a:0]aresample=48000{},showspectrumpic=s="
|
||||
if "3" in fmt:
|
||||
fc += "1280x1024,crop=1420:1056:70:48[o]"
|
||||
@@ -687,7 +740,7 @@ class ThumbSrv(object):
|
||||
b"-nostdin",
|
||||
b"-v", b"error",
|
||||
b"-hide_banner",
|
||||
b"-i", fsenc(abspath),
|
||||
b"-i", fsenc(infile),
|
||||
b"-filter_complex", fc.encode("utf-8"),
|
||||
b"-map", b"[o]",
|
||||
b"-frames:v", b"1",
|
||||
@@ -991,6 +1044,8 @@ class ThumbSrv(object):
|
||||
# thumb file
|
||||
try:
|
||||
b64, ts, ext = f.split(".")
|
||||
if len(ts) > 8 and PTN_TS.match(ts):
|
||||
ts = "yeahokay"
|
||||
if len(b64) != 24 or len(ts) != 8 or ext not in exts:
|
||||
raise Exception()
|
||||
except:
|
||||
|
||||
@@ -134,9 +134,9 @@ class U2idx(object):
|
||||
assert sqlite3 # type: ignore # !rm
|
||||
|
||||
ptop = vn.realpath
|
||||
histpath = self.asrv.vfs.histtab.get(ptop)
|
||||
histpath = self.asrv.vfs.dbpaths.get(ptop)
|
||||
if not histpath:
|
||||
self.log("no histpath for %r" % (ptop,))
|
||||
self.log("no dbpath for %r" % (ptop,))
|
||||
return None
|
||||
|
||||
db_path = os.path.join(histpath, "up2k.db")
|
||||
|
||||
@@ -2,7 +2,6 @@
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import errno
|
||||
import gzip
|
||||
import hashlib
|
||||
import json
|
||||
import math
|
||||
@@ -42,6 +41,7 @@ from .util import (
|
||||
fsenc,
|
||||
gen_filekey,
|
||||
gen_filekey_dbg,
|
||||
gzip,
|
||||
hidedir,
|
||||
humansize,
|
||||
min_ex,
|
||||
@@ -94,7 +94,7 @@ VF_AFFECTS_INDEXING = set(zsg.split(" "))
|
||||
|
||||
SBUSY = "cannot receive uploads right now;\nserver busy with %s.\nPlease wait; the client will retry..."
|
||||
|
||||
HINT_HISTPATH = "you could try moving the database to another location (preferably an SSD or NVME drive) using either the --hist argument (global option for all volumes), or the hist volflag (just for this volume)"
|
||||
HINT_HISTPATH = "you could try moving the database to another location (preferably an SSD or NVME drive) using either the --hist argument (global option for all volumes), or the hist volflag (just for this volume), or, if you want to keep the thumbnails in the current location and only move the database itself, then use --dbpath or volflag dbpath"
|
||||
|
||||
|
||||
NULLSTAT = os.stat_result((0, -1, -1, 0, 0, 0, 0, 0, 0, 0))
|
||||
@@ -1096,9 +1096,9 @@ class Up2k(object):
|
||||
self, ptop: str, flags: dict[str, Any]
|
||||
) -> Optional[tuple["sqlite3.Cursor", str]]:
|
||||
"""mutex(main,reg) me"""
|
||||
histpath = self.vfs.histtab.get(ptop)
|
||||
histpath = self.vfs.dbpaths.get(ptop)
|
||||
if not histpath:
|
||||
self.log("no histpath for %r" % (ptop,))
|
||||
self.log("no dbpath for %r" % (ptop,))
|
||||
return None
|
||||
|
||||
db_path = os.path.join(histpath, "up2k.db")
|
||||
@@ -1119,7 +1119,7 @@ class Up2k(object):
|
||||
ft = "\033[0;32m{}{:.0}"
|
||||
ff = "\033[0;35m{}{:.0}"
|
||||
fv = "\033[0;36m{}:\033[90m{}"
|
||||
zs = "ext_th_d html_head mv_re_r mv_re_t rm_re_r rm_re_t srch_re_dots srch_re_nodot"
|
||||
zs = "ext_th_d html_head mv_re_r mv_re_t rm_re_r rm_re_t srch_re_dots srch_re_nodot zipmax zipmaxn_v zipmaxs_v"
|
||||
fx = set(zs.split())
|
||||
fd = vf_bmap()
|
||||
fd.update(vf_cmap())
|
||||
@@ -1344,12 +1344,15 @@ class Up2k(object):
|
||||
]
|
||||
excl += [absreal(x) for x in excl]
|
||||
excl += list(self.vfs.histtab.values())
|
||||
excl += list(self.vfs.dbpaths.values())
|
||||
if WINDOWS:
|
||||
excl = [x.replace("/", "\\") for x in excl]
|
||||
else:
|
||||
# ~/.wine/dosdevices/z:/ and such
|
||||
excl.extend(("/dev", "/proc", "/run", "/sys"))
|
||||
|
||||
excl = list({k: 1 for k in excl})
|
||||
|
||||
if self.args.re_dirsz:
|
||||
db.c.execute("delete from ds")
|
||||
db.n += 1
|
||||
@@ -2918,7 +2921,6 @@ class Up2k(object):
|
||||
if ptop not in self.registry:
|
||||
raise Pebkac(410, "location unavailable")
|
||||
|
||||
cj["name"] = sanitize_fn(cj["name"], "")
|
||||
cj["poke"] = now = self.db_act = self.vol_act[ptop] = time.time()
|
||||
wark = dwark = self._get_wark(cj)
|
||||
job = None
|
||||
@@ -2954,9 +2956,14 @@ class Up2k(object):
|
||||
self.salt, cj["size"], cj["lmod"], cj["prel"], cj["name"]
|
||||
)
|
||||
|
||||
if vfs.flags.get("up_ts", "") == "fu" or not cj["lmod"]:
|
||||
zi = cj["lmod"]
|
||||
bad_mt = zi <= 0 or zi > 0xAAAAAAAA
|
||||
if bad_mt or vfs.flags.get("up_ts", "") == "fu":
|
||||
# force upload time rather than last-modified
|
||||
cj["lmod"] = int(time.time())
|
||||
if zi and bad_mt:
|
||||
t = "ignoring impossible last-modified time from client: %s"
|
||||
self.log(t % (zi,), 6)
|
||||
|
||||
alts: list[tuple[int, int, dict[str, Any], "sqlite3.Cursor", str, str]] = []
|
||||
for ptop, cur in vols:
|
||||
@@ -3231,6 +3238,7 @@ class Up2k(object):
|
||||
job["ptop"] = vfs.realpath
|
||||
job["vtop"] = vfs.vpath
|
||||
job["prel"] = rem
|
||||
job["name"] = sanitize_fn(job["name"], "")
|
||||
if zvfs.vpath != vfs.vpath:
|
||||
# print(json.dumps(job, sort_keys=True, indent=4))
|
||||
job["hash"] = cj["hash"]
|
||||
@@ -3421,6 +3429,7 @@ class Up2k(object):
|
||||
rm: bool = False,
|
||||
lmod: float = 0,
|
||||
fsrc: Optional[str] = None,
|
||||
is_mv: bool = False,
|
||||
) -> None:
|
||||
if src == dst or (fsrc and fsrc == dst):
|
||||
t = "symlinking a file to itself?? orig(%s) fsrc(%s) link(%s)"
|
||||
@@ -3437,7 +3446,7 @@ class Up2k(object):
|
||||
|
||||
linked = False
|
||||
try:
|
||||
if not flags.get("dedup"):
|
||||
if not is_mv and not flags.get("dedup"):
|
||||
raise Exception("dedup is disabled in config")
|
||||
|
||||
lsrc = src
|
||||
@@ -3703,8 +3712,9 @@ class Up2k(object):
|
||||
if self.idx_wark(vflags, *z2):
|
||||
del self.registry[ptop][wark]
|
||||
else:
|
||||
for k in "host tnam busy sprs poke t0c".split():
|
||||
for k in "host tnam busy sprs poke".split():
|
||||
del job[k]
|
||||
job.pop("t0c", None)
|
||||
job["t0"] = int(job["t0"])
|
||||
job["hash"] = []
|
||||
job["done"] = 1
|
||||
@@ -4596,7 +4606,7 @@ class Up2k(object):
|
||||
dlink = bos.readlink(sabs)
|
||||
dlink = os.path.join(os.path.dirname(sabs), dlink)
|
||||
dlink = bos.path.abspath(dlink)
|
||||
self._symlink(dlink, dabs, dvn.flags, lmod=ftime)
|
||||
self._symlink(dlink, dabs, dvn.flags, lmod=ftime, is_mv=True)
|
||||
wunlink(self.log, sabs, svn.flags)
|
||||
else:
|
||||
atomic_move(self.log, sabs, dabs, svn.flags)
|
||||
@@ -4815,7 +4825,7 @@ class Up2k(object):
|
||||
flags = self.flags.get(ptop) or {}
|
||||
atomic_move(self.log, sabs, slabs, flags)
|
||||
bos.utime(slabs, (int(time.time()), int(mt)), False)
|
||||
self._symlink(slabs, sabs, flags, False)
|
||||
self._symlink(slabs, sabs, flags, False, is_mv=True)
|
||||
full[slabs] = (ptop, rem)
|
||||
sabs = slabs
|
||||
|
||||
@@ -4874,7 +4884,9 @@ class Up2k(object):
|
||||
# (for example a volume with symlinked dupes but no --dedup);
|
||||
# fsrc=sabs is then a source that currently resolves to copy
|
||||
|
||||
self._symlink(dabs, alink, flags, False, lmod=lmod or 0, fsrc=sabs)
|
||||
self._symlink(
|
||||
dabs, alink, flags, False, lmod=lmod or 0, fsrc=sabs, is_mv=True
|
||||
)
|
||||
|
||||
return len(full) + len(links)
|
||||
|
||||
@@ -4988,6 +5000,7 @@ class Up2k(object):
|
||||
job["ptop"] = vfs.realpath
|
||||
job["vtop"] = vfs.vpath
|
||||
job["prel"] = rem
|
||||
job["name"] = sanitize_fn(job["name"], "")
|
||||
if zvfs.vpath != vfs.vpath:
|
||||
self.log("xbu reloc2:%d..." % (depth,), 6)
|
||||
return self._handle_json(job, depth + 1)
|
||||
@@ -5092,7 +5105,7 @@ class Up2k(object):
|
||||
|
||||
def _snap_reg(self, ptop: str, reg: dict[str, dict[str, Any]]) -> None:
|
||||
now = time.time()
|
||||
histpath = self.vfs.histtab.get(ptop)
|
||||
histpath = self.vfs.dbpaths.get(ptop)
|
||||
if not histpath:
|
||||
return
|
||||
|
||||
|
||||
@@ -31,6 +31,17 @@ from collections import Counter
|
||||
from ipaddress import IPv4Address, IPv4Network, IPv6Address, IPv6Network
|
||||
from queue import Queue
|
||||
|
||||
try:
|
||||
from zlib_ng import gzip_ng as gzip
|
||||
from zlib_ng import zlib_ng as zlib
|
||||
|
||||
sys.modules["gzip"] = gzip
|
||||
# sys.modules["zlib"] = zlib
|
||||
# `- somehow makes tarfile 3% slower with default malloc, and barely faster with mimalloc
|
||||
except:
|
||||
import gzip
|
||||
import zlib
|
||||
|
||||
from .__init__ import (
|
||||
ANYWIN,
|
||||
EXE,
|
||||
@@ -103,8 +114,14 @@ IP6ALL = "0:0:0:0:0:0:0:0"
|
||||
|
||||
|
||||
try:
|
||||
import ctypes
|
||||
import fcntl
|
||||
|
||||
HAVE_FCNTL = True
|
||||
except:
|
||||
HAVE_FCNTL = False
|
||||
|
||||
try:
|
||||
import ctypes
|
||||
import termios
|
||||
except:
|
||||
pass
|
||||
@@ -234,6 +251,9 @@ SYMTIME = PY36 and os.utime in os.supports_follow_symlinks
|
||||
|
||||
META_NOBOTS = '<meta name="robots" content="noindex, nofollow">\n'
|
||||
|
||||
# smart enough to understand javascript while also ignoring rel="nofollow"
|
||||
BAD_BOTS = r"Barkrowler|bingbot|BLEXBot|Googlebot|GoogleOther|GPTBot|PetalBot|SeekportBot|SemrushBot|YandexBot"
|
||||
|
||||
FFMPEG_URL = "https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z"
|
||||
|
||||
URL_PRJ = "https://github.com/9001/copyparty"
|
||||
@@ -448,8 +468,12 @@ UNHUMANIZE_UNITS = {
|
||||
|
||||
VF_CAREFUL = {"mv_re_t": 5, "rm_re_t": 5, "mv_re_r": 0.1, "rm_re_r": 0.1}
|
||||
|
||||
FN_EMB = set([".prologue.html", ".epilogue.html", "readme.md", "preadme.md"])
|
||||
|
||||
|
||||
def read_ram() -> tuple[float, float]:
|
||||
# NOTE: apparently no need to consider /sys/fs/cgroup/memory.max
|
||||
# (cgroups2) since the limit is synced to /proc/meminfo
|
||||
a = b = 0
|
||||
try:
|
||||
with open("/proc/meminfo", "rb", 0x10000) as f:
|
||||
@@ -594,6 +618,38 @@ except Exception as ex:
|
||||
print("using fallback base64 codec due to %r" % (ex,))
|
||||
|
||||
|
||||
class NotUTF8(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def read_utf8(log: Optional["NamedLogger"], ap: Union[str, bytes], strict: bool) -> str:
|
||||
with open(ap, "rb") as f:
|
||||
buf = f.read()
|
||||
|
||||
try:
|
||||
return buf.decode("utf-8", "strict")
|
||||
except UnicodeDecodeError as ex:
|
||||
eo = ex.start
|
||||
eb = buf[eo : eo + 1]
|
||||
|
||||
if not strict:
|
||||
t = "WARNING: The file [%s] is not using the UTF-8 character encoding; some characters in the file will be skipped/ignored. The first unreadable character was byte %r at offset %d. Please convert this file to UTF-8 by opening the file in your text-editor and saving it as UTF-8."
|
||||
t = t % (ap, eb, eo)
|
||||
if log:
|
||||
log(t, 3)
|
||||
else:
|
||||
print(t)
|
||||
return buf.decode("utf-8", "replace")
|
||||
|
||||
t = "ERROR: The file [%s] is not using the UTF-8 character encoding, and cannot be loaded. The first unreadable character was byte %r at offset %d. Please convert this file to UTF-8 by opening the file in your text-editor and saving it as UTF-8."
|
||||
t = t % (ap, eb, eo)
|
||||
if log:
|
||||
log(t, 3)
|
||||
else:
|
||||
print(t)
|
||||
raise NotUTF8(t)
|
||||
|
||||
|
||||
class Daemon(threading.Thread):
|
||||
def __init__(
|
||||
self,
|
||||
@@ -1419,8 +1475,6 @@ def stackmon(fp: str, ival: float, suffix: str) -> None:
|
||||
buf = st.encode("utf-8", "replace")
|
||||
|
||||
if fp.endswith(".gz"):
|
||||
import gzip
|
||||
|
||||
# 2459b 2304b 2241b 2202b 2194b 2191b lv3..8
|
||||
# 0.06s 0.08s 0.11s 0.13s 0.16s 0.19s
|
||||
buf = gzip.compress(buf, compresslevel=6)
|
||||
@@ -1500,6 +1554,12 @@ def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
|
||||
txt = txt.replace(bap.replace(b"\\", b"\\\\"), bvp)
|
||||
txt = txt.replace(bhp.replace(b"\\", b"\\\\"), bvph)
|
||||
|
||||
if vol.histpath != vol.dbpath:
|
||||
bdp = vol.dbpath.encode("utf-8")
|
||||
bdph = b"$db(/" + bvp + b")"
|
||||
txt = txt.replace(bdp, bdph)
|
||||
txt = txt.replace(bdp.replace(b"\\", b"\\\\"), bdph)
|
||||
|
||||
if txt != txt0:
|
||||
txt += b"\r\nNOTE: filepaths sanitized; see serverlog for correct values"
|
||||
|
||||
@@ -3888,8 +3948,75 @@ def hidedir(dp) -> None:
|
||||
pass
|
||||
|
||||
|
||||
_flocks = {}
|
||||
|
||||
|
||||
def _lock_file_noop(ap: str) -> bool:
|
||||
return True
|
||||
|
||||
|
||||
def _lock_file_ioctl(ap: str) -> bool:
|
||||
assert fcntl # type: ignore # !rm
|
||||
try:
|
||||
fd = _flocks.pop(ap)
|
||||
os.close(fd)
|
||||
except:
|
||||
pass
|
||||
|
||||
fd = os.open(ap, os.O_RDWR | os.O_CREAT, 438)
|
||||
# NOTE: the fcntl.lockf identifier is (pid,node);
|
||||
# the lock will be dropped if os.close(os.open(ap))
|
||||
# is performed anywhere else in this thread
|
||||
|
||||
try:
|
||||
fcntl.lockf(fd, fcntl.LOCK_EX | fcntl.LOCK_NB)
|
||||
_flocks[ap] = fd
|
||||
return True
|
||||
except Exception as ex:
|
||||
eno = getattr(ex, "errno", -1)
|
||||
try:
|
||||
os.close(fd)
|
||||
except:
|
||||
pass
|
||||
if eno in (errno.EAGAIN, errno.EACCES):
|
||||
return False
|
||||
print("WARNING: unexpected errno %d from fcntl.lockf; %r" % (eno, ex))
|
||||
return True
|
||||
|
||||
|
||||
def _lock_file_windows(ap: str) -> bool:
|
||||
try:
|
||||
import msvcrt
|
||||
|
||||
try:
|
||||
fd = _flocks.pop(ap)
|
||||
os.close(fd)
|
||||
except:
|
||||
pass
|
||||
|
||||
fd = os.open(ap, os.O_RDWR | os.O_CREAT, 438)
|
||||
msvcrt.locking(fd, msvcrt.LK_NBLCK, 1)
|
||||
return True
|
||||
except Exception as ex:
|
||||
eno = getattr(ex, "errno", -1)
|
||||
if eno == errno.EACCES:
|
||||
return False
|
||||
print("WARNING: unexpected errno %d from msvcrt.locking; %r" % (eno, ex))
|
||||
return True
|
||||
|
||||
|
||||
if os.environ.get("PRTY_NO_DB_LOCK"):
|
||||
lock_file = _lock_file_noop
|
||||
elif ANYWIN:
|
||||
lock_file = _lock_file_windows
|
||||
elif HAVE_FCNTL:
|
||||
lock_file = _lock_file_ioctl
|
||||
else:
|
||||
lock_file = _lock_file_noop
|
||||
|
||||
|
||||
try:
|
||||
if sys.version_info < (3, 10):
|
||||
if sys.version_info < (3, 10) or os.environ.get("PRTY_NO_IMPRESO"):
|
||||
# py3.8 doesn't have .files
|
||||
# py3.9 has broken .is_file
|
||||
raise ImportError()
|
||||
@@ -4021,9 +4148,22 @@ class WrongPostKey(Pebkac):
|
||||
self.datagen = datagen
|
||||
|
||||
|
||||
_: Any = (mp, BytesIO, quote, unquote, SQLITE_VER, JINJA_VER, PYFTPD_VER, PARTFTPY_VER)
|
||||
_: Any = (
|
||||
gzip,
|
||||
mp,
|
||||
zlib,
|
||||
BytesIO,
|
||||
quote,
|
||||
unquote,
|
||||
SQLITE_VER,
|
||||
JINJA_VER,
|
||||
PYFTPD_VER,
|
||||
PARTFTPY_VER,
|
||||
)
|
||||
__all__ = [
|
||||
"gzip",
|
||||
"mp",
|
||||
"zlib",
|
||||
"BytesIO",
|
||||
"quote",
|
||||
"unquote",
|
||||
|
||||
@@ -1151,10 +1151,10 @@ html.y #widget.open {
|
||||
background: #fff;
|
||||
background: var(--bg-u3);
|
||||
}
|
||||
#wfs, #wfm, #wzip, #wnp {
|
||||
#wfs, #wfm, #wzip, #wnp, #wm3u {
|
||||
display: none;
|
||||
}
|
||||
#wfs, #wzip, #wnp {
|
||||
#wfs, #wzip, #wnp, #wm3u {
|
||||
margin-right: .2em;
|
||||
padding-right: .2em;
|
||||
border: 1px solid var(--bg-u5);
|
||||
@@ -1175,6 +1175,7 @@ html.y #widget.open {
|
||||
line-height: 1em;
|
||||
}
|
||||
#wtoggle.sel #wzip,
|
||||
#wtoggle.m3u #wm3u,
|
||||
#wtoggle.np #wnp {
|
||||
display: inline-block;
|
||||
}
|
||||
@@ -1183,6 +1184,7 @@ html.y #widget.open {
|
||||
}
|
||||
#wfm a,
|
||||
#wnp a,
|
||||
#wm3u a,
|
||||
#wzip a {
|
||||
font-size: .5em;
|
||||
padding: 0 .3em;
|
||||
@@ -1190,6 +1192,10 @@ html.y #widget.open {
|
||||
position: relative;
|
||||
display: inline-block;
|
||||
}
|
||||
#wm3u a {
|
||||
margin: -.2em .1em;
|
||||
font-size: .45em;
|
||||
}
|
||||
#wfs {
|
||||
font-size: .36em;
|
||||
text-align: right;
|
||||
@@ -1198,6 +1204,7 @@ html.y #widget.open {
|
||||
border-width: 0 .25em 0 0;
|
||||
}
|
||||
#wfm span,
|
||||
#wm3u span,
|
||||
#wnp span {
|
||||
font-size: .6em;
|
||||
display: block;
|
||||
@@ -1205,6 +1212,10 @@ html.y #widget.open {
|
||||
#wnp span {
|
||||
font-size: .7em;
|
||||
}
|
||||
#wm3u span {
|
||||
font-size: .77em;
|
||||
padding-top: .2em;
|
||||
}
|
||||
#wfm a:not(.en) {
|
||||
opacity: .3;
|
||||
color: var(--fm-off);
|
||||
|
||||
@@ -144,6 +144,8 @@ var Ls = {
|
||||
"wt_seldl": "download selection as separate files$NHotkey: Y",
|
||||
"wt_npirc": "copy irc-formatted track info",
|
||||
"wt_nptxt": "copy plaintext track info",
|
||||
"wt_m3ua": "add to m3u playlist (click <code>📻copy</code> later)",
|
||||
"wt_m3uc": "copy m3u playlist to clipboard",
|
||||
"wt_grid": "toggle grid / list view$NHotkey: G",
|
||||
"wt_prev": "previous track$NHotkey: J",
|
||||
"wt_play": "play / pause$NHotkey: P",
|
||||
@@ -271,6 +273,7 @@ var Ls = {
|
||||
"ml_eq": "audio equalizer",
|
||||
"ml_drc": "dynamic range compressor",
|
||||
|
||||
"mt_loop": "loop/repeat one song\">🔁",
|
||||
"mt_shuf": "shuffle the songs in each folder\">🔀",
|
||||
"mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶",
|
||||
"mt_preload": "start loading the next song near the end for gapless playback\">preload",
|
||||
@@ -279,6 +282,7 @@ var Ls = {
|
||||
"mt_fau": "on phones, prevent music from stopping if the next song doesn't preload fast enough (can make tags display glitchy)\">☕️",
|
||||
"mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s",
|
||||
"mt_npclip": "show buttons for clipboarding the currently playing song\">/np",
|
||||
"mt_m3u_c": "show buttons for clipboarding the$Nselected songs as m3u8 playlist entries\">📻",
|
||||
"mt_octl": "os integration (media hotkeys / osd)\">os-ctl",
|
||||
"mt_oseek": "allow seeking through os integration$N$Nnote: on some devices (iPhones),$Nthis replaces the next-song button\">seek",
|
||||
"mt_oscv": "show album cover in osd\">art",
|
||||
@@ -304,6 +308,7 @@ var Ls = {
|
||||
|
||||
"mb_play": "play",
|
||||
"mm_hashplay": "play this audio file?",
|
||||
"mm_m3u": "press <code>Enter/OK</code> to Play\npress <code>ESC/Cancel</code> to Edit",
|
||||
"mp_breq": "need firefox 82+ or chrome 73+ or iOS 15+",
|
||||
"mm_bload": "now loading...",
|
||||
"mm_bconv": "converting to {0}, please wait...",
|
||||
@@ -316,6 +321,7 @@ var Ls = {
|
||||
"mm_eunk": "Unknown Errol",
|
||||
"mm_e404": "Could not play audio; error 404: File not found.",
|
||||
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
||||
"mm_e500": "Could not play audio; error 500: Check server logs.",
|
||||
"mm_e5xx": "Could not play audio; server error ",
|
||||
"mm_nof": "not finding any more audio files nearby",
|
||||
"mm_prescan": "Looking for music to play next...",
|
||||
@@ -330,6 +336,7 @@ var Ls = {
|
||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
||||
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
||||
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
||||
|
||||
"f_dls": 'the file links in the current folder have\nbeen changed into download links',
|
||||
|
||||
@@ -432,6 +439,10 @@ var Ls = {
|
||||
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||
|
||||
"m3u_add1": "song added to m3u playlist",
|
||||
"m3u_addn": "{0} songs added to m3u playlist",
|
||||
"m3u_clip": "m3u playlist now copied to clipboard\n\nyou should create a new textfile named something.m3u and paste the playlist in that document; this will make it playable",
|
||||
|
||||
"gt_vau": "don't show videos, just play the audio\">🎧",
|
||||
"gt_msel": "enable file selection; ctrl-click a file to override$N$N<em>when active: doubleclick a file / folder to open it</em>$N$NHotkey: S\">multiselect",
|
||||
"gt_crop": "center-crop thumbnails\">crop",
|
||||
@@ -542,6 +553,7 @@ var Ls = {
|
||||
"u_enoow": "overwrite will not work here; need Delete-permission",
|
||||
"u_badf": 'These {0} files (of {1} total) were skipped, possibly due to filesystem permissions:\n\n',
|
||||
"u_blankf": 'These {0} files (of {1} total) are blank / empty; upload them anyways?\n\n',
|
||||
"u_applef": 'These {0} files (of {1} total) are probably undesirable;\nPress <code>OK/Enter</code> to SKIP the following files,\nPress <code>Cancel/ESC</code> to NOT exclude, and UPLOAD those as well:\n\n',
|
||||
"u_just1": '\nMaybe it works better if you select just one file',
|
||||
"u_ff_many": "if you're using <b>Linux / MacOS / Android,</b> then this amount of files <a href=\"https://bugzilla.mozilla.org/show_bug.cgi?id=1790500\" target=\"_blank\"><em>may</em> crash Firefox!</a>\nif that happens, please try again (or use Chrome).",
|
||||
"u_up_life": "This upload will be deleted from the server\n{0} after it completes",
|
||||
@@ -746,6 +758,8 @@ var Ls = {
|
||||
"wt_seldl": "last ned de valgte filene$NSnarvei: Y",
|
||||
"wt_npirc": "kopiér sang-info (irc-formatert)",
|
||||
"wt_nptxt": "kopiér sang-info",
|
||||
"wt_m3ua": "legg til sang i m3u-spilleliste$N(husk å klikke på <code>📻copy</code> senere)",
|
||||
"wt_m3uc": "kopiér m3u-spillelisten til utklippstavlen",
|
||||
"wt_grid": "bytt mellom ikoner og listevisning$NSnarvei: G",
|
||||
"wt_prev": "forrige sang$NSnarvei: J",
|
||||
"wt_play": "play / pause$NSnarvei: P",
|
||||
@@ -873,6 +887,7 @@ var Ls = {
|
||||
"ml_eq": "audio equalizer (tonejustering)",
|
||||
"ml_drc": "compressor (volum-utjevning)",
|
||||
|
||||
"mt_loop": "spill den samme sangen om og om igjen\">🔁",
|
||||
"mt_shuf": "sangene i hver mappe$Nspilles i tilfeldig rekkefølge\">🔀",
|
||||
"mt_aplay": "forsøk å starte avspilling hvis linken du klikket på for å åpne nettsiden inneholder en sang-ID$N$Nhvis denne deaktiveres så vil heller ikke nettside-URLen bli oppdatert med sang-ID'er når musikk spilles, i tilfelle innstillingene skulle gå tapt og nettsiden lastes på ny\">a▶",
|
||||
"mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles",
|
||||
@@ -881,6 +896,7 @@ var Ls = {
|
||||
"mt_fau": "for telefoner: forhindre at avspilling stopper hvis nettet er for tregt til å laste neste sang i tide. Hvis påskrudd, kan forårsake at sang-info ikke vises korrekt i OS'et\">☕️",
|
||||
"mt_waves": "waveform seekbar:$Nvis volumkurve i avspillingsfeltet\">~s",
|
||||
"mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np",
|
||||
"mt_m3u_c": "vis knapper for å kopiere de valgte$Nsangene som innslag i en m3u8 spilleliste\">📻",
|
||||
"mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl",
|
||||
"mt_oseek": "tillat spoling med fjernkontroll$N$Nmerk: på noen enheter (iPhones) så vil$Ndette erstatte knappen for neste sang\">spoling",
|
||||
"mt_oscv": "vis album-cover på infoskjermen\">bilde",
|
||||
@@ -906,6 +922,7 @@ var Ls = {
|
||||
|
||||
"mb_play": "lytt",
|
||||
"mm_hashplay": "spill denne sangen?",
|
||||
"mm_m3u": "trykk <code>Enter/OK</code> for å spille\ntrykk <code>ESC/Avbryt</code> for å redigere",
|
||||
"mp_breq": "krever firefox 82+, chrome 73+, eller iOS 15+",
|
||||
"mm_bload": "laster inn...",
|
||||
"mm_bconv": "konverterer til {0}, vent litt...",
|
||||
@@ -918,6 +935,7 @@ var Ls = {
|
||||
"mm_eunk": "Ukjent feil",
|
||||
"mm_e404": "Avspilling feilet: Fil ikke funnet.",
|
||||
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
|
||||
"mm_e500": "Avspilling feilet: Rusk i maskineriet, sjekk serverloggen.",
|
||||
"mm_e5xx": "Avspilling feilet: ",
|
||||
"mm_nof": "finner ikke flere sanger i nærheten",
|
||||
"mm_prescan": "Leter etter neste sang...",
|
||||
@@ -932,6 +950,7 @@ var Ls = {
|
||||
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
||||
"fbd_more": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_more">vis {2}</a> eller <a href="#" id="bd_all">vis alle</a></div>',
|
||||
"fbd_all": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_all">vis alle</a></div>',
|
||||
"f_anota": "kun {0} av totalt {1} elementer ble markert;\nfor å velge alt må du bla til bunnen av mappen først",
|
||||
|
||||
"f_dls": 'linkene i denne mappen er nå\nomgjort til nedlastningsknapper',
|
||||
|
||||
@@ -1034,6 +1053,10 @@ var Ls = {
|
||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
|
||||
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
|
||||
|
||||
"m3u_add1": "sangen ble lagt til i m3u-spillelisten",
|
||||
"m3u_addn": "{0} sanger ble lagt til i m3u-spillelisten",
|
||||
"m3u_clip": "m3u-spillelisten ble kopiert til utklippstavlen\n\nneste steg er å opprette et tekstdokument med filnavn som slutter på <code>.m3u</code> og lime inn spillelisten der",
|
||||
|
||||
"gt_vau": "ikke vis videofiler, bare spill lyden\">🎧",
|
||||
"gt_msel": "markér filer istedenfor å åpne dem; ctrl-klikk filer for å overstyre$N$N<em>når aktiv: dobbelklikk en fil / mappe for å åpne</em>$N$NSnarvei: S\">markering",
|
||||
"gt_crop": "beskjær ikonene så de passer bedre\">✂",
|
||||
@@ -1144,6 +1167,7 @@ var Ls = {
|
||||
"u_enoow": "kan ikke overskrive filer her (Delete-rettigheten er nødvendig)",
|
||||
"u_badf": 'Disse {0} filene (av totalt {1}) kan ikke leses, kanskje pga rettighetsproblemer i filsystemet på datamaskinen din:\n\n',
|
||||
"u_blankf": 'Disse {0} filene (av totalt {1}) er blanke / uten innhold; ønsker du å laste dem opp uansett?\n\n',
|
||||
"u_applef": 'Disse {0} filene (av totalt {1}) er antagelig uønskede;\nTrykk <code>OK/Enter</code> for å HOPPE OVER disse filene,\nTrykk <code>Avbryt/ESC</code> for å LASTE OPP disse filene også:\n\n',
|
||||
"u_just1": '\nFunker kanskje bedre hvis du bare tar én fil om gangen',
|
||||
"u_ff_many": 'Hvis du bruker <b>Linux / MacOS / Android,</b> så kan dette antallet filer<br /><a href="https://bugzilla.mozilla.org/show_bug.cgi?id=1790500" target="_blank"><em>kanskje</em> krasje Firefox!</a> Hvis det skjer, så prøv igjen (eller bruk Chrome).',
|
||||
"u_up_life": "Filene slettes fra serveren {0}\netter at opplastningen er fullført",
|
||||
@@ -1475,6 +1499,7 @@ var Ls = {
|
||||
"ml_eq": "音频均衡器",
|
||||
"ml_drc": "动态范围压缩器",
|
||||
|
||||
"mt_loop": "循环播放当前的歌曲\">🔁", //m
|
||||
"mt_shuf": "在每个文件夹中随机播放歌曲\">🔀",
|
||||
"mt_aplay": "如果链接中有歌曲 ID,则自动播放,禁用此选项将停止在播放音乐时更新页面 URL 中的歌曲 ID,以防止在设置丢失但 URL 保留时自动播放\">自动播放▶",
|
||||
"mt_preload": "在歌曲快结束时开始加载下一首歌,以实现无缝播放\">预加载",
|
||||
@@ -1520,6 +1545,7 @@ var Ls = {
|
||||
"mm_eunk": "未知错误",
|
||||
"mm_e404": "无法播放音频;错误 404:文件未找到。",
|
||||
"mm_e403": "无法播放音频;错误 403:访问被拒绝。\n\n尝试按 F5 重新加载,也许你已被注销",
|
||||
"mm_e500": "无法播放音频;错误 500:检查服务器日志。", //m
|
||||
"mm_e5xx": "无法播放音频;服务器错误",
|
||||
"mm_nof": "附近找不到更多音频文件",
|
||||
"mm_prescan": "正在寻找下一首音乐...",
|
||||
@@ -1746,6 +1772,7 @@ var Ls = {
|
||||
"u_enoow": "无法覆盖此处的文件;需要删除权限", //m
|
||||
"u_badf": '这些 {0} 个文件(共 {1} 个)被跳过,可能是由于文件系统权限:\n\n',
|
||||
"u_blankf": '这些 {0} 个文件(共 {1} 个)是空白的;是否仍然上传?\n\n',
|
||||
"u_applef": "这些 {0} 个文件(共 {1} 个)可能是不需要的;\n按 <code>确定/Enter</code> 跳过以下文件,\n按 <code>取消/ESC</code> 取消排除,并上传这些文件:\n\n", //m
|
||||
"u_just1": '\n也许如果你只选择一个文件会更好',
|
||||
"u_ff_many": "如果你使用的是 <b>Linux / MacOS / Android,</b> 那么这个文件数量 <a href=\"https://bugzilla.mozilla.org/show_bug.cgi?id=1790500\" target=\"_blank\"><em>可能</em> 崩溃 Firefox!</a>\n如果发生这种情况,请再试一次(或使用 Chrome)。",
|
||||
"u_up_life": "此上传将在 {0} 后从服务器删除",
|
||||
@@ -1880,6 +1907,9 @@ ebi('widget').innerHTML = (
|
||||
'</span><span id="wnp"><a' +
|
||||
' href="#" id="npirc" tt="' + L.wt_npirc + '">📋<span>irc</span></a><a' +
|
||||
' href="#" id="nptxt" tt="' + L.wt_nptxt + '">📋<span>txt</span></a>' +
|
||||
'</span><span id="wm3u"><a' +
|
||||
' href="#" id="m3ua" tt="' + L.wt_m3ua + '">📻<span>add</span></a><a' +
|
||||
' href="#" id="m3uc" tt="' + L.wt_m3uc + '">📻<span>copy</span></a>' +
|
||||
'</span><a' +
|
||||
' href="#" id="wtgrid" tt="' + L.wt_grid + '">田</a><a' +
|
||||
' href="#" id="wtico">♫</a>' +
|
||||
@@ -2286,6 +2316,7 @@ var mpl = (function () {
|
||||
|
||||
ebi('op_player').innerHTML = (
|
||||
'<div><h3>' + L.cl_opts + '</h3><div>' +
|
||||
'<a href="#" class="tgl btn" id="au_loop" tt="' + L.mt_loop + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_shuf" tt="' + L.mt_shuf + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_aplay" tt="' + L.mt_aplay + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_preload" tt="' + L.mt_preload + '</a>' +
|
||||
@@ -2294,6 +2325,7 @@ var mpl = (function () {
|
||||
'<a href="#" class="tgl btn" id="au_fau" tt="' + L.mt_fau + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_waves" tt="' + L.mt_waves + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_npclip" tt="' + L.mt_npclip + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_m3u_c" tt="' + L.mt_m3u_c + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_os_ctl" tt="' + L.mt_octl + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_os_seek" tt="' + L.mt_oseek + '</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_osd_cv" tt="' + L.mt_oscv + '</a>' +
|
||||
@@ -2336,7 +2368,12 @@ var mpl = (function () {
|
||||
"pb_mode": (sread('pb_mode', ['loop', 'next']) || 'next').split('-')[0],
|
||||
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
||||
'traversals': 0,
|
||||
'm3ut': '#EXTM3U\n',
|
||||
};
|
||||
bcfg_bind(r, 'loop', 'au_loop', false, function (v) {
|
||||
if (mp.au)
|
||||
mp.au.loop = v;
|
||||
});
|
||||
bcfg_bind(r, 'shuf', 'au_shuf', false, function () {
|
||||
mp.read_order(); // don't bind
|
||||
});
|
||||
@@ -2361,6 +2398,9 @@ var mpl = (function () {
|
||||
bcfg_bind(r, 'clip', 'au_npclip', false, function (v) {
|
||||
clmod(ebi('wtoggle'), 'np', v && mp.au);
|
||||
});
|
||||
bcfg_bind(r, 'm3uen', 'au_m3u_c', false, function (v) {
|
||||
clmod(ebi('wtoggle'), 'm3u', v && (mp.au || msel.getsel().length));
|
||||
});
|
||||
bcfg_bind(r, 'follow', 'au_follow', false, setaufollow);
|
||||
bcfg_bind(r, 'ac_flac', 'ac_flac', true);
|
||||
bcfg_bind(r, 'ac_aac', 'ac_aac', false);
|
||||
@@ -2549,7 +2589,7 @@ var mpl = (function () {
|
||||
ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : '');
|
||||
ebi('np_title').textContent = np.title || '';
|
||||
ebi('np_dur').textContent = np['.dur'] || '';
|
||||
ebi('np_url').textContent = get_vpath() + np.file.split('?')[0];
|
||||
ebi('np_url').textContent = uricom_dec(get_evpath()) + np.file.split('?')[0];
|
||||
if (!MOBILE && cover)
|
||||
ebi('np_img').setAttribute('src', cover);
|
||||
else
|
||||
@@ -2614,6 +2654,7 @@ if (can_owa && APPLE && / OS ([1-9]|1[0-7])_/.test(UA))
|
||||
mpl.init_ac2();
|
||||
|
||||
|
||||
var re_m3u = /\.(m3u8?)$/i;
|
||||
var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
|
||||
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|itgz|itxz|itz|m4a|mdgz|mdxz|mdz|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|s3gz|s3xz|s3z|tak|tta|ulaw|wav|wma|wv|xm|xmgz|xmxz|xmz|xpk|3gp|asf|avi|flv|m4v|mkv|mov|mp4|mpeg|mpeg2|mpegts|mpg|mpg2|nut|ogm|ogv|rm|ts|vob|webm|wmv)$/i;
|
||||
|
||||
@@ -2640,9 +2681,9 @@ function MPlayer() {
|
||||
|
||||
link = link[link.length - 1];
|
||||
var url = link.getAttribute('href'),
|
||||
m = re_audio.exec(url.split('?')[0]);
|
||||
fn = url.split('?')[0];
|
||||
|
||||
if (m) {
|
||||
if (re_audio.exec(fn)) {
|
||||
var tid = link.getAttribute('id');
|
||||
r.order.push(tid);
|
||||
r.tracks[tid] = url;
|
||||
@@ -2650,6 +2691,11 @@ function MPlayer() {
|
||||
ebi('a' + tid).onclick = ev_play;
|
||||
clmod(trs[a], 'au', 1);
|
||||
}
|
||||
else if (re_m3u.exec(fn)) {
|
||||
var tid = link.getAttribute('id');
|
||||
tds[0].innerHTML = '<a id="a' + tid + '" href="#a' + tid + '" class="play">' + L.mb_play + '</a></td>';
|
||||
ebi('a' + tid).onclick = ev_load_m3u;
|
||||
}
|
||||
}
|
||||
|
||||
r.vol = clamp(fcfg_get('vol', IPHONE ? 1 : dvol / 100), 0, 1);
|
||||
@@ -2815,6 +2861,14 @@ function MPlayer() {
|
||||
r.fau.loop = true;
|
||||
r.fau.play();
|
||||
};
|
||||
|
||||
r.set_ev = function () {
|
||||
mp.au.onended = evau_end;
|
||||
mp.au.onerror = evau_error;
|
||||
mp.au.onprogress = pbar.drawpos;
|
||||
mp.au.onplaying = mpui.progress_updater;
|
||||
mp.au.onloadeddata = mp.au.onloadedmetadata = mp.nopause;
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -2856,6 +2910,8 @@ var widget = (function () {
|
||||
wtico = ebi('wtico'),
|
||||
nptxt = ebi('nptxt'),
|
||||
npirc = ebi('npirc'),
|
||||
m3ua = ebi('m3ua'),
|
||||
m3uc = ebi('m3uc'),
|
||||
touchmode = false,
|
||||
was_paused = true;
|
||||
|
||||
@@ -2914,6 +2970,49 @@ var widget = (function () {
|
||||
toast.ok(1, L.clipped, null, 'top');
|
||||
});
|
||||
};
|
||||
m3ua.onclick = function (e) {
|
||||
ev(e);
|
||||
var el,
|
||||
files = [],
|
||||
sel = msel.getsel();
|
||||
|
||||
for (var a = 0; a < sel.length; a++) {
|
||||
el = ebi(sel[a].id).closest('tr');
|
||||
if (clgot(el, 'au'))
|
||||
files.push(el);
|
||||
}
|
||||
el = QS('#files tr.play');
|
||||
if (!sel.length && el)
|
||||
files.push(el);
|
||||
|
||||
for (var a = 0; a < files.length; a++) {
|
||||
var md = ft2dict(files[a])[0],
|
||||
dur = md['.dur'] || '1',
|
||||
tag = '';
|
||||
|
||||
if (md.artist && md.title)
|
||||
tag = md.artist + ' - ' + md.title;
|
||||
else if (md.artist)
|
||||
tag = md.artist + ' - ' + md.file;
|
||||
else if (md.title)
|
||||
tag = md.title;
|
||||
|
||||
if (dur.indexOf(':') > 0) {
|
||||
dur = dur.split(':');
|
||||
dur = 60 * parseInt(dur[0]) + parseInt(dur[1]);
|
||||
}
|
||||
else dur = parseInt(dur);
|
||||
|
||||
mpl.m3ut += '#EXTINF:' + dur + ',' + tag + '\n' + uricom_dec(get_evpath()) + md.file + '\n';
|
||||
}
|
||||
toast.ok(2, files.length == 1 ? L.m3u_add1 : L.m3u_addn.format(files.length), null, 'top');
|
||||
};
|
||||
m3uc.onclick = function (e) {
|
||||
ev(e);
|
||||
cliptxt(mpl.m3ut, function () {
|
||||
toast.ok(15, L.m3u_clip, null, 'top');
|
||||
});
|
||||
};
|
||||
r.set(sread('au_open') == 1);
|
||||
setTimeout(function () {
|
||||
clmod(widget, 'anim', 1);
|
||||
@@ -4052,10 +4151,7 @@ function play(tid, is_ev, seek) {
|
||||
else {
|
||||
mp.au = new Audio();
|
||||
mp.au2 = new Audio();
|
||||
mp.au.onerror = evau_error;
|
||||
mp.au.onprogress = pbar.drawpos;
|
||||
mp.au.onplaying = mpui.progress_updater;
|
||||
mp.au.onended = next_song;
|
||||
mp.set_ev();
|
||||
widget.open();
|
||||
}
|
||||
mp.init_fau();
|
||||
@@ -4068,13 +4164,9 @@ function play(tid, is_ev, seek) {
|
||||
var t = mp.au;
|
||||
mp.au = mp.au2;
|
||||
mp.au2 = t;
|
||||
t.onerror = t.onprogress = t.onended = null;
|
||||
t.onerror = t.onprogress = t.onended = t.loop = null;
|
||||
t.ld = 0; //owa
|
||||
mp.au.onerror = evau_error;
|
||||
mp.au.onprogress = pbar.drawpos;
|
||||
mp.au.onplaying = mpui.progress_updater;
|
||||
mp.au.onloadeddata = mp.au.onloadedmetadata = mp.nopause;
|
||||
mp.au.onended = next_song;
|
||||
mp.set_ev();
|
||||
t = mp.au.currentTime;
|
||||
if (isNum(t) && t > 0.1)
|
||||
mp.au.currentTime = 0;
|
||||
@@ -4104,6 +4196,7 @@ function play(tid, is_ev, seek) {
|
||||
clmod(ebi(oid), 'act', 1);
|
||||
clmod(ebi(oid).closest('tr'), 'play', 1);
|
||||
clmod(ebi('wtoggle'), 'np', mpl.clip);
|
||||
clmod(ebi('wtoggle'), 'm3u', mpl.m3uen);
|
||||
if (thegrid)
|
||||
thegrid.loadsel();
|
||||
|
||||
@@ -4112,6 +4205,7 @@ function play(tid, is_ev, seek) {
|
||||
|
||||
try {
|
||||
mp.nopause();
|
||||
mp.au.loop = mpl.loop;
|
||||
if (mpl.aplay || is_ev !== -1)
|
||||
mp.au.play();
|
||||
|
||||
@@ -4156,6 +4250,15 @@ function scroll2playing() {
|
||||
}
|
||||
|
||||
|
||||
function evau_end(e) {
|
||||
if (!mpl.loop)
|
||||
return next_song(e);
|
||||
ev(e);
|
||||
mp.au.currentTime = 0;
|
||||
mp.au.play();
|
||||
}
|
||||
|
||||
|
||||
// event from the audio object if something breaks
|
||||
function evau_error(e) {
|
||||
var err = '',
|
||||
@@ -4199,6 +4302,7 @@ function evau_error(e) {
|
||||
}
|
||||
var em = '' + eplaya.error.message,
|
||||
mfile = '\n\nFile: «' + uricom_dec(eplaya.src.split('/').pop()) + '»',
|
||||
e500 = L.mm_e500,
|
||||
e404 = L.mm_e404,
|
||||
e403 = L.mm_e403;
|
||||
|
||||
@@ -4211,6 +4315,9 @@ function evau_error(e) {
|
||||
if (em.startsWith('404: '))
|
||||
err = e404;
|
||||
|
||||
if (em.startsWith('500: '))
|
||||
err = e500;
|
||||
|
||||
toast.warn(15, esc(basenames(err + mfile)));
|
||||
console.log(basenames(err + mfile));
|
||||
|
||||
@@ -4222,7 +4329,9 @@ function evau_error(e) {
|
||||
if (this.status < 400)
|
||||
return;
|
||||
|
||||
err = this.status == 403 ? e403 : this.status == 404 ? e404 :
|
||||
err = this.status == 403 ? e403 :
|
||||
this.status == 404 ? e404 :
|
||||
this.status == 500 ? e500 :
|
||||
L.mm_e5xx + this.status;
|
||||
|
||||
toast.warn(15, esc(basenames(err + mfile)));
|
||||
@@ -4362,6 +4471,11 @@ function eval_hash() {
|
||||
goto(v.slice(3));
|
||||
return;
|
||||
}
|
||||
|
||||
if (v.startsWith("#m3u=")) {
|
||||
load_m3u(v.slice(5));
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -4421,7 +4535,8 @@ function eval_hash() {
|
||||
|
||||
function read_dsort(txt) {
|
||||
dnsort = dnsort ? 1 : 0;
|
||||
clmod(ebi('nsort'), 'on', (sread('nsort') || dnsort) == 1);
|
||||
ENATSORT = NATSORT && (sread('nsort') || dnsort) == 1;
|
||||
clmod(ebi('nsort'), 'on', ENATSORT);
|
||||
try {
|
||||
var zt = (('' + txt).trim() || 'href').split(/,+/g);
|
||||
dsort = [];
|
||||
@@ -4438,7 +4553,7 @@ function read_dsort(txt) {
|
||||
}
|
||||
}
|
||||
catch (ex) {
|
||||
toast.warn(10, 'failed to apply default sort order [' + txt + ']:\n' + ex);
|
||||
toast.warn(10, 'failed to apply default sort order [' + esc('' + txt) + ']:\n' + ex);
|
||||
dsort = [['href', 1, '']];
|
||||
}
|
||||
}
|
||||
@@ -4467,9 +4582,6 @@ function sortfiles(nodes) {
|
||||
|
||||
sopts = sopts && sopts.length ? sopts : jcp(dsort);
|
||||
|
||||
var collator = !clgot(ebi('nsort'), 'on') ? null :
|
||||
new Intl.Collator([], {numeric: true});
|
||||
|
||||
try {
|
||||
var is_srch = false;
|
||||
if (nodes[0]['rp']) {
|
||||
@@ -4521,8 +4633,9 @@ function sortfiles(nodes) {
|
||||
}
|
||||
if (v2 === undefined) return 1 * rev;
|
||||
|
||||
var ret = rev * (typ == 'int' ? (v1 - v2) : collator ?
|
||||
collator.compare(v1, v2) : v1.localeCompare(v2));
|
||||
var ret = rev * (typ == 'int' ? (v1 - v2) :
|
||||
ENATSORT ? NATSORT.compare(v1, v2) :
|
||||
v1.localeCompare(v2));
|
||||
|
||||
if (ret === 0)
|
||||
ret = onodes.indexOf(n1) - onodes.indexOf(n2);
|
||||
@@ -4704,6 +4817,7 @@ var fileman = (function () {
|
||||
clmod(bshr, 'hide', hshr);
|
||||
|
||||
clmod(ebi('wfm'), 'act', QS('#wfm a.en:not(.hide)'));
|
||||
clmod(ebi('wtoggle'), 'm3u', mpl.m3uen && (nsel || (mp && mp.au)));
|
||||
|
||||
var wfs = ebi('wfs'), h = '';
|
||||
try {
|
||||
@@ -5771,7 +5885,7 @@ var showfile = (function () {
|
||||
|
||||
td.innerHTML = '<a href="#" id="t' +
|
||||
link.id + '" class="doc bri" hl="' +
|
||||
link.id + '">-txt-</a>';
|
||||
link.id + '" rel="nofollow">-txt-</a>';
|
||||
|
||||
td.getElementsByTagName('a')[0].setAttribute('href', '?doc=' + fn);
|
||||
}
|
||||
@@ -5960,7 +6074,8 @@ var showfile = (function () {
|
||||
};
|
||||
|
||||
r.mktree = function () {
|
||||
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('<span>/</span>') + '</li>'];
|
||||
var crumbs = linksplit(get_evpath()).join('<span>/</span>'),
|
||||
html = ['<li class="bn">' + L.tv_lst + '<br />' + crumbs + '</li>'];
|
||||
for (var a = 0; a < r.files.length; a++) {
|
||||
var file = r.files[a];
|
||||
html.push('<li><a href="?doc=' +
|
||||
@@ -6535,8 +6650,8 @@ function tree_scrolltoo(q) {
|
||||
var ctr = ebi('tree'),
|
||||
em = parseFloat(getComputedStyle(act).fontSize),
|
||||
top = act.offsetTop + ul.offsetTop,
|
||||
min = top - 11 * em,
|
||||
max = top - (ctr.offsetHeight - 10 * em);
|
||||
min = top - 20 * em,
|
||||
max = top - (ctr.offsetHeight - 16 * em);
|
||||
|
||||
if (ctr.scrollTop > min)
|
||||
ctr.scrollTop = Math.floor(min);
|
||||
@@ -6707,7 +6822,8 @@ var ahotkeys = function (e) {
|
||||
return ebi('griden').click();
|
||||
}
|
||||
|
||||
if ((aet == 'tr' || aet == 'td') && ae.closest('#files')) {
|
||||
var in_ftab = (aet == 'tr' || aet == 'td') && ae.closest('#files');
|
||||
if (in_ftab) {
|
||||
var d = '', rem = 0;
|
||||
if (aet == 'td') ae = ae.closest('tr'); //ie11
|
||||
if (k == 'ArrowUp' || k == 'Up') d = 'previous';
|
||||
@@ -6724,12 +6840,19 @@ var ahotkeys = function (e) {
|
||||
msel.selui();
|
||||
return ev(e);
|
||||
}
|
||||
}
|
||||
if (in_ftab || !aet || (ae && ae.closest('#ggrid'))) {
|
||||
if ((k == 'KeyA' || k == 'a') && ctrl(e)) {
|
||||
var sel = msel.getsel(),
|
||||
var ntot = treectl.lsc.files.length + treectl.lsc.dirs.length,
|
||||
sel = msel.getsel(),
|
||||
all = msel.getall();
|
||||
|
||||
msel.evsel(e, sel.length < all.length);
|
||||
msel.origin_id(null);
|
||||
if (ntot > all.length)
|
||||
toast.warn(10, L.f_anota.format(all.length, ntot), L.f_anota);
|
||||
else if (toast.tag == L.f_anota)
|
||||
toast.hide();
|
||||
return ev(e);
|
||||
}
|
||||
}
|
||||
@@ -6859,7 +6982,7 @@ var ahotkeys = function (e) {
|
||||
|
||||
|
||||
// search
|
||||
(function () {
|
||||
var search_ui = (function () {
|
||||
var sconf = [
|
||||
[
|
||||
L.s_sz,
|
||||
@@ -6894,7 +7017,8 @@ var ahotkeys = function (e) {
|
||||
]
|
||||
];
|
||||
|
||||
var trs = [],
|
||||
var r = {},
|
||||
trs = [],
|
||||
orig_url = null,
|
||||
orig_html = null,
|
||||
cap = 125;
|
||||
@@ -7101,13 +7225,19 @@ var ahotkeys = function (e) {
|
||||
search_in_progress = 0;
|
||||
srch_msg(false, '');
|
||||
|
||||
var res = JSON.parse(this.responseText),
|
||||
tagord = res.tag_order;
|
||||
var res = JSON.parse(this.responseText);
|
||||
r.render(res, this, true);
|
||||
}
|
||||
|
||||
sortfiles(res.hits);
|
||||
r.render = function (res, xhr, sort) {
|
||||
var tagord = res.tag_order;
|
||||
|
||||
srch_msg(false, '');
|
||||
if (sort)
|
||||
sortfiles(res.hits);
|
||||
|
||||
var ofiles = ebi('files');
|
||||
if (ofiles.getAttribute('ts') > this.ts)
|
||||
if (xhr && ofiles.getAttribute('ts') > xhr.ts)
|
||||
return;
|
||||
|
||||
treectl.hide();
|
||||
@@ -7159,19 +7289,21 @@ var ahotkeys = function (e) {
|
||||
}
|
||||
|
||||
ofiles = set_files_html(html.join('\n'));
|
||||
ofiles.setAttribute("ts", this.ts);
|
||||
ofiles.setAttribute("q_raw", this.q_raw);
|
||||
ofiles.setAttribute("ts", xhr ? xhr.ts : 1);
|
||||
ofiles.setAttribute("q_raw", xhr ? xhr.q_raw : 'playlist');
|
||||
set_vq();
|
||||
mukey.render();
|
||||
reload_browser();
|
||||
filecols.set_style(['File Name']);
|
||||
|
||||
sethash('q=' + uricom_enc(this.q_raw));
|
||||
if (xhr)
|
||||
sethash('q=' + uricom_enc(xhr.q_raw));
|
||||
|
||||
ebi('unsearch').onclick = unsearch;
|
||||
var m = ebi('moar');
|
||||
if (m)
|
||||
m.onclick = moar;
|
||||
}
|
||||
};
|
||||
|
||||
function unsearch(e) {
|
||||
ev(e);
|
||||
@@ -7188,9 +7320,98 @@ var ahotkeys = function (e) {
|
||||
cap *= 2;
|
||||
do_search();
|
||||
}
|
||||
|
||||
return r;
|
||||
})();
|
||||
|
||||
|
||||
function ev_load_m3u(e) {
|
||||
ev(e);
|
||||
var id = this.getAttribute('id').slice(1),
|
||||
url = ebi(id).getAttribute('href').split('?')[0];
|
||||
|
||||
modal.confirm(L.mm_m3u,
|
||||
function () { load_m3u(url); },
|
||||
function () {
|
||||
if (has(perms, 'write') && has(perms, 'delete'))
|
||||
window.location = url + '?edit';
|
||||
else
|
||||
showfile.show(url);
|
||||
}
|
||||
);
|
||||
return false;
|
||||
}
|
||||
function load_m3u(url) {
|
||||
var xhr = new XHR();
|
||||
xhr.open('GET', url, true);
|
||||
xhr.onload = render_m3u;
|
||||
xhr.url = url;
|
||||
xhr.send();
|
||||
return false;
|
||||
}
|
||||
function render_m3u() {
|
||||
if (!xhrchk(this, L.tv_xe1, L.tv_xe2))
|
||||
return;
|
||||
|
||||
var evp = get_evpath(),
|
||||
m3u = this.responseText,
|
||||
xtd = m3u.slice(0, 12).indexOf('#EXTM3U') + 1,
|
||||
lines = m3u.replace(/\r/g, '\n').split('\n'),
|
||||
dur = 1,
|
||||
artist = '',
|
||||
title = '',
|
||||
ret = {'hits': [], 'tag_order': ['artist', 'title', '.dur'], 'trunc': false};
|
||||
|
||||
for (var a = 0; a < lines.length; a++) {
|
||||
var ln = lines[a].trim();
|
||||
if (xtd && ln.startsWith('#')) {
|
||||
var m = /^#EXTINF:([0-9]+)[, ](.*)/.exec(ln);
|
||||
if (m) {
|
||||
dur = m[1];
|
||||
title = m[2];
|
||||
var ofs = title.indexOf(' - ');
|
||||
if (ofs > 0) {
|
||||
artist = title.slice(0, ofs);
|
||||
title = title.slice(ofs + 3);
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
if (ln.indexOf('.') < 0)
|
||||
continue;
|
||||
|
||||
var n = ret.hits.length + 1,
|
||||
url = ln;
|
||||
|
||||
if (url.indexOf(':\\')) // C:\
|
||||
url = url.split(/\\/g).pop();
|
||||
|
||||
url = url.replace(/\\/g, '/');
|
||||
url = uricom_enc(url).replace(/%2f/gi, '/')
|
||||
|
||||
if (!url.startsWith('/'))
|
||||
url = vjoin(evp, url);
|
||||
|
||||
ret.hits.push({
|
||||
"ts": 946684800 + n,
|
||||
"sz": 100000 + n,
|
||||
"rp": url,
|
||||
"tags": {".dur": dur, "artist": artist, "title": title}
|
||||
});
|
||||
dur = 1;
|
||||
artist = title = '';
|
||||
}
|
||||
|
||||
search_ui.render(ret, null, false);
|
||||
sethash('m3u=' + this.url.split('?')[0].split('/').pop());
|
||||
goto();
|
||||
|
||||
var el = QS('#files>tbody>tr.au>td>a.play');
|
||||
if (el)
|
||||
el.click();
|
||||
}
|
||||
|
||||
|
||||
function aligngriditems() {
|
||||
if (!treectl)
|
||||
return;
|
||||
@@ -7255,6 +7476,7 @@ var treectl = (function () {
|
||||
treesz = clamp(icfg_get('treesz', 16), 10, 50);
|
||||
|
||||
var resort = function () {
|
||||
ENATSORT = NATSORT && clgot(ebi('nsort'), 'on');
|
||||
treectl.gentab(get_evpath(), treectl.lsc);
|
||||
};
|
||||
bcfg_bind(r, 'ireadme', 'ireadme', true);
|
||||
@@ -7583,8 +7805,8 @@ var treectl = (function () {
|
||||
};
|
||||
|
||||
function reload_tree() {
|
||||
var cdir = r.nextdir || get_vpath(),
|
||||
cevp = get_evpath(),
|
||||
var cevp = get_evpath(),
|
||||
cdir = r.nextdir || uricom_dec(cevp),
|
||||
links = QSA('#treeul a+a'),
|
||||
nowrap = QS('#tree.nowrap') && QS('#hovertree.on'),
|
||||
act = null;
|
||||
@@ -7933,7 +8155,7 @@ var treectl = (function () {
|
||||
|
||||
if (tn.lead == '-')
|
||||
tn.lead = '<a href="?doc=' + bhref + '" id="t' + id +
|
||||
'" class="doc' + (lang ? ' bri' : '') +
|
||||
'" rel="nofollow" class="doc' + (lang ? ' bri' : '') +
|
||||
'" hl="' + id + '" name="' + hname + '">-txt-</a>';
|
||||
|
||||
var cl = /\.PARTIAL$/.exec(fname) ? ' class="fade"' : '',
|
||||
@@ -8091,7 +8313,7 @@ var treectl = (function () {
|
||||
document.documentElement.scrollLeft = 0;
|
||||
setTimeout(function () {
|
||||
r.gentab(get_evpath(), r.lsc);
|
||||
ebi('wrap').style.opacity = 'unset';
|
||||
ebi('wrap').style.opacity = CLOSEST ? 'unset' : 1;
|
||||
}, 1);
|
||||
};
|
||||
|
||||
@@ -8127,9 +8349,16 @@ var treectl = (function () {
|
||||
}
|
||||
delete res['a'];
|
||||
var keys = Object.keys(res);
|
||||
keys.sort(function (a, b) { return a.localeCompare(b); });
|
||||
for (var a = 0; a < keys.length; a++)
|
||||
keys[a] = [uricom_dec(keys[a]), keys[a]];
|
||||
|
||||
if (ENATSORT)
|
||||
keys.sort(function (a, b) { return NATSORT.compare(a[0], b[0]); });
|
||||
else
|
||||
keys.sort(function (a, b) { return a[0].localeCompare(b[0]); });
|
||||
|
||||
for (var a = 0; a < keys.length; a++) {
|
||||
var kk = keys[a],
|
||||
var kk = keys[a][1],
|
||||
m = /(\?k=[^\n]+)/.exec(kk),
|
||||
kdk = m ? m[1] : '',
|
||||
ks = kk.replace(kdk, '').slice(1),
|
||||
@@ -8237,7 +8466,7 @@ var wfp_debounce = (function () {
|
||||
if (--r.n <= 0) {
|
||||
r.n = 0;
|
||||
clearTimeout(r.t);
|
||||
ebi('wfp').style.opacity = 'unset';
|
||||
ebi('wfp').style.opacity = CLOSEST ? 'unset' : 1;
|
||||
}
|
||||
};
|
||||
r.reset = function () {
|
||||
@@ -8379,7 +8608,7 @@ function mk_files_header(taglist) {
|
||||
var tag = taglist[a],
|
||||
c1 = tag.slice(0, 1).toUpperCase();
|
||||
|
||||
tag = c1 + tag.slice(1);
|
||||
tag = esc(c1 + tag.slice(1));
|
||||
if (c1 == '.')
|
||||
tag = '<th name="tags/' + tag + '" sort="int"><span>' + tag.slice(1);
|
||||
else
|
||||
@@ -9408,6 +9637,11 @@ function sandbox(tgt, rules, allow, cls, html) {
|
||||
clmod(tgt, 'sb');
|
||||
return false;
|
||||
}
|
||||
if (!CLOSEST) {
|
||||
tgt.textContent = html;
|
||||
clmod(tgt, 'sb');
|
||||
return false;
|
||||
}
|
||||
clmod(tgt, 'sb', 1);
|
||||
|
||||
var tid = tgt.getAttribute('id'),
|
||||
@@ -9475,7 +9709,7 @@ window.addEventListener("message", function (e) {
|
||||
el.parentNode.removeChild(el.previousSibling);
|
||||
|
||||
el.style.height = (parseInt(t[2]) + SBH) + 'px';
|
||||
el.style.visibility = 'unset';
|
||||
el.style.visibility = CLOSEST ? 'unset' : 'block';
|
||||
wfp_debounce.show();
|
||||
}
|
||||
else if (t[0] == 'iscroll') {
|
||||
@@ -9769,7 +10003,7 @@ function wintitle(txt, noname) {
|
||||
if (s_name && !noname)
|
||||
txt = s_name + ' ' + txt;
|
||||
|
||||
txt += get_vpath().slice(1, -1).split('/').pop();
|
||||
txt += uricom_dec(get_evpath()).slice(1, -1).split('/').pop();
|
||||
|
||||
document.title = txt;
|
||||
}
|
||||
|
||||
@@ -1078,26 +1078,28 @@ action_stack = (function () {
|
||||
var p1 = from.length,
|
||||
p2 = to.length;
|
||||
|
||||
while (p1-- > 0 && p2-- > 0)
|
||||
while (p1 --> 0 && p2 --> 0)
|
||||
if (from[p1] != to[p2])
|
||||
break;
|
||||
|
||||
if (car > ++p1) {
|
||||
if (car > ++p1)
|
||||
car = p1;
|
||||
}
|
||||
|
||||
var txt = from.substring(car, p1)
|
||||
return {
|
||||
car: car,
|
||||
cdr: ++p2,
|
||||
cdr: p2 + (car && 1),
|
||||
txt: txt,
|
||||
cpos: cpos
|
||||
};
|
||||
}
|
||||
|
||||
var undiff = function (from, change) {
|
||||
var t1 = from.substring(0, change.car),
|
||||
t2 = from.substring(change.cdr);
|
||||
|
||||
return {
|
||||
txt: from.substring(0, change.car) + change.txt + from.substring(change.cdr),
|
||||
txt: t1 + change.txt + t2,
|
||||
cpos: change.cpos
|
||||
};
|
||||
}
|
||||
|
||||
@@ -122,7 +122,7 @@
|
||||
<input type="hidden" id="la" name="act" value="login" />
|
||||
<input type="password" id="lp" name="cppwd" placeholder=" password" />
|
||||
<input type="hidden" name="uhash" id="uhash" value="x" />
|
||||
<input type="submit" id="ls" value="Login" />
|
||||
<input type="submit" id="ls" value="login" />
|
||||
{% if chpw %}
|
||||
<a id="x" href="#">change password</a>
|
||||
{% endif %}
|
||||
|
||||
@@ -381,6 +381,9 @@ html.y .btn:focus {
|
||||
box-shadow: 0 .1em .2em #037 inset;
|
||||
outline: #037 solid .1em;
|
||||
}
|
||||
input, button {
|
||||
font-family: var(--font-main), sans-serif;
|
||||
}
|
||||
input[type="submit"] {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
@@ -1319,7 +1319,7 @@ function up2k_init(subtle) {
|
||||
if (bad_files.length) {
|
||||
var msg = L.u_badf.format(bad_files.length, ntot);
|
||||
for (var a = 0, aa = Math.min(20, bad_files.length); a < aa; a++)
|
||||
msg += '-- ' + bad_files[a][1] + '\n';
|
||||
msg += '-- ' + esc(bad_files[a][1]) + '\n';
|
||||
|
||||
msg += L.u_just1;
|
||||
return modal.alert(msg, function () {
|
||||
@@ -1331,7 +1331,7 @@ function up2k_init(subtle) {
|
||||
if (nil_files.length) {
|
||||
var msg = L.u_blankf.format(nil_files.length, ntot);
|
||||
for (var a = 0, aa = Math.min(20, nil_files.length); a < aa; a++)
|
||||
msg += '-- ' + nil_files[a][1] + '\n';
|
||||
msg += '-- ' + esc(nil_files[a][1]) + '\n';
|
||||
|
||||
msg += L.u_just1;
|
||||
return modal.confirm(msg, function () {
|
||||
@@ -1343,10 +1343,68 @@ function up2k_init(subtle) {
|
||||
});
|
||||
}
|
||||
|
||||
var fps = new Set(), pdp = '';
|
||||
for (var a = 0; a < good_files.length; a++) {
|
||||
var fp = good_files[a][1],
|
||||
dp = vsplit(fp)[0];
|
||||
fps.add(fp);
|
||||
if (pdp != dp) {
|
||||
pdp = dp;
|
||||
dp = dp.slice(0, -1);
|
||||
while (dp) {
|
||||
fps.add(dp);
|
||||
dp = vsplit(dp)[0].slice(0, -1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var junk = [], rmi = [];
|
||||
for (var a = 0; a < good_files.length; a++) {
|
||||
var fn = good_files[a][1];
|
||||
if (fn.indexOf("/.") < 0 && fn.indexOf("/__MACOS") < 0)
|
||||
continue;
|
||||
|
||||
if (/\/__MACOS|\/\.(DS_Store|AppleDouble|LSOverride|DocumentRevisions-|fseventsd|Spotlight-V[0-9]|TemporaryItems|Trashes|VolumeIcon\.icns|com\.apple\.timemachine\.donotpresent|AppleDB|AppleDesktop|apdisk)/.exec(fn)) {
|
||||
junk.push(good_files[a]);
|
||||
rmi.push(a);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (fn.indexOf("/._") + 1 &&
|
||||
fps.has(fn.replace("/._", "/")) &&
|
||||
fn.split("/").pop().startsWith("._") &&
|
||||
!has(rmi, a)
|
||||
) {
|
||||
junk.push(good_files[a]);
|
||||
rmi.push(a);
|
||||
}
|
||||
}
|
||||
|
||||
if (!junk.length)
|
||||
return gotallfiles2(good_files);
|
||||
|
||||
junk.sort();
|
||||
rmi.sort(function (a, b) { return a - b; });
|
||||
|
||||
var msg = L.u_applef.format(junk.length, good_files.length);
|
||||
for (var a = 0, aa = Math.min(1000, junk.length); a < aa; a++)
|
||||
msg += '-- ' + esc(junk[a][1]) + '\n';
|
||||
|
||||
return modal.confirm(msg, function () {
|
||||
for (var a = rmi.length - 1; a >= 0; a--)
|
||||
good_files.splice(rmi[a], 1);
|
||||
|
||||
start_actx();
|
||||
gotallfiles2(good_files);
|
||||
}, function () {
|
||||
start_actx();
|
||||
gotallfiles2(good_files);
|
||||
});
|
||||
}
|
||||
|
||||
function gotallfiles2(good_files) {
|
||||
good_files.sort(function (a, b) {
|
||||
a = a[1];
|
||||
b = b[1];
|
||||
return a < b ? -1 : a > b ? 1 : 0;
|
||||
return a[1] < b[1] ? -1 : 1;
|
||||
});
|
||||
|
||||
var msg = [];
|
||||
@@ -1357,7 +1415,7 @@ function up2k_init(subtle) {
|
||||
if (FIREFOX && good_files.length > 3000)
|
||||
msg.push(L.u_ff_many + "\n\n");
|
||||
|
||||
msg.push(L.u_asku.format(good_files.length, esc(get_vpath())) + '<ul>');
|
||||
msg.push(L.u_asku.format(good_files.length, esc(uricom_dec(get_evpath()))) + '<ul>');
|
||||
for (var a = 0, aa = Math.min(20, good_files.length); a < aa; a++)
|
||||
msg.push('<li>' + esc(good_files[a][1]) + '</li>');
|
||||
|
||||
@@ -1399,9 +1457,7 @@ function up2k_init(subtle) {
|
||||
|
||||
if (!uc.az)
|
||||
good_files.sort(function (a, b) {
|
||||
a = a[0].size;
|
||||
b = b[0].size;
|
||||
return a < b ? -1 : a > b ? 1 : 0;
|
||||
return a[0].size - b[0].size;
|
||||
});
|
||||
|
||||
for (var a = 0; a < good_files.length; a++) {
|
||||
@@ -1409,7 +1465,7 @@ function up2k_init(subtle) {
|
||||
name = good_files[a][1],
|
||||
fdir = evpath,
|
||||
now = Date.now(),
|
||||
lmod = uc.u2ts ? (fobj.lastModified || now) : 0,
|
||||
lmod = (uc.u2ts && fobj.lastModified) || 0,
|
||||
ofs = name.lastIndexOf('/') + 1;
|
||||
|
||||
if (ofs) {
|
||||
@@ -2073,8 +2129,8 @@ function up2k_init(subtle) {
|
||||
try { orz(e); } catch (ex) { vis_exh(ex + '', 'up2k.js', '', '', ex); }
|
||||
};
|
||||
reader.onerror = function () {
|
||||
var err = reader.error + '';
|
||||
var handled = false;
|
||||
var err = esc('' + reader.error),
|
||||
handled = false;
|
||||
|
||||
if (err.indexOf('NotReadableError') !== -1 || // win10-chrome defender
|
||||
err.indexOf('NotFoundError') !== -1 // macos-firefox permissions
|
||||
@@ -2298,7 +2354,7 @@ function up2k_init(subtle) {
|
||||
xhr.onerror = xhr.ontimeout = function () {
|
||||
console.log('head onerror, retrying', t.name, t);
|
||||
if (!toast.visible)
|
||||
toast.warn(9.98, L.u_enethd + "\n\nfile: " + t.name, t);
|
||||
toast.warn(9.98, L.u_enethd + "\n\nfile: " + esc(t.name), t);
|
||||
|
||||
apop(st.busy.head, t);
|
||||
st.todo.head.unshift(t);
|
||||
@@ -2373,7 +2429,7 @@ function up2k_init(subtle) {
|
||||
return console.log('zombie handshake onerror', t.name, t);
|
||||
|
||||
if (!toast.visible)
|
||||
toast.warn(9.98, L.u_eneths + "\n\nfile: " + t.name, t);
|
||||
toast.warn(9.98, L.u_eneths + "\n\nfile: " + esc(t.name), t);
|
||||
|
||||
console.log('handshake onerror, retrying', t.name, t);
|
||||
apop(st.busy.handshake, t);
|
||||
@@ -2478,7 +2534,7 @@ function up2k_init(subtle) {
|
||||
var idx = t.hash.indexOf(missing[a]);
|
||||
if (idx < 0)
|
||||
return modal.alert('wtf negative index for hash "{0}" in task:\n{1}'.format(
|
||||
missing[a], JSON.stringify(t)));
|
||||
missing[a], esc(JSON.stringify(t))));
|
||||
|
||||
t.postlist.push(idx);
|
||||
cbd[idx] = 0;
|
||||
@@ -2632,7 +2688,7 @@ function up2k_init(subtle) {
|
||||
return toast.err(0, L.u_ehsdf + "\n\n" + rsp.replace(/.*; /, ''));
|
||||
|
||||
err = t.t_uploading ? L.u_ehsfin : t.srch ? L.u_ehssrch : L.u_ehsinit;
|
||||
xhrchk(xhr, err + "\n\nfile: " + t.name + "\n\nerror ", "404, target folder not found", "warn", t);
|
||||
xhrchk(xhr, err + "\n\nfile: " + esc(t.name) + "\n\nerror ", "404, target folder not found", "warn", t);
|
||||
}
|
||||
}
|
||||
xhr.onload = function (e) {
|
||||
@@ -2789,7 +2845,7 @@ function up2k_init(subtle) {
|
||||
toast.inf(10, L.u_cbusy);
|
||||
}
|
||||
else {
|
||||
xhrchk(xhr, L.u_cuerr2.format(snpart, Math.ceil(t.size / chunksize), t.name), "404, target folder not found (???)", "warn", t);
|
||||
xhrchk(xhr, L.u_cuerr2.format(snpart, Math.ceil(t.size / chunksize), esc(t.name)), "404, target folder not found (???)", "warn", t);
|
||||
chill(t);
|
||||
}
|
||||
orz2(xhr);
|
||||
@@ -2833,7 +2889,7 @@ function up2k_init(subtle) {
|
||||
xhr.bsent = 0;
|
||||
|
||||
if (!toast.visible)
|
||||
toast.warn(9.98, L.u_cuerr.format(snpart, Math.ceil(t.size / chunksize), t.name), t);
|
||||
toast.warn(9.98, L.u_cuerr.format(snpart, Math.ceil(t.size / chunksize), esc(t.name)), t);
|
||||
|
||||
t.nojoin = t.nojoin || t.postlist.length; // maybe rproxy postsize limit
|
||||
console.log('chunkpit onerror,', t.name, t);
|
||||
|
||||
@@ -364,7 +364,8 @@ if (!Element.prototype.matches)
|
||||
Element.prototype.mozMatchesSelector ||
|
||||
Element.prototype.webkitMatchesSelector;
|
||||
|
||||
if (!Element.prototype.closest)
|
||||
var CLOSEST = !!Element.prototype.closest;
|
||||
if (!CLOSEST)
|
||||
Element.prototype.closest = function (s) {
|
||||
var el = this;
|
||||
do {
|
||||
@@ -461,6 +462,13 @@ function namesan(txt, win, fslash) {
|
||||
}
|
||||
|
||||
|
||||
var NATSORT, ENATSORT;
|
||||
try {
|
||||
NATSORT = new Intl.Collator([], {numeric: true});
|
||||
}
|
||||
catch (ex) { }
|
||||
|
||||
|
||||
var crctab = (function () {
|
||||
var c, tab = [];
|
||||
for (var n = 0; n < 256; n++) {
|
||||
@@ -614,6 +622,33 @@ function showsort(tab) {
|
||||
}
|
||||
}
|
||||
}
|
||||
function st_cmp_num(a, b) {
|
||||
a = a[0];
|
||||
b = b[0];
|
||||
return (
|
||||
a === null ? -1 :
|
||||
b === null ? 1 :
|
||||
(a - b)
|
||||
);
|
||||
}
|
||||
function st_cmp_nat(a, b) {
|
||||
a = a[0];
|
||||
b = b[0];
|
||||
return (
|
||||
a === null ? -1 :
|
||||
b === null ? 1 :
|
||||
NATSORT.compare(a, b)
|
||||
);
|
||||
}
|
||||
function st_cmp_gen(a, b) {
|
||||
a = a[0];
|
||||
b = b[0];
|
||||
return (
|
||||
a === null ? -1 :
|
||||
b === null ? 1 :
|
||||
a.localeCompare(b)
|
||||
);
|
||||
}
|
||||
function sortTable(table, col, cb) {
|
||||
var tb = table.tBodies[0],
|
||||
th = table.tHead.rows[0].cells,
|
||||
@@ -659,19 +694,17 @@ function sortTable(table, col, cb) {
|
||||
}
|
||||
vl.push([v, a]);
|
||||
}
|
||||
vl.sort(function (a, b) {
|
||||
a = a[0];
|
||||
b = b[0];
|
||||
if (a === null)
|
||||
return -1;
|
||||
if (b === null)
|
||||
return 1;
|
||||
|
||||
if (stype == 'int') {
|
||||
return reverse * (a - b);
|
||||
}
|
||||
return reverse * (a.localeCompare(b));
|
||||
});
|
||||
if (stype == 'int')
|
||||
vl.sort(st_cmp_num);
|
||||
else if (ENATSORT)
|
||||
vl.sort(st_cmp_nat);
|
||||
else
|
||||
vl.sort(st_cmp_gen);
|
||||
|
||||
if (reverse < 0)
|
||||
vl.reverse();
|
||||
|
||||
if (sread('dir1st') !== '0') {
|
||||
var r1 = [], r2 = [];
|
||||
for (var i = 0; i < tr.length; i++) {
|
||||
@@ -857,11 +890,6 @@ function get_evpath() {
|
||||
}
|
||||
|
||||
|
||||
function get_vpath() {
|
||||
return uricom_dec(get_evpath());
|
||||
}
|
||||
|
||||
|
||||
function noq_href(el) {
|
||||
return el.getAttribute('href').split('?')[0];
|
||||
}
|
||||
|
||||
@@ -64,7 +64,7 @@ onmessage = (d) => {
|
||||
};
|
||||
reader.onerror = function () {
|
||||
busy = false;
|
||||
var err = reader.error + '';
|
||||
var err = esc('' + reader.error);
|
||||
|
||||
if (err.indexOf('NotReadableError') !== -1 || // win10-chrome defender
|
||||
err.indexOf('NotFoundError') !== -1 // macos-firefox permissions
|
||||
|
||||
@@ -1,3 +1,286 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0420-1836 `v1.16.21` unzip-compat
|
||||
|
||||
a couple guys have been asking if I accept donations -- thanks a lot!! added a few options on [my github page](https://github.com/9001/) :>
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* #156 add button to loop/repeat music 71c55659
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* #155 download-as-zip: increase compatibility with the unix `unzip` command db33d68d
|
||||
* this unfortunately reduces support for huge zipfiles on old software (WinXP and such)
|
||||
* and makes it less safe to stream zips into unzippers, so use tar.gz instead
|
||||
* and is perhaps not even a copyparty bug; see commit-message for the full story
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* show warning on Ctrl-A in lazy-loaded folders 5b3a5fe7
|
||||
* docker: hide keepalive pings from logs d5a9bd80
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0413-2151 `v1.16.20` all sorted
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* when enabled, natural-sort will now also apply to tags, not just filenames 7b2bd6da
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* some sorting-related stuff 7b2bd6da
|
||||
* folders with non-ascii names would sort incorrectly in the navpane/sidebar
|
||||
* natural-sort didn't apply correctly after changing the sort order
|
||||
* workaround [ffmpeg-bug 10797](https://trac.ffmpeg.org/ticket/10797) 98dcaee2
|
||||
* reduces ram usage from 1534 to 230 MiB when generating spectrograms of s3xmodit songs (amiga chiptunes)
|
||||
* disable mdns if only listening on uds (unix-sockets) ffc16109 361aebf8
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* hotkey CTRL-A will now select all files in gridview 233075ae
|
||||
* and it toggles (just like in list-view) so try pressing it again
|
||||
* copyparty.exe: upgrade to pillow v11.2.1 c7aa1a35
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0408-2132 `v1.16.19` GHOST
|
||||
|
||||
did you know that every song named `GHOST` is a banger? it's true! [ghost](https://www.youtube.com/watch?v=NoUAwC4yiAw) // [ghost](https://www.youtube.com/watch?v=IKKar5SS29E) // [ghost](https://www.youtube.com/watch?v=tFSFlgm_tsw)
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* option to store markdown backups out-of-volume fc883418
|
||||
* the default is still a subfolder named `.hist` next to the markdown file
|
||||
* `--md-hist v` puts them in the volume's hist-folder instead
|
||||
* `--md-hist n` disables markdown-backups entirely
|
||||
* #149 option to store the volume sqlite databases at a custom locations outside the hist-folder e1b9ac63
|
||||
* new option `--dbpath` works like `--hist` but it only moves the database file, not the thumbnails
|
||||
* they can be combined, in which case `--hist` is applied to thumbnails, `--dbpath` to the db
|
||||
* useful when you're squeezing every last drop of performance out of your filesystem (see the issue)
|
||||
* actively prevent sharing certain databases (sessions/shares) between multiple copyparty instances acfaacbd
|
||||
* an errormessage was added to explain some different alternatives for doing this safely
|
||||
* for example by setting `XDG_CONFIG_HOME` which now works on all platforms b17ccc38
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* #151 mkdir did not work in locations outside the volume root (via symlinks) 2b50fc20
|
||||
* improve the ui feedback when trying to play an audio file which failed to transcode f9954bc4
|
||||
* also helps with server-filesystem issues, including image-thumbs
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* #152 custom fonts are also applied to textboxes and buttons (thx @thaddeuskkr) d450f615
|
||||
* be more careful with the shares-db 8e0364ef
|
||||
* be less careful with the sessions-db 8e0364ef
|
||||
* update deps c0becc64
|
||||
* web: dompurify
|
||||
* copyparty.exe: python 3.12.10
|
||||
* rephrase `-j0` warning on windows to also mention that Microsoft Defender will freak out c0becc64
|
||||
* #149 add [a script](https://github.com/9001/copyparty/tree/hovudstraum/contrib#zfs-tunepy) to optimize the sqlite databases for storage on zfs 4f397b9b
|
||||
* block `GoogleOther` (another recalcitrant bot) from zip-downloads c2034f7b
|
||||
* rephrase `-j0` warning on windows to also mention that Microsoft Defender will freak out c0becc64
|
||||
* update [contributing.md](https://github.com/9001/copyparty/blob/hovudstraum/CONTRIBUTING.md) with a section regarding LLM/AI-written code cec3bee0
|
||||
* the [helptext](https://ocv.me/copyparty/helptext.html) will also be uploaded to each github release from now on, [permalink](https://github.com/9001/copyparty/releases/latest/download/helptext.html)
|
||||
* add review from ixbt forums b383c08c
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0323-2216 `v1.16.18` zlib-ng
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* prefer zlib-ng when available 57a56073
|
||||
* download-as-tar-gz becomes 2.5x faster
|
||||
* default-enabled in docker-images
|
||||
* not enabled in copyparty.exe yet; coming in a future python version
|
||||
* docker: add mimalloc (optional, default-disabled) de2c9788
|
||||
* gives twice the speed, and twice the ram usage
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* small up2k glitch 3c90cec0
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* rename logues/readmes when uploaded with write-only access 2525d594
|
||||
* since they are used as helptext when viewing the page
|
||||
* try to block google and other bad bots from `?doc` and `?zip` 99f63adf
|
||||
* apparently `rel="nofollow"` means nothing these days
|
||||
|
||||
### the docker images for this release were built from e1dea7ef
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0316-2002 `v1.16.17` boot2party
|
||||
|
||||
## NEW: make it a bootable usb flashdrive
|
||||
|
||||
get the party going anywhere, anytime, no OS required! [download flashdrive image](https://a.ocv.me/pub/stuff/edcd001/enterprise-edition/) or watch the [low-effort demo video](https://a.ocv.me/pub/stuff/edcd001/enterprise-edition/hub-demo-hq.webm) which eventually gets to the copyparty part after showing off a bunch of other stuff on there
|
||||
|
||||
* there is [source code](https://github.com/9001/asm/tree/hovudstraum/p/hub) and [build instructions](https://github.com/9001/asm/tree/hovudstraum/p/hub/sm/how2build) too
|
||||
* please don't take this too seriously
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* option to specify max-size for download-as-zip/tar 494179bd 0a33336d
|
||||
* either the total download size (`--zipmaxs 500M`), and/or max number of files (`--zipmaxn 9k`)
|
||||
* applies to all uesrs by default; can also ignore limits for authorized users (`--zipmaxu`)
|
||||
* errormessage can be customized with `--zipmaxt "winter is coming... but this download isn't"`
|
||||
* [appledoubles](https://a.ocv.me/pub/stuff/?doc=appledoubles-and-friends.txt) are detected and skipped when uploading with the browser-UI 78208405
|
||||
* IdP-volumes can be filtered by group 9c2c4237
|
||||
* `[/users/${u}]` in a config-file creates the volume for all users like before
|
||||
* `[/users/${u%+canwrite}]` only if the user is in the `canwrite` group
|
||||
* `[/users/${u%-admins}]` only if the user is NOT in the `admins` group
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* when moving a folder with symlinks, don't expand them into full files 5ab09769
|
||||
* absolute symlinks are moved as-is; relative symlinks are rewritten so they still point to the same file when possible (if both source and destination are indexed in the db)
|
||||
* the previous behavior was good for un-deduplicating files after changing the server-settings, but was too inconvenient for all other usecases
|
||||
* #146 fix downloading from shares when `-j0` enabled 8417098c
|
||||
* only show the download-as-zip link when the user is actually allowed to 14bb2999
|
||||
* the suggestions in the serverlog regarding how to fix incorrect X-Forwarded-For settings would be incorrect if the reverse-proxy used IPv6 to communicate with copyparty 16462ee5
|
||||
* set nofollow on `?doc` links so crawlers don't download binary files as text 6a2644fe
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* #147 IdP: fix the warning about dangerous misconfigurations to be more accurate 29a17ae2
|
||||
* #143 print a warning on incorrect character-encoding in textfiles (config-files, logues, readmes etc.) 25974d66
|
||||
* copyparty.exe: update to jinja 3.1.6 (copyparty was *not affected* by the jinja-3.1.5 vuln)
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0228-1846 `v1.16.16` lemon melon cookie
|
||||
|
||||
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
|
||||
|
||||
webdev is [like a lemon](https://youtu.be/HPURbfKb7to) sometimes
|
||||
|
||||
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) ╱ [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) ╱ [client testbed](https://cd.ocv.me/b/)
|
||||
|
||||
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates, such as [vulnerabilities](https://github.com/9001/copyparty/security) (most recently 2025-02-25)
|
||||
|
||||
## recent important news
|
||||
|
||||
* [v1.16.15 (2025-02-25)](https://github.com/9001/copyparty/releases/tag/v1.16.15) fixed low-severity xss when uploading maliciously-named files
|
||||
* [v1.15.0 (2024-09-08)](https://github.com/9001/copyparty/releases/tag/v1.15.0) changed upload deduplication to be default-disabled
|
||||
* [v1.14.3 (2024-08-30)](https://github.com/9001/copyparty/releases/tag/v1.14.3) fixed a bug that was introduced in v1.13.8 (2024-08-13); this bug could lead to **data loss** -- see the v1.14.3 release-notes for details
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* #142 workaround android-chrome timestamp bug 5e12abbb
|
||||
* all files were uploaded with last-modified year 1601 in specific recent versions of chrome
|
||||
* https://issues.chromium.org/issues/393149335 has the actual fix; will be out soon
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* add helptext for volflags `dk`, `dks`, `dky` 65a7706f
|
||||
* fix false-positive warning when disabling a global option per-volume by unsetting the volflag
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* #140 nixos: @daimond113 fixed a warning in the nixpkg (thx!) e0fe2b97
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0225-0017 `v1.16.15` fix low-severity vuln
|
||||
|
||||
<img src="https://github.com/9001/copyparty/raw/hovudstraum/docs/logo.svg" width="250" align="right"/>
|
||||
|
||||
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) ╱ [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) ╱ [client testbed](https://cd.ocv.me/b/)
|
||||
|
||||
## ⚠️ this fixes a minor vulnerability; CVE-score `3.6`/`10`
|
||||
|
||||
[GHSA-m2jw-cj8v-937r](https://github.com/9001/copyparty/security/advisories/GHSA-m2jw-cj8v-937r) aka [CVE-2025-27145](https://www.cve.org/CVERecord?id=CVE-2025-27145) could let an attacker run arbitrary javascript by tricking an authenticated user into uploading files with malicious filenames
|
||||
|
||||
* ...but it required some clever social engineering, and is **not likely** to be a cause for concern... ah, better safe than sorry
|
||||
|
||||
there is a [discord server](https://discord.gg/25J8CdTT6G) with an `@everyone` in case of future important updates, such as [vulnerabilities](https://github.com/9001/copyparty/security) (most recently 2025-02-25)
|
||||
|
||||
## recent important news
|
||||
|
||||
* [v1.15.0 (2024-09-08)](https://github.com/9001/copyparty/releases/tag/v1.15.0) changed upload deduplication to be default-disabled
|
||||
* [v1.14.3 (2024-08-30)](https://github.com/9001/copyparty/releases/tag/v1.14.3) fixed a bug that was introduced in v1.13.8 (2024-08-13); this bug could lead to **data loss** -- see the v1.14.3 release-notes for details
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* nothing this time
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* fix [GHSA-m2jw-cj8v-937r](https://github.com/9001/copyparty/security/advisories/GHSA-m2jw-cj8v-937r) / [CVE-2025-27145](https://www.cve.org/CVERecord?id=CVE-2025-27145) in 438ea6cc
|
||||
* when trying to upload an empty files by dragging it into the browser, the filename would be rendered as HTML, allowing javascript injection if the filename was malicious
|
||||
* issue discovered and reported by @JayPatel48 (thx!)
|
||||
* related issues in errorhandling of uploads 499ae1c7 36866f1d
|
||||
* these all had the same consequences as the GHSA above, but a network outage was necessary to trigger them
|
||||
* which would probably have the lucky side-effect of blocking the javascript download, nice
|
||||
* paranoid fixing of probably-not-even-issues 3adbb2ff
|
||||
* fix some markdown / texteditor bugs 407531bc
|
||||
* only indicate file-versions for markdown files in listings, since it's tricky to edit non-textfiles otherwise
|
||||
* CTRL-C followed by CTRL-V and CTRL-Z in a single-line file would make a character fall off
|
||||
* ensure safety of extensions
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* readme:
|
||||
* mention support for running the server on risc-v 6d102fc8
|
||||
* mention that the [sony psp](https://github.com/user-attachments/assets/9d21f020-1110-4652-abeb-6fc09c533d4f) can browse and upload 598a29a7
|
||||
|
||||
----
|
||||
|
||||
# 💾 what to download?
|
||||
| download link | is it good? | description |
|
||||
| -- | -- | -- |
|
||||
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
|
||||
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
|
||||
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
|
||||
| [u2c.exe](https://github.com/9001/copyparty/releases/download/v1.16.14/u2c.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/u2c.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
|
||||
| [copyparty.pyz](https://github.com/9001/copyparty/releases/latest/download/copyparty.pyz) | ⚠️ acceptable | similar to the regular sfx, [mostly worse](https://github.com/9001/copyparty#zipapp) |
|
||||
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
|
||||
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.16.5/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
|
||||
|
||||
* except for [u2c.exe](https://github.com/9001/copyparty/releases/download/v1.16.14/u2c.exe), all of the options above are mostly equivalent
|
||||
* the zip and tar.gz files below are just source code
|
||||
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0219-2309 `v1.16.14` overwrite by upload
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* #139 overwrite existing files by uploading over them e9f78ea7
|
||||
* default-disabled; a new togglebutton in the upload-UI configures it
|
||||
* can optionally compare last-modified-time and only overwrite older files
|
||||
* [GDPR compliance](https://github.com/9001/copyparty#GDPR-compliance) (maybe/probably) 4be0d426
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* some cosmetic volflag stuff, all harmless b190e676
|
||||
* disabling a volflag `foo` with `-foo` shows a warning that `-foo` was not a recognized volflag, but it still does the right thing
|
||||
* some volflags give the *"unrecognized volflag, will ignore"* warning, but not to worry, they still work just fine:
|
||||
* `xz` to allow serverside xz-compression of uploaded files
|
||||
* the option to customize the loader-spinner would glitch out during the initial page load 7d7d5d6c
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* [randpic.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/handlers/randpic.py), new 404-handler example, returns a random pic from a folder 60d5f271
|
||||
* readme: [howto permanent cloudflare tunnel](https://github.com/9001/copyparty#permanent-cloudflare-tunnel) for easy hosting from home 2beb2acc
|
||||
* [synology-dsm](https://github.com/9001/copyparty/blob/hovudstraum/docs/synology-dsm.md): mention how to update the docker image 56ce5919
|
||||
* spinner improvements 6858cb06
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0213-2057 `v1.16.13` configure with confidence
|
||||
|
||||
|
||||
@@ -281,8 +281,11 @@ on writing your own [hooks](../README.md#event-hooks)
|
||||
hooks can cause intentional side-effects, such as redirecting an upload into another location, or creating+indexing additional files, or deleting existing files, by returning json on stdout
|
||||
|
||||
* `reloc` can redirect uploads before/after uploading has finished, based on filename, extension, file contents, uploader ip/name etc.
|
||||
* example: [reloc-by-ext](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reloc-by-ext.py)
|
||||
* `idx` informs copyparty about a new file to index as a consequence of this upload
|
||||
* example: [podcast-normalizer.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/podcast-normalizer.py)
|
||||
* `del` tells copyparty to delete an unrelated file by vpath
|
||||
* example: ( ´・ω・) nyoro~n
|
||||
|
||||
for these to take effect, the hook must be defined with the `c1` flag; see example [reloc-by-ext](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reloc-by-ext.py)
|
||||
|
||||
|
||||
@@ -11,9 +11,14 @@ services:
|
||||
- ./:/cfg:z
|
||||
- /path/to/your/fileshare/top/folder:/w:z
|
||||
|
||||
# enabling mimalloc by replacing "NOPE" with "2" will make some stuff twice as fast, but everything will use twice as much ram:
|
||||
environment:
|
||||
LD_PRELOAD: /usr/lib/libmimalloc-secure.so.NOPE
|
||||
|
||||
stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --spider -q 127.0.0.1:3923/?reset"]
|
||||
# hide it from logs with "/._" so it matches the default --lf-url filter
|
||||
test: ["CMD-SHELL", "wget --spider -q 127.0.0.1:3923/?reset=/._"]
|
||||
interval: 1m
|
||||
timeout: 2s
|
||||
retries: 5
|
||||
|
||||
@@ -23,6 +23,9 @@ services:
|
||||
- 'traefik.http.routers.copyparty.tls=true'
|
||||
- 'traefik.http.routers.copyparty.middlewares=authelia@docker'
|
||||
stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal
|
||||
environment:
|
||||
LD_PRELOAD: /usr/lib/libmimalloc-secure.so.NOPE
|
||||
# enable mimalloc by replacing "NOPE" with "2" for a nice speed-boost (will use twice as much ram)
|
||||
|
||||
authelia:
|
||||
image: authelia/authelia:v4.38.0-beta3 # the config files in the authelia folder use the new syntax
|
||||
|
||||
@@ -22,13 +22,10 @@ services:
|
||||
- 'traefik.http.routers.fs.rule=Host(`fs.example.com`)'
|
||||
- 'traefik.http.routers.fs.entrypoints=http'
|
||||
#- 'traefik.http.routers.fs.middlewares=authelia@docker' # TODO: ???
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "wget --spider -q 127.0.0.1:3923/?reset"]
|
||||
interval: 1m
|
||||
timeout: 2s
|
||||
retries: 5
|
||||
start_period: 15s
|
||||
stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal
|
||||
environment:
|
||||
LD_PRELOAD: /usr/lib/libmimalloc-secure.so.NOPE
|
||||
# enable mimalloc by replacing "NOPE" with "2" for a nice speed-boost (will use twice as much ram)
|
||||
|
||||
traefik:
|
||||
image: traefik:v2.11
|
||||
|
||||
@@ -13,6 +13,8 @@
|
||||
# because that is the data-volume in the docker containers,
|
||||
# because a deployment like this (with an IdP) is more commonly
|
||||
# seen in containerized environments -- but this is not required
|
||||
#
|
||||
# the example group "su" (super-user) is the admins group
|
||||
|
||||
|
||||
[global]
|
||||
@@ -78,6 +80,18 @@
|
||||
rwmda: @${g}, @su # read-write-move-delete-admin for that group + the "su" group
|
||||
|
||||
|
||||
[/sus/${u%+su}] # users which ARE members of group "su" gets /sus/username
|
||||
/w/tank1/${u} # which will be "tank1/username" in the docker data volume
|
||||
accs:
|
||||
rwmda: ${u} # read-write-move-delete-admin for that username
|
||||
|
||||
|
||||
[/m8s/${u%-su}] # users which are NOT members of group "su" gets /m8s/username
|
||||
/w/tank2/${u} # which will be "tank2/username" in the docker data volume
|
||||
accs:
|
||||
rwmda: ${u} # read-write-move-delete-admin for that username
|
||||
|
||||
|
||||
# and create some strategic volumes to prevent anyone from gaining
|
||||
# unintended access to priv folders if the users/groups db is lost
|
||||
[/u]
|
||||
@@ -88,3 +102,7 @@
|
||||
/w/lounge
|
||||
accs:
|
||||
rwmda: @su
|
||||
[/sus]
|
||||
/w/tank1
|
||||
[/m8s]
|
||||
/w/tank2
|
||||
|
||||
@@ -33,12 +33,6 @@ if you are introducing a new ttf/woff font, don't forget to declare the font its
|
||||
}
|
||||
```
|
||||
|
||||
and because textboxes don't inherit fonts by default, you can force it like this:
|
||||
|
||||
```css
|
||||
input[type=text], input[type=submit], input[type=button] { font-family: var(--font-main) }
|
||||
```
|
||||
|
||||
and if you want to have a monospace font in the fancy markdown editor, do this:
|
||||
|
||||
```css
|
||||
|
||||
@@ -131,6 +131,7 @@ symbol legend,
|
||||
| runs on Linux | █ | ╱ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | |
|
||||
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ | |
|
||||
| runs on Risc-V | █ | | | █ | █ | █ | | • | | █ | | | |
|
||||
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ | █ |
|
||||
| zero setup, just go | █ | █ | █ | | | ╱ | █ | | | █ | | ╱ | █ |
|
||||
| android app | ╱ | | | █ | █ | | | | | | | | |
|
||||
|
||||
@@ -3,7 +3,7 @@ WORKDIR /z
|
||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
ver_hashwasm=4.12.0 \
|
||||
ver_marked=4.3.0 \
|
||||
ver_dompf=3.2.4 \
|
||||
ver_dompf=3.2.5 \
|
||||
ver_mde=2.18.0 \
|
||||
ver_codemirror=5.65.18 \
|
||||
ver_fontawesome=5.13.0 \
|
||||
@@ -12,7 +12,7 @@ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
|
||||
# versioncheck:
|
||||
# https://github.com/markedjs/marked/releases
|
||||
# https://github.com/Ionaru/easy-markdown-editor/tags
|
||||
# https://github.com/Ionaru/easy-markdown-editor/tags # ignore 2.20.0
|
||||
# https://github.com/codemirror/codemirror5/releases
|
||||
# https://github.com/cure53/DOMPurify/releases
|
||||
# https://github.com/Daninet/hash-wasm/releases
|
||||
|
||||
@@ -8,12 +8,13 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
ENV XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add !pyc \
|
||||
tzdata wget \
|
||||
tzdata wget mimalloc2 mimalloc2-insecure \
|
||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||
ffmpeg
|
||||
|
||||
COPY i/dist/copyparty-sfx.py innvikler.sh ./
|
||||
RUN ash innvikler.sh && rm innvikler.sh
|
||||
ADD base ./base
|
||||
RUN ash innvikler.sh ac
|
||||
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
|
||||
@@ -11,7 +11,7 @@ COPY i/bin/mtag/install-deps.sh ./
|
||||
COPY i/bin/mtag/audio-bpm.py /mtag/
|
||||
COPY i/bin/mtag/audio-key.py /mtag/
|
||||
RUN apk add -U !pyc \
|
||||
tzdata wget \
|
||||
tzdata wget mimalloc2 mimalloc2-insecure \
|
||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||
py3-pip py3-cffi \
|
||||
ffmpeg \
|
||||
@@ -31,7 +31,8 @@ RUN apk add -U !pyc \
|
||||
&& ln -s /root/vamp /root/.local /
|
||||
|
||||
COPY i/dist/copyparty-sfx.py innvikler.sh ./
|
||||
RUN ash innvikler.sh && rm innvikler.sh
|
||||
ADD base ./base
|
||||
RUN ash innvikler.sh dj
|
||||
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
|
||||
@@ -8,11 +8,12 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
ENV XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk --no-cache add !pyc \
|
||||
tzdata wget \
|
||||
tzdata wget mimalloc2 mimalloc2-insecure \
|
||||
py3-jinja2 py3-argon2-cffi py3-pillow py3-mutagen
|
||||
|
||||
COPY i/dist/copyparty-sfx.py innvikler.sh ./
|
||||
RUN ash innvikler.sh && rm innvikler.sh
|
||||
ADD base ./base
|
||||
RUN ash innvikler.sh im
|
||||
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
|
||||
@@ -8,7 +8,7 @@ LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||
ENV XDG_CONFIG_HOME=/cfg
|
||||
|
||||
RUN apk add -U !pyc \
|
||||
tzdata wget \
|
||||
tzdata wget mimalloc2 mimalloc2-insecure \
|
||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||
py3-pip py3-cffi \
|
||||
ffmpeg \
|
||||
@@ -21,7 +21,8 @@ RUN apk add -U !pyc \
|
||||
&& apk del py3-pip .bd
|
||||
|
||||
COPY i/dist/copyparty-sfx.py innvikler.sh ./
|
||||
RUN ash innvikler.sh && rm innvikler.sh
|
||||
ADD base ./base
|
||||
RUN ash innvikler.sh iv
|
||||
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
|
||||
@@ -11,7 +11,7 @@ RUN apk --no-cache add !pyc \
|
||||
py3-jinja2
|
||||
|
||||
COPY i/dist/copyparty-sfx.py innvikler.sh ./
|
||||
RUN ash innvikler.sh && rm innvikler.sh
|
||||
RUN ash innvikler.sh min
|
||||
|
||||
WORKDIR /w
|
||||
EXPOSE 3923
|
||||
|
||||
@@ -101,6 +101,14 @@ the following advice is best-effort and not guaranteed to be entirely correct
|
||||
|
||||
* copyparty will generally create a `.hist` folder at the top of each volume, which contains the filesystem index, thumbnails and such. For performance reasons, but also just to keep things tidy, it might be convenient to store these inside the config folder instead. Add the line `hist: /cfg/hists/` inside the `[global]` section of your `copyparty.conf` to do this
|
||||
|
||||
* if you want more performance, and you're OK with doubling the RAM usage, then consider enabling mimalloc **(maybe buggy)** with one of these:
|
||||
|
||||
* `-e LD_PRELOAD=/usr/lib/libmimalloc-secure.so.2` makes download-as-zip **3x** as fast, filesystem-indexing **1.5x** as fast, etc.
|
||||
|
||||
* `-e LD_PRELOAD=/usr/lib/libmimalloc-insecure.so.2` adds another 10% speed but makes it easier to exploit future vulnerabilities
|
||||
|
||||
* complete example: `podman run --rm -it -p 3923:3923 -v "$PWD:/w:z" -e LD_PRELOAD=/usr/lib/libmimalloc-secure.so.2 copyparty/ac -v /w::r`
|
||||
|
||||
|
||||
## enabling the ftp server
|
||||
|
||||
|
||||
5
scripts/docker/base/Dockerfile.zlibng
Normal file
5
scripts/docker/base/Dockerfile.zlibng
Normal file
@@ -0,0 +1,5 @@
|
||||
FROM alpine:latest
|
||||
WORKDIR /z
|
||||
|
||||
RUN apk add py3-pip make gcc musl-dev python3-dev
|
||||
RUN pip wheel https://files.pythonhosted.org/packages/c4/a7/0b7673be5945071e99364a3ac1987b02fc1d416617e97f3e8816d275174e/zlib_ng-0.5.1.tar.gz
|
||||
15
scripts/docker/base/Makefile
Normal file
15
scripts/docker/base/Makefile
Normal file
@@ -0,0 +1,15 @@
|
||||
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
|
||||
|
||||
all:
|
||||
# build zlib-ng from source so we know how the sausage was made
|
||||
# (still only doing the archs which are officially supported/tested)
|
||||
|
||||
podman build --arch amd64 -t localhost/cpp-zlibng-amd64:latest -f Dockerfile.zlibng .
|
||||
podman run --arch amd64 --rm --log-driver=none -i localhost/cpp-zlibng-amd64:latest tar -cC/z . | tar -xv
|
||||
|
||||
podman build --arch arm64 -t localhost/cpp-zlibng-amd64:latest -f Dockerfile.zlibng .
|
||||
podman run --arch arm64 --rm --log-driver=none -i localhost/cpp-zlibng-amd64:latest tar -cC/z . | tar -xv
|
||||
|
||||
sh:
|
||||
@printf "\n\033[1;31mopening a shell in the most recently created docker image\033[0m\n"
|
||||
docker run --rm -it --entrypoint /bin/ash `docker images -aq | head -n 1`
|
||||
@@ -1,6 +1,16 @@
|
||||
#!/bin/ash
|
||||
set -ex
|
||||
|
||||
# use zlib-ng if available
|
||||
f=/z/base/zlib_ng-0.5.1-cp312-cp312-linux_$(cat /etc/apk/arch).whl
|
||||
[ "$1" != min ] && [ -e $f ] && {
|
||||
apk add -t .bd !pyc py3-pip
|
||||
rm -f /usr/lib/python3*/EXTERNALLY-MANAGED
|
||||
pip install $f
|
||||
apk del .bd
|
||||
}
|
||||
rm -rf /z/base
|
||||
|
||||
# cleanup for flavors with python build steps (dj/iv)
|
||||
rm -rf /var/cache/apk/* /root/.cache
|
||||
|
||||
@@ -22,6 +32,9 @@ rm -rf \
|
||||
/tmp/pe-* /z/copyparty-sfx.py \
|
||||
ensurepip pydoc_data turtle.py turtledemo lib2to3
|
||||
|
||||
# speedhack
|
||||
sed -ri 's/os.environ.get\("PRTY_NO_IMPRESO"\)/"1"/' /usr/lib/python3.*/site-packages/copyparty/util.py
|
||||
|
||||
# drop bytecode
|
||||
find / -xdev -name __pycache__ -print0 | xargs -0 rm -rf
|
||||
|
||||
@@ -40,7 +53,33 @@ find -name __pycache__ |
|
||||
cd /z
|
||||
python3 -m copyparty \
|
||||
--ign-ebind -p$((1024+RANDOM)),$((1024+RANDOM)),$((1024+RANDOM)) \
|
||||
--no-crt -qi127.1 --exit=idx -e2dsa -e2ts
|
||||
-v .::r --no-crt -qi127.1 --exit=idx -e2dsa -e2ts
|
||||
|
||||
########################################################################
|
||||
# test download-as-tar.gz
|
||||
|
||||
t=$(mktemp)
|
||||
python3 -m copyparty \
|
||||
--ign-ebind -p$((1024+RANDOM)),$((1024+RANDOM)),$((1024+RANDOM)) \
|
||||
-v .::r --no-crt -qi127.1 --wr-h-eps $t & pid=$!
|
||||
|
||||
for n in $(seq 1 200); do sleep 0.2
|
||||
v=$(awk '/^127/{print;n=1;exit}END{exit n-1}' $t) && break
|
||||
done
|
||||
[ -z "$v" ] && echo SNAAAAAKE && exit 1
|
||||
|
||||
for n in $(seq 1 200); do sleep 0.2
|
||||
wget -O- http://${v/ /:}/?tar=gz:1 >tf && break
|
||||
done
|
||||
tar -xzO top/innvikler.sh <tf | cmp innvikler.sh
|
||||
rm tf
|
||||
|
||||
kill $pid; wait $pid
|
||||
|
||||
########################################################################
|
||||
|
||||
# output from -e2d
|
||||
rm -rf .hist
|
||||
|
||||
# goodbye
|
||||
exec rm innvikler.sh
|
||||
|
||||
@@ -237,6 +237,8 @@ necho() {
|
||||
tar -zxf $f
|
||||
mv partftpy-*/partftpy .
|
||||
rm -rf partftpy-* partftpy/bin
|
||||
#(cd partftpy && "$pybin" ../../scripts/strip_hints/a.py; rm uh) # dont need the full thing, just this:
|
||||
sed -ri 's/from typing import TYPE_CHECKING$/TYPE_CHECKING = False/' partftpy/TftpShared.py
|
||||
|
||||
necho collecting python-magic
|
||||
v=0.4.27
|
||||
|
||||
@@ -79,7 +79,6 @@ excl=(
|
||||
email.parser
|
||||
importlib.resources
|
||||
importlib_resources
|
||||
inspect
|
||||
multiprocessing
|
||||
packaging
|
||||
pdb
|
||||
@@ -99,6 +98,7 @@ excl=(
|
||||
PIL.ImageWin
|
||||
PIL.PdfParser
|
||||
) || excl+=(
|
||||
inspect
|
||||
PIL
|
||||
PIL.ExifTags
|
||||
PIL.Image
|
||||
|
||||
@@ -23,11 +23,11 @@ ac96786e5d35882e0c5b724794329c9125c2b86ae7847f17acfc49f0d294312c6afc1c3f248655de
|
||||
# win10
|
||||
0a2cd4cadf0395f0374974cd2bc2407e5cc65c111275acdffb6ecc5a2026eee9e1bb3da528b35c7f0ff4b64563a74857d5c2149051e281cc09ebd0d1968be9aa en-us_windows_10_enterprise_ltsc_2021_x64_dvd_d289cf96.iso
|
||||
16cc0c58b5df6c7040893089f3eb29c074aed61d76dae6cd628d8a89a05f6223ac5d7f3f709a12417c147594a87a94cc808d1e04a6f1e407cc41f7c9f47790d1 virtio-win-0.1.248.iso
|
||||
18b9e8cfa682da51da1b682612652030bd7f10e4a1d5ea5220ab32bde734b0e6fe1c7dbd903ac37928c0171fd45d5ca602952054de40a4e55e9ed596279516b5 jinja2-3.1.5-py3-none-any.whl
|
||||
9a7f40edc6f9209a2acd23793f3cbd6213c94f36064048cb8bf6eb04f1bdb2c2fe991cb09f77fe8b13e5cd85c618ef23573e79813b2fef899ab2f290cd129779 jinja2-3.1.6-py3-none-any.whl
|
||||
6df21f0da408a89f6504417c7cdf9aaafe4ed88cfa13e9b8fa8414f604c0401f885a04bbad0484dc51a29284af5d1548e33c6cc6bfb9896d9992c1b1074f332d MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl
|
||||
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
||||
0203ec2551c4836696cfab0b2c9fff603352f03fa36e7476e2e1ca7ec57a3a0c24bd791fcd92f342bf817f0887854d9f072e0271c643de4b313d8c9569ba8813 packaging-24.1-py3-none-any.whl
|
||||
12d7921dc7dfd8a4b0ea0fa2bae8f1354fcdd59ece3d7f4e075aed631f9ba791dc142c70b1ccd1e6287c43139df1db26bd57a7a217c8da3a77326036495cdb57 pillow-11.1.0-cp312-cp312-win_amd64.whl
|
||||
c9051daaf34ec934962c743a5ac2dbe55a9b0cababb693a8cde0001d24d4a50b67bd534d714d935def6ca7b898ec0a352e58bd9ccdce01c54eaf2281b18e478d pillow-11.2.1-cp312-cp312-win_amd64.whl
|
||||
f0463895e9aee97f31a2003323de235fed1b26289766dc0837261e3f4a594a31162b69e9adbb0e9a31e2e2d4b5f25c762ed1669553df7dc89a8ba4f85d297873 pyinstaller-6.11.1-py3-none-win_amd64.whl
|
||||
d550a0a14428386945533de2220c4c2e37c0c890fc51a600f626c6ca90a32d39572c121ec04c157ba3a8d6601cb021f8433d871b5c562a3d342c804fffec90c1 pyinstaller_hooks_contrib-2024.11-py3-none-any.whl
|
||||
17b64ff6744004a05d475c8f6de3e48286db4069afad4cae690f83b3555f8e35ceafb210eeba69a11e983d0da3001099de284b6696ed0f1bf9cd791938a7f2cd python-3.12.9-amd64.exe
|
||||
4f9a4d9f65c93e2d851e2674057343a9599f30f5dc582ffca485522237d4fcf43653b3d393ed5eb11e518c4ba93714a07134bbb13a97d421cce211e1da34682e python-3.12.10-amd64.exe
|
||||
|
||||
@@ -34,14 +34,14 @@ fns=(
|
||||
upx-4.2.4-win32.zip
|
||||
)
|
||||
[ $w10 ] && fns+=(
|
||||
jinja2-3.1.4-py3-none-any.whl
|
||||
jinja2-3.1.6-py3-none-any.whl
|
||||
MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl
|
||||
mutagen-1.47.0-py3-none-any.whl
|
||||
packaging-24.1-py3-none-any.whl
|
||||
pillow-11.1.0-cp312-cp312-win_amd64.whl
|
||||
pillow-11.2.1-cp312-cp312-win_amd64.whl
|
||||
pyinstaller-6.10.0-py3-none-win_amd64.whl
|
||||
pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
|
||||
python-3.12.9-amd64.exe
|
||||
python-3.12.10-amd64.exe
|
||||
)
|
||||
[ $w7 ] && fns+=(
|
||||
future-1.0.0-py3-none-any.whl
|
||||
|
||||
@@ -413,6 +413,9 @@ def run_i(ld):
|
||||
for x in ld:
|
||||
sys.path.insert(0, x)
|
||||
|
||||
e = os.environ
|
||||
e["PRTY_NO_IMPRESO"] = "1"
|
||||
|
||||
from copyparty.__main__ import main as p
|
||||
|
||||
p()
|
||||
|
||||
@@ -84,6 +84,9 @@ def uh2(fp):
|
||||
if " # !rm" in ln:
|
||||
continue
|
||||
|
||||
if ln.endswith("TYPE_CHECKING"):
|
||||
ln = ln.replace("from typing import TYPE_CHECKING", "TYPE_CHECKING = False")
|
||||
|
||||
lns.append(ln)
|
||||
|
||||
cs = "\n".join(lns)
|
||||
|
||||
@@ -148,6 +148,7 @@ var tl_browser = {
|
||||
["U/O", "skip 10sec back/fwd"],
|
||||
["0..9", "jump to 0%..90%"],
|
||||
["P", "play/pause (also initiates)"],
|
||||
["S", "select playing song"],
|
||||
["Y", "download song"],
|
||||
], [
|
||||
"image-viewer",
|
||||
@@ -156,6 +157,7 @@ var tl_browser = {
|
||||
["F", "fullscreen"],
|
||||
["R", "rotate clockwise"],
|
||||
["🡅 R", "rotate ccw"],
|
||||
["S", "select pic"],
|
||||
["Y", "download pic"],
|
||||
], [
|
||||
"video-player",
|
||||
@@ -235,7 +237,8 @@ var tl_browser = {
|
||||
|
||||
"ul_par": "parallel uploads:",
|
||||
"ut_rand": "randomize filenames",
|
||||
"ut_u2ts": "copy the last-modified timestamp$Nfrom your filesystem to the server",
|
||||
"ut_u2ts": "copy the last-modified timestamp$Nfrom your filesystem to the server\">📅",
|
||||
"ut_ow": "overwrite existing files on the server?$N🛡️: never (will generate a new filename instead)$N🕒: overwrite if server-file is older than yours$N♻️: always overwrite if the files are different",
|
||||
"ut_mt": "continue hashing other files while uploading$N$Nmaybe disable if your CPU or HDD is a bottleneck",
|
||||
"ut_ask": 'ask for confirmation before upload starts">💭',
|
||||
"ut_pot": "improve upload speed on slow devices$Nby making the UI less complex",
|
||||
@@ -327,7 +330,7 @@ var tl_browser = {
|
||||
"cut_nag": "OS notification when upload completes$N(only if the browser or tab is not active)",
|
||||
"cut_sfx": "audible alert when upload completes$N(only if the browser or tab is not active)",
|
||||
|
||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$N30% faster https, 4.5x faster http,$Nand 5.3x faster on android phones\">mt",
|
||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
||||
|
||||
"cft_text": "favicon text (blank and refresh to disable)",
|
||||
"cft_fg": "foreground color",
|
||||
@@ -349,10 +352,12 @@ var tl_browser = {
|
||||
"ml_pmode": "at end of folder...",
|
||||
"ml_btns": "cmds",
|
||||
"ml_tcode": "transcode",
|
||||
"ml_tcode2": "transcode to",
|
||||
"ml_tint": "tint",
|
||||
"ml_eq": "audio equalizer",
|
||||
"ml_drc": "dynamic range compressor",
|
||||
|
||||
"mt_loop": "loop/repeat one song\">🔁",
|
||||
"mt_shuf": "shuffle the songs in each folder\">🔀",
|
||||
"mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶",
|
||||
"mt_preload": "start loading the next song near the end for gapless playback\">preload",
|
||||
@@ -372,6 +377,14 @@ var tl_browser = {
|
||||
"mt_cflac": "convert flac / wav to opus\">flac",
|
||||
"mt_caac": "convert aac / m4a to opus\">aac",
|
||||
"mt_coth": "convert all others (not mp3) to opus\">oth",
|
||||
"mt_c2opus": "best choice for desktops, laptops, android\">opus",
|
||||
"mt_c2owa": "opus-weba, for iOS 17.5 and newer\">owa",
|
||||
"mt_c2caf": "opus-caf, for iOS 11 through 17\">caf",
|
||||
"mt_c2mp3": "use this on very old devices\">mp3",
|
||||
"mt_c2ok": "nice, good choice",
|
||||
"mt_c2nd": "that's not the recommended output format for your device, but that's fine",
|
||||
"mt_c2ng": "your device does not seem to support this output format, but let's try anyways",
|
||||
"mt_xowa": "there are bugs in iOS preventing background playback using this format; please use caf or mp3 instead",
|
||||
"mt_tint": "background level (0-100) on the seekbar$Nto make buffering less distracting",
|
||||
"mt_eq": "enables the equalizer and gain control;$N$Nboost <code>0</code> = standard 100% volume (unmodified)$N$Nwidth <code>1 </code> = standard stereo (unmodified)$Nwidth <code>0.5</code> = 50% left-right crossfeed$Nwidth <code>0 </code> = mono$N$Nboost <code>-0.8</code> & width <code>10</code> = vocal removal :^)$N$Nenabling the equalizer makes gapless albums fully gapless, so leave it on with all the values at zero (except width = 1) if you care about that",
|
||||
"mt_drc": "enables the dynamic range compressor (volume flattener / brickwaller); will also enable EQ to balance the spaghetti, so set all EQ fields except for 'width' to 0 if you don't want it$N$Nlowers the volume of audio above THRESHOLD dB; for every RATIO dB past THRESHOLD there is 1 dB of output, so default values of tresh -24 and ratio 12 means it should never get louder than -22 dB and it is safe to increase the equalizer boost to 0.8, or even 1.8 with ATK 0 and a huge RLS like 90 (only works in firefox; RLS is max 1 in other browsers)$N$N(see wikipedia, they explain it much better)",
|
||||
@@ -390,6 +403,7 @@ var tl_browser = {
|
||||
"mm_eunk": "Unknown Errol",
|
||||
"mm_e404": "Could not play audio; error 404: File not found.",
|
||||
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
||||
"mm_e500": "Could not play audio; error 500: Check server logs.",
|
||||
"mm_e5xx": "Could not play audio; server error ",
|
||||
"mm_nof": "not finding any more audio files nearby",
|
||||
"mm_prescan": "Looking for music to play next...",
|
||||
@@ -404,6 +418,7 @@ var tl_browser = {
|
||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
||||
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
||||
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
||||
|
||||
"f_dls": 'the file links in the current folder have\nbeen changed into download links',
|
||||
|
||||
@@ -613,8 +628,10 @@ var tl_browser = {
|
||||
"u_ewrite": 'you do not have write-access to this folder',
|
||||
"u_eread": 'you do not have read-access to this folder',
|
||||
"u_enoi": 'file-search is not enabled in server config',
|
||||
"u_enoow": "overwrite will not work here; need Delete-permission",
|
||||
"u_badf": 'These {0} files (of {1} total) were skipped, possibly due to filesystem permissions:\n\n',
|
||||
"u_blankf": 'These {0} files (of {1} total) are blank / empty; upload them anyways?\n\n',
|
||||
"u_applef": 'These {0} files (of {1} total) are probably undesirable;\nPress <code>OK/Enter</code> to SKIP the following files,\nPress <code>Cancel/ESC</code> to NOT exclude, and UPLOAD those as well:\n\n',
|
||||
"u_just1": '\nMaybe it works better if you select just one file',
|
||||
"u_ff_many": "if you're using <b>Linux / MacOS / Android,</b> then this amount of files <a href=\"https://bugzilla.mozilla.org/show_bug.cgi?id=1790500\" target=\"_blank\"><em>may</em> crash Firefox!</a>\nif that happens, please try again (or use Chrome).",
|
||||
"u_up_life": "This upload will be deleted from the server\n{0} after it completes",
|
||||
|
||||
@@ -371,7 +371,7 @@ class TestDots(unittest.TestCase):
|
||||
return " ".join(tar)
|
||||
|
||||
def curl(self, url, uname, binary=False, req=b""):
|
||||
req = req or hdr(url, uname)
|
||||
req = req or hdr(url.replace("th=x", "th=j"), uname)
|
||||
conn = self.conn.setbuf(req)
|
||||
HttpCli(conn).run()
|
||||
if binary:
|
||||
|
||||
@@ -129,13 +129,13 @@ class Cfg(Namespace):
|
||||
def __init__(self, a=None, v=None, c=None, **ka0):
|
||||
ka = {}
|
||||
|
||||
ex = "chpw daw dav_auth dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink ih ihead magic hardlink_only nid nih no_acode no_athumb no_bauth no_clone no_cp no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nsort nw og og_no_head og_s_title ohead q rand re_dirsz rss smb srch_dbg srch_excl stats uqe vague_403 vc ver write_uplog xdev xlink xvol zs"
|
||||
ex = "chpw daw dav_auth dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink ih ihead magic hardlink_only nid nih no_acode no_athumb no_bauth no_clone no_cp no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nsort nw og og_no_head og_s_title ohead q rand re_dirsz rss smb srch_dbg srch_excl stats uqe vague_403 vc ver wo_up_readme write_uplog xdev xlink xvol zipmaxu zs"
|
||||
ka.update(**{k: False for k in ex.split()})
|
||||
|
||||
ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash plain_ip"
|
||||
ka.update(**{k: True for k in ex.split()})
|
||||
|
||||
ex = "ah_cli ah_gen css_browser hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua"
|
||||
ex = "ah_cli ah_gen css_browser dbpath hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua ua_nodoc ua_nozip"
|
||||
ka.update(**{k: None for k in ex.split()})
|
||||
|
||||
ex = "hash_mt hsortn safe_dedup srch_time u2abort u2j u2sz"
|
||||
@@ -144,10 +144,10 @@ class Cfg(Namespace):
|
||||
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody th_convt ups_who zip_who"
|
||||
ka.update(**{k: 9 for k in ex.split()})
|
||||
|
||||
ex = "db_act forget_ip k304 loris no304 re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo u2ow"
|
||||
ex = "db_act forget_ip k304 loris no304 re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo u2ow zipmaxn zipmaxs"
|
||||
ka.update(**{k: 0 for k in ex.split()})
|
||||
|
||||
ex = "ah_alg bname chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sba lg_sbf log_fk md_sba md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src R RS SR"
|
||||
ex = "ah_alg bname chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sba lg_sbf log_fk md_sba md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src zipmaxt R RS SR"
|
||||
ka.update(**{k: "" for k in ex.split()})
|
||||
|
||||
ex = "ban_403 ban_404 ban_422 ban_pw ban_url spinner"
|
||||
@@ -174,6 +174,7 @@ class Cfg(Namespace):
|
||||
lang="eng",
|
||||
log_badpwd=1,
|
||||
logout=573,
|
||||
md_hist="s",
|
||||
mte={"a": True},
|
||||
mth={},
|
||||
mtp=[],
|
||||
|
||||
Reference in New Issue
Block a user