Compare commits
65 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4dca1cf8f4 | ||
|
|
edba7fffd3 | ||
|
|
21a96bcfe8 | ||
|
|
2d322dd48e | ||
|
|
df6d4df4f8 | ||
|
|
5aa893973c | ||
|
|
be0dd555a6 | ||
|
|
9921c43e3a | ||
|
|
14fa369fae | ||
|
|
0f0f8d90c1 | ||
|
|
1afbff7335 | ||
|
|
8c32b0e7bb | ||
|
|
9bc4c5d2e6 | ||
|
|
1534b7cb55 | ||
|
|
56d3bcf515 | ||
|
|
78605d9a79 | ||
|
|
d46a40fed8 | ||
|
|
ce4e489802 | ||
|
|
fd7c71d6a3 | ||
|
|
fad2268566 | ||
|
|
a95ea03cd0 | ||
|
|
f6be390579 | ||
|
|
4f264a0a9c | ||
|
|
d27144340f | ||
|
|
299cff3ff7 | ||
|
|
42c199e78e | ||
|
|
1b2d39857b | ||
|
|
ed908b9868 | ||
|
|
d162502c38 | ||
|
|
bf11b2a421 | ||
|
|
77274e9d59 | ||
|
|
8306e3d9de | ||
|
|
deb6711b51 | ||
|
|
7ef6fd13cf | ||
|
|
65c4e03574 | ||
|
|
c9fafb202d | ||
|
|
d4d9069130 | ||
|
|
7eca90cc21 | ||
|
|
6ecf4fdceb | ||
|
|
8cae7a715b | ||
|
|
c75b0c25a6 | ||
|
|
9dd5dec093 | ||
|
|
ec05f8ccd5 | ||
|
|
a1c7a095ee | ||
|
|
77df17d191 | ||
|
|
fa5845ff5f | ||
|
|
17fa490687 | ||
|
|
1eff87c3bd | ||
|
|
d123d2bff0 | ||
|
|
5ac3864874 | ||
|
|
c599e2aaa3 | ||
|
|
2e53f7979a | ||
|
|
f61511d8c8 | ||
|
|
47415a7120 | ||
|
|
db7becacd2 | ||
|
|
28b63e587b | ||
|
|
9cb93ae1ed | ||
|
|
e3e51fb83a | ||
|
|
49c7124776 | ||
|
|
60fb1207fc | ||
|
|
48470f6b50 | ||
|
|
1d308eeb4c | ||
|
|
84f5f41747 | ||
|
|
19189afb34 | ||
|
|
23e77a3389 |
52
README.md
52
README.md
@@ -8,12 +8,14 @@ turn almost any device into a file server with resumable uploads/downloads using
|
|||||||
* 🔌 protocols: [http](#the-browser) // [webdav](#webdav-server) // [ftp](#ftp-server) // [tftp](#tftp-server) // [smb/cifs](#smb-server)
|
* 🔌 protocols: [http](#the-browser) // [webdav](#webdav-server) // [ftp](#ftp-server) // [tftp](#tftp-server) // [smb/cifs](#smb-server)
|
||||||
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
|
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
|
||||||
|
|
||||||
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
|
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running on a nuc in my basement
|
||||||
|
|
||||||
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
|
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
|
||||||
|
|
||||||
🎬 **videos:** [upload](https://a.ocv.me/pub/demo/pics-vids/up2k.webm) // [cli-upload](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm) // [race-the-beam](https://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)
|
🎬 **videos:** [upload](https://a.ocv.me/pub/demo/pics-vids/up2k.webm) // [cli-upload](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm) // [race-the-beam](https://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)
|
||||||
|
|
||||||
|
made in Norway 🇳🇴
|
||||||
|
|
||||||
|
|
||||||
## readme toc
|
## readme toc
|
||||||
|
|
||||||
@@ -54,6 +56,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
|||||||
* [creating a playlist](#creating-a-playlist) - with a standalone mediaplayer or copyparty
|
* [creating a playlist](#creating-a-playlist) - with a standalone mediaplayer or copyparty
|
||||||
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
|
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
|
||||||
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
|
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
|
||||||
|
* [textfile viewer](#textfile-viewer) - with realtime streaming of logfiles and such ([demo](https://a.ocv.me/pub/demo/logtail/))
|
||||||
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
||||||
* [markdown vars](#markdown-vars) - dynamic docs with serverside variable expansion
|
* [markdown vars](#markdown-vars) - dynamic docs with serverside variable expansion
|
||||||
* [other tricks](#other-tricks)
|
* [other tricks](#other-tricks)
|
||||||
@@ -255,7 +258,8 @@ also see [comparison to similar software](./docs/versus.md)
|
|||||||
* ☑ play video files as audio (converted on server)
|
* ☑ play video files as audio (converted on server)
|
||||||
* ☑ create and play [m3u8 playlists](#playlists)
|
* ☑ create and play [m3u8 playlists](#playlists)
|
||||||
* ☑ image gallery with webm player
|
* ☑ image gallery with webm player
|
||||||
* ☑ textfile browser with syntax hilighting
|
* ☑ [textfile browser](#textfile-viewer) with syntax hilighting
|
||||||
|
* ☑ realtime streaming of growing files (logfiles and such)
|
||||||
* ☑ [thumbnails](#thumbnails)
|
* ☑ [thumbnails](#thumbnails)
|
||||||
* ☑ ...of images using Pillow, pyvips, or FFmpeg
|
* ☑ ...of images using Pillow, pyvips, or FFmpeg
|
||||||
* ☑ ...of videos using FFmpeg
|
* ☑ ...of videos using FFmpeg
|
||||||
@@ -559,6 +563,8 @@ a client can request to see dotfiles in directory listings if global option `-ed
|
|||||||
|
|
||||||
dotfiles do not appear in search results unless one of the above is true, **and** the global option / volflag `dotsrch` is set
|
dotfiles do not appear in search results unless one of the above is true, **and** the global option / volflag `dotsrch` is set
|
||||||
|
|
||||||
|
> even if user has permission to see dotfiles, they are default-hidden unless `--see-dots` is set, and/or user has enabled the `dotfiles` option in the settings tab
|
||||||
|
|
||||||
config file example, where the same permission to see dotfiles is given in two different ways just for reference:
|
config file example, where the same permission to see dotfiles is given in two different ways just for reference:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
@@ -695,7 +701,10 @@ enabling `multiselect` lets you click files to select them, and then shift-click
|
|||||||
* `multiselect` is mostly intended for phones/tablets, but the `sel` option in the `[⚙️] settings` tab is better suited for desktop use, allowing selection by CTRL-clicking and range-selection with SHIFT-click, all without affecting regular clicking
|
* `multiselect` is mostly intended for phones/tablets, but the `sel` option in the `[⚙️] settings` tab is better suited for desktop use, allowing selection by CTRL-clicking and range-selection with SHIFT-click, all without affecting regular clicking
|
||||||
* the `sel` option can be made default globally with `--gsel` or per-volume with volflag `gsel`
|
* the `sel` option can be made default globally with `--gsel` or per-volume with volflag `gsel`
|
||||||
|
|
||||||
to show `/icons/exe.png` as the thumbnail for all .exe files, `--ext-th=exe=/icons/exe.png` (optionally as a volflag)
|
to show `/icons/exe.png` and `/icons/elf.gif` as the thumbnail for all `.exe` and `.elf` files respectively, do this: `--ext-th=exe=/icons/exe.png --ext-th=elf=/icons/elf.gif`
|
||||||
|
* optionally as separate volflags for each mapping; see config file example below
|
||||||
|
* the supported image formats are [jpg, png, gif, webp, ico](https://developer.mozilla.org/en-US/docs/Web/Media/Guides/Formats/Image_types)
|
||||||
|
* be careful with svg; chrome will crash if you have too many unique svg files showing on the same page (the limit is 250 or so) -- showing the same handful of svg files thousands of times is ok however
|
||||||
|
|
||||||
config file example:
|
config file example:
|
||||||
|
|
||||||
@@ -712,6 +721,7 @@ config file example:
|
|||||||
dthumb # disable ALL thumbnails and audio transcoding
|
dthumb # disable ALL thumbnails and audio transcoding
|
||||||
dvthumb # only disable video thumbnails
|
dvthumb # only disable video thumbnails
|
||||||
ext-th: exe=/ico/exe.png # /ico/exe.png is the thumbnail of *.exe
|
ext-th: exe=/ico/exe.png # /ico/exe.png is the thumbnail of *.exe
|
||||||
|
ext-th: elf=/ico/elf.gif # ...and /ico/elf.gif is used for *.elf
|
||||||
th-covers: folder.png,folder.jpg,cover.png,cover.jpg # the default
|
th-covers: folder.png,folder.jpg,cover.png,cover.jpg # the default
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -1119,6 +1129,18 @@ not available on iPhones / iPads because AudioContext currently breaks backgroun
|
|||||||
due to phone / app settings, android phones may randomly stop playing music when the power saver kicks in, especially at the end of an album -- you can fix it by [disabling power saving](https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png) in the [app settings](https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png) of the browser you use for music streaming (preferably a dedicated one)
|
due to phone / app settings, android phones may randomly stop playing music when the power saver kicks in, especially at the end of an album -- you can fix it by [disabling power saving](https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png) in the [app settings](https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png) of the browser you use for music streaming (preferably a dedicated one)
|
||||||
|
|
||||||
|
|
||||||
|
## textfile viewer
|
||||||
|
|
||||||
|
with realtime streaming of logfiles and such ([demo](https://a.ocv.me/pub/demo/logtail/)) , and terminal colors work too
|
||||||
|
|
||||||
|
click `-txt-` next to a textfile to open the viewer, which has the following toolbar buttons:
|
||||||
|
|
||||||
|
* `✏️ edit` opens the textfile editor
|
||||||
|
* `📡 follow` starts monitoring the file for changes, streaming new lines in realtime
|
||||||
|
* similar to `tail -f`
|
||||||
|
* [link directly](https://a.ocv.me/pub/demo/logtail/?doc=lipsum.txt&tail) to a file with tailing enabled by adding `&tail` to the textviewer URL
|
||||||
|
|
||||||
|
|
||||||
## markdown viewer
|
## markdown viewer
|
||||||
|
|
||||||
and there are *two* editors
|
and there are *two* editors
|
||||||
@@ -1479,7 +1501,6 @@ the same arguments can be set as volflags, in addition to `d2d`, `d2ds`, `d2t`,
|
|||||||
note:
|
note:
|
||||||
* upload-times can be displayed in the file listing by enabling the `.up_at` metadata key, either globally with `-e2d -mte +.up_at` or per-volume with volflags `e2d,mte=+.up_at` (will have a ~17% performance impact on directory listings)
|
* upload-times can be displayed in the file listing by enabling the `.up_at` metadata key, either globally with `-e2d -mte +.up_at` or per-volume with volflags `e2d,mte=+.up_at` (will have a ~17% performance impact on directory listings)
|
||||||
* `e2tsr` is probably always overkill, since `e2ds`/`e2dsa` would pick up any file modifications and `e2ts` would then reindex those, unless there is a new copyparty version with new parsers and the release note says otherwise
|
* `e2tsr` is probably always overkill, since `e2ds`/`e2dsa` would pick up any file modifications and `e2ts` would then reindex those, unless there is a new copyparty version with new parsers and the release note says otherwise
|
||||||
* the rescan button in the admin panel has no effect unless the volume has `-e2ds` or higher
|
|
||||||
|
|
||||||
config file example (these options are recommended btw):
|
config file example (these options are recommended btw):
|
||||||
|
|
||||||
@@ -1584,7 +1605,7 @@ config file example:
|
|||||||
w: * # anyone can upload here
|
w: * # anyone can upload here
|
||||||
rw: ed # only user "ed" can read-write
|
rw: ed # only user "ed" can read-write
|
||||||
flags:
|
flags:
|
||||||
e2ds: # filesystem indexing is required for many of these:
|
e2ds # filesystem indexing is required for many of these:
|
||||||
sz: 1k-3m # accept upload only if filesize in this range
|
sz: 1k-3m # accept upload only if filesize in this range
|
||||||
df: 4g # free disk space cannot go lower than this
|
df: 4g # free disk space cannot go lower than this
|
||||||
vmaxb: 1g # volume can never exceed 1 GiB
|
vmaxb: 1g # volume can never exceed 1 GiB
|
||||||
@@ -1641,6 +1662,8 @@ this can instead be kept in a single place using the `--hist` argument, or the `
|
|||||||
|
|
||||||
by default, the per-volume `up2k.db` sqlite3-database for `-e2d` and `-e2t` is stored next to the thumbnails according to the `--hist` option, but the global-option `--dbpath` and/or volflag `dbpath` can be used to put the database somewhere else
|
by default, the per-volume `up2k.db` sqlite3-database for `-e2d` and `-e2t` is stored next to the thumbnails according to the `--hist` option, but the global-option `--dbpath` and/or volflag `dbpath` can be used to put the database somewhere else
|
||||||
|
|
||||||
|
if your storage backend is unreliable (NFS or bad HDDs), you can specify one or more "landmarks" to look for before doing anything database-related. A landmark is a file which is always expected to exist inside the volume. This avoids spurious filesystem rescans in the event of an outage. One line per landmark (see example below)
|
||||||
|
|
||||||
note:
|
note:
|
||||||
* putting the hist-folders on an SSD is strongly recommended for performance
|
* putting the hist-folders on an SSD is strongly recommended for performance
|
||||||
* markdown edits are always stored in a local `.hist` subdirectory
|
* markdown edits are always stored in a local `.hist` subdirectory
|
||||||
@@ -1658,6 +1681,8 @@ config file example:
|
|||||||
flags:
|
flags:
|
||||||
hist: - # restore the default (/mnt/nas/pics/.hist/)
|
hist: - # restore the default (/mnt/nas/pics/.hist/)
|
||||||
hist: /mnt/nas/cache/pics/ # can be absolute path
|
hist: /mnt/nas/cache/pics/ # can be absolute path
|
||||||
|
landmark: me.jpg # /mnt/nas/pics/me.jpg must be readable to enable db
|
||||||
|
landmark: info/a.txt^=ok # and this textfile must start with "ok"
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
@@ -2343,8 +2368,10 @@ TLDR: yes
|
|||||||
| send message | yep | yep | yep | yep | yep | yep | yep | yep |
|
| send message | yep | yep | yep | yep | yep | yep | yep | yep |
|
||||||
| set sort order | - | yep | yep | yep | yep | yep | yep | yep |
|
| set sort order | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
| zip selection | - | yep | yep | yep | yep | yep | yep | yep |
|
| zip selection | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
|
| file search | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
| file rename | - | yep | yep | yep | yep | yep | yep | yep |
|
| file rename | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
| file cut/paste | - | yep | yep | yep | yep | yep | yep | yep |
|
| file cut/paste | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
|
| unpost uploads | - | - | yep | yep | yep | yep | yep | yep |
|
||||||
| navpane | - | yep | yep | yep | yep | yep | yep | yep |
|
| navpane | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
| image viewer | - | yep | yep | yep | yep | yep | yep | yep |
|
| image viewer | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
| video player | - | yep | yep | yep | yep | yep | yep | yep |
|
| video player | - | yep | yep | yep | yep | yep | yep | yep |
|
||||||
@@ -2417,6 +2444,9 @@ interact with copyparty using non-browser clients
|
|||||||
* and for screenshots on macos, see [./contrib/ishare.iscu](./contrib/#ishareiscu)
|
* and for screenshots on macos, see [./contrib/ishare.iscu](./contrib/#ishareiscu)
|
||||||
* and for screenshots on linux, see [./contrib/flameshot.sh](./contrib/flameshot.sh)
|
* and for screenshots on linux, see [./contrib/flameshot.sh](./contrib/flameshot.sh)
|
||||||
|
|
||||||
|
* [Custom Uploader](https://f-droid.org/en/packages/com.nyx.custom_uploader/) (an Android app) as an alternative to copyparty's own [PartyUP!](#android-app)
|
||||||
|
* works if you set UploadURL to `https://your.com/foo/?want=url&pw=hunter2` and FormDataName `f`
|
||||||
|
|
||||||
* contextlet (web browser integration); see [contrib contextlet](contrib/#send-to-cppcontextletjson)
|
* contextlet (web browser integration); see [contrib contextlet](contrib/#send-to-cppcontextletjson)
|
||||||
|
|
||||||
* [igloo irc](https://iglooirc.com/): Method: `post` Host: `https://you.com/up/?want=url&pw=hunter2` Multipart: `yes` File parameter: `f`
|
* [igloo irc](https://iglooirc.com/): Method: `post` Host: `https://you.com/up/?want=url&pw=hunter2` Multipart: `yes` File parameter: `f`
|
||||||
@@ -2518,6 +2548,11 @@ below are some tweaks roughly ordered by usefulness:
|
|||||||
|
|
||||||
when uploading files,
|
when uploading files,
|
||||||
|
|
||||||
|
* when uploading from very fast storage (NVMe SSD) with chrome/firefox, enable `[wasm]` in the `[⚙️] settings` tab to more effectively use all CPU-cores for hashing
|
||||||
|
* don't do this on Safari (runs faster without)
|
||||||
|
* don't do this on older browsers; likely to provoke browser-bugs (browser eats all RAM and crashes)
|
||||||
|
* can be made default-enabled serverside with `--nosubtle 137` (chrome v137+) or `--nosubtle 2` (chrome+firefox)
|
||||||
|
|
||||||
* chrome is recommended (unfortunately), at least compared to firefox:
|
* chrome is recommended (unfortunately), at least compared to firefox:
|
||||||
* up to 90% faster when hashing, especially on SSDs
|
* up to 90% faster when hashing, especially on SSDs
|
||||||
* up to 40% faster when uploading over extremely fast internets
|
* up to 40% faster when uploading over extremely fast internets
|
||||||
@@ -2698,7 +2733,7 @@ enable [thumbnails](#thumbnails) of...
|
|||||||
* **images:** `Pillow` and/or `pyvips` and/or `ffmpeg` (requires py2.7 or py3.5+)
|
* **images:** `Pillow` and/or `pyvips` and/or `ffmpeg` (requires py2.7 or py3.5+)
|
||||||
* **videos/audio:** `ffmpeg` and `ffprobe` somewhere in `$PATH`
|
* **videos/audio:** `ffmpeg` and `ffprobe` somewhere in `$PATH`
|
||||||
* **HEIF pictures:** `pyvips` or `ffmpeg` or `pyheif-pillow-opener` (requires Linux or a C compiler)
|
* **HEIF pictures:** `pyvips` or `ffmpeg` or `pyheif-pillow-opener` (requires Linux or a C compiler)
|
||||||
* **AVIF pictures:** `pyvips` or `ffmpeg` or `pillow-avif-plugin`
|
* **AVIF pictures:** `pyvips` or `ffmpeg` or `pillow-avif-plugin` or pillow v11.3+
|
||||||
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
|
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
|
||||||
|
|
||||||
enable sending [zeromq messages](#zeromq) from event-hooks: `pyzmq`
|
enable sending [zeromq messages](#zeromq) from event-hooks: `pyzmq`
|
||||||
@@ -2725,10 +2760,11 @@ set any of the following environment variables to disable its associated optiona
|
|||||||
| `PRTY_NO_CFSSL` | never attempt to generate self-signed certificates using [cfssl](https://github.com/cloudflare/cfssl) |
|
| `PRTY_NO_CFSSL` | never attempt to generate self-signed certificates using [cfssl](https://github.com/cloudflare/cfssl) |
|
||||||
| `PRTY_NO_FFMPEG` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips |
|
| `PRTY_NO_FFMPEG` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips |
|
||||||
| `PRTY_NO_FFPROBE` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips, **metadata-scanning** must be handled by mutagen |
|
| `PRTY_NO_FFPROBE` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips, **metadata-scanning** must be handled by mutagen |
|
||||||
|
| `PRTY_NO_MAGIC` | do not use [magic](https://pypi.org/project/python-magic/) for filetype detection |
|
||||||
| `PRTY_NO_MUTAGEN` | do not use [mutagen](https://pypi.org/project/mutagen/) for reading metadata from media files; will fallback to ffprobe |
|
| `PRTY_NO_MUTAGEN` | do not use [mutagen](https://pypi.org/project/mutagen/) for reading metadata from media files; will fallback to ffprobe |
|
||||||
| `PRTY_NO_PIL` | disable all [Pillow](https://pypi.org/project/pillow/)-based thumbnail support; will fallback to libvips or ffmpeg |
|
| `PRTY_NO_PIL` | disable all [Pillow](https://pypi.org/project/pillow/)-based thumbnail support; will fallback to libvips or ffmpeg |
|
||||||
| `PRTY_NO_PILF` | disable Pillow `ImageFont` text rendering, used for folder thumbnails |
|
| `PRTY_NO_PILF` | disable Pillow `ImageFont` text rendering, used for folder thumbnails |
|
||||||
| `PRTY_NO_PIL_AVIF` | disable 3rd-party Pillow plugin for [AVIF support](https://pypi.org/project/pillow-avif-plugin/) |
|
| `PRTY_NO_PIL_AVIF` | disable Pillow avif support (internal and/or [plugin](https://pypi.org/project/pillow-avif-plugin/)) |
|
||||||
| `PRTY_NO_PIL_HEIF` | disable 3rd-party Pillow plugin for [HEIF support](https://pypi.org/project/pyheif-pillow-opener/) |
|
| `PRTY_NO_PIL_HEIF` | disable 3rd-party Pillow plugin for [HEIF support](https://pypi.org/project/pyheif-pillow-opener/) |
|
||||||
| `PRTY_NO_PIL_WEBP` | disable use of native webp support in Pillow |
|
| `PRTY_NO_PIL_WEBP` | disable use of native webp support in Pillow |
|
||||||
| `PRTY_NO_PSUTIL` | do not use [psutil](https://pypi.org/project/psutil/) for reaping stuck hooks and plugins on Windows |
|
| `PRTY_NO_PSUTIL` | do not use [psutil](https://pypi.org/project/psutil/) for reaping stuck hooks and plugins on Windows |
|
||||||
@@ -2825,5 +2861,7 @@ if there's a wall of base64 in the log (thread stacks) then please include that,
|
|||||||
|
|
||||||
for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
|
for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
|
||||||
|
|
||||||
|
specifically you may want to [build the sfx](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#just-the-sfx) or [build from scratch](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#build-from-scratch)
|
||||||
|
|
||||||
see [./docs/TODO.md](./docs/TODO.md) for planned features / fixes / changes
|
see [./docs/TODO.md](./docs/TODO.md) for planned features / fixes / changes
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ import os
|
|||||||
import stat
|
import stat
|
||||||
import subprocess as sp
|
import subprocess as sp
|
||||||
import sys
|
import sys
|
||||||
|
from urllib.parse import unquote_to_bytes as unquote
|
||||||
|
|
||||||
|
|
||||||
"""
|
"""
|
||||||
@@ -28,14 +29,17 @@ which does the following respectively,
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
MOUNT_BASE = b"/run/media/egon/"
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
try:
|
try:
|
||||||
label = sys.argv[1].split(":usb-eject:")[1].split(":")[0]
|
label = sys.argv[1].split(":usb-eject:")[1].split(":")[0]
|
||||||
mp = "/run/media/egon/" + label
|
mp = MOUNT_BASE + unquote(label)
|
||||||
# print("ejecting [%s]... " % (mp,), end="")
|
# print("ejecting [%s]... " % (mp,), end="")
|
||||||
mp = os.path.abspath(os.path.realpath(mp.encode("utf-8")))
|
mp = os.path.abspath(os.path.realpath(mp))
|
||||||
st = os.lstat(mp)
|
st = os.lstat(mp)
|
||||||
if not stat.S_ISDIR(st.st_mode):
|
if not stat.S_ISDIR(st.st_mode) or not mp.startswith(MOUNT_BASE):
|
||||||
raise Exception("not a regular directory")
|
raise Exception("not a regular directory")
|
||||||
|
|
||||||
# if you're running copyparty as root (thx for the faith)
|
# if you're running copyparty as root (thx for the faith)
|
||||||
|
|||||||
@@ -22,6 +22,8 @@ set -e
|
|||||||
# modifies the keyfinder python lib to load the .so in ~/pe
|
# modifies the keyfinder python lib to load the .so in ~/pe
|
||||||
|
|
||||||
|
|
||||||
|
export FORCE_COLOR=1
|
||||||
|
|
||||||
linux=1
|
linux=1
|
||||||
|
|
||||||
win=
|
win=
|
||||||
@@ -186,12 +188,15 @@ install_keyfinder() {
|
|||||||
echo "so not found at $sop"
|
echo "so not found at $sop"
|
||||||
exit 1
|
exit 1
|
||||||
}
|
}
|
||||||
|
|
||||||
|
x=${-//[^x]/}; set -x; cat /etc/alpine-release
|
||||||
# rm -rf /Users/ed/Library/Python/3.9/lib/python/site-packages/*keyfinder*
|
# rm -rf /Users/ed/Library/Python/3.9/lib/python/site-packages/*keyfinder*
|
||||||
CFLAGS="-I$h/pe/keyfinder/include -I/opt/local/include -I/usr/include/ffmpeg" \
|
CFLAGS="-I$h/pe/keyfinder/include -I/opt/local/include -I/usr/include/ffmpeg" \
|
||||||
|
CXXFLAGS="-I$h/pe/keyfinder/include -I/opt/local/include -I/usr/include/ffmpeg" \
|
||||||
LDFLAGS="-L$h/pe/keyfinder/lib -L$h/pe/keyfinder/lib64 -L/opt/local/lib" \
|
LDFLAGS="-L$h/pe/keyfinder/lib -L$h/pe/keyfinder/lib64 -L/opt/local/lib" \
|
||||||
PKG_CONFIG_PATH=/c/msys64/mingw64/lib/pkgconfig \
|
PKG_CONFIG_PATH="/c/msys64/mingw64/lib/pkgconfig:$h/pe/keyfinder/lib/pkgconfig" \
|
||||||
$pybin -m pip install --user keyfinder
|
$pybin -m pip install --user keyfinder
|
||||||
|
[ "$x" ] || set +x
|
||||||
|
|
||||||
pypath="$($pybin -c 'import keyfinder; print(keyfinder.__file__)')"
|
pypath="$($pybin -c 'import keyfinder; print(keyfinder.__file__)')"
|
||||||
for pyso in "${pypath%/*}"/*.so; do
|
for pyso in "${pypath%/*}"/*.so; do
|
||||||
|
|||||||
@@ -2,19 +2,38 @@
|
|||||||
# not accept more consecutive clients than what copyparty is able to;
|
# not accept more consecutive clients than what copyparty is able to;
|
||||||
# nginx default is 512 (worker_processes 1, worker_connections 512)
|
# nginx default is 512 (worker_processes 1, worker_connections 512)
|
||||||
#
|
#
|
||||||
|
# ======================================================================
|
||||||
|
#
|
||||||
|
# to reverse-proxy a specific path/subpath/location below a domain
|
||||||
|
# (rather than a complete subdomain), for example "/qw/er", you must
|
||||||
|
# run copyparty with --rp-loc /qw/as and also change the following:
|
||||||
|
# location / {
|
||||||
|
# proxy_pass http://cpp_tcp;
|
||||||
|
# to this:
|
||||||
|
# location /qw/er/ {
|
||||||
|
# proxy_pass http://cpp_tcp/qw/er/;
|
||||||
|
#
|
||||||
|
# ======================================================================
|
||||||
|
#
|
||||||
# rarely, in some extreme usecases, it can be good to add -j0
|
# rarely, in some extreme usecases, it can be good to add -j0
|
||||||
# (40'000 requests per second, or 20gbps upload/download in parallel)
|
# (40'000 requests per second, or 20gbps upload/download in parallel)
|
||||||
# but this is usually counterproductive and slightly buggy
|
# but this is usually counterproductive and slightly buggy
|
||||||
#
|
#
|
||||||
|
# ======================================================================
|
||||||
|
#
|
||||||
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
|
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
|
||||||
#
|
#
|
||||||
# if you are behind cloudflare (or another protection service),
|
# ======================================================================
|
||||||
|
#
|
||||||
|
# if you are behind cloudflare (or another CDN/WAF/protection service),
|
||||||
# remember to reject all connections which are not coming from your
|
# remember to reject all connections which are not coming from your
|
||||||
# protection service -- for cloudflare in particular, you can
|
# protection service -- for cloudflare in particular, you can
|
||||||
# generate the list of permitted IP ranges like so:
|
# generate the list of permitted IP ranges like so:
|
||||||
# (curl -s https://www.cloudflare.com/ips-v{4,6} | sed 's/^/allow /; s/$/;/'; echo; echo "deny all;") > /etc/nginx/cloudflare-only.conf
|
# (curl -s https://www.cloudflare.com/ips-v{4,6} | sed 's/^/allow /; s/$/;/'; echo; echo "deny all;") > /etc/nginx/cloudflare-only.conf
|
||||||
#
|
#
|
||||||
# and then enable it below by uncomenting the cloudflare-only.conf line
|
# and then enable it below by uncomenting the cloudflare-only.conf line
|
||||||
|
#
|
||||||
|
# ======================================================================
|
||||||
|
|
||||||
|
|
||||||
upstream cpp_tcp {
|
upstream cpp_tcp {
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Maintainer: icxes <dev.null@need.moe>
|
# Maintainer: icxes <dev.null@need.moe>
|
||||||
pkgname=copyparty
|
pkgname=copyparty
|
||||||
pkgver="1.17.0"
|
pkgver="1.18.3"
|
||||||
pkgrel=1
|
pkgrel=1
|
||||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
||||||
arch=("any")
|
arch=("any")
|
||||||
@@ -22,7 +22,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
|||||||
)
|
)
|
||||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||||
backup=("etc/${pkgname}.d/init" )
|
backup=("etc/${pkgname}.d/init" )
|
||||||
sha256sums=("d8a49b3398f4cdb0754dd8b9e3ab32544e44e2fee94c88903f65ffc003b6eeec")
|
sha256sums=("aa12f4779cf5c014cc9503798ac63872dac840ca91ddf122daa6befb4c883d48")
|
||||||
|
|
||||||
build() {
|
build() {
|
||||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, pyzmq, ffmpeg, mutagen,
|
{ lib, stdenv, makeWrapper, fetchurl, util-linux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, pyzmq, ffmpeg, mutagen,
|
||||||
|
|
||||||
# use argon2id-hashed passwords in config files (sha2 is always available)
|
# use argon2id-hashed passwords in config files (sha2 is always available)
|
||||||
withHashedPasswords ? true,
|
withHashedPasswords ? true,
|
||||||
@@ -61,7 +61,7 @@ in stdenv.mkDerivation {
|
|||||||
installPhase = ''
|
installPhase = ''
|
||||||
install -Dm755 $src $out/share/copyparty-sfx.py
|
install -Dm755 $src $out/share/copyparty-sfx.py
|
||||||
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
|
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
|
||||||
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
--set PATH '${lib.makeBinPath ([ util-linux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
||||||
--add-flags "$out/share/copyparty-sfx.py"
|
--add-flags "$out/share/copyparty-sfx.py"
|
||||||
'';
|
'';
|
||||||
meta.mainProgram = "copyparty";
|
meta.mainProgram = "copyparty";
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"url": "https://github.com/9001/copyparty/releases/download/v1.17.0/copyparty-sfx.py",
|
"url": "https://github.com/9001/copyparty/releases/download/v1.18.3/copyparty-sfx.py",
|
||||||
"version": "1.17.0",
|
"version": "1.18.3",
|
||||||
"hash": "sha256-iRqaXQvwX4DceOhZmI6g9KXeR+rFAWnNHK/GTHkoQ7Q="
|
"hash": "sha256-INqErls4gyhBAlDlY1vfNboKrrqHmeiyB+RAuuYRISQ="
|
||||||
}
|
}
|
||||||
@@ -12,6 +12,23 @@ almost the same as minimal-up2k.html except this one...:
|
|||||||
|
|
||||||
-- looks slightly better
|
-- looks slightly better
|
||||||
|
|
||||||
|
|
||||||
|
========================
|
||||||
|
== USAGE INSTRUCTIONS ==
|
||||||
|
|
||||||
|
1. create a volume which anyone can read from (if you haven't already)
|
||||||
|
2. copy this file into that volume, so anyone can download it
|
||||||
|
3. enable the plugin by telling the webbrowser to load this file;
|
||||||
|
assuming the URL to the public volume is /res/, and
|
||||||
|
assuming you're using config-files, then add this to your config:
|
||||||
|
|
||||||
|
[global]
|
||||||
|
js-browser: /res/minimal-up2k.js
|
||||||
|
|
||||||
|
alternatively, if you're not using config-files, then
|
||||||
|
add the following commandline argument instead:
|
||||||
|
--js-browser=/res/minimal-up2k.js
|
||||||
|
|
||||||
*/
|
*/
|
||||||
|
|
||||||
var u2min = `
|
var u2min = `
|
||||||
|
|||||||
@@ -80,6 +80,7 @@ web/deps/prismd.css
|
|||||||
web/deps/scp.woff2
|
web/deps/scp.woff2
|
||||||
web/deps/sha512.ac.js
|
web/deps/sha512.ac.js
|
||||||
web/deps/sha512.hw.js
|
web/deps/sha512.hw.js
|
||||||
|
web/idp.html
|
||||||
web/iiam.gif
|
web/iiam.gif
|
||||||
web/md.css
|
web/md.css
|
||||||
web/md.html
|
web/md.html
|
||||||
|
|||||||
@@ -863,6 +863,43 @@ def get_sects():
|
|||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
|
[
|
||||||
|
"chmod",
|
||||||
|
"file/folder permissions",
|
||||||
|
dedent(
|
||||||
|
"""
|
||||||
|
global-option \033[33m--chmod-f\033[0m and volflag \033[33mchmod_f\033[0m specifies the unix-permission to use when creating a new file
|
||||||
|
|
||||||
|
similarly, \033[33m--chmod-d\033[0m and \033[33mchmod_d\033[0m sets the directory/folder perm
|
||||||
|
|
||||||
|
the value is a three-digit octal number such as 755, 750, 644, etc.
|
||||||
|
|
||||||
|
first digit = "User"; permission for the unix-user
|
||||||
|
second digit = "Group"; permission for the unix-group
|
||||||
|
third digit = "Other"; permission for all other users/groups
|
||||||
|
|
||||||
|
for files:
|
||||||
|
0 = --- = no access
|
||||||
|
1 = --x = can execute the file as a program
|
||||||
|
2 = -w- = can write
|
||||||
|
3 = -wx = can write and execute
|
||||||
|
4 = r-- = can read
|
||||||
|
5 = r-x = can read and execute
|
||||||
|
6 = rw- = can read and write
|
||||||
|
7 = rwx = can read, write, execute
|
||||||
|
|
||||||
|
for directories/folders:
|
||||||
|
0 = --- = no access
|
||||||
|
1 = --x = can read files in folder but not list contents
|
||||||
|
2 = -w- = n/a
|
||||||
|
3 = -wx = can create files but not list
|
||||||
|
4 = r-- = can list, but not read/write
|
||||||
|
5 = r-x = can list and read files
|
||||||
|
6 = rw- = n/a
|
||||||
|
7 = rwx = can read, write, list
|
||||||
|
"""
|
||||||
|
),
|
||||||
|
],
|
||||||
[
|
[
|
||||||
"pwhash",
|
"pwhash",
|
||||||
"password hashing",
|
"password hashing",
|
||||||
@@ -964,6 +1001,7 @@ def add_general(ap, nc, srvname):
|
|||||||
ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)")
|
ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)")
|
||||||
ap2.add_argument("--mime", metavar="EXT=MIME", type=u, action="append", help="map file \033[33mEXT\033[0mension to \033[33mMIME\033[0mtype, for example [\033[32mjpg=image/jpeg\033[0m]")
|
ap2.add_argument("--mime", metavar="EXT=MIME", type=u, action="append", help="map file \033[33mEXT\033[0mension to \033[33mMIME\033[0mtype, for example [\033[32mjpg=image/jpeg\033[0m]")
|
||||||
ap2.add_argument("--mimes", action="store_true", help="list default mimetype mapping and exit")
|
ap2.add_argument("--mimes", action="store_true", help="list default mimetype mapping and exit")
|
||||||
|
ap2.add_argument("--rmagic", action="store_true", help="do expensive analysis to improve accuracy of returned mimetypes; will make file-downloads, rss, and webdav slower (volflag=rmagic)")
|
||||||
ap2.add_argument("--license", action="store_true", help="show licenses and exit")
|
ap2.add_argument("--license", action="store_true", help="show licenses and exit")
|
||||||
ap2.add_argument("--version", action="store_true", help="show versions and exit")
|
ap2.add_argument("--version", action="store_true", help="show versions and exit")
|
||||||
|
|
||||||
@@ -1012,6 +1050,8 @@ def add_upload(ap):
|
|||||||
ap2.add_argument("--reg-cap", metavar="N", type=int, default=38400, help="max number of uploads to keep in memory when running without \033[33m-e2d\033[0m; roughly 1 MiB RAM per 600")
|
ap2.add_argument("--reg-cap", metavar="N", type=int, default=38400, help="max number of uploads to keep in memory when running without \033[33m-e2d\033[0m; roughly 1 MiB RAM per 600")
|
||||||
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload (bad idea to enable this on windows and/or cow filesystems)")
|
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload (bad idea to enable this on windows and/or cow filesystems)")
|
||||||
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even when it might be dangerous (multiprocessing, filesystems lacking sparse-files support, ...)")
|
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even when it might be dangerous (multiprocessing, filesystems lacking sparse-files support, ...)")
|
||||||
|
ap2.add_argument("--chmod-f", metavar="UGO", type=u, default="", help="unix file permissions to use when creating files; default is probably 644 (OS-decided), see --help-chmod. Examples: [\033[32m644\033[0m] = owner-RW + all-R, [\033[32m755\033[0m] = owner-RWX + all-RX, [\033[32m777\033[0m] = full-yolo (volflag=chmod_f)")
|
||||||
|
ap2.add_argument("--chmod-d", metavar="UGO", type=u, default="755", help="unix file permissions to use when creating directories; see --help-chmod. Examples: [\033[32m755\033[0m] = owner-RW + all-R, [\033[32m777\033[0m] = full-yolo (volflag=chmod_d)")
|
||||||
ap2.add_argument("--dedup", action="store_true", help="enable symlink-based upload deduplication (volflag=dedup)")
|
ap2.add_argument("--dedup", action="store_true", help="enable symlink-based upload deduplication (volflag=dedup)")
|
||||||
ap2.add_argument("--safe-dedup", metavar="N", type=int, default=50, help="how careful to be when deduplicating files; [\033[32m1\033[0m] = just verify the filesize, [\033[32m50\033[0m] = verify file contents have not been altered (volflag=safededup)")
|
ap2.add_argument("--safe-dedup", metavar="N", type=int, default=50, help="how careful to be when deduplicating files; [\033[32m1\033[0m] = just verify the filesize, [\033[32m50\033[0m] = verify file contents have not been altered (volflag=safededup)")
|
||||||
ap2.add_argument("--hardlink", action="store_true", help="enable hardlink-based dedup; will fallback on symlinks when that is impossible (across filesystems) (volflag=hardlink)")
|
ap2.add_argument("--hardlink", action="store_true", help="enable hardlink-based dedup; will fallback on symlinks when that is impossible (across filesystems) (volflag=hardlink)")
|
||||||
@@ -1028,6 +1068,7 @@ def add_upload(ap):
|
|||||||
ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)")
|
ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)")
|
||||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
||||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||||
|
ap2.add_argument("--nosubtle", metavar="N", type=int, default=0, help="when to use a wasm-hasher instead of the browser's builtin; faster on chrome, but buggy in older chrome versions. [\033[32m0\033[0m] = only when necessary (non-https), [\033[32m1\033[0m] = always (all browsers), [\033[32m2\033[0m] = always on chrome/firefox, [\033[32m3\033[0m] = always on chrome, [\033[32mN\033[0m] = chrome-version N and newer (recommendation: 137)")
|
||||||
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good when latency is low (same-country), 2~4 for android-clients, 2~6 for cross-atlantic. Max is 6 in most browsers. Big values increase network-speed but may reduce HDD-speed")
|
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good when latency is low (same-country), 2~4 for android-clients, 2~6 for cross-atlantic. Max is 6 in most browsers. Big values increase network-speed but may reduce HDD-speed")
|
||||||
ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for \033[33mdefault\033[0m, and never exceed \033[33mmax\033[0m. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]")
|
ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for \033[33mdefault\033[0m, and never exceed \033[33mmax\033[0m. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]")
|
||||||
ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to replace/overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)")
|
ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to replace/overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)")
|
||||||
@@ -1047,7 +1088,7 @@ def add_network(ap):
|
|||||||
ap2.add_argument("--rp-loc", metavar="PATH", type=u, default="", help="if reverse-proxying on a location instead of a dedicated domain/subdomain, provide the base location here; example: [\033[32m/foo/bar\033[0m]")
|
ap2.add_argument("--rp-loc", metavar="PATH", type=u, default="", help="if reverse-proxying on a location instead of a dedicated domain/subdomain, provide the base location here; example: [\033[32m/foo/bar\033[0m]")
|
||||||
if ANYWIN:
|
if ANYWIN:
|
||||||
ap2.add_argument("--reuseaddr", action="store_true", help="set reuseaddr on listening sockets on windows; allows rapid restart of copyparty at the expense of being able to accidentally start multiple instances")
|
ap2.add_argument("--reuseaddr", action="store_true", help="set reuseaddr on listening sockets on windows; allows rapid restart of copyparty at the expense of being able to accidentally start multiple instances")
|
||||||
else:
|
elif not MACOS:
|
||||||
ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)")
|
ap2.add_argument("--freebind", action="store_true", help="allow listening on IPs which do not yet exist, for example if the network interfaces haven't finished going up. Only makes sense for IPs other than '0.0.0.0', '127.0.0.1', '::', and '::1'. May require running as root (unless net.ipv6.ip_nonlocal_bind)")
|
||||||
ap2.add_argument("--wr-h-eps", metavar="PATH", type=u, default="", help="write list of listening-on ip:port to textfile at \033[33mPATH\033[0m when http-servers have started")
|
ap2.add_argument("--wr-h-eps", metavar="PATH", type=u, default="", help="write list of listening-on ip:port to textfile at \033[33mPATH\033[0m when http-servers have started")
|
||||||
ap2.add_argument("--wr-h-aon", metavar="PATH", type=u, default="", help="write list of accessible-on ip:port to textfile at \033[33mPATH\033[0m when http-servers have started")
|
ap2.add_argument("--wr-h-aon", metavar="PATH", type=u, default="", help="write list of accessible-on ip:port to textfile at \033[33mPATH\033[0m when http-servers have started")
|
||||||
@@ -1091,12 +1132,16 @@ def add_cert(ap, cert_path):
|
|||||||
|
|
||||||
|
|
||||||
def add_auth(ap):
|
def add_auth(ap):
|
||||||
|
idp_db = os.path.join(E.cfg, "idp.db")
|
||||||
ses_db = os.path.join(E.cfg, "sessions.db")
|
ses_db = os.path.join(E.cfg, "sessions.db")
|
||||||
ap2 = ap.add_argument_group('IdP / identity provider / user authentication options')
|
ap2 = ap.add_argument_group('IdP / identity provider / user authentication options')
|
||||||
ap2.add_argument("--idp-h-usr", metavar="HN", type=u, default="", help="bypass the copyparty authentication checks if the request-header \033[33mHN\033[0m contains a username to associate the request with (for use with authentik/oauth/...)\n\033[1;31mWARNING:\033[0m if you enable this, make sure clients are unable to specify this header themselves; must be washed away and replaced by a reverse-proxy")
|
ap2.add_argument("--idp-h-usr", metavar="HN", type=u, default="", help="bypass the copyparty authentication checks if the request-header \033[33mHN\033[0m contains a username to associate the request with (for use with authentik/oauth/...)\n\033[1;31mWARNING:\033[0m if you enable this, make sure clients are unable to specify this header themselves; must be washed away and replaced by a reverse-proxy")
|
||||||
ap2.add_argument("--idp-h-grp", metavar="HN", type=u, default="", help="assume the request-header \033[33mHN\033[0m contains the groupname of the requesting user; can be referenced in config files for group-based access control")
|
ap2.add_argument("--idp-h-grp", metavar="HN", type=u, default="", help="assume the request-header \033[33mHN\033[0m contains the groupname of the requesting user; can be referenced in config files for group-based access control")
|
||||||
ap2.add_argument("--idp-h-key", metavar="HN", type=u, default="", help="optional but recommended safeguard; your reverse-proxy will insert a secret header named \033[33mHN\033[0m into all requests, and the other IdP headers will be ignored if this header is not present")
|
ap2.add_argument("--idp-h-key", metavar="HN", type=u, default="", help="optional but recommended safeguard; your reverse-proxy will insert a secret header named \033[33mHN\033[0m into all requests, and the other IdP headers will be ignored if this header is not present")
|
||||||
ap2.add_argument("--idp-gsep", metavar="RE", type=u, default="|:;+,", help="if there are multiple groups in \033[33m--idp-h-grp\033[0m, they are separated by one of the characters in \033[33mRE\033[0m")
|
ap2.add_argument("--idp-gsep", metavar="RE", type=u, default="|:;+,", help="if there are multiple groups in \033[33m--idp-h-grp\033[0m, they are separated by one of the characters in \033[33mRE\033[0m")
|
||||||
|
ap2.add_argument("--idp-db", metavar="PATH", type=u, default=idp_db, help="where to store the known IdP users/groups (if you run multiple copyparty instances, make sure they use different DBs)")
|
||||||
|
ap2.add_argument("--idp-store", metavar="N", type=int, default=1, help="how to use \033[33m--idp-db\033[0m; [\033[32m0\033[0m] = entirely disable, [\033[32m1\033[0m] = write-only (effectively disabled), [\033[32m2\033[0m] = remember users, [\033[32m3\033[0m] = remember users and groups.\nNOTE: Will remember and restore the IdP-volumes of all users for all eternity if set to 2 or 3, even when user is deleted from your IdP")
|
||||||
|
ap2.add_argument("--idp-adm", metavar="U,U", type=u, default="", help="comma-separated list of users allowed to use /?idp (the cache management UI)")
|
||||||
ap2.add_argument("--no-bauth", action="store_true", help="disable basic-authentication support; do not accept passwords from the 'Authenticate' header at all. NOTE: This breaks support for the android app")
|
ap2.add_argument("--no-bauth", action="store_true", help="disable basic-authentication support; do not accept passwords from the 'Authenticate' header at all. NOTE: This breaks support for the android app")
|
||||||
ap2.add_argument("--bauth-last", action="store_true", help="keeps basic-authentication enabled, but only as a last-resort; if a cookie is also provided then the cookie wins")
|
ap2.add_argument("--bauth-last", action="store_true", help="keeps basic-authentication enabled, but only as a last-resort; if a cookie is also provided then the cookie wins")
|
||||||
ap2.add_argument("--ses-db", metavar="PATH", type=u, default=ses_db, help="where to store the sessions database (if you run multiple copyparty instances, make sure they use different DBs)")
|
ap2.add_argument("--ses-db", metavar="PATH", type=u, default=ses_db, help="where to store the sessions database (if you run multiple copyparty instances, make sure they use different DBs)")
|
||||||
@@ -1269,6 +1314,7 @@ def add_optouts(ap):
|
|||||||
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
||||||
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
|
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
|
||||||
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
|
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
|
||||||
|
ap2.add_argument("--no-tail", action="store_true", help="disable streaming a growing files with ?tail (volflag=notail)")
|
||||||
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader-IP into the database; will also disable unpost, you may want \033[32m--forget-ip\033[0m instead (volflag=no_db_ip)")
|
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader-IP into the database; will also disable unpost, you may want \033[32m--forget-ip\033[0m instead (volflag=no_db_ip)")
|
||||||
|
|
||||||
|
|
||||||
@@ -1398,6 +1444,16 @@ def add_transcoding(ap):
|
|||||||
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds")
|
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds")
|
||||||
|
|
||||||
|
|
||||||
|
def add_tail(ap):
|
||||||
|
ap2 = ap.add_argument_group('tailing options (realtime streaming of a growing file)')
|
||||||
|
ap2.add_argument("--tail-who", metavar="LVL", type=int, default=2, help="who can tail? [\033[32m0\033[0m]=nobody, [\033[32m1\033[0m]=admins, [\033[32m2\033[0m]=authenticated-with-read-access, [\033[32m3\033[0m]=everyone-with-read-access (volflag=tail_who)")
|
||||||
|
ap2.add_argument("--tail-cmax", metavar="N", type=int, default=64, help="do not allow starting a new tail if more than \033[33mN\033[0m active downloads")
|
||||||
|
ap2.add_argument("--tail-tmax", metavar="SEC", type=float, default=0, help="terminate connection after \033[33mSEC\033[0m seconds; [\033[32m0\033[0m]=never (volflag=tail_tmax)")
|
||||||
|
ap2.add_argument("--tail-rate", metavar="SEC", type=float, default=0.2, help="check for new data every \033[33mSEC\033[0m seconds (volflag=tail_rate)")
|
||||||
|
ap2.add_argument("--tail-ka", metavar="SEC", type=float, default=3.0, help="send a zerobyte if connection is idle for \033[33mSEC\033[0m seconds to prevent disconnect")
|
||||||
|
ap2.add_argument("--tail-fd", metavar="SEC", type=float, default=1.0, help="check if file was replaced (new fd) if idle for \033[33mSEC\033[0m seconds (volflag=tail_fd)")
|
||||||
|
|
||||||
|
|
||||||
def add_rss(ap):
|
def add_rss(ap):
|
||||||
ap2 = ap.add_argument_group('RSS options')
|
ap2 = ap.add_argument_group('RSS options')
|
||||||
ap2.add_argument("--rss", action="store_true", help="enable RSS output (experimental) (volflag=rss)")
|
ap2.add_argument("--rss", action="store_true", help="enable RSS output (experimental) (volflag=rss)")
|
||||||
@@ -1493,6 +1549,8 @@ def add_ui(ap, retry):
|
|||||||
ap2.add_argument("--sort", metavar="C,C,C", type=u, default="href", help="default sort order, comma-separated column IDs (see header tooltips), prefix with '-' for descending. Examples: \033[32mhref -href ext sz ts tags/Album tags/.tn\033[0m (volflag=sort)")
|
ap2.add_argument("--sort", metavar="C,C,C", type=u, default="href", help="default sort order, comma-separated column IDs (see header tooltips), prefix with '-' for descending. Examples: \033[32mhref -href ext sz ts tags/Album tags/.tn\033[0m (volflag=sort)")
|
||||||
ap2.add_argument("--nsort", action="store_true", help="default-enable natural sort of filenames with leading numbers (volflag=nsort)")
|
ap2.add_argument("--nsort", action="store_true", help="default-enable natural sort of filenames with leading numbers (volflag=nsort)")
|
||||||
ap2.add_argument("--hsortn", metavar="N", type=int, default=2, help="number of sorting rules to include in media URLs by default (volflag=hsortn)")
|
ap2.add_argument("--hsortn", metavar="N", type=int, default=2, help="number of sorting rules to include in media URLs by default (volflag=hsortn)")
|
||||||
|
ap2.add_argument("--see-dots", action="store_true", help="default-enable seeing dotfiles; only takes effect if user has the necessary permissions")
|
||||||
|
ap2.add_argument("--qdel", metavar="LVL", type=int, default=2, help="number of confirmations to show when deleting files (2/1/0)")
|
||||||
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching \033[33mREGEX\033[0m in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
|
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching \033[33mREGEX\033[0m in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
|
||||||
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
|
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
|
||||||
ap2.add_argument("--ext-th", metavar="E=VP", type=u, action="append", help="use thumbnail-image \033[33mVP\033[0m for file-extension \033[33mE\033[0m, example: [\033[32mexe=/.res/exe.png\033[0m] (volflag=ext_th)")
|
ap2.add_argument("--ext-th", metavar="E=VP", type=u, action="append", help="use thumbnail-image \033[33mVP\033[0m for file-extension \033[33mE\033[0m, example: [\033[32mexe=/.res/exe.png\033[0m] (volflag=ext_th)")
|
||||||
@@ -1517,6 +1575,7 @@ def add_ui(ap, retry):
|
|||||||
ap2.add_argument("--lg-sba", metavar="TXT", type=u, default="", help="the value of the iframe 'allow' attribute for prologue/epilogue docs (volflag=lg_sba); see https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Permissions-Policy#iframes")
|
ap2.add_argument("--lg-sba", metavar="TXT", type=u, default="", help="the value of the iframe 'allow' attribute for prologue/epilogue docs (volflag=lg_sba); see https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Permissions-Policy#iframes")
|
||||||
ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README/PREADME.md documents (volflags: no_sb_md | sb_md)")
|
ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README/PREADME.md documents (volflags: no_sb_md | sb_md)")
|
||||||
ap2.add_argument("--no-sb-lg", action="store_true", help="don't sandbox prologue/epilogue docs (volflags: no_sb_lg | sb_lg); enables non-js support")
|
ap2.add_argument("--no-sb-lg", action="store_true", help="don't sandbox prologue/epilogue docs (volflags: no_sb_lg | sb_lg); enables non-js support")
|
||||||
|
ap2.add_argument("--have-unlistc", action="store_true", help=argparse.SUPPRESS)
|
||||||
|
|
||||||
|
|
||||||
def add_debug(ap):
|
def add_debug(ap):
|
||||||
@@ -1602,6 +1661,7 @@ def run_argparse(
|
|||||||
add_hooks(ap)
|
add_hooks(ap)
|
||||||
add_stats(ap)
|
add_stats(ap)
|
||||||
add_txt(ap)
|
add_txt(ap)
|
||||||
|
add_tail(ap)
|
||||||
add_og(ap)
|
add_og(ap)
|
||||||
add_ui(ap, retry)
|
add_ui(ap, retry)
|
||||||
add_admin(ap)
|
add_admin(ap)
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
# coding: utf-8
|
# coding: utf-8
|
||||||
|
|
||||||
VERSION = (1, 17, 1)
|
VERSION = (1, 18, 4)
|
||||||
CODENAME = "mixtape.m3u"
|
CODENAME = "logtail"
|
||||||
BUILD_DT = (2025, 5, 18)
|
BUILD_DT = (2025, 7, 25)
|
||||||
|
|
||||||
S_VERSION = ".".join(map(str, VERSION))
|
S_VERSION = ".".join(map(str, VERSION))
|
||||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ from .util import (
|
|||||||
DEF_MTE,
|
DEF_MTE,
|
||||||
DEF_MTH,
|
DEF_MTH,
|
||||||
EXTS,
|
EXTS,
|
||||||
|
HAVE_SQLITE3,
|
||||||
IMPLICATIONS,
|
IMPLICATIONS,
|
||||||
MIMES,
|
MIMES,
|
||||||
SQLITE_VER,
|
SQLITE_VER,
|
||||||
@@ -32,6 +33,7 @@ from .util import (
|
|||||||
afsenc,
|
afsenc,
|
||||||
get_df,
|
get_df,
|
||||||
humansize,
|
humansize,
|
||||||
|
min_ex,
|
||||||
odfusion,
|
odfusion,
|
||||||
read_utf8,
|
read_utf8,
|
||||||
relchk,
|
relchk,
|
||||||
@@ -44,6 +46,9 @@ from .util import (
|
|||||||
vsplit,
|
vsplit,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if HAVE_SQLITE3:
|
||||||
|
import sqlite3
|
||||||
|
|
||||||
if True: # pylint: disable=using-constant-test
|
if True: # pylint: disable=using-constant-test
|
||||||
from collections.abc import Iterable
|
from collections.abc import Iterable
|
||||||
|
|
||||||
@@ -72,7 +77,9 @@ SSEELOG = " ({})".format(SEE_LOG)
|
|||||||
BAD_CFG = "invalid config; {}".format(SEE_LOG)
|
BAD_CFG = "invalid config; {}".format(SEE_LOG)
|
||||||
SBADCFG = " ({})".format(BAD_CFG)
|
SBADCFG = " ({})".format(BAD_CFG)
|
||||||
|
|
||||||
PTN_U_GRP = re.compile(r"\$\{u%([+-])([^}]+)\}")
|
PTN_U_GRP = re.compile(r"\$\{u(%[+-][^}]+)\}")
|
||||||
|
PTN_G_GRP = re.compile(r"\$\{g(%[+-][^}]+)\}")
|
||||||
|
PTN_SIGIL = re.compile(r"(\${[ug][}%])")
|
||||||
|
|
||||||
|
|
||||||
class CfgEx(Exception):
|
class CfgEx(Exception):
|
||||||
@@ -113,6 +120,8 @@ class Lim(object):
|
|||||||
|
|
||||||
self.reg: Optional[dict[str, dict[str, Any]]] = None # up2k registry
|
self.reg: Optional[dict[str, dict[str, Any]]] = None # up2k registry
|
||||||
|
|
||||||
|
self.chmod_d = 0o755
|
||||||
|
|
||||||
self.nups: dict[str, list[float]] = {} # num tracker
|
self.nups: dict[str, list[float]] = {} # num tracker
|
||||||
self.bups: dict[str, list[tuple[float, int]]] = {} # byte tracker list
|
self.bups: dict[str, list[tuple[float, int]]] = {} # byte tracker list
|
||||||
self.bupc: dict[str, int] = {} # byte tracker cache
|
self.bupc: dict[str, int] = {} # byte tracker cache
|
||||||
@@ -273,7 +282,7 @@ class Lim(object):
|
|||||||
if not dirs:
|
if not dirs:
|
||||||
# no branches yet; make one
|
# no branches yet; make one
|
||||||
sub = os.path.join(path, "0")
|
sub = os.path.join(path, "0")
|
||||||
bos.mkdir(sub)
|
bos.mkdir(sub, self.chmod_d)
|
||||||
else:
|
else:
|
||||||
# try newest branch only
|
# try newest branch only
|
||||||
sub = os.path.join(path, str(dirs[-1]))
|
sub = os.path.join(path, str(dirs[-1]))
|
||||||
@@ -288,7 +297,7 @@ class Lim(object):
|
|||||||
|
|
||||||
# make a branch
|
# make a branch
|
||||||
sub = os.path.join(path, str(dirs[-1] + 1))
|
sub = os.path.join(path, str(dirs[-1] + 1))
|
||||||
bos.mkdir(sub)
|
bos.mkdir(sub, self.chmod_d)
|
||||||
ret = self.dive(sub, lvs - 1)
|
ret = self.dive(sub, lvs - 1)
|
||||||
if ret is None:
|
if ret is None:
|
||||||
raise Pebkac(500, "rotation bug")
|
raise Pebkac(500, "rotation bug")
|
||||||
@@ -357,7 +366,6 @@ class VFS(object):
|
|||||||
self.flags = flags # config options
|
self.flags = flags # config options
|
||||||
self.root = self
|
self.root = self
|
||||||
self.dev = 0 # st_dev
|
self.dev = 0 # st_dev
|
||||||
self.badcfg1 = False
|
|
||||||
self.nodes: dict[str, VFS] = {} # child nodes
|
self.nodes: dict[str, VFS] = {} # child nodes
|
||||||
self.histtab: dict[str, str] = {} # all realpath->histpath
|
self.histtab: dict[str, str] = {} # all realpath->histpath
|
||||||
self.dbpaths: dict[str, str] = {} # all realpath->dbpath
|
self.dbpaths: dict[str, str] = {} # all realpath->dbpath
|
||||||
@@ -366,6 +374,7 @@ class VFS(object):
|
|||||||
self.shr_src: Optional[tuple[VFS, str]] = None # source vfs+rem of a share
|
self.shr_src: Optional[tuple[VFS, str]] = None # source vfs+rem of a share
|
||||||
self.shr_files: set[str] = set() # filenames to include from shr_src
|
self.shr_files: set[str] = set() # filenames to include from shr_src
|
||||||
self.shr_owner: str = "" # uname
|
self.shr_owner: str = "" # uname
|
||||||
|
self.shr_all_aps: list[tuple[str, list[VFS]]] = []
|
||||||
self.aread: dict[str, list[str]] = {}
|
self.aread: dict[str, list[str]] = {}
|
||||||
self.awrite: dict[str, list[str]] = {}
|
self.awrite: dict[str, list[str]] = {}
|
||||||
self.amove: dict[str, list[str]] = {}
|
self.amove: dict[str, list[str]] = {}
|
||||||
@@ -377,20 +386,20 @@ class VFS(object):
|
|||||||
self.adot: dict[str, list[str]] = {}
|
self.adot: dict[str, list[str]] = {}
|
||||||
self.js_ls = {}
|
self.js_ls = {}
|
||||||
self.js_htm = ""
|
self.js_htm = ""
|
||||||
|
self.all_vols: dict[str, VFS] = {} # flattened recursive
|
||||||
|
self.all_nodes: dict[str, VFS] = {} # also jumpvols/shares
|
||||||
|
|
||||||
if realpath:
|
if realpath:
|
||||||
rp = realpath + ("" if realpath.endswith(os.sep) else os.sep)
|
rp = realpath + ("" if realpath.endswith(os.sep) else os.sep)
|
||||||
vp = vpath + ("/" if vpath else "")
|
vp = vpath + ("/" if vpath else "")
|
||||||
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
|
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
|
||||||
self.dbpath = self.histpath
|
self.dbpath = self.histpath
|
||||||
self.all_vols = {vpath: self} # flattened recursive
|
self.all_vols[vpath] = self
|
||||||
self.all_nodes = {vpath: self} # also jumpvols/shares
|
self.all_nodes[vpath] = self
|
||||||
self.all_aps = [(rp, self)]
|
self.all_aps = [(rp, [self])]
|
||||||
self.all_vps = [(vp, self)]
|
self.all_vps = [(vp, self)]
|
||||||
else:
|
else:
|
||||||
self.histpath = self.dbpath = ""
|
self.histpath = self.dbpath = ""
|
||||||
self.all_vols = {}
|
|
||||||
self.all_nodes = {}
|
|
||||||
self.all_aps = []
|
self.all_aps = []
|
||||||
self.all_vps = []
|
self.all_vps = []
|
||||||
|
|
||||||
@@ -409,7 +418,7 @@ class VFS(object):
|
|||||||
self,
|
self,
|
||||||
vols: dict[str, "VFS"],
|
vols: dict[str, "VFS"],
|
||||||
nodes: dict[str, "VFS"],
|
nodes: dict[str, "VFS"],
|
||||||
aps: list[tuple[str, "VFS"]],
|
aps: list[tuple[str, list["VFS"]]],
|
||||||
vps: list[tuple[str, "VFS"]],
|
vps: list[tuple[str, "VFS"]],
|
||||||
) -> None:
|
) -> None:
|
||||||
nodes[self.vpath] = self
|
nodes[self.vpath] = self
|
||||||
@@ -418,7 +427,11 @@ class VFS(object):
|
|||||||
rp = self.realpath
|
rp = self.realpath
|
||||||
rp += "" if rp.endswith(os.sep) else os.sep
|
rp += "" if rp.endswith(os.sep) else os.sep
|
||||||
vp = self.vpath + ("/" if self.vpath else "")
|
vp = self.vpath + ("/" if self.vpath else "")
|
||||||
aps.append((rp, self))
|
hit = next((x[1] for x in aps if x[0] == rp), None)
|
||||||
|
if hit:
|
||||||
|
hit.append(self)
|
||||||
|
else:
|
||||||
|
aps.append((rp, [self]))
|
||||||
vps.append((vp, self))
|
vps.append((vp, self))
|
||||||
|
|
||||||
for v in self.nodes.values():
|
for v in self.nodes.values():
|
||||||
@@ -842,9 +855,11 @@ class VFS(object):
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
if "xvol" in self.flags:
|
if "xvol" in self.flags:
|
||||||
for vap, vn in self.root.all_aps:
|
all_aps = self.shr_all_aps or self.root.all_aps
|
||||||
|
|
||||||
|
for vap, vns in all_aps:
|
||||||
if aps.startswith(vap):
|
if aps.startswith(vap):
|
||||||
return vn
|
return self if self in vns else vns[0]
|
||||||
|
|
||||||
if self.log:
|
if self.log:
|
||||||
self.log("vfs", "xvol: %r" % (ap,), 3)
|
self.log("vfs", "xvol: %r" % (ap,), 3)
|
||||||
@@ -853,6 +868,53 @@ class VFS(object):
|
|||||||
|
|
||||||
return self
|
return self
|
||||||
|
|
||||||
|
def check_landmarks(self) -> bool:
|
||||||
|
if self.dbv:
|
||||||
|
return True
|
||||||
|
|
||||||
|
vps = self.flags.get("landmark") or []
|
||||||
|
if not vps:
|
||||||
|
return True
|
||||||
|
|
||||||
|
failed = ""
|
||||||
|
for vp in vps:
|
||||||
|
if "^=" in vp:
|
||||||
|
vp, zs = vp.split("^=", 1)
|
||||||
|
expect = zs.encode("utf-8")
|
||||||
|
else:
|
||||||
|
expect = b""
|
||||||
|
|
||||||
|
if self.log:
|
||||||
|
t = "checking [/%s] landmark [%s]"
|
||||||
|
self.log("vfs", t % (self.vpath, vp), 6)
|
||||||
|
|
||||||
|
ap = "?"
|
||||||
|
try:
|
||||||
|
ap = self.canonical(vp)
|
||||||
|
with open(ap, "rb") as f:
|
||||||
|
buf = f.read(4096)
|
||||||
|
if not buf.startswith(expect):
|
||||||
|
t = "file [%s] does not start with the expected bytes %s"
|
||||||
|
failed = t % (ap, expect)
|
||||||
|
break
|
||||||
|
except Exception as ex:
|
||||||
|
t = "%r while trying to read [%s] => [%s]"
|
||||||
|
failed = t % (ex, vp, ap)
|
||||||
|
break
|
||||||
|
|
||||||
|
if not failed:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if self.log:
|
||||||
|
t = "WARNING: landmark verification failed; %s; will now disable up2k database for volume [/%s]"
|
||||||
|
self.log("vfs", t % (failed, self.vpath), 3)
|
||||||
|
|
||||||
|
for rm in "e2d e2t e2v".split():
|
||||||
|
self.flags = {k: v for k, v in self.flags.items() if not k.startswith(rm)}
|
||||||
|
self.flags["d2d"] = True
|
||||||
|
self.flags["d2t"] = True
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
if WINDOWS:
|
if WINDOWS:
|
||||||
re_vol = re.compile(r"^([a-zA-Z]:[\\/][^:]*|[^:]*):([^:]*):(.*)$")
|
re_vol = re.compile(r"^([a-zA-Z]:[\\/][^:]*|[^:]*):([^:]*):(.*)$")
|
||||||
@@ -877,6 +939,7 @@ class AuthSrv(object):
|
|||||||
self.warn_anonwrite = warn_anonwrite
|
self.warn_anonwrite = warn_anonwrite
|
||||||
self.line_ctr = 0
|
self.line_ctr = 0
|
||||||
self.indent = ""
|
self.indent = ""
|
||||||
|
self.is_lxc = args.c == ["/z/initcfg"]
|
||||||
|
|
||||||
# fwd-decl
|
# fwd-decl
|
||||||
self.vfs = VFS(log_func, "", "", "", AXS(), {})
|
self.vfs = VFS(log_func, "", "", "", AXS(), {})
|
||||||
@@ -887,6 +950,8 @@ class AuthSrv(object):
|
|||||||
self.defpw: dict[str, str] = {}
|
self.defpw: dict[str, str] = {}
|
||||||
self.grps: dict[str, list[str]] = {}
|
self.grps: dict[str, list[str]] = {}
|
||||||
self.re_pwd: Optional[re.Pattern] = None
|
self.re_pwd: Optional[re.Pattern] = None
|
||||||
|
self.cfg_files_loaded: list[str] = []
|
||||||
|
self.badcfg1 = False
|
||||||
|
|
||||||
# all volumes observed since last restart
|
# all volumes observed since last restart
|
||||||
self.idp_vols: dict[str, str] = {} # vpath->abspath
|
self.idp_vols: dict[str, str] = {} # vpath->abspath
|
||||||
@@ -913,6 +978,9 @@ class AuthSrv(object):
|
|||||||
|
|
||||||
yield prev, True
|
yield prev, True
|
||||||
|
|
||||||
|
def vf0(self):
|
||||||
|
return {"d2d": True, "tcolor": self.args.tcolor}
|
||||||
|
|
||||||
def idp_checkin(
|
def idp_checkin(
|
||||||
self, broker: Optional["BrokerCli"], uname: str, gname: str
|
self, broker: Optional["BrokerCli"], uname: str, gname: str
|
||||||
) -> bool:
|
) -> bool:
|
||||||
@@ -931,6 +999,10 @@ class AuthSrv(object):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
self.idp_accs[uname] = gnames
|
self.idp_accs[uname] = gnames
|
||||||
|
try:
|
||||||
|
self._update_idp_db(uname, gname)
|
||||||
|
except:
|
||||||
|
self.log("failed to update the --idp-db:\n%s" % (min_ex(),), 3)
|
||||||
|
|
||||||
t = "reinitializing due to new user from IdP: [%r:%r]"
|
t = "reinitializing due to new user from IdP: [%r:%r]"
|
||||||
self.log(t % (uname, gnames), 3)
|
self.log(t % (uname, gnames), 3)
|
||||||
@@ -943,6 +1015,22 @@ class AuthSrv(object):
|
|||||||
broker.ask("reload", False, True).get()
|
broker.ask("reload", False, True).get()
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
def _update_idp_db(self, uname: str, gname: str) -> None:
|
||||||
|
if not self.args.idp_store:
|
||||||
|
return
|
||||||
|
|
||||||
|
assert sqlite3 # type: ignore # !rm
|
||||||
|
|
||||||
|
db = sqlite3.connect(self.args.idp_db)
|
||||||
|
cur = db.cursor()
|
||||||
|
|
||||||
|
cur.execute("delete from us where un = ?", (uname,))
|
||||||
|
cur.execute("insert into us values (?,?)", (uname, gname))
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
cur.close()
|
||||||
|
db.close()
|
||||||
|
|
||||||
def _map_volume_idp(
|
def _map_volume_idp(
|
||||||
self,
|
self,
|
||||||
src: str,
|
src: str,
|
||||||
@@ -963,15 +1051,27 @@ class AuthSrv(object):
|
|||||||
un_gn = [("", "")]
|
un_gn = [("", "")]
|
||||||
|
|
||||||
for un, gn in un_gn:
|
for un, gn in un_gn:
|
||||||
m = PTN_U_GRP.search(dst0)
|
rejected = False
|
||||||
if m:
|
for ptn in [PTN_U_GRP, PTN_G_GRP]:
|
||||||
req, gnc = m.groups()
|
m = ptn.search(dst0)
|
||||||
hit = gnc in (un_gns.get(un) or [])
|
if not m:
|
||||||
if req == "+":
|
|
||||||
if not hit:
|
|
||||||
continue
|
|
||||||
elif hit:
|
|
||||||
continue
|
continue
|
||||||
|
zs = m.group(1)
|
||||||
|
zs = zs.replace(",%+", "\n%+")
|
||||||
|
zs = zs.replace(",%-", "\n%-")
|
||||||
|
for rule in zs.split("\n"):
|
||||||
|
gnc = rule[2:]
|
||||||
|
if ptn == PTN_U_GRP:
|
||||||
|
# is user member of group?
|
||||||
|
hit = gnc in (un_gns.get(un) or [])
|
||||||
|
else:
|
||||||
|
# is it this specific group?
|
||||||
|
hit = gn == gnc
|
||||||
|
|
||||||
|
if rule.startswith("%+") != hit:
|
||||||
|
rejected = True
|
||||||
|
if rejected:
|
||||||
|
continue
|
||||||
|
|
||||||
# if ap/vp has a user/group placeholder, make sure to keep
|
# if ap/vp has a user/group placeholder, make sure to keep
|
||||||
# track so the same user/group is mapped when setting perms;
|
# track so the same user/group is mapped when setting perms;
|
||||||
@@ -986,6 +1086,8 @@ class AuthSrv(object):
|
|||||||
|
|
||||||
src = src1.replace("${g}", gn or "\n")
|
src = src1.replace("${g}", gn or "\n")
|
||||||
dst = dst1.replace("${g}", gn or "\n")
|
dst = dst1.replace("${g}", gn or "\n")
|
||||||
|
src = PTN_G_GRP.sub(gn or "\n", src)
|
||||||
|
dst = PTN_G_GRP.sub(gn or "\n", dst)
|
||||||
if src == src1 and dst == dst1:
|
if src == src1 and dst == dst1:
|
||||||
gn = ""
|
gn = ""
|
||||||
|
|
||||||
@@ -1077,6 +1179,7 @@ class AuthSrv(object):
|
|||||||
* any non-zero value from IdP group header
|
* any non-zero value from IdP group header
|
||||||
* otherwise take --grps / [groups]
|
* otherwise take --grps / [groups]
|
||||||
"""
|
"""
|
||||||
|
self.load_idp_db(bool(self.idp_accs))
|
||||||
ret = {un: gns[:] for un, gns in self.idp_accs.items()}
|
ret = {un: gns[:] for un, gns in self.idp_accs.items()}
|
||||||
ret.update({zs: [""] for zs in acct if zs not in ret})
|
ret.update({zs: [""] for zs in acct if zs not in ret})
|
||||||
for gn, uns in grps.items():
|
for gn, uns in grps.items():
|
||||||
@@ -1445,7 +1548,7 @@ class AuthSrv(object):
|
|||||||
flags[name] = True
|
flags[name] = True
|
||||||
return
|
return
|
||||||
|
|
||||||
zs = "ext_th mtp on403 on404 xbu xau xiu xbc xac xbr xar xbd xad xm xban"
|
zs = "ext_th landmark mtp on403 on404 xbu xau xiu xbc xac xbr xar xbd xad xm xban"
|
||||||
if name not in zs.split():
|
if name not in zs.split():
|
||||||
if value is True:
|
if value is True:
|
||||||
t = "└─add volflag [{}] = {} ({})"
|
t = "└─add volflag [{}] = {} ({})"
|
||||||
@@ -1482,8 +1585,10 @@ class AuthSrv(object):
|
|||||||
daxs: dict[str, AXS] = {}
|
daxs: dict[str, AXS] = {}
|
||||||
mflags: dict[str, dict[str, Any]] = {} # vpath:flags
|
mflags: dict[str, dict[str, Any]] = {} # vpath:flags
|
||||||
mount: dict[str, tuple[str, str]] = {} # dst:src (vp:(ap,vp0))
|
mount: dict[str, tuple[str, str]] = {} # dst:src (vp:(ap,vp0))
|
||||||
|
cfg_files_loaded: list[str] = []
|
||||||
|
|
||||||
self.idp_vols = {} # yolo
|
self.idp_vols = {} # yolo
|
||||||
|
self.badcfg1 = False
|
||||||
|
|
||||||
if self.args.a:
|
if self.args.a:
|
||||||
# list of username:password
|
# list of username:password
|
||||||
@@ -1544,6 +1649,7 @@ class AuthSrv(object):
|
|||||||
zst = [(max(0, len(x) - 2) * " ") + "└" + x[-1] for x in zstt]
|
zst = [(max(0, len(x) - 2) * " ") + "└" + x[-1] for x in zstt]
|
||||||
t = "loaded {} config files:\n{}"
|
t = "loaded {} config files:\n{}"
|
||||||
self.log(t.format(len(zst), "\n".join(zst)))
|
self.log(t.format(len(zst), "\n".join(zst)))
|
||||||
|
cfg_files_loaded = zst
|
||||||
|
|
||||||
except:
|
except:
|
||||||
lns = lns[: self.line_ctr]
|
lns = lns[: self.line_ctr]
|
||||||
@@ -1568,21 +1674,25 @@ class AuthSrv(object):
|
|||||||
if not mount and not self.args.idp_h_usr:
|
if not mount and not self.args.idp_h_usr:
|
||||||
# -h says our defaults are CWD at root and read/write for everyone
|
# -h says our defaults are CWD at root and read/write for everyone
|
||||||
axs = AXS(["*"], ["*"], None, None)
|
axs = AXS(["*"], ["*"], None, None)
|
||||||
if os.path.exists("/z/initcfg"):
|
if self.is_lxc:
|
||||||
t = "Read-access has been disabled due to failsafe: Docker detected, but the config does not define any volumes. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to all of /w/ by adding the following arguments to the docker container: -v .::rw"
|
t = "Read-access has been disabled due to failsafe: Docker detected, but %s. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to all of /w/ by adding the following arguments to the docker container: -v .::rw"
|
||||||
self.log(t, 1)
|
if len(cfg_files_loaded) == 1:
|
||||||
|
self.log(t % ("no config-file was provided",), 1)
|
||||||
|
t = "it is strongly recommended to add a config-file instead, for example based on https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/docker/basic-docker-compose/copyparty.conf"
|
||||||
|
self.log(t, 3)
|
||||||
|
else:
|
||||||
|
self.log(t % ("the config does not define any volumes",), 1)
|
||||||
axs = AXS()
|
axs = AXS()
|
||||||
elif self.args.c:
|
elif self.args.c:
|
||||||
t = "Read-access has been disabled due to failsafe: No volumes were defined by the config-file. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to the working-directory by adding the following arguments: -v .::rw"
|
t = "Read-access has been disabled due to failsafe: No volumes were defined by the config-file. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to the working-directory by adding the following arguments: -v .::rw"
|
||||||
self.log(t, 1)
|
self.log(t, 1)
|
||||||
axs = AXS()
|
axs = AXS()
|
||||||
vfs = VFS(self.log_func, absreal("."), "", "", axs, {})
|
vfs = VFS(self.log_func, absreal("."), "", "", axs, self.vf0())
|
||||||
if not axs.uread:
|
if not axs.uread:
|
||||||
vfs.badcfg1 = True
|
self.badcfg1 = True
|
||||||
elif "" not in mount:
|
elif "" not in mount:
|
||||||
# there's volumes but no root; make root inaccessible
|
# there's volumes but no root; make root inaccessible
|
||||||
zsd = {"d2d": True, "tcolor": self.args.tcolor}
|
vfs = VFS(self.log_func, "", "", "", AXS(), self.vf0())
|
||||||
vfs = VFS(self.log_func, "", "", "", AXS(), zsd)
|
|
||||||
|
|
||||||
maxdepth = 0
|
maxdepth = 0
|
||||||
for dst in sorted(mount.keys(), key=lambda x: (x.count("/"), len(x))):
|
for dst in sorted(mount.keys(), key=lambda x: (x.count("/"), len(x))):
|
||||||
@@ -1629,10 +1739,9 @@ class AuthSrv(object):
|
|||||||
shr = enshare[1:-1]
|
shr = enshare[1:-1]
|
||||||
shrs = enshare[1:]
|
shrs = enshare[1:]
|
||||||
if enshare:
|
if enshare:
|
||||||
import sqlite3
|
assert sqlite3 # type: ignore # !rm
|
||||||
|
|
||||||
zsd = {"d2d": True, "tcolor": self.args.tcolor}
|
shv = VFS(self.log_func, "", shr, shr, AXS(), self.vf0())
|
||||||
shv = VFS(self.log_func, "", shr, shr, AXS(), zsd)
|
|
||||||
|
|
||||||
db_path = self.args.shr_db
|
db_path = self.args.shr_db
|
||||||
db = sqlite3.connect(db_path)
|
db = sqlite3.connect(db_path)
|
||||||
@@ -1852,7 +1961,7 @@ class AuthSrv(object):
|
|||||||
is_shr = shr and zv.vpath.split("/")[0] == shr
|
is_shr = shr and zv.vpath.split("/")[0] == shr
|
||||||
if histp and not is_shr and histp in rhisttab:
|
if histp and not is_shr and histp in rhisttab:
|
||||||
zv2 = rhisttab[histp]
|
zv2 = rhisttab[histp]
|
||||||
t = "invalid config; multiple volumes share the same histpath (database+thumbnails location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
t = "invalid config; multiple volumes share the same histpath (database+thumbnails location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: /%s [%s]"
|
||||||
t = t % (histp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
t = t % (histp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
||||||
self.log(t, 1)
|
self.log(t, 1)
|
||||||
raise Exception(t)
|
raise Exception(t)
|
||||||
@@ -1866,7 +1975,7 @@ class AuthSrv(object):
|
|||||||
is_shr = shr and zv.vpath.split("/")[0] == shr
|
is_shr = shr and zv.vpath.split("/")[0] == shr
|
||||||
if dbp and not is_shr and dbp in rdbpaths:
|
if dbp and not is_shr and dbp in rdbpaths:
|
||||||
zv2 = rdbpaths[dbp]
|
zv2 = rdbpaths[dbp]
|
||||||
t = "invalid config; multiple volumes share the same dbpath (database location):\n dbpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
t = "invalid config; multiple volumes share the same dbpath (database location):\n dbpath: %s\n volume 1: /%s [%s]\n volume 2: /%s [%s]"
|
||||||
t = t % (dbp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
t = t % (dbp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
||||||
self.log(t, 1)
|
self.log(t, 1)
|
||||||
raise Exception(t)
|
raise Exception(t)
|
||||||
@@ -2010,8 +2119,11 @@ class AuthSrv(object):
|
|||||||
elif self.args.re_maxage:
|
elif self.args.re_maxage:
|
||||||
vol.flags["scan"] = self.args.re_maxage
|
vol.flags["scan"] = self.args.re_maxage
|
||||||
|
|
||||||
|
self.args.have_unlistc = False
|
||||||
|
|
||||||
all_mte = {}
|
all_mte = {}
|
||||||
errors = False
|
errors = False
|
||||||
|
free_umask = False
|
||||||
for vol in vfs.all_nodes.values():
|
for vol in vfs.all_nodes.values():
|
||||||
if (self.args.e2ds and vol.axs.uwrite) or self.args.e2dsa:
|
if (self.args.e2ds and vol.axs.uwrite) or self.args.e2dsa:
|
||||||
vol.flags["e2ds"] = True
|
vol.flags["e2ds"] = True
|
||||||
@@ -2049,12 +2161,13 @@ class AuthSrv(object):
|
|||||||
if vf not in vol.flags:
|
if vf not in vol.flags:
|
||||||
vol.flags[vf] = getattr(self.args, ga)
|
vol.flags[vf] = getattr(self.args, ga)
|
||||||
|
|
||||||
zs = "forget_ip nrand u2abort u2ow ups_who zip_who"
|
zs = "forget_ip nrand tail_who u2abort u2ow ups_who zip_who"
|
||||||
for k in zs.split():
|
for k in zs.split():
|
||||||
if k in vol.flags:
|
if k in vol.flags:
|
||||||
vol.flags[k] = int(vol.flags[k])
|
vol.flags[k] = int(vol.flags[k])
|
||||||
|
|
||||||
for k in ("convt",):
|
zs = "convt tail_fd tail_rate tail_tmax"
|
||||||
|
for k in zs.split():
|
||||||
if k in vol.flags:
|
if k in vol.flags:
|
||||||
vol.flags[k] = float(vol.flags[k])
|
vol.flags[k] = float(vol.flags[k])
|
||||||
|
|
||||||
@@ -2067,9 +2180,33 @@ class AuthSrv(object):
|
|||||||
t = 'volume "/%s" has invalid %stry [%s]'
|
t = 'volume "/%s" has invalid %stry [%s]'
|
||||||
raise Exception(t % (vol.vpath, k, vol.flags.get(k + "try")))
|
raise Exception(t % (vol.vpath, k, vol.flags.get(k + "try")))
|
||||||
|
|
||||||
|
for k in ("chmod_d", "chmod_f"):
|
||||||
|
is_d = k == "chmod_d"
|
||||||
|
zs = vol.flags.get(k, "")
|
||||||
|
if not zs and is_d:
|
||||||
|
zs = "755"
|
||||||
|
if not zs:
|
||||||
|
vol.flags.pop(k, None)
|
||||||
|
continue
|
||||||
|
if not re.match("^[0-7]{3}$", zs):
|
||||||
|
t = "config-option '%s' must be a three-digit octal value such as [755] or [644] but the value was [%s]"
|
||||||
|
t = t % (k, zs)
|
||||||
|
self.log(t, 1)
|
||||||
|
raise Exception(t)
|
||||||
|
zi = int(zs, 8)
|
||||||
|
vol.flags[k] = zi
|
||||||
|
if (is_d and zi != 0o755) or not is_d:
|
||||||
|
free_umask = True
|
||||||
|
|
||||||
|
if vol.lim:
|
||||||
|
vol.lim.chmod_d = vol.flags["chmod_d"]
|
||||||
|
|
||||||
if vol.flags.get("og"):
|
if vol.flags.get("og"):
|
||||||
self.args.uqe = True
|
self.args.uqe = True
|
||||||
|
|
||||||
|
if "unlistcr" in vol.flags or "unlistcw" in vol.flags:
|
||||||
|
self.args.have_unlistc = True
|
||||||
|
|
||||||
zs = str(vol.flags.get("tcolor", "")).lstrip("#")
|
zs = str(vol.flags.get("tcolor", "")).lstrip("#")
|
||||||
if len(zs) == 3: # fc5 => ffcc55
|
if len(zs) == 3: # fc5 => ffcc55
|
||||||
vol.flags["tcolor"] = "".join([x * 2 for x in zs])
|
vol.flags["tcolor"] = "".join([x * 2 for x in zs])
|
||||||
@@ -2147,6 +2284,8 @@ class AuthSrv(object):
|
|||||||
t = "WARNING: volume [/%s]: invalid value specified for ext-th: %s"
|
t = "WARNING: volume [/%s]: invalid value specified for ext-th: %s"
|
||||||
self.log(t % (vol.vpath, etv), 3)
|
self.log(t % (vol.vpath, etv), 3)
|
||||||
|
|
||||||
|
vol.check_landmarks()
|
||||||
|
|
||||||
# d2d drops all database features for a volume
|
# d2d drops all database features for a volume
|
||||||
for grp, rm in [["d2d", "e2d"], ["d2t", "e2t"], ["d2d", "e2v"]]:
|
for grp, rm in [["d2d", "e2d"], ["d2t", "e2t"], ["d2d", "e2v"]]:
|
||||||
if not vol.flags.get(grp, False):
|
if not vol.flags.get(grp, False):
|
||||||
@@ -2293,6 +2432,10 @@ class AuthSrv(object):
|
|||||||
if errors:
|
if errors:
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
|
setattr(self.args, "free_umask", free_umask)
|
||||||
|
if free_umask:
|
||||||
|
os.umask(0)
|
||||||
|
|
||||||
vfs.bubble_flags()
|
vfs.bubble_flags()
|
||||||
|
|
||||||
have_e2d = False
|
have_e2d = False
|
||||||
@@ -2383,7 +2526,7 @@ class AuthSrv(object):
|
|||||||
idp_vn, _ = vfs.get(idp_vp, "*", False, False)
|
idp_vn, _ = vfs.get(idp_vp, "*", False, False)
|
||||||
idp_vp0 = idp_vn.vpath0
|
idp_vp0 = idp_vn.vpath0
|
||||||
|
|
||||||
sigils = set(re.findall(r"(\${[ug][}%])", idp_vp0))
|
sigils = set(PTN_SIGIL.findall(idp_vp0))
|
||||||
if len(sigils) > 1:
|
if len(sigils) > 1:
|
||||||
t = '\nWARNING: IdP-volume "/%s" created by "/%s" has multiple IdP placeholders: %s'
|
t = '\nWARNING: IdP-volume "/%s" created by "/%s" has multiple IdP placeholders: %s'
|
||||||
self.idp_warn.append(t % (idp_vp, idp_vp0, list(sigils)))
|
self.idp_warn.append(t % (idp_vp, idp_vp0, list(sigils)))
|
||||||
@@ -2433,6 +2576,7 @@ class AuthSrv(object):
|
|||||||
self.defpw = defpw
|
self.defpw = defpw
|
||||||
self.grps = grps
|
self.grps = grps
|
||||||
self.iacct = {v: k for k, v in acct.items()}
|
self.iacct = {v: k for k, v in acct.items()}
|
||||||
|
self.cfg_files_loaded = cfg_files_loaded
|
||||||
|
|
||||||
self.load_sessions()
|
self.load_sessions()
|
||||||
|
|
||||||
@@ -2494,6 +2638,28 @@ class AuthSrv(object):
|
|||||||
shn.shr_src = (s_vfs, s_rem)
|
shn.shr_src = (s_vfs, s_rem)
|
||||||
shn.realpath = s_vfs.canonical(s_rem)
|
shn.realpath = s_vfs.canonical(s_rem)
|
||||||
|
|
||||||
|
# root.all_aps doesn't include any shares, so make a copy where the
|
||||||
|
# share appears in all abspaths it can provide (for example for chk_ap)
|
||||||
|
ap = shn.realpath
|
||||||
|
if not ap.endswith(os.sep):
|
||||||
|
ap += os.sep
|
||||||
|
shn.shr_all_aps = [(x, y[:]) for x, y in vfs.all_aps]
|
||||||
|
exact = False
|
||||||
|
for ap2, vns in shn.shr_all_aps:
|
||||||
|
if ap == ap2:
|
||||||
|
exact = True
|
||||||
|
if ap2.startswith(ap):
|
||||||
|
try:
|
||||||
|
vp2 = vjoin(s_rem, ap2[len(ap) :])
|
||||||
|
vn2, _ = s_vfs.get(vp2, "*", False, False)
|
||||||
|
if vn2 == s_vfs or vn2.dbv == s_vfs:
|
||||||
|
vns.append(shn)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
if not exact:
|
||||||
|
shn.shr_all_aps.append((ap, [shn]))
|
||||||
|
shn.shr_all_aps.sort(key=lambda x: len(x[0]), reverse=True)
|
||||||
|
|
||||||
if self.args.shr_v:
|
if self.args.shr_v:
|
||||||
t = "mapped %s share [%s] by [%s] => [%s] => [%s]"
|
t = "mapped %s share [%s] by [%s] => [%s] => [%s]"
|
||||||
self.log(t % (s_pr, s_k, s_un, s_vp, shn.realpath))
|
self.log(t % (s_pr, s_k, s_un, s_vp, shn.realpath))
|
||||||
@@ -2508,7 +2674,7 @@ class AuthSrv(object):
|
|||||||
continue # also fine
|
continue # also fine
|
||||||
for zs in svn.nodes.keys():
|
for zs in svn.nodes.keys():
|
||||||
# hide subvolume
|
# hide subvolume
|
||||||
vn.nodes[zs] = VFS(self.log_func, "", "", "", AXS(), {})
|
vn.nodes[zs] = VFS(self.log_func, "", "", "", AXS(), self.vf0())
|
||||||
|
|
||||||
cur2.close()
|
cur2.close()
|
||||||
cur.close()
|
cur.close()
|
||||||
@@ -2552,6 +2718,8 @@ class AuthSrv(object):
|
|||||||
"txt_ext": self.args.textfiles.replace(",", " "),
|
"txt_ext": self.args.textfiles.replace(",", " "),
|
||||||
"def_hcols": list(vf.get("mth") or []),
|
"def_hcols": list(vf.get("mth") or []),
|
||||||
"unlist0": vf.get("unlist") or "",
|
"unlist0": vf.get("unlist") or "",
|
||||||
|
"see_dots": self.args.see_dots,
|
||||||
|
"dqdel": self.args.qdel,
|
||||||
"dgrid": "grid" in vf,
|
"dgrid": "grid" in vf,
|
||||||
"dgsel": "gsel" in vf,
|
"dgsel": "gsel" in vf,
|
||||||
"dnsort": "nsort" in vf,
|
"dnsort": "nsort" in vf,
|
||||||
@@ -2563,6 +2731,7 @@ class AuthSrv(object):
|
|||||||
"idxh": int(self.args.ih),
|
"idxh": int(self.args.ih),
|
||||||
"themes": self.args.themes,
|
"themes": self.args.themes,
|
||||||
"turbolvl": self.args.turbo,
|
"turbolvl": self.args.turbo,
|
||||||
|
"nosubtle": self.args.nosubtle,
|
||||||
"u2j": self.args.u2j,
|
"u2j": self.args.u2j,
|
||||||
"u2sz": self.args.u2sz,
|
"u2sz": self.args.u2sz,
|
||||||
"u2ts": vf["u2ts"],
|
"u2ts": vf["u2ts"],
|
||||||
@@ -2591,6 +2760,43 @@ class AuthSrv(object):
|
|||||||
zs = str(vol.flags.get("tcolor") or self.args.tcolor)
|
zs = str(vol.flags.get("tcolor") or self.args.tcolor)
|
||||||
vol.flags["tcolor"] = zs.lstrip("#")
|
vol.flags["tcolor"] = zs.lstrip("#")
|
||||||
|
|
||||||
|
def load_idp_db(self, quiet=False) -> None:
|
||||||
|
# mutex me
|
||||||
|
level = self.args.idp_store
|
||||||
|
if level < 2 or not self.args.idp_h_usr:
|
||||||
|
return
|
||||||
|
|
||||||
|
assert sqlite3 # type: ignore # !rm
|
||||||
|
|
||||||
|
db = sqlite3.connect(self.args.idp_db)
|
||||||
|
cur = db.cursor()
|
||||||
|
from_cache = cur.execute("select un, gs from us").fetchall()
|
||||||
|
cur.close()
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
self.idp_accs.clear()
|
||||||
|
self.idp_usr_gh.clear()
|
||||||
|
|
||||||
|
gsep = self.args.idp_gsep
|
||||||
|
n = []
|
||||||
|
for uname, gname in from_cache:
|
||||||
|
if level < 3:
|
||||||
|
if uname in self.idp_accs:
|
||||||
|
continue
|
||||||
|
gname = ""
|
||||||
|
gnames = [x.strip() for x in gsep.split(gname)]
|
||||||
|
gnames.sort()
|
||||||
|
|
||||||
|
# self.idp_usr_gh[uname] = gname
|
||||||
|
self.idp_accs[uname] = gnames
|
||||||
|
n.append(uname)
|
||||||
|
|
||||||
|
if n and not quiet:
|
||||||
|
t = ", ".join(n[:9])
|
||||||
|
if len(n) > 9:
|
||||||
|
t += "..."
|
||||||
|
self.log("found %d IdP users in db (%s)" % (len(n), t))
|
||||||
|
|
||||||
def load_sessions(self, quiet=False) -> None:
|
def load_sessions(self, quiet=False) -> None:
|
||||||
# mutex me
|
# mutex me
|
||||||
if self.args.no_ses:
|
if self.args.no_ses:
|
||||||
@@ -2598,7 +2804,7 @@ class AuthSrv(object):
|
|||||||
self.sesa = {}
|
self.sesa = {}
|
||||||
return
|
return
|
||||||
|
|
||||||
import sqlite3
|
assert sqlite3 # type: ignore # !rm
|
||||||
|
|
||||||
ases = {}
|
ases = {}
|
||||||
blen = (self.args.ses_len // 4) * 4 # 3 bytes in 4 chars
|
blen = (self.args.ses_len // 4) * 4 # 3 bytes in 4 chars
|
||||||
@@ -2645,7 +2851,7 @@ class AuthSrv(object):
|
|||||||
if self.args.no_ses:
|
if self.args.no_ses:
|
||||||
return
|
return
|
||||||
|
|
||||||
import sqlite3
|
assert sqlite3 # type: ignore # !rm
|
||||||
|
|
||||||
db = sqlite3.connect(self.args.ses_db)
|
db = sqlite3.connect(self.args.ses_db)
|
||||||
cur = db.cursor()
|
cur = db.cursor()
|
||||||
|
|||||||
@@ -25,14 +25,26 @@ def listdir(p: str = ".") -> list[str]:
|
|||||||
|
|
||||||
|
|
||||||
def makedirs(name: str, mode: int = 0o755, exist_ok: bool = True) -> bool:
|
def makedirs(name: str, mode: int = 0o755, exist_ok: bool = True) -> bool:
|
||||||
|
# os.makedirs does 777 for all but leaf; this does mode on all
|
||||||
|
todo = []
|
||||||
bname = fsenc(name)
|
bname = fsenc(name)
|
||||||
try:
|
while bname:
|
||||||
os.makedirs(bname, mode)
|
if os.path.isdir(bname):
|
||||||
return True
|
break
|
||||||
except:
|
todo.append(bname)
|
||||||
if not exist_ok or not os.path.isdir(bname):
|
bname = os.path.dirname(bname)
|
||||||
raise
|
if not todo:
|
||||||
|
if not exist_ok:
|
||||||
|
os.mkdir(bname) # to throw
|
||||||
return False
|
return False
|
||||||
|
for zb in todo[::-1]:
|
||||||
|
try:
|
||||||
|
os.mkdir(zb, mode)
|
||||||
|
except:
|
||||||
|
if os.path.isdir(zb):
|
||||||
|
continue
|
||||||
|
raise
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
def mkdir(p: str, mode: int = 0o755) -> None:
|
def mkdir(p: str, mode: int = 0o755) -> None:
|
||||||
|
|||||||
@@ -22,6 +22,7 @@ def vf_bmap() -> dict[str, str]:
|
|||||||
"no_forget": "noforget",
|
"no_forget": "noforget",
|
||||||
"no_pipe": "nopipe",
|
"no_pipe": "nopipe",
|
||||||
"no_robots": "norobots",
|
"no_robots": "norobots",
|
||||||
|
"no_tail": "notail",
|
||||||
"no_thumb": "dthumb",
|
"no_thumb": "dthumb",
|
||||||
"no_vthumb": "dvthumb",
|
"no_vthumb": "dvthumb",
|
||||||
"no_athumb": "dathumb",
|
"no_athumb": "dathumb",
|
||||||
@@ -51,6 +52,7 @@ def vf_bmap() -> dict[str, str]:
|
|||||||
"og_no_head",
|
"og_no_head",
|
||||||
"og_s_title",
|
"og_s_title",
|
||||||
"rand",
|
"rand",
|
||||||
|
"rmagic",
|
||||||
"rss",
|
"rss",
|
||||||
"wo_up_readme",
|
"wo_up_readme",
|
||||||
"xdev",
|
"xdev",
|
||||||
@@ -76,6 +78,8 @@ def vf_vmap() -> dict[str, str]:
|
|||||||
}
|
}
|
||||||
for k in (
|
for k in (
|
||||||
"bup_ck",
|
"bup_ck",
|
||||||
|
"chmod_d",
|
||||||
|
"chmod_f",
|
||||||
"dbd",
|
"dbd",
|
||||||
"forget_ip",
|
"forget_ip",
|
||||||
"hsortn",
|
"hsortn",
|
||||||
@@ -101,6 +105,10 @@ def vf_vmap() -> dict[str, str]:
|
|||||||
"mv_retry",
|
"mv_retry",
|
||||||
"rm_retry",
|
"rm_retry",
|
||||||
"sort",
|
"sort",
|
||||||
|
"tail_fd",
|
||||||
|
"tail_rate",
|
||||||
|
"tail_tmax",
|
||||||
|
"tail_who",
|
||||||
"tcolor",
|
"tcolor",
|
||||||
"unlist",
|
"unlist",
|
||||||
"u2abort",
|
"u2abort",
|
||||||
@@ -163,6 +171,8 @@ flagcats = {
|
|||||||
"safededup": "verify on-disk data before using it for dedup",
|
"safededup": "verify on-disk data before using it for dedup",
|
||||||
"noclone": "take dupe data from clients, even if available on HDD",
|
"noclone": "take dupe data from clients, even if available on HDD",
|
||||||
"nodupe": "rejects existing files (instead of linking/cloning them)",
|
"nodupe": "rejects existing files (instead of linking/cloning them)",
|
||||||
|
"chmod_d=755": "unix-permission for new dirs/folders",
|
||||||
|
"chmod_f=644": "unix-permission for new files",
|
||||||
"sparse": "force use of sparse files, mainly for s3-backed storage",
|
"sparse": "force use of sparse files, mainly for s3-backed storage",
|
||||||
"nosparse": "deny use of sparse files, mainly for slow storage",
|
"nosparse": "deny use of sparse files, mainly for slow storage",
|
||||||
"daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files",
|
"daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files",
|
||||||
@@ -212,6 +222,7 @@ flagcats = {
|
|||||||
"d2d": "disables all database stuff, overrides -e2*",
|
"d2d": "disables all database stuff, overrides -e2*",
|
||||||
"hist=/tmp/cdb": "puts thumbnails and indexes at that location",
|
"hist=/tmp/cdb": "puts thumbnails and indexes at that location",
|
||||||
"dbpath=/tmp/cdb": "puts indexes at that location",
|
"dbpath=/tmp/cdb": "puts indexes at that location",
|
||||||
|
"landmark=foo": "disable db if file foo doesn't exist",
|
||||||
"scan=60": "scan for new files every 60sec, same as --re-maxage",
|
"scan=60": "scan for new files every 60sec, same as --re-maxage",
|
||||||
"nohash=\\.iso$": "skips hashing file contents if path matches *.iso",
|
"nohash=\\.iso$": "skips hashing file contents if path matches *.iso",
|
||||||
"noidx=\\.iso$": "fully ignores the contents at paths matching *.iso",
|
"noidx=\\.iso$": "fully ignores the contents at paths matching *.iso",
|
||||||
@@ -274,6 +285,8 @@ flagcats = {
|
|||||||
"nodirsz": "don't show total folder size",
|
"nodirsz": "don't show total folder size",
|
||||||
"robots": "allows indexing by search engines (default)",
|
"robots": "allows indexing by search engines (default)",
|
||||||
"norobots": "kindly asks search engines to leave",
|
"norobots": "kindly asks search engines to leave",
|
||||||
|
"unlistcr": "don't list read-access in controlpanel",
|
||||||
|
"unlistcw": "don't list write-access in controlpanel",
|
||||||
"no_sb_md": "disable js sandbox for markdown files",
|
"no_sb_md": "disable js sandbox for markdown files",
|
||||||
"no_sb_lg": "disable js sandbox for prologue/epilogue",
|
"no_sb_lg": "disable js sandbox for prologue/epilogue",
|
||||||
"sb_md": "enable js sandbox for markdown files (default)",
|
"sb_md": "enable js sandbox for markdown files (default)",
|
||||||
@@ -304,6 +317,13 @@ flagcats = {
|
|||||||
"exp_md": "placeholders to expand in markdown files; see --help",
|
"exp_md": "placeholders to expand in markdown files; see --help",
|
||||||
"exp_lg": "placeholders to expand in prologue/epilogue; see --help",
|
"exp_lg": "placeholders to expand in prologue/epilogue; see --help",
|
||||||
},
|
},
|
||||||
|
"tailing": {
|
||||||
|
"notail": "disable ?tail (download a growing file continuously)",
|
||||||
|
"tail_fd=1": "check if file was replaced (new fd) every 1 sec",
|
||||||
|
"tail_rate=0.2": "check for new data every 0.2 sec",
|
||||||
|
"tail_tmax=30": "kill connection after 30 sec",
|
||||||
|
"tail_who=2": "restrict ?tail access (1=admins,2=authed,3=everyone)",
|
||||||
|
},
|
||||||
"others": {
|
"others": {
|
||||||
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
|
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
|
||||||
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
|
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
|
||||||
@@ -312,6 +332,7 @@ flagcats = {
|
|||||||
"dks": "per-directory accesskeys allow browsing into subdirs",
|
"dks": "per-directory accesskeys allow browsing into subdirs",
|
||||||
"dky": 'allow seeing files (not folders) inside a specific folder\nwith "g" perm, and does not require a valid dirkey to do so',
|
"dky": 'allow seeing files (not folders) inside a specific folder\nwith "g" perm, and does not require a valid dirkey to do so',
|
||||||
"rss": "allow '?rss' URL suffix (experimental)",
|
"rss": "allow '?rss' URL suffix (experimental)",
|
||||||
|
"rmagic": "expensive analysis for mimetype accuracy",
|
||||||
"ups_who=2": "restrict viewing the list of recent uploads",
|
"ups_who=2": "restrict viewing the list of recent uploads",
|
||||||
"zip_who=2": "restrict access to download-as-zip/tar",
|
"zip_who=2": "restrict access to download-as-zip/tar",
|
||||||
"zipmaxn=9k": "reject download-as-zip if more than 9000 files",
|
"zipmaxn=9k": "reject download-as-zip if more than 9000 files",
|
||||||
|
|||||||
@@ -229,7 +229,7 @@ class FtpFs(AbstractedFS):
|
|||||||
r = "r" in mode
|
r = "r" in mode
|
||||||
w = "w" in mode or "a" in mode or "+" in mode
|
w = "w" in mode or "a" in mode or "+" in mode
|
||||||
|
|
||||||
ap = self.rv2a(filename, r, w)[0]
|
ap, vfs, _ = self.rv2a(filename, r, w)
|
||||||
self.validpath(ap)
|
self.validpath(ap)
|
||||||
if w:
|
if w:
|
||||||
try:
|
try:
|
||||||
@@ -261,7 +261,11 @@ class FtpFs(AbstractedFS):
|
|||||||
|
|
||||||
wunlink(self.log, ap, VF_CAREFUL)
|
wunlink(self.log, ap, VF_CAREFUL)
|
||||||
|
|
||||||
return open(fsenc(ap), mode, self.args.iobuf)
|
ret = open(fsenc(ap), mode, self.args.iobuf)
|
||||||
|
if w and "chmod_f" in vfs.flags:
|
||||||
|
os.fchmod(ret.fileno(), vfs.flags["chmod_f"])
|
||||||
|
|
||||||
|
return ret
|
||||||
|
|
||||||
def chdir(self, path: str) -> None:
|
def chdir(self, path: str) -> None:
|
||||||
nwd = join(self.cwd, path)
|
nwd = join(self.cwd, path)
|
||||||
@@ -292,8 +296,9 @@ class FtpFs(AbstractedFS):
|
|||||||
) = avfs.can_access("", self.h.uname)
|
) = avfs.can_access("", self.h.uname)
|
||||||
|
|
||||||
def mkdir(self, path: str) -> None:
|
def mkdir(self, path: str) -> None:
|
||||||
ap = self.rv2a(path, w=True)[0]
|
ap, vfs, _ = self.rv2a(path, w=True)
|
||||||
bos.makedirs(ap) # filezilla expects this
|
chmod = vfs.flags["chmod_d"]
|
||||||
|
bos.makedirs(ap, chmod) # filezilla expects this
|
||||||
|
|
||||||
def listdir(self, path: str) -> list[str]:
|
def listdir(self, path: str) -> list[str]:
|
||||||
vpath = join(self.cwd, path)
|
vpath = join(self.cwd, path)
|
||||||
|
|||||||
@@ -45,6 +45,7 @@ from .util import (
|
|||||||
APPLESAN_RE,
|
APPLESAN_RE,
|
||||||
BITNESS,
|
BITNESS,
|
||||||
DAV_ALLPROPS,
|
DAV_ALLPROPS,
|
||||||
|
E_SCK_WR,
|
||||||
FN_EMB,
|
FN_EMB,
|
||||||
HAVE_SQLITE3,
|
HAVE_SQLITE3,
|
||||||
HTTPCODE,
|
HTTPCODE,
|
||||||
@@ -189,11 +190,11 @@ class HttpCli(object):
|
|||||||
self.log_src = conn.log_src # mypy404
|
self.log_src = conn.log_src # mypy404
|
||||||
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
||||||
self.tls: bool = hasattr(self.s, "cipher")
|
self.tls: bool = hasattr(self.s, "cipher")
|
||||||
|
self.is_vproxied = bool(self.args.R)
|
||||||
|
|
||||||
# placeholders; assigned by run()
|
# placeholders; assigned by run()
|
||||||
self.keepalive = False
|
self.keepalive = False
|
||||||
self.is_https = False
|
self.is_https = False
|
||||||
self.is_vproxied = False
|
|
||||||
self.in_hdr_recv = True
|
self.in_hdr_recv = True
|
||||||
self.headers: dict[str, str] = {}
|
self.headers: dict[str, str] = {}
|
||||||
self.mode = " " # http verb
|
self.mode = " " # http verb
|
||||||
@@ -401,7 +402,6 @@ class HttpCli(object):
|
|||||||
self.bad_xff = True
|
self.bad_xff = True
|
||||||
else:
|
else:
|
||||||
self.ip = cli_ip
|
self.ip = cli_ip
|
||||||
self.is_vproxied = bool(self.args.R)
|
|
||||||
self.log_src = self.conn.set_rproxy(self.ip)
|
self.log_src = self.conn.set_rproxy(self.ip)
|
||||||
self.host = self.headers.get("x-forwarded-host") or self.host
|
self.host = self.headers.get("x-forwarded-host") or self.host
|
||||||
trusted_xff = True
|
trusted_xff = True
|
||||||
@@ -534,6 +534,7 @@ class HttpCli(object):
|
|||||||
else:
|
else:
|
||||||
t = "incorrect --rp-loc or webserver config; expected vpath starting with %r but got %r"
|
t = "incorrect --rp-loc or webserver config; expected vpath starting with %r but got %r"
|
||||||
self.log(t % (self.args.R, vpath), 1)
|
self.log(t % (self.args.R, vpath), 1)
|
||||||
|
self.is_vproxied = False
|
||||||
|
|
||||||
self.ouparam = uparam.copy()
|
self.ouparam = uparam.copy()
|
||||||
|
|
||||||
@@ -1233,10 +1234,19 @@ class HttpCli(object):
|
|||||||
else:
|
else:
|
||||||
return self.tx_404(True)
|
return self.tx_404(True)
|
||||||
else:
|
else:
|
||||||
vfs = self.asrv.vfs
|
if (
|
||||||
if vfs.badcfg1:
|
self.asrv.badcfg1
|
||||||
t = "<h2>access denied due to failsafe; check server log</h2>"
|
and "h" not in self.ouparam
|
||||||
html = self.j2s("splash", this=self, msg=t)
|
and "hc" not in self.ouparam
|
||||||
|
):
|
||||||
|
zs1 = "copyparty refused to start due to a failsafe: invalid server config; check server log"
|
||||||
|
zs2 = 'you may <a href="/?h">access the controlpanel</a> but nothing will work until you shutdown the copyparty container and %s config-file (or provide the configuration as command-line arguments)'
|
||||||
|
if self.asrv.is_lxc and len(self.asrv.cfg_files_loaded) == 1:
|
||||||
|
zs2 = zs2 % ("add a",)
|
||||||
|
else:
|
||||||
|
zs2 = zs2 % ("fix the",)
|
||||||
|
|
||||||
|
html = self.j2s("msg", h1=zs1, h2=zs2)
|
||||||
self.reply(html.encode("utf-8", "replace"), 500)
|
self.reply(html.encode("utf-8", "replace"), 500)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@@ -1290,6 +1300,9 @@ class HttpCli(object):
|
|||||||
if "ru" in self.uparam:
|
if "ru" in self.uparam:
|
||||||
return self.tx_rups()
|
return self.tx_rups()
|
||||||
|
|
||||||
|
if "idp" in self.uparam:
|
||||||
|
return self.tx_idp()
|
||||||
|
|
||||||
if "h" in self.uparam:
|
if "h" in self.uparam:
|
||||||
return self.tx_mounts()
|
return self.tx_mounts()
|
||||||
|
|
||||||
@@ -1362,12 +1375,13 @@ class HttpCli(object):
|
|||||||
title = self.uparam.get("title") or self.vpath.split("/")[-1]
|
title = self.uparam.get("title") or self.vpath.split("/")[-1]
|
||||||
etitle = html_escape(title, True, True)
|
etitle = html_escape(title, True, True)
|
||||||
|
|
||||||
baseurl = "%s://%s%s" % (
|
baseurl = "%s://%s/" % (
|
||||||
"https" if self.is_https else "http",
|
"https" if self.is_https else "http",
|
||||||
self.host,
|
self.host,
|
||||||
self.args.SRS,
|
|
||||||
)
|
)
|
||||||
feed = "%s%s" % (baseurl, self.req[1:])
|
feed = baseurl + self.req[1:]
|
||||||
|
if self.is_vproxied:
|
||||||
|
baseurl += self.args.RS
|
||||||
efeed = html_escape(feed, True, True)
|
efeed = html_escape(feed, True, True)
|
||||||
edirlink = efeed.split("?")[0] + q_pw
|
edirlink = efeed.split("?")[0] + q_pw
|
||||||
|
|
||||||
@@ -1380,7 +1394,7 @@ class HttpCli(object):
|
|||||||
\t\t<title>%s</title>
|
\t\t<title>%s</title>
|
||||||
\t\t<description></description>
|
\t\t<description></description>
|
||||||
\t\t<link>%s</link>
|
\t\t<link>%s</link>
|
||||||
\t\t<generator>copyparty-1</generator>
|
\t\t<generator>copyparty-2</generator>
|
||||||
"""
|
"""
|
||||||
% (efeed, etitle, edirlink)
|
% (efeed, etitle, edirlink)
|
||||||
]
|
]
|
||||||
@@ -1403,7 +1417,13 @@ class HttpCli(object):
|
|||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
ap = ""
|
||||||
|
use_magic = "rmagic" in self.vn.flags
|
||||||
|
|
||||||
for i in hits:
|
for i in hits:
|
||||||
|
if use_magic:
|
||||||
|
ap = os.path.join(self.vn.realpath, i["rp"])
|
||||||
|
|
||||||
iurl = html_escape("%s%s" % (baseurl, i["rp"]), True, True)
|
iurl = html_escape("%s%s" % (baseurl, i["rp"]), True, True)
|
||||||
title = unquotep(i["rp"].split("?")[0].split("/")[-1])
|
title = unquotep(i["rp"].split("?")[0].split("/")[-1])
|
||||||
title = html_escape(title, True, True)
|
title = html_escape(title, True, True)
|
||||||
@@ -1411,7 +1431,7 @@ class HttpCli(object):
|
|||||||
tag_a = str(i["tags"].get("artist") or "")
|
tag_a = str(i["tags"].get("artist") or "")
|
||||||
desc = "%s - %s" % (tag_a, tag_t) if tag_t and tag_a else (tag_t or tag_a)
|
desc = "%s - %s" % (tag_a, tag_t) if tag_t and tag_a else (tag_t or tag_a)
|
||||||
desc = html_escape(desc, True, True) if desc else title
|
desc = html_escape(desc, True, True) if desc else title
|
||||||
mime = html_escape(guess_mime(title))
|
mime = html_escape(guess_mime(title, ap))
|
||||||
lmod = formatdate(max(0, i["ts"]))
|
lmod = formatdate(max(0, i["ts"]))
|
||||||
zsa = (iurl, iurl, title, desc, lmod, iurl, mime, i["sz"])
|
zsa = (iurl, iurl, title, desc, lmod, iurl, mime, i["sz"])
|
||||||
zs = (
|
zs = (
|
||||||
@@ -1564,6 +1584,9 @@ class HttpCli(object):
|
|||||||
None, 207, "text/xml; charset=" + enc, {"Transfer-Encoding": "chunked"}
|
None, 207, "text/xml; charset=" + enc, {"Transfer-Encoding": "chunked"}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
ap = ""
|
||||||
|
use_magic = "rmagic" in vn.flags
|
||||||
|
|
||||||
ret = '<?xml version="1.0" encoding="{}"?>\n<D:multistatus xmlns:D="DAV:">'
|
ret = '<?xml version="1.0" encoding="{}"?>\n<D:multistatus xmlns:D="DAV:">'
|
||||||
ret = ret.format(uenc)
|
ret = ret.format(uenc)
|
||||||
for x in fgen:
|
for x in fgen:
|
||||||
@@ -1590,7 +1613,9 @@ class HttpCli(object):
|
|||||||
"supportedlock": '<D:lockentry xmlns:D="DAV:"><D:lockscope><D:exclusive/></D:lockscope><D:locktype><D:write/></D:locktype></D:lockentry>',
|
"supportedlock": '<D:lockentry xmlns:D="DAV:"><D:lockscope><D:exclusive/></D:lockscope><D:locktype><D:write/></D:locktype></D:lockentry>',
|
||||||
}
|
}
|
||||||
if not isdir:
|
if not isdir:
|
||||||
pvs["getcontenttype"] = html_escape(guess_mime(rp))
|
if use_magic:
|
||||||
|
ap = os.path.join(tap, x["vp"])
|
||||||
|
pvs["getcontenttype"] = html_escape(guess_mime(rp, ap))
|
||||||
pvs["getcontentlength"] = str(st.st_size)
|
pvs["getcontentlength"] = str(st.st_size)
|
||||||
|
|
||||||
for k, v in pvs.items():
|
for k, v in pvs.items():
|
||||||
@@ -2043,7 +2068,7 @@ class HttpCli(object):
|
|||||||
fdir, fn = os.path.split(fdir)
|
fdir, fn = os.path.split(fdir)
|
||||||
rem, _ = vsplit(rem)
|
rem, _ = vsplit(rem)
|
||||||
|
|
||||||
bos.makedirs(fdir)
|
bos.makedirs(fdir, vfs.flags["chmod_d"])
|
||||||
|
|
||||||
open_ka: dict[str, Any] = {"fun": open}
|
open_ka: dict[str, Any] = {"fun": open}
|
||||||
open_a = ["wb", self.args.iobuf]
|
open_a = ["wb", self.args.iobuf]
|
||||||
@@ -2102,6 +2127,8 @@ class HttpCli(object):
|
|||||||
fn = vfs.flags["put_name2"].format(now=time.time(), cip=self.dip())
|
fn = vfs.flags["put_name2"].format(now=time.time(), cip=self.dip())
|
||||||
|
|
||||||
params = {"suffix": suffix, "fdir": fdir}
|
params = {"suffix": suffix, "fdir": fdir}
|
||||||
|
if "chmod_f" in vfs.flags:
|
||||||
|
params["chmod"] = vfs.flags["chmod_f"]
|
||||||
if self.args.nw:
|
if self.args.nw:
|
||||||
params = {}
|
params = {}
|
||||||
fn = os.devnull
|
fn = os.devnull
|
||||||
@@ -2150,7 +2177,7 @@ class HttpCli(object):
|
|||||||
if self.args.nw:
|
if self.args.nw:
|
||||||
fn = os.devnull
|
fn = os.devnull
|
||||||
else:
|
else:
|
||||||
bos.makedirs(fdir)
|
bos.makedirs(fdir, vfs.flags["chmod_d"])
|
||||||
path = os.path.join(fdir, fn)
|
path = os.path.join(fdir, fn)
|
||||||
if not nameless:
|
if not nameless:
|
||||||
self.vpath = vjoin(self.vpath, fn)
|
self.vpath = vjoin(self.vpath, fn)
|
||||||
@@ -2282,7 +2309,7 @@ class HttpCli(object):
|
|||||||
if self.args.hook_v:
|
if self.args.hook_v:
|
||||||
log_reloc(self.log, hr["reloc"], x, path, vp, fn, vfs, rem)
|
log_reloc(self.log, hr["reloc"], x, path, vp, fn, vfs, rem)
|
||||||
fdir, self.vpath, fn, (vfs, rem) = x
|
fdir, self.vpath, fn, (vfs, rem) = x
|
||||||
bos.makedirs(fdir)
|
bos.makedirs(fdir, vfs.flags["chmod_d"])
|
||||||
path2 = os.path.join(fdir, fn)
|
path2 = os.path.join(fdir, fn)
|
||||||
atomic_move(self.log, path, path2, vfs.flags)
|
atomic_move(self.log, path, path2, vfs.flags)
|
||||||
path = path2
|
path = path2
|
||||||
@@ -2568,7 +2595,7 @@ class HttpCli(object):
|
|||||||
dst = vfs.canonical(rem)
|
dst = vfs.canonical(rem)
|
||||||
try:
|
try:
|
||||||
if not bos.path.isdir(dst):
|
if not bos.path.isdir(dst):
|
||||||
bos.makedirs(dst)
|
bos.makedirs(dst, vfs.flags["chmod_d"])
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
self.log("makedirs failed %r" % (dst,))
|
self.log("makedirs failed %r" % (dst,))
|
||||||
if not bos.path.isdir(dst):
|
if not bos.path.isdir(dst):
|
||||||
@@ -2587,10 +2614,6 @@ class HttpCli(object):
|
|||||||
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
||||||
ret = x.get()
|
ret = x.get()
|
||||||
|
|
||||||
if self.is_vproxied:
|
|
||||||
if "purl" in ret:
|
|
||||||
ret["purl"] = self.args.SR + ret["purl"]
|
|
||||||
|
|
||||||
if self.args.shr and self.vpath.startswith(self.args.shr1):
|
if self.args.shr and self.vpath.startswith(self.args.shr1):
|
||||||
# strip common suffix (uploader's folder structure)
|
# strip common suffix (uploader's folder structure)
|
||||||
vp_req, vp_vfs = vroots(self.vpath, vjoin(dbv.vpath, vrem))
|
vp_req, vp_vfs = vroots(self.vpath, vjoin(dbv.vpath, vrem))
|
||||||
@@ -2600,6 +2623,10 @@ class HttpCli(object):
|
|||||||
raise Pebkac(500, t % zt)
|
raise Pebkac(500, t % zt)
|
||||||
ret["purl"] = vp_req + ret["purl"][len(vp_vfs) :]
|
ret["purl"] = vp_req + ret["purl"][len(vp_vfs) :]
|
||||||
|
|
||||||
|
if self.is_vproxied:
|
||||||
|
if "purl" in ret:
|
||||||
|
ret["purl"] = self.args.SR + ret["purl"]
|
||||||
|
|
||||||
ret = json.dumps(ret)
|
ret = json.dumps(ret)
|
||||||
self.log(ret)
|
self.log(ret)
|
||||||
self.reply(ret.encode("utf-8"), mime="application/json")
|
self.reply(ret.encode("utf-8"), mime="application/json")
|
||||||
@@ -2707,6 +2734,7 @@ class HttpCli(object):
|
|||||||
locked = chashes # remaining chunks to be received in this request
|
locked = chashes # remaining chunks to be received in this request
|
||||||
written = [] # chunks written to disk, but not yet released by up2k
|
written = [] # chunks written to disk, but not yet released by up2k
|
||||||
num_left = -1 # num chunks left according to most recent up2k release
|
num_left = -1 # num chunks left according to most recent up2k release
|
||||||
|
bail1 = False # used in sad path to avoid contradicting error-text
|
||||||
treport = time.time() # ratelimit up2k reporting to reduce overhead
|
treport = time.time() # ratelimit up2k reporting to reduce overhead
|
||||||
|
|
||||||
if "x-up2k-subc" in self.headers:
|
if "x-up2k-subc" in self.headers:
|
||||||
@@ -2845,7 +2873,6 @@ class HttpCli(object):
|
|||||||
except:
|
except:
|
||||||
# maybe busted handle (eg. disk went full)
|
# maybe busted handle (eg. disk went full)
|
||||||
f.close()
|
f.close()
|
||||||
chashes = [] # exception flag
|
|
||||||
raise
|
raise
|
||||||
finally:
|
finally:
|
||||||
if locked:
|
if locked:
|
||||||
@@ -2854,13 +2881,14 @@ class HttpCli(object):
|
|||||||
num_left, t = x.get()
|
num_left, t = x.get()
|
||||||
if num_left < 0:
|
if num_left < 0:
|
||||||
self.loud_reply(t, status=500)
|
self.loud_reply(t, status=500)
|
||||||
if chashes: # kills exception bubbling otherwise
|
bail1 = True
|
||||||
return False
|
|
||||||
else:
|
else:
|
||||||
t = "got %d more chunks, %d left"
|
t = "got %d more chunks, %d left"
|
||||||
self.log(t % (len(written), num_left), 6)
|
self.log(t % (len(written), num_left), 6)
|
||||||
|
|
||||||
if num_left < 0:
|
if num_left < 0:
|
||||||
|
if bail1:
|
||||||
|
return False
|
||||||
raise Pebkac(500, "unconfirmed; see serverlog")
|
raise Pebkac(500, "unconfirmed; see serverlog")
|
||||||
|
|
||||||
if not num_left and fpool:
|
if not num_left and fpool:
|
||||||
@@ -3002,7 +3030,7 @@ class HttpCli(object):
|
|||||||
raise Pebkac(405, 'folder "/%s" already exists' % (vpath,))
|
raise Pebkac(405, 'folder "/%s" already exists' % (vpath,))
|
||||||
|
|
||||||
try:
|
try:
|
||||||
bos.makedirs(fn)
|
bos.makedirs(fn, vfs.flags["chmod_d"])
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
if ex.errno == errno.EACCES:
|
if ex.errno == errno.EACCES:
|
||||||
raise Pebkac(500, "the server OS denied write-access")
|
raise Pebkac(500, "the server OS denied write-access")
|
||||||
@@ -3044,6 +3072,8 @@ class HttpCli(object):
|
|||||||
|
|
||||||
with open(fsenc(fn), "wb") as f:
|
with open(fsenc(fn), "wb") as f:
|
||||||
f.write(b"`GRUNNUR`\n")
|
f.write(b"`GRUNNUR`\n")
|
||||||
|
if "chmod_f" in vfs.flags:
|
||||||
|
os.fchmod(f.fileno(), vfs.flags["chmod_f"])
|
||||||
|
|
||||||
vpath = "{}/{}".format(self.vpath, sanitized).lstrip("/")
|
vpath = "{}/{}".format(self.vpath, sanitized).lstrip("/")
|
||||||
self.redirect(vpath, "?edit")
|
self.redirect(vpath, "?edit")
|
||||||
@@ -3117,7 +3147,7 @@ class HttpCli(object):
|
|||||||
)
|
)
|
||||||
upload_vpath = "{}/{}".format(vfs.vpath, rem).strip("/")
|
upload_vpath = "{}/{}".format(vfs.vpath, rem).strip("/")
|
||||||
if not nullwrite:
|
if not nullwrite:
|
||||||
bos.makedirs(fdir_base)
|
bos.makedirs(fdir_base, vfs.flags["chmod_d"])
|
||||||
|
|
||||||
rnd, lifetime, xbu, xau = self.upload_flags(vfs)
|
rnd, lifetime, xbu, xau = self.upload_flags(vfs)
|
||||||
zs = self.uparam.get("want") or self.headers.get("accept") or ""
|
zs = self.uparam.get("want") or self.headers.get("accept") or ""
|
||||||
@@ -3212,8 +3242,11 @@ class HttpCli(object):
|
|||||||
else:
|
else:
|
||||||
open_args["fdir"] = fdir
|
open_args["fdir"] = fdir
|
||||||
|
|
||||||
|
if "chmod_f" in vfs.flags:
|
||||||
|
open_args["chmod"] = vfs.flags["chmod_f"]
|
||||||
|
|
||||||
if p_file and not nullwrite:
|
if p_file and not nullwrite:
|
||||||
bos.makedirs(fdir)
|
bos.makedirs(fdir, vfs.flags["chmod_d"])
|
||||||
|
|
||||||
# reserve destination filename
|
# reserve destination filename
|
||||||
f, fname = ren_open(fname, "wb", fdir=fdir, suffix=suffix)
|
f, fname = ren_open(fname, "wb", fdir=fdir, suffix=suffix)
|
||||||
@@ -3317,7 +3350,7 @@ class HttpCli(object):
|
|||||||
if nullwrite:
|
if nullwrite:
|
||||||
fdir = ap2 = ""
|
fdir = ap2 = ""
|
||||||
else:
|
else:
|
||||||
bos.makedirs(fdir)
|
bos.makedirs(fdir, vfs.flags["chmod_d"])
|
||||||
atomic_move(self.log, abspath, ap2, vfs.flags)
|
atomic_move(self.log, abspath, ap2, vfs.flags)
|
||||||
abspath = ap2
|
abspath = ap2
|
||||||
sz = bos.path.getsize(abspath)
|
sz = bos.path.getsize(abspath)
|
||||||
@@ -3438,6 +3471,8 @@ class HttpCli(object):
|
|||||||
ft = "{}:{}".format(self.ip, self.addr[1])
|
ft = "{}:{}".format(self.ip, self.addr[1])
|
||||||
ft = "{}\n{}\n{}\n".format(ft, msg.rstrip(), errmsg)
|
ft = "{}\n{}\n{}\n".format(ft, msg.rstrip(), errmsg)
|
||||||
f.write(ft.encode("utf-8"))
|
f.write(ft.encode("utf-8"))
|
||||||
|
if "chmod_f" in vfs.flags:
|
||||||
|
os.fchmod(f.fileno(), vfs.flags["chmod_f"])
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
suf = "\nfailed to write the upload report: {}".format(ex)
|
suf = "\nfailed to write the upload report: {}".format(ex)
|
||||||
|
|
||||||
@@ -3488,7 +3523,7 @@ class HttpCli(object):
|
|||||||
lim = vfs.get_dbv(rem)[0].lim
|
lim = vfs.get_dbv(rem)[0].lim
|
||||||
if lim:
|
if lim:
|
||||||
fp, rp = lim.all(self.ip, rp, clen, vfs.realpath, fp, self.conn.hsrv.broker)
|
fp, rp = lim.all(self.ip, rp, clen, vfs.realpath, fp, self.conn.hsrv.broker)
|
||||||
bos.makedirs(fp)
|
bos.makedirs(fp, vfs.flags["chmod_d"])
|
||||||
|
|
||||||
fp = os.path.join(fp, fn)
|
fp = os.path.join(fp, fn)
|
||||||
rem = "{}/{}".format(rp, fn).strip("/")
|
rem = "{}/{}".format(rp, fn).strip("/")
|
||||||
@@ -3556,13 +3591,15 @@ class HttpCli(object):
|
|||||||
zs = ub64enc(zb).decode("ascii")[:24].lower()
|
zs = ub64enc(zb).decode("ascii")[:24].lower()
|
||||||
dp = "%s/md/%s/%s/%s" % (dbv.histpath, zs[:2], zs[2:4], zs)
|
dp = "%s/md/%s/%s/%s" % (dbv.histpath, zs[:2], zs[2:4], zs)
|
||||||
self.log("moving old version to %s/%s" % (dp, mfile2))
|
self.log("moving old version to %s/%s" % (dp, mfile2))
|
||||||
if bos.makedirs(dp):
|
if bos.makedirs(dp, vfs.flags["chmod_d"]):
|
||||||
with open(os.path.join(dp, "dir.txt"), "wb") as f:
|
with open(os.path.join(dp, "dir.txt"), "wb") as f:
|
||||||
f.write(afsenc(vrd))
|
f.write(afsenc(vrd))
|
||||||
|
if "chmod_f" in vfs.flags:
|
||||||
|
os.fchmod(f.fileno(), vfs.flags["chmod_f"])
|
||||||
elif hist_cfg == "s":
|
elif hist_cfg == "s":
|
||||||
dp = os.path.join(mdir, ".hist")
|
dp = os.path.join(mdir, ".hist")
|
||||||
try:
|
try:
|
||||||
bos.mkdir(dp)
|
bos.mkdir(dp, vfs.flags["chmod_d"])
|
||||||
hidedir(dp)
|
hidedir(dp)
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
@@ -3601,6 +3638,8 @@ class HttpCli(object):
|
|||||||
wunlink(self.log, fp, vfs.flags)
|
wunlink(self.log, fp, vfs.flags)
|
||||||
|
|
||||||
with open(fsenc(fp), "wb", self.args.iobuf) as f:
|
with open(fsenc(fp), "wb", self.args.iobuf) as f:
|
||||||
|
if "chmod_f" in vfs.flags:
|
||||||
|
os.fchmod(f.fileno(), vfs.flags["chmod_f"])
|
||||||
sz, sha512, _ = hashcopy(p_data, f, None, 0, self.args.s_wr_slp)
|
sz, sha512, _ = hashcopy(p_data, f, None, 0, self.args.s_wr_slp)
|
||||||
|
|
||||||
if lim:
|
if lim:
|
||||||
@@ -3804,6 +3843,20 @@ class HttpCli(object):
|
|||||||
|
|
||||||
return txt
|
return txt
|
||||||
|
|
||||||
|
def _can_tail(self, volflags: dict[str, Any]) -> bool:
|
||||||
|
zp = self.args.ua_nodoc
|
||||||
|
if zp and zp.search(self.ua):
|
||||||
|
t = "this URL contains no valuable information for bots/crawlers"
|
||||||
|
raise Pebkac(403, t)
|
||||||
|
lvl = volflags["tail_who"]
|
||||||
|
if "notail" in volflags or not lvl:
|
||||||
|
raise Pebkac(400, "tail is disabled in server config")
|
||||||
|
elif lvl <= 1 and not self.can_admin:
|
||||||
|
raise Pebkac(400, "tail is admin-only on this server")
|
||||||
|
elif lvl <= 2 and self.uname in ("", "*"):
|
||||||
|
raise Pebkac(400, "you must be authenticated to use ?tail on this server")
|
||||||
|
return True
|
||||||
|
|
||||||
def _can_zip(self, volflags: dict[str, Any]) -> str:
|
def _can_zip(self, volflags: dict[str, Any]) -> str:
|
||||||
lvl = volflags["zip_who"]
|
lvl = volflags["zip_who"]
|
||||||
if self.args.no_zip or not lvl:
|
if self.args.no_zip or not lvl:
|
||||||
@@ -3948,6 +4001,8 @@ class HttpCli(object):
|
|||||||
logmsg = "{:4} {} ".format("", self.req)
|
logmsg = "{:4} {} ".format("", self.req)
|
||||||
logtail = ""
|
logtail = ""
|
||||||
|
|
||||||
|
is_tail = "tail" in self.uparam and self._can_tail(self.vn.flags)
|
||||||
|
|
||||||
if ptop is not None:
|
if ptop is not None:
|
||||||
ap_data = "<%s>" % (req_path,)
|
ap_data = "<%s>" % (req_path,)
|
||||||
try:
|
try:
|
||||||
@@ -4061,6 +4116,7 @@ class HttpCli(object):
|
|||||||
and can_range
|
and can_range
|
||||||
and file_sz
|
and file_sz
|
||||||
and "," not in hrange
|
and "," not in hrange
|
||||||
|
and not is_tail
|
||||||
):
|
):
|
||||||
try:
|
try:
|
||||||
if not hrange.lower().startswith("bytes"):
|
if not hrange.lower().startswith("bytes"):
|
||||||
@@ -4129,6 +4185,8 @@ class HttpCli(object):
|
|||||||
mime = "text/plain; charset={}".format(self.uparam["txt"] or "utf-8")
|
mime = "text/plain; charset={}".format(self.uparam["txt"] or "utf-8")
|
||||||
elif "mime" in self.uparam:
|
elif "mime" in self.uparam:
|
||||||
mime = str(self.uparam.get("mime"))
|
mime = str(self.uparam.get("mime"))
|
||||||
|
elif "rmagic" in self.vn.flags:
|
||||||
|
mime = guess_mime(req_path, fs_path)
|
||||||
else:
|
else:
|
||||||
mime = guess_mime(req_path)
|
mime = guess_mime(req_path)
|
||||||
|
|
||||||
@@ -4146,13 +4204,18 @@ class HttpCli(object):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
dls = self.conn.hsrv.dls
|
dls = self.conn.hsrv.dls
|
||||||
|
if is_tail:
|
||||||
|
upper = 1 << 30
|
||||||
|
if len(dls) > self.args.tail_cmax:
|
||||||
|
raise Pebkac(400, "too many active downloads to start a new tail")
|
||||||
|
|
||||||
if upper - lower > 0x400000: # 4m
|
if upper - lower > 0x400000: # 4m
|
||||||
now = time.time()
|
now = time.time()
|
||||||
self.dl_id = "%s:%s" % (self.ip, self.addr[1])
|
self.dl_id = "%s:%s" % (self.ip, self.addr[1])
|
||||||
dls[self.dl_id] = (now, 0)
|
dls[self.dl_id] = (now, 0)
|
||||||
self.conn.hsrv.dli[self.dl_id] = (
|
self.conn.hsrv.dli[self.dl_id] = (
|
||||||
now,
|
now,
|
||||||
upper - lower,
|
0 if is_tail else upper - lower,
|
||||||
self.vn,
|
self.vn,
|
||||||
self.vpath,
|
self.vpath,
|
||||||
self.uname,
|
self.uname,
|
||||||
@@ -4163,6 +4226,9 @@ class HttpCli(object):
|
|||||||
return self.tx_pipe(
|
return self.tx_pipe(
|
||||||
ptop, req_path, ap_data, job, lower, upper, status, mime, logmsg
|
ptop, req_path, ap_data, job, lower, upper, status, mime, logmsg
|
||||||
)
|
)
|
||||||
|
elif is_tail:
|
||||||
|
self.tx_tail(open_args, status, mime)
|
||||||
|
return False
|
||||||
|
|
||||||
ret = True
|
ret = True
|
||||||
with open_func(*open_args) as f:
|
with open_func(*open_args) as f:
|
||||||
@@ -4192,6 +4258,133 @@ class HttpCli(object):
|
|||||||
|
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
def tx_tail(
|
||||||
|
self,
|
||||||
|
open_args: list[Any],
|
||||||
|
status: int,
|
||||||
|
mime: str,
|
||||||
|
) -> None:
|
||||||
|
vf = self.vn.flags
|
||||||
|
self.send_headers(length=None, status=status, mime=mime)
|
||||||
|
abspath: bytes = open_args[0]
|
||||||
|
sec_rate = vf["tail_rate"]
|
||||||
|
sec_max = vf["tail_tmax"]
|
||||||
|
sec_fd = vf["tail_fd"]
|
||||||
|
sec_ka = self.args.tail_ka
|
||||||
|
wr_slp = self.args.s_wr_slp
|
||||||
|
wr_sz = self.args.s_wr_sz
|
||||||
|
dls = self.conn.hsrv.dls
|
||||||
|
dl_id = self.dl_id
|
||||||
|
|
||||||
|
# non-numeric = full file from start
|
||||||
|
# positive = absolute offset from start
|
||||||
|
# negative = start that many bytes from eof
|
||||||
|
try:
|
||||||
|
ofs = int(self.uparam["tail"])
|
||||||
|
except:
|
||||||
|
ofs = 0
|
||||||
|
|
||||||
|
t0 = time.time()
|
||||||
|
ofs0 = ofs
|
||||||
|
f = None
|
||||||
|
try:
|
||||||
|
st = os.stat(abspath)
|
||||||
|
f = open(*open_args)
|
||||||
|
f.seek(0, os.SEEK_END)
|
||||||
|
eof = f.tell()
|
||||||
|
f.seek(0)
|
||||||
|
if ofs < 0:
|
||||||
|
ofs = max(0, ofs + eof)
|
||||||
|
|
||||||
|
self.log("tailing from byte %d: %r" % (ofs, abspath), 6)
|
||||||
|
|
||||||
|
# send initial data asap
|
||||||
|
remains = sendfile_py(
|
||||||
|
self.log, # d/c
|
||||||
|
ofs,
|
||||||
|
eof,
|
||||||
|
f,
|
||||||
|
self.s,
|
||||||
|
wr_sz,
|
||||||
|
wr_slp,
|
||||||
|
False, # d/c
|
||||||
|
dls,
|
||||||
|
dl_id,
|
||||||
|
)
|
||||||
|
sent = (eof - ofs) - remains
|
||||||
|
ofs = eof - remains
|
||||||
|
f.seek(ofs)
|
||||||
|
|
||||||
|
try:
|
||||||
|
st2 = os.stat(open_args[0])
|
||||||
|
if st.st_ino == st2.st_ino:
|
||||||
|
st = st2 # for filesize
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
gone = 0
|
||||||
|
t_fd = t_ka = time.time()
|
||||||
|
while True:
|
||||||
|
assert f # !rm
|
||||||
|
buf = f.read(4096)
|
||||||
|
now = time.time()
|
||||||
|
|
||||||
|
if sec_max and now - t0 >= sec_max:
|
||||||
|
self.log("max duration exceeded; kicking client", 6)
|
||||||
|
zb = b"\n\n*** max duration exceeded; disconnecting ***\n"
|
||||||
|
self.s.sendall(zb)
|
||||||
|
break
|
||||||
|
|
||||||
|
if buf:
|
||||||
|
t_fd = t_ka = now
|
||||||
|
self.s.sendall(buf)
|
||||||
|
sent += len(buf)
|
||||||
|
dls[dl_id] = (time.time(), sent)
|
||||||
|
continue
|
||||||
|
|
||||||
|
time.sleep(sec_rate)
|
||||||
|
if t_ka < now - sec_ka:
|
||||||
|
t_ka = now
|
||||||
|
self.s.send(b"\x00")
|
||||||
|
if t_fd < now - sec_fd:
|
||||||
|
try:
|
||||||
|
st2 = os.stat(open_args[0])
|
||||||
|
if (
|
||||||
|
st2.st_ino != st.st_ino
|
||||||
|
or st2.st_size < sent
|
||||||
|
or st2.st_size < st.st_size
|
||||||
|
):
|
||||||
|
assert f # !rm
|
||||||
|
# open new file before closing previous to avoid toctous (open may fail; cannot null f before)
|
||||||
|
f2 = open(*open_args)
|
||||||
|
f.close()
|
||||||
|
f = f2
|
||||||
|
f.seek(0, os.SEEK_END)
|
||||||
|
eof = f.tell()
|
||||||
|
if eof < sent:
|
||||||
|
ofs = sent = 0 # shrunk; send from start
|
||||||
|
zb = b"\n\n*** file size decreased -- rewinding to the start of the file ***\n\n"
|
||||||
|
self.s.sendall(zb)
|
||||||
|
if ofs0 < 0 and eof > -ofs0:
|
||||||
|
ofs = eof + ofs0
|
||||||
|
else:
|
||||||
|
ofs = sent # just new fd? resume from same ofs
|
||||||
|
f.seek(ofs)
|
||||||
|
self.log("reopened at byte %d: %r" % (ofs, abspath), 6)
|
||||||
|
gone = 0
|
||||||
|
st = st2
|
||||||
|
except:
|
||||||
|
gone += 1
|
||||||
|
if gone > 3:
|
||||||
|
self.log("file deleted; disconnecting")
|
||||||
|
break
|
||||||
|
except IOError as ex:
|
||||||
|
if ex.errno not in E_SCK_WR:
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
if f:
|
||||||
|
f.close()
|
||||||
|
|
||||||
def tx_pipe(
|
def tx_pipe(
|
||||||
self,
|
self,
|
||||||
ptop: str,
|
ptop: str,
|
||||||
@@ -4752,7 +4945,6 @@ class HttpCli(object):
|
|||||||
if zi == 2 or (zi == 1 and self.avol):
|
if zi == 2 or (zi == 1 and self.avol):
|
||||||
dl_list = self.get_dls()
|
dl_list = self.get_dls()
|
||||||
for t0, t1, sent, sz, vp, dl_id, uname in dl_list:
|
for t0, t1, sent, sz, vp, dl_id, uname in dl_list:
|
||||||
rem = sz - sent
|
|
||||||
td = max(0.1, now - t0)
|
td = max(0.1, now - t0)
|
||||||
rd, fn = vsplit(vp)
|
rd, fn = vsplit(vp)
|
||||||
if not rd:
|
if not rd:
|
||||||
@@ -4772,6 +4964,11 @@ class HttpCli(object):
|
|||||||
fn = html_escape(fn) if fn else self.conn.hsrv.iiam
|
fn = html_escape(fn) if fn else self.conn.hsrv.iiam
|
||||||
dls.append((perc, hsent, spd, eta, idle, usr, erd, rds, fn))
|
dls.append((perc, hsent, spd, eta, idle, usr, erd, rds, fn))
|
||||||
|
|
||||||
|
if self.args.have_unlistc:
|
||||||
|
allvols = self.asrv.vfs.all_vols
|
||||||
|
rvol = [x for x in rvol if "unlistcr" not in allvols[x[1:-1]].flags]
|
||||||
|
wvol = [x for x in wvol if "unlistcw" not in allvols[x[1:-1]].flags]
|
||||||
|
|
||||||
fmt = self.uparam.get("ls", "")
|
fmt = self.uparam.get("ls", "")
|
||||||
if not fmt and (self.ua.startswith("curl/") or self.ua.startswith("fetch")):
|
if not fmt and (self.ua.startswith("curl/") or self.ua.startswith("fetch")):
|
||||||
fmt = "v"
|
fmt = "v"
|
||||||
@@ -4926,15 +5123,24 @@ class HttpCli(object):
|
|||||||
return "" # unhandled / fallthrough
|
return "" # unhandled / fallthrough
|
||||||
|
|
||||||
def scanvol(self) -> bool:
|
def scanvol(self) -> bool:
|
||||||
if not self.can_admin:
|
|
||||||
raise Pebkac(403, "'scanvol' not allowed for user " + self.uname)
|
|
||||||
|
|
||||||
if self.args.no_rescan:
|
if self.args.no_rescan:
|
||||||
raise Pebkac(403, "the rescan feature is disabled in server config")
|
raise Pebkac(403, "the rescan feature is disabled in server config")
|
||||||
|
|
||||||
vn, _ = self.asrv.vfs.get(self.vpath, self.uname, True, True)
|
vpaths = self.uparam["scan"].split(",/")
|
||||||
|
if vpaths == [""]:
|
||||||
|
vpaths = [self.vpath]
|
||||||
|
|
||||||
args = [self.asrv.vfs.all_vols, [vn.vpath], False, True]
|
vols = []
|
||||||
|
for vpath in vpaths:
|
||||||
|
vn, _ = self.asrv.vfs.get(vpath, self.uname, True, True)
|
||||||
|
vols.append(vn.vpath)
|
||||||
|
if self.uname not in vn.axs.uadmin:
|
||||||
|
self.log("rejected scanning [%s] => [%s];" % (vpath, vn.vpath), 3)
|
||||||
|
raise Pebkac(403, "'scanvol' not allowed for user " + self.uname)
|
||||||
|
|
||||||
|
self.log("trying to rescan %d volumes: %r" % (len(vols), vols))
|
||||||
|
|
||||||
|
args = [self.asrv.vfs.all_vols, vols, False, True]
|
||||||
|
|
||||||
x = self.conn.hsrv.broker.ask("up2k.rescan", *args)
|
x = self.conn.hsrv.broker.ask("up2k.rescan", *args)
|
||||||
err = x.get()
|
err = x.get()
|
||||||
@@ -5354,6 +5560,32 @@ class HttpCli(object):
|
|||||||
self.reply(html.encode("utf-8"), status=200)
|
self.reply(html.encode("utf-8"), status=200)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
def tx_idp(self) -> bool:
|
||||||
|
if self.uname.lower() not in self.args.idp_adm_set:
|
||||||
|
raise Pebkac(403, "'idp' not allowed for user " + self.uname)
|
||||||
|
|
||||||
|
cmd = self.uparam["idp"]
|
||||||
|
if cmd.startswith("rm="):
|
||||||
|
import sqlite3
|
||||||
|
|
||||||
|
db = sqlite3.connect(self.args.idp_db)
|
||||||
|
db.execute("delete from us where un=?", (cmd[3:],))
|
||||||
|
db.commit()
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
self.conn.hsrv.broker.ask("reload", False, True).get()
|
||||||
|
|
||||||
|
self.redirect("", "?idp")
|
||||||
|
return True
|
||||||
|
|
||||||
|
rows = [
|
||||||
|
[k, "[%s]" % ("], [".join(v))]
|
||||||
|
for k, v in sorted(self.asrv.idp_accs.items())
|
||||||
|
]
|
||||||
|
html = self.j2s("idp", this=self, rows=rows, now=int(time.time()))
|
||||||
|
self.reply(html.encode("utf-8"), status=200)
|
||||||
|
return True
|
||||||
|
|
||||||
def tx_shares(self) -> bool:
|
def tx_shares(self) -> bool:
|
||||||
if self.uname == "*":
|
if self.uname == "*":
|
||||||
self.loud_reply("you're not logged in")
|
self.loud_reply("you're not logged in")
|
||||||
@@ -5425,10 +5657,10 @@ class HttpCli(object):
|
|||||||
|
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
if reload:
|
if reload:
|
||||||
self.conn.hsrv.broker.ask("reload", False, False).get()
|
self.conn.hsrv.broker.ask("reload", False, True).get()
|
||||||
self.conn.hsrv.broker.ask("up2k.wake_rescanner").get()
|
self.conn.hsrv.broker.ask("up2k.wake_rescanner").get()
|
||||||
|
|
||||||
self.redirect(self.args.SRS + "?shares")
|
self.redirect("", "?shares")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def handle_share(self, req: dict[str, str]) -> bool:
|
def handle_share(self, req: dict[str, str]) -> bool:
|
||||||
@@ -5517,7 +5749,7 @@ class HttpCli(object):
|
|||||||
cur.execute(q, (skey, fn))
|
cur.execute(q, (skey, fn))
|
||||||
|
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
self.conn.hsrv.broker.ask("reload", False, False).get()
|
self.conn.hsrv.broker.ask("reload", False, True).get()
|
||||||
self.conn.hsrv.broker.ask("up2k.wake_rescanner").get()
|
self.conn.hsrv.broker.ask("up2k.wake_rescanner").get()
|
||||||
|
|
||||||
fn = quotep(fns[0]) if len(fns) == 1 else ""
|
fn = quotep(fns[0]) if len(fns) == 1 else ""
|
||||||
|
|||||||
@@ -224,3 +224,6 @@ class HttpConn(object):
|
|||||||
if self.u2idx:
|
if self.u2idx:
|
||||||
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
|
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
|
||||||
self.u2idx = None
|
self.u2idx = None
|
||||||
|
|
||||||
|
if self.rproxy:
|
||||||
|
self.set_rproxy()
|
||||||
|
|||||||
@@ -175,6 +175,7 @@ class HttpSrv(object):
|
|||||||
"browser",
|
"browser",
|
||||||
"browser2",
|
"browser2",
|
||||||
"cf",
|
"cf",
|
||||||
|
"idp",
|
||||||
"md",
|
"md",
|
||||||
"mde",
|
"mde",
|
||||||
"msg",
|
"msg",
|
||||||
@@ -313,6 +314,8 @@ class HttpSrv(object):
|
|||||||
|
|
||||||
Daemon(self.broker.say, "sig-hsrv-up1", ("cb_httpsrv_up",))
|
Daemon(self.broker.say, "sig-hsrv-up1", ("cb_httpsrv_up",))
|
||||||
|
|
||||||
|
saddr = ("", 0) # fwd-decl for `except TypeError as ex:`
|
||||||
|
|
||||||
while not self.stopping:
|
while not self.stopping:
|
||||||
if self.args.log_conn:
|
if self.args.log_conn:
|
||||||
self.log(self.name, "|%sC-ncli" % ("-" * 1,), c="90")
|
self.log(self.name, "|%sC-ncli" % ("-" * 1,), c="90")
|
||||||
@@ -394,6 +397,19 @@ class HttpSrv(object):
|
|||||||
self.log(self.name, "accept({}): {}".format(fno, ex), c=6)
|
self.log(self.name, "accept({}): {}".format(fno, ex), c=6)
|
||||||
time.sleep(0.02)
|
time.sleep(0.02)
|
||||||
continue
|
continue
|
||||||
|
except TypeError as ex:
|
||||||
|
# on macOS, accept() may return a None saddr if blocked by LittleSnitch;
|
||||||
|
# unicode(saddr[0]) ==> TypeError: 'NoneType' object is not subscriptable
|
||||||
|
if tcp and not saddr:
|
||||||
|
t = "accept(%s): failed to accept connection from client due to firewall or network issue"
|
||||||
|
self.log(self.name, t % (fno,), c=3)
|
||||||
|
try:
|
||||||
|
sck.close() # type: ignore
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
time.sleep(0.02)
|
||||||
|
continue
|
||||||
|
raise
|
||||||
|
|
||||||
if self.args.log_conn:
|
if self.args.log_conn:
|
||||||
t = "|{}C-acc2 \033[0;36m{} \033[3{}m{}".format(
|
t = "|{}C-acc2 \033[0;36m{} \033[3{}m{}".format(
|
||||||
|
|||||||
@@ -320,7 +320,7 @@ class SMB(object):
|
|||||||
|
|
||||||
self.hub.up2k.handle_mv(uname, "1.7.6.2", vp1, vp2)
|
self.hub.up2k.handle_mv(uname, "1.7.6.2", vp1, vp2)
|
||||||
try:
|
try:
|
||||||
bos.makedirs(ap2)
|
bos.makedirs(ap2, vfs2.flags["chmod_d"])
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -334,7 +334,7 @@ class SMB(object):
|
|||||||
t = "blocked mkdir (no-write-acc %s): /%s @%s"
|
t = "blocked mkdir (no-write-acc %s): /%s @%s"
|
||||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||||
|
|
||||||
return bos.mkdir(ap)
|
return bos.mkdir(ap, vfs.flags["chmod_d"])
|
||||||
|
|
||||||
def _stat(self, vpath: str, *a: Any, **ka: Any) -> os.stat_result:
|
def _stat(self, vpath: str, *a: Any, **ka: Any) -> os.stat_result:
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -27,6 +27,7 @@ if True: # pylint: disable=using-constant-test
|
|||||||
|
|
||||||
from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, E, EnvParams, unicode
|
from .__init__ import ANYWIN, EXE, MACOS, PY2, TYPE_CHECKING, E, EnvParams, unicode
|
||||||
from .authsrv import BAD_CFG, AuthSrv
|
from .authsrv import BAD_CFG, AuthSrv
|
||||||
|
from .bos import bos
|
||||||
from .cert import ensure_cert
|
from .cert import ensure_cert
|
||||||
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, HAVE_MUTAGEN
|
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, HAVE_MUTAGEN
|
||||||
from .pwhash import HAVE_ARGON2
|
from .pwhash import HAVE_ARGON2
|
||||||
@@ -88,6 +89,7 @@ if PY2:
|
|||||||
range = xrange # type: ignore
|
range = xrange # type: ignore
|
||||||
|
|
||||||
|
|
||||||
|
VER_IDP_DB = 1
|
||||||
VER_SESSION_DB = 1
|
VER_SESSION_DB = 1
|
||||||
VER_SHARES_DB = 2
|
VER_SHARES_DB = 2
|
||||||
|
|
||||||
@@ -258,11 +260,15 @@ class SvcHub(object):
|
|||||||
self.log("root", "effective %s is %s" % (zs, getattr(args, zs)))
|
self.log("root", "effective %s is %s" % (zs, getattr(args, zs)))
|
||||||
|
|
||||||
if args.ah_cli or args.ah_gen:
|
if args.ah_cli or args.ah_gen:
|
||||||
|
args.idp_store = 0
|
||||||
args.no_ses = True
|
args.no_ses = True
|
||||||
args.shr = ""
|
args.shr = ""
|
||||||
|
|
||||||
|
if args.idp_store and args.idp_h_usr:
|
||||||
|
self.setup_db("idp")
|
||||||
|
|
||||||
if not self.args.no_ses:
|
if not self.args.no_ses:
|
||||||
self.setup_session_db()
|
self.setup_db("ses")
|
||||||
|
|
||||||
args.shr1 = ""
|
args.shr1 = ""
|
||||||
if args.shr:
|
if args.shr:
|
||||||
@@ -421,26 +427,58 @@ class SvcHub(object):
|
|||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def setup_session_db(self) -> None:
|
def _db_onfail_ses(self) -> None:
|
||||||
|
self.args.no_ses = True
|
||||||
|
|
||||||
|
def _db_onfail_idp(self) -> None:
|
||||||
|
self.args.idp_store = 0
|
||||||
|
|
||||||
|
def setup_db(self, which: str) -> None:
|
||||||
|
"""
|
||||||
|
the "non-mission-critical" databases; if something looks broken then just nuke it
|
||||||
|
"""
|
||||||
|
if which == "ses":
|
||||||
|
native_ver = VER_SESSION_DB
|
||||||
|
db_path = self.args.ses_db
|
||||||
|
desc = "sessions-db"
|
||||||
|
pathopt = "ses-db"
|
||||||
|
sanchk_q = "select count(*) from us"
|
||||||
|
createfun = self._create_session_db
|
||||||
|
failfun = self._db_onfail_ses
|
||||||
|
elif which == "idp":
|
||||||
|
native_ver = VER_IDP_DB
|
||||||
|
db_path = self.args.idp_db
|
||||||
|
desc = "idp-db"
|
||||||
|
pathopt = "idp-db"
|
||||||
|
sanchk_q = "select count(*) from us"
|
||||||
|
createfun = self._create_idp_db
|
||||||
|
failfun = self._db_onfail_idp
|
||||||
|
else:
|
||||||
|
raise Exception("unknown cachetype")
|
||||||
|
|
||||||
|
if not db_path.endswith(".db"):
|
||||||
|
zs = "config option --%s (the %s) was configured to [%s] which is invalid; must be a filepath ending with .db"
|
||||||
|
self.log("root", zs % (pathopt, desc, db_path), 1)
|
||||||
|
raise Exception(BAD_CFG)
|
||||||
|
|
||||||
if not HAVE_SQLITE3:
|
if not HAVE_SQLITE3:
|
||||||
self.args.no_ses = True
|
failfun()
|
||||||
t = "WARNING: sqlite3 not available; disabling sessions, will use plaintext passwords in cookies"
|
if which == "ses":
|
||||||
self.log("root", t, 3)
|
zs = "disabling sessions, will use plaintext passwords in cookies"
|
||||||
|
elif which == "idp":
|
||||||
|
zs = "disabling idp-db, will be unable to remember IdP-volumes after a restart"
|
||||||
|
self.log("root", "WARNING: sqlite3 not available; %s" % (zs,), 3)
|
||||||
return
|
return
|
||||||
|
|
||||||
assert sqlite3 # type: ignore # !rm
|
assert sqlite3 # type: ignore # !rm
|
||||||
|
|
||||||
# policy:
|
|
||||||
# the sessions-db is whatever, if something looks broken then just nuke it
|
|
||||||
|
|
||||||
db_path = self.args.ses_db
|
|
||||||
db_lock = db_path + ".lock"
|
db_lock = db_path + ".lock"
|
||||||
try:
|
try:
|
||||||
create = not os.path.getsize(db_path)
|
create = not os.path.getsize(db_path)
|
||||||
except:
|
except:
|
||||||
create = True
|
create = True
|
||||||
zs = "creating new" if create else "opening"
|
zs = "creating new" if create else "opening"
|
||||||
self.log("root", "%s sessions-db %s" % (zs, db_path))
|
self.log("root", "%s %s %s" % (zs, desc, db_path))
|
||||||
|
|
||||||
for tries in range(2):
|
for tries in range(2):
|
||||||
sver = 0
|
sver = 0
|
||||||
@@ -450,17 +488,19 @@ class SvcHub(object):
|
|||||||
try:
|
try:
|
||||||
zs = "select v from kv where k='sver'"
|
zs = "select v from kv where k='sver'"
|
||||||
sver = cur.execute(zs).fetchall()[0][0]
|
sver = cur.execute(zs).fetchall()[0][0]
|
||||||
if sver > VER_SESSION_DB:
|
if sver > native_ver:
|
||||||
zs = "this version of copyparty only understands session-db v%d and older; the db is v%d"
|
zs = "this version of copyparty only understands %s v%d and older; the db is v%d"
|
||||||
raise Exception(zs % (VER_SESSION_DB, sver))
|
raise Exception(zs % (desc, native_ver, sver))
|
||||||
|
|
||||||
cur.execute("select count(*) from us").fetchone()
|
cur.execute(sanchk_q).fetchone()
|
||||||
except:
|
except:
|
||||||
if sver:
|
if sver:
|
||||||
raise
|
raise
|
||||||
sver = 1
|
sver = createfun(cur)
|
||||||
self._create_session_db(cur)
|
|
||||||
err = self._verify_session_db(cur, sver, db_path)
|
err = self._verify_db(
|
||||||
|
cur, which, pathopt, db_path, desc, sver, native_ver
|
||||||
|
)
|
||||||
if err:
|
if err:
|
||||||
tries = 99
|
tries = 99
|
||||||
self.args.no_ses = True
|
self.args.no_ses = True
|
||||||
@@ -468,10 +508,10 @@ class SvcHub(object):
|
|||||||
break
|
break
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
if tries or sver > VER_SESSION_DB:
|
if tries or sver > native_ver:
|
||||||
raise
|
raise
|
||||||
t = "sessions-db is unusable; deleting and recreating: %r"
|
t = "%s is unusable; deleting and recreating: %r"
|
||||||
self.log("root", t % (ex,), 3)
|
self.log("root", t % (desc, ex), 3)
|
||||||
try:
|
try:
|
||||||
cur.close() # type: ignore
|
cur.close() # type: ignore
|
||||||
except:
|
except:
|
||||||
@@ -486,7 +526,7 @@ class SvcHub(object):
|
|||||||
pass
|
pass
|
||||||
os.unlink(db_path)
|
os.unlink(db_path)
|
||||||
|
|
||||||
def _create_session_db(self, cur: "sqlite3.Cursor") -> None:
|
def _create_session_db(self, cur: "sqlite3.Cursor") -> int:
|
||||||
sch = [
|
sch = [
|
||||||
r"create table kv (k text, v int)",
|
r"create table kv (k text, v int)",
|
||||||
r"create table us (un text, si text, t0 int)",
|
r"create table us (un text, si text, t0 int)",
|
||||||
@@ -499,8 +539,31 @@ class SvcHub(object):
|
|||||||
for cmd in sch:
|
for cmd in sch:
|
||||||
cur.execute(cmd)
|
cur.execute(cmd)
|
||||||
self.log("root", "created new sessions-db")
|
self.log("root", "created new sessions-db")
|
||||||
|
return 1
|
||||||
|
|
||||||
def _verify_session_db(self, cur: "sqlite3.Cursor", sver: int, db_path: str) -> str:
|
def _create_idp_db(self, cur: "sqlite3.Cursor") -> int:
|
||||||
|
sch = [
|
||||||
|
r"create table kv (k text, v int)",
|
||||||
|
r"create table us (un text, gs text)",
|
||||||
|
# username, groups
|
||||||
|
r"create index us_un on us(un)",
|
||||||
|
r"insert into kv values ('sver', 1)",
|
||||||
|
]
|
||||||
|
for cmd in sch:
|
||||||
|
cur.execute(cmd)
|
||||||
|
self.log("root", "created new idp-db")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
def _verify_db(
|
||||||
|
self,
|
||||||
|
cur: "sqlite3.Cursor",
|
||||||
|
which: str,
|
||||||
|
pathopt: str,
|
||||||
|
db_path: str,
|
||||||
|
desc: str,
|
||||||
|
sver: int,
|
||||||
|
native_ver: int,
|
||||||
|
) -> str:
|
||||||
# ensure writable (maybe owned by other user)
|
# ensure writable (maybe owned by other user)
|
||||||
db = cur.connection
|
db = cur.connection
|
||||||
|
|
||||||
@@ -512,9 +575,16 @@ class SvcHub(object):
|
|||||||
except:
|
except:
|
||||||
owner = 0
|
owner = 0
|
||||||
|
|
||||||
|
if which == "ses":
|
||||||
|
cons = "Will now disable sessions and instead use plaintext passwords in cookies."
|
||||||
|
elif which == "idp":
|
||||||
|
cons = "Each IdP-volume will not become available until its associated user sends their first request."
|
||||||
|
else:
|
||||||
|
raise Exception()
|
||||||
|
|
||||||
if not lock_file(db_path + ".lock"):
|
if not lock_file(db_path + ".lock"):
|
||||||
t = "the sessions-db [%s] is already in use by another copyparty instance (pid:%d). This is not supported; please provide another database with --ses-db or give this copyparty-instance its entirely separate config-folder by setting another path in the XDG_CONFIG_HOME env-var. You can also disable this safeguard by setting env-var PRTY_NO_DB_LOCK=1. Will now disable sessions and instead use plaintext passwords in cookies."
|
t = "the %s [%s] is already in use by another copyparty instance (pid:%d). This is not supported; please provide another database with --%s or give this copyparty-instance its entirely separate config-folder by setting another path in the XDG_CONFIG_HOME env-var. You can also disable this safeguard by setting env-var PRTY_NO_DB_LOCK=1. %s"
|
||||||
return t % (db_path, owner)
|
return t % (desc, db_path, owner, pathopt, cons)
|
||||||
|
|
||||||
vars = (("pid", os.getpid()), ("ts", int(time.time() * 1000)))
|
vars = (("pid", os.getpid()), ("ts", int(time.time() * 1000)))
|
||||||
if owner:
|
if owner:
|
||||||
@@ -526,9 +596,9 @@ class SvcHub(object):
|
|||||||
for k, v in vars:
|
for k, v in vars:
|
||||||
cur.execute("insert into kv values(?, ?)", (k, v))
|
cur.execute("insert into kv values(?, ?)", (k, v))
|
||||||
|
|
||||||
if sver < VER_SESSION_DB:
|
if sver < native_ver:
|
||||||
cur.execute("delete from kv where k='sver'")
|
cur.execute("delete from kv where k='sver'")
|
||||||
cur.execute("insert into kv values('sver',?)", (VER_SESSION_DB,))
|
cur.execute("insert into kv values('sver',?)", (native_ver,))
|
||||||
|
|
||||||
db.commit()
|
db.commit()
|
||||||
cur.close()
|
cur.close()
|
||||||
@@ -880,6 +950,12 @@ class SvcHub(object):
|
|||||||
vs = os.path.expandvars(os.path.expanduser(vs))
|
vs = os.path.expandvars(os.path.expanduser(vs))
|
||||||
setattr(al, k, vs)
|
setattr(al, k, vs)
|
||||||
|
|
||||||
|
for k in "idp_adm".split(" "):
|
||||||
|
vs = getattr(al, k)
|
||||||
|
vsa = [x.strip() for x in vs.split(",")]
|
||||||
|
vsa = [x.lower() for x in vsa if x]
|
||||||
|
setattr(al, k + "_set", set(vsa))
|
||||||
|
|
||||||
zs = "dav_ua1 sus_urls nonsus_urls ua_nodoc ua_nozip"
|
zs = "dav_ua1 sus_urls nonsus_urls ua_nodoc ua_nozip"
|
||||||
for k in zs.split(" "):
|
for k in zs.split(" "):
|
||||||
vs = getattr(al, k)
|
vs = getattr(al, k)
|
||||||
@@ -1043,7 +1119,7 @@ class SvcHub(object):
|
|||||||
|
|
||||||
fn = sel_fn
|
fn = sel_fn
|
||||||
try:
|
try:
|
||||||
os.makedirs(os.path.dirname(fn))
|
bos.makedirs(os.path.dirname(fn))
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -1060,6 +1136,9 @@ class SvcHub(object):
|
|||||||
|
|
||||||
lh = codecs.open(fn, "w", encoding="utf-8", errors="replace")
|
lh = codecs.open(fn, "w", encoding="utf-8", errors="replace")
|
||||||
|
|
||||||
|
if getattr(self.args, "free_umask", False):
|
||||||
|
os.fchmod(lh.fileno(), 0o644)
|
||||||
|
|
||||||
argv = [pybin] + self.argv
|
argv = [pybin] + self.argv
|
||||||
if hasattr(shlex, "quote"):
|
if hasattr(shlex, "quote"):
|
||||||
argv = [shlex.quote(x) for x in argv]
|
argv = [shlex.quote(x) for x in argv]
|
||||||
|
|||||||
@@ -282,7 +282,7 @@ class TcpSrv(object):
|
|||||||
except:
|
except:
|
||||||
pass # will create another ipv4 socket instead
|
pass # will create another ipv4 socket instead
|
||||||
|
|
||||||
if not ANYWIN and self.args.freebind:
|
if getattr(self.args, "freebind", False):
|
||||||
srv.setsockopt(socket.SOL_IP, socket.IP_FREEBIND, 1)
|
srv.setsockopt(socket.SOL_IP, socket.IP_FREEBIND, 1)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -387,14 +387,18 @@ class Tftpd(object):
|
|||||||
if not a:
|
if not a:
|
||||||
a = (self.args.iobuf,)
|
a = (self.args.iobuf,)
|
||||||
|
|
||||||
return open(ap, mode, *a, **ka)
|
ret = open(ap, mode, *a, **ka)
|
||||||
|
if wr and "chmod_f" in vfs.flags:
|
||||||
|
os.fchmod(ret.fileno(), vfs.flags["chmod_f"])
|
||||||
|
|
||||||
|
return ret
|
||||||
|
|
||||||
def _mkdir(self, vpath: str, *a) -> None:
|
def _mkdir(self, vpath: str, *a) -> None:
|
||||||
vfs, _, ap = self._v2a("mkdir", vpath, [False, True])
|
vfs, _, ap = self._v2a("mkdir", vpath, [False, True])
|
||||||
if "*" not in vfs.axs.uwrite:
|
if "*" not in vfs.axs.uwrite:
|
||||||
yeet("blocked mkdir; folder not world-writable: /%s" % (vpath,))
|
yeet("blocked mkdir; folder not world-writable: /%s" % (vpath,))
|
||||||
|
|
||||||
return bos.mkdir(ap)
|
return bos.mkdir(ap, vfs.flags["chmod_d"])
|
||||||
|
|
||||||
def _unlink(self, vpath: str) -> None:
|
def _unlink(self, vpath: str) -> None:
|
||||||
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
|
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
|
||||||
|
|||||||
@@ -96,6 +96,10 @@ try:
|
|||||||
if os.environ.get("PRTY_NO_PIL_AVIF"):
|
if os.environ.get("PRTY_NO_PIL_AVIF"):
|
||||||
raise Exception()
|
raise Exception()
|
||||||
|
|
||||||
|
if ".avif" in Image.registered_extensions():
|
||||||
|
HAVE_AVIF = True
|
||||||
|
raise Exception()
|
||||||
|
|
||||||
import pillow_avif # noqa: F401 # pylint: disable=unused-import
|
import pillow_avif # noqa: F401 # pylint: disable=unused-import
|
||||||
|
|
||||||
HAVE_AVIF = True
|
HAVE_AVIF = True
|
||||||
@@ -265,7 +269,8 @@ class ThumbSrv(object):
|
|||||||
self.log("joined waiting room for %r" % (tpath,))
|
self.log("joined waiting room for %r" % (tpath,))
|
||||||
except:
|
except:
|
||||||
thdir = os.path.dirname(tpath)
|
thdir = os.path.dirname(tpath)
|
||||||
bos.makedirs(os.path.join(thdir, "w"))
|
chmod = 0o700 if self.args.free_umask else 0o755
|
||||||
|
bos.makedirs(os.path.join(thdir, "w"), chmod)
|
||||||
|
|
||||||
inf_path = os.path.join(thdir, "dir.txt")
|
inf_path = os.path.join(thdir, "dir.txt")
|
||||||
if not bos.path.exists(inf_path):
|
if not bos.path.exists(inf_path):
|
||||||
@@ -280,7 +285,7 @@ class ThumbSrv(object):
|
|||||||
vn = next((x for x in allvols if x.realpath == ptop), None)
|
vn = next((x for x in allvols if x.realpath == ptop), None)
|
||||||
if not vn:
|
if not vn:
|
||||||
self.log("ptop %r not in %s" % (ptop, allvols), 3)
|
self.log("ptop %r not in %s" % (ptop, allvols), 3)
|
||||||
vn = self.asrv.vfs.all_aps[0][1]
|
vn = self.asrv.vfs.all_aps[0][1][0]
|
||||||
|
|
||||||
self.q.put((abspath, tpath, fmt, vn))
|
self.q.put((abspath, tpath, fmt, vn))
|
||||||
self.log("conv %r :%s \033[0m%r" % (tpath, fmt, abspath), 6)
|
self.log("conv %r :%s \033[0m%r" % (tpath, fmt, abspath), 6)
|
||||||
|
|||||||
@@ -915,7 +915,8 @@ class Up2k(object):
|
|||||||
# only need to protect register_vpath but all in one go feels right
|
# only need to protect register_vpath but all in one go feels right
|
||||||
for vol in vols:
|
for vol in vols:
|
||||||
try:
|
try:
|
||||||
bos.makedirs(vol.realpath) # gonna happen at snap anyways
|
# mkdir gonna happen at snap anyways;
|
||||||
|
bos.makedirs(vol.realpath, vol.flags["chmod_d"])
|
||||||
dir_is_empty(self.log_func, not self.args.no_scandir, vol.realpath)
|
dir_is_empty(self.log_func, not self.args.no_scandir, vol.realpath)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
self.volstate[vol.vpath] = "OFFLINE (cannot access folder)"
|
self.volstate[vol.vpath] = "OFFLINE (cannot access folder)"
|
||||||
@@ -1141,6 +1142,20 @@ class Up2k(object):
|
|||||||
del fl[k1]
|
del fl[k1]
|
||||||
else:
|
else:
|
||||||
fl[k1] = ",".join(x for x in fl[k1])
|
fl[k1] = ",".join(x for x in fl[k1])
|
||||||
|
|
||||||
|
if fl["chmod_d"] == int(self.args.chmod_d, 8):
|
||||||
|
fl.pop("chmod_d")
|
||||||
|
try:
|
||||||
|
if fl["chmod_f"] == int(self.args.chmod_f or "-1", 8):
|
||||||
|
fl.pop("chmod_f")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
for k in ("chmod_f", "chmod_d"):
|
||||||
|
try:
|
||||||
|
fl[k] = "%o" % (fl[k])
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
a = [
|
a = [
|
||||||
(ft if v is True else ff if v is False else fv).format(k, str(v))
|
(ft if v is True else ff if v is False else fv).format(k, str(v))
|
||||||
for k, v in fl.items()
|
for k, v in fl.items()
|
||||||
@@ -1364,6 +1379,10 @@ class Up2k(object):
|
|||||||
t = "volume /%s at [%s] is empty; will not be indexed as this could be due to an offline filesystem"
|
t = "volume /%s at [%s] is empty; will not be indexed as this could be due to an offline filesystem"
|
||||||
self.log(t % (vol.vpath, rtop), 6)
|
self.log(t % (vol.vpath, rtop), 6)
|
||||||
return True, False
|
return True, False
|
||||||
|
if not vol.check_landmarks():
|
||||||
|
t = "volume /%s at [%s] will not be indexed due to bad landmarks"
|
||||||
|
self.log(t % (vol.vpath, rtop), 6)
|
||||||
|
return True, False
|
||||||
|
|
||||||
n_add, _, _ = self._build_dir(
|
n_add, _, _ = self._build_dir(
|
||||||
db,
|
db,
|
||||||
@@ -3290,7 +3309,7 @@ class Up2k(object):
|
|||||||
reg,
|
reg,
|
||||||
"up2k._get_volsize",
|
"up2k._get_volsize",
|
||||||
)
|
)
|
||||||
bos.makedirs(ap2)
|
bos.makedirs(ap2, vfs.flags["chmod_d"])
|
||||||
vfs.lim.nup(cj["addr"])
|
vfs.lim.nup(cj["addr"])
|
||||||
vfs.lim.bup(cj["addr"], cj["size"])
|
vfs.lim.bup(cj["addr"], cj["size"])
|
||||||
|
|
||||||
@@ -3397,11 +3416,11 @@ class Up2k(object):
|
|||||||
self.log(t % (mts - mtc, mts, mtc, fp))
|
self.log(t % (mts - mtc, mts, mtc, fp))
|
||||||
ow = False
|
ow = False
|
||||||
|
|
||||||
|
ptop = job["ptop"]
|
||||||
|
vf = self.flags.get(ptop) or {}
|
||||||
if ow:
|
if ow:
|
||||||
self.log("replacing existing file at %r" % (fp,))
|
self.log("replacing existing file at %r" % (fp,))
|
||||||
cur = None
|
cur = None
|
||||||
ptop = job["ptop"]
|
|
||||||
vf = self.flags.get(ptop) or {}
|
|
||||||
st = bos.stat(fp)
|
st = bos.stat(fp)
|
||||||
try:
|
try:
|
||||||
vrel = vjoin(job["prel"], fname)
|
vrel = vjoin(job["prel"], fname)
|
||||||
@@ -3421,8 +3440,13 @@ class Up2k(object):
|
|||||||
else:
|
else:
|
||||||
dip = self.hub.iphash.s(ip)
|
dip = self.hub.iphash.s(ip)
|
||||||
|
|
||||||
suffix = "-%.6f-%s" % (ts, dip)
|
f, ret = ren_open(
|
||||||
f, ret = ren_open(fname, "wb", fdir=fdir, suffix=suffix)
|
fname,
|
||||||
|
"wb",
|
||||||
|
fdir=fdir,
|
||||||
|
suffix="-%.6f-%s" % (ts, dip),
|
||||||
|
chmod=vf.get("chmod_f", -1),
|
||||||
|
)
|
||||||
f.close()
|
f.close()
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
@@ -4277,7 +4301,7 @@ class Up2k(object):
|
|||||||
self.log(t, 1)
|
self.log(t, 1)
|
||||||
raise Pebkac(405, t)
|
raise Pebkac(405, t)
|
||||||
|
|
||||||
bos.makedirs(os.path.dirname(dabs))
|
bos.makedirs(os.path.dirname(dabs), dvn.flags["chmod_d"])
|
||||||
|
|
||||||
c1, w, ftime_, fsize_, ip, at = self._find_from_vpath(
|
c1, w, ftime_, fsize_, ip, at = self._find_from_vpath(
|
||||||
svn_dbv.realpath, srem_dbv
|
svn_dbv.realpath, srem_dbv
|
||||||
@@ -4453,7 +4477,7 @@ class Up2k(object):
|
|||||||
vp = vjoin(dvp, rem)
|
vp = vjoin(dvp, rem)
|
||||||
try:
|
try:
|
||||||
dvn, drem = self.vfs.get(vp, uname, False, True)
|
dvn, drem = self.vfs.get(vp, uname, False, True)
|
||||||
bos.mkdir(dvn.canonical(drem))
|
bos.mkdir(dvn.canonical(drem), dvn.flags["chmod_d"])
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -4523,7 +4547,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
is_xvol = svn.realpath != dvn.realpath
|
is_xvol = svn.realpath != dvn.realpath
|
||||||
|
|
||||||
bos.makedirs(os.path.dirname(dabs))
|
bos.makedirs(os.path.dirname(dabs), dvn.flags["chmod_d"])
|
||||||
|
|
||||||
if is_dirlink:
|
if is_dirlink:
|
||||||
dlabs = absreal(sabs)
|
dlabs = absreal(sabs)
|
||||||
@@ -5030,8 +5054,13 @@ class Up2k(object):
|
|||||||
else:
|
else:
|
||||||
dip = self.hub.iphash.s(job["addr"])
|
dip = self.hub.iphash.s(job["addr"])
|
||||||
|
|
||||||
suffix = "-%.6f-%s" % (job["t0"], dip)
|
f, job["tnam"] = ren_open(
|
||||||
f, job["tnam"] = ren_open(tnam, "wb", fdir=pdir, suffix=suffix)
|
tnam,
|
||||||
|
"wb",
|
||||||
|
fdir=pdir,
|
||||||
|
suffix="-%.6f-%s" % (job["t0"], dip),
|
||||||
|
chmod=vf.get("chmod_f", -1),
|
||||||
|
)
|
||||||
try:
|
try:
|
||||||
abspath = djoin(pdir, job["tnam"])
|
abspath = djoin(pdir, job["tnam"])
|
||||||
sprs = job["sprs"]
|
sprs = job["sprs"]
|
||||||
|
|||||||
@@ -105,6 +105,7 @@ def _ens(want: str) -> tuple[int, ...]:
|
|||||||
# WSAENOTSOCK - no longer a socket
|
# WSAENOTSOCK - no longer a socket
|
||||||
# EUNATCH - can't assign requested address (wifi down)
|
# EUNATCH - can't assign requested address (wifi down)
|
||||||
E_SCK = _ens("ENOTCONN EUNATCH EBADF WSAENOTSOCK WSAECONNRESET")
|
E_SCK = _ens("ENOTCONN EUNATCH EBADF WSAENOTSOCK WSAECONNRESET")
|
||||||
|
E_SCK_WR = _ens("EPIPE ESHUTDOWN EBADFD")
|
||||||
E_ADDR_NOT_AVAIL = _ens("EADDRNOTAVAIL WSAEADDRNOTAVAIL")
|
E_ADDR_NOT_AVAIL = _ens("EADDRNOTAVAIL WSAEADDRNOTAVAIL")
|
||||||
E_ADDR_IN_USE = _ens("EADDRINUSE WSAEADDRINUSE")
|
E_ADDR_IN_USE = _ens("EADDRINUSE WSAEADDRINUSE")
|
||||||
E_ACCESS = _ens("EACCES WSAEACCES")
|
E_ACCESS = _ens("EACCES WSAEACCES")
|
||||||
@@ -153,6 +154,14 @@ try:
|
|||||||
except:
|
except:
|
||||||
HAVE_PSUTIL = False
|
HAVE_PSUTIL = False
|
||||||
|
|
||||||
|
try:
|
||||||
|
if os.environ.get("PRTY_NO_MAGIC"):
|
||||||
|
raise Exception()
|
||||||
|
|
||||||
|
import magic
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
if True: # pylint: disable=using-constant-test
|
if True: # pylint: disable=using-constant-test
|
||||||
import types
|
import types
|
||||||
from collections.abc import Callable, Iterable
|
from collections.abc import Callable, Iterable
|
||||||
@@ -175,8 +184,6 @@ if True: # pylint: disable=using-constant-test
|
|||||||
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
import magic
|
|
||||||
|
|
||||||
from .authsrv import VFS
|
from .authsrv import VFS
|
||||||
from .broker_util import BrokerCli
|
from .broker_util import BrokerCli
|
||||||
from .up2k import Up2k
|
from .up2k import Up2k
|
||||||
@@ -1256,8 +1263,6 @@ class Magician(object):
|
|||||||
self.magic: Optional["magic.Magic"] = None
|
self.magic: Optional["magic.Magic"] = None
|
||||||
|
|
||||||
def ext(self, fpath: str) -> str:
|
def ext(self, fpath: str) -> str:
|
||||||
import magic
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if self.bad_magic:
|
if self.bad_magic:
|
||||||
raise Exception()
|
raise Exception()
|
||||||
@@ -1580,6 +1585,7 @@ def ren_open(fname: str, *args: Any, **kwargs: Any) -> tuple[typing.IO[Any], str
|
|||||||
fun = kwargs.pop("fun", open)
|
fun = kwargs.pop("fun", open)
|
||||||
fdir = kwargs.pop("fdir", None)
|
fdir = kwargs.pop("fdir", None)
|
||||||
suffix = kwargs.pop("suffix", None)
|
suffix = kwargs.pop("suffix", None)
|
||||||
|
chmod = kwargs.pop("chmod", -1)
|
||||||
|
|
||||||
if fname == os.devnull:
|
if fname == os.devnull:
|
||||||
return fun(fname, *args, **kwargs), fname
|
return fun(fname, *args, **kwargs), fname
|
||||||
@@ -1623,6 +1629,11 @@ def ren_open(fname: str, *args: Any, **kwargs: Any) -> tuple[typing.IO[Any], str
|
|||||||
fp2 = os.path.join(fdir, fp2)
|
fp2 = os.path.join(fdir, fp2)
|
||||||
with open(fsenc(fp2), "wb") as f2:
|
with open(fsenc(fp2), "wb") as f2:
|
||||||
f2.write(orig_name.encode("utf-8"))
|
f2.write(orig_name.encode("utf-8"))
|
||||||
|
if chmod >= 0:
|
||||||
|
os.fchmod(f2.fileno(), chmod)
|
||||||
|
|
||||||
|
if chmod >= 0:
|
||||||
|
os.fchmod(f.fileno(), chmod)
|
||||||
|
|
||||||
return f, fname
|
return f, fname
|
||||||
|
|
||||||
@@ -1963,7 +1974,7 @@ def rand_name(fdir: str, fn: str, rnd: int) -> str:
|
|||||||
return fn
|
return fn
|
||||||
|
|
||||||
|
|
||||||
def gen_filekey(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str:
|
def _gen_filekey(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str:
|
||||||
if alg == 1:
|
if alg == 1:
|
||||||
zs = "%s %s %s %s" % (salt, fspath, fsize, inode)
|
zs = "%s %s %s %s" % (salt, fspath, fsize, inode)
|
||||||
else:
|
else:
|
||||||
@@ -1973,6 +1984,13 @@ def gen_filekey(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str
|
|||||||
return ub64enc(hashlib.sha512(zb).digest()).decode("ascii")
|
return ub64enc(hashlib.sha512(zb).digest()).decode("ascii")
|
||||||
|
|
||||||
|
|
||||||
|
def _gen_filekey_w(alg: int, salt: str, fspath: str, fsize: int, inode: int) -> str:
|
||||||
|
return _gen_filekey(alg, salt, fspath.replace("/", "\\"), fsize, inode)
|
||||||
|
|
||||||
|
|
||||||
|
gen_filekey = _gen_filekey_w if ANYWIN else _gen_filekey
|
||||||
|
|
||||||
|
|
||||||
def gen_filekey_dbg(
|
def gen_filekey_dbg(
|
||||||
alg: int,
|
alg: int,
|
||||||
salt: str,
|
salt: str,
|
||||||
@@ -2396,11 +2414,11 @@ def pathmod(
|
|||||||
|
|
||||||
# try to map abspath to vpath
|
# try to map abspath to vpath
|
||||||
np = np.replace("/", os.sep)
|
np = np.replace("/", os.sep)
|
||||||
for vn_ap, vn in vfs.all_aps:
|
for vn_ap, vns in vfs.all_aps:
|
||||||
if not np.startswith(vn_ap):
|
if not np.startswith(vn_ap):
|
||||||
continue
|
continue
|
||||||
zs = np[len(vn_ap) :].replace(os.sep, "/")
|
zs = np[len(vn_ap) :].replace(os.sep, "/")
|
||||||
nvp = vjoin(vn.vpath, zs)
|
nvp = vjoin(vns[0].vpath, zs)
|
||||||
break
|
break
|
||||||
|
|
||||||
if nvp == "\n":
|
if nvp == "\n":
|
||||||
@@ -3152,11 +3170,13 @@ def unescape_cookie(orig: str) -> str:
|
|||||||
return "".join(ret)
|
return "".join(ret)
|
||||||
|
|
||||||
|
|
||||||
def guess_mime(url: str, fallback: str = "application/octet-stream") -> str:
|
def guess_mime(
|
||||||
|
url: str, path: str = "", fallback: str = "application/octet-stream"
|
||||||
|
) -> str:
|
||||||
try:
|
try:
|
||||||
ext = url.rsplit(".", 1)[1].lower()
|
ext = url.rsplit(".", 1)[1].lower()
|
||||||
except:
|
except:
|
||||||
return fallback
|
ext = ""
|
||||||
|
|
||||||
ret = MIMES.get(ext)
|
ret = MIMES.get(ext)
|
||||||
|
|
||||||
@@ -3164,6 +3184,16 @@ def guess_mime(url: str, fallback: str = "application/octet-stream") -> str:
|
|||||||
x = mimetypes.guess_type(url)
|
x = mimetypes.guess_type(url)
|
||||||
ret = "application/{}".format(x[1]) if x[1] else x[0]
|
ret = "application/{}".format(x[1]) if x[1] else x[0]
|
||||||
|
|
||||||
|
if not ret and path:
|
||||||
|
try:
|
||||||
|
with open(fsenc(path), "rb", 0) as f:
|
||||||
|
ret = magic.from_buffer(f.read(4096), mime=True)
|
||||||
|
if ret.startswith("text/htm"):
|
||||||
|
# avoid serving up HTML content unless there was actually a .html extension
|
||||||
|
ret = "text/plain"
|
||||||
|
except Exception as ex:
|
||||||
|
pass
|
||||||
|
|
||||||
if not ret:
|
if not ret:
|
||||||
ret = fallback
|
ret = fallback
|
||||||
|
|
||||||
|
|||||||
@@ -592,9 +592,7 @@ window.baguetteBox = (function () {
|
|||||||
preloadPrev(currentIndex);
|
preloadPrev(currentIndex);
|
||||||
});
|
});
|
||||||
|
|
||||||
clmod(ebi('bbox-btns'), 'off');
|
show_buttons(0);
|
||||||
clmod(btnPrev, 'off');
|
|
||||||
clmod(btnNext, 'off');
|
|
||||||
|
|
||||||
updateOffset();
|
updateOffset();
|
||||||
overlay.style.display = 'block';
|
overlay.style.display = 'block';
|
||||||
@@ -776,6 +774,8 @@ window.baguetteBox = (function () {
|
|||||||
if (is_vid) {
|
if (is_vid) {
|
||||||
image.volume = clamp(fcfg_get('vol', dvol / 100), 0, 1);
|
image.volume = clamp(fcfg_get('vol', dvol / 100), 0, 1);
|
||||||
image.setAttribute('controls', 'controls');
|
image.setAttribute('controls', 'controls');
|
||||||
|
image.setAttribute('playsinline', '1');
|
||||||
|
// ios ignores poster
|
||||||
image.onended = vidEnd;
|
image.onended = vidEnd;
|
||||||
image.onplay = function () { show_buttons(1); };
|
image.onplay = function () { show_buttons(1); };
|
||||||
image.onpause = function () { show_buttons(); };
|
image.onpause = function () { show_buttons(); };
|
||||||
|
|||||||
@@ -4,6 +4,8 @@
|
|||||||
--grid-sz: 10em;
|
--grid-sz: 10em;
|
||||||
--grid-ln: 3;
|
--grid-ln: 3;
|
||||||
--nav-sz: 16em;
|
--nav-sz: 16em;
|
||||||
|
--sbw: 0.5em;
|
||||||
|
--sbh: 0.5em;
|
||||||
|
|
||||||
--fg: #ccc;
|
--fg: #ccc;
|
||||||
--fg-max: #fff;
|
--fg-max: #fff;
|
||||||
@@ -1558,8 +1560,8 @@ html {
|
|||||||
z-index: 1;
|
z-index: 1;
|
||||||
position: fixed;
|
position: fixed;
|
||||||
background: var(--tree-bg);
|
background: var(--tree-bg);
|
||||||
left: -.98em;
|
left: -.96em;
|
||||||
width: calc(var(--nav-sz) - 0.5em);
|
width: calc(.3em + var(--nav-sz) - var(--sbw));
|
||||||
border-bottom: 1px solid var(--bg-u5);
|
border-bottom: 1px solid var(--bg-u5);
|
||||||
overflow: hidden;
|
overflow: hidden;
|
||||||
}
|
}
|
||||||
@@ -1825,10 +1827,11 @@ html.y #tree.nowrap .ntree a+a:hover {
|
|||||||
line-height: 2.3em;
|
line-height: 2.3em;
|
||||||
margin-bottom: 1.5em;
|
margin-bottom: 1.5em;
|
||||||
}
|
}
|
||||||
|
#hdoc,
|
||||||
#ghead {
|
#ghead {
|
||||||
position: sticky;
|
position: sticky;
|
||||||
top: -.3em;
|
top: -.3em;
|
||||||
z-index: 1;
|
z-index: 2;
|
||||||
}
|
}
|
||||||
.ghead .btn {
|
.ghead .btn {
|
||||||
position: relative;
|
position: relative;
|
||||||
@@ -1838,6 +1841,13 @@ html.y #tree.nowrap .ntree a+a:hover {
|
|||||||
white-space: pre;
|
white-space: pre;
|
||||||
padding-left: .3em;
|
padding-left: .3em;
|
||||||
}
|
}
|
||||||
|
#tailbtns {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
#taildoc.on+#tailbtns {
|
||||||
|
display: inherit;
|
||||||
|
display: unset;
|
||||||
|
}
|
||||||
#op_unpost {
|
#op_unpost {
|
||||||
padding: 1em;
|
padding: 1em;
|
||||||
}
|
}
|
||||||
@@ -1934,6 +1944,9 @@ html.y #tree.nowrap .ntree a+a:hover {
|
|||||||
padding: 1em 0 1em 0;
|
padding: 1em 0 1em 0;
|
||||||
border-radius: .3em;
|
border-radius: .3em;
|
||||||
}
|
}
|
||||||
|
#doc.wrap {
|
||||||
|
white-space: pre-wrap;
|
||||||
|
}
|
||||||
html.y #doc {
|
html.y #doc {
|
||||||
box-shadow: 0 0 .3em var(--bg-u5);
|
box-shadow: 0 0 .3em var(--bg-u5);
|
||||||
background: #f7f7f7;
|
background: #f7f7f7;
|
||||||
@@ -2022,6 +2035,9 @@ a.btn,
|
|||||||
font-family: 'scp', monospace, monospace;
|
font-family: 'scp', monospace, monospace;
|
||||||
font-family: var(--font-mono), 'scp', monospace, monospace;
|
font-family: var(--font-mono), 'scp', monospace, monospace;
|
||||||
}
|
}
|
||||||
|
#hkhelp b {
|
||||||
|
text-shadow: 1px 0 0 var(--fg), -1px 0 0 var(--fg), 0 -1px 0 var(--fg);
|
||||||
|
}
|
||||||
html.noscroll,
|
html.noscroll,
|
||||||
html.noscroll .sbar {
|
html.noscroll .sbar {
|
||||||
scrollbar-width: none;
|
scrollbar-width: none;
|
||||||
@@ -2183,18 +2199,25 @@ html.y #bbox-overlay figcaption a {
|
|||||||
top: calc(50% - 30px);
|
top: calc(50% - 30px);
|
||||||
width: 44px;
|
width: 44px;
|
||||||
height: 60px;
|
height: 60px;
|
||||||
|
transition: background-color .3s ease, color .3s ease, left .3s ease, right .3s ease;
|
||||||
|
}
|
||||||
|
#bbox-btns button {
|
||||||
|
transition: background-color .3s ease, color .3s ease;
|
||||||
|
}
|
||||||
|
#bbox-btns {
|
||||||
|
transition: top .3s ease;
|
||||||
}
|
}
|
||||||
.bbox-btn {
|
.bbox-btn {
|
||||||
position: fixed;
|
position: fixed;
|
||||||
}
|
}
|
||||||
.bbox-btn,
|
#bbox-next.off {
|
||||||
#bbox-btns {
|
right: -2.6em;
|
||||||
opacity: 1;
|
}
|
||||||
animation: opacity .2s infinite ease-in-out;
|
#bbox-prev.off {
|
||||||
|
left: -2.6em;
|
||||||
}
|
}
|
||||||
.bbox-btn.off,
|
|
||||||
#bbox-btns.off {
|
#bbox-btns.off {
|
||||||
opacity: 0;
|
top: -2.2em;
|
||||||
}
|
}
|
||||||
#bbox-overlay button {
|
#bbox-overlay button {
|
||||||
cursor: pointer;
|
cursor: pointer;
|
||||||
@@ -2205,8 +2228,6 @@ html.y #bbox-overlay figcaption a {
|
|||||||
border-radius: 15%;
|
border-radius: 15%;
|
||||||
background: rgba(50, 50, 50, 0.5);
|
background: rgba(50, 50, 50, 0.5);
|
||||||
color: rgba(255,255,255,0.7);
|
color: rgba(255,255,255,0.7);
|
||||||
transition: background-color .3s ease;
|
|
||||||
transition: color .3s ease;
|
|
||||||
font-size: 1.4em;
|
font-size: 1.4em;
|
||||||
line-height: 1.4em;
|
line-height: 1.4em;
|
||||||
vertical-align: top;
|
vertical-align: top;
|
||||||
@@ -3055,7 +3076,8 @@ html.b .ntree a {
|
|||||||
padding: .6em .2em;
|
padding: .6em .2em;
|
||||||
}
|
}
|
||||||
html.b #treepar {
|
html.b #treepar {
|
||||||
margin-left: .62em;
|
margin-left: .63em;
|
||||||
|
width: calc(.1em + var(--nav-sz) - var(--sbw));
|
||||||
border-bottom: .2em solid var(--f-h-b1);
|
border-bottom: .2em solid var(--f-h-b1);
|
||||||
}
|
}
|
||||||
html.b #wrap {
|
html.b #wrap {
|
||||||
@@ -3227,7 +3249,7 @@ html.d #treepar {
|
|||||||
|
|
||||||
#ggrid>a>span {
|
#ggrid>a>span {
|
||||||
text-align: center;
|
text-align: center;
|
||||||
padding: 0.2em;
|
padding: .2em .2em .15em .2em;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -3250,4 +3272,9 @@ html.d #treepar {
|
|||||||
.dropdesc>div>div {
|
.dropdesc>div>div {
|
||||||
transition: none;
|
transition: none;
|
||||||
}
|
}
|
||||||
|
#bbox-next,
|
||||||
|
#bbox-prev,
|
||||||
|
#bbox-btns {
|
||||||
|
transition: background-color .3s ease, color .3s ease;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -35,7 +35,7 @@ var Ls = {
|
|||||||
"file-manager",
|
"file-manager",
|
||||||
["G", "toggle list / grid view"],
|
["G", "toggle list / grid view"],
|
||||||
["T", "toggle thumbnails / icons"],
|
["T", "toggle thumbnails / icons"],
|
||||||
["🡅 A/D", "thumbnail size"],
|
["⇧ A/D", "thumbnail size"],
|
||||||
["ctrl-K", "delete selected"],
|
["ctrl-K", "delete selected"],
|
||||||
["ctrl-X", "cut selection to clipboard"],
|
["ctrl-X", "cut selection to clipboard"],
|
||||||
["ctrl-C", "copy selection to clipboard"],
|
["ctrl-C", "copy selection to clipboard"],
|
||||||
@@ -45,9 +45,9 @@ var Ls = {
|
|||||||
|
|
||||||
"file-list-sel",
|
"file-list-sel",
|
||||||
["space", "toggle file selection"],
|
["space", "toggle file selection"],
|
||||||
["🡑/🡓", "move selection cursor"],
|
["↑/↓", "move selection cursor"],
|
||||||
["ctrl 🡑/🡓", "move cursor and viewport"],
|
["ctrl ↑/↓", "move cursor and viewport"],
|
||||||
["🡅 🡑/🡓", "select prev/next file"],
|
["⇧ ↑/↓", "select prev/next file"],
|
||||||
["ctrl-A", "select all files / folders"],
|
["ctrl-A", "select all files / folders"],
|
||||||
], [
|
], [
|
||||||
"navigation",
|
"navigation",
|
||||||
@@ -70,7 +70,7 @@ var Ls = {
|
|||||||
["Home/End", "first/last pic"],
|
["Home/End", "first/last pic"],
|
||||||
["F", "fullscreen"],
|
["F", "fullscreen"],
|
||||||
["R", "rotate clockwise"],
|
["R", "rotate clockwise"],
|
||||||
["🡅 R", "rotate ccw"],
|
["⇧ R", "rotate ccw"],
|
||||||
["S", "select pic"],
|
["S", "select pic"],
|
||||||
["Y", "download pic"],
|
["Y", "download pic"],
|
||||||
], [
|
], [
|
||||||
@@ -226,6 +226,7 @@ var Ls = {
|
|||||||
"ct_csel": 'use CTRL and SHIFT for file selection in grid-view">sel',
|
"ct_csel": 'use CTRL and SHIFT for file selection in grid-view">sel',
|
||||||
"ct_ihop": 'when the image viewer is closed, scroll down to the last viewed file">g⮯',
|
"ct_ihop": 'when the image viewer is closed, scroll down to the last viewed file">g⮯',
|
||||||
"ct_dots": 'show hidden files (if server permits)">dotfiles',
|
"ct_dots": 'show hidden files (if server permits)">dotfiles',
|
||||||
|
"ct_qdel": 'when deleting files, only ask for confirmation once">qdel',
|
||||||
"ct_dir1st": 'sort folders before files">📁 first',
|
"ct_dir1st": 'sort folders before files">📁 first',
|
||||||
"ct_nsort": 'natural sort (for filenames with leading digits)">nsort',
|
"ct_nsort": 'natural sort (for filenames with leading digits)">nsort',
|
||||||
"ct_readme": 'show README.md in folder listings">📜 readme',
|
"ct_readme": 'show README.md in folder listings">📜 readme',
|
||||||
@@ -249,6 +250,8 @@ var Ls = {
|
|||||||
|
|
||||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
||||||
|
|
||||||
|
"cut_wasm": "use wasm instead of the browser's built-in hasher; improves speed on chrome-based browsers but increases CPU load, and many older versions of chrome have bugs which makes the browser consume all RAM and crash if this is enabled\">wasm",
|
||||||
|
|
||||||
"cft_text": "favicon text (blank and refresh to disable)",
|
"cft_text": "favicon text (blank and refresh to disable)",
|
||||||
"cft_fg": "foreground color",
|
"cft_fg": "foreground color",
|
||||||
"cft_bg": "background color",
|
"cft_bg": "background color",
|
||||||
@@ -335,6 +338,7 @@ var Ls = {
|
|||||||
"f_empty": 'this folder is empty',
|
"f_empty": 'this folder is empty',
|
||||||
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
||||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||||
|
"f_bigtxt2": "view just the end of the file instead? this will also enable following/tailing, showing newly added lines of text in real time",
|
||||||
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
||||||
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
||||||
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
||||||
@@ -439,6 +443,11 @@ var Ls = {
|
|||||||
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
||||||
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
||||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||||
|
"tvt_tail": "monitor file for changes; show new lines in real time\">📡 follow",
|
||||||
|
"tvt_wrap": "word-wrap\">↵",
|
||||||
|
"tvt_atail": "lock scroll to bottom of page\">⚓",
|
||||||
|
"tvt_ctail": "decode terminal colors (ansi escape codes)\">🌈",
|
||||||
|
"tvt_ntail": "scrollback limit (how many bytes of text to keep loaded)",
|
||||||
|
|
||||||
"m3u_add1": "song added to m3u playlist",
|
"m3u_add1": "song added to m3u playlist",
|
||||||
"m3u_addn": "{0} songs added to m3u playlist",
|
"m3u_addn": "{0} songs added to m3u playlist",
|
||||||
@@ -538,6 +547,7 @@ var Ls = {
|
|||||||
"u_https3": "for better performance",
|
"u_https3": "for better performance",
|
||||||
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
||||||
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
||||||
|
"tail_2old": "need firefox 105+ or chrome 71+ or iOS 14.5+",
|
||||||
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
|
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
|
||||||
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
|
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
|
||||||
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
||||||
@@ -649,7 +659,7 @@ var Ls = {
|
|||||||
"filbehandler",
|
"filbehandler",
|
||||||
["G", "listevisning eller ikoner"],
|
["G", "listevisning eller ikoner"],
|
||||||
["T", "miniatyrbilder på/av"],
|
["T", "miniatyrbilder på/av"],
|
||||||
["🡅 A/D", "ikonstørrelse"],
|
["⇧ A/D", "ikonstørrelse"],
|
||||||
["ctrl-K", "slett valgte"],
|
["ctrl-K", "slett valgte"],
|
||||||
["ctrl-X", "klipp ut valgte"],
|
["ctrl-X", "klipp ut valgte"],
|
||||||
["ctrl-C", "kopiér til utklippstavle"],
|
["ctrl-C", "kopiér til utklippstavle"],
|
||||||
@@ -659,9 +669,9 @@ var Ls = {
|
|||||||
|
|
||||||
"filmarkering",
|
"filmarkering",
|
||||||
["space", "marker fil"],
|
["space", "marker fil"],
|
||||||
["🡑/🡓", "flytt markør"],
|
["↑/↓", "flytt markør"],
|
||||||
["ctrl 🡑/🡓", "flytt markør og scroll"],
|
["ctrl ↑/↓", "flytt markør og scroll"],
|
||||||
["🡅 🡑/🡓", "velg forr./neste fil"],
|
["⇧ ↑/↓", "velg forr./neste fil"],
|
||||||
["ctrl-A", "velg alle filer / mapper"],
|
["ctrl-A", "velg alle filer / mapper"],
|
||||||
], [
|
], [
|
||||||
"navigering",
|
"navigering",
|
||||||
@@ -684,7 +694,7 @@ var Ls = {
|
|||||||
["Home/End", "første/siste bilde"],
|
["Home/End", "første/siste bilde"],
|
||||||
["F", "fullskjermvisning"],
|
["F", "fullskjermvisning"],
|
||||||
["R", "rotere mot høyre"],
|
["R", "rotere mot høyre"],
|
||||||
["🡅 R", "rotere mot venstre"],
|
["⇧ R", "rotere mot venstre"],
|
||||||
["S", "marker bilde"],
|
["S", "marker bilde"],
|
||||||
["Y", "last ned bilde"],
|
["Y", "last ned bilde"],
|
||||||
], [
|
], [
|
||||||
@@ -841,6 +851,7 @@ var Ls = {
|
|||||||
"ct_csel": 'bruk tastene CTRL og SHIFT for markering av filer i ikonvisning">merk',
|
"ct_csel": 'bruk tastene CTRL og SHIFT for markering av filer i ikonvisning">merk',
|
||||||
"ct_ihop": 'bla ned til sist viste bilde når bildeviseren lukkes">g⮯',
|
"ct_ihop": 'bla ned til sist viste bilde når bildeviseren lukkes">g⮯',
|
||||||
"ct_dots": 'vis skjulte filer (gitt at serveren tillater det)">.synlig',
|
"ct_dots": 'vis skjulte filer (gitt at serveren tillater det)">.synlig',
|
||||||
|
"ct_qdel": 'sletteknappen spør bare én gang om bekreftelse">hurtig🗑️',
|
||||||
"ct_dir1st": 'sorter slik at mapper kommer foran filer">📁 først',
|
"ct_dir1st": 'sorter slik at mapper kommer foran filer">📁 først',
|
||||||
"ct_nsort": 'naturlig sortering (forstår tall i filnavn)">nsort',
|
"ct_nsort": 'naturlig sortering (forstår tall i filnavn)">nsort',
|
||||||
"ct_readme": 'vis README.md nedenfor filene">📜 readme',
|
"ct_readme": 'vis README.md nedenfor filene">📜 readme',
|
||||||
@@ -864,6 +875,8 @@ var Ls = {
|
|||||||
|
|
||||||
"cut_mt": "raskere befaring ved å bruke hele CPU'en$N$Ndenne funksjonen anvender web-workers$Nog krever mer RAM (opptil 512 MiB ekstra)$N$Ngjør https 30% raskere, http 4.5x raskere\">mt",
|
"cut_mt": "raskere befaring ved å bruke hele CPU'en$N$Ndenne funksjonen anvender web-workers$Nog krever mer RAM (opptil 512 MiB ekstra)$N$Ngjør https 30% raskere, http 4.5x raskere\">mt",
|
||||||
|
|
||||||
|
"cut_wasm": "bruk wasm istedenfor nettleserens sha512-funksjon; gir bedre ytelse på chrome-baserte nettlesere, men bruker mere CPU, og eldre versjoner av chrome tåler det ikke (spiser opp all RAM og krasjer)\">wasm",
|
||||||
|
|
||||||
"cft_text": "ikontekst (blank ut og last siden på nytt for å deaktivere)",
|
"cft_text": "ikontekst (blank ut og last siden på nytt for å deaktivere)",
|
||||||
"cft_fg": "farge",
|
"cft_fg": "farge",
|
||||||
"cft_bg": "bakgrunnsfarge",
|
"cft_bg": "bakgrunnsfarge",
|
||||||
@@ -950,6 +963,7 @@ var Ls = {
|
|||||||
"f_empty": 'denne mappen er tom',
|
"f_empty": 'denne mappen er tom',
|
||||||
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
||||||
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
||||||
|
"f_bigtxt2": "vil du se bunnen av filen istedenfor? du vil da også se nye linjer som blir lagt til på slutten av filen i sanntid",
|
||||||
"fbd_more": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_more">vis {2}</a> eller <a href="#" id="bd_all">vis alle</a></div>',
|
"fbd_more": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_more">vis {2}</a> eller <a href="#" id="bd_all">vis alle</a></div>',
|
||||||
"fbd_all": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_all">vis alle</a></div>',
|
"fbd_all": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_all">vis alle</a></div>',
|
||||||
"f_anota": "kun {0} av totalt {1} elementer ble markert;\nfor å velge alt må du bla til bunnen av mappen først",
|
"f_anota": "kun {0} av totalt {1} elementer ble markert;\nfor å velge alt må du bla til bunnen av mappen først",
|
||||||
@@ -1054,6 +1068,11 @@ var Ls = {
|
|||||||
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ neste",
|
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ neste",
|
||||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
|
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
|
||||||
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
|
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
|
||||||
|
"tvt_tail": "overvåk filen for endringer og vis nye linjer i sanntid\">📡 følg",
|
||||||
|
"tvt_wrap": "tekstbryting\">↵",
|
||||||
|
"tvt_atail": "hold de nyeste linjene synlig (lås til bunnen av siden)\">⚓",
|
||||||
|
"tvt_ctail": "forstå og vis terminalfarger (ansi-sekvenser)\">🌈",
|
||||||
|
"tvt_ntail": "maks-grense for antall bokstaver som skal vises i vinduet",
|
||||||
|
|
||||||
"m3u_add1": "sangen ble lagt til i m3u-spillelisten",
|
"m3u_add1": "sangen ble lagt til i m3u-spillelisten",
|
||||||
"m3u_addn": "{0} sanger ble lagt til i m3u-spillelisten",
|
"m3u_addn": "{0} sanger ble lagt til i m3u-spillelisten",
|
||||||
@@ -1153,6 +1172,7 @@ var Ls = {
|
|||||||
"u_https3": "for høyere hastighet",
|
"u_https3": "for høyere hastighet",
|
||||||
"u_ancient": 'nettleseren din er prehistorisk -- mulig du burde <a href="#" onclick="goto(\'bup\')">bruke bup istedenfor</a>',
|
"u_ancient": 'nettleseren din er prehistorisk -- mulig du burde <a href="#" onclick="goto(\'bup\')">bruke bup istedenfor</a>',
|
||||||
"u_nowork": "krever firefox 53+, chrome 57+, eller iOS 11+",
|
"u_nowork": "krever firefox 53+, chrome 57+, eller iOS 11+",
|
||||||
|
"tail_2old": "krever firefox 105+, chrome 71+, eller iOS 14.5+",
|
||||||
"u_nodrop": 'nettleseren din er for gammel til å laste opp filer ved å dra dem inn i vinduet',
|
"u_nodrop": 'nettleseren din er for gammel til å laste opp filer ved å dra dem inn i vinduet',
|
||||||
"u_notdir": "mottok ikke mappen!\n\nnettleseren din er for gammel,\nprøv å dra mappen inn i vinduet istedenfor",
|
"u_notdir": "mottok ikke mappen!\n\nnettleseren din er for gammel,\nprøv å dra mappen inn i vinduet istedenfor",
|
||||||
"u_uri": "for å laste opp bilder ifra andre nettleservinduer,\nslipp bildet rett på den store last-opp-knappen",
|
"u_uri": "for å laste opp bilder ifra andre nettleservinduer,\nslipp bildet rett på den store last-opp-knappen",
|
||||||
@@ -1265,7 +1285,7 @@ var Ls = {
|
|||||||
"file-manager",
|
"file-manager",
|
||||||
["G", "切换列表 / 网格视图"],
|
["G", "切换列表 / 网格视图"],
|
||||||
["T", "切换缩略图 / 图标"],
|
["T", "切换缩略图 / 图标"],
|
||||||
["🡅 A/D", "缩略图大小"],
|
["⇧ A/D", "缩略图大小"],
|
||||||
["ctrl-K", "删除选中项"],
|
["ctrl-K", "删除选中项"],
|
||||||
["ctrl-X", "剪切选中项"],
|
["ctrl-X", "剪切选中项"],
|
||||||
["ctrl-C", "复制选中项"], //m
|
["ctrl-C", "复制选中项"], //m
|
||||||
@@ -1275,9 +1295,9 @@ var Ls = {
|
|||||||
|
|
||||||
"file-list-sel",
|
"file-list-sel",
|
||||||
["space", "切换文件选择"],
|
["space", "切换文件选择"],
|
||||||
["🡑/🡓", "移动选择光标"],
|
["↑/↓", "移动选择光标"],
|
||||||
["ctrl 🡑/🡓", "移动光标和视图"],
|
["ctrl ↑/↓", "移动光标和视图"],
|
||||||
["🡅 🡑/🡓", "选择上一个/下一个文件"],
|
["⇧ ↑/↓", "选择上一个/下一个文件"],
|
||||||
["ctrl-A", "选择所有文件 / 文件夹"]
|
["ctrl-A", "选择所有文件 / 文件夹"]
|
||||||
], [
|
], [
|
||||||
"navigation",
|
"navigation",
|
||||||
@@ -1300,7 +1320,7 @@ var Ls = {
|
|||||||
["Home/End", "第一张/最后一张图片"],
|
["Home/End", "第一张/最后一张图片"],
|
||||||
["F", "全屏"],
|
["F", "全屏"],
|
||||||
["R", "顺时针旋转"],
|
["R", "顺时针旋转"],
|
||||||
["🡅 R", "逆时针旋转"],
|
["⇧ R", "逆时针旋转"],
|
||||||
["S", "选择图片"], //m
|
["S", "选择图片"], //m
|
||||||
["Y", "下载图片"]
|
["Y", "下载图片"]
|
||||||
], [
|
], [
|
||||||
@@ -1456,6 +1476,7 @@ var Ls = {
|
|||||||
"ct_csel": '在网格视图中使用 CTRL 和 SHIFT 进行文件选择">CTRL',
|
"ct_csel": '在网格视图中使用 CTRL 和 SHIFT 进行文件选择">CTRL',
|
||||||
"ct_ihop": '当图像查看器关闭时,滚动到最后查看的文件">滚动',
|
"ct_ihop": '当图像查看器关闭时,滚动到最后查看的文件">滚动',
|
||||||
"ct_dots": '显示隐藏文件(如果服务器允许)">隐藏文件',
|
"ct_dots": '显示隐藏文件(如果服务器允许)">隐藏文件',
|
||||||
|
"ct_qdel": '删除文件时,只需确认一次">快删', //m
|
||||||
"ct_dir1st": '在文件之前排序文件夹">📁 排序',
|
"ct_dir1st": '在文件之前排序文件夹">📁 排序',
|
||||||
"ct_nsort": '正确排序以数字开头的文件名">数字排序', //m
|
"ct_nsort": '正确排序以数字开头的文件名">数字排序', //m
|
||||||
"ct_readme": '在文件夹列表中显示 README.md">📜 readme',
|
"ct_readme": '在文件夹列表中显示 README.md">📜 readme',
|
||||||
@@ -1479,6 +1500,8 @@ var Ls = {
|
|||||||
|
|
||||||
"cut_mt": "使用多线程加速文件哈希$N$N这使用 Web Worker 并且需要更多内存(额外最多 512 MiB)$N$N这使得 https 快 30%,http 快 4.5 倍\">mt",
|
"cut_mt": "使用多线程加速文件哈希$N$N这使用 Web Worker 并且需要更多内存(额外最多 512 MiB)$N$N这使得 https 快 30%,http 快 4.5 倍\">mt",
|
||||||
|
|
||||||
|
"cut_wasm": "使用基于 WASM 的哈希计算器代替浏览器内置的哈希功能;这可以提升在基于 Chrome 的浏览器上的速度,但会增加 CPU 使用率,而且许多旧版本的 Chrome 存在漏洞,启用此功能会导致浏览器占用所有内存并崩溃。\">wasm", //m
|
||||||
|
|
||||||
"cft_text": "网站图标文本(为空并刷新以禁用)",
|
"cft_text": "网站图标文本(为空并刷新以禁用)",
|
||||||
"cft_fg": "前景色",
|
"cft_fg": "前景色",
|
||||||
"cft_bg": "背景色",
|
"cft_bg": "背景色",
|
||||||
@@ -1565,6 +1588,7 @@ var Ls = {
|
|||||||
"f_empty": '该文件夹为空',
|
"f_empty": '该文件夹为空',
|
||||||
"f_chide": '隐藏列 «{0}»\n\n你可以在设置选项卡中重新显示列',
|
"f_chide": '隐藏列 «{0}»\n\n你可以在设置选项卡中重新显示列',
|
||||||
"f_bigtxt": "这个文件大小为 {0} MiB -- 真的以文本形式查看?",
|
"f_bigtxt": "这个文件大小为 {0} MiB -- 真的以文本形式查看?",
|
||||||
|
"f_bigtxt2": " 你想查看文件的结尾部分吗?这也将启用实时跟踪功能,能够实时显示新添加的文本行。", //m
|
||||||
"fbd_more": '<div id="blazy">显示 <code>{0}</code> 个文件中的 <code>{1}</code> 个;<a href="#" id="bd_more">显示 {2}</a> 或 <a href="#" id="bd_all">显示全部</a></div>',
|
"fbd_more": '<div id="blazy">显示 <code>{0}</code> 个文件中的 <code>{1}</code> 个;<a href="#" id="bd_more">显示 {2}</a> 或 <a href="#" id="bd_all">显示全部</a></div>',
|
||||||
"fbd_all": '<div id="blazy">显示 <code>{0}</code> 个文件中的 <code>{1}</code> 个;<a href="#" id="bd_all">显示全部</a></div>',
|
"fbd_all": '<div id="blazy">显示 <code>{0}</code> 个文件中的 <code>{1}</code> 个;<a href="#" id="bd_all">显示全部</a></div>',
|
||||||
"f_anota": "仅选择了 {0} 个项目,共 {1} 个;\n要选择整个文件夹,请先滚动到底部", //m
|
"f_anota": "仅选择了 {0} 个项目,共 {1} 个;\n要选择整个文件夹,请先滚动到底部", //m
|
||||||
@@ -1669,6 +1693,11 @@ var Ls = {
|
|||||||
"tvt_next": "显示下一个文档$N快捷键: K\">⬇ 下一个",
|
"tvt_next": "显示下一个文档$N快捷键: K\">⬇ 下一个",
|
||||||
"tvt_sel": "选择文件 (用于剪切/删除/...)$N快捷键: S\">选择",
|
"tvt_sel": "选择文件 (用于剪切/删除/...)$N快捷键: S\">选择",
|
||||||
"tvt_edit": "在文本编辑器中打开文件$N快捷键: E\">✏️ 编辑",
|
"tvt_edit": "在文本编辑器中打开文件$N快捷键: E\">✏️ 编辑",
|
||||||
|
"tvt_tail": "监视文件更改,并实时显示新增的行\">📡 跟踪", //m
|
||||||
|
"tvt_wrap": "自动换行\">↵", //m
|
||||||
|
"tvt_atail": "锁定到底部,显示最新内容\">⚓", //m
|
||||||
|
"tvt_ctail": "解析终端颜色(ANSI 转义码)\">🌈", //m
|
||||||
|
"tvt_ntail": "滚动历史上限(保留多少字节的文本)", //m
|
||||||
|
|
||||||
"m3u_add1": "歌曲已添加到 m3u 播放列表", //m
|
"m3u_add1": "歌曲已添加到 m3u 播放列表", //m
|
||||||
"m3u_addn": "已添加 {0} 首歌曲到 m3u 播放列表", //m
|
"m3u_addn": "已添加 {0} 首歌曲到 m3u 播放列表", //m
|
||||||
@@ -1768,6 +1797,7 @@ var Ls = {
|
|||||||
"u_https3": "以获得更好的性能",
|
"u_https3": "以获得更好的性能",
|
||||||
"u_ancient": '你的浏览器非常古老 -- 也许你应该 <a href="#" onclick="goto(\'bup\')">改用 bup</a>',
|
"u_ancient": '你的浏览器非常古老 -- 也许你应该 <a href="#" onclick="goto(\'bup\')">改用 bup</a>',
|
||||||
"u_nowork": "需要 Firefox 53+ 或 Chrome 57+ 或 iOS 11+",
|
"u_nowork": "需要 Firefox 53+ 或 Chrome 57+ 或 iOS 11+",
|
||||||
|
"tail_2old": "需要 Firefox 105+ 或 Chrome 71+ 或 iOS 14.5+",
|
||||||
"u_nodrop": '浏览器版本低,不支持通过拖动文件到窗口来上传文件',
|
"u_nodrop": '浏览器版本低,不支持通过拖动文件到窗口来上传文件',
|
||||||
"u_notdir": "不是文件夹!\n\n您的浏览器太旧;\n请尝试将文件夹拖入窗口",
|
"u_notdir": "不是文件夹!\n\n您的浏览器太旧;\n请尝试将文件夹拖入窗口",
|
||||||
"u_uri": "要从其他浏览器窗口拖放图片,\n请将其拖放到大的上传按钮上",
|
"u_uri": "要从其他浏览器窗口拖放图片,\n请将其拖放到大的上传按钮上",
|
||||||
@@ -2063,6 +2093,7 @@ ebi('op_cfg').innerHTML = (
|
|||||||
' <a id="csel" class="tgl btn" href="#" tt="' + L.ct_csel + '</a>\n' +
|
' <a id="csel" class="tgl btn" href="#" tt="' + L.ct_csel + '</a>\n' +
|
||||||
' <a id="ihop" class="tgl btn" href="#" tt="' + L.ct_ihop + '</a>\n' +
|
' <a id="ihop" class="tgl btn" href="#" tt="' + L.ct_ihop + '</a>\n' +
|
||||||
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '</a>\n' +
|
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '</a>\n' +
|
||||||
|
' <a id="qdel" class="tgl btn" href="#" tt="' + L.ct_qdel + '</a>\n' +
|
||||||
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '</a>\n' +
|
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '</a>\n' +
|
||||||
' <a id="nsort" class="tgl btn" href="#" tt="' + L.ct_nsort + '</a>\n' +
|
' <a id="nsort" class="tgl btn" href="#" tt="' + L.ct_nsort + '</a>\n' +
|
||||||
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '</a>\n' +
|
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '</a>\n' +
|
||||||
@@ -2090,6 +2121,7 @@ ebi('op_cfg').innerHTML = (
|
|||||||
' <a id="u2ts" class="tgl btn" href="#" tt="' + L.ut_u2ts + '</a>\n' +
|
' <a id="u2ts" class="tgl btn" href="#" tt="' + L.ut_u2ts + '</a>\n' +
|
||||||
' <a id="umod" class="tgl btn" href="#" tt="' + L.cut_umod + '</a>\n' +
|
' <a id="umod" class="tgl btn" href="#" tt="' + L.cut_umod + '</a>\n' +
|
||||||
' <a id="hashw" class="tgl btn" href="#" tt="' + L.cut_mt + '</a>\n' +
|
' <a id="hashw" class="tgl btn" href="#" tt="' + L.cut_mt + '</a>\n' +
|
||||||
|
' <a id="nosubtle" class="tgl btn" href="#" tt="' + L.cut_wasm + '</a>\n' +
|
||||||
' <a id="u2turbo" class="tgl btn ttb" href="#" tt="' + L.cut_turbo + '</a>\n' +
|
' <a id="u2turbo" class="tgl btn ttb" href="#" tt="' + L.cut_turbo + '</a>\n' +
|
||||||
' <a id="u2tdate" class="tgl btn ttb" href="#" tt="' + L.cut_datechk + '</a>\n' +
|
' <a id="u2tdate" class="tgl btn ttb" href="#" tt="' + L.cut_datechk + '</a>\n' +
|
||||||
' <input type="text" id="u2szg" value="" ' + NOAC + ' style="width:3em" tt="' + L.cut_u2sz + '" />' +
|
' <input type="text" id="u2szg" value="" ' + NOAC + ' style="width:3em" tt="' + L.cut_u2sz + '" />' +
|
||||||
@@ -2230,14 +2262,17 @@ SPINNER = m[0];
|
|||||||
|
|
||||||
|
|
||||||
var SBW, SBH; // scrollbar size
|
var SBW, SBH; // scrollbar size
|
||||||
(function () {
|
function read_sbw() {
|
||||||
var el = mknod('div');
|
var el = mknod('div');
|
||||||
el.style.cssText = 'overflow:scroll;width:100px;height:100px';
|
el.style.cssText = 'overflow:scroll;width:100px;height:100px;position:absolute;top:0;left:0';
|
||||||
document.body.appendChild(el);
|
document.body.appendChild(el);
|
||||||
SBW = el.offsetWidth - el.clientWidth;
|
SBW = el.offsetWidth - el.clientWidth;
|
||||||
SBH = el.offsetHeight - el.clientHeight;
|
SBH = el.offsetHeight - el.clientHeight;
|
||||||
document.body.removeChild(el);
|
document.body.removeChild(el);
|
||||||
})();
|
setcvar('--sbw', SBW + 'px');
|
||||||
|
setcvar('--sbh', SBH + 'px');
|
||||||
|
}
|
||||||
|
onresize100.add(read_sbw, true);
|
||||||
|
|
||||||
|
|
||||||
var have_webp = sread('have_webp');
|
var have_webp = sread('have_webp');
|
||||||
@@ -2959,6 +2994,9 @@ var widget = (function () {
|
|||||||
ebi('bplay').innerHTML = paused ? '▶' : '⏸';
|
ebi('bplay').innerHTML = paused ? '▶' : '⏸';
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
r.setvis = function () {
|
||||||
|
widget.style.display = !has(perms, "read") || showfile.abrt ? 'none' : '';
|
||||||
|
};
|
||||||
wtico.onclick = function (e) {
|
wtico.onclick = function (e) {
|
||||||
if (!touchmode)
|
if (!touchmode)
|
||||||
r.toggle(e);
|
r.toggle(e);
|
||||||
@@ -3649,7 +3687,7 @@ var mpui = (function () {
|
|||||||
var oi = mp.order.indexOf(mp.au.tid) + 1,
|
var oi = mp.order.indexOf(mp.au.tid) + 1,
|
||||||
evp = get_evpath();
|
evp = get_evpath();
|
||||||
|
|
||||||
if (mpl.pb_mode == 'loop' || mp.au.evp != evp)
|
if (mpl.pb_mode == 'loop' || mp.au.evp != evp || ebi('unsearch'))
|
||||||
oi = 0;
|
oi = 0;
|
||||||
|
|
||||||
if (oi >= mp.order.length) {
|
if (oi >= mp.order.length) {
|
||||||
@@ -5419,7 +5457,16 @@ var fileman = (function () {
|
|||||||
deleter();
|
deleter();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var asks = r.qdel ? 1 : 2;
|
||||||
|
if (dqdel === 0)
|
||||||
|
asks -= 1;
|
||||||
|
|
||||||
|
if (!asks)
|
||||||
|
return deleter();
|
||||||
|
|
||||||
modal.confirm('<h6 style="color:#900">' + L.danger + '</h6>\n<b>' + L.fd_warn1.format(vps.length) + '</b><ul>' + uricom_adec(vps, true).join('') + '</ul>', function () {
|
modal.confirm('<h6 style="color:#900">' + L.danger + '</h6>\n<b>' + L.fd_warn1.format(vps.length) + '</b><ul>' + uricom_adec(vps, true).join('') + '</ul>', function () {
|
||||||
|
if (asks === 1)
|
||||||
|
return deleter();
|
||||||
modal.confirm(L.fd_warn2, deleter, null);
|
modal.confirm(L.fd_warn2, deleter, null);
|
||||||
}, null);
|
}, null);
|
||||||
};
|
};
|
||||||
@@ -5780,6 +5827,8 @@ var fileman = (function () {
|
|||||||
r.bus.onmessage();
|
r.bus.onmessage();
|
||||||
};
|
};
|
||||||
|
|
||||||
|
bcfg_bind(r, 'qdel', 'qdel', dqdel == 1);
|
||||||
|
|
||||||
bren.onclick = r.rename;
|
bren.onclick = r.rename;
|
||||||
bdel.onclick = r.delete;
|
bdel.onclick = r.delete;
|
||||||
bcut.onclick = r.cut;
|
bcut.onclick = r.cut;
|
||||||
@@ -5792,7 +5841,9 @@ var fileman = (function () {
|
|||||||
|
|
||||||
|
|
||||||
var showfile = (function () {
|
var showfile = (function () {
|
||||||
var r = {};
|
var r = {
|
||||||
|
'nrend': 0,
|
||||||
|
};
|
||||||
r.map = {
|
r.map = {
|
||||||
'.ahk': 'autohotkey',
|
'.ahk': 'autohotkey',
|
||||||
'.bas': 'basic',
|
'.bas': 'basic',
|
||||||
@@ -5905,16 +5956,77 @@ var showfile = (function () {
|
|||||||
}
|
}
|
||||||
r.mktree();
|
r.mktree();
|
||||||
if (em) {
|
if (em) {
|
||||||
render(em);
|
if (r.taildoc)
|
||||||
|
r.show(em[0], true);
|
||||||
|
else
|
||||||
|
render(em);
|
||||||
em = null;
|
em = null;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
r.tail = function (url, no_push) {
|
||||||
|
r.abrt = new AbortController();
|
||||||
|
widget.setvis();
|
||||||
|
render([url, '', ''], no_push);
|
||||||
|
var me = r.tail_id = Date.now(),
|
||||||
|
wfp = ebi('wfp'),
|
||||||
|
edoc = ebi('doc'),
|
||||||
|
txt = '';
|
||||||
|
|
||||||
|
url = addq(url, 'tail=-' + r.tailnb);
|
||||||
|
fetch(url, {'signal': r.abrt.signal}).then(function(rsp) {
|
||||||
|
var ro = rsp.body.pipeThrough(
|
||||||
|
new TextDecoderStream('utf-8', {'fatal': false}),
|
||||||
|
{'signal': r.abrt.signal}).getReader();
|
||||||
|
|
||||||
|
var rf = function() {
|
||||||
|
ro.read().then(function(v) {
|
||||||
|
if (r.tail_id != me)
|
||||||
|
return;
|
||||||
|
var vt = v.done ? '\n*** lost connection to copyparty ***' : v.value;
|
||||||
|
if (vt == '\x00')
|
||||||
|
return rf();
|
||||||
|
txt += vt;
|
||||||
|
var ofs = txt.length - r.tailnb;
|
||||||
|
if (ofs > 0) {
|
||||||
|
var ofs2 = txt.indexOf('\n', ofs);
|
||||||
|
if (ofs2 >= ofs && ofs - ofs2 < 512)
|
||||||
|
ofs = ofs2;
|
||||||
|
txt = txt.slice(ofs);
|
||||||
|
}
|
||||||
|
var html = esc(txt);
|
||||||
|
if (r.tailansi)
|
||||||
|
html = r.ansify(html);
|
||||||
|
edoc.innerHTML = html;
|
||||||
|
if (r.tail2end)
|
||||||
|
window.scrollTo(0, wfp.offsetTop - window.innerHeight);
|
||||||
|
if (!v.done)
|
||||||
|
rf();
|
||||||
|
});
|
||||||
|
};
|
||||||
|
if (r.tail_id == me)
|
||||||
|
rf();
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
r.untail = function () {
|
||||||
|
if (!r.abrt)
|
||||||
|
return;
|
||||||
|
r.abrt.abort();
|
||||||
|
r.abrt = null;
|
||||||
|
r.tail_id = -1;
|
||||||
|
widget.setvis();
|
||||||
|
};
|
||||||
|
|
||||||
r.show = function (url, no_push) {
|
r.show = function (url, no_push) {
|
||||||
|
r.untail();
|
||||||
var xhr = new XHR(),
|
var xhr = new XHR(),
|
||||||
m = /[?&](k=[^&#]+)/.exec(url);
|
m = /[?&](k=[^&#]+)/.exec(url);
|
||||||
|
|
||||||
url = url.split('?')[0] + (m ? '?' + m[1] : '');
|
url = url.split('?')[0] + (m ? '?' + m[1] : '');
|
||||||
|
if (r.taildoc)
|
||||||
|
return r.tail(url, no_push);
|
||||||
|
|
||||||
xhr.url = url;
|
xhr.url = url;
|
||||||
xhr.fname = uricom_dec(url.split('/').pop());
|
xhr.fname = uricom_dec(url.split('/').pop());
|
||||||
xhr.no_push = no_push;
|
xhr.no_push = no_push;
|
||||||
@@ -5954,7 +6066,8 @@ var showfile = (function () {
|
|||||||
|
|
||||||
function render(doc, no_push) {
|
function render(doc, no_push) {
|
||||||
r.q = null;
|
r.q = null;
|
||||||
var url = doc[0],
|
r.nrend++;
|
||||||
|
var url = r.url = doc[0],
|
||||||
lnh = doc[1],
|
lnh = doc[1],
|
||||||
txt = doc[2],
|
txt = doc[2],
|
||||||
name = url.split('?')[0].split('/').pop(),
|
name = url.split('?')[0].split('/').pop(),
|
||||||
@@ -5968,9 +6081,13 @@ var showfile = (function () {
|
|||||||
ebi('editdoc').style.display = (has(perms, 'write') && (is_md || has(perms, 'delete'))) ? '' : 'none';
|
ebi('editdoc').style.display = (has(perms, 'write') && (is_md || has(perms, 'delete'))) ? '' : 'none';
|
||||||
|
|
||||||
var wr = ebi('bdoc'),
|
var wr = ebi('bdoc'),
|
||||||
|
nrend = r.nrend,
|
||||||
defer = !Prism.highlightElement;
|
defer = !Prism.highlightElement;
|
||||||
|
|
||||||
var fun = function (el) {
|
var fun = function (el) {
|
||||||
|
if (r.nrend != nrend)
|
||||||
|
return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
if (lnh.slice(0, 5) == '#doc.')
|
if (lnh.slice(0, 5) == '#doc.')
|
||||||
sethash(lnh.slice(1));
|
sethash(lnh.slice(1));
|
||||||
@@ -5978,13 +6095,16 @@ var showfile = (function () {
|
|||||||
el = el || QS('#doc>code');
|
el = el || QS('#doc>code');
|
||||||
Prism.highlightElement(el);
|
Prism.highlightElement(el);
|
||||||
if (el.className == 'language-ans' || (!lang && /\x1b\[[0-9;]{0,16}m/.exec(txt.slice(0, 4096))))
|
if (el.className == 'language-ans' || (!lang && /\x1b\[[0-9;]{0,16}m/.exec(txt.slice(0, 4096))))
|
||||||
r.ansify(el);
|
el.innerHTML = r.ansify(el.innerHTML);
|
||||||
}
|
}
|
||||||
catch (ex) { }
|
catch (ex) { }
|
||||||
}
|
}
|
||||||
|
|
||||||
if (txt.length > 1024 * 256)
|
var skip_prism = !txt || txt.length > 1024 * 256;
|
||||||
|
if (skip_prism) {
|
||||||
fun = function (el) { };
|
fun = function (el) { };
|
||||||
|
is_md = false;
|
||||||
|
}
|
||||||
|
|
||||||
qsr('#doc');
|
qsr('#doc');
|
||||||
var el = mknod('pre', 'doc');
|
var el = mknod('pre', 'doc');
|
||||||
@@ -5996,7 +6116,7 @@ var showfile = (function () {
|
|||||||
else {
|
else {
|
||||||
el.textContent = txt;
|
el.textContent = txt;
|
||||||
el.innerHTML = '<code>' + el.innerHTML + '</code>';
|
el.innerHTML = '<code>' + el.innerHTML + '</code>';
|
||||||
if (!window.no_prism) {
|
if (!window.no_prism && !skip_prism) {
|
||||||
if ((lang == 'conf' || lang == 'cfg') && ('\n' + txt).indexOf('\n# -*- mode: yaml -*-') + 1)
|
if ((lang == 'conf' || lang == 'cfg') && ('\n' + txt).indexOf('\n# -*- mode: yaml -*-') + 1)
|
||||||
lang = 'yaml';
|
lang = 'yaml';
|
||||||
|
|
||||||
@@ -6006,6 +6126,8 @@ var showfile = (function () {
|
|||||||
else
|
else
|
||||||
import_js(SR + '/.cpr/deps/prism.js', function () { fun(); });
|
import_js(SR + '/.cpr/deps/prism.js', function () { fun(); });
|
||||||
}
|
}
|
||||||
|
if (!txt && r.wrap)
|
||||||
|
el.className = 'wrap';
|
||||||
}
|
}
|
||||||
|
|
||||||
wr.appendChild(el);
|
wr.appendChild(el);
|
||||||
@@ -6027,11 +6149,11 @@ var showfile = (function () {
|
|||||||
tree_scrollto();
|
tree_scrollto();
|
||||||
}
|
}
|
||||||
|
|
||||||
r.ansify = function (el) {
|
r.ansify = function (html) {
|
||||||
var ctab = (light ?
|
var ctab = (light ?
|
||||||
'bfbfbf d30253 497600 b96900 006fbb a50097 288276 2d2d2d 9f9f9f 943b55 3a5600 7f4f00 00507d 683794 004343 000000' :
|
'bfbfbf d30253 497600 b96900 006fbb a50097 288276 2d2d2d 9f9f9f 943b55 3a5600 7f4f00 00507d 683794 004343 000000' :
|
||||||
'404040 f03669 b8e346 ffa402 02a2ff f65be3 3da698 d2d2d2 606060 c75b79 c8e37e ffbe4a 71cbff b67fe3 9cf0ed ffffff').split(/ /g),
|
'404040 f03669 b8e346 ffa402 02a2ff f65be3 3da698 d2d2d2 606060 c75b79 c8e37e ffbe4a 71cbff b67fe3 9cf0ed ffffff').split(/ /g),
|
||||||
src = el.innerHTML.split(/\x1b\[/g),
|
src = html.split(/\x1b\[/g),
|
||||||
out = ['<span>'], fg = 7, bg = null, bfg = 0, bbg = 0, inv = 0, bold = 0;
|
out = ['<span>'], fg = 7, bg = null, bfg = 0, bbg = 0, inv = 0, bold = 0;
|
||||||
|
|
||||||
for (var a = 0; a < src.length; a++) {
|
for (var a = 0; a < src.length; a++) {
|
||||||
@@ -6084,7 +6206,7 @@ var showfile = (function () {
|
|||||||
|
|
||||||
out.push(s + '">' + txt);
|
out.push(s + '">' + txt);
|
||||||
}
|
}
|
||||||
el.innerHTML = out.join('');
|
return out.join('');
|
||||||
};
|
};
|
||||||
|
|
||||||
r.mktree = function () {
|
r.mktree = function () {
|
||||||
@@ -6131,6 +6253,18 @@ var showfile = (function () {
|
|||||||
msel.selui();
|
msel.selui();
|
||||||
};
|
};
|
||||||
|
|
||||||
|
r.tgltail = function () {
|
||||||
|
if (!window.TextDecoderStream) {
|
||||||
|
bcfg_set('taildoc', r.taildoc = false);
|
||||||
|
return toast.err(10, L.tail_2old);
|
||||||
|
}
|
||||||
|
r.show(r.url, true);
|
||||||
|
};
|
||||||
|
|
||||||
|
r.tglwrap = function () {
|
||||||
|
r.show(r.url, true);
|
||||||
|
};
|
||||||
|
|
||||||
var bdoc = ebi('bdoc');
|
var bdoc = ebi('bdoc');
|
||||||
bdoc.className = 'line-numbers';
|
bdoc.className = 'line-numbers';
|
||||||
bdoc.innerHTML = (
|
bdoc.innerHTML = (
|
||||||
@@ -6141,15 +6275,38 @@ var showfile = (function () {
|
|||||||
'<a href="#" class="btn" id="nextdoc" tt="' + L.tvt_next + '</a>\n' +
|
'<a href="#" class="btn" id="nextdoc" tt="' + L.tvt_next + '</a>\n' +
|
||||||
'<a href="#" class="btn" id="seldoc" tt="' + L.tvt_sel + '</a>\n' +
|
'<a href="#" class="btn" id="seldoc" tt="' + L.tvt_sel + '</a>\n' +
|
||||||
'<a href="#" class="btn" id="editdoc" tt="' + L.tvt_edit + '</a>\n' +
|
'<a href="#" class="btn" id="editdoc" tt="' + L.tvt_edit + '</a>\n' +
|
||||||
|
'<a href="#" class="btn tgl" id="taildoc" tt="' + L.tvt_tail + '</a>\n' +
|
||||||
|
'<div id="tailbtns">\n' +
|
||||||
|
'<a href="#" class="btn tgl" id="wrapdoc" tt="' + L.tvt_wrap + '</a>\n' +
|
||||||
|
'<a href="#" class="btn tgl" id="tail2end" tt="' + L.tvt_atail + '</a>\n' +
|
||||||
|
'<a href="#" class="btn tgl" id="tailansi" tt="' + L.tvt_ctail + '</a>\n' +
|
||||||
|
'<input type="text" id="tailnb" value="" ' + NOAC + ' style="width:4em" tt="' + L.tvt_ntail + '" />' +
|
||||||
|
'</div>\n' +
|
||||||
'</div>'
|
'</div>'
|
||||||
);
|
);
|
||||||
ebi('xdoc').onclick = function () {
|
ebi('xdoc').onclick = function () {
|
||||||
|
r.untail();
|
||||||
thegrid.setvis(true);
|
thegrid.setvis(true);
|
||||||
|
bcfg_bind(r, 'taildoc', 'taildoc', false, r.tgltail);
|
||||||
};
|
};
|
||||||
ebi('dldoc').setAttribute('download', '');
|
ebi('dldoc').setAttribute('download', '');
|
||||||
ebi('prevdoc').onclick = function () { tree_neigh(-1); };
|
ebi('prevdoc').onclick = function () { tree_neigh(-1); };
|
||||||
ebi('nextdoc').onclick = function () { tree_neigh(1); };
|
ebi('nextdoc').onclick = function () { tree_neigh(1); };
|
||||||
ebi('seldoc').onclick = r.tglsel;
|
ebi('seldoc').onclick = r.tglsel;
|
||||||
|
bcfg_bind(r, 'wrap', 'wrapdoc', true, r.tglwrap);
|
||||||
|
bcfg_bind(r, 'taildoc', 'taildoc', false, r.tgltail);
|
||||||
|
bcfg_bind(r, 'tail2end', 'tail2end', true);
|
||||||
|
bcfg_bind(r, 'tailansi', 'tailansi', false, r.tgltail);
|
||||||
|
|
||||||
|
r.tailnb = ebi('tailnb').value = icfg_get('tailnb', 131072);
|
||||||
|
ebi('tailnb').oninput = function (e) {
|
||||||
|
swrite('tailnb', r.tailnb = this.value);
|
||||||
|
};
|
||||||
|
|
||||||
|
if (/[?&]tail\b/.exec(sloc0)) {
|
||||||
|
clmod(ebi('taildoc'), 'on', 1);
|
||||||
|
r.taildoc = true;
|
||||||
|
}
|
||||||
|
|
||||||
return r;
|
return r;
|
||||||
})();
|
})();
|
||||||
@@ -6444,6 +6601,7 @@ var thegrid = (function () {
|
|||||||
ohref = esc(ao.getAttribute('href')),
|
ohref = esc(ao.getAttribute('href')),
|
||||||
href = ohref.split('?')[0],
|
href = ohref.split('?')[0],
|
||||||
ext = '',
|
ext = '',
|
||||||
|
ext0 = '',
|
||||||
name = uricom_dec(vsplit(href)[1]),
|
name = uricom_dec(vsplit(href)[1]),
|
||||||
ref = ao.getAttribute('id'),
|
ref = ao.getAttribute('id'),
|
||||||
isdir = href.endsWith('/'),
|
isdir = href.endsWith('/'),
|
||||||
@@ -6456,17 +6614,19 @@ var thegrid = (function () {
|
|||||||
ar.shift();
|
ar.shift();
|
||||||
|
|
||||||
ar.reverse();
|
ar.reverse();
|
||||||
|
ext0 = ar[0];
|
||||||
for (var b = 0; b < Math.min(2, ar.length); b++) {
|
for (var b = 0; b < Math.min(2, ar.length); b++) {
|
||||||
if (ar[b].length > 7)
|
if (ar[b].length > 7)
|
||||||
break;
|
break;
|
||||||
|
|
||||||
ext = ar[b] + '.' + ext;
|
ext = ext ? (ar[b] + '.' + ext) : ar[b];
|
||||||
}
|
}
|
||||||
ext = (ext || 'unk.').slice(0, -1);
|
if (!ext)
|
||||||
|
ext = 'unk';
|
||||||
}
|
}
|
||||||
|
|
||||||
if (use_ext_th && ext_th[ext]) {
|
if (use_ext_th && (ext_th[ext] || ext_th[ext0])) {
|
||||||
ihref = ext_th[ext];
|
ihref = ext_th[ext] || ext_th[ext0];
|
||||||
}
|
}
|
||||||
else if (r.thumbs) {
|
else if (r.thumbs) {
|
||||||
ihref = addq(ihref, 'th=' + (have_webp ? 'w' : 'j'));
|
ihref = addq(ihref, 'th=' + (have_webp ? 'w' : 'j'));
|
||||||
@@ -6740,8 +6900,10 @@ function hkhelp() {
|
|||||||
try {
|
try {
|
||||||
if (c[a].length != 2)
|
if (c[a].length != 2)
|
||||||
html.push('<tr><th colspan="2">' + esc(c[a]) + '</th></tr>');
|
html.push('<tr><th colspan="2">' + esc(c[a]) + '</th></tr>');
|
||||||
else
|
else {
|
||||||
html.push('<tr><td>{0}</td><td>{1}</td></tr>'.format(c[a][0], c[a][1]));
|
var t1 = c[a][0].replace('⇧', '<b>⇧</b>');
|
||||||
|
html.push('<tr><td>{0}</td><td>{1}</td></tr>'.format(t1, c[a][1]));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
catch (ex) {
|
catch (ex) {
|
||||||
html.push(">>> " + c[a]);
|
html.push(">>> " + c[a]);
|
||||||
@@ -7497,7 +7659,7 @@ var treectl = (function () {
|
|||||||
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
|
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
|
||||||
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
||||||
bcfg_bind(r, 'csel', 'csel', dgsel);
|
bcfg_bind(r, 'csel', 'csel', dgsel);
|
||||||
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
|
bcfg_bind(r, 'dots', 'dotfiles', see_dots, function (v) {
|
||||||
r.goto();
|
r.goto();
|
||||||
var xhr = new XHR();
|
var xhr = new XHR();
|
||||||
xhr.open('GET', SR + '/?setck=dots=' + (v ? 'y' : ''), true);
|
xhr.open('GET', SR + '/?setck=dots=' + (v ? 'y' : ''), true);
|
||||||
@@ -8573,7 +8735,7 @@ function apply_perms(res) {
|
|||||||
if (up2k)
|
if (up2k)
|
||||||
up2k.set_fsearch();
|
up2k.set_fsearch();
|
||||||
|
|
||||||
ebi('widget').style.display = have_read ? '' : 'none';
|
widget.setvis();
|
||||||
thegrid.setvis();
|
thegrid.setvis();
|
||||||
if (!have_read && have_write)
|
if (!have_read && have_write)
|
||||||
goto('up2k');
|
goto('up2k');
|
||||||
@@ -10104,13 +10266,18 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
|
|||||||
fun = function () {
|
fun = function () {
|
||||||
showfile.show(href, tgt.getAttribute('lang'));
|
showfile.show(href, tgt.getAttribute('lang'));
|
||||||
},
|
},
|
||||||
|
tfun = function () {
|
||||||
|
bcfg_set('taildoc', showfile.taildoc = true);
|
||||||
|
fun();
|
||||||
|
},
|
||||||
szs = ft2dict(a.closest('tr'))[0].sz,
|
szs = ft2dict(a.closest('tr'))[0].sz,
|
||||||
sz = parseInt(szs.replace(/[, ]/g, ''));
|
sz = parseInt(szs.replace(/[, ]/g, ''));
|
||||||
|
|
||||||
if (sz < 1024 * 1024)
|
if (sz < 1024 * 1024 || showfile.taildoc)
|
||||||
fun();
|
fun();
|
||||||
else
|
else
|
||||||
modal.confirm(L.f_bigtxt.format(f2f(sz / 1024 / 1024, 1)), fun, null);
|
modal.confirm(L.f_bigtxt.format(f2f(sz / 1024 / 1024, 1)), fun, function() {
|
||||||
|
modal.confirm(L.f_bigtxt2, tfun, null)});
|
||||||
|
|
||||||
return ev(e);
|
return ev(e);
|
||||||
}
|
}
|
||||||
|
|||||||
55
copyparty/web/idp.html
Normal file
55
copyparty/web/idp.html
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8">
|
||||||
|
<title>{{ s_doctitle }}</title>
|
||||||
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||||
|
<meta name="robots" content="noindex, nofollow">
|
||||||
|
<meta name="theme-color" content="#{{ tcolor }}">
|
||||||
|
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/shares.css?_={{ ts }}">
|
||||||
|
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
|
||||||
|
{{ html_head }}
|
||||||
|
</head>
|
||||||
|
|
||||||
|
<body>
|
||||||
|
<div id="wrap">
|
||||||
|
<a href="{{ r }}/?idp">refresh</a>
|
||||||
|
<a href="{{ r }}/?h">control-panel</a>
|
||||||
|
|
||||||
|
<table id="tab"><thead><tr>
|
||||||
|
<th>forget</th>
|
||||||
|
<th>user</th>
|
||||||
|
<th>groups</th>
|
||||||
|
</tr></thead><tbody>
|
||||||
|
{% for un, gn in rows %}
|
||||||
|
<tr>
|
||||||
|
<td><a href="{{ r }}/?idp=rm={{ un|e }}">forget</a></td>
|
||||||
|
<td>{{ un|e }}</td>
|
||||||
|
<td>{{ gn|e }}</td>
|
||||||
|
</tr>
|
||||||
|
{% endfor %}
|
||||||
|
</tbody></table>
|
||||||
|
{% if not rows %}
|
||||||
|
(there are no IdP users in the cache)
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
<a href="#" id="repl">π</a>
|
||||||
|
<script>
|
||||||
|
|
||||||
|
var SR="{{ r }}",
|
||||||
|
lang="{{ lang }}",
|
||||||
|
dfavico="{{ favico }}";
|
||||||
|
|
||||||
|
var STG = window.localStorage;
|
||||||
|
document.documentElement.className = (STG && STG.cpp_thm) || "{{ this.args.theme }}";
|
||||||
|
|
||||||
|
</script>
|
||||||
|
<script src="{{ r }}/.cpr/util.js?_={{ ts }}"></script>
|
||||||
|
{%- if js %}
|
||||||
|
<script src="{{ js }}_={{ ts }}"></script>
|
||||||
|
{%- endif %}
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
|
||||||
@@ -135,6 +135,10 @@
|
|||||||
|
|
||||||
<h1 id="cc">other stuff:</h1>
|
<h1 id="cc">other stuff:</h1>
|
||||||
<ul>
|
<ul>
|
||||||
|
{%- if this.uname in this.args.idp_adm_set %}
|
||||||
|
<li><a id="ag" href="{{ r }}/?idp">view idp cache</a></li>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
{%- if this.uname != '*' and this.args.shr %}
|
{%- if this.uname != '*' and this.args.shr %}
|
||||||
<li><a id="y" href="{{ r }}/?shares">edit shares</a></li>
|
<li><a id="y" href="{{ r }}/?shares">edit shares</a></li>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|||||||
@@ -39,6 +39,7 @@ var Ls = {
|
|||||||
"ad1": "no304 stopper all bruk av cache. Hvis ikke k304 var nok, prøv denne. Vil mangedoble dataforbruk!",
|
"ad1": "no304 stopper all bruk av cache. Hvis ikke k304 var nok, prøv denne. Vil mangedoble dataforbruk!",
|
||||||
"ae1": "utgående:",
|
"ae1": "utgående:",
|
||||||
"af1": "vis nylig opplastede filer",
|
"af1": "vis nylig opplastede filer",
|
||||||
|
"ag1": "vis kjente IdP-brukere",
|
||||||
},
|
},
|
||||||
"eng": {
|
"eng": {
|
||||||
"d2": "shows the state of all active threads",
|
"d2": "shows the state of all active threads",
|
||||||
@@ -90,6 +91,7 @@ var Ls = {
|
|||||||
"ad1": "启用 no304 将禁用所有缓存;如果 k304 不够,可以尝试此选项。这将消耗大量的网络流量!", //m
|
"ad1": "启用 no304 将禁用所有缓存;如果 k304 不够,可以尝试此选项。这将消耗大量的网络流量!", //m
|
||||||
"ae1": "正在下载:", //m
|
"ae1": "正在下载:", //m
|
||||||
"af1": "显示最近上传的文件", //m
|
"af1": "显示最近上传的文件", //m
|
||||||
|
"ag1": "查看已知 IdP 用户", //m
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,18 @@
|
|||||||
"use strict";
|
"use strict";
|
||||||
|
|
||||||
|
|
||||||
|
(function () {
|
||||||
|
var x = sread('nosubtle');
|
||||||
|
if (x === '0' || x === '1')
|
||||||
|
nosubtle = parseInt(x);
|
||||||
|
if ((nosubtle > 1 && !CHROME && !FIREFOX) ||
|
||||||
|
(nosubtle > 2 && !CHROME) ||
|
||||||
|
(CHROME && nosubtle > VCHROME) ||
|
||||||
|
!WebAssembly)
|
||||||
|
nosubtle = 0;
|
||||||
|
})();
|
||||||
|
|
||||||
|
|
||||||
function goto_up2k() {
|
function goto_up2k() {
|
||||||
if (up2k === false)
|
if (up2k === false)
|
||||||
return goto('bup');
|
return goto('bup');
|
||||||
@@ -23,7 +35,7 @@ var up2k = null,
|
|||||||
m = 'will use ' + sha_js + ' instead of native sha512 due to';
|
m = 'will use ' + sha_js + ' instead of native sha512 due to';
|
||||||
|
|
||||||
try {
|
try {
|
||||||
if (sread('nosubtle') || window.nosubtle)
|
if (nosubtle)
|
||||||
throw 'chickenbit';
|
throw 'chickenbit';
|
||||||
var cf = crypto.subtle || crypto.webkitSubtle;
|
var cf = crypto.subtle || crypto.webkitSubtle;
|
||||||
cf.digest('SHA-512', new Uint8Array(1)).then(
|
cf.digest('SHA-512', new Uint8Array(1)).then(
|
||||||
@@ -825,7 +837,7 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
qsr('#u2depmsg');
|
qsr('#u2depmsg');
|
||||||
var o = mknod('div', 'u2depmsg');
|
var o = mknod('div', 'u2depmsg');
|
||||||
o.innerHTML = m;
|
o.innerHTML = nosubtle ? '' : m;
|
||||||
ebi('u2foot').appendChild(o);
|
ebi('u2foot').appendChild(o);
|
||||||
}
|
}
|
||||||
loading_deps = true;
|
loading_deps = true;
|
||||||
@@ -881,7 +893,8 @@ function up2k_init(subtle) {
|
|||||||
bcfg_bind(uc, 'turbo', 'u2turbo', turbolvl > 1, draw_turbo);
|
bcfg_bind(uc, 'turbo', 'u2turbo', turbolvl > 1, draw_turbo);
|
||||||
bcfg_bind(uc, 'datechk', 'u2tdate', turbolvl < 3, null);
|
bcfg_bind(uc, 'datechk', 'u2tdate', turbolvl < 3, null);
|
||||||
bcfg_bind(uc, 'az', 'u2sort', u2sort.indexOf('n') + 1, set_u2sort);
|
bcfg_bind(uc, 'az', 'u2sort', u2sort.indexOf('n') + 1, set_u2sort);
|
||||||
bcfg_bind(uc, 'hashw', 'hashw', !!WebAssembly && !(CHROME && MOBILE) && (!subtle || !CHROME), set_hashw);
|
bcfg_bind(uc, 'hashw', 'hashw', !!WebAssembly && !(CHROME && MOBILE) && (!subtle || !CHROME || VCHROME > 136), set_hashw);
|
||||||
|
bcfg_bind(uc, 'hwasm', 'nosubtle', nosubtle, set_nosubtle);
|
||||||
bcfg_bind(uc, 'upnag', 'upnag', false, set_upnag);
|
bcfg_bind(uc, 'upnag', 'upnag', false, set_upnag);
|
||||||
bcfg_bind(uc, 'upsfx', 'upsfx', false, set_upsfx);
|
bcfg_bind(uc, 'upsfx', 'upsfx', false, set_upsfx);
|
||||||
|
|
||||||
@@ -1442,9 +1455,16 @@ function up2k_init(subtle) {
|
|||||||
if (CHROME) {
|
if (CHROME) {
|
||||||
// chrome-bug 383568268 // #124
|
// chrome-bug 383568268 // #124
|
||||||
nw = Math.max(1, (nw > 4 ? 4 : (nw - 1)));
|
nw = Math.max(1, (nw > 4 ? 4 : (nw - 1)));
|
||||||
|
if (VCHROME < 137)
|
||||||
nw = (subtle && !MOBILE && nw > 2) ? 2 : nw;
|
nw = (subtle && !MOBILE && nw > 2) ? 2 : nw;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var x = sread('u2hashers') || window.u2hashers;
|
||||||
|
if (x) {
|
||||||
|
console.log('u2hashers is overriding default-value ' + nw);
|
||||||
|
nw = parseInt(x);
|
||||||
|
}
|
||||||
|
|
||||||
for (var a = 0; a < nw; a++)
|
for (var a = 0; a < nw; a++)
|
||||||
hws.push(new Worker(SR + '/.cpr/w.hash.js?_=' + TS));
|
hws.push(new Worker(SR + '/.cpr/w.hash.js?_=' + TS));
|
||||||
|
|
||||||
@@ -2213,6 +2233,7 @@ function up2k_init(subtle) {
|
|||||||
reading = 0,
|
reading = 0,
|
||||||
max_readers = 1,
|
max_readers = 1,
|
||||||
opt_readers = 2,
|
opt_readers = 2,
|
||||||
|
failed = false,
|
||||||
free = [],
|
free = [],
|
||||||
busy = {},
|
busy = {},
|
||||||
nbusy = 0,
|
nbusy = 0,
|
||||||
@@ -2262,6 +2283,14 @@ function up2k_init(subtle) {
|
|||||||
tasker();
|
tasker();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function go_fail() {
|
||||||
|
failed = true;
|
||||||
|
if (nbusy)
|
||||||
|
return;
|
||||||
|
apop(st.busy.hash, t);
|
||||||
|
st.bytes.finished += t.size;
|
||||||
|
}
|
||||||
|
|
||||||
function onmsg(d) {
|
function onmsg(d) {
|
||||||
d = d.data;
|
d = d.data;
|
||||||
var k = d[0];
|
var k = d[0];
|
||||||
@@ -2276,6 +2305,12 @@ function up2k_init(subtle) {
|
|||||||
return vis_exh(d[1], 'up2k.js', '', '', d[1]);
|
return vis_exh(d[1], 'up2k.js', '', '', d[1]);
|
||||||
|
|
||||||
if (k == "fail") {
|
if (k == "fail") {
|
||||||
|
var nchunk = d[1];
|
||||||
|
free.push(busy[nchunk]);
|
||||||
|
delete busy[nchunk];
|
||||||
|
nbusy--;
|
||||||
|
reading--;
|
||||||
|
|
||||||
pvis.seth(t.n, 1, d[1]);
|
pvis.seth(t.n, 1, d[1]);
|
||||||
pvis.seth(t.n, 2, d[2]);
|
pvis.seth(t.n, 2, d[2]);
|
||||||
console.log(d[1], d[2]);
|
console.log(d[1], d[2]);
|
||||||
@@ -2283,9 +2318,7 @@ function up2k_init(subtle) {
|
|||||||
got_oserr();
|
got_oserr();
|
||||||
|
|
||||||
pvis.move(t.n, 'ng');
|
pvis.move(t.n, 'ng');
|
||||||
apop(st.busy.hash, t);
|
return go_fail();
|
||||||
st.bytes.finished += t.size;
|
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (k == "ferr")
|
if (k == "ferr")
|
||||||
@@ -2318,6 +2351,9 @@ function up2k_init(subtle) {
|
|||||||
t.hash.push(nchunk);
|
t.hash.push(nchunk);
|
||||||
pvis.hashed(t);
|
pvis.hashed(t);
|
||||||
|
|
||||||
|
if (failed)
|
||||||
|
return go_fail();
|
||||||
|
|
||||||
if (t.hash.length < nchunks)
|
if (t.hash.length < nchunks)
|
||||||
return nbusy < opt_readers && go_next();
|
return nbusy < opt_readers && go_next();
|
||||||
|
|
||||||
@@ -2395,8 +2431,8 @@ function up2k_init(subtle) {
|
|||||||
try { orz(e); } catch (ex) { vis_exh(ex + '', 'up2k.js', '', '', ex); }
|
try { orz(e); } catch (ex) { vis_exh(ex + '', 'up2k.js', '', '', ex); }
|
||||||
};
|
};
|
||||||
|
|
||||||
xhr.timeout = 34000;
|
|
||||||
xhr.open('HEAD', t.purl + uricom_enc(t.name), true);
|
xhr.open('HEAD', t.purl + uricom_enc(t.name), true);
|
||||||
|
xhr.timeout = 34000;
|
||||||
xhr.send();
|
xhr.send();
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2875,7 +2911,8 @@ function up2k_init(subtle) {
|
|||||||
|
|
||||||
st.bytes.inflight += db;
|
st.bytes.inflight += db;
|
||||||
xhr.bsent = nb;
|
xhr.bsent = nb;
|
||||||
xhr.timeout = 64000 + Date.now() - xhr.t0;
|
if (!IE)
|
||||||
|
xhr.timeout = 64000 + Date.now() - xhr.t0;
|
||||||
pvis.prog(t, pcar, nb);
|
pvis.prog(t, pcar, nb);
|
||||||
};
|
};
|
||||||
xhr.onload = function (xev) {
|
xhr.onload = function (xev) {
|
||||||
@@ -2923,7 +2960,7 @@ function up2k_init(subtle) {
|
|||||||
|
|
||||||
xhr.bsent = 0;
|
xhr.bsent = 0;
|
||||||
xhr.t0 = Date.now();
|
xhr.t0 = Date.now();
|
||||||
xhr.timeout = 42000;
|
xhr.timeout = 1000 * (IE ? 1234 : 42);
|
||||||
xhr.responseType = 'text';
|
xhr.responseType = 'text';
|
||||||
xhr.send(t.fobj.slice(car, cdr));
|
xhr.send(t.fobj.slice(car, cdr));
|
||||||
}
|
}
|
||||||
@@ -3269,6 +3306,12 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function set_nosubtle(v) {
|
||||||
|
if (!WebAssembly)
|
||||||
|
return toast.err(10, L.u_nowork);
|
||||||
|
modal.confirm(L.lang_set, location.reload.bind(location), null);
|
||||||
|
}
|
||||||
|
|
||||||
function set_upnag(en) {
|
function set_upnag(en) {
|
||||||
function nopenag() {
|
function nopenag() {
|
||||||
bcfg_set('upnag', uc.upnag = false);
|
bcfg_set('upnag', uc.upnag = false);
|
||||||
|
|||||||
@@ -32,7 +32,7 @@ var wah = '',
|
|||||||
CHROME = !!window.chrome, // safari=false
|
CHROME = !!window.chrome, // safari=false
|
||||||
VCHROME = CHROME ? 1 : 0,
|
VCHROME = CHROME ? 1 : 0,
|
||||||
UA = '' + navigator.userAgent,
|
UA = '' + navigator.userAgent,
|
||||||
IE = /Trident\//.test(UA),
|
IE = !!document.documentMode,
|
||||||
FIREFOX = ('netscape' in window) && / rv:/.test(UA),
|
FIREFOX = ('netscape' in window) && / rv:/.test(UA),
|
||||||
IPHONE = TOUCH && /iPhone|iPad|iPod/i.test(UA),
|
IPHONE = TOUCH && /iPhone|iPad|iPod/i.test(UA),
|
||||||
LINUX = /Linux/.test(UA),
|
LINUX = /Linux/.test(UA),
|
||||||
@@ -69,7 +69,7 @@ try {
|
|||||||
|
|
||||||
CHROME = navigator.userAgentData.brands.find(function (d) { return d.brand == 'Chromium' });
|
CHROME = navigator.userAgentData.brands.find(function (d) { return d.brand == 'Chromium' });
|
||||||
if (CHROME)
|
if (CHROME)
|
||||||
VCHROME = CHROME.version;
|
VCHROME = parseInt(CHROME.version);
|
||||||
else
|
else
|
||||||
VCHROME = 0;
|
VCHROME = 0;
|
||||||
|
|
||||||
@@ -183,7 +183,7 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
|
|||||||
if (url.indexOf(' > eval') + 1 && !evalex_fatal)
|
if (url.indexOf(' > eval') + 1 && !evalex_fatal)
|
||||||
return; // md timer
|
return; // md timer
|
||||||
|
|
||||||
if (IE && url.indexOf('prism.js') + 1)
|
if (url.indexOf('prism.js') + 1)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
if (url.indexOf('easymde.js') + 1)
|
if (url.indexOf('easymde.js') + 1)
|
||||||
|
|||||||
@@ -4,6 +4,16 @@
|
|||||||
function hex2u8(txt) {
|
function hex2u8(txt) {
|
||||||
return new Uint8Array(txt.match(/.{2}/g).map(function (b) { return parseInt(b, 16); }));
|
return new Uint8Array(txt.match(/.{2}/g).map(function (b) { return parseInt(b, 16); }));
|
||||||
}
|
}
|
||||||
|
function esc(txt) {
|
||||||
|
return txt.replace(/[&"<>]/g, function (c) {
|
||||||
|
return {
|
||||||
|
'&': '&',
|
||||||
|
'"': '"',
|
||||||
|
'<': '<',
|
||||||
|
'>': '>'
|
||||||
|
}[c];
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
var subtle = null;
|
var subtle = null;
|
||||||
@@ -19,6 +29,8 @@ catch (ex) {
|
|||||||
}
|
}
|
||||||
function load_fb() {
|
function load_fb() {
|
||||||
subtle = null;
|
subtle = null;
|
||||||
|
if (self.hashwasm)
|
||||||
|
return;
|
||||||
importScripts('deps/sha512.hw.js');
|
importScripts('deps/sha512.hw.js');
|
||||||
console.log('using fallback hasher');
|
console.log('using fallback hasher');
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,160 @@
|
|||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2025-0721-2307 `v1.18.3` drop the umask
|
||||||
|
|
||||||
|
## 🧪 new features
|
||||||
|
|
||||||
|
* #181 the default chmod (unix-permissions) of new files and folders can now be changed 9921c43e
|
||||||
|
* `--chmod-d` or volflag `chmod_d` sets directory permissions; default is 755
|
||||||
|
* `--chmod-f` or volflag `chmod_f` sets file permissions; default is usually 644 (OS-defined)
|
||||||
|
* see `--help-chmod` which explains the numbers
|
||||||
|
|
||||||
|
## 🩹 bugfixes
|
||||||
|
|
||||||
|
* #179 couldn't combine `--shr` (shares) and `--xvol` (symlink-guard) 0f0f8d90
|
||||||
|
* #180 gallery buttons could still be clicked when faded-out 8c32b0e7
|
||||||
|
* rss-feeds were slightly busted when combined with rp-loc (location-based proxying) 56d3bcf5
|
||||||
|
* music-playback within search-results no longer jumps into the next folder at end-of-list 9bc4c5d2
|
||||||
|
* video-playback on iOS now behaves like on all other platforms 78605d9a
|
||||||
|
* (it would force-switch into fullscreen because that's their default)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2025-0707-1419 `v1.18.2` idp-vol persistence
|
||||||
|
|
||||||
|
## 🧪 new features
|
||||||
|
|
||||||
|
* IdP-volumes can optionally be persisted across restarts d162502c
|
||||||
|
* there is a UI to manage the cached users/groups 4f264a0a
|
||||||
|
* only available to users listed in the new option `--idp-adm`
|
||||||
|
* api for manually rescanning several volumes at once 42c199e7
|
||||||
|
* `/some/path/?scan` does that one volume like before
|
||||||
|
* `/any/path/?scan=/vol1,/another/vol2` rescans `/vol1` and `/another/vol2`
|
||||||
|
* volflag to hide volume from listing in controlpanel fd7c71d6
|
||||||
|
|
||||||
|
## 🩹 bugfixes
|
||||||
|
|
||||||
|
* macos: fix confusing crash when blocked by [Little Snitch](https://www.obdev.at/products/littlesnitch/) bf11b2a4
|
||||||
|
* unpost could break in some hairy reverseproxy setups 1b2d3985
|
||||||
|
* copyparty32.exe: fix segfault on win7 c9fafb20
|
||||||
|
* ui: fix navpane overlapping the scrollbar (still a bit jank but eh) 7ef6fd13
|
||||||
|
* usb-eject: support all volume names ed908b98
|
||||||
|
* docker: ensure clean slate deb6711b
|
||||||
|
* fix up2k on ie11 d2714434
|
||||||
|
|
||||||
|
## 🔧 other changes
|
||||||
|
|
||||||
|
* update buildscript for keyfinder to support llvm 65c4e035
|
||||||
|
* #175 add `python-magic` into the `iv` and `dj` docker flavors (thx @Morganamilo) 77274e9d
|
||||||
|
* properly killed the experimental docker flavors to avoid confusion 8306e3d9
|
||||||
|
* copyparty.exe: updated pillow 299cff3f f6be3905
|
||||||
|
* avif support was removed to save 2 MiB
|
||||||
|
|
||||||
|
## 🌠 fun facts
|
||||||
|
|
||||||
|
* this release was slightly delayed due to a [norwegian traffic jam](https://a.ocv.me/pub/g/2025/07/PXL_20250706_143558381.jpg)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2025-0622-0020 `v1.18.0` Logtail
|
||||||
|
|
||||||
|
## 🧪 new features
|
||||||
|
|
||||||
|
* textfile-viewer can now livestream logfiles (and other growing files) 17fa4906 77df17d1 a1c7a095 6ecf4fdc
|
||||||
|
* see [readme](https://github.com/9001/copyparty/#textfile-viewer) and the [live demo](https://a.ocv.me/pub/demo/logtail/)
|
||||||
|
* IdP-volumes: extend syntax for excluding certain users/groups 2e53f797
|
||||||
|
* the commit-message explains it well enough
|
||||||
|
* new option `--see-dots` to show dotfiles in the web-ui by default c599e2aa
|
||||||
|
* #171 automatic mimetype detection for files without extensions (thx @Morganamilo!) ec05f8cc 9dd5dec0
|
||||||
|
* default-disabled since it has a performance impact on webdav
|
||||||
|
* there are plans to fix this by using the db instead
|
||||||
|
* #170 improve custom filetype icons
|
||||||
|
* be less strict; if a thumbnail is set for `.gz` files, use it for `.tar.gz` too c75b0c25
|
||||||
|
* improve config docs fa5845ff
|
||||||
|
|
||||||
|
## 🩹 bugfixes
|
||||||
|
|
||||||
|
* cosmetic: get rid of some noise along the bottom of some cards in the gridview 8cae7a71
|
||||||
|
* cosmetic: satisfy a new syntax warning in cpython-3.14 5ac38648
|
||||||
|
|
||||||
|
## 🔧 other changes
|
||||||
|
|
||||||
|
* properly document how to [build from source](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#build-from-scratch) / build from scratch f61511d8
|
||||||
|
* update deps
|
||||||
|
* copyparty.exe: python 3.13 1eff87c3
|
||||||
|
* webdeps: dompurify 7eca90cc
|
||||||
|
|
||||||
|
## 🌠 fun facts
|
||||||
|
|
||||||
|
* this release was cooked up in a [swedish forest cabin](https://a.ocv.me/pub/g/nerd-stuff/forestparty.jpg)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2025-0527-1939 `v1.17.2` pushing chrome to the limits (and then some)
|
||||||
|
|
||||||
|
## 🧪 new features
|
||||||
|
|
||||||
|
* not this time
|
||||||
|
|
||||||
|
## 🩹 bugfixes
|
||||||
|
|
||||||
|
* up2k: improve file-hashing speed on recent versions of google chrome e3e51fb8
|
||||||
|
* speed increased from 319 to 513 MiB/s by default (but older chrome versions did 748...)
|
||||||
|
* read the commit message for the full story, but basically chrome has gotten gradually slower over the past couple versions (starting from v133) and this makes it slightly less bad again
|
||||||
|
* hashing speed can be further improved from `0.5` to `1.1` GiB/s by enabling the `[wasm]` option in the `[⚙️] settings` tab
|
||||||
|
* this option can be made default-enabled with `--nosubtle 137` but beware that this increases the chances of running into browser-bugs (foreshadowing...)
|
||||||
|
* up2k: fix errorhandler for browser-bugs (oom and such) 49c71247
|
||||||
|
* because [chrome-bug 383568268](https://issues.chromium.org/issues/383568268) is about to make a [surprise return?!](https://issues.chromium.org/issues/383568268#comment14)
|
||||||
|
* #168 fix uploading into shares if path-based proxying is used 9cb93ae1
|
||||||
|
* #165 unconditionally heed `--rp-loc` 84f5f417
|
||||||
|
* the config-option for [path-based proxying](https://github.com/9001/copyparty/#reverse-proxy) was ignored if the reverse-proxy was untrusted; this was confusing and not strictly necessary
|
||||||
|
|
||||||
|
## 🔧 other changes
|
||||||
|
|
||||||
|
* #166 the nixos module was improved once more (thx @msfjarvis!) 48470f6b 60fb1207
|
||||||
|
* added usage instructions to [minimal-up2k.js](https://github.com/9001/copyparty/tree/hovudstraum/contrib/plugins#example-browser-js), the up2k-ui [simplifier](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png) 1d308eeb
|
||||||
|
* docker: improve feedback if config is bad or missing 28b63e58
|
||||||
|
|
||||||
|
## 🌠 fun facts
|
||||||
|
|
||||||
|
* this release was tested using an [unreliable rdp connection](https://a.ocv.me/pub/g/nerd-stuff/PXL_20250526_021207825.jpg) through two ssh-jumphosts to a qemu win10 vm back home from the bergen-oslo night train wifi
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2025-0518-2234 `v1.17.1` as seen on archlinux
|
||||||
|
|
||||||
|
## 🧪 new features
|
||||||
|
|
||||||
|
* new toolbar button to zip/tar the currently open folder 256dad8c
|
||||||
|
* new options to specify the default checksum algorithm for PUT/bup/WebDAV uploads 0de09860
|
||||||
|
* #164 new option `--put-name` to specify the filename of nameless uploads 5dcd88a6
|
||||||
|
* the default is still `put-TIMESTAMP-IPADDRESS.bin`
|
||||||
|
|
||||||
|
## 🩹 bugfixes
|
||||||
|
|
||||||
|
* #162 password-protected shares was incompatible with password-hashing c3ef3fdc
|
||||||
|
* #161 m3u playlist creation was only possible over https 94352f27
|
||||||
|
* when relocating/redirecting an upload from an xbu hook (execute-before-upload), could miss an already existing file at the destination and create another copy 0a9a8077
|
||||||
|
* some edgecases when moving files between filesystems f425ff51
|
||||||
|
* improve tagscan-resume after a server restart (primarily for dupes) 41fa6b25
|
||||||
|
* support prehistoric timestamps in fat16 vhd-drives on windows 261236e3
|
||||||
|
|
||||||
|
## 🔧 other changes
|
||||||
|
|
||||||
|
* #159 the nixos module was improved (thx @gabevenberg and @chinponya!) d1bca1f5
|
||||||
|
* an archlinux maintainer adopted the aur package; copyparty is now [officially in arch](https://archlinux.org/packages/extra/any/copyparty/) b9ba783c
|
||||||
|
* #162 add KDE Dolphin instructions to the conect-page d4a8071d
|
||||||
|
* audioplayer now knows that `.oga` means `.ogg`
|
||||||
|
|
||||||
|
## 🌠 fun facts
|
||||||
|
|
||||||
|
* this release contains code [pair-programmed during an anime rave](https://a.ocv.me/pub/g/nerd-stuff/PXL_20250503_222654610.jpg)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
# 2025-0426-2149 `v1.17.0` mixtape.m3u
|
# 2025-0426-2149 `v1.17.0` mixtape.m3u
|
||||||
|
|
||||||
|
|||||||
@@ -22,6 +22,7 @@
|
|||||||
* [dev env setup](#dev-env-setup)
|
* [dev env setup](#dev-env-setup)
|
||||||
* [just the sfx](#just-the-sfx)
|
* [just the sfx](#just-the-sfx)
|
||||||
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
|
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
|
||||||
|
* [build from scratch](#build-from-scratch) - how the sausage is made
|
||||||
* [complete release](#complete-release)
|
* [complete release](#complete-release)
|
||||||
* [debugging](#debugging)
|
* [debugging](#debugging)
|
||||||
* [music playback halting on phones](#music-playback-halting-on-phones) - mostly fine on android
|
* [music playback halting on phones](#music-playback-halting-on-phones) - mostly fine on android
|
||||||
@@ -190,6 +191,9 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
|||||||
| GET | `?v` | open image/video/audio in mediaplayer |
|
| GET | `?v` | open image/video/audio in mediaplayer |
|
||||||
| GET | `?txt` | get file at URL as plaintext |
|
| GET | `?txt` | get file at URL as plaintext |
|
||||||
| GET | `?txt=iso-8859-1` | ...with specific charset |
|
| GET | `?txt=iso-8859-1` | ...with specific charset |
|
||||||
|
| GET | `?tail` | continuously stream a growing file |
|
||||||
|
| GET | `?tail=1024` | ...starting from byte 1024 |
|
||||||
|
| GET | `?tail=-128` | ...starting 128 bytes from the end |
|
||||||
| GET | `?th` | get image/video at URL as thumbnail |
|
| GET | `?th` | get image/video at URL as thumbnail |
|
||||||
| GET | `?th=opus` | convert audio file to 128kbps opus |
|
| GET | `?th=opus` | convert audio file to 128kbps opus |
|
||||||
| GET | `?th=caf` | ...in the iOS-proprietary container |
|
| GET | `?th=caf` | ...in the iOS-proprietary container |
|
||||||
@@ -257,6 +261,7 @@ upload modifiers:
|
|||||||
|--|--|--|
|
|--|--|--|
|
||||||
| GET | `?reload=cfg` | reload config files and rescan volumes |
|
| GET | `?reload=cfg` | reload config files and rescan volumes |
|
||||||
| GET | `?scan` | initiate a rescan of the volume which provides URL |
|
| GET | `?scan` | initiate a rescan of the volume which provides URL |
|
||||||
|
| GET | `?scan=/a,/b` | initiate a rescan of volumes `/a` and `/b` |
|
||||||
| GET | `?stack` | show a stacktrace of all threads |
|
| GET | `?stack` | show a stacktrace of all threads |
|
||||||
|
|
||||||
## general
|
## general
|
||||||
@@ -338,7 +343,7 @@ for the `re`pack to work, first run one of the sfx'es once to unpack it
|
|||||||
|
|
||||||
you need python 3.9 or newer due to type hints
|
you need python 3.9 or newer due to type hints
|
||||||
|
|
||||||
the rest is mostly optional; if you need a working env for vscode or similar
|
setting up a venv with the below packages is only necessary if you want it for vscode or similar
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
python3 -m venv .venv
|
python3 -m venv .venv
|
||||||
@@ -350,7 +355,7 @@ pip install mutagen # audio metadata
|
|||||||
pip install pyftpdlib # ftp server
|
pip install pyftpdlib # ftp server
|
||||||
pip install partftpy # tftp server
|
pip install partftpy # tftp server
|
||||||
pip install impacket # smb server -- disable Windows Defender if you REALLY need this on windows
|
pip install impacket # smb server -- disable Windows Defender if you REALLY need this on windows
|
||||||
pip install Pillow pyheif-pillow-opener pillow-avif-plugin # thumbnails
|
pip install Pillow pyheif-pillow-opener # thumbnails
|
||||||
pip install pyvips # faster thumbnails
|
pip install pyvips # faster thumbnails
|
||||||
pip install psutil # better cleanup of stuck metadata parsers on windows
|
pip install psutil # better cleanup of stuck metadata parsers on windows
|
||||||
pip install black==21.12b0 click==8.0.2 bandit pylint flake8 isort mypy # vscode tooling
|
pip install black==21.12b0 click==8.0.2 bandit pylint flake8 isort mypy # vscode tooling
|
||||||
@@ -392,6 +397,39 @@ python3 setup.py install --skip-build --prefix=/usr --root=$HOME/pe/copyparty
|
|||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
## build from scratch
|
||||||
|
|
||||||
|
how the sausage is made:
|
||||||
|
|
||||||
|
to get started, first `cd` into the `scripts` folder
|
||||||
|
|
||||||
|
* the first step is the webdeps; they end up in `../copyparty/web/deps/` for example `../copyparty/web/deps/marked.js.gz` -- if you need to build the webdeps, run `make -C deps-docker`
|
||||||
|
* this needs rootless podman and the `podman-docker` compat-layer to pretend it's docker, although it *should* be possible to use rootful/rootless docker too
|
||||||
|
* if you don't have rootless podman/docker then `sudo make -C deps-docker` is fine too
|
||||||
|
* alternatively, you can entirely skip building the webdeps and instead extract the compiled webdeps from the latest github release with `./make-sfx.sh fast dl-wd`
|
||||||
|
|
||||||
|
* next, build `copyparty-sfx.py` by running `./make-sfx.sh gz fast`
|
||||||
|
* this is a dependency for most of the remaining steps, since they take the sfx as input
|
||||||
|
* removing `fast` makes it compress better
|
||||||
|
* removing `gz` too compresses even better, but startup gets slower
|
||||||
|
|
||||||
|
* if you want to build the `.pyz` standalone "binary", now run `./make-pyz.sh`
|
||||||
|
|
||||||
|
* if you want to build a pypi package, now run `./make-pypi-release.sh d`
|
||||||
|
|
||||||
|
* if you want to build a docker-image, you have two options:
|
||||||
|
* if you want to use podman to build all docker-images for all supported architectures, now run `(cd docker; ./make.sh hclean; ./make.sh hclean pull img)`
|
||||||
|
* if you want to use docker to build all docker-images for your native architecture, now run `sudo make -C docker`
|
||||||
|
* if you want to do something else, please take a look at `docker/make.sh` or `docker/Makefile` for inspiration
|
||||||
|
|
||||||
|
* if you want to build the windows exe, first grab some snacks and a beer, [you'll need it](https://github.com/9001/copyparty/tree/hovudstraum/scripts/pyinstaller)
|
||||||
|
|
||||||
|
the complete list of buildtime dependencies to do a build from scratch is as follows:
|
||||||
|
|
||||||
|
* on ubuntu-server, install podman or [docker](https://get.docker.com/), and then `sudo apt install make zip bzip2`
|
||||||
|
* because ubuntu is specifically what someone asked about :-p
|
||||||
|
|
||||||
|
|
||||||
## complete release
|
## complete release
|
||||||
|
|
||||||
also builds the sfx so skip the sfx section above
|
also builds the sfx so skip the sfx section above
|
||||||
|
|||||||
@@ -106,3 +106,10 @@
|
|||||||
/w/tank1
|
/w/tank1
|
||||||
[/m8s]
|
[/m8s]
|
||||||
/w/tank2
|
/w/tank2
|
||||||
|
|
||||||
|
|
||||||
|
# some other things you can do:
|
||||||
|
# [/demo/${u%-su,%-fds}] # users which are NOT members of "su" or "fds"
|
||||||
|
# [/demo/${u%+su,%+fds}] # users which ARE members of BOTH "su" and "fds"
|
||||||
|
# [/demo/${g%-su}] # all groups except su
|
||||||
|
# [/demo/${g%-su,%-fds}] # all groups except su and fds
|
||||||
|
|||||||
@@ -168,6 +168,7 @@ symbol legend,
|
|||||||
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | ╱ | ╱ |
|
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | ╱ | ╱ |
|
||||||
| CTRL-V from device | █ | | | █ | | | | | | | | | |
|
| CTRL-V from device | █ | | | █ | | | | | | | | | |
|
||||||
| race the beam ("p2p") | █ | | | | | | | | | | | | |
|
| race the beam ("p2p") | █ | | | | | | | | | | | | |
|
||||||
|
| "tail -f" streaming | █ | | | | | | | | | | | | |
|
||||||
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ | |
|
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ | |
|
||||||
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ | ╱ | ╱ |
|
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ | ╱ | ╱ |
|
||||||
| ┗ max disk usage | █ | █ | █ | | █ | | | | █ | | | █ | █ |
|
| ┗ max disk usage | █ | █ | █ | | █ | | | | █ | | | █ | █ |
|
||||||
@@ -193,6 +194,8 @@ symbol legend,
|
|||||||
|
|
||||||
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
|
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
|
||||||
|
|
||||||
|
* `tail -f` = when viewing or downloading a logfile, the connection can remain open to keep showing new lines as they are added in real time
|
||||||
|
|
||||||
* `upload routing` = depending on filetype / contents / uploader etc., the file can be redirected to another location or otherwise transformed; mitigates limitations such as [sharex#3992](https://github.com/ShareX/ShareX/issues/3992)
|
* `upload routing` = depending on filetype / contents / uploader etc., the file can be redirected to another location or otherwise transformed; mitigates limitations such as [sharex#3992](https://github.com/ShareX/ShareX/issues/3992)
|
||||||
* copyparty example: [reloc-by-ext](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#before-upload)
|
* copyparty example: [reloc-by-ext](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#before-upload)
|
||||||
|
|
||||||
|
|||||||
8
flake.lock
generated
8
flake.lock
generated
@@ -17,16 +17,16 @@
|
|||||||
},
|
},
|
||||||
"nixpkgs": {
|
"nixpkgs": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1680334310,
|
"lastModified": 1748162331,
|
||||||
"narHash": "sha256-ISWz16oGxBhF7wqAxefMPwFag6SlsA9up8muV79V9ck=",
|
"narHash": "sha256-rqc2RKYTxP3tbjA+PB3VMRQNnjesrT0pEofXQTrMsS8=",
|
||||||
"owner": "NixOS",
|
"owner": "NixOS",
|
||||||
"repo": "nixpkgs",
|
"repo": "nixpkgs",
|
||||||
"rev": "884e3b68be02ff9d61a042bc9bd9dd2a358f95da",
|
"rev": "7c43f080a7f28b2774f3b3f43234ca11661bf334",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
"id": "nixpkgs",
|
"id": "nixpkgs",
|
||||||
"ref": "nixos-22.11",
|
"ref": "nixos-25.05",
|
||||||
"type": "indirect"
|
"type": "indirect"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
inputs = {
|
inputs = {
|
||||||
nixpkgs.url = "nixpkgs/nixos-22.11";
|
nixpkgs.url = "nixpkgs/nixos-25.05";
|
||||||
flake-utils.url = "github:numtide/flake-utils";
|
flake-utils.url = "github:numtide/flake-utils";
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -17,6 +17,9 @@
|
|||||||
let
|
let
|
||||||
pkgs = import nixpkgs {
|
pkgs = import nixpkgs {
|
||||||
inherit system;
|
inherit system;
|
||||||
|
config = {
|
||||||
|
allowAliases = false;
|
||||||
|
};
|
||||||
overlays = [ self.overlays.default ];
|
overlays = [ self.overlays.default ];
|
||||||
};
|
};
|
||||||
in {
|
in {
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ WORKDIR /z
|
|||||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||||
ver_hashwasm=4.12.0 \
|
ver_hashwasm=4.12.0 \
|
||||||
ver_marked=4.3.0 \
|
ver_marked=4.3.0 \
|
||||||
ver_dompf=3.2.5 \
|
ver_dompf=3.2.6 \
|
||||||
ver_mde=2.18.0 \
|
ver_mde=2.18.0 \
|
||||||
ver_codemirror=5.65.18 \
|
ver_codemirror=5.65.18 \
|
||||||
ver_fontawesome=5.13.0 \
|
ver_fontawesome=5.13.0 \
|
||||||
|
|||||||
@@ -15,13 +15,14 @@ RUN apk add -U !pyc \
|
|||||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||||
py3-pip py3-cffi \
|
py3-pip py3-cffi \
|
||||||
ffmpeg \
|
ffmpeg \
|
||||||
|
py3-magic \
|
||||||
vips-jxl vips-heif vips-poppler vips-magick \
|
vips-jxl vips-heif vips-poppler vips-magick \
|
||||||
py3-numpy fftw libsndfile \
|
py3-numpy fftw libsndfile \
|
||||||
vamp-sdk vamp-sdk-libs \
|
vamp-sdk vamp-sdk-libs \
|
||||||
&& apk add -t .bd \
|
&& apk add -t .bd \
|
||||||
bash wget gcc g++ make cmake patchelf \
|
bash wget gcc g++ make cmake patchelf \
|
||||||
python3-dev ffmpeg-dev fftw-dev libsndfile-dev \
|
python3-dev ffmpeg-dev fftw-dev libsndfile-dev \
|
||||||
py3-wheel py3-numpy-dev \
|
py3-wheel py3-numpy-dev libffi-dev \
|
||||||
vamp-sdk-dev \
|
vamp-sdk-dev \
|
||||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||||
&& python3 -m pip install pyvips \
|
&& python3 -m pip install pyvips \
|
||||||
|
|||||||
@@ -1,4 +1,7 @@
|
|||||||
FROM debian:12-slim
|
FROM DO_NOT_USE_THIS_DOCKER_IMAGE
|
||||||
|
# this image is an unmaintained experiment to see whether alpine was the correct choice (it was)
|
||||||
|
|
||||||
|
#FROM debian:12-slim
|
||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||||
|
|||||||
@@ -1,4 +1,7 @@
|
|||||||
FROM fedora:39
|
FROM DO_NOT_USE_THIS_DOCKER_IMAGE
|
||||||
|
# this image is an unmaintained experiment to see whether alpine was the correct choice (it was)
|
||||||
|
|
||||||
|
#FROM fedora:39
|
||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||||
|
|||||||
@@ -1,4 +1,7 @@
|
|||||||
FROM fedora:38
|
FROM DO_NOT_USE_THIS_DOCKER_IMAGE
|
||||||
|
# this image is an unmaintained experiment to see whether alpine was the correct choice (it was)
|
||||||
|
|
||||||
|
#FROM fedora:38
|
||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||||
|
|||||||
@@ -1,4 +1,7 @@
|
|||||||
FROM ubuntu:23.04
|
FROM DO_NOT_USE_THIS_DOCKER_IMAGE
|
||||||
|
# this image is an unmaintained experiment to see whether alpine was the correct choice (it was)
|
||||||
|
|
||||||
|
#FROM ubuntu:23.04
|
||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
LABEL org.opencontainers.image.url="https://github.com/9001/copyparty" \
|
||||||
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
org.opencontainers.image.source="https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker" \
|
||||||
|
|||||||
@@ -12,10 +12,11 @@ RUN apk add -U !pyc \
|
|||||||
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
py3-jinja2 py3-argon2-cffi py3-pyzmq py3-pillow \
|
||||||
py3-pip py3-cffi \
|
py3-pip py3-cffi \
|
||||||
ffmpeg \
|
ffmpeg \
|
||||||
|
py3-magic \
|
||||||
vips-jxl vips-heif vips-poppler vips-magick \
|
vips-jxl vips-heif vips-poppler vips-magick \
|
||||||
&& apk add -t .bd \
|
&& apk add -t .bd \
|
||||||
bash wget gcc g++ make cmake patchelf \
|
bash wget gcc g++ make cmake patchelf \
|
||||||
python3-dev py3-wheel \
|
python3-dev py3-wheel libffi-dev \
|
||||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||||
&& python3 -m pip install pyvips \
|
&& python3 -m pip install pyvips \
|
||||||
&& apk del py3-pip .bd
|
&& apk del py3-pip .bd
|
||||||
|
|||||||
@@ -28,6 +28,14 @@ all:
|
|||||||
|
|
||||||
docker image ls
|
docker image ls
|
||||||
|
|
||||||
|
min:
|
||||||
|
rm -rf i
|
||||||
|
mkdir i
|
||||||
|
tar -cC../.. dist/copyparty-sfx.py bin/mtag | tar -xvCi
|
||||||
|
|
||||||
|
podman build --squash --pull=always -t copyparty/min:latest -f Dockerfile.min .
|
||||||
|
echo 'scale=1;'`podman save copyparty/min:latest | pigz -c | wc -c`/1024/1024 | bc
|
||||||
|
|
||||||
push:
|
push:
|
||||||
docker push copyparty/min
|
docker push copyparty/min
|
||||||
docker push copyparty/im
|
docker push copyparty/im
|
||||||
|
|||||||
@@ -63,12 +63,13 @@ python3 -m copyparty \
|
|||||||
--ign-ebind -p$((1024+RANDOM)),$((1024+RANDOM)),$((1024+RANDOM)) \
|
--ign-ebind -p$((1024+RANDOM)),$((1024+RANDOM)),$((1024+RANDOM)) \
|
||||||
-v .::r --no-crt -qi127.1 --wr-h-eps $t & pid=$!
|
-v .::r --no-crt -qi127.1 --wr-h-eps $t & pid=$!
|
||||||
|
|
||||||
for n in $(seq 1 200); do sleep 0.2
|
for n in $(seq 1 900); do sleep 0.2
|
||||||
v=$(awk '/^127/{print;n=1;exit}END{exit n-1}' $t) && break
|
v=$(awk '/^127/{print;n=1;exit}END{exit n-1}' $t) && break
|
||||||
done
|
done
|
||||||
[ -z "$v" ] && echo SNAAAAAKE && exit 1
|
[ -z "$v" ] && echo SNAAAAAKE && exit 1
|
||||||
|
rm $t
|
||||||
|
|
||||||
for n in $(seq 1 200); do sleep 0.2
|
for n in $(seq 1 900); do sleep 0.2
|
||||||
wget -O- http://${v/ /:}/?tar=gz:1 >tf && break
|
wget -O- http://${v/ /:}/?tar=gz:1 >tf && break
|
||||||
done
|
done
|
||||||
tar -xzO top/innvikler.sh <tf | cmp innvikler.sh
|
tar -xzO top/innvikler.sh <tf | cmp innvikler.sh
|
||||||
@@ -79,7 +80,7 @@ kill $pid; wait $pid
|
|||||||
########################################################################
|
########################################################################
|
||||||
|
|
||||||
# output from -e2d
|
# output from -e2d
|
||||||
rm -rf .hist
|
rm -rf .hist /cfg/copyparty
|
||||||
|
|
||||||
# goodbye
|
# goodbye
|
||||||
exec rm innvikler.sh
|
exec rm innvikler.sh
|
||||||
|
|||||||
@@ -7,7 +7,7 @@ import subprocess as sp
|
|||||||
|
|
||||||
# to convert the copyparty --help to html, run this in xfce4-terminal @ 140x43:
|
# to convert the copyparty --help to html, run this in xfce4-terminal @ 140x43:
|
||||||
_ = r""""
|
_ = r""""
|
||||||
echo; for a in '' -bind -accounts -flags -handlers -hooks -urlform -exp -ls -dbd -pwhash -zm; do
|
echo; for a in '' -bind -accounts -flags -handlers -hooks -urlform -exp -ls -dbd -chmod -pwhash -zm; do
|
||||||
./copyparty-sfx.py --help$a 2>/dev/null; printf '\n\n\n%0139d\n\n\n'; done # xfce4-terminal @ 140x43
|
./copyparty-sfx.py --help$a 2>/dev/null; printf '\n\n\n%0139d\n\n\n'; done # xfce4-terminal @ 140x43
|
||||||
"""
|
"""
|
||||||
# click [edit] => [select all]
|
# click [edit] => [select all]
|
||||||
|
|||||||
@@ -23,7 +23,7 @@ exit 0
|
|||||||
|
|
||||||
|
|
||||||
# first open an infinitely wide console (this is why you own an ultrawide) and copypaste this into it:
|
# first open an infinitely wide console (this is why you own an ultrawide) and copypaste this into it:
|
||||||
for a in '' -bind -accounts -flags -handlers -hooks -urlform -exp -ls -dbd -pwhash -zm; do
|
for a in '' -bind -accounts -flags -handlers -hooks -urlform -exp -ls -dbd -chmod -pwhash -zm; do
|
||||||
./copyparty-sfx.py --help$a 2>/dev/null; printf '\n\n\n%0255d\n\n\n'; done
|
./copyparty-sfx.py --help$a 2>/dev/null; printf '\n\n\n%0255d\n\n\n'; done
|
||||||
|
|
||||||
# then copypaste all of the output by pressing ctrl-shift-a, ctrl-shift-c
|
# then copypaste all of the output by pressing ctrl-shift-a, ctrl-shift-c
|
||||||
|
|||||||
@@ -537,6 +537,7 @@ find | grep -E '\.(js|html)$' | while IFS= read -r f; do
|
|||||||
done
|
done
|
||||||
|
|
||||||
gzres() {
|
gzres() {
|
||||||
|
local pk=
|
||||||
[ $zopf ] && command -v zopfli && pk="zopfli --i$zopf"
|
[ $zopf ] && command -v zopfli && pk="zopfli --i$zopf"
|
||||||
[ $zopf ] && command -v pigz && pk="pigz -11 -I $zopf"
|
[ $zopf ] && command -v pigz && pk="pigz -11 -I $zopf"
|
||||||
[ -z "$pk" ] && pk='gzip'
|
[ -z "$pk" ] && pk='gzip'
|
||||||
@@ -628,7 +629,6 @@ suf=
|
|||||||
[ $use_gz ] && {
|
[ $use_gz ] && {
|
||||||
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
|
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
|
||||||
py=$py.t
|
py=$py.t
|
||||||
suf=-gz
|
|
||||||
}
|
}
|
||||||
|
|
||||||
"$pybin" $py --sfx-make tar.bz2 $ver $ts
|
"$pybin" $py --sfx-make tar.bz2 $ver $ts
|
||||||
|
|||||||
@@ -14,9 +14,10 @@ clean=--clean
|
|||||||
|
|
||||||
uname -s | grep WOW64 && m=64 || m=32
|
uname -s | grep WOW64 && m=64 || m=32
|
||||||
uname -s | grep NT-10 && w10=1 || w7=1
|
uname -s | grep NT-10 && w10=1 || w7=1
|
||||||
|
[ $w7 ] && export PRTY_NO_MAGIC=1
|
||||||
[ $w7 ] && [ -e up2k.sh ] && [ ! "$1" ] && ./up2k.sh
|
[ $w7 ] && [ -e up2k.sh ] && [ ! "$1" ] && ./up2k.sh
|
||||||
|
|
||||||
[ $w7 ] && pyv=37 || pyv=312
|
[ $w7 ] && pyv=37 || pyv=313
|
||||||
esuf=
|
esuf=
|
||||||
[ $w7 ] && [ $m = 32 ] && esuf=32
|
[ $w7 ] && [ $m = 32 ] && esuf=32
|
||||||
[ $w7 ] && [ $m = 64 ] && esuf=-winpe64
|
[ $w7 ] && [ $m = 64 ] && esuf=-winpe64
|
||||||
@@ -89,14 +90,18 @@ excl=(
|
|||||||
urllib.request
|
urllib.request
|
||||||
urllib.response
|
urllib.response
|
||||||
urllib.robotparser
|
urllib.robotparser
|
||||||
zipfile
|
|
||||||
)
|
)
|
||||||
[ $w10 ] && excl+=(
|
[ $w10 ] && excl+=(
|
||||||
|
_pyrepl
|
||||||
|
distutils
|
||||||
|
setuptools
|
||||||
|
PIL._avif
|
||||||
PIL.ImageQt
|
PIL.ImageQt
|
||||||
PIL.ImageShow
|
PIL.ImageShow
|
||||||
PIL.ImageTk
|
PIL.ImageTk
|
||||||
PIL.ImageWin
|
PIL.ImageWin
|
||||||
PIL.PdfParser
|
PIL.PdfParser
|
||||||
|
zipimport
|
||||||
) || excl+=(
|
) || excl+=(
|
||||||
inspect
|
inspect
|
||||||
PIL
|
PIL
|
||||||
@@ -104,6 +109,7 @@ excl=(
|
|||||||
PIL.Image
|
PIL.Image
|
||||||
PIL.ImageDraw
|
PIL.ImageDraw
|
||||||
PIL.ImageOps
|
PIL.ImageOps
|
||||||
|
zipfile
|
||||||
)
|
)
|
||||||
excl=( "${excl[@]/#/--exclude-module }" )
|
excl=( "${excl[@]/#/--exclude-module }" )
|
||||||
|
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f8
|
|||||||
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
||||||
b297ff66ec50cf5a1abcf07d6ac949644c5150ba094ffac974c5d27c81574c3e97ed814a47547f4b03a4c83ea0fb8f026433fca06a3f08e32742dc5c024f3d07 pywin32_ctypes-0.2.3-py3-none-any.whl
|
b297ff66ec50cf5a1abcf07d6ac949644c5150ba094ffac974c5d27c81574c3e97ed814a47547f4b03a4c83ea0fb8f026433fca06a3f08e32742dc5c024f3d07 pywin32_ctypes-0.2.3-py3-none-any.whl
|
||||||
085d39ef4426aa5f097fbc484595becc16e61ca23fc7da4d2a8bba540a3b82e789e390b176c7151bdc67d01735cce22b1562cdb2e31273225a2d3e275851a4ad setuptools-70.3.0-py3-none-any.whl
|
085d39ef4426aa5f097fbc484595becc16e61ca23fc7da4d2a8bba540a3b82e789e390b176c7151bdc67d01735cce22b1562cdb2e31273225a2d3e275851a4ad setuptools-70.3.0-py3-none-any.whl
|
||||||
360a141928f4a7ec18a994602cbb28bbf8b5cc7c077a06ac76b54b12fa769ed95ca0333a5cf728923a8e0baeb5cc4d5e73e5b3de2666beb05eb477d8ae719093 upx-4.2.4-win32.zip
|
644931f8e1764e168c257c11c77b3d2ac5408397d97b0eef98168a058efe793d3ab6900dc2e9c54923a2bd906dd66bfbff8db6ff43418513e530a1bd501c6ccd upx-5.0.1-win32.zip
|
||||||
# win7
|
# win7
|
||||||
3253e86471e6f9fa85bfdb7684cd2f964ed6e35c6a4db87f81cca157c049bef43e66dfcae1e037b2fb904567b1e028aaeefe8983ba3255105df787406d2aa71e en_windows_7_professional_with_sp1_x86_dvd_u_677056.iso
|
3253e86471e6f9fa85bfdb7684cd2f964ed6e35c6a4db87f81cca157c049bef43e66dfcae1e037b2fb904567b1e028aaeefe8983ba3255105df787406d2aa71e en_windows_7_professional_with_sp1_x86_dvd_u_677056.iso
|
||||||
ab0db0283f61a5bbe44797d74546786bf41685175764a448d2e3bd629f292f1e7d829757b26be346b5044d78c9c1891736d93237cee4b1b6f5996a902c86d15f en_windows_7_professional_with_sp1_x64_dvd_u_676939.iso
|
ab0db0283f61a5bbe44797d74546786bf41685175764a448d2e3bd629f292f1e7d829757b26be346b5044d78c9c1891736d93237cee4b1b6f5996a902c86d15f en_windows_7_professional_with_sp1_x64_dvd_u_676939.iso
|
||||||
@@ -24,10 +24,11 @@ ac96786e5d35882e0c5b724794329c9125c2b86ae7847f17acfc49f0d294312c6afc1c3f248655de
|
|||||||
0a2cd4cadf0395f0374974cd2bc2407e5cc65c111275acdffb6ecc5a2026eee9e1bb3da528b35c7f0ff4b64563a74857d5c2149051e281cc09ebd0d1968be9aa en-us_windows_10_enterprise_ltsc_2021_x64_dvd_d289cf96.iso
|
0a2cd4cadf0395f0374974cd2bc2407e5cc65c111275acdffb6ecc5a2026eee9e1bb3da528b35c7f0ff4b64563a74857d5c2149051e281cc09ebd0d1968be9aa en-us_windows_10_enterprise_ltsc_2021_x64_dvd_d289cf96.iso
|
||||||
16cc0c58b5df6c7040893089f3eb29c074aed61d76dae6cd628d8a89a05f6223ac5d7f3f709a12417c147594a87a94cc808d1e04a6f1e407cc41f7c9f47790d1 virtio-win-0.1.248.iso
|
16cc0c58b5df6c7040893089f3eb29c074aed61d76dae6cd628d8a89a05f6223ac5d7f3f709a12417c147594a87a94cc808d1e04a6f1e407cc41f7c9f47790d1 virtio-win-0.1.248.iso
|
||||||
9a7f40edc6f9209a2acd23793f3cbd6213c94f36064048cb8bf6eb04f1bdb2c2fe991cb09f77fe8b13e5cd85c618ef23573e79813b2fef899ab2f290cd129779 jinja2-3.1.6-py3-none-any.whl
|
9a7f40edc6f9209a2acd23793f3cbd6213c94f36064048cb8bf6eb04f1bdb2c2fe991cb09f77fe8b13e5cd85c618ef23573e79813b2fef899ab2f290cd129779 jinja2-3.1.6-py3-none-any.whl
|
||||||
6df21f0da408a89f6504417c7cdf9aaafe4ed88cfa13e9b8fa8414f604c0401f885a04bbad0484dc51a29284af5d1548e33c6cc6bfb9896d9992c1b1074f332d MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl
|
00731cfdd9d5c12efef04a7161c90c1e5ed1dc4677aa88a1d4054aff836f3430df4da5262ed4289c21637358a9e10e5df16f76743cbf5a29bb3a44b146c19cf3 MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl
|
||||||
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
||||||
0203ec2551c4836696cfab0b2c9fff603352f03fa36e7476e2e1ca7ec57a3a0c24bd791fcd92f342bf817f0887854d9f072e0271c643de4b313d8c9569ba8813 packaging-24.1-py3-none-any.whl
|
a726fb46cce24f781fc8b55a3e6dea0a884ebc3b2b400ea74aa02333699f4955a5dc1e2ec5927ac72f35a624401f3f3b442882ba1cc4cadaf9c88558b5b8bdae packaging-25.0-py3-none-any.whl
|
||||||
c9051daaf34ec934962c743a5ac2dbe55a9b0cababb693a8cde0001d24d4a50b67bd534d714d935def6ca7b898ec0a352e58bd9ccdce01c54eaf2281b18e478d pillow-11.2.1-cp312-cp312-win_amd64.whl
|
3e39ea6e16b502d99a2e6544579095d0f7c6097761cd85135d5e929b9dec1b32e80669a846f94ee8c2cca9be2f5fe728625d09453988864c04e16bb8445c3f91 pillow-11.3.0-cp313-cp313-win_amd64.whl
|
||||||
f0463895e9aee97f31a2003323de235fed1b26289766dc0837261e3f4a594a31162b69e9adbb0e9a31e2e2d4b5f25c762ed1669553df7dc89a8ba4f85d297873 pyinstaller-6.11.1-py3-none-win_amd64.whl
|
59fbbcae044f4ee73d203ac74b553b27bfad3e6b2f3fb290fd3f8774753c6b545176b6b3399c240b092d131d152290ce732750accd962dc1e48e930be85f5e53 pyinstaller-6.14.1-py3-none-win_amd64.whl
|
||||||
d550a0a14428386945533de2220c4c2e37c0c890fc51a600f626c6ca90a32d39572c121ec04c157ba3a8d6601cb021f8433d871b5c562a3d342c804fffec90c1 pyinstaller_hooks_contrib-2024.11-py3-none-any.whl
|
fc6f3e144c5f5b662412de07cb8bf0c2eb3b3be21d19ec448aef3c4244d779b9ab8027fd67a4871e6e13823b248ea0f5a7a9241a53aef30f3b51a6d3cb5bdb3f pyinstaller_hooks_contrib-2025.5-py3-none-any.whl
|
||||||
4f9a4d9f65c93e2d851e2674057343a9599f30f5dc582ffca485522237d4fcf43653b3d393ed5eb11e518c4ba93714a07134bbb13a97d421cce211e1da34682e python-3.12.10-amd64.exe
|
2c7a52e223b8186c21009d3fa5ed6a856d8eb4ef3b98f5d24c378c6a1afbfa1378bd7a51d6addc500e263d7989efb544c862bf920055e740f137c702dfd9d18b python-3.13.5-amd64.exe
|
||||||
|
2a0420f7faaa33d2132b82895a8282688030e939db0225ad8abb95a47bdb87b45318f10985fc3cee271a9121441c1526caa363d7f2e4a4b18b1a674068766e87 setuptools-80.9.0-py3-none-any.whl
|
||||||
|
|||||||
@@ -29,19 +29,19 @@ uname -s | grep NT-10 && w10=1 || {
|
|||||||
fns=(
|
fns=(
|
||||||
altgraph-0.17.4-py2.py3-none-any.whl
|
altgraph-0.17.4-py2.py3-none-any.whl
|
||||||
pefile-2023.2.7-py3-none-any.whl
|
pefile-2023.2.7-py3-none-any.whl
|
||||||
pywin32_ctypes-0.2.2-py3-none-any.whl
|
pywin32_ctypes-0.2.3-py3-none-any.whl
|
||||||
setuptools-70.3.0-py3-none-any.whl
|
upx-5.0.1-win32.zip
|
||||||
upx-4.2.4-win32.zip
|
|
||||||
)
|
)
|
||||||
[ $w10 ] && fns+=(
|
[ $w10 ] && fns+=(
|
||||||
jinja2-3.1.6-py3-none-any.whl
|
jinja2-3.1.6-py3-none-any.whl
|
||||||
MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl
|
MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl
|
||||||
mutagen-1.47.0-py3-none-any.whl
|
mutagen-1.47.0-py3-none-any.whl
|
||||||
packaging-24.1-py3-none-any.whl
|
packaging-25.0-py3-none-any.whl
|
||||||
pillow-11.2.1-cp312-cp312-win_amd64.whl
|
pillow-11.3.0-cp313-cp313-win_amd64.whl
|
||||||
pyinstaller-6.10.0-py3-none-win_amd64.whl
|
pyinstaller-6.14.1-py3-none-win_amd64.whl
|
||||||
pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
|
pyinstaller_hooks_contrib-2025.5-py3-none-any.whl
|
||||||
python-3.12.10-amd64.exe
|
python-3.13.5-amd64.exe
|
||||||
|
setuptools-80.9.0-py3-none-any.whl
|
||||||
)
|
)
|
||||||
[ $w7 ] && fns+=(
|
[ $w7 ] && fns+=(
|
||||||
future-1.0.0-py3-none-any.whl
|
future-1.0.0-py3-none-any.whl
|
||||||
@@ -49,6 +49,7 @@ fns=(
|
|||||||
packaging-24.0-py3-none-any.whl
|
packaging-24.0-py3-none-any.whl
|
||||||
pip-24.0-py3-none-any.whl
|
pip-24.0-py3-none-any.whl
|
||||||
pyinstaller_hooks_contrib-2023.8-py2.py3-none-any.whl
|
pyinstaller_hooks_contrib-2023.8-py2.py3-none-any.whl
|
||||||
|
setuptools-70.3.0-py3-none-any.whl
|
||||||
typing_extensions-4.7.1-py3-none-any.whl
|
typing_extensions-4.7.1-py3-none-any.whl
|
||||||
zipp-3.15.0-py3-none-any.whl
|
zipp-3.15.0-py3-none-any.whl
|
||||||
)
|
)
|
||||||
@@ -80,7 +81,7 @@ close and reopen git-bash so python is in PATH
|
|||||||
|
|
||||||
===[ copy-paste into git-bash ]================================
|
===[ copy-paste into git-bash ]================================
|
||||||
uname -s | grep NT-10 && w10=1 || w7=1
|
uname -s | grep NT-10 && w10=1 || w7=1
|
||||||
[ $w7 ] && pyv=37 || pyv=312
|
[ $w7 ] && pyv=37 || pyv=313
|
||||||
appd=$(cygpath.exe "$APPDATA")
|
appd=$(cygpath.exe "$APPDATA")
|
||||||
cd ~/Downloads &&
|
cd ~/Downloads &&
|
||||||
yes | unzip upx-*-win32.zip &&
|
yes | unzip upx-*-win32.zip &&
|
||||||
|
|||||||
@@ -34,6 +34,7 @@ shift
|
|||||||
./make-sfx.sh "$@"
|
./make-sfx.sh "$@"
|
||||||
f=../dist/copyparty-sfx
|
f=../dist/copyparty-sfx
|
||||||
[ -e $f.py ] && s= || s=-gz
|
[ -e $f.py ] && s= || s=-gz
|
||||||
|
# TODO: the -gz suffix is gone, can drop all the $s stuff probably
|
||||||
|
|
||||||
$f$s.py --version >/dev/null
|
$f$s.py --version >/dev/null
|
||||||
|
|
||||||
|
|||||||
@@ -94,6 +94,7 @@ copyparty/web/deps/prismd.css,
|
|||||||
copyparty/web/deps/scp.woff2,
|
copyparty/web/deps/scp.woff2,
|
||||||
copyparty/web/deps/sha512.ac.js,
|
copyparty/web/deps/sha512.ac.js,
|
||||||
copyparty/web/deps/sha512.hw.js,
|
copyparty/web/deps/sha512.hw.js,
|
||||||
|
copyparty/web/idp.html,
|
||||||
copyparty/web/iiam.gif,
|
copyparty/web/iiam.gif,
|
||||||
copyparty/web/md.css,
|
copyparty/web/md.css,
|
||||||
copyparty/web/md.html,
|
copyparty/web/md.html,
|
||||||
|
|||||||
@@ -121,7 +121,7 @@ var tl_browser = {
|
|||||||
"file-manager",
|
"file-manager",
|
||||||
["G", "toggle list / grid view"],
|
["G", "toggle list / grid view"],
|
||||||
["T", "toggle thumbnails / icons"],
|
["T", "toggle thumbnails / icons"],
|
||||||
["🡅 A/D", "thumbnail size"],
|
["⇧ A/D", "thumbnail size"],
|
||||||
["ctrl-K", "delete selected"],
|
["ctrl-K", "delete selected"],
|
||||||
["ctrl-X", "cut selection to clipboard"],
|
["ctrl-X", "cut selection to clipboard"],
|
||||||
["ctrl-C", "copy selection to clipboard"],
|
["ctrl-C", "copy selection to clipboard"],
|
||||||
@@ -131,9 +131,9 @@ var tl_browser = {
|
|||||||
|
|
||||||
"file-list-sel",
|
"file-list-sel",
|
||||||
["space", "toggle file selection"],
|
["space", "toggle file selection"],
|
||||||
["🡑/🡓", "move selection cursor"],
|
["↑/↓", "move selection cursor"],
|
||||||
["ctrl 🡑/🡓", "move cursor and viewport"],
|
["ctrl ↑/↓", "move cursor and viewport"],
|
||||||
["🡅 🡑/🡓", "select prev/next file"],
|
["⇧ ↑/↓", "select prev/next file"],
|
||||||
["ctrl-A", "select all files / folders"],
|
["ctrl-A", "select all files / folders"],
|
||||||
], [
|
], [
|
||||||
"navigation",
|
"navigation",
|
||||||
@@ -156,7 +156,7 @@ var tl_browser = {
|
|||||||
["Home/End", "first/last pic"],
|
["Home/End", "first/last pic"],
|
||||||
["F", "fullscreen"],
|
["F", "fullscreen"],
|
||||||
["R", "rotate clockwise"],
|
["R", "rotate clockwise"],
|
||||||
["🡅 R", "rotate ccw"],
|
["⇧ R", "rotate ccw"],
|
||||||
["S", "select pic"],
|
["S", "select pic"],
|
||||||
["Y", "download pic"],
|
["Y", "download pic"],
|
||||||
], [
|
], [
|
||||||
@@ -312,6 +312,7 @@ var tl_browser = {
|
|||||||
"ct_csel": 'use CTRL and SHIFT for file selection in grid-view">sel',
|
"ct_csel": 'use CTRL and SHIFT for file selection in grid-view">sel',
|
||||||
"ct_ihop": 'when the image viewer is closed, scroll down to the last viewed file">g⮯',
|
"ct_ihop": 'when the image viewer is closed, scroll down to the last viewed file">g⮯',
|
||||||
"ct_dots": 'show hidden files (if server permits)">dotfiles',
|
"ct_dots": 'show hidden files (if server permits)">dotfiles',
|
||||||
|
"ct_qdel": 'when deleting files, only ask for confirmation once">qdel',
|
||||||
"ct_dir1st": 'sort folders before files">📁 first',
|
"ct_dir1st": 'sort folders before files">📁 first',
|
||||||
"ct_nsort": 'natural sort (for filenames with leading digits)">nsort',
|
"ct_nsort": 'natural sort (for filenames with leading digits)">nsort',
|
||||||
"ct_readme": 'show README.md in folder listings">📜 readme',
|
"ct_readme": 'show README.md in folder listings">📜 readme',
|
||||||
@@ -335,6 +336,8 @@ var tl_browser = {
|
|||||||
|
|
||||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
||||||
|
|
||||||
|
"cut_wasm": "use wasm instead of the browser's built-in hasher; improves speed on chrome-based browsers but increases CPU load, and many older versions of chrome have bugs which makes the browser consume all RAM and crash if this is enabled\">wasm",
|
||||||
|
|
||||||
"cft_text": "favicon text (blank and refresh to disable)",
|
"cft_text": "favicon text (blank and refresh to disable)",
|
||||||
"cft_fg": "foreground color",
|
"cft_fg": "foreground color",
|
||||||
"cft_bg": "background color",
|
"cft_bg": "background color",
|
||||||
@@ -421,6 +424,7 @@ var tl_browser = {
|
|||||||
"f_empty": 'this folder is empty',
|
"f_empty": 'this folder is empty',
|
||||||
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
||||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||||
|
"f_bigtxt2": "view just the end of the file instead? this will also enable following/tailing, showing newly added lines of text in real time",
|
||||||
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
||||||
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
||||||
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
||||||
@@ -525,6 +529,11 @@ var tl_browser = {
|
|||||||
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
||||||
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
||||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||||
|
"tvt_tail": "monitor file for changes; show new lines in real time\">📡 follow",
|
||||||
|
"tvt_wrap": "word-wrap\">↵",
|
||||||
|
"tvt_atail": "lock scroll to bottom of page\">⚓",
|
||||||
|
"tvt_ctail": "decode terminal colors (ansi escape codes)\">🌈",
|
||||||
|
"tvt_ntail": "scrollback limit (how many bytes of text to keep loaded)",
|
||||||
|
|
||||||
"m3u_add1": "song added to m3u playlist",
|
"m3u_add1": "song added to m3u playlist",
|
||||||
"m3u_addn": "{0} songs added to m3u playlist",
|
"m3u_addn": "{0} songs added to m3u playlist",
|
||||||
@@ -624,6 +633,7 @@ var tl_browser = {
|
|||||||
"u_https3": "for better performance",
|
"u_https3": "for better performance",
|
||||||
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
||||||
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
||||||
|
"tail_2old": "need firefox 105+ or chrome 71+ or iOS 14.5+",
|
||||||
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
|
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
|
||||||
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
|
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
|
||||||
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
||||||
|
|||||||
46
tests/res/idp/7.conf
Normal file
46
tests/res/idp/7.conf
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
# -*- mode: yaml -*-
|
||||||
|
# vim: ft=yaml:
|
||||||
|
|
||||||
|
[global]
|
||||||
|
idp-h-usr: x-idp-user
|
||||||
|
idp-h-grp: x-idp-group
|
||||||
|
|
||||||
|
[/u/${u}]
|
||||||
|
/u/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/uya/${u%+ga}]
|
||||||
|
/uya/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/uyab/${u%+ga,%+gb}]
|
||||||
|
/uyab/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/una/${u%-ga}]
|
||||||
|
/una/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/unab/${u%-ga,%-gb}]
|
||||||
|
/unab/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/gya/${g%+ga}]
|
||||||
|
/gya/${g}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/gna/${g%-ga}]
|
||||||
|
/gna/${g}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/gnab/${g%-ga,%-gb}]
|
||||||
|
/gnab/${g}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
47
tests/res/idp/8.conf
Normal file
47
tests/res/idp/8.conf
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
# -*- mode: yaml -*-
|
||||||
|
# vim: ft=yaml:
|
||||||
|
|
||||||
|
[groups]
|
||||||
|
ga: iua, iuab, iuabc
|
||||||
|
gb: iuab, iuabc, iub, iubc
|
||||||
|
gc: iuabc, iubc, iuc
|
||||||
|
|
||||||
|
[/u/${u}]
|
||||||
|
/u/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/uya/${u%+ga}]
|
||||||
|
/uya/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/uyab/${u%+ga,%+gb}]
|
||||||
|
/uyab/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/una/${u%-ga}]
|
||||||
|
/una/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/unab/${u%-ga,%-gb}]
|
||||||
|
/unab/${u}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/gya/${g%+ga}]
|
||||||
|
/gya/${g}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/gna/${g%-ga}]
|
||||||
|
/gna/${g}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
|
||||||
|
[/gnab/${g%-ga,%-gb}]
|
||||||
|
/gnab/${g}
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
@@ -234,3 +234,74 @@ class TestVFS(unittest.TestCase):
|
|||||||
au.idp_checkin(None, "iud", "su")
|
au.idp_checkin(None, "iud", "su")
|
||||||
self.assertAxsAt(au, "team/su/iuc", [["iuc", "iud"]])
|
self.assertAxsAt(au, "team/su/iuc", [["iuc", "iud"]])
|
||||||
self.assertAxsAt(au, "team/su/iud", [["iuc", "iud"]])
|
self.assertAxsAt(au, "team/su/iud", [["iuc", "iud"]])
|
||||||
|
|
||||||
|
def test_7(self):
|
||||||
|
"""
|
||||||
|
conditional idp-vols
|
||||||
|
"""
|
||||||
|
_, cfgdir, xcfg = self.prep()
|
||||||
|
au = AuthSrv(Cfg(c=[cfgdir + "/7.conf"], **xcfg), self.log)
|
||||||
|
au.idp_checkin(None, "iua", "ga")
|
||||||
|
au.idp_checkin(None, "iuab", "ga,gb")
|
||||||
|
au.idp_checkin(None, "iuabc", "ga,gb,gc")
|
||||||
|
au.idp_checkin(None, "iub", "gb")
|
||||||
|
au.idp_checkin(None, "iubc", "gb,gc")
|
||||||
|
au.idp_checkin(None, "iuc", "gc")
|
||||||
|
zs = """
|
||||||
|
u/iua
|
||||||
|
u/iuab
|
||||||
|
u/iuabc
|
||||||
|
u/iub
|
||||||
|
u/iubc
|
||||||
|
u/iuc
|
||||||
|
uya/iua
|
||||||
|
uya/iuab
|
||||||
|
uya/iuabc
|
||||||
|
uyab/iuab
|
||||||
|
uyab/iuabc
|
||||||
|
una/iub
|
||||||
|
una/iubc
|
||||||
|
una/iuc
|
||||||
|
unab/iuc
|
||||||
|
gya/ga
|
||||||
|
gna/gb
|
||||||
|
gna/gc
|
||||||
|
gnab/gc
|
||||||
|
"""
|
||||||
|
zl1 = sorted(zs.strip().split("\n"))[:]
|
||||||
|
zl2 = sorted(list(au.vfs.all_vols))[:]
|
||||||
|
# print(" ".join(zl1))
|
||||||
|
# print(" ".join(zl2))
|
||||||
|
self.assertListEqual(zl1, zl2)
|
||||||
|
|
||||||
|
def test_8(self):
|
||||||
|
"""
|
||||||
|
conditional non-idp vols
|
||||||
|
"""
|
||||||
|
_, cfgdir, xcfg = self.prep()
|
||||||
|
xcfg = {"vc": True}
|
||||||
|
au = AuthSrv(Cfg(c=[cfgdir + "/8.conf"], **xcfg), self.log)
|
||||||
|
zs = """
|
||||||
|
u/iua
|
||||||
|
u/iuab
|
||||||
|
u/iuabc
|
||||||
|
u/iub
|
||||||
|
u/iubc
|
||||||
|
u/iuc
|
||||||
|
uya/iua
|
||||||
|
uya/iuab
|
||||||
|
uya/iuabc
|
||||||
|
uyab/iuab
|
||||||
|
uyab/iuabc
|
||||||
|
una/iub
|
||||||
|
una/iubc
|
||||||
|
una/iuc
|
||||||
|
unab/iuc
|
||||||
|
gya/ga
|
||||||
|
gna/gb
|
||||||
|
gna/gc
|
||||||
|
gnab/gc
|
||||||
|
"""
|
||||||
|
zl1 = sorted(zs.strip().split("\n"))[:]
|
||||||
|
zl2 = sorted(list(au.vfs.all_vols))[:]
|
||||||
|
self.assertListEqual(zl1, zl2)
|
||||||
|
|||||||
229
tests/test_shr.py
Normal file
229
tests/test_shr.py
Normal file
@@ -0,0 +1,229 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
# coding: utf-8
|
||||||
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import sqlite3
|
||||||
|
import tempfile
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
from copyparty.__init__ import ANYWIN
|
||||||
|
from copyparty.authsrv import AuthSrv
|
||||||
|
from copyparty.httpcli import HttpCli
|
||||||
|
from copyparty.util import absreal
|
||||||
|
from tests import util as tu
|
||||||
|
from tests.util import Cfg
|
||||||
|
|
||||||
|
|
||||||
|
class TestShr(unittest.TestCase):
|
||||||
|
def log(self, src, msg, c=0):
|
||||||
|
m = "%s" % (msg,)
|
||||||
|
if (
|
||||||
|
"warning: filesystem-path does not exist:" in m
|
||||||
|
or "you are sharing a system directory:" in m
|
||||||
|
or "symlink-based deduplication is enabled" in m
|
||||||
|
or m.startswith("hint: argument")
|
||||||
|
):
|
||||||
|
return
|
||||||
|
|
||||||
|
print(("[%s] %s" % (src, msg)).encode("ascii", "replace").decode("ascii"))
|
||||||
|
|
||||||
|
def assertLD(self, url, auth, els, edl):
|
||||||
|
ls = self.ls(url, auth)
|
||||||
|
self.assertEqual(ls[0], len(els) == 2)
|
||||||
|
if not ls[0]:
|
||||||
|
return
|
||||||
|
a = [list(sorted(els[0])), list(sorted(els[1]))]
|
||||||
|
b = [list(sorted(ls[1])), list(sorted(ls[2]))]
|
||||||
|
self.assertEqual(a, b)
|
||||||
|
|
||||||
|
if edl is None:
|
||||||
|
edl = els[1]
|
||||||
|
can_dl = []
|
||||||
|
for fn in b[1]:
|
||||||
|
if fn == "a.db":
|
||||||
|
continue
|
||||||
|
furl = url + "/" + fn
|
||||||
|
if auth:
|
||||||
|
furl += "?pw=p1"
|
||||||
|
h, zb = self.curl(furl, True)
|
||||||
|
if h.startswith("HTTP/1.1 200 "):
|
||||||
|
can_dl.append(fn)
|
||||||
|
self.assertEqual(edl, can_dl)
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
self.td = tu.get_ramdisk()
|
||||||
|
td = os.path.join(self.td, "vfs")
|
||||||
|
os.mkdir(td)
|
||||||
|
os.chdir(td)
|
||||||
|
os.mkdir("d1")
|
||||||
|
os.mkdir("d2")
|
||||||
|
os.mkdir("d2/d3")
|
||||||
|
for zs in ("d1/f1", "d2/f2", "d2/d3/f3"):
|
||||||
|
with open(zs, "wb") as f:
|
||||||
|
f.write(zs.encode("utf-8"))
|
||||||
|
for dst in ("d1", "d2", "d2/d3"):
|
||||||
|
src, fn = zs.rsplit("/", 1)
|
||||||
|
os.symlink(absreal(zs), dst + "/l" + fn[-1:])
|
||||||
|
|
||||||
|
db = sqlite3.connect("a.db")
|
||||||
|
with db:
|
||||||
|
zs = r"create table sh (k text, pw text, vp text, pr text, st int, un text, t0 int, t1 int)"
|
||||||
|
db.execute(zs)
|
||||||
|
db.close()
|
||||||
|
|
||||||
|
def tearDown(self):
|
||||||
|
os.chdir(tempfile.gettempdir())
|
||||||
|
shutil.rmtree(self.td)
|
||||||
|
|
||||||
|
def cinit(self):
|
||||||
|
self.asrv = AuthSrv(self.args, self.log)
|
||||||
|
self.conn = tu.VHttpConn(self.args, self.asrv, self.log, b"", True)
|
||||||
|
|
||||||
|
def test1(self):
|
||||||
|
self.args = Cfg(
|
||||||
|
a=["u1:p1"],
|
||||||
|
v=["::A,u1", "d1:v1:A,u1", "d2/d3:d2/d3:A,u1"],
|
||||||
|
shr="/shr/",
|
||||||
|
shr1="shr/",
|
||||||
|
shr_db="a.db",
|
||||||
|
shr_v=False,
|
||||||
|
)
|
||||||
|
self.cinit()
|
||||||
|
|
||||||
|
self.assertLD("", True, [["d1", "d2", "v1"], ["a.db"]], [])
|
||||||
|
self.assertLD("d1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("v1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2", True, [["d3"], ["f2", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2/d3", True, [[], ["f3", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d3", True, [], [])
|
||||||
|
|
||||||
|
jt = {
|
||||||
|
"k": "r",
|
||||||
|
"vp": ["/"],
|
||||||
|
"pw": "",
|
||||||
|
"exp": "99",
|
||||||
|
"perms": ["read"],
|
||||||
|
}
|
||||||
|
print(self.post_json("?pw=p1&share", jt)[1])
|
||||||
|
jt = {
|
||||||
|
"k": "d2",
|
||||||
|
"vp": ["/d2/"],
|
||||||
|
"pw": "",
|
||||||
|
"exp": "99",
|
||||||
|
"perms": ["read"],
|
||||||
|
}
|
||||||
|
print(self.post_json("?pw=p1&share", jt)[1])
|
||||||
|
self.conn.shutdown()
|
||||||
|
self.cinit()
|
||||||
|
|
||||||
|
self.assertLD("", True, [["d1", "d2", "v1"], ["a.db"]], [])
|
||||||
|
self.assertLD("d1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("v1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2", True, [["d3"], ["f2", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2/d3", True, [[], ["f3", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d3", True, [], [])
|
||||||
|
|
||||||
|
self.assertLD("shr/d2", False, [[], ["f2", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("shr/d2/d3", False, [], None)
|
||||||
|
|
||||||
|
self.assertLD("shr/r", False, [["d1"], ["a.db"]], [])
|
||||||
|
self.assertLD("shr/r/d1", False, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("shr/r/d2", False, [], None) # unfortunate
|
||||||
|
self.assertLD("shr/r/d2/d3", False, [], None)
|
||||||
|
|
||||||
|
self.conn.shutdown()
|
||||||
|
|
||||||
|
def test2(self):
|
||||||
|
self.args = Cfg(
|
||||||
|
a=["u1:p1"],
|
||||||
|
v=["::A,u1", "d1:v1:A,u1", "d2/d3:d2/d3:A,u1"],
|
||||||
|
shr="/shr/",
|
||||||
|
shr1="shr/",
|
||||||
|
shr_db="a.db",
|
||||||
|
shr_v=False,
|
||||||
|
xvol=True,
|
||||||
|
)
|
||||||
|
self.cinit()
|
||||||
|
|
||||||
|
self.assertLD("", True, [["d1", "d2", "v1"], ["a.db"]], [])
|
||||||
|
self.assertLD("d1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("v1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2", True, [["d3"], ["f2", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2/d3", True, [[], ["f3", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d3", True, [], [])
|
||||||
|
|
||||||
|
jt = {
|
||||||
|
"k": "r",
|
||||||
|
"vp": ["/"],
|
||||||
|
"pw": "",
|
||||||
|
"exp": "99",
|
||||||
|
"perms": ["read"],
|
||||||
|
}
|
||||||
|
print(self.post_json("?pw=p1&share", jt)[1])
|
||||||
|
jt = {
|
||||||
|
"k": "d2",
|
||||||
|
"vp": ["/d2/"],
|
||||||
|
"pw": "",
|
||||||
|
"exp": "99",
|
||||||
|
"perms": ["read"],
|
||||||
|
}
|
||||||
|
print(self.post_json("?pw=p1&share", jt)[1])
|
||||||
|
self.conn.shutdown()
|
||||||
|
self.cinit()
|
||||||
|
|
||||||
|
self.assertLD("", True, [["d1", "d2", "v1"], ["a.db"]], [])
|
||||||
|
self.assertLD("d1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("v1", True, [[], ["f1", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2", True, [["d3"], ["f2", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d2/d3", True, [[], ["f3", "l1", "l2", "l3"]], None)
|
||||||
|
self.assertLD("d3", True, [], [])
|
||||||
|
|
||||||
|
self.assertLD("shr/d2", False, [[], ["f2", "l1", "l2", "l3"]], ["f2", "l2"])
|
||||||
|
self.assertLD("shr/d2/d3", False, [], [])
|
||||||
|
|
||||||
|
self.assertLD("shr/r", False, [["d1"], ["a.db"]], [])
|
||||||
|
self.assertLD(
|
||||||
|
"shr/r/d1", False, [[], ["f1", "l1", "l2", "l3"]], ["f1", "l1", "l2"]
|
||||||
|
)
|
||||||
|
self.assertLD("shr/r/d2", False, [], []) # unfortunate
|
||||||
|
self.assertLD("shr/r/d2/d3", False, [], [])
|
||||||
|
|
||||||
|
self.conn.shutdown()
|
||||||
|
|
||||||
|
def ls(self, url: str, auth: bool):
|
||||||
|
zs = url + "?ls" + ("&pw=p1" if auth else "")
|
||||||
|
h, b = self.curl(zs)
|
||||||
|
if not h.startswith("HTTP/1.1 200 "):
|
||||||
|
return (False, [], [])
|
||||||
|
jo = json.loads(b)
|
||||||
|
return (
|
||||||
|
True,
|
||||||
|
[x["href"].rstrip("/") for x in jo.get("dirs") or {}],
|
||||||
|
[x["href"] for x in jo.get("files") or {}],
|
||||||
|
)
|
||||||
|
|
||||||
|
def curl(self, url: str, binary=False):
|
||||||
|
h = "GET /%s HTTP/1.1\r\nConnection: close\r\n\r\n"
|
||||||
|
HttpCli(self.conn.setbuf((h % (url,)).encode("utf-8"))).run()
|
||||||
|
if binary:
|
||||||
|
h, b = self.conn.s._reply.split(b"\r\n\r\n", 1)
|
||||||
|
return [h.decode("utf-8"), b]
|
||||||
|
|
||||||
|
return self.conn.s._reply.decode("utf-8").split("\r\n\r\n", 1)
|
||||||
|
|
||||||
|
def post_json(self, url: str, data):
|
||||||
|
buf = json.dumps(data).encode("utf-8")
|
||||||
|
msg = [
|
||||||
|
"POST /%s HTTP/1.1" % (url,),
|
||||||
|
"Connection: close",
|
||||||
|
"Content-Type: application/json",
|
||||||
|
"Content-Length: %d" % (len(buf),),
|
||||||
|
"\r\n",
|
||||||
|
]
|
||||||
|
buf = "\r\n".join(msg).encode("utf-8") + buf
|
||||||
|
print("PUT -->", buf)
|
||||||
|
HttpCli(self.conn.setbuf(buf)).run()
|
||||||
|
return self.conn.s._reply.decode("utf-8").split("\r\n\r\n", 1)
|
||||||
@@ -143,25 +143,25 @@ class Cfg(Namespace):
|
|||||||
def __init__(self, a=None, v=None, c=None, **ka0):
|
def __init__(self, a=None, v=None, c=None, **ka0):
|
||||||
ka = {}
|
ka = {}
|
||||||
|
|
||||||
ex = "chpw daw dav_auth dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink ih ihead magic hardlink_only nid nih no_acode no_athumb no_bauth no_clone no_cp no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nsort nw og og_no_head og_s_title ohead q rand re_dirsz rss smb srch_dbg srch_excl stats uqe vague_403 vc ver wo_up_readme write_uplog xdev xlink xvol zipmaxu zs"
|
ex = "chpw daw dav_auth dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink hardlink_only ih ihead magic nid nih no_acode no_athumb no_bauth no_clone no_cp no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tail no_tarcmp no_thumb no_vthumb no_zip nrand nsort nw og og_no_head og_s_title ohead q rand re_dirsz rmagic rss smb srch_dbg srch_excl stats uqe vague_403 vc ver wo_up_readme write_uplog xdev xlink xvol zipmaxu zs"
|
||||||
ka.update(**{k: False for k in ex.split()})
|
ka.update(**{k: False for k in ex.split()})
|
||||||
|
|
||||||
ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash plain_ip"
|
ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash see_dots plain_ip"
|
||||||
ka.update(**{k: True for k in ex.split()})
|
ka.update(**{k: True for k in ex.split()})
|
||||||
|
|
||||||
ex = "ah_cli ah_gen css_browser dbpath hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua ua_nodoc ua_nozip"
|
ex = "ah_cli ah_gen css_browser dbpath hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua ua_nodoc ua_nozip"
|
||||||
ka.update(**{k: None for k in ex.split()})
|
ka.update(**{k: None for k in ex.split()})
|
||||||
|
|
||||||
ex = "hash_mt hsortn safe_dedup srch_time u2abort u2j u2sz"
|
ex = "hash_mt hsortn qdel safe_dedup srch_time tail_fd tail_rate u2abort u2j u2sz"
|
||||||
ka.update(**{k: 1 for k in ex.split()})
|
ka.update(**{k: 1 for k in ex.split()})
|
||||||
|
|
||||||
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody th_convt ups_who zip_who"
|
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody tail_tmax tail_who th_convt ups_who zip_who"
|
||||||
ka.update(**{k: 9 for k in ex.split()})
|
ka.update(**{k: 9 for k in ex.split()})
|
||||||
|
|
||||||
ex = "db_act forget_ip k304 loris no304 re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo u2ow zipmaxn zipmaxs"
|
ex = "db_act forget_ip idp_store k304 loris no304 nosubtle re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo u2ow zipmaxn zipmaxs"
|
||||||
ka.update(**{k: 0 for k in ex.split()})
|
ka.update(**{k: 0 for k in ex.split()})
|
||||||
|
|
||||||
ex = "ah_alg bname chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sba lg_sbf log_fk md_sba md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src zipmaxt R RS SR"
|
ex = "ah_alg bname chmod_f chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sba lg_sbf log_fk md_sba md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src zipmaxt R RS SR"
|
||||||
ka.update(**{k: "" for k in ex.split()})
|
ka.update(**{k: "" for k in ex.split()})
|
||||||
|
|
||||||
ex = "ban_403 ban_404 ban_422 ban_pw ban_url spinner"
|
ex = "ban_403 ban_404 ban_422 ban_pw ban_url spinner"
|
||||||
@@ -181,6 +181,7 @@ class Cfg(Namespace):
|
|||||||
c=c,
|
c=c,
|
||||||
E=E,
|
E=E,
|
||||||
bup_ck="sha512",
|
bup_ck="sha512",
|
||||||
|
chmod_d="755",
|
||||||
dbd="wal",
|
dbd="wal",
|
||||||
dk_salt="b" * 16,
|
dk_salt="b" * 16,
|
||||||
fk_salt="a" * 16,
|
fk_salt="a" * 16,
|
||||||
@@ -260,6 +261,9 @@ class VHub(object):
|
|||||||
self.is_dut = True
|
self.is_dut = True
|
||||||
self.up2k = Up2k(self)
|
self.up2k = Up2k(self)
|
||||||
|
|
||||||
|
def reload(self, a, b):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
class VBrokerThr(BrokerThr):
|
class VBrokerThr(BrokerThr):
|
||||||
def __init__(self, hub):
|
def __init__(self, hub):
|
||||||
|
|||||||
Reference in New Issue
Block a user