Compare commits
46 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7eca90cc21 | ||
|
|
6ecf4fdceb | ||
|
|
8cae7a715b | ||
|
|
c75b0c25a6 | ||
|
|
9dd5dec093 | ||
|
|
ec05f8ccd5 | ||
|
|
a1c7a095ee | ||
|
|
77df17d191 | ||
|
|
fa5845ff5f | ||
|
|
17fa490687 | ||
|
|
1eff87c3bd | ||
|
|
d123d2bff0 | ||
|
|
5ac3864874 | ||
|
|
c599e2aaa3 | ||
|
|
2e53f7979a | ||
|
|
f61511d8c8 | ||
|
|
47415a7120 | ||
|
|
db7becacd2 | ||
|
|
28b63e587b | ||
|
|
9cb93ae1ed | ||
|
|
e3e51fb83a | ||
|
|
49c7124776 | ||
|
|
60fb1207fc | ||
|
|
48470f6b50 | ||
|
|
1d308eeb4c | ||
|
|
84f5f41747 | ||
|
|
19189afb34 | ||
|
|
23e77a3389 | ||
|
|
ecced0c4f2 | ||
|
|
d4a8071de5 | ||
|
|
261236e302 | ||
|
|
0de09860f6 | ||
|
|
bfb39969a4 | ||
|
|
256dad8cc0 | ||
|
|
a247ba9ca3 | ||
|
|
0a9a807772 | ||
|
|
41fa6b2552 | ||
|
|
f425ff51ae | ||
|
|
7cde9a2976 | ||
|
|
5dcd88a6c8 | ||
|
|
c3ef3fdc1f | ||
|
|
b9ba783c1c | ||
|
|
d1bca1f52f | ||
|
|
94352f278b | ||
|
|
4fb87ebe32 | ||
|
|
3cbb7243ab |
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -34,7 +34,7 @@ remove the ones that are not relevant:
|
||||
### Server details (if you're NOT using docker/podman)
|
||||
remove the ones that are not relevant:
|
||||
* **server OS / version:**
|
||||
* **what copyparty did you grab:** (sfx/exe/pip/aur/...)
|
||||
* **what copyparty did you grab:** (sfx/exe/pip/arch/...)
|
||||
* **how you're running it:** (in a terminal, as a systemd-service, ...)
|
||||
* run copyparty with `--version` and grab the last 3 lines (they start with `copyparty`, `CPython`, `sqlite`) and paste them below this line:
|
||||
* **copyparty arguments and/or config-file:**
|
||||
|
||||
51
README.md
51
README.md
@@ -8,12 +8,14 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* 🔌 protocols: [http](#the-browser) // [webdav](#webdav-server) // [ftp](#ftp-server) // [tftp](#tftp-server) // [smb/cifs](#smb-server)
|
||||
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
|
||||
|
||||
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
|
||||
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running on a nuc in my basement
|
||||
|
||||
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
|
||||
|
||||
🎬 **videos:** [upload](https://a.ocv.me/pub/demo/pics-vids/up2k.webm) // [cli-upload](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm) // [race-the-beam](https://a.ocv.me/pub/g/nerd-stuff/cpp/2024-0418-race-the-beam.webm)
|
||||
|
||||
made in Norway 🇳🇴
|
||||
|
||||
|
||||
## readme toc
|
||||
|
||||
@@ -54,6 +56,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [creating a playlist](#creating-a-playlist) - with a standalone mediaplayer or copyparty
|
||||
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
|
||||
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
|
||||
* [textfile viewer](#textfile-viewer) - with realtime streaming of logfiles and such ([demo](https://a.ocv.me/pub/demo/logtail/))
|
||||
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
||||
* [markdown vars](#markdown-vars) - dynamic docs with serverside variable expansion
|
||||
* [other tricks](#other-tricks)
|
||||
@@ -104,7 +107,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
||||
* [feature chickenbits](#feature-chickenbits) - buggy feature? rip it out
|
||||
* [feature beefybits](#feature-beefybits) - force-enable features with known issues on your OS/env
|
||||
* [packages](#packages) - the party might be closer than you think
|
||||
* [arch package](#arch-package) - now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
|
||||
* [arch package](#arch-package) - `pacman -S copyparty` (in [arch linux extra](https://archlinux.org/packages/extra/any/copyparty/))
|
||||
* [fedora package](#fedora-package) - does not exist yet
|
||||
* [nix package](#nix-package) - `nix profile install github:9001/copyparty`
|
||||
* [nixos module](#nixos-module)
|
||||
@@ -255,7 +258,8 @@ also see [comparison to similar software](./docs/versus.md)
|
||||
* ☑ play video files as audio (converted on server)
|
||||
* ☑ create and play [m3u8 playlists](#playlists)
|
||||
* ☑ image gallery with webm player
|
||||
* ☑ textfile browser with syntax hilighting
|
||||
* ☑ [textfile browser](#textfile-viewer) with syntax hilighting
|
||||
* ☑ realtime streaming of growing files (logfiles and such)
|
||||
* ☑ [thumbnails](#thumbnails)
|
||||
* ☑ ...of images using Pillow, pyvips, or FFmpeg
|
||||
* ☑ ...of videos using FFmpeg
|
||||
@@ -417,6 +421,9 @@ upgrade notes
|
||||
|
||||
"frequently" asked questions
|
||||
|
||||
* CopyParty?
|
||||
* nope! the name is either copyparty (all-lowercase) or Copyparty -- it's [one word](https://en.wiktionary.org/wiki/copyparty) after all :>
|
||||
|
||||
* can I change the 🌲 spinning pine-tree loading animation?
|
||||
* [yeah...](https://github.com/9001/copyparty/tree/hovudstraum/docs/rice#boring-loader-spinner) :-(
|
||||
|
||||
@@ -556,6 +563,8 @@ a client can request to see dotfiles in directory listings if global option `-ed
|
||||
|
||||
dotfiles do not appear in search results unless one of the above is true, **and** the global option / volflag `dotsrch` is set
|
||||
|
||||
> even if user has permission to see dotfiles, they are default-hidden unless `--see-dots` is set, and/or user has enabled the `dotfiles` option in the settings tab
|
||||
|
||||
config file example, where the same permission to see dotfiles is given in two different ways just for reference:
|
||||
|
||||
```yaml
|
||||
@@ -692,7 +701,10 @@ enabling `multiselect` lets you click files to select them, and then shift-click
|
||||
* `multiselect` is mostly intended for phones/tablets, but the `sel` option in the `[⚙️] settings` tab is better suited for desktop use, allowing selection by CTRL-clicking and range-selection with SHIFT-click, all without affecting regular clicking
|
||||
* the `sel` option can be made default globally with `--gsel` or per-volume with volflag `gsel`
|
||||
|
||||
to show `/icons/exe.png` as the thumbnail for all .exe files, `--ext-th=exe=/icons/exe.png` (optionally as a volflag)
|
||||
to show `/icons/exe.png` and `/icons/elf.gif` as the thumbnail for all `.exe` and `.elf` files respectively, do this: `--ext-th=exe=/icons/exe.png --ext-th=elf=/icons/elf.gif`
|
||||
* optionally as separate volflags for each mapping; see config file example below
|
||||
* the supported image formats are [jpg, png, gif, webp, ico](https://developer.mozilla.org/en-US/docs/Web/Media/Guides/Formats/Image_types)
|
||||
* be careful with svg; chrome will crash if you have too many unique svg files showing on the same page (the limit is 250 or so) -- showing the same handful of svg files thousands of times is ok however
|
||||
|
||||
config file example:
|
||||
|
||||
@@ -709,6 +721,7 @@ config file example:
|
||||
dthumb # disable ALL thumbnails and audio transcoding
|
||||
dvthumb # only disable video thumbnails
|
||||
ext-th: exe=/ico/exe.png # /ico/exe.png is the thumbnail of *.exe
|
||||
ext-th: elf=/ico/elf.gif # ...and /ico/elf.gif is used for *.elf
|
||||
th-covers: folder.png,folder.jpg,cover.png,cover.jpg # the default
|
||||
```
|
||||
|
||||
@@ -915,6 +928,7 @@ semi-intentional limitations:
|
||||
|
||||
* cleanup of expired shares only works when global option `e2d` is set, and/or at least one volume on the server has volflag `e2d`
|
||||
* only folders from the same volume are shared; if you are sharing a folder which contains other volumes, then the contents of those volumes will not be available
|
||||
* if you change [password hashing](#password-hashing) settings after creating a password-protected share, then that share will stop working
|
||||
* related to [IdP volumes being forgotten on shutdown](https://github.com/9001/copyparty/blob/hovudstraum/docs/idp.md#idp-volumes-are-forgotten-on-shutdown), any shares pointing into a user's IdP volume will be unavailable until that user makes their first request after a restart
|
||||
* no option to "delete after first access" because tricky
|
||||
* when linking something to discord (for example) it'll get accessed by their scraper and that would count as a hit
|
||||
@@ -1115,6 +1129,18 @@ not available on iPhones / iPads because AudioContext currently breaks backgroun
|
||||
due to phone / app settings, android phones may randomly stop playing music when the power saver kicks in, especially at the end of an album -- you can fix it by [disabling power saving](https://user-images.githubusercontent.com/241032/235262123-c328cca9-3930-4948-bd18-3949b9fd3fcf.png) in the [app settings](https://user-images.githubusercontent.com/241032/235262121-2ffc51ae-7821-4310-a322-c3b7a507890c.png) of the browser you use for music streaming (preferably a dedicated one)
|
||||
|
||||
|
||||
## textfile viewer
|
||||
|
||||
with realtime streaming of logfiles and such ([demo](https://a.ocv.me/pub/demo/logtail/)) , and terminal colors work too
|
||||
|
||||
click `-txt-` next to a textfile to open the viewer, which has the following toolbar buttons:
|
||||
|
||||
* `✏️ edit` opens the textfile editor
|
||||
* `📡 follow` starts monitoring the file for changes, streaming new lines in realtime
|
||||
* similar to `tail -f`
|
||||
* [link directly](https://a.ocv.me/pub/demo/logtail/?doc=lipsum.txt&tail) to a file with tailing enabled by adding `&tail` to the textviewer URL
|
||||
|
||||
|
||||
## markdown viewer
|
||||
|
||||
and there are *two* editors
|
||||
@@ -2203,10 +2229,14 @@ if your distro/OS is not mentioned below, there might be some hints in the [«on
|
||||
|
||||
## arch package
|
||||
|
||||
now [available on aur](https://aur.archlinux.org/packages/copyparty) maintained by [@icxes](https://github.com/icxes)
|
||||
`pacman -S copyparty` (in [arch linux extra](https://archlinux.org/packages/extra/any/copyparty/))
|
||||
|
||||
it comes with a [systemd service](./contrib/package/arch/copyparty.service) and expects to find one or more [config files](./docs/example.conf) in `/etc/copyparty.d/`
|
||||
|
||||
after installing it, you may want to `cp /usr/lib/systemd/system/copyparty.service /etc/systemd/system/` and then `vim /etc/systemd/system/copyparty.service` to change what user/group it is running as (you only need to do this once)
|
||||
|
||||
NOTE: there used to be an aur package; this evaporated when copyparty was adopted by the official archlinux repos. If you're still using the aur package, please move
|
||||
|
||||
|
||||
## fedora package
|
||||
|
||||
@@ -2409,6 +2439,9 @@ interact with copyparty using non-browser clients
|
||||
* and for screenshots on macos, see [./contrib/ishare.iscu](./contrib/#ishareiscu)
|
||||
* and for screenshots on linux, see [./contrib/flameshot.sh](./contrib/flameshot.sh)
|
||||
|
||||
* [Custom Uploader](https://f-droid.org/en/packages/com.nyx.custom_uploader/) (an Android app) as an alternative to copyparty's own [PartyUP!](#android-app)
|
||||
* works if you set UploadURL to `https://your.com/foo/?want=url&pw=hunter2` and FormDataName `f`
|
||||
|
||||
* contextlet (web browser integration); see [contrib contextlet](contrib/#send-to-cppcontextletjson)
|
||||
|
||||
* [igloo irc](https://iglooirc.com/): Method: `post` Host: `https://you.com/up/?want=url&pw=hunter2` Multipart: `yes` File parameter: `f`
|
||||
@@ -2510,6 +2543,11 @@ below are some tweaks roughly ordered by usefulness:
|
||||
|
||||
when uploading files,
|
||||
|
||||
* when uploading from very fast storage (NVMe SSD) with chrome/firefox, enable `[wasm]` in the `[⚙️] settings` tab to more effectively use all CPU-cores for hashing
|
||||
* don't do this on Safari (runs faster without)
|
||||
* don't do this on older browsers; likely to provoke browser-bugs (browser eats all RAM and crashes)
|
||||
* can be made default-enabled serverside with `--nosubtle 137` (chrome v137+) or `--nosubtle 2` (chrome+firefox)
|
||||
|
||||
* chrome is recommended (unfortunately), at least compared to firefox:
|
||||
* up to 90% faster when hashing, especially on SSDs
|
||||
* up to 40% faster when uploading over extremely fast internets
|
||||
@@ -2717,6 +2755,7 @@ set any of the following environment variables to disable its associated optiona
|
||||
| `PRTY_NO_CFSSL` | never attempt to generate self-signed certificates using [cfssl](https://github.com/cloudflare/cfssl) |
|
||||
| `PRTY_NO_FFMPEG` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips |
|
||||
| `PRTY_NO_FFPROBE` | **audio transcoding** goes byebye, **thumbnailing** must be handled by Pillow/libvips, **metadata-scanning** must be handled by mutagen |
|
||||
| `PRTY_NO_MAGIC` | do not use [magic](https://pypi.org/project/python-magic/) for filetype detection |
|
||||
| `PRTY_NO_MUTAGEN` | do not use [mutagen](https://pypi.org/project/mutagen/) for reading metadata from media files; will fallback to ffprobe |
|
||||
| `PRTY_NO_PIL` | disable all [Pillow](https://pypi.org/project/pillow/)-based thumbnail support; will fallback to libvips or ffmpeg |
|
||||
| `PRTY_NO_PILF` | disable Pillow `ImageFont` text rendering, used for folder thumbnails |
|
||||
@@ -2817,5 +2856,7 @@ if there's a wall of base64 in the log (thread stacks) then please include that,
|
||||
|
||||
for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
|
||||
|
||||
specifically you may want to [build the sfx](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#just-the-sfx) or [build from scratch](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#build-from-scratch)
|
||||
|
||||
see [./docs/TODO.md](./docs/TODO.md) for planned features / fixes / changes
|
||||
|
||||
|
||||
@@ -52,7 +52,7 @@ example usage as a volflag in a copyparty config file:
|
||||
### CONFIG
|
||||
|
||||
# filetypes to process; ignores everything else
|
||||
EXTS = "mp3 flac ogg opus m4a aac wav wma"
|
||||
EXTS = "mp3 flac ogg oga opus m4a aac wav wma"
|
||||
|
||||
# the name of the subdir to put the normalized files in
|
||||
SUBDIR = "normalized"
|
||||
|
||||
@@ -71,6 +71,9 @@ def main():
|
||||
## selecting it inside the print at the end:
|
||||
##
|
||||
|
||||
# move all uploads to one specific folder
|
||||
into_junk = {"vp": "/junk"}
|
||||
|
||||
# create a subfolder named after the filetype and move it into there
|
||||
into_subfolder = {"vp": ext}
|
||||
|
||||
@@ -92,8 +95,8 @@ def main():
|
||||
by_category = {} # no action
|
||||
|
||||
# now choose the default effect to apply; can be any of these:
|
||||
# into_subfolder into_toplevel into_sibling by_category
|
||||
effect = {"vp": "/junk"}
|
||||
# into_junk into_subfolder into_toplevel into_sibling by_category
|
||||
effect = into_sibling
|
||||
|
||||
##
|
||||
## but we can keep going, adding more speicifc rules
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
S_VERSION = "2.10"
|
||||
S_BUILD_DT = "2025-02-19"
|
||||
S_VERSION = "2.11"
|
||||
S_BUILD_DT = "2025-05-18"
|
||||
|
||||
"""
|
||||
u2c.py: upload to copyparty
|
||||
@@ -1289,7 +1289,7 @@ class Ctl(object):
|
||||
if self.ar.jw:
|
||||
print("%s %s" % (wark, vp))
|
||||
else:
|
||||
zd = datetime.datetime.fromtimestamp(file.lmod, UTC)
|
||||
zd = datetime.datetime.fromtimestamp(max(0, file.lmod), UTC)
|
||||
dt = "%04d-%02d-%02d %02d:%02d:%02d" % (
|
||||
zd.year,
|
||||
zd.month,
|
||||
|
||||
@@ -2,19 +2,38 @@
|
||||
# not accept more consecutive clients than what copyparty is able to;
|
||||
# nginx default is 512 (worker_processes 1, worker_connections 512)
|
||||
#
|
||||
# ======================================================================
|
||||
#
|
||||
# to reverse-proxy a specific path/subpath/location below a domain
|
||||
# (rather than a complete subdomain), for example "/qw/er", you must
|
||||
# run copyparty with --rp-loc /qw/as and also change the following:
|
||||
# location / {
|
||||
# proxy_pass http://cpp_tcp;
|
||||
# to this:
|
||||
# location /qw/er/ {
|
||||
# proxy_pass http://cpp_tcp/qw/er/;
|
||||
#
|
||||
# ======================================================================
|
||||
#
|
||||
# rarely, in some extreme usecases, it can be good to add -j0
|
||||
# (40'000 requests per second, or 20gbps upload/download in parallel)
|
||||
# but this is usually counterproductive and slightly buggy
|
||||
#
|
||||
# ======================================================================
|
||||
#
|
||||
# on fedora/rhel, remember to setsebool -P httpd_can_network_connect 1
|
||||
#
|
||||
# if you are behind cloudflare (or another protection service),
|
||||
# ======================================================================
|
||||
#
|
||||
# if you are behind cloudflare (or another CDN/WAF/protection service),
|
||||
# remember to reject all connections which are not coming from your
|
||||
# protection service -- for cloudflare in particular, you can
|
||||
# generate the list of permitted IP ranges like so:
|
||||
# (curl -s https://www.cloudflare.com/ips-v{4,6} | sed 's/^/allow /; s/$/;/'; echo; echo "deny all;") > /etc/nginx/cloudflare-only.conf
|
||||
#
|
||||
# and then enable it below by uncomenting the cloudflare-only.conf line
|
||||
#
|
||||
# ======================================================================
|
||||
|
||||
|
||||
upstream cpp_tcp {
|
||||
|
||||
@@ -1,29 +1,31 @@
|
||||
{ config, pkgs, lib, ... }:
|
||||
|
||||
with lib;
|
||||
|
||||
let
|
||||
{
|
||||
config,
|
||||
pkgs,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
with lib; let
|
||||
mkKeyValue = key: value:
|
||||
if value == true then
|
||||
# sets with a true boolean value are coerced to just the key name
|
||||
if value == true
|
||||
then
|
||||
# sets with a true boolean value are coerced to just the key name
|
||||
key
|
||||
else if value == false then
|
||||
# or omitted completely when false
|
||||
else if value == false
|
||||
then
|
||||
# or omitted completely when false
|
||||
""
|
||||
else
|
||||
(generators.mkKeyValueDefault { inherit mkValueString; } ": " key value);
|
||||
else (generators.mkKeyValueDefault {inherit mkValueString;} ": " key value);
|
||||
|
||||
mkAttrsString = value: (generators.toKeyValue { inherit mkKeyValue; } value);
|
||||
mkAttrsString = value: (generators.toKeyValue {inherit mkKeyValue;} value);
|
||||
|
||||
mkValueString = value:
|
||||
if isList value then
|
||||
(concatStringsSep ", " (map mkValueString value))
|
||||
else if isAttrs value then
|
||||
"\n" + (mkAttrsString value)
|
||||
else
|
||||
(generators.mkValueStringDefault { } value);
|
||||
if isList value
|
||||
then (concatStringsSep ", " (map mkValueString value))
|
||||
else if isAttrs value
|
||||
then "\n" + (mkAttrsString value)
|
||||
else (generators.mkValueStringDefault {} value);
|
||||
|
||||
mkSectionName = value: "[" + (escape [ "[" "]" ] value) + "]";
|
||||
mkSectionName = value: "[" + (escape ["[" "]"] value) + "]";
|
||||
|
||||
mkSection = name: attrs: ''
|
||||
${mkSectionName name}
|
||||
@@ -49,12 +51,12 @@ let
|
||||
${concatStringsSep "\n" (mapAttrsToList mkVolume cfg.volumes)}
|
||||
'';
|
||||
|
||||
name = "copyparty";
|
||||
cfg = config.services.copyparty;
|
||||
configFile = pkgs.writeText "${name}.conf" configStr;
|
||||
runtimeConfigPath = "/run/${name}/${name}.conf";
|
||||
home = "/var/lib/${name}";
|
||||
defaultShareDir = "${home}/data";
|
||||
configFile = pkgs.writeText "copyparty.conf" configStr;
|
||||
runtimeConfigPath = "/run/copyparty/copyparty.conf";
|
||||
externalCacheDir = "/var/cache/copyparty";
|
||||
externalStateDir = "/var/lib/copyparty";
|
||||
defaultShareDir = "${externalStateDir}/data";
|
||||
in {
|
||||
options.services.copyparty = {
|
||||
enable = mkEnableOption "web-based file manager";
|
||||
@@ -68,6 +70,35 @@ in {
|
||||
'';
|
||||
};
|
||||
|
||||
mkHashWrapper = mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = ''
|
||||
Make a shell script wrapper called 'copyparty-hash' with all options set here,
|
||||
that launches the hashing cli.
|
||||
'';
|
||||
};
|
||||
|
||||
user = mkOption {
|
||||
type = types.str;
|
||||
default = "copyparty";
|
||||
description = ''
|
||||
The user that copyparty will run under.
|
||||
|
||||
If changed from default, you are responsible for making sure the user exists.
|
||||
'';
|
||||
};
|
||||
|
||||
group = mkOption {
|
||||
type = types.str;
|
||||
default = "copyparty";
|
||||
description = ''
|
||||
The group that copyparty will run under.
|
||||
|
||||
If changed from default, you are responsible for making sure the user exists.
|
||||
'';
|
||||
};
|
||||
|
||||
openFilesLimit = mkOption {
|
||||
default = 4096;
|
||||
type = types.either types.int types.str;
|
||||
@@ -79,22 +110,25 @@ in {
|
||||
description = ''
|
||||
Global settings to apply.
|
||||
Directly maps to values in the [global] section of the copyparty config.
|
||||
Cannot set "c" or "hist", those are set by this module.
|
||||
See `${getExe cfg.package} --help` for more details.
|
||||
'';
|
||||
default = {
|
||||
i = "127.0.0.1";
|
||||
no-reload = true;
|
||||
hist = externalCacheDir;
|
||||
};
|
||||
example = literalExpression ''
|
||||
{
|
||||
i = "0.0.0.0";
|
||||
no-reload = true;
|
||||
hist = ${externalCacheDir};
|
||||
}
|
||||
'';
|
||||
};
|
||||
|
||||
accounts = mkOption {
|
||||
type = types.attrsOf (types.submodule ({ ... }: {
|
||||
type = types.attrsOf (types.submodule ({...}: {
|
||||
options = {
|
||||
passwordFile = mkOption {
|
||||
type = types.str;
|
||||
@@ -109,7 +143,7 @@ in {
|
||||
description = ''
|
||||
A set of copyparty accounts to create.
|
||||
'';
|
||||
default = { };
|
||||
default = {};
|
||||
example = literalExpression ''
|
||||
{
|
||||
ed.passwordFile = "/run/keys/copyparty/ed";
|
||||
@@ -118,10 +152,10 @@ in {
|
||||
};
|
||||
|
||||
volumes = mkOption {
|
||||
type = types.attrsOf (types.submodule ({ ... }: {
|
||||
type = types.attrsOf (types.submodule ({...}: {
|
||||
options = {
|
||||
path = mkOption {
|
||||
type = types.str;
|
||||
type = types.path;
|
||||
description = ''
|
||||
Path of a directory to share.
|
||||
'';
|
||||
@@ -177,7 +211,7 @@ in {
|
||||
nohash = "\.iso$";
|
||||
};
|
||||
'';
|
||||
default = { };
|
||||
default = {};
|
||||
};
|
||||
};
|
||||
}));
|
||||
@@ -185,7 +219,7 @@ in {
|
||||
default = {
|
||||
"/" = {
|
||||
path = defaultShareDir;
|
||||
access = { r = "*"; };
|
||||
access = {r = "*";};
|
||||
};
|
||||
};
|
||||
example = literalExpression ''
|
||||
@@ -204,52 +238,65 @@ in {
|
||||
};
|
||||
};
|
||||
|
||||
config = mkIf cfg.enable {
|
||||
config = mkIf cfg.enable (let
|
||||
command = "${getExe cfg.package} -c ${runtimeConfigPath}";
|
||||
in {
|
||||
systemd.services.copyparty = {
|
||||
description = "http file sharing hub";
|
||||
wantedBy = [ "multi-user.target" ];
|
||||
wantedBy = ["multi-user.target"];
|
||||
|
||||
environment = {
|
||||
PYTHONUNBUFFERED = "true";
|
||||
XDG_CONFIG_HOME = "${home}/.config";
|
||||
XDG_CONFIG_HOME = externalStateDir;
|
||||
};
|
||||
|
||||
preStart = let
|
||||
replaceSecretCommand = name: attrs:
|
||||
"${getExe pkgs.replace-secret} '${
|
||||
passwordPlaceholder name
|
||||
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
|
||||
replaceSecretCommand = name: attrs: "${getExe pkgs.replace-secret} '${
|
||||
passwordPlaceholder name
|
||||
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
|
||||
in ''
|
||||
set -euo pipefail
|
||||
install -m 600 ${configFile} ${runtimeConfigPath}
|
||||
${concatStringsSep "\n"
|
||||
(mapAttrsToList replaceSecretCommand cfg.accounts)}
|
||||
(mapAttrsToList replaceSecretCommand cfg.accounts)}
|
||||
'';
|
||||
|
||||
serviceConfig = {
|
||||
Type = "simple";
|
||||
ExecStart = "${getExe cfg.package} -c ${runtimeConfigPath}";
|
||||
|
||||
ExecStart = command;
|
||||
# Hardening options
|
||||
User = "copyparty";
|
||||
Group = "copyparty";
|
||||
RuntimeDirectory = name;
|
||||
User = cfg.user;
|
||||
Group = cfg.group;
|
||||
RuntimeDirectory = ["copyparty"];
|
||||
RuntimeDirectoryMode = "0700";
|
||||
StateDirectory = [ name "${name}/data" "${name}/.config" ];
|
||||
StateDirectory = ["copyparty"];
|
||||
StateDirectoryMode = "0700";
|
||||
WorkingDirectory = home;
|
||||
CacheDirectory = lib.mkIf (cfg.settings ? hist) ["copyparty"];
|
||||
CacheDirectoryMode = lib.mkIf (cfg.settings ? hist) "0700";
|
||||
WorkingDirectory = externalStateDir;
|
||||
BindReadOnlyPaths =
|
||||
[
|
||||
"/nix/store"
|
||||
"-/etc/resolv.conf"
|
||||
"-/etc/nsswitch.conf"
|
||||
"-/etc/hosts"
|
||||
"-/etc/localtime"
|
||||
]
|
||||
++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
|
||||
BindPaths =
|
||||
(
|
||||
if cfg.settings ? hist
|
||||
then [cfg.settings.hist]
|
||||
else []
|
||||
)
|
||||
++ [externalStateDir]
|
||||
++ (mapAttrsToList (k: v: v.path) cfg.volumes);
|
||||
# ProtectSystem = "strict";
|
||||
# Note that unlike what 'ro' implies,
|
||||
# this actually makes it impossible to read anything in the root FS,
|
||||
# except for things explicitly mounted via `RuntimeDirectory`, `StateDirectory`, `CacheDirectory`, and `BindReadOnlyPaths`.
|
||||
# This is because TemporaryFileSystem creates a *new* *empty* filesystem for the process, so only bindmounts are visible.
|
||||
TemporaryFileSystem = "/:ro";
|
||||
BindReadOnlyPaths = [
|
||||
"/nix/store"
|
||||
"-/etc/resolv.conf"
|
||||
"-/etc/nsswitch.conf"
|
||||
"-/etc/hosts"
|
||||
"-/etc/localtime"
|
||||
] ++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
|
||||
BindPaths = [ home ] ++ (mapAttrsToList (k: v: v.path) cfg.volumes);
|
||||
# Would re-mount paths ignored by temporary root
|
||||
#ProtectSystem = "strict";
|
||||
ProtectHome = true;
|
||||
PrivateTmp = true;
|
||||
PrivateDevices = true;
|
||||
ProtectKernelTunables = true;
|
||||
@@ -269,15 +316,48 @@ in {
|
||||
NoNewPrivileges = true;
|
||||
LockPersonality = true;
|
||||
RestrictRealtime = true;
|
||||
MemoryDenyWriteExecute = true;
|
||||
};
|
||||
};
|
||||
|
||||
users.groups.copyparty = { };
|
||||
users.users.copyparty = {
|
||||
# ensure volumes exist:
|
||||
systemd.tmpfiles.settings."copyparty" = (
|
||||
lib.attrsets.mapAttrs' (
|
||||
name: value:
|
||||
lib.attrsets.nameValuePair (value.path) {
|
||||
d = {
|
||||
#: in front of things means it wont change it if the directory already exists.
|
||||
group = ":${cfg.group}";
|
||||
user = ":${cfg.user}";
|
||||
mode = ":755";
|
||||
};
|
||||
}
|
||||
)
|
||||
cfg.volumes
|
||||
);
|
||||
|
||||
users.groups.copyparty = lib.mkIf (cfg.user == "copyparty" && cfg.group == "copyparty") {};
|
||||
users.users.copyparty = lib.mkIf (cfg.user == "copyparty" && cfg.group == "copyparty") {
|
||||
description = "Service user for copyparty";
|
||||
group = "copyparty";
|
||||
home = home;
|
||||
home = externalStateDir;
|
||||
isSystemUser = true;
|
||||
};
|
||||
};
|
||||
environment.systemPackages = lib.mkIf cfg.mkHashWrapper [
|
||||
(pkgs.writeShellScriptBin
|
||||
"copyparty-hash"
|
||||
''
|
||||
set -a # automatically export variables
|
||||
# set same environment variables as the systemd service
|
||||
${lib.pipe config.systemd.services.copyparty.environment [
|
||||
(lib.filterAttrs (n: v: v != null && n != "PATH"))
|
||||
(lib.mapAttrs (_: v: "${v}"))
|
||||
(lib.toShellVars)
|
||||
]}
|
||||
PATH=${config.systemd.services.copyparty.environment.PATH}:$PATH
|
||||
|
||||
exec ${command} --ah-cli
|
||||
'')
|
||||
];
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.16.21"
|
||||
pkgver="1.17.2"
|
||||
pkgrel=1
|
||||
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
|
||||
arch=("any")
|
||||
@@ -22,7 +22,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("2e416e18dc854c65643b8aaedca56e0a5c5a03b0c3d45b7ff3f68daa38d8e9c6")
|
||||
sha256sums=("20af4a9b3188fee235c505af4a09190088d0094ab594e37ca1eabbda41c8912d")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, pyzmq, ffmpeg, mutagen,
|
||||
{ lib, stdenv, makeWrapper, fetchurl, util-linux, python, jinja2, impacket, pyftpdlib, pyopenssl, argon2-cffi, pillow, pyvips, pyzmq, ffmpeg, mutagen,
|
||||
|
||||
# use argon2id-hashed passwords in config files (sha2 is always available)
|
||||
withHashedPasswords ? true,
|
||||
@@ -61,7 +61,7 @@ in stdenv.mkDerivation {
|
||||
installPhase = ''
|
||||
install -Dm755 $src $out/share/copyparty-sfx.py
|
||||
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
|
||||
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
||||
--set PATH '${lib.makeBinPath ([ util-linux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
||||
--add-flags "$out/share/copyparty-sfx.py"
|
||||
'';
|
||||
meta.mainProgram = "copyparty";
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.16.21/copyparty-sfx.py",
|
||||
"version": "1.16.21",
|
||||
"hash": "sha256-+/f4g8J2Mv0l6ChXzbNJ84G8LeB+mP1UfkWzQxizd/g="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.17.2/copyparty-sfx.py",
|
||||
"version": "1.17.2",
|
||||
"hash": "sha256-qY3QTLthJLpvO9yo2kiwM2cT3eqg5/cGvGzp+JHy4Ko="
|
||||
}
|
||||
@@ -12,6 +12,23 @@ almost the same as minimal-up2k.html except this one...:
|
||||
|
||||
-- looks slightly better
|
||||
|
||||
|
||||
========================
|
||||
== USAGE INSTRUCTIONS ==
|
||||
|
||||
1. create a volume which anyone can read from (if you haven't already)
|
||||
2. copy this file into that volume, so anyone can download it
|
||||
3. enable the plugin by telling the webbrowser to load this file;
|
||||
assuming the URL to the public volume is /res/, and
|
||||
assuming you're using config-files, then add this to your config:
|
||||
|
||||
[global]
|
||||
js-browser: /res/minimal-up2k.js
|
||||
|
||||
alternatively, if you're not using config-files, then
|
||||
add the following commandline argument instead:
|
||||
--js-browser=/res/minimal-up2k.js
|
||||
|
||||
*/
|
||||
|
||||
var u2min = `
|
||||
|
||||
@@ -964,6 +964,7 @@ def add_general(ap, nc, srvname):
|
||||
ap2.add_argument("--name", metavar="TXT", type=u, default=srvname, help="server name (displayed topleft in browser and in mDNS)")
|
||||
ap2.add_argument("--mime", metavar="EXT=MIME", type=u, action="append", help="map file \033[33mEXT\033[0mension to \033[33mMIME\033[0mtype, for example [\033[32mjpg=image/jpeg\033[0m]")
|
||||
ap2.add_argument("--mimes", action="store_true", help="list default mimetype mapping and exit")
|
||||
ap2.add_argument("--rmagic", action="store_true", help="do expensive analysis to improve accuracy of returned mimetypes; will make file-downloads, rss, and webdav slower (volflag=rmagic)")
|
||||
ap2.add_argument("--license", action="store_true", help="show licenses and exit")
|
||||
ap2.add_argument("--version", action="store_true", help="show versions and exit")
|
||||
|
||||
@@ -1003,6 +1004,9 @@ def add_upload(ap):
|
||||
ap2 = ap.add_argument_group('upload options')
|
||||
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless \033[33m-ed\033[0m")
|
||||
ap2.add_argument("--plain-ip", action="store_true", help="when avoiding filename collisions by appending the uploader's ip to the filename: append the plaintext ip instead of salting and hashing the ip")
|
||||
ap2.add_argument("--put-name", metavar="TXT", type=u, default="put-{now.6f}-{cip}.bin", help="filename for nameless uploads (when uploader doesn't provide a name); default is [\033[32mput-UNIXTIME-IP.bin\033[0m] (the \033[32m.6f\033[0m means six decimal places) (volflag=put_name)")
|
||||
ap2.add_argument("--put-ck", metavar="ALG", type=u, default="sha512", help="default checksum-hasher for PUT/WebDAV uploads: no / md5 / sha1 / sha256 / sha512 / b2 / blake2 / b2s / blake2s (volflag=put_ck)")
|
||||
ap2.add_argument("--bup-ck", metavar="ALG", type=u, default="sha512", help="default checksum-hasher for bup/basic-uploader: no / md5 / sha1 / sha256 / sha512 / b2 / blake2 / b2s / blake2s (volflag=bup_ck)")
|
||||
ap2.add_argument("--unpost", metavar="SEC", type=int, default=3600*12, help="grace period where uploads can be deleted by the uploader, even without delete permissions; 0=disabled, default=12h")
|
||||
ap2.add_argument("--u2abort", metavar="NUM", type=int, default=1, help="clients can abort incomplete uploads by using the unpost tab (requires \033[33m-e2d\033[0m). [\033[32m0\033[0m] = never allowed (disable feature), [\033[32m1\033[0m] = allow if client has the same IP as the upload AND is using the same account, [\033[32m2\033[0m] = just check the IP, [\033[32m3\033[0m] = just check account-name (volflag=u2abort)")
|
||||
ap2.add_argument("--blank-wt", metavar="SEC", type=int, default=300, help="file write grace period (any client can write to a blank file last-modified more recently than \033[33mSEC\033[0m seconds ago)")
|
||||
@@ -1025,6 +1029,7 @@ def add_upload(ap):
|
||||
ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)")
|
||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
|
||||
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
|
||||
ap2.add_argument("--nosubtle", metavar="N", type=int, default=0, help="when to use a wasm-hasher instead of the browser's builtin; faster on chrome, but buggy in older chrome versions. [\033[32m0\033[0m] = only when necessary (non-https), [\033[32m1\033[0m] = always (all browsers), [\033[32m2\033[0m] = always on chrome/firefox, [\033[32m3\033[0m] = always on chrome, [\033[32mN\033[0m] = chrome-version N and newer (recommendation: 137)")
|
||||
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good when latency is low (same-country), 2~4 for android-clients, 2~6 for cross-atlantic. Max is 6 in most browsers. Big values increase network-speed but may reduce HDD-speed")
|
||||
ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for \033[33mdefault\033[0m, and never exceed \033[33mmax\033[0m. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]")
|
||||
ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to replace/overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)")
|
||||
@@ -1266,6 +1271,7 @@ def add_optouts(ap):
|
||||
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
||||
ap2.add_argument("--no-lifetime", action="store_true", help="do not allow clients (or server config) to schedule an upload to be deleted after a given time")
|
||||
ap2.add_argument("--no-pipe", action="store_true", help="disable race-the-beam (lockstep download of files which are currently being uploaded) (volflag=nopipe)")
|
||||
ap2.add_argument("--no-tail", action="store_true", help="disable streaming a growing files with ?tail (volflag=notail)")
|
||||
ap2.add_argument("--no-db-ip", action="store_true", help="do not write uploader-IP into the database; will also disable unpost, you may want \033[32m--forget-ip\033[0m instead (volflag=no_db_ip)")
|
||||
|
||||
|
||||
@@ -1379,8 +1385,8 @@ def add_thumbnail(ap):
|
||||
ap2.add_argument("--th-r-vips", metavar="T,T", type=u, default="avif,exr,fit,fits,fts,gif,hdr,heic,jp2,jpeg,jpg,jpx,jxl,nii,pfm,pgm,png,ppm,svg,tif,tiff,webp", help="image formats to decode using pyvips")
|
||||
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,cbz,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itxz,itz,m4a,mdgz,mdxz,mdz,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,s3gz,s3xz,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmxz,xmz,xpk", help="audio formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-spec-cnv", metavar="T,T", type=u, default="it,itgz,itxz,itz,mdgz,mdxz,mdz,mo3,mod,s3m,s3gz,s3xz,s3z,xm,xmgz,xmxz,xmz,xpk", help="audio formats which provoke https://trac.ffmpeg.org/ticket/10797 (huge ram usage for s3xmodit spectrograms)")
|
||||
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itxz,itz,m4a,mdgz,mdxz,mdz,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,oga,ogg,okt,opus,ra,s3m,s3gz,s3xz,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmxz,xmz,xpk", help="audio formats to decode using ffmpeg")
|
||||
ap2.add_argument("--th-spec-cnv", metavar="T", type=u, default="it,itgz,itxz,itz,mdgz,mdxz,mdz,mo3,mod,s3m,s3gz,s3xz,s3z,xm,xmgz,xmxz,xmz,xpk", help="audio formats which provoke https://trac.ffmpeg.org/ticket/10797 (huge ram usage for s3xmodit spectrograms)")
|
||||
ap2.add_argument("--au-unpk", metavar="E=F.C", type=u, default="mdz=mod.zip, mdgz=mod.gz, mdxz=mod.xz, s3z=s3m.zip, s3gz=s3m.gz, s3xz=s3m.xz, xmz=xm.zip, xmgz=xm.gz, xmxz=xm.xz, itz=it.zip, itgz=it.gz, itxz=it.xz, cbz=jpg.cbz", help="audio/image formats to decompress before passing to ffmpeg")
|
||||
|
||||
|
||||
@@ -1395,6 +1401,16 @@ def add_transcoding(ap):
|
||||
ap2.add_argument("--ac-maxage", metavar="SEC", type=int, default=86400, help="delete cached transcode output after \033[33mSEC\033[0m seconds")
|
||||
|
||||
|
||||
def add_tail(ap):
|
||||
ap2 = ap.add_argument_group('tailing options (realtime streaming of a growing file)')
|
||||
ap2.add_argument("--tail-who", metavar="LVL", type=int, default=2, help="who can tail? [\033[32m0\033[0m]=nobody, [\033[32m1\033[0m]=admins, [\033[32m2\033[0m]=authenticated-with-read-access, [\033[32m3\033[0m]=everyone-with-read-access (volflag=tail_who)")
|
||||
ap2.add_argument("--tail-cmax", metavar="N", type=int, default=64, help="do not allow starting a new tail if more than \033[33mN\033[0m active downloads")
|
||||
ap2.add_argument("--tail-tmax", metavar="SEC", type=float, default=0, help="terminate connection after \033[33mSEC\033[0m seconds; [\033[32m0\033[0m]=never (volflag=tail_tmax)")
|
||||
ap2.add_argument("--tail-rate", metavar="SEC", type=float, default=0.2, help="check for new data every \033[33mSEC\033[0m seconds (volflag=tail_rate)")
|
||||
ap2.add_argument("--tail-ka", metavar="SEC", type=float, default=3.0, help="send a zerobyte if connection is idle for \033[33mSEC\033[0m seconds to prevent disconnect")
|
||||
ap2.add_argument("--tail-fd", metavar="SEC", type=float, default=1.0, help="check if file was replaced (new fd) if idle for \033[33mSEC\033[0m seconds (volflag=tail_fd)")
|
||||
|
||||
|
||||
def add_rss(ap):
|
||||
ap2 = ap.add_argument_group('RSS options')
|
||||
ap2.add_argument("--rss", action="store_true", help="enable RSS output (experimental) (volflag=rss)")
|
||||
@@ -1490,6 +1506,7 @@ def add_ui(ap, retry):
|
||||
ap2.add_argument("--sort", metavar="C,C,C", type=u, default="href", help="default sort order, comma-separated column IDs (see header tooltips), prefix with '-' for descending. Examples: \033[32mhref -href ext sz ts tags/Album tags/.tn\033[0m (volflag=sort)")
|
||||
ap2.add_argument("--nsort", action="store_true", help="default-enable natural sort of filenames with leading numbers (volflag=nsort)")
|
||||
ap2.add_argument("--hsortn", metavar="N", type=int, default=2, help="number of sorting rules to include in media URLs by default (volflag=hsortn)")
|
||||
ap2.add_argument("--see-dots", action="store_true", help="default-enable seeing dotfiles; only takes effect if user has the necessary permissions")
|
||||
ap2.add_argument("--unlist", metavar="REGEX", type=u, default="", help="don't show files matching \033[33mREGEX\033[0m in file list. Purely cosmetic! Does not affect API calls, just the browser. Example: [\033[32m\\.(js|css)$\033[0m] (volflag=unlist)")
|
||||
ap2.add_argument("--favico", metavar="TXT", type=u, default="c 000 none" if retry else "🎉 000 none", help="\033[33mfavicon-text\033[0m [ \033[33mforeground\033[0m [ \033[33mbackground\033[0m ] ], set blank to disable")
|
||||
ap2.add_argument("--ext-th", metavar="E=VP", type=u, action="append", help="use thumbnail-image \033[33mVP\033[0m for file-extension \033[33mE\033[0m, example: [\033[32mexe=/.res/exe.png\033[0m] (volflag=ext_th)")
|
||||
@@ -1599,6 +1616,7 @@ def run_argparse(
|
||||
add_hooks(ap)
|
||||
add_stats(ap)
|
||||
add_txt(ap)
|
||||
add_tail(ap)
|
||||
add_og(ap)
|
||||
add_ui(ap, retry)
|
||||
add_admin(ap)
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 17, 0)
|
||||
CODENAME = "mixtape.m3u"
|
||||
BUILD_DT = (2025, 4, 26)
|
||||
VERSION = (1, 18, 0)
|
||||
CODENAME = "logtail"
|
||||
BUILD_DT = (2025, 6, 22)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -72,7 +72,9 @@ SSEELOG = " ({})".format(SEE_LOG)
|
||||
BAD_CFG = "invalid config; {}".format(SEE_LOG)
|
||||
SBADCFG = " ({})".format(BAD_CFG)
|
||||
|
||||
PTN_U_GRP = re.compile(r"\$\{u%([+-])([^}]+)\}")
|
||||
PTN_U_GRP = re.compile(r"\$\{u(%[+-][^}]+)\}")
|
||||
PTN_G_GRP = re.compile(r"\$\{g(%[+-][^}]+)\}")
|
||||
PTN_SIGIL = re.compile(r"(\${[ug][}%])")
|
||||
|
||||
|
||||
class CfgEx(Exception):
|
||||
@@ -357,7 +359,6 @@ class VFS(object):
|
||||
self.flags = flags # config options
|
||||
self.root = self
|
||||
self.dev = 0 # st_dev
|
||||
self.badcfg1 = False
|
||||
self.nodes: dict[str, VFS] = {} # child nodes
|
||||
self.histtab: dict[str, str] = {} # all realpath->histpath
|
||||
self.dbpaths: dict[str, str] = {} # all realpath->dbpath
|
||||
@@ -877,6 +878,7 @@ class AuthSrv(object):
|
||||
self.warn_anonwrite = warn_anonwrite
|
||||
self.line_ctr = 0
|
||||
self.indent = ""
|
||||
self.is_lxc = args.c == ["/z/initcfg"]
|
||||
|
||||
# fwd-decl
|
||||
self.vfs = VFS(log_func, "", "", "", AXS(), {})
|
||||
@@ -887,6 +889,8 @@ class AuthSrv(object):
|
||||
self.defpw: dict[str, str] = {}
|
||||
self.grps: dict[str, list[str]] = {}
|
||||
self.re_pwd: Optional[re.Pattern] = None
|
||||
self.cfg_files_loaded: list[str] = []
|
||||
self.badcfg1 = False
|
||||
|
||||
# all volumes observed since last restart
|
||||
self.idp_vols: dict[str, str] = {} # vpath->abspath
|
||||
@@ -963,15 +967,27 @@ class AuthSrv(object):
|
||||
un_gn = [("", "")]
|
||||
|
||||
for un, gn in un_gn:
|
||||
m = PTN_U_GRP.search(dst0)
|
||||
if m:
|
||||
req, gnc = m.groups()
|
||||
hit = gnc in (un_gns.get(un) or [])
|
||||
if req == "+":
|
||||
if not hit:
|
||||
continue
|
||||
elif hit:
|
||||
rejected = False
|
||||
for ptn in [PTN_U_GRP, PTN_G_GRP]:
|
||||
m = ptn.search(dst0)
|
||||
if not m:
|
||||
continue
|
||||
zs = m.group(1)
|
||||
zs = zs.replace(",%+", "\n%+")
|
||||
zs = zs.replace(",%-", "\n%-")
|
||||
for rule in zs.split("\n"):
|
||||
gnc = rule[2:]
|
||||
if ptn == PTN_U_GRP:
|
||||
# is user member of group?
|
||||
hit = gnc in (un_gns.get(un) or [])
|
||||
else:
|
||||
# is it this specific group?
|
||||
hit = gn == gnc
|
||||
|
||||
if rule.startswith("%+") != hit:
|
||||
rejected = True
|
||||
if rejected:
|
||||
continue
|
||||
|
||||
# if ap/vp has a user/group placeholder, make sure to keep
|
||||
# track so the same user/group is mapped when setting perms;
|
||||
@@ -986,6 +1002,8 @@ class AuthSrv(object):
|
||||
|
||||
src = src1.replace("${g}", gn or "\n")
|
||||
dst = dst1.replace("${g}", gn or "\n")
|
||||
src = PTN_G_GRP.sub(gn or "\n", src)
|
||||
dst = PTN_G_GRP.sub(gn or "\n", dst)
|
||||
if src == src1 and dst == dst1:
|
||||
gn = ""
|
||||
|
||||
@@ -1482,8 +1500,10 @@ class AuthSrv(object):
|
||||
daxs: dict[str, AXS] = {}
|
||||
mflags: dict[str, dict[str, Any]] = {} # vpath:flags
|
||||
mount: dict[str, tuple[str, str]] = {} # dst:src (vp:(ap,vp0))
|
||||
cfg_files_loaded: list[str] = []
|
||||
|
||||
self.idp_vols = {} # yolo
|
||||
self.badcfg1 = False
|
||||
|
||||
if self.args.a:
|
||||
# list of username:password
|
||||
@@ -1544,6 +1564,7 @@ class AuthSrv(object):
|
||||
zst = [(max(0, len(x) - 2) * " ") + "└" + x[-1] for x in zstt]
|
||||
t = "loaded {} config files:\n{}"
|
||||
self.log(t.format(len(zst), "\n".join(zst)))
|
||||
cfg_files_loaded = zst
|
||||
|
||||
except:
|
||||
lns = lns[: self.line_ctr]
|
||||
@@ -1568,9 +1589,14 @@ class AuthSrv(object):
|
||||
if not mount and not self.args.idp_h_usr:
|
||||
# -h says our defaults are CWD at root and read/write for everyone
|
||||
axs = AXS(["*"], ["*"], None, None)
|
||||
if os.path.exists("/z/initcfg"):
|
||||
t = "Read-access has been disabled due to failsafe: Docker detected, but the config does not define any volumes. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to all of /w/ by adding the following arguments to the docker container: -v .::rw"
|
||||
self.log(t, 1)
|
||||
if self.is_lxc:
|
||||
t = "Read-access has been disabled due to failsafe: Docker detected, but %s. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to all of /w/ by adding the following arguments to the docker container: -v .::rw"
|
||||
if len(cfg_files_loaded) == 1:
|
||||
self.log(t % ("no config-file was provided",), 1)
|
||||
t = "it is strongly recommended to add a config-file instead, for example based on https://github.com/9001/copyparty/blob/hovudstraum/docs/examples/docker/basic-docker-compose/copyparty.conf"
|
||||
self.log(t, 3)
|
||||
else:
|
||||
self.log(t % ("the config does not define any volumes",), 1)
|
||||
axs = AXS()
|
||||
elif self.args.c:
|
||||
t = "Read-access has been disabled due to failsafe: No volumes were defined by the config-file. This failsafe is to prevent unintended access if this is due to accidental loss of config. You can override this safeguard and allow read/write to the working-directory by adding the following arguments: -v .::rw"
|
||||
@@ -1578,7 +1604,7 @@ class AuthSrv(object):
|
||||
axs = AXS()
|
||||
vfs = VFS(self.log_func, absreal("."), "", "", axs, {})
|
||||
if not axs.uread:
|
||||
vfs.badcfg1 = True
|
||||
self.badcfg1 = True
|
||||
elif "" not in mount:
|
||||
# there's volumes but no root; make root inaccessible
|
||||
zsd = {"d2d": True, "tcolor": self.args.tcolor}
|
||||
@@ -1852,7 +1878,7 @@ class AuthSrv(object):
|
||||
is_shr = shr and zv.vpath.split("/")[0] == shr
|
||||
if histp and not is_shr and histp in rhisttab:
|
||||
zv2 = rhisttab[histp]
|
||||
t = "invalid config; multiple volumes share the same histpath (database+thumbnails location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
||||
t = "invalid config; multiple volumes share the same histpath (database+thumbnails location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: /%s [%s]"
|
||||
t = t % (histp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
||||
self.log(t, 1)
|
||||
raise Exception(t)
|
||||
@@ -1866,7 +1892,7 @@ class AuthSrv(object):
|
||||
is_shr = shr and zv.vpath.split("/")[0] == shr
|
||||
if dbp and not is_shr and dbp in rdbpaths:
|
||||
zv2 = rdbpaths[dbp]
|
||||
t = "invalid config; multiple volumes share the same dbpath (database location):\n dbpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
|
||||
t = "invalid config; multiple volumes share the same dbpath (database location):\n dbpath: %s\n volume 1: /%s [%s]\n volume 2: /%s [%s]"
|
||||
t = t % (dbp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
|
||||
self.log(t, 1)
|
||||
raise Exception(t)
|
||||
@@ -2049,12 +2075,13 @@ class AuthSrv(object):
|
||||
if vf not in vol.flags:
|
||||
vol.flags[vf] = getattr(self.args, ga)
|
||||
|
||||
zs = "forget_ip nrand u2abort u2ow ups_who zip_who"
|
||||
zs = "forget_ip nrand tail_who u2abort u2ow ups_who zip_who"
|
||||
for k in zs.split():
|
||||
if k in vol.flags:
|
||||
vol.flags[k] = int(vol.flags[k])
|
||||
|
||||
for k in ("convt",):
|
||||
zs = "convt tail_fd tail_rate tail_tmax"
|
||||
for k in zs.split():
|
||||
if k in vol.flags:
|
||||
vol.flags[k] = float(vol.flags[k])
|
||||
|
||||
@@ -2074,6 +2101,10 @@ class AuthSrv(object):
|
||||
if len(zs) == 3: # fc5 => ffcc55
|
||||
vol.flags["tcolor"] = "".join([x * 2 for x in zs])
|
||||
|
||||
# volflag syntax currently doesn't allow for ':' in value
|
||||
zs = vol.flags["put_name"]
|
||||
vol.flags["put_name2"] = zs.replace("{now.", "{now:.")
|
||||
|
||||
if vol.flags.get("neversymlink"):
|
||||
vol.flags["hardlinkonly"] = True # was renamed
|
||||
if vol.flags.get("hardlinkonly"):
|
||||
@@ -2379,7 +2410,7 @@ class AuthSrv(object):
|
||||
idp_vn, _ = vfs.get(idp_vp, "*", False, False)
|
||||
idp_vp0 = idp_vn.vpath0
|
||||
|
||||
sigils = set(re.findall(r"(\${[ug][}%])", idp_vp0))
|
||||
sigils = set(PTN_SIGIL.findall(idp_vp0))
|
||||
if len(sigils) > 1:
|
||||
t = '\nWARNING: IdP-volume "/%s" created by "/%s" has multiple IdP placeholders: %s'
|
||||
self.idp_warn.append(t % (idp_vp, idp_vp0, list(sigils)))
|
||||
@@ -2429,6 +2460,7 @@ class AuthSrv(object):
|
||||
self.defpw = defpw
|
||||
self.grps = grps
|
||||
self.iacct = {v: k for k, v in acct.items()}
|
||||
self.cfg_files_loaded = cfg_files_loaded
|
||||
|
||||
self.load_sessions()
|
||||
|
||||
@@ -2548,6 +2580,7 @@ class AuthSrv(object):
|
||||
"txt_ext": self.args.textfiles.replace(",", " "),
|
||||
"def_hcols": list(vf.get("mth") or []),
|
||||
"unlist0": vf.get("unlist") or "",
|
||||
"see_dots": self.args.see_dots,
|
||||
"dgrid": "grid" in vf,
|
||||
"dgsel": "gsel" in vf,
|
||||
"dnsort": "nsort" in vf,
|
||||
@@ -2559,6 +2592,7 @@ class AuthSrv(object):
|
||||
"idxh": int(self.args.ih),
|
||||
"themes": self.args.themes,
|
||||
"turbolvl": self.args.turbo,
|
||||
"nosubtle": self.args.nosubtle,
|
||||
"u2j": self.args.u2j,
|
||||
"u2sz": self.args.u2sz,
|
||||
"u2ts": vf["u2ts"],
|
||||
|
||||
@@ -1,13 +1,11 @@
|
||||
import calendar
|
||||
import errno
|
||||
import filecmp
|
||||
import json
|
||||
import os
|
||||
import shutil
|
||||
import time
|
||||
|
||||
from .__init__ import ANYWIN
|
||||
from .util import Netdev, load_resource, runcmd, wrename, wunlink
|
||||
from .util import Netdev, atomic_move, load_resource, runcmd, wunlink
|
||||
|
||||
HAVE_CFSSL = not os.environ.get("PRTY_NO_CFSSL")
|
||||
|
||||
@@ -122,7 +120,7 @@ def _gen_ca(log: "RootLogger", args):
|
||||
wunlink(nlog, bname + ".key", VF)
|
||||
except:
|
||||
pass
|
||||
wrename(nlog, bname + "-key.pem", bname + ".key", VF)
|
||||
atomic_move(nlog, bname + "-key.pem", bname + ".key", VF)
|
||||
wunlink(nlog, bname + ".csr", VF)
|
||||
|
||||
log("cert", "new ca OK", 2)
|
||||
@@ -215,7 +213,7 @@ def _gen_srv(log: "RootLogger", args, netdevs: dict[str, Netdev]):
|
||||
wunlink(nlog, bname + ".key", VF)
|
||||
except:
|
||||
pass
|
||||
wrename(nlog, bname + "-key.pem", bname + ".key", VF)
|
||||
atomic_move(nlog, bname + "-key.pem", bname + ".key", VF)
|
||||
wunlink(nlog, bname + ".csr", VF)
|
||||
|
||||
with open(os.path.join(args.crt_dir, "ca.pem"), "rb") as f:
|
||||
|
||||
@@ -22,6 +22,7 @@ def vf_bmap() -> dict[str, str]:
|
||||
"no_forget": "noforget",
|
||||
"no_pipe": "nopipe",
|
||||
"no_robots": "norobots",
|
||||
"no_tail": "notail",
|
||||
"no_thumb": "dthumb",
|
||||
"no_vthumb": "dvthumb",
|
||||
"no_athumb": "dathumb",
|
||||
@@ -51,6 +52,7 @@ def vf_bmap() -> dict[str, str]:
|
||||
"og_no_head",
|
||||
"og_s_title",
|
||||
"rand",
|
||||
"rmagic",
|
||||
"rss",
|
||||
"wo_up_readme",
|
||||
"xdev",
|
||||
@@ -75,6 +77,7 @@ def vf_vmap() -> dict[str, str]:
|
||||
"th_x3": "th3x",
|
||||
}
|
||||
for k in (
|
||||
"bup_ck",
|
||||
"dbd",
|
||||
"forget_ip",
|
||||
"hsortn",
|
||||
@@ -95,9 +98,15 @@ def vf_vmap() -> dict[str, str]:
|
||||
"og_title_i",
|
||||
"og_tpl",
|
||||
"og_ua",
|
||||
"put_ck",
|
||||
"put_name",
|
||||
"mv_retry",
|
||||
"rm_retry",
|
||||
"sort",
|
||||
"tail_fd",
|
||||
"tail_rate",
|
||||
"tail_tmax",
|
||||
"tail_who",
|
||||
"tcolor",
|
||||
"unlist",
|
||||
"u2abort",
|
||||
@@ -165,6 +174,9 @@ flagcats = {
|
||||
"daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files",
|
||||
"nosub": "forces all uploads into the top folder of the vfs",
|
||||
"magic": "enables filetype detection for nameless uploads",
|
||||
"put_name": "fallback filename for nameless uploads",
|
||||
"put_ck": "default checksum-hasher for PUT/WebDAV uploads",
|
||||
"bup_ck": "default checksum-hasher for bup/basic uploads",
|
||||
"gz": "allows server-side gzip compression of uploads with ?gz",
|
||||
"xz": "allows server-side lzma compression of uploads with ?xz",
|
||||
"pk": "forces server-side compression, optional arg: xz,9",
|
||||
@@ -298,6 +310,13 @@ flagcats = {
|
||||
"exp_md": "placeholders to expand in markdown files; see --help",
|
||||
"exp_lg": "placeholders to expand in prologue/epilogue; see --help",
|
||||
},
|
||||
"tailing": {
|
||||
"notail": "disable ?tail (download a growing file continuously)",
|
||||
"tail_fd=1": "check if file was replaced (new fd) every 1 sec",
|
||||
"tail_rate=0.2": "check for new data every 0.2 sec",
|
||||
"tail_tmax=30": "kill connection after 30 sec",
|
||||
"tail_who=2": "restrict ?tail access (1=admins,2=authed,3=everyone)",
|
||||
},
|
||||
"others": {
|
||||
"dots": "allow all users with read-access to\nenable the option to show dotfiles in listings",
|
||||
"fk=8": 'generates per-file accesskeys,\nwhich are then required at the "g" permission;\nkeys are invalidated if filesize or inode changes',
|
||||
@@ -306,6 +325,7 @@ flagcats = {
|
||||
"dks": "per-directory accesskeys allow browsing into subdirs",
|
||||
"dky": 'allow seeing files (not folders) inside a specific folder\nwith "g" perm, and does not require a valid dirkey to do so',
|
||||
"rss": "allow '?rss' URL suffix (experimental)",
|
||||
"rmagic": "expensive analysis for mimetype accuracy",
|
||||
"ups_who=2": "restrict viewing the list of recent uploads",
|
||||
"zip_who=2": "restrict access to download-as-zip/tar",
|
||||
"zipmaxn=9k": "reject download-as-zip if more than 9000 files",
|
||||
|
||||
@@ -113,7 +113,6 @@ from .util import (
|
||||
vol_san,
|
||||
vroots,
|
||||
vsplit,
|
||||
wrename,
|
||||
wunlink,
|
||||
yieldfile,
|
||||
)
|
||||
@@ -190,11 +189,11 @@ class HttpCli(object):
|
||||
self.log_src = conn.log_src # mypy404
|
||||
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
||||
self.tls: bool = hasattr(self.s, "cipher")
|
||||
self.is_vproxied = bool(self.args.R)
|
||||
|
||||
# placeholders; assigned by run()
|
||||
self.keepalive = False
|
||||
self.is_https = False
|
||||
self.is_vproxied = False
|
||||
self.in_hdr_recv = True
|
||||
self.headers: dict[str, str] = {}
|
||||
self.mode = " " # http verb
|
||||
@@ -402,7 +401,6 @@ class HttpCli(object):
|
||||
self.bad_xff = True
|
||||
else:
|
||||
self.ip = cli_ip
|
||||
self.is_vproxied = bool(self.args.R)
|
||||
self.log_src = self.conn.set_rproxy(self.ip)
|
||||
self.host = self.headers.get("x-forwarded-host") or self.host
|
||||
trusted_xff = True
|
||||
@@ -535,6 +533,7 @@ class HttpCli(object):
|
||||
else:
|
||||
t = "incorrect --rp-loc or webserver config; expected vpath starting with %r but got %r"
|
||||
self.log(t % (self.args.R, vpath), 1)
|
||||
self.is_vproxied = False
|
||||
|
||||
self.ouparam = uparam.copy()
|
||||
|
||||
@@ -1234,10 +1233,19 @@ class HttpCli(object):
|
||||
else:
|
||||
return self.tx_404(True)
|
||||
else:
|
||||
vfs = self.asrv.vfs
|
||||
if vfs.badcfg1:
|
||||
t = "<h2>access denied due to failsafe; check server log</h2>"
|
||||
html = self.j2s("splash", this=self, msg=t)
|
||||
if (
|
||||
self.asrv.badcfg1
|
||||
and "h" not in self.ouparam
|
||||
and "hc" not in self.ouparam
|
||||
):
|
||||
zs1 = "copyparty refused to start due to a failsafe: invalid server config; check server log"
|
||||
zs2 = 'you may <a href="/?h">access the controlpanel</a> but nothing will work until you shutdown the copyparty container and %s config-file (or provide the configuration as command-line arguments)'
|
||||
if self.asrv.is_lxc and len(self.asrv.cfg_files_loaded) == 1:
|
||||
zs2 = zs2 % ("add a",)
|
||||
else:
|
||||
zs2 = zs2 % ("fix the",)
|
||||
|
||||
html = self.j2s("msg", h1=zs1, h2=zs2)
|
||||
self.reply(html.encode("utf-8", "replace"), 500)
|
||||
return True
|
||||
|
||||
@@ -1404,7 +1412,13 @@ class HttpCli(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
ap = ""
|
||||
use_magic = "rmagic" in self.vn.flags
|
||||
|
||||
for i in hits:
|
||||
if use_magic:
|
||||
ap = os.path.join(self.vn.realpath, i["rp"])
|
||||
|
||||
iurl = html_escape("%s%s" % (baseurl, i["rp"]), True, True)
|
||||
title = unquotep(i["rp"].split("?")[0].split("/")[-1])
|
||||
title = html_escape(title, True, True)
|
||||
@@ -1412,8 +1426,8 @@ class HttpCli(object):
|
||||
tag_a = str(i["tags"].get("artist") or "")
|
||||
desc = "%s - %s" % (tag_a, tag_t) if tag_t and tag_a else (tag_t or tag_a)
|
||||
desc = html_escape(desc, True, True) if desc else title
|
||||
mime = html_escape(guess_mime(title))
|
||||
lmod = formatdate(i["ts"])
|
||||
mime = html_escape(guess_mime(title, ap))
|
||||
lmod = formatdate(max(0, i["ts"]))
|
||||
zsa = (iurl, iurl, title, desc, lmod, iurl, mime, i["sz"])
|
||||
zs = (
|
||||
"""\
|
||||
@@ -1565,12 +1579,15 @@ class HttpCli(object):
|
||||
None, 207, "text/xml; charset=" + enc, {"Transfer-Encoding": "chunked"}
|
||||
)
|
||||
|
||||
ap = ""
|
||||
use_magic = "rmagic" in vn.flags
|
||||
|
||||
ret = '<?xml version="1.0" encoding="{}"?>\n<D:multistatus xmlns:D="DAV:">'
|
||||
ret = ret.format(uenc)
|
||||
for x in fgen:
|
||||
rp = vjoin(vtop, x["vp"])
|
||||
st: os.stat_result = x["st"]
|
||||
mtime = st.st_mtime
|
||||
mtime = max(0, st.st_mtime)
|
||||
if stat.S_ISLNK(st.st_mode):
|
||||
try:
|
||||
st = bos.stat(os.path.join(tap, x["vp"]))
|
||||
@@ -1591,7 +1608,9 @@ class HttpCli(object):
|
||||
"supportedlock": '<D:lockentry xmlns:D="DAV:"><D:lockscope><D:exclusive/></D:lockscope><D:locktype><D:write/></D:locktype></D:lockentry>',
|
||||
}
|
||||
if not isdir:
|
||||
pvs["getcontenttype"] = html_escape(guess_mime(rp))
|
||||
if use_magic:
|
||||
ap = os.path.join(tap, x["vp"])
|
||||
pvs["getcontenttype"] = html_escape(guess_mime(rp, ap))
|
||||
pvs["getcontentlength"] = str(st.st_size)
|
||||
|
||||
for k, v in pvs.items():
|
||||
@@ -2100,8 +2119,7 @@ class HttpCli(object):
|
||||
suffix = "-{:.6f}-{}".format(time.time(), self.dip())
|
||||
nameless = not fn
|
||||
if nameless:
|
||||
suffix += ".bin"
|
||||
fn = "put" + suffix
|
||||
fn = vfs.flags["put_name2"].format(now=time.time(), cip=self.dip())
|
||||
|
||||
params = {"suffix": suffix, "fdir": fdir}
|
||||
if self.args.nw:
|
||||
@@ -2181,28 +2199,26 @@ class HttpCli(object):
|
||||
# small toctou, but better than clobbering a hardlink
|
||||
wunlink(self.log, path, vfs.flags)
|
||||
|
||||
halg = "sha512"
|
||||
hasher = None
|
||||
copier = hashcopy
|
||||
if "ck" in self.ouparam or "ck" in self.headers:
|
||||
halg = zs = self.ouparam.get("ck") or self.headers.get("ck") or ""
|
||||
if not zs or zs == "no":
|
||||
copier = justcopy
|
||||
halg = ""
|
||||
elif zs == "md5":
|
||||
hasher = hashlib.md5(**USED4SEC)
|
||||
elif zs == "sha1":
|
||||
hasher = hashlib.sha1(**USED4SEC)
|
||||
elif zs == "sha256":
|
||||
hasher = hashlib.sha256(**USED4SEC)
|
||||
elif zs in ("blake2", "b2"):
|
||||
hasher = hashlib.blake2b(**USED4SEC)
|
||||
elif zs in ("blake2s", "b2s"):
|
||||
hasher = hashlib.blake2s(**USED4SEC)
|
||||
elif zs == "sha512":
|
||||
pass
|
||||
else:
|
||||
raise Pebkac(500, "unknown hash alg")
|
||||
halg = self.ouparam.get("ck") or self.headers.get("ck") or vfs.flags["put_ck"]
|
||||
if halg == "sha512":
|
||||
pass
|
||||
elif halg == "no":
|
||||
copier = justcopy
|
||||
halg = ""
|
||||
elif halg == "md5":
|
||||
hasher = hashlib.md5(**USED4SEC)
|
||||
elif halg == "sha1":
|
||||
hasher = hashlib.sha1(**USED4SEC)
|
||||
elif halg == "sha256":
|
||||
hasher = hashlib.sha256(**USED4SEC)
|
||||
elif halg in ("blake2", "b2"):
|
||||
hasher = hashlib.blake2b(**USED4SEC)
|
||||
elif halg in ("blake2s", "b2s"):
|
||||
hasher = hashlib.blake2s(**USED4SEC)
|
||||
else:
|
||||
raise Pebkac(500, "unknown hash alg")
|
||||
|
||||
f, fn = ren_open(fn, *open_a, **params)
|
||||
try:
|
||||
@@ -2591,10 +2607,6 @@ class HttpCli(object):
|
||||
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
||||
ret = x.get()
|
||||
|
||||
if self.is_vproxied:
|
||||
if "purl" in ret:
|
||||
ret["purl"] = self.args.SR + ret["purl"]
|
||||
|
||||
if self.args.shr and self.vpath.startswith(self.args.shr1):
|
||||
# strip common suffix (uploader's folder structure)
|
||||
vp_req, vp_vfs = vroots(self.vpath, vjoin(dbv.vpath, vrem))
|
||||
@@ -2604,6 +2616,10 @@ class HttpCli(object):
|
||||
raise Pebkac(500, t % zt)
|
||||
ret["purl"] = vp_req + ret["purl"][len(vp_vfs) :]
|
||||
|
||||
if self.is_vproxied:
|
||||
if "purl" in ret:
|
||||
ret["purl"] = self.args.SR + ret["purl"]
|
||||
|
||||
ret = json.dumps(ret)
|
||||
self.log(ret)
|
||||
self.reply(ret.encode("utf-8"), mime="application/json")
|
||||
@@ -2711,6 +2727,7 @@ class HttpCli(object):
|
||||
locked = chashes # remaining chunks to be received in this request
|
||||
written = [] # chunks written to disk, but not yet released by up2k
|
||||
num_left = -1 # num chunks left according to most recent up2k release
|
||||
bail1 = False # used in sad path to avoid contradicting error-text
|
||||
treport = time.time() # ratelimit up2k reporting to reduce overhead
|
||||
|
||||
if "x-up2k-subc" in self.headers:
|
||||
@@ -2849,7 +2866,6 @@ class HttpCli(object):
|
||||
except:
|
||||
# maybe busted handle (eg. disk went full)
|
||||
f.close()
|
||||
chashes = [] # exception flag
|
||||
raise
|
||||
finally:
|
||||
if locked:
|
||||
@@ -2858,13 +2874,14 @@ class HttpCli(object):
|
||||
num_left, t = x.get()
|
||||
if num_left < 0:
|
||||
self.loud_reply(t, status=500)
|
||||
if chashes: # kills exception bubbling otherwise
|
||||
return False
|
||||
bail1 = True
|
||||
else:
|
||||
t = "got %d more chunks, %d left"
|
||||
self.log(t % (len(written), num_left), 6)
|
||||
|
||||
if num_left < 0:
|
||||
if bail1:
|
||||
return False
|
||||
raise Pebkac(500, "unconfirmed; see serverlog")
|
||||
|
||||
if not num_left and fpool:
|
||||
@@ -2931,7 +2948,8 @@ class HttpCli(object):
|
||||
self.parser.drop()
|
||||
|
||||
self.log("logout " + self.uname)
|
||||
self.asrv.forget_session(self.conn.hsrv.broker, self.uname)
|
||||
if not self.uname.startswith("s_"):
|
||||
self.asrv.forget_session(self.conn.hsrv.broker, self.uname)
|
||||
self.get_pwd_cookie("x")
|
||||
|
||||
dst = self.args.SRS + "?h"
|
||||
@@ -3084,15 +3102,18 @@ class HttpCli(object):
|
||||
vfs, rem = self.asrv.vfs.get(self.vpath, self.uname, False, True)
|
||||
self._assert_safe_rem(rem)
|
||||
|
||||
halg = "sha512"
|
||||
hasher = None
|
||||
copier = hashcopy
|
||||
if nohash:
|
||||
halg = ""
|
||||
copier = justcopy
|
||||
elif "ck" in self.ouparam or "ck" in self.headers:
|
||||
halg = self.ouparam.get("ck") or self.headers.get("ck") or ""
|
||||
if not halg or halg == "no":
|
||||
else:
|
||||
copier = hashcopy
|
||||
halg = (
|
||||
self.ouparam.get("ck") or self.headers.get("ck") or vfs.flags["bup_ck"]
|
||||
)
|
||||
if halg == "sha512":
|
||||
pass
|
||||
elif halg == "no":
|
||||
copier = justcopy
|
||||
halg = ""
|
||||
elif halg == "md5":
|
||||
@@ -3105,8 +3126,6 @@ class HttpCli(object):
|
||||
hasher = hashlib.blake2b(**USED4SEC)
|
||||
elif halg in ("blake2s", "b2s"):
|
||||
hasher = hashlib.blake2s(**USED4SEC)
|
||||
elif halg == "sha512":
|
||||
pass
|
||||
else:
|
||||
raise Pebkac(500, "unknown hash alg")
|
||||
|
||||
@@ -3569,7 +3588,7 @@ class HttpCli(object):
|
||||
except:
|
||||
pass
|
||||
if dp:
|
||||
wrename(self.log, fp, os.path.join(dp, mfile2), vfs.flags)
|
||||
atomic_move(self.log, fp, os.path.join(dp, mfile2), vfs.flags)
|
||||
|
||||
assert self.parser.gen # !rm
|
||||
p_field, _, p_data = next(self.parser.gen)
|
||||
@@ -3806,6 +3825,20 @@ class HttpCli(object):
|
||||
|
||||
return txt
|
||||
|
||||
def _can_tail(self, volflags: dict[str, Any]) -> bool:
|
||||
zp = self.args.ua_nodoc
|
||||
if zp and zp.search(self.ua):
|
||||
t = "this URL contains no valuable information for bots/crawlers"
|
||||
raise Pebkac(403, t)
|
||||
lvl = volflags["tail_who"]
|
||||
if "notail" in volflags or not lvl:
|
||||
raise Pebkac(400, "tail is disabled in server config")
|
||||
elif lvl <= 1 and not self.can_admin:
|
||||
raise Pebkac(400, "tail is admin-only on this server")
|
||||
elif lvl <= 2 and self.uname in ("", "*"):
|
||||
raise Pebkac(400, "you must be authenticated to use ?tail on this server")
|
||||
return True
|
||||
|
||||
def _can_zip(self, volflags: dict[str, Any]) -> str:
|
||||
lvl = volflags["zip_who"]
|
||||
if self.args.no_zip or not lvl:
|
||||
@@ -3950,6 +3983,8 @@ class HttpCli(object):
|
||||
logmsg = "{:4} {} ".format("", self.req)
|
||||
logtail = ""
|
||||
|
||||
is_tail = "tail" in self.uparam and self._can_tail(self.vn.flags)
|
||||
|
||||
if ptop is not None:
|
||||
ap_data = "<%s>" % (req_path,)
|
||||
try:
|
||||
@@ -3980,7 +4015,7 @@ class HttpCli(object):
|
||||
if ptop is not None:
|
||||
assert job and ap_data # type: ignore # !rm
|
||||
sz = job["size"]
|
||||
file_ts = job["lmod"]
|
||||
file_ts = max(0, job["lmod"])
|
||||
editions["plain"] = (ap_data, sz)
|
||||
break
|
||||
|
||||
@@ -4063,6 +4098,7 @@ class HttpCli(object):
|
||||
and can_range
|
||||
and file_sz
|
||||
and "," not in hrange
|
||||
and not is_tail
|
||||
):
|
||||
try:
|
||||
if not hrange.lower().startswith("bytes"):
|
||||
@@ -4131,6 +4167,8 @@ class HttpCli(object):
|
||||
mime = "text/plain; charset={}".format(self.uparam["txt"] or "utf-8")
|
||||
elif "mime" in self.uparam:
|
||||
mime = str(self.uparam.get("mime"))
|
||||
elif "rmagic" in self.vn.flags:
|
||||
mime = guess_mime(req_path, fs_path)
|
||||
else:
|
||||
mime = guess_mime(req_path)
|
||||
|
||||
@@ -4148,13 +4186,18 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
dls = self.conn.hsrv.dls
|
||||
if is_tail:
|
||||
upper = 1 << 30
|
||||
if len(dls) > self.args.tail_cmax:
|
||||
raise Pebkac(400, "too many active downloads to start a new tail")
|
||||
|
||||
if upper - lower > 0x400000: # 4m
|
||||
now = time.time()
|
||||
self.dl_id = "%s:%s" % (self.ip, self.addr[1])
|
||||
dls[self.dl_id] = (now, 0)
|
||||
self.conn.hsrv.dli[self.dl_id] = (
|
||||
now,
|
||||
upper - lower,
|
||||
0 if is_tail else upper - lower,
|
||||
self.vn,
|
||||
self.vpath,
|
||||
self.uname,
|
||||
@@ -4165,6 +4208,9 @@ class HttpCli(object):
|
||||
return self.tx_pipe(
|
||||
ptop, req_path, ap_data, job, lower, upper, status, mime, logmsg
|
||||
)
|
||||
elif is_tail:
|
||||
self.tx_tail(open_args, status, mime)
|
||||
return False
|
||||
|
||||
ret = True
|
||||
with open_func(*open_args) as f:
|
||||
@@ -4194,6 +4240,133 @@ class HttpCli(object):
|
||||
|
||||
return ret
|
||||
|
||||
def tx_tail(
|
||||
self,
|
||||
open_args: list[Any],
|
||||
status: int,
|
||||
mime: str,
|
||||
) -> None:
|
||||
vf = self.vn.flags
|
||||
self.send_headers(length=None, status=status, mime=mime)
|
||||
abspath: bytes = open_args[0]
|
||||
sec_rate = vf["tail_rate"]
|
||||
sec_max = vf["tail_tmax"]
|
||||
sec_fd = vf["tail_fd"]
|
||||
sec_ka = self.args.tail_ka
|
||||
wr_slp = self.args.s_wr_slp
|
||||
wr_sz = self.args.s_wr_sz
|
||||
dls = self.conn.hsrv.dls
|
||||
dl_id = self.dl_id
|
||||
|
||||
# non-numeric = full file from start
|
||||
# positive = absolute offset from start
|
||||
# negative = start that many bytes from eof
|
||||
try:
|
||||
ofs = int(self.uparam["tail"])
|
||||
except:
|
||||
ofs = 0
|
||||
|
||||
t0 = time.time()
|
||||
ofs0 = ofs
|
||||
f = None
|
||||
try:
|
||||
st = os.stat(abspath)
|
||||
f = open(*open_args)
|
||||
f.seek(0, os.SEEK_END)
|
||||
eof = f.tell()
|
||||
f.seek(0)
|
||||
if ofs < 0:
|
||||
ofs = max(0, ofs + eof)
|
||||
|
||||
self.log("tailing from byte %d: %r" % (ofs, abspath), 6)
|
||||
|
||||
# send initial data asap
|
||||
remains = sendfile_py(
|
||||
self.log, # d/c
|
||||
ofs,
|
||||
eof,
|
||||
f,
|
||||
self.s,
|
||||
wr_sz,
|
||||
wr_slp,
|
||||
False, # d/c
|
||||
dls,
|
||||
dl_id,
|
||||
)
|
||||
sent = (eof - ofs) - remains
|
||||
ofs = eof - remains
|
||||
f.seek(ofs)
|
||||
|
||||
try:
|
||||
st2 = os.stat(open_args[0])
|
||||
if st.st_ino == st2.st_ino:
|
||||
st = st2 # for filesize
|
||||
except:
|
||||
pass
|
||||
|
||||
gone = 0
|
||||
t_fd = t_ka = time.time()
|
||||
while True:
|
||||
assert f # !rm
|
||||
buf = f.read(4096)
|
||||
now = time.time()
|
||||
|
||||
if sec_max and now - t0 >= sec_max:
|
||||
self.log("max duration exceeded; kicking client", 6)
|
||||
zb = b"\n\n*** max duration exceeded; disconnecting ***\n"
|
||||
self.s.sendall(zb)
|
||||
break
|
||||
|
||||
if buf:
|
||||
t_fd = t_ka = now
|
||||
self.s.sendall(buf)
|
||||
sent += len(buf)
|
||||
dls[dl_id] = (time.time(), sent)
|
||||
continue
|
||||
|
||||
time.sleep(sec_rate)
|
||||
if t_ka < now - sec_ka:
|
||||
t_ka = now
|
||||
self.s.send(b"\x00")
|
||||
if t_fd < now - sec_fd:
|
||||
try:
|
||||
st2 = os.stat(open_args[0])
|
||||
if (
|
||||
st2.st_ino != st.st_ino
|
||||
or st2.st_size < sent
|
||||
or st2.st_size < st.st_size
|
||||
):
|
||||
assert f # !rm
|
||||
# open new file before closing previous to avoid toctous (open may fail; cannot null f before)
|
||||
f2 = open(*open_args)
|
||||
f.close()
|
||||
f = f2
|
||||
f.seek(0, os.SEEK_END)
|
||||
eof = f.tell()
|
||||
if eof < sent:
|
||||
ofs = sent = 0 # shrunk; send from start
|
||||
zb = b"\n\n*** file size decreased -- rewinding to the start of the file ***\n\n"
|
||||
self.s.sendall(zb)
|
||||
if ofs0 < 0 and eof > -ofs0:
|
||||
ofs = eof + ofs0
|
||||
else:
|
||||
ofs = sent # just new fd? resume from same ofs
|
||||
f.seek(ofs)
|
||||
self.log("reopened at byte %d: %r" % (ofs, abspath), 6)
|
||||
gone = 0
|
||||
st = st2
|
||||
except:
|
||||
gone += 1
|
||||
if gone > 3:
|
||||
self.log("file deleted; disconnecting")
|
||||
break
|
||||
except IOError as ex:
|
||||
if ex.errno not in (errno.EPIPE, errno.ESHUTDOWN, errno.EBADFD):
|
||||
raise
|
||||
finally:
|
||||
if f:
|
||||
f.close()
|
||||
|
||||
def tx_pipe(
|
||||
self,
|
||||
ptop: str,
|
||||
@@ -4754,7 +4927,6 @@ class HttpCli(object):
|
||||
if zi == 2 or (zi == 1 and self.avol):
|
||||
dl_list = self.get_dls()
|
||||
for t0, t1, sent, sz, vp, dl_id, uname in dl_list:
|
||||
rem = sz - sent
|
||||
td = max(0.1, now - t0)
|
||||
rd, fn = vsplit(vp)
|
||||
if not rd:
|
||||
@@ -5504,6 +5676,7 @@ class HttpCli(object):
|
||||
raise Pebkac(400, "selected file not found on disk: [%s]" % (fn,))
|
||||
|
||||
pw = req.get("pw") or ""
|
||||
pw = self.asrv.ah.hash(pw)
|
||||
now = int(time.time())
|
||||
sexp = req["exp"]
|
||||
exp = int(sexp) if sexp else 0
|
||||
@@ -6121,7 +6294,7 @@ class HttpCli(object):
|
||||
margin = "-"
|
||||
|
||||
sz = inf.st_size
|
||||
zd = datetime.fromtimestamp(linf.st_mtime, UTC)
|
||||
zd = datetime.fromtimestamp(max(0, linf.st_mtime), UTC)
|
||||
dt = "%04d-%02d-%02d %02d:%02d:%02d" % (
|
||||
zd.year,
|
||||
zd.month,
|
||||
|
||||
@@ -17,6 +17,9 @@ if True: # pylint: disable=using-constant-test
|
||||
from .util import NamedLogger
|
||||
|
||||
|
||||
TAR_NO_OPUS = set("aac|m4a|mp3|oga|ogg|opus|wma".split("|"))
|
||||
|
||||
|
||||
class StreamArc(object):
|
||||
def __init__(
|
||||
self,
|
||||
@@ -82,9 +85,7 @@ def enthumb(
|
||||
) -> dict[str, Any]:
|
||||
rem = f["vp"]
|
||||
ext = rem.rsplit(".", 1)[-1].lower()
|
||||
if (fmt == "mp3" and ext == "mp3") or (
|
||||
fmt == "opus" and ext in "aac|m4a|mp3|ogg|opus|wma".split("|")
|
||||
):
|
||||
if (fmt == "mp3" and ext == "mp3") or (fmt == "opus" and ext in TAR_NO_OPUS):
|
||||
raise Exception()
|
||||
|
||||
vp = vjoin(vtop, rem.split("/", 1)[1])
|
||||
|
||||
@@ -284,6 +284,7 @@ class Tftpd(object):
|
||||
if not ptn or not ptn.match(fn.lower()):
|
||||
return None
|
||||
|
||||
tsdt = datetime.fromtimestamp
|
||||
vn, rem = self.asrv.vfs.get(vpath, "*", True, False)
|
||||
fsroot, vfs_ls, vfs_virt = vn.ls(
|
||||
rem,
|
||||
@@ -296,7 +297,7 @@ class Tftpd(object):
|
||||
dirs1 = [(v.st_mtime, v.st_size, k + "/") for k, v in vfs_ls if k in dnames]
|
||||
fils1 = [(v.st_mtime, v.st_size, k) for k, v in vfs_ls if k not in dnames]
|
||||
real1 = dirs1 + fils1
|
||||
realt = [(datetime.fromtimestamp(mt, UTC), sz, fn) for mt, sz, fn in real1]
|
||||
realt = [(tsdt(max(0, mt), UTC), sz, fn) for mt, sz, fn in real1]
|
||||
reals = [
|
||||
(
|
||||
"%04d-%02d-%02d %02d:%02d:%02d"
|
||||
|
||||
@@ -24,13 +24,13 @@ from .util import (
|
||||
Cooldown,
|
||||
Daemon,
|
||||
afsenc,
|
||||
atomic_move,
|
||||
fsenc,
|
||||
min_ex,
|
||||
runcmd,
|
||||
statdir,
|
||||
ub64enc,
|
||||
vsplit,
|
||||
wrename,
|
||||
wunlink,
|
||||
)
|
||||
|
||||
@@ -412,7 +412,7 @@ class ThumbSrv(object):
|
||||
wunlink(self.log, ap_unpk, vn.flags)
|
||||
|
||||
try:
|
||||
wrename(self.log, ttpath, tpath, vn.flags)
|
||||
atomic_move(self.log, ttpath, tpath, vn.flags)
|
||||
except Exception as ex:
|
||||
if not os.path.exists(tpath):
|
||||
t = "failed to move [%s] to [%s]: %r"
|
||||
@@ -677,7 +677,7 @@ class ThumbSrv(object):
|
||||
except:
|
||||
pass
|
||||
else:
|
||||
wrename(self.log, wtpath, tpath, vn.flags)
|
||||
atomic_move(self.log, wtpath, tpath, vn.flags)
|
||||
|
||||
def conv_spec(self, abspath: str, tpath: str, fmt: str, vn: VFS) -> None:
|
||||
ret, _ = ffprobe(abspath, int(vn.flags["convt"] / 2))
|
||||
|
||||
@@ -1119,7 +1119,7 @@ class Up2k(object):
|
||||
ft = "\033[0;32m{}{:.0}"
|
||||
ff = "\033[0;35m{}{:.0}"
|
||||
fv = "\033[0;36m{}:\033[90m{}"
|
||||
zs = "ext_th_d html_head mv_re_r mv_re_t rm_re_r rm_re_t srch_re_dots srch_re_nodot zipmax zipmaxn_v zipmaxs_v"
|
||||
zs = "ext_th_d html_head put_name2 mv_re_r mv_re_t rm_re_r rm_re_t srch_re_dots srch_re_nodot zipmax zipmaxn_v zipmaxs_v"
|
||||
fx = set(zs.split())
|
||||
fd = vf_bmap()
|
||||
fd.update(vf_cmap())
|
||||
@@ -2120,11 +2120,12 @@ class Up2k(object):
|
||||
return -1
|
||||
|
||||
w = bw[:-1].decode("ascii")
|
||||
w16 = w[:16]
|
||||
|
||||
with self.mutex:
|
||||
try:
|
||||
q = "select rd, fn, ip, at from up where substr(w,1,16)=? and +w=?"
|
||||
rd, fn, ip, at = cur.execute(q, (w[:16], w)).fetchone()
|
||||
rd, fn, ip, at = cur.execute(q, (w16, w)).fetchone()
|
||||
except:
|
||||
# file modified/deleted since spooling
|
||||
continue
|
||||
@@ -2133,8 +2134,12 @@ class Up2k(object):
|
||||
rd, fn = s3dec(rd, fn)
|
||||
|
||||
if "mtp" in flags:
|
||||
q = "select 1 from mt where w=? and +k='t:mtp' limit 1"
|
||||
if cur.execute(q, (w16,)).fetchone():
|
||||
continue
|
||||
|
||||
q = "insert into mt values (?,'t:mtp','a')"
|
||||
cur.execute(q, (w[:16],))
|
||||
cur.execute(q, (w16,))
|
||||
|
||||
abspath = djoin(ptop, rd, fn)
|
||||
self.pp.msg = "c%d %s" % (nq, abspath)
|
||||
@@ -2190,7 +2195,7 @@ class Up2k(object):
|
||||
return tf, -1
|
||||
|
||||
if flt == 1:
|
||||
q = "select w from mt where w = ?"
|
||||
q = "select 1 from mt where w=? and +k != 't:mtp'"
|
||||
if c2.execute(q, (row[0][:16],)).fetchone():
|
||||
continue
|
||||
|
||||
@@ -3231,7 +3236,7 @@ class Up2k(object):
|
||||
if hr.get("reloc"):
|
||||
x = pathmod(self.vfs, dst, vp, hr["reloc"])
|
||||
if x:
|
||||
zvfs = vfs
|
||||
ud1 = (vfs.vpath, job["prel"], job["name"])
|
||||
pdir, _, job["name"], (vfs, rem) = x
|
||||
dst = os.path.join(pdir, job["name"])
|
||||
job["vcfg"] = vfs.flags
|
||||
@@ -3239,7 +3244,8 @@ class Up2k(object):
|
||||
job["vtop"] = vfs.vpath
|
||||
job["prel"] = rem
|
||||
job["name"] = sanitize_fn(job["name"], "")
|
||||
if zvfs.vpath != vfs.vpath:
|
||||
ud2 = (vfs.vpath, job["prel"], job["name"])
|
||||
if ud1 != ud2:
|
||||
# print(json.dumps(job, sort_keys=True, indent=4))
|
||||
job["hash"] = cj["hash"]
|
||||
self.log("xbu reloc1:%d..." % (depth,), 6)
|
||||
@@ -4994,14 +5000,15 @@ class Up2k(object):
|
||||
if hr.get("reloc"):
|
||||
x = pathmod(self.vfs, ap_chk, vp_chk, hr["reloc"])
|
||||
if x:
|
||||
zvfs = vfs
|
||||
ud1 = (vfs.vpath, job["prel"], job["name"])
|
||||
pdir, _, job["name"], (vfs, rem) = x
|
||||
job["vcfg"] = vf = vfs.flags
|
||||
job["ptop"] = vfs.realpath
|
||||
job["vtop"] = vfs.vpath
|
||||
job["prel"] = rem
|
||||
job["name"] = sanitize_fn(job["name"], "")
|
||||
if zvfs.vpath != vfs.vpath:
|
||||
ud2 = (vfs.vpath, job["prel"], job["name"])
|
||||
if ud1 != ud2:
|
||||
self.log("xbu reloc2:%d..." % (depth,), 6)
|
||||
return self._handle_json(job, depth + 1)
|
||||
|
||||
|
||||
@@ -153,6 +153,14 @@ try:
|
||||
except:
|
||||
HAVE_PSUTIL = False
|
||||
|
||||
try:
|
||||
if os.environ.get("PRTY_NO_MAGIC"):
|
||||
raise Exception()
|
||||
|
||||
import magic
|
||||
except:
|
||||
pass
|
||||
|
||||
if True: # pylint: disable=using-constant-test
|
||||
import types
|
||||
from collections.abc import Callable, Iterable
|
||||
@@ -175,8 +183,6 @@ if True: # pylint: disable=using-constant-test
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
import magic
|
||||
|
||||
from .authsrv import VFS
|
||||
from .broker_util import BrokerCli
|
||||
from .up2k import Up2k
|
||||
@@ -1256,8 +1262,6 @@ class Magician(object):
|
||||
self.magic: Optional["magic.Magic"] = None
|
||||
|
||||
def ext(self, fpath: str) -> str:
|
||||
import magic
|
||||
|
||||
try:
|
||||
if self.bad_magic:
|
||||
raise Exception()
|
||||
@@ -2583,6 +2587,11 @@ def _fs_mvrm(
|
||||
now = time.time()
|
||||
if ex.errno == errno.ENOENT:
|
||||
return False
|
||||
if not attempt and ex.errno == errno.EXDEV:
|
||||
t = "using copy+delete (%s)\n %s\n %s"
|
||||
log(t % (ex.strerror, src, dst))
|
||||
osfun = shutil.move
|
||||
continue
|
||||
if now - t0 > maxtime or attempt == 90209:
|
||||
raise
|
||||
if not attempt:
|
||||
@@ -2607,15 +2616,18 @@ def atomic_move(log: "NamedLogger", src: str, dst: str, flags: dict[str, Any]) -
|
||||
elif flags.get("mv_re_t"):
|
||||
_fs_mvrm(log, src, dst, True, flags)
|
||||
else:
|
||||
os.replace(bsrc, bdst)
|
||||
|
||||
|
||||
def wrename(log: "NamedLogger", src: str, dst: str, flags: dict[str, Any]) -> bool:
|
||||
if not flags.get("mv_re_t"):
|
||||
os.rename(fsenc(src), fsenc(dst))
|
||||
return True
|
||||
|
||||
return _fs_mvrm(log, src, dst, False, flags)
|
||||
try:
|
||||
os.replace(bsrc, bdst)
|
||||
except OSError as ex:
|
||||
if ex.errno != errno.EXDEV:
|
||||
raise
|
||||
t = "using copy+delete (%s);\n %s\n %s"
|
||||
log(t % (ex.strerror, src, dst))
|
||||
try:
|
||||
os.unlink(bdst)
|
||||
except:
|
||||
pass
|
||||
shutil.move(bsrc, bdst)
|
||||
|
||||
|
||||
def wunlink(log: "NamedLogger", abspath: str, flags: dict[str, Any]) -> bool:
|
||||
@@ -3144,11 +3156,13 @@ def unescape_cookie(orig: str) -> str:
|
||||
return "".join(ret)
|
||||
|
||||
|
||||
def guess_mime(url: str, fallback: str = "application/octet-stream") -> str:
|
||||
def guess_mime(
|
||||
url: str, path: str = "", fallback: str = "application/octet-stream"
|
||||
) -> str:
|
||||
try:
|
||||
ext = url.rsplit(".", 1)[1].lower()
|
||||
except:
|
||||
return fallback
|
||||
ext = ""
|
||||
|
||||
ret = MIMES.get(ext)
|
||||
|
||||
@@ -3156,6 +3170,16 @@ def guess_mime(url: str, fallback: str = "application/octet-stream") -> str:
|
||||
x = mimetypes.guess_type(url)
|
||||
ret = "application/{}".format(x[1]) if x[1] else x[0]
|
||||
|
||||
if not ret and path:
|
||||
try:
|
||||
with open(fsenc(path), "rb", 0) as f:
|
||||
ret = magic.from_buffer(f.read(4096), mime=True)
|
||||
if ret.startswith("text/htm"):
|
||||
# avoid serving up HTML content unless there was actually a .html extension
|
||||
ret = "text/plain"
|
||||
except Exception as ex:
|
||||
pass
|
||||
|
||||
if not ret:
|
||||
ret = fallback
|
||||
|
||||
|
||||
@@ -1160,8 +1160,8 @@ html.y #widget.open {
|
||||
border: 1px solid var(--bg-u5);
|
||||
border-width: 0 .1em 0 0;
|
||||
}
|
||||
#wfm.act+#wzip,
|
||||
#wfm.act+#wzip+#wnp {
|
||||
#wfm.act+#wzip1+#wzip,
|
||||
#wfm.act+#wzip1+#wzip+#wnp {
|
||||
margin-left: .2em;
|
||||
padding-left: .2em;
|
||||
border-left-width: .1em;
|
||||
@@ -1179,12 +1179,14 @@ html.y #widget.open {
|
||||
#wtoggle.np #wnp {
|
||||
display: inline-block;
|
||||
}
|
||||
#wtoggle.sel #wzip1,
|
||||
#wtoggle.sel.np #wnp {
|
||||
display: none;
|
||||
}
|
||||
#wfm a,
|
||||
#wnp a,
|
||||
#wm3u a,
|
||||
#zip1,
|
||||
#wzip a {
|
||||
font-size: .5em;
|
||||
padding: 0 .3em;
|
||||
@@ -1192,6 +1194,9 @@ html.y #widget.open {
|
||||
position: relative;
|
||||
display: inline-block;
|
||||
}
|
||||
#zip1 {
|
||||
font-size: .38em;
|
||||
}
|
||||
#wm3u a {
|
||||
margin: -.2em .1em;
|
||||
font-size: .45em;
|
||||
@@ -1205,10 +1210,14 @@ html.y #widget.open {
|
||||
}
|
||||
#wfm span,
|
||||
#wm3u span,
|
||||
#zip1 span,
|
||||
#wnp span {
|
||||
font-size: .6em;
|
||||
display: block;
|
||||
}
|
||||
#zip1 span {
|
||||
font-size: .9em;
|
||||
}
|
||||
#wnp span {
|
||||
font-size: .7em;
|
||||
}
|
||||
@@ -1816,10 +1825,11 @@ html.y #tree.nowrap .ntree a+a:hover {
|
||||
line-height: 2.3em;
|
||||
margin-bottom: 1.5em;
|
||||
}
|
||||
#hdoc,
|
||||
#ghead {
|
||||
position: sticky;
|
||||
top: -.3em;
|
||||
z-index: 1;
|
||||
z-index: 2;
|
||||
}
|
||||
.ghead .btn {
|
||||
position: relative;
|
||||
@@ -1829,6 +1839,13 @@ html.y #tree.nowrap .ntree a+a:hover {
|
||||
white-space: pre;
|
||||
padding-left: .3em;
|
||||
}
|
||||
#tailbtns {
|
||||
display: none;
|
||||
}
|
||||
#taildoc.on+#tailbtns {
|
||||
display: inherit;
|
||||
display: unset;
|
||||
}
|
||||
#op_unpost {
|
||||
padding: 1em;
|
||||
}
|
||||
@@ -1925,6 +1942,9 @@ html.y #tree.nowrap .ntree a+a:hover {
|
||||
padding: 1em 0 1em 0;
|
||||
border-radius: .3em;
|
||||
}
|
||||
#doc.wrap {
|
||||
white-space: pre-wrap;
|
||||
}
|
||||
html.y #doc {
|
||||
box-shadow: 0 0 .3em var(--bg-u5);
|
||||
background: #f7f7f7;
|
||||
@@ -3218,7 +3238,7 @@ html.d #treepar {
|
||||
|
||||
#ggrid>a>span {
|
||||
text-align: center;
|
||||
padding: 0.2em;
|
||||
padding: .2em .2em .15em .2em;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -140,6 +140,7 @@ var Ls = {
|
||||
"wt_pst": "paste a previously cut / copied selection$NHotkey: ctrl-V",
|
||||
"wt_selall": "select all files$NHotkey: ctrl-A (when file focused)",
|
||||
"wt_selinv": "invert selection",
|
||||
"wt_zip1": "download this folder as archive",
|
||||
"wt_selzip": "download selection as archive",
|
||||
"wt_seldl": "download selection as separate files$NHotkey: Y",
|
||||
"wt_npirc": "copy irc-formatted track info",
|
||||
@@ -248,6 +249,8 @@ var Ls = {
|
||||
|
||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
||||
|
||||
"cut_wasm": "use wasm instead of the browser's built-in hasher; improves speed on chrome-based browsers but increases CPU load, and many older versions of chrome have bugs which makes the browser consume all RAM and crash if this is enabled\">wasm",
|
||||
|
||||
"cft_text": "favicon text (blank and refresh to disable)",
|
||||
"cft_fg": "foreground color",
|
||||
"cft_bg": "background color",
|
||||
@@ -334,6 +337,7 @@ var Ls = {
|
||||
"f_empty": 'this folder is empty',
|
||||
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||
"f_bigtxt2": "view just the end of the file instead? this will also enable following/tailing, showing newly added lines of text in real time",
|
||||
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
||||
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
||||
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
||||
@@ -438,6 +442,11 @@ var Ls = {
|
||||
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
||||
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||
"tvt_tail": "monitor file for changes; show new lines in real time\">📡 follow",
|
||||
"tvt_wrap": "word-wrap\">↵",
|
||||
"tvt_atail": "lock scroll to bottom of page\">⚓",
|
||||
"tvt_ctail": "decode terminal colors (ansi escape codes)\">🌈",
|
||||
"tvt_ntail": "scrollback limit (how many bytes of text to keep loaded)",
|
||||
|
||||
"m3u_add1": "song added to m3u playlist",
|
||||
"m3u_addn": "{0} songs added to m3u playlist",
|
||||
@@ -537,6 +546,7 @@ var Ls = {
|
||||
"u_https3": "for better performance",
|
||||
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
||||
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
||||
"tail_2old": "need firefox 105+ or chrome 71+ or iOS 14.5+",
|
||||
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
|
||||
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
|
||||
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
||||
@@ -754,6 +764,7 @@ var Ls = {
|
||||
"wt_pst": "lim inn filer (som tidligere ble klippet ut / kopiert et annet sted)$NSnarvei: ctrl-V",
|
||||
"wt_selall": "velg alle filer$NSnarvei: ctrl-A (mens fokus er på en fil)",
|
||||
"wt_selinv": "inverter utvalg",
|
||||
"wt_zip1": "last ned denne mappen som et arkiv",
|
||||
"wt_selzip": "last ned de valgte filene som et arkiv",
|
||||
"wt_seldl": "last ned de valgte filene$NSnarvei: Y",
|
||||
"wt_npirc": "kopiér sang-info (irc-formatert)",
|
||||
@@ -862,6 +873,8 @@ var Ls = {
|
||||
|
||||
"cut_mt": "raskere befaring ved å bruke hele CPU'en$N$Ndenne funksjonen anvender web-workers$Nog krever mer RAM (opptil 512 MiB ekstra)$N$Ngjør https 30% raskere, http 4.5x raskere\">mt",
|
||||
|
||||
"cut_wasm": "bruk wasm istedenfor nettleserens sha512-funksjon; gir bedre ytelse på chrome-baserte nettlesere, men bruker mere CPU, og eldre versjoner av chrome tåler det ikke (spiser opp all RAM og krasjer)\">wasm",
|
||||
|
||||
"cft_text": "ikontekst (blank ut og last siden på nytt for å deaktivere)",
|
||||
"cft_fg": "farge",
|
||||
"cft_bg": "bakgrunnsfarge",
|
||||
@@ -948,6 +961,7 @@ var Ls = {
|
||||
"f_empty": 'denne mappen er tom',
|
||||
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
||||
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
||||
"f_bigtxt2": "vil du se bunnen av filen istedenfor? du vil da også se nye linjer som blir lagt til på slutten av filen i sanntid",
|
||||
"fbd_more": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_more">vis {2}</a> eller <a href="#" id="bd_all">vis alle</a></div>',
|
||||
"fbd_all": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_all">vis alle</a></div>',
|
||||
"f_anota": "kun {0} av totalt {1} elementer ble markert;\nfor å velge alt må du bla til bunnen av mappen først",
|
||||
@@ -1052,6 +1066,11 @@ var Ls = {
|
||||
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ neste",
|
||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
|
||||
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
|
||||
"tvt_tail": "overvåk filen for endringer og vis nye linjer i sanntid\">📡 følg",
|
||||
"tvt_wrap": "tekstbryting\">↵",
|
||||
"tvt_atail": "hold de nyeste linjene synlig (lås til bunnen av siden)\">⚓",
|
||||
"tvt_ctail": "forstå og vis terminalfarger (ansi-sekvenser)\">🌈",
|
||||
"tvt_ntail": "maks-grense for antall bokstaver som skal vises i vinduet",
|
||||
|
||||
"m3u_add1": "sangen ble lagt til i m3u-spillelisten",
|
||||
"m3u_addn": "{0} sanger ble lagt til i m3u-spillelisten",
|
||||
@@ -1151,6 +1170,7 @@ var Ls = {
|
||||
"u_https3": "for høyere hastighet",
|
||||
"u_ancient": 'nettleseren din er prehistorisk -- mulig du burde <a href="#" onclick="goto(\'bup\')">bruke bup istedenfor</a>',
|
||||
"u_nowork": "krever firefox 53+, chrome 57+, eller iOS 11+",
|
||||
"tail_2old": "krever firefox 105+, chrome 71+, eller iOS 14.5+",
|
||||
"u_nodrop": 'nettleseren din er for gammel til å laste opp filer ved å dra dem inn i vinduet',
|
||||
"u_notdir": "mottok ikke mappen!\n\nnettleseren din er for gammel,\nprøv å dra mappen inn i vinduet istedenfor",
|
||||
"u_uri": "for å laste opp bilder ifra andre nettleservinduer,\nslipp bildet rett på den store last-opp-knappen",
|
||||
@@ -1368,10 +1388,13 @@ var Ls = {
|
||||
"wt_pst": "粘贴之前剪切/复制的选择$N快捷键: ctrl-V",
|
||||
"wt_selall": "选择所有文件$N快捷键: ctrl-A(当文件被聚焦时)",
|
||||
"wt_selinv": "反转选择",
|
||||
"wt_zip1": "将此文件夹下载为归档文件", //m
|
||||
"wt_selzip": "将选择下载为归档文件",
|
||||
"wt_seldl": "将选择下载为单独的文件$N快捷键: Y",
|
||||
"wt_npirc": "复制 IRC 格式的曲目信息",
|
||||
"wt_nptxt": "复制纯文本格式的曲目信息",
|
||||
"wt_m3ua": "添加到 m3u 播放列表(稍后点击 <code>📻copy</code>)", //m
|
||||
"wt_m3uc": "复制 m3u 播放列表到剪贴板", //m
|
||||
"wt_grid": "切换网格/列表视图$N快捷键: G",
|
||||
"wt_prev": "上一曲$N快捷键: J",
|
||||
"wt_play": "播放/暂停$N快捷键: P",
|
||||
@@ -1474,6 +1497,8 @@ var Ls = {
|
||||
|
||||
"cut_mt": "使用多线程加速文件哈希$N$N这使用 Web Worker 并且需要更多内存(额外最多 512 MiB)$N$N这使得 https 快 30%,http 快 4.5 倍\">mt",
|
||||
|
||||
"cut_wasm": "使用基于 WASM 的哈希计算器代替浏览器内置的哈希功能;这可以提升在基于 Chrome 的浏览器上的速度,但会增加 CPU 使用率,而且许多旧版本的 Chrome 存在漏洞,启用此功能会导致浏览器占用所有内存并崩溃。\">wasm", //m
|
||||
|
||||
"cft_text": "网站图标文本(为空并刷新以禁用)",
|
||||
"cft_fg": "前景色",
|
||||
"cft_bg": "背景色",
|
||||
@@ -1508,6 +1533,7 @@ var Ls = {
|
||||
"mt_fau": "在手机上,如果下一首歌未能快速预加载,防止音乐停止(可能导致标签显示异常)\">☕️",
|
||||
"mt_waves": "波形进度条:$N显示音频幅度\">进度条",
|
||||
"mt_npclip": "显示当前播放歌曲的剪贴板按钮\">♪剪切板",
|
||||
"mt_m3u_c": "显示按钮以将所选歌曲$N复制为 m3u8 播放列表条目\">📻", //m
|
||||
"mt_octl": "操作系统集成(媒体快捷键 / OSD)\">OSD",
|
||||
"mt_oseek": "允许通过操作系统集成进行跳转$N$N注意:在某些设备(如 iPhone)上,$N这将替代下一首歌按钮\">seek",
|
||||
"mt_oscv": "在 OSD 中显示专辑封面\">封面",
|
||||
@@ -1533,6 +1559,7 @@ var Ls = {
|
||||
|
||||
"mb_play": "播放",
|
||||
"mm_hashplay": "播放这个音频文件?",
|
||||
"mm_m3u": "按 <code>Enter/确定</code> 播放\n按 <code>ESC/取消</code> 编辑", //m
|
||||
"mp_breq": "需要 Firefox 82+ 或 Chrome 73+ 或 iOS 15+",
|
||||
"mm_bload": "正在加载...",
|
||||
"mm_bconv": "正在转换为 {0},请稍等...",
|
||||
@@ -1558,8 +1585,10 @@ var Ls = {
|
||||
"f_empty": '该文件夹为空',
|
||||
"f_chide": '隐藏列 «{0}»\n\n你可以在设置选项卡中重新显示列',
|
||||
"f_bigtxt": "这个文件大小为 {0} MiB -- 真的以文本形式查看?",
|
||||
"f_bigtxt2": " 你想查看文件的结尾部分吗?这也将启用实时跟踪功能,能够实时显示新添加的文本行。", //m
|
||||
"fbd_more": '<div id="blazy">显示 <code>{0}</code> 个文件中的 <code>{1}</code> 个;<a href="#" id="bd_more">显示 {2}</a> 或 <a href="#" id="bd_all">显示全部</a></div>',
|
||||
"fbd_all": '<div id="blazy">显示 <code>{0}</code> 个文件中的 <code>{1}</code> 个;<a href="#" id="bd_all">显示全部</a></div>',
|
||||
"f_anota": "仅选择了 {0} 个项目,共 {1} 个;\n要选择整个文件夹,请先滚动到底部", //m
|
||||
|
||||
"f_dls": '当前文件夹中的文件链接已\n更改为下载链接',
|
||||
|
||||
@@ -1593,7 +1622,7 @@ var Ls = {
|
||||
"fs_tsrc": "共享的文件或文件夹",
|
||||
"fs_ppwd": "密码可选",
|
||||
"fs_w8": "正在创建文件共享...",
|
||||
"fs_ok": "按 <code>Enter/OK</code> 复制到剪贴板\n按 <code>ESC/Cancel</code> 关闭",
|
||||
"fs_ok": "按 <code>Enter/确定</code> 复制到剪贴板\n按 <code>ESC/取消</code> 关闭",
|
||||
|
||||
"frt_dec": "可能修复一些损坏的文件名\">url-decode",
|
||||
"frt_rst": "将修改后的文件名重置为原始文件名\">↺ 重置",
|
||||
@@ -1661,6 +1690,15 @@ var Ls = {
|
||||
"tvt_next": "显示下一个文档$N快捷键: K\">⬇ 下一个",
|
||||
"tvt_sel": "选择文件 (用于剪切/删除/...)$N快捷键: S\">选择",
|
||||
"tvt_edit": "在文本编辑器中打开文件$N快捷键: E\">✏️ 编辑",
|
||||
"tvt_tail": "监视文件更改,并实时显示新增的行\">📡 跟踪", //m
|
||||
"tvt_wrap": "自动换行\">↵", //m
|
||||
"tvt_atail": "锁定到底部,显示最新内容\">⚓", //m
|
||||
"tvt_ctail": "解析终端颜色(ANSI 转义码)\">🌈", //m
|
||||
"tvt_ntail": "滚动历史上限(保留多少字节的文本)", //m
|
||||
|
||||
"m3u_add1": "歌曲已添加到 m3u 播放列表", //m
|
||||
"m3u_addn": "已添加 {0} 首歌曲到 m3u 播放列表", //m
|
||||
"m3u_clip": "m3u 播放列表已复制到剪贴板\n\n请创建一个以 <code>.m3u</code> 结尾的文本文件,\n并将播放列表粘贴到该文件中;\n这样就可以播放了", //m
|
||||
|
||||
"gt_vau": "不显示视频,仅播放音频\">🎧",
|
||||
"gt_msel": "启用文件选择;按住 ctrl 键点击文件以覆盖$N$N<em>当启用时:双击文件/文件夹以打开它</em>$N$N快捷键:S\">多选",
|
||||
@@ -1756,6 +1794,7 @@ var Ls = {
|
||||
"u_https3": "以获得更好的性能",
|
||||
"u_ancient": '你的浏览器非常古老 -- 也许你应该 <a href="#" onclick="goto(\'bup\')">改用 bup</a>',
|
||||
"u_nowork": "需要 Firefox 53+ 或 Chrome 57+ 或 iOS 11+",
|
||||
"tail_2old": "需要 Firefox 105+ 或 Chrome 71+ 或 iOS 14.5+",
|
||||
"u_nodrop": '浏览器版本低,不支持通过拖动文件到窗口来上传文件',
|
||||
"u_notdir": "不是文件夹!\n\n您的浏览器太旧;\n请尝试将文件夹拖入窗口",
|
||||
"u_uri": "要从其他浏览器窗口拖放图片,\n请将其拖放到大的上传按钮上",
|
||||
@@ -1899,6 +1938,8 @@ ebi('widget').innerHTML = (
|
||||
' href="#" id="fcut" tt="' + L.wt_cut + '">✂<span>cut</span></a><a' +
|
||||
' href="#" id="fcpy" tt="' + L.wt_cpy + '">⧉<span>copy</span></a><a' +
|
||||
' href="#" id="fpst" tt="' + L.wt_pst + '">📋<span>paste</span></a>' +
|
||||
'</span><span id="wzip1"><a' +
|
||||
' href="#" id="zip1" tt="' + L.wt_zip1 + '">📦<span>zip</span></a>' +
|
||||
'</span><span id="wzip"><a' +
|
||||
' href="#" id="selall" tt="' + L.wt_selall + '">sel.<br />all</a><a' +
|
||||
' href="#" id="selinv" tt="' + L.wt_selinv + '">sel.<br />inv.</a><a' +
|
||||
@@ -2076,6 +2117,7 @@ ebi('op_cfg').innerHTML = (
|
||||
' <a id="u2ts" class="tgl btn" href="#" tt="' + L.ut_u2ts + '</a>\n' +
|
||||
' <a id="umod" class="tgl btn" href="#" tt="' + L.cut_umod + '</a>\n' +
|
||||
' <a id="hashw" class="tgl btn" href="#" tt="' + L.cut_mt + '</a>\n' +
|
||||
' <a id="nosubtle" class="tgl btn" href="#" tt="' + L.cut_wasm + '</a>\n' +
|
||||
' <a id="u2turbo" class="tgl btn ttb" href="#" tt="' + L.cut_turbo + '</a>\n' +
|
||||
' <a id="u2tdate" class="tgl btn ttb" href="#" tt="' + L.cut_datechk + '</a>\n' +
|
||||
' <input type="text" id="u2szg" value="" ' + NOAC + ' style="width:3em" tt="' + L.cut_u2sz + '" />' +
|
||||
@@ -2468,7 +2510,7 @@ var mpl = (function () {
|
||||
c = r.ac_flac;
|
||||
else if (/\.(aac|m4a)$/i.exec(cs))
|
||||
c = r.ac_aac;
|
||||
else if (/\.(ogg|opus)$/i.exec(cs) && (!can_ogg || mpl.ac2 == 'mp3'))
|
||||
else if (/\.(oga|ogg|opus)$/i.exec(cs) && (!can_ogg || mpl.ac2 == 'mp3'))
|
||||
c = true;
|
||||
else if (re_au_native.exec(cs))
|
||||
c = false;
|
||||
@@ -2655,8 +2697,8 @@ mpl.init_ac2();
|
||||
|
||||
|
||||
var re_m3u = /\.(m3u8?)$/i;
|
||||
var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
|
||||
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|itgz|itxz|itz|m4a|mdgz|mdxz|mdz|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|s3gz|s3xz|s3z|tak|tta|ulaw|wav|wma|wv|xm|xmgz|xmxz|xmz|xpk|3gp|asf|avi|flv|m4v|mkv|mov|mp4|mpeg|mpeg2|mpegts|mpg|mpg2|nut|ogm|ogv|rm|ts|vob|webm|wmv)$/i;
|
||||
var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|oga|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
|
||||
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|itgz|itxz|itz|m4a|mdgz|mdxz|mdz|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|oga|ogg|okt|opus|ra|s3m|s3gz|s3xz|s3z|tak|tta|ulaw|wav|wma|wv|xm|xmgz|xmxz|xmz|xpk|3gp|asf|avi|flv|m4v|mkv|mov|mp4|mpeg|mpeg2|mpegts|mpg|mpg2|nut|ogm|ogv|rm|ts|vob|webm|wmv)$/i;
|
||||
|
||||
|
||||
// extract songs + add play column
|
||||
@@ -2945,6 +2987,9 @@ var widget = (function () {
|
||||
ebi('bplay').innerHTML = paused ? '▶' : '⏸';
|
||||
}
|
||||
};
|
||||
r.setvis = function () {
|
||||
widget.style.display = !has(perms, "read") || showfile.abrt ? 'none' : '';
|
||||
};
|
||||
wtico.onclick = function (e) {
|
||||
if (!touchmode)
|
||||
r.toggle(e);
|
||||
@@ -5778,7 +5823,9 @@ var fileman = (function () {
|
||||
|
||||
|
||||
var showfile = (function () {
|
||||
var r = {};
|
||||
var r = {
|
||||
'nrend': 0,
|
||||
};
|
||||
r.map = {
|
||||
'.ahk': 'autohotkey',
|
||||
'.bas': 'basic',
|
||||
@@ -5891,16 +5938,77 @@ var showfile = (function () {
|
||||
}
|
||||
r.mktree();
|
||||
if (em) {
|
||||
render(em);
|
||||
if (r.taildoc)
|
||||
r.show(em[0], true);
|
||||
else
|
||||
render(em);
|
||||
em = null;
|
||||
}
|
||||
};
|
||||
|
||||
r.tail = function (url, no_push) {
|
||||
r.abrt = new AbortController();
|
||||
widget.setvis();
|
||||
render([url, '', ''], no_push);
|
||||
var me = r.tail_id = Date.now(),
|
||||
wfp = ebi('wfp'),
|
||||
edoc = ebi('doc'),
|
||||
txt = '';
|
||||
|
||||
url = addq(url, 'tail=-' + r.tailnb);
|
||||
fetch(url, {'signal': r.abrt.signal}).then(function(rsp) {
|
||||
var ro = rsp.body.pipeThrough(
|
||||
new TextDecoderStream('utf-8', {'fatal': false}),
|
||||
{'signal': r.abrt.signal}).getReader();
|
||||
|
||||
var rf = function() {
|
||||
ro.read().then(function(v) {
|
||||
if (r.tail_id != me)
|
||||
return;
|
||||
var vt = v.done ? '\n*** lost connection to copyparty ***' : v.value;
|
||||
if (vt == '\x00')
|
||||
return rf();
|
||||
txt += vt;
|
||||
var ofs = txt.length - r.tailnb;
|
||||
if (ofs > 0) {
|
||||
var ofs2 = txt.indexOf('\n', ofs);
|
||||
if (ofs2 >= ofs && ofs - ofs2 < 512)
|
||||
ofs = ofs2;
|
||||
txt = txt.slice(ofs);
|
||||
}
|
||||
var html = esc(txt);
|
||||
if (r.tailansi)
|
||||
html = r.ansify(html);
|
||||
edoc.innerHTML = html;
|
||||
if (r.tail2end)
|
||||
window.scrollTo(0, wfp.offsetTop - window.innerHeight);
|
||||
if (!v.done)
|
||||
rf();
|
||||
});
|
||||
};
|
||||
if (r.tail_id == me)
|
||||
rf();
|
||||
});
|
||||
};
|
||||
|
||||
r.untail = function () {
|
||||
if (!r.abrt)
|
||||
return;
|
||||
r.abrt.abort();
|
||||
r.abrt = null;
|
||||
r.tail_id = -1;
|
||||
widget.setvis();
|
||||
};
|
||||
|
||||
r.show = function (url, no_push) {
|
||||
r.untail();
|
||||
var xhr = new XHR(),
|
||||
m = /[?&](k=[^&#]+)/.exec(url);
|
||||
|
||||
url = url.split('?')[0] + (m ? '?' + m[1] : '');
|
||||
if (r.taildoc)
|
||||
return r.tail(url, no_push);
|
||||
|
||||
xhr.url = url;
|
||||
xhr.fname = uricom_dec(url.split('/').pop());
|
||||
xhr.no_push = no_push;
|
||||
@@ -5940,7 +6048,8 @@ var showfile = (function () {
|
||||
|
||||
function render(doc, no_push) {
|
||||
r.q = null;
|
||||
var url = doc[0],
|
||||
r.nrend++;
|
||||
var url = r.url = doc[0],
|
||||
lnh = doc[1],
|
||||
txt = doc[2],
|
||||
name = url.split('?')[0].split('/').pop(),
|
||||
@@ -5954,9 +6063,13 @@ var showfile = (function () {
|
||||
ebi('editdoc').style.display = (has(perms, 'write') && (is_md || has(perms, 'delete'))) ? '' : 'none';
|
||||
|
||||
var wr = ebi('bdoc'),
|
||||
nrend = r.nrend,
|
||||
defer = !Prism.highlightElement;
|
||||
|
||||
var fun = function (el) {
|
||||
if (r.nrend != nrend)
|
||||
return;
|
||||
|
||||
try {
|
||||
if (lnh.slice(0, 5) == '#doc.')
|
||||
sethash(lnh.slice(1));
|
||||
@@ -5964,13 +6077,16 @@ var showfile = (function () {
|
||||
el = el || QS('#doc>code');
|
||||
Prism.highlightElement(el);
|
||||
if (el.className == 'language-ans' || (!lang && /\x1b\[[0-9;]{0,16}m/.exec(txt.slice(0, 4096))))
|
||||
r.ansify(el);
|
||||
el.innerHTML = r.ansify(el.innerHTML);
|
||||
}
|
||||
catch (ex) { }
|
||||
}
|
||||
|
||||
if (txt.length > 1024 * 256)
|
||||
var skip_prism = !txt || txt.length > 1024 * 256;
|
||||
if (skip_prism) {
|
||||
fun = function (el) { };
|
||||
is_md = false;
|
||||
}
|
||||
|
||||
qsr('#doc');
|
||||
var el = mknod('pre', 'doc');
|
||||
@@ -5982,7 +6098,7 @@ var showfile = (function () {
|
||||
else {
|
||||
el.textContent = txt;
|
||||
el.innerHTML = '<code>' + el.innerHTML + '</code>';
|
||||
if (!window.no_prism) {
|
||||
if (!window.no_prism && !skip_prism) {
|
||||
if ((lang == 'conf' || lang == 'cfg') && ('\n' + txt).indexOf('\n# -*- mode: yaml -*-') + 1)
|
||||
lang = 'yaml';
|
||||
|
||||
@@ -5992,6 +6108,8 @@ var showfile = (function () {
|
||||
else
|
||||
import_js(SR + '/.cpr/deps/prism.js', function () { fun(); });
|
||||
}
|
||||
if (!txt && r.wrap)
|
||||
el.className = 'wrap';
|
||||
}
|
||||
|
||||
wr.appendChild(el);
|
||||
@@ -6013,11 +6131,11 @@ var showfile = (function () {
|
||||
tree_scrollto();
|
||||
}
|
||||
|
||||
r.ansify = function (el) {
|
||||
r.ansify = function (html) {
|
||||
var ctab = (light ?
|
||||
'bfbfbf d30253 497600 b96900 006fbb a50097 288276 2d2d2d 9f9f9f 943b55 3a5600 7f4f00 00507d 683794 004343 000000' :
|
||||
'404040 f03669 b8e346 ffa402 02a2ff f65be3 3da698 d2d2d2 606060 c75b79 c8e37e ffbe4a 71cbff b67fe3 9cf0ed ffffff').split(/ /g),
|
||||
src = el.innerHTML.split(/\x1b\[/g),
|
||||
src = html.split(/\x1b\[/g),
|
||||
out = ['<span>'], fg = 7, bg = null, bfg = 0, bbg = 0, inv = 0, bold = 0;
|
||||
|
||||
for (var a = 0; a < src.length; a++) {
|
||||
@@ -6070,7 +6188,7 @@ var showfile = (function () {
|
||||
|
||||
out.push(s + '">' + txt);
|
||||
}
|
||||
el.innerHTML = out.join('');
|
||||
return out.join('');
|
||||
};
|
||||
|
||||
r.mktree = function () {
|
||||
@@ -6117,6 +6235,18 @@ var showfile = (function () {
|
||||
msel.selui();
|
||||
};
|
||||
|
||||
r.tgltail = function () {
|
||||
if (!window.TextDecoderStream) {
|
||||
bcfg_set('taildoc', r.taildoc = false);
|
||||
return toast.err(10, L.tail_2old);
|
||||
}
|
||||
r.show(r.url, true);
|
||||
};
|
||||
|
||||
r.tglwrap = function () {
|
||||
r.show(r.url, true);
|
||||
};
|
||||
|
||||
var bdoc = ebi('bdoc');
|
||||
bdoc.className = 'line-numbers';
|
||||
bdoc.innerHTML = (
|
||||
@@ -6127,15 +6257,38 @@ var showfile = (function () {
|
||||
'<a href="#" class="btn" id="nextdoc" tt="' + L.tvt_next + '</a>\n' +
|
||||
'<a href="#" class="btn" id="seldoc" tt="' + L.tvt_sel + '</a>\n' +
|
||||
'<a href="#" class="btn" id="editdoc" tt="' + L.tvt_edit + '</a>\n' +
|
||||
'<a href="#" class="btn tgl" id="taildoc" tt="' + L.tvt_tail + '</a>\n' +
|
||||
'<div id="tailbtns">\n' +
|
||||
'<a href="#" class="btn tgl" id="wrapdoc" tt="' + L.tvt_wrap + '</a>\n' +
|
||||
'<a href="#" class="btn tgl" id="tail2end" tt="' + L.tvt_atail + '</a>\n' +
|
||||
'<a href="#" class="btn tgl" id="tailansi" tt="' + L.tvt_ctail + '</a>\n' +
|
||||
'<input type="text" id="tailnb" value="" ' + NOAC + ' style="width:4em" tt="' + L.tvt_ntail + '" />' +
|
||||
'</div>\n' +
|
||||
'</div>'
|
||||
);
|
||||
ebi('xdoc').onclick = function () {
|
||||
r.untail();
|
||||
thegrid.setvis(true);
|
||||
bcfg_bind(r, 'taildoc', 'taildoc', false, r.tgltail);
|
||||
};
|
||||
ebi('dldoc').setAttribute('download', '');
|
||||
ebi('prevdoc').onclick = function () { tree_neigh(-1); };
|
||||
ebi('nextdoc').onclick = function () { tree_neigh(1); };
|
||||
ebi('seldoc').onclick = r.tglsel;
|
||||
bcfg_bind(r, 'wrap', 'wrapdoc', true, r.tglwrap);
|
||||
bcfg_bind(r, 'taildoc', 'taildoc', false, r.tgltail);
|
||||
bcfg_bind(r, 'tail2end', 'tail2end', true);
|
||||
bcfg_bind(r, 'tailansi', 'tailansi', false, r.tgltail);
|
||||
|
||||
r.tailnb = ebi('tailnb').value = icfg_get('tailnb', 131072);
|
||||
ebi('tailnb').oninput = function (e) {
|
||||
swrite('tailnb', r.tailnb = this.value);
|
||||
};
|
||||
|
||||
if (/[?&]tail\b/.exec(sloc0)) {
|
||||
clmod(ebi('taildoc'), 'on', 1);
|
||||
r.taildoc = true;
|
||||
}
|
||||
|
||||
return r;
|
||||
})();
|
||||
@@ -6430,6 +6583,7 @@ var thegrid = (function () {
|
||||
ohref = esc(ao.getAttribute('href')),
|
||||
href = ohref.split('?')[0],
|
||||
ext = '',
|
||||
ext0 = '',
|
||||
name = uricom_dec(vsplit(href)[1]),
|
||||
ref = ao.getAttribute('id'),
|
||||
isdir = href.endsWith('/'),
|
||||
@@ -6442,17 +6596,19 @@ var thegrid = (function () {
|
||||
ar.shift();
|
||||
|
||||
ar.reverse();
|
||||
ext0 = ar[0];
|
||||
for (var b = 0; b < Math.min(2, ar.length); b++) {
|
||||
if (ar[b].length > 7)
|
||||
break;
|
||||
|
||||
ext = ar[b] + '.' + ext;
|
||||
ext = ext ? (ar[b] + '.' + ext) : ar[b];
|
||||
}
|
||||
ext = (ext || 'unk.').slice(0, -1);
|
||||
if (!ext)
|
||||
ext = 'unk';
|
||||
}
|
||||
|
||||
if (use_ext_th && ext_th[ext]) {
|
||||
ihref = ext_th[ext];
|
||||
if (use_ext_th && (ext_th[ext] || ext_th[ext0])) {
|
||||
ihref = ext_th[ext] || ext_th[ext0];
|
||||
}
|
||||
else if (r.thumbs) {
|
||||
ihref = addq(ihref, 'th=' + (have_webp ? 'w' : 'j'));
|
||||
@@ -7483,7 +7639,7 @@ var treectl = (function () {
|
||||
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
|
||||
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
||||
bcfg_bind(r, 'csel', 'csel', dgsel);
|
||||
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
|
||||
bcfg_bind(r, 'dots', 'dotfiles', see_dots, function (v) {
|
||||
r.goto();
|
||||
var xhr = new XHR();
|
||||
xhr.open('GET', SR + '/?setck=dots=' + (v ? 'y' : ''), true);
|
||||
@@ -8559,7 +8715,7 @@ function apply_perms(res) {
|
||||
if (up2k)
|
||||
up2k.set_fsearch();
|
||||
|
||||
ebi('widget').style.display = have_read ? '' : 'none';
|
||||
widget.setvis();
|
||||
thegrid.setvis();
|
||||
if (!have_read && have_write)
|
||||
goto('up2k');
|
||||
@@ -9073,6 +9229,15 @@ var arcfmt = (function () {
|
||||
}
|
||||
ebi('selzip').textContent = fmt.split('_')[0];
|
||||
ebi('selzip').setAttribute('fmt', arg);
|
||||
|
||||
QS('#zip1 span').textContent = fmt.split('_')[0];
|
||||
ebi('zip1').setAttribute("href",
|
||||
get_evpath() + (dk ? '?k=' + dk + '&': '?') + arg);
|
||||
|
||||
if (!have_zip) {
|
||||
ebi('zip1').style.display = 'none';
|
||||
ebi('selzip').style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
function try_render() {
|
||||
@@ -9311,7 +9476,10 @@ var msel = (function () {
|
||||
r.selui(true);
|
||||
arcfmt.render();
|
||||
fileman.render();
|
||||
ebi('selzip').style.display = is_srch ? 'none' : '';
|
||||
|
||||
var zipvis = (is_srch || !have_zip) ? 'none' : '';
|
||||
ebi('selzip').style.display = zipvis;
|
||||
ebi('zip1').style.display = zipvis;
|
||||
}
|
||||
return r;
|
||||
})();
|
||||
@@ -10078,13 +10246,18 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
|
||||
fun = function () {
|
||||
showfile.show(href, tgt.getAttribute('lang'));
|
||||
},
|
||||
tfun = function () {
|
||||
bcfg_set('taildoc', showfile.taildoc = true);
|
||||
fun();
|
||||
},
|
||||
szs = ft2dict(a.closest('tr'))[0].sz,
|
||||
sz = parseInt(szs.replace(/[, ]/g, ''));
|
||||
|
||||
if (sz < 1024 * 1024)
|
||||
if (sz < 1024 * 1024 || showfile.taildoc)
|
||||
fun();
|
||||
else
|
||||
modal.confirm(L.f_bigtxt.format(f2f(sz / 1024 / 1024, 1)), fun, null);
|
||||
modal.confirm(L.f_bigtxt.format(f2f(sz / 1024 / 1024, 1)), fun, function() {
|
||||
modal.confirm(L.f_bigtxt2, tfun, null)});
|
||||
|
||||
return ev(e);
|
||||
}
|
||||
|
||||
@@ -101,6 +101,7 @@
|
||||
gio mount -a dav{{ s }}://{{ ep }}/{{ rvp }}
|
||||
{%- endif %}
|
||||
</pre>
|
||||
<p>on KDE Dolphin, use <code>webdav{{ s }}://{{ ep }}/{{ rvp }}</code></p>
|
||||
</div>
|
||||
|
||||
<div class="os mac">
|
||||
|
||||
@@ -1,6 +1,18 @@
|
||||
"use strict";
|
||||
|
||||
|
||||
(function () {
|
||||
var x = sread('nosubtle');
|
||||
if (x === '0' || x === '1')
|
||||
nosubtle = parseInt(x);
|
||||
if ((nosubtle > 1 && !CHROME && !FIREFOX) ||
|
||||
(nosubtle > 2 && !CHROME) ||
|
||||
(CHROME && nosubtle > VCHROME) ||
|
||||
!WebAssembly)
|
||||
nosubtle = 0;
|
||||
})();
|
||||
|
||||
|
||||
function goto_up2k() {
|
||||
if (up2k === false)
|
||||
return goto('bup');
|
||||
@@ -23,7 +35,7 @@ var up2k = null,
|
||||
m = 'will use ' + sha_js + ' instead of native sha512 due to';
|
||||
|
||||
try {
|
||||
if (sread('nosubtle') || window.nosubtle)
|
||||
if (nosubtle)
|
||||
throw 'chickenbit';
|
||||
var cf = crypto.subtle || crypto.webkitSubtle;
|
||||
cf.digest('SHA-512', new Uint8Array(1)).then(
|
||||
@@ -825,7 +837,7 @@ function up2k_init(subtle) {
|
||||
}
|
||||
qsr('#u2depmsg');
|
||||
var o = mknod('div', 'u2depmsg');
|
||||
o.innerHTML = m;
|
||||
o.innerHTML = nosubtle ? '' : m;
|
||||
ebi('u2foot').appendChild(o);
|
||||
}
|
||||
loading_deps = true;
|
||||
@@ -881,7 +893,8 @@ function up2k_init(subtle) {
|
||||
bcfg_bind(uc, 'turbo', 'u2turbo', turbolvl > 1, draw_turbo);
|
||||
bcfg_bind(uc, 'datechk', 'u2tdate', turbolvl < 3, null);
|
||||
bcfg_bind(uc, 'az', 'u2sort', u2sort.indexOf('n') + 1, set_u2sort);
|
||||
bcfg_bind(uc, 'hashw', 'hashw', !!WebAssembly && !(CHROME && MOBILE) && (!subtle || !CHROME), set_hashw);
|
||||
bcfg_bind(uc, 'hashw', 'hashw', !!WebAssembly && !(CHROME && MOBILE) && (!subtle || !CHROME || VCHROME > 136), set_hashw);
|
||||
bcfg_bind(uc, 'hwasm', 'nosubtle', nosubtle, set_nosubtle);
|
||||
bcfg_bind(uc, 'upnag', 'upnag', false, set_upnag);
|
||||
bcfg_bind(uc, 'upsfx', 'upsfx', false, set_upsfx);
|
||||
|
||||
@@ -1442,9 +1455,16 @@ function up2k_init(subtle) {
|
||||
if (CHROME) {
|
||||
// chrome-bug 383568268 // #124
|
||||
nw = Math.max(1, (nw > 4 ? 4 : (nw - 1)));
|
||||
if (VCHROME < 137)
|
||||
nw = (subtle && !MOBILE && nw > 2) ? 2 : nw;
|
||||
}
|
||||
|
||||
var x = sread('u2hashers') || window.u2hashers;
|
||||
if (x) {
|
||||
console.log('u2hashers is overriding default-value ' + nw);
|
||||
nw = parseInt(x);
|
||||
}
|
||||
|
||||
for (var a = 0; a < nw; a++)
|
||||
hws.push(new Worker(SR + '/.cpr/w.hash.js?_=' + TS));
|
||||
|
||||
@@ -2213,6 +2233,7 @@ function up2k_init(subtle) {
|
||||
reading = 0,
|
||||
max_readers = 1,
|
||||
opt_readers = 2,
|
||||
failed = false,
|
||||
free = [],
|
||||
busy = {},
|
||||
nbusy = 0,
|
||||
@@ -2262,6 +2283,14 @@ function up2k_init(subtle) {
|
||||
tasker();
|
||||
}
|
||||
|
||||
function go_fail() {
|
||||
failed = true;
|
||||
if (nbusy)
|
||||
return;
|
||||
apop(st.busy.hash, t);
|
||||
st.bytes.finished += t.size;
|
||||
}
|
||||
|
||||
function onmsg(d) {
|
||||
d = d.data;
|
||||
var k = d[0];
|
||||
@@ -2276,6 +2305,12 @@ function up2k_init(subtle) {
|
||||
return vis_exh(d[1], 'up2k.js', '', '', d[1]);
|
||||
|
||||
if (k == "fail") {
|
||||
var nchunk = d[1];
|
||||
free.push(busy[nchunk]);
|
||||
delete busy[nchunk];
|
||||
nbusy--;
|
||||
reading--;
|
||||
|
||||
pvis.seth(t.n, 1, d[1]);
|
||||
pvis.seth(t.n, 2, d[2]);
|
||||
console.log(d[1], d[2]);
|
||||
@@ -2283,9 +2318,7 @@ function up2k_init(subtle) {
|
||||
got_oserr();
|
||||
|
||||
pvis.move(t.n, 'ng');
|
||||
apop(st.busy.hash, t);
|
||||
st.bytes.finished += t.size;
|
||||
return;
|
||||
return go_fail();
|
||||
}
|
||||
|
||||
if (k == "ferr")
|
||||
@@ -2318,6 +2351,9 @@ function up2k_init(subtle) {
|
||||
t.hash.push(nchunk);
|
||||
pvis.hashed(t);
|
||||
|
||||
if (failed)
|
||||
return go_fail();
|
||||
|
||||
if (t.hash.length < nchunks)
|
||||
return nbusy < opt_readers && go_next();
|
||||
|
||||
@@ -3269,6 +3305,12 @@ function up2k_init(subtle) {
|
||||
}
|
||||
}
|
||||
|
||||
function set_nosubtle(v) {
|
||||
if (!WebAssembly)
|
||||
return toast.err(10, L.u_nowork);
|
||||
modal.confirm(L.lang_set, location.reload.bind(location), null);
|
||||
}
|
||||
|
||||
function set_upnag(en) {
|
||||
function nopenag() {
|
||||
bcfg_set('upnag', uc.upnag = false);
|
||||
|
||||
@@ -69,7 +69,7 @@ try {
|
||||
|
||||
CHROME = navigator.userAgentData.brands.find(function (d) { return d.brand == 'Chromium' });
|
||||
if (CHROME)
|
||||
VCHROME = CHROME.version;
|
||||
VCHROME = parseInt(CHROME.version);
|
||||
else
|
||||
VCHROME = 0;
|
||||
|
||||
@@ -183,7 +183,7 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
if (url.indexOf(' > eval') + 1 && !evalex_fatal)
|
||||
return; // md timer
|
||||
|
||||
if (IE && url.indexOf('prism.js') + 1)
|
||||
if (url.indexOf('prism.js') + 1)
|
||||
return;
|
||||
|
||||
if (url.indexOf('easymde.js') + 1)
|
||||
@@ -1229,7 +1229,7 @@ function dl_file(url) {
|
||||
function cliptxt(txt, ok) {
|
||||
var fb = function () {
|
||||
console.log('clip-fb');
|
||||
var o = mknod('input');
|
||||
var o = mknod('textarea');
|
||||
o.value = txt;
|
||||
document.body.appendChild(o);
|
||||
o.focus();
|
||||
@@ -1239,6 +1239,8 @@ function cliptxt(txt, ok) {
|
||||
ok();
|
||||
};
|
||||
try {
|
||||
if (!window.isSecureContext)
|
||||
throw 1;
|
||||
navigator.clipboard.writeText(txt).then(ok, fb);
|
||||
}
|
||||
catch (ex) { fb(); }
|
||||
|
||||
@@ -4,6 +4,16 @@
|
||||
function hex2u8(txt) {
|
||||
return new Uint8Array(txt.match(/.{2}/g).map(function (b) { return parseInt(b, 16); }));
|
||||
}
|
||||
function esc(txt) {
|
||||
return txt.replace(/[&"<>]/g, function (c) {
|
||||
return {
|
||||
'&': '&',
|
||||
'"': '"',
|
||||
'<': '<',
|
||||
'>': '>'
|
||||
}[c];
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
var subtle = null;
|
||||
@@ -19,6 +29,8 @@ catch (ex) {
|
||||
}
|
||||
function load_fb() {
|
||||
subtle = null;
|
||||
if (self.hashwasm)
|
||||
return;
|
||||
importScripts('deps/sha512.hw.js');
|
||||
console.log('using fallback hasher');
|
||||
}
|
||||
|
||||
@@ -1,3 +1,93 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0527-1939 `v1.17.2` pushing chrome to the limits (and then some)
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* not this time
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* up2k: improve file-hashing speed on recent versions of google chrome e3e51fb8
|
||||
* speed increased from 319 to 513 MiB/s by default (but older chrome versions did 748...)
|
||||
* read the commit message for the full story, but basically chrome has gotten gradually slower over the past couple versions (starting from v133) and this makes it slightly less bad again
|
||||
* hashing speed can be further improved from `0.5` to `1.1` GiB/s by enabling the `[wasm]` option in the `[⚙️] settings` tab
|
||||
* this option can be made default-enabled with `--nosubtle 137` but beware that this increases the chances of running into browser-bugs (foreshadowing...)
|
||||
* up2k: fix errorhandler for browser-bugs (oom and such) 49c71247
|
||||
* because [chrome-bug 383568268](https://issues.chromium.org/issues/383568268) is about to make a [surprise return?!](https://issues.chromium.org/issues/383568268#comment14)
|
||||
* #168 fix uploading into shares if path-based proxying is used 9cb93ae1
|
||||
* #165 unconditionally heed `--rp-loc` 84f5f417
|
||||
* the config-option for [path-based proxying](https://github.com/9001/copyparty/#reverse-proxy) was ignored if the reverse-proxy was untrusted; this was confusing and not strictly necessary
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* #166 the nixos module was improved once more (thx @msfjarvis!) 48470f6b 60fb1207
|
||||
* added usage instructions to [minimal-up2k.js](https://github.com/9001/copyparty/tree/hovudstraum/contrib/plugins#example-browser-js), the up2k-ui [simplifier](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png) 1d308eeb
|
||||
* docker: improve feedback if config is bad or missing 28b63e58
|
||||
|
||||
## 🌠 fun facts
|
||||
|
||||
* this release was tested using an [unreliable rdp connection](https://a.ocv.me/pub/g/nerd-stuff/PXL_20250526_021207825.jpg) through two ssh-jumphosts to a qemu win10 vm back home from the bergen-oslo night train wifi
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0518-2234 `v1.17.1` as seen on archlinux
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* new toolbar button to zip/tar the currently open folder 256dad8c
|
||||
* new options to specify the default checksum algorithm for PUT/bup/WebDAV uploads 0de09860
|
||||
* #164 new option `--put-name` to specify the filename of nameless uploads 5dcd88a6
|
||||
* the default is still `put-TIMESTAMP-IPADDRESS.bin`
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* #162 password-protected shares was incompatible with password-hashing c3ef3fdc
|
||||
* #161 m3u playlist creation was only possible over https 94352f27
|
||||
* when relocating/redirecting an upload from an xbu hook (execute-before-upload), could miss an already existing file at the destination and create another copy 0a9a8077
|
||||
* some edgecases when moving files between filesystems f425ff51
|
||||
* improve tagscan-resume after a server restart (primarily for dupes) 41fa6b25
|
||||
* support prehistoric timestamps in fat16 vhd-drives on windows 261236e3
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* #159 the nixos module was improved (thx @gabevenberg and @chinponya!) d1bca1f5
|
||||
* an archlinux maintainer adopted the aur package; copyparty is now [officially in arch](https://archlinux.org/packages/extra/any/copyparty/) b9ba783c
|
||||
* #162 add KDE Dolphin instructions to the conect-page d4a8071d
|
||||
* audioplayer now knows that `.oga` means `.ogg`
|
||||
|
||||
## 🌠 fun facts
|
||||
|
||||
* this release contains code [pair-programmed during an anime rave](https://a.ocv.me/pub/g/nerd-stuff/PXL_20250503_222654610.jpg)
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0426-2149 `v1.17.0` mixtape.m3u
|
||||
|
||||
## 🧪 new features
|
||||
|
||||
* [m3u playlists](https://github.com/9001/copyparty/#playlists) 897f9d32 ad200f2b 4195762d fff45552
|
||||
* create and play m3u / m3u8 files
|
||||
|
||||
## 🩹 bugfixes
|
||||
|
||||
* improve support for ie11 (yes, internet explorer 11) 3090c748 95157d02
|
||||
* now possible to launch the password-hasher cli while another instance is running dbfc899d
|
||||
* in preparation of #157 / #159
|
||||
|
||||
## 🔧 other changes
|
||||
|
||||
* make better decisions when running in a VM with less than 1 GiB RAM dc3b7a27
|
||||
|
||||
## 🌠 fun facts
|
||||
|
||||
* this release contains code written [less than 1masl](https://a.ocv.me/pub/g/nerd-stuff/PXL_20250425_170037812.jpg) and was gonna be named [hash again](https://www.youtube.com/watch?v=twUFbqyul_M) since it was originally just the password-hasher fix, but then kipun suggested adding playlist support (thx kipun)
|
||||
* [donations](https://github.com/9001/) are now also possible through github -- good alternative to paypal (y)
|
||||
* and thanks a lot for the support (and kind words therein) so far, appreciate it :>
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2025-0420-1836 `v1.16.21` unzip-compat
|
||||
|
||||
|
||||
@@ -22,6 +22,7 @@
|
||||
* [dev env setup](#dev-env-setup)
|
||||
* [just the sfx](#just-the-sfx)
|
||||
* [build from release tarball](#build-from-release-tarball) - uses the included prebuilt webdeps
|
||||
* [build from scratch](#build-from-scratch) - how the sausage is made
|
||||
* [complete release](#complete-release)
|
||||
* [debugging](#debugging)
|
||||
* [music playback halting on phones](#music-playback-halting-on-phones) - mostly fine on android
|
||||
@@ -190,6 +191,9 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
||||
| GET | `?v` | open image/video/audio in mediaplayer |
|
||||
| GET | `?txt` | get file at URL as plaintext |
|
||||
| GET | `?txt=iso-8859-1` | ...with specific charset |
|
||||
| GET | `?tail` | continuously stream a growing file |
|
||||
| GET | `?tail=1024` | ...starting from byte 1024 |
|
||||
| GET | `?tail=-128` | ...starting 128 bytes from the end |
|
||||
| GET | `?th` | get image/video at URL as thumbnail |
|
||||
| GET | `?th=opus` | convert audio file to 128kbps opus |
|
||||
| GET | `?th=caf` | ...in the iOS-proprietary container |
|
||||
@@ -338,7 +342,7 @@ for the `re`pack to work, first run one of the sfx'es once to unpack it
|
||||
|
||||
you need python 3.9 or newer due to type hints
|
||||
|
||||
the rest is mostly optional; if you need a working env for vscode or similar
|
||||
setting up a venv with the below packages is only necessary if you want it for vscode or similar
|
||||
|
||||
```sh
|
||||
python3 -m venv .venv
|
||||
@@ -392,6 +396,39 @@ python3 setup.py install --skip-build --prefix=/usr --root=$HOME/pe/copyparty
|
||||
```
|
||||
|
||||
|
||||
## build from scratch
|
||||
|
||||
how the sausage is made:
|
||||
|
||||
to get started, first `cd` into the `scripts` folder
|
||||
|
||||
* the first step is the webdeps; they end up in `../copyparty/web/deps/` for example `../copyparty/web/deps/marked.js.gz` -- if you need to build the webdeps, run `make -C deps-docker`
|
||||
* this needs rootless podman and the `podman-docker` compat-layer to pretend it's docker, although it *should* be possible to use rootful/rootless docker too
|
||||
* if you don't have rootless podman/docker then `sudo make -C deps-docker` is fine too
|
||||
* alternatively, you can entirely skip building the webdeps and instead extract the compiled webdeps from the latest github release with `./make-sfx.sh fast dl-wd`
|
||||
|
||||
* next, build `copyparty-sfx.py` by running `./make-sfx.sh gz fast`
|
||||
* this is a dependency for most of the remaining steps, since they take the sfx as input
|
||||
* removing `fast` makes it compress better
|
||||
* removing `gz` too compresses even better, but startup gets slower
|
||||
|
||||
* if you want to build the `.pyz` standalone "binary", now run `./make-pyz.sh`
|
||||
|
||||
* if you want to build a pypi package, now run `./make-pypi-release.sh d`
|
||||
|
||||
* if you want to build a docker-image, you have two options:
|
||||
* if you want to use podman to build all docker-images for all supported architectures, now run `(cd docker; ./make.sh hclean; ./make.sh hclean pull img)`
|
||||
* if you want to use docker to build all docker-images for your native architecture, now run `sudo make -C docker`
|
||||
* if you want to do something else, please take a look at `docker/make.sh` or `docker/Makefile` for inspiration
|
||||
|
||||
* if you want to build the windows exe, first grab some snacks and a beer, [you'll need it](https://github.com/9001/copyparty/tree/hovudstraum/scripts/pyinstaller)
|
||||
|
||||
the complete list of buildtime dependencies to do a build from scratch is as follows:
|
||||
|
||||
* on ubuntu-server, install podman or [docker](https://get.docker.com/), and then `sudo apt install make zip bzip2`
|
||||
* because ubuntu is specifically what someone asked about :-p
|
||||
|
||||
|
||||
## complete release
|
||||
|
||||
also builds the sfx so skip the sfx section above
|
||||
|
||||
@@ -106,3 +106,10 @@
|
||||
/w/tank1
|
||||
[/m8s]
|
||||
/w/tank2
|
||||
|
||||
|
||||
# some other things you can do:
|
||||
# [/demo/${u%-su,%-fds}] # users which are NOT members of "su" or "fds"
|
||||
# [/demo/${u%+su,%+fds}] # users which ARE members of BOTH "su" and "fds"
|
||||
# [/demo/${g%-su}] # all groups except su
|
||||
# [/demo/${g%-su,%-fds}] # all groups except su and fds
|
||||
|
||||
@@ -168,6 +168,7 @@ symbol legend,
|
||||
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | ╱ | ╱ |
|
||||
| CTRL-V from device | █ | | | █ | | | | | | | | | |
|
||||
| race the beam ("p2p") | █ | | | | | | | | | | | | |
|
||||
| "tail -f" streaming | █ | | | | | | | | | | | | |
|
||||
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ | |
|
||||
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ | ╱ | ╱ |
|
||||
| ┗ max disk usage | █ | █ | █ | | █ | | | | █ | | | █ | █ |
|
||||
@@ -193,6 +194,8 @@ symbol legend,
|
||||
|
||||
* `race the beam` = files can be downloaded while they're still uploading; downloaders are slowed down such that the uploader is always ahead
|
||||
|
||||
* `tail -f` = when viewing or downloading a logfile, the connection can remain open to keep showing new lines as they are added in real time
|
||||
|
||||
* `upload routing` = depending on filetype / contents / uploader etc., the file can be redirected to another location or otherwise transformed; mitigates limitations such as [sharex#3992](https://github.com/ShareX/ShareX/issues/3992)
|
||||
* copyparty example: [reloc-by-ext](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks#before-upload)
|
||||
|
||||
|
||||
8
flake.lock
generated
8
flake.lock
generated
@@ -17,16 +17,16 @@
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1680334310,
|
||||
"narHash": "sha256-ISWz16oGxBhF7wqAxefMPwFag6SlsA9up8muV79V9ck=",
|
||||
"lastModified": 1748162331,
|
||||
"narHash": "sha256-rqc2RKYTxP3tbjA+PB3VMRQNnjesrT0pEofXQTrMsS8=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "884e3b68be02ff9d61a042bc9bd9dd2a358f95da",
|
||||
"rev": "7c43f080a7f28b2774f3b3f43234ca11661bf334",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"id": "nixpkgs",
|
||||
"ref": "nixos-22.11",
|
||||
"ref": "nixos-25.05",
|
||||
"type": "indirect"
|
||||
}
|
||||
},
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
inputs = {
|
||||
nixpkgs.url = "nixpkgs/nixos-22.11";
|
||||
nixpkgs.url = "nixpkgs/nixos-25.05";
|
||||
flake-utils.url = "github:numtide/flake-utils";
|
||||
};
|
||||
|
||||
@@ -17,6 +17,9 @@
|
||||
let
|
||||
pkgs = import nixpkgs {
|
||||
inherit system;
|
||||
config = {
|
||||
allowAliases = false;
|
||||
};
|
||||
overlays = [ self.overlays.default ];
|
||||
};
|
||||
in {
|
||||
|
||||
@@ -3,7 +3,7 @@ WORKDIR /z
|
||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||
ver_hashwasm=4.12.0 \
|
||||
ver_marked=4.3.0 \
|
||||
ver_dompf=3.2.5 \
|
||||
ver_dompf=3.2.6 \
|
||||
ver_mde=2.18.0 \
|
||||
ver_codemirror=5.65.18 \
|
||||
ver_fontawesome=5.13.0 \
|
||||
|
||||
@@ -21,7 +21,7 @@ RUN apk add -U !pyc \
|
||||
&& apk add -t .bd \
|
||||
bash wget gcc g++ make cmake patchelf \
|
||||
python3-dev ffmpeg-dev fftw-dev libsndfile-dev \
|
||||
py3-wheel py3-numpy-dev \
|
||||
py3-wheel py3-numpy-dev libffi-dev \
|
||||
vamp-sdk-dev \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install pyvips \
|
||||
|
||||
@@ -15,7 +15,7 @@ RUN apk add -U !pyc \
|
||||
vips-jxl vips-heif vips-poppler vips-magick \
|
||||
&& apk add -t .bd \
|
||||
bash wget gcc g++ make cmake patchelf \
|
||||
python3-dev py3-wheel \
|
||||
python3-dev py3-wheel libffi-dev \
|
||||
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
|
||||
&& python3 -m pip install pyvips \
|
||||
&& apk del py3-pip .bd
|
||||
|
||||
@@ -28,6 +28,14 @@ all:
|
||||
|
||||
docker image ls
|
||||
|
||||
min:
|
||||
rm -rf i
|
||||
mkdir i
|
||||
tar -cC../.. dist/copyparty-sfx.py bin/mtag | tar -xvCi
|
||||
|
||||
podman build --squash --pull=always -t copyparty/min:latest -f Dockerfile.min .
|
||||
echo 'scale=1;'`podman save copyparty/min:latest | pigz -c | wc -c`/1024/1024 | bc
|
||||
|
||||
push:
|
||||
docker push copyparty/min
|
||||
docker push copyparty/im
|
||||
|
||||
@@ -537,6 +537,7 @@ find | grep -E '\.(js|html)$' | while IFS= read -r f; do
|
||||
done
|
||||
|
||||
gzres() {
|
||||
local pk=
|
||||
[ $zopf ] && command -v zopfli && pk="zopfli --i$zopf"
|
||||
[ $zopf ] && command -v pigz && pk="pigz -11 -I $zopf"
|
||||
[ -z "$pk" ] && pk='gzip'
|
||||
@@ -628,7 +629,6 @@ suf=
|
||||
[ $use_gz ] && {
|
||||
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
|
||||
py=$py.t
|
||||
suf=-gz
|
||||
}
|
||||
|
||||
"$pybin" $py --sfx-make tar.bz2 $ver $ts
|
||||
|
||||
@@ -16,7 +16,7 @@ uname -s | grep WOW64 && m=64 || m=32
|
||||
uname -s | grep NT-10 && w10=1 || w7=1
|
||||
[ $w7 ] && [ -e up2k.sh ] && [ ! "$1" ] && ./up2k.sh
|
||||
|
||||
[ $w7 ] && pyv=37 || pyv=312
|
||||
[ $w7 ] && pyv=37 || pyv=313
|
||||
esuf=
|
||||
[ $w7 ] && [ $m = 32 ] && esuf=32
|
||||
[ $w7 ] && [ $m = 64 ] && esuf=-winpe64
|
||||
@@ -89,14 +89,17 @@ excl=(
|
||||
urllib.request
|
||||
urllib.response
|
||||
urllib.robotparser
|
||||
zipfile
|
||||
)
|
||||
[ $w10 ] && excl+=(
|
||||
_pyrepl
|
||||
distutils
|
||||
setuptools
|
||||
PIL.ImageQt
|
||||
PIL.ImageShow
|
||||
PIL.ImageTk
|
||||
PIL.ImageWin
|
||||
PIL.PdfParser
|
||||
zipimport
|
||||
) || excl+=(
|
||||
inspect
|
||||
PIL
|
||||
@@ -104,6 +107,7 @@ excl=(
|
||||
PIL.Image
|
||||
PIL.ImageDraw
|
||||
PIL.ImageOps
|
||||
zipfile
|
||||
)
|
||||
excl=( "${excl[@]/#/--exclude-module }" )
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ f117016b1e6a7d7e745db30d3e67f1acf7957c443a0dd301b6c5e10b8368f2aa4db6be9782d2d3f8
|
||||
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
||||
b297ff66ec50cf5a1abcf07d6ac949644c5150ba094ffac974c5d27c81574c3e97ed814a47547f4b03a4c83ea0fb8f026433fca06a3f08e32742dc5c024f3d07 pywin32_ctypes-0.2.3-py3-none-any.whl
|
||||
085d39ef4426aa5f097fbc484595becc16e61ca23fc7da4d2a8bba540a3b82e789e390b176c7151bdc67d01735cce22b1562cdb2e31273225a2d3e275851a4ad setuptools-70.3.0-py3-none-any.whl
|
||||
360a141928f4a7ec18a994602cbb28bbf8b5cc7c077a06ac76b54b12fa769ed95ca0333a5cf728923a8e0baeb5cc4d5e73e5b3de2666beb05eb477d8ae719093 upx-4.2.4-win32.zip
|
||||
644931f8e1764e168c257c11c77b3d2ac5408397d97b0eef98168a058efe793d3ab6900dc2e9c54923a2bd906dd66bfbff8db6ff43418513e530a1bd501c6ccd upx-5.0.1-win32.zip
|
||||
# win7
|
||||
3253e86471e6f9fa85bfdb7684cd2f964ed6e35c6a4db87f81cca157c049bef43e66dfcae1e037b2fb904567b1e028aaeefe8983ba3255105df787406d2aa71e en_windows_7_professional_with_sp1_x86_dvd_u_677056.iso
|
||||
ab0db0283f61a5bbe44797d74546786bf41685175764a448d2e3bd629f292f1e7d829757b26be346b5044d78c9c1891736d93237cee4b1b6f5996a902c86d15f en_windows_7_professional_with_sp1_x64_dvd_u_676939.iso
|
||||
@@ -24,10 +24,11 @@ ac96786e5d35882e0c5b724794329c9125c2b86ae7847f17acfc49f0d294312c6afc1c3f248655de
|
||||
0a2cd4cadf0395f0374974cd2bc2407e5cc65c111275acdffb6ecc5a2026eee9e1bb3da528b35c7f0ff4b64563a74857d5c2149051e281cc09ebd0d1968be9aa en-us_windows_10_enterprise_ltsc_2021_x64_dvd_d289cf96.iso
|
||||
16cc0c58b5df6c7040893089f3eb29c074aed61d76dae6cd628d8a89a05f6223ac5d7f3f709a12417c147594a87a94cc808d1e04a6f1e407cc41f7c9f47790d1 virtio-win-0.1.248.iso
|
||||
9a7f40edc6f9209a2acd23793f3cbd6213c94f36064048cb8bf6eb04f1bdb2c2fe991cb09f77fe8b13e5cd85c618ef23573e79813b2fef899ab2f290cd129779 jinja2-3.1.6-py3-none-any.whl
|
||||
6df21f0da408a89f6504417c7cdf9aaafe4ed88cfa13e9b8fa8414f604c0401f885a04bbad0484dc51a29284af5d1548e33c6cc6bfb9896d9992c1b1074f332d MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl
|
||||
00731cfdd9d5c12efef04a7161c90c1e5ed1dc4677aa88a1d4054aff836f3430df4da5262ed4289c21637358a9e10e5df16f76743cbf5a29bb3a44b146c19cf3 MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl
|
||||
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
||||
0203ec2551c4836696cfab0b2c9fff603352f03fa36e7476e2e1ca7ec57a3a0c24bd791fcd92f342bf817f0887854d9f072e0271c643de4b313d8c9569ba8813 packaging-24.1-py3-none-any.whl
|
||||
c9051daaf34ec934962c743a5ac2dbe55a9b0cababb693a8cde0001d24d4a50b67bd534d714d935def6ca7b898ec0a352e58bd9ccdce01c54eaf2281b18e478d pillow-11.2.1-cp312-cp312-win_amd64.whl
|
||||
f0463895e9aee97f31a2003323de235fed1b26289766dc0837261e3f4a594a31162b69e9adbb0e9a31e2e2d4b5f25c762ed1669553df7dc89a8ba4f85d297873 pyinstaller-6.11.1-py3-none-win_amd64.whl
|
||||
d550a0a14428386945533de2220c4c2e37c0c890fc51a600f626c6ca90a32d39572c121ec04c157ba3a8d6601cb021f8433d871b5c562a3d342c804fffec90c1 pyinstaller_hooks_contrib-2024.11-py3-none-any.whl
|
||||
4f9a4d9f65c93e2d851e2674057343a9599f30f5dc582ffca485522237d4fcf43653b3d393ed5eb11e518c4ba93714a07134bbb13a97d421cce211e1da34682e python-3.12.10-amd64.exe
|
||||
a726fb46cce24f781fc8b55a3e6dea0a884ebc3b2b400ea74aa02333699f4955a5dc1e2ec5927ac72f35a624401f3f3b442882ba1cc4cadaf9c88558b5b8bdae packaging-25.0-py3-none-any.whl
|
||||
9265164114db16c7f4286f188d6ebc7f3e2c9a9aca7144f92bc66c320d7d5db44e2c3a93e5e8f8cc12b8ffb30d0402a9510716ed7dbd10df39d19f5cf7ed7486 pillow-11.2.1-cp313-cp313-win_amd64.whl
|
||||
59fbbcae044f4ee73d203ac74b553b27bfad3e6b2f3fb290fd3f8774753c6b545176b6b3399c240b092d131d152290ce732750accd962dc1e48e930be85f5e53 pyinstaller-6.14.1-py3-none-win_amd64.whl
|
||||
fc6f3e144c5f5b662412de07cb8bf0c2eb3b3be21d19ec448aef3c4244d779b9ab8027fd67a4871e6e13823b248ea0f5a7a9241a53aef30f3b51a6d3cb5bdb3f pyinstaller_hooks_contrib-2025.5-py3-none-any.whl
|
||||
2c7a52e223b8186c21009d3fa5ed6a856d8eb4ef3b98f5d24c378c6a1afbfa1378bd7a51d6addc500e263d7989efb544c862bf920055e740f137c702dfd9d18b python-3.13.5-amd64.exe
|
||||
2a0420f7faaa33d2132b82895a8282688030e939db0225ad8abb95a47bdb87b45318f10985fc3cee271a9121441c1526caa363d7f2e4a4b18b1a674068766e87 setuptools-80.9.0-py3-none-any.whl
|
||||
|
||||
@@ -29,19 +29,19 @@ uname -s | grep NT-10 && w10=1 || {
|
||||
fns=(
|
||||
altgraph-0.17.4-py2.py3-none-any.whl
|
||||
pefile-2023.2.7-py3-none-any.whl
|
||||
pywin32_ctypes-0.2.2-py3-none-any.whl
|
||||
setuptools-70.3.0-py3-none-any.whl
|
||||
upx-4.2.4-win32.zip
|
||||
pywin32_ctypes-0.2.3-py3-none-any.whl
|
||||
upx-5.0.1-win32.zip
|
||||
)
|
||||
[ $w10 ] && fns+=(
|
||||
jinja2-3.1.6-py3-none-any.whl
|
||||
MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl
|
||||
MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl
|
||||
mutagen-1.47.0-py3-none-any.whl
|
||||
packaging-24.1-py3-none-any.whl
|
||||
pillow-11.2.1-cp312-cp312-win_amd64.whl
|
||||
pyinstaller-6.10.0-py3-none-win_amd64.whl
|
||||
pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
|
||||
python-3.12.10-amd64.exe
|
||||
packaging-25.0-py3-none-any.whl
|
||||
pillow-11.2.1-cp313-cp313-win_amd64.whl
|
||||
pyinstaller-6.14.1-py3-none-win_amd64.whl
|
||||
pyinstaller_hooks_contrib-2025.5-py3-none-any.whl
|
||||
python-3.13.5-amd64.exe
|
||||
setuptools-80.9.0-py3-none-any.whl
|
||||
)
|
||||
[ $w7 ] && fns+=(
|
||||
future-1.0.0-py3-none-any.whl
|
||||
@@ -49,6 +49,7 @@ fns=(
|
||||
packaging-24.0-py3-none-any.whl
|
||||
pip-24.0-py3-none-any.whl
|
||||
pyinstaller_hooks_contrib-2023.8-py2.py3-none-any.whl
|
||||
setuptools-70.3.0-py3-none-any.whl
|
||||
typing_extensions-4.7.1-py3-none-any.whl
|
||||
zipp-3.15.0-py3-none-any.whl
|
||||
)
|
||||
@@ -80,7 +81,7 @@ close and reopen git-bash so python is in PATH
|
||||
|
||||
===[ copy-paste into git-bash ]================================
|
||||
uname -s | grep NT-10 && w10=1 || w7=1
|
||||
[ $w7 ] && pyv=37 || pyv=312
|
||||
[ $w7 ] && pyv=37 || pyv=313
|
||||
appd=$(cygpath.exe "$APPDATA")
|
||||
cd ~/Downloads &&
|
||||
yes | unzip upx-*-win32.zip &&
|
||||
|
||||
@@ -34,6 +34,7 @@ shift
|
||||
./make-sfx.sh "$@"
|
||||
f=../dist/copyparty-sfx
|
||||
[ -e $f.py ] && s= || s=-gz
|
||||
# TODO: the -gz suffix is gone, can drop all the $s stuff probably
|
||||
|
||||
$f$s.py --version >/dev/null
|
||||
|
||||
|
||||
@@ -226,10 +226,13 @@ var tl_browser = {
|
||||
"wt_pst": "paste a previously cut / copied selection$NHotkey: ctrl-V",
|
||||
"wt_selall": "select all files$NHotkey: ctrl-A (when file focused)",
|
||||
"wt_selinv": "invert selection",
|
||||
"wt_zip1": "download this folder as archive",
|
||||
"wt_selzip": "download selection as archive",
|
||||
"wt_seldl": "download selection as separate files$NHotkey: Y",
|
||||
"wt_npirc": "copy irc-formatted track info",
|
||||
"wt_nptxt": "copy plaintext track info",
|
||||
"wt_m3ua": "add to m3u playlist (click <code>📻copy</code> later)",
|
||||
"wt_m3uc": "copy m3u playlist to clipboard",
|
||||
"wt_grid": "toggle grid / list view$NHotkey: G",
|
||||
"wt_prev": "previous track$NHotkey: J",
|
||||
"wt_play": "play / pause$NHotkey: P",
|
||||
@@ -332,6 +335,8 @@ var tl_browser = {
|
||||
|
||||
"cut_mt": "use multithreading to accelerate file hashing$N$Nthis uses web-workers and requires$Nmore RAM (up to 512 MiB extra)$N$Nmakes https 30% faster, http 4.5x faster\">mt",
|
||||
|
||||
"cut_wasm": "use wasm instead of the browser's built-in hasher; improves speed on chrome-based browsers but increases CPU load, and many older versions of chrome have bugs which makes the browser consume all RAM and crash if this is enabled\">wasm",
|
||||
|
||||
"cft_text": "favicon text (blank and refresh to disable)",
|
||||
"cft_fg": "foreground color",
|
||||
"cft_bg": "background color",
|
||||
@@ -366,6 +371,7 @@ var tl_browser = {
|
||||
"mt_fau": "on phones, prevent music from stopping if the next song doesn't preload fast enough (can make tags display glitchy)\">☕️",
|
||||
"mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s",
|
||||
"mt_npclip": "show buttons for clipboarding the currently playing song\">/np",
|
||||
"mt_m3u_c": "show buttons for clipboarding the$Nselected songs as m3u8 playlist entries\">📻",
|
||||
"mt_octl": "os integration (media hotkeys / osd)\">os-ctl",
|
||||
"mt_oseek": "allow seeking through os integration$N$Nnote: on some devices (iPhones),$Nthis replaces the next-song button\">seek",
|
||||
"mt_oscv": "show album cover in osd\">art",
|
||||
@@ -391,6 +397,7 @@ var tl_browser = {
|
||||
|
||||
"mb_play": "play",
|
||||
"mm_hashplay": "play this audio file?",
|
||||
"mm_m3u": "press <code>Enter/OK</code> to Play\npress <code>ESC/Cancel</code> to Edit",
|
||||
"mp_breq": "need firefox 82+ or chrome 73+ or iOS 15+",
|
||||
"mm_bload": "now loading...",
|
||||
"mm_bconv": "converting to {0}, please wait...",
|
||||
@@ -416,6 +423,7 @@ var tl_browser = {
|
||||
"f_empty": 'this folder is empty',
|
||||
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||
"f_bigtxt2": "view just the end of the file instead? this will also enable following/tailing, showing newly added lines of text in real time",
|
||||
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
|
||||
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
|
||||
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
|
||||
@@ -520,6 +528,15 @@ var tl_browser = {
|
||||
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
||||
"tvt_sel": "select file ( for cut / copy / delete / ... )$NHotkey: S\">sel",
|
||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||
"tvt_tail": "monitor file for changes; show new lines in real time\">📡 follow",
|
||||
"tvt_wrap": "word-wrap\">↵",
|
||||
"tvt_atail": "lock scroll to bottom of page\">⚓",
|
||||
"tvt_ctail": "decode terminal colors (ansi escape codes)\">🌈",
|
||||
"tvt_ntail": "scrollback limit (how many bytes of text to keep loaded)",
|
||||
|
||||
"m3u_add1": "song added to m3u playlist",
|
||||
"m3u_addn": "{0} songs added to m3u playlist",
|
||||
"m3u_clip": "m3u playlist now copied to clipboard\n\nyou should create a new textfile named something.m3u and paste the playlist in that document; this will make it playable",
|
||||
|
||||
"gt_vau": "don't show videos, just play the audio\">🎧",
|
||||
"gt_msel": "enable file selection; ctrl-click a file to override$N$N<em>when active: doubleclick a file / folder to open it</em>$N$NHotkey: S\">multiselect",
|
||||
@@ -615,6 +632,7 @@ var tl_browser = {
|
||||
"u_https3": "for better performance",
|
||||
"u_ancient": 'your browser is impressively ancient -- maybe you should <a href="#" onclick="goto(\'bup\')">use bup instead</a>',
|
||||
"u_nowork": "need firefox 53+ or chrome 57+ or iOS 11+",
|
||||
"tail_2old": "need firefox 105+ or chrome 71+ or iOS 14.5+",
|
||||
"u_nodrop": 'your browser is too old for drag-and-drop uploading',
|
||||
"u_notdir": "that's not a folder!\n\nyour browser is too old,\nplease try dragdrop instead",
|
||||
"u_uri": "to dragdrop images from other browser windows,\nplease drop it onto the big upload button",
|
||||
|
||||
46
tests/res/idp/7.conf
Normal file
46
tests/res/idp/7.conf
Normal file
@@ -0,0 +1,46 @@
|
||||
# -*- mode: yaml -*-
|
||||
# vim: ft=yaml:
|
||||
|
||||
[global]
|
||||
idp-h-usr: x-idp-user
|
||||
idp-h-grp: x-idp-group
|
||||
|
||||
[/u/${u}]
|
||||
/u/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/uya/${u%+ga}]
|
||||
/uya/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/uyab/${u%+ga,%+gb}]
|
||||
/uyab/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/una/${u%-ga}]
|
||||
/una/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/unab/${u%-ga,%-gb}]
|
||||
/unab/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/gya/${g%+ga}]
|
||||
/gya/${g}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/gna/${g%-ga}]
|
||||
/gna/${g}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/gnab/${g%-ga,%-gb}]
|
||||
/gnab/${g}
|
||||
accs:
|
||||
r: *
|
||||
47
tests/res/idp/8.conf
Normal file
47
tests/res/idp/8.conf
Normal file
@@ -0,0 +1,47 @@
|
||||
# -*- mode: yaml -*-
|
||||
# vim: ft=yaml:
|
||||
|
||||
[groups]
|
||||
ga: iua, iuab, iuabc
|
||||
gb: iuab, iuabc, iub, iubc
|
||||
gc: iuabc, iubc, iuc
|
||||
|
||||
[/u/${u}]
|
||||
/u/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/uya/${u%+ga}]
|
||||
/uya/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/uyab/${u%+ga,%+gb}]
|
||||
/uyab/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/una/${u%-ga}]
|
||||
/una/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/unab/${u%-ga,%-gb}]
|
||||
/unab/${u}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/gya/${g%+ga}]
|
||||
/gya/${g}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/gna/${g%-ga}]
|
||||
/gna/${g}
|
||||
accs:
|
||||
r: *
|
||||
|
||||
[/gnab/${g%-ga,%-gb}]
|
||||
/gnab/${g}
|
||||
accs:
|
||||
r: *
|
||||
@@ -234,3 +234,74 @@ class TestVFS(unittest.TestCase):
|
||||
au.idp_checkin(None, "iud", "su")
|
||||
self.assertAxsAt(au, "team/su/iuc", [["iuc", "iud"]])
|
||||
self.assertAxsAt(au, "team/su/iud", [["iuc", "iud"]])
|
||||
|
||||
def test_7(self):
|
||||
"""
|
||||
conditional idp-vols
|
||||
"""
|
||||
_, cfgdir, xcfg = self.prep()
|
||||
au = AuthSrv(Cfg(c=[cfgdir + "/7.conf"], **xcfg), self.log)
|
||||
au.idp_checkin(None, "iua", "ga")
|
||||
au.idp_checkin(None, "iuab", "ga,gb")
|
||||
au.idp_checkin(None, "iuabc", "ga,gb,gc")
|
||||
au.idp_checkin(None, "iub", "gb")
|
||||
au.idp_checkin(None, "iubc", "gb,gc")
|
||||
au.idp_checkin(None, "iuc", "gc")
|
||||
zs = """
|
||||
u/iua
|
||||
u/iuab
|
||||
u/iuabc
|
||||
u/iub
|
||||
u/iubc
|
||||
u/iuc
|
||||
uya/iua
|
||||
uya/iuab
|
||||
uya/iuabc
|
||||
uyab/iuab
|
||||
uyab/iuabc
|
||||
una/iub
|
||||
una/iubc
|
||||
una/iuc
|
||||
unab/iuc
|
||||
gya/ga
|
||||
gna/gb
|
||||
gna/gc
|
||||
gnab/gc
|
||||
"""
|
||||
zl1 = sorted(zs.strip().split("\n"))[:]
|
||||
zl2 = sorted(list(au.vfs.all_vols))[:]
|
||||
# print(" ".join(zl1))
|
||||
# print(" ".join(zl2))
|
||||
self.assertListEqual(zl1, zl2)
|
||||
|
||||
def test_8(self):
|
||||
"""
|
||||
conditional non-idp vols
|
||||
"""
|
||||
_, cfgdir, xcfg = self.prep()
|
||||
xcfg = {"vc": True}
|
||||
au = AuthSrv(Cfg(c=[cfgdir + "/8.conf"], **xcfg), self.log)
|
||||
zs = """
|
||||
u/iua
|
||||
u/iuab
|
||||
u/iuabc
|
||||
u/iub
|
||||
u/iubc
|
||||
u/iuc
|
||||
uya/iua
|
||||
uya/iuab
|
||||
uya/iuabc
|
||||
uyab/iuab
|
||||
uyab/iuabc
|
||||
una/iub
|
||||
una/iubc
|
||||
una/iuc
|
||||
unab/iuc
|
||||
gya/ga
|
||||
gna/gb
|
||||
gna/gc
|
||||
gnab/gc
|
||||
"""
|
||||
zl1 = sorted(zs.strip().split("\n"))[:]
|
||||
zl2 = sorted(list(au.vfs.all_vols))[:]
|
||||
self.assertListEqual(zl1, zl2)
|
||||
|
||||
@@ -82,6 +82,19 @@ def get_ramdisk():
|
||||
return subdir(vol)
|
||||
|
||||
if os.path.exists("/Volumes"):
|
||||
sck = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
while True:
|
||||
try:
|
||||
sck.bind(("127.0.0.1", 2775))
|
||||
break
|
||||
except:
|
||||
print("waiting for 2775")
|
||||
time.sleep(0.5)
|
||||
|
||||
v = "/Volumes/cptd"
|
||||
if os.path.exists(v):
|
||||
return subdir(v)
|
||||
|
||||
# hdiutil eject /Volumes/cptd/
|
||||
devname, _ = chkcmd("hdiutil attach -nomount ram://131072".split())
|
||||
devname = devname.strip()
|
||||
@@ -97,6 +110,7 @@ def get_ramdisk():
|
||||
except:
|
||||
pass
|
||||
|
||||
sck.close()
|
||||
return subdir("/Volumes/cptd")
|
||||
except Exception as ex:
|
||||
print(repr(ex))
|
||||
@@ -129,22 +143,22 @@ class Cfg(Namespace):
|
||||
def __init__(self, a=None, v=None, c=None, **ka0):
|
||||
ka = {}
|
||||
|
||||
ex = "chpw daw dav_auth dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink ih ihead magic hardlink_only nid nih no_acode no_athumb no_bauth no_clone no_cp no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tarcmp no_thumb no_vthumb no_zip nrand nsort nw og og_no_head og_s_title ohead q rand re_dirsz rss smb srch_dbg srch_excl stats uqe vague_403 vc ver wo_up_readme write_uplog xdev xlink xvol zipmaxu zs"
|
||||
ex = "chpw daw dav_auth dav_mac dav_rt e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp early_ban ed emp exp force_js getmod grid gsel hardlink hardlink_only ih ihead magic nid nih no_acode no_athumb no_bauth no_clone no_cp no_dav no_db_ip no_del no_dirsz no_dupe no_lifetime no_logues no_mv no_pipe no_poll no_readme no_robots no_sb_md no_sb_lg no_scandir no_tail no_tarcmp no_thumb no_vthumb no_zip nrand nsort nw og og_no_head og_s_title ohead q rand re_dirsz rmagic rss smb srch_dbg srch_excl stats uqe vague_403 vc ver wo_up_readme write_uplog xdev xlink xvol zipmaxu zs"
|
||||
ka.update(**{k: False for k in ex.split()})
|
||||
|
||||
ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash plain_ip"
|
||||
ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash see_dots plain_ip"
|
||||
ka.update(**{k: True for k in ex.split()})
|
||||
|
||||
ex = "ah_cli ah_gen css_browser dbpath hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua ua_nodoc ua_nozip"
|
||||
ka.update(**{k: None for k in ex.split()})
|
||||
|
||||
ex = "hash_mt hsortn safe_dedup srch_time u2abort u2j u2sz"
|
||||
ex = "hash_mt hsortn safe_dedup srch_time tail_fd tail_rate u2abort u2j u2sz"
|
||||
ka.update(**{k: 1 for k in ex.split()})
|
||||
|
||||
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody th_convt ups_who zip_who"
|
||||
ex = "au_vol dl_list mtab_age reg_cap s_thead s_tbody tail_tmax tail_who th_convt ups_who zip_who"
|
||||
ka.update(**{k: 9 for k in ex.split()})
|
||||
|
||||
ex = "db_act forget_ip k304 loris no304 re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo u2ow zipmaxn zipmaxs"
|
||||
ex = "db_act forget_ip k304 loris no304 nosubtle re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo u2ow zipmaxn zipmaxs"
|
||||
ka.update(**{k: 0 for k in ex.split()})
|
||||
|
||||
ex = "ah_alg bname chpw_db doctitle df exit favico idp_h_usr ipa html_head lg_sba lg_sbf log_fk md_sba md_sbf name og_desc og_site og_th og_title og_title_a og_title_v og_title_i shr tcolor textfiles unlist vname xff_src zipmaxt R RS SR"
|
||||
@@ -166,6 +180,7 @@ class Cfg(Namespace):
|
||||
v=v or [],
|
||||
c=c,
|
||||
E=E,
|
||||
bup_ck="sha512",
|
||||
dbd="wal",
|
||||
dk_salt="b" * 16,
|
||||
fk_salt="a" * 16,
|
||||
@@ -178,6 +193,8 @@ class Cfg(Namespace):
|
||||
mte={"a": True},
|
||||
mth={},
|
||||
mtp=[],
|
||||
put_ck="sha512",
|
||||
put_name="put-{now.6f}-{cip}.bin",
|
||||
mv_retry="0/0",
|
||||
rm_retry="0/0",
|
||||
s_rd_sz=256 * 1024,
|
||||
|
||||
Reference in New Issue
Block a user