Compare commits
41 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
42099baeff | ||
|
|
2459965ca8 | ||
|
|
6acf436573 | ||
|
|
f217e1ce71 | ||
|
|
418000aee3 | ||
|
|
dbbba9625b | ||
|
|
397bc92fbc | ||
|
|
6e615dcd03 | ||
|
|
9ac5908b33 | ||
|
|
50912480b9 | ||
|
|
24b9b8319d | ||
|
|
b0f4f0b653 | ||
|
|
05bbd41c4b | ||
|
|
8f5f8a3cda | ||
|
|
c8938fc033 | ||
|
|
1550350e05 | ||
|
|
5cc190c026 | ||
|
|
d6a0a738ce | ||
|
|
f5fe3678ee | ||
|
|
f2a7925387 | ||
|
|
fa953ced52 | ||
|
|
f0000d9861 | ||
|
|
4e67516719 | ||
|
|
29db7a6270 | ||
|
|
852499e296 | ||
|
|
f1775fd51c | ||
|
|
4bb306932a | ||
|
|
2a37e81bd8 | ||
|
|
6a312ca856 | ||
|
|
e7f3e475a2 | ||
|
|
854ba0ec06 | ||
|
|
209b49d771 | ||
|
|
949baae539 | ||
|
|
5f4ea27586 | ||
|
|
099cc97247 | ||
|
|
592b7d6315 | ||
|
|
0880bf55a1 | ||
|
|
4cbffec0ec | ||
|
|
cc355417d4 | ||
|
|
e2bc573e61 | ||
|
|
41c0376177 |
6
.gitignore
vendored
6
.gitignore
vendored
@@ -21,6 +21,9 @@ copyparty.egg-info/
|
|||||||
# winmerge
|
# winmerge
|
||||||
*.bak
|
*.bak
|
||||||
|
|
||||||
|
# apple pls
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
# derived
|
# derived
|
||||||
copyparty/res/COPYING.txt
|
copyparty/res/COPYING.txt
|
||||||
copyparty/web/deps/
|
copyparty/web/deps/
|
||||||
@@ -34,3 +37,6 @@ up.*.txt
|
|||||||
.hist/
|
.hist/
|
||||||
scripts/docker/*.out
|
scripts/docker/*.out
|
||||||
scripts/docker/*.err
|
scripts/docker/*.err
|
||||||
|
|
||||||
|
# nix build output link
|
||||||
|
result
|
||||||
|
|||||||
190
README.md
190
README.md
@@ -39,6 +39,8 @@ turn almost any device into a file server with resumable uploads/downloads using
|
|||||||
* [self-destruct](#self-destruct) - uploads can be given a lifetime
|
* [self-destruct](#self-destruct) - uploads can be given a lifetime
|
||||||
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
|
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
|
||||||
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
|
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
|
||||||
|
* [media player](#media-player) - plays almost every audio format there is
|
||||||
|
* [audio equalizer](#audio-equalizer) - bass boosted
|
||||||
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
||||||
* [other tricks](#other-tricks)
|
* [other tricks](#other-tricks)
|
||||||
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
|
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
|
||||||
@@ -67,6 +69,8 @@ turn almost any device into a file server with resumable uploads/downloads using
|
|||||||
* [themes](#themes)
|
* [themes](#themes)
|
||||||
* [complete examples](#complete-examples)
|
* [complete examples](#complete-examples)
|
||||||
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
|
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
|
||||||
|
* [nix package](#nix-package) - `nix profile install github:9001/copyparty`
|
||||||
|
* [nixos module](#nixos-module)
|
||||||
* [browser support](#browser-support) - TLDR: yes
|
* [browser support](#browser-support) - TLDR: yes
|
||||||
* [client examples](#client-examples) - interact with copyparty using non-browser clients
|
* [client examples](#client-examples) - interact with copyparty using non-browser clients
|
||||||
* [folder sync](#folder-sync) - sync folders to/from copyparty
|
* [folder sync](#folder-sync) - sync folders to/from copyparty
|
||||||
@@ -82,7 +86,7 @@ turn almost any device into a file server with resumable uploads/downloads using
|
|||||||
* [recovering from crashes](#recovering-from-crashes)
|
* [recovering from crashes](#recovering-from-crashes)
|
||||||
* [client crashes](#client-crashes)
|
* [client crashes](#client-crashes)
|
||||||
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
|
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
|
||||||
* [HTTP API](#HTTP-API) - see [devnotes](#./docs/devnotes.md#http-api)
|
* [HTTP API](#HTTP-API) - see [devnotes](./docs/devnotes.md#http-api)
|
||||||
* [dependencies](#dependencies) - mandatory deps
|
* [dependencies](#dependencies) - mandatory deps
|
||||||
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
|
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
|
||||||
* [optional gpl stuff](#optional-gpl-stuff)
|
* [optional gpl stuff](#optional-gpl-stuff)
|
||||||
@@ -99,6 +103,7 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
|
|||||||
|
|
||||||
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
|
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
|
||||||
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
|
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
|
||||||
|
* or [install through nix](#nix-package), or [on NixOS](#nixos-module)
|
||||||
* or if you are on android, [install copyparty in termux](#install-on-android)
|
* or if you are on android, [install copyparty in termux](#install-on-android)
|
||||||
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
|
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
|
||||||
* docker has all deps built-in, so skip this step:
|
* docker has all deps built-in, so skip this step:
|
||||||
@@ -118,6 +123,8 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
|
|||||||
|
|
||||||
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
|
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
|
||||||
|
|
||||||
|
or see [complete windows example](./docs/examples/windows.md)
|
||||||
|
|
||||||
some recommended options:
|
some recommended options:
|
||||||
* `-e2dsa` enables general [file indexing](#file-indexing)
|
* `-e2dsa` enables general [file indexing](#file-indexing)
|
||||||
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
|
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
|
||||||
@@ -130,10 +137,11 @@ some recommended options:
|
|||||||
|
|
||||||
you may also want these, especially on servers:
|
you may also want these, especially on servers:
|
||||||
|
|
||||||
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service
|
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service (see guide inside)
|
||||||
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
|
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
|
||||||
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
|
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
|
||||||
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
|
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
|
||||||
|
* [nixos module](#nixos-module) to run copyparty on NixOS hosts
|
||||||
|
|
||||||
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
|
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
|
||||||
```
|
```
|
||||||
@@ -168,7 +176,7 @@ firewall-cmd --reload
|
|||||||
* ☑ write-only folders
|
* ☑ write-only folders
|
||||||
* ☑ [unpost](#unpost): undo/delete accidental uploads
|
* ☑ [unpost](#unpost): undo/delete accidental uploads
|
||||||
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
|
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
|
||||||
* ☑ symlink/discard existing files (content-matching)
|
* ☑ symlink/discard duplicates (content-matching)
|
||||||
* download
|
* download
|
||||||
* ☑ single files in browser
|
* ☑ single files in browser
|
||||||
* ☑ [folders as zip / tar files](#zip-downloads)
|
* ☑ [folders as zip / tar files](#zip-downloads)
|
||||||
@@ -649,12 +657,63 @@ or a mix of both:
|
|||||||
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
|
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
|
||||||
|
|
||||||
|
|
||||||
|
## media player
|
||||||
|
|
||||||
|
plays almost every audio format there is (if the server has FFmpeg installed for on-demand transcoding)
|
||||||
|
|
||||||
|
the following audio formats are usually always playable, even without FFmpeg: `aac|flac|m4a|mp3|ogg|opus|wav`
|
||||||
|
|
||||||
|
some hilights:
|
||||||
|
* OS integration; control playback from your phone's lockscreen ([windows](https://user-images.githubusercontent.com/241032/233213022-298a98ba-721a-4cf1-a3d4-f62634bc53d5.png) // [iOS](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) // [android](https://user-images.githubusercontent.com/241032/233212311-a7368590-08c7-4f9f-a1af-48ccf3f36fad.png))
|
||||||
|
* shows the audio waveform in the seekbar
|
||||||
|
* not perfectly gapless but can get really close (see settings + eq below); good enough to enjoy gapless albums as intended
|
||||||
|
|
||||||
|
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
|
||||||
|
|
||||||
|
open the `[🎺]` media-player-settings tab to configure it,
|
||||||
|
* switches:
|
||||||
|
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
|
||||||
|
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
|
||||||
|
* `[~s]` toggles the seekbar waveform display
|
||||||
|
* `[/np]` enables buttons to copy the now-playing info as an irc message
|
||||||
|
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
|
||||||
|
* `[seek]` allows seeking with lockscreen controls (buggy on some devices)
|
||||||
|
* `[art]` shows album art on the lockscreen
|
||||||
|
* `[🎯]` keeps the playing song scrolled into view (good when using the player as a taskbar dock)
|
||||||
|
* `[⟎]` shrinks the playback controls
|
||||||
|
* playback mode:
|
||||||
|
* `[loop]` keeps looping the folder
|
||||||
|
* `[next]` plays into the next folder
|
||||||
|
* transcode:
|
||||||
|
* `[flac]` convers `flac` and `wav` files into opus
|
||||||
|
* `[aac]` converts `aac` and `m4a` files into opus
|
||||||
|
* `[oth]` converts all other known formats into opus
|
||||||
|
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
|
||||||
|
* "tint" reduces the contrast of the playback bar
|
||||||
|
|
||||||
|
|
||||||
|
### audio equalizer
|
||||||
|
|
||||||
|
bass boosted
|
||||||
|
|
||||||
|
can also boost the volume in general, or increase/decrease stereo width (like [crossfeed](https://www.foobar2000.org/components/view/foo_dsp_meiercf) just worse)
|
||||||
|
|
||||||
|
has the convenient side-effect of reducing the pause between songs, so gapless albums play better with the eq enabled (just make it flat)
|
||||||
|
|
||||||
|
|
||||||
## markdown viewer
|
## markdown viewer
|
||||||
|
|
||||||
and there are *two* editors
|
and there are *two* editors
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
there is a built-in extension for inline clickable thumbnails;
|
||||||
|
* enable it by adding `<!-- th -->` somewhere in the doc
|
||||||
|
* add thumbnails with `!th[l](your.jpg)` where `l` means left-align (`r` = right-align)
|
||||||
|
* a single line with `---` clears the float / inlining
|
||||||
|
* in the case of README.md being displayed below a file listing, thumbnails will open in the gallery viewer
|
||||||
|
|
||||||
|
other notes,
|
||||||
* the document preview has a max-width which is the same as an A4 paper when printed
|
* the document preview has a max-width which is the same as an A4 paper when printed
|
||||||
|
|
||||||
|
|
||||||
@@ -1078,6 +1137,8 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
|
|||||||
|
|
||||||
## complete examples
|
## complete examples
|
||||||
|
|
||||||
|
* [running on windows](./docs/examples/windows.md)
|
||||||
|
|
||||||
* read-only music server
|
* read-only music server
|
||||||
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
|
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
|
||||||
|
|
||||||
@@ -1108,6 +1169,110 @@ example webserver configs:
|
|||||||
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
|
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
|
||||||
|
|
||||||
|
|
||||||
|
## nix package
|
||||||
|
|
||||||
|
`nix profile install github:9001/copyparty`
|
||||||
|
|
||||||
|
requires a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of nix
|
||||||
|
|
||||||
|
some recommended dependencies are enabled by default; [override the package](https://github.com/9001/copyparty/blob/hovudstraum/contrib/package/nix/copyparty/default.nix#L3-L22) if you want to add/remove some features/deps
|
||||||
|
|
||||||
|
`ffmpeg-full` was chosen over `ffmpeg-headless` mainly because we need `withWebp` (and `withOpenmpt` is also nice) and being able to use a cached build felt more important than optimizing for size at the time -- PRs welcome if you disagree 👍
|
||||||
|
|
||||||
|
|
||||||
|
## nixos module
|
||||||
|
|
||||||
|
for this setup, you will need a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of NixOS.
|
||||||
|
|
||||||
|
```nix
|
||||||
|
{
|
||||||
|
# add copyparty flake to your inputs
|
||||||
|
inputs.copyparty.url = "github:9001/copyparty";
|
||||||
|
|
||||||
|
# ensure that copyparty is an allowed argument to the outputs function
|
||||||
|
outputs = { self, nixpkgs, copyparty }: {
|
||||||
|
nixosConfigurations.yourHostName = nixpkgs.lib.nixosSystem {
|
||||||
|
modules = [
|
||||||
|
# load the copyparty NixOS module
|
||||||
|
copyparty.nixosModules.default
|
||||||
|
({ pkgs, ... }: {
|
||||||
|
# add the copyparty overlay to expose the package to the module
|
||||||
|
nixpkgs.overlays = [ copyparty.overlays.default ];
|
||||||
|
# (optional) install the package globally
|
||||||
|
environment.systemPackages = [ pkgs.copyparty ];
|
||||||
|
# configure the copyparty module
|
||||||
|
services.copyparty.enable = true;
|
||||||
|
})
|
||||||
|
];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
copyparty on NixOS is configured via `services.copyparty` options, for example:
|
||||||
|
```nix
|
||||||
|
services.copyparty = {
|
||||||
|
enable = true;
|
||||||
|
# directly maps to values in the [global] section of the copyparty config.
|
||||||
|
# see `copyparty --help` for available options
|
||||||
|
settings = {
|
||||||
|
i = "0.0.0.0";
|
||||||
|
# use lists to set multiple values
|
||||||
|
p = [ 3210 3211 ];
|
||||||
|
# use booleans to set binary flags
|
||||||
|
no-reload = true;
|
||||||
|
# using 'false' will do nothing and omit the value when generating a config
|
||||||
|
ignored-flag = false;
|
||||||
|
};
|
||||||
|
|
||||||
|
# create users
|
||||||
|
accounts = {
|
||||||
|
# specify the account name as the key
|
||||||
|
ed = {
|
||||||
|
# provide the path to a file containing the password, keeping it out of /nix/store
|
||||||
|
# must be readable by the copyparty service user
|
||||||
|
passwordFile = "/run/keys/copyparty/ed_password";
|
||||||
|
};
|
||||||
|
# or do both in one go
|
||||||
|
k.passwordFile = "/run/keys/copyparty/k_password";
|
||||||
|
};
|
||||||
|
|
||||||
|
# create a volume
|
||||||
|
volumes = {
|
||||||
|
# create a volume at "/" (the webroot), which will
|
||||||
|
"/" = {
|
||||||
|
# share the contents of "/srv/copyparty"
|
||||||
|
path = "/srv/copyparty";
|
||||||
|
# see `copyparty --help-accounts` for available options
|
||||||
|
access = {
|
||||||
|
# everyone gets read-access, but
|
||||||
|
r = "*";
|
||||||
|
# users "ed" and "k" get read-write
|
||||||
|
rw = [ "ed" "k" ];
|
||||||
|
};
|
||||||
|
# see `copyparty --help-flags` for available options
|
||||||
|
flags = {
|
||||||
|
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
|
||||||
|
fk = 4;
|
||||||
|
# scan for new files every 60sec
|
||||||
|
scan = 60;
|
||||||
|
# volflag "e2d" enables the uploads database
|
||||||
|
e2d = true;
|
||||||
|
# "d2t" disables multimedia parsers (in case the uploads are malicious)
|
||||||
|
d2t = true;
|
||||||
|
# skips hashing file contents if path matches *.iso
|
||||||
|
nohash = "\.iso$";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
# you may increase the open file limit for the process
|
||||||
|
openFilesLimit = 8192;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
the passwordFile at /run/keys/copyparty/ could for example be generated by [agenix](https://github.com/ryantm/agenix), or you could just dump it in the nix store instead if that's acceptable
|
||||||
|
|
||||||
|
|
||||||
# browser support
|
# browser support
|
||||||
|
|
||||||
TLDR: yes
|
TLDR: yes
|
||||||
@@ -1206,7 +1371,9 @@ sync folders to/from copyparty
|
|||||||
|
|
||||||
the commandline uploader [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) with `--dr` is the best way to sync a folder to copyparty; verifies checksums and does files in parallel, and deletes unexpected files on the server after upload has finished which makes file-renames really cheap (it'll rename serverside and skip uploading)
|
the commandline uploader [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) with `--dr` is the best way to sync a folder to copyparty; verifies checksums and does files in parallel, and deletes unexpected files on the server after upload has finished which makes file-renames really cheap (it'll rename serverside and skip uploading)
|
||||||
|
|
||||||
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty for instance), although syncing to copyparty is about 5x slower than up2k.py if you have many small files in particular
|
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty, ...), although there is no integrity check and it won't work with files over 100 MiB if copyparty is behind cloudflare
|
||||||
|
|
||||||
|
* starting from rclone v1.63 (currently [in beta](https://beta.rclone.org/?filter=latest)), rclone will also be faster than up2k.py
|
||||||
|
|
||||||
|
|
||||||
## mount as drive
|
## mount as drive
|
||||||
@@ -1215,11 +1382,10 @@ a remote copyparty server as a local filesystem; go to the control-panel and cl
|
|||||||
|
|
||||||
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
|
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
|
||||||
|
|
||||||
* [rclone-http](./docs/rclone.md) (25s), read-only
|
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE ([v1.63-beta](https://beta.rclone.org/?filter=latest))
|
||||||
|
* [rclone-http](./docs/rclone.md) (26s), read-only
|
||||||
|
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
|
||||||
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
|
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
|
||||||
* [rclone-webdav](./docs/rclone.md) (51s), read/WRITE
|
|
||||||
* copyparty-1.5.0's webdav server is faster than rclone-1.60.0 (69s)
|
|
||||||
* [partyfuse.py](./bin/#partyfusepy) (71s), read-only
|
|
||||||
* davfs2 (103s), read/WRITE, *very fast* on small files
|
* davfs2 (103s), read/WRITE, *very fast* on small files
|
||||||
* [win10-webdav](#webdav-server) (138s), read/WRITE
|
* [win10-webdav](#webdav-server) (138s), read/WRITE
|
||||||
* [win10-smb2](#smb-server) (387s), read/WRITE
|
* [win10-smb2](#smb-server) (387s), read/WRITE
|
||||||
@@ -1259,11 +1425,13 @@ below are some tweaks roughly ordered by usefulness:
|
|||||||
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
||||||
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
||||||
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
||||||
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example:
|
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
|
||||||
* huge amount of short-lived connections
|
* lots of connections (many users or heavy clients)
|
||||||
* simultaneous downloads and uploads saturating a 20gbps connection
|
* simultaneous downloads and uploads saturating a 20gbps connection
|
||||||
|
|
||||||
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
||||||
|
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
|
||||||
|
* and pypy can sometimes crash on startup with `-j0` (TODO make issue)
|
||||||
|
|
||||||
|
|
||||||
## client-side
|
## client-side
|
||||||
@@ -1372,7 +1540,7 @@ however you can hit `F12` in the up2k tab and use the devtools to see how far yo
|
|||||||
|
|
||||||
# HTTP API
|
# HTTP API
|
||||||
|
|
||||||
see [devnotes](#./docs/devnotes.md#http-api)
|
see [devnotes](./docs/devnotes.md#http-api)
|
||||||
|
|
||||||
|
|
||||||
# dependencies
|
# dependencies
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xb
|
|||||||
# after upload
|
# after upload
|
||||||
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
|
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
|
||||||
* [notify2.py](notify2.py) uses the json API to show more context
|
* [notify2.py](notify2.py) uses the json API to show more context
|
||||||
|
* [image-noexif.py](image-noexif.py) removes image exif by overwriting / directly editing the uploaded file
|
||||||
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
|
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
|
||||||
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
|
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
|
||||||
|
|
||||||
|
|||||||
@@ -13,9 +13,15 @@ example usage as global config:
|
|||||||
--xau f,t5,j,bin/hooks/discord-announce.py
|
--xau f,t5,j,bin/hooks/discord-announce.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xau=f,t5,j,bin/hooks/discord-announce.py
|
-v srv/inc:inc:r:rw,ed:c,xau=f,t5,j,bin/hooks/discord-announce.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
|
xbu = execute after upload
|
||||||
f = fork; don't wait for it to finish
|
f = fork; don't wait for it to finish
|
||||||
t5 = timeout if it's still running after 5 sec
|
t5 = timeout if it's still running after 5 sec
|
||||||
j = provide upload information as json; not just the filename
|
j = provide upload information as json; not just the filename
|
||||||
@@ -30,6 +36,7 @@ then use this to design your message: https://discohook.org/
|
|||||||
|
|
||||||
def main():
|
def main():
|
||||||
WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
|
WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
|
||||||
|
WEBHOOK = "https://discord.com/api/webhooks/1066830390280597718/M1TDD110hQA-meRLMRhdurych8iyG35LDoI1YhzbrjGP--BXNZodZFczNVwK4Ce7Yme5"
|
||||||
|
|
||||||
# read info from copyparty
|
# read info from copyparty
|
||||||
inf = json.loads(sys.argv[1])
|
inf = json.loads(sys.argv[1])
|
||||||
|
|||||||
72
bin/hooks/image-noexif.py
Executable file
72
bin/hooks/image-noexif.py
Executable file
@@ -0,0 +1,72 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import subprocess as sp
|
||||||
|
|
||||||
|
|
||||||
|
_ = r"""
|
||||||
|
remove exif tags from uploaded images; the eventhook edition of
|
||||||
|
https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py
|
||||||
|
|
||||||
|
dependencies:
|
||||||
|
exiftool / perl-Image-ExifTool
|
||||||
|
|
||||||
|
being an upload hook, this will take effect after upload completion
|
||||||
|
but before copyparty has hashed/indexed the file, which means that
|
||||||
|
copyparty will never index the original file, so deduplication will
|
||||||
|
not work as expected... which is mostly OK but ehhh
|
||||||
|
|
||||||
|
note: modifies the file in-place, so don't set the `f` (fork) flag
|
||||||
|
|
||||||
|
example usages; either as global config (all volumes) or as volflag:
|
||||||
|
--xau bin/hooks/image-noexif.py
|
||||||
|
-v srv/inc:inc:r:rw,ed:c,xau=bin/hooks/image-noexif.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
explained:
|
||||||
|
share fs-path srv/inc at /inc (readable by all, read-write for user ed)
|
||||||
|
running this xau (execute-after-upload) plugin for all uploaded files
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
# filetypes to process; ignores everything else
|
||||||
|
EXTS = ("jpg", "jpeg", "avif", "heif", "heic")
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
from copyparty.util import fsenc
|
||||||
|
except:
|
||||||
|
|
||||||
|
def fsenc(p):
|
||||||
|
return p.encode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
fp = sys.argv[1]
|
||||||
|
ext = fp.lower().split(".")[-1]
|
||||||
|
if ext not in EXTS:
|
||||||
|
return
|
||||||
|
|
||||||
|
cwd, fn = os.path.split(fp)
|
||||||
|
os.chdir(cwd)
|
||||||
|
f1 = fsenc(fn)
|
||||||
|
cmd = [
|
||||||
|
b"exiftool",
|
||||||
|
b"-exif:all=",
|
||||||
|
b"-iptc:all=",
|
||||||
|
b"-xmp:all=",
|
||||||
|
b"-P",
|
||||||
|
b"-overwrite_original",
|
||||||
|
b"--",
|
||||||
|
f1,
|
||||||
|
]
|
||||||
|
sp.check_output(cmd)
|
||||||
|
print("image-noexif: stripped")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
try:
|
||||||
|
main()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
@@ -17,8 +17,12 @@ depdencies:
|
|||||||
|
|
||||||
example usages; either as global config (all volumes) or as volflag:
|
example usages; either as global config (all volumes) or as volflag:
|
||||||
--xau f,bin/hooks/notify.py
|
--xau f,bin/hooks/notify.py
|
||||||
-v srv/inc:inc:c,xau=f,bin/hooks/notify.py
|
-v srv/inc:inc:r:rw,ed:c,xau=f,bin/hooks/notify.py
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xau = execute after upload
|
xau = execute after upload
|
||||||
|
|||||||
@@ -15,9 +15,13 @@ and also supports --xm (notify on 📟 message)
|
|||||||
example usages; either as global config (all volumes) or as volflag:
|
example usages; either as global config (all volumes) or as volflag:
|
||||||
--xm f,j,bin/hooks/notify2.py
|
--xm f,j,bin/hooks/notify2.py
|
||||||
--xau f,j,bin/hooks/notify2.py
|
--xau f,j,bin/hooks/notify2.py
|
||||||
-v srv/inc:inc:c,xm=f,j,bin/hooks/notify2.py
|
-v srv/inc:inc:r:rw,ed:c,xm=f,j,bin/hooks/notify2.py
|
||||||
-v srv/inc:inc:c,xau=f,j,bin/hooks/notify2.py
|
-v srv/inc:inc:r:rw,ed:c,xau=f,j,bin/hooks/notify2.py
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads / msgs with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xau = execute after upload
|
xau = execute after upload
|
||||||
|
|||||||
@@ -10,7 +10,12 @@ example usage as global config:
|
|||||||
--xbu c,bin/hooks/reject-extension.py
|
--xbu c,bin/hooks/reject-extension.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xbu=c,bin/hooks/reject-extension.py
|
-v srv/inc:inc:r:rw,ed:c,xbu=c,bin/hooks/reject-extension.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xbu = execute before upload
|
xbu = execute before upload
|
||||||
|
|||||||
@@ -17,7 +17,12 @@ example usage as global config:
|
|||||||
--xau c,bin/hooks/reject-mimetype.py
|
--xau c,bin/hooks/reject-mimetype.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xau=c,bin/hooks/reject-mimetype.py
|
-v srv/inc:inc:r:rw,ed:c,xau=c,bin/hooks/reject-mimetype.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xau = execute after upload
|
xau = execute after upload
|
||||||
|
|||||||
@@ -15,9 +15,15 @@ example usage as global config:
|
|||||||
--xm f,j,t3600,bin/hooks/wget.py
|
--xm f,j,t3600,bin/hooks/wget.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xm=f,j,t3600,bin/hooks/wget.py
|
-v srv/inc:inc:r:rw,ed:c,xm=f,j,t3600,bin/hooks/wget.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all messages with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
|
xm = execute on message-to-server-log
|
||||||
f = fork so it doesn't block uploads
|
f = fork so it doesn't block uploads
|
||||||
j = provide message information as json; not just the text
|
j = provide message information as json; not just the text
|
||||||
c3 = mute all output
|
c3 = mute all output
|
||||||
|
|||||||
@@ -18,7 +18,12 @@ example usage as global config:
|
|||||||
--xiu i5,j,bin/hooks/xiu-sha.py
|
--xiu i5,j,bin/hooks/xiu-sha.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xiu=i5,j,bin/hooks/xiu-sha.py
|
-v srv/inc:inc:r:rw,ed:c,xiu=i5,j,bin/hooks/xiu-sha.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on batches of uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xiu = execute after uploads...
|
xiu = execute after uploads...
|
||||||
|
|||||||
@@ -15,7 +15,12 @@ example usage as global config:
|
|||||||
--xiu i1,j,bin/hooks/xiu.py
|
--xiu i1,j,bin/hooks/xiu.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xiu=i1,j,bin/hooks/xiu.py
|
-v srv/inc:inc:r:rw,ed:c,xiu=i1,j,bin/hooks/xiu.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on batches of uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xiu = execute after uploads...
|
xiu = execute after uploads...
|
||||||
|
|||||||
@@ -16,6 +16,10 @@ dep: ffmpeg
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
# save beat timestamps to ".beats/filename.txt"
|
||||||
|
SAVE = False
|
||||||
|
|
||||||
|
|
||||||
def det(tf):
|
def det(tf):
|
||||||
# fmt: off
|
# fmt: off
|
||||||
sp.check_call([
|
sp.check_call([
|
||||||
@@ -23,12 +27,11 @@ def det(tf):
|
|||||||
b"-nostdin",
|
b"-nostdin",
|
||||||
b"-hide_banner",
|
b"-hide_banner",
|
||||||
b"-v", b"fatal",
|
b"-v", b"fatal",
|
||||||
b"-ss", b"13",
|
|
||||||
b"-y", b"-i", fsenc(sys.argv[1]),
|
b"-y", b"-i", fsenc(sys.argv[1]),
|
||||||
b"-map", b"0:a:0",
|
b"-map", b"0:a:0",
|
||||||
b"-ac", b"1",
|
b"-ac", b"1",
|
||||||
b"-ar", b"22050",
|
b"-ar", b"22050",
|
||||||
b"-t", b"300",
|
b"-t", b"360",
|
||||||
b"-f", b"f32le",
|
b"-f", b"f32le",
|
||||||
fsenc(tf)
|
fsenc(tf)
|
||||||
])
|
])
|
||||||
@@ -47,10 +50,29 @@ def det(tf):
|
|||||||
print(c["list"][0]["label"].split(" ")[0])
|
print(c["list"][0]["label"].split(" ")[0])
|
||||||
return
|
return
|
||||||
|
|
||||||
# throws if detection failed:
|
# throws if detection failed:
|
||||||
bpm = float(cl[-1]["timestamp"] - cl[1]["timestamp"])
|
beats = [float(x["timestamp"]) for x in cl]
|
||||||
bpm = round(60 * ((len(cl) - 1) / bpm), 2)
|
bds = [b - a for a, b in zip(beats, beats[1:])]
|
||||||
print(f"{bpm:.2f}")
|
bds.sort()
|
||||||
|
n0 = int(len(bds) * 0.2)
|
||||||
|
n1 = int(len(bds) * 0.75) + 1
|
||||||
|
bds = bds[n0:n1]
|
||||||
|
bpm = sum(bds)
|
||||||
|
bpm = round(60 * (len(bds) / bpm), 2)
|
||||||
|
print(f"{bpm:.2f}")
|
||||||
|
|
||||||
|
if SAVE:
|
||||||
|
fdir, fname = os.path.split(sys.argv[1])
|
||||||
|
bdir = os.path.join(fdir, ".beats")
|
||||||
|
try:
|
||||||
|
os.mkdir(fsenc(bdir))
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
fp = os.path.join(bdir, fname) + ".txt"
|
||||||
|
with open(fsenc(fp), "wb") as f:
|
||||||
|
txt = "\n".join([f"{x:.2f}" for x in beats])
|
||||||
|
f.write(txt.encode("utf-8"))
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
|
|||||||
25
bin/up2k.py
25
bin/up2k.py
@@ -1,8 +1,8 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
from __future__ import print_function, unicode_literals
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
S_VERSION = "1.5"
|
S_VERSION = "1.6"
|
||||||
S_BUILD_DT = "2023-03-12"
|
S_BUILD_DT = "2023-04-20"
|
||||||
|
|
||||||
"""
|
"""
|
||||||
up2k.py: upload to copyparty
|
up2k.py: upload to copyparty
|
||||||
@@ -653,6 +653,7 @@ class Ctl(object):
|
|||||||
return nfiles, nbytes
|
return nfiles, nbytes
|
||||||
|
|
||||||
def __init__(self, ar, stats=None):
|
def __init__(self, ar, stats=None):
|
||||||
|
self.ok = False
|
||||||
self.ar = ar
|
self.ar = ar
|
||||||
self.stats = stats or self._scan()
|
self.stats = stats or self._scan()
|
||||||
if not self.stats:
|
if not self.stats:
|
||||||
@@ -700,6 +701,8 @@ class Ctl(object):
|
|||||||
|
|
||||||
self._fancy()
|
self._fancy()
|
||||||
|
|
||||||
|
self.ok = True
|
||||||
|
|
||||||
def _safe(self):
|
def _safe(self):
|
||||||
"""minimal basic slow boring fallback codepath"""
|
"""minimal basic slow boring fallback codepath"""
|
||||||
search = self.ar.s
|
search = self.ar.s
|
||||||
@@ -1009,8 +1012,9 @@ class Ctl(object):
|
|||||||
file, cid = task
|
file, cid = task
|
||||||
try:
|
try:
|
||||||
upload(file, cid, self.ar.a, stats)
|
upload(file, cid, self.ar.a, stats)
|
||||||
except:
|
except Exception as ex:
|
||||||
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
|
t = "upload failed, retrying: {0} #{1} ({2})\n"
|
||||||
|
eprint(t.format(file.name, cid[:8], ex))
|
||||||
# handshake will fix it
|
# handshake will fix it
|
||||||
|
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
@@ -1049,6 +1053,8 @@ def main():
|
|||||||
print(ver)
|
print(ver)
|
||||||
return
|
return
|
||||||
|
|
||||||
|
sys.argv = [x for x in sys.argv if x != "--ws"]
|
||||||
|
|
||||||
# fmt: off
|
# fmt: off
|
||||||
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
|
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
|
||||||
NOTE:
|
NOTE:
|
||||||
@@ -1067,7 +1073,6 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
ap = app.add_argument_group("compatibility")
|
ap = app.add_argument_group("compatibility")
|
||||||
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
||||||
ap.add_argument("--ws", action="store_true", help="copyparty is running on windows; wait before deleting files after uploading")
|
|
||||||
|
|
||||||
ap = app.add_argument_group("folder sync")
|
ap = app.add_argument_group("folder sync")
|
||||||
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
|
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
|
||||||
@@ -1129,15 +1134,13 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
ctl = Ctl(ar)
|
ctl = Ctl(ar)
|
||||||
|
|
||||||
if ar.dr and not ar.drd:
|
if ar.dr and not ar.drd and ctl.ok:
|
||||||
print("\npass 2/2: delete")
|
print("\npass 2/2: delete")
|
||||||
if getattr(ctl, "up_br") and ar.ws:
|
|
||||||
# wait for up2k to mtime if there was uploads
|
|
||||||
time.sleep(4)
|
|
||||||
|
|
||||||
ar.drd = True
|
ar.drd = True
|
||||||
ar.z = True
|
ar.z = True
|
||||||
Ctl(ar, ctl.stats)
|
ctl = Ctl(ar, ctl.stats)
|
||||||
|
|
||||||
|
sys.exit(0 if ctl.ok else 1)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|||||||
@@ -39,3 +39,9 @@ server {
|
|||||||
proxy_set_header Connection "Keep-Alive";
|
proxy_set_header Connection "Keep-Alive";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
|
||||||
|
client_max_body_size 1024M;
|
||||||
|
client_header_timeout 610m;
|
||||||
|
client_body_timeout 610m;
|
||||||
|
send_timeout 610m;
|
||||||
|
|||||||
281
contrib/nixos/modules/copyparty.nix
Normal file
281
contrib/nixos/modules/copyparty.nix
Normal file
@@ -0,0 +1,281 @@
|
|||||||
|
{ config, pkgs, lib, ... }:
|
||||||
|
|
||||||
|
with lib;
|
||||||
|
|
||||||
|
let
|
||||||
|
mkKeyValue = key: value:
|
||||||
|
if value == true then
|
||||||
|
# sets with a true boolean value are coerced to just the key name
|
||||||
|
key
|
||||||
|
else if value == false then
|
||||||
|
# or omitted completely when false
|
||||||
|
""
|
||||||
|
else
|
||||||
|
(generators.mkKeyValueDefault { inherit mkValueString; } ": " key value);
|
||||||
|
|
||||||
|
mkAttrsString = value: (generators.toKeyValue { inherit mkKeyValue; } value);
|
||||||
|
|
||||||
|
mkValueString = value:
|
||||||
|
if isList value then
|
||||||
|
(concatStringsSep ", " (map mkValueString value))
|
||||||
|
else if isAttrs value then
|
||||||
|
"\n" + (mkAttrsString value)
|
||||||
|
else
|
||||||
|
(generators.mkValueStringDefault { } value);
|
||||||
|
|
||||||
|
mkSectionName = value: "[" + (escape [ "[" "]" ] value) + "]";
|
||||||
|
|
||||||
|
mkSection = name: attrs: ''
|
||||||
|
${mkSectionName name}
|
||||||
|
${mkAttrsString attrs}
|
||||||
|
'';
|
||||||
|
|
||||||
|
mkVolume = name: attrs: ''
|
||||||
|
${mkSectionName name}
|
||||||
|
${attrs.path}
|
||||||
|
${mkAttrsString {
|
||||||
|
accs = attrs.access;
|
||||||
|
flags = attrs.flags;
|
||||||
|
}}
|
||||||
|
'';
|
||||||
|
|
||||||
|
passwordPlaceholder = name: "{{password-${name}}}";
|
||||||
|
|
||||||
|
accountsWithPlaceholders = mapAttrs (name: attrs: passwordPlaceholder name);
|
||||||
|
|
||||||
|
configStr = ''
|
||||||
|
${mkSection "global" cfg.settings}
|
||||||
|
${mkSection "accounts" (accountsWithPlaceholders cfg.accounts)}
|
||||||
|
${concatStringsSep "\n" (mapAttrsToList mkVolume cfg.volumes)}
|
||||||
|
'';
|
||||||
|
|
||||||
|
name = "copyparty";
|
||||||
|
cfg = config.services.copyparty;
|
||||||
|
configFile = pkgs.writeText "${name}.conf" configStr;
|
||||||
|
runtimeConfigPath = "/run/${name}/${name}.conf";
|
||||||
|
home = "/var/lib/${name}";
|
||||||
|
defaultShareDir = "${home}/data";
|
||||||
|
in {
|
||||||
|
options.services.copyparty = {
|
||||||
|
enable = mkEnableOption "web-based file manager";
|
||||||
|
|
||||||
|
package = mkOption {
|
||||||
|
type = types.package;
|
||||||
|
default = pkgs.copyparty;
|
||||||
|
defaultText = "pkgs.copyparty";
|
||||||
|
description = ''
|
||||||
|
Package of the application to run, exposed for overriding purposes.
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
openFilesLimit = mkOption {
|
||||||
|
default = 4096;
|
||||||
|
type = types.either types.int types.str;
|
||||||
|
description = "Number of files to allow copyparty to open.";
|
||||||
|
};
|
||||||
|
|
||||||
|
settings = mkOption {
|
||||||
|
type = types.attrs;
|
||||||
|
description = ''
|
||||||
|
Global settings to apply.
|
||||||
|
Directly maps to values in the [global] section of the copyparty config.
|
||||||
|
See `${getExe cfg.package} --help` for more details.
|
||||||
|
'';
|
||||||
|
default = {
|
||||||
|
i = "127.0.0.1";
|
||||||
|
no-reload = true;
|
||||||
|
};
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
i = "0.0.0.0";
|
||||||
|
no-reload = true;
|
||||||
|
}
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
accounts = mkOption {
|
||||||
|
type = types.attrsOf (types.submodule ({ ... }: {
|
||||||
|
options = {
|
||||||
|
passwordFile = mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = ''
|
||||||
|
Runtime file path to a file containing the user password.
|
||||||
|
Must be readable by the copyparty user.
|
||||||
|
'';
|
||||||
|
example = "/run/keys/copyparty/ed";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}));
|
||||||
|
description = ''
|
||||||
|
A set of copyparty accounts to create.
|
||||||
|
'';
|
||||||
|
default = { };
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
ed.passwordFile = "/run/keys/copyparty/ed";
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
volumes = mkOption {
|
||||||
|
type = types.attrsOf (types.submodule ({ ... }: {
|
||||||
|
options = {
|
||||||
|
path = mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = ''
|
||||||
|
Path of a directory to share.
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
access = mkOption {
|
||||||
|
type = types.attrs;
|
||||||
|
description = ''
|
||||||
|
Attribute list of permissions and the users to apply them to.
|
||||||
|
|
||||||
|
The key must be a string containing any combination of allowed permission:
|
||||||
|
"r" (read): list folder contents, download files
|
||||||
|
"w" (write): upload files; need "r" to see the uploads
|
||||||
|
"m" (move): move files and folders; need "w" at destination
|
||||||
|
"d" (delete): permanently delete files and folders
|
||||||
|
"g" (get): download files, but cannot see folder contents
|
||||||
|
"G" (upget): "get", but can see filekeys of their own uploads
|
||||||
|
|
||||||
|
For example: "rwmd"
|
||||||
|
|
||||||
|
The value must be one of:
|
||||||
|
an account name, defined in `accounts`
|
||||||
|
a list of account names
|
||||||
|
"*", which means "any account"
|
||||||
|
'';
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
# wG = write-upget = see your own uploads only
|
||||||
|
wG = "*";
|
||||||
|
# read-write-modify-delete for users "ed" and "k"
|
||||||
|
rwmd = ["ed" "k"];
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
flags = mkOption {
|
||||||
|
type = types.attrs;
|
||||||
|
description = ''
|
||||||
|
Attribute list of volume flags to apply.
|
||||||
|
See `${getExe cfg.package} --help-flags` for more details.
|
||||||
|
'';
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
|
||||||
|
fk = 4;
|
||||||
|
# scan for new files every 60sec
|
||||||
|
scan = 60;
|
||||||
|
# volflag "e2d" enables the uploads database
|
||||||
|
e2d = true;
|
||||||
|
# "d2t" disables multimedia parsers (in case the uploads are malicious)
|
||||||
|
d2t = true;
|
||||||
|
# skips hashing file contents if path matches *.iso
|
||||||
|
nohash = "\.iso$";
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
default = { };
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}));
|
||||||
|
description = "A set of copyparty volumes to create";
|
||||||
|
default = {
|
||||||
|
"/" = {
|
||||||
|
path = defaultShareDir;
|
||||||
|
access = { r = "*"; };
|
||||||
|
};
|
||||||
|
};
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
"/" = {
|
||||||
|
path = ${defaultShareDir};
|
||||||
|
access = {
|
||||||
|
# wG = write-upget = see your own uploads only
|
||||||
|
wG = "*";
|
||||||
|
# read-write-modify-delete for users "ed" and "k"
|
||||||
|
rwmd = ["ed" "k"];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
config = mkIf cfg.enable {
|
||||||
|
systemd.services.copyparty = {
|
||||||
|
description = "http file sharing hub";
|
||||||
|
wantedBy = [ "multi-user.target" ];
|
||||||
|
|
||||||
|
environment = {
|
||||||
|
PYTHONUNBUFFERED = "true";
|
||||||
|
XDG_CONFIG_HOME = "${home}/.config";
|
||||||
|
};
|
||||||
|
|
||||||
|
preStart = let
|
||||||
|
replaceSecretCommand = name: attrs:
|
||||||
|
"${getExe pkgs.replace-secret} '${
|
||||||
|
passwordPlaceholder name
|
||||||
|
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
|
||||||
|
in ''
|
||||||
|
set -euo pipefail
|
||||||
|
install -m 600 ${configFile} ${runtimeConfigPath}
|
||||||
|
${concatStringsSep "\n"
|
||||||
|
(mapAttrsToList replaceSecretCommand cfg.accounts)}
|
||||||
|
'';
|
||||||
|
|
||||||
|
serviceConfig = {
|
||||||
|
Type = "simple";
|
||||||
|
ExecStart = "${getExe cfg.package} -c ${runtimeConfigPath}";
|
||||||
|
|
||||||
|
# Hardening options
|
||||||
|
User = "copyparty";
|
||||||
|
Group = "copyparty";
|
||||||
|
RuntimeDirectory = name;
|
||||||
|
RuntimeDirectoryMode = "0700";
|
||||||
|
StateDirectory = [ name "${name}/data" "${name}/.config" ];
|
||||||
|
StateDirectoryMode = "0700";
|
||||||
|
WorkingDirectory = home;
|
||||||
|
TemporaryFileSystem = "/:ro";
|
||||||
|
BindReadOnlyPaths = [
|
||||||
|
"/nix/store"
|
||||||
|
"-/etc/resolv.conf"
|
||||||
|
"-/etc/nsswitch.conf"
|
||||||
|
"-/etc/hosts"
|
||||||
|
"-/etc/localtime"
|
||||||
|
] ++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
|
||||||
|
BindPaths = [ home ] ++ (mapAttrsToList (k: v: v.path) cfg.volumes);
|
||||||
|
# Would re-mount paths ignored by temporary root
|
||||||
|
#ProtectSystem = "strict";
|
||||||
|
ProtectHome = true;
|
||||||
|
PrivateTmp = true;
|
||||||
|
PrivateDevices = true;
|
||||||
|
ProtectKernelTunables = true;
|
||||||
|
ProtectControlGroups = true;
|
||||||
|
RestrictSUIDSGID = true;
|
||||||
|
PrivateMounts = true;
|
||||||
|
ProtectKernelModules = true;
|
||||||
|
ProtectKernelLogs = true;
|
||||||
|
ProtectHostname = true;
|
||||||
|
ProtectClock = true;
|
||||||
|
ProtectProc = "invisible";
|
||||||
|
ProcSubset = "pid";
|
||||||
|
RestrictNamespaces = true;
|
||||||
|
RemoveIPC = true;
|
||||||
|
UMask = "0077";
|
||||||
|
LimitNOFILE = cfg.openFilesLimit;
|
||||||
|
NoNewPrivileges = true;
|
||||||
|
LockPersonality = true;
|
||||||
|
RestrictRealtime = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
users.groups.copyparty = { };
|
||||||
|
users.users.copyparty = {
|
||||||
|
description = "Service user for copyparty";
|
||||||
|
group = "copyparty";
|
||||||
|
home = home;
|
||||||
|
isSystemUser = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
# Maintainer: icxes <dev.null@need.moe>
|
# Maintainer: icxes <dev.null@need.moe>
|
||||||
pkgname=copyparty
|
pkgname=copyparty
|
||||||
pkgver="1.6.9"
|
pkgver="1.6.11"
|
||||||
pkgrel=1
|
pkgrel=1
|
||||||
pkgdesc="Portable file sharing hub"
|
pkgdesc="Portable file sharing hub"
|
||||||
arch=("any")
|
arch=("any")
|
||||||
@@ -26,12 +26,12 @@ source=("${url}/releases/download/v${pkgver}/${pkgname}-sfx.py"
|
|||||||
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
|
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
|
||||||
)
|
)
|
||||||
backup=("etc/${pkgname}.d/init" )
|
backup=("etc/${pkgname}.d/init" )
|
||||||
sha256sums=("64f3b6a7120b3e1c1167e6aa7c0f023c39abb18e50525013b97467326a2f73ab"
|
sha256sums=("d096e33ab666ef45213899dd3a10735f62b5441339cb7374f93b232d0b6c8d34"
|
||||||
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
|
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
|
||||||
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
|
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
|
||||||
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
|
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
|
||||||
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
|
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
|
||||||
"23054bb206153a1ed34038accaf490b8068f9c856e423c2f2595b148b40c0a0c"
|
"8e89d281483e22d11d111bed540652af35b66af6f14f49faae7b959f6cdc6475"
|
||||||
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
|
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|||||||
55
contrib/package/nix/copyparty/default.nix
Normal file
55
contrib/package/nix/copyparty/default.nix
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, pillow, pyvips, ffmpeg, mutagen,
|
||||||
|
|
||||||
|
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
|
||||||
|
withThumbnails ? true,
|
||||||
|
|
||||||
|
# create thumbnails with PyVIPS; even faster, uses more memory
|
||||||
|
# -- can be combined with Pillow to support more filetypes
|
||||||
|
withFastThumbnails ? false,
|
||||||
|
|
||||||
|
# enable FFmpeg; thumbnails for most filetypes (also video and audio), extract audio metadata, transcode audio to opus
|
||||||
|
# -- possibly dangerous if you allow anonymous uploads, since FFmpeg has a huge attack surface
|
||||||
|
# -- can be combined with Thumbnails and/or FastThumbnails, since FFmpeg is slower than both
|
||||||
|
withMediaProcessing ? true,
|
||||||
|
|
||||||
|
# if MediaProcessing is not enabled, you probably want this instead (less accurate, but much safer and faster)
|
||||||
|
withBasicAudioMetadata ? false,
|
||||||
|
|
||||||
|
# enable FTPS support in the FTP server
|
||||||
|
withFTPS ? false,
|
||||||
|
|
||||||
|
# samba/cifs server; dangerous and buggy, enable if you really need it
|
||||||
|
withSMB ? false,
|
||||||
|
|
||||||
|
}:
|
||||||
|
|
||||||
|
let
|
||||||
|
pinData = lib.importJSON ./pin.json;
|
||||||
|
pyEnv = python.withPackages (ps:
|
||||||
|
with ps; [
|
||||||
|
jinja2
|
||||||
|
]
|
||||||
|
++ lib.optional withSMB impacket
|
||||||
|
++ lib.optional withFTPS pyopenssl
|
||||||
|
++ lib.optional withThumbnails pillow
|
||||||
|
++ lib.optional withFastThumbnails pyvips
|
||||||
|
++ lib.optional withMediaProcessing ffmpeg
|
||||||
|
++ lib.optional withBasicAudioMetadata mutagen
|
||||||
|
);
|
||||||
|
in stdenv.mkDerivation {
|
||||||
|
pname = "copyparty";
|
||||||
|
version = pinData.version;
|
||||||
|
src = fetchurl {
|
||||||
|
url = pinData.url;
|
||||||
|
hash = pinData.hash;
|
||||||
|
};
|
||||||
|
buildInputs = [ makeWrapper ];
|
||||||
|
dontUnpack = true;
|
||||||
|
dontBuild = true;
|
||||||
|
installPhase = ''
|
||||||
|
install -Dm755 $src $out/share/copyparty-sfx.py
|
||||||
|
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
|
||||||
|
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
||||||
|
--add-flags "$out/share/copyparty-sfx.py"
|
||||||
|
'';
|
||||||
|
}
|
||||||
5
contrib/package/nix/copyparty/pin.json
Normal file
5
contrib/package/nix/copyparty/pin.json
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"url": "https://github.com/9001/copyparty/releases/download/v1.6.11/copyparty-sfx.py",
|
||||||
|
"version": "1.6.11",
|
||||||
|
"hash": "sha256-0JbjOrZm70UhOJndOhBzX2K1RBM5y3N0+TsjLQtsjTQ="
|
||||||
|
}
|
||||||
77
contrib/package/nix/copyparty/update.py
Executable file
77
contrib/package/nix/copyparty/update.py
Executable file
@@ -0,0 +1,77 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Update the Nix package pin
|
||||||
|
#
|
||||||
|
# Usage: ./update.sh [PATH]
|
||||||
|
# When the [PATH] is not set, it will fetch the latest release from the repo.
|
||||||
|
# With [PATH] set, it will hash the given file and generate the URL,
|
||||||
|
# base on the version contained within the file
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import json
|
||||||
|
import hashlib
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
OUTPUT_FILE = Path("pin.json")
|
||||||
|
TARGET_ASSET = "copyparty-sfx.py"
|
||||||
|
HASH_TYPE = "sha256"
|
||||||
|
LATEST_RELEASE_URL = "https://api.github.com/repos/9001/copyparty/releases/latest"
|
||||||
|
DOWNLOAD_URL = lambda version: f"https://github.com/9001/copyparty/releases/download/v{version}/{TARGET_ASSET}"
|
||||||
|
|
||||||
|
|
||||||
|
def get_formatted_hash(binary):
|
||||||
|
hasher = hashlib.new("sha256")
|
||||||
|
hasher.update(binary)
|
||||||
|
asset_hash = hasher.digest()
|
||||||
|
encoded_hash = base64.b64encode(asset_hash).decode("ascii")
|
||||||
|
return f"{HASH_TYPE}-{encoded_hash}"
|
||||||
|
|
||||||
|
|
||||||
|
def version_from_sfx(binary):
|
||||||
|
result = re.search(b'^VER = "(.*)"$', binary, re.MULTILINE)
|
||||||
|
if result:
|
||||||
|
return result.groups(1)[0].decode("ascii")
|
||||||
|
|
||||||
|
raise ValueError("version not found in provided file")
|
||||||
|
|
||||||
|
|
||||||
|
def remote_release_pin():
|
||||||
|
import requests
|
||||||
|
|
||||||
|
response = requests.get(LATEST_RELEASE_URL).json()
|
||||||
|
version = response["tag_name"].lstrip("v")
|
||||||
|
asset_info = [a for a in response["assets"] if a["name"] == TARGET_ASSET][0]
|
||||||
|
download_url = asset_info["browser_download_url"]
|
||||||
|
asset = requests.get(download_url)
|
||||||
|
formatted_hash = get_formatted_hash(asset.content)
|
||||||
|
|
||||||
|
result = {"url": download_url, "version": version, "hash": formatted_hash}
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def local_release_pin(path):
|
||||||
|
asset = path.read_bytes()
|
||||||
|
version = version_from_sfx(asset)
|
||||||
|
download_url = DOWNLOAD_URL(version)
|
||||||
|
formatted_hash = get_formatted_hash(asset)
|
||||||
|
|
||||||
|
result = {"url": download_url, "version": version, "hash": formatted_hash}
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
asset_path = Path(sys.argv[1])
|
||||||
|
result = local_release_pin(asset_path)
|
||||||
|
else:
|
||||||
|
result = remote_release_pin()
|
||||||
|
|
||||||
|
print(result)
|
||||||
|
json_result = json.dumps(result, indent=4)
|
||||||
|
OUTPUT_FILE.write_text(json_result)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
208
contrib/plugins/rave.js
Normal file
208
contrib/plugins/rave.js
Normal file
@@ -0,0 +1,208 @@
|
|||||||
|
/* untz untz untz untz */
|
||||||
|
|
||||||
|
(function () {
|
||||||
|
|
||||||
|
var can, ctx, W, H, fft, buf, bars, barw, pv,
|
||||||
|
hue = 0,
|
||||||
|
ibeat = 0,
|
||||||
|
beats = [9001],
|
||||||
|
beats_url = '',
|
||||||
|
uofs = 0,
|
||||||
|
ops = ebi('ops'),
|
||||||
|
raving = false,
|
||||||
|
recalc = 0,
|
||||||
|
cdown = 0,
|
||||||
|
FC = 0.9,
|
||||||
|
css = `<style>
|
||||||
|
|
||||||
|
#fft {
|
||||||
|
position: fixed;
|
||||||
|
top: 0;
|
||||||
|
left: 0;
|
||||||
|
z-index: -1;
|
||||||
|
}
|
||||||
|
body {
|
||||||
|
box-shadow: inset 0 0 0 white;
|
||||||
|
}
|
||||||
|
#ops>a,
|
||||||
|
#path>a {
|
||||||
|
display: inline-block;
|
||||||
|
}
|
||||||
|
/*
|
||||||
|
body.untz {
|
||||||
|
animation: untz-body 200ms ease-out;
|
||||||
|
}
|
||||||
|
@keyframes untz-body {
|
||||||
|
0% {inset 0 0 20em white}
|
||||||
|
100% {inset 0 0 0 white}
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
:root, html.a, html.b, html.c, html.d, html.e {
|
||||||
|
--row-alt: rgba(48,52,78,0.2);
|
||||||
|
}
|
||||||
|
#files td {
|
||||||
|
background: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
</style>`;
|
||||||
|
|
||||||
|
QS('body').appendChild(mknod('div', null, css));
|
||||||
|
|
||||||
|
function rave_load() {
|
||||||
|
console.log('rave_load');
|
||||||
|
can = mknod('canvas', 'fft');
|
||||||
|
QS('body').appendChild(can);
|
||||||
|
ctx = can.getContext('2d');
|
||||||
|
|
||||||
|
fft = new AnalyserNode(actx, {
|
||||||
|
"fftSize": 2048,
|
||||||
|
"maxDecibels": 0,
|
||||||
|
"smoothingTimeConstant": 0.7,
|
||||||
|
});
|
||||||
|
ibeat = 0;
|
||||||
|
beats = [9001];
|
||||||
|
buf = new Uint8Array(fft.frequencyBinCount);
|
||||||
|
bars = buf.length * FC;
|
||||||
|
afilt.filters.push(fft);
|
||||||
|
if (!raving) {
|
||||||
|
raving = true;
|
||||||
|
raver();
|
||||||
|
}
|
||||||
|
beats_url = mp.au.src.split('?')[0].replace(/(.*\/)(.*)/, '$1.beats/$2.txt');
|
||||||
|
console.log("reading beats from", beats_url);
|
||||||
|
var xhr = new XHR();
|
||||||
|
xhr.open('GET', beats_url, true);
|
||||||
|
xhr.onload = readbeats;
|
||||||
|
xhr.url = beats_url;
|
||||||
|
xhr.send();
|
||||||
|
}
|
||||||
|
|
||||||
|
function rave_unload() {
|
||||||
|
qsr('#fft');
|
||||||
|
can = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function readbeats() {
|
||||||
|
if (this.url != beats_url)
|
||||||
|
return console.log('old beats??', this.url, beats_url);
|
||||||
|
|
||||||
|
var sbeats = this.responseText.replace(/\r/g, '').split(/\n/g);
|
||||||
|
if (sbeats.length < 3)
|
||||||
|
return;
|
||||||
|
|
||||||
|
beats = [];
|
||||||
|
for (var a = 0; a < sbeats.length; a++)
|
||||||
|
beats.push(parseFloat(sbeats[a]));
|
||||||
|
|
||||||
|
var end = beats.slice(-2),
|
||||||
|
t = end[1],
|
||||||
|
d = t - end[0];
|
||||||
|
|
||||||
|
while (d > 0.1 && t < 1200)
|
||||||
|
beats.push(t += d);
|
||||||
|
}
|
||||||
|
|
||||||
|
function hrand() {
|
||||||
|
return Math.random() - 0.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
function raver() {
|
||||||
|
if (!can) {
|
||||||
|
raving = false;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
requestAnimationFrame(raver);
|
||||||
|
if (!mp || !mp.au || mp.au.paused)
|
||||||
|
return;
|
||||||
|
|
||||||
|
if (--uofs >= 0) {
|
||||||
|
document.body.style.marginLeft = hrand() * uofs + 'px';
|
||||||
|
ebi('tree').style.marginLeft = hrand() * uofs + 'px';
|
||||||
|
for (var a of QSA('#ops>a, #path>a, #pctl>a'))
|
||||||
|
a.style.transform = 'translate(' + hrand() * uofs * 1 + 'px, ' + hrand() * uofs * 0.7 + 'px) rotate(' + Math.random() * uofs * 0.7 + 'deg)'
|
||||||
|
}
|
||||||
|
|
||||||
|
if (--recalc < 0) {
|
||||||
|
recalc = 60;
|
||||||
|
var tree = ebi('tree'),
|
||||||
|
x = tree.style.display == 'none' ? 0 : tree.offsetWidth;
|
||||||
|
|
||||||
|
//W = can.width = window.innerWidth - x;
|
||||||
|
//H = can.height = window.innerHeight;
|
||||||
|
//H = ebi('widget').offsetTop;
|
||||||
|
W = can.width = bars;
|
||||||
|
H = can.height = 512;
|
||||||
|
barw = 1; //parseInt(0.8 + W / bars);
|
||||||
|
can.style.left = x + 'px';
|
||||||
|
can.style.width = (window.innerWidth - x) + 'px';
|
||||||
|
can.style.height = ebi('widget').offsetTop + 'px';
|
||||||
|
}
|
||||||
|
|
||||||
|
//if (--cdown == 1)
|
||||||
|
// clmod(ops, 'untz');
|
||||||
|
|
||||||
|
fft.getByteFrequencyData(buf);
|
||||||
|
|
||||||
|
var imax = 0, vmax = 0;
|
||||||
|
for (var a = 10; a < 50; a++)
|
||||||
|
if (vmax < buf[a]) {
|
||||||
|
vmax = buf[a];
|
||||||
|
imax = a;
|
||||||
|
}
|
||||||
|
|
||||||
|
hue = hue * 0.93 + imax * 0.07;
|
||||||
|
|
||||||
|
ctx.fillStyle = 'rgba(0,0,0,0)';
|
||||||
|
ctx.fillRect(0, 0, W, H);
|
||||||
|
ctx.clearRect(0, 0, W, H);
|
||||||
|
ctx.fillStyle = 'hsla(' + (hue * 2.5) + ',100%,50%,0.7)';
|
||||||
|
|
||||||
|
var x = 0, mul = (H / 256) * 0.5;
|
||||||
|
for (var a = 0; a < buf.length * FC; a++) {
|
||||||
|
var v = buf[a] * mul * (1 + 0.69 * a / buf.length);
|
||||||
|
ctx.fillRect(x, H - v, barw, v);
|
||||||
|
x += barw;
|
||||||
|
}
|
||||||
|
|
||||||
|
var t = mp.au.currentTime + 0.05;
|
||||||
|
|
||||||
|
if (ibeat >= beats.length || beats[ibeat] > t)
|
||||||
|
return;
|
||||||
|
|
||||||
|
while (ibeat < beats.length && beats[ibeat++] < t)
|
||||||
|
continue;
|
||||||
|
|
||||||
|
return untz();
|
||||||
|
|
||||||
|
var cv = 0;
|
||||||
|
for (var a = 0; a < 128; a++)
|
||||||
|
cv += buf[a];
|
||||||
|
|
||||||
|
if (cv - pv > 1000) {
|
||||||
|
console.log(pv, cv, cv - pv);
|
||||||
|
if (cdown < 0) {
|
||||||
|
clmod(ops, 'untz', 1);
|
||||||
|
cdown = 20;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
pv = cv;
|
||||||
|
}
|
||||||
|
|
||||||
|
function untz() {
|
||||||
|
console.log('untz');
|
||||||
|
uofs = 14;
|
||||||
|
document.body.animate([
|
||||||
|
{ boxShadow: 'inset 0 0 1em #f0c' },
|
||||||
|
{ boxShadow: 'inset 0 0 20em #f0c', offset: 0.2 },
|
||||||
|
{ boxShadow: 'inset 0 0 0 #f0c' },
|
||||||
|
], { duration: 200, iterations: 1 });
|
||||||
|
}
|
||||||
|
|
||||||
|
afilt.plugs.push({
|
||||||
|
"en": true,
|
||||||
|
"load": rave_load,
|
||||||
|
"unload": rave_unload
|
||||||
|
});
|
||||||
|
|
||||||
|
})();
|
||||||
@@ -2,12 +2,16 @@
|
|||||||
# and share '/mnt' with anonymous read+write
|
# and share '/mnt' with anonymous read+write
|
||||||
#
|
#
|
||||||
# installation:
|
# installation:
|
||||||
# cp -pv copyparty.service /etc/systemd/system
|
# wget https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py -O /usr/local/bin/copyparty-sfx.py
|
||||||
# restorecon -vr /etc/systemd/system/copyparty.service
|
# cp -pv copyparty.service /etc/systemd/system/
|
||||||
|
# restorecon -vr /etc/systemd/system/copyparty.service # on fedora/rhel
|
||||||
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
|
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
|
||||||
# firewall-cmd --reload
|
# firewall-cmd --reload
|
||||||
# systemctl daemon-reload && systemctl enable --now copyparty
|
# systemctl daemon-reload && systemctl enable --now copyparty
|
||||||
#
|
#
|
||||||
|
# if it fails to start, first check this: systemctl status copyparty
|
||||||
|
# then try starting it while viewing logs: journalctl -fan 100
|
||||||
|
#
|
||||||
# you may want to:
|
# you may want to:
|
||||||
# change "User=cpp" and "/home/cpp/" to another user
|
# change "User=cpp" and "/home/cpp/" to another user
|
||||||
# remove the nft lines to only listen on port 3923
|
# remove the nft lines to only listen on port 3923
|
||||||
@@ -44,7 +48,7 @@ ExecReload=/bin/kill -s USR1 $MAINPID
|
|||||||
User=cpp
|
User=cpp
|
||||||
Environment=XDG_CONFIG_HOME=/home/cpp/.config
|
Environment=XDG_CONFIG_HOME=/home/cpp/.config
|
||||||
|
|
||||||
# setup forwarding from ports 80 and 443 to port 3923
|
# OPTIONAL: setup forwarding from ports 80 and 443 to port 3923
|
||||||
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
|
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
|
||||||
ExecStartPre=+nft add table ip nat
|
ExecStartPre=+nft add table ip nat
|
||||||
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }
|
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }
|
||||||
|
|||||||
@@ -900,8 +900,8 @@ def add_db_general(ap, hcores):
|
|||||||
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
|
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
|
||||||
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
|
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
|
||||||
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
|
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
|
||||||
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching paths during e2ds folder scans (volflag=nohash)")
|
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
|
||||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching paths during e2ds folder scans (volflag=noidx)")
|
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
|
||||||
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
|
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
|
||||||
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
|
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
|
||||||
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
|
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
# coding: utf-8
|
# coding: utf-8
|
||||||
|
|
||||||
VERSION = (1, 6, 10)
|
VERSION = (1, 6, 12)
|
||||||
CODENAME = "cors k"
|
CODENAME = "cors k"
|
||||||
BUILD_DT = (2023, 3, 20)
|
BUILD_DT = (2023, 4, 20)
|
||||||
|
|
||||||
S_VERSION = ".".join(map(str, VERSION))
|
S_VERSION = ".".join(map(str, VERSION))
|
||||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||||
|
|||||||
@@ -356,7 +356,8 @@ class VFS(object):
|
|||||||
flags = {k: v for k, v in self.flags.items()}
|
flags = {k: v for k, v in self.flags.items()}
|
||||||
hist = flags.get("hist")
|
hist = flags.get("hist")
|
||||||
if hist and hist != "-":
|
if hist and hist != "-":
|
||||||
flags["hist"] = "{}/{}".format(hist.rstrip("/"), name)
|
zs = "{}/{}".format(hist.rstrip("/"), name)
|
||||||
|
flags["hist"] = os.path.expanduser(zs) if zs.startswith("~") else zs
|
||||||
|
|
||||||
return flags
|
return flags
|
||||||
|
|
||||||
@@ -1101,6 +1102,9 @@ class AuthSrv(object):
|
|||||||
if vflag == "-":
|
if vflag == "-":
|
||||||
pass
|
pass
|
||||||
elif vflag:
|
elif vflag:
|
||||||
|
if vflag.startswith("~"):
|
||||||
|
vflag = os.path.expanduser(vflag)
|
||||||
|
|
||||||
vol.histpath = uncyg(vflag) if WINDOWS else vflag
|
vol.histpath = uncyg(vflag) if WINDOWS else vflag
|
||||||
elif self.args.hist:
|
elif self.args.hist:
|
||||||
for nch in range(len(hid)):
|
for nch in range(len(hid)):
|
||||||
|
|||||||
@@ -14,8 +14,8 @@ from pyftpdlib.handlers import FTPHandler
|
|||||||
from pyftpdlib.servers import FTPServer
|
from pyftpdlib.servers import FTPServer
|
||||||
|
|
||||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
||||||
from .bos import bos
|
|
||||||
from .authsrv import VFS
|
from .authsrv import VFS
|
||||||
|
from .bos import bos
|
||||||
from .util import (
|
from .util import (
|
||||||
Daemon,
|
Daemon,
|
||||||
Pebkac,
|
Pebkac,
|
||||||
|
|||||||
@@ -264,9 +264,10 @@ class HttpCli(object):
|
|||||||
self.is_https = (
|
self.is_https = (
|
||||||
self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls
|
self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls
|
||||||
)
|
)
|
||||||
self.host = self.headers.get("host") or "{}:{}".format(
|
self.host = self.headers.get("host") or ""
|
||||||
*list(self.s.getsockname()[:2])
|
if not self.host:
|
||||||
)
|
zs = "{}:{}".format(*list(self.s.getsockname()[:2]))
|
||||||
|
self.host = zs[7:] if zs.startswith("::ffff:") else zs
|
||||||
|
|
||||||
n = self.args.rproxy
|
n = self.args.rproxy
|
||||||
if n:
|
if n:
|
||||||
@@ -1115,7 +1116,14 @@ class HttpCli(object):
|
|||||||
if self.do_log:
|
if self.do_log:
|
||||||
self.log("MKCOL " + self.req)
|
self.log("MKCOL " + self.req)
|
||||||
|
|
||||||
return self._mkdir(self.vpath)
|
try:
|
||||||
|
return self._mkdir(self.vpath, True)
|
||||||
|
except Pebkac as ex:
|
||||||
|
if ex.code >= 500:
|
||||||
|
raise
|
||||||
|
|
||||||
|
self.reply(b"", ex.code)
|
||||||
|
return True
|
||||||
|
|
||||||
def handle_move(self) -> bool:
|
def handle_move(self) -> bool:
|
||||||
dst = self.headers["destination"]
|
dst = self.headers["destination"]
|
||||||
@@ -1568,6 +1576,11 @@ class HttpCli(object):
|
|||||||
t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
|
t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
|
||||||
|
|
||||||
h = {"Location": url} if is_put and url else {}
|
h = {"Location": url} if is_put and url else {}
|
||||||
|
|
||||||
|
if "x-oc-mtime" in self.headers:
|
||||||
|
h["X-OC-MTime"] = "accepted"
|
||||||
|
t = "" # some webdav clients expect/prefer this
|
||||||
|
|
||||||
self.reply(t.encode("utf-8"), 201, headers=h)
|
self.reply(t.encode("utf-8"), 201, headers=h)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@@ -1741,7 +1754,7 @@ class HttpCli(object):
|
|||||||
|
|
||||||
def handle_search(self, body: dict[str, Any]) -> bool:
|
def handle_search(self, body: dict[str, Any]) -> bool:
|
||||||
idx = self.conn.get_u2idx()
|
idx = self.conn.get_u2idx()
|
||||||
if not hasattr(idx, "p_end"):
|
if not idx or not hasattr(idx, "p_end"):
|
||||||
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
|
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
|
||||||
|
|
||||||
vols = []
|
vols = []
|
||||||
@@ -1968,7 +1981,7 @@ class HttpCli(object):
|
|||||||
sanitized = sanitize_fn(new_dir, "", [])
|
sanitized = sanitize_fn(new_dir, "", [])
|
||||||
return self._mkdir(vjoin(self.vpath, sanitized))
|
return self._mkdir(vjoin(self.vpath, sanitized))
|
||||||
|
|
||||||
def _mkdir(self, vpath: str) -> bool:
|
def _mkdir(self, vpath: str, dav: bool = False) -> bool:
|
||||||
nullwrite = self.args.nw
|
nullwrite = self.args.nw
|
||||||
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
||||||
self._assert_safe_rem(rem)
|
self._assert_safe_rem(rem)
|
||||||
@@ -1994,7 +2007,12 @@ class HttpCli(object):
|
|||||||
raise Pebkac(500, min_ex())
|
raise Pebkac(500, min_ex())
|
||||||
|
|
||||||
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
|
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
|
||||||
self.redirect(vpath, status=201)
|
|
||||||
|
if dav:
|
||||||
|
self.reply(b"", 201)
|
||||||
|
else:
|
||||||
|
self.redirect(vpath, status=201)
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def handle_new_md(self) -> bool:
|
def handle_new_md(self) -> bool:
|
||||||
@@ -2538,8 +2556,12 @@ class HttpCli(object):
|
|||||||
hrange = self.headers.get("range")
|
hrange = self.headers.get("range")
|
||||||
|
|
||||||
# let's not support 206 with compression
|
# let's not support 206 with compression
|
||||||
if do_send and not is_compressed and hrange and file_sz:
|
# and multirange / multipart is also not-impl (mostly because calculating contentlength is a pain)
|
||||||
|
if do_send and not is_compressed and hrange and file_sz and "," not in hrange:
|
||||||
try:
|
try:
|
||||||
|
if not hrange.lower().startswith("bytes"):
|
||||||
|
raise Exception()
|
||||||
|
|
||||||
a, b = hrange.split("=", 1)[1].split("-")
|
a, b = hrange.split("=", 1)[1].split("-")
|
||||||
|
|
||||||
if a.strip():
|
if a.strip():
|
||||||
@@ -3057,7 +3079,7 @@ class HttpCli(object):
|
|||||||
raise Pebkac(403, "the unpost feature is disabled in server config")
|
raise Pebkac(403, "the unpost feature is disabled in server config")
|
||||||
|
|
||||||
idx = self.conn.get_u2idx()
|
idx = self.conn.get_u2idx()
|
||||||
if not hasattr(idx, "p_end"):
|
if not idx or not hasattr(idx, "p_end"):
|
||||||
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
|
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
|
||||||
|
|
||||||
filt = self.uparam.get("filter")
|
filt = self.uparam.get("filter")
|
||||||
@@ -3276,9 +3298,10 @@ class HttpCli(object):
|
|||||||
|
|
||||||
is_dir = stat.S_ISDIR(st.st_mode)
|
is_dir = stat.S_ISDIR(st.st_mode)
|
||||||
icur = None
|
icur = None
|
||||||
if e2t or (e2d and is_dir):
|
if is_dir and (e2t or e2d):
|
||||||
idx = self.conn.get_u2idx()
|
idx = self.conn.get_u2idx()
|
||||||
icur = idx.get_cur(dbv.realpath)
|
if idx and hasattr(idx, "p_end"):
|
||||||
|
icur = idx.get_cur(dbv.realpath)
|
||||||
|
|
||||||
if self.can_read:
|
if self.can_read:
|
||||||
th_fmt = self.uparam.get("th")
|
th_fmt = self.uparam.get("th")
|
||||||
|
|||||||
@@ -103,11 +103,12 @@ class HttpConn(object):
|
|||||||
def log(self, msg: str, c: Union[int, str] = 0) -> None:
|
def log(self, msg: str, c: Union[int, str] = 0) -> None:
|
||||||
self.log_func(self.log_src, msg, c)
|
self.log_func(self.log_src, msg, c)
|
||||||
|
|
||||||
def get_u2idx(self) -> U2idx:
|
def get_u2idx(self) -> Optional[U2idx]:
|
||||||
# one u2idx per tcp connection;
|
# grab from a pool of u2idx instances;
|
||||||
# sqlite3 fully parallelizes under python threads
|
# sqlite3 fully parallelizes under python threads
|
||||||
|
# but avoid running out of FDs by creating too many
|
||||||
if not self.u2idx:
|
if not self.u2idx:
|
||||||
self.u2idx = U2idx(self)
|
self.u2idx = self.hsrv.get_u2idx(str(self.addr))
|
||||||
|
|
||||||
return self.u2idx
|
return self.u2idx
|
||||||
|
|
||||||
@@ -215,3 +216,7 @@ class HttpConn(object):
|
|||||||
self.cli = HttpCli(self)
|
self.cli = HttpCli(self)
|
||||||
if not self.cli.run():
|
if not self.cli.run():
|
||||||
return
|
return
|
||||||
|
|
||||||
|
if self.u2idx:
|
||||||
|
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
|
||||||
|
self.u2idx = None
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ import time
|
|||||||
|
|
||||||
import queue
|
import queue
|
||||||
|
|
||||||
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams
|
from .__init__ import ANYWIN, CORES, EXE, MACOS, TYPE_CHECKING, EnvParams
|
||||||
|
|
||||||
try:
|
try:
|
||||||
MNFE = ModuleNotFoundError
|
MNFE = ModuleNotFoundError
|
||||||
@@ -40,6 +40,7 @@ except MNFE:
|
|||||||
|
|
||||||
from .bos import bos
|
from .bos import bos
|
||||||
from .httpconn import HttpConn
|
from .httpconn import HttpConn
|
||||||
|
from .u2idx import U2idx
|
||||||
from .util import (
|
from .util import (
|
||||||
E_SCK,
|
E_SCK,
|
||||||
FHC,
|
FHC,
|
||||||
@@ -111,6 +112,9 @@ class HttpSrv(object):
|
|||||||
self.cb_ts = 0.0
|
self.cb_ts = 0.0
|
||||||
self.cb_v = ""
|
self.cb_v = ""
|
||||||
|
|
||||||
|
self.u2idx_free: dict[str, U2idx] = {}
|
||||||
|
self.u2idx_n = 0
|
||||||
|
|
||||||
env = jinja2.Environment()
|
env = jinja2.Environment()
|
||||||
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
|
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
|
||||||
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
|
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
|
||||||
@@ -445,6 +449,9 @@ class HttpSrv(object):
|
|||||||
self.clients.remove(cli)
|
self.clients.remove(cli)
|
||||||
self.ncli -= 1
|
self.ncli -= 1
|
||||||
|
|
||||||
|
if cli.u2idx:
|
||||||
|
self.put_u2idx(str(addr), cli.u2idx)
|
||||||
|
|
||||||
def cachebuster(self) -> str:
|
def cachebuster(self) -> str:
|
||||||
if time.time() - self.cb_ts < 1:
|
if time.time() - self.cb_ts < 1:
|
||||||
return self.cb_v
|
return self.cb_v
|
||||||
@@ -466,3 +473,31 @@ class HttpSrv(object):
|
|||||||
self.cb_v = v.decode("ascii")[-4:]
|
self.cb_v = v.decode("ascii")[-4:]
|
||||||
self.cb_ts = time.time()
|
self.cb_ts = time.time()
|
||||||
return self.cb_v
|
return self.cb_v
|
||||||
|
|
||||||
|
def get_u2idx(self, ident: str) -> Optional[U2idx]:
|
||||||
|
utab = self.u2idx_free
|
||||||
|
for _ in range(100): # 5/0.05 = 5sec
|
||||||
|
with self.mutex:
|
||||||
|
if utab:
|
||||||
|
if ident in utab:
|
||||||
|
return utab.pop(ident)
|
||||||
|
|
||||||
|
return utab.pop(list(utab.keys())[0])
|
||||||
|
|
||||||
|
if self.u2idx_n < CORES:
|
||||||
|
self.u2idx_n += 1
|
||||||
|
return U2idx(self)
|
||||||
|
|
||||||
|
time.sleep(0.05)
|
||||||
|
# not using conditional waits, on a hunch that
|
||||||
|
# average performance will be faster like this
|
||||||
|
# since most servers won't be fully saturated
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def put_u2idx(self, ident: str, u2idx: U2idx) -> None:
|
||||||
|
with self.mutex:
|
||||||
|
while ident in self.u2idx_free:
|
||||||
|
ident += "a"
|
||||||
|
|
||||||
|
self.u2idx_free[ident] = u2idx
|
||||||
|
|||||||
@@ -128,6 +128,9 @@ class SvcHub(object):
|
|||||||
args.no_robots = True
|
args.no_robots = True
|
||||||
args.force_js = True
|
args.force_js = True
|
||||||
|
|
||||||
|
if not self._process_config():
|
||||||
|
raise Exception("bad config")
|
||||||
|
|
||||||
self.log = self._log_disabled if args.q else self._log_enabled
|
self.log = self._log_disabled if args.q else self._log_enabled
|
||||||
if args.lo:
|
if args.lo:
|
||||||
self._setup_logfile(printed)
|
self._setup_logfile(printed)
|
||||||
@@ -177,9 +180,6 @@ class SvcHub(object):
|
|||||||
|
|
||||||
self.log("root", "max clients: {}".format(self.args.nc))
|
self.log("root", "max clients: {}".format(self.args.nc))
|
||||||
|
|
||||||
if not self._process_config():
|
|
||||||
raise Exception("bad config")
|
|
||||||
|
|
||||||
self.tcpsrv = TcpSrv(self)
|
self.tcpsrv = TcpSrv(self)
|
||||||
self.up2k = Up2k(self)
|
self.up2k = Up2k(self)
|
||||||
|
|
||||||
@@ -350,6 +350,19 @@ class SvcHub(object):
|
|||||||
|
|
||||||
al.th_covers = set(al.th_covers.split(","))
|
al.th_covers = set(al.th_covers.split(","))
|
||||||
|
|
||||||
|
for k in "c".split(" "):
|
||||||
|
vl = getattr(al, k)
|
||||||
|
if not vl:
|
||||||
|
continue
|
||||||
|
|
||||||
|
vl = [os.path.expanduser(x) if x.startswith("~") else x for x in vl]
|
||||||
|
setattr(al, k, vl)
|
||||||
|
|
||||||
|
for k in "lo hist ssl_log".split(" "):
|
||||||
|
vs = getattr(al, k)
|
||||||
|
if vs and vs.startswith("~"):
|
||||||
|
setattr(al, k, os.path.expanduser(vs))
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _setlimits(self) -> None:
|
def _setlimits(self) -> None:
|
||||||
|
|||||||
@@ -322,7 +322,7 @@ class TcpSrv(object):
|
|||||||
if k not in netdevs:
|
if k not in netdevs:
|
||||||
removed = "{} = {}".format(k, v)
|
removed = "{} = {}".format(k, v)
|
||||||
|
|
||||||
t = "network change detected:\n added {}\nremoved {}"
|
t = "network change detected:\n added {}\033[0;33m\nremoved {}"
|
||||||
self.log("tcpsrv", t.format(added, removed), 3)
|
self.log("tcpsrv", t.format(added, removed), 3)
|
||||||
self.netdevs = netdevs
|
self.netdevs = netdevs
|
||||||
self._distribute_netdevs()
|
self._distribute_netdevs()
|
||||||
|
|||||||
@@ -16,10 +16,10 @@ from .__init__ import ANYWIN, TYPE_CHECKING
|
|||||||
from .bos import bos
|
from .bos import bos
|
||||||
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
|
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
|
||||||
from .util import (
|
from .util import (
|
||||||
|
FFMPEG_URL,
|
||||||
BytesIO,
|
BytesIO,
|
||||||
Cooldown,
|
Cooldown,
|
||||||
Daemon,
|
Daemon,
|
||||||
FFMPEG_URL,
|
|
||||||
Pebkac,
|
Pebkac,
|
||||||
afsenc,
|
afsenc,
|
||||||
fsenc,
|
fsenc,
|
||||||
|
|||||||
@@ -34,14 +34,14 @@ if True: # pylint: disable=using-constant-test
|
|||||||
from typing import Any, Optional, Union
|
from typing import Any, Optional, Union
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from .httpconn import HttpConn
|
from .httpsrv import HttpSrv
|
||||||
|
|
||||||
|
|
||||||
class U2idx(object):
|
class U2idx(object):
|
||||||
def __init__(self, conn: "HttpConn") -> None:
|
def __init__(self, hsrv: "HttpSrv") -> None:
|
||||||
self.log_func = conn.log_func
|
self.log_func = hsrv.log
|
||||||
self.asrv = conn.asrv
|
self.asrv = hsrv.asrv
|
||||||
self.args = conn.args
|
self.args = hsrv.args
|
||||||
self.timeout = self.args.srch_time
|
self.timeout = self.args.srch_time
|
||||||
|
|
||||||
if not HAVE_SQLITE3:
|
if not HAVE_SQLITE3:
|
||||||
@@ -51,7 +51,7 @@ class U2idx(object):
|
|||||||
self.active_id = ""
|
self.active_id = ""
|
||||||
self.active_cur: Optional["sqlite3.Cursor"] = None
|
self.active_cur: Optional["sqlite3.Cursor"] = None
|
||||||
self.cur: dict[str, "sqlite3.Cursor"] = {}
|
self.cur: dict[str, "sqlite3.Cursor"] = {}
|
||||||
self.mem_cur = sqlite3.connect(":memory:").cursor()
|
self.mem_cur = sqlite3.connect(":memory:", check_same_thread=False).cursor()
|
||||||
self.mem_cur.execute(r"create table a (b text)")
|
self.mem_cur.execute(r"create table a (b text)")
|
||||||
|
|
||||||
self.p_end = 0.0
|
self.p_end = 0.0
|
||||||
@@ -101,7 +101,8 @@ class U2idx(object):
|
|||||||
uri = ""
|
uri = ""
|
||||||
try:
|
try:
|
||||||
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
|
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
|
||||||
cur = sqlite3.connect(uri, 2, uri=True).cursor()
|
db = sqlite3.connect(uri, 2, uri=True, check_same_thread=False)
|
||||||
|
cur = db.cursor()
|
||||||
cur.execute('pragma table_info("up")').fetchone()
|
cur.execute('pragma table_info("up")').fetchone()
|
||||||
self.log("ro: {}".format(db_path))
|
self.log("ro: {}".format(db_path))
|
||||||
except:
|
except:
|
||||||
@@ -112,7 +113,7 @@ class U2idx(object):
|
|||||||
if not cur:
|
if not cur:
|
||||||
# on windows, this steals the write-lock from up2k.deferred_init --
|
# on windows, this steals the write-lock from up2k.deferred_init --
|
||||||
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
|
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
|
||||||
cur = sqlite3.connect(db_path, 2).cursor()
|
cur = sqlite3.connect(db_path, 2, check_same_thread=False).cursor()
|
||||||
self.log("opened {}".format(db_path))
|
self.log("opened {}".format(db_path))
|
||||||
|
|
||||||
self.cur[ptop] = cur
|
self.cur[ptop] = cur
|
||||||
|
|||||||
@@ -1032,7 +1032,7 @@ class Up2k(object):
|
|||||||
zh.update(cv.encode("utf-8", "replace"))
|
zh.update(cv.encode("utf-8", "replace"))
|
||||||
zh.update(spack(b"<d", cst.st_mtime))
|
zh.update(spack(b"<d", cst.st_mtime))
|
||||||
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
||||||
sql = "select d from dh where d = ? and h = ?"
|
sql = "select d from dh where d=? and +h=?"
|
||||||
try:
|
try:
|
||||||
c = db.c.execute(sql, (rd, dhash))
|
c = db.c.execute(sql, (rd, dhash))
|
||||||
drd = rd
|
drd = rd
|
||||||
@@ -1316,9 +1316,9 @@ class Up2k(object):
|
|||||||
|
|
||||||
w, drd, dfn = zb[:-1].decode("utf-8").split("\x00")
|
w, drd, dfn = zb[:-1].decode("utf-8").split("\x00")
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
q = "select mt, sz from up where w = ? and rd = ? and fn = ?"
|
q = "select mt, sz from up where rd=? and fn=? and +w=?"
|
||||||
try:
|
try:
|
||||||
mt, sz = cur.execute(q, (w, drd, dfn)).fetchone()
|
mt, sz = cur.execute(q, (drd, dfn, w)).fetchone()
|
||||||
except:
|
except:
|
||||||
# file moved/deleted since spooling
|
# file moved/deleted since spooling
|
||||||
continue
|
continue
|
||||||
@@ -2223,7 +2223,7 @@ class Up2k(object):
|
|||||||
q = r"select * from up where w = ?"
|
q = r"select * from up where w = ?"
|
||||||
argv = [wark]
|
argv = [wark]
|
||||||
else:
|
else:
|
||||||
q = r"select * from up where substr(w,1,16) = ? and w = ?"
|
q = r"select * from up where substr(w,1,16)=? and +w=?"
|
||||||
argv = [wark[:16], wark]
|
argv = [wark[:16], wark]
|
||||||
|
|
||||||
c2 = cur.execute(q, tuple(argv))
|
c2 = cur.execute(q, tuple(argv))
|
||||||
@@ -3300,9 +3300,16 @@ class Up2k(object):
|
|||||||
"""
|
"""
|
||||||
dupes = []
|
dupes = []
|
||||||
sabs = djoin(sptop, srem)
|
sabs = djoin(sptop, srem)
|
||||||
q = "select rd, fn from up where substr(w,1,16)=? and w=?"
|
|
||||||
|
if self.no_expr_idx:
|
||||||
|
q = r"select rd, fn from up where w = ?"
|
||||||
|
argv = (wark,)
|
||||||
|
else:
|
||||||
|
q = r"select rd, fn from up where substr(w,1,16)=? and +w=?"
|
||||||
|
argv = (wark[:16], wark)
|
||||||
|
|
||||||
for ptop, cur in self.cur.items():
|
for ptop, cur in self.cur.items():
|
||||||
for rd, fn in cur.execute(q, (wark[:16], wark)):
|
for rd, fn in cur.execute(q, argv):
|
||||||
if rd.startswith("//") or fn.startswith("//"):
|
if rd.startswith("//") or fn.startswith("//"):
|
||||||
rd, fn = s3dec(rd, fn)
|
rd, fn = s3dec(rd, fn)
|
||||||
|
|
||||||
|
|||||||
@@ -27,8 +27,8 @@ window.baguetteBox = (function () {
|
|||||||
isOverlayVisible = false,
|
isOverlayVisible = false,
|
||||||
touch = {}, // start-pos
|
touch = {}, // start-pos
|
||||||
touchFlag = false, // busy
|
touchFlag = false, // busy
|
||||||
re_i = /.+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
|
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
|
||||||
re_v = /.+\.(webm|mkv|mp4)(\?|$)/i,
|
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
|
||||||
anims = ['slideIn', 'fadeIn', 'none'],
|
anims = ['slideIn', 'fadeIn', 'none'],
|
||||||
data = {}, // all galleries
|
data = {}, // all galleries
|
||||||
imagesElements = [],
|
imagesElements = [],
|
||||||
@@ -580,6 +580,7 @@ window.baguetteBox = (function () {
|
|||||||
function hideOverlay(e) {
|
function hideOverlay(e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
playvid(false);
|
playvid(false);
|
||||||
|
removeFromCache('#files');
|
||||||
if (options.noScrollbars) {
|
if (options.noScrollbars) {
|
||||||
document.documentElement.style.overflowY = 'auto';
|
document.documentElement.style.overflowY = 'auto';
|
||||||
document.body.style.overflowY = 'auto';
|
document.body.style.overflowY = 'auto';
|
||||||
@@ -812,10 +813,16 @@ window.baguetteBox = (function () {
|
|||||||
}
|
}
|
||||||
|
|
||||||
function vid() {
|
function vid() {
|
||||||
|
if (currentIndex >= imagesElements.length)
|
||||||
|
return;
|
||||||
|
|
||||||
return imagesElements[currentIndex].querySelector('video');
|
return imagesElements[currentIndex].querySelector('video');
|
||||||
}
|
}
|
||||||
|
|
||||||
function vidimg() {
|
function vidimg() {
|
||||||
|
if (currentIndex >= imagesElements.length)
|
||||||
|
return;
|
||||||
|
|
||||||
return imagesElements[currentIndex].querySelector('img, video');
|
return imagesElements[currentIndex].querySelector('img, video');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1074,6 +1074,9 @@ html.np_open #ggrid>a.au:before {
|
|||||||
background: var(--bg-d3);
|
background: var(--bg-d3);
|
||||||
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
|
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
|
||||||
}
|
}
|
||||||
|
#player {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
#widget {
|
#widget {
|
||||||
position: fixed;
|
position: fixed;
|
||||||
font-size: 1.4em;
|
font-size: 1.4em;
|
||||||
|
|||||||
@@ -974,6 +974,17 @@ ebi('widget').innerHTML = (
|
|||||||
' <canvas id="pvol" width="288" height="38"></canvas>' +
|
' <canvas id="pvol" width="288" height="38"></canvas>' +
|
||||||
' <canvas id="barpos"></canvas>' +
|
' <canvas id="barpos"></canvas>' +
|
||||||
' <canvas id="barbuf"></canvas>' +
|
' <canvas id="barbuf"></canvas>' +
|
||||||
|
'</div>' +
|
||||||
|
'<div id="np_inf">' +
|
||||||
|
' <img id="np_img"></span>' +
|
||||||
|
' <span id="np_url"></span>' +
|
||||||
|
' <span id="np_circle"></span>' +
|
||||||
|
' <span id="np_album"></span>' +
|
||||||
|
' <span id="np_tn"></span>' +
|
||||||
|
' <span id="np_artist"></span>' +
|
||||||
|
' <span id="np_title"></span>' +
|
||||||
|
' <span id="np_pos"></span>' +
|
||||||
|
' <span id="np_dur"></span>' +
|
||||||
'</div>'
|
'</div>'
|
||||||
);
|
);
|
||||||
|
|
||||||
@@ -1405,10 +1416,17 @@ var mpl = (function () {
|
|||||||
};
|
};
|
||||||
|
|
||||||
r.pp = function () {
|
r.pp = function () {
|
||||||
|
var adur, apos, playing = mp.au && !mp.au.paused;
|
||||||
|
|
||||||
|
clmod(ebi('np_inf'), 'playing', playing);
|
||||||
|
|
||||||
|
if (mp.au && isNum(adur = mp.au.duration) && isNum(apos = mp.au.currentTime) && apos >= 0)
|
||||||
|
ebi('np_pos').textContent = s2ms(apos);
|
||||||
|
|
||||||
if (!r.os_ctl)
|
if (!r.os_ctl)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
navigator.mediaSession.playbackState = mp.au && !mp.au.paused ? "playing" : "paused";
|
navigator.mediaSession.playbackState = playing ? "playing" : "paused";
|
||||||
};
|
};
|
||||||
|
|
||||||
function setaufollow() {
|
function setaufollow() {
|
||||||
@@ -1423,9 +1441,10 @@ var mpl = (function () {
|
|||||||
var np = get_np()[0],
|
var np = get_np()[0],
|
||||||
fns = np.file.split(' - '),
|
fns = np.file.split(' - '),
|
||||||
artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
|
artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
|
||||||
tags = {
|
title = np.title || fns.pop(),
|
||||||
title: np.title || fns.pop()
|
cover = '',
|
||||||
};
|
pcover = '',
|
||||||
|
tags = { title: title };
|
||||||
|
|
||||||
if (artist)
|
if (artist)
|
||||||
tags.artist = artist;
|
tags.artist = artist;
|
||||||
@@ -1446,18 +1465,28 @@ var mpl = (function () {
|
|||||||
|
|
||||||
if (cover) {
|
if (cover) {
|
||||||
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
|
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
|
||||||
|
pcover = cover;
|
||||||
|
|
||||||
var pwd = get_pwd();
|
var pwd = get_pwd();
|
||||||
if (pwd)
|
if (pwd)
|
||||||
cover += '&pw=' + uricom_enc(pwd);
|
pcover += '&pw=' + uricom_enc(pwd);
|
||||||
|
|
||||||
tags.artwork = [{ "src": cover, type: "image/jpeg" }];
|
tags.artwork = [{ "src": pcover, type: "image/jpeg" }];
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ebi('np_circle').textContent = np.circle || '';
|
||||||
|
ebi('np_album').textContent = np.album || '';
|
||||||
|
ebi('np_tn').textContent = np['.tn'] || '';
|
||||||
|
ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : '');
|
||||||
|
ebi('np_title').textContent = np.title || '';
|
||||||
|
ebi('np_dur').textContent = np['.dur'] || '';
|
||||||
|
ebi('np_url').textContent = get_vpath() + np.file.split('?')[0];
|
||||||
|
ebi('np_img').setAttribute('src', cover); // dont give last.fm the pwd
|
||||||
|
|
||||||
navigator.mediaSession.metadata = new MediaMetadata(tags);
|
navigator.mediaSession.metadata = new MediaMetadata(tags);
|
||||||
navigator.mediaSession.setActionHandler('play', playpause);
|
navigator.mediaSession.setActionHandler('play', mplay);
|
||||||
navigator.mediaSession.setActionHandler('pause', playpause);
|
navigator.mediaSession.setActionHandler('pause', mpause);
|
||||||
navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null);
|
navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null);
|
||||||
navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null);
|
navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null);
|
||||||
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
|
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
|
||||||
@@ -1467,11 +1496,17 @@ var mpl = (function () {
|
|||||||
r.announce = announce;
|
r.announce = announce;
|
||||||
|
|
||||||
r.stop = function () {
|
r.stop = function () {
|
||||||
if (!r.os_ctl || !navigator.mediaSession.metadata)
|
if (!r.os_ctl)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
navigator.mediaSession.metadata = null;
|
navigator.mediaSession.metadata = null;
|
||||||
navigator.mediaSession.playbackState = "paused";
|
navigator.mediaSession.playbackState = "paused";
|
||||||
|
|
||||||
|
var hs = 'play pause seekbackward seekforward previoustrack nexttrack'.split(/ /g);
|
||||||
|
for (var a = 0; a < hs.length; a++)
|
||||||
|
navigator.mediaSession.setActionHandler(hs[a], null);
|
||||||
|
|
||||||
|
navigator.mediaSession.setPositionState();
|
||||||
};
|
};
|
||||||
|
|
||||||
r.unbuffer = function (url) {
|
r.unbuffer = function (url) {
|
||||||
@@ -1511,6 +1546,7 @@ function MPlayer() {
|
|||||||
r.au2 = null;
|
r.au2 = null;
|
||||||
r.tracks = {};
|
r.tracks = {};
|
||||||
r.order = [];
|
r.order = [];
|
||||||
|
r.cd_pause = 0;
|
||||||
|
|
||||||
var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native,
|
var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native,
|
||||||
trs = QSA('#files tbody tr');
|
trs = QSA('#files tbody tr');
|
||||||
@@ -1567,6 +1603,7 @@ function MPlayer() {
|
|||||||
r.ftid = -1;
|
r.ftid = -1;
|
||||||
r.ftimer = null;
|
r.ftimer = null;
|
||||||
r.fade_in = function () {
|
r.fade_in = function () {
|
||||||
|
r.nopause();
|
||||||
r.fvol = 0;
|
r.fvol = 0;
|
||||||
r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1);
|
r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1);
|
||||||
if (r.au) {
|
if (r.au) {
|
||||||
@@ -1599,9 +1636,9 @@ function MPlayer() {
|
|||||||
r.au.pause();
|
r.au.pause();
|
||||||
mpl.pp();
|
mpl.pp();
|
||||||
|
|
||||||
var t = mp.au.currentTime - 0.8;
|
var t = r.au.currentTime - 0.8;
|
||||||
if (isNum(t))
|
if (isNum(t))
|
||||||
mp.au.currentTime = Math.max(t, 0);
|
r.au.currentTime = Math.max(t, 0);
|
||||||
}
|
}
|
||||||
else if (r.fvol > r.vol)
|
else if (r.fvol > r.vol)
|
||||||
r.fvol = r.vol;
|
r.fvol = r.vol;
|
||||||
@@ -1647,8 +1684,14 @@ function MPlayer() {
|
|||||||
drop();
|
drop();
|
||||||
});
|
});
|
||||||
|
|
||||||
mp.au2.preload = "auto";
|
r.nopause();
|
||||||
mp.au2.src = mp.au2.rsrc = url;
|
r.au2.onloadeddata = r.au2.onloadedmetadata = r.nopause;
|
||||||
|
r.au2.preload = "auto";
|
||||||
|
r.au2.src = r.au2.rsrc = url;
|
||||||
|
};
|
||||||
|
|
||||||
|
r.nopause = function () {
|
||||||
|
r.cd_pause = Date.now();
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1787,6 +1830,7 @@ function glossy_grad(can, h, s, l) {
|
|||||||
var pbar = (function () {
|
var pbar = (function () {
|
||||||
var r = {},
|
var r = {},
|
||||||
bau = null,
|
bau = null,
|
||||||
|
html_txt = 'a',
|
||||||
lastmove = 0,
|
lastmove = 0,
|
||||||
mousepos = 0,
|
mousepos = 0,
|
||||||
gradh = -1,
|
gradh = -1,
|
||||||
@@ -1956,6 +2000,16 @@ var pbar = (function () {
|
|||||||
pctx.strokeText(t2, xt2, yt);
|
pctx.strokeText(t2, xt2, yt);
|
||||||
pctx.fillText(t1, xt1, yt);
|
pctx.fillText(t1, xt1, yt);
|
||||||
pctx.fillText(t2, xt2, yt);
|
pctx.fillText(t2, xt2, yt);
|
||||||
|
|
||||||
|
if (w && html_txt != t2) {
|
||||||
|
ebi('np_pos').textContent = html_txt = t2;
|
||||||
|
if (mpl.os_ctl)
|
||||||
|
navigator.mediaSession.setPositionState({
|
||||||
|
'duration': adur,
|
||||||
|
'position': apos,
|
||||||
|
'playbackRate': 1
|
||||||
|
});
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
window.addEventListener('resize', r.onresize);
|
window.addEventListener('resize', r.onresize);
|
||||||
@@ -2085,6 +2139,7 @@ function seek_au_sec(seek) {
|
|||||||
if (!isNum(seek))
|
if (!isNum(seek))
|
||||||
return;
|
return;
|
||||||
|
|
||||||
|
mp.nopause();
|
||||||
mp.au.currentTime = seek;
|
mp.au.currentTime = seek;
|
||||||
|
|
||||||
if (mp.au.paused)
|
if (mp.au.paused)
|
||||||
@@ -2157,6 +2212,25 @@ function playpause(e) {
|
|||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
|
function mplay(e) {
|
||||||
|
if (mp.au && !mp.au.paused)
|
||||||
|
return;
|
||||||
|
|
||||||
|
playpause(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
function mpause(e) {
|
||||||
|
if (mp.cd_pause > Date.now() - 100)
|
||||||
|
return;
|
||||||
|
|
||||||
|
if (mp.au && mp.au.paused)
|
||||||
|
return;
|
||||||
|
|
||||||
|
playpause(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
// hook up the widget buttons
|
// hook up the widget buttons
|
||||||
(function () {
|
(function () {
|
||||||
ebi('bplay').onclick = playpause;
|
ebi('bplay').onclick = playpause;
|
||||||
@@ -2302,12 +2376,14 @@ function start_actx() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
var audio_eq = (function () {
|
var afilt = (function () {
|
||||||
var r = {
|
var r = {
|
||||||
"en": false,
|
"eqen": false,
|
||||||
"bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000],
|
"bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000],
|
||||||
"gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4],
|
"gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4],
|
||||||
"filters": [],
|
"filters": [],
|
||||||
|
"filterskip": [],
|
||||||
|
"plugs": [],
|
||||||
"amp": 0,
|
"amp": 0,
|
||||||
"chw": 1,
|
"chw": 1,
|
||||||
"last_au": null,
|
"last_au": null,
|
||||||
@@ -2396,6 +2472,10 @@ var audio_eq = (function () {
|
|||||||
r.filters[a].disconnect();
|
r.filters[a].disconnect();
|
||||||
|
|
||||||
r.filters = [];
|
r.filters = [];
|
||||||
|
r.filterskip = [];
|
||||||
|
|
||||||
|
for (var a = 0; a < r.plugs.length; a++)
|
||||||
|
r.plugs[a].unload();
|
||||||
|
|
||||||
if (!mp)
|
if (!mp)
|
||||||
return;
|
return;
|
||||||
@@ -2413,18 +2493,34 @@ var audio_eq = (function () {
|
|||||||
if (!actx)
|
if (!actx)
|
||||||
bcfg_set('au_eq', false);
|
bcfg_set('au_eq', false);
|
||||||
|
|
||||||
if (!actx || !mp.au || (!r.en && !mp.acs))
|
var plug = false;
|
||||||
|
for (var a = 0; a < r.plugs.length; a++)
|
||||||
|
if (r.plugs[a].en)
|
||||||
|
plug = true;
|
||||||
|
|
||||||
|
if (!actx || !mp.au || (!r.eqen && !plug && !mp.acs))
|
||||||
return;
|
return;
|
||||||
|
|
||||||
r.stop();
|
r.stop();
|
||||||
mp.au.id = mp.au.id || Date.now();
|
mp.au.id = mp.au.id || Date.now();
|
||||||
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
|
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
|
||||||
|
|
||||||
if (!r.en) {
|
if (r.eqen)
|
||||||
mp.acs.connect(actx.destination);
|
add_eq();
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
for (var a = 0; a < r.plugs.length; a++)
|
||||||
|
if (r.plugs[a].en)
|
||||||
|
r.plugs[a].load();
|
||||||
|
|
||||||
|
for (var a = 0; a < r.filters.length; a++)
|
||||||
|
if (!has(r.filterskip, a))
|
||||||
|
r.filters[a].connect(a ? r.filters[a - 1] : actx.destination);
|
||||||
|
|
||||||
|
mp.acs.connect(r.filters.length ?
|
||||||
|
r.filters[r.filters.length - 1] : actx.destination);
|
||||||
|
}
|
||||||
|
|
||||||
|
function add_eq() {
|
||||||
var min, max;
|
var min, max;
|
||||||
min = max = r.gains[0];
|
min = max = r.gains[0];
|
||||||
for (var a = 1; a < r.gains.length; a++) {
|
for (var a = 1; a < r.gains.length; a++) {
|
||||||
@@ -2455,9 +2551,6 @@ var audio_eq = (function () {
|
|||||||
fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect
|
fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect
|
||||||
r.filters.push(fi);
|
r.filters.push(fi);
|
||||||
|
|
||||||
for (var a = r.filters.length - 1; a >= 0; a--)
|
|
||||||
r.filters[a].connect(a > 0 ? r.filters[a - 1] : actx.destination);
|
|
||||||
|
|
||||||
if (Math.round(r.chw * 25) != 25) {
|
if (Math.round(r.chw * 25) != 25) {
|
||||||
var split = actx.createChannelSplitter(2),
|
var split = actx.createChannelSplitter(2),
|
||||||
merge = actx.createChannelMerger(2),
|
merge = actx.createChannelMerger(2),
|
||||||
@@ -2482,11 +2575,10 @@ var audio_eq = (function () {
|
|||||||
split.connect(lg2, 0);
|
split.connect(lg2, 0);
|
||||||
split.connect(rg1, 1);
|
split.connect(rg1, 1);
|
||||||
split.connect(rg2, 1);
|
split.connect(rg2, 1);
|
||||||
|
r.filterskip.push(r.filters.length);
|
||||||
r.filters.push(split);
|
r.filters.push(split);
|
||||||
mp.acs.channelCountMode = 'explicit';
|
mp.acs.channelCountMode = 'explicit';
|
||||||
}
|
}
|
||||||
|
|
||||||
mp.acs.connect(r.filters[r.filters.length - 1]);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function eq_step(e) {
|
function eq_step(e) {
|
||||||
@@ -2581,7 +2673,7 @@ var audio_eq = (function () {
|
|||||||
txt[a].onkeydown = eq_keydown;
|
txt[a].onkeydown = eq_keydown;
|
||||||
}
|
}
|
||||||
|
|
||||||
bcfg_bind(r, 'en', 'au_eq', false, r.apply);
|
bcfg_bind(r, 'eqen', 'au_eq', false, r.apply);
|
||||||
|
|
||||||
r.draw();
|
r.draw();
|
||||||
return r;
|
return r;
|
||||||
@@ -2663,7 +2755,7 @@ function play(tid, is_ev, seek) {
|
|||||||
else
|
else
|
||||||
mp.au.src = mp.au.rsrc = url;
|
mp.au.src = mp.au.rsrc = url;
|
||||||
|
|
||||||
audio_eq.apply();
|
afilt.apply();
|
||||||
|
|
||||||
setTimeout(function () {
|
setTimeout(function () {
|
||||||
mpl.unbuffer(url);
|
mpl.unbuffer(url);
|
||||||
@@ -2686,6 +2778,7 @@ function play(tid, is_ev, seek) {
|
|||||||
scroll2playing();
|
scroll2playing();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
mp.nopause();
|
||||||
mp.au.play();
|
mp.au.play();
|
||||||
if (mp.au.paused)
|
if (mp.au.paused)
|
||||||
autoplay_blocked(seek);
|
autoplay_blocked(seek);
|
||||||
@@ -4150,6 +4243,21 @@ var thegrid = (function () {
|
|||||||
ev(e);
|
ev(e);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
r.imshow = function (url) {
|
||||||
|
var sel = '#ggrid>a'
|
||||||
|
if (!thegrid.en) {
|
||||||
|
thegrid.bagit('#files');
|
||||||
|
sel = '#files a[id]';
|
||||||
|
}
|
||||||
|
var ims = QSA(sel);
|
||||||
|
for (var a = 0, aa = ims.length; a < aa; a++) {
|
||||||
|
var iu = ims[a].getAttribute('href').split('?')[0].split('/').slice(-1)[0];
|
||||||
|
if (iu == url)
|
||||||
|
return ims[a].click();
|
||||||
|
}
|
||||||
|
baguetteBox.hide();
|
||||||
|
};
|
||||||
|
|
||||||
r.loadsel = function () {
|
r.loadsel = function () {
|
||||||
if (r.dirty)
|
if (r.dirty)
|
||||||
return;
|
return;
|
||||||
@@ -4289,19 +4397,19 @@ var thegrid = (function () {
|
|||||||
}
|
}
|
||||||
|
|
||||||
r.dirty = false;
|
r.dirty = false;
|
||||||
r.bagit();
|
r.bagit('#ggrid');
|
||||||
r.loadsel();
|
r.loadsel();
|
||||||
setTimeout(r.tippen, 20);
|
setTimeout(r.tippen, 20);
|
||||||
}
|
}
|
||||||
|
|
||||||
r.bagit = function () {
|
r.bagit = function (isrc) {
|
||||||
if (!window.baguetteBox)
|
if (!window.baguetteBox)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
if (r.bbox)
|
if (r.bbox)
|
||||||
baguetteBox.destroy();
|
baguetteBox.destroy();
|
||||||
|
|
||||||
r.bbox = baguetteBox.run('#ggrid', {
|
r.bbox = baguetteBox.run(isrc, {
|
||||||
captions: function (g) {
|
captions: function (g) {
|
||||||
var idx = -1,
|
var idx = -1,
|
||||||
h = '' + g;
|
h = '' + g;
|
||||||
@@ -6674,22 +6782,27 @@ var globalcss = (function () {
|
|||||||
|
|
||||||
var dcs = document.styleSheets;
|
var dcs = document.styleSheets;
|
||||||
for (var a = 0; a < dcs.length; a++) {
|
for (var a = 0; a < dcs.length; a++) {
|
||||||
var base = dcs[a].href,
|
var ds, base = '';
|
||||||
|
try {
|
||||||
|
base = dcs[a].href;
|
||||||
|
if (!base)
|
||||||
|
continue;
|
||||||
|
|
||||||
ds = dcs[a].cssRules;
|
ds = dcs[a].cssRules;
|
||||||
|
base = base.replace(/[^/]+$/, '');
|
||||||
if (!base)
|
for (var b = 0; b < ds.length; b++) {
|
||||||
continue;
|
var css = ds[b].cssText.split(/\burl\(/g);
|
||||||
|
ret += css[0];
|
||||||
base = base.replace(/[^/]+$/, '');
|
for (var c = 1; c < css.length; c++) {
|
||||||
for (var b = 0; b < ds.length; b++) {
|
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
|
||||||
var css = ds[b].cssText.split(/\burl\(/g);
|
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
|
||||||
ret += css[0];
|
css[c].slice(delim ? 1 : 0);
|
||||||
for (var c = 1; c < css.length; c++) {
|
}
|
||||||
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
|
ret += '\n';
|
||||||
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
|
|
||||||
css[c].slice(delim ? 1 : 0);
|
|
||||||
}
|
}
|
||||||
ret += '\n';
|
}
|
||||||
|
catch (ex) {
|
||||||
|
console.log('could not read css', a, base);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return ret;
|
return ret;
|
||||||
@@ -6775,6 +6888,7 @@ function show_md(md, name, div, url, depth) {
|
|||||||
|
|
||||||
els[a].setAttribute('href', '#md-' + href.slice(1));
|
els[a].setAttribute('href', '#md-' + href.slice(1));
|
||||||
}
|
}
|
||||||
|
md_th_set();
|
||||||
set_tabindex();
|
set_tabindex();
|
||||||
var hash = location.hash;
|
var hash = location.hash;
|
||||||
if (hash.startsWith('#md-'))
|
if (hash.startsWith('#md-'))
|
||||||
@@ -6853,6 +6967,7 @@ function sandbox(tgt, rules, cls, html) {
|
|||||||
'window.onblur=function(){say("ilost #' + tid + '")};' +
|
'window.onblur=function(){say("ilost #' + tid + '")};' +
|
||||||
'var el="' + want + '"&&ebi("' + want + '");' +
|
'var el="' + want + '"&&ebi("' + want + '");' +
|
||||||
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
|
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
|
||||||
|
'md_th_set();' +
|
||||||
(cls == 'mdo' && md_plug.post ?
|
(cls == 'mdo' && md_plug.post ?
|
||||||
'const x={' + md_plug.post + '};' +
|
'const x={' + md_plug.post + '};' +
|
||||||
'if(x.render)x.render(ebi("b"));' +
|
'if(x.render)x.render(ebi("b"));' +
|
||||||
@@ -6885,6 +7000,9 @@ window.addEventListener("message", function (e) {
|
|||||||
else if (t[0] == 'igot' || t[0] == 'ilost') {
|
else if (t[0] == 'igot' || t[0] == 'ilost') {
|
||||||
clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot');
|
clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot');
|
||||||
}
|
}
|
||||||
|
else if (t[0] == 'imshow') {
|
||||||
|
thegrid.imshow(e.data.slice(7));
|
||||||
|
}
|
||||||
} catch (ex) {
|
} catch (ex) {
|
||||||
console.log('msg-err: ' + ex);
|
console.log('msg-err: ' + ex);
|
||||||
}
|
}
|
||||||
@@ -7159,8 +7277,8 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
|
|||||||
|
|
||||||
function reload_mp() {
|
function reload_mp() {
|
||||||
if (mp && mp.au) {
|
if (mp && mp.au) {
|
||||||
if (audio_eq)
|
if (afilt)
|
||||||
audio_eq.stop();
|
afilt.stop();
|
||||||
|
|
||||||
mp.au.pause();
|
mp.au.pause();
|
||||||
mp.au = null;
|
mp.au = null;
|
||||||
@@ -7172,8 +7290,8 @@ function reload_mp() {
|
|||||||
plays[a].parentNode.innerHTML = '-';
|
plays[a].parentNode.innerHTML = '-';
|
||||||
|
|
||||||
mp = new MPlayer();
|
mp = new MPlayer();
|
||||||
if (audio_eq)
|
if (afilt)
|
||||||
audio_eq.acst = {};
|
afilt.acst = {};
|
||||||
|
|
||||||
setTimeout(pbar.onresize, 1);
|
setTimeout(pbar.onresize, 1);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -231,11 +231,11 @@ function convert_markdown(md_text, dest_dom) {
|
|||||||
var nodes = md_dom.getElementsByTagName('a');
|
var nodes = md_dom.getElementsByTagName('a');
|
||||||
for (var a = nodes.length - 1; a >= 0; a--) {
|
for (var a = nodes.length - 1; a >= 0; a--) {
|
||||||
var href = nodes[a].getAttribute('href');
|
var href = nodes[a].getAttribute('href');
|
||||||
var txt = nodes[a].textContent;
|
var txt = nodes[a].innerHTML;
|
||||||
|
|
||||||
if (!txt)
|
if (!txt)
|
||||||
nodes[a].textContent = href;
|
nodes[a].textContent = href;
|
||||||
else if (href !== txt)
|
else if (href !== txt && !nodes[a].className)
|
||||||
nodes[a].className = 'vis';
|
nodes[a].className = 'vis';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -43,10 +43,9 @@
|
|||||||
<h1>WebDAV</h1>
|
<h1>WebDAV</h1>
|
||||||
|
|
||||||
<div class="os win">
|
<div class="os win">
|
||||||
<p><em>note: rclone-FTP is a bit faster, so {% if args.ftp or args.ftps %}try that first{% else %}consider enabling FTP in server settings{% endif %}</em></p>
|
|
||||||
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
|
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
|
||||||
<pre>
|
<pre>
|
||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
{% if s %}
|
||||||
@@ -71,7 +70,7 @@
|
|||||||
</pre>
|
</pre>
|
||||||
<p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
|
<p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
|
||||||
<pre>
|
<pre>
|
||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
{% if s %}
|
||||||
|
|||||||
@@ -451,6 +451,20 @@ html.y textarea:focus {
|
|||||||
padding: .2em .5em;
|
padding: .2em .5em;
|
||||||
border: .12em solid #aaa;
|
border: .12em solid #aaa;
|
||||||
}
|
}
|
||||||
|
.mdo .mdth,
|
||||||
|
.mdo .mdthl,
|
||||||
|
.mdo .mdthr {
|
||||||
|
margin: .5em .5em .5em 0;
|
||||||
|
}
|
||||||
|
.mdthl {
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
.mdthr {
|
||||||
|
float: right;
|
||||||
|
}
|
||||||
|
hr {
|
||||||
|
clear: both;
|
||||||
|
}
|
||||||
|
|
||||||
@media screen {
|
@media screen {
|
||||||
.mdo {
|
.mdo {
|
||||||
|
|||||||
@@ -1602,6 +1602,14 @@ function load_md_plug(md_text, plug_type, defer) {
|
|||||||
if (defer)
|
if (defer)
|
||||||
md_plug[plug_type] = null;
|
md_plug[plug_type] = null;
|
||||||
|
|
||||||
|
if (plug_type == 'pre')
|
||||||
|
try {
|
||||||
|
md_text = md_thumbs(md_text);
|
||||||
|
}
|
||||||
|
catch (ex) {
|
||||||
|
toast.warn(30, '' + ex);
|
||||||
|
}
|
||||||
|
|
||||||
if (!have_emp)
|
if (!have_emp)
|
||||||
return md_text;
|
return md_text;
|
||||||
|
|
||||||
@@ -1642,6 +1650,47 @@ function load_md_plug(md_text, plug_type, defer) {
|
|||||||
|
|
||||||
return md;
|
return md;
|
||||||
}
|
}
|
||||||
|
function md_thumbs(md) {
|
||||||
|
if (!/(^|\n)<!-- th -->/.exec(md))
|
||||||
|
return md;
|
||||||
|
|
||||||
|
// `!th[flags](some.jpg)`
|
||||||
|
// flags: nothing or "l" or "r"
|
||||||
|
|
||||||
|
md = md.split(/!th\[/g);
|
||||||
|
for (var a = 1; a < md.length; a++) {
|
||||||
|
if (!/^[^\]!()]*\]\([^\][!()]+\)/.exec(md[a])) {
|
||||||
|
md[a] = '!th[' + md[a];
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
var o1 = md[a].indexOf(']('),
|
||||||
|
o2 = md[a].indexOf(')', o1),
|
||||||
|
alt = md[a].slice(0, o1),
|
||||||
|
flags = alt.split(','),
|
||||||
|
url = md[a].slice(o1 + 2, o2),
|
||||||
|
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
|
||||||
|
|
||||||
|
if (!/[?&]cache/.exec(url))
|
||||||
|
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i';
|
||||||
|
|
||||||
|
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
|
||||||
|
}
|
||||||
|
return md.join('');
|
||||||
|
}
|
||||||
|
function md_th_set() {
|
||||||
|
var els = QSA('.mdth');
|
||||||
|
for (var a = 0, aa = els.length; a < aa; a++)
|
||||||
|
els[a].onclick = md_th_click;
|
||||||
|
}
|
||||||
|
function md_th_click(e) {
|
||||||
|
ev(e);
|
||||||
|
var url = this.getAttribute('href').split('?')[0];
|
||||||
|
if (window.sb_md)
|
||||||
|
window.parent.postMessage("imshow " + url, "*");
|
||||||
|
else
|
||||||
|
thegrid.imshow(url);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';
|
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';
|
||||||
|
|||||||
@@ -1,3 +1,52 @@
|
|||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0401-2112 `v1.6.11` not joke
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* new event-hook: [exif stripper](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/image-noexif.py)
|
||||||
|
* [markdown thumbnails](https://a.ocv.me/pub/demo/pics-vids/README.md?v) -- see [readme](https://github.com/9001/copyparty#markdown-viewer)
|
||||||
|
* soon: support for [web-scrobbler](https://github.com/web-scrobbler/web-scrobbler/) - the [Last.fm](https://www.last.fm/user/tripflag) browser extension
|
||||||
|
* will update here + readme with more info when [the v3](https://github.com/web-scrobbler/web-scrobbler/projects/5) is out
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* more sqlite query-planner twiddling
|
||||||
|
* deleting files is MUCH faster now, and uploads / bootup might be a bit better too
|
||||||
|
* webdav optimizations / compliance
|
||||||
|
* should make some webdav clients run faster than before
|
||||||
|
* in very related news, the webdav-client in [rclone](https://github.com/rclone/rclone/) v1.63 ([currently beta](https://beta.rclone.org/?filter=latest)) will be ***FAST!***
|
||||||
|
* does cool stuff such as [bidirectional sync](https://github.com/9001/copyparty#folder-sync) between copyparty and a local folder
|
||||||
|
* [bpm detector](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/audio-bpm.py) is a bit more accurate
|
||||||
|
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) / commandline uploader: better error messages if something goes wrong
|
||||||
|
* readme rendering could fail in firefox if certain addons were installed (not sure which)
|
||||||
|
* event-hooks: more accurate usage examples
|
||||||
|
|
||||||
|
## other changes
|
||||||
|
* @chinponya automated the prismjs build step (thx!)
|
||||||
|
* updated some js deps (markedjs, codemirror)
|
||||||
|
* copyparty.exe: updated Pillow to 9.5.0
|
||||||
|
* and finally [the joke](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/rave.js) (looks [like this](https://cd.ocv.me/b/d2/d21/#af-9b927c42))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0320-2156 `v1.6.10` rclone sync
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* [iPhone "app"](https://github.com/9001/copyparty#ios-shortcuts) (upload shortcut) -- thanks @Daedren !
|
||||||
|
* can strip exif, upload files, pics, vids, links, clipboard
|
||||||
|
* can download links and rehost the target file on your server
|
||||||
|
* support `rclone sync` to [sync folders](https://github.com/9001/copyparty#folder-sync) to/from copyparty
|
||||||
|
* let webdav clients set lastmodified times during upload
|
||||||
|
* let webdav clients replace files during upload
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh): FFmpeg transcoding was slow because there was no `/dev/urandom`
|
||||||
|
* iphones would fail to play *some* songs (low-bitrate and/or shorter than ~7 seconds)
|
||||||
|
* due to either an iOS bug or an FFmpeg bug in the caf remuxing idk
|
||||||
|
* fixed by mixing in white noise into songs if an iPhone asks for them
|
||||||
|
* small correction in the docker readme regarding rootless podman
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
# 2023-0316-2106 `v1.6.9` index.html
|
# 2023-0316-2106 `v1.6.9` index.html
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,7 @@
|
|||||||
* [future plans](#future-plans) - some improvement ideas
|
* [future plans](#future-plans) - some improvement ideas
|
||||||
* [design](#design)
|
* [design](#design)
|
||||||
* [up2k](#up2k) - quick outline of the up2k protocol
|
* [up2k](#up2k) - quick outline of the up2k protocol
|
||||||
|
* [why not tus](#why-not-tus) - I didn't know about [tus](https://tus.io/)
|
||||||
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
|
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
|
||||||
* [http api](#http-api)
|
* [http api](#http-api)
|
||||||
* [read](#read)
|
* [read](#read)
|
||||||
@@ -66,6 +67,13 @@ regarding the frequent server log message during uploads;
|
|||||||
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
|
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
|
||||||
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
|
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
|
||||||
|
|
||||||
|
## why not tus
|
||||||
|
|
||||||
|
I didn't know about [tus](https://tus.io/) when I made this, but:
|
||||||
|
* up2k has the advantage that it supports parallel uploading of non-contiguous chunks straight into the final file -- [tus does a merge at the end](https://tus.io/protocols/resumable-upload.html#concatenation) which is slow and taxing on the server HDD / filesystem (unless i'm misunderstanding)
|
||||||
|
* up2k has the slight disadvantage of requiring the client to hash the entire file before an upload can begin, but this has the benefit of immediately skipping duplicate files
|
||||||
|
* and the hashing happens in a separate thread anyways so it's usually not a bottleneck
|
||||||
|
|
||||||
## why chunk-hashes
|
## why chunk-hashes
|
||||||
|
|
||||||
a single sha512 would be better, right?
|
a single sha512 would be better, right?
|
||||||
|
|||||||
4
docs/examples/README.md
Normal file
4
docs/examples/README.md
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
copyparty server config examples
|
||||||
|
|
||||||
|
[windows.md](windows.md) -- running copyparty as a service on windows
|
||||||
|
|
||||||
116
docs/examples/windows.md
Normal file
116
docs/examples/windows.md
Normal file
@@ -0,0 +1,116 @@
|
|||||||
|
# running copyparty on windows
|
||||||
|
|
||||||
|
this is a complete example / quickstart for running copyparty on windows, optionally as a service (autostart on boot)
|
||||||
|
|
||||||
|
you will definitely need either [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (comfy, portable, more features) or [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) (smaller, safer)
|
||||||
|
|
||||||
|
* if you decided to grab `copyparty-sfx.py` instead of the exe you will also need to install the ["Latest Python 3 Release"](https://www.python.org/downloads/windows/)
|
||||||
|
|
||||||
|
then you probably want to download [FFmpeg](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) and put `ffmpeg.exe` and `ffprobe.exe` in your PATH (so for example `C:\Windows\System32\`) -- this enables thumbnails, audio transcoding, and making music metadata searchable
|
||||||
|
|
||||||
|
|
||||||
|
## the config file
|
||||||
|
|
||||||
|
open up notepad and save the following as `c:\users\you\documents\party.conf` (for example)
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
[global]
|
||||||
|
lo: c:\users\you\logs\cpp-%Y-%m%d.xz # log to file
|
||||||
|
e2dsa, e2ts, no-dedup, z # sets 4 flags; see expl.
|
||||||
|
p: 80, 443 # listen on ports 80 and 443, not 3923
|
||||||
|
theme: 2 # default theme: protonmail-monokai
|
||||||
|
lang: nor # default language: viking
|
||||||
|
|
||||||
|
[accounts] # usernames and passwords
|
||||||
|
kevin: shangalabangala # kevin's password
|
||||||
|
|
||||||
|
[/] # create a volume available at /
|
||||||
|
c:\pub # sharing this filesystem location
|
||||||
|
accs: # and set permissions:
|
||||||
|
r: * # everyone can read/download files,
|
||||||
|
rwmd: kevin # kevin can read/write/move/delete
|
||||||
|
|
||||||
|
[/inc] # create another volume at /inc
|
||||||
|
c:\pub\inc # sharing this filesystem location
|
||||||
|
accs: # permissions:
|
||||||
|
w: * # everyone can upload, but not browse
|
||||||
|
rwmd: kevin # kevin is admin here too
|
||||||
|
|
||||||
|
[/music] # and a third volume at /music
|
||||||
|
~/music # which shares c:\users\you\music
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
rwmd: kevin
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
### config explained: [global]
|
||||||
|
|
||||||
|
the `[global]` section accepts any config parameters you can see when running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor`
|
||||||
|
* `lo: c:\users\you\logs\cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed)
|
||||||
|
* sorry that `~/logs/` doesn't work currently, good oversight
|
||||||
|
* `e2dsa` enables the upload deduplicator and file indexer, which enables searching
|
||||||
|
* `e2ts` enables music metadata indexing, making albums / titles etc. searchable too
|
||||||
|
* `no-dedup` writes full dupes to disk instead of symlinking, since lots of windows software doesn't handle symlinks well
|
||||||
|
* but the improved upload speed from `e2dsa` is not affected
|
||||||
|
* `z` enables zeroconf, making the server available at `http://HOSTNAME.local/` from any other machine in the LAN
|
||||||
|
* `p: 80,443` listens on the ports `80` and `443` instead of the default `3923`
|
||||||
|
* `lang: nor` sets default language to viking
|
||||||
|
|
||||||
|
|
||||||
|
### config explained: [accounts]
|
||||||
|
|
||||||
|
the `[accounts]` section defines all the user accounts, which can then be referenced when granting people access to the different volumes
|
||||||
|
|
||||||
|
|
||||||
|
### config explained: volumes
|
||||||
|
|
||||||
|
then we create three volumes, one at `/`, one at `/inc`, and one at `/music`
|
||||||
|
* `/` and `/music` are readable without requiring people to login (`r: *`) but you need to login as kevin to write/move/delete files (`rwmd: kevin`)
|
||||||
|
* anyone can upload to `/inc` but you must be logged in as kevin to see the files inside
|
||||||
|
|
||||||
|
|
||||||
|
## run copyparty
|
||||||
|
|
||||||
|
to test your config it's best to just run copyparty in a console to watch the output:
|
||||||
|
|
||||||
|
```batch
|
||||||
|
copyparty.exe -c party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
or if you wanna use `copyparty-sfx.py` instead of the exe (understandable),
|
||||||
|
|
||||||
|
```batch
|
||||||
|
%localappdata%\programs\python\python311\python.exe copyparty-sfx.py -c party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
(please adjust `python311` to match the python version you installed, i'm not good enough at windows to make that bit generic)
|
||||||
|
|
||||||
|
|
||||||
|
## run it as a service
|
||||||
|
|
||||||
|
to run this as a service you need [NSSM](https://nssm.cc/ci/nssm-2.24-101-g897c7ad.zip), so put the exe somewhere in your PATH
|
||||||
|
|
||||||
|
then either do this for `copyparty.exe`:
|
||||||
|
```batch
|
||||||
|
nssm install cpp %homedrive%%homepath%\downloads\copyparty.exe -c %homedrive%%homepath%\documents\party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
or do this for `copyparty-sfx.py`:
|
||||||
|
```batch
|
||||||
|
nssm install cpp %localappdata%\programs\python\python311\python.exe %homedrive%%homepath%\downloads\copyparty-sfx.py -c %homedrive%%homepath%\documents\party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
then after creating the service, modify it so it runs with your own windows account (so file permissions don't get wonky and paths expand as expected):
|
||||||
|
```batch
|
||||||
|
nssm set cpp ObjectName .\yourAccoutName yourWindowsPassword
|
||||||
|
nssm start cpp
|
||||||
|
```
|
||||||
|
|
||||||
|
and that's it, all good
|
||||||
|
|
||||||
|
if it doesn't start, enable stderr logging so you can see what went wrong:
|
||||||
|
```batch
|
||||||
|
nssm set cpp AppStderr %homedrive%%homepath%\logs\cppsvc.err
|
||||||
|
nssm set cpp AppStderrCreationDisposition 2
|
||||||
|
```
|
||||||
@@ -29,11 +29,13 @@ echo type = webdav
|
|||||||
echo vendor = owncloud
|
echo vendor = owncloud
|
||||||
echo url = http://127.0.0.1:3923/
|
echo url = http://127.0.0.1:3923/
|
||||||
echo headers = Cookie,cppwd=hunter2
|
echo headers = Cookie,cppwd=hunter2
|
||||||
|
echo pacer_min_sleep = 0.01ms
|
||||||
echo(
|
echo(
|
||||||
echo [cpp-ro]
|
echo [cpp-ro]
|
||||||
echo type = http
|
echo type = http
|
||||||
echo url = http://127.0.0.1:3923/
|
echo url = http://127.0.0.1:3923/
|
||||||
echo headers = Cookie,cppwd=hunter2
|
echo headers = Cookie,cppwd=hunter2
|
||||||
|
echo pacer_min_sleep = 0.01ms
|
||||||
) > %userprofile%\.config\rclone\rclone.conf
|
) > %userprofile%\.config\rclone\rclone.conf
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -48,11 +50,13 @@ type = webdav
|
|||||||
vendor = owncloud
|
vendor = owncloud
|
||||||
url = http://127.0.0.1:3923/
|
url = http://127.0.0.1:3923/
|
||||||
headers = Cookie,cppwd=hunter2
|
headers = Cookie,cppwd=hunter2
|
||||||
|
pacer_min_sleep = 0.01ms
|
||||||
|
|
||||||
[cpp-ro]
|
[cpp-ro]
|
||||||
type = http
|
type = http
|
||||||
url = http://127.0.0.1:3923/
|
url = http://127.0.0.1:3923/
|
||||||
headers = Cookie,cppwd=hunter2
|
headers = Cookie,cppwd=hunter2
|
||||||
|
pacer_min_sleep = 0.01ms
|
||||||
EOF
|
EOF
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -74,8 +78,6 @@ note that the up2k client [up2k.py](https://github.com/9001/copyparty/tree/hovud
|
|||||||
rclone sync /usr/share/icons/ cpp-rw:fds/
|
rclone sync /usr/share/icons/ cpp-rw:fds/
|
||||||
```
|
```
|
||||||
|
|
||||||
TODO: rclone bug? `--transfers=4` doesn't seem to do anything (it does one request at a time), doesn't matter if the webdav server is copyparty or rclone
|
|
||||||
|
|
||||||
|
|
||||||
# use rclone as server too, replacing copyparty
|
# use rclone as server too, replacing copyparty
|
||||||
|
|
||||||
|
|||||||
42
flake.lock
generated
Normal file
42
flake.lock
generated
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
{
|
||||||
|
"nodes": {
|
||||||
|
"flake-utils": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1678901627,
|
||||||
|
"narHash": "sha256-U02riOqrKKzwjsxc/400XnElV+UtPUQWpANPlyazjH0=",
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"rev": "93a2b84fc4b70d9e089d029deacc3583435c2ed6",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nixpkgs": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1680334310,
|
||||||
|
"narHash": "sha256-ISWz16oGxBhF7wqAxefMPwFag6SlsA9up8muV79V9ck=",
|
||||||
|
"owner": "NixOS",
|
||||||
|
"repo": "nixpkgs",
|
||||||
|
"rev": "884e3b68be02ff9d61a042bc9bd9dd2a358f95da",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"id": "nixpkgs",
|
||||||
|
"ref": "nixos-22.11",
|
||||||
|
"type": "indirect"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"root": {
|
||||||
|
"inputs": {
|
||||||
|
"flake-utils": "flake-utils",
|
||||||
|
"nixpkgs": "nixpkgs"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"root": "root",
|
||||||
|
"version": 7
|
||||||
|
}
|
||||||
28
flake.nix
Normal file
28
flake.nix
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
{
|
||||||
|
inputs = {
|
||||||
|
nixpkgs.url = "nixpkgs/nixos-22.11";
|
||||||
|
flake-utils.url = "github:numtide/flake-utils";
|
||||||
|
};
|
||||||
|
|
||||||
|
outputs = { self, nixpkgs, flake-utils }:
|
||||||
|
{
|
||||||
|
nixosModules.default = ./contrib/nixos/modules/copyparty.nix;
|
||||||
|
overlays.default = self: super: {
|
||||||
|
copyparty =
|
||||||
|
self.python3.pkgs.callPackage ./contrib/package/nix/copyparty {
|
||||||
|
ffmpeg = self.ffmpeg-full;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
} // flake-utils.lib.eachDefaultSystem (system:
|
||||||
|
let
|
||||||
|
pkgs = import nixpkgs {
|
||||||
|
inherit system;
|
||||||
|
overlays = [ self.overlays.default ];
|
||||||
|
};
|
||||||
|
in {
|
||||||
|
packages = {
|
||||||
|
inherit (pkgs) copyparty;
|
||||||
|
default = self.packages.${system}.copyparty;
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -3,12 +3,21 @@ FROM alpine:3.16
|
|||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||||
ver_hashwasm=4.9.0 \
|
ver_hashwasm=4.9.0 \
|
||||||
ver_marked=4.2.5 \
|
ver_marked=4.3.0 \
|
||||||
ver_mde=2.18.0 \
|
ver_mde=2.18.0 \
|
||||||
ver_codemirror=5.65.11 \
|
ver_codemirror=5.65.12 \
|
||||||
ver_fontawesome=5.13.0 \
|
ver_fontawesome=5.13.0 \
|
||||||
|
ver_prism=1.29.0 \
|
||||||
ver_zopfli=1.0.3
|
ver_zopfli=1.0.3
|
||||||
|
|
||||||
|
# versioncheck:
|
||||||
|
# https://github.com/markedjs/marked/releases
|
||||||
|
# https://github.com/Ionaru/easy-markdown-editor/tags
|
||||||
|
# https://github.com/codemirror/codemirror5/releases
|
||||||
|
# https://github.com/Daninet/hash-wasm/releases
|
||||||
|
# https://github.com/openpgpjs/asmcrypto.js
|
||||||
|
# https://github.com/google/zopfli/tags
|
||||||
|
|
||||||
|
|
||||||
# download;
|
# download;
|
||||||
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
|
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
|
||||||
@@ -22,6 +31,7 @@ RUN mkdir -p /z/dist/no-pk \
|
|||||||
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
||||||
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
||||||
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
||||||
|
&& wget https://github.com/PrismJS/prism/archive/refs/tags/v$ver_prism.tar.gz -O prism.tgz \
|
||||||
&& (mkdir hash-wasm \
|
&& (mkdir hash-wasm \
|
||||||
&& cd hash-wasm \
|
&& cd hash-wasm \
|
||||||
&& unzip ../hash-wasm.zip) \
|
&& unzip ../hash-wasm.zip) \
|
||||||
@@ -39,14 +49,11 @@ RUN mkdir -p /z/dist/no-pk \
|
|||||||
&& cd easy-markdown-editor* \
|
&& cd easy-markdown-editor* \
|
||||||
&& npm install \
|
&& npm install \
|
||||||
&& npm i gulp-cli -g ) \
|
&& npm i gulp-cli -g ) \
|
||||||
|
&& tar -xf prism.tgz \
|
||||||
&& unzip fontawesome.zip \
|
&& unzip fontawesome.zip \
|
||||||
&& tar -xf zopfli.tgz
|
&& tar -xf zopfli.tgz
|
||||||
|
|
||||||
|
|
||||||
# todo
|
|
||||||
# https://prismjs.com/download.html#themes=prism-funky&languages=markup+css+clike+javascript+autohotkey+bash+basic+batch+c+csharp+cpp+cmake+diff+docker+go+ini+java+json+kotlin+latex+less+lisp+lua+makefile+objectivec+perl+powershell+python+r+jsx+ruby+rust+sass+scss+sql+swift+systemd+toml+typescript+vbnet+verilog+vhdl+yaml&plugins=line-highlight+line-numbers+autolinker
|
|
||||||
|
|
||||||
|
|
||||||
# build fonttools (which needs zopfli)
|
# build fonttools (which needs zopfli)
|
||||||
RUN tar -xf zopfli.tgz \
|
RUN tar -xf zopfli.tgz \
|
||||||
&& cd zopfli* \
|
&& cd zopfli* \
|
||||||
@@ -121,6 +128,12 @@ COPY shiftbase.py /z
|
|||||||
RUN /bin/ash /z/mini-fa.sh
|
RUN /bin/ash /z/mini-fa.sh
|
||||||
|
|
||||||
|
|
||||||
|
# build prismjs
|
||||||
|
COPY genprism.py /z
|
||||||
|
COPY genprism.sh /z
|
||||||
|
RUN ./genprism.sh $ver_prism
|
||||||
|
|
||||||
|
|
||||||
# compress
|
# compress
|
||||||
COPY zopfli.makefile /z/dist/Makefile
|
COPY zopfli.makefile /z/dist/Makefile
|
||||||
RUN cd /z/dist \
|
RUN cd /z/dist \
|
||||||
|
|||||||
@@ -1,10 +1,9 @@
|
|||||||
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
|
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
|
||||||
vend := $(self)/../../copyparty/web/deps
|
vend := $(self)/../../copyparty/web/deps
|
||||||
|
|
||||||
|
# prefers podman-docker (optionally rootless) over actual docker/moby
|
||||||
|
|
||||||
all:
|
all:
|
||||||
-service docker start
|
|
||||||
-systemctl start docker
|
|
||||||
|
|
||||||
docker build -t build-copyparty-deps .
|
docker build -t build-copyparty-deps .
|
||||||
|
|
||||||
rm -rf $(vend)
|
rm -rf $(vend)
|
||||||
@@ -14,6 +13,7 @@ all:
|
|||||||
docker run --rm -i build-copyparty-deps:latest | \
|
docker run --rm -i build-copyparty-deps:latest | \
|
||||||
tar -xvC $(vend) --strip-components=1
|
tar -xvC $(vend) --strip-components=1
|
||||||
|
|
||||||
|
touch $(vend)/__init__.py
|
||||||
chown -R `stat $(self) -c %u:%g` $(vend)
|
chown -R `stat $(self) -c %u:%g` $(vend)
|
||||||
|
|
||||||
purge:
|
purge:
|
||||||
|
|||||||
198
scripts/deps-docker/genprism.py
Executable file
198
scripts/deps-docker/genprism.py
Executable file
@@ -0,0 +1,198 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# author: @chinponya
|
||||||
|
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
from pathlib import Path
|
||||||
|
from urllib.parse import urlparse, parse_qsl
|
||||||
|
|
||||||
|
|
||||||
|
def read_json(path):
|
||||||
|
return json.loads(path.read_text())
|
||||||
|
|
||||||
|
|
||||||
|
def get_prism_version(prism_path):
|
||||||
|
package_json_path = prism_path / "package.json"
|
||||||
|
package_json = read_json(package_json_path)
|
||||||
|
return package_json["version"]
|
||||||
|
|
||||||
|
|
||||||
|
def get_prism_components(prism_path):
|
||||||
|
components_json_path = prism_path / "components.json"
|
||||||
|
components_json = read_json(components_json_path)
|
||||||
|
return components_json
|
||||||
|
|
||||||
|
|
||||||
|
def parse_prism_configuration(url_str):
|
||||||
|
url = urlparse(url_str)
|
||||||
|
# prism.com uses a non-standard query string-like encoding
|
||||||
|
query = {k: v.split(" ") for k, v in parse_qsl(url.fragment)}
|
||||||
|
return query
|
||||||
|
|
||||||
|
|
||||||
|
def paths_of_component(prism_path, kind, components, name, minified):
|
||||||
|
component = components[kind][name]
|
||||||
|
meta = components[kind]["meta"]
|
||||||
|
path_format = meta["path"]
|
||||||
|
path_base = prism_path / path_format.replace("{id}", name)
|
||||||
|
|
||||||
|
if isinstance(component, str):
|
||||||
|
# 'core' component has a different shape, so we convert it to be consistent
|
||||||
|
component = {"title": component}
|
||||||
|
|
||||||
|
if meta.get("noCSS") or component.get("noCSS"):
|
||||||
|
extensions = ["js"]
|
||||||
|
elif kind == "themes":
|
||||||
|
extensions = ["css"]
|
||||||
|
else:
|
||||||
|
extensions = ["js", "css"]
|
||||||
|
|
||||||
|
if path_base.is_dir():
|
||||||
|
result = {ext: path_base / f"{name}.{ext}" for ext in extensions}
|
||||||
|
elif path_base.suffix:
|
||||||
|
ext = path_base.suffix.replace(".", "")
|
||||||
|
result = {ext: path_base}
|
||||||
|
else:
|
||||||
|
result = {ext: path_base.with_suffix(f".{ext}") for ext in extensions}
|
||||||
|
|
||||||
|
if minified:
|
||||||
|
result = {
|
||||||
|
ext: path.with_suffix(".min" + path.suffix) for ext, path in result.items()
|
||||||
|
}
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def read_component_contents(kv_paths):
|
||||||
|
return {k: path.read_text() for k, path in kv_paths.items()}
|
||||||
|
|
||||||
|
|
||||||
|
def get_language_dependencies(components, name):
|
||||||
|
dependencies = components["languages"][name].get("require")
|
||||||
|
|
||||||
|
if isinstance(dependencies, list):
|
||||||
|
return dependencies
|
||||||
|
elif isinstance(dependencies, str):
|
||||||
|
return [dependencies]
|
||||||
|
else:
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
def make_header(prism_path, url):
|
||||||
|
version = get_prism_version(prism_path)
|
||||||
|
header = f"/* PrismJS {version}\n{url} */"
|
||||||
|
return {"js": header, "css": header}
|
||||||
|
|
||||||
|
|
||||||
|
def make_core(prism_path, components, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "core", components, "core", minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_theme(prism_path, components, name, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "themes", components, name, minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_language(prism_path, components, name, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "languages", components, name, minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_languages(prism_path, components, names, minified):
|
||||||
|
names_with_dependencies = sum(
|
||||||
|
([*get_language_dependencies(components, name), name] for name in names), []
|
||||||
|
)
|
||||||
|
|
||||||
|
seen = set()
|
||||||
|
names_with_dependencies = [
|
||||||
|
x for x in names_with_dependencies if not (x in seen or seen.add(x))
|
||||||
|
]
|
||||||
|
|
||||||
|
kv_code = [
|
||||||
|
make_language(prism_path, components, name, minified)
|
||||||
|
for name in names_with_dependencies
|
||||||
|
]
|
||||||
|
|
||||||
|
return kv_code
|
||||||
|
|
||||||
|
|
||||||
|
def make_plugin(prism_path, components, name, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "plugins", components, name, minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_plugins(prism_path, components, names, minified):
|
||||||
|
kv_code = [make_plugin(prism_path, components, name, minified) for name in names]
|
||||||
|
return kv_code
|
||||||
|
|
||||||
|
|
||||||
|
def make_code(prism_path, url, minified):
|
||||||
|
components = get_prism_components(prism_path)
|
||||||
|
configuration = parse_prism_configuration(url)
|
||||||
|
theme_name = configuration["themes"][0]
|
||||||
|
code = [
|
||||||
|
make_header(prism_path, url),
|
||||||
|
make_core(prism_path, components, minified),
|
||||||
|
make_theme(prism_path, components, theme_name, minified),
|
||||||
|
]
|
||||||
|
|
||||||
|
if configuration.get("languages"):
|
||||||
|
code.extend(
|
||||||
|
make_languages(prism_path, components, configuration["languages"], minified)
|
||||||
|
)
|
||||||
|
|
||||||
|
if configuration.get("plugins"):
|
||||||
|
code.extend(
|
||||||
|
make_plugins(prism_path, components, configuration["plugins"], minified)
|
||||||
|
)
|
||||||
|
|
||||||
|
return code
|
||||||
|
|
||||||
|
|
||||||
|
def join_code(kv_code):
|
||||||
|
result = {"js": "", "css": ""}
|
||||||
|
|
||||||
|
for row in kv_code:
|
||||||
|
for key, code in row.items():
|
||||||
|
result[key] += code
|
||||||
|
result[key] += "\n"
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def write_code(kv_code, js_out, css_out):
|
||||||
|
code = join_code(kv_code)
|
||||||
|
|
||||||
|
with js_out.open("w") as f:
|
||||||
|
f.write(code["js"])
|
||||||
|
print(f"written {js_out}")
|
||||||
|
|
||||||
|
with css_out.open("w") as f:
|
||||||
|
f.write(code["css"])
|
||||||
|
print(f"written {css_out}")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
# fmt: off
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("url", help="configured prism download url")
|
||||||
|
parser.add_argument("--dir", type=Path, default=Path("."), help="prism repo directory")
|
||||||
|
parser.add_argument("--minify", default=True, action=argparse.BooleanOptionalAction, help="use minified files",)
|
||||||
|
parser.add_argument("--js-out", type=Path, default=Path("prism.js"), help="JS output file path")
|
||||||
|
parser.add_argument("--css-out", type=Path, default=Path("prism.css"), help="CSS output file path")
|
||||||
|
# fmt: on
|
||||||
|
args = parser.parse_args()
|
||||||
|
return args
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
args = parse_args()
|
||||||
|
code = make_code(args.dir, args.url, args.minify)
|
||||||
|
write_code(code, args.js_out, args.css_out)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
66
scripts/deps-docker/genprism.sh
Executable file
66
scripts/deps-docker/genprism.sh
Executable file
@@ -0,0 +1,66 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
langs=(
|
||||||
|
markup
|
||||||
|
css
|
||||||
|
clike
|
||||||
|
javascript
|
||||||
|
autohotkey
|
||||||
|
bash
|
||||||
|
basic
|
||||||
|
batch
|
||||||
|
c
|
||||||
|
csharp
|
||||||
|
cpp
|
||||||
|
cmake
|
||||||
|
diff
|
||||||
|
docker
|
||||||
|
elixir
|
||||||
|
glsl
|
||||||
|
go
|
||||||
|
ini
|
||||||
|
java
|
||||||
|
json
|
||||||
|
kotlin
|
||||||
|
latex
|
||||||
|
less
|
||||||
|
lisp
|
||||||
|
lua
|
||||||
|
makefile
|
||||||
|
matlab
|
||||||
|
moonscript
|
||||||
|
nim
|
||||||
|
objectivec
|
||||||
|
perl
|
||||||
|
powershell
|
||||||
|
python
|
||||||
|
r
|
||||||
|
jsx
|
||||||
|
ruby
|
||||||
|
rust
|
||||||
|
sass
|
||||||
|
scss
|
||||||
|
sql
|
||||||
|
swift
|
||||||
|
systemd
|
||||||
|
toml
|
||||||
|
typescript
|
||||||
|
vbnet
|
||||||
|
verilog
|
||||||
|
vhdl
|
||||||
|
yaml
|
||||||
|
zig
|
||||||
|
)
|
||||||
|
|
||||||
|
slangs="${langs[*]}"
|
||||||
|
slangs="${slangs// /+}"
|
||||||
|
|
||||||
|
for theme in prism-funky prism ; do
|
||||||
|
u="https://prismjs.com/download.html#themes=$theme&languages=$slangs&plugins=line-highlight+line-numbers+autolinker"
|
||||||
|
echo "$u"
|
||||||
|
./genprism.py --dir prism-$1 --js-out prism.js --css-out $theme.css "$u"
|
||||||
|
done
|
||||||
|
|
||||||
|
mv prism-funky.css prismd.css
|
||||||
|
mv prismd.css prism.css prism.js /z/dist/
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||||
adds linetracking to marked.js v4.2.3;
|
adds linetracking to marked.js v4.3.0;
|
||||||
add data-ln="%d" to most tags, %d is the source markdown line
|
add data-ln="%d" to most tags, %d is the source markdown line
|
||||||
--- a/src/Lexer.js
|
--- a/src/Lexer.js
|
||||||
+++ b/src/Lexer.js
|
+++ b/src/Lexer.js
|
||||||
@@ -206,7 +206,6 @@ index a22a2bc..884ad66 100644
|
|||||||
// Run any renderer extensions
|
// Run any renderer extensions
|
||||||
if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) {
|
if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) {
|
||||||
diff --git a/src/Renderer.js b/src/Renderer.js
|
diff --git a/src/Renderer.js b/src/Renderer.js
|
||||||
index 7c36a75..aa1a53a 100644
|
|
||||||
--- a/src/Renderer.js
|
--- a/src/Renderer.js
|
||||||
+++ b/src/Renderer.js
|
+++ b/src/Renderer.js
|
||||||
@@ -11,6 +11,12 @@ export class Renderer {
|
@@ -11,6 +11,12 @@ export class Renderer {
|
||||||
@@ -290,10 +289,9 @@ index 7c36a75..aa1a53a 100644
|
|||||||
if (title) {
|
if (title) {
|
||||||
out += ` title="${title}"`;
|
out += ` title="${title}"`;
|
||||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||||
index e8a69b6..2cc772b 100644
|
|
||||||
--- a/src/Tokenizer.js
|
--- a/src/Tokenizer.js
|
||||||
+++ b/src/Tokenizer.js
|
+++ b/src/Tokenizer.js
|
||||||
@@ -312,4 +312,7 @@ export class Tokenizer {
|
@@ -333,4 +333,7 @@ export class Tokenizer {
|
||||||
const l = list.items.length;
|
const l = list.items.length;
|
||||||
|
|
||||||
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||||
|
strip some features
|
||||||
--- a/src/Lexer.js
|
--- a/src/Lexer.js
|
||||||
+++ b/src/Lexer.js
|
+++ b/src/Lexer.js
|
||||||
@@ -7,5 +7,5 @@ import { repeatString } from './helpers.js';
|
@@ -7,5 +7,5 @@ import { repeatString } from './helpers.js';
|
||||||
@@ -56,7 +57,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
|||||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||||
--- a/src/Tokenizer.js
|
--- a/src/Tokenizer.js
|
||||||
+++ b/src/Tokenizer.js
|
+++ b/src/Tokenizer.js
|
||||||
@@ -352,14 +352,7 @@ export class Tokenizer {
|
@@ -367,14 +367,7 @@ export class Tokenizer {
|
||||||
type: 'html',
|
type: 'html',
|
||||||
raw: cap[0],
|
raw: cap[0],
|
||||||
- pre: !this.options.sanitizer
|
- pre: !this.options.sanitizer
|
||||||
@@ -72,7 +73,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
- }
|
- }
|
||||||
return token;
|
return token;
|
||||||
}
|
}
|
||||||
@@ -502,15 +495,9 @@ export class Tokenizer {
|
@@ -517,15 +510,9 @@ export class Tokenizer {
|
||||||
|
|
||||||
return {
|
return {
|
||||||
- type: this.options.sanitize
|
- type: this.options.sanitize
|
||||||
@@ -90,7 +91,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text: cap[0]
|
+ text: cap[0]
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -699,10 +686,10 @@ export class Tokenizer {
|
@@ -714,10 +701,10 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- autolink(src, mangle) {
|
- autolink(src, mangle) {
|
||||||
@@ -103,7 +104,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text = escape(cap[1]);
|
+ text = escape(cap[1]);
|
||||||
href = 'mailto:' + text;
|
href = 'mailto:' + text;
|
||||||
} else {
|
} else {
|
||||||
@@ -727,10 +714,10 @@ export class Tokenizer {
|
@@ -742,10 +729,10 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- url(src, mangle) {
|
- url(src, mangle) {
|
||||||
@@ -116,7 +117,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text = escape(cap[0]);
|
+ text = escape(cap[0]);
|
||||||
href = 'mailto:' + text;
|
href = 'mailto:' + text;
|
||||||
} else {
|
} else {
|
||||||
@@ -764,12 +751,12 @@ export class Tokenizer {
|
@@ -779,12 +766,12 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- inlineText(src, smartypants) {
|
- inlineText(src, smartypants) {
|
||||||
@@ -135,8 +136,8 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
diff --git a/src/defaults.js b/src/defaults.js
|
diff --git a/src/defaults.js b/src/defaults.js
|
||||||
--- a/src/defaults.js
|
--- a/src/defaults.js
|
||||||
+++ b/src/defaults.js
|
+++ b/src/defaults.js
|
||||||
@@ -10,11 +10,7 @@ export function getDefaults() {
|
@@ -11,11 +11,7 @@ export function getDefaults() {
|
||||||
highlight: null,
|
hooks: null,
|
||||||
langPrefix: 'language-',
|
langPrefix: 'language-',
|
||||||
- mangle: true,
|
- mangle: true,
|
||||||
pedantic: false,
|
pedantic: false,
|
||||||
@@ -170,7 +171,7 @@ diff --git a/src/helpers.js b/src/helpers.js
|
|||||||
+export function cleanUrl(base, href) {
|
+export function cleanUrl(base, href) {
|
||||||
if (base && !originIndependentUrl.test(href)) {
|
if (base && !originIndependentUrl.test(href)) {
|
||||||
href = resolveUrl(base, href);
|
href = resolveUrl(base, href);
|
||||||
@@ -250,10 +237,4 @@ export function findClosingBracket(str, b) {
|
@@ -233,10 +220,4 @@ export function findClosingBracket(str, b) {
|
||||||
}
|
}
|
||||||
|
|
||||||
-export function checkSanitizeDeprecation(opt) {
|
-export function checkSanitizeDeprecation(opt) {
|
||||||
@@ -185,30 +186,25 @@ diff --git a/src/marked.js b/src/marked.js
|
|||||||
--- a/src/marked.js
|
--- a/src/marked.js
|
||||||
+++ b/src/marked.js
|
+++ b/src/marked.js
|
||||||
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
|
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
|
||||||
|
import { Hooks } from './Hooks.js';
|
||||||
import {
|
import {
|
||||||
merge,
|
|
||||||
- checkSanitizeDeprecation,
|
- checkSanitizeDeprecation,
|
||||||
escape
|
escape
|
||||||
} from './helpers.js';
|
} from './helpers.js';
|
||||||
@@ -35,5 +34,4 @@ export function marked(src, opt, callback) {
|
@@ -18,5 +17,5 @@ import {
|
||||||
|
function onError(silent, async, callback) {
|
||||||
opt = merge({}, marked.defaults, opt || {});
|
return (e) => {
|
||||||
- checkSanitizeDeprecation(opt);
|
|
||||||
|
|
||||||
if (callback) {
|
|
||||||
@@ -318,5 +316,4 @@ marked.parseInline = function(src, opt) {
|
|
||||||
|
|
||||||
opt = merge({}, marked.defaults, opt || {});
|
|
||||||
- checkSanitizeDeprecation(opt);
|
|
||||||
|
|
||||||
try {
|
|
||||||
@@ -327,5 +324,5 @@ marked.parseInline = function(src, opt) {
|
|
||||||
return Parser.parseInline(tokens, opt);
|
|
||||||
} catch (e) {
|
|
||||||
- e.message += '\nPlease report this to https://github.com/markedjs/marked.';
|
- e.message += '\nPlease report this to https://github.com/markedjs/marked.';
|
||||||
+ e.message += '\nmake issue @ https://github.com/9001/copyparty';
|
+ e.message += '\nmake issue @ https://github.com/9001/copyparty';
|
||||||
if (opt.silent) {
|
|
||||||
return '<p>An error occurred:</p><pre>'
|
if (silent) {
|
||||||
|
@@ -65,6 +64,4 @@ function parseMarkdown(lexer, parser) {
|
||||||
|
}
|
||||||
|
|
||||||
|
- checkSanitizeDeprecation(opt);
|
||||||
|
-
|
||||||
|
if (opt.hooks) {
|
||||||
|
opt.hooks.options = opt;
|
||||||
diff --git a/test/bench.js b/test/bench.js
|
diff --git a/test/bench.js b/test/bench.js
|
||||||
--- a/test/bench.js
|
--- a/test/bench.js
|
||||||
+++ b/test/bench.js
|
+++ b/test/bench.js
|
||||||
@@ -250,70 +246,70 @@ diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
|||||||
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
||||||
--- a/test/unit/Lexer-spec.js
|
--- a/test/unit/Lexer-spec.js
|
||||||
+++ b/test/unit/Lexer-spec.js
|
+++ b/test/unit/Lexer-spec.js
|
||||||
@@ -712,5 +712,5 @@ paragraph
|
@@ -794,5 +794,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('sanitize', () => {
|
- it('sanitize', () => {
|
||||||
+ /*it('sanitize', () => {
|
+ /*it('sanitize', () => {
|
||||||
expectTokens({
|
expectTokens({
|
||||||
md: '<div>html</div>',
|
md: '<div>html</div>',
|
||||||
@@ -730,5 +730,5 @@ paragraph
|
@@ -812,5 +812,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -810,5 +810,5 @@ paragraph
|
@@ -892,5 +892,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('html sanitize', () => {
|
- it('html sanitize', () => {
|
||||||
+ /*it('html sanitize', () => {
|
+ /*it('html sanitize', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: '<div>html</div>',
|
md: '<div>html</div>',
|
||||||
@@ -818,5 +818,5 @@ paragraph
|
@@ -900,5 +900,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
|
|
||||||
it('link', () => {
|
it('link', () => {
|
||||||
@@ -1129,5 +1129,5 @@ paragraph
|
@@ -1211,5 +1211,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('autolink mangle email', () => {
|
- it('autolink mangle email', () => {
|
||||||
+ /*it('autolink mangle email', () => {
|
+ /*it('autolink mangle email', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: '<test@example.com>',
|
md: '<test@example.com>',
|
||||||
@@ -1149,5 +1149,5 @@ paragraph
|
@@ -1231,5 +1231,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
|
|
||||||
it('url', () => {
|
it('url', () => {
|
||||||
@@ -1186,5 +1186,5 @@ paragraph
|
@@ -1268,5 +1268,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('url mangle email', () => {
|
- it('url mangle email', () => {
|
||||||
+ /*it('url mangle email', () => {
|
+ /*it('url mangle email', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: 'test@example.com',
|
md: 'test@example.com',
|
||||||
@@ -1206,5 +1206,5 @@ paragraph
|
@@ -1288,5 +1288,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -1222,5 +1222,5 @@ paragraph
|
@@ -1304,5 +1304,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- describe('smartypants', () => {
|
- describe('smartypants', () => {
|
||||||
+ /*describe('smartypants', () => {
|
+ /*describe('smartypants', () => {
|
||||||
it('single quotes', () => {
|
it('single quotes', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
@@ -1292,5 +1292,5 @@ paragraph
|
@@ -1374,5 +1374,5 @@ paragraph
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
|
|||||||
@@ -40,4 +40,10 @@ update_arch_pkgbuild() {
|
|||||||
rm -rf x
|
rm -rf x
|
||||||
}
|
}
|
||||||
|
|
||||||
|
update_nixos_pin() {
|
||||||
|
( cd $self/../contrib/package/nix/copyparty;
|
||||||
|
./update.py $self/../dist/copyparty-sfx.py )
|
||||||
|
}
|
||||||
|
|
||||||
update_arch_pkgbuild
|
update_arch_pkgbuild
|
||||||
|
update_nixos_pin
|
||||||
|
|||||||
@@ -1,9 +1,9 @@
|
|||||||
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
|
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
|
||||||
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
|
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
|
||||||
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
||||||
85a041cc95cf493f5e2ebc2ca406d2718735e43951988810dc448d29e9ee0bcdb1ca19e0c22243441f45633969af8027469f29f6288f6830c724a3fa38886e5c pyinstaller-5.8.0-py3-none-win32.whl
|
d68c78bc83f4f48c604912b2d1ca4772b0e6ed676cd2eb439411e0a74d63fe215aac93dd9dab04ed341909a4a6a1efc13ec982516e3cb0fc7c355055e63d9178 pyinstaller-5.10.1-py3-none-win32.whl
|
||||||
adf0d23a98da38056de25e07e68921739173efc70fb9bf3f68d8c7c3d0d092e09efa69d35c0c9ecc990bc3c5fa62038227ef480ed06ddfaf05353f6e468f5dca pyinstaller-5.8.0-py3-none-win_amd64.whl
|
fe62705893c86eeb2d5b841da8debe05dedda98364dec190b487e718caad8a8735503bf93739a7a27ea793a835bf976fb919ceec1424b8fc550b936bae4a54e9 pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||||
01d7f8125966ed30389a879ba69d2c1fd3212bafad3fb485317580bcb9f489e8b901c4d325f6cb8a52986838ba6d44d3852e62b27c1f1d5a576899821cc0ae02 pyinstaller_hooks_contrib-2023.0-py2.py3-none-any.whl
|
61c543983ff67e2bdff94d2d6198023679437363db8c660fa81683aff87c5928cd800720488e18d09be89fe45d6ab99be3ccb912cb2e03e2bca385b4338e1e42 pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
|
||||||
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||||
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
||||||
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
||||||
@@ -26,5 +26,5 @@ ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a250675
|
|||||||
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
||||||
b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl
|
b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl
|
||||||
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
|
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
|
||||||
ea152624499966615ee74f2aefed27da528785e1215f46d61e79c5290bb8105fd98e9948938efbca9cd19e2f1dd48c9e712b4f30a4148a0ed5d1ff2dff77106e Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
78414808cb9a5fa74e7b23360b8f46147952530e3cc78a3ad4b80be3e26598080537ac691a1be1f35b7428a22c1f65a6adf45986da2752fbe9d9819d77a58bf8 Pillow-9.5.0-cp311-cp311-win_amd64.whl
|
||||||
2b04b196f1115f42375e623a35edeb71565dfd090416b22510ec0270fefe86f7d397a98aabbe9ebfe3f6a355fe25c487a4875d4252027d0a61ccb64cacd7631d python-3.11.2-amd64.exe
|
4b7711b950858f459d47145b88ccde659279c6af47144d58a1c54ea2ce4b80ec43eb7f69c68f12f8f6bc54c86a44e77441993257f7ad43aab364655de5c51bb1 python-3.11.2-amd64.exe
|
||||||
|
|||||||
@@ -17,15 +17,15 @@ uname -s | grep NT-10 && w10=1 || {
|
|||||||
fns=(
|
fns=(
|
||||||
altgraph-0.17.3-py2.py3-none-any.whl
|
altgraph-0.17.3-py2.py3-none-any.whl
|
||||||
pefile-2023.2.7-py3-none-any.whl
|
pefile-2023.2.7-py3-none-any.whl
|
||||||
pyinstaller-5.8.0-py3-none-win_amd64.whl
|
pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||||
pyinstaller_hooks_contrib-2023.0-py2.py3-none-any.whl
|
pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
|
||||||
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||||
upx-4.0.2-win32.zip
|
upx-4.0.2-win32.zip
|
||||||
)
|
)
|
||||||
[ $w10 ] && fns+=(
|
[ $w10 ] && fns+=(
|
||||||
mutagen-1.46.0-py3-none-any.whl
|
mutagen-1.46.0-py3-none-any.whl
|
||||||
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
||||||
python-3.11.2-amd64.exe
|
python-3.11.3-amd64.exe
|
||||||
}
|
}
|
||||||
[ $w7 ] && fns+=(
|
[ $w7 ] && fns+=(
|
||||||
certifi-2022.12.7-py3-none-any.whl
|
certifi-2022.12.7-py3-none-any.whl
|
||||||
@@ -43,12 +43,12 @@ fns=(
|
|||||||
)
|
)
|
||||||
[ $w7x64 ] && fns+=(
|
[ $w7x64 ] && fns+=(
|
||||||
windows6.1-kb2533623-x64.msu
|
windows6.1-kb2533623-x64.msu
|
||||||
pyinstaller-5.8.0-py3-none-win_amd64.whl
|
pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||||
python-3.7.9-amd64.exe
|
python-3.7.9-amd64.exe
|
||||||
)
|
)
|
||||||
[ $w7x32 ] && fns+=(
|
[ $w7x32 ] && fns+=(
|
||||||
windows6.1-kb2533623-x86.msu
|
windows6.1-kb2533623-x86.msu
|
||||||
pyinstaller-5.8.0-py3-none-win32.whl
|
pyinstaller-5.10.1-py3-none-win32.whl
|
||||||
python-3.7.9.exe
|
python-3.7.9.exe
|
||||||
)
|
)
|
||||||
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
|
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
|
||||||
|
|||||||
@@ -16,7 +16,9 @@ cat $f | awk '
|
|||||||
h=0
|
h=0
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
/^#/{s=1;pr()} /^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff)|`$/{s=0}
|
/^#/{s=1;rs=0;pr()}
|
||||||
|
/^#* *(nix package)/{rs=1}
|
||||||
|
/^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff|nixos module)|`$/{s=rs}
|
||||||
/^#/{
|
/^#/{
|
||||||
lv=length($1);
|
lv=length($1);
|
||||||
sub(/[^ ]+ /,"");
|
sub(/[^ ]+ /,"");
|
||||||
|
|||||||
Reference in New Issue
Block a user