Compare commits
86 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
42099baeff | ||
|
|
2459965ca8 | ||
|
|
6acf436573 | ||
|
|
f217e1ce71 | ||
|
|
418000aee3 | ||
|
|
dbbba9625b | ||
|
|
397bc92fbc | ||
|
|
6e615dcd03 | ||
|
|
9ac5908b33 | ||
|
|
50912480b9 | ||
|
|
24b9b8319d | ||
|
|
b0f4f0b653 | ||
|
|
05bbd41c4b | ||
|
|
8f5f8a3cda | ||
|
|
c8938fc033 | ||
|
|
1550350e05 | ||
|
|
5cc190c026 | ||
|
|
d6a0a738ce | ||
|
|
f5fe3678ee | ||
|
|
f2a7925387 | ||
|
|
fa953ced52 | ||
|
|
f0000d9861 | ||
|
|
4e67516719 | ||
|
|
29db7a6270 | ||
|
|
852499e296 | ||
|
|
f1775fd51c | ||
|
|
4bb306932a | ||
|
|
2a37e81bd8 | ||
|
|
6a312ca856 | ||
|
|
e7f3e475a2 | ||
|
|
854ba0ec06 | ||
|
|
209b49d771 | ||
|
|
949baae539 | ||
|
|
5f4ea27586 | ||
|
|
099cc97247 | ||
|
|
592b7d6315 | ||
|
|
0880bf55a1 | ||
|
|
4cbffec0ec | ||
|
|
cc355417d4 | ||
|
|
e2bc573e61 | ||
|
|
41c0376177 | ||
|
|
c01cad091e | ||
|
|
eb349f339c | ||
|
|
24d8caaf3e | ||
|
|
5ac2c20959 | ||
|
|
bb72e6bf30 | ||
|
|
d8142e866a | ||
|
|
7b7979fd61 | ||
|
|
749616d09d | ||
|
|
5485c6d7ca | ||
|
|
b7aea38d77 | ||
|
|
0ecd9f99e6 | ||
|
|
ca04a00662 | ||
|
|
8a09601be8 | ||
|
|
1fe0d4693e | ||
|
|
bba8a3c6bc | ||
|
|
e3d7f0c7d5 | ||
|
|
be7bb71bbc | ||
|
|
e0c4829ec6 | ||
|
|
5af1575329 | ||
|
|
884f966b86 | ||
|
|
f6c6fbc223 | ||
|
|
b0cc396bca | ||
|
|
ae463518f6 | ||
|
|
2be2e9a0d8 | ||
|
|
e405fddf74 | ||
|
|
c269b0dd91 | ||
|
|
8c3211263a | ||
|
|
bf04e7c089 | ||
|
|
c7c6e48b1a | ||
|
|
974ca773be | ||
|
|
9270c2df19 | ||
|
|
b39ff92f34 | ||
|
|
7454167f78 | ||
|
|
5ceb3a962f | ||
|
|
52bd5642da | ||
|
|
c39c93725f | ||
|
|
d00f0b9fa7 | ||
|
|
01cfc70982 | ||
|
|
e6aec189bd | ||
|
|
c98fff1647 | ||
|
|
0009e31bd3 | ||
|
|
db95e880b2 | ||
|
|
e69fea4a59 | ||
|
|
4360800a6e | ||
|
|
b179e2b031 |
2
.github/pull_request_template.md
vendored
Normal file
2
.github/pull_request_template.md
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
Please include the following text somewhere in this PR description:
|
||||||
|
This PR complies with the DCO; https://developercertificate.org/
|
||||||
6
.gitignore
vendored
6
.gitignore
vendored
@@ -21,6 +21,9 @@ copyparty.egg-info/
|
|||||||
# winmerge
|
# winmerge
|
||||||
*.bak
|
*.bak
|
||||||
|
|
||||||
|
# apple pls
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
# derived
|
# derived
|
||||||
copyparty/res/COPYING.txt
|
copyparty/res/COPYING.txt
|
||||||
copyparty/web/deps/
|
copyparty/web/deps/
|
||||||
@@ -34,3 +37,6 @@ up.*.txt
|
|||||||
.hist/
|
.hist/
|
||||||
scripts/docker/*.out
|
scripts/docker/*.out
|
||||||
scripts/docker/*.err
|
scripts/docker/*.err
|
||||||
|
|
||||||
|
# nix build output link
|
||||||
|
result
|
||||||
|
|||||||
288
README.md
288
README.md
@@ -1,29 +1,16 @@
|
|||||||
# ⇆🎉 copyparty
|
# 💾🎉 copyparty
|
||||||
|
|
||||||
* portable file sharing hub (py2/py3) [(on PyPI)](https://pypi.org/project/copyparty/)
|
turn almost any device into a file server with resumable uploads/downloads using [*any*](#browser-support) web browser
|
||||||
* MIT-Licensed, 2019-05-26, ed @ irc.rizon.net
|
|
||||||
|
|
||||||
|
* server only needs Python (2 or 3), all dependencies optional
|
||||||
|
* 🔌 protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
|
||||||
|
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
|
||||||
|
|
||||||
## summary
|
👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
|
||||||
|
|
||||||
turn your phone or raspi into a portable file server with resumable uploads/downloads using *any* web browser
|
|
||||||
|
|
||||||
* server only needs Python (`2.7` or `3.3+`), all dependencies optional
|
|
||||||
* browse/upload with [IE4](#browser-support) / netscape4.0 on win3.11 (heh)
|
|
||||||
* protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
|
|
||||||
|
|
||||||
try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
|
|
||||||
|
|
||||||
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
|
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
|
||||||
|
|
||||||
|
|
||||||
## get the app
|
|
||||||
|
|
||||||
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
|
|
||||||
|
|
||||||
(the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet)
|
|
||||||
|
|
||||||
|
|
||||||
## readme toc
|
## readme toc
|
||||||
|
|
||||||
* top
|
* top
|
||||||
@@ -52,6 +39,8 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
|
|||||||
* [self-destruct](#self-destruct) - uploads can be given a lifetime
|
* [self-destruct](#self-destruct) - uploads can be given a lifetime
|
||||||
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
|
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
|
||||||
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
|
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
|
||||||
|
* [media player](#media-player) - plays almost every audio format there is
|
||||||
|
* [audio equalizer](#audio-equalizer) - bass boosted
|
||||||
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
* [markdown viewer](#markdown-viewer) - and there are *two* editors
|
||||||
* [other tricks](#other-tricks)
|
* [other tricks](#other-tricks)
|
||||||
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
|
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
|
||||||
@@ -80,18 +69,24 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
|
|||||||
* [themes](#themes)
|
* [themes](#themes)
|
||||||
* [complete examples](#complete-examples)
|
* [complete examples](#complete-examples)
|
||||||
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
|
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
|
||||||
|
* [nix package](#nix-package) - `nix profile install github:9001/copyparty`
|
||||||
|
* [nixos module](#nixos-module)
|
||||||
* [browser support](#browser-support) - TLDR: yes
|
* [browser support](#browser-support) - TLDR: yes
|
||||||
* [client examples](#client-examples) - interact with copyparty using non-browser clients
|
* [client examples](#client-examples) - interact with copyparty using non-browser clients
|
||||||
|
* [folder sync](#folder-sync) - sync folders to/from copyparty
|
||||||
* [mount as drive](#mount-as-drive) - a remote copyparty server as a local filesystem
|
* [mount as drive](#mount-as-drive) - a remote copyparty server as a local filesystem
|
||||||
|
* [android app](#android-app) - upload to copyparty with one tap
|
||||||
|
* [iOS shortcuts](#iOS-shortcuts) - there is no iPhone app, but
|
||||||
* [performance](#performance) - defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
|
* [performance](#performance) - defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
|
||||||
* [client-side](#client-side) - when uploading files
|
* [client-side](#client-side) - when uploading files
|
||||||
* [security](#security) - some notes on hardening
|
* [security](#security) - some notes on hardening
|
||||||
* [gotchas](#gotchas) - behavior that might be unexpected
|
* [gotchas](#gotchas) - behavior that might be unexpected
|
||||||
* [cors](#cors) - cross-site request config
|
* [cors](#cors) - cross-site request config
|
||||||
|
* [https](#https) - both HTTP and HTTPS are accepted
|
||||||
* [recovering from crashes](#recovering-from-crashes)
|
* [recovering from crashes](#recovering-from-crashes)
|
||||||
* [client crashes](#client-crashes)
|
* [client crashes](#client-crashes)
|
||||||
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
|
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
|
||||||
* [HTTP API](#HTTP-API) - see [devnotes](#./docs/devnotes.md#http-api)
|
* [HTTP API](#HTTP-API) - see [devnotes](./docs/devnotes.md#http-api)
|
||||||
* [dependencies](#dependencies) - mandatory deps
|
* [dependencies](#dependencies) - mandatory deps
|
||||||
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
|
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
|
||||||
* [optional gpl stuff](#optional-gpl-stuff)
|
* [optional gpl stuff](#optional-gpl-stuff)
|
||||||
@@ -108,6 +103,8 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
|
|||||||
|
|
||||||
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
|
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
|
||||||
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
|
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
|
||||||
|
* or [install through nix](#nix-package), or [on NixOS](#nixos-module)
|
||||||
|
* or if you are on android, [install copyparty in termux](#install-on-android)
|
||||||
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
|
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
|
||||||
* docker has all deps built-in, so skip this step:
|
* docker has all deps built-in, so skip this step:
|
||||||
|
|
||||||
@@ -126,6 +123,8 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
|
|||||||
|
|
||||||
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
|
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
|
||||||
|
|
||||||
|
or see [complete windows example](./docs/examples/windows.md)
|
||||||
|
|
||||||
some recommended options:
|
some recommended options:
|
||||||
* `-e2dsa` enables general [file indexing](#file-indexing)
|
* `-e2dsa` enables general [file indexing](#file-indexing)
|
||||||
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
|
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
|
||||||
@@ -138,10 +137,11 @@ some recommended options:
|
|||||||
|
|
||||||
you may also want these, especially on servers:
|
you may also want these, especially on servers:
|
||||||
|
|
||||||
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service
|
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service (see guide inside)
|
||||||
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
|
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
|
||||||
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
|
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
|
||||||
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
|
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
|
||||||
|
* [nixos module](#nixos-module) to run copyparty on NixOS hosts
|
||||||
|
|
||||||
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
|
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
|
||||||
```
|
```
|
||||||
@@ -165,14 +165,18 @@ firewall-cmd --reload
|
|||||||
* ☑ [smb/cifs server](#smb-server)
|
* ☑ [smb/cifs server](#smb-server)
|
||||||
* ☑ [qr-code](#qr-code) for quick access
|
* ☑ [qr-code](#qr-code) for quick access
|
||||||
* ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf)
|
* ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf)
|
||||||
|
* ☑ [event hooks](#event-hooks) / script runner
|
||||||
|
* ☑ [reverse-proxy support](https://github.com/9001/copyparty#reverse-proxy)
|
||||||
* upload
|
* upload
|
||||||
* ☑ basic: plain multipart, ie6 support
|
* ☑ basic: plain multipart, ie6 support
|
||||||
* ☑ [up2k](#uploading): js, resumable, multithreaded
|
* ☑ [up2k](#uploading): js, resumable, multithreaded
|
||||||
* unaffected by cloudflare's max-upload-size (100 MiB)
|
* unaffected by cloudflare's max-upload-size (100 MiB)
|
||||||
* ☑ stash: simple PUT filedropper
|
* ☑ stash: simple PUT filedropper
|
||||||
|
* ☑ filename randomizer
|
||||||
|
* ☑ write-only folders
|
||||||
* ☑ [unpost](#unpost): undo/delete accidental uploads
|
* ☑ [unpost](#unpost): undo/delete accidental uploads
|
||||||
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
|
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
|
||||||
* ☑ symlink/discard existing files (content-matching)
|
* ☑ symlink/discard duplicates (content-matching)
|
||||||
* download
|
* download
|
||||||
* ☑ single files in browser
|
* ☑ single files in browser
|
||||||
* ☑ [folders as zip / tar files](#zip-downloads)
|
* ☑ [folders as zip / tar files](#zip-downloads)
|
||||||
@@ -193,10 +197,15 @@ firewall-cmd --reload
|
|||||||
* ☑ [locate files by contents](#file-search)
|
* ☑ [locate files by contents](#file-search)
|
||||||
* ☑ search by name/path/date/size
|
* ☑ search by name/path/date/size
|
||||||
* ☑ [search by ID3-tags etc.](#searching)
|
* ☑ [search by ID3-tags etc.](#searching)
|
||||||
|
* client support
|
||||||
|
* ☑ [folder sync](#folder-sync)
|
||||||
|
* ☑ [curl-friendly](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png)
|
||||||
* markdown
|
* markdown
|
||||||
* ☑ [viewer](#markdown-viewer)
|
* ☑ [viewer](#markdown-viewer)
|
||||||
* ☑ editor (sure why not)
|
* ☑ editor (sure why not)
|
||||||
|
|
||||||
|
PS: something missing? post any crazy ideas you've got as a [feature request](https://github.com/9001/copyparty/issues/new?assignees=9001&labels=enhancement&template=feature_request.md) or [discussion](https://github.com/9001/copyparty/discussions/new?category=ideas) 🤙
|
||||||
|
|
||||||
|
|
||||||
## testimonials
|
## testimonials
|
||||||
|
|
||||||
@@ -240,6 +249,9 @@ browser-specific:
|
|||||||
server-os-specific:
|
server-os-specific:
|
||||||
* RHEL8 / Rocky8: you can run copyparty using `/usr/libexec/platform-python`
|
* RHEL8 / Rocky8: you can run copyparty using `/usr/libexec/platform-python`
|
||||||
|
|
||||||
|
server notes:
|
||||||
|
* pypy is supported but regular cpython is faster if you enable the database
|
||||||
|
|
||||||
|
|
||||||
# bugs
|
# bugs
|
||||||
|
|
||||||
@@ -265,7 +277,7 @@ server-os-specific:
|
|||||||
|
|
||||||
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
|
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
|
||||||
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
|
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
|
||||||
* "future" because `AudioContext` is broken in the current iOS version (15.1), maybe one day...
|
* "future" because `AudioContext` can't maintain a stable playback speed in the current iOS version (15.7), maybe one day...
|
||||||
|
|
||||||
* Windows: folders cannot be accessed if the name ends with `.`
|
* Windows: folders cannot be accessed if the name ends with `.`
|
||||||
* python or windows bug
|
* python or windows bug
|
||||||
@@ -513,11 +525,14 @@ up2k has several advantages:
|
|||||||
* much higher speeds than ftp/scp/tarpipe on some internet connections (mainly american ones) thanks to parallel connections
|
* much higher speeds than ftp/scp/tarpipe on some internet connections (mainly american ones) thanks to parallel connections
|
||||||
* the last-modified timestamp of the file is preserved
|
* the last-modified timestamp of the file is preserved
|
||||||
|
|
||||||
|
> it is perfectly safe to restart / upgrade copyparty while someone is uploading to it!
|
||||||
|
> all known up2k clients will resume just fine 💪
|
||||||
|
|
||||||
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
|
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
**protip:** you can avoid scaring away users with [contrib/plugins/minimal-up2k.html](contrib/plugins/minimal-up2k.html) which makes it look [much simpler](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png)
|
**protip:** you can avoid scaring away users with [contrib/plugins/minimal-up2k.js](contrib/plugins/minimal-up2k.js) which makes it look [much simpler](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png)
|
||||||
|
|
||||||
**protip:** if you enable `favicon` in the `[⚙️] settings` tab (by typing something into the textbox), the icon in the browser tab will indicate upload progress -- also, the `[🔔]` and/or `[🔊]` switches enable visible and/or audible notifications on upload completion
|
**protip:** if you enable `favicon` in the `[⚙️] settings` tab (by typing something into the textbox), the icon in the browser tab will indicate upload progress -- also, the `[🔔]` and/or `[🔊]` switches enable visible and/or audible notifications on upload completion
|
||||||
|
|
||||||
@@ -642,12 +657,63 @@ or a mix of both:
|
|||||||
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
|
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
|
||||||
|
|
||||||
|
|
||||||
|
## media player
|
||||||
|
|
||||||
|
plays almost every audio format there is (if the server has FFmpeg installed for on-demand transcoding)
|
||||||
|
|
||||||
|
the following audio formats are usually always playable, even without FFmpeg: `aac|flac|m4a|mp3|ogg|opus|wav`
|
||||||
|
|
||||||
|
some hilights:
|
||||||
|
* OS integration; control playback from your phone's lockscreen ([windows](https://user-images.githubusercontent.com/241032/233213022-298a98ba-721a-4cf1-a3d4-f62634bc53d5.png) // [iOS](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) // [android](https://user-images.githubusercontent.com/241032/233212311-a7368590-08c7-4f9f-a1af-48ccf3f36fad.png))
|
||||||
|
* shows the audio waveform in the seekbar
|
||||||
|
* not perfectly gapless but can get really close (see settings + eq below); good enough to enjoy gapless albums as intended
|
||||||
|
|
||||||
|
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
|
||||||
|
|
||||||
|
open the `[🎺]` media-player-settings tab to configure it,
|
||||||
|
* switches:
|
||||||
|
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
|
||||||
|
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
|
||||||
|
* `[~s]` toggles the seekbar waveform display
|
||||||
|
* `[/np]` enables buttons to copy the now-playing info as an irc message
|
||||||
|
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
|
||||||
|
* `[seek]` allows seeking with lockscreen controls (buggy on some devices)
|
||||||
|
* `[art]` shows album art on the lockscreen
|
||||||
|
* `[🎯]` keeps the playing song scrolled into view (good when using the player as a taskbar dock)
|
||||||
|
* `[⟎]` shrinks the playback controls
|
||||||
|
* playback mode:
|
||||||
|
* `[loop]` keeps looping the folder
|
||||||
|
* `[next]` plays into the next folder
|
||||||
|
* transcode:
|
||||||
|
* `[flac]` convers `flac` and `wav` files into opus
|
||||||
|
* `[aac]` converts `aac` and `m4a` files into opus
|
||||||
|
* `[oth]` converts all other known formats into opus
|
||||||
|
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
|
||||||
|
* "tint" reduces the contrast of the playback bar
|
||||||
|
|
||||||
|
|
||||||
|
### audio equalizer
|
||||||
|
|
||||||
|
bass boosted
|
||||||
|
|
||||||
|
can also boost the volume in general, or increase/decrease stereo width (like [crossfeed](https://www.foobar2000.org/components/view/foo_dsp_meiercf) just worse)
|
||||||
|
|
||||||
|
has the convenient side-effect of reducing the pause between songs, so gapless albums play better with the eq enabled (just make it flat)
|
||||||
|
|
||||||
|
|
||||||
## markdown viewer
|
## markdown viewer
|
||||||
|
|
||||||
and there are *two* editors
|
and there are *two* editors
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
there is a built-in extension for inline clickable thumbnails;
|
||||||
|
* enable it by adding `<!-- th -->` somewhere in the doc
|
||||||
|
* add thumbnails with `!th[l](your.jpg)` where `l` means left-align (`r` = right-align)
|
||||||
|
* a single line with `---` clears the float / inlining
|
||||||
|
* in the case of README.md being displayed below a file listing, thumbnails will open in the gallery viewer
|
||||||
|
|
||||||
|
other notes,
|
||||||
* the document preview has a max-width which is the same as an A4 paper when printed
|
* the document preview has a max-width which is the same as an A4 paper when printed
|
||||||
|
|
||||||
|
|
||||||
@@ -1071,6 +1137,8 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
|
|||||||
|
|
||||||
## complete examples
|
## complete examples
|
||||||
|
|
||||||
|
* [running on windows](./docs/examples/windows.md)
|
||||||
|
|
||||||
* read-only music server
|
* read-only music server
|
||||||
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
|
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
|
||||||
|
|
||||||
@@ -1086,19 +1154,125 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
|
|||||||
|
|
||||||
## reverse-proxy
|
## reverse-proxy
|
||||||
|
|
||||||
running copyparty next to other websites hosted on an existing webserver such as nginx or apache
|
running copyparty next to other websites hosted on an existing webserver such as nginx, caddy, or apache
|
||||||
|
|
||||||
you can either:
|
you can either:
|
||||||
* give copyparty its own domain or subdomain (recommended)
|
* give copyparty its own domain or subdomain (recommended)
|
||||||
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
|
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
|
||||||
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
|
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
|
||||||
|
|
||||||
|
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which could be a nice speed boost
|
||||||
|
|
||||||
example webserver configs:
|
example webserver configs:
|
||||||
|
|
||||||
* [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain
|
* [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain
|
||||||
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
|
* [apache2 config](contrib/apache/copyparty.conf) -- location-based
|
||||||
|
|
||||||
|
|
||||||
|
## nix package
|
||||||
|
|
||||||
|
`nix profile install github:9001/copyparty`
|
||||||
|
|
||||||
|
requires a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of nix
|
||||||
|
|
||||||
|
some recommended dependencies are enabled by default; [override the package](https://github.com/9001/copyparty/blob/hovudstraum/contrib/package/nix/copyparty/default.nix#L3-L22) if you want to add/remove some features/deps
|
||||||
|
|
||||||
|
`ffmpeg-full` was chosen over `ffmpeg-headless` mainly because we need `withWebp` (and `withOpenmpt` is also nice) and being able to use a cached build felt more important than optimizing for size at the time -- PRs welcome if you disagree 👍
|
||||||
|
|
||||||
|
|
||||||
|
## nixos module
|
||||||
|
|
||||||
|
for this setup, you will need a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of NixOS.
|
||||||
|
|
||||||
|
```nix
|
||||||
|
{
|
||||||
|
# add copyparty flake to your inputs
|
||||||
|
inputs.copyparty.url = "github:9001/copyparty";
|
||||||
|
|
||||||
|
# ensure that copyparty is an allowed argument to the outputs function
|
||||||
|
outputs = { self, nixpkgs, copyparty }: {
|
||||||
|
nixosConfigurations.yourHostName = nixpkgs.lib.nixosSystem {
|
||||||
|
modules = [
|
||||||
|
# load the copyparty NixOS module
|
||||||
|
copyparty.nixosModules.default
|
||||||
|
({ pkgs, ... }: {
|
||||||
|
# add the copyparty overlay to expose the package to the module
|
||||||
|
nixpkgs.overlays = [ copyparty.overlays.default ];
|
||||||
|
# (optional) install the package globally
|
||||||
|
environment.systemPackages = [ pkgs.copyparty ];
|
||||||
|
# configure the copyparty module
|
||||||
|
services.copyparty.enable = true;
|
||||||
|
})
|
||||||
|
];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
copyparty on NixOS is configured via `services.copyparty` options, for example:
|
||||||
|
```nix
|
||||||
|
services.copyparty = {
|
||||||
|
enable = true;
|
||||||
|
# directly maps to values in the [global] section of the copyparty config.
|
||||||
|
# see `copyparty --help` for available options
|
||||||
|
settings = {
|
||||||
|
i = "0.0.0.0";
|
||||||
|
# use lists to set multiple values
|
||||||
|
p = [ 3210 3211 ];
|
||||||
|
# use booleans to set binary flags
|
||||||
|
no-reload = true;
|
||||||
|
# using 'false' will do nothing and omit the value when generating a config
|
||||||
|
ignored-flag = false;
|
||||||
|
};
|
||||||
|
|
||||||
|
# create users
|
||||||
|
accounts = {
|
||||||
|
# specify the account name as the key
|
||||||
|
ed = {
|
||||||
|
# provide the path to a file containing the password, keeping it out of /nix/store
|
||||||
|
# must be readable by the copyparty service user
|
||||||
|
passwordFile = "/run/keys/copyparty/ed_password";
|
||||||
|
};
|
||||||
|
# or do both in one go
|
||||||
|
k.passwordFile = "/run/keys/copyparty/k_password";
|
||||||
|
};
|
||||||
|
|
||||||
|
# create a volume
|
||||||
|
volumes = {
|
||||||
|
# create a volume at "/" (the webroot), which will
|
||||||
|
"/" = {
|
||||||
|
# share the contents of "/srv/copyparty"
|
||||||
|
path = "/srv/copyparty";
|
||||||
|
# see `copyparty --help-accounts` for available options
|
||||||
|
access = {
|
||||||
|
# everyone gets read-access, but
|
||||||
|
r = "*";
|
||||||
|
# users "ed" and "k" get read-write
|
||||||
|
rw = [ "ed" "k" ];
|
||||||
|
};
|
||||||
|
# see `copyparty --help-flags` for available options
|
||||||
|
flags = {
|
||||||
|
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
|
||||||
|
fk = 4;
|
||||||
|
# scan for new files every 60sec
|
||||||
|
scan = 60;
|
||||||
|
# volflag "e2d" enables the uploads database
|
||||||
|
e2d = true;
|
||||||
|
# "d2t" disables multimedia parsers (in case the uploads are malicious)
|
||||||
|
d2t = true;
|
||||||
|
# skips hashing file contents if path matches *.iso
|
||||||
|
nohash = "\.iso$";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
# you may increase the open file limit for the process
|
||||||
|
openFilesLimit = 8192;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
the passwordFile at /run/keys/copyparty/ could for example be generated by [agenix](https://github.com/ryantm/agenix), or you could just dump it in the nix store instead if that's acceptable
|
||||||
|
|
||||||
|
|
||||||
# browser support
|
# browser support
|
||||||
|
|
||||||
TLDR: yes
|
TLDR: yes
|
||||||
@@ -1170,7 +1344,7 @@ interact with copyparty using non-browser clients
|
|||||||
* `(printf 'PUT / HTTP/1.1\r\n\r\n'; cat movie.mkv) >/dev/tcp/127.0.0.1/3923`
|
* `(printf 'PUT / HTTP/1.1\r\n\r\n'; cat movie.mkv) >/dev/tcp/127.0.0.1/3923`
|
||||||
|
|
||||||
* python: [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
|
* python: [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
|
||||||
* file uploads, file-search, folder sync, autoresume of aborted/broken uploads
|
* file uploads, file-search, [folder sync](#folder-sync), autoresume of aborted/broken uploads
|
||||||
* can be downloaded from copyparty: controlpanel -> connect -> [up2k.py](http://127.0.0.1:3923/.cpr/a/up2k.py)
|
* can be downloaded from copyparty: controlpanel -> connect -> [up2k.py](http://127.0.0.1:3923/.cpr/a/up2k.py)
|
||||||
* see [./bin/README.md#up2kpy](bin/README.md#up2kpy)
|
* see [./bin/README.md#up2kpy](bin/README.md#up2kpy)
|
||||||
|
|
||||||
@@ -1191,17 +1365,27 @@ you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, ur
|
|||||||
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
|
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
|
||||||
|
|
||||||
|
|
||||||
|
## folder sync
|
||||||
|
|
||||||
|
sync folders to/from copyparty
|
||||||
|
|
||||||
|
the commandline uploader [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) with `--dr` is the best way to sync a folder to copyparty; verifies checksums and does files in parallel, and deletes unexpected files on the server after upload has finished which makes file-renames really cheap (it'll rename serverside and skip uploading)
|
||||||
|
|
||||||
|
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty, ...), although there is no integrity check and it won't work with files over 100 MiB if copyparty is behind cloudflare
|
||||||
|
|
||||||
|
* starting from rclone v1.63 (currently [in beta](https://beta.rclone.org/?filter=latest)), rclone will also be faster than up2k.py
|
||||||
|
|
||||||
|
|
||||||
## mount as drive
|
## mount as drive
|
||||||
|
|
||||||
a remote copyparty server as a local filesystem; go to the control-panel and click `connect` to see a list of commands to do that
|
a remote copyparty server as a local filesystem; go to the control-panel and click `connect` to see a list of commands to do that
|
||||||
|
|
||||||
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
|
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
|
||||||
|
|
||||||
* [rclone-http](./docs/rclone.md) (25s), read-only
|
* [rclone-webdav](./docs/rclone.md) (25s), read/WRITE ([v1.63-beta](https://beta.rclone.org/?filter=latest))
|
||||||
|
* [rclone-http](./docs/rclone.md) (26s), read-only
|
||||||
|
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
|
||||||
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
|
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
|
||||||
* [rclone-webdav](./docs/rclone.md) (51s), read/WRITE
|
|
||||||
* copyparty-1.5.0's webdav server is faster than rclone-1.60.0 (69s)
|
|
||||||
* [partyfuse.py](./bin/#partyfusepy) (71s), read-only
|
|
||||||
* davfs2 (103s), read/WRITE, *very fast* on small files
|
* davfs2 (103s), read/WRITE, *very fast* on small files
|
||||||
* [win10-webdav](#webdav-server) (138s), read/WRITE
|
* [win10-webdav](#webdav-server) (138s), read/WRITE
|
||||||
* [win10-smb2](#smb-server) (387s), read/WRITE
|
* [win10-smb2](#smb-server) (387s), read/WRITE
|
||||||
@@ -1209,6 +1393,27 @@ alternatively, some alternatives roughly sorted by speed (unreproducible benchma
|
|||||||
most clients will fail to mount the root of a copyparty server unless there is a root volume (so you get the admin-panel instead of a browser when accessing it) -- in that case, mount a specific volume instead
|
most clients will fail to mount the root of a copyparty server unless there is a root volume (so you get the admin-panel instead of a browser when accessing it) -- in that case, mount a specific volume instead
|
||||||
|
|
||||||
|
|
||||||
|
# android app
|
||||||
|
|
||||||
|
upload to copyparty with one tap
|
||||||
|
|
||||||
|
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
|
||||||
|
|
||||||
|
the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet
|
||||||
|
|
||||||
|
if you want to run the copyparty server on your android device, see [install on android](#install-on-android)
|
||||||
|
|
||||||
|
|
||||||
|
# iOS shortcuts
|
||||||
|
|
||||||
|
there is no iPhone app, but the following shortcuts are almost as good:
|
||||||
|
|
||||||
|
* [upload to copyparty](https://www.icloud.com/shortcuts/41e98dd985cb4d3bb433222bc1e9e770) ([offline](https://github.com/9001/copyparty/raw/hovudstraum/contrib/ios/upload-to-copyparty.shortcut)) ([png](https://user-images.githubusercontent.com/241032/226118053-78623554-b0ed-482e-98e4-6d57ada58ea4.png)) based on the [original](https://www.icloud.com/shortcuts/ab415d5b4de3467b9ce6f151b439a5d7) by [Daedren](https://github.com/Daedren) (thx!)
|
||||||
|
* can strip exif, upload files, pics, vids, links, clipboard
|
||||||
|
* can download links and rehost the target file on copyparty (see first comment inside the shortcut)
|
||||||
|
* pics become lowres if you share from gallery to shortcut, so better to launch the shortcut and pick stuff from there
|
||||||
|
|
||||||
|
|
||||||
# performance
|
# performance
|
||||||
|
|
||||||
defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
|
defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
|
||||||
@@ -1220,11 +1425,13 @@ below are some tweaks roughly ordered by usefulness:
|
|||||||
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
||||||
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
|
||||||
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
|
||||||
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example:
|
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
|
||||||
* huge amount of short-lived connections
|
* lots of connections (many users or heavy clients)
|
||||||
* simultaneous downloads and uploads saturating a 20gbps connection
|
* simultaneous downloads and uploads saturating a 20gbps connection
|
||||||
|
|
||||||
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
||||||
|
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
|
||||||
|
* and pypy can sometimes crash on startup with `-j0` (TODO make issue)
|
||||||
|
|
||||||
|
|
||||||
## client-side
|
## client-side
|
||||||
@@ -1304,6 +1511,13 @@ by default, except for `GET` and `HEAD` operations, all requests must either:
|
|||||||
cors can be configured with `--acao` and `--acam`, or the protections entirely disabled with `--allow-csrf`
|
cors can be configured with `--acao` and `--acam`, or the protections entirely disabled with `--allow-csrf`
|
||||||
|
|
||||||
|
|
||||||
|
## https
|
||||||
|
|
||||||
|
both HTTP and HTTPS are accepted by default, but letting a [reverse proxy](#reverse-proxy) handle the https/tls/ssl would be better (probably more secure by default)
|
||||||
|
|
||||||
|
copyparty doesn't speak HTTP/2 or QUIC, so using a reverse proxy would solve that as well
|
||||||
|
|
||||||
|
|
||||||
# recovering from crashes
|
# recovering from crashes
|
||||||
|
|
||||||
## client crashes
|
## client crashes
|
||||||
@@ -1326,7 +1540,7 @@ however you can hit `F12` in the up2k tab and use the devtools to see how far yo
|
|||||||
|
|
||||||
# HTTP API
|
# HTTP API
|
||||||
|
|
||||||
see [devnotes](#./docs/devnotes.md#http-api)
|
see [devnotes](./docs/devnotes.md#http-api)
|
||||||
|
|
||||||
|
|
||||||
# dependencies
|
# dependencies
|
||||||
@@ -1371,14 +1585,14 @@ these are standalone programs and will never be imported / evaluated by copypart
|
|||||||
|
|
||||||
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
|
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
|
||||||
|
|
||||||
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](#./docs/devnotes.md#sfx-repack)
|
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](./docs/devnotes.md#sfx-repack)
|
||||||
|
|
||||||
|
|
||||||
## copyparty.exe
|
## copyparty.exe
|
||||||
|
|
||||||
download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
|
download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
can be convenient on machines where installing python is problematic, however is **not recommended** -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
|
can be convenient on machines where installing python is problematic, however is **not recommended** -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
|
||||||
|
|
||||||
@@ -1386,9 +1600,9 @@ can be convenient on machines where installing python is problematic, however is
|
|||||||
|
|
||||||
* on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145), on win10 it just works
|
* on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145), on win10 it just works
|
||||||
|
|
||||||
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with windows7, which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
|
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with [windows7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png), which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
|
||||||
|
|
||||||
* dangerous and deprecated: [copyparty64.exe](https://github.com/9001/copyparty/releases/download/v1.6.5/copyparty64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
|
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
|
||||||
|
|
||||||
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
||||||
|
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xb
|
|||||||
# after upload
|
# after upload
|
||||||
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
|
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
|
||||||
* [notify2.py](notify2.py) uses the json API to show more context
|
* [notify2.py](notify2.py) uses the json API to show more context
|
||||||
|
* [image-noexif.py](image-noexif.py) removes image exif by overwriting / directly editing the uploaded file
|
||||||
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
|
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
|
||||||
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
|
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
|
||||||
|
|
||||||
|
|||||||
@@ -13,9 +13,15 @@ example usage as global config:
|
|||||||
--xau f,t5,j,bin/hooks/discord-announce.py
|
--xau f,t5,j,bin/hooks/discord-announce.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xau=f,t5,j,bin/hooks/discord-announce.py
|
-v srv/inc:inc:r:rw,ed:c,xau=f,t5,j,bin/hooks/discord-announce.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
|
xbu = execute after upload
|
||||||
f = fork; don't wait for it to finish
|
f = fork; don't wait for it to finish
|
||||||
t5 = timeout if it's still running after 5 sec
|
t5 = timeout if it's still running after 5 sec
|
||||||
j = provide upload information as json; not just the filename
|
j = provide upload information as json; not just the filename
|
||||||
@@ -30,6 +36,7 @@ then use this to design your message: https://discohook.org/
|
|||||||
|
|
||||||
def main():
|
def main():
|
||||||
WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
|
WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
|
||||||
|
WEBHOOK = "https://discord.com/api/webhooks/1066830390280597718/M1TDD110hQA-meRLMRhdurych8iyG35LDoI1YhzbrjGP--BXNZodZFczNVwK4Ce7Yme5"
|
||||||
|
|
||||||
# read info from copyparty
|
# read info from copyparty
|
||||||
inf = json.loads(sys.argv[1])
|
inf = json.loads(sys.argv[1])
|
||||||
|
|||||||
72
bin/hooks/image-noexif.py
Executable file
72
bin/hooks/image-noexif.py
Executable file
@@ -0,0 +1,72 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import subprocess as sp
|
||||||
|
|
||||||
|
|
||||||
|
_ = r"""
|
||||||
|
remove exif tags from uploaded images; the eventhook edition of
|
||||||
|
https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py
|
||||||
|
|
||||||
|
dependencies:
|
||||||
|
exiftool / perl-Image-ExifTool
|
||||||
|
|
||||||
|
being an upload hook, this will take effect after upload completion
|
||||||
|
but before copyparty has hashed/indexed the file, which means that
|
||||||
|
copyparty will never index the original file, so deduplication will
|
||||||
|
not work as expected... which is mostly OK but ehhh
|
||||||
|
|
||||||
|
note: modifies the file in-place, so don't set the `f` (fork) flag
|
||||||
|
|
||||||
|
example usages; either as global config (all volumes) or as volflag:
|
||||||
|
--xau bin/hooks/image-noexif.py
|
||||||
|
-v srv/inc:inc:r:rw,ed:c,xau=bin/hooks/image-noexif.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
explained:
|
||||||
|
share fs-path srv/inc at /inc (readable by all, read-write for user ed)
|
||||||
|
running this xau (execute-after-upload) plugin for all uploaded files
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
# filetypes to process; ignores everything else
|
||||||
|
EXTS = ("jpg", "jpeg", "avif", "heif", "heic")
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
from copyparty.util import fsenc
|
||||||
|
except:
|
||||||
|
|
||||||
|
def fsenc(p):
|
||||||
|
return p.encode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
fp = sys.argv[1]
|
||||||
|
ext = fp.lower().split(".")[-1]
|
||||||
|
if ext not in EXTS:
|
||||||
|
return
|
||||||
|
|
||||||
|
cwd, fn = os.path.split(fp)
|
||||||
|
os.chdir(cwd)
|
||||||
|
f1 = fsenc(fn)
|
||||||
|
cmd = [
|
||||||
|
b"exiftool",
|
||||||
|
b"-exif:all=",
|
||||||
|
b"-iptc:all=",
|
||||||
|
b"-xmp:all=",
|
||||||
|
b"-P",
|
||||||
|
b"-overwrite_original",
|
||||||
|
b"--",
|
||||||
|
f1,
|
||||||
|
]
|
||||||
|
sp.check_output(cmd)
|
||||||
|
print("image-noexif: stripped")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
try:
|
||||||
|
main()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
@@ -17,8 +17,12 @@ depdencies:
|
|||||||
|
|
||||||
example usages; either as global config (all volumes) or as volflag:
|
example usages; either as global config (all volumes) or as volflag:
|
||||||
--xau f,bin/hooks/notify.py
|
--xau f,bin/hooks/notify.py
|
||||||
-v srv/inc:inc:c,xau=f,bin/hooks/notify.py
|
-v srv/inc:inc:r:rw,ed:c,xau=f,bin/hooks/notify.py
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xau = execute after upload
|
xau = execute after upload
|
||||||
|
|||||||
@@ -15,9 +15,13 @@ and also supports --xm (notify on 📟 message)
|
|||||||
example usages; either as global config (all volumes) or as volflag:
|
example usages; either as global config (all volumes) or as volflag:
|
||||||
--xm f,j,bin/hooks/notify2.py
|
--xm f,j,bin/hooks/notify2.py
|
||||||
--xau f,j,bin/hooks/notify2.py
|
--xau f,j,bin/hooks/notify2.py
|
||||||
-v srv/inc:inc:c,xm=f,j,bin/hooks/notify2.py
|
-v srv/inc:inc:r:rw,ed:c,xm=f,j,bin/hooks/notify2.py
|
||||||
-v srv/inc:inc:c,xau=f,j,bin/hooks/notify2.py
|
-v srv/inc:inc:r:rw,ed:c,xau=f,j,bin/hooks/notify2.py
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads / msgs with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xau = execute after upload
|
xau = execute after upload
|
||||||
|
|||||||
@@ -10,7 +10,12 @@ example usage as global config:
|
|||||||
--xbu c,bin/hooks/reject-extension.py
|
--xbu c,bin/hooks/reject-extension.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xbu=c,bin/hooks/reject-extension.py
|
-v srv/inc:inc:r:rw,ed:c,xbu=c,bin/hooks/reject-extension.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xbu = execute before upload
|
xbu = execute before upload
|
||||||
|
|||||||
@@ -17,7 +17,12 @@ example usage as global config:
|
|||||||
--xau c,bin/hooks/reject-mimetype.py
|
--xau c,bin/hooks/reject-mimetype.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xau=c,bin/hooks/reject-mimetype.py
|
-v srv/inc:inc:r:rw,ed:c,xau=c,bin/hooks/reject-mimetype.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xau = execute after upload
|
xau = execute after upload
|
||||||
|
|||||||
@@ -15,9 +15,15 @@ example usage as global config:
|
|||||||
--xm f,j,t3600,bin/hooks/wget.py
|
--xm f,j,t3600,bin/hooks/wget.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xm=f,j,t3600,bin/hooks/wget.py
|
-v srv/inc:inc:r:rw,ed:c,xm=f,j,t3600,bin/hooks/wget.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on all messages with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
|
xm = execute on message-to-server-log
|
||||||
f = fork so it doesn't block uploads
|
f = fork so it doesn't block uploads
|
||||||
j = provide message information as json; not just the text
|
j = provide message information as json; not just the text
|
||||||
c3 = mute all output
|
c3 = mute all output
|
||||||
|
|||||||
@@ -18,7 +18,12 @@ example usage as global config:
|
|||||||
--xiu i5,j,bin/hooks/xiu-sha.py
|
--xiu i5,j,bin/hooks/xiu-sha.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xiu=i5,j,bin/hooks/xiu-sha.py
|
-v srv/inc:inc:r:rw,ed:c,xiu=i5,j,bin/hooks/xiu-sha.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on batches of uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xiu = execute after uploads...
|
xiu = execute after uploads...
|
||||||
|
|||||||
@@ -15,7 +15,12 @@ example usage as global config:
|
|||||||
--xiu i1,j,bin/hooks/xiu.py
|
--xiu i1,j,bin/hooks/xiu.py
|
||||||
|
|
||||||
example usage as a volflag (per-volume config):
|
example usage as a volflag (per-volume config):
|
||||||
-v srv/inc:inc:c,xiu=i1,j,bin/hooks/xiu.py
|
-v srv/inc:inc:r:rw,ed:c,xiu=i1,j,bin/hooks/xiu.py
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
(share filesystem-path srv/inc as volume /inc,
|
||||||
|
readable by everyone, read-write for user 'ed',
|
||||||
|
running this plugin on batches of uploads with the params listed below)
|
||||||
|
|
||||||
parameters explained,
|
parameters explained,
|
||||||
xiu = execute after uploads...
|
xiu = execute after uploads...
|
||||||
|
|||||||
@@ -31,7 +31,7 @@ run [`install-deps.sh`](install-deps.sh) to build/install most dependencies requ
|
|||||||
*alternatively* (or preferably) use packages from your distro instead, then you'll need at least these:
|
*alternatively* (or preferably) use packages from your distro instead, then you'll need at least these:
|
||||||
|
|
||||||
* from distro: `numpy vamp-plugin-sdk beatroot-vamp mixxx-keyfinder ffmpeg`
|
* from distro: `numpy vamp-plugin-sdk beatroot-vamp mixxx-keyfinder ffmpeg`
|
||||||
* from pypy: `keyfinder vamp`
|
* from pip: `keyfinder vamp`
|
||||||
|
|
||||||
|
|
||||||
# usage from copyparty
|
# usage from copyparty
|
||||||
|
|||||||
@@ -16,6 +16,10 @@ dep: ffmpeg
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
# save beat timestamps to ".beats/filename.txt"
|
||||||
|
SAVE = False
|
||||||
|
|
||||||
|
|
||||||
def det(tf):
|
def det(tf):
|
||||||
# fmt: off
|
# fmt: off
|
||||||
sp.check_call([
|
sp.check_call([
|
||||||
@@ -23,12 +27,11 @@ def det(tf):
|
|||||||
b"-nostdin",
|
b"-nostdin",
|
||||||
b"-hide_banner",
|
b"-hide_banner",
|
||||||
b"-v", b"fatal",
|
b"-v", b"fatal",
|
||||||
b"-ss", b"13",
|
|
||||||
b"-y", b"-i", fsenc(sys.argv[1]),
|
b"-y", b"-i", fsenc(sys.argv[1]),
|
||||||
b"-map", b"0:a:0",
|
b"-map", b"0:a:0",
|
||||||
b"-ac", b"1",
|
b"-ac", b"1",
|
||||||
b"-ar", b"22050",
|
b"-ar", b"22050",
|
||||||
b"-t", b"300",
|
b"-t", b"360",
|
||||||
b"-f", b"f32le",
|
b"-f", b"f32le",
|
||||||
fsenc(tf)
|
fsenc(tf)
|
||||||
])
|
])
|
||||||
@@ -47,10 +50,29 @@ def det(tf):
|
|||||||
print(c["list"][0]["label"].split(" ")[0])
|
print(c["list"][0]["label"].split(" ")[0])
|
||||||
return
|
return
|
||||||
|
|
||||||
# throws if detection failed:
|
# throws if detection failed:
|
||||||
bpm = float(cl[-1]["timestamp"] - cl[1]["timestamp"])
|
beats = [float(x["timestamp"]) for x in cl]
|
||||||
bpm = round(60 * ((len(cl) - 1) / bpm), 2)
|
bds = [b - a for a, b in zip(beats, beats[1:])]
|
||||||
print(f"{bpm:.2f}")
|
bds.sort()
|
||||||
|
n0 = int(len(bds) * 0.2)
|
||||||
|
n1 = int(len(bds) * 0.75) + 1
|
||||||
|
bds = bds[n0:n1]
|
||||||
|
bpm = sum(bds)
|
||||||
|
bpm = round(60 * (len(bds) / bpm), 2)
|
||||||
|
print(f"{bpm:.2f}")
|
||||||
|
|
||||||
|
if SAVE:
|
||||||
|
fdir, fname = os.path.split(sys.argv[1])
|
||||||
|
bdir = os.path.join(fdir, ".beats")
|
||||||
|
try:
|
||||||
|
os.mkdir(fsenc(bdir))
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
fp = os.path.join(bdir, fname) + ".txt"
|
||||||
|
with open(fsenc(fp), "wb") as f:
|
||||||
|
txt = "\n".join([f"{x:.2f}" for x in beats])
|
||||||
|
f.write(txt.encode("utf-8"))
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
|
|||||||
@@ -4,8 +4,9 @@ set -e
|
|||||||
# runs copyparty (or any other program really) in a chroot
|
# runs copyparty (or any other program really) in a chroot
|
||||||
#
|
#
|
||||||
# assumption: these directories, and everything within, are owned by root
|
# assumption: these directories, and everything within, are owned by root
|
||||||
sysdirs=( /bin /lib /lib32 /lib64 /sbin /usr /etc/alternatives )
|
sysdirs=(); for v in /bin /lib /lib32 /lib64 /sbin /usr /etc/alternatives ; do
|
||||||
|
[ -e $v ] && sysdirs+=($v)
|
||||||
|
done
|
||||||
|
|
||||||
# error-handler
|
# error-handler
|
||||||
help() { cat <<'EOF'
|
help() { cat <<'EOF'
|
||||||
@@ -38,7 +39,7 @@ while true; do
|
|||||||
v="$1"; shift
|
v="$1"; shift
|
||||||
[ "$v" = -- ] && break # end of volumes
|
[ "$v" = -- ] && break # end of volumes
|
||||||
[ "$#" -eq 0 ] && break # invalid usage
|
[ "$#" -eq 0 ] && break # invalid usage
|
||||||
vols+=( "$(realpath "$v")" )
|
vols+=( "$(realpath "$v" || echo "$v")" )
|
||||||
done
|
done
|
||||||
pybin="$1"; shift
|
pybin="$1"; shift
|
||||||
pybin="$(command -v "$pybin")"
|
pybin="$(command -v "$pybin")"
|
||||||
@@ -82,7 +83,7 @@ jail="${jail%/}"
|
|||||||
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | sed -r 's`/$``' | LC_ALL=C sort | uniq |
|
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | sed -r 's`/$``' | LC_ALL=C sort | uniq |
|
||||||
while IFS= read -r v; do
|
while IFS= read -r v; do
|
||||||
[ -e "$v" ] || {
|
[ -e "$v" ] || {
|
||||||
# printf '\033[1;31mfolder does not exist:\033[0m %s\n' "/$v"
|
printf '\033[1;31mfolder does not exist:\033[0m %s\n' "$v"
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
i1=$(stat -c%D.%i "$v" 2>/dev/null || echo a)
|
i1=$(stat -c%D.%i "$v" 2>/dev/null || echo a)
|
||||||
@@ -117,6 +118,15 @@ mkdir -p "$jail/tmp"
|
|||||||
chmod 777 "$jail/tmp"
|
chmod 777 "$jail/tmp"
|
||||||
|
|
||||||
|
|
||||||
|
# create a dev
|
||||||
|
(cd $jail; mkdir -p dev; cd dev
|
||||||
|
[ -e null ] || mknod -m 666 null c 1 3
|
||||||
|
[ -e zero ] || mknod -m 666 zero c 1 5
|
||||||
|
[ -e random ] || mknod -m 444 random c 1 8
|
||||||
|
[ -e urandom ] || mknod -m 444 urandom c 1 9
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
# run copyparty
|
# run copyparty
|
||||||
export HOME=$(getent passwd $uid | cut -d: -f6)
|
export HOME=$(getent passwd $uid | cut -d: -f6)
|
||||||
export USER=$(getent passwd $uid | cut -d: -f1)
|
export USER=$(getent passwd $uid | cut -d: -f1)
|
||||||
|
|||||||
123
bin/up2k.py
123
bin/up2k.py
@@ -1,9 +1,12 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
from __future__ import print_function, unicode_literals
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
|
S_VERSION = "1.6"
|
||||||
|
S_BUILD_DT = "2023-04-20"
|
||||||
|
|
||||||
"""
|
"""
|
||||||
up2k.py: upload to copyparty
|
up2k.py: upload to copyparty
|
||||||
2023-01-13, v1.2, ed <irc.rizon.net>, MIT-Licensed
|
2021, ed <irc.rizon.net>, MIT-Licensed
|
||||||
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
||||||
|
|
||||||
- dependencies: requests
|
- dependencies: requests
|
||||||
@@ -24,6 +27,8 @@ import platform
|
|||||||
import threading
|
import threading
|
||||||
import datetime
|
import datetime
|
||||||
|
|
||||||
|
EXE = sys.executable.endswith("exe")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import argparse
|
import argparse
|
||||||
except:
|
except:
|
||||||
@@ -34,7 +39,9 @@ except:
|
|||||||
try:
|
try:
|
||||||
import requests
|
import requests
|
||||||
except ImportError:
|
except ImportError:
|
||||||
if sys.version_info > (2, 7):
|
if EXE:
|
||||||
|
raise
|
||||||
|
elif sys.version_info > (2, 7):
|
||||||
m = "\nERROR: need 'requests'; please run this command:\n {0} -m pip install --user requests\n"
|
m = "\nERROR: need 'requests'; please run this command:\n {0} -m pip install --user requests\n"
|
||||||
else:
|
else:
|
||||||
m = "requests/2.18.4 urllib3/1.23 chardet/3.0.4 certifi/2020.4.5.1 idna/2.7"
|
m = "requests/2.18.4 urllib3/1.23 chardet/3.0.4 certifi/2020.4.5.1 idna/2.7"
|
||||||
@@ -245,7 +252,13 @@ def eprint(*a, **ka):
|
|||||||
|
|
||||||
|
|
||||||
def flushing_print(*a, **ka):
|
def flushing_print(*a, **ka):
|
||||||
_print(*a, **ka)
|
try:
|
||||||
|
_print(*a, **ka)
|
||||||
|
except:
|
||||||
|
v = " ".join(str(x) for x in a)
|
||||||
|
v = v.encode("ascii", "replace").decode("ascii")
|
||||||
|
_print(v, **ka)
|
||||||
|
|
||||||
if "flush" not in ka:
|
if "flush" not in ka:
|
||||||
sys.stdout.flush()
|
sys.stdout.flush()
|
||||||
|
|
||||||
@@ -372,6 +385,23 @@ def walkdir(err, top, seen):
|
|||||||
def walkdirs(err, tops):
|
def walkdirs(err, tops):
|
||||||
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
||||||
sep = "{0}".format(os.sep).encode("ascii")
|
sep = "{0}".format(os.sep).encode("ascii")
|
||||||
|
if not VT100:
|
||||||
|
za = []
|
||||||
|
for td in tops:
|
||||||
|
try:
|
||||||
|
ap = os.path.abspath(os.path.realpath(td))
|
||||||
|
if td[-1:] in (b"\\", b"/"):
|
||||||
|
ap += sep
|
||||||
|
except:
|
||||||
|
# maybe cpython #88013 (ok)
|
||||||
|
ap = td
|
||||||
|
|
||||||
|
za.append(ap)
|
||||||
|
|
||||||
|
za = [x if x.startswith(b"\\\\") else b"\\\\?\\" + x for x in za]
|
||||||
|
za = [x.replace(b"/", b"\\") for x in za]
|
||||||
|
tops = za
|
||||||
|
|
||||||
for top in tops:
|
for top in tops:
|
||||||
isdir = os.path.isdir(top)
|
isdir = os.path.isdir(top)
|
||||||
if top[-1:] == sep:
|
if top[-1:] == sep:
|
||||||
@@ -520,7 +550,11 @@ def handshake(ar, file, search):
|
|||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
em = str(ex).split("SSLError(")[-1].split("\nURL: ")[0].strip()
|
em = str(ex).split("SSLError(")[-1].split("\nURL: ")[0].strip()
|
||||||
|
|
||||||
if sc == 422 or "<pre>partial upload exists at a different" in txt:
|
if (
|
||||||
|
sc == 422
|
||||||
|
or "<pre>partial upload exists at a different" in txt
|
||||||
|
or "<pre>source file busy; please try again" in txt
|
||||||
|
):
|
||||||
file.recheck = True
|
file.recheck = True
|
||||||
return [], False
|
return [], False
|
||||||
elif sc == 409 or "<pre>upload rejected, file already exists" in txt:
|
elif sc == 409 or "<pre>upload rejected, file already exists" in txt:
|
||||||
@@ -552,8 +586,8 @@ def handshake(ar, file, search):
|
|||||||
return r["hash"], r["sprs"]
|
return r["hash"], r["sprs"]
|
||||||
|
|
||||||
|
|
||||||
def upload(file, cid, pw):
|
def upload(file, cid, pw, stats):
|
||||||
# type: (File, str, str) -> None
|
# type: (File, str, str, str) -> None
|
||||||
"""upload one specific chunk, `cid` (a chunk-hash)"""
|
"""upload one specific chunk, `cid` (a chunk-hash)"""
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
@@ -561,6 +595,10 @@ def upload(file, cid, pw):
|
|||||||
"X-Up2k-Wark": file.wark,
|
"X-Up2k-Wark": file.wark,
|
||||||
"Content-Type": "application/octet-stream",
|
"Content-Type": "application/octet-stream",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if stats:
|
||||||
|
headers["X-Up2k-Stat"] = stats
|
||||||
|
|
||||||
if pw:
|
if pw:
|
||||||
headers["Cookie"] = "=".join(["cppwd", pw])
|
headers["Cookie"] = "=".join(["cppwd", pw])
|
||||||
|
|
||||||
@@ -615,6 +653,7 @@ class Ctl(object):
|
|||||||
return nfiles, nbytes
|
return nfiles, nbytes
|
||||||
|
|
||||||
def __init__(self, ar, stats=None):
|
def __init__(self, ar, stats=None):
|
||||||
|
self.ok = False
|
||||||
self.ar = ar
|
self.ar = ar
|
||||||
self.stats = stats or self._scan()
|
self.stats = stats or self._scan()
|
||||||
if not self.stats:
|
if not self.stats:
|
||||||
@@ -629,6 +668,8 @@ class Ctl(object):
|
|||||||
req_ses.verify = ar.te
|
req_ses.verify = ar.te
|
||||||
|
|
||||||
self.filegen = walkdirs([], ar.files)
|
self.filegen = walkdirs([], ar.files)
|
||||||
|
self.recheck = [] # type: list[File]
|
||||||
|
|
||||||
if ar.safe:
|
if ar.safe:
|
||||||
self._safe()
|
self._safe()
|
||||||
else:
|
else:
|
||||||
@@ -647,11 +688,11 @@ class Ctl(object):
|
|||||||
self.t0 = time.time()
|
self.t0 = time.time()
|
||||||
self.t0_up = None
|
self.t0_up = None
|
||||||
self.spd = None
|
self.spd = None
|
||||||
|
self.eta = "99:99:99"
|
||||||
|
|
||||||
self.mutex = threading.Lock()
|
self.mutex = threading.Lock()
|
||||||
self.q_handshake = Queue() # type: Queue[File]
|
self.q_handshake = Queue() # type: Queue[File]
|
||||||
self.q_upload = Queue() # type: Queue[tuple[File, str]]
|
self.q_upload = Queue() # type: Queue[tuple[File, str]]
|
||||||
self.recheck = [] # type: list[File]
|
|
||||||
|
|
||||||
self.st_hash = [None, "(idle, starting...)"] # type: tuple[File, int]
|
self.st_hash = [None, "(idle, starting...)"] # type: tuple[File, int]
|
||||||
self.st_up = [None, "(idle, starting...)"] # type: tuple[File, int]
|
self.st_up = [None, "(idle, starting...)"] # type: tuple[File, int]
|
||||||
@@ -660,6 +701,8 @@ class Ctl(object):
|
|||||||
|
|
||||||
self._fancy()
|
self._fancy()
|
||||||
|
|
||||||
|
self.ok = True
|
||||||
|
|
||||||
def _safe(self):
|
def _safe(self):
|
||||||
"""minimal basic slow boring fallback codepath"""
|
"""minimal basic slow boring fallback codepath"""
|
||||||
search = self.ar.s
|
search = self.ar.s
|
||||||
@@ -693,7 +736,8 @@ class Ctl(object):
|
|||||||
ncs = len(hs)
|
ncs = len(hs)
|
||||||
for nc, cid in enumerate(hs):
|
for nc, cid in enumerate(hs):
|
||||||
print(" {0} up {1}".format(ncs - nc, cid))
|
print(" {0} up {1}".format(ncs - nc, cid))
|
||||||
upload(file, cid, self.ar.a)
|
stats = "{0}/0/0/{1}".format(nf, self.nfiles - nf)
|
||||||
|
upload(file, cid, self.ar.a, stats)
|
||||||
|
|
||||||
print(" ok!")
|
print(" ok!")
|
||||||
if file.recheck:
|
if file.recheck:
|
||||||
@@ -768,12 +812,12 @@ class Ctl(object):
|
|||||||
eta = (self.nbytes - self.up_b) / (spd + 1)
|
eta = (self.nbytes - self.up_b) / (spd + 1)
|
||||||
|
|
||||||
spd = humansize(spd)
|
spd = humansize(spd)
|
||||||
eta = str(datetime.timedelta(seconds=int(eta)))
|
self.eta = str(datetime.timedelta(seconds=int(eta)))
|
||||||
sleft = humansize(self.nbytes - self.up_b)
|
sleft = humansize(self.nbytes - self.up_b)
|
||||||
nleft = self.nfiles - self.up_f
|
nleft = self.nfiles - self.up_f
|
||||||
tail = "\033[K\033[u" if VT100 and not self.ar.ns else "\r"
|
tail = "\033[K\033[u" if VT100 and not self.ar.ns else "\r"
|
||||||
|
|
||||||
t = "{0} eta @ {1}/s, {2}, {3}# left".format(eta, spd, sleft, nleft)
|
t = "{0} eta @ {1}/s, {2}, {3}# left".format(self.eta, spd, sleft, nleft)
|
||||||
eprint(txt + "\033]0;{0}\033\\\r{0}{1}".format(t, tail))
|
eprint(txt + "\033]0;{0}\033\\\r{0}{1}".format(t, tail))
|
||||||
|
|
||||||
if not self.recheck:
|
if not self.recheck:
|
||||||
@@ -811,7 +855,7 @@ class Ctl(object):
|
|||||||
zb += quotep(rd.replace(b"\\", b"/"))
|
zb += quotep(rd.replace(b"\\", b"/"))
|
||||||
r = req_ses.get(zb + b"?ls&dots", headers=headers)
|
r = req_ses.get(zb + b"?ls&dots", headers=headers)
|
||||||
if not r:
|
if not r:
|
||||||
raise Exception("HTTP {}".format(r.status_code))
|
raise Exception("HTTP {0}".format(r.status_code))
|
||||||
|
|
||||||
j = r.json()
|
j = r.json()
|
||||||
for f in j["dirs"] + j["files"]:
|
for f in j["dirs"] + j["files"]:
|
||||||
@@ -886,6 +930,9 @@ class Ctl(object):
|
|||||||
self.handshaker_busy += 1
|
self.handshaker_busy += 1
|
||||||
|
|
||||||
upath = file.abs.decode("utf-8", "replace")
|
upath = file.abs.decode("utf-8", "replace")
|
||||||
|
if not VT100:
|
||||||
|
upath = upath[4:]
|
||||||
|
|
||||||
hs, sprs = handshake(self.ar, file, search)
|
hs, sprs = handshake(self.ar, file, search)
|
||||||
if search:
|
if search:
|
||||||
if hs:
|
if hs:
|
||||||
@@ -951,11 +998,23 @@ class Ctl(object):
|
|||||||
self.uploader_busy += 1
|
self.uploader_busy += 1
|
||||||
self.t0_up = self.t0_up or time.time()
|
self.t0_up = self.t0_up or time.time()
|
||||||
|
|
||||||
|
zs = "{0}/{1}/{2}/{3} {4}/{5} {6}"
|
||||||
|
stats = zs.format(
|
||||||
|
self.up_f,
|
||||||
|
len(self.recheck),
|
||||||
|
self.uploader_busy,
|
||||||
|
self.nfiles - self.up_f,
|
||||||
|
int(self.nbytes / (1024 * 1024)),
|
||||||
|
int((self.nbytes - self.up_b) / (1024 * 1024)),
|
||||||
|
self.eta,
|
||||||
|
)
|
||||||
|
|
||||||
file, cid = task
|
file, cid = task
|
||||||
try:
|
try:
|
||||||
upload(file, cid, self.ar.a)
|
upload(file, cid, self.ar.a, stats)
|
||||||
except:
|
except Exception as ex:
|
||||||
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
|
t = "upload failed, retrying: {0} #{1} ({2})\n"
|
||||||
|
eprint(t.format(file.name, cid[:8], ex))
|
||||||
# handshake will fix it
|
# handshake will fix it
|
||||||
|
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
@@ -989,8 +1048,15 @@ def main():
|
|||||||
cores = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
|
cores = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
|
||||||
hcores = min(cores, 3) # 4% faster than 4+ on py3.9 @ r5-4500U
|
hcores = min(cores, 3) # 4% faster than 4+ on py3.9 @ r5-4500U
|
||||||
|
|
||||||
|
ver = "{0}, v{1}".format(S_BUILD_DT, S_VERSION)
|
||||||
|
if "--version" in sys.argv:
|
||||||
|
print(ver)
|
||||||
|
return
|
||||||
|
|
||||||
|
sys.argv = [x for x in sys.argv if x != "--ws"]
|
||||||
|
|
||||||
# fmt: off
|
# fmt: off
|
||||||
ap = app = argparse.ArgumentParser(formatter_class=APF, epilog="""
|
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
|
||||||
NOTE:
|
NOTE:
|
||||||
source file/folder selection uses rsync syntax, meaning that:
|
source file/folder selection uses rsync syntax, meaning that:
|
||||||
"foo" uploads the entire folder to URL/foo/
|
"foo" uploads the entire folder to URL/foo/
|
||||||
@@ -1003,10 +1069,10 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
|
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
|
||||||
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
||||||
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
||||||
|
ap.add_argument("--version", action="store_true", help="show version and exit")
|
||||||
|
|
||||||
ap = app.add_argument_group("compatibility")
|
ap = app.add_argument_group("compatibility")
|
||||||
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
||||||
ap.add_argument("--ws", action="store_true", help="copyparty is running on windows; wait before deleting files after uploading")
|
|
||||||
|
|
||||||
ap = app.add_argument_group("folder sync")
|
ap = app.add_argument_group("folder sync")
|
||||||
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
|
ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
|
||||||
@@ -1026,7 +1092,16 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
ap.add_argument("-td", action="store_true", help="disable certificate check")
|
ap.add_argument("-td", action="store_true", help="disable certificate check")
|
||||||
# fmt: on
|
# fmt: on
|
||||||
|
|
||||||
ar = app.parse_args()
|
try:
|
||||||
|
ar = app.parse_args()
|
||||||
|
finally:
|
||||||
|
if EXE and not sys.argv[1:]:
|
||||||
|
print("*** hit enter to exit ***")
|
||||||
|
try:
|
||||||
|
input()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
if ar.drd:
|
if ar.drd:
|
||||||
ar.dr = True
|
ar.dr = True
|
||||||
|
|
||||||
@@ -1040,7 +1115,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
ar.files = [
|
ar.files = [
|
||||||
os.path.abspath(os.path.realpath(x.encode("utf-8")))
|
os.path.abspath(os.path.realpath(x.encode("utf-8")))
|
||||||
+ (x[-1:] if x[-1:] == os.sep else "").encode("utf-8")
|
+ (x[-1:] if x[-1:] in ("\\", "/") else "").encode("utf-8")
|
||||||
for x in ar.files
|
for x in ar.files
|
||||||
]
|
]
|
||||||
|
|
||||||
@@ -1050,7 +1125,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
if ar.a and ar.a.startswith("$"):
|
if ar.a and ar.a.startswith("$"):
|
||||||
fn = ar.a[1:]
|
fn = ar.a[1:]
|
||||||
print("reading password from file [{}]".format(fn))
|
print("reading password from file [{0}]".format(fn))
|
||||||
with open(fn, "rb") as f:
|
with open(fn, "rb") as f:
|
||||||
ar.a = f.read().decode("utf-8").strip()
|
ar.a = f.read().decode("utf-8").strip()
|
||||||
|
|
||||||
@@ -1059,15 +1134,13 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
ctl = Ctl(ar)
|
ctl = Ctl(ar)
|
||||||
|
|
||||||
if ar.dr and not ar.drd:
|
if ar.dr and not ar.drd and ctl.ok:
|
||||||
print("\npass 2/2: delete")
|
print("\npass 2/2: delete")
|
||||||
if getattr(ctl, "up_br") and ar.ws:
|
|
||||||
# wait for up2k to mtime if there was uploads
|
|
||||||
time.sleep(4)
|
|
||||||
|
|
||||||
ar.drd = True
|
ar.drd = True
|
||||||
ar.z = True
|
ar.z = True
|
||||||
Ctl(ar, ctl.stats)
|
ctl = Ctl(ar, ctl.stats)
|
||||||
|
|
||||||
|
sys.exit(0 if ctl.ok else 1)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
|
|
||||||
<head>
|
<head>
|
||||||
<meta charset="utf-8">
|
<meta charset="utf-8">
|
||||||
<title>⇆🎉 redirect</title>
|
<title>💾🎉 redirect</title>
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
<style>
|
<style>
|
||||||
|
|
||||||
|
|||||||
BIN
contrib/ios/upload-to-copyparty.shortcut
Normal file
BIN
contrib/ios/upload-to-copyparty.shortcut
Normal file
Binary file not shown.
@@ -39,3 +39,9 @@ server {
|
|||||||
proxy_set_header Connection "Keep-Alive";
|
proxy_set_header Connection "Keep-Alive";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
|
||||||
|
client_max_body_size 1024M;
|
||||||
|
client_header_timeout 610m;
|
||||||
|
client_body_timeout 610m;
|
||||||
|
send_timeout 610m;
|
||||||
|
|||||||
281
contrib/nixos/modules/copyparty.nix
Normal file
281
contrib/nixos/modules/copyparty.nix
Normal file
@@ -0,0 +1,281 @@
|
|||||||
|
{ config, pkgs, lib, ... }:
|
||||||
|
|
||||||
|
with lib;
|
||||||
|
|
||||||
|
let
|
||||||
|
mkKeyValue = key: value:
|
||||||
|
if value == true then
|
||||||
|
# sets with a true boolean value are coerced to just the key name
|
||||||
|
key
|
||||||
|
else if value == false then
|
||||||
|
# or omitted completely when false
|
||||||
|
""
|
||||||
|
else
|
||||||
|
(generators.mkKeyValueDefault { inherit mkValueString; } ": " key value);
|
||||||
|
|
||||||
|
mkAttrsString = value: (generators.toKeyValue { inherit mkKeyValue; } value);
|
||||||
|
|
||||||
|
mkValueString = value:
|
||||||
|
if isList value then
|
||||||
|
(concatStringsSep ", " (map mkValueString value))
|
||||||
|
else if isAttrs value then
|
||||||
|
"\n" + (mkAttrsString value)
|
||||||
|
else
|
||||||
|
(generators.mkValueStringDefault { } value);
|
||||||
|
|
||||||
|
mkSectionName = value: "[" + (escape [ "[" "]" ] value) + "]";
|
||||||
|
|
||||||
|
mkSection = name: attrs: ''
|
||||||
|
${mkSectionName name}
|
||||||
|
${mkAttrsString attrs}
|
||||||
|
'';
|
||||||
|
|
||||||
|
mkVolume = name: attrs: ''
|
||||||
|
${mkSectionName name}
|
||||||
|
${attrs.path}
|
||||||
|
${mkAttrsString {
|
||||||
|
accs = attrs.access;
|
||||||
|
flags = attrs.flags;
|
||||||
|
}}
|
||||||
|
'';
|
||||||
|
|
||||||
|
passwordPlaceholder = name: "{{password-${name}}}";
|
||||||
|
|
||||||
|
accountsWithPlaceholders = mapAttrs (name: attrs: passwordPlaceholder name);
|
||||||
|
|
||||||
|
configStr = ''
|
||||||
|
${mkSection "global" cfg.settings}
|
||||||
|
${mkSection "accounts" (accountsWithPlaceholders cfg.accounts)}
|
||||||
|
${concatStringsSep "\n" (mapAttrsToList mkVolume cfg.volumes)}
|
||||||
|
'';
|
||||||
|
|
||||||
|
name = "copyparty";
|
||||||
|
cfg = config.services.copyparty;
|
||||||
|
configFile = pkgs.writeText "${name}.conf" configStr;
|
||||||
|
runtimeConfigPath = "/run/${name}/${name}.conf";
|
||||||
|
home = "/var/lib/${name}";
|
||||||
|
defaultShareDir = "${home}/data";
|
||||||
|
in {
|
||||||
|
options.services.copyparty = {
|
||||||
|
enable = mkEnableOption "web-based file manager";
|
||||||
|
|
||||||
|
package = mkOption {
|
||||||
|
type = types.package;
|
||||||
|
default = pkgs.copyparty;
|
||||||
|
defaultText = "pkgs.copyparty";
|
||||||
|
description = ''
|
||||||
|
Package of the application to run, exposed for overriding purposes.
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
openFilesLimit = mkOption {
|
||||||
|
default = 4096;
|
||||||
|
type = types.either types.int types.str;
|
||||||
|
description = "Number of files to allow copyparty to open.";
|
||||||
|
};
|
||||||
|
|
||||||
|
settings = mkOption {
|
||||||
|
type = types.attrs;
|
||||||
|
description = ''
|
||||||
|
Global settings to apply.
|
||||||
|
Directly maps to values in the [global] section of the copyparty config.
|
||||||
|
See `${getExe cfg.package} --help` for more details.
|
||||||
|
'';
|
||||||
|
default = {
|
||||||
|
i = "127.0.0.1";
|
||||||
|
no-reload = true;
|
||||||
|
};
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
i = "0.0.0.0";
|
||||||
|
no-reload = true;
|
||||||
|
}
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
accounts = mkOption {
|
||||||
|
type = types.attrsOf (types.submodule ({ ... }: {
|
||||||
|
options = {
|
||||||
|
passwordFile = mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = ''
|
||||||
|
Runtime file path to a file containing the user password.
|
||||||
|
Must be readable by the copyparty user.
|
||||||
|
'';
|
||||||
|
example = "/run/keys/copyparty/ed";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}));
|
||||||
|
description = ''
|
||||||
|
A set of copyparty accounts to create.
|
||||||
|
'';
|
||||||
|
default = { };
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
ed.passwordFile = "/run/keys/copyparty/ed";
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
volumes = mkOption {
|
||||||
|
type = types.attrsOf (types.submodule ({ ... }: {
|
||||||
|
options = {
|
||||||
|
path = mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = ''
|
||||||
|
Path of a directory to share.
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
access = mkOption {
|
||||||
|
type = types.attrs;
|
||||||
|
description = ''
|
||||||
|
Attribute list of permissions and the users to apply them to.
|
||||||
|
|
||||||
|
The key must be a string containing any combination of allowed permission:
|
||||||
|
"r" (read): list folder contents, download files
|
||||||
|
"w" (write): upload files; need "r" to see the uploads
|
||||||
|
"m" (move): move files and folders; need "w" at destination
|
||||||
|
"d" (delete): permanently delete files and folders
|
||||||
|
"g" (get): download files, but cannot see folder contents
|
||||||
|
"G" (upget): "get", but can see filekeys of their own uploads
|
||||||
|
|
||||||
|
For example: "rwmd"
|
||||||
|
|
||||||
|
The value must be one of:
|
||||||
|
an account name, defined in `accounts`
|
||||||
|
a list of account names
|
||||||
|
"*", which means "any account"
|
||||||
|
'';
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
# wG = write-upget = see your own uploads only
|
||||||
|
wG = "*";
|
||||||
|
# read-write-modify-delete for users "ed" and "k"
|
||||||
|
rwmd = ["ed" "k"];
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
flags = mkOption {
|
||||||
|
type = types.attrs;
|
||||||
|
description = ''
|
||||||
|
Attribute list of volume flags to apply.
|
||||||
|
See `${getExe cfg.package} --help-flags` for more details.
|
||||||
|
'';
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
|
||||||
|
fk = 4;
|
||||||
|
# scan for new files every 60sec
|
||||||
|
scan = 60;
|
||||||
|
# volflag "e2d" enables the uploads database
|
||||||
|
e2d = true;
|
||||||
|
# "d2t" disables multimedia parsers (in case the uploads are malicious)
|
||||||
|
d2t = true;
|
||||||
|
# skips hashing file contents if path matches *.iso
|
||||||
|
nohash = "\.iso$";
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
default = { };
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}));
|
||||||
|
description = "A set of copyparty volumes to create";
|
||||||
|
default = {
|
||||||
|
"/" = {
|
||||||
|
path = defaultShareDir;
|
||||||
|
access = { r = "*"; };
|
||||||
|
};
|
||||||
|
};
|
||||||
|
example = literalExpression ''
|
||||||
|
{
|
||||||
|
"/" = {
|
||||||
|
path = ${defaultShareDir};
|
||||||
|
access = {
|
||||||
|
# wG = write-upget = see your own uploads only
|
||||||
|
wG = "*";
|
||||||
|
# read-write-modify-delete for users "ed" and "k"
|
||||||
|
rwmd = ["ed" "k"];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
config = mkIf cfg.enable {
|
||||||
|
systemd.services.copyparty = {
|
||||||
|
description = "http file sharing hub";
|
||||||
|
wantedBy = [ "multi-user.target" ];
|
||||||
|
|
||||||
|
environment = {
|
||||||
|
PYTHONUNBUFFERED = "true";
|
||||||
|
XDG_CONFIG_HOME = "${home}/.config";
|
||||||
|
};
|
||||||
|
|
||||||
|
preStart = let
|
||||||
|
replaceSecretCommand = name: attrs:
|
||||||
|
"${getExe pkgs.replace-secret} '${
|
||||||
|
passwordPlaceholder name
|
||||||
|
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
|
||||||
|
in ''
|
||||||
|
set -euo pipefail
|
||||||
|
install -m 600 ${configFile} ${runtimeConfigPath}
|
||||||
|
${concatStringsSep "\n"
|
||||||
|
(mapAttrsToList replaceSecretCommand cfg.accounts)}
|
||||||
|
'';
|
||||||
|
|
||||||
|
serviceConfig = {
|
||||||
|
Type = "simple";
|
||||||
|
ExecStart = "${getExe cfg.package} -c ${runtimeConfigPath}";
|
||||||
|
|
||||||
|
# Hardening options
|
||||||
|
User = "copyparty";
|
||||||
|
Group = "copyparty";
|
||||||
|
RuntimeDirectory = name;
|
||||||
|
RuntimeDirectoryMode = "0700";
|
||||||
|
StateDirectory = [ name "${name}/data" "${name}/.config" ];
|
||||||
|
StateDirectoryMode = "0700";
|
||||||
|
WorkingDirectory = home;
|
||||||
|
TemporaryFileSystem = "/:ro";
|
||||||
|
BindReadOnlyPaths = [
|
||||||
|
"/nix/store"
|
||||||
|
"-/etc/resolv.conf"
|
||||||
|
"-/etc/nsswitch.conf"
|
||||||
|
"-/etc/hosts"
|
||||||
|
"-/etc/localtime"
|
||||||
|
] ++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
|
||||||
|
BindPaths = [ home ] ++ (mapAttrsToList (k: v: v.path) cfg.volumes);
|
||||||
|
# Would re-mount paths ignored by temporary root
|
||||||
|
#ProtectSystem = "strict";
|
||||||
|
ProtectHome = true;
|
||||||
|
PrivateTmp = true;
|
||||||
|
PrivateDevices = true;
|
||||||
|
ProtectKernelTunables = true;
|
||||||
|
ProtectControlGroups = true;
|
||||||
|
RestrictSUIDSGID = true;
|
||||||
|
PrivateMounts = true;
|
||||||
|
ProtectKernelModules = true;
|
||||||
|
ProtectKernelLogs = true;
|
||||||
|
ProtectHostname = true;
|
||||||
|
ProtectClock = true;
|
||||||
|
ProtectProc = "invisible";
|
||||||
|
ProcSubset = "pid";
|
||||||
|
RestrictNamespaces = true;
|
||||||
|
RemoveIPC = true;
|
||||||
|
UMask = "0077";
|
||||||
|
LimitNOFILE = cfg.openFilesLimit;
|
||||||
|
NoNewPrivileges = true;
|
||||||
|
LockPersonality = true;
|
||||||
|
RestrictRealtime = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
users.groups.copyparty = { };
|
||||||
|
users.users.copyparty = {
|
||||||
|
description = "Service user for copyparty";
|
||||||
|
group = "copyparty";
|
||||||
|
home = home;
|
||||||
|
isSystemUser = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
# Maintainer: icxes <dev.null@need.moe>
|
# Maintainer: icxes <dev.null@need.moe>
|
||||||
pkgname=copyparty
|
pkgname=copyparty
|
||||||
pkgver="1.6.5"
|
pkgver="1.6.11"
|
||||||
pkgrel=1
|
pkgrel=1
|
||||||
pkgdesc="Portable file sharing hub"
|
pkgdesc="Portable file sharing hub"
|
||||||
arch=("any")
|
arch=("any")
|
||||||
@@ -26,12 +26,12 @@ source=("${url}/releases/download/v${pkgver}/${pkgname}-sfx.py"
|
|||||||
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
|
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
|
||||||
)
|
)
|
||||||
backup=("etc/${pkgname}.d/init" )
|
backup=("etc/${pkgname}.d/init" )
|
||||||
sha256sums=("947d3f191f96f6a9e451bbcb35c5582ba210d81cfdc92dfa9ab0390dbecf26ee"
|
sha256sums=("d096e33ab666ef45213899dd3a10735f62b5441339cb7374f93b232d0b6c8d34"
|
||||||
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
|
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
|
||||||
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
|
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
|
||||||
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
|
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
|
||||||
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
|
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
|
||||||
"746971e95817c54445ce7f9c8406822dffc814cd5eb8113abd36dd472fd677d7"
|
"8e89d281483e22d11d111bed540652af35b66af6f14f49faae7b959f6cdc6475"
|
||||||
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
|
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|||||||
55
contrib/package/nix/copyparty/default.nix
Normal file
55
contrib/package/nix/copyparty/default.nix
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, pillow, pyvips, ffmpeg, mutagen,
|
||||||
|
|
||||||
|
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
|
||||||
|
withThumbnails ? true,
|
||||||
|
|
||||||
|
# create thumbnails with PyVIPS; even faster, uses more memory
|
||||||
|
# -- can be combined with Pillow to support more filetypes
|
||||||
|
withFastThumbnails ? false,
|
||||||
|
|
||||||
|
# enable FFmpeg; thumbnails for most filetypes (also video and audio), extract audio metadata, transcode audio to opus
|
||||||
|
# -- possibly dangerous if you allow anonymous uploads, since FFmpeg has a huge attack surface
|
||||||
|
# -- can be combined with Thumbnails and/or FastThumbnails, since FFmpeg is slower than both
|
||||||
|
withMediaProcessing ? true,
|
||||||
|
|
||||||
|
# if MediaProcessing is not enabled, you probably want this instead (less accurate, but much safer and faster)
|
||||||
|
withBasicAudioMetadata ? false,
|
||||||
|
|
||||||
|
# enable FTPS support in the FTP server
|
||||||
|
withFTPS ? false,
|
||||||
|
|
||||||
|
# samba/cifs server; dangerous and buggy, enable if you really need it
|
||||||
|
withSMB ? false,
|
||||||
|
|
||||||
|
}:
|
||||||
|
|
||||||
|
let
|
||||||
|
pinData = lib.importJSON ./pin.json;
|
||||||
|
pyEnv = python.withPackages (ps:
|
||||||
|
with ps; [
|
||||||
|
jinja2
|
||||||
|
]
|
||||||
|
++ lib.optional withSMB impacket
|
||||||
|
++ lib.optional withFTPS pyopenssl
|
||||||
|
++ lib.optional withThumbnails pillow
|
||||||
|
++ lib.optional withFastThumbnails pyvips
|
||||||
|
++ lib.optional withMediaProcessing ffmpeg
|
||||||
|
++ lib.optional withBasicAudioMetadata mutagen
|
||||||
|
);
|
||||||
|
in stdenv.mkDerivation {
|
||||||
|
pname = "copyparty";
|
||||||
|
version = pinData.version;
|
||||||
|
src = fetchurl {
|
||||||
|
url = pinData.url;
|
||||||
|
hash = pinData.hash;
|
||||||
|
};
|
||||||
|
buildInputs = [ makeWrapper ];
|
||||||
|
dontUnpack = true;
|
||||||
|
dontBuild = true;
|
||||||
|
installPhase = ''
|
||||||
|
install -Dm755 $src $out/share/copyparty-sfx.py
|
||||||
|
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
|
||||||
|
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
|
||||||
|
--add-flags "$out/share/copyparty-sfx.py"
|
||||||
|
'';
|
||||||
|
}
|
||||||
5
contrib/package/nix/copyparty/pin.json
Normal file
5
contrib/package/nix/copyparty/pin.json
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"url": "https://github.com/9001/copyparty/releases/download/v1.6.11/copyparty-sfx.py",
|
||||||
|
"version": "1.6.11",
|
||||||
|
"hash": "sha256-0JbjOrZm70UhOJndOhBzX2K1RBM5y3N0+TsjLQtsjTQ="
|
||||||
|
}
|
||||||
77
contrib/package/nix/copyparty/update.py
Executable file
77
contrib/package/nix/copyparty/update.py
Executable file
@@ -0,0 +1,77 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Update the Nix package pin
|
||||||
|
#
|
||||||
|
# Usage: ./update.sh [PATH]
|
||||||
|
# When the [PATH] is not set, it will fetch the latest release from the repo.
|
||||||
|
# With [PATH] set, it will hash the given file and generate the URL,
|
||||||
|
# base on the version contained within the file
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import json
|
||||||
|
import hashlib
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
OUTPUT_FILE = Path("pin.json")
|
||||||
|
TARGET_ASSET = "copyparty-sfx.py"
|
||||||
|
HASH_TYPE = "sha256"
|
||||||
|
LATEST_RELEASE_URL = "https://api.github.com/repos/9001/copyparty/releases/latest"
|
||||||
|
DOWNLOAD_URL = lambda version: f"https://github.com/9001/copyparty/releases/download/v{version}/{TARGET_ASSET}"
|
||||||
|
|
||||||
|
|
||||||
|
def get_formatted_hash(binary):
|
||||||
|
hasher = hashlib.new("sha256")
|
||||||
|
hasher.update(binary)
|
||||||
|
asset_hash = hasher.digest()
|
||||||
|
encoded_hash = base64.b64encode(asset_hash).decode("ascii")
|
||||||
|
return f"{HASH_TYPE}-{encoded_hash}"
|
||||||
|
|
||||||
|
|
||||||
|
def version_from_sfx(binary):
|
||||||
|
result = re.search(b'^VER = "(.*)"$', binary, re.MULTILINE)
|
||||||
|
if result:
|
||||||
|
return result.groups(1)[0].decode("ascii")
|
||||||
|
|
||||||
|
raise ValueError("version not found in provided file")
|
||||||
|
|
||||||
|
|
||||||
|
def remote_release_pin():
|
||||||
|
import requests
|
||||||
|
|
||||||
|
response = requests.get(LATEST_RELEASE_URL).json()
|
||||||
|
version = response["tag_name"].lstrip("v")
|
||||||
|
asset_info = [a for a in response["assets"] if a["name"] == TARGET_ASSET][0]
|
||||||
|
download_url = asset_info["browser_download_url"]
|
||||||
|
asset = requests.get(download_url)
|
||||||
|
formatted_hash = get_formatted_hash(asset.content)
|
||||||
|
|
||||||
|
result = {"url": download_url, "version": version, "hash": formatted_hash}
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def local_release_pin(path):
|
||||||
|
asset = path.read_bytes()
|
||||||
|
version = version_from_sfx(asset)
|
||||||
|
download_url = DOWNLOAD_URL(version)
|
||||||
|
formatted_hash = get_formatted_hash(asset)
|
||||||
|
|
||||||
|
result = {"url": download_url, "version": version, "hash": formatted_hash}
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
if len(sys.argv) > 1:
|
||||||
|
asset_path = Path(sys.argv[1])
|
||||||
|
result = local_release_pin(asset_path)
|
||||||
|
else:
|
||||||
|
result = remote_release_pin()
|
||||||
|
|
||||||
|
print(result)
|
||||||
|
json_result = json.dumps(result, indent=4)
|
||||||
|
OUTPUT_FILE.write_text(json_result)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
208
contrib/plugins/rave.js
Normal file
208
contrib/plugins/rave.js
Normal file
@@ -0,0 +1,208 @@
|
|||||||
|
/* untz untz untz untz */
|
||||||
|
|
||||||
|
(function () {
|
||||||
|
|
||||||
|
var can, ctx, W, H, fft, buf, bars, barw, pv,
|
||||||
|
hue = 0,
|
||||||
|
ibeat = 0,
|
||||||
|
beats = [9001],
|
||||||
|
beats_url = '',
|
||||||
|
uofs = 0,
|
||||||
|
ops = ebi('ops'),
|
||||||
|
raving = false,
|
||||||
|
recalc = 0,
|
||||||
|
cdown = 0,
|
||||||
|
FC = 0.9,
|
||||||
|
css = `<style>
|
||||||
|
|
||||||
|
#fft {
|
||||||
|
position: fixed;
|
||||||
|
top: 0;
|
||||||
|
left: 0;
|
||||||
|
z-index: -1;
|
||||||
|
}
|
||||||
|
body {
|
||||||
|
box-shadow: inset 0 0 0 white;
|
||||||
|
}
|
||||||
|
#ops>a,
|
||||||
|
#path>a {
|
||||||
|
display: inline-block;
|
||||||
|
}
|
||||||
|
/*
|
||||||
|
body.untz {
|
||||||
|
animation: untz-body 200ms ease-out;
|
||||||
|
}
|
||||||
|
@keyframes untz-body {
|
||||||
|
0% {inset 0 0 20em white}
|
||||||
|
100% {inset 0 0 0 white}
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
:root, html.a, html.b, html.c, html.d, html.e {
|
||||||
|
--row-alt: rgba(48,52,78,0.2);
|
||||||
|
}
|
||||||
|
#files td {
|
||||||
|
background: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
</style>`;
|
||||||
|
|
||||||
|
QS('body').appendChild(mknod('div', null, css));
|
||||||
|
|
||||||
|
function rave_load() {
|
||||||
|
console.log('rave_load');
|
||||||
|
can = mknod('canvas', 'fft');
|
||||||
|
QS('body').appendChild(can);
|
||||||
|
ctx = can.getContext('2d');
|
||||||
|
|
||||||
|
fft = new AnalyserNode(actx, {
|
||||||
|
"fftSize": 2048,
|
||||||
|
"maxDecibels": 0,
|
||||||
|
"smoothingTimeConstant": 0.7,
|
||||||
|
});
|
||||||
|
ibeat = 0;
|
||||||
|
beats = [9001];
|
||||||
|
buf = new Uint8Array(fft.frequencyBinCount);
|
||||||
|
bars = buf.length * FC;
|
||||||
|
afilt.filters.push(fft);
|
||||||
|
if (!raving) {
|
||||||
|
raving = true;
|
||||||
|
raver();
|
||||||
|
}
|
||||||
|
beats_url = mp.au.src.split('?')[0].replace(/(.*\/)(.*)/, '$1.beats/$2.txt');
|
||||||
|
console.log("reading beats from", beats_url);
|
||||||
|
var xhr = new XHR();
|
||||||
|
xhr.open('GET', beats_url, true);
|
||||||
|
xhr.onload = readbeats;
|
||||||
|
xhr.url = beats_url;
|
||||||
|
xhr.send();
|
||||||
|
}
|
||||||
|
|
||||||
|
function rave_unload() {
|
||||||
|
qsr('#fft');
|
||||||
|
can = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function readbeats() {
|
||||||
|
if (this.url != beats_url)
|
||||||
|
return console.log('old beats??', this.url, beats_url);
|
||||||
|
|
||||||
|
var sbeats = this.responseText.replace(/\r/g, '').split(/\n/g);
|
||||||
|
if (sbeats.length < 3)
|
||||||
|
return;
|
||||||
|
|
||||||
|
beats = [];
|
||||||
|
for (var a = 0; a < sbeats.length; a++)
|
||||||
|
beats.push(parseFloat(sbeats[a]));
|
||||||
|
|
||||||
|
var end = beats.slice(-2),
|
||||||
|
t = end[1],
|
||||||
|
d = t - end[0];
|
||||||
|
|
||||||
|
while (d > 0.1 && t < 1200)
|
||||||
|
beats.push(t += d);
|
||||||
|
}
|
||||||
|
|
||||||
|
function hrand() {
|
||||||
|
return Math.random() - 0.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
function raver() {
|
||||||
|
if (!can) {
|
||||||
|
raving = false;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
requestAnimationFrame(raver);
|
||||||
|
if (!mp || !mp.au || mp.au.paused)
|
||||||
|
return;
|
||||||
|
|
||||||
|
if (--uofs >= 0) {
|
||||||
|
document.body.style.marginLeft = hrand() * uofs + 'px';
|
||||||
|
ebi('tree').style.marginLeft = hrand() * uofs + 'px';
|
||||||
|
for (var a of QSA('#ops>a, #path>a, #pctl>a'))
|
||||||
|
a.style.transform = 'translate(' + hrand() * uofs * 1 + 'px, ' + hrand() * uofs * 0.7 + 'px) rotate(' + Math.random() * uofs * 0.7 + 'deg)'
|
||||||
|
}
|
||||||
|
|
||||||
|
if (--recalc < 0) {
|
||||||
|
recalc = 60;
|
||||||
|
var tree = ebi('tree'),
|
||||||
|
x = tree.style.display == 'none' ? 0 : tree.offsetWidth;
|
||||||
|
|
||||||
|
//W = can.width = window.innerWidth - x;
|
||||||
|
//H = can.height = window.innerHeight;
|
||||||
|
//H = ebi('widget').offsetTop;
|
||||||
|
W = can.width = bars;
|
||||||
|
H = can.height = 512;
|
||||||
|
barw = 1; //parseInt(0.8 + W / bars);
|
||||||
|
can.style.left = x + 'px';
|
||||||
|
can.style.width = (window.innerWidth - x) + 'px';
|
||||||
|
can.style.height = ebi('widget').offsetTop + 'px';
|
||||||
|
}
|
||||||
|
|
||||||
|
//if (--cdown == 1)
|
||||||
|
// clmod(ops, 'untz');
|
||||||
|
|
||||||
|
fft.getByteFrequencyData(buf);
|
||||||
|
|
||||||
|
var imax = 0, vmax = 0;
|
||||||
|
for (var a = 10; a < 50; a++)
|
||||||
|
if (vmax < buf[a]) {
|
||||||
|
vmax = buf[a];
|
||||||
|
imax = a;
|
||||||
|
}
|
||||||
|
|
||||||
|
hue = hue * 0.93 + imax * 0.07;
|
||||||
|
|
||||||
|
ctx.fillStyle = 'rgba(0,0,0,0)';
|
||||||
|
ctx.fillRect(0, 0, W, H);
|
||||||
|
ctx.clearRect(0, 0, W, H);
|
||||||
|
ctx.fillStyle = 'hsla(' + (hue * 2.5) + ',100%,50%,0.7)';
|
||||||
|
|
||||||
|
var x = 0, mul = (H / 256) * 0.5;
|
||||||
|
for (var a = 0; a < buf.length * FC; a++) {
|
||||||
|
var v = buf[a] * mul * (1 + 0.69 * a / buf.length);
|
||||||
|
ctx.fillRect(x, H - v, barw, v);
|
||||||
|
x += barw;
|
||||||
|
}
|
||||||
|
|
||||||
|
var t = mp.au.currentTime + 0.05;
|
||||||
|
|
||||||
|
if (ibeat >= beats.length || beats[ibeat] > t)
|
||||||
|
return;
|
||||||
|
|
||||||
|
while (ibeat < beats.length && beats[ibeat++] < t)
|
||||||
|
continue;
|
||||||
|
|
||||||
|
return untz();
|
||||||
|
|
||||||
|
var cv = 0;
|
||||||
|
for (var a = 0; a < 128; a++)
|
||||||
|
cv += buf[a];
|
||||||
|
|
||||||
|
if (cv - pv > 1000) {
|
||||||
|
console.log(pv, cv, cv - pv);
|
||||||
|
if (cdown < 0) {
|
||||||
|
clmod(ops, 'untz', 1);
|
||||||
|
cdown = 20;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
pv = cv;
|
||||||
|
}
|
||||||
|
|
||||||
|
function untz() {
|
||||||
|
console.log('untz');
|
||||||
|
uofs = 14;
|
||||||
|
document.body.animate([
|
||||||
|
{ boxShadow: 'inset 0 0 1em #f0c' },
|
||||||
|
{ boxShadow: 'inset 0 0 20em #f0c', offset: 0.2 },
|
||||||
|
{ boxShadow: 'inset 0 0 0 #f0c' },
|
||||||
|
], { duration: 200, iterations: 1 });
|
||||||
|
}
|
||||||
|
|
||||||
|
afilt.plugs.push({
|
||||||
|
"en": true,
|
||||||
|
"load": rave_load,
|
||||||
|
"unload": rave_unload
|
||||||
|
});
|
||||||
|
|
||||||
|
})();
|
||||||
@@ -2,12 +2,16 @@
|
|||||||
# and share '/mnt' with anonymous read+write
|
# and share '/mnt' with anonymous read+write
|
||||||
#
|
#
|
||||||
# installation:
|
# installation:
|
||||||
# cp -pv copyparty.service /etc/systemd/system
|
# wget https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py -O /usr/local/bin/copyparty-sfx.py
|
||||||
# restorecon -vr /etc/systemd/system/copyparty.service
|
# cp -pv copyparty.service /etc/systemd/system/
|
||||||
|
# restorecon -vr /etc/systemd/system/copyparty.service # on fedora/rhel
|
||||||
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
|
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
|
||||||
# firewall-cmd --reload
|
# firewall-cmd --reload
|
||||||
# systemctl daemon-reload && systemctl enable --now copyparty
|
# systemctl daemon-reload && systemctl enable --now copyparty
|
||||||
#
|
#
|
||||||
|
# if it fails to start, first check this: systemctl status copyparty
|
||||||
|
# then try starting it while viewing logs: journalctl -fan 100
|
||||||
|
#
|
||||||
# you may want to:
|
# you may want to:
|
||||||
# change "User=cpp" and "/home/cpp/" to another user
|
# change "User=cpp" and "/home/cpp/" to another user
|
||||||
# remove the nft lines to only listen on port 3923
|
# remove the nft lines to only listen on port 3923
|
||||||
@@ -44,7 +48,7 @@ ExecReload=/bin/kill -s USR1 $MAINPID
|
|||||||
User=cpp
|
User=cpp
|
||||||
Environment=XDG_CONFIG_HOME=/home/cpp/.config
|
Environment=XDG_CONFIG_HOME=/home/cpp/.config
|
||||||
|
|
||||||
# setup forwarding from ports 80 and 443 to port 3923
|
# OPTIONAL: setup forwarding from ports 80 and 443 to port 3923
|
||||||
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
|
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
|
||||||
ExecStartPre=+nft add table ip nat
|
ExecStartPre=+nft add table ip nat
|
||||||
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }
|
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }
|
||||||
|
|||||||
@@ -758,7 +758,7 @@ def add_ftp(ap):
|
|||||||
|
|
||||||
def add_webdav(ap):
|
def add_webdav(ap):
|
||||||
ap2 = ap.add_argument_group('WebDAV options')
|
ap2 = ap.add_argument_group('WebDAV options')
|
||||||
ap2.add_argument("--daw", action="store_true", help="enable full write support. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
|
ap2.add_argument("--daw", action="store_true", help="enable full write support, even if client may not be webdav. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
|
||||||
ap2.add_argument("--dav-inf", action="store_true", help="allow depth:infinite requests (recursive file listing); extremely server-heavy but required for spec compliance -- luckily few clients rely on this")
|
ap2.add_argument("--dav-inf", action="store_true", help="allow depth:infinite requests (recursive file listing); extremely server-heavy but required for spec compliance -- luckily few clients rely on this")
|
||||||
ap2.add_argument("--dav-mac", action="store_true", help="disable apple-garbage filter -- allow macos to create junk files (._* and .DS_Store, .Spotlight-*, .fseventsd, .Trashes, .AppleDouble, __MACOS)")
|
ap2.add_argument("--dav-mac", action="store_true", help="disable apple-garbage filter -- allow macos to create junk files (._* and .DS_Store, .Spotlight-*, .fseventsd, .Trashes, .AppleDouble, __MACOS)")
|
||||||
|
|
||||||
@@ -874,7 +874,7 @@ def add_thumbnail(ap):
|
|||||||
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown -- avoids doing keepalive pokes (updating the mtime) on thumbnail folders more often than SEC seconds")
|
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown -- avoids doing keepalive pokes (updating the mtime) on thumbnail folders more often than SEC seconds")
|
||||||
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
|
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
|
||||||
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age -- folders which haven't been poked for longer than --th-poke seconds will get deleted every --th-clean seconds")
|
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age -- folders which haven't been poked for longer than --th-poke seconds will get deleted every --th-clean seconds")
|
||||||
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for")
|
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for; case-insensitive if -e2d")
|
||||||
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
|
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
|
||||||
# https://github.com/libvips/libvips
|
# https://github.com/libvips/libvips
|
||||||
# ffmpeg -hide_banner -demuxers | awk '/^ D /{print$2}' | while IFS= read -r x; do ffmpeg -hide_banner -h demuxer=$x; done | grep -E '^Demuxer |extensions:'
|
# ffmpeg -hide_banner -demuxers | awk '/^ D /{print$2}' | while IFS= read -r x; do ffmpeg -hide_banner -h demuxer=$x; done | grep -E '^Demuxer |extensions:'
|
||||||
@@ -900,8 +900,8 @@ def add_db_general(ap, hcores):
|
|||||||
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
|
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
|
||||||
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
|
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
|
||||||
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
|
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
|
||||||
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching paths during e2ds folder scans (volflag=nohash)")
|
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
|
||||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching paths during e2ds folder scans (volflag=noidx)")
|
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
|
||||||
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
|
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
|
||||||
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
|
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
|
||||||
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
|
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
|
||||||
@@ -946,6 +946,7 @@ def add_ui(ap, retry):
|
|||||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||||
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
|
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
|
||||||
|
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI)")
|
||||||
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
||||||
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
|
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
|
||||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
|
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
# coding: utf-8
|
# coding: utf-8
|
||||||
|
|
||||||
VERSION = (1, 6, 6)
|
VERSION = (1, 6, 12)
|
||||||
CODENAME = "cors k"
|
CODENAME = "cors k"
|
||||||
BUILD_DT = (2023, 2, 26)
|
BUILD_DT = (2023, 4, 20)
|
||||||
|
|
||||||
S_VERSION = ".".join(map(str, VERSION))
|
S_VERSION = ".".join(map(str, VERSION))
|
||||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||||
|
|||||||
@@ -356,7 +356,8 @@ class VFS(object):
|
|||||||
flags = {k: v for k, v in self.flags.items()}
|
flags = {k: v for k, v in self.flags.items()}
|
||||||
hist = flags.get("hist")
|
hist = flags.get("hist")
|
||||||
if hist and hist != "-":
|
if hist and hist != "-":
|
||||||
flags["hist"] = "{}/{}".format(hist.rstrip("/"), name)
|
zs = "{}/{}".format(hist.rstrip("/"), name)
|
||||||
|
flags["hist"] = os.path.expanduser(zs) if zs.startswith("~") else zs
|
||||||
|
|
||||||
return flags
|
return flags
|
||||||
|
|
||||||
@@ -1101,6 +1102,9 @@ class AuthSrv(object):
|
|||||||
if vflag == "-":
|
if vflag == "-":
|
||||||
pass
|
pass
|
||||||
elif vflag:
|
elif vflag:
|
||||||
|
if vflag.startswith("~"):
|
||||||
|
vflag = os.path.expanduser(vflag)
|
||||||
|
|
||||||
vol.histpath = uncyg(vflag) if WINDOWS else vflag
|
vol.histpath = uncyg(vflag) if WINDOWS else vflag
|
||||||
elif self.args.hist:
|
elif self.args.hist:
|
||||||
for nch in range(len(hid)):
|
for nch in range(len(hid)):
|
||||||
|
|||||||
@@ -14,8 +14,8 @@ from pyftpdlib.handlers import FTPHandler
|
|||||||
from pyftpdlib.servers import FTPServer
|
from pyftpdlib.servers import FTPServer
|
||||||
|
|
||||||
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
|
||||||
from .bos import bos
|
|
||||||
from .authsrv import VFS
|
from .authsrv import VFS
|
||||||
|
from .bos import bos
|
||||||
from .util import (
|
from .util import (
|
||||||
Daemon,
|
Daemon,
|
||||||
Pebkac,
|
Pebkac,
|
||||||
@@ -129,6 +129,7 @@ class FtpFs(AbstractedFS):
|
|||||||
|
|
||||||
def die(self, msg):
|
def die(self, msg):
|
||||||
self.h.die(msg)
|
self.h.die(msg)
|
||||||
|
raise Exception()
|
||||||
|
|
||||||
def v2a(
|
def v2a(
|
||||||
self,
|
self,
|
||||||
@@ -218,7 +219,7 @@ class FtpFs(AbstractedFS):
|
|||||||
|
|
||||||
def mkdir(self, path: str) -> None:
|
def mkdir(self, path: str) -> None:
|
||||||
ap = self.rv2a(path, w=True)[0]
|
ap = self.rv2a(path, w=True)[0]
|
||||||
bos.mkdir(ap)
|
bos.makedirs(ap) # filezilla expects this
|
||||||
|
|
||||||
def listdir(self, path: str) -> list[str]:
|
def listdir(self, path: str) -> list[str]:
|
||||||
vpath = join(self.cwd, path).lstrip("/")
|
vpath = join(self.cwd, path).lstrip("/")
|
||||||
|
|||||||
@@ -264,9 +264,10 @@ class HttpCli(object):
|
|||||||
self.is_https = (
|
self.is_https = (
|
||||||
self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls
|
self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls
|
||||||
)
|
)
|
||||||
self.host = self.headers.get("host") or "{}:{}".format(
|
self.host = self.headers.get("host") or ""
|
||||||
*list(self.s.getsockname()[:2])
|
if not self.host:
|
||||||
)
|
zs = "{}:{}".format(*list(self.s.getsockname()[:2]))
|
||||||
|
self.host = zs[7:] if zs.startswith("::ffff:") else zs
|
||||||
|
|
||||||
n = self.args.rproxy
|
n = self.args.rproxy
|
||||||
if n:
|
if n:
|
||||||
@@ -778,8 +779,8 @@ class HttpCli(object):
|
|||||||
if "k304" in self.uparam:
|
if "k304" in self.uparam:
|
||||||
return self.set_k304()
|
return self.set_k304()
|
||||||
|
|
||||||
if "am_js" in self.uparam:
|
if "setck" in self.uparam:
|
||||||
return self.set_am_js()
|
return self.setck()
|
||||||
|
|
||||||
if "reset" in self.uparam:
|
if "reset" in self.uparam:
|
||||||
return self.set_cfg_reset()
|
return self.set_cfg_reset()
|
||||||
@@ -865,7 +866,17 @@ class HttpCli(object):
|
|||||||
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False, err=401)
|
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False, err=401)
|
||||||
depth = self.headers.get("depth", "infinity").lower()
|
depth = self.headers.get("depth", "infinity").lower()
|
||||||
|
|
||||||
if depth == "infinity":
|
try:
|
||||||
|
topdir = {"vp": "", "st": bos.stat(vn.canonical(rem))}
|
||||||
|
except OSError as ex:
|
||||||
|
if ex.errno != errno.ENOENT:
|
||||||
|
raise
|
||||||
|
raise Pebkac(404)
|
||||||
|
|
||||||
|
if not stat.S_ISDIR(topdir["st"].st_mode):
|
||||||
|
fgen = []
|
||||||
|
|
||||||
|
elif depth == "infinity":
|
||||||
if not self.args.dav_inf:
|
if not self.args.dav_inf:
|
||||||
self.log("client wants --dav-inf", 3)
|
self.log("client wants --dav-inf", 3)
|
||||||
zb = b'<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:propfind-finite-depth/></D:error>'
|
zb = b'<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:propfind-finite-depth/></D:error>'
|
||||||
@@ -904,13 +915,6 @@ class HttpCli(object):
|
|||||||
t2 = " or 'infinity'" if self.args.dav_inf else ""
|
t2 = " or 'infinity'" if self.args.dav_inf else ""
|
||||||
raise Pebkac(412, t.format(depth, t2))
|
raise Pebkac(412, t.format(depth, t2))
|
||||||
|
|
||||||
try:
|
|
||||||
topdir = {"vp": "", "st": os.stat(vn.canonical(rem))}
|
|
||||||
except OSError as ex:
|
|
||||||
if ex.errno != errno.ENOENT:
|
|
||||||
raise
|
|
||||||
raise Pebkac(404)
|
|
||||||
|
|
||||||
fgen = itertools.chain([topdir], fgen) # type: ignore
|
fgen = itertools.chain([topdir], fgen) # type: ignore
|
||||||
vtop = vjoin(self.args.R, vjoin(vn.vpath, rem))
|
vtop = vjoin(self.args.R, vjoin(vn.vpath, rem))
|
||||||
|
|
||||||
@@ -1112,7 +1116,14 @@ class HttpCli(object):
|
|||||||
if self.do_log:
|
if self.do_log:
|
||||||
self.log("MKCOL " + self.req)
|
self.log("MKCOL " + self.req)
|
||||||
|
|
||||||
return self._mkdir(self.vpath)
|
try:
|
||||||
|
return self._mkdir(self.vpath, True)
|
||||||
|
except Pebkac as ex:
|
||||||
|
if ex.code >= 500:
|
||||||
|
raise
|
||||||
|
|
||||||
|
self.reply(b"", ex.code)
|
||||||
|
return True
|
||||||
|
|
||||||
def handle_move(self) -> bool:
|
def handle_move(self) -> bool:
|
||||||
dst = self.headers["destination"]
|
dst = self.headers["destination"]
|
||||||
@@ -1426,9 +1437,9 @@ class HttpCli(object):
|
|||||||
self.log(t, 1)
|
self.log(t, 1)
|
||||||
raise Pebkac(403, t)
|
raise Pebkac(403, t)
|
||||||
|
|
||||||
if is_put and not (self.args.no_dav or self.args.nw):
|
if is_put and not (self.args.no_dav or self.args.nw) and bos.path.exists(path):
|
||||||
# allow overwrite if...
|
# allow overwrite if...
|
||||||
# * volflag 'daw' is set
|
# * volflag 'daw' is set, or client is definitely webdav
|
||||||
# * and account has delete-access
|
# * and account has delete-access
|
||||||
# or...
|
# or...
|
||||||
# * file exists, is empty, sufficiently new
|
# * file exists, is empty, sufficiently new
|
||||||
@@ -1438,9 +1449,11 @@ class HttpCli(object):
|
|||||||
if self.args.dotpart:
|
if self.args.dotpart:
|
||||||
tnam = "." + tnam
|
tnam = "." + tnam
|
||||||
|
|
||||||
if (vfs.flags.get("daw") and self.can_delete) or (
|
if (
|
||||||
|
self.can_delete
|
||||||
|
and (vfs.flags.get("daw") or "x-oc-mtime" in self.headers)
|
||||||
|
) or (
|
||||||
not bos.path.exists(os.path.join(fdir, tnam))
|
not bos.path.exists(os.path.join(fdir, tnam))
|
||||||
and bos.path.exists(path)
|
|
||||||
and not bos.path.getsize(path)
|
and not bos.path.getsize(path)
|
||||||
and bos.path.getmtime(path) >= time.time() - self.args.blank_wt
|
and bos.path.getmtime(path) >= time.time() - self.args.blank_wt
|
||||||
):
|
):
|
||||||
@@ -1464,6 +1477,16 @@ class HttpCli(object):
|
|||||||
if self.args.nw:
|
if self.args.nw:
|
||||||
return post_sz, sha_hex, sha_b64, remains, path, ""
|
return post_sz, sha_hex, sha_b64, remains, path, ""
|
||||||
|
|
||||||
|
at = mt = time.time() - lifetime
|
||||||
|
cli_mt = self.headers.get("x-oc-mtime")
|
||||||
|
if cli_mt:
|
||||||
|
try:
|
||||||
|
mt = int(cli_mt)
|
||||||
|
times = (int(time.time()), mt)
|
||||||
|
bos.utime(path, times, False)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
if nameless and "magic" in vfs.flags:
|
if nameless and "magic" in vfs.flags:
|
||||||
try:
|
try:
|
||||||
ext = self.conn.hsrv.magician.ext(path)
|
ext = self.conn.hsrv.magician.ext(path)
|
||||||
@@ -1486,7 +1509,6 @@ class HttpCli(object):
|
|||||||
fn = fn2
|
fn = fn2
|
||||||
path = path2
|
path = path2
|
||||||
|
|
||||||
at = time.time() - lifetime
|
|
||||||
if xau and not runhook(
|
if xau and not runhook(
|
||||||
self.log,
|
self.log,
|
||||||
xau,
|
xau,
|
||||||
@@ -1494,7 +1516,7 @@ class HttpCli(object):
|
|||||||
self.vpath,
|
self.vpath,
|
||||||
self.host,
|
self.host,
|
||||||
self.uname,
|
self.uname,
|
||||||
at,
|
mt,
|
||||||
post_sz,
|
post_sz,
|
||||||
self.ip,
|
self.ip,
|
||||||
at,
|
at,
|
||||||
@@ -1554,6 +1576,11 @@ class HttpCli(object):
|
|||||||
t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
|
t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
|
||||||
|
|
||||||
h = {"Location": url} if is_put and url else {}
|
h = {"Location": url} if is_put and url else {}
|
||||||
|
|
||||||
|
if "x-oc-mtime" in self.headers:
|
||||||
|
h["X-OC-MTime"] = "accepted"
|
||||||
|
t = "" # some webdav clients expect/prefer this
|
||||||
|
|
||||||
self.reply(t.encode("utf-8"), 201, headers=h)
|
self.reply(t.encode("utf-8"), 201, headers=h)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@@ -1714,7 +1741,7 @@ class HttpCli(object):
|
|||||||
except:
|
except:
|
||||||
raise Pebkac(500, min_ex())
|
raise Pebkac(500, min_ex())
|
||||||
|
|
||||||
x = self.conn.hsrv.broker.ask("up2k.handle_json", body)
|
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
||||||
ret = x.get()
|
ret = x.get()
|
||||||
if self.is_vproxied:
|
if self.is_vproxied:
|
||||||
if "purl" in ret:
|
if "purl" in ret:
|
||||||
@@ -1727,7 +1754,7 @@ class HttpCli(object):
|
|||||||
|
|
||||||
def handle_search(self, body: dict[str, Any]) -> bool:
|
def handle_search(self, body: dict[str, Any]) -> bool:
|
||||||
idx = self.conn.get_u2idx()
|
idx = self.conn.get_u2idx()
|
||||||
if not hasattr(idx, "p_end"):
|
if not idx or not hasattr(idx, "p_end"):
|
||||||
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
|
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
|
||||||
|
|
||||||
vols = []
|
vols = []
|
||||||
@@ -1757,12 +1784,13 @@ class HttpCli(object):
|
|||||||
hits = idx.fsearch(vols, body)
|
hits = idx.fsearch(vols, body)
|
||||||
msg: Any = repr(hits)
|
msg: Any = repr(hits)
|
||||||
taglist: list[str] = []
|
taglist: list[str] = []
|
||||||
|
trunc = False
|
||||||
else:
|
else:
|
||||||
# search by query params
|
# search by query params
|
||||||
q = body["q"]
|
q = body["q"]
|
||||||
n = body.get("n", self.args.srch_hits)
|
n = body.get("n", self.args.srch_hits)
|
||||||
self.log("qj: {} |{}|".format(q, n))
|
self.log("qj: {} |{}|".format(q, n))
|
||||||
hits, taglist = idx.search(vols, q, n)
|
hits, taglist, trunc = idx.search(vols, q, n)
|
||||||
msg = len(hits)
|
msg = len(hits)
|
||||||
|
|
||||||
idx.p_end = time.time()
|
idx.p_end = time.time()
|
||||||
@@ -1782,7 +1810,8 @@ class HttpCli(object):
|
|||||||
for hit in hits:
|
for hit in hits:
|
||||||
hit["rp"] = self.args.RS + hit["rp"]
|
hit["rp"] = self.args.RS + hit["rp"]
|
||||||
|
|
||||||
r = json.dumps({"hits": hits, "tag_order": order}).encode("utf-8")
|
rj = {"hits": hits, "tag_order": order, "trunc": trunc}
|
||||||
|
r = json.dumps(rj).encode("utf-8")
|
||||||
self.reply(r, mime="application/json")
|
self.reply(r, mime="application/json")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@@ -1884,17 +1913,10 @@ class HttpCli(object):
|
|||||||
with self.mutex:
|
with self.mutex:
|
||||||
self.u2fh.close(path)
|
self.u2fh.close(path)
|
||||||
|
|
||||||
# windows cant rename open files
|
if not num_left and not self.args.nw:
|
||||||
if ANYWIN and path != fin_path and not self.args.nw:
|
self.conn.hsrv.broker.ask(
|
||||||
self.conn.hsrv.broker.ask("up2k.finish_upload", ptop, wark).get()
|
"up2k.finish_upload", ptop, wark, self.u2fh.aps
|
||||||
|
).get()
|
||||||
if not ANYWIN and not num_left:
|
|
||||||
times = (int(time.time()), int(lastmod))
|
|
||||||
self.log("no more chunks, setting times {}".format(times))
|
|
||||||
try:
|
|
||||||
bos.utime(fin_path, times)
|
|
||||||
except:
|
|
||||||
self.log("failed to utime ({}, {})".format(fin_path, times))
|
|
||||||
|
|
||||||
cinf = self.headers.get("x-up2k-stat", "")
|
cinf = self.headers.get("x-up2k-stat", "")
|
||||||
|
|
||||||
@@ -1959,7 +1981,7 @@ class HttpCli(object):
|
|||||||
sanitized = sanitize_fn(new_dir, "", [])
|
sanitized = sanitize_fn(new_dir, "", [])
|
||||||
return self._mkdir(vjoin(self.vpath, sanitized))
|
return self._mkdir(vjoin(self.vpath, sanitized))
|
||||||
|
|
||||||
def _mkdir(self, vpath: str) -> bool:
|
def _mkdir(self, vpath: str, dav: bool = False) -> bool:
|
||||||
nullwrite = self.args.nw
|
nullwrite = self.args.nw
|
||||||
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
|
||||||
self._assert_safe_rem(rem)
|
self._assert_safe_rem(rem)
|
||||||
@@ -1985,7 +2007,12 @@ class HttpCli(object):
|
|||||||
raise Pebkac(500, min_ex())
|
raise Pebkac(500, min_ex())
|
||||||
|
|
||||||
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
|
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
|
||||||
self.redirect(vpath, status=201)
|
|
||||||
|
if dav:
|
||||||
|
self.reply(b"", 201)
|
||||||
|
else:
|
||||||
|
self.redirect(vpath, status=201)
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def handle_new_md(self) -> bool:
|
def handle_new_md(self) -> bool:
|
||||||
@@ -2529,8 +2556,12 @@ class HttpCli(object):
|
|||||||
hrange = self.headers.get("range")
|
hrange = self.headers.get("range")
|
||||||
|
|
||||||
# let's not support 206 with compression
|
# let's not support 206 with compression
|
||||||
if do_send and not is_compressed and hrange and file_sz:
|
# and multirange / multipart is also not-impl (mostly because calculating contentlength is a pain)
|
||||||
|
if do_send and not is_compressed and hrange and file_sz and "," not in hrange:
|
||||||
try:
|
try:
|
||||||
|
if not hrange.lower().startswith("bytes"):
|
||||||
|
raise Exception()
|
||||||
|
|
||||||
a, b = hrange.split("=", 1)[1].split("-")
|
a, b = hrange.split("=", 1)[1].split("-")
|
||||||
|
|
||||||
if a.strip():
|
if a.strip():
|
||||||
@@ -2903,15 +2934,16 @@ class HttpCli(object):
|
|||||||
self.redirect("", "?h#cc")
|
self.redirect("", "?h#cc")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def set_am_js(self) -> bool:
|
def setck(self) -> bool:
|
||||||
v = "n" if self.uparam["am_js"] == "n" else "y"
|
k, v = self.uparam["setck"].split("=", 1)
|
||||||
ck = gencookie("js", v, self.args.R, False, 86400 * 299)
|
t = None if v == "" else 86400 * 299
|
||||||
|
ck = gencookie(k, v, self.args.R, False, t)
|
||||||
self.out_headerlist.append(("Set-Cookie", ck))
|
self.out_headerlist.append(("Set-Cookie", ck))
|
||||||
self.reply(b"promoted\n")
|
self.reply(b"o7\n")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def set_cfg_reset(self) -> bool:
|
def set_cfg_reset(self) -> bool:
|
||||||
for k in ("k304", "js", "cppwd", "cppws"):
|
for k in ("k304", "js", "idxh", "cppwd", "cppws"):
|
||||||
cookie = gencookie(k, "x", self.args.R, False, None)
|
cookie = gencookie(k, "x", self.args.R, False, None)
|
||||||
self.out_headerlist.append(("Set-Cookie", cookie))
|
self.out_headerlist.append(("Set-Cookie", cookie))
|
||||||
|
|
||||||
@@ -3047,7 +3079,7 @@ class HttpCli(object):
|
|||||||
raise Pebkac(403, "the unpost feature is disabled in server config")
|
raise Pebkac(403, "the unpost feature is disabled in server config")
|
||||||
|
|
||||||
idx = self.conn.get_u2idx()
|
idx = self.conn.get_u2idx()
|
||||||
if not hasattr(idx, "p_end"):
|
if not idx or not hasattr(idx, "p_end"):
|
||||||
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
|
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
|
||||||
|
|
||||||
filt = self.uparam.get("filter")
|
filt = self.uparam.get("filter")
|
||||||
@@ -3255,23 +3287,48 @@ class HttpCli(object):
|
|||||||
):
|
):
|
||||||
raise Pebkac(403)
|
raise Pebkac(403)
|
||||||
|
|
||||||
|
e2d = "e2d" in vn.flags
|
||||||
|
e2t = "e2t" in vn.flags
|
||||||
|
|
||||||
self.html_head = vn.flags.get("html_head", "")
|
self.html_head = vn.flags.get("html_head", "")
|
||||||
if vn.flags.get("norobots"):
|
if vn.flags.get("norobots") or "b" in self.uparam:
|
||||||
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
|
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
|
||||||
else:
|
else:
|
||||||
self.out_headers.pop("X-Robots-Tag", None)
|
self.out_headers.pop("X-Robots-Tag", None)
|
||||||
|
|
||||||
is_dir = stat.S_ISDIR(st.st_mode)
|
is_dir = stat.S_ISDIR(st.st_mode)
|
||||||
|
icur = None
|
||||||
|
if is_dir and (e2t or e2d):
|
||||||
|
idx = self.conn.get_u2idx()
|
||||||
|
if idx and hasattr(idx, "p_end"):
|
||||||
|
icur = idx.get_cur(dbv.realpath)
|
||||||
|
|
||||||
if self.can_read:
|
if self.can_read:
|
||||||
th_fmt = self.uparam.get("th")
|
th_fmt = self.uparam.get("th")
|
||||||
if th_fmt is not None:
|
if th_fmt is not None:
|
||||||
if is_dir:
|
if is_dir:
|
||||||
for fn in self.args.th_covers.split(","):
|
vrem = vrem.rstrip("/")
|
||||||
fp = os.path.join(abspath, fn)
|
if icur and vrem:
|
||||||
if bos.path.exists(fp):
|
q = "select fn from cv where rd=? and dn=?"
|
||||||
vrem = "{}/{}".format(vrem.rstrip("/"), fn).strip("/")
|
crd, cdn = vrem.rsplit("/", 1) if "/" in vrem else ("", vrem)
|
||||||
is_dir = False
|
# no mojibake support:
|
||||||
break
|
try:
|
||||||
|
cfn = icur.execute(q, (crd, cdn)).fetchone()
|
||||||
|
if cfn:
|
||||||
|
fn = cfn[0]
|
||||||
|
fp = os.path.join(abspath, fn)
|
||||||
|
if bos.path.exists(fp):
|
||||||
|
vrem = "{}/{}".format(vrem, fn).strip("/")
|
||||||
|
is_dir = False
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
for fn in self.args.th_covers:
|
||||||
|
fp = os.path.join(abspath, fn)
|
||||||
|
if bos.path.exists(fp):
|
||||||
|
vrem = "{}/{}".format(vrem, fn).strip("/")
|
||||||
|
is_dir = False
|
||||||
|
break
|
||||||
|
|
||||||
if is_dir:
|
if is_dir:
|
||||||
return self.tx_ico("a.folder")
|
return self.tx_ico("a.folder")
|
||||||
@@ -3378,8 +3435,8 @@ class HttpCli(object):
|
|||||||
"taglist": [],
|
"taglist": [],
|
||||||
"srvinf": srv_infot,
|
"srvinf": srv_infot,
|
||||||
"acct": self.uname,
|
"acct": self.uname,
|
||||||
"idx": ("e2d" in vn.flags),
|
"idx": e2d,
|
||||||
"itag": ("e2t" in vn.flags),
|
"itag": e2t,
|
||||||
"lifetime": vn.flags.get("lifetime") or 0,
|
"lifetime": vn.flags.get("lifetime") or 0,
|
||||||
"frand": bool(vn.flags.get("rand")),
|
"frand": bool(vn.flags.get("rand")),
|
||||||
"perms": perms,
|
"perms": perms,
|
||||||
@@ -3398,8 +3455,8 @@ class HttpCli(object):
|
|||||||
"taglist": [],
|
"taglist": [],
|
||||||
"def_hcols": [],
|
"def_hcols": [],
|
||||||
"have_emp": self.args.emp,
|
"have_emp": self.args.emp,
|
||||||
"have_up2k_idx": ("e2d" in vn.flags),
|
"have_up2k_idx": e2d,
|
||||||
"have_tags_idx": ("e2t" in vn.flags),
|
"have_tags_idx": e2t,
|
||||||
"have_acode": (not self.args.no_acode),
|
"have_acode": (not self.args.no_acode),
|
||||||
"have_mv": (not self.args.no_mv),
|
"have_mv": (not self.args.no_mv),
|
||||||
"have_del": (not self.args.no_del),
|
"have_del": (not self.args.no_del),
|
||||||
@@ -3411,11 +3468,12 @@ class HttpCli(object):
|
|||||||
"url_suf": url_suf,
|
"url_suf": url_suf,
|
||||||
"logues": logues,
|
"logues": logues,
|
||||||
"readme": readme,
|
"readme": readme,
|
||||||
"title": html_escape(self.vpath, crlf=True) or "⇆🎉",
|
"title": html_escape(self.vpath, crlf=True) or "💾🎉",
|
||||||
"srv_info": srv_infot,
|
"srv_info": srv_infot,
|
||||||
"dtheme": self.args.theme,
|
"dtheme": self.args.theme,
|
||||||
"themes": self.args.themes,
|
"themes": self.args.themes,
|
||||||
"turbolvl": self.args.turbo,
|
"turbolvl": self.args.turbo,
|
||||||
|
"idxh": int(self.args.ih),
|
||||||
"u2sort": self.args.u2sort,
|
"u2sort": self.args.u2sort,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -3475,11 +3533,6 @@ class HttpCli(object):
|
|||||||
if not self.args.ed or "dots" not in self.uparam:
|
if not self.args.ed or "dots" not in self.uparam:
|
||||||
ls_names = exclude_dotfiles(ls_names)
|
ls_names = exclude_dotfiles(ls_names)
|
||||||
|
|
||||||
icur = None
|
|
||||||
if "e2t" in vn.flags:
|
|
||||||
idx = self.conn.get_u2idx()
|
|
||||||
icur = idx.get_cur(dbv.realpath)
|
|
||||||
|
|
||||||
add_fk = vn.flags.get("fk")
|
add_fk = vn.flags.get("fk")
|
||||||
|
|
||||||
dirs = []
|
dirs = []
|
||||||
@@ -3554,6 +3607,20 @@ class HttpCli(object):
|
|||||||
files.append(item)
|
files.append(item)
|
||||||
item["rd"] = rem
|
item["rd"] = rem
|
||||||
|
|
||||||
|
if (
|
||||||
|
self.cookies.get("idxh") == "y"
|
||||||
|
and "ls" not in self.uparam
|
||||||
|
and "v" not in self.uparam
|
||||||
|
):
|
||||||
|
idx_html = set(["index.htm", "index.html"])
|
||||||
|
for item in files:
|
||||||
|
if item["name"] in idx_html:
|
||||||
|
# do full resolve in case of shadowed file
|
||||||
|
vp = vjoin(self.vpath.split("?")[0], item["name"])
|
||||||
|
vn, rem = self.asrv.vfs.get(vp, self.uname, True, False)
|
||||||
|
ap = vn.canonical(rem)
|
||||||
|
return self.tx_file(ap) # is no-cache
|
||||||
|
|
||||||
tagset: set[str] = set()
|
tagset: set[str] = set()
|
||||||
for fe in files:
|
for fe in files:
|
||||||
fn = fe["name"]
|
fn = fe["name"]
|
||||||
|
|||||||
@@ -103,11 +103,12 @@ class HttpConn(object):
|
|||||||
def log(self, msg: str, c: Union[int, str] = 0) -> None:
|
def log(self, msg: str, c: Union[int, str] = 0) -> None:
|
||||||
self.log_func(self.log_src, msg, c)
|
self.log_func(self.log_src, msg, c)
|
||||||
|
|
||||||
def get_u2idx(self) -> U2idx:
|
def get_u2idx(self) -> Optional[U2idx]:
|
||||||
# one u2idx per tcp connection;
|
# grab from a pool of u2idx instances;
|
||||||
# sqlite3 fully parallelizes under python threads
|
# sqlite3 fully parallelizes under python threads
|
||||||
|
# but avoid running out of FDs by creating too many
|
||||||
if not self.u2idx:
|
if not self.u2idx:
|
||||||
self.u2idx = U2idx(self)
|
self.u2idx = self.hsrv.get_u2idx(str(self.addr))
|
||||||
|
|
||||||
return self.u2idx
|
return self.u2idx
|
||||||
|
|
||||||
@@ -215,3 +216,7 @@ class HttpConn(object):
|
|||||||
self.cli = HttpCli(self)
|
self.cli = HttpCli(self)
|
||||||
if not self.cli.run():
|
if not self.cli.run():
|
||||||
return
|
return
|
||||||
|
|
||||||
|
if self.u2idx:
|
||||||
|
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
|
||||||
|
self.u2idx = None
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ import time
|
|||||||
|
|
||||||
import queue
|
import queue
|
||||||
|
|
||||||
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams
|
from .__init__ import ANYWIN, CORES, EXE, MACOS, TYPE_CHECKING, EnvParams
|
||||||
|
|
||||||
try:
|
try:
|
||||||
MNFE = ModuleNotFoundError
|
MNFE = ModuleNotFoundError
|
||||||
@@ -40,6 +40,7 @@ except MNFE:
|
|||||||
|
|
||||||
from .bos import bos
|
from .bos import bos
|
||||||
from .httpconn import HttpConn
|
from .httpconn import HttpConn
|
||||||
|
from .u2idx import U2idx
|
||||||
from .util import (
|
from .util import (
|
||||||
E_SCK,
|
E_SCK,
|
||||||
FHC,
|
FHC,
|
||||||
@@ -111,6 +112,9 @@ class HttpSrv(object):
|
|||||||
self.cb_ts = 0.0
|
self.cb_ts = 0.0
|
||||||
self.cb_v = ""
|
self.cb_v = ""
|
||||||
|
|
||||||
|
self.u2idx_free: dict[str, U2idx] = {}
|
||||||
|
self.u2idx_n = 0
|
||||||
|
|
||||||
env = jinja2.Environment()
|
env = jinja2.Environment()
|
||||||
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
|
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
|
||||||
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
|
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
|
||||||
@@ -445,6 +449,9 @@ class HttpSrv(object):
|
|||||||
self.clients.remove(cli)
|
self.clients.remove(cli)
|
||||||
self.ncli -= 1
|
self.ncli -= 1
|
||||||
|
|
||||||
|
if cli.u2idx:
|
||||||
|
self.put_u2idx(str(addr), cli.u2idx)
|
||||||
|
|
||||||
def cachebuster(self) -> str:
|
def cachebuster(self) -> str:
|
||||||
if time.time() - self.cb_ts < 1:
|
if time.time() - self.cb_ts < 1:
|
||||||
return self.cb_v
|
return self.cb_v
|
||||||
@@ -466,3 +473,31 @@ class HttpSrv(object):
|
|||||||
self.cb_v = v.decode("ascii")[-4:]
|
self.cb_v = v.decode("ascii")[-4:]
|
||||||
self.cb_ts = time.time()
|
self.cb_ts = time.time()
|
||||||
return self.cb_v
|
return self.cb_v
|
||||||
|
|
||||||
|
def get_u2idx(self, ident: str) -> Optional[U2idx]:
|
||||||
|
utab = self.u2idx_free
|
||||||
|
for _ in range(100): # 5/0.05 = 5sec
|
||||||
|
with self.mutex:
|
||||||
|
if utab:
|
||||||
|
if ident in utab:
|
||||||
|
return utab.pop(ident)
|
||||||
|
|
||||||
|
return utab.pop(list(utab.keys())[0])
|
||||||
|
|
||||||
|
if self.u2idx_n < CORES:
|
||||||
|
self.u2idx_n += 1
|
||||||
|
return U2idx(self)
|
||||||
|
|
||||||
|
time.sleep(0.05)
|
||||||
|
# not using conditional waits, on a hunch that
|
||||||
|
# average performance will be faster like this
|
||||||
|
# since most servers won't be fully saturated
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def put_u2idx(self, ident: str, u2idx: U2idx) -> None:
|
||||||
|
with self.mutex:
|
||||||
|
while ident in self.u2idx_free:
|
||||||
|
ident += "a"
|
||||||
|
|
||||||
|
self.u2idx_free[ident] = u2idx
|
||||||
|
|||||||
@@ -128,6 +128,9 @@ class SvcHub(object):
|
|||||||
args.no_robots = True
|
args.no_robots = True
|
||||||
args.force_js = True
|
args.force_js = True
|
||||||
|
|
||||||
|
if not self._process_config():
|
||||||
|
raise Exception("bad config")
|
||||||
|
|
||||||
self.log = self._log_disabled if args.q else self._log_enabled
|
self.log = self._log_disabled if args.q else self._log_enabled
|
||||||
if args.lo:
|
if args.lo:
|
||||||
self._setup_logfile(printed)
|
self._setup_logfile(printed)
|
||||||
@@ -149,12 +152,9 @@ class SvcHub(object):
|
|||||||
self.log("root", t.format(args.j))
|
self.log("root", t.format(args.j))
|
||||||
|
|
||||||
if not args.no_fpool and args.j != 1:
|
if not args.no_fpool and args.j != 1:
|
||||||
t = "WARNING: --use-fpool combined with multithreading is untested and can probably cause undefined behavior"
|
t = "WARNING: ignoring --use-fpool because multithreading (-j{}) is enabled"
|
||||||
if ANYWIN:
|
self.log("root", t.format(args.j), c=3)
|
||||||
t = 'windows cannot do multithreading without --no-fpool, so enabling that -- note that upload performance will suffer if you have microsoft defender "real-time protection" enabled, so you probably want to use -j 1 instead'
|
args.no_fpool = True
|
||||||
args.no_fpool = True
|
|
||||||
|
|
||||||
self.log("root", t, c=3)
|
|
||||||
|
|
||||||
bri = "zy"[args.theme % 2 :][:1]
|
bri = "zy"[args.theme % 2 :][:1]
|
||||||
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
|
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
|
||||||
@@ -180,9 +180,6 @@ class SvcHub(object):
|
|||||||
|
|
||||||
self.log("root", "max clients: {}".format(self.args.nc))
|
self.log("root", "max clients: {}".format(self.args.nc))
|
||||||
|
|
||||||
if not self._process_config():
|
|
||||||
raise Exception("bad config")
|
|
||||||
|
|
||||||
self.tcpsrv = TcpSrv(self)
|
self.tcpsrv = TcpSrv(self)
|
||||||
self.up2k = Up2k(self)
|
self.up2k = Up2k(self)
|
||||||
|
|
||||||
@@ -351,6 +348,21 @@ class SvcHub(object):
|
|||||||
if al.rsp_jtr:
|
if al.rsp_jtr:
|
||||||
al.rsp_slp = 0.000001
|
al.rsp_slp = 0.000001
|
||||||
|
|
||||||
|
al.th_covers = set(al.th_covers.split(","))
|
||||||
|
|
||||||
|
for k in "c".split(" "):
|
||||||
|
vl = getattr(al, k)
|
||||||
|
if not vl:
|
||||||
|
continue
|
||||||
|
|
||||||
|
vl = [os.path.expanduser(x) if x.startswith("~") else x for x in vl]
|
||||||
|
setattr(al, k, vl)
|
||||||
|
|
||||||
|
for k in "lo hist ssl_log".split(" "):
|
||||||
|
vs = getattr(al, k)
|
||||||
|
if vs and vs.startswith("~"):
|
||||||
|
setattr(al, k, os.path.expanduser(vs))
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _setlimits(self) -> None:
|
def _setlimits(self) -> None:
|
||||||
@@ -405,6 +417,7 @@ class SvcHub(object):
|
|||||||
|
|
||||||
def _setup_logfile(self, printed: str) -> None:
|
def _setup_logfile(self, printed: str) -> None:
|
||||||
base_fn = fn = sel_fn = self._logname()
|
base_fn = fn = sel_fn = self._logname()
|
||||||
|
do_xz = fn.lower().endswith(".xz")
|
||||||
if fn != self.args.lo:
|
if fn != self.args.lo:
|
||||||
ctr = 0
|
ctr = 0
|
||||||
# yup this is a race; if started sufficiently concurrently, two
|
# yup this is a race; if started sufficiently concurrently, two
|
||||||
@@ -416,7 +429,7 @@ class SvcHub(object):
|
|||||||
fn = sel_fn
|
fn = sel_fn
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if fn.lower().endswith(".xz"):
|
if do_xz:
|
||||||
import lzma
|
import lzma
|
||||||
|
|
||||||
lh = lzma.open(fn, "wt", encoding="utf-8", errors="replace", preset=0)
|
lh = lzma.open(fn, "wt", encoding="utf-8", errors="replace", preset=0)
|
||||||
|
|||||||
@@ -322,7 +322,7 @@ class TcpSrv(object):
|
|||||||
if k not in netdevs:
|
if k not in netdevs:
|
||||||
removed = "{} = {}".format(k, v)
|
removed = "{} = {}".format(k, v)
|
||||||
|
|
||||||
t = "network change detected:\n added {}\nremoved {}"
|
t = "network change detected:\n added {}\033[0;33m\nremoved {}"
|
||||||
self.log("tcpsrv", t.format(added, removed), 3)
|
self.log("tcpsrv", t.format(added, removed), 3)
|
||||||
self.netdevs = netdevs
|
self.netdevs = netdevs
|
||||||
self._distribute_netdevs()
|
self._distribute_netdevs()
|
||||||
|
|||||||
@@ -16,10 +16,10 @@ from .__init__ import ANYWIN, TYPE_CHECKING
|
|||||||
from .bos import bos
|
from .bos import bos
|
||||||
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
|
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
|
||||||
from .util import (
|
from .util import (
|
||||||
|
FFMPEG_URL,
|
||||||
BytesIO,
|
BytesIO,
|
||||||
Cooldown,
|
Cooldown,
|
||||||
Daemon,
|
Daemon,
|
||||||
FFMPEG_URL,
|
|
||||||
Pebkac,
|
Pebkac,
|
||||||
afsenc,
|
afsenc,
|
||||||
fsenc,
|
fsenc,
|
||||||
@@ -561,12 +561,19 @@ class ThumbSrv(object):
|
|||||||
if "ac" not in ret:
|
if "ac" not in ret:
|
||||||
raise Exception("not audio")
|
raise Exception("not audio")
|
||||||
|
|
||||||
|
try:
|
||||||
|
dur = ret[".dur"][1]
|
||||||
|
except:
|
||||||
|
dur = 0
|
||||||
|
|
||||||
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
|
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
|
||||||
want_caf = tpath.endswith(".caf")
|
want_caf = tpath.endswith(".caf")
|
||||||
tmp_opus = tpath
|
tmp_opus = tpath
|
||||||
if want_caf:
|
if want_caf:
|
||||||
tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
|
tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
|
||||||
|
|
||||||
|
caf_src = abspath if src_opus else tmp_opus
|
||||||
|
|
||||||
if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
|
if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
|
||||||
# fmt: off
|
# fmt: off
|
||||||
cmd = [
|
cmd = [
|
||||||
@@ -584,7 +591,32 @@ class ThumbSrv(object):
|
|||||||
# fmt: on
|
# fmt: on
|
||||||
self._run_ff(cmd)
|
self._run_ff(cmd)
|
||||||
|
|
||||||
if want_caf:
|
# iOS fails to play some "insufficiently complex" files
|
||||||
|
# (average file shorter than 8 seconds), so of course we
|
||||||
|
# fix that by mixing in some inaudible pink noise :^)
|
||||||
|
# 6.3 sec seems like the cutoff so lets do 7, and
|
||||||
|
# 7 sec of psyqui-musou.opus @ 3:50 is 174 KiB
|
||||||
|
if want_caf and (dur < 20 or bos.path.getsize(caf_src) < 256 * 1024):
|
||||||
|
# fmt: off
|
||||||
|
cmd = [
|
||||||
|
b"ffmpeg",
|
||||||
|
b"-nostdin",
|
||||||
|
b"-v", b"error",
|
||||||
|
b"-hide_banner",
|
||||||
|
b"-i", fsenc(abspath),
|
||||||
|
b"-filter_complex", b"anoisesrc=a=0.001:d=7:c=pink,asplit[l][r]; [l][r]amerge[s]; [0:a:0][s]amix",
|
||||||
|
b"-map_metadata", b"-1",
|
||||||
|
b"-ac", b"2",
|
||||||
|
b"-c:a", b"libopus",
|
||||||
|
b"-b:a", b"128k",
|
||||||
|
b"-f", b"caf",
|
||||||
|
fsenc(tpath)
|
||||||
|
]
|
||||||
|
# fmt: on
|
||||||
|
self._run_ff(cmd)
|
||||||
|
|
||||||
|
elif want_caf:
|
||||||
|
# simple remux should be safe
|
||||||
# fmt: off
|
# fmt: off
|
||||||
cmd = [
|
cmd = [
|
||||||
b"ffmpeg",
|
b"ffmpeg",
|
||||||
|
|||||||
@@ -34,14 +34,14 @@ if True: # pylint: disable=using-constant-test
|
|||||||
from typing import Any, Optional, Union
|
from typing import Any, Optional, Union
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from .httpconn import HttpConn
|
from .httpsrv import HttpSrv
|
||||||
|
|
||||||
|
|
||||||
class U2idx(object):
|
class U2idx(object):
|
||||||
def __init__(self, conn: "HttpConn") -> None:
|
def __init__(self, hsrv: "HttpSrv") -> None:
|
||||||
self.log_func = conn.log_func
|
self.log_func = hsrv.log
|
||||||
self.asrv = conn.asrv
|
self.asrv = hsrv.asrv
|
||||||
self.args = conn.args
|
self.args = hsrv.args
|
||||||
self.timeout = self.args.srch_time
|
self.timeout = self.args.srch_time
|
||||||
|
|
||||||
if not HAVE_SQLITE3:
|
if not HAVE_SQLITE3:
|
||||||
@@ -51,7 +51,7 @@ class U2idx(object):
|
|||||||
self.active_id = ""
|
self.active_id = ""
|
||||||
self.active_cur: Optional["sqlite3.Cursor"] = None
|
self.active_cur: Optional["sqlite3.Cursor"] = None
|
||||||
self.cur: dict[str, "sqlite3.Cursor"] = {}
|
self.cur: dict[str, "sqlite3.Cursor"] = {}
|
||||||
self.mem_cur = sqlite3.connect(":memory:").cursor()
|
self.mem_cur = sqlite3.connect(":memory:", check_same_thread=False).cursor()
|
||||||
self.mem_cur.execute(r"create table a (b text)")
|
self.mem_cur.execute(r"create table a (b text)")
|
||||||
|
|
||||||
self.p_end = 0.0
|
self.p_end = 0.0
|
||||||
@@ -101,7 +101,8 @@ class U2idx(object):
|
|||||||
uri = ""
|
uri = ""
|
||||||
try:
|
try:
|
||||||
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
|
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
|
||||||
cur = sqlite3.connect(uri, 2, uri=True).cursor()
|
db = sqlite3.connect(uri, 2, uri=True, check_same_thread=False)
|
||||||
|
cur = db.cursor()
|
||||||
cur.execute('pragma table_info("up")').fetchone()
|
cur.execute('pragma table_info("up")').fetchone()
|
||||||
self.log("ro: {}".format(db_path))
|
self.log("ro: {}".format(db_path))
|
||||||
except:
|
except:
|
||||||
@@ -112,7 +113,7 @@ class U2idx(object):
|
|||||||
if not cur:
|
if not cur:
|
||||||
# on windows, this steals the write-lock from up2k.deferred_init --
|
# on windows, this steals the write-lock from up2k.deferred_init --
|
||||||
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
|
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
|
||||||
cur = sqlite3.connect(db_path, 2).cursor()
|
cur = sqlite3.connect(db_path, 2, check_same_thread=False).cursor()
|
||||||
self.log("opened {}".format(db_path))
|
self.log("opened {}".format(db_path))
|
||||||
|
|
||||||
self.cur[ptop] = cur
|
self.cur[ptop] = cur
|
||||||
@@ -120,10 +121,10 @@ class U2idx(object):
|
|||||||
|
|
||||||
def search(
|
def search(
|
||||||
self, vols: list[tuple[str, str, dict[str, Any]]], uq: str, lim: int
|
self, vols: list[tuple[str, str, dict[str, Any]]], uq: str, lim: int
|
||||||
) -> tuple[list[dict[str, Any]], list[str]]:
|
) -> tuple[list[dict[str, Any]], list[str], bool]:
|
||||||
"""search by query params"""
|
"""search by query params"""
|
||||||
if not HAVE_SQLITE3:
|
if not HAVE_SQLITE3:
|
||||||
return [], []
|
return [], [], False
|
||||||
|
|
||||||
q = ""
|
q = ""
|
||||||
v: Union[str, int] = ""
|
v: Union[str, int] = ""
|
||||||
@@ -275,7 +276,7 @@ class U2idx(object):
|
|||||||
have_up: bool,
|
have_up: bool,
|
||||||
have_mt: bool,
|
have_mt: bool,
|
||||||
lim: int,
|
lim: int,
|
||||||
) -> tuple[list[dict[str, Any]], list[str]]:
|
) -> tuple[list[dict[str, Any]], list[str], bool]:
|
||||||
done_flag: list[bool] = []
|
done_flag: list[bool] = []
|
||||||
self.active_id = "{:.6f}_{}".format(
|
self.active_id = "{:.6f}_{}".format(
|
||||||
time.time(), threading.current_thread().ident
|
time.time(), threading.current_thread().ident
|
||||||
@@ -316,9 +317,6 @@ class U2idx(object):
|
|||||||
c = cur.execute(uq, tuple(vuv))
|
c = cur.execute(uq, tuple(vuv))
|
||||||
for hit in c:
|
for hit in c:
|
||||||
w, ts, sz, rd, fn, ip, at = hit[:7]
|
w, ts, sz, rd, fn, ip, at = hit[:7]
|
||||||
lim -= 1
|
|
||||||
if lim < 0:
|
|
||||||
break
|
|
||||||
|
|
||||||
if rd.startswith("//") or fn.startswith("//"):
|
if rd.startswith("//") or fn.startswith("//"):
|
||||||
rd, fn = s3dec(rd, fn)
|
rd, fn = s3dec(rd, fn)
|
||||||
@@ -346,6 +344,10 @@ class U2idx(object):
|
|||||||
)[:fk]
|
)[:fk]
|
||||||
)
|
)
|
||||||
|
|
||||||
|
lim -= 1
|
||||||
|
if lim < 0:
|
||||||
|
break
|
||||||
|
|
||||||
seen_rps.add(rp)
|
seen_rps.add(rp)
|
||||||
sret.append({"ts": int(ts), "sz": sz, "rp": rp + suf, "w": w[:16]})
|
sret.append({"ts": int(ts), "sz": sz, "rp": rp + suf, "w": w[:16]})
|
||||||
|
|
||||||
@@ -368,7 +370,7 @@ class U2idx(object):
|
|||||||
|
|
||||||
ret.sort(key=itemgetter("rp"))
|
ret.sort(key=itemgetter("rp"))
|
||||||
|
|
||||||
return ret, list(taglist.keys())
|
return ret, list(taglist.keys()), lim < 0
|
||||||
|
|
||||||
def terminator(self, identifier: str, done_flag: list[bool]) -> None:
|
def terminator(self, identifier: str, done_flag: list[bool]) -> None:
|
||||||
for _ in range(self.timeout):
|
for _ in range(self.timeout):
|
||||||
|
|||||||
@@ -73,6 +73,9 @@ if True: # pylint: disable=using-constant-test
|
|||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from .svchub import SvcHub
|
from .svchub import SvcHub
|
||||||
|
|
||||||
|
zs = "avif,avifs,bmp,gif,heic,heics,heif,heifs,ico,j2p,j2k,jp2,jpeg,jpg,jpx,png,tga,tif,tiff,webp"
|
||||||
|
CV_EXTS = set(zs.split(","))
|
||||||
|
|
||||||
|
|
||||||
class Dbw(object):
|
class Dbw(object):
|
||||||
def __init__(self, c: "sqlite3.Cursor", n: int, t: float) -> None:
|
def __init__(self, c: "sqlite3.Cursor", n: int, t: float) -> None:
|
||||||
@@ -124,6 +127,7 @@ class Up2k(object):
|
|||||||
self.droppable: dict[str, list[str]] = {}
|
self.droppable: dict[str, list[str]] = {}
|
||||||
self.volstate: dict[str, str] = {}
|
self.volstate: dict[str, str] = {}
|
||||||
self.vol_act: dict[str, float] = {}
|
self.vol_act: dict[str, float] = {}
|
||||||
|
self.busy_aps: set[str] = set()
|
||||||
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
|
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
|
||||||
self.snap_persist_interval = 300 # persist unfinished index every 5 min
|
self.snap_persist_interval = 300 # persist unfinished index every 5 min
|
||||||
self.snap_discard_interval = 21600 # drop unfinished after 6 hours inactivity
|
self.snap_discard_interval = 21600 # drop unfinished after 6 hours inactivity
|
||||||
@@ -161,12 +165,6 @@ class Up2k(object):
|
|||||||
t = "could not initialize sqlite3, will use in-memory registry only"
|
t = "could not initialize sqlite3, will use in-memory registry only"
|
||||||
self.log(t, 3)
|
self.log(t, 3)
|
||||||
|
|
||||||
if ANYWIN:
|
|
||||||
# usually fails to set lastmod too quickly
|
|
||||||
self.lastmod_q: list[tuple[str, int, tuple[int, int], bool]] = []
|
|
||||||
self.lastmod_q2 = self.lastmod_q[:]
|
|
||||||
Daemon(self._lastmodder, "up2k-lastmod")
|
|
||||||
|
|
||||||
self.fstab = Fstab(self.log_func)
|
self.fstab = Fstab(self.log_func)
|
||||||
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
||||||
|
|
||||||
@@ -463,11 +461,9 @@ class Up2k(object):
|
|||||||
q = "select * from up where substr(w,1,16)=? and +rd=? and +fn=?"
|
q = "select * from up where substr(w,1,16)=? and +rd=? and +fn=?"
|
||||||
ups = []
|
ups = []
|
||||||
for wrf in wrfs:
|
for wrf in wrfs:
|
||||||
try:
|
up = cur.execute(q, wrf).fetchone()
|
||||||
# almost definitely exists; don't care if it doesn't
|
if up:
|
||||||
ups.append(cur.execute(q, wrf).fetchone())
|
ups.append(up)
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# t1 = time.time()
|
# t1 = time.time()
|
||||||
# self.log("mapped {} warks in {:.3f} sec".format(len(wrfs), t1 - t0))
|
# self.log("mapped {} warks in {:.3f} sec".format(len(wrfs), t1 - t0))
|
||||||
@@ -952,6 +948,7 @@ class Up2k(object):
|
|||||||
unreg: list[str] = []
|
unreg: list[str] = []
|
||||||
files: list[tuple[int, int, str]] = []
|
files: list[tuple[int, int, str]] = []
|
||||||
fat32 = True
|
fat32 = True
|
||||||
|
cv = ""
|
||||||
|
|
||||||
assert self.pp and self.mem_cur
|
assert self.pp and self.mem_cur
|
||||||
self.pp.msg = "a{} {}".format(self.pp.n, cdir)
|
self.pp.msg = "a{} {}".format(self.pp.n, cdir)
|
||||||
@@ -1014,6 +1011,12 @@ class Up2k(object):
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
files.append((sz, lmod, iname))
|
files.append((sz, lmod, iname))
|
||||||
|
liname = iname.lower()
|
||||||
|
if sz and (
|
||||||
|
iname in self.args.th_covers
|
||||||
|
or (not cv and liname.rsplit(".", 1)[-1] in CV_EXTS)
|
||||||
|
):
|
||||||
|
cv = iname
|
||||||
|
|
||||||
# folder of 1000 files = ~1 MiB RAM best-case (tiny filenames);
|
# folder of 1000 files = ~1 MiB RAM best-case (tiny filenames);
|
||||||
# free up stuff we're done with before dhashing
|
# free up stuff we're done with before dhashing
|
||||||
@@ -1026,9 +1029,10 @@ class Up2k(object):
|
|||||||
zh = hashlib.sha1()
|
zh = hashlib.sha1()
|
||||||
_ = [zh.update(str(x).encode("utf-8", "replace")) for x in files]
|
_ = [zh.update(str(x).encode("utf-8", "replace")) for x in files]
|
||||||
|
|
||||||
|
zh.update(cv.encode("utf-8", "replace"))
|
||||||
zh.update(spack(b"<d", cst.st_mtime))
|
zh.update(spack(b"<d", cst.st_mtime))
|
||||||
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
||||||
sql = "select d from dh where d = ? and h = ?"
|
sql = "select d from dh where d=? and +h=?"
|
||||||
try:
|
try:
|
||||||
c = db.c.execute(sql, (rd, dhash))
|
c = db.c.execute(sql, (rd, dhash))
|
||||||
drd = rd
|
drd = rd
|
||||||
@@ -1039,6 +1043,18 @@ class Up2k(object):
|
|||||||
if c.fetchone():
|
if c.fetchone():
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
if cv and rd:
|
||||||
|
# mojibake not supported (for performance / simplicity):
|
||||||
|
try:
|
||||||
|
q = "select * from cv where rd=? and dn=? and +fn=?"
|
||||||
|
crd, cdn = rd.rsplit("/", 1) if "/" in rd else ("", rd)
|
||||||
|
if not db.c.execute(q, (crd, cdn, cv)).fetchone():
|
||||||
|
db.c.execute("delete from cv where rd=? and dn=?", (crd, cdn))
|
||||||
|
db.c.execute("insert into cv values (?,?,?)", (crd, cdn, cv))
|
||||||
|
db.n += 1
|
||||||
|
except Exception as ex:
|
||||||
|
self.log("cover {}/{} failed: {}".format(rd, cv, ex), 6)
|
||||||
|
|
||||||
seen_files = set([x[2] for x in files]) # for dropcheck
|
seen_files = set([x[2] for x in files]) # for dropcheck
|
||||||
for sz, lmod, fn in files:
|
for sz, lmod, fn in files:
|
||||||
if self.stop:
|
if self.stop:
|
||||||
@@ -1235,6 +1251,22 @@ class Up2k(object):
|
|||||||
if n_rm2:
|
if n_rm2:
|
||||||
self.log("forgetting {} shadowed deleted files".format(n_rm2))
|
self.log("forgetting {} shadowed deleted files".format(n_rm2))
|
||||||
|
|
||||||
|
c2.connection.commit()
|
||||||
|
|
||||||
|
# then covers
|
||||||
|
n_rm3 = 0
|
||||||
|
qu = "select 1 from up where rd=? and +fn=? limit 1"
|
||||||
|
q = "delete from cv where rd=? and dn=? and +fn=?"
|
||||||
|
for crd, cdn, fn in cur.execute("select * from cv"):
|
||||||
|
urd = vjoin(crd, cdn)
|
||||||
|
if not c2.execute(qu, (urd, fn)).fetchone():
|
||||||
|
c2.execute(q, (crd, cdn, fn))
|
||||||
|
n_rm3 += 1
|
||||||
|
|
||||||
|
if n_rm3:
|
||||||
|
self.log("forgetting {} deleted covers".format(n_rm3))
|
||||||
|
|
||||||
|
c2.connection.commit()
|
||||||
c2.close()
|
c2.close()
|
||||||
return n_rm + n_rm2
|
return n_rm + n_rm2
|
||||||
|
|
||||||
@@ -1284,9 +1316,9 @@ class Up2k(object):
|
|||||||
|
|
||||||
w, drd, dfn = zb[:-1].decode("utf-8").split("\x00")
|
w, drd, dfn = zb[:-1].decode("utf-8").split("\x00")
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
q = "select mt, sz from up where w = ? and rd = ? and fn = ?"
|
q = "select mt, sz from up where rd=? and fn=? and +w=?"
|
||||||
try:
|
try:
|
||||||
mt, sz = cur.execute(q, (w, drd, dfn)).fetchone()
|
mt, sz = cur.execute(q, (drd, dfn, w)).fetchone()
|
||||||
except:
|
except:
|
||||||
# file moved/deleted since spooling
|
# file moved/deleted since spooling
|
||||||
continue
|
continue
|
||||||
@@ -1387,6 +1419,7 @@ class Up2k(object):
|
|||||||
cur, _ = reg
|
cur, _ = reg
|
||||||
self._set_tagscan(cur, True)
|
self._set_tagscan(cur, True)
|
||||||
cur.execute("delete from dh")
|
cur.execute("delete from dh")
|
||||||
|
cur.execute("delete from cv")
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
|
|
||||||
def _set_tagscan(self, cur: "sqlite3.Cursor", need: bool) -> bool:
|
def _set_tagscan(self, cur: "sqlite3.Cursor", need: bool) -> bool:
|
||||||
@@ -1967,6 +2000,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
if ver == DB_VER:
|
if ver == DB_VER:
|
||||||
try:
|
try:
|
||||||
|
self._add_cv_tab(cur)
|
||||||
self._add_xiu_tab(cur)
|
self._add_xiu_tab(cur)
|
||||||
self._add_dhash_tab(cur)
|
self._add_dhash_tab(cur)
|
||||||
except:
|
except:
|
||||||
@@ -2062,6 +2096,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
self._add_dhash_tab(cur)
|
self._add_dhash_tab(cur)
|
||||||
self._add_xiu_tab(cur)
|
self._add_xiu_tab(cur)
|
||||||
|
self._add_cv_tab(cur)
|
||||||
self.log("created DB at {}".format(db_path))
|
self.log("created DB at {}".format(db_path))
|
||||||
return cur
|
return cur
|
||||||
|
|
||||||
@@ -2110,12 +2145,34 @@ class Up2k(object):
|
|||||||
|
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
|
|
||||||
|
def _add_cv_tab(self, cur: "sqlite3.Cursor") -> None:
|
||||||
|
# v5b -> v5c
|
||||||
|
try:
|
||||||
|
cur.execute("select rd, dn, fn from cv limit 1").fetchone()
|
||||||
|
return
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
for cmd in [
|
||||||
|
r"create table cv (rd text, dn text, fn text)",
|
||||||
|
r"create index cv_i on cv(rd, dn)",
|
||||||
|
]:
|
||||||
|
cur.execute(cmd)
|
||||||
|
|
||||||
|
try:
|
||||||
|
cur.execute("delete from dh")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
cur.connection.commit()
|
||||||
|
|
||||||
def _job_volchk(self, cj: dict[str, Any]) -> None:
|
def _job_volchk(self, cj: dict[str, Any]) -> None:
|
||||||
if not self.register_vpath(cj["ptop"], cj["vcfg"]):
|
if not self.register_vpath(cj["ptop"], cj["vcfg"]):
|
||||||
if cj["ptop"] not in self.registry:
|
if cj["ptop"] not in self.registry:
|
||||||
raise Pebkac(410, "location unavailable")
|
raise Pebkac(410, "location unavailable")
|
||||||
|
|
||||||
def handle_json(self, cj: dict[str, Any]) -> dict[str, Any]:
|
def handle_json(self, cj: dict[str, Any], busy_aps: set[str]) -> dict[str, Any]:
|
||||||
|
self.busy_aps = busy_aps
|
||||||
try:
|
try:
|
||||||
# bit expensive; 3.9=10x 3.11=2x
|
# bit expensive; 3.9=10x 3.11=2x
|
||||||
if self.mutex.acquire(timeout=10):
|
if self.mutex.acquire(timeout=10):
|
||||||
@@ -2166,7 +2223,7 @@ class Up2k(object):
|
|||||||
q = r"select * from up where w = ?"
|
q = r"select * from up where w = ?"
|
||||||
argv = [wark]
|
argv = [wark]
|
||||||
else:
|
else:
|
||||||
q = r"select * from up where substr(w,1,16) = ? and w = ?"
|
q = r"select * from up where substr(w,1,16)=? and +w=?"
|
||||||
argv = [wark[:16], wark]
|
argv = [wark[:16], wark]
|
||||||
|
|
||||||
c2 = cur.execute(q, tuple(argv))
|
c2 = cur.execute(q, tuple(argv))
|
||||||
@@ -2289,6 +2346,22 @@ class Up2k(object):
|
|||||||
else:
|
else:
|
||||||
# symlink to the client-provided name,
|
# symlink to the client-provided name,
|
||||||
# returning the previous upload info
|
# returning the previous upload info
|
||||||
|
psrc = src + ".PARTIAL"
|
||||||
|
if self.args.dotpart:
|
||||||
|
m = re.match(r"(.*[\\/])(.*)", psrc)
|
||||||
|
if m: # always true but...
|
||||||
|
zs1, zs2 = m.groups()
|
||||||
|
psrc = zs1 + "." + zs2
|
||||||
|
|
||||||
|
if (
|
||||||
|
src in self.busy_aps
|
||||||
|
or psrc in self.busy_aps
|
||||||
|
or (wark in reg and "done" not in reg[wark])
|
||||||
|
):
|
||||||
|
raise Pebkac(
|
||||||
|
422, "source file busy; please try again later"
|
||||||
|
)
|
||||||
|
|
||||||
job = deepcopy(job)
|
job = deepcopy(job)
|
||||||
job["wark"] = wark
|
job["wark"] = wark
|
||||||
job["at"] = cj.get("at") or time.time()
|
job["at"] = cj.get("at") or time.time()
|
||||||
@@ -2332,7 +2405,7 @@ class Up2k(object):
|
|||||||
if not n4g:
|
if not n4g:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
if cur:
|
if cur and not self.args.nw:
|
||||||
zs = "prel name lmod size ptop vtop wark host user addr at"
|
zs = "prel name lmod size ptop vtop wark host user addr at"
|
||||||
a = [job[x] for x in zs.split()]
|
a = [job[x] for x in zs.split()]
|
||||||
self.db_add(cur, vfs.flags, *a)
|
self.db_add(cur, vfs.flags, *a)
|
||||||
@@ -2507,10 +2580,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
if lmod and (not linked or SYMTIME):
|
if lmod and (not linked or SYMTIME):
|
||||||
times = (int(time.time()), int(lmod))
|
times = (int(time.time()), int(lmod))
|
||||||
if ANYWIN:
|
bos.utime(dst, times, False)
|
||||||
self.lastmod_q.append((dst, 0, times, False))
|
|
||||||
else:
|
|
||||||
bos.utime(dst, times, False)
|
|
||||||
|
|
||||||
def handle_chunk(
|
def handle_chunk(
|
||||||
self, ptop: str, wark: str, chash: str
|
self, ptop: str, wark: str, chash: str
|
||||||
@@ -2591,13 +2661,10 @@ class Up2k(object):
|
|||||||
self.regdrop(ptop, wark)
|
self.regdrop(ptop, wark)
|
||||||
return ret, dst
|
return ret, dst
|
||||||
|
|
||||||
# windows cant rename open files
|
|
||||||
if not ANYWIN or src == dst:
|
|
||||||
self._finish_upload(ptop, wark)
|
|
||||||
|
|
||||||
return ret, dst
|
return ret, dst
|
||||||
|
|
||||||
def finish_upload(self, ptop: str, wark: str) -> None:
|
def finish_upload(self, ptop: str, wark: str, busy_aps: set[str]) -> None:
|
||||||
|
self.busy_aps = busy_aps
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
self._finish_upload(ptop, wark)
|
self._finish_upload(ptop, wark)
|
||||||
|
|
||||||
@@ -2610,6 +2677,10 @@ class Up2k(object):
|
|||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise Pebkac(500, "finish_upload, wark, " + repr(ex))
|
raise Pebkac(500, "finish_upload, wark, " + repr(ex))
|
||||||
|
|
||||||
|
if job["need"]:
|
||||||
|
t = "finish_upload {} with remaining chunks {}"
|
||||||
|
raise Pebkac(500, t.format(wark, job["need"]))
|
||||||
|
|
||||||
# self.log("--- " + wark + " " + dst + " finish_upload atomic " + dst, 4)
|
# self.log("--- " + wark + " " + dst + " finish_upload atomic " + dst, 4)
|
||||||
atomic_move(src, dst)
|
atomic_move(src, dst)
|
||||||
|
|
||||||
@@ -2617,14 +2688,15 @@ class Up2k(object):
|
|||||||
vflags = self.flags[ptop]
|
vflags = self.flags[ptop]
|
||||||
|
|
||||||
times = (int(time.time()), int(job["lmod"]))
|
times = (int(time.time()), int(job["lmod"]))
|
||||||
if ANYWIN:
|
self.log(
|
||||||
z1 = (dst, job["size"], times, job["sprs"])
|
"no more chunks, setting times {} ({}) on {}".format(
|
||||||
self.lastmod_q.append(z1)
|
times, bos.path.getsize(dst), dst
|
||||||
elif not job["hash"]:
|
)
|
||||||
try:
|
)
|
||||||
bos.utime(dst, times)
|
try:
|
||||||
except:
|
bos.utime(dst, times)
|
||||||
pass
|
except:
|
||||||
|
self.log("failed to utime ({}, {})".format(dst, times))
|
||||||
|
|
||||||
zs = "prel name lmod size ptop vtop wark host user addr"
|
zs = "prel name lmod size ptop vtop wark host user addr"
|
||||||
z2 = [job[x] for x in zs.split()]
|
z2 = [job[x] for x in zs.split()]
|
||||||
@@ -2645,6 +2717,7 @@ class Up2k(object):
|
|||||||
if self.idx_wark(vflags, *z2):
|
if self.idx_wark(vflags, *z2):
|
||||||
del self.registry[ptop][wark]
|
del self.registry[ptop][wark]
|
||||||
else:
|
else:
|
||||||
|
self.registry[ptop][wark]["done"] = 1
|
||||||
self.regdrop(ptop, wark)
|
self.regdrop(ptop, wark)
|
||||||
|
|
||||||
if wake_sr:
|
if wake_sr:
|
||||||
@@ -2814,6 +2887,16 @@ class Up2k(object):
|
|||||||
with self.rescan_cond:
|
with self.rescan_cond:
|
||||||
self.rescan_cond.notify_all()
|
self.rescan_cond.notify_all()
|
||||||
|
|
||||||
|
if rd and sz and fn.lower() in self.args.th_covers:
|
||||||
|
# wasteful; db_add will re-index actual covers
|
||||||
|
# but that won't catch existing files
|
||||||
|
crd, cdn = rd.rsplit("/", 1) if "/" in rd else ("", rd)
|
||||||
|
try:
|
||||||
|
db.execute("delete from cv where rd=? and dn=?", (crd, cdn))
|
||||||
|
db.execute("insert into cv values (?,?,?)", (crd, cdn, fn))
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
def handle_rm(self, uname: str, ip: str, vpaths: list[str], lim: list[int]) -> str:
|
def handle_rm(self, uname: str, ip: str, vpaths: list[str], lim: list[int]) -> str:
|
||||||
n_files = 0
|
n_files = 0
|
||||||
ok = {}
|
ok = {}
|
||||||
@@ -3217,9 +3300,16 @@ class Up2k(object):
|
|||||||
"""
|
"""
|
||||||
dupes = []
|
dupes = []
|
||||||
sabs = djoin(sptop, srem)
|
sabs = djoin(sptop, srem)
|
||||||
q = "select rd, fn from up where substr(w,1,16)=? and w=?"
|
|
||||||
|
if self.no_expr_idx:
|
||||||
|
q = r"select rd, fn from up where w = ?"
|
||||||
|
argv = (wark,)
|
||||||
|
else:
|
||||||
|
q = r"select rd, fn from up where substr(w,1,16)=? and +w=?"
|
||||||
|
argv = (wark[:16], wark)
|
||||||
|
|
||||||
for ptop, cur in self.cur.items():
|
for ptop, cur in self.cur.items():
|
||||||
for rd, fn in cur.execute(q, (wark[:16], wark)):
|
for rd, fn in cur.execute(q, argv):
|
||||||
if rd.startswith("//") or fn.startswith("//"):
|
if rd.startswith("//") or fn.startswith("//"):
|
||||||
rd, fn = s3dec(rd, fn)
|
rd, fn = s3dec(rd, fn)
|
||||||
|
|
||||||
@@ -3428,27 +3518,6 @@ class Up2k(object):
|
|||||||
if not job["hash"]:
|
if not job["hash"]:
|
||||||
self._finish_upload(job["ptop"], job["wark"])
|
self._finish_upload(job["ptop"], job["wark"])
|
||||||
|
|
||||||
def _lastmodder(self) -> None:
|
|
||||||
while True:
|
|
||||||
ready = self.lastmod_q2
|
|
||||||
self.lastmod_q2 = self.lastmod_q
|
|
||||||
self.lastmod_q = []
|
|
||||||
|
|
||||||
time.sleep(1)
|
|
||||||
for path, sz, times, sparse in ready:
|
|
||||||
self.log("lmod: setting times {} on {}".format(times, path))
|
|
||||||
try:
|
|
||||||
bos.utime(path, times, False)
|
|
||||||
except:
|
|
||||||
t = "lmod: failed to utime ({}, {}):\n{}"
|
|
||||||
self.log(t.format(path, times, min_ex()))
|
|
||||||
|
|
||||||
if sparse and self.args.sparse and self.args.sparse * 1024 * 1024 <= sz:
|
|
||||||
try:
|
|
||||||
sp.check_call(["fsutil", "sparse", "setflag", path, "0"])
|
|
||||||
except:
|
|
||||||
self.log("could not unsparse [{}]".format(path), 3)
|
|
||||||
|
|
||||||
def _snapshot(self) -> None:
|
def _snapshot(self) -> None:
|
||||||
slp = self.snap_persist_interval
|
slp = self.snap_persist_interval
|
||||||
while True:
|
while True:
|
||||||
|
|||||||
@@ -668,6 +668,7 @@ class FHC(object):
|
|||||||
|
|
||||||
def __init__(self) -> None:
|
def __init__(self) -> None:
|
||||||
self.cache: dict[str, FHC.CE] = {}
|
self.cache: dict[str, FHC.CE] = {}
|
||||||
|
self.aps: set[str] = set()
|
||||||
|
|
||||||
def close(self, path: str) -> None:
|
def close(self, path: str) -> None:
|
||||||
try:
|
try:
|
||||||
@@ -679,6 +680,7 @@ class FHC(object):
|
|||||||
fh.close()
|
fh.close()
|
||||||
|
|
||||||
del self.cache[path]
|
del self.cache[path]
|
||||||
|
self.aps.remove(path)
|
||||||
|
|
||||||
def clean(self) -> None:
|
def clean(self) -> None:
|
||||||
if not self.cache:
|
if not self.cache:
|
||||||
@@ -699,6 +701,7 @@ class FHC(object):
|
|||||||
return self.cache[path].fhs.pop()
|
return self.cache[path].fhs.pop()
|
||||||
|
|
||||||
def put(self, path: str, fh: typing.BinaryIO) -> None:
|
def put(self, path: str, fh: typing.BinaryIO) -> None:
|
||||||
|
self.aps.add(path)
|
||||||
try:
|
try:
|
||||||
ce = self.cache[path]
|
ce = self.cache[path]
|
||||||
ce.fhs.append(fh)
|
ce.fhs.append(fh)
|
||||||
|
|||||||
@@ -27,8 +27,8 @@ window.baguetteBox = (function () {
|
|||||||
isOverlayVisible = false,
|
isOverlayVisible = false,
|
||||||
touch = {}, // start-pos
|
touch = {}, // start-pos
|
||||||
touchFlag = false, // busy
|
touchFlag = false, // busy
|
||||||
re_i = /.+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
|
re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
|
||||||
re_v = /.+\.(webm|mkv|mp4)(\?|$)/i,
|
re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
|
||||||
anims = ['slideIn', 'fadeIn', 'none'],
|
anims = ['slideIn', 'fadeIn', 'none'],
|
||||||
data = {}, // all galleries
|
data = {}, // all galleries
|
||||||
imagesElements = [],
|
imagesElements = [],
|
||||||
@@ -580,6 +580,7 @@ window.baguetteBox = (function () {
|
|||||||
function hideOverlay(e) {
|
function hideOverlay(e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
playvid(false);
|
playvid(false);
|
||||||
|
removeFromCache('#files');
|
||||||
if (options.noScrollbars) {
|
if (options.noScrollbars) {
|
||||||
document.documentElement.style.overflowY = 'auto';
|
document.documentElement.style.overflowY = 'auto';
|
||||||
document.body.style.overflowY = 'auto';
|
document.body.style.overflowY = 'auto';
|
||||||
@@ -812,10 +813,16 @@ window.baguetteBox = (function () {
|
|||||||
}
|
}
|
||||||
|
|
||||||
function vid() {
|
function vid() {
|
||||||
|
if (currentIndex >= imagesElements.length)
|
||||||
|
return;
|
||||||
|
|
||||||
return imagesElements[currentIndex].querySelector('video');
|
return imagesElements[currentIndex].querySelector('video');
|
||||||
}
|
}
|
||||||
|
|
||||||
function vidimg() {
|
function vidimg() {
|
||||||
|
if (currentIndex >= imagesElements.length)
|
||||||
|
return;
|
||||||
|
|
||||||
return imagesElements[currentIndex].querySelector('img, video');
|
return imagesElements[currentIndex].querySelector('img, video');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -93,6 +93,7 @@
|
|||||||
--g-fsel-bg: #d39;
|
--g-fsel-bg: #d39;
|
||||||
--g-fsel-b1: #f4a;
|
--g-fsel-b1: #f4a;
|
||||||
--g-fsel-ts: #804;
|
--g-fsel-ts: #804;
|
||||||
|
--g-dfg: var(--srv-3);
|
||||||
--g-fg: var(--a-hil);
|
--g-fg: var(--a-hil);
|
||||||
--g-bg: var(--bg-u2);
|
--g-bg: var(--bg-u2);
|
||||||
--g-b1: var(--bg-u4);
|
--g-b1: var(--bg-u4);
|
||||||
@@ -327,6 +328,7 @@ html.c {
|
|||||||
}
|
}
|
||||||
html.cz {
|
html.cz {
|
||||||
--bgg: var(--bg-u2);
|
--bgg: var(--bg-u2);
|
||||||
|
--srv-3: #fff;
|
||||||
}
|
}
|
||||||
html.cy {
|
html.cy {
|
||||||
--fg: #fff;
|
--fg: #fff;
|
||||||
@@ -354,6 +356,7 @@ html.cy {
|
|||||||
--chk-fg: #fd0;
|
--chk-fg: #fd0;
|
||||||
|
|
||||||
--srv-1: #f00;
|
--srv-1: #f00;
|
||||||
|
--srv-3: #fff;
|
||||||
--op-aa-bg: #fff;
|
--op-aa-bg: #fff;
|
||||||
|
|
||||||
--u2-b1-bg: #f00;
|
--u2-b1-bg: #f00;
|
||||||
@@ -793,6 +796,8 @@ html.y #path a:hover {
|
|||||||
}
|
}
|
||||||
.logue {
|
.logue {
|
||||||
padding: .2em 0;
|
padding: .2em 0;
|
||||||
|
position: relative;
|
||||||
|
z-index: 1;
|
||||||
}
|
}
|
||||||
.logue.hidden,
|
.logue.hidden,
|
||||||
.logue:empty {
|
.logue:empty {
|
||||||
@@ -964,6 +969,9 @@ html.y #path a:hover {
|
|||||||
#ggrid>a.dir:before {
|
#ggrid>a.dir:before {
|
||||||
content: '📂';
|
content: '📂';
|
||||||
}
|
}
|
||||||
|
#ggrid>a.dir>span {
|
||||||
|
color: var(--g-dfg);
|
||||||
|
}
|
||||||
#ggrid>a.au:before {
|
#ggrid>a.au:before {
|
||||||
content: '💾';
|
content: '💾';
|
||||||
}
|
}
|
||||||
@@ -1010,6 +1018,9 @@ html.np_open #ggrid>a.au:before {
|
|||||||
background: var(--g-sel-bg);
|
background: var(--g-sel-bg);
|
||||||
border-color: var(--g-sel-b1);
|
border-color: var(--g-sel-b1);
|
||||||
}
|
}
|
||||||
|
#ggrid>a.sel>span {
|
||||||
|
color: var(--g-sel-fg);
|
||||||
|
}
|
||||||
#ggrid>a.sel,
|
#ggrid>a.sel,
|
||||||
#ggrid>a[tt].sel {
|
#ggrid>a[tt].sel {
|
||||||
border-top: 1px solid var(--g-fsel-b1);
|
border-top: 1px solid var(--g-fsel-b1);
|
||||||
@@ -1063,6 +1074,9 @@ html.np_open #ggrid>a.au:before {
|
|||||||
background: var(--bg-d3);
|
background: var(--bg-d3);
|
||||||
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
|
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
|
||||||
}
|
}
|
||||||
|
#player {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
#widget {
|
#widget {
|
||||||
position: fixed;
|
position: fixed;
|
||||||
font-size: 1.4em;
|
font-size: 1.4em;
|
||||||
|
|||||||
@@ -155,6 +155,7 @@
|
|||||||
sb_lg = "{{ sb_lg }}",
|
sb_lg = "{{ sb_lg }}",
|
||||||
lifetime = {{ lifetime }},
|
lifetime = {{ lifetime }},
|
||||||
turbolvl = {{ turbolvl }},
|
turbolvl = {{ turbolvl }},
|
||||||
|
idxh = {{ idxh }},
|
||||||
frand = {{ frand|tojson }},
|
frand = {{ frand|tojson }},
|
||||||
u2sort = "{{ u2sort }}",
|
u2sort = "{{ u2sort }}",
|
||||||
have_emp = {{ have_emp|tojson }},
|
have_emp = {{ have_emp|tojson }},
|
||||||
|
|||||||
@@ -193,6 +193,7 @@ var Ls = {
|
|||||||
"ct_dots": "show hidden files (if server permits)",
|
"ct_dots": "show hidden files (if server permits)",
|
||||||
"ct_dir1st": "sort folders before files",
|
"ct_dir1st": "sort folders before files",
|
||||||
"ct_readme": "show README.md in folder listings",
|
"ct_readme": "show README.md in folder listings",
|
||||||
|
"ct_idxh": "show index.html instead of folder listing",
|
||||||
"ct_sbars": "show scrollbars",
|
"ct_sbars": "show scrollbars",
|
||||||
|
|
||||||
"cut_turbo": "the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>"does this have the same filesize on the server?"</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then "upload" the same files again to let the client verify them",
|
"cut_turbo": "the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>"does this have the same filesize on the server?"</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then "upload" the same files again to let the client verify them",
|
||||||
@@ -259,6 +260,10 @@ var Ls = {
|
|||||||
"mm_e404": "Could not play audio; error 404: File not found.",
|
"mm_e404": "Could not play audio; error 404: File not found.",
|
||||||
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
||||||
"mm_e5xx": "Could not play audio; server error ",
|
"mm_e5xx": "Could not play audio; server error ",
|
||||||
|
"mm_nof": "not finding any more audio files nearby",
|
||||||
|
"mm_hnf": "that song no longer exists",
|
||||||
|
|
||||||
|
"im_hnf": "that image no longer exists",
|
||||||
|
|
||||||
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
||||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||||
@@ -648,6 +653,7 @@ var Ls = {
|
|||||||
"ct_dots": "vis skjulte filer (gitt at serveren tillater det)",
|
"ct_dots": "vis skjulte filer (gitt at serveren tillater det)",
|
||||||
"ct_dir1st": "sorter slik at mapper kommer foran filer",
|
"ct_dir1st": "sorter slik at mapper kommer foran filer",
|
||||||
"ct_readme": "vis README.md nedenfor filene",
|
"ct_readme": "vis README.md nedenfor filene",
|
||||||
|
"ct_idxh": "vis index.html istedenfor fil-liste",
|
||||||
"ct_sbars": "vis rullgardiner / skrollefelt",
|
"ct_sbars": "vis rullgardiner / skrollefelt",
|
||||||
|
|
||||||
"cut_turbo": "forenklet befaring ved opplastning; bør sannsynlig <em>ikke</em> skrus på:$N$Nnyttig dersom du var midt i en svær opplastning som måtte restartes av en eller annen grunn, og du vil komme igang igjen så raskt som overhodet mulig.$N$Nnår denne er skrudd på så forenkles befaringen kraftig; istedenfor å utføre en trygg sjekk på om filene finnes på serveren i god stand, så sjekkes kun om <em>filstørrelsen</em> stemmer. Så dersom en korrupt fil skulle befinne seg på serveren allerede, på samme sted med samme størrelse og navn, så blir det <em>ikke oppdaget</em>.$N$Ndet anbefales å kun benytte denne funksjonen for å komme seg raskt igjennom selve opplastningen, for så å skru den av, og til slutt "laste opp" de samme filene én gang til -- slik at integriteten kan verifiseres",
|
"cut_turbo": "forenklet befaring ved opplastning; bør sannsynlig <em>ikke</em> skrus på:$N$Nnyttig dersom du var midt i en svær opplastning som måtte restartes av en eller annen grunn, og du vil komme igang igjen så raskt som overhodet mulig.$N$Nnår denne er skrudd på så forenkles befaringen kraftig; istedenfor å utføre en trygg sjekk på om filene finnes på serveren i god stand, så sjekkes kun om <em>filstørrelsen</em> stemmer. Så dersom en korrupt fil skulle befinne seg på serveren allerede, på samme sted med samme størrelse og navn, så blir det <em>ikke oppdaget</em>.$N$Ndet anbefales å kun benytte denne funksjonen for å komme seg raskt igjennom selve opplastningen, for så å skru den av, og til slutt "laste opp" de samme filene én gang til -- slik at integriteten kan verifiseres",
|
||||||
@@ -714,6 +720,10 @@ var Ls = {
|
|||||||
"mm_e404": "Avspilling feilet: Fil ikke funnet.",
|
"mm_e404": "Avspilling feilet: Fil ikke funnet.",
|
||||||
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
|
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
|
||||||
"mm_e5xx": "Avspilling feilet: ",
|
"mm_e5xx": "Avspilling feilet: ",
|
||||||
|
"mm_nof": "finner ikke flere sanger i nærheten",
|
||||||
|
"mm_hnf": "sangen finnes ikke lenger",
|
||||||
|
|
||||||
|
"im_hnf": "bildet finnes ikke lenger",
|
||||||
|
|
||||||
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
||||||
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
||||||
@@ -964,6 +974,17 @@ ebi('widget').innerHTML = (
|
|||||||
' <canvas id="pvol" width="288" height="38"></canvas>' +
|
' <canvas id="pvol" width="288" height="38"></canvas>' +
|
||||||
' <canvas id="barpos"></canvas>' +
|
' <canvas id="barpos"></canvas>' +
|
||||||
' <canvas id="barbuf"></canvas>' +
|
' <canvas id="barbuf"></canvas>' +
|
||||||
|
'</div>' +
|
||||||
|
'<div id="np_inf">' +
|
||||||
|
' <img id="np_img"></span>' +
|
||||||
|
' <span id="np_url"></span>' +
|
||||||
|
' <span id="np_circle"></span>' +
|
||||||
|
' <span id="np_album"></span>' +
|
||||||
|
' <span id="np_tn"></span>' +
|
||||||
|
' <span id="np_artist"></span>' +
|
||||||
|
' <span id="np_title"></span>' +
|
||||||
|
' <span id="np_pos"></span>' +
|
||||||
|
' <span id="np_dur"></span>' +
|
||||||
'</div>'
|
'</div>'
|
||||||
);
|
);
|
||||||
|
|
||||||
@@ -1075,6 +1096,7 @@ ebi('op_cfg').innerHTML = (
|
|||||||
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '">dotfiles</a>\n' +
|
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '">dotfiles</a>\n' +
|
||||||
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '">📁 first</a>\n' +
|
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '">📁 first</a>\n' +
|
||||||
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '">📜 readme</a>\n' +
|
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '">📜 readme</a>\n' +
|
||||||
|
' <a id="idxh" class="tgl btn" href="#" tt="' + L.ct_idxh + '">htm</a>\n' +
|
||||||
' <a id="sbars" class="tgl btn" href="#" tt="' + L.ct_sbars + '">⟊</a>\n' +
|
' <a id="sbars" class="tgl btn" href="#" tt="' + L.ct_sbars + '">⟊</a>\n' +
|
||||||
' </div>\n' +
|
' </div>\n' +
|
||||||
'</div>\n' +
|
'</div>\n' +
|
||||||
@@ -1270,6 +1292,7 @@ function set_files_html(html) {
|
|||||||
|
|
||||||
|
|
||||||
var ACtx = window.AudioContext || window.webkitAudioContext,
|
var ACtx = window.AudioContext || window.webkitAudioContext,
|
||||||
|
noih = /[?&]v\b/.exec('' + location),
|
||||||
hash0 = location.hash,
|
hash0 = location.hash,
|
||||||
mp;
|
mp;
|
||||||
|
|
||||||
@@ -1312,6 +1335,7 @@ var mpl = (function () {
|
|||||||
var r = {
|
var r = {
|
||||||
"pb_mode": (sread('pb_mode') || 'next').split('-')[0],
|
"pb_mode": (sread('pb_mode') || 'next').split('-')[0],
|
||||||
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
||||||
|
'traversals': 0,
|
||||||
};
|
};
|
||||||
bcfg_bind(r, 'preload', 'au_preload', true);
|
bcfg_bind(r, 'preload', 'au_preload', true);
|
||||||
bcfg_bind(r, 'fullpre', 'au_fullpre', false);
|
bcfg_bind(r, 'fullpre', 'au_fullpre', false);
|
||||||
@@ -1392,10 +1416,17 @@ var mpl = (function () {
|
|||||||
};
|
};
|
||||||
|
|
||||||
r.pp = function () {
|
r.pp = function () {
|
||||||
|
var adur, apos, playing = mp.au && !mp.au.paused;
|
||||||
|
|
||||||
|
clmod(ebi('np_inf'), 'playing', playing);
|
||||||
|
|
||||||
|
if (mp.au && isNum(adur = mp.au.duration) && isNum(apos = mp.au.currentTime) && apos >= 0)
|
||||||
|
ebi('np_pos').textContent = s2ms(apos);
|
||||||
|
|
||||||
if (!r.os_ctl)
|
if (!r.os_ctl)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
navigator.mediaSession.playbackState = mp.au && !mp.au.paused ? "playing" : "paused";
|
navigator.mediaSession.playbackState = playing ? "playing" : "paused";
|
||||||
};
|
};
|
||||||
|
|
||||||
function setaufollow() {
|
function setaufollow() {
|
||||||
@@ -1410,9 +1441,10 @@ var mpl = (function () {
|
|||||||
var np = get_np()[0],
|
var np = get_np()[0],
|
||||||
fns = np.file.split(' - '),
|
fns = np.file.split(' - '),
|
||||||
artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
|
artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
|
||||||
tags = {
|
title = np.title || fns.pop(),
|
||||||
title: np.title || fns.pop()
|
cover = '',
|
||||||
};
|
pcover = '',
|
||||||
|
tags = { title: title };
|
||||||
|
|
||||||
if (artist)
|
if (artist)
|
||||||
tags.artist = artist;
|
tags.artist = artist;
|
||||||
@@ -1433,18 +1465,28 @@ var mpl = (function () {
|
|||||||
|
|
||||||
if (cover) {
|
if (cover) {
|
||||||
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
|
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
|
||||||
|
pcover = cover;
|
||||||
|
|
||||||
var pwd = get_pwd();
|
var pwd = get_pwd();
|
||||||
if (pwd)
|
if (pwd)
|
||||||
cover += '&pw=' + uricom_enc(pwd);
|
pcover += '&pw=' + uricom_enc(pwd);
|
||||||
|
|
||||||
tags.artwork = [{ "src": cover, type: "image/jpeg" }];
|
tags.artwork = [{ "src": pcover, type: "image/jpeg" }];
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
ebi('np_circle').textContent = np.circle || '';
|
||||||
|
ebi('np_album').textContent = np.album || '';
|
||||||
|
ebi('np_tn').textContent = np['.tn'] || '';
|
||||||
|
ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : '');
|
||||||
|
ebi('np_title').textContent = np.title || '';
|
||||||
|
ebi('np_dur').textContent = np['.dur'] || '';
|
||||||
|
ebi('np_url').textContent = get_vpath() + np.file.split('?')[0];
|
||||||
|
ebi('np_img').setAttribute('src', cover); // dont give last.fm the pwd
|
||||||
|
|
||||||
navigator.mediaSession.metadata = new MediaMetadata(tags);
|
navigator.mediaSession.metadata = new MediaMetadata(tags);
|
||||||
navigator.mediaSession.setActionHandler('play', playpause);
|
navigator.mediaSession.setActionHandler('play', mplay);
|
||||||
navigator.mediaSession.setActionHandler('pause', playpause);
|
navigator.mediaSession.setActionHandler('pause', mpause);
|
||||||
navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null);
|
navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null);
|
||||||
navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null);
|
navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null);
|
||||||
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
|
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
|
||||||
@@ -1454,11 +1496,17 @@ var mpl = (function () {
|
|||||||
r.announce = announce;
|
r.announce = announce;
|
||||||
|
|
||||||
r.stop = function () {
|
r.stop = function () {
|
||||||
if (!r.os_ctl || !navigator.mediaSession.metadata)
|
if (!r.os_ctl)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
navigator.mediaSession.metadata = null;
|
navigator.mediaSession.metadata = null;
|
||||||
navigator.mediaSession.playbackState = "paused";
|
navigator.mediaSession.playbackState = "paused";
|
||||||
|
|
||||||
|
var hs = 'play pause seekbackward seekforward previoustrack nexttrack'.split(/ /g);
|
||||||
|
for (var a = 0; a < hs.length; a++)
|
||||||
|
navigator.mediaSession.setActionHandler(hs[a], null);
|
||||||
|
|
||||||
|
navigator.mediaSession.setPositionState();
|
||||||
};
|
};
|
||||||
|
|
||||||
r.unbuffer = function (url) {
|
r.unbuffer = function (url) {
|
||||||
@@ -1498,6 +1546,7 @@ function MPlayer() {
|
|||||||
r.au2 = null;
|
r.au2 = null;
|
||||||
r.tracks = {};
|
r.tracks = {};
|
||||||
r.order = [];
|
r.order = [];
|
||||||
|
r.cd_pause = 0;
|
||||||
|
|
||||||
var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native,
|
var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native,
|
||||||
trs = QSA('#files tbody tr');
|
trs = QSA('#files tbody tr');
|
||||||
@@ -1554,6 +1603,7 @@ function MPlayer() {
|
|||||||
r.ftid = -1;
|
r.ftid = -1;
|
||||||
r.ftimer = null;
|
r.ftimer = null;
|
||||||
r.fade_in = function () {
|
r.fade_in = function () {
|
||||||
|
r.nopause();
|
||||||
r.fvol = 0;
|
r.fvol = 0;
|
||||||
r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1);
|
r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1);
|
||||||
if (r.au) {
|
if (r.au) {
|
||||||
@@ -1586,9 +1636,9 @@ function MPlayer() {
|
|||||||
r.au.pause();
|
r.au.pause();
|
||||||
mpl.pp();
|
mpl.pp();
|
||||||
|
|
||||||
var t = mp.au.currentTime - 0.8;
|
var t = r.au.currentTime - 0.8;
|
||||||
if (isNum(t))
|
if (isNum(t))
|
||||||
mp.au.currentTime = Math.max(t, 0);
|
r.au.currentTime = Math.max(t, 0);
|
||||||
}
|
}
|
||||||
else if (r.fvol > r.vol)
|
else if (r.fvol > r.vol)
|
||||||
r.fvol = r.vol;
|
r.fvol = r.vol;
|
||||||
@@ -1634,8 +1684,14 @@ function MPlayer() {
|
|||||||
drop();
|
drop();
|
||||||
});
|
});
|
||||||
|
|
||||||
mp.au2.preload = "auto";
|
r.nopause();
|
||||||
mp.au2.src = mp.au2.rsrc = url;
|
r.au2.onloadeddata = r.au2.onloadedmetadata = r.nopause;
|
||||||
|
r.au2.preload = "auto";
|
||||||
|
r.au2.src = r.au2.rsrc = url;
|
||||||
|
};
|
||||||
|
|
||||||
|
r.nopause = function () {
|
||||||
|
r.cd_pause = Date.now();
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1774,6 +1830,7 @@ function glossy_grad(can, h, s, l) {
|
|||||||
var pbar = (function () {
|
var pbar = (function () {
|
||||||
var r = {},
|
var r = {},
|
||||||
bau = null,
|
bau = null,
|
||||||
|
html_txt = 'a',
|
||||||
lastmove = 0,
|
lastmove = 0,
|
||||||
mousepos = 0,
|
mousepos = 0,
|
||||||
gradh = -1,
|
gradh = -1,
|
||||||
@@ -1943,6 +2000,16 @@ var pbar = (function () {
|
|||||||
pctx.strokeText(t2, xt2, yt);
|
pctx.strokeText(t2, xt2, yt);
|
||||||
pctx.fillText(t1, xt1, yt);
|
pctx.fillText(t1, xt1, yt);
|
||||||
pctx.fillText(t2, xt2, yt);
|
pctx.fillText(t2, xt2, yt);
|
||||||
|
|
||||||
|
if (w && html_txt != t2) {
|
||||||
|
ebi('np_pos').textContent = html_txt = t2;
|
||||||
|
if (mpl.os_ctl)
|
||||||
|
navigator.mediaSession.setPositionState({
|
||||||
|
'duration': adur,
|
||||||
|
'position': apos,
|
||||||
|
'playbackRate': 1
|
||||||
|
});
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
window.addEventListener('resize', r.onresize);
|
window.addEventListener('resize', r.onresize);
|
||||||
@@ -2072,6 +2139,7 @@ function seek_au_sec(seek) {
|
|||||||
if (!isNum(seek))
|
if (!isNum(seek))
|
||||||
return;
|
return;
|
||||||
|
|
||||||
|
mp.nopause();
|
||||||
mp.au.currentTime = seek;
|
mp.au.currentTime = seek;
|
||||||
|
|
||||||
if (mp.au.paused)
|
if (mp.au.paused)
|
||||||
@@ -2093,7 +2161,15 @@ function song_skip(n) {
|
|||||||
}
|
}
|
||||||
function next_song(e) {
|
function next_song(e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
return song_skip(1);
|
if (mp.order.length) {
|
||||||
|
mpl.traversals = 0;
|
||||||
|
return song_skip(1);
|
||||||
|
}
|
||||||
|
if (mpl.traversals++ < 5) {
|
||||||
|
treectl.ls_cb = next_song;
|
||||||
|
return tree_neigh(1);
|
||||||
|
}
|
||||||
|
toast.inf(10, L.mm_nof);
|
||||||
}
|
}
|
||||||
function prev_song(e) {
|
function prev_song(e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
@@ -2136,6 +2212,25 @@ function playpause(e) {
|
|||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
|
function mplay(e) {
|
||||||
|
if (mp.au && !mp.au.paused)
|
||||||
|
return;
|
||||||
|
|
||||||
|
playpause(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
function mpause(e) {
|
||||||
|
if (mp.cd_pause > Date.now() - 100)
|
||||||
|
return;
|
||||||
|
|
||||||
|
if (mp.au && mp.au.paused)
|
||||||
|
return;
|
||||||
|
|
||||||
|
playpause(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
// hook up the widget buttons
|
// hook up the widget buttons
|
||||||
(function () {
|
(function () {
|
||||||
ebi('bplay').onclick = playpause;
|
ebi('bplay').onclick = playpause;
|
||||||
@@ -2281,12 +2376,14 @@ function start_actx() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
var audio_eq = (function () {
|
var afilt = (function () {
|
||||||
var r = {
|
var r = {
|
||||||
"en": false,
|
"eqen": false,
|
||||||
"bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000],
|
"bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000],
|
||||||
"gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4],
|
"gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4],
|
||||||
"filters": [],
|
"filters": [],
|
||||||
|
"filterskip": [],
|
||||||
|
"plugs": [],
|
||||||
"amp": 0,
|
"amp": 0,
|
||||||
"chw": 1,
|
"chw": 1,
|
||||||
"last_au": null,
|
"last_au": null,
|
||||||
@@ -2375,6 +2472,10 @@ var audio_eq = (function () {
|
|||||||
r.filters[a].disconnect();
|
r.filters[a].disconnect();
|
||||||
|
|
||||||
r.filters = [];
|
r.filters = [];
|
||||||
|
r.filterskip = [];
|
||||||
|
|
||||||
|
for (var a = 0; a < r.plugs.length; a++)
|
||||||
|
r.plugs[a].unload();
|
||||||
|
|
||||||
if (!mp)
|
if (!mp)
|
||||||
return;
|
return;
|
||||||
@@ -2392,18 +2493,34 @@ var audio_eq = (function () {
|
|||||||
if (!actx)
|
if (!actx)
|
||||||
bcfg_set('au_eq', false);
|
bcfg_set('au_eq', false);
|
||||||
|
|
||||||
if (!actx || !mp.au || (!r.en && !mp.acs))
|
var plug = false;
|
||||||
|
for (var a = 0; a < r.plugs.length; a++)
|
||||||
|
if (r.plugs[a].en)
|
||||||
|
plug = true;
|
||||||
|
|
||||||
|
if (!actx || !mp.au || (!r.eqen && !plug && !mp.acs))
|
||||||
return;
|
return;
|
||||||
|
|
||||||
r.stop();
|
r.stop();
|
||||||
mp.au.id = mp.au.id || Date.now();
|
mp.au.id = mp.au.id || Date.now();
|
||||||
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
|
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
|
||||||
|
|
||||||
if (!r.en) {
|
if (r.eqen)
|
||||||
mp.acs.connect(actx.destination);
|
add_eq();
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
for (var a = 0; a < r.plugs.length; a++)
|
||||||
|
if (r.plugs[a].en)
|
||||||
|
r.plugs[a].load();
|
||||||
|
|
||||||
|
for (var a = 0; a < r.filters.length; a++)
|
||||||
|
if (!has(r.filterskip, a))
|
||||||
|
r.filters[a].connect(a ? r.filters[a - 1] : actx.destination);
|
||||||
|
|
||||||
|
mp.acs.connect(r.filters.length ?
|
||||||
|
r.filters[r.filters.length - 1] : actx.destination);
|
||||||
|
}
|
||||||
|
|
||||||
|
function add_eq() {
|
||||||
var min, max;
|
var min, max;
|
||||||
min = max = r.gains[0];
|
min = max = r.gains[0];
|
||||||
for (var a = 1; a < r.gains.length; a++) {
|
for (var a = 1; a < r.gains.length; a++) {
|
||||||
@@ -2434,9 +2551,6 @@ var audio_eq = (function () {
|
|||||||
fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect
|
fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect
|
||||||
r.filters.push(fi);
|
r.filters.push(fi);
|
||||||
|
|
||||||
for (var a = r.filters.length - 1; a >= 0; a--)
|
|
||||||
r.filters[a].connect(a > 0 ? r.filters[a - 1] : actx.destination);
|
|
||||||
|
|
||||||
if (Math.round(r.chw * 25) != 25) {
|
if (Math.round(r.chw * 25) != 25) {
|
||||||
var split = actx.createChannelSplitter(2),
|
var split = actx.createChannelSplitter(2),
|
||||||
merge = actx.createChannelMerger(2),
|
merge = actx.createChannelMerger(2),
|
||||||
@@ -2461,11 +2575,10 @@ var audio_eq = (function () {
|
|||||||
split.connect(lg2, 0);
|
split.connect(lg2, 0);
|
||||||
split.connect(rg1, 1);
|
split.connect(rg1, 1);
|
||||||
split.connect(rg2, 1);
|
split.connect(rg2, 1);
|
||||||
|
r.filterskip.push(r.filters.length);
|
||||||
r.filters.push(split);
|
r.filters.push(split);
|
||||||
mp.acs.channelCountMode = 'explicit';
|
mp.acs.channelCountMode = 'explicit';
|
||||||
}
|
}
|
||||||
|
|
||||||
mp.acs.connect(r.filters[r.filters.length - 1]);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function eq_step(e) {
|
function eq_step(e) {
|
||||||
@@ -2560,7 +2673,7 @@ var audio_eq = (function () {
|
|||||||
txt[a].onkeydown = eq_keydown;
|
txt[a].onkeydown = eq_keydown;
|
||||||
}
|
}
|
||||||
|
|
||||||
bcfg_bind(r, 'en', 'au_eq', false, r.apply);
|
bcfg_bind(r, 'eqen', 'au_eq', false, r.apply);
|
||||||
|
|
||||||
r.draw();
|
r.draw();
|
||||||
return r;
|
return r;
|
||||||
@@ -2582,7 +2695,7 @@ function play(tid, is_ev, seek) {
|
|||||||
if ((tn + '').indexOf('f-') === 0) {
|
if ((tn + '').indexOf('f-') === 0) {
|
||||||
tn = mp.order.indexOf(tn);
|
tn = mp.order.indexOf(tn);
|
||||||
if (tn < 0)
|
if (tn < 0)
|
||||||
return;
|
return toast.warn(10, L.mm_hnf);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (tn >= mp.order.length) {
|
if (tn >= mp.order.length) {
|
||||||
@@ -2642,7 +2755,7 @@ function play(tid, is_ev, seek) {
|
|||||||
else
|
else
|
||||||
mp.au.src = mp.au.rsrc = url;
|
mp.au.src = mp.au.rsrc = url;
|
||||||
|
|
||||||
audio_eq.apply();
|
afilt.apply();
|
||||||
|
|
||||||
setTimeout(function () {
|
setTimeout(function () {
|
||||||
mpl.unbuffer(url);
|
mpl.unbuffer(url);
|
||||||
@@ -2665,6 +2778,7 @@ function play(tid, is_ev, seek) {
|
|||||||
scroll2playing();
|
scroll2playing();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
mp.nopause();
|
||||||
mp.au.play();
|
mp.au.play();
|
||||||
if (mp.au.paused)
|
if (mp.au.paused)
|
||||||
autoplay_blocked(seek);
|
autoplay_blocked(seek);
|
||||||
@@ -2850,6 +2964,9 @@ function eval_hash() {
|
|||||||
clearInterval(t);
|
clearInterval(t);
|
||||||
baguetteBox.urltime(ts);
|
baguetteBox.urltime(ts);
|
||||||
var im = QS('#ggrid a[ref="' + id + '"]');
|
var im = QS('#ggrid a[ref="' + id + '"]');
|
||||||
|
if (!im)
|
||||||
|
return toast.warn(10, L.im_hnf);
|
||||||
|
|
||||||
im.click();
|
im.click();
|
||||||
im.scrollIntoView();
|
im.scrollIntoView();
|
||||||
}, 50);
|
}, 50);
|
||||||
@@ -4126,6 +4243,21 @@ var thegrid = (function () {
|
|||||||
ev(e);
|
ev(e);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
r.imshow = function (url) {
|
||||||
|
var sel = '#ggrid>a'
|
||||||
|
if (!thegrid.en) {
|
||||||
|
thegrid.bagit('#files');
|
||||||
|
sel = '#files a[id]';
|
||||||
|
}
|
||||||
|
var ims = QSA(sel);
|
||||||
|
for (var a = 0, aa = ims.length; a < aa; a++) {
|
||||||
|
var iu = ims[a].getAttribute('href').split('?')[0].split('/').slice(-1)[0];
|
||||||
|
if (iu == url)
|
||||||
|
return ims[a].click();
|
||||||
|
}
|
||||||
|
baguetteBox.hide();
|
||||||
|
};
|
||||||
|
|
||||||
r.loadsel = function () {
|
r.loadsel = function () {
|
||||||
if (r.dirty)
|
if (r.dirty)
|
||||||
return;
|
return;
|
||||||
@@ -4265,19 +4397,19 @@ var thegrid = (function () {
|
|||||||
}
|
}
|
||||||
|
|
||||||
r.dirty = false;
|
r.dirty = false;
|
||||||
r.bagit();
|
r.bagit('#ggrid');
|
||||||
r.loadsel();
|
r.loadsel();
|
||||||
setTimeout(r.tippen, 20);
|
setTimeout(r.tippen, 20);
|
||||||
}
|
}
|
||||||
|
|
||||||
r.bagit = function () {
|
r.bagit = function (isrc) {
|
||||||
if (!window.baguetteBox)
|
if (!window.baguetteBox)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
if (r.bbox)
|
if (r.bbox)
|
||||||
baguetteBox.destroy();
|
baguetteBox.destroy();
|
||||||
|
|
||||||
r.bbox = baguetteBox.run('#ggrid', {
|
r.bbox = baguetteBox.run(isrc, {
|
||||||
captions: function (g) {
|
captions: function (g) {
|
||||||
var idx = -1,
|
var idx = -1,
|
||||||
h = '' + g;
|
h = '' + g;
|
||||||
@@ -4852,7 +4984,7 @@ document.onkeydown = function (e) {
|
|||||||
|
|
||||||
var html = mk_files_header(tagord), seen = {};
|
var html = mk_files_header(tagord), seen = {};
|
||||||
html.push('<tbody>');
|
html.push('<tbody>');
|
||||||
html.push('<tr class="srch_hdr"><td>-</td><td><a href="#" id="unsearch"><big style="font-weight:bold">[❌] ' + L.sl_close + '</big></a> -- ' + L.sl_hits.format(res.hits.length) + (res.hits.length == cap ? ' -- <a href="#" id="moar">' + L.sl_moar + '</a>' : '') + '</td></tr>');
|
html.push('<tr class="srch_hdr"><td>-</td><td><a href="#" id="unsearch"><big style="font-weight:bold">[❌] ' + L.sl_close + '</big></a> -- ' + L.sl_hits.format(res.hits.length) + (res.trunc ? ' -- <a href="#" id="moar">' + L.sl_moar + '</a>' : '') + '</td></tr>');
|
||||||
|
|
||||||
for (var a = 0; a < res.hits.length; a++) {
|
for (var a = 0; a < res.hits.length; a++) {
|
||||||
var r = res.hits[a],
|
var r = res.hits[a],
|
||||||
@@ -4966,6 +5098,7 @@ var treectl = (function () {
|
|||||||
treesz = clamp(icfg_get('treesz', 16), 10, 50);
|
treesz = clamp(icfg_get('treesz', 16), 10, 50);
|
||||||
|
|
||||||
bcfg_bind(r, 'ireadme', 'ireadme', true);
|
bcfg_bind(r, 'ireadme', 'ireadme', true);
|
||||||
|
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
|
||||||
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
||||||
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
|
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
|
||||||
r.goto(get_evpath());
|
r.goto(get_evpath());
|
||||||
@@ -4990,6 +5123,16 @@ var treectl = (function () {
|
|||||||
}
|
}
|
||||||
setwrap(r.wtree);
|
setwrap(r.wtree);
|
||||||
|
|
||||||
|
function setidxh(v) {
|
||||||
|
if (!v == !/\bidxh=y\b/.exec('' + document.cookie))
|
||||||
|
return;
|
||||||
|
|
||||||
|
var xhr = new XHR();
|
||||||
|
xhr.open('GET', SR + '/?setck=idxh=' + (v ? 'y' : 'n'), true);
|
||||||
|
xhr.send();
|
||||||
|
}
|
||||||
|
setidxh(r.idxh);
|
||||||
|
|
||||||
r.entree = function (e, nostore) {
|
r.entree = function (e, nostore) {
|
||||||
ev(e);
|
ev(e);
|
||||||
entreed = true;
|
entreed = true;
|
||||||
@@ -5418,6 +5561,9 @@ var treectl = (function () {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (r.chk_index_html(this.top, res))
|
||||||
|
return;
|
||||||
|
|
||||||
for (var a = 0; a < res.files.length; a++)
|
for (var a = 0; a < res.files.length; a++)
|
||||||
if (res.files[a].tags === undefined)
|
if (res.files[a].tags === undefined)
|
||||||
res.files[a].tags = {};
|
res.files[a].tags = {};
|
||||||
@@ -5465,6 +5611,17 @@ var treectl = (function () {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
r.chk_index_html = function (top, res) {
|
||||||
|
if (!r.idxh || !res || !res.files || noih)
|
||||||
|
return;
|
||||||
|
|
||||||
|
for (var a = 0; a < res.files.length; a++)
|
||||||
|
if (/^index.html?(\?|$)/i.exec(res.files[a].href)) {
|
||||||
|
window.location = vjoin(top, res.files[a].href);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
r.gentab = function (top, res) {
|
r.gentab = function (top, res) {
|
||||||
var nodes = res.dirs.concat(res.files),
|
var nodes = res.dirs.concat(res.files),
|
||||||
html = mk_files_header(res.taglist),
|
html = mk_files_header(res.taglist),
|
||||||
@@ -5585,14 +5742,18 @@ var treectl = (function () {
|
|||||||
qsr('#bbsw');
|
qsr('#bbsw');
|
||||||
if (ls0 === null) {
|
if (ls0 === null) {
|
||||||
var xhr = new XHR();
|
var xhr = new XHR();
|
||||||
xhr.open('GET', SR + '/?am_js', true);
|
xhr.open('GET', SR + '/?setck=js=y', true);
|
||||||
xhr.send();
|
xhr.send();
|
||||||
|
|
||||||
r.ls_cb = showfile.addlinks;
|
r.ls_cb = showfile.addlinks;
|
||||||
return r.reqls(get_evpath(), false);
|
return r.reqls(get_evpath(), false);
|
||||||
}
|
}
|
||||||
|
|
||||||
r.gentab(get_evpath(), ls0);
|
var top = get_evpath();
|
||||||
|
if (r.chk_index_html(top, ls0))
|
||||||
|
return;
|
||||||
|
|
||||||
|
r.gentab(top, ls0);
|
||||||
pbar.onresize();
|
pbar.onresize();
|
||||||
vbar.onresize();
|
vbar.onresize();
|
||||||
showfile.addlinks();
|
showfile.addlinks();
|
||||||
@@ -6621,22 +6782,27 @@ var globalcss = (function () {
|
|||||||
|
|
||||||
var dcs = document.styleSheets;
|
var dcs = document.styleSheets;
|
||||||
for (var a = 0; a < dcs.length; a++) {
|
for (var a = 0; a < dcs.length; a++) {
|
||||||
var base = dcs[a].href,
|
var ds, base = '';
|
||||||
|
try {
|
||||||
|
base = dcs[a].href;
|
||||||
|
if (!base)
|
||||||
|
continue;
|
||||||
|
|
||||||
ds = dcs[a].cssRules;
|
ds = dcs[a].cssRules;
|
||||||
|
base = base.replace(/[^/]+$/, '');
|
||||||
if (!base)
|
for (var b = 0; b < ds.length; b++) {
|
||||||
continue;
|
var css = ds[b].cssText.split(/\burl\(/g);
|
||||||
|
ret += css[0];
|
||||||
base = base.replace(/[^/]+$/, '');
|
for (var c = 1; c < css.length; c++) {
|
||||||
for (var b = 0; b < ds.length; b++) {
|
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
|
||||||
var css = ds[b].cssText.split(/\burl\(/g);
|
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
|
||||||
ret += css[0];
|
css[c].slice(delim ? 1 : 0);
|
||||||
for (var c = 1; c < css.length; c++) {
|
}
|
||||||
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
|
ret += '\n';
|
||||||
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
|
|
||||||
css[c].slice(delim ? 1 : 0);
|
|
||||||
}
|
}
|
||||||
ret += '\n';
|
}
|
||||||
|
catch (ex) {
|
||||||
|
console.log('could not read css', a, base);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return ret;
|
return ret;
|
||||||
@@ -6722,6 +6888,7 @@ function show_md(md, name, div, url, depth) {
|
|||||||
|
|
||||||
els[a].setAttribute('href', '#md-' + href.slice(1));
|
els[a].setAttribute('href', '#md-' + href.slice(1));
|
||||||
}
|
}
|
||||||
|
md_th_set();
|
||||||
set_tabindex();
|
set_tabindex();
|
||||||
var hash = location.hash;
|
var hash = location.hash;
|
||||||
if (hash.startsWith('#md-'))
|
if (hash.startsWith('#md-'))
|
||||||
@@ -6800,6 +6967,7 @@ function sandbox(tgt, rules, cls, html) {
|
|||||||
'window.onblur=function(){say("ilost #' + tid + '")};' +
|
'window.onblur=function(){say("ilost #' + tid + '")};' +
|
||||||
'var el="' + want + '"&&ebi("' + want + '");' +
|
'var el="' + want + '"&&ebi("' + want + '");' +
|
||||||
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
|
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
|
||||||
|
'md_th_set();' +
|
||||||
(cls == 'mdo' && md_plug.post ?
|
(cls == 'mdo' && md_plug.post ?
|
||||||
'const x={' + md_plug.post + '};' +
|
'const x={' + md_plug.post + '};' +
|
||||||
'if(x.render)x.render(ebi("b"));' +
|
'if(x.render)x.render(ebi("b"));' +
|
||||||
@@ -6832,6 +7000,9 @@ window.addEventListener("message", function (e) {
|
|||||||
else if (t[0] == 'igot' || t[0] == 'ilost') {
|
else if (t[0] == 'igot' || t[0] == 'ilost') {
|
||||||
clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot');
|
clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot');
|
||||||
}
|
}
|
||||||
|
else if (t[0] == 'imshow') {
|
||||||
|
thegrid.imshow(e.data.slice(7));
|
||||||
|
}
|
||||||
} catch (ex) {
|
} catch (ex) {
|
||||||
console.log('msg-err: ' + ex);
|
console.log('msg-err: ' + ex);
|
||||||
}
|
}
|
||||||
@@ -7106,8 +7277,8 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
|
|||||||
|
|
||||||
function reload_mp() {
|
function reload_mp() {
|
||||||
if (mp && mp.au) {
|
if (mp && mp.au) {
|
||||||
if (audio_eq)
|
if (afilt)
|
||||||
audio_eq.stop();
|
afilt.stop();
|
||||||
|
|
||||||
mp.au.pause();
|
mp.au.pause();
|
||||||
mp.au = null;
|
mp.au = null;
|
||||||
@@ -7119,8 +7290,8 @@ function reload_mp() {
|
|||||||
plays[a].parentNode.innerHTML = '-';
|
plays[a].parentNode.innerHTML = '-';
|
||||||
|
|
||||||
mp = new MPlayer();
|
mp = new MPlayer();
|
||||||
if (audio_eq)
|
if (afilt)
|
||||||
audio_eq.acst = {};
|
afilt.acst = {};
|
||||||
|
|
||||||
setTimeout(pbar.onresize, 1);
|
setTimeout(pbar.onresize, 1);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -231,11 +231,11 @@ function convert_markdown(md_text, dest_dom) {
|
|||||||
var nodes = md_dom.getElementsByTagName('a');
|
var nodes = md_dom.getElementsByTagName('a');
|
||||||
for (var a = nodes.length - 1; a >= 0; a--) {
|
for (var a = nodes.length - 1; a >= 0; a--) {
|
||||||
var href = nodes[a].getAttribute('href');
|
var href = nodes[a].getAttribute('href');
|
||||||
var txt = nodes[a].textContent;
|
var txt = nodes[a].innerHTML;
|
||||||
|
|
||||||
if (!txt)
|
if (!txt)
|
||||||
nodes[a].textContent = href;
|
nodes[a].textContent = href;
|
||||||
else if (href !== txt)
|
else if (href !== txt && !nodes[a].className)
|
||||||
nodes[a].className = 'vis';
|
nodes[a].className = 'vis';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -43,10 +43,9 @@
|
|||||||
<h1>WebDAV</h1>
|
<h1>WebDAV</h1>
|
||||||
|
|
||||||
<div class="os win">
|
<div class="os win">
|
||||||
<p><em>note: rclone-FTP is a bit faster, so {% if args.ftp or args.ftps %}try that first{% else %}consider enabling FTP in server settings{% endif %}</em></p>
|
|
||||||
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
|
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
|
||||||
<pre>
|
<pre>
|
||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
{% if s %}
|
||||||
@@ -64,9 +63,14 @@
|
|||||||
yum install davfs2
|
yum install davfs2
|
||||||
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
|
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
|
||||||
</pre>
|
</pre>
|
||||||
<p>or you can use rclone instead, which is much slower but doesn't require root:</p>
|
<p>make it automount on boot:</p>
|
||||||
<pre>
|
<pre>
|
||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
|
||||||
|
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
|
||||||
|
</pre>
|
||||||
|
<p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
|
||||||
|
<pre>
|
||||||
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
|
||||||
</pre>
|
</pre>
|
||||||
{% if s %}
|
{% if s %}
|
||||||
|
|||||||
@@ -451,6 +451,20 @@ html.y textarea:focus {
|
|||||||
padding: .2em .5em;
|
padding: .2em .5em;
|
||||||
border: .12em solid #aaa;
|
border: .12em solid #aaa;
|
||||||
}
|
}
|
||||||
|
.mdo .mdth,
|
||||||
|
.mdo .mdthl,
|
||||||
|
.mdo .mdthr {
|
||||||
|
margin: .5em .5em .5em 0;
|
||||||
|
}
|
||||||
|
.mdthl {
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
.mdthr {
|
||||||
|
float: right;
|
||||||
|
}
|
||||||
|
hr {
|
||||||
|
clear: both;
|
||||||
|
}
|
||||||
|
|
||||||
@media screen {
|
@media screen {
|
||||||
.mdo {
|
.mdo {
|
||||||
|
|||||||
@@ -114,10 +114,10 @@ function up2k_flagbus() {
|
|||||||
do_take(now);
|
do_take(now);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if (flag.owner && now - flag.owner[1] > 5000) {
|
if (flag.owner && now - flag.owner[1] > 12000) {
|
||||||
flag.owner = null;
|
flag.owner = null;
|
||||||
}
|
}
|
||||||
if (flag.wants && now - flag.wants[1] > 5000) {
|
if (flag.wants && now - flag.wants[1] > 12000) {
|
||||||
flag.wants = null;
|
flag.wants = null;
|
||||||
}
|
}
|
||||||
if (!flag.owner && !flag.wants) {
|
if (!flag.owner && !flag.wants) {
|
||||||
@@ -772,6 +772,7 @@ function fsearch_explain(n) {
|
|||||||
|
|
||||||
function up2k_init(subtle) {
|
function up2k_init(subtle) {
|
||||||
var r = {
|
var r = {
|
||||||
|
"tact": Date.now(),
|
||||||
"init_deps": init_deps,
|
"init_deps": init_deps,
|
||||||
"set_fsearch": set_fsearch,
|
"set_fsearch": set_fsearch,
|
||||||
"gotallfiles": [gotallfiles] // hooks
|
"gotallfiles": [gotallfiles] // hooks
|
||||||
@@ -1647,8 +1648,14 @@ function up2k_init(subtle) {
|
|||||||
running = true;
|
running = true;
|
||||||
while (true) {
|
while (true) {
|
||||||
var now = Date.now(),
|
var now = Date.now(),
|
||||||
|
blocktime = now - r.tact,
|
||||||
is_busy = st.car < st.files.length;
|
is_busy = st.car < st.files.length;
|
||||||
|
|
||||||
|
if (blocktime > 2500)
|
||||||
|
console.log('main thread blocked for ' + blocktime);
|
||||||
|
|
||||||
|
r.tact = now;
|
||||||
|
|
||||||
if (was_busy && !is_busy) {
|
if (was_busy && !is_busy) {
|
||||||
for (var a = 0; a < st.files.length; a++) {
|
for (var a = 0; a < st.files.length; a++) {
|
||||||
var t = st.files[a];
|
var t = st.files[a];
|
||||||
@@ -1788,6 +1795,15 @@ function up2k_init(subtle) {
|
|||||||
})();
|
})();
|
||||||
|
|
||||||
function uptoast() {
|
function uptoast() {
|
||||||
|
if (st.busy.handshake.length)
|
||||||
|
return;
|
||||||
|
|
||||||
|
for (var a = 0; a < st.files.length; a++) {
|
||||||
|
var t = st.files[a];
|
||||||
|
if (t.want_recheck && !t.rechecks)
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
var sr = uc.fsearch,
|
var sr = uc.fsearch,
|
||||||
ok = pvis.ctr.ok,
|
ok = pvis.ctr.ok,
|
||||||
ng = pvis.ctr.ng,
|
ng = pvis.ctr.ng,
|
||||||
@@ -2043,6 +2059,8 @@ function up2k_init(subtle) {
|
|||||||
nbusy++;
|
nbusy++;
|
||||||
reading++;
|
reading++;
|
||||||
nchunk++;
|
nchunk++;
|
||||||
|
if (Date.now() - up2k.tact > 1500)
|
||||||
|
tasker();
|
||||||
}
|
}
|
||||||
|
|
||||||
function onmsg(d) {
|
function onmsg(d) {
|
||||||
@@ -2373,16 +2391,17 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var err_pend = rsp.indexOf('partial upload exists at a different') + 1,
|
var err_pend = rsp.indexOf('partial upload exists at a different') + 1,
|
||||||
|
err_srcb = rsp.indexOf('source file busy; please try again') + 1,
|
||||||
err_plug = rsp.indexOf('upload blocked by x') + 1,
|
err_plug = rsp.indexOf('upload blocked by x') + 1,
|
||||||
err_dupe = rsp.indexOf('upload rejected, file already exists') + 1;
|
err_dupe = rsp.indexOf('upload rejected, file already exists') + 1;
|
||||||
|
|
||||||
if (err_pend || err_plug || err_dupe) {
|
if (err_pend || err_srcb || err_plug || err_dupe) {
|
||||||
err = rsp;
|
err = rsp;
|
||||||
ofs = err.indexOf('\n/');
|
ofs = err.indexOf('\n/');
|
||||||
if (ofs !== -1) {
|
if (ofs !== -1) {
|
||||||
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
|
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
|
||||||
}
|
}
|
||||||
if (!t.rechecks && err_pend) {
|
if (!t.rechecks && (err_pend || err_srcb)) {
|
||||||
t.rechecks = 0;
|
t.rechecks = 0;
|
||||||
t.want_recheck = true;
|
t.want_recheck = true;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -112,12 +112,13 @@ if ((document.location + '').indexOf(',rej,') + 1)
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
console.hist = [];
|
console.hist = [];
|
||||||
|
var CMAXHIST = 100;
|
||||||
var hook = function (t) {
|
var hook = function (t) {
|
||||||
var orig = console[t].bind(console),
|
var orig = console[t].bind(console),
|
||||||
cfun = function () {
|
cfun = function () {
|
||||||
console.hist.push(Date.now() + ' ' + t + ': ' + Array.from(arguments).join(', '));
|
console.hist.push(Date.now() + ' ' + t + ': ' + Array.from(arguments).join(', '));
|
||||||
if (console.hist.length > 100)
|
if (console.hist.length > CMAXHIST)
|
||||||
console.hist = console.hist.slice(50);
|
console.hist = console.hist.slice(CMAXHIST / 2);
|
||||||
|
|
||||||
orig.apply(console, arguments);
|
orig.apply(console, arguments);
|
||||||
};
|
};
|
||||||
@@ -631,6 +632,29 @@ function vsplit(vp) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
function vjoin(p1, p2) {
|
||||||
|
if (!p1)
|
||||||
|
p1 = '';
|
||||||
|
|
||||||
|
if (!p2)
|
||||||
|
p2 = '';
|
||||||
|
|
||||||
|
if (p1.endsWith('/'))
|
||||||
|
p1 = p1.slice(0, -1);
|
||||||
|
|
||||||
|
if (p2.startsWith('/'))
|
||||||
|
p2 = p2.slice(1);
|
||||||
|
|
||||||
|
if (!p1)
|
||||||
|
return p2;
|
||||||
|
|
||||||
|
if (!p2)
|
||||||
|
return p1;
|
||||||
|
|
||||||
|
return p1 + '/' + p2;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
function uricom_enc(txt, do_fb_enc) {
|
function uricom_enc(txt, do_fb_enc) {
|
||||||
try {
|
try {
|
||||||
return encodeURIComponent(txt);
|
return encodeURIComponent(txt);
|
||||||
@@ -1182,13 +1206,13 @@ var tt = (function () {
|
|||||||
r.th.style.top = (e.pageY + 12 * sy) + 'px';
|
r.th.style.top = (e.pageY + 12 * sy) + 'px';
|
||||||
};
|
};
|
||||||
|
|
||||||
if (IPHONE) {
|
if (TOUCH) {
|
||||||
var f1 = r.show,
|
var f1 = r.show,
|
||||||
f2 = r.hide,
|
f2 = r.hide,
|
||||||
q = [];
|
q = [];
|
||||||
|
|
||||||
// if an onclick-handler creates a new timer,
|
// if an onclick-handler creates a new timer,
|
||||||
// iOS 13.1.2 delays the entire handler by up to 401ms,
|
// webkits delay the entire handler by up to 401ms,
|
||||||
// win by using a shared timer instead
|
// win by using a shared timer instead
|
||||||
|
|
||||||
timer.add(function () {
|
timer.add(function () {
|
||||||
@@ -1578,6 +1602,14 @@ function load_md_plug(md_text, plug_type, defer) {
|
|||||||
if (defer)
|
if (defer)
|
||||||
md_plug[plug_type] = null;
|
md_plug[plug_type] = null;
|
||||||
|
|
||||||
|
if (plug_type == 'pre')
|
||||||
|
try {
|
||||||
|
md_text = md_thumbs(md_text);
|
||||||
|
}
|
||||||
|
catch (ex) {
|
||||||
|
toast.warn(30, '' + ex);
|
||||||
|
}
|
||||||
|
|
||||||
if (!have_emp)
|
if (!have_emp)
|
||||||
return md_text;
|
return md_text;
|
||||||
|
|
||||||
@@ -1618,6 +1650,47 @@ function load_md_plug(md_text, plug_type, defer) {
|
|||||||
|
|
||||||
return md;
|
return md;
|
||||||
}
|
}
|
||||||
|
function md_thumbs(md) {
|
||||||
|
if (!/(^|\n)<!-- th -->/.exec(md))
|
||||||
|
return md;
|
||||||
|
|
||||||
|
// `!th[flags](some.jpg)`
|
||||||
|
// flags: nothing or "l" or "r"
|
||||||
|
|
||||||
|
md = md.split(/!th\[/g);
|
||||||
|
for (var a = 1; a < md.length; a++) {
|
||||||
|
if (!/^[^\]!()]*\]\([^\][!()]+\)/.exec(md[a])) {
|
||||||
|
md[a] = '!th[' + md[a];
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
var o1 = md[a].indexOf(']('),
|
||||||
|
o2 = md[a].indexOf(')', o1),
|
||||||
|
alt = md[a].slice(0, o1),
|
||||||
|
flags = alt.split(','),
|
||||||
|
url = md[a].slice(o1 + 2, o2),
|
||||||
|
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
|
||||||
|
|
||||||
|
if (!/[?&]cache/.exec(url))
|
||||||
|
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i';
|
||||||
|
|
||||||
|
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
|
||||||
|
}
|
||||||
|
return md.join('');
|
||||||
|
}
|
||||||
|
function md_th_set() {
|
||||||
|
var els = QSA('.mdth');
|
||||||
|
for (var a = 0, aa = els.length; a < aa; a++)
|
||||||
|
els[a].onclick = md_th_click;
|
||||||
|
}
|
||||||
|
function md_th_click(e) {
|
||||||
|
ev(e);
|
||||||
|
var url = this.getAttribute('href').split('?')[0];
|
||||||
|
if (window.sb_md)
|
||||||
|
window.parent.postMessage("imshow " + url, "*");
|
||||||
|
else
|
||||||
|
thegrid.imshow(url);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';
|
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';
|
||||||
|
|||||||
@@ -1,3 +1,185 @@
|
|||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0401-2112 `v1.6.11` not joke
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* new event-hook: [exif stripper](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/image-noexif.py)
|
||||||
|
* [markdown thumbnails](https://a.ocv.me/pub/demo/pics-vids/README.md?v) -- see [readme](https://github.com/9001/copyparty#markdown-viewer)
|
||||||
|
* soon: support for [web-scrobbler](https://github.com/web-scrobbler/web-scrobbler/) - the [Last.fm](https://www.last.fm/user/tripflag) browser extension
|
||||||
|
* will update here + readme with more info when [the v3](https://github.com/web-scrobbler/web-scrobbler/projects/5) is out
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* more sqlite query-planner twiddling
|
||||||
|
* deleting files is MUCH faster now, and uploads / bootup might be a bit better too
|
||||||
|
* webdav optimizations / compliance
|
||||||
|
* should make some webdav clients run faster than before
|
||||||
|
* in very related news, the webdav-client in [rclone](https://github.com/rclone/rclone/) v1.63 ([currently beta](https://beta.rclone.org/?filter=latest)) will be ***FAST!***
|
||||||
|
* does cool stuff such as [bidirectional sync](https://github.com/9001/copyparty#folder-sync) between copyparty and a local folder
|
||||||
|
* [bpm detector](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/audio-bpm.py) is a bit more accurate
|
||||||
|
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) / commandline uploader: better error messages if something goes wrong
|
||||||
|
* readme rendering could fail in firefox if certain addons were installed (not sure which)
|
||||||
|
* event-hooks: more accurate usage examples
|
||||||
|
|
||||||
|
## other changes
|
||||||
|
* @chinponya automated the prismjs build step (thx!)
|
||||||
|
* updated some js deps (markedjs, codemirror)
|
||||||
|
* copyparty.exe: updated Pillow to 9.5.0
|
||||||
|
* and finally [the joke](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/rave.js) (looks [like this](https://cd.ocv.me/b/d2/d21/#af-9b927c42))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0320-2156 `v1.6.10` rclone sync
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* [iPhone "app"](https://github.com/9001/copyparty#ios-shortcuts) (upload shortcut) -- thanks @Daedren !
|
||||||
|
* can strip exif, upload files, pics, vids, links, clipboard
|
||||||
|
* can download links and rehost the target file on your server
|
||||||
|
* support `rclone sync` to [sync folders](https://github.com/9001/copyparty#folder-sync) to/from copyparty
|
||||||
|
* let webdav clients set lastmodified times during upload
|
||||||
|
* let webdav clients replace files during upload
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh): FFmpeg transcoding was slow because there was no `/dev/urandom`
|
||||||
|
* iphones would fail to play *some* songs (low-bitrate and/or shorter than ~7 seconds)
|
||||||
|
* due to either an iOS bug or an FFmpeg bug in the caf remuxing idk
|
||||||
|
* fixed by mixing in white noise into songs if an iPhone asks for them
|
||||||
|
* small correction in the docker readme regarding rootless podman
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0316-2106 `v1.6.9` index.html
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* option to show `index.html` instead of the folder listing
|
||||||
|
* arg `--ih` makes it default-enabled
|
||||||
|
* clients can enable/disable it in the `[⚙️]` settings tab
|
||||||
|
* url-param `?v` skips it for a particular folder
|
||||||
|
* faster folder-thumbnail validation on startup (mostly on conventional HDDs)
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* "load more" button didn't always show up when search results got truncated
|
||||||
|
* ux: tooltips could block buttons on android
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0312-1610 `v1.6.8` folder thumbs
|
||||||
|
|
||||||
|
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||||
|
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) ╱ [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) ╱ [client testbed](https://cd.ocv.me/b/)
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* folder thumbnails are indexed in the db
|
||||||
|
* now supports non-lowercase names (`Cover.jpg`, `Folder.JPG`)
|
||||||
|
* folders without a specific cover/folder image will show the first pic inside
|
||||||
|
* when audio playback continues into an empty folder, keep trying for a bit
|
||||||
|
* add no-index hints (google etc) in basic-browser HTML (`?b`, `?b=u`)
|
||||||
|
* [commandline uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) supports long filenames on win7
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* rotated logfiles didn't get xz compressed
|
||||||
|
* image-gallery links pointing to a deleted image shows an error instead of a crashpage
|
||||||
|
|
||||||
|
## other changes
|
||||||
|
* folder thumbnails have purple text to differentiate from files
|
||||||
|
* `copyparty32.exe` starts 30% faster (but is 6% larger)
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
|
# what to download?
|
||||||
|
| download link | is it good? | description |
|
||||||
|
| -- | -- | -- |
|
||||||
|
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
|
||||||
|
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
|
||||||
|
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
|
||||||
|
| [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
|
||||||
|
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
|
||||||
|
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
|
||||||
|
|
||||||
|
* except for [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe), all of the options above are equivalent
|
||||||
|
* the zip and tar.gz files below are just source code
|
||||||
|
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0305-2018 `v1.6.7` fix no-dedup + add up2k.exe
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* controlpanel-connect: add example for webdav automount
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* fix a race which, in worst case (but unlikely on linux), **could cause data loss**
|
||||||
|
* could only happen if `--no-dedup` or volflag `copydupes` was set (**not** default)
|
||||||
|
* if two identical files were uploaded at the same time, there was a small chance that one of the files would become empty
|
||||||
|
* check if you were affected by doing a search for zero-byte files using either of the following:
|
||||||
|
* https://127.0.0.1:3923/#q=size%20%3D%200
|
||||||
|
* `find -type f -size 0`
|
||||||
|
* let me know if you lost something important and had logging enabled!
|
||||||
|
* ftp: mkdir can do multiple levels at once (support filezilla)
|
||||||
|
* fix flickering toast on upload finish
|
||||||
|
* `[💤]` (upload-baton) could disengage if chrome decides to pause the background tab for 10sec (which it sometimes does)
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
|
## introducing [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe)
|
||||||
|
|
||||||
|
the commandline up2k upload / filesearch client, now as a standalone windows exe
|
||||||
|
* based on python 3.7 so it runs on 32bit windows7 or anything newer
|
||||||
|
* *no https support* (saves space + the python3.7 openssl is getting old)
|
||||||
|
* built from b39ff92f34e3fca389c78109d20d5454af761f8e so it can do long filepaths and mojibake
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
|
⭐️ **you probably want [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) below;**
|
||||||
|
the exe is [not recommended](https://github.com/9001/copyparty#copypartyexe) for longterm use
|
||||||
|
and the zip and tar.gz files are source code
|
||||||
|
(python packages are available at [PyPI](https://pypi.org/project/copyparty/#files))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0226-2030 `v1.6.6` r 2 0 0
|
||||||
|
|
||||||
|
two hundred releases wow
|
||||||
|
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||||
|
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) ╱ [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) ╱ [client testbed](https://cd.ocv.me/b/)
|
||||||
|
* currently fighting a ground fault so the demo server will be unreliable for a while
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* more docker containers! now runs on x64, x32, aarch64, armhf, ppc64, s390x
|
||||||
|
* pls let me know if you actually run copyparty on an IBM mainframe 👍
|
||||||
|
* new [event hook](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) type `xiu` runs just once for all recent uploads
|
||||||
|
* example hook [xiu-sha.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/xiu-sha.py) generates sha512 checksum files
|
||||||
|
* new arg `--rsp-jtr` simulates connection jitter
|
||||||
|
* copyparty.exe integrity selftest
|
||||||
|
* ux:
|
||||||
|
* return to previous page after logging in
|
||||||
|
* show a warning on the login page if you're not using https
|
||||||
|
* freebsd: detect `fetch` and return the [colorful sortable plaintext](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png) listing
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* permit replacing empty files only during a `--blank-wt` grace period
|
||||||
|
* lifetimes: keep upload-time when a size/mtime change triggers a reindex
|
||||||
|
* during cleanup after an unlink, never rmdir the entire volume
|
||||||
|
* rescan button in the controlpanel required volumes to be e2ds
|
||||||
|
* dupes could get indexed with the wrong mtime
|
||||||
|
* only affected the search index; the filesystem got the right one
|
||||||
|
* ux: search results could include the same hit twice in case of overlapping volumes
|
||||||
|
* ux: upload UI would remain expanded permanently after visiting a huge tab
|
||||||
|
* ftp: return proper error messages when client does something illegal
|
||||||
|
* ie11: support the back button
|
||||||
|
|
||||||
|
## other changes
|
||||||
|
* [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) replaces copyparty64.exe -- now built for 64-bit windows 10
|
||||||
|
* **on win10 it just works** -- on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145) -- no win7 support
|
||||||
|
* has the latest security patches, but sfx.py is still better for long-term use
|
||||||
|
* has pillow and mutagen; can make thumbnails and parse/index media
|
||||||
|
* [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is the old win7-compatible, dangerously-insecure edition
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
# 2023-0212-1411 `v1.6.5` windows smb fix + win10.exe
|
# 2023-0212-1411 `v1.6.5` windows smb fix + win10.exe
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,7 @@
|
|||||||
* [future plans](#future-plans) - some improvement ideas
|
* [future plans](#future-plans) - some improvement ideas
|
||||||
* [design](#design)
|
* [design](#design)
|
||||||
* [up2k](#up2k) - quick outline of the up2k protocol
|
* [up2k](#up2k) - quick outline of the up2k protocol
|
||||||
|
* [why not tus](#why-not-tus) - I didn't know about [tus](https://tus.io/)
|
||||||
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
|
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
|
||||||
* [http api](#http-api)
|
* [http api](#http-api)
|
||||||
* [read](#read)
|
* [read](#read)
|
||||||
@@ -66,6 +67,13 @@ regarding the frequent server log message during uploads;
|
|||||||
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
|
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
|
||||||
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
|
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
|
||||||
|
|
||||||
|
## why not tus
|
||||||
|
|
||||||
|
I didn't know about [tus](https://tus.io/) when I made this, but:
|
||||||
|
* up2k has the advantage that it supports parallel uploading of non-contiguous chunks straight into the final file -- [tus does a merge at the end](https://tus.io/protocols/resumable-upload.html#concatenation) which is slow and taxing on the server HDD / filesystem (unless i'm misunderstanding)
|
||||||
|
* up2k has the slight disadvantage of requiring the client to hash the entire file before an upload can begin, but this has the benefit of immediately skipping duplicate files
|
||||||
|
* and the hashing happens in a separate thread anyways so it's usually not a bottleneck
|
||||||
|
|
||||||
## why chunk-hashes
|
## why chunk-hashes
|
||||||
|
|
||||||
a single sha512 would be better, right?
|
a single sha512 would be better, right?
|
||||||
|
|||||||
4
docs/examples/README.md
Normal file
4
docs/examples/README.md
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
copyparty server config examples
|
||||||
|
|
||||||
|
[windows.md](windows.md) -- running copyparty as a service on windows
|
||||||
|
|
||||||
116
docs/examples/windows.md
Normal file
116
docs/examples/windows.md
Normal file
@@ -0,0 +1,116 @@
|
|||||||
|
# running copyparty on windows
|
||||||
|
|
||||||
|
this is a complete example / quickstart for running copyparty on windows, optionally as a service (autostart on boot)
|
||||||
|
|
||||||
|
you will definitely need either [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (comfy, portable, more features) or [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) (smaller, safer)
|
||||||
|
|
||||||
|
* if you decided to grab `copyparty-sfx.py` instead of the exe you will also need to install the ["Latest Python 3 Release"](https://www.python.org/downloads/windows/)
|
||||||
|
|
||||||
|
then you probably want to download [FFmpeg](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) and put `ffmpeg.exe` and `ffprobe.exe` in your PATH (so for example `C:\Windows\System32\`) -- this enables thumbnails, audio transcoding, and making music metadata searchable
|
||||||
|
|
||||||
|
|
||||||
|
## the config file
|
||||||
|
|
||||||
|
open up notepad and save the following as `c:\users\you\documents\party.conf` (for example)
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
[global]
|
||||||
|
lo: c:\users\you\logs\cpp-%Y-%m%d.xz # log to file
|
||||||
|
e2dsa, e2ts, no-dedup, z # sets 4 flags; see expl.
|
||||||
|
p: 80, 443 # listen on ports 80 and 443, not 3923
|
||||||
|
theme: 2 # default theme: protonmail-monokai
|
||||||
|
lang: nor # default language: viking
|
||||||
|
|
||||||
|
[accounts] # usernames and passwords
|
||||||
|
kevin: shangalabangala # kevin's password
|
||||||
|
|
||||||
|
[/] # create a volume available at /
|
||||||
|
c:\pub # sharing this filesystem location
|
||||||
|
accs: # and set permissions:
|
||||||
|
r: * # everyone can read/download files,
|
||||||
|
rwmd: kevin # kevin can read/write/move/delete
|
||||||
|
|
||||||
|
[/inc] # create another volume at /inc
|
||||||
|
c:\pub\inc # sharing this filesystem location
|
||||||
|
accs: # permissions:
|
||||||
|
w: * # everyone can upload, but not browse
|
||||||
|
rwmd: kevin # kevin is admin here too
|
||||||
|
|
||||||
|
[/music] # and a third volume at /music
|
||||||
|
~/music # which shares c:\users\you\music
|
||||||
|
accs:
|
||||||
|
r: *
|
||||||
|
rwmd: kevin
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
### config explained: [global]
|
||||||
|
|
||||||
|
the `[global]` section accepts any config parameters you can see when running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor`
|
||||||
|
* `lo: c:\users\you\logs\cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed)
|
||||||
|
* sorry that `~/logs/` doesn't work currently, good oversight
|
||||||
|
* `e2dsa` enables the upload deduplicator and file indexer, which enables searching
|
||||||
|
* `e2ts` enables music metadata indexing, making albums / titles etc. searchable too
|
||||||
|
* `no-dedup` writes full dupes to disk instead of symlinking, since lots of windows software doesn't handle symlinks well
|
||||||
|
* but the improved upload speed from `e2dsa` is not affected
|
||||||
|
* `z` enables zeroconf, making the server available at `http://HOSTNAME.local/` from any other machine in the LAN
|
||||||
|
* `p: 80,443` listens on the ports `80` and `443` instead of the default `3923`
|
||||||
|
* `lang: nor` sets default language to viking
|
||||||
|
|
||||||
|
|
||||||
|
### config explained: [accounts]
|
||||||
|
|
||||||
|
the `[accounts]` section defines all the user accounts, which can then be referenced when granting people access to the different volumes
|
||||||
|
|
||||||
|
|
||||||
|
### config explained: volumes
|
||||||
|
|
||||||
|
then we create three volumes, one at `/`, one at `/inc`, and one at `/music`
|
||||||
|
* `/` and `/music` are readable without requiring people to login (`r: *`) but you need to login as kevin to write/move/delete files (`rwmd: kevin`)
|
||||||
|
* anyone can upload to `/inc` but you must be logged in as kevin to see the files inside
|
||||||
|
|
||||||
|
|
||||||
|
## run copyparty
|
||||||
|
|
||||||
|
to test your config it's best to just run copyparty in a console to watch the output:
|
||||||
|
|
||||||
|
```batch
|
||||||
|
copyparty.exe -c party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
or if you wanna use `copyparty-sfx.py` instead of the exe (understandable),
|
||||||
|
|
||||||
|
```batch
|
||||||
|
%localappdata%\programs\python\python311\python.exe copyparty-sfx.py -c party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
(please adjust `python311` to match the python version you installed, i'm not good enough at windows to make that bit generic)
|
||||||
|
|
||||||
|
|
||||||
|
## run it as a service
|
||||||
|
|
||||||
|
to run this as a service you need [NSSM](https://nssm.cc/ci/nssm-2.24-101-g897c7ad.zip), so put the exe somewhere in your PATH
|
||||||
|
|
||||||
|
then either do this for `copyparty.exe`:
|
||||||
|
```batch
|
||||||
|
nssm install cpp %homedrive%%homepath%\downloads\copyparty.exe -c %homedrive%%homepath%\documents\party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
or do this for `copyparty-sfx.py`:
|
||||||
|
```batch
|
||||||
|
nssm install cpp %localappdata%\programs\python\python311\python.exe %homedrive%%homepath%\downloads\copyparty-sfx.py -c %homedrive%%homepath%\documents\party.conf
|
||||||
|
```
|
||||||
|
|
||||||
|
then after creating the service, modify it so it runs with your own windows account (so file permissions don't get wonky and paths expand as expected):
|
||||||
|
```batch
|
||||||
|
nssm set cpp ObjectName .\yourAccoutName yourWindowsPassword
|
||||||
|
nssm start cpp
|
||||||
|
```
|
||||||
|
|
||||||
|
and that's it, all good
|
||||||
|
|
||||||
|
if it doesn't start, enable stderr logging so you can see what went wrong:
|
||||||
|
```batch
|
||||||
|
nssm set cpp AppStderr %homedrive%%homepath%\logs\cppsvc.err
|
||||||
|
nssm set cpp AppStderrCreationDisposition 2
|
||||||
|
```
|
||||||
@@ -14,6 +14,10 @@ when server is on another machine (1gbit LAN),
|
|||||||
|
|
||||||
# creating the config file
|
# creating the config file
|
||||||
|
|
||||||
|
the copyparty "connect" page at `/?hc` (so for example http://127.0.0.1:3923/?hc) will generate commands to autoconfigure rclone for your server
|
||||||
|
|
||||||
|
**if you prefer to configure rclone manually, continue reading:**
|
||||||
|
|
||||||
replace `hunter2` with your password, or remove the `hunter2` lines if you allow anonymous access
|
replace `hunter2` with your password, or remove the `hunter2` lines if you allow anonymous access
|
||||||
|
|
||||||
|
|
||||||
@@ -22,14 +26,16 @@ replace `hunter2` with your password, or remove the `hunter2` lines if you allow
|
|||||||
(
|
(
|
||||||
echo [cpp-rw]
|
echo [cpp-rw]
|
||||||
echo type = webdav
|
echo type = webdav
|
||||||
echo vendor = other
|
echo vendor = owncloud
|
||||||
echo url = http://127.0.0.1:3923/
|
echo url = http://127.0.0.1:3923/
|
||||||
echo headers = Cookie,cppwd=hunter2
|
echo headers = Cookie,cppwd=hunter2
|
||||||
|
echo pacer_min_sleep = 0.01ms
|
||||||
echo(
|
echo(
|
||||||
echo [cpp-ro]
|
echo [cpp-ro]
|
||||||
echo type = http
|
echo type = http
|
||||||
echo url = http://127.0.0.1:3923/
|
echo url = http://127.0.0.1:3923/
|
||||||
echo headers = Cookie,cppwd=hunter2
|
echo headers = Cookie,cppwd=hunter2
|
||||||
|
echo pacer_min_sleep = 0.01ms
|
||||||
) > %userprofile%\.config\rclone\rclone.conf
|
) > %userprofile%\.config\rclone\rclone.conf
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -41,14 +47,16 @@ also install the windows dependencies: [winfsp](https://github.com/billziss-gh/w
|
|||||||
cat > ~/.config/rclone/rclone.conf <<'EOF'
|
cat > ~/.config/rclone/rclone.conf <<'EOF'
|
||||||
[cpp-rw]
|
[cpp-rw]
|
||||||
type = webdav
|
type = webdav
|
||||||
vendor = other
|
vendor = owncloud
|
||||||
url = http://127.0.0.1:3923/
|
url = http://127.0.0.1:3923/
|
||||||
headers = Cookie,cppwd=hunter2
|
headers = Cookie,cppwd=hunter2
|
||||||
|
pacer_min_sleep = 0.01ms
|
||||||
|
|
||||||
[cpp-ro]
|
[cpp-ro]
|
||||||
type = http
|
type = http
|
||||||
url = http://127.0.0.1:3923/
|
url = http://127.0.0.1:3923/
|
||||||
headers = Cookie,cppwd=hunter2
|
headers = Cookie,cppwd=hunter2
|
||||||
|
pacer_min_sleep = 0.01ms
|
||||||
EOF
|
EOF
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -62,6 +70,15 @@ rclone.exe mount --vfs-cache-mode writes --vfs-cache-max-age 5s --attr-timeout 5
|
|||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
# sync folders to/from copyparty
|
||||||
|
|
||||||
|
note that the up2k client [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) (available on the "connect" page of your copyparty server) does uploads much faster and safer, but rclone is bidirectional and more ubiquitous
|
||||||
|
|
||||||
|
```
|
||||||
|
rclone sync /usr/share/icons/ cpp-rw:fds/
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
# use rclone as server too, replacing copyparty
|
# use rclone as server too, replacing copyparty
|
||||||
|
|
||||||
feels out of place but is too good not to mention
|
feels out of place but is too good not to mention
|
||||||
@@ -70,3 +87,26 @@ feels out of place but is too good not to mention
|
|||||||
rclone.exe serve http --read-only .
|
rclone.exe serve http --read-only .
|
||||||
rclone.exe serve webdav .
|
rclone.exe serve webdav .
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
# devnotes
|
||||||
|
|
||||||
|
copyparty supports and expects [the following](https://github.com/rclone/rclone/blob/46484022b08f8756050aa45505ea0db23e62df8b/backend/webdav/webdav.go#L575-L578) from rclone,
|
||||||
|
|
||||||
|
```go
|
||||||
|
case "owncloud":
|
||||||
|
f.canStream = true
|
||||||
|
f.precision = time.Second
|
||||||
|
f.useOCMtime = true
|
||||||
|
f.hasOCMD5 = true
|
||||||
|
f.hasOCSHA1 = true
|
||||||
|
```
|
||||||
|
|
||||||
|
notably,
|
||||||
|
* `useOCMtime` enables the `x-oc-mtime` header to retain mtime of uploads from rclone
|
||||||
|
* `canStream` is supported but not required by us
|
||||||
|
* `hasOCMD5` / `hasOCSHA1` is conveniently dontcare on both ends
|
||||||
|
|
||||||
|
there's a scary comment mentioning PROPSET of lastmodified which is not something we wish to support
|
||||||
|
|
||||||
|
and if `vendor=owncloud` ever stops working, try `vendor=fastmail` instead
|
||||||
|
|||||||
375
docs/versus.md
375
docs/versus.md
@@ -17,6 +17,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
|
|||||||
|
|
||||||
### ...in reviews:
|
### ...in reviews:
|
||||||
* ✅ = advantages over copyparty
|
* ✅ = advantages over copyparty
|
||||||
|
* 💾 = what copyparty offers as an alternative
|
||||||
* 🔵 = similarities
|
* 🔵 = similarities
|
||||||
* ⚠️ = disadvantages (something copyparty does "better")
|
* ⚠️ = disadvantages (something copyparty does "better")
|
||||||
|
|
||||||
@@ -46,6 +47,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
|
|||||||
* [kodbox](#kodbox)
|
* [kodbox](#kodbox)
|
||||||
* [filebrowser](#filebrowser)
|
* [filebrowser](#filebrowser)
|
||||||
* [filegator](#filegator)
|
* [filegator](#filegator)
|
||||||
|
* [sftpgo](#sftpgo)
|
||||||
* [updog](#updog)
|
* [updog](#updog)
|
||||||
* [goshs](#goshs)
|
* [goshs](#goshs)
|
||||||
* [gimme-that](#gimme-that)
|
* [gimme-that](#gimme-that)
|
||||||
@@ -53,6 +55,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
|
|||||||
* [linx](#linx)
|
* [linx](#linx)
|
||||||
* [h5ai](#h5ai)
|
* [h5ai](#h5ai)
|
||||||
* [autoindex](#autoindex)
|
* [autoindex](#autoindex)
|
||||||
|
* [miniserve](#miniserve)
|
||||||
* [briefly considered](#briefly-considered)
|
* [briefly considered](#briefly-considered)
|
||||||
|
|
||||||
|
|
||||||
@@ -89,6 +92,7 @@ the softwares,
|
|||||||
* `i` = [kodbox](https://github.com/kalcaddle/kodbox)
|
* `i` = [kodbox](https://github.com/kalcaddle/kodbox)
|
||||||
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
|
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
|
||||||
* `k` = [filegator](https://github.com/filegator/filegator)
|
* `k` = [filegator](https://github.com/filegator/filegator)
|
||||||
|
* `l` = [sftpgo](https://github.com/drakkan/sftpgo)
|
||||||
|
|
||||||
some softwares not in the matrixes,
|
some softwares not in the matrixes,
|
||||||
* [updog](#updog)
|
* [updog](#updog)
|
||||||
@@ -96,6 +100,9 @@ some softwares not in the matrixes,
|
|||||||
* [gimme-that](#gimmethat)
|
* [gimme-that](#gimmethat)
|
||||||
* [ass](#ass)
|
* [ass](#ass)
|
||||||
* [linx](#linx)
|
* [linx](#linx)
|
||||||
|
* [h5ai](#h5ai)
|
||||||
|
* [autoindex](#autoindex)
|
||||||
|
* [miniserve](#miniserve)
|
||||||
|
|
||||||
symbol legend,
|
symbol legend,
|
||||||
* `█` = absolutely
|
* `█` = absolutely
|
||||||
@@ -106,62 +113,64 @@ symbol legend,
|
|||||||
|
|
||||||
## general
|
## general
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| intuitive UX | | ╱ | █ | █ | █ | | █ | █ | █ | █ | █ |
|
| intuitive UX | | ╱ | █ | █ | █ | | █ | █ | █ | █ | █ | █ |
|
||||||
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | |
|
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ |
|
||||||
| good documentation | | | | █ | █ | █ | █ | | | █ | █ |
|
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | ╱ |
|
||||||
| runs on iOS | ╱ | | | | | ╱ | | | | | |
|
| runs on iOS | ╱ | | | | | ╱ | | | | | | |
|
||||||
| runs on Android | █ | | | | | █ | | | | | |
|
| runs on Android | █ | | | | | █ | | | | | | |
|
||||||
| runs on WinXP | █ | █ | | | | █ | | | | | |
|
| runs on WinXP | █ | █ | | | | █ | | | | | | |
|
||||||
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | ╱ | █ | █ | █ |
|
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | ╱ | █ | █ | █ | █ |
|
||||||
| runs on Linux | █ | ╱ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
| runs on Linux | █ | ╱ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||||
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||||
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | |
|
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ |
|
||||||
| portable binary | █ | █ | █ | | | █ | █ | | | █ | |
|
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ |
|
||||||
| zero setup, just go | █ | █ | █ | | | ╱ | █ | | | █ | |
|
| zero setup, just go | █ | █ | █ | | | ╱ | █ | | | █ | | ╱ |
|
||||||
| android app | ╱ | | | █ | █ | | | | | | |
|
| android app | ╱ | | | █ | █ | | | | | | | |
|
||||||
| iOS app | | | | █ | █ | | | | | | |
|
| iOS app | ╱ | | | █ | █ | | | | | | | |
|
||||||
|
|
||||||
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
|
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
|
||||||
* `a`/copyparty remarks:
|
* `a`/copyparty remarks:
|
||||||
* no gui for server settings; only for client-side stuff
|
* no gui for server settings; only for client-side stuff
|
||||||
* can theoretically run on iOS / iPads using [iSH](https://ish.app/), but only the iPad will offer sufficient multitasking i think
|
* can theoretically run on iOS / iPads using [iSH](https://ish.app/), but only the iPad will offer sufficient multitasking i think
|
||||||
* [android app](https://f-droid.org/en/packages/me.ocv.partyup/) is for uploading only
|
* [android app](https://f-droid.org/en/packages/me.ocv.partyup/) is for uploading only
|
||||||
|
* no iOS app but has [shortcuts](https://github.com/9001/copyparty#ios-shortcuts) for easy uploading
|
||||||
* `b`/hfs2 runs on linux through wine
|
* `b`/hfs2 runs on linux through wine
|
||||||
* `f`/rclone must be started with the command `rclone serve webdav .` or similar
|
* `f`/rclone must be started with the command `rclone serve webdav .` or similar
|
||||||
* `h`/chibisafe has undocumented windows support
|
* `h`/chibisafe has undocumented windows support
|
||||||
|
* `i`/sftpgo must be launched with a command
|
||||||
|
|
||||||
|
|
||||||
## file transfer
|
## file transfer
|
||||||
|
|
||||||
*the thing that copyparty is actually kinda good at*
|
*the thing that copyparty is actually kinda good at*
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ |
|
| download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ | █ |
|
||||||
| download folder as tar | █ | | | | | | | | | █ | |
|
| download folder as tar | █ | | | | | | | | | █ | | |
|
||||||
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||||
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ |
|
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | |
|
||||||
| resumable uploads | █ | | | | | | | | █ | | █ |
|
| resumable uploads | █ | | | | | | | | █ | | █ | ╱ |
|
||||||
| upload segmenting | █ | | | | | | | █ | █ | | █ |
|
| upload segmenting | █ | | | | | | | █ | █ | | █ | ╱ |
|
||||||
| upload acceleration | █ | | | | | | | | █ | | █ |
|
| upload acceleration | █ | | | | | | | | █ | | █ | |
|
||||||
| upload verification | █ | | | █ | █ | | | | █ | | |
|
| upload verification | █ | | | █ | █ | | | | █ | | | |
|
||||||
| upload deduplication | █ | | | | █ | | | | █ | | |
|
| upload deduplication | █ | | | | █ | | | | █ | | | |
|
||||||
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ |
|
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | ╱ |
|
||||||
| keep last-modified time | █ | | | █ | █ | █ | | | | | |
|
| keep last-modified time | █ | | | █ | █ | █ | | | | | | █ |
|
||||||
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ |
|
| upload rules | ╱ | ╱ | ╱ | ╱ | ╱ | | | ╱ | ╱ | | ╱ | ╱ |
|
||||||
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | |
|
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ |
|
||||||
| ┗ max filesize | █ | | | | | | | █ | | | █ |
|
| ┗ max filesize | █ | | | | | | | █ | | | █ | █ |
|
||||||
| ┗ max items in folder | █ | | | | | | | | | | |
|
| ┗ max items in folder | █ | | | | | | | | | | | ╱ |
|
||||||
| ┗ max file age | █ | | | | | | | | █ | | |
|
| ┗ max file age | █ | | | | | | | | █ | | | |
|
||||||
| ┗ max uploads over time | █ | | | | | | | | | | |
|
| ┗ max uploads over time | █ | | | | | | | | | | | ╱ |
|
||||||
| ┗ compress before write | █ | | | | | | | | | | |
|
| ┗ compress before write | █ | | | | | | | | | | | |
|
||||||
| ┗ randomize filename | █ | | | | | | | █ | █ | | |
|
| ┗ randomize filename | █ | | | | | | | █ | █ | | | |
|
||||||
| ┗ mimetype reject-list | ╱ | | | | | | | | • | ╱ | |
|
| ┗ mimetype reject-list | ╱ | | | | | | | | • | ╱ | | ╱ |
|
||||||
| ┗ extension reject-list | ╱ | | | | | | | █ | • | ╱ | |
|
| ┗ extension reject-list | ╱ | | | | | | | █ | • | ╱ | | ╱ |
|
||||||
| checksums provided | | | | █ | █ | | | | █ | ╱ | |
|
| checksums provided | | | | █ | █ | | | | █ | ╱ | | |
|
||||||
| cloud storage backend | ╱ | ╱ | ╱ | █ | █ | █ | ╱ | | | ╱ | █ |
|
| cloud storage backend | ╱ | ╱ | ╱ | █ | █ | █ | ╱ | | | ╱ | █ | █ |
|
||||||
|
|
||||||
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
|
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
|
||||||
|
|
||||||
@@ -178,25 +187,29 @@ symbol legend,
|
|||||||
* can provide checksums for single files on request
|
* can provide checksums for single files on request
|
||||||
* can probably do extension/mimetype rejection similar to copyparty
|
* can probably do extension/mimetype rejection similar to copyparty
|
||||||
* `k`/filegator download-as-zip is not streaming; it creates the full zipfile before download can start
|
* `k`/filegator download-as-zip is not streaming; it creates the full zipfile before download can start
|
||||||
|
* `l`/sftpgo:
|
||||||
|
* resumable/segmented uploads only over SFTP, not over HTTP
|
||||||
|
* upload rules are totals only, not over time
|
||||||
|
* can probably do extension/mimetype rejection similar to copyparty
|
||||||
|
|
||||||
|
|
||||||
## protocols and client support
|
## protocols and client support
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||||
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | |
|
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ |
|
||||||
| serve ftp | █ | | | | | █ | | | | | |
|
| serve ftp | █ | | | | | █ | | | | | | █ |
|
||||||
| serve ftps | █ | | | | | █ | | | | | |
|
| serve ftps | █ | | | | | █ | | | | | | █ |
|
||||||
| serve sftp | | | | | | █ | | | | | |
|
| serve sftp | | | | | | █ | | | | | | █ |
|
||||||
| serve smb/cifs | ╱ | | | | | █ | | | | | |
|
| serve smb/cifs | ╱ | | | | | █ | | | | | | |
|
||||||
| serve dlna | | | | | | █ | | | | | |
|
| serve dlna | | | | | | █ | | | | | | |
|
||||||
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ |
|
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ |
|
||||||
| zeroconf | █ | | | | | | | | | | |
|
| zeroconf | █ | | | | | | | | | | | |
|
||||||
| supports netscape 4 | ╱ | | | | | █ | | | | | • |
|
| supports netscape 4 | ╱ | | | | | █ | | | | | • | |
|
||||||
| ...internet explorer 6 | ╱ | █ | | █ | | █ | | | | | • |
|
| ...internet explorer 6 | ╱ | █ | | █ | | █ | | | | | • | |
|
||||||
| mojibake filenames | █ | | | • | • | █ | █ | • | • | • | |
|
| mojibake filenames | █ | | | • | • | █ | █ | • | • | • | | ╱ |
|
||||||
| undecodable filenames | █ | | | • | • | █ | | • | • | | |
|
| undecodable filenames | █ | | | • | • | █ | | • | • | | | ╱ |
|
||||||
|
|
||||||
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
|
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
|
||||||
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
|
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
|
||||||
@@ -206,57 +219,62 @@ symbol legend,
|
|||||||
* `a`/copyparty remarks:
|
* `a`/copyparty remarks:
|
||||||
* extremely minimal samba/cifs server
|
* extremely minimal samba/cifs server
|
||||||
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
|
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
|
||||||
|
* `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss)
|
||||||
|
|
||||||
|
|
||||||
## server configuration
|
## server configuration
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| config from cmd args | █ | | | | | █ | █ | | | █ | |
|
| config from cmd args | █ | | | | | █ | █ | | | █ | | ╱ |
|
||||||
| config files | █ | █ | █ | ╱ | ╱ | █ | | █ | | █ | • |
|
| config files | █ | █ | █ | ╱ | ╱ | █ | | █ | | █ | • | ╱ |
|
||||||
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ |
|
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | |
|
||||||
| same-port http / https | █ | | | | | | | | | | |
|
| same-port http / https | █ | | | | | | | | | | | |
|
||||||
| listen multiple ports | █ | | | | | | | | | | |
|
| listen multiple ports | █ | | | | | | | | | | | █ |
|
||||||
| virtual file system | █ | █ | █ | | | | █ | | | | |
|
| virtual file system | █ | █ | █ | | | | █ | | | | | █ |
|
||||||
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • |
|
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ |
|
||||||
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • |
|
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | |
|
||||||
|
|
||||||
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
|
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
|
||||||
|
* `l`/sftpgo:
|
||||||
|
* config: users must be added through gui / api calls
|
||||||
|
|
||||||
|
|
||||||
## server capabilities
|
## server capabilities
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
|
||||||
| single-sign-on | | | | █ | █ | | | | • | | |
|
| per-account chroot | | | | | | | | | | | | █ |
|
||||||
| token auth | | | | █ | █ | | | █ | | | |
|
| single-sign-on | | | | █ | █ | | | | • | | | |
|
||||||
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | ╱ |
|
| token auth | | | | █ | █ | | | █ | | | | |
|
||||||
| per-folder permissions | ╱ | | | █ | █ | | █ | | █ | █ | ╱ |
|
| 2fa | | | | █ | █ | | | | | | | █ |
|
||||||
| per-file permissions | | | | █ | █ | | █ | | █ | | |
|
| per-volume permissions | █ | █ | █ | █ | █ | █ | █ | | █ | █ | ╱ | █ |
|
||||||
| per-file passwords | █ | | | █ | █ | | █ | | █ | | |
|
| per-folder permissions | ╱ | | | █ | █ | | █ | | █ | █ | ╱ | █ |
|
||||||
| unmap subfolders | █ | | | | | | █ | | | █ | ╱ |
|
| per-file permissions | | | | █ | █ | | █ | | █ | | | |
|
||||||
| index.html blocks list | | | | | | | █ | | | • | |
|
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | |
|
||||||
| write-only folders | █ | | | | | | | | | | █ |
|
| unmap subfolders | █ | | | | | | █ | | | █ | ╱ | • |
|
||||||
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ |
|
| index.html blocks list | | | | | | | █ | | | • | | |
|
||||||
| file versioning | | | | █ | █ | | | | | | |
|
| write-only folders | █ | | | | | | | | | | █ | █ |
|
||||||
| file encryption | | | | █ | █ | █ | | | | | |
|
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ |
|
||||||
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | |
|
| file versioning | | | | █ | █ | | | | | | | |
|
||||||
| ┗ per-volume db | █ | | • | • | • | | | • | • | | |
|
| file encryption | | | | █ | █ | █ | | | | | | █ |
|
||||||
| ┗ db stored in folder | █ | | | | | | | • | • | █ | |
|
| file indexing | █ | | █ | █ | █ | | | █ | █ | █ | | |
|
||||||
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | |
|
| ┗ per-volume db | █ | | • | • | • | | | • | • | | | |
|
||||||
| ┗ existing file tree | █ | | █ | | | | | | | █ | |
|
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | |
|
||||||
| file action event hooks | █ | | | | | | | | | █ | |
|
| ┗ db stored out-of-tree | █ | | █ | █ | █ | | | • | • | █ | | |
|
||||||
| one-way folder sync | █ | | | █ | █ | █ | | | | | |
|
| ┗ existing file tree | █ | | █ | | | | | | | █ | | |
|
||||||
| full sync | | | | █ | █ | | | | | | |
|
| file action event hooks | █ | | | | | | | | | █ | | █ |
|
||||||
| speed throttle | | █ | █ | | | █ | | | █ | | |
|
| one-way folder sync | █ | | | █ | █ | █ | | | | | | |
|
||||||
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | |
|
| full sync | | | | █ | █ | | | | | | | |
|
||||||
| dyndns updater | | █ | | | | | | | | | |
|
| speed throttle | | █ | █ | | | █ | | | █ | | | █ |
|
||||||
| self-updater | | | █ | | | | | | | | |
|
| anti-bruteforce | █ | █ | █ | █ | █ | | | | • | | | █ |
|
||||||
| log rotation | █ | | █ | █ | █ | | | • | █ | | |
|
| dyndns updater | | █ | | | | | | | | | | |
|
||||||
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | |
|
| self-updater | | | █ | | | | | | | | | |
|
||||||
| curl-friendly ls | █ | | | | | | | | | | |
|
| log rotation | █ | | █ | █ | █ | | | • | █ | | | █ |
|
||||||
| curl-friendly upload | █ | | | | | █ | █ | • | | | |
|
| upload tracking / log | █ | █ | • | █ | █ | | | █ | █ | | | ╱ |
|
||||||
|
| curl-friendly ls | █ | | | | | | | | | | | |
|
||||||
|
| curl-friendly upload | █ | | | | | █ | █ | • | | | | |
|
||||||
|
|
||||||
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
|
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
|
||||||
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
|
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
|
||||||
@@ -277,49 +295,52 @@ symbol legend,
|
|||||||
* `k`/filegator remarks:
|
* `k`/filegator remarks:
|
||||||
* `per-* permissions` -- can limit a user to one folder and its subfolders
|
* `per-* permissions` -- can limit a user to one folder and its subfolders
|
||||||
* `unmap subfolders` -- can globally filter a list of paths
|
* `unmap subfolders` -- can globally filter a list of paths
|
||||||
|
* `l`/sftpgo:
|
||||||
|
* `file action event hooks` also include on-download triggers
|
||||||
|
* `upload tracking / log` in main logfile
|
||||||
|
|
||||||
|
|
||||||
## client features
|
## client features
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ |
|
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | |
|
||||||
| themes | █ | █ | | █ | | | | | █ | | |
|
| themes | █ | █ | | █ | | | | | █ | | | |
|
||||||
| directory tree nav | █ | ╱ | | | █ | | | | █ | | ╱ |
|
| directory tree nav | █ | ╱ | | | █ | | | | █ | | ╱ | |
|
||||||
| multi-column sorting | █ | | | | | | | | | | |
|
| multi-column sorting | █ | | | | | | | | | | | |
|
||||||
| thumbnails | █ | | | ╱ | ╱ | | | █ | █ | ╱ | |
|
| thumbnails | █ | | | ╱ | ╱ | | | █ | █ | ╱ | | |
|
||||||
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | |
|
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | |
|
||||||
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | |
|
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | |
|
||||||
| ┗ audio spectrograms | █ | | | | | | | | | | |
|
| ┗ audio spectrograms | █ | | | | | | | | | | | |
|
||||||
| audio player | █ | | | █ | █ | | | | █ | ╱ | |
|
| audio player | █ | | | █ | █ | | | | █ | ╱ | | |
|
||||||
| ┗ gapless playback | █ | | | | | | | | • | | |
|
| ┗ gapless playback | █ | | | | | | | | • | | | |
|
||||||
| ┗ audio equalizer | █ | | | | | | | | | | |
|
| ┗ audio equalizer | █ | | | | | | | | | | | |
|
||||||
| ┗ waveform seekbar | █ | | | | | | | | | | |
|
| ┗ waveform seekbar | █ | | | | | | | | | | | |
|
||||||
| ┗ OS integration | █ | | | | | | | | | | |
|
| ┗ OS integration | █ | | | | | | | | | | | |
|
||||||
| ┗ transcode to lossy | █ | | | | | | | | | | |
|
| ┗ transcode to lossy | █ | | | | | | | | | | | |
|
||||||
| video player | █ | | | █ | █ | | | | █ | █ | |
|
| video player | █ | | | █ | █ | | | | █ | █ | | |
|
||||||
| ┗ video transcoding | | | | | | | | | █ | | |
|
| ┗ video transcoding | | | | | | | | | █ | | | |
|
||||||
| audio BPM detector | █ | | | | | | | | | | |
|
| audio BPM detector | █ | | | | | | | | | | | |
|
||||||
| audio key detector | █ | | | | | | | | | | |
|
| audio key detector | █ | | | | | | | | | | | |
|
||||||
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ |
|
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | ╱ | |
|
||||||
| search by date / size | █ | | | | █ | | | █ | █ | | |
|
| search by date / size | █ | | | | █ | | | █ | █ | | | |
|
||||||
| search by bpm / key | █ | | | | | | | | | | |
|
| search by bpm / key | █ | | | | | | | | | | | |
|
||||||
| search by custom tags | | | | | | | | █ | █ | | |
|
| search by custom tags | | | | | | | | █ | █ | | | |
|
||||||
| search in file contents | | | | █ | █ | | | | █ | | |
|
| search in file contents | | | | █ | █ | | | | █ | | | |
|
||||||
| search by custom parser | █ | | | | | | | | | | |
|
| search by custom parser | █ | | | | | | | | | | | |
|
||||||
| find local file | █ | | | | | | | | | | |
|
| find local file | █ | | | | | | | | | | | |
|
||||||
| undo recent uploads | █ | | | | | | | | | | |
|
| undo recent uploads | █ | | | | | | | | | | | |
|
||||||
| create directories | █ | | | █ | █ | ╱ | █ | █ | █ | █ | █ |
|
| create directories | █ | | | █ | █ | ╱ | █ | █ | █ | █ | █ | █ |
|
||||||
| image viewer | █ | | | █ | █ | | | | █ | █ | █ |
|
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | |
|
||||||
| markdown viewer | █ | | | | █ | | | | █ | ╱ | ╱ |
|
| markdown viewer | █ | | | | █ | | | | █ | ╱ | ╱ | |
|
||||||
| markdown editor | █ | | | | █ | | | | █ | ╱ | ╱ |
|
| markdown editor | █ | | | | █ | | | | █ | ╱ | ╱ | |
|
||||||
| readme.md in listing | █ | | | █ | | | | | | | |
|
| readme.md in listing | █ | | | █ | | | | | | | | |
|
||||||
| rename files | █ | █ | █ | █ | █ | ╱ | █ | | █ | █ | █ |
|
| rename files | █ | █ | █ | █ | █ | ╱ | █ | | █ | █ | █ | █ |
|
||||||
| batch rename | █ | | | | | | | | █ | | |
|
| batch rename | █ | | | | | | | | █ | | | |
|
||||||
| cut / paste files | █ | █ | | █ | █ | | | | █ | | |
|
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | |
|
||||||
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ |
|
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | |
|
||||||
| delete files | █ | █ | | █ | █ | ╱ | █ | █ | █ | █ | █ |
|
| delete files | █ | █ | | █ | █ | ╱ | █ | █ | █ | █ | █ | █ |
|
||||||
| copy files | | | | | █ | | | | █ | █ | █ |
|
| copy files | | | | | █ | | | | █ | █ | █ | |
|
||||||
|
|
||||||
* `single-page app` = multitasking; possible to continue navigating while uploading
|
* `single-page app` = multitasking; possible to continue navigating while uploading
|
||||||
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
|
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
|
||||||
@@ -335,20 +356,21 @@ symbol legend,
|
|||||||
|
|
||||||
## integration
|
## integration
|
||||||
|
|
||||||
| feature / software | a | b | c | d | e | f | g | h | i | j | k |
|
| feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
|
||||||
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - |
|
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
|
||||||
| OS alert on upload | █ | | | | | | | | | ╱ | |
|
| OS alert on upload | █ | | | | | | | | | ╱ | | ╱ |
|
||||||
| discord | █ | | | | | | | | | ╱ | |
|
| discord | █ | | | | | | | | | ╱ | | ╱ |
|
||||||
| ┗ announce uploads | █ | | | | | | | | | | |
|
| ┗ announce uploads | █ | | | | | | | | | | | ╱ |
|
||||||
| ┗ custom embeds | | | | | | | | | | | |
|
| ┗ custom embeds | | | | | | | | | | | | ╱ |
|
||||||
| sharex | █ | | | █ | | █ | ╱ | █ | | | |
|
| sharex | █ | | | █ | | █ | ╱ | █ | | | | |
|
||||||
| flameshot | | | | | | █ | | | | | |
|
| flameshot | | | | | | █ | | | | | | |
|
||||||
|
|
||||||
* sharex `╱` = yes, but does not provide example sharex config
|
* sharex `╱` = yes, but does not provide example sharex config
|
||||||
* `a`/copyparty remarks:
|
* `a`/copyparty remarks:
|
||||||
* `OS alert on upload` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
|
* `OS alert on upload` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
|
||||||
* `discord » announce uploads` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
|
* `discord » announce uploads` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
|
||||||
* `j`/filebrowser can probably pull those off with command runners similar to copyparty
|
* `j`/filebrowser can probably pull those off with command runners similar to copyparty
|
||||||
|
* `l`/sftpgo has nothing built-in but is very extensible
|
||||||
|
|
||||||
|
|
||||||
## another matrix
|
## another matrix
|
||||||
@@ -366,6 +388,7 @@ symbol legend,
|
|||||||
| kodbox | php | ░ gpl3 | 92 MB |
|
| kodbox | php | ░ gpl3 | 92 MB |
|
||||||
| filebrowser | go | █ apl2 | 20 MB |
|
| filebrowser | go | █ apl2 | 20 MB |
|
||||||
| filegator | php | █ mit | • |
|
| filegator | php | █ mit | • |
|
||||||
|
| sftpgo | go | ‼ agpl | 44 MB |
|
||||||
| updog | python | █ mit | 17 MB |
|
| updog | python | █ mit | 17 MB |
|
||||||
| goshs | go | █ mit | 11 MB |
|
| goshs | go | █ mit | 11 MB |
|
||||||
| gimme-that | python | █ mit | 4.8 MB |
|
| gimme-that | python | █ mit | 4.8 MB |
|
||||||
@@ -379,6 +402,7 @@ symbol legend,
|
|||||||
# reviews
|
# reviews
|
||||||
|
|
||||||
* ✅ are advantages over copyparty
|
* ✅ are advantages over copyparty
|
||||||
|
* 💾 are what copyparty offers as an alternative
|
||||||
* 🔵 are similarities
|
* 🔵 are similarities
|
||||||
* ⚠️ are disadvantages (something copyparty does "better")
|
* ⚠️ are disadvantages (something copyparty does "better")
|
||||||
|
|
||||||
@@ -412,10 +436,11 @@ symbol legend,
|
|||||||
* ⚠️ http/webdav only; no ftp, zeroconf
|
* ⚠️ http/webdav only; no ftp, zeroconf
|
||||||
* ⚠️ less awesome music player
|
* ⚠️ less awesome music player
|
||||||
* ⚠️ doesn't run on android or ipads
|
* ⚠️ doesn't run on android or ipads
|
||||||
|
* ⚠️ AGPL licensed
|
||||||
* ✅ great ui/ux
|
* ✅ great ui/ux
|
||||||
* ✅ config gui
|
* ✅ config gui
|
||||||
* ✅ apps (android / iphone)
|
* ✅ apps (android / iphone)
|
||||||
* copyparty: android upload-only app
|
* 💾 android upload-only app + iPhone upload shortcut
|
||||||
* ✅ more granular permissions (per-file)
|
* ✅ more granular permissions (per-file)
|
||||||
* ✅ search: fulltext indexing of file contents
|
* ✅ search: fulltext indexing of file contents
|
||||||
* ✅ webauthn passwordless authentication
|
* ✅ webauthn passwordless authentication
|
||||||
@@ -430,10 +455,11 @@ symbol legend,
|
|||||||
* ⚠️ http/webdav only; no ftp, zeroconf
|
* ⚠️ http/webdav only; no ftp, zeroconf
|
||||||
* ⚠️ less awesome music player
|
* ⚠️ less awesome music player
|
||||||
* ⚠️ doesn't run on android or ipads
|
* ⚠️ doesn't run on android or ipads
|
||||||
|
* ⚠️ AGPL licensed
|
||||||
* ✅ great ui/ux
|
* ✅ great ui/ux
|
||||||
* ✅ config gui
|
* ✅ config gui
|
||||||
* ✅ apps (android / iphone)
|
* ✅ apps (android / iphone)
|
||||||
* copyparty: android upload-only app
|
* 💾 android upload-only app + iPhone upload shortcut
|
||||||
* ✅ more granular permissions (per-file)
|
* ✅ more granular permissions (per-file)
|
||||||
* ✅ search: fulltext indexing of file contents
|
* ✅ search: fulltext indexing of file contents
|
||||||
|
|
||||||
@@ -467,7 +493,7 @@ symbol legend,
|
|||||||
* ✅ searchable image tags; delete by tag
|
* ✅ searchable image tags; delete by tag
|
||||||
* ✅ browser extension to upload files to the server
|
* ✅ browser extension to upload files to the server
|
||||||
* ✅ reject uploads by file extension
|
* ✅ reject uploads by file extension
|
||||||
* copyparty: can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
|
* 💾 can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
|
||||||
* ✅ token auth (api keys)
|
* ✅ token auth (api keys)
|
||||||
|
|
||||||
## [kodbox](https://github.com/kalcaddle/kodbox)
|
## [kodbox](https://github.com/kalcaddle/kodbox)
|
||||||
@@ -512,6 +538,30 @@ symbol legend,
|
|||||||
* ⚠️ doesn't support crazy filenames
|
* ⚠️ doesn't support crazy filenames
|
||||||
* ⚠️ limited file search
|
* ⚠️ limited file search
|
||||||
|
|
||||||
|
## [sftpgo](https://github.com/drakkan/sftpgo)
|
||||||
|
* go; cross-platform (windows, linux, mac)
|
||||||
|
* ⚠️ http uploads not resumable / accelerated / integrity-checked
|
||||||
|
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||||
|
* 🔵 sftp uploads are resumable
|
||||||
|
* ⚠️ web UI is very minimal + a bit slow
|
||||||
|
* ⚠️ no thumbnails / image viewer / audio player
|
||||||
|
* ⚠️ basic file manager (no cut/paste/move)
|
||||||
|
* ⚠️ no filesystem indexing / search
|
||||||
|
* ⚠️ doesn't run on phones, tablets
|
||||||
|
* ⚠️ no zeroconf (mdns/ssdp)
|
||||||
|
* ⚠️ AGPL licensed
|
||||||
|
* 🔵 ftp, ftps, webdav
|
||||||
|
* ✅ sftp server
|
||||||
|
* ✅ settings gui
|
||||||
|
* ✅ acme (automatic tls certs)
|
||||||
|
* 💾 relies on caddy/certbot/acme.sh
|
||||||
|
* ✅ at-rest encryption
|
||||||
|
* 💾 relies on LUKS/BitLocker
|
||||||
|
* ✅ can use S3/GCS as storage backend
|
||||||
|
* 💾 relies on rclone-mount
|
||||||
|
* ✅ on-download event hook (otherwise same as copyparty)
|
||||||
|
* ✅ more extensive permissions control
|
||||||
|
|
||||||
## [updog](https://github.com/sc0tfree/updog)
|
## [updog](https://github.com/sc0tfree/updog)
|
||||||
* python; cross-platform
|
* python; cross-platform
|
||||||
* basic directory listing with upload feature
|
* basic directory listing with upload feature
|
||||||
@@ -526,7 +576,7 @@ symbol legend,
|
|||||||
* ⚠️ uploads not resumable / accelerated / integrity-checked
|
* ⚠️ uploads not resumable / accelerated / integrity-checked
|
||||||
* ⚠️ on cloudflare: max upload size 100 MiB
|
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||||
* ✅ cool clipboard widget
|
* ✅ cool clipboard widget
|
||||||
* copyparty: the markdown editor is an ok substitute
|
* 💾 the markdown editor is an ok substitute
|
||||||
* 🔵 read-only and upload-only modes (same as copyparty's write-only)
|
* 🔵 read-only and upload-only modes (same as copyparty's write-only)
|
||||||
* 🔵 https, webdav, but no ftp
|
* 🔵 https, webdav, but no ftp
|
||||||
|
|
||||||
@@ -538,7 +588,7 @@ symbol legend,
|
|||||||
* ⚠️ weird folder structure for uploads
|
* ⚠️ weird folder structure for uploads
|
||||||
* ✅ clamav antivirus check on upload! neat
|
* ✅ clamav antivirus check on upload! neat
|
||||||
* 🔵 optional max-filesize, os-notification on uploads
|
* 🔵 optional max-filesize, os-notification on uploads
|
||||||
* copyparty: os-notification available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
|
* 💾 os-notification available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
|
||||||
|
|
||||||
## [ass](https://github.com/tycrek/ass)
|
## [ass](https://github.com/tycrek/ass)
|
||||||
* nodejs; recommends docker
|
* nodejs; recommends docker
|
||||||
@@ -549,12 +599,13 @@ symbol legend,
|
|||||||
* ⚠️ on cloudflare: max upload size 100 MiB
|
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||||
* ✅ token auth
|
* ✅ token auth
|
||||||
* ✅ gps metadata stripping
|
* ✅ gps metadata stripping
|
||||||
* copyparty: possible with [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py)
|
* 💾 possible with [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py)
|
||||||
* ✅ discord integration (custom embeds, upload webhook)
|
* ✅ discord integration (custom embeds, upload webhook)
|
||||||
* copyparty: [upload webhook plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
|
* 💾 [upload webhook plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
|
||||||
* ✅ reject uploads by mimetype
|
* ✅ reject uploads by mimetype
|
||||||
* copyparty: can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
|
* 💾 can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
|
||||||
* ✅ can use S3 as storage backend; copyparty relies on rclone-mount for that
|
* ✅ can use S3 as storage backend
|
||||||
|
* 💾 relies on rclone-mount
|
||||||
* ✅ custom 404 pages
|
* ✅ custom 404 pages
|
||||||
|
|
||||||
## [linx](https://github.com/ZizzyDizzyMC/linx-server/)
|
## [linx](https://github.com/ZizzyDizzyMC/linx-server/)
|
||||||
@@ -564,12 +615,13 @@ symbol legend,
|
|||||||
* 🔵 some of its unique features have been added to copyparty as former linx users have migrated
|
* 🔵 some of its unique features have been added to copyparty as former linx users have migrated
|
||||||
* file expiration timers, filename randomization
|
* file expiration timers, filename randomization
|
||||||
* ✅ password-protected files
|
* ✅ password-protected files
|
||||||
* copyparty: password-protected folders + filekeys to skip the folder password seem to cover most usecases
|
* 💾 password-protected folders + filekeys to skip the folder password seem to cover most usecases
|
||||||
* ✅ file deletion keys
|
* ✅ file deletion keys
|
||||||
* ✅ download files as torrents
|
* ✅ download files as torrents
|
||||||
* ✅ remote uploads (send a link to the server and it downloads it)
|
* ✅ remote uploads (send a link to the server and it downloads it)
|
||||||
* copyparty: available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
|
* 💾 available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
|
||||||
* ✅ can use S3 as storage backend; copyparty relies on rclone-mount for that
|
* ✅ can use S3 as storage backend
|
||||||
|
* 💾 relies on rclone-mount
|
||||||
|
|
||||||
## [h5ai](https://larsjung.de/h5ai/)
|
## [h5ai](https://larsjung.de/h5ai/)
|
||||||
* ⚠️ read only; no upload/move/delete
|
* ⚠️ read only; no upload/move/delete
|
||||||
@@ -581,7 +633,16 @@ symbol legend,
|
|||||||
## [autoindex](https://github.com/nielsAD/autoindex)
|
## [autoindex](https://github.com/nielsAD/autoindex)
|
||||||
* ⚠️ read only; no upload/move/delete
|
* ⚠️ read only; no upload/move/delete
|
||||||
* ✅ directory cache for faster browsing of cloud storage
|
* ✅ directory cache for faster browsing of cloud storage
|
||||||
* copyparty: local index/cache for recursive search (names/attrs/tags), but not for browsing
|
* 💾 local index/cache for recursive search (names/attrs/tags), but not for browsing
|
||||||
|
|
||||||
|
## [miniserve](https://github.com/svenstaro/miniserve)
|
||||||
|
* rust; cross-platform (windows, linux, mac)
|
||||||
|
* ⚠️ uploads not resumable / accelerated / integrity-checked
|
||||||
|
* ⚠️ on cloudflare: max upload size 100 MiB
|
||||||
|
* ⚠️ no thumbnails / image viewer / audio player / file manager
|
||||||
|
* ⚠️ no filesystem indexing / search
|
||||||
|
* 🔵 upload, tar/zip download, qr-code
|
||||||
|
* ✅ faster at loading huge folders
|
||||||
|
|
||||||
|
|
||||||
# briefly considered
|
# briefly considered
|
||||||
|
|||||||
42
flake.lock
generated
Normal file
42
flake.lock
generated
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
{
|
||||||
|
"nodes": {
|
||||||
|
"flake-utils": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1678901627,
|
||||||
|
"narHash": "sha256-U02riOqrKKzwjsxc/400XnElV+UtPUQWpANPlyazjH0=",
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"rev": "93a2b84fc4b70d9e089d029deacc3583435c2ed6",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nixpkgs": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1680334310,
|
||||||
|
"narHash": "sha256-ISWz16oGxBhF7wqAxefMPwFag6SlsA9up8muV79V9ck=",
|
||||||
|
"owner": "NixOS",
|
||||||
|
"repo": "nixpkgs",
|
||||||
|
"rev": "884e3b68be02ff9d61a042bc9bd9dd2a358f95da",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"id": "nixpkgs",
|
||||||
|
"ref": "nixos-22.11",
|
||||||
|
"type": "indirect"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"root": {
|
||||||
|
"inputs": {
|
||||||
|
"flake-utils": "flake-utils",
|
||||||
|
"nixpkgs": "nixpkgs"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"root": "root",
|
||||||
|
"version": 7
|
||||||
|
}
|
||||||
28
flake.nix
Normal file
28
flake.nix
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
{
|
||||||
|
inputs = {
|
||||||
|
nixpkgs.url = "nixpkgs/nixos-22.11";
|
||||||
|
flake-utils.url = "github:numtide/flake-utils";
|
||||||
|
};
|
||||||
|
|
||||||
|
outputs = { self, nixpkgs, flake-utils }:
|
||||||
|
{
|
||||||
|
nixosModules.default = ./contrib/nixos/modules/copyparty.nix;
|
||||||
|
overlays.default = self: super: {
|
||||||
|
copyparty =
|
||||||
|
self.python3.pkgs.callPackage ./contrib/package/nix/copyparty {
|
||||||
|
ffmpeg = self.ffmpeg-full;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
} // flake-utils.lib.eachDefaultSystem (system:
|
||||||
|
let
|
||||||
|
pkgs = import nixpkgs {
|
||||||
|
inherit system;
|
||||||
|
overlays = [ self.overlays.default ];
|
||||||
|
};
|
||||||
|
in {
|
||||||
|
packages = {
|
||||||
|
inherit (pkgs) copyparty;
|
||||||
|
default = self.packages.${system}.copyparty;
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -3,12 +3,21 @@ FROM alpine:3.16
|
|||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
|
||||||
ver_hashwasm=4.9.0 \
|
ver_hashwasm=4.9.0 \
|
||||||
ver_marked=4.2.5 \
|
ver_marked=4.3.0 \
|
||||||
ver_mde=2.18.0 \
|
ver_mde=2.18.0 \
|
||||||
ver_codemirror=5.65.11 \
|
ver_codemirror=5.65.12 \
|
||||||
ver_fontawesome=5.13.0 \
|
ver_fontawesome=5.13.0 \
|
||||||
|
ver_prism=1.29.0 \
|
||||||
ver_zopfli=1.0.3
|
ver_zopfli=1.0.3
|
||||||
|
|
||||||
|
# versioncheck:
|
||||||
|
# https://github.com/markedjs/marked/releases
|
||||||
|
# https://github.com/Ionaru/easy-markdown-editor/tags
|
||||||
|
# https://github.com/codemirror/codemirror5/releases
|
||||||
|
# https://github.com/Daninet/hash-wasm/releases
|
||||||
|
# https://github.com/openpgpjs/asmcrypto.js
|
||||||
|
# https://github.com/google/zopfli/tags
|
||||||
|
|
||||||
|
|
||||||
# download;
|
# download;
|
||||||
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
|
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
|
||||||
@@ -22,6 +31,7 @@ RUN mkdir -p /z/dist/no-pk \
|
|||||||
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
||||||
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
||||||
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
||||||
|
&& wget https://github.com/PrismJS/prism/archive/refs/tags/v$ver_prism.tar.gz -O prism.tgz \
|
||||||
&& (mkdir hash-wasm \
|
&& (mkdir hash-wasm \
|
||||||
&& cd hash-wasm \
|
&& cd hash-wasm \
|
||||||
&& unzip ../hash-wasm.zip) \
|
&& unzip ../hash-wasm.zip) \
|
||||||
@@ -39,14 +49,11 @@ RUN mkdir -p /z/dist/no-pk \
|
|||||||
&& cd easy-markdown-editor* \
|
&& cd easy-markdown-editor* \
|
||||||
&& npm install \
|
&& npm install \
|
||||||
&& npm i gulp-cli -g ) \
|
&& npm i gulp-cli -g ) \
|
||||||
|
&& tar -xf prism.tgz \
|
||||||
&& unzip fontawesome.zip \
|
&& unzip fontawesome.zip \
|
||||||
&& tar -xf zopfli.tgz
|
&& tar -xf zopfli.tgz
|
||||||
|
|
||||||
|
|
||||||
# todo
|
|
||||||
# https://prismjs.com/download.html#themes=prism-funky&languages=markup+css+clike+javascript+autohotkey+bash+basic+batch+c+csharp+cpp+cmake+diff+docker+go+ini+java+json+kotlin+latex+less+lisp+lua+makefile+objectivec+perl+powershell+python+r+jsx+ruby+rust+sass+scss+sql+swift+systemd+toml+typescript+vbnet+verilog+vhdl+yaml&plugins=line-highlight+line-numbers+autolinker
|
|
||||||
|
|
||||||
|
|
||||||
# build fonttools (which needs zopfli)
|
# build fonttools (which needs zopfli)
|
||||||
RUN tar -xf zopfli.tgz \
|
RUN tar -xf zopfli.tgz \
|
||||||
&& cd zopfli* \
|
&& cd zopfli* \
|
||||||
@@ -121,6 +128,12 @@ COPY shiftbase.py /z
|
|||||||
RUN /bin/ash /z/mini-fa.sh
|
RUN /bin/ash /z/mini-fa.sh
|
||||||
|
|
||||||
|
|
||||||
|
# build prismjs
|
||||||
|
COPY genprism.py /z
|
||||||
|
COPY genprism.sh /z
|
||||||
|
RUN ./genprism.sh $ver_prism
|
||||||
|
|
||||||
|
|
||||||
# compress
|
# compress
|
||||||
COPY zopfli.makefile /z/dist/Makefile
|
COPY zopfli.makefile /z/dist/Makefile
|
||||||
RUN cd /z/dist \
|
RUN cd /z/dist \
|
||||||
|
|||||||
@@ -1,10 +1,9 @@
|
|||||||
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
|
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
|
||||||
vend := $(self)/../../copyparty/web/deps
|
vend := $(self)/../../copyparty/web/deps
|
||||||
|
|
||||||
|
# prefers podman-docker (optionally rootless) over actual docker/moby
|
||||||
|
|
||||||
all:
|
all:
|
||||||
-service docker start
|
|
||||||
-systemctl start docker
|
|
||||||
|
|
||||||
docker build -t build-copyparty-deps .
|
docker build -t build-copyparty-deps .
|
||||||
|
|
||||||
rm -rf $(vend)
|
rm -rf $(vend)
|
||||||
@@ -14,6 +13,7 @@ all:
|
|||||||
docker run --rm -i build-copyparty-deps:latest | \
|
docker run --rm -i build-copyparty-deps:latest | \
|
||||||
tar -xvC $(vend) --strip-components=1
|
tar -xvC $(vend) --strip-components=1
|
||||||
|
|
||||||
|
touch $(vend)/__init__.py
|
||||||
chown -R `stat $(self) -c %u:%g` $(vend)
|
chown -R `stat $(self) -c %u:%g` $(vend)
|
||||||
|
|
||||||
purge:
|
purge:
|
||||||
|
|||||||
198
scripts/deps-docker/genprism.py
Executable file
198
scripts/deps-docker/genprism.py
Executable file
@@ -0,0 +1,198 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# author: @chinponya
|
||||||
|
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
from pathlib import Path
|
||||||
|
from urllib.parse import urlparse, parse_qsl
|
||||||
|
|
||||||
|
|
||||||
|
def read_json(path):
|
||||||
|
return json.loads(path.read_text())
|
||||||
|
|
||||||
|
|
||||||
|
def get_prism_version(prism_path):
|
||||||
|
package_json_path = prism_path / "package.json"
|
||||||
|
package_json = read_json(package_json_path)
|
||||||
|
return package_json["version"]
|
||||||
|
|
||||||
|
|
||||||
|
def get_prism_components(prism_path):
|
||||||
|
components_json_path = prism_path / "components.json"
|
||||||
|
components_json = read_json(components_json_path)
|
||||||
|
return components_json
|
||||||
|
|
||||||
|
|
||||||
|
def parse_prism_configuration(url_str):
|
||||||
|
url = urlparse(url_str)
|
||||||
|
# prism.com uses a non-standard query string-like encoding
|
||||||
|
query = {k: v.split(" ") for k, v in parse_qsl(url.fragment)}
|
||||||
|
return query
|
||||||
|
|
||||||
|
|
||||||
|
def paths_of_component(prism_path, kind, components, name, minified):
|
||||||
|
component = components[kind][name]
|
||||||
|
meta = components[kind]["meta"]
|
||||||
|
path_format = meta["path"]
|
||||||
|
path_base = prism_path / path_format.replace("{id}", name)
|
||||||
|
|
||||||
|
if isinstance(component, str):
|
||||||
|
# 'core' component has a different shape, so we convert it to be consistent
|
||||||
|
component = {"title": component}
|
||||||
|
|
||||||
|
if meta.get("noCSS") or component.get("noCSS"):
|
||||||
|
extensions = ["js"]
|
||||||
|
elif kind == "themes":
|
||||||
|
extensions = ["css"]
|
||||||
|
else:
|
||||||
|
extensions = ["js", "css"]
|
||||||
|
|
||||||
|
if path_base.is_dir():
|
||||||
|
result = {ext: path_base / f"{name}.{ext}" for ext in extensions}
|
||||||
|
elif path_base.suffix:
|
||||||
|
ext = path_base.suffix.replace(".", "")
|
||||||
|
result = {ext: path_base}
|
||||||
|
else:
|
||||||
|
result = {ext: path_base.with_suffix(f".{ext}") for ext in extensions}
|
||||||
|
|
||||||
|
if minified:
|
||||||
|
result = {
|
||||||
|
ext: path.with_suffix(".min" + path.suffix) for ext, path in result.items()
|
||||||
|
}
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def read_component_contents(kv_paths):
|
||||||
|
return {k: path.read_text() for k, path in kv_paths.items()}
|
||||||
|
|
||||||
|
|
||||||
|
def get_language_dependencies(components, name):
|
||||||
|
dependencies = components["languages"][name].get("require")
|
||||||
|
|
||||||
|
if isinstance(dependencies, list):
|
||||||
|
return dependencies
|
||||||
|
elif isinstance(dependencies, str):
|
||||||
|
return [dependencies]
|
||||||
|
else:
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
def make_header(prism_path, url):
|
||||||
|
version = get_prism_version(prism_path)
|
||||||
|
header = f"/* PrismJS {version}\n{url} */"
|
||||||
|
return {"js": header, "css": header}
|
||||||
|
|
||||||
|
|
||||||
|
def make_core(prism_path, components, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "core", components, "core", minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_theme(prism_path, components, name, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "themes", components, name, minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_language(prism_path, components, name, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "languages", components, name, minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_languages(prism_path, components, names, minified):
|
||||||
|
names_with_dependencies = sum(
|
||||||
|
([*get_language_dependencies(components, name), name] for name in names), []
|
||||||
|
)
|
||||||
|
|
||||||
|
seen = set()
|
||||||
|
names_with_dependencies = [
|
||||||
|
x for x in names_with_dependencies if not (x in seen or seen.add(x))
|
||||||
|
]
|
||||||
|
|
||||||
|
kv_code = [
|
||||||
|
make_language(prism_path, components, name, minified)
|
||||||
|
for name in names_with_dependencies
|
||||||
|
]
|
||||||
|
|
||||||
|
return kv_code
|
||||||
|
|
||||||
|
|
||||||
|
def make_plugin(prism_path, components, name, minified):
|
||||||
|
kv_paths = paths_of_component(prism_path, "plugins", components, name, minified)
|
||||||
|
return read_component_contents(kv_paths)
|
||||||
|
|
||||||
|
|
||||||
|
def make_plugins(prism_path, components, names, minified):
|
||||||
|
kv_code = [make_plugin(prism_path, components, name, minified) for name in names]
|
||||||
|
return kv_code
|
||||||
|
|
||||||
|
|
||||||
|
def make_code(prism_path, url, minified):
|
||||||
|
components = get_prism_components(prism_path)
|
||||||
|
configuration = parse_prism_configuration(url)
|
||||||
|
theme_name = configuration["themes"][0]
|
||||||
|
code = [
|
||||||
|
make_header(prism_path, url),
|
||||||
|
make_core(prism_path, components, minified),
|
||||||
|
make_theme(prism_path, components, theme_name, minified),
|
||||||
|
]
|
||||||
|
|
||||||
|
if configuration.get("languages"):
|
||||||
|
code.extend(
|
||||||
|
make_languages(prism_path, components, configuration["languages"], minified)
|
||||||
|
)
|
||||||
|
|
||||||
|
if configuration.get("plugins"):
|
||||||
|
code.extend(
|
||||||
|
make_plugins(prism_path, components, configuration["plugins"], minified)
|
||||||
|
)
|
||||||
|
|
||||||
|
return code
|
||||||
|
|
||||||
|
|
||||||
|
def join_code(kv_code):
|
||||||
|
result = {"js": "", "css": ""}
|
||||||
|
|
||||||
|
for row in kv_code:
|
||||||
|
for key, code in row.items():
|
||||||
|
result[key] += code
|
||||||
|
result[key] += "\n"
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def write_code(kv_code, js_out, css_out):
|
||||||
|
code = join_code(kv_code)
|
||||||
|
|
||||||
|
with js_out.open("w") as f:
|
||||||
|
f.write(code["js"])
|
||||||
|
print(f"written {js_out}")
|
||||||
|
|
||||||
|
with css_out.open("w") as f:
|
||||||
|
f.write(code["css"])
|
||||||
|
print(f"written {css_out}")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
# fmt: off
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("url", help="configured prism download url")
|
||||||
|
parser.add_argument("--dir", type=Path, default=Path("."), help="prism repo directory")
|
||||||
|
parser.add_argument("--minify", default=True, action=argparse.BooleanOptionalAction, help="use minified files",)
|
||||||
|
parser.add_argument("--js-out", type=Path, default=Path("prism.js"), help="JS output file path")
|
||||||
|
parser.add_argument("--css-out", type=Path, default=Path("prism.css"), help="CSS output file path")
|
||||||
|
# fmt: on
|
||||||
|
args = parser.parse_args()
|
||||||
|
return args
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
args = parse_args()
|
||||||
|
code = make_code(args.dir, args.url, args.minify)
|
||||||
|
write_code(code, args.js_out, args.css_out)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
66
scripts/deps-docker/genprism.sh
Executable file
66
scripts/deps-docker/genprism.sh
Executable file
@@ -0,0 +1,66 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
langs=(
|
||||||
|
markup
|
||||||
|
css
|
||||||
|
clike
|
||||||
|
javascript
|
||||||
|
autohotkey
|
||||||
|
bash
|
||||||
|
basic
|
||||||
|
batch
|
||||||
|
c
|
||||||
|
csharp
|
||||||
|
cpp
|
||||||
|
cmake
|
||||||
|
diff
|
||||||
|
docker
|
||||||
|
elixir
|
||||||
|
glsl
|
||||||
|
go
|
||||||
|
ini
|
||||||
|
java
|
||||||
|
json
|
||||||
|
kotlin
|
||||||
|
latex
|
||||||
|
less
|
||||||
|
lisp
|
||||||
|
lua
|
||||||
|
makefile
|
||||||
|
matlab
|
||||||
|
moonscript
|
||||||
|
nim
|
||||||
|
objectivec
|
||||||
|
perl
|
||||||
|
powershell
|
||||||
|
python
|
||||||
|
r
|
||||||
|
jsx
|
||||||
|
ruby
|
||||||
|
rust
|
||||||
|
sass
|
||||||
|
scss
|
||||||
|
sql
|
||||||
|
swift
|
||||||
|
systemd
|
||||||
|
toml
|
||||||
|
typescript
|
||||||
|
vbnet
|
||||||
|
verilog
|
||||||
|
vhdl
|
||||||
|
yaml
|
||||||
|
zig
|
||||||
|
)
|
||||||
|
|
||||||
|
slangs="${langs[*]}"
|
||||||
|
slangs="${slangs// /+}"
|
||||||
|
|
||||||
|
for theme in prism-funky prism ; do
|
||||||
|
u="https://prismjs.com/download.html#themes=$theme&languages=$slangs&plugins=line-highlight+line-numbers+autolinker"
|
||||||
|
echo "$u"
|
||||||
|
./genprism.py --dir prism-$1 --js-out prism.js --css-out $theme.css "$u"
|
||||||
|
done
|
||||||
|
|
||||||
|
mv prism-funky.css prismd.css
|
||||||
|
mv prismd.css prism.css prism.js /z/dist/
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||||
adds linetracking to marked.js v4.2.3;
|
adds linetracking to marked.js v4.3.0;
|
||||||
add data-ln="%d" to most tags, %d is the source markdown line
|
add data-ln="%d" to most tags, %d is the source markdown line
|
||||||
--- a/src/Lexer.js
|
--- a/src/Lexer.js
|
||||||
+++ b/src/Lexer.js
|
+++ b/src/Lexer.js
|
||||||
@@ -206,7 +206,6 @@ index a22a2bc..884ad66 100644
|
|||||||
// Run any renderer extensions
|
// Run any renderer extensions
|
||||||
if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) {
|
if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) {
|
||||||
diff --git a/src/Renderer.js b/src/Renderer.js
|
diff --git a/src/Renderer.js b/src/Renderer.js
|
||||||
index 7c36a75..aa1a53a 100644
|
|
||||||
--- a/src/Renderer.js
|
--- a/src/Renderer.js
|
||||||
+++ b/src/Renderer.js
|
+++ b/src/Renderer.js
|
||||||
@@ -11,6 +11,12 @@ export class Renderer {
|
@@ -11,6 +11,12 @@ export class Renderer {
|
||||||
@@ -290,10 +289,9 @@ index 7c36a75..aa1a53a 100644
|
|||||||
if (title) {
|
if (title) {
|
||||||
out += ` title="${title}"`;
|
out += ` title="${title}"`;
|
||||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||||
index e8a69b6..2cc772b 100644
|
|
||||||
--- a/src/Tokenizer.js
|
--- a/src/Tokenizer.js
|
||||||
+++ b/src/Tokenizer.js
|
+++ b/src/Tokenizer.js
|
||||||
@@ -312,4 +312,7 @@ export class Tokenizer {
|
@@ -333,4 +333,7 @@ export class Tokenizer {
|
||||||
const l = list.items.length;
|
const l = list.items.length;
|
||||||
|
|
||||||
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||||
|
strip some features
|
||||||
--- a/src/Lexer.js
|
--- a/src/Lexer.js
|
||||||
+++ b/src/Lexer.js
|
+++ b/src/Lexer.js
|
||||||
@@ -7,5 +7,5 @@ import { repeatString } from './helpers.js';
|
@@ -7,5 +7,5 @@ import { repeatString } from './helpers.js';
|
||||||
@@ -56,7 +57,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
|||||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||||
--- a/src/Tokenizer.js
|
--- a/src/Tokenizer.js
|
||||||
+++ b/src/Tokenizer.js
|
+++ b/src/Tokenizer.js
|
||||||
@@ -352,14 +352,7 @@ export class Tokenizer {
|
@@ -367,14 +367,7 @@ export class Tokenizer {
|
||||||
type: 'html',
|
type: 'html',
|
||||||
raw: cap[0],
|
raw: cap[0],
|
||||||
- pre: !this.options.sanitizer
|
- pre: !this.options.sanitizer
|
||||||
@@ -72,7 +73,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
- }
|
- }
|
||||||
return token;
|
return token;
|
||||||
}
|
}
|
||||||
@@ -502,15 +495,9 @@ export class Tokenizer {
|
@@ -517,15 +510,9 @@ export class Tokenizer {
|
||||||
|
|
||||||
return {
|
return {
|
||||||
- type: this.options.sanitize
|
- type: this.options.sanitize
|
||||||
@@ -90,7 +91,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text: cap[0]
|
+ text: cap[0]
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -699,10 +686,10 @@ export class Tokenizer {
|
@@ -714,10 +701,10 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- autolink(src, mangle) {
|
- autolink(src, mangle) {
|
||||||
@@ -103,7 +104,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text = escape(cap[1]);
|
+ text = escape(cap[1]);
|
||||||
href = 'mailto:' + text;
|
href = 'mailto:' + text;
|
||||||
} else {
|
} else {
|
||||||
@@ -727,10 +714,10 @@ export class Tokenizer {
|
@@ -742,10 +729,10 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- url(src, mangle) {
|
- url(src, mangle) {
|
||||||
@@ -116,7 +117,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text = escape(cap[0]);
|
+ text = escape(cap[0]);
|
||||||
href = 'mailto:' + text;
|
href = 'mailto:' + text;
|
||||||
} else {
|
} else {
|
||||||
@@ -764,12 +751,12 @@ export class Tokenizer {
|
@@ -779,12 +766,12 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- inlineText(src, smartypants) {
|
- inlineText(src, smartypants) {
|
||||||
@@ -135,8 +136,8 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
diff --git a/src/defaults.js b/src/defaults.js
|
diff --git a/src/defaults.js b/src/defaults.js
|
||||||
--- a/src/defaults.js
|
--- a/src/defaults.js
|
||||||
+++ b/src/defaults.js
|
+++ b/src/defaults.js
|
||||||
@@ -10,11 +10,7 @@ export function getDefaults() {
|
@@ -11,11 +11,7 @@ export function getDefaults() {
|
||||||
highlight: null,
|
hooks: null,
|
||||||
langPrefix: 'language-',
|
langPrefix: 'language-',
|
||||||
- mangle: true,
|
- mangle: true,
|
||||||
pedantic: false,
|
pedantic: false,
|
||||||
@@ -170,7 +171,7 @@ diff --git a/src/helpers.js b/src/helpers.js
|
|||||||
+export function cleanUrl(base, href) {
|
+export function cleanUrl(base, href) {
|
||||||
if (base && !originIndependentUrl.test(href)) {
|
if (base && !originIndependentUrl.test(href)) {
|
||||||
href = resolveUrl(base, href);
|
href = resolveUrl(base, href);
|
||||||
@@ -250,10 +237,4 @@ export function findClosingBracket(str, b) {
|
@@ -233,10 +220,4 @@ export function findClosingBracket(str, b) {
|
||||||
}
|
}
|
||||||
|
|
||||||
-export function checkSanitizeDeprecation(opt) {
|
-export function checkSanitizeDeprecation(opt) {
|
||||||
@@ -185,30 +186,25 @@ diff --git a/src/marked.js b/src/marked.js
|
|||||||
--- a/src/marked.js
|
--- a/src/marked.js
|
||||||
+++ b/src/marked.js
|
+++ b/src/marked.js
|
||||||
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
|
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
|
||||||
|
import { Hooks } from './Hooks.js';
|
||||||
import {
|
import {
|
||||||
merge,
|
|
||||||
- checkSanitizeDeprecation,
|
- checkSanitizeDeprecation,
|
||||||
escape
|
escape
|
||||||
} from './helpers.js';
|
} from './helpers.js';
|
||||||
@@ -35,5 +34,4 @@ export function marked(src, opt, callback) {
|
@@ -18,5 +17,5 @@ import {
|
||||||
|
function onError(silent, async, callback) {
|
||||||
opt = merge({}, marked.defaults, opt || {});
|
return (e) => {
|
||||||
- checkSanitizeDeprecation(opt);
|
|
||||||
|
|
||||||
if (callback) {
|
|
||||||
@@ -318,5 +316,4 @@ marked.parseInline = function(src, opt) {
|
|
||||||
|
|
||||||
opt = merge({}, marked.defaults, opt || {});
|
|
||||||
- checkSanitizeDeprecation(opt);
|
|
||||||
|
|
||||||
try {
|
|
||||||
@@ -327,5 +324,5 @@ marked.parseInline = function(src, opt) {
|
|
||||||
return Parser.parseInline(tokens, opt);
|
|
||||||
} catch (e) {
|
|
||||||
- e.message += '\nPlease report this to https://github.com/markedjs/marked.';
|
- e.message += '\nPlease report this to https://github.com/markedjs/marked.';
|
||||||
+ e.message += '\nmake issue @ https://github.com/9001/copyparty';
|
+ e.message += '\nmake issue @ https://github.com/9001/copyparty';
|
||||||
if (opt.silent) {
|
|
||||||
return '<p>An error occurred:</p><pre>'
|
if (silent) {
|
||||||
|
@@ -65,6 +64,4 @@ function parseMarkdown(lexer, parser) {
|
||||||
|
}
|
||||||
|
|
||||||
|
- checkSanitizeDeprecation(opt);
|
||||||
|
-
|
||||||
|
if (opt.hooks) {
|
||||||
|
opt.hooks.options = opt;
|
||||||
diff --git a/test/bench.js b/test/bench.js
|
diff --git a/test/bench.js b/test/bench.js
|
||||||
--- a/test/bench.js
|
--- a/test/bench.js
|
||||||
+++ b/test/bench.js
|
+++ b/test/bench.js
|
||||||
@@ -250,70 +246,70 @@ diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
|||||||
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
||||||
--- a/test/unit/Lexer-spec.js
|
--- a/test/unit/Lexer-spec.js
|
||||||
+++ b/test/unit/Lexer-spec.js
|
+++ b/test/unit/Lexer-spec.js
|
||||||
@@ -712,5 +712,5 @@ paragraph
|
@@ -794,5 +794,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('sanitize', () => {
|
- it('sanitize', () => {
|
||||||
+ /*it('sanitize', () => {
|
+ /*it('sanitize', () => {
|
||||||
expectTokens({
|
expectTokens({
|
||||||
md: '<div>html</div>',
|
md: '<div>html</div>',
|
||||||
@@ -730,5 +730,5 @@ paragraph
|
@@ -812,5 +812,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -810,5 +810,5 @@ paragraph
|
@@ -892,5 +892,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('html sanitize', () => {
|
- it('html sanitize', () => {
|
||||||
+ /*it('html sanitize', () => {
|
+ /*it('html sanitize', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: '<div>html</div>',
|
md: '<div>html</div>',
|
||||||
@@ -818,5 +818,5 @@ paragraph
|
@@ -900,5 +900,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
|
|
||||||
it('link', () => {
|
it('link', () => {
|
||||||
@@ -1129,5 +1129,5 @@ paragraph
|
@@ -1211,5 +1211,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('autolink mangle email', () => {
|
- it('autolink mangle email', () => {
|
||||||
+ /*it('autolink mangle email', () => {
|
+ /*it('autolink mangle email', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: '<test@example.com>',
|
md: '<test@example.com>',
|
||||||
@@ -1149,5 +1149,5 @@ paragraph
|
@@ -1231,5 +1231,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
|
|
||||||
it('url', () => {
|
it('url', () => {
|
||||||
@@ -1186,5 +1186,5 @@ paragraph
|
@@ -1268,5 +1268,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('url mangle email', () => {
|
- it('url mangle email', () => {
|
||||||
+ /*it('url mangle email', () => {
|
+ /*it('url mangle email', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: 'test@example.com',
|
md: 'test@example.com',
|
||||||
@@ -1206,5 +1206,5 @@ paragraph
|
@@ -1288,5 +1288,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -1222,5 +1222,5 @@ paragraph
|
@@ -1304,5 +1304,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- describe('smartypants', () => {
|
- describe('smartypants', () => {
|
||||||
+ /*describe('smartypants', () => {
|
+ /*describe('smartypants', () => {
|
||||||
it('single quotes', () => {
|
it('single quotes', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
@@ -1292,5 +1292,5 @@ paragraph
|
@@ -1374,5 +1374,5 @@ paragraph
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ docker run --rm -it -u 1000 -p 3923:3923 -v /mnt/nas:/w -v $PWD/cfgdir:/cfg copy
|
|||||||
* `/cfg` is an optional folder with zero or more config files (*.conf) to load
|
* `/cfg` is an optional folder with zero or more config files (*.conf) to load
|
||||||
* `copyparty/ac` is the recommended [image edition](#editions)
|
* `copyparty/ac` is the recommended [image edition](#editions)
|
||||||
* you can download the image from github instead by replacing `copyparty/ac` with `ghcr.io/9001/copyparty-ac`
|
* you can download the image from github instead by replacing `copyparty/ac` with `ghcr.io/9001/copyparty-ac`
|
||||||
|
* if you are using rootless podman, remove `-u 1000`
|
||||||
|
|
||||||
i'm unfamiliar with docker-compose and alternatives so let me know if this section could be better 🙏
|
i'm unfamiliar with docker-compose and alternatives so let me know if this section could be better 🙏
|
||||||
|
|
||||||
|
|||||||
@@ -43,11 +43,11 @@ filt=
|
|||||||
[ $purge ] && filt='NR>1{print$3}'
|
[ $purge ] && filt='NR>1{print$3}'
|
||||||
[ $filt ] && {
|
[ $filt ] && {
|
||||||
[ $purge ] && {
|
[ $purge ] && {
|
||||||
podman kill $(podman ps -q)
|
podman kill $(podman ps -q) || true
|
||||||
podman rm $(podman ps -qa)
|
podman rm $(podman ps -qa) || true
|
||||||
}
|
}
|
||||||
podman rmi -f $(podman images -a --history | awk "$filt") || true
|
podman rmi -f $(podman images -a --history | awk "$filt") || true
|
||||||
podman rmi $(podman images -a --history | awk '/^<none>.*<none>.*-tmp:/{print$3}')
|
podman rmi $(podman images -a --history | awk '/^<none>.*<none>.*-tmp:/{print$3}') || true
|
||||||
}
|
}
|
||||||
|
|
||||||
[ $pull ] && {
|
[ $pull ] && {
|
||||||
|
|||||||
@@ -40,4 +40,10 @@ update_arch_pkgbuild() {
|
|||||||
rm -rf x
|
rm -rf x
|
||||||
}
|
}
|
||||||
|
|
||||||
|
update_nixos_pin() {
|
||||||
|
( cd $self/../contrib/package/nix/copyparty;
|
||||||
|
./update.py $self/../dist/copyparty-sfx.py )
|
||||||
|
}
|
||||||
|
|
||||||
update_arch_pkgbuild
|
update_arch_pkgbuild
|
||||||
|
update_nixos_pin
|
||||||
|
|||||||
@@ -1,7 +1,9 @@
|
|||||||
builds a fully standalone copyparty.exe compatible with 32bit win7-sp1 and later
|
builds copyparty32.exe, fully standalone, compatible with 32bit win7-sp1 and later
|
||||||
|
|
||||||
requires a win7 vm which has never been connected to the internet and a host-only network with the linux host at 192.168.123.1
|
requires a win7 vm which has never been connected to the internet and a host-only network with the linux host at 192.168.123.1
|
||||||
|
|
||||||
|
copyparty.exe is built by a win10-ltsc-2021 vm with similar setup
|
||||||
|
|
||||||
first-time setup steps in notes.txt
|
first-time setup steps in notes.txt
|
||||||
|
|
||||||
run build.sh in the vm to fetch src + compile + push a new exe to the linux host for manual publishing
|
run build.sh in the vm to fetch src + compile + push a new exe to the linux host for manual publishing
|
||||||
|
|||||||
@@ -9,9 +9,14 @@ tee build2.sh | cmp build.sh && rm build2.sh || {
|
|||||||
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
|
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
|
||||||
}
|
}
|
||||||
|
|
||||||
uname -s | grep WOW64 && m= || m=32
|
[ -e up2k.sh ] && ./up2k.sh
|
||||||
|
|
||||||
|
uname -s | grep WOW64 && m=64 || m=32
|
||||||
uname -s | grep NT-10 && w10=1 || w7=1
|
uname -s | grep NT-10 && w10=1 || w7=1
|
||||||
[ $w7 ] && pyv=37 || pyv=311
|
[ $w7 ] && pyv=37 || pyv=311
|
||||||
|
esuf=
|
||||||
|
[ $w7 ] && [ $m = 32 ] && esuf=32
|
||||||
|
[ $w7 ] && [ $m = 64 ] && esuf=-winpe64
|
||||||
|
|
||||||
appd=$(cygpath.exe "$APPDATA")
|
appd=$(cygpath.exe "$APPDATA")
|
||||||
spkgs=$appd/Python/Python$pyv/site-packages
|
spkgs=$appd/Python/Python$pyv/site-packages
|
||||||
@@ -57,6 +62,7 @@ read a b c d _ < <(
|
|||||||
sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/'
|
sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/'
|
||||||
)
|
)
|
||||||
sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2
|
sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2
|
||||||
|
sed -ri s/copyparty.exe/copyparty$esuf.exe/ loader.rc2
|
||||||
|
|
||||||
excl=(
|
excl=(
|
||||||
copyparty.broker_mp
|
copyparty.broker_mp
|
||||||
@@ -101,4 +107,4 @@ base64 | head -c12 >> dist/copyparty.exe
|
|||||||
|
|
||||||
dist/copyparty.exe --version
|
dist/copyparty.exe --version
|
||||||
|
|
||||||
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$m.exe
|
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe
|
||||||
|
|||||||
@@ -1,12 +1,18 @@
|
|||||||
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
|
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
|
||||||
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
|
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
|
||||||
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
|
||||||
85a041cc95cf493f5e2ebc2ca406d2718735e43951988810dc448d29e9ee0bcdb1ca19e0c22243441f45633969af8027469f29f6288f6830c724a3fa38886e5c pyinstaller-5.8.0-py3-none-win32.whl
|
d68c78bc83f4f48c604912b2d1ca4772b0e6ed676cd2eb439411e0a74d63fe215aac93dd9dab04ed341909a4a6a1efc13ec982516e3cb0fc7c355055e63d9178 pyinstaller-5.10.1-py3-none-win32.whl
|
||||||
adf0d23a98da38056de25e07e68921739173efc70fb9bf3f68d8c7c3d0d092e09efa69d35c0c9ecc990bc3c5fa62038227ef480ed06ddfaf05353f6e468f5dca pyinstaller-5.8.0-py3-none-win_amd64.whl
|
fe62705893c86eeb2d5b841da8debe05dedda98364dec190b487e718caad8a8735503bf93739a7a27ea793a835bf976fb919ceec1424b8fc550b936bae4a54e9 pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||||
01d7f8125966ed30389a879ba69d2c1fd3212bafad3fb485317580bcb9f489e8b901c4d325f6cb8a52986838ba6d44d3852e62b27c1f1d5a576899821cc0ae02 pyinstaller_hooks_contrib-2023.0-py2.py3-none-any.whl
|
61c543983ff67e2bdff94d2d6198023679437363db8c660fa81683aff87c5928cd800720488e18d09be89fe45d6ab99be3ccb912cb2e03e2bca385b4338e1e42 pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
|
||||||
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||||
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
||||||
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
||||||
|
# up2k (win7)
|
||||||
|
a7d259277af4948bf960682bc9fb45a44b9ae9a19763c8a7c313cef4aa9ec2d447d843e4a7c409e9312c8c8f863a24487a8ee4ffa6891e9b1c4e111bb4723861 certifi-2022.12.7-py3-none-any.whl
|
||||||
|
2822c0dae180b1c8cfb7a70c8c00bad62af9afdbb18b656236680def9d3f1fcdcb8ef5eb64fc3b4c934385cd175ad5992a2284bcba78a243130de75b2d1650db charset_normalizer-3.1.0-cp37-cp37m-win32.whl
|
||||||
|
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl
|
||||||
|
220e0e122d5851aaccf633224dd7fbd3ba8c8d2720944d8019d6a276ed818d83e3426fe21807f22d673b5428f19fcf9a6b4e645f69bbecd967c568bb6aeb7c8d requests-2.28.2-py3-none-any.whl
|
||||||
|
8770011f4ad1fe40a3062e6cdf1fda431530c59ee7de3fc5f8c57db54bfdb71c3aa220ca0e0bb1874fc6700e9ebb57defbae54ac84938bc9ad8f074910106681 urllib3-1.26.14-py2.py3-none-any.whl
|
||||||
# win7
|
# win7
|
||||||
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
|
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
|
||||||
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
|
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
|
||||||
@@ -20,5 +26,5 @@ ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a250675
|
|||||||
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
||||||
b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl
|
b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl
|
||||||
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
|
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
|
||||||
ea152624499966615ee74f2aefed27da528785e1215f46d61e79c5290bb8105fd98e9948938efbca9cd19e2f1dd48c9e712b4f30a4148a0ed5d1ff2dff77106e Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
78414808cb9a5fa74e7b23360b8f46147952530e3cc78a3ad4b80be3e26598080537ac691a1be1f35b7428a22c1f65a6adf45986da2752fbe9d9819d77a58bf8 Pillow-9.5.0-cp311-cp311-win_amd64.whl
|
||||||
2b04b196f1115f42375e623a35edeb71565dfd090416b22510ec0270fefe86f7d397a98aabbe9ebfe3f6a355fe25c487a4875d4252027d0a61ccb64cacd7631d python-3.11.2-amd64.exe
|
4b7711b950858f459d47145b88ccde659279c6af47144d58a1c54ea2ce4b80ec43eb7f69c68f12f8f6bc54c86a44e77441993257f7ad43aab364655de5c51bb1 python-3.11.2-amd64.exe
|
||||||
|
|||||||
@@ -13,6 +13,13 @@ https://pypi.org/project/MarkupSafe/#files
|
|||||||
https://pypi.org/project/mutagen/#files
|
https://pypi.org/project/mutagen/#files
|
||||||
https://pypi.org/project/Pillow/#files
|
https://pypi.org/project/Pillow/#files
|
||||||
|
|
||||||
|
# up2k (win7) additionals
|
||||||
|
https://pypi.org/project/certifi/#files
|
||||||
|
https://pypi.org/project/charset-normalizer/#files # cp37-cp37m-win32.whl
|
||||||
|
https://pypi.org/project/idna/#files
|
||||||
|
https://pypi.org/project/requests/#files
|
||||||
|
https://pypi.org/project/urllib3/#files
|
||||||
|
|
||||||
# win7 additionals
|
# win7 additionals
|
||||||
https://pypi.org/project/future/#files
|
https://pypi.org/project/future/#files
|
||||||
https://pypi.org/project/importlib-metadata/#files
|
https://pypi.org/project/importlib-metadata/#files
|
||||||
|
|||||||
@@ -1,8 +1,10 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
|
genico() {
|
||||||
|
|
||||||
# imagemagick png compression is broken, use pillow instead
|
# imagemagick png compression is broken, use pillow instead
|
||||||
convert ~/AndroidStudioProjects/PartyUP/metadata/en-US/images/icon.png a.bmp
|
convert $1 a.bmp
|
||||||
|
|
||||||
#convert a.bmp -trim -resize '48x48!' -strip a.png
|
#convert a.bmp -trim -resize '48x48!' -strip a.png
|
||||||
python3 <<'EOF'
|
python3 <<'EOF'
|
||||||
@@ -17,11 +19,15 @@ EOF
|
|||||||
pngquant --strip --quality 30 a.png
|
pngquant --strip --quality 30 a.png
|
||||||
mv a-*.png a.png
|
mv a-*.png a.png
|
||||||
|
|
||||||
python3 <<'EOF'
|
python3 <<EOF
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
Image.open('a.png').save('loader.ico',sizes=[(48,48)])
|
Image.open('a.png').save('$2',sizes=[(48,48)])
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
rm a.{bmp,png}
|
rm a.{bmp,png}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
genico ~/AndroidStudioProjects/PartyUP/metadata/en-US/images/icon.png loader.ico
|
||||||
|
genico https://raw.githubusercontent.com/googlefonts/noto-emoji/main/png/512/emoji_u1f680.png up2k.ico
|
||||||
ls -al
|
ls -al
|
||||||
exit 0
|
|
||||||
|
|||||||
@@ -30,6 +30,9 @@ if possible, for performance and security reasons, please use this instead:
|
|||||||
https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
|
https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
if sys.maxsize > 2 ** 32:
|
||||||
|
v = v.replace("32-bit", "64-bit")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
print(v.replace("\n", "\n▒▌ ")[1:] + "\n")
|
print(v.replace("\n", "\n▒▌ ")[1:] + "\n")
|
||||||
except:
|
except:
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ VSVersionInfo(
|
|||||||
StringTable(
|
StringTable(
|
||||||
'000004b0',
|
'000004b0',
|
||||||
[StringStruct('CompanyName', 'ocv.me'),
|
[StringStruct('CompanyName', 'ocv.me'),
|
||||||
StringStruct('FileDescription', 'copyparty'),
|
StringStruct('FileDescription', 'copyparty file server'),
|
||||||
StringStruct('FileVersion', '1.2.3'),
|
StringStruct('FileVersion', '1.2.3'),
|
||||||
StringStruct('InternalName', 'copyparty'),
|
StringStruct('InternalName', 'copyparty'),
|
||||||
StringStruct('LegalCopyright', '2019, ed'),
|
StringStruct('LegalCopyright', '2019, ed'),
|
||||||
|
|||||||
@@ -17,16 +17,23 @@ uname -s | grep NT-10 && w10=1 || {
|
|||||||
fns=(
|
fns=(
|
||||||
altgraph-0.17.3-py2.py3-none-any.whl
|
altgraph-0.17.3-py2.py3-none-any.whl
|
||||||
pefile-2023.2.7-py3-none-any.whl
|
pefile-2023.2.7-py3-none-any.whl
|
||||||
pyinstaller-5.8.0-py3-none-win_amd64.whl
|
pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||||
pyinstaller_hooks_contrib-2023.0-py2.py3-none-any.whl
|
pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
|
||||||
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||||
upx-4.0.2-win32.zip
|
upx-4.0.2-win32.zip
|
||||||
)
|
)
|
||||||
[ $w10 ] && fns+=(
|
[ $w10 ] && fns+=(
|
||||||
mutagen-1.46.0-py3-none-any.whl
|
mutagen-1.46.0-py3-none-any.whl
|
||||||
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
||||||
python-3.11.2-amd64.exe
|
python-3.11.3-amd64.exe
|
||||||
}
|
}
|
||||||
|
[ $w7 ] && fns+=(
|
||||||
|
certifi-2022.12.7-py3-none-any.whl
|
||||||
|
chardet-5.1.0-py3-none-any.whl
|
||||||
|
idna-3.4-py3-none-any.whl
|
||||||
|
requests-2.28.2-py3-none-any.whl
|
||||||
|
urllib3-1.26.14-py2.py3-none-any.whl
|
||||||
|
)
|
||||||
[ $w7 ] && fns+=(
|
[ $w7 ] && fns+=(
|
||||||
future-0.18.2.tar.gz
|
future-0.18.2.tar.gz
|
||||||
importlib_metadata-5.0.0-py3-none-any.whl
|
importlib_metadata-5.0.0-py3-none-any.whl
|
||||||
@@ -36,12 +43,12 @@ fns=(
|
|||||||
)
|
)
|
||||||
[ $w7x64 ] && fns+=(
|
[ $w7x64 ] && fns+=(
|
||||||
windows6.1-kb2533623-x64.msu
|
windows6.1-kb2533623-x64.msu
|
||||||
pyinstaller-5.8.0-py3-none-win_amd64.whl
|
pyinstaller-5.10.1-py3-none-win_amd64.whl
|
||||||
python-3.7.9-amd64.exe
|
python-3.7.9-amd64.exe
|
||||||
)
|
)
|
||||||
[ $w7x32 ] && fns+=(
|
[ $w7x32 ] && fns+=(
|
||||||
windows6.1-kb2533623-x86.msu
|
windows6.1-kb2533623-x86.msu
|
||||||
pyinstaller-5.8.0-py3-none-win32.whl
|
pyinstaller-5.10.1-py3-none-win32.whl
|
||||||
python-3.7.9.exe
|
python-3.7.9.exe
|
||||||
)
|
)
|
||||||
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
|
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
|
||||||
@@ -58,28 +65,32 @@ manually install:
|
|||||||
|
|
||||||
===[ copy-paste into git-bash ]================================
|
===[ copy-paste into git-bash ]================================
|
||||||
uname -s | grep NT-10 && w10=1 || w7=1
|
uname -s | grep NT-10 && w10=1 || w7=1
|
||||||
|
[ $w7 ] && pyv=37 || pyv=311
|
||||||
|
appd=$(cygpath.exe "$APPDATA")
|
||||||
cd ~/Downloads &&
|
cd ~/Downloads &&
|
||||||
unzip upx-*-win32.zip &&
|
unzip upx-*-win32.zip &&
|
||||||
mv upx-*/upx.exe . &&
|
mv upx-*/upx.exe . &&
|
||||||
python -m ensurepip &&
|
python -m ensurepip &&
|
||||||
python -m pip install --user -U pip-*.whl &&
|
python -m pip install --user -U pip-*.whl &&
|
||||||
{ [ $w7 ] || python -m pip install --user -U mutagen-*.whl Pillow-*.whl; } &&
|
{ [ $w7 ] || python -m pip install --user -U mutagen-*.whl Pillow-*.whl; } &&
|
||||||
|
{ [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } &&
|
||||||
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
|
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
|
||||||
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
|
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
|
||||||
|
sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py &&
|
||||||
|
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&
|
||||||
|
python uncomment.py $(for d in $appd/Python/Python$pyv/site-packages/{requests,urllib3,charset_normalizer,certifi,idna}; do find $d -name \*.py; done) &&
|
||||||
cd &&
|
cd &&
|
||||||
rm -f build.sh &&
|
rm -f build.sh &&
|
||||||
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/build.sh &&
|
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/build.sh &&
|
||||||
|
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.sh &&
|
||||||
echo ok
|
echo ok
|
||||||
# python -m pip install --user -U Pillow-9.2.0-cp37-cp37m-win32.whl
|
# python -m pip install --user -U Pillow-9.2.0-cp37-cp37m-win32.whl
|
||||||
# sed -ri 's/, bestopt, /]+bestopt+[/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
# sed -ri 's/, bestopt, /]+bestopt+[/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
||||||
# sed -ri 's/(^\s+bestopt = ).*/\1["--best","--lzma","--ultra-brute"]/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
# sed -ri 's/(^\s+bestopt = ).*/\1["--best","--lzma","--ultra-brute"]/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
||||||
|
|
||||||
===[ win10: copy-paste into git-bash ]=========================
|
===[ win10: copy-paste into git-bash ]=========================
|
||||||
appd=$(cygpath.exe "$APPDATA")
|
|
||||||
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&
|
|
||||||
#for f in $appd/Python/Python311/site-packages/mutagen/*.py; do awk -i inplace '/^\s*def _?(save|write)/{sub(/d.*/," ");s=$0;ns=length(s)} ns&&/[^ ]/&&substr($0,0,ns)!=s{ns=0} !ns' "$f"; done &&
|
#for f in $appd/Python/Python311/site-packages/mutagen/*.py; do awk -i inplace '/^\s*def _?(save|write)/{sub(/d.*/," ");s=$0;ns=length(s)} ns&&/[^ ]/&&substr($0,0,ns)!=s{ns=0} !ns' "$f"; done &&
|
||||||
python uncomment.py $appd/Python/Python311/site-packages/{mutagen,PIL,jinja2,markupsafe}/*.py &&
|
python uncomment.py $appd/Python/Python311/site-packages/{mutagen,PIL,jinja2,markupsafe}/*.py &&
|
||||||
sed -ri 's/--lzma/--best/' $APPDATA/Python/Python311/site-packages/pyinstaller/building/utils.py &&
|
|
||||||
echo ok
|
echo ok
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
29
scripts/pyinstaller/up2k.rc
Normal file
29
scripts/pyinstaller/up2k.rc
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
# UTF-8
|
||||||
|
VSVersionInfo(
|
||||||
|
ffi=FixedFileInfo(
|
||||||
|
filevers=(1,2,3,0),
|
||||||
|
prodvers=(1,2,3,0),
|
||||||
|
mask=0x3f,
|
||||||
|
flags=0x0,
|
||||||
|
OS=0x4,
|
||||||
|
fileType=0x1,
|
||||||
|
subtype=0x0,
|
||||||
|
date=(0, 0)
|
||||||
|
),
|
||||||
|
kids=[
|
||||||
|
StringFileInfo(
|
||||||
|
[
|
||||||
|
StringTable(
|
||||||
|
'000004b0',
|
||||||
|
[StringStruct('CompanyName', 'ocv.me'),
|
||||||
|
StringStruct('FileDescription', 'copyparty uploader / filesearch command'),
|
||||||
|
StringStruct('FileVersion', '1.2.3'),
|
||||||
|
StringStruct('InternalName', 'up2k'),
|
||||||
|
StringStruct('LegalCopyright', '2019, ed'),
|
||||||
|
StringStruct('OriginalFilename', 'up2k.exe'),
|
||||||
|
StringStruct('ProductName', 'copyparty up2k client'),
|
||||||
|
StringStruct('ProductVersion', '1.2.3')])
|
||||||
|
]),
|
||||||
|
VarFileInfo([VarStruct('Translation', [0, 1200])])
|
||||||
|
]
|
||||||
|
)
|
||||||
48
scripts/pyinstaller/up2k.sh
Normal file
48
scripts/pyinstaller/up2k.sh
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
curl -k https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.sh |
|
||||||
|
tee up2k2.sh | cmp up2k.sh && rm up2k2.sh || {
|
||||||
|
[ -s up2k2.sh ] || exit 1
|
||||||
|
echo "new up2k script; upgrade y/n:"
|
||||||
|
while true; do read -u1 -n1 -r r; [[ $r =~ [yYnN] ]] && break; done
|
||||||
|
[[ $r =~ [yY] ]] && mv up2k{2,}.sh && exec ./up2k.sh
|
||||||
|
}
|
||||||
|
|
||||||
|
uname -s | grep -E 'WOW64|NT-10' && echo need win7-32 && exit 1
|
||||||
|
|
||||||
|
dl() { curl -fkLO "$1"; }
|
||||||
|
cd ~/Downloads
|
||||||
|
|
||||||
|
dl https://192.168.123.1:3923/cpp/bin/up2k.py
|
||||||
|
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.ico
|
||||||
|
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.rc
|
||||||
|
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.spec
|
||||||
|
|
||||||
|
# $LOCALAPPDATA/programs/python/python37-32/python -m pip install --user -U pyinstaller requests
|
||||||
|
|
||||||
|
grep -E '^from .ssl_ import' $APPDATA/python/python37/site-packages/urllib3/util/proxy.py && {
|
||||||
|
echo golfing
|
||||||
|
echo > $APPDATA/python/python37/site-packages/requests/certs.py
|
||||||
|
sed -ri 's/^(DEFAULT_CA_BUNDLE_PATH = ).*/\1""/' $APPDATA/python/python37/site-packages/requests/utils.py
|
||||||
|
sed -ri '/^import zipfile$/d' $APPDATA/python/python37/site-packages/requests/utils.py
|
||||||
|
sed -ri 's/"idna"//' $APPDATA/python/python37/site-packages/requests/packages.py
|
||||||
|
sed -ri 's/import charset_normalizer.*/pass/' $APPDATA/python/python37/site-packages/requests/compat.py
|
||||||
|
sed -ri 's/raise.*charset_normalizer.*/pass/' $APPDATA/python/python37/site-packages/requests/__init__.py
|
||||||
|
sed -ri 's/import charset_normalizer.*//' $APPDATA/python/python37/site-packages/requests/packages.py
|
||||||
|
sed -ri 's/chardet.__name__/"\\roll\\tide"/' $APPDATA/python/python37/site-packages/requests/packages.py
|
||||||
|
sed -ri 's/chardet,//' $APPDATA/python/python37/site-packages/requests/models.py
|
||||||
|
for n in util/__init__.py connection.py; do awk -i inplace '/^from (\.util)?\.ssl_ /{s=1} !s; /^\)/{s=0}' $APPDATA/python/python37/site-packages/urllib3/$n; done
|
||||||
|
sed -ri 's/^from .ssl_ import .*//' $APPDATA/python/python37/site-packages/urllib3/util/proxy.py
|
||||||
|
echo golfed
|
||||||
|
}
|
||||||
|
|
||||||
|
read a b _ < <(awk -F\" '/^S_VERSION =/{$0=$2;sub(/\./," ");print}' < up2k.py)
|
||||||
|
sed -r 's/1,2,3,0/'$a,$b,0,0'/;s/1\.2\.3/'$a.$b.0/ <up2k.rc >up2k.rc2
|
||||||
|
|
||||||
|
#python uncomment.py up2k.py
|
||||||
|
$APPDATA/python/python37/scripts/pyinstaller -y --clean --upx-dir=. up2k.spec
|
||||||
|
|
||||||
|
./dist/up2k.exe --version
|
||||||
|
|
||||||
|
curl -fkT dist/up2k.exe -HPW:wark https://192.168.123.1:3923/
|
||||||
78
scripts/pyinstaller/up2k.spec
Normal file
78
scripts/pyinstaller/up2k.spec
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
# -*- mode: python ; coding: utf-8 -*-
|
||||||
|
|
||||||
|
|
||||||
|
block_cipher = None
|
||||||
|
|
||||||
|
|
||||||
|
a = Analysis(
|
||||||
|
['up2k.py'],
|
||||||
|
pathex=[],
|
||||||
|
binaries=[],
|
||||||
|
datas=[],
|
||||||
|
hiddenimports=[],
|
||||||
|
hookspath=[],
|
||||||
|
hooksconfig={},
|
||||||
|
runtime_hooks=[],
|
||||||
|
excludes=[
|
||||||
|
'ftplib',
|
||||||
|
'lzma',
|
||||||
|
'pickle',
|
||||||
|
'ssl',
|
||||||
|
'tarfile',
|
||||||
|
'bz2',
|
||||||
|
'zipfile',
|
||||||
|
'tracemalloc',
|
||||||
|
'zlib',
|
||||||
|
'urllib3.util.ssl_',
|
||||||
|
'urllib3.contrib.pyopenssl',
|
||||||
|
'urllib3.contrib.socks',
|
||||||
|
'certifi',
|
||||||
|
'idna',
|
||||||
|
'chardet',
|
||||||
|
'charset_normalizer',
|
||||||
|
'email.contentmanager',
|
||||||
|
'email.policy',
|
||||||
|
'encodings.zlib_codec',
|
||||||
|
'encodings.base64_codec',
|
||||||
|
'encodings.bz2_codec',
|
||||||
|
'encodings.charmap',
|
||||||
|
'encodings.hex_codec',
|
||||||
|
'encodings.palmos',
|
||||||
|
'encodings.punycode',
|
||||||
|
'encodings.rot_13',
|
||||||
|
],
|
||||||
|
win_no_prefer_redirects=False,
|
||||||
|
win_private_assemblies=False,
|
||||||
|
cipher=block_cipher,
|
||||||
|
noarchive=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
# this is the only change to the autogenerated specfile:
|
||||||
|
xdll = ["libcrypto-1_1.dll"]
|
||||||
|
a.binaries = TOC([x for x in a.binaries if x[0] not in xdll])
|
||||||
|
|
||||||
|
pyz = PYZ(a.pure, a.zipped_data, cipher=block_cipher)
|
||||||
|
|
||||||
|
exe = EXE(
|
||||||
|
pyz,
|
||||||
|
a.scripts,
|
||||||
|
a.binaries,
|
||||||
|
a.zipfiles,
|
||||||
|
a.datas,
|
||||||
|
[],
|
||||||
|
name='up2k',
|
||||||
|
debug=False,
|
||||||
|
bootloader_ignore_signals=False,
|
||||||
|
strip=False,
|
||||||
|
upx=True,
|
||||||
|
upx_exclude=[],
|
||||||
|
runtime_tmpdir=None,
|
||||||
|
console=True,
|
||||||
|
disable_windowed_traceback=False,
|
||||||
|
argv_emulation=False,
|
||||||
|
target_arch=None,
|
||||||
|
codesign_identity=None,
|
||||||
|
entitlements_file=None,
|
||||||
|
version='up2k.rc2',
|
||||||
|
icon=['up2k.ico'],
|
||||||
|
)
|
||||||
14
scripts/pyinstaller/up2k.spec.sh
Normal file
14
scripts/pyinstaller/up2k.spec.sh
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# grep '">encodings.cp' C:/Users/ed/dev/copyparty/bin/dist/xref-up2k.html | sed -r 's/.*encodings.cp//;s/<.*//' | sort -n | uniq | tr '\n' ,
|
||||||
|
# grep -i encodings -A1 build/up2k/xref-up2k.html | sed -r 's/.*(Missing|Excluded)Module.*//' | grep moduletype -B1 | grep -v moduletype
|
||||||
|
|
||||||
|
ex=(
|
||||||
|
ftplib lzma pickle ssl tarfile bz2 zipfile tracemalloc zlib
|
||||||
|
urllib3.util.ssl_ urllib3.contrib.pyopenssl urllib3.contrib.socks certifi idna chardet charset_normalizer
|
||||||
|
email.contentmanager email.policy
|
||||||
|
encodings.{zlib_codec,base64_codec,bz2_codec,charmap,hex_codec,palmos,punycode,rot_13}
|
||||||
|
);
|
||||||
|
cex=(); for a in "${ex[@]}"; do cex+=(--exclude "$a"); done
|
||||||
|
$APPDATA/python/python37/scripts/pyi-makespec --version-file up2k.rc2 -i up2k.ico -n up2k -c -F up2k.py "${cex[@]}"
|
||||||
@@ -16,7 +16,9 @@ cat $f | awk '
|
|||||||
h=0
|
h=0
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
/^#/{s=1;pr()} /^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff)|`$/{s=0}
|
/^#/{s=1;rs=0;pr()}
|
||||||
|
/^#* *(nix package)/{rs=1}
|
||||||
|
/^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff|nixos module)|`$/{s=rs}
|
||||||
/^#/{
|
/^#/{
|
||||||
lv=length($1);
|
lv=length($1);
|
||||||
sub(/[^ ]+ /,"");
|
sub(/[^ ]+ /,"");
|
||||||
|
|||||||
@@ -98,7 +98,7 @@ class Cfg(Namespace):
|
|||||||
def __init__(self, a=None, v=None, c=None):
|
def __init__(self, a=None, v=None, c=None):
|
||||||
ka = {}
|
ka = {}
|
||||||
|
|
||||||
ex = "daw dav_inf dav_mac dotsrch e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp force_js getmod hardlink ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_thumb no_vthumb no_zip nrand nw rand vc xdev xlink xvol"
|
ex = "daw dav_inf dav_mac dotsrch e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp force_js getmod hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_thumb no_vthumb no_zip nrand nw rand vc xdev xlink xvol"
|
||||||
ka.update(**{k: False for k in ex.split()})
|
ka.update(**{k: False for k in ex.split()})
|
||||||
|
|
||||||
ex = "dotpart no_rescan no_sendfile no_voldump plain_ip"
|
ex = "dotpart no_rescan no_sendfile no_voldump plain_ip"
|
||||||
|
|||||||
Reference in New Issue
Block a user