Compare commits

...

62 Commits

Author SHA1 Message Date
ed
42099baeff v1.6.12 2023-04-20 21:41:47 +00:00
ed
2459965ca8 u2cli: dont enter delete stage if something failed 2023-04-20 20:40:09 +00:00
ed
6acf436573 u2idx pool instead of per-socket;
prevents running out of FDs thanks to thousands of sqlite3 sessions
and neatly sidesteps what could possibly be a race in python's
sqlite3 bindings where it sometimes forgets to close the fd
2023-04-20 20:36:13 +00:00
ed
f217e1ce71 correctly ignore multirange requests 2023-04-20 19:14:38 +00:00
ed
418000aee3 explain tus incompatibility + update docs 2023-04-19 21:46:33 +00:00
ed
dbbba9625b nix: make deps optional + update docs 2023-04-17 13:17:53 +02:00
Chinpo Nya
397bc92fbc rewrite the nix module config with nix options 2023-04-17 00:26:57 +02:00
Chinpo Nya
6e615dcd03 fix: remove ffmpeg from python env build inputs 2023-04-17 00:26:57 +02:00
Chinpo Nya
9ac5908b33 refactor: remove unnecessary use of 'rec' 2023-04-17 00:26:57 +02:00
Chinpo Nya
50912480b9 automate nix package updates 2023-04-17 00:26:57 +02:00
Chinpo Nya
24b9b8319d nix/nixos documentation 2023-04-17 00:26:57 +02:00
Chinpo Nya
b0f4f0b653 nixos module 2023-04-17 00:26:57 +02:00
Chinpo Nya
05bbd41c4b nix package 2023-04-17 00:26:57 +02:00
ed
8f5f8a3cda expand userhomes everywhere:
* -c
* -lo
* --hist
* hist volflag
* --ssl-log
2023-04-14 18:55:19 +02:00
ed
c8938fc033 fix ipv4 location header on dualstack 2023-04-14 14:06:44 +02:00
ed
1550350e05 update docs (performance tips, windows example) 2023-04-13 21:36:55 +00:00
ed
5cc190c026 better 2023-04-12 22:09:46 +00:00
ed
d6a0a738ce add windows example + update docs + some cosmetics 2023-04-12 22:06:44 +00:00
ed
f5fe3678ee more safari-on-touchbar-macbook workarounds:
* safari invokes pause on the mediasession
   whenever any Audio loads a new src (preload)

* ...and on some(?) seeks
2023-04-07 23:04:01 +02:00
ed
f2a7925387 avoid safari bugs on touchbar macbooks:
* songs would play backwards
* playback started immediately on folder change
2023-04-07 12:38:37 +02:00
ed
fa953ced52 update archpkg to 1.6.11 2023-04-01 22:59:20 +00:00
ed
f0000d9861 v1.6.11 2023-04-01 21:12:54 +00:00
ed
4e67516719 last.fm web-scrobbler support 2023-04-01 21:02:03 +00:00
ed
29db7a6270 deps: automate prismjs build 2023-04-01 17:46:42 +00:00
ed
852499e296 dont panic in case of extension-injected css 2023-04-01 16:08:45 +00:00
ed
f1775fd51c update deps 2023-04-01 15:15:53 +00:00
ed
4bb306932a update systemd notes 2023-04-01 10:32:12 +00:00
ed
2a37e81bd8 add rclone optimization, closes #21 2023-04-01 10:21:21 +00:00
ed
6a312ca856 something dumb 2023-04-01 00:16:30 +00:00
ed
e7f3e475a2 more accurate bpm detector 2023-03-31 21:20:37 +00:00
ed
854ba0ec06 add audio filter plugin thing 2023-03-31 20:20:28 +00:00
ed
209b49d771 remind sqlite we have indexes 2023-03-30 21:45:58 +00:00
ed
949baae539 integrate markdown thumbs with image gallery 2023-03-30 21:21:21 +00:00
ed
5f4ea27586 new hook: exif stripper 2023-03-26 22:19:15 +00:00
ed
099cc97247 hooks: more correct usage examples 2023-03-26 22:18:48 +00:00
ed
592b7d6315 gdi js 2023-03-26 02:06:49 +00:00
ed
0880bf55a1 markdown thumbnails 2023-03-26 01:53:41 +00:00
ed
4cbffec0ec u2cli: show more errors + drop --ws (does nothing) 2023-03-23 23:47:41 +00:00
ed
cc355417d4 update docs 2023-03-23 23:37:45 +00:00
ed
e2bc573e61 webdav correctness:
* generally respond without body
   (rclone likes this)
* don't connection:close on most mkcol errors
2023-03-23 23:25:00 +00:00
ed
41c0376177 update archpkg to 1.6.10 2023-03-20 23:37:20 +00:00
ed
c01cad091e v1.6.10 2023-03-20 21:56:31 +00:00
ed
eb349f339c update foldersync / rclone docs 2023-03-20 21:54:08 +00:00
ed
24d8caaf3e switch rclone to owncloud mode so it sends lastmod 2023-03-20 21:45:52 +00:00
ed
5ac2c20959 basic support for rclone sync 2023-03-20 21:17:53 +00:00
ed
bb72e6bf30 support propfind of files (not just dirs) 2023-03-20 20:58:51 +00:00
ed
d8142e866a accept last-modified from owncloud webdav extension 2023-03-20 20:28:26 +00:00
ed
7b7979fd61 add sftpgo to comparison + update docs 2023-03-19 21:45:35 +00:00
ed
749616d09d help iOS understand short audio files 2023-03-19 20:03:35 +00:00
ed
5485c6d7ca prisonparty: FFmpeg runs faster with /dev/urandom 2023-03-19 18:32:35 +00:00
ed
b7aea38d77 add iOS uploader (mk.ii) 2023-03-18 18:38:37 +00:00
ed
0ecd9f99e6 update archpkg to 1.6.9 2023-03-16 22:34:09 +00:00
ed
ca04a00662 v1.6.9 2023-03-16 21:06:18 +00:00
ed
8a09601be8 url-param ?v disables index.html 2023-03-16 20:52:43 +00:00
ed
1fe0d4693e fix logues bleeding into navpane 2023-03-16 20:23:01 +00:00
ed
bba8a3c6bc fix truncated search results 2023-03-16 20:12:13 +00:00
ed
e3d7f0c7d5 add tooltip delay to android too 2023-03-16 19:48:44 +00:00
ed
be7bb71bbc add option to show index.html instead of listing 2023-03-16 19:41:33 +00:00
ed
e0c4829ec6 verify covers against db instead of fs 2023-03-15 19:48:43 +00:00
ed
5af1575329 readme: ideas welcome w 2023-03-14 22:24:43 +00:00
ed
884f966b86 update archpkg to 1.6.8 2023-03-12 18:55:02 +00:00
ed
f6c6fbc223 fix exe builder 2023-03-12 18:54:16 +00:00
66 changed files with 2520 additions and 426 deletions

6
.gitignore vendored
View File

@@ -21,6 +21,9 @@ copyparty.egg-info/
# winmerge # winmerge
*.bak *.bak
# apple pls
.DS_Store
# derived # derived
copyparty/res/COPYING.txt copyparty/res/COPYING.txt
copyparty/web/deps/ copyparty/web/deps/
@@ -34,3 +37,6 @@ up.*.txt
.hist/ .hist/
scripts/docker/*.out scripts/docker/*.out
scripts/docker/*.err scripts/docker/*.err
# nix build output link
result

272
README.md
View File

@@ -1,29 +1,16 @@
# 💾🎉 copyparty # 💾🎉 copyparty
* portable file sharing hub (py2/py3) [(on PyPI)](https://pypi.org/project/copyparty/) turn almost any device into a file server with resumable uploads/downloads using [*any*](#browser-support) web browser
* MIT-Licensed, 2019-05-26, ed @ irc.rizon.net
* server only needs Python (2 or 3), all dependencies optional
* 🔌 protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
* 📱 [android app](#android-app) // [iPhone shortcuts](#ios-shortcuts)
## summary 👉 **[Get started](#quickstart)!** or visit the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
turn your phone or raspi into a portable file server with resumable uploads/downloads using *any* web browser
* server only needs Python (`2.7` or `3.3+`), all dependencies optional
* browse/upload with [IE4](#browser-support) / netscape4.0 on win3.11 (heh)
* protocols: [http](#the-browser) // [ftp](#ftp-server) // [webdav](#webdav-server) // [smb/cifs](#smb-server)
try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running from a basement in finland
📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer) 📷 **screenshots:** [browser](#the-browser) // [upload](#uploading) // [unpost](#unpost) // [thumbnails](#thumbnails) // [search](#searching) // [fsearch](#file-search) // [zip-DL](#zip-downloads) // [md-viewer](#markdown-viewer)
## get the app
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
(the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet)
## readme toc ## readme toc
* top * top
@@ -52,6 +39,8 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [self-destruct](#self-destruct) - uploads can be given a lifetime * [self-destruct](#self-destruct) - uploads can be given a lifetime
* [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission) * [file manager](#file-manager) - cut/paste, rename, and delete files/folders (if you have permission)
* [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI * [batch rename](#batch-rename) - select some files and press `F2` to bring up the rename UI
* [media player](#media-player) - plays almost every audio format there is
* [audio equalizer](#audio-equalizer) - bass boosted
* [markdown viewer](#markdown-viewer) - and there are *two* editors * [markdown viewer](#markdown-viewer) - and there are *two* editors
* [other tricks](#other-tricks) * [other tricks](#other-tricks)
* [searching](#searching) - search by size, date, path/name, mp3-tags, ... * [searching](#searching) - search by size, date, path/name, mp3-tags, ...
@@ -80,18 +69,24 @@ try the **[read-only demo server](https://a.ocv.me/pub/demo/)** 👀 running fro
* [themes](#themes) * [themes](#themes)
* [complete examples](#complete-examples) * [complete examples](#complete-examples)
* [reverse-proxy](#reverse-proxy) - running copyparty next to other websites * [reverse-proxy](#reverse-proxy) - running copyparty next to other websites
* [nix package](#nix-package) - `nix profile install github:9001/copyparty`
* [nixos module](#nixos-module)
* [browser support](#browser-support) - TLDR: yes * [browser support](#browser-support) - TLDR: yes
* [client examples](#client-examples) - interact with copyparty using non-browser clients * [client examples](#client-examples) - interact with copyparty using non-browser clients
* [folder sync](#folder-sync) - sync folders to/from copyparty
* [mount as drive](#mount-as-drive) - a remote copyparty server as a local filesystem * [mount as drive](#mount-as-drive) - a remote copyparty server as a local filesystem
* [android app](#android-app) - upload to copyparty with one tap
* [iOS shortcuts](#iOS-shortcuts) - there is no iPhone app, but
* [performance](#performance) - defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload * [performance](#performance) - defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
* [client-side](#client-side) - when uploading files * [client-side](#client-side) - when uploading files
* [security](#security) - some notes on hardening * [security](#security) - some notes on hardening
* [gotchas](#gotchas) - behavior that might be unexpected * [gotchas](#gotchas) - behavior that might be unexpected
* [cors](#cors) - cross-site request config * [cors](#cors) - cross-site request config
* [https](#https) - both HTTP and HTTPS are accepted
* [recovering from crashes](#recovering-from-crashes) * [recovering from crashes](#recovering-from-crashes)
* [client crashes](#client-crashes) * [client crashes](#client-crashes)
* [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads * [frefox wsod](#frefox-wsod) - firefox 87 can crash during uploads
* [HTTP API](#HTTP-API) - see [devnotes](#./docs/devnotes.md#http-api) * [HTTP API](#HTTP-API) - see [devnotes](./docs/devnotes.md#http-api)
* [dependencies](#dependencies) - mandatory deps * [dependencies](#dependencies) - mandatory deps
* [optional dependencies](#optional-dependencies) - install these to enable bonus features * [optional dependencies](#optional-dependencies) - install these to enable bonus features
* [optional gpl stuff](#optional-gpl-stuff) * [optional gpl stuff](#optional-gpl-stuff)
@@ -108,6 +103,8 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
* or install through pypi (python3 only): `python3 -m pip install --user -U copyparty` * or install through pypi (python3 only): `python3 -m pip install --user -U copyparty`
* or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead * or if you cannot install python, you can use [copyparty.exe](#copypartyexe) instead
* or [install through nix](#nix-package), or [on NixOS](#nixos-module)
* or if you are on android, [install copyparty in termux](#install-on-android)
* or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too * or if you prefer to [use docker](./scripts/docker/) 🐋 you can do that too
* docker has all deps built-in, so skip this step: * docker has all deps built-in, so skip this step:
@@ -126,6 +123,8 @@ enable thumbnails (images/audio/video), media indexing, and audio transcoding by
running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes) running copyparty without arguments (for example doubleclicking it on Windows) will give everyone read/write access to the current folder; you may want [accounts and volumes](#accounts-and-volumes)
or see [complete windows example](./docs/examples/windows.md)
some recommended options: some recommended options:
* `-e2dsa` enables general [file indexing](#file-indexing) * `-e2dsa` enables general [file indexing](#file-indexing)
* `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen) * `-e2ts` enables audio metadata indexing (needs either FFprobe or Mutagen)
@@ -138,10 +137,11 @@ some recommended options:
you may also want these, especially on servers: you may also want these, especially on servers:
* [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service * [contrib/systemd/copyparty.service](contrib/systemd/copyparty.service) to run copyparty as a systemd service (see guide inside)
* [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security) * [contrib/systemd/prisonparty.service](contrib/systemd/prisonparty.service) to run it in a chroot (for extra security)
* [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD * [contrib/rc/copyparty](contrib/rc/copyparty) to run copyparty on FreeBSD
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https) * [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to [reverse-proxy](#reverse-proxy) behind nginx (for better https)
* [nixos module](#nixos-module) to run copyparty on NixOS hosts
and remember to open the ports you want; here's a complete example including every feature copyparty has to offer: and remember to open the ports you want; here's a complete example including every feature copyparty has to offer:
``` ```
@@ -165,14 +165,18 @@ firewall-cmd --reload
* ☑ [smb/cifs server](#smb-server) * ☑ [smb/cifs server](#smb-server)
* ☑ [qr-code](#qr-code) for quick access * ☑ [qr-code](#qr-code) for quick access
* ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf) * ☑ [upnp / zeroconf / mdns / ssdp](#zeroconf)
* ☑ [event hooks](#event-hooks) / script runner
* ☑ [reverse-proxy support](https://github.com/9001/copyparty#reverse-proxy)
* upload * upload
* ☑ basic: plain multipart, ie6 support * ☑ basic: plain multipart, ie6 support
* ☑ [up2k](#uploading): js, resumable, multithreaded * ☑ [up2k](#uploading): js, resumable, multithreaded
* unaffected by cloudflare's max-upload-size (100 MiB) * unaffected by cloudflare's max-upload-size (100 MiB)
* ☑ stash: simple PUT filedropper * ☑ stash: simple PUT filedropper
* ☑ filename randomizer
* ☑ write-only folders
* ☑ [unpost](#unpost): undo/delete accidental uploads * ☑ [unpost](#unpost): undo/delete accidental uploads
* ☑ [self-destruct](#self-destruct) (specified server-side or client-side) * ☑ [self-destruct](#self-destruct) (specified server-side or client-side)
* ☑ symlink/discard existing files (content-matching) * ☑ symlink/discard duplicates (content-matching)
* download * download
* ☑ single files in browser * ☑ single files in browser
* ☑ [folders as zip / tar files](#zip-downloads) * ☑ [folders as zip / tar files](#zip-downloads)
@@ -193,10 +197,15 @@ firewall-cmd --reload
* ☑ [locate files by contents](#file-search) * ☑ [locate files by contents](#file-search)
* ☑ search by name/path/date/size * ☑ search by name/path/date/size
* ☑ [search by ID3-tags etc.](#searching) * ☑ [search by ID3-tags etc.](#searching)
* client support
* ☑ [folder sync](#folder-sync)
* ☑ [curl-friendly](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png)
* markdown * markdown
* ☑ [viewer](#markdown-viewer) * ☑ [viewer](#markdown-viewer)
* ☑ editor (sure why not) * ☑ editor (sure why not)
PS: something missing? post any crazy ideas you've got as a [feature request](https://github.com/9001/copyparty/issues/new?assignees=9001&labels=enhancement&template=feature_request.md) or [discussion](https://github.com/9001/copyparty/discussions/new?category=ideas) 🤙
## testimonials ## testimonials
@@ -268,7 +277,7 @@ server notes:
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11) * iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume * *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
* "future" because `AudioContext` is broken in the current iOS version (15.1), maybe one day... * "future" because `AudioContext` can't maintain a stable playback speed in the current iOS version (15.7), maybe one day...
* Windows: folders cannot be accessed if the name ends with `.` * Windows: folders cannot be accessed if the name ends with `.`
* python or windows bug * python or windows bug
@@ -648,12 +657,63 @@ or a mix of both:
the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`) the metadata keys you can use in the format field are the ones in the file-browser table header (whatever is collected with `-mte` and `-mtp`)
## media player
plays almost every audio format there is (if the server has FFmpeg installed for on-demand transcoding)
the following audio formats are usually always playable, even without FFmpeg: `aac|flac|m4a|mp3|ogg|opus|wav`
some hilights:
* OS integration; control playback from your phone's lockscreen ([windows](https://user-images.githubusercontent.com/241032/233213022-298a98ba-721a-4cf1-a3d4-f62634bc53d5.png) // [iOS](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) // [android](https://user-images.githubusercontent.com/241032/233212311-a7368590-08c7-4f9f-a1af-48ccf3f36fad.png))
* shows the audio waveform in the seekbar
* not perfectly gapless but can get really close (see settings + eq below); good enough to enjoy gapless albums as intended
click the `play` link next to an audio file, or copy the link target to [share it](https://a.ocv.me/pub/demo/music/Ubiktune%20-%20SOUNDSHOCK%202%20-%20FM%20FUNK%20TERRROR!!/#af-1fbfba61&t=18) (optionally with a timestamp to start playing from, like that example does)
open the `[🎺]` media-player-settings tab to configure it,
* switches:
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
* `[~s]` toggles the seekbar waveform display
* `[/np]` enables buttons to copy the now-playing info as an irc message
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
* `[seek]` allows seeking with lockscreen controls (buggy on some devices)
* `[art]` shows album art on the lockscreen
* `[🎯]` keeps the playing song scrolled into view (good when using the player as a taskbar dock)
* `[⟎]` shrinks the playback controls
* playback mode:
* `[loop]` keeps looping the folder
* `[next]` plays into the next folder
* transcode:
* `[flac]` convers `flac` and `wav` files into opus
* `[aac]` converts `aac` and `m4a` files into opus
* `[oth]` converts all other known formats into opus
* `aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|m4a|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|tak|tta|ulaw|wav|wma|wv|xm|xpk`
* "tint" reduces the contrast of the playback bar
### audio equalizer
bass boosted
can also boost the volume in general, or increase/decrease stereo width (like [crossfeed](https://www.foobar2000.org/components/view/foo_dsp_meiercf) just worse)
has the convenient side-effect of reducing the pause between songs, so gapless albums play better with the eq enabled (just make it flat)
## markdown viewer ## markdown viewer
and there are *two* editors and there are *two* editors
![copyparty-md-read-fs8](https://user-images.githubusercontent.com/241032/115978057-66419080-a57d-11eb-8539-d2be843991aa.png) ![copyparty-md-read-fs8](https://user-images.githubusercontent.com/241032/115978057-66419080-a57d-11eb-8539-d2be843991aa.png)
there is a built-in extension for inline clickable thumbnails;
* enable it by adding `<!-- th -->` somewhere in the doc
* add thumbnails with `!th[l](your.jpg)` where `l` means left-align (`r` = right-align)
* a single line with `---` clears the float / inlining
* in the case of README.md being displayed below a file listing, thumbnails will open in the gallery viewer
other notes,
* the document preview has a max-width which is the same as an A4 paper when printed * the document preview has a max-width which is the same as an A4 paper when printed
@@ -1077,6 +1137,8 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
## complete examples ## complete examples
* [running on windows](./docs/examples/windows.md)
* read-only music server * read-only music server
`python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2` `python copyparty-sfx.py -v /mnt/nas/music:/music:r -e2dsa -e2ts --no-robots --force-js --theme 2`
@@ -1092,19 +1154,125 @@ see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where
## reverse-proxy ## reverse-proxy
running copyparty next to other websites hosted on an existing webserver such as nginx or apache running copyparty next to other websites hosted on an existing webserver such as nginx, caddy, or apache
you can either: you can either:
* give copyparty its own domain or subdomain (recommended) * give copyparty its own domain or subdomain (recommended)
* or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs * or do location-based proxying, using `--rp-loc=/stuff` to tell copyparty where it is mounted -- has a slight performance cost and higher chance of bugs
* if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below * if copyparty says `incorrect --rp-loc or webserver config; expected vpath starting with [...]` it's likely because the webserver is stripping away the proxy location from the request URLs -- see the `ProxyPass` in the apache example below
some reverse proxies (such as [Caddy](https://caddyserver.com/)) can automatically obtain a valid https/tls certificate for you, and some support HTTP/2 and QUIC which could be a nice speed boost
example webserver configs: example webserver configs:
* [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain * [nginx config](contrib/nginx/copyparty.conf) -- entire domain/subdomain
* [apache2 config](contrib/apache/copyparty.conf) -- location-based * [apache2 config](contrib/apache/copyparty.conf) -- location-based
## nix package
`nix profile install github:9001/copyparty`
requires a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of nix
some recommended dependencies are enabled by default; [override the package](https://github.com/9001/copyparty/blob/hovudstraum/contrib/package/nix/copyparty/default.nix#L3-L22) if you want to add/remove some features/deps
`ffmpeg-full` was chosen over `ffmpeg-headless` mainly because we need `withWebp` (and `withOpenmpt` is also nice) and being able to use a cached build felt more important than optimizing for size at the time -- PRs welcome if you disagree 👍
## nixos module
for this setup, you will need a [flake-enabled](https://nixos.wiki/wiki/Flakes) installation of NixOS.
```nix
{
# add copyparty flake to your inputs
inputs.copyparty.url = "github:9001/copyparty";
# ensure that copyparty is an allowed argument to the outputs function
outputs = { self, nixpkgs, copyparty }: {
nixosConfigurations.yourHostName = nixpkgs.lib.nixosSystem {
modules = [
# load the copyparty NixOS module
copyparty.nixosModules.default
({ pkgs, ... }: {
# add the copyparty overlay to expose the package to the module
nixpkgs.overlays = [ copyparty.overlays.default ];
# (optional) install the package globally
environment.systemPackages = [ pkgs.copyparty ];
# configure the copyparty module
services.copyparty.enable = true;
})
];
};
};
}
```
copyparty on NixOS is configured via `services.copyparty` options, for example:
```nix
services.copyparty = {
enable = true;
# directly maps to values in the [global] section of the copyparty config.
# see `copyparty --help` for available options
settings = {
i = "0.0.0.0";
# use lists to set multiple values
p = [ 3210 3211 ];
# use booleans to set binary flags
no-reload = true;
# using 'false' will do nothing and omit the value when generating a config
ignored-flag = false;
};
# create users
accounts = {
# specify the account name as the key
ed = {
# provide the path to a file containing the password, keeping it out of /nix/store
# must be readable by the copyparty service user
passwordFile = "/run/keys/copyparty/ed_password";
};
# or do both in one go
k.passwordFile = "/run/keys/copyparty/k_password";
};
# create a volume
volumes = {
# create a volume at "/" (the webroot), which will
"/" = {
# share the contents of "/srv/copyparty"
path = "/srv/copyparty";
# see `copyparty --help-accounts` for available options
access = {
# everyone gets read-access, but
r = "*";
# users "ed" and "k" get read-write
rw = [ "ed" "k" ];
};
# see `copyparty --help-flags` for available options
flags = {
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
fk = 4;
# scan for new files every 60sec
scan = 60;
# volflag "e2d" enables the uploads database
e2d = true;
# "d2t" disables multimedia parsers (in case the uploads are malicious)
d2t = true;
# skips hashing file contents if path matches *.iso
nohash = "\.iso$";
};
};
};
# you may increase the open file limit for the process
openFilesLimit = 8192;
};
```
the passwordFile at /run/keys/copyparty/ could for example be generated by [agenix](https://github.com/ryantm/agenix), or you could just dump it in the nix store instead if that's acceptable
# browser support # browser support
TLDR: yes TLDR: yes
@@ -1176,7 +1344,7 @@ interact with copyparty using non-browser clients
* `(printf 'PUT / HTTP/1.1\r\n\r\n'; cat movie.mkv) >/dev/tcp/127.0.0.1/3923` * `(printf 'PUT / HTTP/1.1\r\n\r\n'; cat movie.mkv) >/dev/tcp/127.0.0.1/3923`
* python: [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm) * python: [up2k.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) is a command-line up2k client [(webm)](https://ocv.me/stuff/u2cli.webm)
* file uploads, file-search, folder sync, autoresume of aborted/broken uploads * file uploads, file-search, [folder sync](#folder-sync), autoresume of aborted/broken uploads
* can be downloaded from copyparty: controlpanel -> connect -> [up2k.py](http://127.0.0.1:3923/.cpr/a/up2k.py) * can be downloaded from copyparty: controlpanel -> connect -> [up2k.py](http://127.0.0.1:3923/.cpr/a/up2k.py)
* see [./bin/README.md#up2kpy](bin/README.md#up2kpy) * see [./bin/README.md#up2kpy](bin/README.md#up2kpy)
@@ -1197,17 +1365,27 @@ you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, ur
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
## folder sync
sync folders to/from copyparty
the commandline uploader [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) with `--dr` is the best way to sync a folder to copyparty; verifies checksums and does files in parallel, and deletes unexpected files on the server after upload has finished which makes file-renames really cheap (it'll rename serverside and skip uploading)
alternatively there is [rclone](./docs/rclone.md) which allows for bidirectional sync and is *way* more flexible (stream files straight from sftp/s3/gcs to copyparty, ...), although there is no integrity check and it won't work with files over 100 MiB if copyparty is behind cloudflare
* starting from rclone v1.63 (currently [in beta](https://beta.rclone.org/?filter=latest)), rclone will also be faster than up2k.py
## mount as drive ## mount as drive
a remote copyparty server as a local filesystem; go to the control-panel and click `connect` to see a list of commands to do that a remote copyparty server as a local filesystem; go to the control-panel and click `connect` to see a list of commands to do that
alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first: alternatively, some alternatives roughly sorted by speed (unreproducible benchmark), best first:
* [rclone-http](./docs/rclone.md) (25s), read-only * [rclone-webdav](./docs/rclone.md) (25s), read/WRITE ([v1.63-beta](https://beta.rclone.org/?filter=latest))
* [rclone-http](./docs/rclone.md) (26s), read-only
* [partyfuse.py](./bin/#partyfusepy) (35s), read-only
* [rclone-ftp](./docs/rclone.md) (47s), read/WRITE * [rclone-ftp](./docs/rclone.md) (47s), read/WRITE
* [rclone-webdav](./docs/rclone.md) (51s), read/WRITE
* copyparty-1.5.0's webdav server is faster than rclone-1.60.0 (69s)
* [partyfuse.py](./bin/#partyfusepy) (71s), read-only
* davfs2 (103s), read/WRITE, *very fast* on small files * davfs2 (103s), read/WRITE, *very fast* on small files
* [win10-webdav](#webdav-server) (138s), read/WRITE * [win10-webdav](#webdav-server) (138s), read/WRITE
* [win10-smb2](#smb-server) (387s), read/WRITE * [win10-smb2](#smb-server) (387s), read/WRITE
@@ -1215,6 +1393,27 @@ alternatively, some alternatives roughly sorted by speed (unreproducible benchma
most clients will fail to mount the root of a copyparty server unless there is a root volume (so you get the admin-panel instead of a browser when accessing it) -- in that case, mount a specific volume instead most clients will fail to mount the root of a copyparty server unless there is a root volume (so you get the admin-panel instead of a browser when accessing it) -- in that case, mount a specific volume instead
# android app
upload to copyparty with one tap
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet
if you want to run the copyparty server on your android device, see [install on android](#install-on-android)
# iOS shortcuts
there is no iPhone app, but the following shortcuts are almost as good:
* [upload to copyparty](https://www.icloud.com/shortcuts/41e98dd985cb4d3bb433222bc1e9e770) ([offline](https://github.com/9001/copyparty/raw/hovudstraum/contrib/ios/upload-to-copyparty.shortcut)) ([png](https://user-images.githubusercontent.com/241032/226118053-78623554-b0ed-482e-98e4-6d57ada58ea4.png)) based on the [original](https://www.icloud.com/shortcuts/ab415d5b4de3467b9ce6f151b439a5d7) by [Daedren](https://github.com/Daedren) (thx!)
* can strip exif, upload files, pics, vids, links, clipboard
* can download links and rehost the target file on copyparty (see first comment inside the shortcut)
* pics become lowres if you share from gallery to shortcut, so better to launch the shortcut and pick stuff from there
# performance # performance
defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload defaults are usually fine - expect `8 GiB/s` download, `1 GiB/s` upload
@@ -1226,11 +1425,13 @@ below are some tweaks roughly ordered by usefulness:
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set * `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable * `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger) * `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example: * `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
* huge amount of short-lived connections * lots of connections (many users or heavy clients)
* simultaneous downloads and uploads saturating a 20gbps connection * simultaneous downloads and uploads saturating a 20gbps connection
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u ...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
* using [pypy](https://www.pypy.org/) instead of [cpython](https://www.python.org/) *can* be 70% faster for some workloads, but slower for many others
* and pypy can sometimes crash on startup with `-j0` (TODO make issue)
## client-side ## client-side
@@ -1310,6 +1511,13 @@ by default, except for `GET` and `HEAD` operations, all requests must either:
cors can be configured with `--acao` and `--acam`, or the protections entirely disabled with `--allow-csrf` cors can be configured with `--acao` and `--acam`, or the protections entirely disabled with `--allow-csrf`
## https
both HTTP and HTTPS are accepted by default, but letting a [reverse proxy](#reverse-proxy) handle the https/tls/ssl would be better (probably more secure by default)
copyparty doesn't speak HTTP/2 or QUIC, so using a reverse proxy would solve that as well
# recovering from crashes # recovering from crashes
## client crashes ## client crashes
@@ -1332,7 +1540,7 @@ however you can hit `F12` in the up2k tab and use the devtools to see how far yo
# HTTP API # HTTP API
see [devnotes](#./docs/devnotes.md#http-api) see [devnotes](./docs/devnotes.md#http-api)
# dependencies # dependencies
@@ -1377,7 +1585,7 @@ these are standalone programs and will never be imported / evaluated by copypart
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](#./docs/devnotes.md#sfx-repack) you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack](./docs/devnotes.md#sfx-repack)
## copyparty.exe ## copyparty.exe

View File

@@ -10,6 +10,7 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xb
# after upload # after upload
* [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png)) * [notify.py](notify.py) shows a desktop notification ([example](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png))
* [notify2.py](notify2.py) uses the json API to show more context * [notify2.py](notify2.py) uses the json API to show more context
* [image-noexif.py](image-noexif.py) removes image exif by overwriting / directly editing the uploaded file
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png)) * [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable * [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable

View File

@@ -13,9 +13,15 @@ example usage as global config:
--xau f,t5,j,bin/hooks/discord-announce.py --xau f,t5,j,bin/hooks/discord-announce.py
example usage as a volflag (per-volume config): example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xau=f,t5,j,bin/hooks/discord-announce.py -v srv/inc:inc:r:rw,ed:c,xau=f,t5,j,bin/hooks/discord-announce.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained, parameters explained,
xbu = execute after upload
f = fork; don't wait for it to finish f = fork; don't wait for it to finish
t5 = timeout if it's still running after 5 sec t5 = timeout if it's still running after 5 sec
j = provide upload information as json; not just the filename j = provide upload information as json; not just the filename
@@ -30,6 +36,7 @@ then use this to design your message: https://discohook.org/
def main(): def main():
WEBHOOK = "https://discord.com/api/webhooks/1234/base64" WEBHOOK = "https://discord.com/api/webhooks/1234/base64"
WEBHOOK = "https://discord.com/api/webhooks/1066830390280597718/M1TDD110hQA-meRLMRhdurych8iyG35LDoI1YhzbrjGP--BXNZodZFczNVwK4Ce7Yme5"
# read info from copyparty # read info from copyparty
inf = json.loads(sys.argv[1]) inf = json.loads(sys.argv[1])

72
bin/hooks/image-noexif.py Executable file
View File

@@ -0,0 +1,72 @@
#!/usr/bin/env python3
import os
import sys
import subprocess as sp
_ = r"""
remove exif tags from uploaded images; the eventhook edition of
https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py
dependencies:
exiftool / perl-Image-ExifTool
being an upload hook, this will take effect after upload completion
but before copyparty has hashed/indexed the file, which means that
copyparty will never index the original file, so deduplication will
not work as expected... which is mostly OK but ehhh
note: modifies the file in-place, so don't set the `f` (fork) flag
example usages; either as global config (all volumes) or as volflag:
--xau bin/hooks/image-noexif.py
-v srv/inc:inc:r:rw,ed:c,xau=bin/hooks/image-noexif.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
explained:
share fs-path srv/inc at /inc (readable by all, read-write for user ed)
running this xau (execute-after-upload) plugin for all uploaded files
"""
# filetypes to process; ignores everything else
EXTS = ("jpg", "jpeg", "avif", "heif", "heic")
try:
from copyparty.util import fsenc
except:
def fsenc(p):
return p.encode("utf-8")
def main():
fp = sys.argv[1]
ext = fp.lower().split(".")[-1]
if ext not in EXTS:
return
cwd, fn = os.path.split(fp)
os.chdir(cwd)
f1 = fsenc(fn)
cmd = [
b"exiftool",
b"-exif:all=",
b"-iptc:all=",
b"-xmp:all=",
b"-P",
b"-overwrite_original",
b"--",
f1,
]
sp.check_output(cmd)
print("image-noexif: stripped")
if __name__ == "__main__":
try:
main()
except:
pass

View File

@@ -17,8 +17,12 @@ depdencies:
example usages; either as global config (all volumes) or as volflag: example usages; either as global config (all volumes) or as volflag:
--xau f,bin/hooks/notify.py --xau f,bin/hooks/notify.py
-v srv/inc:inc:c,xau=f,bin/hooks/notify.py -v srv/inc:inc:r:rw,ed:c,xau=f,bin/hooks/notify.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained, parameters explained,
xau = execute after upload xau = execute after upload

View File

@@ -15,9 +15,13 @@ and also supports --xm (notify on 📟 message)
example usages; either as global config (all volumes) or as volflag: example usages; either as global config (all volumes) or as volflag:
--xm f,j,bin/hooks/notify2.py --xm f,j,bin/hooks/notify2.py
--xau f,j,bin/hooks/notify2.py --xau f,j,bin/hooks/notify2.py
-v srv/inc:inc:c,xm=f,j,bin/hooks/notify2.py -v srv/inc:inc:r:rw,ed:c,xm=f,j,bin/hooks/notify2.py
-v srv/inc:inc:c,xau=f,j,bin/hooks/notify2.py -v srv/inc:inc:r:rw,ed:c,xau=f,j,bin/hooks/notify2.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads / msgs with the params listed below)
parameters explained, parameters explained,
xau = execute after upload xau = execute after upload

View File

@@ -10,7 +10,12 @@ example usage as global config:
--xbu c,bin/hooks/reject-extension.py --xbu c,bin/hooks/reject-extension.py
example usage as a volflag (per-volume config): example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xbu=c,bin/hooks/reject-extension.py -v srv/inc:inc:r:rw,ed:c,xbu=c,bin/hooks/reject-extension.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained, parameters explained,
xbu = execute before upload xbu = execute before upload

View File

@@ -17,7 +17,12 @@ example usage as global config:
--xau c,bin/hooks/reject-mimetype.py --xau c,bin/hooks/reject-mimetype.py
example usage as a volflag (per-volume config): example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xau=c,bin/hooks/reject-mimetype.py -v srv/inc:inc:r:rw,ed:c,xau=c,bin/hooks/reject-mimetype.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all uploads with the params listed below)
parameters explained, parameters explained,
xau = execute after upload xau = execute after upload

View File

@@ -15,9 +15,15 @@ example usage as global config:
--xm f,j,t3600,bin/hooks/wget.py --xm f,j,t3600,bin/hooks/wget.py
example usage as a volflag (per-volume config): example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xm=f,j,t3600,bin/hooks/wget.py -v srv/inc:inc:r:rw,ed:c,xm=f,j,t3600,bin/hooks/wget.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on all messages with the params listed below)
parameters explained, parameters explained,
xm = execute on message-to-server-log
f = fork so it doesn't block uploads f = fork so it doesn't block uploads
j = provide message information as json; not just the text j = provide message information as json; not just the text
c3 = mute all output c3 = mute all output

View File

@@ -18,7 +18,12 @@ example usage as global config:
--xiu i5,j,bin/hooks/xiu-sha.py --xiu i5,j,bin/hooks/xiu-sha.py
example usage as a volflag (per-volume config): example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xiu=i5,j,bin/hooks/xiu-sha.py -v srv/inc:inc:r:rw,ed:c,xiu=i5,j,bin/hooks/xiu-sha.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on batches of uploads with the params listed below)
parameters explained, parameters explained,
xiu = execute after uploads... xiu = execute after uploads...

View File

@@ -15,7 +15,12 @@ example usage as global config:
--xiu i1,j,bin/hooks/xiu.py --xiu i1,j,bin/hooks/xiu.py
example usage as a volflag (per-volume config): example usage as a volflag (per-volume config):
-v srv/inc:inc:c,xiu=i1,j,bin/hooks/xiu.py -v srv/inc:inc:r:rw,ed:c,xiu=i1,j,bin/hooks/xiu.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share filesystem-path srv/inc as volume /inc,
readable by everyone, read-write for user 'ed',
running this plugin on batches of uploads with the params listed below)
parameters explained, parameters explained,
xiu = execute after uploads... xiu = execute after uploads...

View File

@@ -16,6 +16,10 @@ dep: ffmpeg
""" """
# save beat timestamps to ".beats/filename.txt"
SAVE = False
def det(tf): def det(tf):
# fmt: off # fmt: off
sp.check_call([ sp.check_call([
@@ -23,12 +27,11 @@ def det(tf):
b"-nostdin", b"-nostdin",
b"-hide_banner", b"-hide_banner",
b"-v", b"fatal", b"-v", b"fatal",
b"-ss", b"13",
b"-y", b"-i", fsenc(sys.argv[1]), b"-y", b"-i", fsenc(sys.argv[1]),
b"-map", b"0:a:0", b"-map", b"0:a:0",
b"-ac", b"1", b"-ac", b"1",
b"-ar", b"22050", b"-ar", b"22050",
b"-t", b"300", b"-t", b"360",
b"-f", b"f32le", b"-f", b"f32le",
fsenc(tf) fsenc(tf)
]) ])
@@ -47,10 +50,29 @@ def det(tf):
print(c["list"][0]["label"].split(" ")[0]) print(c["list"][0]["label"].split(" ")[0])
return return
# throws if detection failed: # throws if detection failed:
bpm = float(cl[-1]["timestamp"] - cl[1]["timestamp"]) beats = [float(x["timestamp"]) for x in cl]
bpm = round(60 * ((len(cl) - 1) / bpm), 2) bds = [b - a for a, b in zip(beats, beats[1:])]
print(f"{bpm:.2f}") bds.sort()
n0 = int(len(bds) * 0.2)
n1 = int(len(bds) * 0.75) + 1
bds = bds[n0:n1]
bpm = sum(bds)
bpm = round(60 * (len(bds) / bpm), 2)
print(f"{bpm:.2f}")
if SAVE:
fdir, fname = os.path.split(sys.argv[1])
bdir = os.path.join(fdir, ".beats")
try:
os.mkdir(fsenc(bdir))
except:
pass
fp = os.path.join(bdir, fname) + ".txt"
with open(fsenc(fp), "wb") as f:
txt = "\n".join([f"{x:.2f}" for x in beats])
f.write(txt.encode("utf-8"))
def main(): def main():

View File

@@ -118,6 +118,15 @@ mkdir -p "$jail/tmp"
chmod 777 "$jail/tmp" chmod 777 "$jail/tmp"
# create a dev
(cd $jail; mkdir -p dev; cd dev
[ -e null ] || mknod -m 666 null c 1 3
[ -e zero ] || mknod -m 666 zero c 1 5
[ -e random ] || mknod -m 444 random c 1 8
[ -e urandom ] || mknod -m 444 urandom c 1 9
)
# run copyparty # run copyparty
export HOME=$(getent passwd $uid | cut -d: -f6) export HOME=$(getent passwd $uid | cut -d: -f6)
export USER=$(getent passwd $uid | cut -d: -f1) export USER=$(getent passwd $uid | cut -d: -f1)

View File

@@ -1,8 +1,8 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
S_VERSION = "1.5" S_VERSION = "1.6"
S_BUILD_DT = "2023-03-12" S_BUILD_DT = "2023-04-20"
""" """
up2k.py: upload to copyparty up2k.py: upload to copyparty
@@ -653,6 +653,7 @@ class Ctl(object):
return nfiles, nbytes return nfiles, nbytes
def __init__(self, ar, stats=None): def __init__(self, ar, stats=None):
self.ok = False
self.ar = ar self.ar = ar
self.stats = stats or self._scan() self.stats = stats or self._scan()
if not self.stats: if not self.stats:
@@ -700,6 +701,8 @@ class Ctl(object):
self._fancy() self._fancy()
self.ok = True
def _safe(self): def _safe(self):
"""minimal basic slow boring fallback codepath""" """minimal basic slow boring fallback codepath"""
search = self.ar.s search = self.ar.s
@@ -1009,8 +1012,9 @@ class Ctl(object):
file, cid = task file, cid = task
try: try:
upload(file, cid, self.ar.a, stats) upload(file, cid, self.ar.a, stats)
except: except Exception as ex:
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8])) t = "upload failed, retrying: {0} #{1} ({2})\n"
eprint(t.format(file.name, cid[:8], ex))
# handshake will fix it # handshake will fix it
with self.mutex: with self.mutex:
@@ -1049,6 +1053,8 @@ def main():
print(ver) print(ver)
return return
sys.argv = [x for x in sys.argv if x != "--ws"]
# fmt: off # fmt: off
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog=""" ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
NOTE: NOTE:
@@ -1067,7 +1073,6 @@ source file/folder selection uses rsync syntax, meaning that:
ap = app.add_argument_group("compatibility") ap = app.add_argument_group("compatibility")
ap.add_argument("--cls", action="store_true", help="clear screen before start") ap.add_argument("--cls", action="store_true", help="clear screen before start")
ap.add_argument("--ws", action="store_true", help="copyparty is running on windows; wait before deleting files after uploading")
ap = app.add_argument_group("folder sync") ap = app.add_argument_group("folder sync")
ap.add_argument("--dl", action="store_true", help="delete local files after uploading") ap.add_argument("--dl", action="store_true", help="delete local files after uploading")
@@ -1129,15 +1134,13 @@ source file/folder selection uses rsync syntax, meaning that:
ctl = Ctl(ar) ctl = Ctl(ar)
if ar.dr and not ar.drd: if ar.dr and not ar.drd and ctl.ok:
print("\npass 2/2: delete") print("\npass 2/2: delete")
if getattr(ctl, "up_br") and ar.ws:
# wait for up2k to mtime if there was uploads
time.sleep(4)
ar.drd = True ar.drd = True
ar.z = True ar.z = True
Ctl(ar, ctl.stats) ctl = Ctl(ar, ctl.stats)
sys.exit(0 if ctl.ok else 1)
if __name__ == "__main__": if __name__ == "__main__":

Binary file not shown.

View File

@@ -39,3 +39,9 @@ server {
proxy_set_header Connection "Keep-Alive"; proxy_set_header Connection "Keep-Alive";
} }
} }
# default client_max_body_size (1M) blocks uploads larger than 256 MiB
client_max_body_size 1024M;
client_header_timeout 610m;
client_body_timeout 610m;
send_timeout 610m;

View File

@@ -0,0 +1,281 @@
{ config, pkgs, lib, ... }:
with lib;
let
mkKeyValue = key: value:
if value == true then
# sets with a true boolean value are coerced to just the key name
key
else if value == false then
# or omitted completely when false
""
else
(generators.mkKeyValueDefault { inherit mkValueString; } ": " key value);
mkAttrsString = value: (generators.toKeyValue { inherit mkKeyValue; } value);
mkValueString = value:
if isList value then
(concatStringsSep ", " (map mkValueString value))
else if isAttrs value then
"\n" + (mkAttrsString value)
else
(generators.mkValueStringDefault { } value);
mkSectionName = value: "[" + (escape [ "[" "]" ] value) + "]";
mkSection = name: attrs: ''
${mkSectionName name}
${mkAttrsString attrs}
'';
mkVolume = name: attrs: ''
${mkSectionName name}
${attrs.path}
${mkAttrsString {
accs = attrs.access;
flags = attrs.flags;
}}
'';
passwordPlaceholder = name: "{{password-${name}}}";
accountsWithPlaceholders = mapAttrs (name: attrs: passwordPlaceholder name);
configStr = ''
${mkSection "global" cfg.settings}
${mkSection "accounts" (accountsWithPlaceholders cfg.accounts)}
${concatStringsSep "\n" (mapAttrsToList mkVolume cfg.volumes)}
'';
name = "copyparty";
cfg = config.services.copyparty;
configFile = pkgs.writeText "${name}.conf" configStr;
runtimeConfigPath = "/run/${name}/${name}.conf";
home = "/var/lib/${name}";
defaultShareDir = "${home}/data";
in {
options.services.copyparty = {
enable = mkEnableOption "web-based file manager";
package = mkOption {
type = types.package;
default = pkgs.copyparty;
defaultText = "pkgs.copyparty";
description = ''
Package of the application to run, exposed for overriding purposes.
'';
};
openFilesLimit = mkOption {
default = 4096;
type = types.either types.int types.str;
description = "Number of files to allow copyparty to open.";
};
settings = mkOption {
type = types.attrs;
description = ''
Global settings to apply.
Directly maps to values in the [global] section of the copyparty config.
See `${getExe cfg.package} --help` for more details.
'';
default = {
i = "127.0.0.1";
no-reload = true;
};
example = literalExpression ''
{
i = "0.0.0.0";
no-reload = true;
}
'';
};
accounts = mkOption {
type = types.attrsOf (types.submodule ({ ... }: {
options = {
passwordFile = mkOption {
type = types.str;
description = ''
Runtime file path to a file containing the user password.
Must be readable by the copyparty user.
'';
example = "/run/keys/copyparty/ed";
};
};
}));
description = ''
A set of copyparty accounts to create.
'';
default = { };
example = literalExpression ''
{
ed.passwordFile = "/run/keys/copyparty/ed";
};
'';
};
volumes = mkOption {
type = types.attrsOf (types.submodule ({ ... }: {
options = {
path = mkOption {
type = types.str;
description = ''
Path of a directory to share.
'';
};
access = mkOption {
type = types.attrs;
description = ''
Attribute list of permissions and the users to apply them to.
The key must be a string containing any combination of allowed permission:
"r" (read): list folder contents, download files
"w" (write): upload files; need "r" to see the uploads
"m" (move): move files and folders; need "w" at destination
"d" (delete): permanently delete files and folders
"g" (get): download files, but cannot see folder contents
"G" (upget): "get", but can see filekeys of their own uploads
For example: "rwmd"
The value must be one of:
an account name, defined in `accounts`
a list of account names
"*", which means "any account"
'';
example = literalExpression ''
{
# wG = write-upget = see your own uploads only
wG = "*";
# read-write-modify-delete for users "ed" and "k"
rwmd = ["ed" "k"];
};
'';
};
flags = mkOption {
type = types.attrs;
description = ''
Attribute list of volume flags to apply.
See `${getExe cfg.package} --help-flags` for more details.
'';
example = literalExpression ''
{
# "fk" enables filekeys (necessary for upget permission) (4 chars long)
fk = 4;
# scan for new files every 60sec
scan = 60;
# volflag "e2d" enables the uploads database
e2d = true;
# "d2t" disables multimedia parsers (in case the uploads are malicious)
d2t = true;
# skips hashing file contents if path matches *.iso
nohash = "\.iso$";
};
'';
default = { };
};
};
}));
description = "A set of copyparty volumes to create";
default = {
"/" = {
path = defaultShareDir;
access = { r = "*"; };
};
};
example = literalExpression ''
{
"/" = {
path = ${defaultShareDir};
access = {
# wG = write-upget = see your own uploads only
wG = "*";
# read-write-modify-delete for users "ed" and "k"
rwmd = ["ed" "k"];
};
};
};
'';
};
};
config = mkIf cfg.enable {
systemd.services.copyparty = {
description = "http file sharing hub";
wantedBy = [ "multi-user.target" ];
environment = {
PYTHONUNBUFFERED = "true";
XDG_CONFIG_HOME = "${home}/.config";
};
preStart = let
replaceSecretCommand = name: attrs:
"${getExe pkgs.replace-secret} '${
passwordPlaceholder name
}' '${attrs.passwordFile}' ${runtimeConfigPath}";
in ''
set -euo pipefail
install -m 600 ${configFile} ${runtimeConfigPath}
${concatStringsSep "\n"
(mapAttrsToList replaceSecretCommand cfg.accounts)}
'';
serviceConfig = {
Type = "simple";
ExecStart = "${getExe cfg.package} -c ${runtimeConfigPath}";
# Hardening options
User = "copyparty";
Group = "copyparty";
RuntimeDirectory = name;
RuntimeDirectoryMode = "0700";
StateDirectory = [ name "${name}/data" "${name}/.config" ];
StateDirectoryMode = "0700";
WorkingDirectory = home;
TemporaryFileSystem = "/:ro";
BindReadOnlyPaths = [
"/nix/store"
"-/etc/resolv.conf"
"-/etc/nsswitch.conf"
"-/etc/hosts"
"-/etc/localtime"
] ++ (mapAttrsToList (k: v: "-${v.passwordFile}") cfg.accounts);
BindPaths = [ home ] ++ (mapAttrsToList (k: v: v.path) cfg.volumes);
# Would re-mount paths ignored by temporary root
#ProtectSystem = "strict";
ProtectHome = true;
PrivateTmp = true;
PrivateDevices = true;
ProtectKernelTunables = true;
ProtectControlGroups = true;
RestrictSUIDSGID = true;
PrivateMounts = true;
ProtectKernelModules = true;
ProtectKernelLogs = true;
ProtectHostname = true;
ProtectClock = true;
ProtectProc = "invisible";
ProcSubset = "pid";
RestrictNamespaces = true;
RemoveIPC = true;
UMask = "0077";
LimitNOFILE = cfg.openFilesLimit;
NoNewPrivileges = true;
LockPersonality = true;
RestrictRealtime = true;
};
};
users.groups.copyparty = { };
users.users.copyparty = {
description = "Service user for copyparty";
group = "copyparty";
home = home;
isSystemUser = true;
};
};
}

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe> # Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty pkgname=copyparty
pkgver="1.6.7" pkgver="1.6.11"
pkgrel=1 pkgrel=1
pkgdesc="Portable file sharing hub" pkgdesc="Portable file sharing hub"
arch=("any") arch=("any")
@@ -26,12 +26,12 @@ source=("${url}/releases/download/v${pkgver}/${pkgname}-sfx.py"
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE" "https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
) )
backup=("etc/${pkgname}.d/init" ) backup=("etc/${pkgname}.d/init" )
sha256sums=("3fb40a631e9decf0073db06aab6fd8d743de91f4ddb82a65164d39d53e0b413f" sha256sums=("d096e33ab666ef45213899dd3a10735f62b5441339cb7374f93b232d0b6c8d34"
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2" "b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc" "f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff" "c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8" "dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
"23054bb206153a1ed34038accaf490b8068f9c856e423c2f2595b148b40c0a0c" "8e89d281483e22d11d111bed540652af35b66af6f14f49faae7b959f6cdc6475"
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3" "cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
) )

View File

@@ -0,0 +1,55 @@
{ lib, stdenv, makeWrapper, fetchurl, utillinux, python, jinja2, impacket, pyftpdlib, pyopenssl, pillow, pyvips, ffmpeg, mutagen,
# create thumbnails with Pillow; faster than FFmpeg / MediaProcessing
withThumbnails ? true,
# create thumbnails with PyVIPS; even faster, uses more memory
# -- can be combined with Pillow to support more filetypes
withFastThumbnails ? false,
# enable FFmpeg; thumbnails for most filetypes (also video and audio), extract audio metadata, transcode audio to opus
# -- possibly dangerous if you allow anonymous uploads, since FFmpeg has a huge attack surface
# -- can be combined with Thumbnails and/or FastThumbnails, since FFmpeg is slower than both
withMediaProcessing ? true,
# if MediaProcessing is not enabled, you probably want this instead (less accurate, but much safer and faster)
withBasicAudioMetadata ? false,
# enable FTPS support in the FTP server
withFTPS ? false,
# samba/cifs server; dangerous and buggy, enable if you really need it
withSMB ? false,
}:
let
pinData = lib.importJSON ./pin.json;
pyEnv = python.withPackages (ps:
with ps; [
jinja2
]
++ lib.optional withSMB impacket
++ lib.optional withFTPS pyopenssl
++ lib.optional withThumbnails pillow
++ lib.optional withFastThumbnails pyvips
++ lib.optional withMediaProcessing ffmpeg
++ lib.optional withBasicAudioMetadata mutagen
);
in stdenv.mkDerivation {
pname = "copyparty";
version = pinData.version;
src = fetchurl {
url = pinData.url;
hash = pinData.hash;
};
buildInputs = [ makeWrapper ];
dontUnpack = true;
dontBuild = true;
installPhase = ''
install -Dm755 $src $out/share/copyparty-sfx.py
makeWrapper ${pyEnv.interpreter} $out/bin/copyparty \
--set PATH '${lib.makeBinPath ([ utillinux ] ++ lib.optional withMediaProcessing ffmpeg)}:$PATH' \
--add-flags "$out/share/copyparty-sfx.py"
'';
}

View File

@@ -0,0 +1,5 @@
{
"url": "https://github.com/9001/copyparty/releases/download/v1.6.11/copyparty-sfx.py",
"version": "1.6.11",
"hash": "sha256-0JbjOrZm70UhOJndOhBzX2K1RBM5y3N0+TsjLQtsjTQ="
}

View File

@@ -0,0 +1,77 @@
#!/usr/bin/env python3
# Update the Nix package pin
#
# Usage: ./update.sh [PATH]
# When the [PATH] is not set, it will fetch the latest release from the repo.
# With [PATH] set, it will hash the given file and generate the URL,
# base on the version contained within the file
import base64
import json
import hashlib
import sys
import re
from pathlib import Path
OUTPUT_FILE = Path("pin.json")
TARGET_ASSET = "copyparty-sfx.py"
HASH_TYPE = "sha256"
LATEST_RELEASE_URL = "https://api.github.com/repos/9001/copyparty/releases/latest"
DOWNLOAD_URL = lambda version: f"https://github.com/9001/copyparty/releases/download/v{version}/{TARGET_ASSET}"
def get_formatted_hash(binary):
hasher = hashlib.new("sha256")
hasher.update(binary)
asset_hash = hasher.digest()
encoded_hash = base64.b64encode(asset_hash).decode("ascii")
return f"{HASH_TYPE}-{encoded_hash}"
def version_from_sfx(binary):
result = re.search(b'^VER = "(.*)"$', binary, re.MULTILINE)
if result:
return result.groups(1)[0].decode("ascii")
raise ValueError("version not found in provided file")
def remote_release_pin():
import requests
response = requests.get(LATEST_RELEASE_URL).json()
version = response["tag_name"].lstrip("v")
asset_info = [a for a in response["assets"] if a["name"] == TARGET_ASSET][0]
download_url = asset_info["browser_download_url"]
asset = requests.get(download_url)
formatted_hash = get_formatted_hash(asset.content)
result = {"url": download_url, "version": version, "hash": formatted_hash}
return result
def local_release_pin(path):
asset = path.read_bytes()
version = version_from_sfx(asset)
download_url = DOWNLOAD_URL(version)
formatted_hash = get_formatted_hash(asset)
result = {"url": download_url, "version": version, "hash": formatted_hash}
return result
def main():
if len(sys.argv) > 1:
asset_path = Path(sys.argv[1])
result = local_release_pin(asset_path)
else:
result = remote_release_pin()
print(result)
json_result = json.dumps(result, indent=4)
OUTPUT_FILE.write_text(json_result)
if __name__ == "__main__":
main()

208
contrib/plugins/rave.js Normal file
View File

@@ -0,0 +1,208 @@
/* untz untz untz untz */
(function () {
var can, ctx, W, H, fft, buf, bars, barw, pv,
hue = 0,
ibeat = 0,
beats = [9001],
beats_url = '',
uofs = 0,
ops = ebi('ops'),
raving = false,
recalc = 0,
cdown = 0,
FC = 0.9,
css = `<style>
#fft {
position: fixed;
top: 0;
left: 0;
z-index: -1;
}
body {
box-shadow: inset 0 0 0 white;
}
#ops>a,
#path>a {
display: inline-block;
}
/*
body.untz {
animation: untz-body 200ms ease-out;
}
@keyframes untz-body {
0% {inset 0 0 20em white}
100% {inset 0 0 0 white}
}
*/
:root, html.a, html.b, html.c, html.d, html.e {
--row-alt: rgba(48,52,78,0.2);
}
#files td {
background: none;
}
</style>`;
QS('body').appendChild(mknod('div', null, css));
function rave_load() {
console.log('rave_load');
can = mknod('canvas', 'fft');
QS('body').appendChild(can);
ctx = can.getContext('2d');
fft = new AnalyserNode(actx, {
"fftSize": 2048,
"maxDecibels": 0,
"smoothingTimeConstant": 0.7,
});
ibeat = 0;
beats = [9001];
buf = new Uint8Array(fft.frequencyBinCount);
bars = buf.length * FC;
afilt.filters.push(fft);
if (!raving) {
raving = true;
raver();
}
beats_url = mp.au.src.split('?')[0].replace(/(.*\/)(.*)/, '$1.beats/$2.txt');
console.log("reading beats from", beats_url);
var xhr = new XHR();
xhr.open('GET', beats_url, true);
xhr.onload = readbeats;
xhr.url = beats_url;
xhr.send();
}
function rave_unload() {
qsr('#fft');
can = null;
}
function readbeats() {
if (this.url != beats_url)
return console.log('old beats??', this.url, beats_url);
var sbeats = this.responseText.replace(/\r/g, '').split(/\n/g);
if (sbeats.length < 3)
return;
beats = [];
for (var a = 0; a < sbeats.length; a++)
beats.push(parseFloat(sbeats[a]));
var end = beats.slice(-2),
t = end[1],
d = t - end[0];
while (d > 0.1 && t < 1200)
beats.push(t += d);
}
function hrand() {
return Math.random() - 0.5;
}
function raver() {
if (!can) {
raving = false;
return;
}
requestAnimationFrame(raver);
if (!mp || !mp.au || mp.au.paused)
return;
if (--uofs >= 0) {
document.body.style.marginLeft = hrand() * uofs + 'px';
ebi('tree').style.marginLeft = hrand() * uofs + 'px';
for (var a of QSA('#ops>a, #path>a, #pctl>a'))
a.style.transform = 'translate(' + hrand() * uofs * 1 + 'px, ' + hrand() * uofs * 0.7 + 'px) rotate(' + Math.random() * uofs * 0.7 + 'deg)'
}
if (--recalc < 0) {
recalc = 60;
var tree = ebi('tree'),
x = tree.style.display == 'none' ? 0 : tree.offsetWidth;
//W = can.width = window.innerWidth - x;
//H = can.height = window.innerHeight;
//H = ebi('widget').offsetTop;
W = can.width = bars;
H = can.height = 512;
barw = 1; //parseInt(0.8 + W / bars);
can.style.left = x + 'px';
can.style.width = (window.innerWidth - x) + 'px';
can.style.height = ebi('widget').offsetTop + 'px';
}
//if (--cdown == 1)
// clmod(ops, 'untz');
fft.getByteFrequencyData(buf);
var imax = 0, vmax = 0;
for (var a = 10; a < 50; a++)
if (vmax < buf[a]) {
vmax = buf[a];
imax = a;
}
hue = hue * 0.93 + imax * 0.07;
ctx.fillStyle = 'rgba(0,0,0,0)';
ctx.fillRect(0, 0, W, H);
ctx.clearRect(0, 0, W, H);
ctx.fillStyle = 'hsla(' + (hue * 2.5) + ',100%,50%,0.7)';
var x = 0, mul = (H / 256) * 0.5;
for (var a = 0; a < buf.length * FC; a++) {
var v = buf[a] * mul * (1 + 0.69 * a / buf.length);
ctx.fillRect(x, H - v, barw, v);
x += barw;
}
var t = mp.au.currentTime + 0.05;
if (ibeat >= beats.length || beats[ibeat] > t)
return;
while (ibeat < beats.length && beats[ibeat++] < t)
continue;
return untz();
var cv = 0;
for (var a = 0; a < 128; a++)
cv += buf[a];
if (cv - pv > 1000) {
console.log(pv, cv, cv - pv);
if (cdown < 0) {
clmod(ops, 'untz', 1);
cdown = 20;
}
}
pv = cv;
}
function untz() {
console.log('untz');
uofs = 14;
document.body.animate([
{ boxShadow: 'inset 0 0 1em #f0c' },
{ boxShadow: 'inset 0 0 20em #f0c', offset: 0.2 },
{ boxShadow: 'inset 0 0 0 #f0c' },
], { duration: 200, iterations: 1 });
}
afilt.plugs.push({
"en": true,
"load": rave_load,
"unload": rave_unload
});
})();

View File

@@ -2,12 +2,16 @@
# and share '/mnt' with anonymous read+write # and share '/mnt' with anonymous read+write
# #
# installation: # installation:
# cp -pv copyparty.service /etc/systemd/system # wget https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py -O /usr/local/bin/copyparty-sfx.py
# restorecon -vr /etc/systemd/system/copyparty.service # cp -pv copyparty.service /etc/systemd/system/
# restorecon -vr /etc/systemd/system/copyparty.service # on fedora/rhel
# firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt # firewall-cmd --permanent --add-port={80,443,3923}/tcp # --zone=libvirt
# firewall-cmd --reload # firewall-cmd --reload
# systemctl daemon-reload && systemctl enable --now copyparty # systemctl daemon-reload && systemctl enable --now copyparty
# #
# if it fails to start, first check this: systemctl status copyparty
# then try starting it while viewing logs: journalctl -fan 100
#
# you may want to: # you may want to:
# change "User=cpp" and "/home/cpp/" to another user # change "User=cpp" and "/home/cpp/" to another user
# remove the nft lines to only listen on port 3923 # remove the nft lines to only listen on port 3923
@@ -44,7 +48,7 @@ ExecReload=/bin/kill -s USR1 $MAINPID
User=cpp User=cpp
Environment=XDG_CONFIG_HOME=/home/cpp/.config Environment=XDG_CONFIG_HOME=/home/cpp/.config
# setup forwarding from ports 80 and 443 to port 3923 # OPTIONAL: setup forwarding from ports 80 and 443 to port 3923
ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true' ExecStartPre=+/bin/bash -c 'nft -n -a list table nat | awk "/ to :3923 /{print\$NF}" | xargs -rL1 nft delete rule nat prerouting handle; true'
ExecStartPre=+nft add table ip nat ExecStartPre=+nft add table ip nat
ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; } ExecStartPre=+nft -- add chain ip nat prerouting { type nat hook prerouting priority -100 \; }

View File

@@ -758,7 +758,7 @@ def add_ftp(ap):
def add_webdav(ap): def add_webdav(ap):
ap2 = ap.add_argument_group('WebDAV options') ap2 = ap.add_argument_group('WebDAV options')
ap2.add_argument("--daw", action="store_true", help="enable full write support. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)") ap2.add_argument("--daw", action="store_true", help="enable full write support, even if client may not be webdav. \033[1;31mWARNING:\033[0m This has side-effects -- PUT-operations will now \033[1;31mOVERWRITE\033[0m existing files, rather than inventing new filenames to avoid loss of data. You might want to instead set this as a volflag where needed. By not setting this flag, uploaded files can get written to a filename which the client does not expect (which might be okay, depending on client)")
ap2.add_argument("--dav-inf", action="store_true", help="allow depth:infinite requests (recursive file listing); extremely server-heavy but required for spec compliance -- luckily few clients rely on this") ap2.add_argument("--dav-inf", action="store_true", help="allow depth:infinite requests (recursive file listing); extremely server-heavy but required for spec compliance -- luckily few clients rely on this")
ap2.add_argument("--dav-mac", action="store_true", help="disable apple-garbage filter -- allow macos to create junk files (._* and .DS_Store, .Spotlight-*, .fseventsd, .Trashes, .AppleDouble, __MACOS)") ap2.add_argument("--dav-mac", action="store_true", help="disable apple-garbage filter -- allow macos to create junk files (._* and .DS_Store, .Spotlight-*, .fseventsd, .Trashes, .AppleDouble, __MACOS)")
@@ -900,8 +900,8 @@ def add_db_general(ap, hcores):
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash") ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty") ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)") ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume data (db, thumbs) (volflag=hist)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching paths during e2ds folder scans (volflag=nohash)") ap2.add_argument("--no-hash", metavar="PTN", type=u, help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching paths during e2ds folder scans (volflag=noidx)") ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower") ap2.add_argument("--no-dhash", action="store_true", help="disable rescan acceleration; do full database integrity check -- makes the db ~5%% smaller and bootup/rescans 3~10x slower")
ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)") ap2.add_argument("--re-dhash", action="store_true", help="rebuild the cache if it gets out of sync (for example crash on startup during metadata scanning)")
ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)") ap2.add_argument("--no-forget", action="store_true", help="never forget indexed files, even when deleted from disk -- makes it impossible to ever upload the same file twice (volflag=noforget)")
@@ -946,6 +946,7 @@ def add_ui(ap, retry):
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include") ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include") ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages") ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI)")
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext") ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)") ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents") ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")

View File

@@ -1,8 +1,8 @@
# coding: utf-8 # coding: utf-8
VERSION = (1, 6, 8) VERSION = (1, 6, 12)
CODENAME = "cors k" CODENAME = "cors k"
BUILD_DT = (2023, 3, 12) BUILD_DT = (2023, 4, 20)
S_VERSION = ".".join(map(str, VERSION)) S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT) S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -356,7 +356,8 @@ class VFS(object):
flags = {k: v for k, v in self.flags.items()} flags = {k: v for k, v in self.flags.items()}
hist = flags.get("hist") hist = flags.get("hist")
if hist and hist != "-": if hist and hist != "-":
flags["hist"] = "{}/{}".format(hist.rstrip("/"), name) zs = "{}/{}".format(hist.rstrip("/"), name)
flags["hist"] = os.path.expanduser(zs) if zs.startswith("~") else zs
return flags return flags
@@ -1101,6 +1102,9 @@ class AuthSrv(object):
if vflag == "-": if vflag == "-":
pass pass
elif vflag: elif vflag:
if vflag.startswith("~"):
vflag = os.path.expanduser(vflag)
vol.histpath = uncyg(vflag) if WINDOWS else vflag vol.histpath = uncyg(vflag) if WINDOWS else vflag
elif self.args.hist: elif self.args.hist:
for nch in range(len(hid)): for nch in range(len(hid)):

View File

@@ -14,8 +14,8 @@ from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer from pyftpdlib.servers import FTPServer
from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E from .__init__ import ANYWIN, PY2, TYPE_CHECKING, E
from .bos import bos
from .authsrv import VFS from .authsrv import VFS
from .bos import bos
from .util import ( from .util import (
Daemon, Daemon,
Pebkac, Pebkac,
@@ -129,6 +129,7 @@ class FtpFs(AbstractedFS):
def die(self, msg): def die(self, msg):
self.h.die(msg) self.h.die(msg)
raise Exception()
def v2a( def v2a(
self, self,

View File

@@ -264,9 +264,10 @@ class HttpCli(object):
self.is_https = ( self.is_https = (
self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls self.headers.get("x-forwarded-proto", "").lower() == "https" or self.tls
) )
self.host = self.headers.get("host") or "{}:{}".format( self.host = self.headers.get("host") or ""
*list(self.s.getsockname()[:2]) if not self.host:
) zs = "{}:{}".format(*list(self.s.getsockname()[:2]))
self.host = zs[7:] if zs.startswith("::ffff:") else zs
n = self.args.rproxy n = self.args.rproxy
if n: if n:
@@ -778,8 +779,8 @@ class HttpCli(object):
if "k304" in self.uparam: if "k304" in self.uparam:
return self.set_k304() return self.set_k304()
if "am_js" in self.uparam: if "setck" in self.uparam:
return self.set_am_js() return self.setck()
if "reset" in self.uparam: if "reset" in self.uparam:
return self.set_cfg_reset() return self.set_cfg_reset()
@@ -865,7 +866,17 @@ class HttpCli(object):
vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False, err=401) vn, rem = self.asrv.vfs.get(self.vpath, self.uname, True, False, err=401)
depth = self.headers.get("depth", "infinity").lower() depth = self.headers.get("depth", "infinity").lower()
if depth == "infinity": try:
topdir = {"vp": "", "st": bos.stat(vn.canonical(rem))}
except OSError as ex:
if ex.errno != errno.ENOENT:
raise
raise Pebkac(404)
if not stat.S_ISDIR(topdir["st"].st_mode):
fgen = []
elif depth == "infinity":
if not self.args.dav_inf: if not self.args.dav_inf:
self.log("client wants --dav-inf", 3) self.log("client wants --dav-inf", 3)
zb = b'<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:propfind-finite-depth/></D:error>' zb = b'<?xml version="1.0" encoding="utf-8"?>\n<D:error xmlns:D="DAV:"><D:propfind-finite-depth/></D:error>'
@@ -904,13 +915,6 @@ class HttpCli(object):
t2 = " or 'infinity'" if self.args.dav_inf else "" t2 = " or 'infinity'" if self.args.dav_inf else ""
raise Pebkac(412, t.format(depth, t2)) raise Pebkac(412, t.format(depth, t2))
try:
topdir = {"vp": "", "st": os.stat(vn.canonical(rem))}
except OSError as ex:
if ex.errno != errno.ENOENT:
raise
raise Pebkac(404)
fgen = itertools.chain([topdir], fgen) # type: ignore fgen = itertools.chain([topdir], fgen) # type: ignore
vtop = vjoin(self.args.R, vjoin(vn.vpath, rem)) vtop = vjoin(self.args.R, vjoin(vn.vpath, rem))
@@ -1112,7 +1116,14 @@ class HttpCli(object):
if self.do_log: if self.do_log:
self.log("MKCOL " + self.req) self.log("MKCOL " + self.req)
return self._mkdir(self.vpath) try:
return self._mkdir(self.vpath, True)
except Pebkac as ex:
if ex.code >= 500:
raise
self.reply(b"", ex.code)
return True
def handle_move(self) -> bool: def handle_move(self) -> bool:
dst = self.headers["destination"] dst = self.headers["destination"]
@@ -1426,9 +1437,9 @@ class HttpCli(object):
self.log(t, 1) self.log(t, 1)
raise Pebkac(403, t) raise Pebkac(403, t)
if is_put and not (self.args.no_dav or self.args.nw): if is_put and not (self.args.no_dav or self.args.nw) and bos.path.exists(path):
# allow overwrite if... # allow overwrite if...
# * volflag 'daw' is set # * volflag 'daw' is set, or client is definitely webdav
# * and account has delete-access # * and account has delete-access
# or... # or...
# * file exists, is empty, sufficiently new # * file exists, is empty, sufficiently new
@@ -1438,9 +1449,11 @@ class HttpCli(object):
if self.args.dotpart: if self.args.dotpart:
tnam = "." + tnam tnam = "." + tnam
if (vfs.flags.get("daw") and self.can_delete) or ( if (
self.can_delete
and (vfs.flags.get("daw") or "x-oc-mtime" in self.headers)
) or (
not bos.path.exists(os.path.join(fdir, tnam)) not bos.path.exists(os.path.join(fdir, tnam))
and bos.path.exists(path)
and not bos.path.getsize(path) and not bos.path.getsize(path)
and bos.path.getmtime(path) >= time.time() - self.args.blank_wt and bos.path.getmtime(path) >= time.time() - self.args.blank_wt
): ):
@@ -1464,6 +1477,16 @@ class HttpCli(object):
if self.args.nw: if self.args.nw:
return post_sz, sha_hex, sha_b64, remains, path, "" return post_sz, sha_hex, sha_b64, remains, path, ""
at = mt = time.time() - lifetime
cli_mt = self.headers.get("x-oc-mtime")
if cli_mt:
try:
mt = int(cli_mt)
times = (int(time.time()), mt)
bos.utime(path, times, False)
except:
pass
if nameless and "magic" in vfs.flags: if nameless and "magic" in vfs.flags:
try: try:
ext = self.conn.hsrv.magician.ext(path) ext = self.conn.hsrv.magician.ext(path)
@@ -1486,7 +1509,6 @@ class HttpCli(object):
fn = fn2 fn = fn2
path = path2 path = path2
at = time.time() - lifetime
if xau and not runhook( if xau and not runhook(
self.log, self.log,
xau, xau,
@@ -1494,7 +1516,7 @@ class HttpCli(object):
self.vpath, self.vpath,
self.host, self.host,
self.uname, self.uname,
at, mt,
post_sz, post_sz,
self.ip, self.ip,
at, at,
@@ -1554,6 +1576,11 @@ class HttpCli(object):
t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url) t = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
h = {"Location": url} if is_put and url else {} h = {"Location": url} if is_put and url else {}
if "x-oc-mtime" in self.headers:
h["X-OC-MTime"] = "accepted"
t = "" # some webdav clients expect/prefer this
self.reply(t.encode("utf-8"), 201, headers=h) self.reply(t.encode("utf-8"), 201, headers=h)
return True return True
@@ -1727,7 +1754,7 @@ class HttpCli(object):
def handle_search(self, body: dict[str, Any]) -> bool: def handle_search(self, body: dict[str, Any]) -> bool:
idx = self.conn.get_u2idx() idx = self.conn.get_u2idx()
if not hasattr(idx, "p_end"): if not idx or not hasattr(idx, "p_end"):
raise Pebkac(500, "sqlite3 is not available on the server; cannot search") raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
vols = [] vols = []
@@ -1757,12 +1784,13 @@ class HttpCli(object):
hits = idx.fsearch(vols, body) hits = idx.fsearch(vols, body)
msg: Any = repr(hits) msg: Any = repr(hits)
taglist: list[str] = [] taglist: list[str] = []
trunc = False
else: else:
# search by query params # search by query params
q = body["q"] q = body["q"]
n = body.get("n", self.args.srch_hits) n = body.get("n", self.args.srch_hits)
self.log("qj: {} |{}|".format(q, n)) self.log("qj: {} |{}|".format(q, n))
hits, taglist = idx.search(vols, q, n) hits, taglist, trunc = idx.search(vols, q, n)
msg = len(hits) msg = len(hits)
idx.p_end = time.time() idx.p_end = time.time()
@@ -1782,7 +1810,8 @@ class HttpCli(object):
for hit in hits: for hit in hits:
hit["rp"] = self.args.RS + hit["rp"] hit["rp"] = self.args.RS + hit["rp"]
r = json.dumps({"hits": hits, "tag_order": order}).encode("utf-8") rj = {"hits": hits, "tag_order": order, "trunc": trunc}
r = json.dumps(rj).encode("utf-8")
self.reply(r, mime="application/json") self.reply(r, mime="application/json")
return True return True
@@ -1952,7 +1981,7 @@ class HttpCli(object):
sanitized = sanitize_fn(new_dir, "", []) sanitized = sanitize_fn(new_dir, "", [])
return self._mkdir(vjoin(self.vpath, sanitized)) return self._mkdir(vjoin(self.vpath, sanitized))
def _mkdir(self, vpath: str) -> bool: def _mkdir(self, vpath: str, dav: bool = False) -> bool:
nullwrite = self.args.nw nullwrite = self.args.nw
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True) vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
self._assert_safe_rem(rem) self._assert_safe_rem(rem)
@@ -1978,7 +2007,12 @@ class HttpCli(object):
raise Pebkac(500, min_ex()) raise Pebkac(500, min_ex())
self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1]) self.out_headers["X-New-Dir"] = quotep(vpath.split("/")[-1])
self.redirect(vpath, status=201)
if dav:
self.reply(b"", 201)
else:
self.redirect(vpath, status=201)
return True return True
def handle_new_md(self) -> bool: def handle_new_md(self) -> bool:
@@ -2522,8 +2556,12 @@ class HttpCli(object):
hrange = self.headers.get("range") hrange = self.headers.get("range")
# let's not support 206 with compression # let's not support 206 with compression
if do_send and not is_compressed and hrange and file_sz: # and multirange / multipart is also not-impl (mostly because calculating contentlength is a pain)
if do_send and not is_compressed and hrange and file_sz and "," not in hrange:
try: try:
if not hrange.lower().startswith("bytes"):
raise Exception()
a, b = hrange.split("=", 1)[1].split("-") a, b = hrange.split("=", 1)[1].split("-")
if a.strip(): if a.strip():
@@ -2896,15 +2934,16 @@ class HttpCli(object):
self.redirect("", "?h#cc") self.redirect("", "?h#cc")
return True return True
def set_am_js(self) -> bool: def setck(self) -> bool:
v = "n" if self.uparam["am_js"] == "n" else "y" k, v = self.uparam["setck"].split("=", 1)
ck = gencookie("js", v, self.args.R, False, 86400 * 299) t = None if v == "" else 86400 * 299
ck = gencookie(k, v, self.args.R, False, t)
self.out_headerlist.append(("Set-Cookie", ck)) self.out_headerlist.append(("Set-Cookie", ck))
self.reply(b"promoted\n") self.reply(b"o7\n")
return True return True
def set_cfg_reset(self) -> bool: def set_cfg_reset(self) -> bool:
for k in ("k304", "js", "cppwd", "cppws"): for k in ("k304", "js", "idxh", "cppwd", "cppws"):
cookie = gencookie(k, "x", self.args.R, False, None) cookie = gencookie(k, "x", self.args.R, False, None)
self.out_headerlist.append(("Set-Cookie", cookie)) self.out_headerlist.append(("Set-Cookie", cookie))
@@ -3040,7 +3079,7 @@ class HttpCli(object):
raise Pebkac(403, "the unpost feature is disabled in server config") raise Pebkac(403, "the unpost feature is disabled in server config")
idx = self.conn.get_u2idx() idx = self.conn.get_u2idx()
if not hasattr(idx, "p_end"): if not idx or not hasattr(idx, "p_end"):
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost") raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
filt = self.uparam.get("filter") filt = self.uparam.get("filter")
@@ -3259,9 +3298,10 @@ class HttpCli(object):
is_dir = stat.S_ISDIR(st.st_mode) is_dir = stat.S_ISDIR(st.st_mode)
icur = None icur = None
if e2t or (e2d and is_dir): if is_dir and (e2t or e2d):
idx = self.conn.get_u2idx() idx = self.conn.get_u2idx()
icur = idx.get_cur(dbv.realpath) if idx and hasattr(idx, "p_end"):
icur = idx.get_cur(dbv.realpath)
if self.can_read: if self.can_read:
th_fmt = self.uparam.get("th") th_fmt = self.uparam.get("th")
@@ -3433,6 +3473,7 @@ class HttpCli(object):
"dtheme": self.args.theme, "dtheme": self.args.theme,
"themes": self.args.themes, "themes": self.args.themes,
"turbolvl": self.args.turbo, "turbolvl": self.args.turbo,
"idxh": int(self.args.ih),
"u2sort": self.args.u2sort, "u2sort": self.args.u2sort,
} }
@@ -3566,6 +3607,20 @@ class HttpCli(object):
files.append(item) files.append(item)
item["rd"] = rem item["rd"] = rem
if (
self.cookies.get("idxh") == "y"
and "ls" not in self.uparam
and "v" not in self.uparam
):
idx_html = set(["index.htm", "index.html"])
for item in files:
if item["name"] in idx_html:
# do full resolve in case of shadowed file
vp = vjoin(self.vpath.split("?")[0], item["name"])
vn, rem = self.asrv.vfs.get(vp, self.uname, True, False)
ap = vn.canonical(rem)
return self.tx_file(ap) # is no-cache
tagset: set[str] = set() tagset: set[str] = set()
for fe in files: for fe in files:
fn = fe["name"] fn = fe["name"]

View File

@@ -103,11 +103,12 @@ class HttpConn(object):
def log(self, msg: str, c: Union[int, str] = 0) -> None: def log(self, msg: str, c: Union[int, str] = 0) -> None:
self.log_func(self.log_src, msg, c) self.log_func(self.log_src, msg, c)
def get_u2idx(self) -> U2idx: def get_u2idx(self) -> Optional[U2idx]:
# one u2idx per tcp connection; # grab from a pool of u2idx instances;
# sqlite3 fully parallelizes under python threads # sqlite3 fully parallelizes under python threads
# but avoid running out of FDs by creating too many
if not self.u2idx: if not self.u2idx:
self.u2idx = U2idx(self) self.u2idx = self.hsrv.get_u2idx(str(self.addr))
return self.u2idx return self.u2idx
@@ -215,3 +216,7 @@ class HttpConn(object):
self.cli = HttpCli(self) self.cli = HttpCli(self)
if not self.cli.run(): if not self.cli.run():
return return
if self.u2idx:
self.hsrv.put_u2idx(str(self.addr), self.u2idx)
self.u2idx = None

View File

@@ -11,7 +11,7 @@ import time
import queue import queue
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams from .__init__ import ANYWIN, CORES, EXE, MACOS, TYPE_CHECKING, EnvParams
try: try:
MNFE = ModuleNotFoundError MNFE = ModuleNotFoundError
@@ -40,6 +40,7 @@ except MNFE:
from .bos import bos from .bos import bos
from .httpconn import HttpConn from .httpconn import HttpConn
from .u2idx import U2idx
from .util import ( from .util import (
E_SCK, E_SCK,
FHC, FHC,
@@ -111,6 +112,9 @@ class HttpSrv(object):
self.cb_ts = 0.0 self.cb_ts = 0.0
self.cb_v = "" self.cb_v = ""
self.u2idx_free: dict[str, U2idx] = {}
self.u2idx_n = 0
env = jinja2.Environment() env = jinja2.Environment()
env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web")) env.loader = jinja2.FileSystemLoader(os.path.join(self.E.mod, "web"))
jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"] jn = ["splash", "svcs", "browser", "browser2", "msg", "md", "mde", "cf"]
@@ -445,6 +449,9 @@ class HttpSrv(object):
self.clients.remove(cli) self.clients.remove(cli)
self.ncli -= 1 self.ncli -= 1
if cli.u2idx:
self.put_u2idx(str(addr), cli.u2idx)
def cachebuster(self) -> str: def cachebuster(self) -> str:
if time.time() - self.cb_ts < 1: if time.time() - self.cb_ts < 1:
return self.cb_v return self.cb_v
@@ -466,3 +473,31 @@ class HttpSrv(object):
self.cb_v = v.decode("ascii")[-4:] self.cb_v = v.decode("ascii")[-4:]
self.cb_ts = time.time() self.cb_ts = time.time()
return self.cb_v return self.cb_v
def get_u2idx(self, ident: str) -> Optional[U2idx]:
utab = self.u2idx_free
for _ in range(100): # 5/0.05 = 5sec
with self.mutex:
if utab:
if ident in utab:
return utab.pop(ident)
return utab.pop(list(utab.keys())[0])
if self.u2idx_n < CORES:
self.u2idx_n += 1
return U2idx(self)
time.sleep(0.05)
# not using conditional waits, on a hunch that
# average performance will be faster like this
# since most servers won't be fully saturated
return None
def put_u2idx(self, ident: str, u2idx: U2idx) -> None:
with self.mutex:
while ident in self.u2idx_free:
ident += "a"
self.u2idx_free[ident] = u2idx

View File

@@ -128,6 +128,9 @@ class SvcHub(object):
args.no_robots = True args.no_robots = True
args.force_js = True args.force_js = True
if not self._process_config():
raise Exception("bad config")
self.log = self._log_disabled if args.q else self._log_enabled self.log = self._log_disabled if args.q else self._log_enabled
if args.lo: if args.lo:
self._setup_logfile(printed) self._setup_logfile(printed)
@@ -177,9 +180,6 @@ class SvcHub(object):
self.log("root", "max clients: {}".format(self.args.nc)) self.log("root", "max clients: {}".format(self.args.nc))
if not self._process_config():
raise Exception("bad config")
self.tcpsrv = TcpSrv(self) self.tcpsrv = TcpSrv(self)
self.up2k = Up2k(self) self.up2k = Up2k(self)
@@ -350,6 +350,19 @@ class SvcHub(object):
al.th_covers = set(al.th_covers.split(",")) al.th_covers = set(al.th_covers.split(","))
for k in "c".split(" "):
vl = getattr(al, k)
if not vl:
continue
vl = [os.path.expanduser(x) if x.startswith("~") else x for x in vl]
setattr(al, k, vl)
for k in "lo hist ssl_log".split(" "):
vs = getattr(al, k)
if vs and vs.startswith("~"):
setattr(al, k, os.path.expanduser(vs))
return True return True
def _setlimits(self) -> None: def _setlimits(self) -> None:

View File

@@ -322,7 +322,7 @@ class TcpSrv(object):
if k not in netdevs: if k not in netdevs:
removed = "{} = {}".format(k, v) removed = "{} = {}".format(k, v)
t = "network change detected:\n added {}\nremoved {}" t = "network change detected:\n added {}\033[0;33m\nremoved {}"
self.log("tcpsrv", t.format(added, removed), 3) self.log("tcpsrv", t.format(added, removed), 3)
self.netdevs = netdevs self.netdevs = netdevs
self._distribute_netdevs() self._distribute_netdevs()

View File

@@ -16,10 +16,10 @@ from .__init__ import ANYWIN, TYPE_CHECKING
from .bos import bos from .bos import bos
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
from .util import ( from .util import (
FFMPEG_URL,
BytesIO, BytesIO,
Cooldown, Cooldown,
Daemon, Daemon,
FFMPEG_URL,
Pebkac, Pebkac,
afsenc, afsenc,
fsenc, fsenc,
@@ -561,12 +561,19 @@ class ThumbSrv(object):
if "ac" not in ret: if "ac" not in ret:
raise Exception("not audio") raise Exception("not audio")
try:
dur = ret[".dur"][1]
except:
dur = 0
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus" src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
want_caf = tpath.endswith(".caf") want_caf = tpath.endswith(".caf")
tmp_opus = tpath tmp_opus = tpath
if want_caf: if want_caf:
tmp_opus = tpath.rsplit(".", 1)[0] + ".opus" tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
caf_src = abspath if src_opus else tmp_opus
if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)): if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
# fmt: off # fmt: off
cmd = [ cmd = [
@@ -584,7 +591,32 @@ class ThumbSrv(object):
# fmt: on # fmt: on
self._run_ff(cmd) self._run_ff(cmd)
if want_caf: # iOS fails to play some "insufficiently complex" files
# (average file shorter than 8 seconds), so of course we
# fix that by mixing in some inaudible pink noise :^)
# 6.3 sec seems like the cutoff so lets do 7, and
# 7 sec of psyqui-musou.opus @ 3:50 is 174 KiB
if want_caf and (dur < 20 or bos.path.getsize(caf_src) < 256 * 1024):
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
b"-filter_complex", b"anoisesrc=a=0.001:d=7:c=pink,asplit[l][r]; [l][r]amerge[s]; [0:a:0][s]amix",
b"-map_metadata", b"-1",
b"-ac", b"2",
b"-c:a", b"libopus",
b"-b:a", b"128k",
b"-f", b"caf",
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd)
elif want_caf:
# simple remux should be safe
# fmt: off # fmt: off
cmd = [ cmd = [
b"ffmpeg", b"ffmpeg",

View File

@@ -34,14 +34,14 @@ if True: # pylint: disable=using-constant-test
from typing import Any, Optional, Union from typing import Any, Optional, Union
if TYPE_CHECKING: if TYPE_CHECKING:
from .httpconn import HttpConn from .httpsrv import HttpSrv
class U2idx(object): class U2idx(object):
def __init__(self, conn: "HttpConn") -> None: def __init__(self, hsrv: "HttpSrv") -> None:
self.log_func = conn.log_func self.log_func = hsrv.log
self.asrv = conn.asrv self.asrv = hsrv.asrv
self.args = conn.args self.args = hsrv.args
self.timeout = self.args.srch_time self.timeout = self.args.srch_time
if not HAVE_SQLITE3: if not HAVE_SQLITE3:
@@ -51,7 +51,7 @@ class U2idx(object):
self.active_id = "" self.active_id = ""
self.active_cur: Optional["sqlite3.Cursor"] = None self.active_cur: Optional["sqlite3.Cursor"] = None
self.cur: dict[str, "sqlite3.Cursor"] = {} self.cur: dict[str, "sqlite3.Cursor"] = {}
self.mem_cur = sqlite3.connect(":memory:").cursor() self.mem_cur = sqlite3.connect(":memory:", check_same_thread=False).cursor()
self.mem_cur.execute(r"create table a (b text)") self.mem_cur.execute(r"create table a (b text)")
self.p_end = 0.0 self.p_end = 0.0
@@ -101,7 +101,8 @@ class U2idx(object):
uri = "" uri = ""
try: try:
uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri()) uri = "{}?mode=ro&nolock=1".format(Path(db_path).as_uri())
cur = sqlite3.connect(uri, 2, uri=True).cursor() db = sqlite3.connect(uri, 2, uri=True, check_same_thread=False)
cur = db.cursor()
cur.execute('pragma table_info("up")').fetchone() cur.execute('pragma table_info("up")').fetchone()
self.log("ro: {}".format(db_path)) self.log("ro: {}".format(db_path))
except: except:
@@ -112,7 +113,7 @@ class U2idx(object):
if not cur: if not cur:
# on windows, this steals the write-lock from up2k.deferred_init -- # on windows, this steals the write-lock from up2k.deferred_init --
# seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2 # seen on win 10.0.17763.2686, py 3.10.4, sqlite 3.37.2
cur = sqlite3.connect(db_path, 2).cursor() cur = sqlite3.connect(db_path, 2, check_same_thread=False).cursor()
self.log("opened {}".format(db_path)) self.log("opened {}".format(db_path))
self.cur[ptop] = cur self.cur[ptop] = cur
@@ -120,10 +121,10 @@ class U2idx(object):
def search( def search(
self, vols: list[tuple[str, str, dict[str, Any]]], uq: str, lim: int self, vols: list[tuple[str, str, dict[str, Any]]], uq: str, lim: int
) -> tuple[list[dict[str, Any]], list[str]]: ) -> tuple[list[dict[str, Any]], list[str], bool]:
"""search by query params""" """search by query params"""
if not HAVE_SQLITE3: if not HAVE_SQLITE3:
return [], [] return [], [], False
q = "" q = ""
v: Union[str, int] = "" v: Union[str, int] = ""
@@ -275,7 +276,7 @@ class U2idx(object):
have_up: bool, have_up: bool,
have_mt: bool, have_mt: bool,
lim: int, lim: int,
) -> tuple[list[dict[str, Any]], list[str]]: ) -> tuple[list[dict[str, Any]], list[str], bool]:
done_flag: list[bool] = [] done_flag: list[bool] = []
self.active_id = "{:.6f}_{}".format( self.active_id = "{:.6f}_{}".format(
time.time(), threading.current_thread().ident time.time(), threading.current_thread().ident
@@ -316,9 +317,6 @@ class U2idx(object):
c = cur.execute(uq, tuple(vuv)) c = cur.execute(uq, tuple(vuv))
for hit in c: for hit in c:
w, ts, sz, rd, fn, ip, at = hit[:7] w, ts, sz, rd, fn, ip, at = hit[:7]
lim -= 1
if lim < 0:
break
if rd.startswith("//") or fn.startswith("//"): if rd.startswith("//") or fn.startswith("//"):
rd, fn = s3dec(rd, fn) rd, fn = s3dec(rd, fn)
@@ -346,6 +344,10 @@ class U2idx(object):
)[:fk] )[:fk]
) )
lim -= 1
if lim < 0:
break
seen_rps.add(rp) seen_rps.add(rp)
sret.append({"ts": int(ts), "sz": sz, "rp": rp + suf, "w": w[:16]}) sret.append({"ts": int(ts), "sz": sz, "rp": rp + suf, "w": w[:16]})
@@ -368,7 +370,7 @@ class U2idx(object):
ret.sort(key=itemgetter("rp")) ret.sort(key=itemgetter("rp"))
return ret, list(taglist.keys()) return ret, list(taglist.keys()), lim < 0
def terminator(self, identifier: str, done_flag: list[bool]) -> None: def terminator(self, identifier: str, done_flag: list[bool]) -> None:
for _ in range(self.timeout): for _ in range(self.timeout):

View File

@@ -1032,7 +1032,7 @@ class Up2k(object):
zh.update(cv.encode("utf-8", "replace")) zh.update(cv.encode("utf-8", "replace"))
zh.update(spack(b"<d", cst.st_mtime)) zh.update(spack(b"<d", cst.st_mtime))
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii") dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
sql = "select d from dh where d = ? and h = ?" sql = "select d from dh where d=? and +h=?"
try: try:
c = db.c.execute(sql, (rd, dhash)) c = db.c.execute(sql, (rd, dhash))
drd = rd drd = rd
@@ -1251,12 +1251,15 @@ class Up2k(object):
if n_rm2: if n_rm2:
self.log("forgetting {} shadowed deleted files".format(n_rm2)) self.log("forgetting {} shadowed deleted files".format(n_rm2))
c2.connection.commit()
# then covers # then covers
n_rm3 = 0 n_rm3 = 0
qu = "select 1 from up where rd=? and +fn=? limit 1"
q = "delete from cv where rd=? and dn=? and +fn=?" q = "delete from cv where rd=? and dn=? and +fn=?"
for crd, cdn, fn in cur.execute("select * from cv"): for crd, cdn, fn in cur.execute("select * from cv"):
ap = os.path.join(top, crd, cdn, fn) urd = vjoin(crd, cdn)
if not bos.path.exists(ap): if not c2.execute(qu, (urd, fn)).fetchone():
c2.execute(q, (crd, cdn, fn)) c2.execute(q, (crd, cdn, fn))
n_rm3 += 1 n_rm3 += 1
@@ -1313,9 +1316,9 @@ class Up2k(object):
w, drd, dfn = zb[:-1].decode("utf-8").split("\x00") w, drd, dfn = zb[:-1].decode("utf-8").split("\x00")
with self.mutex: with self.mutex:
q = "select mt, sz from up where w = ? and rd = ? and fn = ?" q = "select mt, sz from up where rd=? and fn=? and +w=?"
try: try:
mt, sz = cur.execute(q, (w, drd, dfn)).fetchone() mt, sz = cur.execute(q, (drd, dfn, w)).fetchone()
except: except:
# file moved/deleted since spooling # file moved/deleted since spooling
continue continue
@@ -2220,7 +2223,7 @@ class Up2k(object):
q = r"select * from up where w = ?" q = r"select * from up where w = ?"
argv = [wark] argv = [wark]
else: else:
q = r"select * from up where substr(w,1,16) = ? and w = ?" q = r"select * from up where substr(w,1,16)=? and +w=?"
argv = [wark[:16], wark] argv = [wark[:16], wark]
c2 = cur.execute(q, tuple(argv)) c2 = cur.execute(q, tuple(argv))
@@ -3297,9 +3300,16 @@ class Up2k(object):
""" """
dupes = [] dupes = []
sabs = djoin(sptop, srem) sabs = djoin(sptop, srem)
q = "select rd, fn from up where substr(w,1,16)=? and w=?"
if self.no_expr_idx:
q = r"select rd, fn from up where w = ?"
argv = (wark,)
else:
q = r"select rd, fn from up where substr(w,1,16)=? and +w=?"
argv = (wark[:16], wark)
for ptop, cur in self.cur.items(): for ptop, cur in self.cur.items():
for rd, fn in cur.execute(q, (wark[:16], wark)): for rd, fn in cur.execute(q, argv):
if rd.startswith("//") or fn.startswith("//"): if rd.startswith("//") or fn.startswith("//"):
rd, fn = s3dec(rd, fn) rd, fn = s3dec(rd, fn)

View File

@@ -27,8 +27,8 @@ window.baguetteBox = (function () {
isOverlayVisible = false, isOverlayVisible = false,
touch = {}, // start-pos touch = {}, // start-pos
touchFlag = false, // busy touchFlag = false, // busy
re_i = /.+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i, re_i = /^[^?]+\.(a?png|avif|bmp|gif|heif|jpe?g|jfif|svg|webp)(\?|$)/i,
re_v = /.+\.(webm|mkv|mp4)(\?|$)/i, re_v = /^[^?]+\.(webm|mkv|mp4)(\?|$)/i,
anims = ['slideIn', 'fadeIn', 'none'], anims = ['slideIn', 'fadeIn', 'none'],
data = {}, // all galleries data = {}, // all galleries
imagesElements = [], imagesElements = [],
@@ -580,6 +580,7 @@ window.baguetteBox = (function () {
function hideOverlay(e) { function hideOverlay(e) {
ev(e); ev(e);
playvid(false); playvid(false);
removeFromCache('#files');
if (options.noScrollbars) { if (options.noScrollbars) {
document.documentElement.style.overflowY = 'auto'; document.documentElement.style.overflowY = 'auto';
document.body.style.overflowY = 'auto'; document.body.style.overflowY = 'auto';
@@ -812,10 +813,16 @@ window.baguetteBox = (function () {
} }
function vid() { function vid() {
if (currentIndex >= imagesElements.length)
return;
return imagesElements[currentIndex].querySelector('video'); return imagesElements[currentIndex].querySelector('video');
} }
function vidimg() { function vidimg() {
if (currentIndex >= imagesElements.length)
return;
return imagesElements[currentIndex].querySelector('img, video'); return imagesElements[currentIndex].querySelector('img, video');
} }

View File

@@ -796,6 +796,8 @@ html.y #path a:hover {
} }
.logue { .logue {
padding: .2em 0; padding: .2em 0;
position: relative;
z-index: 1;
} }
.logue.hidden, .logue.hidden,
.logue:empty { .logue:empty {
@@ -1072,6 +1074,9 @@ html.np_open #ggrid>a.au:before {
background: var(--bg-d3); background: var(--bg-d3);
box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh); box-shadow: -.2em .2em 0 var(--f-sel-sh), -.2em -.2em 0 var(--f-sel-sh);
} }
#player {
display: none;
}
#widget { #widget {
position: fixed; position: fixed;
font-size: 1.4em; font-size: 1.4em;

View File

@@ -155,6 +155,7 @@
sb_lg = "{{ sb_lg }}", sb_lg = "{{ sb_lg }}",
lifetime = {{ lifetime }}, lifetime = {{ lifetime }},
turbolvl = {{ turbolvl }}, turbolvl = {{ turbolvl }},
idxh = {{ idxh }},
frand = {{ frand|tojson }}, frand = {{ frand|tojson }},
u2sort = "{{ u2sort }}", u2sort = "{{ u2sort }}",
have_emp = {{ have_emp|tojson }}, have_emp = {{ have_emp|tojson }},

View File

@@ -193,6 +193,7 @@ var Ls = {
"ct_dots": "show hidden files (if server permits)", "ct_dots": "show hidden files (if server permits)",
"ct_dir1st": "sort folders before files", "ct_dir1st": "sort folders before files",
"ct_readme": "show README.md in folder listings", "ct_readme": "show README.md in folder listings",
"ct_idxh": "show index.html instead of folder listing",
"ct_sbars": "show scrollbars", "ct_sbars": "show scrollbars",
"cut_turbo": "the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>&quot;does this have the same filesize on the server?&quot;</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then &quot;upload&quot; the same files again to let the client verify them", "cut_turbo": "the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>&quot;does this have the same filesize on the server?&quot;</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then &quot;upload&quot; the same files again to let the client verify them",
@@ -652,6 +653,7 @@ var Ls = {
"ct_dots": "vis skjulte filer (gitt at serveren tillater det)", "ct_dots": "vis skjulte filer (gitt at serveren tillater det)",
"ct_dir1st": "sorter slik at mapper kommer foran filer", "ct_dir1st": "sorter slik at mapper kommer foran filer",
"ct_readme": "vis README.md nedenfor filene", "ct_readme": "vis README.md nedenfor filene",
"ct_idxh": "vis index.html istedenfor fil-liste",
"ct_sbars": "vis rullgardiner / skrollefelt", "ct_sbars": "vis rullgardiner / skrollefelt",
"cut_turbo": "forenklet befaring ved opplastning; bør sannsynlig <em>ikke</em> skrus på:$N$Nnyttig dersom du var midt i en svær opplastning som måtte restartes av en eller annen grunn, og du vil komme igang igjen så raskt som overhodet mulig.$N$Nnår denne er skrudd på så forenkles befaringen kraftig; istedenfor å utføre en trygg sjekk på om filene finnes på serveren i god stand, så sjekkes kun om <em>filstørrelsen</em> stemmer. Så dersom en korrupt fil skulle befinne seg på serveren allerede, på samme sted med samme størrelse og navn, så blir det <em>ikke oppdaget</em>.$N$Ndet anbefales å kun benytte denne funksjonen for å komme seg raskt igjennom selve opplastningen, for så å skru den av, og til slutt &quot;laste opp&quot; de samme filene én gang til -- slik at integriteten kan verifiseres", "cut_turbo": "forenklet befaring ved opplastning; bør sannsynlig <em>ikke</em> skrus på:$N$Nnyttig dersom du var midt i en svær opplastning som måtte restartes av en eller annen grunn, og du vil komme igang igjen så raskt som overhodet mulig.$N$Nnår denne er skrudd på så forenkles befaringen kraftig; istedenfor å utføre en trygg sjekk på om filene finnes på serveren i god stand, så sjekkes kun om <em>filstørrelsen</em> stemmer. Så dersom en korrupt fil skulle befinne seg på serveren allerede, på samme sted med samme størrelse og navn, så blir det <em>ikke oppdaget</em>.$N$Ndet anbefales å kun benytte denne funksjonen for å komme seg raskt igjennom selve opplastningen, for så å skru den av, og til slutt &quot;laste opp&quot; de samme filene én gang til -- slik at integriteten kan verifiseres",
@@ -972,6 +974,17 @@ ebi('widget').innerHTML = (
' <canvas id="pvol" width="288" height="38"></canvas>' + ' <canvas id="pvol" width="288" height="38"></canvas>' +
' <canvas id="barpos"></canvas>' + ' <canvas id="barpos"></canvas>' +
' <canvas id="barbuf"></canvas>' + ' <canvas id="barbuf"></canvas>' +
'</div>' +
'<div id="np_inf">' +
' <img id="np_img"></span>' +
' <span id="np_url"></span>' +
' <span id="np_circle"></span>' +
' <span id="np_album"></span>' +
' <span id="np_tn"></span>' +
' <span id="np_artist"></span>' +
' <span id="np_title"></span>' +
' <span id="np_pos"></span>' +
' <span id="np_dur"></span>' +
'</div>' '</div>'
); );
@@ -1083,6 +1096,7 @@ ebi('op_cfg').innerHTML = (
' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '">dotfiles</a>\n' + ' <a id="dotfiles" class="tgl btn" href="#" tt="' + L.ct_dots + '">dotfiles</a>\n' +
' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '">📁 first</a>\n' + ' <a id="dir1st" class="tgl btn" href="#" tt="' + L.ct_dir1st + '">📁 first</a>\n' +
' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '">📜 readme</a>\n' + ' <a id="ireadme" class="tgl btn" href="#" tt="' + L.ct_readme + '">📜 readme</a>\n' +
' <a id="idxh" class="tgl btn" href="#" tt="' + L.ct_idxh + '">htm</a>\n' +
' <a id="sbars" class="tgl btn" href="#" tt="' + L.ct_sbars + '">⟊</a>\n' + ' <a id="sbars" class="tgl btn" href="#" tt="' + L.ct_sbars + '">⟊</a>\n' +
' </div>\n' + ' </div>\n' +
'</div>\n' + '</div>\n' +
@@ -1278,6 +1292,7 @@ function set_files_html(html) {
var ACtx = window.AudioContext || window.webkitAudioContext, var ACtx = window.AudioContext || window.webkitAudioContext,
noih = /[?&]v\b/.exec('' + location),
hash0 = location.hash, hash0 = location.hash,
mp; mp;
@@ -1401,10 +1416,17 @@ var mpl = (function () {
}; };
r.pp = function () { r.pp = function () {
var adur, apos, playing = mp.au && !mp.au.paused;
clmod(ebi('np_inf'), 'playing', playing);
if (mp.au && isNum(adur = mp.au.duration) && isNum(apos = mp.au.currentTime) && apos >= 0)
ebi('np_pos').textContent = s2ms(apos);
if (!r.os_ctl) if (!r.os_ctl)
return; return;
navigator.mediaSession.playbackState = mp.au && !mp.au.paused ? "playing" : "paused"; navigator.mediaSession.playbackState = playing ? "playing" : "paused";
}; };
function setaufollow() { function setaufollow() {
@@ -1419,9 +1441,10 @@ var mpl = (function () {
var np = get_np()[0], var np = get_np()[0],
fns = np.file.split(' - '), fns = np.file.split(' - '),
artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')), artist = (np.circle && np.circle != np.artist ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
tags = { title = np.title || fns.pop(),
title: np.title || fns.pop() cover = '',
}; pcover = '',
tags = { title: title };
if (artist) if (artist)
tags.artist = artist; tags.artist = artist;
@@ -1442,18 +1465,28 @@ var mpl = (function () {
if (cover) { if (cover) {
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j'; cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
pcover = cover;
var pwd = get_pwd(); var pwd = get_pwd();
if (pwd) if (pwd)
cover += '&pw=' + uricom_enc(pwd); pcover += '&pw=' + uricom_enc(pwd);
tags.artwork = [{ "src": cover, type: "image/jpeg" }]; tags.artwork = [{ "src": pcover, type: "image/jpeg" }];
} }
} }
ebi('np_circle').textContent = np.circle || '';
ebi('np_album').textContent = np.album || '';
ebi('np_tn').textContent = np['.tn'] || '';
ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : '');
ebi('np_title').textContent = np.title || '';
ebi('np_dur').textContent = np['.dur'] || '';
ebi('np_url').textContent = get_vpath() + np.file.split('?')[0];
ebi('np_img').setAttribute('src', cover); // dont give last.fm the pwd
navigator.mediaSession.metadata = new MediaMetadata(tags); navigator.mediaSession.metadata = new MediaMetadata(tags);
navigator.mediaSession.setActionHandler('play', playpause); navigator.mediaSession.setActionHandler('play', mplay);
navigator.mediaSession.setActionHandler('pause', playpause); navigator.mediaSession.setActionHandler('pause', mpause);
navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null); navigator.mediaSession.setActionHandler('seekbackward', r.os_seek ? function () { seek_au_rel(-10); } : null);
navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null); navigator.mediaSession.setActionHandler('seekforward', r.os_seek ? function () { seek_au_rel(10); } : null);
navigator.mediaSession.setActionHandler('previoustrack', prev_song); navigator.mediaSession.setActionHandler('previoustrack', prev_song);
@@ -1463,11 +1496,17 @@ var mpl = (function () {
r.announce = announce; r.announce = announce;
r.stop = function () { r.stop = function () {
if (!r.os_ctl || !navigator.mediaSession.metadata) if (!r.os_ctl)
return; return;
navigator.mediaSession.metadata = null; navigator.mediaSession.metadata = null;
navigator.mediaSession.playbackState = "paused"; navigator.mediaSession.playbackState = "paused";
var hs = 'play pause seekbackward seekforward previoustrack nexttrack'.split(/ /g);
for (var a = 0; a < hs.length; a++)
navigator.mediaSession.setActionHandler(hs[a], null);
navigator.mediaSession.setPositionState();
}; };
r.unbuffer = function (url) { r.unbuffer = function (url) {
@@ -1507,6 +1546,7 @@ function MPlayer() {
r.au2 = null; r.au2 = null;
r.tracks = {}; r.tracks = {};
r.order = []; r.order = [];
r.cd_pause = 0;
var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native, var re_audio = have_acode && mpl.ac_oth ? re_au_all : re_au_native,
trs = QSA('#files tbody tr'); trs = QSA('#files tbody tr');
@@ -1563,6 +1603,7 @@ function MPlayer() {
r.ftid = -1; r.ftid = -1;
r.ftimer = null; r.ftimer = null;
r.fade_in = function () { r.fade_in = function () {
r.nopause();
r.fvol = 0; r.fvol = 0;
r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1); r.fdir = 0.025 * r.vol * (CHROME ? 1.5 : 1);
if (r.au) { if (r.au) {
@@ -1595,9 +1636,9 @@ function MPlayer() {
r.au.pause(); r.au.pause();
mpl.pp(); mpl.pp();
var t = mp.au.currentTime - 0.8; var t = r.au.currentTime - 0.8;
if (isNum(t)) if (isNum(t))
mp.au.currentTime = Math.max(t, 0); r.au.currentTime = Math.max(t, 0);
} }
else if (r.fvol > r.vol) else if (r.fvol > r.vol)
r.fvol = r.vol; r.fvol = r.vol;
@@ -1643,8 +1684,14 @@ function MPlayer() {
drop(); drop();
}); });
mp.au2.preload = "auto"; r.nopause();
mp.au2.src = mp.au2.rsrc = url; r.au2.onloadeddata = r.au2.onloadedmetadata = r.nopause;
r.au2.preload = "auto";
r.au2.src = r.au2.rsrc = url;
};
r.nopause = function () {
r.cd_pause = Date.now();
}; };
} }
@@ -1783,6 +1830,7 @@ function glossy_grad(can, h, s, l) {
var pbar = (function () { var pbar = (function () {
var r = {}, var r = {},
bau = null, bau = null,
html_txt = 'a',
lastmove = 0, lastmove = 0,
mousepos = 0, mousepos = 0,
gradh = -1, gradh = -1,
@@ -1952,6 +2000,16 @@ var pbar = (function () {
pctx.strokeText(t2, xt2, yt); pctx.strokeText(t2, xt2, yt);
pctx.fillText(t1, xt1, yt); pctx.fillText(t1, xt1, yt);
pctx.fillText(t2, xt2, yt); pctx.fillText(t2, xt2, yt);
if (w && html_txt != t2) {
ebi('np_pos').textContent = html_txt = t2;
if (mpl.os_ctl)
navigator.mediaSession.setPositionState({
'duration': adur,
'position': apos,
'playbackRate': 1
});
}
}; };
window.addEventListener('resize', r.onresize); window.addEventListener('resize', r.onresize);
@@ -2081,6 +2139,7 @@ function seek_au_sec(seek) {
if (!isNum(seek)) if (!isNum(seek))
return; return;
mp.nopause();
mp.au.currentTime = seek; mp.au.currentTime = seek;
if (mp.au.paused) if (mp.au.paused)
@@ -2153,6 +2212,25 @@ function playpause(e) {
}; };
function mplay(e) {
if (mp.au && !mp.au.paused)
return;
playpause(e);
}
function mpause(e) {
if (mp.cd_pause > Date.now() - 100)
return;
if (mp.au && mp.au.paused)
return;
playpause(e);
}
// hook up the widget buttons // hook up the widget buttons
(function () { (function () {
ebi('bplay').onclick = playpause; ebi('bplay').onclick = playpause;
@@ -2298,12 +2376,14 @@ function start_actx() {
} }
} }
var audio_eq = (function () { var afilt = (function () {
var r = { var r = {
"en": false, "eqen": false,
"bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000], "bands": [31.25, 62.5, 125, 250, 500, 1000, 2000, 4000, 8000, 16000],
"gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4], "gains": [4, 3, 2, 1, 0, 0, 1, 2, 3, 4],
"filters": [], "filters": [],
"filterskip": [],
"plugs": [],
"amp": 0, "amp": 0,
"chw": 1, "chw": 1,
"last_au": null, "last_au": null,
@@ -2392,6 +2472,10 @@ var audio_eq = (function () {
r.filters[a].disconnect(); r.filters[a].disconnect();
r.filters = []; r.filters = [];
r.filterskip = [];
for (var a = 0; a < r.plugs.length; a++)
r.plugs[a].unload();
if (!mp) if (!mp)
return; return;
@@ -2409,18 +2493,34 @@ var audio_eq = (function () {
if (!actx) if (!actx)
bcfg_set('au_eq', false); bcfg_set('au_eq', false);
if (!actx || !mp.au || (!r.en && !mp.acs)) var plug = false;
for (var a = 0; a < r.plugs.length; a++)
if (r.plugs[a].en)
plug = true;
if (!actx || !mp.au || (!r.eqen && !plug && !mp.acs))
return; return;
r.stop(); r.stop();
mp.au.id = mp.au.id || Date.now(); mp.au.id = mp.au.id || Date.now();
mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au); mp.acs = r.acst[mp.au.id] = r.acst[mp.au.id] || actx.createMediaElementSource(mp.au);
if (!r.en) { if (r.eqen)
mp.acs.connect(actx.destination); add_eq();
return;
}
for (var a = 0; a < r.plugs.length; a++)
if (r.plugs[a].en)
r.plugs[a].load();
for (var a = 0; a < r.filters.length; a++)
if (!has(r.filterskip, a))
r.filters[a].connect(a ? r.filters[a - 1] : actx.destination);
mp.acs.connect(r.filters.length ?
r.filters[r.filters.length - 1] : actx.destination);
}
function add_eq() {
var min, max; var min, max;
min = max = r.gains[0]; min = max = r.gains[0];
for (var a = 1; a < r.gains.length; a++) { for (var a = 1; a < r.gains.length; a++) {
@@ -2451,9 +2551,6 @@ var audio_eq = (function () {
fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect fi.gain.value = r.amp + 0.94; // +.137 dB measured; now -.25 dB and almost bitperfect
r.filters.push(fi); r.filters.push(fi);
for (var a = r.filters.length - 1; a >= 0; a--)
r.filters[a].connect(a > 0 ? r.filters[a - 1] : actx.destination);
if (Math.round(r.chw * 25) != 25) { if (Math.round(r.chw * 25) != 25) {
var split = actx.createChannelSplitter(2), var split = actx.createChannelSplitter(2),
merge = actx.createChannelMerger(2), merge = actx.createChannelMerger(2),
@@ -2478,11 +2575,10 @@ var audio_eq = (function () {
split.connect(lg2, 0); split.connect(lg2, 0);
split.connect(rg1, 1); split.connect(rg1, 1);
split.connect(rg2, 1); split.connect(rg2, 1);
r.filterskip.push(r.filters.length);
r.filters.push(split); r.filters.push(split);
mp.acs.channelCountMode = 'explicit'; mp.acs.channelCountMode = 'explicit';
} }
mp.acs.connect(r.filters[r.filters.length - 1]);
} }
function eq_step(e) { function eq_step(e) {
@@ -2577,7 +2673,7 @@ var audio_eq = (function () {
txt[a].onkeydown = eq_keydown; txt[a].onkeydown = eq_keydown;
} }
bcfg_bind(r, 'en', 'au_eq', false, r.apply); bcfg_bind(r, 'eqen', 'au_eq', false, r.apply);
r.draw(); r.draw();
return r; return r;
@@ -2659,7 +2755,7 @@ function play(tid, is_ev, seek) {
else else
mp.au.src = mp.au.rsrc = url; mp.au.src = mp.au.rsrc = url;
audio_eq.apply(); afilt.apply();
setTimeout(function () { setTimeout(function () {
mpl.unbuffer(url); mpl.unbuffer(url);
@@ -2682,6 +2778,7 @@ function play(tid, is_ev, seek) {
scroll2playing(); scroll2playing();
try { try {
mp.nopause();
mp.au.play(); mp.au.play();
if (mp.au.paused) if (mp.au.paused)
autoplay_blocked(seek); autoplay_blocked(seek);
@@ -4146,6 +4243,21 @@ var thegrid = (function () {
ev(e); ev(e);
} }
r.imshow = function (url) {
var sel = '#ggrid>a'
if (!thegrid.en) {
thegrid.bagit('#files');
sel = '#files a[id]';
}
var ims = QSA(sel);
for (var a = 0, aa = ims.length; a < aa; a++) {
var iu = ims[a].getAttribute('href').split('?')[0].split('/').slice(-1)[0];
if (iu == url)
return ims[a].click();
}
baguetteBox.hide();
};
r.loadsel = function () { r.loadsel = function () {
if (r.dirty) if (r.dirty)
return; return;
@@ -4285,19 +4397,19 @@ var thegrid = (function () {
} }
r.dirty = false; r.dirty = false;
r.bagit(); r.bagit('#ggrid');
r.loadsel(); r.loadsel();
setTimeout(r.tippen, 20); setTimeout(r.tippen, 20);
} }
r.bagit = function () { r.bagit = function (isrc) {
if (!window.baguetteBox) if (!window.baguetteBox)
return; return;
if (r.bbox) if (r.bbox)
baguetteBox.destroy(); baguetteBox.destroy();
r.bbox = baguetteBox.run('#ggrid', { r.bbox = baguetteBox.run(isrc, {
captions: function (g) { captions: function (g) {
var idx = -1, var idx = -1,
h = '' + g; h = '' + g;
@@ -4872,7 +4984,7 @@ document.onkeydown = function (e) {
var html = mk_files_header(tagord), seen = {}; var html = mk_files_header(tagord), seen = {};
html.push('<tbody>'); html.push('<tbody>');
html.push('<tr class="srch_hdr"><td>-</td><td><a href="#" id="unsearch"><big style="font-weight:bold">[❌] ' + L.sl_close + '</big></a> -- ' + L.sl_hits.format(res.hits.length) + (res.hits.length == cap ? ' -- <a href="#" id="moar">' + L.sl_moar + '</a>' : '') + '</td></tr>'); html.push('<tr class="srch_hdr"><td>-</td><td><a href="#" id="unsearch"><big style="font-weight:bold">[❌] ' + L.sl_close + '</big></a> -- ' + L.sl_hits.format(res.hits.length) + (res.trunc ? ' -- <a href="#" id="moar">' + L.sl_moar + '</a>' : '') + '</td></tr>');
for (var a = 0; a < res.hits.length; a++) { for (var a = 0; a < res.hits.length; a++) {
var r = res.hits[a], var r = res.hits[a],
@@ -4986,6 +5098,7 @@ var treectl = (function () {
treesz = clamp(icfg_get('treesz', 16), 10, 50); treesz = clamp(icfg_get('treesz', 16), 10, 50);
bcfg_bind(r, 'ireadme', 'ireadme', true); bcfg_bind(r, 'ireadme', 'ireadme', true);
bcfg_bind(r, 'idxh', 'idxh', idxh, setidxh);
bcfg_bind(r, 'dyn', 'dyntree', true, onresize); bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) { bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
r.goto(get_evpath()); r.goto(get_evpath());
@@ -5010,6 +5123,16 @@ var treectl = (function () {
} }
setwrap(r.wtree); setwrap(r.wtree);
function setidxh(v) {
if (!v == !/\bidxh=y\b/.exec('' + document.cookie))
return;
var xhr = new XHR();
xhr.open('GET', SR + '/?setck=idxh=' + (v ? 'y' : 'n'), true);
xhr.send();
}
setidxh(r.idxh);
r.entree = function (e, nostore) { r.entree = function (e, nostore) {
ev(e); ev(e);
entreed = true; entreed = true;
@@ -5438,6 +5561,9 @@ var treectl = (function () {
return; return;
} }
if (r.chk_index_html(this.top, res))
return;
for (var a = 0; a < res.files.length; a++) for (var a = 0; a < res.files.length; a++)
if (res.files[a].tags === undefined) if (res.files[a].tags === undefined)
res.files[a].tags = {}; res.files[a].tags = {};
@@ -5485,6 +5611,17 @@ var treectl = (function () {
} }
} }
r.chk_index_html = function (top, res) {
if (!r.idxh || !res || !res.files || noih)
return;
for (var a = 0; a < res.files.length; a++)
if (/^index.html?(\?|$)/i.exec(res.files[a].href)) {
window.location = vjoin(top, res.files[a].href);
return true;
}
};
r.gentab = function (top, res) { r.gentab = function (top, res) {
var nodes = res.dirs.concat(res.files), var nodes = res.dirs.concat(res.files),
html = mk_files_header(res.taglist), html = mk_files_header(res.taglist),
@@ -5605,14 +5742,18 @@ var treectl = (function () {
qsr('#bbsw'); qsr('#bbsw');
if (ls0 === null) { if (ls0 === null) {
var xhr = new XHR(); var xhr = new XHR();
xhr.open('GET', SR + '/?am_js', true); xhr.open('GET', SR + '/?setck=js=y', true);
xhr.send(); xhr.send();
r.ls_cb = showfile.addlinks; r.ls_cb = showfile.addlinks;
return r.reqls(get_evpath(), false); return r.reqls(get_evpath(), false);
} }
r.gentab(get_evpath(), ls0); var top = get_evpath();
if (r.chk_index_html(top, ls0))
return;
r.gentab(top, ls0);
pbar.onresize(); pbar.onresize();
vbar.onresize(); vbar.onresize();
showfile.addlinks(); showfile.addlinks();
@@ -6641,22 +6782,27 @@ var globalcss = (function () {
var dcs = document.styleSheets; var dcs = document.styleSheets;
for (var a = 0; a < dcs.length; a++) { for (var a = 0; a < dcs.length; a++) {
var base = dcs[a].href, var ds, base = '';
try {
base = dcs[a].href;
if (!base)
continue;
ds = dcs[a].cssRules; ds = dcs[a].cssRules;
base = base.replace(/[^/]+$/, '');
if (!base) for (var b = 0; b < ds.length; b++) {
continue; var css = ds[b].cssText.split(/\burl\(/g);
ret += css[0];
base = base.replace(/[^/]+$/, ''); for (var c = 1; c < css.length; c++) {
for (var b = 0; b < ds.length; b++) { var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : '';
var css = ds[b].cssText.split(/\burl\(/g); ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
ret += css[0]; css[c].slice(delim ? 1 : 0);
for (var c = 1; c < css.length; c++) { }
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : ''; ret += '\n';
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) +
css[c].slice(delim ? 1 : 0);
} }
ret += '\n'; }
catch (ex) {
console.log('could not read css', a, base);
} }
} }
return ret; return ret;
@@ -6742,6 +6888,7 @@ function show_md(md, name, div, url, depth) {
els[a].setAttribute('href', '#md-' + href.slice(1)); els[a].setAttribute('href', '#md-' + href.slice(1));
} }
md_th_set();
set_tabindex(); set_tabindex();
var hash = location.hash; var hash = location.hash;
if (hash.startsWith('#md-')) if (hash.startsWith('#md-'))
@@ -6820,6 +6967,7 @@ function sandbox(tgt, rules, cls, html) {
'window.onblur=function(){say("ilost #' + tid + '")};' + 'window.onblur=function(){say("ilost #' + tid + '")};' +
'var el="' + want + '"&&ebi("' + want + '");' + 'var el="' + want + '"&&ebi("' + want + '");' +
'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' + 'if(el)say("iscroll #' + tid + ' "+el.offsetTop);' +
'md_th_set();' +
(cls == 'mdo' && md_plug.post ? (cls == 'mdo' && md_plug.post ?
'const x={' + md_plug.post + '};' + 'const x={' + md_plug.post + '};' +
'if(x.render)x.render(ebi("b"));' + 'if(x.render)x.render(ebi("b"));' +
@@ -6852,6 +7000,9 @@ window.addEventListener("message", function (e) {
else if (t[0] == 'igot' || t[0] == 'ilost') { else if (t[0] == 'igot' || t[0] == 'ilost') {
clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot'); clmod(QS(t[1] + '>iframe'), 'focus', t[0] == 'igot');
} }
else if (t[0] == 'imshow') {
thegrid.imshow(e.data.slice(7));
}
} catch (ex) { } catch (ex) {
console.log('msg-err: ' + ex); console.log('msg-err: ' + ex);
} }
@@ -7126,8 +7277,8 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
function reload_mp() { function reload_mp() {
if (mp && mp.au) { if (mp && mp.au) {
if (audio_eq) if (afilt)
audio_eq.stop(); afilt.stop();
mp.au.pause(); mp.au.pause();
mp.au = null; mp.au = null;
@@ -7139,8 +7290,8 @@ function reload_mp() {
plays[a].parentNode.innerHTML = '-'; plays[a].parentNode.innerHTML = '-';
mp = new MPlayer(); mp = new MPlayer();
if (audio_eq) if (afilt)
audio_eq.acst = {}; afilt.acst = {};
setTimeout(pbar.onresize, 1); setTimeout(pbar.onresize, 1);
} }

View File

@@ -231,11 +231,11 @@ function convert_markdown(md_text, dest_dom) {
var nodes = md_dom.getElementsByTagName('a'); var nodes = md_dom.getElementsByTagName('a');
for (var a = nodes.length - 1; a >= 0; a--) { for (var a = nodes.length - 1; a >= 0; a--) {
var href = nodes[a].getAttribute('href'); var href = nodes[a].getAttribute('href');
var txt = nodes[a].textContent; var txt = nodes[a].innerHTML;
if (!txt) if (!txt)
nodes[a].textContent = href; nodes[a].textContent = href;
else if (href !== txt) else if (href !== txt && !nodes[a].className)
nodes[a].className = 'vis'; nodes[a].className = 'vis';
} }

View File

@@ -43,10 +43,9 @@
<h1>WebDAV</h1> <h1>WebDAV</h1>
<div class="os win"> <div class="os win">
<p><em>note: rclone-FTP is a bit faster, so {% if args.ftp or args.ftps %}try that first{% else %}consider enabling FTP in server settings{% endif %}</em></p>
<p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p> <p>if you can, install <a href="https://winfsp.dev/rel/">winfsp</a>+<a href="https://downloads.rclone.org/rclone-current-windows-amd64.zip">rclone</a> and then paste this in cmd:</p>
<pre> <pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %} rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b> rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>W:</b>
</pre> </pre>
{% if s %} {% if s %}
@@ -69,9 +68,9 @@
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
</pre> </pre>
<p>or you can use rclone instead, which is much slower but doesn't require root:</p> <p>or you can use rclone instead, which is much slower but doesn't require root (plus it keeps lastmodified on upload):</p>
<pre> <pre>
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %} rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=owncloud pacer_min_sleep=0.01ms{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b> rclone mount --vfs-cache-mode writes --dir-cache-time 5s {{ aname }}-dav:{{ rvp }} <b>mp</b>
</pre> </pre>
{% if s %} {% if s %}

View File

@@ -451,6 +451,20 @@ html.y textarea:focus {
padding: .2em .5em; padding: .2em .5em;
border: .12em solid #aaa; border: .12em solid #aaa;
} }
.mdo .mdth,
.mdo .mdthl,
.mdo .mdthr {
margin: .5em .5em .5em 0;
}
.mdthl {
float: left;
}
.mdthr {
float: right;
}
hr {
clear: both;
}
@media screen { @media screen {
.mdo { .mdo {

View File

@@ -632,6 +632,29 @@ function vsplit(vp) {
} }
function vjoin(p1, p2) {
if (!p1)
p1 = '';
if (!p2)
p2 = '';
if (p1.endsWith('/'))
p1 = p1.slice(0, -1);
if (p2.startsWith('/'))
p2 = p2.slice(1);
if (!p1)
return p2;
if (!p2)
return p1;
return p1 + '/' + p2;
}
function uricom_enc(txt, do_fb_enc) { function uricom_enc(txt, do_fb_enc) {
try { try {
return encodeURIComponent(txt); return encodeURIComponent(txt);
@@ -1183,13 +1206,13 @@ var tt = (function () {
r.th.style.top = (e.pageY + 12 * sy) + 'px'; r.th.style.top = (e.pageY + 12 * sy) + 'px';
}; };
if (IPHONE) { if (TOUCH) {
var f1 = r.show, var f1 = r.show,
f2 = r.hide, f2 = r.hide,
q = []; q = [];
// if an onclick-handler creates a new timer, // if an onclick-handler creates a new timer,
// iOS 13.1.2 delays the entire handler by up to 401ms, // webkits delay the entire handler by up to 401ms,
// win by using a shared timer instead // win by using a shared timer instead
timer.add(function () { timer.add(function () {
@@ -1579,6 +1602,14 @@ function load_md_plug(md_text, plug_type, defer) {
if (defer) if (defer)
md_plug[plug_type] = null; md_plug[plug_type] = null;
if (plug_type == 'pre')
try {
md_text = md_thumbs(md_text);
}
catch (ex) {
toast.warn(30, '' + ex);
}
if (!have_emp) if (!have_emp)
return md_text; return md_text;
@@ -1619,6 +1650,47 @@ function load_md_plug(md_text, plug_type, defer) {
return md; return md;
} }
function md_thumbs(md) {
if (!/(^|\n)<!-- th -->/.exec(md))
return md;
// `!th[flags](some.jpg)`
// flags: nothing or "l" or "r"
md = md.split(/!th\[/g);
for (var a = 1; a < md.length; a++) {
if (!/^[^\]!()]*\]\([^\][!()]+\)/.exec(md[a])) {
md[a] = '!th[' + md[a];
continue;
}
var o1 = md[a].indexOf(']('),
o2 = md[a].indexOf(')', o1),
alt = md[a].slice(0, o1),
flags = alt.split(','),
url = md[a].slice(o1 + 2, o2),
float = has(flags, 'l') ? 'left' : has(flags, 'r') ? 'right' : '';
if (!/[?&]cache/.exec(url))
url += (url.indexOf('?') < 0 ? '?' : '&') + 'cache=i';
md[a] = '<a href="' + url + '" class="mdth mdth' + float.slice(0, 1) + '"><img src="' + url + '&th=w" alt="' + alt + '" /></a>' + md[a].slice(o2 + 1);
}
return md.join('');
}
function md_th_set() {
var els = QSA('.mdth');
for (var a = 0, aa = els.length; a < aa; a++)
els[a].onclick = md_th_click;
}
function md_th_click(e) {
ev(e);
var url = this.getAttribute('href').split('?')[0];
if (window.sb_md)
window.parent.postMessage("imshow " + url, "*");
else
thegrid.imshow(url);
}
var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n'; var svg_decl = '<?xml version="1.0" encoding="UTF-8"?>\n';

View File

@@ -1,3 +1,108 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0401-2112 `v1.6.11` not joke
## new features
* new event-hook: [exif stripper](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/image-noexif.py)
* [markdown thumbnails](https://a.ocv.me/pub/demo/pics-vids/README.md?v) -- see [readme](https://github.com/9001/copyparty#markdown-viewer)
* soon: support for [web-scrobbler](https://github.com/web-scrobbler/web-scrobbler/) - the [Last.fm](https://www.last.fm/user/tripflag) browser extension
* will update here + readme with more info when [the v3](https://github.com/web-scrobbler/web-scrobbler/projects/5) is out
## bugfixes
* more sqlite query-planner twiddling
* deleting files is MUCH faster now, and uploads / bootup might be a bit better too
* webdav optimizations / compliance
* should make some webdav clients run faster than before
* in very related news, the webdav-client in [rclone](https://github.com/rclone/rclone/) v1.63 ([currently beta](https://beta.rclone.org/?filter=latest)) will be ***FAST!***
* does cool stuff such as [bidirectional sync](https://github.com/9001/copyparty#folder-sync) between copyparty and a local folder
* [bpm detector](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/audio-bpm.py) is a bit more accurate
* [u2cli](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) / commandline uploader: better error messages if something goes wrong
* readme rendering could fail in firefox if certain addons were installed (not sure which)
* event-hooks: more accurate usage examples
## other changes
* @chinponya automated the prismjs build step (thx!)
* updated some js deps (markedjs, codemirror)
* copyparty.exe: updated Pillow to 9.5.0
* and finally [the joke](https://github.com/9001/copyparty/blob/hovudstraum/contrib/plugins/rave.js) (looks [like this](https://cd.ocv.me/b/d2/d21/#af-9b927c42))
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0320-2156 `v1.6.10` rclone sync
## new features
* [iPhone "app"](https://github.com/9001/copyparty#ios-shortcuts) (upload shortcut) -- thanks @Daedren !
* can strip exif, upload files, pics, vids, links, clipboard
* can download links and rehost the target file on your server
* support `rclone sync` to [sync folders](https://github.com/9001/copyparty#folder-sync) to/from copyparty
* let webdav clients set lastmodified times during upload
* let webdav clients replace files during upload
## bugfixes
* [prisonparty](https://github.com/9001/copyparty/blob/hovudstraum/bin/prisonparty.sh): FFmpeg transcoding was slow because there was no `/dev/urandom`
* iphones would fail to play *some* songs (low-bitrate and/or shorter than ~7 seconds)
* due to either an iOS bug or an FFmpeg bug in the caf remuxing idk
* fixed by mixing in white noise into songs if an iPhone asks for them
* small correction in the docker readme regarding rootless podman
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0316-2106 `v1.6.9` index.html
## new features
* option to show `index.html` instead of the folder listing
* arg `--ih` makes it default-enabled
* clients can enable/disable it in the `[⚙️]` settings tab
* url-param `?v` skips it for a particular folder
* faster folder-thumbnail validation on startup (mostly on conventional HDDs)
## bugfixes
* "load more" button didn't always show up when search results got truncated
* ux: tooltips could block buttons on android
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0312-1610 `v1.6.8` folder thumbs
* read-only demo server at https://a.ocv.me/pub/demo/
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) [client testbed](https://cd.ocv.me/b/)
## new features
* folder thumbnails are indexed in the db
* now supports non-lowercase names (`Cover.jpg`, `Folder.JPG`)
* folders without a specific cover/folder image will show the first pic inside
* when audio playback continues into an empty folder, keep trying for a bit
* add no-index hints (google etc) in basic-browser HTML (`?b`, `?b=u`)
* [commandline uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) supports long filenames on win7
## bugfixes
* rotated logfiles didn't get xz compressed
* image-gallery links pointing to a deleted image shows an error instead of a crashpage
## other changes
* folder thumbnails have purple text to differentiate from files
* `copyparty32.exe` starts 30% faster (but is 6% larger)
----
# what to download?
| download link | is it good? | description |
| -- | -- | -- |
| **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** | ✅ the best 👍 | runs anywhere! only needs python |
| [a docker image](https://github.com/9001/copyparty/blob/hovudstraum/scripts/docker/README.md) | it's ok | good if you prefer docker 🐋 |
| [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) | ⚠️ [acceptable](https://github.com/9001/copyparty#copypartyexe) | for [win8](https://user-images.githubusercontent.com/241032/221445946-1e328e56-8c5b-44a9-8b9f-dee84d942535.png) or later; built-in thumbnailer |
| [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe) | ⚠️ acceptable | [CLI uploader](https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py) as a win7+ exe ([video](https://a.ocv.me/pub/demo/pics-vids/u2cli.webm)) |
| [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) | ⛔️ [dangerous](https://github.com/9001/copyparty#copypartyexe) | for [win7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png) -- never expose to the internet! |
| [cpp-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) | ⛔️ dangerous | runs on [64bit WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png), otherwise useless |
* except for [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe), all of the options above are equivalent
* the zip and tar.gz files below are just source code
* python packages are available at [PyPI](https://pypi.org/project/copyparty/#files)
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2023-0305-2018 `v1.6.7` fix no-dedup + add up2k.exe # 2023-0305-2018 `v1.6.7` fix no-dedup + add up2k.exe

View File

@@ -4,6 +4,7 @@
* [future plans](#future-plans) - some improvement ideas * [future plans](#future-plans) - some improvement ideas
* [design](#design) * [design](#design)
* [up2k](#up2k) - quick outline of the up2k protocol * [up2k](#up2k) - quick outline of the up2k protocol
* [why not tus](#why-not-tus) - I didn't know about [tus](https://tus.io/)
* [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right? * [why chunk-hashes](#why-chunk-hashes) - a single sha512 would be better, right?
* [http api](#http-api) * [http api](#http-api)
* [read](#read) * [read](#read)
@@ -66,6 +67,13 @@ regarding the frequent server log message during uploads;
* on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled * on this http connection, `2.77 GiB` transferred, `102.9 MiB/s` average, `948` chunks handled
* client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left * client says `4` uploads OK, `0` failed, `3` busy, `1` queued, `10042 MiB` total size, `7198 MiB` and `00:01:09` left
## why not tus
I didn't know about [tus](https://tus.io/) when I made this, but:
* up2k has the advantage that it supports parallel uploading of non-contiguous chunks straight into the final file -- [tus does a merge at the end](https://tus.io/protocols/resumable-upload.html#concatenation) which is slow and taxing on the server HDD / filesystem (unless i'm misunderstanding)
* up2k has the slight disadvantage of requiring the client to hash the entire file before an upload can begin, but this has the benefit of immediately skipping duplicate files
* and the hashing happens in a separate thread anyways so it's usually not a bottleneck
## why chunk-hashes ## why chunk-hashes
a single sha512 would be better, right? a single sha512 would be better, right?

4
docs/examples/README.md Normal file
View File

@@ -0,0 +1,4 @@
copyparty server config examples
[windows.md](windows.md) -- running copyparty as a service on windows

116
docs/examples/windows.md Normal file
View File

@@ -0,0 +1,116 @@
# running copyparty on windows
this is a complete example / quickstart for running copyparty on windows, optionally as a service (autostart on boot)
you will definitely need either [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (comfy, portable, more features) or [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) (smaller, safer)
* if you decided to grab `copyparty-sfx.py` instead of the exe you will also need to install the ["Latest Python 3 Release"](https://www.python.org/downloads/windows/)
then you probably want to download [FFmpeg](https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip) and put `ffmpeg.exe` and `ffprobe.exe` in your PATH (so for example `C:\Windows\System32\`) -- this enables thumbnails, audio transcoding, and making music metadata searchable
## the config file
open up notepad and save the following as `c:\users\you\documents\party.conf` (for example)
```yaml
[global]
lo: c:\users\you\logs\cpp-%Y-%m%d.xz # log to file
e2dsa, e2ts, no-dedup, z # sets 4 flags; see expl.
p: 80, 443 # listen on ports 80 and 443, not 3923
theme: 2 # default theme: protonmail-monokai
lang: nor # default language: viking
[accounts] # usernames and passwords
kevin: shangalabangala # kevin's password
[/] # create a volume available at /
c:\pub # sharing this filesystem location
accs: # and set permissions:
r: * # everyone can read/download files,
rwmd: kevin # kevin can read/write/move/delete
[/inc] # create another volume at /inc
c:\pub\inc # sharing this filesystem location
accs: # permissions:
w: * # everyone can upload, but not browse
rwmd: kevin # kevin is admin here too
[/music] # and a third volume at /music
~/music # which shares c:\users\you\music
accs:
r: *
rwmd: kevin
```
### config explained: [global]
the `[global]` section accepts any config parameters you can see when running copyparty (either the exe or the sfx.py) with `--help`, so this is the same as running copyparty with arguments `--lo c:\users\you\logs\copyparty-%Y-%m%d.xz -e2dsa -e2ts --no-dedup -z -p 80,443 --theme 2 --lang nor`
* `lo: c:\users\you\logs\cpp-%Y-%m%d.xz` writes compressed logs (the compression will make them delayed)
* sorry that `~/logs/` doesn't work currently, good oversight
* `e2dsa` enables the upload deduplicator and file indexer, which enables searching
* `e2ts` enables music metadata indexing, making albums / titles etc. searchable too
* `no-dedup` writes full dupes to disk instead of symlinking, since lots of windows software doesn't handle symlinks well
* but the improved upload speed from `e2dsa` is not affected
* `z` enables zeroconf, making the server available at `http://HOSTNAME.local/` from any other machine in the LAN
* `p: 80,443` listens on the ports `80` and `443` instead of the default `3923`
* `lang: nor` sets default language to viking
### config explained: [accounts]
the `[accounts]` section defines all the user accounts, which can then be referenced when granting people access to the different volumes
### config explained: volumes
then we create three volumes, one at `/`, one at `/inc`, and one at `/music`
* `/` and `/music` are readable without requiring people to login (`r: *`) but you need to login as kevin to write/move/delete files (`rwmd: kevin`)
* anyone can upload to `/inc` but you must be logged in as kevin to see the files inside
## run copyparty
to test your config it's best to just run copyparty in a console to watch the output:
```batch
copyparty.exe -c party.conf
```
or if you wanna use `copyparty-sfx.py` instead of the exe (understandable),
```batch
%localappdata%\programs\python\python311\python.exe copyparty-sfx.py -c party.conf
```
(please adjust `python311` to match the python version you installed, i'm not good enough at windows to make that bit generic)
## run it as a service
to run this as a service you need [NSSM](https://nssm.cc/ci/nssm-2.24-101-g897c7ad.zip), so put the exe somewhere in your PATH
then either do this for `copyparty.exe`:
```batch
nssm install cpp %homedrive%%homepath%\downloads\copyparty.exe -c %homedrive%%homepath%\documents\party.conf
```
or do this for `copyparty-sfx.py`:
```batch
nssm install cpp %localappdata%\programs\python\python311\python.exe %homedrive%%homepath%\downloads\copyparty-sfx.py -c %homedrive%%homepath%\documents\party.conf
```
then after creating the service, modify it so it runs with your own windows account (so file permissions don't get wonky and paths expand as expected):
```batch
nssm set cpp ObjectName .\yourAccoutName yourWindowsPassword
nssm start cpp
```
and that's it, all good
if it doesn't start, enable stderr logging so you can see what went wrong:
```batch
nssm set cpp AppStderr %homedrive%%homepath%\logs\cppsvc.err
nssm set cpp AppStderrCreationDisposition 2
```

View File

@@ -14,6 +14,10 @@ when server is on another machine (1gbit LAN),
# creating the config file # creating the config file
the copyparty "connect" page at `/?hc` (so for example http://127.0.0.1:3923/?hc) will generate commands to autoconfigure rclone for your server
**if you prefer to configure rclone manually, continue reading:**
replace `hunter2` with your password, or remove the `hunter2` lines if you allow anonymous access replace `hunter2` with your password, or remove the `hunter2` lines if you allow anonymous access
@@ -22,14 +26,16 @@ replace `hunter2` with your password, or remove the `hunter2` lines if you allow
( (
echo [cpp-rw] echo [cpp-rw]
echo type = webdav echo type = webdav
echo vendor = other echo vendor = owncloud
echo url = http://127.0.0.1:3923/ echo url = http://127.0.0.1:3923/
echo headers = Cookie,cppwd=hunter2 echo headers = Cookie,cppwd=hunter2
echo pacer_min_sleep = 0.01ms
echo( echo(
echo [cpp-ro] echo [cpp-ro]
echo type = http echo type = http
echo url = http://127.0.0.1:3923/ echo url = http://127.0.0.1:3923/
echo headers = Cookie,cppwd=hunter2 echo headers = Cookie,cppwd=hunter2
echo pacer_min_sleep = 0.01ms
) > %userprofile%\.config\rclone\rclone.conf ) > %userprofile%\.config\rclone\rclone.conf
``` ```
@@ -41,14 +47,16 @@ also install the windows dependencies: [winfsp](https://github.com/billziss-gh/w
cat > ~/.config/rclone/rclone.conf <<'EOF' cat > ~/.config/rclone/rclone.conf <<'EOF'
[cpp-rw] [cpp-rw]
type = webdav type = webdav
vendor = other vendor = owncloud
url = http://127.0.0.1:3923/ url = http://127.0.0.1:3923/
headers = Cookie,cppwd=hunter2 headers = Cookie,cppwd=hunter2
pacer_min_sleep = 0.01ms
[cpp-ro] [cpp-ro]
type = http type = http
url = http://127.0.0.1:3923/ url = http://127.0.0.1:3923/
headers = Cookie,cppwd=hunter2 headers = Cookie,cppwd=hunter2
pacer_min_sleep = 0.01ms
EOF EOF
``` ```
@@ -62,6 +70,15 @@ rclone.exe mount --vfs-cache-mode writes --vfs-cache-max-age 5s --attr-timeout 5
``` ```
# sync folders to/from copyparty
note that the up2k client [up2k.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#up2kpy) (available on the "connect" page of your copyparty server) does uploads much faster and safer, but rclone is bidirectional and more ubiquitous
```
rclone sync /usr/share/icons/ cpp-rw:fds/
```
# use rclone as server too, replacing copyparty # use rclone as server too, replacing copyparty
feels out of place but is too good not to mention feels out of place but is too good not to mention
@@ -70,3 +87,26 @@ feels out of place but is too good not to mention
rclone.exe serve http --read-only . rclone.exe serve http --read-only .
rclone.exe serve webdav . rclone.exe serve webdav .
``` ```
# devnotes
copyparty supports and expects [the following](https://github.com/rclone/rclone/blob/46484022b08f8756050aa45505ea0db23e62df8b/backend/webdav/webdav.go#L575-L578) from rclone,
```go
case "owncloud":
f.canStream = true
f.precision = time.Second
f.useOCMtime = true
f.hasOCMD5 = true
f.hasOCSHA1 = true
```
notably,
* `useOCMtime` enables the `x-oc-mtime` header to retain mtime of uploads from rclone
* `canStream` is supported but not required by us
* `hasOCMD5` / `hasOCSHA1` is conveniently dontcare on both ends
there's a scary comment mentioning PROPSET of lastmodified which is not something we wish to support
and if `vendor=owncloud` ever stops working, try `vendor=fastmail` instead

View File

@@ -17,6 +17,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
### ...in reviews: ### ...in reviews:
* ✅ = advantages over copyparty * ✅ = advantages over copyparty
* 💾 = what copyparty offers as an alternative
* 🔵 = similarities * 🔵 = similarities
* ⚠️ = disadvantages (something copyparty does "better") * ⚠️ = disadvantages (something copyparty does "better")
@@ -46,6 +47,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* [kodbox](#kodbox) * [kodbox](#kodbox)
* [filebrowser](#filebrowser) * [filebrowser](#filebrowser)
* [filegator](#filegator) * [filegator](#filegator)
* [sftpgo](#sftpgo)
* [updog](#updog) * [updog](#updog)
* [goshs](#goshs) * [goshs](#goshs)
* [gimme-that](#gimme-that) * [gimme-that](#gimme-that)
@@ -53,6 +55,7 @@ currently up to date with [awesome-selfhosted](https://github.com/awesome-selfho
* [linx](#linx) * [linx](#linx)
* [h5ai](#h5ai) * [h5ai](#h5ai)
* [autoindex](#autoindex) * [autoindex](#autoindex)
* [miniserve](#miniserve)
* [briefly considered](#briefly-considered) * [briefly considered](#briefly-considered)
@@ -89,6 +92,7 @@ the softwares,
* `i` = [kodbox](https://github.com/kalcaddle/kodbox) * `i` = [kodbox](https://github.com/kalcaddle/kodbox)
* `j` = [filebrowser](https://github.com/filebrowser/filebrowser) * `j` = [filebrowser](https://github.com/filebrowser/filebrowser)
* `k` = [filegator](https://github.com/filegator/filegator) * `k` = [filegator](https://github.com/filegator/filegator)
* `l` = [sftpgo](https://github.com/drakkan/sftpgo)
some softwares not in the matrixes, some softwares not in the matrixes,
* [updog](#updog) * [updog](#updog)
@@ -96,6 +100,9 @@ some softwares not in the matrixes,
* [gimme-that](#gimmethat) * [gimme-that](#gimmethat)
* [ass](#ass) * [ass](#ass)
* [linx](#linx) * [linx](#linx)
* [h5ai](#h5ai)
* [autoindex](#autoindex)
* [miniserve](#miniserve)
symbol legend, symbol legend,
* `█` = absolutely * `█` = absolutely
@@ -106,62 +113,64 @@ symbol legend,
## general ## general
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | | intuitive UX | | | █ | █ | █ | | █ | █ | █ | █ | █ | █ |
| config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | | config GUI | | █ | █ | █ | █ | | | █ | █ | █ | | █ |
| good documentation | | | | █ | █ | █ | █ | | | █ | █ | | good documentation | | | | █ | █ | █ | █ | | | █ | █ | |
| runs on iOS | | | | | | | | | | | | | runs on iOS | | | | | | | | | | | | |
| runs on Android | █ | | | | | █ | | | | | | | runs on Android | █ | | | | | █ | | | | | | |
| runs on WinXP | █ | █ | | | | █ | | | | | | | runs on WinXP | █ | █ | | | | █ | | | | | | |
| runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | | runs on Windows | █ | █ | █ | █ | █ | █ | █ | | █ | █ | █ | █ |
| runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | | runs on Linux | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | | runs on Macos | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | | runs on FreeBSD | █ | | | • | █ | █ | █ | • | █ | █ | | █ |
| portable binary | █ | █ | █ | | | █ | █ | | | █ | | | portable binary | █ | █ | █ | | | █ | █ | | | █ | | █ |
| zero setup, just go | █ | █ | █ | | | | █ | | | █ | | | zero setup, just go | █ | █ | █ | | | | █ | | | █ | | |
| android app | | | | █ | █ | | | | | | | | android app | | | | █ | █ | | | | | | | |
| iOS app | | | | █ | █ | | | | | | | | iOS app | | | | █ | █ | | | | | | | |
* `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever * `zero setup` = you can get a mostly working setup by just launching the app, without having to install any software or configure whatever
* `a`/copyparty remarks: * `a`/copyparty remarks:
* no gui for server settings; only for client-side stuff * no gui for server settings; only for client-side stuff
* can theoretically run on iOS / iPads using [iSH](https://ish.app/), but only the iPad will offer sufficient multitasking i think * can theoretically run on iOS / iPads using [iSH](https://ish.app/), but only the iPad will offer sufficient multitasking i think
* [android app](https://f-droid.org/en/packages/me.ocv.partyup/) is for uploading only * [android app](https://f-droid.org/en/packages/me.ocv.partyup/) is for uploading only
* no iOS app but has [shortcuts](https://github.com/9001/copyparty#ios-shortcuts) for easy uploading
* `b`/hfs2 runs on linux through wine * `b`/hfs2 runs on linux through wine
* `f`/rclone must be started with the command `rclone serve webdav .` or similar * `f`/rclone must be started with the command `rclone serve webdav .` or similar
* `h`/chibisafe has undocumented windows support * `h`/chibisafe has undocumented windows support
* `i`/sftpgo must be launched with a command
## file transfer ## file transfer
*the thing that copyparty is actually kinda good at* *the thing that copyparty is actually kinda good at*
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | | | download folder as zip | █ | █ | █ | █ | █ | | █ | | █ | █ | | █ |
| download folder as tar | █ | | | | | | | | | █ | | | download folder as tar | █ | | | | | | | | | █ | | |
| upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | upload | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| parallel uploads | █ | | | █ | █ | | • | | █ | | █ | | parallel uploads | █ | | | █ | █ | | • | | █ | | █ | |
| resumable uploads | █ | | | | | | | | █ | | █ | | resumable uploads | █ | | | | | | | | █ | | █ | |
| upload segmenting | █ | | | | | | | █ | █ | | █ | | upload segmenting | █ | | | | | | | █ | █ | | █ | |
| upload acceleration | █ | | | | | | | | █ | | █ | | upload acceleration | █ | | | | | | | | █ | | █ | |
| upload verification | █ | | | █ | █ | | | | █ | | | | upload verification | █ | | | █ | █ | | | | █ | | | |
| upload deduplication | █ | | | | █ | | | | █ | | | | upload deduplication | █ | | | | █ | | | | █ | | | |
| upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | | upload a 999 TiB file | █ | | | | █ | █ | • | | █ | | █ | |
| keep last-modified time | █ | | | █ | █ | █ | | | | | | | keep last-modified time | █ | | | █ | █ | █ | | | | | | █ |
| upload rules | | | | | | | | | | | | | upload rules | | | | | | | | | | | | |
| ┗ max disk usage | █ | █ | | | █ | | | | █ | | | | ┗ max disk usage | █ | █ | | | █ | | | | █ | | | █ |
| ┗ max filesize | █ | | | | | | | █ | | | █ | | ┗ max filesize | █ | | | | | | | █ | | | █ | █ |
| ┗ max items in folder | █ | | | | | | | | | | | | ┗ max items in folder | █ | | | | | | | | | | | |
| ┗ max file age | █ | | | | | | | | █ | | | | ┗ max file age | █ | | | | | | | | █ | | | |
| ┗ max uploads over time | █ | | | | | | | | | | | | ┗ max uploads over time | █ | | | | | | | | | | | |
| ┗ compress before write | █ | | | | | | | | | | | | ┗ compress before write | █ | | | | | | | | | | | |
| ┗ randomize filename | █ | | | | | | | █ | █ | | | | ┗ randomize filename | █ | | | | | | | █ | █ | | | |
| ┗ mimetype reject-list | | | | | | | | | • | | | | ┗ mimetype reject-list | | | | | | | | | • | | | |
| ┗ extension reject-list | | | | | | | | █ | • | | | | ┗ extension reject-list | | | | | | | | █ | • | | | |
| checksums provided | | | | █ | █ | | | | █ | | | | checksums provided | | | | █ | █ | | | | █ | | | |
| cloud storage backend | | | | █ | █ | █ | | | | | █ | | cloud storage backend | | | | █ | █ | █ | | | | | █ | █ |
* `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example * `upload segmenting` = files are sliced into chunks, making it possible to upload files larger than 100 MiB on cloudflare for example
@@ -178,25 +187,29 @@ symbol legend,
* can provide checksums for single files on request * can provide checksums for single files on request
* can probably do extension/mimetype rejection similar to copyparty * can probably do extension/mimetype rejection similar to copyparty
* `k`/filegator download-as-zip is not streaming; it creates the full zipfile before download can start * `k`/filegator download-as-zip is not streaming; it creates the full zipfile before download can start
* `l`/sftpgo:
* resumable/segmented uploads only over SFTP, not over HTTP
* upload rules are totals only, not over time
* can probably do extension/mimetype rejection similar to copyparty
## protocols and client support ## protocols and client support
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | | serve https | █ | | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| serve webdav | █ | | | █ | █ | █ | █ | | █ | | | | serve webdav | █ | | | █ | █ | █ | █ | | █ | | | █ |
| serve ftp | █ | | | | | █ | | | | | | | serve ftp | █ | | | | | █ | | | | | | █ |
| serve ftps | █ | | | | | █ | | | | | | | serve ftps | █ | | | | | █ | | | | | | █ |
| serve sftp | | | | | | █ | | | | | | | serve sftp | | | | | | █ | | | | | | █ |
| serve smb/cifs | | | | | | █ | | | | | | | serve smb/cifs | | | | | | █ | | | | | | |
| serve dlna | | | | | | █ | | | | | | | serve dlna | | | | | | █ | | | | | | |
| listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | | listen on unix-socket | | | | █ | █ | | █ | █ | █ | | █ | █ |
| zeroconf | █ | | | | | | | | | | | | zeroconf | █ | | | | | | | | | | | |
| supports netscape 4 | | | | | | █ | | | | | • | | supports netscape 4 | | | | | | █ | | | | | • | |
| ...internet explorer 6 | | █ | | █ | | █ | | | | | • | | ...internet explorer 6 | | █ | | █ | | █ | | | | | • | |
| mojibake filenames | █ | | | • | • | █ | █ | • | • | • | | | mojibake filenames | █ | | | • | • | █ | █ | • | • | • | | |
| undecodable filenames | █ | | | • | • | █ | | • | • | | | | undecodable filenames | █ | | | • | • | █ | | • | • | | | |
* `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf: * `webdav` = protocol convenient for mounting a remote server as a local filesystem; see zeroconf:
* `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices * `zeroconf` = the server announces itself on the LAN, [automatically appearing](https://user-images.githubusercontent.com/241032/215344737-0eae8d98-9496-4256-9aa8-cd2f6971810d.png) on other zeroconf-capable devices
@@ -206,57 +219,62 @@ symbol legend,
* `a`/copyparty remarks: * `a`/copyparty remarks:
* extremely minimal samba/cifs server * extremely minimal samba/cifs server
* netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png)) * netscape 4 / ie6 support is mostly listed as a joke altho some people have actually found it useful ([ie4 tho](https://user-images.githubusercontent.com/241032/118192791-fb31fe00-b446-11eb-9647-898ea8efc1f7.png))
* `l`/sftpgo translates mojibake filenames into valid utf-8 (information loss)
## server configuration ## server configuration
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| config from cmd args | █ | | | | | █ | █ | | | █ | | | config from cmd args | █ | | | | | █ | █ | | | █ | | |
| config files | █ | █ | █ | | | █ | | █ | | █ | • | | config files | █ | █ | █ | | | █ | | █ | | █ | • | |
| runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | | runtime config reload | █ | █ | █ | | | | | █ | █ | █ | █ | |
| same-port http / https | █ | | | | | | | | | | | | same-port http / https | █ | | | | | | | | | | | |
| listen multiple ports | █ | | | | | | | | | | | | listen multiple ports | █ | | | | | | | | | | | █ |
| virtual file system | █ | █ | █ | | | | █ | | | | | | virtual file system | █ | █ | █ | | | | █ | | | | | █ |
| reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | | reverse-proxy ok | █ | | █ | █ | █ | █ | █ | █ | • | • | • | █ |
| folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | | folder-rproxy ok | █ | | | | █ | █ | | • | • | • | • | |
* `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead * `folder-rproxy` = reverse-proxying without dedicating an entire (sub)domain, using a subfolder instead
* `l`/sftpgo:
* config: users must be added through gui / api calls
## server capabilities ## server capabilities
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | | accounts | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ | █ |
| single-sign-on | | | | | | | | | • | | | | per-account chroot | | | | | | | | | | | | |
| token auth | | | | █ | █ | | | | | | | | single-sign-on | | | | █ | █ | | | | • | | | |
| per-volume permissions | | | | █ | █ | | █ | | | | | | token auth | | | | █ | █ | | | █ | | | | |
| per-folder permissions | | | | █ | █ | | | | █ | █ | | | 2fa | | | | █ | █ | | | | | | | █ |
| per-file permissions | | | | █ | █ | | █ | | █ | | | | per-volume permissions | | | █ | █ | █ | █ | █ | | █ | █ | | |
| per-file passwords | | | | █ | █ | | █ | | █ | | | | per-folder permissions | | | | █ | █ | | █ | | █ | | | █ |
| unmap subfolders | | | | | | | █ | | | █ | | | per-file permissions | | | | █ | █ | | | | █ | | | |
| index.html blocks list | | | | | | | █ | | | • | | | per-file passwords | | | | █ | █ | | █ | | █ | | | |
| write-only folders | █ | | | | | | | | | | | | unmap subfolders | █ | | | | | | | | | █ | | |
| files stored as-is | █ | | | | | █ | | | | | | | index.html blocks list | | | | | | | █ | | | | | |
| file versioning | | | | █ | █ | | | | | | | | write-only folders | █ | | | | | | | | | | █ | █ |
| file encryption | | | | █ | █ | █ | | | | | | | files stored as-is | █ | █ | | █ | | █ | █ | | | | | |
| file indexing | █ | | | █ | █ | | | | | | | | file versioning | | | | █ | █ | | | | | | | |
| ┗ per-volume db | | | | | | | | | | | | | file encryption | | | | | | | | | | | | |
| ┗ db stored in folder | █ | | | | | | | • | • | █ | | | file indexing | █ | | | █ | █ | | | | █ | █ | | |
| ┗ db stored out-of-tree | █ | | | | | | | • | • | | | | ┗ per-volume db | █ | | | | | | | • | • | | | |
| ┗ existing file tree | █ | | █ | | | | | | | | | | ┗ db stored in folder | █ | | | | | | | • | • | █ | | |
| file action event hooks | █ | | | | | | | | | █ | | | ┗ db stored out-of-tree | █ | | █ | █ | | | | | • | █ | | |
| one-way folder sync | █ | | | | | | | | | | | | ┗ existing file tree | █ | | | | | | | | | █ | | |
| full sync | | | | █ | █ | | | | | | | | file action event hooks | █ | | | | | | | | | █ | | █ |
| speed throttle | | █ | █ | | | █ | | | | | | | one-way folder sync | █ | | | █ | █ | █ | | | | | | |
| anti-bruteforce | █ | █ | | █ | █ | | | | | | | | full sync | | | | █ | █ | | | | | | | |
| dyndns updater | | █ | | | | | | | | | | | speed throttle | | █ | | | | | | | █ | | | |
| self-updater | | | █ | | | | | | | | | | anti-bruteforce | █ | █ | █ | █ | █ | | | | | | | |
| log rotation | █ | | | | | | | • | █ | | | | dyndns updater | | █ | | | | | | | | | | |
| upload tracking / log | █ | █ | | | | | | | | | | | self-updater | | | █ | | | | | | | | | |
| curl-friendly ls | █ | | | | | | | | | | | | log rotation | █ | | █ | █ | | | | | | | | |
| curl-friendly upload | █ | | | | | | █ | | | | | | upload tracking / log | █ | | | █ | █ | | | █ | | | | |
| curl-friendly ls | █ | | | | | | | | | | | |
| curl-friendly upload | █ | | | | | █ | █ | • | | | | |
* `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path * `unmap subfolders` = "shadowing"; mounting a local folder in the middle of an existing filesystem tree in order to disable access below that path
* `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that * `files stored as-is` = uploaded files are trivially readable from the server HDD, not sliced into chunks or in weird folder structures or anything like that
@@ -277,49 +295,52 @@ symbol legend,
* `k`/filegator remarks: * `k`/filegator remarks:
* `per-* permissions` -- can limit a user to one folder and its subfolders * `per-* permissions` -- can limit a user to one folder and its subfolders
* `unmap subfolders` -- can globally filter a list of paths * `unmap subfolders` -- can globally filter a list of paths
* `l`/sftpgo:
* `file action event hooks` also include on-download triggers
* `upload tracking / log` in main logfile
## client features ## client features
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ---------------------- | - | - | - | - | - | - | - | - | - | - | - | | ---------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | | single-page app | █ | | █ | █ | █ | | | █ | █ | █ | █ | |
| themes | █ | █ | | █ | | | | | █ | | | | themes | █ | █ | | █ | | | | | █ | | | |
| directory tree nav | █ | | | | █ | | | | █ | | | | directory tree nav | █ | | | | █ | | | | █ | | | |
| multi-column sorting | █ | | | | | | | | | | | | multi-column sorting | █ | | | | | | | | | | | |
| thumbnails | █ | | | | | | | █ | █ | | | | thumbnails | █ | | | | | | | █ | █ | | | |
| ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | | ┗ image thumbnails | █ | | | █ | █ | | | █ | █ | █ | | |
| ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | | ┗ video thumbnails | █ | | | █ | █ | | | | █ | | | |
| ┗ audio spectrograms | █ | | | | | | | | | | | | ┗ audio spectrograms | █ | | | | | | | | | | | |
| audio player | █ | | | █ | █ | | | | █ | | | | audio player | █ | | | █ | █ | | | | █ | | | |
| ┗ gapless playback | █ | | | | | | | | • | | | | ┗ gapless playback | █ | | | | | | | | • | | | |
| ┗ audio equalizer | █ | | | | | | | | | | | | ┗ audio equalizer | █ | | | | | | | | | | | |
| ┗ waveform seekbar | █ | | | | | | | | | | | | ┗ waveform seekbar | █ | | | | | | | | | | | |
| ┗ OS integration | █ | | | | | | | | | | | | ┗ OS integration | █ | | | | | | | | | | | |
| ┗ transcode to lossy | █ | | | | | | | | | | | | ┗ transcode to lossy | █ | | | | | | | | | | | |
| video player | █ | | | █ | █ | | | | █ | █ | | | video player | █ | | | █ | █ | | | | █ | █ | | |
| ┗ video transcoding | | | | | | | | | █ | | | | ┗ video transcoding | | | | | | | | | █ | | | |
| audio BPM detector | █ | | | | | | | | | | | | audio BPM detector | █ | | | | | | | | | | | |
| audio key detector | █ | | | | | | | | | | | | audio key detector | █ | | | | | | | | | | | |
| search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | | search by path / name | █ | █ | █ | █ | █ | | █ | | █ | █ | | |
| search by date / size | █ | | | | █ | | | █ | █ | | | | search by date / size | █ | | | | █ | | | █ | █ | | | |
| search by bpm / key | █ | | | | | | | | | | | | search by bpm / key | █ | | | | | | | | | | | |
| search by custom tags | | | | | | | | █ | █ | | | | search by custom tags | | | | | | | | █ | █ | | | |
| search in file contents | | | | █ | █ | | | | █ | | | | search in file contents | | | | █ | █ | | | | █ | | | |
| search by custom parser | █ | | | | | | | | | | | | search by custom parser | █ | | | | | | | | | | | |
| find local file | █ | | | | | | | | | | | | find local file | █ | | | | | | | | | | | |
| undo recent uploads | █ | | | | | | | | | | | | undo recent uploads | █ | | | | | | | | | | | |
| create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | | create directories | █ | | | █ | █ | | █ | █ | █ | █ | █ | █ |
| image viewer | █ | | | █ | █ | | | | █ | █ | █ | | image viewer | █ | | | █ | █ | | | | █ | █ | █ | |
| markdown viewer | █ | | | | █ | | | | █ | | | | markdown viewer | █ | | | | █ | | | | █ | | | |
| markdown editor | █ | | | | █ | | | | █ | | | | markdown editor | █ | | | | █ | | | | █ | | | |
| readme.md in listing | █ | | | █ | | | | | | | | | readme.md in listing | █ | | | █ | | | | | | | | |
| rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | | rename files | █ | █ | █ | █ | █ | | █ | | █ | █ | █ | █ |
| batch rename | █ | | | | | | | | █ | | | | batch rename | █ | | | | | | | | █ | | | |
| cut / paste files | █ | █ | | █ | █ | | | | █ | | | | cut / paste files | █ | █ | | █ | █ | | | | █ | | | |
| move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | | move files | █ | █ | | █ | █ | | █ | | █ | █ | █ | |
| delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | | delete files | █ | █ | | █ | █ | | █ | █ | █ | █ | █ | █ |
| copy files | | | | | █ | | | | █ | █ | █ | | copy files | | | | | █ | | | | █ | █ | █ | |
* `single-page app` = multitasking; possible to continue navigating while uploading * `single-page app` = multitasking; possible to continue navigating while uploading
* `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song * `audio player » os-integration` = use the [lockscreen](https://user-images.githubusercontent.com/241032/142711926-0700be6c-3e31-47b3-9928-53722221f722.png) or [media hotkeys](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) to play/pause, prev/next song
@@ -335,20 +356,21 @@ symbol legend,
## integration ## integration
| feature / software | a | b | c | d | e | f | g | h | i | j | k | | feature / software | a | b | c | d | e | f | g | h | i | j | k | l |
| ----------------------- | - | - | - | - | - | - | - | - | - | - | - | | ----------------------- | - | - | - | - | - | - | - | - | - | - | - | - |
| OS alert on upload | █ | | | | | | | | | | | | OS alert on upload | █ | | | | | | | | | | | |
| discord | █ | | | | | | | | | | | | discord | █ | | | | | | | | | | | |
| ┗ announce uploads | █ | | | | | | | | | | | | ┗ announce uploads | █ | | | | | | | | | | | |
| ┗ custom embeds | | | | | | | | | | | | | ┗ custom embeds | | | | | | | | | | | | |
| sharex | █ | | | █ | | █ | | █ | | | | | sharex | █ | | | █ | | █ | | █ | | | | |
| flameshot | | | | | | █ | | | | | | | flameshot | | | | | | █ | | | | | | |
* sharex `` = yes, but does not provide example sharex config * sharex `` = yes, but does not provide example sharex config
* `a`/copyparty remarks: * `a`/copyparty remarks:
* `OS alert on upload` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py) * `OS alert on upload` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
* `discord » announce uploads` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py) * `discord » announce uploads` available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
* `j`/filebrowser can probably pull those off with command runners similar to copyparty * `j`/filebrowser can probably pull those off with command runners similar to copyparty
* `l`/sftpgo has nothing built-in but is very extensible
## another matrix ## another matrix
@@ -366,6 +388,7 @@ symbol legend,
| kodbox | php | ░ gpl3 | 92 MB | | kodbox | php | ░ gpl3 | 92 MB |
| filebrowser | go | █ apl2 | 20 MB | | filebrowser | go | █ apl2 | 20 MB |
| filegator | php | █ mit | • | | filegator | php | █ mit | • |
| sftpgo | go | ‼ agpl | 44 MB |
| updog | python | █ mit | 17 MB | | updog | python | █ mit | 17 MB |
| goshs | go | █ mit | 11 MB | | goshs | go | █ mit | 11 MB |
| gimme-that | python | █ mit | 4.8 MB | | gimme-that | python | █ mit | 4.8 MB |
@@ -379,6 +402,7 @@ symbol legend,
# reviews # reviews
* ✅ are advantages over copyparty * ✅ are advantages over copyparty
* 💾 are what copyparty offers as an alternative
* 🔵 are similarities * 🔵 are similarities
* ⚠️ are disadvantages (something copyparty does "better") * ⚠️ are disadvantages (something copyparty does "better")
@@ -412,10 +436,11 @@ symbol legend,
* ⚠️ http/webdav only; no ftp, zeroconf * ⚠️ http/webdav only; no ftp, zeroconf
* ⚠️ less awesome music player * ⚠️ less awesome music player
* ⚠️ doesn't run on android or ipads * ⚠️ doesn't run on android or ipads
* ⚠️ AGPL licensed
* ✅ great ui/ux * ✅ great ui/ux
* ✅ config gui * ✅ config gui
* ✅ apps (android / iphone) * ✅ apps (android / iphone)
* copyparty: android upload-only app * 💾 android upload-only app + iPhone upload shortcut
* ✅ more granular permissions (per-file) * ✅ more granular permissions (per-file)
* ✅ search: fulltext indexing of file contents * ✅ search: fulltext indexing of file contents
* ✅ webauthn passwordless authentication * ✅ webauthn passwordless authentication
@@ -430,10 +455,11 @@ symbol legend,
* ⚠️ http/webdav only; no ftp, zeroconf * ⚠️ http/webdav only; no ftp, zeroconf
* ⚠️ less awesome music player * ⚠️ less awesome music player
* ⚠️ doesn't run on android or ipads * ⚠️ doesn't run on android or ipads
* ⚠️ AGPL licensed
* ✅ great ui/ux * ✅ great ui/ux
* ✅ config gui * ✅ config gui
* ✅ apps (android / iphone) * ✅ apps (android / iphone)
* copyparty: android upload-only app * 💾 android upload-only app + iPhone upload shortcut
* ✅ more granular permissions (per-file) * ✅ more granular permissions (per-file)
* ✅ search: fulltext indexing of file contents * ✅ search: fulltext indexing of file contents
@@ -467,7 +493,7 @@ symbol legend,
* ✅ searchable image tags; delete by tag * ✅ searchable image tags; delete by tag
* ✅ browser extension to upload files to the server * ✅ browser extension to upload files to the server
* ✅ reject uploads by file extension * ✅ reject uploads by file extension
* copyparty: can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins * 💾 can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
* ✅ token auth (api keys) * ✅ token auth (api keys)
## [kodbox](https://github.com/kalcaddle/kodbox) ## [kodbox](https://github.com/kalcaddle/kodbox)
@@ -512,6 +538,30 @@ symbol legend,
* ⚠️ doesn't support crazy filenames * ⚠️ doesn't support crazy filenames
* ⚠️ limited file search * ⚠️ limited file search
## [sftpgo](https://github.com/drakkan/sftpgo)
* go; cross-platform (windows, linux, mac)
* ⚠️ http uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* 🔵 sftp uploads are resumable
* ⚠️ web UI is very minimal + a bit slow
* ⚠️ no thumbnails / image viewer / audio player
* ⚠️ basic file manager (no cut/paste/move)
* ⚠️ no filesystem indexing / search
* ⚠️ doesn't run on phones, tablets
* ⚠️ no zeroconf (mdns/ssdp)
* ⚠️ AGPL licensed
* 🔵 ftp, ftps, webdav
* ✅ sftp server
* ✅ settings gui
* ✅ acme (automatic tls certs)
* 💾 relies on caddy/certbot/acme.sh
* ✅ at-rest encryption
* 💾 relies on LUKS/BitLocker
* ✅ can use S3/GCS as storage backend
* 💾 relies on rclone-mount
* ✅ on-download event hook (otherwise same as copyparty)
* ✅ more extensive permissions control
## [updog](https://github.com/sc0tfree/updog) ## [updog](https://github.com/sc0tfree/updog)
* python; cross-platform * python; cross-platform
* basic directory listing with upload feature * basic directory listing with upload feature
@@ -526,7 +576,7 @@ symbol legend,
* ⚠️ uploads not resumable / accelerated / integrity-checked * ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB * ⚠️ on cloudflare: max upload size 100 MiB
* ✅ cool clipboard widget * ✅ cool clipboard widget
* copyparty: the markdown editor is an ok substitute * 💾 the markdown editor is an ok substitute
* 🔵 read-only and upload-only modes (same as copyparty's write-only) * 🔵 read-only and upload-only modes (same as copyparty's write-only)
* 🔵 https, webdav, but no ftp * 🔵 https, webdav, but no ftp
@@ -538,7 +588,7 @@ symbol legend,
* ⚠️ weird folder structure for uploads * ⚠️ weird folder structure for uploads
* ✅ clamav antivirus check on upload! neat * ✅ clamav antivirus check on upload! neat
* 🔵 optional max-filesize, os-notification on uploads * 🔵 optional max-filesize, os-notification on uploads
* copyparty: os-notification available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py) * 💾 os-notification available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/notify.py)
## [ass](https://github.com/tycrek/ass) ## [ass](https://github.com/tycrek/ass)
* nodejs; recommends docker * nodejs; recommends docker
@@ -549,12 +599,13 @@ symbol legend,
* ⚠️ on cloudflare: max upload size 100 MiB * ⚠️ on cloudflare: max upload size 100 MiB
* ✅ token auth * ✅ token auth
* ✅ gps metadata stripping * ✅ gps metadata stripping
* copyparty: possible with [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py) * 💾 possible with [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/image-noexif.py)
* ✅ discord integration (custom embeds, upload webhook) * ✅ discord integration (custom embeds, upload webhook)
* copyparty: [upload webhook plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py) * 💾 [upload webhook plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/discord-announce.py)
* ✅ reject uploads by mimetype * ✅ reject uploads by mimetype
* copyparty: can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins * 💾 can reject uploads [by extension](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-extension.py) or [mimetype](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reject-mimetype.py) using plugins
* ✅ can use S3 as storage backend; copyparty relies on rclone-mount for that * ✅ can use S3 as storage backend
* 💾 relies on rclone-mount
* ✅ custom 404 pages * ✅ custom 404 pages
## [linx](https://github.com/ZizzyDizzyMC/linx-server/) ## [linx](https://github.com/ZizzyDizzyMC/linx-server/)
@@ -564,12 +615,13 @@ symbol legend,
* 🔵 some of its unique features have been added to copyparty as former linx users have migrated * 🔵 some of its unique features have been added to copyparty as former linx users have migrated
* file expiration timers, filename randomization * file expiration timers, filename randomization
* ✅ password-protected files * ✅ password-protected files
* copyparty: password-protected folders + filekeys to skip the folder password seem to cover most usecases * 💾 password-protected folders + filekeys to skip the folder password seem to cover most usecases
* ✅ file deletion keys * ✅ file deletion keys
* ✅ download files as torrents * ✅ download files as torrents
* ✅ remote uploads (send a link to the server and it downloads it) * ✅ remote uploads (send a link to the server and it downloads it)
* copyparty: available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py) * 💾 available as [a plugin](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
* ✅ can use S3 as storage backend; copyparty relies on rclone-mount for that * ✅ can use S3 as storage backend
* 💾 relies on rclone-mount
## [h5ai](https://larsjung.de/h5ai/) ## [h5ai](https://larsjung.de/h5ai/)
* ⚠️ read only; no upload/move/delete * ⚠️ read only; no upload/move/delete
@@ -581,7 +633,16 @@ symbol legend,
## [autoindex](https://github.com/nielsAD/autoindex) ## [autoindex](https://github.com/nielsAD/autoindex)
* ⚠️ read only; no upload/move/delete * ⚠️ read only; no upload/move/delete
* ✅ directory cache for faster browsing of cloud storage * ✅ directory cache for faster browsing of cloud storage
* copyparty: local index/cache for recursive search (names/attrs/tags), but not for browsing * 💾 local index/cache for recursive search (names/attrs/tags), but not for browsing
## [miniserve](https://github.com/svenstaro/miniserve)
* rust; cross-platform (windows, linux, mac)
* ⚠️ uploads not resumable / accelerated / integrity-checked
* ⚠️ on cloudflare: max upload size 100 MiB
* ⚠️ no thumbnails / image viewer / audio player / file manager
* ⚠️ no filesystem indexing / search
* 🔵 upload, tar/zip download, qr-code
* ✅ faster at loading huge folders
# briefly considered # briefly considered

42
flake.lock generated Normal file
View File

@@ -0,0 +1,42 @@
{
"nodes": {
"flake-utils": {
"locked": {
"lastModified": 1678901627,
"narHash": "sha256-U02riOqrKKzwjsxc/400XnElV+UtPUQWpANPlyazjH0=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "93a2b84fc4b70d9e089d029deacc3583435c2ed6",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1680334310,
"narHash": "sha256-ISWz16oGxBhF7wqAxefMPwFag6SlsA9up8muV79V9ck=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "884e3b68be02ff9d61a042bc9bd9dd2a358f95da",
"type": "github"
},
"original": {
"id": "nixpkgs",
"ref": "nixos-22.11",
"type": "indirect"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
}
},
"root": "root",
"version": 7
}

28
flake.nix Normal file
View File

@@ -0,0 +1,28 @@
{
inputs = {
nixpkgs.url = "nixpkgs/nixos-22.11";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }:
{
nixosModules.default = ./contrib/nixos/modules/copyparty.nix;
overlays.default = self: super: {
copyparty =
self.python3.pkgs.callPackage ./contrib/package/nix/copyparty {
ffmpeg = self.ffmpeg-full;
};
};
} // flake-utils.lib.eachDefaultSystem (system:
let
pkgs = import nixpkgs {
inherit system;
overlays = [ self.overlays.default ];
};
in {
packages = {
inherit (pkgs) copyparty;
default = self.packages.${system}.copyparty;
};
});
}

View File

@@ -3,12 +3,21 @@ FROM alpine:3.16
WORKDIR /z WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.9.0 \ ver_hashwasm=4.9.0 \
ver_marked=4.2.5 \ ver_marked=4.3.0 \
ver_mde=2.18.0 \ ver_mde=2.18.0 \
ver_codemirror=5.65.11 \ ver_codemirror=5.65.12 \
ver_fontawesome=5.13.0 \ ver_fontawesome=5.13.0 \
ver_prism=1.29.0 \
ver_zopfli=1.0.3 ver_zopfli=1.0.3
# versioncheck:
# https://github.com/markedjs/marked/releases
# https://github.com/Ionaru/easy-markdown-editor/tags
# https://github.com/codemirror/codemirror5/releases
# https://github.com/Daninet/hash-wasm/releases
# https://github.com/openpgpjs/asmcrypto.js
# https://github.com/google/zopfli/tags
# download; # download;
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap # the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
@@ -22,6 +31,7 @@ RUN mkdir -p /z/dist/no-pk \
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \ && wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \ && wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \ && wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
&& wget https://github.com/PrismJS/prism/archive/refs/tags/v$ver_prism.tar.gz -O prism.tgz \
&& (mkdir hash-wasm \ && (mkdir hash-wasm \
&& cd hash-wasm \ && cd hash-wasm \
&& unzip ../hash-wasm.zip) \ && unzip ../hash-wasm.zip) \
@@ -39,14 +49,11 @@ RUN mkdir -p /z/dist/no-pk \
&& cd easy-markdown-editor* \ && cd easy-markdown-editor* \
&& npm install \ && npm install \
&& npm i gulp-cli -g ) \ && npm i gulp-cli -g ) \
&& tar -xf prism.tgz \
&& unzip fontawesome.zip \ && unzip fontawesome.zip \
&& tar -xf zopfli.tgz && tar -xf zopfli.tgz
# todo
# https://prismjs.com/download.html#themes=prism-funky&languages=markup+css+clike+javascript+autohotkey+bash+basic+batch+c+csharp+cpp+cmake+diff+docker+go+ini+java+json+kotlin+latex+less+lisp+lua+makefile+objectivec+perl+powershell+python+r+jsx+ruby+rust+sass+scss+sql+swift+systemd+toml+typescript+vbnet+verilog+vhdl+yaml&plugins=line-highlight+line-numbers+autolinker
# build fonttools (which needs zopfli) # build fonttools (which needs zopfli)
RUN tar -xf zopfli.tgz \ RUN tar -xf zopfli.tgz \
&& cd zopfli* \ && cd zopfli* \
@@ -121,6 +128,12 @@ COPY shiftbase.py /z
RUN /bin/ash /z/mini-fa.sh RUN /bin/ash /z/mini-fa.sh
# build prismjs
COPY genprism.py /z
COPY genprism.sh /z
RUN ./genprism.sh $ver_prism
# compress # compress
COPY zopfli.makefile /z/dist/Makefile COPY zopfli.makefile /z/dist/Makefile
RUN cd /z/dist \ RUN cd /z/dist \

View File

@@ -1,10 +1,9 @@
self := $(dir $(abspath $(lastword $(MAKEFILE_LIST)))) self := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
vend := $(self)/../../copyparty/web/deps vend := $(self)/../../copyparty/web/deps
# prefers podman-docker (optionally rootless) over actual docker/moby
all: all:
-service docker start
-systemctl start docker
docker build -t build-copyparty-deps . docker build -t build-copyparty-deps .
rm -rf $(vend) rm -rf $(vend)
@@ -14,6 +13,7 @@ all:
docker run --rm -i build-copyparty-deps:latest | \ docker run --rm -i build-copyparty-deps:latest | \
tar -xvC $(vend) --strip-components=1 tar -xvC $(vend) --strip-components=1
touch $(vend)/__init__.py
chown -R `stat $(self) -c %u:%g` $(vend) chown -R `stat $(self) -c %u:%g` $(vend)
purge: purge:

198
scripts/deps-docker/genprism.py Executable file
View File

@@ -0,0 +1,198 @@
#!/usr/bin/env python3
# author: @chinponya
import argparse
import json
from pathlib import Path
from urllib.parse import urlparse, parse_qsl
def read_json(path):
return json.loads(path.read_text())
def get_prism_version(prism_path):
package_json_path = prism_path / "package.json"
package_json = read_json(package_json_path)
return package_json["version"]
def get_prism_components(prism_path):
components_json_path = prism_path / "components.json"
components_json = read_json(components_json_path)
return components_json
def parse_prism_configuration(url_str):
url = urlparse(url_str)
# prism.com uses a non-standard query string-like encoding
query = {k: v.split(" ") for k, v in parse_qsl(url.fragment)}
return query
def paths_of_component(prism_path, kind, components, name, minified):
component = components[kind][name]
meta = components[kind]["meta"]
path_format = meta["path"]
path_base = prism_path / path_format.replace("{id}", name)
if isinstance(component, str):
# 'core' component has a different shape, so we convert it to be consistent
component = {"title": component}
if meta.get("noCSS") or component.get("noCSS"):
extensions = ["js"]
elif kind == "themes":
extensions = ["css"]
else:
extensions = ["js", "css"]
if path_base.is_dir():
result = {ext: path_base / f"{name}.{ext}" for ext in extensions}
elif path_base.suffix:
ext = path_base.suffix.replace(".", "")
result = {ext: path_base}
else:
result = {ext: path_base.with_suffix(f".{ext}") for ext in extensions}
if minified:
result = {
ext: path.with_suffix(".min" + path.suffix) for ext, path in result.items()
}
return result
def read_component_contents(kv_paths):
return {k: path.read_text() for k, path in kv_paths.items()}
def get_language_dependencies(components, name):
dependencies = components["languages"][name].get("require")
if isinstance(dependencies, list):
return dependencies
elif isinstance(dependencies, str):
return [dependencies]
else:
return []
def make_header(prism_path, url):
version = get_prism_version(prism_path)
header = f"/* PrismJS {version}\n{url} */"
return {"js": header, "css": header}
def make_core(prism_path, components, minified):
kv_paths = paths_of_component(prism_path, "core", components, "core", minified)
return read_component_contents(kv_paths)
def make_theme(prism_path, components, name, minified):
kv_paths = paths_of_component(prism_path, "themes", components, name, minified)
return read_component_contents(kv_paths)
def make_language(prism_path, components, name, minified):
kv_paths = paths_of_component(prism_path, "languages", components, name, minified)
return read_component_contents(kv_paths)
def make_languages(prism_path, components, names, minified):
names_with_dependencies = sum(
([*get_language_dependencies(components, name), name] for name in names), []
)
seen = set()
names_with_dependencies = [
x for x in names_with_dependencies if not (x in seen or seen.add(x))
]
kv_code = [
make_language(prism_path, components, name, minified)
for name in names_with_dependencies
]
return kv_code
def make_plugin(prism_path, components, name, minified):
kv_paths = paths_of_component(prism_path, "plugins", components, name, minified)
return read_component_contents(kv_paths)
def make_plugins(prism_path, components, names, minified):
kv_code = [make_plugin(prism_path, components, name, minified) for name in names]
return kv_code
def make_code(prism_path, url, minified):
components = get_prism_components(prism_path)
configuration = parse_prism_configuration(url)
theme_name = configuration["themes"][0]
code = [
make_header(prism_path, url),
make_core(prism_path, components, minified),
make_theme(prism_path, components, theme_name, minified),
]
if configuration.get("languages"):
code.extend(
make_languages(prism_path, components, configuration["languages"], minified)
)
if configuration.get("plugins"):
code.extend(
make_plugins(prism_path, components, configuration["plugins"], minified)
)
return code
def join_code(kv_code):
result = {"js": "", "css": ""}
for row in kv_code:
for key, code in row.items():
result[key] += code
result[key] += "\n"
return result
def write_code(kv_code, js_out, css_out):
code = join_code(kv_code)
with js_out.open("w") as f:
f.write(code["js"])
print(f"written {js_out}")
with css_out.open("w") as f:
f.write(code["css"])
print(f"written {css_out}")
def parse_args():
# fmt: off
parser = argparse.ArgumentParser()
parser.add_argument("url", help="configured prism download url")
parser.add_argument("--dir", type=Path, default=Path("."), help="prism repo directory")
parser.add_argument("--minify", default=True, action=argparse.BooleanOptionalAction, help="use minified files",)
parser.add_argument("--js-out", type=Path, default=Path("prism.js"), help="JS output file path")
parser.add_argument("--css-out", type=Path, default=Path("prism.css"), help="CSS output file path")
# fmt: on
args = parser.parse_args()
return args
def main():
args = parse_args()
code = make_code(args.dir, args.url, args.minify)
write_code(code, args.js_out, args.css_out)
if __name__ == "__main__":
main()

66
scripts/deps-docker/genprism.sh Executable file
View File

@@ -0,0 +1,66 @@
#!/bin/bash
set -e
langs=(
markup
css
clike
javascript
autohotkey
bash
basic
batch
c
csharp
cpp
cmake
diff
docker
elixir
glsl
go
ini
java
json
kotlin
latex
less
lisp
lua
makefile
matlab
moonscript
nim
objectivec
perl
powershell
python
r
jsx
ruby
rust
sass
scss
sql
swift
systemd
toml
typescript
vbnet
verilog
vhdl
yaml
zig
)
slangs="${langs[*]}"
slangs="${slangs// /+}"
for theme in prism-funky prism ; do
u="https://prismjs.com/download.html#themes=$theme&languages=$slangs&plugins=line-highlight+line-numbers+autolinker"
echo "$u"
./genprism.py --dir prism-$1 --js-out prism.js --css-out $theme.css "$u"
done
mv prism-funky.css prismd.css
mv prismd.css prism.css prism.js /z/dist/

View File

@@ -1,5 +1,5 @@
diff --git a/src/Lexer.js b/src/Lexer.js diff --git a/src/Lexer.js b/src/Lexer.js
adds linetracking to marked.js v4.2.3; adds linetracking to marked.js v4.3.0;
add data-ln="%d" to most tags, %d is the source markdown line add data-ln="%d" to most tags, %d is the source markdown line
--- a/src/Lexer.js --- a/src/Lexer.js
+++ b/src/Lexer.js +++ b/src/Lexer.js
@@ -206,7 +206,6 @@ index a22a2bc..884ad66 100644
// Run any renderer extensions // Run any renderer extensions
if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) { if (this.options.extensions && this.options.extensions.renderers && this.options.extensions.renderers[token.type]) {
diff --git a/src/Renderer.js b/src/Renderer.js diff --git a/src/Renderer.js b/src/Renderer.js
index 7c36a75..aa1a53a 100644
--- a/src/Renderer.js --- a/src/Renderer.js
+++ b/src/Renderer.js +++ b/src/Renderer.js
@@ -11,6 +11,12 @@ export class Renderer { @@ -11,6 +11,12 @@ export class Renderer {
@@ -290,10 +289,9 @@ index 7c36a75..aa1a53a 100644
if (title) { if (title) {
out += ` title="${title}"`; out += ` title="${title}"`;
diff --git a/src/Tokenizer.js b/src/Tokenizer.js diff --git a/src/Tokenizer.js b/src/Tokenizer.js
index e8a69b6..2cc772b 100644
--- a/src/Tokenizer.js --- a/src/Tokenizer.js
+++ b/src/Tokenizer.js +++ b/src/Tokenizer.js
@@ -312,4 +312,7 @@ export class Tokenizer { @@ -333,4 +333,7 @@ export class Tokenizer {
const l = list.items.length; const l = list.items.length;
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad + // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad

View File

@@ -1,4 +1,5 @@
diff --git a/src/Lexer.js b/src/Lexer.js diff --git a/src/Lexer.js b/src/Lexer.js
strip some features
--- a/src/Lexer.js --- a/src/Lexer.js
+++ b/src/Lexer.js +++ b/src/Lexer.js
@@ -7,5 +7,5 @@ import { repeatString } from './helpers.js'; @@ -7,5 +7,5 @@ import { repeatString } from './helpers.js';
@@ -56,7 +57,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
diff --git a/src/Tokenizer.js b/src/Tokenizer.js diff --git a/src/Tokenizer.js b/src/Tokenizer.js
--- a/src/Tokenizer.js --- a/src/Tokenizer.js
+++ b/src/Tokenizer.js +++ b/src/Tokenizer.js
@@ -352,14 +352,7 @@ export class Tokenizer { @@ -367,14 +367,7 @@ export class Tokenizer {
type: 'html', type: 'html',
raw: cap[0], raw: cap[0],
- pre: !this.options.sanitizer - pre: !this.options.sanitizer
@@ -72,7 +73,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
- } - }
return token; return token;
} }
@@ -502,15 +495,9 @@ export class Tokenizer { @@ -517,15 +510,9 @@ export class Tokenizer {
return { return {
- type: this.options.sanitize - type: this.options.sanitize
@@ -90,7 +91,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
+ text: cap[0] + text: cap[0]
}; };
} }
@@ -699,10 +686,10 @@ export class Tokenizer { @@ -714,10 +701,10 @@ export class Tokenizer {
} }
- autolink(src, mangle) { - autolink(src, mangle) {
@@ -103,7 +104,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
+ text = escape(cap[1]); + text = escape(cap[1]);
href = 'mailto:' + text; href = 'mailto:' + text;
} else { } else {
@@ -727,10 +714,10 @@ export class Tokenizer { @@ -742,10 +729,10 @@ export class Tokenizer {
} }
- url(src, mangle) { - url(src, mangle) {
@@ -116,7 +117,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
+ text = escape(cap[0]); + text = escape(cap[0]);
href = 'mailto:' + text; href = 'mailto:' + text;
} else { } else {
@@ -764,12 +751,12 @@ export class Tokenizer { @@ -779,12 +766,12 @@ export class Tokenizer {
} }
- inlineText(src, smartypants) { - inlineText(src, smartypants) {
@@ -135,8 +136,8 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
diff --git a/src/defaults.js b/src/defaults.js diff --git a/src/defaults.js b/src/defaults.js
--- a/src/defaults.js --- a/src/defaults.js
+++ b/src/defaults.js +++ b/src/defaults.js
@@ -10,11 +10,7 @@ export function getDefaults() { @@ -11,11 +11,7 @@ export function getDefaults() {
highlight: null, hooks: null,
langPrefix: 'language-', langPrefix: 'language-',
- mangle: true, - mangle: true,
pedantic: false, pedantic: false,
@@ -170,7 +171,7 @@ diff --git a/src/helpers.js b/src/helpers.js
+export function cleanUrl(base, href) { +export function cleanUrl(base, href) {
if (base && !originIndependentUrl.test(href)) { if (base && !originIndependentUrl.test(href)) {
href = resolveUrl(base, href); href = resolveUrl(base, href);
@@ -250,10 +237,4 @@ export function findClosingBracket(str, b) { @@ -233,10 +220,4 @@ export function findClosingBracket(str, b) {
} }
-export function checkSanitizeDeprecation(opt) { -export function checkSanitizeDeprecation(opt) {
@@ -185,30 +186,25 @@ diff --git a/src/marked.js b/src/marked.js
--- a/src/marked.js --- a/src/marked.js
+++ b/src/marked.js +++ b/src/marked.js
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js'; @@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
import { Hooks } from './Hooks.js';
import { import {
merge,
- checkSanitizeDeprecation, - checkSanitizeDeprecation,
escape escape
} from './helpers.js'; } from './helpers.js';
@@ -35,5 +34,4 @@ export function marked(src, opt, callback) { @@ -18,5 +17,5 @@ import {
function onError(silent, async, callback) {
opt = merge({}, marked.defaults, opt || {}); return (e) => {
- checkSanitizeDeprecation(opt);
if (callback) {
@@ -318,5 +316,4 @@ marked.parseInline = function(src, opt) {
opt = merge({}, marked.defaults, opt || {});
- checkSanitizeDeprecation(opt);
try {
@@ -327,5 +324,5 @@ marked.parseInline = function(src, opt) {
return Parser.parseInline(tokens, opt);
} catch (e) {
- e.message += '\nPlease report this to https://github.com/markedjs/marked.'; - e.message += '\nPlease report this to https://github.com/markedjs/marked.';
+ e.message += '\nmake issue @ https://github.com/9001/copyparty'; + e.message += '\nmake issue @ https://github.com/9001/copyparty';
if (opt.silent) {
return '<p>An error occurred:</p><pre>' if (silent) {
@@ -65,6 +64,4 @@ function parseMarkdown(lexer, parser) {
}
- checkSanitizeDeprecation(opt);
-
if (opt.hooks) {
opt.hooks.options = opt;
diff --git a/test/bench.js b/test/bench.js diff --git a/test/bench.js b/test/bench.js
--- a/test/bench.js --- a/test/bench.js
+++ b/test/bench.js +++ b/test/bench.js
@@ -250,70 +246,70 @@ diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
--- a/test/unit/Lexer-spec.js --- a/test/unit/Lexer-spec.js
+++ b/test/unit/Lexer-spec.js +++ b/test/unit/Lexer-spec.js
@@ -712,5 +712,5 @@ paragraph @@ -794,5 +794,5 @@ paragraph
}); });
- it('sanitize', () => { - it('sanitize', () => {
+ /*it('sanitize', () => { + /*it('sanitize', () => {
expectTokens({ expectTokens({
md: '<div>html</div>', md: '<div>html</div>',
@@ -730,5 +730,5 @@ paragraph @@ -812,5 +812,5 @@ paragraph
] ]
}); });
- }); - });
+ });*/ + });*/
}); });
@@ -810,5 +810,5 @@ paragraph @@ -892,5 +892,5 @@ paragraph
}); });
- it('html sanitize', () => { - it('html sanitize', () => {
+ /*it('html sanitize', () => { + /*it('html sanitize', () => {
expectInlineTokens({ expectInlineTokens({
md: '<div>html</div>', md: '<div>html</div>',
@@ -818,5 +818,5 @@ paragraph @@ -900,5 +900,5 @@ paragraph
] ]
}); });
- }); - });
+ });*/ + });*/
it('link', () => { it('link', () => {
@@ -1129,5 +1129,5 @@ paragraph @@ -1211,5 +1211,5 @@ paragraph
}); });
- it('autolink mangle email', () => { - it('autolink mangle email', () => {
+ /*it('autolink mangle email', () => { + /*it('autolink mangle email', () => {
expectInlineTokens({ expectInlineTokens({
md: '<test@example.com>', md: '<test@example.com>',
@@ -1149,5 +1149,5 @@ paragraph @@ -1231,5 +1231,5 @@ paragraph
] ]
}); });
- }); - });
+ });*/ + });*/
it('url', () => { it('url', () => {
@@ -1186,5 +1186,5 @@ paragraph @@ -1268,5 +1268,5 @@ paragraph
}); });
- it('url mangle email', () => { - it('url mangle email', () => {
+ /*it('url mangle email', () => { + /*it('url mangle email', () => {
expectInlineTokens({ expectInlineTokens({
md: 'test@example.com', md: 'test@example.com',
@@ -1206,5 +1206,5 @@ paragraph @@ -1288,5 +1288,5 @@ paragraph
] ]
}); });
- }); - });
+ });*/ + });*/
}); });
@@ -1222,5 +1222,5 @@ paragraph @@ -1304,5 +1304,5 @@ paragraph
}); });
- describe('smartypants', () => { - describe('smartypants', () => {
+ /*describe('smartypants', () => { + /*describe('smartypants', () => {
it('single quotes', () => { it('single quotes', () => {
expectInlineTokens({ expectInlineTokens({
@@ -1292,5 +1292,5 @@ paragraph @@ -1374,5 +1374,5 @@ paragraph
}); });
}); });
- }); - });

View File

@@ -14,6 +14,7 @@ docker run --rm -it -u 1000 -p 3923:3923 -v /mnt/nas:/w -v $PWD/cfgdir:/cfg copy
* `/cfg` is an optional folder with zero or more config files (*.conf) to load * `/cfg` is an optional folder with zero or more config files (*.conf) to load
* `copyparty/ac` is the recommended [image edition](#editions) * `copyparty/ac` is the recommended [image edition](#editions)
* you can download the image from github instead by replacing `copyparty/ac` with `ghcr.io/9001/copyparty-ac` * you can download the image from github instead by replacing `copyparty/ac` with `ghcr.io/9001/copyparty-ac`
* if you are using rootless podman, remove `-u 1000`
i'm unfamiliar with docker-compose and alternatives so let me know if this section could be better 🙏 i'm unfamiliar with docker-compose and alternatives so let me know if this section could be better 🙏

View File

@@ -40,4 +40,10 @@ update_arch_pkgbuild() {
rm -rf x rm -rf x
} }
update_nixos_pin() {
( cd $self/../contrib/package/nix/copyparty;
./update.py $self/../dist/copyparty-sfx.py )
}
update_arch_pkgbuild update_arch_pkgbuild
update_nixos_pin

View File

@@ -9,11 +9,14 @@ tee build2.sh | cmp build.sh && rm build2.sh || {
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh [[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
} }
./up2k.sh [ -e up2k.sh ] && ./up2k.sh
uname -s | grep WOW64 && m= || m=32 uname -s | grep WOW64 && m=64 || m=32
uname -s | grep NT-10 && w10=1 || w7=1 uname -s | grep NT-10 && w10=1 || w7=1
[ $w7 ] && pyv=37 || pyv=311 [ $w7 ] && pyv=37 || pyv=311
esuf=
[ $w7 ] && [ $m = 32 ] && esuf=32
[ $w7 ] && [ $m = 64 ] && esuf=-winpe64
appd=$(cygpath.exe "$APPDATA") appd=$(cygpath.exe "$APPDATA")
spkgs=$appd/Python/Python$pyv/site-packages spkgs=$appd/Python/Python$pyv/site-packages
@@ -59,8 +62,7 @@ read a b c d _ < <(
sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/' sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/'
) )
sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2 sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2
[ $m ] && sed -ri s/copyparty.exe/copyparty$esuf.exe/ loader.rc2
sed -ri 's/copyparty.exe/copyparty32.exe/' loader.rc2
excl=( excl=(
copyparty.broker_mp copyparty.broker_mp
@@ -105,4 +107,4 @@ base64 | head -c12 >> dist/copyparty.exe
dist/copyparty.exe --version dist/copyparty.exe --version
curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$m.exe curl -fkT dist/copyparty.exe -b cppwd=wark https://192.168.123.1:3923/copyparty$esuf.exe

View File

@@ -1,9 +1,9 @@
d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl d5510a24cb5e15d6d30677335bbc7624c319b371c0513981843dc51d9b3a1e027661096dfcfc540634222bb2634be6db55bf95185b30133cb884f1e47652cf53 altgraph-0.17.3-py2.py3-none-any.whl
eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe eda6c38fc4d813fee897e969ff9ecc5acc613df755ae63df0392217bbd67408b5c1f6c676f2bf5497b772a3eb4e1a360e1245e1c16ee83f0af555f1ab82c3977 Git-2.39.1-32-bit.exe
17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl 17ce52ba50692a9d964f57a23ac163fb74c77fdeb2ca988a6d439ae1fe91955ff43730c073af97a7b3223093ffea3479a996b9b50ee7fba0869247a56f74baa6 pefile-2023.2.7-py3-none-any.whl
85a041cc95cf493f5e2ebc2ca406d2718735e43951988810dc448d29e9ee0bcdb1ca19e0c22243441f45633969af8027469f29f6288f6830c724a3fa38886e5c pyinstaller-5.8.0-py3-none-win32.whl d68c78bc83f4f48c604912b2d1ca4772b0e6ed676cd2eb439411e0a74d63fe215aac93dd9dab04ed341909a4a6a1efc13ec982516e3cb0fc7c355055e63d9178 pyinstaller-5.10.1-py3-none-win32.whl
adf0d23a98da38056de25e07e68921739173efc70fb9bf3f68d8c7c3d0d092e09efa69d35c0c9ecc990bc3c5fa62038227ef480ed06ddfaf05353f6e468f5dca pyinstaller-5.8.0-py3-none-win_amd64.whl fe62705893c86eeb2d5b841da8debe05dedda98364dec190b487e718caad8a8735503bf93739a7a27ea793a835bf976fb919ceec1424b8fc550b936bae4a54e9 pyinstaller-5.10.1-py3-none-win_amd64.whl
01d7f8125966ed30389a879ba69d2c1fd3212bafad3fb485317580bcb9f489e8b901c4d325f6cb8a52986838ba6d44d3852e62b27c1f1d5a576899821cc0ae02 pyinstaller_hooks_contrib-2023.0-py2.py3-none-any.whl 61c543983ff67e2bdff94d2d6198023679437363db8c660fa81683aff87c5928cd800720488e18d09be89fe45d6ab99be3ccb912cb2e03e2bca385b4338e1e42 pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl 132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl 3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip 4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
@@ -26,5 +26,5 @@ ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a250675
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl 00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl b1db6f5a79fc15391547643e5973cf5946c0acfa6febb68bc90fc3f66369681100cc100f32dd04256dcefa510e7864c718515a436a4af3a10fe205c413c7e693 MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl 4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
ea152624499966615ee74f2aefed27da528785e1215f46d61e79c5290bb8105fd98e9948938efbca9cd19e2f1dd48c9e712b4f30a4148a0ed5d1ff2dff77106e Pillow-9.4.0-cp311-cp311-win_amd64.whl 78414808cb9a5fa74e7b23360b8f46147952530e3cc78a3ad4b80be3e26598080537ac691a1be1f35b7428a22c1f65a6adf45986da2752fbe9d9819d77a58bf8 Pillow-9.5.0-cp311-cp311-win_amd64.whl
2b04b196f1115f42375e623a35edeb71565dfd090416b22510ec0270fefe86f7d397a98aabbe9ebfe3f6a355fe25c487a4875d4252027d0a61ccb64cacd7631d python-3.11.2-amd64.exe 4b7711b950858f459d47145b88ccde659279c6af47144d58a1c54ea2ce4b80ec43eb7f69c68f12f8f6bc54c86a44e77441993257f7ad43aab364655de5c51bb1 python-3.11.2-amd64.exe

View File

@@ -30,6 +30,9 @@ if possible, for performance and security reasons, please use this instead:
https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py
""" """
if sys.maxsize > 2 ** 32:
v = v.replace("32-bit", "64-bit")
try: try:
print(v.replace("\n", "\n▒▌ ")[1:] + "\n") print(v.replace("\n", "\n▒▌ ")[1:] + "\n")
except: except:

View File

@@ -17,15 +17,15 @@ uname -s | grep NT-10 && w10=1 || {
fns=( fns=(
altgraph-0.17.3-py2.py3-none-any.whl altgraph-0.17.3-py2.py3-none-any.whl
pefile-2023.2.7-py3-none-any.whl pefile-2023.2.7-py3-none-any.whl
pyinstaller-5.8.0-py3-none-win_amd64.whl pyinstaller-5.10.1-py3-none-win_amd64.whl
pyinstaller_hooks_contrib-2023.0-py2.py3-none-any.whl pyinstaller_hooks_contrib-2023.2-py2.py3-none-any.whl
pywin32_ctypes-0.2.0-py2.py3-none-any.whl pywin32_ctypes-0.2.0-py2.py3-none-any.whl
upx-4.0.2-win32.zip upx-4.0.2-win32.zip
) )
[ $w10 ] && fns+=( [ $w10 ] && fns+=(
mutagen-1.46.0-py3-none-any.whl mutagen-1.46.0-py3-none-any.whl
Pillow-9.4.0-cp311-cp311-win_amd64.whl Pillow-9.4.0-cp311-cp311-win_amd64.whl
python-3.11.2-amd64.exe python-3.11.3-amd64.exe
} }
[ $w7 ] && fns+=( [ $w7 ] && fns+=(
certifi-2022.12.7-py3-none-any.whl certifi-2022.12.7-py3-none-any.whl
@@ -43,12 +43,12 @@ fns=(
) )
[ $w7x64 ] && fns+=( [ $w7x64 ] && fns+=(
windows6.1-kb2533623-x64.msu windows6.1-kb2533623-x64.msu
pyinstaller-5.8.0-py3-none-win_amd64.whl pyinstaller-5.10.1-py3-none-win_amd64.whl
python-3.7.9-amd64.exe python-3.7.9-amd64.exe
) )
[ $w7x32 ] && fns+=( [ $w7x32 ] && fns+=(
windows6.1-kb2533623-x86.msu windows6.1-kb2533623-x86.msu
pyinstaller-5.8.0-py3-none-win32.whl pyinstaller-5.10.1-py3-none-win32.whl
python-3.7.9.exe python-3.7.9.exe
) )
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; } dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }

View File

@@ -16,7 +16,9 @@ cat $f | awk '
h=0 h=0
}; };
}; };
/^#/{s=1;pr()} /^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff)|`$/{s=0} /^#/{s=1;rs=0;pr()}
/^#* *(nix package)/{rs=1}
/^#* *(install on android|dev env setup|just the sfx|complete release|optional gpl stuff|nixos module)|`$/{s=rs}
/^#/{ /^#/{
lv=length($1); lv=length($1);
sub(/[^ ]+ /,""); sub(/[^ ]+ /,"");

View File

@@ -98,7 +98,7 @@ class Cfg(Namespace):
def __init__(self, a=None, v=None, c=None): def __init__(self, a=None, v=None, c=None):
ka = {} ka = {}
ex = "daw dav_inf dav_mac dotsrch e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp force_js getmod hardlink ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_thumb no_vthumb no_zip nrand nw rand vc xdev xlink xvol" ex = "daw dav_inf dav_mac dotsrch e2d e2ds e2dsa e2t e2ts e2tsr e2v e2vu e2vp ed emp force_js getmod hardlink ih ihead magic never_symlink nid nih no_acode no_athumb no_dav no_dedup no_del no_dupe no_logues no_mv no_readme no_robots no_sb_md no_sb_lg no_scandir no_thumb no_vthumb no_zip nrand nw rand vc xdev xlink xvol"
ka.update(**{k: False for k in ex.split()}) ka.update(**{k: False for k in ex.split()})
ex = "dotpart no_rescan no_sendfile no_voldump plain_ip" ex = "dotpart no_rescan no_sendfile no_voldump plain_ip"