Compare commits

...

53 Commits

Author SHA1 Message Date
ed
fff45552da v1.17.0 2025-04-26 21:49:09 +00:00
ed
95157d02c9 ie11 can't sandbox; add minimal fallback 2025-04-26 20:14:23 +00:00
ed
3090c74832 ie11: fix debounce-untint;
css 'unset' appeared in chr41, ff27

dom.closest appeared in chr41, ff35
2025-04-26 19:57:59 +00:00
ed
4195762d2a playlist: when lacking perms, s/edit/view/ 2025-04-26 19:28:12 +00:00
ed
dc3b7a2720 reduce --th-ram-max floor;
helps avoid oom in a vm with 512 MiB ram
2025-04-26 19:06:32 +00:00
ed
ad200f2b97 add ui for creating playlists 2025-04-26 00:19:41 +00:00
ed
897f9d328d audioplayer: load and play m3u8 playlists 2025-04-25 22:33:00 +00:00
ed
efbe34f29d readme: mention basic-auth behavior 2025-04-25 18:57:12 +00:00
ed
dbfc899d79 pw-hash tweaks (#159):
* do not take lock on shares-db / sessions-db when running with
   `--ah-gen` or `--ah-cli` (allows a 2nd instance for that purpose)

* add options to print effective salt for ah/fk/dk; useful for nixos
   and other usecases where config is derived or otherwise opaque
2025-04-25 18:12:35 +00:00
ed
74fb4b0cb8 fix --u2j helptext:
* mention potential hdd-bottleneck from big values
* most browsers enforce a max-value of 6 (c354a38b)
* chunk-stitching (132a8350) made this less important;
   still beneficial, but only to a point
2025-04-24 20:51:45 +00:00
ed
68e7000275 update pkgs to 1.16.21 2025-04-20 19:19:35 +00:00
ed
38c2dcce3e v1.16.21 2025-04-20 18:36:32 +00:00
ed
5b3a5fe76b show warning on ctrl-a in lazyloaded folders 2025-04-20 13:33:01 +00:00
ed
d5a9bd80b2 docker: hide healthcheck from logs 2025-04-20 12:26:56 +00:00
ed
71c5565949 add button to loop/repeat music; closes #156 2025-04-20 11:45:48 +00:00
ed
db33d68d42 zip-download: eagerly 64bit data-descriptors; closes #155
this avoids a false-positive in the info-zip unzip zipbomb detector.

unfortunately,

* now impossible to extract large (4 GiB) zipfiles using old software
   (WinXP, macos 10.12)

* now less viable to stream download-as-zip into a zipfile unpacker
   (please use download-as-tar for that purpose)

context:

the zipfile specification (APPNOTE.TXT) is slightly ambiguous as to when
data-descriptor (0x504b0708) filesize-fields change from 32bit to 64bit;
both copyparty and libarchive independently made the same interpretation
that this is only when the local header is zip64, AND the size-fields
are both 0xFFFFFFFF. This makes sense because the data descriptor is
only necessary when that particular file-to-be-added exceeds 4 GiB,
and/or when the crc32 is not known ahead of time.

another interpretation, seen in an early version of the patchset
to fix CVE-2019-13232 (zip-bombs) in the info-zip unzip command,
believes the only requirement is that the local header is zip64.

in many linux distributions, the unzip command would thus fail on
zipfiles created by copyparty, since they (by default) satisfy
the three requirements to hit the zipbomb false-positive:

* total filesize exceeds 4 GiB, and...
* a mix of regular (32bit) and zip64 entries, and...
* streaming-mode zipfile (not made with ?zip=crc)

this issue no longer exists in a more recent version of that patchset,
https://github.com/madler/unzip/commit/af0d07f95809653b
but this fix has not yet made it into most linux distros
2025-04-17 18:52:47 +00:00
ed
e1c20c7a18 readme: mention bootable flashdrive / cdrom 2025-04-17 18:45:50 +00:00
ed
d3f1b45ce3 update pkgs to 1.16.20 2025-04-13 22:32:06 +00:00
ed
c7aa1a3558 v1.16.20 2025-04-13 21:51:39 +00:00
ed
7b2bd6da83 fix sorting of japanese folders
directory-tree sidebar did not sort correctly for non-ascii names

also fix a natural-sort bug; it only took effect for the
initial folder load, and not when changing the sort-order

also, natural-sort will now apply to all non-numeric fields,
not just the filename like before
2025-04-13 21:11:07 +00:00
ed
2bd955ba9f race-the-beam: improve phrasing 2025-04-13 18:51:45 +00:00
ed
98dcaee210 workaround ffmpeg-bug 10797
reduces ram usage from 1534 to 230 MiB when generating spectrograms
of files which are decoded by libopenmpt, so most s3xmodit formats
2025-04-13 18:51:35 +00:00
ed
361aebf877 warn on zeroconf with uds-only 2025-04-13 16:38:29 +00:00
ed
ffc1610980 dont crash if qrcode + mdns + uds 2025-04-13 16:11:36 +00:00
ed
233075aee7 ctrl-a selects all files in gridview too 2025-04-13 16:09:49 +00:00
ed
d1a4d335df increase treenav scroll-margins
was too small in deep folders, and/or long foldernames
2025-04-13 16:09:14 +00:00
ed
96acbd3593 cleanup
* remove cpr bonk (deadcode)
* remove get_vpath (wasteful)
2025-04-13 16:08:44 +00:00
thaddeus kuah
4b876dd133 full lowercase on login button to match the page
Signed-off-by: thaddeus kuah <tk@tkkr.dev>
2025-04-11 23:56:51 +02:00
ed
a06c5eb048 new xau hook: podcast-normalizer.py 2025-04-09 19:44:13 +00:00
ed
c9cdc3e1c1 update pkgs to 1.16.19 2025-04-08 21:52:43 +00:00
ed
c0becc6418 v1.16.19 2025-04-08 21:32:51 +00:00
ed
b17ccc38ee prefer XDG_CONFIG_HOME on all platforms
convenient escape-hatch
2025-04-08 19:23:14 +00:00
ed
acfaacbd46 enforce single-instance for session/shares db
use file-locking to detect and prevent misconfigurations
which could lead to subtle unexpected behavior
2025-04-08 19:08:12 +00:00
ed
8e0364efad if this is wrong i blame suzubrah for playing entirely too hype music at 6am in the fkn morning
improve shares/session-db smoketests and error semantics
2025-04-08 05:42:21 +00:00
ed
e3043004ba improve u2ow phrasing 2025-04-07 20:48:43 +00:00
ed
b2aaf40a3e speedgolf
in some envs (unsure which), importlib.resources is an
expensive import; drop it when we know it's useless
2025-04-07 20:34:55 +00:00
ed
21db8833dc tests: fix for f9954bc4e5 2025-04-07 18:59:43 +00:00
ed
ec14c3944e fix DeprecationWarning: Accessing argon2.__version__ is deprecated and will be removed in a future release. Use importlib.metadata directly to query for structlog's packaging metadata. 2025-04-07 18:51:13 +00:00
ed
20920e844f svg newlines + fix cleaner warning:
* support newlines in svg files;
  * `--error--\ncheck\nserver\nlog`
  * `upload\nonly`

* thumbnails of files with lastmodified year 1601 would
   make the cleaner print a harmless but annoying warning
2025-04-07 18:47:20 +00:00
ed
f9954bc4e5 smoketest fs-access when transcoding
the thumbnailer / audio transcoder could return misleading errors
if the operation fails due to insufficient filesystem permissions

try reading a few bytes from the file and bail early if it fails,
and detect/log unwritable output folders for thumbnails

also fixes http-response to only return svg-formatted errors
if the initial request expects a picture in response, not audio
2025-04-07 18:41:37 +00:00
thaddeus kuah
d450f61534 Apply custom fonts to buttons and input fields (#152)
* set custom font for inputs and buttons

Signed-off-by: Thaddeus Kuah <tk@tkkr.dev>
2025-04-06 19:15:10 +00:00
ed
2b50fc2010 fix mkdir in symlinked folders; closes #151
remove an overly careful safety-check which would refuse creating
directories if the location was outside of the volume's base-path

it is safe to trust `rem` due to `vpath = undot(vpath)` and
a similar check being performed inside `vfs.get` as well,
so this served no purpose
2025-04-06 09:18:40 +00:00
ed
c2034f7bc5 add GoogleOther to bad-crawlers list 2025-04-01 21:29:58 +02:00
ed
cec3bee020 forbid all use of LLM / AI when writing code 2025-03-31 17:25:56 +00:00
ed
e1b9ac631f separate histpath and dbpath options (#149)
the up2k databases are, by default, stored in a `.hist` subfolder
inside each volume, next to thumbnails and transcoded audio

add a new option for storing the databases in a separate location,
making it possible to tune the underlying filesystem for optimal
performance characteristics

the `--hist` global-option and `hist` volflag still behave like
before, but `--dbpath` and volflag `dbpath` will override the
histpath for the up2k-db and up2k-snap exclusivey
2025-03-30 16:08:28 +00:00
ed
19ee64e5e3 clarify that all dependencies are optional (#149) 2025-03-30 13:30:52 +00:00
ed
4f397b9b5b add zfs-tune (#149) 2025-03-30 13:30:15 +00:00
ed
71775dcccb mention mimalloc 2025-03-30 13:17:12 +00:00
ed
b383c08cc3 add review from ixbt forums 2025-03-29 13:57:35 +00:00
ed
fc88341820 add option to store markdown backups elsewhere
`--md-hist` / volflag `md_hist` specifies where to put old
versions of markdown files when edited using the web-ui;

* `s` = create `.hist` subfolder next to the markdown file
   (the default, both previously and now)

* `v` = use the volume's hist-path, either according to
   `--hist` or the `hist` volflag. NOTE: old versions
   will not be retrievable through the web-ui

* `n` = nope / disabled; overwrite without backup
2025-03-26 20:07:35 +00:00
ed
43bbd566d7 mention mimalloc in docker-compose examples (thx thad) 2025-03-24 23:19:17 +00:00
ed
e1dea7ef3e dangit 2025-03-23 23:28:05 +00:00
ed
de2fedd2cd update pkgs to 1.16.18 2025-03-23 23:04:53 +00:00
45 changed files with 1322 additions and 231 deletions

View File

@@ -1,8 +1,21 @@
* do something cool * **found a bug?** [create an issue!](https://github.com/9001/copyparty/issues) or let me know in the [discord](https://discord.gg/25J8CdTT6G) :>
* **fixed a bug?** create a PR or post a patch! big thx in advance :>
* **have a cool idea?** let's discuss it! anywhere's fine, you choose.
really tho, send a PR or an issue or whatever, all appreciated, anything goes, just behave aight 👍👍 but please:
# do not use AI / LMM when writing code
copyparty is 100% organic, free-range, human-written software!
> ⚠ you are now entering a no-copilot zone
the *only* place where LMM/AI *may* be accepted is for [localization](https://github.com/9001/copyparty/tree/hovudstraum/docs/rice#translations) if you are fluent and have confirmed that the translation is accurate.
sorry for the harsh tone, but this is important to me 🙏
but to be more specific,
# contribution ideas # contribution ideas

View File

@@ -50,6 +50,8 @@ turn almost any device into a file server with resumable uploads/downloads using
* [rss feeds](#rss-feeds) - monitor a folder with your RSS reader * [rss feeds](#rss-feeds) - monitor a folder with your RSS reader
* [recent uploads](#recent-uploads) - list all recent uploads * [recent uploads](#recent-uploads) - list all recent uploads
* [media player](#media-player) - plays almost every audio format there is * [media player](#media-player) - plays almost every audio format there is
* [playlists](#playlists) - create and play [m3u8](https://en.wikipedia.org/wiki/M3U) playlists
* [creating a playlist](#creating-a-playlist) - with a standalone mediaplayer or copyparty
* [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression) * [audio equalizer](#audio-equalizer) - and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
* [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings * [fix unreliable playback on android](#fix-unreliable-playback-on-android) - due to phone / app settings
* [markdown viewer](#markdown-viewer) - and there are *two* editors * [markdown viewer](#markdown-viewer) - and there are *two* editors
@@ -147,6 +149,7 @@ just run **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/
* or if you are on android, [install copyparty in termux](#install-on-android) * or if you are on android, [install copyparty in termux](#install-on-android)
* or maybe you have a [synology nas / dsm](./docs/synology-dsm.md) * or maybe you have a [synology nas / dsm](./docs/synology-dsm.md)
* or if your computer is messed up and nothing else works, [try the pyz](#zipapp) * or if your computer is messed up and nothing else works, [try the pyz](#zipapp)
* or if your OS is dead, give the [bootable flashdrive / cd-rom](https://a.ocv.me/pub/stuff/edcd001/enterprise-edition/) a spin
* or if you don't trust copyparty yet and want to isolate it a little, then... * or if you don't trust copyparty yet and want to isolate it a little, then...
* ...maybe [prisonparty](./bin/prisonparty.sh) to create a tiny [chroot](https://wiki.archlinux.org/title/Chroot) (very portable), * ...maybe [prisonparty](./bin/prisonparty.sh) to create a tiny [chroot](https://wiki.archlinux.org/title/Chroot) (very portable),
* ...or [bubbleparty](./bin/bubbleparty.sh) to wrap it in [bubblewrap](https://github.com/containers/bubblewrap) (much better) * ...or [bubbleparty](./bin/bubbleparty.sh) to wrap it in [bubblewrap](https://github.com/containers/bubblewrap) (much better)
@@ -250,6 +253,7 @@ also see [comparison to similar software](./docs/versus.md)
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename)) * ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
* ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus/mp3 transcoding) * ☑ audio player (with [OS media controls](https://user-images.githubusercontent.com/241032/215347492-b4250797-6c90-4e09-9a4c-721edf2fb15c.png) and opus/mp3 transcoding)
* ☑ play video files as audio (converted on server) * ☑ play video files as audio (converted on server)
* ☑ create and play [m3u8 playlists](#playlists)
* ☑ image gallery with webm player * ☑ image gallery with webm player
* ☑ textfile browser with syntax hilighting * ☑ textfile browser with syntax hilighting
* ☑ [thumbnails](#thumbnails) * ☑ [thumbnails](#thumbnails)
@@ -281,6 +285,8 @@ small collection of user feedback
`good enough`, `surprisingly correct`, `certified good software`, `just works`, `why`, `wow this is better than nextcloud` `good enough`, `surprisingly correct`, `certified good software`, `just works`, `why`, `wow this is better than nextcloud`
* UI просто ужасно. Если буду описывать детально не смогу удержаться в рамках приличий
# motivations # motivations
@@ -300,6 +306,8 @@ project goals / philosophy
* adaptable, malleable, hackable * adaptable, malleable, hackable
* no build steps; modify the js/python without needing node.js or anything like that * no build steps; modify the js/python without needing node.js or anything like that
becoming rich is specifically *not* a motivation, but if you wanna donate then see my [github profile](https://github.com/9001) regarding donations for my FOSS stuff in general (also THANKS!)
## notes ## notes
@@ -329,7 +337,8 @@ roughly sorted by chance of encounter
* `--th-ff-jpg` may fix video thumbnails on some FFmpeg versions (macos, some linux) * `--th-ff-jpg` may fix video thumbnails on some FFmpeg versions (macos, some linux)
* `--th-ff-swr` may fix audio thumbnails on some FFmpeg versions * `--th-ff-swr` may fix audio thumbnails on some FFmpeg versions
* if the `up2k.db` (filesystem index) is on a samba-share or network disk, you'll get unpredictable behavior if the share is disconnected for a bit * if the `up2k.db` (filesystem index) is on a samba-share or network disk, you'll get unpredictable behavior if the share is disconnected for a bit
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db on a local disk instead * use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db and thumbnails on a local disk instead
* or, if you only want to move the db (and not the thumbnails), then use `--dbpath` or the `dbpath` volflag
* all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise * all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise
* probably more, pls let me know * probably more, pls let me know
@@ -382,7 +391,8 @@ same order here too
* this is an msys2 bug, the regular windows edition of python is fine * this is an msys2 bug, the regular windows edition of python is fine
* VirtualBox: sqlite throws `Disk I/O Error` when running in a VM and the up2k database is in a vboxsf * VirtualBox: sqlite throws `Disk I/O Error` when running in a VM and the up2k database is in a vboxsf
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db inside the vm instead * use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db and thumbnails inside the vm instead
* or, if you only want to move the db (and not the thumbnails), then use `--dbpath` or the `dbpath` volflag
* also happens on mergerfs, so put the db elsewhere * also happens on mergerfs, so put the db elsewhere
* Ubuntu: dragging files from certain folders into firefox or chrome is impossible * Ubuntu: dragging files from certain folders into firefox or chrome is impossible
@@ -726,6 +736,7 @@ select which type of archive you want in the `[⚙️] config` tab:
* `up2k.db` and `dir.txt` is always excluded * `up2k.db` and `dir.txt` is always excluded
* bsdtar supports streaming unzipping: `curl foo?zip | bsdtar -xv` * bsdtar supports streaming unzipping: `curl foo?zip | bsdtar -xv`
* good, because copyparty's zip is faster than tar on small files * good, because copyparty's zip is faster than tar on small files
* but `?tar` is better for large files, especially if the total exceeds 4 GiB
* `zip_crc` will take longer to download since the server has to read each file twice * `zip_crc` will take longer to download since the server has to read each file twice
* this is only to support MS-DOS PKZIP v2.04g (october 1993) and older * this is only to support MS-DOS PKZIP v2.04g (october 1993) and older
* how are you accessing copyparty actually * how are you accessing copyparty actually
@@ -804,6 +815,8 @@ if you are resuming a massive upload and want to skip hashing the files which al
if the server is behind a proxy which imposes a request-size limit, you can configure up2k to sneak below the limit with server-option `--u2sz` (the default is 96 MiB to support Cloudflare) if the server is behind a proxy which imposes a request-size limit, you can configure up2k to sneak below the limit with server-option `--u2sz` (the default is 96 MiB to support Cloudflare)
if you want to replace existing files on the server with new uploads by default, run with `--u2ow 2` (only works if users have the delete-permission, and can still be disabled with `🛡️` in the UI)
### file-search ### file-search
@@ -1028,11 +1041,13 @@ click the `play` link next to an audio file, or copy the link target to [share i
open the `[🎺]` media-player-settings tab to configure it, open the `[🎺]` media-player-settings tab to configure it,
* "switches": * "switches":
* `[🔁]` repeats one single song forever
* `[🔀]` shuffles the files inside each folder * `[🔀]` shuffles the files inside each folder
* `[preload]` starts loading the next track when it's about to end, reduces the silence between songs * `[preload]` starts loading the next track when it's about to end, reduces the silence between songs
* `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections * `[full]` does a full preload by downloading the entire next file; good for unreliable connections, bad for slow connections
* `[~s]` toggles the seekbar waveform display * `[~s]` toggles the seekbar waveform display
* `[/np]` enables buttons to copy the now-playing info as an irc message * `[/np]` enables buttons to copy the now-playing info as an irc message
* `[📻]` enables buttons to create an [m3u playlist](#playlists) with the selected songs
* `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession)) * `[os-ctl]` makes it possible to control audio playback from the lockscreen of your device (enables [mediasession](https://developer.mozilla.org/en-US/docs/Web/API/MediaSession))
* `[seek]` allows seeking with lockscreen controls (buggy on some devices) * `[seek]` allows seeking with lockscreen controls (buggy on some devices)
* `[art]` shows album art on the lockscreen * `[art]` shows album art on the lockscreen
@@ -1051,11 +1066,39 @@ open the `[🎺]` media-player-settings tab to configure it,
* "transcode to": * "transcode to":
* `[opus]` produces an `opus` whenever transcoding is necessary (the best choice on Android and PCs) * `[opus]` produces an `opus` whenever transcoding is necessary (the best choice on Android and PCs)
* `[awo]` is `opus` in a `weba` file, good for iPhones (iOS 17.5 and newer) but Apple is still fixing some state-confusion bugs as of iOS 18.2.1 * `[awo]` is `opus` in a `weba` file, good for iPhones (iOS 17.5 and newer) but Apple is still fixing some state-confusion bugs as of iOS 18.2.1
* `[caf]` is `opus` in a `caf` file, good for iPhones (iOS 11 through 17), technically unsupported by Apple but works for the mos tpart * `[caf]` is `opus` in a `caf` file, good for iPhones (iOS 11 through 17), technically unsupported by Apple but works for the most part
* `[mp3]` -- the myth, the legend, the undying master of mediocre sound quality that definitely works everywhere * `[mp3]` -- the myth, the legend, the undying master of mediocre sound quality that definitely works everywhere
* "tint" reduces the contrast of the playback bar * "tint" reduces the contrast of the playback bar
### playlists
create and play [m3u8](https://en.wikipedia.org/wiki/M3U) playlists -- see example [text](https://a.ocv.me/pub/demo/music/?doc=example-playlist.m3u) and [player](https://a.ocv.me/pub/demo/music/#m3u=example-playlist.m3u)
click a file with the extension `m3u` or `m3u8` (for example `mixtape.m3u` or `touhou.m3u8` ) and you get two choices: Play / Edit
playlists can include songs across folders anywhere on the server, but filekeys/dirkeys are NOT supported, so the listener must have read-access or get-access to the files
### creating a playlist
with a standalone mediaplayer or copyparty
you can use foobar2000, deadbeef, just about any standalone player should work -- but you might need to edit the filepaths in the playlist so they fit with the server-URLs
alternatively, you can create the playlist using copyparty itself:
* open the `[🎺]` media-player-settings tab and enable the `[📻]` create-playlist feature -- this adds two new buttons in the bottom-right tray, `[📻add]` and `[📻copy]` which appear when you listen to music, or when you select a few audiofiles
* click the `📻add` button while a song is playing (or when you've selected some songs) and they'll be added to "the list" (you can't see it yet)
* at any time, click `📻copy` to send the playlist to your clipboard
* you can then continue adding more songs if you'd like
* if you want to wipe the playlist and start from scratch, just refresh the page
* create a new textfile, name it `something.m3u` and paste the playlist there
### audio equalizer ### audio equalizer
and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression) and [dynamic range compressor](https://en.wikipedia.org/wiki/Dynamic_range_compression)
@@ -1592,6 +1635,8 @@ copyparty creates a subfolder named `.hist` inside each volume where it stores t
this can instead be kept in a single place using the `--hist` argument, or the `hist=` volflag, or a mix of both: this can instead be kept in a single place using the `--hist` argument, or the `hist=` volflag, or a mix of both:
* `--hist ~/.cache/copyparty -v ~/music::r:c,hist=-` sets `~/.cache/copyparty` as the default place to put volume info, but `~/music` gets the regular `.hist` subfolder (`-` restores default behavior) * `--hist ~/.cache/copyparty -v ~/music::r:c,hist=-` sets `~/.cache/copyparty` as the default place to put volume info, but `~/music` gets the regular `.hist` subfolder (`-` restores default behavior)
by default, the per-volume `up2k.db` sqlite3-database for `-e2d` and `-e2t` is stored next to the thumbnails according to the `--hist` option, but the global-option `--dbpath` and/or volflag `dbpath` can be used to put the database somewhere else
note: note:
* putting the hist-folders on an SSD is strongly recommended for performance * putting the hist-folders on an SSD is strongly recommended for performance
* markdown edits are always stored in a local `.hist` subdirectory * markdown edits are always stored in a local `.hist` subdirectory
@@ -2127,7 +2172,9 @@ buggy feature? rip it out by setting any of the following environment variables
| env-var | what it does | | env-var | what it does |
| -------------------- | ------------ | | -------------------- | ------------ |
| `PRTY_NO_DB_LOCK` | do not lock session/shares-databases for exclusive access |
| `PRTY_NO_IFADDR` | disable ip/nic discovery by poking into your OS with ctypes | | `PRTY_NO_IFADDR` | disable ip/nic discovery by poking into your OS with ctypes |
| `PRTY_NO_IMPRESO` | do not try to load js/css files using `importlib.resources` |
| `PRTY_NO_IPV6` | disable some ipv6 support (should not be necessary since windows 2000) | | `PRTY_NO_IPV6` | disable some ipv6 support (should not be necessary since windows 2000) |
| `PRTY_NO_LZMA` | disable streaming xz compression of incoming uploads | | `PRTY_NO_LZMA` | disable streaming xz compression of incoming uploads |
| `PRTY_NO_MP` | disable all use of the python `multiprocessing` module (actual multithreading, cpu-count for parsers/thumbnailers) | | `PRTY_NO_MP` | disable all use of the python `multiprocessing` module (actual multithreading, cpu-count for parsers/thumbnailers) |
@@ -2373,6 +2420,8 @@ copyparty returns a truncated sha512sum of your PUT/POST as base64; you can gene
you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, url-param `?pw=hunter2`, or with basic-authentication (either as the username or password) you can provide passwords using header `PW: hunter2`, cookie `cppwd=hunter2`, url-param `?pw=hunter2`, or with basic-authentication (either as the username or password)
> for basic-authentication, all of the following are accepted: `password` / `whatever:password` / `password:whatever` (the username is ignored)
NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename NOTE: curl will not send the original filename if you use `-T` combined with url-params! Also, make sure to always leave a trailing slash in URLs unless you want to override the filename
@@ -2445,6 +2494,8 @@ below are some tweaks roughly ordered by usefulness:
* `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable * `--no-hash .` when indexing a network-disk if you don't care about the actual filehashes and only want the names/tags searchable
* if your volumes are on a network-disk such as NFS / SMB / s3, specifying larger values for `--iobuf` and/or `--s-rd-sz` and/or `--s-wr-sz` may help; try setting all of them to `524288` or `1048576` or `4194304` * if your volumes are on a network-disk such as NFS / SMB / s3, specifying larger values for `--iobuf` and/or `--s-rd-sz` and/or `--s-wr-sz` may help; try setting all of them to `524288` or `1048576` or `4194304`
* `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger) * `--no-htp --hash-mt=0 --mtag-mt=1 --th-mt=1` minimizes the number of threads; can help in some eccentric environments (like the vscode debugger)
* when running on AlpineLinux or other musl-based distro, try mimalloc for higher performance (and twice as much RAM usage); `apk add mimalloc2` and run copyparty with env-var `LD_PRELOAD=/usr/lib/libmimalloc-secure.so.2`
* note that mimalloc requires special care when combined with prisonparty and/or bubbleparty/bubblewrap; you must give it access to `/proc` and `/sys` otherwise you'll encounter issues with FFmpeg (audio transcoding, thumbnails)
* `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example: * `-j0` enables multiprocessing (actual multithreading), can reduce latency to `20+80/numCores` percent and generally improve performance in cpu-intensive workloads, for example:
* lots of connections (many users or heavy clients) * lots of connections (many users or heavy clients)
* simultaneous downloads and uploads saturating a 20gbps connection * simultaneous downloads and uploads saturating a 20gbps connection

View File

@@ -14,6 +14,8 @@ run copyparty with `--help-hooks` for usage details / hook type explanations (xm
* [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png)) * [discord-announce.py](discord-announce.py) announces new uploads on discord using webhooks ([example](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png))
* [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable * [reject-mimetype.py](reject-mimetype.py) rejects uploads unless the mimetype is acceptable
* [into-the-cache-it-goes.py](into-the-cache-it-goes.py) avoids bugs in caching proxies by immediately downloading each file that is uploaded * [into-the-cache-it-goes.py](into-the-cache-it-goes.py) avoids bugs in caching proxies by immediately downloading each file that is uploaded
* [podcast-normalizer.py](podcast-normalizer.py) creates a second file with dynamic-range-compression whenever an audio file is uploaded
* good example of the `idx` [hook effect](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#hook-effects) to tell copyparty about additional files to scan/index
# upload batches # upload batches
@@ -25,6 +27,7 @@ these are `--xiu` hooks; unlike `xbu` and `xau` (which get executed on every sin
# before upload # before upload
* [reject-extension.py](reject-extension.py) rejects uploads if they match a list of file extensions * [reject-extension.py](reject-extension.py) rejects uploads if they match a list of file extensions
* [reloc-by-ext.py](reloc-by-ext.py) redirects an upload to another destination based on the file extension * [reloc-by-ext.py](reloc-by-ext.py) redirects an upload to another destination based on the file extension
* good example of the `reloc` [hook effect](https://github.com/9001/copyparty/blob/hovudstraum/docs/devnotes.md#hook-effects)
# on message # on message

121
bin/hooks/podcast-normalizer.py Executable file
View File

@@ -0,0 +1,121 @@
#!/usr/bin/env python3
import json
import os
import sys
import subprocess as sp
_ = r"""
sends all uploaded audio files through an aggressive
dynamic-range-compressor to even out the volume levels
dependencies:
ffmpeg
being an xau hook, this gets eXecuted After Upload completion
but before copyparty has started hashing/indexing the file, so
we'll create a second normalized copy in a subfolder and tell
copyparty to hash/index that additional file as well
example usage as global config:
-e2d -e2t --xau j,c1,bin/hooks/podcast-normalizer.py
parameters explained,
e2d/e2t = enable database and metadata indexing
xau = execute after upload
j = this hook needs upload information as json (not just the filename)
c1 = this hook returns json on stdout, so tell copyparty to read that
example usage as a volflag (per-volume config):
-v srv/inc/pods:inc/pods:r:rw,ed:c,xau=j,c1,bin/hooks/podcast-normalizer.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(share fs-path srv/inc/pods at URL /inc/pods,
readable by all, read-write for user ed,
running this xau (exec-after-upload) plugin for all uploaded files)
example usage as a volflag in a copyparty config file:
[/inc/pods]
srv/inc/pods
accs:
r: *
rw: ed
flags:
e2d # enables file indexing
e2t # metadata tags too
xau: j,c1,bin/hooks/podcast-normalizer.py
"""
########################################################################
### CONFIG
# filetypes to process; ignores everything else
EXTS = "mp3 flac ogg opus m4a aac wav wma"
# the name of the subdir to put the normalized files in
SUBDIR = "normalized"
########################################################################
# try to enable support for crazy filenames
try:
from copyparty.util import fsenc
except:
def fsenc(p):
return p.encode("utf-8")
def main():
# read info from copyparty
inf = json.loads(sys.argv[1])
vpath = inf["vp"]
abspath = inf["ap"]
# check if the file-extension is on the to-be-processed list
ext = abspath.lower().split(".")[-1]
if ext not in EXTS.split():
return
# jump into the folder where the file was uploaded
# and create the subfolder to place the normalized copy inside
dirpath, filename = os.path.split(abspath)
os.chdir(fsenc(dirpath))
os.makedirs(SUBDIR, exist_ok=True)
# the input and output filenames to give ffmpeg
fname_in = fsenc(f"./{filename}")
fname_out = fsenc(f"{SUBDIR}/{filename}.opus")
# fmt: off
# create and run the ffmpeg command
cmd = [
b"ffmpeg",
b"-nostdin",
b"-hide_banner",
b"-i", fname_in,
b"-af", b"dynaudnorm=f=100:g=9", # the normalizer config
b"-c:a", b"libopus",
b"-b:a", b"128k",
fname_out,
]
# fmt: on
sp.check_output(cmd)
# and finally, tell copyparty about the new file
# so it appears in the database and rss-feed:
vpath = f"{SUBDIR}/{filename}.opus"
print(json.dumps({"idx": {"vp": [vpath]}}))
# (it's fine to give it a relative path like that; it gets
# resolved relative to the folder the file was uploaded into)
if __name__ == "__main__":
try:
main()
except Exception as ex:
print("podcast-normalizer failed; %r" % (ex,))

View File

@@ -50,6 +50,9 @@
* give a 3rd argument to install it to your copyparty config * give a 3rd argument to install it to your copyparty config
* systemd service at [`systemd/cfssl.service`](systemd/cfssl.service) * systemd service at [`systemd/cfssl.service`](systemd/cfssl.service)
### [`zfs-tune.py`](zfs-tune.py)
* optimizes databases for optimal performance when stored on a zfs filesystem; also see [openzfs docs](https://openzfs.github.io/openzfs-docs/Performance%20and%20Tuning/Workload%20Tuning.html#database-workloads) and specifically the SQLite subsection
# OS integration # OS integration
init-scripts to start copyparty as a service init-scripts to start copyparty as a service
* [`systemd/copyparty.service`](systemd/copyparty.service) runs the sfx normally * [`systemd/copyparty.service`](systemd/copyparty.service) runs the sfx normally

View File

@@ -1,6 +1,6 @@
# Maintainer: icxes <dev.null@need.moe> # Maintainer: icxes <dev.null@need.moe>
pkgname=copyparty pkgname=copyparty
pkgver="1.16.17" pkgver="1.16.21"
pkgrel=1 pkgrel=1
pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++" pkgdesc="File server with accelerated resumable uploads, dedup, WebDAV, FTP, TFTP, zeroconf, media indexer, thumbnails++"
arch=("any") arch=("any")
@@ -22,7 +22,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
) )
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz") source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
backup=("etc/${pkgname}.d/init" ) backup=("etc/${pkgname}.d/init" )
sha256sums=("6dba0df650bfa6c47ebffcd0c9ef450b49dd998b87265778470799f7cdcd6b00") sha256sums=("2e416e18dc854c65643b8aaedca56e0a5c5a03b0c3d45b7ff3f68daa38d8e9c6")
build() { build() {
cd "${srcdir}/${pkgname}-${pkgver}" cd "${srcdir}/${pkgname}-${pkgver}"

View File

@@ -1,5 +1,5 @@
{ {
"url": "https://github.com/9001/copyparty/releases/download/v1.16.17/copyparty-sfx.py", "url": "https://github.com/9001/copyparty/releases/download/v1.16.21/copyparty-sfx.py",
"version": "1.16.17", "version": "1.16.21",
"hash": "sha256-D3hz4tr0/Qb8ySZvhI/eKTUvONbmb8RbwzTEHMWpA6o=" "hash": "sha256-+/f4g8J2Mv0l6ChXzbNJ84G8LeB+mP1UfkWzQxizd/g="
} }

107
contrib/zfs-tune.py Executable file
View File

@@ -0,0 +1,107 @@
#!/usr/bin/env python3
import os
import sqlite3
import sys
import traceback
"""
when the up2k-database is stored on a zfs volume, this may give
slightly higher performance (actual gains not measured yet)
NOTE: must be applied in combination with the related advice in the openzfs documentation;
https://openzfs.github.io/openzfs-docs/Performance%20and%20Tuning/Workload%20Tuning.html#database-workloads
and see specifically the SQLite subsection
it is assumed that all databases are stored in a single location,
for example with `--hist /var/store/hists`
three alternatives for running this script:
1. copy it into /var/store/hists and run "python3 zfs-tune.py s"
(s = modify all databases below folder containing script)
2. cd into /var/store/hists and run "python3 ~/zfs-tune.py w"
(w = modify all databases below current working directory)
3. python3 ~/zfs-tune.py /var/store/hists
if you use docker, run copyparty with `--hist /cfg/hists`, copy this script into /cfg, and run this:
podman run --rm -it --entrypoint /usr/bin/python3 ghcr.io/9001/copyparty-ac /cfg/zfs-tune.py s
"""
PAGESIZE = 65536
# borrowed from copyparty; short efficient stacktrace for errors
def min_ex(max_lines: int = 8, reverse: bool = False) -> str:
et, ev, tb = sys.exc_info()
stb = traceback.extract_tb(tb) if tb else traceback.extract_stack()[:-1]
fmt = "%s:%d <%s>: %s"
ex = [fmt % (fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in stb]
if et or ev or tb:
ex.append("[%s] %s" % (et.__name__ if et else "(anonymous)", ev))
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
def set_pagesize(db_path):
try:
# check current page_size
with sqlite3.connect(db_path) as db:
v = db.execute("pragma page_size").fetchone()[0]
if v == PAGESIZE:
print(" `-- OK")
return
# https://www.sqlite.org/pragma.html#pragma_page_size
# `- disable wal; set pagesize; vacuum
# (copyparty will reenable wal if necessary)
with sqlite3.connect(db_path) as db:
db.execute("pragma journal_mode=delete")
db.commit()
with sqlite3.connect(db_path) as db:
db.execute(f"pragma page_size = {PAGESIZE}")
db.execute("vacuum")
print(" `-- new pagesize OK")
except Exception:
err = min_ex().replace("\n", "\n -- ")
print(f"FAILED: {db_path}\n -- {err}")
def main():
top = os.path.dirname(os.path.abspath(__file__))
cwd = os.path.abspath(os.getcwd())
try:
x = sys.argv[1]
except:
print(f"""
this script takes one mandatory argument:
specify 's' to start recursing from folder containing this script file ({top})
specify 'w' to start recursing from the current working directory ({cwd})
specify a path to start recursing from there
""")
sys.exit(1)
if x.lower() == "w":
top = cwd
elif x.lower() != "s":
top = x
for dirpath, dirs, files in os.walk(top):
for fname in files:
if not fname.endswith(".db"):
continue
db_path = os.path.join(dirpath, fname)
print(db_path)
set_pagesize(db_path)
if __name__ == "__main__":
main()

View File

@@ -228,7 +228,23 @@ def init_E(EE: EnvParams) -> None:
if E.mod.endswith("__init__"): if E.mod.endswith("__init__"):
E.mod = os.path.dirname(E.mod) E.mod = os.path.dirname(E.mod)
if sys.platform == "win32": try:
p = os.environ.get("XDG_CONFIG_HOME")
if not p:
raise Exception()
if p.startswith("~"):
p = os.path.expanduser(p)
p = os.path.abspath(os.path.realpath(p))
p = os.path.join(p, "copyparty")
if not os.path.isdir(p):
os.mkdir(p)
os.listdir(p)
except:
p = ""
if p:
E.cfg = p
elif sys.platform == "win32":
bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "." bdir = os.environ.get("APPDATA") or os.environ.get("TEMP") or "."
E.cfg = os.path.normpath(bdir + "/copyparty") E.cfg = os.path.normpath(bdir + "/copyparty")
elif sys.platform == "darwin": elif sys.platform == "darwin":
@@ -1009,9 +1025,9 @@ def add_upload(ap):
ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)") ap2.add_argument("--df", metavar="GiB", type=u, default="0", help="ensure \033[33mGiB\033[0m free disk space by rejecting upload requests; assumes gigabytes unless a unit suffix is given: [\033[32m256m\033[0m], [\033[32m4\033[0m], [\033[32m2T\033[0m] (volflag=df)")
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files") ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="windows-only: minimum size of incoming uploads through up2k before they are made into sparse files")
ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck") ap2.add_argument("--turbo", metavar="LVL", type=int, default=0, help="configure turbo-mode in up2k client; [\033[32m-1\033[0m] = forbidden/always-off, [\033[32m0\033[0m] = default-off and warn if enabled, [\033[32m1\033[0m] = default-off, [\033[32m2\033[0m] = on, [\033[32m3\033[0m] = on and disable datecheck")
ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good for low-latency (same-country) connections, 4-8 for android clients, 16 for cross-atlantic (max=64)") ap2.add_argument("--u2j", metavar="JOBS", type=int, default=2, help="web-client: number of file chunks to upload in parallel; 1 or 2 is good when latency is low (same-country), 2~4 for android-clients, 2~6 for cross-atlantic. Max is 6 in most browsers. Big values increase network-speed but may reduce HDD-speed")
ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for \033[33mdefault\033[0m, and never exceed \033[33mmax\033[0m. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]") ap2.add_argument("--u2sz", metavar="N,N,N", type=u, default="1,64,96", help="web-client: default upload chunksize (MiB); sets \033[33mmin,default,max\033[0m in the settings gui. Each HTTP POST will aim for \033[33mdefault\033[0m, and never exceed \033[33mmax\033[0m. Cloudflare max is 96. Big values are good for cross-atlantic but may increase HDD fragmentation on some FS. Disable this optimization with [\033[32m1,1,1\033[0m]")
ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)") ap2.add_argument("--u2ow", metavar="NUM", type=int, default=0, help="web-client: default setting for when to replace/overwrite existing files; [\033[32m0\033[0m]=never, [\033[32m1\033[0m]=if-client-newer, [\033[32m2\033[0m]=always (volflag=u2ow)")
ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine") ap2.add_argument("--u2sort", metavar="TXT", type=u, default="s", help="upload order; [\033[32ms\033[0m]=smallest-first, [\033[32mn\033[0m]=alphabetical, [\033[32mfs\033[0m]=force-s, [\033[32mfn\033[0m]=force-n -- alphabetical is a bit slower on fiber/LAN but makes it easier to eyeball if everything went fine")
ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory") ap2.add_argument("--write-uplog", action="store_true", help="write POST reports to textfiles in working-directory")
@@ -1292,6 +1308,9 @@ def add_salt(ap, fk_salt, dk_salt, ah_salt):
ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files") ap2.add_argument("--fk-salt", metavar="SALT", type=u, default=fk_salt, help="per-file accesskey salt; used to generate unpredictable URLs for hidden files")
ap2.add_argument("--dk-salt", metavar="SALT", type=u, default=dk_salt, help="per-directory accesskey salt; used to generate unpredictable URLs to share folders with users who only have the 'get' permission") ap2.add_argument("--dk-salt", metavar="SALT", type=u, default=dk_salt, help="per-directory accesskey salt; used to generate unpredictable URLs to share folders with users who only have the 'get' permission")
ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)") ap2.add_argument("--warksalt", metavar="SALT", type=u, default="hunter2", help="up2k file-hash salt; serves no purpose, no reason to change this (but delete all databases if you do)")
ap2.add_argument("--show-ah-salt", action="store_true", help="on startup, print the effective value of \033[33m--ah-salt\033[0m (the autogenerated value in $XDG_CONFIG_HOME unless otherwise specified)")
ap2.add_argument("--show-fk-salt", action="store_true", help="on startup, print the effective value of \033[33m--fk-salt\033[0m (the autogenerated value in $XDG_CONFIG_HOME unless otherwise specified)")
ap2.add_argument("--show-dk-salt", action="store_true", help="on startup, print the effective value of \033[33m--dk-salt\033[0m (the autogenerated value in $XDG_CONFIG_HOME unless otherwise specified)")
def add_shutdown(ap): def add_shutdown(ap):
@@ -1333,7 +1352,7 @@ def add_admin(ap):
def add_thumbnail(ap): def add_thumbnail(ap):
th_ram = (RAM_AVAIL or RAM_TOTAL or 9) * 0.6 th_ram = (RAM_AVAIL or RAM_TOTAL or 9) * 0.6
th_ram = int(max(min(th_ram, 6), 1) * 10) / 10 th_ram = int(max(min(th_ram, 6), 0.3) * 10) / 10
ap2 = ap.add_argument_group('thumbnail options') ap2 = ap.add_argument_group('thumbnail options')
ap2.add_argument("--no-thumb", action="store_true", help="disable all thumbnails (volflag=dthumb)") ap2.add_argument("--no-thumb", action="store_true", help="disable all thumbnails (volflag=dthumb)")
ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails (volflag=dvthumb)") ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails (volflag=dvthumb)")
@@ -1361,6 +1380,7 @@ def add_thumbnail(ap):
ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,cbz,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg") ap2.add_argument("--th-r-ffi", metavar="T,T", type=u, default="apng,avif,avifs,bmp,cbz,dds,dib,fit,fits,fts,gif,hdr,heic,heics,heif,heifs,icns,ico,jp2,jpeg,jpg,jpx,jxl,pbm,pcx,pfm,pgm,png,pnm,ppm,psd,qoi,sgi,tga,tif,tiff,webp,xbm,xpm", help="image formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg") ap2.add_argument("--th-r-ffv", metavar="T,T", type=u, default="3gp,asf,av1,avc,avi,flv,h264,h265,hevc,m4v,mjpeg,mjpg,mkv,mov,mp4,mpeg,mpeg2,mpegts,mpg,mpg2,mts,nut,ogm,ogv,rm,ts,vob,webm,wmv", help="video formats to decode using ffmpeg")
ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itxz,itz,m4a,mdgz,mdxz,mdz,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,s3gz,s3xz,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmxz,xmz,xpk", help="audio formats to decode using ffmpeg") ap2.add_argument("--th-r-ffa", metavar="T,T", type=u, default="aac,ac3,aif,aiff,alac,alaw,amr,apac,ape,au,bonk,dfpwm,dts,flac,gsm,ilbc,it,itgz,itxz,itz,m4a,mdgz,mdxz,mdz,mo3,mod,mp2,mp3,mpc,mptm,mt2,mulaw,ogg,okt,opus,ra,s3m,s3gz,s3xz,s3z,tak,tta,ulaw,wav,wma,wv,xm,xmgz,xmxz,xmz,xpk", help="audio formats to decode using ffmpeg")
ap2.add_argument("--th-spec-cnv", metavar="T,T", type=u, default="it,itgz,itxz,itz,mdgz,mdxz,mdz,mo3,mod,s3m,s3gz,s3xz,s3z,xm,xmgz,xmxz,xmz,xpk", help="audio formats which provoke https://trac.ffmpeg.org/ticket/10797 (huge ram usage for s3xmodit spectrograms)")
ap2.add_argument("--au-unpk", metavar="E=F.C", type=u, default="mdz=mod.zip, mdgz=mod.gz, mdxz=mod.xz, s3z=s3m.zip, s3gz=s3m.gz, s3xz=s3m.xz, xmz=xm.zip, xmgz=xm.gz, xmxz=xm.xz, itz=it.zip, itgz=it.gz, itxz=it.xz, cbz=jpg.cbz", help="audio/image formats to decompress before passing to ffmpeg") ap2.add_argument("--au-unpk", metavar="E=F.C", type=u, default="mdz=mod.zip, mdgz=mod.gz, mdxz=mod.xz, s3z=s3m.zip, s3gz=s3m.gz, s3xz=s3m.xz, xmz=xm.zip, xmgz=xm.gz, xmxz=xm.xz, itz=it.zip, itgz=it.gz, itxz=it.xz, cbz=jpg.cbz", help="audio/image formats to decompress before passing to ffmpeg")
@@ -1393,6 +1413,7 @@ def add_db_general(ap, hcores):
ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash") ap2.add_argument("-e2vu", action="store_true", help="on hash mismatch: update the database with the new hash")
ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty") ap2.add_argument("-e2vp", action="store_true", help="on hash mismatch: panic and quit copyparty")
ap2.add_argument("--hist", metavar="PATH", type=u, default="", help="where to store volume data (db, thumbs); default is a folder named \".hist\" inside each volume (volflag=hist)") ap2.add_argument("--hist", metavar="PATH", type=u, default="", help="where to store volume data (db, thumbs); default is a folder named \".hist\" inside each volume (volflag=hist)")
ap2.add_argument("--dbpath", metavar="PATH", type=u, default="", help="override where the volume databases are to be placed; default is the same as \033[33m--hist\033[0m (volflag=dbpath)")
ap2.add_argument("--no-hash", metavar="PTN", type=u, default="", help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)") ap2.add_argument("--no-hash", metavar="PTN", type=u, default="", help="regex: disable hashing of matching absolute-filesystem-paths during e2ds folder scans (volflag=nohash)")
ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)") ap2.add_argument("--no-idx", metavar="PTN", type=u, default=noidx, help="regex: disable indexing of matching absolute-filesystem-paths during e2ds folder scans (volflag=noidx)")
ap2.add_argument("--no-dirsz", action="store_true", help="do not show total recursive size of folders in listings, show inode size instead; slightly faster (volflag=nodirsz)") ap2.add_argument("--no-dirsz", action="store_true", help="do not show total recursive size of folders in listings, show inode size instead; slightly faster (volflag=nodirsz)")
@@ -1431,6 +1452,7 @@ def add_db_metadata(ap):
def add_txt(ap): def add_txt(ap):
ap2 = ap.add_argument_group('textfile options') ap2 = ap.add_argument_group('textfile options')
ap2.add_argument("--md-hist", metavar="TXT", type=u, default="s", help="where to store old version of markdown files; [\033[32ms\033[0m]=subfolder, [\033[32mv\033[0m]=volume-histpath, [\033[32mn\033[0m]=nope/disabled (volflag=md_hist)")
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="the textfile editor will check for serverside changes every \033[33mSEC\033[0m seconds") ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="the textfile editor will check for serverside changes every \033[33mSEC\033[0m seconds")
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins -- neat but dangerous, big XSS risk") ap2.add_argument("-emp", action="store_true", help="enable markdown plugins -- neat but dangerous, big XSS risk")
ap2.add_argument("--exp", action="store_true", help="enable textfile expansion -- replace {{self.ip}} and such; see \033[33m--help-exp\033[0m (volflag=exp)") ap2.add_argument("--exp", action="store_true", help="enable textfile expansion -- replace {{self.ip}} and such; see \033[33m--help-exp\033[0m (volflag=exp)")

View File

@@ -1,8 +1,8 @@
# coding: utf-8 # coding: utf-8
VERSION = (1, 16, 18) VERSION = (1, 17, 0)
CODENAME = "COPYparty" CODENAME = "mixtape.m3u"
BUILD_DT = (2025, 3, 23) BUILD_DT = (2025, 4, 26)
S_VERSION = ".".join(map(str, VERSION)) S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT) S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -360,6 +360,7 @@ class VFS(object):
self.badcfg1 = False self.badcfg1 = False
self.nodes: dict[str, VFS] = {} # child nodes self.nodes: dict[str, VFS] = {} # child nodes
self.histtab: dict[str, str] = {} # all realpath->histpath self.histtab: dict[str, str] = {} # all realpath->histpath
self.dbpaths: dict[str, str] = {} # all realpath->dbpath
self.dbv: Optional[VFS] = None # closest full/non-jump parent self.dbv: Optional[VFS] = None # closest full/non-jump parent
self.lim: Optional[Lim] = None # upload limits; only set for dbv self.lim: Optional[Lim] = None # upload limits; only set for dbv
self.shr_src: Optional[tuple[VFS, str]] = None # source vfs+rem of a share self.shr_src: Optional[tuple[VFS, str]] = None # source vfs+rem of a share
@@ -381,12 +382,13 @@ class VFS(object):
rp = realpath + ("" if realpath.endswith(os.sep) else os.sep) rp = realpath + ("" if realpath.endswith(os.sep) else os.sep)
vp = vpath + ("/" if vpath else "") vp = vpath + ("/" if vpath else "")
self.histpath = os.path.join(realpath, ".hist") # db / thumbcache self.histpath = os.path.join(realpath, ".hist") # db / thumbcache
self.dbpath = self.histpath
self.all_vols = {vpath: self} # flattened recursive self.all_vols = {vpath: self} # flattened recursive
self.all_nodes = {vpath: self} # also jumpvols/shares self.all_nodes = {vpath: self} # also jumpvols/shares
self.all_aps = [(rp, self)] self.all_aps = [(rp, self)]
self.all_vps = [(vp, self)] self.all_vps = [(vp, self)]
else: else:
self.histpath = "" self.histpath = self.dbpath = ""
self.all_vols = {} self.all_vols = {}
self.all_nodes = {} self.all_nodes = {}
self.all_aps = [] self.all_aps = []
@@ -461,17 +463,23 @@ class VFS(object):
def _copy_flags(self, name: str) -> dict[str, Any]: def _copy_flags(self, name: str) -> dict[str, Any]:
flags = {k: v for k, v in self.flags.items()} flags = {k: v for k, v in self.flags.items()}
hist = flags.get("hist") hist = flags.get("hist")
if hist and hist != "-": if hist and hist != "-":
zs = "{}/{}".format(hist.rstrip("/"), name) zs = "{}/{}".format(hist.rstrip("/"), name)
flags["hist"] = os.path.expandvars(os.path.expanduser(zs)) flags["hist"] = os.path.expandvars(os.path.expanduser(zs))
dbp = flags.get("dbpath")
if dbp and dbp != "-":
zs = "{}/{}".format(dbp.rstrip("/"), name)
flags["dbpath"] = os.path.expandvars(os.path.expanduser(zs))
return flags return flags
def bubble_flags(self) -> None: def bubble_flags(self) -> None:
if self.dbv: if self.dbv:
for k, v in self.dbv.flags.items(): for k, v in self.dbv.flags.items():
if k not in ["hist"]: if k not in ("hist", "dbpath"):
self.flags[k] = v self.flags[k] = v
for n in self.nodes.values(): for n in self.nodes.values():
@@ -1759,7 +1767,7 @@ class AuthSrv(object):
pass pass
elif vflag: elif vflag:
vflag = os.path.expandvars(os.path.expanduser(vflag)) vflag = os.path.expandvars(os.path.expanduser(vflag))
vol.histpath = uncyg(vflag) if WINDOWS else vflag vol.histpath = vol.dbpath = uncyg(vflag) if WINDOWS else vflag
elif self.args.hist: elif self.args.hist:
for nch in range(len(hid)): for nch in range(len(hid)):
hpath = os.path.join(self.args.hist, hid[: nch + 1]) hpath = os.path.join(self.args.hist, hid[: nch + 1])
@@ -1780,12 +1788,45 @@ class AuthSrv(object):
with open(powner, "wb") as f: with open(powner, "wb") as f:
f.write(me) f.write(me)
vol.histpath = hpath vol.histpath = vol.dbpath = hpath
break break
vol.histpath = absreal(vol.histpath) vol.histpath = absreal(vol.histpath)
for vol in vfs.all_vols.values():
hid = self.hid_cache[vol.realpath]
vflag = vol.flags.get("dbpath")
if vflag == "-":
pass
elif vflag:
vflag = os.path.expandvars(os.path.expanduser(vflag))
vol.dbpath = uncyg(vflag) if WINDOWS else vflag
elif self.args.dbpath:
for nch in range(len(hid)):
hpath = os.path.join(self.args.dbpath, hid[: nch + 1])
bos.makedirs(hpath)
powner = os.path.join(hpath, "owner.txt")
try:
with open(powner, "rb") as f:
owner = f.read().rstrip()
except:
owner = None
me = afsenc(vol.realpath).rstrip()
if owner not in [None, me]:
continue
if owner is None:
with open(powner, "wb") as f:
f.write(me)
vol.dbpath = hpath
break
vol.dbpath = absreal(vol.dbpath)
if vol.dbv: if vol.dbv:
if bos.path.exists(os.path.join(vol.histpath, "up2k.db")): if bos.path.exists(os.path.join(vol.dbpath, "up2k.db")):
promote.append(vol) promote.append(vol)
vol.dbv = None vol.dbv = None
else: else:
@@ -1800,9 +1841,7 @@ class AuthSrv(object):
"\n the following jump-volumes were generated to assist the vfs.\n As they contain a database (probably from v0.11.11 or older),\n they are promoted to full volumes:" "\n the following jump-volumes were generated to assist the vfs.\n As they contain a database (probably from v0.11.11 or older),\n they are promoted to full volumes:"
] ]
for vol in promote: for vol in promote:
ta.append( ta.append(" /%s (%s) (%s)" % (vol.vpath, vol.realpath, vol.dbpath))
" /{} ({}) ({})".format(vol.vpath, vol.realpath, vol.histpath)
)
self.log("\n\n".join(ta) + "\n", c=3) self.log("\n\n".join(ta) + "\n", c=3)
@@ -1813,13 +1852,27 @@ class AuthSrv(object):
is_shr = shr and zv.vpath.split("/")[0] == shr is_shr = shr and zv.vpath.split("/")[0] == shr
if histp and not is_shr and histp in rhisttab: if histp and not is_shr and histp in rhisttab:
zv2 = rhisttab[histp] zv2 = rhisttab[histp]
t = "invalid config; multiple volumes share the same histpath (database location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]" t = "invalid config; multiple volumes share the same histpath (database+thumbnails location):\n histpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
t = t % (histp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath) t = t % (histp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
self.log(t, 1) self.log(t, 1)
raise Exception(t) raise Exception(t)
rhisttab[histp] = zv rhisttab[histp] = zv
vfs.histtab[zv.realpath] = histp vfs.histtab[zv.realpath] = histp
rdbpaths = {}
vfs.dbpaths = {}
for zv in vfs.all_vols.values():
dbp = zv.dbpath
is_shr = shr and zv.vpath.split("/")[0] == shr
if dbp and not is_shr and dbp in rdbpaths:
zv2 = rdbpaths[dbp]
t = "invalid config; multiple volumes share the same dbpath (database location):\n dbpath: %s\n volume 1: /%s [%s]\n volume 2: %s [%s]"
t = t % (dbp, zv2.vpath, zv2.realpath, zv.vpath, zv.realpath)
self.log(t, 1)
raise Exception(t)
rdbpaths[dbp] = zv
vfs.dbpaths[zv.realpath] = dbp
for vol in vfs.all_vols.values(): for vol in vfs.all_vols.values():
use = False use = False
for k in ["zipmaxn", "zipmaxs"]: for k in ["zipmaxn", "zipmaxs"]:

View File

@@ -83,6 +83,7 @@ def vf_vmap() -> dict[str, str]:
"md_sbf", "md_sbf",
"lg_sba", "lg_sba",
"md_sba", "md_sba",
"md_hist",
"nrand", "nrand",
"u2ow", "u2ow",
"og_desc", "og_desc",
@@ -204,6 +205,7 @@ flagcats = {
"d2v": "disables file verification, overrides -e2v*", "d2v": "disables file verification, overrides -e2v*",
"d2d": "disables all database stuff, overrides -e2*", "d2d": "disables all database stuff, overrides -e2*",
"hist=/tmp/cdb": "puts thumbnails and indexes at that location", "hist=/tmp/cdb": "puts thumbnails and indexes at that location",
"dbpath=/tmp/cdb": "puts indexes at that location",
"scan=60": "scan for new files every 60sec, same as --re-maxage", "scan=60": "scan for new files every 60sec, same as --re-maxage",
"nohash=\\.iso$": "skips hashing file contents if path matches *.iso", "nohash=\\.iso$": "skips hashing file contents if path matches *.iso",
"noidx=\\.iso$": "fully ignores the contents at paths matching *.iso", "noidx=\\.iso$": "fully ignores the contents at paths matching *.iso",
@@ -291,6 +293,7 @@ flagcats = {
"og_ua": "if defined: only send OG html if useragent matches this regex", "og_ua": "if defined: only send OG html if useragent matches this regex",
}, },
"textfiles": { "textfiles": {
"md_hist": "where to put markdown backups; s=subfolder, v=volHist, n=nope",
"exp": "enable textfile expansion; see --help-exp", "exp": "enable textfile expansion; see --help-exp",
"exp_md": "placeholders to expand in markdown files; see --help", "exp_md": "placeholders to expand in markdown files; see --help",
"exp_lg": "placeholders to expand in prologue/epilogue; see --help", "exp_lg": "placeholders to expand in prologue/epilogue; see --help",

View File

@@ -57,6 +57,7 @@ from .util import (
UnrecvEOF, UnrecvEOF,
WrongPostKey, WrongPostKey,
absreal, absreal,
afsenc,
alltrace, alltrace,
atomic_move, atomic_move,
b64dec, b64dec,
@@ -1205,11 +1206,6 @@ class HttpCli(object):
else: else:
return self.tx_res(res_path) return self.tx_res(res_path)
if res_path != undot(res_path):
t = "malicious user; attempted path traversal; req(%r) vp(%r) => %r"
self.log(t % (self.req, "/" + self.vpath, res_path), 1)
self.cbonk(self.conn.hsrv.gmal, self.req, "trav", "path traversal")
self.tx_404() self.tx_404()
return False return False
@@ -2998,9 +2994,6 @@ class HttpCli(object):
vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True) vfs, rem = self.asrv.vfs.get(vpath, self.uname, False, True)
rem = sanitize_vpath(rem, "/") rem = sanitize_vpath(rem, "/")
fn = vfs.canonical(rem) fn = vfs.canonical(rem)
if not fn.startswith(vfs.realpath):
self.log("invalid mkdir %r %r" % (self.gctx, vpath), 1)
raise Pebkac(422)
if not nullwrite: if not nullwrite:
fdir = os.path.dirname(fn) fdir = os.path.dirname(fn)
@@ -3501,6 +3494,7 @@ class HttpCli(object):
fp = os.path.join(fp, fn) fp = os.path.join(fp, fn)
rem = "{}/{}".format(rp, fn).strip("/") rem = "{}/{}".format(rp, fn).strip("/")
dbv, vrem = vfs.get_dbv(rem)
if not rem.endswith(".md") and not self.can_delete: if not rem.endswith(".md") and not self.can_delete:
raise Pebkac(400, "only markdown pls") raise Pebkac(400, "only markdown pls")
@@ -3555,13 +3549,27 @@ class HttpCli(object):
mdir, mfile = os.path.split(fp) mdir, mfile = os.path.split(fp)
fname, fext = mfile.rsplit(".", 1) if "." in mfile else (mfile, "md") fname, fext = mfile.rsplit(".", 1) if "." in mfile else (mfile, "md")
mfile2 = "{}.{:.3f}.{}".format(fname, srv_lastmod, fext) mfile2 = "{}.{:.3f}.{}".format(fname, srv_lastmod, fext)
try:
dp = ""
hist_cfg = dbv.flags["md_hist"]
if hist_cfg == "v":
vrd = vsplit(vrem)[0]
zb = hashlib.sha512(afsenc(vrd)).digest()
zs = ub64enc(zb).decode("ascii")[:24].lower()
dp = "%s/md/%s/%s/%s" % (dbv.histpath, zs[:2], zs[2:4], zs)
self.log("moving old version to %s/%s" % (dp, mfile2))
if bos.makedirs(dp):
with open(os.path.join(dp, "dir.txt"), "wb") as f:
f.write(afsenc(vrd))
elif hist_cfg == "s":
dp = os.path.join(mdir, ".hist") dp = os.path.join(mdir, ".hist")
try:
bos.mkdir(dp) bos.mkdir(dp)
hidedir(dp) hidedir(dp)
except: except:
pass pass
wrename(self.log, fp, os.path.join(mdir, ".hist", mfile2), vfs.flags) if dp:
wrename(self.log, fp, os.path.join(dp, mfile2), vfs.flags)
assert self.parser.gen # !rm assert self.parser.gen # !rm
p_field, _, p_data = next(self.parser.gen) p_field, _, p_data = next(self.parser.gen)
@@ -3634,13 +3642,12 @@ class HttpCli(object):
wunlink(self.log, fp, vfs.flags) wunlink(self.log, fp, vfs.flags)
raise Pebkac(403, t) raise Pebkac(403, t)
vfs, rem = vfs.get_dbv(rem)
self.conn.hsrv.broker.say( self.conn.hsrv.broker.say(
"up2k.hash_file", "up2k.hash_file",
vfs.realpath, dbv.realpath,
vfs.vpath, dbv.vpath,
vfs.flags, dbv.flags,
vsplit(rem)[0], vsplit(vrem)[0],
fn, fn,
self.ip, self.ip,
new_lastmod, new_lastmod,
@@ -4239,6 +4246,7 @@ class HttpCli(object):
self.log(t % (data_end / M, lower / M, upper / M), 6) self.log(t % (data_end / M, lower / M, upper / M), 6)
with self.u2mutex: with self.u2mutex:
if data_end > self.u2fh.aps.get(ap_data, data_end): if data_end > self.u2fh.aps.get(ap_data, data_end):
fhs: Optional[set[typing.BinaryIO]] = None
try: try:
fhs = self.u2fh.cache[ap_data].all_fhs fhs = self.u2fh.cache[ap_data].all_fhs
for fh in fhs: for fh in fhs:
@@ -4246,7 +4254,11 @@ class HttpCli(object):
self.u2fh.aps[ap_data] = data_end self.u2fh.aps[ap_data] = data_end
self.log("pipe: flushed %d up2k-FDs" % (len(fhs),)) self.log("pipe: flushed %d up2k-FDs" % (len(fhs),))
except Exception as ex: except Exception as ex:
self.log("pipe: u2fh flush failed: %r" % (ex,)) if fhs is None:
err = "file is not being written to right now"
else:
err = repr(ex)
self.log("pipe: u2fh flush failed: " + err)
if lower >= data_end: if lower >= data_end:
if data_end: if data_end:
@@ -4870,7 +4882,7 @@ class HttpCli(object):
self.reply(pt.encode("utf-8"), status=rc) self.reply(pt.encode("utf-8"), status=rc)
return True return True
if "th" in self.ouparam: if "th" in self.ouparam and str(self.ouparam["th"])[:1] in "jw":
return self.tx_svg("e" + pt[:3]) return self.tx_svg("e" + pt[:3])
# most webdav clients will not send credentials until they # most webdav clients will not send credentials until they
@@ -5798,7 +5810,13 @@ class HttpCli(object):
thp = None thp = None
if self.thumbcli and not nothumb: if self.thumbcli and not nothumb:
try:
thp = self.thumbcli.get(dbv, vrem, int(st.st_mtime), th_fmt) thp = self.thumbcli.get(dbv, vrem, int(st.st_mtime), th_fmt)
except Pebkac as ex:
if ex.code == 500 and th_fmt[:1] in "jw":
self.log("failed to convert [%s]:\n%s" % (abspath, ex), 3)
return self.tx_svg("--error--\ncheck\nserver\nlog")
raise
if thp: if thp:
return self.tx_file(thp) return self.tx_file(thp)
@@ -6020,9 +6038,11 @@ class HttpCli(object):
# check for old versions of files, # check for old versions of files,
# [num-backups, most-recent, hist-path] # [num-backups, most-recent, hist-path]
hist: dict[str, tuple[int, float, str]] = {} hist: dict[str, tuple[int, float, str]] = {}
try:
if vf["md_hist"] != "s":
raise Exception()
histdir = os.path.join(fsroot, ".hist") histdir = os.path.join(fsroot, ".hist")
ptn = RE_MDV ptn = RE_MDV
try:
for hfn in bos.listdir(histdir): for hfn in bos.listdir(histdir):
m = ptn.match(hfn) m = ptn.match(hfn)
if not m: if not m:

View File

@@ -94,10 +94,21 @@ class Ico(object):
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<svg version="1.1" viewBox="0 0 100 {}" xmlns="http://www.w3.org/2000/svg"><g> <svg version="1.1" viewBox="0 0 100 {}" xmlns="http://www.w3.org/2000/svg"><g>
<rect width="100%" height="100%" fill="#{}" /> <rect width="100%" height="100%" fill="#{}" />
<text x="50%" y="50%" dominant-baseline="middle" text-anchor="middle" xml:space="preserve" <text x="50%" y="{}" dominant-baseline="middle" text-anchor="middle" xml:space="preserve"
fill="#{}" font-family="monospace" font-size="14px" style="letter-spacing:.5px">{}</text> fill="#{}" font-family="monospace" font-size="14px" style="letter-spacing:.5px">{}</text>
</g></svg> </g></svg>
""" """
svg = svg.format(h, c[:6], c[6:], html_escape(ext, True))
txt = html_escape(ext, True)
if "\n" in txt:
lines = txt.split("\n")
n = len(lines)
y = "20%" if n == 2 else "10%" if n == 3 else "0"
zs = '<tspan x="50%%" dy="1.2em">%s</tspan>'
txt = "".join([zs % (x,) for x in lines])
else:
y = "50%"
svg = svg.format(h, c[:6], y, c[6:], txt)
return "image/svg+xml", svg.encode("utf-8") return "image/svg+xml", svg.encode("utf-8")

View File

@@ -15,7 +15,7 @@ try:
raise Exception() raise Exception()
HAVE_ARGON2 = True HAVE_ARGON2 = True
from argon2 import __version__ as argon2ver from argon2 import exceptions as argon2ex
except: except:
HAVE_ARGON2 = False HAVE_ARGON2 = False

View File

@@ -64,6 +64,7 @@ from .util import (
expat_ver, expat_ver,
gzip, gzip,
load_ipu, load_ipu,
lock_file,
min_ex, min_ex,
mp, mp,
odfusion, odfusion,
@@ -73,6 +74,9 @@ from .util import (
ub64enc, ub64enc,
) )
if HAVE_SQLITE3:
import sqlite3
if TYPE_CHECKING: if TYPE_CHECKING:
try: try:
from .mdns import MDNS from .mdns import MDNS
@@ -84,6 +88,10 @@ if PY2:
range = xrange # type: ignore range = xrange # type: ignore
VER_SESSION_DB = 1
VER_SHARES_DB = 2
class SvcHub(object): class SvcHub(object):
""" """
Hosts all services which cannot be parallelized due to reliance on monolithic resources. Hosts all services which cannot be parallelized due to reliance on monolithic resources.
@@ -186,8 +194,14 @@ class SvcHub(object):
if not args.use_fpool and args.j != 1: if not args.use_fpool and args.j != 1:
args.no_fpool = True args.no_fpool = True
t = "multithreading enabled with -j {}, so disabling fpool -- this can reduce upload performance on some filesystems" t = "multithreading enabled with -j {}, so disabling fpool -- this can reduce upload performance on some filesystems, and make some antivirus-softwares "
self.log("root", t.format(args.j)) c = 0
if ANYWIN:
t += "(especially Microsoft Defender) stress your CPU and HDD severely during big uploads"
c = 3
else:
t += "consume more resources (CPU/HDD) than normal"
self.log("root", t.format(args.j), c)
if not args.no_fpool and args.j != 1: if not args.no_fpool and args.j != 1:
t = "WARNING: ignoring --use-fpool because multithreading (-j{}) is enabled" t = "WARNING: ignoring --use-fpool because multithreading (-j{}) is enabled"
@@ -239,6 +253,14 @@ class SvcHub(object):
setattr(args, "ipu_iu", iu) setattr(args, "ipu_iu", iu)
setattr(args, "ipu_nm", nm) setattr(args, "ipu_nm", nm)
for zs in "ah_salt fk_salt dk_salt".split():
if getattr(args, "show_%s" % (zs,)):
self.log("root", "effective %s is %s" % (zs, getattr(args, zs)))
if args.ah_cli or args.ah_gen:
args.no_ses = True
args.shr = ""
if not self.args.no_ses: if not self.args.no_ses:
self.setup_session_db() self.setup_session_db()
@@ -406,25 +428,49 @@ class SvcHub(object):
self.log("root", t, 3) self.log("root", t, 3)
return return
import sqlite3 assert sqlite3 # type: ignore # !rm
# policy:
# the sessions-db is whatever, if something looks broken then just nuke it
create = True
db_path = self.args.ses_db db_path = self.args.ses_db
self.log("root", "opening sessions-db %s" % (db_path,)) db_lock = db_path + ".lock"
for n in range(2): try:
create = not os.path.getsize(db_path)
except:
create = True
zs = "creating new" if create else "opening"
self.log("root", "%s sessions-db %s" % (zs, db_path))
for tries in range(2):
sver = 0
try: try:
db = sqlite3.connect(db_path) db = sqlite3.connect(db_path)
cur = db.cursor() cur = db.cursor()
try: try:
zs = "select v from kv where k='sver'"
sver = cur.execute(zs).fetchall()[0][0]
if sver > VER_SESSION_DB:
zs = "this version of copyparty only understands session-db v%d and older; the db is v%d"
raise Exception(zs % (VER_SESSION_DB, sver))
cur.execute("select count(*) from us").fetchone() cur.execute("select count(*) from us").fetchone()
create = False
break
except: except:
pass if sver:
except Exception as ex:
if n:
raise raise
t = "sessions-db corrupt; deleting and recreating: %r" sver = 1
self._create_session_db(cur)
err = self._verify_session_db(cur, sver, db_path)
if err:
tries = 99
self.args.no_ses = True
self.log("root", err, 3)
break
except Exception as ex:
if tries or sver > VER_SESSION_DB:
raise
t = "sessions-db is unusable; deleting and recreating: %r"
self.log("root", t % (ex,), 3) self.log("root", t % (ex,), 3)
try: try:
cur.close() # type: ignore cur.close() # type: ignore
@@ -434,8 +480,13 @@ class SvcHub(object):
db.close() # type: ignore db.close() # type: ignore
except: except:
pass pass
try:
os.unlink(db_lock)
except:
pass
os.unlink(db_path) os.unlink(db_path)
def _create_session_db(self, cur: "sqlite3.Cursor") -> None:
sch = [ sch = [
r"create table kv (k text, v int)", r"create table kv (k text, v int)",
r"create table us (un text, si text, t0 int)", r"create table us (un text, si text, t0 int)",
@@ -445,17 +496,44 @@ class SvcHub(object):
r"create index us_t0 on us(t0)", r"create index us_t0 on us(t0)",
r"insert into kv values ('sver', 1)", r"insert into kv values ('sver', 1)",
] ]
assert db # type: ignore # !rm
assert cur # type: ignore # !rm
if create:
for cmd in sch: for cmd in sch:
cur.execute(cmd) cur.execute(cmd)
self.log("root", "created new sessions-db") self.log("root", "created new sessions-db")
db.commit()
def _verify_session_db(self, cur: "sqlite3.Cursor", sver: int, db_path: str) -> str:
# ensure writable (maybe owned by other user)
db = cur.connection
try:
zil = cur.execute("select v from kv where k='pid'").fetchall()
if len(zil) > 1:
raise Exception()
owner = zil[0][0]
except:
owner = 0
if not lock_file(db_path + ".lock"):
t = "the sessions-db [%s] is already in use by another copyparty instance (pid:%d). This is not supported; please provide another database with --ses-db or give this copyparty-instance its entirely separate config-folder by setting another path in the XDG_CONFIG_HOME env-var. You can also disable this safeguard by setting env-var PRTY_NO_DB_LOCK=1. Will now disable sessions and instead use plaintext passwords in cookies."
return t % (db_path, owner)
vars = (("pid", os.getpid()), ("ts", int(time.time() * 1000)))
if owner:
# wear-estimate: 2 cells; offsets 0x10, 0x50, 0x19720
for k, v in vars:
cur.execute("update kv set v=? where k=?", (v, k))
else:
# wear-estimate: 3~4 cells; offsets 0x10, 0x50, 0x19180, 0x19710, 0x36000, 0x360b0, 0x36b90
for k, v in vars:
cur.execute("insert into kv values(?, ?)", (k, v))
if sver < VER_SESSION_DB:
cur.execute("delete from kv where k='sver'")
cur.execute("insert into kv values('sver',?)", (VER_SESSION_DB,))
db.commit()
cur.close() cur.close()
db.close() db.close()
return ""
def setup_share_db(self) -> None: def setup_share_db(self) -> None:
al = self.args al = self.args
@@ -464,7 +542,7 @@ class SvcHub(object):
al.shr = "" al.shr = ""
return return
import sqlite3 assert sqlite3 # type: ignore # !rm
al.shr = al.shr.strip("/") al.shr = al.shr.strip("/")
if "/" in al.shr or not al.shr: if "/" in al.shr or not al.shr:
@@ -475,34 +553,48 @@ class SvcHub(object):
al.shr = "/%s/" % (al.shr,) al.shr = "/%s/" % (al.shr,)
al.shr1 = al.shr[1:] al.shr1 = al.shr[1:]
create = True # policy:
modified = False # the shares-db is important, so panic if something is wrong
db_path = self.args.shr_db db_path = self.args.shr_db
self.log("root", "opening shares-db %s" % (db_path,)) db_lock = db_path + ".lock"
for n in range(2): try:
create = not os.path.getsize(db_path)
except:
create = True
zs = "creating new" if create else "opening"
self.log("root", "%s shares-db %s" % (zs, db_path))
sver = 0
try: try:
db = sqlite3.connect(db_path) db = sqlite3.connect(db_path)
cur = db.cursor() cur = db.cursor()
try: if not create:
zs = "select v from kv where k='sver'"
sver = cur.execute(zs).fetchall()[0][0]
if sver > VER_SHARES_DB:
zs = "this version of copyparty only understands shares-db v%d and older; the db is v%d"
raise Exception(zs % (VER_SHARES_DB, sver))
cur.execute("select count(*) from sh").fetchone() cur.execute("select count(*) from sh").fetchone()
create = False
break
except:
pass
except Exception as ex: except Exception as ex:
if n: t = "could not open shares-db; will now panic...\nthe following database must be repaired or deleted before you can launch copyparty:\n%s\n\nERROR: %s\n\nadditional details:\n%s\n"
self.log("root", t % (db_path, ex, min_ex()), 1)
raise raise
t = "shares-db corrupt; deleting and recreating: %r"
self.log("root", t % (ex,), 3)
try: try:
cur.close() # type: ignore zil = cur.execute("select v from kv where k='pid'").fetchall()
if len(zil) > 1:
raise Exception()
owner = zil[0][0]
except: except:
pass owner = 0
try:
db.close() # type: ignore if not lock_file(db_lock):
except: t = "the shares-db [%s] is already in use by another copyparty instance (pid:%d). This is not supported; please provide another database with --shr-db or give this copyparty-instance its entirely separate config-folder by setting another path in the XDG_CONFIG_HOME env-var. You can also disable this safeguard by setting env-var PRTY_NO_DB_LOCK=1. Will now panic."
pass t = t % (db_path, owner)
os.unlink(db_path) self.log("root", t, 1)
raise Exception(t)
sch1 = [ sch1 = [
r"create table kv (k text, v int)", r"create table kv (k text, v int)",
@@ -514,34 +606,37 @@ class SvcHub(object):
r"create index sf_k on sf(k)", r"create index sf_k on sf(k)",
r"create index sh_k on sh(k)", r"create index sh_k on sh(k)",
r"create index sh_t1 on sh(t1)", r"create index sh_t1 on sh(t1)",
r"insert into kv values ('sver', 2)",
] ]
assert db # type: ignore # !rm assert db # type: ignore # !rm
assert cur # type: ignore # !rm assert cur # type: ignore # !rm
if create: if not sver:
dver = 2 sver = VER_SHARES_DB
modified = True
for cmd in sch1 + sch2: for cmd in sch1 + sch2:
cur.execute(cmd) cur.execute(cmd)
self.log("root", "created new shares-db") self.log("root", "created new shares-db")
else:
(dver,) = cur.execute("select v from kv where k = 'sver'").fetchall()[0]
if dver == 1: if sver == 1:
modified = True
for cmd in sch2: for cmd in sch2:
cur.execute(cmd) cur.execute(cmd)
cur.execute("update sh set st = 0") cur.execute("update sh set st = 0")
self.log("root", "shares-db schema upgrade ok") self.log("root", "shares-db schema upgrade ok")
if modified: if sver < VER_SHARES_DB:
for cmd in [ cur.execute("delete from kv where k='sver'")
r"delete from kv where k = 'sver'", cur.execute("insert into kv values('sver',?)", (VER_SHARES_DB,))
r"insert into kv values ('sver', %d)" % (2,),
]:
cur.execute(cmd)
db.commit()
vars = (("pid", os.getpid()), ("ts", int(time.time() * 1000)))
if owner:
# wear-estimate: same as sessions-db
for k, v in vars:
cur.execute("update kv set v=? where k=?", (v, k))
else:
for k, v in vars:
cur.execute("insert into kv values(?, ?)", (k, v))
db.commit()
cur.close() cur.close()
db.close() db.close()
@@ -679,10 +774,11 @@ class SvcHub(object):
t += ", " t += ", "
t += "\033[0mNG: \033[35m" + sng t += "\033[0mNG: \033[35m" + sng
t += "\033[0m, see --deps" t += "\033[0m, see --deps (this is fine btw)"
self.log("dependencies", t, 6) self.log("optional-dependencies", t, 6)
def _check_env(self) -> None: def _check_env(self) -> None:
al = self.args
try: try:
files = os.listdir(E.cfg) files = os.listdir(E.cfg)
except: except:
@@ -699,6 +795,21 @@ class SvcHub(object):
if self.args.bauth_last: if self.args.bauth_last:
self.log("root", "WARNING: ignoring --bauth-last due to --no-bauth", 3) self.log("root", "WARNING: ignoring --bauth-last due to --no-bauth", 3)
have_tcp = False
for zs in al.i:
if not zs.startswith("unix:"):
have_tcp = True
if not have_tcp:
zb = False
zs = "z zm zm4 zm6 zmv zmvv zs zsv zv"
for zs in zs.split():
if getattr(al, zs, False):
setattr(al, zs, False)
zb = True
if zb:
t = "only listening on unix-sockets; cannot enable zeroconf/mdns/ssdp as requested"
self.log("root", t, 3)
if not self.args.no_dav: if not self.args.no_dav:
from .dxml import DXML_OK from .dxml import DXML_OK
@@ -763,7 +874,7 @@ class SvcHub(object):
vl = [os.path.expandvars(os.path.expanduser(x)) for x in vl] vl = [os.path.expandvars(os.path.expanduser(x)) for x in vl]
setattr(al, k, vl) setattr(al, k, vl)
for k in "lo hist ssl_log".split(" "): for k in "lo hist dbpath ssl_log".split(" "):
vs = getattr(al, k) vs = getattr(al, k)
if vs: if vs:
vs = os.path.expandvars(os.path.expanduser(vs)) vs = os.path.expandvars(os.path.expanduser(vs))

View File

@@ -54,6 +54,7 @@ def gen_fdesc(sz: int, crc32: int, z64: bool) -> bytes:
def gen_hdr( def gen_hdr(
h_pos: Optional[int], h_pos: Optional[int],
z64: bool,
fn: str, fn: str,
sz: int, sz: int,
lastmod: int, lastmod: int,
@@ -70,7 +71,6 @@ def gen_hdr(
# appnote 4.5 / zip 3.0 (2008) / unzip 6.0 (2009) says to add z64 # appnote 4.5 / zip 3.0 (2008) / unzip 6.0 (2009) says to add z64
# extinfo for values which exceed H, but that becomes an off-by-one # extinfo for values which exceed H, but that becomes an off-by-one
# (can't tell if it was clamped or exactly maxval), make it obvious # (can't tell if it was clamped or exactly maxval), make it obvious
z64 = sz >= 0xFFFFFFFF
z64v = [sz, sz] if z64 else [] z64v = [sz, sz] if z64 else []
if h_pos and h_pos >= 0xFFFFFFFF: if h_pos and h_pos >= 0xFFFFFFFF:
# central, also consider ptr to original header # central, also consider ptr to original header
@@ -244,6 +244,7 @@ class StreamZip(StreamArc):
sz = st.st_size sz = st.st_size
ts = st.st_mtime ts = st.st_mtime
h_pos = self.pos
crc = 0 crc = 0
if self.pre_crc: if self.pre_crc:
@@ -252,8 +253,12 @@ class StreamZip(StreamArc):
crc &= 0xFFFFFFFF crc &= 0xFFFFFFFF
h_pos = self.pos # some unzip-programs expect a 64bit data-descriptor
buf = gen_hdr(None, name, sz, ts, self.utf8, crc, self.pre_crc) # even if the only 32bit-exceeding value is the offset,
# so force that by placeholdering the filesize too
z64 = h_pos >= 0xFFFFFFFF or sz >= 0xFFFFFFFF
buf = gen_hdr(None, z64, name, sz, ts, self.utf8, crc, self.pre_crc)
yield self._ct(buf) yield self._ct(buf)
for buf in yieldfile(src, self.args.iobuf): for buf in yieldfile(src, self.args.iobuf):
@@ -266,8 +271,6 @@ class StreamZip(StreamArc):
self.items.append((name, sz, ts, crc, h_pos)) self.items.append((name, sz, ts, crc, h_pos))
z64 = sz >= 4 * 1024 * 1024 * 1024
if z64 or not self.pre_crc: if z64 or not self.pre_crc:
buf = gen_fdesc(sz, crc, z64) buf = gen_fdesc(sz, crc, z64)
yield self._ct(buf) yield self._ct(buf)
@@ -306,7 +309,8 @@ class StreamZip(StreamArc):
cdir_pos = self.pos cdir_pos = self.pos
for name, sz, ts, crc, h_pos in self.items: for name, sz, ts, crc, h_pos in self.items:
buf = gen_hdr(h_pos, name, sz, ts, self.utf8, crc, self.pre_crc) z64 = h_pos >= 0xFFFFFFFF or sz >= 0xFFFFFFFF
buf = gen_hdr(h_pos, z64, name, sz, ts, self.utf8, crc, self.pre_crc)
mbuf += self._ct(buf) mbuf += self._ct(buf)
if len(mbuf) >= 16384: if len(mbuf) >= 16384:
yield mbuf yield mbuf

View File

@@ -566,7 +566,7 @@ class TcpSrv(object):
ip = None ip = None
ips = list(t1) + list(t2) ips = list(t1) + list(t2)
qri = self.args.qri qri = self.args.qri
if self.args.zm and not qri: if self.args.zm and not qri and ips:
name = self.args.name + ".local" name = self.args.name + ".local"
t1[name] = next(v for v in (t1 or t2).values()) t1[name] = next(v for v in (t1 or t2).values())
ips = [name] + ips ips = [name] + ips

View File

@@ -1,13 +1,15 @@
# coding: utf-8 # coding: utf-8
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
import errno
import os import os
import stat
from .__init__ import TYPE_CHECKING from .__init__ import TYPE_CHECKING
from .authsrv import VFS from .authsrv import VFS
from .bos import bos from .bos import bos
from .th_srv import EXTS_AC, HAVE_WEBP, thumb_path from .th_srv import EXTS_AC, HAVE_WEBP, thumb_path
from .util import Cooldown from .util import Cooldown, Pebkac
if True: # pylint: disable=using-constant-test if True: # pylint: disable=using-constant-test
from typing import Optional, Union from typing import Optional, Union
@@ -16,6 +18,9 @@ if TYPE_CHECKING:
from .httpsrv import HttpSrv from .httpsrv import HttpSrv
IOERROR = "reading the file was denied by the server os; either due to filesystem permissions, selinux, apparmor, or similar:\n%r"
class ThumbCli(object): class ThumbCli(object):
def __init__(self, hsrv: "HttpSrv") -> None: def __init__(self, hsrv: "HttpSrv") -> None:
self.broker = hsrv.broker self.broker = hsrv.broker
@@ -124,7 +129,7 @@ class ThumbCli(object):
tpath = thumb_path(histpath, rem, mtime, fmt, self.fmt_ffa) tpath = thumb_path(histpath, rem, mtime, fmt, self.fmt_ffa)
tpaths = [tpath] tpaths = [tpath]
if fmt == "w": if fmt[:1] == "w":
# also check for jpg (maybe webp is unavailable) # also check for jpg (maybe webp is unavailable)
tpaths.append(tpath.rsplit(".", 1)[0] + ".jpg") tpaths.append(tpath.rsplit(".", 1)[0] + ".jpg")
@@ -157,8 +162,22 @@ class ThumbCli(object):
if abort: if abort:
return None return None
if not bos.path.getsize(os.path.join(ptop, rem)): ap = os.path.join(ptop, rem)
try:
st = bos.stat(ap)
if not st.st_size or not stat.S_ISREG(st.st_mode):
return None return None
with open(ap, "rb", 4) as f:
if not f.read(4):
raise Exception()
except OSError as ex:
if ex.errno == errno.ENOENT:
raise Pebkac(404)
else:
raise Pebkac(500, IOERROR % (ex,))
except Exception as ex:
raise Pebkac(500, IOERROR % (ex,))
x = self.broker.ask("thumbsrv.get", ptop, rem, mtime, fmt) x = self.broker.ask("thumbsrv.get", ptop, rem, mtime, fmt)
return x.get() # type: ignore return x.get() # type: ignore

View File

@@ -4,8 +4,10 @@ from __future__ import print_function, unicode_literals
import hashlib import hashlib
import logging import logging
import os import os
import re
import shutil import shutil
import subprocess as sp import subprocess as sp
import tempfile
import threading import threading
import time import time
@@ -18,6 +20,7 @@ from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, au_unpk, ffprobe
from .util import BytesIO # type: ignore from .util import BytesIO # type: ignore
from .util import ( from .util import (
FFMPEG_URL, FFMPEG_URL,
VF_CAREFUL,
Cooldown, Cooldown,
Daemon, Daemon,
afsenc, afsenc,
@@ -48,6 +51,10 @@ HAVE_WEBP = False
EXTS_TH = set(["jpg", "webp", "png"]) EXTS_TH = set(["jpg", "webp", "png"])
EXTS_AC = set(["opus", "owa", "caf", "mp3"]) EXTS_AC = set(["opus", "owa", "caf", "mp3"])
EXTS_SPEC_SAFE = set("aif aiff flac mp3 opus wav".split())
PTN_TS = re.compile("^-?[0-9a-f]{8,10}$")
try: try:
if os.environ.get("PRTY_NO_PIL"): if os.environ.get("PRTY_NO_PIL"):
@@ -163,12 +170,15 @@ class ThumbSrv(object):
self.mutex = threading.Lock() self.mutex = threading.Lock()
self.busy: dict[str, list[threading.Condition]] = {} self.busy: dict[str, list[threading.Condition]] = {}
self.untemp: dict[str, list[str]] = {}
self.ram: dict[str, float] = {} self.ram: dict[str, float] = {}
self.memcond = threading.Condition(self.mutex) self.memcond = threading.Condition(self.mutex)
self.stopping = False self.stopping = False
self.rm_nullthumbs = True # forget failed conversions on startup self.rm_nullthumbs = True # forget failed conversions on startup
self.nthr = max(1, self.args.th_mt) self.nthr = max(1, self.args.th_mt)
self.exts_spec_unsafe = set(self.args.th_spec_cnv.split(","))
self.q: Queue[Optional[tuple[str, str, str, VFS]]] = Queue(self.nthr * 4) self.q: Queue[Optional[tuple[str, str, str, VFS]]] = Queue(self.nthr * 4)
for n in range(self.nthr): for n in range(self.nthr):
Daemon(self.worker, "thumb-{}-{}".format(n, self.nthr)) Daemon(self.worker, "thumb-{}-{}".format(n, self.nthr))
@@ -385,8 +395,12 @@ class ThumbSrv(object):
self.log(msg, c) self.log(msg, c)
if getattr(ex, "returncode", 0) != 321: if getattr(ex, "returncode", 0) != 321:
if fun == funs[-1]: if fun == funs[-1]:
try:
with open(ttpath, "wb") as _: with open(ttpath, "wb") as _:
pass pass
except Exception as ex:
t = "failed to create the file [%s]: %r"
self.log(t % (ttpath, ex), 3)
else: else:
# ffmpeg may spawn empty files on windows # ffmpeg may spawn empty files on windows
try: try:
@@ -399,13 +413,24 @@ class ThumbSrv(object):
try: try:
wrename(self.log, ttpath, tpath, vn.flags) wrename(self.log, ttpath, tpath, vn.flags)
except: except Exception as ex:
if not os.path.exists(tpath):
t = "failed to move [%s] to [%s]: %r"
self.log(t % (ttpath, tpath, ex), 3)
pass pass
untemp = []
with self.mutex: with self.mutex:
subs = self.busy[tpath] subs = self.busy[tpath]
del self.busy[tpath] del self.busy[tpath]
self.ram.pop(ttpath, None) self.ram.pop(ttpath, None)
untemp = self.untemp.pop(ttpath, None) or []
for ap in untemp:
try:
wunlink(self.log, ap, VF_CAREFUL)
except:
pass
for x in subs: for x in subs:
with x: with x:
@@ -659,15 +684,43 @@ class ThumbSrv(object):
if "ac" not in ret: if "ac" not in ret:
raise Exception("not audio") raise Exception("not audio")
fext = abspath.split(".")[-1].lower()
# https://trac.ffmpeg.org/ticket/10797 # https://trac.ffmpeg.org/ticket/10797
# expect 1 GiB every 600 seconds when duration is tricky; # expect 1 GiB every 600 seconds when duration is tricky;
# simple filetypes are generally safer so let's special-case those # simple filetypes are generally safer so let's special-case those
safe = ("flac", "wav", "aif", "aiff", "opus") coeff = 1800 if fext in EXTS_SPEC_SAFE else 600
coeff = 1800 if abspath.split(".")[-1].lower() in safe else 600 dur = ret[".dur"][1] if ".dur" in ret else 900
dur = ret[".dur"][1] if ".dur" in ret else 300
need = 0.2 + dur / coeff need = 0.2 + dur / coeff
self.wait4ram(need, tpath) self.wait4ram(need, tpath)
infile = abspath
if dur >= 900 or fext in self.exts_spec_unsafe:
with tempfile.NamedTemporaryFile(suffix=".spec.flac", delete=False) as f:
f.write(b"h")
infile = f.name
try:
self.untemp[tpath].append(infile)
except:
self.untemp[tpath] = [infile]
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
b"-map", b"0:a:0",
b"-ac", b"1",
b"-ar", b"48000",
b"-sample_fmt", b"s16",
b"-t", b"900",
b"-y", fsenc(infile),
]
# fmt: on
self._run_ff(cmd, vn)
fc = "[0:a:0]aresample=48000{},showspectrumpic=s=" fc = "[0:a:0]aresample=48000{},showspectrumpic=s="
if "3" in fmt: if "3" in fmt:
fc += "1280x1024,crop=1420:1056:70:48[o]" fc += "1280x1024,crop=1420:1056:70:48[o]"
@@ -687,7 +740,7 @@ class ThumbSrv(object):
b"-nostdin", b"-nostdin",
b"-v", b"error", b"-v", b"error",
b"-hide_banner", b"-hide_banner",
b"-i", fsenc(abspath), b"-i", fsenc(infile),
b"-filter_complex", fc.encode("utf-8"), b"-filter_complex", fc.encode("utf-8"),
b"-map", b"[o]", b"-map", b"[o]",
b"-frames:v", b"1", b"-frames:v", b"1",
@@ -991,6 +1044,8 @@ class ThumbSrv(object):
# thumb file # thumb file
try: try:
b64, ts, ext = f.split(".") b64, ts, ext = f.split(".")
if len(ts) > 8 and PTN_TS.match(ts):
ts = "yeahokay"
if len(b64) != 24 or len(ts) != 8 or ext not in exts: if len(b64) != 24 or len(ts) != 8 or ext not in exts:
raise Exception() raise Exception()
except: except:

View File

@@ -134,9 +134,9 @@ class U2idx(object):
assert sqlite3 # type: ignore # !rm assert sqlite3 # type: ignore # !rm
ptop = vn.realpath ptop = vn.realpath
histpath = self.asrv.vfs.histtab.get(ptop) histpath = self.asrv.vfs.dbpaths.get(ptop)
if not histpath: if not histpath:
self.log("no histpath for %r" % (ptop,)) self.log("no dbpath for %r" % (ptop,))
return None return None
db_path = os.path.join(histpath, "up2k.db") db_path = os.path.join(histpath, "up2k.db")

View File

@@ -94,7 +94,7 @@ VF_AFFECTS_INDEXING = set(zsg.split(" "))
SBUSY = "cannot receive uploads right now;\nserver busy with %s.\nPlease wait; the client will retry..." SBUSY = "cannot receive uploads right now;\nserver busy with %s.\nPlease wait; the client will retry..."
HINT_HISTPATH = "you could try moving the database to another location (preferably an SSD or NVME drive) using either the --hist argument (global option for all volumes), or the hist volflag (just for this volume)" HINT_HISTPATH = "you could try moving the database to another location (preferably an SSD or NVME drive) using either the --hist argument (global option for all volumes), or the hist volflag (just for this volume), or, if you want to keep the thumbnails in the current location and only move the database itself, then use --dbpath or volflag dbpath"
NULLSTAT = os.stat_result((0, -1, -1, 0, 0, 0, 0, 0, 0, 0)) NULLSTAT = os.stat_result((0, -1, -1, 0, 0, 0, 0, 0, 0, 0))
@@ -1096,9 +1096,9 @@ class Up2k(object):
self, ptop: str, flags: dict[str, Any] self, ptop: str, flags: dict[str, Any]
) -> Optional[tuple["sqlite3.Cursor", str]]: ) -> Optional[tuple["sqlite3.Cursor", str]]:
"""mutex(main,reg) me""" """mutex(main,reg) me"""
histpath = self.vfs.histtab.get(ptop) histpath = self.vfs.dbpaths.get(ptop)
if not histpath: if not histpath:
self.log("no histpath for %r" % (ptop,)) self.log("no dbpath for %r" % (ptop,))
return None return None
db_path = os.path.join(histpath, "up2k.db") db_path = os.path.join(histpath, "up2k.db")
@@ -1344,12 +1344,15 @@ class Up2k(object):
] ]
excl += [absreal(x) for x in excl] excl += [absreal(x) for x in excl]
excl += list(self.vfs.histtab.values()) excl += list(self.vfs.histtab.values())
excl += list(self.vfs.dbpaths.values())
if WINDOWS: if WINDOWS:
excl = [x.replace("/", "\\") for x in excl] excl = [x.replace("/", "\\") for x in excl]
else: else:
# ~/.wine/dosdevices/z:/ and such # ~/.wine/dosdevices/z:/ and such
excl.extend(("/dev", "/proc", "/run", "/sys")) excl.extend(("/dev", "/proc", "/run", "/sys"))
excl = list({k: 1 for k in excl})
if self.args.re_dirsz: if self.args.re_dirsz:
db.c.execute("delete from ds") db.c.execute("delete from ds")
db.n += 1 db.n += 1
@@ -5102,7 +5105,7 @@ class Up2k(object):
def _snap_reg(self, ptop: str, reg: dict[str, dict[str, Any]]) -> None: def _snap_reg(self, ptop: str, reg: dict[str, dict[str, Any]]) -> None:
now = time.time() now = time.time()
histpath = self.vfs.histtab.get(ptop) histpath = self.vfs.dbpaths.get(ptop)
if not histpath: if not histpath:
return return

View File

@@ -114,8 +114,14 @@ IP6ALL = "0:0:0:0:0:0:0:0"
try: try:
import ctypes
import fcntl import fcntl
HAVE_FCNTL = True
except:
HAVE_FCNTL = False
try:
import ctypes
import termios import termios
except: except:
pass pass
@@ -246,7 +252,7 @@ SYMTIME = PY36 and os.utime in os.supports_follow_symlinks
META_NOBOTS = '<meta name="robots" content="noindex, nofollow">\n' META_NOBOTS = '<meta name="robots" content="noindex, nofollow">\n'
# smart enough to understand javascript while also ignoring rel="nofollow" # smart enough to understand javascript while also ignoring rel="nofollow"
BAD_BOTS = r"Barkrowler|bingbot|BLEXBot|Googlebot|GPTBot|PetalBot|SeekportBot|SemrushBot|YandexBot" BAD_BOTS = r"Barkrowler|bingbot|BLEXBot|Googlebot|GoogleOther|GPTBot|PetalBot|SeekportBot|SemrushBot|YandexBot"
FFMPEG_URL = "https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z" FFMPEG_URL = "https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z"
@@ -466,6 +472,8 @@ FN_EMB = set([".prologue.html", ".epilogue.html", "readme.md", "preadme.md"])
def read_ram() -> tuple[float, float]: def read_ram() -> tuple[float, float]:
# NOTE: apparently no need to consider /sys/fs/cgroup/memory.max
# (cgroups2) since the limit is synced to /proc/meminfo
a = b = 0 a = b = 0
try: try:
with open("/proc/meminfo", "rb", 0x10000) as f: with open("/proc/meminfo", "rb", 0x10000) as f:
@@ -1546,6 +1554,12 @@ def vol_san(vols: list["VFS"], txt: bytes) -> bytes:
txt = txt.replace(bap.replace(b"\\", b"\\\\"), bvp) txt = txt.replace(bap.replace(b"\\", b"\\\\"), bvp)
txt = txt.replace(bhp.replace(b"\\", b"\\\\"), bvph) txt = txt.replace(bhp.replace(b"\\", b"\\\\"), bvph)
if vol.histpath != vol.dbpath:
bdp = vol.dbpath.encode("utf-8")
bdph = b"$db(/" + bvp + b")"
txt = txt.replace(bdp, bdph)
txt = txt.replace(bdp.replace(b"\\", b"\\\\"), bdph)
if txt != txt0: if txt != txt0:
txt += b"\r\nNOTE: filepaths sanitized; see serverlog for correct values" txt += b"\r\nNOTE: filepaths sanitized; see serverlog for correct values"
@@ -3934,8 +3948,75 @@ def hidedir(dp) -> None:
pass pass
_flocks = {}
def _lock_file_noop(ap: str) -> bool:
return True
def _lock_file_ioctl(ap: str) -> bool:
assert fcntl # type: ignore # !rm
try:
fd = _flocks.pop(ap)
os.close(fd)
except:
pass
fd = os.open(ap, os.O_RDWR | os.O_CREAT, 438)
# NOTE: the fcntl.lockf identifier is (pid,node);
# the lock will be dropped if os.close(os.open(ap))
# is performed anywhere else in this thread
try:
fcntl.lockf(fd, fcntl.LOCK_EX | fcntl.LOCK_NB)
_flocks[ap] = fd
return True
except Exception as ex:
eno = getattr(ex, "errno", -1)
try:
os.close(fd)
except:
pass
if eno in (errno.EAGAIN, errno.EACCES):
return False
print("WARNING: unexpected errno %d from fcntl.lockf; %r" % (eno, ex))
return True
def _lock_file_windows(ap: str) -> bool:
try:
import msvcrt
try:
fd = _flocks.pop(ap)
os.close(fd)
except:
pass
fd = os.open(ap, os.O_RDWR | os.O_CREAT, 438)
msvcrt.locking(fd, msvcrt.LK_NBLCK, 1)
return True
except Exception as ex:
eno = getattr(ex, "errno", -1)
if eno == errno.EACCES:
return False
print("WARNING: unexpected errno %d from msvcrt.locking; %r" % (eno, ex))
return True
if os.environ.get("PRTY_NO_DB_LOCK"):
lock_file = _lock_file_noop
elif ANYWIN:
lock_file = _lock_file_windows
elif HAVE_FCNTL:
lock_file = _lock_file_ioctl
else:
lock_file = _lock_file_noop
try: try:
if sys.version_info < (3, 10): if sys.version_info < (3, 10) or os.environ.get("PRTY_NO_IMPRESO"):
# py3.8 doesn't have .files # py3.8 doesn't have .files
# py3.9 has broken .is_file # py3.9 has broken .is_file
raise ImportError() raise ImportError()

View File

@@ -1151,10 +1151,10 @@ html.y #widget.open {
background: #fff; background: #fff;
background: var(--bg-u3); background: var(--bg-u3);
} }
#wfs, #wfm, #wzip, #wnp { #wfs, #wfm, #wzip, #wnp, #wm3u {
display: none; display: none;
} }
#wfs, #wzip, #wnp { #wfs, #wzip, #wnp, #wm3u {
margin-right: .2em; margin-right: .2em;
padding-right: .2em; padding-right: .2em;
border: 1px solid var(--bg-u5); border: 1px solid var(--bg-u5);
@@ -1175,6 +1175,7 @@ html.y #widget.open {
line-height: 1em; line-height: 1em;
} }
#wtoggle.sel #wzip, #wtoggle.sel #wzip,
#wtoggle.m3u #wm3u,
#wtoggle.np #wnp { #wtoggle.np #wnp {
display: inline-block; display: inline-block;
} }
@@ -1183,6 +1184,7 @@ html.y #widget.open {
} }
#wfm a, #wfm a,
#wnp a, #wnp a,
#wm3u a,
#wzip a { #wzip a {
font-size: .5em; font-size: .5em;
padding: 0 .3em; padding: 0 .3em;
@@ -1190,6 +1192,10 @@ html.y #widget.open {
position: relative; position: relative;
display: inline-block; display: inline-block;
} }
#wm3u a {
margin: -.2em .1em;
font-size: .45em;
}
#wfs { #wfs {
font-size: .36em; font-size: .36em;
text-align: right; text-align: right;
@@ -1198,6 +1204,7 @@ html.y #widget.open {
border-width: 0 .25em 0 0; border-width: 0 .25em 0 0;
} }
#wfm span, #wfm span,
#wm3u span,
#wnp span { #wnp span {
font-size: .6em; font-size: .6em;
display: block; display: block;
@@ -1205,6 +1212,10 @@ html.y #widget.open {
#wnp span { #wnp span {
font-size: .7em; font-size: .7em;
} }
#wm3u span {
font-size: .77em;
padding-top: .2em;
}
#wfm a:not(.en) { #wfm a:not(.en) {
opacity: .3; opacity: .3;
color: var(--fm-off); color: var(--fm-off);

View File

@@ -144,6 +144,8 @@ var Ls = {
"wt_seldl": "download selection as separate files$NHotkey: Y", "wt_seldl": "download selection as separate files$NHotkey: Y",
"wt_npirc": "copy irc-formatted track info", "wt_npirc": "copy irc-formatted track info",
"wt_nptxt": "copy plaintext track info", "wt_nptxt": "copy plaintext track info",
"wt_m3ua": "add to m3u playlist (click <code>📻copy</code> later)",
"wt_m3uc": "copy m3u playlist to clipboard",
"wt_grid": "toggle grid / list view$NHotkey: G", "wt_grid": "toggle grid / list view$NHotkey: G",
"wt_prev": "previous track$NHotkey: J", "wt_prev": "previous track$NHotkey: J",
"wt_play": "play / pause$NHotkey: P", "wt_play": "play / pause$NHotkey: P",
@@ -271,6 +273,7 @@ var Ls = {
"ml_eq": "audio equalizer", "ml_eq": "audio equalizer",
"ml_drc": "dynamic range compressor", "ml_drc": "dynamic range compressor",
"mt_loop": "loop/repeat one song\">🔁",
"mt_shuf": "shuffle the songs in each folder\">🔀", "mt_shuf": "shuffle the songs in each folder\">🔀",
"mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶", "mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶",
"mt_preload": "start loading the next song near the end for gapless playback\">preload", "mt_preload": "start loading the next song near the end for gapless playback\">preload",
@@ -279,6 +282,7 @@ var Ls = {
"mt_fau": "on phones, prevent music from stopping if the next song doesn't preload fast enough (can make tags display glitchy)\">☕️", "mt_fau": "on phones, prevent music from stopping if the next song doesn't preload fast enough (can make tags display glitchy)\">☕️",
"mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s", "mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s",
"mt_npclip": "show buttons for clipboarding the currently playing song\">/np", "mt_npclip": "show buttons for clipboarding the currently playing song\">/np",
"mt_m3u_c": "show buttons for clipboarding the$Nselected songs as m3u8 playlist entries\">📻",
"mt_octl": "os integration (media hotkeys / osd)\">os-ctl", "mt_octl": "os integration (media hotkeys / osd)\">os-ctl",
"mt_oseek": "allow seeking through os integration$N$Nnote: on some devices (iPhones),$Nthis replaces the next-song button\">seek", "mt_oseek": "allow seeking through os integration$N$Nnote: on some devices (iPhones),$Nthis replaces the next-song button\">seek",
"mt_oscv": "show album cover in osd\">art", "mt_oscv": "show album cover in osd\">art",
@@ -304,6 +308,7 @@ var Ls = {
"mb_play": "play", "mb_play": "play",
"mm_hashplay": "play this audio file?", "mm_hashplay": "play this audio file?",
"mm_m3u": "press <code>Enter/OK</code> to Play\npress <code>ESC/Cancel</code> to Edit",
"mp_breq": "need firefox 82+ or chrome 73+ or iOS 15+", "mp_breq": "need firefox 82+ or chrome 73+ or iOS 15+",
"mm_bload": "now loading...", "mm_bload": "now loading...",
"mm_bconv": "converting to {0}, please wait...", "mm_bconv": "converting to {0}, please wait...",
@@ -316,6 +321,7 @@ var Ls = {
"mm_eunk": "Unknown Errol", "mm_eunk": "Unknown Errol",
"mm_e404": "Could not play audio; error 404: File not found.", "mm_e404": "Could not play audio; error 404: File not found.",
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out", "mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
"mm_e500": "Could not play audio; error 500: Check server logs.",
"mm_e5xx": "Could not play audio; server error ", "mm_e5xx": "Could not play audio; server error ",
"mm_nof": "not finding any more audio files nearby", "mm_nof": "not finding any more audio files nearby",
"mm_prescan": "Looking for music to play next...", "mm_prescan": "Looking for music to play next...",
@@ -330,6 +336,7 @@ var Ls = {
"f_bigtxt": "this file is {0} MiB large -- really view as text?", "f_bigtxt": "this file is {0} MiB large -- really view as text?",
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>', "fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>', "fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
"f_dls": 'the file links in the current folder have\nbeen changed into download links', "f_dls": 'the file links in the current folder have\nbeen changed into download links',
@@ -432,6 +439,10 @@ var Ls = {
"tvt_sel": "select file &nbsp; ( for cut / copy / delete / ... )$NHotkey: S\">sel", "tvt_sel": "select file &nbsp; ( for cut / copy / delete / ... )$NHotkey: S\">sel",
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit", "tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
"m3u_add1": "song added to m3u playlist",
"m3u_addn": "{0} songs added to m3u playlist",
"m3u_clip": "m3u playlist now copied to clipboard\n\nyou should create a new textfile named something.m3u and paste the playlist in that document; this will make it playable",
"gt_vau": "don't show videos, just play the audio\">🎧", "gt_vau": "don't show videos, just play the audio\">🎧",
"gt_msel": "enable file selection; ctrl-click a file to override$N$N&lt;em&gt;when active: doubleclick a file / folder to open it&lt;/em&gt;$N$NHotkey: S\">multiselect", "gt_msel": "enable file selection; ctrl-click a file to override$N$N&lt;em&gt;when active: doubleclick a file / folder to open it&lt;/em&gt;$N$NHotkey: S\">multiselect",
"gt_crop": "center-crop thumbnails\">crop", "gt_crop": "center-crop thumbnails\">crop",
@@ -747,6 +758,8 @@ var Ls = {
"wt_seldl": "last ned de valgte filene$NSnarvei: Y", "wt_seldl": "last ned de valgte filene$NSnarvei: Y",
"wt_npirc": "kopiér sang-info (irc-formatert)", "wt_npirc": "kopiér sang-info (irc-formatert)",
"wt_nptxt": "kopiér sang-info", "wt_nptxt": "kopiér sang-info",
"wt_m3ua": "legg til sang i m3u-spilleliste$N(husk å klikke på <code>📻copy</code> senere)",
"wt_m3uc": "kopiér m3u-spillelisten til utklippstavlen",
"wt_grid": "bytt mellom ikoner og listevisning$NSnarvei: G", "wt_grid": "bytt mellom ikoner og listevisning$NSnarvei: G",
"wt_prev": "forrige sang$NSnarvei: J", "wt_prev": "forrige sang$NSnarvei: J",
"wt_play": "play / pause$NSnarvei: P", "wt_play": "play / pause$NSnarvei: P",
@@ -874,6 +887,7 @@ var Ls = {
"ml_eq": "audio equalizer (tonejustering)", "ml_eq": "audio equalizer (tonejustering)",
"ml_drc": "compressor (volum-utjevning)", "ml_drc": "compressor (volum-utjevning)",
"mt_loop": "spill den samme sangen om og om igjen\">🔁",
"mt_shuf": "sangene i hver mappe$Nspilles i tilfeldig rekkefølge\">🔀", "mt_shuf": "sangene i hver mappe$Nspilles i tilfeldig rekkefølge\">🔀",
"mt_aplay": "forsøk å starte avspilling hvis linken du klikket på for å åpne nettsiden inneholder en sang-ID$N$Nhvis denne deaktiveres så vil heller ikke nettside-URLen bli oppdatert med sang-ID'er når musikk spilles, i tilfelle innstillingene skulle gå tapt og nettsiden lastes på ny\">a▶", "mt_aplay": "forsøk å starte avspilling hvis linken du klikket på for å åpne nettsiden inneholder en sang-ID$N$Nhvis denne deaktiveres så vil heller ikke nettside-URLen bli oppdatert med sang-ID'er når musikk spilles, i tilfelle innstillingene skulle gå tapt og nettsiden lastes på ny\">a▶",
"mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles", "mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles",
@@ -882,6 +896,7 @@ var Ls = {
"mt_fau": "for telefoner: forhindre at avspilling stopper hvis nettet er for tregt til å laste neste sang i tide. Hvis påskrudd, kan forårsake at sang-info ikke vises korrekt i OS'et\">☕️", "mt_fau": "for telefoner: forhindre at avspilling stopper hvis nettet er for tregt til å laste neste sang i tide. Hvis påskrudd, kan forårsake at sang-info ikke vises korrekt i OS'et\">☕️",
"mt_waves": "waveform seekbar:$Nvis volumkurve i avspillingsfeltet\">~s", "mt_waves": "waveform seekbar:$Nvis volumkurve i avspillingsfeltet\">~s",
"mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np", "mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np",
"mt_m3u_c": "vis knapper for å kopiere de valgte$Nsangene som innslag i en m3u8 spilleliste\">📻",
"mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl", "mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl",
"mt_oseek": "tillat spoling med fjernkontroll$N$Nmerk: på noen enheter (iPhones) så vil$Ndette erstatte knappen for neste sang\">spoling", "mt_oseek": "tillat spoling med fjernkontroll$N$Nmerk: på noen enheter (iPhones) så vil$Ndette erstatte knappen for neste sang\">spoling",
"mt_oscv": "vis album-cover på infoskjermen\">bilde", "mt_oscv": "vis album-cover på infoskjermen\">bilde",
@@ -907,6 +922,7 @@ var Ls = {
"mb_play": "lytt", "mb_play": "lytt",
"mm_hashplay": "spill denne sangen?", "mm_hashplay": "spill denne sangen?",
"mm_m3u": "trykk <code>Enter/OK</code> for å spille\ntrykk <code>ESC/Avbryt</code> for å redigere",
"mp_breq": "krever firefox 82+, chrome 73+, eller iOS 15+", "mp_breq": "krever firefox 82+, chrome 73+, eller iOS 15+",
"mm_bload": "laster inn...", "mm_bload": "laster inn...",
"mm_bconv": "konverterer til {0}, vent litt...", "mm_bconv": "konverterer til {0}, vent litt...",
@@ -919,6 +935,7 @@ var Ls = {
"mm_eunk": "Ukjent feil", "mm_eunk": "Ukjent feil",
"mm_e404": "Avspilling feilet: Fil ikke funnet.", "mm_e404": "Avspilling feilet: Fil ikke funnet.",
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.", "mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
"mm_e500": "Avspilling feilet: Rusk i maskineriet, sjekk serverloggen.",
"mm_e5xx": "Avspilling feilet: ", "mm_e5xx": "Avspilling feilet: ",
"mm_nof": "finner ikke flere sanger i nærheten", "mm_nof": "finner ikke flere sanger i nærheten",
"mm_prescan": "Leter etter neste sang...", "mm_prescan": "Leter etter neste sang...",
@@ -933,6 +950,7 @@ var Ls = {
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?", "f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
"fbd_more": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_more">vis {2}</a> eller <a href="#" id="bd_all">vis alle</a></div>', "fbd_more": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_more">vis {2}</a> eller <a href="#" id="bd_all">vis alle</a></div>',
"fbd_all": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_all">vis alle</a></div>', "fbd_all": '<div id="blazy">viser <code>{0}</code> av <code>{1}</code> filer; <a href="#" id="bd_all">vis alle</a></div>',
"f_anota": "kun {0} av totalt {1} elementer ble markert;\nfor å velge alt må du bla til bunnen av mappen først",
"f_dls": 'linkene i denne mappen er nå\nomgjort til nedlastningsknapper', "f_dls": 'linkene i denne mappen er nå\nomgjort til nedlastningsknapper',
@@ -1035,6 +1053,10 @@ var Ls = {
"tvt_sel": "markér filen &nbsp; ( for utklipp / sletting / ... )$NSnarvei: S\">merk", "tvt_sel": "markér filen &nbsp; ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre", "tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
"m3u_add1": "sangen ble lagt til i m3u-spillelisten",
"m3u_addn": "{0} sanger ble lagt til i m3u-spillelisten",
"m3u_clip": "m3u-spillelisten ble kopiert til utklippstavlen\n\nneste steg er å opprette et tekstdokument med filnavn som slutter på <code>.m3u</code> og lime inn spillelisten der",
"gt_vau": "ikke vis videofiler, bare spill lyden\">🎧", "gt_vau": "ikke vis videofiler, bare spill lyden\">🎧",
"gt_msel": "markér filer istedenfor å åpne dem; ctrl-klikk filer for å overstyre$N$N&lt;em&gt;når aktiv: dobbelklikk en fil / mappe for å åpne&lt;/em&gt;$N$NSnarvei: S\">markering", "gt_msel": "markér filer istedenfor å åpne dem; ctrl-klikk filer for å overstyre$N$N&lt;em&gt;når aktiv: dobbelklikk en fil / mappe for å åpne&lt;/em&gt;$N$NSnarvei: S\">markering",
"gt_crop": "beskjær ikonene så de passer bedre\">✂", "gt_crop": "beskjær ikonene så de passer bedre\">✂",
@@ -1477,6 +1499,7 @@ var Ls = {
"ml_eq": "音频均衡器", "ml_eq": "音频均衡器",
"ml_drc": "动态范围压缩器", "ml_drc": "动态范围压缩器",
"mt_loop": "循环播放当前的歌曲\">🔁", //m
"mt_shuf": "在每个文件夹中随机播放歌曲\">🔀", "mt_shuf": "在每个文件夹中随机播放歌曲\">🔀",
"mt_aplay": "如果链接中有歌曲 ID则自动播放,禁用此选项将停止在播放音乐时更新页面 URL 中的歌曲 ID以防止在设置丢失但 URL 保留时自动播放\">自动播放▶", "mt_aplay": "如果链接中有歌曲 ID则自动播放,禁用此选项将停止在播放音乐时更新页面 URL 中的歌曲 ID以防止在设置丢失但 URL 保留时自动播放\">自动播放▶",
"mt_preload": "在歌曲快结束时开始加载下一首歌,以实现无缝播放\">预加载", "mt_preload": "在歌曲快结束时开始加载下一首歌,以实现无缝播放\">预加载",
@@ -1522,6 +1545,7 @@ var Ls = {
"mm_eunk": "未知错误", "mm_eunk": "未知错误",
"mm_e404": "无法播放音频;错误 404文件未找到。", "mm_e404": "无法播放音频;错误 404文件未找到。",
"mm_e403": "无法播放音频;错误 403访问被拒绝。\n\n尝试按 F5 重新加载,也许你已被注销", "mm_e403": "无法播放音频;错误 403访问被拒绝。\n\n尝试按 F5 重新加载,也许你已被注销",
"mm_e500": "无法播放音频;错误 500检查服务器日志。", //m
"mm_e5xx": "无法播放音频;服务器错误", "mm_e5xx": "无法播放音频;服务器错误",
"mm_nof": "附近找不到更多音频文件", "mm_nof": "附近找不到更多音频文件",
"mm_prescan": "正在寻找下一首音乐...", "mm_prescan": "正在寻找下一首音乐...",
@@ -1883,6 +1907,9 @@ ebi('widget').innerHTML = (
'</span><span id="wnp"><a' + '</span><span id="wnp"><a' +
' href="#" id="npirc" tt="' + L.wt_npirc + '">📋<span>irc</span></a><a' + ' href="#" id="npirc" tt="' + L.wt_npirc + '">📋<span>irc</span></a><a' +
' href="#" id="nptxt" tt="' + L.wt_nptxt + '">📋<span>txt</span></a>' + ' href="#" id="nptxt" tt="' + L.wt_nptxt + '">📋<span>txt</span></a>' +
'</span><span id="wm3u"><a' +
' href="#" id="m3ua" tt="' + L.wt_m3ua + '">📻<span>add</span></a><a' +
' href="#" id="m3uc" tt="' + L.wt_m3uc + '">📻<span>copy</span></a>' +
'</span><a' + '</span><a' +
' href="#" id="wtgrid" tt="' + L.wt_grid + '">田</a><a' + ' href="#" id="wtgrid" tt="' + L.wt_grid + '">田</a><a' +
' href="#" id="wtico">♫</a>' + ' href="#" id="wtico">♫</a>' +
@@ -2289,6 +2316,7 @@ var mpl = (function () {
ebi('op_player').innerHTML = ( ebi('op_player').innerHTML = (
'<div><h3>' + L.cl_opts + '</h3><div>' + '<div><h3>' + L.cl_opts + '</h3><div>' +
'<a href="#" class="tgl btn" id="au_loop" tt="' + L.mt_loop + '</a>' +
'<a href="#" class="tgl btn" id="au_shuf" tt="' + L.mt_shuf + '</a>' + '<a href="#" class="tgl btn" id="au_shuf" tt="' + L.mt_shuf + '</a>' +
'<a href="#" class="tgl btn" id="au_aplay" tt="' + L.mt_aplay + '</a>' + '<a href="#" class="tgl btn" id="au_aplay" tt="' + L.mt_aplay + '</a>' +
'<a href="#" class="tgl btn" id="au_preload" tt="' + L.mt_preload + '</a>' + '<a href="#" class="tgl btn" id="au_preload" tt="' + L.mt_preload + '</a>' +
@@ -2297,6 +2325,7 @@ var mpl = (function () {
'<a href="#" class="tgl btn" id="au_fau" tt="' + L.mt_fau + '</a>' + '<a href="#" class="tgl btn" id="au_fau" tt="' + L.mt_fau + '</a>' +
'<a href="#" class="tgl btn" id="au_waves" tt="' + L.mt_waves + '</a>' + '<a href="#" class="tgl btn" id="au_waves" tt="' + L.mt_waves + '</a>' +
'<a href="#" class="tgl btn" id="au_npclip" tt="' + L.mt_npclip + '</a>' + '<a href="#" class="tgl btn" id="au_npclip" tt="' + L.mt_npclip + '</a>' +
'<a href="#" class="tgl btn" id="au_m3u_c" tt="' + L.mt_m3u_c + '</a>' +
'<a href="#" class="tgl btn" id="au_os_ctl" tt="' + L.mt_octl + '</a>' + '<a href="#" class="tgl btn" id="au_os_ctl" tt="' + L.mt_octl + '</a>' +
'<a href="#" class="tgl btn" id="au_os_seek" tt="' + L.mt_oseek + '</a>' + '<a href="#" class="tgl btn" id="au_os_seek" tt="' + L.mt_oseek + '</a>' +
'<a href="#" class="tgl btn" id="au_osd_cv" tt="' + L.mt_oscv + '</a>' + '<a href="#" class="tgl btn" id="au_osd_cv" tt="' + L.mt_oscv + '</a>' +
@@ -2339,7 +2368,12 @@ var mpl = (function () {
"pb_mode": (sread('pb_mode', ['loop', 'next']) || 'next').split('-')[0], "pb_mode": (sread('pb_mode', ['loop', 'next']) || 'next').split('-')[0],
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl, "os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
'traversals': 0, 'traversals': 0,
'm3ut': '#EXTM3U\n',
}; };
bcfg_bind(r, 'loop', 'au_loop', false, function (v) {
if (mp.au)
mp.au.loop = v;
});
bcfg_bind(r, 'shuf', 'au_shuf', false, function () { bcfg_bind(r, 'shuf', 'au_shuf', false, function () {
mp.read_order(); // don't bind mp.read_order(); // don't bind
}); });
@@ -2364,6 +2398,9 @@ var mpl = (function () {
bcfg_bind(r, 'clip', 'au_npclip', false, function (v) { bcfg_bind(r, 'clip', 'au_npclip', false, function (v) {
clmod(ebi('wtoggle'), 'np', v && mp.au); clmod(ebi('wtoggle'), 'np', v && mp.au);
}); });
bcfg_bind(r, 'm3uen', 'au_m3u_c', false, function (v) {
clmod(ebi('wtoggle'), 'm3u', v && (mp.au || msel.getsel().length));
});
bcfg_bind(r, 'follow', 'au_follow', false, setaufollow); bcfg_bind(r, 'follow', 'au_follow', false, setaufollow);
bcfg_bind(r, 'ac_flac', 'ac_flac', true); bcfg_bind(r, 'ac_flac', 'ac_flac', true);
bcfg_bind(r, 'ac_aac', 'ac_aac', false); bcfg_bind(r, 'ac_aac', 'ac_aac', false);
@@ -2552,7 +2589,7 @@ var mpl = (function () {
ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : ''); ebi('np_artist').textContent = np.artist || (fns.length > 1 ? fns[0] : '');
ebi('np_title').textContent = np.title || ''; ebi('np_title').textContent = np.title || '';
ebi('np_dur').textContent = np['.dur'] || ''; ebi('np_dur').textContent = np['.dur'] || '';
ebi('np_url').textContent = get_vpath() + np.file.split('?')[0]; ebi('np_url').textContent = uricom_dec(get_evpath()) + np.file.split('?')[0];
if (!MOBILE && cover) if (!MOBILE && cover)
ebi('np_img').setAttribute('src', cover); ebi('np_img').setAttribute('src', cover);
else else
@@ -2617,6 +2654,7 @@ if (can_owa && APPLE && / OS ([1-9]|1[0-7])_/.test(UA))
mpl.init_ac2(); mpl.init_ac2();
var re_m3u = /\.(m3u8?)$/i;
var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i, var re_au_native = (can_ogg || have_acode) ? /\.(aac|flac|m4a|mp3|ogg|opus|wav)$/i : /\.(aac|flac|m4a|mp3|wav)$/i,
re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|itgz|itxz|itz|m4a|mdgz|mdxz|mdz|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|s3gz|s3xz|s3z|tak|tta|ulaw|wav|wma|wv|xm|xmgz|xmxz|xmz|xpk|3gp|asf|avi|flv|m4v|mkv|mov|mp4|mpeg|mpeg2|mpegts|mpg|mpg2|nut|ogm|ogv|rm|ts|vob|webm|wmv)$/i; re_au_all = /\.(aac|ac3|aif|aiff|alac|alaw|amr|ape|au|dfpwm|dts|flac|gsm|it|itgz|itxz|itz|m4a|mdgz|mdxz|mdz|mo3|mod|mp2|mp3|mpc|mptm|mt2|mulaw|ogg|okt|opus|ra|s3m|s3gz|s3xz|s3z|tak|tta|ulaw|wav|wma|wv|xm|xmgz|xmxz|xmz|xpk|3gp|asf|avi|flv|m4v|mkv|mov|mp4|mpeg|mpeg2|mpegts|mpg|mpg2|nut|ogm|ogv|rm|ts|vob|webm|wmv)$/i;
@@ -2643,9 +2681,9 @@ function MPlayer() {
link = link[link.length - 1]; link = link[link.length - 1];
var url = link.getAttribute('href'), var url = link.getAttribute('href'),
m = re_audio.exec(url.split('?')[0]); fn = url.split('?')[0];
if (m) { if (re_audio.exec(fn)) {
var tid = link.getAttribute('id'); var tid = link.getAttribute('id');
r.order.push(tid); r.order.push(tid);
r.tracks[tid] = url; r.tracks[tid] = url;
@@ -2653,6 +2691,11 @@ function MPlayer() {
ebi('a' + tid).onclick = ev_play; ebi('a' + tid).onclick = ev_play;
clmod(trs[a], 'au', 1); clmod(trs[a], 'au', 1);
} }
else if (re_m3u.exec(fn)) {
var tid = link.getAttribute('id');
tds[0].innerHTML = '<a id="a' + tid + '" href="#a' + tid + '" class="play">' + L.mb_play + '</a></td>';
ebi('a' + tid).onclick = ev_load_m3u;
}
} }
r.vol = clamp(fcfg_get('vol', IPHONE ? 1 : dvol / 100), 0, 1); r.vol = clamp(fcfg_get('vol', IPHONE ? 1 : dvol / 100), 0, 1);
@@ -2818,6 +2861,14 @@ function MPlayer() {
r.fau.loop = true; r.fau.loop = true;
r.fau.play(); r.fau.play();
}; };
r.set_ev = function () {
mp.au.onended = evau_end;
mp.au.onerror = evau_error;
mp.au.onprogress = pbar.drawpos;
mp.au.onplaying = mpui.progress_updater;
mp.au.onloadeddata = mp.au.onloadedmetadata = mp.nopause;
};
} }
@@ -2859,6 +2910,8 @@ var widget = (function () {
wtico = ebi('wtico'), wtico = ebi('wtico'),
nptxt = ebi('nptxt'), nptxt = ebi('nptxt'),
npirc = ebi('npirc'), npirc = ebi('npirc'),
m3ua = ebi('m3ua'),
m3uc = ebi('m3uc'),
touchmode = false, touchmode = false,
was_paused = true; was_paused = true;
@@ -2917,6 +2970,49 @@ var widget = (function () {
toast.ok(1, L.clipped, null, 'top'); toast.ok(1, L.clipped, null, 'top');
}); });
}; };
m3ua.onclick = function (e) {
ev(e);
var el,
files = [],
sel = msel.getsel();
for (var a = 0; a < sel.length; a++) {
el = ebi(sel[a].id).closest('tr');
if (clgot(el, 'au'))
files.push(el);
}
el = QS('#files tr.play');
if (!sel.length && el)
files.push(el);
for (var a = 0; a < files.length; a++) {
var md = ft2dict(files[a])[0],
dur = md['.dur'] || '1',
tag = '';
if (md.artist && md.title)
tag = md.artist + ' - ' + md.title;
else if (md.artist)
tag = md.artist + ' - ' + md.file;
else if (md.title)
tag = md.title;
if (dur.indexOf(':') > 0) {
dur = dur.split(':');
dur = 60 * parseInt(dur[0]) + parseInt(dur[1]);
}
else dur = parseInt(dur);
mpl.m3ut += '#EXTINF:' + dur + ',' + tag + '\n' + uricom_dec(get_evpath()) + md.file + '\n';
}
toast.ok(2, files.length == 1 ? L.m3u_add1 : L.m3u_addn.format(files.length), null, 'top');
};
m3uc.onclick = function (e) {
ev(e);
cliptxt(mpl.m3ut, function () {
toast.ok(15, L.m3u_clip, null, 'top');
});
};
r.set(sread('au_open') == 1); r.set(sread('au_open') == 1);
setTimeout(function () { setTimeout(function () {
clmod(widget, 'anim', 1); clmod(widget, 'anim', 1);
@@ -4055,10 +4151,7 @@ function play(tid, is_ev, seek) {
else { else {
mp.au = new Audio(); mp.au = new Audio();
mp.au2 = new Audio(); mp.au2 = new Audio();
mp.au.onerror = evau_error; mp.set_ev();
mp.au.onprogress = pbar.drawpos;
mp.au.onplaying = mpui.progress_updater;
mp.au.onended = next_song;
widget.open(); widget.open();
} }
mp.init_fau(); mp.init_fau();
@@ -4071,13 +4164,9 @@ function play(tid, is_ev, seek) {
var t = mp.au; var t = mp.au;
mp.au = mp.au2; mp.au = mp.au2;
mp.au2 = t; mp.au2 = t;
t.onerror = t.onprogress = t.onended = null; t.onerror = t.onprogress = t.onended = t.loop = null;
t.ld = 0; //owa t.ld = 0; //owa
mp.au.onerror = evau_error; mp.set_ev();
mp.au.onprogress = pbar.drawpos;
mp.au.onplaying = mpui.progress_updater;
mp.au.onloadeddata = mp.au.onloadedmetadata = mp.nopause;
mp.au.onended = next_song;
t = mp.au.currentTime; t = mp.au.currentTime;
if (isNum(t) && t > 0.1) if (isNum(t) && t > 0.1)
mp.au.currentTime = 0; mp.au.currentTime = 0;
@@ -4107,6 +4196,7 @@ function play(tid, is_ev, seek) {
clmod(ebi(oid), 'act', 1); clmod(ebi(oid), 'act', 1);
clmod(ebi(oid).closest('tr'), 'play', 1); clmod(ebi(oid).closest('tr'), 'play', 1);
clmod(ebi('wtoggle'), 'np', mpl.clip); clmod(ebi('wtoggle'), 'np', mpl.clip);
clmod(ebi('wtoggle'), 'm3u', mpl.m3uen);
if (thegrid) if (thegrid)
thegrid.loadsel(); thegrid.loadsel();
@@ -4115,6 +4205,7 @@ function play(tid, is_ev, seek) {
try { try {
mp.nopause(); mp.nopause();
mp.au.loop = mpl.loop;
if (mpl.aplay || is_ev !== -1) if (mpl.aplay || is_ev !== -1)
mp.au.play(); mp.au.play();
@@ -4159,6 +4250,15 @@ function scroll2playing() {
} }
function evau_end(e) {
if (!mpl.loop)
return next_song(e);
ev(e);
mp.au.currentTime = 0;
mp.au.play();
}
// event from the audio object if something breaks // event from the audio object if something breaks
function evau_error(e) { function evau_error(e) {
var err = '', var err = '',
@@ -4202,6 +4302,7 @@ function evau_error(e) {
} }
var em = '' + eplaya.error.message, var em = '' + eplaya.error.message,
mfile = '\n\nFile: «' + uricom_dec(eplaya.src.split('/').pop()) + '»', mfile = '\n\nFile: «' + uricom_dec(eplaya.src.split('/').pop()) + '»',
e500 = L.mm_e500,
e404 = L.mm_e404, e404 = L.mm_e404,
e403 = L.mm_e403; e403 = L.mm_e403;
@@ -4214,6 +4315,9 @@ function evau_error(e) {
if (em.startsWith('404: ')) if (em.startsWith('404: '))
err = e404; err = e404;
if (em.startsWith('500: '))
err = e500;
toast.warn(15, esc(basenames(err + mfile))); toast.warn(15, esc(basenames(err + mfile)));
console.log(basenames(err + mfile)); console.log(basenames(err + mfile));
@@ -4225,7 +4329,9 @@ function evau_error(e) {
if (this.status < 400) if (this.status < 400)
return; return;
err = this.status == 403 ? e403 : this.status == 404 ? e404 : err = this.status == 403 ? e403 :
this.status == 404 ? e404 :
this.status == 500 ? e500 :
L.mm_e5xx + this.status; L.mm_e5xx + this.status;
toast.warn(15, esc(basenames(err + mfile))); toast.warn(15, esc(basenames(err + mfile)));
@@ -4365,6 +4471,11 @@ function eval_hash() {
goto(v.slice(3)); goto(v.slice(3));
return; return;
} }
if (v.startsWith("#m3u=")) {
load_m3u(v.slice(5));
return;
}
} }
@@ -4424,7 +4535,8 @@ function eval_hash() {
function read_dsort(txt) { function read_dsort(txt) {
dnsort = dnsort ? 1 : 0; dnsort = dnsort ? 1 : 0;
clmod(ebi('nsort'), 'on', (sread('nsort') || dnsort) == 1); ENATSORT = NATSORT && (sread('nsort') || dnsort) == 1;
clmod(ebi('nsort'), 'on', ENATSORT);
try { try {
var zt = (('' + txt).trim() || 'href').split(/,+/g); var zt = (('' + txt).trim() || 'href').split(/,+/g);
dsort = []; dsort = [];
@@ -4470,9 +4582,6 @@ function sortfiles(nodes) {
sopts = sopts && sopts.length ? sopts : jcp(dsort); sopts = sopts && sopts.length ? sopts : jcp(dsort);
var collator = !clgot(ebi('nsort'), 'on') ? null :
new Intl.Collator([], {numeric: true});
try { try {
var is_srch = false; var is_srch = false;
if (nodes[0]['rp']) { if (nodes[0]['rp']) {
@@ -4524,8 +4633,9 @@ function sortfiles(nodes) {
} }
if (v2 === undefined) return 1 * rev; if (v2 === undefined) return 1 * rev;
var ret = rev * (typ == 'int' ? (v1 - v2) : collator ? var ret = rev * (typ == 'int' ? (v1 - v2) :
collator.compare(v1, v2) : v1.localeCompare(v2)); ENATSORT ? NATSORT.compare(v1, v2) :
v1.localeCompare(v2));
if (ret === 0) if (ret === 0)
ret = onodes.indexOf(n1) - onodes.indexOf(n2); ret = onodes.indexOf(n1) - onodes.indexOf(n2);
@@ -4707,6 +4817,7 @@ var fileman = (function () {
clmod(bshr, 'hide', hshr); clmod(bshr, 'hide', hshr);
clmod(ebi('wfm'), 'act', QS('#wfm a.en:not(.hide)')); clmod(ebi('wfm'), 'act', QS('#wfm a.en:not(.hide)'));
clmod(ebi('wtoggle'), 'm3u', mpl.m3uen && (nsel || (mp && mp.au)));
var wfs = ebi('wfs'), h = ''; var wfs = ebi('wfs'), h = '';
try { try {
@@ -5963,7 +6074,8 @@ var showfile = (function () {
}; };
r.mktree = function () { r.mktree = function () {
var html = ['<li class="bn">' + L.tv_lst + '<br />' + linksplit(get_vpath()).join('<span>/</span>') + '</li>']; var crumbs = linksplit(get_evpath()).join('<span>/</span>'),
html = ['<li class="bn">' + L.tv_lst + '<br />' + crumbs + '</li>'];
for (var a = 0; a < r.files.length; a++) { for (var a = 0; a < r.files.length; a++) {
var file = r.files[a]; var file = r.files[a];
html.push('<li><a href="?doc=' + html.push('<li><a href="?doc=' +
@@ -6538,8 +6650,8 @@ function tree_scrolltoo(q) {
var ctr = ebi('tree'), var ctr = ebi('tree'),
em = parseFloat(getComputedStyle(act).fontSize), em = parseFloat(getComputedStyle(act).fontSize),
top = act.offsetTop + ul.offsetTop, top = act.offsetTop + ul.offsetTop,
min = top - 11 * em, min = top - 20 * em,
max = top - (ctr.offsetHeight - 10 * em); max = top - (ctr.offsetHeight - 16 * em);
if (ctr.scrollTop > min) if (ctr.scrollTop > min)
ctr.scrollTop = Math.floor(min); ctr.scrollTop = Math.floor(min);
@@ -6710,7 +6822,8 @@ var ahotkeys = function (e) {
return ebi('griden').click(); return ebi('griden').click();
} }
if ((aet == 'tr' || aet == 'td') && ae.closest('#files')) { var in_ftab = (aet == 'tr' || aet == 'td') && ae.closest('#files');
if (in_ftab) {
var d = '', rem = 0; var d = '', rem = 0;
if (aet == 'td') ae = ae.closest('tr'); //ie11 if (aet == 'td') ae = ae.closest('tr'); //ie11
if (k == 'ArrowUp' || k == 'Up') d = 'previous'; if (k == 'ArrowUp' || k == 'Up') d = 'previous';
@@ -6727,12 +6840,19 @@ var ahotkeys = function (e) {
msel.selui(); msel.selui();
return ev(e); return ev(e);
} }
}
if (in_ftab || !aet || (ae && ae.closest('#ggrid'))) {
if ((k == 'KeyA' || k == 'a') && ctrl(e)) { if ((k == 'KeyA' || k == 'a') && ctrl(e)) {
var sel = msel.getsel(), var ntot = treectl.lsc.files.length + treectl.lsc.dirs.length,
sel = msel.getsel(),
all = msel.getall(); all = msel.getall();
msel.evsel(e, sel.length < all.length); msel.evsel(e, sel.length < all.length);
msel.origin_id(null); msel.origin_id(null);
if (ntot > all.length)
toast.warn(10, L.f_anota.format(all.length, ntot), L.f_anota);
else if (toast.tag == L.f_anota)
toast.hide();
return ev(e); return ev(e);
} }
} }
@@ -6862,7 +6982,7 @@ var ahotkeys = function (e) {
// search // search
(function () { var search_ui = (function () {
var sconf = [ var sconf = [
[ [
L.s_sz, L.s_sz,
@@ -6897,7 +7017,8 @@ var ahotkeys = function (e) {
] ]
]; ];
var trs = [], var r = {},
trs = [],
orig_url = null, orig_url = null,
orig_html = null, orig_html = null,
cap = 125; cap = 125;
@@ -7104,13 +7225,19 @@ var ahotkeys = function (e) {
search_in_progress = 0; search_in_progress = 0;
srch_msg(false, ''); srch_msg(false, '');
var res = JSON.parse(this.responseText), var res = JSON.parse(this.responseText);
tagord = res.tag_order; r.render(res, this, true);
}
r.render = function (res, xhr, sort) {
var tagord = res.tag_order;
srch_msg(false, '');
if (sort)
sortfiles(res.hits); sortfiles(res.hits);
var ofiles = ebi('files'); var ofiles = ebi('files');
if (ofiles.getAttribute('ts') > this.ts) if (xhr && ofiles.getAttribute('ts') > xhr.ts)
return; return;
treectl.hide(); treectl.hide();
@@ -7162,19 +7289,21 @@ var ahotkeys = function (e) {
} }
ofiles = set_files_html(html.join('\n')); ofiles = set_files_html(html.join('\n'));
ofiles.setAttribute("ts", this.ts); ofiles.setAttribute("ts", xhr ? xhr.ts : 1);
ofiles.setAttribute("q_raw", this.q_raw); ofiles.setAttribute("q_raw", xhr ? xhr.q_raw : 'playlist');
set_vq(); set_vq();
mukey.render(); mukey.render();
reload_browser(); reload_browser();
filecols.set_style(['File Name']); filecols.set_style(['File Name']);
sethash('q=' + uricom_enc(this.q_raw)); if (xhr)
sethash('q=' + uricom_enc(xhr.q_raw));
ebi('unsearch').onclick = unsearch; ebi('unsearch').onclick = unsearch;
var m = ebi('moar'); var m = ebi('moar');
if (m) if (m)
m.onclick = moar; m.onclick = moar;
} };
function unsearch(e) { function unsearch(e) {
ev(e); ev(e);
@@ -7191,9 +7320,98 @@ var ahotkeys = function (e) {
cap *= 2; cap *= 2;
do_search(); do_search();
} }
return r;
})(); })();
function ev_load_m3u(e) {
ev(e);
var id = this.getAttribute('id').slice(1),
url = ebi(id).getAttribute('href').split('?')[0];
modal.confirm(L.mm_m3u,
function () { load_m3u(url); },
function () {
if (has(perms, 'write') && has(perms, 'delete'))
window.location = url + '?edit';
else
showfile.show(url);
}
);
return false;
}
function load_m3u(url) {
var xhr = new XHR();
xhr.open('GET', url, true);
xhr.onload = render_m3u;
xhr.url = url;
xhr.send();
return false;
}
function render_m3u() {
if (!xhrchk(this, L.tv_xe1, L.tv_xe2))
return;
var evp = get_evpath(),
m3u = this.responseText,
xtd = m3u.slice(0, 12).indexOf('#EXTM3U') + 1,
lines = m3u.replace(/\r/g, '\n').split('\n'),
dur = 1,
artist = '',
title = '',
ret = {'hits': [], 'tag_order': ['artist', 'title', '.dur'], 'trunc': false};
for (var a = 0; a < lines.length; a++) {
var ln = lines[a].trim();
if (xtd && ln.startsWith('#')) {
var m = /^#EXTINF:([0-9]+)[, ](.*)/.exec(ln);
if (m) {
dur = m[1];
title = m[2];
var ofs = title.indexOf(' - ');
if (ofs > 0) {
artist = title.slice(0, ofs);
title = title.slice(ofs + 3);
}
}
continue;
}
if (ln.indexOf('.') < 0)
continue;
var n = ret.hits.length + 1,
url = ln;
if (url.indexOf(':\\')) // C:\
url = url.split(/\\/g).pop();
url = url.replace(/\\/g, '/');
url = uricom_enc(url).replace(/%2f/gi, '/')
if (!url.startsWith('/'))
url = vjoin(evp, url);
ret.hits.push({
"ts": 946684800 + n,
"sz": 100000 + n,
"rp": url,
"tags": {".dur": dur, "artist": artist, "title": title}
});
dur = 1;
artist = title = '';
}
search_ui.render(ret, null, false);
sethash('m3u=' + this.url.split('?')[0].split('/').pop());
goto();
var el = QS('#files>tbody>tr.au>td>a.play');
if (el)
el.click();
}
function aligngriditems() { function aligngriditems() {
if (!treectl) if (!treectl)
return; return;
@@ -7258,6 +7476,7 @@ var treectl = (function () {
treesz = clamp(icfg_get('treesz', 16), 10, 50); treesz = clamp(icfg_get('treesz', 16), 10, 50);
var resort = function () { var resort = function () {
ENATSORT = NATSORT && clgot(ebi('nsort'), 'on');
treectl.gentab(get_evpath(), treectl.lsc); treectl.gentab(get_evpath(), treectl.lsc);
}; };
bcfg_bind(r, 'ireadme', 'ireadme', true); bcfg_bind(r, 'ireadme', 'ireadme', true);
@@ -7586,8 +7805,8 @@ var treectl = (function () {
}; };
function reload_tree() { function reload_tree() {
var cdir = r.nextdir || get_vpath(), var cevp = get_evpath(),
cevp = get_evpath(), cdir = r.nextdir || uricom_dec(cevp),
links = QSA('#treeul a+a'), links = QSA('#treeul a+a'),
nowrap = QS('#tree.nowrap') && QS('#hovertree.on'), nowrap = QS('#tree.nowrap') && QS('#hovertree.on'),
act = null; act = null;
@@ -8094,7 +8313,7 @@ var treectl = (function () {
document.documentElement.scrollLeft = 0; document.documentElement.scrollLeft = 0;
setTimeout(function () { setTimeout(function () {
r.gentab(get_evpath(), r.lsc); r.gentab(get_evpath(), r.lsc);
ebi('wrap').style.opacity = 'unset'; ebi('wrap').style.opacity = CLOSEST ? 'unset' : 1;
}, 1); }, 1);
}; };
@@ -8130,9 +8349,16 @@ var treectl = (function () {
} }
delete res['a']; delete res['a'];
var keys = Object.keys(res); var keys = Object.keys(res);
keys.sort(function (a, b) { return a.localeCompare(b); }); for (var a = 0; a < keys.length; a++)
keys[a] = [uricom_dec(keys[a]), keys[a]];
if (ENATSORT)
keys.sort(function (a, b) { return NATSORT.compare(a[0], b[0]); });
else
keys.sort(function (a, b) { return a[0].localeCompare(b[0]); });
for (var a = 0; a < keys.length; a++) { for (var a = 0; a < keys.length; a++) {
var kk = keys[a], var kk = keys[a][1],
m = /(\?k=[^\n]+)/.exec(kk), m = /(\?k=[^\n]+)/.exec(kk),
kdk = m ? m[1] : '', kdk = m ? m[1] : '',
ks = kk.replace(kdk, '').slice(1), ks = kk.replace(kdk, '').slice(1),
@@ -8240,7 +8466,7 @@ var wfp_debounce = (function () {
if (--r.n <= 0) { if (--r.n <= 0) {
r.n = 0; r.n = 0;
clearTimeout(r.t); clearTimeout(r.t);
ebi('wfp').style.opacity = 'unset'; ebi('wfp').style.opacity = CLOSEST ? 'unset' : 1;
} }
}; };
r.reset = function () { r.reset = function () {
@@ -9411,6 +9637,11 @@ function sandbox(tgt, rules, allow, cls, html) {
clmod(tgt, 'sb'); clmod(tgt, 'sb');
return false; return false;
} }
if (!CLOSEST) {
tgt.textContent = html;
clmod(tgt, 'sb');
return false;
}
clmod(tgt, 'sb', 1); clmod(tgt, 'sb', 1);
var tid = tgt.getAttribute('id'), var tid = tgt.getAttribute('id'),
@@ -9478,7 +9709,7 @@ window.addEventListener("message", function (e) {
el.parentNode.removeChild(el.previousSibling); el.parentNode.removeChild(el.previousSibling);
el.style.height = (parseInt(t[2]) + SBH) + 'px'; el.style.height = (parseInt(t[2]) + SBH) + 'px';
el.style.visibility = 'unset'; el.style.visibility = CLOSEST ? 'unset' : 'block';
wfp_debounce.show(); wfp_debounce.show();
} }
else if (t[0] == 'iscroll') { else if (t[0] == 'iscroll') {
@@ -9772,7 +10003,7 @@ function wintitle(txt, noname) {
if (s_name && !noname) if (s_name && !noname)
txt = s_name + ' ' + txt; txt = s_name + ' ' + txt;
txt += get_vpath().slice(1, -1).split('/').pop(); txt += uricom_dec(get_evpath()).slice(1, -1).split('/').pop();
document.title = txt; document.title = txt;
} }

View File

@@ -122,7 +122,7 @@
<input type="hidden" id="la" name="act" value="login" /> <input type="hidden" id="la" name="act" value="login" />
<input type="password" id="lp" name="cppwd" placeholder=" password" /> <input type="password" id="lp" name="cppwd" placeholder=" password" />
<input type="hidden" name="uhash" id="uhash" value="x" /> <input type="hidden" name="uhash" id="uhash" value="x" />
<input type="submit" id="ls" value="Login" /> <input type="submit" id="ls" value="login" />
{% if chpw %} {% if chpw %}
<a id="x" href="#">change password</a> <a id="x" href="#">change password</a>
{% endif %} {% endif %}

View File

@@ -381,6 +381,9 @@ html.y .btn:focus {
box-shadow: 0 .1em .2em #037 inset; box-shadow: 0 .1em .2em #037 inset;
outline: #037 solid .1em; outline: #037 solid .1em;
} }
input, button {
font-family: var(--font-main), sans-serif;
}
input[type="submit"] { input[type="submit"] {
cursor: pointer; cursor: pointer;
} }

View File

@@ -1415,7 +1415,7 @@ function up2k_init(subtle) {
if (FIREFOX && good_files.length > 3000) if (FIREFOX && good_files.length > 3000)
msg.push(L.u_ff_many + "\n\n"); msg.push(L.u_ff_many + "\n\n");
msg.push(L.u_asku.format(good_files.length, esc(get_vpath())) + '<ul>'); msg.push(L.u_asku.format(good_files.length, esc(uricom_dec(get_evpath()))) + '<ul>');
for (var a = 0, aa = Math.min(20, good_files.length); a < aa; a++) for (var a = 0, aa = Math.min(20, good_files.length); a < aa; a++)
msg.push('<li>' + esc(good_files[a][1]) + '</li>'); msg.push('<li>' + esc(good_files[a][1]) + '</li>');

View File

@@ -364,7 +364,8 @@ if (!Element.prototype.matches)
Element.prototype.mozMatchesSelector || Element.prototype.mozMatchesSelector ||
Element.prototype.webkitMatchesSelector; Element.prototype.webkitMatchesSelector;
if (!Element.prototype.closest) var CLOSEST = !!Element.prototype.closest;
if (!CLOSEST)
Element.prototype.closest = function (s) { Element.prototype.closest = function (s) {
var el = this; var el = this;
do { do {
@@ -461,6 +462,13 @@ function namesan(txt, win, fslash) {
} }
var NATSORT, ENATSORT;
try {
NATSORT = new Intl.Collator([], {numeric: true});
}
catch (ex) { }
var crctab = (function () { var crctab = (function () {
var c, tab = []; var c, tab = [];
for (var n = 0; n < 256; n++) { for (var n = 0; n < 256; n++) {
@@ -614,6 +622,33 @@ function showsort(tab) {
} }
} }
} }
function st_cmp_num(a, b) {
a = a[0];
b = b[0];
return (
a === null ? -1 :
b === null ? 1 :
(a - b)
);
}
function st_cmp_nat(a, b) {
a = a[0];
b = b[0];
return (
a === null ? -1 :
b === null ? 1 :
NATSORT.compare(a, b)
);
}
function st_cmp_gen(a, b) {
a = a[0];
b = b[0];
return (
a === null ? -1 :
b === null ? 1 :
a.localeCompare(b)
);
}
function sortTable(table, col, cb) { function sortTable(table, col, cb) {
var tb = table.tBodies[0], var tb = table.tBodies[0],
th = table.tHead.rows[0].cells, th = table.tHead.rows[0].cells,
@@ -659,19 +694,17 @@ function sortTable(table, col, cb) {
} }
vl.push([v, a]); vl.push([v, a]);
} }
vl.sort(function (a, b) {
a = a[0];
b = b[0];
if (a === null)
return -1;
if (b === null)
return 1;
if (stype == 'int') { if (stype == 'int')
return reverse * (a - b); vl.sort(st_cmp_num);
} else if (ENATSORT)
return reverse * (a.localeCompare(b)); vl.sort(st_cmp_nat);
}); else
vl.sort(st_cmp_gen);
if (reverse < 0)
vl.reverse();
if (sread('dir1st') !== '0') { if (sread('dir1st') !== '0') {
var r1 = [], r2 = []; var r1 = [], r2 = [];
for (var i = 0; i < tr.length; i++) { for (var i = 0; i < tr.length; i++) {
@@ -857,11 +890,6 @@ function get_evpath() {
} }
function get_vpath() {
return uricom_dec(get_evpath());
}
function noq_href(el) { function noq_href(el) {
return el.getAttribute('href').split('?')[0]; return el.getAttribute('href').split('?')[0];
} }

View File

@@ -1,3 +1,120 @@
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2025-0420-1836 `v1.16.21` unzip-compat
a couple guys have been asking if I accept donations -- thanks a lot!! added a few options on [my github page](https://github.com/9001/) :>
## 🧪 new features
* #156 add button to loop/repeat music 71c55659
## 🩹 bugfixes
* #155 download-as-zip: increase compatibility with the unix `unzip` command db33d68d
* this unfortunately reduces support for huge zipfiles on old software (WinXP and such)
* and makes it less safe to stream zips into unzippers, so use tar.gz instead
* and is perhaps not even a copyparty bug; see commit-message for the full story
## 🔧 other changes
* show warning on Ctrl-A in lazy-loaded folders 5b3a5fe7
* docker: hide keepalive pings from logs d5a9bd80
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2025-0413-2151 `v1.16.20` all sorted
## 🧪 new features
* when enabled, natural-sort will now also apply to tags, not just filenames 7b2bd6da
## 🩹 bugfixes
* some sorting-related stuff 7b2bd6da
* folders with non-ascii names would sort incorrectly in the navpane/sidebar
* natural-sort didn't apply correctly after changing the sort order
* workaround [ffmpeg-bug 10797](https://trac.ffmpeg.org/ticket/10797) 98dcaee2
* reduces ram usage from 1534 to 230 MiB when generating spectrograms of s3xmodit songs (amiga chiptunes)
* disable mdns if only listening on uds (unix-sockets) ffc16109 361aebf8
## 🔧 other changes
* hotkey CTRL-A will now select all files in gridview 233075ae
* and it toggles (just like in list-view) so try pressing it again
* copyparty.exe: upgrade to pillow v11.2.1 c7aa1a35
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2025-0408-2132 `v1.16.19` GHOST
did you know that every song named `GHOST` is a banger? it's true! [ghost](https://www.youtube.com/watch?v=NoUAwC4yiAw) // [ghost](https://www.youtube.com/watch?v=IKKar5SS29E) // [ghost](https://www.youtube.com/watch?v=tFSFlgm_tsw)
## 🧪 new features
* option to store markdown backups out-of-volume fc883418
* the default is still a subfolder named `.hist` next to the markdown file
* `--md-hist v` puts them in the volume's hist-folder instead
* `--md-hist n` disables markdown-backups entirely
* #149 option to store the volume sqlite databases at a custom locations outside the hist-folder e1b9ac63
* new option `--dbpath` works like `--hist` but it only moves the database file, not the thumbnails
* they can be combined, in which case `--hist` is applied to thumbnails, `--dbpath` to the db
* useful when you're squeezing every last drop of performance out of your filesystem (see the issue)
* actively prevent sharing certain databases (sessions/shares) between multiple copyparty instances acfaacbd
* an errormessage was added to explain some different alternatives for doing this safely
* for example by setting `XDG_CONFIG_HOME` which now works on all platforms b17ccc38
## 🩹 bugfixes
* #151 mkdir did not work in locations outside the volume root (via symlinks) 2b50fc20
* improve the ui feedback when trying to play an audio file which failed to transcode f9954bc4
* also helps with server-filesystem issues, including image-thumbs
## 🔧 other changes
* #152 custom fonts are also applied to textboxes and buttons (thx @thaddeuskkr) d450f615
* be more careful with the shares-db 8e0364ef
* be less careful with the sessions-db 8e0364ef
* update deps c0becc64
* web: dompurify
* copyparty.exe: python 3.12.10
* rephrase `-j0` warning on windows to also mention that Microsoft Defender will freak out c0becc64
* #149 add [a script](https://github.com/9001/copyparty/tree/hovudstraum/contrib#zfs-tunepy) to optimize the sqlite databases for storage on zfs 4f397b9b
* block `GoogleOther` (another recalcitrant bot) from zip-downloads c2034f7b
* rephrase `-j0` warning on windows to also mention that Microsoft Defender will freak out c0becc64
* update [contributing.md](https://github.com/9001/copyparty/blob/hovudstraum/CONTRIBUTING.md) with a section regarding LLM/AI-written code cec3bee0
* the [helptext](https://ocv.me/copyparty/helptext.html) will also be uploaded to each github release from now on, [permalink](https://github.com/9001/copyparty/releases/latest/download/helptext.html)
* add review from ixbt forums b383c08c
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2025-0323-2216 `v1.16.18` zlib-ng
## 🧪 new features
* prefer zlib-ng when available 57a56073
* download-as-tar-gz becomes 2.5x faster
* default-enabled in docker-images
* not enabled in copyparty.exe yet; coming in a future python version
* docker: add mimalloc (optional, default-disabled) de2c9788
* gives twice the speed, and twice the ram usage
## 🩹 bugfixes
* small up2k glitch 3c90cec0
## 🔧 other changes
* rename logues/readmes when uploaded with write-only access 2525d594
* since they are used as helptext when viewing the page
* try to block google and other bad bots from `?doc` and `?zip` 99f63adf
* apparently `rel="nofollow"` means nothing these days
### the docker images for this release were built from e1dea7ef
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
# 2025-0316-2002 `v1.16.17` boot2party # 2025-0316-2002 `v1.16.17` boot2party

View File

@@ -281,8 +281,11 @@ on writing your own [hooks](../README.md#event-hooks)
hooks can cause intentional side-effects, such as redirecting an upload into another location, or creating+indexing additional files, or deleting existing files, by returning json on stdout hooks can cause intentional side-effects, such as redirecting an upload into another location, or creating+indexing additional files, or deleting existing files, by returning json on stdout
* `reloc` can redirect uploads before/after uploading has finished, based on filename, extension, file contents, uploader ip/name etc. * `reloc` can redirect uploads before/after uploading has finished, based on filename, extension, file contents, uploader ip/name etc.
* example: [reloc-by-ext](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reloc-by-ext.py)
* `idx` informs copyparty about a new file to index as a consequence of this upload * `idx` informs copyparty about a new file to index as a consequence of this upload
* example: [podcast-normalizer.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/podcast-normalizer.py)
* `del` tells copyparty to delete an unrelated file by vpath * `del` tells copyparty to delete an unrelated file by vpath
* example: ( ´・ω・) nyoro~n
for these to take effect, the hook must be defined with the `c1` flag; see example [reloc-by-ext](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reloc-by-ext.py) for these to take effect, the hook must be defined with the `c1` flag; see example [reloc-by-ext](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/reloc-by-ext.py)

View File

@@ -11,9 +11,14 @@ services:
- ./:/cfg:z - ./:/cfg:z
- /path/to/your/fileshare/top/folder:/w:z - /path/to/your/fileshare/top/folder:/w:z
# enabling mimalloc by replacing "NOPE" with "2" will make some stuff twice as fast, but everything will use twice as much ram:
environment:
LD_PRELOAD: /usr/lib/libmimalloc-secure.so.NOPE
stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal
healthcheck: healthcheck:
test: ["CMD-SHELL", "wget --spider -q 127.0.0.1:3923/?reset"] # hide it from logs with "/._" so it matches the default --lf-url filter
test: ["CMD-SHELL", "wget --spider -q 127.0.0.1:3923/?reset=/._"]
interval: 1m interval: 1m
timeout: 2s timeout: 2s
retries: 5 retries: 5

View File

@@ -23,6 +23,9 @@ services:
- 'traefik.http.routers.copyparty.tls=true' - 'traefik.http.routers.copyparty.tls=true'
- 'traefik.http.routers.copyparty.middlewares=authelia@docker' - 'traefik.http.routers.copyparty.middlewares=authelia@docker'
stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal
environment:
LD_PRELOAD: /usr/lib/libmimalloc-secure.so.NOPE
# enable mimalloc by replacing "NOPE" with "2" for a nice speed-boost (will use twice as much ram)
authelia: authelia:
image: authelia/authelia:v4.38.0-beta3 # the config files in the authelia folder use the new syntax image: authelia/authelia:v4.38.0-beta3 # the config files in the authelia folder use the new syntax

View File

@@ -22,13 +22,10 @@ services:
- 'traefik.http.routers.fs.rule=Host(`fs.example.com`)' - 'traefik.http.routers.fs.rule=Host(`fs.example.com`)'
- 'traefik.http.routers.fs.entrypoints=http' - 'traefik.http.routers.fs.entrypoints=http'
#- 'traefik.http.routers.fs.middlewares=authelia@docker' # TODO: ??? #- 'traefik.http.routers.fs.middlewares=authelia@docker' # TODO: ???
healthcheck:
test: ["CMD-SHELL", "wget --spider -q 127.0.0.1:3923/?reset"]
interval: 1m
timeout: 2s
retries: 5
start_period: 15s
stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal stop_grace_period: 15s # thumbnailer is allowed to continue finishing up for 10s after the shutdown signal
environment:
LD_PRELOAD: /usr/lib/libmimalloc-secure.so.NOPE
# enable mimalloc by replacing "NOPE" with "2" for a nice speed-boost (will use twice as much ram)
traefik: traefik:
image: traefik:v2.11 image: traefik:v2.11

View File

@@ -33,12 +33,6 @@ if you are introducing a new ttf/woff font, don't forget to declare the font its
} }
``` ```
and because textboxes don't inherit fonts by default, you can force it like this:
```css
input[type=text], input[type=submit], input[type=button] { font-family: var(--font-main) }
```
and if you want to have a monospace font in the fancy markdown editor, do this: and if you want to have a monospace font in the fancy markdown editor, do this:
```css ```css

View File

@@ -3,7 +3,7 @@ WORKDIR /z
ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
ver_hashwasm=4.12.0 \ ver_hashwasm=4.12.0 \
ver_marked=4.3.0 \ ver_marked=4.3.0 \
ver_dompf=3.2.4 \ ver_dompf=3.2.5 \
ver_mde=2.18.0 \ ver_mde=2.18.0 \
ver_codemirror=5.65.18 \ ver_codemirror=5.65.18 \
ver_fontawesome=5.13.0 \ ver_fontawesome=5.13.0 \

View File

@@ -2,7 +2,7 @@
set -ex set -ex
# use zlib-ng if available # use zlib-ng if available
f=/z/base/zlib_ng-0.5.1-cp312-cp312-linux_$(uname -m).whl f=/z/base/zlib_ng-0.5.1-cp312-cp312-linux_$(cat /etc/apk/arch).whl
[ "$1" != min ] && [ -e $f ] && { [ "$1" != min ] && [ -e $f ] && {
apk add -t .bd !pyc py3-pip apk add -t .bd !pyc py3-pip
rm -f /usr/lib/python3*/EXTERNALLY-MANAGED rm -f /usr/lib/python3*/EXTERNALLY-MANAGED
@@ -32,6 +32,9 @@ rm -rf \
/tmp/pe-* /z/copyparty-sfx.py \ /tmp/pe-* /z/copyparty-sfx.py \
ensurepip pydoc_data turtle.py turtledemo lib2to3 ensurepip pydoc_data turtle.py turtledemo lib2to3
# speedhack
sed -ri 's/os.environ.get\("PRTY_NO_IMPRESO"\)/"1"/' /usr/lib/python3.*/site-packages/copyparty/util.py
# drop bytecode # drop bytecode
find / -xdev -name __pycache__ -print0 | xargs -0 rm -rf find / -xdev -name __pycache__ -print0 | xargs -0 rm -rf
@@ -65,7 +68,11 @@ for n in $(seq 1 200); do sleep 0.2
done done
[ -z "$v" ] && echo SNAAAAAKE && exit 1 [ -z "$v" ] && echo SNAAAAAKE && exit 1
wget -O- http://${v/ /:}/?tar=gz:1 | tar -xzO top/innvikler.sh | cmp innvikler.sh for n in $(seq 1 200); do sleep 0.2
wget -O- http://${v/ /:}/?tar=gz:1 >tf && break
done
tar -xzO top/innvikler.sh <tf | cmp innvikler.sh
rm tf
kill $pid; wait $pid kill $pid; wait $pid

View File

@@ -237,6 +237,8 @@ necho() {
tar -zxf $f tar -zxf $f
mv partftpy-*/partftpy . mv partftpy-*/partftpy .
rm -rf partftpy-* partftpy/bin rm -rf partftpy-* partftpy/bin
#(cd partftpy && "$pybin" ../../scripts/strip_hints/a.py; rm uh) # dont need the full thing, just this:
sed -ri 's/from typing import TYPE_CHECKING$/TYPE_CHECKING = False/' partftpy/TftpShared.py
necho collecting python-magic necho collecting python-magic
v=0.4.27 v=0.4.27

View File

@@ -27,7 +27,7 @@ ac96786e5d35882e0c5b724794329c9125c2b86ae7847f17acfc49f0d294312c6afc1c3f248655de
6df21f0da408a89f6504417c7cdf9aaafe4ed88cfa13e9b8fa8414f604c0401f885a04bbad0484dc51a29284af5d1548e33c6cc6bfb9896d9992c1b1074f332d MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl 6df21f0da408a89f6504417c7cdf9aaafe4ed88cfa13e9b8fa8414f604c0401f885a04bbad0484dc51a29284af5d1548e33c6cc6bfb9896d9992c1b1074f332d MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl 8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
0203ec2551c4836696cfab0b2c9fff603352f03fa36e7476e2e1ca7ec57a3a0c24bd791fcd92f342bf817f0887854d9f072e0271c643de4b313d8c9569ba8813 packaging-24.1-py3-none-any.whl 0203ec2551c4836696cfab0b2c9fff603352f03fa36e7476e2e1ca7ec57a3a0c24bd791fcd92f342bf817f0887854d9f072e0271c643de4b313d8c9569ba8813 packaging-24.1-py3-none-any.whl
12d7921dc7dfd8a4b0ea0fa2bae8f1354fcdd59ece3d7f4e075aed631f9ba791dc142c70b1ccd1e6287c43139df1db26bd57a7a217c8da3a77326036495cdb57 pillow-11.1.0-cp312-cp312-win_amd64.whl c9051daaf34ec934962c743a5ac2dbe55a9b0cababb693a8cde0001d24d4a50b67bd534d714d935def6ca7b898ec0a352e58bd9ccdce01c54eaf2281b18e478d pillow-11.2.1-cp312-cp312-win_amd64.whl
f0463895e9aee97f31a2003323de235fed1b26289766dc0837261e3f4a594a31162b69e9adbb0e9a31e2e2d4b5f25c762ed1669553df7dc89a8ba4f85d297873 pyinstaller-6.11.1-py3-none-win_amd64.whl f0463895e9aee97f31a2003323de235fed1b26289766dc0837261e3f4a594a31162b69e9adbb0e9a31e2e2d4b5f25c762ed1669553df7dc89a8ba4f85d297873 pyinstaller-6.11.1-py3-none-win_amd64.whl
d550a0a14428386945533de2220c4c2e37c0c890fc51a600f626c6ca90a32d39572c121ec04c157ba3a8d6601cb021f8433d871b5c562a3d342c804fffec90c1 pyinstaller_hooks_contrib-2024.11-py3-none-any.whl d550a0a14428386945533de2220c4c2e37c0c890fc51a600f626c6ca90a32d39572c121ec04c157ba3a8d6601cb021f8433d871b5c562a3d342c804fffec90c1 pyinstaller_hooks_contrib-2024.11-py3-none-any.whl
17b64ff6744004a05d475c8f6de3e48286db4069afad4cae690f83b3555f8e35ceafb210eeba69a11e983d0da3001099de284b6696ed0f1bf9cd791938a7f2cd python-3.12.9-amd64.exe 4f9a4d9f65c93e2d851e2674057343a9599f30f5dc582ffca485522237d4fcf43653b3d393ed5eb11e518c4ba93714a07134bbb13a97d421cce211e1da34682e python-3.12.10-amd64.exe

View File

@@ -38,10 +38,10 @@ fns=(
MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl
mutagen-1.47.0-py3-none-any.whl mutagen-1.47.0-py3-none-any.whl
packaging-24.1-py3-none-any.whl packaging-24.1-py3-none-any.whl
pillow-11.1.0-cp312-cp312-win_amd64.whl pillow-11.2.1-cp312-cp312-win_amd64.whl
pyinstaller-6.10.0-py3-none-win_amd64.whl pyinstaller-6.10.0-py3-none-win_amd64.whl
pyinstaller_hooks_contrib-2024.8-py3-none-any.whl pyinstaller_hooks_contrib-2024.8-py3-none-any.whl
python-3.12.9-amd64.exe python-3.12.10-amd64.exe
) )
[ $w7 ] && fns+=( [ $w7 ] && fns+=(
future-1.0.0-py3-none-any.whl future-1.0.0-py3-none-any.whl

View File

@@ -413,6 +413,9 @@ def run_i(ld):
for x in ld: for x in ld:
sys.path.insert(0, x) sys.path.insert(0, x)
e = os.environ
e["PRTY_NO_IMPRESO"] = "1"
from copyparty.__main__ import main as p from copyparty.__main__ import main as p
p() p()

View File

@@ -84,6 +84,9 @@ def uh2(fp):
if " # !rm" in ln: if " # !rm" in ln:
continue continue
if ln.endswith("TYPE_CHECKING"):
ln = ln.replace("from typing import TYPE_CHECKING", "TYPE_CHECKING = False")
lns.append(ln) lns.append(ln)
cs = "\n".join(lns) cs = "\n".join(lns)

View File

@@ -357,6 +357,7 @@ var tl_browser = {
"ml_eq": "audio equalizer", "ml_eq": "audio equalizer",
"ml_drc": "dynamic range compressor", "ml_drc": "dynamic range compressor",
"mt_loop": "loop/repeat one song\">🔁",
"mt_shuf": "shuffle the songs in each folder\">🔀", "mt_shuf": "shuffle the songs in each folder\">🔀",
"mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶", "mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶",
"mt_preload": "start loading the next song near the end for gapless playback\">preload", "mt_preload": "start loading the next song near the end for gapless playback\">preload",
@@ -402,6 +403,7 @@ var tl_browser = {
"mm_eunk": "Unknown Errol", "mm_eunk": "Unknown Errol",
"mm_e404": "Could not play audio; error 404: File not found.", "mm_e404": "Could not play audio; error 404: File not found.",
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out", "mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
"mm_e500": "Could not play audio; error 500: Check server logs.",
"mm_e5xx": "Could not play audio; server error ", "mm_e5xx": "Could not play audio; server error ",
"mm_nof": "not finding any more audio files nearby", "mm_nof": "not finding any more audio files nearby",
"mm_prescan": "Looking for music to play next...", "mm_prescan": "Looking for music to play next...",
@@ -416,6 +418,7 @@ var tl_browser = {
"f_bigtxt": "this file is {0} MiB large -- really view as text?", "f_bigtxt": "this file is {0} MiB large -- really view as text?",
"fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>', "fbd_more": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_more">show {2}</a> or <a href="#" id="bd_all">show all</a></div>',
"fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>', "fbd_all": '<div id="blazy">showing <code>{0}</code> of <code>{1}</code> files; <a href="#" id="bd_all">show all</a></div>',
"f_anota": "only {0} of the {1} items were selected;\nto select the full folder, first scroll to the bottom",
"f_dls": 'the file links in the current folder have\nbeen changed into download links', "f_dls": 'the file links in the current folder have\nbeen changed into download links',

View File

@@ -371,7 +371,7 @@ class TestDots(unittest.TestCase):
return " ".join(tar) return " ".join(tar)
def curl(self, url, uname, binary=False, req=b""): def curl(self, url, uname, binary=False, req=b""):
req = req or hdr(url, uname) req = req or hdr(url.replace("th=x", "th=j"), uname)
conn = self.conn.setbuf(req) conn = self.conn.setbuf(req)
HttpCli(conn).run() HttpCli(conn).run()
if binary: if binary:

View File

@@ -135,7 +135,7 @@ class Cfg(Namespace):
ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash plain_ip" ex = "dav_inf dedup dotpart dotsrch hook_v no_dhash no_fastboot no_fpool no_htp no_rescan no_sendfile no_ses no_snap no_up_list no_voldump re_dhash plain_ip"
ka.update(**{k: True for k in ex.split()}) ka.update(**{k: True for k in ex.split()})
ex = "ah_cli ah_gen css_browser hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua ua_nodoc ua_nozip" ex = "ah_cli ah_gen css_browser dbpath hist ipu js_browser js_other mime mimes no_forget no_hash no_idx nonsus_urls og_tpl og_ua ua_nodoc ua_nozip"
ka.update(**{k: None for k in ex.split()}) ka.update(**{k: None for k in ex.split()})
ex = "hash_mt hsortn safe_dedup srch_time u2abort u2j u2sz" ex = "hash_mt hsortn safe_dedup srch_time u2abort u2j u2sz"
@@ -174,6 +174,7 @@ class Cfg(Namespace):
lang="eng", lang="eng",
log_badpwd=1, log_badpwd=1,
logout=573, logout=573,
md_hist="s",
mte={"a": True}, mte={"a": True},
mth={}, mth={},
mtp=[], mtp=[],