Compare commits
78 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
3ba0cc20f1 | ||
|
|
dd28de1796 | ||
|
|
9eecc9e19a | ||
|
|
6530cb6b05 | ||
|
|
41ce613379 | ||
|
|
5e2785caba | ||
|
|
d7cc000976 | ||
|
|
50d8ff95ae | ||
|
|
b2de1459b6 | ||
|
|
f0ffbea0b2 | ||
|
|
199ccca0fe | ||
|
|
1d9b355743 | ||
|
|
f0437fbb07 | ||
|
|
abc404a5b7 | ||
|
|
04b9e21330 | ||
|
|
1044aa071b | ||
|
|
4c3192c8cc | ||
|
|
689e77a025 | ||
|
|
3bd89403d2 | ||
|
|
b4800d9bcb | ||
|
|
05485e8539 | ||
|
|
0e03dc0868 | ||
|
|
352b1ed10a | ||
|
|
0db1244d04 | ||
|
|
ece08b8179 | ||
|
|
b8945ae233 | ||
|
|
dcaf7b0a20 | ||
|
|
f982cdc178 | ||
|
|
b265e59834 | ||
|
|
4a843a6624 | ||
|
|
241ef5b99d | ||
|
|
f39f575a9c | ||
|
|
1521307f1e | ||
|
|
dd122111e6 | ||
|
|
00c177fa74 | ||
|
|
f6c7e49eb8 | ||
|
|
1a8dc3d18a | ||
|
|
38a163a09a | ||
|
|
8f031246d2 | ||
|
|
8f3d97dde7 | ||
|
|
4acaf24d65 | ||
|
|
9a8dbbbcf8 | ||
|
|
a3efc4c726 | ||
|
|
0278bf328f | ||
|
|
17ddd96cc6 | ||
|
|
0e82e79aea | ||
|
|
30f124c061 | ||
|
|
e19d90fcfc | ||
|
|
184bbdd23d | ||
|
|
30b50aec95 | ||
|
|
c3c3d81db1 | ||
|
|
49b7231283 | ||
|
|
edbedcdad3 | ||
|
|
e4ae5f74e6 | ||
|
|
2c7ffe08d7 | ||
|
|
3ca46bae46 | ||
|
|
7e82aaf843 | ||
|
|
315bd71adf | ||
|
|
2c612c9aeb | ||
|
|
36aee085f7 | ||
|
|
d01bb69a9c | ||
|
|
c9b1c48c72 | ||
|
|
aea3843cf2 | ||
|
|
131b6f4b9a | ||
|
|
6efb8b735a | ||
|
|
223b7af2ce | ||
|
|
e72c2a6982 | ||
|
|
dd9b93970e | ||
|
|
e4c7cd81a9 | ||
|
|
12b3a62586 | ||
|
|
2da3bdcd47 | ||
|
|
c1dccbe0ba | ||
|
|
9629fcde68 | ||
|
|
cae436b566 | ||
|
|
01714700ae | ||
|
|
51e6c4852b | ||
|
|
b206c5d64e | ||
|
|
62c3272351 |
60
README.md
60
README.md
@@ -52,7 +52,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
|||||||
* [compress uploads](#compress-uploads) - files can be autocompressed on upload
|
* [compress uploads](#compress-uploads) - files can be autocompressed on upload
|
||||||
* [database location](#database-location) - in-volume (`.hist/up2k.db`, default) or somewhere else
|
* [database location](#database-location) - in-volume (`.hist/up2k.db`, default) or somewhere else
|
||||||
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
|
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
|
||||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags
|
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||||
* [upload events](#upload-events) - trigger a script/program on each upload
|
* [upload events](#upload-events) - trigger a script/program on each upload
|
||||||
* [complete examples](#complete-examples)
|
* [complete examples](#complete-examples)
|
||||||
* [browser support](#browser-support) - TLDR: yes
|
* [browser support](#browser-support) - TLDR: yes
|
||||||
@@ -78,6 +78,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
|||||||
* [sfx](#sfx) - there are two self-contained "binaries"
|
* [sfx](#sfx) - there are two self-contained "binaries"
|
||||||
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
|
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
|
||||||
* [install on android](#install-on-android)
|
* [install on android](#install-on-android)
|
||||||
|
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports
|
||||||
* [building](#building)
|
* [building](#building)
|
||||||
* [dev env setup](#dev-env-setup)
|
* [dev env setup](#dev-env-setup)
|
||||||
* [just the sfx](#just-the-sfx)
|
* [just the sfx](#just-the-sfx)
|
||||||
@@ -161,12 +162,13 @@ feature summary
|
|||||||
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
|
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
|
||||||
* ☑ audio player (with OS media controls and opus transcoding)
|
* ☑ audio player (with OS media controls and opus transcoding)
|
||||||
* ☑ image gallery with webm player
|
* ☑ image gallery with webm player
|
||||||
|
* ☑ textfile browser with syntax hilighting
|
||||||
* ☑ [thumbnails](#thumbnails)
|
* ☑ [thumbnails](#thumbnails)
|
||||||
* ☑ ...of images using Pillow
|
* ☑ ...of images using Pillow
|
||||||
* ☑ ...of videos using FFmpeg
|
* ☑ ...of videos using FFmpeg
|
||||||
|
* ☑ ...of audio (spectrograms) using FFmpeg
|
||||||
* ☑ cache eviction (max-age; maybe max-size eventually)
|
* ☑ cache eviction (max-age; maybe max-size eventually)
|
||||||
* ☑ SPA (browse while uploading)
|
* ☑ SPA (browse while uploading)
|
||||||
* if you use the navpane to navigate, not folders in the file list
|
|
||||||
* server indexing
|
* server indexing
|
||||||
* ☑ [locate files by contents](#file-search)
|
* ☑ [locate files by contents](#file-search)
|
||||||
* ☑ search by name/path/date/size
|
* ☑ search by name/path/date/size
|
||||||
@@ -233,6 +235,10 @@ some improvement ideas
|
|||||||
|
|
||||||
## not my bugs
|
## not my bugs
|
||||||
|
|
||||||
|
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
|
||||||
|
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
|
||||||
|
* "future" because `AudioContext` is broken in the current iOS version (15.1), maybe one day...
|
||||||
|
|
||||||
* Windows: folders cannot be accessed if the name ends with `.`
|
* Windows: folders cannot be accessed if the name ends with `.`
|
||||||
* python or windows bug
|
* python or windows bug
|
||||||
|
|
||||||
@@ -249,6 +255,7 @@ some improvement ideas
|
|||||||
|
|
||||||
* is it possible to block read-access to folders unless you know the exact URL for a particular file inside?
|
* is it possible to block read-access to folders unless you know the exact URL for a particular file inside?
|
||||||
* yes, using the [`g` permission](#accounts-and-volumes), see the examples there
|
* yes, using the [`g` permission](#accounts-and-volumes), see the examples there
|
||||||
|
* you can also do this with linux filesystem permissions; `chmod 111 music` will make it possible to access files and folders inside the `music` folder but not list the immediate contents -- also works with other software, not just copyparty
|
||||||
|
|
||||||
* can I make copyparty download a file to my server if I give it a URL?
|
* can I make copyparty download a file to my server if I give it a URL?
|
||||||
* not officially, but there is a [terrible hack](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/wget.py) which makes it possible
|
* not officially, but there is a [terrible hack](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/wget.py) which makes it possible
|
||||||
@@ -317,6 +324,7 @@ the browser has the following hotkeys (always qwerty)
|
|||||||
* `V` toggle folders / textfiles in the navpane
|
* `V` toggle folders / textfiles in the navpane
|
||||||
* `G` toggle list / [grid view](#thumbnails)
|
* `G` toggle list / [grid view](#thumbnails)
|
||||||
* `T` toggle thumbnails / icons
|
* `T` toggle thumbnails / icons
|
||||||
|
* `ESC` close various things
|
||||||
* `ctrl-X` cut selected files/folders
|
* `ctrl-X` cut selected files/folders
|
||||||
* `ctrl-V` paste
|
* `ctrl-V` paste
|
||||||
* `F2` [rename](#batch-rename) selected file/folder
|
* `F2` [rename](#batch-rename) selected file/folder
|
||||||
@@ -368,9 +376,13 @@ switching between breadcrumbs or navpane
|
|||||||
|
|
||||||
click the `🌲` or pressing the `B` hotkey to toggle between breadcrumbs path (default), or a navpane (tree-browser sidebar thing)
|
click the `🌲` or pressing the `B` hotkey to toggle between breadcrumbs path (default), or a navpane (tree-browser sidebar thing)
|
||||||
|
|
||||||
* `[-]` and `[+]` (or hotkeys `A`/`D`) adjust the size
|
* `[+]` and `[-]` (or hotkeys `A`/`D`) adjust the size
|
||||||
* `[v]` jumps to the currently open folder
|
* `[🎯]` jumps to the currently open folder
|
||||||
|
* `[📃]` toggles between showing folders and textfiles
|
||||||
|
* `[📌]` shows the name of all parent folders in a docked panel
|
||||||
* `[a]` toggles automatic widening as you go deeper
|
* `[a]` toggles automatic widening as you go deeper
|
||||||
|
* `[↵]` toggles wordwrap
|
||||||
|
* `[👀]` show full name on hover (if wordwrap is off)
|
||||||
|
|
||||||
|
|
||||||
## thumbnails
|
## thumbnails
|
||||||
@@ -386,6 +398,7 @@ audio files are covnerted into spectrograms using FFmpeg unless you `--no-athumb
|
|||||||
images with the following names (see `--th-covers`) become the thumbnail of the folder they're in: `folder.png`, `folder.jpg`, `cover.png`, `cover.jpg`
|
images with the following names (see `--th-covers`) become the thumbnail of the folder they're in: `folder.png`, `folder.jpg`, `cover.png`, `cover.jpg`
|
||||||
|
|
||||||
in the grid/thumbnail view, if the audio player panel is open, songs will start playing when clicked
|
in the grid/thumbnail view, if the audio player panel is open, songs will start playing when clicked
|
||||||
|
* indicated by the audio files having the ▶ icon instead of 💾
|
||||||
|
|
||||||
|
|
||||||
## zip downloads
|
## zip downloads
|
||||||
@@ -563,6 +576,8 @@ and there are *two* editors
|
|||||||
|
|
||||||
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
|
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
|
||||||
|
|
||||||
|
* enabling the audio equalizer can help make gapless albums fully gapless in some browsers (chrome), so consider leaving it on with all the values at zero
|
||||||
|
|
||||||
* get a plaintext file listing by adding `?ls=t` to a URL, or a compact colored one with `?ls=v` (for unix terminals)
|
* get a plaintext file listing by adding `?ls=t` to a URL, or a compact colored one with `?ls=v` (for unix terminals)
|
||||||
|
|
||||||
* if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1
|
* if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1
|
||||||
@@ -611,10 +626,12 @@ through arguments:
|
|||||||
* `-e2ts` also scans for tags in all files that don't have tags yet
|
* `-e2ts` also scans for tags in all files that don't have tags yet
|
||||||
* `-e2tsr` also deletes all existing tags, doing a full reindex
|
* `-e2tsr` also deletes all existing tags, doing a full reindex
|
||||||
|
|
||||||
the same arguments can be set as volume flags, in addition to `d2d` and `d2t` for disabling:
|
the same arguments can be set as volume flags, in addition to `d2d`, `d2ds`, `d2t`, `d2ts` for disabling:
|
||||||
* `-v ~/music::r:c,e2dsa,e2tsr` does a full reindex of everything on startup
|
* `-v ~/music::r:c,e2dsa,e2tsr` does a full reindex of everything on startup
|
||||||
* `-v ~/music::r:c,d2d` disables **all** indexing, even if any `-e2*` are on
|
* `-v ~/music::r:c,d2d` disables **all** indexing, even if any `-e2*` are on
|
||||||
* `-v ~/music::r:c,d2t` disables all `-e2t*` (tags), does not affect `-e2d*`
|
* `-v ~/music::r:c,d2t` disables all `-e2t*` (tags), does not affect `-e2d*`
|
||||||
|
* `-v ~/music::r:c,d2ds` disables on-boot scans; only index new uploads
|
||||||
|
* `-v ~/music::r:c,d2ts` same except only affecting tags
|
||||||
|
|
||||||
note:
|
note:
|
||||||
* the parser can finally handle `c,e2dsa,e2tsr` so you no longer have to `c,e2dsa:c,e2tsr`
|
* the parser can finally handle `c,e2dsa,e2tsr` so you no longer have to `c,e2dsa:c,e2tsr`
|
||||||
@@ -669,6 +686,12 @@ things to note,
|
|||||||
* the files will be indexed after compression, so dupe-detection and file-search will not work as expected
|
* the files will be indexed after compression, so dupe-detection and file-search will not work as expected
|
||||||
|
|
||||||
some examples,
|
some examples,
|
||||||
|
* `-v inc:inc:w:c,pk=xz,0`
|
||||||
|
folder named inc, shared at inc, write-only for everyone, forces xz compression at level 0
|
||||||
|
* `-v inc:inc:w:c,pk`
|
||||||
|
same write-only inc, but forces gz compression (default) instead of xz
|
||||||
|
* `-v inc:inc:w:c,gz`
|
||||||
|
allows (but does not force) gz compression if client uploads to `/inc?pk` or `/inc?gz` or `/inc?gz=4`
|
||||||
|
|
||||||
|
|
||||||
## database location
|
## database location
|
||||||
@@ -713,7 +736,7 @@ see the beautiful mess of a dictionary in [mtag.py](https://github.com/9001/copy
|
|||||||
|
|
||||||
## file parser plugins
|
## file parser plugins
|
||||||
|
|
||||||
provide custom parsers to index additional tags
|
provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||||
|
|
||||||
copyparty can invoke external programs to collect additional metadata for files using `mtp` (either as argument or volume flag), there is a default timeout of 30sec
|
copyparty can invoke external programs to collect additional metadata for files using `mtp` (either as argument or volume flag), there is a default timeout of 30sec
|
||||||
|
|
||||||
@@ -784,7 +807,7 @@ TLDR: yes
|
|||||||
* internet explorer 6 to 8 behave the same
|
* internet explorer 6 to 8 behave the same
|
||||||
* firefox 52 and chrome 49 are the final winxp versions
|
* firefox 52 and chrome 49 are the final winxp versions
|
||||||
* `*1` yes, but extremely slow (ie10: `1 MiB/s`, ie11: `270 KiB/s`)
|
* `*1` yes, but extremely slow (ie10: `1 MiB/s`, ie11: `270 KiB/s`)
|
||||||
* `*3` using a wasm decoder which consumes a bit more power
|
* `*3` iOS 11 and newer, opus only, and requires FFmpeg on the server
|
||||||
|
|
||||||
quick summary of more eccentric web-browsers trying to view a directory index:
|
quick summary of more eccentric web-browsers trying to view a directory index:
|
||||||
|
|
||||||
@@ -834,7 +857,7 @@ copyparty returns a truncated sha512sum of your PUT/POST as base64; you can gene
|
|||||||
b512(){ printf "$((sha512sum||shasum -a512)|sed -E 's/ .*//;s/(..)/\\x\1/g')"|base64|tr '+/' '-_'|head -c44;}
|
b512(){ printf "$((sha512sum||shasum -a512)|sed -E 's/ .*//;s/(..)/\\x\1/g')"|base64|tr '+/' '-_'|head -c44;}
|
||||||
b512 <movie.mkv
|
b512 <movie.mkv
|
||||||
|
|
||||||
you can provide passwords using cookie 'cppwd=hunter2', as a url query `?pw=hunter2`, or with basic-authentication (either as the username or password)
|
you can provide passwords using cookie `cppwd=hunter2`, as a url query `?pw=hunter2`, or with basic-authentication (either as the username or password)
|
||||||
|
|
||||||
|
|
||||||
# up2k
|
# up2k
|
||||||
@@ -973,6 +996,7 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
|||||||
| GET | `?txt=iso-8859-1` | ...with specific charset |
|
| GET | `?txt=iso-8859-1` | ...with specific charset |
|
||||||
| GET | `?th` | get image/video at URL as thumbnail |
|
| GET | `?th` | get image/video at URL as thumbnail |
|
||||||
| GET | `?th=opus` | convert audio file to 128kbps opus |
|
| GET | `?th=opus` | convert audio file to 128kbps opus |
|
||||||
|
| GET | `?th=caf` | ...in the iOS-proprietary container |
|
||||||
|
|
||||||
| method | body | result |
|
| method | body | result |
|
||||||
|--|--|--|
|
|--|--|--|
|
||||||
@@ -1068,13 +1092,11 @@ pls note that `copyparty-sfx.sh` will fail if you rename `copyparty-sfx.py` to `
|
|||||||
reduce the size of an sfx by removing features
|
reduce the size of an sfx by removing features
|
||||||
|
|
||||||
if you don't need all the features, you can repack the sfx and save a bunch of space; all you need is an sfx and a copy of this repo (nothing else to download or build, except if you're on windows then you need msys2 or WSL)
|
if you don't need all the features, you can repack the sfx and save a bunch of space; all you need is an sfx and a copy of this repo (nothing else to download or build, except if you're on windows then you need msys2 or WSL)
|
||||||
* `584k` size of original sfx.py as of v1.1.0
|
* `393k` size of original sfx.py as of v1.1.3
|
||||||
* `392k` after `./scripts/make-sfx.sh re no-ogv`
|
* `310k` after `./scripts/make-sfx.sh re no-cm`
|
||||||
* `310k` after `./scripts/make-sfx.sh re no-ogv no-cm`
|
* `269k` after `./scripts/make-sfx.sh re no-cm no-hl`
|
||||||
* `269k` after `./scripts/make-sfx.sh re no-ogv no-cm no-hl`
|
|
||||||
|
|
||||||
the features you can opt to drop are
|
the features you can opt to drop are
|
||||||
* `ogv`.js, the opus/vorbis decoder which is needed by apple devices to play foss audio files, saves ~192k
|
|
||||||
* `cm`/easymde, the "fancy" markdown editor, saves ~82k
|
* `cm`/easymde, the "fancy" markdown editor, saves ~82k
|
||||||
* `hl`, prism, the syntax hilighter, saves ~41k
|
* `hl`, prism, the syntax hilighter, saves ~41k
|
||||||
* `fnt`, source-code-pro, the monospace font, saves ~9k
|
* `fnt`, source-code-pro, the monospace font, saves ~9k
|
||||||
@@ -1082,7 +1104,7 @@ the features you can opt to drop are
|
|||||||
|
|
||||||
for the `re`pack to work, first run one of the sfx'es once to unpack it
|
for the `re`pack to work, first run one of the sfx'es once to unpack it
|
||||||
|
|
||||||
**note:** you can also just download and run [scripts/copyparty-repack.sh](scripts/copyparty-repack.sh) -- this will grab the latest copyparty release from github and do a `no-ogv no-cm` repack; works on linux/macos (and windows with msys2 or WSL)
|
**note:** you can also just download and run [scripts/copyparty-repack.sh](scripts/copyparty-repack.sh) -- this will grab the latest copyparty release from github and do a few repacks; works on linux/macos (and windows with msys2 or WSL)
|
||||||
|
|
||||||
|
|
||||||
# install on android
|
# install on android
|
||||||
@@ -1096,6 +1118,16 @@ echo $?
|
|||||||
after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux
|
after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux
|
||||||
|
|
||||||
|
|
||||||
|
# reporting bugs
|
||||||
|
|
||||||
|
ideas for context to include in bug reports
|
||||||
|
|
||||||
|
if something broke during an upload (replacing FILENAME with a part of the filename that broke):
|
||||||
|
```
|
||||||
|
journalctl -aS '48 hour ago' -u copyparty | grep -C10 FILENAME | tee bug.log
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
# building
|
# building
|
||||||
|
|
||||||
## dev env setup
|
## dev env setup
|
||||||
|
|||||||
@@ -11,14 +11,18 @@ import re
|
|||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import time
|
import time
|
||||||
|
import json
|
||||||
import stat
|
import stat
|
||||||
import errno
|
import errno
|
||||||
import struct
|
import struct
|
||||||
|
import codecs
|
||||||
|
import platform
|
||||||
import threading
|
import threading
|
||||||
import http.client # py2: httplib
|
import http.client # py2: httplib
|
||||||
import urllib.parse
|
import urllib.parse
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from urllib.parse import quote_from_bytes as quote
|
from urllib.parse import quote_from_bytes as quote
|
||||||
|
from urllib.parse import unquote_to_bytes as unquote
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import fuse
|
import fuse
|
||||||
@@ -38,7 +42,7 @@ except:
|
|||||||
mount a copyparty server (local or remote) as a filesystem
|
mount a copyparty server (local or remote) as a filesystem
|
||||||
|
|
||||||
usage:
|
usage:
|
||||||
python ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,url=http://192.168.1.69:3923 /mnt/nas
|
python ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,pw=wark,url=http://192.168.1.69:3923 /mnt/nas
|
||||||
|
|
||||||
dependencies:
|
dependencies:
|
||||||
sudo apk add fuse-dev python3-dev
|
sudo apk add fuse-dev python3-dev
|
||||||
@@ -50,6 +54,10 @@ fork of copyparty-fuse.py based on fuse-python which
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
WINDOWS = sys.platform == "win32"
|
||||||
|
MACOS = platform.system() == "Darwin"
|
||||||
|
|
||||||
|
|
||||||
def threadless_log(msg):
|
def threadless_log(msg):
|
||||||
print(msg + "\n", end="")
|
print(msg + "\n", end="")
|
||||||
|
|
||||||
@@ -93,6 +101,41 @@ def html_dec(txt):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def register_wtf8():
|
||||||
|
def wtf8_enc(text):
|
||||||
|
return str(text).encode("utf-8", "surrogateescape"), len(text)
|
||||||
|
|
||||||
|
def wtf8_dec(binary):
|
||||||
|
return bytes(binary).decode("utf-8", "surrogateescape"), len(binary)
|
||||||
|
|
||||||
|
def wtf8_search(encoding_name):
|
||||||
|
return codecs.CodecInfo(wtf8_enc, wtf8_dec, name="wtf-8")
|
||||||
|
|
||||||
|
codecs.register(wtf8_search)
|
||||||
|
|
||||||
|
|
||||||
|
bad_good = {}
|
||||||
|
good_bad = {}
|
||||||
|
|
||||||
|
|
||||||
|
def enwin(txt):
|
||||||
|
return "".join([bad_good.get(x, x) for x in txt])
|
||||||
|
|
||||||
|
for bad, good in bad_good.items():
|
||||||
|
txt = txt.replace(bad, good)
|
||||||
|
|
||||||
|
return txt
|
||||||
|
|
||||||
|
|
||||||
|
def dewin(txt):
|
||||||
|
return "".join([good_bad.get(x, x) for x in txt])
|
||||||
|
|
||||||
|
for bad, good in bad_good.items():
|
||||||
|
txt = txt.replace(good, bad)
|
||||||
|
|
||||||
|
return txt
|
||||||
|
|
||||||
|
|
||||||
class CacheNode(object):
|
class CacheNode(object):
|
||||||
def __init__(self, tag, data):
|
def __init__(self, tag, data):
|
||||||
self.tag = tag
|
self.tag = tag
|
||||||
@@ -115,8 +158,9 @@ class Stat(fuse.Stat):
|
|||||||
|
|
||||||
|
|
||||||
class Gateway(object):
|
class Gateway(object):
|
||||||
def __init__(self, base_url):
|
def __init__(self, base_url, pw):
|
||||||
self.base_url = base_url
|
self.base_url = base_url
|
||||||
|
self.pw = pw
|
||||||
|
|
||||||
ui = urllib.parse.urlparse(base_url)
|
ui = urllib.parse.urlparse(base_url)
|
||||||
self.web_root = ui.path.strip("/")
|
self.web_root = ui.path.strip("/")
|
||||||
@@ -135,8 +179,7 @@ class Gateway(object):
|
|||||||
self.conns = {}
|
self.conns = {}
|
||||||
|
|
||||||
def quotep(self, path):
|
def quotep(self, path):
|
||||||
# TODO: mojibake support
|
path = path.encode("wtf-8")
|
||||||
path = path.encode("utf-8", "ignore")
|
|
||||||
return quote(path, safe="/")
|
return quote(path, safe="/")
|
||||||
|
|
||||||
def getconn(self, tid=None):
|
def getconn(self, tid=None):
|
||||||
@@ -159,20 +202,29 @@ class Gateway(object):
|
|||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def sendreq(self, *args, **kwargs):
|
def sendreq(self, *args, **ka):
|
||||||
tid = get_tid()
|
tid = get_tid()
|
||||||
|
if self.pw:
|
||||||
|
ck = "cppwd=" + self.pw
|
||||||
|
try:
|
||||||
|
ka["headers"]["Cookie"] = ck
|
||||||
|
except:
|
||||||
|
ka["headers"] = {"Cookie": ck}
|
||||||
try:
|
try:
|
||||||
c = self.getconn(tid)
|
c = self.getconn(tid)
|
||||||
c.request(*list(args), **kwargs)
|
c.request(*list(args), **ka)
|
||||||
return c.getresponse()
|
return c.getresponse()
|
||||||
except:
|
except:
|
||||||
self.closeconn(tid)
|
self.closeconn(tid)
|
||||||
c = self.getconn(tid)
|
c = self.getconn(tid)
|
||||||
c.request(*list(args), **kwargs)
|
c.request(*list(args), **ka)
|
||||||
return c.getresponse()
|
return c.getresponse()
|
||||||
|
|
||||||
def listdir(self, path):
|
def listdir(self, path):
|
||||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?dots"
|
if bad_good:
|
||||||
|
path = dewin(path)
|
||||||
|
|
||||||
|
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?dots&ls"
|
||||||
r = self.sendreq("GET", web_path)
|
r = self.sendreq("GET", web_path)
|
||||||
if r.status != 200:
|
if r.status != 200:
|
||||||
self.closeconn()
|
self.closeconn()
|
||||||
@@ -182,9 +234,12 @@ class Gateway(object):
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
return self.parse_html(r)
|
return self.parse_jls(r)
|
||||||
|
|
||||||
def download_file_range(self, path, ofs1, ofs2):
|
def download_file_range(self, path, ofs1, ofs2):
|
||||||
|
if bad_good:
|
||||||
|
path = dewin(path)
|
||||||
|
|
||||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?raw"
|
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?raw"
|
||||||
hdr_range = "bytes={}-{}".format(ofs1, ofs2 - 1)
|
hdr_range = "bytes={}-{}".format(ofs1, ofs2 - 1)
|
||||||
log("downloading {}".format(hdr_range))
|
log("downloading {}".format(hdr_range))
|
||||||
@@ -200,40 +255,27 @@ class Gateway(object):
|
|||||||
|
|
||||||
return r.read()
|
return r.read()
|
||||||
|
|
||||||
def parse_html(self, datasrc):
|
def parse_jls(self, datasrc):
|
||||||
ret = []
|
rsp = b""
|
||||||
remainder = b""
|
|
||||||
ptn = re.compile(
|
|
||||||
r"^<tr><td>(-|DIR)</td><td><a [^>]+>([^<]+)</a></td><td>([^<]+)</td><td>([^<]+)</td></tr>$"
|
|
||||||
)
|
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
buf = remainder + datasrc.read(4096)
|
buf = datasrc.read(1024 * 32)
|
||||||
# print('[{}]'.format(buf.decode('utf-8')))
|
|
||||||
if not buf:
|
if not buf:
|
||||||
break
|
break
|
||||||
|
|
||||||
remainder = b""
|
rsp += buf
|
||||||
endpos = buf.rfind(b"\n")
|
|
||||||
if endpos >= 0:
|
|
||||||
remainder = buf[endpos + 1 :]
|
|
||||||
buf = buf[:endpos]
|
|
||||||
|
|
||||||
lines = buf.decode("utf-8").split("\n")
|
rsp = json.loads(rsp.decode("utf-8"))
|
||||||
for line in lines:
|
ret = []
|
||||||
m = ptn.match(line)
|
for statfun, nodes in [
|
||||||
if not m:
|
[self.stat_dir, rsp["dirs"]],
|
||||||
# print(line)
|
[self.stat_file, rsp["files"]],
|
||||||
continue
|
]:
|
||||||
|
for n in nodes:
|
||||||
|
fname = unquote(n["href"].split("?")[0]).rstrip(b"/").decode("wtf-8")
|
||||||
|
if bad_good:
|
||||||
|
fname = enwin(fname)
|
||||||
|
|
||||||
ftype, fname, fsize, fdate = m.groups()
|
ret.append([fname, statfun(n["ts"], n["sz"]), 0])
|
||||||
fname = html_dec(fname)
|
|
||||||
ts = datetime.strptime(fdate, "%Y-%m-%d %H:%M:%S").timestamp()
|
|
||||||
sz = int(fsize)
|
|
||||||
if ftype == "-":
|
|
||||||
ret.append([fname, self.stat_file(ts, sz), 0])
|
|
||||||
else:
|
|
||||||
ret.append([fname, self.stat_dir(ts, sz), 0])
|
|
||||||
|
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
@@ -262,6 +304,7 @@ class CPPF(Fuse):
|
|||||||
Fuse.__init__(self, *args, **kwargs)
|
Fuse.__init__(self, *args, **kwargs)
|
||||||
|
|
||||||
self.url = None
|
self.url = None
|
||||||
|
self.pw = None
|
||||||
|
|
||||||
self.dircache = []
|
self.dircache = []
|
||||||
self.dircache_mtx = threading.Lock()
|
self.dircache_mtx = threading.Lock()
|
||||||
@@ -271,7 +314,7 @@ class CPPF(Fuse):
|
|||||||
|
|
||||||
def init2(self):
|
def init2(self):
|
||||||
# TODO figure out how python-fuse wanted this to go
|
# TODO figure out how python-fuse wanted this to go
|
||||||
self.gw = Gateway(self.url) # .decode('utf-8'))
|
self.gw = Gateway(self.url, self.pw) # .decode('utf-8'))
|
||||||
info("up")
|
info("up")
|
||||||
|
|
||||||
def clean_dircache(self):
|
def clean_dircache(self):
|
||||||
@@ -536,6 +579,8 @@ class CPPF(Fuse):
|
|||||||
|
|
||||||
def getattr(self, path):
|
def getattr(self, path):
|
||||||
log("getattr [{}]".format(path))
|
log("getattr [{}]".format(path))
|
||||||
|
if WINDOWS:
|
||||||
|
path = enwin(path) # windows occasionally decodes f0xx to xx
|
||||||
|
|
||||||
path = path.strip("/")
|
path = path.strip("/")
|
||||||
try:
|
try:
|
||||||
@@ -568,9 +613,25 @@ class CPPF(Fuse):
|
|||||||
|
|
||||||
def main():
|
def main():
|
||||||
time.strptime("19970815", "%Y%m%d") # python#7980
|
time.strptime("19970815", "%Y%m%d") # python#7980
|
||||||
|
register_wtf8()
|
||||||
|
if WINDOWS:
|
||||||
|
os.system("rem")
|
||||||
|
|
||||||
|
for ch in '<>:"\\|?*':
|
||||||
|
# microsoft maps illegal characters to f0xx
|
||||||
|
# (e000 to f8ff is basic-plane private-use)
|
||||||
|
bad_good[ch] = chr(ord(ch) + 0xF000)
|
||||||
|
|
||||||
|
for n in range(0, 0x100):
|
||||||
|
# map surrogateescape to another private-use area
|
||||||
|
bad_good[chr(n + 0xDC00)] = chr(n + 0xF100)
|
||||||
|
|
||||||
|
for k, v in bad_good.items():
|
||||||
|
good_bad[v] = k
|
||||||
|
|
||||||
server = CPPF()
|
server = CPPF()
|
||||||
server.parser.add_option(mountopt="url", metavar="BASE_URL", default=None)
|
server.parser.add_option(mountopt="url", metavar="BASE_URL", default=None)
|
||||||
|
server.parser.add_option(mountopt="pw", metavar="PASSWORD", default=None)
|
||||||
server.parse(values=server, errex=1)
|
server.parse(values=server, errex=1)
|
||||||
if not server.url or not str(server.url).startswith("http"):
|
if not server.url or not str(server.url).startswith("http"):
|
||||||
print("\nerror:")
|
print("\nerror:")
|
||||||
@@ -578,7 +639,7 @@ def main():
|
|||||||
print(" need argument: mount-path")
|
print(" need argument: mount-path")
|
||||||
print("example:")
|
print("example:")
|
||||||
print(
|
print(
|
||||||
" ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,url=http://192.168.1.69:3923 /mnt/nas"
|
" ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,pw=wark,url=http://192.168.1.69:3923 /mnt/nas"
|
||||||
)
|
)
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
|
|||||||
@@ -6,9 +6,13 @@ some of these rely on libraries which are not MIT-compatible
|
|||||||
|
|
||||||
* [audio-bpm.py](./audio-bpm.py) detects the BPM of music using the BeatRoot Vamp Plugin; imports GPL2
|
* [audio-bpm.py](./audio-bpm.py) detects the BPM of music using the BeatRoot Vamp Plugin; imports GPL2
|
||||||
* [audio-key.py](./audio-key.py) detects the melodic key of music using the Mixxx fork of keyfinder; imports GPL3
|
* [audio-key.py](./audio-key.py) detects the melodic key of music using the Mixxx fork of keyfinder; imports GPL3
|
||||||
* [media-hash.py](./media-hash.py) generates checksums for audio and video streams; uses FFmpeg (LGPL or GPL)
|
|
||||||
|
|
||||||
these do not have any problematic dependencies:
|
these invoke standalone programs which are GPL or similar, so is legally fine for most purposes:
|
||||||
|
|
||||||
|
* [media-hash.py](./media-hash.py) generates checksums for audio and video streams; uses FFmpeg (LGPL or GPL)
|
||||||
|
* [image-noexif.py](./image-noexif.py) removes exif tags from images; uses exiftool (GPLv1 or artistic-license)
|
||||||
|
|
||||||
|
these do not have any problematic dependencies at all:
|
||||||
|
|
||||||
* [cksum.py](./cksum.py) computes various checksums
|
* [cksum.py](./cksum.py) computes various checksums
|
||||||
* [exe.py](./exe.py) grabs metadata from .exe and .dll files (example for retrieving multiple tags with one parser)
|
* [exe.py](./exe.py) grabs metadata from .exe and .dll files (example for retrieving multiple tags with one parser)
|
||||||
|
|||||||
@@ -19,18 +19,18 @@ dep: ffmpeg
|
|||||||
def det(tf):
|
def det(tf):
|
||||||
# fmt: off
|
# fmt: off
|
||||||
sp.check_call([
|
sp.check_call([
|
||||||
"ffmpeg",
|
b"ffmpeg",
|
||||||
"-nostdin",
|
b"-nostdin",
|
||||||
"-hide_banner",
|
b"-hide_banner",
|
||||||
"-v", "fatal",
|
b"-v", b"fatal",
|
||||||
"-ss", "13",
|
b"-ss", b"13",
|
||||||
"-y", "-i", fsenc(sys.argv[1]),
|
b"-y", b"-i", fsenc(sys.argv[1]),
|
||||||
"-map", "0:a:0",
|
b"-map", b"0:a:0",
|
||||||
"-ac", "1",
|
b"-ac", b"1",
|
||||||
"-ar", "22050",
|
b"-ar", b"22050",
|
||||||
"-t", "300",
|
b"-t", b"300",
|
||||||
"-f", "f32le",
|
b"-f", b"f32le",
|
||||||
tf
|
fsenc(tf)
|
||||||
])
|
])
|
||||||
# fmt: on
|
# fmt: on
|
||||||
|
|
||||||
|
|||||||
@@ -23,15 +23,15 @@ dep: ffmpeg
|
|||||||
def det(tf):
|
def det(tf):
|
||||||
# fmt: off
|
# fmt: off
|
||||||
sp.check_call([
|
sp.check_call([
|
||||||
"ffmpeg",
|
b"ffmpeg",
|
||||||
"-nostdin",
|
b"-nostdin",
|
||||||
"-hide_banner",
|
b"-hide_banner",
|
||||||
"-v", "fatal",
|
b"-v", b"fatal",
|
||||||
"-y", "-i", fsenc(sys.argv[1]),
|
b"-y", b"-i", fsenc(sys.argv[1]),
|
||||||
"-map", "0:a:0",
|
b"-map", b"0:a:0",
|
||||||
"-t", "300",
|
b"-t", b"300",
|
||||||
"-sample_fmt", "s16",
|
b"-sample_fmt", b"s16",
|
||||||
tf
|
fsenc(tf)
|
||||||
])
|
])
|
||||||
# fmt: on
|
# fmt: on
|
||||||
|
|
||||||
|
|||||||
93
bin/mtag/image-noexif.py
Normal file
93
bin/mtag/image-noexif.py
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
"""
|
||||||
|
remove exif tags from uploaded images
|
||||||
|
|
||||||
|
dependencies:
|
||||||
|
exiftool
|
||||||
|
|
||||||
|
about:
|
||||||
|
creates a "noexif" subfolder and puts exif-stripped copies of each image there,
|
||||||
|
the reason for the subfolder is to avoid issues with the up2k.db / deduplication:
|
||||||
|
|
||||||
|
if the original image is modified in-place, then copyparty will keep the original
|
||||||
|
hash in up2k.db for a while (until the next volume rescan), so if the image is
|
||||||
|
reuploaded after a rescan then the upload will be renamed and kept as a dupe
|
||||||
|
|
||||||
|
alternatively you could switch the logic around, making a copy of the original
|
||||||
|
image into a subfolder named "exif" and modify the original in-place, but then
|
||||||
|
up2k.db will be out of sync until the next rescan, so any additional uploads
|
||||||
|
of the same image will get symlinked (deduplicated) to the modified copy
|
||||||
|
instead of the original in "exif"
|
||||||
|
|
||||||
|
or maybe delete the original image after processing, that would kinda work too
|
||||||
|
|
||||||
|
example copyparty config to use this:
|
||||||
|
-v/mnt/nas/pics:pics:rwmd,ed:c,e2ts,mte=+noexif:c,mtp=noexif=ejpg,ejpeg,ad,bin/mtag/image-noexif.py
|
||||||
|
|
||||||
|
explained:
|
||||||
|
for realpath /mnt/nas/pics (served at /pics) with read-write-modify-delete for ed,
|
||||||
|
enable file analysis on upload (e2ts),
|
||||||
|
append "noexif" to the list of known tags (mtp),
|
||||||
|
and use mtp plugin "bin/mtag/image-noexif.py" to provide that tag,
|
||||||
|
do this on all uploads with the file extension "jpg" or "jpeg",
|
||||||
|
ad = parse file regardless if FFmpeg thinks it is audio or not
|
||||||
|
|
||||||
|
PS: this requires e2ts to be functional,
|
||||||
|
meaning you need to do at least one of these:
|
||||||
|
* apt install ffmpeg
|
||||||
|
* pip3 install mutagen
|
||||||
|
and your python must have sqlite3 support compiled in
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
import filecmp
|
||||||
|
import subprocess as sp
|
||||||
|
|
||||||
|
try:
|
||||||
|
from copyparty.util import fsenc
|
||||||
|
except:
|
||||||
|
|
||||||
|
def fsenc(p):
|
||||||
|
return p.encode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
cwd, fn = os.path.split(sys.argv[1])
|
||||||
|
if os.path.basename(cwd) == "noexif":
|
||||||
|
return
|
||||||
|
|
||||||
|
os.chdir(cwd)
|
||||||
|
f1 = fsenc(fn)
|
||||||
|
f2 = os.path.join(b"noexif", f1)
|
||||||
|
cmd = [
|
||||||
|
b"exiftool",
|
||||||
|
b"-exif:all=",
|
||||||
|
b"-iptc:all=",
|
||||||
|
b"-xmp:all=",
|
||||||
|
b"-P",
|
||||||
|
b"-o",
|
||||||
|
b"noexif/",
|
||||||
|
b"--",
|
||||||
|
f1,
|
||||||
|
]
|
||||||
|
sp.check_output(cmd)
|
||||||
|
if not os.path.exists(f2):
|
||||||
|
print("failed")
|
||||||
|
return
|
||||||
|
|
||||||
|
if filecmp.cmp(f1, f2, shallow=False):
|
||||||
|
print("clean")
|
||||||
|
else:
|
||||||
|
print("exif")
|
||||||
|
|
||||||
|
# lastmod = os.path.getmtime(f1)
|
||||||
|
# times = (int(time.time()), int(lastmod))
|
||||||
|
# os.utime(f2, times)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -13,7 +13,7 @@ try:
|
|||||||
except:
|
except:
|
||||||
|
|
||||||
def fsenc(p):
|
def fsenc(p):
|
||||||
return p
|
return p.encode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
"""
|
"""
|
||||||
@@ -24,13 +24,13 @@ dep: ffmpeg
|
|||||||
def det():
|
def det():
|
||||||
# fmt: off
|
# fmt: off
|
||||||
cmd = [
|
cmd = [
|
||||||
"ffmpeg",
|
b"ffmpeg",
|
||||||
"-nostdin",
|
b"-nostdin",
|
||||||
"-hide_banner",
|
b"-hide_banner",
|
||||||
"-v", "fatal",
|
b"-v", b"fatal",
|
||||||
"-i", fsenc(sys.argv[1]),
|
b"-i", fsenc(sys.argv[1]),
|
||||||
"-f", "framemd5",
|
b"-f", b"framemd5",
|
||||||
"-"
|
b"-"
|
||||||
]
|
]
|
||||||
# fmt: on
|
# fmt: on
|
||||||
|
|
||||||
|
|||||||
70
bin/up2k.py
70
bin/up2k.py
@@ -3,7 +3,7 @@ from __future__ import print_function, unicode_literals
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
up2k.py: upload to copyparty
|
up2k.py: upload to copyparty
|
||||||
2021-10-31, v0.11, ed <irc.rizon.net>, MIT-Licensed
|
2021-11-28, v0.13, ed <irc.rizon.net>, MIT-Licensed
|
||||||
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
||||||
|
|
||||||
- dependencies: requests
|
- dependencies: requests
|
||||||
@@ -224,29 +224,47 @@ class CTermsize(object):
|
|||||||
ss = CTermsize()
|
ss = CTermsize()
|
||||||
|
|
||||||
|
|
||||||
def statdir(top):
|
def _scd(err, top):
|
||||||
"""non-recursive listing of directory contents, along with stat() info"""
|
"""non-recursive listing of directory contents, along with stat() info"""
|
||||||
if hasattr(os, "scandir"):
|
with os.scandir(top) as dh:
|
||||||
with os.scandir(top) as dh:
|
for fh in dh:
|
||||||
for fh in dh:
|
abspath = os.path.join(top, fh.name)
|
||||||
yield [os.path.join(top, fh.name), fh.stat()]
|
try:
|
||||||
else:
|
yield [abspath, fh.stat()]
|
||||||
for name in os.listdir(top):
|
except:
|
||||||
abspath = os.path.join(top, name)
|
err.append(abspath)
|
||||||
|
|
||||||
|
|
||||||
|
def _lsd(err, top):
|
||||||
|
"""non-recursive listing of directory contents, along with stat() info"""
|
||||||
|
for name in os.listdir(top):
|
||||||
|
abspath = os.path.join(top, name)
|
||||||
|
try:
|
||||||
yield [abspath, os.stat(abspath)]
|
yield [abspath, os.stat(abspath)]
|
||||||
|
except:
|
||||||
|
err.append(abspath)
|
||||||
|
|
||||||
|
|
||||||
def walkdir(top):
|
if hasattr(os, "scandir"):
|
||||||
|
statdir = _scd
|
||||||
|
else:
|
||||||
|
statdir = _lsd
|
||||||
|
|
||||||
|
|
||||||
|
def walkdir(err, top):
|
||||||
"""recursive statdir"""
|
"""recursive statdir"""
|
||||||
for ap, inf in sorted(statdir(top)):
|
for ap, inf in sorted(statdir(err, top)):
|
||||||
if stat.S_ISDIR(inf.st_mode):
|
if stat.S_ISDIR(inf.st_mode):
|
||||||
for x in walkdir(ap):
|
try:
|
||||||
yield x
|
for x in walkdir(err, ap):
|
||||||
|
yield x
|
||||||
|
except:
|
||||||
|
err.append(ap)
|
||||||
else:
|
else:
|
||||||
yield ap, inf
|
yield ap, inf
|
||||||
|
|
||||||
|
|
||||||
def walkdirs(tops):
|
def walkdirs(err, tops):
|
||||||
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
||||||
sep = "{0}".format(os.sep).encode("ascii")
|
sep = "{0}".format(os.sep).encode("ascii")
|
||||||
for top in tops:
|
for top in tops:
|
||||||
@@ -256,7 +274,7 @@ def walkdirs(tops):
|
|||||||
stop = os.path.dirname(top)
|
stop = os.path.dirname(top)
|
||||||
|
|
||||||
if os.path.isdir(top):
|
if os.path.isdir(top):
|
||||||
for ap, inf in walkdir(top):
|
for ap, inf in walkdir(err, top):
|
||||||
yield stop, ap[len(stop) :].lstrip(sep), inf
|
yield stop, ap[len(stop) :].lstrip(sep), inf
|
||||||
else:
|
else:
|
||||||
d, n = top.rsplit(sep, 1)
|
d, n = top.rsplit(sep, 1)
|
||||||
@@ -372,7 +390,7 @@ def handshake(req_ses, url, file, pw, search):
|
|||||||
r = req_ses.post(url, headers=headers, json=req)
|
r = req_ses.post(url, headers=headers, json=req)
|
||||||
break
|
break
|
||||||
except:
|
except:
|
||||||
eprint("handshake failed, retry...\n")
|
eprint("handshake failed, retrying: {0}\n".format(file.name))
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -446,10 +464,21 @@ class Ctl(object):
|
|||||||
|
|
||||||
nfiles = 0
|
nfiles = 0
|
||||||
nbytes = 0
|
nbytes = 0
|
||||||
for _, _, inf in walkdirs(ar.files):
|
err = []
|
||||||
|
for _, _, inf in walkdirs(err, ar.files):
|
||||||
nfiles += 1
|
nfiles += 1
|
||||||
nbytes += inf.st_size
|
nbytes += inf.st_size
|
||||||
|
|
||||||
|
if err:
|
||||||
|
eprint("\n# failed to access {0} paths:\n".format(len(err)))
|
||||||
|
for x in err:
|
||||||
|
eprint(x.decode("utf-8", "replace") + "\n")
|
||||||
|
|
||||||
|
eprint("^ failed to access those {0} paths ^\n\n".format(len(err)))
|
||||||
|
if not ar.ok:
|
||||||
|
eprint("aborting because --ok is not set\n")
|
||||||
|
return
|
||||||
|
|
||||||
eprint("found {0} files, {1}\n\n".format(nfiles, humansize(nbytes)))
|
eprint("found {0} files, {1}\n\n".format(nfiles, humansize(nbytes)))
|
||||||
self.nfiles = nfiles
|
self.nfiles = nfiles
|
||||||
self.nbytes = nbytes
|
self.nbytes = nbytes
|
||||||
@@ -460,7 +489,7 @@ class Ctl(object):
|
|||||||
if ar.te:
|
if ar.te:
|
||||||
req_ses.verify = ar.te
|
req_ses.verify = ar.te
|
||||||
|
|
||||||
self.filegen = walkdirs(ar.files)
|
self.filegen = walkdirs([], ar.files)
|
||||||
if ar.safe:
|
if ar.safe:
|
||||||
self.safe()
|
self.safe()
|
||||||
else:
|
else:
|
||||||
@@ -476,7 +505,7 @@ class Ctl(object):
|
|||||||
print("{0} {1}\n hash...".format(self.nfiles - nf, upath))
|
print("{0} {1}\n hash...".format(self.nfiles - nf, upath))
|
||||||
get_hashlist(file, None)
|
get_hashlist(file, None)
|
||||||
|
|
||||||
burl = self.ar.url[:8] + self.ar.url[8:].split("/")[0] + "/"
|
burl = self.ar.url[:12] + self.ar.url[8:].split("/")[0] + "/"
|
||||||
while True:
|
while True:
|
||||||
print(" hs...")
|
print(" hs...")
|
||||||
hs = handshake(req_ses, self.ar.url, file, self.ar.a, search)
|
hs = handshake(req_ses, self.ar.url, file, self.ar.a, search)
|
||||||
@@ -744,7 +773,7 @@ class Ctl(object):
|
|||||||
try:
|
try:
|
||||||
upload(req_ses, file, cid, self.ar.a)
|
upload(req_ses, file, cid, self.ar.a)
|
||||||
except:
|
except:
|
||||||
eprint("upload failed, retry...\n")
|
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
|
||||||
pass # handshake will fix it
|
pass # handshake will fix it
|
||||||
|
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
@@ -783,6 +812,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
ap.add_argument("files", type=unicode, nargs="+", help="files and/or folders to process")
|
ap.add_argument("files", type=unicode, nargs="+", help="files and/or folders to process")
|
||||||
ap.add_argument("-a", metavar="PASSWORD", help="password")
|
ap.add_argument("-a", metavar="PASSWORD", help="password")
|
||||||
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
||||||
|
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
||||||
ap = app.add_argument_group("performance tweaks")
|
ap = app.add_argument_group("performance tweaks")
|
||||||
ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections")
|
ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections")
|
||||||
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
|
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
|
|
||||||
upstream cpp {
|
upstream cpp {
|
||||||
server 127.0.0.1:3923;
|
server 127.0.0.1:3923;
|
||||||
keepalive 120;
|
keepalive 1;
|
||||||
}
|
}
|
||||||
server {
|
server {
|
||||||
listen 443 ssl;
|
listen 443 ssl;
|
||||||
|
|||||||
@@ -25,26 +25,34 @@ ANYWIN = WINDOWS or sys.platform in ["msys"]
|
|||||||
MACOS = platform.system() == "Darwin"
|
MACOS = platform.system() == "Darwin"
|
||||||
|
|
||||||
|
|
||||||
def get_unix_home():
|
def get_unixdir():
|
||||||
try:
|
paths = [
|
||||||
v = os.environ["XDG_CONFIG_HOME"]
|
(os.environ.get, "XDG_CONFIG_HOME"),
|
||||||
if not v:
|
(os.path.expanduser, "~/.config"),
|
||||||
raise Exception()
|
(os.environ.get, "TMPDIR"),
|
||||||
ret = os.path.normpath(v)
|
(os.environ.get, "TEMP"),
|
||||||
os.listdir(ret)
|
(os.environ.get, "TMP"),
|
||||||
return ret
|
(unicode, "/tmp"),
|
||||||
except:
|
]
|
||||||
pass
|
for chk in [os.listdir, os.mkdir]:
|
||||||
|
for pf, pa in paths:
|
||||||
|
try:
|
||||||
|
p = pf(pa)
|
||||||
|
# print(chk.__name__, p, pa)
|
||||||
|
if not p or p.startswith("~"):
|
||||||
|
continue
|
||||||
|
|
||||||
try:
|
p = os.path.normpath(p)
|
||||||
v = os.path.expanduser("~/.config")
|
chk(p)
|
||||||
if v.startswith("~"):
|
p = os.path.join(p, "copyparty")
|
||||||
raise Exception()
|
if not os.path.isdir(p):
|
||||||
ret = os.path.normpath(v)
|
os.mkdir(p)
|
||||||
os.listdir(ret)
|
|
||||||
return ret
|
return p
|
||||||
except:
|
except:
|
||||||
return "/tmp"
|
pass
|
||||||
|
|
||||||
|
raise Exception("could not find a writable path for config")
|
||||||
|
|
||||||
|
|
||||||
class EnvParams(object):
|
class EnvParams(object):
|
||||||
@@ -59,7 +67,7 @@ class EnvParams(object):
|
|||||||
elif sys.platform == "darwin":
|
elif sys.platform == "darwin":
|
||||||
self.cfg = os.path.expanduser("~/Library/Preferences/copyparty")
|
self.cfg = os.path.expanduser("~/Library/Preferences/copyparty")
|
||||||
else:
|
else:
|
||||||
self.cfg = get_unix_home() + "/copyparty"
|
self.cfg = get_unixdir()
|
||||||
|
|
||||||
self.cfg = self.cfg.replace("\\", "/")
|
self.cfg = self.cfg.replace("\\", "/")
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -23,7 +23,7 @@ from textwrap import dedent
|
|||||||
from .__init__ import E, WINDOWS, ANYWIN, VT100, PY2, unicode
|
from .__init__ import E, WINDOWS, ANYWIN, VT100, PY2, unicode
|
||||||
from .__version__ import S_VERSION, S_BUILD_DT, CODENAME
|
from .__version__ import S_VERSION, S_BUILD_DT, CODENAME
|
||||||
from .svchub import SvcHub
|
from .svchub import SvcHub
|
||||||
from .util import py_desc, align_tab, IMPLICATIONS, ansi_re
|
from .util import py_desc, align_tab, IMPLICATIONS, ansi_re, min_ex
|
||||||
from .authsrv import re_vol
|
from .authsrv import re_vol
|
||||||
|
|
||||||
HAVE_SSL = True
|
HAVE_SSL = True
|
||||||
@@ -222,6 +222,54 @@ def sighandler(sig=None, frame=None):
|
|||||||
print("\n".join(msg))
|
print("\n".join(msg))
|
||||||
|
|
||||||
|
|
||||||
|
def disable_quickedit():
|
||||||
|
import ctypes
|
||||||
|
import atexit
|
||||||
|
from ctypes import wintypes
|
||||||
|
|
||||||
|
def ecb(ok, fun, args):
|
||||||
|
if not ok:
|
||||||
|
err = ctypes.get_last_error()
|
||||||
|
if err:
|
||||||
|
raise ctypes.WinError(err)
|
||||||
|
return args
|
||||||
|
|
||||||
|
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
|
||||||
|
if PY2:
|
||||||
|
wintypes.LPDWORD = ctypes.POINTER(wintypes.DWORD)
|
||||||
|
|
||||||
|
k32.GetStdHandle.errcheck = ecb
|
||||||
|
k32.GetConsoleMode.errcheck = ecb
|
||||||
|
k32.SetConsoleMode.errcheck = ecb
|
||||||
|
k32.GetConsoleMode.argtypes = (wintypes.HANDLE, wintypes.LPDWORD)
|
||||||
|
k32.SetConsoleMode.argtypes = (wintypes.HANDLE, wintypes.DWORD)
|
||||||
|
|
||||||
|
def cmode(out, mode=None):
|
||||||
|
h = k32.GetStdHandle(-11 if out else -10)
|
||||||
|
if mode:
|
||||||
|
return k32.SetConsoleMode(h, mode)
|
||||||
|
|
||||||
|
mode = wintypes.DWORD()
|
||||||
|
k32.GetConsoleMode(h, ctypes.byref(mode))
|
||||||
|
return mode.value
|
||||||
|
|
||||||
|
# disable quickedit
|
||||||
|
mode = orig_in = cmode(False)
|
||||||
|
quickedit = 0x40
|
||||||
|
extended = 0x80
|
||||||
|
mask = quickedit + extended
|
||||||
|
if mode & mask != extended:
|
||||||
|
atexit.register(cmode, False, orig_in)
|
||||||
|
cmode(False, mode & ~mask | extended)
|
||||||
|
|
||||||
|
# enable colors in case the os.system("rem") trick ever stops working
|
||||||
|
if VT100:
|
||||||
|
mode = orig_out = cmode(True)
|
||||||
|
if mode & 4 != 4:
|
||||||
|
atexit.register(cmode, True, orig_out)
|
||||||
|
cmode(True, mode | 4)
|
||||||
|
|
||||||
|
|
||||||
def run_argparse(argv, formatter):
|
def run_argparse(argv, formatter):
|
||||||
ap = argparse.ArgumentParser(
|
ap = argparse.ArgumentParser(
|
||||||
formatter_class=formatter,
|
formatter_class=formatter,
|
||||||
@@ -302,6 +350,8 @@ def run_argparse(argv, formatter):
|
|||||||
|
|
||||||
\033[0mdatabase, general:
|
\033[0mdatabase, general:
|
||||||
\033[36me2d\033[35m sets -e2d (all -e2* args can be set using ce2* volflags)
|
\033[36me2d\033[35m sets -e2d (all -e2* args can be set using ce2* volflags)
|
||||||
|
\033[36md2ts\033[35m disables metadata collection for existing files
|
||||||
|
\033[36md2ds\033[35m disables onboot indexing, overrides -e2ds*
|
||||||
\033[36md2t\033[35m disables metadata collection, overrides -e2t*
|
\033[36md2t\033[35m disables metadata collection, overrides -e2t*
|
||||||
\033[36md2d\033[35m disables all database stuff, overrides -e2*
|
\033[36md2d\033[35m disables all database stuff, overrides -e2*
|
||||||
\033[36mnohash=\\.iso$\033[35m skips hashing file contents if path matches *.iso
|
\033[36mnohash=\\.iso$\033[35m skips hashing file contents if path matches *.iso
|
||||||
@@ -368,6 +418,7 @@ def run_argparse(argv, formatter):
|
|||||||
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins")
|
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins")
|
||||||
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
|
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
|
||||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-forms; examples: [stash], [save,get]")
|
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-forms; examples: [stash], [save,get]")
|
||||||
|
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="window title, for example '$ip-10.1.2.' or '$ip-'")
|
||||||
|
|
||||||
ap2 = ap.add_argument_group('upload options')
|
ap2 = ap.add_argument_group('upload options')
|
||||||
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads")
|
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads")
|
||||||
@@ -376,6 +427,7 @@ def run_argparse(argv, formatter):
|
|||||||
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload")
|
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload")
|
||||||
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even if copyparty thinks you're better off without")
|
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even if copyparty thinks you're better off without")
|
||||||
ap2.add_argument("--no-symlink", action="store_true", help="duplicate file contents instead")
|
ap2.add_argument("--no-symlink", action="store_true", help="duplicate file contents instead")
|
||||||
|
ap2.add_argument("--reg-cap", metavar="N", type=int, default=9000, help="max number of uploads to keep in memory when running without -e2d")
|
||||||
|
|
||||||
ap2 = ap.add_argument_group('network options')
|
ap2 = ap.add_argument_group('network options')
|
||||||
ap2.add_argument("-i", metavar="IP", type=u, default="0.0.0.0", help="ip to bind (comma-sep.)")
|
ap2.add_argument("-i", metavar="IP", type=u, default="0.0.0.0", help="ip to bind (comma-sep.)")
|
||||||
@@ -383,6 +435,7 @@ def run_argparse(argv, formatter):
|
|||||||
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; 0 = tcp, 1 = origin (first x-fwd), 2 = cloudflare, 3 = nginx, -1 = closest proxy")
|
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; 0 = tcp, 1 = origin (first x-fwd), 2 = cloudflare, 3 = nginx, -1 = closest proxy")
|
||||||
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
|
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
|
||||||
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="socket write delay in seconds")
|
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="socket write delay in seconds")
|
||||||
|
ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="response delay in seconds")
|
||||||
|
|
||||||
ap2 = ap.add_argument_group('SSL/TLS options')
|
ap2 = ap.add_argument_group('SSL/TLS options')
|
||||||
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls")
|
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls")
|
||||||
@@ -394,6 +447,7 @@ def run_argparse(argv, formatter):
|
|||||||
|
|
||||||
ap2 = ap.add_argument_group('opt-outs')
|
ap2 = ap.add_argument_group('opt-outs')
|
||||||
ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)")
|
ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)")
|
||||||
|
ap2.add_argument("--keep-qem", action="store_true", help="do not disable quick-edit-mode on windows")
|
||||||
ap2.add_argument("--no-del", action="store_true", help="disable delete operations")
|
ap2.add_argument("--no-del", action="store_true", help="disable delete operations")
|
||||||
ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations")
|
ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations")
|
||||||
ap2.add_argument("-nih", action="store_true", help="no info hostname")
|
ap2.add_argument("-nih", action="store_true", help="no info hostname")
|
||||||
@@ -435,6 +489,7 @@ def run_argparse(argv, formatter):
|
|||||||
ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails")
|
ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails")
|
||||||
ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res")
|
ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res")
|
||||||
ap2.add_argument("--th-mt", metavar="CORES", type=int, default=cores, help="num cpu cores to use for generating thumbnails")
|
ap2.add_argument("--th-mt", metavar="CORES", type=int, default=cores, help="num cpu cores to use for generating thumbnails")
|
||||||
|
ap2.add_argument("--th-convt", metavar="SEC", type=int, default=60, help="conversion timeout in seconds")
|
||||||
ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image")
|
ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image")
|
||||||
ap2.add_argument("--th-no-jpg", action="store_true", help="disable jpg output")
|
ap2.add_argument("--th-no-jpg", action="store_true", help="disable jpg output")
|
||||||
ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output")
|
ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output")
|
||||||
@@ -477,6 +532,7 @@ def run_argparse(argv, formatter):
|
|||||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||||
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
||||||
|
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
|
||||||
|
|
||||||
ap2 = ap.add_argument_group('debug options')
|
ap2 = ap.add_argument_group('debug options')
|
||||||
ap2.add_argument("--no-sendfile", action="store_true", help="disable sendfile")
|
ap2.add_argument("--no-sendfile", action="store_true", help="disable sendfile")
|
||||||
@@ -519,7 +575,7 @@ def main(argv=None):
|
|||||||
if HAVE_SSL:
|
if HAVE_SSL:
|
||||||
ensure_cert()
|
ensure_cert()
|
||||||
|
|
||||||
for k, v in zip(argv, argv[1:]):
|
for k, v in zip(argv[1:], argv[2:]):
|
||||||
if k == "-c":
|
if k == "-c":
|
||||||
supp = args_from_cfg(v)
|
supp = args_from_cfg(v)
|
||||||
argv.extend(supp)
|
argv.extend(supp)
|
||||||
@@ -547,6 +603,15 @@ def main(argv=None):
|
|||||||
except AssertionError:
|
except AssertionError:
|
||||||
al = run_argparse(argv, Dodge11874)
|
al = run_argparse(argv, Dodge11874)
|
||||||
|
|
||||||
|
if WINDOWS and not al.keep_qem:
|
||||||
|
try:
|
||||||
|
disable_quickedit()
|
||||||
|
except:
|
||||||
|
print("\nfailed to disable quick-edit-mode:\n" + min_ex() + "\n")
|
||||||
|
|
||||||
|
if not VT100:
|
||||||
|
al.wintitle = ""
|
||||||
|
|
||||||
nstrs = []
|
nstrs = []
|
||||||
anymod = False
|
anymod = False
|
||||||
for ostr in al.v or []:
|
for ostr in al.v or []:
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
# coding: utf-8
|
# coding: utf-8
|
||||||
|
|
||||||
VERSION = (1, 1, 2)
|
VERSION = (1, 1, 10)
|
||||||
CODENAME = "opus"
|
CODENAME = "opus"
|
||||||
BUILD_DT = (2021, 11, 12)
|
BUILD_DT = (2021, 12, 16)
|
||||||
|
|
||||||
S_VERSION = ".".join(map(str, VERSION))
|
S_VERSION = ".".join(map(str, VERSION))
|
||||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||||
|
|||||||
@@ -926,6 +926,14 @@ class AuthSrv(object):
|
|||||||
vol.flags["d2t"] = True
|
vol.flags["d2t"] = True
|
||||||
vol.flags = {k: v for k, v in vol.flags.items() if not k.startswith(rm)}
|
vol.flags = {k: v for k, v in vol.flags.items() if not k.startswith(rm)}
|
||||||
|
|
||||||
|
# d2ds drops all onboot scans for a volume
|
||||||
|
for grp, rm in [["d2ds", "e2ds"], ["d2ts", "e2ts"]]:
|
||||||
|
if not vol.flags.get(grp, False):
|
||||||
|
continue
|
||||||
|
|
||||||
|
vol.flags["d2ts"] = True
|
||||||
|
vol.flags = {k: v for k, v in vol.flags.items() if not k.startswith(rm)}
|
||||||
|
|
||||||
# mt* needs e2t so drop those too
|
# mt* needs e2t so drop those too
|
||||||
for grp, rm in [["e2t", "mt"]]:
|
for grp, rm in [["e2t", "mt"]]:
|
||||||
if vol.flags.get(grp, False):
|
if vol.flags.get(grp, False):
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
from __future__ import print_function, unicode_literals
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
import os
|
import os
|
||||||
from ..util import fsenc, fsdec
|
from ..util import fsenc, fsdec, SYMTIME
|
||||||
from . import path
|
from . import path
|
||||||
|
|
||||||
|
|
||||||
@@ -55,5 +55,8 @@ def unlink(p):
|
|||||||
return os.unlink(fsenc(p))
|
return os.unlink(fsenc(p))
|
||||||
|
|
||||||
|
|
||||||
def utime(p, times=None):
|
def utime(p, times=None, follow_symlinks=True):
|
||||||
return os.utime(fsenc(p), times)
|
if SYMTIME:
|
||||||
|
return os.utime(fsenc(p), times, follow_symlinks=follow_symlinks)
|
||||||
|
else:
|
||||||
|
return os.utime(fsenc(p), times)
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
from __future__ import print_function, unicode_literals
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
import os
|
import os
|
||||||
from ..util import fsenc, fsdec
|
from ..util import fsenc, fsdec, SYMTIME
|
||||||
|
|
||||||
|
|
||||||
def abspath(p):
|
def abspath(p):
|
||||||
@@ -13,8 +13,11 @@ def exists(p):
|
|||||||
return os.path.exists(fsenc(p))
|
return os.path.exists(fsenc(p))
|
||||||
|
|
||||||
|
|
||||||
def getmtime(p):
|
def getmtime(p, follow_symlinks=True):
|
||||||
return os.path.getmtime(fsenc(p))
|
if not follow_symlinks and SYMTIME:
|
||||||
|
return os.lstat(fsenc(p)).st_mtime
|
||||||
|
else:
|
||||||
|
return os.path.getmtime(fsenc(p))
|
||||||
|
|
||||||
|
|
||||||
def getsize(p):
|
def getsize(p):
|
||||||
|
|||||||
@@ -60,6 +60,7 @@ class HttpCli(object):
|
|||||||
self.bufsz = 1024 * 32
|
self.bufsz = 1024 * 32
|
||||||
self.hint = None
|
self.hint = None
|
||||||
self.trailing_slash = True
|
self.trailing_slash = True
|
||||||
|
self.out_headerlist = []
|
||||||
self.out_headers = {
|
self.out_headers = {
|
||||||
"Access-Control-Allow-Origin": "*",
|
"Access-Control-Allow-Origin": "*",
|
||||||
"Cache-Control": "no-store; max-age=0",
|
"Cache-Control": "no-store; max-age=0",
|
||||||
@@ -91,6 +92,7 @@ class HttpCli(object):
|
|||||||
tpl = self.conn.hsrv.j2[name]
|
tpl = self.conn.hsrv.j2[name]
|
||||||
if ka:
|
if ka:
|
||||||
ka["ts"] = self.conn.hsrv.cachebuster()
|
ka["ts"] = self.conn.hsrv.cachebuster()
|
||||||
|
ka["svcname"] = self.args.doctitle
|
||||||
return tpl.render(**ka)
|
return tpl.render(**ka)
|
||||||
|
|
||||||
return tpl
|
return tpl
|
||||||
@@ -126,7 +128,8 @@ class HttpCli(object):
|
|||||||
self.loud_reply(unicode(ex), status=ex.code, volsan=True)
|
self.loud_reply(unicode(ex), status=ex.code, volsan=True)
|
||||||
return self.keepalive
|
return self.keepalive
|
||||||
|
|
||||||
# time.sleep(0.4)
|
if self.args.rsp_slp:
|
||||||
|
time.sleep(self.args.rsp_slp)
|
||||||
|
|
||||||
# normalize incoming headers to lowercase;
|
# normalize incoming headers to lowercase;
|
||||||
# outgoing headers however are Correct-Case
|
# outgoing headers however are Correct-Case
|
||||||
@@ -225,10 +228,10 @@ class HttpCli(object):
|
|||||||
self.gvol = self.asrv.vfs.aget[self.uname]
|
self.gvol = self.asrv.vfs.aget[self.uname]
|
||||||
|
|
||||||
if pwd and "pw" in self.ouparam and pwd != cookies.get("cppwd"):
|
if pwd and "pw" in self.ouparam and pwd != cookies.get("cppwd"):
|
||||||
self.out_headers["Set-Cookie"] = self.get_pwd_cookie(pwd)[0]
|
self.out_headerlist.append(("Set-Cookie", self.get_pwd_cookie(pwd)[0]))
|
||||||
|
|
||||||
ua = self.headers.get("user-agent", "")
|
self.ua = self.headers.get("user-agent", "")
|
||||||
self.is_rclone = ua.startswith("rclone/")
|
self.is_rclone = self.ua.startswith("rclone/")
|
||||||
if self.is_rclone:
|
if self.is_rclone:
|
||||||
uparam["raw"] = False
|
uparam["raw"] = False
|
||||||
uparam["dots"] = False
|
uparam["dots"] = False
|
||||||
@@ -283,12 +286,19 @@ class HttpCli(object):
|
|||||||
n = "604800" if cache == "i" else cache or "69"
|
n = "604800" if cache == "i" else cache or "69"
|
||||||
self.out_headers["Cache-Control"] = "max-age=" + n
|
self.out_headers["Cache-Control"] = "max-age=" + n
|
||||||
|
|
||||||
|
def k304(self):
|
||||||
|
k304 = self.cookies.get("k304")
|
||||||
|
return k304 == "y" or ("; Trident/" in self.ua and not k304)
|
||||||
|
|
||||||
def send_headers(self, length, status=200, mime=None, headers=None):
|
def send_headers(self, length, status=200, mime=None, headers=None):
|
||||||
response = ["{} {} {}".format(self.http_ver, status, HTTPCODE[status])]
|
response = ["{} {} {}".format(self.http_ver, status, HTTPCODE[status])]
|
||||||
|
|
||||||
if length is not None:
|
if length is not None:
|
||||||
response.append("Content-Length: " + unicode(length))
|
response.append("Content-Length: " + unicode(length))
|
||||||
|
|
||||||
|
if status == 304 and self.k304():
|
||||||
|
self.keepalive = False
|
||||||
|
|
||||||
# close if unknown length, otherwise take client's preference
|
# close if unknown length, otherwise take client's preference
|
||||||
response.append("Connection: " + ("Keep-Alive" if self.keepalive else "Close"))
|
response.append("Connection: " + ("Keep-Alive" if self.keepalive else "Close"))
|
||||||
|
|
||||||
@@ -302,7 +312,7 @@ class HttpCli(object):
|
|||||||
|
|
||||||
self.out_headers["Content-Type"] = mime
|
self.out_headers["Content-Type"] = mime
|
||||||
|
|
||||||
for k, v in self.out_headers.items():
|
for k, v in list(self.out_headers.items()) + self.out_headerlist:
|
||||||
response.append("{}: {}".format(k, v))
|
response.append("{}: {}".format(k, v))
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -428,6 +438,15 @@ class HttpCli(object):
|
|||||||
if "ups" in self.uparam:
|
if "ups" in self.uparam:
|
||||||
return self.tx_ups()
|
return self.tx_ups()
|
||||||
|
|
||||||
|
if "k304" in self.uparam:
|
||||||
|
return self.set_k304()
|
||||||
|
|
||||||
|
if "am_js" in self.uparam:
|
||||||
|
return self.set_am_js()
|
||||||
|
|
||||||
|
if "reset" in self.uparam:
|
||||||
|
return self.set_cfg_reset()
|
||||||
|
|
||||||
if "h" in self.uparam:
|
if "h" in self.uparam:
|
||||||
return self.tx_mounts()
|
return self.tx_mounts()
|
||||||
|
|
||||||
@@ -505,7 +524,7 @@ class HttpCli(object):
|
|||||||
return self.handle_stash()
|
return self.handle_stash()
|
||||||
|
|
||||||
if "save" in opt:
|
if "save" in opt:
|
||||||
post_sz, _, _, path = self.dump_to_file()
|
post_sz, _, _, _, path = self.dump_to_file()
|
||||||
self.log("urlform: {} bytes, {}".format(post_sz, path))
|
self.log("urlform: {} bytes, {}".format(post_sz, path))
|
||||||
elif "print" in opt:
|
elif "print" in opt:
|
||||||
reader, _ = self.get_body_reader()
|
reader, _ = self.get_body_reader()
|
||||||
@@ -590,8 +609,8 @@ class HttpCli(object):
|
|||||||
alg = alg or "gz" # def.pk
|
alg = alg or "gz" # def.pk
|
||||||
try:
|
try:
|
||||||
# config-forced opts
|
# config-forced opts
|
||||||
alg, lv = pk.split(",")
|
alg, nlv = pk.split(",")
|
||||||
lv[alg] = int(lv)
|
lv[alg] = int(nlv)
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -621,7 +640,7 @@ class HttpCli(object):
|
|||||||
with ren_open(fn, *open_a, **params) as f:
|
with ren_open(fn, *open_a, **params) as f:
|
||||||
f, fn = f["orz"]
|
f, fn = f["orz"]
|
||||||
path = os.path.join(fdir, fn)
|
path = os.path.join(fdir, fn)
|
||||||
post_sz, _, sha_b64 = hashcopy(reader, f)
|
post_sz, sha_hex, sha_b64 = hashcopy(reader, f)
|
||||||
|
|
||||||
if lim:
|
if lim:
|
||||||
lim.nup(self.ip)
|
lim.nup(self.ip)
|
||||||
@@ -645,13 +664,14 @@ class HttpCli(object):
|
|||||||
time.time(),
|
time.time(),
|
||||||
)
|
)
|
||||||
|
|
||||||
return post_sz, sha_b64, remains, path
|
return post_sz, sha_hex, sha_b64, remains, path
|
||||||
|
|
||||||
def handle_stash(self):
|
def handle_stash(self):
|
||||||
post_sz, sha_b64, remains, path = self.dump_to_file()
|
post_sz, sha_hex, sha_b64, remains, path = self.dump_to_file()
|
||||||
spd = self._spd(post_sz)
|
spd = self._spd(post_sz)
|
||||||
self.log("{} wrote {}/{} bytes to {}".format(spd, post_sz, remains, path))
|
self.log("{} wrote {}/{} bytes to {}".format(spd, post_sz, remains, path))
|
||||||
self.reply("{}\n{}\n".format(post_sz, sha_b64).encode("utf-8"))
|
m = "{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56])
|
||||||
|
self.reply(m.encode("utf-8"))
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _spd(self, nbytes, add=True):
|
def _spd(self, nbytes, add=True):
|
||||||
@@ -783,6 +803,10 @@ class HttpCli(object):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
def handle_search(self, body):
|
def handle_search(self, body):
|
||||||
|
idx = self.conn.get_u2idx()
|
||||||
|
if not hasattr(idx, "p_end"):
|
||||||
|
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
|
||||||
|
|
||||||
vols = []
|
vols = []
|
||||||
seen = {}
|
seen = {}
|
||||||
for vtop in self.rvol:
|
for vtop in self.rvol:
|
||||||
@@ -794,7 +818,6 @@ class HttpCli(object):
|
|||||||
seen[vfs] = True
|
seen[vfs] = True
|
||||||
vols.append([vfs.vpath, vfs.realpath, vfs.flags])
|
vols.append([vfs.vpath, vfs.realpath, vfs.flags])
|
||||||
|
|
||||||
idx = self.conn.get_u2idx()
|
|
||||||
t0 = time.time()
|
t0 = time.time()
|
||||||
if idx.p_end:
|
if idx.p_end:
|
||||||
penalty = 0.7
|
penalty = 0.7
|
||||||
@@ -854,63 +877,63 @@ class HttpCli(object):
|
|||||||
response = x.get()
|
response = x.get()
|
||||||
chunksize, cstart, path, lastmod = response
|
chunksize, cstart, path, lastmod = response
|
||||||
|
|
||||||
if self.args.nw:
|
|
||||||
path = os.devnull
|
|
||||||
|
|
||||||
if remains > chunksize:
|
|
||||||
raise Pebkac(400, "your chunk is too big to fit")
|
|
||||||
|
|
||||||
self.log("writing {} #{} @{} len {}".format(path, chash, cstart, remains))
|
|
||||||
|
|
||||||
reader = read_socket(self.sr, remains)
|
|
||||||
|
|
||||||
f = None
|
|
||||||
fpool = not self.args.no_fpool
|
|
||||||
if fpool:
|
|
||||||
with self.mutex:
|
|
||||||
try:
|
|
||||||
f = self.u2fh.pop(path)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
f = f or open(fsenc(path), "rb+", 512 * 1024)
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
f.seek(cstart[0])
|
if self.args.nw:
|
||||||
post_sz, _, sha_b64 = hashcopy(reader, f)
|
path = os.devnull
|
||||||
|
|
||||||
if sha_b64 != chash:
|
if remains > chunksize:
|
||||||
raise Pebkac(
|
raise Pebkac(400, "your chunk is too big to fit")
|
||||||
400,
|
|
||||||
"your chunk got corrupted somehow (received {} bytes); expected vs received hash:\n{}\n{}".format(
|
|
||||||
post_sz, chash, sha_b64
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
if len(cstart) > 1 and path != os.devnull:
|
self.log("writing {} #{} @{} len {}".format(path, chash, cstart, remains))
|
||||||
self.log(
|
|
||||||
"clone {} to {}".format(
|
|
||||||
cstart[0], " & ".join(unicode(x) for x in cstart[1:])
|
|
||||||
)
|
|
||||||
)
|
|
||||||
ofs = 0
|
|
||||||
while ofs < chunksize:
|
|
||||||
bufsz = min(chunksize - ofs, 4 * 1024 * 1024)
|
|
||||||
f.seek(cstart[0] + ofs)
|
|
||||||
buf = f.read(bufsz)
|
|
||||||
for wofs in cstart[1:]:
|
|
||||||
f.seek(wofs + ofs)
|
|
||||||
f.write(buf)
|
|
||||||
|
|
||||||
ofs += len(buf)
|
reader = read_socket(self.sr, remains)
|
||||||
|
|
||||||
self.log("clone {} done".format(cstart[0]))
|
f = None
|
||||||
finally:
|
fpool = not self.args.no_fpool
|
||||||
if not fpool:
|
if fpool:
|
||||||
f.close()
|
|
||||||
else:
|
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
self.u2fh.put(path, f)
|
try:
|
||||||
|
f = self.u2fh.pop(path)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
f = f or open(fsenc(path), "rb+", 512 * 1024)
|
||||||
|
|
||||||
|
try:
|
||||||
|
f.seek(cstart[0])
|
||||||
|
post_sz, _, sha_b64 = hashcopy(reader, f)
|
||||||
|
|
||||||
|
if sha_b64 != chash:
|
||||||
|
m = "your chunk got corrupted somehow (received {} bytes); expected vs received hash:\n{}\n{}"
|
||||||
|
raise Pebkac(400, m.format(post_sz, chash, sha_b64))
|
||||||
|
|
||||||
|
if len(cstart) > 1 and path != os.devnull:
|
||||||
|
self.log(
|
||||||
|
"clone {} to {}".format(
|
||||||
|
cstart[0], " & ".join(unicode(x) for x in cstart[1:])
|
||||||
|
)
|
||||||
|
)
|
||||||
|
ofs = 0
|
||||||
|
while ofs < chunksize:
|
||||||
|
bufsz = min(chunksize - ofs, 4 * 1024 * 1024)
|
||||||
|
f.seek(cstart[0] + ofs)
|
||||||
|
buf = f.read(bufsz)
|
||||||
|
for wofs in cstart[1:]:
|
||||||
|
f.seek(wofs + ofs)
|
||||||
|
f.write(buf)
|
||||||
|
|
||||||
|
ofs += len(buf)
|
||||||
|
|
||||||
|
self.log("clone {} done".format(cstart[0]))
|
||||||
|
finally:
|
||||||
|
if not fpool:
|
||||||
|
f.close()
|
||||||
|
else:
|
||||||
|
with self.mutex:
|
||||||
|
self.u2fh.put(path, f)
|
||||||
|
finally:
|
||||||
|
x = self.conn.hsrv.broker.put(True, "up2k.release_chunk", ptop, wark, chash)
|
||||||
|
x.get() # block client until released
|
||||||
|
|
||||||
x = self.conn.hsrv.broker.put(True, "up2k.confirm_chunk", ptop, wark, chash)
|
x = self.conn.hsrv.broker.put(True, "up2k.confirm_chunk", ptop, wark, chash)
|
||||||
x = x.get()
|
x = x.get()
|
||||||
@@ -957,15 +980,13 @@ class HttpCli(object):
|
|||||||
def get_pwd_cookie(self, pwd):
|
def get_pwd_cookie(self, pwd):
|
||||||
if pwd in self.asrv.iacct:
|
if pwd in self.asrv.iacct:
|
||||||
msg = "login ok"
|
msg = "login ok"
|
||||||
dt = datetime.utcfromtimestamp(time.time() + 60 * 60 * 24 * 365)
|
dur = 60 * 60 * 24 * 365
|
||||||
exp = dt.strftime("%a, %d %b %Y %H:%M:%S GMT")
|
|
||||||
else:
|
else:
|
||||||
msg = "naw dude"
|
msg = "naw dude"
|
||||||
pwd = "x" # nosec
|
pwd = "x" # nosec
|
||||||
exp = "Fri, 15 Aug 1997 01:00:00 GMT"
|
dur = None
|
||||||
|
|
||||||
ck = "cppwd={}; Path=/; Expires={}; SameSite=Lax".format(pwd, exp)
|
return [gencookie("cppwd", pwd, dur), msg]
|
||||||
return [ck, msg]
|
|
||||||
|
|
||||||
def handle_mkdir(self):
|
def handle_mkdir(self):
|
||||||
new_dir = self.parser.require("name", 512)
|
new_dir = self.parser.require("name", 512)
|
||||||
@@ -1073,7 +1094,7 @@ class HttpCli(object):
|
|||||||
f, fname = f["orz"]
|
f, fname = f["orz"]
|
||||||
abspath = os.path.join(fdir, fname)
|
abspath = os.path.join(fdir, fname)
|
||||||
self.log("writing to {}".format(abspath))
|
self.log("writing to {}".format(abspath))
|
||||||
sz, sha512_hex, _ = hashcopy(p_data, f)
|
sz, sha_hex, sha_b64 = hashcopy(p_data, f)
|
||||||
if sz == 0:
|
if sz == 0:
|
||||||
raise Pebkac(400, "empty files in post")
|
raise Pebkac(400, "empty files in post")
|
||||||
|
|
||||||
@@ -1086,7 +1107,7 @@ class HttpCli(object):
|
|||||||
bos.unlink(abspath)
|
bos.unlink(abspath)
|
||||||
raise
|
raise
|
||||||
|
|
||||||
files.append([sz, sha512_hex, p_file, fname, abspath])
|
files.append([sz, sha_hex, sha_b64, p_file, fname, abspath])
|
||||||
dbv, vrem = vfs.get_dbv(rem)
|
dbv, vrem = vfs.get_dbv(rem)
|
||||||
self.conn.hsrv.broker.put(
|
self.conn.hsrv.broker.put(
|
||||||
False,
|
False,
|
||||||
@@ -1138,7 +1159,7 @@ class HttpCli(object):
|
|||||||
jmsg["error"] = errmsg
|
jmsg["error"] = errmsg
|
||||||
errmsg = "ERROR: " + errmsg
|
errmsg = "ERROR: " + errmsg
|
||||||
|
|
||||||
for sz, sha512, ofn, lfn, ap in files:
|
for sz, sha_hex, sha_b64, ofn, lfn, ap in files:
|
||||||
vsuf = ""
|
vsuf = ""
|
||||||
if self.can_read and "fk" in vfs.flags:
|
if self.can_read and "fk" in vfs.flags:
|
||||||
vsuf = "?k=" + gen_filekey(
|
vsuf = "?k=" + gen_filekey(
|
||||||
@@ -1149,8 +1170,13 @@ class HttpCli(object):
|
|||||||
)[: vfs.flags["fk"]]
|
)[: vfs.flags["fk"]]
|
||||||
|
|
||||||
vpath = "{}/{}".format(upload_vpath, lfn).strip("/")
|
vpath = "{}/{}".format(upload_vpath, lfn).strip("/")
|
||||||
msg += 'sha512: {} // {} bytes // <a href="/{}">{}</a> {}\n'.format(
|
msg += 'sha512: {} // {} // {} bytes // <a href="/{}">{}</a> {}\n'.format(
|
||||||
sha512[:56], sz, quotep(vpath) + vsuf, html_escape(ofn, crlf=True), vsuf
|
sha_hex[:56],
|
||||||
|
sha_b64,
|
||||||
|
sz,
|
||||||
|
quotep(vpath) + vsuf,
|
||||||
|
html_escape(ofn, crlf=True),
|
||||||
|
vsuf,
|
||||||
)
|
)
|
||||||
# truncated SHA-512 prevents length extension attacks;
|
# truncated SHA-512 prevents length extension attacks;
|
||||||
# using SHA-512/224, optionally SHA-512/256 = :64
|
# using SHA-512/224, optionally SHA-512/256 = :64
|
||||||
@@ -1160,7 +1186,8 @@ class HttpCli(object):
|
|||||||
self.headers.get("host", "copyparty"),
|
self.headers.get("host", "copyparty"),
|
||||||
vpath + vsuf,
|
vpath + vsuf,
|
||||||
),
|
),
|
||||||
"sha512": sha512[:56],
|
"sha512": sha_hex[:56],
|
||||||
|
"sha_b64": sha_b64,
|
||||||
"sz": sz,
|
"sz": sz,
|
||||||
"fn": lfn,
|
"fn": lfn,
|
||||||
"fn_orig": ofn,
|
"fn_orig": ofn,
|
||||||
@@ -1340,6 +1367,9 @@ class HttpCli(object):
|
|||||||
try:
|
try:
|
||||||
fs_path = req_path + ext
|
fs_path = req_path + ext
|
||||||
st = bos.stat(fs_path)
|
st = bos.stat(fs_path)
|
||||||
|
if stat.S_ISDIR(st.st_mode):
|
||||||
|
continue
|
||||||
|
|
||||||
file_ts = max(file_ts, st.st_mtime)
|
file_ts = max(file_ts, st.st_mtime)
|
||||||
editions[ext or "plain"] = [fs_path, st.st_size]
|
editions[ext or "plain"] = [fs_path, st.st_size]
|
||||||
except:
|
except:
|
||||||
@@ -1378,8 +1408,7 @@ class HttpCli(object):
|
|||||||
if "gzip" not in supported_editions:
|
if "gzip" not in supported_editions:
|
||||||
decompress = True
|
decompress = True
|
||||||
else:
|
else:
|
||||||
ua = self.headers.get("user-agent", "")
|
if re.match(r"MSIE [4-6]\.", self.ua) and " SV1" not in self.ua:
|
||||||
if re.match(r"MSIE [4-6]\.", ua) and " SV1" not in ua:
|
|
||||||
decompress = True
|
decompress = True
|
||||||
|
|
||||||
if not decompress:
|
if not decompress:
|
||||||
@@ -1486,11 +1515,12 @@ class HttpCli(object):
|
|||||||
with open_func(*open_args) as f:
|
with open_func(*open_args) as f:
|
||||||
sendfun = sendfile_kern if use_sendfile else sendfile_py
|
sendfun = sendfile_kern if use_sendfile else sendfile_py
|
||||||
remains = sendfun(
|
remains = sendfun(
|
||||||
lower, upper, f, self.s, self.args.s_wr_sz, self.args.s_wr_slp
|
self.log, lower, upper, f, self.s, self.args.s_wr_sz, self.args.s_wr_slp
|
||||||
)
|
)
|
||||||
|
|
||||||
if remains > 0:
|
if remains > 0:
|
||||||
logmsg += " \033[31m" + unicode(upper - remains) + "\033[0m"
|
logmsg += " \033[31m" + unicode(upper - remains) + "\033[0m"
|
||||||
|
self.keepalive = False
|
||||||
|
|
||||||
spd = self._spd((upper - lower) - remains)
|
spd = self._spd((upper - lower) - remains)
|
||||||
if self.do_log:
|
if self.do_log:
|
||||||
@@ -1692,10 +1722,28 @@ class HttpCli(object):
|
|||||||
tagq=vs["tagq"],
|
tagq=vs["tagq"],
|
||||||
mtpq=vs["mtpq"],
|
mtpq=vs["mtpq"],
|
||||||
url_suf=suf,
|
url_suf=suf,
|
||||||
|
k304=self.k304(),
|
||||||
)
|
)
|
||||||
self.reply(html.encode("utf-8"))
|
self.reply(html.encode("utf-8"))
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
def set_k304(self):
|
||||||
|
ck = gencookie("k304", self.uparam["k304"], 60 * 60 * 24 * 365)
|
||||||
|
self.out_headerlist.append(("Set-Cookie", ck))
|
||||||
|
self.redirect("", "?h#cc")
|
||||||
|
|
||||||
|
def set_am_js(self):
|
||||||
|
v = "n" if self.uparam["am_js"] == "n" else "y"
|
||||||
|
ck = gencookie("js", v, 60 * 60 * 24 * 365)
|
||||||
|
self.out_headerlist.append(("Set-Cookie", ck))
|
||||||
|
self.reply(b"promoted\n")
|
||||||
|
|
||||||
|
def set_cfg_reset(self):
|
||||||
|
for k in ("k304", "js", "cppwd"):
|
||||||
|
self.out_headerlist.append(("Set-Cookie", gencookie(k, "x", None)))
|
||||||
|
|
||||||
|
self.redirect("", "?h#cc")
|
||||||
|
|
||||||
def tx_404(self, is_403=False):
|
def tx_404(self, is_403=False):
|
||||||
if self.args.vague_403:
|
if self.args.vague_403:
|
||||||
m = '<h1>404 not found ┐( ´ -`)┌</h1><p>or maybe you don\'t have access -- try logging in or <a href="/?h">go home</a></p>'
|
m = '<h1>404 not found ┐( ´ -`)┌</h1><p>or maybe you don\'t have access -- try logging in or <a href="/?h">go home</a></p>'
|
||||||
@@ -1812,13 +1860,16 @@ class HttpCli(object):
|
|||||||
if not self.args.unpost:
|
if not self.args.unpost:
|
||||||
raise Pebkac(400, "the unpost feature is disabled in server config")
|
raise Pebkac(400, "the unpost feature is disabled in server config")
|
||||||
|
|
||||||
|
idx = self.conn.get_u2idx()
|
||||||
|
if not hasattr(idx, "p_end"):
|
||||||
|
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
|
||||||
|
|
||||||
filt = self.uparam.get("filter")
|
filt = self.uparam.get("filter")
|
||||||
lm = "ups [{}]".format(filt)
|
lm = "ups [{}]".format(filt)
|
||||||
self.log(lm)
|
self.log(lm)
|
||||||
|
|
||||||
ret = []
|
ret = []
|
||||||
t0 = time.time()
|
t0 = time.time()
|
||||||
idx = self.conn.get_u2idx()
|
|
||||||
lim = time.time() - self.args.unpost
|
lim = time.time() - self.args.unpost
|
||||||
for vol in self.asrv.vfs.all_vols.values():
|
for vol in self.asrv.vfs.all_vols.values():
|
||||||
cur = idx.get_cur(vol.realpath)
|
cur = idx.get_cur(vol.realpath)
|
||||||
@@ -1912,6 +1963,13 @@ class HttpCli(object):
|
|||||||
fmt = "{{}} {{:{},}} {{}}"
|
fmt = "{{}} {{:{},}} {{}}"
|
||||||
nfmt = "{:,}"
|
nfmt = "{:,}"
|
||||||
|
|
||||||
|
for x in dirs:
|
||||||
|
n = x["name"] + "/"
|
||||||
|
if arg == "v":
|
||||||
|
n = "\033[94m" + n
|
||||||
|
|
||||||
|
x["name"] = n
|
||||||
|
|
||||||
fmt = fmt.format(len(nfmt.format(biggest)))
|
fmt = fmt.format(len(nfmt.format(biggest)))
|
||||||
ret = [
|
ret = [
|
||||||
"# {}: {}".format(x, ls[x])
|
"# {}: {}".format(x, ls[x])
|
||||||
@@ -2050,6 +2108,7 @@ class HttpCli(object):
|
|||||||
|
|
||||||
url_suf = self.urlq({}, [])
|
url_suf = self.urlq({}, [])
|
||||||
is_ls = "ls" in self.uparam
|
is_ls = "ls" in self.uparam
|
||||||
|
is_js = self.cookies.get("js") == "y"
|
||||||
|
|
||||||
tpl = "browser"
|
tpl = "browser"
|
||||||
if "b" in self.uparam:
|
if "b" in self.uparam:
|
||||||
@@ -2078,6 +2137,7 @@ class HttpCli(object):
|
|||||||
"taglist": [],
|
"taglist": [],
|
||||||
"srvinf": srv_info,
|
"srvinf": srv_info,
|
||||||
"acct": self.uname,
|
"acct": self.uname,
|
||||||
|
"idx": ("e2d" in vn.flags),
|
||||||
"perms": perms,
|
"perms": perms,
|
||||||
"logues": logues,
|
"logues": logues,
|
||||||
"readme": readme,
|
"readme": readme,
|
||||||
@@ -2086,6 +2146,7 @@ class HttpCli(object):
|
|||||||
"vdir": quotep(self.vpath),
|
"vdir": quotep(self.vpath),
|
||||||
"vpnodes": vpnodes,
|
"vpnodes": vpnodes,
|
||||||
"files": [],
|
"files": [],
|
||||||
|
"ls0": None,
|
||||||
"acct": self.uname,
|
"acct": self.uname,
|
||||||
"perms": json.dumps(perms),
|
"perms": json.dumps(perms),
|
||||||
"taglist": [],
|
"taglist": [],
|
||||||
@@ -2166,7 +2227,7 @@ class HttpCli(object):
|
|||||||
for fn in vfs_ls:
|
for fn in vfs_ls:
|
||||||
base = ""
|
base = ""
|
||||||
href = fn
|
href = fn
|
||||||
if not is_ls and not self.trailing_slash and vpath:
|
if not is_ls and not is_js and not self.trailing_slash and vpath:
|
||||||
base = "/" + vpath + "/"
|
base = "/" + vpath + "/"
|
||||||
href = base + fn
|
href = base + fn
|
||||||
|
|
||||||
@@ -2309,7 +2370,12 @@ class HttpCli(object):
|
|||||||
|
|
||||||
dirs.sort(key=itemgetter("name"))
|
dirs.sort(key=itemgetter("name"))
|
||||||
|
|
||||||
j2a["files"] = dirs + files
|
if is_js:
|
||||||
|
j2a["ls0"] = {"dirs": dirs, "files": files, "taglist": taglist}
|
||||||
|
j2a["files"] = []
|
||||||
|
else:
|
||||||
|
j2a["files"] = dirs + files
|
||||||
|
|
||||||
j2a["logues"] = logues
|
j2a["logues"] = logues
|
||||||
j2a["taglist"] = taglist
|
j2a["taglist"] = taglist
|
||||||
j2a["txt_ext"] = self.args.textfiles.replace(",", " ")
|
j2a["txt_ext"] = self.args.textfiles.replace(",", " ")
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ import shutil
|
|||||||
import subprocess as sp
|
import subprocess as sp
|
||||||
|
|
||||||
from .__init__ import PY2, WINDOWS, unicode
|
from .__init__ import PY2, WINDOWS, unicode
|
||||||
from .util import fsenc, fsdec, uncyg, REKOBO_LKEY
|
from .util import fsenc, fsdec, uncyg, runcmd, REKOBO_LKEY
|
||||||
from .bos import bos
|
from .bos import bos
|
||||||
|
|
||||||
|
|
||||||
@@ -73,7 +73,7 @@ class MParser(object):
|
|||||||
raise Exception()
|
raise Exception()
|
||||||
|
|
||||||
|
|
||||||
def ffprobe(abspath):
|
def ffprobe(abspath, timeout=10):
|
||||||
cmd = [
|
cmd = [
|
||||||
b"ffprobe",
|
b"ffprobe",
|
||||||
b"-hide_banner",
|
b"-hide_banner",
|
||||||
@@ -82,10 +82,8 @@ def ffprobe(abspath):
|
|||||||
b"--",
|
b"--",
|
||||||
fsenc(abspath),
|
fsenc(abspath),
|
||||||
]
|
]
|
||||||
p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE)
|
rc = runcmd(cmd, timeout=timeout)
|
||||||
r = p.communicate()
|
return parse_ffprobe(rc[1])
|
||||||
txt = r[0].decode("utf-8", "replace")
|
|
||||||
return parse_ffprobe(txt)
|
|
||||||
|
|
||||||
|
|
||||||
def parse_ffprobe(txt):
|
def parse_ffprobe(txt):
|
||||||
@@ -420,7 +418,8 @@ class MTag(object):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
md = mutagen.File(fsenc(abspath), easy=True)
|
md = mutagen.File(fsenc(abspath), easy=True)
|
||||||
x = md.info.length
|
if not md.info.length and not md.info.codec:
|
||||||
|
raise Exception()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
return self.get_ffprobe(abspath) if self.can_ffprobe else {}
|
return self.get_ffprobe(abspath) if self.can_ffprobe else {}
|
||||||
|
|
||||||
|
|||||||
@@ -302,6 +302,10 @@ class SvcHub(object):
|
|||||||
print("nailed it", end="")
|
print("nailed it", end="")
|
||||||
ret = self.retcode
|
ret = self.retcode
|
||||||
finally:
|
finally:
|
||||||
|
if self.args.wintitle:
|
||||||
|
print("\033]0;\033\\", file=sys.stderr, end="")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
print("\033[0m")
|
print("\033[0m")
|
||||||
if self.logf:
|
if self.logf:
|
||||||
self.logf.close()
|
self.logf.close()
|
||||||
@@ -397,7 +401,6 @@ class SvcHub(object):
|
|||||||
|
|
||||||
def check_mp_enable(self):
|
def check_mp_enable(self):
|
||||||
if self.args.j == 1:
|
if self.args.j == 1:
|
||||||
self.log("svchub", "multiprocessing disabled by argument -j 1")
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
if mp.cpu_count() <= 1:
|
if mp.cpu_count() <= 1:
|
||||||
|
|||||||
@@ -2,9 +2,10 @@
|
|||||||
from __future__ import print_function, unicode_literals
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
import re
|
import re
|
||||||
|
import sys
|
||||||
import socket
|
import socket
|
||||||
|
|
||||||
from .__init__ import MACOS, ANYWIN
|
from .__init__ import MACOS, ANYWIN, unicode
|
||||||
from .util import chkcmd
|
from .util import chkcmd
|
||||||
|
|
||||||
|
|
||||||
@@ -54,6 +55,8 @@ class TcpSrv(object):
|
|||||||
eps[x] = "external"
|
eps[x] = "external"
|
||||||
|
|
||||||
msgs = []
|
msgs = []
|
||||||
|
title_tab = {}
|
||||||
|
title_vars = [x[1:] for x in self.args.wintitle.split(" ") if x.startswith("$")]
|
||||||
m = "available @ http://{}:{}/ (\033[33m{}\033[0m)"
|
m = "available @ http://{}:{}/ (\033[33m{}\033[0m)"
|
||||||
for ip, desc in sorted(eps.items(), key=lambda x: x[1]):
|
for ip, desc in sorted(eps.items(), key=lambda x: x[1]):
|
||||||
for port in sorted(self.args.p):
|
for port in sorted(self.args.p):
|
||||||
@@ -62,11 +65,39 @@ class TcpSrv(object):
|
|||||||
|
|
||||||
msgs.append(m.format(ip, port, desc))
|
msgs.append(m.format(ip, port, desc))
|
||||||
|
|
||||||
|
if not self.args.wintitle:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if port in [80, 443]:
|
||||||
|
ep = ip
|
||||||
|
else:
|
||||||
|
ep = "{}:{}".format(ip, port)
|
||||||
|
|
||||||
|
hits = []
|
||||||
|
if "pub" in title_vars and "external" in unicode(desc):
|
||||||
|
hits.append(("pub", ep))
|
||||||
|
|
||||||
|
if "pub" in title_vars or "all" in title_vars:
|
||||||
|
hits.append(("all", ep))
|
||||||
|
|
||||||
|
for var in title_vars:
|
||||||
|
if var.startswith("ip-") and ep.startswith(var[3:]):
|
||||||
|
hits.append((var, ep))
|
||||||
|
|
||||||
|
for tk, tv in hits:
|
||||||
|
try:
|
||||||
|
title_tab[tk][tv] = 1
|
||||||
|
except:
|
||||||
|
title_tab[tk] = {tv: 1}
|
||||||
|
|
||||||
if msgs:
|
if msgs:
|
||||||
msgs[-1] += "\n"
|
msgs[-1] += "\n"
|
||||||
for m in msgs:
|
for m in msgs:
|
||||||
self.log("tcpsrv", m)
|
self.log("tcpsrv", m)
|
||||||
|
|
||||||
|
if self.args.wintitle:
|
||||||
|
self._set_wintitle(title_tab)
|
||||||
|
|
||||||
def _listen(self, ip, port):
|
def _listen(self, ip, port):
|
||||||
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||||
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||||
@@ -232,3 +263,26 @@ class TcpSrv(object):
|
|||||||
eps[default_route] = desc
|
eps[default_route] = desc
|
||||||
|
|
||||||
return eps
|
return eps
|
||||||
|
|
||||||
|
def _set_wintitle(self, vars):
|
||||||
|
vars["all"] = vars.get("all", {"Local-Only": 1})
|
||||||
|
vars["pub"] = vars.get("pub", vars["all"])
|
||||||
|
|
||||||
|
vars2 = {}
|
||||||
|
for k, eps in vars.items():
|
||||||
|
vars2[k] = {
|
||||||
|
ep: 1
|
||||||
|
for ep in eps.keys()
|
||||||
|
if ":" not in ep or ep.split(":")[0] not in eps
|
||||||
|
}
|
||||||
|
|
||||||
|
title = ""
|
||||||
|
vars = vars2
|
||||||
|
for p in self.args.wintitle.split(" "):
|
||||||
|
if p.startswith("$"):
|
||||||
|
p = " and ".join(sorted(vars.get(p[1:], {"(None)": 1}).keys()))
|
||||||
|
|
||||||
|
title += "{} ".format(p)
|
||||||
|
|
||||||
|
print("\033]0;{}\033\\".format(title), file=sys.stderr, end="")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|||||||
@@ -30,7 +30,7 @@ class ThumbCli(object):
|
|||||||
if is_vid and self.args.no_vthumb:
|
if is_vid and self.args.no_vthumb:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
want_opus = fmt == "opus"
|
want_opus = fmt in ("opus", "caf")
|
||||||
is_au = ext in FMT_FFA
|
is_au = ext in FMT_FFA
|
||||||
if is_au:
|
if is_au:
|
||||||
if want_opus:
|
if want_opus:
|
||||||
|
|||||||
@@ -90,7 +90,7 @@ def thumb_path(histpath, rem, mtime, fmt):
|
|||||||
h = hashlib.sha512(fsenc(fn)).digest()
|
h = hashlib.sha512(fsenc(fn)).digest()
|
||||||
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
|
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
|
||||||
|
|
||||||
if fmt == "opus":
|
if fmt in ("opus", "caf"):
|
||||||
cat = "ac"
|
cat = "ac"
|
||||||
else:
|
else:
|
||||||
fmt = "webp" if fmt == "w" else "jpg"
|
fmt = "webp" if fmt == "w" else "jpg"
|
||||||
@@ -216,7 +216,7 @@ class ThumbSrv(object):
|
|||||||
elif ext in FMT_FFV:
|
elif ext in FMT_FFV:
|
||||||
fun = self.conv_ffmpeg
|
fun = self.conv_ffmpeg
|
||||||
elif ext in FMT_FFA:
|
elif ext in FMT_FFA:
|
||||||
if tpath.endswith(".opus"):
|
if tpath.endswith(".opus") or tpath.endswith(".caf"):
|
||||||
fun = self.conv_opus
|
fun = self.conv_opus
|
||||||
else:
|
else:
|
||||||
fun = self.conv_spec
|
fun = self.conv_spec
|
||||||
@@ -349,7 +349,7 @@ class ThumbSrv(object):
|
|||||||
|
|
||||||
def _run_ff(self, cmd):
|
def _run_ff(self, cmd):
|
||||||
# self.log((b" ".join(cmd)).decode("utf-8"))
|
# self.log((b" ".join(cmd)).decode("utf-8"))
|
||||||
ret, sout, serr = runcmd(cmd)
|
ret, sout, serr = runcmd(cmd, timeout=self.args.th_convt)
|
||||||
if ret != 0:
|
if ret != 0:
|
||||||
m = "FFmpeg failed (probably a corrupt video file):\n"
|
m = "FFmpeg failed (probably a corrupt video file):\n"
|
||||||
m += "\n".join(["ff: {}".format(x) for x in serr.split("\n")])
|
m += "\n".join(["ff: {}".format(x) for x in serr.split("\n")])
|
||||||
@@ -406,21 +406,45 @@ class ThumbSrv(object):
|
|||||||
if "ac" not in ret:
|
if "ac" not in ret:
|
||||||
raise Exception("not audio")
|
raise Exception("not audio")
|
||||||
|
|
||||||
# fmt: off
|
src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
|
||||||
cmd = [
|
want_caf = tpath.endswith(".caf")
|
||||||
b"ffmpeg",
|
tmp_opus = tpath
|
||||||
b"-nostdin",
|
if want_caf:
|
||||||
b"-v", b"error",
|
tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
|
||||||
b"-hide_banner",
|
|
||||||
b"-i", fsenc(abspath),
|
|
||||||
b"-map", b"0:a:0",
|
|
||||||
b"-c:a", b"libopus",
|
|
||||||
b"-b:a", b"128k",
|
|
||||||
fsenc(tpath)
|
|
||||||
]
|
|
||||||
# fmt: on
|
|
||||||
|
|
||||||
self._run_ff(cmd)
|
if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
|
||||||
|
# fmt: off
|
||||||
|
cmd = [
|
||||||
|
b"ffmpeg",
|
||||||
|
b"-nostdin",
|
||||||
|
b"-v", b"error",
|
||||||
|
b"-hide_banner",
|
||||||
|
b"-i", fsenc(abspath),
|
||||||
|
b"-map_metadata", b"-1",
|
||||||
|
b"-map", b"0:a:0",
|
||||||
|
b"-c:a", b"libopus",
|
||||||
|
b"-b:a", b"128k",
|
||||||
|
fsenc(tmp_opus)
|
||||||
|
]
|
||||||
|
# fmt: on
|
||||||
|
self._run_ff(cmd)
|
||||||
|
|
||||||
|
if want_caf:
|
||||||
|
# fmt: off
|
||||||
|
cmd = [
|
||||||
|
b"ffmpeg",
|
||||||
|
b"-nostdin",
|
||||||
|
b"-v", b"error",
|
||||||
|
b"-hide_banner",
|
||||||
|
b"-i", fsenc(abspath if src_opus else tmp_opus),
|
||||||
|
b"-map_metadata", b"-1",
|
||||||
|
b"-map", b"0:a:0",
|
||||||
|
b"-c:a", b"copy",
|
||||||
|
b"-f", b"caf",
|
||||||
|
fsenc(tpath)
|
||||||
|
]
|
||||||
|
# fmt: on
|
||||||
|
self._run_ff(cmd)
|
||||||
|
|
||||||
def poke(self, tdir):
|
def poke(self, tdir):
|
||||||
if not self.poke_cd.poke(tdir):
|
if not self.poke_cd.poke(tdir):
|
||||||
@@ -461,7 +485,7 @@ class ThumbSrv(object):
|
|||||||
thumbpath = os.path.join(histpath, cat)
|
thumbpath = os.path.join(histpath, cat)
|
||||||
|
|
||||||
# self.log("cln {}".format(thumbpath))
|
# self.log("cln {}".format(thumbpath))
|
||||||
exts = ["jpg", "webp"] if cat == "th" else ["opus"]
|
exts = ["jpg", "webp"] if cat == "th" else ["opus", "caf"]
|
||||||
maxage = getattr(self.args, cat + "_maxage")
|
maxage = getattr(self.args, cat + "_maxage")
|
||||||
now = time.time()
|
now = time.time()
|
||||||
prev_b64 = None
|
prev_b64 = None
|
||||||
|
|||||||
@@ -117,7 +117,16 @@ class U2idx(object):
|
|||||||
if ok:
|
if ok:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
v, uq = (uq + " ").split(" ", 1)
|
if uq.startswith('"'):
|
||||||
|
v, uq = uq[1:].split('"', 1)
|
||||||
|
while v.endswith("\\"):
|
||||||
|
v2, uq = uq.split('"', 1)
|
||||||
|
v = v[:-1] + '"' + v2
|
||||||
|
uq = uq.strip()
|
||||||
|
else:
|
||||||
|
v, uq = (uq + " ").split(" ", 1)
|
||||||
|
v = v.replace('\\"', '"')
|
||||||
|
|
||||||
if is_key:
|
if is_key:
|
||||||
is_key = False
|
is_key = False
|
||||||
|
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ from .util import (
|
|||||||
Pebkac,
|
Pebkac,
|
||||||
Queue,
|
Queue,
|
||||||
ProgressPrinter,
|
ProgressPrinter,
|
||||||
|
SYMTIME,
|
||||||
fsdec,
|
fsdec,
|
||||||
fsenc,
|
fsenc,
|
||||||
absreal,
|
absreal,
|
||||||
@@ -73,6 +74,7 @@ class Up2k(object):
|
|||||||
self.need_rescan = {}
|
self.need_rescan = {}
|
||||||
self.dupesched = {}
|
self.dupesched = {}
|
||||||
self.registry = {}
|
self.registry = {}
|
||||||
|
self.droppable = {}
|
||||||
self.entags = {}
|
self.entags = {}
|
||||||
self.flags = {}
|
self.flags = {}
|
||||||
self.cur = {}
|
self.cur = {}
|
||||||
@@ -125,11 +127,11 @@ class Up2k(object):
|
|||||||
all_vols = self.asrv.vfs.all_vols
|
all_vols = self.asrv.vfs.all_vols
|
||||||
have_e2d = self.init_indexes(all_vols)
|
have_e2d = self.init_indexes(all_vols)
|
||||||
|
|
||||||
if have_e2d:
|
thr = threading.Thread(target=self._snapshot, name="up2k-snapshot")
|
||||||
thr = threading.Thread(target=self._snapshot, name="up2k-snapshot")
|
thr.daemon = True
|
||||||
thr.daemon = True
|
thr.start()
|
||||||
thr.start()
|
|
||||||
|
|
||||||
|
if have_e2d:
|
||||||
thr = threading.Thread(target=self._hasher, name="up2k-hasher")
|
thr = threading.Thread(target=self._hasher, name="up2k-hasher")
|
||||||
thr.daemon = True
|
thr.daemon = True
|
||||||
thr.start()
|
thr.start()
|
||||||
@@ -295,7 +297,8 @@ class Up2k(object):
|
|||||||
def _vis_reg_progress(self, reg):
|
def _vis_reg_progress(self, reg):
|
||||||
ret = []
|
ret = []
|
||||||
for _, job in reg.items():
|
for _, job in reg.items():
|
||||||
ret.append(self._vis_job_progress(job))
|
if job["need"]:
|
||||||
|
ret.append(self._vis_job_progress(job))
|
||||||
|
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
@@ -483,26 +486,41 @@ class Up2k(object):
|
|||||||
self.log("/{} {}".format(vpath, " ".join(sorted(a))), "35")
|
self.log("/{} {}".format(vpath, " ".join(sorted(a))), "35")
|
||||||
|
|
||||||
reg = {}
|
reg = {}
|
||||||
|
drp = None
|
||||||
path = os.path.join(histpath, "up2k.snap")
|
path = os.path.join(histpath, "up2k.snap")
|
||||||
if "e2d" in flags and bos.path.exists(path):
|
if bos.path.exists(path):
|
||||||
with gzip.GzipFile(path, "rb") as f:
|
with gzip.GzipFile(path, "rb") as f:
|
||||||
j = f.read().decode("utf-8")
|
j = f.read().decode("utf-8")
|
||||||
|
|
||||||
reg2 = json.loads(j)
|
reg2 = json.loads(j)
|
||||||
|
try:
|
||||||
|
drp = reg2["droppable"]
|
||||||
|
reg2 = reg2["registry"]
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
for k, job in reg2.items():
|
for k, job in reg2.items():
|
||||||
path = os.path.join(job["ptop"], job["prel"], job["name"])
|
path = os.path.join(job["ptop"], job["prel"], job["name"])
|
||||||
if bos.path.exists(path):
|
if bos.path.exists(path):
|
||||||
reg[k] = job
|
reg[k] = job
|
||||||
job["poke"] = time.time()
|
job["poke"] = time.time()
|
||||||
|
job["busy"] = {}
|
||||||
else:
|
else:
|
||||||
self.log("ign deleted file in snap: [{}]".format(path))
|
self.log("ign deleted file in snap: [{}]".format(path))
|
||||||
|
|
||||||
m = "loaded snap {} |{}|".format(path, len(reg.keys()))
|
if drp is None:
|
||||||
|
drp = [k for k, v in reg.items() if not v.get("need", [])]
|
||||||
|
else:
|
||||||
|
drp = [x for x in drp if x in reg]
|
||||||
|
|
||||||
|
m = "loaded snap {} |{}| ({})".format(path, len(reg.keys()), len(drp or []))
|
||||||
m = [m] + self._vis_reg_progress(reg)
|
m = [m] + self._vis_reg_progress(reg)
|
||||||
self.log("\n".join(m))
|
self.log("\n".join(m))
|
||||||
|
|
||||||
self.flags[ptop] = flags
|
self.flags[ptop] = flags
|
||||||
self.registry[ptop] = reg
|
self.registry[ptop] = reg
|
||||||
|
self.droppable[ptop] = drp or []
|
||||||
|
self.regdrop(ptop, None)
|
||||||
if not HAVE_SQLITE3 or "e2d" not in flags or "d2d" in flags:
|
if not HAVE_SQLITE3 or "e2d" not in flags or "d2d" in flags:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@@ -1256,6 +1274,7 @@ class Up2k(object):
|
|||||||
"at": at,
|
"at": at,
|
||||||
"hash": [],
|
"hash": [],
|
||||||
"need": [],
|
"need": [],
|
||||||
|
"busy": {},
|
||||||
}
|
}
|
||||||
|
|
||||||
if job and wark in reg:
|
if job and wark in reg:
|
||||||
@@ -1289,11 +1308,14 @@ class Up2k(object):
|
|||||||
err = "partial upload exists at a different location; please resume uploading here instead:\n"
|
err = "partial upload exists at a different location; please resume uploading here instead:\n"
|
||||||
err += "/" + quotep(vsrc) + " "
|
err += "/" + quotep(vsrc) + " "
|
||||||
|
|
||||||
dupe = [cj["prel"], cj["name"]]
|
# registry is size-constrained + can only contain one unique wark;
|
||||||
try:
|
# let want_recheck trigger symlink (if still in reg) or reupload
|
||||||
self.dupesched[src].append(dupe)
|
if cur:
|
||||||
except:
|
dupe = [cj["prel"], cj["name"], cj["lmod"]]
|
||||||
self.dupesched[src] = [dupe]
|
try:
|
||||||
|
self.dupesched[src].append(dupe)
|
||||||
|
except:
|
||||||
|
self.dupesched[src] = [dupe]
|
||||||
|
|
||||||
raise Pebkac(400, err)
|
raise Pebkac(400, err)
|
||||||
|
|
||||||
@@ -1314,7 +1336,7 @@ class Up2k(object):
|
|||||||
dst = os.path.join(job["ptop"], job["prel"], job["name"])
|
dst = os.path.join(job["ptop"], job["prel"], job["name"])
|
||||||
if not self.args.nw:
|
if not self.args.nw:
|
||||||
bos.unlink(dst) # TODO ed pls
|
bos.unlink(dst) # TODO ed pls
|
||||||
self._symlink(src, dst)
|
self._symlink(src, dst, lmod=cj["lmod"])
|
||||||
|
|
||||||
if cur:
|
if cur:
|
||||||
a = [cj[x] for x in "prel name lmod size addr".split()]
|
a = [cj[x] for x in "prel name lmod size addr".split()]
|
||||||
@@ -1338,6 +1360,7 @@ class Up2k(object):
|
|||||||
"t0": now,
|
"t0": now,
|
||||||
"hash": deepcopy(cj["hash"]),
|
"hash": deepcopy(cj["hash"]),
|
||||||
"need": [],
|
"need": [],
|
||||||
|
"busy": {},
|
||||||
}
|
}
|
||||||
# client-provided, sanitized by _get_wark: name, size, lmod
|
# client-provided, sanitized by _get_wark: name, size, lmod
|
||||||
for k in [
|
for k in [
|
||||||
@@ -1385,13 +1408,14 @@ class Up2k(object):
|
|||||||
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as f:
|
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as f:
|
||||||
return f["orz"][1]
|
return f["orz"][1]
|
||||||
|
|
||||||
def _symlink(self, src, dst, verbose=True):
|
def _symlink(self, src, dst, verbose=True, lmod=None):
|
||||||
if verbose:
|
if verbose:
|
||||||
self.log("linking dupe:\n {0}\n {1}".format(src, dst))
|
self.log("linking dupe:\n {0}\n {1}".format(src, dst))
|
||||||
|
|
||||||
if self.args.nw:
|
if self.args.nw:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
linked = False
|
||||||
try:
|
try:
|
||||||
if self.args.no_symlink:
|
if self.args.no_symlink:
|
||||||
raise Exception("disabled in config")
|
raise Exception("disabled in config")
|
||||||
@@ -1422,10 +1446,18 @@ class Up2k(object):
|
|||||||
hops = len(ndst[nc:]) - 1
|
hops = len(ndst[nc:]) - 1
|
||||||
lsrc = "../" * hops + "/".join(lsrc)
|
lsrc = "../" * hops + "/".join(lsrc)
|
||||||
os.symlink(fsenc(lsrc), fsenc(ldst))
|
os.symlink(fsenc(lsrc), fsenc(ldst))
|
||||||
|
linked = True
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
self.log("cannot symlink; creating copy: " + repr(ex))
|
self.log("cannot symlink; creating copy: " + repr(ex))
|
||||||
shutil.copy2(fsenc(src), fsenc(dst))
|
shutil.copy2(fsenc(src), fsenc(dst))
|
||||||
|
|
||||||
|
if lmod and (not linked or SYMTIME):
|
||||||
|
times = (int(time.time()), int(lmod))
|
||||||
|
if ANYWIN:
|
||||||
|
self.lastmod_q.put([dst, 0, times])
|
||||||
|
else:
|
||||||
|
bos.utime(dst, times, False)
|
||||||
|
|
||||||
def handle_chunk(self, ptop, wark, chash):
|
def handle_chunk(self, ptop, wark, chash):
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
job = self.registry[ptop].get(wark)
|
job = self.registry[ptop].get(wark)
|
||||||
@@ -1444,6 +1476,14 @@ class Up2k(object):
|
|||||||
if not nchunk:
|
if not nchunk:
|
||||||
raise Pebkac(400, "unknown chunk")
|
raise Pebkac(400, "unknown chunk")
|
||||||
|
|
||||||
|
if chash in job["busy"]:
|
||||||
|
nh = len(job["hash"])
|
||||||
|
idx = job["hash"].index(chash)
|
||||||
|
m = "that chunk is already being written to:\n {}\n {} {}/{}\n {}"
|
||||||
|
raise Pebkac(400, m.format(wark, chash, idx, nh, job["name"]))
|
||||||
|
|
||||||
|
job["busy"][chash] = 1
|
||||||
|
|
||||||
job["poke"] = time.time()
|
job["poke"] = time.time()
|
||||||
|
|
||||||
chunksize = up2k_chunksize(job["size"])
|
chunksize = up2k_chunksize(job["size"])
|
||||||
@@ -1453,6 +1493,14 @@ class Up2k(object):
|
|||||||
|
|
||||||
return [chunksize, ofs, path, job["lmod"]]
|
return [chunksize, ofs, path, job["lmod"]]
|
||||||
|
|
||||||
|
def release_chunk(self, ptop, wark, chash):
|
||||||
|
with self.mutex:
|
||||||
|
job = self.registry[ptop].get(wark)
|
||||||
|
if job:
|
||||||
|
job["busy"].pop(chash, None)
|
||||||
|
|
||||||
|
return [True]
|
||||||
|
|
||||||
def confirm_chunk(self, ptop, wark, chash):
|
def confirm_chunk(self, ptop, wark, chash):
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
try:
|
try:
|
||||||
@@ -1463,6 +1511,8 @@ class Up2k(object):
|
|||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
return "confirm_chunk, wark, " + repr(ex)
|
return "confirm_chunk, wark, " + repr(ex)
|
||||||
|
|
||||||
|
job["busy"].pop(chash, None)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
job["need"].remove(chash)
|
job["need"].remove(chash)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
@@ -1473,7 +1523,7 @@ class Up2k(object):
|
|||||||
return ret, src
|
return ret, src
|
||||||
|
|
||||||
if self.args.nw:
|
if self.args.nw:
|
||||||
# del self.registry[ptop][wark]
|
self.regdrop(ptop, wark)
|
||||||
return ret, dst
|
return ret, dst
|
||||||
|
|
||||||
# windows cant rename open files
|
# windows cant rename open files
|
||||||
@@ -1505,21 +1555,21 @@ class Up2k(object):
|
|||||||
a = [job[x] for x in "ptop wark prel name lmod size addr".split()]
|
a = [job[x] for x in "ptop wark prel name lmod size addr".split()]
|
||||||
a += [job.get("at") or time.time()]
|
a += [job.get("at") or time.time()]
|
||||||
if self.idx_wark(*a):
|
if self.idx_wark(*a):
|
||||||
# self.log("pop " + wark + " " + dst + " finish_upload idx_wark", 4)
|
|
||||||
del self.registry[ptop][wark]
|
del self.registry[ptop][wark]
|
||||||
# in-memory registry is reserved for unfinished uploads
|
else:
|
||||||
|
self.regdrop(ptop, wark)
|
||||||
|
|
||||||
dupes = self.dupesched.pop(dst, [])
|
dupes = self.dupesched.pop(dst, [])
|
||||||
if not dupes:
|
if not dupes:
|
||||||
return
|
return
|
||||||
|
|
||||||
cur = self.cur.get(ptop)
|
cur = self.cur.get(ptop)
|
||||||
for rd, fn in dupes:
|
for rd, fn, lmod in dupes:
|
||||||
d2 = os.path.join(ptop, rd, fn)
|
d2 = os.path.join(ptop, rd, fn)
|
||||||
if os.path.exists(d2):
|
if os.path.exists(d2):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
self._symlink(dst, d2)
|
self._symlink(dst, d2, lmod=lmod)
|
||||||
if cur:
|
if cur:
|
||||||
self.db_rm(cur, rd, fn)
|
self.db_rm(cur, rd, fn)
|
||||||
self.db_add(cur, wark, rd, fn, *a[-4:])
|
self.db_add(cur, wark, rd, fn, *a[-4:])
|
||||||
@@ -1527,6 +1577,21 @@ class Up2k(object):
|
|||||||
if cur:
|
if cur:
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
|
|
||||||
|
def regdrop(self, ptop, wark):
|
||||||
|
t = self.droppable[ptop]
|
||||||
|
if wark:
|
||||||
|
t.append(wark)
|
||||||
|
|
||||||
|
if len(t) <= self.args.reg_cap:
|
||||||
|
return
|
||||||
|
|
||||||
|
n = len(t) - int(self.args.reg_cap / 2)
|
||||||
|
m = "up2k-registry [{}] has {} droppables; discarding {}"
|
||||||
|
self.log(m.format(ptop, len(t), n))
|
||||||
|
for k in t[:n]:
|
||||||
|
self.registry[ptop].pop(k, None)
|
||||||
|
self.droppable[ptop] = t[n:]
|
||||||
|
|
||||||
def idx_wark(self, ptop, wark, rd, fn, lmod, sz, ip, at):
|
def idx_wark(self, ptop, wark, rd, fn, lmod, sz, ip, at):
|
||||||
cur = self.cur.get(ptop)
|
cur = self.cur.get(ptop)
|
||||||
if not cur:
|
if not cur:
|
||||||
@@ -1721,8 +1786,9 @@ class Up2k(object):
|
|||||||
dlabs = absreal(sabs)
|
dlabs = absreal(sabs)
|
||||||
m = "moving symlink from [{}] to [{}], target [{}]"
|
m = "moving symlink from [{}] to [{}], target [{}]"
|
||||||
self.log(m.format(sabs, dabs, dlabs))
|
self.log(m.format(sabs, dabs, dlabs))
|
||||||
os.unlink(sabs)
|
mt = bos.path.getmtime(sabs, False)
|
||||||
self._symlink(dlabs, dabs, False)
|
bos.unlink(sabs)
|
||||||
|
self._symlink(dlabs, dabs, False, lmod=mt)
|
||||||
|
|
||||||
# folders are too scary, schedule rescan of both vols
|
# folders are too scary, schedule rescan of both vols
|
||||||
self.need_rescan[svn.vpath] = 1
|
self.need_rescan[svn.vpath] = 1
|
||||||
@@ -1852,25 +1918,30 @@ class Up2k(object):
|
|||||||
slabs = list(sorted(links.keys()))[0]
|
slabs = list(sorted(links.keys()))[0]
|
||||||
ptop, rem = links.pop(slabs)
|
ptop, rem = links.pop(slabs)
|
||||||
self.log("linkswap [{}] and [{}]".format(sabs, slabs))
|
self.log("linkswap [{}] and [{}]".format(sabs, slabs))
|
||||||
|
mt = bos.path.getmtime(slabs, False)
|
||||||
bos.unlink(slabs)
|
bos.unlink(slabs)
|
||||||
bos.rename(sabs, slabs)
|
bos.rename(sabs, slabs)
|
||||||
|
bos.utime(slabs, (int(time.time()), int(mt)), False)
|
||||||
self._symlink(slabs, sabs, False)
|
self._symlink(slabs, sabs, False)
|
||||||
full[slabs] = [ptop, rem]
|
full[slabs] = [ptop, rem]
|
||||||
|
sabs = slabs
|
||||||
|
|
||||||
if not dabs:
|
if not dabs:
|
||||||
dabs = list(sorted(full.keys()))[0]
|
dabs = list(sorted(full.keys()))[0]
|
||||||
|
|
||||||
for alink in links.keys():
|
for alink in links.keys():
|
||||||
|
lmod = None
|
||||||
try:
|
try:
|
||||||
if alink != sabs and absreal(alink) != sabs:
|
if alink != sabs and absreal(alink) != sabs:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
self.log("relinking [{}] to [{}]".format(alink, dabs))
|
self.log("relinking [{}] to [{}]".format(alink, dabs))
|
||||||
|
lmod = bos.path.getmtime(alink, False)
|
||||||
bos.unlink(alink)
|
bos.unlink(alink)
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
self._symlink(dabs, alink, False)
|
self._symlink(dabs, alink, False, lmod=lmod)
|
||||||
|
|
||||||
return len(full) + len(links)
|
return len(full) + len(links)
|
||||||
|
|
||||||
@@ -1976,7 +2047,7 @@ class Up2k(object):
|
|||||||
for path, sz, times in ready:
|
for path, sz, times in ready:
|
||||||
self.log("lmod: setting times {} on {}".format(times, path))
|
self.log("lmod: setting times {} on {}".format(times, path))
|
||||||
try:
|
try:
|
||||||
bos.utime(path, times)
|
bos.utime(path, times, False)
|
||||||
except:
|
except:
|
||||||
self.log("lmod: failed to utime ({}, {})".format(path, times))
|
self.log("lmod: failed to utime ({}, {})".format(path, times))
|
||||||
|
|
||||||
@@ -2042,7 +2113,8 @@ class Up2k(object):
|
|||||||
bos.makedirs(histpath)
|
bos.makedirs(histpath)
|
||||||
|
|
||||||
path2 = "{}.{}".format(path, os.getpid())
|
path2 = "{}.{}".format(path, os.getpid())
|
||||||
j = json.dumps(reg, indent=2, sort_keys=True).encode("utf-8")
|
body = {"droppable": self.droppable[ptop], "registry": reg}
|
||||||
|
j = json.dumps(body, indent=2, sort_keys=True).encode("utf-8")
|
||||||
with gzip.GzipFile(path2, "wb") as f:
|
with gzip.GzipFile(path2, "wb") as f:
|
||||||
f.write(j)
|
f.write(j)
|
||||||
|
|
||||||
|
|||||||
@@ -67,8 +67,9 @@ if WINDOWS and PY2:
|
|||||||
FS_ENCODING = "utf-8"
|
FS_ENCODING = "utf-8"
|
||||||
|
|
||||||
|
|
||||||
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
|
SYMTIME = sys.version_info >= (3, 6) and os.supports_follow_symlinks
|
||||||
|
|
||||||
|
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
|
||||||
|
|
||||||
HTTPCODE = {
|
HTTPCODE = {
|
||||||
200: "OK",
|
200: "OK",
|
||||||
@@ -104,6 +105,7 @@ MIMES = {
|
|||||||
"txt": "text/plain",
|
"txt": "text/plain",
|
||||||
"js": "text/javascript",
|
"js": "text/javascript",
|
||||||
"opus": "audio/ogg; codecs=opus",
|
"opus": "audio/ogg; codecs=opus",
|
||||||
|
"caf": "audio/x-caf",
|
||||||
"mp3": "audio/mpeg",
|
"mp3": "audio/mpeg",
|
||||||
"m4a": "audio/mp4",
|
"m4a": "audio/mp4",
|
||||||
"jpg": "image/jpeg",
|
"jpg": "image/jpeg",
|
||||||
@@ -821,6 +823,17 @@ def gen_filekey(salt, fspath, fsize, inode):
|
|||||||
).decode("ascii")
|
).decode("ascii")
|
||||||
|
|
||||||
|
|
||||||
|
def gencookie(k, v, dur):
|
||||||
|
v = v.replace(";", "")
|
||||||
|
if dur:
|
||||||
|
dt = datetime.utcfromtimestamp(time.time() + dur)
|
||||||
|
exp = dt.strftime("%a, %d %b %Y %H:%M:%S GMT")
|
||||||
|
else:
|
||||||
|
exp = "Fri, 15 Aug 1997 01:00:00 GMT"
|
||||||
|
|
||||||
|
return "{}={}; Path=/; Expires={}; SameSite=Lax".format(k, v, exp)
|
||||||
|
|
||||||
|
|
||||||
def humansize(sz, terse=False):
|
def humansize(sz, terse=False):
|
||||||
for unit in ["B", "KiB", "MiB", "GiB", "TiB"]:
|
for unit in ["B", "KiB", "MiB", "GiB", "TiB"]:
|
||||||
if sz < 1024:
|
if sz < 1024:
|
||||||
@@ -1164,7 +1177,7 @@ def hashcopy(fin, fout):
|
|||||||
return tlen, hashobj.hexdigest(), digest_b64
|
return tlen, hashobj.hexdigest(), digest_b64
|
||||||
|
|
||||||
|
|
||||||
def sendfile_py(lower, upper, f, s, bufsz, slp):
|
def sendfile_py(log, lower, upper, f, s, bufsz, slp):
|
||||||
remains = upper - lower
|
remains = upper - lower
|
||||||
f.seek(lower)
|
f.seek(lower)
|
||||||
while remains > 0:
|
while remains > 0:
|
||||||
@@ -1184,17 +1197,24 @@ def sendfile_py(lower, upper, f, s, bufsz, slp):
|
|||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
def sendfile_kern(lower, upper, f, s, bufsz, slp):
|
def sendfile_kern(log, lower, upper, f, s, bufsz, slp):
|
||||||
out_fd = s.fileno()
|
out_fd = s.fileno()
|
||||||
in_fd = f.fileno()
|
in_fd = f.fileno()
|
||||||
ofs = lower
|
ofs = lower
|
||||||
|
stuck = None
|
||||||
while ofs < upper:
|
while ofs < upper:
|
||||||
|
stuck = stuck or time.time()
|
||||||
try:
|
try:
|
||||||
req = min(2 ** 30, upper - ofs)
|
req = min(2 ** 30, upper - ofs)
|
||||||
select.select([], [out_fd], [], 10)
|
select.select([], [out_fd], [], 10)
|
||||||
n = os.sendfile(out_fd, in_fd, ofs, req)
|
n = os.sendfile(out_fd, in_fd, ofs, req)
|
||||||
|
stuck = None
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
# print("sendfile: " + repr(ex))
|
d = time.time() - stuck
|
||||||
|
log("sendfile stuck for {:.3f} sec: {!r}".format(d, ex))
|
||||||
|
if d < 3600 and ex.errno == 11: # eagain
|
||||||
|
continue
|
||||||
|
|
||||||
n = 0
|
n = 0
|
||||||
|
|
||||||
if n <= 0:
|
if n <= 0:
|
||||||
@@ -1312,9 +1332,17 @@ def guess_mime(url, fallback="application/octet-stream"):
|
|||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
|
||||||
def runcmd(argv):
|
def runcmd(argv, timeout=None):
|
||||||
p = sp.Popen(argv, stdout=sp.PIPE, stderr=sp.PIPE)
|
p = sp.Popen(argv, stdout=sp.PIPE, stderr=sp.PIPE)
|
||||||
stdout, stderr = p.communicate()
|
if not timeout or PY2:
|
||||||
|
stdout, stderr = p.communicate()
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
stdout, stderr = p.communicate(timeout=timeout)
|
||||||
|
except sp.TimeoutExpired:
|
||||||
|
p.kill()
|
||||||
|
stdout, stderr = p.communicate()
|
||||||
|
|
||||||
stdout = stdout.decode("utf-8", "replace")
|
stdout = stdout.decode("utf-8", "replace")
|
||||||
stderr = stderr.decode("utf-8", "replace")
|
stderr = stderr.decode("utf-8", "replace")
|
||||||
return [p.returncode, stdout, stderr]
|
return [p.returncode, stdout, stderr]
|
||||||
|
|||||||
@@ -44,8 +44,9 @@ pre, code, tt, #doc, #doc>code {
|
|||||||
margin-left: -.7em;
|
margin-left: -.7em;
|
||||||
}
|
}
|
||||||
#files {
|
#files {
|
||||||
border-spacing: 0;
|
|
||||||
z-index: 1;
|
z-index: 1;
|
||||||
|
top: -.3em;
|
||||||
|
border-spacing: 0;
|
||||||
position: relative;
|
position: relative;
|
||||||
}
|
}
|
||||||
#files tbody a {
|
#files tbody a {
|
||||||
@@ -72,17 +73,38 @@ a, #files tbody div a:last-child {
|
|||||||
}
|
}
|
||||||
#files thead {
|
#files thead {
|
||||||
position: sticky;
|
position: sticky;
|
||||||
top: 0;
|
top: -1px;
|
||||||
}
|
}
|
||||||
#files thead a {
|
#files thead a {
|
||||||
color: #999;
|
color: #999;
|
||||||
font-weight: normal;
|
font-weight: normal;
|
||||||
}
|
}
|
||||||
|
.s0:after,
|
||||||
|
.s1:after {
|
||||||
|
content: '⌄';
|
||||||
|
margin-left: -.1em;
|
||||||
|
}
|
||||||
|
.s0r:after,
|
||||||
|
.s1r:after {
|
||||||
|
content: '⌃';
|
||||||
|
margin-left: -.1em;
|
||||||
|
}
|
||||||
|
.s0:after,
|
||||||
|
.s0r:after {
|
||||||
|
color: #fb0;
|
||||||
|
}
|
||||||
|
.s1:after,
|
||||||
|
.s1r:after {
|
||||||
|
color: #d09;
|
||||||
|
}
|
||||||
|
#files thead th:after {
|
||||||
|
margin-right: -.7em;
|
||||||
|
}
|
||||||
#files tbody tr:hover td {
|
#files tbody tr:hover td {
|
||||||
background: #1c1c1c;
|
background: #1c1c1c;
|
||||||
}
|
}
|
||||||
#files thead th {
|
#files thead th {
|
||||||
padding: 0 .3em .3em .3em;
|
padding: .3em;
|
||||||
border-bottom: 1px solid #444;
|
border-bottom: 1px solid #444;
|
||||||
cursor: pointer;
|
cursor: pointer;
|
||||||
}
|
}
|
||||||
@@ -239,6 +261,8 @@ html.light #ggrid>a[tt].sel {
|
|||||||
#files tbody tr.sel:hover td,
|
#files tbody tr.sel:hover td,
|
||||||
#files tbody tr.sel:focus td,
|
#files tbody tr.sel:focus td,
|
||||||
#ggrid>a.sel:hover,
|
#ggrid>a.sel:hover,
|
||||||
|
#ggrid>a.sel:focus,
|
||||||
|
html.light #ggrid>a.sel:focus,
|
||||||
html.light #ggrid>a.sel:hover {
|
html.light #ggrid>a.sel:hover {
|
||||||
color: #fff;
|
color: #fff;
|
||||||
background: #d39;
|
background: #d39;
|
||||||
@@ -294,6 +318,8 @@ html.light #ggrid>a.sel {
|
|||||||
width: 100%;
|
width: 100%;
|
||||||
z-index: 3;
|
z-index: 3;
|
||||||
touch-action: none;
|
touch-action: none;
|
||||||
|
}
|
||||||
|
#widget.anim {
|
||||||
transition: bottom 0.15s;
|
transition: bottom 0.15s;
|
||||||
}
|
}
|
||||||
#widget.open {
|
#widget.open {
|
||||||
@@ -461,7 +487,7 @@ html.light #wfm a:not(.en) {
|
|||||||
width: calc(100% - 10.5em);
|
width: calc(100% - 10.5em);
|
||||||
background: rgba(0,0,0,0.2);
|
background: rgba(0,0,0,0.2);
|
||||||
}
|
}
|
||||||
@media (min-width: 80em) {
|
@media (min-width: 70em) {
|
||||||
#barpos,
|
#barpos,
|
||||||
#barbuf {
|
#barbuf {
|
||||||
width: calc(100% - 21em);
|
width: calc(100% - 21em);
|
||||||
@@ -653,7 +679,7 @@ input.eq_gain {
|
|||||||
#wrap {
|
#wrap {
|
||||||
margin: 1.8em 1.5em 0 1.5em;
|
margin: 1.8em 1.5em 0 1.5em;
|
||||||
min-height: 70vh;
|
min-height: 70vh;
|
||||||
padding-bottom: 5em;
|
padding-bottom: 7em;
|
||||||
}
|
}
|
||||||
#tree {
|
#tree {
|
||||||
display: none;
|
display: none;
|
||||||
@@ -876,13 +902,15 @@ html.light #tree.nowrap .ntree a+a:hover {
|
|||||||
}
|
}
|
||||||
#thumbs,
|
#thumbs,
|
||||||
#au_fullpre,
|
#au_fullpre,
|
||||||
|
#au_os_seek,
|
||||||
#au_osd_cv,
|
#au_osd_cv,
|
||||||
#u2tdate {
|
#u2tdate {
|
||||||
opacity: .3;
|
opacity: .3;
|
||||||
}
|
}
|
||||||
#griden.on+#thumbs,
|
#griden.on+#thumbs,
|
||||||
#au_preload.on+#au_fullpre,
|
#au_preload.on+#au_fullpre,
|
||||||
#au_os_ctl.on+#au_osd_cv,
|
#au_os_ctl.on+#au_os_seek,
|
||||||
|
#au_os_ctl.on+#au_os_seek+#au_osd_cv,
|
||||||
#u2turbo.on+#u2tdate {
|
#u2turbo.on+#u2tdate {
|
||||||
opacity: 1;
|
opacity: 1;
|
||||||
}
|
}
|
||||||
@@ -945,6 +973,12 @@ html.light .ghead {
|
|||||||
#ggrid>a.dir:before {
|
#ggrid>a.dir:before {
|
||||||
content: '📂';
|
content: '📂';
|
||||||
}
|
}
|
||||||
|
#ggrid>a.au:before {
|
||||||
|
content: '💾';
|
||||||
|
}
|
||||||
|
html.np_open #ggrid>a.au:before {
|
||||||
|
content: '▶';
|
||||||
|
}
|
||||||
#ggrid>a:before {
|
#ggrid>a:before {
|
||||||
display: block;
|
display: block;
|
||||||
position: absolute;
|
position: absolute;
|
||||||
@@ -954,6 +988,12 @@ html.light .ghead {
|
|||||||
background: linear-gradient(135deg,rgba(255,255,255,0) 50%,rgba(255,255,255,0.2));
|
background: linear-gradient(135deg,rgba(255,255,255,0) 50%,rgba(255,255,255,0.2));
|
||||||
border-radius: .3em;
|
border-radius: .3em;
|
||||||
font-size: 2em;
|
font-size: 2em;
|
||||||
|
transition: font-size .15s, margin .15s;
|
||||||
|
}
|
||||||
|
#ggrid>a:focus:before,
|
||||||
|
#ggrid>a:hover:before {
|
||||||
|
font-size: 2.5em;
|
||||||
|
margin: -.2em;
|
||||||
}
|
}
|
||||||
#op_unpost {
|
#op_unpost {
|
||||||
padding: 1em;
|
padding: 1em;
|
||||||
@@ -1023,7 +1063,6 @@ html.light #rui {
|
|||||||
font-size: 1.5em;
|
font-size: 1.5em;
|
||||||
}
|
}
|
||||||
#doc {
|
#doc {
|
||||||
background: none;
|
|
||||||
overflow: visible;
|
overflow: visible;
|
||||||
margin: -1em 0 .5em 0;
|
margin: -1em 0 .5em 0;
|
||||||
padding: 1em 0 1em 0;
|
padding: 1em 0 1em 0;
|
||||||
@@ -1062,7 +1101,7 @@ html.light #doc .line-highlight {
|
|||||||
#docul li {
|
#docul li {
|
||||||
margin: 0;
|
margin: 0;
|
||||||
}
|
}
|
||||||
#tree #docul a {
|
#tree #docul li+li a {
|
||||||
display: block;
|
display: block;
|
||||||
}
|
}
|
||||||
#seldoc.sel {
|
#seldoc.sel {
|
||||||
@@ -1111,6 +1150,7 @@ a.btn,
|
|||||||
|
|
||||||
|
|
||||||
html,
|
html,
|
||||||
|
#doc,
|
||||||
#rui,
|
#rui,
|
||||||
#files td,
|
#files td,
|
||||||
#files thead th,
|
#files thead th,
|
||||||
@@ -1158,6 +1198,7 @@ html,
|
|||||||
#ggrid>a[tt] {
|
#ggrid>a[tt] {
|
||||||
background: linear-gradient(135deg, #2c2c2c 95%, #444 95%);
|
background: linear-gradient(135deg, #2c2c2c 95%, #444 95%);
|
||||||
}
|
}
|
||||||
|
#ggrid>a:focus,
|
||||||
#ggrid>a:hover {
|
#ggrid>a:hover {
|
||||||
background: #383838;
|
background: #383838;
|
||||||
border-color: #555;
|
border-color: #555;
|
||||||
@@ -1171,6 +1212,7 @@ html.light #ggrid>a {
|
|||||||
html.light #ggrid>a[tt] {
|
html.light #ggrid>a[tt] {
|
||||||
background: linear-gradient(135deg, #f7f7f7 95%, #ccc 95%);
|
background: linear-gradient(135deg, #f7f7f7 95%, #ccc 95%);
|
||||||
}
|
}
|
||||||
|
html.light #ggrid>a:focus,
|
||||||
html.light #ggrid>a:hover {
|
html.light #ggrid>a:hover {
|
||||||
background: #fff;
|
background: #fff;
|
||||||
border-color: #ccc;
|
border-color: #ccc;
|
||||||
@@ -1190,6 +1232,7 @@ html.light {
|
|||||||
html.light #ops,
|
html.light #ops,
|
||||||
html.light .opbox,
|
html.light .opbox,
|
||||||
html.light #path,
|
html.light #path,
|
||||||
|
html.light #doc,
|
||||||
html.light #srch_form,
|
html.light #srch_form,
|
||||||
html.light .ghead,
|
html.light .ghead,
|
||||||
html.light #u2etas {
|
html.light #u2etas {
|
||||||
@@ -1267,6 +1310,14 @@ html.light #ops a,
|
|||||||
html.light #files tbody div a:last-child {
|
html.light #files tbody div a:last-child {
|
||||||
color: #06a;
|
color: #06a;
|
||||||
}
|
}
|
||||||
|
html.light .s0:after,
|
||||||
|
html.light .s0r:after {
|
||||||
|
color: #059;
|
||||||
|
}
|
||||||
|
html.light .s1:after,
|
||||||
|
html.light .s1r:after {
|
||||||
|
color: #f5d;
|
||||||
|
}
|
||||||
html.light #files thead th {
|
html.light #files thead th {
|
||||||
background: #eaeaea;
|
background: #eaeaea;
|
||||||
border-color: #ccc;
|
border-color: #ccc;
|
||||||
@@ -1373,6 +1424,7 @@ html.light .opview input[type="text"] {
|
|||||||
border-color: #38d;
|
border-color: #38d;
|
||||||
}
|
}
|
||||||
html.light #u2tab a>span,
|
html.light #u2tab a>span,
|
||||||
|
html.light #docul .bn a>span,
|
||||||
html.light #files td div span {
|
html.light #files td div span {
|
||||||
color: #000;
|
color: #000;
|
||||||
}
|
}
|
||||||
@@ -1663,8 +1715,6 @@ html.light #bbox-overlay figcaption a {
|
|||||||
|
|
||||||
#op_up2k {
|
#op_up2k {
|
||||||
padding: 0 1em 1em 1em;
|
padding: 0 1em 1em 1em;
|
||||||
min-height: 0;
|
|
||||||
transition: min-height .2s;
|
|
||||||
}
|
}
|
||||||
#drops {
|
#drops {
|
||||||
display: none;
|
display: none;
|
||||||
@@ -1829,13 +1879,18 @@ html.light #u2err.err {
|
|||||||
#u2notbtn * {
|
#u2notbtn * {
|
||||||
line-height: 1.3em;
|
line-height: 1.3em;
|
||||||
}
|
}
|
||||||
|
#u2tabw {
|
||||||
|
min-height: 0;
|
||||||
|
transition: min-height .2s;
|
||||||
|
margin: 3em 0;
|
||||||
|
}
|
||||||
#u2tab {
|
#u2tab {
|
||||||
border-collapse: collapse;
|
border-collapse: collapse;
|
||||||
margin: 3em auto;
|
|
||||||
width: calc(100% - 2em);
|
width: calc(100% - 2em);
|
||||||
max-width: 100em;
|
max-width: 100em;
|
||||||
|
margin: 0 auto;
|
||||||
}
|
}
|
||||||
#op_up2k.srch #u2tab {
|
#op_up2k.srch #u2tabf {
|
||||||
max-width: none;
|
max-width: none;
|
||||||
}
|
}
|
||||||
#u2tab td {
|
#u2tab td {
|
||||||
@@ -2095,6 +2150,7 @@ html.light #u2foot .warn span {
|
|||||||
border-color: #d06;
|
border-color: #d06;
|
||||||
}
|
}
|
||||||
#u2tab a>span,
|
#u2tab a>span,
|
||||||
|
#docul .bn a>span,
|
||||||
#unpost a>span {
|
#unpost a>span {
|
||||||
font-weight: bold;
|
font-weight: bold;
|
||||||
font-style: italic;
|
font-style: italic;
|
||||||
|
|||||||
@@ -143,7 +143,8 @@
|
|||||||
have_zip = {{ have_zip|tojson }},
|
have_zip = {{ have_zip|tojson }},
|
||||||
txt_ext = "{{ txt_ext }}",
|
txt_ext = "{{ txt_ext }}",
|
||||||
{% if no_prism %}no_prism = 1,{% endif %}
|
{% if no_prism %}no_prism = 1,{% endif %}
|
||||||
readme = {{ readme|tojson }};
|
readme = {{ readme|tojson }},
|
||||||
|
ls0 = {{ ls0|tojson }};
|
||||||
|
|
||||||
document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark");
|
document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark");
|
||||||
</script>
|
</script>
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -10,7 +10,7 @@
|
|||||||
{%- endif %}
|
{%- endif %}
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
<div id="mn">navbar</div>
|
<div id="mn"></div>
|
||||||
<div id="mh">
|
<div id="mh">
|
||||||
<a id="lightswitch" href="#">go dark</a>
|
<a id="lightswitch" href="#">go dark</a>
|
||||||
<a id="navtoggle" href="#">hide nav</a>
|
<a id="navtoggle" href="#">hide nav</a>
|
||||||
|
|||||||
@@ -39,20 +39,14 @@ var md_plug = {};
|
|||||||
|
|
||||||
// add navbar
|
// add navbar
|
||||||
(function () {
|
(function () {
|
||||||
var n = document.location + '';
|
var parts = get_evpath().split('/'), link = '', o;
|
||||||
n = n.substr(n.indexOf('//') + 2).split('?')[0].split('/');
|
for (var a = 0, aa = parts.length - 2; a <= aa; a++) {
|
||||||
n[0] = 'top';
|
link += parts[a] + (a < aa ? '/' : '');
|
||||||
var loc = [];
|
o = mknod('a');
|
||||||
var nav = [];
|
o.setAttribute('href', link);
|
||||||
for (var a = 0; a < n.length; a++) {
|
o.textContent = uricom_dec(parts[a])[0] || 'top';
|
||||||
if (a > 0)
|
dom_nav.appendChild(o);
|
||||||
loc.push(n[a]);
|
|
||||||
|
|
||||||
var dec = esc(uricom_dec(n[a])[0]);
|
|
||||||
|
|
||||||
nav.push('<a href="/' + loc.join('/') + '">' + dec + '</a>');
|
|
||||||
}
|
}
|
||||||
dom_nav.innerHTML = nav.join('');
|
|
||||||
})();
|
})();
|
||||||
|
|
||||||
|
|
||||||
@@ -256,7 +250,7 @@ function convert_markdown(md_text, dest_dom) {
|
|||||||
Object.assign(marked_opts, ext[0]);
|
Object.assign(marked_opts, ext[0]);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
var md_html = marked(md_text, marked_opts);
|
var md_html = marked.parse(md_text, marked_opts);
|
||||||
}
|
}
|
||||||
catch (ex) {
|
catch (ex) {
|
||||||
if (ext)
|
if (ext)
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
|
|
||||||
<head>
|
<head>
|
||||||
<meta charset="utf-8">
|
<meta charset="utf-8">
|
||||||
<title>copyparty</title>
|
<title>{{ svcname }}</title>
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||||
<link rel="stylesheet" media="screen" href="/.cpr/msg.css?_={{ ts }}">
|
<link rel="stylesheet" media="screen" href="/.cpr/msg.css?_={{ ts }}">
|
||||||
|
|||||||
@@ -38,7 +38,8 @@ a+a {
|
|||||||
margin: -.2em 0 0 .5em;
|
margin: -.2em 0 0 .5em;
|
||||||
}
|
}
|
||||||
.logout,
|
.logout,
|
||||||
.btns a {
|
.btns a,
|
||||||
|
a.r {
|
||||||
color: #c04;
|
color: #c04;
|
||||||
border-color: #c7a;
|
border-color: #c7a;
|
||||||
}
|
}
|
||||||
@@ -79,6 +80,12 @@ table {
|
|||||||
margin-top: .3em;
|
margin-top: .3em;
|
||||||
text-align: right;
|
text-align: right;
|
||||||
}
|
}
|
||||||
|
blockquote {
|
||||||
|
margin: 0 0 1.6em .6em;
|
||||||
|
padding: .7em 1em 0 1em;
|
||||||
|
border-left: .3em solid rgba(128,128,128,0.5);
|
||||||
|
border-radius: 0 0 0 .25em;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
html.dark,
|
html.dark,
|
||||||
@@ -96,7 +103,8 @@ html.dark a {
|
|||||||
border-color: #37a;
|
border-color: #37a;
|
||||||
}
|
}
|
||||||
html.dark .logout,
|
html.dark .logout,
|
||||||
html.dark .btns a {
|
html.dark .btns a,
|
||||||
|
html.dark a.r {
|
||||||
background: #804;
|
background: #804;
|
||||||
border-color: #c28;
|
border-color: #c28;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
|
|
||||||
<head>
|
<head>
|
||||||
<meta charset="utf-8">
|
<meta charset="utf-8">
|
||||||
<title>copyparty</title>
|
<title>{{ svcname }}</title>
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||||
<link rel="stylesheet" media="screen" href="/.cpr/splash.css?_={{ ts }}">
|
<link rel="stylesheet" media="screen" href="/.cpr/splash.css?_={{ ts }}">
|
||||||
@@ -72,6 +72,18 @@
|
|||||||
</ul>
|
</ul>
|
||||||
{%- endif %}
|
{%- endif %}
|
||||||
|
|
||||||
|
<h1 id="cc">client config:</h1>
|
||||||
|
<ul>
|
||||||
|
{% if k304 %}
|
||||||
|
<li><a href="/?k304=n">disable k304</a> (currently enabled)
|
||||||
|
{%- else %}
|
||||||
|
<li><a href="/?k304=y" class="r">enable k304</a> (currently disabled)
|
||||||
|
{% endif %}
|
||||||
|
<blockquote>enabling this will disconnect your client on every HTTP 304, which can prevent some buggy browsers/proxies from getting stuck (suddenly not being able to load pages), <em>but</em> it will also make things slower in general</blockquote></li>
|
||||||
|
|
||||||
|
<li><a href="/?reset" class="r" onclick="localStorage.clear();return true">reset client settings</a></li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
<h1>login for more:</h1>
|
<h1>login for more:</h1>
|
||||||
<ul>
|
<ul>
|
||||||
<form method="post" enctype="multipart/form-data" action="/{{ qvpath }}">
|
<form method="post" enctype="multipart/form-data" action="/{{ qvpath }}">
|
||||||
@@ -84,8 +96,7 @@
|
|||||||
<a href="#" id="repl">π</a>
|
<a href="#" id="repl">π</a>
|
||||||
<script>
|
<script>
|
||||||
|
|
||||||
if (localStorage.lightmode != 1)
|
document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark");
|
||||||
document.documentElement.setAttribute("class", "dark");
|
|
||||||
|
|
||||||
</script>
|
</script>
|
||||||
<script src="/.cpr/util.js?_={{ ts }}"></script>
|
<script src="/.cpr/util.js?_={{ ts }}"></script>
|
||||||
|
|||||||
@@ -116,6 +116,20 @@ html {
|
|||||||
#toast.err #toastc {
|
#toast.err #toastc {
|
||||||
background: #d06;
|
background: #d06;
|
||||||
}
|
}
|
||||||
|
#tth {
|
||||||
|
color: #fff;
|
||||||
|
background: #111;
|
||||||
|
font-size: .9em;
|
||||||
|
padding: 0 .26em;
|
||||||
|
line-height: .97em;
|
||||||
|
border-radius: 1em;
|
||||||
|
position: absolute;
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
#tth.act {
|
||||||
|
display: block;
|
||||||
|
z-index: 9001;
|
||||||
|
}
|
||||||
#tt.b {
|
#tt.b {
|
||||||
padding: 0 2em;
|
padding: 0 2em;
|
||||||
border-radius: .5em;
|
border-radius: .5em;
|
||||||
@@ -159,6 +173,10 @@ html.light #tt code {
|
|||||||
html.light #tt em {
|
html.light #tt em {
|
||||||
color: #d38;
|
color: #d38;
|
||||||
}
|
}
|
||||||
|
html.light #tth {
|
||||||
|
color: #000;
|
||||||
|
background: #fff;
|
||||||
|
}
|
||||||
#modal {
|
#modal {
|
||||||
position: fixed;
|
position: fixed;
|
||||||
overflow: auto;
|
overflow: auto;
|
||||||
@@ -336,6 +354,13 @@ html.light textarea:focus {
|
|||||||
}
|
}
|
||||||
.mdo ul,
|
.mdo ul,
|
||||||
.mdo ol {
|
.mdo ol {
|
||||||
|
padding-left: 1em;
|
||||||
|
}
|
||||||
|
.mdo ul ul,
|
||||||
|
.mdo ul ol,
|
||||||
|
.mdo ol ul,
|
||||||
|
.mdo ol ol {
|
||||||
|
padding-left: 2em;
|
||||||
border-left: .3em solid #ddd;
|
border-left: .3em solid #ddd;
|
||||||
}
|
}
|
||||||
.mdo ul>li,
|
.mdo ul>li,
|
||||||
|
|||||||
@@ -525,13 +525,15 @@ function Donut(uc, st) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
r.on = function (ya) {
|
r.on = function (ya) {
|
||||||
r.fc = 99;
|
r.fc = r.tc = 99;
|
||||||
r.eta = null;
|
r.eta = null;
|
||||||
r.base = pos();
|
r.base = pos();
|
||||||
optab.innerHTML = ya ? svg() : optab.getAttribute('ico');
|
optab.innerHTML = ya ? svg() : optab.getAttribute('ico');
|
||||||
el = QS('#ops a .donut');
|
el = QS('#ops a .donut');
|
||||||
if (!ya)
|
if (!ya) {
|
||||||
favico.upd();
|
favico.upd();
|
||||||
|
wintitle();
|
||||||
|
}
|
||||||
};
|
};
|
||||||
r.do = function () {
|
r.do = function () {
|
||||||
if (!el)
|
if (!el)
|
||||||
@@ -541,6 +543,11 @@ function Donut(uc, st) {
|
|||||||
v = pos() - r.base,
|
v = pos() - r.base,
|
||||||
ofs = el.style.strokeDashoffset = o - o * v / t;
|
ofs = el.style.strokeDashoffset = o - o * v / t;
|
||||||
|
|
||||||
|
if (++r.tc >= 10) {
|
||||||
|
wintitle(f2f(v * 100 / t, 1) + '%, ' + r.eta + 's, ', true);
|
||||||
|
r.tc = 0;
|
||||||
|
}
|
||||||
|
|
||||||
if (favico.txt) {
|
if (favico.txt) {
|
||||||
if (++r.fc < 10 && r.eta && r.eta > 99)
|
if (++r.fc < 10 && r.eta && r.eta > 99)
|
||||||
return;
|
return;
|
||||||
@@ -728,7 +735,6 @@ function up2k_init(subtle) {
|
|||||||
if (++nenters <= 0)
|
if (++nenters <= 0)
|
||||||
nenters = 1;
|
nenters = 1;
|
||||||
|
|
||||||
//console.log(nenters, Date.now(), 'enter', this, e.target);
|
|
||||||
if (onover.bind(this)(e))
|
if (onover.bind(this)(e))
|
||||||
return true;
|
return true;
|
||||||
|
|
||||||
@@ -750,12 +756,19 @@ function up2k_init(subtle) {
|
|||||||
ebi('up_dz').setAttribute('err', mup || '');
|
ebi('up_dz').setAttribute('err', mup || '');
|
||||||
ebi('srch_dz').setAttribute('err', msr || '');
|
ebi('srch_dz').setAttribute('err', msr || '');
|
||||||
}
|
}
|
||||||
|
function onoverb(e) {
|
||||||
|
// zones are alive; disable cuo2duo branch
|
||||||
|
document.body.ondragover = document.body.ondrop = null;
|
||||||
|
return onover.bind(this)(e);
|
||||||
|
}
|
||||||
function onover(e) {
|
function onover(e) {
|
||||||
try {
|
try {
|
||||||
var ok = false, dt = e.dataTransfer.types;
|
var ok = false, dt = e.dataTransfer.types;
|
||||||
for (var a = 0; a < dt.length; a++)
|
for (var a = 0; a < dt.length; a++)
|
||||||
if (dt[a] == 'Files')
|
if (dt[a] == 'Files')
|
||||||
ok = true;
|
ok = true;
|
||||||
|
else if (dt[a] == 'text/uri-list')
|
||||||
|
return true;
|
||||||
|
|
||||||
if (!ok)
|
if (!ok)
|
||||||
return true;
|
return true;
|
||||||
@@ -781,17 +794,20 @@ function up2k_init(subtle) {
|
|||||||
clmod(ebi('drops'), 'vis');
|
clmod(ebi('drops'), 'vis');
|
||||||
clmod(ebi('up_dz'), 'hl');
|
clmod(ebi('up_dz'), 'hl');
|
||||||
clmod(ebi('srch_dz'), 'hl');
|
clmod(ebi('srch_dz'), 'hl');
|
||||||
|
// cuo2duo:
|
||||||
|
document.body.ondragover = onover;
|
||||||
|
document.body.ondrop = gotfile;
|
||||||
}
|
}
|
||||||
|
|
||||||
//console.log(nenters, Date.now(), 'leave', this, e && e.target);
|
|
||||||
}
|
}
|
||||||
document.body.ondragenter = ondrag;
|
document.body.ondragenter = ondrag;
|
||||||
document.body.ondragleave = offdrag;
|
document.body.ondragleave = offdrag;
|
||||||
|
document.body.ondragover = onover;
|
||||||
|
document.body.ondrop = gotfile;
|
||||||
|
|
||||||
var drops = [ebi('up_dz'), ebi('srch_dz')];
|
var drops = [ebi('up_dz'), ebi('srch_dz')];
|
||||||
for (var a = 0; a < 2; a++) {
|
for (var a = 0; a < 2; a++) {
|
||||||
drops[a].ondragenter = ondrag;
|
drops[a].ondragenter = ondrag;
|
||||||
drops[a].ondragover = onover;
|
drops[a].ondragover = onoverb;
|
||||||
drops[a].ondragleave = offdrag;
|
drops[a].ondragleave = offdrag;
|
||||||
drops[a].ondrop = gotfile;
|
drops[a].ondrop = gotfile;
|
||||||
}
|
}
|
||||||
@@ -801,7 +817,10 @@ function up2k_init(subtle) {
|
|||||||
ev(e);
|
ev(e);
|
||||||
nenters = 0;
|
nenters = 0;
|
||||||
offdrag.bind(this)();
|
offdrag.bind(this)();
|
||||||
var dz = (this && this.getAttribute('id'));
|
var dz = this && this.getAttribute('id');
|
||||||
|
if (!dz && e && e.clientY)
|
||||||
|
// cuo2duo fallback
|
||||||
|
dz = e.clientY < window.innerHeight / 2 ? 'up_dz' : 'srch_dz';
|
||||||
|
|
||||||
var err = this.getAttribute('err');
|
var err = this.getAttribute('err');
|
||||||
if (err)
|
if (err)
|
||||||
@@ -1069,7 +1088,7 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
more_one_file();
|
more_one_file();
|
||||||
|
|
||||||
var etaref = 0, etaskip = 0, op_minh = 0;
|
var etaref = 0, etaskip = 0, utw_minh = 0;
|
||||||
function etafun() {
|
function etafun() {
|
||||||
var nhash = st.busy.head.length + st.busy.hash.length + st.todo.head.length + st.todo.hash.length,
|
var nhash = st.busy.head.length + st.busy.hash.length + st.todo.head.length + st.todo.hash.length,
|
||||||
nsend = st.busy.upload.length + st.todo.upload.length,
|
nsend = st.busy.upload.length + st.todo.upload.length,
|
||||||
@@ -1082,13 +1101,10 @@ function up2k_init(subtle) {
|
|||||||
|
|
||||||
//ebi('acc_info').innerHTML = humantime(st.time.busy) + ' ' + f2f(now / 1000, 1);
|
//ebi('acc_info').innerHTML = humantime(st.time.busy) + ' ' + f2f(now / 1000, 1);
|
||||||
|
|
||||||
var op = ebi('op_up2k'),
|
var minh = QS('#op_up2k.act') && st.is_busy ? Math.max(utw_minh, ebi('u2tab').offsetHeight + 32) : 0;
|
||||||
uff = ebi('u2footfoot'),
|
if (utw_minh < minh || !utw_minh) {
|
||||||
minh = QS('#op_up2k.act') ? Math.max(op_minh, uff.offsetTop + uff.offsetHeight - op.offsetTop + 32) : 0;
|
utw_minh = minh;
|
||||||
|
ebi('u2tabw').style.minHeight = utw_minh + 'px';
|
||||||
if (minh > op_minh || !op_minh) {
|
|
||||||
op_minh = minh;
|
|
||||||
op.style.minHeight = op_minh + 'px';
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!nhash)
|
if (!nhash)
|
||||||
@@ -1211,15 +1227,16 @@ function up2k_init(subtle) {
|
|||||||
running = true;
|
running = true;
|
||||||
while (true) {
|
while (true) {
|
||||||
var now = Date.now(),
|
var now = Date.now(),
|
||||||
is_busy = 0 !=
|
oldest_active = Math.min( // gzip take the wheel
|
||||||
st.todo.head.length +
|
st.todo.head.length ? st.todo.head[0].n : st.files.length,
|
||||||
st.todo.hash.length +
|
st.todo.hash.length ? st.todo.hash[0].n : st.files.length,
|
||||||
st.todo.handshake.length +
|
st.todo.upload.length ? st.todo.upload[0].nfile : st.files.length,
|
||||||
st.todo.upload.length +
|
st.todo.handshake.length ? st.todo.handshake[0].n : st.files.length,
|
||||||
st.busy.head.length +
|
st.busy.head.length ? st.busy.head[0].n : st.files.length,
|
||||||
st.busy.hash.length +
|
st.busy.hash.length ? st.busy.hash[0].n : st.files.length,
|
||||||
st.busy.handshake.length +
|
st.busy.upload.length ? st.busy.upload[0].nfile : st.files.length,
|
||||||
st.busy.upload.length;
|
st.busy.handshake.length ? st.busy.handshake[0].n : st.files.length),
|
||||||
|
is_busy = oldest_active < st.files.length;
|
||||||
|
|
||||||
if (was_busy && !is_busy) {
|
if (was_busy && !is_busy) {
|
||||||
for (var a = 0; a < st.files.length; a++) {
|
for (var a = 0; a < st.files.length; a++) {
|
||||||
@@ -1239,7 +1256,7 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (was_busy != is_busy) {
|
if (was_busy != is_busy) {
|
||||||
was_busy = is_busy;
|
st.is_busy = was_busy = is_busy;
|
||||||
|
|
||||||
window[(is_busy ? "add" : "remove") +
|
window[(is_busy ? "add" : "remove") +
|
||||||
"EventListener"]("beforeunload", warn_uploader_busy);
|
"EventListener"]("beforeunload", warn_uploader_busy);
|
||||||
@@ -1268,7 +1285,7 @@ function up2k_init(subtle) {
|
|||||||
|
|
||||||
timer.rm(etafun);
|
timer.rm(etafun);
|
||||||
timer.rm(donut.do);
|
timer.rm(donut.do);
|
||||||
op_minh = 0;
|
utw_minh = 0;
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
timer.add(donut.do);
|
timer.add(donut.do);
|
||||||
@@ -1320,7 +1337,8 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (st.todo.head.length &&
|
if (st.todo.head.length &&
|
||||||
st.busy.head.length < parallel_uploads) {
|
st.busy.head.length < parallel_uploads &&
|
||||||
|
(!is_busy || st.todo.head[0].n - oldest_active < parallel_uploads * 2)) {
|
||||||
exec_head();
|
exec_head();
|
||||||
mou_ikkai = true;
|
mou_ikkai = true;
|
||||||
}
|
}
|
||||||
@@ -1467,7 +1485,8 @@ function up2k_init(subtle) {
|
|||||||
err.indexOf('NotFoundError') !== -1 // macos-firefox permissions
|
err.indexOf('NotFoundError') !== -1 // macos-firefox permissions
|
||||||
) {
|
) {
|
||||||
pvis.seth(t.n, 1, 'OS-error');
|
pvis.seth(t.n, 1, 'OS-error');
|
||||||
pvis.seth(t.n, 2, err);
|
pvis.seth(t.n, 2, err + ' @ ' + car);
|
||||||
|
console.log('OS-error', reader.error, '@', car);
|
||||||
handled = true;
|
handled = true;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1843,7 +1862,8 @@ function up2k_init(subtle) {
|
|||||||
st.bytes.uploaded += cdr - car;
|
st.bytes.uploaded += cdr - car;
|
||||||
t.bytes_uploaded += cdr - car;
|
t.bytes_uploaded += cdr - car;
|
||||||
}
|
}
|
||||||
else if (txt.indexOf('already got that') !== -1) {
|
else if (txt.indexOf('already got that') + 1 ||
|
||||||
|
txt.indexOf('already being written') + 1) {
|
||||||
console.log("ignoring dupe-segment error", t);
|
console.log("ignoring dupe-segment error", t);
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
@@ -1851,6 +1871,9 @@ function up2k_init(subtle) {
|
|||||||
xhr.status, t.name) + (txt || "no further information"));
|
xhr.status, t.name) + (txt || "no further information"));
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
orz2(xhr);
|
||||||
|
}
|
||||||
|
function orz2(xhr) {
|
||||||
apop(st.busy.upload, upt);
|
apop(st.busy.upload, upt);
|
||||||
apop(t.postlist, npart);
|
apop(t.postlist, npart);
|
||||||
if (!t.postlist.length) {
|
if (!t.postlist.length) {
|
||||||
@@ -1872,9 +1895,11 @@ function up2k_init(subtle) {
|
|||||||
if (crashed)
|
if (crashed)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
toast.err(9.98, "failed to upload a chunk,\n" + tries + " retries so far -- retrying in 10sec\n\n" + t.name);
|
if (!toast.visible)
|
||||||
|
toast.warn(9.98, "failed to upload a chunk;\nprobably harmless, continuing\n\n" + t.name);
|
||||||
|
|
||||||
console.log('chunkpit onerror,', ++tries, t);
|
console.log('chunkpit onerror,', ++tries, t);
|
||||||
setTimeout(do_send, 10 * 1000);
|
orz2(xhr);
|
||||||
};
|
};
|
||||||
xhr.open('POST', t.purl, true);
|
xhr.open('POST', t.purl, true);
|
||||||
xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]);
|
xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]);
|
||||||
@@ -2014,7 +2039,7 @@ function up2k_init(subtle) {
|
|||||||
new_state = true;
|
new_state = true;
|
||||||
fixed = true;
|
fixed = true;
|
||||||
}
|
}
|
||||||
if (!has(perms, 'read')) {
|
if (!has(perms, 'read') || !have_up2k_idx) {
|
||||||
new_state = false;
|
new_state = false;
|
||||||
fixed = true;
|
fixed = true;
|
||||||
}
|
}
|
||||||
@@ -2089,7 +2114,7 @@ function up2k_init(subtle) {
|
|||||||
if (parallel_uploads < 1)
|
if (parallel_uploads < 1)
|
||||||
bumpthread(1);
|
bumpthread(1);
|
||||||
|
|
||||||
return { "init_deps": init_deps, "set_fsearch": set_fsearch, "ui": pvis }
|
return { "init_deps": init_deps, "set_fsearch": set_fsearch, "ui": pvis, "st": st, "uc": uc }
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -7,8 +7,7 @@ if (!window['console'])
|
|||||||
|
|
||||||
|
|
||||||
var is_touch = 'ontouchstart' in window,
|
var is_touch = 'ontouchstart' in window,
|
||||||
IPHONE = /iPhone|iPad|iPod/i.test(navigator.userAgent),
|
IPHONE = is_touch && /iPhone|iPad|iPod/i.test(navigator.userAgent),
|
||||||
ANDROID = /android/i.test(navigator.userAgent),
|
|
||||||
WINDOWS = navigator.platform ? navigator.platform == 'Win32' : /Windows/.test(navigator.userAgent);
|
WINDOWS = navigator.platform ? navigator.platform == 'Win32' : /Windows/.test(navigator.userAgent);
|
||||||
|
|
||||||
|
|
||||||
@@ -181,6 +180,7 @@ function ignex(all) {
|
|||||||
if (!all)
|
if (!all)
|
||||||
window.onerror = vis_exh;
|
window.onerror = vis_exh;
|
||||||
}
|
}
|
||||||
|
window.onerror = vis_exh;
|
||||||
|
|
||||||
|
|
||||||
function noop() { }
|
function noop() { }
|
||||||
@@ -286,15 +286,19 @@ function crc32(str) {
|
|||||||
|
|
||||||
|
|
||||||
function clmod(el, cls, add) {
|
function clmod(el, cls, add) {
|
||||||
|
if (!el)
|
||||||
|
return false;
|
||||||
|
|
||||||
if (el.classList) {
|
if (el.classList) {
|
||||||
var have = el.classList.contains(cls);
|
var have = el.classList.contains(cls);
|
||||||
if (add == 't')
|
if (add == 't')
|
||||||
add = !have;
|
add = !have;
|
||||||
|
|
||||||
if (add != have)
|
if (!add == !have)
|
||||||
el.classList[add ? 'add' : 'remove'](cls);
|
return false;
|
||||||
|
|
||||||
return;
|
el.classList[add ? 'add' : 'remove'](cls);
|
||||||
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
var re = new RegExp('\\s*\\b' + cls + '\\s*\\b', 'g'),
|
var re = new RegExp('\\s*\\b' + cls + '\\s*\\b', 'g'),
|
||||||
@@ -305,12 +309,18 @@ function clmod(el, cls, add) {
|
|||||||
|
|
||||||
var n2 = n1.replace(re, ' ') + (add ? ' ' + cls : '');
|
var n2 = n1.replace(re, ' ') + (add ? ' ' + cls : '');
|
||||||
|
|
||||||
if (n1 != n2)
|
if (!n1 == !n2)
|
||||||
el.className = n2;
|
return false;
|
||||||
|
|
||||||
|
el.className = n2;
|
||||||
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
function clgot(el, cls) {
|
function clgot(el, cls) {
|
||||||
|
if (!el)
|
||||||
|
return;
|
||||||
|
|
||||||
if (el.classList)
|
if (el.classList)
|
||||||
return el.classList.contains(cls);
|
return el.classList.contains(cls);
|
||||||
|
|
||||||
@@ -319,14 +329,45 @@ function clgot(el, cls) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
function showsort(tab) {
|
||||||
|
var v, vn, v1, v2, th = tab.tHead,
|
||||||
|
sopts = jread('fsort', [["href", 1, ""]]);
|
||||||
|
|
||||||
|
th && (th = th.rows[0]) && (th = th.cells);
|
||||||
|
|
||||||
|
for (var a = sopts.length - 1; a >= 0; a--) {
|
||||||
|
if (!sopts[a][0])
|
||||||
|
continue;
|
||||||
|
|
||||||
|
v2 = v1;
|
||||||
|
v1 = sopts[a];
|
||||||
|
}
|
||||||
|
|
||||||
|
v = [v1, v2];
|
||||||
|
vn = [v1 ? v1[0] : '', v2 ? v2[0] : ''];
|
||||||
|
|
||||||
|
var ga = QSA('#ghead a[s]');
|
||||||
|
for (var a = 0; a < ga.length; a++)
|
||||||
|
ga[a].className = '';
|
||||||
|
|
||||||
|
for (var a = 0; a < th.length; a++) {
|
||||||
|
var n = vn.indexOf(th[a].getAttribute('name')),
|
||||||
|
cl = n < 0 ? ' ' : ' s' + n + (v[n][1] > 0 ? ' ' : 'r ');
|
||||||
|
|
||||||
|
th[a].className = th[a].className.replace(/ *s[01]r? */, ' ') + cl;
|
||||||
|
if (n + 1) {
|
||||||
|
ga = QS('#ghead a[s="' + vn[n] + '"]');
|
||||||
|
if (ga)
|
||||||
|
ga.className = cl;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
function sortTable(table, col, cb) {
|
function sortTable(table, col, cb) {
|
||||||
var tb = table.tBodies[0],
|
var tb = table.tBodies[0],
|
||||||
th = table.tHead.rows[0].cells,
|
th = table.tHead.rows[0].cells,
|
||||||
tr = Array.prototype.slice.call(tb.rows, 0),
|
tr = Array.prototype.slice.call(tb.rows, 0),
|
||||||
i, reverse = th[col].className.indexOf('sort1') !== -1 ? -1 : 1;
|
i, reverse = /s0[^r]/.exec(th[col].className + ' ') ? -1 : 1;
|
||||||
for (var a = 0, thl = th.length; a < thl; a++)
|
|
||||||
th[a].className = th[a].className.replace(/ *sort-?1 */, " ");
|
|
||||||
th[col].className += ' sort' + reverse;
|
|
||||||
var stype = th[col].getAttribute('sort');
|
var stype = th[col].getAttribute('sort');
|
||||||
try {
|
try {
|
||||||
var nrules = [], rules = jread("fsort", []);
|
var nrules = [], rules = jread("fsort", []);
|
||||||
@@ -344,6 +385,7 @@ function sortTable(table, col, cb) {
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
jwrite("fsort", nrules);
|
jwrite("fsort", nrules);
|
||||||
|
try { showsort(table); } catch (ex) { }
|
||||||
}
|
}
|
||||||
catch (ex) {
|
catch (ex) {
|
||||||
console.log("failed to persist sort rules, resetting: " + ex);
|
console.log("failed to persist sort rules, resetting: " + ex);
|
||||||
@@ -392,7 +434,7 @@ function makeSortable(table, cb) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
function linksplit(rp) {
|
function linksplit(rp, id) {
|
||||||
var ret = [],
|
var ret = [],
|
||||||
apath = '/',
|
apath = '/',
|
||||||
q = null;
|
q = null;
|
||||||
@@ -422,8 +464,13 @@ function linksplit(rp) {
|
|||||||
vlink = vlink.slice(0, -1) + '<span>/</span>';
|
vlink = vlink.slice(0, -1) + '<span>/</span>';
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!rp && q)
|
if (!rp) {
|
||||||
link += q;
|
if (q)
|
||||||
|
link += q;
|
||||||
|
|
||||||
|
if (id)
|
||||||
|
link += '" id="' + id;
|
||||||
|
}
|
||||||
|
|
||||||
ret.push('<a href="' + apath + link + '">' + vlink + '</a>');
|
ret.push('<a href="' + apath + link + '">' + vlink + '</a>');
|
||||||
apath += link;
|
apath += link;
|
||||||
@@ -782,13 +829,18 @@ var timer = (function () {
|
|||||||
var tt = (function () {
|
var tt = (function () {
|
||||||
var r = {
|
var r = {
|
||||||
"tt": mknod("div"),
|
"tt": mknod("div"),
|
||||||
|
"th": mknod("div"),
|
||||||
"en": true,
|
"en": true,
|
||||||
"el": null,
|
"el": null,
|
||||||
"skip": false
|
"skip": false,
|
||||||
|
"lvis": 0
|
||||||
};
|
};
|
||||||
|
|
||||||
|
r.th.innerHTML = '?';
|
||||||
r.tt.setAttribute('id', 'tt');
|
r.tt.setAttribute('id', 'tt');
|
||||||
|
r.th.setAttribute('id', 'tth');
|
||||||
document.body.appendChild(r.tt);
|
document.body.appendChild(r.tt);
|
||||||
|
document.body.appendChild(r.th);
|
||||||
|
|
||||||
var prev = null;
|
var prev = null;
|
||||||
r.cshow = function () {
|
r.cshow = function () {
|
||||||
@@ -798,11 +850,25 @@ var tt = (function () {
|
|||||||
prev = this;
|
prev = this;
|
||||||
};
|
};
|
||||||
|
|
||||||
r.show = function () {
|
var tev;
|
||||||
if (r.skip) {
|
r.dshow = function (e) {
|
||||||
r.skip = false;
|
clearTimeout(tev);
|
||||||
|
if (!r.getmsg(this))
|
||||||
return;
|
return;
|
||||||
}
|
|
||||||
|
if (Date.now() - r.lvis < 400)
|
||||||
|
return r.show.bind(this)();
|
||||||
|
|
||||||
|
tev = setTimeout(r.show.bind(this), 800);
|
||||||
|
if (is_touch)
|
||||||
|
return;
|
||||||
|
|
||||||
|
this.addEventListener('mousemove', r.move);
|
||||||
|
clmod(r.th, 'act', 1);
|
||||||
|
r.move(e);
|
||||||
|
};
|
||||||
|
|
||||||
|
r.getmsg = function (el) {
|
||||||
if (QS('body.bbox-open'))
|
if (QS('body.bbox-open'))
|
||||||
return;
|
return;
|
||||||
|
|
||||||
@@ -810,7 +876,16 @@ var tt = (function () {
|
|||||||
if (cfg !== null && cfg != '1')
|
if (cfg !== null && cfg != '1')
|
||||||
return;
|
return;
|
||||||
|
|
||||||
var msg = this.getAttribute('tt');
|
return el.getAttribute('tt');
|
||||||
|
};
|
||||||
|
|
||||||
|
r.show = function () {
|
||||||
|
clearTimeout(tev);
|
||||||
|
if (r.skip) {
|
||||||
|
r.skip = false;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
var msg = r.getmsg(this);
|
||||||
if (!msg)
|
if (!msg)
|
||||||
return;
|
return;
|
||||||
|
|
||||||
@@ -824,6 +899,7 @@ var tt = (function () {
|
|||||||
if (dir.indexOf('u') + 1) top = false;
|
if (dir.indexOf('u') + 1) top = false;
|
||||||
if (dir.indexOf('d') + 1) top = true;
|
if (dir.indexOf('d') + 1) top = true;
|
||||||
|
|
||||||
|
clmod(r.th, 'act');
|
||||||
clmod(r.tt, 'b', big);
|
clmod(r.tt, 'b', big);
|
||||||
r.tt.style.left = '0';
|
r.tt.style.left = '0';
|
||||||
r.tt.style.top = '0';
|
r.tt.style.top = '0';
|
||||||
@@ -849,14 +925,27 @@ var tt = (function () {
|
|||||||
|
|
||||||
r.hide = function (e) {
|
r.hide = function (e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
|
clearTimeout(tev);
|
||||||
window.removeEventListener('scroll', r.hide);
|
window.removeEventListener('scroll', r.hide);
|
||||||
clmod(r.tt, 'show');
|
|
||||||
clmod(r.tt, 'b');
|
clmod(r.tt, 'b');
|
||||||
|
clmod(r.th, 'act');
|
||||||
|
if (clmod(r.tt, 'show'))
|
||||||
|
r.lvis = Date.now();
|
||||||
|
|
||||||
if (r.el)
|
if (r.el)
|
||||||
r.el.removeEventListener('mouseleave', r.hide);
|
r.el.removeEventListener('mouseleave', r.hide);
|
||||||
|
|
||||||
|
if (e && e.target)
|
||||||
|
e.target.removeEventListener('mousemove', r.move);
|
||||||
};
|
};
|
||||||
|
|
||||||
if (is_touch && IPHONE) {
|
r.move = function (e) {
|
||||||
|
r.th.style.left = (e.pageX + 12) + 'px';
|
||||||
|
r.th.style.top = (e.pageY + 12) + 'px';
|
||||||
|
};
|
||||||
|
|
||||||
|
if (IPHONE) {
|
||||||
var f1 = r.show,
|
var f1 = r.show,
|
||||||
f2 = r.hide,
|
f2 = r.hide,
|
||||||
q = [];
|
q = [];
|
||||||
@@ -882,14 +971,14 @@ var tt = (function () {
|
|||||||
|
|
||||||
r.att = function (ctr) {
|
r.att = function (ctr) {
|
||||||
var _cshow = r.en ? r.cshow : null,
|
var _cshow = r.en ? r.cshow : null,
|
||||||
_show = r.en ? r.show : null,
|
_dshow = r.en ? r.dshow : null,
|
||||||
_hide = r.en ? r.hide : null,
|
_hide = r.en ? r.hide : null,
|
||||||
o = ctr.querySelectorAll('*[tt]');
|
o = ctr.querySelectorAll('*[tt]');
|
||||||
|
|
||||||
for (var a = o.length - 1; a >= 0; a--) {
|
for (var a = o.length - 1; a >= 0; a--) {
|
||||||
o[a].onfocus = _cshow;
|
o[a].onfocus = _cshow;
|
||||||
o[a].onblur = _hide;
|
o[a].onblur = _hide;
|
||||||
o[a].onmouseenter = _show;
|
o[a].onmouseenter = _dshow;
|
||||||
o[a].onmouseleave = _hide;
|
o[a].onmouseleave = _hide;
|
||||||
}
|
}
|
||||||
r.hide();
|
r.hide();
|
||||||
|
|||||||
@@ -23,6 +23,15 @@ point `--css-browser` to one of these by URL:
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
# utilities
|
||||||
|
|
||||||
|
## [`multisearch.html`](multisearch.html)
|
||||||
|
* takes a list of filenames of youtube rips, grabs the youtube-id of each file, and does a search on the server for those
|
||||||
|
* use it by putting it somewhere on the server and opening it as an html page
|
||||||
|
* also serves as an extendable template for other specific search behaviors
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# other stuff
|
# other stuff
|
||||||
|
|
||||||
## [`rclone.md`](rclone.md)
|
## [`rclone.md`](rclone.md)
|
||||||
|
|||||||
124
docs/multisearch.html
Normal file
124
docs/multisearch.html
Normal file
@@ -0,0 +1,124 @@
|
|||||||
|
<!DOCTYPE html><html lang="en"><head>
|
||||||
|
<meta charset="utf-8">
|
||||||
|
<title>multisearch</title>
|
||||||
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
|
<style>
|
||||||
|
|
||||||
|
html, body {
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
color: #ddd;
|
||||||
|
background: #222;
|
||||||
|
font-family: sans-serif;
|
||||||
|
}
|
||||||
|
body {
|
||||||
|
padding: 1em;
|
||||||
|
}
|
||||||
|
a {
|
||||||
|
color: #fc5;
|
||||||
|
}
|
||||||
|
ul {
|
||||||
|
line-height: 1.5em;
|
||||||
|
}
|
||||||
|
code {
|
||||||
|
color: #fc5;
|
||||||
|
border: 1px solid #444;
|
||||||
|
padding: .1em .2em;
|
||||||
|
font-family: sans-serif, sans-serif;
|
||||||
|
}
|
||||||
|
#src {
|
||||||
|
display: block;
|
||||||
|
width: calc(100% - 1em);
|
||||||
|
padding: .5em;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
td {
|
||||||
|
padding-left: 1em;
|
||||||
|
}
|
||||||
|
.hit,
|
||||||
|
.miss {
|
||||||
|
font-weight: bold;
|
||||||
|
padding-left: 0;
|
||||||
|
padding-top: 1em;
|
||||||
|
}
|
||||||
|
.hit {color: #af0;}
|
||||||
|
.miss {color: #f0c;}
|
||||||
|
.hit:before {content: '✅';}
|
||||||
|
.miss:before {content: '❌';}
|
||||||
|
|
||||||
|
</style></head><body>
|
||||||
|
<ul>
|
||||||
|
<li>paste a list of filenames (youtube rips) below and hit search</li>
|
||||||
|
<li>it will grab the youtube-id from the filenames and search for each id</li>
|
||||||
|
<li>filenames must be like <code>-YTID.webm</code> (youtube-dl style) or <code>[YTID].webm</code> (ytdlp style)</li>
|
||||||
|
</ul>
|
||||||
|
<textarea id="src"></textarea>
|
||||||
|
<button id="go">search</button>
|
||||||
|
<div id="res"></div>
|
||||||
|
<script>
|
||||||
|
|
||||||
|
var ebi = document.getElementById.bind(document);
|
||||||
|
function esc(txt) {
|
||||||
|
return txt.replace(/[&"<>]/g, function (c) {
|
||||||
|
return {
|
||||||
|
'&': '&',
|
||||||
|
'"': '"',
|
||||||
|
'<': '<',
|
||||||
|
'>': '>'
|
||||||
|
}[c];
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
ebi('go').onclick = async function() {
|
||||||
|
var queries = [];
|
||||||
|
for (var ln of ebi('src').value.split(/\n/g)) {
|
||||||
|
// filter the list of input files,
|
||||||
|
// only keeping youtube videos,
|
||||||
|
// meaning the filename ends with either
|
||||||
|
// [YOUTUBEID].EXTENSION or
|
||||||
|
// -YOUTUBEID.EXTENSION
|
||||||
|
var m = /[[-]([0-9a-zA-Z_-]{11})\]?\.(mp4|webm|mkv)$/.exec(ln);
|
||||||
|
if (!m || !(m = m[1]))
|
||||||
|
continue;
|
||||||
|
|
||||||
|
// create a search query for each line: name like *youtubeid*
|
||||||
|
queries.push([ln, `name like *${m}*`]);
|
||||||
|
}
|
||||||
|
|
||||||
|
var a = 0, html = ['<table>'], hits = [], misses = [];
|
||||||
|
for (var [fn, q] of queries) {
|
||||||
|
var r = await fetch('/?srch', {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({'q': q})
|
||||||
|
});
|
||||||
|
r = await r.json();
|
||||||
|
|
||||||
|
var cl, tab2;
|
||||||
|
if (r.hits.length) {
|
||||||
|
tab2 = hits;
|
||||||
|
cl = 'hit';
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
tab2 = misses;
|
||||||
|
cl = 'miss';
|
||||||
|
}
|
||||||
|
var h = `<tr><td class="${cl}" colspan="9">${esc(fn)}</td></tr>`;
|
||||||
|
tab2.push(h);
|
||||||
|
html.push(h);
|
||||||
|
for (var h of r.hits) {
|
||||||
|
var link = `<a href="/${h.rp}">${esc(decodeURIComponent(h.rp))}</a>`;
|
||||||
|
html.push(`<tr><td>${h.sz}</td><td>${link}</td></tr>`);
|
||||||
|
}
|
||||||
|
ebi('res').innerHTML = `searching, ${++a} / ${queries.length} done, ${hits.length} hits, ${misses.length} miss`;
|
||||||
|
}
|
||||||
|
html.push('<tr><td><h1>hits:</h1></td></tr>');
|
||||||
|
html = html.concat(hits);
|
||||||
|
|
||||||
|
html.push('<tr><td><h1>miss:</h1></td></tr>');
|
||||||
|
html = html.concat(misses);
|
||||||
|
|
||||||
|
html.push('</table>');
|
||||||
|
ebi('res').innerHTML = html.join('\n');
|
||||||
|
};
|
||||||
|
|
||||||
|
</script></body></html>
|
||||||
@@ -80,6 +80,12 @@ shab64() { sp=$1; f="$2"; v=0; sz=$(stat -c%s "$f"); while true; do w=$((v+sp*10
|
|||||||
command -v gdate && date() { gdate "$@"; }; while true; do t=$(date +%s.%N); (time wget http://127.0.0.1:3923/?ls -qO- | jq -C '.files[]|{sz:.sz,ta:.tags.artist,tb:.tags.".bpm"}|del(.[]|select(.==null))' | awk -F\" '/"/{t[$2]++} END {for (k in t){v=t[k];p=sprintf("%" (v+1) "s",v);gsub(/ /,"#",p);printf "\033[36m%s\033[33m%s ",k,p}}') 2>&1 | awk -v ts=$t 'NR==1{t1=$0} NR==2{sub(/.*0m/,"");sub(/s$/,"");t2=$0;c=2; if(t2>0.3){c=3} if(t2>0.8){c=1} } END{sub(/[0-9]{6}$/,"",ts);printf "%s \033[3%dm%s %s\033[0m\n",ts,c,t2,t1}'; sleep 0.1 || break; done
|
command -v gdate && date() { gdate "$@"; }; while true; do t=$(date +%s.%N); (time wget http://127.0.0.1:3923/?ls -qO- | jq -C '.files[]|{sz:.sz,ta:.tags.artist,tb:.tags.".bpm"}|del(.[]|select(.==null))' | awk -F\" '/"/{t[$2]++} END {for (k in t){v=t[k];p=sprintf("%" (v+1) "s",v);gsub(/ /,"#",p);printf "\033[36m%s\033[33m%s ",k,p}}') 2>&1 | awk -v ts=$t 'NR==1{t1=$0} NR==2{sub(/.*0m/,"");sub(/s$/,"");t2=$0;c=2; if(t2>0.3){c=3} if(t2>0.8){c=1} } END{sub(/[0-9]{6}$/,"",ts);printf "%s \033[3%dm%s %s\033[0m\n",ts,c,t2,t1}'; sleep 0.1 || break; done
|
||||||
|
|
||||||
|
|
||||||
|
##
|
||||||
|
## track an up2k upload and print all chunks in file-order
|
||||||
|
|
||||||
|
grep '"name": "2021-07-18 02-17-59.mkv"' fug.log | head -n 1 | sed -r 's/.*"hash": \[//; s/\].*//' | tr '"' '\n' | grep -E '^[a-zA-Z0-9_-]{44}$' | while IFS= read -r cid; do cat -n fug.log | grep -vF '"purl": "' | grep -- "$cid"; echo; done | stdbuf -oL tr '\t' ' ' | while IFS=' ' read -r ln _ _ _ _ _ ts ip port msg; do [ -z "$msg" ] && echo && continue; printf '%6s [%s] [%s] %s\n' $ln "$ts" "$ip $port" "$msg"; read -r ln _ _ _ _ _ ts ip port msg < <(cat -n fug.log | tail -n +$((ln+1)) | grep -F "$ip $port" | head -n 1); printf '%6s [%s] [%s] %s\n' $ln "$ts" "$ip $port" "$msg"; done
|
||||||
|
|
||||||
|
|
||||||
##
|
##
|
||||||
## js oneliners
|
## js oneliners
|
||||||
|
|
||||||
@@ -185,8 +191,13 @@ about:config >> devtools.debugger.prefs-schema-version = -1
|
|||||||
git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > ../revs && cat ../{util,browser,up2k}.js >../vr && cat ../revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser,up2k}.js >../vg 2>/dev/null && diff -wNarU0 ../{vg,vr} | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done
|
git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > ../revs && cat ../{util,browser,up2k}.js >../vr && cat ../revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser,up2k}.js >../vg 2>/dev/null && diff -wNarU0 ../{vg,vr} | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done
|
||||||
|
|
||||||
# download all sfx versions
|
# download all sfx versions
|
||||||
curl https://api.github.com/repos/9001/copyparty/releases?per_page=100 | jq -r '.[] | .tag_name + " " + .name' | tr -d '\r' | while read v t; do fn="copyparty $v $t.py"; [ -e "$fn" ] || curl https://github.com/9001/copyparty/releases/download/$v/copyparty-sfx.py -Lo "$fn"; done
|
curl https://api.github.com/repos/9001/copyparty/releases?per_page=100 | jq -r '.[] | .tag_name + " " + .name' | tr -d '\r' | while read v t; do fn="$(printf '%s\n' "copyparty $v $t.py" | tr / -)"; [ -e "$fn" ] || curl https://github.com/9001/copyparty/releases/download/$v/copyparty-sfx.py -Lo "$fn"; done
|
||||||
|
|
||||||
|
# push to multiple git remotes
|
||||||
|
git config -l | grep '^remote'
|
||||||
|
git remote add all git@github.com:9001/copyparty.git
|
||||||
|
git remote set-url --add --push all git@gitlab.com:9001/copyparty.git
|
||||||
|
git remote set-url --add --push all git@github.com:9001/copyparty.git
|
||||||
|
|
||||||
##
|
##
|
||||||
## http 206
|
## http 206
|
||||||
|
|||||||
@@ -140,10 +140,10 @@ repack() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
repack sfx-full "re gz no-sh"
|
repack sfx-full "re gz no-sh"
|
||||||
repack sfx-ent "re no-dd no-ogv"
|
repack sfx-ent "re no-dd"
|
||||||
repack sfx-ent "re no-dd no-ogv gz no-sh"
|
repack sfx-ent "re no-dd gz no-sh"
|
||||||
repack sfx-lite "re no-dd no-ogv no-cm no-hl"
|
repack sfx-lite "re no-dd no-cm no-hl"
|
||||||
repack sfx-lite "re no-dd no-ogv no-cm no-hl gz no-sh"
|
repack sfx-lite "re no-dd no-cm no-hl gz no-sh"
|
||||||
|
|
||||||
|
|
||||||
# move fuse and up2k clients into copyparty-extras/,
|
# move fuse and up2k clients into copyparty-extras/,
|
||||||
|
|||||||
@@ -1,11 +1,10 @@
|
|||||||
FROM alpine:3.14
|
FROM alpine:3.15
|
||||||
WORKDIR /z
|
WORKDIR /z
|
||||||
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
|
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
|
||||||
ver_hashwasm=4.9.0 \
|
ver_hashwasm=4.9.0 \
|
||||||
ver_marked=3.0.4 \
|
ver_marked=4.0.6 \
|
||||||
ver_ogvjs=1.8.4 \
|
|
||||||
ver_mde=2.15.0 \
|
ver_mde=2.15.0 \
|
||||||
ver_codemirror=5.62.3 \
|
ver_codemirror=5.64.0 \
|
||||||
ver_fontawesome=5.13.0 \
|
ver_fontawesome=5.13.0 \
|
||||||
ver_zopfli=1.0.3
|
ver_zopfli=1.0.3
|
||||||
|
|
||||||
@@ -15,7 +14,6 @@ ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
|
|||||||
RUN mkdir -p /z/dist/no-pk \
|
RUN mkdir -p /z/dist/no-pk \
|
||||||
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
|
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
|
||||||
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \
|
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \
|
||||||
&& wget https://github.com/brion/ogv.js/releases/download/$ver_ogvjs/ogvjs-$ver_ogvjs.zip -O ogvjs.zip \
|
|
||||||
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
|
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
|
||||||
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
|
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
|
||||||
&& wget https://github.com/Ionaru/easy-markdown-editor/archive/$ver_mde.tar.gz -O mde.tgz \
|
&& wget https://github.com/Ionaru/easy-markdown-editor/archive/$ver_mde.tar.gz -O mde.tgz \
|
||||||
@@ -23,7 +21,6 @@ RUN mkdir -p /z/dist/no-pk \
|
|||||||
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
|
||||||
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
|
||||||
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
|
||||||
&& unzip ogvjs.zip \
|
|
||||||
&& (mkdir hash-wasm \
|
&& (mkdir hash-wasm \
|
||||||
&& cd hash-wasm \
|
&& cd hash-wasm \
|
||||||
&& unzip ../hash-wasm.zip) \
|
&& unzip ../hash-wasm.zip) \
|
||||||
@@ -77,21 +74,6 @@ RUN cd hash-wasm \
|
|||||||
&& mv sha512.umd.min.js /z/dist/sha512.hw.js
|
&& mv sha512.umd.min.js /z/dist/sha512.hw.js
|
||||||
|
|
||||||
|
|
||||||
# build ogvjs
|
|
||||||
RUN cd ogvjs-$ver_ogvjs \
|
|
||||||
&& cp -pv \
|
|
||||||
ogv-worker-audio.js \
|
|
||||||
ogv-demuxer-ogg-wasm.js \
|
|
||||||
ogv-demuxer-ogg-wasm.wasm \
|
|
||||||
ogv-decoder-audio-opus-wasm.js \
|
|
||||||
ogv-decoder-audio-opus-wasm.wasm \
|
|
||||||
ogv-decoder-audio-vorbis-wasm.js \
|
|
||||||
ogv-decoder-audio-vorbis-wasm.wasm \
|
|
||||||
/z/dist \
|
|
||||||
&& cp -pv \
|
|
||||||
ogv-es2017.js /z/dist/ogv.js
|
|
||||||
|
|
||||||
|
|
||||||
# build marked
|
# build marked
|
||||||
COPY marked.patch /z/
|
COPY marked.patch /z/
|
||||||
COPY marked-ln.patch /z/
|
COPY marked-ln.patch /z/
|
||||||
@@ -100,7 +82,6 @@ RUN cd marked-$ver_marked \
|
|||||||
&& patch -p1 < /z/marked.patch \
|
&& patch -p1 < /z/marked.patch \
|
||||||
&& npm run build \
|
&& npm run build \
|
||||||
&& cp -pv marked.min.js /z/dist/marked.js \
|
&& cp -pv marked.min.js /z/dist/marked.js \
|
||||||
&& cp -pv lib/marked.js /z/dist/marked.full.js \
|
|
||||||
&& mkdir -p /z/nodepkgs \
|
&& mkdir -p /z/nodepkgs \
|
||||||
&& ln -s $(pwd) /z/nodepkgs/marked
|
&& ln -s $(pwd) /z/nodepkgs/marked
|
||||||
# && npm run test \
|
# && npm run test \
|
||||||
@@ -116,8 +97,10 @@ RUN cd CodeMirror-$ver_codemirror \
|
|||||||
|
|
||||||
|
|
||||||
# build easymde
|
# build easymde
|
||||||
|
COPY easymde-marked6.patch /z/
|
||||||
COPY easymde.patch /z/
|
COPY easymde.patch /z/
|
||||||
RUN cd easy-markdown-editor-$ver_mde \
|
RUN cd easy-markdown-editor-$ver_mde \
|
||||||
|
&& patch -p1 < /z/easymde-marked6.patch \
|
||||||
&& patch -p1 < /z/easymde.patch \
|
&& patch -p1 < /z/easymde.patch \
|
||||||
&& sed -ri 's`https://registry.npmjs.org/marked/-/marked-[0-9\.]+.tgz`file:/z/nodepkgs/marked`' package-lock.json \
|
&& sed -ri 's`https://registry.npmjs.org/marked/-/marked-[0-9\.]+.tgz`file:/z/nodepkgs/marked`' package-lock.json \
|
||||||
&& sed -ri 's`("marked": ")[^"]+`\1file:/z/nodepkgs/marked`' ./package.json \
|
&& sed -ri 's`("marked": ")[^"]+`\1file:/z/nodepkgs/marked`' ./package.json \
|
||||||
|
|||||||
12
scripts/deps-docker/easymde-marked6.patch
Normal file
12
scripts/deps-docker/easymde-marked6.patch
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
diff --git a/src/js/easymde.js b/src/js/easymde.js
|
||||||
|
--- a/src/js/easymde.js
|
||||||
|
+++ b/src/js/easymde.js
|
||||||
|
@@ -1962,7 +1962,7 @@ EasyMDE.prototype.markdown = function (text) {
|
||||||
|
marked.setOptions(markedOptions);
|
||||||
|
|
||||||
|
// Convert the markdown to HTML
|
||||||
|
- var htmlText = marked(text);
|
||||||
|
+ var htmlText = marked.parse(text);
|
||||||
|
|
||||||
|
// Sanitize HTML
|
||||||
|
if (this.options.renderingConfig && typeof this.options.renderingConfig.sanitizerFunction === 'function') {
|
||||||
@@ -1,15 +1,15 @@
|
|||||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||||
adds linetracking to marked.js v3.0.4;
|
adds linetracking to marked.js v4.0.6;
|
||||||
add data-ln="%d" to most tags, %d is the source markdown line
|
add data-ln="%d" to most tags, %d is the source markdown line
|
||||||
--- a/src/Lexer.js
|
--- a/src/Lexer.js
|
||||||
+++ b/src/Lexer.js
|
+++ b/src/Lexer.js
|
||||||
@@ -50,4 +50,5 @@ function mangle(text) {
|
@@ -50,4 +50,5 @@ function mangle(text) {
|
||||||
module.exports = class Lexer {
|
export class Lexer {
|
||||||
constructor(options) {
|
constructor(options) {
|
||||||
+ this.ln = 1; // like most editors, start couting from 1
|
+ this.ln = 1; // like most editors, start couting from 1
|
||||||
this.tokens = [];
|
this.tokens = [];
|
||||||
this.tokens.links = Object.create(null);
|
this.tokens.links = Object.create(null);
|
||||||
@@ -127,4 +128,15 @@ module.exports = class Lexer {
|
@@ -127,4 +128,15 @@ export class Lexer {
|
||||||
}
|
}
|
||||||
|
|
||||||
+ set_ln(token, ln = this.ln) {
|
+ set_ln(token, ln = this.ln) {
|
||||||
@@ -25,7 +25,7 @@ add data-ln="%d" to most tags, %d is the source markdown line
|
|||||||
+
|
+
|
||||||
/**
|
/**
|
||||||
* Lexing
|
* Lexing
|
||||||
@@ -134,7 +146,11 @@ module.exports = class Lexer {
|
@@ -134,7 +146,11 @@ export class Lexer {
|
||||||
src = src.replace(/^ +$/gm, '');
|
src = src.replace(/^ +$/gm, '');
|
||||||
}
|
}
|
||||||
- let token, lastToken, cutSrc, lastParagraphClipped;
|
- let token, lastToken, cutSrc, lastParagraphClipped;
|
||||||
@@ -38,105 +38,105 @@ add data-ln="%d" to most tags, %d is the source markdown line
|
|||||||
+
|
+
|
||||||
if (this.options.extensions
|
if (this.options.extensions
|
||||||
&& this.options.extensions.block
|
&& this.options.extensions.block
|
||||||
@@ -142,4 +158,5 @@ module.exports = class Lexer {
|
@@ -142,4 +158,5 @@ export class Lexer {
|
||||||
if (token = extTokenizer.call({ lexer: this }, src, tokens)) {
|
if (token = extTokenizer.call({ lexer: this }, src, tokens)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
return true;
|
return true;
|
||||||
@@ -153,4 +170,5 @@ module.exports = class Lexer {
|
@@ -153,4 +170,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.space(src)) {
|
if (token = this.tokenizer.space(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln); // is \n if not type
|
+ this.set_ln(token, ln); // is \n if not type
|
||||||
if (token.type) {
|
if (token.type) {
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
@@ -162,4 +180,5 @@ module.exports = class Lexer {
|
@@ -162,4 +180,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.code(src)) {
|
if (token = this.tokenizer.code(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
lastToken = tokens[tokens.length - 1];
|
lastToken = tokens[tokens.length - 1];
|
||||||
// An indented code block cannot interrupt a paragraph.
|
// An indented code block cannot interrupt a paragraph.
|
||||||
@@ -177,4 +196,5 @@ module.exports = class Lexer {
|
@@ -177,4 +196,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.fences(src)) {
|
if (token = this.tokenizer.fences(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -184,4 +204,5 @@ module.exports = class Lexer {
|
@@ -184,4 +204,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.heading(src)) {
|
if (token = this.tokenizer.heading(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -191,4 +212,5 @@ module.exports = class Lexer {
|
@@ -191,4 +212,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.hr(src)) {
|
if (token = this.tokenizer.hr(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -198,4 +220,5 @@ module.exports = class Lexer {
|
@@ -198,4 +220,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.blockquote(src)) {
|
if (token = this.tokenizer.blockquote(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -205,4 +228,5 @@ module.exports = class Lexer {
|
@@ -205,4 +228,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.list(src)) {
|
if (token = this.tokenizer.list(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -212,4 +236,5 @@ module.exports = class Lexer {
|
@@ -212,4 +236,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.html(src)) {
|
if (token = this.tokenizer.html(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -219,4 +244,5 @@ module.exports = class Lexer {
|
@@ -219,4 +244,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.def(src)) {
|
if (token = this.tokenizer.def(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
lastToken = tokens[tokens.length - 1];
|
lastToken = tokens[tokens.length - 1];
|
||||||
if (lastToken && (lastToken.type === 'paragraph' || lastToken.type === 'text')) {
|
if (lastToken && (lastToken.type === 'paragraph' || lastToken.type === 'text')) {
|
||||||
@@ -236,4 +262,5 @@ module.exports = class Lexer {
|
@@ -236,4 +262,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.table(src)) {
|
if (token = this.tokenizer.table(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -243,4 +270,5 @@ module.exports = class Lexer {
|
@@ -243,4 +270,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.lheading(src)) {
|
if (token = this.tokenizer.lheading(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -263,4 +291,5 @@ module.exports = class Lexer {
|
@@ -263,4 +291,5 @@ export class Lexer {
|
||||||
}
|
}
|
||||||
if (this.state.top && (token = this.tokenizer.paragraph(cutSrc))) {
|
if (this.state.top && (token = this.tokenizer.paragraph(cutSrc))) {
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
lastToken = tokens[tokens.length - 1];
|
lastToken = tokens[tokens.length - 1];
|
||||||
if (lastParagraphClipped && lastToken.type === 'paragraph') {
|
if (lastParagraphClipped && lastToken.type === 'paragraph') {
|
||||||
@@ -280,4 +309,6 @@ module.exports = class Lexer {
|
@@ -280,4 +309,6 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.text(src)) {
|
if (token = this.tokenizer.text(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.set_ln(token, ln);
|
+ this.set_ln(token, ln);
|
||||||
+ this.ln++;
|
+ this.ln++;
|
||||||
lastToken = tokens[tokens.length - 1];
|
lastToken = tokens[tokens.length - 1];
|
||||||
if (lastToken && lastToken.type === 'text') {
|
if (lastToken && lastToken.type === 'text') {
|
||||||
@@ -355,4 +386,5 @@ module.exports = class Lexer {
|
@@ -355,4 +386,5 @@ export class Lexer {
|
||||||
if (token = extTokenizer.call({ lexer: this }, src, tokens)) {
|
if (token = extTokenizer.call({ lexer: this }, src, tokens)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.ln = token.ln || this.ln;
|
+ this.ln = token.ln || this.ln;
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
return true;
|
return true;
|
||||||
@@ -420,4 +452,6 @@ module.exports = class Lexer {
|
@@ -420,4 +452,6 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.br(src)) {
|
if (token = this.tokenizer.br(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ // no need to reset (no more blockTokens anyways)
|
+ // no need to reset (no more blockTokens anyways)
|
||||||
+ token.ln = this.ln++;
|
+ token.ln = this.ln++;
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
continue;
|
continue;
|
||||||
@@ -462,4 +496,5 @@ module.exports = class Lexer {
|
@@ -462,4 +496,5 @@ export class Lexer {
|
||||||
if (token = this.tokenizer.inlineText(cutSrc, smartypants)) {
|
if (token = this.tokenizer.inlineText(cutSrc, smartypants)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
+ this.ln = token.ln || this.ln;
|
+ this.ln = token.ln || this.ln;
|
||||||
@@ -145,13 +145,13 @@ add data-ln="%d" to most tags, %d is the source markdown line
|
|||||||
diff --git a/src/Parser.js b/src/Parser.js
|
diff --git a/src/Parser.js b/src/Parser.js
|
||||||
--- a/src/Parser.js
|
--- a/src/Parser.js
|
||||||
+++ b/src/Parser.js
|
+++ b/src/Parser.js
|
||||||
@@ -18,4 +18,5 @@ module.exports = class Parser {
|
@@ -18,4 +18,5 @@ export class Parser {
|
||||||
this.textRenderer = new TextRenderer();
|
this.textRenderer = new TextRenderer();
|
||||||
this.slugger = new Slugger();
|
this.slugger = new Slugger();
|
||||||
+ this.ln = 0; // error indicator; should always be set >=1 from tokens
|
+ this.ln = 0; // error indicator; should always be set >=1 from tokens
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -64,4 +65,8 @@ module.exports = class Parser {
|
@@ -64,4 +65,8 @@ export class Parser {
|
||||||
for (i = 0; i < l; i++) {
|
for (i = 0; i < l; i++) {
|
||||||
token = tokens[i];
|
token = tokens[i];
|
||||||
+ // take line-numbers from tokens whenever possible
|
+ // take line-numbers from tokens whenever possible
|
||||||
@@ -160,7 +160,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
|||||||
+ this.renderer.tag_ln(this.ln);
|
+ this.renderer.tag_ln(this.ln);
|
||||||
|
|
||||||
// Run any renderer extensions
|
// Run any renderer extensions
|
||||||
@@ -124,7 +129,10 @@ module.exports = class Parser {
|
@@ -124,7 +129,10 @@ export class Parser {
|
||||||
}
|
}
|
||||||
|
|
||||||
- body += this.renderer.tablerow(cell);
|
- body += this.renderer.tablerow(cell);
|
||||||
@@ -173,7 +173,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
|||||||
+ out += this.renderer.tag_ln(token.ln).table(header, body);
|
+ out += this.renderer.tag_ln(token.ln).table(header, body);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
@@ -167,8 +175,12 @@ module.exports = class Parser {
|
@@ -167,8 +175,12 @@ export class Parser {
|
||||||
|
|
||||||
itemBody += this.parse(item.tokens, loose);
|
itemBody += this.parse(item.tokens, loose);
|
||||||
- body += this.renderer.listitem(itemBody, task, checked);
|
- body += this.renderer.listitem(itemBody, task, checked);
|
||||||
@@ -188,7 +188,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
|||||||
+ out += this.renderer.tag_ln(token.ln).list(body, ordered, start);
|
+ out += this.renderer.tag_ln(token.ln).list(body, ordered, start);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
@@ -179,5 +191,6 @@ module.exports = class Parser {
|
@@ -179,5 +191,6 @@ export class Parser {
|
||||||
}
|
}
|
||||||
case 'paragraph': {
|
case 'paragraph': {
|
||||||
- out += this.renderer.paragraph(this.parseInline(token.tokens));
|
- out += this.renderer.paragraph(this.parseInline(token.tokens));
|
||||||
@@ -196,7 +196,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
|||||||
+ out += this.renderer.tag_ln(token.ln).paragraph(t);
|
+ out += this.renderer.tag_ln(token.ln).paragraph(t);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
@@ -221,4 +234,7 @@ module.exports = class Parser {
|
@@ -221,4 +234,7 @@ export class Parser {
|
||||||
token = tokens[i];
|
token = tokens[i];
|
||||||
|
|
||||||
+ // another thing that only affects <br/> and other inlines
|
+ // another thing that only affects <br/> and other inlines
|
||||||
@@ -207,7 +207,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
|||||||
diff --git a/src/Renderer.js b/src/Renderer.js
|
diff --git a/src/Renderer.js b/src/Renderer.js
|
||||||
--- a/src/Renderer.js
|
--- a/src/Renderer.js
|
||||||
+++ b/src/Renderer.js
|
+++ b/src/Renderer.js
|
||||||
@@ -11,6 +11,12 @@ module.exports = class Renderer {
|
@@ -11,6 +11,12 @@ export class Renderer {
|
||||||
constructor(options) {
|
constructor(options) {
|
||||||
this.options = options || defaults;
|
this.options = options || defaults;
|
||||||
+ this.ln = "";
|
+ this.ln = "";
|
||||||
@@ -220,7 +220,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
|||||||
+
|
+
|
||||||
code(code, infostring, escaped) {
|
code(code, infostring, escaped) {
|
||||||
const lang = (infostring || '').match(/\S*/)[0];
|
const lang = (infostring || '').match(/\S*/)[0];
|
||||||
@@ -26,10 +32,10 @@ module.exports = class Renderer {
|
@@ -26,10 +32,10 @@ export class Renderer {
|
||||||
|
|
||||||
if (!lang) {
|
if (!lang) {
|
||||||
- return '<pre><code>'
|
- return '<pre><code>'
|
||||||
@@ -233,55 +233,55 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
|||||||
+ return '<pre' + this.ln + '><code class="'
|
+ return '<pre' + this.ln + '><code class="'
|
||||||
+ this.options.langPrefix
|
+ this.options.langPrefix
|
||||||
+ escape(lang, true)
|
+ escape(lang, true)
|
||||||
@@ -40,5 +46,5 @@ module.exports = class Renderer {
|
@@ -40,5 +46,5 @@ export class Renderer {
|
||||||
|
|
||||||
blockquote(quote) {
|
blockquote(quote) {
|
||||||
- return '<blockquote>\n' + quote + '</blockquote>\n';
|
- return '<blockquote>\n' + quote + '</blockquote>\n';
|
||||||
+ return '<blockquote' + this.ln + '>\n' + quote + '</blockquote>\n';
|
+ return '<blockquote' + this.ln + '>\n' + quote + '</blockquote>\n';
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -51,4 +57,5 @@ module.exports = class Renderer {
|
@@ -51,4 +57,5 @@ export class Renderer {
|
||||||
return '<h'
|
return '<h'
|
||||||
+ level
|
+ level
|
||||||
+ + this.ln
|
+ + this.ln
|
||||||
+ ' id="'
|
+ ' id="'
|
||||||
+ this.options.headerPrefix
|
+ this.options.headerPrefix
|
||||||
@@ -61,5 +68,5 @@ module.exports = class Renderer {
|
@@ -61,5 +68,5 @@ export class Renderer {
|
||||||
}
|
}
|
||||||
// ignore IDs
|
// ignore IDs
|
||||||
- return '<h' + level + '>' + text + '</h' + level + '>\n';
|
- return '<h' + level + '>' + text + '</h' + level + '>\n';
|
||||||
+ return '<h' + level + this.ln + '>' + text + '</h' + level + '>\n';
|
+ return '<h' + level + this.ln + '>' + text + '</h' + level + '>\n';
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -75,5 +82,5 @@ module.exports = class Renderer {
|
@@ -75,5 +82,5 @@ export class Renderer {
|
||||||
|
|
||||||
listitem(text) {
|
listitem(text) {
|
||||||
- return '<li>' + text + '</li>\n';
|
- return '<li>' + text + '</li>\n';
|
||||||
+ return '<li' + this.ln + '>' + text + '</li>\n';
|
+ return '<li' + this.ln + '>' + text + '</li>\n';
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -87,5 +94,5 @@ module.exports = class Renderer {
|
@@ -87,5 +94,5 @@ export class Renderer {
|
||||||
|
|
||||||
paragraph(text) {
|
paragraph(text) {
|
||||||
- return '<p>' + text + '</p>\n';
|
- return '<p>' + text + '</p>\n';
|
||||||
+ return '<p' + this.ln + '>' + text + '</p>\n';
|
+ return '<p' + this.ln + '>' + text + '</p>\n';
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -102,5 +109,5 @@ module.exports = class Renderer {
|
@@ -102,5 +109,5 @@ export class Renderer {
|
||||||
|
|
||||||
tablerow(content) {
|
tablerow(content) {
|
||||||
- return '<tr>\n' + content + '</tr>\n';
|
- return '<tr>\n' + content + '</tr>\n';
|
||||||
+ return '<tr' + this.ln + '>\n' + content + '</tr>\n';
|
+ return '<tr' + this.ln + '>\n' + content + '</tr>\n';
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -127,5 +134,5 @@ module.exports = class Renderer {
|
@@ -127,5 +134,5 @@ export class Renderer {
|
||||||
|
|
||||||
br() {
|
br() {
|
||||||
- return this.options.xhtml ? '<br/>' : '<br>';
|
- return this.options.xhtml ? '<br/>' : '<br>';
|
||||||
+ return this.options.xhtml ? '<br' + this.ln + '/>' : '<br' + this.ln + '>';
|
+ return this.options.xhtml ? '<br' + this.ln + '/>' : '<br' + this.ln + '>';
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -153,5 +160,5 @@ module.exports = class Renderer {
|
@@ -153,5 +160,5 @@ export class Renderer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- let out = '<img src="' + href + '" alt="' + text + '"';
|
- let out = '<img src="' + href + '" alt="' + text + '"';
|
||||||
@@ -291,7 +291,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
|||||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||||
--- a/src/Tokenizer.js
|
--- a/src/Tokenizer.js
|
||||||
+++ b/src/Tokenizer.js
|
+++ b/src/Tokenizer.js
|
||||||
@@ -301,4 +301,7 @@ module.exports = class Tokenizer {
|
@@ -297,4 +297,7 @@ export class Tokenizer {
|
||||||
const l = list.items.length;
|
const l = list.items.length;
|
||||||
|
|
||||||
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||||
--- a/src/Lexer.js
|
--- a/src/Lexer.js
|
||||||
+++ b/src/Lexer.js
|
+++ b/src/Lexer.js
|
||||||
@@ -6,5 +6,5 @@ const { repeatString } = require('./helpers.js');
|
@@ -6,5 +6,5 @@ import { repeatString } from './helpers.js';
|
||||||
/**
|
/**
|
||||||
* smartypants text replacement
|
* smartypants text replacement
|
||||||
- */
|
- */
|
||||||
@@ -15,21 +15,21 @@ diff --git a/src/Lexer.js b/src/Lexer.js
|
|||||||
+ *
|
+ *
|
||||||
function mangle(text) {
|
function mangle(text) {
|
||||||
let out = '',
|
let out = '',
|
||||||
@@ -465,5 +465,5 @@ module.exports = class Lexer {
|
@@ -466,5 +466,5 @@ export class Lexer {
|
||||||
|
|
||||||
// autolink
|
// autolink
|
||||||
- if (token = this.tokenizer.autolink(src, mangle)) {
|
- if (token = this.tokenizer.autolink(src, mangle)) {
|
||||||
+ if (token = this.tokenizer.autolink(src)) {
|
+ if (token = this.tokenizer.autolink(src)) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
@@ -472,5 +472,5 @@ module.exports = class Lexer {
|
@@ -473,5 +473,5 @@ export class Lexer {
|
||||||
|
|
||||||
// url (gfm)
|
// url (gfm)
|
||||||
- if (!this.state.inLink && (token = this.tokenizer.url(src, mangle))) {
|
- if (!this.state.inLink && (token = this.tokenizer.url(src, mangle))) {
|
||||||
+ if (!this.state.inLink && (token = this.tokenizer.url(src))) {
|
+ if (!this.state.inLink && (token = this.tokenizer.url(src))) {
|
||||||
src = src.substring(token.raw.length);
|
src = src.substring(token.raw.length);
|
||||||
tokens.push(token);
|
tokens.push(token);
|
||||||
@@ -493,5 +493,5 @@ module.exports = class Lexer {
|
@@ -494,5 +494,5 @@ export class Lexer {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
- if (token = this.tokenizer.inlineText(cutSrc, smartypants)) {
|
- if (token = this.tokenizer.inlineText(cutSrc, smartypants)) {
|
||||||
@@ -39,14 +39,14 @@ diff --git a/src/Lexer.js b/src/Lexer.js
|
|||||||
diff --git a/src/Renderer.js b/src/Renderer.js
|
diff --git a/src/Renderer.js b/src/Renderer.js
|
||||||
--- a/src/Renderer.js
|
--- a/src/Renderer.js
|
||||||
+++ b/src/Renderer.js
|
+++ b/src/Renderer.js
|
||||||
@@ -142,5 +142,5 @@ module.exports = class Renderer {
|
@@ -142,5 +142,5 @@ export class Renderer {
|
||||||
|
|
||||||
link(href, title, text) {
|
link(href, title, text) {
|
||||||
- href = cleanUrl(this.options.sanitize, this.options.baseUrl, href);
|
- href = cleanUrl(this.options.sanitize, this.options.baseUrl, href);
|
||||||
+ href = cleanUrl(this.options.baseUrl, href);
|
+ href = cleanUrl(this.options.baseUrl, href);
|
||||||
if (href === null) {
|
if (href === null) {
|
||||||
return text;
|
return text;
|
||||||
@@ -155,5 +155,5 @@ module.exports = class Renderer {
|
@@ -155,5 +155,5 @@ export class Renderer {
|
||||||
|
|
||||||
image(href, title, text) {
|
image(href, title, text) {
|
||||||
- href = cleanUrl(this.options.sanitize, this.options.baseUrl, href);
|
- href = cleanUrl(this.options.sanitize, this.options.baseUrl, href);
|
||||||
@@ -56,7 +56,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
|||||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||||
--- a/src/Tokenizer.js
|
--- a/src/Tokenizer.js
|
||||||
+++ b/src/Tokenizer.js
|
+++ b/src/Tokenizer.js
|
||||||
@@ -321,14 +321,7 @@ module.exports = class Tokenizer {
|
@@ -320,14 +320,7 @@ export class Tokenizer {
|
||||||
type: 'html',
|
type: 'html',
|
||||||
raw: cap[0],
|
raw: cap[0],
|
||||||
- pre: !this.options.sanitizer
|
- pre: !this.options.sanitizer
|
||||||
@@ -72,7 +72,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
- }
|
- }
|
||||||
return token;
|
return token;
|
||||||
}
|
}
|
||||||
@@ -477,15 +470,9 @@ module.exports = class Tokenizer {
|
@@ -476,15 +469,9 @@ export class Tokenizer {
|
||||||
|
|
||||||
return {
|
return {
|
||||||
- type: this.options.sanitize
|
- type: this.options.sanitize
|
||||||
@@ -90,7 +90,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text: cap[0]
|
+ text: cap[0]
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
@@ -672,10 +659,10 @@ module.exports = class Tokenizer {
|
@@ -671,10 +658,10 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- autolink(src, mangle) {
|
- autolink(src, mangle) {
|
||||||
@@ -103,7 +103,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text = escape(cap[1]);
|
+ text = escape(cap[1]);
|
||||||
href = 'mailto:' + text;
|
href = 'mailto:' + text;
|
||||||
} else {
|
} else {
|
||||||
@@ -700,10 +687,10 @@ module.exports = class Tokenizer {
|
@@ -699,10 +686,10 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- url(src, mangle) {
|
- url(src, mangle) {
|
||||||
@@ -116,7 +116,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
+ text = escape(cap[0]);
|
+ text = escape(cap[0]);
|
||||||
href = 'mailto:' + text;
|
href = 'mailto:' + text;
|
||||||
} else {
|
} else {
|
||||||
@@ -737,12 +724,12 @@ module.exports = class Tokenizer {
|
@@ -736,12 +723,12 @@ export class Tokenizer {
|
||||||
}
|
}
|
||||||
|
|
||||||
- inlineText(src, smartypants) {
|
- inlineText(src, smartypants) {
|
||||||
@@ -135,7 +135,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
|||||||
diff --git a/src/defaults.js b/src/defaults.js
|
diff --git a/src/defaults.js b/src/defaults.js
|
||||||
--- a/src/defaults.js
|
--- a/src/defaults.js
|
||||||
+++ b/src/defaults.js
|
+++ b/src/defaults.js
|
||||||
@@ -9,12 +9,8 @@ function getDefaults() {
|
@@ -9,12 +9,8 @@ export function getDefaults() {
|
||||||
highlight: null,
|
highlight: null,
|
||||||
langPrefix: 'language-',
|
langPrefix: 'language-',
|
||||||
- mangle: true,
|
- mangle: true,
|
||||||
@@ -151,10 +151,10 @@ diff --git a/src/defaults.js b/src/defaults.js
|
|||||||
diff --git a/src/helpers.js b/src/helpers.js
|
diff --git a/src/helpers.js b/src/helpers.js
|
||||||
--- a/src/helpers.js
|
--- a/src/helpers.js
|
||||||
+++ b/src/helpers.js
|
+++ b/src/helpers.js
|
||||||
@@ -64,18 +64,5 @@ function edit(regex, opt) {
|
@@ -64,18 +64,5 @@ export function edit(regex, opt) {
|
||||||
const nonWordAndColonTest = /[^\w:]/g;
|
const nonWordAndColonTest = /[^\w:]/g;
|
||||||
const originIndependentUrl = /^$|^[a-z][a-z0-9+.-]*:|^[?#]/i;
|
const originIndependentUrl = /^$|^[a-z][a-z0-9+.-]*:|^[?#]/i;
|
||||||
-function cleanUrl(sanitize, base, href) {
|
-export function cleanUrl(sanitize, base, href) {
|
||||||
- if (sanitize) {
|
- if (sanitize) {
|
||||||
- let prot;
|
- let prot;
|
||||||
- try {
|
- try {
|
||||||
@@ -168,36 +168,30 @@ diff --git a/src/helpers.js b/src/helpers.js
|
|||||||
- return null;
|
- return null;
|
||||||
- }
|
- }
|
||||||
- }
|
- }
|
||||||
+function cleanUrl(base, href) {
|
+export function cleanUrl(base, href) {
|
||||||
if (base && !originIndependentUrl.test(href)) {
|
if (base && !originIndependentUrl.test(href)) {
|
||||||
href = resolveUrl(base, href);
|
href = resolveUrl(base, href);
|
||||||
@@ -227,10 +214,4 @@ function findClosingBracket(str, b) {
|
@@ -227,10 +214,4 @@ export function findClosingBracket(str, b) {
|
||||||
}
|
}
|
||||||
|
|
||||||
-function checkSanitizeDeprecation(opt) {
|
-export function checkSanitizeDeprecation(opt) {
|
||||||
- if (opt && opt.sanitize && !opt.silent) {
|
- if (opt && opt.sanitize && !opt.silent) {
|
||||||
- console.warn('marked(): sanitize and sanitizer parameters are deprecated since version 0.7.0, should not be used and will be removed in the future. Read more here: https://marked.js.org/#/USING_ADVANCED.md#options');
|
- console.warn('marked(): sanitize and sanitizer parameters are deprecated since version 0.7.0, should not be used and will be removed in the future. Read more here: https://marked.js.org/#/USING_ADVANCED.md#options');
|
||||||
- }
|
- }
|
||||||
-}
|
-}
|
||||||
-
|
-
|
||||||
// copied from https://stackoverflow.com/a/5450113/806777
|
// copied from https://stackoverflow.com/a/5450113/806777
|
||||||
function repeatString(pattern, count) {
|
export function repeatString(pattern, count) {
|
||||||
@@ -260,5 +241,4 @@ module.exports = {
|
|
||||||
rtrim,
|
|
||||||
findClosingBracket,
|
|
||||||
- checkSanitizeDeprecation,
|
|
||||||
repeatString
|
|
||||||
};
|
|
||||||
diff --git a/src/marked.js b/src/marked.js
|
diff --git a/src/marked.js b/src/marked.js
|
||||||
--- a/src/marked.js
|
--- a/src/marked.js
|
||||||
+++ b/src/marked.js
|
+++ b/src/marked.js
|
||||||
@@ -7,5 +7,4 @@ const Slugger = require('./Slugger.js');
|
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
|
||||||
const {
|
import {
|
||||||
merge,
|
merge,
|
||||||
- checkSanitizeDeprecation,
|
- checkSanitizeDeprecation,
|
||||||
escape
|
escape
|
||||||
} = require('./helpers.js');
|
} from './helpers.js';
|
||||||
@@ -35,5 +34,4 @@ function marked(src, opt, callback) {
|
@@ -35,5 +34,4 @@ export function marked(src, opt, callback) {
|
||||||
|
|
||||||
opt = merge({}, marked.defaults, opt || {});
|
opt = merge({}, marked.defaults, opt || {});
|
||||||
- checkSanitizeDeprecation(opt);
|
- checkSanitizeDeprecation(opt);
|
||||||
@@ -219,37 +213,37 @@ diff --git a/src/marked.js b/src/marked.js
|
|||||||
diff --git a/test/bench.js b/test/bench.js
|
diff --git a/test/bench.js b/test/bench.js
|
||||||
--- a/test/bench.js
|
--- a/test/bench.js
|
||||||
+++ b/test/bench.js
|
+++ b/test/bench.js
|
||||||
@@ -33,5 +33,4 @@ async function runBench(options) {
|
@@ -37,5 +37,4 @@ export async function runBench(options) {
|
||||||
breaks: false,
|
breaks: false,
|
||||||
pedantic: false,
|
pedantic: false,
|
||||||
- sanitize: false,
|
- sanitize: false,
|
||||||
smartLists: false
|
smartLists: false
|
||||||
});
|
});
|
||||||
@@ -45,5 +44,4 @@ async function runBench(options) {
|
@@ -49,5 +48,4 @@ export async function runBench(options) {
|
||||||
breaks: false,
|
breaks: false,
|
||||||
pedantic: false,
|
pedantic: false,
|
||||||
- sanitize: false,
|
- sanitize: false,
|
||||||
smartLists: false
|
smartLists: false
|
||||||
});
|
});
|
||||||
@@ -58,5 +56,4 @@ async function runBench(options) {
|
@@ -62,5 +60,4 @@ export async function runBench(options) {
|
||||||
breaks: false,
|
breaks: false,
|
||||||
pedantic: false,
|
pedantic: false,
|
||||||
- sanitize: false,
|
- sanitize: false,
|
||||||
smartLists: false
|
smartLists: false
|
||||||
});
|
});
|
||||||
@@ -70,5 +67,4 @@ async function runBench(options) {
|
@@ -74,5 +71,4 @@ export async function runBench(options) {
|
||||||
breaks: false,
|
breaks: false,
|
||||||
pedantic: false,
|
pedantic: false,
|
||||||
- sanitize: false,
|
- sanitize: false,
|
||||||
smartLists: false
|
smartLists: false
|
||||||
});
|
});
|
||||||
@@ -83,5 +79,4 @@ async function runBench(options) {
|
@@ -87,5 +83,4 @@ export async function runBench(options) {
|
||||||
breaks: false,
|
breaks: false,
|
||||||
pedantic: true,
|
pedantic: true,
|
||||||
- sanitize: false,
|
- sanitize: false,
|
||||||
smartLists: false
|
smartLists: false
|
||||||
});
|
});
|
||||||
@@ -95,5 +90,4 @@ async function runBench(options) {
|
@@ -99,5 +94,4 @@ export async function runBench(options) {
|
||||||
breaks: false,
|
breaks: false,
|
||||||
pedantic: true,
|
pedantic: true,
|
||||||
- sanitize: false,
|
- sanitize: false,
|
||||||
@@ -258,7 +252,7 @@ diff --git a/test/bench.js b/test/bench.js
|
|||||||
diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
||||||
--- a/test/specs/run-spec.js
|
--- a/test/specs/run-spec.js
|
||||||
+++ b/test/specs/run-spec.js
|
+++ b/test/specs/run-spec.js
|
||||||
@@ -22,9 +22,4 @@ function runSpecs(title, dir, showCompletionTable, options) {
|
@@ -25,9 +25,4 @@ function runSpecs(title, dir, showCompletionTable, options) {
|
||||||
}
|
}
|
||||||
|
|
||||||
- if (spec.options.sanitizer) {
|
- if (spec.options.sanitizer) {
|
||||||
@@ -268,77 +262,77 @@ diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
|||||||
-
|
-
|
||||||
(spec.only ? fit : (spec.skip ? xit : it))('should ' + passFail + example, async() => {
|
(spec.only ? fit : (spec.skip ? xit : it))('should ' + passFail + example, async() => {
|
||||||
const before = process.hrtime();
|
const before = process.hrtime();
|
||||||
@@ -53,3 +48,2 @@ runSpecs('Original', './original', false, { gfm: false, pedantic: true });
|
@@ -56,3 +51,2 @@ runSpecs('Original', './original', false, { gfm: false, pedantic: true });
|
||||||
runSpecs('New', './new');
|
runSpecs('New', './new');
|
||||||
runSpecs('ReDOS', './redos');
|
runSpecs('ReDOS', './redos');
|
||||||
-runSpecs('Security', './security', false, { silent: true }); // silent - do not show deprecation warning
|
-runSpecs('Security', './security', false, { silent: true }); // silent - do not show deprecation warning
|
||||||
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
||||||
--- a/test/unit/Lexer-spec.js
|
--- a/test/unit/Lexer-spec.js
|
||||||
+++ b/test/unit/Lexer-spec.js
|
+++ b/test/unit/Lexer-spec.js
|
||||||
@@ -589,5 +589,5 @@ paragraph
|
@@ -635,5 +635,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('sanitize', () => {
|
- it('sanitize', () => {
|
||||||
+ /*it('sanitize', () => {
|
+ /*it('sanitize', () => {
|
||||||
expectTokens({
|
expectTokens({
|
||||||
md: '<div>html</div>',
|
md: '<div>html</div>',
|
||||||
@@ -607,5 +607,5 @@ paragraph
|
@@ -653,5 +653,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -652,5 +652,5 @@ paragraph
|
@@ -698,5 +698,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('html sanitize', () => {
|
- it('html sanitize', () => {
|
||||||
+ /*it('html sanitize', () => {
|
+ /*it('html sanitize', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: '<div>html</div>',
|
md: '<div>html</div>',
|
||||||
@@ -660,5 +660,5 @@ paragraph
|
@@ -706,5 +706,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
|
|
||||||
it('link', () => {
|
it('link', () => {
|
||||||
@@ -971,5 +971,5 @@ paragraph
|
@@ -1017,5 +1017,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('autolink mangle email', () => {
|
- it('autolink mangle email', () => {
|
||||||
+ /*it('autolink mangle email', () => {
|
+ /*it('autolink mangle email', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: '<test@example.com>',
|
md: '<test@example.com>',
|
||||||
@@ -991,5 +991,5 @@ paragraph
|
@@ -1037,5 +1037,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
|
|
||||||
it('url', () => {
|
it('url', () => {
|
||||||
@@ -1028,5 +1028,5 @@ paragraph
|
@@ -1074,5 +1074,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- it('url mangle email', () => {
|
- it('url mangle email', () => {
|
||||||
+ /*it('url mangle email', () => {
|
+ /*it('url mangle email', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
md: 'test@example.com',
|
md: 'test@example.com',
|
||||||
@@ -1048,5 +1048,5 @@ paragraph
|
@@ -1094,5 +1094,5 @@ paragraph
|
||||||
]
|
]
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
+ });*/
|
+ });*/
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -1064,5 +1064,5 @@ paragraph
|
@@ -1110,5 +1110,5 @@ paragraph
|
||||||
});
|
});
|
||||||
|
|
||||||
- describe('smartypants', () => {
|
- describe('smartypants', () => {
|
||||||
+ /*describe('smartypants', () => {
|
+ /*describe('smartypants', () => {
|
||||||
it('single quotes', () => {
|
it('single quotes', () => {
|
||||||
expectInlineTokens({
|
expectInlineTokens({
|
||||||
@@ -1134,5 +1134,5 @@ paragraph
|
@@ -1180,5 +1180,5 @@ paragraph
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
- });
|
- });
|
||||||
|
|||||||
@@ -86,8 +86,6 @@ function have() {
|
|||||||
python -c "import $1; $1; $1.__version__"
|
python -c "import $1; $1; $1.__version__"
|
||||||
}
|
}
|
||||||
|
|
||||||
mv copyparty/web/deps/marked.full.js.gz srv/ || true
|
|
||||||
|
|
||||||
. buildenv/bin/activate
|
. buildenv/bin/activate
|
||||||
have setuptools
|
have setuptools
|
||||||
have wheel
|
have wheel
|
||||||
|
|||||||
@@ -16,9 +16,6 @@ help() { exec cat <<'EOF'
|
|||||||
#
|
#
|
||||||
# `no-sh` makes just the python sfx, skips the sh/unix sfx
|
# `no-sh` makes just the python sfx, skips the sh/unix sfx
|
||||||
#
|
#
|
||||||
# `no-ogv` saves ~192k by removing the opus/vorbis audio codecs
|
|
||||||
# (only affects apple devices; everything else has native support)
|
|
||||||
#
|
|
||||||
# `no-cm` saves ~82k by removing easymde/codemirror
|
# `no-cm` saves ~82k by removing easymde/codemirror
|
||||||
# (the fancy markdown editor)
|
# (the fancy markdown editor)
|
||||||
#
|
#
|
||||||
@@ -75,7 +72,6 @@ while [ ! -z "$1" ]; do
|
|||||||
clean) clean=1 ; ;;
|
clean) clean=1 ; ;;
|
||||||
re) repack=1 ; ;;
|
re) repack=1 ; ;;
|
||||||
gz) use_gz=1 ; ;;
|
gz) use_gz=1 ; ;;
|
||||||
no-ogv) no_ogv=1 ; ;;
|
|
||||||
no-fnt) no_fnt=1 ; ;;
|
no-fnt) no_fnt=1 ; ;;
|
||||||
no-hl) no_hl=1 ; ;;
|
no-hl) no_hl=1 ; ;;
|
||||||
no-dd) no_dd=1 ; ;;
|
no-dd) no_dd=1 ; ;;
|
||||||
@@ -218,9 +214,6 @@ cat have | while IFS= read -r x; do
|
|||||||
done
|
done
|
||||||
rm have
|
rm have
|
||||||
|
|
||||||
[ $no_ogv ] &&
|
|
||||||
rm -rf copyparty/web/deps/{dynamicaudio,ogv}*
|
|
||||||
|
|
||||||
[ $no_cm ] && {
|
[ $no_cm ] && {
|
||||||
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
|
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
|
||||||
echo h > copyparty/web/mde.html
|
echo h > copyparty/web/mde.html
|
||||||
|
|||||||
@@ -35,8 +35,6 @@ ver="$1"
|
|||||||
exit 1
|
exit 1
|
||||||
}
|
}
|
||||||
|
|
||||||
mv copyparty/web/deps/marked.full.js.gz srv/ || true
|
|
||||||
|
|
||||||
mkdir -p dist
|
mkdir -p dist
|
||||||
zip_path="$(pwd)/dist/copyparty-$ver.zip"
|
zip_path="$(pwd)/dist/copyparty-$ver.zip"
|
||||||
tgz_path="$(pwd)/dist/copyparty-$ver.tar.gz"
|
tgz_path="$(pwd)/dist/copyparty-$ver.tar.gz"
|
||||||
|
|||||||
@@ -7,8 +7,9 @@ v=$1
|
|||||||
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
|
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
|
||||||
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
|
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
|
||||||
|
|
||||||
|
git push all
|
||||||
git tag v$v
|
git tag v$v
|
||||||
git push origin --tags
|
git push all --tags
|
||||||
|
|
||||||
rm -rf ../dist
|
rm -rf ../dist
|
||||||
|
|
||||||
|
|||||||
@@ -49,14 +49,6 @@ copyparty/web/deps/easymde.js,
|
|||||||
copyparty/web/deps/marked.js,
|
copyparty/web/deps/marked.js,
|
||||||
copyparty/web/deps/mini-fa.css,
|
copyparty/web/deps/mini-fa.css,
|
||||||
copyparty/web/deps/mini-fa.woff,
|
copyparty/web/deps/mini-fa.woff,
|
||||||
copyparty/web/deps/ogv-decoder-audio-opus-wasm.js,
|
|
||||||
copyparty/web/deps/ogv-decoder-audio-opus-wasm.wasm,
|
|
||||||
copyparty/web/deps/ogv-decoder-audio-vorbis-wasm.js,
|
|
||||||
copyparty/web/deps/ogv-decoder-audio-vorbis-wasm.wasm,
|
|
||||||
copyparty/web/deps/ogv-demuxer-ogg-wasm.js,
|
|
||||||
copyparty/web/deps/ogv-demuxer-ogg-wasm.wasm,
|
|
||||||
copyparty/web/deps/ogv-worker-audio.js,
|
|
||||||
copyparty/web/deps/ogv.js,
|
|
||||||
copyparty/web/deps/prism.js,
|
copyparty/web/deps/prism.js,
|
||||||
copyparty/web/deps/prism.css,
|
copyparty/web/deps/prism.css,
|
||||||
copyparty/web/deps/prismd.css,
|
copyparty/web/deps/prismd.css,
|
||||||
|
|||||||
@@ -29,6 +29,9 @@ class Cfg(Namespace):
|
|||||||
v=v or [],
|
v=v or [],
|
||||||
c=c,
|
c=c,
|
||||||
rproxy=0,
|
rproxy=0,
|
||||||
|
rsp_slp=0,
|
||||||
|
s_wr_slp=0,
|
||||||
|
s_wr_sz=512 * 1024,
|
||||||
ed=False,
|
ed=False,
|
||||||
nw=False,
|
nw=False,
|
||||||
unpost=600,
|
unpost=600,
|
||||||
@@ -48,6 +51,7 @@ class Cfg(Namespace):
|
|||||||
mte="a",
|
mte="a",
|
||||||
mth="",
|
mth="",
|
||||||
textfiles="",
|
textfiles="",
|
||||||
|
doctitle="",
|
||||||
hist=None,
|
hist=None,
|
||||||
no_idx=None,
|
no_idx=None,
|
||||||
no_hash=None,
|
no_hash=None,
|
||||||
|
|||||||
@@ -23,6 +23,7 @@ class Cfg(Namespace):
|
|||||||
"mtp": [],
|
"mtp": [],
|
||||||
"mte": "a",
|
"mte": "a",
|
||||||
"mth": "",
|
"mth": "",
|
||||||
|
"doctitle": "",
|
||||||
"hist": None,
|
"hist": None,
|
||||||
"no_idx": None,
|
"no_idx": None,
|
||||||
"no_hash": None,
|
"no_hash": None,
|
||||||
@@ -31,6 +32,9 @@ class Cfg(Namespace):
|
|||||||
"no_voldump": True,
|
"no_voldump": True,
|
||||||
"re_maxage": 0,
|
"re_maxage": 0,
|
||||||
"rproxy": 0,
|
"rproxy": 0,
|
||||||
|
"rsp_slp": 0,
|
||||||
|
"s_wr_slp": 0,
|
||||||
|
"s_wr_sz": 512 * 1024,
|
||||||
}
|
}
|
||||||
ex.update(ex2)
|
ex.update(ex2)
|
||||||
super(Cfg, self).__init__(a=a or [], v=v or [], c=c, **ex)
|
super(Cfg, self).__init__(a=a or [], v=v or [], c=c, **ex)
|
||||||
|
|||||||
Reference in New Issue
Block a user