Compare commits
35 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ec4daacf9e | ||
|
|
f3e8308718 | ||
|
|
515ac5d941 | ||
|
|
954c7e7e50 | ||
|
|
67ff57f3a3 | ||
|
|
c10c70c1e5 | ||
|
|
04592a98d2 | ||
|
|
c9c4aac6cf | ||
|
|
8b2c7586ce | ||
|
|
32e22dfe84 | ||
|
|
d70b885722 | ||
|
|
ac6c4b13f5 | ||
|
|
ececdad22d | ||
|
|
bf659781b0 | ||
|
|
2c6bb195a4 | ||
|
|
c032cd08b3 | ||
|
|
39e7a7a231 | ||
|
|
6e14cd2c39 | ||
|
|
aab3baaea7 | ||
|
|
b8453c3b4f | ||
|
|
6ce0e2cd5b | ||
|
|
76beaae7f2 | ||
|
|
c1a7f9edbe | ||
|
|
b5f2fe2f0a | ||
|
|
98a90d49cb | ||
|
|
f55e982cb5 | ||
|
|
686c7defeb | ||
|
|
0b1e483c53 | ||
|
|
457d7df129 | ||
|
|
ce776a547c | ||
|
|
ded0567cbf | ||
|
|
c9cac83d09 | ||
|
|
4fbe6b01a8 | ||
|
|
ee9585264e | ||
|
|
c9ffead7bf |
48
README.md
48
README.md
@@ -20,7 +20,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
|
||||
<a href="https://f-droid.org/packages/me.ocv.partyup/"><img src="https://ocv.me/fdroid.png" alt="Get it on F-Droid" height="50" /> '' <img src="https://img.shields.io/f-droid/v/me.ocv.partyup.svg" alt="f-droid version info" /></a> '' <a href="https://github.com/9001/party-up"><img src="https://img.shields.io/github/release/9001/party-up.svg?logo=github" alt="github version info" /></a>
|
||||
|
||||
(basic upload client, nothing fancy yet)
|
||||
(the app is **NOT** the full copyparty server! just a basic upload client, nothing fancy yet)
|
||||
|
||||
|
||||
## readme toc
|
||||
@@ -54,6 +54,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
* [other tricks](#other-tricks)
|
||||
* [searching](#searching) - search by size, date, path/name, mp3-tags, ...
|
||||
* [server config](#server-config) - using arguments or config files, or a mix of both
|
||||
* [ftp-server](#ftp-server) - an FTP server can be started using `--ftp 3921`
|
||||
* [file indexing](#file-indexing)
|
||||
* [upload rules](#upload-rules) - set upload rules using volume flags
|
||||
* [compress uploads](#compress-uploads) - files can be autocompressed on upload
|
||||
@@ -61,6 +62,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
|
||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||
* [upload events](#upload-events) - trigger a script/program on each upload
|
||||
* [hiding from google](#hiding-from-google) - tell search engines you dont wanna be indexed
|
||||
* [complete examples](#complete-examples)
|
||||
* [browser support](#browser-support) - TLDR: yes
|
||||
* [client examples](#client-examples) - interact with copyparty using non-browser clients
|
||||
@@ -82,7 +84,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
|
||||
* [install recommended deps](#install-recommended-deps)
|
||||
* [optional gpl stuff](#optional-gpl-stuff)
|
||||
* [sfx](#sfx) - there are two self-contained "binaries"
|
||||
* [sfx](#sfx) - the self-contained "binary"
|
||||
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
|
||||
* [install on android](#install-on-android)
|
||||
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports
|
||||
@@ -154,6 +156,7 @@ feature summary
|
||||
* ☑ multiprocessing (actual multithreading)
|
||||
* ☑ volumes (mountpoints)
|
||||
* ☑ [accounts](#accounts-and-volumes)
|
||||
* ☑ [ftp-server](#ftp-server)
|
||||
* upload
|
||||
* ☑ basic: plain multipart, ie6 support
|
||||
* ☑ [up2k](#uploading): js, resumable, multithreaded
|
||||
@@ -237,6 +240,8 @@ some improvement ideas
|
||||
|
||||
## general bugs
|
||||
|
||||
* Windows: if the up2k db is on a samba-share or network disk, you'll get unpredictable behavior if the share is disconnected for a bit
|
||||
* use `--hist` or the `hist` volflag (`-v [...]:c,hist=/tmp/foo`) to place the db on a local disk instead
|
||||
* all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise
|
||||
* probably more, pls let me know
|
||||
|
||||
@@ -621,6 +626,18 @@ using arguments or config files, or a mix of both:
|
||||
* or click the `[reload cfg]` button in the control-panel when logged in as admin
|
||||
|
||||
|
||||
## ftp-server
|
||||
|
||||
an FTP server can be started using `--ftp 3921`, and/or `--ftps` for explicit TLS (ftpes)
|
||||
|
||||
* based on [pyftpdlib](https://github.com/giampaolo/pyftpdlib)
|
||||
* needs a dedicated port (cannot share with the HTTP/HTTPS API)
|
||||
* uploads are not resumable -- delete and restart if necessary
|
||||
* runs in active mode by default, you probably want `--ftp-pr 12000-13000`
|
||||
* if you enable both `ftp` and `ftps`, the port-range will be divided in half
|
||||
* some older software (filezilla on debian-stable) cannot passive-mode with TLS
|
||||
|
||||
|
||||
## file indexing
|
||||
|
||||
file indexing relies on two database tables, the up2k filetree (`-e2d`) and the metadata tags (`-e2t`), stored in `.hist/up2k.db`. Configuration can be done through arguments, volume flags, or a mix of both.
|
||||
@@ -776,6 +793,17 @@ and it will occupy the parsing threads, so fork anything expensive, or if you wa
|
||||
if this becomes popular maybe there should be a less janky way to do it actually
|
||||
|
||||
|
||||
## hiding from google
|
||||
|
||||
tell search engines you dont wanna be indexed, either using the good old [robots.txt](https://www.robotstxt.org/robotstxt.html) or through copyparty settings:
|
||||
|
||||
* `--no-robots` adds HTTP (`X-Robots-Tag`) and HTML (`<meta>`) headers with `noindex, nofollow` globally
|
||||
* volume-flag `[...]:c,norobots` does the same thing for that single volume
|
||||
* volume-flag `[...]:c,robots` ALLOWS search-engine crawling for that volume, even if `--no-robots` is set globally
|
||||
|
||||
also, `--force-js` disables the plain HTML folder listing, making things harder to parse for search engines
|
||||
|
||||
|
||||
## complete examples
|
||||
|
||||
* read-only music server with bpm and key scanning
|
||||
@@ -1059,6 +1087,10 @@ mandatory deps:
|
||||
|
||||
install these to enable bonus features
|
||||
|
||||
enable ftp-server:
|
||||
* for just plaintext FTP, `pyftpdlib` (is built into the SFX)
|
||||
* with TLS encryption, `pyftpdlib pyopenssl`
|
||||
|
||||
enable music tags:
|
||||
* either `mutagen` (fast, pure-python, skips a few tags, makes copyparty GPL? idk)
|
||||
* or `ffprobe` (20x slower, more accurate, possibly dangerous depending on your distro and users)
|
||||
@@ -1085,13 +1117,7 @@ these are standalone programs and will never be imported / evaluated by copypart
|
||||
|
||||
# sfx
|
||||
|
||||
there are two self-contained "binaries":
|
||||
* [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) -- pure python, works everywhere, **recommended**
|
||||
* [copyparty-sfx.sh](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.sh) -- smaller, but only for linux and macos, kinda deprecated
|
||||
|
||||
launch either of them (**use sfx.py on systemd**) and it'll unpack and run copyparty, assuming you have python installed of course
|
||||
|
||||
pls note that `copyparty-sfx.sh` will fail if you rename `copyparty-sfx.py` to `copyparty.py` and keep it in the same folder because `sys.path` is funky
|
||||
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
|
||||
|
||||
|
||||
## sfx repack
|
||||
@@ -1166,8 +1192,8 @@ mv /tmp/pe-copyparty/copyparty/web/deps/ copyparty/web/deps/
|
||||
then build the sfx using any of the following examples:
|
||||
|
||||
```sh
|
||||
./scripts/make-sfx.sh # both python and sh editions
|
||||
./scripts/make-sfx.sh no-sh gz # just python with gzip
|
||||
./scripts/make-sfx.sh # regular edition
|
||||
./scripts/make-sfx.sh gz no-cm # gzip-compressed + no fancy markdown editor
|
||||
```
|
||||
|
||||
|
||||
|
||||
@@ -4,8 +4,8 @@ set -e
|
||||
|
||||
# install dependencies for audio-*.py
|
||||
#
|
||||
# linux/alpine: requires {python3,ffmpeg,fftw}-dev py3-{wheel,pip} py3-numpy{,-dev} vamp-sdk-dev patchelf cmake
|
||||
# linux/debian: requires libav{codec,device,filter,format,resample,util}-dev {libfftw3,python3}-dev python3-{numpy,pip} vamp-{plugin-sdk,examples} patchelf cmake
|
||||
# linux/alpine: requires gcc g++ make cmake patchelf {python3,ffmpeg,fftw,libsndfile}-dev py3-{wheel,pip} py3-numpy{,-dev}
|
||||
# linux/debian: requires libav{codec,device,filter,format,resample,util}-dev {libfftw3,python3,libsndfile1}-dev python3-{numpy,pip} vamp-{plugin-sdk,examples} patchelf cmake
|
||||
# win64: requires msys2-mingw64 environment
|
||||
# macos: requires macports
|
||||
#
|
||||
@@ -101,8 +101,11 @@ export -f dl_files
|
||||
|
||||
|
||||
github_tarball() {
|
||||
rm -rf g
|
||||
mkdir g
|
||||
cd g
|
||||
dl_text "$1" |
|
||||
tee json |
|
||||
tee ../json |
|
||||
(
|
||||
# prefer jq if available
|
||||
jq -r '.tarball_url' ||
|
||||
@@ -111,8 +114,11 @@ github_tarball() {
|
||||
awk -F\" '/"tarball_url": "/ {print$4}'
|
||||
) |
|
||||
tee /dev/stderr |
|
||||
head -n 1 |
|
||||
tr -d '\r' | tr '\n' '\0' |
|
||||
xargs -0 bash -c 'dl_files "$@"' _
|
||||
mv * ../tgz
|
||||
cd ..
|
||||
}
|
||||
|
||||
|
||||
@@ -127,6 +133,7 @@ gitlab_tarball() {
|
||||
tr \" '\n' | grep -E '\.tar\.gz$' | head -n 1
|
||||
) |
|
||||
tee /dev/stderr |
|
||||
head -n 1 |
|
||||
tr -d '\r' | tr '\n' '\0' |
|
||||
tee links |
|
||||
xargs -0 bash -c 'dl_files "$@"' _
|
||||
@@ -138,10 +145,17 @@ install_keyfinder() {
|
||||
# use msys2 in mingw-w64 mode
|
||||
# pacman -S --needed mingw-w64-x86_64-{ffmpeg,python}
|
||||
|
||||
github_tarball https://api.github.com/repos/mixxxdj/libkeyfinder/releases/latest
|
||||
[ -e $HOME/pe/keyfinder ] && {
|
||||
echo found a keyfinder build in ~/pe, skipping
|
||||
return
|
||||
}
|
||||
|
||||
tar -xf mixxxdj-libkeyfinder-*
|
||||
rm -- *.tar.gz
|
||||
cd "$td"
|
||||
github_tarball https://api.github.com/repos/mixxxdj/libkeyfinder/releases/latest
|
||||
ls -al
|
||||
|
||||
tar -xf tgz
|
||||
rm tgz
|
||||
cd mixxxdj-libkeyfinder*
|
||||
|
||||
h="$HOME"
|
||||
@@ -208,6 +222,22 @@ install_vamp() {
|
||||
|
||||
$pybin -m pip install --user vamp
|
||||
|
||||
cd "$td"
|
||||
echo '#include <vamp-sdk/Plugin.h>' | gcc -x c -c -o /dev/null - || [ -e ~/pe/vamp-sdk ] || {
|
||||
printf '\033[33mcould not find the vamp-sdk, building from source\033[0m\n'
|
||||
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/2588/vamp-plugin-sdk-2.9.0.tar.gz)
|
||||
sha512sum -c <(
|
||||
echo "7ef7f837d19a08048b059e0da408373a7964ced452b290fae40b85d6d70ca9000bcfb3302cd0b4dc76cf2a848528456f78c1ce1ee0c402228d812bd347b6983b -"
|
||||
) <vamp-plugin-sdk-2.9.0.tar.gz
|
||||
tar -xf vamp-plugin-sdk-2.9.0.tar.gz
|
||||
rm -- *.tar.gz
|
||||
ls -al
|
||||
cd vamp-plugin-sdk-*
|
||||
./configure --prefix=$HOME/pe/vamp-sdk
|
||||
make -j1 install
|
||||
}
|
||||
|
||||
cd "$td"
|
||||
have_beatroot || {
|
||||
printf '\033[33mcould not find the vamp beatroot plugin, building from source\033[0m\n'
|
||||
(dl_files yolo https://code.soundsoftware.ac.uk/attachments/download/885/beatroot-vamp-v1.0.tar.gz)
|
||||
@@ -215,8 +245,11 @@ install_vamp() {
|
||||
echo "1f444d1d58ccf565c0adfe99f1a1aa62789e19f5071e46857e2adfbc9d453037bc1c4dcb039b02c16240e9b97f444aaff3afb625c86aa2470233e711f55b6874 -"
|
||||
) <beatroot-vamp-v1.0.tar.gz
|
||||
tar -xf beatroot-vamp-v1.0.tar.gz
|
||||
rm -- *.tar.gz
|
||||
cd beatroot-vamp-v1.0
|
||||
make -f Makefile.linux -j4
|
||||
[ -e ~/pe/vamp-sdk ] &&
|
||||
sed -ri 's`^(CFLAGS :=.*)`\1 -I'$HOME'/pe/vamp-sdk/include`' Makefile.linux
|
||||
make -f Makefile.linux -j4 LDFLAGS=-L$HOME/pe/vamp-sdk/lib
|
||||
# /home/ed/vamp /home/ed/.vamp /usr/local/lib/vamp
|
||||
mkdir ~/vamp
|
||||
cp -pv beatroot-vamp.* ~/vamp/
|
||||
@@ -230,6 +263,7 @@ install_vamp() {
|
||||
|
||||
# not in use because it kinda segfaults, also no windows support
|
||||
install_soundtouch() {
|
||||
cd "$td"
|
||||
gitlab_tarball https://gitlab.com/api/v4/projects/soundtouch%2Fsoundtouch/releases
|
||||
|
||||
tar -xvf soundtouch-*
|
||||
|
||||
67
bin/prisonparty.sh
Normal file → Executable file
67
bin/prisonparty.sh
Normal file → Executable file
@@ -11,10 +11,16 @@ sysdirs=( /bin /lib /lib32 /lib64 /sbin /usr )
|
||||
help() { cat <<'EOF'
|
||||
|
||||
usage:
|
||||
./prisonparty.sh <ROOTDIR> <UID> <GID> [VOLDIR [VOLDIR...]] -- copyparty-sfx.py [...]"
|
||||
./prisonparty.sh <ROOTDIR> <UID> <GID> [VOLDIR [VOLDIR...]] -- python3 copyparty-sfx.py [...]"
|
||||
|
||||
example:
|
||||
./prisonparty.sh /var/lib/copyparty-jail 1000 1000 /mnt/nas/music -- copyparty-sfx.py -v /mnt/nas/music::rwmd"
|
||||
./prisonparty.sh /var/lib/copyparty-jail 1000 1000 /mnt/nas/music -- python3 copyparty-sfx.py -v /mnt/nas/music::rwmd"
|
||||
|
||||
example for running straight from source (instead of using an sfx):
|
||||
PYTHONPATH=$PWD ./prisonparty.sh /var/lib/copyparty-jail 1000 1000 /mnt/nas/music -- python3 -um copyparty -v /mnt/nas/music::rwmd"
|
||||
|
||||
note that if you have python modules installed as --user (such as bpm/key detectors),
|
||||
you should add /home/foo/.local as a VOLDIR
|
||||
|
||||
EOF
|
||||
exit 1
|
||||
@@ -35,10 +41,20 @@ while true; do
|
||||
vols+=( "$(realpath "$v")" )
|
||||
done
|
||||
pybin="$1"; shift
|
||||
pybin="$(realpath "$pybin")"
|
||||
pybin="$(command -v "$pybin")"
|
||||
pyarg=
|
||||
while true; do
|
||||
v="$1"
|
||||
[ "${v:0:1}" = - ] || break
|
||||
pyarg="$pyarg $v"
|
||||
shift
|
||||
done
|
||||
cpp="$1"; shift
|
||||
cpp="$(realpath "$cpp")"
|
||||
cppdir="$(dirname "$cpp")"
|
||||
[ -d "$cpp" ] && cppdir="$PWD" || {
|
||||
# sfx, not module
|
||||
cpp="$(realpath "$cpp")"
|
||||
cppdir="$(dirname "$cpp")"
|
||||
}
|
||||
trap - EXIT
|
||||
|
||||
|
||||
@@ -60,11 +76,10 @@ echo
|
||||
|
||||
# remove any trailing slashes
|
||||
jail="${jail%/}"
|
||||
cppdir="${cppdir%/}"
|
||||
|
||||
|
||||
# bind-mount system directories and volumes
|
||||
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | LC_ALL=C sort |
|
||||
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | sed -r 's`/$``' | LC_ALL=C sort | uniq |
|
||||
while IFS= read -r v; do
|
||||
[ -e "$v" ] || {
|
||||
# printf '\033[1;31mfolder does not exist:\033[0m %s\n' "/$v"
|
||||
@@ -72,6 +87,7 @@ while IFS= read -r v; do
|
||||
}
|
||||
i1=$(stat -c%D.%i "$v" 2>/dev/null || echo a)
|
||||
i2=$(stat -c%D.%i "$jail$v" 2>/dev/null || echo b)
|
||||
# echo "v [$v] i1 [$i1] i2 [$i2]"
|
||||
[ $i1 = $i2 ] && continue
|
||||
|
||||
mkdir -p "$jail$v"
|
||||
@@ -79,21 +95,34 @@ while IFS= read -r v; do
|
||||
done
|
||||
|
||||
|
||||
cln() {
|
||||
rv=$?
|
||||
# cleanup if not in use
|
||||
lsof "$jail" | grep -qF "$jail" &&
|
||||
echo "chroot is in use, will not cleanup" ||
|
||||
{
|
||||
mount | grep -F " on $jail" |
|
||||
awk '{sub(/ type .*/,"");sub(/.* on /,"");print}' |
|
||||
LC_ALL=C sort -r | tee /dev/stderr | tr '\n' '\0' | xargs -r0 umount
|
||||
}
|
||||
exit $rv
|
||||
}
|
||||
trap cln EXIT
|
||||
|
||||
|
||||
# create a tmp
|
||||
mkdir -p "$jail/tmp"
|
||||
chmod 777 "$jail/tmp"
|
||||
|
||||
|
||||
# run copyparty
|
||||
/sbin/chroot --userspec=$uid:$gid "$jail" "$pybin" "$cpp" "$@" && rv=0 || rv=$?
|
||||
|
||||
|
||||
# cleanup if not in use
|
||||
lsof "$jail" | grep -qF "$jail" &&
|
||||
echo "chroot is in use, will not cleanup" ||
|
||||
{
|
||||
mount | grep -qF " on $jail" |
|
||||
awk '{sub(/ type .*/,"");sub(/.* on /,"");print}' |
|
||||
LC_ALL=C sort -r | tee /dev/stderr | tr '\n' '\0' | xargs -r0 umount
|
||||
}
|
||||
exit $rv
|
||||
export HOME=$(getent passwd $uid | cut -d: -f6)
|
||||
export USER=$(getent passwd $uid | cut -d: -f1)
|
||||
export LOGNAME="$USER"
|
||||
#echo "pybin [$pybin]"
|
||||
#echo "pyarg [$pyarg]"
|
||||
#echo "cpp [$cpp]"
|
||||
chroot --userspec=$uid:$gid "$jail" "$pybin" $pyarg "$cpp" "$@" &
|
||||
p=$!
|
||||
trap 'kill $p' INT TERM
|
||||
wait
|
||||
|
||||
@@ -445,6 +445,13 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("--ssl-dbg", action="store_true", help="dump some tls info")
|
||||
ap2.add_argument("--ssl-log", metavar="PATH", type=u, help="log master secrets")
|
||||
|
||||
ap2 = ap.add_argument_group('FTP options')
|
||||
ap2.add_argument("--ftp", metavar="PORT", type=int, help="enable FTP server on PORT, for example 3921")
|
||||
ap2.add_argument("--ftps", metavar="PORT", type=int, help="enable FTPS server on PORT, for example 3990")
|
||||
ap2.add_argument("--ftp-dbg", action="store_true", help="enable debug logging")
|
||||
ap2.add_argument("--ftp-nat", metavar="ADDR", type=u, help="the NAT address to use for passive connections")
|
||||
ap2.add_argument("--ftp-pr", metavar="P-P", type=u, help="the range of TCP ports to use for passive connections, for example 12000-13000")
|
||||
|
||||
ap2 = ap.add_argument_group('opt-outs')
|
||||
ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)")
|
||||
ap2.add_argument("--keep-qem", action="store_true", help="do not disable quick-edit-mode on windows")
|
||||
@@ -464,6 +471,8 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("--no-logues", action="store_true", help="disable rendering .prologue/.epilogue.html into directory listings")
|
||||
ap2.add_argument("--no-readme", action="store_true", help="disable rendering readme.md into directory listings")
|
||||
ap2.add_argument("--vague-403", action="store_true", help="send 404 instead of 403 (security through ambiguity, very enterprise)")
|
||||
ap2.add_argument("--force-js", action="store_true", help="don't send HTML folder listings, force clients to use the embedded json instead")
|
||||
ap2.add_argument("--no-robots", action="store_true", help="adds http and html headers asking search engines to not index anything")
|
||||
|
||||
ap2 = ap.add_argument_group('yolo options')
|
||||
ap2.add_argument("--ign-ebind", action="store_true", help="continue running even if it's impossible to listen on some of the requested endpoints")
|
||||
@@ -513,6 +522,7 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("--no-idx", metavar="PTN", type=u, help="regex: disable indexing of matching paths during e2ds folder scans")
|
||||
ap2.add_argument("--re-maxage", metavar="SEC", type=int, default=0, help="disk rescan volume interval, 0=off, can be set per-volume with the 'scan' volflag")
|
||||
ap2.add_argument("--srch-time", metavar="SEC", type=int, default=30, help="search deadline")
|
||||
ap2.add_argument("--srch-hits", metavar="N", type=int, default=1000, help="max search results")
|
||||
|
||||
ap2 = ap.add_argument_group('metadata db options')
|
||||
ap2.add_argument("-e2t", action="store_true", help="enable metadata indexing")
|
||||
@@ -531,6 +541,7 @@ def run_argparse(argv, formatter):
|
||||
ap2 = ap.add_argument_group('ui options')
|
||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
|
||||
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
|
||||
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 1, 11)
|
||||
CODENAME = "opus"
|
||||
BUILD_DT = (2022, 1, 14)
|
||||
VERSION = (1, 2, 2)
|
||||
CODENAME = "ftp btw"
|
||||
BUILD_DT = (2022, 3, 20)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -14,6 +14,7 @@ from datetime import datetime
|
||||
from .__init__ import WINDOWS
|
||||
from .util import (
|
||||
IMPLICATIONS,
|
||||
META_NOBOTS,
|
||||
uncyg,
|
||||
undot,
|
||||
unhumanize,
|
||||
@@ -394,6 +395,13 @@ class VFS(object):
|
||||
if ok:
|
||||
virt_vis[name] = vn2
|
||||
|
||||
if ".hist" in abspath:
|
||||
p = abspath.replace("\\", "/") if WINDOWS else abspath
|
||||
if p.endswith("/.hist"):
|
||||
real = [x for x in real if not x[0].startswith("up2k.")]
|
||||
elif "/.hist/th/" in p:
|
||||
real = [x for x in real if not x[0].endswith("dir.txt")]
|
||||
|
||||
return [abspath, real, virt_vis]
|
||||
|
||||
def walk(self, rel, rem, seen, uname, permsets, dots, scandir, lstat):
|
||||
@@ -444,10 +452,6 @@ class VFS(object):
|
||||
if flt:
|
||||
flt = {k: True for k in flt}
|
||||
|
||||
f1 = "{0}.hist{0}up2k.".format(os.sep)
|
||||
f2a = os.sep + "dir.txt"
|
||||
f2b = "{0}.hist{0}".format(os.sep)
|
||||
|
||||
# if multiselect: add all items to archive root
|
||||
# if single folder: the folder itself is the top-level item
|
||||
folder = "" if flt else (vrem.split("/")[-1] or "top")
|
||||
@@ -483,13 +487,6 @@ class VFS(object):
|
||||
for x in rm:
|
||||
del vd[x]
|
||||
|
||||
# up2k filetring based on actual abspath
|
||||
files = [
|
||||
x
|
||||
for x in files
|
||||
if f1 not in x[1] and (not x[1].endswith(f2a) or f2b not in x[1])
|
||||
]
|
||||
|
||||
for f in [{"vp": v, "ap": a, "st": n[1]} for v, a, n in files]:
|
||||
yield f
|
||||
|
||||
@@ -865,6 +862,19 @@ class AuthSrv(object):
|
||||
if use:
|
||||
vol.lim = lim
|
||||
|
||||
if self.args.no_robots:
|
||||
for vol in vfs.all_vols.values():
|
||||
# volflag "robots" overrides global "norobots", allowing indexing by search engines for this vol
|
||||
if not vol.flags.get("robots"):
|
||||
vol.flags["norobots"] = True
|
||||
|
||||
for vol in vfs.all_vols.values():
|
||||
h = [vol.flags.get("html_head", self.args.html_head)]
|
||||
if vol.flags.get("norobots"):
|
||||
h.insert(0, META_NOBOTS)
|
||||
|
||||
vol.flags["html_head"] = "\n".join([x for x in h if x])
|
||||
|
||||
for vol in vfs.all_vols.values():
|
||||
fk = vol.flags.get("fk")
|
||||
if fk:
|
||||
|
||||
@@ -18,10 +18,6 @@ def listdir(p="."):
|
||||
return [fsdec(x) for x in os.listdir(fsenc(p))]
|
||||
|
||||
|
||||
def lstat(p):
|
||||
return os.lstat(fsenc(p))
|
||||
|
||||
|
||||
def makedirs(name, mode=0o755, exist_ok=True):
|
||||
bname = fsenc(name)
|
||||
try:
|
||||
@@ -60,3 +56,12 @@ def utime(p, times=None, follow_symlinks=True):
|
||||
return os.utime(fsenc(p), times, follow_symlinks=follow_symlinks)
|
||||
else:
|
||||
return os.utime(fsenc(p), times)
|
||||
|
||||
|
||||
if hasattr(os, "lstat"):
|
||||
|
||||
def lstat(p):
|
||||
return os.lstat(fsenc(p))
|
||||
|
||||
else:
|
||||
lstat = stat
|
||||
|
||||
@@ -36,5 +36,9 @@ def islink(p):
|
||||
return os.path.islink(fsenc(p))
|
||||
|
||||
|
||||
def lexists(p):
|
||||
return os.path.lexists(fsenc(p))
|
||||
|
||||
|
||||
def realpath(p):
|
||||
return fsdec(os.path.realpath(fsenc(p)))
|
||||
|
||||
374
copyparty/ftpd.py
Normal file
374
copyparty/ftpd.py
Normal file
@@ -0,0 +1,374 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import os
|
||||
import sys
|
||||
import stat
|
||||
import time
|
||||
import logging
|
||||
import threading
|
||||
|
||||
from .__init__ import E, PY2
|
||||
from .util import Pebkac, fsenc, exclude_dotfiles
|
||||
from .bos import bos
|
||||
|
||||
try:
|
||||
from pyftpdlib.ioloop import IOLoop
|
||||
except ImportError:
|
||||
p = os.path.join(E.mod, "vend")
|
||||
print("loading asynchat from " + p)
|
||||
sys.path.append(p)
|
||||
from pyftpdlib.ioloop import IOLoop
|
||||
|
||||
from pyftpdlib.authorizers import DummyAuthorizer, AuthenticationFailed
|
||||
from pyftpdlib.filesystems import AbstractedFS, FilesystemError
|
||||
from pyftpdlib.handlers import FTPHandler
|
||||
from pyftpdlib.servers import FTPServer
|
||||
from pyftpdlib.log import config_logging
|
||||
|
||||
|
||||
try:
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .svchub import SvcHub
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
|
||||
class FtpAuth(DummyAuthorizer):
|
||||
def __init__(self):
|
||||
super(FtpAuth, self).__init__()
|
||||
self.hub = None # type: SvcHub
|
||||
|
||||
def validate_authentication(self, username, password, handler):
|
||||
asrv = self.hub.asrv
|
||||
if username == "anonymous":
|
||||
password = ""
|
||||
|
||||
uname = "*"
|
||||
if password:
|
||||
uname = asrv.iacct.get(password, None)
|
||||
|
||||
handler.username = uname
|
||||
|
||||
if password and not uname:
|
||||
raise AuthenticationFailed("Authentication failed.")
|
||||
|
||||
def get_home_dir(self, username):
|
||||
return "/"
|
||||
|
||||
def has_user(self, username):
|
||||
asrv = self.hub.asrv
|
||||
return username in asrv.acct
|
||||
|
||||
def has_perm(self, username, perm, path=None):
|
||||
return True # handled at filesystem layer
|
||||
|
||||
def get_perms(self, username):
|
||||
return "elradfmwMT"
|
||||
|
||||
def get_msg_login(self, username):
|
||||
return "sup {}".format(username)
|
||||
|
||||
def get_msg_quit(self, username):
|
||||
return "cya"
|
||||
|
||||
|
||||
class FtpFs(AbstractedFS):
|
||||
def __init__(self, root, cmd_channel):
|
||||
self.h = self.cmd_channel = cmd_channel # type: FTPHandler
|
||||
self.hub = cmd_channel.hub # type: SvcHub
|
||||
self.args = cmd_channel.args
|
||||
|
||||
self.uname = self.hub.asrv.iacct.get(cmd_channel.password, "*")
|
||||
|
||||
self.cwd = "/" # pyftpdlib convention of leading slash
|
||||
self.root = "/var/lib/empty"
|
||||
|
||||
self.listdirinfo = self.listdir
|
||||
self.chdir(".")
|
||||
|
||||
def v2a(self, vpath, r=False, w=False, m=False, d=False):
|
||||
try:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, r, w, m, d)
|
||||
if not vfs.realpath:
|
||||
raise FilesystemError("no filesystem mounted at this path")
|
||||
|
||||
return os.path.join(vfs.realpath, rem)
|
||||
except Pebkac as ex:
|
||||
raise FilesystemError(str(ex))
|
||||
|
||||
def rv2a(self, vpath, r=False, w=False, m=False, d=False):
|
||||
return self.v2a(os.path.join(self.cwd, vpath), r, w, m, d)
|
||||
|
||||
def ftp2fs(self, ftppath):
|
||||
# return self.v2a(ftppath)
|
||||
return ftppath # self.cwd must be vpath
|
||||
|
||||
def fs2ftp(self, fspath):
|
||||
# raise NotImplementedError()
|
||||
return fspath
|
||||
|
||||
def validpath(self, path):
|
||||
if "/.hist/" in path:
|
||||
if "/up2k." in path or path.endswith("/dir.txt"):
|
||||
raise FilesystemError("access to this file is forbidden")
|
||||
|
||||
return True
|
||||
|
||||
def open(self, filename, mode):
|
||||
r = "r" in mode
|
||||
w = "w" in mode or "a" in mode or "+" in mode
|
||||
|
||||
ap = self.rv2a(filename, r, w)
|
||||
if w and bos.path.exists(ap):
|
||||
raise FilesystemError("cannot open existing file for writing")
|
||||
|
||||
self.validpath(ap)
|
||||
return open(fsenc(ap), mode)
|
||||
|
||||
def chdir(self, path):
|
||||
self.cwd = join(self.cwd, path)
|
||||
x = self.hub.asrv.vfs.can_access(self.cwd.lstrip("/"), self.h.username)
|
||||
self.can_read, self.can_write, self.can_move, self.can_delete, self.can_get = x
|
||||
|
||||
def mkdir(self, path):
|
||||
ap = self.rv2a(path, w=True)
|
||||
bos.mkdir(ap)
|
||||
|
||||
def listdir(self, path):
|
||||
vpath = join(self.cwd, path).lstrip("/")
|
||||
try:
|
||||
vfs, rem = self.hub.asrv.vfs.get(vpath, self.uname, True, False)
|
||||
|
||||
fsroot, vfs_ls, vfs_virt = vfs.ls(
|
||||
rem, self.uname, not self.args.no_scandir, [[True], [False, True]]
|
||||
)
|
||||
vfs_ls = [x[0] for x in vfs_ls]
|
||||
vfs_ls.extend(vfs_virt.keys())
|
||||
|
||||
if not self.args.ed:
|
||||
vfs_ls = exclude_dotfiles(vfs_ls)
|
||||
|
||||
vfs_ls.sort()
|
||||
return vfs_ls
|
||||
except Exception as ex:
|
||||
if vpath:
|
||||
# display write-only folders as empty
|
||||
return []
|
||||
|
||||
# return list of volumes
|
||||
r = {x.split("/")[0]: 1 for x in self.hub.asrv.vfs.all_vols.keys()}
|
||||
return list(sorted(list(r.keys())))
|
||||
|
||||
def rmdir(self, path):
|
||||
ap = self.rv2a(path, d=True)
|
||||
bos.rmdir(ap)
|
||||
|
||||
def remove(self, path):
|
||||
if self.args.no_del:
|
||||
raise FilesystemError("the delete feature is disabled in server config")
|
||||
|
||||
vp = join(self.cwd, path).lstrip("/")
|
||||
x = self.hub.broker.put(
|
||||
True, "up2k.handle_rm", self.uname, self.h.remote_ip, [vp]
|
||||
)
|
||||
|
||||
try:
|
||||
x.get()
|
||||
except Exception as ex:
|
||||
raise FilesystemError(str(ex))
|
||||
|
||||
def rename(self, src, dst):
|
||||
if not self.can_move:
|
||||
raise FilesystemError("not allowed for user " + self.h.username)
|
||||
|
||||
if self.args.no_mv:
|
||||
m = "the rename/move feature is disabled in server config"
|
||||
raise FilesystemError(m)
|
||||
|
||||
svp = join(self.cwd, src).lstrip("/")
|
||||
dvp = join(self.cwd, dst).lstrip("/")
|
||||
x = self.hub.broker.put(True, "up2k.handle_mv", self.uname, svp, dvp)
|
||||
try:
|
||||
x.get()
|
||||
except Exception as ex:
|
||||
raise FilesystemError(str(ex))
|
||||
|
||||
def chmod(self, path, mode):
|
||||
pass
|
||||
|
||||
def stat(self, path):
|
||||
try:
|
||||
ap = self.rv2a(path, r=True)
|
||||
return bos.stat(ap)
|
||||
except:
|
||||
ap = self.rv2a(path)
|
||||
st = bos.stat(ap)
|
||||
if not stat.S_ISDIR(st.st_mode):
|
||||
raise
|
||||
|
||||
return st
|
||||
|
||||
def utime(self, path, timeval):
|
||||
ap = self.rv2a(path, w=True)
|
||||
return bos.utime(ap, (timeval, timeval))
|
||||
|
||||
def lstat(self, path):
|
||||
ap = self.rv2a(path)
|
||||
return bos.lstat(ap)
|
||||
|
||||
def isfile(self, path):
|
||||
st = self.stat(path)
|
||||
return stat.S_ISREG(st.st_mode)
|
||||
|
||||
def islink(self, path):
|
||||
ap = self.rv2a(path)
|
||||
return bos.path.islink(ap)
|
||||
|
||||
def isdir(self, path):
|
||||
try:
|
||||
st = self.stat(path)
|
||||
return stat.S_ISDIR(st.st_mode)
|
||||
except:
|
||||
return True
|
||||
|
||||
def getsize(self, path):
|
||||
ap = self.rv2a(path)
|
||||
return bos.path.getsize(ap)
|
||||
|
||||
def getmtime(self, path):
|
||||
ap = self.rv2a(path)
|
||||
return bos.path.getmtime(ap)
|
||||
|
||||
def realpath(self, path):
|
||||
return path
|
||||
|
||||
def lexists(self, path):
|
||||
ap = self.rv2a(path)
|
||||
return bos.path.lexists(ap)
|
||||
|
||||
def get_user_by_uid(self, uid):
|
||||
return "root"
|
||||
|
||||
def get_group_by_uid(self, gid):
|
||||
return "root"
|
||||
|
||||
|
||||
class FtpHandler(FTPHandler):
|
||||
abstracted_fs = FtpFs
|
||||
|
||||
def __init__(self, conn, server, ioloop=None):
|
||||
if PY2:
|
||||
FTPHandler.__init__(self, conn, server, ioloop)
|
||||
else:
|
||||
super(FtpHandler, self).__init__(conn, server, ioloop)
|
||||
|
||||
# abspath->vpath mapping to resolve log_transfer paths
|
||||
self.vfs_map = {}
|
||||
|
||||
def ftp_STOR(self, file, mode="w"):
|
||||
vp = join(self.fs.cwd, file).lstrip("/")
|
||||
ap = self.fs.v2a(vp)
|
||||
self.vfs_map[ap] = vp
|
||||
# print("ftp_STOR: {} {} => {}".format(vp, mode, ap))
|
||||
ret = FTPHandler.ftp_STOR(self, file, mode)
|
||||
# print("ftp_STOR: {} {} OK".format(vp, mode))
|
||||
return ret
|
||||
|
||||
def log_transfer(self, cmd, filename, receive, completed, elapsed, bytes):
|
||||
ap = filename.decode("utf-8", "replace")
|
||||
vp = self.vfs_map.pop(ap, None)
|
||||
# print("xfer_end: {} => {}".format(ap, vp))
|
||||
if vp:
|
||||
vp, fn = os.path.split(vp)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vp, self.username, False, True)
|
||||
vfs, rem = vfs.get_dbv(rem)
|
||||
self.hub.broker.put(
|
||||
False,
|
||||
"up2k.hash_file",
|
||||
vfs.realpath,
|
||||
vfs.flags,
|
||||
rem,
|
||||
fn,
|
||||
self.remote_ip,
|
||||
time.time(),
|
||||
)
|
||||
|
||||
return FTPHandler.log_transfer(
|
||||
self, cmd, filename, receive, completed, elapsed, bytes
|
||||
)
|
||||
|
||||
|
||||
try:
|
||||
from pyftpdlib.handlers import TLS_FTPHandler
|
||||
|
||||
class SftpHandler(FtpHandler, TLS_FTPHandler):
|
||||
pass
|
||||
|
||||
except:
|
||||
pass
|
||||
|
||||
|
||||
class Ftpd(object):
|
||||
def __init__(self, hub):
|
||||
self.hub = hub
|
||||
self.args = hub.args
|
||||
|
||||
hs = []
|
||||
if self.args.ftp:
|
||||
hs.append([FtpHandler, self.args.ftp])
|
||||
if self.args.ftps:
|
||||
try:
|
||||
h = SftpHandler
|
||||
except:
|
||||
m = "\nftps requires pyopenssl;\nplease run the following:\n\n {} -m pip install --user pyopenssl\n"
|
||||
print(m.format(sys.executable))
|
||||
sys.exit(1)
|
||||
|
||||
h.certfile = os.path.join(E.cfg, "cert.pem")
|
||||
h.tls_control_required = True
|
||||
h.tls_data_required = True
|
||||
|
||||
hs.append([h, self.args.ftps])
|
||||
|
||||
for h in hs:
|
||||
h, lp = h
|
||||
h.hub = hub
|
||||
h.args = hub.args
|
||||
h.authorizer = FtpAuth()
|
||||
h.authorizer.hub = hub
|
||||
|
||||
if self.args.ftp_pr:
|
||||
p1, p2 = [int(x) for x in self.args.ftp_pr.split("-")]
|
||||
if self.args.ftp and self.args.ftps:
|
||||
# divide port range in half
|
||||
d = int((p2 - p1) / 2)
|
||||
if lp == self.args.ftp:
|
||||
p2 = p1 + d
|
||||
else:
|
||||
p1 += d + 1
|
||||
|
||||
h.passive_ports = list(range(p1, p2 + 1))
|
||||
|
||||
if self.args.ftp_nat:
|
||||
h.masquerade_address = self.args.ftp_nat
|
||||
|
||||
if self.args.ftp_dbg:
|
||||
config_logging(level=logging.DEBUG)
|
||||
|
||||
ioloop = IOLoop()
|
||||
for ip in self.args.i:
|
||||
for h, lp in hs:
|
||||
FTPServer((ip, int(lp)), h, ioloop)
|
||||
|
||||
t = threading.Thread(target=ioloop.loop)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
|
||||
def join(p1, p2):
|
||||
w = os.path.join(p1, p2.replace("\\", "/"))
|
||||
return os.path.normpath(w).replace("\\", "/")
|
||||
@@ -65,6 +65,11 @@ class HttpCli(object):
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
"Cache-Control": "no-store; max-age=0",
|
||||
}
|
||||
h = self.args.html_head
|
||||
if self.args.no_robots:
|
||||
h = META_NOBOTS + (("\n" + h) if h else "")
|
||||
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
|
||||
self.html_head = h
|
||||
|
||||
def log(self, msg, c=0):
|
||||
ptn = self.asrv.re_pwd
|
||||
@@ -93,6 +98,7 @@ class HttpCli(object):
|
||||
if ka:
|
||||
ka["ts"] = self.conn.hsrv.cachebuster()
|
||||
ka["svcname"] = self.args.doctitle
|
||||
ka["html_head"] = self.html_head
|
||||
return tpl.render(**ka)
|
||||
|
||||
return tpl
|
||||
@@ -640,7 +646,7 @@ class HttpCli(object):
|
||||
with ren_open(fn, *open_a, **params) as f:
|
||||
f, fn = f["orz"]
|
||||
path = os.path.join(fdir, fn)
|
||||
post_sz, sha_hex, sha_b64 = hashcopy(reader, f)
|
||||
post_sz, sha_hex, sha_b64 = hashcopy(reader, f, self.args.s_wr_slp)
|
||||
|
||||
if lim:
|
||||
lim.nup(self.ip)
|
||||
@@ -666,6 +672,7 @@ class HttpCli(object):
|
||||
time.time(),
|
||||
)
|
||||
|
||||
vsuf = ""
|
||||
if self.can_read and "fk" in vfs.flags:
|
||||
vsuf = "?k=" + gen_filekey(
|
||||
self.args.fk_salt,
|
||||
@@ -688,7 +695,8 @@ class HttpCli(object):
|
||||
def handle_stash(self):
|
||||
post_sz, sha_hex, sha_b64, remains, path, url = self.dump_to_file()
|
||||
spd = self._spd(post_sz)
|
||||
self.log("{} wrote {}/{} bytes to {}".format(spd, post_sz, remains, path))
|
||||
m = "{} wrote {}/{} bytes to {} # {}"
|
||||
self.log(m.format(spd, post_sz, remains, path, sha_b64[:28])) # 21
|
||||
m = "{}\n{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56], url)
|
||||
self.reply(m.encode("utf-8"))
|
||||
return True
|
||||
@@ -920,7 +928,7 @@ class HttpCli(object):
|
||||
|
||||
try:
|
||||
f.seek(cstart[0])
|
||||
post_sz, _, sha_b64 = hashcopy(reader, f)
|
||||
post_sz, _, sha_b64 = hashcopy(reader, f, self.args.s_wr_slp)
|
||||
|
||||
if sha_b64 != chash:
|
||||
m = "your chunk got corrupted somehow (received {} bytes); expected vs received hash:\n{}\n{}"
|
||||
@@ -1113,7 +1121,7 @@ class HttpCli(object):
|
||||
f, fname = f["orz"]
|
||||
abspath = os.path.join(fdir, fname)
|
||||
self.log("writing to {}".format(abspath))
|
||||
sz, sha_hex, sha_b64 = hashcopy(p_data, f)
|
||||
sz, sha_hex, sha_b64 = hashcopy(p_data, f, self.args.s_wr_slp)
|
||||
if sz == 0:
|
||||
raise Pebkac(400, "empty files in post")
|
||||
|
||||
@@ -1331,7 +1339,7 @@ class HttpCli(object):
|
||||
raise Pebkac(400, "expected body, got {}".format(p_field))
|
||||
|
||||
with open(fsenc(fp), "wb", 512 * 1024) as f:
|
||||
sz, sha512, _ = hashcopy(p_data, f)
|
||||
sz, sha512, _ = hashcopy(p_data, f, self.args.s_wr_slp)
|
||||
|
||||
if lim:
|
||||
lim.nup(self.ip)
|
||||
@@ -1675,13 +1683,15 @@ class HttpCli(object):
|
||||
|
||||
boundary = "\roll\tide"
|
||||
targs = {
|
||||
"ts": self.conn.hsrv.cachebuster(),
|
||||
"svcname": self.args.doctitle,
|
||||
"html_head": self.html_head,
|
||||
"edit": "edit" in self.uparam,
|
||||
"title": html_escape(self.vpath, crlf=True),
|
||||
"lastmod": int(ts_md * 1000),
|
||||
"md_plug": "true" if self.args.emp else "false",
|
||||
"md_chk_rate": self.args.mcr,
|
||||
"md": boundary,
|
||||
"ts": self.conn.hsrv.cachebuster(),
|
||||
"arg_base": arg_base,
|
||||
}
|
||||
html = template.render(**targs).encode("utf-8", "replace")
|
||||
@@ -1730,6 +1740,31 @@ class HttpCli(object):
|
||||
vstate = {}
|
||||
vs = {"scanning": None, "hashq": None, "tagq": None, "mtpq": None}
|
||||
|
||||
if self.uparam.get("ls") in ["v", "t", "txt"]:
|
||||
if self.uname == "*":
|
||||
txt = "howdy stranger (you're not logged in)"
|
||||
else:
|
||||
txt = "welcome back {}".format(self.uname)
|
||||
|
||||
if vstate:
|
||||
txt += "\nstatus:"
|
||||
for k in ["scanning", "hashq", "tagq", "mtpq"]:
|
||||
txt += " {}({})".format(k, vs[k])
|
||||
|
||||
if rvol:
|
||||
txt += "\nyou can browse:"
|
||||
for v in rvol:
|
||||
txt += "\n " + v
|
||||
|
||||
if wvol:
|
||||
txt += "\nyou can upload to:"
|
||||
for v in wvol:
|
||||
txt += "\n " + v
|
||||
|
||||
txt = txt.encode("utf-8", "replace") + b"\n"
|
||||
self.reply(txt, mime="text/plain; charset=utf-8")
|
||||
return True
|
||||
|
||||
html = self.j2(
|
||||
"splash",
|
||||
this=self,
|
||||
@@ -2039,6 +2074,12 @@ class HttpCli(object):
|
||||
):
|
||||
raise Pebkac(403)
|
||||
|
||||
self.html_head = vn.flags.get("html_head", "")
|
||||
if vn.flags.get("norobots"):
|
||||
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
|
||||
else:
|
||||
self.out_headers.pop("X-Robots-Tag", None)
|
||||
|
||||
is_dir = stat.S_ISDIR(st.st_mode)
|
||||
if self.can_read:
|
||||
th_fmt = self.uparam.get("th")
|
||||
@@ -2129,7 +2170,7 @@ class HttpCli(object):
|
||||
|
||||
url_suf = self.urlq({}, [])
|
||||
is_ls = "ls" in self.uparam
|
||||
is_js = self.cookies.get("js") == "y"
|
||||
is_js = self.args.force_js or self.cookies.get("js") == "y"
|
||||
|
||||
tpl = "browser"
|
||||
if "b" in self.uparam:
|
||||
@@ -2232,10 +2273,6 @@ class HttpCli(object):
|
||||
if not self.args.ed or "dots" not in self.uparam:
|
||||
vfs_ls = exclude_dotfiles(vfs_ls)
|
||||
|
||||
hidden = []
|
||||
if rem == ".hist":
|
||||
hidden = ["up2k."]
|
||||
|
||||
icur = None
|
||||
if "e2t" in vn.flags:
|
||||
idx = self.conn.get_u2idx()
|
||||
@@ -2254,8 +2291,6 @@ class HttpCli(object):
|
||||
|
||||
if fn in vfs_virt:
|
||||
fspath = vfs_virt[fn].realpath
|
||||
elif hidden and any(fn.startswith(x) for x in hidden):
|
||||
continue
|
||||
else:
|
||||
fspath = fsroot + "/" + fn
|
||||
|
||||
@@ -2372,7 +2407,7 @@ class HttpCli(object):
|
||||
|
||||
doc = self.uparam.get("doc") if self.can_read else None
|
||||
if doc:
|
||||
doc = unquotep(doc.replace("+", " "))
|
||||
doc = unquotep(doc.replace("+", " ").split("?")[0])
|
||||
j2a["docname"] = doc
|
||||
if next((x for x in files if x["name"] == doc), None):
|
||||
with open(os.path.join(abspath, doc), "rb") as f:
|
||||
|
||||
@@ -105,6 +105,11 @@ class SvcHub(object):
|
||||
|
||||
args.th_poke = min(args.th_poke, args.th_maxage, args.ac_maxage)
|
||||
|
||||
if args.ftp or args.ftps:
|
||||
from .ftpd import Ftpd
|
||||
|
||||
self.ftpd = Ftpd(self)
|
||||
|
||||
# decide which worker impl to use
|
||||
if self.check_mp_enable():
|
||||
from .broker_mp import BrokerMp as Broker
|
||||
@@ -357,7 +362,7 @@ class SvcHub(object):
|
||||
src = ansi_re.sub("", src)
|
||||
elif c:
|
||||
if isinstance(c, int):
|
||||
msg = "\033[3{}m{}".format(c, msg)
|
||||
msg = "\033[3{}m{}\033[0m".format(c, msg)
|
||||
elif "\033" not in c:
|
||||
msg = "\033[{}m{}\033[0m".format(c, msg)
|
||||
else:
|
||||
|
||||
@@ -57,13 +57,19 @@ class TcpSrv(object):
|
||||
msgs = []
|
||||
title_tab = {}
|
||||
title_vars = [x[1:] for x in self.args.wintitle.split(" ") if x.startswith("$")]
|
||||
m = "available @ http://{}:{}/ (\033[33m{}\033[0m)"
|
||||
m = "available @ {}://{}:{}/ (\033[33m{}\033[0m)"
|
||||
for ip, desc in sorted(eps.items(), key=lambda x: x[1]):
|
||||
for port in sorted(self.args.p):
|
||||
if port not in ok.get(ip, ok.get("0.0.0.0", [])):
|
||||
continue
|
||||
|
||||
msgs.append(m.format(ip, port, desc))
|
||||
proto = " http"
|
||||
if self.args.http_only:
|
||||
pass
|
||||
elif self.args.https_only or port == 443:
|
||||
proto = "https"
|
||||
|
||||
msgs.append(m.format(proto, ip, port, desc))
|
||||
|
||||
if not self.args.wintitle:
|
||||
continue
|
||||
@@ -144,10 +150,15 @@ class TcpSrv(object):
|
||||
return eps
|
||||
|
||||
r = re.compile(r"^\s+inet ([^ ]+)/.* (.*)")
|
||||
ri = re.compile(r"^\s*[0-9]+\s*:.*")
|
||||
up = False
|
||||
for ln in txt.split("\n"):
|
||||
if ri.match(ln):
|
||||
up = "UP" in re.split("[>,< ]", ln)
|
||||
|
||||
try:
|
||||
ip, dev = r.match(ln.rstrip()).groups()
|
||||
eps[ip] = dev
|
||||
eps[ip] = dev + ("" if up else ", \033[31mLINK-DOWN")
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -177,6 +188,7 @@ class TcpSrv(object):
|
||||
|
||||
def ips_windows_ipconfig(self):
|
||||
eps = {}
|
||||
offs = {}
|
||||
try:
|
||||
txt, _ = chkcmd(["ipconfig"])
|
||||
except:
|
||||
@@ -184,18 +196,29 @@ class TcpSrv(object):
|
||||
|
||||
rdev = re.compile(r"(^[^ ].*):$")
|
||||
rip = re.compile(r"^ +IPv?4? [^:]+: *([0-9\.]{7,15})$")
|
||||
roff = re.compile(r".*: Media disconnected$")
|
||||
dev = None
|
||||
for ln in txt.replace("\r", "").split("\n"):
|
||||
m = rdev.match(ln)
|
||||
if m:
|
||||
if dev and dev not in eps.values():
|
||||
offs[dev] = 1
|
||||
|
||||
dev = m.group(1).split(" adapter ", 1)[-1]
|
||||
|
||||
if dev and roff.match(ln):
|
||||
offs[dev] = 1
|
||||
dev = None
|
||||
|
||||
m = rip.match(ln)
|
||||
if m and dev:
|
||||
eps[m.group(1)] = dev
|
||||
dev = None
|
||||
|
||||
return eps
|
||||
if dev and dev not in eps.values():
|
||||
offs[dev] = 1
|
||||
|
||||
return eps, offs
|
||||
|
||||
def ips_windows_netsh(self):
|
||||
eps = {}
|
||||
@@ -215,7 +238,6 @@ class TcpSrv(object):
|
||||
m = rip.match(ln)
|
||||
if m and dev:
|
||||
eps[m.group(1)] = dev
|
||||
dev = None
|
||||
|
||||
return eps
|
||||
|
||||
@@ -223,8 +245,11 @@ class TcpSrv(object):
|
||||
if MACOS:
|
||||
eps = self.ips_macos()
|
||||
elif ANYWIN:
|
||||
eps = self.ips_windows_ipconfig() # sees more interfaces
|
||||
eps, off = self.ips_windows_ipconfig() # sees more interfaces + link state
|
||||
eps.update(self.ips_windows_netsh()) # has better names
|
||||
for k, v in eps.items():
|
||||
if v in off:
|
||||
eps[k] += ", \033[31mLINK-DOWN"
|
||||
else:
|
||||
eps = self.ips_linux()
|
||||
|
||||
|
||||
@@ -51,11 +51,11 @@ class U2idx(object):
|
||||
fhash = body["hash"]
|
||||
wark = up2k_wark_from_hashlist(self.args.salt, fsize, fhash)
|
||||
|
||||
uq = "where substr(w,1,16) = ? and w = ?"
|
||||
uq = "substr(w,1,16) = ? and w = ?"
|
||||
uv = [wark[:16], wark]
|
||||
|
||||
try:
|
||||
return self.run_query(vols, uq, uv)[0]
|
||||
return self.run_query(vols, uq, uv, True, False)[0]
|
||||
except:
|
||||
raise Pebkac(500, min_ex())
|
||||
|
||||
@@ -87,17 +87,16 @@ class U2idx(object):
|
||||
|
||||
q = ""
|
||||
va = []
|
||||
joins = ""
|
||||
have_up = False # query has up.* operands
|
||||
have_mt = False
|
||||
is_key = True
|
||||
is_size = False
|
||||
is_date = False
|
||||
field_end = "" # closing parenthesis or whatever
|
||||
kw_key = ["(", ")", "and ", "or ", "not "]
|
||||
kw_val = ["==", "=", "!=", ">", ">=", "<", "<=", "like "]
|
||||
ptn_mt = re.compile(r"^\.?[a-z_-]+$")
|
||||
mt_ctr = 0
|
||||
mt_keycmp = "substr(up.w,1,16)"
|
||||
mt_keycmp2 = None
|
||||
ptn_lc = re.compile(r" (mt[0-9]+\.v) ([=<!>]+) \? $")
|
||||
ptn_lc = re.compile(r" (mt\.v) ([=<!>]+) \? \) $")
|
||||
ptn_lcv = re.compile(r"[a-zA-Z]")
|
||||
|
||||
while True:
|
||||
@@ -133,28 +132,31 @@ class U2idx(object):
|
||||
if v == "size":
|
||||
v = "up.sz"
|
||||
is_size = True
|
||||
have_up = True
|
||||
|
||||
elif v == "date":
|
||||
v = "up.mt"
|
||||
is_date = True
|
||||
have_up = True
|
||||
|
||||
elif v == "path":
|
||||
v = "up.rd"
|
||||
v = "trim(?||up.rd,'/')"
|
||||
va.append("\nrd")
|
||||
have_up = True
|
||||
|
||||
elif v == "name":
|
||||
v = "up.fn"
|
||||
have_up = True
|
||||
|
||||
elif v == "tags" or ptn_mt.match(v):
|
||||
mt_ctr += 1
|
||||
mt_keycmp2 = "mt{}.w".format(mt_ctr)
|
||||
joins += "inner join mt mt{} on {} = {} ".format(
|
||||
mt_ctr, mt_keycmp, mt_keycmp2
|
||||
)
|
||||
mt_keycmp = mt_keycmp2
|
||||
have_mt = True
|
||||
field_end = ") "
|
||||
if v == "tags":
|
||||
v = "mt{0}.v".format(mt_ctr)
|
||||
vq = "mt.v"
|
||||
else:
|
||||
v = "+mt{0}.k = '{1}' and mt{0}.v".format(mt_ctr, v)
|
||||
vq = "+mt.k = '{}' and mt.v".format(v)
|
||||
|
||||
v = "exists(select 1 from mt where mt.w = mtw and " + vq
|
||||
|
||||
else:
|
||||
raise Pebkac(400, "invalid key [" + v + "]")
|
||||
@@ -200,6 +202,10 @@ class U2idx(object):
|
||||
va.append(v)
|
||||
is_key = True
|
||||
|
||||
if field_end:
|
||||
q += field_end
|
||||
field_end = ""
|
||||
|
||||
# lowercase tag searches
|
||||
m = ptn_lc.search(q)
|
||||
if not m or not ptn_lcv.search(unicode(v)):
|
||||
@@ -211,16 +217,16 @@ class U2idx(object):
|
||||
|
||||
field, oper = m.groups()
|
||||
if oper in ["=", "=="]:
|
||||
q += " {} like ? ".format(field)
|
||||
q += " {} like ? ) ".format(field)
|
||||
else:
|
||||
q += " lower({}) {} ? ".format(field, oper)
|
||||
q += " lower({}) {} ? ) ".format(field, oper)
|
||||
|
||||
try:
|
||||
return self.run_query(vols, joins + "where " + q, va)
|
||||
return self.run_query(vols, q, va, have_up, have_mt)
|
||||
except Exception as ex:
|
||||
raise Pebkac(500, repr(ex))
|
||||
|
||||
def run_query(self, vols, uq, uv):
|
||||
def run_query(self, vols, uq, uv, have_up, have_mt):
|
||||
done_flag = []
|
||||
self.active_id = "{:.6f}_{}".format(
|
||||
time.time(), threading.current_thread().ident
|
||||
@@ -237,16 +243,19 @@ class U2idx(object):
|
||||
thr.start()
|
||||
|
||||
if not uq or not uv:
|
||||
q = "select * from up"
|
||||
v = ()
|
||||
uq = "select * from up"
|
||||
uv = ()
|
||||
elif have_mt:
|
||||
uq = "select up.*, substr(up.w,1,16) mtw from up where " + uq
|
||||
uv = tuple(uv)
|
||||
else:
|
||||
q = "select up.* from up " + uq
|
||||
v = tuple(uv)
|
||||
uq = "select up.* from up where " + uq
|
||||
uv = tuple(uv)
|
||||
|
||||
self.log("qs: {!r} {!r}".format(q, v))
|
||||
self.log("qs: {!r} {!r}".format(uq, uv))
|
||||
|
||||
ret = []
|
||||
lim = 1000
|
||||
lim = int(self.args.srch_hits)
|
||||
taglist = {}
|
||||
for (vtop, ptop, flags) in vols:
|
||||
cur = self.get_cur(ptop)
|
||||
@@ -255,11 +264,19 @@ class U2idx(object):
|
||||
|
||||
self.active_cur = cur
|
||||
|
||||
vuv = []
|
||||
for v in uv:
|
||||
if v == "\nrd":
|
||||
v = vtop + "/"
|
||||
|
||||
vuv.append(v)
|
||||
vuv = tuple(vuv)
|
||||
|
||||
sret = []
|
||||
fk = flags.get("fk")
|
||||
c = cur.execute(q, v)
|
||||
c = cur.execute(uq, vuv)
|
||||
for hit in c:
|
||||
w, ts, sz, rd, fn, ip, at = hit
|
||||
w, ts, sz, rd, fn, ip, at = hit[:7]
|
||||
lim -= 1
|
||||
if lim <= 0:
|
||||
break
|
||||
|
||||
@@ -1137,9 +1137,9 @@ class Up2k(object):
|
||||
m = "database is version {}, this copyparty only supports versions <= {}"
|
||||
raise Exception(m.format(ver, DB_VER))
|
||||
|
||||
msg = "creating new DB (old is bad); backup: {}"
|
||||
msg = "creating new DB (old is bad); backup: "
|
||||
if ver:
|
||||
msg = "creating new DB (too old to upgrade); backup: {}"
|
||||
msg = "creating new DB (too old to upgrade); backup: "
|
||||
|
||||
cur = self._backup_db(db_path, cur, ver, msg)
|
||||
db = cur.connection
|
||||
|
||||
@@ -71,6 +71,8 @@ SYMTIME = sys.version_info >= (3, 6) and os.supports_follow_symlinks
|
||||
|
||||
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
|
||||
|
||||
META_NOBOTS = '<meta name="robots" content="noindex, nofollow">'
|
||||
|
||||
HTTPCODE = {
|
||||
200: "OK",
|
||||
204: "No Content",
|
||||
@@ -1164,13 +1166,15 @@ def yieldfile(fn):
|
||||
yield buf
|
||||
|
||||
|
||||
def hashcopy(fin, fout):
|
||||
def hashcopy(fin, fout, slp=0):
|
||||
hashobj = hashlib.sha512()
|
||||
tlen = 0
|
||||
for buf in fin:
|
||||
tlen += len(buf)
|
||||
hashobj.update(buf)
|
||||
fout.write(buf)
|
||||
if slp:
|
||||
time.sleep(slp)
|
||||
|
||||
digest = hashobj.digest()[:33]
|
||||
digest_b64 = base64.urlsafe_b64encode(digest).decode("utf-8")
|
||||
|
||||
@@ -37,7 +37,7 @@ pre, code, tt, #doc, #doc>code {
|
||||
display: inline-block;
|
||||
padding: .35em .5em .2em .5em;
|
||||
border-radius: 0 .3em .3em 0;
|
||||
margin: 1.3em 0 0 0;
|
||||
margin: 1.3em 0 -.2em 0;
|
||||
font-size: 1.4em;
|
||||
}
|
||||
#path #entree {
|
||||
@@ -51,7 +51,8 @@ pre, code, tt, #doc, #doc>code {
|
||||
}
|
||||
#files tbody a {
|
||||
display: block;
|
||||
padding: .3em 0;
|
||||
padding: .5em 0;
|
||||
margin: -.3em 0;
|
||||
scroll-margin-top: 45vh;
|
||||
}
|
||||
#files tr {
|
||||
@@ -110,7 +111,7 @@ a, #files tbody div a:last-child {
|
||||
}
|
||||
#files td {
|
||||
margin: 0;
|
||||
padding: .1em .5em;
|
||||
padding: .3em .5em;
|
||||
border-left: 1px solid #3c3c3c;
|
||||
}
|
||||
#files td+td+td {
|
||||
@@ -234,8 +235,8 @@ a, #files tbody div a:last-child {
|
||||
}
|
||||
#files tbody a.play {
|
||||
color: #e70;
|
||||
padding: .2em;
|
||||
margin: -.2em;
|
||||
padding: .3em;
|
||||
margin: -.3em;
|
||||
}
|
||||
#files tbody a.play.act {
|
||||
color: #720;
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
<title>⇆🎉 {{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/ui.css?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/browser.css?_={{ ts }}">
|
||||
{%- if css %}
|
||||
|
||||
@@ -2389,6 +2389,7 @@ var showfile = (function () {
|
||||
continue;
|
||||
|
||||
td.innerHTML = '<a href="#" class="doc bri" hl="' + link.id + '">-txt-</a>';
|
||||
td.getElementsByTagName('a')[0].setAttribute('href', '?doc=' + fn);
|
||||
}
|
||||
r.mktree();
|
||||
if (em) {
|
||||
@@ -2425,6 +2426,7 @@ var showfile = (function () {
|
||||
lnh = doc[1],
|
||||
txt = doc[2],
|
||||
name = url.split('/').pop(),
|
||||
tname = uricom_dec(name)[0],
|
||||
lang = r.getlang(name),
|
||||
is_md = lang == 'md';
|
||||
|
||||
@@ -2471,13 +2473,14 @@ var showfile = (function () {
|
||||
wr.style.display = '';
|
||||
set_tabindex();
|
||||
|
||||
wintitle(tname + ' \u2014 ');
|
||||
document.documentElement.scrollTop = 0;
|
||||
var hfun = no_push ? hist_replace : hist_push;
|
||||
hfun(get_evpath() + '?doc=' + url.split('/').pop());
|
||||
|
||||
qsr('#docname');
|
||||
el = mknod('span');
|
||||
el.textContent = uricom_dec(name)[0];
|
||||
el.textContent = tname;
|
||||
el.setAttribute('id', 'docname');
|
||||
ebi('path').appendChild(el);
|
||||
|
||||
@@ -2612,6 +2615,7 @@ var thegrid = (function () {
|
||||
return;
|
||||
|
||||
hist_push(get_evpath());
|
||||
wintitle();
|
||||
}
|
||||
|
||||
var vis = has(perms, "read");
|
||||
@@ -3025,7 +3029,7 @@ document.onkeydown = function (e) {
|
||||
}
|
||||
}
|
||||
|
||||
if (ae.closest('pre')) {
|
||||
if (ae && ae.closest('pre')) {
|
||||
if (k == 'KeyA' && ctrl(e)) {
|
||||
var sel = document.getSelection(),
|
||||
ran = document.createRange();
|
||||
@@ -3219,7 +3223,7 @@ document.onkeydown = function (e) {
|
||||
clearTimeout(defer_timeout);
|
||||
clearTimeout(search_timeout);
|
||||
search_timeout = setTimeout(do_search,
|
||||
v && v.length < (is_touch ? 4 : 3) ? 600 : 200);
|
||||
v && v.length < (is_touch ? 4 : 3) ? 1000 : 500);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3289,10 +3293,10 @@ document.onkeydown = function (e) {
|
||||
}
|
||||
|
||||
if (k == 'path' || k == 'name' || k == 'tags') {
|
||||
var not = ' ';
|
||||
var not = '';
|
||||
if (tv.slice(0, 1) == '-') {
|
||||
tv = tv.slice(1);
|
||||
not = ' not ';
|
||||
not = 'not ';
|
||||
}
|
||||
|
||||
if (tv.slice(0, 1) == '^') {
|
||||
@@ -3313,7 +3317,7 @@ document.onkeydown = function (e) {
|
||||
tv = '"' + tv + '"';
|
||||
}
|
||||
|
||||
q += k + not + 'like ' + tv;
|
||||
q += not + k + ' like ' + tv;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -3361,6 +3365,7 @@ document.onkeydown = function (e) {
|
||||
return;
|
||||
|
||||
treectl.hide();
|
||||
thegrid.setvis(true);
|
||||
|
||||
var html = mk_files_header(tagord), seen = {};
|
||||
html.push('<tbody>');
|
||||
@@ -3925,7 +3930,7 @@ var treectl = (function () {
|
||||
showfile.files.push({ 'id': id, 'name': fname });
|
||||
|
||||
if (tn.lead == '-')
|
||||
tn.lead = '<a href="#" class="doc' + (lang ? ' bri' : '') +
|
||||
tn.lead = '<a href="?doc=' + tn.href + '" class="doc' + (lang ? ' bri' : '') +
|
||||
'" hl="' + id + '" name="' + hname + '">-txt-</a>';
|
||||
|
||||
var ln = ['<tr><td>' + tn.lead + '</td><td sortv="' + sortv +
|
||||
@@ -5029,6 +5034,9 @@ ebi('path').onclick = function (e) {
|
||||
|
||||
|
||||
ebi('files').onclick = ebi('docul').onclick = function (e) {
|
||||
if (ctrl(e))
|
||||
return true;
|
||||
|
||||
var tgt = e.target.closest('a[id]');
|
||||
if (tgt && tgt.getAttribute('id').indexOf('f-') === 0 && tgt.textContent.endsWith('/')) {
|
||||
var el = treectl.find(tgt.textContent.slice(0, -1));
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
<title>{{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
{{ html_head }}
|
||||
<style>
|
||||
html{font-family:sans-serif}
|
||||
td{border:1px solid #999;border-width:1px 1px 0 0;padding:0 5px}
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
<title>📝🎉 {{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.7">
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" href="/.cpr/ui.css?_={{ ts }}">
|
||||
<link rel="stylesheet" href="/.cpr/md.css?_={{ ts }}">
|
||||
{%- if edit %}
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
<title>📝🎉 {{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.7">
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" href="/.cpr/ui.css?_={{ ts }}">
|
||||
<link rel="stylesheet" href="/.cpr/mde.css?_={{ ts }}">
|
||||
<link rel="stylesheet" href="/.cpr/deps/mini-fa.css?_={{ ts }}">
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
<title>{{ svcname }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/msg.css?_={{ ts }}">
|
||||
</head>
|
||||
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
<title>{{ svcname }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
{{ html_head }}
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/splash.css?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/ui.css?_={{ ts }}">
|
||||
</head>
|
||||
|
||||
@@ -3,6 +3,12 @@ echo not a script
|
||||
exit 1
|
||||
|
||||
|
||||
##
|
||||
## add index.html banners
|
||||
|
||||
find -name index.html | sed -r 's/index.html$//' | while IFS= read -r dir; do f="$dir/.prologue.html"; [ -e "$f" ] || echo '<h1><a href="index.html">open index.html</a></h1>' >"$f"; done
|
||||
|
||||
|
||||
##
|
||||
## delete all partial uploads
|
||||
## (supports linux/macos, probably windows+msys2)
|
||||
@@ -95,6 +101,7 @@ var t=[]; var b=document.location.href.split('#')[0].slice(0, -1); document.quer
|
||||
# debug md-editor line tracking
|
||||
var s=mknod('style');s.innerHTML='*[data-ln]:before {content:attr(data-ln)!important;color:#f0c;background:#000;position:absolute;left:-1.5em;font-size:1rem}';document.head.appendChild(s);
|
||||
|
||||
|
||||
##
|
||||
## bash oneliners
|
||||
|
||||
@@ -199,6 +206,7 @@ git remote add all git@github.com:9001/copyparty.git
|
||||
git remote set-url --add --push all git@gitlab.com:9001/copyparty.git
|
||||
git remote set-url --add --push all git@github.com:9001/copyparty.git
|
||||
|
||||
|
||||
##
|
||||
## http 206
|
||||
|
||||
|
||||
@@ -12,21 +12,18 @@ set -e
|
||||
#
|
||||
# output summary (filesizes and contents):
|
||||
#
|
||||
# 535672 copyparty-extras/sfx-full/copyparty-sfx.sh
|
||||
# 550760 copyparty-extras/sfx-full/copyparty-sfx.py
|
||||
# `- original unmodified sfx from github
|
||||
#
|
||||
# 572923 copyparty-extras/sfx-full/copyparty-sfx-gz.py
|
||||
# `- unmodified but recompressed from bzip2 to gzip
|
||||
#
|
||||
# 341792 copyparty-extras/sfx-ent/copyparty-sfx.sh
|
||||
# 353975 copyparty-extras/sfx-ent/copyparty-sfx.py
|
||||
# 376934 copyparty-extras/sfx-ent/copyparty-sfx-gz.py
|
||||
# `- removed iOS ogg/opus/vorbis audio decoder,
|
||||
# removed the audio tray mouse cursor,
|
||||
# "enterprise edition"
|
||||
#
|
||||
# 259288 copyparty-extras/sfx-lite/copyparty-sfx.sh
|
||||
# 270004 copyparty-extras/sfx-lite/copyparty-sfx.py
|
||||
# 293159 copyparty-extras/sfx-lite/copyparty-sfx-gz.py
|
||||
# `- also removed the codemirror markdown editor
|
||||
@@ -81,7 +78,7 @@ cache="$od/.copyparty-repack.cache"
|
||||
# fallback to awk (sorry)
|
||||
awk -F\" '/"browser_download_url".*(\.tar\.gz|-sfx\.)/ {print$4}'
|
||||
) |
|
||||
grep -E '(sfx\.(sh|py)|tar\.gz)$' |
|
||||
grep -E '(sfx\.py|tar\.gz)$' |
|
||||
tee /dev/stderr |
|
||||
tr -d '\r' | tr '\n' '\0' |
|
||||
xargs -0 bash -c 'dl_files "$@"' _
|
||||
@@ -139,11 +136,11 @@ repack() {
|
||||
)
|
||||
}
|
||||
|
||||
repack sfx-full "re gz no-sh"
|
||||
repack sfx-full "re gz"
|
||||
repack sfx-ent "re no-dd"
|
||||
repack sfx-ent "re no-dd gz no-sh"
|
||||
repack sfx-ent "re no-dd gz"
|
||||
repack sfx-lite "re no-dd no-cm no-hl"
|
||||
repack sfx-lite "re no-dd no-cm no-hl gz no-sh"
|
||||
repack sfx-lite "re no-dd no-cm no-hl gz"
|
||||
|
||||
|
||||
# move fuse and up2k clients into copyparty-extras/,
|
||||
|
||||
@@ -2,15 +2,15 @@ FROM alpine:3.15
|
||||
WORKDIR /z
|
||||
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
|
||||
ver_hashwasm=4.9.0 \
|
||||
ver_marked=4.0.10 \
|
||||
ver_mde=2.15.0 \
|
||||
ver_codemirror=5.64.0 \
|
||||
ver_marked=4.0.12 \
|
||||
ver_mde=2.16.1 \
|
||||
ver_codemirror=5.65.2 \
|
||||
ver_fontawesome=5.13.0 \
|
||||
ver_zopfli=1.0.3
|
||||
|
||||
|
||||
# download;
|
||||
# the scp url is latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
|
||||
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
|
||||
RUN mkdir -p /z/dist/no-pk \
|
||||
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
|
||||
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \
|
||||
@@ -97,15 +97,14 @@ RUN cd CodeMirror-$ver_codemirror \
|
||||
|
||||
|
||||
# build easymde
|
||||
COPY easymde-marked6.patch /z/
|
||||
COPY easymde.patch /z/
|
||||
RUN cd easy-markdown-editor-$ver_mde \
|
||||
&& patch -p1 < /z/easymde-marked6.patch \
|
||||
&& patch -p1 < /z/easymde.patch \
|
||||
&& sed -ri 's`https://registry.npmjs.org/marked/-/marked-[0-9\.]+.tgz`file:/z/nodepkgs/marked`' package-lock.json \
|
||||
&& sed -ri 's`https://registry.npmjs.org/codemirror/-/codemirror-[0-9\.]+.tgz`file:/z/nodepkgs/codemirror`' package-lock.json \
|
||||
&& sed -ri 's`("marked": ")[^"]+`\1file:/z/nodepkgs/marked`' ./package.json \
|
||||
&& sed -ri 's`("codemirror": ")[^"]+`\1file:/z/nodepkgs/codemirror`' ./package.json \
|
||||
&& sed -ri 's`^var marked = require\(.marked/lib/marked.\);$`var marked = window.marked;`' src/js/easymde.js \
|
||||
&& sed -ri 's`^var marked = require\(.marked.\).marked;$`var marked = window.marked;`' src/js/easymde.js \
|
||||
&& npm install
|
||||
|
||||
COPY easymde-ln.patch /z/
|
||||
@@ -119,6 +118,7 @@ RUN cd easy-markdown-editor-$ver_mde \
|
||||
# build fontawesome and scp
|
||||
COPY mini-fa.sh /z
|
||||
COPY mini-fa.css /z
|
||||
COPY shiftbase.py /z
|
||||
RUN /bin/ash /z/mini-fa.sh
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
diff -NarU2 codemirror-5.59.3-orig/mode/gfm/gfm.js codemirror-5.59.3/mode/gfm/gfm.js
|
||||
--- codemirror-5.59.3-orig/mode/gfm/gfm.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/mode/gfm/gfm.js 2021-02-21 20:42:02.166174775 +0000
|
||||
diff -wNarU2 codemirror-5.65.1-orig/mode/gfm/gfm.js codemirror-5.65.1/mode/gfm/gfm.js
|
||||
--- codemirror-5.65.1-orig/mode/gfm/gfm.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/mode/gfm/gfm.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -97,5 +97,5 @@
|
||||
}
|
||||
}
|
||||
@@ -15,9 +15,9 @@ diff -NarU2 codemirror-5.59.3-orig/mode/gfm/gfm.js codemirror-5.59.3/mode/gfm/gf
|
||||
+ }*/
|
||||
stream.next();
|
||||
return null;
|
||||
diff -NarU2 codemirror-5.59.3-orig/mode/meta.js codemirror-5.59.3/mode/meta.js
|
||||
--- codemirror-5.59.3-orig/mode/meta.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/mode/meta.js 2021-02-21 20:42:54.798742821 +0000
|
||||
diff -wNarU2 codemirror-5.65.1-orig/mode/meta.js codemirror-5.65.1/mode/meta.js
|
||||
--- codemirror-5.65.1-orig/mode/meta.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/mode/meta.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -13,4 +13,5 @@
|
||||
|
||||
CodeMirror.modeInfo = [
|
||||
@@ -62,10 +62,10 @@ diff -NarU2 codemirror-5.59.3-orig/mode/meta.js codemirror-5.59.3/mode/meta.js
|
||||
+ */
|
||||
];
|
||||
// Ensure all modes have a mime property for backwards compatibility
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/display/selection.js codemirror-5.59.3/src/display/selection.js
|
||||
--- codemirror-5.59.3-orig/src/display/selection.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/display/selection.js 2021-02-21 20:44:14.860894328 +0000
|
||||
@@ -84,29 +84,21 @@
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/display/selection.js codemirror-5.65.1/src/display/selection.js
|
||||
--- codemirror-5.65.1-orig/src/display/selection.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/display/selection.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -96,29 +96,21 @@
|
||||
let order = getOrder(lineObj, doc.direction)
|
||||
iterateBidiSections(order, fromArg || 0, toArg == null ? lineLen : toArg, (from, to, dir, i) => {
|
||||
- let ltr = dir == "ltr"
|
||||
@@ -105,24 +105,24 @@ diff -NarU2 codemirror-5.59.3-orig/src/display/selection.js codemirror-5.59.3/sr
|
||||
+ botRight = openEnd && last ? rightSide : toPos.right
|
||||
add(topLeft, fromPos.top, topRight - topLeft, fromPos.bottom)
|
||||
if (fromPos.bottom < toPos.top) add(leftSide, fromPos.bottom, null, toPos.top)
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/input/ContentEditableInput.js codemirror-5.59.3/src/input/ContentEditableInput.js
|
||||
--- codemirror-5.59.3-orig/src/input/ContentEditableInput.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/input/ContentEditableInput.js 2021-02-21 20:44:33.273953867 +0000
|
||||
@@ -399,4 +399,5 @@
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/input/ContentEditableInput.js codemirror-5.65.1/src/input/ContentEditableInput.js
|
||||
--- codemirror-5.65.1-orig/src/input/ContentEditableInput.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/input/ContentEditableInput.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -400,4 +400,5 @@
|
||||
let info = mapFromLineView(view, line, pos.line)
|
||||
|
||||
+ /*
|
||||
let order = getOrder(line, cm.doc.direction), side = "left"
|
||||
if (order) {
|
||||
@@ -404,4 +405,5 @@
|
||||
@@ -405,4 +406,5 @@
|
||||
side = partPos % 2 ? "right" : "left"
|
||||
}
|
||||
+ */
|
||||
let result = nodeAndOffsetInLineMap(info.map, pos.ch, side)
|
||||
result.offset = result.collapse == "right" ? result.end : result.start
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/input/movement.js codemirror-5.59.3/src/input/movement.js
|
||||
--- codemirror-5.59.3-orig/src/input/movement.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/input/movement.js 2021-02-21 20:45:12.763093671 +0000
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/input/movement.js codemirror-5.65.1/src/input/movement.js
|
||||
--- codemirror-5.65.1-orig/src/input/movement.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/input/movement.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -15,4 +15,5 @@
|
||||
|
||||
export function endOfLine(visually, cm, lineObj, lineNo, dir) {
|
||||
@@ -146,9 +146,16 @@ diff -NarU2 codemirror-5.59.3-orig/src/input/movement.js codemirror-5.59.3/src/i
|
||||
return null
|
||||
+ */
|
||||
}
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/line/line_data.js codemirror-5.59.3/src/line/line_data.js
|
||||
--- codemirror-5.59.3-orig/src/line/line_data.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/line/line_data.js 2021-02-21 20:45:36.472549599 +0000
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/line/line_data.js codemirror-5.65.1/src/line/line_data.js
|
||||
--- codemirror-5.65.1-orig/src/line/line_data.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/line/line_data.js 2022-02-09 22:54:11.542722046 +0100
|
||||
@@ -3,5 +3,5 @@
|
||||
import { elt, eltP, joinClasses } from "../util/dom.js"
|
||||
import { eventMixin, signal } from "../util/event.js"
|
||||
-import { hasBadBidiRects, zeroWidthElement } from "../util/feature_detection.js"
|
||||
+import { zeroWidthElement } from "../util/feature_detection.js"
|
||||
import { lst, spaceStr } from "../util/misc.js"
|
||||
|
||||
@@ -79,6 +79,6 @@
|
||||
// Optionally wire in some hacks into the token-rendering
|
||||
// algorithm, to deal with browser quirks.
|
||||
@@ -158,10 +165,10 @@ diff -NarU2 codemirror-5.59.3-orig/src/line/line_data.js codemirror-5.59.3/src/l
|
||||
+ // builder.addToken = buildTokenBadBidi(builder.addToken, order)
|
||||
builder.map = []
|
||||
let allowFrontierUpdate = lineView != cm.display.externalMeasured && lineNo(line)
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/measurement/position_measurement.js codemirror-5.59.3/src/measurement/position_measurement.js
|
||||
--- codemirror-5.59.3-orig/src/measurement/position_measurement.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/measurement/position_measurement.js 2021-02-21 20:50:52.372945293 +0000
|
||||
@@ -380,5 +380,6 @@
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/measurement/position_measurement.js codemirror-5.65.1/src/measurement/position_measurement.js
|
||||
--- codemirror-5.65.1-orig/src/measurement/position_measurement.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/measurement/position_measurement.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -382,5 +382,6 @@
|
||||
sticky = "after"
|
||||
}
|
||||
- if (!order) return get(sticky == "before" ? ch - 1 : ch, sticky == "before")
|
||||
@@ -169,39 +176,39 @@ diff -NarU2 codemirror-5.59.3-orig/src/measurement/position_measurement.js codem
|
||||
+ /*
|
||||
|
||||
function getBidi(ch, partPos, invert) {
|
||||
@@ -391,4 +392,5 @@
|
||||
@@ -393,4 +394,5 @@
|
||||
if (other != null) val.other = getBidi(ch, other, sticky != "before")
|
||||
return val
|
||||
+ */
|
||||
}
|
||||
|
||||
@@ -468,4 +470,5 @@
|
||||
@@ -470,4 +472,5 @@
|
||||
let begin = 0, end = lineObj.text.length, ltr = true
|
||||
|
||||
+ /*
|
||||
let order = getOrder(lineObj, cm.doc.direction)
|
||||
// If the line isn't plain left-to-right text, first figure out
|
||||
@@ -482,4 +485,5 @@
|
||||
@@ -484,4 +487,5 @@
|
||||
end = ltr ? part.to : part.from - 1
|
||||
}
|
||||
+ */
|
||||
|
||||
// A binary search to find the first character whose bounding box
|
||||
@@ -526,4 +530,5 @@
|
||||
@@ -528,4 +532,5 @@
|
||||
}
|
||||
|
||||
+/*
|
||||
function coordsBidiPart(cm, lineObj, lineNo, preparedMeasure, order, x, y) {
|
||||
// Bidi parts are sorted left-to-right, and in a non-line-wrapping
|
||||
@@ -580,4 +585,5 @@
|
||||
@@ -582,4 +587,5 @@
|
||||
return part
|
||||
}
|
||||
+*/
|
||||
|
||||
let measureText
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/util/bidi.js codemirror-5.59.3/src/util/bidi.js
|
||||
--- codemirror-5.59.3-orig/src/util/bidi.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/util/bidi.js 2021-02-21 20:52:18.168092225 +0000
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/util/bidi.js codemirror-5.65.1/src/util/bidi.js
|
||||
--- codemirror-5.65.1-orig/src/util/bidi.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/util/bidi.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -4,5 +4,5 @@
|
||||
|
||||
export function iterateBidiSections(order, from, to, f) {
|
||||
@@ -259,9 +266,9 @@ diff -NarU2 codemirror-5.59.3-orig/src/util/bidi.js codemirror-5.59.3/src/util/b
|
||||
- return order
|
||||
+ return false;
|
||||
}
|
||||
diff -NarU2 codemirror-5.59.3-orig/src/util/feature_detection.js codemirror-5.59.3/src/util/feature_detection.js
|
||||
--- codemirror-5.59.3-orig/src/util/feature_detection.js 2021-02-20 21:24:57.000000000 +0000
|
||||
+++ codemirror-5.59.3/src/util/feature_detection.js 2021-02-21 20:49:22.191269270 +0000
|
||||
diff -wNarU2 codemirror-5.65.1-orig/src/util/feature_detection.js codemirror-5.65.1/src/util/feature_detection.js
|
||||
--- codemirror-5.65.1-orig/src/util/feature_detection.js 2022-01-20 13:06:23.000000000 +0100
|
||||
+++ codemirror-5.65.1/src/util/feature_detection.js 2022-02-09 22:50:18.145862052 +0100
|
||||
@@ -25,4 +25,5 @@
|
||||
}
|
||||
|
||||
|
||||
@@ -1,52 +1,52 @@
|
||||
diff -NarU2 easy-markdown-editor-2.14.0-orig/gulpfile.js easy-markdown-editor-2.14.0/gulpfile.js
|
||||
--- easy-markdown-editor-2.14.0-orig/gulpfile.js 2021-02-14 12:11:48.000000000 +0000
|
||||
+++ easy-markdown-editor-2.14.0/gulpfile.js 2021-02-21 20:55:37.134701007 +0000
|
||||
diff -wNarU2 easy-markdown-editor-2.16.1-orig/gulpfile.js easy-markdown-editor-2.16.1/gulpfile.js
|
||||
--- easy-markdown-editor-2.16.1-orig/gulpfile.js 2022-01-14 23:27:44.000000000 +0100
|
||||
+++ easy-markdown-editor-2.16.1/gulpfile.js 2022-02-09 23:06:01.694592535 +0100
|
||||
@@ -25,5 +25,4 @@
|
||||
'./node_modules/codemirror/lib/codemirror.css',
|
||||
'./src/css/*.css',
|
||||
- './node_modules/codemirror-spell-checker/src/css/spell-checker.css',
|
||||
];
|
||||
|
||||
diff -NarU2 easy-markdown-editor-2.14.0-orig/package.json easy-markdown-editor-2.14.0/package.json
|
||||
--- easy-markdown-editor-2.14.0-orig/package.json 2021-02-14 12:11:48.000000000 +0000
|
||||
+++ easy-markdown-editor-2.14.0/package.json 2021-02-21 20:55:47.761190082 +0000
|
||||
@@ -21,5 +21,4 @@
|
||||
"dependencies": {
|
||||
"codemirror": "^5.59.2",
|
||||
diff -wNarU2 easy-markdown-editor-2.16.1-orig/package.json easy-markdown-editor-2.16.1/package.json
|
||||
--- easy-markdown-editor-2.16.1-orig/package.json 2022-01-14 23:27:44.000000000 +0100
|
||||
+++ easy-markdown-editor-2.16.1/package.json 2022-02-09 23:06:24.778501888 +0100
|
||||
@@ -23,5 +23,4 @@
|
||||
"@types/marked": "^4.0.1",
|
||||
"codemirror": "^5.63.1",
|
||||
- "codemirror-spell-checker": "1.1.2",
|
||||
"marked": "^2.0.0"
|
||||
"marked": "^4.0.10"
|
||||
},
|
||||
diff -NarU2 easy-markdown-editor-2.14.0-orig/src/js/easymde.js easy-markdown-editor-2.14.0/src/js/easymde.js
|
||||
--- easy-markdown-editor-2.14.0-orig/src/js/easymde.js 2021-02-14 12:11:48.000000000 +0000
|
||||
+++ easy-markdown-editor-2.14.0/src/js/easymde.js 2021-02-21 20:57:09.143171536 +0000
|
||||
diff -wNarU2 easy-markdown-editor-2.16.1-orig/src/js/easymde.js easy-markdown-editor-2.16.1/src/js/easymde.js
|
||||
--- easy-markdown-editor-2.16.1-orig/src/js/easymde.js 2022-01-14 23:27:44.000000000 +0100
|
||||
+++ easy-markdown-editor-2.16.1/src/js/easymde.js 2022-02-09 23:07:21.203131415 +0100
|
||||
@@ -12,5 +12,4 @@
|
||||
require('codemirror/mode/gfm/gfm.js');
|
||||
require('codemirror/mode/xml/xml.js');
|
||||
-var CodeMirrorSpellChecker = require('codemirror-spell-checker');
|
||||
var marked = require('marked/lib/marked');
|
||||
var marked = require('marked').marked;
|
||||
|
||||
@@ -1762,9 +1761,4 @@
|
||||
@@ -1816,9 +1815,4 @@
|
||||
options.autosave.uniqueId = options.autosave.unique_id;
|
||||
|
||||
- // If overlay mode is specified and combine is not provided, default it to true
|
||||
- if (options.overlayMode && options.overlayMode.combine === undefined) {
|
||||
- options.overlayMode.combine = true;
|
||||
- options.overlayMode.combine = true;
|
||||
- }
|
||||
-
|
||||
// Update this options
|
||||
this.options = options;
|
||||
@@ -2003,28 +1997,7 @@
|
||||
@@ -2057,34 +2051,7 @@
|
||||
var mode, backdrop;
|
||||
|
||||
- // CodeMirror overlay mode
|
||||
- if (options.overlayMode) {
|
||||
- CodeMirror.defineMode('overlay-mode', function(config) {
|
||||
- return CodeMirror.overlayMode(CodeMirror.getMode(config, options.spellChecker !== false ? 'spell-checker' : 'gfm'), options.overlayMode.mode, options.overlayMode.combine);
|
||||
- });
|
||||
- CodeMirror.defineMode('overlay-mode', function (config) {
|
||||
- return CodeMirror.overlayMode(CodeMirror.getMode(config, options.spellChecker !== false ? 'spell-checker' : 'gfm'), options.overlayMode.mode, options.overlayMode.combine);
|
||||
- });
|
||||
-
|
||||
- mode = 'overlay-mode';
|
||||
- backdrop = options.parsingConfig;
|
||||
- backdrop.gitHubSpice = false;
|
||||
- mode = 'overlay-mode';
|
||||
- backdrop = options.parsingConfig;
|
||||
- backdrop.gitHubSpice = false;
|
||||
- } else {
|
||||
mode = options.parsingConfig;
|
||||
mode.name = 'gfm';
|
||||
@@ -58,31 +58,35 @@ diff -NarU2 easy-markdown-editor-2.14.0-orig/src/js/easymde.js easy-markdown-edi
|
||||
- backdrop.name = 'gfm';
|
||||
- backdrop.gitHubSpice = false;
|
||||
-
|
||||
- CodeMirrorSpellChecker({
|
||||
- codeMirrorInstance: CodeMirror,
|
||||
- });
|
||||
- if (typeof options.spellChecker === 'function') {
|
||||
- options.spellChecker({
|
||||
- codeMirrorInstance: CodeMirror,
|
||||
- });
|
||||
- } else {
|
||||
- CodeMirrorSpellChecker({
|
||||
- codeMirrorInstance: CodeMirror,
|
||||
- });
|
||||
- }
|
||||
- }
|
||||
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
diff -NarU2 easy-markdown-editor-2.14.0-orig/types/easymde.d.ts easy-markdown-editor-2.14.0/types/easymde.d.ts
|
||||
--- easy-markdown-editor-2.14.0-orig/types/easymde.d.ts 2021-02-14 12:11:48.000000000 +0000
|
||||
+++ easy-markdown-editor-2.14.0/types/easymde.d.ts 2021-02-21 20:57:42.492620979 +0000
|
||||
@@ -160,9 +160,4 @@
|
||||
diff -wNarU2 easy-markdown-editor-2.16.1-orig/types/easymde.d.ts easy-markdown-editor-2.16.1/types/easymde.d.ts
|
||||
--- easy-markdown-editor-2.16.1-orig/types/easymde.d.ts 2022-01-14 23:27:44.000000000 +0100
|
||||
+++ easy-markdown-editor-2.16.1/types/easymde.d.ts 2022-02-09 23:07:55.427605243 +0100
|
||||
@@ -167,9 +167,4 @@
|
||||
}
|
||||
|
||||
- interface OverlayModeOptions {
|
||||
- mode: CodeMirror.Mode<any>
|
||||
- combine?: boolean
|
||||
- mode: CodeMirror.Mode<any>;
|
||||
- combine?: boolean;
|
||||
- }
|
||||
-
|
||||
interface Options {
|
||||
autoDownloadFontAwesome?: boolean;
|
||||
@@ -214,7 +209,5 @@
|
||||
interface SpellCheckerOptions {
|
||||
codeMirrorInstance: CodeMirror.Editor;
|
||||
@@ -229,6 +224,4 @@
|
||||
syncSideBySidePreviewScroll?: boolean;
|
||||
|
||||
promptTexts?: PromptTexts;
|
||||
- syncSideBySidePreviewScroll?: boolean;
|
||||
- overlayMode?: OverlayModeOptions;
|
||||
-
|
||||
- overlayMode?: OverlayModeOptions
|
||||
+ syncSideBySidePreviewScroll?: boolean
|
||||
direction?: 'ltr' | 'rtl';
|
||||
}
|
||||
}
|
||||
|
||||
@@ -29,3 +29,10 @@ pyftsubset "$orig_woff" --unicodes-file=/z/icon.list --no-ignore-missing-unicode
|
||||
|
||||
# scp is easier, just want basic latin
|
||||
pyftsubset /z/scp.woff2 --unicodes="20-7e,ab,b7,bb,2022" --no-ignore-missing-unicodes --flavor=woff2 --output-file=/z/dist/no-pk/scp.woff2 --verbose
|
||||
|
||||
exit 0
|
||||
|
||||
# kinda works but ruins hinting on windows, just use the old version of the font which has correct baseline
|
||||
python3 shiftbase.py /z/dist/no-pk/scp.woff2
|
||||
cd /z/dist/no-pk/
|
||||
mv scp.woff2.woff2 scp.woff2
|
||||
|
||||
27
scripts/deps-docker/shiftbase.py
Executable file
27
scripts/deps-docker/shiftbase.py
Executable file
@@ -0,0 +1,27 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import sys
|
||||
from fontTools.ttLib import TTFont, newTable
|
||||
|
||||
|
||||
def main():
|
||||
woff = sys.argv[1]
|
||||
font = TTFont(woff)
|
||||
print(repr(font["hhea"].__dict__))
|
||||
print(repr(font["OS/2"].__dict__))
|
||||
# font["hhea"].ascent = round(base_asc * mul)
|
||||
# font["hhea"].descent = round(base_desc * mul)
|
||||
# font["OS/2"].usWinAscent = round(base_asc * mul)
|
||||
font["OS/2"].usWinDescent = round(font["OS/2"].usWinDescent * 1.1)
|
||||
font["OS/2"].sTypoDescender = round(font["OS/2"].sTypoDescender * 1.1)
|
||||
|
||||
try:
|
||||
del font["post"].mapping["Delta#1"]
|
||||
except:
|
||||
pass
|
||||
|
||||
font.save(woff + ".woff2")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -14,8 +14,6 @@ help() { exec cat <<'EOF'
|
||||
#
|
||||
# `gz` creates a gzip-compressed python sfx instead of bzip2
|
||||
#
|
||||
# `no-sh` makes just the python sfx, skips the sh/unix sfx
|
||||
#
|
||||
# `no-cm` saves ~82k by removing easymde/codemirror
|
||||
# (the fancy markdown editor)
|
||||
#
|
||||
@@ -64,8 +62,6 @@ pybin=$(command -v python3 || command -v python) || {
|
||||
}
|
||||
|
||||
use_gz=
|
||||
do_sh=1
|
||||
do_py=1
|
||||
zopf=2560
|
||||
while [ ! -z "$1" ]; do
|
||||
case $1 in
|
||||
@@ -76,8 +72,6 @@ while [ ! -z "$1" ]; do
|
||||
no-hl) no_hl=1 ; ;;
|
||||
no-dd) no_dd=1 ; ;;
|
||||
no-cm) no_cm=1 ; ;;
|
||||
no-sh) do_sh= ; ;;
|
||||
no-py) do_py= ; ;;
|
||||
fast) zopf=100 ; ;;
|
||||
*) help ; ;;
|
||||
esac
|
||||
@@ -107,7 +101,7 @@ tmpdir="$(
|
||||
[ $repack ] && {
|
||||
old="$tmpdir/pe-copyparty"
|
||||
echo "repack of files in $old"
|
||||
cp -pR "$old/"*{dep-j2,copyparty} .
|
||||
cp -pR "$old/"*{dep-j2,dep-ftp,copyparty} .
|
||||
}
|
||||
|
||||
[ $repack ] || {
|
||||
@@ -134,6 +128,27 @@ tmpdir="$(
|
||||
mkdir dep-j2/
|
||||
mv {markupsafe,jinja2} dep-j2/
|
||||
|
||||
echo collecting pyftpdlib
|
||||
f="../build/pyftpdlib-1.5.6.tar.gz"
|
||||
[ -e "$f" ] ||
|
||||
(url=https://github.com/giampaolo/pyftpdlib/archive/refs/tags/release-1.5.6.tar.gz;
|
||||
wget -O$f "$url" || curl -L "$url" >$f)
|
||||
|
||||
tar -zxf $f
|
||||
mv pyftpdlib-release-*/pyftpdlib .
|
||||
rm -rf pyftpdlib-release-* pyftpdlib/test
|
||||
|
||||
mkdir dep-ftp/
|
||||
mv pyftpdlib dep-ftp/
|
||||
|
||||
echo collecting asyncore, asynchat
|
||||
for n in asyncore.py asynchat.py; do
|
||||
f=../build/$n
|
||||
[ -e "$f" ] ||
|
||||
(url=https://raw.githubusercontent.com/python/cpython/c4d45ee670c09d4f6da709df072ec80cb7dfad22/Lib/$n;
|
||||
wget -O$f "$url" || curl -L "$url" >$f)
|
||||
done
|
||||
|
||||
# msys2 tar is bad, make the best of it
|
||||
echo collecting source
|
||||
[ $clean ] && {
|
||||
@@ -144,6 +159,12 @@ tmpdir="$(
|
||||
(cd .. && tar -cf tar copyparty) && tar -xf ../tar
|
||||
}
|
||||
rm -f ../tar
|
||||
|
||||
# insert asynchat
|
||||
mkdir copyparty/vend
|
||||
for n in asyncore.py asynchat.py; do
|
||||
awk 'NR<4||NR>27;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' ../build/$n >copyparty/vend/$n
|
||||
done
|
||||
}
|
||||
|
||||
ver=
|
||||
@@ -245,7 +266,7 @@ rm have
|
||||
find | grep -E '\.py$' |
|
||||
grep -vE '__version__' |
|
||||
tr '\n' '\0' |
|
||||
xargs -0 $pybin ../scripts/uncomment.py
|
||||
xargs -0 "$pybin" ../scripts/uncomment.py
|
||||
|
||||
f=dep-j2/jinja2/constants.py
|
||||
awk '/^LOREM_IPSUM_WORDS/{o=1;print "LOREM_IPSUM_WORDS = u\"a\"";next} !o; /"""/{o=0}' <$f >t
|
||||
@@ -331,11 +352,18 @@ nf=$(ls -1 "$zdir"/arc.* | wc -l)
|
||||
|
||||
|
||||
echo gen tarlist
|
||||
for d in copyparty dep-j2; do find $d -type f; done |
|
||||
for d in copyparty dep-j2 dep-ftp; do find $d -type f; done |
|
||||
sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort |
|
||||
sed -r 's/([^ ]*) (.*)/\2.\1/' | grep -vE '/list1?$' > list1
|
||||
|
||||
(grep -vE '\.(gz|br)$' list1; grep -E '\.(gz|br)$' list1 | shuf) >list || true
|
||||
for n in {1..50}; do
|
||||
(grep -vE '\.(gz|br)$' list1; grep -E '\.(gz|br)$' list1 | shuf) >list || true
|
||||
s=$(md5sum list | cut -c-16)
|
||||
grep -q $s "$zdir/h" && continue
|
||||
echo $s >> "$zdir/h"
|
||||
break
|
||||
done
|
||||
[ $n -eq 50 ] && exit
|
||||
|
||||
echo creating tar
|
||||
args=(--owner=1000 --group=1000)
|
||||
@@ -350,41 +378,27 @@ pe=bz2
|
||||
|
||||
echo compressing tar
|
||||
# detect best level; bzip2 -7 is usually better than -9
|
||||
[ $do_py ] && { for n in {2..9}; do cp tar t.$n; $pc -$n t.$n & done; wait; mv -v $(ls -1S t.*.$pe | tail -n 1) tar.bz2; }
|
||||
[ $do_sh ] && { for n in {2..9}; do cp tar t.$n; xz -ze$n t.$n & done; wait; mv -v $(ls -1S t.*.xz | tail -n 1) tar.xz; }
|
||||
for n in {2..9}; do cp tar t.$n; $pc -$n t.$n & done; wait; mv -v $(ls -1S t.*.$pe | tail -n 1) tar.bz2
|
||||
rm t.* || true
|
||||
exts=()
|
||||
|
||||
|
||||
[ $do_sh ] && {
|
||||
exts+=(.sh)
|
||||
echo creating unix sfx
|
||||
(
|
||||
sed "s/PACK_TS/$ts/; s/PACK_HTS/$hts/; s/CPP_VER/$ver/" <../scripts/sfx.sh |
|
||||
grep -E '^sfx_eof$' -B 9001;
|
||||
cat tar.xz
|
||||
) >$sfx_out.sh
|
||||
echo creating sfx
|
||||
|
||||
py=../scripts/sfx.py
|
||||
suf=
|
||||
[ $use_gz ] && {
|
||||
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
|
||||
py=$py.t
|
||||
suf=-gz
|
||||
}
|
||||
|
||||
"$pybin" $py --sfx-make tar.bz2 $ver $ts
|
||||
mv sfx.out $sfx_out$suf.py
|
||||
|
||||
[ $do_py ] && {
|
||||
echo creating generic sfx
|
||||
|
||||
py=../scripts/sfx.py
|
||||
suf=
|
||||
[ $use_gz ] && {
|
||||
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
|
||||
py=$py.t
|
||||
suf=-gz
|
||||
}
|
||||
|
||||
$pybin $py --sfx-make tar.bz2 $ver $ts
|
||||
mv sfx.out $sfx_out$suf.py
|
||||
|
||||
exts+=($suf.py)
|
||||
[ $use_gz ] &&
|
||||
rm $py
|
||||
}
|
||||
exts+=($suf.py)
|
||||
[ $use_gz ] &&
|
||||
rm $py
|
||||
|
||||
|
||||
chmod 755 $sfx_out*
|
||||
@@ -395,4 +409,4 @@ for ext in ${exts[@]}; do
|
||||
done
|
||||
|
||||
# apk add bash python3 tar xz bzip2
|
||||
# while true; do ./make-sfx.sh; for f in ..//dist/copyparty-sfx.{sh,py}; do mv $f $f.$(wc -c <$f | awk '{print$1}'); done; done
|
||||
# while true; do ./make-sfx.sh; f=../dist/copyparty-sfx.py; mv $f $f.$(wc -c <$f | awk '{print$1}'); done
|
||||
|
||||
@@ -4,34 +4,31 @@ set -e
|
||||
cd ~/dev/copyparty/scripts
|
||||
|
||||
v=$1
|
||||
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
|
||||
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
|
||||
|
||||
git push all
|
||||
git tag v$v
|
||||
git push all --tags
|
||||
[ "$v" = sfx ] || {
|
||||
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
|
||||
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
|
||||
|
||||
rm -rf ../dist
|
||||
git push all
|
||||
git tag v$v
|
||||
git push all --tags
|
||||
|
||||
./make-pypi-release.sh u
|
||||
(cd .. && python3 ./setup.py clean2)
|
||||
rm -rf ../dist
|
||||
|
||||
./make-tgz-release.sh $v
|
||||
./make-pypi-release.sh u
|
||||
(cd .. && python3 ./setup.py clean2)
|
||||
|
||||
./make-tgz-release.sh $v
|
||||
}
|
||||
|
||||
rm -f ../dist/copyparty-sfx.*
|
||||
./make-sfx.sh no-sh
|
||||
../dist/copyparty-sfx.py -h
|
||||
f=../dist/copyparty-sfx.py
|
||||
./make-sfx.sh
|
||||
$f -h
|
||||
|
||||
ar=
|
||||
while true; do
|
||||
for ((a=0; a<100; a++)); do
|
||||
for f in ../dist/copyparty-sfx.{py,sh}; do
|
||||
[ -e $f ] || continue;
|
||||
mv $f $f.$(wc -c <$f | awk '{print$1}')
|
||||
done
|
||||
./make-sfx.sh re $ar
|
||||
done
|
||||
ar=no-sh
|
||||
mv $f $f.$(wc -c <$f | awk '{print$1}')
|
||||
./make-sfx.sh re $ar
|
||||
done
|
||||
|
||||
# git tag -d v$v; git push --delete origin v$v
|
||||
|
||||
@@ -11,6 +11,7 @@ copyparty/broker_mp.py,
|
||||
copyparty/broker_mpw.py,
|
||||
copyparty/broker_thr.py,
|
||||
copyparty/broker_util.py,
|
||||
copyparty/ftpd.py,
|
||||
copyparty/httpcli.py,
|
||||
copyparty/httpconn.py,
|
||||
copyparty/httpsrv.py,
|
||||
@@ -31,6 +32,9 @@ copyparty/th_srv.py,
|
||||
copyparty/u2idx.py,
|
||||
copyparty/up2k.py,
|
||||
copyparty/util.py,
|
||||
copyparty/vend,
|
||||
copyparty/vend/asynchat.py,
|
||||
copyparty/vend/asyncore.py,
|
||||
copyparty/web,
|
||||
copyparty/web/baguettebox.js,
|
||||
copyparty/web/browser.css,
|
||||
|
||||
@@ -368,8 +368,9 @@ def confirm(rv):
|
||||
sys.exit(rv or 1)
|
||||
|
||||
|
||||
def run(tmp, j2):
|
||||
def run(tmp, j2, ftp):
|
||||
msg("jinja2:", j2 or "bundled")
|
||||
msg("pyftpd:", ftp or "bundled")
|
||||
msg("sfxdir:", tmp)
|
||||
msg()
|
||||
|
||||
@@ -387,9 +388,8 @@ def run(tmp, j2):
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
ld = [tmp, os.path.join(tmp, "dep-j2")]
|
||||
if j2:
|
||||
del ld[-1]
|
||||
ld = (("", ""), (j2, "dep-j2"), (ftp, "dep-ftp"))
|
||||
ld = [os.path.join(tmp, b) for a, b in ld if not a]
|
||||
|
||||
if any([re.match(r"^-.*j[0-9]", x) for x in sys.argv]):
|
||||
run_s(ld)
|
||||
@@ -462,7 +462,12 @@ def main():
|
||||
j2 = None
|
||||
|
||||
try:
|
||||
run(tmp, j2)
|
||||
from pyftpdlib.__init__ import __ver__ as ftp
|
||||
except:
|
||||
ftp = None
|
||||
|
||||
try:
|
||||
run(tmp, j2, ftp)
|
||||
except SystemExit as ex:
|
||||
c = ex.code
|
||||
if c not in [0, -15]:
|
||||
|
||||
7
setup.py
7
setup.py
@@ -112,7 +112,12 @@ args = {
|
||||
"data_files": data_files,
|
||||
"packages": find_packages(),
|
||||
"install_requires": ["jinja2"],
|
||||
"extras_require": {"thumbnails": ["Pillow"], "audiotags": ["mutagen"]},
|
||||
"extras_require": {
|
||||
"thumbnails": ["Pillow"],
|
||||
"audiotags": ["mutagen"],
|
||||
"ftpd": ["pyftpdlib"],
|
||||
"ftps": ["pyopenssl"],
|
||||
},
|
||||
"entry_points": {"console_scripts": ["copyparty = copyparty.__main__:main"]},
|
||||
"scripts": ["bin/copyparty-fuse.py", "bin/up2k.py"],
|
||||
"cmdclass": {"clean2": clean2},
|
||||
|
||||
@@ -52,9 +52,12 @@ class Cfg(Namespace):
|
||||
mth="",
|
||||
textfiles="",
|
||||
doctitle="",
|
||||
html_head="",
|
||||
hist=None,
|
||||
no_idx=None,
|
||||
no_hash=None,
|
||||
force_js=False,
|
||||
no_robots=False,
|
||||
js_browser=None,
|
||||
css_browser=None,
|
||||
**{k: False for k in "e2d e2ds e2dsa e2t e2ts e2tsr no_acode".split()}
|
||||
|
||||
@@ -17,13 +17,14 @@ from copyparty import util
|
||||
|
||||
class Cfg(Namespace):
|
||||
def __init__(self, a=None, v=None, c=None):
|
||||
ex = "nw e2d e2ds e2dsa e2t e2ts e2tsr no_logues no_readme no_acode"
|
||||
ex = "nw e2d e2ds e2dsa e2t e2ts e2tsr no_logues no_readme no_acode force_js no_robots"
|
||||
ex = {k: False for k in ex.split()}
|
||||
ex2 = {
|
||||
"mtp": [],
|
||||
"mte": "a",
|
||||
"mth": "",
|
||||
"doctitle": "",
|
||||
"html_head": "",
|
||||
"hist": None,
|
||||
"no_idx": None,
|
||||
"no_hash": None,
|
||||
|
||||
@@ -109,6 +109,9 @@ class VSock(object):
|
||||
self._reply += buf
|
||||
return len(buf)
|
||||
|
||||
def getsockname(self):
|
||||
return ("a", 1)
|
||||
|
||||
|
||||
class VHttpSrv(object):
|
||||
def __init__(self):
|
||||
|
||||
Reference in New Issue
Block a user