Compare commits

..

52 Commits

Author SHA1 Message Date
ed
dcaf7b0a20 v1.1.5 2021-12-04 03:33:57 +01:00
ed
f982cdc178 spa gridview 2021-12-04 03:31:12 +01:00
ed
b265e59834 spa filetab 2021-12-04 03:25:28 +01:00
ed
4a843a6624 unflicker navpane + add client state escape hatch 2021-12-04 02:46:00 +01:00
ed
241ef5b99d preserve mtimes when juggling symlinks 2021-12-04 01:58:04 +01:00
ed
f39f575a9c sort-order indicators 2021-12-03 23:53:41 +01:00
ed
1521307f1e use preferred sort on initial render, fixes #8 2021-12-03 02:07:08 +01:00
ed
dd122111e6 v1.1.4 2021-11-28 04:22:05 +01:00
ed
00c177fa74 show upload eta in window title 2021-11-28 04:05:16 +01:00
ed
f6c7e49eb8 u2cli: better error messages 2021-11-28 03:38:57 +01:00
ed
1a8dc3d18a add workaround for #7 after all since it was trivial 2021-11-28 00:12:19 +01:00
ed
38a163a09a better dropzone for extremely slow browsers 2021-11-28 00:11:21 +01:00
ed
8f031246d2 disable windows quickedit to avoid accidental lockups 2021-11-27 21:43:19 +01:00
ed
8f3d97dde7 indicate onclick action for audio files in grid view 2021-11-24 22:10:59 +01:00
ed
4acaf24d65 remember if media controls were open or not 2021-11-24 21:49:41 +01:00
ed
9a8dbbbcf8 another accesskey fix 2021-11-22 21:57:29 +01:00
ed
a3efc4c726 encode quoted queries into raw 2021-11-22 21:53:23 +01:00
ed
0278bf328f support raw-queries with quotes 2021-11-22 20:59:07 +01:00
ed
17ddd96cc6 up2k list wasnt centered anymore 2021-11-21 22:44:11 +01:00
ed
0e82e79aea mention the eq fixing gapless albums 2021-11-20 19:33:56 +01:00
ed
30f124c061 fix forcing compression levels 2021-11-20 18:51:15 +01:00
ed
e19d90fcfc add missing examples 2021-11-20 18:50:55 +01:00
ed
184bbdd23d legalese rephrasing 2021-11-20 17:58:37 +01:00
ed
30b50aec95 mention mtp readme 2021-11-20 17:51:49 +01:00
ed
c3c3d81db1 add mtp plugin for exif stripping 2021-11-20 17:45:56 +01:00
ed
49b7231283 fix mojibake support in misc mtp plugins 2021-11-20 17:33:24 +01:00
ed
edbedcdad3 v1.1.3 2021-11-20 02:27:09 +01:00
ed
e4ae5f74e6 add tooltip indicator 2021-11-20 01:47:16 +01:00
ed
2c7ffe08d7 include sha512 as both hex and b64 in responses 2021-11-20 01:03:32 +01:00
ed
3ca46bae46 good oneliner 2021-11-20 00:20:34 +01:00
ed
7e82aaf843 simplify/improve up2k ui debounce 2021-11-20 00:03:15 +01:00
ed
315bd71adf limit turbo runahead 2021-11-20 00:01:14 +01:00
ed
2c612c9aeb ux 2021-11-19 21:31:05 +01:00
ed
36aee085f7 add timeouts to FFmpeg things 2021-11-16 22:22:09 +01:00
ed
d01bb69a9c u2cli: option to ignore inaccessible files 2021-11-16 21:53:00 +01:00
ed
c9b1c48c72 sizelimit registry + persist without e2d 2021-11-16 21:31:24 +01:00
ed
aea3843cf2 this is just noise 2021-11-16 21:28:50 +01:00
ed
131b6f4b9a workaround chrome rendering bug 2021-11-16 21:28:36 +01:00
ed
6efb8b735a better handling of python builds without sqlite3 2021-11-16 01:13:04 +01:00
ed
223b7af2ce more iOS jank 2021-11-16 00:05:35 +01:00
ed
e72c2a6982 add fastpath for using the eq as a pure gain control 2021-11-15 23:19:43 +01:00
ed
dd9b93970e autoenable aac transcoding when codec missing 2021-11-15 23:18:52 +01:00
ed
e4c7cd81a9 update readme 2021-11-15 20:28:53 +01:00
ed
12b3a62586 fix dumb mistakes 2021-11-15 20:13:16 +01:00
ed
2da3bdcd47 delay tooltips, fix #6 2021-11-15 03:56:17 +01:00
ed
c1dccbe0ba trick iphones into preloading natively 2021-11-15 03:01:11 +01:00
ed
9629fcde68 optionally enable seeking through os controls 2021-11-15 02:47:42 +01:00
ed
cae436b566 add client-option to disconnect on HTTP 304 2021-11-15 02:45:18 +01:00
ed
01714700ae more gapless fixes 2021-11-14 20:25:28 +01:00
ed
51e6c4852b retire ogvjs 2021-11-14 19:28:44 +01:00
ed
b206c5d64e handle multiple simultaneous uploads of the same file 2021-11-14 15:03:11 +01:00
ed
62c3272351 add option to simulate latency 2021-11-14 15:01:20 +01:00
34 changed files with 1175 additions and 638 deletions

View File

@@ -52,7 +52,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
* [compress uploads](#compress-uploads) - files can be autocompressed on upload * [compress uploads](#compress-uploads) - files can be autocompressed on upload
* [database location](#database-location) - in-volume (`.hist/up2k.db`, default) or somewhere else * [database location](#database-location) - in-volume (`.hist/up2k.db`, default) or somewhere else
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload * [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags * [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
* [upload events](#upload-events) - trigger a script/program on each upload * [upload events](#upload-events) - trigger a script/program on each upload
* [complete examples](#complete-examples) * [complete examples](#complete-examples)
* [browser support](#browser-support) - TLDR: yes * [browser support](#browser-support) - TLDR: yes
@@ -78,6 +78,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
* [sfx](#sfx) - there are two self-contained "binaries" * [sfx](#sfx) - there are two self-contained "binaries"
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features * [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
* [install on android](#install-on-android) * [install on android](#install-on-android)
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports
* [building](#building) * [building](#building)
* [dev env setup](#dev-env-setup) * [dev env setup](#dev-env-setup)
* [just the sfx](#just-the-sfx) * [just the sfx](#just-the-sfx)
@@ -161,9 +162,11 @@ feature summary
* ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename)) * ☑ file manager (cut/paste, delete, [batch-rename](#batch-rename))
* ☑ audio player (with OS media controls and opus transcoding) * ☑ audio player (with OS media controls and opus transcoding)
* ☑ image gallery with webm player * ☑ image gallery with webm player
* ☑ textfile browser with syntax hilighting
* ☑ [thumbnails](#thumbnails) * ☑ [thumbnails](#thumbnails)
* ☑ ...of images using Pillow * ☑ ...of images using Pillow
* ☑ ...of videos using FFmpeg * ☑ ...of videos using FFmpeg
* ☑ ...of audio (spectrograms) using FFmpeg
* ☑ cache eviction (max-age; maybe max-size eventually) * ☑ cache eviction (max-age; maybe max-size eventually)
* ☑ SPA (browse while uploading) * ☑ SPA (browse while uploading)
* if you use the navpane to navigate, not folders in the file list * if you use the navpane to navigate, not folders in the file list
@@ -233,6 +236,10 @@ some improvement ideas
## not my bugs ## not my bugs
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
* *future workaround:* enable the equalizer, make it all-zero, and set a negative boost to reduce the volume
* "future" because `AudioContext` is broken in the current iOS version (15.1), maybe one day...
* Windows: folders cannot be accessed if the name ends with `.` * Windows: folders cannot be accessed if the name ends with `.`
* python or windows bug * python or windows bug
@@ -249,6 +256,7 @@ some improvement ideas
* is it possible to block read-access to folders unless you know the exact URL for a particular file inside? * is it possible to block read-access to folders unless you know the exact URL for a particular file inside?
* yes, using the [`g` permission](#accounts-and-volumes), see the examples there * yes, using the [`g` permission](#accounts-and-volumes), see the examples there
* you can also do this with linux filesystem permissions; `chmod 111 music` will make it possible to access files and folders inside the `music` folder but not list the immediate contents -- also works with other software, not just copyparty
* can I make copyparty download a file to my server if I give it a URL? * can I make copyparty download a file to my server if I give it a URL?
* not officially, but there is a [terrible hack](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/wget.py) which makes it possible * not officially, but there is a [terrible hack](https://github.com/9001/copyparty/blob/hovudstraum/bin/mtag/wget.py) which makes it possible
@@ -563,6 +571,8 @@ and there are *two* editors
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab` * you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
* enabling the audio equalizer can help make gapless albums fully gapless in some browsers (chrome), so consider leaving it on with all the values at zero
* get a plaintext file listing by adding `?ls=t` to a URL, or a compact colored one with `?ls=v` (for unix terminals) * get a plaintext file listing by adding `?ls=t` to a URL, or a compact colored one with `?ls=v` (for unix terminals)
* if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1 * if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1
@@ -669,6 +679,12 @@ things to note,
* the files will be indexed after compression, so dupe-detection and file-search will not work as expected * the files will be indexed after compression, so dupe-detection and file-search will not work as expected
some examples, some examples,
* `-v inc:inc:w:c,pk=xz,0`
folder named inc, shared at inc, write-only for everyone, forces xz compression at level 0
* `-v inc:inc:w:c,pk`
same write-only inc, but forces gz compression (default) instead of xz
* `-v inc:inc:w:c,gz`
allows (but does not force) gz compression if client uploads to `/inc?pk` or `/inc?gz` or `/inc?gz=4`
## database location ## database location
@@ -713,7 +729,7 @@ see the beautiful mess of a dictionary in [mtag.py](https://github.com/9001/copy
## file parser plugins ## file parser plugins
provide custom parsers to index additional tags provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
copyparty can invoke external programs to collect additional metadata for files using `mtp` (either as argument or volume flag), there is a default timeout of 30sec copyparty can invoke external programs to collect additional metadata for files using `mtp` (either as argument or volume flag), there is a default timeout of 30sec
@@ -784,7 +800,7 @@ TLDR: yes
* internet explorer 6 to 8 behave the same * internet explorer 6 to 8 behave the same
* firefox 52 and chrome 49 are the final winxp versions * firefox 52 and chrome 49 are the final winxp versions
* `*1` yes, but extremely slow (ie10: `1 MiB/s`, ie11: `270 KiB/s`) * `*1` yes, but extremely slow (ie10: `1 MiB/s`, ie11: `270 KiB/s`)
* `*3` using a wasm decoder which consumes a bit more power * `*3` iOS 11 and newer, opus only, and requires FFmpeg on the server
quick summary of more eccentric web-browsers trying to view a directory index: quick summary of more eccentric web-browsers trying to view a directory index:
@@ -973,6 +989,7 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
| GET | `?txt=iso-8859-1` | ...with specific charset | | GET | `?txt=iso-8859-1` | ...with specific charset |
| GET | `?th` | get image/video at URL as thumbnail | | GET | `?th` | get image/video at URL as thumbnail |
| GET | `?th=opus` | convert audio file to 128kbps opus | | GET | `?th=opus` | convert audio file to 128kbps opus |
| GET | `?th=caf` | ...in the iOS-proprietary container |
| method | body | result | | method | body | result |
|--|--|--| |--|--|--|
@@ -1068,13 +1085,11 @@ pls note that `copyparty-sfx.sh` will fail if you rename `copyparty-sfx.py` to `
reduce the size of an sfx by removing features reduce the size of an sfx by removing features
if you don't need all the features, you can repack the sfx and save a bunch of space; all you need is an sfx and a copy of this repo (nothing else to download or build, except if you're on windows then you need msys2 or WSL) if you don't need all the features, you can repack the sfx and save a bunch of space; all you need is an sfx and a copy of this repo (nothing else to download or build, except if you're on windows then you need msys2 or WSL)
* `584k` size of original sfx.py as of v1.1.0 * `393k` size of original sfx.py as of v1.1.3
* `392k` after `./scripts/make-sfx.sh re no-ogv` * `310k` after `./scripts/make-sfx.sh re no-cm`
* `310k` after `./scripts/make-sfx.sh re no-ogv no-cm` * `269k` after `./scripts/make-sfx.sh re no-cm no-hl`
* `269k` after `./scripts/make-sfx.sh re no-ogv no-cm no-hl`
the features you can opt to drop are the features you can opt to drop are
* `ogv`.js, the opus/vorbis decoder which is needed by apple devices to play foss audio files, saves ~192k
* `cm`/easymde, the "fancy" markdown editor, saves ~82k * `cm`/easymde, the "fancy" markdown editor, saves ~82k
* `hl`, prism, the syntax hilighter, saves ~41k * `hl`, prism, the syntax hilighter, saves ~41k
* `fnt`, source-code-pro, the monospace font, saves ~9k * `fnt`, source-code-pro, the monospace font, saves ~9k
@@ -1082,7 +1097,7 @@ the features you can opt to drop are
for the `re`pack to work, first run one of the sfx'es once to unpack it for the `re`pack to work, first run one of the sfx'es once to unpack it
**note:** you can also just download and run [scripts/copyparty-repack.sh](scripts/copyparty-repack.sh) -- this will grab the latest copyparty release from github and do a `no-ogv no-cm` repack; works on linux/macos (and windows with msys2 or WSL) **note:** you can also just download and run [scripts/copyparty-repack.sh](scripts/copyparty-repack.sh) -- this will grab the latest copyparty release from github and do a few repacks; works on linux/macos (and windows with msys2 or WSL)
# install on android # install on android
@@ -1096,6 +1111,16 @@ echo $?
after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux
# reporting bugs
ideas for context to include in bug reports
if something broke during an upload (replacing FILENAME with a part of the filename that broke):
```
journalctl -aS '48 hour ago' -u copyparty | grep -C10 FILENAME | tee bug.log
```
# building # building
## dev env setup ## dev env setup

View File

@@ -6,9 +6,13 @@ some of these rely on libraries which are not MIT-compatible
* [audio-bpm.py](./audio-bpm.py) detects the BPM of music using the BeatRoot Vamp Plugin; imports GPL2 * [audio-bpm.py](./audio-bpm.py) detects the BPM of music using the BeatRoot Vamp Plugin; imports GPL2
* [audio-key.py](./audio-key.py) detects the melodic key of music using the Mixxx fork of keyfinder; imports GPL3 * [audio-key.py](./audio-key.py) detects the melodic key of music using the Mixxx fork of keyfinder; imports GPL3
* [media-hash.py](./media-hash.py) generates checksums for audio and video streams; uses FFmpeg (LGPL or GPL)
these do not have any problematic dependencies: these invoke standalone programs which are GPL or similar, so is legally fine for most purposes:
* [media-hash.py](./media-hash.py) generates checksums for audio and video streams; uses FFmpeg (LGPL or GPL)
* [image-noexif.py](./image-noexif.py) removes exif tags from images; uses exiftool (GPLv1 or artistic-license)
these do not have any problematic dependencies at all:
* [cksum.py](./cksum.py) computes various checksums * [cksum.py](./cksum.py) computes various checksums
* [exe.py](./exe.py) grabs metadata from .exe and .dll files (example for retrieving multiple tags with one parser) * [exe.py](./exe.py) grabs metadata from .exe and .dll files (example for retrieving multiple tags with one parser)

View File

@@ -19,18 +19,18 @@ dep: ffmpeg
def det(tf): def det(tf):
# fmt: off # fmt: off
sp.check_call([ sp.check_call([
"ffmpeg", b"ffmpeg",
"-nostdin", b"-nostdin",
"-hide_banner", b"-hide_banner",
"-v", "fatal", b"-v", b"fatal",
"-ss", "13", b"-ss", b"13",
"-y", "-i", fsenc(sys.argv[1]), b"-y", b"-i", fsenc(sys.argv[1]),
"-map", "0:a:0", b"-map", b"0:a:0",
"-ac", "1", b"-ac", b"1",
"-ar", "22050", b"-ar", b"22050",
"-t", "300", b"-t", b"300",
"-f", "f32le", b"-f", b"f32le",
tf fsenc(tf)
]) ])
# fmt: on # fmt: on

View File

@@ -23,15 +23,15 @@ dep: ffmpeg
def det(tf): def det(tf):
# fmt: off # fmt: off
sp.check_call([ sp.check_call([
"ffmpeg", b"ffmpeg",
"-nostdin", b"-nostdin",
"-hide_banner", b"-hide_banner",
"-v", "fatal", b"-v", b"fatal",
"-y", "-i", fsenc(sys.argv[1]), b"-y", b"-i", fsenc(sys.argv[1]),
"-map", "0:a:0", b"-map", b"0:a:0",
"-t", "300", b"-t", b"300",
"-sample_fmt", "s16", b"-sample_fmt", b"s16",
tf fsenc(tf)
]) ])
# fmt: on # fmt: on

93
bin/mtag/image-noexif.py Normal file
View File

@@ -0,0 +1,93 @@
#!/usr/bin/env python3
"""
remove exif tags from uploaded images
dependencies:
exiftool
about:
creates a "noexif" subfolder and puts exif-stripped copies of each image there,
the reason for the subfolder is to avoid issues with the up2k.db / deduplication:
if the original image is modified in-place, then copyparty will keep the original
hash in up2k.db for a while (until the next volume rescan), so if the image is
reuploaded after a rescan then the upload will be renamed and kept as a dupe
alternatively you could switch the logic around, making a copy of the original
image into a subfolder named "exif" and modify the original in-place, but then
up2k.db will be out of sync until the next rescan, so any additional uploads
of the same image will get symlinked (deduplicated) to the modified copy
instead of the original in "exif"
or maybe delete the original image after processing, that would kinda work too
example copyparty config to use this:
-v/mnt/nas/pics:pics:rwmd,ed:c,e2ts,mte=+noexif:c,mtp=noexif=ejpg,ejpeg,ad,bin/mtag/image-noexif.py
explained:
for realpath /mnt/nas/pics (served at /pics) with read-write-modify-delete for ed,
enable file analysis on upload (e2ts),
append "noexif" to the list of known tags (mtp),
and use mtp plugin "bin/mtag/image-noexif.py" to provide that tag,
do this on all uploads with the file extension "jpg" or "jpeg",
ad = parse file regardless if FFmpeg thinks it is audio or not
PS: this requires e2ts to be functional,
meaning you need to do at least one of these:
* apt install ffmpeg
* pip3 install mutagen
and your python must have sqlite3 support compiled in
"""
import os
import sys
import time
import filecmp
import subprocess as sp
try:
from copyparty.util import fsenc
except:
def fsenc(p):
return p.encode("utf-8")
def main():
cwd, fn = os.path.split(sys.argv[1])
if os.path.basename(cwd) == "noexif":
return
os.chdir(cwd)
f1 = fsenc(fn)
f2 = os.path.join(b"noexif", f1)
cmd = [
b"exiftool",
b"-exif:all=",
b"-iptc:all=",
b"-xmp:all=",
b"-P",
b"-o",
b"noexif/",
b"--",
f1,
]
sp.check_output(cmd)
if not os.path.exists(f2):
print("failed")
return
if filecmp.cmp(f1, f2, shallow=False):
print("clean")
else:
print("exif")
# lastmod = os.path.getmtime(f1)
# times = (int(time.time()), int(lastmod))
# os.utime(f2, times)
if __name__ == "__main__":
main()

View File

@@ -13,7 +13,7 @@ try:
except: except:
def fsenc(p): def fsenc(p):
return p return p.encode("utf-8")
""" """
@@ -24,13 +24,13 @@ dep: ffmpeg
def det(): def det():
# fmt: off # fmt: off
cmd = [ cmd = [
"ffmpeg", b"ffmpeg",
"-nostdin", b"-nostdin",
"-hide_banner", b"-hide_banner",
"-v", "fatal", b"-v", b"fatal",
"-i", fsenc(sys.argv[1]), b"-i", fsenc(sys.argv[1]),
"-f", "framemd5", b"-f", b"framemd5",
"-" b"-"
] ]
# fmt: on # fmt: on

View File

@@ -3,7 +3,7 @@ from __future__ import print_function, unicode_literals
""" """
up2k.py: upload to copyparty up2k.py: upload to copyparty
2021-10-31, v0.11, ed <irc.rizon.net>, MIT-Licensed 2021-11-28, v0.13, ed <irc.rizon.net>, MIT-Licensed
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
- dependencies: requests - dependencies: requests
@@ -224,29 +224,47 @@ class CTermsize(object):
ss = CTermsize() ss = CTermsize()
def statdir(top): def _scd(err, top):
"""non-recursive listing of directory contents, along with stat() info""" """non-recursive listing of directory contents, along with stat() info"""
if hasattr(os, "scandir"): with os.scandir(top) as dh:
with os.scandir(top) as dh: for fh in dh:
for fh in dh: abspath = os.path.join(top, fh.name)
yield [os.path.join(top, fh.name), fh.stat()] try:
else: yield [abspath, fh.stat()]
for name in os.listdir(top): except:
abspath = os.path.join(top, name) err.append(abspath)
def _lsd(err, top):
"""non-recursive listing of directory contents, along with stat() info"""
for name in os.listdir(top):
abspath = os.path.join(top, name)
try:
yield [abspath, os.stat(abspath)] yield [abspath, os.stat(abspath)]
except:
err.append(abspath)
def walkdir(top): if hasattr(os, "scandir"):
statdir = _scd
else:
statdir = _lsd
def walkdir(err, top):
"""recursive statdir""" """recursive statdir"""
for ap, inf in sorted(statdir(top)): for ap, inf in sorted(statdir(err, top)):
if stat.S_ISDIR(inf.st_mode): if stat.S_ISDIR(inf.st_mode):
for x in walkdir(ap): try:
yield x for x in walkdir(err, ap):
yield x
except:
err.append(ap)
else: else:
yield ap, inf yield ap, inf
def walkdirs(tops): def walkdirs(err, tops):
"""recursive statdir for a list of tops, yields [top, relpath, stat]""" """recursive statdir for a list of tops, yields [top, relpath, stat]"""
sep = "{0}".format(os.sep).encode("ascii") sep = "{0}".format(os.sep).encode("ascii")
for top in tops: for top in tops:
@@ -256,7 +274,7 @@ def walkdirs(tops):
stop = os.path.dirname(top) stop = os.path.dirname(top)
if os.path.isdir(top): if os.path.isdir(top):
for ap, inf in walkdir(top): for ap, inf in walkdir(err, top):
yield stop, ap[len(stop) :].lstrip(sep), inf yield stop, ap[len(stop) :].lstrip(sep), inf
else: else:
d, n = top.rsplit(sep, 1) d, n = top.rsplit(sep, 1)
@@ -372,7 +390,7 @@ def handshake(req_ses, url, file, pw, search):
r = req_ses.post(url, headers=headers, json=req) r = req_ses.post(url, headers=headers, json=req)
break break
except: except:
eprint("handshake failed, retry...\n") eprint("handshake failed, retrying: {0}\n".format(file.name))
time.sleep(1) time.sleep(1)
try: try:
@@ -446,10 +464,21 @@ class Ctl(object):
nfiles = 0 nfiles = 0
nbytes = 0 nbytes = 0
for _, _, inf in walkdirs(ar.files): err = []
for _, _, inf in walkdirs(err, ar.files):
nfiles += 1 nfiles += 1
nbytes += inf.st_size nbytes += inf.st_size
if err:
eprint("\n# failed to access {0} paths:\n".format(len(err)))
for x in err:
eprint(x.decode("utf-8", "replace") + "\n")
eprint("^ failed to access those {0} paths ^\n\n".format(len(err)))
if not ar.ok:
eprint("aborting because --ok is not set\n")
return
eprint("found {0} files, {1}\n\n".format(nfiles, humansize(nbytes))) eprint("found {0} files, {1}\n\n".format(nfiles, humansize(nbytes)))
self.nfiles = nfiles self.nfiles = nfiles
self.nbytes = nbytes self.nbytes = nbytes
@@ -460,7 +489,7 @@ class Ctl(object):
if ar.te: if ar.te:
req_ses.verify = ar.te req_ses.verify = ar.te
self.filegen = walkdirs(ar.files) self.filegen = walkdirs([], ar.files)
if ar.safe: if ar.safe:
self.safe() self.safe()
else: else:
@@ -476,7 +505,7 @@ class Ctl(object):
print("{0} {1}\n hash...".format(self.nfiles - nf, upath)) print("{0} {1}\n hash...".format(self.nfiles - nf, upath))
get_hashlist(file, None) get_hashlist(file, None)
burl = self.ar.url[:8] + self.ar.url[8:].split("/")[0] + "/" burl = self.ar.url[:12] + self.ar.url[8:].split("/")[0] + "/"
while True: while True:
print(" hs...") print(" hs...")
hs = handshake(req_ses, self.ar.url, file, self.ar.a, search) hs = handshake(req_ses, self.ar.url, file, self.ar.a, search)
@@ -744,7 +773,7 @@ class Ctl(object):
try: try:
upload(req_ses, file, cid, self.ar.a) upload(req_ses, file, cid, self.ar.a)
except: except:
eprint("upload failed, retry...\n") eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
pass # handshake will fix it pass # handshake will fix it
with self.mutex: with self.mutex:
@@ -783,6 +812,7 @@ source file/folder selection uses rsync syntax, meaning that:
ap.add_argument("files", type=unicode, nargs="+", help="files and/or folders to process") ap.add_argument("files", type=unicode, nargs="+", help="files and/or folders to process")
ap.add_argument("-a", metavar="PASSWORD", help="password") ap.add_argument("-a", metavar="PASSWORD", help="password")
ap.add_argument("-s", action="store_true", help="file-search (disables upload)") ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
ap = app.add_argument_group("performance tweaks") ap = app.add_argument_group("performance tweaks")
ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections") ap.add_argument("-j", type=int, metavar="THREADS", default=4, help="parallel connections")
ap.add_argument("-nh", action="store_true", help="disable hashing while uploading") ap.add_argument("-nh", action="store_true", help="disable hashing while uploading")

View File

@@ -13,7 +13,7 @@
upstream cpp { upstream cpp {
server 127.0.0.1:3923; server 127.0.0.1:3923;
keepalive 120; keepalive 1;
} }
server { server {
listen 443 ssl; listen 443 ssl;

View File

@@ -23,7 +23,7 @@ from textwrap import dedent
from .__init__ import E, WINDOWS, ANYWIN, VT100, PY2, unicode from .__init__ import E, WINDOWS, ANYWIN, VT100, PY2, unicode
from .__version__ import S_VERSION, S_BUILD_DT, CODENAME from .__version__ import S_VERSION, S_BUILD_DT, CODENAME
from .svchub import SvcHub from .svchub import SvcHub
from .util import py_desc, align_tab, IMPLICATIONS, ansi_re from .util import py_desc, align_tab, IMPLICATIONS, ansi_re, min_ex
from .authsrv import re_vol from .authsrv import re_vol
HAVE_SSL = True HAVE_SSL = True
@@ -222,6 +222,54 @@ def sighandler(sig=None, frame=None):
print("\n".join(msg)) print("\n".join(msg))
def disable_quickedit():
import ctypes
import atexit
from ctypes import wintypes
def ecb(ok, fun, args):
if not ok:
err = ctypes.get_last_error()
if err:
raise ctypes.WinError(err)
return args
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
if PY2:
wintypes.LPDWORD = ctypes.POINTER(wintypes.DWORD)
k32.GetStdHandle.errcheck = ecb
k32.GetConsoleMode.errcheck = ecb
k32.SetConsoleMode.errcheck = ecb
k32.GetConsoleMode.argtypes = (wintypes.HANDLE, wintypes.LPDWORD)
k32.SetConsoleMode.argtypes = (wintypes.HANDLE, wintypes.DWORD)
def cmode(out, mode=None):
h = k32.GetStdHandle(-11 if out else -10)
if mode:
return k32.SetConsoleMode(h, mode)
mode = wintypes.DWORD()
k32.GetConsoleMode(h, ctypes.byref(mode))
return mode.value
# disable quickedit
mode = orig_in = cmode(False)
quickedit = 0x40
extended = 0x80
mask = quickedit + extended
if mode & mask != extended:
atexit.register(cmode, False, orig_in)
cmode(False, mode & ~mask | extended)
# enable colors in case the os.system("rem") trick ever stops working
if VT100:
mode = orig_out = cmode(True)
if mode & 4 != 4:
atexit.register(cmode, True, orig_out)
cmode(True, mode | 4)
def run_argparse(argv, formatter): def run_argparse(argv, formatter):
ap = argparse.ArgumentParser( ap = argparse.ArgumentParser(
formatter_class=formatter, formatter_class=formatter,
@@ -376,6 +424,7 @@ def run_argparse(argv, formatter):
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload") ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload")
ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even if copyparty thinks you're better off without") ap2.add_argument("--use-fpool", action="store_true", help="force file-handle pooling, even if copyparty thinks you're better off without")
ap2.add_argument("--no-symlink", action="store_true", help="duplicate file contents instead") ap2.add_argument("--no-symlink", action="store_true", help="duplicate file contents instead")
ap2.add_argument("--reg-cap", metavar="N", type=int, default=9000, help="max number of uploads to keep in memory when running without -e2d")
ap2 = ap.add_argument_group('network options') ap2 = ap.add_argument_group('network options')
ap2.add_argument("-i", metavar="IP", type=u, default="0.0.0.0", help="ip to bind (comma-sep.)") ap2.add_argument("-i", metavar="IP", type=u, default="0.0.0.0", help="ip to bind (comma-sep.)")
@@ -383,6 +432,7 @@ def run_argparse(argv, formatter):
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; 0 = tcp, 1 = origin (first x-fwd), 2 = cloudflare, 3 = nginx, -1 = closest proxy") ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; 0 = tcp, 1 = origin (first x-fwd), 2 = cloudflare, 3 = nginx, -1 = closest proxy")
ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes") ap2.add_argument("--s-wr-sz", metavar="B", type=int, default=256*1024, help="socket write size in bytes")
ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="socket write delay in seconds") ap2.add_argument("--s-wr-slp", metavar="SEC", type=float, default=0, help="socket write delay in seconds")
ap2.add_argument("--rsp-slp", metavar="SEC", type=float, default=0, help="response delay in seconds")
ap2 = ap.add_argument_group('SSL/TLS options') ap2 = ap.add_argument_group('SSL/TLS options')
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls") ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls")
@@ -394,6 +444,7 @@ def run_argparse(argv, formatter):
ap2 = ap.add_argument_group('opt-outs') ap2 = ap.add_argument_group('opt-outs')
ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)") ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)")
ap2.add_argument("--keep-qem", action="store_true", help="do not disable quick-edit-mode on windows")
ap2.add_argument("--no-del", action="store_true", help="disable delete operations") ap2.add_argument("--no-del", action="store_true", help="disable delete operations")
ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations") ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations")
ap2.add_argument("-nih", action="store_true", help="no info hostname") ap2.add_argument("-nih", action="store_true", help="no info hostname")
@@ -435,6 +486,7 @@ def run_argparse(argv, formatter):
ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails") ap2.add_argument("--no-vthumb", action="store_true", help="disable video thumbnails")
ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res") ap2.add_argument("--th-size", metavar="WxH", default="320x256", help="thumbnail res")
ap2.add_argument("--th-mt", metavar="CORES", type=int, default=cores, help="num cpu cores to use for generating thumbnails") ap2.add_argument("--th-mt", metavar="CORES", type=int, default=cores, help="num cpu cores to use for generating thumbnails")
ap2.add_argument("--th-convt", metavar="SEC", type=int, default=60, help="conversion timeout in seconds")
ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image") ap2.add_argument("--th-no-crop", action="store_true", help="dynamic height; show full image")
ap2.add_argument("--th-no-jpg", action="store_true", help="disable jpg output") ap2.add_argument("--th-no-jpg", action="store_true", help="disable jpg output")
ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output") ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output")
@@ -519,7 +571,7 @@ def main(argv=None):
if HAVE_SSL: if HAVE_SSL:
ensure_cert() ensure_cert()
for k, v in zip(argv, argv[1:]): for k, v in zip(argv[1:], argv[2:]):
if k == "-c": if k == "-c":
supp = args_from_cfg(v) supp = args_from_cfg(v)
argv.extend(supp) argv.extend(supp)
@@ -547,6 +599,12 @@ def main(argv=None):
except AssertionError: except AssertionError:
al = run_argparse(argv, Dodge11874) al = run_argparse(argv, Dodge11874)
if WINDOWS and not al.keep_qem:
try:
disable_quickedit()
except:
print("\nfailed to disable quick-edit-mode:\n" + min_ex() + "\n")
nstrs = [] nstrs = []
anymod = False anymod = False
for ostr in al.v or []: for ostr in al.v or []:

View File

@@ -1,8 +1,8 @@
# coding: utf-8 # coding: utf-8
VERSION = (1, 1, 2) VERSION = (1, 1, 5)
CODENAME = "opus" CODENAME = "opus"
BUILD_DT = (2021, 11, 12) BUILD_DT = (2021, 12, 4)
S_VERSION = ".".join(map(str, VERSION)) S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT) S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -2,7 +2,7 @@
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
import os import os
from ..util import fsenc, fsdec from ..util import fsenc, fsdec, SYMTIME
from . import path from . import path
@@ -55,5 +55,8 @@ def unlink(p):
return os.unlink(fsenc(p)) return os.unlink(fsenc(p))
def utime(p, times=None): def utime(p, times=None, follow_symlinks=True):
return os.utime(fsenc(p), times) if SYMTIME:
return os.utime(fsenc(p), times, follow_symlinks=follow_symlinks)
else:
return os.utime(fsenc(p), times)

View File

@@ -2,7 +2,7 @@
from __future__ import print_function, unicode_literals from __future__ import print_function, unicode_literals
import os import os
from ..util import fsenc, fsdec from ..util import fsenc, fsdec, SYMTIME
def abspath(p): def abspath(p):
@@ -13,8 +13,11 @@ def exists(p):
return os.path.exists(fsenc(p)) return os.path.exists(fsenc(p))
def getmtime(p): def getmtime(p, follow_symlinks=True):
return os.path.getmtime(fsenc(p)) if not follow_symlinks and SYMTIME:
return os.lstat(fsenc(p)).st_mtime
else:
return os.path.getmtime(fsenc(p))
def getsize(p): def getsize(p):

View File

@@ -60,6 +60,7 @@ class HttpCli(object):
self.bufsz = 1024 * 32 self.bufsz = 1024 * 32
self.hint = None self.hint = None
self.trailing_slash = True self.trailing_slash = True
self.out_headerlist = []
self.out_headers = { self.out_headers = {
"Access-Control-Allow-Origin": "*", "Access-Control-Allow-Origin": "*",
"Cache-Control": "no-store; max-age=0", "Cache-Control": "no-store; max-age=0",
@@ -126,7 +127,8 @@ class HttpCli(object):
self.loud_reply(unicode(ex), status=ex.code, volsan=True) self.loud_reply(unicode(ex), status=ex.code, volsan=True)
return self.keepalive return self.keepalive
# time.sleep(0.4) if self.args.rsp_slp:
time.sleep(self.args.rsp_slp)
# normalize incoming headers to lowercase; # normalize incoming headers to lowercase;
# outgoing headers however are Correct-Case # outgoing headers however are Correct-Case
@@ -225,10 +227,10 @@ class HttpCli(object):
self.gvol = self.asrv.vfs.aget[self.uname] self.gvol = self.asrv.vfs.aget[self.uname]
if pwd and "pw" in self.ouparam and pwd != cookies.get("cppwd"): if pwd and "pw" in self.ouparam and pwd != cookies.get("cppwd"):
self.out_headers["Set-Cookie"] = self.get_pwd_cookie(pwd)[0] self.out_headerlist.append(("Set-Cookie", self.get_pwd_cookie(pwd)[0]))
ua = self.headers.get("user-agent", "") self.ua = self.headers.get("user-agent", "")
self.is_rclone = ua.startswith("rclone/") self.is_rclone = self.ua.startswith("rclone/")
if self.is_rclone: if self.is_rclone:
uparam["raw"] = False uparam["raw"] = False
uparam["dots"] = False uparam["dots"] = False
@@ -283,12 +285,19 @@ class HttpCli(object):
n = "604800" if cache == "i" else cache or "69" n = "604800" if cache == "i" else cache or "69"
self.out_headers["Cache-Control"] = "max-age=" + n self.out_headers["Cache-Control"] = "max-age=" + n
def k304(self):
k304 = self.cookies.get("k304")
return k304 == "y" or ("; Trident/" in self.ua and not k304)
def send_headers(self, length, status=200, mime=None, headers=None): def send_headers(self, length, status=200, mime=None, headers=None):
response = ["{} {} {}".format(self.http_ver, status, HTTPCODE[status])] response = ["{} {} {}".format(self.http_ver, status, HTTPCODE[status])]
if length is not None: if length is not None:
response.append("Content-Length: " + unicode(length)) response.append("Content-Length: " + unicode(length))
if status == 304 and self.k304():
self.keepalive = False
# close if unknown length, otherwise take client's preference # close if unknown length, otherwise take client's preference
response.append("Connection: " + ("Keep-Alive" if self.keepalive else "Close")) response.append("Connection: " + ("Keep-Alive" if self.keepalive else "Close"))
@@ -302,7 +311,7 @@ class HttpCli(object):
self.out_headers["Content-Type"] = mime self.out_headers["Content-Type"] = mime
for k, v in self.out_headers.items(): for k, v in list(self.out_headers.items()) + self.out_headerlist:
response.append("{}: {}".format(k, v)) response.append("{}: {}".format(k, v))
try: try:
@@ -428,6 +437,15 @@ class HttpCli(object):
if "ups" in self.uparam: if "ups" in self.uparam:
return self.tx_ups() return self.tx_ups()
if "k304" in self.uparam:
return self.set_k304()
if "am_js" in self.uparam:
return self.set_am_js()
if "reset" in self.uparam:
return self.set_cfg_reset()
if "h" in self.uparam: if "h" in self.uparam:
return self.tx_mounts() return self.tx_mounts()
@@ -505,7 +523,7 @@ class HttpCli(object):
return self.handle_stash() return self.handle_stash()
if "save" in opt: if "save" in opt:
post_sz, _, _, path = self.dump_to_file() post_sz, _, _, _, path = self.dump_to_file()
self.log("urlform: {} bytes, {}".format(post_sz, path)) self.log("urlform: {} bytes, {}".format(post_sz, path))
elif "print" in opt: elif "print" in opt:
reader, _ = self.get_body_reader() reader, _ = self.get_body_reader()
@@ -590,8 +608,8 @@ class HttpCli(object):
alg = alg or "gz" # def.pk alg = alg or "gz" # def.pk
try: try:
# config-forced opts # config-forced opts
alg, lv = pk.split(",") alg, nlv = pk.split(",")
lv[alg] = int(lv) lv[alg] = int(nlv)
except: except:
pass pass
@@ -621,7 +639,7 @@ class HttpCli(object):
with ren_open(fn, *open_a, **params) as f: with ren_open(fn, *open_a, **params) as f:
f, fn = f["orz"] f, fn = f["orz"]
path = os.path.join(fdir, fn) path = os.path.join(fdir, fn)
post_sz, _, sha_b64 = hashcopy(reader, f) post_sz, sha_hex, sha_b64 = hashcopy(reader, f)
if lim: if lim:
lim.nup(self.ip) lim.nup(self.ip)
@@ -645,13 +663,14 @@ class HttpCli(object):
time.time(), time.time(),
) )
return post_sz, sha_b64, remains, path return post_sz, sha_hex, sha_b64, remains, path
def handle_stash(self): def handle_stash(self):
post_sz, sha_b64, remains, path = self.dump_to_file() post_sz, sha_hex, sha_b64, remains, path = self.dump_to_file()
spd = self._spd(post_sz) spd = self._spd(post_sz)
self.log("{} wrote {}/{} bytes to {}".format(spd, post_sz, remains, path)) self.log("{} wrote {}/{} bytes to {}".format(spd, post_sz, remains, path))
self.reply("{}\n{}\n".format(post_sz, sha_b64).encode("utf-8")) m = "{}\n{}\n{}\n".format(post_sz, sha_b64, sha_hex[:56])
self.reply(m.encode("utf-8"))
return True return True
def _spd(self, nbytes, add=True): def _spd(self, nbytes, add=True):
@@ -783,6 +802,10 @@ class HttpCli(object):
return True return True
def handle_search(self, body): def handle_search(self, body):
idx = self.conn.get_u2idx()
if not hasattr(idx, "p_end"):
raise Pebkac(500, "sqlite3 is not available on the server; cannot search")
vols = [] vols = []
seen = {} seen = {}
for vtop in self.rvol: for vtop in self.rvol:
@@ -794,7 +817,6 @@ class HttpCli(object):
seen[vfs] = True seen[vfs] = True
vols.append([vfs.vpath, vfs.realpath, vfs.flags]) vols.append([vfs.vpath, vfs.realpath, vfs.flags])
idx = self.conn.get_u2idx()
t0 = time.time() t0 = time.time()
if idx.p_end: if idx.p_end:
penalty = 0.7 penalty = 0.7
@@ -854,63 +876,63 @@ class HttpCli(object):
response = x.get() response = x.get()
chunksize, cstart, path, lastmod = response chunksize, cstart, path, lastmod = response
if self.args.nw:
path = os.devnull
if remains > chunksize:
raise Pebkac(400, "your chunk is too big to fit")
self.log("writing {} #{} @{} len {}".format(path, chash, cstart, remains))
reader = read_socket(self.sr, remains)
f = None
fpool = not self.args.no_fpool
if fpool:
with self.mutex:
try:
f = self.u2fh.pop(path)
except:
pass
f = f or open(fsenc(path), "rb+", 512 * 1024)
try: try:
f.seek(cstart[0]) if self.args.nw:
post_sz, _, sha_b64 = hashcopy(reader, f) path = os.devnull
if sha_b64 != chash: if remains > chunksize:
raise Pebkac( raise Pebkac(400, "your chunk is too big to fit")
400,
"your chunk got corrupted somehow (received {} bytes); expected vs received hash:\n{}\n{}".format(
post_sz, chash, sha_b64
),
)
if len(cstart) > 1 and path != os.devnull: self.log("writing {} #{} @{} len {}".format(path, chash, cstart, remains))
self.log(
"clone {} to {}".format(
cstart[0], " & ".join(unicode(x) for x in cstart[1:])
)
)
ofs = 0
while ofs < chunksize:
bufsz = min(chunksize - ofs, 4 * 1024 * 1024)
f.seek(cstart[0] + ofs)
buf = f.read(bufsz)
for wofs in cstart[1:]:
f.seek(wofs + ofs)
f.write(buf)
ofs += len(buf) reader = read_socket(self.sr, remains)
self.log("clone {} done".format(cstart[0])) f = None
finally: fpool = not self.args.no_fpool
if not fpool: if fpool:
f.close()
else:
with self.mutex: with self.mutex:
self.u2fh.put(path, f) try:
f = self.u2fh.pop(path)
except:
pass
f = f or open(fsenc(path), "rb+", 512 * 1024)
try:
f.seek(cstart[0])
post_sz, _, sha_b64 = hashcopy(reader, f)
if sha_b64 != chash:
m = "your chunk got corrupted somehow (received {} bytes); expected vs received hash:\n{}\n{}"
raise Pebkac(400, m.format(post_sz, chash, sha_b64))
if len(cstart) > 1 and path != os.devnull:
self.log(
"clone {} to {}".format(
cstart[0], " & ".join(unicode(x) for x in cstart[1:])
)
)
ofs = 0
while ofs < chunksize:
bufsz = min(chunksize - ofs, 4 * 1024 * 1024)
f.seek(cstart[0] + ofs)
buf = f.read(bufsz)
for wofs in cstart[1:]:
f.seek(wofs + ofs)
f.write(buf)
ofs += len(buf)
self.log("clone {} done".format(cstart[0]))
finally:
if not fpool:
f.close()
else:
with self.mutex:
self.u2fh.put(path, f)
finally:
x = self.conn.hsrv.broker.put(True, "up2k.release_chunk", ptop, wark, chash)
x.get() # block client until released
x = self.conn.hsrv.broker.put(True, "up2k.confirm_chunk", ptop, wark, chash) x = self.conn.hsrv.broker.put(True, "up2k.confirm_chunk", ptop, wark, chash)
x = x.get() x = x.get()
@@ -957,15 +979,13 @@ class HttpCli(object):
def get_pwd_cookie(self, pwd): def get_pwd_cookie(self, pwd):
if pwd in self.asrv.iacct: if pwd in self.asrv.iacct:
msg = "login ok" msg = "login ok"
dt = datetime.utcfromtimestamp(time.time() + 60 * 60 * 24 * 365) dur = 60 * 60 * 24 * 365
exp = dt.strftime("%a, %d %b %Y %H:%M:%S GMT")
else: else:
msg = "naw dude" msg = "naw dude"
pwd = "x" # nosec pwd = "x" # nosec
exp = "Fri, 15 Aug 1997 01:00:00 GMT" dur = None
ck = "cppwd={}; Path=/; Expires={}; SameSite=Lax".format(pwd, exp) return [gencookie("cppwd", pwd, dur), msg]
return [ck, msg]
def handle_mkdir(self): def handle_mkdir(self):
new_dir = self.parser.require("name", 512) new_dir = self.parser.require("name", 512)
@@ -1073,7 +1093,7 @@ class HttpCli(object):
f, fname = f["orz"] f, fname = f["orz"]
abspath = os.path.join(fdir, fname) abspath = os.path.join(fdir, fname)
self.log("writing to {}".format(abspath)) self.log("writing to {}".format(abspath))
sz, sha512_hex, _ = hashcopy(p_data, f) sz, sha_hex, sha_b64 = hashcopy(p_data, f)
if sz == 0: if sz == 0:
raise Pebkac(400, "empty files in post") raise Pebkac(400, "empty files in post")
@@ -1086,7 +1106,7 @@ class HttpCli(object):
bos.unlink(abspath) bos.unlink(abspath)
raise raise
files.append([sz, sha512_hex, p_file, fname, abspath]) files.append([sz, sha_hex, sha_b64, p_file, fname, abspath])
dbv, vrem = vfs.get_dbv(rem) dbv, vrem = vfs.get_dbv(rem)
self.conn.hsrv.broker.put( self.conn.hsrv.broker.put(
False, False,
@@ -1138,7 +1158,7 @@ class HttpCli(object):
jmsg["error"] = errmsg jmsg["error"] = errmsg
errmsg = "ERROR: " + errmsg errmsg = "ERROR: " + errmsg
for sz, sha512, ofn, lfn, ap in files: for sz, sha_hex, sha_b64, ofn, lfn, ap in files:
vsuf = "" vsuf = ""
if self.can_read and "fk" in vfs.flags: if self.can_read and "fk" in vfs.flags:
vsuf = "?k=" + gen_filekey( vsuf = "?k=" + gen_filekey(
@@ -1149,8 +1169,13 @@ class HttpCli(object):
)[: vfs.flags["fk"]] )[: vfs.flags["fk"]]
vpath = "{}/{}".format(upload_vpath, lfn).strip("/") vpath = "{}/{}".format(upload_vpath, lfn).strip("/")
msg += 'sha512: {} // {} bytes // <a href="/{}">{}</a> {}\n'.format( msg += 'sha512: {} // {} // {} bytes // <a href="/{}">{}</a> {}\n'.format(
sha512[:56], sz, quotep(vpath) + vsuf, html_escape(ofn, crlf=True), vsuf sha_hex[:56],
sha_b64,
sz,
quotep(vpath) + vsuf,
html_escape(ofn, crlf=True),
vsuf,
) )
# truncated SHA-512 prevents length extension attacks; # truncated SHA-512 prevents length extension attacks;
# using SHA-512/224, optionally SHA-512/256 = :64 # using SHA-512/224, optionally SHA-512/256 = :64
@@ -1160,7 +1185,8 @@ class HttpCli(object):
self.headers.get("host", "copyparty"), self.headers.get("host", "copyparty"),
vpath + vsuf, vpath + vsuf,
), ),
"sha512": sha512[:56], "sha512": sha_hex[:56],
"sha_b64": sha_b64,
"sz": sz, "sz": sz,
"fn": lfn, "fn": lfn,
"fn_orig": ofn, "fn_orig": ofn,
@@ -1378,8 +1404,7 @@ class HttpCli(object):
if "gzip" not in supported_editions: if "gzip" not in supported_editions:
decompress = True decompress = True
else: else:
ua = self.headers.get("user-agent", "") if re.match(r"MSIE [4-6]\.", self.ua) and " SV1" not in self.ua:
if re.match(r"MSIE [4-6]\.", ua) and " SV1" not in ua:
decompress = True decompress = True
if not decompress: if not decompress:
@@ -1692,10 +1717,28 @@ class HttpCli(object):
tagq=vs["tagq"], tagq=vs["tagq"],
mtpq=vs["mtpq"], mtpq=vs["mtpq"],
url_suf=suf, url_suf=suf,
k304=self.k304(),
) )
self.reply(html.encode("utf-8")) self.reply(html.encode("utf-8"))
return True return True
def set_k304(self):
ck = gencookie("k304", self.uparam["k304"], 60 * 60 * 24 * 365)
self.out_headerlist.append(("Set-Cookie", ck))
self.redirect("", "?h#cc")
def set_am_js(self):
v = "n" if self.uparam["am_js"] == "n" else "y"
ck = gencookie("js", v, 60 * 60 * 24 * 365)
self.out_headerlist.append(("Set-Cookie", ck))
self.reply(b"promoted\n")
def set_cfg_reset(self):
for k in ("k304", "js", "cppwd"):
self.out_headerlist.append(("Set-Cookie", gencookie(k, "x", None)))
self.redirect("", "?h#cc")
def tx_404(self, is_403=False): def tx_404(self, is_403=False):
if self.args.vague_403: if self.args.vague_403:
m = '<h1>404 not found &nbsp;┐( ´ -`)┌</h1><p>or maybe you don\'t have access -- try logging in or <a href="/?h">go home</a></p>' m = '<h1>404 not found &nbsp;┐( ´ -`)┌</h1><p>or maybe you don\'t have access -- try logging in or <a href="/?h">go home</a></p>'
@@ -1812,13 +1855,16 @@ class HttpCli(object):
if not self.args.unpost: if not self.args.unpost:
raise Pebkac(400, "the unpost feature is disabled in server config") raise Pebkac(400, "the unpost feature is disabled in server config")
idx = self.conn.get_u2idx()
if not hasattr(idx, "p_end"):
raise Pebkac(500, "sqlite3 is not available on the server; cannot unpost")
filt = self.uparam.get("filter") filt = self.uparam.get("filter")
lm = "ups [{}]".format(filt) lm = "ups [{}]".format(filt)
self.log(lm) self.log(lm)
ret = [] ret = []
t0 = time.time() t0 = time.time()
idx = self.conn.get_u2idx()
lim = time.time() - self.args.unpost lim = time.time() - self.args.unpost
for vol in self.asrv.vfs.all_vols.values(): for vol in self.asrv.vfs.all_vols.values():
cur = idx.get_cur(vol.realpath) cur = idx.get_cur(vol.realpath)
@@ -2086,6 +2132,7 @@ class HttpCli(object):
"vdir": quotep(self.vpath), "vdir": quotep(self.vpath),
"vpnodes": vpnodes, "vpnodes": vpnodes,
"files": [], "files": [],
"ls0": None,
"acct": self.uname, "acct": self.uname,
"perms": json.dumps(perms), "perms": json.dumps(perms),
"taglist": [], "taglist": [],
@@ -2309,7 +2356,12 @@ class HttpCli(object):
dirs.sort(key=itemgetter("name")) dirs.sort(key=itemgetter("name"))
j2a["files"] = dirs + files if self.cookies.get("js") == "y":
j2a["ls0"] = {"dirs": dirs, "files": files, "taglist": taglist}
j2a["files"] = []
else:
j2a["files"] = dirs + files
j2a["logues"] = logues j2a["logues"] = logues
j2a["taglist"] = taglist j2a["taglist"] = taglist
j2a["txt_ext"] = self.args.textfiles.replace(",", " ") j2a["txt_ext"] = self.args.textfiles.replace(",", " ")

View File

@@ -8,7 +8,7 @@ import shutil
import subprocess as sp import subprocess as sp
from .__init__ import PY2, WINDOWS, unicode from .__init__ import PY2, WINDOWS, unicode
from .util import fsenc, fsdec, uncyg, REKOBO_LKEY from .util import fsenc, fsdec, uncyg, runcmd, REKOBO_LKEY
from .bos import bos from .bos import bos
@@ -73,7 +73,7 @@ class MParser(object):
raise Exception() raise Exception()
def ffprobe(abspath): def ffprobe(abspath, timeout=10):
cmd = [ cmd = [
b"ffprobe", b"ffprobe",
b"-hide_banner", b"-hide_banner",
@@ -82,10 +82,8 @@ def ffprobe(abspath):
b"--", b"--",
fsenc(abspath), fsenc(abspath),
] ]
p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) rc = runcmd(cmd, timeout=timeout)
r = p.communicate() return parse_ffprobe(rc[1])
txt = r[0].decode("utf-8", "replace")
return parse_ffprobe(txt)
def parse_ffprobe(txt): def parse_ffprobe(txt):

View File

@@ -397,7 +397,6 @@ class SvcHub(object):
def check_mp_enable(self): def check_mp_enable(self):
if self.args.j == 1: if self.args.j == 1:
self.log("svchub", "multiprocessing disabled by argument -j 1")
return False return False
if mp.cpu_count() <= 1: if mp.cpu_count() <= 1:

View File

@@ -30,7 +30,7 @@ class ThumbCli(object):
if is_vid and self.args.no_vthumb: if is_vid and self.args.no_vthumb:
return None return None
want_opus = fmt == "opus" want_opus = fmt in ("opus", "caf")
is_au = ext in FMT_FFA is_au = ext in FMT_FFA
if is_au: if is_au:
if want_opus: if want_opus:

View File

@@ -90,7 +90,7 @@ def thumb_path(histpath, rem, mtime, fmt):
h = hashlib.sha512(fsenc(fn)).digest() h = hashlib.sha512(fsenc(fn)).digest()
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24] fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
if fmt == "opus": if fmt in ("opus", "caf"):
cat = "ac" cat = "ac"
else: else:
fmt = "webp" if fmt == "w" else "jpg" fmt = "webp" if fmt == "w" else "jpg"
@@ -216,7 +216,7 @@ class ThumbSrv(object):
elif ext in FMT_FFV: elif ext in FMT_FFV:
fun = self.conv_ffmpeg fun = self.conv_ffmpeg
elif ext in FMT_FFA: elif ext in FMT_FFA:
if tpath.endswith(".opus"): if tpath.endswith(".opus") or tpath.endswith(".caf"):
fun = self.conv_opus fun = self.conv_opus
else: else:
fun = self.conv_spec fun = self.conv_spec
@@ -349,7 +349,7 @@ class ThumbSrv(object):
def _run_ff(self, cmd): def _run_ff(self, cmd):
# self.log((b" ".join(cmd)).decode("utf-8")) # self.log((b" ".join(cmd)).decode("utf-8"))
ret, sout, serr = runcmd(cmd) ret, sout, serr = runcmd(cmd, timeout=self.args.th_convt)
if ret != 0: if ret != 0:
m = "FFmpeg failed (probably a corrupt video file):\n" m = "FFmpeg failed (probably a corrupt video file):\n"
m += "\n".join(["ff: {}".format(x) for x in serr.split("\n")]) m += "\n".join(["ff: {}".format(x) for x in serr.split("\n")])
@@ -406,21 +406,45 @@ class ThumbSrv(object):
if "ac" not in ret: if "ac" not in ret:
raise Exception("not audio") raise Exception("not audio")
# fmt: off src_opus = abspath.lower().endswith(".opus") or ret["ac"][1] == "opus"
cmd = [ want_caf = tpath.endswith(".caf")
b"ffmpeg", tmp_opus = tpath
b"-nostdin", if want_caf:
b"-v", b"error", tmp_opus = tpath.rsplit(".", 1)[0] + ".opus"
b"-hide_banner",
b"-i", fsenc(abspath),
b"-map", b"0:a:0",
b"-c:a", b"libopus",
b"-b:a", b"128k",
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd) if not want_caf or (not src_opus and not bos.path.isfile(tmp_opus)):
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath),
b"-map_metadata", b"-1",
b"-map", b"0:a:0",
b"-c:a", b"libopus",
b"-b:a", b"128k",
fsenc(tmp_opus)
]
# fmt: on
self._run_ff(cmd)
if want_caf:
# fmt: off
cmd = [
b"ffmpeg",
b"-nostdin",
b"-v", b"error",
b"-hide_banner",
b"-i", fsenc(abspath if src_opus else tmp_opus),
b"-map_metadata", b"-1",
b"-map", b"0:a:0",
b"-c:a", b"copy",
b"-f", b"caf",
fsenc(tpath)
]
# fmt: on
self._run_ff(cmd)
def poke(self, tdir): def poke(self, tdir):
if not self.poke_cd.poke(tdir): if not self.poke_cd.poke(tdir):
@@ -461,7 +485,7 @@ class ThumbSrv(object):
thumbpath = os.path.join(histpath, cat) thumbpath = os.path.join(histpath, cat)
# self.log("cln {}".format(thumbpath)) # self.log("cln {}".format(thumbpath))
exts = ["jpg", "webp"] if cat == "th" else ["opus"] exts = ["jpg", "webp"] if cat == "th" else ["opus", "caf"]
maxage = getattr(self.args, cat + "_maxage") maxage = getattr(self.args, cat + "_maxage")
now = time.time() now = time.time()
prev_b64 = None prev_b64 = None

View File

@@ -117,7 +117,16 @@ class U2idx(object):
if ok: if ok:
continue continue
v, uq = (uq + " ").split(" ", 1) if uq.startswith('"'):
v, uq = uq[1:].split('"', 1)
while v.endswith("\\"):
v2, uq = uq.split('"', 1)
v = v[:-1] + '"' + v2
uq = uq.strip()
else:
v, uq = (uq + " ").split(" ", 1)
v = v.replace('\\"', '"')
if is_key: if is_key:
is_key = False is_key = False

View File

@@ -21,6 +21,7 @@ from .util import (
Pebkac, Pebkac,
Queue, Queue,
ProgressPrinter, ProgressPrinter,
SYMTIME,
fsdec, fsdec,
fsenc, fsenc,
absreal, absreal,
@@ -73,6 +74,7 @@ class Up2k(object):
self.need_rescan = {} self.need_rescan = {}
self.dupesched = {} self.dupesched = {}
self.registry = {} self.registry = {}
self.droppable = {}
self.entags = {} self.entags = {}
self.flags = {} self.flags = {}
self.cur = {} self.cur = {}
@@ -125,11 +127,11 @@ class Up2k(object):
all_vols = self.asrv.vfs.all_vols all_vols = self.asrv.vfs.all_vols
have_e2d = self.init_indexes(all_vols) have_e2d = self.init_indexes(all_vols)
if have_e2d: thr = threading.Thread(target=self._snapshot, name="up2k-snapshot")
thr = threading.Thread(target=self._snapshot, name="up2k-snapshot") thr.daemon = True
thr.daemon = True thr.start()
thr.start()
if have_e2d:
thr = threading.Thread(target=self._hasher, name="up2k-hasher") thr = threading.Thread(target=self._hasher, name="up2k-hasher")
thr.daemon = True thr.daemon = True
thr.start() thr.start()
@@ -295,7 +297,8 @@ class Up2k(object):
def _vis_reg_progress(self, reg): def _vis_reg_progress(self, reg):
ret = [] ret = []
for _, job in reg.items(): for _, job in reg.items():
ret.append(self._vis_job_progress(job)) if job["need"]:
ret.append(self._vis_job_progress(job))
return ret return ret
@@ -483,26 +486,41 @@ class Up2k(object):
self.log("/{} {}".format(vpath, " ".join(sorted(a))), "35") self.log("/{} {}".format(vpath, " ".join(sorted(a))), "35")
reg = {} reg = {}
drp = None
path = os.path.join(histpath, "up2k.snap") path = os.path.join(histpath, "up2k.snap")
if "e2d" in flags and bos.path.exists(path): if bos.path.exists(path):
with gzip.GzipFile(path, "rb") as f: with gzip.GzipFile(path, "rb") as f:
j = f.read().decode("utf-8") j = f.read().decode("utf-8")
reg2 = json.loads(j) reg2 = json.loads(j)
try:
drp = reg2["droppable"]
reg2 = reg2["registry"]
except:
pass
for k, job in reg2.items(): for k, job in reg2.items():
path = os.path.join(job["ptop"], job["prel"], job["name"]) path = os.path.join(job["ptop"], job["prel"], job["name"])
if bos.path.exists(path): if bos.path.exists(path):
reg[k] = job reg[k] = job
job["poke"] = time.time() job["poke"] = time.time()
job["busy"] = {}
else: else:
self.log("ign deleted file in snap: [{}]".format(path)) self.log("ign deleted file in snap: [{}]".format(path))
m = "loaded snap {} |{}|".format(path, len(reg.keys())) if drp is None:
drp = [k for k, v in reg.items() if not v.get("need", [])]
else:
drp = [x for x in drp if x in reg]
m = "loaded snap {} |{}| ({})".format(path, len(reg.keys()), len(drp or []))
m = [m] + self._vis_reg_progress(reg) m = [m] + self._vis_reg_progress(reg)
self.log("\n".join(m)) self.log("\n".join(m))
self.flags[ptop] = flags self.flags[ptop] = flags
self.registry[ptop] = reg self.registry[ptop] = reg
self.droppable[ptop] = drp or []
self.regdrop(ptop, None)
if not HAVE_SQLITE3 or "e2d" not in flags or "d2d" in flags: if not HAVE_SQLITE3 or "e2d" not in flags or "d2d" in flags:
return None return None
@@ -1256,6 +1274,7 @@ class Up2k(object):
"at": at, "at": at,
"hash": [], "hash": [],
"need": [], "need": [],
"busy": {},
} }
if job and wark in reg: if job and wark in reg:
@@ -1289,7 +1308,7 @@ class Up2k(object):
err = "partial upload exists at a different location; please resume uploading here instead:\n" err = "partial upload exists at a different location; please resume uploading here instead:\n"
err += "/" + quotep(vsrc) + " " err += "/" + quotep(vsrc) + " "
dupe = [cj["prel"], cj["name"]] dupe = [cj["prel"], cj["name"], cj["lmod"]]
try: try:
self.dupesched[src].append(dupe) self.dupesched[src].append(dupe)
except: except:
@@ -1314,7 +1333,7 @@ class Up2k(object):
dst = os.path.join(job["ptop"], job["prel"], job["name"]) dst = os.path.join(job["ptop"], job["prel"], job["name"])
if not self.args.nw: if not self.args.nw:
bos.unlink(dst) # TODO ed pls bos.unlink(dst) # TODO ed pls
self._symlink(src, dst) self._symlink(src, dst, lmod=cj["lmod"])
if cur: if cur:
a = [cj[x] for x in "prel name lmod size addr".split()] a = [cj[x] for x in "prel name lmod size addr".split()]
@@ -1338,6 +1357,7 @@ class Up2k(object):
"t0": now, "t0": now,
"hash": deepcopy(cj["hash"]), "hash": deepcopy(cj["hash"]),
"need": [], "need": [],
"busy": {},
} }
# client-provided, sanitized by _get_wark: name, size, lmod # client-provided, sanitized by _get_wark: name, size, lmod
for k in [ for k in [
@@ -1385,13 +1405,14 @@ class Up2k(object):
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as f: with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as f:
return f["orz"][1] return f["orz"][1]
def _symlink(self, src, dst, verbose=True): def _symlink(self, src, dst, verbose=True, lmod=None):
if verbose: if verbose:
self.log("linking dupe:\n {0}\n {1}".format(src, dst)) self.log("linking dupe:\n {0}\n {1}".format(src, dst))
if self.args.nw: if self.args.nw:
return return
linked = False
try: try:
if self.args.no_symlink: if self.args.no_symlink:
raise Exception("disabled in config") raise Exception("disabled in config")
@@ -1422,10 +1443,18 @@ class Up2k(object):
hops = len(ndst[nc:]) - 1 hops = len(ndst[nc:]) - 1
lsrc = "../" * hops + "/".join(lsrc) lsrc = "../" * hops + "/".join(lsrc)
os.symlink(fsenc(lsrc), fsenc(ldst)) os.symlink(fsenc(lsrc), fsenc(ldst))
linked = True
except Exception as ex: except Exception as ex:
self.log("cannot symlink; creating copy: " + repr(ex)) self.log("cannot symlink; creating copy: " + repr(ex))
shutil.copy2(fsenc(src), fsenc(dst)) shutil.copy2(fsenc(src), fsenc(dst))
if lmod and (not linked or SYMTIME):
times = (int(time.time()), int(lmod))
if ANYWIN:
self.lastmod_q.put([dst, 0, times])
else:
bos.utime(dst, times, False)
def handle_chunk(self, ptop, wark, chash): def handle_chunk(self, ptop, wark, chash):
with self.mutex: with self.mutex:
job = self.registry[ptop].get(wark) job = self.registry[ptop].get(wark)
@@ -1444,6 +1473,14 @@ class Up2k(object):
if not nchunk: if not nchunk:
raise Pebkac(400, "unknown chunk") raise Pebkac(400, "unknown chunk")
if chash in job["busy"]:
nh = len(job["hash"])
idx = job["hash"].index(chash)
m = "that chunk is already being written to:\n {}\n {} {}/{}\n {}"
raise Pebkac(400, m.format(wark, chash, idx, nh, job["name"]))
job["busy"][chash] = 1
job["poke"] = time.time() job["poke"] = time.time()
chunksize = up2k_chunksize(job["size"]) chunksize = up2k_chunksize(job["size"])
@@ -1453,6 +1490,14 @@ class Up2k(object):
return [chunksize, ofs, path, job["lmod"]] return [chunksize, ofs, path, job["lmod"]]
def release_chunk(self, ptop, wark, chash):
with self.mutex:
job = self.registry[ptop].get(wark)
if job:
job["busy"].pop(chash, None)
return [True]
def confirm_chunk(self, ptop, wark, chash): def confirm_chunk(self, ptop, wark, chash):
with self.mutex: with self.mutex:
try: try:
@@ -1463,6 +1508,8 @@ class Up2k(object):
except Exception as ex: except Exception as ex:
return "confirm_chunk, wark, " + repr(ex) return "confirm_chunk, wark, " + repr(ex)
job["busy"].pop(chash, None)
try: try:
job["need"].remove(chash) job["need"].remove(chash)
except Exception as ex: except Exception as ex:
@@ -1473,7 +1520,7 @@ class Up2k(object):
return ret, src return ret, src
if self.args.nw: if self.args.nw:
# del self.registry[ptop][wark] self.regdrop(ptop, wark)
return ret, dst return ret, dst
# windows cant rename open files # windows cant rename open files
@@ -1505,21 +1552,21 @@ class Up2k(object):
a = [job[x] for x in "ptop wark prel name lmod size addr".split()] a = [job[x] for x in "ptop wark prel name lmod size addr".split()]
a += [job.get("at") or time.time()] a += [job.get("at") or time.time()]
if self.idx_wark(*a): if self.idx_wark(*a):
# self.log("pop " + wark + " " + dst + " finish_upload idx_wark", 4)
del self.registry[ptop][wark] del self.registry[ptop][wark]
# in-memory registry is reserved for unfinished uploads else:
self.regdrop(ptop, wark)
dupes = self.dupesched.pop(dst, []) dupes = self.dupesched.pop(dst, [])
if not dupes: if not dupes:
return return
cur = self.cur.get(ptop) cur = self.cur.get(ptop)
for rd, fn in dupes: for rd, fn, lmod in dupes:
d2 = os.path.join(ptop, rd, fn) d2 = os.path.join(ptop, rd, fn)
if os.path.exists(d2): if os.path.exists(d2):
continue continue
self._symlink(dst, d2) self._symlink(dst, d2, lmod=lmod)
if cur: if cur:
self.db_rm(cur, rd, fn) self.db_rm(cur, rd, fn)
self.db_add(cur, wark, rd, fn, *a[-4:]) self.db_add(cur, wark, rd, fn, *a[-4:])
@@ -1527,6 +1574,21 @@ class Up2k(object):
if cur: if cur:
cur.connection.commit() cur.connection.commit()
def regdrop(self, ptop, wark):
t = self.droppable[ptop]
if wark:
t.append(wark)
if len(t) <= self.args.reg_cap:
return
n = len(t) - int(self.args.reg_cap / 2)
m = "up2k-registry [{}] has {} droppables; discarding {}"
self.log(m.format(ptop, len(t), n))
for k in t[:n]:
self.registry[ptop].pop(k, None)
self.droppable[ptop] = t[n:]
def idx_wark(self, ptop, wark, rd, fn, lmod, sz, ip, at): def idx_wark(self, ptop, wark, rd, fn, lmod, sz, ip, at):
cur = self.cur.get(ptop) cur = self.cur.get(ptop)
if not cur: if not cur:
@@ -1721,8 +1783,9 @@ class Up2k(object):
dlabs = absreal(sabs) dlabs = absreal(sabs)
m = "moving symlink from [{}] to [{}], target [{}]" m = "moving symlink from [{}] to [{}], target [{}]"
self.log(m.format(sabs, dabs, dlabs)) self.log(m.format(sabs, dabs, dlabs))
os.unlink(sabs) mt = bos.path.getmtime(sabs, False)
self._symlink(dlabs, dabs, False) bos.unlink(sabs)
self._symlink(dlabs, dabs, False, lmod=mt)
# folders are too scary, schedule rescan of both vols # folders are too scary, schedule rescan of both vols
self.need_rescan[svn.vpath] = 1 self.need_rescan[svn.vpath] = 1
@@ -1852,25 +1915,30 @@ class Up2k(object):
slabs = list(sorted(links.keys()))[0] slabs = list(sorted(links.keys()))[0]
ptop, rem = links.pop(slabs) ptop, rem = links.pop(slabs)
self.log("linkswap [{}] and [{}]".format(sabs, slabs)) self.log("linkswap [{}] and [{}]".format(sabs, slabs))
mt = bos.path.getmtime(slabs, False)
bos.unlink(slabs) bos.unlink(slabs)
bos.rename(sabs, slabs) bos.rename(sabs, slabs)
bos.utime(slabs, (int(time.time()), int(mt)), False)
self._symlink(slabs, sabs, False) self._symlink(slabs, sabs, False)
full[slabs] = [ptop, rem] full[slabs] = [ptop, rem]
sabs = slabs
if not dabs: if not dabs:
dabs = list(sorted(full.keys()))[0] dabs = list(sorted(full.keys()))[0]
for alink in links.keys(): for alink in links.keys():
lmod = None
try: try:
if alink != sabs and absreal(alink) != sabs: if alink != sabs and absreal(alink) != sabs:
continue continue
self.log("relinking [{}] to [{}]".format(alink, dabs)) self.log("relinking [{}] to [{}]".format(alink, dabs))
lmod = bos.path.getmtime(alink, False)
bos.unlink(alink) bos.unlink(alink)
except: except:
pass pass
self._symlink(dabs, alink, False) self._symlink(dabs, alink, False, lmod=lmod)
return len(full) + len(links) return len(full) + len(links)
@@ -1976,7 +2044,7 @@ class Up2k(object):
for path, sz, times in ready: for path, sz, times in ready:
self.log("lmod: setting times {} on {}".format(times, path)) self.log("lmod: setting times {} on {}".format(times, path))
try: try:
bos.utime(path, times) bos.utime(path, times, False)
except: except:
self.log("lmod: failed to utime ({}, {})".format(path, times)) self.log("lmod: failed to utime ({}, {})".format(path, times))
@@ -2042,7 +2110,8 @@ class Up2k(object):
bos.makedirs(histpath) bos.makedirs(histpath)
path2 = "{}.{}".format(path, os.getpid()) path2 = "{}.{}".format(path, os.getpid())
j = json.dumps(reg, indent=2, sort_keys=True).encode("utf-8") body = {"droppable": self.droppable[ptop], "registry": reg}
j = json.dumps(body, indent=2, sort_keys=True).encode("utf-8")
with gzip.GzipFile(path2, "wb") as f: with gzip.GzipFile(path2, "wb") as f:
f.write(j) f.write(j)

View File

@@ -67,8 +67,9 @@ if WINDOWS and PY2:
FS_ENCODING = "utf-8" FS_ENCODING = "utf-8"
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT" SYMTIME = sys.version_info >= (3, 6) and os.supports_follow_symlinks
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
HTTPCODE = { HTTPCODE = {
200: "OK", 200: "OK",
@@ -104,6 +105,7 @@ MIMES = {
"txt": "text/plain", "txt": "text/plain",
"js": "text/javascript", "js": "text/javascript",
"opus": "audio/ogg; codecs=opus", "opus": "audio/ogg; codecs=opus",
"caf": "audio/x-caf",
"mp3": "audio/mpeg", "mp3": "audio/mpeg",
"m4a": "audio/mp4", "m4a": "audio/mp4",
"jpg": "image/jpeg", "jpg": "image/jpeg",
@@ -821,6 +823,17 @@ def gen_filekey(salt, fspath, fsize, inode):
).decode("ascii") ).decode("ascii")
def gencookie(k, v, dur):
v = v.replace(";", "")
if dur:
dt = datetime.utcfromtimestamp(time.time() + dur)
exp = dt.strftime("%a, %d %b %Y %H:%M:%S GMT")
else:
exp = "Fri, 15 Aug 1997 01:00:00 GMT"
return "{}={}; Path=/; Expires={}; SameSite=Lax".format(k, v, exp)
def humansize(sz, terse=False): def humansize(sz, terse=False):
for unit in ["B", "KiB", "MiB", "GiB", "TiB"]: for unit in ["B", "KiB", "MiB", "GiB", "TiB"]:
if sz < 1024: if sz < 1024:
@@ -1312,9 +1325,17 @@ def guess_mime(url, fallback="application/octet-stream"):
return ret return ret
def runcmd(argv): def runcmd(argv, timeout=None):
p = sp.Popen(argv, stdout=sp.PIPE, stderr=sp.PIPE) p = sp.Popen(argv, stdout=sp.PIPE, stderr=sp.PIPE)
stdout, stderr = p.communicate() if not timeout or PY2:
stdout, stderr = p.communicate()
else:
try:
stdout, stderr = p.communicate(timeout=timeout)
except sp.TimeoutExpired:
p.kill()
stdout, stderr = p.communicate()
stdout = stdout.decode("utf-8", "replace") stdout = stdout.decode("utf-8", "replace")
stderr = stderr.decode("utf-8", "replace") stderr = stderr.decode("utf-8", "replace")
return [p.returncode, stdout, stderr] return [p.returncode, stdout, stderr]

View File

@@ -44,8 +44,9 @@ pre, code, tt, #doc, #doc>code {
margin-left: -.7em; margin-left: -.7em;
} }
#files { #files {
border-spacing: 0;
z-index: 1; z-index: 1;
top: -.3em;
border-spacing: 0;
position: relative; position: relative;
} }
#files tbody a { #files tbody a {
@@ -72,17 +73,38 @@ a, #files tbody div a:last-child {
} }
#files thead { #files thead {
position: sticky; position: sticky;
top: 0; top: -1px;
} }
#files thead a { #files thead a {
color: #999; color: #999;
font-weight: normal; font-weight: normal;
} }
.s0:after,
.s1:after {
content: '⌄';
margin-left: -.1em;
}
.s0r:after,
.s1r:after {
content: '⌃';
margin-left: -.1em;
}
.s0:after,
.s0r:after {
color: #fb0;
}
.s1:after,
.s1r:after {
color: #d09;
}
#files thead th:after {
margin-right: -.7em;
}
#files tbody tr:hover td { #files tbody tr:hover td {
background: #1c1c1c; background: #1c1c1c;
} }
#files thead th { #files thead th {
padding: 0 .3em .3em .3em; padding: .3em;
border-bottom: 1px solid #444; border-bottom: 1px solid #444;
cursor: pointer; cursor: pointer;
} }
@@ -239,6 +261,8 @@ html.light #ggrid>a[tt].sel {
#files tbody tr.sel:hover td, #files tbody tr.sel:hover td,
#files tbody tr.sel:focus td, #files tbody tr.sel:focus td,
#ggrid>a.sel:hover, #ggrid>a.sel:hover,
#ggrid>a.sel:focus,
html.light #ggrid>a.sel:focus,
html.light #ggrid>a.sel:hover { html.light #ggrid>a.sel:hover {
color: #fff; color: #fff;
background: #d39; background: #d39;
@@ -294,6 +318,8 @@ html.light #ggrid>a.sel {
width: 100%; width: 100%;
z-index: 3; z-index: 3;
touch-action: none; touch-action: none;
}
#widget.anim {
transition: bottom 0.15s; transition: bottom 0.15s;
} }
#widget.open { #widget.open {
@@ -876,13 +902,15 @@ html.light #tree.nowrap .ntree a+a:hover {
} }
#thumbs, #thumbs,
#au_fullpre, #au_fullpre,
#au_os_seek,
#au_osd_cv, #au_osd_cv,
#u2tdate { #u2tdate {
opacity: .3; opacity: .3;
} }
#griden.on+#thumbs, #griden.on+#thumbs,
#au_preload.on+#au_fullpre, #au_preload.on+#au_fullpre,
#au_os_ctl.on+#au_osd_cv, #au_os_ctl.on+#au_os_seek,
#au_os_ctl.on+#au_os_seek+#au_osd_cv,
#u2turbo.on+#u2tdate { #u2turbo.on+#u2tdate {
opacity: 1; opacity: 1;
} }
@@ -945,6 +973,12 @@ html.light .ghead {
#ggrid>a.dir:before { #ggrid>a.dir:before {
content: '📂'; content: '📂';
} }
#ggrid>a.au:before {
content: '💾';
}
html.np_open #ggrid>a.au:before {
content: '▶';
}
#ggrid>a:before { #ggrid>a:before {
display: block; display: block;
position: absolute; position: absolute;
@@ -954,6 +988,12 @@ html.light .ghead {
background: linear-gradient(135deg,rgba(255,255,255,0) 50%,rgba(255,255,255,0.2)); background: linear-gradient(135deg,rgba(255,255,255,0) 50%,rgba(255,255,255,0.2));
border-radius: .3em; border-radius: .3em;
font-size: 2em; font-size: 2em;
transition: font-size .15s, margin .15s;
}
#ggrid>a:focus:before,
#ggrid>a:hover:before {
font-size: 2.5em;
margin: -.2em;
} }
#op_unpost { #op_unpost {
padding: 1em; padding: 1em;
@@ -1023,7 +1063,6 @@ html.light #rui {
font-size: 1.5em; font-size: 1.5em;
} }
#doc { #doc {
background: none;
overflow: visible; overflow: visible;
margin: -1em 0 .5em 0; margin: -1em 0 .5em 0;
padding: 1em 0 1em 0; padding: 1em 0 1em 0;
@@ -1111,6 +1150,7 @@ a.btn,
html, html,
#doc,
#rui, #rui,
#files td, #files td,
#files thead th, #files thead th,
@@ -1158,6 +1198,7 @@ html,
#ggrid>a[tt] { #ggrid>a[tt] {
background: linear-gradient(135deg, #2c2c2c 95%, #444 95%); background: linear-gradient(135deg, #2c2c2c 95%, #444 95%);
} }
#ggrid>a:focus,
#ggrid>a:hover { #ggrid>a:hover {
background: #383838; background: #383838;
border-color: #555; border-color: #555;
@@ -1171,6 +1212,7 @@ html.light #ggrid>a {
html.light #ggrid>a[tt] { html.light #ggrid>a[tt] {
background: linear-gradient(135deg, #f7f7f7 95%, #ccc 95%); background: linear-gradient(135deg, #f7f7f7 95%, #ccc 95%);
} }
html.light #ggrid>a:focus,
html.light #ggrid>a:hover { html.light #ggrid>a:hover {
background: #fff; background: #fff;
border-color: #ccc; border-color: #ccc;
@@ -1190,6 +1232,7 @@ html.light {
html.light #ops, html.light #ops,
html.light .opbox, html.light .opbox,
html.light #path, html.light #path,
html.light #doc,
html.light #srch_form, html.light #srch_form,
html.light .ghead, html.light .ghead,
html.light #u2etas { html.light #u2etas {
@@ -1267,6 +1310,14 @@ html.light #ops a,
html.light #files tbody div a:last-child { html.light #files tbody div a:last-child {
color: #06a; color: #06a;
} }
html.light .s0:after,
html.light .s0r:after {
color: #059;
}
html.light .s1:after,
html.light .s1r:after {
color: #f5d;
}
html.light #files thead th { html.light #files thead th {
background: #eaeaea; background: #eaeaea;
border-color: #ccc; border-color: #ccc;
@@ -1663,8 +1714,6 @@ html.light #bbox-overlay figcaption a {
#op_up2k { #op_up2k {
padding: 0 1em 1em 1em; padding: 0 1em 1em 1em;
min-height: 0;
transition: min-height .2s;
} }
#drops { #drops {
display: none; display: none;
@@ -1829,13 +1878,18 @@ html.light #u2err.err {
#u2notbtn * { #u2notbtn * {
line-height: 1.3em; line-height: 1.3em;
} }
#u2tabw {
min-height: 0;
transition: min-height .2s;
margin: 3em 0;
}
#u2tab { #u2tab {
border-collapse: collapse; border-collapse: collapse;
margin: 3em auto;
width: calc(100% - 2em); width: calc(100% - 2em);
max-width: 100em; max-width: 100em;
margin: 0 auto;
} }
#op_up2k.srch #u2tab { #op_up2k.srch #u2tabf {
max-width: none; max-width: none;
} }
#u2tab td { #u2tab td {

View File

@@ -143,7 +143,8 @@
have_zip = {{ have_zip|tojson }}, have_zip = {{ have_zip|tojson }},
txt_ext = "{{ txt_ext }}", txt_ext = "{{ txt_ext }}",
{% if no_prism %}no_prism = 1,{% endif %} {% if no_prism %}no_prism = 1,{% endif %}
readme = {{ readme|tojson }}; readme = {{ readme|tojson }},
ls0 = {{ ls0|tojson }};
document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark"); document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark");
</script> </script>

File diff suppressed because it is too large Load Diff

View File

@@ -38,7 +38,8 @@ a+a {
margin: -.2em 0 0 .5em; margin: -.2em 0 0 .5em;
} }
.logout, .logout,
.btns a { .btns a,
a.r {
color: #c04; color: #c04;
border-color: #c7a; border-color: #c7a;
} }
@@ -79,6 +80,12 @@ table {
margin-top: .3em; margin-top: .3em;
text-align: right; text-align: right;
} }
blockquote {
margin: 0 0 1.6em .6em;
padding: .7em 1em 0 1em;
border-left: .3em solid rgba(128,128,128,0.5);
border-radius: 0 0 0 .25em;
}
html.dark, html.dark,
@@ -96,7 +103,8 @@ html.dark a {
border-color: #37a; border-color: #37a;
} }
html.dark .logout, html.dark .logout,
html.dark .btns a { html.dark .btns a,
html.dark a.r {
background: #804; background: #804;
border-color: #c28; border-color: #c28;
} }

View File

@@ -72,6 +72,18 @@
</ul> </ul>
{%- endif %} {%- endif %}
<h1 id="cc">client config:</h1>
<ul>
{% if k304 %}
<li><a href="/?k304=n">disable k304</a> (currently enabled)
{%- else %}
<li><a href="/?k304=y" class="r">enable k304</a> (currently disabled)
{% endif %}
<blockquote>enabling this will disconnect your client on every HTTP 304, which can prevent some buggy browsers/proxies from getting stuck (suddenly not being able to load pages), <em>but</em> it will also make things slower in general</blockquote></li>
<li><a href="/?reset" class="r" onclick="localStorage.clear();return true">reset client settings</a></li>
</ul>
<h1>login for more:</h1> <h1>login for more:</h1>
<ul> <ul>
<form method="post" enctype="multipart/form-data" action="/{{ qvpath }}"> <form method="post" enctype="multipart/form-data" action="/{{ qvpath }}">
@@ -84,8 +96,7 @@
<a href="#" id="repl">π</a> <a href="#" id="repl">π</a>
<script> <script>
if (localStorage.lightmode != 1) document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark");
document.documentElement.setAttribute("class", "dark");
</script> </script>
<script src="/.cpr/util.js?_={{ ts }}"></script> <script src="/.cpr/util.js?_={{ ts }}"></script>

View File

@@ -116,6 +116,20 @@ html {
#toast.err #toastc { #toast.err #toastc {
background: #d06; background: #d06;
} }
#tth {
color: #fff;
background: #111;
font-size: .9em;
padding: 0 .26em;
line-height: .97em;
border-radius: 1em;
position: absolute;
display: none;
}
#tth.act {
display: block;
z-index: 9001;
}
#tt.b { #tt.b {
padding: 0 2em; padding: 0 2em;
border-radius: .5em; border-radius: .5em;
@@ -159,6 +173,10 @@ html.light #tt code {
html.light #tt em { html.light #tt em {
color: #d38; color: #d38;
} }
html.light #tth {
color: #000;
background: #fff;
}
#modal { #modal {
position: fixed; position: fixed;
overflow: auto; overflow: auto;

View File

@@ -525,13 +525,15 @@ function Donut(uc, st) {
} }
r.on = function (ya) { r.on = function (ya) {
r.fc = 99; r.fc = r.tc = 99;
r.eta = null; r.eta = null;
r.base = pos(); r.base = pos();
optab.innerHTML = ya ? svg() : optab.getAttribute('ico'); optab.innerHTML = ya ? svg() : optab.getAttribute('ico');
el = QS('#ops a .donut'); el = QS('#ops a .donut');
if (!ya) if (!ya) {
favico.upd(); favico.upd();
wintitle();
}
}; };
r.do = function () { r.do = function () {
if (!el) if (!el)
@@ -541,6 +543,11 @@ function Donut(uc, st) {
v = pos() - r.base, v = pos() - r.base,
ofs = el.style.strokeDashoffset = o - o * v / t; ofs = el.style.strokeDashoffset = o - o * v / t;
if (++r.tc >= 10) {
wintitle(f2f(v * 100 / t, 1) + '%, ' + r.eta + 's, ', true);
r.tc = 0;
}
if (favico.txt) { if (favico.txt) {
if (++r.fc < 10 && r.eta && r.eta > 99) if (++r.fc < 10 && r.eta && r.eta > 99)
return; return;
@@ -728,7 +735,6 @@ function up2k_init(subtle) {
if (++nenters <= 0) if (++nenters <= 0)
nenters = 1; nenters = 1;
//console.log(nenters, Date.now(), 'enter', this, e.target);
if (onover.bind(this)(e)) if (onover.bind(this)(e))
return true; return true;
@@ -750,12 +756,19 @@ function up2k_init(subtle) {
ebi('up_dz').setAttribute('err', mup || ''); ebi('up_dz').setAttribute('err', mup || '');
ebi('srch_dz').setAttribute('err', msr || ''); ebi('srch_dz').setAttribute('err', msr || '');
} }
function onoverb(e) {
// zones are alive; disable cuo2duo branch
document.body.ondragover = document.body.ondrop = null;
return onover.bind(this)(e);
}
function onover(e) { function onover(e) {
try { try {
var ok = false, dt = e.dataTransfer.types; var ok = false, dt = e.dataTransfer.types;
for (var a = 0; a < dt.length; a++) for (var a = 0; a < dt.length; a++)
if (dt[a] == 'Files') if (dt[a] == 'Files')
ok = true; ok = true;
else if (dt[a] == 'text/uri-list')
return true;
if (!ok) if (!ok)
return true; return true;
@@ -781,17 +794,20 @@ function up2k_init(subtle) {
clmod(ebi('drops'), 'vis'); clmod(ebi('drops'), 'vis');
clmod(ebi('up_dz'), 'hl'); clmod(ebi('up_dz'), 'hl');
clmod(ebi('srch_dz'), 'hl'); clmod(ebi('srch_dz'), 'hl');
// cuo2duo:
document.body.ondragover = onover;
document.body.ondrop = gotfile;
} }
//console.log(nenters, Date.now(), 'leave', this, e && e.target);
} }
document.body.ondragenter = ondrag; document.body.ondragenter = ondrag;
document.body.ondragleave = offdrag; document.body.ondragleave = offdrag;
document.body.ondragover = onover;
document.body.ondrop = gotfile;
var drops = [ebi('up_dz'), ebi('srch_dz')]; var drops = [ebi('up_dz'), ebi('srch_dz')];
for (var a = 0; a < 2; a++) { for (var a = 0; a < 2; a++) {
drops[a].ondragenter = ondrag; drops[a].ondragenter = ondrag;
drops[a].ondragover = onover; drops[a].ondragover = onoverb;
drops[a].ondragleave = offdrag; drops[a].ondragleave = offdrag;
drops[a].ondrop = gotfile; drops[a].ondrop = gotfile;
} }
@@ -801,7 +817,10 @@ function up2k_init(subtle) {
ev(e); ev(e);
nenters = 0; nenters = 0;
offdrag.bind(this)(); offdrag.bind(this)();
var dz = (this && this.getAttribute('id')); var dz = this && this.getAttribute('id');
if (!dz && e && e.clientY)
// cuo2duo fallback
dz = e.clientY < window.innerHeight / 2 ? 'up_dz' : 'srch_dz';
var err = this.getAttribute('err'); var err = this.getAttribute('err');
if (err) if (err)
@@ -1069,7 +1088,7 @@ function up2k_init(subtle) {
} }
more_one_file(); more_one_file();
var etaref = 0, etaskip = 0, op_minh = 0; var etaref = 0, etaskip = 0, utw_minh = 0;
function etafun() { function etafun() {
var nhash = st.busy.head.length + st.busy.hash.length + st.todo.head.length + st.todo.hash.length, var nhash = st.busy.head.length + st.busy.hash.length + st.todo.head.length + st.todo.hash.length,
nsend = st.busy.upload.length + st.todo.upload.length, nsend = st.busy.upload.length + st.todo.upload.length,
@@ -1082,13 +1101,10 @@ function up2k_init(subtle) {
//ebi('acc_info').innerHTML = humantime(st.time.busy) + ' ' + f2f(now / 1000, 1); //ebi('acc_info').innerHTML = humantime(st.time.busy) + ' ' + f2f(now / 1000, 1);
var op = ebi('op_up2k'), var minh = QS('#op_up2k.act') && st.is_busy ? Math.max(utw_minh, ebi('u2tab').offsetHeight + 32) : 0;
uff = ebi('u2footfoot'), if (utw_minh < minh || !utw_minh) {
minh = QS('#op_up2k.act') ? Math.max(op_minh, uff.offsetTop + uff.offsetHeight - op.offsetTop + 32) : 0; utw_minh = minh;
ebi('u2tabw').style.minHeight = utw_minh + 'px';
if (minh > op_minh || !op_minh) {
op_minh = minh;
op.style.minHeight = op_minh + 'px';
} }
if (!nhash) if (!nhash)
@@ -1211,15 +1227,16 @@ function up2k_init(subtle) {
running = true; running = true;
while (true) { while (true) {
var now = Date.now(), var now = Date.now(),
is_busy = 0 != oldest_active = Math.min( // gzip take the wheel
st.todo.head.length + st.todo.head.length ? st.todo.head[0].n : st.files.length,
st.todo.hash.length + st.todo.hash.length ? st.todo.hash[0].n : st.files.length,
st.todo.handshake.length + st.todo.upload.length ? st.todo.upload[0].nfile : st.files.length,
st.todo.upload.length + st.todo.handshake.length ? st.todo.handshake[0].n : st.files.length,
st.busy.head.length + st.busy.head.length ? st.busy.head[0].n : st.files.length,
st.busy.hash.length + st.busy.hash.length ? st.busy.hash[0].n : st.files.length,
st.busy.handshake.length + st.busy.upload.length ? st.busy.upload[0].nfile : st.files.length,
st.busy.upload.length; st.busy.handshake.length ? st.busy.handshake[0].n : st.files.length),
is_busy = oldest_active < st.files.length;
if (was_busy && !is_busy) { if (was_busy && !is_busy) {
for (var a = 0; a < st.files.length; a++) { for (var a = 0; a < st.files.length; a++) {
@@ -1239,7 +1256,7 @@ function up2k_init(subtle) {
} }
if (was_busy != is_busy) { if (was_busy != is_busy) {
was_busy = is_busy; st.is_busy = was_busy = is_busy;
window[(is_busy ? "add" : "remove") + window[(is_busy ? "add" : "remove") +
"EventListener"]("beforeunload", warn_uploader_busy); "EventListener"]("beforeunload", warn_uploader_busy);
@@ -1268,7 +1285,7 @@ function up2k_init(subtle) {
timer.rm(etafun); timer.rm(etafun);
timer.rm(donut.do); timer.rm(donut.do);
op_minh = 0; utw_minh = 0;
} }
else { else {
timer.add(donut.do); timer.add(donut.do);
@@ -1320,7 +1337,8 @@ function up2k_init(subtle) {
} }
if (st.todo.head.length && if (st.todo.head.length &&
st.busy.head.length < parallel_uploads) { st.busy.head.length < parallel_uploads &&
(!is_busy || st.todo.head[0].n - oldest_active < parallel_uploads * 2)) {
exec_head(); exec_head();
mou_ikkai = true; mou_ikkai = true;
} }
@@ -1843,7 +1861,8 @@ function up2k_init(subtle) {
st.bytes.uploaded += cdr - car; st.bytes.uploaded += cdr - car;
t.bytes_uploaded += cdr - car; t.bytes_uploaded += cdr - car;
} }
else if (txt.indexOf('already got that') !== -1) { else if (txt.indexOf('already got that') + 1 ||
txt.indexOf('already being written') + 1) {
console.log("ignoring dupe-segment error", t); console.log("ignoring dupe-segment error", t);
} }
else { else {
@@ -1851,6 +1870,9 @@ function up2k_init(subtle) {
xhr.status, t.name) + (txt || "no further information")); xhr.status, t.name) + (txt || "no further information"));
return; return;
} }
orz2(xhr);
}
function orz2(xhr) {
apop(st.busy.upload, upt); apop(st.busy.upload, upt);
apop(t.postlist, npart); apop(t.postlist, npart);
if (!t.postlist.length) { if (!t.postlist.length) {
@@ -1872,9 +1894,11 @@ function up2k_init(subtle) {
if (crashed) if (crashed)
return; return;
toast.err(9.98, "failed to upload a chunk,\n" + tries + " retries so far -- retrying in 10sec\n\n" + t.name); if (!toast.visible)
toast.warn(9.98, "failed to upload a chunk;\nprobably harmless, continuing\n\n" + t.name);
console.log('chunkpit onerror,', ++tries, t); console.log('chunkpit onerror,', ++tries, t);
setTimeout(do_send, 10 * 1000); orz2(xhr);
}; };
xhr.open('POST', t.purl, true); xhr.open('POST', t.purl, true);
xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]); xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]);

View File

@@ -7,8 +7,7 @@ if (!window['console'])
var is_touch = 'ontouchstart' in window, var is_touch = 'ontouchstart' in window,
IPHONE = /iPhone|iPad|iPod/i.test(navigator.userAgent), IPHONE = is_touch && /iPhone|iPad|iPod/i.test(navigator.userAgent),
ANDROID = /android/i.test(navigator.userAgent),
WINDOWS = navigator.platform ? navigator.platform == 'Win32' : /Windows/.test(navigator.userAgent); WINDOWS = navigator.platform ? navigator.platform == 'Win32' : /Windows/.test(navigator.userAgent);
@@ -181,6 +180,7 @@ function ignex(all) {
if (!all) if (!all)
window.onerror = vis_exh; window.onerror = vis_exh;
} }
window.onerror = vis_exh;
function noop() { } function noop() { }
@@ -286,15 +286,19 @@ function crc32(str) {
function clmod(el, cls, add) { function clmod(el, cls, add) {
if (!el)
return false;
if (el.classList) { if (el.classList) {
var have = el.classList.contains(cls); var have = el.classList.contains(cls);
if (add == 't') if (add == 't')
add = !have; add = !have;
if (add != have) if (!add == !have)
el.classList[add ? 'add' : 'remove'](cls); return false;
return; el.classList[add ? 'add' : 'remove'](cls);
return true;
} }
var re = new RegExp('\\s*\\b' + cls + '\\s*\\b', 'g'), var re = new RegExp('\\s*\\b' + cls + '\\s*\\b', 'g'),
@@ -305,12 +309,18 @@ function clmod(el, cls, add) {
var n2 = n1.replace(re, ' ') + (add ? ' ' + cls : ''); var n2 = n1.replace(re, ' ') + (add ? ' ' + cls : '');
if (n1 != n2) if (!n1 == !n2)
el.className = n2; return false;
el.className = n2;
return true;
} }
function clgot(el, cls) { function clgot(el, cls) {
if (!el)
return;
if (el.classList) if (el.classList)
return el.classList.contains(cls); return el.classList.contains(cls);
@@ -319,14 +329,45 @@ function clgot(el, cls) {
} }
function showsort(tab) {
var v, vn, v1, v2, th = tab.tHead,
sopts = jread('fsort', [["href", 1, ""]]);
th && (th = th.rows[0]) && (th = th.cells);
for (var a = sopts.length - 1; a >= 0; a--) {
if (!sopts[a][0])
continue;
v2 = v1;
v1 = sopts[a];
}
v = [v1, v2];
vn = [v1 ? v1[0] : '', v2 ? v2[0] : ''];
var ga = QSA('#ghead a[s]');
for (var a = 0; a < ga.length; a++)
ga[a].className = '';
for (var a = 0; a < th.length; a++) {
var n = vn.indexOf(th[a].getAttribute('name')),
cl = n < 0 ? ' ' : ' s' + n + (v[n][1] > 0 ? ' ' : 'r ');
th[a].className = th[a].className.replace(/ *s[01]r? */, ' ') + cl;
if (n + 1) {
ga = QS('#ghead a[s="' + vn[n] + '"]');
if (ga)
ga.className = cl;
}
}
}
function sortTable(table, col, cb) { function sortTable(table, col, cb) {
var tb = table.tBodies[0], var tb = table.tBodies[0],
th = table.tHead.rows[0].cells, th = table.tHead.rows[0].cells,
tr = Array.prototype.slice.call(tb.rows, 0), tr = Array.prototype.slice.call(tb.rows, 0),
i, reverse = th[col].className.indexOf('sort1') !== -1 ? -1 : 1; i, reverse = /s0[^r]/.exec(th[col].className + ' ') ? -1 : 1;
for (var a = 0, thl = th.length; a < thl; a++)
th[a].className = th[a].className.replace(/ *sort-?1 */, " ");
th[col].className += ' sort' + reverse;
var stype = th[col].getAttribute('sort'); var stype = th[col].getAttribute('sort');
try { try {
var nrules = [], rules = jread("fsort", []); var nrules = [], rules = jread("fsort", []);
@@ -344,6 +385,7 @@ function sortTable(table, col, cb) {
break; break;
} }
jwrite("fsort", nrules); jwrite("fsort", nrules);
try { showsort(table); } catch (ex) { }
} }
catch (ex) { catch (ex) {
console.log("failed to persist sort rules, resetting: " + ex); console.log("failed to persist sort rules, resetting: " + ex);
@@ -782,13 +824,18 @@ var timer = (function () {
var tt = (function () { var tt = (function () {
var r = { var r = {
"tt": mknod("div"), "tt": mknod("div"),
"th": mknod("div"),
"en": true, "en": true,
"el": null, "el": null,
"skip": false "skip": false,
"lvis": 0
}; };
r.th.innerHTML = '?';
r.tt.setAttribute('id', 'tt'); r.tt.setAttribute('id', 'tt');
r.th.setAttribute('id', 'tth');
document.body.appendChild(r.tt); document.body.appendChild(r.tt);
document.body.appendChild(r.th);
var prev = null; var prev = null;
r.cshow = function () { r.cshow = function () {
@@ -798,11 +845,25 @@ var tt = (function () {
prev = this; prev = this;
}; };
r.show = function () { var tev;
if (r.skip) { r.dshow = function (e) {
r.skip = false; clearTimeout(tev);
if (!r.getmsg(this))
return; return;
}
if (Date.now() - r.lvis < 400)
return r.show.bind(this)();
tev = setTimeout(r.show.bind(this), 800);
if (is_touch)
return;
this.addEventListener('mousemove', r.move);
clmod(r.th, 'act', 1);
r.move(e);
};
r.getmsg = function (el) {
if (QS('body.bbox-open')) if (QS('body.bbox-open'))
return; return;
@@ -810,7 +871,16 @@ var tt = (function () {
if (cfg !== null && cfg != '1') if (cfg !== null && cfg != '1')
return; return;
var msg = this.getAttribute('tt'); return el.getAttribute('tt');
};
r.show = function () {
clearTimeout(tev);
if (r.skip) {
r.skip = false;
return;
}
var msg = r.getmsg(this);
if (!msg) if (!msg)
return; return;
@@ -824,6 +894,7 @@ var tt = (function () {
if (dir.indexOf('u') + 1) top = false; if (dir.indexOf('u') + 1) top = false;
if (dir.indexOf('d') + 1) top = true; if (dir.indexOf('d') + 1) top = true;
clmod(r.th, 'act');
clmod(r.tt, 'b', big); clmod(r.tt, 'b', big);
r.tt.style.left = '0'; r.tt.style.left = '0';
r.tt.style.top = '0'; r.tt.style.top = '0';
@@ -849,14 +920,27 @@ var tt = (function () {
r.hide = function (e) { r.hide = function (e) {
ev(e); ev(e);
clearTimeout(tev);
window.removeEventListener('scroll', r.hide); window.removeEventListener('scroll', r.hide);
clmod(r.tt, 'show');
clmod(r.tt, 'b'); clmod(r.tt, 'b');
clmod(r.th, 'act');
if (clmod(r.tt, 'show'))
r.lvis = Date.now();
if (r.el) if (r.el)
r.el.removeEventListener('mouseleave', r.hide); r.el.removeEventListener('mouseleave', r.hide);
if (e && e.target)
e.target.removeEventListener('mousemove', r.move);
}; };
if (is_touch && IPHONE) { r.move = function (e) {
r.th.style.left = (e.pageX + 12) + 'px';
r.th.style.top = (e.pageY + 12) + 'px';
};
if (IPHONE) {
var f1 = r.show, var f1 = r.show,
f2 = r.hide, f2 = r.hide,
q = []; q = [];
@@ -882,14 +966,14 @@ var tt = (function () {
r.att = function (ctr) { r.att = function (ctr) {
var _cshow = r.en ? r.cshow : null, var _cshow = r.en ? r.cshow : null,
_show = r.en ? r.show : null, _dshow = r.en ? r.dshow : null,
_hide = r.en ? r.hide : null, _hide = r.en ? r.hide : null,
o = ctr.querySelectorAll('*[tt]'); o = ctr.querySelectorAll('*[tt]');
for (var a = o.length - 1; a >= 0; a--) { for (var a = o.length - 1; a >= 0; a--) {
o[a].onfocus = _cshow; o[a].onfocus = _cshow;
o[a].onblur = _hide; o[a].onblur = _hide;
o[a].onmouseenter = _show; o[a].onmouseenter = _dshow;
o[a].onmouseleave = _hide; o[a].onmouseleave = _hide;
} }
r.hide(); r.hide();

View File

@@ -80,6 +80,12 @@ shab64() { sp=$1; f="$2"; v=0; sz=$(stat -c%s "$f"); while true; do w=$((v+sp*10
command -v gdate && date() { gdate "$@"; }; while true; do t=$(date +%s.%N); (time wget http://127.0.0.1:3923/?ls -qO- | jq -C '.files[]|{sz:.sz,ta:.tags.artist,tb:.tags.".bpm"}|del(.[]|select(.==null))' | awk -F\" '/"/{t[$2]++} END {for (k in t){v=t[k];p=sprintf("%" (v+1) "s",v);gsub(/ /,"#",p);printf "\033[36m%s\033[33m%s ",k,p}}') 2>&1 | awk -v ts=$t 'NR==1{t1=$0} NR==2{sub(/.*0m/,"");sub(/s$/,"");t2=$0;c=2; if(t2>0.3){c=3} if(t2>0.8){c=1} } END{sub(/[0-9]{6}$/,"",ts);printf "%s \033[3%dm%s %s\033[0m\n",ts,c,t2,t1}'; sleep 0.1 || break; done command -v gdate && date() { gdate "$@"; }; while true; do t=$(date +%s.%N); (time wget http://127.0.0.1:3923/?ls -qO- | jq -C '.files[]|{sz:.sz,ta:.tags.artist,tb:.tags.".bpm"}|del(.[]|select(.==null))' | awk -F\" '/"/{t[$2]++} END {for (k in t){v=t[k];p=sprintf("%" (v+1) "s",v);gsub(/ /,"#",p);printf "\033[36m%s\033[33m%s ",k,p}}') 2>&1 | awk -v ts=$t 'NR==1{t1=$0} NR==2{sub(/.*0m/,"");sub(/s$/,"");t2=$0;c=2; if(t2>0.3){c=3} if(t2>0.8){c=1} } END{sub(/[0-9]{6}$/,"",ts);printf "%s \033[3%dm%s %s\033[0m\n",ts,c,t2,t1}'; sleep 0.1 || break; done
##
## track an up2k upload and print all chunks in file-order
grep '"name": "2021-07-18 02-17-59.mkv"' fug.log | head -n 1 | sed -r 's/.*"hash": \[//; s/\].*//' | tr '"' '\n' | grep -E '^[a-zA-Z0-9_-]{44}$' | while IFS= read -r cid; do cat -n fug.log | grep -vF '"purl": "' | grep -- "$cid"; echo; done | stdbuf -oL tr '\t' ' ' | while IFS=' ' read -r ln _ _ _ _ _ ts ip port msg; do [ -z "$msg" ] && echo && continue; printf '%6s [%s] [%s] %s\n' $ln "$ts" "$ip $port" "$msg"; read -r ln _ _ _ _ _ ts ip port msg < <(cat -n fug.log | tail -n +$((ln+1)) | grep -F "$ip $port" | head -n 1); printf '%6s [%s] [%s] %s\n' $ln "$ts" "$ip $port" "$msg"; done
## ##
## js oneliners ## js oneliners
@@ -185,8 +191,13 @@ about:config >> devtools.debugger.prefs-schema-version = -1
git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > ../revs && cat ../{util,browser,up2k}.js >../vr && cat ../revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser,up2k}.js >../vg 2>/dev/null && diff -wNarU0 ../{vg,vr} | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > ../revs && cat ../{util,browser,up2k}.js >../vr && cat ../revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser,up2k}.js >../vg 2>/dev/null && diff -wNarU0 ../{vg,vr} | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done
# download all sfx versions # download all sfx versions
curl https://api.github.com/repos/9001/copyparty/releases?per_page=100 | jq -r '.[] | .tag_name + " " + .name' | tr -d '\r' | while read v t; do fn="copyparty $v $t.py"; [ -e "$fn" ] || curl https://github.com/9001/copyparty/releases/download/$v/copyparty-sfx.py -Lo "$fn"; done curl https://api.github.com/repos/9001/copyparty/releases?per_page=100 | jq -r '.[] | .tag_name + " " + .name' | tr -d '\r' | while read v t; do fn="$(printf '%s\n' "copyparty $v $t.py" | tr / -)"; [ -e "$fn" ] || curl https://github.com/9001/copyparty/releases/download/$v/copyparty-sfx.py -Lo "$fn"; done
# push to multiple git remotes
git config -l | grep '^remote'
git remote add all git@github.com:9001/copyparty.git
git remote set-url --add --push all git@gitlab.com:9001/copyparty.git
git remote set-url --add --push all git@github.com:9001/copyparty.git
## ##
## http 206 ## http 206

View File

@@ -140,10 +140,10 @@ repack() {
} }
repack sfx-full "re gz no-sh" repack sfx-full "re gz no-sh"
repack sfx-ent "re no-dd no-ogv" repack sfx-ent "re no-dd"
repack sfx-ent "re no-dd no-ogv gz no-sh" repack sfx-ent "re no-dd gz no-sh"
repack sfx-lite "re no-dd no-ogv no-cm no-hl" repack sfx-lite "re no-dd no-cm no-hl"
repack sfx-lite "re no-dd no-ogv no-cm no-hl gz no-sh" repack sfx-lite "re no-dd no-cm no-hl gz no-sh"
# move fuse and up2k clients into copyparty-extras/, # move fuse and up2k clients into copyparty-extras/,

View File

@@ -3,7 +3,6 @@ WORKDIR /z
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \ ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
ver_hashwasm=4.9.0 \ ver_hashwasm=4.9.0 \
ver_marked=3.0.4 \ ver_marked=3.0.4 \
ver_ogvjs=1.8.4 \
ver_mde=2.15.0 \ ver_mde=2.15.0 \
ver_codemirror=5.62.3 \ ver_codemirror=5.62.3 \
ver_fontawesome=5.13.0 \ ver_fontawesome=5.13.0 \
@@ -15,7 +14,6 @@ ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
RUN mkdir -p /z/dist/no-pk \ RUN mkdir -p /z/dist/no-pk \
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \ && wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \ && apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \
&& wget https://github.com/brion/ogv.js/releases/download/$ver_ogvjs/ogvjs-$ver_ogvjs.zip -O ogvjs.zip \
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \ && wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \ && wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
&& wget https://github.com/Ionaru/easy-markdown-editor/archive/$ver_mde.tar.gz -O mde.tgz \ && wget https://github.com/Ionaru/easy-markdown-editor/archive/$ver_mde.tar.gz -O mde.tgz \
@@ -23,7 +21,6 @@ RUN mkdir -p /z/dist/no-pk \
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \ && wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \ && wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \ && wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
&& unzip ogvjs.zip \
&& (mkdir hash-wasm \ && (mkdir hash-wasm \
&& cd hash-wasm \ && cd hash-wasm \
&& unzip ../hash-wasm.zip) \ && unzip ../hash-wasm.zip) \
@@ -77,21 +74,6 @@ RUN cd hash-wasm \
&& mv sha512.umd.min.js /z/dist/sha512.hw.js && mv sha512.umd.min.js /z/dist/sha512.hw.js
# build ogvjs
RUN cd ogvjs-$ver_ogvjs \
&& cp -pv \
ogv-worker-audio.js \
ogv-demuxer-ogg-wasm.js \
ogv-demuxer-ogg-wasm.wasm \
ogv-decoder-audio-opus-wasm.js \
ogv-decoder-audio-opus-wasm.wasm \
ogv-decoder-audio-vorbis-wasm.js \
ogv-decoder-audio-vorbis-wasm.wasm \
/z/dist \
&& cp -pv \
ogv-es2017.js /z/dist/ogv.js
# build marked # build marked
COPY marked.patch /z/ COPY marked.patch /z/
COPY marked-ln.patch /z/ COPY marked-ln.patch /z/

View File

@@ -16,9 +16,6 @@ help() { exec cat <<'EOF'
# #
# `no-sh` makes just the python sfx, skips the sh/unix sfx # `no-sh` makes just the python sfx, skips the sh/unix sfx
# #
# `no-ogv` saves ~192k by removing the opus/vorbis audio codecs
# (only affects apple devices; everything else has native support)
#
# `no-cm` saves ~82k by removing easymde/codemirror # `no-cm` saves ~82k by removing easymde/codemirror
# (the fancy markdown editor) # (the fancy markdown editor)
# #
@@ -75,7 +72,6 @@ while [ ! -z "$1" ]; do
clean) clean=1 ; ;; clean) clean=1 ; ;;
re) repack=1 ; ;; re) repack=1 ; ;;
gz) use_gz=1 ; ;; gz) use_gz=1 ; ;;
no-ogv) no_ogv=1 ; ;;
no-fnt) no_fnt=1 ; ;; no-fnt) no_fnt=1 ; ;;
no-hl) no_hl=1 ; ;; no-hl) no_hl=1 ; ;;
no-dd) no_dd=1 ; ;; no-dd) no_dd=1 ; ;;
@@ -218,9 +214,6 @@ cat have | while IFS= read -r x; do
done done
rm have rm have
[ $no_ogv ] &&
rm -rf copyparty/web/deps/{dynamicaudio,ogv}*
[ $no_cm ] && { [ $no_cm ] && {
rm -rf copyparty/web/mde.* copyparty/web/deps/easymde* rm -rf copyparty/web/mde.* copyparty/web/deps/easymde*
echo h > copyparty/web/mde.html echo h > copyparty/web/mde.html

View File

@@ -7,8 +7,9 @@ v=$1
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1 printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1 grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
git push all
git tag v$v git tag v$v
git push origin --tags git push all --tags
rm -rf ../dist rm -rf ../dist

View File

@@ -49,14 +49,6 @@ copyparty/web/deps/easymde.js,
copyparty/web/deps/marked.js, copyparty/web/deps/marked.js,
copyparty/web/deps/mini-fa.css, copyparty/web/deps/mini-fa.css,
copyparty/web/deps/mini-fa.woff, copyparty/web/deps/mini-fa.woff,
copyparty/web/deps/ogv-decoder-audio-opus-wasm.js,
copyparty/web/deps/ogv-decoder-audio-opus-wasm.wasm,
copyparty/web/deps/ogv-decoder-audio-vorbis-wasm.js,
copyparty/web/deps/ogv-decoder-audio-vorbis-wasm.wasm,
copyparty/web/deps/ogv-demuxer-ogg-wasm.js,
copyparty/web/deps/ogv-demuxer-ogg-wasm.wasm,
copyparty/web/deps/ogv-worker-audio.js,
copyparty/web/deps/ogv.js,
copyparty/web/deps/prism.js, copyparty/web/deps/prism.js,
copyparty/web/deps/prism.css, copyparty/web/deps/prism.css,
copyparty/web/deps/prismd.css, copyparty/web/deps/prismd.css,