Compare commits
120 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
016dba4ca9 | ||
|
|
39c7ef305f | ||
|
|
849c1dc848 | ||
|
|
61414014fe | ||
|
|
578a915884 | ||
|
|
eacafb8a63 | ||
|
|
4446760f74 | ||
|
|
6da2a083f9 | ||
|
|
8837c8f822 | ||
|
|
bac301ed66 | ||
|
|
061db3906d | ||
|
|
fd7df5c952 | ||
|
|
a270019147 | ||
|
|
55e0209901 | ||
|
|
2b255fbbed | ||
|
|
8a2345a0fb | ||
|
|
bfa9f535aa | ||
|
|
f757623ad8 | ||
|
|
3c7465e268 | ||
|
|
108665fc4f | ||
|
|
ed519c9138 | ||
|
|
2dd2e2c57e | ||
|
|
6c3a976222 | ||
|
|
80cc26bd95 | ||
|
|
970fb84fd8 | ||
|
|
20cbcf6931 | ||
|
|
8fcde2a579 | ||
|
|
b32d1f8ad3 | ||
|
|
03513e0cb1 | ||
|
|
e041a2b197 | ||
|
|
d7d625be2a | ||
|
|
4121266678 | ||
|
|
22971a6be4 | ||
|
|
efbf8d7e0d | ||
|
|
397396ea4a | ||
|
|
e59b077c21 | ||
|
|
4bc39f3084 | ||
|
|
21c3570786 | ||
|
|
2f85c1fb18 | ||
|
|
1e27a4c2df | ||
|
|
456f575637 | ||
|
|
51546c9e64 | ||
|
|
83b4b70ef4 | ||
|
|
a5120d4f6f | ||
|
|
c95941e14f | ||
|
|
0dd531149d | ||
|
|
67da1b5219 | ||
|
|
919bd16437 | ||
|
|
ecead109ab | ||
|
|
765294c263 | ||
|
|
d6b5351207 | ||
|
|
a2009bcc6b | ||
|
|
12709a8a0a | ||
|
|
c055baefd2 | ||
|
|
56522599b5 | ||
|
|
664f53b75d | ||
|
|
87200d9f10 | ||
|
|
5c3d0b6520 | ||
|
|
bd49979f4a | ||
|
|
7e606cdd9f | ||
|
|
8b4b7fa794 | ||
|
|
05345ddf8b | ||
|
|
66adb470ad | ||
|
|
e15c8fd146 | ||
|
|
0f09b98a39 | ||
|
|
b4d6f4e24d | ||
|
|
3217fa625b | ||
|
|
e719ff8a47 | ||
|
|
9fcf528d45 | ||
|
|
1ddbf5a158 | ||
|
|
64bf4574b0 | ||
|
|
5649d26077 | ||
|
|
92f923effe | ||
|
|
0d46d548b9 | ||
|
|
062df3f0c3 | ||
|
|
789fb53b8e | ||
|
|
351db5a18f | ||
|
|
aabbd271c8 | ||
|
|
aae8e0171e | ||
|
|
45827a2458 | ||
|
|
726030296f | ||
|
|
6659ab3881 | ||
|
|
c6a103609e | ||
|
|
c6b3f035e5 | ||
|
|
2b0a7e378e | ||
|
|
b75ce909c8 | ||
|
|
229c3f5dab | ||
|
|
ec73094506 | ||
|
|
c7650c9326 | ||
|
|
d94c6d4e72 | ||
|
|
3cc8760733 | ||
|
|
a2f6973495 | ||
|
|
f8648fa651 | ||
|
|
177aa038df | ||
|
|
e0a14ec881 | ||
|
|
9366512f2f | ||
|
|
ea38b8041a | ||
|
|
f1870daf0d | ||
|
|
9722441aad | ||
|
|
9d014087f4 | ||
|
|
83b4038b85 | ||
|
|
1e0a448feb | ||
|
|
fb81de3b36 | ||
|
|
aa4f352301 | ||
|
|
f1a1c2ea45 | ||
|
|
6249bd4163 | ||
|
|
2579dc64ce | ||
|
|
356512270a | ||
|
|
bed27f2b43 | ||
|
|
54013d861b | ||
|
|
ec100210dc | ||
|
|
3ab1acf32c | ||
|
|
8c28266418 | ||
|
|
7f8b8dcb92 | ||
|
|
6dd39811d4 | ||
|
|
35e2138e3e | ||
|
|
239b4e9fe6 | ||
|
|
2fcd0e7e72 | ||
|
|
357347ce3a | ||
|
|
36dc1107fb |
79
README.md
79
README.md
@@ -20,8 +20,10 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
|
||||
* top
|
||||
* [quickstart](#quickstart)
|
||||
* [on debian](#on-debian)
|
||||
* [notes](#notes)
|
||||
* [status](#status)
|
||||
* [testimonials](#testimonials)
|
||||
* [bugs](#bugs)
|
||||
* [general bugs](#general-bugs)
|
||||
* [not my bugs](#not-my-bugs)
|
||||
@@ -44,6 +46,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
* [browser support](#browser-support)
|
||||
* [client examples](#client-examples)
|
||||
* [up2k](#up2k)
|
||||
* [performance](#performance)
|
||||
* [dependencies](#dependencies)
|
||||
* [optional dependencies](#optional-dependencies)
|
||||
* [install recommended deps](#install-recommended-deps)
|
||||
@@ -68,6 +71,7 @@ some recommended options:
|
||||
* `-e2dsa` enables general file indexing, see [search configuration](#search-configuration)
|
||||
* `-e2ts` enables audio metadata indexing (needs either FFprobe or mutagen), see [optional dependencies](#optional-dependencies)
|
||||
* `-v /mnt/music:/music:r:afoo -a foo:bar` shares `/mnt/music` as `/music`, `r`eadable by anyone, with user `foo` as `a`dmin (read/write), password `bar`
|
||||
* the syntax is `-v src:dst:perm:perm:...` so local-path, url-path, and one or more permissions to set
|
||||
* replace `:r:afoo` with `:rfoo` to only make the folder readable by `foo` and nobody else
|
||||
* in addition to `r`ead and `a`dmin, `w`rite makes a folder write-only, so cannot list/access files in it
|
||||
* `--ls '**,*,ln,p,r'` to crash on startup if any of the volumes contain a symlink which point outside the volume, as that could give users unintended access
|
||||
@@ -77,6 +81,19 @@ you may also want these, especially on servers:
|
||||
* [contrib/nginx/copyparty.conf](contrib/nginx/copyparty.conf) to reverse-proxy behind nginx (for better https)
|
||||
|
||||
|
||||
### on debian
|
||||
|
||||
recommended steps to enable audio metadata and thumbnails (from images and videos):
|
||||
|
||||
* as root, run the following:
|
||||
`apt install python3 python3-pip python3-dev ffmpeg`
|
||||
|
||||
* then, as the user which will be running copyparty (so hopefully not root), run this:
|
||||
`python3 -m pip install --user -U Pillow pillow-avif-plugin`
|
||||
|
||||
(skipped `pyheif-pillow-opener` because apparently debian is too old to build it)
|
||||
|
||||
|
||||
## notes
|
||||
|
||||
general:
|
||||
@@ -111,7 +128,7 @@ summary: all planned features work! now please enjoy the bloatening
|
||||
* ☑ FUSE client (read-only)
|
||||
* browser
|
||||
* ☑ tree-view
|
||||
* ☑ audio player
|
||||
* ☑ audio player (with OS media controls)
|
||||
* ☑ thumbnails
|
||||
* ☑ images using Pillow
|
||||
* ☑ videos using FFmpeg
|
||||
@@ -128,6 +145,13 @@ summary: all planned features work! now please enjoy the bloatening
|
||||
* ☑ editor (sure why not)
|
||||
|
||||
|
||||
## testimonials
|
||||
|
||||
small collection of user feedback
|
||||
|
||||
`good enough`, `surprisingly correct`, `certified good software`, `just works`, `why`
|
||||
|
||||
|
||||
# bugs
|
||||
|
||||
* Windows: python 3.7 and older cannot read tags with ffprobe, so use mutagen or upgrade
|
||||
@@ -139,6 +163,8 @@ summary: all planned features work! now please enjoy the bloatening
|
||||
|
||||
* all volumes must exist / be available on startup; up2k (mtp especially) gets funky otherwise
|
||||
* cannot mount something at `/d1/d2/d3` unless `d2` exists inside `d1`
|
||||
* dupe files will not have metadata (audio tags etc) displayed in the file listing
|
||||
* because they don't get `up` entries in the db (probably best fix) and `tx_browser` does not `lstat`
|
||||
* probably more, pls let me know
|
||||
|
||||
## not my bugs
|
||||
@@ -168,25 +194,34 @@ summary: all planned features work! now please enjoy the bloatening
|
||||
## hotkeys
|
||||
|
||||
the browser has the following hotkeys
|
||||
* `B` toggle breadcrumbs / directory tree
|
||||
* `I/K` prev/next folder
|
||||
* `P` parent folder
|
||||
* `M` parent folder
|
||||
* `G` toggle list / grid view
|
||||
* `T` toggle thumbnails / icons
|
||||
* when playing audio:
|
||||
* `0..9` jump to 10%..90%
|
||||
* `U/O` skip 10sec back/forward
|
||||
* `J/L` prev/next song
|
||||
* `M` play/pause (also starts playing the folder)
|
||||
* `U/O` skip 10sec back/forward
|
||||
* `0..9` jump to 10%..90%
|
||||
* `P` play/pause (also starts playing the folder)
|
||||
* when viewing images / playing videos:
|
||||
* `J/L, Left/Right` prev/next file
|
||||
* `Home/End` first/last file
|
||||
* `U/O` skip 10sec back/forward
|
||||
* `P/K/Space` play/pause video
|
||||
* `Esc` close viewer
|
||||
* when tree-sidebar is open:
|
||||
* `A/D` adjust tree width
|
||||
* in the grid view:
|
||||
* `S` toggle multiselect
|
||||
* `A/D` zoom
|
||||
* shift+`A/D` zoom
|
||||
|
||||
|
||||
## tree-mode
|
||||
|
||||
by default there's a breadcrumbs path; you can replace this with a tree-browser sidebar thing by clicking the 🌲
|
||||
by default there's a breadcrumbs path; you can replace this with a tree-browser sidebar thing by clicking the `🌲` or pressing the `B` hotkey
|
||||
|
||||
click `[-]` and `[+]` to adjust the size, and the `[a]` toggles if the tree should widen dynamically as you go deeper or stay fixed-size
|
||||
click `[-]` and `[+]` (or hotkeys `A`/`D`) to adjust the size, and the `[a]` toggles if the tree should widen dynamically as you go deeper or stay fixed-size
|
||||
|
||||
|
||||
## thumbnails
|
||||
@@ -197,6 +232,8 @@ it does static images with Pillow and uses FFmpeg for video files, so you may wa
|
||||
|
||||
images named `folder.jpg` and `folder.png` become the thumbnail of the folder they're in
|
||||
|
||||
in the grid/thumbnail view, if the audio player panel is open, songs will start playing when clicked
|
||||
|
||||
|
||||
## zip downloads
|
||||
|
||||
@@ -280,6 +317,8 @@ up2k has saved a few uploads from becoming corrupted in-transfer already; caught
|
||||
|
||||
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
|
||||
|
||||
* if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1
|
||||
|
||||
|
||||
# searching
|
||||
|
||||
@@ -462,6 +501,23 @@ quick outline of the up2k protocol, see [uploading](#uploading) for the web-clie
|
||||
* client does another handshake with the hashlist; server replies with OK or a list of chunks to reupload
|
||||
|
||||
|
||||
# performance
|
||||
|
||||
defaults are good for most cases, don't mind the `cannot efficiently use multiple CPU cores` message, it's very unlikely to be a problem
|
||||
|
||||
below are some tweaks roughly ordered by usefulness:
|
||||
|
||||
* `-q` disables logging and can help a bunch, even when combined with `-lo` to redirect logs to file
|
||||
* `--http-only` or `--https-only` (unless you want to support both protocols) will reduce the delay before a new connection is established
|
||||
* `--hist` pointing to a fast location (ssd) will make directory listings and searches faster when `-e2d` or `-e2t` is set
|
||||
* `--no-hash` when indexing a networked disk if you don't care about the actual filehashes and only want the names/tags searchable
|
||||
* `-j` enables multiprocessing (actual multithreading) and can make copyparty perform better in cpu-intensive workloads, for example:
|
||||
* huge amount of short-lived connections
|
||||
* really heavy traffic (downloads/uploads)
|
||||
|
||||
...however it adds an overhead to internal communication so it might be a net loss, see if it works 4 u
|
||||
|
||||
|
||||
# dependencies
|
||||
|
||||
* `jinja2` (is built into the SFX)
|
||||
@@ -591,13 +647,15 @@ in the `scripts` folder:
|
||||
roughly sorted by priority
|
||||
|
||||
* readme.md as epilogue
|
||||
* single sha512 across all up2k chunks? maybe
|
||||
* reduce up2k roundtrips
|
||||
* start from a chunk index and just go
|
||||
* terminate client on bad data
|
||||
* logging to file
|
||||
|
||||
discarded ideas
|
||||
|
||||
* single sha512 across all up2k chunks?
|
||||
* crypto.subtle cannot into streaming, would have to use hashwasm, expensive
|
||||
* separate sqlite table per tag
|
||||
* performance fixed by skipping some indexes (`+mt.k`)
|
||||
* audio fingerprinting
|
||||
@@ -612,3 +670,6 @@ discarded ideas
|
||||
* nah
|
||||
* look into android thumbnail cache file format
|
||||
* absolutely not
|
||||
* indexedDB for hashes, cfg enable/clear/sz, 2gb avail, ~9k for 1g, ~4k for 100m, 500k items before autoeviction
|
||||
* blank hashlist when up-ok to skip handshake
|
||||
* too many confusing side-effects
|
||||
|
||||
@@ -48,15 +48,16 @@ you could replace winfsp with [dokan](https://github.com/dokan-dev/dokany/releas
|
||||
|
||||
|
||||
# [`dbtool.py`](dbtool.py)
|
||||
upgrade utility which can show db info and help transfer data between databases, for example when a new version of copyparty recommends to wipe the DB and reindex because it now collects additional metadata during analysis, but you have some really expensive `-mtp` parsers and want to copy over the tags from the old db
|
||||
upgrade utility which can show db info and help transfer data between databases, for example when a new version of copyparty is incompatible with the old DB and automatically rebuilds the DB from scratch, but you have some really expensive `-mtp` parsers and want to copy over the tags from the old db
|
||||
|
||||
for that example (upgrading to v0.11.0), first move the old db aside, launch copyparty, let it rebuild the db until the point where it starts running mtp (colored messages as it adds the mtp tags), then CTRL-C and patch in the old mtp tags from the old db instead
|
||||
for that example (upgrading to v0.11.20), first launch the new version of copyparty like usual, let it make a backup of the old db and rebuild the new db until the point where it starts running mtp (colored messages as it adds the mtp tags), that's when you hit CTRL-C and patch in the old mtp tags from the old db instead
|
||||
|
||||
so assuming you have `-mtp` parsers to provide the tags `key` and `.bpm`:
|
||||
|
||||
```
|
||||
~/bin/dbtool.py -ls up2k.db
|
||||
~/bin/dbtool.py -src up2k.db.v0.10.22 up2k.db -cmp
|
||||
~/bin/dbtool.py -src up2k.db.v0.10.22 up2k.db -rm-mtp-flag -copy key
|
||||
~/bin/dbtool.py -src up2k.db.v0.10.22 up2k.db -rm-mtp-flag -copy .bpm -vac
|
||||
cd /mnt/nas/music/.hist
|
||||
~/src/copyparty/bin/dbtool.py -ls up2k.db
|
||||
~/src/copyparty/bin/dbtool.py -src up2k.*.v3 up2k.db -cmp
|
||||
~/src/copyparty/bin/dbtool.py -src up2k.*.v3 up2k.db -rm-mtp-flag -copy key
|
||||
~/src/copyparty/bin/dbtool.py -src up2k.*.v3 up2k.db -rm-mtp-flag -copy .bpm -vac
|
||||
```
|
||||
|
||||
@@ -345,7 +345,7 @@ class Gateway(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
def sendreq(self, *args, headers={}, **kwargs):
|
||||
def sendreq(self, meth, path, headers, **kwargs):
|
||||
if self.password:
|
||||
headers["Cookie"] = "=".join(["cppwd", self.password])
|
||||
|
||||
@@ -354,21 +354,21 @@ class Gateway(object):
|
||||
if c.rx_path:
|
||||
raise Exception()
|
||||
|
||||
c.request(*list(args), headers=headers, **kwargs)
|
||||
c.request(meth, path, headers=headers, **kwargs)
|
||||
c.rx = c.getresponse()
|
||||
return c
|
||||
except:
|
||||
tid = threading.current_thread().ident
|
||||
dbg(
|
||||
"\033[1;37;44mbad conn {:x}\n {}\n {}\033[0m".format(
|
||||
tid, " ".join(str(x) for x in args), c.rx_path if c else "(null)"
|
||||
"\033[1;37;44mbad conn {:x}\n {} {}\n {}\033[0m".format(
|
||||
tid, meth, path, c.rx_path if c else "(null)"
|
||||
)
|
||||
)
|
||||
|
||||
self.closeconn(c)
|
||||
c = self.getconn()
|
||||
try:
|
||||
c.request(*list(args), headers=headers, **kwargs)
|
||||
c.request(meth, path, headers=headers, **kwargs)
|
||||
c.rx = c.getresponse()
|
||||
return c
|
||||
except:
|
||||
@@ -386,7 +386,7 @@ class Gateway(object):
|
||||
path = dewin(path)
|
||||
|
||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?dots"
|
||||
c = self.sendreq("GET", web_path)
|
||||
c = self.sendreq("GET", web_path, {})
|
||||
if c.rx.status != 200:
|
||||
self.closeconn(c)
|
||||
log(
|
||||
@@ -440,7 +440,7 @@ class Gateway(object):
|
||||
)
|
||||
)
|
||||
|
||||
c = self.sendreq("GET", web_path, headers={"Range": hdr_range})
|
||||
c = self.sendreq("GET", web_path, {"Range": hdr_range})
|
||||
if c.rx.status != http.client.PARTIAL_CONTENT:
|
||||
self.closeconn(c)
|
||||
raise Exception(
|
||||
|
||||
@@ -54,10 +54,13 @@ MACOS = platform.system() == "Darwin"
|
||||
info = log = dbg = None
|
||||
|
||||
|
||||
print("{} v{} @ {}".format(
|
||||
platform.python_implementation(),
|
||||
".".join([str(x) for x in sys.version_info]),
|
||||
sys.executable))
|
||||
print(
|
||||
"{} v{} @ {}".format(
|
||||
platform.python_implementation(),
|
||||
".".join([str(x) for x in sys.version_info]),
|
||||
sys.executable,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
try:
|
||||
@@ -299,14 +302,14 @@ class Gateway(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
def sendreq(self, *args, headers={}, **kwargs):
|
||||
def sendreq(self, meth, path, headers, **kwargs):
|
||||
tid = get_tid()
|
||||
if self.password:
|
||||
headers["Cookie"] = "=".join(["cppwd", self.password])
|
||||
|
||||
try:
|
||||
c = self.getconn(tid)
|
||||
c.request(*list(args), headers=headers, **kwargs)
|
||||
c.request(meth, path, headers=headers, **kwargs)
|
||||
return c.getresponse()
|
||||
except:
|
||||
dbg("bad conn")
|
||||
@@ -314,7 +317,7 @@ class Gateway(object):
|
||||
self.closeconn(tid)
|
||||
try:
|
||||
c = self.getconn(tid)
|
||||
c.request(*list(args), headers=headers, **kwargs)
|
||||
c.request(meth, path, headers=headers, **kwargs)
|
||||
return c.getresponse()
|
||||
except:
|
||||
info("http connection failed:\n" + traceback.format_exc())
|
||||
@@ -331,7 +334,7 @@ class Gateway(object):
|
||||
path = dewin(path)
|
||||
|
||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?dots&ls"
|
||||
r = self.sendreq("GET", web_path)
|
||||
r = self.sendreq("GET", web_path, {})
|
||||
if r.status != 200:
|
||||
self.closeconn()
|
||||
log(
|
||||
@@ -368,7 +371,7 @@ class Gateway(object):
|
||||
)
|
||||
)
|
||||
|
||||
r = self.sendreq("GET", web_path, headers={"Range": hdr_range})
|
||||
r = self.sendreq("GET", web_path, {"Range": hdr_range})
|
||||
if r.status != http.client.PARTIAL_CONTENT:
|
||||
self.closeconn()
|
||||
raise Exception(
|
||||
|
||||
@@ -60,7 +60,7 @@ def main():
|
||||
try:
|
||||
det(tf)
|
||||
except:
|
||||
pass
|
||||
pass # mute
|
||||
finally:
|
||||
os.unlink(tf)
|
||||
|
||||
|
||||
123
bin/mtag/audio-key-slicing.py
Executable file
123
bin/mtag/audio-key-slicing.py
Executable file
@@ -0,0 +1,123 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import re
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import subprocess as sp
|
||||
|
||||
import keyfinder
|
||||
|
||||
from copyparty.util import fsenc
|
||||
|
||||
"""
|
||||
dep: github/mixxxdj/libkeyfinder
|
||||
dep: pypi/keyfinder
|
||||
dep: ffmpeg
|
||||
|
||||
note: this is a janky edition of the regular audio-key.py,
|
||||
slicing the files at 20sec intervals and keeping 5sec from each,
|
||||
surprisingly accurate but still garbage (446 ok, 69 bad, 13% miss)
|
||||
|
||||
it is fast tho
|
||||
"""
|
||||
|
||||
|
||||
def get_duration():
|
||||
# TODO provide ffprobe tags to mtp as json
|
||||
|
||||
# fmt: off
|
||||
dur = sp.check_output([
|
||||
"ffprobe",
|
||||
"-hide_banner",
|
||||
"-v", "fatal",
|
||||
"-show_streams",
|
||||
"-show_format",
|
||||
fsenc(sys.argv[1])
|
||||
])
|
||||
# fmt: on
|
||||
|
||||
dur = dur.decode("ascii", "replace").split("\n")
|
||||
dur = [x.split("=")[1] for x in dur if x.startswith("duration=")]
|
||||
dur = [float(x) for x in dur if re.match(r"^[0-9\.,]+$", x)]
|
||||
return list(sorted(dur))[-1] if dur else None
|
||||
|
||||
|
||||
def get_segs(dur):
|
||||
# keep first 5s of each 20s,
|
||||
# keep entire last segment
|
||||
ofs = 0
|
||||
segs = []
|
||||
while True:
|
||||
seg = [ofs, 5]
|
||||
segs.append(seg)
|
||||
if dur - ofs < 20:
|
||||
seg[-1] = int(dur - seg[0])
|
||||
break
|
||||
|
||||
ofs += 20
|
||||
|
||||
return segs
|
||||
|
||||
|
||||
def slice(tf):
|
||||
dur = get_duration()
|
||||
dur = min(dur, 600) # max 10min
|
||||
segs = get_segs(dur)
|
||||
|
||||
# fmt: off
|
||||
cmd = [
|
||||
"ffmpeg",
|
||||
"-nostdin",
|
||||
"-hide_banner",
|
||||
"-v", "fatal",
|
||||
"-y"
|
||||
]
|
||||
|
||||
for seg in segs:
|
||||
cmd.extend([
|
||||
"-ss", str(seg[0]),
|
||||
"-i", fsenc(sys.argv[1])
|
||||
])
|
||||
|
||||
filt = ""
|
||||
for n, seg in enumerate(segs):
|
||||
filt += "[{}:a:0]atrim=duration={}[a{}]; ".format(n, seg[1], n)
|
||||
|
||||
prev = "a0"
|
||||
for n in range(1, len(segs)):
|
||||
nxt = "b{}".format(n)
|
||||
filt += "[{}][a{}]acrossfade=d=0.5[{}]; ".format(prev, n, nxt)
|
||||
prev = nxt
|
||||
|
||||
cmd.extend([
|
||||
"-filter_complex", filt[:-2],
|
||||
"-map", "[{}]".format(nxt),
|
||||
"-sample_fmt", "s16",
|
||||
tf
|
||||
])
|
||||
# fmt: on
|
||||
|
||||
# print(cmd)
|
||||
sp.check_call(cmd)
|
||||
|
||||
|
||||
def det(tf):
|
||||
slice(tf)
|
||||
print(keyfinder.key(tf).camelot())
|
||||
|
||||
|
||||
def main():
|
||||
with tempfile.NamedTemporaryFile(suffix=".flac", delete=False) as f:
|
||||
f.write(b"h")
|
||||
tf = f.name
|
||||
|
||||
try:
|
||||
det(tf)
|
||||
finally:
|
||||
os.unlink(tf)
|
||||
pass
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,18 +1,54 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import subprocess as sp
|
||||
import keyfinder
|
||||
|
||||
from copyparty.util import fsenc
|
||||
|
||||
"""
|
||||
dep: github/mixxxdj/libkeyfinder
|
||||
dep: pypi/keyfinder
|
||||
dep: ffmpeg
|
||||
|
||||
note: cannot fsenc
|
||||
"""
|
||||
|
||||
|
||||
try:
|
||||
print(keyfinder.key(sys.argv[1]).camelot())
|
||||
except:
|
||||
pass
|
||||
# tried trimming the first/last 5th, bad idea,
|
||||
# misdetects 9a law field (Sphere Caliber) as 10b,
|
||||
# obvious when mixing 9a ghostly parapara ship
|
||||
|
||||
|
||||
def det(tf):
|
||||
# fmt: off
|
||||
sp.check_call([
|
||||
"ffmpeg",
|
||||
"-nostdin",
|
||||
"-hide_banner",
|
||||
"-v", "fatal",
|
||||
"-y", "-i", fsenc(sys.argv[1]),
|
||||
"-t", "300",
|
||||
"-sample_fmt", "s16",
|
||||
tf
|
||||
])
|
||||
# fmt: on
|
||||
|
||||
print(keyfinder.key(tf).camelot())
|
||||
|
||||
|
||||
def main():
|
||||
with tempfile.NamedTemporaryFile(suffix=".flac", delete=False) as f:
|
||||
f.write(b"h")
|
||||
tf = f.name
|
||||
|
||||
try:
|
||||
det(tf)
|
||||
except:
|
||||
pass # mute
|
||||
finally:
|
||||
os.unlink(tf)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
|
||||
@@ -1,7 +1,15 @@
|
||||
# when running copyparty behind a reverse-proxy,
|
||||
# make sure that copyparty allows at least as many clients as the proxy does,
|
||||
# so run copyparty with -nc 512 if your nginx has the default limits
|
||||
# (worker_processes 1, worker_connections 512)
|
||||
# when running copyparty behind a reverse proxy,
|
||||
# the following arguments are recommended:
|
||||
#
|
||||
# -nc 512 important, see next paragraph
|
||||
# --http-only lower latency on initial connection
|
||||
# -i 127.0.0.1 only accept connections from nginx
|
||||
#
|
||||
# -nc must match or exceed the webserver's max number of concurrent clients;
|
||||
# nginx default is 512 (worker_processes 1, worker_connections 512)
|
||||
#
|
||||
# you may also consider adding -j0 for CPU-intensive configurations
|
||||
# (not that i can really think of any good examples)
|
||||
|
||||
upstream cpp {
|
||||
server 127.0.0.1:3923;
|
||||
|
||||
@@ -9,6 +9,9 @@ import os
|
||||
PY2 = sys.version_info[0] == 2
|
||||
if PY2:
|
||||
sys.dont_write_bytecode = True
|
||||
unicode = unicode
|
||||
else:
|
||||
unicode = str
|
||||
|
||||
WINDOWS = False
|
||||
if platform.system() == "Windows":
|
||||
|
||||
@@ -20,10 +20,10 @@ import threading
|
||||
import traceback
|
||||
from textwrap import dedent
|
||||
|
||||
from .__init__ import E, WINDOWS, VT100, PY2
|
||||
from .__init__ import E, WINDOWS, VT100, PY2, unicode
|
||||
from .__version__ import S_VERSION, S_BUILD_DT, CODENAME
|
||||
from .svchub import SvcHub
|
||||
from .util import py_desc, align_tab, IMPLICATIONS, alltrace
|
||||
from .util import py_desc, align_tab, IMPLICATIONS
|
||||
|
||||
HAVE_SSL = True
|
||||
try:
|
||||
@@ -31,6 +31,8 @@ try:
|
||||
except:
|
||||
HAVE_SSL = False
|
||||
|
||||
printed = ""
|
||||
|
||||
|
||||
class RiceFormatter(argparse.HelpFormatter):
|
||||
def _get_help_string(self, action):
|
||||
@@ -61,8 +63,15 @@ class Dodge11874(RiceFormatter):
|
||||
super(Dodge11874, self).__init__(*args, **kwargs)
|
||||
|
||||
|
||||
def lprint(*a, **ka):
|
||||
global printed
|
||||
|
||||
printed += " ".join(unicode(x) for x in a) + ka.get("end", "\n")
|
||||
print(*a, **ka)
|
||||
|
||||
|
||||
def warn(msg):
|
||||
print("\033[1mwarning:\033[0;33m {}\033[0m\n".format(msg))
|
||||
lprint("\033[1mwarning:\033[0;33m {}\033[0m\n".format(msg))
|
||||
|
||||
|
||||
def ensure_locale():
|
||||
@@ -73,7 +82,7 @@ def ensure_locale():
|
||||
]:
|
||||
try:
|
||||
locale.setlocale(locale.LC_ALL, x)
|
||||
print("Locale:", x)
|
||||
lprint("Locale:", x)
|
||||
break
|
||||
except:
|
||||
continue
|
||||
@@ -94,7 +103,7 @@ def ensure_cert():
|
||||
|
||||
try:
|
||||
if filecmp.cmp(cert_cfg, cert_insec):
|
||||
print(
|
||||
lprint(
|
||||
"\033[33m using default TLS certificate; https will be insecure."
|
||||
+ "\033[36m\n certificate location: {}\033[0m\n".format(cert_cfg)
|
||||
)
|
||||
@@ -123,7 +132,7 @@ def configure_ssl_ver(al):
|
||||
if "help" in sslver:
|
||||
avail = [terse_sslver(x[6:]) for x in flags]
|
||||
avail = " ".join(sorted(avail) + ["all"])
|
||||
print("\navailable ssl/tls versions:\n " + avail)
|
||||
lprint("\navailable ssl/tls versions:\n " + avail)
|
||||
sys.exit(0)
|
||||
|
||||
al.ssl_flags_en = 0
|
||||
@@ -143,7 +152,7 @@ def configure_ssl_ver(al):
|
||||
|
||||
for k in ["ssl_flags_en", "ssl_flags_de"]:
|
||||
num = getattr(al, k)
|
||||
print("{}: {:8x} ({})".format(k, num, num))
|
||||
lprint("{}: {:8x} ({})".format(k, num, num))
|
||||
|
||||
# think i need that beer now
|
||||
|
||||
@@ -160,13 +169,13 @@ def configure_ssl_ciphers(al):
|
||||
try:
|
||||
ctx.set_ciphers(al.ciphers)
|
||||
except:
|
||||
print("\n\033[1;31mfailed to set ciphers\033[0m\n")
|
||||
lprint("\n\033[1;31mfailed to set ciphers\033[0m\n")
|
||||
|
||||
if not hasattr(ctx, "get_ciphers"):
|
||||
print("cannot read cipher list: openssl or python too old")
|
||||
lprint("cannot read cipher list: openssl or python too old")
|
||||
else:
|
||||
ciphers = [x["description"] for x in ctx.get_ciphers()]
|
||||
print("\n ".join(["\nenabled ciphers:"] + align_tab(ciphers) + [""]))
|
||||
lprint("\n ".join(["\nenabled ciphers:"] + align_tab(ciphers) + [""]))
|
||||
|
||||
if is_help:
|
||||
sys.exit(0)
|
||||
@@ -182,16 +191,6 @@ def sighandler(sig=None, frame=None):
|
||||
print("\n".join(msg))
|
||||
|
||||
|
||||
def stackmon(fp, ival):
|
||||
ctr = 0
|
||||
while True:
|
||||
ctr += 1
|
||||
time.sleep(ival)
|
||||
st = "{}, {}\n{}".format(ctr, time.time(), alltrace())
|
||||
with open(fp, "wb") as f:
|
||||
f.write(st.encode("utf-8", "replace"))
|
||||
|
||||
|
||||
def run_argparse(argv, formatter):
|
||||
ap = argparse.ArgumentParser(
|
||||
formatter_class=formatter,
|
||||
@@ -249,30 +248,32 @@ def run_argparse(argv, formatter):
|
||||
),
|
||||
)
|
||||
# fmt: off
|
||||
ap.add_argument("-c", metavar="PATH", type=str, action="append", help="add config file")
|
||||
ap.add_argument("-nc", metavar="NUM", type=int, default=64, help="max num clients")
|
||||
ap.add_argument("-j", metavar="CORES", type=int, default=1, help="max num cpu cores")
|
||||
ap.add_argument("-a", metavar="ACCT", type=str, action="append", help="add account, USER:PASS; example [ed:wark")
|
||||
ap.add_argument("-v", metavar="VOL", type=str, action="append", help="add volume, SRC:DST:FLAG; example [.::r], [/mnt/nas/music:/music:r:aed")
|
||||
ap.add_argument("-ed", action="store_true", help="enable ?dots")
|
||||
ap.add_argument("-emp", action="store_true", help="enable markdown plugins")
|
||||
ap.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
|
||||
ap.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads")
|
||||
ap.add_argument("--sparse", metavar="MiB", type=int, default=4, help="up2k min.size threshold (mswin-only)")
|
||||
ap.add_argument("--urlform", metavar="MODE", type=str, default="print,get", help="how to handle url-forms; examples: [stash], [save,get]")
|
||||
u = unicode
|
||||
ap2 = ap.add_argument_group('general options')
|
||||
ap2.add_argument("-c", metavar="PATH", type=u, action="append", help="add config file")
|
||||
ap2.add_argument("-nc", metavar="NUM", type=int, default=64, help="max num clients")
|
||||
ap2.add_argument("-j", metavar="CORES", type=int, default=1, help="max num cpu cores")
|
||||
ap2.add_argument("-a", metavar="ACCT", type=u, action="append", help="add account, USER:PASS; example [ed:wark")
|
||||
ap2.add_argument("-v", metavar="VOL", type=u, action="append", help="add volume, SRC:DST:FLAG; example [.::r], [/mnt/nas/music:/music:r:aed")
|
||||
ap2.add_argument("-ed", action="store_true", help="enable ?dots")
|
||||
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins")
|
||||
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
|
||||
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads")
|
||||
ap2.add_argument("--sparse", metavar="MiB", type=int, default=4, help="up2k min.size threshold (mswin-only)")
|
||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-forms; examples: [stash], [save,get]")
|
||||
|
||||
ap2 = ap.add_argument_group('network options')
|
||||
ap2.add_argument("-i", metavar="IP", type=str, default="0.0.0.0", help="ip to bind (comma-sep.)")
|
||||
ap2.add_argument("-p", metavar="PORT", type=str, default="3923", help="ports to bind (comma/range)")
|
||||
ap2.add_argument("-i", metavar="IP", type=u, default="0.0.0.0", help="ip to bind (comma-sep.)")
|
||||
ap2.add_argument("-p", metavar="PORT", type=u, default="3923", help="ports to bind (comma/range)")
|
||||
ap2.add_argument("--rproxy", metavar="DEPTH", type=int, default=1, help="which ip to keep; 0 = tcp, 1 = origin (first x-fwd), 2 = cloudflare, 3 = nginx, -1 = closest proxy")
|
||||
|
||||
ap2 = ap.add_argument_group('SSL/TLS options')
|
||||
ap2.add_argument("--http-only", action="store_true", help="disable ssl/tls")
|
||||
ap2.add_argument("--https-only", action="store_true", help="disable plaintext")
|
||||
ap2.add_argument("--ssl-ver", metavar="LIST", type=str, help="set allowed ssl/tls versions; [help] shows available versions; default is what your python version considers safe")
|
||||
ap2.add_argument("--ciphers", metavar="LIST", help="set allowed ssl/tls ciphers; [help] shows available ciphers")
|
||||
ap2.add_argument("--ssl-ver", metavar="LIST", type=u, help="set allowed ssl/tls versions; [help] shows available versions; default is what your python version considers safe")
|
||||
ap2.add_argument("--ciphers", metavar="LIST", type=u, help="set allowed ssl/tls ciphers; [help] shows available ciphers")
|
||||
ap2.add_argument("--ssl-dbg", action="store_true", help="dump some tls info")
|
||||
ap2.add_argument("--ssl-log", metavar="PATH", help="log master secrets")
|
||||
ap2.add_argument("--ssl-log", metavar="PATH", type=u, help="log master secrets")
|
||||
|
||||
ap2 = ap.add_argument_group('opt-outs')
|
||||
ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)")
|
||||
@@ -281,14 +282,16 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
|
||||
|
||||
ap2 = ap.add_argument_group('safety options')
|
||||
ap2.add_argument("--ls", metavar="U[,V[,F]]", help="scan all volumes; arguments USER,VOL,FLAGS; example [**,*,ln,p,r]")
|
||||
ap2.add_argument("--salt", type=str, default="hunter2", help="up2k file-hash salt")
|
||||
ap2.add_argument("--ls", metavar="U[,V[,F]]", type=u, help="scan all volumes; arguments USER,VOL,FLAGS; example [**,*,ln,p,r]")
|
||||
ap2.add_argument("--salt", type=u, default="hunter2", help="up2k file-hash salt")
|
||||
|
||||
ap2 = ap.add_argument_group('logging options')
|
||||
ap2.add_argument("-q", action="store_true", help="quiet")
|
||||
ap2.add_argument("-lo", metavar="PATH", type=u, help="logfile, example: cpp-%%Y-%%m%%d-%%H%%M%%S.txt.xz")
|
||||
ap2.add_argument("--log-conn", action="store_true", help="print tcp-server msgs")
|
||||
ap2.add_argument("--ihead", metavar="HEADER", action='append', help="dump incoming header")
|
||||
ap2.add_argument("--lf-url", metavar="RE", type=str, default=r"^/\.cpr/|\?th=[wj]$", help="dont log URLs matching")
|
||||
ap2.add_argument("--log-htp", action="store_true", help="print http-server threadpool scaling")
|
||||
ap2.add_argument("--ihead", metavar="HEADER", type=u, action='append', help="dump incoming header")
|
||||
ap2.add_argument("--lf-url", metavar="RE", type=u, default=r"^/\.cpr/|\?th=[wj]$", help="dont log URLs matching")
|
||||
|
||||
ap2 = ap.add_argument_group('admin panel options')
|
||||
ap2.add_argument("--no-rescan", action="store_true", help="disable ?scan (volume reindexing)")
|
||||
@@ -303,8 +306,9 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("--th-no-webp", action="store_true", help="disable webp output")
|
||||
ap2.add_argument("--th-ff-jpg", action="store_true", help="force jpg for video thumbs")
|
||||
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown")
|
||||
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval")
|
||||
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
|
||||
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age")
|
||||
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat for")
|
||||
|
||||
ap2 = ap.add_argument_group('database options')
|
||||
ap2.add_argument("-e2d", action="store_true", help="enable up2k database")
|
||||
@@ -313,24 +317,26 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("-e2t", action="store_true", help="enable metadata indexing")
|
||||
ap2.add_argument("-e2ts", action="store_true", help="enable metadata scanner, sets -e2t")
|
||||
ap2.add_argument("-e2tsr", action="store_true", help="rescan all metadata, sets -e2ts")
|
||||
ap2.add_argument("--hist", metavar="PATH", type=str, help="where to store volume state")
|
||||
ap2.add_argument("--hist", metavar="PATH", type=u, help="where to store volume state")
|
||||
ap2.add_argument("--no-hash", action="store_true", help="disable hashing during e2ds folder scans")
|
||||
ap2.add_argument("--no-mutagen", action="store_true", help="use ffprobe for tags instead")
|
||||
ap2.add_argument("--no-mtag-mt", action="store_true", help="disable tag-read parallelism")
|
||||
ap2.add_argument("-mtm", metavar="M=t,t,t", action="append", type=str, help="add/replace metadata mapping")
|
||||
ap2.add_argument("-mte", metavar="M,M,M", type=str, help="tags to index/display (comma-sep.)",
|
||||
ap2.add_argument("-mtm", metavar="M=t,t,t", type=u, action="append", help="add/replace metadata mapping")
|
||||
ap2.add_argument("-mte", metavar="M,M,M", type=u, help="tags to index/display (comma-sep.)",
|
||||
default="circle,album,.tn,artist,title,.bpm,key,.dur,.q,.vq,.aq,ac,vc,res,.fps")
|
||||
ap2.add_argument("-mtp", metavar="M=[f,]bin", action="append", type=str, help="read tag M using bin")
|
||||
ap2.add_argument("-mtp", metavar="M=[f,]bin", type=u, action="append", help="read tag M using bin")
|
||||
ap2.add_argument("--srch-time", metavar="SEC", type=int, default=30, help="search deadline")
|
||||
|
||||
ap2 = ap.add_argument_group('appearance options')
|
||||
ap2.add_argument("--css-browser", metavar="L", help="URL to additional CSS to include")
|
||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||
|
||||
ap2 = ap.add_argument_group('debug options')
|
||||
ap2.add_argument("--no-sendfile", action="store_true", help="disable sendfile")
|
||||
ap2.add_argument("--no-scandir", action="store_true", help="disable scandir")
|
||||
ap2.add_argument("--no-fastboot", action="store_true", help="wait for up2k indexing")
|
||||
ap2.add_argument("--stackmon", metavar="P,S", help="write stacktrace to Path every S second")
|
||||
ap2.add_argument("--no-htp", action="store_true", help="disable httpserver threadpool, create threads as-needed instead")
|
||||
ap2.add_argument("--stackmon", metavar="P,S", type=u, help="write stacktrace to Path every S second")
|
||||
ap2.add_argument("--log-thrs", metavar="SEC", type=float, help="list active threads every SEC")
|
||||
|
||||
return ap.parse_args(args=argv[1:])
|
||||
# fmt: on
|
||||
@@ -347,7 +353,7 @@ def main(argv=None):
|
||||
desc = py_desc().replace("[", "\033[1;30m[")
|
||||
|
||||
f = '\033[36mcopyparty v{} "\033[35m{}\033[36m" ({})\n{}\033[0m\n'
|
||||
print(f.format(S_VERSION, CODENAME, S_BUILD_DT, desc))
|
||||
lprint(f.format(S_VERSION, CODENAME, S_BUILD_DT, desc))
|
||||
|
||||
ensure_locale()
|
||||
if HAVE_SSL:
|
||||
@@ -361,7 +367,7 @@ def main(argv=None):
|
||||
continue
|
||||
|
||||
msg = "\033[1;31mWARNING:\033[0;1m\n {} \033[0;33mwas replaced with\033[0;1m {} \033[0;33mand will be removed\n\033[0m"
|
||||
print(msg.format(dk, nk))
|
||||
lprint(msg.format(dk, nk))
|
||||
argv[idx] = nk
|
||||
time.sleep(2)
|
||||
|
||||
@@ -370,16 +376,6 @@ def main(argv=None):
|
||||
except AssertionError:
|
||||
al = run_argparse(argv, Dodge11874)
|
||||
|
||||
if al.stackmon:
|
||||
fp, f = al.stackmon.rsplit(",", 1)
|
||||
f = int(f)
|
||||
t = threading.Thread(
|
||||
target=stackmon,
|
||||
args=(fp, f),
|
||||
)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
# propagate implications
|
||||
for k1, k2 in IMPLICATIONS:
|
||||
if getattr(al, k1):
|
||||
@@ -410,12 +406,12 @@ def main(argv=None):
|
||||
+ " (if you crash with codec errors then that is why)"
|
||||
)
|
||||
|
||||
if WINDOWS and sys.version_info < (3, 6):
|
||||
if sys.version_info < (3, 6):
|
||||
al.no_scandir = True
|
||||
|
||||
# signal.signal(signal.SIGINT, sighandler)
|
||||
|
||||
SvcHub(al).run()
|
||||
SvcHub(al, argv, printed).run()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (0, 11, 20)
|
||||
VERSION = (0, 11, 38)
|
||||
CODENAME = "the grid"
|
||||
BUILD_DT = (2021, 6, 20)
|
||||
BUILD_DT = (2021, 7, 13)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -10,13 +10,14 @@ import hashlib
|
||||
import threading
|
||||
|
||||
from .__init__ import WINDOWS
|
||||
from .util import IMPLICATIONS, uncyg, undot, Pebkac, fsdec, fsenc, statdir, nuprint
|
||||
from .util import IMPLICATIONS, uncyg, undot, Pebkac, fsdec, fsenc, statdir
|
||||
|
||||
|
||||
class VFS(object):
|
||||
"""single level in the virtual fs"""
|
||||
|
||||
def __init__(self, realpath, vpath, uread=[], uwrite=[], uadm=[], flags={}):
|
||||
def __init__(self, log, realpath, vpath, uread, uwrite, uadm, flags):
|
||||
self.log = log
|
||||
self.realpath = realpath # absolute path on host filesystem
|
||||
self.vpath = vpath # absolute path in the virtual filesystem
|
||||
self.uread = uread # users who can read this
|
||||
@@ -62,6 +63,7 @@ class VFS(object):
|
||||
return self.nodes[name].add(src, dst)
|
||||
|
||||
vn = VFS(
|
||||
self.log,
|
||||
os.path.join(self.realpath, name) if self.realpath else None,
|
||||
"{}/{}".format(self.vpath, name).lstrip("/"),
|
||||
self.uread,
|
||||
@@ -79,7 +81,7 @@ class VFS(object):
|
||||
|
||||
# leaf does not exist; create and keep permissions blank
|
||||
vp = "{}/{}".format(self.vpath, dst).lstrip("/")
|
||||
vn = VFS(src, vp)
|
||||
vn = VFS(self.log, src, vp, [], [], [], {})
|
||||
vn.dbv = self.dbv or self
|
||||
self.nodes[dst] = vn
|
||||
return vn
|
||||
@@ -181,7 +183,7 @@ class VFS(object):
|
||||
"""return user-readable [fsdir,real,virt] items at vpath"""
|
||||
virt_vis = {} # nodes readable by user
|
||||
abspath = self.canonical(rem)
|
||||
real = list(statdir(nuprint, scandir, lstat, abspath))
|
||||
real = list(statdir(self.log, scandir, lstat, abspath))
|
||||
real.sort()
|
||||
if not rem:
|
||||
for name, vn2 in sorted(self.nodes.items()):
|
||||
@@ -208,8 +210,13 @@ class VFS(object):
|
||||
rem, uname, scandir, incl_wo=False, lstat=lstat
|
||||
)
|
||||
|
||||
if seen and not fsroot.startswith(seen[-1]) and fsroot in seen:
|
||||
print("bailing from symlink loop,\n {}\n {}".format(seen[-1], fsroot))
|
||||
if (
|
||||
seen
|
||||
and (not fsroot.startswith(seen[-1]) or fsroot == seen[-1])
|
||||
and fsroot in seen
|
||||
):
|
||||
m = "bailing from symlink loop,\n prev: {}\n curr: {}\n from: {}/{}"
|
||||
self.log("vfs.walk", m.format(seen[-1], fsroot, self.vpath, rem), 3)
|
||||
return
|
||||
|
||||
seen = seen[:] + [fsroot]
|
||||
@@ -242,6 +249,10 @@ class VFS(object):
|
||||
if flt:
|
||||
flt = {k: True for k in flt}
|
||||
|
||||
f1 = "{0}.hist{0}up2k.".format(os.sep)
|
||||
f2a = os.sep + "dir.txt"
|
||||
f2b = "{0}.hist{0}".format(os.sep)
|
||||
|
||||
for vpath, apath, files, rd, vd in self.walk(
|
||||
"", vrem, [], uname, dots, scandir, False
|
||||
):
|
||||
@@ -275,7 +286,11 @@ class VFS(object):
|
||||
del vd[x]
|
||||
|
||||
# up2k filetring based on actual abspath
|
||||
files = [x for x in files if "{0}.hist{0}up2k.".format(os.sep) not in x[1]]
|
||||
files = [
|
||||
x
|
||||
for x in files
|
||||
if f1 not in x[1] and (not x[1].endswith(f2a) or f2b not in x[1])
|
||||
]
|
||||
|
||||
for f in [{"vp": v, "ap": a, "st": n[1]} for v, a, n in files]:
|
||||
yield f
|
||||
@@ -466,7 +481,7 @@ class AuthSrv(object):
|
||||
)
|
||||
except:
|
||||
m = "\n\033[1;31m\nerror in config file {} on line {}:\n\033[0m"
|
||||
print(m.format(cfg_fn, self.line_ctr))
|
||||
self.log(m.format(cfg_fn, self.line_ctr), 1)
|
||||
raise
|
||||
|
||||
# case-insensitive; normalize
|
||||
@@ -482,10 +497,10 @@ class AuthSrv(object):
|
||||
|
||||
if not mount:
|
||||
# -h says our defaults are CWD at root and read/write for everyone
|
||||
vfs = VFS(os.path.abspath("."), "", ["*"], ["*"])
|
||||
vfs = VFS(self.log_func, os.path.abspath("."), "", ["*"], ["*"], ["*"], {})
|
||||
elif "" not in mount:
|
||||
# there's volumes but no root; make root inaccessible
|
||||
vfs = VFS(None, "")
|
||||
vfs = VFS(self.log_func, None, "", [], [], [], {})
|
||||
vfs.flags["d2d"] = True
|
||||
|
||||
maxdepth = 0
|
||||
@@ -497,7 +512,13 @@ class AuthSrv(object):
|
||||
if dst == "":
|
||||
# rootfs was mapped; fully replaces the default CWD vfs
|
||||
vfs = VFS(
|
||||
mount[dst], dst, mread[dst], mwrite[dst], madm[dst], mflags[dst]
|
||||
self.log_func,
|
||||
mount[dst],
|
||||
dst,
|
||||
mread[dst],
|
||||
mwrite[dst],
|
||||
madm[dst],
|
||||
mflags[dst],
|
||||
)
|
||||
continue
|
||||
|
||||
@@ -693,6 +714,11 @@ class AuthSrv(object):
|
||||
self.user = user
|
||||
self.iuser = {v: k for k, v in user.items()}
|
||||
|
||||
self.re_pwd = None
|
||||
pwds = [re.escape(x) for x in self.iuser.keys()]
|
||||
if pwds:
|
||||
self.re_pwd = re.compile("=(" + "|".join(pwds) + ")([]&; ]|$)")
|
||||
|
||||
# import pprint
|
||||
# pprint.pprint({"usr": user, "rd": mread, "wr": mwrite, "mnt": mount})
|
||||
|
||||
@@ -775,7 +801,7 @@ class AuthSrv(object):
|
||||
msg = [x[1] for x in files]
|
||||
|
||||
if msg:
|
||||
nuprint("\n".join(msg))
|
||||
self.log("\n" + "\n".join(msg))
|
||||
|
||||
if n_bads and flag_p:
|
||||
raise Exception("found symlink leaving volume, and strict is set")
|
||||
|
||||
@@ -4,17 +4,11 @@ from __future__ import print_function, unicode_literals
|
||||
import time
|
||||
import threading
|
||||
|
||||
from .__init__ import PY2, WINDOWS, VT100
|
||||
from .broker_util import try_exec
|
||||
from .broker_mpw import MpWorker
|
||||
from .util import mp
|
||||
|
||||
|
||||
if PY2 and not WINDOWS:
|
||||
from multiprocessing.reduction import ForkingPickler
|
||||
from StringIO import StringIO as MemesIO # pylint: disable=import-error
|
||||
|
||||
|
||||
class BrokerMp(object):
|
||||
"""external api; manages MpWorkers"""
|
||||
|
||||
@@ -33,19 +27,17 @@ class BrokerMp(object):
|
||||
cores = mp.cpu_count()
|
||||
|
||||
self.log("broker", "booting {} subprocesses".format(cores))
|
||||
for n in range(cores):
|
||||
for n in range(1, cores + 1):
|
||||
q_pend = mp.Queue(1)
|
||||
q_yield = mp.Queue(64)
|
||||
|
||||
proc = mp.Process(target=MpWorker, args=(q_pend, q_yield, self.args, n))
|
||||
proc.q_pend = q_pend
|
||||
proc.q_yield = q_yield
|
||||
proc.nid = n
|
||||
proc.clients = {}
|
||||
proc.workload = 0
|
||||
|
||||
thr = threading.Thread(
|
||||
target=self.collector, args=(proc,), name="mp-collector"
|
||||
target=self.collector, args=(proc,), name="mp-sink-{}".format(n)
|
||||
)
|
||||
thr.daemon = True
|
||||
thr.start()
|
||||
@@ -53,13 +45,6 @@ class BrokerMp(object):
|
||||
self.procs.append(proc)
|
||||
proc.start()
|
||||
|
||||
if not self.args.q:
|
||||
thr = threading.Thread(
|
||||
target=self.debug_load_balancer, name="mp-dbg-loadbalancer"
|
||||
)
|
||||
thr.daemon = True
|
||||
thr.start()
|
||||
|
||||
def shutdown(self):
|
||||
self.log("broker", "shutting down")
|
||||
for n, proc in enumerate(self.procs):
|
||||
@@ -89,20 +74,6 @@ class BrokerMp(object):
|
||||
if dest == "log":
|
||||
self.log(*args)
|
||||
|
||||
elif dest == "workload":
|
||||
with self.mutex:
|
||||
proc.workload = args[0]
|
||||
|
||||
elif dest == "httpdrop":
|
||||
addr = args[0]
|
||||
|
||||
with self.mutex:
|
||||
del proc.clients[addr]
|
||||
if not proc.clients:
|
||||
proc.workload = 0
|
||||
|
||||
self.hub.tcpsrv.num_clients.add(-1)
|
||||
|
||||
elif dest == "retq":
|
||||
# response from previous ipc call
|
||||
with self.retpend_mutex:
|
||||
@@ -128,38 +99,9 @@ class BrokerMp(object):
|
||||
returns a Queue object which eventually contains the response if want_retval
|
||||
(not-impl here since nothing uses it yet)
|
||||
"""
|
||||
if dest == "httpconn":
|
||||
sck, addr = args
|
||||
sck2 = sck
|
||||
if PY2:
|
||||
buf = MemesIO()
|
||||
ForkingPickler(buf).dump(sck)
|
||||
sck2 = buf.getvalue()
|
||||
|
||||
proc = sorted(self.procs, key=lambda x: x.workload)[0]
|
||||
proc.q_pend.put([0, dest, [sck2, addr]])
|
||||
|
||||
with self.mutex:
|
||||
proc.clients[addr] = 50
|
||||
proc.workload += 50
|
||||
if dest == "listen":
|
||||
for p in self.procs:
|
||||
p.q_pend.put([0, dest, [args[0], len(self.procs)]])
|
||||
|
||||
else:
|
||||
raise Exception("what is " + str(dest))
|
||||
|
||||
def debug_load_balancer(self):
|
||||
fmt = "\033[1m{}\033[0;36m{:4}\033[0m "
|
||||
if not VT100:
|
||||
fmt = "({}{:4})"
|
||||
|
||||
last = ""
|
||||
while self.procs:
|
||||
msg = ""
|
||||
for proc in self.procs:
|
||||
msg += fmt.format(len(proc.clients), proc.workload)
|
||||
|
||||
if msg != last:
|
||||
last = msg
|
||||
with self.hub.log_mutex:
|
||||
print(msg)
|
||||
|
||||
time.sleep(0.1)
|
||||
|
||||
@@ -3,18 +3,13 @@ from __future__ import print_function, unicode_literals
|
||||
from copyparty.authsrv import AuthSrv
|
||||
|
||||
import sys
|
||||
import time
|
||||
import signal
|
||||
import threading
|
||||
|
||||
from .__init__ import PY2, WINDOWS
|
||||
from .broker_util import ExceptionalQueue
|
||||
from .httpsrv import HttpSrv
|
||||
from .util import FAKE_MP
|
||||
|
||||
if PY2 and not WINDOWS:
|
||||
import pickle # nosec
|
||||
|
||||
|
||||
class MpWorker(object):
|
||||
"""one single mp instance"""
|
||||
@@ -25,10 +20,11 @@ class MpWorker(object):
|
||||
self.args = args
|
||||
self.n = n
|
||||
|
||||
self.log = self._log_disabled if args.q and not args.lo else self._log_enabled
|
||||
|
||||
self.retpend = {}
|
||||
self.retpend_mutex = threading.Lock()
|
||||
self.mutex = threading.Lock()
|
||||
self.workload_thr_alive = False
|
||||
|
||||
# we inherited signal_handler from parent,
|
||||
# replace it with something harmless
|
||||
@@ -39,8 +35,7 @@ class MpWorker(object):
|
||||
self.asrv = AuthSrv(args, None, False)
|
||||
|
||||
# instantiate all services here (TODO: inheritance?)
|
||||
self.httpsrv = HttpSrv(self, True)
|
||||
self.httpsrv.disconnect_func = self.httpdrop
|
||||
self.httpsrv = HttpSrv(self, n)
|
||||
|
||||
# on winxp and some other platforms,
|
||||
# use thr.join() to block all signals
|
||||
@@ -53,15 +48,15 @@ class MpWorker(object):
|
||||
# print('k')
|
||||
pass
|
||||
|
||||
def log(self, src, msg, c=0):
|
||||
def _log_enabled(self, src, msg, c=0):
|
||||
self.q_yield.put([0, "log", [src, msg, c]])
|
||||
|
||||
def _log_disabled(self, src, msg, c=0):
|
||||
pass
|
||||
|
||||
def logw(self, msg, c=0):
|
||||
self.log("mp{}".format(self.n), msg, c)
|
||||
|
||||
def httpdrop(self, addr):
|
||||
self.q_yield.put([0, "httpdrop", [addr]])
|
||||
|
||||
def main(self):
|
||||
while True:
|
||||
retq_id, dest, args = self.q_pend.get()
|
||||
@@ -73,24 +68,8 @@ class MpWorker(object):
|
||||
sys.exit(0)
|
||||
return
|
||||
|
||||
elif dest == "httpconn":
|
||||
sck, addr = args
|
||||
if PY2:
|
||||
sck = pickle.loads(sck) # nosec
|
||||
|
||||
if self.args.log_conn:
|
||||
self.log("%s %s" % addr, "|%sC-qpop" % ("-" * 4,), c="1;30")
|
||||
|
||||
self.httpsrv.accept(sck, addr)
|
||||
|
||||
with self.mutex:
|
||||
if not self.workload_thr_alive:
|
||||
self.workload_thr_alive = True
|
||||
thr = threading.Thread(
|
||||
target=self.thr_workload, name="mpw-workload"
|
||||
)
|
||||
thr.daemon = True
|
||||
thr.start()
|
||||
elif dest == "listen":
|
||||
self.httpsrv.listen(args[0], args[1])
|
||||
|
||||
elif dest == "retq":
|
||||
# response from previous ipc call
|
||||
@@ -114,16 +93,3 @@ class MpWorker(object):
|
||||
|
||||
self.q_yield.put([retq_id, dest, args])
|
||||
return retq
|
||||
|
||||
def thr_workload(self):
|
||||
"""announce workloads to MpSrv (the mp controller / loadbalancer)"""
|
||||
# avoid locking in extract_filedata by tracking difference here
|
||||
while True:
|
||||
time.sleep(0.2)
|
||||
with self.mutex:
|
||||
if self.httpsrv.num_clients() == 0:
|
||||
# no clients rn, termiante thread
|
||||
self.workload_thr_alive = False
|
||||
return
|
||||
|
||||
self.q_yield.put([0, "workload", [self.httpsrv.workload]])
|
||||
|
||||
@@ -3,7 +3,6 @@ from __future__ import print_function, unicode_literals
|
||||
|
||||
import threading
|
||||
|
||||
from .authsrv import AuthSrv
|
||||
from .httpsrv import HttpSrv
|
||||
from .broker_util import ExceptionalQueue, try_exec
|
||||
|
||||
@@ -20,8 +19,7 @@ class BrokerThr(object):
|
||||
self.mutex = threading.Lock()
|
||||
|
||||
# instantiate all services here (TODO: inheritance?)
|
||||
self.httpsrv = HttpSrv(self)
|
||||
self.httpsrv.disconnect_func = self.httpdrop
|
||||
self.httpsrv = HttpSrv(self, None)
|
||||
|
||||
def shutdown(self):
|
||||
# self.log("broker", "shutting down")
|
||||
@@ -29,12 +27,8 @@ class BrokerThr(object):
|
||||
pass
|
||||
|
||||
def put(self, want_retval, dest, *args):
|
||||
if dest == "httpconn":
|
||||
sck, addr = args
|
||||
if self.args.log_conn:
|
||||
self.log("%s %s" % addr, "|%sC-qpop" % ("-" * 4,), c="1;30")
|
||||
|
||||
self.httpsrv.accept(sck, addr)
|
||||
if dest == "listen":
|
||||
self.httpsrv.listen(args[0], 1)
|
||||
|
||||
else:
|
||||
# new ipc invoking managed service in hub
|
||||
@@ -51,6 +45,3 @@ class BrokerThr(object):
|
||||
retq = ExceptionalQueue(1)
|
||||
retq.put(rv)
|
||||
return retq
|
||||
|
||||
def httpdrop(self, addr):
|
||||
self.hub.tcpsrv.num_clients.add(-1)
|
||||
|
||||
@@ -10,19 +10,15 @@ import json
|
||||
import string
|
||||
import socket
|
||||
import ctypes
|
||||
import traceback
|
||||
from datetime import datetime
|
||||
import calendar
|
||||
|
||||
from .__init__ import E, PY2, WINDOWS, ANYWIN
|
||||
from .__init__ import E, PY2, WINDOWS, ANYWIN, unicode
|
||||
from .util import * # noqa # pylint: disable=unused-wildcard-import
|
||||
from .authsrv import AuthSrv
|
||||
from .szip import StreamZip
|
||||
from .star import StreamTar
|
||||
|
||||
if not PY2:
|
||||
unicode = str
|
||||
|
||||
|
||||
NO_CACHE = {"Cache-Control": "no-cache"}
|
||||
NO_STORE = {"Cache-Control": "no-store; max-age=0"}
|
||||
@@ -41,7 +37,6 @@ class HttpCli(object):
|
||||
self.ip = conn.addr[0]
|
||||
self.addr = conn.addr # type: tuple[str, int]
|
||||
self.args = conn.args
|
||||
self.is_mp = conn.is_mp
|
||||
self.asrv = conn.asrv # type: AuthSrv
|
||||
self.ico = conn.ico
|
||||
self.thumbcli = conn.thumbcli
|
||||
@@ -50,12 +45,21 @@ class HttpCli(object):
|
||||
self.tls = hasattr(self.s, "cipher")
|
||||
|
||||
self.bufsz = 1024 * 32
|
||||
self.hint = None
|
||||
self.absolute_urls = False
|
||||
self.out_headers = {"Access-Control-Allow-Origin": "*"}
|
||||
|
||||
def log(self, msg, c=0):
|
||||
ptn = self.asrv.re_pwd
|
||||
if ptn and ptn.search(msg):
|
||||
msg = ptn.sub(self.unpwd, msg)
|
||||
|
||||
self.log_func(self.log_src, msg, c)
|
||||
|
||||
def unpwd(self, m):
|
||||
a, b = m.groups()
|
||||
return "=\033[7m {} \033[27m{}".format(self.asrv.iuser[a], b)
|
||||
|
||||
def _check_nonfatal(self, ex):
|
||||
return ex.code < 400 or ex.code in [404, 429]
|
||||
|
||||
@@ -64,14 +68,19 @@ class HttpCli(object):
|
||||
if rem.startswith("/") or rem.startswith("../") or "/../" in rem:
|
||||
raise Exception("that was close")
|
||||
|
||||
def j2(self, name, **kwargs):
|
||||
def j2(self, name, **ka):
|
||||
tpl = self.conn.hsrv.j2[name]
|
||||
return tpl.render(**kwargs) if kwargs else tpl
|
||||
if ka:
|
||||
ka["ts"] = self.conn.hsrv.cachebuster()
|
||||
return tpl.render(**ka)
|
||||
|
||||
return tpl
|
||||
|
||||
def run(self):
|
||||
"""returns true if connection can be reused"""
|
||||
self.keepalive = False
|
||||
self.headers = {}
|
||||
self.hint = None
|
||||
try:
|
||||
headerlines = read_header(self.sr)
|
||||
if not headerlines:
|
||||
@@ -85,9 +94,13 @@ class HttpCli(object):
|
||||
try:
|
||||
self.mode, self.req, self.http_ver = headerlines[0].split(" ")
|
||||
except:
|
||||
raise Pebkac(400, "bad headers:\n" + "\n".join(headerlines))
|
||||
msg = " ]\n#[ ".join(headerlines)
|
||||
raise Pebkac(400, "bad headers:\n#[ " + msg + " ]")
|
||||
|
||||
except Pebkac as ex:
|
||||
self.mode = "GET"
|
||||
self.req = "[junk]"
|
||||
self.http_ver = "HTTP/1.1"
|
||||
# self.log("pebkac at httpcli.run #1: " + repr(ex))
|
||||
self.keepalive = self._check_nonfatal(ex)
|
||||
self.loud_reply(unicode(ex), status=ex.code)
|
||||
@@ -130,6 +143,9 @@ class HttpCli(object):
|
||||
if v is not None:
|
||||
self.log("[H] {}: \033[33m[{}]".format(k, v), 6)
|
||||
|
||||
if "&" in self.req and "?" not in self.req:
|
||||
self.hint = "did you mean '?' instead of '&'"
|
||||
|
||||
# split req into vpath + uparam
|
||||
uparam = {}
|
||||
if "?" not in self.req:
|
||||
@@ -169,6 +185,9 @@ class HttpCli(object):
|
||||
self.rvol, self.wvol, self.avol = [[], [], []]
|
||||
self.asrv.vfs.user_tree(self.uname, self.rvol, self.wvol, self.avol)
|
||||
|
||||
if pwd and "pw" in self.ouparam and pwd != cookies.get("cppwd"):
|
||||
self.out_headers["Set-Cookie"] = self.get_pwd_cookie(pwd)[0]
|
||||
|
||||
ua = self.headers.get("user-agent", "")
|
||||
self.is_rclone = ua.startswith("rclone/")
|
||||
if self.is_rclone:
|
||||
@@ -199,12 +218,15 @@ class HttpCli(object):
|
||||
|
||||
self.log("{}\033[0m, {}".format(str(ex), self.vpath), 3)
|
||||
msg = "<pre>{}\r\nURL: {}\r\n".format(str(ex), self.vpath)
|
||||
if self.hint:
|
||||
msg += "hint: {}\r\n".format(self.hint)
|
||||
|
||||
self.reply(msg.encode("utf-8", "replace"), status=ex.code)
|
||||
return self.keepalive
|
||||
except Pebkac:
|
||||
return False
|
||||
|
||||
def send_headers(self, length, status=200, mime=None, headers={}):
|
||||
def send_headers(self, length, status=200, mime=None, headers=None):
|
||||
response = ["{} {} {}".format(self.http_ver, status, HTTPCODE[status])]
|
||||
|
||||
if length is not None:
|
||||
@@ -214,7 +236,8 @@ class HttpCli(object):
|
||||
response.append("Connection: " + ("Keep-Alive" if self.keepalive else "Close"))
|
||||
|
||||
# headers{} overrides anything set previously
|
||||
self.out_headers.update(headers)
|
||||
if headers:
|
||||
self.out_headers.update(headers)
|
||||
|
||||
# default to utf8 html if no content-type is set
|
||||
if not mime:
|
||||
@@ -231,7 +254,7 @@ class HttpCli(object):
|
||||
except:
|
||||
raise Pebkac(400, "client d/c while replying headers")
|
||||
|
||||
def reply(self, body, status=200, mime=None, headers={}):
|
||||
def reply(self, body, status=200, mime=None, headers=None):
|
||||
# TODO something to reply with user-supplied values safely
|
||||
self.send_headers(len(body), status, mime, headers)
|
||||
|
||||
@@ -247,7 +270,7 @@ class HttpCli(object):
|
||||
self.log(body.rstrip())
|
||||
self.reply(b"<pre>" + body.encode("utf-8") + b"\r\n", *list(args), **kwargs)
|
||||
|
||||
def urlq(self, add={}, rm=[]):
|
||||
def urlq(self, add, rm):
|
||||
"""
|
||||
generates url query based on uparam (b, pw, all others)
|
||||
removing anything in rm, adding pairs in add
|
||||
@@ -319,6 +342,9 @@ class HttpCli(object):
|
||||
if "tree" in self.uparam:
|
||||
return self.tx_tree()
|
||||
|
||||
if "stack" in self.uparam:
|
||||
return self.tx_stack()
|
||||
|
||||
# conditional redirect to single volumes
|
||||
if self.vpath == "" and not self.ouparam:
|
||||
nread = len(self.rvol)
|
||||
@@ -348,9 +374,6 @@ class HttpCli(object):
|
||||
if "scan" in self.uparam:
|
||||
return self.scanvol()
|
||||
|
||||
if "stack" in self.uparam:
|
||||
return self.tx_stack()
|
||||
|
||||
return self.tx_browser()
|
||||
|
||||
def handle_options(self):
|
||||
@@ -456,15 +479,17 @@ class HttpCli(object):
|
||||
addr = self.ip.replace(":", ".")
|
||||
fn = "put-{:.6f}-{}.bin".format(time.time(), addr)
|
||||
path = os.path.join(fdir, fn)
|
||||
if self.args.nw:
|
||||
path = os.devnull
|
||||
|
||||
with open(fsenc(path), "wb", 512 * 1024) as f:
|
||||
post_sz, _, sha_b64 = hashcopy(self.conn, reader, f)
|
||||
post_sz, _, sha_b64 = hashcopy(reader, f)
|
||||
|
||||
vfs, vrem = vfs.get_dbv(rem)
|
||||
|
||||
self.conn.hsrv.broker.put(
|
||||
False, "up2k.hash_file", vfs.realpath, vfs.flags, vrem, fn
|
||||
)
|
||||
if not self.args.nw:
|
||||
vfs, vrem = vfs.get_dbv(rem)
|
||||
self.conn.hsrv.broker.put(
|
||||
False, "up2k.hash_file", vfs.realpath, vfs.flags, vrem, fn
|
||||
)
|
||||
|
||||
return post_sz, sha_b64, remains, path
|
||||
|
||||
@@ -481,7 +506,7 @@ class HttpCli(object):
|
||||
|
||||
spd1 = get_spd(nbytes, self.t0)
|
||||
spd2 = get_spd(self.conn.nbyte, self.conn.t0)
|
||||
return spd1 + " " + spd2
|
||||
return "{} {} n{}".format(spd1, spd2, self.conn.nreq)
|
||||
|
||||
def handle_post_multipart(self):
|
||||
self.parser = MultipartParser(self.log, self.sr, self.headers)
|
||||
@@ -585,13 +610,14 @@ class HttpCli(object):
|
||||
os.makedirs(fsenc(dst))
|
||||
except OSError as ex:
|
||||
self.log("makedirs failed [{}]".format(dst))
|
||||
if ex.errno == 13:
|
||||
raise Pebkac(500, "the server OS denied write-access")
|
||||
if not os.path.isdir(fsenc(dst)):
|
||||
if ex.errno == 13:
|
||||
raise Pebkac(500, "the server OS denied write-access")
|
||||
|
||||
if ex.errno == 17:
|
||||
raise Pebkac(400, "some file got your folder name")
|
||||
if ex.errno == 17:
|
||||
raise Pebkac(400, "some file got your folder name")
|
||||
|
||||
raise Pebkac(500, min_ex())
|
||||
raise Pebkac(500, min_ex())
|
||||
except:
|
||||
raise Pebkac(500, min_ex())
|
||||
|
||||
@@ -623,7 +649,7 @@ class HttpCli(object):
|
||||
penalty = 0.7
|
||||
t_idle = t0 - idx.p_end
|
||||
if idx.p_dur > 0.7 and t_idle < penalty:
|
||||
m = "rate-limit ({:.1f} sec), cost {:.2f}, idle {:.2f}"
|
||||
m = "rate-limit {:.1f} sec, cost {:.2f}, idle {:.2f}"
|
||||
raise Pebkac(429, m.format(penalty, idx.p_dur, t_idle))
|
||||
|
||||
if "srch" in body:
|
||||
@@ -689,7 +715,7 @@ class HttpCli(object):
|
||||
|
||||
with open(fsenc(path), "rb+", 512 * 1024) as f:
|
||||
f.seek(cstart[0])
|
||||
post_sz, _, sha_b64 = hashcopy(self.conn, reader, f)
|
||||
post_sz, _, sha_b64 = hashcopy(reader, f)
|
||||
|
||||
if sha_b64 != chash:
|
||||
raise Pebkac(
|
||||
@@ -743,6 +769,12 @@ class HttpCli(object):
|
||||
pwd = self.parser.require("cppwd", 64)
|
||||
self.parser.drop()
|
||||
|
||||
ck, msg = self.get_pwd_cookie(pwd)
|
||||
html = self.j2("msg", h1=msg, h2='<a href="/">ack</a>', redir="/")
|
||||
self.reply(html.encode("utf-8"), headers={"Set-Cookie": ck})
|
||||
return True
|
||||
|
||||
def get_pwd_cookie(self, pwd):
|
||||
if pwd in self.asrv.iuser:
|
||||
msg = "login ok"
|
||||
dt = datetime.utcfromtimestamp(time.time() + 60 * 60 * 24 * 365)
|
||||
@@ -753,9 +785,7 @@ class HttpCli(object):
|
||||
exp = "Fri, 15 Aug 1997 01:00:00 GMT"
|
||||
|
||||
ck = "cppwd={}; Path=/; Expires={}; SameSite=Lax".format(pwd, exp)
|
||||
html = self.j2("msg", h1=msg, h2='<a href="/">ack</a>', redir="/")
|
||||
self.reply(html.encode("utf-8"), headers={"Set-Cookie": ck})
|
||||
return True
|
||||
return [ck, msg]
|
||||
|
||||
def handle_mkdir(self):
|
||||
new_dir = self.parser.require("name", 512)
|
||||
@@ -765,7 +795,7 @@ class HttpCli(object):
|
||||
vfs, rem = self.asrv.vfs.get(self.vpath, self.uname, False, True)
|
||||
self._assert_safe_rem(rem)
|
||||
|
||||
sanitized = sanitize_fn(new_dir)
|
||||
sanitized = sanitize_fn(new_dir, "", [])
|
||||
|
||||
if not nullwrite:
|
||||
fdir = os.path.join(vfs.realpath, rem)
|
||||
@@ -802,7 +832,7 @@ class HttpCli(object):
|
||||
if not new_file.endswith(".md"):
|
||||
new_file += ".md"
|
||||
|
||||
sanitized = sanitize_fn(new_file)
|
||||
sanitized = sanitize_fn(new_file, "", [])
|
||||
|
||||
if not nullwrite:
|
||||
fdir = os.path.join(vfs.realpath, rem)
|
||||
@@ -835,7 +865,7 @@ class HttpCli(object):
|
||||
if p_file and not nullwrite:
|
||||
fdir = os.path.join(vfs.realpath, rem)
|
||||
fname = sanitize_fn(
|
||||
p_file, bad=[".prologue.html", ".epilogue.html"]
|
||||
p_file, "", [".prologue.html", ".epilogue.html"]
|
||||
)
|
||||
|
||||
if not os.path.isdir(fsenc(fdir)):
|
||||
@@ -852,7 +882,7 @@ class HttpCli(object):
|
||||
with ren_open(fname, "wb", 512 * 1024, **open_args) as f:
|
||||
f, fname = f["orz"]
|
||||
self.log("writing to {}/{}".format(fdir, fname))
|
||||
sz, sha512_hex, _ = hashcopy(self.conn, p_data, f)
|
||||
sz, sha512_hex, _ = hashcopy(p_data, f)
|
||||
if sz == 0:
|
||||
raise Pebkac(400, "empty files in post")
|
||||
|
||||
@@ -1035,7 +1065,7 @@ class HttpCli(object):
|
||||
raise Pebkac(400, "expected body, got {}".format(p_field))
|
||||
|
||||
with open(fsenc(fp), "wb", 512 * 1024) as f:
|
||||
sz, sha512, _ = hashcopy(self.conn, p_data, f)
|
||||
sz, sha512, _ = hashcopy(p_data, f)
|
||||
|
||||
new_lastmod = os.stat(fsenc(fp)).st_mtime
|
||||
new_lastmod3 = int(new_lastmod * 1000)
|
||||
@@ -1225,8 +1255,7 @@ class HttpCli(object):
|
||||
if use_sendfile:
|
||||
remains = sendfile_kern(lower, upper, f, self.s)
|
||||
else:
|
||||
actor = self.conn if self.is_mp else None
|
||||
remains = sendfile_py(lower, upper, f, self.s, actor)
|
||||
remains = sendfile_py(lower, upper, f, self.s)
|
||||
|
||||
if remains > 0:
|
||||
logmsg += " \033[31m" + unicode(upper - remains) + "\033[0m"
|
||||
@@ -1283,7 +1312,7 @@ class HttpCli(object):
|
||||
|
||||
fgen = vn.zipgen(rem, items, self.uname, dots, not self.args.no_scandir)
|
||||
# for f in fgen: print(repr({k: f[k] for k in ["vp", "ap"]}))
|
||||
bgen = packer(fgen, utf8="utf" in uarg, pre_crc="crc" in uarg)
|
||||
bgen = packer(self.log, fgen, utf8="utf" in uarg, pre_crc="crc" in uarg)
|
||||
bsent = 0
|
||||
for buf in bgen.gen():
|
||||
if not buf:
|
||||
@@ -1305,7 +1334,7 @@ class HttpCli(object):
|
||||
ext = "folder"
|
||||
exact = True
|
||||
|
||||
bad = re.compile(r"[](){}/[]|^[0-9_-]*$")
|
||||
bad = re.compile(r"[](){}/ []|^[0-9_-]*$")
|
||||
n = ext.split(".")[::-1]
|
||||
if not exact:
|
||||
n = n[:-1]
|
||||
@@ -1361,6 +1390,7 @@ class HttpCli(object):
|
||||
"md_plug": "true" if self.args.emp else "false",
|
||||
"md_chk_rate": self.args.mcr,
|
||||
"md": boundary,
|
||||
"ts": self.conn.hsrv.cachebuster(),
|
||||
}
|
||||
html = template.render(**targs).encode("utf-8", "replace")
|
||||
html = html.split(boundary.encode("utf-8"))
|
||||
@@ -1393,7 +1423,7 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
def tx_mounts(self):
|
||||
suf = self.urlq(rm=["h"])
|
||||
suf = self.urlq({}, ["h"])
|
||||
rvol, wvol, avol = [
|
||||
[("/" + x).rstrip("/") + "/" for x in y]
|
||||
for y in [self.rvol, self.wvol, self.avol]
|
||||
@@ -1443,7 +1473,7 @@ class HttpCli(object):
|
||||
raise Pebkac(500, x)
|
||||
|
||||
def tx_stack(self):
|
||||
if not self.readable or not self.writable:
|
||||
if not self.avol:
|
||||
raise Pebkac(403, "not admin")
|
||||
|
||||
if self.args.no_stack:
|
||||
@@ -1533,14 +1563,16 @@ class HttpCli(object):
|
||||
raise Pebkac(404)
|
||||
|
||||
if self.readable:
|
||||
if rem.startswith(".hist/up2k."):
|
||||
if rem.startswith(".hist/up2k.") or (
|
||||
rem.endswith("/dir.txt") and rem.startswith(".hist/th/")
|
||||
):
|
||||
raise Pebkac(403)
|
||||
|
||||
is_dir = stat.S_ISDIR(st.st_mode)
|
||||
th_fmt = self.uparam.get("th")
|
||||
if th_fmt is not None:
|
||||
if is_dir:
|
||||
for fn in ["folder.png", "folder.jpg"]:
|
||||
for fn in self.args.th_covers.split(","):
|
||||
fp = os.path.join(abspath, fn)
|
||||
if os.path.exists(fp):
|
||||
vrem = "{}/{}".format(vrem.rstrip("/"), fn)
|
||||
@@ -1602,9 +1634,8 @@ class HttpCli(object):
|
||||
if self.writable:
|
||||
perms.append("write")
|
||||
|
||||
url_suf = self.urlq()
|
||||
url_suf = self.urlq({}, [])
|
||||
is_ls = "ls" in self.uparam
|
||||
ts = "" # "?{}".format(time.time())
|
||||
|
||||
tpl = "browser"
|
||||
if "b" in self.uparam:
|
||||
@@ -1629,7 +1660,6 @@ class HttpCli(object):
|
||||
"vdir": quotep(self.vpath),
|
||||
"vpnodes": vpnodes,
|
||||
"files": [],
|
||||
"ts": ts,
|
||||
"perms": json.dumps(perms),
|
||||
"taglist": [],
|
||||
"tag_order": [],
|
||||
@@ -1765,28 +1795,44 @@ class HttpCli(object):
|
||||
fn = f["name"]
|
||||
rd = f["rd"]
|
||||
del f["rd"]
|
||||
if icur:
|
||||
if vn != dbv:
|
||||
_, rd = vn.get_dbv(rd)
|
||||
if not icur:
|
||||
break
|
||||
|
||||
if vn != dbv:
|
||||
_, rd = vn.get_dbv(rd)
|
||||
|
||||
q = "select w from up where rd = ? and fn = ?"
|
||||
r = None
|
||||
try:
|
||||
r = icur.execute(q, (rd, fn)).fetchone()
|
||||
except Exception as ex:
|
||||
if "database is locked" in str(ex):
|
||||
break
|
||||
|
||||
q = "select w from up where rd = ? and fn = ?"
|
||||
try:
|
||||
r = icur.execute(q, (rd, fn)).fetchone()
|
||||
except:
|
||||
args = s3enc(idx.mem_cur, rd, fn)
|
||||
r = icur.execute(q, args).fetchone()
|
||||
except:
|
||||
m = "tag list error, {}/{}\n{}"
|
||||
self.log(m.format(rd, fn, min_ex()))
|
||||
break
|
||||
|
||||
tags = {}
|
||||
f["tags"] = tags
|
||||
tags = {}
|
||||
f["tags"] = tags
|
||||
|
||||
if not r:
|
||||
continue
|
||||
if not r:
|
||||
continue
|
||||
|
||||
w = r[0][:16]
|
||||
q = "select k, v from mt where w = ? and k != 'x'"
|
||||
w = r[0][:16]
|
||||
q = "select k, v from mt where w = ? and k != 'x'"
|
||||
try:
|
||||
for k, v in icur.execute(q, (w,)):
|
||||
taglist[k] = True
|
||||
tags[k] = v
|
||||
except:
|
||||
m = "tag read error, {}/{} [{}]:\n{}"
|
||||
self.log(m.format(rd, fn, w, min_ex()))
|
||||
break
|
||||
|
||||
if icur:
|
||||
taglist = [k for k in vn.flags.get("mte", "").split(",") if k in taglist]
|
||||
|
||||
@@ -3,7 +3,6 @@ from __future__ import print_function, unicode_literals
|
||||
|
||||
import re
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import socket
|
||||
|
||||
@@ -35,7 +34,6 @@ class HttpConn(object):
|
||||
|
||||
self.args = hsrv.args
|
||||
self.asrv = hsrv.asrv
|
||||
self.is_mp = hsrv.is_mp
|
||||
self.cert_path = hsrv.cert_path
|
||||
|
||||
enth = HAVE_PIL and not self.args.no_thumb
|
||||
@@ -44,8 +42,8 @@ class HttpConn(object):
|
||||
|
||||
self.t0 = time.time()
|
||||
self.stopping = False
|
||||
self.nreq = 0
|
||||
self.nbyte = 0
|
||||
self.workload = 0
|
||||
self.u2idx = None
|
||||
self.log_func = hsrv.log
|
||||
self.lf_url = re.compile(self.args.lf_url) if self.args.lf_url else None
|
||||
@@ -184,11 +182,7 @@ class HttpConn(object):
|
||||
self.sr = Unrecv(self.s)
|
||||
|
||||
while not self.stopping:
|
||||
if self.is_mp:
|
||||
self.workload += 50
|
||||
if self.workload >= 2 ** 31:
|
||||
self.workload = 100
|
||||
|
||||
self.nreq += 1
|
||||
cli = HttpCli(self)
|
||||
if not cli.run():
|
||||
return
|
||||
|
||||
@@ -4,6 +4,8 @@ from __future__ import print_function, unicode_literals
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import math
|
||||
import base64
|
||||
import socket
|
||||
import threading
|
||||
|
||||
@@ -24,10 +26,15 @@ except ImportError:
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
from .__init__ import E, MACOS
|
||||
from .authsrv import AuthSrv
|
||||
from .__init__ import E, PY2, MACOS
|
||||
from .util import spack, min_ex, start_stackmon, start_log_thrs
|
||||
from .httpconn import HttpConn
|
||||
|
||||
if PY2:
|
||||
import Queue as queue
|
||||
else:
|
||||
import queue
|
||||
|
||||
|
||||
class HttpSrv(object):
|
||||
"""
|
||||
@@ -35,19 +42,28 @@ class HttpSrv(object):
|
||||
relying on MpSrv for performance (HttpSrv is just plain threads)
|
||||
"""
|
||||
|
||||
def __init__(self, broker, is_mp=False):
|
||||
def __init__(self, broker, nid):
|
||||
self.broker = broker
|
||||
self.is_mp = is_mp
|
||||
self.nid = nid
|
||||
self.args = broker.args
|
||||
self.log = broker.log
|
||||
self.asrv = broker.asrv
|
||||
|
||||
self.disconnect_func = None
|
||||
self.name = "httpsrv" + ("-n{}-i{:x}".format(nid, os.getpid()) if nid else "")
|
||||
self.mutex = threading.Lock()
|
||||
self.stopping = False
|
||||
|
||||
self.clients = {}
|
||||
self.workload = 0
|
||||
self.workload_thr_alive = False
|
||||
self.tp_nthr = 0 # actual
|
||||
self.tp_ncli = 0 # fading
|
||||
self.tp_time = None # latest worker collect
|
||||
self.tp_q = None if self.args.no_htp else queue.LifoQueue()
|
||||
|
||||
self.srvs = []
|
||||
self.ncli = 0 # exact
|
||||
self.clients = {} # laggy
|
||||
self.nclimax = 0
|
||||
self.cb_ts = 0
|
||||
self.cb_v = 0
|
||||
|
||||
env = jinja2.Environment()
|
||||
env.loader = jinja2.FileSystemLoader(os.path.join(E.mod, "web"))
|
||||
@@ -62,24 +78,155 @@ class HttpSrv(object):
|
||||
else:
|
||||
self.cert_path = None
|
||||
|
||||
if self.tp_q:
|
||||
self.start_threads(4)
|
||||
|
||||
name = "httpsrv-scaler" + ("-{}".format(nid) if nid else "")
|
||||
t = threading.Thread(target=self.thr_scaler, name=name)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
if nid:
|
||||
if self.args.stackmon:
|
||||
start_stackmon(self.args.stackmon, nid)
|
||||
|
||||
if self.args.log_thrs:
|
||||
start_log_thrs(self.log, self.args.log_thrs, nid)
|
||||
|
||||
def start_threads(self, n):
|
||||
self.tp_nthr += n
|
||||
if self.args.log_htp:
|
||||
self.log(self.name, "workers += {} = {}".format(n, self.tp_nthr), 6)
|
||||
|
||||
for _ in range(n):
|
||||
thr = threading.Thread(
|
||||
target=self.thr_poolw,
|
||||
name=self.name + "-poolw",
|
||||
)
|
||||
thr.daemon = True
|
||||
thr.start()
|
||||
|
||||
def stop_threads(self, n):
|
||||
self.tp_nthr -= n
|
||||
if self.args.log_htp:
|
||||
self.log(self.name, "workers -= {} = {}".format(n, self.tp_nthr), 6)
|
||||
|
||||
for _ in range(n):
|
||||
self.tp_q.put(None)
|
||||
|
||||
def thr_scaler(self):
|
||||
while True:
|
||||
time.sleep(2 if self.tp_ncli else 30)
|
||||
with self.mutex:
|
||||
self.tp_ncli = max(self.ncli, self.tp_ncli - 2)
|
||||
if self.tp_nthr > self.tp_ncli + 8:
|
||||
self.stop_threads(4)
|
||||
|
||||
def listen(self, sck, nlisteners):
|
||||
ip, port = sck.getsockname()
|
||||
self.srvs.append(sck)
|
||||
self.nclimax = math.ceil(self.args.nc * 1.0 / nlisteners)
|
||||
t = threading.Thread(
|
||||
target=self.thr_listen,
|
||||
args=(sck,),
|
||||
name="httpsrv-n{}-listen-{}-{}".format(self.nid or "0", ip, port),
|
||||
)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
def thr_listen(self, srv_sck):
|
||||
"""listens on a shared tcp server"""
|
||||
ip, port = srv_sck.getsockname()
|
||||
fno = srv_sck.fileno()
|
||||
msg = "subscribed @ {}:{} f{}".format(ip, port, fno)
|
||||
self.log(self.name, msg)
|
||||
while not self.stopping:
|
||||
if self.args.log_conn:
|
||||
self.log(self.name, "|%sC-ncli" % ("-" * 1,), c="1;30")
|
||||
|
||||
if self.ncli >= self.nclimax:
|
||||
self.log(self.name, "at connection limit; waiting", 3)
|
||||
while self.ncli >= self.nclimax:
|
||||
time.sleep(0.1)
|
||||
|
||||
if self.args.log_conn:
|
||||
self.log(self.name, "|%sC-acc1" % ("-" * 2,), c="1;30")
|
||||
|
||||
try:
|
||||
sck, addr = srv_sck.accept()
|
||||
except (OSError, socket.error) as ex:
|
||||
self.log(self.name, "accept({}): {}".format(fno, ex), c=6)
|
||||
time.sleep(0.02)
|
||||
continue
|
||||
|
||||
if self.args.log_conn:
|
||||
m = "|{}C-acc2 \033[0;36m{} \033[3{}m{}".format(
|
||||
"-" * 3, ip, port % 8, port
|
||||
)
|
||||
self.log("%s %s" % addr, m, c="1;30")
|
||||
|
||||
self.accept(sck, addr)
|
||||
|
||||
def accept(self, sck, addr):
|
||||
"""takes an incoming tcp connection and creates a thread to handle it"""
|
||||
if self.args.log_conn:
|
||||
self.log("%s %s" % addr, "|%sC-cthr" % ("-" * 5,), c="1;30")
|
||||
now = time.time()
|
||||
|
||||
if self.tp_time and now - self.tp_time > 300:
|
||||
self.tp_q = None
|
||||
|
||||
if self.tp_q:
|
||||
self.tp_q.put((sck, addr))
|
||||
with self.mutex:
|
||||
self.ncli += 1
|
||||
self.tp_time = self.tp_time or now
|
||||
self.tp_ncli = max(self.tp_ncli, self.ncli + 1)
|
||||
if self.tp_nthr < self.ncli + 4:
|
||||
self.start_threads(8)
|
||||
return
|
||||
|
||||
if not self.args.no_htp:
|
||||
m = "looks like the httpserver threadpool died; please make an issue on github and tell me the story of how you pulled that off, thanks and dog bless\n"
|
||||
self.log(self.name, m, 1)
|
||||
|
||||
with self.mutex:
|
||||
self.ncli += 1
|
||||
|
||||
thr = threading.Thread(
|
||||
target=self.thr_client,
|
||||
args=(sck, addr),
|
||||
name="httpsrv-{}-{}".format(addr[0].split(".", 2)[-1][-6:], addr[1]),
|
||||
name="httpconn-{}-{}".format(addr[0].split(".", 2)[-1][-6:], addr[1]),
|
||||
)
|
||||
thr.daemon = True
|
||||
thr.start()
|
||||
|
||||
def num_clients(self):
|
||||
with self.mutex:
|
||||
return len(self.clients)
|
||||
def thr_poolw(self):
|
||||
while True:
|
||||
task = self.tp_q.get()
|
||||
if not task:
|
||||
break
|
||||
|
||||
with self.mutex:
|
||||
self.tp_time = None
|
||||
|
||||
try:
|
||||
sck, addr = task
|
||||
me = threading.current_thread()
|
||||
me.name = "httpconn-{}-{}".format(
|
||||
addr[0].split(".", 2)[-1][-6:], addr[1]
|
||||
)
|
||||
self.thr_client(sck, addr)
|
||||
me.name = self.name + "-poolw"
|
||||
except:
|
||||
self.log(self.name, "thr_client: " + min_ex(), 3)
|
||||
|
||||
def shutdown(self):
|
||||
self.stopping = True
|
||||
for srv in self.srvs:
|
||||
try:
|
||||
srv.close()
|
||||
except:
|
||||
pass
|
||||
|
||||
clients = list(self.clients.keys())
|
||||
for cli in clients:
|
||||
try:
|
||||
@@ -87,7 +234,14 @@ class HttpSrv(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
self.log("httpsrv-n", "ok bye")
|
||||
if self.tp_q:
|
||||
self.stop_threads(self.tp_nthr)
|
||||
for _ in range(10):
|
||||
time.sleep(0.05)
|
||||
if self.tp_q.empty():
|
||||
break
|
||||
|
||||
self.log(self.name, "ok bye")
|
||||
|
||||
def thr_client(self, sck, addr):
|
||||
"""thread managing one tcp client"""
|
||||
@@ -97,25 +251,15 @@ class HttpSrv(object):
|
||||
with self.mutex:
|
||||
self.clients[cli] = 0
|
||||
|
||||
if self.is_mp:
|
||||
self.workload += 50
|
||||
if not self.workload_thr_alive:
|
||||
self.workload_thr_alive = True
|
||||
thr = threading.Thread(
|
||||
target=self.thr_workload, name="httpsrv-workload"
|
||||
)
|
||||
thr.daemon = True
|
||||
thr.start()
|
||||
|
||||
fno = sck.fileno()
|
||||
try:
|
||||
if self.args.log_conn:
|
||||
self.log("%s %s" % addr, "|%sC-crun" % ("-" * 6,), c="1;30")
|
||||
self.log("%s %s" % addr, "|%sC-crun" % ("-" * 4,), c="1;30")
|
||||
|
||||
cli.run()
|
||||
|
||||
except (OSError, socket.error) as ex:
|
||||
if ex.errno not in [10038, 10054, 107, 57, 9]:
|
||||
if ex.errno not in [10038, 10054, 107, 57, 49, 9]:
|
||||
self.log(
|
||||
"%s %s" % addr,
|
||||
"run({}): {}".format(fno, ex),
|
||||
@@ -125,7 +269,7 @@ class HttpSrv(object):
|
||||
finally:
|
||||
sck = cli.s
|
||||
if self.args.log_conn:
|
||||
self.log("%s %s" % addr, "|%sC-cdone" % ("-" * 7,), c="1;30")
|
||||
self.log("%s %s" % addr, "|%sC-cdone" % ("-" * 5,), c="1;30")
|
||||
|
||||
try:
|
||||
fno = sck.fileno()
|
||||
@@ -138,42 +282,37 @@ class HttpSrv(object):
|
||||
"shut({}): {}".format(fno, ex),
|
||||
c="1;30",
|
||||
)
|
||||
if ex.errno not in [10038, 10054, 107, 57, 9]:
|
||||
if ex.errno not in [10038, 10054, 107, 57, 49, 9]:
|
||||
# 10038 No longer considered a socket
|
||||
# 10054 Foribly closed by remote
|
||||
# 107 Transport endpoint not connected
|
||||
# 57 Socket is not connected
|
||||
# 49 Can't assign requested address (wifi down)
|
||||
# 9 Bad file descriptor
|
||||
raise
|
||||
finally:
|
||||
with self.mutex:
|
||||
del self.clients[cli]
|
||||
self.ncli -= 1
|
||||
|
||||
if self.disconnect_func:
|
||||
self.disconnect_func(addr) # pylint: disable=not-callable
|
||||
def cachebuster(self):
|
||||
if time.time() - self.cb_ts < 1:
|
||||
return self.cb_v
|
||||
|
||||
def thr_workload(self):
|
||||
"""indicates the python interpreter workload caused by this HttpSrv"""
|
||||
# avoid locking in extract_filedata by tracking difference here
|
||||
while True:
|
||||
time.sleep(0.2)
|
||||
with self.mutex:
|
||||
if not self.clients:
|
||||
# no clients rn, termiante thread
|
||||
self.workload_thr_alive = False
|
||||
self.workload = 0
|
||||
return
|
||||
with self.mutex:
|
||||
if time.time() - self.cb_ts < 1:
|
||||
return self.cb_v
|
||||
|
||||
total = 0
|
||||
with self.mutex:
|
||||
for cli in self.clients.keys():
|
||||
now = cli.workload
|
||||
delta = now - self.clients[cli]
|
||||
if delta < 0:
|
||||
# was reset in HttpCli to prevent overflow
|
||||
delta = now
|
||||
v = E.t0
|
||||
try:
|
||||
with os.scandir(os.path.join(E.mod, "web")) as dh:
|
||||
for fh in dh:
|
||||
inf = fh.stat(follow_symlinks=False)
|
||||
v = max(v, inf.st_mtime)
|
||||
except:
|
||||
pass
|
||||
|
||||
total += delta
|
||||
self.clients[cli] = now
|
||||
|
||||
self.workload = total
|
||||
v = base64.urlsafe_b64encode(spack(b">xxL", int(v)))
|
||||
self.cb_v = v.decode("ascii")[-4:]
|
||||
self.cb_ts = time.time()
|
||||
return self.cb_v
|
||||
|
||||
@@ -7,12 +7,9 @@ import json
|
||||
import shutil
|
||||
import subprocess as sp
|
||||
|
||||
from .__init__ import PY2, WINDOWS
|
||||
from .__init__ import PY2, WINDOWS, unicode
|
||||
from .util import fsenc, fsdec, uncyg, REKOBO_LKEY
|
||||
|
||||
if not PY2:
|
||||
unicode = str
|
||||
|
||||
|
||||
def have_ff(cmd):
|
||||
if PY2:
|
||||
@@ -115,6 +112,19 @@ def parse_ffprobe(txt):
|
||||
ret = {} # processed
|
||||
md = {} # raw tags
|
||||
|
||||
is_audio = fmt.get("format_name") in ["mp3", "ogg", "flac", "wav"]
|
||||
if fmt.get("filename", "").split(".")[-1].lower() in ["m4a", "aac"]:
|
||||
is_audio = True
|
||||
|
||||
# if audio file, ensure audio stream appears first
|
||||
if (
|
||||
is_audio
|
||||
and len(streams) > 2
|
||||
and streams[1].get("codec_type") != "audio"
|
||||
and streams[2].get("codec_type") == "audio"
|
||||
):
|
||||
streams = [fmt, streams[2], streams[1]] + streams[3:]
|
||||
|
||||
have = {}
|
||||
for strm in streams:
|
||||
typ = strm.get("codec_type")
|
||||
@@ -134,9 +144,7 @@ def parse_ffprobe(txt):
|
||||
]
|
||||
|
||||
if typ == "video":
|
||||
if strm.get("DISPOSITION:attached_pic") == "1" or fmt.get(
|
||||
"format_name"
|
||||
) in ["mp3", "ogg", "flac"]:
|
||||
if strm.get("DISPOSITION:attached_pic") == "1" or is_audio:
|
||||
continue
|
||||
|
||||
kvm = [
|
||||
@@ -180,7 +188,7 @@ def parse_ffprobe(txt):
|
||||
|
||||
k = k[4:].strip()
|
||||
v = v.strip()
|
||||
if k and v:
|
||||
if k and v and k not in md:
|
||||
md[k] = [v]
|
||||
|
||||
for k in [".q", ".vq", ".aq"]:
|
||||
|
||||
@@ -33,10 +33,11 @@ class QFile(object):
|
||||
class StreamTar(object):
|
||||
"""construct in-memory tar file from the given path"""
|
||||
|
||||
def __init__(self, fgen, **kwargs):
|
||||
def __init__(self, log, fgen, **kwargs):
|
||||
self.ci = 0
|
||||
self.co = 0
|
||||
self.qfile = QFile()
|
||||
self.log = log
|
||||
self.fgen = fgen
|
||||
self.errf = None
|
||||
|
||||
@@ -91,7 +92,8 @@ class StreamTar(object):
|
||||
errors.append([f["vp"], repr(ex)])
|
||||
|
||||
if errors:
|
||||
self.errf = errdesc(errors)
|
||||
self.errf, txt = errdesc(errors)
|
||||
self.log("\n".join(([repr(self.errf)] + txt[1:])))
|
||||
self.ser(self.errf)
|
||||
|
||||
self.tar.close()
|
||||
|
||||
@@ -25,4 +25,4 @@ def errdesc(errors):
|
||||
"vp": "archive-errors-{}.txt".format(dt),
|
||||
"ap": tf_path,
|
||||
"st": os.stat(tf_path),
|
||||
}
|
||||
}, report
|
||||
|
||||
@@ -5,12 +5,13 @@ import re
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import shlex
|
||||
import threading
|
||||
from datetime import datetime, timedelta
|
||||
import calendar
|
||||
|
||||
from .__init__ import PY2, WINDOWS, MACOS, VT100
|
||||
from .util import mp
|
||||
from .__init__ import E, PY2, WINDOWS, MACOS, VT100
|
||||
from .util import mp, start_log_thrs, start_stackmon
|
||||
from .authsrv import AuthSrv
|
||||
from .tcpsrv import TcpSrv
|
||||
from .up2k import Up2k
|
||||
@@ -28,14 +29,24 @@ class SvcHub(object):
|
||||
put() can return a queue (if want_reply=True) which has a blocking get() with the response.
|
||||
"""
|
||||
|
||||
def __init__(self, args):
|
||||
def __init__(self, args, argv, printed):
|
||||
self.args = args
|
||||
self.argv = argv
|
||||
self.logf = None
|
||||
|
||||
self.ansi_re = re.compile("\033\\[[^m]*m")
|
||||
self.log_mutex = threading.Lock()
|
||||
self.next_day = 0
|
||||
|
||||
self.log = self._log_disabled if args.q else self._log_enabled
|
||||
if args.lo:
|
||||
self._setup_logfile(printed)
|
||||
|
||||
if args.stackmon:
|
||||
start_stackmon(args.stackmon, 0)
|
||||
|
||||
if args.log_thrs:
|
||||
start_log_thrs(self.log, args.log_thrs, 0)
|
||||
|
||||
# initiate all services to manage
|
||||
self.asrv = AuthSrv(self.args, self.log, False)
|
||||
@@ -69,6 +80,52 @@ class SvcHub(object):
|
||||
|
||||
self.broker = Broker(self)
|
||||
|
||||
def _logname(self):
|
||||
dt = datetime.utcfromtimestamp(time.time())
|
||||
fn = self.args.lo
|
||||
for fs in "YmdHMS":
|
||||
fs = "%" + fs
|
||||
if fs in fn:
|
||||
fn = fn.replace(fs, dt.strftime(fs))
|
||||
|
||||
return fn
|
||||
|
||||
def _setup_logfile(self, printed):
|
||||
base_fn = fn = sel_fn = self._logname()
|
||||
if fn != self.args.lo:
|
||||
ctr = 0
|
||||
# yup this is a race; if started sufficiently concurrently, two
|
||||
# copyparties can grab the same logfile (considered and ignored)
|
||||
while os.path.exists(sel_fn):
|
||||
ctr += 1
|
||||
sel_fn = "{}.{}".format(fn, ctr)
|
||||
|
||||
fn = sel_fn
|
||||
|
||||
try:
|
||||
import lzma
|
||||
|
||||
lh = lzma.open(fn, "wt", encoding="utf-8", errors="replace", preset=0)
|
||||
|
||||
except:
|
||||
import codecs
|
||||
|
||||
lh = codecs.open(fn, "w", encoding="utf-8", errors="replace")
|
||||
|
||||
lh.base_fn = base_fn
|
||||
|
||||
argv = [sys.executable] + self.argv
|
||||
if hasattr(shlex, "quote"):
|
||||
argv = [shlex.quote(x) for x in argv]
|
||||
else:
|
||||
argv = ['"{}"'.format(x) for x in argv]
|
||||
|
||||
msg = "[+] opened logfile [{}]\n".format(fn)
|
||||
printed += msg
|
||||
lh.write("t0: {:.3f}\nargv: {}\n\n{}".format(E.t0, " ".join(argv), printed))
|
||||
self.logf = lh
|
||||
print(msg, end="")
|
||||
|
||||
def run(self):
|
||||
thr = threading.Thread(target=self.tcpsrv.run, name="svchub-main")
|
||||
thr.daemon = True
|
||||
@@ -99,9 +156,36 @@ class SvcHub(object):
|
||||
print("nailed it", end="")
|
||||
finally:
|
||||
print("\033[0m")
|
||||
if self.logf:
|
||||
self.logf.close()
|
||||
|
||||
def _log_disabled(self, src, msg, c=0):
|
||||
pass
|
||||
if not self.logf:
|
||||
return
|
||||
|
||||
with self.log_mutex:
|
||||
ts = datetime.utcfromtimestamp(time.time())
|
||||
ts = ts.strftime("%Y-%m%d-%H%M%S.%f")[:-3]
|
||||
self.logf.write("@{} [{}] {}\n".format(ts, src, msg))
|
||||
|
||||
now = time.time()
|
||||
if now >= self.next_day:
|
||||
self._set_next_day()
|
||||
|
||||
def _set_next_day(self):
|
||||
if self.next_day and self.logf and self.logf.base_fn != self._logname():
|
||||
self.logf.close()
|
||||
self._setup_logfile("")
|
||||
|
||||
dt = datetime.utcfromtimestamp(time.time())
|
||||
|
||||
# unix timestamp of next 00:00:00 (leap-seconds safe)
|
||||
day_now = dt.day
|
||||
while dt.day == day_now:
|
||||
dt += timedelta(hours=12)
|
||||
|
||||
dt = dt.replace(hour=0, minute=0, second=0)
|
||||
self.next_day = calendar.timegm(dt.utctimetuple())
|
||||
|
||||
def _log_enabled(self, src, msg, c=0):
|
||||
"""handles logging from all components"""
|
||||
@@ -110,14 +194,7 @@ class SvcHub(object):
|
||||
if now >= self.next_day:
|
||||
dt = datetime.utcfromtimestamp(now)
|
||||
print("\033[36m{}\033[0m\n".format(dt.strftime("%Y-%m-%d")), end="")
|
||||
|
||||
# unix timestamp of next 00:00:00 (leap-seconds safe)
|
||||
day_now = dt.day
|
||||
while dt.day == day_now:
|
||||
dt += timedelta(hours=12)
|
||||
|
||||
dt = dt.replace(hour=0, minute=0, second=0)
|
||||
self.next_day = calendar.timegm(dt.utctimetuple())
|
||||
self._set_next_day()
|
||||
|
||||
fmt = "\033[36m{} \033[33m{:21} \033[0m{}\n"
|
||||
if not VT100:
|
||||
@@ -144,20 +221,20 @@ class SvcHub(object):
|
||||
except:
|
||||
print(msg.encode("ascii", "replace").decode(), end="")
|
||||
|
||||
if self.logf:
|
||||
self.logf.write(msg)
|
||||
|
||||
def check_mp_support(self):
|
||||
vmin = sys.version_info[1]
|
||||
if WINDOWS:
|
||||
msg = "need python 3.3 or newer for multiprocessing;"
|
||||
if PY2:
|
||||
# py2 pickler doesn't support winsock
|
||||
return msg
|
||||
elif vmin < 3:
|
||||
if PY2 or vmin < 3:
|
||||
return msg
|
||||
elif MACOS:
|
||||
return "multiprocessing is wonky on mac osx;"
|
||||
else:
|
||||
msg = "need python 2.7 or 3.3+ for multiprocessing;"
|
||||
if not PY2 and vmin < 3:
|
||||
msg = "need python 3.3+ for multiprocessing;"
|
||||
if PY2 or vmin < 3:
|
||||
return msg
|
||||
|
||||
try:
|
||||
@@ -189,5 +266,5 @@ class SvcHub(object):
|
||||
if not err:
|
||||
return True
|
||||
else:
|
||||
self.log("root", err)
|
||||
self.log("svchub", err)
|
||||
return False
|
||||
|
||||
@@ -4,15 +4,14 @@ from __future__ import print_function, unicode_literals
|
||||
import os
|
||||
import time
|
||||
import zlib
|
||||
import struct
|
||||
from datetime import datetime
|
||||
|
||||
from .sutil import errdesc
|
||||
from .util import yieldfile, sanitize_fn
|
||||
from .util import yieldfile, sanitize_fn, spack, sunpack
|
||||
|
||||
|
||||
def dostime2unix(buf):
|
||||
t, d = struct.unpack("<HH", buf)
|
||||
t, d = sunpack(b"<HH", buf)
|
||||
|
||||
ts = (t & 0x1F) * 2
|
||||
tm = (t >> 5) & 0x3F
|
||||
@@ -36,13 +35,13 @@ def unixtime2dos(ts):
|
||||
|
||||
bd = ((dy - 1980) << 9) + (dm << 5) + dd
|
||||
bt = (th << 11) + (tm << 5) + ts // 2
|
||||
return struct.pack("<HH", bt, bd)
|
||||
return spack(b"<HH", bt, bd)
|
||||
|
||||
|
||||
def gen_fdesc(sz, crc32, z64):
|
||||
ret = b"\x50\x4b\x07\x08"
|
||||
fmt = "<LQQ" if z64 else "<LLL"
|
||||
ret += struct.pack(fmt, crc32, sz, sz)
|
||||
fmt = b"<LQQ" if z64 else b"<LLL"
|
||||
ret += spack(fmt, crc32, sz, sz)
|
||||
return ret
|
||||
|
||||
|
||||
@@ -66,7 +65,7 @@ def gen_hdr(h_pos, fn, sz, lastmod, utf8, crc32, pre_crc):
|
||||
req_ver = b"\x2d\x00" if z64 else b"\x0a\x00"
|
||||
|
||||
if crc32:
|
||||
crc32 = struct.pack("<L", crc32)
|
||||
crc32 = spack(b"<L", crc32)
|
||||
else:
|
||||
crc32 = b"\x00" * 4
|
||||
|
||||
@@ -87,14 +86,14 @@ def gen_hdr(h_pos, fn, sz, lastmod, utf8, crc32, pre_crc):
|
||||
# however infozip does actual sz and it even works on winxp
|
||||
# (same reasning for z64 extradata later)
|
||||
vsz = 0xFFFFFFFF if z64 else sz
|
||||
ret += struct.pack("<LL", vsz, vsz)
|
||||
ret += spack(b"<LL", vsz, vsz)
|
||||
|
||||
# windows support (the "?" replace below too)
|
||||
fn = sanitize_fn(fn, ok="/")
|
||||
fn = sanitize_fn(fn, "/", [])
|
||||
bfn = fn.encode("utf-8" if utf8 else "cp437", "replace").replace(b"?", b"_")
|
||||
|
||||
z64_len = len(z64v) * 8 + 4 if z64v else 0
|
||||
ret += struct.pack("<HH", len(bfn), z64_len)
|
||||
ret += spack(b"<HH", len(bfn), z64_len)
|
||||
|
||||
if h_pos is not None:
|
||||
# 2b comment, 2b diskno
|
||||
@@ -106,12 +105,12 @@ def gen_hdr(h_pos, fn, sz, lastmod, utf8, crc32, pre_crc):
|
||||
ret += b"\x01\x00\x00\x00\xa4\x81"
|
||||
|
||||
# 4b local-header-ofs
|
||||
ret += struct.pack("<L", min(h_pos, 0xFFFFFFFF))
|
||||
ret += spack(b"<L", min(h_pos, 0xFFFFFFFF))
|
||||
|
||||
ret += bfn
|
||||
|
||||
if z64v:
|
||||
ret += struct.pack("<HH" + "Q" * len(z64v), 1, len(z64v) * 8, *z64v)
|
||||
ret += spack(b"<HH" + b"Q" * len(z64v), 1, len(z64v) * 8, *z64v)
|
||||
|
||||
return ret
|
||||
|
||||
@@ -136,7 +135,7 @@ def gen_ecdr(items, cdir_pos, cdir_end):
|
||||
need_64 = nitems == 0xFFFF or 0xFFFFFFFF in [csz, cpos]
|
||||
|
||||
# 2b tnfiles, 2b dnfiles, 4b dir sz, 4b dir pos
|
||||
ret += struct.pack("<HHLL", nitems, nitems, csz, cpos)
|
||||
ret += spack(b"<HHLL", nitems, nitems, csz, cpos)
|
||||
|
||||
# 2b comment length
|
||||
ret += b"\x00\x00"
|
||||
@@ -163,7 +162,7 @@ def gen_ecdr64(items, cdir_pos, cdir_end):
|
||||
|
||||
# 8b tnfiles, 8b dnfiles, 8b dir sz, 8b dir pos
|
||||
cdir_sz = cdir_end - cdir_pos
|
||||
ret += struct.pack("<QQQQ", len(items), len(items), cdir_sz, cdir_pos)
|
||||
ret += spack(b"<QQQQ", len(items), len(items), cdir_sz, cdir_pos)
|
||||
|
||||
return ret
|
||||
|
||||
@@ -178,13 +177,14 @@ def gen_ecdr64_loc(ecdr64_pos):
|
||||
ret = b"\x50\x4b\x06\x07"
|
||||
|
||||
# 4b cdisk, 8b start of ecdr64, 4b ndisks
|
||||
ret += struct.pack("<LQL", 0, ecdr64_pos, 1)
|
||||
ret += spack(b"<LQL", 0, ecdr64_pos, 1)
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
class StreamZip(object):
|
||||
def __init__(self, fgen, utf8=False, pre_crc=False):
|
||||
def __init__(self, log, fgen, utf8=False, pre_crc=False):
|
||||
self.log = log
|
||||
self.fgen = fgen
|
||||
self.utf8 = utf8
|
||||
self.pre_crc = pre_crc
|
||||
@@ -247,8 +247,8 @@ class StreamZip(object):
|
||||
errors.append([f["vp"], repr(ex)])
|
||||
|
||||
if errors:
|
||||
errf = errdesc(errors)
|
||||
print(repr(errf))
|
||||
errf, txt = errdesc(errors)
|
||||
self.log("\n".join(([repr(errf)] + txt[1:])))
|
||||
for x in self.ser(errf):
|
||||
yield x
|
||||
|
||||
|
||||
@@ -2,11 +2,9 @@
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import re
|
||||
import time
|
||||
import socket
|
||||
import select
|
||||
|
||||
from .util import chkcmd, Counter
|
||||
from .util import chkcmd
|
||||
|
||||
|
||||
class TcpSrv(object):
|
||||
@@ -20,7 +18,6 @@ class TcpSrv(object):
|
||||
self.args = hub.args
|
||||
self.log = hub.log
|
||||
|
||||
self.num_clients = Counter()
|
||||
self.stopping = False
|
||||
|
||||
ip = "127.0.0.1"
|
||||
@@ -66,44 +63,13 @@ class TcpSrv(object):
|
||||
for srv in self.srv:
|
||||
srv.listen(self.args.nc)
|
||||
ip, port = srv.getsockname()
|
||||
self.log("tcpsrv", "listening @ {0}:{1}".format(ip, port))
|
||||
fno = srv.fileno()
|
||||
msg = "listening @ {}:{} f{}".format(ip, port, fno)
|
||||
self.log("tcpsrv", msg)
|
||||
if self.args.q:
|
||||
print(msg)
|
||||
|
||||
while not self.stopping:
|
||||
if self.args.log_conn:
|
||||
self.log("tcpsrv", "|%sC-ncli" % ("-" * 1,), c="1;30")
|
||||
|
||||
if self.num_clients.v >= self.args.nc:
|
||||
time.sleep(0.1)
|
||||
continue
|
||||
|
||||
if self.args.log_conn:
|
||||
self.log("tcpsrv", "|%sC-acc1" % ("-" * 2,), c="1;30")
|
||||
|
||||
try:
|
||||
# macos throws bad-fd
|
||||
ready, _, _ = select.select(self.srv, [], [])
|
||||
except:
|
||||
ready = []
|
||||
if not self.stopping:
|
||||
raise
|
||||
|
||||
for srv in ready:
|
||||
if self.stopping:
|
||||
break
|
||||
|
||||
sck, addr = srv.accept()
|
||||
sip, sport = srv.getsockname()
|
||||
if self.args.log_conn:
|
||||
self.log(
|
||||
"%s %s" % addr,
|
||||
"|{}C-acc2 \033[0;36m{} \033[3{}m{}".format(
|
||||
"-" * 3, sip, sport % 8, sport
|
||||
),
|
||||
c="1;30",
|
||||
)
|
||||
|
||||
self.num_clients.add()
|
||||
self.hub.broker.put(False, "httpconn", sck, addr)
|
||||
self.hub.broker.put(False, "listen", srv)
|
||||
|
||||
def shutdown(self):
|
||||
self.stopping = True
|
||||
|
||||
@@ -9,15 +9,11 @@ import hashlib
|
||||
import threading
|
||||
import subprocess as sp
|
||||
|
||||
from .__init__ import PY2
|
||||
from .__init__ import PY2, unicode
|
||||
from .util import fsenc, runcmd, Queue, Cooldown, BytesIO, min_ex
|
||||
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE, ffprobe
|
||||
|
||||
|
||||
if not PY2:
|
||||
unicode = str
|
||||
|
||||
|
||||
HAVE_PIL = False
|
||||
HAVE_HEIF = False
|
||||
HAVE_AVIF = False
|
||||
@@ -53,7 +49,7 @@ except:
|
||||
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
|
||||
# ffmpeg -formats
|
||||
FMT_PIL = "bmp dib gif icns ico jpg jpeg jp2 jpx pcx png pbm pgm ppm pnm sgi tga tif tiff webp xbm dds xpm"
|
||||
FMT_FF = "av1 asf avi flv m4v mkv mjpeg mjpg mpg mpeg mpg2 mpeg2 h264 avc h265 hevc mov 3gp mp4 ts mpegts nut ogv ogm rm vob webm wmv"
|
||||
FMT_FF = "av1 asf avi flv m4v mkv mjpeg mjpg mpg mpeg mpg2 mpeg2 h264 avc mts h265 hevc mov 3gp mp4 ts mpegts nut ogv ogm rm vob webm wmv"
|
||||
|
||||
if HAVE_HEIF:
|
||||
FMT_PIL += " heif heifs heic heics"
|
||||
@@ -134,9 +130,10 @@ class ThumbSrv(object):
|
||||
msg += ", ".join(missing)
|
||||
self.log(msg, c=3)
|
||||
|
||||
t = threading.Thread(target=self.cleaner, name="thumb-cleaner")
|
||||
t.daemon = True
|
||||
t.start()
|
||||
if self.args.th_clean:
|
||||
t = threading.Thread(target=self.cleaner, name="thumb-cleaner")
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
def log(self, msg, c=0):
|
||||
self.log_func("thumb", msg, c)
|
||||
@@ -263,7 +260,7 @@ class ThumbSrv(object):
|
||||
pass # default q = 75
|
||||
|
||||
if im.mode not in fmts:
|
||||
print("conv {}".format(im.mode))
|
||||
# print("conv {}".format(im.mode))
|
||||
im = im.convert("RGB")
|
||||
|
||||
im.save(tpath, quality=40, method=6)
|
||||
|
||||
@@ -26,7 +26,7 @@ class U2idx(object):
|
||||
self.timeout = self.args.srch_time
|
||||
|
||||
if not HAVE_SQLITE3:
|
||||
self.log("could not load sqlite3; searchign wqill be disabled")
|
||||
self.log("your python does not have sqlite3; searching will be disabled")
|
||||
return
|
||||
|
||||
self.cur = {}
|
||||
@@ -57,6 +57,9 @@ class U2idx(object):
|
||||
raise Pebkac(500, min_ex())
|
||||
|
||||
def get_cur(self, ptop):
|
||||
if not HAVE_SQLITE3:
|
||||
return None
|
||||
|
||||
cur = self.cur.get(ptop)
|
||||
if cur:
|
||||
return cur
|
||||
@@ -66,7 +69,7 @@ class U2idx(object):
|
||||
if not os.path.exists(db_path):
|
||||
return None
|
||||
|
||||
cur = sqlite3.connect(db_path).cursor()
|
||||
cur = sqlite3.connect(db_path, 2).cursor()
|
||||
self.cur[ptop] = cur
|
||||
return cur
|
||||
|
||||
|
||||
@@ -103,13 +103,15 @@ class Up2k(object):
|
||||
self.deferred_init()
|
||||
else:
|
||||
t = threading.Thread(
|
||||
target=self.deferred_init,
|
||||
name="up2k-deferred-init",
|
||||
target=self.deferred_init, name="up2k-deferred-init", args=(0.5,)
|
||||
)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
def deferred_init(self):
|
||||
def deferred_init(self, wait=0):
|
||||
if wait:
|
||||
time.sleep(wait)
|
||||
|
||||
all_vols = self.asrv.vfs.all_vols
|
||||
have_e2d = self.init_indexes(all_vols)
|
||||
|
||||
@@ -193,7 +195,7 @@ class Up2k(object):
|
||||
|
||||
return True, ret
|
||||
|
||||
def init_indexes(self, all_vols, scan_vols=[]):
|
||||
def init_indexes(self, all_vols, scan_vols=None):
|
||||
self.pp = ProgressPrinter()
|
||||
vols = all_vols.values()
|
||||
t0 = time.time()
|
||||
@@ -342,7 +344,15 @@ class Up2k(object):
|
||||
for k, v in flags.items()
|
||||
]
|
||||
if a:
|
||||
self.log(" ".join(sorted(a)) + "\033[0m")
|
||||
vpath = "?"
|
||||
for k, v in self.asrv.vfs.all_vols.items():
|
||||
if v.realpath == ptop:
|
||||
vpath = k
|
||||
|
||||
if vpath:
|
||||
vpath += "/"
|
||||
|
||||
self.log("/{} {}".format(vpath, " ".join(sorted(a))), "35")
|
||||
|
||||
reg = {}
|
||||
path = os.path.join(histpath, "up2k.snap")
|
||||
@@ -401,7 +411,7 @@ class Up2k(object):
|
||||
if WINDOWS:
|
||||
excl = [x.replace("/", "\\") for x in excl]
|
||||
|
||||
n_add = self._build_dir(dbw, top, set(excl), top, nohash)
|
||||
n_add = self._build_dir(dbw, top, set(excl), top, nohash, [])
|
||||
n_rm = self._drop_lost(dbw[0], top)
|
||||
if dbw[1]:
|
||||
self.log("commit {} new files".format(dbw[1]))
|
||||
@@ -409,11 +419,25 @@ class Up2k(object):
|
||||
|
||||
return True, n_add or n_rm or do_vac
|
||||
|
||||
def _build_dir(self, dbw, top, excl, cdir, nohash):
|
||||
def _build_dir(self, dbw, top, excl, cdir, nohash, seen):
|
||||
rcdir = cdir
|
||||
if not ANYWIN:
|
||||
try:
|
||||
# a bit expensive but worth
|
||||
rcdir = os.path.realpath(cdir)
|
||||
except:
|
||||
pass
|
||||
|
||||
if rcdir in seen:
|
||||
m = "bailing from symlink loop,\n prev: {}\n curr: {}\n from: {}"
|
||||
self.log(m.format(seen[-1], rcdir, cdir), 3)
|
||||
return 0
|
||||
|
||||
seen = seen + [cdir]
|
||||
self.pp.msg = "a{} {}".format(self.pp.n, cdir)
|
||||
histpath = self.asrv.vfs.histtab[top]
|
||||
ret = 0
|
||||
g = statdir(self.log, not self.args.no_scandir, False, cdir)
|
||||
g = statdir(self.log_func, not self.args.no_scandir, False, cdir)
|
||||
for iname, inf in sorted(g):
|
||||
abspath = os.path.join(cdir, iname)
|
||||
lmod = int(inf.st_mtime)
|
||||
@@ -422,7 +446,7 @@ class Up2k(object):
|
||||
if abspath in excl or abspath == histpath:
|
||||
continue
|
||||
# self.log(" dir: {}".format(abspath))
|
||||
ret += self._build_dir(dbw, top, excl, abspath, nohash)
|
||||
ret += self._build_dir(dbw, top, excl, abspath, nohash, seen)
|
||||
else:
|
||||
# self.log("file: {}".format(abspath))
|
||||
rp = abspath[len(top) + 1 :]
|
||||
@@ -653,7 +677,7 @@ class Up2k(object):
|
||||
try:
|
||||
parser = MParser(parser)
|
||||
except:
|
||||
self.log("invalid argument: " + parser, 1)
|
||||
self.log("invalid argument (could not find program): " + parser, 1)
|
||||
return
|
||||
|
||||
for tag in entags:
|
||||
@@ -901,7 +925,7 @@ class Up2k(object):
|
||||
except:
|
||||
self.log("WARN: could not list files; DB corrupt?\n" + min_ex())
|
||||
|
||||
elif ver > DB_VER:
|
||||
if (ver or 0) > DB_VER:
|
||||
m = "database is version {}, this copyparty only supports versions <= {}"
|
||||
raise Exception(m.format(ver, DB_VER))
|
||||
|
||||
@@ -967,7 +991,7 @@ class Up2k(object):
|
||||
if cj["ptop"] not in self.registry:
|
||||
raise Pebkac(410, "location unavailable")
|
||||
|
||||
cj["name"] = sanitize_fn(cj["name"], bad=[".prologue.html", ".epilogue.html"])
|
||||
cj["name"] = sanitize_fn(cj["name"], "", [".prologue.html", ".epilogue.html"])
|
||||
cj["poke"] = time.time()
|
||||
wark = self._get_wark(cj)
|
||||
now = time.time()
|
||||
@@ -1019,7 +1043,8 @@ class Up2k(object):
|
||||
break
|
||||
except:
|
||||
# missing; restart
|
||||
job = None
|
||||
if not self.args.nw:
|
||||
job = None
|
||||
break
|
||||
else:
|
||||
# file contents match, but not the path
|
||||
@@ -1046,8 +1071,9 @@ class Up2k(object):
|
||||
pdir = os.path.join(cj["ptop"], cj["prel"])
|
||||
job["name"] = self._untaken(pdir, cj["name"], now, cj["addr"])
|
||||
dst = os.path.join(job["ptop"], job["prel"], job["name"])
|
||||
os.unlink(fsenc(dst)) # TODO ed pls
|
||||
self._symlink(src, dst)
|
||||
if not self.args.nw:
|
||||
os.unlink(fsenc(dst)) # TODO ed pls
|
||||
self._symlink(src, dst)
|
||||
|
||||
if not job:
|
||||
job = {
|
||||
@@ -1089,6 +1115,9 @@ class Up2k(object):
|
||||
}
|
||||
|
||||
def _untaken(self, fdir, fname, ts, ip):
|
||||
if self.args.nw:
|
||||
return fname
|
||||
|
||||
# TODO broker which avoid this race and
|
||||
# provides a new filename if taken (same as bup)
|
||||
suffix = ".{:.6f}-{}".format(ts, ip)
|
||||
@@ -1098,6 +1127,9 @@ class Up2k(object):
|
||||
def _symlink(self, src, dst):
|
||||
# TODO store this in linktab so we never delete src if there are links to it
|
||||
self.log("linking dupe:\n {0}\n {1}".format(src, dst))
|
||||
if self.args.nw:
|
||||
return
|
||||
|
||||
try:
|
||||
lsrc = src
|
||||
ldst = dst
|
||||
@@ -1175,6 +1207,10 @@ class Up2k(object):
|
||||
if ret > 0:
|
||||
return ret, src
|
||||
|
||||
if self.args.nw:
|
||||
# del self.registry[ptop][wark]
|
||||
return ret, dst
|
||||
|
||||
atomic_move(src, dst)
|
||||
|
||||
if ANYWIN:
|
||||
@@ -1284,6 +1320,10 @@ class Up2k(object):
|
||||
if self.args.dotpart:
|
||||
tnam = "." + tnam
|
||||
|
||||
if self.args.nw:
|
||||
job["tnam"] = tnam
|
||||
return
|
||||
|
||||
suffix = ".{:.6f}-{}".format(job["t0"], job["addr"])
|
||||
with ren_open(tnam, "wb", fdir=pdir, suffix=suffix) as f:
|
||||
f, job["tnam"] = f["orz"]
|
||||
|
||||
@@ -16,6 +16,7 @@ import mimetypes
|
||||
import contextlib
|
||||
import subprocess as sp # nosec
|
||||
from datetime import datetime
|
||||
from collections import Counter
|
||||
|
||||
from .__init__ import PY2, WINDOWS, ANYWIN
|
||||
from .stolen import surrogateescape
|
||||
@@ -42,6 +43,20 @@ else:
|
||||
from Queue import Queue # pylint: disable=import-error,no-name-in-module
|
||||
from StringIO import StringIO as BytesIO
|
||||
|
||||
|
||||
try:
|
||||
struct.unpack(b">i", b"idgi")
|
||||
spack = struct.pack
|
||||
sunpack = struct.unpack
|
||||
except:
|
||||
|
||||
def spack(f, *a, **ka):
|
||||
return struct.pack(f.decode("ascii"), *a, **ka)
|
||||
|
||||
def sunpack(f, *a, **ka):
|
||||
return struct.unpack(f.decode("ascii"), *a, **ka)
|
||||
|
||||
|
||||
surrogateescape.register_surrogateescape()
|
||||
FS_ENCODING = sys.getfilesystemencoding()
|
||||
if WINDOWS and PY2:
|
||||
@@ -123,20 +138,6 @@ REKOBO_KEY = {
|
||||
REKOBO_LKEY = {k.lower(): v for k, v in REKOBO_KEY.items()}
|
||||
|
||||
|
||||
class Counter(object):
|
||||
def __init__(self, v=0):
|
||||
self.v = v
|
||||
self.mutex = threading.Lock()
|
||||
|
||||
def add(self, delta=1):
|
||||
with self.mutex:
|
||||
self.v += delta
|
||||
|
||||
def set(self, absval):
|
||||
with self.mutex:
|
||||
self.v = absval
|
||||
|
||||
|
||||
class Cooldown(object):
|
||||
def __init__(self, maxage):
|
||||
self.maxage = maxage
|
||||
@@ -231,7 +232,7 @@ def nuprint(msg):
|
||||
|
||||
def rice_tid():
|
||||
tid = threading.current_thread().ident
|
||||
c = struct.unpack(b"B" * 5, struct.pack(b">Q", tid)[-5:])
|
||||
c = sunpack(b"B" * 5, spack(b">Q", tid)[-5:])
|
||||
return "".join("\033[1;37;48;5;{}m{:02x}".format(x, x) for x in c) + "\033[0m"
|
||||
|
||||
|
||||
@@ -282,15 +283,69 @@ def alltrace():
|
||||
return "\n".join(rret + bret)
|
||||
|
||||
|
||||
def start_stackmon(arg_str, nid):
|
||||
suffix = "-{}".format(nid) if nid else ""
|
||||
fp, f = arg_str.rsplit(",", 1)
|
||||
f = int(f)
|
||||
t = threading.Thread(
|
||||
target=stackmon,
|
||||
args=(fp, f, suffix),
|
||||
name="stackmon" + suffix,
|
||||
)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
|
||||
def stackmon(fp, ival, suffix):
|
||||
ctr = 0
|
||||
while True:
|
||||
ctr += 1
|
||||
time.sleep(ival)
|
||||
st = "{}, {}\n{}".format(ctr, time.time(), alltrace())
|
||||
with open(fp + suffix, "wb") as f:
|
||||
f.write(st.encode("utf-8", "replace"))
|
||||
|
||||
|
||||
def start_log_thrs(logger, ival, nid):
|
||||
ival = int(ival)
|
||||
tname = lname = "log-thrs"
|
||||
if nid:
|
||||
tname = "logthr-n{}-i{:x}".format(nid, os.getpid())
|
||||
lname = tname[3:]
|
||||
|
||||
t = threading.Thread(
|
||||
target=log_thrs,
|
||||
args=(logger, ival, lname),
|
||||
name=tname,
|
||||
)
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
|
||||
def log_thrs(log, ival, name):
|
||||
while True:
|
||||
time.sleep(ival)
|
||||
tv = [x.name for x in threading.enumerate()]
|
||||
tv = [
|
||||
x.split("-")[0]
|
||||
if x.startswith("httpconn-") or x.startswith("thumb-")
|
||||
else "listen"
|
||||
if "-listen-" in x
|
||||
else x
|
||||
for x in tv
|
||||
if not x.startswith("pydevd.")
|
||||
]
|
||||
tv = ["{}\033[36m{}".format(v, k) for k, v in sorted(Counter(tv).items())]
|
||||
log(name, "\033[0m \033[33m".join(tv), 3)
|
||||
|
||||
|
||||
def min_ex():
|
||||
et, ev, tb = sys.exc_info()
|
||||
tb = traceback.extract_tb(tb, 2)
|
||||
ex = [
|
||||
"{} @ {} <{}>: {}".format(fp.split(os.sep)[-1], ln, fun, txt)
|
||||
for fp, ln, fun, txt in tb
|
||||
]
|
||||
ex.append("{}: {}".format(et.__name__, ev))
|
||||
return "\n".join(ex)
|
||||
tb = traceback.extract_tb(tb)
|
||||
fmt = "{} @ {} <{}>: {}"
|
||||
ex = [fmt.format(fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in tb]
|
||||
ex.append("[{}] {}".format(et.__name__, ev))
|
||||
return "\n".join(ex[-8:])
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
@@ -674,7 +729,7 @@ def undot(path):
|
||||
return "/".join(ret)
|
||||
|
||||
|
||||
def sanitize_fn(fn, ok="", bad=[]):
|
||||
def sanitize_fn(fn, ok, bad):
|
||||
if "/" not in ok:
|
||||
fn = fn.replace("\\", "/").split("/")[-1]
|
||||
|
||||
@@ -904,16 +959,10 @@ def yieldfile(fn):
|
||||
yield buf
|
||||
|
||||
|
||||
def hashcopy(actor, fin, fout):
|
||||
is_mp = actor.is_mp
|
||||
def hashcopy(fin, fout):
|
||||
hashobj = hashlib.sha512()
|
||||
tlen = 0
|
||||
for buf in fin:
|
||||
if is_mp:
|
||||
actor.workload += 1
|
||||
if actor.workload > 2 ** 31:
|
||||
actor.workload = 100
|
||||
|
||||
tlen += len(buf)
|
||||
hashobj.update(buf)
|
||||
fout.write(buf)
|
||||
@@ -924,15 +973,10 @@ def hashcopy(actor, fin, fout):
|
||||
return tlen, hashobj.hexdigest(), digest_b64
|
||||
|
||||
|
||||
def sendfile_py(lower, upper, f, s, actor=None):
|
||||
def sendfile_py(lower, upper, f, s):
|
||||
remains = upper - lower
|
||||
f.seek(lower)
|
||||
while remains > 0:
|
||||
if actor:
|
||||
actor.workload += 1
|
||||
if actor.workload > 2 ** 31:
|
||||
actor.workload = 100
|
||||
|
||||
# time.sleep(0.01)
|
||||
buf = f.read(min(1024 * 32, remains))
|
||||
if not buf:
|
||||
@@ -979,8 +1023,7 @@ def statdir(logger, scandir, lstat, top):
|
||||
try:
|
||||
yield [fsdec(fh.name), fh.stat(follow_symlinks=not lstat)]
|
||||
except Exception as ex:
|
||||
msg = "scan-stat: \033[36m{} @ {}"
|
||||
logger(msg.format(repr(ex), fsdec(fh.path)))
|
||||
logger(src, "[s] {} @ {}".format(repr(ex), fsdec(fh.path)), 6)
|
||||
else:
|
||||
src = "listdir"
|
||||
fun = os.lstat if lstat else os.stat
|
||||
@@ -989,11 +1032,10 @@ def statdir(logger, scandir, lstat, top):
|
||||
try:
|
||||
yield [fsdec(name), fun(abspath)]
|
||||
except Exception as ex:
|
||||
msg = "list-stat: \033[36m{} @ {}"
|
||||
logger(msg.format(repr(ex), fsdec(abspath)))
|
||||
logger(src, "[s] {} @ {}".format(repr(ex), fsdec(abspath)), 6)
|
||||
|
||||
except Exception as ex:
|
||||
logger("{}: \033[31m{} @ {}".format(src, repr(ex), top))
|
||||
logger(src, "{} @ {}".format(repr(ex), top), 1)
|
||||
|
||||
|
||||
def unescape_cookie(orig):
|
||||
@@ -1030,7 +1072,13 @@ def guess_mime(url, fallback="application/octet-stream"):
|
||||
except:
|
||||
return fallback
|
||||
|
||||
return MIMES.get(ext) or mimetypes.guess_type(url)[0] or fallback
|
||||
ret = MIMES.get(ext) or mimetypes.guess_type(url)[0] or fallback
|
||||
|
||||
if ";" not in ret:
|
||||
if ret.startswith("text/") or ret.endswith("/javascript"):
|
||||
ret += "; charset=UTF-8"
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
def runcmd(*argv):
|
||||
@@ -1064,10 +1112,7 @@ def gzip_orig_sz(fn):
|
||||
with open(fsenc(fn), "rb") as f:
|
||||
f.seek(-4, 2)
|
||||
rv = f.read(4)
|
||||
try:
|
||||
return struct.unpack(b"I", rv)[0]
|
||||
except:
|
||||
return struct.unpack("I", rv)[0]
|
||||
return sunpack(b"I", rv)[0]
|
||||
|
||||
|
||||
def py_desc():
|
||||
|
||||
@@ -28,10 +28,16 @@ window.baguetteBox = (function () {
|
||||
isOverlayVisible = false,
|
||||
touch = {}, // start-pos
|
||||
touchFlag = false, // busy
|
||||
regex = /.+\.(gif|jpe?g|png|webp)/i,
|
||||
re_i = /.+\.(gif|jpe?g|png|webp)(\?|$)/i,
|
||||
re_v = /.+\.(webm|mp4)(\?|$)/i,
|
||||
data = {}, // all galleries
|
||||
imagesElements = [],
|
||||
documentLastFocus = null;
|
||||
documentLastFocus = null,
|
||||
isFullscreen = false;
|
||||
|
||||
var onFSC = function (e) {
|
||||
isFullscreen = !!document.fullscreenElement;
|
||||
};
|
||||
|
||||
var overlayClickHandler = function (event) {
|
||||
if (event.target.id.indexOf('baguette-img') !== -1) {
|
||||
@@ -96,10 +102,6 @@ window.baguetteBox = (function () {
|
||||
data[selector] = selectorData;
|
||||
|
||||
[].forEach.call(galleryNodeList, function (galleryElement) {
|
||||
if (userOptions && userOptions.filter) {
|
||||
regex = userOptions.filter;
|
||||
}
|
||||
|
||||
var tagsNodeList = [];
|
||||
if (galleryElement.tagName === 'A') {
|
||||
tagsNodeList = [galleryElement];
|
||||
@@ -109,7 +111,7 @@ window.baguetteBox = (function () {
|
||||
|
||||
tagsNodeList = [].filter.call(tagsNodeList, function (element) {
|
||||
if (element.className.indexOf(userOptions && userOptions.ignoreClass) === -1) {
|
||||
return regex.test(element.href);
|
||||
return re_i.test(element.href) || re_v.test(element.href);
|
||||
}
|
||||
});
|
||||
if (tagsNodeList.length === 0) {
|
||||
@@ -119,7 +121,7 @@ window.baguetteBox = (function () {
|
||||
var gallery = [];
|
||||
[].forEach.call(tagsNodeList, function (imageElement, imageIndex) {
|
||||
var imageElementClickHandler = function (event) {
|
||||
if (event && event.ctrlKey)
|
||||
if (event && (event.ctrlKey || event.metaKey))
|
||||
return true;
|
||||
|
||||
event.preventDefault ? event.preventDefault() : event.returnValue = false;
|
||||
@@ -209,24 +211,46 @@ window.baguetteBox = (function () {
|
||||
bindEvents();
|
||||
}
|
||||
|
||||
function keyDownHandler(event) {
|
||||
switch (event.keyCode) {
|
||||
case 37: // Left
|
||||
showPreviousImage();
|
||||
break;
|
||||
case 39: // Right
|
||||
showNextImage();
|
||||
break;
|
||||
case 27: // Esc
|
||||
hideOverlay();
|
||||
break;
|
||||
case 36: // Home
|
||||
showFirstImage(event);
|
||||
break;
|
||||
case 35: // End
|
||||
showLastImage(event);
|
||||
break;
|
||||
}
|
||||
function keyDownHandler(e) {
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing)
|
||||
return;
|
||||
|
||||
var k = e.code + '';
|
||||
|
||||
if (k == "ArrowLeft" || k == "KeyJ")
|
||||
showPreviousImage();
|
||||
else if (k == "ArrowRight" || k == "KeyL")
|
||||
showNextImage();
|
||||
else if (k == "Escape")
|
||||
hideOverlay();
|
||||
else if (k == "Home")
|
||||
showFirstImage(e);
|
||||
else if (k == "End")
|
||||
showLastImage(e);
|
||||
else if (k == "Space" || k == "KeyP" || k == "KeyK")
|
||||
playpause();
|
||||
else if (k == "KeyU" || k == "KeyO")
|
||||
relseek(k == "KeyU" ? -10 : 10);
|
||||
else if (k == "KeyM" && vid())
|
||||
vid().muted = !vid().muted;
|
||||
else if (k == "KeyF")
|
||||
try {
|
||||
if (isFullscreen)
|
||||
document.exitFullscreen();
|
||||
else
|
||||
vid().requestFullscreen();
|
||||
}
|
||||
catch (ex) { }
|
||||
}
|
||||
|
||||
function keyUpHandler(e) {
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing)
|
||||
return;
|
||||
|
||||
var k = e.code + '';
|
||||
|
||||
if (k == "Space")
|
||||
ev(e);
|
||||
}
|
||||
|
||||
var passiveSupp = false;
|
||||
@@ -325,6 +349,8 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
|
||||
bind(document, 'keydown', keyDownHandler);
|
||||
bind(document, 'keyup', keyUpHandler);
|
||||
bind(document, 'fullscreenchange', onFSC);
|
||||
currentIndex = chosenImageIndex;
|
||||
touch = {
|
||||
count: 0,
|
||||
@@ -366,6 +392,7 @@ window.baguetteBox = (function () {
|
||||
|
||||
function hideOverlay(e) {
|
||||
ev(e);
|
||||
playvid(false);
|
||||
if (options.noScrollbars) {
|
||||
document.documentElement.style.overflowY = 'auto';
|
||||
document.body.style.overflowY = 'auto';
|
||||
@@ -375,6 +402,8 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
|
||||
unbind(document, 'keydown', keyDownHandler);
|
||||
unbind(document, 'keyup', keyUpHandler);
|
||||
unbind(document, 'fullscreenchange', onFSC);
|
||||
// Fade out and hide the overlay
|
||||
overlay.className = '';
|
||||
setTimeout(function () {
|
||||
@@ -398,8 +427,8 @@ window.baguetteBox = (function () {
|
||||
return; // out-of-bounds or gallery dirty
|
||||
}
|
||||
|
||||
if (imageContainer.getElementsByTagName('img')[0]) {
|
||||
// image is loaded, cb and bail
|
||||
if (imageContainer.querySelector('img, video')) {
|
||||
// was loaded, cb and bail
|
||||
if (callback) {
|
||||
callback();
|
||||
}
|
||||
@@ -408,7 +437,7 @@ window.baguetteBox = (function () {
|
||||
|
||||
var imageElement = galleryItem.imageElement,
|
||||
imageSrc = imageElement.href,
|
||||
thumbnailElement = imageElement.getElementsByTagName('img')[0],
|
||||
thumbnailElement = imageElement.querySelector('img, video'),
|
||||
imageCaption = typeof options.captions === 'function' ?
|
||||
options.captions.call(currentGallery, imageElement) :
|
||||
imageElement.getAttribute('data-caption') || imageElement.title;
|
||||
@@ -428,16 +457,20 @@ window.baguetteBox = (function () {
|
||||
}
|
||||
imageContainer.appendChild(figure);
|
||||
|
||||
var image = mknod('img');
|
||||
image.onload = function () {
|
||||
var is_vid = re_v.test(imageSrc),
|
||||
image = mknod(is_vid ? 'video' : 'img');
|
||||
|
||||
clmod(imageContainer, 'vid', is_vid);
|
||||
|
||||
image.addEventListener(is_vid ? 'loadedmetadata' : 'load', function () {
|
||||
// Remove loader element
|
||||
var spinner = document.querySelector('#baguette-img-' + index + ' .baguetteBox-spinner');
|
||||
figure.removeChild(spinner);
|
||||
if (!options.async && callback) {
|
||||
if (!options.async && callback)
|
||||
callback();
|
||||
}
|
||||
};
|
||||
});
|
||||
image.setAttribute('src', imageSrc);
|
||||
image.setAttribute('controls', 'controls');
|
||||
image.alt = thumbnailElement ? thumbnailElement.alt || '' : '';
|
||||
if (options.titleTag && imageCaption) {
|
||||
image.title = imageCaption;
|
||||
@@ -498,6 +531,7 @@ window.baguetteBox = (function () {
|
||||
return false;
|
||||
}
|
||||
|
||||
playvid(false);
|
||||
currentIndex = index;
|
||||
loadImage(currentIndex, function () {
|
||||
preloadNext(currentIndex);
|
||||
@@ -512,6 +546,26 @@ window.baguetteBox = (function () {
|
||||
return true;
|
||||
}
|
||||
|
||||
function vid() {
|
||||
return imagesElements[currentIndex].querySelector('video');
|
||||
}
|
||||
|
||||
function playvid(play) {
|
||||
if (vid())
|
||||
vid()[play ? 'play' : 'pause']();
|
||||
}
|
||||
|
||||
function playpause() {
|
||||
var v = vid();
|
||||
if (v)
|
||||
v[v.paused ? "play" : "pause"]();
|
||||
}
|
||||
|
||||
function relseek(sec) {
|
||||
if (vid())
|
||||
vid().currentTime += sec;
|
||||
}
|
||||
|
||||
/**
|
||||
* Triggers the bounce animation
|
||||
* @param {('left'|'right')} direction - Direction of the movement
|
||||
@@ -534,6 +588,8 @@ window.baguetteBox = (function () {
|
||||
} else {
|
||||
slider.style.transform = 'translate3d(' + offset + ',0,0)';
|
||||
}
|
||||
playvid(false);
|
||||
playvid(true);
|
||||
}
|
||||
|
||||
function preloadNext(index) {
|
||||
@@ -566,6 +622,7 @@ window.baguetteBox = (function () {
|
||||
unbindEvents();
|
||||
clearCachedData();
|
||||
unbind(document, 'keydown', keyDownHandler);
|
||||
unbind(document, 'keyup', keyUpHandler);
|
||||
document.getElementsByTagName('body')[0].removeChild(ebi('baguetteBox-overlay'));
|
||||
data = {};
|
||||
currentGallery = [];
|
||||
@@ -577,6 +634,8 @@ window.baguetteBox = (function () {
|
||||
show: show,
|
||||
showNext: showNextImage,
|
||||
showPrevious: showPreviousImage,
|
||||
relseek: relseek,
|
||||
playpause: playpause,
|
||||
hide: hideOverlay,
|
||||
destroy: destroyPlugin
|
||||
};
|
||||
|
||||
@@ -29,10 +29,10 @@ body {
|
||||
position: fixed;
|
||||
max-width: 34em;
|
||||
background: #222;
|
||||
border: 0 solid #555;
|
||||
border: 0 solid #777;
|
||||
overflow: hidden;
|
||||
margin-top: 1em;
|
||||
padding: 0 1em;
|
||||
padding: 0 1.3em;
|
||||
height: 0;
|
||||
opacity: .1;
|
||||
transition: opacity 0.14s, height 0.14s, padding 0.14s;
|
||||
@@ -40,19 +40,31 @@ body {
|
||||
border-radius: .4em;
|
||||
z-index: 9001;
|
||||
}
|
||||
#tt.b {
|
||||
padding: 0 2em;
|
||||
border-radius: .5em;
|
||||
box-shadow: 0 .2em 1em #000;
|
||||
}
|
||||
#tt.show {
|
||||
padding: 1em;
|
||||
padding: 1em 1.3em;
|
||||
border-width: .4em 0;
|
||||
height: auto;
|
||||
border-width: .2em 0;
|
||||
opacity: 1;
|
||||
}
|
||||
#tt.show.b {
|
||||
padding: 1.5em 2em;
|
||||
border-width: .5em 0;
|
||||
}
|
||||
#tt code {
|
||||
background: #3c3c3c;
|
||||
padding: .2em .3em;
|
||||
padding: .1em .3em;
|
||||
border-top: 1px solid #777;
|
||||
border-radius: .3em;
|
||||
font-family: monospace, monospace;
|
||||
line-height: 2em;
|
||||
line-height: 1.7em;
|
||||
}
|
||||
#tt em {
|
||||
color: #f6a;
|
||||
}
|
||||
#path,
|
||||
#path * {
|
||||
@@ -607,7 +619,7 @@ input.eq_gain {
|
||||
#srch_q {
|
||||
white-space: pre;
|
||||
color: #f80;
|
||||
height: 1em;
|
||||
min-height: 1em;
|
||||
margin: .2em 0 -1em 1.6em;
|
||||
}
|
||||
#tq_raw {
|
||||
@@ -811,10 +823,14 @@ input.eq_gain {
|
||||
padding: 0;
|
||||
border-bottom: 1px solid #555;
|
||||
}
|
||||
#thumbs {
|
||||
#thumbs,
|
||||
#au_osd_cv,
|
||||
#u2tdate {
|
||||
opacity: .3;
|
||||
}
|
||||
#griden.on+#thumbs {
|
||||
#griden.on+#thumbs,
|
||||
#au_os_ctl.on+#au_osd_cv,
|
||||
#u2turbo.on+#u2tdate {
|
||||
opacity: 1;
|
||||
}
|
||||
#ghead {
|
||||
@@ -919,13 +935,16 @@ html.light {
|
||||
}
|
||||
html.light #tt {
|
||||
background: #fff;
|
||||
border-color: #888;
|
||||
border-color: #888 #000 #777 #000;
|
||||
box-shadow: 0 .3em 1em rgba(0,0,0,0.4);
|
||||
}
|
||||
html.light #tt code {
|
||||
background: #060;
|
||||
color: #fff;
|
||||
}
|
||||
html.light #tt em {
|
||||
color: #d38;
|
||||
}
|
||||
html.light #ops,
|
||||
html.light .opbox,
|
||||
html.light #srch_form {
|
||||
@@ -1155,22 +1174,27 @@ html.light #tree::-webkit-scrollbar {
|
||||
margin: 0;
|
||||
height: 100%;
|
||||
}
|
||||
#baguetteBox-overlay .full-image img {
|
||||
#baguetteBox-overlay .full-image img,
|
||||
#baguetteBox-overlay .full-image video {
|
||||
display: inline-block;
|
||||
width: auto;
|
||||
height: auto;
|
||||
max-height: 100%;
|
||||
max-width: 100%;
|
||||
max-height: 100%;
|
||||
max-height: calc(100% - 1.4em);
|
||||
margin-bottom: 1.4em;
|
||||
vertical-align: middle;
|
||||
box-shadow: 0 0 8px rgba(0, 0, 0, 0.6);
|
||||
}
|
||||
#baguetteBox-overlay .full-image video {
|
||||
background: #333;
|
||||
}
|
||||
#baguetteBox-overlay .full-image figcaption {
|
||||
display: block;
|
||||
position: absolute;
|
||||
bottom: 0;
|
||||
position: fixed;
|
||||
bottom: .1em;
|
||||
width: 100%;
|
||||
text-align: center;
|
||||
line-height: 1.8;
|
||||
white-space: normal;
|
||||
color: #ccc;
|
||||
}
|
||||
|
||||
@@ -6,10 +6,10 @@
|
||||
<title>⇆🎉 {{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/browser.css{{ ts }}">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/upload.css{{ ts }}">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/browser.css?_={{ ts }}">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/upload.css?_={{ ts }}">
|
||||
{%- if css %}
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="{{ css }}{{ ts }}">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="{{ css }}?_={{ ts }}">
|
||||
{%- endif %}
|
||||
</head>
|
||||
|
||||
@@ -110,7 +110,7 @@
|
||||
|
||||
<div id="epi" class="logue">{{ logues[1] }}</div>
|
||||
|
||||
<h2><a href="?h">control-panel</a></h2>
|
||||
<h2><a href="/?h">control-panel</a></h2>
|
||||
|
||||
</div>
|
||||
|
||||
@@ -127,9 +127,9 @@
|
||||
have_tags_idx = {{ have_tags_idx|tojson }},
|
||||
have_zip = {{ have_zip|tojson }};
|
||||
</script>
|
||||
<script src="/.cpr/util.js{{ ts }}"></script>
|
||||
<script src="/.cpr/browser.js{{ ts }}"></script>
|
||||
<script src="/.cpr/up2k.js{{ ts }}"></script>
|
||||
<script src="/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="/.cpr/browser.js?_={{ ts }}"></script>
|
||||
<script src="/.cpr/up2k.js?_={{ ts }}"></script>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
|
||||
@@ -78,7 +78,7 @@ ebi('op_up2k').innerHTML = (
|
||||
' <tr>\n' +
|
||||
' <td>\n' +
|
||||
' <a href="#" id="nthread_sub">–</a><input\n' +
|
||||
' class="txtbox" id="nthread" value="2"/><a\n' +
|
||||
' class="txtbox" id="nthread" value="2" tt="pause uploads by setting it to 0"/><a\n' +
|
||||
' href="#" id="nthread_add">+</a><br /> \n' +
|
||||
' </td>\n' +
|
||||
' </tr>\n' +
|
||||
@@ -133,6 +133,13 @@ ebi('op_cfg').innerHTML = (
|
||||
(have_zip ? (
|
||||
'<div><h3>folder download</h3><div id="arc_fmt"></div></div>\n'
|
||||
) : '') +
|
||||
'<div>\n' +
|
||||
' <h3>up2k switches</h3>\n' +
|
||||
' <div>\n' +
|
||||
' <a id="u2turbo" class="tgl btn ttb" href="#" tt="the yolo button, you probably DO NOT want to enable this:$N$Nuse this if you were uploading a huge amount of files and had to restart for some reason, and want to continue the upload ASAP$N$Nthis replaces the hash-check with a simple <em>"does this have the same filesize on the server?"</em> so if the file contents are different it will NOT be uploaded$N$Nyou should turn this off when the upload is done, and then "upload" the same files again to let the client verify them">turbo</a>\n' +
|
||||
' <a id="u2tdate" class="tgl btn ttb" href="#" tt="has no effect unless the turbo button is enabled$N$Nreduces the yolo factor by a tiny amount; checks whether the file timestamps on the server matches yours$N$Nshould <em>theoretically</em> catch most unfinished/corrupted uploads, but is not a substitute for doing a verification pass with turbo disabled afterwards">date-chk</a>\n' +
|
||||
' </div>\n' +
|
||||
'</div>\n' +
|
||||
'<div><h3>key notation</h3><div id="key_notation"></div></div>\n' +
|
||||
'<div class="fill"><h3>hidden columns</h3><div id="hcols"></div></div>'
|
||||
);
|
||||
@@ -222,10 +229,14 @@ var have_webp = null;
|
||||
|
||||
|
||||
var mpl = (function () {
|
||||
var have_mctl = 'mediaSession' in navigator && window.MediaMetadata;
|
||||
|
||||
ebi('op_player').innerHTML = (
|
||||
'<div><h3>switches</h3><div>' +
|
||||
'<a href="#" class="tgl btn" id="au_preload" tt="start loading the next song near the end for gapless playback">preload</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_npclip" tt="show buttons for clipboarding the currently playing song">/np clip</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_os_ctl" tt="os integration (media hotkeys / osd)">os-ctl</a>' +
|
||||
'<a href="#" class="tgl btn" id="au_osd_cv" tt="show album cover in osd">osd-cv</a>' +
|
||||
'</div></div>' +
|
||||
|
||||
'<div><h3>playback mode</h3><div id="pb_mode">' +
|
||||
@@ -233,12 +244,18 @@ var mpl = (function () {
|
||||
'<a href="#" class="tgl btn" tt="load the next folder and continue">📂 next-folder</a>' +
|
||||
'</div></div>' +
|
||||
|
||||
'<div><h3>tint</h3><div>' +
|
||||
'<input type="text" id="pb_tint" size="3" value="0" tt="background level (0-100) on the seekbar$Nto make buffering less distracting" />' +
|
||||
'</div></div>' +
|
||||
|
||||
'<div><h3>audio equalizer</h3><div id="audio_eq"></div></div>');
|
||||
|
||||
var r = {
|
||||
"pb_mode": sread('pb_mode') || 'loop-folder',
|
||||
"preload": bcfg_get('au_preload', true),
|
||||
"clip": bcfg_get('au_npclip', false)
|
||||
"clip": bcfg_get('au_npclip', false),
|
||||
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
||||
"osd_cv": bcfg_get('au_osd_cv', true),
|
||||
};
|
||||
|
||||
ebi('au_preload').onclick = function (e) {
|
||||
@@ -254,6 +271,20 @@ var mpl = (function () {
|
||||
clmod(ebi('wtoggle'), 'np', r.clip && mp.au);
|
||||
};
|
||||
|
||||
ebi('au_os_ctl').onclick = function (e) {
|
||||
ev(e);
|
||||
r.os_ctl = !r.os_ctl && have_mctl;
|
||||
bcfg_set('au_os_ctl', r.os_ctl);
|
||||
if (!have_mctl)
|
||||
alert('need firefox 82+ or chrome 73+');
|
||||
};
|
||||
|
||||
ebi('au_osd_cv').onclick = function (e) {
|
||||
ev(e);
|
||||
r.osd_cv = !r.osd_cv;
|
||||
bcfg_set('au_osd_cv', r.osd_cv);
|
||||
};
|
||||
|
||||
function draw_pb_mode() {
|
||||
var btns = QSA('#pb_mode>a');
|
||||
for (var a = 0, aa = btns.length; a < aa; a++) {
|
||||
@@ -270,20 +301,98 @@ var mpl = (function () {
|
||||
draw_pb_mode();
|
||||
}
|
||||
|
||||
function set_tint() {
|
||||
var tint = icfg_get('pb_tint', 0);
|
||||
if (!tint)
|
||||
ebi('barbuf').style.removeProperty('background');
|
||||
else
|
||||
ebi('barbuf').style.background = 'rgba(126,163,75,' + (tint / 100.0) + ')';
|
||||
}
|
||||
ebi('pb_tint').oninput = function (e) {
|
||||
swrite('pb_tint', this.value);
|
||||
set_tint();
|
||||
};
|
||||
set_tint();
|
||||
|
||||
r.pp = function () {
|
||||
if (!r.os_ctl)
|
||||
return;
|
||||
|
||||
navigator.mediaSession.playbackState = mp.au && !mp.au.paused ? "playing" : "paused";
|
||||
};
|
||||
|
||||
r.announce = function () {
|
||||
if (!r.os_ctl)
|
||||
return;
|
||||
|
||||
var np = get_np()[0],
|
||||
fns = np.file.split(' - '),
|
||||
artist = (np.circle ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
|
||||
tags = {
|
||||
title: np.title || fns.slice(-1)[0]
|
||||
};
|
||||
|
||||
if (artist)
|
||||
tags.artist = artist;
|
||||
|
||||
if (np.album)
|
||||
tags.album = np.album;
|
||||
|
||||
if (r.osd_cv) {
|
||||
var files = QSA("#files tr>td:nth-child(2)>a[id]"),
|
||||
cover = null;
|
||||
|
||||
for (var a = 0, aa = files.length; a < aa; a++) {
|
||||
if (/^(cover|folder)\.(jpe?g|png|gif)$/.test(files[a].textContent)) {
|
||||
cover = files[a].getAttribute('href');
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (cover) {
|
||||
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
|
||||
|
||||
var pwd = get_pwd();
|
||||
if (pwd)
|
||||
cover += '&pw=' + uricom_enc(pwd);
|
||||
|
||||
tags.artwork = [{ "src": cover, type: "image/jpeg" }];
|
||||
}
|
||||
}
|
||||
|
||||
navigator.mediaSession.metadata = new MediaMetadata(tags);
|
||||
navigator.mediaSession.setActionHandler('play', playpause);
|
||||
navigator.mediaSession.setActionHandler('pause', playpause);
|
||||
navigator.mediaSession.setActionHandler('seekbackward', function () { seek_au_rel(-10); });
|
||||
navigator.mediaSession.setActionHandler('seekforward', function () { seek_au_rel(10); });
|
||||
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
|
||||
navigator.mediaSession.setActionHandler('nexttrack', next_song);
|
||||
r.pp();
|
||||
};
|
||||
|
||||
r.stop = function () {
|
||||
if (!r.os_ctl || !navigator.mediaSession.metadata)
|
||||
return;
|
||||
|
||||
navigator.mediaSession.metadata = null;
|
||||
navigator.mediaSession.playbackState = "paused";
|
||||
};
|
||||
|
||||
return r;
|
||||
})();
|
||||
|
||||
|
||||
// extract songs + add play column
|
||||
function MPlayer() {
|
||||
this.id = Date.now();
|
||||
this.au = null;
|
||||
this.au_native = null;
|
||||
this.au_native2 = null;
|
||||
this.au_ogvjs = null;
|
||||
this.au_ogvjs2 = null;
|
||||
this.tracks = {};
|
||||
this.order = [];
|
||||
var r = this;
|
||||
r.id = Date.now();
|
||||
r.au = null;
|
||||
r.au_native = null;
|
||||
r.au_native2 = null;
|
||||
r.au_ogvjs = null;
|
||||
r.au_ogvjs2 = null;
|
||||
r.tracks = {};
|
||||
r.order = [];
|
||||
|
||||
var re_audio = /\.(opus|ogg|m4a|aac|mp3|wav|flac)$/i,
|
||||
trs = QSA('#files tbody tr');
|
||||
@@ -298,32 +407,33 @@ function MPlayer() {
|
||||
|
||||
if (m) {
|
||||
var tid = link.getAttribute('id');
|
||||
this.order.push(tid);
|
||||
this.tracks[tid] = url;
|
||||
r.order.push(tid);
|
||||
r.tracks[tid] = url;
|
||||
tds[0].innerHTML = '<a id="a' + tid + '" href="#a' + tid + '" class="play">play</a></td>';
|
||||
ebi('a' + tid).onclick = ev_play;
|
||||
}
|
||||
}
|
||||
|
||||
this.vol = sread('vol');
|
||||
if (this.vol !== null)
|
||||
this.vol = parseFloat(this.vol);
|
||||
r.vol = sread('vol');
|
||||
if (r.vol !== null)
|
||||
r.vol = parseFloat(r.vol);
|
||||
else
|
||||
this.vol = 0.5;
|
||||
r.vol = 0.5;
|
||||
|
||||
this.expvol = function () {
|
||||
return 0.5 * this.vol + 0.5 * this.vol * this.vol;
|
||||
r.expvol = function (v) {
|
||||
return 0.5 * v + 0.5 * v * v;
|
||||
};
|
||||
|
||||
this.setvol = function (vol) {
|
||||
this.vol = Math.max(Math.min(vol, 1), 0);
|
||||
r.setvol = function (vol) {
|
||||
r.vol = Math.max(Math.min(vol, 1), 0);
|
||||
swrite('vol', vol);
|
||||
r.stopfade(true);
|
||||
|
||||
if (this.au)
|
||||
this.au.volume = this.expvol();
|
||||
if (r.au)
|
||||
r.au.volume = r.expvol(r.vol);
|
||||
};
|
||||
|
||||
this.read_order = function () {
|
||||
r.read_order = function () {
|
||||
var order = [],
|
||||
links = QSA('#files>tbody>tr>td:nth-child(1)>a');
|
||||
|
||||
@@ -334,24 +444,71 @@ function MPlayer() {
|
||||
|
||||
order.push(tid.slice(1));
|
||||
}
|
||||
this.order = order;
|
||||
r.order = order;
|
||||
};
|
||||
|
||||
this.preload = function (url) {
|
||||
r.fdir = 0;
|
||||
r.fvol = -1;
|
||||
r.ftid = -1;
|
||||
r.ftimer = null;
|
||||
r.fade_in = function () {
|
||||
r.fvol = 0;
|
||||
r.fdir = 0.025;
|
||||
if (r.au) {
|
||||
r.ftid = r.au.tid;
|
||||
r.au.play();
|
||||
mpl.pp();
|
||||
fader();
|
||||
}
|
||||
};
|
||||
r.fade_out = function () {
|
||||
r.fvol = r.vol;
|
||||
r.fdir = -0.05;
|
||||
r.ftid = r.au.tid;
|
||||
fader();
|
||||
};
|
||||
r.stopfade = function (hard) {
|
||||
clearTimeout(r.ftimer);
|
||||
if (hard)
|
||||
r.ftid = -1;
|
||||
}
|
||||
function fader() {
|
||||
r.stopfade();
|
||||
if (!r.au || r.au.tid !== r.ftid)
|
||||
return;
|
||||
|
||||
var done = true;
|
||||
r.fvol += r.fdir;
|
||||
if (r.fvol < 0) {
|
||||
r.fvol = 0;
|
||||
r.au.pause();
|
||||
mpl.pp();
|
||||
}
|
||||
else if (r.fvol > r.vol)
|
||||
r.fvol = r.vol;
|
||||
else
|
||||
done = false;
|
||||
|
||||
r.au.volume = r.expvol(r.fvol);
|
||||
if (!done)
|
||||
setTimeout(fader, 10);
|
||||
}
|
||||
|
||||
r.preload = function (url) {
|
||||
var au = null;
|
||||
if (need_ogv_for(url)) {
|
||||
au = mp.au_ogvjs2;
|
||||
if (!au && window['OGVPlayer']) {
|
||||
au = new OGVPlayer();
|
||||
au.preload = "auto";
|
||||
this.au_ogvjs2 = au;
|
||||
r.au_ogvjs2 = au;
|
||||
}
|
||||
} else {
|
||||
au = mp.au_native2;
|
||||
if (!au) {
|
||||
au = new Audio();
|
||||
au.preload = "auto";
|
||||
this.au_native2 = au;
|
||||
r.au_native2 = au;
|
||||
}
|
||||
}
|
||||
if (au) {
|
||||
@@ -365,39 +522,62 @@ var mp = new MPlayer();
|
||||
makeSortable(ebi('files'), mp.read_order.bind(mp));
|
||||
|
||||
|
||||
function get_np() {
|
||||
var th = ebi('files').tHead.rows[0].cells,
|
||||
tr = QS('#files tr.play').cells,
|
||||
rv = [],
|
||||
ra = [],
|
||||
rt = {};
|
||||
|
||||
for (var a = 1, aa = th.length; a < aa; a++) {
|
||||
var tv = tr[a].textContent,
|
||||
tk = a == 1 ? 'file' : th[a].getAttribute('name').split('/').slice(-1)[0],
|
||||
vis = th[a].className.indexOf('min') === -1;
|
||||
|
||||
if (!tv)
|
||||
continue;
|
||||
|
||||
(vis ? rv : ra).push(tk);
|
||||
rt[tk] = tv;
|
||||
}
|
||||
return [rt, rv, ra];
|
||||
};
|
||||
|
||||
|
||||
// toggle player widget
|
||||
var widget = (function () {
|
||||
var ret = {},
|
||||
var r = {},
|
||||
widget = ebi('widget'),
|
||||
wtico = ebi('wtico'),
|
||||
nptxt = ebi('nptxt'),
|
||||
npirc = ebi('npirc'),
|
||||
touchmode = false,
|
||||
side_open = false,
|
||||
was_paused = true;
|
||||
|
||||
ret.open = function () {
|
||||
if (side_open)
|
||||
r.is_open = false;
|
||||
|
||||
r.open = function () {
|
||||
if (r.is_open)
|
||||
return false;
|
||||
|
||||
widget.className = 'open';
|
||||
side_open = true;
|
||||
r.is_open = true;
|
||||
return true;
|
||||
};
|
||||
ret.close = function () {
|
||||
if (!side_open)
|
||||
r.close = function () {
|
||||
if (!r.is_open)
|
||||
return false;
|
||||
|
||||
widget.className = '';
|
||||
side_open = false;
|
||||
r.is_open = false;
|
||||
return true;
|
||||
};
|
||||
ret.toggle = function (e) {
|
||||
ret.open() || ret.close();
|
||||
r.toggle = function (e) {
|
||||
r.open() || r.close();
|
||||
ev(e);
|
||||
return false;
|
||||
};
|
||||
ret.paused = function (paused) {
|
||||
r.paused = function (paused) {
|
||||
if (was_paused != paused) {
|
||||
was_paused = paused;
|
||||
ebi('bplay').innerHTML = paused ? '▶' : '⏸';
|
||||
@@ -405,28 +585,22 @@ var widget = (function () {
|
||||
};
|
||||
wtico.onclick = function (e) {
|
||||
if (!touchmode)
|
||||
ret.toggle(e);
|
||||
r.toggle(e);
|
||||
|
||||
return false;
|
||||
};
|
||||
npirc.onclick = nptxt.onclick = function (e) {
|
||||
ev(e);
|
||||
var th = ebi('files').tHead.rows[0].cells,
|
||||
tr = QS('#files tr.play').cells,
|
||||
irc = this.getAttribute('id') == 'npirc',
|
||||
var irc = this.getAttribute('id') == 'npirc',
|
||||
ck = irc ? '06' : '',
|
||||
cv = irc ? '07' : '',
|
||||
m = ck + 'np: ';
|
||||
m = ck + 'np: ',
|
||||
npr = get_np(),
|
||||
npk = npr[1],
|
||||
np = npr[0];
|
||||
|
||||
for (var a = 1, aa = th.length; a < aa; a++) {
|
||||
if (th[a].className.indexOf('min') !== -1)
|
||||
continue;
|
||||
|
||||
var tv = tr[a].textContent,
|
||||
tk = a == 1 ? '' : th[a].getAttribute('name').split('/').slice(-1)[0];
|
||||
|
||||
m += tk + '(' + cv + tv + ck + ') // ';
|
||||
}
|
||||
for (var a = 0; a < npk.length; a++)
|
||||
m += (npk[a] == 'file' ? '' : npk[a]) + '(' + cv + np[npk[a]] + ck + ') // ';
|
||||
|
||||
m += '[' + cv + s2ms(mp.au.currentTime) + ck + '/' + cv + s2ms(mp.au.duration) + ck + ']';
|
||||
|
||||
@@ -442,7 +616,7 @@ var widget = (function () {
|
||||
document.body.removeChild(o);
|
||||
}, 500);
|
||||
};
|
||||
return ret;
|
||||
return r;
|
||||
})();
|
||||
|
||||
|
||||
@@ -488,12 +662,15 @@ var pbar = (function () {
|
||||
}
|
||||
|
||||
r.drawbuf = function () {
|
||||
var bc = r.buf,
|
||||
bctx = bc.ctx;
|
||||
|
||||
bctx.clearRect(0, 0, bc.w, bc.h);
|
||||
|
||||
if (!mp.au)
|
||||
return;
|
||||
|
||||
var bc = r.buf,
|
||||
bctx = bc.ctx,
|
||||
sm = bc.w * 1.0 / mp.au.duration,
|
||||
var sm = bc.w * 1.0 / mp.au.duration,
|
||||
gk = bc.h + '' + light;
|
||||
|
||||
if (gradh != gk) {
|
||||
@@ -501,7 +678,6 @@ var pbar = (function () {
|
||||
grad = glossy_grad(bc, 85, [35, 40, 37, 35], light ? [45, 56, 50, 45] : [42, 51, 47, 42]);
|
||||
}
|
||||
bctx.fillStyle = grad;
|
||||
bctx.clearRect(0, 0, bc.w, bc.h);
|
||||
for (var a = 0; a < mp.au.buffered.length; a++) {
|
||||
var x1 = sm * mp.au.buffered.start(a),
|
||||
x2 = sm * mp.au.buffered.end(a);
|
||||
@@ -511,15 +687,17 @@ var pbar = (function () {
|
||||
};
|
||||
|
||||
r.drawpos = function () {
|
||||
var bc = r.buf,
|
||||
pc = r.pos,
|
||||
pctx = pc.ctx;
|
||||
|
||||
pctx.clearRect(0, 0, pc.w, pc.h);
|
||||
|
||||
if (!mp.au || isNaN(mp.au.duration) || isNaN(mp.au.currentTime))
|
||||
return; // not-init || unsupp-codec
|
||||
|
||||
var bc = r.buf,
|
||||
pc = r.pos,
|
||||
pctx = pc.ctx,
|
||||
sm = bc.w * 1.0 / mp.au.duration;
|
||||
var sm = bc.w * 1.0 / mp.au.duration;
|
||||
|
||||
pctx.clearRect(0, 0, pc.w, pc.h);
|
||||
pctx.fillStyle = light ? 'rgba(0,64,0,0.15)' : 'rgba(204,255,128,0.15)';
|
||||
for (var p = 1, mins = mp.au.duration / 10; p <= mins; p++)
|
||||
pctx.fillRect(Math.floor(sm * p * 10), 0, 2, pc.h);
|
||||
@@ -635,6 +813,11 @@ function seek_au_mul(mul) {
|
||||
seek_au_sec(mp.au.duration * mul);
|
||||
}
|
||||
|
||||
function seek_au_rel(sec) {
|
||||
if (mp.au)
|
||||
seek_au_sec(mp.au.currentTime + sec);
|
||||
}
|
||||
|
||||
function seek_au_sec(seek) {
|
||||
if (!mp.au)
|
||||
return;
|
||||
@@ -645,9 +828,8 @@ function seek_au_sec(seek) {
|
||||
|
||||
mp.au.currentTime = seek;
|
||||
|
||||
// ogv.js breaks on .play() during playback
|
||||
if (mp.au === mp.au_native)
|
||||
mp.au.play();
|
||||
if (mp.au.paused)
|
||||
mp.fade_in();
|
||||
|
||||
mpui.progress_updater();
|
||||
}
|
||||
@@ -669,22 +851,29 @@ function next_song(e) {
|
||||
}
|
||||
function prev_song(e) {
|
||||
ev(e);
|
||||
|
||||
if (mp.au && !mp.au.paused && mp.au.currentTime > 3)
|
||||
return seek_au_sec(0);
|
||||
|
||||
return song_skip(-1);
|
||||
}
|
||||
|
||||
|
||||
function playpause(e) {
|
||||
// must be event-chain
|
||||
ev(e);
|
||||
if (mp.au) {
|
||||
if (mp.au.paused)
|
||||
mp.au.play();
|
||||
mp.fade_in();
|
||||
else
|
||||
mp.au.pause();
|
||||
mp.fade_out();
|
||||
|
||||
mpui.progress_updater();
|
||||
}
|
||||
else
|
||||
play(0);
|
||||
play(0, true);
|
||||
|
||||
mpl.pp();
|
||||
};
|
||||
|
||||
|
||||
@@ -693,9 +882,13 @@ function playpause(e) {
|
||||
ebi('bplay').onclick = playpause;
|
||||
ebi('bprev').onclick = prev_song;
|
||||
ebi('bnext').onclick = next_song;
|
||||
ebi('barpos').onclick = function (e) {
|
||||
|
||||
var bar = ebi('barpos');
|
||||
|
||||
bar.onclick = function (e) {
|
||||
if (!mp.au) {
|
||||
return play(0);
|
||||
play(0, true);
|
||||
return mp.fade_in();
|
||||
}
|
||||
|
||||
var rect = pbar.buf.can.getBoundingClientRect(),
|
||||
@@ -703,6 +896,19 @@ function playpause(e) {
|
||||
|
||||
seek_au_mul(x * 1.0 / rect.width);
|
||||
};
|
||||
|
||||
if (!is_touch)
|
||||
bar.onwheel = function (e) {
|
||||
var dist = Math.sign(e.deltaY) * 10;
|
||||
if (Math.abs(e.deltaY) < 30 && !e.deltaMode)
|
||||
dist = e.deltaY;
|
||||
|
||||
if (!dist || !mp.au)
|
||||
return true;
|
||||
|
||||
seek_au_rel(dist);
|
||||
ev(e);
|
||||
};
|
||||
})();
|
||||
|
||||
|
||||
@@ -770,7 +976,12 @@ var mpui = (function () {
|
||||
// event from play button next to a file in the list
|
||||
function ev_play(e) {
|
||||
ev(e);
|
||||
play(this.getAttribute('id').slice(1));
|
||||
|
||||
var fade = !mp.au || mp.au.paused;
|
||||
play(this.getAttribute('id').slice(1), true);
|
||||
if (fade)
|
||||
mp.fade_in();
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
@@ -1009,13 +1220,18 @@ var audio_eq = (function () {
|
||||
|
||||
|
||||
// plays the tid'th audio file on the page
|
||||
function play(tid, seek, call_depth) {
|
||||
function play(tid, is_ev, seek, call_depth) {
|
||||
if (mp.order.length == 0)
|
||||
return console.log('no audio found wait what');
|
||||
|
||||
mp.stopfade(true);
|
||||
|
||||
var tn = tid;
|
||||
if ((tn + '').indexOf('f-') === 0)
|
||||
if ((tn + '').indexOf('f-') === 0) {
|
||||
tn = mp.order.indexOf(tn);
|
||||
if (tn < 0)
|
||||
return;
|
||||
}
|
||||
|
||||
if (tn >= mp.order.length) {
|
||||
if (mpl.pb_mode == 'loop-folder') {
|
||||
@@ -1054,7 +1270,7 @@ function play(tid, seek, call_depth) {
|
||||
}
|
||||
else if (window['OGVPlayer']) {
|
||||
mp.au = mp.au_ogvjs = new OGVPlayer();
|
||||
attempt_play = false;
|
||||
attempt_play = is_ev;
|
||||
mp.au.addEventListener('error', evau_error, true);
|
||||
mp.au.addEventListener('progress', pbar.drawpos);
|
||||
mp.au.addEventListener('ended', next_song);
|
||||
@@ -1067,7 +1283,7 @@ function play(tid, seek, call_depth) {
|
||||
show_modal('<h1>loading ogv.js</h1><h2>thanks apple</h2>');
|
||||
|
||||
import_js('/.cpr/deps/ogv.js', function () {
|
||||
play(tid, seek, 1);
|
||||
play(tid, false, seek, 1);
|
||||
});
|
||||
|
||||
return;
|
||||
@@ -1088,7 +1304,7 @@ function play(tid, seek, call_depth) {
|
||||
|
||||
mp.au.tid = tid;
|
||||
mp.au.src = url + (url.indexOf('?') < 0 ? '?cache' : '&cache');
|
||||
mp.au.volume = mp.expvol();
|
||||
mp.au.volume = mp.expvol(mp.vol);
|
||||
var oid = 'a' + tid;
|
||||
setclass(oid, 'play act');
|
||||
var trs = ebi('files').getElementsByTagName('tbody')[0].getElementsByTagName('tr');
|
||||
@@ -1124,6 +1340,7 @@ function play(tid, seek, call_depth) {
|
||||
|
||||
mpui.progress_updater();
|
||||
pbar.drawbuf();
|
||||
mpl.announce();
|
||||
return true;
|
||||
}
|
||||
catch (ex) {
|
||||
@@ -1178,7 +1395,8 @@ function show_modal(html) {
|
||||
|
||||
|
||||
// hide fullscreen message
|
||||
function unblocked() {
|
||||
function unblocked(e) {
|
||||
ev(e);
|
||||
var dom = ebi('blocked');
|
||||
if (dom)
|
||||
dom.parentNode.removeChild(dom);
|
||||
@@ -1193,26 +1411,25 @@ function autoplay_blocked(seek) {
|
||||
|
||||
var go = ebi('blk_go'),
|
||||
na = ebi('blk_na'),
|
||||
fn = mp.tracks[mp.au.tid].split(/\//).pop();
|
||||
tid = mp.au.tid,
|
||||
fn = mp.tracks[tid].split(/\//).pop();
|
||||
|
||||
fn = uricom_dec(fn.replace(/\+/g, ' '))[0];
|
||||
|
||||
go.textContent = 'Play "' + fn + '"';
|
||||
go.onclick = function (e) {
|
||||
if (e) e.preventDefault();
|
||||
unblocked();
|
||||
mp.au.play();
|
||||
if (seek)
|
||||
seek_au_sec(seek);
|
||||
else
|
||||
mpui.progress_updater();
|
||||
unblocked(e);
|
||||
// chrome 91 may permanently taint on a failed play()
|
||||
// depending on win10 settings or something? idk
|
||||
mp.au_native = mp.au_ogvjs = null;
|
||||
play(tid, true, seek);
|
||||
mp.fade_in();
|
||||
};
|
||||
na.onclick = unblocked;
|
||||
}
|
||||
|
||||
|
||||
// autoplay linked track
|
||||
(function () {
|
||||
function play_linked() {
|
||||
var v = location.hash;
|
||||
if (v && v.indexOf('#af-') === 0) {
|
||||
var id = v.slice(2).split('&');
|
||||
@@ -1226,9 +1443,9 @@ function autoplay_blocked(seek) {
|
||||
if (!m)
|
||||
return play(id[0]);
|
||||
|
||||
return play(id[0], parseInt(m[1] || 0) * 60 + parseInt(m[2] || 0));
|
||||
return play(id[0], false, parseInt(m[1] || 0) * 60 + parseInt(m[2] || 0));
|
||||
}
|
||||
})();
|
||||
};
|
||||
|
||||
|
||||
var thegrid = (function () {
|
||||
@@ -1334,35 +1551,57 @@ var thegrid = (function () {
|
||||
}
|
||||
setsz();
|
||||
|
||||
function seltgl(e) {
|
||||
if (e && e.ctrlKey)
|
||||
function gclick(e) {
|
||||
if (e && (e.ctrlKey || e.metaKey))
|
||||
return true;
|
||||
|
||||
ev(e);
|
||||
var oth = ebi(this.getAttribute('ref')),
|
||||
td = oth.parentNode.nextSibling,
|
||||
href = this.getAttribute('href'),
|
||||
aplay = ebi('a' + oth.getAttribute('id')),
|
||||
is_img = /\.(gif|jpe?g|png|webp)(\?|$)/i.test(href),
|
||||
in_tree = null,
|
||||
have_sel = QS('#files tr.sel'),
|
||||
td = oth.closest('td').nextSibling,
|
||||
tr = td.parentNode;
|
||||
|
||||
td.click();
|
||||
this.setAttribute('class', tr.getAttribute('class'));
|
||||
}
|
||||
if (/\/(\?|$)/.test(href)) {
|
||||
var ta = QSA('#treeul a.hl+ul>li>a+a'),
|
||||
txt = oth.textContent.slice(0, -1);
|
||||
|
||||
function bgopen(e) {
|
||||
for (var a = 0, aa = ta.length; a < aa; a++) {
|
||||
if (ta[a].textContent == txt) {
|
||||
in_tree = ta[a];
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (r.sel) {
|
||||
td.click();
|
||||
this.setAttribute('class', tr.getAttribute('class'));
|
||||
}
|
||||
else if (widget.is_open && aplay)
|
||||
aplay.click();
|
||||
|
||||
else if (in_tree && !have_sel)
|
||||
in_tree.click();
|
||||
|
||||
else if (!is_img && have_sel)
|
||||
window.open(href, '_blank');
|
||||
|
||||
else return true;
|
||||
ev(e);
|
||||
var url = this.getAttribute('href');
|
||||
window.open(url, '_blank');
|
||||
}
|
||||
|
||||
r.loadsel = function () {
|
||||
if (r.dirty)
|
||||
return;
|
||||
|
||||
var ths = QSA('#ggrid>a'),
|
||||
have_sel = !!QS('#files tr.sel');
|
||||
var ths = QSA('#ggrid>a');
|
||||
|
||||
for (var a = 0, aa = ths.length; a < aa; a++) {
|
||||
ths[a].onclick = r.sel ? seltgl : have_sel ? bgopen : null;
|
||||
ths[a].setAttribute('class', ebi(ths[a].getAttribute('ref')).parentNode.parentNode.getAttribute('class'));
|
||||
var tr = ebi(ths[a].getAttribute('ref')).closest('tr');
|
||||
ths[a].setAttribute('class', tr.getAttribute('class'));
|
||||
}
|
||||
var uns = QS('#ggrid a[ref="unsearch"]');
|
||||
if (uns)
|
||||
@@ -1393,6 +1632,8 @@ var thegrid = (function () {
|
||||
|
||||
if (r.thumbs) {
|
||||
ihref += (ihref.indexOf('?') === -1 ? '?' : '&') + 'th=' + (have_webp ? 'w' : 'j');
|
||||
if (href == "#")
|
||||
ihref = '/.cpr/ico/⏏️';
|
||||
}
|
||||
else if (isdir) {
|
||||
ihref = '/.cpr/ico/folder';
|
||||
@@ -1420,6 +1661,11 @@ var thegrid = (function () {
|
||||
ihref + '" /><span' + ac + '>' + ao.innerHTML + '</span></a>');
|
||||
}
|
||||
ebi('ggrid').innerHTML = html.join('\n');
|
||||
|
||||
var ths = QSA('#ggrid>a');
|
||||
for (var a = 0, aa = ths.length; a < aa; a++)
|
||||
ths[a].onclick = gclick;
|
||||
|
||||
r.dirty = false;
|
||||
r.bagit();
|
||||
r.loadsel();
|
||||
@@ -1507,26 +1753,32 @@ document.onkeydown = function (e) {
|
||||
if (!document.activeElement || document.activeElement != document.body && document.activeElement.nodeName.toLowerCase() != 'a')
|
||||
return;
|
||||
|
||||
if (e.ctrlKey || e.altKey || e.shiftKey || e.metaKey || e.isComposing)
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing)
|
||||
return;
|
||||
|
||||
var k = e.code + '', pos = -1, n;
|
||||
|
||||
if (e.shiftKey && k != 'KeyA' && k != 'KeyD')
|
||||
return;
|
||||
|
||||
var k = (e.code + ''), pos = -1;
|
||||
if (k.indexOf('Digit') === 0)
|
||||
pos = parseInt(k.slice(-1)) * 0.1;
|
||||
|
||||
if (pos !== -1)
|
||||
return seek_au_mul(pos);
|
||||
return seek_au_mul(pos) || true;
|
||||
|
||||
var n = k == 'KeyJ' ? -1 : k == 'KeyL' ? 1 : 0;
|
||||
if (n !== 0)
|
||||
return song_skip(n);
|
||||
if (k == 'KeyJ')
|
||||
return prev_song() || true;
|
||||
|
||||
if (k == 'KeyL')
|
||||
return next_song() || true;
|
||||
|
||||
if (k == 'KeyP')
|
||||
return playpause();
|
||||
return playpause() || true;
|
||||
|
||||
n = k == 'KeyU' ? -10 : k == 'KeyO' ? 10 : 0;
|
||||
if (n !== 0)
|
||||
return mp.au ? seek_au_sec(mp.au.currentTime + n) : true;
|
||||
return seek_au_rel(n) || true;
|
||||
|
||||
n = k == 'KeyI' ? -1 : k == 'KeyK' ? 1 : 0;
|
||||
if (n !== 0)
|
||||
@@ -1536,7 +1788,6 @@ document.onkeydown = function (e) {
|
||||
return tree_up();
|
||||
|
||||
if (k == 'KeyB')
|
||||
//return treectl.hidden ? treectl.show() : treectl.hide();
|
||||
return treectl.hidden ? treectl.entree() : treectl.detree();
|
||||
|
||||
if (k == 'KeyG')
|
||||
@@ -1545,6 +1796,14 @@ document.onkeydown = function (e) {
|
||||
if (k == 'KeyT')
|
||||
return ebi('thumbs').click();
|
||||
|
||||
if (!treectl.hidden && (!e.shiftKey || !thegrid.en)) {
|
||||
if (k == 'KeyA')
|
||||
return QS('#twig').click();
|
||||
|
||||
if (k == 'KeyD')
|
||||
return QS('#twobytwo').click();
|
||||
}
|
||||
|
||||
if (thegrid.en) {
|
||||
if (k == 'KeyS')
|
||||
return ebi('gridsel').click();
|
||||
@@ -1627,6 +1886,7 @@ document.onkeydown = function (e) {
|
||||
}
|
||||
|
||||
var search_timeout,
|
||||
defer_timeout,
|
||||
search_in_progress = 0;
|
||||
|
||||
function ev_search_input() {
|
||||
@@ -1641,9 +1901,29 @@ document.onkeydown = function (e) {
|
||||
if (id != "q_raw")
|
||||
encode_query();
|
||||
|
||||
clearTimeout(search_timeout);
|
||||
if (Date.now() - search_in_progress > 30 * 1000)
|
||||
set_vq();
|
||||
|
||||
clearTimeout(defer_timeout);
|
||||
defer_timeout = setTimeout(try_search, 2000);
|
||||
try_search();
|
||||
}
|
||||
|
||||
function try_search() {
|
||||
if (Date.now() - search_in_progress > 30 * 1000) {
|
||||
clearTimeout(defer_timeout);
|
||||
clearTimeout(search_timeout);
|
||||
search_timeout = setTimeout(do_search, 200);
|
||||
}
|
||||
}
|
||||
|
||||
function set_vq() {
|
||||
if (search_in_progress)
|
||||
return;
|
||||
|
||||
var q = ebi('q_raw').value,
|
||||
vq = ebi('files').getAttribute('q_raw');
|
||||
|
||||
srch_msg(false, (q == vq) ? '' : 'search results below are from a previous query:\n ' + (vq ? vq : '(*)'));
|
||||
}
|
||||
|
||||
function encode_query() {
|
||||
@@ -1713,7 +1993,8 @@ document.onkeydown = function (e) {
|
||||
xhr.setRequestHeader('Content-Type', 'text/plain');
|
||||
xhr.onreadystatechange = xhr_search_results;
|
||||
xhr.ts = Date.now();
|
||||
xhr.send(JSON.stringify({ "q": ebi('q_raw').value }));
|
||||
xhr.q_raw = ebi('q_raw').value;
|
||||
xhr.send(JSON.stringify({ "q": xhr.q_raw }));
|
||||
}
|
||||
|
||||
function xhr_search_results() {
|
||||
@@ -1784,6 +2065,8 @@ document.onkeydown = function (e) {
|
||||
|
||||
ofiles.innerHTML = html.join('\n');
|
||||
ofiles.setAttribute("ts", this.ts);
|
||||
ofiles.setAttribute("q_raw", this.q_raw);
|
||||
set_vq();
|
||||
mukey.render();
|
||||
reload_browser();
|
||||
filecols.set_style(['File Name']);
|
||||
@@ -1795,6 +2078,7 @@ document.onkeydown = function (e) {
|
||||
ev(e);
|
||||
treectl.show();
|
||||
ebi('files').innerHTML = orig_html;
|
||||
ebi('files').removeAttribute('q_raw');
|
||||
orig_html = null;
|
||||
msel.render();
|
||||
reload_browser();
|
||||
@@ -2013,6 +2297,9 @@ var treectl = (function () {
|
||||
}
|
||||
|
||||
function treego(e) {
|
||||
if (e && (e.ctrlKey || e.metaKey))
|
||||
return true;
|
||||
|
||||
ev(e);
|
||||
if (this.getAttribute('class') == 'hl' &&
|
||||
this.previousSibling.textContent == '-') {
|
||||
@@ -2743,8 +3030,10 @@ function reload_mp() {
|
||||
mp.au.pause();
|
||||
mp.au = null;
|
||||
}
|
||||
mpl.stop();
|
||||
widget.close();
|
||||
mp = new MPlayer();
|
||||
setTimeout(pbar.onresize, 1);
|
||||
}
|
||||
|
||||
|
||||
@@ -2788,3 +3077,4 @@ function reload_browser(not_mp) {
|
||||
reload_browser(true);
|
||||
mukey.render();
|
||||
msel.render();
|
||||
play_linked();
|
||||
|
||||
@@ -54,7 +54,7 @@
|
||||
<div>{{ logues[1] }}</div><br />
|
||||
{%- endif %}
|
||||
|
||||
<h2><a href="{{ url_suf }}{{ url_suf and '&' or '?' }}h">control-panel</a></h2>
|
||||
<h2><a href="/{{ url_suf }}{{ url_suf and '&' or '?' }}h">control-panel</a></h2>
|
||||
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,9 +3,9 @@
|
||||
<title>📝🎉 {{ title }}</title> <!-- 📜 -->
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.7">
|
||||
<link href="/.cpr/md.css" rel="stylesheet">
|
||||
<link href="/.cpr/md.css?_={{ ts }}" rel="stylesheet">
|
||||
{%- if edit %}
|
||||
<link href="/.cpr/md2.css" rel="stylesheet">
|
||||
<link href="/.cpr/md2.css?_={{ ts }}" rel="stylesheet">
|
||||
{%- endif %}
|
||||
</head>
|
||||
<body>
|
||||
@@ -146,10 +146,10 @@ var md_opt = {
|
||||
})();
|
||||
|
||||
</script>
|
||||
<script src="/.cpr/util.js"></script>
|
||||
<script src="/.cpr/deps/marked.js"></script>
|
||||
<script src="/.cpr/md.js"></script>
|
||||
<script src="/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="/.cpr/deps/marked.js?_={{ ts }}"></script>
|
||||
<script src="/.cpr/md.js?_={{ ts }}"></script>
|
||||
{%- if edit %}
|
||||
<script src="/.cpr/md2.js"></script>
|
||||
<script src="/.cpr/md2.js?_={{ ts }}"></script>
|
||||
{%- endif %}
|
||||
</body></html>
|
||||
|
||||
@@ -3,9 +3,9 @@
|
||||
<title>📝🎉 {{ title }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.7">
|
||||
<link href="/.cpr/mde.css" rel="stylesheet">
|
||||
<link href="/.cpr/deps/mini-fa.css" rel="stylesheet">
|
||||
<link href="/.cpr/deps/easymde.css" rel="stylesheet">
|
||||
<link href="/.cpr/mde.css?_={{ ts }}" rel="stylesheet">
|
||||
<link href="/.cpr/deps/mini-fa.css?_={{ ts }}" rel="stylesheet">
|
||||
<link href="/.cpr/deps/easymde.css?_={{ ts }}" rel="stylesheet">
|
||||
</head>
|
||||
<body>
|
||||
<div id="mw">
|
||||
@@ -43,7 +43,7 @@ var lightswitch = (function () {
|
||||
})();
|
||||
|
||||
</script>
|
||||
<script src="/.cpr/util.js"></script>
|
||||
<script src="/.cpr/deps/easymde.js"></script>
|
||||
<script src="/.cpr/mde.js"></script>
|
||||
<script src="/.cpr/util.js?_={{ ts }}"></script>
|
||||
<script src="/.cpr/deps/easymde.js?_={{ ts }}"></script>
|
||||
<script src="/.cpr/mde.js?_={{ ts }}"></script>
|
||||
</body></html>
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<title>copyparty</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/msg.css">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/msg.css?_={{ ts }}">
|
||||
</head>
|
||||
|
||||
<body>
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<title>copyparty</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/splash.css">
|
||||
<link rel="stylesheet" type="text/css" media="screen" href="/.cpr/splash.css?_={{ ts }}">
|
||||
</head>
|
||||
|
||||
<body>
|
||||
@@ -35,7 +35,7 @@
|
||||
</table>
|
||||
</td></tr></table>
|
||||
<div class="btns">
|
||||
<a href="{{ avol[0] }}?stack">dump stack</a>
|
||||
<a href="/?stack">dump stack</a>
|
||||
</div>
|
||||
{%- endif %}
|
||||
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
"use strict";
|
||||
|
||||
window.onerror = vis_exh;
|
||||
|
||||
|
||||
function goto_up2k() {
|
||||
if (up2k === false)
|
||||
@@ -16,17 +14,19 @@ function goto_up2k() {
|
||||
|
||||
// chrome requires https to use crypto.subtle,
|
||||
// usually it's undefined but some chromes throw on invoke
|
||||
var up2k = null;
|
||||
var sha_js = window.WebAssembly ? 'hw' : 'ac'; // ff53,c57,sa11
|
||||
var up2k = null,
|
||||
sha_js = window.WebAssembly ? 'hw' : 'ac', // ff53,c57,sa11
|
||||
m = 'will use ' + sha_js + ' instead of native sha512 due to';
|
||||
|
||||
try {
|
||||
var cf = crypto.subtle || crypto.webkitSubtle;
|
||||
cf.digest('SHA-512', new Uint8Array(1)).then(
|
||||
function (x) { console.log('sha-ok'); up2k = up2k_init(cf); },
|
||||
function (x) { console.log('sha-ng:', x); up2k = up2k_init(false); }
|
||||
function (x) { console.log(m, x); up2k = up2k_init(false); }
|
||||
);
|
||||
}
|
||||
catch (ex) {
|
||||
console.log('sha-na:', ex);
|
||||
console.log(m, ex);
|
||||
try {
|
||||
up2k = up2k_init(false);
|
||||
}
|
||||
@@ -142,7 +142,7 @@ function U2pvis(act, btns) {
|
||||
this.tail = -1;
|
||||
this.wsz = 3;
|
||||
|
||||
this.addfile = function (entry, sz) {
|
||||
this.addfile = function (entry, sz, draw) {
|
||||
this.tab.push({
|
||||
"hn": entry[0],
|
||||
"ht": entry[1],
|
||||
@@ -156,6 +156,9 @@ function U2pvis(act, btns) {
|
||||
"bd0": 0 // upload start
|
||||
});
|
||||
this.ctr["q"]++;
|
||||
if (!draw)
|
||||
return;
|
||||
|
||||
this.drawcard("q");
|
||||
if (this.act == "q") {
|
||||
this.addrow(this.tab.length - 1);
|
||||
@@ -222,7 +225,7 @@ function U2pvis(act, btns) {
|
||||
this.hashed = function (fobj) {
|
||||
var fo = this.tab[fobj.n],
|
||||
nb = fo.bt * (++fo.nh / fo.cb.length),
|
||||
p = this.perc(nb, 0, fobj.size, fobj.t1);
|
||||
p = this.perc(nb, 0, fobj.size, fobj.t_hashing);
|
||||
|
||||
fo.hp = '{0}%, {1}, {2} MB/s'.format(
|
||||
p[0].toFixed(2), p[1], p[2].toFixed(2)
|
||||
@@ -245,7 +248,7 @@ function U2pvis(act, btns) {
|
||||
fo.cb[nchunk] = cbd;
|
||||
fo.bd += delta;
|
||||
|
||||
var p = this.perc(fo.bd, fo.bd0, fo.bt, fobj.t3);
|
||||
var p = this.perc(fo.bd, fo.bd0, fo.bt, fobj.t_uploading);
|
||||
fo.hp = '{0}%, {1}, {2} MB/s'.format(
|
||||
p[0].toFixed(2), p[1], p[2].toFixed(2)
|
||||
);
|
||||
@@ -256,6 +259,41 @@ function U2pvis(act, btns) {
|
||||
var obj = ebi('f{0}p'.format(fobj.n)),
|
||||
o1 = p[0] - 2, o2 = p[0] - 0.1, o3 = p[0];
|
||||
|
||||
if (!obj) { //} || true) {
|
||||
var msg = [
|
||||
"act", this.act,
|
||||
"in", fo.in,
|
||||
"is_act", this.is_act(fo.in),
|
||||
"head", this.head,
|
||||
"tail", this.tail,
|
||||
"nfile", fobj.n,
|
||||
"name", fobj.name,
|
||||
"sz", fobj.size,
|
||||
"bytesDelta", delta,
|
||||
"bytesDone", fo.bd,
|
||||
],
|
||||
m2 = '',
|
||||
ds = QSA("#u2tab>tbody>tr>td:first-child>a:last-child");
|
||||
|
||||
for (var a = 0; a < msg.length; a += 2)
|
||||
m2 += msg[a] + '=' + msg[a + 1] + ', ';
|
||||
|
||||
console.log(m2);
|
||||
|
||||
for (var a = 0, aa = ds.length; a < aa; a++) {
|
||||
var id = ds[a].parentNode.getAttribute('id').slice(1, -1);
|
||||
console.log("dom %d/%d = [%s] in(%s) is_act(%s) %s",
|
||||
a, aa, id, this.tab[id].in, this.is_act(fo.in), ds[a].textContent);
|
||||
}
|
||||
|
||||
for (var a = 0, aa = this.tab.length; a < aa; a++)
|
||||
if (this.is_act(this.tab[a].in))
|
||||
console.log("tab %d/%d = sz %s", a, aa, this.tab[a].bt);
|
||||
|
||||
console.log("a");
|
||||
throw 42;
|
||||
}
|
||||
|
||||
obj.innerHTML = fo.hp;
|
||||
obj.style.color = '#fff';
|
||||
obj.style.background = 'linear-gradient(90deg, #050, #270 ' + o1 + '%, #4b0 ' + o2 + '%, #333 ' + o3 + '%, #333 99%, #777)';
|
||||
@@ -270,26 +308,35 @@ function U2pvis(act, btns) {
|
||||
throw 42;
|
||||
}
|
||||
|
||||
//console.log("oldcat %s %d, newcat %s %d, head=%d, tail=%d, file=%d, act.old=%s, act.new=%s, bz_act=%s",
|
||||
// oldcat, this.ctr[oldcat],
|
||||
// newcat, this.ctr[newcat],
|
||||
// this.head, this.tail, nfile,
|
||||
// this.is_act(oldcat), this.is_act(newcat), bz_act);
|
||||
|
||||
fo.in = newcat;
|
||||
this.ctr[oldcat]--;
|
||||
this.ctr[newcat]++;
|
||||
this.drawcard(oldcat);
|
||||
this.drawcard(newcat);
|
||||
if (this.is_act(newcat)) {
|
||||
this.tail++;
|
||||
this.tail = Math.max(this.tail, nfile + 1);
|
||||
if (!ebi('f' + nfile))
|
||||
this.addrow(nfile);
|
||||
}
|
||||
else if (this.is_act(oldcat)) {
|
||||
this.head++;
|
||||
while (this.head < Math.min(this.tab.length, this.tail) && this.precard[this.tab[this.head].in])
|
||||
this.head++;
|
||||
|
||||
if (!bz_act) {
|
||||
var tr = ebi("f" + nfile);
|
||||
tr.parentNode.removeChild(tr);
|
||||
}
|
||||
}
|
||||
if (bz_act) {
|
||||
else return;
|
||||
|
||||
if (bz_act)
|
||||
this.bzw();
|
||||
}
|
||||
};
|
||||
|
||||
this.bzw = function () {
|
||||
@@ -303,7 +350,8 @@ function U2pvis(act, btns) {
|
||||
|
||||
while (this.head - first > this.wsz) {
|
||||
var obj = ebi('f' + (first++));
|
||||
obj.parentNode.removeChild(obj);
|
||||
if (obj)
|
||||
obj.parentNode.removeChild(obj);
|
||||
}
|
||||
while (last - this.tail < this.wsz && last < this.tab.length - 2) {
|
||||
var obj = ebi('f' + (++last));
|
||||
@@ -336,6 +384,8 @@ function U2pvis(act, btns) {
|
||||
|
||||
this.changecard = function (card) {
|
||||
this.act = card;
|
||||
this.precard = has(["ok", "ng", "done"], this.act) ? {} : this.act == "bz" ? { "ok": 1, "ng": 1 } : { "ok": 1, "ng": 1, "bz": 1 };
|
||||
this.postcard = has(["ok", "ng", "done"], this.act) ? { "bz": 1, "q": 1 } : this.act == "bz" ? { "q": 1 } : {};
|
||||
this.head = -1;
|
||||
this.tail = -1;
|
||||
var html = [];
|
||||
@@ -350,9 +400,23 @@ function U2pvis(act, btns) {
|
||||
}
|
||||
}
|
||||
if (this.head == -1) {
|
||||
this.head = this.tab.length;
|
||||
this.tail = this.head - 1;
|
||||
for (var a = 0; a < this.tab.length; a++) {
|
||||
var rt = this.tab[a].in;
|
||||
if (this.precard[rt]) {
|
||||
this.head = a + 1;
|
||||
this.tail = a;
|
||||
}
|
||||
else if (this.postcard[rt]) {
|
||||
this.head = a;
|
||||
this.tail = a - 1;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (this.head < 0)
|
||||
this.head = 0;
|
||||
|
||||
if (card == "bz") {
|
||||
for (var a = this.head - 1; a >= this.head - this.wsz && a >= 0; a--) {
|
||||
html.unshift(this.genrow(a, true).replace(/><td>/, "><td>a "));
|
||||
@@ -399,6 +463,8 @@ function U2pvis(act, btns) {
|
||||
that.changecard(newtab);
|
||||
};
|
||||
}
|
||||
|
||||
this.changecard(this.act);
|
||||
}
|
||||
|
||||
|
||||
@@ -495,17 +561,21 @@ function up2k_init(subtle) {
|
||||
ask_up = bcfg_get('ask_up', true),
|
||||
flag_en = bcfg_get('flag_en', false),
|
||||
fsearch = bcfg_get('fsearch', false),
|
||||
turbo = bcfg_get('u2turbo', false),
|
||||
datechk = bcfg_get('u2tdate', true),
|
||||
fdom_ctr = 0,
|
||||
min_filebuf = 0;
|
||||
|
||||
var st = {
|
||||
"files": [],
|
||||
"todo": {
|
||||
"head": [],
|
||||
"hash": [],
|
||||
"handshake": [],
|
||||
"upload": []
|
||||
},
|
||||
"busy": {
|
||||
"head": [],
|
||||
"hash": [],
|
||||
"handshake": [],
|
||||
"upload": []
|
||||
@@ -516,6 +586,15 @@ function up2k_init(subtle) {
|
||||
}
|
||||
};
|
||||
|
||||
function push_t(arr, t) {
|
||||
var sort = arr.length && arr[arr.length - 1].n > t.n;
|
||||
arr.push(t);
|
||||
if (sort)
|
||||
arr.sort(function (a, b) {
|
||||
return a.n < b.n ? -1 : 1;
|
||||
});
|
||||
}
|
||||
|
||||
var pvis = new U2pvis("bz", '#u2cards');
|
||||
|
||||
var bobslice = null;
|
||||
@@ -559,7 +638,7 @@ function up2k_init(subtle) {
|
||||
}
|
||||
else files = e.target.files;
|
||||
|
||||
if (!files || files.length == 0)
|
||||
if (!files || !files.length)
|
||||
return alert('no files selected??');
|
||||
|
||||
more_one_file();
|
||||
@@ -598,14 +677,50 @@ function up2k_init(subtle) {
|
||||
}
|
||||
}
|
||||
|
||||
function read_dirs(rd, pf, dirs, good, bad) {
|
||||
function rd_flatten(pf, dirs) {
|
||||
var ret = jcp(pf);
|
||||
for (var a = 0; a < dirs.length; a++)
|
||||
ret.push(dirs.fullPath || '');
|
||||
|
||||
ret.sort();
|
||||
return ret;
|
||||
}
|
||||
|
||||
var rd_missing_ref = [];
|
||||
function read_dirs(rd, pf, dirs, good, bad, spins) {
|
||||
spins = spins || 0;
|
||||
if (++spins == 5)
|
||||
rd_missing_ref = rd_flatten(pf, dirs);
|
||||
|
||||
if (spins == 200) {
|
||||
var missing = rd_flatten(pf, dirs),
|
||||
match = rd_missing_ref.length == missing.length,
|
||||
aa = match ? missing.length : 0;
|
||||
|
||||
missing.sort();
|
||||
for (var a = 0; a < aa; a++)
|
||||
if (rd_missing_ref[a] != missing[a])
|
||||
match = false;
|
||||
|
||||
if (match) {
|
||||
var msg = ['directory iterator got stuck on the following {0} items; good chance your browser is about to spinlock:'.format(missing.length)];
|
||||
for (var a = 0; a < Math.min(20, missing.length); a++)
|
||||
msg.push(missing[a]);
|
||||
|
||||
alert(msg.join('\n-- '));
|
||||
dirs = [];
|
||||
pf = [];
|
||||
}
|
||||
spins = 0;
|
||||
}
|
||||
|
||||
if (!dirs.length) {
|
||||
if (!pf.length)
|
||||
return gotallfiles(good, bad);
|
||||
|
||||
console.log("retry pf, " + pf.length);
|
||||
setTimeout(function () {
|
||||
read_dirs(rd, pf, dirs, good, bad);
|
||||
read_dirs(rd, pf, dirs, good, bad, spins);
|
||||
}, 50);
|
||||
return;
|
||||
}
|
||||
@@ -626,8 +741,7 @@ function up2k_init(subtle) {
|
||||
|
||||
pf.push(name);
|
||||
dn.file(function (fobj) {
|
||||
var idx = pf.indexOf(name);
|
||||
pf.splice(idx, 1);
|
||||
apop(pf, name);
|
||||
try {
|
||||
if (fobj.size > 0) {
|
||||
good.push([fobj, name]);
|
||||
@@ -645,12 +759,12 @@ function up2k_init(subtle) {
|
||||
dirs.shift();
|
||||
rd = null;
|
||||
}
|
||||
return read_dirs(rd, pf, dirs, good, bad);
|
||||
return read_dirs(rd, pf, dirs, good, bad, spins);
|
||||
});
|
||||
}
|
||||
|
||||
function gotallfiles(good_files, bad_files) {
|
||||
if (bad_files.length > 0) {
|
||||
if (bad_files.length) {
|
||||
var ntot = bad_files.length + good_files.length,
|
||||
msg = 'These {0} files (of {1} total) were skipped because they are empty:\n'.format(bad_files.length, ntot);
|
||||
|
||||
@@ -670,40 +784,52 @@ function up2k_init(subtle) {
|
||||
if (ask_up && !fsearch && !confirm(msg.join('\n')))
|
||||
return;
|
||||
|
||||
var seen = {},
|
||||
evpath = get_evpath(),
|
||||
draw_each = good_files.length < 50;
|
||||
|
||||
for (var a = 0; a < st.files.length; a++)
|
||||
seen[st.files[a].name + '\n' + st.files[a].size] = 1;
|
||||
|
||||
for (var a = 0; a < good_files.length; a++) {
|
||||
var fobj = good_files[a][0],
|
||||
now = Date.now(),
|
||||
lmod = fobj.lastModified || now;
|
||||
|
||||
var entry = {
|
||||
"n": parseInt(st.files.length.toString()),
|
||||
"n": st.files.length,
|
||||
"t0": now,
|
||||
"fobj": fobj,
|
||||
"name": good_files[a][1],
|
||||
"size": fobj.size,
|
||||
"lmod": lmod / 1000,
|
||||
"purl": get_evpath(),
|
||||
"purl": evpath,
|
||||
"done": false,
|
||||
"hash": []
|
||||
};
|
||||
},
|
||||
key = entry.name + '\n' + entry.size;
|
||||
|
||||
var skip = false;
|
||||
for (var b = 0; b < st.files.length; b++)
|
||||
if (entry.name == st.files[b].name &&
|
||||
entry.size == st.files[b].size)
|
||||
skip = true;
|
||||
|
||||
if (skip)
|
||||
if (seen[key])
|
||||
continue;
|
||||
|
||||
seen[key] = 1;
|
||||
|
||||
pvis.addfile([
|
||||
fsearch ? esc(entry.name) : linksplit(
|
||||
uricom_dec(entry.purl)[0] + entry.name).join(' '),
|
||||
'📐 hash',
|
||||
''
|
||||
], fobj.size);
|
||||
], fobj.size, draw_each);
|
||||
|
||||
st.files.push(entry);
|
||||
st.todo.hash.push(entry);
|
||||
if (turbo)
|
||||
push_t(st.todo.head, entry);
|
||||
else
|
||||
push_t(st.todo.hash, entry);
|
||||
}
|
||||
if (!draw_each) {
|
||||
pvis.drawcard("q");
|
||||
pvis.changecard(pvis.act);
|
||||
}
|
||||
}
|
||||
ebi('u2btn').addEventListener('drop', gotfile, false);
|
||||
@@ -739,10 +865,35 @@ function up2k_init(subtle) {
|
||||
//
|
||||
|
||||
function handshakes_permitted() {
|
||||
var lim = multitask ? 1 : 0;
|
||||
return lim >=
|
||||
if (!st.todo.handshake.length)
|
||||
return true;
|
||||
|
||||
var t = st.todo.handshake[0],
|
||||
cd = t.cooldown;
|
||||
|
||||
if (cd && cd - Date.now() > 0)
|
||||
return false;
|
||||
|
||||
// keepalive or verify
|
||||
if (t.keepalive ||
|
||||
t.t_uploaded)
|
||||
return true;
|
||||
|
||||
if (parallel_uploads <
|
||||
st.busy.handshake.length)
|
||||
return false;
|
||||
|
||||
if (st.busy.handshake.length)
|
||||
for (var n = t.n - 1; n >= t.n - parallel_uploads && n >= 0; n--)
|
||||
if (st.files[n].t_uploading)
|
||||
return false;
|
||||
|
||||
if ((multitask ? 1 : 0) <
|
||||
st.todo.upload.length +
|
||||
st.busy.upload.length;
|
||||
st.busy.upload.length)
|
||||
return false;
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
function hashing_permitted() {
|
||||
@@ -773,14 +924,17 @@ function up2k_init(subtle) {
|
||||
|
||||
clearTimeout(tto);
|
||||
running = true;
|
||||
while (true) {
|
||||
var is_busy = 0 !=
|
||||
st.todo.hash.length +
|
||||
st.todo.handshake.length +
|
||||
st.todo.upload.length +
|
||||
st.busy.hash.length +
|
||||
st.busy.handshake.length +
|
||||
st.busy.upload.length;
|
||||
while (window['vis_exh']) {
|
||||
var now = Date.now(),
|
||||
is_busy = 0 !=
|
||||
st.todo.head.length +
|
||||
st.todo.hash.length +
|
||||
st.todo.handshake.length +
|
||||
st.todo.upload.length +
|
||||
st.busy.head.length +
|
||||
st.busy.hash.length +
|
||||
st.busy.handshake.length +
|
||||
st.busy.upload.length;
|
||||
|
||||
if (was_busy != is_busy) {
|
||||
was_busy = is_busy;
|
||||
@@ -791,7 +945,6 @@ function up2k_init(subtle) {
|
||||
|
||||
if (flag) {
|
||||
if (is_busy) {
|
||||
var now = Date.now();
|
||||
flag.take(now);
|
||||
if (!flag.ours)
|
||||
return defer();
|
||||
@@ -803,43 +956,52 @@ function up2k_init(subtle) {
|
||||
|
||||
var mou_ikkai = false;
|
||||
|
||||
if (st.busy.handshake.length > 0 &&
|
||||
st.busy.handshake[0].busied < Date.now() - 30 * 1000
|
||||
if (st.busy.handshake.length &&
|
||||
st.busy.handshake[0].t_busied < now - 30 * 1000
|
||||
) {
|
||||
console.log("retrying stuck handshake");
|
||||
var t = st.busy.handshake.shift();
|
||||
st.todo.handshake.unshift(t);
|
||||
}
|
||||
|
||||
if (st.todo.handshake.length > 0 &&
|
||||
st.busy.handshake.length == 0 && (
|
||||
st.todo.handshake[0].t4 || (
|
||||
handshakes_permitted() &&
|
||||
st.busy.upload.length < parallel_uploads
|
||||
)
|
||||
)
|
||||
) {
|
||||
exec_handshake();
|
||||
var nprev = -1;
|
||||
for (var a = 0; a < st.todo.upload.length; a++) {
|
||||
var nf = st.todo.upload[a].nfile;
|
||||
if (nprev == nf)
|
||||
continue;
|
||||
|
||||
nprev = nf;
|
||||
var t = st.files[nf];
|
||||
if (now - t.t_busied > 1000 * 30 &&
|
||||
now - t.t_handshake > 1000 * (21600 - 1800)
|
||||
) {
|
||||
apop(st.todo.handshake, t);
|
||||
st.todo.handshake.unshift(t);
|
||||
t.keepalive = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (st.todo.head.length &&
|
||||
st.busy.head.length < parallel_uploads) {
|
||||
exec_head();
|
||||
mou_ikkai = true;
|
||||
}
|
||||
|
||||
if (handshakes_permitted() &&
|
||||
st.todo.handshake.length > 0 &&
|
||||
st.busy.handshake.length == 0 &&
|
||||
st.busy.upload.length < parallel_uploads) {
|
||||
st.todo.handshake.length) {
|
||||
exec_handshake();
|
||||
mou_ikkai = true;
|
||||
}
|
||||
|
||||
if (st.todo.upload.length > 0 &&
|
||||
if (st.todo.upload.length &&
|
||||
st.busy.upload.length < parallel_uploads) {
|
||||
exec_upload();
|
||||
mou_ikkai = true;
|
||||
}
|
||||
|
||||
if (hashing_permitted() &&
|
||||
st.todo.hash.length > 0 &&
|
||||
st.busy.hash.length == 0) {
|
||||
st.todo.hash.length &&
|
||||
!st.busy.hash.length) {
|
||||
exec_hash();
|
||||
mou_ikkai = true;
|
||||
}
|
||||
@@ -946,7 +1108,7 @@ function up2k_init(subtle) {
|
||||
|
||||
bpend += cdr - car;
|
||||
|
||||
reader.onload = function (e) {
|
||||
function orz(e) {
|
||||
if (!min_filebuf && nch == 1) {
|
||||
min_filebuf = 1;
|
||||
var td = Date.now() - t0;
|
||||
@@ -956,9 +1118,30 @@ function up2k_init(subtle) {
|
||||
}
|
||||
}
|
||||
hash_calc(nch, e.target.result);
|
||||
}
|
||||
reader.onload = function (e) {
|
||||
try { orz(e); } catch (ex) { vis_exh(ex + '', '', '', '', ex); }
|
||||
};
|
||||
reader.onerror = function () {
|
||||
alert('y o u b r o k e i t\nerror: ' + reader.error);
|
||||
var err = reader.error + '';
|
||||
var handled = false;
|
||||
|
||||
if (err.indexOf('NotReadableError') !== -1 || // win10-chrome defender
|
||||
err.indexOf('NotFoundError') !== -1 // macos-firefox permissions
|
||||
) {
|
||||
pvis.seth(t.n, 1, 'OS-error');
|
||||
pvis.seth(t.n, 2, err);
|
||||
handled = true;
|
||||
}
|
||||
|
||||
if (handled) {
|
||||
pvis.move(t.n, 'ng');
|
||||
apop(st.busy.hash, t);
|
||||
st.bytes.uploaded += t.size;
|
||||
return tasker();
|
||||
}
|
||||
|
||||
alert('y o u b r o k e i t\nfile: ' + t.name + '\nerror: ' + err);
|
||||
};
|
||||
reader.readAsArrayBuffer(
|
||||
bobslice.call(t.fobj, car, cdr));
|
||||
@@ -986,15 +1169,15 @@ function up2k_init(subtle) {
|
||||
t.hash.push(hashtab[a]);
|
||||
}
|
||||
|
||||
t.t2 = Date.now();
|
||||
t.t_hashed = Date.now();
|
||||
if (t.n == 0 && window.location.hash == '#dbg') {
|
||||
var spd = (t.size / ((t.t2 - t.t1) / 1000.)) / (1024 * 1024.);
|
||||
alert('{0} ms, {1} MB/s\n'.format(t.t2 - t.t1, spd.toFixed(3)) + t.hash.join('\n'));
|
||||
var spd = (t.size / ((t.t_hashed - t.t_hashing) / 1000.)) / (1024 * 1024.);
|
||||
alert('{0} ms, {1} MB/s\n'.format(t.t_hashed - t.t_hashing, spd.toFixed(3)) + t.hash.join('\n'));
|
||||
}
|
||||
|
||||
pvis.seth(t.n, 2, 'hashing done');
|
||||
pvis.seth(t.n, 1, '📦 wait');
|
||||
st.busy.hash.splice(st.busy.hash.indexOf(t), 1);
|
||||
apop(st.busy.hash, t);
|
||||
st.todo.handshake.push(t);
|
||||
tasker();
|
||||
};
|
||||
@@ -1017,10 +1200,57 @@ function up2k_init(subtle) {
|
||||
}, 1);
|
||||
};
|
||||
|
||||
t.t1 = Date.now();
|
||||
t.t_hashing = Date.now();
|
||||
segm_next();
|
||||
}
|
||||
|
||||
/////
|
||||
////
|
||||
/// head
|
||||
//
|
||||
|
||||
function exec_head() {
|
||||
var t = st.todo.head.shift();
|
||||
st.busy.head.push(t);
|
||||
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.onerror = function () {
|
||||
console.log('head onerror, retrying', t);
|
||||
apop(st.busy.head, t);
|
||||
st.todo.head.unshift(t);
|
||||
tasker();
|
||||
};
|
||||
function orz(e) {
|
||||
var ok = false;
|
||||
if (xhr.status == 200) {
|
||||
var srv_sz = xhr.getResponseHeader('Content-Length'),
|
||||
srv_ts = xhr.getResponseHeader('Last-Modified');
|
||||
|
||||
ok = t.size == srv_sz;
|
||||
if (ok && datechk) {
|
||||
srv_ts = new Date(srv_ts) / 1000;
|
||||
ok = Math.abs(srv_ts - t.lmod) < 2;
|
||||
}
|
||||
}
|
||||
apop(st.busy.head, t);
|
||||
if (!ok)
|
||||
return push_t(st.todo.hash, t);
|
||||
|
||||
t.done = true;
|
||||
st.bytes.hashed += t.size;
|
||||
st.bytes.uploaded += t.size;
|
||||
pvis.seth(t.n, 1, 'YOLO');
|
||||
pvis.seth(t.n, 2, "turbo'd");
|
||||
pvis.move(t.n, 'ok');
|
||||
};
|
||||
xhr.onload = function (e) {
|
||||
try { orz(e); } catch (ex) { vis_exh(ex + '', '', '', '', ex); }
|
||||
};
|
||||
|
||||
xhr.open('HEAD', t.purl + t.name, true);
|
||||
xhr.send();
|
||||
}
|
||||
|
||||
/////
|
||||
////
|
||||
/// handshake
|
||||
@@ -1028,30 +1258,41 @@ function up2k_init(subtle) {
|
||||
|
||||
function exec_handshake() {
|
||||
var t = st.todo.handshake.shift(),
|
||||
keepalive = t.keepalive,
|
||||
me = Date.now();
|
||||
|
||||
st.busy.handshake.push(t);
|
||||
t.busied = me;
|
||||
t.keepalive = undefined;
|
||||
t.t_busied = me;
|
||||
|
||||
if (keepalive)
|
||||
console.log("sending keepalive handshake", t);
|
||||
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.onerror = function () {
|
||||
if (t.busied != me) {
|
||||
if (t.t_busied != me) {
|
||||
console.log('zombie handshake onerror,', t);
|
||||
return;
|
||||
}
|
||||
console.log('handshake onerror, retrying');
|
||||
st.busy.handshake.splice(st.busy.handshake.indexOf(t), 1);
|
||||
console.log('handshake onerror, retrying', t);
|
||||
apop(st.busy.handshake, t);
|
||||
st.todo.handshake.unshift(t);
|
||||
t.keepalive = keepalive;
|
||||
tasker();
|
||||
};
|
||||
xhr.onload = function (e) {
|
||||
if (t.busied != me) {
|
||||
function orz(e) {
|
||||
if (t.t_busied != me) {
|
||||
console.log('zombie handshake onload,', t);
|
||||
return;
|
||||
}
|
||||
if (xhr.status == 200) {
|
||||
var response = JSON.parse(xhr.responseText);
|
||||
t.t_handshake = Date.now();
|
||||
if (keepalive) {
|
||||
apop(st.busy.handshake, t);
|
||||
return;
|
||||
}
|
||||
|
||||
var response = JSON.parse(xhr.responseText);
|
||||
if (!response.name) {
|
||||
var msg = '',
|
||||
smsg = '';
|
||||
@@ -1075,7 +1316,7 @@ function up2k_init(subtle) {
|
||||
pvis.seth(t.n, 2, msg);
|
||||
pvis.seth(t.n, 1, smsg);
|
||||
pvis.move(t.n, smsg == '404' ? 'ng' : 'ok');
|
||||
st.busy.handshake.splice(st.busy.handshake.indexOf(t), 1);
|
||||
apop(st.busy.handshake, t);
|
||||
st.bytes.uploaded += t.size;
|
||||
t.done = true;
|
||||
tasker();
|
||||
@@ -1084,6 +1325,7 @@ function up2k_init(subtle) {
|
||||
|
||||
if (response.name !== t.name) {
|
||||
// file exists; server renamed us
|
||||
console.log("server-rename [" + t.name + "] to [" + response.name + "]");
|
||||
t.name = response.name;
|
||||
pvis.seth(t.n, 0, linksplit(t.purl + t.name).join(' '));
|
||||
}
|
||||
@@ -1116,31 +1358,41 @@ function up2k_init(subtle) {
|
||||
var done = true,
|
||||
msg = '🎷🐛';
|
||||
|
||||
if (t.postlist.length > 0) {
|
||||
if (t.postlist.length) {
|
||||
var arr = st.todo.upload,
|
||||
sort = arr.length && arr[arr.length - 1].nfile > t.n;
|
||||
|
||||
for (var a = 0; a < t.postlist.length; a++)
|
||||
st.todo.upload.push({
|
||||
arr.push({
|
||||
'nfile': t.n,
|
||||
'npart': t.postlist[a]
|
||||
});
|
||||
|
||||
msg = 'uploading';
|
||||
done = false;
|
||||
|
||||
if (sort)
|
||||
arr.sort(function (a, b) {
|
||||
return a.nfile < b.nfile ? -1 :
|
||||
/* */ a.nfile > b.nfile ? 1 :
|
||||
a.npart < b.npart ? -1 : 1;
|
||||
});
|
||||
}
|
||||
pvis.seth(t.n, 1, msg);
|
||||
st.busy.handshake.splice(st.busy.handshake.indexOf(t), 1);
|
||||
apop(st.busy.handshake, t);
|
||||
|
||||
if (done) {
|
||||
t.done = true;
|
||||
st.bytes.uploaded += t.size - t.bytes_uploaded;
|
||||
var spd1 = (t.size / ((t.t2 - t.t1) / 1000.)) / (1024 * 1024.),
|
||||
spd2 = (t.size / ((t.t4 - t.t3) / 1000.)) / (1024 * 1024.);
|
||||
var spd1 = (t.size / ((t.t_hashed - t.t_hashing) / 1000.)) / (1024 * 1024.),
|
||||
spd2 = (t.size / ((t.t_uploaded - t.t_uploading) / 1000.)) / (1024 * 1024.);
|
||||
|
||||
pvis.seth(t.n, 2, 'hash {0}, up {1} MB/s'.format(
|
||||
spd1.toFixed(2), spd2.toFixed(2)));
|
||||
|
||||
pvis.move(t.n, 'ok');
|
||||
}
|
||||
else t.t4 = undefined;
|
||||
else t.t_uploaded = undefined;
|
||||
|
||||
tasker();
|
||||
}
|
||||
@@ -1155,6 +1407,15 @@ function up2k_init(subtle) {
|
||||
if (rsp.indexOf('<pre>') === 0)
|
||||
rsp = rsp.slice(5);
|
||||
|
||||
if (rsp.indexOf('rate-limit ') !== -1) {
|
||||
var penalty = rsp.replace(/.*rate-limit /, "").split(' ')[0];
|
||||
console.log("rate-limit: " + penalty);
|
||||
t.cooldown = Date.now() + parseFloat(penalty) * 1000;
|
||||
apop(st.busy.handshake, t);
|
||||
st.todo.handshake.unshift(t);
|
||||
return;
|
||||
}
|
||||
|
||||
st.bytes.uploaded += t.size;
|
||||
if (rsp.indexOf('partial upload exists') !== -1 ||
|
||||
rsp.indexOf('file already exists') !== -1) {
|
||||
@@ -1169,7 +1430,7 @@ function up2k_init(subtle) {
|
||||
pvis.seth(t.n, 2, err);
|
||||
pvis.move(t.n, 'ng');
|
||||
|
||||
st.busy.handshake.splice(st.busy.handshake.indexOf(t), 1);
|
||||
apop(st.busy.handshake, t);
|
||||
tasker();
|
||||
return;
|
||||
}
|
||||
@@ -1179,6 +1440,9 @@ function up2k_init(subtle) {
|
||||
(xhr.responseText && xhr.responseText) ||
|
||||
"no further information"));
|
||||
}
|
||||
}
|
||||
xhr.onload = function (e) {
|
||||
try { orz(e); } catch (ex) { vis_exh(ex + '', '', '', '', ex); }
|
||||
};
|
||||
|
||||
var req = {
|
||||
@@ -1207,8 +1471,8 @@ function up2k_init(subtle) {
|
||||
var npart = upt.npart,
|
||||
t = st.files[upt.nfile];
|
||||
|
||||
if (!t.t3)
|
||||
t.t3 = Date.now();
|
||||
if (!t.t_uploading)
|
||||
t.t_uploading = Date.now();
|
||||
|
||||
pvis.seth(t.n, 1, "🚀 send");
|
||||
|
||||
@@ -1219,40 +1483,56 @@ function up2k_init(subtle) {
|
||||
if (cdr >= t.size)
|
||||
cdr = t.size;
|
||||
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.upload.onprogress = function (xev) {
|
||||
pvis.prog(t, npart, xev.loaded);
|
||||
};
|
||||
xhr.onload = function (xev) {
|
||||
function orz(xhr) {
|
||||
var txt = ((xhr.response && xhr.response.err) || xhr.responseText) + '';
|
||||
if (xhr.status == 200) {
|
||||
pvis.prog(t, npart, cdr - car);
|
||||
st.bytes.uploaded += cdr - car;
|
||||
t.bytes_uploaded += cdr - car;
|
||||
st.busy.upload.splice(st.busy.upload.indexOf(upt), 1);
|
||||
t.postlist.splice(t.postlist.indexOf(npart), 1);
|
||||
if (t.postlist.length == 0) {
|
||||
t.t4 = Date.now();
|
||||
pvis.seth(t.n, 1, 'verifying');
|
||||
st.todo.handshake.unshift(t);
|
||||
}
|
||||
tasker();
|
||||
}
|
||||
else
|
||||
else if (txt.indexOf('already got that') !== -1) {
|
||||
console.log("ignoring dupe-segment error", t);
|
||||
}
|
||||
else {
|
||||
alert("server broke; cu-err {0} on file [{1}]:\n".format(
|
||||
xhr.status, t.name) + (
|
||||
(xhr.response && xhr.response.err) ||
|
||||
(xhr.responseText && xhr.responseText) ||
|
||||
"no further information"));
|
||||
};
|
||||
xhr.open('POST', t.purl + 'chunkpit.php', true);
|
||||
xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]);
|
||||
xhr.setRequestHeader("X-Up2k-Wark", t.wark);
|
||||
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
|
||||
if (xhr.overrideMimeType)
|
||||
xhr.overrideMimeType('Content-Type', 'application/octet-stream');
|
||||
xhr.status, t.name) + (txt || "no further information"));
|
||||
return;
|
||||
}
|
||||
apop(st.busy.upload, upt);
|
||||
apop(t.postlist, npart);
|
||||
if (!t.postlist.length) {
|
||||
t.t_uploaded = Date.now();
|
||||
pvis.seth(t.n, 1, 'verifying');
|
||||
st.todo.handshake.unshift(t);
|
||||
}
|
||||
tasker();
|
||||
}
|
||||
function do_send() {
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.upload.onprogress = function (xev) {
|
||||
pvis.prog(t, npart, xev.loaded);
|
||||
};
|
||||
xhr.onload = function (xev) {
|
||||
try { orz(xhr); } catch (ex) { vis_exh(ex + '', '', '', '', ex); }
|
||||
};
|
||||
xhr.onerror = function (xev) {
|
||||
if (!window['vis_exh'])
|
||||
return;
|
||||
|
||||
xhr.responseType = 'text';
|
||||
xhr.send(bobslice.call(t.fobj, car, cdr));
|
||||
console.log('chunkpit onerror, retrying', t);
|
||||
do_send();
|
||||
};
|
||||
xhr.open('POST', t.purl + 'chunkpit.php', true);
|
||||
xhr.setRequestHeader("X-Up2k-Hash", t.hash[npart]);
|
||||
xhr.setRequestHeader("X-Up2k-Wark", t.wark);
|
||||
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
|
||||
if (xhr.overrideMimeType)
|
||||
xhr.overrideMimeType('Content-Type', 'application/octet-stream');
|
||||
|
||||
xhr.responseType = 'text';
|
||||
xhr.send(bobslice.call(t.fobj, car, cdr));
|
||||
}
|
||||
do_send();
|
||||
}
|
||||
|
||||
/////
|
||||
@@ -1292,6 +1572,17 @@ function up2k_init(subtle) {
|
||||
}
|
||||
tt.init();
|
||||
|
||||
function bumpthread2(e) {
|
||||
if (e.ctrlKey || e.altKey || e.metaKey || e.isComposing)
|
||||
return;
|
||||
|
||||
if (e.code == 'ArrowUp')
|
||||
bumpthread(1);
|
||||
|
||||
if (e.code == 'ArrowDown')
|
||||
bumpthread(-1);
|
||||
}
|
||||
|
||||
function bumpthread(dir) {
|
||||
try {
|
||||
dir.stopPropagation();
|
||||
@@ -1302,7 +1593,7 @@ function up2k_init(subtle) {
|
||||
if (dir.target) {
|
||||
clmod(obj, 'err', 1);
|
||||
var v = Math.floor(parseInt(obj.value));
|
||||
if (v < 1 || v > 8 || v !== v)
|
||||
if (v < 0 || v > 64 || v !== v)
|
||||
return;
|
||||
|
||||
parallel_uploads = v;
|
||||
@@ -1313,11 +1604,11 @@ function up2k_init(subtle) {
|
||||
|
||||
parallel_uploads += dir;
|
||||
|
||||
if (parallel_uploads < 1)
|
||||
parallel_uploads = 1;
|
||||
if (parallel_uploads < 0)
|
||||
parallel_uploads = 0;
|
||||
|
||||
if (parallel_uploads > 8)
|
||||
parallel_uploads = 8;
|
||||
if (parallel_uploads > 16)
|
||||
parallel_uploads = 16;
|
||||
|
||||
obj.value = parallel_uploads;
|
||||
bumpthread({ "target": 1 })
|
||||
@@ -1337,6 +1628,35 @@ function up2k_init(subtle) {
|
||||
set_fsearch(!fsearch);
|
||||
}
|
||||
|
||||
function tgl_turbo() {
|
||||
turbo = !turbo;
|
||||
bcfg_set('u2turbo', turbo);
|
||||
draw_turbo();
|
||||
}
|
||||
|
||||
function tgl_datechk() {
|
||||
datechk = !datechk;
|
||||
bcfg_set('u2tdate', datechk);
|
||||
}
|
||||
|
||||
function draw_turbo() {
|
||||
var msgu = '<p class="warn">WARNING: turbo enabled, <span> client may not detect and resume incomplete uploads; see turbo-button tooltip</span></p>',
|
||||
msgs = '<p class="warn">WARNING: turbo enabled, <span> search may give false-positives; see turbo-button tooltip</span></p>',
|
||||
msg = fsearch ? msgs : msgu,
|
||||
omsg = fsearch ? msgu : msgs,
|
||||
html = ebi('u2foot').innerHTML,
|
||||
ohtml = html;
|
||||
|
||||
if (turbo && html.indexOf(msg) === -1)
|
||||
html = html.replace(omsg, '') + msg;
|
||||
else if (!turbo)
|
||||
html = html.replace(msgu, '').replace(msgs, '');
|
||||
|
||||
if (html !== ohtml)
|
||||
ebi('u2foot').innerHTML = html;
|
||||
}
|
||||
draw_turbo();
|
||||
|
||||
function set_fsearch(new_state) {
|
||||
var fixed = false;
|
||||
|
||||
@@ -1374,6 +1694,7 @@ function up2k_init(subtle) {
|
||||
}
|
||||
catch (ex) { }
|
||||
|
||||
draw_turbo();
|
||||
onresize();
|
||||
}
|
||||
|
||||
@@ -1413,10 +1734,13 @@ function up2k_init(subtle) {
|
||||
bumpthread(-1);
|
||||
};
|
||||
|
||||
ebi('nthread').onkeydown = bumpthread2;
|
||||
ebi('nthread').addEventListener('input', bumpthread, false);
|
||||
ebi('multitask').addEventListener('click', tgl_multitask, false);
|
||||
ebi('ask_up').addEventListener('click', tgl_ask_up, false);
|
||||
ebi('flag_en').addEventListener('click', tgl_flag_en, false);
|
||||
ebi('u2turbo').addEventListener('click', tgl_turbo, false);
|
||||
ebi('u2tdate').addEventListener('click', tgl_datechk, false);
|
||||
var o = ebi('fsearch');
|
||||
if (o)
|
||||
o.addEventListener('click', tgl_fsearch, false);
|
||||
@@ -1426,7 +1750,10 @@ function up2k_init(subtle) {
|
||||
nodes[a].addEventListener('touchend', nop, false);
|
||||
|
||||
set_fsearch();
|
||||
bumpthread({ "target": 1 })
|
||||
bumpthread({ "target": 1 });
|
||||
if (parallel_uploads < 1)
|
||||
bumpthread(1);
|
||||
|
||||
return { "init_deps": init_deps, "set_fsearch": set_fsearch }
|
||||
}
|
||||
|
||||
|
||||
@@ -215,9 +215,31 @@
|
||||
color: #fff;
|
||||
font-style: italic;
|
||||
}
|
||||
#u2foot .warn {
|
||||
font-size: 1.3em;
|
||||
padding: .5em .8em;
|
||||
margin: 1em -.6em;
|
||||
color: #f74;
|
||||
background: #322;
|
||||
border: 1px solid #633;
|
||||
border-width: .1em 0;
|
||||
text-align: center;
|
||||
}
|
||||
#u2foot .warn span {
|
||||
color: #f86;
|
||||
}
|
||||
html.light #u2foot .warn {
|
||||
color: #b00;
|
||||
background: #fca;
|
||||
border-color: #f70;
|
||||
}
|
||||
html.light #u2foot .warn span {
|
||||
color: #930;
|
||||
}
|
||||
#u2foot span {
|
||||
color: #999;
|
||||
font-size: .9em;
|
||||
font-weight: normal;
|
||||
}
|
||||
#u2footfoot {
|
||||
margin-bottom: -1em;
|
||||
|
||||
@@ -11,16 +11,6 @@ var is_touch = 'ontouchstart' in window,
|
||||
|
||||
|
||||
// error handler for mobile devices
|
||||
function hcroak(msg) {
|
||||
document.body.innerHTML = msg;
|
||||
window.onerror = undefined;
|
||||
throw 'fatal_err';
|
||||
}
|
||||
function croak(msg) {
|
||||
document.body.textContent = msg;
|
||||
window.onerror = undefined;
|
||||
throw msg;
|
||||
}
|
||||
function esc(txt) {
|
||||
return txt.replace(/[&"<>]/g, function (c) {
|
||||
return {
|
||||
@@ -32,9 +22,12 @@ function esc(txt) {
|
||||
});
|
||||
}
|
||||
function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
if (!window.onerror)
|
||||
return;
|
||||
|
||||
window.onerror = undefined;
|
||||
window['vis_exh'] = null;
|
||||
var html = ['<h1>you hit a bug!</h1><p>please screenshot this error and send me a copy arigathanks gozaimuch (ed/irc.rizon.net or ed#2644)</p><p>',
|
||||
var html = ['<h1>you hit a bug!</h1><p>please send me a screenshot arigathanks gozaimuch: <code>ed/irc.rizon.net</code> or <code>ed#2644</code><br /> (and if you can, press F12 and include the "Console" tab in the screenshot too)</p><p>',
|
||||
esc(String(msg)), '</p><p>', esc(url + ' @' + lineNo + ':' + columnNo), '</p>'];
|
||||
|
||||
if (error) {
|
||||
@@ -44,9 +37,14 @@ function vis_exh(msg, url, lineNo, columnNo, error) {
|
||||
html.push('<h2>' + find[a] + '</h2>' +
|
||||
esc(String(error[find[a]])).replace(/\n/g, '<br />\n'));
|
||||
}
|
||||
document.body.style.fontSize = '0.8em';
|
||||
document.body.style.padding = '0 1em 1em 1em';
|
||||
hcroak(html.join('\n'));
|
||||
html.push('<p style="border-top:1px solid #999;margin-top:1.5em;font-size:1.4em">if you are stuck here, try to <a href="#" onclick="localStorage.clear();location.reload();">reset copyparty settings</a></p>');
|
||||
document.body.innerHTML = html.join('\n');
|
||||
|
||||
var s = mknod('style');
|
||||
s.innerHTML = 'body{background:#333;color:#ddd;font-family:sans-serif;font-size:0.8em;padding:0 1em 1em 1em} code{color:#bf7;background:#222;padding:.1em;margin:.2em;font-size:1.1em;font-family:monospace,monospace} *{line-height:1.5em}';
|
||||
document.head.appendChild(s);
|
||||
|
||||
throw 'fatal_err';
|
||||
}
|
||||
|
||||
|
||||
@@ -67,6 +65,9 @@ function ev(e) {
|
||||
if (e.stopPropagation)
|
||||
e.stopPropagation();
|
||||
|
||||
if (e.stopImmediatePropagation)
|
||||
e.stopImmediatePropagation();
|
||||
|
||||
e.returnValue = false;
|
||||
return e;
|
||||
}
|
||||
@@ -359,6 +360,15 @@ function get_vpath() {
|
||||
}
|
||||
|
||||
|
||||
function get_pwd() {
|
||||
var pwd = ('; ' + document.cookie).split('; cppwd=');
|
||||
if (pwd.length < 2)
|
||||
return null;
|
||||
|
||||
return pwd[1].split(';')[0];
|
||||
}
|
||||
|
||||
|
||||
function unix2iso(ts) {
|
||||
return new Date(ts * 1000).toISOString().replace("T", " ").slice(0, -5);
|
||||
}
|
||||
@@ -380,6 +390,18 @@ function has(haystack, needle) {
|
||||
}
|
||||
|
||||
|
||||
function apop(arr, v) {
|
||||
var ofs = arr.indexOf(v);
|
||||
if (ofs !== -1)
|
||||
arr.splice(ofs, 1);
|
||||
}
|
||||
|
||||
|
||||
function jcp(obj) {
|
||||
return JSON.parse(JSON.stringify(obj));
|
||||
}
|
||||
|
||||
|
||||
function sread(key) {
|
||||
if (window.localStorage)
|
||||
return localStorage.getItem(key);
|
||||
@@ -493,8 +515,10 @@ var tt = (function () {
|
||||
|
||||
var pos = this.getBoundingClientRect(),
|
||||
left = pos.left < window.innerWidth / 2,
|
||||
top = pos.top < window.innerHeight / 2;
|
||||
top = pos.top < window.innerHeight / 2,
|
||||
big = this.className.indexOf(' ttb') !== -1;
|
||||
|
||||
clmod(r.tt, 'b', big);
|
||||
r.tt.style.top = top ? pos.bottom + 'px' : 'auto';
|
||||
r.tt.style.bottom = top ? 'auto' : (window.innerHeight - pos.top) + 'px';
|
||||
r.tt.style.left = left ? pos.left + 'px' : 'auto';
|
||||
|
||||
@@ -15,11 +15,6 @@
|
||||
}
|
||||
#ggrid>a[href$="/"]:before {
|
||||
content: '📂';
|
||||
display: block;
|
||||
position: absolute;
|
||||
margin: -.1em -.4em;
|
||||
text-shadow: 0 0 .1em #000;
|
||||
font-size: 2em;
|
||||
}
|
||||
|
||||
|
||||
@@ -27,8 +22,11 @@
|
||||
#ggrid>a:before {
|
||||
display: block;
|
||||
position: absolute;
|
||||
margin: -.1em -.4em;
|
||||
padding: .3em 0;
|
||||
margin: -.4em;
|
||||
text-shadow: 0 0 .1em #000;
|
||||
background: linear-gradient(135deg,rgba(255,255,255,0) 50%,rgba(255,255,255,0.2));
|
||||
border-radius: .3em;
|
||||
font-size: 2em;
|
||||
}
|
||||
|
||||
|
||||
51
docs/hls.html
Normal file
51
docs/hls.html
Normal file
@@ -0,0 +1,51 @@
|
||||
<!DOCTYPE html><html lang="en"><head>
|
||||
<meta charset="utf-8">
|
||||
<title>hls-test</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
</head><body>
|
||||
|
||||
<video id="vid" controls></video>
|
||||
<script src="hls.light.js"></script>
|
||||
<script>
|
||||
|
||||
var video = document.getElementById('vid');
|
||||
var hls = new Hls({
|
||||
debug: true,
|
||||
autoStartLoad: false
|
||||
});
|
||||
hls.loadSource('live/v.m3u8');
|
||||
hls.attachMedia(video);
|
||||
hls.on(Hls.Events.MANIFEST_PARSED, function() {
|
||||
hls.startLoad(0);
|
||||
});
|
||||
hls.on(Hls.Events.MEDIA_ATTACHED, function() {
|
||||
video.muted = true;
|
||||
video.play();
|
||||
});
|
||||
|
||||
/*
|
||||
general good news:
|
||||
- doesn't need fixed-length segments; ok to let x264 pick optimal keyframes and slice on those
|
||||
- hls.js polls the m3u8 for new segments, scales the duration accordingly, seeking works great
|
||||
- the sfx will grow by 66 KiB since that's how small hls.js can get, wait thats not good
|
||||
|
||||
# vod, creates m3u8 at the end, fixed keyframes, v bad
|
||||
ffmpeg -hide_banner -threads 0 -flags -global_header -i ..\CowboyBebopMovie-OP1.webm -vf scale=1280:-4,format=yuv420p -ac 2 -c:a libopus -b:a 128k -c:v libx264 -preset slow -crf 24 -maxrate:v 5M -bufsize:v 10M -g 120 -keyint_min 120 -sc_threshold 0 -hls_time 4 -hls_playlist_type vod -hls_segment_filename v%05d.ts v.m3u8
|
||||
|
||||
# live, updates m3u8 as it goes, dynamic keyframes, streamable with hls.js
|
||||
ffmpeg -hide_banner -threads 0 -flags -global_header -i ..\..\CowboyBebopMovie-OP1.webm -vf scale=1280:-4,format=yuv420p -ac 2 -c:a libopus -b:a 128k -c:v libx264 -preset slow -crf 24 -maxrate:v 5M -bufsize:v 10M -f segment -segment_list v.m3u8 -segment_format mpegts -segment_list_flags live v%05d.ts
|
||||
|
||||
# fmp4 (fragmented mp4), doesn't work with hls.js, gets duratoin 149:07:51 (536871s), probably the tkhd/mdhd 0xffffffff (timebase 8000? ok)
|
||||
ffmpeg -re -hide_banner -threads 0 -flags +cgop -i ..\..\CowboyBebopMovie-OP1.webm -vf scale=1280:-4,format=yuv420p -ac 2 -c:a libopus -b:a 128k -c:v libx264 -preset slow -crf 24 -maxrate:v 5M -bufsize:v 10M -f segment -segment_list v.m3u8 -segment_format fmp4 -segment_list_flags live v%05d.mp4
|
||||
|
||||
# try 2, works, uses tempfiles for m3u8 updates, good, 6% smaller
|
||||
ffmpeg -re -hide_banner -threads 0 -flags +cgop -i ..\..\CowboyBebopMovie-OP1.webm -vf scale=1280:-4,format=yuv420p -ac 2 -c:a libopus -b:a 128k -c:v libx264 -preset slow -crf 24 -maxrate:v 5M -bufsize:v 10M -f hls -hls_segment_type fmp4 -hls_list_size 0 -hls_segment_filename v%05d.mp4 v.m3u8
|
||||
|
||||
more notes
|
||||
- adding -hls_flags single_file makes duration wack during playback (for both fmp4 and ts), ok once finalized and refreshed, gives no size reduction anyways
|
||||
- bebop op has good keyframe spacing for testing hls.js, in particular it hops one seg back and immediately resumes if it hits eof with the explicit hls.startLoad(0); otherwise it jumps into the middle of a seg and becomes art
|
||||
- can probably -c:v copy most of the time, is there a way to check for cgop? todo
|
||||
|
||||
*/
|
||||
</script>
|
||||
</body></html>
|
||||
@@ -103,6 +103,15 @@ cat warks | while IFS= read -r x; do sqlite3 up2k.db "delete from mt where w = '
|
||||
# dump all dbs
|
||||
find -iname up2k.db | while IFS= read -r x; do sqlite3 "$x" 'select substr(w,1,12), rd, fn from up' | sed -r 's/\|/ \| /g' | while IFS= read -r y; do printf '%s | %s\n' "$x" "$y"; done; done
|
||||
|
||||
# unschedule mtp scan for all files somewhere under "enc/"
|
||||
sqlite3 -readonly up2k.db 'select substr(up.w,1,16) from up inner join mt on mt.w = substr(up.w,1,16) where rd like "enc/%" and +mt.k = "t:mtp"' > keys; awk '{printf "delete from mt where w = \"%s\" and +k = \"t:mtp\";\n", $0}' <keys | tee /dev/stderr | sqlite3 up2k.db
|
||||
|
||||
# compare metadata key "key" between two databases
|
||||
sqlite3 -readonly up2k.db.key-full 'select w, v from mt where k = "key" order by w' > k1; sqlite3 -readonly up2k.db 'select w, v from mt where k = "key" order by w' > k2; ok=0; ng=0; while IFS='|' read w k2; do k1="$(grep -E "^$w" k1 | sed -r 's/.*\|//')"; [ "$k1" = "$k2" ] && ok=$((ok+1)) || { ng=$((ng+1)); printf '%3s %3s %s\n' "$k1" "$k2" "$(sqlite3 -readonly up2k.db.key-full "select * from up where substr(w,1,16) = '$w'" | sed -r 's/\|/ | /g')"; }; done < <(cat k2); echo "match $ok diff $ng"
|
||||
|
||||
# actually this is much better
|
||||
sqlite3 -readonly up2k.db.key-full 'select w, v from mt where k = "key" order by w' > k1; sqlite3 -readonly up2k.db 'select mt.w, mt.v, up.rd, up.fn from mt inner join up on mt.w = substr(up.w,1,16) where mt.k = "key" order by up.rd, up.fn' > k2; ok=0; ng=0; while IFS='|' read w k2 path; do k1="$(grep -E "^$w" k1 | sed -r 's/.*\|//')"; [ "$k1" = "$k2" ] && ok=$((ok+1)) || { ng=$((ng+1)); printf '%3s %3s %s\n' "$k1" "$k2" "$path"; }; done < <(cat k2); echo "match $ok diff $ng"
|
||||
|
||||
|
||||
##
|
||||
## media
|
||||
@@ -157,7 +166,7 @@ dbg.asyncStore.pendingBreakpoints = {}
|
||||
about:config >> devtools.debugger.prefs-schema-version = -1
|
||||
|
||||
# determine server version
|
||||
git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > /dev/shm/revs && cat /dev/shm/revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser,up2k}.js 2>/dev/null | diff -wNarU0 - <(cat /mnt/Users/ed/Downloads/ref/{util,browser,up2k}.js) | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done
|
||||
git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > ../revs && cat ../{util,browser}.js >../vr && cat ../revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser}.js >../vg 2>/dev/null && diff -wNarU0 ../{vg,vr} | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done
|
||||
|
||||
|
||||
##
|
||||
@@ -200,3 +209,4 @@ mk() { rm -rf /tmp/foo; sudo -u ed bash -c 'mkdir /tmp/foo; echo hi > /tmp/foo/b
|
||||
mk && t0="$(date)" && while true; do date -s "$(date '+ 1 hour')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; done; echo "$t0"
|
||||
mk && sudo -u ed flock /tmp/foo sleep 40 & sleep 1; ps aux | grep -E 'sleep 40$' && t0="$(date)" && for n in {1..40}; do date -s "$(date '+ 1 day')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; done; echo "$t0"
|
||||
mk && t0="$(date)" && for n in {1..40}; do date -s "$(date '+ 1 day')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; tar -cf/dev/null /tmp/foo; done; echo "$t0"
|
||||
|
||||
|
||||
@@ -6,10 +6,10 @@ import re, os, sys, time, shutil, signal, threading, tarfile, hashlib, platform,
|
||||
import subprocess as sp
|
||||
|
||||
"""
|
||||
run me with any version of python, i will unpack and run copyparty
|
||||
pls don't edit this file with a text editor,
|
||||
it breaks the compressed stuff at the end
|
||||
|
||||
(but please don't edit this file with a text editor
|
||||
since that would probably corrupt the binary stuff at the end)
|
||||
run me with any version of python, i will unpack and run copyparty
|
||||
|
||||
there's zero binaries! just plaintext python scripts all the way down
|
||||
so you can easily unpack the archive and inspect it for shady stuff
|
||||
|
||||
@@ -23,13 +23,14 @@ def hdr(query):
|
||||
|
||||
|
||||
class Cfg(Namespace):
|
||||
def __init__(self, a=[], v=[], c=None):
|
||||
def __init__(self, a=None, v=None, c=None):
|
||||
super(Cfg, self).__init__(
|
||||
a=a,
|
||||
v=v,
|
||||
a=a or [],
|
||||
v=v or [],
|
||||
c=c,
|
||||
rproxy=0,
|
||||
ed=False,
|
||||
nw=False,
|
||||
no_zip=False,
|
||||
no_scandir=False,
|
||||
no_sendfile=True,
|
||||
|
||||
@@ -16,8 +16,8 @@ from copyparty import util
|
||||
|
||||
|
||||
class Cfg(Namespace):
|
||||
def __init__(self, a=[], v=[], c=None):
|
||||
ex = {k: False for k in "e2d e2ds e2dsa e2t e2ts e2tsr".split()}
|
||||
def __init__(self, a=None, v=None, c=None):
|
||||
ex = {k: False for k in "nw e2d e2ds e2dsa e2t e2ts e2tsr".split()}
|
||||
ex2 = {
|
||||
"mtp": [],
|
||||
"mte": "a",
|
||||
@@ -27,7 +27,7 @@ class Cfg(Namespace):
|
||||
"rproxy": 0,
|
||||
}
|
||||
ex.update(ex2)
|
||||
super(Cfg, self).__init__(a=a, v=v, c=c, **ex)
|
||||
super(Cfg, self).__init__(a=a or [], v=v or [], c=c, **ex)
|
||||
|
||||
|
||||
class TestVFS(unittest.TestCase):
|
||||
|
||||
@@ -108,6 +108,9 @@ class VHttpSrv(object):
|
||||
aliases = ["splash", "browser", "browser2", "msg", "md", "mde"]
|
||||
self.j2 = {x: J2_FILES for x in aliases}
|
||||
|
||||
def cachebuster(self):
|
||||
return "a"
|
||||
|
||||
|
||||
class VHttpConn(object):
|
||||
def __init__(self, args, asrv, log, buf):
|
||||
@@ -121,8 +124,8 @@ class VHttpConn(object):
|
||||
self.log_src = "a"
|
||||
self.lf_url = None
|
||||
self.hsrv = VHttpSrv()
|
||||
self.nreq = 0
|
||||
self.nbyte = 0
|
||||
self.workload = 0
|
||||
self.ico = None
|
||||
self.thumbcli = None
|
||||
self.t0 = time.time()
|
||||
Reference in New Issue
Block a user