Compare commits
46 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d7cc000976 | ||
|
|
50d8ff95ae | ||
|
|
b2de1459b6 | ||
|
|
f0ffbea0b2 | ||
|
|
199ccca0fe | ||
|
|
1d9b355743 | ||
|
|
f0437fbb07 | ||
|
|
abc404a5b7 | ||
|
|
04b9e21330 | ||
|
|
1044aa071b | ||
|
|
4c3192c8cc | ||
|
|
689e77a025 | ||
|
|
3bd89403d2 | ||
|
|
b4800d9bcb | ||
|
|
05485e8539 | ||
|
|
0e03dc0868 | ||
|
|
352b1ed10a | ||
|
|
0db1244d04 | ||
|
|
ece08b8179 | ||
|
|
b8945ae233 | ||
|
|
dcaf7b0a20 | ||
|
|
f982cdc178 | ||
|
|
b265e59834 | ||
|
|
4a843a6624 | ||
|
|
241ef5b99d | ||
|
|
f39f575a9c | ||
|
|
1521307f1e | ||
|
|
dd122111e6 | ||
|
|
00c177fa74 | ||
|
|
f6c7e49eb8 | ||
|
|
1a8dc3d18a | ||
|
|
38a163a09a | ||
|
|
8f031246d2 | ||
|
|
8f3d97dde7 | ||
|
|
4acaf24d65 | ||
|
|
9a8dbbbcf8 | ||
|
|
a3efc4c726 | ||
|
|
0278bf328f | ||
|
|
17ddd96cc6 | ||
|
|
0e82e79aea | ||
|
|
30f124c061 | ||
|
|
e19d90fcfc | ||
|
|
184bbdd23d | ||
|
|
30b50aec95 | ||
|
|
c3c3d81db1 | ||
|
|
49b7231283 |
29
README.md
29
README.md
@@ -52,7 +52,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
|
||||
* [compress uploads](#compress-uploads) - files can be autocompressed on upload
|
||||
* [database location](#database-location) - in-volume (`.hist/up2k.db`, default) or somewhere else
|
||||
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
|
||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags
|
||||
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||
* [upload events](#upload-events) - trigger a script/program on each upload
|
||||
* [complete examples](#complete-examples)
|
||||
* [browser support](#browser-support) - TLDR: yes
|
||||
@@ -169,7 +169,6 @@ feature summary
|
||||
* ☑ ...of audio (spectrograms) using FFmpeg
|
||||
* ☑ cache eviction (max-age; maybe max-size eventually)
|
||||
* ☑ SPA (browse while uploading)
|
||||
* if you use the navpane to navigate, not folders in the file list
|
||||
* server indexing
|
||||
* ☑ [locate files by contents](#file-search)
|
||||
* ☑ search by name/path/date/size
|
||||
@@ -325,6 +324,7 @@ the browser has the following hotkeys (always qwerty)
|
||||
* `V` toggle folders / textfiles in the navpane
|
||||
* `G` toggle list / [grid view](#thumbnails)
|
||||
* `T` toggle thumbnails / icons
|
||||
* `ESC` close various things
|
||||
* `ctrl-X` cut selected files/folders
|
||||
* `ctrl-V` paste
|
||||
* `F2` [rename](#batch-rename) selected file/folder
|
||||
@@ -376,9 +376,13 @@ switching between breadcrumbs or navpane
|
||||
|
||||
click the `🌲` or pressing the `B` hotkey to toggle between breadcrumbs path (default), or a navpane (tree-browser sidebar thing)
|
||||
|
||||
* `[-]` and `[+]` (or hotkeys `A`/`D`) adjust the size
|
||||
* `[v]` jumps to the currently open folder
|
||||
* `[+]` and `[-]` (or hotkeys `A`/`D`) adjust the size
|
||||
* `[🎯]` jumps to the currently open folder
|
||||
* `[📃]` toggles between showing folders and textfiles
|
||||
* `[📌]` shows the name of all parent folders in a docked panel
|
||||
* `[a]` toggles automatic widening as you go deeper
|
||||
* `[↵]` toggles wordwrap
|
||||
* `[👀]` show full name on hover (if wordwrap is off)
|
||||
|
||||
|
||||
## thumbnails
|
||||
@@ -394,6 +398,7 @@ audio files are covnerted into spectrograms using FFmpeg unless you `--no-athumb
|
||||
images with the following names (see `--th-covers`) become the thumbnail of the folder they're in: `folder.png`, `folder.jpg`, `cover.png`, `cover.jpg`
|
||||
|
||||
in the grid/thumbnail view, if the audio player panel is open, songs will start playing when clicked
|
||||
* indicated by the audio files having the ▶ icon instead of 💾
|
||||
|
||||
|
||||
## zip downloads
|
||||
@@ -571,6 +576,8 @@ and there are *two* editors
|
||||
|
||||
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
|
||||
|
||||
* enabling the audio equalizer can help make gapless albums fully gapless in some browsers (chrome), so consider leaving it on with all the values at zero
|
||||
|
||||
* get a plaintext file listing by adding `?ls=t` to a URL, or a compact colored one with `?ls=v` (for unix terminals)
|
||||
|
||||
* if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1
|
||||
@@ -619,10 +626,12 @@ through arguments:
|
||||
* `-e2ts` also scans for tags in all files that don't have tags yet
|
||||
* `-e2tsr` also deletes all existing tags, doing a full reindex
|
||||
|
||||
the same arguments can be set as volume flags, in addition to `d2d` and `d2t` for disabling:
|
||||
the same arguments can be set as volume flags, in addition to `d2d`, `d2ds`, `d2t`, `d2ts` for disabling:
|
||||
* `-v ~/music::r:c,e2dsa,e2tsr` does a full reindex of everything on startup
|
||||
* `-v ~/music::r:c,d2d` disables **all** indexing, even if any `-e2*` are on
|
||||
* `-v ~/music::r:c,d2t` disables all `-e2t*` (tags), does not affect `-e2d*`
|
||||
* `-v ~/music::r:c,d2ds` disables on-boot scans; only index new uploads
|
||||
* `-v ~/music::r:c,d2ts` same except only affecting tags
|
||||
|
||||
note:
|
||||
* the parser can finally handle `c,e2dsa,e2tsr` so you no longer have to `c,e2dsa:c,e2tsr`
|
||||
@@ -677,6 +686,12 @@ things to note,
|
||||
* the files will be indexed after compression, so dupe-detection and file-search will not work as expected
|
||||
|
||||
some examples,
|
||||
* `-v inc:inc:w:c,pk=xz,0`
|
||||
folder named inc, shared at inc, write-only for everyone, forces xz compression at level 0
|
||||
* `-v inc:inc:w:c,pk`
|
||||
same write-only inc, but forces gz compression (default) instead of xz
|
||||
* `-v inc:inc:w:c,gz`
|
||||
allows (but does not force) gz compression if client uploads to `/inc?pk` or `/inc?gz` or `/inc?gz=4`
|
||||
|
||||
|
||||
## database location
|
||||
@@ -721,7 +736,7 @@ see the beautiful mess of a dictionary in [mtag.py](https://github.com/9001/copy
|
||||
|
||||
## file parser plugins
|
||||
|
||||
provide custom parsers to index additional tags
|
||||
provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
|
||||
|
||||
copyparty can invoke external programs to collect additional metadata for files using `mtp` (either as argument or volume flag), there is a default timeout of 30sec
|
||||
|
||||
@@ -842,7 +857,7 @@ copyparty returns a truncated sha512sum of your PUT/POST as base64; you can gene
|
||||
b512(){ printf "$((sha512sum||shasum -a512)|sed -E 's/ .*//;s/(..)/\\x\1/g')"|base64|tr '+/' '-_'|head -c44;}
|
||||
b512 <movie.mkv
|
||||
|
||||
you can provide passwords using cookie 'cppwd=hunter2', as a url query `?pw=hunter2`, or with basic-authentication (either as the username or password)
|
||||
you can provide passwords using cookie `cppwd=hunter2`, as a url query `?pw=hunter2`, or with basic-authentication (either as the username or password)
|
||||
|
||||
|
||||
# up2k
|
||||
|
||||
@@ -11,14 +11,18 @@ import re
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import json
|
||||
import stat
|
||||
import errno
|
||||
import struct
|
||||
import codecs
|
||||
import platform
|
||||
import threading
|
||||
import http.client # py2: httplib
|
||||
import urllib.parse
|
||||
from datetime import datetime
|
||||
from urllib.parse import quote_from_bytes as quote
|
||||
from urllib.parse import unquote_to_bytes as unquote
|
||||
|
||||
try:
|
||||
import fuse
|
||||
@@ -38,7 +42,7 @@ except:
|
||||
mount a copyparty server (local or remote) as a filesystem
|
||||
|
||||
usage:
|
||||
python ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,url=http://192.168.1.69:3923 /mnt/nas
|
||||
python ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,pw=wark,url=http://192.168.1.69:3923 /mnt/nas
|
||||
|
||||
dependencies:
|
||||
sudo apk add fuse-dev python3-dev
|
||||
@@ -50,6 +54,10 @@ fork of copyparty-fuse.py based on fuse-python which
|
||||
"""
|
||||
|
||||
|
||||
WINDOWS = sys.platform == "win32"
|
||||
MACOS = platform.system() == "Darwin"
|
||||
|
||||
|
||||
def threadless_log(msg):
|
||||
print(msg + "\n", end="")
|
||||
|
||||
@@ -93,6 +101,41 @@ def html_dec(txt):
|
||||
)
|
||||
|
||||
|
||||
def register_wtf8():
|
||||
def wtf8_enc(text):
|
||||
return str(text).encode("utf-8", "surrogateescape"), len(text)
|
||||
|
||||
def wtf8_dec(binary):
|
||||
return bytes(binary).decode("utf-8", "surrogateescape"), len(binary)
|
||||
|
||||
def wtf8_search(encoding_name):
|
||||
return codecs.CodecInfo(wtf8_enc, wtf8_dec, name="wtf-8")
|
||||
|
||||
codecs.register(wtf8_search)
|
||||
|
||||
|
||||
bad_good = {}
|
||||
good_bad = {}
|
||||
|
||||
|
||||
def enwin(txt):
|
||||
return "".join([bad_good.get(x, x) for x in txt])
|
||||
|
||||
for bad, good in bad_good.items():
|
||||
txt = txt.replace(bad, good)
|
||||
|
||||
return txt
|
||||
|
||||
|
||||
def dewin(txt):
|
||||
return "".join([good_bad.get(x, x) for x in txt])
|
||||
|
||||
for bad, good in bad_good.items():
|
||||
txt = txt.replace(good, bad)
|
||||
|
||||
return txt
|
||||
|
||||
|
||||
class CacheNode(object):
|
||||
def __init__(self, tag, data):
|
||||
self.tag = tag
|
||||
@@ -115,8 +158,9 @@ class Stat(fuse.Stat):
|
||||
|
||||
|
||||
class Gateway(object):
|
||||
def __init__(self, base_url):
|
||||
def __init__(self, base_url, pw):
|
||||
self.base_url = base_url
|
||||
self.pw = pw
|
||||
|
||||
ui = urllib.parse.urlparse(base_url)
|
||||
self.web_root = ui.path.strip("/")
|
||||
@@ -135,8 +179,7 @@ class Gateway(object):
|
||||
self.conns = {}
|
||||
|
||||
def quotep(self, path):
|
||||
# TODO: mojibake support
|
||||
path = path.encode("utf-8", "ignore")
|
||||
path = path.encode("wtf-8")
|
||||
return quote(path, safe="/")
|
||||
|
||||
def getconn(self, tid=None):
|
||||
@@ -159,20 +202,29 @@ class Gateway(object):
|
||||
except:
|
||||
pass
|
||||
|
||||
def sendreq(self, *args, **kwargs):
|
||||
def sendreq(self, *args, **ka):
|
||||
tid = get_tid()
|
||||
if self.pw:
|
||||
ck = "cppwd=" + self.pw
|
||||
try:
|
||||
ka["headers"]["Cookie"] = ck
|
||||
except:
|
||||
ka["headers"] = {"Cookie": ck}
|
||||
try:
|
||||
c = self.getconn(tid)
|
||||
c.request(*list(args), **kwargs)
|
||||
c.request(*list(args), **ka)
|
||||
return c.getresponse()
|
||||
except:
|
||||
self.closeconn(tid)
|
||||
c = self.getconn(tid)
|
||||
c.request(*list(args), **kwargs)
|
||||
c.request(*list(args), **ka)
|
||||
return c.getresponse()
|
||||
|
||||
def listdir(self, path):
|
||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?dots"
|
||||
if bad_good:
|
||||
path = dewin(path)
|
||||
|
||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?dots&ls"
|
||||
r = self.sendreq("GET", web_path)
|
||||
if r.status != 200:
|
||||
self.closeconn()
|
||||
@@ -182,9 +234,12 @@ class Gateway(object):
|
||||
)
|
||||
)
|
||||
|
||||
return self.parse_html(r)
|
||||
return self.parse_jls(r)
|
||||
|
||||
def download_file_range(self, path, ofs1, ofs2):
|
||||
if bad_good:
|
||||
path = dewin(path)
|
||||
|
||||
web_path = self.quotep("/" + "/".join([self.web_root, path])) + "?raw"
|
||||
hdr_range = "bytes={}-{}".format(ofs1, ofs2 - 1)
|
||||
log("downloading {}".format(hdr_range))
|
||||
@@ -200,40 +255,27 @@ class Gateway(object):
|
||||
|
||||
return r.read()
|
||||
|
||||
def parse_html(self, datasrc):
|
||||
ret = []
|
||||
remainder = b""
|
||||
ptn = re.compile(
|
||||
r"^<tr><td>(-|DIR)</td><td><a [^>]+>([^<]+)</a></td><td>([^<]+)</td><td>([^<]+)</td></tr>$"
|
||||
)
|
||||
|
||||
def parse_jls(self, datasrc):
|
||||
rsp = b""
|
||||
while True:
|
||||
buf = remainder + datasrc.read(4096)
|
||||
# print('[{}]'.format(buf.decode('utf-8')))
|
||||
buf = datasrc.read(1024 * 32)
|
||||
if not buf:
|
||||
break
|
||||
|
||||
remainder = b""
|
||||
endpos = buf.rfind(b"\n")
|
||||
if endpos >= 0:
|
||||
remainder = buf[endpos + 1 :]
|
||||
buf = buf[:endpos]
|
||||
rsp += buf
|
||||
|
||||
lines = buf.decode("utf-8").split("\n")
|
||||
for line in lines:
|
||||
m = ptn.match(line)
|
||||
if not m:
|
||||
# print(line)
|
||||
continue
|
||||
rsp = json.loads(rsp.decode("utf-8"))
|
||||
ret = []
|
||||
for statfun, nodes in [
|
||||
[self.stat_dir, rsp["dirs"]],
|
||||
[self.stat_file, rsp["files"]],
|
||||
]:
|
||||
for n in nodes:
|
||||
fname = unquote(n["href"].split("?")[0]).rstrip(b"/").decode("wtf-8")
|
||||
if bad_good:
|
||||
fname = enwin(fname)
|
||||
|
||||
ftype, fname, fsize, fdate = m.groups()
|
||||
fname = html_dec(fname)
|
||||
ts = datetime.strptime(fdate, "%Y-%m-%d %H:%M:%S").timestamp()
|
||||
sz = int(fsize)
|
||||
if ftype == "-":
|
||||
ret.append([fname, self.stat_file(ts, sz), 0])
|
||||
else:
|
||||
ret.append([fname, self.stat_dir(ts, sz), 0])
|
||||
ret.append([fname, statfun(n["ts"], n["sz"]), 0])
|
||||
|
||||
return ret
|
||||
|
||||
@@ -262,6 +304,7 @@ class CPPF(Fuse):
|
||||
Fuse.__init__(self, *args, **kwargs)
|
||||
|
||||
self.url = None
|
||||
self.pw = None
|
||||
|
||||
self.dircache = []
|
||||
self.dircache_mtx = threading.Lock()
|
||||
@@ -271,7 +314,7 @@ class CPPF(Fuse):
|
||||
|
||||
def init2(self):
|
||||
# TODO figure out how python-fuse wanted this to go
|
||||
self.gw = Gateway(self.url) # .decode('utf-8'))
|
||||
self.gw = Gateway(self.url, self.pw) # .decode('utf-8'))
|
||||
info("up")
|
||||
|
||||
def clean_dircache(self):
|
||||
@@ -536,6 +579,8 @@ class CPPF(Fuse):
|
||||
|
||||
def getattr(self, path):
|
||||
log("getattr [{}]".format(path))
|
||||
if WINDOWS:
|
||||
path = enwin(path) # windows occasionally decodes f0xx to xx
|
||||
|
||||
path = path.strip("/")
|
||||
try:
|
||||
@@ -568,9 +613,25 @@ class CPPF(Fuse):
|
||||
|
||||
def main():
|
||||
time.strptime("19970815", "%Y%m%d") # python#7980
|
||||
register_wtf8()
|
||||
if WINDOWS:
|
||||
os.system("rem")
|
||||
|
||||
for ch in '<>:"\\|?*':
|
||||
# microsoft maps illegal characters to f0xx
|
||||
# (e000 to f8ff is basic-plane private-use)
|
||||
bad_good[ch] = chr(ord(ch) + 0xF000)
|
||||
|
||||
for n in range(0, 0x100):
|
||||
# map surrogateescape to another private-use area
|
||||
bad_good[chr(n + 0xDC00)] = chr(n + 0xF100)
|
||||
|
||||
for k, v in bad_good.items():
|
||||
good_bad[v] = k
|
||||
|
||||
server = CPPF()
|
||||
server.parser.add_option(mountopt="url", metavar="BASE_URL", default=None)
|
||||
server.parser.add_option(mountopt="pw", metavar="PASSWORD", default=None)
|
||||
server.parse(values=server, errex=1)
|
||||
if not server.url or not str(server.url).startswith("http"):
|
||||
print("\nerror:")
|
||||
@@ -578,7 +639,7 @@ def main():
|
||||
print(" need argument: mount-path")
|
||||
print("example:")
|
||||
print(
|
||||
" ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,url=http://192.168.1.69:3923 /mnt/nas"
|
||||
" ./copyparty-fuseb.py -f -o allow_other,auto_unmount,nonempty,pw=wark,url=http://192.168.1.69:3923 /mnt/nas"
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@@ -6,9 +6,13 @@ some of these rely on libraries which are not MIT-compatible
|
||||
|
||||
* [audio-bpm.py](./audio-bpm.py) detects the BPM of music using the BeatRoot Vamp Plugin; imports GPL2
|
||||
* [audio-key.py](./audio-key.py) detects the melodic key of music using the Mixxx fork of keyfinder; imports GPL3
|
||||
* [media-hash.py](./media-hash.py) generates checksums for audio and video streams; uses FFmpeg (LGPL or GPL)
|
||||
|
||||
these do not have any problematic dependencies:
|
||||
these invoke standalone programs which are GPL or similar, so is legally fine for most purposes:
|
||||
|
||||
* [media-hash.py](./media-hash.py) generates checksums for audio and video streams; uses FFmpeg (LGPL or GPL)
|
||||
* [image-noexif.py](./image-noexif.py) removes exif tags from images; uses exiftool (GPLv1 or artistic-license)
|
||||
|
||||
these do not have any problematic dependencies at all:
|
||||
|
||||
* [cksum.py](./cksum.py) computes various checksums
|
||||
* [exe.py](./exe.py) grabs metadata from .exe and .dll files (example for retrieving multiple tags with one parser)
|
||||
|
||||
@@ -19,18 +19,18 @@ dep: ffmpeg
|
||||
def det(tf):
|
||||
# fmt: off
|
||||
sp.check_call([
|
||||
"ffmpeg",
|
||||
"-nostdin",
|
||||
"-hide_banner",
|
||||
"-v", "fatal",
|
||||
"-ss", "13",
|
||||
"-y", "-i", fsenc(sys.argv[1]),
|
||||
"-map", "0:a:0",
|
||||
"-ac", "1",
|
||||
"-ar", "22050",
|
||||
"-t", "300",
|
||||
"-f", "f32le",
|
||||
tf
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-hide_banner",
|
||||
b"-v", b"fatal",
|
||||
b"-ss", b"13",
|
||||
b"-y", b"-i", fsenc(sys.argv[1]),
|
||||
b"-map", b"0:a:0",
|
||||
b"-ac", b"1",
|
||||
b"-ar", b"22050",
|
||||
b"-t", b"300",
|
||||
b"-f", b"f32le",
|
||||
fsenc(tf)
|
||||
])
|
||||
# fmt: on
|
||||
|
||||
|
||||
@@ -23,15 +23,15 @@ dep: ffmpeg
|
||||
def det(tf):
|
||||
# fmt: off
|
||||
sp.check_call([
|
||||
"ffmpeg",
|
||||
"-nostdin",
|
||||
"-hide_banner",
|
||||
"-v", "fatal",
|
||||
"-y", "-i", fsenc(sys.argv[1]),
|
||||
"-map", "0:a:0",
|
||||
"-t", "300",
|
||||
"-sample_fmt", "s16",
|
||||
tf
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-hide_banner",
|
||||
b"-v", b"fatal",
|
||||
b"-y", b"-i", fsenc(sys.argv[1]),
|
||||
b"-map", b"0:a:0",
|
||||
b"-t", b"300",
|
||||
b"-sample_fmt", b"s16",
|
||||
fsenc(tf)
|
||||
])
|
||||
# fmt: on
|
||||
|
||||
|
||||
93
bin/mtag/image-noexif.py
Normal file
93
bin/mtag/image-noexif.py
Normal file
@@ -0,0 +1,93 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
remove exif tags from uploaded images
|
||||
|
||||
dependencies:
|
||||
exiftool
|
||||
|
||||
about:
|
||||
creates a "noexif" subfolder and puts exif-stripped copies of each image there,
|
||||
the reason for the subfolder is to avoid issues with the up2k.db / deduplication:
|
||||
|
||||
if the original image is modified in-place, then copyparty will keep the original
|
||||
hash in up2k.db for a while (until the next volume rescan), so if the image is
|
||||
reuploaded after a rescan then the upload will be renamed and kept as a dupe
|
||||
|
||||
alternatively you could switch the logic around, making a copy of the original
|
||||
image into a subfolder named "exif" and modify the original in-place, but then
|
||||
up2k.db will be out of sync until the next rescan, so any additional uploads
|
||||
of the same image will get symlinked (deduplicated) to the modified copy
|
||||
instead of the original in "exif"
|
||||
|
||||
or maybe delete the original image after processing, that would kinda work too
|
||||
|
||||
example copyparty config to use this:
|
||||
-v/mnt/nas/pics:pics:rwmd,ed:c,e2ts,mte=+noexif:c,mtp=noexif=ejpg,ejpeg,ad,bin/mtag/image-noexif.py
|
||||
|
||||
explained:
|
||||
for realpath /mnt/nas/pics (served at /pics) with read-write-modify-delete for ed,
|
||||
enable file analysis on upload (e2ts),
|
||||
append "noexif" to the list of known tags (mtp),
|
||||
and use mtp plugin "bin/mtag/image-noexif.py" to provide that tag,
|
||||
do this on all uploads with the file extension "jpg" or "jpeg",
|
||||
ad = parse file regardless if FFmpeg thinks it is audio or not
|
||||
|
||||
PS: this requires e2ts to be functional,
|
||||
meaning you need to do at least one of these:
|
||||
* apt install ffmpeg
|
||||
* pip3 install mutagen
|
||||
and your python must have sqlite3 support compiled in
|
||||
"""
|
||||
|
||||
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import filecmp
|
||||
import subprocess as sp
|
||||
|
||||
try:
|
||||
from copyparty.util import fsenc
|
||||
except:
|
||||
|
||||
def fsenc(p):
|
||||
return p.encode("utf-8")
|
||||
|
||||
|
||||
def main():
|
||||
cwd, fn = os.path.split(sys.argv[1])
|
||||
if os.path.basename(cwd) == "noexif":
|
||||
return
|
||||
|
||||
os.chdir(cwd)
|
||||
f1 = fsenc(fn)
|
||||
f2 = os.path.join(b"noexif", f1)
|
||||
cmd = [
|
||||
b"exiftool",
|
||||
b"-exif:all=",
|
||||
b"-iptc:all=",
|
||||
b"-xmp:all=",
|
||||
b"-P",
|
||||
b"-o",
|
||||
b"noexif/",
|
||||
b"--",
|
||||
f1,
|
||||
]
|
||||
sp.check_output(cmd)
|
||||
if not os.path.exists(f2):
|
||||
print("failed")
|
||||
return
|
||||
|
||||
if filecmp.cmp(f1, f2, shallow=False):
|
||||
print("clean")
|
||||
else:
|
||||
print("exif")
|
||||
|
||||
# lastmod = os.path.getmtime(f1)
|
||||
# times = (int(time.time()), int(lastmod))
|
||||
# os.utime(f2, times)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -13,7 +13,7 @@ try:
|
||||
except:
|
||||
|
||||
def fsenc(p):
|
||||
return p
|
||||
return p.encode("utf-8")
|
||||
|
||||
|
||||
"""
|
||||
@@ -24,13 +24,13 @@ dep: ffmpeg
|
||||
def det():
|
||||
# fmt: off
|
||||
cmd = [
|
||||
"ffmpeg",
|
||||
"-nostdin",
|
||||
"-hide_banner",
|
||||
"-v", "fatal",
|
||||
"-i", fsenc(sys.argv[1]),
|
||||
"-f", "framemd5",
|
||||
"-"
|
||||
b"ffmpeg",
|
||||
b"-nostdin",
|
||||
b"-hide_banner",
|
||||
b"-v", b"fatal",
|
||||
b"-i", fsenc(sys.argv[1]),
|
||||
b"-f", b"framemd5",
|
||||
b"-"
|
||||
]
|
||||
# fmt: on
|
||||
|
||||
|
||||
12
bin/up2k.py
12
bin/up2k.py
@@ -3,7 +3,7 @@ from __future__ import print_function, unicode_literals
|
||||
|
||||
"""
|
||||
up2k.py: upload to copyparty
|
||||
2021-11-16, v0.12, ed <irc.rizon.net>, MIT-Licensed
|
||||
2021-11-28, v0.13, ed <irc.rizon.net>, MIT-Licensed
|
||||
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
||||
|
||||
- dependencies: requests
|
||||
@@ -390,7 +390,7 @@ def handshake(req_ses, url, file, pw, search):
|
||||
r = req_ses.post(url, headers=headers, json=req)
|
||||
break
|
||||
except:
|
||||
eprint("handshake failed, retry...\n")
|
||||
eprint("handshake failed, retrying: {0}\n".format(file.name))
|
||||
time.sleep(1)
|
||||
|
||||
try:
|
||||
@@ -470,11 +470,11 @@ class Ctl(object):
|
||||
nbytes += inf.st_size
|
||||
|
||||
if err:
|
||||
eprint("\n# failed to access {} paths:\n".format(len(err)))
|
||||
eprint("\n# failed to access {0} paths:\n".format(len(err)))
|
||||
for x in err:
|
||||
eprint(x.decode("utf-8", "replace") + "\n")
|
||||
|
||||
eprint("^ failed to access those {} paths ^\n\n".format(len(err)))
|
||||
eprint("^ failed to access those {0} paths ^\n\n".format(len(err)))
|
||||
if not ar.ok:
|
||||
eprint("aborting because --ok is not set\n")
|
||||
return
|
||||
@@ -505,7 +505,7 @@ class Ctl(object):
|
||||
print("{0} {1}\n hash...".format(self.nfiles - nf, upath))
|
||||
get_hashlist(file, None)
|
||||
|
||||
burl = self.ar.url[:8] + self.ar.url[8:].split("/")[0] + "/"
|
||||
burl = self.ar.url[:12] + self.ar.url[8:].split("/")[0] + "/"
|
||||
while True:
|
||||
print(" hs...")
|
||||
hs = handshake(req_ses, self.ar.url, file, self.ar.a, search)
|
||||
@@ -773,7 +773,7 @@ class Ctl(object):
|
||||
try:
|
||||
upload(req_ses, file, cid, self.ar.a)
|
||||
except:
|
||||
eprint("upload failed, retry...\n")
|
||||
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
|
||||
pass # handshake will fix it
|
||||
|
||||
with self.mutex:
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
|
||||
upstream cpp {
|
||||
server 127.0.0.1:3923;
|
||||
keepalive 120;
|
||||
keepalive 1;
|
||||
}
|
||||
server {
|
||||
listen 443 ssl;
|
||||
|
||||
@@ -25,26 +25,34 @@ ANYWIN = WINDOWS or sys.platform in ["msys"]
|
||||
MACOS = platform.system() == "Darwin"
|
||||
|
||||
|
||||
def get_unix_home():
|
||||
try:
|
||||
v = os.environ["XDG_CONFIG_HOME"]
|
||||
if not v:
|
||||
raise Exception()
|
||||
ret = os.path.normpath(v)
|
||||
os.listdir(ret)
|
||||
return ret
|
||||
except:
|
||||
pass
|
||||
def get_unixdir():
|
||||
paths = [
|
||||
(os.environ.get, "XDG_CONFIG_HOME"),
|
||||
(os.path.expanduser, "~/.config"),
|
||||
(os.environ.get, "TMPDIR"),
|
||||
(os.environ.get, "TEMP"),
|
||||
(os.environ.get, "TMP"),
|
||||
(unicode, "/tmp"),
|
||||
]
|
||||
for chk in [os.listdir, os.mkdir]:
|
||||
for pf, pa in paths:
|
||||
try:
|
||||
p = pf(pa)
|
||||
# print(chk.__name__, p, pa)
|
||||
if not p or p.startswith("~"):
|
||||
continue
|
||||
|
||||
try:
|
||||
v = os.path.expanduser("~/.config")
|
||||
if v.startswith("~"):
|
||||
raise Exception()
|
||||
ret = os.path.normpath(v)
|
||||
os.listdir(ret)
|
||||
return ret
|
||||
except:
|
||||
return "/tmp"
|
||||
p = os.path.normpath(p)
|
||||
chk(p)
|
||||
p = os.path.join(p, "copyparty")
|
||||
if not os.path.isdir(p):
|
||||
os.mkdir(p)
|
||||
|
||||
return p
|
||||
except:
|
||||
pass
|
||||
|
||||
raise Exception("could not find a writable path for config")
|
||||
|
||||
|
||||
class EnvParams(object):
|
||||
@@ -59,7 +67,7 @@ class EnvParams(object):
|
||||
elif sys.platform == "darwin":
|
||||
self.cfg = os.path.expanduser("~/Library/Preferences/copyparty")
|
||||
else:
|
||||
self.cfg = get_unix_home() + "/copyparty"
|
||||
self.cfg = get_unixdir()
|
||||
|
||||
self.cfg = self.cfg.replace("\\", "/")
|
||||
try:
|
||||
|
||||
@@ -23,7 +23,7 @@ from textwrap import dedent
|
||||
from .__init__ import E, WINDOWS, ANYWIN, VT100, PY2, unicode
|
||||
from .__version__ import S_VERSION, S_BUILD_DT, CODENAME
|
||||
from .svchub import SvcHub
|
||||
from .util import py_desc, align_tab, IMPLICATIONS, ansi_re
|
||||
from .util import py_desc, align_tab, IMPLICATIONS, ansi_re, min_ex
|
||||
from .authsrv import re_vol
|
||||
|
||||
HAVE_SSL = True
|
||||
@@ -222,6 +222,54 @@ def sighandler(sig=None, frame=None):
|
||||
print("\n".join(msg))
|
||||
|
||||
|
||||
def disable_quickedit():
|
||||
import ctypes
|
||||
import atexit
|
||||
from ctypes import wintypes
|
||||
|
||||
def ecb(ok, fun, args):
|
||||
if not ok:
|
||||
err = ctypes.get_last_error()
|
||||
if err:
|
||||
raise ctypes.WinError(err)
|
||||
return args
|
||||
|
||||
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
|
||||
if PY2:
|
||||
wintypes.LPDWORD = ctypes.POINTER(wintypes.DWORD)
|
||||
|
||||
k32.GetStdHandle.errcheck = ecb
|
||||
k32.GetConsoleMode.errcheck = ecb
|
||||
k32.SetConsoleMode.errcheck = ecb
|
||||
k32.GetConsoleMode.argtypes = (wintypes.HANDLE, wintypes.LPDWORD)
|
||||
k32.SetConsoleMode.argtypes = (wintypes.HANDLE, wintypes.DWORD)
|
||||
|
||||
def cmode(out, mode=None):
|
||||
h = k32.GetStdHandle(-11 if out else -10)
|
||||
if mode:
|
||||
return k32.SetConsoleMode(h, mode)
|
||||
|
||||
mode = wintypes.DWORD()
|
||||
k32.GetConsoleMode(h, ctypes.byref(mode))
|
||||
return mode.value
|
||||
|
||||
# disable quickedit
|
||||
mode = orig_in = cmode(False)
|
||||
quickedit = 0x40
|
||||
extended = 0x80
|
||||
mask = quickedit + extended
|
||||
if mode & mask != extended:
|
||||
atexit.register(cmode, False, orig_in)
|
||||
cmode(False, mode & ~mask | extended)
|
||||
|
||||
# enable colors in case the os.system("rem") trick ever stops working
|
||||
if VT100:
|
||||
mode = orig_out = cmode(True)
|
||||
if mode & 4 != 4:
|
||||
atexit.register(cmode, True, orig_out)
|
||||
cmode(True, mode | 4)
|
||||
|
||||
|
||||
def run_argparse(argv, formatter):
|
||||
ap = argparse.ArgumentParser(
|
||||
formatter_class=formatter,
|
||||
@@ -302,6 +350,8 @@ def run_argparse(argv, formatter):
|
||||
|
||||
\033[0mdatabase, general:
|
||||
\033[36me2d\033[35m sets -e2d (all -e2* args can be set using ce2* volflags)
|
||||
\033[36md2ts\033[35m disables metadata collection for existing files
|
||||
\033[36md2ds\033[35m disables onboot indexing, overrides -e2ds*
|
||||
\033[36md2t\033[35m disables metadata collection, overrides -e2t*
|
||||
\033[36md2d\033[35m disables all database stuff, overrides -e2*
|
||||
\033[36mnohash=\\.iso$\033[35m skips hashing file contents if path matches *.iso
|
||||
@@ -368,6 +418,7 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("-emp", action="store_true", help="enable markdown plugins")
|
||||
ap2.add_argument("-mcr", metavar="SEC", type=int, default=60, help="md-editor mod-chk rate")
|
||||
ap2.add_argument("--urlform", metavar="MODE", type=u, default="print,get", help="how to handle url-forms; examples: [stash], [save,get]")
|
||||
ap2.add_argument("--wintitle", metavar="TXT", type=u, default="cpp @ $pub", help="window title, for example '$ip-10.1.2.' or '$ip-'")
|
||||
|
||||
ap2 = ap.add_argument_group('upload options')
|
||||
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads")
|
||||
@@ -396,6 +447,7 @@ def run_argparse(argv, formatter):
|
||||
|
||||
ap2 = ap.add_argument_group('opt-outs')
|
||||
ap2.add_argument("-nw", action="store_true", help="disable writes (benchmark)")
|
||||
ap2.add_argument("--keep-qem", action="store_true", help="do not disable quick-edit-mode on windows")
|
||||
ap2.add_argument("--no-del", action="store_true", help="disable delete operations")
|
||||
ap2.add_argument("--no-mv", action="store_true", help="disable move/rename operations")
|
||||
ap2.add_argument("-nih", action="store_true", help="no info hostname")
|
||||
@@ -480,6 +532,7 @@ def run_argparse(argv, formatter):
|
||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")
|
||||
|
||||
ap2 = ap.add_argument_group('debug options')
|
||||
ap2.add_argument("--no-sendfile", action="store_true", help="disable sendfile")
|
||||
@@ -550,6 +603,15 @@ def main(argv=None):
|
||||
except AssertionError:
|
||||
al = run_argparse(argv, Dodge11874)
|
||||
|
||||
if WINDOWS and not al.keep_qem:
|
||||
try:
|
||||
disable_quickedit()
|
||||
except:
|
||||
print("\nfailed to disable quick-edit-mode:\n" + min_ex() + "\n")
|
||||
|
||||
if not VT100:
|
||||
al.wintitle = ""
|
||||
|
||||
nstrs = []
|
||||
anymod = False
|
||||
for ostr in al.v or []:
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 1, 3)
|
||||
VERSION = (1, 1, 8)
|
||||
CODENAME = "opus"
|
||||
BUILD_DT = (2021, 11, 20)
|
||||
BUILD_DT = (2021, 12, 10)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -926,6 +926,14 @@ class AuthSrv(object):
|
||||
vol.flags["d2t"] = True
|
||||
vol.flags = {k: v for k, v in vol.flags.items() if not k.startswith(rm)}
|
||||
|
||||
# d2ds drops all onboot scans for a volume
|
||||
for grp, rm in [["d2ds", "e2ds"], ["d2ts", "e2ts"]]:
|
||||
if not vol.flags.get(grp, False):
|
||||
continue
|
||||
|
||||
vol.flags["d2ts"] = True
|
||||
vol.flags = {k: v for k, v in vol.flags.items() if not k.startswith(rm)}
|
||||
|
||||
# mt* needs e2t so drop those too
|
||||
for grp, rm in [["e2t", "mt"]]:
|
||||
if vol.flags.get(grp, False):
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import os
|
||||
from ..util import fsenc, fsdec
|
||||
from ..util import fsenc, fsdec, SYMTIME
|
||||
from . import path
|
||||
|
||||
|
||||
@@ -55,5 +55,8 @@ def unlink(p):
|
||||
return os.unlink(fsenc(p))
|
||||
|
||||
|
||||
def utime(p, times=None):
|
||||
return os.utime(fsenc(p), times)
|
||||
def utime(p, times=None, follow_symlinks=True):
|
||||
if SYMTIME:
|
||||
return os.utime(fsenc(p), times, follow_symlinks=follow_symlinks)
|
||||
else:
|
||||
return os.utime(fsenc(p), times)
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import os
|
||||
from ..util import fsenc, fsdec
|
||||
from ..util import fsenc, fsdec, SYMTIME
|
||||
|
||||
|
||||
def abspath(p):
|
||||
@@ -13,8 +13,11 @@ def exists(p):
|
||||
return os.path.exists(fsenc(p))
|
||||
|
||||
|
||||
def getmtime(p):
|
||||
return os.path.getmtime(fsenc(p))
|
||||
def getmtime(p, follow_symlinks=True):
|
||||
if not follow_symlinks and SYMTIME:
|
||||
return os.lstat(fsenc(p)).st_mtime
|
||||
else:
|
||||
return os.path.getmtime(fsenc(p))
|
||||
|
||||
|
||||
def getsize(p):
|
||||
|
||||
@@ -60,6 +60,7 @@ class HttpCli(object):
|
||||
self.bufsz = 1024 * 32
|
||||
self.hint = None
|
||||
self.trailing_slash = True
|
||||
self.out_headerlist = []
|
||||
self.out_headers = {
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
"Cache-Control": "no-store; max-age=0",
|
||||
@@ -91,6 +92,7 @@ class HttpCli(object):
|
||||
tpl = self.conn.hsrv.j2[name]
|
||||
if ka:
|
||||
ka["ts"] = self.conn.hsrv.cachebuster()
|
||||
ka["svcname"] = self.args.doctitle
|
||||
return tpl.render(**ka)
|
||||
|
||||
return tpl
|
||||
@@ -226,7 +228,7 @@ class HttpCli(object):
|
||||
self.gvol = self.asrv.vfs.aget[self.uname]
|
||||
|
||||
if pwd and "pw" in self.ouparam and pwd != cookies.get("cppwd"):
|
||||
self.out_headers["Set-Cookie"] = self.get_pwd_cookie(pwd)[0]
|
||||
self.out_headerlist.append(("Set-Cookie", self.get_pwd_cookie(pwd)[0]))
|
||||
|
||||
self.ua = self.headers.get("user-agent", "")
|
||||
self.is_rclone = self.ua.startswith("rclone/")
|
||||
@@ -285,7 +287,7 @@ class HttpCli(object):
|
||||
self.out_headers["Cache-Control"] = "max-age=" + n
|
||||
|
||||
def k304(self):
|
||||
k304 = self.cookies.get("k304", "")
|
||||
k304 = self.cookies.get("k304")
|
||||
return k304 == "y" or ("; Trident/" in self.ua and not k304)
|
||||
|
||||
def send_headers(self, length, status=200, mime=None, headers=None):
|
||||
@@ -310,7 +312,7 @@ class HttpCli(object):
|
||||
|
||||
self.out_headers["Content-Type"] = mime
|
||||
|
||||
for k, v in self.out_headers.items():
|
||||
for k, v in list(self.out_headers.items()) + self.out_headerlist:
|
||||
response.append("{}: {}".format(k, v))
|
||||
|
||||
try:
|
||||
@@ -439,6 +441,12 @@ class HttpCli(object):
|
||||
if "k304" in self.uparam:
|
||||
return self.set_k304()
|
||||
|
||||
if "am_js" in self.uparam:
|
||||
return self.set_am_js()
|
||||
|
||||
if "reset" in self.uparam:
|
||||
return self.set_cfg_reset()
|
||||
|
||||
if "h" in self.uparam:
|
||||
return self.tx_mounts()
|
||||
|
||||
@@ -601,8 +609,8 @@ class HttpCli(object):
|
||||
alg = alg or "gz" # def.pk
|
||||
try:
|
||||
# config-forced opts
|
||||
alg, lv = pk.split(",")
|
||||
lv[alg] = int(lv)
|
||||
alg, nlv = pk.split(",")
|
||||
lv[alg] = int(nlv)
|
||||
except:
|
||||
pass
|
||||
|
||||
@@ -1359,6 +1367,9 @@ class HttpCli(object):
|
||||
try:
|
||||
fs_path = req_path + ext
|
||||
st = bos.stat(fs_path)
|
||||
if stat.S_ISDIR(st.st_mode):
|
||||
continue
|
||||
|
||||
file_ts = max(file_ts, st.st_mtime)
|
||||
editions[ext or "plain"] = [fs_path, st.st_size]
|
||||
except:
|
||||
@@ -1717,7 +1728,19 @@ class HttpCli(object):
|
||||
|
||||
def set_k304(self):
|
||||
ck = gencookie("k304", self.uparam["k304"], 60 * 60 * 24 * 365)
|
||||
self.out_headers["Set-Cookie"] = ck
|
||||
self.out_headerlist.append(("Set-Cookie", ck))
|
||||
self.redirect("", "?h#cc")
|
||||
|
||||
def set_am_js(self):
|
||||
v = "n" if self.uparam["am_js"] == "n" else "y"
|
||||
ck = gencookie("js", v, 60 * 60 * 24 * 365)
|
||||
self.out_headerlist.append(("Set-Cookie", ck))
|
||||
self.reply(b"promoted\n")
|
||||
|
||||
def set_cfg_reset(self):
|
||||
for k in ("k304", "js", "cppwd"):
|
||||
self.out_headerlist.append(("Set-Cookie", gencookie(k, "x", None)))
|
||||
|
||||
self.redirect("", "?h#cc")
|
||||
|
||||
def tx_404(self, is_403=False):
|
||||
@@ -1939,6 +1962,13 @@ class HttpCli(object):
|
||||
fmt = "{{}} {{:{},}} {{}}"
|
||||
nfmt = "{:,}"
|
||||
|
||||
for x in dirs:
|
||||
n = x["name"] + "/"
|
||||
if arg == "v":
|
||||
n = "\033[94m" + n
|
||||
|
||||
x["name"] = n
|
||||
|
||||
fmt = fmt.format(len(nfmt.format(biggest)))
|
||||
ret = [
|
||||
"# {}: {}".format(x, ls[x])
|
||||
@@ -2077,6 +2107,7 @@ class HttpCli(object):
|
||||
|
||||
url_suf = self.urlq({}, [])
|
||||
is_ls = "ls" in self.uparam
|
||||
is_js = self.cookies.get("js") == "y"
|
||||
|
||||
tpl = "browser"
|
||||
if "b" in self.uparam:
|
||||
@@ -2105,6 +2136,7 @@ class HttpCli(object):
|
||||
"taglist": [],
|
||||
"srvinf": srv_info,
|
||||
"acct": self.uname,
|
||||
"idx": ("e2d" in vn.flags),
|
||||
"perms": perms,
|
||||
"logues": logues,
|
||||
"readme": readme,
|
||||
@@ -2113,6 +2145,7 @@ class HttpCli(object):
|
||||
"vdir": quotep(self.vpath),
|
||||
"vpnodes": vpnodes,
|
||||
"files": [],
|
||||
"ls0": None,
|
||||
"acct": self.uname,
|
||||
"perms": json.dumps(perms),
|
||||
"taglist": [],
|
||||
@@ -2193,7 +2226,7 @@ class HttpCli(object):
|
||||
for fn in vfs_ls:
|
||||
base = ""
|
||||
href = fn
|
||||
if not is_ls and not self.trailing_slash and vpath:
|
||||
if not is_ls and not is_js and not self.trailing_slash and vpath:
|
||||
base = "/" + vpath + "/"
|
||||
href = base + fn
|
||||
|
||||
@@ -2336,7 +2369,12 @@ class HttpCli(object):
|
||||
|
||||
dirs.sort(key=itemgetter("name"))
|
||||
|
||||
j2a["files"] = dirs + files
|
||||
if is_js:
|
||||
j2a["ls0"] = {"dirs": dirs, "files": files, "taglist": taglist}
|
||||
j2a["files"] = []
|
||||
else:
|
||||
j2a["files"] = dirs + files
|
||||
|
||||
j2a["logues"] = logues
|
||||
j2a["taglist"] = taglist
|
||||
j2a["txt_ext"] = self.args.textfiles.replace(",", " ")
|
||||
|
||||
@@ -302,6 +302,10 @@ class SvcHub(object):
|
||||
print("nailed it", end="")
|
||||
ret = self.retcode
|
||||
finally:
|
||||
if self.args.wintitle:
|
||||
print("\033]0;\033\\", file=sys.stderr, end="")
|
||||
sys.stderr.flush()
|
||||
|
||||
print("\033[0m")
|
||||
if self.logf:
|
||||
self.logf.close()
|
||||
|
||||
@@ -2,9 +2,10 @@
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import re
|
||||
import sys
|
||||
import socket
|
||||
|
||||
from .__init__ import MACOS, ANYWIN
|
||||
from .__init__ import MACOS, ANYWIN, unicode
|
||||
from .util import chkcmd
|
||||
|
||||
|
||||
@@ -54,6 +55,8 @@ class TcpSrv(object):
|
||||
eps[x] = "external"
|
||||
|
||||
msgs = []
|
||||
title_tab = {}
|
||||
title_vars = [x[1:] for x in self.args.wintitle.split(" ") if x.startswith("$")]
|
||||
m = "available @ http://{}:{}/ (\033[33m{}\033[0m)"
|
||||
for ip, desc in sorted(eps.items(), key=lambda x: x[1]):
|
||||
for port in sorted(self.args.p):
|
||||
@@ -62,11 +65,39 @@ class TcpSrv(object):
|
||||
|
||||
msgs.append(m.format(ip, port, desc))
|
||||
|
||||
if not self.args.wintitle:
|
||||
continue
|
||||
|
||||
if port in [80, 443]:
|
||||
ep = ip
|
||||
else:
|
||||
ep = "{}:{}".format(ip, port)
|
||||
|
||||
hits = []
|
||||
if "pub" in title_vars and "external" in unicode(desc):
|
||||
hits.append(("pub", ep))
|
||||
|
||||
if "pub" in title_vars or "all" in title_vars:
|
||||
hits.append(("all", ep))
|
||||
|
||||
for var in title_vars:
|
||||
if var.startswith("ip-") and ep.startswith(var[3:]):
|
||||
hits.append((var, ep))
|
||||
|
||||
for tk, tv in hits:
|
||||
try:
|
||||
title_tab[tk][tv] = 1
|
||||
except:
|
||||
title_tab[tk] = {tv: 1}
|
||||
|
||||
if msgs:
|
||||
msgs[-1] += "\n"
|
||||
for m in msgs:
|
||||
self.log("tcpsrv", m)
|
||||
|
||||
if self.args.wintitle:
|
||||
self._set_wintitle(title_tab)
|
||||
|
||||
def _listen(self, ip, port):
|
||||
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
@@ -232,3 +263,26 @@ class TcpSrv(object):
|
||||
eps[default_route] = desc
|
||||
|
||||
return eps
|
||||
|
||||
def _set_wintitle(self, vars):
|
||||
vars["all"] = vars.get("all", {"Local-Only": 1})
|
||||
vars["pub"] = vars.get("pub", vars["all"])
|
||||
|
||||
vars2 = {}
|
||||
for k, eps in vars.items():
|
||||
vars2[k] = {
|
||||
ep: 1
|
||||
for ep in eps.keys()
|
||||
if ":" not in ep or ep.split(":")[0] not in eps
|
||||
}
|
||||
|
||||
title = ""
|
||||
vars = vars2
|
||||
for p in self.args.wintitle.split(" "):
|
||||
if p.startswith("$"):
|
||||
p = " and ".join(sorted(vars.get(p[1:], {"(None)": 1}).keys()))
|
||||
|
||||
title += "{} ".format(p)
|
||||
|
||||
print("\033]0;{}\033\\".format(title), file=sys.stderr, end="")
|
||||
sys.stderr.flush()
|
||||
|
||||
@@ -117,7 +117,16 @@ class U2idx(object):
|
||||
if ok:
|
||||
continue
|
||||
|
||||
v, uq = (uq + " ").split(" ", 1)
|
||||
if uq.startswith('"'):
|
||||
v, uq = uq[1:].split('"', 1)
|
||||
while v.endswith("\\"):
|
||||
v2, uq = uq.split('"', 1)
|
||||
v = v[:-1] + '"' + v2
|
||||
uq = uq.strip()
|
||||
else:
|
||||
v, uq = (uq + " ").split(" ", 1)
|
||||
v = v.replace('\\"', '"')
|
||||
|
||||
if is_key:
|
||||
is_key = False
|
||||
|
||||
|
||||
@@ -21,6 +21,7 @@ from .util import (
|
||||
Pebkac,
|
||||
Queue,
|
||||
ProgressPrinter,
|
||||
SYMTIME,
|
||||
fsdec,
|
||||
fsenc,
|
||||
absreal,
|
||||
@@ -1307,11 +1308,14 @@ class Up2k(object):
|
||||
err = "partial upload exists at a different location; please resume uploading here instead:\n"
|
||||
err += "/" + quotep(vsrc) + " "
|
||||
|
||||
dupe = [cj["prel"], cj["name"]]
|
||||
try:
|
||||
self.dupesched[src].append(dupe)
|
||||
except:
|
||||
self.dupesched[src] = [dupe]
|
||||
# registry is size-constrained + can only contain one unique wark;
|
||||
# let want_recheck trigger symlink (if still in reg) or reupload
|
||||
if cur:
|
||||
dupe = [cj["prel"], cj["name"], cj["lmod"]]
|
||||
try:
|
||||
self.dupesched[src].append(dupe)
|
||||
except:
|
||||
self.dupesched[src] = [dupe]
|
||||
|
||||
raise Pebkac(400, err)
|
||||
|
||||
@@ -1332,7 +1336,7 @@ class Up2k(object):
|
||||
dst = os.path.join(job["ptop"], job["prel"], job["name"])
|
||||
if not self.args.nw:
|
||||
bos.unlink(dst) # TODO ed pls
|
||||
self._symlink(src, dst)
|
||||
self._symlink(src, dst, lmod=cj["lmod"])
|
||||
|
||||
if cur:
|
||||
a = [cj[x] for x in "prel name lmod size addr".split()]
|
||||
@@ -1404,13 +1408,14 @@ class Up2k(object):
|
||||
with ren_open(fname, "wb", fdir=fdir, suffix=suffix) as f:
|
||||
return f["orz"][1]
|
||||
|
||||
def _symlink(self, src, dst, verbose=True):
|
||||
def _symlink(self, src, dst, verbose=True, lmod=None):
|
||||
if verbose:
|
||||
self.log("linking dupe:\n {0}\n {1}".format(src, dst))
|
||||
|
||||
if self.args.nw:
|
||||
return
|
||||
|
||||
linked = False
|
||||
try:
|
||||
if self.args.no_symlink:
|
||||
raise Exception("disabled in config")
|
||||
@@ -1441,10 +1446,18 @@ class Up2k(object):
|
||||
hops = len(ndst[nc:]) - 1
|
||||
lsrc = "../" * hops + "/".join(lsrc)
|
||||
os.symlink(fsenc(lsrc), fsenc(ldst))
|
||||
linked = True
|
||||
except Exception as ex:
|
||||
self.log("cannot symlink; creating copy: " + repr(ex))
|
||||
shutil.copy2(fsenc(src), fsenc(dst))
|
||||
|
||||
if lmod and (not linked or SYMTIME):
|
||||
times = (int(time.time()), int(lmod))
|
||||
if ANYWIN:
|
||||
self.lastmod_q.put([dst, 0, times])
|
||||
else:
|
||||
bos.utime(dst, times, False)
|
||||
|
||||
def handle_chunk(self, ptop, wark, chash):
|
||||
with self.mutex:
|
||||
job = self.registry[ptop].get(wark)
|
||||
@@ -1551,12 +1564,12 @@ class Up2k(object):
|
||||
return
|
||||
|
||||
cur = self.cur.get(ptop)
|
||||
for rd, fn in dupes:
|
||||
for rd, fn, lmod in dupes:
|
||||
d2 = os.path.join(ptop, rd, fn)
|
||||
if os.path.exists(d2):
|
||||
continue
|
||||
|
||||
self._symlink(dst, d2)
|
||||
self._symlink(dst, d2, lmod=lmod)
|
||||
if cur:
|
||||
self.db_rm(cur, rd, fn)
|
||||
self.db_add(cur, wark, rd, fn, *a[-4:])
|
||||
@@ -1773,8 +1786,9 @@ class Up2k(object):
|
||||
dlabs = absreal(sabs)
|
||||
m = "moving symlink from [{}] to [{}], target [{}]"
|
||||
self.log(m.format(sabs, dabs, dlabs))
|
||||
os.unlink(sabs)
|
||||
self._symlink(dlabs, dabs, False)
|
||||
mt = bos.path.getmtime(sabs, False)
|
||||
bos.unlink(sabs)
|
||||
self._symlink(dlabs, dabs, False, lmod=mt)
|
||||
|
||||
# folders are too scary, schedule rescan of both vols
|
||||
self.need_rescan[svn.vpath] = 1
|
||||
@@ -1904,25 +1918,30 @@ class Up2k(object):
|
||||
slabs = list(sorted(links.keys()))[0]
|
||||
ptop, rem = links.pop(slabs)
|
||||
self.log("linkswap [{}] and [{}]".format(sabs, slabs))
|
||||
mt = bos.path.getmtime(slabs, False)
|
||||
bos.unlink(slabs)
|
||||
bos.rename(sabs, slabs)
|
||||
bos.utime(slabs, (int(time.time()), int(mt)), False)
|
||||
self._symlink(slabs, sabs, False)
|
||||
full[slabs] = [ptop, rem]
|
||||
sabs = slabs
|
||||
|
||||
if not dabs:
|
||||
dabs = list(sorted(full.keys()))[0]
|
||||
|
||||
for alink in links.keys():
|
||||
lmod = None
|
||||
try:
|
||||
if alink != sabs and absreal(alink) != sabs:
|
||||
continue
|
||||
|
||||
self.log("relinking [{}] to [{}]".format(alink, dabs))
|
||||
lmod = bos.path.getmtime(alink, False)
|
||||
bos.unlink(alink)
|
||||
except:
|
||||
pass
|
||||
|
||||
self._symlink(dabs, alink, False)
|
||||
self._symlink(dabs, alink, False, lmod=lmod)
|
||||
|
||||
return len(full) + len(links)
|
||||
|
||||
@@ -2028,7 +2047,7 @@ class Up2k(object):
|
||||
for path, sz, times in ready:
|
||||
self.log("lmod: setting times {} on {}".format(times, path))
|
||||
try:
|
||||
bos.utime(path, times)
|
||||
bos.utime(path, times, False)
|
||||
except:
|
||||
self.log("lmod: failed to utime ({}, {})".format(path, times))
|
||||
|
||||
|
||||
@@ -67,8 +67,9 @@ if WINDOWS and PY2:
|
||||
FS_ENCODING = "utf-8"
|
||||
|
||||
|
||||
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
|
||||
SYMTIME = sys.version_info >= (3, 6) and os.supports_follow_symlinks
|
||||
|
||||
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
|
||||
|
||||
HTTPCODE = {
|
||||
200: "OK",
|
||||
|
||||
@@ -79,6 +79,27 @@ a, #files tbody div a:last-child {
|
||||
color: #999;
|
||||
font-weight: normal;
|
||||
}
|
||||
.s0:after,
|
||||
.s1:after {
|
||||
content: '⌄';
|
||||
margin-left: -.1em;
|
||||
}
|
||||
.s0r:after,
|
||||
.s1r:after {
|
||||
content: '⌃';
|
||||
margin-left: -.1em;
|
||||
}
|
||||
.s0:after,
|
||||
.s0r:after {
|
||||
color: #fb0;
|
||||
}
|
||||
.s1:after,
|
||||
.s1r:after {
|
||||
color: #d09;
|
||||
}
|
||||
#files thead th:after {
|
||||
margin-right: -.7em;
|
||||
}
|
||||
#files tbody tr:hover td {
|
||||
background: #1c1c1c;
|
||||
}
|
||||
@@ -240,6 +261,8 @@ html.light #ggrid>a[tt].sel {
|
||||
#files tbody tr.sel:hover td,
|
||||
#files tbody tr.sel:focus td,
|
||||
#ggrid>a.sel:hover,
|
||||
#ggrid>a.sel:focus,
|
||||
html.light #ggrid>a.sel:focus,
|
||||
html.light #ggrid>a.sel:hover {
|
||||
color: #fff;
|
||||
background: #d39;
|
||||
@@ -295,6 +318,8 @@ html.light #ggrid>a.sel {
|
||||
width: 100%;
|
||||
z-index: 3;
|
||||
touch-action: none;
|
||||
}
|
||||
#widget.anim {
|
||||
transition: bottom 0.15s;
|
||||
}
|
||||
#widget.open {
|
||||
@@ -462,7 +487,7 @@ html.light #wfm a:not(.en) {
|
||||
width: calc(100% - 10.5em);
|
||||
background: rgba(0,0,0,0.2);
|
||||
}
|
||||
@media (min-width: 80em) {
|
||||
@media (min-width: 70em) {
|
||||
#barpos,
|
||||
#barbuf {
|
||||
width: calc(100% - 21em);
|
||||
@@ -654,7 +679,7 @@ input.eq_gain {
|
||||
#wrap {
|
||||
margin: 1.8em 1.5em 0 1.5em;
|
||||
min-height: 70vh;
|
||||
padding-bottom: 5em;
|
||||
padding-bottom: 7em;
|
||||
}
|
||||
#tree {
|
||||
display: none;
|
||||
@@ -948,6 +973,12 @@ html.light .ghead {
|
||||
#ggrid>a.dir:before {
|
||||
content: '📂';
|
||||
}
|
||||
#ggrid>a.au:before {
|
||||
content: '💾';
|
||||
}
|
||||
html.np_open #ggrid>a.au:before {
|
||||
content: '▶';
|
||||
}
|
||||
#ggrid>a:before {
|
||||
display: block;
|
||||
position: absolute;
|
||||
@@ -957,6 +988,12 @@ html.light .ghead {
|
||||
background: linear-gradient(135deg,rgba(255,255,255,0) 50%,rgba(255,255,255,0.2));
|
||||
border-radius: .3em;
|
||||
font-size: 2em;
|
||||
transition: font-size .15s, margin .15s;
|
||||
}
|
||||
#ggrid>a:focus:before,
|
||||
#ggrid>a:hover:before {
|
||||
font-size: 2.5em;
|
||||
margin: -.2em;
|
||||
}
|
||||
#op_unpost {
|
||||
padding: 1em;
|
||||
@@ -1026,7 +1063,6 @@ html.light #rui {
|
||||
font-size: 1.5em;
|
||||
}
|
||||
#doc {
|
||||
background: none;
|
||||
overflow: visible;
|
||||
margin: -1em 0 .5em 0;
|
||||
padding: 1em 0 1em 0;
|
||||
@@ -1065,7 +1101,7 @@ html.light #doc .line-highlight {
|
||||
#docul li {
|
||||
margin: 0;
|
||||
}
|
||||
#tree #docul a {
|
||||
#tree #docul li+li a {
|
||||
display: block;
|
||||
}
|
||||
#seldoc.sel {
|
||||
@@ -1114,6 +1150,7 @@ a.btn,
|
||||
|
||||
|
||||
html,
|
||||
#doc,
|
||||
#rui,
|
||||
#files td,
|
||||
#files thead th,
|
||||
@@ -1161,6 +1198,7 @@ html,
|
||||
#ggrid>a[tt] {
|
||||
background: linear-gradient(135deg, #2c2c2c 95%, #444 95%);
|
||||
}
|
||||
#ggrid>a:focus,
|
||||
#ggrid>a:hover {
|
||||
background: #383838;
|
||||
border-color: #555;
|
||||
@@ -1174,6 +1212,7 @@ html.light #ggrid>a {
|
||||
html.light #ggrid>a[tt] {
|
||||
background: linear-gradient(135deg, #f7f7f7 95%, #ccc 95%);
|
||||
}
|
||||
html.light #ggrid>a:focus,
|
||||
html.light #ggrid>a:hover {
|
||||
background: #fff;
|
||||
border-color: #ccc;
|
||||
@@ -1193,6 +1232,7 @@ html.light {
|
||||
html.light #ops,
|
||||
html.light .opbox,
|
||||
html.light #path,
|
||||
html.light #doc,
|
||||
html.light #srch_form,
|
||||
html.light .ghead,
|
||||
html.light #u2etas {
|
||||
@@ -1270,6 +1310,14 @@ html.light #ops a,
|
||||
html.light #files tbody div a:last-child {
|
||||
color: #06a;
|
||||
}
|
||||
html.light .s0:after,
|
||||
html.light .s0r:after {
|
||||
color: #059;
|
||||
}
|
||||
html.light .s1:after,
|
||||
html.light .s1r:after {
|
||||
color: #f5d;
|
||||
}
|
||||
html.light #files thead th {
|
||||
background: #eaeaea;
|
||||
border-color: #ccc;
|
||||
@@ -1376,6 +1424,7 @@ html.light .opview input[type="text"] {
|
||||
border-color: #38d;
|
||||
}
|
||||
html.light #u2tab a>span,
|
||||
html.light #docul .bn a>span,
|
||||
html.light #files td div span {
|
||||
color: #000;
|
||||
}
|
||||
@@ -1833,12 +1882,13 @@ html.light #u2err.err {
|
||||
#u2tabw {
|
||||
min-height: 0;
|
||||
transition: min-height .2s;
|
||||
margin: 3em auto;
|
||||
margin: 3em 0;
|
||||
}
|
||||
#u2tab {
|
||||
border-collapse: collapse;
|
||||
width: calc(100% - 2em);
|
||||
max-width: 100em;
|
||||
margin: 0 auto;
|
||||
}
|
||||
#op_up2k.srch #u2tabf {
|
||||
max-width: none;
|
||||
@@ -2100,6 +2150,7 @@ html.light #u2foot .warn span {
|
||||
border-color: #d06;
|
||||
}
|
||||
#u2tab a>span,
|
||||
#docul .bn a>span,
|
||||
#unpost a>span {
|
||||
font-weight: bold;
|
||||
font-style: italic;
|
||||
|
||||
@@ -143,7 +143,8 @@
|
||||
have_zip = {{ have_zip|tojson }},
|
||||
txt_ext = "{{ txt_ext }}",
|
||||
{% if no_prism %}no_prism = 1,{% endif %}
|
||||
readme = {{ readme|tojson }};
|
||||
readme = {{ readme|tojson }},
|
||||
ls0 = {{ ls0|tojson }};
|
||||
|
||||
document.documentElement.setAttribute("class", localStorage.lightmode == 1 ? "light" : "dark");
|
||||
</script>
|
||||
|
||||
@@ -8,13 +8,9 @@ function dbg(msg) {
|
||||
// toolbar
|
||||
ebi('ops').innerHTML = (
|
||||
'<a href="#" data-dest="" tt="close submenu">--</a>\n' +
|
||||
(have_up2k_idx ? (
|
||||
'<a href="#" data-perm="read" data-dest="search" tt="search for files by attributes, path/name, music tags, or any combination of those$N$N<code>foo bar</code> = must contain both foo and bar,$N<code>foo -bar</code> = must contain foo but not bar,$N<code>^yana .opus$</code> = must start with yana and have the opus extension">🔎</a>\n' +
|
||||
(have_del && have_unpost ? '<a href="#" data-dest="unpost" tt="unpost: delete your recent uploads">🧯</a>\n' : '') +
|
||||
'<a href="#" data-dest="up2k" tt="up2k: upload files (if you have write-access) or toggle into the search-mode to see if they exist somewhere on the server">🚀</a>\n'
|
||||
) : (
|
||||
'<a href="#" data-perm="write" data-dest="up2k" tt="up2k: upload files with resume support (close your browser and drop the same files in later)">🚀</a>\n'
|
||||
)) +
|
||||
'<a href="#" data-perm="read" data-dep="idx" data-dest="search" tt="search for files by attributes, path/name, music tags, or any combination of those$N$N<code>foo bar</code> = must contain both foo and bar,$N<code>foo -bar</code> = must contain foo but not bar,$N<code>^yana .opus$</code> = start with yana and be an opus file$N<code>"try unite"</code> = contain exactly «try unite»">🔎</a>\n' +
|
||||
(have_del && have_unpost ? '<a href="#" data-dest="unpost" data-dep="idx" tt="unpost: delete your recent uploads">🧯</a>\n' : '') +
|
||||
'<a href="#" data-dest="up2k">🚀</a>\n' +
|
||||
'<a href="#" data-perm="write" data-dest="bup" tt="bup: basic uploader, even supports netscape 4.0">🎈</a>\n' +
|
||||
'<a href="#" data-perm="write" data-dest="mkdir" tt="mkdir: create a new directory">📂</a>\n' +
|
||||
'<a href="#" data-perm="read write" data-dest="new_md" tt="new-md: create a new markdown document">📝</a>\n' +
|
||||
@@ -68,12 +64,10 @@ ebi('op_up2k').innerHTML = (
|
||||
' <input type="checkbox" id="ask_up" />\n' +
|
||||
' <label for="ask_up" tt="ask for confirmation before upload starts">💭</label>\n' +
|
||||
' </td>\n' +
|
||||
(have_up2k_idx ? (
|
||||
' <td class="c" data-perm="read" rowspan="2">\n' +
|
||||
' <input type="checkbox" id="fsearch" />\n' +
|
||||
' <label for="fsearch" tt="don\'t actually upload, instead check if the files already $N exist on the server (will scan all folders you can read)">🔎</label>\n' +
|
||||
' </td>\n'
|
||||
) : '') +
|
||||
' <td class="c" data-perm="read" data-dep="idx" rowspan="2">\n' +
|
||||
' <input type="checkbox" id="fsearch" />\n' +
|
||||
' <label for="fsearch" tt="don\'t actually upload, instead check if the files already $N exist on the server (will scan all folders you can read)">🔎</label>\n' +
|
||||
' </td>\n' +
|
||||
' <td data-perm="read" rowspan="2" id="u2btn_cw"></td>\n' +
|
||||
' <td data-perm="read" rowspan="2" id="u2c3w"></td>\n' +
|
||||
' </tr>\n' +
|
||||
@@ -157,6 +151,7 @@ ebi('op_cfg').innerHTML = (
|
||||
' <a id="thumbs" class="tgl btn" href="#" tt="in icon view, toggle icons or thumbnails$NHotkey: T">🖼️ thumbs</a>\n' +
|
||||
' <a id="dotfiles" class="tgl btn" href="#" tt="show hidden files (if server permits)">dotfiles</a>\n' +
|
||||
' <a id="ireadme" class="tgl btn" href="#" tt="show README.md in folder listings">📜 readme</a>\n' +
|
||||
' <a id="spafiles" class="tgl btn" href="#" tt="speedboost when not using the navpane;$Nturn it off if things arent loading somehow">spa</a>\n' +
|
||||
' </div>\n' +
|
||||
'</div>\n' +
|
||||
(have_zip ? (
|
||||
@@ -306,7 +301,9 @@ function set_files_html(html) {
|
||||
|
||||
|
||||
var ACtx = window.AudioContext || window.webkitAudioContext,
|
||||
actx = ACtx && new ACtx();
|
||||
actx = ACtx && new ACtx(),
|
||||
hash0 = location.hash,
|
||||
mp;
|
||||
|
||||
|
||||
var mpl = (function () {
|
||||
@@ -539,6 +536,7 @@ function MPlayer() {
|
||||
r.tracks[tid] = url;
|
||||
tds[0].innerHTML = '<a id="a' + tid + '" href="#a' + tid + '" class="play">play</a></td>';
|
||||
ebi('a' + tid).onclick = ev_play;
|
||||
clmod(trs[a], 'au', 1);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -651,14 +649,11 @@ function MPlayer() {
|
||||
};
|
||||
}
|
||||
|
||||
addcrc();
|
||||
var mp = new MPlayer();
|
||||
makeSortable(ebi('files'), mp.read_order.bind(mp));
|
||||
|
||||
|
||||
function ft2dict(tr) {
|
||||
var th = ebi('files').tHead.rows[0].cells,
|
||||
rv = [],
|
||||
rh = [],
|
||||
ra = [],
|
||||
rt = {};
|
||||
|
||||
@@ -670,10 +665,11 @@ function ft2dict(tr) {
|
||||
if (!tv)
|
||||
continue;
|
||||
|
||||
(vis ? rv : ra).push(tk);
|
||||
(vis ? rv : rh).push(tk);
|
||||
ra.push(tk);
|
||||
rt[tk] = tv;
|
||||
}
|
||||
return [rt, rv, ra];
|
||||
return [rt, rv, rh, ra];
|
||||
}
|
||||
|
||||
|
||||
@@ -693,24 +689,19 @@ var widget = (function () {
|
||||
touchmode = false,
|
||||
was_paused = true;
|
||||
|
||||
r.is_open = false;
|
||||
|
||||
r.open = function () {
|
||||
if (r.is_open)
|
||||
return false;
|
||||
|
||||
clmod(document.documentElement, 'np_open', 1);
|
||||
widget.className = 'open';
|
||||
r.is_open = true;
|
||||
return true;
|
||||
return r.set(true);
|
||||
};
|
||||
r.close = function () {
|
||||
if (!r.is_open)
|
||||
return r.set(false);
|
||||
};
|
||||
r.set = function (is_open) {
|
||||
if (r.is_open == is_open)
|
||||
return false;
|
||||
|
||||
clmod(document.documentElement, 'np_open');
|
||||
widget.className = '';
|
||||
r.is_open = false;
|
||||
clmod(document.documentElement, 'np_open', is_open);
|
||||
widget.className = is_open ? 'open' : '';
|
||||
bcfg_set('au_open', r.is_open = is_open);
|
||||
return true;
|
||||
};
|
||||
r.toggle = function (e) {
|
||||
@@ -757,6 +748,10 @@ var widget = (function () {
|
||||
document.body.removeChild(o);
|
||||
}, 500);
|
||||
};
|
||||
r.set(sread('au_open') == 1);
|
||||
setTimeout(function () {
|
||||
clmod(ebi('widget'), 'anim', 1);
|
||||
}, 10);
|
||||
return r;
|
||||
})();
|
||||
|
||||
@@ -808,7 +803,7 @@ var pbar = (function () {
|
||||
|
||||
bctx.clearRect(0, 0, bc.w, bc.h);
|
||||
|
||||
if (!mp.au)
|
||||
if (!mp || !mp.au)
|
||||
return;
|
||||
|
||||
var sm = bc.w * 1.0 / mp.au.duration,
|
||||
@@ -835,7 +830,7 @@ var pbar = (function () {
|
||||
|
||||
pctx.clearRect(0, 0, pc.w, pc.h);
|
||||
|
||||
if (!mp.au || isNaN(adur = mp.au.duration) || isNaN(apos = mp.au.currentTime) || apos < 0 || adur < apos)
|
||||
if (!mp || !mp.au || isNaN(adur = mp.au.duration) || isNaN(apos = mp.au.currentTime) || apos < 0 || adur < apos)
|
||||
return; // not-init || unsupp-codec
|
||||
|
||||
var sm = bc.w * 1.0 / adur;
|
||||
@@ -906,6 +901,9 @@ var vbar = (function () {
|
||||
}
|
||||
|
||||
r.draw = function () {
|
||||
if (!mp)
|
||||
return;
|
||||
|
||||
var gh = h + '' + light;
|
||||
if (gradh != gh) {
|
||||
gradh = gh;
|
||||
@@ -1132,7 +1130,6 @@ var mpui = (function () {
|
||||
if (mp.au.paused)
|
||||
timer.rm(updater_impl);
|
||||
}
|
||||
r.progress_updater();
|
||||
return r;
|
||||
})();
|
||||
|
||||
@@ -1343,7 +1340,7 @@ var audio_eq = (function () {
|
||||
}
|
||||
|
||||
var html = ['<table><tr><td rowspan="4">',
|
||||
'<a id="au_eq" class="tgl btn" href="#" tt="enables the equalizer and gain control;$Nboost 0 = unmodified 100% volume">enable</a></td>'],
|
||||
'<a id="au_eq" class="tgl btn" href="#" tt="enables the equalizer and gain control;$Nboost 0 = unmodified 100% volume$N$Nenabling the equalizer makes gapless albums fully gapless, so leave it on with all the values at zero if you care about that">enable</a></td>'],
|
||||
h2 = [], h3 = [], h4 = [];
|
||||
|
||||
var vs = [];
|
||||
@@ -1484,7 +1481,7 @@ function play(tid, is_ev, seek, call_depth) {
|
||||
seek_au_sec(seek);
|
||||
}
|
||||
|
||||
if (!seek) {
|
||||
if (!seek && !ebi('unsearch')) {
|
||||
var o = ebi(oid);
|
||||
o.setAttribute('id', 'thx_js');
|
||||
sethash(oid);
|
||||
@@ -1492,7 +1489,8 @@ function play(tid, is_ev, seek, call_depth) {
|
||||
}
|
||||
|
||||
mpui.progress_updater();
|
||||
pbar.drawbuf();
|
||||
pbar.onresize();
|
||||
vbar.onresize();
|
||||
mpl.announce();
|
||||
return true;
|
||||
}
|
||||
@@ -1567,7 +1565,8 @@ function autoplay_blocked(seek) {
|
||||
|
||||
|
||||
function eval_hash() {
|
||||
var v = location.hash;
|
||||
var v = hash0;
|
||||
hash0 = null;
|
||||
if (!v)
|
||||
return;
|
||||
|
||||
@@ -1929,10 +1928,6 @@ var fileman = (function () {
|
||||
(function (a) {
|
||||
f[a].inew.onkeydown = function (e) {
|
||||
rn_ok(a, true);
|
||||
|
||||
if (e.key == 'Escape')
|
||||
return rn_cancel();
|
||||
|
||||
if (e.key == 'Enter')
|
||||
return rn_apply();
|
||||
};
|
||||
@@ -2482,7 +2477,7 @@ var showfile = (function () {
|
||||
}
|
||||
|
||||
r.mktree = function () {
|
||||
var html = ['<li class="bn">list of textfiles in<br />' + esc(get_vpath()) + '</li>'];
|
||||
var html = ['<li class="bn">list of textfiles in<br />' + linksplit(get_vpath()).join('') + '</li>'];
|
||||
for (var a = 0; a < r.files.length; a++) {
|
||||
var file = r.files[a];
|
||||
html.push('<li><a href="#" hl="' + file.id +
|
||||
@@ -2559,9 +2554,9 @@ var thegrid = (function () {
|
||||
'<a href="#" class="btn" z="1.2" tt="Hotkey: shift-D">+</a></span> <span>chop: ' +
|
||||
'<a href="#" class="btn" l="-1" tt="truncate filenames more (show less)">–</a> ' +
|
||||
'<a href="#" class="btn" l="1" tt="truncate filenames less (show more)">+</a></span> <span>sort by: ' +
|
||||
'<a href="#" s="href">name</a>, ' +
|
||||
'<a href="#" s="sz">size</a>, ' +
|
||||
'<a href="#" s="ts">date</a>, ' +
|
||||
'<a href="#" s="href">name</a> ' +
|
||||
'<a href="#" s="sz">size</a> ' +
|
||||
'<a href="#" s="ts">date</a> ' +
|
||||
'<a href="#" s="ext">type</a>' +
|
||||
'</span></div>' +
|
||||
'<div id="ggrid"></div>'
|
||||
@@ -2675,14 +2670,12 @@ var thegrid = (function () {
|
||||
href = noq_href(this),
|
||||
aplay = ebi('a' + oth.getAttribute('id')),
|
||||
is_img = /\.(gif|jpe?g|png|webp|webm|mp4)(\?|$)/i.test(href),
|
||||
in_tree = null,
|
||||
is_dir = href.endsWith('/'),
|
||||
in_tree = is_dir && treectl.find(oth.textContent.slice(0, -1)),
|
||||
have_sel = QS('#files tr.sel'),
|
||||
td = oth.closest('td').nextSibling,
|
||||
tr = td.parentNode;
|
||||
|
||||
if (href.endsWith('/'))
|
||||
in_tree = treectl.find(oth.textContent.slice(0, -1));
|
||||
|
||||
if (r.sel && !dbl) {
|
||||
td.click();
|
||||
clmod(this, 'sel', clgot(tr, 'sel'));
|
||||
@@ -2693,6 +2686,9 @@ var thegrid = (function () {
|
||||
else if (in_tree && !have_sel)
|
||||
in_tree.click();
|
||||
|
||||
else if (is_dir && !have_sel && treectl.spa)
|
||||
treectl.reqls(href, true, true);
|
||||
|
||||
else if (!is_img && have_sel)
|
||||
window.open(href, '_blank');
|
||||
|
||||
@@ -2867,10 +2863,6 @@ var thegrid = (function () {
|
||||
import_js('/.cpr/baguettebox.js', r.bagit);
|
||||
}, 1);
|
||||
|
||||
if (r.en) {
|
||||
loadgrid();
|
||||
}
|
||||
|
||||
return r;
|
||||
})();
|
||||
|
||||
@@ -2968,12 +2960,12 @@ document.onkeydown = function (e) {
|
||||
if (k == 'Escape') {
|
||||
ae && ae.blur();
|
||||
|
||||
if (ebi('rn_cancel'))
|
||||
return ebi('rn_cancel').click();
|
||||
|
||||
if (QS('.opview.act'))
|
||||
return QS('#ops>a').click();
|
||||
|
||||
if (QS('#unsearch'))
|
||||
return QS('#unsearch').click();
|
||||
|
||||
if (widget.is_open)
|
||||
return widget.close();
|
||||
|
||||
@@ -2983,6 +2975,9 @@ document.onkeydown = function (e) {
|
||||
if (!treectl.hidden)
|
||||
return treectl.detree();
|
||||
|
||||
if (QS('#unsearch'))
|
||||
return QS('#unsearch').click();
|
||||
|
||||
if (thegrid.en)
|
||||
return ebi('griden').click();
|
||||
}
|
||||
@@ -3234,7 +3229,34 @@ document.onkeydown = function (e) {
|
||||
for (var b = 1; b < sconf[a].length; b++) {
|
||||
var k = sconf[a][b][0],
|
||||
chk = 'srch_' + k + 'c',
|
||||
tvs = ebi('srch_' + k + 'v').value.split(/ +/g);
|
||||
vs = ebi('srch_' + k + 'v').value,
|
||||
tvs = [];
|
||||
|
||||
if (k == 'name')
|
||||
console.log('a');
|
||||
while (vs) {
|
||||
vs = vs.trim();
|
||||
if (!vs)
|
||||
break;
|
||||
|
||||
var v = '';
|
||||
if (vs.startsWith('"')) {
|
||||
var vp = vs.slice(1).split(/"(.*)/);
|
||||
v = vp[0];
|
||||
vs = vp[1] || '';
|
||||
while (v.endsWith('\\')) {
|
||||
vp = vs.split(/"(.*)/);
|
||||
v = v.slice(0, -1) + '"' + vp[0];
|
||||
vs = vp[1] || '';
|
||||
}
|
||||
}
|
||||
else {
|
||||
var vp = vs.split(/ +(.*)/);
|
||||
v = vp[0].replace(/\\"/g, '"');
|
||||
vs = vp[1] || '';
|
||||
}
|
||||
tvs.push(v);
|
||||
}
|
||||
|
||||
if (!ebi(chk).checked)
|
||||
continue;
|
||||
@@ -3277,6 +3299,10 @@ document.onkeydown = function (e) {
|
||||
tv += '*';
|
||||
}
|
||||
|
||||
if (tv.indexOf(' ') + 1) {
|
||||
tv = '"' + tv + '"';
|
||||
}
|
||||
|
||||
q += k + not + 'like ' + tv;
|
||||
}
|
||||
}
|
||||
@@ -3326,7 +3352,7 @@ document.onkeydown = function (e) {
|
||||
|
||||
treectl.hide();
|
||||
|
||||
var html = mk_files_header(tagord);
|
||||
var html = mk_files_header(tagord), seen = {};
|
||||
html.push('<tbody>');
|
||||
html.push('<tr><td>-</td><td colspan="42"><a href="#" id="unsearch"><big style="font-weight:bold">[❌] close search results</big></a></td></tr>');
|
||||
for (var a = 0; a < res.hits.length; a++) {
|
||||
@@ -3334,14 +3360,19 @@ document.onkeydown = function (e) {
|
||||
ts = parseInt(r.ts),
|
||||
sz = esc(r.sz + ''),
|
||||
rp = esc(uricom_dec(r.rp + '')[0]),
|
||||
ext = rp.lastIndexOf('.') > 0 ? rp.split('.').pop() : '%',
|
||||
links = linksplit(r.rp + '');
|
||||
ext = rp.lastIndexOf('.') > 0 ? rp.split('.').pop().split('?')[0] : '%',
|
||||
id = 'f-' + ('00000000' + crc32(rp)).slice(-8);
|
||||
|
||||
while (seen[id])
|
||||
id += 'a';
|
||||
seen[id] = 1;
|
||||
|
||||
if (ext.length > 8)
|
||||
ext = '%';
|
||||
|
||||
links = links.join('');
|
||||
var nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz];
|
||||
var links = linksplit(r.rp + '', id).join(''),
|
||||
nodes = ['<tr><td>-</td><td><div>' + links + '</div>', sz];
|
||||
|
||||
for (var b = 0; b < tagord.length; b++) {
|
||||
var k = tagord[b],
|
||||
v = r.tags[k] || "";
|
||||
@@ -3403,6 +3434,7 @@ var treectl = (function () {
|
||||
mentered = null,
|
||||
treesz = clamp(icfg_get('treesz', 16), 10, 50);
|
||||
|
||||
bcfg_bind(r, 'spa', 'spafiles', true);
|
||||
bcfg_bind(r, 'ireadme', 'ireadme', true);
|
||||
bcfg_bind(r, 'dyn', 'dyntree', true, onresize);
|
||||
bcfg_bind(r, 'dots', 'dotfiles', false, function (v) {
|
||||
@@ -3410,7 +3442,7 @@ var treectl = (function () {
|
||||
});
|
||||
setwrap(bcfg_bind(r, 'wtree', 'wraptree', true, setwrap));
|
||||
setwrap(bcfg_bind(r, 'parpane', 'parpane', true, onscroll));
|
||||
bcfg_bind(r, 'htree', 'hovertree', true, reload_tree);
|
||||
bcfg_bind(r, 'htree', 'hovertree', false, reload_tree);
|
||||
|
||||
function setwrap(v) {
|
||||
clmod(ebi('tree'), 'nowrap', !v);
|
||||
@@ -3583,6 +3615,7 @@ var treectl = (function () {
|
||||
if (!QS(q))
|
||||
break;
|
||||
}
|
||||
nq = Math.max(nq, get_evpath().split('/').length - 2);
|
||||
var iw = (treesz + Math.max(0, nq)),
|
||||
w = iw + 'em',
|
||||
w2 = (iw + 2) + 'em';
|
||||
@@ -3605,7 +3638,7 @@ var treectl = (function () {
|
||||
|
||||
r.goto = function (url, push) {
|
||||
get_tree("", url, true);
|
||||
reqls(url, push, true);
|
||||
r.reqls(url, push, true);
|
||||
};
|
||||
|
||||
function get_tree(top, dst, rst) {
|
||||
@@ -3774,12 +3807,12 @@ var treectl = (function () {
|
||||
treegrow.call(this.previousSibling, e);
|
||||
return;
|
||||
}
|
||||
reqls(this.getAttribute('href'), true);
|
||||
r.reqls(this.getAttribute('href'), true);
|
||||
r.dir_cb = tree_scrollto;
|
||||
thegrid.setvis(true);
|
||||
}
|
||||
|
||||
function reqls(url, hpush, no_tree) {
|
||||
r.reqls = function (url, hpush, no_tree) {
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.top = url;
|
||||
xhr.hpush = hpush;
|
||||
@@ -3834,8 +3867,35 @@ var treectl = (function () {
|
||||
|
||||
ebi('srv_info').innerHTML = '<span>' + res.srvinf + '</span>';
|
||||
|
||||
var top = this.top,
|
||||
nodes = res.dirs.concat(res.files),
|
||||
if (this.hpush && !showfile.active())
|
||||
hist_push(this.top);
|
||||
|
||||
r.gentab(this.top, res);
|
||||
|
||||
acct = res.acct;
|
||||
have_up2k_idx = res.idx;
|
||||
apply_perms(res.perms);
|
||||
despin('#files');
|
||||
despin('#gfiles');
|
||||
|
||||
ebi('pro').innerHTML = res.logues ? res.logues[0] || "" : "";
|
||||
ebi('epi').innerHTML = res.logues ? res.logues[1] || "" : "";
|
||||
|
||||
clmod(ebi('epi'), 'mdo');
|
||||
if (res.readme)
|
||||
show_readme(res.readme);
|
||||
|
||||
wintitle();
|
||||
var fun = r.ls_cb;
|
||||
if (fun) {
|
||||
r.ls_cb = null;
|
||||
fun();
|
||||
}
|
||||
eval_hash();
|
||||
}
|
||||
|
||||
r.gentab = function (top, res) {
|
||||
var nodes = res.dirs.concat(res.files),
|
||||
html = mk_files_header(res.taglist),
|
||||
seen = {};
|
||||
|
||||
@@ -3851,7 +3911,7 @@ var treectl = (function () {
|
||||
id = 'f-' + ('00000000' + crc32(fname)).slice(-8),
|
||||
lang = showfile.getlang(fname);
|
||||
|
||||
while (seen[id])
|
||||
while (seen[id]) // ejyefs ev69gg y9j8sg .opus
|
||||
id += 'a';
|
||||
seen[id] = 1;
|
||||
|
||||
@@ -3883,37 +3943,33 @@ var treectl = (function () {
|
||||
html = html.join('\n');
|
||||
set_files_html(html);
|
||||
|
||||
if (this.hpush && !showfile.active())
|
||||
hist_push(this.top);
|
||||
|
||||
acct = res.acct;
|
||||
apply_perms(res.perms);
|
||||
despin('#files');
|
||||
despin('#gfiles');
|
||||
|
||||
ebi('pro').innerHTML = res.logues ? res.logues[0] || "" : "";
|
||||
ebi('epi').innerHTML = res.logues ? res.logues[1] || "" : "";
|
||||
|
||||
clmod(ebi('epi'), 'mdo');
|
||||
if (res.readme)
|
||||
show_readme(res.readme);
|
||||
|
||||
document.title = '⇆🎉 ' + uricom_dec(document.location.pathname.slice(1, -1))[0];
|
||||
|
||||
filecols.set_style();
|
||||
showfile.mktree();
|
||||
mukey.render();
|
||||
reload_tree();
|
||||
reload_browser();
|
||||
tree_scrollto();
|
||||
|
||||
var fun = r.ls_cb;
|
||||
if (fun) {
|
||||
r.ls_cb = null;
|
||||
fun();
|
||||
}
|
||||
}
|
||||
|
||||
r.hydrate = function () {
|
||||
if (ls0 === null) {
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.open('GET', '/?am_js', true);
|
||||
xhr.send();
|
||||
|
||||
return r.reqls(get_evpath(), false, true);
|
||||
}
|
||||
|
||||
r.gentab(get_evpath(), ls0);
|
||||
reload_browser();
|
||||
pbar.onresize();
|
||||
vbar.onresize();
|
||||
mukey.render();
|
||||
showfile.addlinks();
|
||||
thegrid.setdirty();
|
||||
setTimeout(eval_hash, 1);
|
||||
};
|
||||
|
||||
function parsetree(res, top) {
|
||||
var ret = '';
|
||||
for (var a = 0; a < res.a.length; a++) {
|
||||
@@ -4008,6 +4064,17 @@ function despin(sel) {
|
||||
function apply_perms(newperms) {
|
||||
perms = newperms || [];
|
||||
|
||||
var a = QS('#ops a[data-dest="up2k"]');
|
||||
if (have_up2k_idx) {
|
||||
a.removeAttribute('data-perm');
|
||||
a.setAttribute('tt', 'up2k: upload files (if you have write-access) or toggle into the search-mode to see if they exist somewhere on the server');
|
||||
}
|
||||
else {
|
||||
a.setAttribute('data-perm', 'write');
|
||||
a.setAttribute('tt', 'up2k: upload files with resume support (close your browser and drop the same files in later)');
|
||||
}
|
||||
tt.att(QS('#ops'));
|
||||
|
||||
var axs = [],
|
||||
aclass = '>',
|
||||
chk = ['read', 'write', 'move', 'delete', 'get'];
|
||||
@@ -4037,6 +4104,12 @@ function apply_perms(newperms) {
|
||||
o[a].style.display = display;
|
||||
}
|
||||
|
||||
var o = QSA('#ops>a[data-dep], #u2conf td[data-dep]');
|
||||
for (var a = 0; a < o.length; a++)
|
||||
o[a].style.display = (
|
||||
o[a].getAttribute('data-dep') != 'idx' || have_up2k_idx
|
||||
) ? '' : 'none';
|
||||
|
||||
var act = QS('#ops>a.act');
|
||||
if (act && act.style.display === 'none')
|
||||
goto();
|
||||
@@ -4374,27 +4447,6 @@ var mukey = (function () {
|
||||
})();
|
||||
|
||||
|
||||
function addcrc() {
|
||||
var links = QSA(
|
||||
'#files>tbody>tr>td:first-child+td>' + (
|
||||
ebi('unsearch') ? 'div>a:last-child' : 'a'));
|
||||
|
||||
var seen = {}; // ejyefs ev69gg y9j8sg .opus
|
||||
for (var a = 0, aa = links.length; a < aa; a++) {
|
||||
var id = links[a].getAttribute('id');
|
||||
if (!id) {
|
||||
var crc = crc32(links[a].textContent || links[a].innerText);
|
||||
id = 'f-' + ('00000000' + crc).slice(-8);
|
||||
while (seen[id])
|
||||
id += 'a';
|
||||
|
||||
links[a].setAttribute('id', id);
|
||||
}
|
||||
seen[id] = 1;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
var light;
|
||||
(function () {
|
||||
function freshen() {
|
||||
@@ -4726,7 +4778,7 @@ function show_md(md, name, div, url, depth) {
|
||||
|
||||
try {
|
||||
clmod(div, 'mdo', 1);
|
||||
div.innerHTML = marked(md, {
|
||||
div.innerHTML = marked.parse(md, {
|
||||
headerPrefix: 'md-',
|
||||
breaks: true,
|
||||
gfm: true
|
||||
@@ -4948,15 +5000,35 @@ function goto_unpost(e) {
|
||||
}
|
||||
|
||||
|
||||
function wintitle(txt) {
|
||||
document.title = (txt ? txt : '') + get_vpath().slice(1, -1).split('/').pop();
|
||||
}
|
||||
|
||||
|
||||
ebi('path').onclick = function (e) {
|
||||
var a = e.target.closest('a[href]');
|
||||
if (!treectl.spa || !a || !(a = a.getAttribute('href') + '') || !a.endsWith('/'))
|
||||
return;
|
||||
|
||||
thegrid.setvis(true);
|
||||
treectl.reqls(a, true, true);
|
||||
return ev(e);
|
||||
};
|
||||
|
||||
|
||||
ebi('files').onclick = ebi('docul').onclick = function (e) {
|
||||
var tgt = e.target.closest('a[id]');
|
||||
if (tgt && tgt.getAttribute('id').indexOf('f-') === 0 && tgt.textContent.endsWith('/')) {
|
||||
var el = treectl.find(tgt.textContent.slice(0, -1));
|
||||
if (!el)
|
||||
return;
|
||||
|
||||
el.click();
|
||||
return ev(e);
|
||||
if (el) {
|
||||
el.click();
|
||||
return ev(e);
|
||||
}
|
||||
if (treectl.spa) {
|
||||
treectl.reqls(tgt.getAttribute('href'), true, true);
|
||||
return ev(e);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
tgt = e.target.closest('a[hl]');
|
||||
@@ -4964,7 +5036,14 @@ ebi('files').onclick = ebi('docul').onclick = function (e) {
|
||||
showfile.show(noq_href(ebi(tgt.getAttribute('hl'))), tgt.getAttribute('lang'));
|
||||
return ev(e);
|
||||
}
|
||||
}
|
||||
|
||||
tgt = e.target.closest('a');
|
||||
if (tgt && tgt.closest('li.bn')) {
|
||||
thegrid.setvis(true);
|
||||
treectl.goto(tgt.getAttribute('href'), true);
|
||||
return ev(e);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
function reload_mp() {
|
||||
@@ -4972,33 +5051,33 @@ function reload_mp() {
|
||||
audio_eq.stop();
|
||||
mp.au.pause();
|
||||
mp.au = null;
|
||||
mpl.unbuffer();
|
||||
}
|
||||
mpl.stop();
|
||||
widget.close();
|
||||
var plays = QSA('tr>td:first-child>a.play');
|
||||
for (var a = plays.length - 1; a >= 0; a--)
|
||||
plays[a].parentNode.innerHTML = '-';
|
||||
|
||||
mpl.unbuffer();
|
||||
mp = new MPlayer();
|
||||
audio_eq.acst = {};
|
||||
setTimeout(pbar.onresize, 1);
|
||||
}
|
||||
|
||||
|
||||
function reload_browser(not_mp) {
|
||||
function reload_browser() {
|
||||
filecols.set_style();
|
||||
|
||||
var parts = get_evpath().split('/'),
|
||||
rm = QSA('#path>a+a+a');
|
||||
rm = QSA('#path>a+a+a'),
|
||||
ftab = ebi('files'),
|
||||
link = '/', o;
|
||||
|
||||
for (a = rm.length - 1; a >= 0; a--)
|
||||
rm[a].parentNode.removeChild(rm[a]);
|
||||
|
||||
var link = '/';
|
||||
for (var a = 1; a < parts.length - 1; a++) {
|
||||
link += parts[a] + '/';
|
||||
var o = mknod('a');
|
||||
o = mknod('a');
|
||||
o.setAttribute('href', link);
|
||||
o.textContent = uricom_dec(parts[a])[0];
|
||||
ebi('path').appendChild(o);
|
||||
@@ -5012,11 +5091,9 @@ function reload_browser(not_mp) {
|
||||
oo[a].textContent = hsz;
|
||||
}
|
||||
|
||||
if (!not_mp) {
|
||||
addcrc();
|
||||
reload_mp();
|
||||
makeSortable(ebi('files'), mp.read_order.bind(mp));
|
||||
}
|
||||
reload_mp();
|
||||
try { showsort(ftab); } catch (ex) { }
|
||||
makeSortable(ftab, mp.read_order.bind(mp));
|
||||
|
||||
for (var a = 0; a < 2; a++)
|
||||
clmod(ebi(a ? 'pro' : 'epi'), 'hidden', ebi('unsearch'));
|
||||
@@ -5027,7 +5104,4 @@ function reload_browser(not_mp) {
|
||||
thegrid.setdirty();
|
||||
msel.render();
|
||||
}
|
||||
reload_browser(true);
|
||||
showfile.addlinks();
|
||||
mukey.render();
|
||||
setTimeout(eval_hash, 1);
|
||||
treectl.hydrate();
|
||||
|
||||
@@ -10,7 +10,7 @@
|
||||
{%- endif %}
|
||||
</head>
|
||||
<body>
|
||||
<div id="mn">navbar</div>
|
||||
<div id="mn"></div>
|
||||
<div id="mh">
|
||||
<a id="lightswitch" href="#">go dark</a>
|
||||
<a id="navtoggle" href="#">hide nav</a>
|
||||
|
||||
@@ -39,20 +39,14 @@ var md_plug = {};
|
||||
|
||||
// add navbar
|
||||
(function () {
|
||||
var n = document.location + '';
|
||||
n = n.substr(n.indexOf('//') + 2).split('?')[0].split('/');
|
||||
n[0] = 'top';
|
||||
var loc = [];
|
||||
var nav = [];
|
||||
for (var a = 0; a < n.length; a++) {
|
||||
if (a > 0)
|
||||
loc.push(n[a]);
|
||||
|
||||
var dec = esc(uricom_dec(n[a])[0]);
|
||||
|
||||
nav.push('<a href="/' + loc.join('/') + '">' + dec + '</a>');
|
||||
var parts = get_evpath().split('/'), link = '', o;
|
||||
for (var a = 0, aa = parts.length - 2; a <= aa; a++) {
|
||||
link += parts[a] + (a < aa ? '/' : '');
|
||||
o = mknod('a');
|
||||
o.setAttribute('href', link);
|
||||
o.textContent = uricom_dec(parts[a])[0] || 'top';
|
||||
dom_nav.appendChild(o);
|
||||
}
|
||||
dom_nav.innerHTML = nav.join('');
|
||||
})();
|
||||
|
||||
|
||||
@@ -256,7 +250,7 @@ function convert_markdown(md_text, dest_dom) {
|
||||
Object.assign(marked_opts, ext[0]);
|
||||
|
||||
try {
|
||||
var md_html = marked(md_text, marked_opts);
|
||||
var md_html = marked.parse(md_text, marked_opts);
|
||||
}
|
||||
catch (ex) {
|
||||
if (ext)
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>copyparty</title>
|
||||
<title>{{ svcname }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/msg.css?_={{ ts }}">
|
||||
|
||||
@@ -81,9 +81,10 @@ table {
|
||||
text-align: right;
|
||||
}
|
||||
blockquote {
|
||||
margin: 0 0 0 .6em;
|
||||
padding: .7em 1em;
|
||||
margin: 0 0 1.6em .6em;
|
||||
padding: .7em 1em 0 1em;
|
||||
border-left: .3em solid rgba(128,128,128,0.5);
|
||||
border-radius: 0 0 0 .25em;
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>copyparty</title>
|
||||
<title>{{ svcname }}</title>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=0.8">
|
||||
<link rel="stylesheet" media="screen" href="/.cpr/splash.css?_={{ ts }}">
|
||||
@@ -75,11 +75,13 @@
|
||||
<h1 id="cc">client config:</h1>
|
||||
<ul>
|
||||
{% if k304 %}
|
||||
<li><a href="/?k304=n" class="r">disable k304</a> (currently enabled)
|
||||
<li><a href="/?k304=n">disable k304</a> (currently enabled)
|
||||
{%- else %}
|
||||
<li><a href="/?k304=y">enable k304</a> (currently disabled)
|
||||
<li><a href="/?k304=y" class="r">enable k304</a> (currently disabled)
|
||||
{% endif %}
|
||||
<blockquote>enabling this will disconnect your client on every HTTP 304, which can prevent some buggy browsers/proxies from getting stuck (suddenly not being able to load pages), <em>but</em> it will also make things slower in general</blockquote></li>
|
||||
|
||||
<li><a href="/?reset" class="r" onclick="localStorage.clear();return true">reset client settings</a></li>
|
||||
</ul>
|
||||
|
||||
<h1>login for more:</h1>
|
||||
|
||||
@@ -128,6 +128,7 @@ html {
|
||||
}
|
||||
#tth.act {
|
||||
display: block;
|
||||
z-index: 9001;
|
||||
}
|
||||
#tt.b {
|
||||
padding: 0 2em;
|
||||
@@ -353,6 +354,13 @@ html.light textarea:focus {
|
||||
}
|
||||
.mdo ul,
|
||||
.mdo ol {
|
||||
padding-left: 1em;
|
||||
}
|
||||
.mdo ul ul,
|
||||
.mdo ul ol,
|
||||
.mdo ol ul,
|
||||
.mdo ol ol {
|
||||
padding-left: 2em;
|
||||
border-left: .3em solid #ddd;
|
||||
}
|
||||
.mdo ul>li,
|
||||
|
||||
@@ -525,13 +525,15 @@ function Donut(uc, st) {
|
||||
}
|
||||
|
||||
r.on = function (ya) {
|
||||
r.fc = 99;
|
||||
r.fc = r.tc = 99;
|
||||
r.eta = null;
|
||||
r.base = pos();
|
||||
optab.innerHTML = ya ? svg() : optab.getAttribute('ico');
|
||||
el = QS('#ops a .donut');
|
||||
if (!ya)
|
||||
if (!ya) {
|
||||
favico.upd();
|
||||
wintitle();
|
||||
}
|
||||
};
|
||||
r.do = function () {
|
||||
if (!el)
|
||||
@@ -541,6 +543,11 @@ function Donut(uc, st) {
|
||||
v = pos() - r.base,
|
||||
ofs = el.style.strokeDashoffset = o - o * v / t;
|
||||
|
||||
if (++r.tc >= 10) {
|
||||
wintitle(f2f(v * 100 / t, 1) + '%, ' + r.eta + 's, ', true);
|
||||
r.tc = 0;
|
||||
}
|
||||
|
||||
if (favico.txt) {
|
||||
if (++r.fc < 10 && r.eta && r.eta > 99)
|
||||
return;
|
||||
@@ -728,7 +735,6 @@ function up2k_init(subtle) {
|
||||
if (++nenters <= 0)
|
||||
nenters = 1;
|
||||
|
||||
//console.log(nenters, Date.now(), 'enter', this, e.target);
|
||||
if (onover.bind(this)(e))
|
||||
return true;
|
||||
|
||||
@@ -750,12 +756,19 @@ function up2k_init(subtle) {
|
||||
ebi('up_dz').setAttribute('err', mup || '');
|
||||
ebi('srch_dz').setAttribute('err', msr || '');
|
||||
}
|
||||
function onoverb(e) {
|
||||
// zones are alive; disable cuo2duo branch
|
||||
document.body.ondragover = document.body.ondrop = null;
|
||||
return onover.bind(this)(e);
|
||||
}
|
||||
function onover(e) {
|
||||
try {
|
||||
var ok = false, dt = e.dataTransfer.types;
|
||||
for (var a = 0; a < dt.length; a++)
|
||||
if (dt[a] == 'Files')
|
||||
ok = true;
|
||||
else if (dt[a] == 'text/uri-list')
|
||||
return true;
|
||||
|
||||
if (!ok)
|
||||
return true;
|
||||
@@ -781,17 +794,20 @@ function up2k_init(subtle) {
|
||||
clmod(ebi('drops'), 'vis');
|
||||
clmod(ebi('up_dz'), 'hl');
|
||||
clmod(ebi('srch_dz'), 'hl');
|
||||
// cuo2duo:
|
||||
document.body.ondragover = onover;
|
||||
document.body.ondrop = gotfile;
|
||||
}
|
||||
|
||||
//console.log(nenters, Date.now(), 'leave', this, e && e.target);
|
||||
}
|
||||
document.body.ondragenter = ondrag;
|
||||
document.body.ondragleave = offdrag;
|
||||
document.body.ondragover = onover;
|
||||
document.body.ondrop = gotfile;
|
||||
|
||||
var drops = [ebi('up_dz'), ebi('srch_dz')];
|
||||
for (var a = 0; a < 2; a++) {
|
||||
drops[a].ondragenter = ondrag;
|
||||
drops[a].ondragover = onover;
|
||||
drops[a].ondragover = onoverb;
|
||||
drops[a].ondragleave = offdrag;
|
||||
drops[a].ondrop = gotfile;
|
||||
}
|
||||
@@ -801,7 +817,10 @@ function up2k_init(subtle) {
|
||||
ev(e);
|
||||
nenters = 0;
|
||||
offdrag.bind(this)();
|
||||
var dz = (this && this.getAttribute('id'));
|
||||
var dz = this && this.getAttribute('id');
|
||||
if (!dz && e && e.clientY)
|
||||
// cuo2duo fallback
|
||||
dz = e.clientY < window.innerHeight / 2 ? 'up_dz' : 'srch_dz';
|
||||
|
||||
var err = this.getAttribute('err');
|
||||
if (err)
|
||||
@@ -1466,7 +1485,8 @@ function up2k_init(subtle) {
|
||||
err.indexOf('NotFoundError') !== -1 // macos-firefox permissions
|
||||
) {
|
||||
pvis.seth(t.n, 1, 'OS-error');
|
||||
pvis.seth(t.n, 2, err);
|
||||
pvis.seth(t.n, 2, err + ' @ ' + car);
|
||||
console.log('OS-error', reader.error, '@', car);
|
||||
handled = true;
|
||||
}
|
||||
|
||||
@@ -2019,7 +2039,7 @@ function up2k_init(subtle) {
|
||||
new_state = true;
|
||||
fixed = true;
|
||||
}
|
||||
if (!has(perms, 'read')) {
|
||||
if (!has(perms, 'read') || !have_up2k_idx) {
|
||||
new_state = false;
|
||||
fixed = true;
|
||||
}
|
||||
@@ -2094,7 +2114,7 @@ function up2k_init(subtle) {
|
||||
if (parallel_uploads < 1)
|
||||
bumpthread(1);
|
||||
|
||||
return { "init_deps": init_deps, "set_fsearch": set_fsearch, "ui": pvis }
|
||||
return { "init_deps": init_deps, "set_fsearch": set_fsearch, "ui": pvis, "st": st, "uc": uc }
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -329,14 +329,45 @@ function clgot(el, cls) {
|
||||
}
|
||||
|
||||
|
||||
function showsort(tab) {
|
||||
var v, vn, v1, v2, th = tab.tHead,
|
||||
sopts = jread('fsort', [["href", 1, ""]]);
|
||||
|
||||
th && (th = th.rows[0]) && (th = th.cells);
|
||||
|
||||
for (var a = sopts.length - 1; a >= 0; a--) {
|
||||
if (!sopts[a][0])
|
||||
continue;
|
||||
|
||||
v2 = v1;
|
||||
v1 = sopts[a];
|
||||
}
|
||||
|
||||
v = [v1, v2];
|
||||
vn = [v1 ? v1[0] : '', v2 ? v2[0] : ''];
|
||||
|
||||
var ga = QSA('#ghead a[s]');
|
||||
for (var a = 0; a < ga.length; a++)
|
||||
ga[a].className = '';
|
||||
|
||||
for (var a = 0; a < th.length; a++) {
|
||||
var n = vn.indexOf(th[a].getAttribute('name')),
|
||||
cl = n < 0 ? ' ' : ' s' + n + (v[n][1] > 0 ? ' ' : 'r ');
|
||||
|
||||
th[a].className = th[a].className.replace(/ *s[01]r? */, ' ') + cl;
|
||||
if (n + 1) {
|
||||
ga = QS('#ghead a[s="' + vn[n] + '"]');
|
||||
if (ga)
|
||||
ga.className = cl;
|
||||
}
|
||||
}
|
||||
}
|
||||
function sortTable(table, col, cb) {
|
||||
var tb = table.tBodies[0],
|
||||
th = table.tHead.rows[0].cells,
|
||||
tr = Array.prototype.slice.call(tb.rows, 0),
|
||||
i, reverse = th[col].className.indexOf('sort1') !== -1 ? -1 : 1;
|
||||
for (var a = 0, thl = th.length; a < thl; a++)
|
||||
th[a].className = th[a].className.replace(/ *sort-?1 */, " ");
|
||||
th[col].className += ' sort' + reverse;
|
||||
i, reverse = /s0[^r]/.exec(th[col].className + ' ') ? -1 : 1;
|
||||
|
||||
var stype = th[col].getAttribute('sort');
|
||||
try {
|
||||
var nrules = [], rules = jread("fsort", []);
|
||||
@@ -354,6 +385,7 @@ function sortTable(table, col, cb) {
|
||||
break;
|
||||
}
|
||||
jwrite("fsort", nrules);
|
||||
try { showsort(table); } catch (ex) { }
|
||||
}
|
||||
catch (ex) {
|
||||
console.log("failed to persist sort rules, resetting: " + ex);
|
||||
@@ -402,7 +434,7 @@ function makeSortable(table, cb) {
|
||||
}
|
||||
|
||||
|
||||
function linksplit(rp) {
|
||||
function linksplit(rp, id) {
|
||||
var ret = [],
|
||||
apath = '/',
|
||||
q = null;
|
||||
@@ -432,8 +464,13 @@ function linksplit(rp) {
|
||||
vlink = vlink.slice(0, -1) + '<span>/</span>';
|
||||
}
|
||||
|
||||
if (!rp && q)
|
||||
link += q;
|
||||
if (!rp) {
|
||||
if (q)
|
||||
link += q;
|
||||
|
||||
if (id)
|
||||
link += '" id="' + id;
|
||||
}
|
||||
|
||||
ret.push('<a href="' + apath + link + '">' + vlink + '</a>');
|
||||
apath += link;
|
||||
|
||||
@@ -193,6 +193,11 @@ git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --
|
||||
# download all sfx versions
|
||||
curl https://api.github.com/repos/9001/copyparty/releases?per_page=100 | jq -r '.[] | .tag_name + " " + .name' | tr -d '\r' | while read v t; do fn="$(printf '%s\n' "copyparty $v $t.py" | tr / -)"; [ -e "$fn" ] || curl https://github.com/9001/copyparty/releases/download/$v/copyparty-sfx.py -Lo "$fn"; done
|
||||
|
||||
# push to multiple git remotes
|
||||
git config -l | grep '^remote'
|
||||
git remote add all git@github.com:9001/copyparty.git
|
||||
git remote set-url --add --push all git@gitlab.com:9001/copyparty.git
|
||||
git remote set-url --add --push all git@github.com:9001/copyparty.git
|
||||
|
||||
##
|
||||
## http 206
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
FROM alpine:3.14
|
||||
FROM alpine:3.15
|
||||
WORKDIR /z
|
||||
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
|
||||
ver_hashwasm=4.9.0 \
|
||||
ver_marked=3.0.4 \
|
||||
ver_marked=4.0.6 \
|
||||
ver_mde=2.15.0 \
|
||||
ver_codemirror=5.62.3 \
|
||||
ver_codemirror=5.64.0 \
|
||||
ver_fontawesome=5.13.0 \
|
||||
ver_zopfli=1.0.3
|
||||
|
||||
@@ -82,7 +82,6 @@ RUN cd marked-$ver_marked \
|
||||
&& patch -p1 < /z/marked.patch \
|
||||
&& npm run build \
|
||||
&& cp -pv marked.min.js /z/dist/marked.js \
|
||||
&& cp -pv lib/marked.js /z/dist/marked.full.js \
|
||||
&& mkdir -p /z/nodepkgs \
|
||||
&& ln -s $(pwd) /z/nodepkgs/marked
|
||||
# && npm run test \
|
||||
@@ -98,8 +97,10 @@ RUN cd CodeMirror-$ver_codemirror \
|
||||
|
||||
|
||||
# build easymde
|
||||
COPY easymde-marked6.patch /z/
|
||||
COPY easymde.patch /z/
|
||||
RUN cd easy-markdown-editor-$ver_mde \
|
||||
&& patch -p1 < /z/easymde-marked6.patch \
|
||||
&& patch -p1 < /z/easymde.patch \
|
||||
&& sed -ri 's`https://registry.npmjs.org/marked/-/marked-[0-9\.]+.tgz`file:/z/nodepkgs/marked`' package-lock.json \
|
||||
&& sed -ri 's`("marked": ")[^"]+`\1file:/z/nodepkgs/marked`' ./package.json \
|
||||
|
||||
12
scripts/deps-docker/easymde-marked6.patch
Normal file
12
scripts/deps-docker/easymde-marked6.patch
Normal file
@@ -0,0 +1,12 @@
|
||||
diff --git a/src/js/easymde.js b/src/js/easymde.js
|
||||
--- a/src/js/easymde.js
|
||||
+++ b/src/js/easymde.js
|
||||
@@ -1962,7 +1962,7 @@ EasyMDE.prototype.markdown = function (text) {
|
||||
marked.setOptions(markedOptions);
|
||||
|
||||
// Convert the markdown to HTML
|
||||
- var htmlText = marked(text);
|
||||
+ var htmlText = marked.parse(text);
|
||||
|
||||
// Sanitize HTML
|
||||
if (this.options.renderingConfig && typeof this.options.renderingConfig.sanitizerFunction === 'function') {
|
||||
@@ -1,15 +1,15 @@
|
||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||
adds linetracking to marked.js v3.0.4;
|
||||
adds linetracking to marked.js v4.0.6;
|
||||
add data-ln="%d" to most tags, %d is the source markdown line
|
||||
--- a/src/Lexer.js
|
||||
+++ b/src/Lexer.js
|
||||
@@ -50,4 +50,5 @@ function mangle(text) {
|
||||
module.exports = class Lexer {
|
||||
export class Lexer {
|
||||
constructor(options) {
|
||||
+ this.ln = 1; // like most editors, start couting from 1
|
||||
this.tokens = [];
|
||||
this.tokens.links = Object.create(null);
|
||||
@@ -127,4 +128,15 @@ module.exports = class Lexer {
|
||||
@@ -127,4 +128,15 @@ export class Lexer {
|
||||
}
|
||||
|
||||
+ set_ln(token, ln = this.ln) {
|
||||
@@ -25,7 +25,7 @@ add data-ln="%d" to most tags, %d is the source markdown line
|
||||
+
|
||||
/**
|
||||
* Lexing
|
||||
@@ -134,7 +146,11 @@ module.exports = class Lexer {
|
||||
@@ -134,7 +146,11 @@ export class Lexer {
|
||||
src = src.replace(/^ +$/gm, '');
|
||||
}
|
||||
- let token, lastToken, cutSrc, lastParagraphClipped;
|
||||
@@ -38,105 +38,105 @@ add data-ln="%d" to most tags, %d is the source markdown line
|
||||
+
|
||||
if (this.options.extensions
|
||||
&& this.options.extensions.block
|
||||
@@ -142,4 +158,5 @@ module.exports = class Lexer {
|
||||
@@ -142,4 +158,5 @@ export class Lexer {
|
||||
if (token = extTokenizer.call({ lexer: this }, src, tokens)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
return true;
|
||||
@@ -153,4 +170,5 @@ module.exports = class Lexer {
|
||||
@@ -153,4 +170,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.space(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln); // is \n if not type
|
||||
if (token.type) {
|
||||
tokens.push(token);
|
||||
@@ -162,4 +180,5 @@ module.exports = class Lexer {
|
||||
@@ -162,4 +180,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.code(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
lastToken = tokens[tokens.length - 1];
|
||||
// An indented code block cannot interrupt a paragraph.
|
||||
@@ -177,4 +196,5 @@ module.exports = class Lexer {
|
||||
@@ -177,4 +196,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.fences(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -184,4 +204,5 @@ module.exports = class Lexer {
|
||||
@@ -184,4 +204,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.heading(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -191,4 +212,5 @@ module.exports = class Lexer {
|
||||
@@ -191,4 +212,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.hr(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -198,4 +220,5 @@ module.exports = class Lexer {
|
||||
@@ -198,4 +220,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.blockquote(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -205,4 +228,5 @@ module.exports = class Lexer {
|
||||
@@ -205,4 +228,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.list(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -212,4 +236,5 @@ module.exports = class Lexer {
|
||||
@@ -212,4 +236,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.html(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -219,4 +244,5 @@ module.exports = class Lexer {
|
||||
@@ -219,4 +244,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.def(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
lastToken = tokens[tokens.length - 1];
|
||||
if (lastToken && (lastToken.type === 'paragraph' || lastToken.type === 'text')) {
|
||||
@@ -236,4 +262,5 @@ module.exports = class Lexer {
|
||||
@@ -236,4 +262,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.table(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -243,4 +270,5 @@ module.exports = class Lexer {
|
||||
@@ -243,4 +270,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.lheading(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -263,4 +291,5 @@ module.exports = class Lexer {
|
||||
@@ -263,4 +291,5 @@ export class Lexer {
|
||||
}
|
||||
if (this.state.top && (token = this.tokenizer.paragraph(cutSrc))) {
|
||||
+ this.set_ln(token, ln);
|
||||
lastToken = tokens[tokens.length - 1];
|
||||
if (lastParagraphClipped && lastToken.type === 'paragraph') {
|
||||
@@ -280,4 +309,6 @@ module.exports = class Lexer {
|
||||
@@ -280,4 +309,6 @@ export class Lexer {
|
||||
if (token = this.tokenizer.text(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.set_ln(token, ln);
|
||||
+ this.ln++;
|
||||
lastToken = tokens[tokens.length - 1];
|
||||
if (lastToken && lastToken.type === 'text') {
|
||||
@@ -355,4 +386,5 @@ module.exports = class Lexer {
|
||||
@@ -355,4 +386,5 @@ export class Lexer {
|
||||
if (token = extTokenizer.call({ lexer: this }, src, tokens)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.ln = token.ln || this.ln;
|
||||
tokens.push(token);
|
||||
return true;
|
||||
@@ -420,4 +452,6 @@ module.exports = class Lexer {
|
||||
@@ -420,4 +452,6 @@ export class Lexer {
|
||||
if (token = this.tokenizer.br(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ // no need to reset (no more blockTokens anyways)
|
||||
+ token.ln = this.ln++;
|
||||
tokens.push(token);
|
||||
continue;
|
||||
@@ -462,4 +496,5 @@ module.exports = class Lexer {
|
||||
@@ -462,4 +496,5 @@ export class Lexer {
|
||||
if (token = this.tokenizer.inlineText(cutSrc, smartypants)) {
|
||||
src = src.substring(token.raw.length);
|
||||
+ this.ln = token.ln || this.ln;
|
||||
@@ -145,13 +145,13 @@ add data-ln="%d" to most tags, %d is the source markdown line
|
||||
diff --git a/src/Parser.js b/src/Parser.js
|
||||
--- a/src/Parser.js
|
||||
+++ b/src/Parser.js
|
||||
@@ -18,4 +18,5 @@ module.exports = class Parser {
|
||||
@@ -18,4 +18,5 @@ export class Parser {
|
||||
this.textRenderer = new TextRenderer();
|
||||
this.slugger = new Slugger();
|
||||
+ this.ln = 0; // error indicator; should always be set >=1 from tokens
|
||||
}
|
||||
|
||||
@@ -64,4 +65,8 @@ module.exports = class Parser {
|
||||
@@ -64,4 +65,8 @@ export class Parser {
|
||||
for (i = 0; i < l; i++) {
|
||||
token = tokens[i];
|
||||
+ // take line-numbers from tokens whenever possible
|
||||
@@ -160,7 +160,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
||||
+ this.renderer.tag_ln(this.ln);
|
||||
|
||||
// Run any renderer extensions
|
||||
@@ -124,7 +129,10 @@ module.exports = class Parser {
|
||||
@@ -124,7 +129,10 @@ export class Parser {
|
||||
}
|
||||
|
||||
- body += this.renderer.tablerow(cell);
|
||||
@@ -173,7 +173,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
||||
+ out += this.renderer.tag_ln(token.ln).table(header, body);
|
||||
continue;
|
||||
}
|
||||
@@ -167,8 +175,12 @@ module.exports = class Parser {
|
||||
@@ -167,8 +175,12 @@ export class Parser {
|
||||
|
||||
itemBody += this.parse(item.tokens, loose);
|
||||
- body += this.renderer.listitem(itemBody, task, checked);
|
||||
@@ -188,7 +188,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
||||
+ out += this.renderer.tag_ln(token.ln).list(body, ordered, start);
|
||||
continue;
|
||||
}
|
||||
@@ -179,5 +191,6 @@ module.exports = class Parser {
|
||||
@@ -179,5 +191,6 @@ export class Parser {
|
||||
}
|
||||
case 'paragraph': {
|
||||
- out += this.renderer.paragraph(this.parseInline(token.tokens));
|
||||
@@ -196,7 +196,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
||||
+ out += this.renderer.tag_ln(token.ln).paragraph(t);
|
||||
continue;
|
||||
}
|
||||
@@ -221,4 +234,7 @@ module.exports = class Parser {
|
||||
@@ -221,4 +234,7 @@ export class Parser {
|
||||
token = tokens[i];
|
||||
|
||||
+ // another thing that only affects <br/> and other inlines
|
||||
@@ -207,7 +207,7 @@ diff --git a/src/Parser.js b/src/Parser.js
|
||||
diff --git a/src/Renderer.js b/src/Renderer.js
|
||||
--- a/src/Renderer.js
|
||||
+++ b/src/Renderer.js
|
||||
@@ -11,6 +11,12 @@ module.exports = class Renderer {
|
||||
@@ -11,6 +11,12 @@ export class Renderer {
|
||||
constructor(options) {
|
||||
this.options = options || defaults;
|
||||
+ this.ln = "";
|
||||
@@ -220,7 +220,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
||||
+
|
||||
code(code, infostring, escaped) {
|
||||
const lang = (infostring || '').match(/\S*/)[0];
|
||||
@@ -26,10 +32,10 @@ module.exports = class Renderer {
|
||||
@@ -26,10 +32,10 @@ export class Renderer {
|
||||
|
||||
if (!lang) {
|
||||
- return '<pre><code>'
|
||||
@@ -233,55 +233,55 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
||||
+ return '<pre' + this.ln + '><code class="'
|
||||
+ this.options.langPrefix
|
||||
+ escape(lang, true)
|
||||
@@ -40,5 +46,5 @@ module.exports = class Renderer {
|
||||
@@ -40,5 +46,5 @@ export class Renderer {
|
||||
|
||||
blockquote(quote) {
|
||||
- return '<blockquote>\n' + quote + '</blockquote>\n';
|
||||
+ return '<blockquote' + this.ln + '>\n' + quote + '</blockquote>\n';
|
||||
}
|
||||
|
||||
@@ -51,4 +57,5 @@ module.exports = class Renderer {
|
||||
@@ -51,4 +57,5 @@ export class Renderer {
|
||||
return '<h'
|
||||
+ level
|
||||
+ + this.ln
|
||||
+ ' id="'
|
||||
+ this.options.headerPrefix
|
||||
@@ -61,5 +68,5 @@ module.exports = class Renderer {
|
||||
@@ -61,5 +68,5 @@ export class Renderer {
|
||||
}
|
||||
// ignore IDs
|
||||
- return '<h' + level + '>' + text + '</h' + level + '>\n';
|
||||
+ return '<h' + level + this.ln + '>' + text + '</h' + level + '>\n';
|
||||
}
|
||||
|
||||
@@ -75,5 +82,5 @@ module.exports = class Renderer {
|
||||
@@ -75,5 +82,5 @@ export class Renderer {
|
||||
|
||||
listitem(text) {
|
||||
- return '<li>' + text + '</li>\n';
|
||||
+ return '<li' + this.ln + '>' + text + '</li>\n';
|
||||
}
|
||||
|
||||
@@ -87,5 +94,5 @@ module.exports = class Renderer {
|
||||
@@ -87,5 +94,5 @@ export class Renderer {
|
||||
|
||||
paragraph(text) {
|
||||
- return '<p>' + text + '</p>\n';
|
||||
+ return '<p' + this.ln + '>' + text + '</p>\n';
|
||||
}
|
||||
|
||||
@@ -102,5 +109,5 @@ module.exports = class Renderer {
|
||||
@@ -102,5 +109,5 @@ export class Renderer {
|
||||
|
||||
tablerow(content) {
|
||||
- return '<tr>\n' + content + '</tr>\n';
|
||||
+ return '<tr' + this.ln + '>\n' + content + '</tr>\n';
|
||||
}
|
||||
|
||||
@@ -127,5 +134,5 @@ module.exports = class Renderer {
|
||||
@@ -127,5 +134,5 @@ export class Renderer {
|
||||
|
||||
br() {
|
||||
- return this.options.xhtml ? '<br/>' : '<br>';
|
||||
+ return this.options.xhtml ? '<br' + this.ln + '/>' : '<br' + this.ln + '>';
|
||||
}
|
||||
|
||||
@@ -153,5 +160,5 @@ module.exports = class Renderer {
|
||||
@@ -153,5 +160,5 @@ export class Renderer {
|
||||
}
|
||||
|
||||
- let out = '<img src="' + href + '" alt="' + text + '"';
|
||||
@@ -291,7 +291,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
--- a/src/Tokenizer.js
|
||||
+++ b/src/Tokenizer.js
|
||||
@@ -301,4 +301,7 @@ module.exports = class Tokenizer {
|
||||
@@ -297,4 +297,7 @@ export class Tokenizer {
|
||||
const l = list.items.length;
|
||||
|
||||
+ // each nested list gets +1 ahead; this hack makes every listgroup -1 but atleast it doesn't get infinitely bad
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
diff --git a/src/Lexer.js b/src/Lexer.js
|
||||
--- a/src/Lexer.js
|
||||
+++ b/src/Lexer.js
|
||||
@@ -6,5 +6,5 @@ const { repeatString } = require('./helpers.js');
|
||||
@@ -6,5 +6,5 @@ import { repeatString } from './helpers.js';
|
||||
/**
|
||||
* smartypants text replacement
|
||||
- */
|
||||
@@ -15,21 +15,21 @@ diff --git a/src/Lexer.js b/src/Lexer.js
|
||||
+ *
|
||||
function mangle(text) {
|
||||
let out = '',
|
||||
@@ -465,5 +465,5 @@ module.exports = class Lexer {
|
||||
@@ -466,5 +466,5 @@ export class Lexer {
|
||||
|
||||
// autolink
|
||||
- if (token = this.tokenizer.autolink(src, mangle)) {
|
||||
+ if (token = this.tokenizer.autolink(src)) {
|
||||
src = src.substring(token.raw.length);
|
||||
tokens.push(token);
|
||||
@@ -472,5 +472,5 @@ module.exports = class Lexer {
|
||||
@@ -473,5 +473,5 @@ export class Lexer {
|
||||
|
||||
// url (gfm)
|
||||
- if (!this.state.inLink && (token = this.tokenizer.url(src, mangle))) {
|
||||
+ if (!this.state.inLink && (token = this.tokenizer.url(src))) {
|
||||
src = src.substring(token.raw.length);
|
||||
tokens.push(token);
|
||||
@@ -493,5 +493,5 @@ module.exports = class Lexer {
|
||||
@@ -494,5 +494,5 @@ export class Lexer {
|
||||
}
|
||||
}
|
||||
- if (token = this.tokenizer.inlineText(cutSrc, smartypants)) {
|
||||
@@ -39,14 +39,14 @@ diff --git a/src/Lexer.js b/src/Lexer.js
|
||||
diff --git a/src/Renderer.js b/src/Renderer.js
|
||||
--- a/src/Renderer.js
|
||||
+++ b/src/Renderer.js
|
||||
@@ -142,5 +142,5 @@ module.exports = class Renderer {
|
||||
@@ -142,5 +142,5 @@ export class Renderer {
|
||||
|
||||
link(href, title, text) {
|
||||
- href = cleanUrl(this.options.sanitize, this.options.baseUrl, href);
|
||||
+ href = cleanUrl(this.options.baseUrl, href);
|
||||
if (href === null) {
|
||||
return text;
|
||||
@@ -155,5 +155,5 @@ module.exports = class Renderer {
|
||||
@@ -155,5 +155,5 @@ export class Renderer {
|
||||
|
||||
image(href, title, text) {
|
||||
- href = cleanUrl(this.options.sanitize, this.options.baseUrl, href);
|
||||
@@ -56,7 +56,7 @@ diff --git a/src/Renderer.js b/src/Renderer.js
|
||||
diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
--- a/src/Tokenizer.js
|
||||
+++ b/src/Tokenizer.js
|
||||
@@ -321,14 +321,7 @@ module.exports = class Tokenizer {
|
||||
@@ -320,14 +320,7 @@ export class Tokenizer {
|
||||
type: 'html',
|
||||
raw: cap[0],
|
||||
- pre: !this.options.sanitizer
|
||||
@@ -72,7 +72,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
- }
|
||||
return token;
|
||||
}
|
||||
@@ -477,15 +470,9 @@ module.exports = class Tokenizer {
|
||||
@@ -476,15 +469,9 @@ export class Tokenizer {
|
||||
|
||||
return {
|
||||
- type: this.options.sanitize
|
||||
@@ -90,7 +90,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
+ text: cap[0]
|
||||
};
|
||||
}
|
||||
@@ -672,10 +659,10 @@ module.exports = class Tokenizer {
|
||||
@@ -671,10 +658,10 @@ export class Tokenizer {
|
||||
}
|
||||
|
||||
- autolink(src, mangle) {
|
||||
@@ -103,7 +103,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
+ text = escape(cap[1]);
|
||||
href = 'mailto:' + text;
|
||||
} else {
|
||||
@@ -700,10 +687,10 @@ module.exports = class Tokenizer {
|
||||
@@ -699,10 +686,10 @@ export class Tokenizer {
|
||||
}
|
||||
|
||||
- url(src, mangle) {
|
||||
@@ -116,7 +116,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
+ text = escape(cap[0]);
|
||||
href = 'mailto:' + text;
|
||||
} else {
|
||||
@@ -737,12 +724,12 @@ module.exports = class Tokenizer {
|
||||
@@ -736,12 +723,12 @@ export class Tokenizer {
|
||||
}
|
||||
|
||||
- inlineText(src, smartypants) {
|
||||
@@ -135,7 +135,7 @@ diff --git a/src/Tokenizer.js b/src/Tokenizer.js
|
||||
diff --git a/src/defaults.js b/src/defaults.js
|
||||
--- a/src/defaults.js
|
||||
+++ b/src/defaults.js
|
||||
@@ -9,12 +9,8 @@ function getDefaults() {
|
||||
@@ -9,12 +9,8 @@ export function getDefaults() {
|
||||
highlight: null,
|
||||
langPrefix: 'language-',
|
||||
- mangle: true,
|
||||
@@ -151,10 +151,10 @@ diff --git a/src/defaults.js b/src/defaults.js
|
||||
diff --git a/src/helpers.js b/src/helpers.js
|
||||
--- a/src/helpers.js
|
||||
+++ b/src/helpers.js
|
||||
@@ -64,18 +64,5 @@ function edit(regex, opt) {
|
||||
@@ -64,18 +64,5 @@ export function edit(regex, opt) {
|
||||
const nonWordAndColonTest = /[^\w:]/g;
|
||||
const originIndependentUrl = /^$|^[a-z][a-z0-9+.-]*:|^[?#]/i;
|
||||
-function cleanUrl(sanitize, base, href) {
|
||||
-export function cleanUrl(sanitize, base, href) {
|
||||
- if (sanitize) {
|
||||
- let prot;
|
||||
- try {
|
||||
@@ -168,36 +168,30 @@ diff --git a/src/helpers.js b/src/helpers.js
|
||||
- return null;
|
||||
- }
|
||||
- }
|
||||
+function cleanUrl(base, href) {
|
||||
+export function cleanUrl(base, href) {
|
||||
if (base && !originIndependentUrl.test(href)) {
|
||||
href = resolveUrl(base, href);
|
||||
@@ -227,10 +214,4 @@ function findClosingBracket(str, b) {
|
||||
@@ -227,10 +214,4 @@ export function findClosingBracket(str, b) {
|
||||
}
|
||||
|
||||
-function checkSanitizeDeprecation(opt) {
|
||||
-export function checkSanitizeDeprecation(opt) {
|
||||
- if (opt && opt.sanitize && !opt.silent) {
|
||||
- console.warn('marked(): sanitize and sanitizer parameters are deprecated since version 0.7.0, should not be used and will be removed in the future. Read more here: https://marked.js.org/#/USING_ADVANCED.md#options');
|
||||
- }
|
||||
-}
|
||||
-
|
||||
// copied from https://stackoverflow.com/a/5450113/806777
|
||||
function repeatString(pattern, count) {
|
||||
@@ -260,5 +241,4 @@ module.exports = {
|
||||
rtrim,
|
||||
findClosingBracket,
|
||||
- checkSanitizeDeprecation,
|
||||
repeatString
|
||||
};
|
||||
export function repeatString(pattern, count) {
|
||||
diff --git a/src/marked.js b/src/marked.js
|
||||
--- a/src/marked.js
|
||||
+++ b/src/marked.js
|
||||
@@ -7,5 +7,4 @@ const Slugger = require('./Slugger.js');
|
||||
const {
|
||||
@@ -7,5 +7,4 @@ import { Slugger } from './Slugger.js';
|
||||
import {
|
||||
merge,
|
||||
- checkSanitizeDeprecation,
|
||||
escape
|
||||
} = require('./helpers.js');
|
||||
@@ -35,5 +34,4 @@ function marked(src, opt, callback) {
|
||||
} from './helpers.js';
|
||||
@@ -35,5 +34,4 @@ export function marked(src, opt, callback) {
|
||||
|
||||
opt = merge({}, marked.defaults, opt || {});
|
||||
- checkSanitizeDeprecation(opt);
|
||||
@@ -219,37 +213,37 @@ diff --git a/src/marked.js b/src/marked.js
|
||||
diff --git a/test/bench.js b/test/bench.js
|
||||
--- a/test/bench.js
|
||||
+++ b/test/bench.js
|
||||
@@ -33,5 +33,4 @@ async function runBench(options) {
|
||||
@@ -37,5 +37,4 @@ export async function runBench(options) {
|
||||
breaks: false,
|
||||
pedantic: false,
|
||||
- sanitize: false,
|
||||
smartLists: false
|
||||
});
|
||||
@@ -45,5 +44,4 @@ async function runBench(options) {
|
||||
@@ -49,5 +48,4 @@ export async function runBench(options) {
|
||||
breaks: false,
|
||||
pedantic: false,
|
||||
- sanitize: false,
|
||||
smartLists: false
|
||||
});
|
||||
@@ -58,5 +56,4 @@ async function runBench(options) {
|
||||
@@ -62,5 +60,4 @@ export async function runBench(options) {
|
||||
breaks: false,
|
||||
pedantic: false,
|
||||
- sanitize: false,
|
||||
smartLists: false
|
||||
});
|
||||
@@ -70,5 +67,4 @@ async function runBench(options) {
|
||||
@@ -74,5 +71,4 @@ export async function runBench(options) {
|
||||
breaks: false,
|
||||
pedantic: false,
|
||||
- sanitize: false,
|
||||
smartLists: false
|
||||
});
|
||||
@@ -83,5 +79,4 @@ async function runBench(options) {
|
||||
@@ -87,5 +83,4 @@ export async function runBench(options) {
|
||||
breaks: false,
|
||||
pedantic: true,
|
||||
- sanitize: false,
|
||||
smartLists: false
|
||||
});
|
||||
@@ -95,5 +90,4 @@ async function runBench(options) {
|
||||
@@ -99,5 +94,4 @@ export async function runBench(options) {
|
||||
breaks: false,
|
||||
pedantic: true,
|
||||
- sanitize: false,
|
||||
@@ -258,7 +252,7 @@ diff --git a/test/bench.js b/test/bench.js
|
||||
diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
||||
--- a/test/specs/run-spec.js
|
||||
+++ b/test/specs/run-spec.js
|
||||
@@ -22,9 +22,4 @@ function runSpecs(title, dir, showCompletionTable, options) {
|
||||
@@ -25,9 +25,4 @@ function runSpecs(title, dir, showCompletionTable, options) {
|
||||
}
|
||||
|
||||
- if (spec.options.sanitizer) {
|
||||
@@ -268,77 +262,77 @@ diff --git a/test/specs/run-spec.js b/test/specs/run-spec.js
|
||||
-
|
||||
(spec.only ? fit : (spec.skip ? xit : it))('should ' + passFail + example, async() => {
|
||||
const before = process.hrtime();
|
||||
@@ -53,3 +48,2 @@ runSpecs('Original', './original', false, { gfm: false, pedantic: true });
|
||||
@@ -56,3 +51,2 @@ runSpecs('Original', './original', false, { gfm: false, pedantic: true });
|
||||
runSpecs('New', './new');
|
||||
runSpecs('ReDOS', './redos');
|
||||
-runSpecs('Security', './security', false, { silent: true }); // silent - do not show deprecation warning
|
||||
diff --git a/test/unit/Lexer-spec.js b/test/unit/Lexer-spec.js
|
||||
--- a/test/unit/Lexer-spec.js
|
||||
+++ b/test/unit/Lexer-spec.js
|
||||
@@ -589,5 +589,5 @@ paragraph
|
||||
@@ -635,5 +635,5 @@ paragraph
|
||||
});
|
||||
|
||||
- it('sanitize', () => {
|
||||
+ /*it('sanitize', () => {
|
||||
expectTokens({
|
||||
md: '<div>html</div>',
|
||||
@@ -607,5 +607,5 @@ paragraph
|
||||
@@ -653,5 +653,5 @@ paragraph
|
||||
]
|
||||
});
|
||||
- });
|
||||
+ });*/
|
||||
});
|
||||
|
||||
@@ -652,5 +652,5 @@ paragraph
|
||||
@@ -698,5 +698,5 @@ paragraph
|
||||
});
|
||||
|
||||
- it('html sanitize', () => {
|
||||
+ /*it('html sanitize', () => {
|
||||
expectInlineTokens({
|
||||
md: '<div>html</div>',
|
||||
@@ -660,5 +660,5 @@ paragraph
|
||||
@@ -706,5 +706,5 @@ paragraph
|
||||
]
|
||||
});
|
||||
- });
|
||||
+ });*/
|
||||
|
||||
it('link', () => {
|
||||
@@ -971,5 +971,5 @@ paragraph
|
||||
@@ -1017,5 +1017,5 @@ paragraph
|
||||
});
|
||||
|
||||
- it('autolink mangle email', () => {
|
||||
+ /*it('autolink mangle email', () => {
|
||||
expectInlineTokens({
|
||||
md: '<test@example.com>',
|
||||
@@ -991,5 +991,5 @@ paragraph
|
||||
@@ -1037,5 +1037,5 @@ paragraph
|
||||
]
|
||||
});
|
||||
- });
|
||||
+ });*/
|
||||
|
||||
it('url', () => {
|
||||
@@ -1028,5 +1028,5 @@ paragraph
|
||||
@@ -1074,5 +1074,5 @@ paragraph
|
||||
});
|
||||
|
||||
- it('url mangle email', () => {
|
||||
+ /*it('url mangle email', () => {
|
||||
expectInlineTokens({
|
||||
md: 'test@example.com',
|
||||
@@ -1048,5 +1048,5 @@ paragraph
|
||||
@@ -1094,5 +1094,5 @@ paragraph
|
||||
]
|
||||
});
|
||||
- });
|
||||
+ });*/
|
||||
});
|
||||
|
||||
@@ -1064,5 +1064,5 @@ paragraph
|
||||
@@ -1110,5 +1110,5 @@ paragraph
|
||||
});
|
||||
|
||||
- describe('smartypants', () => {
|
||||
+ /*describe('smartypants', () => {
|
||||
it('single quotes', () => {
|
||||
expectInlineTokens({
|
||||
@@ -1134,5 +1134,5 @@ paragraph
|
||||
@@ -1180,5 +1180,5 @@ paragraph
|
||||
});
|
||||
});
|
||||
- });
|
||||
|
||||
@@ -86,8 +86,6 @@ function have() {
|
||||
python -c "import $1; $1; $1.__version__"
|
||||
}
|
||||
|
||||
mv copyparty/web/deps/marked.full.js.gz srv/ || true
|
||||
|
||||
. buildenv/bin/activate
|
||||
have setuptools
|
||||
have wheel
|
||||
|
||||
@@ -35,8 +35,6 @@ ver="$1"
|
||||
exit 1
|
||||
}
|
||||
|
||||
mv copyparty/web/deps/marked.full.js.gz srv/ || true
|
||||
|
||||
mkdir -p dist
|
||||
zip_path="$(pwd)/dist/copyparty-$ver.zip"
|
||||
tgz_path="$(pwd)/dist/copyparty-$ver.tar.gz"
|
||||
|
||||
@@ -7,8 +7,9 @@ v=$1
|
||||
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
|
||||
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
|
||||
|
||||
git push all
|
||||
git tag v$v
|
||||
git push origin --tags
|
||||
git push all --tags
|
||||
|
||||
rm -rf ../dist
|
||||
|
||||
|
||||
@@ -29,6 +29,9 @@ class Cfg(Namespace):
|
||||
v=v or [],
|
||||
c=c,
|
||||
rproxy=0,
|
||||
rsp_slp=0,
|
||||
s_wr_slp=0,
|
||||
s_wr_sz=512 * 1024,
|
||||
ed=False,
|
||||
nw=False,
|
||||
unpost=600,
|
||||
@@ -48,6 +51,7 @@ class Cfg(Namespace):
|
||||
mte="a",
|
||||
mth="",
|
||||
textfiles="",
|
||||
doctitle="",
|
||||
hist=None,
|
||||
no_idx=None,
|
||||
no_hash=None,
|
||||
|
||||
@@ -23,6 +23,7 @@ class Cfg(Namespace):
|
||||
"mtp": [],
|
||||
"mte": "a",
|
||||
"mth": "",
|
||||
"doctitle": "",
|
||||
"hist": None,
|
||||
"no_idx": None,
|
||||
"no_hash": None,
|
||||
@@ -31,6 +32,9 @@ class Cfg(Namespace):
|
||||
"no_voldump": True,
|
||||
"re_maxage": 0,
|
||||
"rproxy": 0,
|
||||
"rsp_slp": 0,
|
||||
"s_wr_slp": 0,
|
||||
"s_wr_sz": 512 * 1024,
|
||||
}
|
||||
ex.update(ex2)
|
||||
super(Cfg, self).__init__(a=a or [], v=v or [], c=c, **ex)
|
||||
|
||||
Reference in New Issue
Block a user