Compare commits
24 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b0cc396bca | ||
|
|
ae463518f6 | ||
|
|
2be2e9a0d8 | ||
|
|
e405fddf74 | ||
|
|
c269b0dd91 | ||
|
|
8c3211263a | ||
|
|
bf04e7c089 | ||
|
|
c7c6e48b1a | ||
|
|
974ca773be | ||
|
|
9270c2df19 | ||
|
|
b39ff92f34 | ||
|
|
7454167f78 | ||
|
|
5ceb3a962f | ||
|
|
52bd5642da | ||
|
|
c39c93725f | ||
|
|
d00f0b9fa7 | ||
|
|
01cfc70982 | ||
|
|
e6aec189bd | ||
|
|
c98fff1647 | ||
|
|
0009e31bd3 | ||
|
|
db95e880b2 | ||
|
|
e69fea4a59 | ||
|
|
4360800a6e | ||
|
|
b179e2b031 |
2
.github/pull_request_template.md
vendored
Normal file
2
.github/pull_request_template.md
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
Please include the following text somewhere in this PR description:
|
||||||
|
This PR complies with the DCO; https://developercertificate.org/
|
||||||
16
README.md
16
README.md
@@ -1,4 +1,4 @@
|
|||||||
# ⇆🎉 copyparty
|
# 💾🎉 copyparty
|
||||||
|
|
||||||
* portable file sharing hub (py2/py3) [(on PyPI)](https://pypi.org/project/copyparty/)
|
* portable file sharing hub (py2/py3) [(on PyPI)](https://pypi.org/project/copyparty/)
|
||||||
* MIT-Licensed, 2019-05-26, ed @ irc.rizon.net
|
* MIT-Licensed, 2019-05-26, ed @ irc.rizon.net
|
||||||
@@ -240,6 +240,9 @@ browser-specific:
|
|||||||
server-os-specific:
|
server-os-specific:
|
||||||
* RHEL8 / Rocky8: you can run copyparty using `/usr/libexec/platform-python`
|
* RHEL8 / Rocky8: you can run copyparty using `/usr/libexec/platform-python`
|
||||||
|
|
||||||
|
server notes:
|
||||||
|
* pypy is supported but regular cpython is faster if you enable the database
|
||||||
|
|
||||||
|
|
||||||
# bugs
|
# bugs
|
||||||
|
|
||||||
@@ -513,11 +516,14 @@ up2k has several advantages:
|
|||||||
* much higher speeds than ftp/scp/tarpipe on some internet connections (mainly american ones) thanks to parallel connections
|
* much higher speeds than ftp/scp/tarpipe on some internet connections (mainly american ones) thanks to parallel connections
|
||||||
* the last-modified timestamp of the file is preserved
|
* the last-modified timestamp of the file is preserved
|
||||||
|
|
||||||
|
> it is perfectly safe to restart / upgrade copyparty while someone is uploading to it!
|
||||||
|
> all known up2k clients will resume just fine 💪
|
||||||
|
|
||||||
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
|
see [up2k](#up2k) for details on how it works, or watch a [demo video](https://a.ocv.me/pub/demo/pics-vids/#gf-0f6f5c0d)
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
**protip:** you can avoid scaring away users with [contrib/plugins/minimal-up2k.html](contrib/plugins/minimal-up2k.html) which makes it look [much simpler](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png)
|
**protip:** you can avoid scaring away users with [contrib/plugins/minimal-up2k.js](contrib/plugins/minimal-up2k.js) which makes it look [much simpler](https://user-images.githubusercontent.com/241032/118311195-dd6ca380-b4ef-11eb-86f3-75a3ff2e1332.png)
|
||||||
|
|
||||||
**protip:** if you enable `favicon` in the `[⚙️] settings` tab (by typing something into the textbox), the icon in the browser tab will indicate upload progress -- also, the `[🔔]` and/or `[🔊]` switches enable visible and/or audible notifications on upload completion
|
**protip:** if you enable `favicon` in the `[⚙️] settings` tab (by typing something into the textbox), the icon in the browser tab will indicate upload progress -- also, the `[🔔]` and/or `[🔊]` switches enable visible and/or audible notifications on upload completion
|
||||||
|
|
||||||
@@ -1378,7 +1384,7 @@ you can reduce the sfx size by repacking it; see [./docs/devnotes.md#sfx-repack]
|
|||||||
|
|
||||||
download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
|
download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
can be convenient on machines where installing python is problematic, however is **not recommended** -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
|
can be convenient on machines where installing python is problematic, however is **not recommended** -- if possible, please use **[copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py)** instead
|
||||||
|
|
||||||
@@ -1386,9 +1392,9 @@ can be convenient on machines where installing python is problematic, however is
|
|||||||
|
|
||||||
* on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145), on win10 it just works
|
* on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145), on win10 it just works
|
||||||
|
|
||||||
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with windows7, which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
|
* dangerous: [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is compatible with [windows7](https://user-images.githubusercontent.com/241032/221445944-ae85d1f4-d351-4837-b130-82cab57d6cca.png), which means it uses an ancient copy of python (3.7.9) which cannot be upgraded and should never be exposed to the internet (LAN is fine)
|
||||||
|
|
||||||
* dangerous and deprecated: [copyparty64.exe](https://github.com/9001/copyparty/releases/download/v1.6.5/copyparty64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
|
* dangerous and deprecated: [copyparty-winpe64.exe](https://github.com/9001/copyparty/releases/download/v1.6.8/copyparty-winpe64.exe) lets you [run copyparty in WinPE](https://user-images.githubusercontent.com/241032/205454984-e6b550df-3c49-486d-9267-1614078dd0dd.png) and is otherwise completely useless
|
||||||
|
|
||||||
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
meanwhile [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) instead relies on your system python which gives better performance and will stay safe as long as you keep your python install up-to-date
|
||||||
|
|
||||||
|
|||||||
@@ -31,7 +31,7 @@ run [`install-deps.sh`](install-deps.sh) to build/install most dependencies requ
|
|||||||
*alternatively* (or preferably) use packages from your distro instead, then you'll need at least these:
|
*alternatively* (or preferably) use packages from your distro instead, then you'll need at least these:
|
||||||
|
|
||||||
* from distro: `numpy vamp-plugin-sdk beatroot-vamp mixxx-keyfinder ffmpeg`
|
* from distro: `numpy vamp-plugin-sdk beatroot-vamp mixxx-keyfinder ffmpeg`
|
||||||
* from pypy: `keyfinder vamp`
|
* from pip: `keyfinder vamp`
|
||||||
|
|
||||||
|
|
||||||
# usage from copyparty
|
# usage from copyparty
|
||||||
|
|||||||
@@ -4,8 +4,9 @@ set -e
|
|||||||
# runs copyparty (or any other program really) in a chroot
|
# runs copyparty (or any other program really) in a chroot
|
||||||
#
|
#
|
||||||
# assumption: these directories, and everything within, are owned by root
|
# assumption: these directories, and everything within, are owned by root
|
||||||
sysdirs=( /bin /lib /lib32 /lib64 /sbin /usr /etc/alternatives )
|
sysdirs=(); for v in /bin /lib /lib32 /lib64 /sbin /usr /etc/alternatives ; do
|
||||||
|
[ -e $v ] && sysdirs+=($v)
|
||||||
|
done
|
||||||
|
|
||||||
# error-handler
|
# error-handler
|
||||||
help() { cat <<'EOF'
|
help() { cat <<'EOF'
|
||||||
@@ -38,7 +39,7 @@ while true; do
|
|||||||
v="$1"; shift
|
v="$1"; shift
|
||||||
[ "$v" = -- ] && break # end of volumes
|
[ "$v" = -- ] && break # end of volumes
|
||||||
[ "$#" -eq 0 ] && break # invalid usage
|
[ "$#" -eq 0 ] && break # invalid usage
|
||||||
vols+=( "$(realpath "$v")" )
|
vols+=( "$(realpath "$v" || echo "$v")" )
|
||||||
done
|
done
|
||||||
pybin="$1"; shift
|
pybin="$1"; shift
|
||||||
pybin="$(command -v "$pybin")"
|
pybin="$(command -v "$pybin")"
|
||||||
@@ -82,7 +83,7 @@ jail="${jail%/}"
|
|||||||
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | sed -r 's`/$``' | LC_ALL=C sort | uniq |
|
printf '%s\n' "${sysdirs[@]}" "${vols[@]}" | sed -r 's`/$``' | LC_ALL=C sort | uniq |
|
||||||
while IFS= read -r v; do
|
while IFS= read -r v; do
|
||||||
[ -e "$v" ] || {
|
[ -e "$v" ] || {
|
||||||
# printf '\033[1;31mfolder does not exist:\033[0m %s\n' "/$v"
|
printf '\033[1;31mfolder does not exist:\033[0m %s\n' "$v"
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
i1=$(stat -c%D.%i "$v" 2>/dev/null || echo a)
|
i1=$(stat -c%D.%i "$v" 2>/dev/null || echo a)
|
||||||
|
|||||||
102
bin/up2k.py
102
bin/up2k.py
@@ -1,9 +1,12 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
from __future__ import print_function, unicode_literals
|
from __future__ import print_function, unicode_literals
|
||||||
|
|
||||||
|
S_VERSION = "1.5"
|
||||||
|
S_BUILD_DT = "2023-03-12"
|
||||||
|
|
||||||
"""
|
"""
|
||||||
up2k.py: upload to copyparty
|
up2k.py: upload to copyparty
|
||||||
2023-01-13, v1.2, ed <irc.rizon.net>, MIT-Licensed
|
2021, ed <irc.rizon.net>, MIT-Licensed
|
||||||
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
https://github.com/9001/copyparty/blob/hovudstraum/bin/up2k.py
|
||||||
|
|
||||||
- dependencies: requests
|
- dependencies: requests
|
||||||
@@ -24,6 +27,8 @@ import platform
|
|||||||
import threading
|
import threading
|
||||||
import datetime
|
import datetime
|
||||||
|
|
||||||
|
EXE = sys.executable.endswith("exe")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import argparse
|
import argparse
|
||||||
except:
|
except:
|
||||||
@@ -34,7 +39,9 @@ except:
|
|||||||
try:
|
try:
|
||||||
import requests
|
import requests
|
||||||
except ImportError:
|
except ImportError:
|
||||||
if sys.version_info > (2, 7):
|
if EXE:
|
||||||
|
raise
|
||||||
|
elif sys.version_info > (2, 7):
|
||||||
m = "\nERROR: need 'requests'; please run this command:\n {0} -m pip install --user requests\n"
|
m = "\nERROR: need 'requests'; please run this command:\n {0} -m pip install --user requests\n"
|
||||||
else:
|
else:
|
||||||
m = "requests/2.18.4 urllib3/1.23 chardet/3.0.4 certifi/2020.4.5.1 idna/2.7"
|
m = "requests/2.18.4 urllib3/1.23 chardet/3.0.4 certifi/2020.4.5.1 idna/2.7"
|
||||||
@@ -245,7 +252,13 @@ def eprint(*a, **ka):
|
|||||||
|
|
||||||
|
|
||||||
def flushing_print(*a, **ka):
|
def flushing_print(*a, **ka):
|
||||||
_print(*a, **ka)
|
try:
|
||||||
|
_print(*a, **ka)
|
||||||
|
except:
|
||||||
|
v = " ".join(str(x) for x in a)
|
||||||
|
v = v.encode("ascii", "replace").decode("ascii")
|
||||||
|
_print(v, **ka)
|
||||||
|
|
||||||
if "flush" not in ka:
|
if "flush" not in ka:
|
||||||
sys.stdout.flush()
|
sys.stdout.flush()
|
||||||
|
|
||||||
@@ -372,6 +385,23 @@ def walkdir(err, top, seen):
|
|||||||
def walkdirs(err, tops):
|
def walkdirs(err, tops):
|
||||||
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
"""recursive statdir for a list of tops, yields [top, relpath, stat]"""
|
||||||
sep = "{0}".format(os.sep).encode("ascii")
|
sep = "{0}".format(os.sep).encode("ascii")
|
||||||
|
if not VT100:
|
||||||
|
za = []
|
||||||
|
for td in tops:
|
||||||
|
try:
|
||||||
|
ap = os.path.abspath(os.path.realpath(td))
|
||||||
|
if td[-1:] in (b"\\", b"/"):
|
||||||
|
ap += sep
|
||||||
|
except:
|
||||||
|
# maybe cpython #88013 (ok)
|
||||||
|
ap = td
|
||||||
|
|
||||||
|
za.append(ap)
|
||||||
|
|
||||||
|
za = [x if x.startswith(b"\\\\") else b"\\\\?\\" + x for x in za]
|
||||||
|
za = [x.replace(b"/", b"\\") for x in za]
|
||||||
|
tops = za
|
||||||
|
|
||||||
for top in tops:
|
for top in tops:
|
||||||
isdir = os.path.isdir(top)
|
isdir = os.path.isdir(top)
|
||||||
if top[-1:] == sep:
|
if top[-1:] == sep:
|
||||||
@@ -520,7 +550,11 @@ def handshake(ar, file, search):
|
|||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
em = str(ex).split("SSLError(")[-1].split("\nURL: ")[0].strip()
|
em = str(ex).split("SSLError(")[-1].split("\nURL: ")[0].strip()
|
||||||
|
|
||||||
if sc == 422 or "<pre>partial upload exists at a different" in txt:
|
if (
|
||||||
|
sc == 422
|
||||||
|
or "<pre>partial upload exists at a different" in txt
|
||||||
|
or "<pre>source file busy; please try again" in txt
|
||||||
|
):
|
||||||
file.recheck = True
|
file.recheck = True
|
||||||
return [], False
|
return [], False
|
||||||
elif sc == 409 or "<pre>upload rejected, file already exists" in txt:
|
elif sc == 409 or "<pre>upload rejected, file already exists" in txt:
|
||||||
@@ -552,8 +586,8 @@ def handshake(ar, file, search):
|
|||||||
return r["hash"], r["sprs"]
|
return r["hash"], r["sprs"]
|
||||||
|
|
||||||
|
|
||||||
def upload(file, cid, pw):
|
def upload(file, cid, pw, stats):
|
||||||
# type: (File, str, str) -> None
|
# type: (File, str, str, str) -> None
|
||||||
"""upload one specific chunk, `cid` (a chunk-hash)"""
|
"""upload one specific chunk, `cid` (a chunk-hash)"""
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
@@ -561,6 +595,10 @@ def upload(file, cid, pw):
|
|||||||
"X-Up2k-Wark": file.wark,
|
"X-Up2k-Wark": file.wark,
|
||||||
"Content-Type": "application/octet-stream",
|
"Content-Type": "application/octet-stream",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if stats:
|
||||||
|
headers["X-Up2k-Stat"] = stats
|
||||||
|
|
||||||
if pw:
|
if pw:
|
||||||
headers["Cookie"] = "=".join(["cppwd", pw])
|
headers["Cookie"] = "=".join(["cppwd", pw])
|
||||||
|
|
||||||
@@ -629,6 +667,8 @@ class Ctl(object):
|
|||||||
req_ses.verify = ar.te
|
req_ses.verify = ar.te
|
||||||
|
|
||||||
self.filegen = walkdirs([], ar.files)
|
self.filegen = walkdirs([], ar.files)
|
||||||
|
self.recheck = [] # type: list[File]
|
||||||
|
|
||||||
if ar.safe:
|
if ar.safe:
|
||||||
self._safe()
|
self._safe()
|
||||||
else:
|
else:
|
||||||
@@ -647,11 +687,11 @@ class Ctl(object):
|
|||||||
self.t0 = time.time()
|
self.t0 = time.time()
|
||||||
self.t0_up = None
|
self.t0_up = None
|
||||||
self.spd = None
|
self.spd = None
|
||||||
|
self.eta = "99:99:99"
|
||||||
|
|
||||||
self.mutex = threading.Lock()
|
self.mutex = threading.Lock()
|
||||||
self.q_handshake = Queue() # type: Queue[File]
|
self.q_handshake = Queue() # type: Queue[File]
|
||||||
self.q_upload = Queue() # type: Queue[tuple[File, str]]
|
self.q_upload = Queue() # type: Queue[tuple[File, str]]
|
||||||
self.recheck = [] # type: list[File]
|
|
||||||
|
|
||||||
self.st_hash = [None, "(idle, starting...)"] # type: tuple[File, int]
|
self.st_hash = [None, "(idle, starting...)"] # type: tuple[File, int]
|
||||||
self.st_up = [None, "(idle, starting...)"] # type: tuple[File, int]
|
self.st_up = [None, "(idle, starting...)"] # type: tuple[File, int]
|
||||||
@@ -693,7 +733,8 @@ class Ctl(object):
|
|||||||
ncs = len(hs)
|
ncs = len(hs)
|
||||||
for nc, cid in enumerate(hs):
|
for nc, cid in enumerate(hs):
|
||||||
print(" {0} up {1}".format(ncs - nc, cid))
|
print(" {0} up {1}".format(ncs - nc, cid))
|
||||||
upload(file, cid, self.ar.a)
|
stats = "{0}/0/0/{1}".format(nf, self.nfiles - nf)
|
||||||
|
upload(file, cid, self.ar.a, stats)
|
||||||
|
|
||||||
print(" ok!")
|
print(" ok!")
|
||||||
if file.recheck:
|
if file.recheck:
|
||||||
@@ -768,12 +809,12 @@ class Ctl(object):
|
|||||||
eta = (self.nbytes - self.up_b) / (spd + 1)
|
eta = (self.nbytes - self.up_b) / (spd + 1)
|
||||||
|
|
||||||
spd = humansize(spd)
|
spd = humansize(spd)
|
||||||
eta = str(datetime.timedelta(seconds=int(eta)))
|
self.eta = str(datetime.timedelta(seconds=int(eta)))
|
||||||
sleft = humansize(self.nbytes - self.up_b)
|
sleft = humansize(self.nbytes - self.up_b)
|
||||||
nleft = self.nfiles - self.up_f
|
nleft = self.nfiles - self.up_f
|
||||||
tail = "\033[K\033[u" if VT100 and not self.ar.ns else "\r"
|
tail = "\033[K\033[u" if VT100 and not self.ar.ns else "\r"
|
||||||
|
|
||||||
t = "{0} eta @ {1}/s, {2}, {3}# left".format(eta, spd, sleft, nleft)
|
t = "{0} eta @ {1}/s, {2}, {3}# left".format(self.eta, spd, sleft, nleft)
|
||||||
eprint(txt + "\033]0;{0}\033\\\r{0}{1}".format(t, tail))
|
eprint(txt + "\033]0;{0}\033\\\r{0}{1}".format(t, tail))
|
||||||
|
|
||||||
if not self.recheck:
|
if not self.recheck:
|
||||||
@@ -811,7 +852,7 @@ class Ctl(object):
|
|||||||
zb += quotep(rd.replace(b"\\", b"/"))
|
zb += quotep(rd.replace(b"\\", b"/"))
|
||||||
r = req_ses.get(zb + b"?ls&dots", headers=headers)
|
r = req_ses.get(zb + b"?ls&dots", headers=headers)
|
||||||
if not r:
|
if not r:
|
||||||
raise Exception("HTTP {}".format(r.status_code))
|
raise Exception("HTTP {0}".format(r.status_code))
|
||||||
|
|
||||||
j = r.json()
|
j = r.json()
|
||||||
for f in j["dirs"] + j["files"]:
|
for f in j["dirs"] + j["files"]:
|
||||||
@@ -886,6 +927,9 @@ class Ctl(object):
|
|||||||
self.handshaker_busy += 1
|
self.handshaker_busy += 1
|
||||||
|
|
||||||
upath = file.abs.decode("utf-8", "replace")
|
upath = file.abs.decode("utf-8", "replace")
|
||||||
|
if not VT100:
|
||||||
|
upath = upath[4:]
|
||||||
|
|
||||||
hs, sprs = handshake(self.ar, file, search)
|
hs, sprs = handshake(self.ar, file, search)
|
||||||
if search:
|
if search:
|
||||||
if hs:
|
if hs:
|
||||||
@@ -951,9 +995,20 @@ class Ctl(object):
|
|||||||
self.uploader_busy += 1
|
self.uploader_busy += 1
|
||||||
self.t0_up = self.t0_up or time.time()
|
self.t0_up = self.t0_up or time.time()
|
||||||
|
|
||||||
|
zs = "{0}/{1}/{2}/{3} {4}/{5} {6}"
|
||||||
|
stats = zs.format(
|
||||||
|
self.up_f,
|
||||||
|
len(self.recheck),
|
||||||
|
self.uploader_busy,
|
||||||
|
self.nfiles - self.up_f,
|
||||||
|
int(self.nbytes / (1024 * 1024)),
|
||||||
|
int((self.nbytes - self.up_b) / (1024 * 1024)),
|
||||||
|
self.eta,
|
||||||
|
)
|
||||||
|
|
||||||
file, cid = task
|
file, cid = task
|
||||||
try:
|
try:
|
||||||
upload(file, cid, self.ar.a)
|
upload(file, cid, self.ar.a, stats)
|
||||||
except:
|
except:
|
||||||
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
|
eprint("upload failed, retrying: {0} #{1}\n".format(file.name, cid[:8]))
|
||||||
# handshake will fix it
|
# handshake will fix it
|
||||||
@@ -989,8 +1044,13 @@ def main():
|
|||||||
cores = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
|
cores = (os.cpu_count() if hasattr(os, "cpu_count") else 0) or 2
|
||||||
hcores = min(cores, 3) # 4% faster than 4+ on py3.9 @ r5-4500U
|
hcores = min(cores, 3) # 4% faster than 4+ on py3.9 @ r5-4500U
|
||||||
|
|
||||||
|
ver = "{0}, v{1}".format(S_BUILD_DT, S_VERSION)
|
||||||
|
if "--version" in sys.argv:
|
||||||
|
print(ver)
|
||||||
|
return
|
||||||
|
|
||||||
# fmt: off
|
# fmt: off
|
||||||
ap = app = argparse.ArgumentParser(formatter_class=APF, epilog="""
|
ap = app = argparse.ArgumentParser(formatter_class=APF, description="copyparty up2k uploader / filesearch tool, " + ver, epilog="""
|
||||||
NOTE:
|
NOTE:
|
||||||
source file/folder selection uses rsync syntax, meaning that:
|
source file/folder selection uses rsync syntax, meaning that:
|
||||||
"foo" uploads the entire folder to URL/foo/
|
"foo" uploads the entire folder to URL/foo/
|
||||||
@@ -1003,6 +1063,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
|
ap.add_argument("-a", metavar="PASSWORD", help="password or $filepath")
|
||||||
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
ap.add_argument("-s", action="store_true", help="file-search (disables upload)")
|
||||||
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
ap.add_argument("--ok", action="store_true", help="continue even if some local files are inaccessible")
|
||||||
|
ap.add_argument("--version", action="store_true", help="show version and exit")
|
||||||
|
|
||||||
ap = app.add_argument_group("compatibility")
|
ap = app.add_argument_group("compatibility")
|
||||||
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
ap.add_argument("--cls", action="store_true", help="clear screen before start")
|
||||||
@@ -1026,7 +1087,16 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
ap.add_argument("-td", action="store_true", help="disable certificate check")
|
ap.add_argument("-td", action="store_true", help="disable certificate check")
|
||||||
# fmt: on
|
# fmt: on
|
||||||
|
|
||||||
ar = app.parse_args()
|
try:
|
||||||
|
ar = app.parse_args()
|
||||||
|
finally:
|
||||||
|
if EXE and not sys.argv[1:]:
|
||||||
|
print("*** hit enter to exit ***")
|
||||||
|
try:
|
||||||
|
input()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
if ar.drd:
|
if ar.drd:
|
||||||
ar.dr = True
|
ar.dr = True
|
||||||
|
|
||||||
@@ -1040,7 +1110,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
ar.files = [
|
ar.files = [
|
||||||
os.path.abspath(os.path.realpath(x.encode("utf-8")))
|
os.path.abspath(os.path.realpath(x.encode("utf-8")))
|
||||||
+ (x[-1:] if x[-1:] == os.sep else "").encode("utf-8")
|
+ (x[-1:] if x[-1:] in ("\\", "/") else "").encode("utf-8")
|
||||||
for x in ar.files
|
for x in ar.files
|
||||||
]
|
]
|
||||||
|
|
||||||
@@ -1050,7 +1120,7 @@ source file/folder selection uses rsync syntax, meaning that:
|
|||||||
|
|
||||||
if ar.a and ar.a.startswith("$"):
|
if ar.a and ar.a.startswith("$"):
|
||||||
fn = ar.a[1:]
|
fn = ar.a[1:]
|
||||||
print("reading password from file [{}]".format(fn))
|
print("reading password from file [{0}]".format(fn))
|
||||||
with open(fn, "rb") as f:
|
with open(fn, "rb") as f:
|
||||||
ar.a = f.read().decode("utf-8").strip()
|
ar.a = f.read().decode("utf-8").strip()
|
||||||
|
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
|
|
||||||
<head>
|
<head>
|
||||||
<meta charset="utf-8">
|
<meta charset="utf-8">
|
||||||
<title>⇆🎉 redirect</title>
|
<title>💾🎉 redirect</title>
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
<style>
|
<style>
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Maintainer: icxes <dev.null@need.moe>
|
# Maintainer: icxes <dev.null@need.moe>
|
||||||
pkgname=copyparty
|
pkgname=copyparty
|
||||||
pkgver="1.6.5"
|
pkgver="1.6.7"
|
||||||
pkgrel=1
|
pkgrel=1
|
||||||
pkgdesc="Portable file sharing hub"
|
pkgdesc="Portable file sharing hub"
|
||||||
arch=("any")
|
arch=("any")
|
||||||
@@ -26,12 +26,12 @@ source=("${url}/releases/download/v${pkgver}/${pkgname}-sfx.py"
|
|||||||
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
|
"https://raw.githubusercontent.com/9001/${pkgname}/v${pkgver}/LICENSE"
|
||||||
)
|
)
|
||||||
backup=("etc/${pkgname}.d/init" )
|
backup=("etc/${pkgname}.d/init" )
|
||||||
sha256sums=("947d3f191f96f6a9e451bbcb35c5582ba210d81cfdc92dfa9ab0390dbecf26ee"
|
sha256sums=("3fb40a631e9decf0073db06aab6fd8d743de91f4ddb82a65164d39d53e0b413f"
|
||||||
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
|
"b8565eba5e64dedba1cf6c7aac7e31c5a731ed7153d6810288a28f00a36c28b2"
|
||||||
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
|
"f65c207e0670f9d78ad2e399bda18d5502ff30d2ac79e0e7fc48e7fbdc39afdc"
|
||||||
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
|
"c4f396b083c9ec02ad50b52412c84d2a82be7f079b2d016e1c9fad22d68285ff"
|
||||||
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
|
"dba701de9fd584405917e923ea1e59dbb249b96ef23bad479cf4e42740b774c8"
|
||||||
"746971e95817c54445ce7f9c8406822dffc814cd5eb8113abd36dd472fd677d7"
|
"23054bb206153a1ed34038accaf490b8068f9c856e423c2f2595b148b40c0a0c"
|
||||||
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
|
"cb2ce3d6277bf2f5a82ecf336cc44963bc6490bcf496ffbd75fc9e21abaa75f3"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|||||||
@@ -874,7 +874,7 @@ def add_thumbnail(ap):
|
|||||||
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown -- avoids doing keepalive pokes (updating the mtime) on thumbnail folders more often than SEC seconds")
|
ap2.add_argument("--th-poke", metavar="SEC", type=int, default=300, help="activity labeling cooldown -- avoids doing keepalive pokes (updating the mtime) on thumbnail folders more often than SEC seconds")
|
||||||
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
|
ap2.add_argument("--th-clean", metavar="SEC", type=int, default=43200, help="cleanup interval; 0=disabled")
|
||||||
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age -- folders which haven't been poked for longer than --th-poke seconds will get deleted every --th-clean seconds")
|
ap2.add_argument("--th-maxage", metavar="SEC", type=int, default=604800, help="max folder age -- folders which haven't been poked for longer than --th-poke seconds will get deleted every --th-clean seconds")
|
||||||
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for")
|
ap2.add_argument("--th-covers", metavar="N,N", type=u, default="folder.png,folder.jpg,cover.png,cover.jpg", help="folder thumbnails to stat/look for; case-insensitive if -e2d")
|
||||||
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
|
# https://pillow.readthedocs.io/en/stable/handbook/image-file-formats.html
|
||||||
# https://github.com/libvips/libvips
|
# https://github.com/libvips/libvips
|
||||||
# ffmpeg -hide_banner -demuxers | awk '/^ D /{print$2}' | while IFS= read -r x; do ffmpeg -hide_banner -h demuxer=$x; done | grep -E '^Demuxer |extensions:'
|
# ffmpeg -hide_banner -demuxers | awk '/^ D /{print$2}' | while IFS= read -r x; do ffmpeg -hide_banner -h demuxer=$x; done | grep -E '^Demuxer |extensions:'
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
# coding: utf-8
|
# coding: utf-8
|
||||||
|
|
||||||
VERSION = (1, 6, 6)
|
VERSION = (1, 6, 8)
|
||||||
CODENAME = "cors k"
|
CODENAME = "cors k"
|
||||||
BUILD_DT = (2023, 2, 26)
|
BUILD_DT = (2023, 3, 12)
|
||||||
|
|
||||||
S_VERSION = ".".join(map(str, VERSION))
|
S_VERSION = ".".join(map(str, VERSION))
|
||||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||||
|
|||||||
@@ -218,7 +218,7 @@ class FtpFs(AbstractedFS):
|
|||||||
|
|
||||||
def mkdir(self, path: str) -> None:
|
def mkdir(self, path: str) -> None:
|
||||||
ap = self.rv2a(path, w=True)[0]
|
ap = self.rv2a(path, w=True)[0]
|
||||||
bos.mkdir(ap)
|
bos.makedirs(ap) # filezilla expects this
|
||||||
|
|
||||||
def listdir(self, path: str) -> list[str]:
|
def listdir(self, path: str) -> list[str]:
|
||||||
vpath = join(self.cwd, path).lstrip("/")
|
vpath = join(self.cwd, path).lstrip("/")
|
||||||
|
|||||||
@@ -1714,7 +1714,7 @@ class HttpCli(object):
|
|||||||
except:
|
except:
|
||||||
raise Pebkac(500, min_ex())
|
raise Pebkac(500, min_ex())
|
||||||
|
|
||||||
x = self.conn.hsrv.broker.ask("up2k.handle_json", body)
|
x = self.conn.hsrv.broker.ask("up2k.handle_json", body, self.u2fh.aps)
|
||||||
ret = x.get()
|
ret = x.get()
|
||||||
if self.is_vproxied:
|
if self.is_vproxied:
|
||||||
if "purl" in ret:
|
if "purl" in ret:
|
||||||
@@ -1884,17 +1884,10 @@ class HttpCli(object):
|
|||||||
with self.mutex:
|
with self.mutex:
|
||||||
self.u2fh.close(path)
|
self.u2fh.close(path)
|
||||||
|
|
||||||
# windows cant rename open files
|
if not num_left and not self.args.nw:
|
||||||
if ANYWIN and path != fin_path and not self.args.nw:
|
self.conn.hsrv.broker.ask(
|
||||||
self.conn.hsrv.broker.ask("up2k.finish_upload", ptop, wark).get()
|
"up2k.finish_upload", ptop, wark, self.u2fh.aps
|
||||||
|
).get()
|
||||||
if not ANYWIN and not num_left:
|
|
||||||
times = (int(time.time()), int(lastmod))
|
|
||||||
self.log("no more chunks, setting times {}".format(times))
|
|
||||||
try:
|
|
||||||
bos.utime(fin_path, times)
|
|
||||||
except:
|
|
||||||
self.log("failed to utime ({}, {})".format(fin_path, times))
|
|
||||||
|
|
||||||
cinf = self.headers.get("x-up2k-stat", "")
|
cinf = self.headers.get("x-up2k-stat", "")
|
||||||
|
|
||||||
@@ -3255,23 +3248,47 @@ class HttpCli(object):
|
|||||||
):
|
):
|
||||||
raise Pebkac(403)
|
raise Pebkac(403)
|
||||||
|
|
||||||
|
e2d = "e2d" in vn.flags
|
||||||
|
e2t = "e2t" in vn.flags
|
||||||
|
|
||||||
self.html_head = vn.flags.get("html_head", "")
|
self.html_head = vn.flags.get("html_head", "")
|
||||||
if vn.flags.get("norobots"):
|
if vn.flags.get("norobots") or "b" in self.uparam:
|
||||||
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
|
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
|
||||||
else:
|
else:
|
||||||
self.out_headers.pop("X-Robots-Tag", None)
|
self.out_headers.pop("X-Robots-Tag", None)
|
||||||
|
|
||||||
is_dir = stat.S_ISDIR(st.st_mode)
|
is_dir = stat.S_ISDIR(st.st_mode)
|
||||||
|
icur = None
|
||||||
|
if e2t or (e2d and is_dir):
|
||||||
|
idx = self.conn.get_u2idx()
|
||||||
|
icur = idx.get_cur(dbv.realpath)
|
||||||
|
|
||||||
if self.can_read:
|
if self.can_read:
|
||||||
th_fmt = self.uparam.get("th")
|
th_fmt = self.uparam.get("th")
|
||||||
if th_fmt is not None:
|
if th_fmt is not None:
|
||||||
if is_dir:
|
if is_dir:
|
||||||
for fn in self.args.th_covers.split(","):
|
vrem = vrem.rstrip("/")
|
||||||
fp = os.path.join(abspath, fn)
|
if icur and vrem:
|
||||||
if bos.path.exists(fp):
|
q = "select fn from cv where rd=? and dn=?"
|
||||||
vrem = "{}/{}".format(vrem.rstrip("/"), fn).strip("/")
|
crd, cdn = vrem.rsplit("/", 1) if "/" in vrem else ("", vrem)
|
||||||
is_dir = False
|
# no mojibake support:
|
||||||
break
|
try:
|
||||||
|
cfn = icur.execute(q, (crd, cdn)).fetchone()
|
||||||
|
if cfn:
|
||||||
|
fn = cfn[0]
|
||||||
|
fp = os.path.join(abspath, fn)
|
||||||
|
if bos.path.exists(fp):
|
||||||
|
vrem = "{}/{}".format(vrem, fn).strip("/")
|
||||||
|
is_dir = False
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
for fn in self.args.th_covers:
|
||||||
|
fp = os.path.join(abspath, fn)
|
||||||
|
if bos.path.exists(fp):
|
||||||
|
vrem = "{}/{}".format(vrem, fn).strip("/")
|
||||||
|
is_dir = False
|
||||||
|
break
|
||||||
|
|
||||||
if is_dir:
|
if is_dir:
|
||||||
return self.tx_ico("a.folder")
|
return self.tx_ico("a.folder")
|
||||||
@@ -3378,8 +3395,8 @@ class HttpCli(object):
|
|||||||
"taglist": [],
|
"taglist": [],
|
||||||
"srvinf": srv_infot,
|
"srvinf": srv_infot,
|
||||||
"acct": self.uname,
|
"acct": self.uname,
|
||||||
"idx": ("e2d" in vn.flags),
|
"idx": e2d,
|
||||||
"itag": ("e2t" in vn.flags),
|
"itag": e2t,
|
||||||
"lifetime": vn.flags.get("lifetime") or 0,
|
"lifetime": vn.flags.get("lifetime") or 0,
|
||||||
"frand": bool(vn.flags.get("rand")),
|
"frand": bool(vn.flags.get("rand")),
|
||||||
"perms": perms,
|
"perms": perms,
|
||||||
@@ -3398,8 +3415,8 @@ class HttpCli(object):
|
|||||||
"taglist": [],
|
"taglist": [],
|
||||||
"def_hcols": [],
|
"def_hcols": [],
|
||||||
"have_emp": self.args.emp,
|
"have_emp": self.args.emp,
|
||||||
"have_up2k_idx": ("e2d" in vn.flags),
|
"have_up2k_idx": e2d,
|
||||||
"have_tags_idx": ("e2t" in vn.flags),
|
"have_tags_idx": e2t,
|
||||||
"have_acode": (not self.args.no_acode),
|
"have_acode": (not self.args.no_acode),
|
||||||
"have_mv": (not self.args.no_mv),
|
"have_mv": (not self.args.no_mv),
|
||||||
"have_del": (not self.args.no_del),
|
"have_del": (not self.args.no_del),
|
||||||
@@ -3411,7 +3428,7 @@ class HttpCli(object):
|
|||||||
"url_suf": url_suf,
|
"url_suf": url_suf,
|
||||||
"logues": logues,
|
"logues": logues,
|
||||||
"readme": readme,
|
"readme": readme,
|
||||||
"title": html_escape(self.vpath, crlf=True) or "⇆🎉",
|
"title": html_escape(self.vpath, crlf=True) or "💾🎉",
|
||||||
"srv_info": srv_infot,
|
"srv_info": srv_infot,
|
||||||
"dtheme": self.args.theme,
|
"dtheme": self.args.theme,
|
||||||
"themes": self.args.themes,
|
"themes": self.args.themes,
|
||||||
@@ -3475,11 +3492,6 @@ class HttpCli(object):
|
|||||||
if not self.args.ed or "dots" not in self.uparam:
|
if not self.args.ed or "dots" not in self.uparam:
|
||||||
ls_names = exclude_dotfiles(ls_names)
|
ls_names = exclude_dotfiles(ls_names)
|
||||||
|
|
||||||
icur = None
|
|
||||||
if "e2t" in vn.flags:
|
|
||||||
idx = self.conn.get_u2idx()
|
|
||||||
icur = idx.get_cur(dbv.realpath)
|
|
||||||
|
|
||||||
add_fk = vn.flags.get("fk")
|
add_fk = vn.flags.get("fk")
|
||||||
|
|
||||||
dirs = []
|
dirs = []
|
||||||
|
|||||||
@@ -149,12 +149,9 @@ class SvcHub(object):
|
|||||||
self.log("root", t.format(args.j))
|
self.log("root", t.format(args.j))
|
||||||
|
|
||||||
if not args.no_fpool and args.j != 1:
|
if not args.no_fpool and args.j != 1:
|
||||||
t = "WARNING: --use-fpool combined with multithreading is untested and can probably cause undefined behavior"
|
t = "WARNING: ignoring --use-fpool because multithreading (-j{}) is enabled"
|
||||||
if ANYWIN:
|
self.log("root", t.format(args.j), c=3)
|
||||||
t = 'windows cannot do multithreading without --no-fpool, so enabling that -- note that upload performance will suffer if you have microsoft defender "real-time protection" enabled, so you probably want to use -j 1 instead'
|
args.no_fpool = True
|
||||||
args.no_fpool = True
|
|
||||||
|
|
||||||
self.log("root", t, c=3)
|
|
||||||
|
|
||||||
bri = "zy"[args.theme % 2 :][:1]
|
bri = "zy"[args.theme % 2 :][:1]
|
||||||
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
|
ch = "abcdefghijklmnopqrstuvwx"[int(args.theme / 2)]
|
||||||
@@ -351,6 +348,8 @@ class SvcHub(object):
|
|||||||
if al.rsp_jtr:
|
if al.rsp_jtr:
|
||||||
al.rsp_slp = 0.000001
|
al.rsp_slp = 0.000001
|
||||||
|
|
||||||
|
al.th_covers = set(al.th_covers.split(","))
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _setlimits(self) -> None:
|
def _setlimits(self) -> None:
|
||||||
@@ -405,6 +404,7 @@ class SvcHub(object):
|
|||||||
|
|
||||||
def _setup_logfile(self, printed: str) -> None:
|
def _setup_logfile(self, printed: str) -> None:
|
||||||
base_fn = fn = sel_fn = self._logname()
|
base_fn = fn = sel_fn = self._logname()
|
||||||
|
do_xz = fn.lower().endswith(".xz")
|
||||||
if fn != self.args.lo:
|
if fn != self.args.lo:
|
||||||
ctr = 0
|
ctr = 0
|
||||||
# yup this is a race; if started sufficiently concurrently, two
|
# yup this is a race; if started sufficiently concurrently, two
|
||||||
@@ -416,7 +416,7 @@ class SvcHub(object):
|
|||||||
fn = sel_fn
|
fn = sel_fn
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if fn.lower().endswith(".xz"):
|
if do_xz:
|
||||||
import lzma
|
import lzma
|
||||||
|
|
||||||
lh = lzma.open(fn, "wt", encoding="utf-8", errors="replace", preset=0)
|
lh = lzma.open(fn, "wt", encoding="utf-8", errors="replace", preset=0)
|
||||||
|
|||||||
@@ -73,6 +73,9 @@ if True: # pylint: disable=using-constant-test
|
|||||||
if TYPE_CHECKING:
|
if TYPE_CHECKING:
|
||||||
from .svchub import SvcHub
|
from .svchub import SvcHub
|
||||||
|
|
||||||
|
zs = "avif,avifs,bmp,gif,heic,heics,heif,heifs,ico,j2p,j2k,jp2,jpeg,jpg,jpx,png,tga,tif,tiff,webp"
|
||||||
|
CV_EXTS = set(zs.split(","))
|
||||||
|
|
||||||
|
|
||||||
class Dbw(object):
|
class Dbw(object):
|
||||||
def __init__(self, c: "sqlite3.Cursor", n: int, t: float) -> None:
|
def __init__(self, c: "sqlite3.Cursor", n: int, t: float) -> None:
|
||||||
@@ -124,6 +127,7 @@ class Up2k(object):
|
|||||||
self.droppable: dict[str, list[str]] = {}
|
self.droppable: dict[str, list[str]] = {}
|
||||||
self.volstate: dict[str, str] = {}
|
self.volstate: dict[str, str] = {}
|
||||||
self.vol_act: dict[str, float] = {}
|
self.vol_act: dict[str, float] = {}
|
||||||
|
self.busy_aps: set[str] = set()
|
||||||
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
|
self.dupesched: dict[str, list[tuple[str, str, float]]] = {}
|
||||||
self.snap_persist_interval = 300 # persist unfinished index every 5 min
|
self.snap_persist_interval = 300 # persist unfinished index every 5 min
|
||||||
self.snap_discard_interval = 21600 # drop unfinished after 6 hours inactivity
|
self.snap_discard_interval = 21600 # drop unfinished after 6 hours inactivity
|
||||||
@@ -161,12 +165,6 @@ class Up2k(object):
|
|||||||
t = "could not initialize sqlite3, will use in-memory registry only"
|
t = "could not initialize sqlite3, will use in-memory registry only"
|
||||||
self.log(t, 3)
|
self.log(t, 3)
|
||||||
|
|
||||||
if ANYWIN:
|
|
||||||
# usually fails to set lastmod too quickly
|
|
||||||
self.lastmod_q: list[tuple[str, int, tuple[int, int], bool]] = []
|
|
||||||
self.lastmod_q2 = self.lastmod_q[:]
|
|
||||||
Daemon(self._lastmodder, "up2k-lastmod")
|
|
||||||
|
|
||||||
self.fstab = Fstab(self.log_func)
|
self.fstab = Fstab(self.log_func)
|
||||||
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
self.gen_fk = self._gen_fk if self.args.log_fk else gen_filekey
|
||||||
|
|
||||||
@@ -463,11 +461,9 @@ class Up2k(object):
|
|||||||
q = "select * from up where substr(w,1,16)=? and +rd=? and +fn=?"
|
q = "select * from up where substr(w,1,16)=? and +rd=? and +fn=?"
|
||||||
ups = []
|
ups = []
|
||||||
for wrf in wrfs:
|
for wrf in wrfs:
|
||||||
try:
|
up = cur.execute(q, wrf).fetchone()
|
||||||
# almost definitely exists; don't care if it doesn't
|
if up:
|
||||||
ups.append(cur.execute(q, wrf).fetchone())
|
ups.append(up)
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# t1 = time.time()
|
# t1 = time.time()
|
||||||
# self.log("mapped {} warks in {:.3f} sec".format(len(wrfs), t1 - t0))
|
# self.log("mapped {} warks in {:.3f} sec".format(len(wrfs), t1 - t0))
|
||||||
@@ -952,6 +948,7 @@ class Up2k(object):
|
|||||||
unreg: list[str] = []
|
unreg: list[str] = []
|
||||||
files: list[tuple[int, int, str]] = []
|
files: list[tuple[int, int, str]] = []
|
||||||
fat32 = True
|
fat32 = True
|
||||||
|
cv = ""
|
||||||
|
|
||||||
assert self.pp and self.mem_cur
|
assert self.pp and self.mem_cur
|
||||||
self.pp.msg = "a{} {}".format(self.pp.n, cdir)
|
self.pp.msg = "a{} {}".format(self.pp.n, cdir)
|
||||||
@@ -1014,6 +1011,12 @@ class Up2k(object):
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
files.append((sz, lmod, iname))
|
files.append((sz, lmod, iname))
|
||||||
|
liname = iname.lower()
|
||||||
|
if sz and (
|
||||||
|
iname in self.args.th_covers
|
||||||
|
or (not cv and liname.rsplit(".", 1)[-1] in CV_EXTS)
|
||||||
|
):
|
||||||
|
cv = iname
|
||||||
|
|
||||||
# folder of 1000 files = ~1 MiB RAM best-case (tiny filenames);
|
# folder of 1000 files = ~1 MiB RAM best-case (tiny filenames);
|
||||||
# free up stuff we're done with before dhashing
|
# free up stuff we're done with before dhashing
|
||||||
@@ -1026,6 +1029,7 @@ class Up2k(object):
|
|||||||
zh = hashlib.sha1()
|
zh = hashlib.sha1()
|
||||||
_ = [zh.update(str(x).encode("utf-8", "replace")) for x in files]
|
_ = [zh.update(str(x).encode("utf-8", "replace")) for x in files]
|
||||||
|
|
||||||
|
zh.update(cv.encode("utf-8", "replace"))
|
||||||
zh.update(spack(b"<d", cst.st_mtime))
|
zh.update(spack(b"<d", cst.st_mtime))
|
||||||
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
dhash = base64.urlsafe_b64encode(zh.digest()[:12]).decode("ascii")
|
||||||
sql = "select d from dh where d = ? and h = ?"
|
sql = "select d from dh where d = ? and h = ?"
|
||||||
@@ -1039,6 +1043,18 @@ class Up2k(object):
|
|||||||
if c.fetchone():
|
if c.fetchone():
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
if cv and rd:
|
||||||
|
# mojibake not supported (for performance / simplicity):
|
||||||
|
try:
|
||||||
|
q = "select * from cv where rd=? and dn=? and +fn=?"
|
||||||
|
crd, cdn = rd.rsplit("/", 1) if "/" in rd else ("", rd)
|
||||||
|
if not db.c.execute(q, (crd, cdn, cv)).fetchone():
|
||||||
|
db.c.execute("delete from cv where rd=? and dn=?", (crd, cdn))
|
||||||
|
db.c.execute("insert into cv values (?,?,?)", (crd, cdn, cv))
|
||||||
|
db.n += 1
|
||||||
|
except Exception as ex:
|
||||||
|
self.log("cover {}/{} failed: {}".format(rd, cv, ex), 6)
|
||||||
|
|
||||||
seen_files = set([x[2] for x in files]) # for dropcheck
|
seen_files = set([x[2] for x in files]) # for dropcheck
|
||||||
for sz, lmod, fn in files:
|
for sz, lmod, fn in files:
|
||||||
if self.stop:
|
if self.stop:
|
||||||
@@ -1235,6 +1251,19 @@ class Up2k(object):
|
|||||||
if n_rm2:
|
if n_rm2:
|
||||||
self.log("forgetting {} shadowed deleted files".format(n_rm2))
|
self.log("forgetting {} shadowed deleted files".format(n_rm2))
|
||||||
|
|
||||||
|
# then covers
|
||||||
|
n_rm3 = 0
|
||||||
|
q = "delete from cv where rd=? and dn=? and +fn=?"
|
||||||
|
for crd, cdn, fn in cur.execute("select * from cv"):
|
||||||
|
ap = os.path.join(top, crd, cdn, fn)
|
||||||
|
if not bos.path.exists(ap):
|
||||||
|
c2.execute(q, (crd, cdn, fn))
|
||||||
|
n_rm3 += 1
|
||||||
|
|
||||||
|
if n_rm3:
|
||||||
|
self.log("forgetting {} deleted covers".format(n_rm3))
|
||||||
|
|
||||||
|
c2.connection.commit()
|
||||||
c2.close()
|
c2.close()
|
||||||
return n_rm + n_rm2
|
return n_rm + n_rm2
|
||||||
|
|
||||||
@@ -1387,6 +1416,7 @@ class Up2k(object):
|
|||||||
cur, _ = reg
|
cur, _ = reg
|
||||||
self._set_tagscan(cur, True)
|
self._set_tagscan(cur, True)
|
||||||
cur.execute("delete from dh")
|
cur.execute("delete from dh")
|
||||||
|
cur.execute("delete from cv")
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
|
|
||||||
def _set_tagscan(self, cur: "sqlite3.Cursor", need: bool) -> bool:
|
def _set_tagscan(self, cur: "sqlite3.Cursor", need: bool) -> bool:
|
||||||
@@ -1967,6 +1997,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
if ver == DB_VER:
|
if ver == DB_VER:
|
||||||
try:
|
try:
|
||||||
|
self._add_cv_tab(cur)
|
||||||
self._add_xiu_tab(cur)
|
self._add_xiu_tab(cur)
|
||||||
self._add_dhash_tab(cur)
|
self._add_dhash_tab(cur)
|
||||||
except:
|
except:
|
||||||
@@ -2062,6 +2093,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
self._add_dhash_tab(cur)
|
self._add_dhash_tab(cur)
|
||||||
self._add_xiu_tab(cur)
|
self._add_xiu_tab(cur)
|
||||||
|
self._add_cv_tab(cur)
|
||||||
self.log("created DB at {}".format(db_path))
|
self.log("created DB at {}".format(db_path))
|
||||||
return cur
|
return cur
|
||||||
|
|
||||||
@@ -2110,12 +2142,34 @@ class Up2k(object):
|
|||||||
|
|
||||||
cur.connection.commit()
|
cur.connection.commit()
|
||||||
|
|
||||||
|
def _add_cv_tab(self, cur: "sqlite3.Cursor") -> None:
|
||||||
|
# v5b -> v5c
|
||||||
|
try:
|
||||||
|
cur.execute("select rd, dn, fn from cv limit 1").fetchone()
|
||||||
|
return
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
for cmd in [
|
||||||
|
r"create table cv (rd text, dn text, fn text)",
|
||||||
|
r"create index cv_i on cv(rd, dn)",
|
||||||
|
]:
|
||||||
|
cur.execute(cmd)
|
||||||
|
|
||||||
|
try:
|
||||||
|
cur.execute("delete from dh")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
cur.connection.commit()
|
||||||
|
|
||||||
def _job_volchk(self, cj: dict[str, Any]) -> None:
|
def _job_volchk(self, cj: dict[str, Any]) -> None:
|
||||||
if not self.register_vpath(cj["ptop"], cj["vcfg"]):
|
if not self.register_vpath(cj["ptop"], cj["vcfg"]):
|
||||||
if cj["ptop"] not in self.registry:
|
if cj["ptop"] not in self.registry:
|
||||||
raise Pebkac(410, "location unavailable")
|
raise Pebkac(410, "location unavailable")
|
||||||
|
|
||||||
def handle_json(self, cj: dict[str, Any]) -> dict[str, Any]:
|
def handle_json(self, cj: dict[str, Any], busy_aps: set[str]) -> dict[str, Any]:
|
||||||
|
self.busy_aps = busy_aps
|
||||||
try:
|
try:
|
||||||
# bit expensive; 3.9=10x 3.11=2x
|
# bit expensive; 3.9=10x 3.11=2x
|
||||||
if self.mutex.acquire(timeout=10):
|
if self.mutex.acquire(timeout=10):
|
||||||
@@ -2289,6 +2343,22 @@ class Up2k(object):
|
|||||||
else:
|
else:
|
||||||
# symlink to the client-provided name,
|
# symlink to the client-provided name,
|
||||||
# returning the previous upload info
|
# returning the previous upload info
|
||||||
|
psrc = src + ".PARTIAL"
|
||||||
|
if self.args.dotpart:
|
||||||
|
m = re.match(r"(.*[\\/])(.*)", psrc)
|
||||||
|
if m: # always true but...
|
||||||
|
zs1, zs2 = m.groups()
|
||||||
|
psrc = zs1 + "." + zs2
|
||||||
|
|
||||||
|
if (
|
||||||
|
src in self.busy_aps
|
||||||
|
or psrc in self.busy_aps
|
||||||
|
or (wark in reg and "done" not in reg[wark])
|
||||||
|
):
|
||||||
|
raise Pebkac(
|
||||||
|
422, "source file busy; please try again later"
|
||||||
|
)
|
||||||
|
|
||||||
job = deepcopy(job)
|
job = deepcopy(job)
|
||||||
job["wark"] = wark
|
job["wark"] = wark
|
||||||
job["at"] = cj.get("at") or time.time()
|
job["at"] = cj.get("at") or time.time()
|
||||||
@@ -2332,7 +2402,7 @@ class Up2k(object):
|
|||||||
if not n4g:
|
if not n4g:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
if cur:
|
if cur and not self.args.nw:
|
||||||
zs = "prel name lmod size ptop vtop wark host user addr at"
|
zs = "prel name lmod size ptop vtop wark host user addr at"
|
||||||
a = [job[x] for x in zs.split()]
|
a = [job[x] for x in zs.split()]
|
||||||
self.db_add(cur, vfs.flags, *a)
|
self.db_add(cur, vfs.flags, *a)
|
||||||
@@ -2507,10 +2577,7 @@ class Up2k(object):
|
|||||||
|
|
||||||
if lmod and (not linked or SYMTIME):
|
if lmod and (not linked or SYMTIME):
|
||||||
times = (int(time.time()), int(lmod))
|
times = (int(time.time()), int(lmod))
|
||||||
if ANYWIN:
|
bos.utime(dst, times, False)
|
||||||
self.lastmod_q.append((dst, 0, times, False))
|
|
||||||
else:
|
|
||||||
bos.utime(dst, times, False)
|
|
||||||
|
|
||||||
def handle_chunk(
|
def handle_chunk(
|
||||||
self, ptop: str, wark: str, chash: str
|
self, ptop: str, wark: str, chash: str
|
||||||
@@ -2591,13 +2658,10 @@ class Up2k(object):
|
|||||||
self.regdrop(ptop, wark)
|
self.regdrop(ptop, wark)
|
||||||
return ret, dst
|
return ret, dst
|
||||||
|
|
||||||
# windows cant rename open files
|
|
||||||
if not ANYWIN or src == dst:
|
|
||||||
self._finish_upload(ptop, wark)
|
|
||||||
|
|
||||||
return ret, dst
|
return ret, dst
|
||||||
|
|
||||||
def finish_upload(self, ptop: str, wark: str) -> None:
|
def finish_upload(self, ptop: str, wark: str, busy_aps: set[str]) -> None:
|
||||||
|
self.busy_aps = busy_aps
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
self._finish_upload(ptop, wark)
|
self._finish_upload(ptop, wark)
|
||||||
|
|
||||||
@@ -2610,6 +2674,10 @@ class Up2k(object):
|
|||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise Pebkac(500, "finish_upload, wark, " + repr(ex))
|
raise Pebkac(500, "finish_upload, wark, " + repr(ex))
|
||||||
|
|
||||||
|
if job["need"]:
|
||||||
|
t = "finish_upload {} with remaining chunks {}"
|
||||||
|
raise Pebkac(500, t.format(wark, job["need"]))
|
||||||
|
|
||||||
# self.log("--- " + wark + " " + dst + " finish_upload atomic " + dst, 4)
|
# self.log("--- " + wark + " " + dst + " finish_upload atomic " + dst, 4)
|
||||||
atomic_move(src, dst)
|
atomic_move(src, dst)
|
||||||
|
|
||||||
@@ -2617,14 +2685,15 @@ class Up2k(object):
|
|||||||
vflags = self.flags[ptop]
|
vflags = self.flags[ptop]
|
||||||
|
|
||||||
times = (int(time.time()), int(job["lmod"]))
|
times = (int(time.time()), int(job["lmod"]))
|
||||||
if ANYWIN:
|
self.log(
|
||||||
z1 = (dst, job["size"], times, job["sprs"])
|
"no more chunks, setting times {} ({}) on {}".format(
|
||||||
self.lastmod_q.append(z1)
|
times, bos.path.getsize(dst), dst
|
||||||
elif not job["hash"]:
|
)
|
||||||
try:
|
)
|
||||||
bos.utime(dst, times)
|
try:
|
||||||
except:
|
bos.utime(dst, times)
|
||||||
pass
|
except:
|
||||||
|
self.log("failed to utime ({}, {})".format(dst, times))
|
||||||
|
|
||||||
zs = "prel name lmod size ptop vtop wark host user addr"
|
zs = "prel name lmod size ptop vtop wark host user addr"
|
||||||
z2 = [job[x] for x in zs.split()]
|
z2 = [job[x] for x in zs.split()]
|
||||||
@@ -2645,6 +2714,7 @@ class Up2k(object):
|
|||||||
if self.idx_wark(vflags, *z2):
|
if self.idx_wark(vflags, *z2):
|
||||||
del self.registry[ptop][wark]
|
del self.registry[ptop][wark]
|
||||||
else:
|
else:
|
||||||
|
self.registry[ptop][wark]["done"] = 1
|
||||||
self.regdrop(ptop, wark)
|
self.regdrop(ptop, wark)
|
||||||
|
|
||||||
if wake_sr:
|
if wake_sr:
|
||||||
@@ -2814,6 +2884,16 @@ class Up2k(object):
|
|||||||
with self.rescan_cond:
|
with self.rescan_cond:
|
||||||
self.rescan_cond.notify_all()
|
self.rescan_cond.notify_all()
|
||||||
|
|
||||||
|
if rd and sz and fn.lower() in self.args.th_covers:
|
||||||
|
# wasteful; db_add will re-index actual covers
|
||||||
|
# but that won't catch existing files
|
||||||
|
crd, cdn = rd.rsplit("/", 1) if "/" in rd else ("", rd)
|
||||||
|
try:
|
||||||
|
db.execute("delete from cv where rd=? and dn=?", (crd, cdn))
|
||||||
|
db.execute("insert into cv values (?,?,?)", (crd, cdn, fn))
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
def handle_rm(self, uname: str, ip: str, vpaths: list[str], lim: list[int]) -> str:
|
def handle_rm(self, uname: str, ip: str, vpaths: list[str], lim: list[int]) -> str:
|
||||||
n_files = 0
|
n_files = 0
|
||||||
ok = {}
|
ok = {}
|
||||||
@@ -3428,27 +3508,6 @@ class Up2k(object):
|
|||||||
if not job["hash"]:
|
if not job["hash"]:
|
||||||
self._finish_upload(job["ptop"], job["wark"])
|
self._finish_upload(job["ptop"], job["wark"])
|
||||||
|
|
||||||
def _lastmodder(self) -> None:
|
|
||||||
while True:
|
|
||||||
ready = self.lastmod_q2
|
|
||||||
self.lastmod_q2 = self.lastmod_q
|
|
||||||
self.lastmod_q = []
|
|
||||||
|
|
||||||
time.sleep(1)
|
|
||||||
for path, sz, times, sparse in ready:
|
|
||||||
self.log("lmod: setting times {} on {}".format(times, path))
|
|
||||||
try:
|
|
||||||
bos.utime(path, times, False)
|
|
||||||
except:
|
|
||||||
t = "lmod: failed to utime ({}, {}):\n{}"
|
|
||||||
self.log(t.format(path, times, min_ex()))
|
|
||||||
|
|
||||||
if sparse and self.args.sparse and self.args.sparse * 1024 * 1024 <= sz:
|
|
||||||
try:
|
|
||||||
sp.check_call(["fsutil", "sparse", "setflag", path, "0"])
|
|
||||||
except:
|
|
||||||
self.log("could not unsparse [{}]".format(path), 3)
|
|
||||||
|
|
||||||
def _snapshot(self) -> None:
|
def _snapshot(self) -> None:
|
||||||
slp = self.snap_persist_interval
|
slp = self.snap_persist_interval
|
||||||
while True:
|
while True:
|
||||||
|
|||||||
@@ -668,6 +668,7 @@ class FHC(object):
|
|||||||
|
|
||||||
def __init__(self) -> None:
|
def __init__(self) -> None:
|
||||||
self.cache: dict[str, FHC.CE] = {}
|
self.cache: dict[str, FHC.CE] = {}
|
||||||
|
self.aps: set[str] = set()
|
||||||
|
|
||||||
def close(self, path: str) -> None:
|
def close(self, path: str) -> None:
|
||||||
try:
|
try:
|
||||||
@@ -679,6 +680,7 @@ class FHC(object):
|
|||||||
fh.close()
|
fh.close()
|
||||||
|
|
||||||
del self.cache[path]
|
del self.cache[path]
|
||||||
|
self.aps.remove(path)
|
||||||
|
|
||||||
def clean(self) -> None:
|
def clean(self) -> None:
|
||||||
if not self.cache:
|
if not self.cache:
|
||||||
@@ -699,6 +701,7 @@ class FHC(object):
|
|||||||
return self.cache[path].fhs.pop()
|
return self.cache[path].fhs.pop()
|
||||||
|
|
||||||
def put(self, path: str, fh: typing.BinaryIO) -> None:
|
def put(self, path: str, fh: typing.BinaryIO) -> None:
|
||||||
|
self.aps.add(path)
|
||||||
try:
|
try:
|
||||||
ce = self.cache[path]
|
ce = self.cache[path]
|
||||||
ce.fhs.append(fh)
|
ce.fhs.append(fh)
|
||||||
|
|||||||
@@ -93,6 +93,7 @@
|
|||||||
--g-fsel-bg: #d39;
|
--g-fsel-bg: #d39;
|
||||||
--g-fsel-b1: #f4a;
|
--g-fsel-b1: #f4a;
|
||||||
--g-fsel-ts: #804;
|
--g-fsel-ts: #804;
|
||||||
|
--g-dfg: var(--srv-3);
|
||||||
--g-fg: var(--a-hil);
|
--g-fg: var(--a-hil);
|
||||||
--g-bg: var(--bg-u2);
|
--g-bg: var(--bg-u2);
|
||||||
--g-b1: var(--bg-u4);
|
--g-b1: var(--bg-u4);
|
||||||
@@ -327,6 +328,7 @@ html.c {
|
|||||||
}
|
}
|
||||||
html.cz {
|
html.cz {
|
||||||
--bgg: var(--bg-u2);
|
--bgg: var(--bg-u2);
|
||||||
|
--srv-3: #fff;
|
||||||
}
|
}
|
||||||
html.cy {
|
html.cy {
|
||||||
--fg: #fff;
|
--fg: #fff;
|
||||||
@@ -354,6 +356,7 @@ html.cy {
|
|||||||
--chk-fg: #fd0;
|
--chk-fg: #fd0;
|
||||||
|
|
||||||
--srv-1: #f00;
|
--srv-1: #f00;
|
||||||
|
--srv-3: #fff;
|
||||||
--op-aa-bg: #fff;
|
--op-aa-bg: #fff;
|
||||||
|
|
||||||
--u2-b1-bg: #f00;
|
--u2-b1-bg: #f00;
|
||||||
@@ -964,6 +967,9 @@ html.y #path a:hover {
|
|||||||
#ggrid>a.dir:before {
|
#ggrid>a.dir:before {
|
||||||
content: '📂';
|
content: '📂';
|
||||||
}
|
}
|
||||||
|
#ggrid>a.dir>span {
|
||||||
|
color: var(--g-dfg);
|
||||||
|
}
|
||||||
#ggrid>a.au:before {
|
#ggrid>a.au:before {
|
||||||
content: '💾';
|
content: '💾';
|
||||||
}
|
}
|
||||||
@@ -1010,6 +1016,9 @@ html.np_open #ggrid>a.au:before {
|
|||||||
background: var(--g-sel-bg);
|
background: var(--g-sel-bg);
|
||||||
border-color: var(--g-sel-b1);
|
border-color: var(--g-sel-b1);
|
||||||
}
|
}
|
||||||
|
#ggrid>a.sel>span {
|
||||||
|
color: var(--g-sel-fg);
|
||||||
|
}
|
||||||
#ggrid>a.sel,
|
#ggrid>a.sel,
|
||||||
#ggrid>a[tt].sel {
|
#ggrid>a[tt].sel {
|
||||||
border-top: 1px solid var(--g-fsel-b1);
|
border-top: 1px solid var(--g-fsel-b1);
|
||||||
|
|||||||
@@ -259,6 +259,10 @@ var Ls = {
|
|||||||
"mm_e404": "Could not play audio; error 404: File not found.",
|
"mm_e404": "Could not play audio; error 404: File not found.",
|
||||||
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
"mm_e403": "Could not play audio; error 403: Access denied.\n\nTry pressing F5 to reload, maybe you got logged out",
|
||||||
"mm_e5xx": "Could not play audio; server error ",
|
"mm_e5xx": "Could not play audio; server error ",
|
||||||
|
"mm_nof": "not finding any more audio files nearby",
|
||||||
|
"mm_hnf": "that song no longer exists",
|
||||||
|
|
||||||
|
"im_hnf": "that image no longer exists",
|
||||||
|
|
||||||
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
"f_chide": 'this will hide the column «{0}»\n\nyou can unhide columns in the settings tab',
|
||||||
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
"f_bigtxt": "this file is {0} MiB large -- really view as text?",
|
||||||
@@ -714,6 +718,10 @@ var Ls = {
|
|||||||
"mm_e404": "Avspilling feilet: Fil ikke funnet.",
|
"mm_e404": "Avspilling feilet: Fil ikke funnet.",
|
||||||
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
|
"mm_e403": "Avspilling feilet: Tilgang nektet.\n\nKanskje du ble logget ut?\nPrøv å trykk F5 for å laste siden på nytt.",
|
||||||
"mm_e5xx": "Avspilling feilet: ",
|
"mm_e5xx": "Avspilling feilet: ",
|
||||||
|
"mm_nof": "finner ikke flere sanger i nærheten",
|
||||||
|
"mm_hnf": "sangen finnes ikke lenger",
|
||||||
|
|
||||||
|
"im_hnf": "bildet finnes ikke lenger",
|
||||||
|
|
||||||
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
"f_chide": 'dette vil skjule kolonnen «{0}»\n\nfanen for "andre innstillinger" lar deg vise kolonnen igjen',
|
||||||
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
"f_bigtxt": "denne filen er hele {0} MiB -- vis som tekst?",
|
||||||
@@ -1312,6 +1320,7 @@ var mpl = (function () {
|
|||||||
var r = {
|
var r = {
|
||||||
"pb_mode": (sread('pb_mode') || 'next').split('-')[0],
|
"pb_mode": (sread('pb_mode') || 'next').split('-')[0],
|
||||||
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
"os_ctl": bcfg_get('au_os_ctl', have_mctl) && have_mctl,
|
||||||
|
'traversals': 0,
|
||||||
};
|
};
|
||||||
bcfg_bind(r, 'preload', 'au_preload', true);
|
bcfg_bind(r, 'preload', 'au_preload', true);
|
||||||
bcfg_bind(r, 'fullpre', 'au_fullpre', false);
|
bcfg_bind(r, 'fullpre', 'au_fullpre', false);
|
||||||
@@ -2093,7 +2102,15 @@ function song_skip(n) {
|
|||||||
}
|
}
|
||||||
function next_song(e) {
|
function next_song(e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
return song_skip(1);
|
if (mp.order.length) {
|
||||||
|
mpl.traversals = 0;
|
||||||
|
return song_skip(1);
|
||||||
|
}
|
||||||
|
if (mpl.traversals++ < 5) {
|
||||||
|
treectl.ls_cb = next_song;
|
||||||
|
return tree_neigh(1);
|
||||||
|
}
|
||||||
|
toast.inf(10, L.mm_nof);
|
||||||
}
|
}
|
||||||
function prev_song(e) {
|
function prev_song(e) {
|
||||||
ev(e);
|
ev(e);
|
||||||
@@ -2582,7 +2599,7 @@ function play(tid, is_ev, seek) {
|
|||||||
if ((tn + '').indexOf('f-') === 0) {
|
if ((tn + '').indexOf('f-') === 0) {
|
||||||
tn = mp.order.indexOf(tn);
|
tn = mp.order.indexOf(tn);
|
||||||
if (tn < 0)
|
if (tn < 0)
|
||||||
return;
|
return toast.warn(10, L.mm_hnf);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (tn >= mp.order.length) {
|
if (tn >= mp.order.length) {
|
||||||
@@ -2850,6 +2867,9 @@ function eval_hash() {
|
|||||||
clearInterval(t);
|
clearInterval(t);
|
||||||
baguetteBox.urltime(ts);
|
baguetteBox.urltime(ts);
|
||||||
var im = QS('#ggrid a[ref="' + id + '"]');
|
var im = QS('#ggrid a[ref="' + id + '"]');
|
||||||
|
if (!im)
|
||||||
|
return toast.warn(10, L.im_hnf);
|
||||||
|
|
||||||
im.click();
|
im.click();
|
||||||
im.scrollIntoView();
|
im.scrollIntoView();
|
||||||
}, 50);
|
}, 50);
|
||||||
|
|||||||
@@ -64,6 +64,11 @@
|
|||||||
yum install davfs2
|
yum install davfs2
|
||||||
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
|
{% if accs %}printf '%s\n' <b>{{ pw }}</b> k | {% endif %}mount -t davfs -ouid=1000 http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b>
|
||||||
</pre>
|
</pre>
|
||||||
|
<p>make it automount on boot:</p>
|
||||||
|
<pre>
|
||||||
|
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>{{ pw }}</b> k" >> /etc/davfs2/secrets
|
||||||
|
printf '%s\n' "http{{ s }}://{{ ep }}/{{ rvp }} <b>mp</b> davfs rw,user,uid=1000,noauto 0 0" >> /etc/fstab
|
||||||
|
</pre>
|
||||||
<p>or you can use rclone instead, which is much slower but doesn't require root:</p>
|
<p>or you can use rclone instead, which is much slower but doesn't require root:</p>
|
||||||
<pre>
|
<pre>
|
||||||
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
rclone config create {{ aname }}-dav webdav url=http{{ s }}://{{ rip }}{{ hport }} vendor=other{% if accs %} user=k pass=<b>{{ pw }}</b>{% endif %}
|
||||||
|
|||||||
@@ -114,10 +114,10 @@ function up2k_flagbus() {
|
|||||||
do_take(now);
|
do_take(now);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if (flag.owner && now - flag.owner[1] > 5000) {
|
if (flag.owner && now - flag.owner[1] > 12000) {
|
||||||
flag.owner = null;
|
flag.owner = null;
|
||||||
}
|
}
|
||||||
if (flag.wants && now - flag.wants[1] > 5000) {
|
if (flag.wants && now - flag.wants[1] > 12000) {
|
||||||
flag.wants = null;
|
flag.wants = null;
|
||||||
}
|
}
|
||||||
if (!flag.owner && !flag.wants) {
|
if (!flag.owner && !flag.wants) {
|
||||||
@@ -772,6 +772,7 @@ function fsearch_explain(n) {
|
|||||||
|
|
||||||
function up2k_init(subtle) {
|
function up2k_init(subtle) {
|
||||||
var r = {
|
var r = {
|
||||||
|
"tact": Date.now(),
|
||||||
"init_deps": init_deps,
|
"init_deps": init_deps,
|
||||||
"set_fsearch": set_fsearch,
|
"set_fsearch": set_fsearch,
|
||||||
"gotallfiles": [gotallfiles] // hooks
|
"gotallfiles": [gotallfiles] // hooks
|
||||||
@@ -1647,8 +1648,14 @@ function up2k_init(subtle) {
|
|||||||
running = true;
|
running = true;
|
||||||
while (true) {
|
while (true) {
|
||||||
var now = Date.now(),
|
var now = Date.now(),
|
||||||
|
blocktime = now - r.tact,
|
||||||
is_busy = st.car < st.files.length;
|
is_busy = st.car < st.files.length;
|
||||||
|
|
||||||
|
if (blocktime > 2500)
|
||||||
|
console.log('main thread blocked for ' + blocktime);
|
||||||
|
|
||||||
|
r.tact = now;
|
||||||
|
|
||||||
if (was_busy && !is_busy) {
|
if (was_busy && !is_busy) {
|
||||||
for (var a = 0; a < st.files.length; a++) {
|
for (var a = 0; a < st.files.length; a++) {
|
||||||
var t = st.files[a];
|
var t = st.files[a];
|
||||||
@@ -1788,6 +1795,15 @@ function up2k_init(subtle) {
|
|||||||
})();
|
})();
|
||||||
|
|
||||||
function uptoast() {
|
function uptoast() {
|
||||||
|
if (st.busy.handshake.length)
|
||||||
|
return;
|
||||||
|
|
||||||
|
for (var a = 0; a < st.files.length; a++) {
|
||||||
|
var t = st.files[a];
|
||||||
|
if (t.want_recheck && !t.rechecks)
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
var sr = uc.fsearch,
|
var sr = uc.fsearch,
|
||||||
ok = pvis.ctr.ok,
|
ok = pvis.ctr.ok,
|
||||||
ng = pvis.ctr.ng,
|
ng = pvis.ctr.ng,
|
||||||
@@ -2043,6 +2059,8 @@ function up2k_init(subtle) {
|
|||||||
nbusy++;
|
nbusy++;
|
||||||
reading++;
|
reading++;
|
||||||
nchunk++;
|
nchunk++;
|
||||||
|
if (Date.now() - up2k.tact > 1500)
|
||||||
|
tasker();
|
||||||
}
|
}
|
||||||
|
|
||||||
function onmsg(d) {
|
function onmsg(d) {
|
||||||
@@ -2373,16 +2391,17 @@ function up2k_init(subtle) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var err_pend = rsp.indexOf('partial upload exists at a different') + 1,
|
var err_pend = rsp.indexOf('partial upload exists at a different') + 1,
|
||||||
|
err_srcb = rsp.indexOf('source file busy; please try again') + 1,
|
||||||
err_plug = rsp.indexOf('upload blocked by x') + 1,
|
err_plug = rsp.indexOf('upload blocked by x') + 1,
|
||||||
err_dupe = rsp.indexOf('upload rejected, file already exists') + 1;
|
err_dupe = rsp.indexOf('upload rejected, file already exists') + 1;
|
||||||
|
|
||||||
if (err_pend || err_plug || err_dupe) {
|
if (err_pend || err_srcb || err_plug || err_dupe) {
|
||||||
err = rsp;
|
err = rsp;
|
||||||
ofs = err.indexOf('\n/');
|
ofs = err.indexOf('\n/');
|
||||||
if (ofs !== -1) {
|
if (ofs !== -1) {
|
||||||
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
|
err = err.slice(0, ofs + 1) + linksplit(err.slice(ofs + 2).trimEnd()).join(' ');
|
||||||
}
|
}
|
||||||
if (!t.rechecks && err_pend) {
|
if (!t.rechecks && (err_pend || err_srcb)) {
|
||||||
t.rechecks = 0;
|
t.rechecks = 0;
|
||||||
t.want_recheck = true;
|
t.want_recheck = true;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -112,12 +112,13 @@ if ((document.location + '').indexOf(',rej,') + 1)
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
console.hist = [];
|
console.hist = [];
|
||||||
|
var CMAXHIST = 100;
|
||||||
var hook = function (t) {
|
var hook = function (t) {
|
||||||
var orig = console[t].bind(console),
|
var orig = console[t].bind(console),
|
||||||
cfun = function () {
|
cfun = function () {
|
||||||
console.hist.push(Date.now() + ' ' + t + ': ' + Array.from(arguments).join(', '));
|
console.hist.push(Date.now() + ' ' + t + ': ' + Array.from(arguments).join(', '));
|
||||||
if (console.hist.length > 100)
|
if (console.hist.length > CMAXHIST)
|
||||||
console.hist = console.hist.slice(50);
|
console.hist = console.hist.slice(CMAXHIST / 2);
|
||||||
|
|
||||||
orig.apply(console, arguments);
|
orig.apply(console, arguments);
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,3 +1,80 @@
|
|||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0305-2018 `v1.6.7` fix no-dedup + add up2k.exe
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* controlpanel-connect: add example for webdav automount
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* fix a race which, in worst case (but unlikely on linux), **could cause data loss**
|
||||||
|
* could only happen if `--no-dedup` or volflag `copydupes` was set (**not** default)
|
||||||
|
* if two identical files were uploaded at the same time, there was a small chance that one of the files would become empty
|
||||||
|
* check if you were affected by doing a search for zero-byte files using either of the following:
|
||||||
|
* https://127.0.0.1:3923/#q=size%20%3D%200
|
||||||
|
* `find -type f -size 0`
|
||||||
|
* let me know if you lost something important and had logging enabled!
|
||||||
|
* ftp: mkdir can do multiple levels at once (support filezilla)
|
||||||
|
* fix flickering toast on upload finish
|
||||||
|
* `[💤]` (upload-baton) could disengage if chrome decides to pause the background tab for 10sec (which it sometimes does)
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
|
## introducing [up2k.exe](https://github.com/9001/copyparty/releases/latest/download/up2k.exe)
|
||||||
|
|
||||||
|
the commandline up2k upload / filesearch client, now as a standalone windows exe
|
||||||
|
* based on python 3.7 so it runs on 32bit windows7 or anything newer
|
||||||
|
* *no https support* (saves space + the python3.7 openssl is getting old)
|
||||||
|
* built from b39ff92f34e3fca389c78109d20d5454af761f8e so it can do long filepaths and mojibake
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
|
⭐️ **you probably want [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) below;**
|
||||||
|
the exe is [not recommended](https://github.com/9001/copyparty#copypartyexe) for longterm use
|
||||||
|
and the zip and tar.gz files are source code
|
||||||
|
(python packages are available at [PyPI](https://pypi.org/project/copyparty/#files))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
|
# 2023-0226-2030 `v1.6.6` r 2 0 0
|
||||||
|
|
||||||
|
two hundred releases wow
|
||||||
|
* read-only demo server at https://a.ocv.me/pub/demo/
|
||||||
|
* [docker image](https://github.com/9001/copyparty/tree/hovudstraum/scripts/docker) ╱ [similar software](https://github.com/9001/copyparty/blob/hovudstraum/docs/versus.md) ╱ [client testbed](https://cd.ocv.me/b/)
|
||||||
|
* currently fighting a ground fault so the demo server will be unreliable for a while
|
||||||
|
|
||||||
|
## new features
|
||||||
|
* more docker containers! now runs on x64, x32, aarch64, armhf, ppc64, s390x
|
||||||
|
* pls let me know if you actually run copyparty on an IBM mainframe 👍
|
||||||
|
* new [event hook](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) type `xiu` runs just once for all recent uploads
|
||||||
|
* example hook [xiu-sha.py](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/xiu-sha.py) generates sha512 checksum files
|
||||||
|
* new arg `--rsp-jtr` simulates connection jitter
|
||||||
|
* copyparty.exe integrity selftest
|
||||||
|
* ux:
|
||||||
|
* return to previous page after logging in
|
||||||
|
* show a warning on the login page if you're not using https
|
||||||
|
* freebsd: detect `fetch` and return the [colorful sortable plaintext](https://user-images.githubusercontent.com/241032/215322619-ea5fd606-3654-40ad-94ee-2bc058647bb2.png) listing
|
||||||
|
|
||||||
|
## bugfixes
|
||||||
|
* permit replacing empty files only during a `--blank-wt` grace period
|
||||||
|
* lifetimes: keep upload-time when a size/mtime change triggers a reindex
|
||||||
|
* during cleanup after an unlink, never rmdir the entire volume
|
||||||
|
* rescan button in the controlpanel required volumes to be e2ds
|
||||||
|
* dupes could get indexed with the wrong mtime
|
||||||
|
* only affected the search index; the filesystem got the right one
|
||||||
|
* ux: search results could include the same hit twice in case of overlapping volumes
|
||||||
|
* ux: upload UI would remain expanded permanently after visiting a huge tab
|
||||||
|
* ftp: return proper error messages when client does something illegal
|
||||||
|
* ie11: support the back button
|
||||||
|
|
||||||
|
## other changes
|
||||||
|
* [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) replaces copyparty64.exe -- now built for 64-bit windows 10
|
||||||
|
* **on win10 it just works** -- on win8 it needs [vc redist 2015](https://www.microsoft.com/en-us/download/details.aspx?id=48145) -- no win7 support
|
||||||
|
* has the latest security patches, but sfx.py is still better for long-term use
|
||||||
|
* has pillow and mutagen; can make thumbnails and parse/index media
|
||||||
|
* [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) is the old win7-compatible, dangerously-insecure edition
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||||
# 2023-0212-1411 `v1.6.5` windows smb fix + win10.exe
|
# 2023-0212-1411 `v1.6.5` windows smb fix + win10.exe
|
||||||
|
|
||||||
|
|||||||
@@ -43,11 +43,11 @@ filt=
|
|||||||
[ $purge ] && filt='NR>1{print$3}'
|
[ $purge ] && filt='NR>1{print$3}'
|
||||||
[ $filt ] && {
|
[ $filt ] && {
|
||||||
[ $purge ] && {
|
[ $purge ] && {
|
||||||
podman kill $(podman ps -q)
|
podman kill $(podman ps -q) || true
|
||||||
podman rm $(podman ps -qa)
|
podman rm $(podman ps -qa) || true
|
||||||
}
|
}
|
||||||
podman rmi -f $(podman images -a --history | awk "$filt") || true
|
podman rmi -f $(podman images -a --history | awk "$filt") || true
|
||||||
podman rmi $(podman images -a --history | awk '/^<none>.*<none>.*-tmp:/{print$3}')
|
podman rmi $(podman images -a --history | awk '/^<none>.*<none>.*-tmp:/{print$3}') || true
|
||||||
}
|
}
|
||||||
|
|
||||||
[ $pull ] && {
|
[ $pull ] && {
|
||||||
|
|||||||
@@ -1,7 +1,9 @@
|
|||||||
builds a fully standalone copyparty.exe compatible with 32bit win7-sp1 and later
|
builds copyparty32.exe, fully standalone, compatible with 32bit win7-sp1 and later
|
||||||
|
|
||||||
requires a win7 vm which has never been connected to the internet and a host-only network with the linux host at 192.168.123.1
|
requires a win7 vm which has never been connected to the internet and a host-only network with the linux host at 192.168.123.1
|
||||||
|
|
||||||
|
copyparty.exe is built by a win10-ltsc-2021 vm with similar setup
|
||||||
|
|
||||||
first-time setup steps in notes.txt
|
first-time setup steps in notes.txt
|
||||||
|
|
||||||
run build.sh in the vm to fetch src + compile + push a new exe to the linux host for manual publishing
|
run build.sh in the vm to fetch src + compile + push a new exe to the linux host for manual publishing
|
||||||
|
|||||||
@@ -9,6 +9,8 @@ tee build2.sh | cmp build.sh && rm build2.sh || {
|
|||||||
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
|
[[ $r =~ [yY] ]] && mv build{2,}.sh && exec ./build.sh
|
||||||
}
|
}
|
||||||
|
|
||||||
|
./up2k.sh
|
||||||
|
|
||||||
uname -s | grep WOW64 && m= || m=32
|
uname -s | grep WOW64 && m= || m=32
|
||||||
uname -s | grep NT-10 && w10=1 || w7=1
|
uname -s | grep NT-10 && w10=1 || w7=1
|
||||||
[ $w7 ] && pyv=37 || pyv=311
|
[ $w7 ] && pyv=37 || pyv=311
|
||||||
@@ -57,6 +59,8 @@ read a b c d _ < <(
|
|||||||
sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/'
|
sed -r 's/[^0-9]+//;s/[" )]//g;s/[-,]/ /g;s/$/ 0/'
|
||||||
)
|
)
|
||||||
sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2
|
sed -r 's/1,2,3,0/'$a,$b,$c,$d'/;s/1\.2\.3/'$a.$b.$c/ <loader.rc >loader.rc2
|
||||||
|
[ $m ] &&
|
||||||
|
sed -ri 's/copyparty.exe/copyparty32.exe/' loader.rc2
|
||||||
|
|
||||||
excl=(
|
excl=(
|
||||||
copyparty.broker_mp
|
copyparty.broker_mp
|
||||||
|
|||||||
@@ -7,6 +7,12 @@ adf0d23a98da38056de25e07e68921739173efc70fb9bf3f68d8c7c3d0d092e09efa69d35c0c9ecc
|
|||||||
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
132a5380f33a245f2e744413a0e1090bc42b7356376de5121397cec5976b04b79f7c9ebe28af222c9c7b01461f7d7920810d220e337694727e0d7cd9e91fa667 pywin32_ctypes-0.2.0-py2.py3-none-any.whl
|
||||||
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
3c5adf0a36516d284a2ede363051edc1bcc9df925c5a8a9fa2e03cab579dd8d847fdad42f7fd5ba35992e08234c97d2dbfec40a9d12eec61c8dc03758f2bd88e typing_extensions-4.4.0-py3-none-any.whl
|
||||||
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
4b6e9ae967a769fe32be8cf0bc0d5a213b138d1e0344e97656d08a3d15578d81c06c45b334c872009db2db8f39db0c77c94ff6c35168d5e13801917667c08678 upx-4.0.2-win32.zip
|
||||||
|
# up2k (win7)
|
||||||
|
a7d259277af4948bf960682bc9fb45a44b9ae9a19763c8a7c313cef4aa9ec2d447d843e4a7c409e9312c8c8f863a24487a8ee4ffa6891e9b1c4e111bb4723861 certifi-2022.12.7-py3-none-any.whl
|
||||||
|
2822c0dae180b1c8cfb7a70c8c00bad62af9afdbb18b656236680def9d3f1fcdcb8ef5eb64fc3b4c934385cd175ad5992a2284bcba78a243130de75b2d1650db charset_normalizer-3.1.0-cp37-cp37m-win32.whl
|
||||||
|
ffdd45326f4e91c02714f7a944cbcc2fdd09299f709cfa8aec0892053eef0134fb80d9ba3790afd319538a86feb619037cbf533e2f5939cb56b35bb17f56c858 idna-3.4-py3-none-any.whl
|
||||||
|
220e0e122d5851aaccf633224dd7fbd3ba8c8d2720944d8019d6a276ed818d83e3426fe21807f22d673b5428f19fcf9a6b4e645f69bbecd967c568bb6aeb7c8d requests-2.28.2-py3-none-any.whl
|
||||||
|
8770011f4ad1fe40a3062e6cdf1fda431530c59ee7de3fc5f8c57db54bfdb71c3aa220ca0e0bb1874fc6700e9ebb57defbae54ac84938bc9ad8f074910106681 urllib3-1.26.14-py2.py3-none-any.whl
|
||||||
# win7
|
# win7
|
||||||
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
|
91c025f7d94bcdf93df838fab67053165a414fc84e8496f92ecbb910dd55f6b6af5e360bbd051444066880c5a6877e75157bd95e150ead46e5c605930dfc50f2 future-0.18.2.tar.gz
|
||||||
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
|
c06b3295d1d0b0f0a6f9a6cd0be861b9b643b4a5ea37857f0bd41c45deaf27bb927b71922dab74e633e43d75d04a9bd0d1c4ad875569740b0f2a98dd2bfa5113 importlib_metadata-5.0.0-py3-none-any.whl
|
||||||
|
|||||||
@@ -13,6 +13,13 @@ https://pypi.org/project/MarkupSafe/#files
|
|||||||
https://pypi.org/project/mutagen/#files
|
https://pypi.org/project/mutagen/#files
|
||||||
https://pypi.org/project/Pillow/#files
|
https://pypi.org/project/Pillow/#files
|
||||||
|
|
||||||
|
# up2k (win7) additionals
|
||||||
|
https://pypi.org/project/certifi/#files
|
||||||
|
https://pypi.org/project/charset-normalizer/#files # cp37-cp37m-win32.whl
|
||||||
|
https://pypi.org/project/idna/#files
|
||||||
|
https://pypi.org/project/requests/#files
|
||||||
|
https://pypi.org/project/urllib3/#files
|
||||||
|
|
||||||
# win7 additionals
|
# win7 additionals
|
||||||
https://pypi.org/project/future/#files
|
https://pypi.org/project/future/#files
|
||||||
https://pypi.org/project/importlib-metadata/#files
|
https://pypi.org/project/importlib-metadata/#files
|
||||||
|
|||||||
@@ -1,8 +1,10 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
|
genico() {
|
||||||
|
|
||||||
# imagemagick png compression is broken, use pillow instead
|
# imagemagick png compression is broken, use pillow instead
|
||||||
convert ~/AndroidStudioProjects/PartyUP/metadata/en-US/images/icon.png a.bmp
|
convert $1 a.bmp
|
||||||
|
|
||||||
#convert a.bmp -trim -resize '48x48!' -strip a.png
|
#convert a.bmp -trim -resize '48x48!' -strip a.png
|
||||||
python3 <<'EOF'
|
python3 <<'EOF'
|
||||||
@@ -17,11 +19,15 @@ EOF
|
|||||||
pngquant --strip --quality 30 a.png
|
pngquant --strip --quality 30 a.png
|
||||||
mv a-*.png a.png
|
mv a-*.png a.png
|
||||||
|
|
||||||
python3 <<'EOF'
|
python3 <<EOF
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
Image.open('a.png').save('loader.ico',sizes=[(48,48)])
|
Image.open('a.png').save('$2',sizes=[(48,48)])
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
rm a.{bmp,png}
|
rm a.{bmp,png}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
genico ~/AndroidStudioProjects/PartyUP/metadata/en-US/images/icon.png loader.ico
|
||||||
|
genico https://raw.githubusercontent.com/googlefonts/noto-emoji/main/png/512/emoji_u1f680.png up2k.ico
|
||||||
ls -al
|
ls -al
|
||||||
exit 0
|
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ VSVersionInfo(
|
|||||||
StringTable(
|
StringTable(
|
||||||
'000004b0',
|
'000004b0',
|
||||||
[StringStruct('CompanyName', 'ocv.me'),
|
[StringStruct('CompanyName', 'ocv.me'),
|
||||||
StringStruct('FileDescription', 'copyparty'),
|
StringStruct('FileDescription', 'copyparty file server'),
|
||||||
StringStruct('FileVersion', '1.2.3'),
|
StringStruct('FileVersion', '1.2.3'),
|
||||||
StringStruct('InternalName', 'copyparty'),
|
StringStruct('InternalName', 'copyparty'),
|
||||||
StringStruct('LegalCopyright', '2019, ed'),
|
StringStruct('LegalCopyright', '2019, ed'),
|
||||||
|
|||||||
@@ -27,6 +27,13 @@ fns=(
|
|||||||
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
||||||
python-3.11.2-amd64.exe
|
python-3.11.2-amd64.exe
|
||||||
}
|
}
|
||||||
|
[ $w7 ] && fns+=(
|
||||||
|
certifi-2022.12.7-py3-none-any.whl
|
||||||
|
chardet-5.1.0-py3-none-any.whl
|
||||||
|
idna-3.4-py3-none-any.whl
|
||||||
|
requests-2.28.2-py3-none-any.whl
|
||||||
|
urllib3-1.26.14-py2.py3-none-any.whl
|
||||||
|
)
|
||||||
[ $w7 ] && fns+=(
|
[ $w7 ] && fns+=(
|
||||||
future-0.18.2.tar.gz
|
future-0.18.2.tar.gz
|
||||||
importlib_metadata-5.0.0-py3-none-any.whl
|
importlib_metadata-5.0.0-py3-none-any.whl
|
||||||
@@ -58,28 +65,32 @@ manually install:
|
|||||||
|
|
||||||
===[ copy-paste into git-bash ]================================
|
===[ copy-paste into git-bash ]================================
|
||||||
uname -s | grep NT-10 && w10=1 || w7=1
|
uname -s | grep NT-10 && w10=1 || w7=1
|
||||||
|
[ $w7 ] && pyv=37 || pyv=311
|
||||||
|
appd=$(cygpath.exe "$APPDATA")
|
||||||
cd ~/Downloads &&
|
cd ~/Downloads &&
|
||||||
unzip upx-*-win32.zip &&
|
unzip upx-*-win32.zip &&
|
||||||
mv upx-*/upx.exe . &&
|
mv upx-*/upx.exe . &&
|
||||||
python -m ensurepip &&
|
python -m ensurepip &&
|
||||||
python -m pip install --user -U pip-*.whl &&
|
python -m pip install --user -U pip-*.whl &&
|
||||||
{ [ $w7 ] || python -m pip install --user -U mutagen-*.whl Pillow-*.whl; } &&
|
{ [ $w7 ] || python -m pip install --user -U mutagen-*.whl Pillow-*.whl; } &&
|
||||||
|
{ [ $w10 ] || python -m pip install --user -U {requests,urllib3,charset_normalizer,certifi,idna}-*.whl; } &&
|
||||||
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
|
{ [ $w10 ] || python -m pip install --user -U future-*.tar.gz importlib_metadata-*.whl typing_extensions-*.whl zipp-*.whl; } &&
|
||||||
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
|
python -m pip install --user -U pyinstaller-*.whl pefile-*.whl pywin32_ctypes-*.whl pyinstaller_hooks_contrib-*.whl altgraph-*.whl &&
|
||||||
|
sed -ri 's/--lzma/--best/' $appd/Python/Python$pyv/site-packages/pyinstaller/building/utils.py &&
|
||||||
|
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&
|
||||||
|
python uncomment.py $(for d in $appd/Python/Python$pyv/site-packages/{requests,urllib3,charset_normalizer,certifi,idna}; do find $d -name \*.py; done) &&
|
||||||
cd &&
|
cd &&
|
||||||
rm -f build.sh &&
|
rm -f build.sh &&
|
||||||
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/build.sh &&
|
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/build.sh &&
|
||||||
|
curl -fkLO https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.sh &&
|
||||||
echo ok
|
echo ok
|
||||||
# python -m pip install --user -U Pillow-9.2.0-cp37-cp37m-win32.whl
|
# python -m pip install --user -U Pillow-9.2.0-cp37-cp37m-win32.whl
|
||||||
# sed -ri 's/, bestopt, /]+bestopt+[/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
# sed -ri 's/, bestopt, /]+bestopt+[/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
||||||
# sed -ri 's/(^\s+bestopt = ).*/\1["--best","--lzma","--ultra-brute"]/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
# sed -ri 's/(^\s+bestopt = ).*/\1["--best","--lzma","--ultra-brute"]/' $APPDATA/Python/Python37/site-packages/pyinstaller/building/utils.py
|
||||||
|
|
||||||
===[ win10: copy-paste into git-bash ]=========================
|
===[ win10: copy-paste into git-bash ]=========================
|
||||||
appd=$(cygpath.exe "$APPDATA")
|
|
||||||
curl -fkLO https://192.168.123.1:3923/cpp/scripts/uncomment.py &&
|
|
||||||
#for f in $appd/Python/Python311/site-packages/mutagen/*.py; do awk -i inplace '/^\s*def _?(save|write)/{sub(/d.*/," ");s=$0;ns=length(s)} ns&&/[^ ]/&&substr($0,0,ns)!=s{ns=0} !ns' "$f"; done &&
|
#for f in $appd/Python/Python311/site-packages/mutagen/*.py; do awk -i inplace '/^\s*def _?(save|write)/{sub(/d.*/," ");s=$0;ns=length(s)} ns&&/[^ ]/&&substr($0,0,ns)!=s{ns=0} !ns' "$f"; done &&
|
||||||
python uncomment.py $appd/Python/Python311/site-packages/{mutagen,PIL,jinja2,markupsafe}/*.py &&
|
python uncomment.py $appd/Python/Python311/site-packages/{mutagen,PIL,jinja2,markupsafe}/*.py &&
|
||||||
sed -ri 's/--lzma/--best/' $APPDATA/Python/Python311/site-packages/pyinstaller/building/utils.py &&
|
|
||||||
echo ok
|
echo ok
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
29
scripts/pyinstaller/up2k.rc
Normal file
29
scripts/pyinstaller/up2k.rc
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
# UTF-8
|
||||||
|
VSVersionInfo(
|
||||||
|
ffi=FixedFileInfo(
|
||||||
|
filevers=(1,2,3,0),
|
||||||
|
prodvers=(1,2,3,0),
|
||||||
|
mask=0x3f,
|
||||||
|
flags=0x0,
|
||||||
|
OS=0x4,
|
||||||
|
fileType=0x1,
|
||||||
|
subtype=0x0,
|
||||||
|
date=(0, 0)
|
||||||
|
),
|
||||||
|
kids=[
|
||||||
|
StringFileInfo(
|
||||||
|
[
|
||||||
|
StringTable(
|
||||||
|
'000004b0',
|
||||||
|
[StringStruct('CompanyName', 'ocv.me'),
|
||||||
|
StringStruct('FileDescription', 'copyparty uploader / filesearch command'),
|
||||||
|
StringStruct('FileVersion', '1.2.3'),
|
||||||
|
StringStruct('InternalName', 'up2k'),
|
||||||
|
StringStruct('LegalCopyright', '2019, ed'),
|
||||||
|
StringStruct('OriginalFilename', 'up2k.exe'),
|
||||||
|
StringStruct('ProductName', 'copyparty up2k client'),
|
||||||
|
StringStruct('ProductVersion', '1.2.3')])
|
||||||
|
]),
|
||||||
|
VarFileInfo([VarStruct('Translation', [0, 1200])])
|
||||||
|
]
|
||||||
|
)
|
||||||
48
scripts/pyinstaller/up2k.sh
Normal file
48
scripts/pyinstaller/up2k.sh
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
curl -k https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.sh |
|
||||||
|
tee up2k2.sh | cmp up2k.sh && rm up2k2.sh || {
|
||||||
|
[ -s up2k2.sh ] || exit 1
|
||||||
|
echo "new up2k script; upgrade y/n:"
|
||||||
|
while true; do read -u1 -n1 -r r; [[ $r =~ [yYnN] ]] && break; done
|
||||||
|
[[ $r =~ [yY] ]] && mv up2k{2,}.sh && exec ./up2k.sh
|
||||||
|
}
|
||||||
|
|
||||||
|
uname -s | grep -E 'WOW64|NT-10' && echo need win7-32 && exit 1
|
||||||
|
|
||||||
|
dl() { curl -fkLO "$1"; }
|
||||||
|
cd ~/Downloads
|
||||||
|
|
||||||
|
dl https://192.168.123.1:3923/cpp/bin/up2k.py
|
||||||
|
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.ico
|
||||||
|
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.rc
|
||||||
|
dl https://192.168.123.1:3923/cpp/scripts/pyinstaller/up2k.spec
|
||||||
|
|
||||||
|
# $LOCALAPPDATA/programs/python/python37-32/python -m pip install --user -U pyinstaller requests
|
||||||
|
|
||||||
|
grep -E '^from .ssl_ import' $APPDATA/python/python37/site-packages/urllib3/util/proxy.py && {
|
||||||
|
echo golfing
|
||||||
|
echo > $APPDATA/python/python37/site-packages/requests/certs.py
|
||||||
|
sed -ri 's/^(DEFAULT_CA_BUNDLE_PATH = ).*/\1""/' $APPDATA/python/python37/site-packages/requests/utils.py
|
||||||
|
sed -ri '/^import zipfile$/d' $APPDATA/python/python37/site-packages/requests/utils.py
|
||||||
|
sed -ri 's/"idna"//' $APPDATA/python/python37/site-packages/requests/packages.py
|
||||||
|
sed -ri 's/import charset_normalizer.*/pass/' $APPDATA/python/python37/site-packages/requests/compat.py
|
||||||
|
sed -ri 's/raise.*charset_normalizer.*/pass/' $APPDATA/python/python37/site-packages/requests/__init__.py
|
||||||
|
sed -ri 's/import charset_normalizer.*//' $APPDATA/python/python37/site-packages/requests/packages.py
|
||||||
|
sed -ri 's/chardet.__name__/"\\roll\\tide"/' $APPDATA/python/python37/site-packages/requests/packages.py
|
||||||
|
sed -ri 's/chardet,//' $APPDATA/python/python37/site-packages/requests/models.py
|
||||||
|
for n in util/__init__.py connection.py; do awk -i inplace '/^from (\.util)?\.ssl_ /{s=1} !s; /^\)/{s=0}' $APPDATA/python/python37/site-packages/urllib3/$n; done
|
||||||
|
sed -ri 's/^from .ssl_ import .*//' $APPDATA/python/python37/site-packages/urllib3/util/proxy.py
|
||||||
|
echo golfed
|
||||||
|
}
|
||||||
|
|
||||||
|
read a b _ < <(awk -F\" '/^S_VERSION =/{$0=$2;sub(/\./," ");print}' < up2k.py)
|
||||||
|
sed -r 's/1,2,3,0/'$a,$b,0,0'/;s/1\.2\.3/'$a.$b.0/ <up2k.rc >up2k.rc2
|
||||||
|
|
||||||
|
#python uncomment.py up2k.py
|
||||||
|
$APPDATA/python/python37/scripts/pyinstaller -y --clean --upx-dir=. up2k.spec
|
||||||
|
|
||||||
|
./dist/up2k.exe --version
|
||||||
|
|
||||||
|
curl -fkT dist/up2k.exe -HPW:wark https://192.168.123.1:3923/
|
||||||
78
scripts/pyinstaller/up2k.spec
Normal file
78
scripts/pyinstaller/up2k.spec
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
# -*- mode: python ; coding: utf-8 -*-
|
||||||
|
|
||||||
|
|
||||||
|
block_cipher = None
|
||||||
|
|
||||||
|
|
||||||
|
a = Analysis(
|
||||||
|
['up2k.py'],
|
||||||
|
pathex=[],
|
||||||
|
binaries=[],
|
||||||
|
datas=[],
|
||||||
|
hiddenimports=[],
|
||||||
|
hookspath=[],
|
||||||
|
hooksconfig={},
|
||||||
|
runtime_hooks=[],
|
||||||
|
excludes=[
|
||||||
|
'ftplib',
|
||||||
|
'lzma',
|
||||||
|
'pickle',
|
||||||
|
'ssl',
|
||||||
|
'tarfile',
|
||||||
|
'bz2',
|
||||||
|
'zipfile',
|
||||||
|
'tracemalloc',
|
||||||
|
'zlib',
|
||||||
|
'urllib3.util.ssl_',
|
||||||
|
'urllib3.contrib.pyopenssl',
|
||||||
|
'urllib3.contrib.socks',
|
||||||
|
'certifi',
|
||||||
|
'idna',
|
||||||
|
'chardet',
|
||||||
|
'charset_normalizer',
|
||||||
|
'email.contentmanager',
|
||||||
|
'email.policy',
|
||||||
|
'encodings.zlib_codec',
|
||||||
|
'encodings.base64_codec',
|
||||||
|
'encodings.bz2_codec',
|
||||||
|
'encodings.charmap',
|
||||||
|
'encodings.hex_codec',
|
||||||
|
'encodings.palmos',
|
||||||
|
'encodings.punycode',
|
||||||
|
'encodings.rot_13',
|
||||||
|
],
|
||||||
|
win_no_prefer_redirects=False,
|
||||||
|
win_private_assemblies=False,
|
||||||
|
cipher=block_cipher,
|
||||||
|
noarchive=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
# this is the only change to the autogenerated specfile:
|
||||||
|
xdll = ["libcrypto-1_1.dll"]
|
||||||
|
a.binaries = TOC([x for x in a.binaries if x[0] not in xdll])
|
||||||
|
|
||||||
|
pyz = PYZ(a.pure, a.zipped_data, cipher=block_cipher)
|
||||||
|
|
||||||
|
exe = EXE(
|
||||||
|
pyz,
|
||||||
|
a.scripts,
|
||||||
|
a.binaries,
|
||||||
|
a.zipfiles,
|
||||||
|
a.datas,
|
||||||
|
[],
|
||||||
|
name='up2k',
|
||||||
|
debug=False,
|
||||||
|
bootloader_ignore_signals=False,
|
||||||
|
strip=False,
|
||||||
|
upx=True,
|
||||||
|
upx_exclude=[],
|
||||||
|
runtime_tmpdir=None,
|
||||||
|
console=True,
|
||||||
|
disable_windowed_traceback=False,
|
||||||
|
argv_emulation=False,
|
||||||
|
target_arch=None,
|
||||||
|
codesign_identity=None,
|
||||||
|
entitlements_file=None,
|
||||||
|
version='up2k.rc2',
|
||||||
|
icon=['up2k.ico'],
|
||||||
|
)
|
||||||
14
scripts/pyinstaller/up2k.spec.sh
Normal file
14
scripts/pyinstaller/up2k.spec.sh
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# grep '">encodings.cp' C:/Users/ed/dev/copyparty/bin/dist/xref-up2k.html | sed -r 's/.*encodings.cp//;s/<.*//' | sort -n | uniq | tr '\n' ,
|
||||||
|
# grep -i encodings -A1 build/up2k/xref-up2k.html | sed -r 's/.*(Missing|Excluded)Module.*//' | grep moduletype -B1 | grep -v moduletype
|
||||||
|
|
||||||
|
ex=(
|
||||||
|
ftplib lzma pickle ssl tarfile bz2 zipfile tracemalloc zlib
|
||||||
|
urllib3.util.ssl_ urllib3.contrib.pyopenssl urllib3.contrib.socks certifi idna chardet charset_normalizer
|
||||||
|
email.contentmanager email.policy
|
||||||
|
encodings.{zlib_codec,base64_codec,bz2_codec,charmap,hex_codec,palmos,punycode,rot_13}
|
||||||
|
);
|
||||||
|
cex=(); for a in "${ex[@]}"; do cex+=(--exclude "$a"); done
|
||||||
|
$APPDATA/python/python37/scripts/pyi-makespec --version-file up2k.rc2 -i up2k.ico -n up2k -c -F up2k.py "${cex[@]}"
|
||||||
Reference in New Issue
Block a user