Compare commits

...

22 Commits

Author SHA1 Message Date
ed
96ceccd12a v1.2.3 2022-03-24 02:35:53 +01:00
ed
87994fe006 retry failed uploads with backoff 2022-03-24 02:29:59 +01:00
ed
fa12c81a03 zip-download files older than 1980-01-01 2022-03-24 01:31:50 +01:00
ed
344ce63455 basic-browser is implicitly not js 2022-03-21 01:20:47 +01:00
ed
ec4daacf9e v1.2.2 2022-03-20 06:15:57 +01:00
ed
f3e8308718 eh, better as volflags 2022-03-20 05:45:07 +01:00
ed
515ac5d941 show textfile name in document title 2022-03-20 03:40:21 +01:00
ed
954c7e7e50 add option to request noindex from crawlers 2022-03-20 03:23:42 +01:00
ed
67ff57f3a3 add option to disable html folder listings 2022-03-20 02:45:53 +01:00
ed
c10c70c1e5 misc 2022-03-04 21:30:31 +01:00
ed
04592a98d2 include all IPs + link status in server url listing 2022-03-04 21:29:28 +01:00
ed
c9c4aac6cf v1.2.1 2022-03-03 01:26:29 +01:00
ed
8b2c7586ce minimal py2 support for ftpd 2022-03-03 01:18:01 +01:00
ed
32e22dfe84 vendor asynchat for pyftpdlib 2022-03-03 01:16:52 +01:00
ed
d70b885722 failed attempt at upgrading scp 2022-03-03 00:17:03 +01:00
ed
ac6c4b13f5 add plaintext volume listing 2022-03-02 21:20:19 +01:00
ed
ececdad22d and increase debounce a bit 2022-03-02 01:56:05 +01:00
ed
bf659781b0 try some more spacing 2022-03-02 01:49:15 +01:00
ed
2c6bb195a4 search: get rid of inner-joins to fix -tags 2022-03-02 00:35:04 +01:00
ed
c032cd08b3 prisonparty: clean exit on sigterm/int 2022-02-27 20:07:28 +01:00
ed
39e7a7a231 sfx: prefer system pyftpdlib if available 2022-02-13 21:00:13 +01:00
ed
6e14cd2c39 graduate copyparty-sfx.sh 2022-02-13 20:44:03 +01:00
35 changed files with 348 additions and 151 deletions

View File

@@ -62,6 +62,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
* [metadata from audio files](#metadata-from-audio-files) - set `-e2t` to index tags on upload
* [file parser plugins](#file-parser-plugins) - provide custom parsers to index additional tags, also see [./bin/mtag/README.md](./bin/mtag/README.md)
* [upload events](#upload-events) - trigger a script/program on each upload
* [hiding from google](#hiding-from-google) - tell search engines you dont wanna be indexed
* [complete examples](#complete-examples)
* [browser support](#browser-support) - TLDR: yes
* [client examples](#client-examples) - interact with copyparty using non-browser clients
@@ -83,7 +84,7 @@ turn your phone or raspi into a portable file server with resumable uploads/down
* [optional dependencies](#optional-dependencies) - install these to enable bonus features
* [install recommended deps](#install-recommended-deps)
* [optional gpl stuff](#optional-gpl-stuff)
* [sfx](#sfx) - there are two self-contained "binaries"
* [sfx](#sfx) - the self-contained "binary"
* [sfx repack](#sfx-repack) - reduce the size of an sfx by removing features
* [install on android](#install-on-android)
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports
@@ -792,6 +793,17 @@ and it will occupy the parsing threads, so fork anything expensive, or if you wa
if this becomes popular maybe there should be a less janky way to do it actually
## hiding from google
tell search engines you dont wanna be indexed, either using the good old [robots.txt](https://www.robotstxt.org/robotstxt.html) or through copyparty settings:
* `--no-robots` adds HTTP (`X-Robots-Tag`) and HTML (`<meta>`) headers with `noindex, nofollow` globally
* volume-flag `[...]:c,norobots` does the same thing for that single volume
* volume-flag `[...]:c,robots` ALLOWS search-engine crawling for that volume, even if `--no-robots` is set globally
also, `--force-js` disables the plain HTML folder listing, making things harder to parse for search engines
## complete examples
* read-only music server with bpm and key scanning
@@ -1075,6 +1087,10 @@ mandatory deps:
install these to enable bonus features
enable ftp-server:
* for just plaintext FTP, `pyftpdlib` (is built into the SFX)
* with TLS encryption, `pyftpdlib pyopenssl`
enable music tags:
* either `mutagen` (fast, pure-python, skips a few tags, makes copyparty GPL? idk)
* or `ffprobe` (20x slower, more accurate, possibly dangerous depending on your distro and users)
@@ -1101,13 +1117,7 @@ these are standalone programs and will never be imported / evaluated by copypart
# sfx
there are two self-contained "binaries":
* [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) -- pure python, works everywhere, **recommended**
* [copyparty-sfx.sh](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.sh) -- smaller, but only for linux and macos, kinda deprecated
launch either of them (**use sfx.py on systemd**) and it'll unpack and run copyparty, assuming you have python installed of course
pls note that `copyparty-sfx.sh` will fail if you rename `copyparty-sfx.py` to `copyparty.py` and keep it in the same folder because `sys.path` is funky
the self-contained "binary" [copyparty-sfx.py](https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py) will unpack itself and run copyparty, assuming you have python installed of course
## sfx repack
@@ -1182,8 +1192,8 @@ mv /tmp/pe-copyparty/copyparty/web/deps/ copyparty/web/deps/
then build the sfx using any of the following examples:
```sh
./scripts/make-sfx.sh # both python and sh editions
./scripts/make-sfx.sh no-sh gz # just python with gzip
./scripts/make-sfx.sh # regular edition
./scripts/make-sfx.sh gz no-cm # gzip-compressed + no fancy markdown editor
```

View File

@@ -4,8 +4,8 @@ set -e
# install dependencies for audio-*.py
#
# linux/alpine: requires {python3,ffmpeg,fftw}-dev py3-{wheel,pip} py3-numpy{,-dev} patchelf cmake
# linux/debian: requires libav{codec,device,filter,format,resample,util}-dev {libfftw3,python3}-dev python3-{numpy,pip} vamp-{plugin-sdk,examples} patchelf cmake
# linux/alpine: requires gcc g++ make cmake patchelf {python3,ffmpeg,fftw,libsndfile}-dev py3-{wheel,pip} py3-numpy{,-dev}
# linux/debian: requires libav{codec,device,filter,format,resample,util}-dev {libfftw3,python3,libsndfile1}-dev python3-{numpy,pip} vamp-{plugin-sdk,examples} patchelf cmake
# win64: requires msys2-mingw64 environment
# macos: requires macports
#

View File

@@ -122,5 +122,7 @@ export LOGNAME="$USER"
#echo "pybin [$pybin]"
#echo "pyarg [$pyarg]"
#echo "cpp [$cpp]"
chroot --userspec=$uid:$gid "$jail" "$pybin" $pyarg "$cpp" "$@"
chroot --userspec=$uid:$gid "$jail" "$pybin" $pyarg "$cpp" "$@" &
p=$!
trap 'kill $p' INT TERM
wait

View File

@@ -471,6 +471,8 @@ def run_argparse(argv, formatter):
ap2.add_argument("--no-logues", action="store_true", help="disable rendering .prologue/.epilogue.html into directory listings")
ap2.add_argument("--no-readme", action="store_true", help="disable rendering readme.md into directory listings")
ap2.add_argument("--vague-403", action="store_true", help="send 404 instead of 403 (security through ambiguity, very enterprise)")
ap2.add_argument("--force-js", action="store_true", help="don't send HTML folder listings, force clients to use the embedded json instead")
ap2.add_argument("--no-robots", action="store_true", help="adds http and html headers asking search engines to not index anything")
ap2 = ap.add_argument_group('yolo options')
ap2.add_argument("--ign-ebind", action="store_true", help="continue running even if it's impossible to listen on some of the requested endpoints")
@@ -539,6 +541,7 @@ def run_argparse(argv, formatter):
ap2 = ap.add_argument_group('ui options')
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty", help="title / service-name to show in html documents")

View File

@@ -1,8 +1,8 @@
# coding: utf-8
VERSION = (1, 2, 0)
VERSION = (1, 2, 3)
CODENAME = "ftp btw"
BUILD_DT = (2022, 2, 13)
BUILD_DT = (2022, 3, 24)
S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -14,6 +14,7 @@ from datetime import datetime
from .__init__ import WINDOWS
from .util import (
IMPLICATIONS,
META_NOBOTS,
uncyg,
undot,
unhumanize,
@@ -861,6 +862,19 @@ class AuthSrv(object):
if use:
vol.lim = lim
if self.args.no_robots:
for vol in vfs.all_vols.values():
# volflag "robots" overrides global "norobots", allowing indexing by search engines for this vol
if not vol.flags.get("robots"):
vol.flags["norobots"] = True
for vol in vfs.all_vols.values():
h = [vol.flags.get("html_head", self.args.html_head)]
if vol.flags.get("norobots"):
h.insert(0, META_NOBOTS)
vol.flags["html_head"] = "\n".join([x for x in h if x])
for vol in vfs.all_vols.values():
fk = vol.flags.get("fk")
if fk:

View File

@@ -7,20 +7,33 @@ import stat
import time
import logging
import threading
from typing import TYPE_CHECKING
from .__init__ import E, PY2
from .util import Pebkac, fsenc, exclude_dotfiles
from .bos import bos
try:
from pyftpdlib.ioloop import IOLoop
except ImportError:
p = os.path.join(E.mod, "vend")
print("loading asynchat from " + p)
sys.path.append(p)
from pyftpdlib.ioloop import IOLoop
from pyftpdlib.authorizers import DummyAuthorizer, AuthenticationFailed
from pyftpdlib.filesystems import AbstractedFS, FilesystemError
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
from pyftpdlib.ioloop import IOLoop
from pyftpdlib.log import config_logging
from .__init__ import E
from .util import Pebkac, fsenc, exclude_dotfiles
from .bos import bos
if TYPE_CHECKING:
from .svchub import SvcHub
try:
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .svchub import SvcHub
except ImportError:
pass
class FtpAuth(DummyAuthorizer):
@@ -248,7 +261,10 @@ class FtpHandler(FTPHandler):
abstracted_fs = FtpFs
def __init__(self, conn, server, ioloop=None):
super(FtpHandler, self).__init__(conn, server, ioloop)
if PY2:
FTPHandler.__init__(self, conn, server, ioloop)
else:
super(FtpHandler, self).__init__(conn, server, ioloop)
# abspath->vpath mapping to resolve log_transfer paths
self.vfs_map = {}

View File

@@ -65,6 +65,11 @@ class HttpCli(object):
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-store; max-age=0",
}
h = self.args.html_head
if self.args.no_robots:
h = META_NOBOTS + (("\n" + h) if h else "")
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
self.html_head = h
def log(self, msg, c=0):
ptn = self.asrv.re_pwd
@@ -93,6 +98,7 @@ class HttpCli(object):
if ka:
ka["ts"] = self.conn.hsrv.cachebuster()
ka["svcname"] = self.args.doctitle
ka["html_head"] = self.html_head
return tpl.render(**ka)
return tpl
@@ -1677,13 +1683,15 @@ class HttpCli(object):
boundary = "\roll\tide"
targs = {
"ts": self.conn.hsrv.cachebuster(),
"svcname": self.args.doctitle,
"html_head": self.html_head,
"edit": "edit" in self.uparam,
"title": html_escape(self.vpath, crlf=True),
"lastmod": int(ts_md * 1000),
"md_plug": "true" if self.args.emp else "false",
"md_chk_rate": self.args.mcr,
"md": boundary,
"ts": self.conn.hsrv.cachebuster(),
"arg_base": arg_base,
}
html = template.render(**targs).encode("utf-8", "replace")
@@ -1732,6 +1740,31 @@ class HttpCli(object):
vstate = {}
vs = {"scanning": None, "hashq": None, "tagq": None, "mtpq": None}
if self.uparam.get("ls") in ["v", "t", "txt"]:
if self.uname == "*":
txt = "howdy stranger (you're not logged in)"
else:
txt = "welcome back {}".format(self.uname)
if vstate:
txt += "\nstatus:"
for k in ["scanning", "hashq", "tagq", "mtpq"]:
txt += " {}({})".format(k, vs[k])
if rvol:
txt += "\nyou can browse:"
for v in rvol:
txt += "\n " + v
if wvol:
txt += "\nyou can upload to:"
for v in wvol:
txt += "\n " + v
txt = txt.encode("utf-8", "replace") + b"\n"
self.reply(txt, mime="text/plain; charset=utf-8")
return True
html = self.j2(
"splash",
this=self,
@@ -2041,6 +2074,12 @@ class HttpCli(object):
):
raise Pebkac(403)
self.html_head = vn.flags.get("html_head", "")
if vn.flags.get("norobots"):
self.out_headers["X-Robots-Tag"] = "noindex, nofollow"
else:
self.out_headers.pop("X-Robots-Tag", None)
is_dir = stat.S_ISDIR(st.st_mode)
if self.can_read:
th_fmt = self.uparam.get("th")
@@ -2131,11 +2170,12 @@ class HttpCli(object):
url_suf = self.urlq({}, [])
is_ls = "ls" in self.uparam
is_js = self.cookies.get("js") == "y"
is_js = self.args.force_js or self.cookies.get("js") == "y"
tpl = "browser"
if "b" in self.uparam:
tpl = "browser2"
is_js = False
logues = ["", ""]
if not self.args.no_logues:

View File

@@ -5,7 +5,7 @@ import tarfile
import threading
from .sutil import errdesc
from .util import Queue, fsenc
from .util import Queue, fsenc, min_ex
from .bos import bos
@@ -88,8 +88,9 @@ class StreamTar(object):
try:
self.ser(f)
except Exception as ex:
errors.append([f["vp"], repr(ex)])
except Exception:
ex = min_ex(5, True).replace("\n", "\n-- ")
errors.append([f["vp"], ex])
if errors:
self.errf, txt = errdesc(errors)

View File

@@ -362,7 +362,7 @@ class SvcHub(object):
src = ansi_re.sub("", src)
elif c:
if isinstance(c, int):
msg = "\033[3{}m{}".format(c, msg)
msg = "\033[3{}m{}\033[0m".format(c, msg)
elif "\033" not in c:
msg = "\033[{}m{}\033[0m".format(c, msg)
else:

View File

@@ -1,13 +1,12 @@
# coding: utf-8
from __future__ import print_function, unicode_literals
import os
import time
import zlib
from datetime import datetime
from .sutil import errdesc
from .util import yieldfile, sanitize_fn, spack, sunpack
from .util import yieldfile, sanitize_fn, spack, sunpack, min_ex
from .bos import bos
@@ -36,7 +35,10 @@ def unixtime2dos(ts):
bd = ((dy - 1980) << 9) + (dm << 5) + dd
bt = (th << 11) + (tm << 5) + ts // 2
return spack(b"<HH", bt, bd)
try:
return spack(b"<HH", bt, bd)
except:
return b"\x00\x00\x21\x00"
def gen_fdesc(sz, crc32, z64):
@@ -244,8 +246,9 @@ class StreamZip(object):
try:
for x in self.ser(f):
yield x
except Exception as ex:
errors.append([f["vp"], repr(ex)])
except Exception:
ex = min_ex(5, True).replace("\n", "\n-- ")
errors.append([f["vp"], ex])
if errors:
errf, txt = errdesc(errors)

View File

@@ -57,13 +57,19 @@ class TcpSrv(object):
msgs = []
title_tab = {}
title_vars = [x[1:] for x in self.args.wintitle.split(" ") if x.startswith("$")]
m = "available @ http://{}:{}/ (\033[33m{}\033[0m)"
m = "available @ {}://{}:{}/ (\033[33m{}\033[0m)"
for ip, desc in sorted(eps.items(), key=lambda x: x[1]):
for port in sorted(self.args.p):
if port not in ok.get(ip, ok.get("0.0.0.0", [])):
continue
msgs.append(m.format(ip, port, desc))
proto = " http"
if self.args.http_only:
pass
elif self.args.https_only or port == 443:
proto = "https"
msgs.append(m.format(proto, ip, port, desc))
if not self.args.wintitle:
continue
@@ -144,10 +150,15 @@ class TcpSrv(object):
return eps
r = re.compile(r"^\s+inet ([^ ]+)/.* (.*)")
ri = re.compile(r"^\s*[0-9]+\s*:.*")
up = False
for ln in txt.split("\n"):
if ri.match(ln):
up = "UP" in re.split("[>,< ]", ln)
try:
ip, dev = r.match(ln.rstrip()).groups()
eps[ip] = dev
eps[ip] = dev + ("" if up else ", \033[31mLINK-DOWN")
except:
pass
@@ -177,6 +188,7 @@ class TcpSrv(object):
def ips_windows_ipconfig(self):
eps = {}
offs = {}
try:
txt, _ = chkcmd(["ipconfig"])
except:
@@ -184,18 +196,29 @@ class TcpSrv(object):
rdev = re.compile(r"(^[^ ].*):$")
rip = re.compile(r"^ +IPv?4? [^:]+: *([0-9\.]{7,15})$")
roff = re.compile(r".*: Media disconnected$")
dev = None
for ln in txt.replace("\r", "").split("\n"):
m = rdev.match(ln)
if m:
if dev and dev not in eps.values():
offs[dev] = 1
dev = m.group(1).split(" adapter ", 1)[-1]
if dev and roff.match(ln):
offs[dev] = 1
dev = None
m = rip.match(ln)
if m and dev:
eps[m.group(1)] = dev
dev = None
return eps
if dev and dev not in eps.values():
offs[dev] = 1
return eps, offs
def ips_windows_netsh(self):
eps = {}
@@ -215,7 +238,6 @@ class TcpSrv(object):
m = rip.match(ln)
if m and dev:
eps[m.group(1)] = dev
dev = None
return eps
@@ -223,8 +245,11 @@ class TcpSrv(object):
if MACOS:
eps = self.ips_macos()
elif ANYWIN:
eps = self.ips_windows_ipconfig() # sees more interfaces
eps, off = self.ips_windows_ipconfig() # sees more interfaces + link state
eps.update(self.ips_windows_netsh()) # has better names
for k, v in eps.items():
if v in off:
eps[k] += ", \033[31mLINK-DOWN"
else:
eps = self.ips_linux()

View File

@@ -51,11 +51,11 @@ class U2idx(object):
fhash = body["hash"]
wark = up2k_wark_from_hashlist(self.args.salt, fsize, fhash)
uq = "where substr(w,1,16) = ? and w = ?"
uq = "substr(w,1,16) = ? and w = ?"
uv = [wark[:16], wark]
try:
return self.run_query(vols, uq, uv)[0]
return self.run_query(vols, uq, uv, True, False)[0]
except:
raise Pebkac(500, min_ex())
@@ -87,17 +87,16 @@ class U2idx(object):
q = ""
va = []
joins = ""
have_up = False # query has up.* operands
have_mt = False
is_key = True
is_size = False
is_date = False
field_end = "" # closing parenthesis or whatever
kw_key = ["(", ")", "and ", "or ", "not "]
kw_val = ["==", "=", "!=", ">", ">=", "<", "<=", "like "]
ptn_mt = re.compile(r"^\.?[a-z_-]+$")
mt_ctr = 0
mt_keycmp = "substr(up.w,1,16)"
mt_keycmp2 = None
ptn_lc = re.compile(r" (mt[0-9]+\.v) ([=<!>]+) \? $")
ptn_lc = re.compile(r" (mt\.v) ([=<!>]+) \? \) $")
ptn_lcv = re.compile(r"[a-zA-Z]")
while True:
@@ -133,29 +132,31 @@ class U2idx(object):
if v == "size":
v = "up.sz"
is_size = True
have_up = True
elif v == "date":
v = "up.mt"
is_date = True
have_up = True
elif v == "path":
v = "trim(?||up.rd,'/')"
va.append("\nrd")
have_up = True
elif v == "name":
v = "up.fn"
have_up = True
elif v == "tags" or ptn_mt.match(v):
mt_ctr += 1
mt_keycmp2 = "mt{}.w".format(mt_ctr)
joins += "inner join mt mt{} on {} = {} ".format(
mt_ctr, mt_keycmp, mt_keycmp2
)
mt_keycmp = mt_keycmp2
have_mt = True
field_end = ") "
if v == "tags":
v = "mt{0}.v".format(mt_ctr)
vq = "mt.v"
else:
v = "+mt{0}.k = '{1}' and mt{0}.v".format(mt_ctr, v)
vq = "+mt.k = '{}' and mt.v".format(v)
v = "exists(select 1 from mt where mt.w = mtw and " + vq
else:
raise Pebkac(400, "invalid key [" + v + "]")
@@ -201,6 +202,10 @@ class U2idx(object):
va.append(v)
is_key = True
if field_end:
q += field_end
field_end = ""
# lowercase tag searches
m = ptn_lc.search(q)
if not m or not ptn_lcv.search(unicode(v)):
@@ -212,16 +217,16 @@ class U2idx(object):
field, oper = m.groups()
if oper in ["=", "=="]:
q += " {} like ? ".format(field)
q += " {} like ? ) ".format(field)
else:
q += " lower({}) {} ? ".format(field, oper)
q += " lower({}) {} ? ) ".format(field, oper)
try:
return self.run_query(vols, joins + "where " + q, va)
return self.run_query(vols, q, va, have_up, have_mt)
except Exception as ex:
raise Pebkac(500, repr(ex))
def run_query(self, vols, uq, uv):
def run_query(self, vols, uq, uv, have_up, have_mt):
done_flag = []
self.active_id = "{:.6f}_{}".format(
time.time(), threading.current_thread().ident
@@ -240,8 +245,11 @@ class U2idx(object):
if not uq or not uv:
uq = "select * from up"
uv = ()
elif have_mt:
uq = "select up.*, substr(up.w,1,16) mtw from up where " + uq
uv = tuple(uv)
else:
uq = "select up.* from up " + uq
uq = "select up.* from up where " + uq
uv = tuple(uv)
self.log("qs: {!r} {!r}".format(uq, uv))
@@ -268,7 +276,7 @@ class U2idx(object):
fk = flags.get("fk")
c = cur.execute(uq, vuv)
for hit in c:
w, ts, sz, rd, fn, ip, at = hit
w, ts, sz, rd, fn, ip, at = hit[:7]
lim -= 1
if lim <= 0:
break

View File

@@ -1137,9 +1137,9 @@ class Up2k(object):
m = "database is version {}, this copyparty only supports versions <= {}"
raise Exception(m.format(ver, DB_VER))
msg = "creating new DB (old is bad); backup: {}"
msg = "creating new DB (old is bad); backup: "
if ver:
msg = "creating new DB (too old to upgrade); backup: {}"
msg = "creating new DB (too old to upgrade); backup: "
cur = self._backup_db(db_path, cur, ver, msg)
db = cur.connection

View File

@@ -71,6 +71,8 @@ SYMTIME = sys.version_info >= (3, 6) and os.supports_follow_symlinks
HTTP_TS_FMT = "%a, %d %b %Y %H:%M:%S GMT"
META_NOBOTS = '<meta name="robots" content="noindex, nofollow">'
HTTPCODE = {
200: "OK",
204: "No Content",
@@ -483,13 +485,13 @@ def vol_san(vols, txt):
return txt
def min_ex():
def min_ex(max_lines=8, reverse=False):
et, ev, tb = sys.exc_info()
tb = traceback.extract_tb(tb)
fmt = "{} @ {} <{}>: {}"
ex = [fmt.format(fp.split(os.sep)[-1], ln, fun, txt) for fp, ln, fun, txt in tb]
ex.append("[{}] {}".format(et.__name__, ev))
return "\n".join(ex[-8:])
return "\n".join(ex[-max_lines:][:: -1 if reverse else 1])
@contextlib.contextmanager

View File

@@ -37,7 +37,7 @@ pre, code, tt, #doc, #doc>code {
display: inline-block;
padding: .35em .5em .2em .5em;
border-radius: 0 .3em .3em 0;
margin: 1.3em 0 0 0;
margin: 1.3em 0 -.2em 0;
font-size: 1.4em;
}
#path #entree {
@@ -51,7 +51,8 @@ pre, code, tt, #doc, #doc>code {
}
#files tbody a {
display: block;
padding: .3em 0;
padding: .5em 0;
margin: -.3em 0;
scroll-margin-top: 45vh;
}
#files tr {
@@ -110,7 +111,7 @@ a, #files tbody div a:last-child {
}
#files td {
margin: 0;
padding: .1em .5em;
padding: .3em .5em;
border-left: 1px solid #3c3c3c;
}
#files td+td+td {
@@ -234,8 +235,8 @@ a, #files tbody div a:last-child {
}
#files tbody a.play {
color: #e70;
padding: .2em;
margin: -.2em;
padding: .3em;
margin: -.3em;
}
#files tbody a.play.act {
color: #720;

View File

@@ -6,6 +6,7 @@
<title>⇆🎉 {{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
{{ html_head }}
<link rel="stylesheet" media="screen" href="/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="/.cpr/browser.css?_={{ ts }}">
{%- if css %}

View File

@@ -2426,6 +2426,7 @@ var showfile = (function () {
lnh = doc[1],
txt = doc[2],
name = url.split('/').pop(),
tname = uricom_dec(name)[0],
lang = r.getlang(name),
is_md = lang == 'md';
@@ -2472,13 +2473,14 @@ var showfile = (function () {
wr.style.display = '';
set_tabindex();
wintitle(tname + ' \u2014 ');
document.documentElement.scrollTop = 0;
var hfun = no_push ? hist_replace : hist_push;
hfun(get_evpath() + '?doc=' + url.split('/').pop());
qsr('#docname');
el = mknod('span');
el.textContent = uricom_dec(name)[0];
el.textContent = tname;
el.setAttribute('id', 'docname');
ebi('path').appendChild(el);
@@ -2613,6 +2615,7 @@ var thegrid = (function () {
return;
hist_push(get_evpath());
wintitle();
}
var vis = has(perms, "read");
@@ -3220,7 +3223,7 @@ document.onkeydown = function (e) {
clearTimeout(defer_timeout);
clearTimeout(search_timeout);
search_timeout = setTimeout(do_search,
v && v.length < (is_touch ? 4 : 3) ? 600 : 200);
v && v.length < (is_touch ? 4 : 3) ? 1000 : 500);
}
}
@@ -3290,10 +3293,10 @@ document.onkeydown = function (e) {
}
if (k == 'path' || k == 'name' || k == 'tags') {
var not = ' ';
var not = '';
if (tv.slice(0, 1) == '-') {
tv = tv.slice(1);
not = ' not ';
not = 'not ';
}
if (tv.slice(0, 1) == '^') {
@@ -3314,7 +3317,7 @@ document.onkeydown = function (e) {
tv = '"' + tv + '"';
}
q += k + not + 'like ' + tv;
q += not + k + ' like ' + tv;
}
}
}

View File

@@ -6,6 +6,7 @@
<title>{{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
{{ html_head }}
<style>
html{font-family:sans-serif}
td{border:1px solid #999;border-width:1px 1px 0 0;padding:0 5px}

View File

@@ -3,6 +3,7 @@
<title>📝🎉 {{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.7">
{{ html_head }}
<link rel="stylesheet" href="/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" href="/.cpr/md.css?_={{ ts }}">
{%- if edit %}

View File

@@ -3,6 +3,7 @@
<title>📝🎉 {{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.7">
{{ html_head }}
<link rel="stylesheet" href="/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" href="/.cpr/mde.css?_={{ ts }}">
<link rel="stylesheet" href="/.cpr/deps/mini-fa.css?_={{ ts }}">

View File

@@ -6,6 +6,7 @@
<title>{{ svcname }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
{{ html_head }}
<link rel="stylesheet" media="screen" href="/.cpr/msg.css?_={{ ts }}">
</head>

View File

@@ -6,6 +6,7 @@
<title>{{ svcname }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8">
{{ html_head }}
<link rel="stylesheet" media="screen" href="/.cpr/splash.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="/.cpr/ui.css?_={{ ts }}">
</head>

View File

@@ -1173,7 +1173,7 @@ function up2k_init(subtle) {
var t = st.todo.handshake[0],
cd = t.cooldown;
if (cd && cd - Date.now() > 0)
if (cd && cd > Date.now())
return false;
// keepalive or verify
@@ -1370,6 +1370,14 @@ function up2k_init(subtle) {
return taskerd;
})();
function chill(t) {
var now = Date.now();
if ((t.coolmul || 0) < 2 || now - t.cooldown < t.coolmul * 700)
t.coolmul = Math.min((t.coolmul || 0.5) * 2, 32);
t.cooldown = Math.max(t.cooldown || 1, Date.now() + t.coolmul * 1000);
}
/////
////
/// hashing
@@ -1756,8 +1764,12 @@ function up2k_init(subtle) {
pvis.move(t.n, 'ok');
}
else t.t_uploaded = undefined;
else {
if (t.t_uploaded)
chill(t);
t.t_uploaded = undefined;
}
tasker();
}
else {
@@ -1869,7 +1881,8 @@ function up2k_init(subtle) {
else {
toast.err(0, "server broke; cu-err {0} on file [{1}]:\n".format(
xhr.status, t.name) + (txt || "no further information"));
return;
chill(t);
}
orz2(xhr);
}

View File

@@ -3,6 +3,12 @@ echo not a script
exit 1
##
## add index.html banners
find -name index.html | sed -r 's/index.html$//' | while IFS= read -r dir; do f="$dir/.prologue.html"; [ -e "$f" ] || echo '<h1><a href="index.html">open index.html</a></h1>' >"$f"; done
##
## delete all partial uploads
## (supports linux/macos, probably windows+msys2)
@@ -95,6 +101,7 @@ var t=[]; var b=document.location.href.split('#')[0].slice(0, -1); document.quer
# debug md-editor line tracking
var s=mknod('style');s.innerHTML='*[data-ln]:before {content:attr(data-ln)!important;color:#f0c;background:#000;position:absolute;left:-1.5em;font-size:1rem}';document.head.appendChild(s);
##
## bash oneliners
@@ -199,6 +206,7 @@ git remote add all git@github.com:9001/copyparty.git
git remote set-url --add --push all git@gitlab.com:9001/copyparty.git
git remote set-url --add --push all git@github.com:9001/copyparty.git
##
## http 206

View File

@@ -12,21 +12,18 @@ set -e
#
# output summary (filesizes and contents):
#
# 535672 copyparty-extras/sfx-full/copyparty-sfx.sh
# 550760 copyparty-extras/sfx-full/copyparty-sfx.py
# `- original unmodified sfx from github
#
# 572923 copyparty-extras/sfx-full/copyparty-sfx-gz.py
# `- unmodified but recompressed from bzip2 to gzip
#
# 341792 copyparty-extras/sfx-ent/copyparty-sfx.sh
# 353975 copyparty-extras/sfx-ent/copyparty-sfx.py
# 376934 copyparty-extras/sfx-ent/copyparty-sfx-gz.py
# `- removed iOS ogg/opus/vorbis audio decoder,
# removed the audio tray mouse cursor,
# "enterprise edition"
#
# 259288 copyparty-extras/sfx-lite/copyparty-sfx.sh
# 270004 copyparty-extras/sfx-lite/copyparty-sfx.py
# 293159 copyparty-extras/sfx-lite/copyparty-sfx-gz.py
# `- also removed the codemirror markdown editor
@@ -81,7 +78,7 @@ cache="$od/.copyparty-repack.cache"
# fallback to awk (sorry)
awk -F\" '/"browser_download_url".*(\.tar\.gz|-sfx\.)/ {print$4}'
) |
grep -E '(sfx\.(sh|py)|tar\.gz)$' |
grep -E '(sfx\.py|tar\.gz)$' |
tee /dev/stderr |
tr -d '\r' | tr '\n' '\0' |
xargs -0 bash -c 'dl_files "$@"' _
@@ -139,11 +136,11 @@ repack() {
)
}
repack sfx-full "re gz no-sh"
repack sfx-full "re gz"
repack sfx-ent "re no-dd"
repack sfx-ent "re no-dd gz no-sh"
repack sfx-ent "re no-dd gz"
repack sfx-lite "re no-dd no-cm no-hl"
repack sfx-lite "re no-dd no-cm no-hl gz no-sh"
repack sfx-lite "re no-dd no-cm no-hl gz"
# move fuse and up2k clients into copyparty-extras/,

View File

@@ -4,13 +4,13 @@ ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
ver_hashwasm=4.9.0 \
ver_marked=4.0.12 \
ver_mde=2.16.1 \
ver_codemirror=5.65.1 \
ver_codemirror=5.65.2 \
ver_fontawesome=5.13.0 \
ver_zopfli=1.0.3
# download;
# the scp url is latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
RUN mkdir -p /z/dist/no-pk \
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \
@@ -118,6 +118,7 @@ RUN cd easy-markdown-editor-$ver_mde \
# build fontawesome and scp
COPY mini-fa.sh /z
COPY mini-fa.css /z
COPY shiftbase.py /z
RUN /bin/ash /z/mini-fa.sh

View File

@@ -29,3 +29,10 @@ pyftsubset "$orig_woff" --unicodes-file=/z/icon.list --no-ignore-missing-unicode
# scp is easier, just want basic latin
pyftsubset /z/scp.woff2 --unicodes="20-7e,ab,b7,bb,2022" --no-ignore-missing-unicodes --flavor=woff2 --output-file=/z/dist/no-pk/scp.woff2 --verbose
exit 0
# kinda works but ruins hinting on windows, just use the old version of the font which has correct baseline
python3 shiftbase.py /z/dist/no-pk/scp.woff2
cd /z/dist/no-pk/
mv scp.woff2.woff2 scp.woff2

View File

@@ -0,0 +1,27 @@
#!/usr/bin/env python3
import sys
from fontTools.ttLib import TTFont, newTable
def main():
woff = sys.argv[1]
font = TTFont(woff)
print(repr(font["hhea"].__dict__))
print(repr(font["OS/2"].__dict__))
# font["hhea"].ascent = round(base_asc * mul)
# font["hhea"].descent = round(base_desc * mul)
# font["OS/2"].usWinAscent = round(base_asc * mul)
font["OS/2"].usWinDescent = round(font["OS/2"].usWinDescent * 1.1)
font["OS/2"].sTypoDescender = round(font["OS/2"].sTypoDescender * 1.1)
try:
del font["post"].mapping["Delta#1"]
except:
pass
font.save(woff + ".woff2")
if __name__ == "__main__":
main()

View File

@@ -14,8 +14,6 @@ help() { exec cat <<'EOF'
#
# `gz` creates a gzip-compressed python sfx instead of bzip2
#
# `no-sh` makes just the python sfx, skips the sh/unix sfx
#
# `no-cm` saves ~82k by removing easymde/codemirror
# (the fancy markdown editor)
#
@@ -64,8 +62,6 @@ pybin=$(command -v python3 || command -v python) || {
}
use_gz=
do_sh=1
do_py=1
zopf=2560
while [ ! -z "$1" ]; do
case $1 in
@@ -76,8 +72,6 @@ while [ ! -z "$1" ]; do
no-hl) no_hl=1 ; ;;
no-dd) no_dd=1 ; ;;
no-cm) no_cm=1 ; ;;
no-sh) do_sh= ; ;;
no-py) do_py= ; ;;
fast) zopf=100 ; ;;
*) help ; ;;
esac
@@ -147,6 +141,14 @@ tmpdir="$(
mkdir dep-ftp/
mv pyftpdlib dep-ftp/
echo collecting asyncore, asynchat
for n in asyncore.py asynchat.py; do
f=../build/$n
[ -e "$f" ] ||
(url=https://raw.githubusercontent.com/python/cpython/c4d45ee670c09d4f6da709df072ec80cb7dfad22/Lib/$n;
wget -O$f "$url" || curl -L "$url" >$f)
done
# msys2 tar is bad, make the best of it
echo collecting source
[ $clean ] && {
@@ -157,6 +159,12 @@ tmpdir="$(
(cd .. && tar -cf tar copyparty) && tar -xf ../tar
}
rm -f ../tar
# insert asynchat
mkdir copyparty/vend
for n in asyncore.py asynchat.py; do
awk 'NR<4||NR>27;NR==4{print"# license: https://opensource.org/licenses/ISC\n"}' ../build/$n >copyparty/vend/$n
done
}
ver=
@@ -258,7 +266,7 @@ rm have
find | grep -E '\.py$' |
grep -vE '__version__' |
tr '\n' '\0' |
xargs -0 $pybin ../scripts/uncomment.py
xargs -0 "$pybin" ../scripts/uncomment.py
f=dep-j2/jinja2/constants.py
awk '/^LOREM_IPSUM_WORDS/{o=1;print "LOREM_IPSUM_WORDS = u\"a\"";next} !o; /"""/{o=0}' <$f >t
@@ -348,7 +356,14 @@ for d in copyparty dep-j2 dep-ftp; do find $d -type f; done |
sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort |
sed -r 's/([^ ]*) (.*)/\2.\1/' | grep -vE '/list1?$' > list1
(grep -vE '\.(gz|br)$' list1; grep -E '\.(gz|br)$' list1 | shuf) >list || true
for n in {1..50}; do
(grep -vE '\.(gz|br)$' list1; grep -E '\.(gz|br)$' list1 | shuf) >list || true
s=$(md5sum list | cut -c-16)
grep -q $s "$zdir/h" && continue
echo $s >> "$zdir/h"
break
done
[ $n -eq 50 ] && exit
echo creating tar
args=(--owner=1000 --group=1000)
@@ -363,41 +378,27 @@ pe=bz2
echo compressing tar
# detect best level; bzip2 -7 is usually better than -9
[ $do_py ] && { for n in {2..9}; do cp tar t.$n; $pc -$n t.$n & done; wait; mv -v $(ls -1S t.*.$pe | tail -n 1) tar.bz2; }
[ $do_sh ] && { for n in {2..9}; do cp tar t.$n; xz -ze$n t.$n & done; wait; mv -v $(ls -1S t.*.xz | tail -n 1) tar.xz; }
for n in {2..9}; do cp tar t.$n; $pc -$n t.$n & done; wait; mv -v $(ls -1S t.*.$pe | tail -n 1) tar.bz2
rm t.* || true
exts=()
[ $do_sh ] && {
exts+=(.sh)
echo creating unix sfx
(
sed "s/PACK_TS/$ts/; s/PACK_HTS/$hts/; s/CPP_VER/$ver/" <../scripts/sfx.sh |
grep -E '^sfx_eof$' -B 9001;
cat tar.xz
) >$sfx_out.sh
echo creating sfx
py=../scripts/sfx.py
suf=
[ $use_gz ] && {
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
py=$py.t
suf=-gz
}
"$pybin" $py --sfx-make tar.bz2 $ver $ts
mv sfx.out $sfx_out$suf.py
[ $do_py ] && {
echo creating generic sfx
py=../scripts/sfx.py
suf=
[ $use_gz ] && {
sed -r 's/"r:bz2"/"r:gz"/' <$py >$py.t
py=$py.t
suf=-gz
}
$pybin $py --sfx-make tar.bz2 $ver $ts
mv sfx.out $sfx_out$suf.py
exts+=($suf.py)
[ $use_gz ] &&
rm $py
}
exts+=($suf.py)
[ $use_gz ] &&
rm $py
chmod 755 $sfx_out*
@@ -408,4 +409,4 @@ for ext in ${exts[@]}; do
done
# apk add bash python3 tar xz bzip2
# while true; do ./make-sfx.sh; for f in ..//dist/copyparty-sfx.{sh,py}; do mv $f $f.$(wc -c <$f | awk '{print$1}'); done; done
# while true; do ./make-sfx.sh; f=../dist/copyparty-sfx.py; mv $f $f.$(wc -c <$f | awk '{print$1}'); done

View File

@@ -4,34 +4,31 @@ set -e
cd ~/dev/copyparty/scripts
v=$1
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
git push all
git tag v$v
git push all --tags
[ "$v" = sfx ] || {
printf '%s\n' "$v" | grep -qE '^[0-9\.]+$' || exit 1
grep -E "(${v//./, })" ../copyparty/__version__.py || exit 1
rm -rf ../dist
git push all
git tag v$v
git push all --tags
./make-pypi-release.sh u
(cd .. && python3 ./setup.py clean2)
rm -rf ../dist
./make-tgz-release.sh $v
./make-pypi-release.sh u
(cd .. && python3 ./setup.py clean2)
./make-tgz-release.sh $v
}
rm -f ../dist/copyparty-sfx.*
./make-sfx.sh no-sh
../dist/copyparty-sfx.py -h
f=../dist/copyparty-sfx.py
./make-sfx.sh
$f -h
ar=
while true; do
for ((a=0; a<100; a++)); do
for f in ../dist/copyparty-sfx.{py,sh}; do
[ -e $f ] || continue;
mv $f $f.$(wc -c <$f | awk '{print$1}')
done
./make-sfx.sh re $ar
done
ar=no-sh
mv $f $f.$(wc -c <$f | awk '{print$1}')
./make-sfx.sh re $ar
done
# git tag -d v$v; git push --delete origin v$v

View File

@@ -32,6 +32,9 @@ copyparty/th_srv.py,
copyparty/u2idx.py,
copyparty/up2k.py,
copyparty/util.py,
copyparty/vend,
copyparty/vend/asynchat.py,
copyparty/vend/asyncore.py,
copyparty/web,
copyparty/web/baguettebox.js,
copyparty/web/browser.css,

View File

@@ -368,8 +368,9 @@ def confirm(rv):
sys.exit(rv or 1)
def run(tmp, j2):
def run(tmp, j2, ftp):
msg("jinja2:", j2 or "bundled")
msg("pyftpd:", ftp or "bundled")
msg("sfxdir:", tmp)
msg()
@@ -387,9 +388,8 @@ def run(tmp, j2):
t.daemon = True
t.start()
ld = [os.path.join(tmp, x) for x in ["", "dep-ftp", "dep-j2"]]
if j2:
del ld[-1]
ld = (("", ""), (j2, "dep-j2"), (ftp, "dep-ftp"))
ld = [os.path.join(tmp, b) for a, b in ld if not a]
if any([re.match(r"^-.*j[0-9]", x) for x in sys.argv]):
run_s(ld)
@@ -462,7 +462,12 @@ def main():
j2 = None
try:
run(tmp, j2)
from pyftpdlib.__init__ import __ver__ as ftp
except:
ftp = None
try:
run(tmp, j2, ftp)
except SystemExit as ex:
c = ex.code
if c not in [0, -15]:

View File

@@ -52,9 +52,12 @@ class Cfg(Namespace):
mth="",
textfiles="",
doctitle="",
html_head="",
hist=None,
no_idx=None,
no_hash=None,
force_js=False,
no_robots=False,
js_browser=None,
css_browser=None,
**{k: False for k in "e2d e2ds e2dsa e2t e2ts e2tsr no_acode".split()}

View File

@@ -17,13 +17,14 @@ from copyparty import util
class Cfg(Namespace):
def __init__(self, a=None, v=None, c=None):
ex = "nw e2d e2ds e2dsa e2t e2ts e2tsr no_logues no_readme no_acode"
ex = "nw e2d e2ds e2dsa e2t e2ts e2tsr no_logues no_readme no_acode force_js no_robots"
ex = {k: False for k in ex.split()}
ex2 = {
"mtp": [],
"mte": "a",
"mth": "",
"doctitle": "",
"html_head": "",
"hist": None,
"no_idx": None,
"no_hash": None,