Compare commits
17 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a996a09bba | ||
|
|
18c763ac08 | ||
|
|
3d9fb753ba | ||
|
|
714fd1811a | ||
|
|
4364581705 | ||
|
|
ba02c9cc12 | ||
|
|
11eefaf968 | ||
|
|
5a968f9e47 | ||
|
|
6420c4bd03 | ||
|
|
0f9877201b | ||
|
|
9ba2dec9b2 | ||
|
|
ae9cfea939 | ||
|
|
cadaeeeace | ||
|
|
767696185b | ||
|
|
c1efd227b7 | ||
|
|
a50d0563c3 | ||
|
|
e5641ddd16 |
23
README.md
23
README.md
@@ -287,6 +287,10 @@ server notes:
|
||||
* iPhones: the volume control doesn't work because [apple doesn't want it to](https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW11)
|
||||
* `AudioContext` will probably never be a viable workaround as apple introduces new issues faster than they fix current ones
|
||||
|
||||
* iPhones: the preload feature (in the media-player-options tab) can cause a tiny audio glitch 20sec before the end of each song, but disabling it may cause worse iOS bugs to appear instead
|
||||
* just a hunch, but disabling preloading may cause playback to stop entirely, or possibly mess with bluetooth speakers
|
||||
* tried to add a tooltip regarding this but looks like apple broke my tooltips
|
||||
|
||||
* Windows: folders cannot be accessed if the name ends with `.`
|
||||
* python or windows bug
|
||||
|
||||
@@ -347,6 +351,7 @@ permissions:
|
||||
* `d` (delete): delete files/folders
|
||||
* `g` (get): only download files, cannot see folder contents or zip/tar
|
||||
* `G` (upget): same as `g` except uploaders get to see their own filekeys (see `fk` in examples below)
|
||||
* `h` (html): same as `g` except folders return their index.html, and filekeys are not necessary for index.html
|
||||
* `a` (admin): can see uploader IPs, config-reload
|
||||
|
||||
examples:
|
||||
@@ -505,10 +510,16 @@ select which type of archive you want in the `[⚙️] config` tab:
|
||||
| name | url-suffix | description |
|
||||
|--|--|--|
|
||||
| `tar` | `?tar` | plain gnutar, works great with `curl \| tar -xv` |
|
||||
| `pax` | `?tar=pax` | pax-format tar, futureproof, not as fast |
|
||||
| `tgz` | `?tar=gz` | gzip compressed gnu-tar (slow), for `curl \| tar -xvz` |
|
||||
| `txz` | `?tar=xz` | gnu-tar with xz / lzma compression (v.slow) |
|
||||
| `zip` | `?zip=utf8` | works everywhere, glitchy filenames on win7 and older |
|
||||
| `zip_dos` | `?zip` | traditional cp437 (no unicode) to fix glitchy filenames |
|
||||
| `zip_crc` | `?zip=crc` | cp437 with crc32 computed early for truly ancient software |
|
||||
|
||||
* gzip default level is `3` (0=fast, 9=best), change with `?tar=gz:9`
|
||||
* xz default level is `1` (0=fast, 9=best), change with `?tar=xz:9`
|
||||
* bz2 default level is `2` (1=fast, 9=best), change with `?tar=bz2:9`
|
||||
* hidden files (dotfiles) are excluded unless `-ed`
|
||||
* `up2k.db` and `dir.txt` is always excluded
|
||||
* `zip_crc` will take longer to download since the server has to read each file twice
|
||||
@@ -893,15 +904,13 @@ unsafe, slow, not recommended for wan, enable with `--smb` for read-only or `--
|
||||
|
||||
click the [connect](http://127.0.0.1:3923/?hc) button in the control-panel to see connection instructions for windows, linux, macos
|
||||
|
||||
dependencies: `python3 -m pip install --user -U impacket==0.10.0`
|
||||
dependencies: `python3 -m pip install --user -U impacket==0.11.0`
|
||||
* newer versions of impacket will hopefully work just fine but there is monkeypatching so maybe not
|
||||
|
||||
some **BIG WARNINGS** specific to SMB/CIFS, in decreasing importance:
|
||||
* not entirely confident that read-only is read-only
|
||||
* the smb backend is not fully integrated with vfs, meaning there could be security issues (path traversal). Please use `--smb-port` (see below) and [prisonparty](./bin/prisonparty.sh)
|
||||
* account passwords work per-volume as expected, but account permissions are coalesced; all accounts have read-access to all volumes, and if a single account has write-access to some volume then all other accounts also do
|
||||
* if no accounts have write-access to a specific volume, or if `--smbw` is not set, then writing to that volume from smb *should* be impossible
|
||||
* will be fixed once [impacket v0.11.0](https://github.com/SecureAuthCorp/impacket/commit/d923c00f75d54b972bca573a211a82f09b55261a) is released
|
||||
* account passwords work per-volume as expected, and so does account permissions (read/write/move/delete), but `--smbw` must be given to allow write-access from smb
|
||||
* [shadowing](#shadowing) probably works as expected but no guarantees
|
||||
|
||||
and some minor issues,
|
||||
@@ -912,7 +921,7 @@ and some minor issues,
|
||||
* win10 onwards does not allow connecting anonymously / without accounts
|
||||
* on windows, creating a new file through rightclick --> new --> textfile throws an error due to impacket limitations -- hit OK and F5 to get your file
|
||||
* python3 only
|
||||
* slow
|
||||
* slow (the builtin webdav support in windows is 5x faster, and rclone-webdav is 30x faster)
|
||||
|
||||
known client bugs:
|
||||
* on win7 only, `--smb1` is much faster than smb2 (default) because it keeps rescanning folders on smb2
|
||||
@@ -1631,6 +1640,8 @@ other misc notes:
|
||||
* combine this with volflag `c,fk` to generate filekeys (per-file accesskeys); users which have full read-access will then see URLs with `?k=...` appended to the end, and `g` users must provide that URL including the correct key to avoid a 404
|
||||
* the default filekey entropy is fairly small so give `--fk-salt` around 30 characters if you want filekeys longer than 16 chars
|
||||
* permissions `wG` lets users upload files and receive their own filekeys, still without being able to see other uploads
|
||||
* permission `h` instead of `r` makes copyparty behave like a traditional webserver with directory listing/index disabled, returning index.html instead
|
||||
* compatibility with filekeys: index.html itself can be retrieved without the correct filekey, but all other files are protected
|
||||
|
||||
|
||||
## gotchas
|
||||
@@ -1734,7 +1745,7 @@ enable [thumbnails](#thumbnails) of...
|
||||
* **JPEG XL pictures:** `pyvips` or `ffmpeg`
|
||||
|
||||
enable [smb](#smb-server) support (**not** recommended):
|
||||
* `impacket==0.10.0`
|
||||
* `impacket==0.11.0`
|
||||
|
||||
`pyvips` gives higher quality thumbnails than `Pillow` and is 320% faster, using 270% more ram: `sudo apt install libvips42 && python3 -m pip install --user -U pyvips`
|
||||
|
||||
|
||||
@@ -138,7 +138,8 @@ in {
|
||||
"d" (delete): permanently delete files and folders
|
||||
"g" (get): download files, but cannot see folder contents
|
||||
"G" (upget): "get", but can see filekeys of their own uploads
|
||||
"a" (upget): can see uploader IPs, config-reload
|
||||
"h" (html): "get", but folders return their index.html
|
||||
"a" (admin): can see uploader IPs, config-reload
|
||||
|
||||
For example: "rwmd"
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Maintainer: icxes <dev.null@need.moe>
|
||||
pkgname=copyparty
|
||||
pkgver="1.9.2"
|
||||
pkgver="1.9.4"
|
||||
pkgrel=1
|
||||
pkgdesc="Portable file sharing hub"
|
||||
arch=("any")
|
||||
@@ -20,7 +20,7 @@ optdepends=("ffmpeg: thumbnails for videos, images (slower) and audio, music tag
|
||||
)
|
||||
source=("https://github.com/9001/${pkgname}/releases/download/v${pkgver}/${pkgname}-${pkgver}.tar.gz")
|
||||
backup=("etc/${pkgname}.d/init" )
|
||||
sha256sums=("d9509c0cf7d325124ce40637258574f62e628f06f74a61b07168fe4afa4ef489")
|
||||
sha256sums=("c327ac35deaa5e6cc86b3b1a251cc78517be5578c37c4dff90e98465ced82abc")
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/${pkgname}-${pkgver}"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.2/copyparty-sfx.py",
|
||||
"version": "1.9.2",
|
||||
"hash": "sha256-Y5sreREKvuHmKZy6kGk/oBHfKTq8b3SZGGWm9u2Iw9E="
|
||||
"url": "https://github.com/9001/copyparty/releases/download/v1.9.4/copyparty-sfx.py",
|
||||
"version": "1.9.4",
|
||||
"hash": "sha256-17dBvr6uUzaQtzunZUX84BLTWCGe1Hucz9b48BKowHs="
|
||||
}
|
||||
@@ -492,6 +492,7 @@ def get_sects():
|
||||
"d" (delete): permanently delete files and folders
|
||||
"g" (get): download files, but cannot see folder contents
|
||||
"G" (upget): "get", but can see filekeys of their own uploads
|
||||
"h" (html): "get", but folders return their index.html
|
||||
"a" (admin): can see uploader IPs, config-reload
|
||||
|
||||
too many volflags to list here, see --help-flags
|
||||
@@ -931,12 +932,13 @@ def add_webdav(ap):
|
||||
|
||||
def add_smb(ap):
|
||||
ap2 = ap.add_argument_group('SMB/CIFS options')
|
||||
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless --smb-port is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is dangerous! Never expose to the internet. Account permissions are coalesced; if one account has write-access to a volume, then all accounts do.")
|
||||
ap2.add_argument("--smb", action="store_true", help="enable smb (read-only) -- this requires running copyparty as root on linux and macos unless --smb-port is set above 1024 and your OS does port-forwarding from 445 to that.\n\033[1;31mWARNING:\033[0m this protocol is dangerous! Never expose to the internet!")
|
||||
ap2.add_argument("--smbw", action="store_true", help="enable write support (please dont)")
|
||||
ap2.add_argument("--smb1", action="store_true", help="disable SMBv2, only enable SMBv1 (CIFS)")
|
||||
ap2.add_argument("--smb-port", metavar="PORT", type=int, default=445, help="port to listen on -- if you change this value, you must NAT from TCP:445 to this port using iptables or similar")
|
||||
ap2.add_argument("--smb-nwa-1", action="store_true", help="disable impacket#1433 workaround (truncate directory listings to 64kB)")
|
||||
ap2.add_argument("--smb-nwa-2", action="store_true", help="disable impacket workaround for filecopy globs")
|
||||
ap2.add_argument("--smba", action="store_true", help="small performance boost: disable per-account permissions, enables account coalescing instead (if one user has write/delete-access, then everyone does)")
|
||||
ap2.add_argument("--smbv", action="store_true", help="verbose")
|
||||
ap2.add_argument("--smbvv", action="store_true", help="verboser")
|
||||
ap2.add_argument("--smbvvv", action="store_true", help="verbosest")
|
||||
@@ -989,6 +991,7 @@ def add_optouts(ap):
|
||||
ap2.add_argument("-nid", action="store_true", help="no info disk-usage -- don't show in UI")
|
||||
ap2.add_argument("-nb", action="store_true", help="no powered-by-copyparty branding in UI")
|
||||
ap2.add_argument("--no-zip", action="store_true", help="disable download as zip/tar")
|
||||
ap2.add_argument("--no-tarcmp", action="store_true", help="disable download as compressed tar (?tar=gz, ?tar=bz2, ?tar=xz, ?tar=gz:9, ...)")
|
||||
ap2.add_argument("--no-lifetime", action="store_true", help="disable automatic deletion of uploads after a certain time (as specified by the 'lifetime' volflag)")
|
||||
|
||||
|
||||
@@ -1149,7 +1152,7 @@ def add_ui(ap, retry):
|
||||
ap2.add_argument("--js-browser", metavar="L", type=u, help="URL to additional JS to include")
|
||||
ap2.add_argument("--css-browser", metavar="L", type=u, help="URL to additional CSS to include")
|
||||
ap2.add_argument("--html-head", metavar="TXT", type=u, default="", help="text to append to the <head> of all HTML pages")
|
||||
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI)")
|
||||
ap2.add_argument("--ih", action="store_true", help="if a folder contains index.html, show that instead of the directory listing by default (can be changed in the client settings UI, or add ?v to URL for override)")
|
||||
ap2.add_argument("--textfiles", metavar="CSV", type=u, default="txt,nfo,diz,cue,readme", help="file extensions to present as plaintext")
|
||||
ap2.add_argument("--txt-max", metavar="KiB", type=int, default=64, help="max size of embedded textfiles on ?doc= (anything bigger will be lazy-loaded by JS)")
|
||||
ap2.add_argument("--doctitle", metavar="TXT", type=u, default="copyparty @ --name", help="title / service-name to show in html documents")
|
||||
@@ -1397,7 +1400,7 @@ def main(argv: Optional[list[str]] = None) -> None:
|
||||
if re.match("c[^,]", opt):
|
||||
mod = True
|
||||
na.append("c," + opt[1:])
|
||||
elif re.sub("^[rwmdgGa]*", "", opt) and "," not in opt:
|
||||
elif re.sub("^[rwmdgGha]*", "", opt) and "," not in opt:
|
||||
mod = True
|
||||
perm = opt[0]
|
||||
na.append(perm + "," + opt[1:])
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# coding: utf-8
|
||||
|
||||
VERSION = (1, 9, 3)
|
||||
VERSION = (1, 9, 5)
|
||||
CODENAME = "prometheable"
|
||||
BUILD_DT = (2023, 8, 31)
|
||||
BUILD_DT = (2023, 9, 9)
|
||||
|
||||
S_VERSION = ".".join(map(str, VERSION))
|
||||
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)
|
||||
|
||||
@@ -67,6 +67,7 @@ class AXS(object):
|
||||
udel: Optional[Union[list[str], set[str]]] = None,
|
||||
uget: Optional[Union[list[str], set[str]]] = None,
|
||||
upget: Optional[Union[list[str], set[str]]] = None,
|
||||
uhtml: Optional[Union[list[str], set[str]]] = None,
|
||||
uadmin: Optional[Union[list[str], set[str]]] = None,
|
||||
) -> None:
|
||||
self.uread: set[str] = set(uread or [])
|
||||
@@ -75,10 +76,11 @@ class AXS(object):
|
||||
self.udel: set[str] = set(udel or [])
|
||||
self.uget: set[str] = set(uget or [])
|
||||
self.upget: set[str] = set(upget or [])
|
||||
self.uhtml: set[str] = set(uhtml or [])
|
||||
self.uadmin: set[str] = set(uadmin or [])
|
||||
|
||||
def __repr__(self) -> str:
|
||||
ks = "uread uwrite umove udel uget upget uadmin".split()
|
||||
ks = "uread uwrite umove udel uget upget uhtml uadmin".split()
|
||||
return "AXS(%s)" % (", ".join("%s=%r" % (k, self.__dict__[k]) for k in ks),)
|
||||
|
||||
|
||||
@@ -329,6 +331,7 @@ class VFS(object):
|
||||
self.adel: dict[str, list[str]] = {}
|
||||
self.aget: dict[str, list[str]] = {}
|
||||
self.apget: dict[str, list[str]] = {}
|
||||
self.ahtml: dict[str, list[str]] = {}
|
||||
self.aadmin: dict[str, list[str]] = {}
|
||||
|
||||
if realpath:
|
||||
@@ -456,6 +459,7 @@ class VFS(object):
|
||||
uname in c.upget or "*" in c.upget,
|
||||
uname in c.uadmin or "*" in c.uadmin,
|
||||
)
|
||||
# skip uhtml because it's rarely needed
|
||||
|
||||
def get(
|
||||
self,
|
||||
@@ -488,7 +492,7 @@ class VFS(object):
|
||||
(will_get, c.uget, "get"),
|
||||
]:
|
||||
if req and (uname not in d and "*" not in d) and uname != LEELOO_DALLAS:
|
||||
if self.log and err != 999:
|
||||
if vpath != cvpath and vpath != "." and self.log:
|
||||
ap = vn.canonical(rem)
|
||||
t = "{} has no {} in [{}] => [{}] => [{}]"
|
||||
self.log("vfs", t.format(uname, msg, vpath, cvpath, ap), 6)
|
||||
@@ -955,7 +959,7 @@ class AuthSrv(object):
|
||||
try:
|
||||
self._l(ln, 5, "volume access config:")
|
||||
sk, sv = ln.split(":")
|
||||
if re.sub("[rwmdgGa]", "", sk) or not sk:
|
||||
if re.sub("[rwmdgGha]", "", sk) or not sk:
|
||||
err = "invalid accs permissions list; "
|
||||
raise Exception(err)
|
||||
if " " in re.sub(", *", "", sv).strip():
|
||||
@@ -964,7 +968,7 @@ class AuthSrv(object):
|
||||
self._read_vol_str(sk, sv.replace(" ", ""), daxs[vp], mflags[vp])
|
||||
continue
|
||||
except:
|
||||
err += "accs entries must be 'rwmdgGa: user1, user2, ...'"
|
||||
err += "accs entries must be 'rwmdgGha: user1, user2, ...'"
|
||||
raise Exception(err + SBADCFG)
|
||||
|
||||
if cat == catf:
|
||||
@@ -1000,7 +1004,7 @@ class AuthSrv(object):
|
||||
def _read_vol_str(
|
||||
self, lvl: str, uname: str, axs: AXS, flags: dict[str, Any]
|
||||
) -> None:
|
||||
if lvl.strip("crwmdgGa"):
|
||||
if lvl.strip("crwmdgGha"):
|
||||
raise Exception("invalid volflag: {},{}".format(lvl, uname))
|
||||
|
||||
if lvl == "c":
|
||||
@@ -1029,10 +1033,12 @@ class AuthSrv(object):
|
||||
("w", axs.uwrite),
|
||||
("m", axs.umove),
|
||||
("d", axs.udel),
|
||||
("a", axs.uadmin),
|
||||
("h", axs.uhtml),
|
||||
("h", axs.uget),
|
||||
("g", axs.uget),
|
||||
("G", axs.uget),
|
||||
("G", axs.upget),
|
||||
("a", axs.uadmin),
|
||||
]: # b bb bbb
|
||||
if ch in lvl:
|
||||
if un == "*":
|
||||
@@ -1105,7 +1111,7 @@ class AuthSrv(object):
|
||||
|
||||
if self.args.v:
|
||||
# list of src:dst:permset:permset:...
|
||||
# permset is <rwmdgGa>[,username][,username] or <c>,<flag>[=args]
|
||||
# permset is <rwmdgGha>[,username][,username] or <c>,<flag>[=args]
|
||||
for v_str in self.args.v:
|
||||
m = re_vol.match(v_str)
|
||||
if not m:
|
||||
@@ -1194,7 +1200,7 @@ class AuthSrv(object):
|
||||
vol.all_vps.sort(key=lambda x: len(x[0]), reverse=True)
|
||||
vol.root = vfs
|
||||
|
||||
for perm in "read write move del get pget admin".split():
|
||||
for perm in "read write move del get pget html admin".split():
|
||||
axs_key = "u" + perm
|
||||
unames = ["*"] + list(acct.keys())
|
||||
umap: dict[str, list[str]] = {x: [] for x in unames}
|
||||
@@ -1216,6 +1222,7 @@ class AuthSrv(object):
|
||||
axs.udel,
|
||||
axs.uget,
|
||||
axs.upget,
|
||||
axs.uhtml,
|
||||
axs.uadmin,
|
||||
]:
|
||||
for usr in d:
|
||||
@@ -1637,6 +1644,7 @@ class AuthSrv(object):
|
||||
["delete", "udel"],
|
||||
[" get", "uget"],
|
||||
[" upget", "upget"],
|
||||
[" html", "uhtml"],
|
||||
["uadmin", "uadmin"],
|
||||
]:
|
||||
u = list(sorted(getattr(zv.axs, attr)))
|
||||
@@ -1675,7 +1683,7 @@ class AuthSrv(object):
|
||||
self.log(t.format(zv.realpath), c=1)
|
||||
|
||||
try:
|
||||
zv, _ = vfs.get("/", "*", False, True, err=999)
|
||||
zv, _ = vfs.get("", "*", False, True, err=999)
|
||||
if self.warn_anonwrite and os.getcwd() == zv.realpath:
|
||||
t = "anyone can write to the current directory: {}\n"
|
||||
self.log(t.format(zv.realpath), c=1)
|
||||
@@ -1804,6 +1812,7 @@ class AuthSrv(object):
|
||||
vc.udel,
|
||||
vc.uget,
|
||||
vc.upget,
|
||||
vc.uhtml,
|
||||
vc.uadmin,
|
||||
]
|
||||
self.log(t.format(*vs))
|
||||
@@ -1945,6 +1954,7 @@ class AuthSrv(object):
|
||||
"d": "udel",
|
||||
"g": "uget",
|
||||
"G": "upget",
|
||||
"h": "uhtml",
|
||||
"a": "uadmin",
|
||||
}
|
||||
users = {}
|
||||
@@ -2148,7 +2158,7 @@ def upgrade_cfg_fmt(
|
||||
else:
|
||||
sn = sn.replace(",", ", ")
|
||||
ret.append(" " + sn)
|
||||
elif sn[:1] in "rwmdgGa":
|
||||
elif sn[:1] in "rwmdgGha":
|
||||
if cat != catx:
|
||||
cat = catx
|
||||
ret.append(cat)
|
||||
|
||||
@@ -62,6 +62,8 @@ permdescs = {
|
||||
"d": "delete; permanently delete files and folders",
|
||||
"g": "get; download files, but cannot see folder contents",
|
||||
"G": 'upget; same as "g" but can see filekeys of their own uploads',
|
||||
"h": 'html; same as "g" but folders return their index.html',
|
||||
"a": "admin; can see uploader IPs, config-reload",
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -629,7 +629,17 @@ class HttpCli(object):
|
||||
headers: Optional[dict[str, str]] = None,
|
||||
volsan: bool = False,
|
||||
) -> bytes:
|
||||
if status > 400 and status in (403, 404, 422):
|
||||
if (
|
||||
status > 400
|
||||
and status in (403, 404, 422)
|
||||
and (
|
||||
status != 422
|
||||
or (
|
||||
not body.startswith(b"<pre>partial upload exists")
|
||||
and not body.startswith(b"<pre>source file busy")
|
||||
)
|
||||
)
|
||||
):
|
||||
if status == 404:
|
||||
g = self.conn.hsrv.g404
|
||||
elif status == 403:
|
||||
@@ -2186,7 +2196,8 @@ class HttpCli(object):
|
||||
vfs, rem = self.asrv.vfs.get(self.vpath, self.uname, False, True)
|
||||
self._assert_safe_rem(rem)
|
||||
|
||||
if not new_file.endswith(".md"):
|
||||
ext = "" if "." not in new_file else new_file.split(".")[-1]
|
||||
if not ext or len(ext) > 5 or not self.can_delete:
|
||||
new_file += ".md"
|
||||
|
||||
sanitized = sanitize_fn(new_file, "", [])
|
||||
@@ -2519,7 +2530,7 @@ class HttpCli(object):
|
||||
fp = os.path.join(fp, fn)
|
||||
rem = "{}/{}".format(rp, fn).strip("/")
|
||||
|
||||
if not rem.endswith(".md"):
|
||||
if not rem.endswith(".md") and not self.can_delete:
|
||||
raise Pebkac(400, "only markdown pls")
|
||||
|
||||
if nullwrite:
|
||||
@@ -2570,7 +2581,8 @@ class HttpCli(object):
|
||||
return True
|
||||
|
||||
mdir, mfile = os.path.split(fp)
|
||||
mfile2 = "{}.{:.3f}.md".format(mfile[:-3], srv_lastmod)
|
||||
fname, fext = mfile.rsplit(".", 1) if "." in mfile else (mfile, "md")
|
||||
mfile2 = "{}.{:.3f}.{}".format(fname, srv_lastmod, fext)
|
||||
try:
|
||||
dp = os.path.join(mdir, ".hist")
|
||||
bos.mkdir(dp)
|
||||
@@ -2895,12 +2907,26 @@ class HttpCli(object):
|
||||
logmsg = "{:4} {} ".format("", self.req)
|
||||
self.keepalive = False
|
||||
|
||||
cancmp = not self.args.no_tarcmp
|
||||
|
||||
if fmt == "tar":
|
||||
mime = "application/x-tar"
|
||||
packer: Type[StreamArc] = StreamTar
|
||||
if cancmp and "gz" in uarg:
|
||||
mime = "application/gzip"
|
||||
ext = "tar.gz"
|
||||
elif cancmp and "bz2" in uarg:
|
||||
mime = "application/x-bzip"
|
||||
ext = "tar.bz2"
|
||||
elif cancmp and "xz" in uarg:
|
||||
mime = "application/x-xz"
|
||||
ext = "tar.xz"
|
||||
else:
|
||||
mime = "application/x-tar"
|
||||
ext = "tar"
|
||||
else:
|
||||
mime = "application/zip"
|
||||
packer = StreamZip
|
||||
ext = "zip"
|
||||
|
||||
fn = items[0] if items and items[0] else self.vpath
|
||||
if fn:
|
||||
@@ -2925,7 +2951,7 @@ class HttpCli(object):
|
||||
ufn = b"".join(zbl).decode("ascii")
|
||||
|
||||
cdis = "attachment; filename=\"{}.{}\"; filename*=UTF-8''{}.{}"
|
||||
cdis = cdis.format(afn, fmt, ufn, fmt)
|
||||
cdis = cdis.format(afn, ext, ufn, ext)
|
||||
self.log(cdis)
|
||||
self.send_headers(None, mime=mime, headers={"Content-Disposition": cdis})
|
||||
|
||||
@@ -2943,7 +2969,13 @@ class HttpCli(object):
|
||||
self.log("transcoding to [{}]".format(cfmt))
|
||||
fgen = gfilter(fgen, self.thumbcli, self.uname, vpath, cfmt)
|
||||
|
||||
bgen = packer(self.log, fgen, utf8="utf" in uarg, pre_crc="crc" in uarg)
|
||||
bgen = packer(
|
||||
self.log,
|
||||
fgen,
|
||||
utf8="utf" in uarg,
|
||||
pre_crc="crc" in uarg,
|
||||
cmp=uarg if cancmp or uarg == "pax" else "",
|
||||
)
|
||||
bsent = 0
|
||||
for buf in bgen.gen():
|
||||
if not buf:
|
||||
@@ -3587,6 +3619,7 @@ class HttpCli(object):
|
||||
self.out_headers.pop("X-Robots-Tag", None)
|
||||
|
||||
is_dir = stat.S_ISDIR(st.st_mode)
|
||||
fk_pass = False
|
||||
icur = None
|
||||
if is_dir and (e2t or e2d):
|
||||
idx = self.conn.get_u2idx()
|
||||
@@ -3635,8 +3668,38 @@ class HttpCli(object):
|
||||
|
||||
return self.tx_ico(rem)
|
||||
|
||||
elif self.can_get and self.avn:
|
||||
axs = self.avn.axs
|
||||
if self.uname not in axs.uhtml and "*" not in axs.uhtml:
|
||||
pass
|
||||
elif is_dir:
|
||||
for fn in ("index.htm", "index.html"):
|
||||
ap2 = os.path.join(abspath, fn)
|
||||
try:
|
||||
st2 = bos.stat(ap2)
|
||||
except:
|
||||
continue
|
||||
|
||||
# might as well be extra careful
|
||||
if not stat.S_ISREG(st2.st_mode):
|
||||
continue
|
||||
|
||||
if not self.trailing_slash:
|
||||
return self.redirect(
|
||||
self.vpath + "/", flavor="redirecting to", use302=True
|
||||
)
|
||||
|
||||
fk_pass = True
|
||||
is_dir = False
|
||||
rem = vjoin(rem, fn)
|
||||
vrem = vjoin(vrem, fn)
|
||||
abspath = ap2
|
||||
break
|
||||
elif self.vpath.rsplit("/", 1)[1] in ("index.htm", "index.html"):
|
||||
fk_pass = True
|
||||
|
||||
if not is_dir and (self.can_read or self.can_get):
|
||||
if not self.can_read and "fk" in vn.flags:
|
||||
if not self.can_read and not fk_pass and "fk" in vn.flags:
|
||||
correct = self.gen_fk(
|
||||
self.args.fk_salt, abspath, st.st_size, 0 if ANYWIN else st.st_ino
|
||||
)[: vn.flags["fk"]]
|
||||
@@ -3646,7 +3709,7 @@ class HttpCli(object):
|
||||
return self.tx_404()
|
||||
|
||||
if (
|
||||
abspath.endswith(".md")
|
||||
(abspath.endswith(".md") or self.can_delete)
|
||||
and "nohtml" not in vn.flags
|
||||
and (
|
||||
"v" in self.uparam
|
||||
@@ -3780,10 +3843,14 @@ class HttpCli(object):
|
||||
}
|
||||
|
||||
if self.args.js_browser:
|
||||
j2a["js"] = self.args.js_browser
|
||||
zs = self.args.js_browser
|
||||
zs += "&" if "?" in zs else "?"
|
||||
j2a["js"] = zs
|
||||
|
||||
if self.args.css_browser:
|
||||
j2a["css"] = self.args.css_browser
|
||||
zs = self.args.css_browser
|
||||
zs += "&" if "?" in zs else "?"
|
||||
j2a["css"] = zs
|
||||
|
||||
if not self.conn.hsrv.prism:
|
||||
j2a["no_prism"] = True
|
||||
|
||||
@@ -32,6 +32,8 @@ class SMB(object):
|
||||
self.asrv = hub.asrv
|
||||
self.log = hub.log
|
||||
self.files: dict[int, tuple[float, str]] = {}
|
||||
self.noacc = self.args.smba
|
||||
self.accs = not self.args.smba
|
||||
|
||||
lg.setLevel(logging.DEBUG if self.args.smbvvv else logging.INFO)
|
||||
for x in ["impacket", "impacket.smbserver"]:
|
||||
@@ -94,6 +96,14 @@ class SMB(object):
|
||||
|
||||
port = int(self.args.smb_port)
|
||||
srv = smbserver.SimpleSMBServer(listenAddress=ip, listenPort=port)
|
||||
try:
|
||||
if self.accs:
|
||||
srv.setAuthCallback(self._auth_cb)
|
||||
except:
|
||||
self.accs = False
|
||||
self.noacc = True
|
||||
t = "impacket too old; access permissions will not work! all accounts are admin!"
|
||||
self.log("smb", t, 1)
|
||||
|
||||
ro = "no" if self.args.smbw else "yes" # (does nothing)
|
||||
srv.addShare("A", "/", readOnly=ro)
|
||||
@@ -119,24 +129,73 @@ class SMB(object):
|
||||
def start(self) -> None:
|
||||
Daemon(self.srv.start)
|
||||
|
||||
def _v2a(self, caller: str, vpath: str, *a: Any) -> tuple[VFS, str]:
|
||||
def _auth_cb(self, *a, **ka):
|
||||
debug("auth-result: %s %s", a, ka)
|
||||
conndata = ka["connData"]
|
||||
auth_ok = conndata["Authenticated"]
|
||||
uname = ka["user_name"] if auth_ok else "*"
|
||||
uname = self.asrv.iacct.get(uname, uname) or "*"
|
||||
oldname = conndata.get("partygoer", "*") or "*"
|
||||
cli_ip = conndata["ClientIP"]
|
||||
cli_hn = ka["host_name"]
|
||||
if uname != "*":
|
||||
conndata["partygoer"] = uname
|
||||
info("client %s [%s] authed as %s", cli_ip, cli_hn, uname)
|
||||
elif oldname != "*":
|
||||
info("client %s [%s] keeping old auth as %s", cli_ip, cli_hn, oldname)
|
||||
elif auth_ok:
|
||||
info("client %s [%s] authed as [*] (anon)", cli_ip, cli_hn)
|
||||
else:
|
||||
info("client %s [%s] rejected", cli_ip, cli_hn)
|
||||
|
||||
def _uname(self) -> str:
|
||||
if self.noacc:
|
||||
return LEELOO_DALLAS
|
||||
|
||||
try:
|
||||
# you found it! my single worst bit of code so far
|
||||
# (if you can think of a better way to track users through impacket i'm all ears)
|
||||
cf0 = inspect.currentframe().f_back.f_back
|
||||
cf = cf0.f_back
|
||||
for n in range(3):
|
||||
cl = cf.f_locals
|
||||
if "connData" in cl:
|
||||
return cl["connData"]["partygoer"]
|
||||
cf = cf.f_back
|
||||
except:
|
||||
warning(
|
||||
"nyoron... %s <<-- %s <<-- %s <<-- %s",
|
||||
cf0.f_code.co_name,
|
||||
cf0.f_back.f_code.co_name,
|
||||
cf0.f_back.f_back.f_code.co_name,
|
||||
cf0.f_back.f_back.f_back.f_code.co_name,
|
||||
)
|
||||
return "*"
|
||||
|
||||
def _v2a(
|
||||
self, caller: str, vpath: str, *a: Any, uname="", perms=None
|
||||
) -> tuple[VFS, str]:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
# cf = inspect.currentframe().f_back
|
||||
# c1 = cf.f_back.f_code.co_name
|
||||
# c2 = cf.f_code.co_name
|
||||
debug('%s("%s", %s)\033[K\033[0m', caller, vpath, str(a))
|
||||
if not uname:
|
||||
uname = self._uname()
|
||||
if not perms:
|
||||
perms = [True, True]
|
||||
|
||||
# TODO find a way to grab `identity` in smbComSessionSetupAndX and smb2SessionSetup
|
||||
vfs, rem = self.asrv.vfs.get(vpath, LEELOO_DALLAS, True, True)
|
||||
debug('%s("%s", %s) %s @%s\033[K\033[0m', caller, vpath, str(a), perms, uname)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, uname, *perms)
|
||||
return vfs, vfs.canonical(rem)
|
||||
|
||||
def _listdir(self, vpath: str, *a: Any, **ka: Any) -> list[str]:
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
# caller = inspect.currentframe().f_back.f_code.co_name
|
||||
debug('listdir("%s", %s)\033[K\033[0m', vpath, str(a))
|
||||
vfs, rem = self.asrv.vfs.get(vpath, LEELOO_DALLAS, False, False)
|
||||
uname = self._uname()
|
||||
# debug('listdir("%s", %s) @%s\033[K\033[0m', vpath, str(a), uname)
|
||||
vfs, rem = self.asrv.vfs.get(vpath, uname, False, False)
|
||||
_, vfs_ls, vfs_virt = vfs.ls(
|
||||
rem, LEELOO_DALLAS, not self.args.no_scandir, [[False, False]]
|
||||
rem, uname, not self.args.no_scandir, [[False, False]]
|
||||
)
|
||||
dirs = [x[0] for x in vfs_ls if stat.S_ISDIR(x[1].st_mode)]
|
||||
fils = [x[0] for x in vfs_ls if x[0] not in dirs]
|
||||
@@ -149,8 +208,8 @@ class SMB(object):
|
||||
sz = 112 * 2 # ['.', '..']
|
||||
for n, fn in enumerate(ls):
|
||||
if sz >= 64000:
|
||||
t = "listing only %d of %d files (%d byte); see impacket#1433"
|
||||
warning(t, n, len(ls), sz)
|
||||
t = "listing only %d of %d files (%d byte) in /%s; see impacket#1433"
|
||||
warning(t, n, len(ls), sz, vpath)
|
||||
break
|
||||
|
||||
nsz = len(fn.encode("utf-16", "replace"))
|
||||
@@ -171,10 +230,12 @@ class SMB(object):
|
||||
if wr and not self.args.smbw:
|
||||
yeet("blocked write (no --smbw): " + vpath)
|
||||
|
||||
vfs, ap = self._v2a("open", vpath, *a)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a("open", vpath, *a, uname=uname, perms=[True, wr])
|
||||
if wr:
|
||||
if not vfs.axs.uwrite:
|
||||
yeet("blocked write (no-write-acc): " + vpath)
|
||||
t = "blocked write (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||
|
||||
xbu = vfs.flags.get("xbu")
|
||||
if xbu and not runhook(
|
||||
@@ -204,7 +265,7 @@ class SMB(object):
|
||||
|
||||
_, vp = self.files.pop(fd)
|
||||
vp, fn = os.path.split(vp)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vp, LEELOO_DALLAS, False, True)
|
||||
vfs, rem = self.hub.asrv.vfs.get(vp, self._uname(), False, True)
|
||||
vfs, rem = vfs.get_dbv(rem)
|
||||
self.hub.up2k.hash_file(
|
||||
vfs.realpath,
|
||||
@@ -224,15 +285,18 @@ class SMB(object):
|
||||
vp1 = vp1.lstrip("/")
|
||||
vp2 = vp2.lstrip("/")
|
||||
|
||||
vfs2, ap2 = self._v2a("rename", vp2, vp1)
|
||||
uname = self._uname()
|
||||
vfs2, ap2 = self._v2a("rename", vp2, vp1, uname=uname)
|
||||
if not vfs2.axs.uwrite:
|
||||
yeet("blocked rename (no-write-acc): " + vp2)
|
||||
t = "blocked write (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs2.axs.uwrite, vp2, uname))
|
||||
|
||||
vfs1, _ = self.asrv.vfs.get(vp1, LEELOO_DALLAS, True, True)
|
||||
vfs1, _ = self.asrv.vfs.get(vp1, uname, True, True, True)
|
||||
if not vfs1.axs.umove:
|
||||
yeet("blocked rename (no-move-acc): " + vp1)
|
||||
t = "blocked rename (no-move-acc %s): /%s @%s"
|
||||
yeet(t % (vfs1.axs.umove, vp1, uname))
|
||||
|
||||
self.hub.up2k.handle_mv(LEELOO_DALLAS, vp1, vp2)
|
||||
self.hub.up2k.handle_mv(uname, vp1, vp2)
|
||||
try:
|
||||
bos.makedirs(ap2)
|
||||
except:
|
||||
@@ -242,52 +306,74 @@ class SMB(object):
|
||||
if not self.args.smbw:
|
||||
yeet("blocked mkdir (no --smbw): " + vpath)
|
||||
|
||||
vfs, ap = self._v2a("mkdir", vpath)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a("mkdir", vpath, uname=uname)
|
||||
if not vfs.axs.uwrite:
|
||||
yeet("blocked mkdir (no-write-acc): " + vpath)
|
||||
t = "blocked mkdir (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||
|
||||
return bos.mkdir(ap)
|
||||
|
||||
def _stat(self, vpath: str, *a: Any, **ka: Any) -> os.stat_result:
|
||||
return bos.stat(self._v2a("stat", vpath, *a)[1], *a, **ka)
|
||||
try:
|
||||
ap = self._v2a("stat", vpath, *a, perms=[True, False])[1]
|
||||
ret = bos.stat(ap, *a, **ka)
|
||||
# debug(" `-stat:ok")
|
||||
return ret
|
||||
except:
|
||||
# white lie: windows freaks out if we raise due to an offline volume
|
||||
# debug(" `-stat:NOPE (faking a directory)")
|
||||
ts = int(time.time())
|
||||
return os.stat_result((16877, -1, -1, 1, 1000, 1000, 8, ts, ts, ts))
|
||||
|
||||
def _unlink(self, vpath: str) -> None:
|
||||
if not self.args.smbw:
|
||||
yeet("blocked delete (no --smbw): " + vpath)
|
||||
|
||||
# return bos.unlink(self._v2a("stat", vpath, *a)[1])
|
||||
vfs, ap = self._v2a("delete", vpath)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a(
|
||||
"delete", vpath, uname=uname, perms=[True, False, False, True]
|
||||
)
|
||||
if not vfs.axs.udel:
|
||||
yeet("blocked delete (no-del-acc): " + vpath)
|
||||
|
||||
vpath = vpath.replace("\\", "/").lstrip("/")
|
||||
self.hub.up2k.handle_rm(LEELOO_DALLAS, "1.7.6.2", [vpath], [], False)
|
||||
self.hub.up2k.handle_rm(uname, "1.7.6.2", [vpath], [], False)
|
||||
|
||||
def _utime(self, vpath: str, times: tuple[float, float]) -> None:
|
||||
if not self.args.smbw:
|
||||
yeet("blocked utime (no --smbw): " + vpath)
|
||||
|
||||
vfs, ap = self._v2a("utime", vpath)
|
||||
uname = self._uname()
|
||||
vfs, ap = self._v2a("utime", vpath, uname=uname)
|
||||
if not vfs.axs.uwrite:
|
||||
yeet("blocked utime (no-write-acc): " + vpath)
|
||||
t = "blocked utime (no-write-acc %s): /%s @%s"
|
||||
yeet(t % (vfs.axs.uwrite, vpath, uname))
|
||||
|
||||
return bos.utime(ap, times)
|
||||
|
||||
def _p_exists(self, vpath: str) -> bool:
|
||||
# ap = "?"
|
||||
try:
|
||||
bos.stat(self._v2a("p.exists", vpath)[1])
|
||||
ap = self._v2a("p.exists", vpath, perms=[True, False])[1]
|
||||
bos.stat(ap)
|
||||
# debug(" `-exists((%s)->(%s)):ok", vpath, ap)
|
||||
return True
|
||||
except:
|
||||
# debug(" `-exists((%s)->(%s)):NOPE", vpath, ap)
|
||||
return False
|
||||
|
||||
def _p_getsize(self, vpath: str) -> int:
|
||||
st = bos.stat(self._v2a("p.getsize", vpath)[1])
|
||||
st = bos.stat(self._v2a("p.getsize", vpath, perms=[True, False])[1])
|
||||
return st.st_size
|
||||
|
||||
def _p_isdir(self, vpath: str) -> bool:
|
||||
try:
|
||||
st = bos.stat(self._v2a("p.isdir", vpath)[1])
|
||||
return stat.S_ISDIR(st.st_mode)
|
||||
st = bos.stat(self._v2a("p.isdir", vpath, perms=[True, False])[1])
|
||||
ret = stat.S_ISDIR(st.st_mode)
|
||||
# debug(" `-isdir:%s:%s", st.st_mode, ret)
|
||||
return ret
|
||||
except:
|
||||
return False
|
||||
|
||||
|
||||
@@ -214,7 +214,7 @@ CONFIGID.UPNP.ORG: 1
|
||||
srv.sck.sendto(zb, addr[:2])
|
||||
|
||||
if cip not in self.txc.c:
|
||||
self.log("{} [{}] --> {}".format(srv.name, srv.ip, cip), "6")
|
||||
self.log("{} [{}] --> {}".format(srv.name, srv.ip, cip), 6)
|
||||
|
||||
self.txc.add(cip)
|
||||
self.txc.cln()
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
# coding: utf-8
|
||||
from __future__ import print_function, unicode_literals
|
||||
|
||||
import re
|
||||
import stat
|
||||
import tarfile
|
||||
|
||||
@@ -44,6 +45,7 @@ class StreamTar(StreamArc):
|
||||
self,
|
||||
log: "NamedLogger",
|
||||
fgen: Generator[dict[str, Any], None, None],
|
||||
cmp: str = "",
|
||||
**kwargs: Any
|
||||
):
|
||||
super(StreamTar, self).__init__(log, fgen)
|
||||
@@ -53,10 +55,36 @@ class StreamTar(StreamArc):
|
||||
self.qfile = QFile()
|
||||
self.errf: dict[str, Any] = {}
|
||||
|
||||
# python 3.8 changed to PAX_FORMAT as default,
|
||||
# waste of space and don't care about the new features
|
||||
# python 3.8 changed to PAX_FORMAT as default;
|
||||
# slower, bigger, and no particular advantage
|
||||
fmt = tarfile.GNU_FORMAT
|
||||
self.tar = tarfile.open(fileobj=self.qfile, mode="w|", format=fmt) # type: ignore
|
||||
if "pax" in cmp:
|
||||
# unless a client asks for it (currently
|
||||
# gnu-tar has wider support than pax-tar)
|
||||
fmt = tarfile.PAX_FORMAT
|
||||
cmp = re.sub(r"[^a-z0-9]*pax[^a-z0-9]*", "", cmp)
|
||||
|
||||
try:
|
||||
cmp, lv = cmp.replace(":", ",").split(",")
|
||||
lv = int(lv)
|
||||
except:
|
||||
lv = None
|
||||
|
||||
arg = {"name": None, "fileobj": self.qfile, "mode": "w", "format": fmt}
|
||||
if cmp == "gz":
|
||||
fun = tarfile.TarFile.gzopen
|
||||
arg["compresslevel"] = lv if lv is not None else 3
|
||||
elif cmp == "bz2":
|
||||
fun = tarfile.TarFile.bz2open
|
||||
arg["compresslevel"] = lv if lv is not None else 2
|
||||
elif cmp == "xz":
|
||||
fun = tarfile.TarFile.xzopen
|
||||
arg["preset"] = lv if lv is not None else 1
|
||||
else:
|
||||
fun = tarfile.open
|
||||
arg["mode"] = "w|"
|
||||
|
||||
self.tar = fun(**arg)
|
||||
|
||||
Daemon(self._gen, "star-gen")
|
||||
|
||||
|
||||
@@ -221,6 +221,7 @@ class StreamZip(StreamArc):
|
||||
fgen: Generator[dict[str, Any], None, None],
|
||||
utf8: bool = False,
|
||||
pre_crc: bool = False,
|
||||
**kwargs: Any
|
||||
) -> None:
|
||||
super(StreamZip, self).__init__(log, fgen)
|
||||
|
||||
@@ -275,6 +276,7 @@ class StreamZip(StreamArc):
|
||||
def gen(self) -> Generator[bytes, None, None]:
|
||||
errf: dict[str, Any] = {}
|
||||
errors = []
|
||||
mbuf = b""
|
||||
try:
|
||||
for f in self.fgen:
|
||||
if "err" in f:
|
||||
@@ -283,13 +285,20 @@ class StreamZip(StreamArc):
|
||||
|
||||
try:
|
||||
for x in self.ser(f):
|
||||
yield x
|
||||
mbuf += x
|
||||
if len(mbuf) >= 16384:
|
||||
yield mbuf
|
||||
mbuf = b""
|
||||
except GeneratorExit:
|
||||
raise
|
||||
except:
|
||||
ex = min_ex(5, True).replace("\n", "\n-- ")
|
||||
errors.append((f["vp"], ex))
|
||||
|
||||
if mbuf:
|
||||
yield mbuf
|
||||
mbuf = b""
|
||||
|
||||
if errors:
|
||||
errf, txt = errdesc(errors)
|
||||
self.log("\n".join(([repr(errf)] + txt[1:])))
|
||||
@@ -299,20 +308,23 @@ class StreamZip(StreamArc):
|
||||
cdir_pos = self.pos
|
||||
for name, sz, ts, crc, h_pos in self.items:
|
||||
buf = gen_hdr(h_pos, name, sz, ts, self.utf8, crc, self.pre_crc)
|
||||
yield self._ct(buf)
|
||||
mbuf += self._ct(buf)
|
||||
if len(mbuf) >= 16384:
|
||||
yield mbuf
|
||||
mbuf = b""
|
||||
cdir_end = self.pos
|
||||
|
||||
_, need_64 = gen_ecdr(self.items, cdir_pos, cdir_end)
|
||||
if need_64:
|
||||
ecdir64_pos = self.pos
|
||||
buf = gen_ecdr64(self.items, cdir_pos, cdir_end)
|
||||
yield self._ct(buf)
|
||||
mbuf += self._ct(buf)
|
||||
|
||||
buf = gen_ecdr64_loc(ecdir64_pos)
|
||||
yield self._ct(buf)
|
||||
mbuf += self._ct(buf)
|
||||
|
||||
ecdr, _ = gen_ecdr(self.items, cdir_pos, cdir_end)
|
||||
yield self._ct(ecdr)
|
||||
yield mbuf + self._ct(ecdr)
|
||||
finally:
|
||||
if errf:
|
||||
bos.unlink(errf["ap"])
|
||||
|
||||
@@ -1050,7 +1050,7 @@ class Up2k(object):
|
||||
if WINDOWS:
|
||||
rd = rd.replace("\\", "/").strip("/")
|
||||
|
||||
g = statdir(self.log_func, not self.args.no_scandir, False, cdir)
|
||||
g = statdir(self.log_func, not self.args.no_scandir, True, cdir)
|
||||
gl = sorted(g)
|
||||
partials = set([x[0] for x in gl if "PARTIAL" in x[0]])
|
||||
for iname, inf in gl:
|
||||
@@ -1065,6 +1065,12 @@ class Up2k(object):
|
||||
continue
|
||||
|
||||
lmod = int(inf.st_mtime)
|
||||
if stat.S_ISLNK(inf.st_mode):
|
||||
try:
|
||||
inf = bos.stat(abspath)
|
||||
except:
|
||||
continue
|
||||
|
||||
sz = inf.st_size
|
||||
if fat32 and not ffat and inf.st_mtime % 2:
|
||||
fat32 = False
|
||||
@@ -1445,9 +1451,11 @@ class Up2k(object):
|
||||
pf = "v{}, {:.0f}+".format(n_left, b_left / 1024 / 1024)
|
||||
self.pp.msg = pf + abspath
|
||||
|
||||
st = bos.stat(abspath)
|
||||
# throws on broken symlinks (always did)
|
||||
stl = bos.lstat(abspath)
|
||||
st = bos.stat(abspath) if stat.S_ISLNK(stl.st_mode) else stl
|
||||
mt2 = int(stl.st_mtime)
|
||||
sz2 = st.st_size
|
||||
mt2 = int(st.st_mtime)
|
||||
|
||||
if nohash or not sz2:
|
||||
w2 = up2k_wark_from_metadata(self.salt, sz2, mt2, rd, fn)
|
||||
@@ -1469,6 +1477,13 @@ class Up2k(object):
|
||||
if w == w2:
|
||||
continue
|
||||
|
||||
# symlink mtime was inconsistent before v1.9.4; check if that's it
|
||||
if st != stl and (nohash or not sz2):
|
||||
mt2b = int(st.st_mtime)
|
||||
w2b = up2k_wark_from_metadata(self.salt, sz2, mt2b, rd, fn)
|
||||
if w == w2b:
|
||||
continue
|
||||
|
||||
rewark.append((drd, dfn, w2, sz2, mt2))
|
||||
|
||||
t = "hash mismatch: {}\n db: {} ({} byte, {})\n fs: {} ({} byte, {})"
|
||||
|
||||
@@ -2125,7 +2125,7 @@ def list_ips() -> list[str]:
|
||||
def yieldfile(fn: str) -> Generator[bytes, None, None]:
|
||||
with open(fsenc(fn), "rb", 512 * 1024) as f:
|
||||
while True:
|
||||
buf = f.read(64 * 1024)
|
||||
buf = f.read(128 * 1024)
|
||||
if not buf:
|
||||
break
|
||||
|
||||
|
||||
@@ -261,7 +261,7 @@ window.baguetteBox = (function () {
|
||||
setloop(1);
|
||||
else if (k == "BracketRight")
|
||||
setloop(2);
|
||||
else if (e.shiftKey)
|
||||
else if (e.shiftKey && k != 'KeyR')
|
||||
return;
|
||||
else if (k == "ArrowLeft" || k == "KeyJ")
|
||||
showPreviousImage();
|
||||
|
||||
@@ -493,6 +493,7 @@ html.dz {
|
||||
--err-ts: #500;
|
||||
|
||||
text-shadow: none;
|
||||
font-family: 'scp', monospace, monospace;
|
||||
}
|
||||
html.dy {
|
||||
--fg: #000;
|
||||
@@ -1397,6 +1398,9 @@ input[type="checkbox"]:checked+label {
|
||||
color: #0e0;
|
||||
color: var(--a);
|
||||
}
|
||||
html.dz input {
|
||||
font-family: 'scp', monospace, monospace;
|
||||
}
|
||||
.opwide div>span>input+label {
|
||||
padding: .3em 0 .3em .3em;
|
||||
margin: 0 0 0 -.3em;
|
||||
@@ -1577,6 +1581,7 @@ html.cz .btn {
|
||||
border-bottom: .2em solid #709;
|
||||
}
|
||||
html.dz .btn {
|
||||
font-size: 1em;
|
||||
box-shadow: 0 0 0 .1em #080 inset;
|
||||
}
|
||||
html.dz .tgl.btn.on {
|
||||
@@ -2766,6 +2771,9 @@ html.c .opbox,
|
||||
html.a .opbox {
|
||||
margin: 1.5em 0 0 0;
|
||||
}
|
||||
html.dz .opview input.i {
|
||||
width: calc(100% - 17em);
|
||||
}
|
||||
html.c #tree,
|
||||
html.c #treeh,
|
||||
html.a #tree,
|
||||
@@ -2818,6 +2826,9 @@ html.a #u2btn {
|
||||
html.ay #u2btn {
|
||||
box-shadow: .4em .4em 0 #ccc;
|
||||
}
|
||||
html.dz #u2btn {
|
||||
letter-spacing: -.033em;
|
||||
}
|
||||
html.c #u2conf.ww #u2btn,
|
||||
html.a #u2conf.ww #u2btn {
|
||||
margin: -2em .5em -3em 0;
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}">
|
||||
{%- if css %}
|
||||
<link rel="stylesheet" media="screen" href="{{ css }}?_={{ ts }}">
|
||||
<link rel="stylesheet" media="screen" href="{{ css }}_={{ ts }}">
|
||||
{%- endif %}
|
||||
</head>
|
||||
|
||||
@@ -173,7 +173,7 @@
|
||||
<script src="{{ r }}/.cpr/browser.js?_={{ ts }}"></script>
|
||||
<script src="{{ r }}/.cpr/up2k.js?_={{ ts }}"></script>
|
||||
{%- if js %}
|
||||
<script src="{{ js }}?_={{ ts }}"></script>
|
||||
<script src="{{ js }}_={{ ts }}"></script>
|
||||
{%- endif %}
|
||||
</body>
|
||||
|
||||
|
||||
@@ -80,6 +80,7 @@ var Ls = {
|
||||
"textfile-viewer",
|
||||
["I/K", "prev/next file"],
|
||||
["M", "close textfile"],
|
||||
["E", "edit textfile"],
|
||||
["S", "select file (for cut/rename)"],
|
||||
]
|
||||
],
|
||||
@@ -327,6 +328,7 @@ var Ls = {
|
||||
"tvt_prev": "show previous document$NHotkey: i\">⬆ prev",
|
||||
"tvt_next": "show next document$NHotkey: K\">⬇ next",
|
||||
"tvt_sel": "select file ( for cut / delete / ... )$NHotkey: S\">sel",
|
||||
"tvt_edit": "open file in text editor$NHotkey: E\">✏️ edit",
|
||||
|
||||
"gt_msel": "enable file selection; ctrl-click a file to override$N$N<em>when active: doubleclick a file / folder to open it</em>$N$NHotkey: S\">multiselect",
|
||||
"gt_zoom": "zoom",
|
||||
@@ -375,7 +377,10 @@ var Ls = {
|
||||
"fu_xe1": "failed to load unpost list from server:\n\nerror ",
|
||||
"fu_xe2": "404: File not found??",
|
||||
|
||||
"fz_tar": "plain gnutar file (linux / mac)",
|
||||
"fz_tar": "uncompressed gnu-tar file (linux / mac)",
|
||||
"fz_pax": "uncompressed pax-format tar (slower)",
|
||||
"fz_targz": "gnu-tar with gzip level 3 compression$N$Nthis is usually very slow, so$Nuse uncompressed tar instead",
|
||||
"fz_tarxz": "gnu-tar with xz level 1 compression$N$Nthis is usually very slow, so$Nuse uncompressed tar instead",
|
||||
"fz_zip8": "zip with utf8 filenames (maybe wonky on windows 7 and older)",
|
||||
"fz_zipd": "zip with traditional cp437 filenames, for really old software",
|
||||
"fz_zipc": "cp437 with crc32 computed early,$Nfor MS-DOS PKZIP v2.04g (october 1993)$N(takes longer to process before download can start)",
|
||||
@@ -544,6 +549,7 @@ var Ls = {
|
||||
"dokumentviser",
|
||||
["I/K", "forr./neste fil"],
|
||||
["M", "lukk tekstdokument"],
|
||||
["E", "rediger tekstdokument"]
|
||||
["S", "velg fil (for F2/ctrl-x/...)"]
|
||||
]
|
||||
],
|
||||
@@ -786,11 +792,12 @@ var Ls = {
|
||||
"tv_xe1": "kunne ikke laste tekstfil:\n\nfeil ",
|
||||
"tv_xe2": "404, Fil ikke funnet",
|
||||
"tv_lst": "tekstfiler i mappen",
|
||||
"tvt_close": "gå tilbake til mappen$NSnarvei: M\">❌ close",
|
||||
"tvt_close": "gå tilbake til mappen$NSnarvei: M\">❌ lukk",
|
||||
"tvt_dl": "last ned denne filen\">💾 last ned",
|
||||
"tvt_prev": "vis forrige dokument$NSnarvei: i\">⬆ prev",
|
||||
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ next",
|
||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">sel",
|
||||
"tvt_prev": "vis forrige dokument$NSnarvei: i\">⬆ forr.",
|
||||
"tvt_next": "vis neste dokument$NSnarvei: K\">⬇ neste",
|
||||
"tvt_sel": "markér filen ( for utklipp / sletting / ... )$NSnarvei: S\">merk",
|
||||
"tvt_edit": "redigér filen$NSnarvei: E\">✏️ endre",
|
||||
|
||||
"gt_msel": "markér filer istedenfor å åpne dem; ctrl-klikk filer for å overstyre$N$N<em>når aktiv: dobbelklikk en fil / mappe for å åpne</em>$N$NSnarvei: S\">markering",
|
||||
"gt_zoom": "zoom",
|
||||
@@ -840,6 +847,9 @@ var Ls = {
|
||||
"fu_xe2": "404: Filen finnes ikke??",
|
||||
|
||||
"fz_tar": "ukomprimert gnu-tar arkiv, for linux og mac",
|
||||
"fz_pax": "ukomprimert pax-tar arkiv, litt tregere",
|
||||
"fz_targz": "gnu-tar pakket med gzip (nivå 3)$N$NNB: denne er veldig treg;$Nukomprimert tar er bedre",
|
||||
"fz_tarxz": "gnu-tar pakket med xz (nivå 1)$N$NNB: denne er veldig treg;$Nukomprimert tar er bedre",
|
||||
"fz_zip8": "zip med filnavn i utf8 (noe problematisk på windows 7 og eldre)",
|
||||
"fz_zipd": "zip med filnavn i cp437, for høggamle maskiner",
|
||||
"fz_zipc": "cp437 med tidlig crc32,$Nfor MS-DOS PKZIP v2.04g (oktober 1993)$N(øker behandlingstid på server)",
|
||||
@@ -3049,7 +3059,7 @@ function eval_hash() {
|
||||
goto('search');
|
||||
var i = ebi('q_raw');
|
||||
i.value = uricom_dec(v.slice(3));
|
||||
return i.oninput();
|
||||
return i.onkeydown({ 'key': 'Enter' });
|
||||
}
|
||||
|
||||
if (v.indexOf('#v=') === 0) {
|
||||
@@ -4015,6 +4025,8 @@ var showfile = (function () {
|
||||
|
||||
ebi('files').style.display = ebi('gfiles').style.display = ebi('lazy').style.display = ebi('pro').style.display = ebi('epi').style.display = 'none';
|
||||
ebi('dldoc').setAttribute('href', url);
|
||||
ebi('editdoc').setAttribute('href', url + (url.indexOf('?') > 0 ? '&' : '?') + 'edit');
|
||||
ebi('editdoc').style.display = (has(perms, 'write') && (is_md || has(perms, 'delete'))) ? '' : 'none';
|
||||
|
||||
var wr = ebi('bdoc'),
|
||||
defer = !Prism.highlightElement;
|
||||
@@ -4026,7 +4038,7 @@ var showfile = (function () {
|
||||
|
||||
el = el || QS('#doc>code');
|
||||
Prism.highlightElement(el);
|
||||
if (el.className == 'language-ans')
|
||||
if (el.className == 'language-ans' || (!lang && /\x1b\[[0-9;]{0,16}m/.exec(txt.slice(0, 4096))))
|
||||
r.ansify(el);
|
||||
}
|
||||
catch (ex) { }
|
||||
@@ -4185,6 +4197,7 @@ var showfile = (function () {
|
||||
'<a href="#" class="btn" id="prevdoc" tt="' + L.tvt_prev + '</a>\n' +
|
||||
'<a href="#" class="btn" id="nextdoc" tt="' + L.tvt_next + '</a>\n' +
|
||||
'<a href="#" class="btn" id="seldoc" tt="' + L.tvt_sel + '</a>\n' +
|
||||
'<a href="#" class="btn" id="editdoc" tt="' + L.tvt_edit + '</a>\n' +
|
||||
'</div>'
|
||||
);
|
||||
ebi('xdoc').onclick = function () {
|
||||
@@ -4887,6 +4900,8 @@ document.onkeydown = function (e) {
|
||||
if (showfile.active()) {
|
||||
if (k == 'KeyS')
|
||||
showfile.tglsel();
|
||||
if (k == 'KeyE' && ebi('editdoc').style.display != 'none')
|
||||
ebi('editdoc').click();
|
||||
}
|
||||
};
|
||||
|
||||
@@ -6672,6 +6687,9 @@ var arcfmt = (function () {
|
||||
var html = [],
|
||||
fmts = [
|
||||
["tar", "tar", L.fz_tar],
|
||||
["pax", "tar=pax", L.fz_pax],
|
||||
["tgz", "tar=gz", L.fz_targz],
|
||||
["txz", "tar=xz", L.fz_tarxz],
|
||||
["zip", "zip=utf8", L.fz_zip8],
|
||||
["zip_dos", "zip", L.fz_zipd],
|
||||
["zip_crc", "zip=crc", L.fz_zipc]
|
||||
@@ -6701,7 +6719,7 @@ var arcfmt = (function () {
|
||||
|
||||
for (var a = 0, aa = tds.length; a < aa; a++) {
|
||||
var o = tds[a], txt = o.textContent, href = o.getAttribute('href');
|
||||
if (txt != 'tar' && txt != 'zip')
|
||||
if (!/^(zip|tar|pax|tgz|txz)$/.exec(txt))
|
||||
continue;
|
||||
|
||||
var ofs = href.lastIndexOf('?');
|
||||
|
||||
@@ -1,3 +1,51 @@
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0902-0018 `v1.9.4` yes symlink times
|
||||
|
||||
hello! it's been a while, an entire day even...
|
||||
|
||||
## new features
|
||||
* download folder as tar.gz, tar.bz2, tar.xz
|
||||
* single-threaded, so extremely slow, but nice for easily compressed data or challenged networks
|
||||
* append `?tar=gz`, `?tar=bz2` or `?tar=xz` to a folder URL to do it
|
||||
* default compression levels are gz:3, bz2:2, xz:1; override with `?tar=gz:9`
|
||||
|
||||
# bugfixes
|
||||
* c1efd227b7377144a5760bc6cff64f4e87b626d9 symlink-deduplicated files got indexed with the wrong last-modified timestamp
|
||||
* mostly inconsequential; would cause the dupe's uploader-ip to be forgotten on the next server restart since it would reindex to "fix" the timestamp
|
||||
* when linking [a search query](https://a.ocv.me/pub/#q=tags%20like%20soundsho*) it loads the results faster
|
||||
|
||||
# other changes
|
||||
* update readme to mention that iPhones and iPads dislike the preload feature and respond by glitching the audio a bit when a song is exactly 20 seconds away from ending and yet how it's probably a bad idea to disable preloading since i bet it's load-bearing against other iOS bugs
|
||||
* speaking of iPhones and iPads, the [previous version](https://github.com/9001/copyparty/releases/tag/v1.9.3) should have fixed album playback on those
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0831-2211 `v1.9.3` iOS and http fixes
|
||||
|
||||
## new features
|
||||
* iPhones and iPads are now able to...
|
||||
* 9986136dfb2364edb35aa9fbb87410641c6d6af3 play entire albums while the screen is off without the music randomly stopping
|
||||
* apple keeps breaking AudioContext in new and interesting ways; time to give up (no more equalizer)
|
||||
* 1c0d978979a703edeb792e552b18d3b7695b2d90 perform search queries and execude js code
|
||||
* by translating [smart-quotes](https://stackoverflow.com/questions/48678359/ios-11-safari-html-disable-smart-punctuation) into regular `'` and `"` characters
|
||||
* python 3.12 support
|
||||
* technically a bugfix since it was added [a year ago](https://github.com/9001/copyparty/commit/32e22dfe84d5e0b13914b4d0e15c1b8c9725a76d) way before the first py3.12 alpha was released but turns out i botched it, oh well
|
||||
* filter error messages so they never include the filesystem path where copyparty's python files reside
|
||||
* print more context in server logs if someone hits an unexpected permission-denied
|
||||
|
||||
# bugfixes
|
||||
found some iffy stuff combing over the code but, as far as I can tell, luckily none of these were dangerous:
|
||||
* URL normalization was a bit funky, but it appears everything access-control-related was unaffected
|
||||
* some url parameters were double-decoded, causing the unpost filtering and file renaming to fail if the values contained `%`
|
||||
* clients could cause the server to return an invalid cache-control header, but newlines and control-characters got rejected correctly
|
||||
* minor cosmetics / qol fixes:
|
||||
* reduced flickering on page load in chrome
|
||||
* fixed some console spam in search results
|
||||
* markdown documents now have the same line-height in directory listings and the editor
|
||||
|
||||
|
||||
|
||||
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
|
||||
# 2023-0826-2116 `v1.9.2` bigger hammer
|
||||
|
||||
|
||||
@@ -125,8 +125,14 @@ authenticate using header `Cookie: cppwd=foo` or url param `&pw=foo`
|
||||
| GET | `?b` | list files/folders at URL as simplified HTML |
|
||||
| GET | `?tree=.` | list one level of subdirectories inside URL |
|
||||
| GET | `?tree` | list one level of subdirectories for each level until URL |
|
||||
| GET | `?tar` | download everything below URL as a tar file |
|
||||
| GET | `?zip=utf-8` | download everything below URL as a zip file |
|
||||
| GET | `?tar` | download everything below URL as a gnu-tar file |
|
||||
| GET | `?tar=gz:9` | ...as a gzip-level-9 gnu-tar file |
|
||||
| GET | `?tar=xz:9` | ...as an xz-level-9 gnu-tar file |
|
||||
| GET | `?tar=pax` | ...as a pax-tar file |
|
||||
| GET | `?tar=pax,xz` | ...as an xz-level-1 pax-tar file |
|
||||
| GET | `?zip=utf-8` | ...as a zip file |
|
||||
| GET | `?zip` | ...as a WinXP-compatible zip file |
|
||||
| GET | `?zip=crc` | ...as an MSDOS-compatible zip file |
|
||||
| GET | `?ups` | show recent uploads from your IP |
|
||||
| GET | `?ups&filter=f` | ...where URL contains `f` |
|
||||
| GET | `?mime=foo` | specify return mimetype `foo` |
|
||||
|
||||
@@ -255,7 +255,7 @@ symbol legend,
|
||||
| per-file permissions | | | | █ | █ | | █ | | █ | | | |
|
||||
| per-file passwords | █ | | | █ | █ | | █ | | █ | | | |
|
||||
| unmap subfolders | █ | | | | | | █ | | | █ | ╱ | • |
|
||||
| index.html blocks list | | | | | | | █ | | | • | | |
|
||||
| index.html blocks list | ╱ | | | | | | █ | | | • | | |
|
||||
| write-only folders | █ | | | | | | | | | | █ | █ |
|
||||
| files stored as-is | █ | █ | █ | █ | | █ | █ | | | █ | █ | █ |
|
||||
| file versioning | | | | █ | █ | | | | | | | |
|
||||
@@ -291,6 +291,7 @@ symbol legend,
|
||||
* one-way folder sync from local to server can be done efficiently with [u2c.py](https://github.com/9001/copyparty/tree/hovudstraum/bin#u2cpy), or with webdav and conventional rsync
|
||||
* can hot-reload config files (with just a few exceptions)
|
||||
* can set per-folder permissions if that folder is made into a separate volume, so there is configuration overhead
|
||||
* `index.html` on its own does not prevent directory listing, but permission `h` (instead of `r`) enforces index.html to be returned instead of folder contents
|
||||
* [event hooks](https://github.com/9001/copyparty/tree/hovudstraum/bin/hooks) ([discord](https://user-images.githubusercontent.com/241032/215304439-1c1cb3c8-ec6f-4c17-9f27-81f969b1811a.png), [desktop](https://user-images.githubusercontent.com/241032/215335767-9c91ed24-d36e-4b6b-9766-fb95d12d163f.png)) inspired by filebrowser, as well as the more complex [media parser](https://github.com/9001/copyparty/tree/hovudstraum/bin/mtag) alternative
|
||||
* upload history can be visualized using [partyjournal](https://github.com/9001/copyparty/blob/hovudstraum/bin/partyjournal.py)
|
||||
* `k`/filegator remarks:
|
||||
|
||||
@@ -25,6 +25,6 @@ ba91ab0518c61eff13e5612d9e6b532940813f6b56e6ed81ea6c7c4d45acee4d98136a383a250675
|
||||
# win10
|
||||
00558cca2e0ac813d404252f6e5aeacb50546822ecb5d0570228b8ddd29d94e059fbeb6b90393dee5abcddaca1370aca784dc9b095cbb74e980b3c024767fb24 Jinja2-3.1.2-py3-none-any.whl
|
||||
7f8f4daa4f4f2dbf24cdd534b2952ee3fba6334eb42b37465ccda3aa1cccc3d6204aa6bfffb8a83bf42ec59c702b5b5247d4c8ee0d4df906334ae53072ef8c4c MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl
|
||||
4a20aeb52d4fde6aabcba05ee261595eeb5482c72ee27332690f34dd6e7a49c0b3ba3813202ac15c9d21e29f1cd803f2e79ccc1c45ec314fcd0a937016bcbc56 mutagen-1.46.0-py3-none-any.whl
|
||||
8a6e2b13a2ec4ef914a5d62aad3db6464d45e525a82e07f6051ed10474eae959069e165dba011aefb8207cdfd55391d73d6f06362c7eb247b08763106709526e mutagen-1.47.0-py3-none-any.whl
|
||||
926d408a886059a75cf12706fa061146f9f042b27fb6e65be7d49f398ed23fb0227639d84804586ac014c6bcf7d08cd86a09c1a20793d341aa0802d3d32a546b Pillow-10.0.0-cp311-cp311-win_amd64.whl
|
||||
c86bbeacad3ae3c7bde747f5b4f09c11eced841add14e79ec4a064e5e29ebca35460e543ba735b11bfb882837d5ff4371ce64492d28d096b4686233c9a8cda6d python-3.11.5-amd64.exe
|
||||
|
||||
@@ -17,13 +17,13 @@ uname -s | grep NT-10 && w10=1 || {
|
||||
fns=(
|
||||
altgraph-0.17.3-py2.py3-none-any.whl
|
||||
pefile-2023.2.7-py3-none-any.whl
|
||||
pyinstaller-5.13.1-py3-none-win_amd64.whl
|
||||
pyinstaller-5.13.2-py3-none-win_amd64.whl
|
||||
pyinstaller_hooks_contrib-2023.7-py2.py3-none-any.whl
|
||||
pywin32_ctypes-0.2.2-py3-none-any.whl
|
||||
upx-4.1.0-win32.zip
|
||||
)
|
||||
[ $w10 ] && fns+=(
|
||||
mutagen-1.46.0-py3-none-any.whl
|
||||
mutagen-1.47.0-py3-none-any.whl
|
||||
Pillow-9.4.0-cp311-cp311-win_amd64.whl
|
||||
python-3.11.3-amd64.exe
|
||||
}
|
||||
@@ -47,7 +47,7 @@ fns=(
|
||||
)
|
||||
[ $w7x32 ] && fns+=(
|
||||
windows6.1-kb2533623-x86.msu
|
||||
pyinstaller-5.13.1-py3-none-win32.whl
|
||||
pyinstaller-5.13.2-py3-none-win32.whl
|
||||
python-3.7.9.exe
|
||||
)
|
||||
dl() { curl -fkLOC- "$1" && return 0; echo "$1"; return 1; }
|
||||
|
||||
Reference in New Issue
Block a user