mirror of
https://github.com/Bubberstation/Bubberstation.git
synced 2025-12-13 03:02:38 +00:00
## About The Pull Request Replaces the asset subsystem's spritesheet generator with a rust-based implementation (https://github.com/tgstation/rust-g/pull/160). This is a rough port of https://github.com/BeeStation/BeeStation-Hornet/pull/10404, but it includes fixes for some cases I didn't catch that apply on TG. (FWIW we've been using this system on prod for over a year and encountered no major issues.) ### TG MAINTAINER NOTE  ### Batched Spritesheets `/datum/asset/spritesheet_batched`: A version of the spritesheet system that collects a list of `/datum/universal_icon`s and sends them off to rustg asynchronously, and the generation also runs on another thread, so the game doesn't block during realize_spritesheet. The rust generation is about 10x faster when it comes to actual icon generation, but the biggest perk of the batched spritesheets is the caching system. This PR notably does not convert a few things to the new spritesheet generator. - Species and antagonist icons in the preferences view because they use getFlatIcon ~~which can't be converted to universal icons~~. - Yes, this is still a *massive* cost to init, unfortunately. On Bee, I actually enabled the 'legacy' cache on prod and development, which you can see in my PR. That's why I added the 'clear cache' verb and the `unregister()` procs, because it can force a regeneration at runtime. I decided not to port this, since I think it would be detrimental to the large amount of contributors here. - It is *technically* possible to port parts of this to the uni_icon system by making a uni_icon version of getFlatIcon. However, some overlays use runtime-generated icons which are ~~completely unparseable to IconForge, since they're stored in the RSC and don't exist as files anywhere~~. This is most noticeable with things like hair (which blend additively with the hair mask on the server, thus making them invisible to `get_flat_uni_icon`). It also doesn't help that species and antag icons will still need to generate a bunch of dummies and delete them to even verify cache validity. - It is actually possible to write the RSC icons to the filesystem (using fcopy) and reference them in IconForge. However, I'm going to wait on doing this until I port my GAGS implementation because it requires GAGS to exist on the filesystem as well. #### Caching IconForge generates a cache based on the set of icons used, all transform operations applied, and the source DMIs of each icon used within the spritesheet. It can compare the hashes and invalidate the cache automatically if any of these change. This means we can enable caching on development, and have absolutely no downsides, because if anything changes, the cache invalidates itself. The caching has a mean cost of ~5ms and saves a lot of time compared to generating the spritesheet, even with rust's faster generation. The main downside is that the cache still requires building the list of icons and their transforms, then json encoding it to send to rustg. Here's an abbreviated example of a cache JSON. All of these need to match for the cache to be valid. `input_hash` contains the transform definitions for all the sprites in the spritesheet, so if the input to iconforge changes, that hash catches it. The `sizes` and `sprites` are loaded into DM. ```json { "input_hash": "99f1bc67d590e000", "dmi_hashes": { "icons/ui/achievements/achievements.dmi": "771200c75da11c62" }, "sizes": [ "76x76" ], "sprites": { "achievement-rustascend": { "size_id": "76x76", "position": 1 } }, "rustg_version": "3.6.0", "dm_version": 1 } ``` ### Universal Icons Universal icons are just a collection of DMI, Icon State, and any icon transformation procs you apply (blends, crops, scales). They can be convered to DM icons via `to_icon()`. I've included an implementation of GAGS that produces universal icons, allowing GAGS items to be converted into them. IconForge can read universal icons and add them to spritesheets. It's basically just a wrapper that reimplements BYOND icon procs. ### Other Stuff Converts some uses of md5asfile within legacy spritesheets to use rustg_hash_file instead, improving the performance of their generation. Fixes lizard body markings not showing in previews, and re-adds eyes to the ethereal color preview. This is a side effect of IconForge having *much* better error handling than DM icon procs. Invalid stuff that gets passed around will error instead of silently doing nothing. Changes the CSS used in legacy spritesheet generation to split `background: url(...) no-repeat` into separate props. This is necessary for WebView2, as IE treats these properties differently - adding `background-color` to an icon object (as seen in the R&D console) won't work if you don't split these out. Deletes unused spritesheets and their associated icons (condiments spritesheet, old PDA spritesheet) ## Why It's Good For The Game If you press "Character Setup", the 10-13sec of lag is now approximately 0.5-2 seconds. Tracy profile showing the time spent on get_asset_datum. I pressed the preferences button during init on both branches. Do note that this was ran with a smart cache HIT, so no generation occurred.  Much lower worst-case for /datum/asset/New (which includes `create_spritesheets()` and `register()`)  Here's a look at the internal costs from rustg - as you can see `generate_spritesheet()` is very fast:  ### Comparison for a single spritesheet - chat spritesheet: **Before**  **After**  ## Changelog 🆑 fix: Fixed lizard body markings and ethereal feature previews in the preference menu missing some overlays. refactor: Optimized spritesheet asset generation greatly using rustg IconForge, greatly reducing post-initialization lag as well as reducing init times and saving server computation. config: Added 'smart' asset caching, for batched rustg IconForge spritesheets. It is persistent and suitable for use on local, with automatic invalidation. add: Added admin verbs - Debug -> Clear Smart/Legacy Asset Cache for spritesheets. fix: Fixed R&D console icons breaking on WebView2/516 /🆑
143 lines
4.9 KiB
Plaintext
143 lines
4.9 KiB
Plaintext
/**
|
|
* For FTP requests. (i.e. downloading runtime logs.)
|
|
*
|
|
* However it'd be ok to use for accessing attack logs and such too, which are even laggier.
|
|
*/
|
|
GLOBAL_VAR_INIT(fileaccess_timer, 0)
|
|
|
|
/client/proc/browse_files(root_type=BROWSE_ROOT_ALL_LOGS, max_iterations=10, list/valid_extensions=list("txt","log","htm", "html", "gz", "json"))
|
|
// wow why was this ever a parameter
|
|
var/root = "data/logs/"
|
|
switch(root_type)
|
|
if(BROWSE_ROOT_ALL_LOGS)
|
|
root = "data/logs/"
|
|
if(BROWSE_ROOT_CURRENT_LOGS)
|
|
root = "[GLOB.log_directory]/"
|
|
var/path = root
|
|
|
|
for(var/i in 1 to max_iterations)
|
|
var/list/choices = flist(path)
|
|
if(path != root)
|
|
choices.Insert(1,"/")
|
|
choices = sort_list(choices) + "Download Folder"
|
|
|
|
var/choice = input(src,"Choose a file to access:","Download",null) as null|anything in choices
|
|
switch(choice)
|
|
if(null)
|
|
return
|
|
if("/")
|
|
path = root
|
|
continue
|
|
if("Download Folder")
|
|
var/list/comp_flist = flist(path)
|
|
var/confirmation = input(src, "Are you SURE you want to download all the files in this folder? (This will open [length(comp_flist)] prompt[length(comp_flist) == 1 ? "" : "s"])", "Confirmation") in list("Yes", "No")
|
|
if(confirmation != "Yes")
|
|
continue
|
|
for(var/file in comp_flist)
|
|
src << ftp(path + file)
|
|
return
|
|
path += choice
|
|
|
|
if(copytext_char(path, -1) != "/") //didn't choose a directory, no need to iterate again
|
|
break
|
|
var/extensions
|
|
for(var/i in valid_extensions)
|
|
if(extensions)
|
|
extensions += "|"
|
|
extensions += "[i]"
|
|
var/regex/valid_ext = new("\\.([extensions])$", "i")
|
|
if( !fexists(path) || !(valid_ext.Find(path)) )
|
|
to_chat(src, "<font color='red'>Error: browse_files(): File not found/Invalid file([path]).</font>")
|
|
return
|
|
|
|
return path
|
|
|
|
#define FTPDELAY 200 //200 tick delay to discourage spam
|
|
#define ADMIN_FTPDELAY_MODIFIER 0.5 //Admins get to spam files faster since we ~trust~ them!
|
|
/* This proc is a failsafe to prevent spamming of file requests.
|
|
It is just a timer that only permits a download every [FTPDELAY] ticks.
|
|
This can be changed by modifying FTPDELAY's value above.
|
|
|
|
PLEASE USE RESPONSIBLY, Some log files can reach sizes of 4MB! */
|
|
/client/proc/file_spam_check()
|
|
var/time_to_wait = GLOB.fileaccess_timer - world.time
|
|
if(time_to_wait > 0)
|
|
to_chat(src, "<font color='red'>Error: file_spam_check(): Spam. Please wait [DisplayTimeText(time_to_wait)].</font>")
|
|
return TRUE
|
|
var/delay = FTPDELAY
|
|
if(holder)
|
|
delay *= ADMIN_FTPDELAY_MODIFIER
|
|
GLOB.fileaccess_timer = world.time + delay
|
|
return FALSE
|
|
#undef FTPDELAY
|
|
#undef ADMIN_FTPDELAY_MODIFIER
|
|
|
|
/**
|
|
* Takes a directory and returns every file within every sub directory.
|
|
* If extensions_filter is provided then only files that end in that extension are given back.
|
|
* If extensions_filter is a list, any file that matches at least one entry is given back.
|
|
*/
|
|
/proc/pathwalk(path, extensions_filter)
|
|
var/list/jobs = list(path)
|
|
var/list/filenames = list()
|
|
|
|
while(jobs.len)
|
|
var/current_dir = pop(jobs)
|
|
var/list/new_filenames = flist(current_dir)
|
|
for(var/new_filename in new_filenames)
|
|
// if filename ends in / it is a directory, append to currdir
|
|
if(findtext(new_filename, "/", -1))
|
|
jobs += "[current_dir][new_filename]"
|
|
continue
|
|
// filename extension filtering
|
|
if(extensions_filter)
|
|
if(islist(extensions_filter))
|
|
for(var/allowed_extension in extensions_filter)
|
|
if(endswith(new_filename, allowed_extension))
|
|
filenames += "[current_dir][new_filename]"
|
|
break
|
|
else if(endswith(new_filename, extensions_filter))
|
|
filenames += "[current_dir][new_filename]"
|
|
else
|
|
filenames += "[current_dir][new_filename]"
|
|
return filenames
|
|
|
|
/proc/pathflatten(path)
|
|
return replacetext(path, "/", "_")
|
|
|
|
/// Returns the md5 of a file at a given path.
|
|
/proc/md5filepath(path)
|
|
. = md5(file(path))
|
|
|
|
/// Save file as an external file then md5 it.
|
|
/// Used because md5ing files stored in the rsc sometimes gives incorrect md5 results.
|
|
/// https://www.byond.com/forum/post/2611357
|
|
/proc/md5asfile(file)
|
|
var/static/notch = 0
|
|
// its importaint this code can handle md5filepath sleeping instead of hard blocking, if it's converted to use rust_g.
|
|
var/filename = "tmp/md5asfile.[world.realtime].[world.timeofday].[world.time].[world.tick_usage].[notch]"
|
|
notch = WRAP(notch+1, 0, 2**15)
|
|
fcopy(file, filename)
|
|
. = md5filepath(filename)
|
|
fdel(filename)
|
|
|
|
/**
|
|
* Sanitizes the name of each node in the path.
|
|
*
|
|
* Im case you are wondering when to use this proc and when to use SANITIZE_FILENAME,
|
|
*
|
|
* You use SANITIZE_FILENAME to sanitize the name of a file [e.g. example.txt]
|
|
*
|
|
* You use sanitize_filepath sanitize the path of a file [e.g. root/node/example.txt]
|
|
*
|
|
* If you use SANITIZE_FILENAME to sanitize a file path things will break.
|
|
*/
|
|
/proc/sanitize_filepath(path)
|
|
. = ""
|
|
var/delimiter = "/" //Very much intentionally hardcoded
|
|
var/list/all_nodes = splittext(path, delimiter)
|
|
for(var/node in all_nodes)
|
|
if(.)
|
|
. += delimiter // Add the delimiter before each successive node.
|
|
. += SANITIZE_FILENAME(node)
|