It's a one-time download shared by all websites that use the Prompt API.
What's a bigger issue is that the models on most standard PCs are both tiny and slow. I was going to try using the Prompt API to change the text of (infocom) text adventures on the fly. But for many PCs, this will currently be too slow to be feasible.
On a Mac, at least, the "correct combination of buttons" is trivial and easy to remember, even for someone like me who rarely uses em-dash. (But, I want to start using it more because I'm sick to death of people treating it as a scarlet letter.)
On Linux, with a compose key, it's <compose><-><-><.> (at least with the settings I have, I don't think I overrode that one). "⸻" is even more fun. You can even make your own sequences, e.g. I've got <compose><O><h><m> for "Ω", and <compose><m><u> for "μ", very handy for electrical stuff like "160μA at 1.8V needs a resistance of 1.25kΩ, dissipating 288μW".
there still a lot of jank. on ios u can only doo this with safari, and even then u loose actual safari niceties like trad browser ui. and idk why but it op ens link in actual safari even if its the same app. unless u have a single page app that does nothing this is not a viable route/.
A nice bonus is that sysadmin tasks tend to be light in terms of token usage, that’s very convenient given the increasingly strict usage limits these days.
By this point? Absolutely. They still get stuck in rabbit holes and go down the wrong path sometimes, so it's not fully fire and forget, but if you aren't taking advantage of LLMs to perform generic sysadmin drudgery, you're wasting your time that could be better spent elsewhere.
The internet of 20 years ago was awash with info for running dedicated servers, fragmented and badly-written in places but it was all there. I can absolutely believe LLMs would enable more people to find that knowledge more easily.
reply