I’m beautiful and tough like a diamond…or beef jerky in a ball gown.

  • 25 Posts
  • 246 Comments
Joined 8 months ago
cake
Cake day: July 15th, 2025

help-circle


  • The world is just as fucked up as it ever was. The only difference now is that every fucked-up thing that ever happens anywhere is getting pushed to your always-on doomscroll device in real time with people attaching their mostly ignorant opinions to it.

    This is where the “touch grass” advice comes into play. In broad terms, the real world is not nearly the hellhole social media portrays it to be.





  • I get what you’re saying and the “individual carbon footprint” is often used to blame shift to regular people just living their lives, but we do still have a carbon footprint. It may be a tiny, rodent-sized footprint compared to the Kaiju-sized ones of big industries, but our actions and choices do have an effect (especially collectively).

    I just don’t like dismissing the individual carbon footprint as total propaganda because it’s not wrong (though I acknowledge it is abused). Dismissing it like that just puts out a defeatist “nothing I do matters” message when our individual choices do matter and add up.

    Can you live a totally carbon-neutral life in the modern age? No, probably not. But we also shouldn’t throw the baby out with the bathwater and do nothing.










  • Disclaimer: : All of my LLM experience is with local models in Ollama on extremely modest hardware (an old laptop with NVidia graphics) , so I can’t speak for the technical reasons the context window isn’t infinite or at least larger on the big player’s models. My understanding is that the context window is basically its short term memory. In humans, short term memory is also fairly limited in capacity. But unlike humans, the LLM can’t really see (or hold) the big picture in its mind.

    But yeah, all you said is correct. Expanding on that, if you try to get it to generate something long-form, such as a novel, it’s basically just generating infinite chapters using the previous chapter (or as much of the history fits into its context window) as reference for the next. This means, at minimum, it’s going to be full of plot holes and will never reach a conclusion unless explicitly directed to wrap things up. And, again, given the limited context window, the ending will be full of plot holes and essentially based only on the previous chapter or two.

    It’s funny because I recently found an old backup drive from high school with some half-written Jurassic Park fan fiction on it, so I tasked an LLM with fleshing it out, mostly for shits and giggles. The result is pure slop that seems like it’s building to something and ultimately goes nowhere. The other funny thing is that it reads almost exactly like a season of Camp Cretaceous / Chaos Theory (the animated kids JP series) and I now fully believe those are also LLM-generated.









  • I remember the individual was frustrated because without the app, you couldn’t modify anything on the device…which seems very bad.

    Sadly, that’s how this one is but the defaults are sane enough that you could get by without it. I didn’t buy it for the features that need the cloud service, so if those are unavailable, that was fine for my use case. The consolation is that configuring it can all be done with just the app over Bluetooth.

    I’m hoping someone reverse engineers the Bluetooth protocol. I did a HCI dump while it was connecting and streaming data, but I can’t make the slightest bit of sense of it.