• 2 Posts
  • 61 Comments
Joined 7 months ago
cake
Cake day: January 25th, 2024

help-circle
  • If you’ve got a VPS at your disposal, many of the homepage softwares I’ve tried over the years have some amount of caching to make them quite fast or even operate offline(“Homer” for one required me to deeply purge my cache as it would still appear when my site was offline…despite having replaced it long ago! 😂). Or, if you wanted to roll your own static HTML page, you can absolutely add a Service Worker for your own offline caching.

    That’s where I’m at now. I use a custom ServiceWorker static HTML for my homepage and tab page on all my devices. This page is a bouncer, checks if I’m at home or not(or if my local dashboard is offline) and either redirects me to the local homepage which has all my HomeLab services on it, or if it fails just tells me I might be abroad or offline and lists a few public websites.

    And yes, this works offline or over a shitty connection. Essentially the service worker quickly provides the cached page from the browser storage, then tries to take the time to check the live version. If it gets one, it updates the cache, if not, enjoy the offline version.








  • Yeah, I can see more of this happening as demand for quality products increases.

    Things that don’t need replaced don’t bring in more money year over year, which means they have to keep coming up with other excuses for you to buy a new one just to stay above water.

    Any time purchases reach critical mass and mostly everyone has bought the “last gizmo you’ll ever need”, they’ll have to release the last-last gizmo you’ll ever need.

    One-time purchase forever mouse would just mean once sales drop they need to release the forever-ever mouse, now with an extra button, then when that one peaks, the forever-and-ever mouse, with one more button than that.

    Or they’ll hit a ceiling and go the way of Instant Pot.

    It feels like a choice between rental(this) or rental with extra e-waste(any time you replace a cheaply made or planned obsolescence product) and it sucks.







  • I mean, it kinda makes sense. Especially in this day and age an appeal is the final say, not the court ruling(feels like everything gets appealed). So, this way the place that happens is the highest court in the state. The final ruling is whether the highest non-appeals court did it right, not the original issue.

    Or, put another way, if you tell me the highest court in the land has made a decision, I would expect that to be the end of it. But it’s not. From the moment the verdict is read lawyers are preparing an appeal. Therefore, whatever court takes the appeal makes the true final decision. Why not then make that the highest court in the land and better reflect the role?



  • Is there a list anywhere of this and other settings and features that could/should certainly be changed to better Firefox privacy?

    Other than that I’m not sure I’m really going to jump ship. I think I’m getting too old for the “clunkiness” that comes with trying to use third party/self hosted alternatives to replace features that ultimately break the privacy angle, or to add them to barebones privacy focused browsers. Containers and profile/bookmark syncing, for example. But if there’s a list of switches I can flip to turn off the most egregious things, that would be good for today.


  • Gas stations attached mechanic shops and then convenience stores even though you don’t spend a lot of time refueling.

    Charging centers simply need to do the same. Or restaurants etc need to invest in charging stations.

    If you go on a long trip and need to charge, and you can spend that time also meeting your personal needs for food and bathroom? By the time you finish a meal at the attached full service restaurant both your car and passengers will be fully refueled.

    Even though I don’t personally own an EV, just a hybrid, that much became obvious as soon as my local grocery added a couple charging spots. Only a couple, but it’s so obvious the answer to long charging times is simply to have something else to do.


  • You would go for a Raspberry Pi when you need something it was invented for.

    Putting a computer on your motorcycle or robot or solar powered RV. Super small space or low-low power availability things, or direct GPIO control.

    A MiniMicro will run laps around a Pi for general compute, but you can’t run it off a cell phone battery pack. People only related Pis to general compute because of the push to sell them as affordable school computers, not because they were awesome at it, because they were cheap and just barely enough.


  • PassingThrough@lemmy.worldtoMemes@lemmy.mlGet rich quick
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Forgive me, I’m no AI expert to fully compare the needed tokens per second measurement to relate to the average query Siri might handle, but I will say this:

    Even in your article, only the largest model ran at 8/tps, others ran much faster, and none of these were optimized for a task, just benchmarking.

    Would it be impossible for Apple to be running an optimized model specific to expected mobile tasks, and leverage their own hardware more efficiently than we can, to meet their needs?

    I imagine they cut out most worldly knowledge etc/use a lightweight model, which is why there is still a need to link to ChatGPT or Apple for some requests, would this let them trim Siri down to perform well enough on phones for most requests? They also advertised launching AI on M1-2 chip devices, which are not M3-Max either…


  • PassingThrough@lemmy.worldtoMemes@lemmy.mlGet rich quick
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    Onboard AI chips will allow this to be local.

    Phones do not have the power to ~~~

    Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer? Gotta have those latest and greatest chips.

    It will be fun to see how it all shakes out. If the AI can’t run most queries on the phone with all this advertising of local processing…there’ll be one hell of a lawsuit coming up.

    EDIT: Finished looking for what I thought I remembered…

    Additionally, Siri has been locally processed since iOS 15.

    https://www.macrumors.com/how-to/use-on-device-siri-iphone-ipad/