maegul (he/they)

A little bit of neuroscience and a little bit of computing

  • 55 Posts
  • 1.51K Comments
Joined 2 years ago
cake
Cake day: January 19th, 2023

help-circle


  • It’s definitely an interesting and relevant idea I think! A major flaw here is the lack of ability for communities to establish themselves as discrete spaces desperate from the doomscrolling crowd.

    A problem with the fediverse on the whole IMO, as community building is IMO what it should be focusing on.

    Generally decentralisation makes things like this difficult, AFAIU. Lemmy has things like private and local only communities in the works that will get you there. But then discovery becomes a problem which probably requires some additional features too.



  • Oh yea, I hear you.

    What your point does though is open up the discussion about whether enforcement makes financial sense in isolation. And once you open that door, the whole becomes uncomfortable for a lot of people who are stuck in a simple black-and-white justice mentality, where “do what you’re supposed, pay what they charge, or be punished” is all there is to making the world work well. You know, “law and order” types.

    You’re trying to talk about incentives. For many though that’s a very dangerous slippery slope. So I’m trying to get a head of that and wonder if the end of that slippery slop is actually a demonstrably good thing.


  • I remember hearing rumours during the role out that tech employees were found asking for help on forums in ways that weren’t promising for the health and talent of the people building it.

    But yea, it’s the embarrassment of this sort of stuff that must be masking the real financials of PT and how viable a free system would be.


  • Yea I’ve kept track of how often I’ve encountered inspectors, and most of the time it’d be worth it to not get the ticket or not tap on. Sometimes though I’ve noticed an increase in the number of inspectors that would definitely shift the equation. Also train stations with gates complicate the matter.

    I don’t know if it’s out there, but I’d personally like to know how the finances come out for making PT free. You obviously lose revenue, but also all the overhead of paying for inspectors and for all of the ticketing infrastructure. I also wonder if the part that makes the finances work is all the fines collected, which would be pretty fucking shithouse if true.


  • The catch is that the whole system is effectively centralised on BlueSky backend services (basically the relay). So while the protocol may be standardised and open, and interpreted with decentralised components, they’ll control the core service. Which means they can unilaterally decide to introduce profitable things like ads and charging for features.

    The promise of the system though is that it provides for various levels of independence that can all connect to each other, so people with different needs and capabilities can all find their spot in the ecosystem. Whether that happens is a big question. Generally I’d say I’m optimistic about the ideas and architecture, but unsure about whether the community around it will get it to what I think it should be.



  • I think for python tooling the choice is Python Vs Rust. C isn’t in the mix either.

    That seems fair. Though I recall Mumba making headway (at least in the anaconda / conda space) and it is a C++ project. AFAIU, their underlying internals have now been folded into conda, which would mean a fairly popular, and arguably successful portion of the tooling ecosystem (I tended to reach for conda and recommend the same to many) is reliant on a C++ foundation.

    On the whole, I imagine this is a good thing as the biggest issue Conda had was performance when trying to resolve packaging environments and versions.

    So, including C++ as part of C (which is probably fair for the purposes of this discussion), I don’t think C is out of the mix either. Should there ever be a push to fold something into core python, using C would probably come back into the picture too.


    I think there’s a survivor bias going on here.

    Your survivorship bias point on rust makes a lot of sense … there’s certainly some push back against its evangelists and that’s fair (as someone who’s learnt the language a bit). Though I think it’s fair to point out the success stories are “survivorship” stories worth noting.

    But it seems we probably come back to whether fundamental tooling should be done in python or a more performant stack. And I think we just disagree here. I want the tooling to “just work” and work well and personally don’t hold nearly as much interest in being able to contribute to it as I do any other python project. If that can be done in python, all the better, but I’m personally not convinced (my experience with conda, while it was a pure python project, is informative for me here)

    Personally I think python should have paid more attention to both built-in tooling (again, I think it’s important to point out how much of this is simply Guido’s “I don’t want to do that” that probably wouldn’t be tolerated these days) and built-in options for more performance (by maybe taking pypy and JIT-ing more seriously).

    Maybe the GIL-less work and more performant python tricks coming down the line will make your argument more compelling to people like me.

    (Thanks very much for the chat BTW, I personally appreciate your perspective as much as I’m arguing with you)


  • Yep! And likely the lesson to take from it for Python in general. The general utility of a singular foundation that the rest of the ecosystem can be built out from.

    Even that it’s compiled is kinda beside the point. There could have been a single Python tool written in Python and bundled with its own Python runtime. But Guido never wanted to do projects and package management and so it’s been left as the one battery definitely not included.


  • I feel like this is conflating two questions now.

    1. Whether to use a non-Python language where appropriate
    2. Whether to use rust over C, which is already heavily used and fundamental in the ecosystem (I think we can put cython and Fortran to the side)

    I think these questions are mostly independent.

    If the chief criterion is accessibility to the Python user base, issue 2 isn’t a problem IMO. One could argue, as does @[email protected] in this thread, that in fact rust provides benefits along these lines that C doesn’t. Rust being influenced by Python adds weight to that. Either way though, people like and want to program in rust and have provided marked success so far in the Python ecosystem (as eraclito cites). It’s still a new-ish language, but if the core issue is C v Rust, it’s probably best to address it on those terms.


  • Fair, but at some point the “dream” breaks down. Python itself is written in C and plenty of packages, some vital, rely on C or Cython (or fortran) and rust now more and more. So why not the tooling that’s used all the time and doing some hard work and often in build/testing cycles?

    If Guido had packaging and project management included in the standard library from ages ago, with parts written in C, no one would bat an eye lid whether users could contribute to that part of the system. Instead, they’d celebrate the “batteries included”, “ease of use” and “zen”-like achievements of the language.

    Somewhere in Simon’s blog post he links to a blog post by Armin on this point, which is that the aim is to “win”, to make a singular tool that is better than all the others and which becomes the standard that everyone uses so that the language can move on from this era of chaos. With that motive, the ability for everyday users to contribute is no longer a priority.


  • Cool to see so many peeps on the Fedi!

    While I haven’t used uv (been kinda out of Python for a while), and I understand the concerns some have, the Python community getting concerned about good package/project management tooling is certainly a telling “choice” about how much senior Python devs have gotten used to their ecosystem. Somewhat ditto about concern over using a more performant language for fundamental tooling (rather than pursuing the dream of writing everything in Python, which is now surely dead).

    So Simon is probably right in saying (in agreement with others):

    while the risk of corporate capture for a crucial aspect of the Python packaging and onboarding ecosystem is a legitimate concern, the amount of progress that has been made here in a relatively short time combined with the open license and quality of the underlying code keeps me optimistic that uv will be a net positive for Python overall

    Concerns over maintainability should Astral go down may best be served by learning rust and establishing best practices around writing Python tooling in compiled languages to ensure future maintainability and composability.





  • Not a stock market person or anything at all … but NVIDIA’s stock has been oscillating since July and has been falling for about a 2 weeks (see Yahoo finance).

    What are the chances that this is the investors getting cold feet about the AI hype? There were open reports from some major banks/investors about a month or so ago raising questions about the business models (right?). I’ve seen a business/analysis report on AI, despite trying to trumpet it, actually contain data on growing uncertainties about its capability from those actually trying to implement, deploy and us it.

    I’d wager that the situation right now is full a lot of tension with plenty of conflicting opinions from different groups of people, almost none of which actually knowing much about generative-AI/LLMs and all having different and competing stakes and interests.



  • Yea, this highlights a fundamental tension I think: sometimes, perhaps oftentimes, the point of doing something is the doing itself, not the result.

    Tech is hyper focused on removing the “doing” and reproducing the result. Now that it’s trying to put itself into the “thinking” part of human work, this tension is making itself unavoidable.

    I think we can all take it as a given that we don’t want to hand total control to machines, simply because of accountability issues. Which means we want a human “in the loop” to ensure things stay sensible. But the ability of that human to keep things sensible requires skills, experience and insight. And all of the focus our education system now has on grades and certificates has lead us astray into thinking that the practice and experience doesn’t mean that much. In a way the labour market and employers are relevant here in their insistence on experience (to the point of absurdity sometimes).

    Bottom line is that we humans are doing machines, and we learn through practice and experience, in ways I suspect much closer to building intuitions. Being stuck on a problem, being confused and getting things wrong are all part of this experience. Making it easier to get the right answer is not making education better. LLMs likely have no good role to play in education and I wouldn’t be surprised if banning them outright in what may become a harshly fought battle isn’t too far away.

    All that being said, I also think LLMs raise questions about what it is we’re doing with our education and tests and whether the simple response to their existence is to conclude that anything an LLM can easily do well isn’t worth assessing. Of course, as I’ve said above, that’s likely manifestly rubbish … building up an intelligent and capable human likely requires getting them to do things an LLM could easily do. But the question still stands I think about whether we need to also find a way to focus more on the less mechanical parts of human intelligence and education.