• 0 Posts
  • 722 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle

  • The thing is that there is no middle ground, the terminal you described would either have the problems of a GUI (limited interactivity) or the problems of a CLI (unintuitive).

    If you do a button to do things you’ve removed option to do different things, if you ask them for text you’ve removed the intuitivity of a button. If you present less options you might not cover all cases, if you present more options it’s a regular terminal.

    It seems that the issue you have is that you need a way to quickly redo commands you’ve done before and you need a good way to know what options you have. I have 2 CLI solutions for you.

    1. If you press Ctrl+R and start typing a command it will auto-complete to the latest command you ran that matches

    2. If you install zsh and grml-zsh-config and switch to zsh you will have a very powerful auto-complete that will allow you to press tab to auto-complete most parameters of most common operations.





  • Some answers address some of your questions, I would like to address something else you said:

    For programs they will always end up on / and I cant install them on another partition (dont know why)

    Programs can certainly be installed wherever you want them to, bit package managers have their own folder structure to follow. Installing a program is not a magical thing, it just means having the binary somewhere on your computer, and that binary having the proper libraries accessible to it. If you downloaded a binary it is TECHNICALLY installed on your computer.

    The other important part here is you being able to access that program, for this there is an environment variable balled PATH which tells Linux where to look for binaries and in what order. So if your PATH is /home/sarmsle/.bin:/media/NTFS/binaries:/bin:/usr/bin whenever you try to run for example firefox it will try to run /home/sarmale/.bin/firefox if that doesn’t exist, it goes to the next, and the next until it finds it or runs out of options.

    Also in Linux you can mount partitions anywhere, and you can create links from one place to another. So nothing prevents you from mounting your NTFS drive in /NTFS or even /home/sarmale/Shared.


  • You are correct, but missed one important point, or actually made an important wrong assumption. You don’t simulate a 1:1 version of your universe.

    It’s impossible to simulate a universe the size of your own universe, but you can simulate smaller universes, or to be more accurately, simpler universes. Think on videogames, you don’t need to simulate everything, you just simulate some things, while the rest is just a static image until you get close. The cool thing about this hypothetical scenario is that you can think of how a simulated universe might be different from a real one, i.e. what shortcuts could we take to make our computers be able to simulate a complex universe (even if smaller than ours).

    For starters you don’t simulate everything, instead of every particle being a particle, which would be prohibitively expensive, particles smaller than a certain size don’t really exist, and instead you have a function that tells you where they are when you need them. For example simulating every electron would be a lot of work, but if instead of simulating them you can run a function that tells you where they are at a given frame of the simulation you can act accordingly without having to actually simulate them. This would cause weird behaviors inside the simulation, such as electrons popping in and out of existence and teleporting over gaps smaller than the radius of your spawn_electron function, which in turn would impose a limit to the size of transistors inside that universe. It would also cause it so that when you fire electrons through a double slit they would interact with one another, because they’re just a function until they hit anything, but if you try to measure which slit they go through then they’re forced to collapse before that and so they don’t interact with one another. But that’s all okay, because you care about macro stuff (otherwise you wouldn’t be simulating an entire universe).

    Another interesting thing is that you probably have several computers working on it, and you don’t really want loading screens or anything like that, so instead you impose a maximum speed inside the simulation, that way whenever something goes from one area of the simulation to the next it will take enough time for everything to be “ready”. It helps if you simulate a universe where gravity is not strong enough to cause a crunch (or your computers will all freeze trying to process it). So your simulated universe might have large empty spaces that don’t need that much computational power, and because traveling through them takes long enough it’s easy to synch the transition from one server to the next. If on the other hand maximum speed was infinite you could have objects teleporting from one server to the next causing a freeze on those two which would leave them out of synch with the rest.

    And that’s the cool thing about thinking how a simulated universe would work, our universe is weird as fuck, and a lot of those weirdness looks like the type of weirdness that would be introduced by someone trying to run their simulation cheaper.




  • Because my experience is always the exact opposite of yours. Windows has never been convenient for me, it always does random shit, and stuff just suddenly stops working because fuck you that’s why. For example, I have a Windows computer at work to build and test the games I work on, this week it decided that it won’t use more than 20% of the CPU for building the latest game, there’s no other bottleneck, temperature is stable at 60°C, disks have space, and most importantly, other games compile just fine, it’s just the one I’m actively working currently that doesn’t. And it’s not an issue in the code either since I’m the only person in the company experiencing this. And, this is the important part, I can’t do anything about it, because no one knows why Windows decided to do that, so there’s nothing anyone can do. On Linux when you have an issue there’s an explanation, and someone with enough experience will find it quickly, on Windows you can be the world’s expert and still the OS will just decide to nope the fuck out.



  • Overcooked 1 is a Unity game released in 2016. Unity only started offering Linux build support as an experimental feature at the end of 2015, so it’s very likely that the version of Unity they used to make Overcooked didn’t had Linux support.

    The real question is why Overcooked All you can Eat doesn’t have Linux support.

    Edit: I forgot to say, I don’t think it’s weird that Paradox supports Linux, they made their engine Linux compatible a while back, so offering support now is trivial. And I always remember the reddit post in which a dev explained that Linux users are like a dedicated QA team hahaha



  • First of all, this discussion doesn’t matter, Bitcoin will never regain the popularity it could have had if it had increased the block size back in 2017 when it became unusable, I remember buying stuff with Bitcoin, I remember mining and using that money to buy games on steam, paying for my electricity bill, etc, those days are gone and won’t come back. After all of the backlash Valve got from the unusability of Bitcoin, they’ll never reenable that, Bitcoin fucked all of crypto adoption back then and I don’t think the market will ever forget (if they would they would have migrated to another coin already).

    $1 which is what you say you pay is too much. Imagine if every time you used your card $1 extra was charged, you would never use that as a currency. The problem is that you’re not thinking on Bitcoin as a currency, and I don’t blame you, that hasn’t been the vision for Bitcoin in a long while, but originally that was the goal.

    You keep saying that BCH is an attempt at centralization, would you care to explain how that would work? Imagine if back when BCH hard fork happened all of the miners had sided with it (they had the option and choose not to), how would that be more centralized? You do know you can mine BCH with any miner, right? Just like BTC you don’t need to use Bitcoin-core, the protocol is open source and there are plenty of implementations.

    As for the botting attacks I was there when they supposedly happened, I was banned from /r/Bitcoin for saying the same thing I’m telling you now, we weren’t bots, that was the excuse the mods there used to ban everyone who disagreed with keeping blocks small. I’m not going to put my hand in the fire for him, I don’t know the guy and for all I care he can go fuck himself, but not everyone who was accused of being a bot was one, lots of us were just people saying “don’t you see how not increasing the block is harming Bitcoin? Don’t you see the hardware price has decreased dramatically since the 1MB block size was introduced? Don’t you see companies are jumping ship and we are losing all of the momentum?” It sounds like bots because lots of people were pointing out the same stuff, but the thing is that we were pointing out things like we were seeing them in the real world. That’s when /r/BTC started gaining popularity, because being banned from /r/Bitcoin was almost a rite of passage, and curiously on /r/BTC no one was banned for showing either side of the opinion, people might get downvoted but that’s just the system working, banning people is censorship and it’s a clear cut way of showing you’ve ran out of arguments and are just looking to create a bubble where everyone agrees with you.

    Nope, the time is gone. Hard forks are not risky at all, Bitcoin has had several in the past, no one made a fuzz about them back then. Soft forks are just as dangerous, and BTC has had several. The difference between a soft and a hard fork is that the hard fork is backwards compatible, whereas the soft fork is forward compatible, both of them the miners are forced to migrate or they will be throwing away money.

    You are correct, there is no demand or usage for either of the Bitcoin’s anymore z there was back then, there isn’t now. BCH is scaled much, much higher than BTC, so if adoption had migrated they would have been prepared. And the important thing is that no one was saying we don’t need side chains, it was more of “we need the main chain to be usable NOW, we can scale indefinitely later, but we need to fix this NOW or Bitcoin adoption will be irreparably gone”, unfortunately it was too little too late, and companies had already jumped ship and weren’t going to risk it back.


  • Yes and no. Historically Bitcoin had no limits on transactions, but someone realized a while back that because transactions were really cheap rich people could flood the network with transactions and that would prevent legitimate people from using the network. Therefore a limit on the size of a block was imposed, which in turn meant that Bitcoin could only process 7 transactions per second in average.

    At the time that was great, if someone tried to attack the network they would quickly run out of space in the block and people wanting to do transactions had only to pay a little bit more to ensure their transactions got approved, which in turn made this attack more expensive and eventually useless.

    However in 2017 Bitcoin gained popularity, and this caused an issue, actual users were hitting that barrier of 7 transactions per second among themselves, and so users began competing with each other for space on the chain. And when that happens your coffee or VPS payment is going to get a much lower fee than what someone trading hundreds or thousands of dollars can afford. So essentially Bitcoin became only usable for large trading (you needed to pay $50 for a transaction at the time).

    Since that became prohibitive for most users, people stopped using it, and because of long confirmation times for people who still tried to use it stores stopped accepting it (some people don’t know this, but Steam and Microsoft used to accept Bitcoin payments).

    You might think the solution is simple, just increase the limit like the guy who imposed it originally said it should be done, but that created a shit storm of people claiming that that would bring all hell on earth, and the proper solution was to build a secondary chain on top of Bitcoin so you only needed to pay the high prices once (this is what the other reply to my comment is mentioning). The problem is that for miners small blocks with higher transactions fees are waaaay more profitable, so they sided up with the fuck the user mentality (and some even say that they paid people to promote that solution).

    So long story short Bitcoin is not currently usable for day-to-day purchases. Transaction prices fluctuate a lot, currently it’s “cheap” at $1.5 per transaction, but it regularly goes to 10/20 and even got to over 100 this year. So it’s no longer a currency, since no one would pay those prices to use it as currency. Bitcoin is now more of an investment than cash, it’s closer to gold in that you could use it as cash but it’s inconvenient and you’ll end up losing money.


  • The lighting network is bound to fail because it suffers from the same problem Bitcoin is solving, i.e. it only works in centralized hubs. Think about how a new user would come to Bitcoin if the LN was in full effect, they would need to spend hundreds to open a channel to a centralized hub, the more centralized the better so it can connect to more places, so they can ensure they have the funds and the connectivity to use the network, because channels can’t be increased, if he ever needs more funds than what he has he will need to close the channel and open a new one, and because channels might be used by third parties when routing through the network if he spends double that amount to create two channels to two centralized hubs he risks having their funds go from one channel to the other and having to go the long route for his own transactions.

    Also the idea that running a node is more expensive because of larger blocks is mostly nonsense, for starters the only nodes that matter are mining nodes, even if your validating node finds an issue it has no power to do anything about it. Secondly there is such a thing as pruning old blocks to reclaim space. Third the whole point of Bitcoin is that you don’t need to validate transactions because mining nodes have incentive to stay honest, and the regulators are other mining nodes who stand to gain from others dishonesty by mining the same block they did, and again, mining nodes require lots of hardware, some extra HDD for old data is cheap in comparison. Fourth, why don’t you feel the need to validate the LN? Why just validate the on-chain stuff but trust the LN? Surely you’ll want your validator Node to process all of the LN transactions to ensure they’re valid, no? Or do you trust that the LN works? And if you do why not trust that the technology the LN is built on top of also works?. Not to mention that even in poor countries the cost of a hard drive that can hold years of data is lower than a couple of transactions on the main Bitcoin network during peak times, and a lot cheaper than opening one channel on the LN that’s worth using, for example earlier this year the average transaction fee peaked at $123 https://ycharts.com/indicators/bitcoin_average_transaction_fee so of someone wanted to make a transaction that day on the Bitcoin network they would need to spend $123 extra, that is enough to get a used 8TB HDD which should hold all of the BCH Blockchain just with the cost of a single transaction.

    Long story short:

    • LN has design flaws and doesn’t work
    • Even if LN worked, on-chain transactions are better, less limited and more secure
    • Even if those design flaws could be addressed, the main chain will still become unusable due to high cost of entry unless blocks get increased
    • If blocks never get increased, popularity will be the death of Bitcoin like it happened in 2017, if someone needs to throw away hundreds of dollars just to get into the network, they’ll never do it. And because RBF people that don’t pay whatever the current tx fee is can get scammed.
    • Validator nodes serve no purpose
    • Even if they did running a validator Node on BCH is cheaper than using BTC
    • Even if validator nodes made some difference then the LN would also need one and then the amount of transactions there would increase the need of a node to higher than what would be needed for BCH (since besides regular transactions you would also need to validate on and off channel transactions).



  • But also as a general rule places don’t let you spend over 30% of your income in rent, 60K that you mentioned for a couple is £4186 per month, so the maximum rent of that hypothetical person is £1256. Which wouldn’t also allow you to rent a house like the one you showed, so it’s pointless. The house where a couple that earns 60k lives can absolutely be bought for close to 1 million (if not less). Whoever is living in a 4 bedroom house like the one you pointed out earns a lot more than 60k and so they can finance the rest.