Lemmy account of natanox@chaos.social

  • 0 Posts
  • 9 Comments
Joined 6 months ago
cake
Cake day: October 7th, 2024

help-circle

  • I’m currently looking for this as well. As far as my investigation went right now I’ll probably go for 2x AMD Instinct MI50. Each of them has equivalent to slightly higher performance than a P40, however usually only 16gb VRAM (If you’re super lucky you might get one with 32gb, those are usually not labeled as such though; probably binned MI60). With two of them you got 32gb VRAM and quite the performance for, right now, 200€ / card. Alternatively you should be able to run quantized models on a single card as well.

    If you don’t mind running ROCm instead of CUDA this seems like a good bang for the buck. Alternatively you might look into AMDs new line of “AI” SoCs (for example Frameworks Desktop computer). They seem to be really good as well, and depending on your usecase might be more useful than an equally priced 4090.








  • This. I wholeheartedly concur that corporate “social” media is fucking shit up like hell, however as long as people are socially stuck there they will be around, as as long as that is the case people who need to make a living will have to roll out stuff and communicate with people there.

    In fact this is specifically the kind of tool a friend and me looked for but couldn’t find as FOSS until now, although we’d definitely need support for the Mastodon API / Lemmy.

    One may argue which platforms to choose (depending on what you want to show people or who you want to reach). But completely abandoning garbage platforms ain’t possible right now, especially for small businesses who need to be seen somewhere.