It was already discussed and bashed here: https://lemmy.world/post/23704373
It was already discussed and bashed here: https://lemmy.world/post/23704373
The user in the screenshot not the OP
After trying couple of keybords (mostly AnySoftKeyboard and TypeWise [proprietary]) I have settled on Unexpected Keyboard due to easy switch of keybords to get to japanese keyboard and tts button.
Depends on the threat model.
NOAA and others gets underfunded/change of menagement and need to close down open access to stuff.
or
Data becomes illegal to possess and feds start knocking on Web Archive doors.
or
Web archive will do something stupid and will get sued/DDOSed
In only one very unlikely scenario it won’t be availble due to recent events. But still redundancy would be good regardless of recent stuff.
they have an automatic VM that dowloads stuff in distributed manner and uploads to archive.org
a) on mobile it doesn’t work as good as on desktop at all
2) It crashed my firefox on desktop
10/10 will read it again
Edit:
This is an ad that you 100% missed
I think more important is compute per watt and idle power consumption than raw max compute power.
Anki for vocab
Textbook for grammar
Immersion for everything else
Also input and output are two different skills.
I live in Poland and in big cities that’s not an issue but in cities other than Warsaw and Kraków you still need to check when the bus will depart. For rural areas reliability is an issue. If for some reason the bus won’t come than the next one is in two hours.
Trains are good enough but often delayed. Mostly 5 to 15 min but there is always the odd one out with 24h that is always talked about at parties.
Trams, trolleybuses, metro and cycling paths are good enough. There are bicycle rental networks that work in a pinch.
I mosty use them to unwind emotionally. Sometimes I will put what is happening in my life in the event that I want to reconstruct the timeline of events from past season.
I also put daily tasks that I want to do and a field to put something I am proud of doing today.
Also I am not diligent enough to make it daily. I treat it more like “Today I …” than a some chore to fill the check box.
Found the spreadsheet https://goo.gl/z8nt3A
And the source: https://www.hardwareluxx.de/community/threads/die-sparsamsten-systeme-30w-idle.1007101/
Still you can calculate how much you will save with 2w power reduction with selling this one and buying different NAS.
You can reduce the disk idle time after access to 5-15 min for better power saving.
Maybe you are looking at the wrong thing. CPU + motherboard controllers idle state matters more than spun down hdds
I saw a spreadsheet somewhere of a lot of cpu + motherboard combinations with idle power consumption for ultra low energy NAS optimisation.
Still AI misalignment is a real issue. I just don’t remember which model was studied and had been found out that it was missaligned.
I have read some headline that said that some of these models just measure age of a patient and a quality of the machine making photos.
Modem translates fiber signals / DSL into twisted pair cable
Acces point translates twisted pair into wifi
I think you are looking for all in one router
For AI/ML workloads the VRAM is king
As you are starting out something older with lots of VRAM would be better than something faster with less VRAM for the same price.
The 4060 ti is a good baseline to compare against as it has a 16GB variant
“Minimum” VRAM for ML is around 10GB the more the better, less VRAM could be usable but with sacrefices with speed and quality.
If you like that stuff in couple of months, you could sell the GPU that you would buy and swap it with 4090 super
For AMD support is confusing as there is no official support for rocm (for mid range GPUs) on linux but someone said that it works.
There is new ZLUDA that enables running CUDA workloads on ROCm
https://www.xda-developers.com/nvidia-cuda-amd-zluda/
I don’t have enough info to reccomend AMD cards
https://rocm.docs.amd.com/projects/radeon/en/latest/docs/compatibility.html
https://rocblas.readthedocs.io/en/rocm-6.0.0/about/compatibility/linux-support.html
Yes on four consumer grade cards
If I want to have mid-range GPU with compute on linux my only option is nvidia.
Let’s go back to handing resumes in person.