Domino on Linux/Unix, Troubleshooting, Best Practices, Tips and more ...

alt

Daniel Nashed

Black Friday - Getting some new AI ready hardware

Daniel Nashed – 30 November 2024 00:24:19

AI functionality requires specific hardware resources. It's pretty clear that you can't avoid looking into NVIDIA GPUs.
But there is new Intel hardware with which helps with AI workloads as well. My test on my new 12th Gen Intel(R) Core(TM) i9-12900HK based machine has some good performance with it's 20 cores.
It's always a combination of GPUs and CPUs. So the new test notebook will be interesting to test.

My Thinkpad has a NVIDIA T1000 card, which is already quite OK for local tests.

The smaller Hetzner GPU server has a Nvidia RTX™ 4000 SFF Ada Generation with 20 GB RAM.


But I need some local test hardware to compile and run AI projects locally on Windows and Linux.
The main project which many other projects including Ollama are built on is llama.cpp  
https://github.com/ggerganov/llama.cpp.

But also the run-time aspect is interesting. On Docker and Kubernetes you need drivers to support GPUs inside the container.
Also hypervisors like Proxmox support GPU mapping into VMs.

A gaming notebook looks like the most reasonable hardware for a AI lab environment locally.
Black Friday is a great opportunity to get some cool new hardware.

Nvidia RTX™ 4060 comes with 8 GB RAM and decent performance.
The latest Intel CPUs support modern instruction sets
Ordered and looking forward to get my hands on it ..


https://www.asus.com/laptops/for-gaming/tuf-gaming/asus-tuf-gaming-f17-2023/

Image:Black Friday - Getting some new AI ready hardware


https://github.com/ollama/ollama/blob/main/docs/gpu.md

Image:Black Friday - Getting some new AI ready hardware

Links

    Archives


    • [HCL Domino]
    • [Domino on Linux]
    • [Nash!Com]
    • [Daniel Nashed]