What hardware is recommended for adequate performance and low power consumption. I’ve seen the Mac mini recommended a few times. Would the 16 GB model be enough? Or do I need the 24 gig model? What about this framework desktop thingy? It should be very good as well, but what will it consume at idle?

Hardware decisions are complete nightmare, because the best models the day after you decide will surely change. For, Coding/text better large models more important than fast dumber models, IMO
but 16gb, 5060ti
20gb, 7900 24gb, 7900xtx
for GPUs, I’d decide based on $/gb. They will all be fast enough. Nvidia generally faster, and so my pick for 16gb, if there are a lot of options with $100 of each other.
64-128gb, amd igpu or arm Mac (latter much better performance) lpddr speeds increasing next year, lpddr GPUs next year, maybe too. Nvidia spark and thor coming soon. wait for benchmarks on both.
256gb m4 max. 4 dimm slot 8700g is only PC desktop with igpu option. Older threadripper/xeon and cpu only could work.
512gb m3 ultra
Even if you need high memory, software design/evolution could end up supporting dual cooperating models, or you may use a GPU for retraining/fine tuning.
You can also have multiple computers with shared mouse/keyboard software that switches based on active mouse monitor, and cascading network connects, but giant models are pretty slow when scaled this way, though GPUs on older hardware, or pure minimal full 128gb linux system for LLM only, but with other option of gaming exclusive mode, as opposed to your 300 browser tab main computer.
For me, I could expand recent 7835hs with 128gb ddr5, and/or get oculink GPU, but also and/or get mac/strix halo or other fairly expensive computer as a dedicated “HPC/AI expansion platform”. Or wait for better options.