12 points by Igor_Wiwi 2 days ago|2 comments
I built localLLLM: a small community project for running local models.

Live: https://locallllm.fly.dev

The goal is simple: if someone has model + OS + GPU + RAM, they should get steps that actually work (ideally one liner)

I need help populating and validating guides.

If you run local models, please submit one working recipe (or report what failed). Would love to hear general feedback as well!

modinfo 2 days ago
This is nice idea! But opensource it and instant to save data to db, just save to markdown on github, then everyone can send PRs to edit/add instructions. More freedom and simpler.
doc_ick 2 days ago
Agreed, otherwise it sounds like op is crowdsourcing just to avoid paying the manual effort.

*Edit: op has been vibe coding all over, could just vibe code the guides themselves without human input?