How to convert Huggingface transformers weights to Ollama model
the end result: an LLM from Huggingface hub running locally with ollama backend prerequisites: an Ubuntu/Debian VM with NVIDIA GPU, conda, git etc. not sure if a GPU is needed for this pa...