AI starter pack — useful links for own GPT models development
2 min readJul 2, 2023
In this post I have collected useful resources which are good point to start own explorations of AI. I am going to update this collection when finding new interesting resources.
READY-TO-USE MODELS AND FRAMEWORKS
- GPT4ALL — an open-source ecosystem designed to train and deploy powerful, customized large language models that run locally on consumer-grade CPUs. A bunch of models is available on this site with comparison of its performance results.
- repository nomic-ai/gpt4all on GitHub
- Hugging Face Hub — a platform designed to share pre-trained AI models and collaborate on the development and sharing of resources related to AI and natural language processing (NLP)
- llama.cpp — C++ based port of LLaMA. It makes possible to run LLaMA on Raspberry Pi or even smartphone.
- alpaca.cpp — C++ based port of Alpaca (forked from llama.cpp)
RUNNING MODELS ON RASPBERRY PI
- LLaMA on Raspberry Pi (based on llama.cpp)
- AlpacaPi
TRAINING OWN MODELS
GitHub projects
- tloen/alpaca-lora — Instruct-tune LLaMA on consumer hardware
- lxe/simple-llm-finetuner — Simple UI for LLM Model Finetuning
GUI FOR LOCAL AI
- GPT4All — desktop client for GPT4All
GitHub projects
- ido-pluto/catai — GUI for LLaMa model
- cocktailpeanut/dalai — GUI for LLaMa model
- Atome-FE/llama-node — GUI for LLaMa model, support llama/alpaca/gpt4all/vicuna/rwkv
- text-generation-webui — gradio web UI for running LLaMA, llama.cpp, GPT-J, Pythia, OPT, and GALACTICA
- nat/openplayground — playground UI, including history, parameter tuning, keyboard shortcuts, and logprops
ARTICLES AND REPORTS
- ANAND, Yuvanesh, et al. Gpt4all:Training an assistant-style chatbot with large scale data distillation from gpt-3.5-turbo. GitHub, 2023.
- CHIANG, Wei-Lin, et al. [Vicuna: An open-source chatbot impressing gpt-4 with 90%* chatgpt quality] (https://lmsys.org/blog/2023-03-30-vicuna/). vicuna.lmsys.org, 2023.
- DING, Ning, et al. Enhancing Chat Language Models by Scaling High-quality Instructional Conversations. arXiv preprint arXiv:2305.14233, 2023.
- HAO, Karen. Training a single AI model can emit as much carbon as five cars in their lifetimes. MIT technology Review, 2019, 75: 103.
- LI, Yunxiang, et al. ChatDoctor: A Medical Chat Model Fine-Tuned on a Large Language Model Meta-AI (LLaMA) Using Medical Domain Knowledge. Cureus, 2023, 15.6.
- STRUBELL, Emma; GANESH, Ananya; MCCALLUM, Andrew. Energy and policy considerations for deep learning in NLP. arXiv preprintarXiv:1906.02243, 2019.
- TAORI, Rohan, et al. [Alpaca: A strong, replicable instruction-following model. Stanford Center for Research on Foundation Models] (https://crfm.stanford.edu/2023/03/13/alpaca.html). 2023, 3.6: 7.
- TOUVRON, Hugo, et al. Llama: Open and efficient foundation language models. arXiv preprint arXiv:2302.13971, 2023.
OTHER USEFUL RESOURCES
- badgooooor/localai-vscode-plugin — Extension for attaching LocalAI instance to VSCode
- Mooler0410/LLMsPracticalGuide — A curated list of practical guide resources of LLMs (LLMs Tree, Examples, Papers)
If you like this article and would be interested to read the next ones, the best way to support my work is to become a Medium member using this link:
If you are already a member and want to support this work, just follow me on Medium.