Generative AI workflows and hacks 2025
2024-03-23 — 2025-10-02
Wherein the year of local LLMs is chronicled, Jan is installed for offline use, Ollama and Simon Willison’s CLI tricks are employed for autonomous inference and Mac acceleration, and DeepSeek reshapes costs.
I’ll try to synthesize LLM research elsewhere. This is where I keep ephemeral notes and links, continuing my habit in 2024.
1 The year I finally install local LLMs
1.1 …using Jan
Jan - Open-source ChatGPT alternative that runs offline
Jan is a full product suite that offers an alternative to Big AI:
- Jan Desktop: macOS, Windows, and Linux apps with offline mode
- Jan Web: Jan on browser, a direct alternative to chatgpt.com
- Jan Mobile: iOS and Android apps (Coming Soon)
- Jan Server: deploy locally, in your cloud, or on-prem
- Jan Models: Open-source models optimized for deep research, tool use, and reasoning
1.2 …via Ollama for autonomous LLM inference
Useful guides:
1.3 …via Simon Willison
Simon develops LLM: A CLI utility and Python library for interacting with Large Language Models.
If we like command-line wizardry, this is neat.
His project has powerful tricks such as Apple Mac acceleration and a stunt like this: pillow.md
Together these produce an incredible CLI text processing pipeline. Watch out — it can get pretty expensive.
See also
1.4 … via LM Studio
A GUI option:
1.5 Virtually via proxy
2 Moar automation
OpenInterpreter/open-interpreter: A natural language interface for computers.
3 What happened previously
4 Economics
5 Deepseek and the cheapening of LLMs
Deepseek deep dive
- The impact of competition and DeepSeek on Nvidia
- deepseek-ai/awesome-deepseek-integration
- GRPO Trainer
- DeepSeek V3 and the cost of frontier AI models
- DeepSeek FAQ – Stratechery by Ben Thompson
- 2. Implement Reward Function for Dataset — veRL documentation
- deepseek-ai/DeepSeek-R1
- huggingface/open-r1: Fully open reproduction of DeepSeek-R1
6 Using AI Agents
I seem to be doing a lot of that.
See AI Agents for more.