Scott's Recipes Logo

Calling an LLM to Implement a Local Side Coding Tool


Please note that all opinions are that of the author.


Last Updated On: 2026-02-15 08:01:34 -0500

I just posted this to Mastodon but it feels useful enough for the future that I want to capture it here.

This feels like a damn useful article to have for future reference:

It is an overview of how to use an LLM API to implement a local coding tool. It isn’t “how the LLM itself works” but how you can wrap it into an API. It is very clear and easy to understand.

I can see this being useful for any type of “embed an LLM into a computing resource that does something with a generated resource locally”. Just as an example, I could see it being used with carpentry specific prompts to, say, generate a set of cut instructions for something like making roof joists which produces a PDF for you. This is something that ChatGPT is weirdly good at; I do less geometry these days than I used to for my amateur carpentry.

Sidebar: I may take a shot at implementing this article using Ruby instead of the Python it used.