• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Dion Almaer

Software, Development, Products

  • @dalmaer
  • LinkedIn
  • Medium
  • RSS
  • Show Search
Hide Search

AI Native Dev

Pools of Extraction: How I Hack on Software Projects with LLMs

September 23, 2025

I’ve started to notice some patterns in how I work on software projects.

Specs as mini-pools

I often sit down and just write a spec. Even if I never build it, the spec itself is a kind of temporary pool: a place where scattered ideas turn into something concrete enough to extract from later.

When I say “write a spec”, I actually start with a “brief” and the spec flushes out over time. AI helps to build out the spec through back and forth, and by reverse-extracting changes to a system. I’m excited for more tools that will help with extraction and syncing. If we can get there, specs can become the source of truth.

It’s important to do so. I was talking to a friend recently who works on spec driven systems and we kept discussing the analogy of building a house.

Imagine meeting with your architect and having a conversation about what you want to build. It starts high level, and the architect comes back with options which you iterate on together. The architect flushes out actual designs, and you can communicate around the nice 3D models vs. the blue prints.

Now imagine meeting with the GC and you just start riffing on what you want to built. Obviously crazy… things get written down and everyone aligns around the same shared sources.

We also have inspectors to verify that that build matches the design and follows all of the building codes. Yes, we have building codes that are shared sources of truth and constraints.

Right now we are still relying on a lot of “talk to the builder and they go off and try to build what you just said” with chat based agents. Over time we will add more shared sources, will have better verification, and this will enable us to scale what we build.

Faster cheaper smaller models

We often think about how specifying things well leads to higher quality output from models. This is true. What is often under appreciated is that it also means that you can use a variety of models to do the job.

Asking a SOTA galaxy brain that has knowledge of Genghis Khan, every language, and on and on can often be overkill. You can do hundreds or even thousands of “flash” calls for the price of one larger call. Thus, having the right knowledge available can allow a small model to give you what you need very quickly and cheaply… especially if a fix loop is applied.

Throwaway projects

I spin up tiny projects all the time, just to try something out. Most of them don’t live very long, but that’s the point—they’re cheap experiments that give me a quick pool of learning before I move on.

Shopify’s Tobi published a repo that helps make this easy:

I'm constantly starting tests and demos of Rails apps and other tools. This new try tool from @tobi looks exactly what I didn't even realize that I needed! https://t.co/n8ahi05Utz

— DHH (@dhh) August 20, 2025

Repo extractions

Sometimes I yank code out of a larger repo, or glue pieces together from a few repos, and build something smaller and more focused. By narrowing the scope, I create a pool that’s easier to wade into.

Because LLMs are so great at taking different information and translating it for your needs, this can happen in interesting ways. Where I would look for more abstractions and shared libraries in the past, it’s more murky now when I can easily create exactly what I need.

It turns out, all of these are me making temporary pools of extraction. And once I noticed that pattern in my own work, I started seeing it everywhere—especially in how LLMs and even our own brains operate.


Pools of Extraction in LLMs

LLMs have two big pools to fish from:

  • The latent pool: everything baked into their weights from training. Big, deep, and fuzzy.
  • The contextual pool: whatever you feed them right now—repos, docs, examples. Sharp, specific, but fleeting.

If you ask a model a question without any context, it dives into the murky depths of its latent pool and hopes to surface something useful. Hand it your relevant code repo, and suddenly it’s swimming in a nice, clear backyard pool with everything it needs right there.

That repo context might not live beyond the session, but while it’s there, it massively increases the odds of good output.


Pools of Extraction in Humans

Humans work in a parallel but messier way. If someone asks me about a library I used years ago, I might dredge up the answer from long-term memory. But if I’ve got my old notes or an example in front of me? Way easier.

Psychologists call these retrieval cues—they reactivate the right pathways so memory reconstructs itself. And unlike LLMs, every time I do recall something, I actually change the memory. My brain’s wiring shifts a little. With LLMs, nothing changes in the weights—context pools evaporate when the session closes.


Conclusion: Build Better Pools

Specs, toy projects, repo extractions—these are all ways I give myself better pools to draw from. LLMs need the same treatment: without context they flail, with it they shine. Humans too: notes, cues, and scaffolds make the difference between blank stares and “aha!”

So now when I’m stuck, I don’t just hope the big pool will have the answer. I ask myself: what’s the smallest pool I can create right now to make extraction easier?

It’s all open book tests from here.

That’s when things click.

Stitch Prompt: A CLI for Design Variety

September 9, 2025

I love how you can have an idea and often make it real in short order. I have been exploring how to make sure that when I design my frontend I get as much variety as possible as, after all, it’s all about taste!

When seeing folks prompt for app UI, 99% of the time I see them ignoring prompting for the style they want. They rush to say what they want the app to do, which is obviously important, but giving the models more information on the style can bring you gold.

I have been collecting styles, and making it easy to pick one to explore… where the tool will fill out the right information to pass in to the models.

I wrapped this in a simple CLI, stitch-prompt, which simply takes:

  • A simple spec that has info on what you are building
  • A style to try, from a curated list
  • An optional prompt that packages it all together

I have noticed that once I get into the habit… I start to ask for a variety of styles and look at them to get a feel for what I fancy with the particular app. Sometimes my mood and the app itself has me in a minimalistic style, at other times I go for more “fun”. With the flick of a --all you get prompts for all styles to grab. It’s been delightful to put these into Stitch and see what comes out the other side:

It’s been delightful to take a series of Stitch screens and put them side by side with different styles:

And, since these are just prompts, you can fire them up with any LLM that can speak image and put them together:

We are in a world where generation is cheap, so make sure to use that fact and do a lot more curation!

We are embracing this in Stitch and hope you give it a try! We have a lot coming.

Stitch: A Tasteful Idea

August 12, 2025

Bringing Taste to Intelligence

My oldest son has been a Star Wars fan where I am more into Star Trek… with The Next Generation being the GOAT. Like many, Commander Data holds a particular spot in my heart… the age old story of Pinocchio where the non-human teaches the most about humanity.As I rewatch the episodes of ST:TNG, and see Data grow, it’s hard not to make the obvious parallels to our newest incarnation of AI. We have new LLMs from various frontier labs, and every time we fire them up it’s like booting up Data for the first time. Great intelligence sits within the weights. Mapped experience lies within. Yet it is hard to know what experience is lacking, and it’s obvious to see the importance of onboarding.

Rise of the Taste Makers

People are finally starting to realise how few truly great designers there are, and how disproportionately valuable they are.

— Benji Taylor (@benjitaylor) August 4, 2025

Great designers are valuable indeed. They not only have great skill, but they have taste. If we do this right, we will be able to amplify their taste.

I hire an interior designer with taste, and I enjoy the process of working with them to curate an experience that I will love in my home. I may not be able to come up with the design, but I have opinions on what I want, and I know what I like when I see it.

Every home you enter has a personal feel, something unique. I love this and want the same for many of my computing experiences. There are familiar building blocks, great for usability, but I don’t want a world where every UI looks the same. That’s boring and soulless.

I loved the early Web for its character, as strange as some of it is. The fashion evolved quickly, as we worked out what worked… and explored a new space. There is still so much to explore, and I want new AI design tools to help me, through their own skill… and through connecting me to taste makers.

Enter Stitch

sneak peek at what's next for Stitch. a lot is coming pic.twitter.com/6mA5vs6GVU

— Stitch by Google (@stitchbygoogle) August 6, 2025

One such tool caught my attention: Stitch, a product from Google Labs. The results seemed different… more diverse. I got to meet the team behind it, one that works in a group focused on the future of software development, which has some friends from the past who I found thriving. I found myself so excited about the mission that I worked to join them… and they have kindly given me that opportunity.

To pull this off we need to weave models deeply trained on great design, with a user experience that lets users wield that power. We have the seeds, with so many ideas and experiments to run and learn from.

This week we shipped a new version of Stitch that contains many performance and reliability updates, and an overhaul to the main window that gives you the freedom of an infinite canvas to play in. It’s a pleasure to use, and is a foundation we are building on.

I wanted to use designs from Stitch as context for Jules as it built out a nanny calendar app for me. Here's what I did. Demo video at the end!

First, I asked Stitch to make the designs for me. 🧵 pic.twitter.com/ItNb4jwpOE

— Kath Korevec (@simpsoka) August 11, 2025

You can see it in action with Kath’s demo that weaves together the latest Stitch with Jules, a sister project to Stitch in labs, which just had an epic week of launches as it came out of beta. Jules is an asynchronous coding agent that you can work with to massively scale your development efforts.

Taste Goes Beyond Design

I realized today why there’s so much churn in the js community.

— Aaron Boodman (@aboodman) July 21, 2025

There are so many areas of taste. Stitch cares about design and frontend, and Jules also cares about the taste in building great software.

We all have our own taste. A developer who picks Vue.js, does so in part because of the taste of Evan You and team. A die hard Clojure fan, follows the taste of Rich Hickey. Open Source has long been a spot for taste makers to share, and for the community to gather.

There is taste in all things code and software projects, and we can create layers that allow those who excel in accessibility to shine through more of us, as with performance, composition, and security, and on and on.

Manifesting Ideas

One of the reasons I am so very excited about the revolution we are all currently a part of, is in its potential to take away friction so that ideas can become real. What was once expensive is on the path to become so very cheap. And marrying taste with intelligence to get a sea of experiences that are delightful to use.

If you want to fight for a more utopian future vs. dystopian, we are hiring across the board. If you love building the future of computing with like minded folk, please reach out!

/fin

Next Page »

Primary Sidebar

Twitter

My Tweets

Recent Posts

  • Stitching with the new Jules API
  • Pools of Extraction: How I Hack on Software Projects with LLMs
  • Stitch Design Variants: A Picture Really Is Worth a Thousand Words?
  • Stitch Prompt: A CLI for Design Variety
  • Stitch: A Tasteful Idea

Follow

  • LinkedIn
  • Medium
  • RSS
  • Twitter

Tags

3d Touch 2016 Active Recall Adaptive Design Agile AI Native Dev AI Software Design AI Software Development Amazon Echo Android Android Development Apple Application Apps Artificial Intelligence Autocorrect blog Bots Brain Calendar Career Advice Cloud Computing Coding Cognitive Bias Commerce Communication Companies Conference Consciousness Cooking Cricket Cross Platform Deadline Delivery Design Design Systems Desktop Developer Advocacy Developer Experience Developer Platform Developer Productivity Developer Relations Developers Developer Tools Development Distributed Teams Documentation DX Ecosystem Education Energy Engineering Engineering Mangement Entrepreneurship Exercise Eyes Family Fitness Football Founders Future GenAI Gender Equality Google Google Developer Google IO Google Labs Habits Health Hill Climbing HR Integrations JavaScript Jobs Jquery Jules Kids Stories Kotlin Language LASIK Leadership Learning LLMs Lottery Machine Learning Management Messaging Metrics Micro Learning Microservices Microsoft Mobile Mobile App Development Mobile Apps Mobile Web Moving On NPM Open Source Organization Organization Design Pair Programming Paren Parenting Path Performance Platform Platform Thinking Politics Product Design Product Development Productivity Product Management Product Metrics Programming Progress Progressive Enhancement Progressive Web App Project Management Psychology Push Notifications pwa QA Rails React Reactive Remix Remote Working Resilience Ruby on Rails Screentime Self Improvement Service Worker Sharing Economy Shipping Shopify Short Story Silicon Valley Slack Soccer Software Software Development Spaced Repetition Speaking Startup Steve Jobs Stitch Study Teaching Team Building Tech Tech Ecosystems Technical Writing Technology Tools Transportation TV Series Twitter Typescript Uber UI Unknown User Experience User Testing UX vitals Voice Walmart Web Web Components Web Development Web Extensions Web Frameworks Web Performance Web Platform WWDC Yarn

Subscribe via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Archives

  • October 2025
  • September 2025
  • August 2025
  • January 2025
  • December 2024
  • November 2024
  • September 2024
  • May 2024
  • April 2024
  • December 2023
  • October 2023
  • August 2023
  • June 2023
  • May 2023
  • March 2023
  • February 2023
  • January 2023
  • September 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • November 2021
  • August 2021
  • July 2021
  • February 2021
  • January 2021
  • May 2020
  • April 2020
  • October 2019
  • August 2019
  • July 2019
  • June 2019
  • April 2019
  • March 2019
  • January 2019
  • October 2018
  • August 2018
  • July 2018
  • May 2018
  • February 2018
  • December 2017
  • November 2017
  • September 2017
  • August 2017
  • July 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012

Search

Subscribe

RSS feed RSS - Posts

The right thing to do, is the right thing to do.

The right thing to do, is the right thing to do.

Dion Almaer

Copyright © 2026 · Log in

 

Loading Comments...