Skip to content

Building in Public

How I Apply Spec-Driven AI Coding

LLMs code better when they focus on a single task at hand instead of trying to solve multiple issues on your codebase at once. Carl Rannaberg recently introduced a plan-based AI coding workflow in his article, "My current AI coding workflow", where LLMs first use a planner phase to create a task plan for the new feature you are developing, and then an executor phase goes through the plan to generate code task by task. I used the method for a while and liked it a lot.

Now, there is a new kid on the block, Kiro.dev from AWS, that goes even further by allowing the planner mode to first create the requirements spec, then the design, and only after that, the tasks list. As I'm still on Kiro's waitlist, I applied the methodology as a unified workflow for all the coding assistants at my disposal: Cursor, Claude, and Gemini.

I've put the framework up on GitHub as https://github.com/andreskull/spec-driven-ai-coding

A Practical Guide to Deploying Dagster Hybrid on GCP

In my previous post, "How I Built a Modern Data Orchestration Layer for Finfluencers.trade", I explained why I chose Dagster for my project — a platform that transforms financial content into structured, auditable data. The core challenge I wanted to solve was creating complete traceability and lineage for every piece of data, from raw podcast transcripts and article text to final predictions on the website.

This post dives into the how — a detailed, technical guide for setting up a Dagster Hybrid deployment on Google Cloud Platform (GCP).

How I Built a Traceable Data Pipeline for Finfluencers.trade

The core of my project, Finfluencers.trade, is turning a chaotic stream of public content—podcasts, videos, and tweets—into a structured, auditable database of financial predictions. To do this reliably, I needed a rock-solid system for managing how data moves and transforms.

This post explains how I set up that system—my data pipeline's orchestration—from day one.

Securing a Newsletter Subscription Form on a Static MkDocs Site

Adding a newsletter subscription form to a static site seems straightforward—until you consider security implications. In this post, I'll share how I implemented a secure newsletter subscription for my MkDocs-based blog using Cloudflare Turnstile and a serverless API.

For the Finfluencers Trade blog, built with MkDocs (a static site generator), I wanted a simple way for readers to subscribe to a newsletter for updates. Static sites can't run server-side code directly, so handling form submissions requires a different approach than dynamic sites (like WordPress or Node.js apps). I needed a solution that was secure, reliable, respected user privacy, and didn't require me to manage a backend just for email signups.

Why I Chose MkDocs for the Blog

Building Finfluencers Trade is a journey, and I'm committed to documenting it openly – the technical challenges, research findings, and strategic pivots. This blog is central to that "Building in Public" approach.

Documenting this journey requires a blog that's easy to manage and integrates with my development workflow. After considering different options, I chose MkDocs with the Material theme because its focus on Markdown, static site generation, and developer experience perfectly matched my needs for the Finfluencers Trade blog.