Skip to content
Back to Blog
1 min read

LLM Cost and Latency Notes: using caching where it actually pays off

I turned implicit processes into explicit operating rules—defining owners, acceptance tests, and lightweight runbooks so teams can move confidently and recover quickly.

The friction I kept seeing was simple: we can ship quickly but still lose reliability when ownership stays fuzzy.

Instead of adding more moving parts, I tested a single-path implementation before introducing alternatives.

April is where Q2 intentions either become systems or remain slideware.

What I changed today

  • I documented one decision that usually lives in hallway conversations.
  • I removed one optional branch that only added maintenance burden.
  • I aligned a technical decision with a business-facing success metric.

What I want to keep doing

Delivery speed held, while ambiguity dropped. That is a win in real teams. When assumptions are visible, teams move faster with fewer expensive surprises.

Tomorrow’s focus

Tomorrow’s focus is to stress-test this with less ideal inputs and see where it bends.

References

Michael John Peña

Michael John Peña

Senior Data Engineer based in Sydney. Writing about data, cloud, and technology.