Engineering

We don't sell someone else's product.

PhotoCast and FieldCast are tools we built because the off-the-shelf options weren't fast enough. Here's how the pipeline works, end to end.

PhotoCast — media days

The kids see their photo before the next pose.

Each athlete gets a QR card at check-in. Cameras feed photos to our edge node over FTP. PhotoCast reads the QR from inside each frame, looks up the player, and pushes the photo to a per-player iPad — typically within a few seconds of the shutter.

01
Camera
FTP push from Z8 / Sony body
02
QR scan
Player lookup against roster
03
Per-player session
DuckDB upsert + thumbnail
04
iPad live view
WebSocket push to the kiosk
FieldCast — live events

Parents see the game while it's still being played.

Cameras upload to R2 over LTE the moment we shoot. Cloudflare Queues pick up each frame, an AI step enhances the image and tags content, a face-matching pass associates the photo with rostered athletes, and the result is published to a public gallery.

01
Camera
FTP → R2 from sideline edge node
02
AI restyle + tag
Cloudflare Queue → GPU processor
03
Face match
Roster-aware identity tagging
04
Live gallery
live.durkin.photos, real-time
live.durkin.photos Open ↗
Built in-house

Yes, we run the data center too.

Cloudflare edge

Workers, R2, D1, Queues. Image proxies, signed URLs, and the live gallery all live at the edge for low-latency delivery worldwide.

AI image processing

Custom restyling and tagging pipeline running on GPU at our studio. Output is consistently graded across every frame of an event.

Operator-built

The software, the cameras, the field crew — all the same team. When something breaks, the person who built it is on the call.

Want this on your sideline?

Schedule a call — we'll talk through the event and the pipeline.