Skip to main content

Work

Full case studies with outcome-first summaries, stack, and proof. Use filters to narrow by type of work, or open the impact record for a recruiter-friendly overview.

Filter projects

Impact summary →

Showing 8 of 14 matching projects.

Analyticado

Centralised analytics hub with ~100 monthly active users

Owned initiative

Summary

Analyticado made trusted analytics easier to find. Dashboards, metadata, and usage signals sit in one searchable hub, supporting roughly 100 monthly active users and roughly a tenfold lift in engagement compared with the previous scattered entry points.

Context

Internal / Enablement

What I built

Analyticado became the organisation's main place to find analytics: Looker dashboards and other outputs in one searchable site. Custom search, connections to Looker's APIs, embedded charts, and screens built for clarity meant analytics use rose about tenfold with roughly 100 active users each month, the default place to discover dashboards and reports. Plain language and layout supported adoption. Pipelines load Looker metadata, curated spreadsheets, and usage events; database design and internal privacy rules governed what metadata could show.

Skills demonstrated

  • Used storytelling and UX clarity to help stakeholders understand and adopt analytical content
  • Managed expectations by aligning site functionality with stakeholder needs
  • Designed interactive, intuitive visual interfaces that highlighted trends and improved findability
  • Applied visual design principles (layout, hierarchy, typography) to enhance usability
  • Built ingestion and transformation processes pulling Looker metadata, curated sheets, and analytics events
  • Demonstrated familiarity with JSON and CSV formats when handling Looker and Google Analytics data
  • Designed normalised BigQuery tables (looker_users, looker_user_activity, looker_explore_fields, looker_dashboard_elements) with proper entity separation
  • Applied 3NF principles by separating user data from activity data, linked via user_id
  • Applied code management and CI/CD principles in Python-based pipelines
  • Used structured code patterns and reusable modules for maintainability
  • Ensured metadata usage aligned with privacy rules and internal content policies

Impact

  • ~100 monthly active users
  • 10x increase in analytics engagement
  • One-stop search for all analytics resources
  • Shaped landings with analytics stakeholders so discovery matched real behaviour

Technologies

Next.jsReactLooker APITypeScriptBigQuery

K-AI - Conversational Analytics Assistant

One chat for sourced answers and conversational analytics

Owned initiative

Summary

K-AI gives teams one place for sourced answers from internal docs and the dashboard catalogue, plus conversational analytics when the question is really about numbers. Routing is tuned so people land on trustworthy explores, long answers run reliably behind corporate proxies, and analytics owners could tighten behaviour before wider rollout.

Context

Internal / Enablement

What I built

Delivered a production chat assistant on the company analytics website. For general questions it searches internal documentation and dashboard catalogue material and answers with citations. When the question is really about numbers or metrics, it hands off into Looker's conversational analytics in the right context. Combines keyword and meaning-based search, AI-powered routing when a question could belong to more than one business area, and careful matching so users do not land on the wrong dashboard. Long answers run as background jobs with polling so sessions stay reliable behind corporate proxies. Uses standard cloud APIs for orchestration and optional storage for job history. Rolled out in stages with configuration so behaviour could be tightened over time.

Skills demonstrated

  • Built ingestion and embedding pipelines for documentation, Looker-derived metadata, and curated knowledge into LanceDB
  • Implemented jobs plus polling and optional BigQuery job persistence for long-running assistant responses
  • Integrated embedded Looker Conversational Analytics alongside knowledge cards on the company analytics portal
  • Implemented source citations and traceability from chat responses back to catalogue and doc chunks
  • Delivered FastAPI orchestration with modular Looker conversational and routing components
  • Implemented BigQuery-backed chat history and preferences with scalable patterns
  • Used Vertex AI (Gemini) for routing, improved questions, and answers with hybrid LanceDB retrieval
  • Applied explore-first planning, stream-aware catalogue filtering, and metadata-driven agent selection for Looker CA
  • Leveraged BigQuery snapshots for agents, explores, fields, optional field usage weighting, and dashboard elements
  • Structured indexed knowledge for fast hybrid retrieval across sources
  • Unified natural-language access so users cite knowledge or open the right CA thread instead of guessing explores
  • Aligned routing behaviour with operational domains (for example warehouse versus last mile) through clarification and gates

Impact

  • One assistant for sourced answers from docs and conversational analytics for metric questions
  • Question routing tuned so people hit the right topic and avoid misleading chart picks
  • Rollout supported by config and pipelines to load docs and internal knowledge
  • Iteration with analytics and portal owners before wider release

Technologies

FastAPIVertex AILanceDBBigQueryLooker APINext.jsGCP

Capture - Mobile Application

Cross-platform photo sharing app on iOS & Android

Owned initiative

Summary

Independently shipped Capture on iOS and Android: shared photos surface after a delay, with subscriptions, one-time purchases, and ads configured end to end. Public metrics on this site show acquisition and retention moving as the product ships.

Context

Personal / Product

What I built

Independently designed, built, and launched a cross-platform phone app (Flutter) for delayed-reveal photo sharing. Owned the idea, branding, build, how to charge for it, and how the app stores present it. Backend on Supabase (database, sign-in, file storage, access rules); push notifications, in-app purchases, and ads wired in. UI designed in Figma. Launched January 2026 with roughly 100 downloads and ongoing growth.

Skills demonstrated

  • Built cross-platform app (iOS & Android) using Flutter, Supabase, Firebase
  • Implemented state management (Riverpod), backend architecture, and monetisation
  • Defined business model, feature set, and UX from first principles
  • Created marketing content and managed App Store Optimisation strategy
  • Designed data schema for users, memories, and subscriptions
  • Structured RLS policies and storage organisation

Impact

  • Launched on both app stores (January 2026)
  • ~100 downloads with ongoing growth
  • Full monetisation: subscriptions, one-time purchases, ads

Technologies

FlutterDartSupabasePostgreSQLFirebaseFigma

Availability Breakdown

Used data to decide when delivery routes could open

Partner / stakeholder priorities

Summary

Checked real booking behaviour against assumptions and showed where demand was missed. Explained delivery-slot and area-setup ideas clearly to leaders. Built charts that highlighted patterns and delivery areas that were too large or poorly sized.

Context

Last Mile

What I built

Checked real customer booking behaviour against what planners assumed, and showed demand was not captured as well as people thought. Gave leadership confidence to automate when delivery routes open and close. Explained delivery-slot timing, area setup, and routing ideas in plain language to executives. Turned very large datasets into a short list of insights. Used diagnostic analysis to find why slots failed. Built efficient SQL over operational tables and charts that highlighted patterns, odd spikes, and delivery areas that were too large or poorly drawn. Automated refresh with spreadsheet scripts. Kept privacy and data accuracy in scope.

Skills demonstrated

  • Explained complex availability concepts to executives and non-technical stakeholders
  • Continuously aligned outputs with stakeholder questions
  • Challenged assumptions by validating availability behaviour with evidence
  • Used synthesis techniques to turn millions of data points into digestible insights
  • Performed diagnostic analysis to identify root causes
  • Used descriptive statistics (EDA, distributions, time patterns)
  • Built optimised SQL queries to extract availability inputs
  • Used aggregation and filtering for analysis-ready data
  • Created visual tools highlighting patterns, anomalies, and zone issues
  • Used design principles for non-technical user interpretability
  • Automated reporting workflows using AppScript
  • Integrated multiple sources for comprehensive narratives

Impact

  • Stakeholder confidence for automated route-release decisions
  • Revealed zone-sizing and demand-capture issues

Technologies

SQLLookerAppScript

Bulky promotion cost-to-serve analysis

Whether big sales on heavy items actually made money after real costs

Partner / stakeholder priorities

Summary

Compared baseline and promotion weeks for high-volume bottled water and multipack soda across partner FCs: pulled together orders, delivery cost per tote (driver, fuel, totes per route), tote fill from bulky volume, opportunity cost versus average basket profit, and incremental route pressure. Estimated on the order of $420K profit impact at the largest site over a seven-day window; ~$560K combined across two FCs versus baseline; rounded network-scale exposure on the order of ~$1M once fully costed. Built partner-facing slides for commercial choices (guardrails, thresholds), not headline sales only. Presented the deck directly to the client: methodology walkthrough, Q&A, and alignment between commercial and fulfilment stakeholders.

Context

Retail fulfilment / Commercial

What I built

Built a full-cost view of large promotions on heavy, space-hungry categories for online grocery during a secondment to the US. Compared baseline weeks to promotion weeks across partner fulfilment sites for bottled water and multipack soda so higher demand could be read together with delivery cost per shopping tote (driver wage and hours per route, fuel split across totes per order), tote fill driven by bulky cube, warehouse handling, and opportunity cost when bulky lines displaced profit from a typical basket mix, not headline sales alone. In one representative seven-day window, the executive view pointed to roughly $420K estimated profit impact at the largest analysed FC; combining two FCs in the model yielded roughly $560K estimated downside versus baseline once both categories were fully costed. The deck also framed rounded network-scale exposure on the order of ~$1M once fully costed. Split costs across unloading, picking, packing, loading, and delivery; documented assumptions and overlapping promos where relevant. Delivered partner-facing slides with recommendations: contribution thresholds after fulfilment and opportunity cost, tighter mechanics or caps for high-space low-margin SKUs, and FC-specific review where economics differ. Presented the deck directly to the client: walked through methodology, handled Q&A, and aligned commercial and fulfilment stakeholders on next steps. Outcome: leadership could approve promos on profit impact, not lift alone.

Skills demonstrated

  • Quantified trade-offs between promotional volume and contribution after end-to-end fulfilment cost, with site-level estimates on the order of $420K (single FC, seven-day window) and ~$560K across two FCs versus baseline
  • Integrated operational and commercial inputs from multiple feeds into analysable grain
  • Translated operational mechanics into commercial implications for partner stakeholders
  • Facilitated client-facing review: live presentation, questions on methodology, and alignment across commercial and ops leaders
  • Delivered a slide-based narrative suitable for executive and partner review
  • Modelled opportunity cost from tote capacity allocated to bulky versus average mix, alongside higher delivery pressure when promo volume spiked
  • Structured period-over-period comparisons with demand, orders, and profitability lenses
  • Stress-tested conclusions across sites and categories with explicit caveat handling
  • Validated roll-ups against business definitions for orders, units, and cost allocation
  • Moved from assumptions through evidence to recommendations usable in approval workflows
  • Used visual hierarchy so conclusions remained traceable to assumptions and calculations

Impact

  • ~$420K estimated site-level impact (7-day bulky beverage window)
  • ~$560K combined two-FC downside versus baseline once water and soda were fully costed
  • Rounded network-scale exposure on the order of ~$1M, plus reusable guardrails for future promos
  • Direct client presentation with Q&A; shared understanding on promotion profitability across stakeholders

Technologies

SQLBigQuerySpreadsheet modelling

Cost to Serve Model

~$1M annual savings through operational optimisation

Owned initiative

Summary

Model that spotted where fresh groceries cost too much to pick and deliver for a large grocery partner. Clear recommendations, major annual savings, and a version the client's team could run themselves. Presented the savings case and model logic to partner stakeholders to drive adoption.

Context

CFC Operations

What I built

Developed the Cost to Serve model that found where fresh groceries cost too much to pick and deliver for a major grocery partner. Recommendations led to about $1M a year in savings. Built a version the client's team could run themselves; operations leadership adopted it. Presented the savings case and model mechanics to partner stakeholders so adoption was grounded in trust and clarity. Used SQL and structured analysis to trace costs to real drivers and explain causes in plain terms. The model became the default reference when discussing fulfilment cost.

Skills demonstrated

  • Identified and quantified cost inefficiencies in fulfilment operations
  • Structured insights for operations leadership decision-making
  • Built queries to extract cost drivers across fulfilment and operational tables
  • Communicated cost model structure and findings to client stakeholders
  • Ensured client self-service version adhered to data integrity and access controls
  • Delivered actionable recommendations leading to ~$1M annual savings
  • Analysed operational data to identify root causes of cost variance
  • Integrated multiple sources for comprehensive cost narratives
  • Aligned model outputs with operational decision-making needs
  • Documented model assumptions and data lineage

Impact

  • ~$1M annual savings for the partner
  • Client self-service capability
  • Model adopted across operations leadership

Technologies

SQLBigQueryLookerGoogle Sheets

Geomapping Availability

Maps showed delivery-area and planning problems

Owned initiative

Summary

Interactive maps for booking patterns and problem areas. Filters for time period, site, and slot-planning region. Surfaced issues such as oversized delivery zones.

Context

Last Mile

What I built

Spotted that leadership could not see booking problems clearly on a map and built interactive maps for availability patterns and weak delivery areas. Showed oversized zones and fed into better route timing and slot planning. Filters for time range, site, and planning region. Surfaced issues such as slots disappearing at peak times. Grew from early notebook experiments to multi-partner tools. Structured data so locations and zones compared fairly.

Skills demonstrated

  • Built scalable solutions from Colab scripts to multi-partner tools
  • Identified critical visibility gap and designed solution for operational needs
  • Used spatial visualisation to frame discussions around route release and planning
  • Built interactive map-based visualisations for availability patterns
  • Created dynamic filtering across time, sites, and slot planning areas
  • Synthesised availability data to uncover operational flaws
  • Identified patterns enabling root cause analysis
  • Integrated multiple data sources for spatial analysis
  • Structured data to support spatial relationships and zone analysis

Impact

  • Exposed zone-sizing issues
  • Informed route-release and slot-planning improvements

Technologies

SQLPythonGoogle ColabLooker

In the Bag

Clearer view of orders and staffing needs

Partner / stakeholder priorities

Summary

Live breakdown of orders and how much labour they imply. Shaped views around what operations asked for. Same-day monitoring for day-to-day decisions.

Context

CFC Operations

What I built

Built live views of how orders break down and how much labour they imply, for partners and internal teams. Helped tighten day-to-day staffing with same-day monitoring. Matched the logic to how operations actually decide things. Traced what drove live order volumes and labour needs. Efficient SQL for live and historical figures with quality checks. Charts that stay easy to read under pressure. Automated refresh via spreadsheet scripts.

Skills demonstrated

  • Communicated insights clearly to executives and partner teams
  • Structured insights around stakeholder questions
  • Used domain expertise to align order breakdown logic with operational needs
  • Identified underlying drivers of live order values and labour demand
  • Wrote efficient SQL queries for live and historic values at scale
  • Implemented quality checks and error handling within workflows
  • Created visualisations for real-time performance monitoring
  • Highlighted trends and anomalies for fast operational decisions
  • Automated data refreshes and workflows using AppScript

Impact

  • Enabled tighter labour management
  • Real-time performance visibility for partners

Technologies

SQLLookerAppScript