Engineering Case Study

The Matter & Gas
Platform.

A serverless AI product analysis platform that evaluates startup ideas and transitions qualified users into structured project workspaces for collaboration and advisory work.

Last updated: March 2026

Context

Problem

Early-stage founders frequently pursue software ideas without understanding:

  • Technical complexity
  • Operational risks
  • Infrastructure requirements
  • AI feasibility

The goal of this system was to create a platform that could:

  • Analyze a proposed product concept using LLM reasoning
  • Produce a structured feasibility report in real time
  • Convert promising leads into authenticated project workspaces

This required bridging anonymous AI interaction with persistent project infrastructure, while maintaining controlled onboarding and operational oversight.

Architecture

The platform uses a serverless AWS architecture designed for low operational overhead and controlled user onboarding.

User

Next.js Frontend

Diagnostic UI, Workspace, Admin

Lambda API Layer

AI Analysis (Bedrock + SSE)Messaging (AppSync)Booking SystemAdmin Operations

DynamoDB

ProjectsConversationsBookingsAnalyticsAggregate

Frontend

Next.js / React application providing:

  • Interactive AI diagnostic interface
  • Progressive rendering of analysis results streamed from backend
  • Authenticated workspace environment (/lobby)

Real-time AI analysis is streamed to the client using Server-Sent Events (SSE) with progressive rendering, cancellation support, and structured error signaling.

Application Layer

The backend consists of 17 AWS Lambda functions responsible for:

  • API endpoints
  • Cognito authentication triggers
  • AppSync resolver functions
  • Scheduled operational jobs
  • DynamoDB stream processors

Core responsibilities include:

  • AI analysis execution
  • Project workspace management
  • Messaging and collaboration workflows
  • Booking and scheduling operations
  • Administrative tooling

Data Layer

Application state is stored in DynamoDB using a deliberately denormalized model optimized for query patterns. Key models include:

  • Projects
  • Conversation messages
  • Conversation summaries
  • Conversation read-state tracking
  • Booking records
  • Contact submissions
  • Analytics aggregates

AI analysis results are stored directly in the DynamoDB Project record (analysisJson).

Infrastructure

Infrastructure is deployed using Amplify Gen 2 (CDK-backed) and includes:

  • AWS Lambda for application logic
  • DynamoDB for application state
  • Cognito authentication
  • AppSync real-time messaging subscriptions
  • EventBridge scheduled jobs
  • SES transactional email
  • CloudFront + WAF edge protection
  • S3 artifact storage utilities

System Snapshot

FrontendNext.js / React
Backend17 Lambda functions
DataDynamoDB (denormalized)
RealtimeSSE + AppSync subscriptions
AuthCognito EMAIL_OTP + PreSignUp gate
AnalyticsEventBridge → AnalyticsAggregate + 4 resolvers
Admin9 pages + cascade delete (10+ tables)
EdgeCloudFront + WAF

Trade-offs

Key Engineering Decisions

Several architectural decisions shaped the system. Each involved deliberate trade-offs between complexity, reliability, and user experience.

Real-Time AI Streaming (SSE)

AI analysis responses are streamed to the client using Server-Sent Events rather than background jobs.

Benefits:

  • Progressive rendering of analysis output
  • Simplified backend architecture
  • Improved user feedback during long model runs

The streaming protocol supports abort handling and structured error events.

Passwordless Authentication

Authentication uses Cognito EMAIL_OTP passwordless login.

Benefits:

  • Reduced signup friction
  • Simpler account recovery
  • No password management overhead

Signup is gated through a whitelist approval model enforced via a Cognito PreSignUp trigger.

Demo-to-Authenticated Transition

Users can run the AI diagnostic anonymously. Results are cached locally in the browser.

After signup, the workspace automatically converts the cached analysis into a persistent project record.

Anonymous explorationAuthenticated collaboration

Real-Time Messaging Architecture

Messaging uses AppSync subscriptions for real-time updates. The system includes:

  • Subscription buffering during fetch cycles
  • ULID-based message identifiers
  • Client-side deduplication logic

Conversation read state is tracked per user and per project to support accurate unread detection.

Race-Safe Scheduling

Booking operations use DynamoDB conditional writes:

attribute_not_exists(slotId)

This guarantees slot exclusivity without requiring transactions.

Prompt Versioning

AI prompts are versioned with an ACTIVE pointer system allowing:

  • Reproducible outputs
  • Controlled prompt iteration
  • Safe system evolution without redeploying infrastructure

Token usage estimation utilities allow cost analysis of inference workloads.

Behind the scenes

Operational Infrastructure

Scheduled Jobs

EventBridge scheduled tasks run hourly to:

  • Send booking reminder emails (24 hours before appointments)
  • Aggregate operational metrics into the AnalyticsAggregate table

Analytics Pipeline

Administrative analytics are powered by scheduled aggregation jobs, a DynamoDB analytics model, and AppSync query resolvers serving:

  • Summary metrics
  • Funnel analysis
  • Time-series views
  • Categorical breakdowns

Abuse Protection

The AI diagnostic endpoint includes IP-based rate limiting implemented in shared Lambda utilities.

Administrative System

The platform includes an internal administrative interface consisting of 9 operational pages, including:

  • Member control center with per-user detail views
  • CRM-style contact notes
  • Per-user project, booking, and AI run history
  • Inbox and conversation management
  • Scheduling management
  • Analytics dashboards
  • AI run audit logs
  • Cascade member deletion touching 10+ table types

Retrospective

Lessons Learned

If you're evaluating AI-driven product infrastructure or exploring what production-grade serverless systems look like in practice — we're happy to talk.