Phase 1: Supabase Schema Additions — Implementation Plan
For agentic workers: REQUIRED: Use superpowers:subagent-driven-development (if subagents available) or superpowers:executing-plans to implement this plan. Steps use checkbox (
- [ ]) syntax for tracking.
Goal: Add the 6 missing tables (notifications, notification_preferences, device_tokens, badges, user_badges, recommendations_cache), new indexes, RLS policies, and triggers to the Supabase schema — completing the foundation for the Supabase-only architecture.
Architecture: Single Supabase migration file following existing conventions (YYYYMMDD_description.sql). All changes are additive (CREATE TABLE IF NOT EXISTS, ADD COLUMN IF NOT EXISTS) — zero risk of breaking existing functionality. Uses SECURITY DEFINER helper functions for admin RLS.
Tech Stack: PostgreSQL 17, Supabase, PostGIS (existing), pg_trgm (for full-text search)
Spec: flow_docs/docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md
File Structure
| Action | File | Purpose |
|---|---|---|
| Create | flow_backend/supabase/migrations/20260330000001_notifications_badges_recommendations.sql | Main migration: 6 new tables, RLS, triggers, indexes |
| Create | flow_backend/supabase/migrations/20260330000002_fulltext_search_indexes.sql | Full-text search indexes for events and venues |
| Create | flow_backend/supabase/migrations/20260330000003_connection_depth_column.sql | Add connection_depth to existing connections table |
| Create | flow_backend/supabase/migrations/20260330000004_cleanup_deprecated_tables.sql | Drop followers table if unused |
| Modify | flow_docs/docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md | Update Phase 1 status to “completed” |
Chunk 1: Notifications & Device Tokens Tables
Task 1: Create the notifications migration file
Files:
-
Create:
flow_backend/supabase/migrations/20260330000001_notifications_badges_recommendations.sql -
Step 1: Create the migration file with header and notifications table
-- =============================================================================
-- Migration: notifications_badges_recommendations
-- Date: 2026-03-30
-- Purpose: Add notification system, badge/achievement system, and recommendation
-- cache tables. Part of the Supabase-only architecture migration.
-- See: docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md
-- =============================================================================
-- =============================================================================
-- 1. NOTIFICATIONS TABLE
-- =============================================================================
CREATE TABLE IF NOT EXISTS public.notifications (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
user_id UUID NOT NULL REFERENCES public.profiles(id) ON DELETE CASCADE,
type TEXT NOT NULL,
title TEXT,
body TEXT,
data JSONB DEFAULT '{}'::jsonb,
read BOOLEAN DEFAULT false,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
COMMENT ON TABLE public.notifications IS 'User notifications (push, in-app, email). Triggered by Edge Functions.';
COMMENT ON COLUMN public.notifications.type IS 'Notification type: event_rsvp, friend_request, message, crew_invite, badge_earned, system';
COMMENT ON COLUMN public.notifications.data IS 'Payload with contextual data (event_id, user_id, etc.)';
-- Trigger: auto-update updated_at
DROP TRIGGER IF EXISTS update_notifications_updated_at ON public.notifications;
CREATE TRIGGER update_notifications_updated_at
BEFORE UPDATE ON public.notifications
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
-- Indexes
CREATE INDEX IF NOT EXISTS idx_notifications_user_read
ON public.notifications (user_id, read, created_at DESC);
CREATE INDEX IF NOT EXISTS idx_notifications_type
ON public.notifications (type, created_at DESC);
-- RLS
ALTER TABLE public.notifications ENABLE ROW LEVEL SECURITY;
DROP POLICY IF EXISTS "Users read own notifications" ON public.notifications;
CREATE POLICY "Users read own notifications" ON public.notifications
FOR SELECT TO authenticated
USING (auth.uid() = user_id);
DROP POLICY IF EXISTS "Users mark own as read" ON public.notifications;
CREATE POLICY "Users mark own as read" ON public.notifications
FOR UPDATE TO authenticated
USING (auth.uid() = user_id)
WITH CHECK (auth.uid() = user_id);
DROP POLICY IF EXISTS "Users delete own notifications" ON public.notifications;
CREATE POLICY "Users delete own notifications" ON public.notifications
FOR DELETE TO authenticated
USING (auth.uid() = user_id);
-- INSERT: Only via Edge Functions using service_role (bypasses RLS)
-- No INSERT policy for authenticated users — prevents notification spoofing- Step 2: Add notification_preferences table to same file
-- =============================================================================
-- 2. NOTIFICATION PREFERENCES TABLE
-- =============================================================================
CREATE TABLE IF NOT EXISTS public.notification_preferences (
user_id UUID PRIMARY KEY REFERENCES public.profiles(id) ON DELETE CASCADE,
push_enabled BOOLEAN DEFAULT true,
email_enabled BOOLEAN DEFAULT true,
quiet_hours_start TIME,
quiet_hours_end TIME,
disabled_types TEXT[] DEFAULT '{}',
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
COMMENT ON TABLE public.notification_preferences IS 'Per-user notification settings. Controls push, email, quiet hours.';
COMMENT ON COLUMN public.notification_preferences.disabled_types IS 'Array of notification types the user has opted out of';
-- Trigger: auto-update updated_at
DROP TRIGGER IF EXISTS update_notification_preferences_updated_at ON public.notification_preferences;
CREATE TRIGGER update_notification_preferences_updated_at
BEFORE UPDATE ON public.notification_preferences
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
-- RLS
ALTER TABLE public.notification_preferences ENABLE ROW LEVEL SECURITY;
DROP POLICY IF EXISTS "Users manage own prefs" ON public.notification_preferences;
CREATE POLICY "Users manage own prefs" ON public.notification_preferences
FOR ALL TO authenticated
USING (auth.uid() = user_id)
WITH CHECK (auth.uid() = user_id);- Step 3: Add device_tokens table to same file
-- =============================================================================
-- 3. DEVICE TOKENS TABLE
-- =============================================================================
CREATE TABLE IF NOT EXISTS public.device_tokens (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
user_id UUID NOT NULL REFERENCES public.profiles(id) ON DELETE CASCADE,
token TEXT NOT NULL,
platform TEXT NOT NULL CHECK (platform IN ('ios', 'android', 'web')),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
UNIQUE(user_id, token)
);
COMMENT ON TABLE public.device_tokens IS 'FCM device tokens for push notifications. One user can have multiple devices.';
-- Index for lookups by user
CREATE INDEX IF NOT EXISTS idx_device_tokens_user
ON public.device_tokens (user_id);
-- RLS
ALTER TABLE public.device_tokens ENABLE ROW LEVEL SECURITY;
DROP POLICY IF EXISTS "Users manage own tokens" ON public.device_tokens;
CREATE POLICY "Users manage own tokens" ON public.device_tokens
FOR ALL TO authenticated
USING (auth.uid() = user_id)
WITH CHECK (auth.uid() = user_id);- Step 4: Commit notification tables
cd /c/Users/elia-/Documents/flowproject/flow_backend
git add supabase/migrations/20260330000001_notifications_badges_recommendations.sql
git commit -m "feat(db): add notifications, preferences, device_tokens tables
Part of Supabase-only migration Phase 1.
Adds notification system tables with RLS, triggers, and indexes.
See: docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md"Chunk 2: Badges & Recommendations Tables
Task 2: Add badges system tables
Files:
-
Modify:
flow_backend/supabase/migrations/20260330000001_notifications_badges_recommendations.sql -
Step 1: Append badges table to migration file
-- =============================================================================
-- 4. BADGES TABLE (Achievement Catalog)
-- =============================================================================
CREATE TABLE IF NOT EXISTS public.badges (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
name TEXT NOT NULL UNIQUE,
description TEXT,
icon_url TEXT,
category TEXT NOT NULL CHECK (category IN ('explorer', 'social', 'organizer', 'veteran')),
criteria JSONB DEFAULT '{}'::jsonb,
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
COMMENT ON TABLE public.badges IS 'Badge definitions. Categories: explorer (discovery), social (connections), organizer (events), veteran (longevity).';
COMMENT ON COLUMN public.badges.criteria IS 'JSON criteria for automatic badge award (e.g., {"events_attended": 10})';
-- RLS: Public read, admin-only write
ALTER TABLE public.badges ENABLE ROW LEVEL SECURITY;
DROP POLICY IF EXISTS "Public badge catalog" ON public.badges;
CREATE POLICY "Public badge catalog" ON public.badges
FOR SELECT USING (true);
DROP POLICY IF EXISTS "Admins manage badges" ON public.badges;
CREATE POLICY "Admins manage badges" ON public.badges
FOR ALL
USING (public.is_admin());
-- =============================================================================
-- 5. USER BADGES TABLE (Earned Badges Junction)
-- =============================================================================
CREATE TABLE IF NOT EXISTS public.user_badges (
user_id UUID NOT NULL REFERENCES public.profiles(id) ON DELETE CASCADE,
badge_id UUID NOT NULL REFERENCES public.badges(id) ON DELETE CASCADE,
earned_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
PRIMARY KEY (user_id, badge_id)
);
COMMENT ON TABLE public.user_badges IS 'Junction table tracking which badges each user has earned.';
-- Index for user badge lookups
CREATE INDEX IF NOT EXISTS idx_user_badges_user
ON public.user_badges (user_id);
-- RLS: Public read (show on profiles), write only via Edge Functions (service_role)
ALTER TABLE public.user_badges ENABLE ROW LEVEL SECURITY;
DROP POLICY IF EXISTS "Public badge display" ON public.user_badges;
CREATE POLICY "Public badge display" ON public.user_badges
FOR SELECT USING (true);
-- INSERT/UPDATE/DELETE: Only via Edge Functions using service_role (bypasses RLS)- Step 2: Append recommendations_cache table
-- =============================================================================
-- 6. RECOMMENDATIONS CACHE TABLE
-- =============================================================================
CREATE TABLE IF NOT EXISTS public.recommendations_cache (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
user_id UUID NOT NULL REFERENCES public.profiles(id) ON DELETE CASCADE,
event_id UUID NOT NULL REFERENCES public.events(id) ON DELETE CASCADE,
score NUMERIC(5,4) NOT NULL,
factors JSONB DEFAULT '{}'::jsonb,
computed_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
UNIQUE(user_id, event_id)
);
COMMENT ON TABLE public.recommendations_cache IS 'Pre-computed event recommendations per user. Refreshed every 6h by Edge Function.';
COMMENT ON COLUMN public.recommendations_cache.score IS 'Recommendation score 0.0000-1.0000. Higher = more relevant.';
COMMENT ON COLUMN public.recommendations_cache.factors IS 'Score breakdown: {"distance": 0.8, "category": 0.9, "friends": 0.5}';
-- Indexes
CREATE INDEX IF NOT EXISTS idx_recommendations_user_score
ON public.recommendations_cache (user_id, score DESC);
CREATE INDEX IF NOT EXISTS idx_recommendations_computed
ON public.recommendations_cache (computed_at);
-- RLS: Users read own, write only via Edge Functions (service_role)
ALTER TABLE public.recommendations_cache ENABLE ROW LEVEL SECURITY;
DROP POLICY IF EXISTS "Users read own recommendations" ON public.recommendations_cache;
CREATE POLICY "Users read own recommendations" ON public.recommendations_cache
FOR SELECT TO authenticated
USING (auth.uid() = user_id);
-- INSERT/UPDATE/DELETE: Only via Edge Functions using service_role (bypasses RLS)- Step 3: Commit badges and recommendations
cd /c/Users/elia-/Documents/flowproject/flow_backend
git add supabase/migrations/20260330000001_notifications_badges_recommendations.sql
git commit -m "feat(db): add badges, user_badges, recommendations_cache tables
Completes the 6 new tables for Supabase-only migration.
Badges for gamification, recommendations_cache for AI scoring.
See: docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md"Chunk 3: Full-Text Search & Schema Enhancements
Task 3: Add full-text search indexes
Files:
-
Create:
flow_backend/supabase/migrations/20260330000002_fulltext_search_indexes.sql -
Step 1: Create full-text search migration
-- =============================================================================
-- Migration: fulltext_search_indexes
-- Date: 2026-03-30
-- Purpose: Add PostgreSQL full-text search indexes on events and venues.
-- Replaces planned Elasticsearch dependency with built-in PostgreSQL FTS.
-- Uses 'simple' language config for multi-language support (Italian + English).
-- =============================================================================
-- Enable pg_trgm extension for fuzzy/similarity matching
CREATE EXTENSION IF NOT EXISTS pg_trgm;
-- =============================================================================
-- 1. EVENTS FULL-TEXT SEARCH
-- =============================================================================
-- GIN index for full-text search on title + description
CREATE INDEX IF NOT EXISTS idx_events_fts
ON public.events
USING GIN (to_tsvector('simple', coalesce(title, '') || ' ' || coalesce(description, '')));
-- Trigram index for fuzzy matching (typo tolerance)
CREATE INDEX IF NOT EXISTS idx_events_title_trgm
ON public.events
USING GIN (title gin_trgm_ops);
-- =============================================================================
-- 2. VENUES FULL-TEXT SEARCH
-- =============================================================================
CREATE INDEX IF NOT EXISTS idx_venues_fts
ON public.venues
USING GIN (to_tsvector('simple', coalesce(name, '') || ' ' || coalesce(description, '')));
CREATE INDEX IF NOT EXISTS idx_venues_name_trgm
ON public.venues
USING GIN (name gin_trgm_ops);
-- =============================================================================
-- 3. PROFILES SEARCH (username, name)
-- =============================================================================
CREATE INDEX IF NOT EXISTS idx_profiles_username_trgm
ON public.profiles
USING GIN (username gin_trgm_ops);
CREATE INDEX IF NOT EXISTS idx_profiles_name_trgm
ON public.profiles
USING GIN (first_name gin_trgm_ops);- Step 2: Commit
cd /c/Users/elia-/Documents/flowproject/flow_backend
git add supabase/migrations/20260330000002_fulltext_search_indexes.sql
git commit -m "feat(db): add full-text search indexes for events, venues, profiles
Uses PostgreSQL FTS (tsvector) + pg_trgm for fuzzy matching.
Replaces Elasticsearch dependency with zero-cost built-in search.
See: docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md"Task 4: Add connection_depth column
Files:
-
Create:
flow_backend/supabase/migrations/20260330000003_connection_depth_column.sql -
Step 1: Create connection_depth migration
-- =============================================================================
-- Migration: connection_depth_column
-- Date: 2026-03-30
-- Purpose: Add connection_depth to connections table for tiered friendships.
-- Depth auto-calculated by Edge Function based on shared event attendance.
-- 0=acquaintance, 1=friend (1-2 shared events), 2=buddy (3-5), 3=crew (6+)
-- =============================================================================
ALTER TABLE public.connections
ADD COLUMN IF NOT EXISTS connection_depth INT DEFAULT 0;
COMMENT ON COLUMN public.connections.connection_depth IS 'Friendship tier: 0=acquaintance, 1=friend, 2=buddy, 3=crew. Auto-updated by compute-connection-depth Edge Function.';
-- Add last_read_at to chat_participants if missing (for unread message counts)
ALTER TABLE public.chat_participants
ADD COLUMN IF NOT EXISTS last_read_at TIMESTAMP WITH TIME ZONE DEFAULT NOW();- Step 2: Commit
cd /c/Users/elia-/Documents/flowproject/flow_backend
git add supabase/migrations/20260330000003_connection_depth_column.sql
git commit -m "feat(db): add connection_depth to connections, last_read_at to chat_participants
Tiered friendship depth (0-3) auto-calculated from shared events.
last_read_at enables unread message count queries.
See: docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md"Task 5: Clean up deprecated followers table
Files:
-
Create:
flow_backend/supabase/migrations/20260330000004_cleanup_deprecated_tables.sql -
Step 1: Check if followers table is referenced in code
cd /c/Users/elia-/Documents/flowproject
grep -r "followers" flow_mobile/lib/ --include="*.dart" -l
grep -r "'followers'" flow_backend/web/ --include="*.ts" --include="*.tsx" -lExpected: No references (mobile uses user_follows, not followers). If references found, skip this task and note for later.
- Step 2: Create cleanup migration (only if no references found)
-- =============================================================================
-- Migration: cleanup_deprecated_tables
-- Date: 2026-03-30
-- Purpose: Drop deprecated `followers` table. Superseded by `user_follows`.
-- Verified: no code references in flow_mobile or flow_backend/web.
-- =============================================================================
DROP TABLE IF EXISTS public.followers CASCADE;- Step 3: Commit
cd /c/Users/elia-/Documents/flowproject/flow_backend
git add supabase/migrations/20260330000004_cleanup_deprecated_tables.sql
git commit -m "chore(db): drop deprecated followers table (superseded by user_follows)
Verified no code references remain. Part of schema cleanup for Supabase-only migration."Chunk 4: Seed Badge Definitions
Task 6: Seed initial badge data
Files:
-
Create:
flow_backend/supabase/migrations/20260330000005_seed_badges.sql -
Step 1: Create badge seed migration
-- =============================================================================
-- Migration: seed_badges
-- Date: 2026-03-30
-- Purpose: Insert initial badge definitions for the gamification system.
-- Categories: explorer (discovery), social (connections), organizer (events), veteran (longevity)
-- =============================================================================
INSERT INTO public.badges (name, description, category, criteria) VALUES
-- Explorer badges
('first_event', 'Attended your first event', 'explorer', '{"events_attended": 1}'),
('event_explorer', 'Attended 10 events', 'explorer', '{"events_attended": 10}'),
('event_veteran', 'Attended 50 events', 'explorer', '{"events_attended": 50}'),
('city_explorer', 'Attended events in 5 different venues', 'explorer', '{"unique_venues": 5}'),
('night_owl', 'Attended 10 events starting after 22:00', 'explorer', '{"night_events": 10}'),
-- Social badges
('social_butterfly', 'Made 10 connections', 'social', '{"connections": 10}'),
('crew_founder', 'Created your first crew', 'social', '{"crews_created": 1}'),
('squad_leader', 'Created 5 squads', 'social', '{"squads_created": 5}'),
('connector', 'Made 50 connections', 'social', '{"connections": 50}'),
('chat_starter', 'Started 10 conversations', 'social', '{"chats_created": 10}'),
-- Organizer badges
('first_organizer', 'Created your first event', 'organizer', '{"events_created": 1}'),
('event_pro', 'Created 10 events', 'organizer', '{"events_created": 10}'),
('crowd_puller', 'Had 100+ attendees at an event', 'organizer', '{"max_attendees": 100}'),
('five_star', 'Received a 5-star average rating', 'organizer', '{"avg_rating": 5.0}'),
('sellout', 'Sold out all tickets for an event', 'organizer', '{"sellout_events": 1}'),
-- Veteran badges
('early_adopter', 'Joined during the beta period', 'veteran', '{"joined_before": "2026-12-31"}'),
('one_month', 'Active for 1 month', 'veteran', '{"days_active": 30}'),
('six_months', 'Active for 6 months', 'veteran', '{"days_active": 180}'),
('one_year', 'Active for 1 year', 'veteran', '{"days_active": 365}'),
('karma_master', 'Achieved a karma score of 4.5+', 'veteran', '{"karma_score": 4.5}')
ON CONFLICT (name) DO NOTHING;- Step 2: Commit
cd /c/Users/elia-/Documents/flowproject/flow_backend
git add supabase/migrations/20260330000005_seed_badges.sql
git commit -m "feat(db): seed 20 badge definitions across 4 categories
Explorer (5), Social (5), Organizer (5), Veteran (5).
Criteria stored as JSONB for flexible evaluation by Edge Functions.
See: docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md"Chunk 5: Verification
Task 7: Verify migrations apply cleanly
- Step 1: Check Supabase CLI is available
npx supabase --versionExpected: Version number (e.g., 1.x.x)
- Step 2: Run migration dry-run (if local Supabase is running)
cd /c/Users/elia-/Documents/flowproject/flow_backend
npx supabase db diff --localThis shows what SQL would be applied. Review for errors.
- Step 3: Apply migrations to remote Supabase (if linked)
cd /c/Users/elia-/Documents/flowproject/flow_backend
npx supabase db pushExpected: All 5 new migrations applied successfully.
- Step 4: Verify tables exist in Supabase dashboard
Check that these tables appear in the Supabase dashboard:
notifications— withuser_id,type,title,body,data,read, timestampsnotification_preferences— withuser_idPK,push_enabled,email_enabled, etc.device_tokens— withuser_id,token,platformbadges— with 20 seeded rowsuser_badges— empty junction tablerecommendations_cache— empty cache table
Also verify:
-
connections.connection_depthcolumn exists (default 0) -
chat_participants.last_read_atcolumn exists -
Full-text search indexes exist on
eventsandvenues -
followerstable is dropped (if it was) -
Step 5: Update spec document
In flow_docs/docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md, update:
Phase 1: Schema & RLS — ✅ COMPLETED (2026-03-30)
- Step 6: Final commit
cd /c/Users/elia-/Documents/flowproject
git add flow_docs/docs/superpowers/specs/2026-03-29-supabase-only-migration-design.md
git commit -m "docs: mark Phase 1 (Schema & RLS) as completed"Summary
| Task | What | Files | Est. |
|---|---|---|---|
| 1 | Notification tables (3) + RLS + triggers | Migration 001 | 3 min |
| 2 | Badge tables (2) + recommendations_cache + RLS | Migration 001 | 3 min |
| 3 | Full-text search indexes (pg_trgm) | Migration 002 | 2 min |
| 4 | connection_depth + last_read_at columns | Migration 003 | 2 min |
| 5 | Drop deprecated followers table | Migration 004 | 2 min |
| 6 | Seed 20 badge definitions | Migration 005 | 2 min |
| 7 | Verify all migrations apply | Manual check | 5 min |
Total: ~20 minutes