# Clone & enter the repo
git clone https://github.com/pgflow-dev/ai-web-scraper.git
cd ai-web-scraper
# Copy the environment file example and add your OpenAI key (required by the tasks)
cp supabase/functions/.env.example supabase/functions/.env
# Edit the .env file and add your OpenAI API key
# OPENAI_API_KEY=sk-...
npx supabase@2.22.12 start
npx supabase@2.22.12 migrations up --local
npx supabase@2.22.12 functions serve
curl -X POST http://127.0.0.1:54321/functions/v1/analyze_website_worker
The first curl
boots the worker; it stays alive and polls for jobs.
select * from pgflow.start_flow(
flow_slug => 'analyzeWebsite',
input => '{"url":"https://supabase.com"}'
);
select * from websites; -- scraped data
select * from pgflow.runs; -- run history
That’s it – scrape, summarize, tag, store!