Project Overview We are building a Next.js + Node.js + Vercel analytics SaaS for affiliates. Users connect affiliate networks via API or CSV upload, we normalise performance stats, and we provide dashboards, GEO heatmaps, anonymised and aggregated network benchmarks, and alerts.
The core challenge (and priority) is building a clean, reliable data ingestion and aggregation system that scales as more users and networks connect.
What You’ll Work On You will lead the backend/data engineering work for ingestion and analytics, including:
- Design the data model in Postgres for raw imports, normalised metrics, and aggregated reporting - Build API integrations (auth where needed, scheduled pulls, pagination, rate limits) - Build a CSV import pipeline (validation, mapping, deduplication, error handling, audit trail) - Implement background jobs for ingestion and processing (cron plus queues/workers) with retries, backoff, and idempotency - Build the aggregation layer for dashboards/heatmaps and anonymised benchmarks (privacy-safe rules for small sample sizes) - Expose clean API endpoints for the Next.js app (performance focused) - Add logging/monitoring and basic automated tests for pipeline reliability - Ensure sensible handling of sensitive data (secrets, access tokens, and user isolation)
Current Stack / Preferences
- Next.js (App Router) + Node.js runtime - Vercel deployment - Postgres (hosted provider is flexible) - ORM: Prisma or Drizzle (open to your recommendation) - Background jobs: cron + queue/worker approach (open to your recommendation)
Required Experience (Must Have)
- Strong Node.js/TypeScript backend experience (production SaaS preferred) - Deep Postgres skills (schema design, indexing, query optimisation, migrations) - Real background job experience (queues/workers, cron scheduling, retries, idempotency) - Proven experience integrating third-party APIs and handling messy/partial data - Ability to design systems that are clean, organised, and maintainable
Nice to Have
- Experience deploying Node/Next.js systems on Vercel or serverless environments - Experience building analytics/aggregation systems (materialized views, rollups, caching strategies) - Familiarity with privacy-safe aggregation (minimum sample thresholds, anonymisation rules) - Experience with affiliate platforms, iGaming, or performance marketing analytics - Observability tooling (Sentry, OpenTelemetry, structured logging)
Engagement
- Contract role (remote) - Start with an initial scope focused on ingestion + aggregation MVP, with potential for ongoing work - Please confirm you are comfortable with the milestone-based budget and timeline below - Deliverables are defined by the milestone acceptance criteria below
What Success Looks Like (Deliverables)
- Clear backend architecture for ingestion, processing, and aggregation - Working pipeline for CSV import and at least one API integration (with a pattern to add more) - Normalised metric layer (consistent definitions across sources) - Aggregated tables/endpoints powering dashboards + GEO heatmap - Foundation for anonymised benchmark calculations - Clean code structure, basic tests, and logging
How to Apply Please send:
- A short intro and 1–3 relevant projects you’ve shipped (links if possible) - Your preferred stack for Postgres + jobs (Prisma/Drizzle, cron/queues, ETL approach) - A brief outline of how you would design ingestion + deduplication + retries for API and CSV sources
Screening Questions (Answer briefly)
- Describe a pipeline you built. How did you handle retries, rate limits, and duplicate imports? - What’s your preferred approach for background jobs in a Next.js/Vercel setup? - How would you prevent anonymised benchmarks from leaking data in small GEO/brand sample sizes?
We are optimising for correctness and reliability over flashy UI. The data pipeline is the constraint. Please include one example of a data pipeline you shipped in production and what broke first.
---------------------------------
**See attached PDF for Milestones and detailed project overview**
Budget
- Timeline: Preferably within 3 months (Milestones 1 to 5 delivered on a rolling basis) - Payment: milestone-based, €1,200 per milestone (5 milestones) - Total budget: €6,000
Milestone payments are released as milestones are completed and accepted, not strictly one per month. Some milestones may be delivered in the same month depending on progress.
---------------------------------
Preferred applicants: Senior backend/data engineers with proven production experience in Node.js/TypeScript, Postgres, and background job systems (data pipelines, ETL, ingestion, rollups).
More ongoing work available after this project for the right candidate.
Mobile Entertainment Game Development Category: 2D Game Art, Game Development, IPhone, Mobile App Development, Unity, Unity 3D, Unreal Engine Budget: ₹750 - ₹1250 INR
Minimal Vlog Video Editor Category: Adobe Premiere Pro, After Effects, Audio Services, Color Grading, DaVinci Resolve, Final Cut Pro, Video Editing, Video Production Budget: ₹600 - ₹1500 INR
Build “My Shivpal” Platform Category: HTML, JavaScript, Mobile App Development, PHP, Product Management, UX / User Experience, Web Development, Web Design Budget: ₹12500 - ₹37500 INR
29 Dec 2025 04:48 GMT
Blender Residential Interior Modeling Category: 3D Animation, 3D Design, 3D Modelling, 3D Rendering, 3D Visualization, 3ds Max, Blender, Interior Design Budget: $10 - $30 AUD