Location: Remote (Pakistan Preferred)
Reports to: Founder / Technical Lead
Project Type: Contract (4–6 Weeks)
Work Type: Remote
⸻
About
We are building a high-scale, production-grade Google Ads data engine to power a large outbound sales infrastructure.
This system will:
- Collect businesses actively running Google Ads (mobile-first)
- Extract phone numbers from call ads and call extensions
- Normalize and deduplicate data
- Store structured lead records in PostgreSQL
- Export automated daily CSV feeds for VICIdial
- Scale to 20,000–50,000+ API requests per day
We are seeking a hands-on, experienced Backend Engineer who has already built similar high-volume data ingestion or SERP-based automation systems in production environments.
This is not an entry-level role and not a basic scraping task. We require prior real-world experience in scalable API-driven data systems.
⸻
Role Overview
As Senior Backend Engineer — Data Pipeline, you will design and implement a scalable, mobile-first SERP automation system.
You will own:
- API integration
- Worker-based processing architecture
- Deduplication logic
- Database schema design
- System reliability under scale
- Automation and export workflows
This is a backend-intensive engineering role focused on performance, scalability, and clean architecture.
⸻
Key Responsibilities
Data Pipeline Development (40%)
- Build scalable Node.js worker-based data collectors
- Integrate SERP APIs (SerpApi / DataForSEO)
- Implement concurrency control and rate limiting
- Design retry mechanisms and exponential backoff strategies
- Ensure stability at 20,000+ daily API requests
- Optimize performance for high-throughput ingestion
⸻
Database Architecture & Deduplication (30%)
- Design PostgreSQL schema for:
- Leads
- Query history
- Suppression lists
- Implement deduplication logic (phone + domain based)
- Optimize indexing and query performance
- Maintain freshness tracking and query rotation logic
- Prevent duplicate or stale data exports
⸻
Automation & Infrastructure (20%)
- Implement Redis + BullMQ worker queues
- Setup scheduled jobs (cron-based execution)
- Automate daily CSV export for VICIdial
- Deploy backend system to VPS/cloud
- Configure process managers (PM2 or equivalent)
- Implement structured logging and monitoring
⸻
Code Quality & Scalability (10%)
- Write modular, maintainable backend architecture
- Document system flow and data structure
- Ensure error handling and graceful recovery
- Maintain performance under high load
⸻
Required Qualifications & Experience (Must-Have)
- 4+ years professional backend development experience
- Strong expertise in Node.js (async, API integration, error handling)
- Advanced PostgreSQL knowledge (schema design, indexing, optimization)
- Experience integrating high-volume third-party APIs
- Experience with worker queues (BullMQ, Redis, or similar)
- Proven experience handling 10,000+ API requests per day
- Experience building scalable data ingestion or scraping systems
- Strong understanding of concurrency, rate limiting, and retry logic
- Experience deploying backend systems independently
⸻
Mandatory Prior Experience Requirement
Applicants must have previously built at least one of the following:
- SERP-based data collection engine
- Large-scale scraping or crawling pipeline
- Marketing lead automation system
- Distributed worker-based ingestion system
- High-volume API automation backend
You must be able to clearly explain:
- How you handled rate limits
- How you prevented duplicates
- How you structured your database
- How you scaled worker processes
- How you handled production failures
Applications without demonstrated relevant experience will not be considered.
⸻
Preferred (Nice-to-Have)
- Experience with SerpApi or DataForSEO
- Experience with marketing automation systems
- Experience with dialer integrations (VICIdial)
- Docker and basic DevOps knowledge
- Experience handling US-based marketing data
⸻
Technical Stack
- Node.js
- PostgreSQL
- Redis + BullMQ
- SERP API integration
- Cron scheduling
- CSV automation
- VPS/cloud deployment
⸻
Project Deliverables
- Fully functional mobile-first SERP data engine
- Reliable deduplication and suppression system
- Automated daily VICIdial export
- Stable worker-based architecture
- Scalable configuration for 20k–50k queries per day
- Documentation of architecture
⸻
Project Timeline
Estimated Duration: 4–6 Weeks
Start Date: Immediate
⸻
Compensation
Project-Based Contract: 200,000 – 350,000 PKR
(Compensation based on experience and portfolio)
Potential long-term engagement based on performance.
⸻
Application Process
To apply, include:
1. Description of a similar system you built
2. Approximate request volume handled per day
3. Architecture overview (Queue, DB, retry logic)
4. Tech stack used
Shortlisted candidates will be given a brief architecture discussion before final selection.
Job Type: Contract
Contract length: 1-2 months
Pay: Rs200,000.00 - Rs300,000.00 per month
Work Location: In person