Part Time
TBA
TBD
Apr 6, 2026
Hey,
Looking for someone sharp who can help extract data from a web platform.
At the
• The platform shows hundreds of results (500+)
• But I can only access or copy around 20–30 records at a time
• There are multiple pages, but data isn’t easily exportable
I need someone who can:
• Extract all available data, not just what’s visible on screen
• Work with login-based platforms (authenticated sessions)
• Handle pagination / lazy loading / dynamic content
• Deliver clean data into Excel or Google Sheets
________________________________________
What you’ll likely be working with:
• JavaScript-heavy web apps
• Infinite scroll or paginated results
• Browser DevTools / Network tab
• API extraction (if available)
• Or DOM scraping if needed
________________________________________
Ideal experience:
• Web scraping tools (Python, Selenium, Puppeteer, etc.)
• Experience extracting from platforms that don’t allow easy export
• Understanding of XHR / API calls / JSON responses
• Able to work efficiently without breaking the platform
________________________________________
Deliverables:
• Full dataset (not partial pages)
• Clean, structured format (CSV / Excel)
• Ideally repeatable process if this becomes ongoing
________________________________________
Notes:
• This is not a basic copy-paste job
• Looking for someone who understands how to get around data limitations properly
• If you’ve done similar work before, mention it briefly