Senior Analytics Engineer / Data Engineer for E-commerce BI Project (BigQuery / dbt / Airbyte)

Please login or register as jobseeker to apply for this job.

TYPE OF WORK

Part Time

SALARY

$6.00 - 10.00 /hr

HOURS PER WEEK

TBD

DATE UPDATED

Mar 30, 2026

JOB OVERVIEW

Hi there!

I run a growing e-commerce business where around 90% of revenue and day-to-day operations are Amazon-focused. I also sell on eBay and via our own brand website, but Amazon is the core engine.

I am looking for an Analytics Engineer / Data Engineer to take over an existing BI project for my e-commerce business. My current BI freelancer has been absent for an extended period due to illness, so I am now looking for someone who can step in, understand the current setup, and complete the project.

This is not a project starting from zero. A large part of the foundation is already built. This is a hands-on build role for someone who can combine technical execution with practical business thinking. After the initial project is completed, there will be ongoing work, maintenance, and new projects. The intention is for the right person to become an embedded part of my company, not just a temporary freelancer.

CURRENT PROJECT STATUS

Milestone 1 is approximately 80% complete already.

What is already done:
• Amazon APIs are connected through Airbyte
• We use dbt on my local TrueNAS system
• Historical Amazon data has been cleaned and imported into BigQuery
• The warehouse foundation is largely in place

What still needs to be done first:
• Import the remaining Amazon ads data
• Finish an accurate P&L model
• Validate outputs against the current business logic

So I am looking for someone who can take over the project, not someone who only wants to start from zero.

BUDGET AND LONG-TERM OPPORTUNITY

For the initial project, I am offering the same pay structure already agreed for this build (see pay per milestone below). Additional contingency buffer: $500 for extra work if needed and agreed.

After the initial project is finished, there will be ongoing maintenance and new BI projects.

For that longer-term work, the hourly rate will be in the range of $6 to $10 USD/hour depending on experience, reliability, and ownership.

I am not looking for a temporary freelancer only. I want the right person to become an embedded part of my company. As the business grows, the plan is to increase hours, responsibility, ownership and pay.

CURRENT STACK

• Airbyte for ingestion
• Google BigQuery as warehouse
• dbt for transformations and KPI logic
• Looker Studio or similar for dashboards
• Google Sheets only for manual inputs where needed

PROJECT DESCRIPTION AND MILESTONES

Milestone 1: Foundation + Amazon Replacement

Timeline: ~3–4 weeks
Budget allocation: $400

• Channel-agnostic warehouse foundation
• Amazon fully implemented end-to-end (sales, ads, fees, core profit KPIs)
• dbt models as the single source of truth
• Full replacement of my current Amazon Google Sheets logic
• Side-by-side validation against existing Sheets

Milestone 2: Multi-Channel Ingestion

Timeline: ~2–3 weeks
Budget allocation: $300

• WooCommerce ingestion (orders, refunds, products)
• eBay ingestion (orders, fees, refunds)
• All channels mapped into the same schema as Amazon
• KPIs reused from Phase 1 (no duplicate logic, no redesign later)

Milestone 3: Cross-Channel KPIs and Dashboards

Timeline: ~1–2 weeks
Budget allocation: $200

• Cross-channel KPI tables (Amazon vs Woo vs eBay)
• Looker Studio dashboards built on curated dbt models
• No business logic in dashboards

Milestone 4: Product Matching, Currency and Handover

Timeline: ~1 week
Budget allocation: $100

• Product matching across Amazon, WooCommerce, and eBay
• Currency handling implemented in the warehouse
• Lightweight documentation and final walkthrough

The goal is that by the end of Phase 2, Amazon, WooCommerce, and eBay all feed into the same warehouse model, with KPIs defined once and reusable for future platforms without rework.

WHAT I'M LOOKING FOR
• Strong SQL skills
• Strong data modeling skills
• Experience with BigQuery
• Experience with dbt
• Experience with ELT / ETL pipelines
• Able to take over and improve an existing setup
• Practical and cost-conscious
• Able to explain things clearly
• Reliable and organized

NICE TO HAVE
• E-commerce analytics experience
• Amazon data experience
• Ads data and P&L experience
• Looker Studio experience
• Experience replacing spreadsheet reporting with a proper warehouse setup

NOT A FIT IF
• You are a VA or data-entry profile
• You only build dashboards
• You put business logic inside BI tools or spreadsheets
• You prefer over-engineered or expensive SaaS-heavy setups
• You only want a very short freelance project

CONFIDENTIALITY
You will have access to confidential company data, including sales data, financial data, advertising data, product data, and internal metrics.

All data, models, transformations, dashboards, and project outputs created during this work remain the exclusive property of my company.

TO APPLY, PLEASE ANSWER
1. What experience do you have with BigQuery, dbt, and Airbyte?
2. Have you ever taken over a partially completed BI or data project before?
3. How would you approach reviewing and completing an existing setup like this?
4. What is your expected timeline for finishing the remaining work?
5. What hourly rate would you want for ongoing work after the initial project?

Please keep your reply clear and concise.

Thanks in advance for your application.

Cheers,
Friso

SKILL REQUIREMENT
VIEW OTHER JOB POSTS FROM:
SHARE THIS POST
facebook linkedin
  BENCHMARKS  
Loading Time: Base Classes  0.0008
Controller Execution Time ( Jobseekers / Job )  0.0134
Total Execution Time  0.0148
  GET DATA  
No GET data exists
  MEMORY USAGE  
1,524,984 bytes
  POST DATA  
No POST data exists
  URI STRING  
jobseekers/job/Senior-Analytics-Engineer-Data-Engineer-for-E-commerce-BI-Project-BigQuery-dbt-Airbyte-1613118
  CLASS/METHOD  
jobseekers/job
  DATABASE:  onlinejobs (Jobseekers:$db)   QUERIES: 13 (0.0083 seconds)  (Hide)
0.0004   SELECT *
                                
FROM exrates
                                WHERE rate_name 
'USD-PHP' 
0.0004   SELECT *
FROM `employer_jobs`
WHERE `job_id` = 1613118
 LIMIT 1 
0.0009   SELECT *
FROM `employers`
WHERE `employer_id` = 897206
 LIMIT 1 
0.0008   SELECT COUNT(*) AS `numrows`
FROM `t_thread` `t`
LEFT JOIN `t_thread_misc` `miscON `t`.`id` = `misc`.`thread_id`
WHERE `t`.`job_id` = 1613118
AND `misc`.`idIS NULL 
0.0005   SELECT e.business_namee.logoe.websitee.rebill_datee.date_added member_datehitsDATEDIFF('2026-04-16',ej.date_added) duration_daysDATEDIFF('2026-04-16',e.rebill_date) duration_rebillej.*, e.deactivate FROM employers eemployer_jobs ej WHERE e.employer_id ej.employer_id AND
                                   ((
e.user_level >= '500' AND ej.date_added <= e.rebill_date)
                                   OR 
e.employer_id '' OR (ej.date_approved <> '2000-01-01' and DATEDIFF('2026-04-16',ej.date_added) <= 14 ))
                                   AND 
e.deactivate != AND ej.deleted AND job_id '1613118' 
0.0003   SELECT *
FROM `employer_jobs_skills` `ejs`
LEFT JOIN `skills_categories` `scON `ejs`.`skill_id` = `sc`.`id`
WHERE `job_id` = 1613118 
0.0007   UPDATE employer_jobs SET hit_counts '***Mar-30-2026=457***Mar-31-2026=96***Apr-01-2026=66***Apr-02-2026=33***Apr-03-2026=32***Apr-04-2026=20***Apr-05-2026=4***Apr-06-2026=18***Apr-07-2026=14***Apr-08-2026=27***Apr-09-2026=24***Apr-10-2026=10***Apr-11-2026=4***Apr-16-2026=1' WHERE job_id'1613118'  
0.0006   UPDATE employer_jobs SET monthly_hits '***Mar-2026=552***Apr-2026=253' WHERE job_id'1613118'  
0.0009   SELECT date_sent FROM jobseeker_sent_emails WHERE jobseeker_id '' AND job_id '1613118' AND status LIKE 'sent%' ORDER BY id DESC  
0.0003   SELECT *
FROM `employer_jobs_skills` `ejs`
LEFT JOIN `skills_categories` `scON `ejs`.`skill_id` = `sc`.`id`
WHERE `job_id` = 1613118 
0.0019   SELECT COUNT(*) AS `numrows`
FROM `employer_jobs`
WHERE `employer_id` = '897206'
AND `date_added` >= '2022-06-08' 
0.0003   select from teasers 
0.0002   SELECT FROM skill_categories WHERE skill_cat_id='' 
  HTTP HEADERS  (Show)
  SESSION DATA  (Show)
  CONFIG VARIABLES  (Show)