W

Full Stack Engineer

New
Experience Level: Mid level Experience Length: 4 years

Job descriptions & requirements


About Waz

Waz is building the verification infrastructure for physical commerce in emerging markets. Every transaction that flows through our platform creates a verified data point that has never existed before in this market — what was ordered, what arrived, what it weighed, what was paid. We are building the intelligence layer that turns those verified transactions into pricing benchmarks, vendor performance scores, and eventually the credit infrastructure that unlocks financing for vendors across the continent. This role is at the core of that intelligence layer.


The Role

We are looking for a Full-Time Full Stack Engineer who thinks in data systems. You will build the pricing intelligence infrastructure that makes Waz genuinely defensible — the data ingestion pipelines, the SKU normalization layer, the benchmark calculation engine, the vendor scoring infrastructure, and the algorithms that turn verified transaction data into actionable intelligence. You will also contribute to the web platform where your data surfaces to users. This is not a pure data science role. You will write production code that ships to real customers. But the thing that makes you exceptional is how you think about data quality, data architecture, and what the data actually means.


What You Will Do

  • Build and maintain data scraping pipelines against Nigerian food market price sources
  • Build resilient scrapers that detect front-end changes and alert rather than silently fail
  • Design and implement the SKU normalization layer — using an LLM to map inconsistent item names from external sources and customer POs to canonical SKUs in our internal database
  • Build the confidence scoring system that rates the quality and consistency of price data across sources
  • Design and implement the unit conversion rules layer — handling yield ratios, packaging conversions, and live-to-dressed protein factors specific to Nigerian market procurement
  • Build the benchmark calculation engine — computing interquartile ranges, rolling averages, and reference prices from verified transaction data and external market data
  • Build the vendor scoring infrastructure — defining and computing vendor performance metrics from verified GRN data including fulfillment rate, price consistency, delivery accuracy, and anomaly flags
  • Design and implement algorithms for anomaly detection — flagging vendor pricing patterns, user-vendor exclusivity patterns, and suspicious procurement behavior
  • Build the Pricing Intelligence module on the web dashboard — search interface, SKU results view, range visualization, confidence display
  • Contribute to backend API development using our existing stack — Hasura, AWS Lambda, Node.js
  • Work with the data engineer on pricing intelligence to ensure the pipeline feeds cleanly into the product layer


What We Are Looking For

  • Minimum four years of engineering experience with at least two years specifically working with data pipelines, scraping infrastructure, or data product development
  • Strong Python for data engineering work — scraping, pipeline orchestration, data transformation
  • Strong JavaScript or TypeScript for web product work
  • Experience building and maintaining web scrapers at production scale — you have dealt with anti-scraping measures, DOM changes, rate limiting, and proxy management
  • Familiarity with common anti-scraping infrastructure — Cloudflare, Akamai, or equivalent — and practical experience working around it without violating terms of service
  • Experience with XPath and CSS selectors for DOM parsing and scraper construction
  • Familiarity with proxy management services and rotating IP infrastructure for scraper reliability
  • Experience with AWS CloudWatch or equivalent for pipeline monitoring, alerting, and logging — you instrument your pipelines so failures are visible immediately
  • Experience with orchestration tools such as Airflow, Prefect, or equivalent for managing pipeline scheduling and dependencies
  • Experience working with LLM APIs — you have built production systems that use OpenAI, Anthropic, or equivalent models for classification or normalization tasks
  • SQL proficiency at an advanced level — you design schemas with data integrity in mind and your queries are optimised
  • Experience building scoring or ranking algorithms from real transaction data — vendor scoring, credit scoring, reliability metrics, or equivalent
  • Understanding of statistical concepts relevant to benchmark construction — percentiles, interquartile ranges, confidence intervals, outlier detection
  • Comfort working across the stack when needed — you are not blocked by frontend work and you do not hand off problems at the API boundary


What Makes You Exceptional for This Role

  • You have built a data product that non-technical users trusted enough to make financial decisions from — and you understood the weight of that responsibility
  • You have experience with Nigerian or African market data specifically — you understand the messiness of informal market price data, inconsistent units, and multilingual item naming
  • You have designed a vendor or supplier scoring system from transaction data and you can speak to the specific choices you made in defining the metrics
  • You have caught a data quality problem that would have produced misleading output to users and you fixed it at the architecture level not with a patch
  • You think about what the data means before you think about how to compute it


What We Offer

  • Ownership of the intelligence and data layer
  • Direct collaboration with the founding team on product and data strategy
  • Remote
  • Salary will range from N2,000,000 to N3,500,000 dependent on experience



<

Important safety tips

  • Do not make any payment without confirming with the Jobberman Customer Support Team.
  • If you think this advert is not genuine, please report it via the Report Job link below.

Job applications are closed.

This job role is not currently accepting applications. Please explore similar vacancies

View Similar Jobs

Similar jobs

Lorem ipsum

Lorem ipsum dolor (Location) Lorem ipsum Confidential
3 years ago

Stay Updated

Join our newsletter and get the latest job listings and career insights delivered straight to your inbox.

v2.homepage.newsletter_signup.choose_type

We care about the protection of your data. Read our

We care about the protection of your data. Read our  privacy policy .

Follow us On:
Get it on Google Play
2026 Jobberman

Or your alerts