Web3 is a new paradigm

Browse 73,420 blockchain jobs in web3 at 7,163 projects. Filter the best remote crypto jobs by salary, location, and skills.

web3.career is now part of the Bondex Logo Bondex Ecosystem

Receive emails of new Web3 Jobs
Job Position Company Posted Location Salary Tags

SkyTrade

Remote

$24k - $48k

Triton

United States

$54k - $90k

Exodus

Remote

$110k - $125k

XDC Network

New York, NY, United States

$76k - $81k

Panoramic Hills Capital

New York, NY, United States

$64k - $90k

Hello Sunshine

New York, NY, United States

$175k - $220k

XDC Ventures (XVC.TECH)

New York, NY, United States

$19k - $24k

Flipster

South Korea

$84k - $110k

pod network

Remote

$45k - $99k

LMAX Group

London, United Kingdom

$72k - $107k

Crypto Finance AG

Zurich, Switzerland

$31k - $90k

Gemini

London, United Kingdom

$32k - $58k

Gemini

Miami, FL, United States

$150k - $215k

Gemini

New York, NY, United States

$92k - $132k

Who We Are

Stablecoins are beginning to reshape the global FX market, where more than $10 trillion trades every day. Hibachi is building the exchange designed for that shift.

We are building a modern central limit order book for global currencies with transparent prices, direct access to liquidity, and infrastructure designed for continuous global markets. Our goal is to open FX trading beyond the traditional interbank system and create a venue where global money can move freely.

We are a small team of engineers and traders who have built market infrastructure at Tower Research, Citadel, Coinbase, and Bloomberg. We care deeply about performance, correctness, and building systems that operate at global scale.

Hibachi is backed by Dragonfly Capital, Electric Capital, Coinbase Ventures, and Circle Ventures.




About The Technology

Hibachi runs a high performance off chain central limit order book built for fast, private trading and deep liquidity. Zero knowledge proofs allow anyone to verify the exchange’s solvency on chain without revealing user positions. The result is transparent infrastructure built for global markets.



The Role

We are seeking a Data Engineer with broad expertise in data modeling, advanced SQL, ETL/ELT development, and CDC (Change Data Capture). You will design and maintain end-to-end data solutions—covering batch and streaming ingestion, data warehousing with Iceberg, AWS DMS (or similar CDC tools). This role requires strong communication skills to ensure data initiatives align with and drive business objectives.




You’ll Be Responsible for:

  • Data Pipeline Development: Architect, build, and maintain batch and streaming data pipelines using PySpark, AWS Glue, and Airflow .Implement Change Data Capture (CDC) with AWS DMS (or comparable tools) to capture incremental updates from source systems.
  • Data Modeling & Architecture: Design modular, reusable, and scalable data models adhering to best practices. Work with iceberg backed Data Warehouse solution for performant storage, queries, and transformations. Ensure consistent data definitions and governance using frameworks like the Glue Catalog.
  • ETL/ELT: Manage ETL/ELT pipelines ensuring efficient data ingestion, cleansing, and aggregation. Monitor and debug performance bottlenecks, applying tuning techniques where necessary.
  • Data Visualization & Analytics: Develop QuickSight dashboards (or similar BI tools) to surface actionable insights for stakeholders.


You’ll Need to Have:

  • Bachelor’s or Master’s Degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • 2+ years of hands-on experience with PySpark for batch and streaming pipelines. Familiarity with streaming ecosystems (Kafka, Kinesis, Spark Structured Streaming).
  • Strong proficiency in AWS Glue, Apache Airflow, and Iceberg
  • Experience with AWS DMS or other CDC tools to manage real-time or near-real-time data ingestion.
  • Advanced SQL knowledge, including performance tuning and complex transformations.
  • Proven background in data modeling and data architecture best practices (data warehouse/data lake).
  • Experience with BI platforms (QuickSight, Tableau, Power BI, etc.) for dashboard development.
  • Understanding of testing frameworks (e.g., Pytest) for data pipelines, unit testing, and QA processes
  • Excellent communication skills, with an ability to bridge technical and business requirements


We’d Love to See:

  • Background in trading, HFT, or capital markets infrastructure