| Job Position | Company | Posted | Location | Salary | Tags |
|---|---|---|---|---|---|
Ethereum Foundation | Remote |
| |||
Blockaid | Tel Aviv, Israel | $88k - $100k | |||
Stellar | Remote | $160k - $205k | |||
Okx | Remote | $240k - $360k | |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Mysten Labs | United States | $150k - $225k | |||
MoonPay | Krakow, Poland | $122k - $123k | |||
Bitso | Latin America | $98k - $103k | |||
Bitso | Latin America | $86k - $102k | |||
Zinnia | India | $98k - $115k | |||
Mysten Labs | United States | $160k - $240k | |||
Binance | Asia |
| |||
MoonPay | Lisbon, Portugal | $84k - $109k | |||
Lume Finance | Remote | $84k - $150k | |||
Genies | Remote | $72k - $158k | |||
CleanSpark | Las Vegas, NV, United States | $65k - $70k |
Data Engineer, AI and Automation
What You’ll Be Doing
- Design and Architect a data store and indexing infrastructure
- Build and maintain the context engineering api to power workflow agents
- Build and maintain data pipelines to track critical Ethereum ecosystem metrics
- Happy working on smaller automation tasks and larger-scale projects
- Prototype tools and dashboards to make ecosystem data more accessible internally
- Help define standards and best practices for measuring ecosystem progress
- Where possible, contribute back to open-source data initiatives across the Ethereum ecosystem
What We Look For
- Experience designing and maintaining data pipelines (batch or streaming)
- Experience applying GenAI/Agentic solutions in a data engineering context
- Familiarity with on-chain data and smart contract logic (Solidity / Rust)
- Experience with orchestration tools like Airflow or Dagster
- Comfort working with external APIs, datasets, and custom scrapers/indexers
- Ability to move from ambiguous ideas to structured metrics and reliable systems
- Familiarity with building data pipelines on AWS, S3 and other cloud platforms and services
- Familiarity with platforms like Dune, The Graph, Token Terminal
- Comfort with dashboarding and visualization, both via external platforms (e.g. Dune) and self-hosted tools (e.g. Plotly)
- Deep interest in Ethereum and public infrastructure
- Interest in collaborating on open-source software and supporting public data infrastructure
What does a data scientist in web3 do?
A data scientist in web3 is a type of data scientist who focuses on working with data related to the development of web-based technologies and applications that are part of the larger web3 ecosystem
This can include working with data from decentralized applications (DApps), blockchain networks, and other types of distributed and decentralized systems
In general, a data scientist in web3 is responsible for using data analysis and machine learning techniques to help organizations and individuals understand, interpret, and make decisions based on the data generated by these systems
Some specific tasks that a data scientist in web3 might be involved in include developing predictive models, conducting research, and creating data visualizations.