Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
Lume Finance | Remote | $84k - $150k | |||
Genies | Remote | $72k - $158k | |||
CleanSpark | Las Vegas, NV, United States | $65k - $70k | |||
Zinnia | India | $98k - $115k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Launchpadtechnologiesinc | Latam | $86k - $109k | |||
Inmobi | Remote | $91k - $110k | |||
Kraken | United States | $127k - $203k | |||
Gsrmarkets | Remote | $80k - $95k | |||
Copperco | Remote | $90k - $164k | |||
Bitpanda | Vienna, Austria | $81k - $84k | |||
Blockchain | Remote | $84k - $109k | |||
Binance | Taipei, Taiwan |
| |||
OpenTag | Sofia, Bulgaria | $84k - $89k | |||
Whatnot | Remote | $84k - $165k | |||
Copperco | Remote | $112k - $156k |
Role Overview:
Data Engineer who can efficiently fetch, structure, and manage blockchain data from The Graph and other sources. You will be responsible for setting up ETL pipelines, transforming raw blockchain data into structured formats, and making it accessible for backend APIs and internal analytics. Your work will empower our backend developers to build efficient APIs and enable meaningful data visualizations for users.
Key Responsibilities:
- Fetch and process blockchain data from The Graph and other sources.
- Design and implement ETL (Extract, Transform, Load) pipelines to structure raw data.
- Store and optimize data for efficient querying and API consumption.
- Identify patterns, trends, and insights from blockchain data to enhance analytics.
- Ensure data integrity, consistency, and performance in a scalable architecture.
- Work closely with the backend team to provide well-structured data for APIs.
- Optimize database performance for real-time and historical analytics.
- Implement caching, indexing, and aggregation strategies for large-scale data processing.
Required Skills & Experience:
- Strong experience with The Graph (GraphQL queries).
- Expertise in ETL processes and data pipeline management.
- Experience with data modeling and structuring for analytics & API consumption.
- Proficiency in Python, Node.js, or Rust for data processing.
- Knowledge of database management (SQL, NoSQL, time-series DBs, or data lakes).
- Ability to detect patterns and trends in blockchain transaction data.
- Experience handling large-scale datasets efficiently.
Nice to Have:
- Experience with blockchain indexing beyond The Graph (custom indexers, RPC data extraction).
- Prior experience in financial or DeFi data analytics.
- Familiarity with machine learning for pattern detection in financial transactions.
What does a data scientist in web3 do?
A data scientist in web3 is a type of data scientist who focuses on working with data related to the development of web-based technologies and applications that are part of the larger web3 ecosystem
This can include working with data from decentralized applications (DApps), blockchain networks, and other types of distributed and decentralized systems
In general, a data scientist in web3 is responsible for using data analysis and machine learning techniques to help organizations and individuals understand, interpret, and make decisions based on the data generated by these systems
Some specific tasks that a data scientist in web3 might be involved in include developing predictive models, conducting research, and creating data visualizations.