ai analyst backend bitcoin blockchain community manager crypto cryptography cto customer support dao data science defi design developer relations devops discord economy designer entry level erc erc 20 evm front end full stack gaming ganache golang hardhat intern java javascript layer 2 marketing mobile moderator nft node non tech open source openzeppelin pay in crypto product manager project manager react refi research ruby rust sales smart contract solana solidity truffle web3 py web3js zero knowledge
| Job Position | Company | Posted | Location | Salary | Tags |
|---|---|---|---|---|---|
FalconX | Remote | $103k - $165k | |||
OKX | Singapore, Singapore | $103k - $117k | |||
OKX | Singapore, Singapore | $90k - $180k | |||
OKX | Singapore, Singapore | $90k - $180k | |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Syndica | Houston, TX, United States | ||||
Goldman Sachs | New York, NY, United States | $115k - $180k | |||
Find Satoshi Lab | Remote | $36k - $54k | |||
ChainSafe Systems | New York, NY, United States | $84k - $84k | |||
Hyperbolic Labs | Irvine, CA, United States | $81k - $150k | |||
Coinbase | Remote | $185k | |||
Coinbase | Remote | $211k - $249k | |||
BitGo | Palo Alto, CA, United States | $185k - $235k | |||
BitGo | Toronto, Canada | $80k - $105k | |||
Nomos | Warsaw, Poland | $90k - $148k | |||
Gauntlet | Remote | $160k - $180k |
FalconX
$103k - $165k estimated
This job is closed
What you’ll be working on:
- Provide technical and thought leadership for Data Engineering and Business Intelligence
- Create, implement and operate the strategy for robust and scalable data pipelines for business intelligence and machine learning
- Develop and maintain core data framework and key infrastructures
- Data Warehouse design and data modeling for efficient and cost effective reporting
- Define and implement Data Governance processes related to data discovery, lineage, access control and quality assurance.
Skills you'll need:
- Degree in Computer Science, a related field or equivalent professional experience
- 3+ years of strong experience with data transformation & ETL on large data sets using open technologies like Spark, SQL and Python
- 3+ years of complex SQL with strong knowledge of SQL optimization and understanding of logical & physical execution plans
- You have at least 1 year working in AWS environment, and should be familiar with modern web technologies like AWS cloud, MySQL database, Redis caching, messaging tools like Kafka/SQS etc
- Experience in advanced Data Lake, Data Warehouse concepts & Data Modeling experience (i.e. Relational, Dimensional, internet-scale logs)
- Knowledge of Python, Spark (Batch/Streaming), SparkSQL and PySpark
- Proficient in at least one of the following Object-oriented programming languages -- Python / Java / C++
- Effective craftsmanship in building, testing, and optimizing ETL/feature/metric pipelines
- Experience with Business Requirements definition and management, structured analysis, process design, use case documentation
- A data-oriented mindset