Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
Binance | Taipei, Taiwan |
| |||
Binance | Taipei, Taiwan |
| |||
Chainalysis | London, United Kingdom | $165k - $250k | |||
Genies | Remote | $84k - $156k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Blockchain | Remote | $106k - $107k | |||
Circle | U.S. - California Ηνωμένες Πολιτείες της Αμερικής | $172k - $212k | |||
swissquote | Switzerland | $147k - $153k | |||
Crypto.com | Singapore, Singapore | $129k - $149k | |||
Binance | Singapore, Singapore |
| |||
Binance | Singapore, Singapore |
| |||
Launchpadtechnologiesinc | Latam | $98k - $114k | |||
Gsrmarkets | London, United Kingdom | $80k - $85k | |||
Keyrock | Singapore, Singapore | $72k - $90k | |||
Bitpanda | Barcelona, Spain | $122k - $150k | |||
OP Labs | Remote | $200k - $260k |
Senior Data Engineer (Data Warehouse) - Web3
Responsibilities
- Architect and implement a flexible, scalable data warehouse aligned with company specifications and business requirements, accelerating delivery and reducing redundant development.
- Design, develop, test, deploy and monitor data models and ETL jobs; rapidly troubleshoot complex issues and optimize calculation logic and pipeline performance.
- Lead data governance initiatives by building and maintaining metadata management and data quality monitoring systems.
- Foster technical team growth through mentorship, knowledge sharing and continuous improvement of collective skills.
Requirements
- 5+ years of hands-on experience designing and developing data lakes and data warehouse solutions.
- Deep expertise in data warehouse modeling and governance, including dimensional modeling, information factory (data vault) methodologies and “one data” principles.
- Proficiency in at least one of Java, Scala or Python, plus strong Hive & Spark SQL programming skills.
- Practical experience with OLAP engines (e.g., Apache Kylin, Impala, Presto, Druid).
- Proven track record in building high-throughput batch pipelines on Big Data platforms.
- Familiarity with core Big Data technologies (Hadoop, Hive, Spark, Delta Lake, Hudi, Presto, HBase, Kafka, Zookeeper, Airflow, Elasticsearch, Redis).
- AWS Big Data service experience is a plus.
- Strong analytical and system-design capabilities, with a clear understanding of business requirements and ability to abstract and architect solutions.
- Collaborative mindset, skilled at building partnerships across teams and stakeholders.
- Preferred: Experience managing petabyte-scale data in Internet-scale environments and resolving critical production incidents.
- Bilingual English/Mandarin is required to be able to coordinate with overseas partners and stakeholders.
What does a Java developer in web3 do?
A Java developer in web3 would likely be focused on developing applications that use the Java programming language in the context of the web3 technology stack
Web3 is a collective term used to refer to the next generation of decentralized, blockchain-based technologies that are aimed at creating a more open and secure internet
In this context, a Java developer would be responsible for writing code that interacts with web3 technologies, such as decentralized applications (DApps) and smart contracts, to create new tools and services that run on the blockchain
This could involve working with cutting-edge technologies such as Ethereum, which is a popular blockchain platform that uses the Java programming language, as well as other web3 technologies and frameworks.