Scala Jobs in Web3
508 jobs found
Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
Foundry | Bengaluru, India | $120k - $144k | |||
Gemini | Remote | $120k - $150k | |||
Gemini | Remote | $172k - $215k | |||
Ledger | London, United Kingdom | $87k - $150k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Circle - Referrals | Remote | $147k - $195k | |||
Circle - Referrals | Remote | $120k - $162k | |||
Ledger | London, United Kingdom | $87k - $100k | |||
Kraken | United States | $76k - $90k | |||
Ripple | Lausanne, Switzerland | $18k - $81k | |||
Coins.ph | Shanghai, China | $103k - $117k | |||
Circle | Boston, MA, United States | $120k - $162k | |||
Figure Markets | San Francisco, CA, United States | $102k - $128k | |||
Copper.co | Remote | $72k - $87k | |||
SADA India | Thiruvananthapuram, India | $103k - $117k | |||
Copper.co | Remote | $36k - $117k |
This position is based at Foundry in India.
DESCRIPTION: The Data Engineer is an exciting opportunity for a motivated and passionate individual to join the team at Foundry, a subsidiary of the blockchain industry’s most prolific and active investor, Digital Currency Group. Reporting to the Engineering Manager, the Data Engineer will be a key technical leader responsible for designing, implementing and optimizing our data infrastructure pipelines.
This is a rare invitation to join a small, highly professional entrepreneurial group, with the backing of the most established player in the fast-growing crypto space.
PRIMARY RESPONSIBILITIES: focus on maintaining the data infrastructure and pipelines. You will work closely with our data analysts to understand data requirements and ensure availability, quality and reliability of our data assets.
WHAT YOU WILL DO:
- Architect and lead the development of robust and scalable data collection infrastructure, covering various protocols.
- Oversee the collection, research, and analysis of operational and blockchain data.
- Drive initiatives to enhance data system performance and reliability, Innovate and optimize internal data practices, addressing gaps and inefficiencies.
- Ensure impeccable data accuracy and completeness during transfer and loading into enterprise repositories.
- Spearhead the establishment and maintenance of security and integrity controls.
- Formulate and monitor data warehouse management policies, procedures, and standards.
- Lead cross-functional project teams in analyzing business requirements, conducting complex data analysis, and supporting user acceptance testing for diverse data needs.
- Expertly organize and interpret data trends, uncovering patterns and insights.
- Collaborate closely with crypto product managers, business analysts, and engineers to fulfill data requirements.
- Develop advanced scripts to automate complex accounting and data processes.
- Lead scripting and scraping projects for data modeling preparation.
- Optimize and refine data architecture for seamless retrieval.
- Enhance data resiliency, redundancy, and recovery mechanisms.
- Develop cutting-edge tools for identifying and documenting public data sources.
- Analyze and interpret operational data from various infrastructure systems, collaborating closely with the Infrastructure Team.
- Spearhead the design and management of comprehensive data lakes and data marts for infrastructure data.
- Develop and implement advanced data solutions while driving infrastructure information gathering initiatives.
Minimum Qualifications; Knowledge, Skills and Abilities:
- Bachelor's degree in computer science, data engineering, information technology, or a related field.
- 5+ years of hands-on experience in data engineering, with a track record of delivering complex data solutions, or equivalent experience.
- Expertise in programming languages such as Python, Java, or Scala. Python preferred.
- Deep knowledge of database systems (SQL, NoSQL) and advanced data modeling concepts.
- Proficiency in big data technologies (e.g., Hadoop, Spark, Kafka) and distributed computing.
- Strong experience with cloud platforms (AWS preferred) and containerization technologies (Docker preferred).
- Strong experience working with Snowflake data warehouse.
- Strong experience with data transformation tools like dbt (data build tool).
- Strong proficiency in job scheduling and orchestration tools such as Prefect, Dagster, or Airflow.
- Familiarity with data integration tools like Fivetran, Stitch, Hevo, Airbyte.
- Experience with Terraform for managing data infrastructure.
- Excellent problem-solving skills, attention to detail, and ability to lead technical projects.
- Effective communication and collaboration skills, with the ability to work across cross-functional teams.
Preferred Qualifications; Knowledge, Skills and Abilities:
- Certifications in AWS Certified Data Analytics, Google Cloud Professional Data Engineer, or Microsoft Certified
- Experience with advanced data warehousing solutions like Amazon Redshift, Google BigQuery, or Snowflake, as well as expertise in data warehousing best practices
- Familiarity with machine learning and artificial intelligence concepts and tools
- Expertise in data governance frameworks and data compliance regulations, such as GDPR or HIPAA
- Proficiency in DevOps practices, continuous integration, and continuous deployment (CI/CD) pipelines
- Prior experience working in a startup environment, with an understanding of the fast-paced and dynamic nature of startups
- Experience working with global teams