| Job Position | Company | Posted | Location | Salary | Tags |
|---|---|---|---|---|---|
Shakepay | Montreal, Canada | $91k - $156k | |||
Blockdaemon | San Francisco, CA, United States | $30k - $90k | |||
MetaMask | Australia | $45k - $75k | |||
NaturalMotion / Zynga | remote | $105k - $111k | |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Provably Fair | Romania | $32k - $64k | |||
Binance | Singapore, Singapore |
| |||
NaturalMotion / Zynga | remote | $105k - $111k | |||
NaturalMotion / Zynga | remote | $105k - $111k | |||
BitSight | Boston, MA, United States | $30k - $60k | |||
OKX | Singapore, Singapore | $32k - $64k | |||
OKX | Singapore, Singapore | $32k - $64k | |||
Zerion | Remote | $105k - $111k | |||
Cryptio | Paris, France | $63k - $90k | |||
The Block | Remote | $91k - $156k | |||
NaturalMotion / Zynga | remote | $105k - $111k |
Data Engineer (Remote - Canada)
You will:
- Build ELT/ETL data pipelines responsible for data ingestion and integration from various sources (batch and streaming) into Shakepay’s internal data platform, and work closely with our Core & Internal Systems engineering teams to incorporate changes from those sources into our downstream processes
- Design, deploy and maintain architecture to support those pipelines; task orchestration, CI/CD pipelines for data processing, data cataloging & documentation, and data monitoring & observability
- Work closely with data analysts & scientists to understand their needs and support their projects
- Help administer the data platform; user administration, data access policies, and performance/cost optimization
- Translate business requirements into architectural designs with estimates
Must have:
- Strong written and verbal communication skills and comfort operating in a remote work context
- 4+ years of experience writing Python code in a data context
- 4+ years of experience working with cloud data warehouses (we use Snowflake)
- 3+ years of cloud architecture experience in a data context (our main cloud provider is AWS)
- 3+ years of experience with task orchestration systems in a data context (ie. Apache Airflow, Prefect, Dagster, etc)
- Advanced SQL skills
- Experience with Docker
Nice to have:
- Familiarity with the principles of the modern data stack and analytics engineering frameworks like dbt
- Experience with Infrastructure-as-code frameworks like Terraform
- 2000+ years of experience as a Roman aqueduct
- Some Kubernetes experience
- Working experience in crypto or fintech
What you get:
- Potentially life-changing stock options. We believe everyone at Shakepay should have the financial upside for building a generational company
- Remote-friendly work environment: work from anywhere in Canada. If you're in Montreal, you can work from the office
- Generous vacation time: we think time off is essential, and highly encourage it
- Personal development: we're here to help you define and hit your personal career goals so that you can get where you want to be
- Continued learning: every Shaker gets a yearly budget to spend on learning
- Employer-covered group insurance: health, dental, paramedical, disability and travel coverage to ensure you're at your best
- Get paid in Bitcoin: choose to take a percentage of your salary in the hardest, soundest money the world has ever known
- A collaborative and friendly team: we succeed together and we have fun doing it
- MacBook: company-issued laptop to make sure you're doing your best work
- Equipment stipend: every Shaker receives a stipend to use toward setting up their home office
What does a data scientist in web3 do?
A data scientist in web3 is a type of data scientist who focuses on working with data related to the development of web-based technologies and applications that are part of the larger web3 ecosystem
This can include working with data from decentralized applications (DApps), blockchain networks, and other types of distributed and decentralized systems
In general, a data scientist in web3 is responsible for using data analysis and machine learning techniques to help organizations and individuals understand, interpret, and make decisions based on the data generated by these systems
Some specific tasks that a data scientist in web3 might be involved in include developing predictive models, conducting research, and creating data visualizations.