Remote Data Science Jobs in Web3

578 jobs found

Receive emails of Remote Data Science Jobs in Web3
Job Position Company Posted Location Salary Tags

Crypto.com

remote

$30k - $80k

Archimed

United States

$70k - $150k

Virtually Human Studio

United States

$105k - $111k

Virtually Human Studio

United States

$63k - $75k

Virtually Human Studio

APAC

$63k - $75k

Finoa GmbH

Berlin, Germany

$45k - $60k

Nuri

Berlin, Germany

$40k - $62k

Kronos Research

Remote

$28k - $72k

Archimed

San Francisco, CA, United States

$80k - $150k

Gnosis

Berlin, Germany

Ultra

Tallinn, Estonia

$105k - $111k

Injective Labs

Remote

$91k - $156k

Palm NFT Studio, Inc.

Van, Turkey

$36k - $90k

Minds

Remote

$45k - $75k

Nuri

Berlin, Germany

$40k - $62k

Data Engineer Backend Team

Crypto.com
$30k - $80k estimated

This job is closed

About Crypto.com

Crypto.com was founded in 2016 on a simple belief: it's a basic human right for everyone to control their money, data and identity. With over 10+ million users on its platform today, Crypto.com provides a powerful alternative to traditional financial services, turning its vision of "cryptocurrency in every wallet" into reality, one customer at a time. Crypto.com is built on a solid foundation of security, privacy and compliance and is the first cryptocurrency company in the world to have ISO27001:2013 and PCI:DSS 3.2.1, Level compliance. Crypto.com is headquartered in Singapore with a 3000+ strong team. For more information, please visit www.crypto.com.

About the role

The data engineering team builds the big data platform and improves data pipeline. We are eager to acquire talents to help achieving team and company ambitions.

Responsibilities

  • Work with teams to build and continue to evolve data models and data flows to enable data driven decision-making
  • Design and implement big data infrastructure and pipelines.
  • Responsible for applying data governance framework including the management of data, operating model, data policies and standards
  • Identify shared data needs across company, understand their specific requirements, and build efficient and scalable data pipelines to meet the various needs to enable data-driven decisions across company
  • Keep lower the latency and bridge the gap between our source systems and our enterprise data warehouse by refactoring and optimizing our core data pipeline jobs

Skills Required

  • 2 to 5 years Experience on data and analytics engineering working with distributed technologies (Hadoop, Spark, Hive, Kafka)
  • Experience with scripting and programming languages like: Python, Scala, or Java, etc
  • Knowledge in data modeling, data warehousing, and ETL pipeline
  • Familiar with database or data warehouse technologies (Postgres, Oracle, MySQL, Hive, etc)
  • Experience with Visualization tools (Tableau, IBM Cognos, etc)
  • Experience with Docker and Kubernetes

Skills Preferred

  • Have experience with real time data processing.
  • Have experience with Airflow or other similar scheduling tools.
  • Have experience with cloud infrastructures
  • We offer an attractive compensation package working in a cutting-edge field of Fintech.
  • Huge responsibilities from Day 1. Be the owner of your own learning curve. The possibilities are limitless and depend on you
  • You get to work in a very dynamic environment and be part of an international team
  • You will get to have involvement in developing brand new products from scratch using latest technologies alongside with a passionate and talented team