| Job Position | Company | Posted | Location | Salary | Tags |
|---|---|---|---|---|---|
Heretic | San Francisco, CA, United States | $105k - $150k | |||
Edge & Node | Remote | $81k - $150k | |||
Marathon Digital Holdings | United States | $75k - $84k | |||
CoinList | San Francisco, CA, United States | $150k - $207k | |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
CSGOEmpire | Remote | $90k - $94k | |||
Binance | Asia |
| |||
CoinList | San Francisco, CA, United States | $122k - $180k | |||
Tokemak | Remote | $200k - $300k | |||
Coinbase | United States | $140k - $165k | |||
Douro Labs | Remote | $84k - $90k | |||
Uniswap | New York, NY, United States | $153k - $187k | |||
Rarible | Lisbon, Portugal | $75k - $84k | |||
Luno | Cape Town, South Africa | $84k - $84k | |||
Swan | Remote | $72k - $100k | |||
Uniswap Labs | New York, NY, United States | $153k - $187k |
Senior Data Engineer ( Stealth PortCo)
Responsibilities
- AI Model Training Pipeline Management
- Design, develop, and maintain scalable and efficient data pipelines for training AI models.
- Collaborate with AI engineers and company management to understand requirements and implement solutions that result in improved model performance
- Implement data versioning, tracking, and monitoring systems to ensure the quality and historical preservation of training data
- Business Analytics Platform
- Lead the management and optimization of business analytics platforms, enabling stakeholders to derive actionable insights from diverse datasets including but not limited to behavioral data, transactional data, survey dataÂ
- Collaborate with cross-functional teams to gather business requirements and translate them into data engineering solutions
- Ensure data consistency, accuracy, and reliability within the business analytics platforms.
- Data Governance and Security
- Enforce data governance policies and security measures to protect sensitive information
- Collaborate with the security team to implement and maintain data privacy standards and compliance
- Documentation and Knowledge Sharing
- Create and maintain comprehensive documentation for data pipelines, ensuring knowledge transfer within the team
Qualifications
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- 5 years’ experience as a Data Engineer with a focus on data pipelines, cloud data solutions, analytics databases, visualization tools and business analytics platforms. Direct experience with AI model pipelines is a plus, but not required.
- Strong proficiency in SQL, Python, and Linux. Comfortable with or ability to learn light Javascript for light tool configuration.
- Experience working with large data sets, both structured and unstructured. Experience converting unstructured JSON and other data types into structured data in a scalable, automated manner.
- Understanding of machine learning model types and generative AI concepts is a plus.
- Experience with cloud based data and image management platforms such as BigQuery, Snowflake, Redshift, and S3.
- In-depth knowledge of database systems, data warehousing, and ETL and data orchestration processes.
- Comfortable with or willing to learn about setting up and managing cloud based visualization tools such as Tableau or Looker, connecting to analytical and relational data sources.
- Entrepreneurial and ownership mindset with bias for action. Prepared to move with speed to outpace competitors.
- Strong interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.
What does a data scientist in web3 do?
A data scientist in web3 is a type of data scientist who focuses on working with data related to the development of web-based technologies and applications that are part of the larger web3 ecosystem
This can include working with data from decentralized applications (DApps), blockchain networks, and other types of distributed and decentralized systems
In general, a data scientist in web3 is responsible for using data analysis and machine learning techniques to help organizations and individuals understand, interpret, and make decisions based on the data generated by these systems
Some specific tasks that a data scientist in web3 might be involved in include developing predictive models, conducting research, and creating data visualizations.