| Job Position | Company | Posted | Location | Salary | Tags |
|---|---|---|---|---|---|
Merkle Science | Bangalore, India | $81k - $168k | |||
Plai Labs | Los Angeles, CA, United States | $81k - $100k | |||
Plai Labs | Los Angeles, CA, United States | $90k - $100k | |||
Limit Break | United States | $174k - $205k | |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Polymer Labs | Madrid, Spain | $51k - $65k | |||
Polymer Labs | Remote | $90k - $100k | |||
Chainkemists | Remote | $72k - $75k | |||
Coins.ph | Manila, Philippines | $72k - $100k | |||
Coins.ph | APAC | $72k - $100k | |||
Coins.ph | Shanghai, China |
| |||
Trustana | Singapore, Singapore | $45k - $72k | |||
OP Labs | remote | $53k - $75k | |||
Shakepay | Montreal, Canada | $63k - $66k | |||
Elwood Technologies | remote |
| |||
Elwood Technologies | remote |
|
This job is closed
Lead Data Engineer
💥 What will you do?
- Create and maintain optimal data pipeline architecture for our workloads. This includes building highly resilient architecture for both streaming and batch ETL processes.
- Have a good understanding of data structures used by public blockchains to store data. A significant portion of our data pipelines parse blockchain data and store them in our data warehouses.
- Assemble large, complex data sets that meet functional / non-functional business requirements. This includes expanding the scope of our data-mining efforts by building data pipelines to crawl data from the dark-web, open-web, third party data sources.
- Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organisation.
- Works closely with a team of frontend and backend engineers, product managers, and analysts to render data on to our products.
- Implement algorithms to transform raw data into useful information.
- Build, Manage and Deploy AI / ML workflows.
- Lead technical architecture, design and best practices for the team.
- Establish and follow operability as part of design principles.
- Good knowledge of Site reliability concepts, such as RCA, postmortems and run-books and related automation.
- Strong DevSecOps principles.
🙋 What are we looking for?
- >6+ years of relevant experience as Big Data Engineer.
- Experience in building data pipelines(ETL/ELT) using open-source tools such as Apache Airflow, Apache Beam and Spark.
- Experience in building realtime streaming pipelines using Kafka/Pubsub.
- Experience in building and maintaining OLAP and OLTP data warehouses.
- Good understanding of python, bash scripting and basic cloud platform skills (on GCP or AWS).
- A working knowledge of Docker is a plus.
- Problem-solving aptitude.
- Analytical mind with a business acumen.
- Excellent communication skills.
👀 What process do we follow? (<2 weeks)
- Application: We will keep it simple. You can apply directly through our job portal. All we ask for is a Resume. Additional Portfolio links such as Github, Medium or a Personal website are welcome.
- Screening: We will screen your profile and get back with a decision within a week.
- Interviews: We will have two rounds of interviews.
- Round one (30mins) will focus on getting to know each other better and identifying if this could work for both of us.
- Round two (60mins) is a technical round where we will review your prior experience and discuss how you would build systems to solve a problem we will introduce on call.
- Meet the Team: Culture-Fit is essential for both you and us. So we always go the extra mile, and you will meet two other colleagues on the team who you would be working with. Here, you could discuss questions on stack, culture and some other things you might be interested in if you had a consideration for a new role.
- Offer Rollout: If all looks well, we will open a bottle of champagne.
What is the salary of a Docker?
Docker is a technology that is used for containerization, and it is widely used in the software development industry
The Docker salary can depends on national averages and may vary depending on the company, location, and other factors
Additionally, the salary of a Docker professional can increase with years of experience and additional skills in related technologies such as Kubernetes or cloud computing
The salary of a Docker can vary depending on several factors such as location, years of experience, industry, and job position
Here are some estimates for the average salaries of Docker-related job positions in the United States based on data from various sources:
- DevOps Engineer with Docker skills: The average salary for a DevOps Engineer with Docker skills in the US is around $115,000 to $150,000 per year.
- Docker Engineer: The average salary for a Docker Engineer in the US is around $110,000 to $140,000 per year.
- Docker Architect: The average salary for a Docker Architect in the US is around $130,000 to $170,000 per year.
- Docker Administrator: The average salary for a Docker Administrator in the US is around $95,000 to $120,000 per year.