| Job Position | Company | Posted | Location | Salary | Tags | 
|---|---|---|---|---|---|
  Ripple |  San Francisco, CA, United States |   $45k - $90k  |  |||
  Ripple |  San Francisco, CA, United States |   $45k - $75k  |  |||
  Tribute Labs |  New York, NY, United States |   $160k - $240k  |  |||
  Novel |  Remote |   
  |  |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. |   |     by Metana   Bootcamp Info  |  |||
  Ripple |  San Francisco, CA, United States |   $54k - $60k  |  |||
  The Block |  Remote |   $42k - $87k  |  |||
  Ripple |  San Francisco, CA, United States |   $59k - $90k  |  |||
  BlockFi |  New York, NY, United States |   $45k - $75k  |  |||
  BlockFi |  New York, NY, United States |   
  |  |||
  BlockFi |  New York, NY, United States |   $63k - $90k  |  |||
  BlockFi |  New York, NY, United States |   $63k - $90k  |  |||
  Ripple |  San Francisco, CA, United States |   $63k - $87k  |  |||
  Figure |  Atlanta, GA, United States |   $85k - $150k  |  |||
  The Block |  Remote |   $63k - $90k  |  |||
  CoinList |  San Francisco, CA, United States |   $22k - $75k  |  
This job is closed
Ripple’s Enterprise Data Management & Analytics team is creating scalable data infrastructure to enable a smooth and safe road for scale. As a DevOps Engineer on our data platform team, you will be responsible for setup, deployment, maintenance, and continuous monitoring of data-intensive applications. Your work will directly support the operation of production software while also enabling the developer experience for data engineers and data scientists. You will bring a software engineering approach to positively impact our culture of ownership, reliability, trust and observability across the ever-increasing scope of our Data Platform.
WHAT YOU'LL DO:
- As a DevOps Engineer, you will be maintaining and developing services to support our data driven analytics framework. Architect, deploy, and maintain Ripple’s multi-region, multi-provider service platforms (with an emphasis on security and resiliency)
 - Design and develop tools for automation, monitoring, and instrumentation to reduce operational friction and increase engineering efficiency
 - Create solutions for unique technical challenges faced by Ripple data infrastructure, engineering and ML teams, secret management, geographic failover, data replication, availability, and platform resiliency, streaming technologies, API Services etc
 - Create and automate new and existing platform and application lifecycle services, leveraging data to converge on declared states with minimal human interaction
 - Collaborate with data engineering to ensure code is production-ready
 - Work closely with developers, data scientists and first level support teams
 - Provide occasional after hours on-call support to handle urgent critical issues by working with first level support teams
 - Participate in the leadership of DevOps-first principles within the organization
 - Research promising new tools and technologies, push the team to experiment and evolve
 
WHAT WE ARE LOOKING FOR:
- 5+ years of software development and operations experience
 - 4+ years DevOps experience in a multi-tenant, highly scalable and highly available environments on GCP and AWS
 - 3+ years experience with Kubernetes and infrastructure provisioning tools like (Terraform, CloudFormation)
 - 3+ years experience with AWS, Docker containers, and container orchestration (Kubernetes, EKS, etc.)
 - 4+ years experience in cloud platform administration (preferably AWS or GCP)
 - Solid development background with Go, Python, Java, or C++
 - Experience designing and operating large scale, multi-region data Infrastructure platforms
 - Experience developing APIs and SDKs
 - Experience with data-relevant AWS/GCP services such as RDS, S3, EMR, Kinesis, DynamoDB, and Lambda (or equivalents from GCP or Azure)
 - Experience running containerized workloads in production, especially batch workloads
 - Experience with container schedulers and runtimes such as Docker, rkt, or OCI running on Kubernetes, Rancher, or Mesos
 - Experience building deployment pipelines leveraging common CI/CD tools
 - Experience with Infrastructure-as-Code (e.g. Terraform, CloudFormation, etc)
 - Experience with real-time telemetry and tracing tools like Jagger and Prometheus, ELK stack (ElasticSearch, Logstash, Kibana, Beats)
 - Experience in these or similar tools/tech: Shell, Bash, Python, Java, Git, Jenkins, Maven, Gradle, Kubernetes, Helm, AWS (EKS, EC2, IAM), Splunk, Prometheus
 - Experience in tuning and scaling Apache Kafka producer/consumer and Spark structured streaming applications
 - Experience setting up LDAP, RBAC, Service Mesh for API’s Gateway for Data Services
 - Security awareness, with an emphasis on designing for security best practices, for IT Security, GDPR and SOC2 regulations
 - Experience with deploying, supporting data specific infrastructure and tooling like Kafka/Kinesis, Kafka connect/K-SQL, SQL engines like Presto/Athena, schedulers and workflow managing tools like Airflow/Luigi, Memcache/Elasticsearch, service API, streaming application like Kafka Streams/Spark Streaming, etc.
 - Familiarity with modern MPP-columnar data warehouse platforms such as Redshift, Snowflake, GreenPlum, ClickHouse (we use Snowflake), BigQuery
 - Familiarity with Spark and the Hadoop ecosystem is a bonus
 
OUR TECH STACK AT A GLANCE:
- Amazon Web Services
 - GCP
 - Kubernetes and Docker
 - GitLab
 - Linux
 - Terraform
 - Java, JavaScript, and Python
 - Airflow, DBT, BigQuery
 - Hadoop/Impala/Hue
 - Kinesis/Spark Streaming
 
WHAT WE OFFER:
- The chance to work in a fast-paced start-up environment with experienced industry leaders
 - A learning environment where you can dive deep into the latest technologies and make an impact
 - Competitive salary and equity
 - 100% paid medical and dental and 95% paid vision insurance for employees starting on your first day
 - 401k (with match), commuter benefits
 - Industry-leading parental leave policies
 - Generous wellness reimbursement and weekly onsite programs
 - Flexible vacation policy - work with your manager to take time off when you need it
 - Employee giving match
 - Modern office in San Francisco’s Financial District
 - Fully-stocked kitchen with organic snacks, beverages, and coffee drinks
 - Weekly company meeting - ask me anything >
 - Team outings to sports games, happy hours, game nights and more!
 
What is 401(K) plan?
A 401(k) plan is a type of employer-sponsored retirement savings plan that allows workers to save and invest for their retirement on a tax-deferred basis
Contributions to a 401(k) plan are made through payroll deductions and are generally invested in a variety of financial instruments, such as stocks, bonds, and mutual funds
The tax-deferred nature of the 401(k) plan means that the money you contribute to the plan is not subject to income tax until you withdraw it in retirement, which can help you save more for retirement
Many employers also offer matching contributions to their employees' 401(k) plans, which can help boost your retirement savings even more.