Polygon Jobs in Web3
507 jobs found
Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
CertiK | Remote | $102k - $190k | |||
Solana US | United States | $185k | |||
Infinite Reality | San Francisco, CA, United States | $45k - $75k | |||
Mynaswap | Los Angeles, CA, United States | $54k - $70k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Ankr | London, United Kingdom | $63k - $100k | |||
CertiK | Remote |
| |||
CertiK | Remote |
| |||
CertiK | Remote | $51k - $70k | |||
CertiK | Remote | $60k - $80k | |||
CertiK | Remote | $102k - $190k | |||
CertiK | Remote | $102k - $190k | |||
CertiK | Remote |
| |||
CertiK | Remote | $100k - $140k | |||
CertiK | Remote | $102k - $180k | |||
Chainstack | Remote | $62k - $82k |
This job is closed
Data Platform Engineer - Matrix
Responsibilities
- Manage critical data platform services for the data lifecycle, including ingestion, transformation, management, delivery, and analytics.
- Build tools and libraries that ease new pipeline development and reduce delivery time.
- Work towards increasing the robustness and fault tolerance on the platform processing large amounts of data.
- Implement best practices and design patterns for different data solutions.
- Define tools and techniques to improve overall data observability and discoverability.
- Troubleshoot and remediate issues with the data platform services and data pipelines.
- Collaborate on cross-team Data Platform initiatives.
- Track and execute continuous improvements.
Required Qualifications
- Experience with data warehouse/data lake platforms such as Snowflake Redshift, or BigQuery.
- Experience with database technologies such as Redis, Kafka, MongoDB, PostgresSQL, and MySQL.
- Experience with data engineering tools such as airflow, dbt, etc.
- Familiarity working with data observability and data governance.
- Programming experience in Bash, Python, Golang, C++, or Java and in data query and manipulation with SQL.
- Proclivity for automation and DevOps practices and tools such as Git and Terraform.
- Broad exposure to at least one cloud platform: AWS, Google, Azure.
- Minimum of BS degree in CS or related field. Preference is MS or PhD.
Preferred Qualifications
- Experience with data science tools and platforms such as Jupiter notebooks, Databricks, SageMaker, and Tensorflow.
- Understanding of Linux.
- Experience working with monitoring and logging tools: Prometheus/DataDog, ELK, Grafana.
- Familiarity working with open source software community.
- Experience in startup, blockchain/smart contract/DeFi.
What is Polygon crypto used for?
Polygon (formerly known as Matic Network) is a Layer 2 scaling solution that aims to provide faster and cheaper transactions for Ethereum-based decentralized applications (dapps)
Polygon is a framework for building and connecting Ethereum-compatible blockchain networks, and it uses a Proof of Stake (PoS) consensus mechanism to validate transactions
It also supports the Ethereum Virtual Machine (EVM), which means that developers can use the same tools and programming languages they are familiar with to build dapps on Polygon
One of the key features of Polygon is its ability to enable interoperability between different blockchain networks
This means that dapps built on Polygon can easily interact with other dapps and networks, making it easier for developers to create more complex applications
Polygon is a Layer 2 scaling solution that offers faster and cheaper transactions for Ethereum-based dapps, while also enabling interoperability between different blockchain networks.