Blockchain Jobs in United States

12,147 jobs found

web3.career is now part of the Bondex Logo Bondex Ecosystem

Receive emails of Blockchain Jobs in United States
Job Position Company Posted Location Salary Tags

Tether

San Francisco, CA, United States

$100k - $500k

Tether

New York, NY, United States

$100k - $500k

Proof of Play

United States

$60k - $72k

Polygon Labs

United States

$72k - $150k

Phantom

United States

$200k - $238k

Groma

Boston, MA, United States

$57k - $72k

Groma

Boston, MA, United States

$90k - $110k

Grayscaleinvestments

Stamford, CT, United States

$73k - $102k

Figure

St Louis, MO, United States

$117k - $122k

Figure

Reno, NV, United States

$105k - $132k

Figment

New York, NY, United States

$120k - $150k

Kraken

United States

$110k - $220k

Chainlink Labs

United States

$90k - $110k

Polymarket

New York, NY, United States

$74k - $120k

Polymarket

New York, NY, United States

$82k - $115k

Tether
$100k - $500k
CA San Francisco US
Apply

Join Tether and Shape the Future of Digital Finance

At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our cutting-edge solutions empower businesses—from exchanges and wallets to payment processors and ATMs—to seamlessly integrate reserve-backed tokens across blockchains. By harnessing the power of blockchain technology, Tether enables you to store, send, and receive digital tokens instantly, securely, and globally, all at a fraction of the cost. Transparency is the bedrock of everything we do, ensuring trust in every transaction.

Innovate with Tether

Tether Finance: Our innovative product suite features the world’s most trusted stablecoin, USDT, relied upon by hundreds of millions worldwide, alongside pioneering digital asset tokenization services.

But that’s just the beginning:

Tether Power: Driving sustainable growth, our energy solutions optimize excess power for Bitcoin mining using eco-friendly practices in state-of-the-art, geo-diverse facilities.

Tether Data: Fueling breakthroughs in AI and peer-to-peer technology, we reduce infrastructure costs and enhance global communications with cutting-edge solutions like KEET, our flagship app that redefines secure and private data sharing.

Tether Education: Democratizing access to top-tier digital learning, we empower individuals to thrive in the digital and gig economies, driving global growth and opportunity.

Tether Evolution: At the intersection of technology and human potential, we are pushing the boundaries of what is possible, crafting a future where innovation and human capabilities merge in powerful, unprecedented ways.

Why Join Us?

Our team is a global talent powerhouse, working remotely from every corner of the world. If you’re passionate about making a mark in the fintech space, this is your opportunity to collaborate with some of the brightest minds, pushing boundaries and setting new standards. We’ve grown fast, stayed lean, and secured our place as a leader in the industry.

If you have excellent English communication skills and are ready to contribute to the most innovative platform on the planet, Tether is the place for you.

Are you ready to be part of the future?

About the job:

As a member of the AI model team, you will drive innovation in architecture development for cutting-edge models of various scales, including small, large, and multi-modal systems. Your work will enhance intelligence, improve efficiency, and introduce new capabilities to advance the field.

You will have a deep expertise in LLM architectures, a strong grasp of pre-training optimization with a hands-on, research-driven approach. Your mission is to explore and implement novel techniques and algorithms that lead to groundbreaking advancements: data curation, strengthening baselines, identifying and resolving existing pre-training bottlenecks to push the limits of AI performance.

Responsibilities:

  • Conduct pre-training AI models on large, distributed servers equipped with thousands of NVIDIA GPUs.

  • Design, prototype, and scale innovative architectures to enhance model intelligence.

  • Independently and collaboratively execute experiments, analyze results, and refine methodologies for optimal performance.

  • Investigate, debug, and improve both model efficiency and computational performance.

  • Contribute to the advancement of training systems to ensure seamless scalability and efficiency on target platforms.


  • A degree in Computer Science or related field. Ideally PhD in NLP, Machine Learning, or a related field, complemented by a solid track record in AI R&D (with good publications in A* conferences).

  • Hands-on experience contributing to large-scale LLM training runs on large, distributed servers equipped with thousands of NVIDIA GPUs, ensuring scalability and impactful advancements in model performance.

  • Familiarity and practical experience with large-scale, distributed training frameworks, libraries and tools.

  • Deep knowledge of state-of-the-art transformer and non-transformer modifications aimed at enhancing intelligence, efficiency and scalability.

  • Strong expertise in PyTorch and Hugging Face libraries with practical experience in model development, continual pretraining, and deployment.