Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
Ripple | San Francisco, CA, United States | $32k - $60k | |||
Ripple | New York, NY, United States | $91k - $105k | |||
Kronos Research | Taipei, Taiwan | $81k - $84k | |||
BitGo | Nashville, TN, United States | $81k - $84k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Ava Labs | New York, NY, United States | $11k - $75k | |||
Ava Labs | San Francisco, CA, United States | $11k - $75k | |||
Ava Labs | Miami, FL, United States | $11k - $75k | |||
Ava Labs | New York, NY, United States | $11k - $75k | |||
Ava Labs | New York, NY, United States | $36k - $66k | |||
Ava Labs | Miami, FL, United States | $36k - $66k | |||
Ava Labs | San Francisco, CA, United States | $36k - $66k | |||
Ava Labs | New York, NY, United States | $36k - $66k | |||
BCGDV | Los Angeles, CA, United States | $98k - $108k | |||
Genesis Global Trading, Inc. | New York, NY, United States | $87k - $90k | |||
Infstones | Remote | $36k - $45k |
This job is closed
What You’ll Do:
- Build and maintain infrastructure and empower data driven culture of Ripple’s distributed financial technology to outperform current banking infrastructure by driving down costs, increasing processing speeds and delivering end-to-end visibility into payment fees, timing, and delivery.
- Review specifications, production schedules, and process flows to improve methods that are applied in Ripple’s services and technologies, and use advanced analytical methods to solve complex issues.
- Support Ripple’s internal and externally-facing data APIs and applications built based on distributed systems, distributed data stores, data pipelines and other tools in cloud services environments.
- Build systems and services that abstract engines and will allow users to focus on business and application logic via higher level programming models based on distributed processing compute engines.
- Build data pipelines and tools to keep pace with the growth of data and its consumers, with stream processing frameworks.
- Build scalable backend services and data pipelines.
- Identify and analyze requirements and use cases from multiple internal teams, including finance, compliance, analytics, data science, and engineering with database internals, database design, SQL and database programming.
- Work with other technical leads to design solutions for the requirements.
What We’re Looking For:
Must have a Bachelor’s degree in Computer Science, Operations Research, or a related field plus 4 years of software development experience in distributed systems and distributed processing compute engines; or a Master’s degree in Computer Science, Operations Research, or a related field plus 2 years of software development experience in distributed systems and distributed processing compute engines.
Of the required experience, must have two years of experience in two of the following (which may be gained concurrently): AWS (Redshift or Kinesis); GCP (BigTable, BigQuery, or Pub/Sub); Hadoop; Kafka; Beam; Storm; Flink; or, Spark.
Of the required experience, must have one year of experience in two of the following (which may be gained concurrently): Python; Java; Node.js; or, Unix.
To apply, please email resume to: [email protected] and must reference Job # SWE00 to be considered.
#LI-DNI
WHAT WE OFFER:
- The chance to work in a fast-paced start-up environment with experienced industry leaders
- A learning environment where you can dive deep into the latest technologies and make an impact
- Competitive salary and equity
- 100% paid medical and dental and 95% paid vision insurance for employees starting on your first day
- 401k (with match), commuter benefits
- Industry-leading parental leave policies
- Generous wellness reimbursement and weekly onsite programs
- Flexible vacation policy - work with your manager to take time off when you need it
- Employee giving match
- Modern office in San Francisco’s Financial District
- Fully-stocked kitchen with organic snacks, beverages, and coffee drinks
- Weekly company meeting - ask me anything >
- Team outings to sports games, happy hours, game nights and more!
What does a Java developer in web3 do?
A Java developer in web3 would likely be focused on developing applications that use the Java programming language in the context of the web3 technology stack
Web3 is a collective term used to refer to the next generation of decentralized, blockchain-based technologies that are aimed at creating a more open and secure internet
In this context, a Java developer would be responsible for writing code that interacts with web3 technologies, such as decentralized applications (DApps) and smart contracts, to create new tools and services that run on the blockchain
This could involve working with cutting-edge technologies such as Ethereum, which is a popular blockchain platform that uses the Java programming language, as well as other web3 technologies and frameworks.