Data Science Web3 Jobs in United States
477 jobs found
web3.career is now part of the Bondex Ecosystem
| Job Position | Company | Posted | Location | Salary | Tags |
|---|---|---|---|---|---|
Braintrust | San Francisco, CA, United States |
| |||
DeFiner Inc | Minneapolis, MN, United States | $10k - $33k | |||
Gemini | United States | $91k - $96k | |||
Braintrust | San Francisco, CA, United States |
| |||
| Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
PolySign & Standard Custody | Denver, CO, United States | $78k - $100k | |||
Cogo Labs | United States | $45k - $76k | |||
IBM | San Francisco, CA, United States | $73k - $152k | |||
Gauntlet | New York, NY, United States | $32k - $64k | |||
Ember Fund | Los Angeles, CA, United States | $32k - $64k | |||
Pagoda | San Francisco, CA, United States | $72k - $75k | |||
Blockchain.com | San Francisco, CA, United States |
| |||
Anonym | Palo Alto, CA, United States |
| |||
Genies | Los Angeles, CA, United States | $63k - $75k | |||
Pagoda | San Francisco, CA, United States | $54k - $80k | |||
Thirdwave | San Francisco, CA, United States | $63k - $100k |
- JOB TYPE: Freelance, Contract Position (no agencies/C2C - see notes below)
- LOCATION: Remote - Work from anywhere
- HOURLY RANGE: Our client is looking to pay $70 – $90/hr
- ESTIMATED DURATION: 20 hrs/wk - Long-term
THE OPPORTUNITY
Ideal Experience and Competencies/ Required Skills:
-Hands-on familiarity and expertise with the Google Cloud data environment: (resources, storage & tools), including, but not limited to BigQuery, Pub/Sub, Compute Engine, Cloud Functions, Cloud Storage, AI/ML stack, Terraform, etc.
-Excellent SQL competency
-Excellent written and oral communication skills
-Excellent organizational skills
-Excellent critical thinking skills
-Detail oriented
-Ability to learn rapidly about new technologies and method
Responsibilities:
-Administration and oversight of Princeton’s google cloud platform and ecosystem
-Function as our GCP administrator; maintain access, resources, and manage system.
-Coordinate with internal and offshore engineering and software development teams
-Support development project and team resources by deploying resources within GCP environments, administrate and promote changes (CI/CD), and oversee access and security.
-Administrator and manage data pipeline and scrapers request.
-Query databases and data models for ad-hoc request
-Database maintenance and monitoring, and creation of new databases and table structures
-Process, cleanse, and verify the integrity of structured / un-structured data used for analysis.
-Identifies rich data sources, joins them with other, potentially incomplete data sources, and cleans the resulting set
-Present information using data visualization techniques.
Apply Now!