Data Science Careers at Braintrust
There are 9 Web3 Jobs at Braintrust
Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
Braintrust | San Francisco, CA, United States |
| |||
Braintrust | San Francisco, CA, United States |
| |||
Braintrust | San Francisco, CA, United States |
| |||
Braintrust | San Francisco, CA, United States |
| |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Braintrust | San Francisco, CA, United States |
| |||
Braintrust | San Francisco, CA, United States | $91k - $96k | |||
Braintrust | San Francisco, CA, United States | $32k - $76k | |||
Braintrust | San Francisco, CA, United States | $30k - $67k | |||
Braintrust | San Francisco, CA, United States | $91k - $156k |
This job is closed
- JOB TYPE: Freelance, Contract Position (no agencies/C2C - see notes below)
- LOCATION: Remote - Work from anywhere
- HOURLY RANGE: Our client is looking to pay $70 – $90/hr
- ESTIMATED DURATION: 20 hrs/wk - Long-term
THE OPPORTUNITY
Ideal Experience and Competencies/ Required Skills:
-Hands-on familiarity and expertise with the Google Cloud data environment: (resources, storage & tools), including, but not limited to BigQuery, Pub/Sub, Compute Engine, Cloud Functions, Cloud Storage, AI/ML stack, Terraform, etc.
-Excellent SQL competency
-Excellent written and oral communication skills
-Excellent organizational skills
-Excellent critical thinking skills
-Detail oriented
-Ability to learn rapidly about new technologies and method
Responsibilities:
-Administration and oversight of Princeton’s google cloud platform and ecosystem
-Function as our GCP administrator; maintain access, resources, and manage system.
-Coordinate with internal and offshore engineering and software development teams
-Support development project and team resources by deploying resources within GCP environments, administrate and promote changes (CI/CD), and oversee access and security.
-Administrator and manage data pipeline and scrapers request.
-Query databases and data models for ad-hoc request
-Database maintenance and monitoring, and creation of new databases and table structures
-Process, cleanse, and verify the integrity of structured / un-structured data used for analysis.
-Identifies rich data sources, joins them with other, potentially incomplete data sources, and cleans the resulting set
-Present information using data visualization techniques.
Apply Now!