0xnotme

Lead Blockchain Engineer

As a seasoned Lead Blockchain Engineer specializing in Machine Learning and Cloud Engineering, my career spans over 10 years in tech, where I've developed a diverse skill set in AWS, GCloud Platform, Python, Node.js, and the Web3 stack. My current role in developing a cutting-edge DeFi platform highlights my expertise in secure cross-chain messaging protocols and strategic insights into distributed consensus systems and protocol incentives.

My professional journey includes significant achievements at Squidrouter and Intellify in Sydney, where I led the development of groundbreaking cross-chain technology and transformative projects in big data and machine learning. These experiences have not only honed my technical skills but also emphasized my leadership abilities in driving project efficiency and team development.

Career Aspirations:
Seeking roles that challenge me in blockchain protocol design and implementation, I am keen on leading a team towards pioneering solutions in blockchain technology. My goal is to make a significant industry impact by innovating in blockchain protocol challenges and providing strategic leadership in a dynamic environment.


Experience: 3 years

Yearly salary: $184,000

Hourly rate: $125

Nationality: 🇩🇪 Germany

Residency: 🇦🇺 Australia


Experience

Principal Data and Machine Learning Architect
Intellify
2021 - 2022
Leadership: Actively mentored team members, nurturing a culture of continuous learning and professional growth, which led to a 10% improvement in team engagement and retention rates. My mentorship approach not only motivated the team but also fostered a supportive and collaborative work environment. Spearheaded regular knowledge-sharing sessions and workshops on the latest data engineering trends, ensuring our team remained at the forefront of industry best practices. This initiative was instrumental in keeping our methodologies cutting-edge and team skill sets advanced. Provided personalized one-on-one mentorship and career development guidance, directly contributing to the professional advancement of 3 team members to senior roles, thereby strengthening our organizational leadership and technical expertise. Project: Machine Learning Platform Architected and delivered a dockerized machine learning solution, optimizing resource utilization and scalability. Implemented AWS Managed Airflow and AWS Batch for automated container management, significantly improving operational efficiency Pioneered the automation of a forecasting solution to supplant an error-prone email-based workflow, enhancing accuracy and reliability in our predictive models. Project: Big Data Platform Designed and implemented a Redshift-based data platform, integrating robust architecture to support our big data needs. This platform laid the foundation for scalable, efficient data handling. Developed a dbt pipeline to automate data import workloads, crucial for critical business reporting. This automation significantly reduced manual efforts and error margins, boosting reporting accuracy and timeliness. Led the design and delivery of a data mart, enabling sophisticated business reporting. This initiative provided actionable insights and drove data-driven decision-making across the organization.
Engineering Lead - Machine Learning and Cloud Data
Foxtel
2020 - 2021
Overall Responsibilities Spearheaded the data engineering team at Foxtel, focusing on Analytics and Finance. Implemented advanced data strategies that transformed data handling and reporting capabilities, leading to a 30% increase in efficiency for query runtime. Drove key initiatives for data integration and warehousing, enhancing data accessibility and reliability, which supported critical business decisions and financial planning. Training and mentorship for a data engineering team Collaborated closely with Analytics and Finance departments to align data engineering efforts with business goals, resulting in improved data-driven decision-making and an increase in operational efficiency by enabling self-serving model deployment. Project: Viewing Data insights Data Analytics Platform (React, looker, BigQuery) (Delivery and Engineering Lead): Managed the delivery team and stakeholders, ensuring seamless project execution and communication. Designed a robust GCP architecture for the back-end infrastructure, optimizing performance and scalability. Implemented automated segment clustering for customer viewing segments, enhancing customer data analysis and segmentation accuracy. Developed and supported data engineering for comprehensive reporting and interactive dash-boarding capabilities, facilitating better insights and user engagement. Project: MLOps Pipeline - ML Flow-based automation unsupervised Model (Delivery and Engineering Lead): Led the design and management of an MLOps pipeline, focusing on ML Flow-based automation for unsupervised training and scoring models, which streamlined machine learning workflows and improved model accuracy. Project: Feature-Store for Machine Learning and Data Analytics Platform (Airflow, BigQuery) (Delivery and Engineering Lead): Directed the delivery team and stakeholder engagement, ensuring strategic alignment and successful project outcomes. Architected a feature store for machine learning models (tabular data), enhancing model performance and data utilization. Devised and implemented an Airflow and dbt-based pipeline on GCP using BigQuery as a storage back-end, which significantly improved data processing efficiency and scalability.
Lead Consultant
DiUS
2017 - 2020
Overall Responsibilities As a Lead Consultant, I'be strategically steered complex consulting projects, delivering innovative solutions and driving significant client satisfaction. My responsibilities encompass a broad spectrum of activities, crucial for the firm's success and growth: Talent Acquisition and Team Building: Spearheaded the hiring and onboarding process for new talent, meticulously selecting individuals who not only possess exceptional skills but also align with our company's culture and values. My focus on building a diverse and dynamic team has significantly enhanced our collective expertise and collaborative potential. Stakeholder Management: Masterfully managed relationships with key stakeholders, ensuring their needs and expectations are not just met but exceeded. My approach involves regular, transparent communication and the ability to translate complex technical concepts into accessible language, fostering trust and long-term partnerships. Internal Learning and Development: Initiated and led various internal learning initiatives, aimed at continuously updating our team's skill set and staying ahead of industry trends. These initiatives include workshops, training sessions, and knowledge-sharing meetups, contributing to an environment of continuous professional growth and innovation. Project: Datarock - Image Analysis for Mining & Exploration Platform (Technical Project Lead) Engineered a sophisticated production model versioning solution, ensuring streamlined model updates and consistency. Constructed a serverless image processing pipeline to efficiently run TensorFlow and PyTorch dockerized workloads at scale, significantly improving processing power and efficiency. Technologies: AWS Sagemaker, Fargate, AWS Lambda, TensorFlow, Pytorch, Python. Project: Anglo American - Visual Search on Rock Texture (Technical Project Lead) Spearheaded the productization of machine learning workloads using container technology on Azure Cloud, optimizing deployment and scalability. Developed an advanced feature extraction system for geological imagery and built a similarity search service using the k-nearest neighbor algorithm on vector data, greatly enhancing data analysis capabilities. Technologies: Azure Cloud Storage, Azure Container Instances, Docker, Python Project: Datarock - Image Analysis for Mining & Exploration Platform (Machine Learning Engineer) Implemented instance segmentation on geological images, enhancing detail and accuracy in image analysis. Automated and deployed dockerized machine learning models, boosting efficiency and model reliability. Conducted model development in TensorFlow using Jupyter Notebooks, advancing our machine learning methodologies. Technologies: AWS EC2, Python, TensorFlow, Mask-RCNN Project: NIB Health Funds Machine Learning - Automate claim processing using image recognition (Architect/ Data Engineer) Conceptualized and developed a serverless claims processing solution, integrating ML and OCR to streamline the claims handling process. Architected a serverless framework for efficient claim processing, leveraging AWS Lambda and an OCR solution based on Tesseract. Successfully productionized text extraction for key claim information, markedly improving the accuracy and speed of claim processing.

Skills

architecture
aws
machine-learning
node
python
tech-lead
solidity
english
german