Key accountabilities
- Drive technical decisions and lead projects end-to-end from planning, to execution, to deployment.
- Mentor junior engineers on technical topics, engineering best practices, and soft skills.
- Possess an in-depth understanding of the data structures and governance
- Fundamental knowledge of modern cloud computing platforms and concepts
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional/non-functional business requirements
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Architect and design the data platform required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud technologies
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data platform needs
- Able to work in a fast-paced, multi-site location, team-oriented environment
- High level of attention to detail
- Strong work ethic and a make-it-happen attitude
- Ability to work and communicate effectively with both internal and external teams.
- Strong organizational and multitasking skills with ability to balance competing priorities
Experiences
- Reporting to Head of Data Operation
- Bachelor’s degree in computer science, engineering, mathematics, or a related technical discipline.
- Exceptionally high standards clean code and architecture.
- 8+ years experienced building large scale back-end systems.
- 8+ years strong understanding in database fundamentals and familiar with architectures of at least one common database technologies like SQL Server, Postgres, Cassandra and MySQL.
- Solid understanding of and experience with SQL.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- 2+ years strong understanding of cloud concepts: virtualization technologies, IaaS, PaaS, SaaS, HA, distributed systems, and cloud delivery models.
- 6+ years experienced with data warehousing architecture and data modeling
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Experienced in manipulating, processing and extracting data from various source systems.
- 2+ years knowledge and understanding of cloud security policies, infrastructure deployments in enterprise-wide environments
- Experience supporting and working with cross-functional teams in a dynamic environment
- Possesses strong organizational and time management skills, driving tasks to completion.
- Able to work independently with minimum supervision.
- Ability to quickly learn, understand, and work with new emerging technologies, methodologies and solutions.
- Exceptional verbal and written communication skills with the ability to effectively communicate with a diverse group of customers, partners, and colleagues
- Experience with big data tools: Spark, Kafka, etc. and ETL tools: DataStage, Oracle Data Integrator, Talend, Informatica, etc.
- Scripting experience with JavaScript, pySpark, Python, T-SQL or other similar languages.
- Must be self-directed, highly organized and adaptable with the ability to effectively multi-task as needed
- Familiar with current and emerging technologies and has the willingness to investigate and suggest new technology to meet business needs.
- Proven to in ETL and data processing, know how to transform data to meet business goals.
- Experienced with CI/CD tools and concepts