Data Engineer Lead

Job Type:

Location:

As part of the Data and Analytics division, you will be working with a team of talented engineers who drive various activities including data engineering, data platforms, analytics, and reports/ visualizations for stakeholders to improve efficiency, experience, revenue and profitability of the company. You will work on and analyze financial data points from different systems to build data pipelines and transform data into actionable intelligence. You will build data/ insight products for use cases like fraud checks, credit scoring and transactional anomalies.


At Ayoconnect, we are on a multi-year journey to industrialize and monetize our data/ insight products and analytics capabilities in the fintech space.

  • Gather and translate user requirements, assess gaps, build roadmaps/ architectures and develop end to end data pipelines.

  • Working on complex and large datasets to build data products with analysts by ensuring data quality and availability.

  • Identify gaps and implement solutions for data security, quality, reliability, monitoring and      automation of processes.

  • Collaborate with cross-functional teams to source new data, develop schema requirements,      and maintain metadata.

  • Come up with right architecture for complex business problems we are solving.

  • Build applications by following coding standards, build appropriate unit tests, integration tests, and deployment scripts.

  • Collaborate with leads to explore existing systems, determine areas of complexity, potential risks to successful implementation, learning the application capabilities and the overall business flow.

  • Analyze existing data pipelines/ SQL queries for performance improvement.

  • Takes ownership of the tasks assigned and is able to execute it in agile and ambiguous environment.

  • Contributes to continual improvement by suggesting improvements to user interface, software      architecture, or proposal of new technology.

  • Mentor the team and be ready to shoulder ad-hoc responsibilities as needed.

Job Description

  • A degree in Computer Science, Information Technology, Computer Engineering

  • Four or more years of experience in creating end to end data engineering pipelines using cloud technology.

  • GCP certification in data engineering, or any other relevant certification.

  • Strong database querying skills using advanced SQL/ procedures on MySQL/ MongoDB/ MariaDB/ Oracle, etc.

  • Knowledge of Big Data or AI/ML architectures, solutions, trends, frameworks for designing architecture and facilitating its development, etc

  • Exposure and hands-on with NoSQL databases like Redis, Cassandra, etc

  • Two or more years of strong python related work experience for building data pipelines, etc.

  • Experience in designing, building and deploying production level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Splunk, etc).

  • Experience in streaming technologies like Spark, IBM Stream, Flink, Dataflow, etc.

  • Experience in messaging technologies like Kafka, Pulsar, RabbitMQ, etc.

  • Experience in creating monitoring, logging and tracing experience in data pipelines/ platform using open source technology.

  • Experience leveraging and managing CI/ CD toolchain products like Jira, Stash, Git, Bitbucket, Artifactory, and Jenkins in Data Engineering projects.

  • Exposure and hands-on with GCP tools and capabilities for data engineering.

  • Experience in DevOps methodologies.

  • Experience in using GCP Cloud Function, Cloud Run, Memorystore, API, etc.

  • Experience in agile projects and is well aware of the agile practices and tools used in the development (Jira, Confluence, etc).

  • Hands-on contributor to the architecture, design and development of the projects

  • Good communication and interpersonal skills.

  • Should be able to clearly articulate and influence stakeholders through presentation, interpersonal, verbal and written skills.

  • Ability to work in agile ambiguous environment and can effectively juggle between competing priorities.

  • Willingness to go the extra mile proactively with minimal assistance.

Job Requirement

Work Timezone:

Data Engineer Lead

As part of the Data and Analytics division, you will be working with a team of talented engineers who drive various activities including data engineering, data platforms, analytics, and reports/ visualizations for stakeholders to improve efficiency, experience, revenue and profitability of the company. You will work on and analyze financial data points from different systems to build data pipelines and transform data into actionable intelligence. You will build data/ insight products for use cases like fraud checks, credit scoring and transactional anomalies.


At Ayoconnect, we are on a multi-year journey to industrialize and monetize our data/ insight products and analytics capabilities in the fintech space.

  • Gather and translate user requirements, assess gaps, build roadmaps/ architectures and develop end to end data pipelines.

  • Working on complex and large datasets to build data products with analysts by ensuring data quality and availability.

  • Identify gaps and implement solutions for data security, quality, reliability, monitoring and      automation of processes.

  • Collaborate with cross-functional teams to source new data, develop schema requirements,      and maintain metadata.

  • Come up with right architecture for complex business problems we are solving.

  • Build applications by following coding standards, build appropriate unit tests, integration tests, and deployment scripts.

  • Collaborate with leads to explore existing systems, determine areas of complexity, potential risks to successful implementation, learning the application capabilities and the overall business flow.

  • Analyze existing data pipelines/ SQL queries for performance improvement.

  • Takes ownership of the tasks assigned and is able to execute it in agile and ambiguous environment.

  • Contributes to continual improvement by suggesting improvements to user interface, software      architecture, or proposal of new technology.

  • Mentor the team and be ready to shoulder ad-hoc responsibilities as needed.

Job Description

  • A degree in Computer Science, Information Technology, Computer Engineering

  • Four or more years of experience in creating end to end data engineering pipelines using cloud technology.

  • GCP certification in data engineering, or any other relevant certification.

  • Strong database querying skills using advanced SQL/ procedures on MySQL/ MongoDB/ MariaDB/ Oracle, etc.

  • Knowledge of Big Data or AI/ML architectures, solutions, trends, frameworks for designing architecture and facilitating its development, etc

  • Exposure and hands-on with NoSQL databases like Redis, Cassandra, etc

  • Two or more years of strong python related work experience for building data pipelines, etc.

  • Experience in designing, building and deploying production level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Splunk, etc).

  • Experience in streaming technologies like Spark, IBM Stream, Flink, Dataflow, etc.

  • Experience in messaging technologies like Kafka, Pulsar, RabbitMQ, etc.

  • Experience in creating monitoring, logging and tracing experience in data pipelines/ platform using open source technology.

  • Experience leveraging and managing CI/ CD toolchain products like Jira, Stash, Git, Bitbucket, Artifactory, and Jenkins in Data Engineering projects.

  • Exposure and hands-on with GCP tools and capabilities for data engineering.

  • Experience in DevOps methodologies.

  • Experience in using GCP Cloud Function, Cloud Run, Memorystore, API, etc.

  • Experience in agile projects and is well aware of the agile practices and tools used in the development (Jira, Confluence, etc).

  • Hands-on contributor to the architecture, design and development of the projects

  • Good communication and interpersonal skills.

  • Should be able to clearly articulate and influence stakeholders through presentation, interpersonal, verbal and written skills.

  • Ability to work in agile ambiguous environment and can effectively juggle between competing priorities.

  • Willingness to go the extra mile proactively with minimal assistance.

Job Requirement

Language Requirement:

English & Bahasa Indonesia

Work Timezone:

Remote

Location:

Remote

Job Type:

Full time