A client wants to improve the operability and performance of their data services across multiple IT platforms: a datalake, a datamart, a data warehouse, a BI environment, and Data Science. These platforms, although mature and already in production, require the implementation of tooling around orchestration issues to ensure operational performance.
Mission: In collaboration with the data architect, data experts, and product manager/owner, you will be involved in the operations and implementation of the technical roadmap for the orchestration platform based on the Airflow product. You will ensure the development and expert support of platform features and DevOps tooling around the product, with a focus on observability, performance management, and continuous improvement of the Orchestration service.
Activities:
- Participate in the development cycle of the Airflow product
- Implement/develop tooling and platform features as required, in collaboration with the Data Architect and data experts.
- Contribute to the development of Directed Acyclic Graphs (DAGs) necessary for the smooth operation of datalake and datamart platforms (AWS and Snowflake).
- Provide technical support to end-users during platform usage.
- Ensure code delivery and deployment.
Ensure the run mode of the platform
- Level 2 expertise support for production and participation in crisis situations.
- Contribute to the product Knowledge Base.
- Report IT performance, real-time health status, and key operational risks to the Airflow platform manager.
Contribute to the Data Orchestration community
- Act as a Key User for platform users: BI, business projects, Data Science, DataLake, etc.
- Participate in regular meetings with users to exchange best practices in usage and development.
Knowledge transfer to the offshore support team
- Produce necessary documentation for knowledge transfer: tutorials, operation documents, etc.
- Conduct knowledge transfer sessions through product demos.
- Support the offshore support team in initial developments.
- Assist the support team in incident/problem resolution.
Deliverables:
- Code and production deployment of DAGs in the product.
- Unit test reports for implemented DAGs.
- Contribution to platform evolution proposal documents.
- Implementation schedule for deliveries/roadmap via JIRA.
- Documentation of development best practices in Confluence.
- Updates to incident and request tickets in JIRA.
- Support Knowledge Base (in ServiceNow and Confluence).
- Committee support and meeting summaries.
Required Skills:
- Cloud/Data Engineer AWS.
- Process orchestration: Airflow.
- ETL/ELT principles: dbt.
- Agile Scrum/Kanban methodology.
- Knowledge of data management ecosystem (Data Engineer profile).
- DEVOPS concepts.
- Observability of Information Systems.
- English
- French will be highly appreciated
- Soft Skills:
- Proactive.
- Autonomous.
- Dynamic.
- Collaborative/Communicative (is an asset).
- Knowledge of the following tools:
- Airflow (mandatory).
- AWS.
- Snowflake.
- dbt.
- JIRA/Confluence.
- Jenkins.
- Gitlab.
- Terraform.
- Kubernetes/Containers.
- Elasticsearch/Kibana (is an asset).