Responsible for designing, developing, and supporting realtime streaming data pipelines using Ab Initio Continuous Flows.
Design and develop Ab Initio Continuous Flow graphs to process real-time and nearrealtime streaming data.
Implement eventdriven data pipelines for ingesting data from sources and targets such as Kafka, MQ, sockets, and file streams.
Knowledge on integrating Downstream databases, APIs and data warehouses.
Experience of handling high availability, resilient, and fault tolerance in continuous flows.
Expert on Ab Initio GDE, EME - Technical Repository, Conduct>It , Ab Initio web interface.
Oracle/Teradata or any database appliance working experience preferred.
Knowledge on Kafka architecture, Kstreams and KSQLs.
Knowledge on Publish/Subscribe based ETL models.
Key Responsibilities*
Interact with Business users and analysts in gathering the requirements
Analyze, design, develop and implement new applications and solutions
Participate in project meetings with all the stake holders
Identify opportunities for performance improvements and implement the same
Develop and use the reusable tools
Groom and Contribute towards all round growth of the junior associatesCI/CD concepts and ability to create pipelines in Jenkins, Teamcity, Ab Initio EME Coding - Python, Java