✨ About The Role
- As a Data Platform Engineer, you will be responsible for designing, developing, maintaining, and troubleshooting ETL processes.
- You will implement data processing pipelines using Kafka for real-time data streaming.
- The role involves collaborating with product managers, data analysts, and other stakeholders to gather requirements and translate them into technical specifications.
- You will oversee code reviews and ensure best practices in coding and data handling.
- Staying up-to-date with emerging trends and technologies in big data is crucial for recommending improvements to architecture and processes.
âš¡ Requirements
- The ideal candidate will have a bachelor's degree in Computer Science, Engineering, or a related field, with a master's degree preferred.
- A minimum of 5 years of experience in the software engineering field is required.
- Proficiency in NodeJS and Java programming, along with experience in related frameworks, is essential.
- Strong analytical skills and problem-solving abilities are necessary for success in this role.
- A solid understanding of CI/CD principles and experience with APIs and SDKs is important.