✨ About The Role
- As a Data Platform Engineer, you will be responsible for designing, developing, and maintaining big data architecture.
- You will implement data processing pipelines using Kafka for real-time data streaming.
- The role involves optimizing and managing search capabilities using Elastic technologies.
- Collaboration with product managers, data analysts, and other stakeholders is key to gathering requirements and translating them into technical specifications.
- You will oversee code reviews and ensure best practices in coding and data handling.
âš¡ Requirements
- The ideal candidate will have a Bachelor's degree in Computer Science, Engineering, or a related field, with a Master's degree preferred.
- A minimum of 5 years of experience in the software engineering field is required, with at least 3 years focused on big data technologies.
- Proficiency in Java programming and experience with related frameworks is essential for success in this role.
- Strong analytical skills and excellent problem-solving abilities are crucial for troubleshooting and optimizing data systems.
- A solid understanding of CI/CD principles and experience working with APIs and SDKs will be beneficial.