Senior Java Developer (Data Integration & Distributed Systems)
Location: Hyderabad, Telangana, India
Experience: 5+ Years
Employment Type: Full-Time
About the Role:
We are seeking an experienced Senior Java Developer with expertise in data integration and distributed systems to join our innovative team in Hyderabad. The ideal candidate will have a strong background in building scalable, secure microservices and managing data pipelines using modern cloud technologies. If you are passionate about designing robust solutions and mentoring others, this is your opportunity to make an impact!
Core Responsibilities:
- Design and develop robust data integration solutions and enterprise integration patterns using Apache Kafka and Apache Camel.
- Implement message routing, transformation, and mediation using Apache Camel DSL.
- Build and maintain secure RESTful web services using Spring Framework.
- Implement Change Data Capture (CDC) patterns for real-time data synchronization.
- Design and maintain Avro schemas for data serialization and evolution.
- Implement and optimize microservices architecture patterns.
- Create and maintain CI/CD pipelines for automated testing and deployment.
- Write Infrastructure as Code using Terraform for AWS resource provisioning.
- Containerize applications using Docker and manage deployments on Amazon EKS.
- Collaborate with cross-functional teams to design and implement scalable solutions.
- Mentor junior developers and contribute to technical decision-making.
Required Technical Skills:
- 5+ years of experience in Java development with strong proficiency in Java 8+.
- Apache Camel Expertise:
- Enterprise Integration Patterns implementation.
- Camel DSL (Java, XML, and YAML).
- Component development and customization.
- Route testing and debugging.
- Performance tuning and optimization.
- Integration with Spring Boot.
- Message transformation and routing.
- Error handling and monitoring.
- Deep understanding of Spring Framework (Spring Boot, Spring Security, Spring Cloud).
- Extensive experience with Apache Kafka for building event-driven architectures.
- Experience in Change Data Capture (CDC) tools and patterns.
- Proficiency with Schema Registry (Apache Avro) and data serialization.
- Strong knowledge of RESTful API design and implementation.
- Hands-on experience with:
- AWS services and cloud architecture patterns.
- Infrastructure as Code using Terraform.
- Docker containerization and Kubernetes (Amazon EKS).
- CI/CD tools and methodologies.
- Git version control and branching strategies.
Preferred Qualifications:
- Experience with:
- Kafka Streams and KSQL.
- Spring libraries.
- AWS service mesh implementations.
- Monitoring and observability tools, metrics collection, and visualization (e.g., Prometheus, Grafana).
- Test-driven development (TDD).
- Confluent Platform and its components.
- Code quality measurement and improvement tools like SonarQube, and peer code reviewing using GitHub PR flows.
- Knowledge of:
- Microservices security patterns.
- OAuth 2.0 and JWT authentication.
- Event sourcing and CQRS patterns/distributed systems design patterns.
- Data governance and compliance requirements.
Location:
- Candidates must be based in Hyderabad.
Comments:
- Immediate joiners preferred.
- Please include a link to your portfolio or GitHub profile showcasing relevant projects.
- Shortlisted candidates will be required to complete a technical assessment.
How to Apply:
Interested candidates are requested to submit their updated resume through the application form on our website, including the following details:
- Current CTC
- Expected CTC
- Notice Period
- Links to relevant projects or portfolio
Join our team and contribute to building cutting-edge, scalable solutions that drive innovation!
We are an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees