Job Title: Senior Engineer – Python, Data Engineering & Backend
Experience: 3–4+ years
Location: Hyderabad (WFO)
About the Role:
We are looking for a Senior Engineer with strong expertise in Python, data engineering, and backend technologies, who can design and implement scalable architectures while working closely with stakeholders and clients. This role requires proficiency in cloud infrastructure across AWS, and Azure, and the ability to bridge technical execution with business needs.
Key Responsibilities:
Design, develop, and optimize backend systems and data pipelines using Python and modern data engineering tools.
Architect and implement scalable, reliable, and secure systems capable of handling large volumes of data and traffic.
Work with AWS, and Azure to deploy, manage, and optimize cloud-based solutions.
Collaborate with stakeholders and clients to gather requirements, define scope, and break down tasks into actionable deliverables.
Translate complex business needs into technical solutions and ensure alignment with overall product goals.
Lead and mentor junior team members when needed, fostering best practices in code quality and architecture.
Participate in Agile ceremonies, ensuring smooth sprint planning, backlog grooming, and delivery tracking.
Troubleshoot, debug, and resolve performance bottlenecks across systems.
Required Skills & Experience:
3–4+ years of professional experience in backend development and data engineering.
Strong Python programming skills, with experience in frameworks like FastAPI, Django, or Flask.
Solid understanding of data pipelines, ETL processes, and database systems (SQL and NoSQL).
Proven experience in designing and implementing scalable architectures.
Hands-on experience with cloud services: AWS, Azure, and GCP (at least two in depth).
Hands-on experience with cloud services: AWS, Azure, and GCP (at least two in depth).
Knowledge of containerization (Docker, Kubernetes) and CI/CD workflows.
Strong communication skills to interact with stakeholders, clients, and cross-functional teams.
Experience in requirement gathering, scope definition, and task breakdown.
Familiarity with Agile/Scrum methodologies.
Nice-to-Have
Experience with big data frameworks (Spark, Beam, etc.).
Exposure to microservices architecture.
Knowledge of infrastructure-as-code tools like Terraform or CloudFormation.
Why Join Us?
Work on impactful projects with diverse data and AI challenges.
Collaborate with a cross-functional, highly skilled team.
Opportunity to work across multiple cloud platforms and modern tech stacks.