Python Pyspark Developer
Job Description:
Location : Dmart (AVENUE E COMMERCE LIMITED) Delphi Building , Hiranandani Garden , Powai.
Exp: 4-5+ Years
(includes 7% of variable)
Work Mode- WFO all 5 days.
Notice period: Immediate Joiners to 15 days max
MUST Skills: Python, Framework(Django/Flask), Pyspark, Kafka, Database( RDBMS /NOSQL - any is fine)
Interviews: 1st round Virtual, then F2F interview must
Education: Only BE/Btech or BCA + MCA and no education and employment overlap. No career gap(if any, not more than 3-6 months)
Job Description:
Role Overview:
We are looking for an experienced Senior Python & Spark Developer with strong expertise in Python (Django, Flask) and PySpark for large-scale data processing. The ideal candidate will be a self-starter who can independently own and deliver end-to-end tasks, while also guiding junior team members.
Key Responsibilities:
- Design, develop, and maintain scalable data-driven and event-driven applications.
- Develop backend services using Python (Django, Flask) and integrate them with distributed data processing pipelines using PySpark.
- Work independently on assigned tasks and take complete ownership from design to deployment.
- Collaborate with cross-functional teams for requirements gathering, design discussions, and delivery planning.
- Guide and mentor junior resources in coding best practices and problem-solving.
- Work with RDBMS and NoSQL databases to design and optimize storage solutions.
- Integrate applications with messaging services like Kafka and MQ for event-driven architectures.
- Ensure smooth deployments through CI/CD pipelines using Jenkins and Docker.
Core Skills & Technologies:
- Languages: Python 3
- Frameworks: Apache Spark (PySpark), Django, Flask
- Databases: PostgreSQL (RDBMS), Cassandra, MongoDB (NoSQL)
- Messaging: Kafka, MQ
- Architecture: Event-Driven, Data-Driven
- CI/CD Tools: Jenkins, Docker
- Monitoring tools: ELK, Prometheus, Grafana
Good to Have:
- Knowledge of Data Lake and Data Warehouse concepts.
- Exposure to large-scale distributed systems.
- Exposure to any of cloud provider [GCP, AWS, Azure]
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 5+ years of professional software development experience.
- At least 2+ years of hands-on experience with PySpark for big data processing.
- Strong backend development experience with Django and Flask.
- Proven ability to work independently and take end-to-end ownership of tasks.
- Strong problem-solving, communication, and mentoring skills.
Company Profile
Since 2003, we have been making a difference in businesses around the world through profound listening, --- thinking, and innovation.
--- isn’t just an organization, it’s a platform where people thrive, leadership is nurtured, and business is built on a strong foundation of shared responsibility.
Apply Now
- Interested candidates are requested to apply for this job.
- Recruiters will evaluate your candidature and will get in touch with you.