Experience: 5+ Years
Type: Full Time
Location: Bhubaneswar
Notice-period: Immediate/15 days
Budget: Upto 10-13 LPA
Technology: IT
Mandatory Skills:
- Python
- Data Analytics using Pandas DataFrame
- AWS
Good to Have Skills:
- Additional AWS Services and Experience
Responsibilities:
- Develop and maintain data analytics solutions using Python, with a strong emphasis on Pandas for data manipulation and analysis.
- Design, implement, and optimize data pipelines on AWS to support various data processing and analytics tasks.
- Collaborate with cross-functional teams to understand data requirements and deliver effective analytics solutions.
- Work with large datasets to perform data cleaning, transformation, and analysis, ensuring data quality and consistency.
- Implement best practices for data storage, retrieval, and security on AWS.
- Develop, test, and deploy scalable and efficient code to process and analyze data.
- Automate data workflows and processes to enhance efficiency and reduce manual effort.
- Stay updated with the latest industry trends and technologies in data analytics and cloud computing to continuously improve solutions.
- Troubleshoot and resolve issues related to data processing and analytics pipelines.
- Provide technical guidance and support to junior developers and other team members.
Qualifications:
- Proven experience as a Python Developer, with a focus on data analytics.
- Strong proficiency in using Pandas for data manipulation and analysis.
- Hands-on experience with AWS, including services such as S3, Lambda, EC2, RDS, and others.
- Excellent problem-solving skills and attention to detail.
- Ability to work collaboratively in a team environment and communicate effectively with technical and non-technical stakeholders.
- Strong understanding of data processing, ETL pipelines, and data warehousing concepts.
- Familiarity with best practices for code versioning, testing, and deployment.
Nice to Have:
- Experience with additional AWS services and capabilities.
- Knowledge of machine learning frameworks and libraries.
- Familiarity with other data processing and analytics tools and technologies.
- Understanding of containerization and orchestration tools like Docker and Kubernetes.