5 Altysys Jobs
Altysys - Data Engineer - Python/PySpark (3-6 yrs)
Altysys
posted 3+ weeks ago
Flexible timing
Key skills for the job
Job Title : Data Engineer
Location : Pune/ Gurgaon/ Hyderabad/ Bhopal/ Indore and Remote
Employment Type : Full-time / Contract
Experience Level : 36 years
No of position 8
Job Summary :
We are seeking a skilled and detail-oriented Data Engineer to join our growing team. The ideal candidate will have strong experience working with large-scale data pipelines and possess expertise in Python, PySpark, SQL, Spark SQL, and Databricks. You will play a key role in designing, building, and optimizing scalable data solutions that power analytics and business insights.
Key Responsibilities :
- Design, develop, and maintain robust and scalable data pipelines using PySpark and SQL.
- Work on data extraction, transformation, and loading (ETL) from a wide variety of data sources.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.
- Optimize data workflows for performance, scalability, and reliability in Databricks and Spark environments.
- Implement data quality and validation checks to ensure data integrity.
- Participate in code reviews and contribute to the continuous improvement of engineering practices.
Required Skills & Qualifications :
- Proficiency in Python and PySpark for building data pipelines.
- Strong understanding of SQL and Spark SQL for querying and manipulating data.
- Hands-on experience with Databricks and Spark-based distributed processing.
- Familiarity with cloud-based data platforms (e.g., Azure, AWS, or GCP) is a plus.
- Solid understanding of data warehousing concepts and best practices.
- Strong problem-solving and communication skills.
Preferred Qualifications :
- Experience with Azure cloud services, including Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure Machine Learning.
- Familiarity with Microsoft Fabric for modern data architecture.
- Experience with orchestration tools such as Apache Airflow.
- Exposure to Airbyte and dbt (Data Build Tool) for data integration and transformation.
- Hands-on experience with Python, SQL, Spark, and Databricks in enterprise environments.
- Experience working with Delta Lake or similar modern data lake architectures.
- Knowledge of CI/CD practices and data governance standards.
Functional Areas: Software/Testing/Networking
Read full job description3-6 Yrs
Data Engineering, Python, SQL +8 more
5-7 Yrs
Java, Java Spring Boot, Kafka +3 more
8-10 Yrs
Oracle Integration Cloud, Solution Design, System Configuration
5-7 Yrs
AWS, Hospital Administration, Cloud Services +5 more
6-8 Yrs
Android, Kotlin, Ott +2 more