126 ALIQAN Technologies Jobs
Data Engineer - Power BI/Tableau (4-8 yrs)
ALIQAN Technologies
posted 4 days ago
Key skills for the job
Role : Data Engineer (Microsoft Fabric & Lakehouse)
Job Title : Data Engineer (Microsoft Fabric & Lakehouse)
Location : Hybrid - Bangalore, India
Experience : 2 to 5 Years
Joining : Immediate
Hiring Process : One interview + One case study round
About the Role :
We are looking for a skilled Data Engineer with 2 - 5 years of experience to join our dynamic team. The ideal candidate will be responsible for designing and developing scalable, reusable, and efficient data pipelines using modern Data Engineering platforms such as Microsoft Fabric, PySpark, and Data Lakehouse architectures.
You will play a key role in integrating data from diverse sources, transforming it into actionable insights, and ensuring high standards of data governance and quality. This role requires a strong understanding of modern data architectures, pipeline observability, and performance optimization.
Key Responsibilities :
- Design and build robust data pipelines using Microsoft Fabric components including Pipelines, Notebooks (PySpark), Dataflows, and Lakehouse architecture.
- Ingest and transform data from a variety of sources such as cloud platforms (Azure, AWS), on-prem databases, SaaS platforms (e.g., Salesforce, Workday), and REST/OpenAPI-based APIs.
- Develop and maintain semantic models and define standardized KPIs for reporting and analytics in Power BI or equivalent BI tools.
- Implement and manage Delta Tables across bronze/silver/gold layers using Lakehouse medallion architecture within OneLake or equivalent environments.
- Apply metadata-driven design principles to support pipeline parameterization, reusability, and scalability.
- Monitor, debug, and optimize pipeline performance; implement logging, alerting, and observability mechanisms.
- Establish and enforce data governance policies including schema versioning, data lineage tracking, role-based access control (RBAC), and audit trail mechanisms.
- Perform data quality checks including null detection, duplicate handling, schema drift management, outlier identification, and Slowly Changing Dimensions (SCD) type management.
Required Skills & Qualifications :
- 2 - 5 years of hands-on experience in Data Engineering or related fields.
- Solid understanding of data lake/lakehouse architectures, preferably with Microsoft Fabric or equivalent tools (e.g., Databricks, Snowflake, Azure Synapse).
- Strong experience with PySpark, SQL, and working with dataflows and notebooks.
- Exposure to BI tools like Power BI, Tableau, or equivalent for data consumption layers.
- Experience with Delta Lake or similar transactional storage layers.
- Familiarity with data ingestion from SaaS applications, APIs, and enterprise databases.
- Understanding of data governance, lineage, and RBAC principles.
- Strong analytical, problem-solving, and communication skills.
Nice to Have :
- Prior experience with Microsoft Fabric and OneLake platform.
- Knowledge of CI/CD practices in data engineering.
- Experience implementing monitoring/alerting tools for data pipelines.
Why Join Us ?
- Opportunity to work on cutting-edge data engineering solutions.
- Fast-paced, collaborative environment with a focus on innovation and learning.
- Exposure to end-to-end data product development and deployment cycles.
Functional Areas: Software/Testing/Networking
Read full job description4-8 Yrs
Power BI, Data Analytics, Data Engineering +6 more
10-11 Yrs
IT Infrastructure, Essbase, Oracle Cloud Infrastructure +2 more
6-12 Yrs
UNIX, Essbase, Shell Scripting +1 more
6-8 Yrs
Servicenow, ITSM, CMDB +1 more
6-8 Yrs
Data Modeling, Schema, Performance Tuning +1 more
5-7 Yrs
Sales, Banking Sales, B2B Sales +3 more
8-11 Yrs
Digital Marketing, Python, SQL +5 more
8-10 Yrs
Oracle HCM, Oracle Integration Cloud, Oracle Cloud +2 more
6-9 Yrs
Manual Testing, Python, Automation Testing +2 more