7 Move Forward Technologies Jobs
Data Architect - AWS/Python (8-14 yrs)
Move Forward Technologies
posted 2 weeks ago
Key skills for the job
Mandatory Skills :
PostgreSQL, Redis, Apache Iceberg, Graph/ Vector Databases, AWS Cloud Platform (including AWS services like S3, EC2, RDS, Lambda, EKS etc.), Languages: Python or Golang
We are seeking a highly experienced Data Architect to design and implement robust, scalable, and high- performance data systems using modern technologies.
The ideal candidate should have deep expertise in database architecture, distributed systems, data modelling, and hands-on proficiency in Python or Golang.
You will be responsible for defining architecture for structured, semi-structured, and vector-based data platforms across AWS infrastructure.
Responsibilities :
1. Data Architecture & Modeling :
a. Design normalized and denormalized data models for transactional and analytical workloads.
b. Define strategies for schema evolution and data versioning using Apache Iceberg or similar.
2. Database Systems Expertise :
a. Architect solutions using PostgreSQL for OLTP and analytical use cases.
b. Implement Redis for low-latency caching, real-time analytics, and streaming pipelines.
c. Design high-performance access patterns for relational, key-value, and time-series data.
3. Vector and Graph Data Architecture :
a. Integrate and manage graph/vector databases for AI/ML and recommendation engines.
b. Optimize vector search performance using indexes and hybrid search strategies.
4. Cloud Infrastructure (AWS) :
a. Deploy and scale data systems using AWS services like S3, EC2, RDS, Lambda, EKS.
b. Ensure high availability, durability, and disaster recovery of data infrastructure.
5. Data Governance & Security :
a. Implement best practices for data quality, lineage, privacy, and role-based access.
b. Establish backup, archival, and retention strategies for critical datasets.
6. Programming & Scripting :
a. Develop data ingestion, transformation, and orchestration workflows in Python or Golang.
b. Write reusable modules and APIs for data access, quality checks, and job orchestration.
Requirements :
1. Bachelor's or Master's degree in Computer Science, Data Engineering, or a related technical field.
2. 6-10 years of industry experience in data engineering, architecture, or backend system design.
3. Proven experience designing and implementing data platforms at scale on cloud infrastructure (preferably AWS).
4. Expertise in PostgreSQL: schema design, indexing, performance tuning
5. Experience with Redis for caching, session stores, or message queues
6. Practical experience with Apache Iceberg for large-scale data storage and schema evolution
7. Understanding of vector databases (e.g., FAISS, Weaviate, Pinecone)
8. Experience with graph databases (e.g., Neo4j, Amazon Neptune)
9. Proficient in Python or Golang for data ingestion, scripting, and automation
10. Familiarity with CI/CD pipelines, monitoring, and infrastructure-as-code (e.g., Terraform, CloudFormation)
11. Ability to design batch & streaming pipelines using tools like Apache Kafka, Airflow, or AWS Glue
12. Implement encryption, data masking, access control
13. Familiarity with GDPR, HIPAA, or similar data compliance standards
14. Proficient in query optimization, data partitioning, indexing strategies, and caching mechanisms
Functional Areas: Software/Testing/Networking
Read full job description8-14 Yrs
Python, AWS, Golang +4 more
4-8 Yrs
Data Engineering, Python, AWS +4 more
2-6 Yrs
SQL, ETL Testing, ERP Systems +2 more
4-8 Yrs
Data Analytics, Data Engineering, SQL +5 more
4-10 Yrs
Python, Machine Learning, Azure DevOps +7 more
5-10 Yrs
Python, ETL Testing, Pyspark +2 more