71 Dash Hire Jobs
Data Engineer - AI Model Development (0-3 yrs)
Dash Hire
posted 3+ weeks ago
Key skills for the job
Job Description :
We're looking for an experienced AI and Data Engineer to join our team and play a pivotal role in architecting and building scalable analytics pipelines and sophisticated AI engines. This role involves developing solutions capable of connecting and processing structured and unstructured data, creating actionable insights, and deploying machine learning models to support advanced analytics and AI-driven capabilities across our platform.
The core responsibilities for the job include the following :
Data Pipeline Development :
- Design and build scalable data pipelines capable of ingesting structured and unstructured data from various sources (APIs, databases, IoT sensors, etc. ).
- Manage data extraction, transformation, loading (ETL), and integration processes.
Data Exploration and Analysis :
- Conduct exploratory data analyses to uncover actionable insights.
- Collaborate with product teams to identify data-driven opportunities and refine business requirements.
AI Model Development :
- Design, develop, and deploy robust AI and machine learning models tailored to specific business needs (predictive analytics, NLP, computer vision).
- Optimize model performance through rigorous testing, hyperparameter tuning, and validation processes.
MLOps and Model Deployment :
- Leverage modern MLOps frameworks (MLflow, Kubeflow, Docker, Kubernetes) for continuous integration, deployment, monitoring, and updating of models.
- Ensure seamless integration of AI models within business systems through APIs and microservices.
System Integration and Scalability :
- Ensure data and AI systems are scalable, secure, and high-performing.
- Integrate analytics and AI solutions seamlessly with existing client systems and platforms.
Continuous Improvement :
- Monitor, maintain, and refine data pipelines and AI models to align with evolving business requirements and emerging technologies.
- Stay informed of industry advancements to enhance Fermi Dev's analytics and AI capabilities.
Requirements :
- Bachelor's degree in Computer Science, Data Science, AI, or related fields.
- 0-3 years of experience building data pipelines and deploying AI/ML solutions in a production environment.
- Demonstrable success in designing complex analytics and AI systems.
Technical Skills :
- Expertise in Python and data-centric libraries (TensorFlow, PyTorch, Pandas, Scikit-learn).
- Proficiency in big data tools (Apache Spark, Hadoop, Kafka) and cloud platforms (AWS, Azure, GCP).
- Strong experience in MLOps, containerization, and model lifecycle management.
- Analytical and Communication Skills: Excellent ability to interpret complex datasets and convert insights into strategic business actions.
- Clear communicator capable of collaborating with interdisciplinary teams.
Preferred Qualifications :
- Prior experience in fintech, healthcare, manufacturing, logistics, or IT sectors.
- Advanced knowledge in NLP, computer vision, or deep learning methodologies.
- Active engagement with the AI and data science community (publications, conferences, open-source contributions).
Functional Areas: Software/Testing/Networking
Read full job description0-3 Yrs
Data Engineering, Python, Artificial Intelligence +6 more
5-8 Yrs
Python, AWS, Cloud Services +7 more
4-7 Yrs
Python, Java, Golang +2 more
11-16 Yrs
Data Analytics, Python, Artificial Intelligence +4 more
3-5 Yrs
UI and UX, Javascript, TypeScript +1 more
2-5 Yrs
Data Analytics, Artificial Intelligence, Machine Learning +4 more
2-4 Yrs
UI and UX, Figma, UX Research +2 more
4-5 Yrs
Data Analytics, SQL, Clinical Data Management +2 more
4-5 Yrs
Key Account Management, Client Management, Client Engagement