Filter interviews by
LLM usecase involves using Latent Linear Models for various data analysis tasks.
LLM can be used for dimensionality reduction in high-dimensional data.
LLM can be used for clustering similar data points together.
LLM can be used for anomaly detection in datasets.
LLM can be applied in natural language processing tasks such as text classification.
LLM can be used in recommendation systems to predict user preferences.
To minimize overfitting, use techniques like cross-validation, regularization, early stopping. To minimize underfitting, increase model complexity, gather more data.
Use cross-validation to evaluate model performance on different subsets of data
Apply regularization techniques like L1 or L2 regularization to penalize large coefficients
Implement early stopping to prevent the model from training for too long
Increase m...
Choosing the right model involves understanding the data, use case, and desired outcomes.
1. Understand the problem: Is it classification, regression, clustering, etc.? Example: Predicting patient outcomes (classification).
2. Analyze the data: Check for missing values, data types, and distribution. Example: Use EDA to visualize patient demographics.
3. Select a model: Based on the problem type, choose a model. Examp...
Identify AI projects from data analysis and outline steps for implementation.
Analyze the data to identify patterns or trends that suggest potential AI applications.
Consider projects like predictive analytics for customer behavior or anomaly detection in fraud detection.
Engage stakeholders to understand business needs and align AI projects with strategic goals.
Prototype a small-scale version of the AI solution to v...
Company fit is crucial for long-term success. Compensation expectations should align with industry standards and experience.
Research the company culture and values to ensure alignment with personal values and work style.
Understand the company's expectations for the role and how your skills and experience can meet or exceed them.
Discuss compensation openly and transparently, considering industry standards, your exp...
Bagging and boosting are ensemble learning techniques used to improve the performance of machine learning models by combining multiple weak learners.
Bagging (Bootstrap Aggregating) involves training multiple models independently on different subsets of the training data and combining their predictions through averaging or voting.
Boosting involves training multiple models sequentially, where each subsequent model c...
I applied via Company Website and was interviewed in Dec 2024. There were 4 interview rounds.
Given Python coding challenge related to string processing and second question was matrix processing using python
Bagging and boosting are ensemble learning techniques used to improve the performance of machine learning models by combining multiple weak learners.
Bagging (Bootstrap Aggregating) involves training multiple models independently on different subsets of the training data and combining their predictions through averaging or voting.
Boosting involves training multiple models sequentially, where each subsequent model correc...
To minimize overfitting, use techniques like cross-validation, regularization, early stopping. To minimize underfitting, increase model complexity, gather more data.
Use cross-validation to evaluate model performance on different subsets of data
Apply regularization techniques like L1 or L2 regularization to penalize large coefficients
Implement early stopping to prevent the model from training for too long
Increase model ...
LLM usecase involves using Latent Linear Models for various data analysis tasks.
LLM can be used for dimensionality reduction in high-dimensional data.
LLM can be used for clustering similar data points together.
LLM can be used for anomaly detection in datasets.
LLM can be applied in natural language processing tasks such as text classification.
LLM can be used in recommendation systems to predict user preferences.
Identify AI projects from data analysis and outline steps for implementation.
Analyze the data to identify patterns or trends that suggest potential AI applications.
Consider projects like predictive analytics for customer behavior or anomaly detection in fraud detection.
Engage stakeholders to understand business needs and align AI projects with strategic goals.
Prototype a small-scale version of the AI solution to valida...
Choosing the right model involves understanding the data, use case, and desired outcomes.
1. Understand the problem: Is it classification, regression, clustering, etc.? Example: Predicting patient outcomes (classification).
2. Analyze the data: Check for missing values, data types, and distribution. Example: Use EDA to visualize patient demographics.
3. Select a model: Based on the problem type, choose a model. Example: U...
I applied via LinkedIn and was interviewed in Apr 2023. There were 3 interview rounds.
Company fit is crucial for long-term success. Compensation expectations should align with industry standards and experience.
Research the company culture and values to ensure alignment with personal values and work style.
Understand the company's expectations for the role and how your skills and experience can meet or exceed them.
Discuss compensation openly and transparently, considering industry standards, your experien...
Top trending discussions
I applied via Campus Placement and was interviewed in Dec 2016. There were 6 interview rounds.
Discussing project applications in PayPal and basic ML concepts for data analysis.
Utilizing customer transaction data to identify spending patterns and improve personalized marketing strategies.
Implementing machine learning algorithms to detect fraudulent transactions in real-time, enhancing security.
Analyzing user behavior on the PayPal platform to optimize user experience and streamline the payment process.
Using pred...
I applied via Job Portal
I applied via Walk-in and was interviewed before Jul 2021. There were 2 interview rounds.
Basic maths, quants and logic
I applied via Naukri.com and was interviewed in Jul 2021. There were 3 interview rounds.
I applied via Naukri.com and was interviewed in Mar 2021. There were 3 interview rounds.
I applied via LinkedIn and was interviewed before Mar 2022. There were 3 interview rounds.
Online test link given data structures were asked, array problem and string problem
Likewise Double Linked List was also asked
Spring Boot API endpoint is a URL that exposes the functionality of a web service.
API endpoints are the entry points for the client to access the server's resources.
Spring Boot provides a simple and easy way to create RESTful APIs.
Endpoints can be secured using Spring Security.
Endpoints can be documented using Swagger or Spring REST Docs.
Examples: /users, /products, /orders
I applied via Recruitment Consultant and was interviewed in Aug 2021. There were 4 interview rounds.
Some of the top questions asked at the MasterCard Senior Data Scientist interview -
based on 2 interview experiences
Difficulty level
Duration
based on 7 reviews
Rating in categories
Senior Software Engineer
1k
salaries
| ₹24.4 L/yr - ₹45 L/yr |
Software Engineer
346
salaries
| ₹11.7 L/yr - ₹25 L/yr |
Software Engineer2
338
salaries
| ₹15 L/yr - ₹26.1 L/yr |
Consultant
199
salaries
| ₹20 L/yr - ₹35 L/yr |
Lead Software Engineer
194
salaries
| ₹35.7 L/yr - ₹63.2 L/yr |
PayPal
FIS
Paytm
Fiserv