Filter interviews by
I applied via Referral and was interviewed in Mar 2021. There were 3 interview rounds.
Yes, lead conversion is possible even if required fields are missing.
Lead conversion can still happen if the minimum required fields are present.
Missing fields can be filled in during the conversion process.
However, missing critical fields may result in incomplete or inaccurate data.
It is important to prioritize data completeness and accuracy for effective lead conversion.
Validation bypass may occur if record is updated through workflow
Validation rules may not be triggered when records are updated through workflow
This can lead to data inconsistencies and errors
It is important to thoroughly test workflows and ensure validation rules are properly enforced
Top trending discussions
I applied via Naukri.com and was interviewed in Mar 2021. There were 3 interview rounds.
I applied via Naukri.com and was interviewed before Apr 2021. There were 2 interview rounds.
I applied via Naukri.com and was interviewed in Jul 2021. There were 3 interview rounds.
The Python code to get the output string as 'jareen arpohc' from the input string 'neeraj chopra' is provided.
Split the input string into individual words
Reverse each word
Join the reversed words with a space in between
Code to list even numbers from a given list of 10 numbers.
Loop through the list of numbers
Check if each number is even using the modulo operator
If the number is even, add it to a new list of even numbers
Return the new list of even numbers
posted on 26 Jul 2022
I applied via Company Website and was interviewed before Jul 2021. There were 3 interview rounds.
Spark DAG (Directed Acyclic Graph) represents the execution plan for Spark jobs, optimizing data processing tasks.
A DAG is a graph structure that consists of nodes (operations) and edges (data flow) without cycles.
In Spark, when an action (like count or collect) is called, it triggers the creation of a DAG.
Each transformation (like map, filter) creates a new RDD, forming a lineage that Spark uses to optimize execution.
...
Data validation in Hive involves using built-in functions and custom scripts to ensure data accuracy and consistency.
Use built-in functions like IS NULL, IS NOT NULL, and COALESCE to check for missing or null values
Use regular expressions and pattern matching to validate data formats
Write custom scripts to perform more complex data validation tasks
Perform data profiling to identify potential data quality issues
Use data...
I applied via Naukri.com and was interviewed before Feb 2022. There were 3 interview rounds.
I applied via Job Portal and was interviewed before Feb 2023. There was 1 interview round.
I appeared for an interview in Feb 2025, where I was asked the following questions.
Technical Lead
41
salaries
| ₹13.2 L/yr - ₹24 L/yr |
Senior System Executive
40
salaries
| ₹5.8 L/yr - ₹12.8 L/yr |
Senior Systems Engineer
26
salaries
| ₹6 L/yr - ₹17.5 L/yr |
System Engineer
22
salaries
| ₹3.6 L/yr - ₹6.5 L/yr |
Salesforce Developer
18
salaries
| ₹2.9 L/yr - ₹9.8 L/yr |
Saama Technologies
Indus Valley Partners
Blenheim Chalcot
DotPe