Filter interviews by
Copy activity in Azure Data Factory (ADF) enables data movement and transformation between various data sources and destinations.
Data Movement: Copy activity allows you to move data from one location to another, such as from Azure Blob Storage to Azure SQL Database.
Data Transformation: You can transform data during the copy process using mapping data flows or by applying transformations in the source or sink.
Integ...
Developed comprehensive frameworks for project management and data analysis to enhance efficiency and decision-making.
Created a project management framework that integrates Agile and Waterfall methodologies, improving team collaboration and delivery timelines.
Developed a data analysis framework using Python and R, enabling real-time insights and predictive analytics for client projects.
Implemented a risk assessmen...
Object-oriented principles focus on organizing software design around data, or objects, rather than functions and logic.
Encapsulation: Bundling data and methods that operate on the data within one unit (e.g., a class).
Inheritance: Mechanism to create a new class using properties and methods of an existing class (e.g., a 'Dog' class inheriting from an 'Animal' class).
Polymorphism: Ability to present the same interf...
Node.js is single-threaded but uses an event-driven model for concurrency, allowing multiple operations to run simultaneously.
Node.js uses an event loop to handle asynchronous operations, allowing it to manage multiple tasks without blocking.
I/O operations (like reading files or querying databases) are non-blocking, enabling other code to run while waiting for these tasks to complete.
Callbacks, Promises, and async...
HDInsight is a cloud-based service in Azure that makes it easy to process big data using Apache Hadoop, Spark, and other tools.
HDInsight is a fully managed cloud service that makes it easy to process big data using open-source frameworks like Apache Hadoop, Spark, and more.
It allows you to create, scale, and monitor Hadoop clusters in Azure.
HDInsight integrates with Azure Data Factory to provide data orchestration...
Data copy in Azure can be performed using Azure Data Factory or Azure Storage Explorer.
Use Azure Data Factory to create data pipelines for copying data between various sources and destinations.
Use Azure Storage Explorer to manually copy data between Azure storage accounts.
Utilize Azure Blob Storage for storing the data to be copied.
Use a command line tool like cat to concatenate multiple CSV files into a single file
Use the cat command in the terminal to concatenate multiple CSV files into a single file
Navigate to the directory where the CSV files are located
Run the command 'cat file1.csv file2.csv > combined.csv' to merge file1.csv and file2.csv into a new file named combined.csv
Dynamic file ingestion in ADF involves using parameters to dynamically load files into Azure Data Factory.
Use parameters to specify the file path and name dynamically
Utilize expressions to dynamically generate file paths
Implement dynamic mapping data flows to handle different file structures
There are three types of Cloud: Public, Private, and Hybrid.
Public Cloud: Services provided by third-party providers over the internet.
Private Cloud: Services provided by a single organization for internal use.
Hybrid Cloud: Combination of public and private cloud services.
Examples: AWS, Microsoft Azure, Google Cloud Platform.
Public cloud examples: Dropbox, Gmail, Salesforce.
Private cloud examples: Bank of America,...
Our current project architecture follows a microservices approach.
We have divided our application into smaller, independent services.
Each service has its own database and communicates with other services through APIs.
We use Docker and Kubernetes for containerization and orchestration.
We also have a centralized configuration server for managing configurations.
We follow RESTful API design principles for communicatio...
I applied via Recruitment Consulltant and was interviewed in Nov 2024. There were 2 interview rounds.
Node.js is single-threaded but uses an event-driven model for concurrency, allowing multiple operations to run simultaneously.
Node.js uses an event loop to handle asynchronous operations, allowing it to manage multiple tasks without blocking.
I/O operations (like reading files or querying databases) are non-blocking, enabling other code to run while waiting for these tasks to complete.
Callbacks, Promises, and async/awai...
I recently interviewed at Neudesic's Kochi office, where the first round was a written aptitude test. The test consisted of 15 questions, but surprisingly, one of the questions was incorrect. The instructors informed us to disregard that particular question.
After completing the test, they collect the sheets and exchange with other candidates for review. The instructors then collected the papers, but here's the puzzling part: they only considered papers with 9 or more correct answers, without even reviewing the remaining papers.
I'm still unsure about the logic behind having candidates review each other's papers. I'm confident that I scored 9 or above, but unfortunately, I didn't clear the exam. It's possible that I made a mistake or the person reviewing my paper mischecked my answers.
Regardless, the experience was certainly memorable, not in a positive way.
Medium level of coding questions.
Use a command line tool like cat to concatenate multiple CSV files into a single file
Use the cat command in the terminal to concatenate multiple CSV files into a single file
Navigate to the directory where the CSV files are located
Run the command 'cat file1.csv file2.csv > combined.csv' to merge file1.csv and file2.csv into a new file named combined.csv
HDInsight is a cloud-based service in Azure that makes it easy to process big data using Apache Hadoop, Spark, and other tools.
HDInsight is a fully managed cloud service that makes it easy to process big data using open-source frameworks like Apache Hadoop, Spark, and more.
It allows you to create, scale, and monitor Hadoop clusters in Azure.
HDInsight integrates with Azure Data Factory to provide data orchestration and ...
Data copy in Azure can be performed using Azure Data Factory or Azure Storage Explorer.
Use Azure Data Factory to create data pipelines for copying data between various sources and destinations.
Use Azure Storage Explorer to manually copy data between Azure storage accounts.
Utilize Azure Blob Storage for storing the data to be copied.
I expect a collaborative environment that fosters growth, innovation, and the opportunity to work on impactful data projects.
Opportunities for professional development, such as workshops and training sessions.
A culture of collaboration where team members share knowledge and support each other.
Engagement in challenging projects that allow me to apply my skills and learn new technologies, like cloud data solutions.
Clear ...
It is on paper mcq based test consisting of 15 questions of easy to medium level and time duration 20 min
Its not full pledged coding round just pseudo code need to write on paper and explain it. Based on the explanation they validate to next question if he/she fails to explain problem they are out of race
I appeared for an interview in Dec 2024.
I appeared for an interview in Oct 2024, where I was asked the following questions.
I applied via Naukri.com and was interviewed in Jan 2024. There were 3 interview rounds.
It was handson adf interview
Linked services are connections to external data sources in Azure Data Factory. Data sets are representations of data in those sources. Functions and stored procedures are used for data transformation.
Linked services are connections to external data sources such as databases, file systems, or APIs.
Data sets are representations of data in those sources, specifying the location, format, and schema of the data.
Functions a...
I applied via Referral and was interviewed in Dec 2023. There were 2 interview rounds.
Swap 2 numbers without 3rd variable,automate an application which contains alert and tabs
I applied via Naukri.com and was interviewed in Mar 2024. There was 1 interview round.
Top trending discussions
Some of the top questions asked at the Neudesic Technologies interview -
The duration of Neudesic Technologies interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 31 interview experiences
Difficulty level
Duration
based on 202 reviews
Rating in categories
Senior Consultant
321
salaries
| ₹13.5 L/yr - ₹40 L/yr |
Consultant
153
salaries
| ₹5 L/yr - ₹20 L/yr |
Senior Consultant 2
126
salaries
| ₹16.5 L/yr - ₹37.5 L/yr |
Senior Consultant 1
113
salaries
| ₹10 L/yr - ₹27.5 L/yr |
Consultant II
101
salaries
| ₹6 L/yr - ₹20 L/yr |
Tekwissen
Damco Solutions
smartData Enterprises
In Time Tec Visionsoft