Google News
logo
Deep Learning Interview Questions
A perceptron is similar to the actual neuron in the human brain. It receives inputs from various entities and applies functions to these inputs, which transform them to be the output.
 
A perceptron is mainly used to perform binary classification where it sees an input, computes functions based on the weights of the input, and outputs the required transformation.
Machine Learning is powerful in a way that it is sufficient to solve most of the problems. However, Deep Learning gets an upper hand when it comes to working with data that has a large number of dimensions. With data that is large in size, a Deep Learning model can easily work with it as it is built to handle this.
Artificial Intelligence is a technique that enables machines to mimic human behavior.
 
Machine Learning is a subset of AI technique which uses statistical methods to enable machines to improve with experience.

Differentiate between AI, Machine Learning and Deep Learning.

Deep learning is a subset of ML which make the computation of multi-layer neural network feasible. It uses Neural networks to simulate human-like decision making.
Activation function translates the inputs into outputs. Activation function decides whether a neuron should be activated or not by calculating the weighted sum and further adding bias with it. The purpose of the activation function is to introduce non-linearity into the output of a neuron.
 
There can be many Activation functions like :
 
* Linear or Identity
* Unit or Binary Step
* Sigmoid or Logistic
* Tanh
* ReLU
* Softmax
Data visualisation libraries help in understanding complex ideas by using visual elements such as graphs, charts, maps and more. The visualisation tools help you to recognise patterns, trends, outliers and more, making it possible to design your data according to the requirement. Popular data visualisation libraries include D3, React-Vis, Chart.js, vx, and more.
Overfitting is a type of modelling error which results in the failure to predict future observations effectively or fit additional data in the existing model. It occurs when a function is too closely fit to a limited set of data points and usually ends with more parameters than the data can accommodate. It is common for huge data sets to have some anomalies, so when this data is used for any kind of modelling, it can result in inaccuracies in the analysis.
What is overfitting?
Overfitting can be prevented by following a few methods namely :
 
Cross-validation : Where the initial training data is split into several mini-test sets and each mini data set is used to tune the model.

Remove features :
Remove irrelevant features manually from the algorithms and use feature selection heuristics to identify the important features

Regularisation :
This involves various ways of making your model simpler so that there’s little room for error due to obscurity. Adding penalty parameters and pruning your decision tree are ways of doing that.

Ensembling :
These are machine learning techniques for combining multiple separate predictions. The most popular methods of ensembling are bagging and boosting.
The deep learning frameworks and tools are :
 
* Keras
* TensorFlow
* PyTorch
* Theano
* CNTK
* Caffe2
* MXNet
The supervised learning algorithms are :
 
* Artificial Neural Network (ANN)
* Perceptron (single and multi-layer)
* Convolution Neural Network (CNN)
* Recurrent Neural Network (RNN)


The unsupervised learning algorithms are :
 
* Autoencoders
* Self-Organizing Maps (SOMs)
* Boltzmann Machine
* Generative adversarial networks (GANs)
Single layer perceptron is the first proposed neural model created. The content of the local memory of the neuron consists of a vector of weights. The computation of a single layer perceptron is performed over the calculation of sum of the input vector each with the value multiplied by corresponding element of vector of the weights. The value which is displayed in the output will be the input of an activation function.
Single Layer Perceptron