Deep learning is a subfield of Machine Learning inspired by human brain. Uses large Artificial Neural Networks (>100 layers) for learning. Requires extensive labeled data and huge computing power. Problems of depth >10 require Very Deep Learning
Holds PhD from Florida International University in Mechanical Engineering. Research focuses on haptics, teleoperation, and robot controller design. Published numerous research articles in Robotica and Mechanism and Machine Theory
Classification predicts classes of given data points using supervised learning. Training data helps understand input-class relationships. Common applications include spam detection and medical diagnosis
Activation function calculates node output based on inputs and weights. Nonlinear activation functions enable solving problems with fewer nodes. Modern functions include GELU, logistic, ReLU, and softmax
Hebbian theory explains synaptic plasticity through repeated presynaptic cell stimulation. Theory states that neurons that fire together, wire together. Donald Hebb introduced theory in 1949 book "The Organization of Behavior"
Tensor refers to multidimensional arrays or multilinear mappings in machine learning. Data can be organized as tensors for analysis by neural networks or tensor methods. Tensor decomposition factorizes data tensors into smaller ones