Back to GlossaryCore Concepts

Neural Network

Definition

A computing system inspired by biological neural networks, consisting of interconnected nodes (neurons) organized in layers that process information and learn patterns from data.

Neural networks are the foundational architecture behind deep learning. They consist of an input layer, one or more hidden layers, and an output layer. Each connection between neurons has a weight that is adjusted during training through backpropagation. Activation functions like ReLU or sigmoid introduce non-linearity, allowing networks to learn complex relationships. Simple neural networks with one hidden layer can approximate any continuous function, but deeper networks learn hierarchical features more efficiently. Modern neural networks can have billions of parameters and are trained on massive datasets using specialized hardware.

Companies in Core Concepts

View Core Concepts companies →