Observation about NNs

NN is just a class of hierarchical mathematical function compositions as functions with large number of parameters.

報道 個人文章

Neural Networks (NNs) are just a class of mathematical functions, that are very flexible, think of them like N-dimensional NURBS, or like N-dimensional "clay" that you can shape into anything, what make of it depends on your parametrization: do you use it like molding material to "photo-copy" existing behaviors or do you use it as engineering material to make high precision components: depends on the fidelity of behavior and pattern copy-ability -- for example, you can't expect to directly 3D-print a washing machine with a low-fidelity 3D printer, similarly, you can't expect a low-fidelity neural net to capture the behavior of stock market... that needs separate modeling of each constituent mind, that observes and makes input into collective behavior of the market.

Looking at NNs as simply a class of mathematical functions. It makese sense to get intuition, when these functions can be useful: for example, one may use NURBS when we want to represent both standard geometric objects like lines, circles, ellipses, spheres, tori, and free‑form geometry like car bodies and human bodies. Similarly, we may use NNs, when we need:

  • Function approximation, regression analysis, time series prediction, fitness approximation and modeling.
  • Classification, pattern and sequence recognition, novelty detection and sequential decision making.
  • Data processing, filtering, clustering, blind source separation and compression.
  • Robotics, including directing manipulators and prostheses.

Think of NNs them as smart clay to photo-copy behavior -- you use them, when you need to capture a complex, non-trivial behavior, that doesn't fall under known traditional models.


安靜地
(可選) 請,登錄