Settings

Theme

A Simpler Alternative to Neural Nets

medium.com

2 points by hacksoi 2 years ago · 6 comments

Reader

Tomte 2 years ago

Yes, we know that there are simpler ways to approximate polynomials than neural networks.

But sure, if you think AGI means solving polynomials, you've done it!

  • hacksoiOP 2 years ago

    I made the direct comparison to neural nets as it uses a very similar method to them (i.e. using weights as parameters and minimizing a cost function via gradient descent) but is simpler (gets rid of layers, neurons, activation functions, etc).

    I never stated "AGI means solving polynomials". Based on how far LLMs have come, function approximation seems to play a role in it.

anonzzzies 2 years ago

Needs to show a LLM built with it I guess.

  • gus_massa 2 years ago

    I agree. Expanding your comment:

    I see two possible problems.

    The first is if this method can express all the functions that a NN can express. High order polinomials usualy have huge spikes outside the region where they are fitted. The functions in NN usualy give more smooth interpolations. Those high exponents make me very worried.

    The second is if it's possible to train them. I use the solver of Excel to fit a lot of experimental data with theoretical formulaswith few parameters. In my experience, it's important to guess initial values of the parameters than are close enough to the best values. Otherwise the gradient descent method just get a horrible local minimum that is completely unrelated to the solution you are looking for.

    In conclusion, it's important to show that this new proposed method works well in practice in a few non trivial problems that then NN can solve, or at least that it can solve some problems that NN can't solve.

  • hacksoiOP 2 years ago

    Yeah, people like to see cool things, can't blame them.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection