Thread with 3 posts
jump to expanded postsometimes math things turn out to be far simpler than you expect. for example, apparently one of the most popular activation functions in neural networks is just max(0,x)? that's what relu is. so i guess f(a,b,c) = max(0, a * 0.5 + b * 0.25 + c * 0.125) is a simple βneuronβ? neat
βͺit's like signed distance fields :oβ¬
βͺ(girl who only knows about sdf's, seeing literally any simple formula with a max() or min() in it) this is just like sdf'sβ¬