Tags: gorgonia/gorgonia
Tags
Clarifysemantics (#458) * Matching up with the tensor package's clarificatin of semantics * Fixed some tests * Ugh, one liner mistake for softmax * Allow for NewTensor(g,0 ...) be equivalent to NewScalar * Fixed all issues with tensordot * Added go.mod for examples and tidied go.mod * Fixed Barwain-Borzelai solver (much more need be done) * Fixed yolo test * Updated example of keepDims (tensor.Dense had new formatting options) * Miscellenous stuff that needs to be updated (go.mod, github actions, etc) * Made the concurrent training examples smaller so github actions can run it * go mod tidy
Make reductionInferShape conservative to fix #384 (#411) * Make reductionInferShape conservative to fix #384 The reductionInferShape currently doesn't respect along initially. It aggressively squeezes dimensions. Not only does it affect normal tensor operation, but also it breaks the backprop autoDiff algorithm sometimes when the network containing BroadcastAdd, resulting in crash when calling Grad(). The change tries to strictly respect the parameter along, e.g., (100, 1) along 0, reduction to shape (1) instead () (1, 64, 1, 64) along 3 will reduce to (1, 64, 1) (64, 1, 3, 2) along (2,3) will reduce to (64, 1). Fixed unit tests. * Remove inconsistent dimention for Sum op After changing reductionType to subtract len(along) from reduction op, SumOp's dimension need to be adjusted with it. Otherwise SymDiff will crash for Sum in calcBroadcastShap.
de-optimize Repeat (#389) * WIP * more work done * fixed cascading incorrect tests * Simplified the repeat type. Reinserted DoDiff * Added PreallocDoer * Fixed a few fundamental things in repeatOp. reshapeOp has been given a way to perform unsafe operations. * Updated go mod
PreviousNext