Towards Understandable Neural Networks for High Level AI Tasks - Part 7

Relating tensor product representations to lamda-calculus, tree-adjoining grammars, other ’vector symbolic architectures’, and the brain - Part 7 Topics that will be discussed in this final lecture of the series are: - programming tensor-product-representation-manipulating Gradient-Symbolic-Computation networks to perform function-application in l-calculus and tree-adjunction (as in Tree-Adjoining Grammar) - thereby demonstrating that GSC networks truly have complete symbol-processing (or ’algebraic’) capab
Back to Top