Programming language processing (similar to natural language processing) is a hot research topic in the field of software engineering; it has also aroused growing interest in the artificial intelligence community.
To evaluate the proposed method, we constructed a CNN architecture for the image classification task with the CIFAR-10 dataset.
It is layered on top of two tree-based convolutional neural networks (TBCNNs), each of which recognizes the algorithm of code written in an individual programming language.
Unlike the NTM, the Neural GPU is highly parallel which makes it easier to train and efficient to run.
Neural approaches to program synthesis and understanding have proliferated widely in the last few years; at the same time graph based neural networks have become a promising new tool.
Through a combination of advanced training techniques and neural network architectural components, it is now possible to create neural networks that can handle tabular data, images, text, and audio as both input and output.
We present a system to infer and execute a human-readable program from a real-world demonstration.
Robotics
This paper presents OptNet, a network architecture that integrates optimization problems (here, specifically in the form of quadratic programs) as individual layers in larger end-to-end trainable deep networks.
Significance: This proof of concept study demonstrates the technical feasibility of a smartwatch device and supervised machine learning approach to more easily monitor and assess the at-home adherence of shoulder physiotherapy exercise protocols.
Human-Computer Interaction I.2.1
We propose a typesafe abstraction to tensors (i. e. multidimensional arrays) exploiting the type-level programming capabilities of Scala through heterogeneous lists (HList), and showcase typesafe abstractions of common tensor operations and various neural layers such as convolution or recurrent neural networks.
Programming Languages