Difference between revisions of "User:Yangzhang"
Jump to navigation
Jump to search
Peterjalbert (talk | contribs) m (Creating user page for new user.) |
(Redirected page to Yang Zhang) |
||
Line 1: | Line 1: | ||
+ | #REDIRECT [[Yang Zhang]] | ||
+ | |||
hello this is yang zhang. I am a cs phd student. | hello this is yang zhang. I am a cs phd student. | ||
Neural networks, in particular recurrent neural networks (RNNs), are now at the core of the leading approaches to language understanding tasks such as language modeling, machine translation and question answering. In Attention Is All You Need we introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well-suited for language understanding. | Neural networks, in particular recurrent neural networks (RNNs), are now at the core of the leading approaches to language understanding tasks such as language modeling, machine translation and question answering. In Attention Is All You Need we introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well-suited for language understanding. |
Latest revision as of 15:47, 19 September 2017
Redirect to:
hello this is yang zhang. I am a cs phd student.
Neural networks, in particular recurrent neural networks (RNNs), are now at the core of the leading approaches to language understanding tasks such as language modeling, machine translation and question answering. In Attention Is All You Need we introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well-suited for language understanding.