fdc2749cceb52addb817f22ed5531535867cd29e,examples/attention_tagger.py,,main,#Any#Any#Any#Any#Any#Any#Any#Any#Any#Any#Any#,138
Before Change
>> Maxout(width, pieces=3), pad=0)
>> PositionEncode(1000, width)
>> flatten_add_lengths
>> MultiHeadedAttention(nM=width, nH=8)
>> with_getitem(0, LayerNorm(nO=width))
>> unflatten
>> with_flatten(Softmax(nr_tag))
)
After Change
>> Maxout(width, pieces=3), pad=0)
>> PositionEncode(1000, width)
>> flatten_add_lengths
>> Residual(MultiHeadedAttention(nM=width, nH=1))
>> unflatten
>> with_flatten(Softmax(nr_tag))
)
In pattern: SUPERPATTERN
Frequency: 3
Non-data size: 4
Instances
Project Name: explosion/thinc
Commit Name: fdc2749cceb52addb817f22ed5531535867cd29e
Time: 2019-06-10
Author: honnibal+gh@gmail.com
File Name: examples/attention_tagger.py
Class Name:
Method Name: main
Project Name: explosion/thinc
Commit Name: e07ec603e2163f7b8b75d9953ee616d1d2f47cdd
Time: 2019-06-09
Author: honnibal+gh@gmail.com
File Name: examples/attention_tagger.py
Class Name:
Method Name: main
Project Name: explosion/thinc
Commit Name: 1f8dd5463cd199ffa64ad98fe3274726ba7301bf
Time: 2019-06-10
Author: honnibal+gh@gmail.com
File Name: examples/attention_tagger.py
Class Name:
Method Name: main