admin 管理员组文章数量: 1184232
2024年1月24日发(作者:php构建构造方法)
def embedding(inputs,
vocab_size,
num_units,
zero_pad=True,
scale=True, scope="embedding",
reuse=None): '''Embeds a given tensor. Args: inputs: A `Tensor` with type `int32` or `int64` containing the ids to be looked up in `lookup table`. vocab_size: An int. Vocabulary size. num_units: An int. Number of embedding hidden units. zero_pad: A boolean. If True, all the values of the fist row (id 0) should be constant zeros. scale: A boolean. If True. the outputs is multiplied by sqrt num_units. scope: Optional scope for `variable_scope`. reuse: Boolean, whether to reuse the weights of a previous layer by the same name. Returns: A `Tensor` with one more rank than inputs's. The last dimensionality should be `num_units`.
For example,
``` import tensorflow as tf
inputs = _int32(e((2*3), (2, 3))) outputs = embedding(inputs, 6, 2, zero_pad=True) with n() as sess: (_variables_initializer()) print (outputs) >> [[[ 0. 0. ] [ 0.09754146 0.67385566] [ 0.37864095 -0.35689294]] [[-1.01329422 -1.09939694] [ 0.7521342 0.38203377] [-0.04973143 -0.06210355]]] ```
``` import tensorflow as tf
inputs = _int32(e((2*3), (2, 3))) outputs = embedding(inputs, 6, 2, zero_pad=False) with n() as sess: (_variables_initializer()) print (outputs) >> [[[-0.19172323 -0.39159766] [-0.43212751 -0.66207761] [ 1.03452027 -0.26704335]] [[-0.11634696 -0.35983452] [ 0.50208133 0.53509563] [ 1.22204471 -0.96587461]]]
``` ''' with le_scope(scope, reuse=reuse): lookup_table = _variable('lookup_table', dtype=32, shape=[vocab_size, num_units], initializer=_initializer()) if zero_pad: lookup_table = (((shape=[1, num_units]), lookup_table[1:, :]), 0) outputs = ing_lookup(lookup_table, inputs)
if scale: outputs = outputs * (num_units ** 0.5)
return outputs
版权声明:本文标题:机器翻译模型Transformer代码详细解析 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.roclinux.cn/b/1706054852a499929.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论