成功解决python\ops\seq2seq.py TypeError: ms_error() got an unexpected keyword argument 'labels'
生活随笔
收集整理的這篇文章主要介紹了
成功解决python\ops\seq2seq.py TypeError: ms_error() got an unexpected keyword argument 'labels'
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
成功解決python\ops\seq2seq.py TypeError: ms_error() got an unexpected keyword argument 'labels'
?
?
?
目錄
解決問題
解決思路
解決方法
?
?
?
?
?
解決問題
錯誤地址:contrib\legacy_seq2seq\python\ops\seq2seq.py", line 1098, in sequence_loss_by_example
TypeError: ms_error() got an unexpected keyword argument 'labels'
?
?
?
?
解決思路
查看函數使用方法
def sequence_loss_by_example(logits,targets,weights,average_across_timesteps=True,softmax_loss_function=None,name=None):"""Weighted cross-entropy loss for a sequence of logits (per example).Args:logits: List of 2D Tensors of shape [batch_size x num_decoder_symbols].targets: List of 1D batch-sized int32 Tensors of the same length as logits.weights: List of 1D batch-sized float-Tensors of the same length as logits.average_across_timesteps: If set, divide the returned cost by the totallabel weight.softmax_loss_function: Function (labels, logits) -> loss-batchto be used instead of the standard softmax (the default if this is None).**Note that to avoid confusion, it is required for the function to acceptnamed arguments.**name: Optional name for this operation, default: "sequence_loss_by_example".Returns:1D batch-sized float Tensor: The log-perplexity for each sequence.Raises:ValueError: If len(logits) is different from len(targets) or len(weights)."""if len(targets) != len(logits) or len(weights) != len(logits):raise ValueError("Lengths of logits, weights, and targets must be the same ""%d, %d, %d." % (len(logits), len(weights), len(targets)))with ops.name_scope(name, "sequence_loss_by_example",logits + targets + weights):log_perp_list = []for logit, target, weight in zip(logits, targets, weights):if softmax_loss_function is None:# TODO(irving,ebrevdo): This reshape is needed because# sequence_loss_by_example is called with scalars sometimes, which# violates our general scalar strictness policy.target = array_ops.reshape(target, [-1])crossent = nn_ops.sparse_softmax_cross_entropy_with_logits(labels=target, logits=logit)else:crossent = softmax_loss_function(targets, logits=logit) #190318修改 targetslog_perp_list.append(crossent * weight)log_perps = math_ops.add_n(log_perp_list)if average_across_timesteps:total_size = math_ops.add_n(weights)total_size += 1e-12 # Just to avoid division by 0 for all-0 weights.log_perps /= total_sizereturn log_perps?
?
?
解決方法
crossent = softmax_loss_function(labels=targets, logits=logit)?
修改為
crossent = softmax_loss_function(targets, logits)?
大功告成!哈哈!
?
?
《新程序員》:云原生和全面數字化實踐50位技術專家共同創作,文字、視頻、音頻交互閱讀總結
以上是生活随笔為你收集整理的成功解决python\ops\seq2seq.py TypeError: ms_error() got an unexpected keyword argument 'labels'的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: ML之回归预测:利用九大类机器学习算法对
- 下一篇: 成功解决python\ops\seq2s