一.基础函数 1.1 .tf.reduce_sum(input_tensor, axis) Computes the sum of elements across dimensions of a tensor,沿着维度sxis计算和 x= [[, , ], [, , ]],其秩为2 //求和,在所有维度操作,也就相当于对所有元素求和 tf.reduce_sum(x) ==> //在维度0上操作,在这个例子中实际就是按列(维度0)求和 tf.reduce_sum(x, ) ==> [, ,…
github:https://github.com/zle1992/Seq2Seq-Chatbot 1. 注意在infer阶段,需要需要reuse, 2.If you are using the BeamSearchDecoder with a cell wrapped in AttentionWrapper, then you must ensure that: The encoder output has been tiled to beam_width via tf.contrib.seq…
Convolutional Neural Networks (CNNs) are responsible for the major breakthroughs in image recognition made in the past few years. In this chapter we will cover: Implementing a Simpler CNN Implementing an Advanced CNN Retraining Existing CNN models Ap…
Effective TensorFlow Table of Contents TensorFlow Basics Understanding static and dynamic shapes Scopes and when to use them Broadcasting the good and the ugly Feeding data to TensorFlow Take advantage of the overloaded operators Understanding order…
fm_model是libFM生成的模型 model.ckpt是可以tensorflow serving的模型结构 亲测输出正确. 代码: import tensorflow as tf # libFM model def load_fm_model(file_name): state = '' fid = 0 max_fid = 0 w0 = 0.0 wj = {} v = {} k = 0 with open(file_name) as f: for line in f: line = lin…
本文是针对谷歌Transformer模型的解读,根据我自己的理解顺序记录的. 另外,针对Kyubyong实现的tensorflow代码进行解读,代码地址https://github.com/Kyubyong/transformer 这里不会详细描述Transformer的实现机理,如果有不了解Transformer的可以先阅读文章<Attention is all you need>,以及我列出的一些参考博客,都是不错的解读. Layer Normalization 首先是Layer Norm…
iris: # -*- coding: utf-8 -*- # K-means with TensorFlow #---------------------------------- # # This script shows how to do k-means with TensorFlow import numpy as np import matplotlib.pyplot as plt import tensorflow as tf from sklearn import dataset…