https://www.tensorflow.org/api_docs/python/tf/contrib/layers/fully_connected

fully_connected:

1、先根据权重得到输出

2、对输出normalizer (BN..)

3、对输出activate(relu, ...)

Defined in tensorflow/contrib/layers/python/layers/layers.py.

def fully_connected(inputs,
num_outputs,
activation_fn=nn.relu,
normalizer_fn=None,
normalizer_params=None,
weights_initializer=initializers.xavier_initializer(),
weights_regularizer=None,
biases_initializer=init_ops.zeros_initializer(),
biases_regularizer=None,
reuse=None,
variables_collections=None,
outputs_collections=None,
trainable=True,
scope=None):
"""Adds a fully connected layer. `fully_connected` creates a variable called `weights`, representing a fully
connected weight matrix, which is multiplied by the `inputs` to produce a
`Tensor` of hidden units. If a `normalizer_fn` is provided (such as
`batch_norm`), it is then applied. Otherwise, if `normalizer_fn` is
None and a `biases_initializer` is provided then a `biases` variable would be
created and added the hidden units. Finally, if `activation_fn` is not `None`,
it is applied to the hidden units as well. Note: that if `inputs` have a rank greater than 2, then `inputs` is flattened
prior to the initial matrix multiply by `weights`. Args:
inputs: A tensor of at least rank 2 and static value for the last dimension;
i.e. `[batch_size, depth]`, `[None, None, None, channels]`.
num_outputs: Integer or long, the number of output units in the layer.
activation_fn: Activation function. The default value is a ReLU function.
Explicitly set it to None to skip it and maintain a linear activation.
normalizer_fn: Normalization function to use instead of `biases`. If
`normalizer_fn` is provided then `biases_initializer` and
`biases_regularizer` are ignored and `biases` are not created nor added.
default set to None for no normalizer function
normalizer_params: Normalization function parameters.
weights_initializer: An initializer for the weights.
weights_regularizer: Optional regularizer for the weights.
biases_initializer: An initializer for the biases. If None skip biases.
biases_regularizer: Optional regularizer for the biases.
reuse: Whether or not the layer and its variables should be reused. To be
able to reuse the layer scope must be given.
variables_collections: Optional list of collections for all the variables or
a dictionary containing a different list of collections per variable.
outputs_collections: Collection to add the outputs.
trainable: If `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see tf.Variable).
scope: Optional scope for variable_scope. Returns:
The tensor variable representing the result of the series of operations. Raises:
ValueError: If x has rank less than 2 or if its last dimension is not set.
"""
if not isinstance(num_outputs, six.integer_types):
raise ValueError('num_outputs should be int or long, got %s.' %
(num_outputs,)) layer_variable_getter = _build_variable_getter({
'bias': 'biases',
'kernel': 'weights'
}) with variable_scope.variable_scope(
scope,
'fully_connected', [inputs],
reuse=reuse,
custom_getter=layer_variable_getter) as sc:
inputs = ops.convert_to_tensor(inputs)
layer = core_layers.Dense(
units=num_outputs,
activation=None,
use_bias=not normalizer_fn and biases_initializer,
kernel_initializer=weights_initializer,
bias_initializer=biases_initializer,
kernel_regularizer=weights_regularizer,
bias_regularizer=biases_regularizer,
activity_regularizer=None,
trainable=trainable,
name=sc.name,
dtype=inputs.dtype.base_dtype,
_scope=sc,
_reuse=reuse)
outputs = layer.apply(inputs) # Add variables to collections.
_add_variable_to_collections(layer.kernel, variables_collections, 'weights')
if layer.bias is not None:
_add_variable_to_collections(layer.bias, variables_collections, 'biases') # Apply normalizer function / layer.
if normalizer_fn is not None:
if not normalizer_params:
normalizer_params = {}
outputs = normalizer_fn(outputs, **normalizer_params) if activation_fn is not None:
outputs = activation_fn(outputs) return utils.collect_named_outputs(outputs_collections, sc.name, outputs)

tensorflow Method源码阅读之 fully_connected的更多相关文章

  1. 【原】AFNetworking源码阅读(四)

    [原]AFNetworking源码阅读(四) 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 上一篇还遗留了很多问题,包括AFURLSessionManagerTaskDe ...

  2. 【原】AFNetworking源码阅读(二)

    [原]AFNetworking源码阅读(二) 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 上一篇中我们在iOS Example代码中提到了AFHTTPSessionMa ...

  3. 【原】SDWebImage源码阅读(一)

    [原]SDWebImage源码阅读(一) 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 一直没有系统地读过整套源码,就感觉像一直看零碎的知识点,没有系统读过一本专业经典书 ...

  4. 源码阅读系列:EventBus

    title: 源码阅读系列:EventBus date: 2016-12-22 16:16:47 tags: 源码阅读 --- EventBus 是人们在日常开发中经常会用到的开源库,即使是不直接用的 ...

  5. EventBus源码解析 源码阅读记录

    EventBus源码阅读记录 repo地址: greenrobot/EventBus EventBus的构造 双重加锁的单例. static volatile EventBus defaultInst ...

  6. angular源码阅读,依赖注入的原理:injector,provider,module之间的关系。

    最开始使用angular的时候,总是觉得它的依赖注入方式非常神奇. 如果你跳槽的时候对新公司说,我曾经使用过angular,那他们肯定会问你angular的依赖注入原理是什么? 这篇博客其实是angu ...

  7. CI框架源码阅读笔记4 引导文件CodeIgniter.php

    到了这里,终于进入CI框架的核心了.既然是“引导”文件,那么就是对用户的请求.参数等做相应的导向,让用户请求和数据流按照正确的线路各就各位.例如,用户的请求url: http://you.host.c ...

  8. CI框架源码阅读笔记3 全局函数Common.php

    从本篇开始,将深入CI框架的内部,一步步去探索这个框架的实现.结构和设计. Common.php文件定义了一系列的全局函数(一般来说,全局函数具有最高的加载优先权,因此大多数的框架中BootStrap ...

  9. parseInt的源码阅读

    parseInt的源码阅读 Integer.parseInt()这个方法的功能小巧又实用,实现起来困难不大,没有很复杂.这里就来看一下Java的源码是怎么写的吧,走一边大婶写过的代码,应该会有点收获吧 ...

随机推荐

  1. Delphi Record To Stream

    type TUserInfo = record sUserId,sUserName:String; iUserCount:integer; end; procedure TForm1.Button1C ...

  2. Kafka 特性

    Kafka 特性 标签(空格分隔): Kafka 支持多个生产者 多个生成者连接Kafka来推送消息,这个和其他的消息队列功能基本上是一样的 支持多个消费者 Kafka支持多个消费者来读取同一个消息流 ...

  3. Windows之MySQL安装教程

    MySQL安装说明 MySQL是一个关系型数据库管理系统,由瑞典MySQL AB 公司开发,目前属于 Oracle 旗下产品.MySQL 是最流行的关系型数据库管理系统之一,在 WEB 应用方面,My ...

  4. docker容器日志收集方案(方案三 filebeat+journald本地日志收集)

    其实方案三和方案二日志采集套路一样,但是还是有点差别. 差别就在于日志格式如下: ​ 为了方便对比吧日志贴上来 Nov 16 10:51:58 localhost 939fe968a91d[4721] ...

  5. git如何设置ssh密钥

    git设置ssh密钥 目前git支持https和git两种传输协议,github分享链接时会有两种协议可选: 1.Clone with SSH 2.Clone with HTTPS git在使用htt ...

  6. django 模型层(2)

    Django 模型层(2) 多表操作---模型之间的关系 1 一对一:作者----作者详细信息 2 一对多:书籍----出版社 3 多对多:书籍----作者 一  创建模型(主键(id)自动创建) 没 ...

  7. Javascrip 入门第三节课

    一.location对象 location.href 获取当前网页的URLlocation.search() 获取?之后的请求信息 location.href="URL" // 跳 ...

  8. git、github、gitlab之间的关系

    GIt-版本控制工具:GitHub-一个网站平台,提供给用户空间存储git仓储,保存用户的一些数据文档或者代码等:GitLab - 基于Git的项目管理软件. Git分布式版本控制系统 Git是一款自 ...

  9. web框架开发-Django模型层(2)-多表操作

    很重要,都是精华 多表关系模型 一对一 一旦确定表关系是一对一,在两张表中的任意一张表中建立关联字段+Unique 一对多 一旦确定表关系是一对多,创建关联字段在多的表中 多对多 一旦确定表关系是多对 ...

  10. centos docker 安装

    centos docker 安装 参考网站 https://docs.docker.com/install/linux/docker-ce/centos/ 1.删除原有docker $ sudo yu ...