入坑tensorflow
win10 CPU版,anaconda prompt命令行一句话,pip install --upgrade tensorflow搞定。比caffe好装一万倍。
gpu版没装成,首先这个笔记本没装cuda,另外一个台式装好了cuda8.0和cunn5.1也是报一样的错误,缺少一个setuptool.egg
命令行如下:
(D:\Users\song\Anaconda3) C:\SPB_Data>python --version
Python 3.6.0 :: Anaconda 4.3.1 (64-bit) (D:\Users\song\Anaconda3) C:\SPB_Data>pip -V
pip 9.0.1 from D:\Users\song\Anaconda3\lib\site-packages (python 3.6) (D:\Users\song\Anaconda3) C:\SPB_Data>pip3 install --upgrade tensorflow-gpu
'pip3' 不是内部或外部命令,也不是可运行的程序
或批处理文件。 (D:\Users\song\Anaconda3) C:\SPB_Data>pip install --upgrade tensorflow-gpu
Collecting tensorflow-gpu
Downloading tensorflow_gpu-1.3.0-cp36-cp36m-win_amd64.whl (60.0MB)
100% |████████████████████████████████| 60.0MB 16kB/s
Collecting numpy>=1.11.0 (from tensorflow-gpu)
Downloading numpy-1.13.3-cp36-none-win_amd64.whl (13.1MB)
100% |████████████████████████████████| 13.1MB 73kB/s
Collecting protobuf>=3.3.0 (from tensorflow-gpu)
Downloading protobuf-3.4.0-py2.py3-none-any.whl (375kB)
100% |████████████████████████████████| 378kB 667kB/s
Collecting wheel>=0.26 (from tensorflow-gpu)
Downloading wheel-0.30.0-py2.py3-none-any.whl (49kB)
100% |████████████████████████████████| 51kB 371kB/s
Collecting tensorflow-tensorboard<.2.0,>=0.1.0 (from tensorflow-gpu)
Downloading tensorflow_tensorboard-0.1.8-py3-none-any.whl (1.6MB)
100% |████████████████████████████████| 1.6MB 413kB/s
Collecting six>=1.10.0 (from tensorflow-gpu)
Downloading six-1.11.0-py2.py3-none-any.whl
Collecting setuptools (from protobuf>=3.3.0->tensorflow-gpu)
Downloading setuptools-36.6.0-py2.py3-none-any.whl (481kB)
100% |████████████████████████████████| 481kB 734kB/s
Collecting bleach==1.5.0 (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow-gpu)
Downloading bleach-1.5.0-py2.py3-none-any.whl
Collecting werkzeug>=0.11.10 (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow-gpu)
Downloading Werkzeug-0.12.2-py2.py3-none-any.whl (312kB)
100% |████████████████████████████████| 317kB 1.7MB/s
Collecting html5lib==0.9999999 (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow-gpu)
Downloading html5lib-0.9999999.tar.gz (889kB)
100% |████████████████████████████████| 890kB 502kB/s
Collecting markdown>=2.6.8 (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow-gpu)
Downloading Markdown-2.6.9.tar.gz (271kB)
100% |████████████████████████████████| 276kB 687kB/s
Building wheels for collected packages: html5lib, markdown
Running setup.py bdist_wheel for html5lib ... done
Stored in directory: C:\Users\song\AppData\Local\pip\Cache\wheels\6f\85\6c\56b8e1292c6214c4eb73b9dda50f53e8e977bf65989373c962
Running setup.py bdist_wheel for markdown ... done
Stored in directory: C:\Users\song\AppData\Local\pip\Cache\wheels\bf\46\10\c93e17ae86ae3b3a919c7b39dad3b5ccf09aeb066419e5c1e5
Successfully built html5lib markdown
Installing collected packages: numpy, setuptools, six, protobuf, wheel, html5lib, bleach, werkzeug, markdown, tensorflow-tensorboard, tensorflow-gpu
Found existing installation: numpy 1.11.3
Uninstalling numpy-1.11.3:
Successfully uninstalled numpy-1.11.3
Found existing installation: setuptools 27.2.0
Uninstalling setuptools-27.2.0:
Successfully uninstalled setuptools-27.2.0
Found existing installation: six 1.10.0
DEPRECATION: Uninstalling a distutils installed project (six) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.
Uninstalling six-1.10.0:
Successfully uninstalled six-1.10.0
Found existing installation: wheel 0.29.0
Uninstalling wheel-0.29.0:
Successfully uninstalled wheel-0.29.0
Found existing installation: Werkzeug 0.11.15
Uninstalling Werkzeug-0.11.15:
Successfully uninstalled Werkzeug-0.11.15
Successfully installed bleach-1.5.0 html5lib-0.9999999 markdown-2.6.9 numpy-1.13.3 protobuf-3.4.0 setuptools-36.6.0 six-1.11.0 tensorflow-gpu-1.3.0 tensorflow-tensorboard-0.1.8 werkzeug-0.12.2 wheel-0.30.0
Traceback (most recent call last):
File "D:\Users\song\Anaconda3\Scripts\pip-script.py", line 5, in <module>
sys.exit(pip.main())
File "D:\Users\song\Anaconda3\lib\site-packages\pip\__init__.py", line 249, in main
return command.main(cmd_args)
File "D:\Users\song\Anaconda3\lib\site-packages\pip\basecommand.py", line 252, in main
pip_version_check(session)
File "D:\Users\song\Anaconda3\lib\site-packages\pip\utils\outdated.py", line 102, in pip_version_check
installed_version = get_installed_version("pip")
File "D:\Users\song\Anaconda3\lib\site-packages\pip\utils\__init__.py", line 838, in get_installed_version
working_set = pkg_resources.WorkingSet()
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 644, in __init__
self.add_entry(entry)
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 700, in add_entry
for dist in find_distributions(entry, True):
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1949, in find_eggs_in_zip
if metadata.has_metadata('PKG-INFO'):
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1463, in has_metadata
return self.egg_info and self._has(self._fn(self.egg_info, name))
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1823, in _has
return zip_path in self.zipinfo or zip_path in self._index()
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1703, in zipinfo
return self._zip_manifests.load(self.loader.archive)
File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1643, in load
mtime = os.stat(path).st_mtime
FileNotFoundError: [WinError 2] 系统找不到指定的文件。: 'D:\\Users\\song\\Anaconda3\\lib\\site-packages\\setuptools-27.2.0-py3.6.egg' (D:\Users\song\Anaconda3) C:\SPB_Data>cd .. (D:\Users\song\Anaconda3) C:\>cd .. (D:\Users\song\Anaconda3) C:\>ls
'ls' 不是内部或外部命令,也不是可运行的程序
或批处理文件。 (D:\Users\song\Anaconda3) C:\>python --version
Python 3.6.0 :: Anaconda 4.3.1 (64-bit) (D:\Users\song\Anaconda3) C:\>nvcc -V
'nvcc' 不是内部或外部命令,也不是可运行的程序
或批处理文件。 (D:\Users\song\Anaconda3) C:\>pip install --upgrade tensorflow
Collecting tensorflow
Downloading tensorflow-1.3.0-cp36-cp36m-win_amd64.whl (25.5MB)
100% |████████████████████████████████| 25.5MB 29kB/s
Requirement already up-to-date: protobuf>=3.3.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
Requirement already up-to-date: wheel>=0.26 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
Requirement already up-to-date: tensorflow-tensorboard<.2.0,>=0.1.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
Requirement already up-to-date: six>=1.10.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
Requirement already up-to-date: numpy>=1.11.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
Requirement already up-to-date: setuptools in d:\users\song\anaconda3\lib\site-packages (from protobuf>=3.3.0->tensorflow)
Requirement already up-to-date: markdown>=2.6.8 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow)
Requirement already up-to-date: bleach==1.5.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow)
Requirement already up-to-date: html5lib==0.9999999 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow)
Requirement already up-to-date: werkzeug>=0.11.10 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<.2.0,>=0.1.0->tensorflow)
Installing collected packages: tensorflow
Successfully installed tensorflow-1.3.0 (D:\Users\song\Anaconda3) C:\>import tensorflow as tf
'import' 不是内部或外部命令,也不是可运行的程序
或批处理文件。 (D:\Users\song\Anaconda3) C:\>python
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
>>> a = tf.random_normal((100,100))
>>> b = tf.random_normal((100,500))
>>> c=tf.matmul(a,b)
>>> sess=tf.InteractiveSession()
2017-10-29 20:46:03.615036: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2017-10-29 20:46:03.620666: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
>>> sess.run(c)
array([[ -2.01546478e+01, -1.21840429e+01, 8.52634966e-01, ...,
-1.93460350e+01, -1.17136412e+01, -2.81856956e+01],
[ -2.86180496e+00, 1.86777287e+01, 2.39728212e-01, ...,
1.65606441e+01, -8.35585117e+00, 1.21092701e+01],
[ -6.70668936e+00, -1.92020512e+00, -8.63678837e+00, ...,
1.19851971e+01, -1.95774388e+00, -3.46706104e+00],
...,
[ -6.20419502e+00, -1.58898029e+01, 1.47155542e+01, ...,
-6.35781908e+00, -7.09256840e+00, 1.04180880e+01],
[ -1.14867371e-03, -2.47349381e+00, 1.40450490e+00, ...,
1.87805653e+00, 7.70393276e+00, -1.11452806e+00],
[ -1.81114292e+01, 2.83652916e+01, 2.23067703e+01, ...,
4.72095060e+00, 2.01743245e+00, 9.46466255e+00]], dtype=float32)
>>> c
<tf.Tensor 'MatMul:0' shape=(100, 500) dtype=float32>
>>> print(c)
Tensor("MatMul:0", shape=(100, 500), dtype=float32)
>>> print(c.val)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'Tensor' object has no attribute 'val'
>>> print(c.eval())
[[ 7.44645548e+00 7.01777339e-01 -3.29522681e+00 ..., -4.11035490e+00
6.88585615e+00 -1.03243275e+01]
[ 1.74935007e+00 -8.06512642e+00 -8.94767094e+00 ..., -8.51691341e+00
-6.86603403e+00 9.46757889e+00]
[ -6.61030436e+00 5.86357307e+00 1.51259956e+01 ..., -9.53737926e+00
1.95381641e-02 1.16717541e+00]
...,
[ -5.34449625e+00 1.13798809e+00 1.34737101e+01 ..., 6.86746025e+00
3.37234330e+00 -9.16017354e-01]
[ -3.89829564e+00 1.19947767e+00 9.16424465e+00 ..., 7.61591375e-01
-1.70225441e-01 1.02892227e+01]
[ 1.97680518e-01 -1.99925423e+01 -9.40755844e+00 ..., 5.44214249e+00
1.52138865e+00 2.48984170e+00]]
>>> print(a)
Tensor("random_normal:0", shape=(100, 100), dtype=float32)
>>> sess=tf.InteractiveSession()
>>> print(sess.run(a))
[[-1.394485 -1.95048952 0.76553309 ..., -0.43924141 -1.21975422
0.60572529]
[ 0.34292024 0.86016667 -2.25437665 ..., 1.67957187 1.57846153
-1.53106809]
[ 0.08453497 0.59995687 -1.37805259 ..., -0.92989731 -0.07856822
-1.36062932]
...,
[-0.41187105 0.60689414 -0.44695681 ..., 0.51408201 -1.49676847
0.95741159]
[-1.01903558 -1.24220276 0.12283699 ..., 0.53144586 -0.2782338
0.34964591]
[ 0.27783027 0.5017578 -1.0619179 ..., 0.4974283 -0.04771407
0.48028085]]
>>> ls
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'ls' is not defined
>>> exit() (D:\Users\song\Anaconda3) C:\>e:\
'e:\' 不是内部或外部命令,也不是可运行的程序
或批处理文件。 (D:\Users\song\Anaconda3) C:\>cd e:\ (D:\Users\song\Anaconda3) C:\>python minst.py
File "minst.py", line 16
SyntaxError: Non-UTF-8 code starting with '\xb0' in file minst.py on line 16, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details (D:\Users\song\Anaconda3) C:\>python minst.py
File "minst.py", line 16
SyntaxError: Non-UTF-8 code starting with '\xb0' in file minst.py on line 16, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details (D:\Users\song\Anaconda3) C:\>python minst.py
File "minst.py", line 16
SyntaxError: Non-UTF-8 code starting with '\xb0' in file minst.py on line 16, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details (D:\Users\song\Anaconda3) C:\>python
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
>>> flags = tf.app.flags
>>> FLAGS = flags.FLAGS
>>> flags.DEFINE_string('data_dir', '/tmp/data/', 'Directory for storing data')
>>>
>>> mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'input_data' is not defined
>>> from __future__ import absolute_import
>>> from __future__ import division
>>> from __future__ import print_function
>>> from tensorflow.examples.tutorials.mnist import input_data
>>>
>>> mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
Successfully downloaded train-images-idx3-ubyte.gz 9912422 bytes.
Extracting /tmp/data/train-images-idx3-ubyte.gz
Traceback (most recent call last):
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 1318, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "D:\Users\song\Anaconda3\lib\http\client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "D:\Users\song\Anaconda3\lib\http\client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "D:\Users\song\Anaconda3\lib\http\client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "D:\Users\song\Anaconda3\lib\http\client.py", line 1026, in _send_output
self.send(msg)
File "D:\Users\song\Anaconda3\lib\http\client.py", line 964, in send
self.connect()
File "D:\Users\song\Anaconda3\lib\http\client.py", line 1400, in connect
server_hostname=server_hostname)
File "D:\Users\song\Anaconda3\lib\ssl.py", line 401, in wrap_socket
_context=self, _session=session)
File "D:\Users\song\Anaconda3\lib\ssl.py", line 808, in __init__
self.do_handshake()
File "D:\Users\song\Anaconda3\lib\ssl.py", line 1061, in do_handshake
self._sslobj.do_handshake()
File "D:\Users\song\Anaconda3\lib\ssl.py", line 683, in do_handshake
self._sslobj.do_handshake()
ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:749) During handling of the above exception, another exception occurred: Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\mnist.py", line 240, in read_data_sets
SOURCE_URL + TRAIN_LABELS)
File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\base.py", line 208, in maybe_download
temp_file_name, _ = urlretrieve_with_retry(source_url)
File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\base.py", line 165, in wrapped_fn
return fn(*args, **kwargs)
File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\base.py", line 190, in urlretrieve_with_retry
return urllib.request.urlretrieve(url, filename)
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 248, in urlretrieve
with contextlib.closing(urlopen(url, data)) as fp:
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 526, in open
response = self._open(req, data)
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 544, in _open
'_open', req)
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 504, in _call_chain
result = func(*args)
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 1361, in https_open
context=self._context, check_hostname=self._check_hostname)
File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 1320, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error EOF occurred in violation of protocol (_ssl.c:749)>
>>> import requests
>>> >>> from requests.adapters import HTTPAdapter
File "<stdin>", line 1
>>> from requests.adapters import HTTPAdapter
^
SyntaxError: invalid syntax
>>> >>> from requests.packages.urllib3.poolmanager import PoolManager
File "<stdin>", line 1
>>> from requests.packages.urllib3.poolmanager import PoolManager
^
SyntaxError: invalid syntax
>>> >>> import ssl
File "<stdin>", line 1
>>> import ssl
^
SyntaxError: invalid syntax
>>> >>>
File "<stdin>", line 1
>>>
^
SyntaxError: invalid syntax
>>> >>> class MyAdapter(HTTPAdapter):
File "<stdin>", line 1
>>> class MyAdapter(HTTPAdapter):
^
SyntaxError: invalid syntax
>>> ... def init_poolmanager(self, connections, maxsize, block=False):
File "<stdin>", line 1
... def init_poolmanager(self, connections, maxsize, block=False):
^
SyntaxError: invalid syntax
>>> ... self.poolmanager = PoolManager(num_pools=connections,
File "<stdin>", line 1
... self.poolmanager = PoolManager(num_pools=connections,
^
SyntaxError: invalid syntax
>>> ... maxsize=maxsize,
File "<stdin>", line 1
... maxsize=maxsize,
^
SyntaxError: invalid syntax
>>> ... block=block,
File "<stdin>", line 1
... block=block,
^
SyntaxError: invalid syntax
>>> ... ssl_version=ssl.PROTOCOL_TLSv1)
File "<stdin>", line 1
... ssl_version=ssl.PROTOCOL_TLSv1)
^
SyntaxError: invalid syntax
>>> ...
Ellipsis
>>> >>> s = requests.Session()
File "<stdin>", line 1
>>> s = requests.Session()
^
SyntaxError: invalid syntax
>>> >>> s.mount('https://', MyAdapter())
File "<stdin>", line 1
>>> s.mount('https://', MyAdapter())
^
SyntaxError: invalid syntax
>>> >>> s.get('https://www.supercash.cz')
File "<stdin>", line 1
>>> s.get('https://www.supercash.cz')
^
SyntaxError: invalid syntax
>>> <Response [200]>
File "<stdin>", line 1
<Response [200]>
^
SyntaxError: invalid syntax
>>> mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
Extracting /tmp/data/train-images-idx3-ubyte.gz
Successfully downloaded train-labels-idx1-ubyte.gz 28881 bytes.
Extracting /tmp/data/train-labels-idx1-ubyte.gz
Successfully downloaded t10k-images-idx3-ubyte.gz 1648877 bytes.
Extracting /tmp/data/t10k-images-idx3-ubyte.gz
Successfully downloaded t10k-labels-idx1-ubyte.gz 4542 bytes.
Extracting /tmp/data/t10k-labels-idx1-ubyte.gz
>>> x = tf.placeholder(tf.float32, [None, 784]) # 占位符
>>> y = tf.placeholder(tf.float32, [None, 10])
>>> W = tf.Variable(tf.zeros([784, 10]))
>>> b = tf.Variable(tf.zeros([10]))
>>> a = tf.nn.softmax(tf.matmul(x, W) + b)
>>> cross_entropy = tf.reduce_mean(-tf.reduce_sum(y * tf.log(a), reduction_indices=[1])) # 损失函数为交叉熵
>>> optimizer = tf.train.GradientDescentOptimizer(0.5) # 梯度下降法,学习速率为0.5
>>> train = optimizer.minimize(cross_entropy) # 训练目标:最小化损失函数
>>>
>>> # Test trained model
... correct_prediction = tf.equal(tf.argmax(a, 1), tf.argmax(y, 1))
>>> accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
>>> correct_prediction = tf.equal(tf.argmax(a, 1), tf.argmax(y, 1))
>>> accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
>>> sess = tf.InteractiveSession() # 建立交互式会话
2017-10-29 21:28:03.960497: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2017-10-29 21:28:03.968465: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
>>> tf.initialize_all_variables().run()
WARNING:tensorflow:From D:\Users\song\Anaconda3\lib\site-packages\tensorflow\python\util\tf_should_use.py:175: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
Instructions for updating:
Use `tf.global_variables_initializer` instead.
>>> for i in range(1000):
... batch_xs, batch_ys = mnist.train.next_batch(100)
... train.run({x: batch_xs, y: batch_ys})
... print(sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels}))
File "<stdin>", line 4
print(sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels}))
^
SyntaxError: invalid syntax
>>> tf.initialize_all_variables().run()
WARNING:tensorflow:From D:\Users\song\Anaconda3\lib\site-packages\tensorflow\python\util\tf_should_use.py:175: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
Instructions for updating:
Use `tf.global_variables_initializer` instead.
>>> tf.global_variables_initializer().run()
>>> for i in range(1000):
... batch_xs, batch_ys = mnist.train.next_batch(100)
... train.run({x: batch_xs, y: batch_ys})
...
>>> print(sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels}))
0.9154
>>>
----------2018.02.10 另外一台ubuntu服务器,tensorflow已装好,python能import--------------
python -c 'import tensorflow as tf; print(tf.__version__)'
或:
>>> tf.__version__
>>> tf.__path__
使用TensorFlow-Slim App:
github上models/research/slim目录下的README有详细说明。
python download_and_convert_data.py --dataset_name=flowers --dataset_dir="/home/.../data/"
等一会儿就下载好了,ls /home/.../data可看到:
新建creatingTFSlimDataset.py写入:
import tensorflow as tf
from datasets import flowers slim = tf.contrib.slim # Selects the 'validation' dataset.
dataset = flowers.get_split('validation', "/home/.../data/") # Creates a TF-Slim DataProvider which reads the dataset in the background
# during both training and testing.
provider = slim.dataset_data_provider.DatasetDataProvider(dataset)
[image, label] = provider.get(['image', 'label'])
在slim目录下python creatingTFSlimDataset.py
----------2018.02.11 macbook pro--------------
mac os安装参考官网和教程,每次打开命令行,进入tf目录执行
source bin/activate
进入;执行 deactivate退出tensorflow环境。
----------2018.02.11 用自己的数据集训练cnn----------
代码完全是来自这里,写一个python脚本,把作者的代码贴进去,文件前后分别加上:
import os
import numpy as np
import tensorflow as tf if __name__ == '__main__':
run_training()
这里要么把run_trainning()里的inputData、model等删掉,要么拆成几个.py分别import。
在run_training()里指向自己的数据,作者代码支持2分类,建两个目录,名字分别为0,1,下边直接放图像数据,不能放其他东西。
目录下python xx.py运行,总是报莫名其妙的错误,如下:
2018-02-11 16:14:27.688087: W tensorflow/core/framework/op_kernel.cc:1188] Unimplemented: Cast float to string is not supported
2018-02-11 16:14:27.691410: W tensorflow/core/framework/op_kernel.cc:1188] Unimplemented: Cast float to string is not supported
2018-02-11 16:14:27.700514: W tensorflow/core/framework/op_kernel.cc:1188] Unimplemented: Cast float to string is not supported
2018-02-11 16:14:27.700547: E tensorflow/core/common_runtime/executor.cc:651] Executor failed to create kernel. Unimplemented: Cast float to string is not supported
[[Node: Cast = Cast[DstT=DT_STRING, SrcT=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](Cast/x)]]
tensorflow.python.framework.errors_impl.InvalidArgumentError: Expected image (JPEG, PNG, or GIF), got unknown format starting with '\000\000\000\001Bud1\000\000(\000\000\000\010\000'
参考这里解决,主要是mac os每个目录下会生成一个.DS_Store隐藏文件,并且每个目录都有,然后get_files里的for in os.listdir(filename+train_class)会把这个不是目录的文件也读进来,用命令:
ls -d .* //显示目录下的隐藏文件
rm .DS_Store //删除
暴力删除所有DS_Store,就可以运行了:
(tf) ...$ python selfDataTest.py
2018-02-11 16:29:32.737708: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.2 AVX AVX2 FMA
0
loss:0.693223297596 accuracy:0.5
1
loss:0.693074345589 accuracy:0.5
2
loss:0.697106897831 accuracy:0.25
3
loss:0.693120956421 accuracy:0.5
4
loss:0.693217039108 accuracy:0.5
5
loss:0.693105101585 accuracy:0.5
6
loss:0.696964502335 accuracy:0.25
7
loss:0.689658999443 accuracy:0.75
8
loss:0.689396262169 accuracy:0.75
9
loss:0.689066112041 accuracy:0.75
10
loss:0.688840508461 accuracy:0.75
11
loss:0.693139314651 accuracy:0.5
12
loss:0.683785676956 accuracy:1.0
13
loss:0.703975439072 accuracy:0.0
14
loss:0.68853032589 accuracy:0.75
15
loss:0.698201835155 accuracy:0.25
16
loss:0.68848156929 accuracy:0.75
17
loss:0.698279738426 accuracy:0.25
18
loss:0.693163573742 accuracy:0.5
19
loss:0.6931681633 accuracy:0.5
20
loss:0.683992028236 accuracy:1.0
21
loss:0.693161666393 accuracy:0.5
22
loss:0.698703587055 accuracy:0.25
23
loss:0.693104684353 accuracy:0.5
24
loss:0.68318516016 accuracy:1.0
25
loss:0.699333965778 accuracy:0.25
26
loss:0.693171679974 accuracy:0.5
27
loss:0.687688589096 accuracy:0.75
28
loss:0.699294626713 accuracy:0.25
29
loss:0.698648869991 accuracy:0.25
30
loss:0.697887659073 accuracy:0.25
31
loss:0.697125077248 accuracy:0.25
32
loss:0.693179786205 accuracy:0.5
33
loss:0.690038383007 accuracy:0.75
34
loss:0.693158328533 accuracy:0.5
35
loss:0.693139135838 accuracy:0.5
36
loss:0.693126440048 accuracy:0.5
37
loss:0.6970089674 accuracy:0.25
38
loss:0.693112254143 accuracy:0.5
39
loss:0.696039140224 accuracy:0.25
40
loss:0.691227436066 accuracy:0.75
41
loss:0.6871124506 accuracy:1.0
42
loss:0.698171555996 accuracy:0.25
43
loss:0.693155050278 accuracy:0.5
44
loss:0.693078219891 accuracy:0.5
45
loss:0.700998663902 accuracy:0.0
46
loss:0.69156730175 accuracy:0.75
47
loss:0.693068742752 accuracy:0.5
48
loss:0.69571262598 accuracy:0.25
49
loss:0.694398403168 accuracy:0.25
50
loss:0.692993998528 accuracy:0.75
51
loss:0.69304227829 accuracy:0.75
52
loss:0.691499948502 accuracy:0.75
53
loss:0.699331462383 accuracy:0.0
54
loss:0.693157672882 accuracy:0.25
55
loss:0.693060457706 accuracy:0.5
56
loss:0.694615125656 accuracy:0.25
57
loss:0.693363189697 accuracy:0.25
58
loss:0.693014979362 accuracy:0.5
59
loss:0.694661080837 accuracy:0.25
60
loss:0.693545818329 accuracy:0.25
61
loss:0.693045854568 accuracy:0.5
62
loss:0.693035364151 accuracy:0.5
63
loss:0.694476723671 accuracy:0.25
64
loss:0.692974746227 accuracy:0.5
65
loss:0.692882657051 accuracy:0.5
66
loss:0.693762481213 accuracy:0.25
67
loss:0.690908133984 accuracy:0.75
68
loss:0.702915489674 accuracy:0.0
69
loss:0.692815005779 accuracy:0.5
70
loss:0.694090306759 accuracy:0.25
71
loss:0.690848052502 accuracy:0.75
72
loss:0.698566675186 accuracy:0.25
73
loss:0.689132213593 accuracy:1.0
74
loss:0.675133347511 accuracy:1.0
75
loss:0.693369984627 accuracy:0.5
76
loss:0.709350824356 accuracy:0.25
77
loss:0.693330347538 accuracy:0.5
78
loss:0.704541862011 accuracy:0.25
79
loss:0.706527590752 accuracy:0.0
80
loss:0.692816853523 accuracy:0.5
81
loss:0.694923400879 accuracy:0.25
82
loss:0.689839780331 accuracy:0.75
83
loss:0.70052075386 accuracy:0.25
84
loss:0.692834496498 accuracy:0.5
85
loss:0.692559659481 accuracy:0.5
86
loss:0.695177018642 accuracy:0.25
87
loss:0.697179615498 accuracy:0.25
88
loss:0.697944283485 accuracy:0.0
89
loss:0.682408571243 accuracy:0.75
90
loss:0.676290035248 accuracy:0.75
91
loss:0.646753549576 accuracy:1.0
92
loss:0.657571673393 accuracy:0.75
93
loss:0.74190735817 accuracy:0.25
94
loss:0.729316711426 accuracy:0.25
95
loss:0.668318748474 accuracy:0.75
96
loss:0.726832032204 accuracy:0.25
97
loss:0.692601561546 accuracy:0.5
98
loss:0.692799925804 accuracy:0.5
99
loss:0.671272993088 accuracy:0.75
100
loss:0.663210988045 accuracy:0.75
101
loss:0.69605076313 accuracy:0.5
102
loss:0.733357787132 accuracy:0.25
103
loss:0.717833042145 accuracy:0.25
104
loss:0.691287755966 accuracy:0.5
105
loss:0.691427409649 accuracy:0.5
106
loss:0.704467654228 accuracy:0.25
107
loss:0.689470171928 accuracy:1.0
108
loss:0.704850375652 accuracy:0.25
109
loss:0.68930375576 accuracy:1.0
110
loss:0.690158724785 accuracy:1.0
111
loss:0.678738832474 accuracy:0.75
112
loss:0.690320134163 accuracy:0.5
113
loss:0.685899198055 accuracy:0.5
114
loss:0.666072010994 accuracy:0.75
115
loss:0.69532930851 accuracy:0.5
116
loss:0.67963975668 accuracy:0.5
117
loss:0.664266467094 accuracy:0.5
118
loss:0.637705147266 accuracy:1.0
119
loss:0.639262676239 accuracy:0.75
120
loss:0.549074888229 accuracy:0.75
121
loss:0.950685620308 accuracy:0.0
122
loss:0.649527430534 accuracy:0.75
123
loss:0.687687039375 accuracy:0.75
124
loss:0.236011490226 accuracy:1.0
125
loss:0.360839009285 accuracy:0.75
126
loss:0.160097658634 accuracy:1.0
127
loss:0.0284716840833 accuracy:1.0
128
loss:1.20477676392 accuracy:0.75
129
loss:0.785823404789 accuracy:0.25
130
loss:0.596278190613 accuracy:0.75
131
loss:0.474103331566 accuracy:0.75
132
loss:0.170581206679 accuracy:1.0
133
loss:1.17357873917 accuracy:0.5
134
loss:0.369093000889 accuracy:1.0
135
loss:0.396817922592 accuracy:0.75
136
loss:0.53536605835 accuracy:0.75
137
loss:0.276045441628 accuracy:0.75
138
loss:0.36287856102 accuracy:0.75
139
loss:0.196955054998 accuracy:1.0
140
loss:0.0153374457732 accuracy:1.0
141
loss:0.0416378080845 accuracy:1.0
142
loss:1.89024567604 accuracy:0.5
143
loss:0.512691378593 accuracy:1.0
144
loss:0.0846931710839 accuracy:1.0
145
loss:0.330246806145 accuracy:0.75
146
loss:0.349481493235 accuracy:0.75
147
loss:0.847968816757 accuracy:0.5
148
loss:0.320005506277 accuracy:0.75
149
loss:0.846890568733 accuracy:0.25
150
loss:0.236197531223 accuracy:1.0
151
loss:0.0872330516577 accuracy:1.0
152
loss:0.0432429946959 accuracy:1.0
153
loss:0.0282921157777 accuracy:1.0
154
loss:1.17421019077 accuracy:0.75
155
loss:0.526151418686 accuracy:0.75
156
loss:0.417270839214 accuracy:0.75
157
loss:0.537223100662 accuracy:0.75
158
loss:0.247993305326 accuracy:1.0
159
loss:0.278814792633 accuracy:1.0
160
loss:0.0463420078158 accuracy:1.0
161
loss:0.0170685201883 accuracy:1.0
162
loss:0.223224148154 accuracy:0.75
163
loss:0.0268691331148 accuracy:1.0
164
loss:2.04596710205 accuracy:0.5
165
loss:0.349981129169 accuracy:1.0
166
loss:0.812381505966 accuracy:0.5
167
loss:0.132523924112 accuracy:1.0
168
loss:0.493652850389 accuracy:0.75
169
loss:0.328869134188 accuracy:0.75
170
loss:0.105988666415 accuracy:1.0
171
loss:0.0751493424177 accuracy:1.0
172
loss:0.0750939249992 accuracy:1.0
173
loss:0.304137170315 accuracy:0.75
174
loss:0.273175984621 accuracy:0.75
175
loss:0.543058216572 accuracy:0.75
176
loss:1.90773689747 accuracy:0.5
177
loss:0.438852667809 accuracy:0.75
178
loss:0.442263126373 accuracy:0.75
179
loss:0.429260343313 accuracy:0.75
180
loss:0.245088890195 accuracy:1.0
181
loss:0.159963816404 accuracy:1.0
182
loss:0.039998114109 accuracy:1.0
183
loss:0.105835229158 accuracy:1.0
184
loss:0.00809328071773 accuracy:1.0
185
loss:0.0673048049212 accuracy:1.0
186
loss:1.13752818108 accuracy:0.75
187
loss:0.490282326937 accuracy:0.75
188
loss:1.42135226727 accuracy:0.75
189
loss:0.288748651743 accuracy:1.0
190
loss:0.0984246730804 accuracy:1.0
191
loss:0.123517766595 accuracy:1.0
192
loss:0.0920013636351 accuracy:1.0
193
loss:1.44978451729 accuracy:0.75
194
loss:0.305551946163 accuracy:1.0
195
loss:0.443002015352 accuracy:0.75
196
loss:0.106428675354 accuracy:1.0
197
loss:0.356863230467 accuracy:0.75
198
loss:0.0275120735168 accuracy:1.0
199
loss:1.12723910809 accuracy:0.75
200
loss:0.08886256814 accuracy:1.0
201
loss:0.0773176699877 accuracy:1.0
202
loss:0.17778685689 accuracy:1.0
203
loss:0.263333916664 accuracy:0.75
204
loss:0.100112996995 accuracy:1.0
205
loss:0.0208118930459 accuracy:1.0
206
loss:0.0241779796779 accuracy:1.0
207
loss:0.00176866375841 accuracy:1.0
208
loss:1.03581428528 accuracy:0.75
209
loss:0.101269900799 accuracy:1.0
210
loss:0.522728979588 accuracy:0.75
211
loss:0.0190876871347 accuracy:1.0
212
loss:0.851385474205 accuracy:0.75
213
loss:0.627064526081 accuracy:0.5
214
loss:0.178076297045 accuracy:1.0
215
loss:0.272920429707 accuracy:1.0
216
loss:0.722631931305 accuracy:0.75
217
loss:0.405046164989 accuracy:0.75
218
loss:0.434506893158 accuracy:0.75
219
loss:0.205615088344 accuracy:1.0
220
loss:0.102596238256 accuracy:1.0
221
loss:0.89775633812 accuracy:0.5
222
loss:0.0162407811731 accuracy:1.0
223
loss:0.257048845291 accuracy:0.75
224
loss:0.53179782629 accuracy:0.75
225
loss:0.414461612701 accuracy:0.75
226
loss:0.274204641581 accuracy:0.75
227
loss:0.751442372799 accuracy:0.75
228
loss:0.100349068642 accuracy:1.0
229
loss:0.491792619228 accuracy:0.5
230
loss:0.470929801464 accuracy:1.0
231
loss:0.684968233109 accuracy:0.5
232
loss:0.505018293858 accuracy:0.75
233
loss:0.23813906312 accuracy:1.0
234
loss:1.05322659016 accuracy:0.5
235
loss:0.291554331779 accuracy:1.0
236
loss:0.384746789932 accuracy:1.0
237
loss:0.37275955081 accuracy:0.75
238
loss:0.0688233971596 accuracy:1.0
239
loss:0.718187510967 accuracy:0.75
240
loss:0.609194219112 accuracy:0.75
241
loss:0.225485235453 accuracy:1.0
242
loss:0.283724486828 accuracy:0.75
243
loss:0.563280165195 accuracy:0.75
244
loss:0.0566305555403 accuracy:1.0
245
loss:0.0681798830628 accuracy:1.0
246
loss:0.198830872774 accuracy:1.0
247
loss:0.743586599827 accuracy:0.75
248
loss:0.108701385558 accuracy:1.0
249
loss:0.232169955969 accuracy:1.0
250
loss:0.0204469505697 accuracy:1.0
251
loss:0.0746807381511 accuracy:1.0
252
loss:1.67662298679 accuracy:0.5
253
loss:0.0344735346735 accuracy:1.0
254
loss:0.329333722591 accuracy:0.75
255
loss:0.0228136144578 accuracy:1.0
256
loss:0.558523058891 accuracy:0.75
257
loss:0.801098883152 accuracy:0.75
258
loss:0.294895410538 accuracy:1.0
259
loss:0.073697000742 accuracy:1.0
260
loss:0.0375180691481 accuracy:1.0
261
loss:0.246171832085 accuracy:0.75
262
loss:0.774982333183 accuracy:0.5
263
loss:0.305063486099 accuracy:1.0
264
loss:0.463157624006 accuracy:0.5
265
loss:0.642466902733 accuracy:0.25
266
loss:0.110810160637 accuracy:1.0
267
loss:0.055772036314 accuracy:1.0
268
loss:0.111803464592 accuracy:1.0
269
loss:0.0542620681226 accuracy:1.0
270
loss:0.867859005928 accuracy:0.75
271
loss:0.282488763332 accuracy:1.0
272
loss:0.102671615779 accuracy:1.0
273
loss:0.251693636179 accuracy:0.75
274
loss:0.765829801559 accuracy:0.75
275
loss:0.194914981723 accuracy:1.0
276
loss:0.102006778121 accuracy:1.0
277
loss:0.0539451315999 accuracy:1.0
278
loss:0.0130981495604 accuracy:1.0
279
loss:2.14160680771 accuracy:0.5
280
loss:0.176309481263 accuracy:1.0
281
loss:0.155295550823 accuracy:1.0
282
loss:0.0576198920608 accuracy:1.0
283
loss:0.267267256975 accuracy:1.0
284
loss:0.170527070761 accuracy:1.0
285
loss:0.793471336365 accuracy:0.75
286
loss:0.054802633822 accuracy:1.0
287
loss:0.0160926636308 accuracy:1.0
288
loss:0.113910079002 accuracy:1.0
289
loss:0.0136507945135 accuracy:1.0
290
loss:0.319148600101 accuracy:0.75
291
loss:0.000944297935348 accuracy:1.0
292
loss:0.000640460464638 accuracy:1.0
293
loss:0.00669733900577 accuracy:1.0
294
loss:0.00175015779678 accuracy:1.0
295
loss:0.0475143417716 accuracy:1.0
296
loss:0.00636913161725 accuracy:1.0
297
loss:0.00344254914671 accuracy:1.0
298
loss:0.629906773567 accuracy:0.75
299
loss:0.00485158292577 accuracy:1.0
300
loss:0.117860376835 accuracy:1.0
301
loss:2.2443985939 accuracy:0.75
302
loss:0.00151524401736 accuracy:1.0
303
loss:0.668887317181 accuracy:0.75
304
loss:0.341220498085 accuracy:0.75
305
loss:0.243527442217 accuracy:0.75
306
loss:0.109274975955 accuracy:1.0
307
loss:0.127818629146 accuracy:1.0
308
loss:0.0721819028258 accuracy:1.0
309
loss:0.0184937343001 accuracy:1.0
310
loss:0.820344865322 accuracy:0.5
311
loss:0.0684595555067 accuracy:1.0
312
loss:0.364878892899 accuracy:0.75
313
loss:0.119165182114 accuracy:1.0
314
loss:0.917512893677 accuracy:0.5
315
loss:0.208229511976 accuracy:0.75
316
loss:0.0379144325852 accuracy:1.0
317
loss:0.291262000799 accuracy:0.75
318
loss:1.70546030998 accuracy:0.5
319
loss:0.0183182619512 accuracy:1.0
320
loss:0.382932752371 accuracy:0.75
321
loss:0.163620784879 accuracy:1.0
322
loss:0.319008469582 accuracy:0.75
323
loss:0.088489279151 accuracy:1.0
324
loss:0.715149879456 accuracy:0.5
325
loss:0.0675266161561 accuracy:1.0
326
loss:0.916550815105 accuracy:0.5
327
loss:0.448634713888 accuracy:1.0
328
loss:0.271819204092 accuracy:0.75
329
loss:0.0831155627966 accuracy:1.0
330
loss:0.171018838882 accuracy:1.0
331
loss:0.0210947152227 accuracy:1.0
332
loss:0.331143260002 accuracy:0.75
333
loss:0.50136744976 accuracy:0.75
334
loss:0.156625300646 accuracy:1.0
335
loss:0.0159201174974 accuracy:1.0
336
loss:0.171763345599 accuracy:1.0
337
loss:0.317091315985 accuracy:0.75
338
loss:0.00742457062006 accuracy:1.0
339
loss:0.147552683949 accuracy:1.0
340
loss:0.265565574169 accuracy:0.75
341
loss:0.0794127807021 accuracy:1.0
342
loss:0.90516358614 accuracy:0.75
343
loss:0.0485695488751 accuracy:1.0
344
loss:0.929676651955 accuracy:0.75
345
loss:0.0915883779526 accuracy:1.0
346
loss:0.0149378413334 accuracy:1.0
347
loss:0.0227350518107 accuracy:1.0
348
loss:0.188080132008 accuracy:1.0
349
loss:0.0991646498442 accuracy:1.0
350
loss:0.0718017593026 accuracy:1.0
351
loss:1.19274258614 accuracy:0.5
352
loss:0.965473353863 accuracy:0.5
353
loss:0.259137153625 accuracy:0.75
354
loss:0.0660394281149 accuracy:1.0
355
loss:0.0636159256101 accuracy:1.0
356
loss:0.473960787058 accuracy:0.75
357
loss:0.0584978982806 accuracy:1.0
358
loss:0.225148662925 accuracy:1.0
359
loss:0.551927268505 accuracy:0.75
360
loss:0.129055544734 accuracy:1.0
361
loss:0.135725021362 accuracy:1.0
362
loss:0.05837514624 accuracy:1.0
363
loss:0.050028629601 accuracy:1.0
364
loss:0.0220219194889 accuracy:1.0
365
loss:0.563142418861 accuracy:0.75
366
loss:0.213800609112 accuracy:1.0
367
loss:0.0281376540661 accuracy:1.0
368
loss:1.20224881172 accuracy:0.75
369
loss:0.528139770031 accuracy:0.75
370
loss:0.124928534031 accuracy:1.0
371
loss:0.26053994894 accuracy:0.75
372
loss:0.200136646628 accuracy:0.75
373
loss:0.106237880886 accuracy:1.0
374
loss:0.317531168461 accuracy:1.0
375
loss:0.246357157826 accuracy:0.75
376
loss:0.161189392209 accuracy:1.0
377
loss:0.0400363244116 accuracy:1.0
378
loss:0.000115944123536 accuracy:1.0
379
loss:0.0736970975995 accuracy:1.0
380
loss:2.95828056335 accuracy:0.5
381
loss:0.0402479618788 accuracy:1.0
382
loss:0.27467161417 accuracy:1.0
383
loss:0.0441851988435 accuracy:1.0
384
loss:0.0222014114261 accuracy:1.0
385
loss:0.0845765322447 accuracy:1.0
386
loss:0.21609556675 accuracy:0.75
387
loss:0.305368185043 accuracy:0.75
388
loss:0.457645982504 accuracy:0.75
389
loss:0.479472994804 accuracy:0.75
390
loss:0.163302078843 accuracy:1.0
391
loss:0.436002552509 accuracy:0.75
392
loss:0.128151774406 accuracy:1.0
393
loss:0.258456408978 accuracy:0.75
394
loss:0.22227601707 accuracy:0.75
395
loss:0.0503372251987 accuracy:1.0
396
loss:0.02476574108 accuracy:1.0
397
loss:0.000495057029184 accuracy:1.0
398
loss:0.419431209564 accuracy:0.75
399
loss:0.279945731163 accuracy:1.0
400
loss:0.000864843954332 accuracy:1.0
401
loss:0.0879789367318 accuracy:1.0
402
loss:0.00543585978448 accuracy:1.0
403
loss:0.0035734588746 accuracy:1.0
404
loss:0.00278418860398 accuracy:1.0
405
loss:0.800966143608 accuracy:0.75
406
loss:0.0348575152457 accuracy:1.0
407
loss:0.217690259218 accuracy:0.75
408
loss:0.00130753079429 accuracy:1.0
409
loss:0.00162001827266 accuracy:1.0
410
loss:0.546540558338 accuracy:0.75
411
loss:0.443211138248 accuracy:0.75
412
loss:0.0923056006432 accuracy:1.0
413
loss:0.282079219818 accuracy:0.75
414
loss:0.304762452841 accuracy:0.75
415
loss:0.292380183935 accuracy:0.75
416
loss:0.028173699975 accuracy:1.0
417
loss:0.0553055480123 accuracy:1.0
418
loss:0.388806015253 accuracy:0.75
419
loss:0.256281733513 accuracy:0.75
420
loss:0.00459419749677 accuracy:1.0
421
loss:0.108316868544 accuracy:1.0
422
loss:0.00306999869645 accuracy:1.0
423
loss:0.185824766755 accuracy:0.75
424
loss:0.0356827452779 accuracy:1.0
425
loss:0.0110305007547 accuracy:1.0
426
loss:0.000118359719636 accuracy:1.0
427
loss:0.0264259390533 accuracy:1.0
428
loss:2.09415435791 accuracy:0.5
429
loss:0.405786812305 accuracy:0.5
430
loss:0.170478060842 accuracy:1.0
431
loss:0.153327018023 accuracy:1.0
432
loss:0.0670616924763 accuracy:1.0
433
loss:0.100017897785 accuracy:1.0
434
loss:0.803987801075 accuracy:0.75
435
loss:0.242291912436 accuracy:0.75
436
loss:0.887839794159 accuracy:0.75
437
loss:0.126330152154 accuracy:1.0
438
loss:0.495402723551 accuracy:0.5
439
loss:0.0176431145519 accuracy:1.0
440
loss:0.254504919052 accuracy:1.0
441
loss:0.0066742207855 accuracy:1.0
442
loss:0.103796347976 accuracy:1.0
443
loss:0.0256795622408 accuracy:1.0
444
loss:0.412333756685 accuracy:0.75
445
loss:0.0198563206941 accuracy:1.0
446
loss:0.0271796099842 accuracy:1.0
447
loss:0.00262259342708 accuracy:1.0
448
loss:0.679375708103 accuracy:0.75
449
loss:0.436676889658 accuracy:0.75
450
loss:0.133831515908 accuracy:1.0
451
loss:0.121498912573 accuracy:1.0
452
loss:0.033711925149 accuracy:1.0
453
loss:0.102268278599 accuracy:1.0
454
loss:0.00103223056067 accuracy:1.0
455
loss:0.128242060542 accuracy:1.0
456
loss:0.00504214642569 accuracy:1.0
457
loss:0.00237890915014 accuracy:1.0
458
loss:1.08625376225 accuracy:0.25
459
loss:0.030952764675 accuracy:1.0
460
loss:0.173320218921 accuracy:1.0
461
loss:0.121969670057 accuracy:1.0
462
loss:0.0947612226009 accuracy:1.0
463
loss:0.205078348517 accuracy:0.75
464
loss:0.00106444279663 accuracy:1.0
465
loss:0.34515401721 accuracy:0.75
466
loss:0.15998339653 accuracy:1.0
467
loss:0.00492420978844 accuracy:1.0
468
loss:0.0870720297098 accuracy:1.0
469
loss:2.09969067574 accuracy:0.5
470
loss:0.194903433323 accuracy:1.0
471
loss:0.242374703288 accuracy:1.0
472
loss:0.00174707639962 accuracy:1.0
473
loss:0.0663149431348 accuracy:1.0
474
loss:0.0415232479572 accuracy:1.0
475
loss:0.745410084724 accuracy:0.75
476
loss:0.72058993578 accuracy:0.75
477
loss:0.074091270566 accuracy:1.0
478
loss:0.0825443267822 accuracy:1.0
479
loss:0.0513244643807 accuracy:1.0
480
loss:0.0320774801075 accuracy:1.0
481
loss:0.0128127280623 accuracy:1.0
482
loss:0.0371737554669 accuracy:1.0
483
loss:0.276018559933 accuracy:0.75
484
loss:0.0172993671149 accuracy:1.0
485
loss:0.0301472023129 accuracy:1.0
486
loss:0.00649361917749 accuracy:1.0
487
loss:0.000473263178719 accuracy:1.0
488
loss:0.000434344052337 accuracy:1.0
489
loss:0.0177765209228 accuracy:1.0
490
loss:0.100023776293 accuracy:1.0
491
loss:0.00998072884977 accuracy:1.0
492
loss:0.178784310818 accuracy:0.75
493
loss:0.000287099683192 accuracy:1.0
494
loss:2.17384004593 accuracy:0.75
495
loss:0.125859886408 accuracy:1.0
496
loss:0.0469430424273 accuracy:1.0
497
loss:0.0470446236432 accuracy:1.0
498
loss:0.00149866973516 accuracy:1.0
499
loss:1.76050198078 accuracy:0.5
500
loss:0.223427206278 accuracy:1.0
501
loss:0.252842336893 accuracy:0.75
502
loss:0.688393950462 accuracy:0.75
503
loss:0.0202198959887 accuracy:1.0
504
loss:0.00671406136826 accuracy:1.0
505
loss:0.248940289021 accuracy:1.0
506
loss:0.274929821491 accuracy:0.75
507
loss:0.12192375958 accuracy:1.0
508
loss:0.529097795486 accuracy:0.75
509
loss:0.0117030935362 accuracy:1.0
510
loss:0.0703663975 accuracy:1.0
511
loss:0.00478047179058 accuracy:1.0
512
loss:0.0121546797454 accuracy:1.0
513
loss:0.208536297083 accuracy:1.0
514
loss:0.00334931351244 accuracy:1.0
515
loss:0.79892295599 accuracy:0.75
516
loss:1.14639115334 accuracy:0.75
517
loss:0.0293184090406 accuracy:1.0
518
loss:0.0145129384473 accuracy:1.0
519
loss:0.51245445013 accuracy:0.5
520
loss:0.163923382759 accuracy:1.0
521
loss:0.00152231776156 accuracy:1.0
522
loss:0.00467296224087 accuracy:1.0
523
loss:0.335566133261 accuracy:0.75
524
loss:0.565649867058 accuracy:0.75
525
loss:0.0779503583908 accuracy:1.0
526
loss:0.0503666475415 accuracy:1.0
527
loss:0.0936669185758 accuracy:1.0
528
loss:0.0114694610238 accuracy:1.0
529
loss:0.0113796535879 accuracy:1.0
530
loss:0.00210900465026 accuracy:1.0
531
loss:0.0697501897812 accuracy:1.0
532
loss:0.0413017123938 accuracy:1.0
533
loss:0.000223232258577 accuracy:1.0
534
loss:0.00237680179998 accuracy:1.0
535
loss:0.0935806557536 accuracy:1.0
536
loss:0.105601318181 accuracy:1.0
537
loss:2.22019316425e-05 accuracy:1.0
538
loss:0.604238510132 accuracy:0.75
539
loss:0.0422407202423 accuracy:1.0
540
loss:0.0232363473624 accuracy:1.0
541
loss:0.0315810516477 accuracy:1.0
542
loss:3.51061898982e-05 accuracy:1.0
543
loss:0.0173356998712 accuracy:1.0
544
loss:0.00834203884006 accuracy:1.0
545
loss:0.000342688814271 accuracy:1.0
546
loss:7.11309767212e-05 accuracy:1.0
547
loss:0.00906061194837 accuracy:1.0
548
loss:1.66892471043e-06 accuracy:1.0
549
loss:0.00172243604902 accuracy:1.0
550
loss:0.034824796021 accuracy:1.0
551
loss:1.22189294416e-06 accuracy:1.0
552
loss:0.00228166719899 accuracy:1.0
553
loss:1.75538408756 accuracy:0.75
554
loss:0.160510271788 accuracy:1.0
555
loss:0.00583411566913 accuracy:1.0
556
loss:0.0328364670277 accuracy:1.0
557
loss:0.865779876709 accuracy:0.75
558
loss:0.643167614937 accuracy:0.5
559
loss:2.28500294685 accuracy:0.0
560
loss:0.0093042999506 accuracy:1.0
561
loss:0.735183119774 accuracy:0.75
562
loss:0.0769147053361 accuracy:1.0
563
loss:0.0310892332345 accuracy:1.0
564
loss:0.0728826448321 accuracy:1.0
565
loss:0.178516685963 accuracy:1.0
566
loss:0.0103313624859 accuracy:1.0
567
loss:0.118710055947 accuracy:1.0
568
loss:0.074576176703 accuracy:1.0
569
loss:0.240194231272 accuracy:0.75
570
loss:0.0038958825171 accuracy:1.0
571
loss:0.000401474506361 accuracy:1.0
572
loss:0.0813326686621 accuracy:1.0
573
loss:0.0319667756557 accuracy:1.0
574
loss:0.0254385173321 accuracy:1.0
575
loss:0.00608881236985 accuracy:1.0
576
loss:0.0615266412497 accuracy:1.0
577
loss:0.00878894422203 accuracy:1.0
578
loss:0.00919084344059 accuracy:1.0
579
loss:0.0137438997626 accuracy:1.0
580
loss:5.85580492043e-05 accuracy:1.0
581
loss:0.950065612793 accuracy:0.75
582
loss:0.517662346363 accuracy:0.75
583
loss:0.0079373139888 accuracy:1.0
584
loss:0.199831828475 accuracy:0.75
585
loss:0.0586840547621 accuracy:1.0
586
loss:0.0635885223746 accuracy:1.0
587
loss:0.00248917890713 accuracy:1.0
588
loss:0.0176570080221 accuracy:1.0
589
loss:0.00802893098444 accuracy:1.0
590
loss:0.00644389400259 accuracy:1.0
591
loss:0.000337625970133 accuracy:1.0
592
loss:0.000656736374367 accuracy:1.0
593
loss:0.0069315279834 accuracy:1.0
594
loss:0.000192244129721 accuracy:1.0
595
loss:0.153810724616 accuracy:1.0
596
loss:0.509512066841 accuracy:0.75
597
loss:2.8454875946 accuracy:0.5
598
loss:0.121696084738 accuracy:1.0
599
loss:0.13493694365 accuracy:1.0
600
loss:0.0113169485703 accuracy:1.0
601
loss:0.143897026777 accuracy:1.0
602
loss:0.0995514839888 accuracy:1.0
603
loss:0.00416302261874 accuracy:1.0
604
loss:0.0498762577772 accuracy:1.0
605
loss:0.000733904773369 accuracy:1.0
606
loss:0.00432188156992 accuracy:1.0
607
loss:0.247714474797 accuracy:0.75
608
loss:0.0603492446244 accuracy:1.0
609
loss:0.00636652298272 accuracy:1.0
610
loss:3.8743002051e-07 accuracy:1.0
611
loss:0.000434571033111 accuracy:1.0
612
loss:0.000185367985978 accuracy:1.0
613
loss:1.27765703201 accuracy:0.5
614
loss:0.0464809089899 accuracy:1.0
615
loss:0.0682013481855 accuracy:1.0
616
loss:0.166923344135 accuracy:1.0
617
loss:0.00747666787356 accuracy:1.0
618
loss:0.000737957539968 accuracy:1.0
619
loss:0.147793710232 accuracy:1.0
620
loss:0.00622826628387 accuracy:1.0
621
loss:0.0026685774792 accuracy:1.0
622
loss:0.0266832802445 accuracy:1.0
623
loss:0.00111918640323 accuracy:1.0
624
loss:0.166999429464 accuracy:1.0
625
loss:0.00493326690048 accuracy:1.0
626
loss:0.148973792791 accuracy:1.0
627
loss:0.0164778511971 accuracy:1.0
628
loss:0.0263445004821 accuracy:1.0
629
loss:0.000971373054199 accuracy:1.0
630
loss:0.137379467487 accuracy:1.0
631
loss:0.000336995668476 accuracy:1.0
632
loss:0.000118585114251 accuracy:1.0
633
loss:0.194744035602 accuracy:0.75
634
loss:0.622318923473 accuracy:0.75
635
loss:0.0158670805395 accuracy:1.0
636
loss:0.00111870421097 accuracy:1.0
637
loss:0.00360449962318 accuracy:1.0
638
loss:0.123612225056 accuracy:1.0
639
loss:0.915646851063 accuracy:0.75
640
loss:0.00414372095838 accuracy:1.0
641
loss:0.00148615182843 accuracy:1.0
642
loss:0.139044344425 accuracy:1.0
643
loss:0.000594415760133 accuracy:1.0
644
loss:0.0548767484725 accuracy:1.0
645
loss:0.131095871329 accuracy:1.0
646
loss:0.0180732347071 accuracy:1.0
647
loss:0.0192443877459 accuracy:1.0
648
loss:0.002840944333 accuracy:1.0
649
loss:0.0817834958434 accuracy:1.0
650
loss:7.39887691452e-05 accuracy:1.0
651
loss:0.000455870700534 accuracy:1.0
652
loss:0.00230005686171 accuracy:1.0
653
loss:0.000108704443846 accuracy:1.0
654
loss:0.0797890126705 accuracy:1.0
655
loss:0.00503324298188 accuracy:1.0
656
loss:0.0720994323492 accuracy:1.0
657
loss:2.68220759381e-07 accuracy:1.0
658
loss:0.00216671684757 accuracy:1.0
659
loss:0.000182132818736 accuracy:1.0
660
loss:0.0201711002737 accuracy:1.0
661
loss:0.000373564369511 accuracy:1.0
662
loss:0.210333183408 accuracy:0.75
663
loss:0.434348583221 accuracy:0.75
664
loss:0.00160946941469 accuracy:1.0
665
loss:0.00168058788404 accuracy:1.0
666
loss:0.0156195694581 accuracy:1.0
667
loss:0.0179282538593 accuracy:1.0
668
loss:0.619975090027 accuracy:0.75
669
loss:0.0250529292971 accuracy:1.0
670
loss:1.1728490591 accuracy:0.75
671
loss:0.000638192694169 accuracy:1.0
672
loss:0.00298879598267 accuracy:1.0
673
loss:0.000486818666104 accuracy:1.0
674
loss:7.75988737587e-05 accuracy:1.0
675
loss:0.0147338835523 accuracy:1.0
676
loss:0.0939862802625 accuracy:1.0
677
loss:0.356457710266 accuracy:0.75
678
loss:0.0 accuracy:1.0
679
loss:0.00246878503822 accuracy:1.0
680
loss:0.00888949073851 accuracy:1.0
681
loss:0.170242756605 accuracy:1.0
682
loss:0.0192358512431 accuracy:1.0
683
loss:0.228889971972 accuracy:1.0
684
loss:0.0224611312151 accuracy:1.0
685
loss:0.645591199398 accuracy:0.75
686
loss:0.0435347706079 accuracy:1.0
687
loss:0.016888409853 accuracy:1.0
688
loss:0.000532899983227 accuracy:1.0
689
loss:0.0515907779336 accuracy:1.0
690
loss:0.0468752644956 accuracy:1.0
691
loss:7.48243910493e-05 accuracy:1.0
692
loss:0.000171366787981 accuracy:1.0
693
loss:0.134263500571 accuracy:1.0
694
loss:0.0109058497474 accuracy:1.0
695
loss:0.0117134619504 accuracy:1.0
696
loss:0.000636452401523 accuracy:1.0
697
loss:0.074299864471 accuracy:1.0
698
loss:0.0229991711676 accuracy:1.0
699
loss:1.19209028071e-06 accuracy:1.0
700
loss:5.01820904901e-05 accuracy:1.0
701
loss:0.000727334117983 accuracy:1.0
702
loss:6.25842085356e-06 accuracy:1.0
703
loss:0.00155220844317 accuracy:1.0
704
loss:2.98023206113e-08 accuracy:1.0
705
loss:0.0246029030532 accuracy:1.0
706
loss:0.00987655483186 accuracy:1.0
707
loss:0.00130767049268 accuracy:1.0
708
loss:8.22408910608e-05 accuracy:1.0
709
loss:1.19209261129e-07 accuracy:1.0
710
loss:8.04661510756e-07 accuracy:1.0
711
loss:0.00843916833401 accuracy:1.0
712
loss:0.710033893585 accuracy:0.75
713
loss:3.87427189708e-06 accuracy:1.0
714
loss:0.00279647787102 accuracy:1.0
715
loss:1.66902518272 accuracy:0.75
716
loss:4.8334699386e-05 accuracy:1.0
717
loss:0.000108164756966 accuracy:1.0
718
loss:0.000460023642518 accuracy:1.0
719
loss:0.014947255142 accuracy:1.0
720
loss:0.00103411846794 accuracy:1.0
721
loss:0.00142774963751 accuracy:1.0
722
loss:0.00128786324058 accuracy:1.0
723
loss:0.000406698818551 accuracy:1.0
724
loss:0.0186016894877 accuracy:1.0
725
loss:3.92168876715e-05 accuracy:1.0
726
loss:0.00641623232514 accuracy:1.0
727
loss:0.000120202676044 accuracy:1.0
728
loss:5.96040854361e-06 accuracy:1.0
729
loss:0.0003364807053 accuracy:1.0
730
loss:7.39286333555e-05 accuracy:1.0
731
loss:0.0154847707599 accuracy:1.0
732
loss:0.000457124668173 accuracy:1.0
733
loss:3.1618270441e-05 accuracy:1.0
734
loss:0.910873115063 accuracy:0.75
735
loss:0.0267107822001 accuracy:1.0
736
loss:0.39768460393 accuracy:0.75
737
loss:0.0669838786125 accuracy:1.0
738
loss:0.00644081551582 accuracy:1.0
739
loss:0.034500323236 accuracy:1.0
740
loss:0.00022785415058 accuracy:1.0
741
loss:0.0 accuracy:1.0
742
loss:0.000210024169064 accuracy:1.0
743
loss:0.00147695967462 accuracy:1.0
744
loss:0.0725145637989 accuracy:1.0
745
loss:0.029834413901 accuracy:1.0
746
loss:0.0220102537423 accuracy:1.0
747
loss:7.51805127948e-05 accuracy:1.0
748
loss:0.243395596743 accuracy:1.0
749
loss:0.0 accuracy:1.0
750
loss:0.000294363766443 accuracy:1.0
751
loss:0.000681870267726 accuracy:1.0
752
loss:0.0001109384757 accuracy:1.0
753
loss:0.00326775875874 accuracy:1.0
754
loss:0.000125748862047 accuracy:1.0
755
loss:0.0223309192806 accuracy:1.0
756
loss:2.08616171449e-07 accuracy:1.0
757
loss:0.00221180100925 accuracy:1.0
758
loss:0.000174014785443 accuracy:1.0
759
loss:1.72069203854 accuracy:0.5
760
loss:0.0 accuracy:1.0
761
loss:2.15550684929 accuracy:0.75
762
loss:0.518400728703 accuracy:0.75
763
loss:0.105010151863 accuracy:1.0
764
loss:0.00476733082905 accuracy:1.0
765
loss:0.618326127529 accuracy:0.75
766
loss:0.0568829216063 accuracy:1.0
767
loss:0.00282724341378 accuracy:1.0
768
loss:0.00655440147966 accuracy:1.0
769
loss:0.0208293218166 accuracy:1.0
770
loss:0.0136799290776 accuracy:1.0
771
loss:0.069712460041 accuracy:1.0
772
loss:0.000210100814002 accuracy:1.0
773
loss:0.00459663849324 accuracy:1.0
774
loss:0.000156323905685 accuracy:1.0
775
loss:0.00276682106778 accuracy:1.0
776
loss:0.000317671743687 accuracy:1.0
777
loss:0.00603116257116 accuracy:1.0
778
loss:0.000105627936136 accuracy:1.0
779
loss:0.0012082036119 accuracy:1.0
780
loss:0.00771681312472 accuracy:1.0
781
loss:0.000266665418167 accuracy:1.0
782
loss:0.000127759703901 accuracy:1.0
783
loss:0.0755270496011 accuracy:1.0
784
loss:0.16779884696 accuracy:1.0
785
loss:0.00140535703395 accuracy:1.0
786
loss:0.00015215415624 accuracy:1.0
787
loss:0.000368304987205 accuracy:1.0
788
loss:0.00157043302897 accuracy:1.0
789
loss:2.41397765421e-06 accuracy:1.0
790
loss:2.59279295278e-06 accuracy:1.0
791
loss:6.28824091109e-06 accuracy:1.0
792
loss:0.0221433527768 accuracy:1.0
793
loss:2.08616171449e-07 accuracy:1.0
794
loss:1.57948488777e-05 accuracy:1.0
795
loss:0.143929585814 accuracy:1.0
796
loss:1.9745169878 accuracy:0.75
797
loss:0.000270577816991 accuracy:1.0
798
loss:5.12140083313 accuracy:0.5
799
loss:0.00173324288335 accuracy:1.0
800
loss:1.19792962074 accuracy:0.75
801
loss:2.02655382964e-06 accuracy:1.0
802
loss:0.00112404092215 accuracy:1.0
803
loss:0.00845727138221 accuracy:1.0
804
loss:0.28252235055 accuracy:0.75
805
loss:0.0106558371335 accuracy:1.0
806
loss:0.014702941291 accuracy:1.0
807
loss:0.00130192749202 accuracy:1.0
808
loss:0.000347329594661 accuracy:1.0
809
loss:0.00703197019175 accuracy:1.0
810
loss:0.0106599340215 accuracy:1.0
811
loss:0.00392346037552 accuracy:1.0
812
loss:0.0465068034828 accuracy:1.0
813
loss:8.81706946529e-05 accuracy:1.0
814
loss:0.0458577163517 accuracy:1.0
815
loss:0.0467248111963 accuracy:1.0
816
loss:0.00091094506206 accuracy:1.0
817
loss:3.09035203827e-05 accuracy:1.0
818
loss:0.00139724835753 accuracy:1.0
819
loss:0.000333012023475 accuracy:1.0
820
loss:0.00710536446422 accuracy:1.0
821
loss:0.00560972420499 accuracy:1.0
822
loss:0.000250497396337 accuracy:1.0
823
loss:0.00380676612258 accuracy:1.0
824
loss:0.000113908354251 accuracy:1.0
825
loss:0.0343874841928 accuracy:1.0
826
loss:0.00282790628262 accuracy:1.0
827
loss:6.72862443025e-05 accuracy:1.0
828
loss:0.00147931976244 accuracy:1.0
829
loss:2.86683352897e-05 accuracy:1.0
830
loss:0.0245764218271 accuracy:1.0
831
loss:4.20210108132e-06 accuracy:1.0
832
loss:7.25034042262e-05 accuracy:1.0
833
loss:3.16184814437e-05 accuracy:1.0
834
loss:1.79704529728e-05 accuracy:1.0
835
loss:0.000492809806019 accuracy:1.0
836
loss:0.01179948356 accuracy:1.0
837
loss:0.0204155929387 accuracy:1.0
838
loss:0.0225518066436 accuracy:1.0
839
loss:0.000979462754913 accuracy:1.0
840
loss:0.0264807604253 accuracy:1.0
841
loss:0.000642011058517 accuracy:1.0
842
loss:9.98367613647e-06 accuracy:1.0
843
loss:0.000578560575377 accuracy:1.0
844
loss:0.00985639821738 accuracy:1.0
845
loss:0.000689651060384 accuracy:1.0
846
loss:0.264054358006 accuracy:0.75
847
loss:0.00211702845991 accuracy:1.0
848
loss:0.0 accuracy:1.0
849
loss:0.00793411303312 accuracy:1.0
850
loss:0.0133068496361 accuracy:1.0
851
loss:0.00019709445769 accuracy:1.0
852
loss:0.000726983242203 accuracy:1.0
853
loss:4.52993208455e-06 accuracy:1.0
854
loss:0.000351926224539 accuracy:1.0
855
loss:2.68220759381e-07 accuracy:1.0
856
loss:2.38418522258e-07 accuracy:1.0
857
loss:0.000103779944766 accuracy:1.0
858
loss:0.00327592715621 accuracy:1.0
859
loss:0.000106491184852 accuracy:1.0
860
loss:5.18235428899e-05 accuracy:1.0
861
loss:2.75658640021e-05 accuracy:1.0
862
loss:6.54671530356e-05 accuracy:1.0
863
loss:0.0223422013223 accuracy:1.0
864
loss:0.000272205128567 accuracy:1.0
865
loss:2.03243580472e-05 accuracy:1.0
866
loss:2.66121260211e-05 accuracy:1.0
867
loss:1.37090319186e-06 accuracy:1.0
868
loss:0.0945804268122 accuracy:1.0
869
loss:3.57627612857e-07 accuracy:1.0
870
loss:0.0 accuracy:1.0
871
loss:4.93477746204e-05 accuracy:1.0
872
loss:4.08288497056e-06 accuracy:1.0
873
loss:0.0 accuracy:1.0
874
loss:0.000123173100292 accuracy:1.0
875
loss:0.000122963945614 accuracy:1.0
876
loss:1.90734135685e-06 accuracy:1.0
877
loss:0.0 accuracy:1.0
878
loss:2.5754442215 accuracy:0.75
879
loss:7.56566732889e-05 accuracy:1.0
880
loss:0.0012678487692 accuracy:1.0
881
loss:0.000307852867991 accuracy:1.0
882
loss:1.25169458443e-06 accuracy:1.0
883
loss:0.00642994698137 accuracy:1.0
884
loss:0.000592304742895 accuracy:1.0
885
loss:0.000150688283611 accuracy:1.0
886
loss:0.00178344349843 accuracy:1.0
887
loss:0.259098112583 accuracy:0.75
888
loss:0.000131561449962 accuracy:1.0
889
loss:3.65051782865e-05 accuracy:1.0
890
loss:0.00213865935802 accuracy:1.0
891
loss:6.25848144864e-07 accuracy:1.0
892
loss:4.12733934354e-05 accuracy:1.0
893
loss:5.81375788897e-05 accuracy:1.0
894
loss:0.000273287500022 accuracy:1.0
895
loss:0.000152824548422 accuracy:1.0
896
loss:8.31473425933e-06 accuracy:1.0
897
loss:2.2857580916e-05 accuracy:1.0
898
loss:7.83792074799e-06 accuracy:1.0
899
loss:0.0115786949173 accuracy:1.0
900
loss:9.68728563748e-05 accuracy:1.0
901
loss:1.13248688649e-06 accuracy:1.0
902
loss:0.0496145673096 accuracy:1.0
903
loss:1.11756226033e-05 accuracy:1.0
904
loss:0.000203325893381 accuracy:1.0
905
loss:4.82794484924e-06 accuracy:1.0
906
loss:1.26060649563e-05 accuracy:1.0
907
loss:0.000413879693951 accuracy:1.0
908
loss:6.35007745586e-05 accuracy:1.0
909
loss:2.98023206113e-08 accuracy:1.0
910
loss:0.00323041388765 accuracy:1.0
911
loss:9.79712203844e-05 accuracy:1.0
912
loss:1.62848079205 accuracy:0.75
913
loss:0.0242614541203 accuracy:1.0
914
loss:0.0270170196891 accuracy:1.0
915
loss:0.298094958067 accuracy:0.75
916
loss:0.000197779576411 accuracy:1.0
917
loss:0.0 accuracy:1.0
918
loss:1.09183096886 accuracy:0.75
919
loss:0.059089269489 accuracy:1.0
920
loss:0.540388822556 accuracy:0.75
921
loss:0.00234878063202 accuracy:1.0
922
loss:0.00449673319235 accuracy:1.0
923
loss:0.00438120402396 accuracy:1.0
924
loss:0.000105101295048 accuracy:1.0
925
loss:0.00597113603726 accuracy:1.0
926
loss:0.000764504424296 accuracy:1.0
927
loss:0.000180775154149 accuracy:1.0
928
loss:0.000158698647283 accuracy:1.0
929
loss:3.12923202728e-06 accuracy:1.0
930
loss:0.0212600249797 accuracy:1.0
931
loss:0.00659835385159 accuracy:1.0
932
loss:0.181482896209 accuracy:0.75
933
loss:0.00233455142006 accuracy:1.0
934
loss:0.77560710907 accuracy:0.75
935
loss:2.85493988486e-05 accuracy:1.0
936
loss:1.73443495441e-05 accuracy:1.0
937
loss:1.39770290843e-05 accuracy:1.0
938
loss:0.0147224320099 accuracy:1.0
939
loss:0.0225931722671 accuracy:1.0
940
loss:0.000159510731464 accuracy:1.0
941
loss:0.0267392788082 accuracy:1.0
942
loss:1.6301364667e-05 accuracy:1.0
943
loss:0.00837118458003 accuracy:1.0
944
loss:1.40070710586e-06 accuracy:1.0
945
loss:1.78813877483e-07 accuracy:1.0
946
loss:1.35598693305e-05 accuracy:1.0
947
loss:3.6832956539e-05 accuracy:1.0
948
loss:2.1457603907e-06 accuracy:1.0
949
loss:6.25848201707e-07 accuracy:1.0
950
loss:6.67563244861e-06 accuracy:1.0
951
loss:0.000172200932866 accuracy:1.0
952
loss:2.45559112955e-05 accuracy:1.0
953
loss:0.28211170435 accuracy:0.75
954
loss:3.78486629415e-06 accuracy:1.0
955
loss:0.0 accuracy:1.0
956
loss:3.99325363105e-05 accuracy:1.0
957
loss:0.000164037759532 accuracy:1.0
958
loss:5.56377053726e-05 accuracy:1.0
959
loss:1.34110086947e-06 accuracy:1.0
960
loss:0.00719599006698 accuracy:1.0
961
loss:5.1224942581e-05 accuracy:1.0
962
loss:0.108724892139 accuracy:1.0
963
loss:0.216220155358 accuracy:0.75
964
loss:2.98023206113e-08 accuracy:1.0
965
loss:1.32020913952e-05 accuracy:1.0
966
loss:0.000549052318092 accuracy:1.0
967
loss:1.04307912352e-06 accuracy:1.0
968
loss:0.00227680965327 accuracy:1.0
969
loss:3.91273861169e-05 accuracy:1.0
970
loss:1.72852867308e-06 accuracy:1.0
971
loss:5.95975398028e-05 accuracy:1.0
972
loss:2.98023206113e-08 accuracy:1.0
973
loss:1.12845361233 accuracy:0.75
974
loss:0.00344220059924 accuracy:1.0
975
loss:1.78813891694e-07 accuracy:1.0
976
loss:0.00555469095707 accuracy:1.0
977
loss:0.0254180897027 accuracy:1.0
978
loss:0.00575304357335 accuracy:1.0
979
loss:8.87657224666e-05 accuracy:1.0
980
loss:0.333685457706 accuracy:0.75
981
loss:1.5411645174 accuracy:0.75
982
loss:0.00339213898405 accuracy:1.0
983
loss:0.00120935903396 accuracy:1.0
984
loss:0.000193331274204 accuracy:1.0
985
loss:0.00145972822793 accuracy:1.0
986
loss:0.0121605098248 accuracy:1.0
987
loss:0.000133721638122 accuracy:1.0
988
loss:5.54321013624e-06 accuracy:1.0
989
loss:0.0561847537756 accuracy:1.0
990
loss:2.12487684621e-05 accuracy:1.0
991
loss:0.00121289375238 accuracy:1.0
992
loss:7.45056922824e-07 accuracy:1.0
993
loss:0.000156591951963 accuracy:1.0
994
loss:0.00030597744626 accuracy:1.0
995
loss:0.00287289102562 accuracy:1.0
996
loss:0.00114007492084 accuracy:1.0
997
loss:0.000725160876755 accuracy:1.0
998
loss:1.57951808433e-06 accuracy:1.0
999
loss:0.00928597245365 accuracy:1.0
----------2018.02.19 在给定inception v4模型下继续训练---------------
实验记录:
error 1:
InvalidArgumentError (see above for traceback): Assign requires shapes of both tensors to match. lhs shape= [5] rhs shape= [1001]
官方给的是1000分类的ImagNet, 自己的数据需要重新训练
error 2:
InvalidArgumentError (see above for traceback): Assign requires shapes of both tensors to match. lhs shape= [5] rhs shape= [1001]
因为用cpu跑,修改train_image_classfier.py里的tf.app.Flags.DEFINE_boolean('clone_on_cpu',False,'use CPUs to deploy clones.'),False改成True.
可视化:
调用tensorboard以后显示TensorBoard 1.5.1 at http://localhost:6006 (Press CTRL+C to quit),在浏览器打开http://localhost:6006即可
入坑tensorflow的更多相关文章
- RoboGuice 3.0 (一)入坑篇
RoboGuice是什么? 一个Android上的依赖注入框架. 依赖注入是什么? 从字面理解,这个框架做了两件事情,第一是去除依赖,第二是注入依赖.简单理解就是,将对象的初始化委托给一个容器控制器, ...
- [SSIS] 在脚本里面使用数据库连接字符串进行查询等处理, 入坑
入坑.!!!!! SSIS 中dts包 设置的 ADO.Net连接, 在传入脚本的时候, 我要使用 数据库连接,进行数据的删除操作. 于是我使用了 了如下的 代码 使用的是windows 身份验证, ...
- webpack入坑之旅(六)配合vue-router实现SPA
这是一系列文章,此系列所有的练习都存在了我的github仓库中vue-webpack,在本人有了新的理解与认识之后,会对文章有不定时的更正与更新.下面是目前完成的列表: webpack入坑之旅(一)不 ...
- webpack入坑之旅(五)加载vue单文件组件
这是一系列文章,此系列所有的练习都存在了我的github仓库中vue-webpack,在本人有了新的理解与认识之后,会对文章有不定时的更正与更新.下面是目前完成的列表: webpack入坑之旅(一)不 ...
- webpack入坑之旅(四)扬帆起航
这是一系列文章,此系列所有的练习都存在了我的github仓库中vue-webpack,在本人有了新的理解与认识之后,会对文章有不定时的更正与更新.下面是目前完成的列表: webpack入坑之旅(一)不 ...
- webpack入坑之旅(三)webpack.config入门
这是一系列文章,此系列所有的练习都存在了我的github仓库中vue-webpack,在本人有了新的理解与认识之后,会对文章有不定时的更正与更新.下面是目前完成的列表: webpack入坑之旅(一)不 ...
- webpack入坑之旅(二)loader入门
这是一系列文章,此系列所有的练习都存在了我的github仓库中vue-webpack 在本人有了新的理解与认识之后,会对文章有不定时的更正与更新.下面是目前完成的列表: webpack入坑之旅(一)不 ...
- webpack入坑之旅(一)不是开始的开始
最近学习框架,选择了vue,然后接触到了vue中的单文件组件,官方推荐使用 Webpack + vue-loader构建这些单文件 Vue 组件,于是就开始了webpack的入坑之旅.因为原来没有用过 ...
- gulp入坑系列(2)——初试JS代码合并与压缩
在上一篇里成功安装了gulp到项目中,现在来测试一下gulp的合并与压缩功能 gulp入坑系列(1)--安装gulp(传送门):http://www.cnblogs.com/YuuyaRin/p/61 ...
随机推荐
- LDA学习小记
看到一段对主题模型的总结,感觉很精辟: 如何找到文本隐含的主题呢?常用的方法一般都是基于统计学的生成方法.即假设以一定的概率选择了一个主题,然后以一定的概率选择当前主题的词.最后这些词组成了我们当前的 ...
- MTU 和 MSS 关系、 IP分片、TCP分段
从四层模型:链路层,网络层,传输层,应用层说 一 .以太网V2格式数据帧 : 链路层 Destination Source Type DataAndPad FCS 6 ...
- 10.1-uC/OS-III任务堆栈空间
1.设置任务优先级 嵌入式系统中的重要的应用应该被设置为高优先级,一些显示操作就应该被设置为低优先级. 然而, 由于实时系统的复杂性, 在大多数情况下任务的优先级是不能被事先确定的.多数系统中,不是所 ...
- SpringBoot-整合log4j日志记录
新建log4j配置文件 文件名称log4j.properties #log4j.rootLogger=CONSOLE,info,error,DEBUG log4j.rootLogger=info,er ...
- 中文全文检索讯搜xunsearch安装
Xunsearch (迅搜)是一套免费开源的专业中文全文检索解决方案,简单易用而且 功能强大.性能卓越能轻松处理海量数据的全文检索.它包含后端索引.搜索服务程序和前端 脚本语言编写的开发工具包(称之为 ...
- docker+Nexus Repository Manager 搭建私有docker仓库
使用容器安装Nexus3 1.下载nexus3的镜像: docker pull sonatype/nexus3 2.使用镜像启动一个容器: docker run -d -p 8081:8081 -p ...
- 配置SQL Server 2016无域AlwaysOn(转)
Windows Server 2016 以及 SQL Server 2016出来已有一段时间了,因为Windows Server 2016可以配置无域的Windows群集,因此也能够以此来配置无域的S ...
- Python创建目录
需要包含os模块进来,使用相关函数即可实现目录的创建 1.创建目录要用到的函数: (1)os.path.exists(path) 判断一个目录是否存在 (2)os.makedirs(path) 多层创 ...
- cobbler自动安装操作系统
cobbler介绍 快速网络安装linux操作系统的服务,支持众多的Linux发行版: Red Hat, Fedora,CentOS,Debian,Ubuntu和SuSE 也可以支持网络安装windo ...
- [vue]webpack使用样式
webpack: 使用自己写样式 main.js导入全局生效 import Vue from 'vue' import App from './App.vue' import './index.css ...