I0415 15:03:37.603461 27311 solver.cpp:42] Solver scaffolding done.
I0415 15:03:37.603549 27311 solver.cpp:247] Solving AlexNet
I0415 15:03:37.603559 27311 solver.cpp:248] Learning Rate Policy: step
I0415 15:03:37.749981 27311 solver.cpp:214] Iteration 0, loss = 5.45141
I0415 15:03:37.750030 27311 solver.cpp:229]     Train net output #0: loss = 5.45141 (* 1 = 5.45141 loss)
I0415 15:03:37.750048 27311 solver.cpp:489] Iteration 0, lr = 0.001
I0415 15:03:38.316994 27311 solver.cpp:214] Iteration 12, loss = 4.23865
I0415 15:03:38.317054 27311 solver.cpp:229]     Train net output #0: loss = 4.23865 (* 1 = 4.23865 loss)
I0415 15:03:38.317068 27311 solver.cpp:489] Iteration 12, lr = 0.001
I0415 15:03:38.920938 27311 solver.cpp:214] Iteration 24, loss = 2.49914
I0415 15:03:38.921000 27311 solver.cpp:229]     Train net output #0: loss = 2.49914 (* 1 = 2.49914 loss)
I0415 15:03:38.921016 27311 solver.cpp:489] Iteration 24, lr = 0.001
I0415 15:03:39.509793 27311 solver.cpp:214] Iteration 36, loss = 3.76504
I0415 15:03:39.509850 27311 solver.cpp:229]     Train net output #0: loss = 3.76504 (* 1 = 3.76504 loss)
I0415 15:03:39.509861 27311 solver.cpp:489] Iteration 36, lr = 0.001
I0415 15:03:40.080806 27311 solver.cpp:214] Iteration 48, loss = 3.74901
I0415 15:03:40.080862 27311 solver.cpp:229]     Train net output #0: loss = 3.74901 (* 1 = 3.74901 loss)
I0415 15:03:40.080878 27311 solver.cpp:489] Iteration 48, lr = 0.001
I0415 15:03:40.643797 27311 solver.cpp:214] Iteration 60, loss = 2.27091
I0415 15:03:40.643849 27311 solver.cpp:229]     Train net output #0: loss = 2.27091 (* 1 = 2.27091 loss)
I0415 15:03:40.643860 27311 solver.cpp:489] Iteration 60, lr = 0.001
I0415 15:03:41.217475 27311 solver.cpp:214] Iteration 72, loss = 2.67078
I0415 15:03:41.217541 27311 solver.cpp:229]     Train net output #0: loss = 2.67078 (* 1 = 2.67078 loss)
I0415 15:03:41.217561 27311 solver.cpp:489] Iteration 72, lr = 0.001
I0415 15:03:41.793390 27311 solver.cpp:214] Iteration 84, loss = 1.77313
I0415 15:03:41.793452 27311 solver.cpp:229]     Train net output #0: loss = 1.77313 (* 1 = 1.77313 loss)
I0415 15:03:41.793468 27311 solver.cpp:489] Iteration 84, lr = 0.001
I0415 15:03:42.362951 27311 solver.cpp:214] Iteration 96, loss = 3.49406
I0415 15:03:42.363004 27311 solver.cpp:229]     Train net output #0: loss = 3.49406 (* 1 = 3.49406 loss)
I0415 15:03:42.363025 27311 solver.cpp:489] Iteration 96, lr = 0.001
I0415 15:03:42.946568 27311 solver.cpp:214] Iteration 108, loss = 2.81601
I0415 15:03:42.946633 27311 solver.cpp:229]     Train net output #0: loss = 2.81601 (* 1 = 2.81601 loss)
I0415 15:03:42.946651 27311 solver.cpp:489] Iteration 108, lr = 0.001
I0415 15:03:43.524155 27311 solver.cpp:214] Iteration 120, loss = 2.85056
I0415 15:03:43.524247 27311 solver.cpp:229]     Train net output #0: loss = 2.85056 (* 1 = 2.85056 loss)
I0415 15:03:43.524265 27311 solver.cpp:489] Iteration 120, lr = 0.001
I0415 15:03:44.100580 27311 solver.cpp:214] Iteration 132, loss = 3.58945
I0415 15:03:44.100646 27311 solver.cpp:229]     Train net output #0: loss = 3.58945 (* 1 = 3.58945 loss)
I0415 15:03:44.100661 27311 solver.cpp:489] Iteration 132, lr = 0.001
F0415 15:03:44.536542 27311 math_functions.cpp:91] Check failed: error == cudaSuccess (4 vs. 0)  unspecified launch failure
*** Check failure stack trace: ***
    @     0x7f01dbd9ddaa  (unknown)
    @     0x7f01dbd9dce4  (unknown)
    @     0x7f01dbd9d6e6  (unknown)
    @     0x7f01dbda0687  (unknown)
    @     0x7f01dc1bb3f5  caffe::caffe_copy<>()
    @     0x7f01dc230232  caffe::BasePrefetchingDataLayer<>::Forward_gpu()
    @     0x7f01dc1d9d6f  caffe::Net<>::ForwardFromTo()
    @     0x7f01dc1da197  caffe::Net<>::ForwardPrefilled()
    @     0x7f01dc20cbe5  caffe::Solver<>::Step()
    @     0x7f01dc20d52f  caffe::Solver<>::Solve()
    @           0x406428  train()
    @           0x404961  main
    @     0x7f01db2afec5  (unknown)
    @           0x404f0d  (unknown)
    @              (nil)  (unknown)
Aborted
wangxiao@gtx-980:~/Downloads/lstm_caffe_master$

---------------------------------------------------------------------------------------------------------------------------------------

怎么破 ???求解答。。。。

well, the only word I want to say is : where amazing happens ???

I restart my pc and run the code again and it worked ......

caffe 训练时,出现错误:Check failed: error == cudaSuccess (4 vs. 0) unspecified launch failure的更多相关文章

  1. caffe运行错误: im2col.cu:61] Check failed: error == cudaSuccess (8 vs. 0) invalid device function

    错误: im2col.cu:61] Check failed: error == cudaSuccess (8 vs. 0)  invalid device function 原因:由于Makefil ...

  2. 配置SSD-caffe测试时出现“Check failed: error == cudaSuccess (10 vs. 0) invalid device ordinal”解决方案

    这是由于GPU数量不匹配造成的,如果训练自己的数据,那么我们只需要将solver.prototxt文件中的device_id项改为自己的GPU块数,一块就是0,两块就是1,以此类推. 但是SSD配置时 ...

  3. 【CUDA开发】 Check failed: error == cudaSuccess (8 vs. 0) invalid device function

    最近在复现R-CNN一系列的实验时,配置代码环境真是花费了不少时间.由于对MATLAB不熟悉,实验采用的都是github上rbg大神的Python版本.在配置Faster R-CNN时,编译没有问题, ...

  4. Caffe 分类问题 Check failed: error == cudaSuccess (2 vs. 0) out of memory

    如果图片过大,需要适当缩小batch_size的值,否则使用GPU时可能超出其缓存大小而报错

  5. Check failed: status == CUBLAS_STATUS_SUCCESS (11 vs. 0) CUBLAS_STATUS_MAPPING_ERROR

    I0930 21:23:15.115576 30918 solver.cpp:281] Learning Rate Policy: multistepF0930 21:23:17.263314 310 ...

  6. check failed status == cudnn_status_success (4 vs. 0) cudnn_status_internal_error

    Check failed: error == cudaSuccess (30 vs. 0) unknown error  这个有可能是显存不足造成的,或者网络参数不对造成的 check failed ...

  7. 目标检测faster rcnn error == cudaSuccess (2 vs. 0) out of memory

    想尝试 更深更强的网络,或者自己写了一个费显存的层,发现1080 ti的11G显存不够用了,老师报显存不够怎么办? Check failed: error == cudaSuccess (2 vs. ...

  8. 发布到远程存储库时遇到错误: Git failed with a fatal error.

    正在推送 master发布到远程存储库时遇到错误: Git failed with a fatal error.Authentication failed for 'http://1212121xxx ...

  9. windows7下解决caffe check failed registry.count(type) == 1(0 vs. 1) unknown layer type问题

    在Windows7下调用vs2013生成的Caffe静态库时经常会提示Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer t ...

随机推荐

  1. js对象的定义及处理

    一,概述 在Java语言中,我们可以定义自己的类,并根据这些类创建对象来使用,在Javascript中,我们也可以定义自己的类,例如定义User类.Hashtable类等等. 目前在Javascrip ...

  2. Linux下进程的建立

    Linux下进程的建立 我们都知道,进程就是正在执行的程序.而在Linux中,可以使用一个进程来创建另外一个进程.这样的话,Linux的进程的组织结构其实有点像Linux目录树,是个层次结构的,可以使 ...

  3. vc设置窗口透明

    ::SetWindowLong(GetSafeHwnd(), GWL_EXSTYLE, ::GetWindowLongPtr(GetSafeHwnd(), GWL_EXSTYLE) | WS_EX_L ...

  4. Python学习路程day5

    冒泡排序 将一个不规则的数组按从小到大的顺序进行排序 data = [10,4,33,21,54,3,8,11,5,22,2,1,17,13,6] #第一次循环,最后一个数字不需要循环,因为最大值已经 ...

  5. linux下的文件权限管理

    权限管理有两个层面 第一层区分用户:文件属主(u), 组用户(g), 其它(o) 第二层区分权限:读(r),写(w),可执行(x) 这两个层次构成文件权限管理的二维结构 u         g     ...

  6. Python 安全类目推荐 (持续更新)

    推荐学习书目 › Learn Python the Hard Way › Python 学习手册 › Python Cookbook › Python 基础教程 Python Sites › PyPI ...

  7. php大力力 [026节] php开发状态要随时做好整理工作

    php大力力 [026节]  php开发状态要随时做好整理工作: 1.整理了开发目录,以及文件命名: 2.做了各个页面的快捷方式: 3.把浏览器safari的很多没来得及消化的页面链接,写入了我的在线 ...

  8. BZOJ 4723 Flappy Bird

    找到可行区间,最优解一定在区间的下端点. #include<iostream> #include<cstdio> #include<cstring> #includ ...

  9. 多态-II(接口实现)

    <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title> ...

  10. iPhone各控件的默认高度

    1.状态栏 状态栏一般高度为20像素,在打手机或者显示消息时会放大到40像素高,注意,两倍高度的状态栏在好像只能在纵向的模式下使用.如下图 用户可以隐藏状态栏,也可以将状态栏设置为灰色,黑色或者半透明 ...