Why are very few schools involved in deep learning research? Why are they still hooked on to Bayesian methods?

First, this question assumes that every university should have a "deep learning" person.  Deep learning is mostly used in vision (and to a lesser extent NLP), and many universities don't have such researchers, so they wouldn't have a deep learning researcher either.

One thing that people often forget is that academics have long careers (thanks to tenure, this is by design).  So if you hire a bunch of researchers now who do deep learning, they're going to be around for decades.  Academia tends to be conservative, so it's not going to stock up on deep learning researchers just because it's cool today.  If this were the norm, CS departments would be full of fuzzy logic researchers hired in the 90s.

There's nothing magical about deep learning.  It's one tool of many (including Bayesian methods, discriminative methods, etc.) you should have in your toolbox.  Departments try to hire bright people, not those who slavishly follow every fad. Obviously, there will be more of these people on faculties who do deep learning in the near future.  (If Facebook, Google, and Baidu don't all hire them first, that is.)

That said, there are lots of folks working in this area.  Of the schools mentioned in the question, Noah Smith at UW and Katrin Erk at Texas.  Other places (off the top of my head) that work in this area: UMass, JHU, Maryland, NYU, Montreal, Michigan, and TTI.  I'm more upset that Princeton and Caltech (where I did my PhD and undergrad) don't have professors in CS who do language research.  That's the bigger crime in my opinion, and is correlated with their lack of deep learning folks.

Blatant self-promotion ... Colorado has three folks working in this area: me, Mike Mozer, and Jim Martin.

  
Updated Mon. 11,170 views. Asked to answer by Nishant Prateek.
Cui Caihao, PhD Candidate in CS & IT

 
 
There is no conflict between these two methods,  deep learning and Bayesian methods are both useful Machine Learning Tools to solve the real problem in our life.  Deep learning allows computational model that are composed of multiple layer to learn representations of data with multiple level of abstraction, this is a automatic feature extractor which can save a lot of engineering skills and domain expertise.

Bayesian method is also used in some part of deep learning, like Bayesian Nets etc.  Some school may looks like that they haven't involved in deep learning research but actually they share the same knowledge base and philosophy in this area.  If one is good at Machine Learning or Statistical Learning, he will feel no pressure to do some research on Deep Learning.

Here is a  paper about deep learning published last month on nature  : Page on nature.com . The authors are so famous in the world right now and  my friend, if you met a guy doing research in AI or ML, and he told you that he had never heard one of them,  you have an obligation to wake him up, LOL~

Here is a reply from  Yann LeCun | Facebook

  
Written Mon. 1,362 views.
Jane Lee, Data mining for businesses and manage... (more)

2 upvotes by Haider Ali and Pss Srivignessh
 
 
I just wanna quote Yann Lecun's answer in Facebookhttps://www.facebook.com/yann.le... 
The key ideas are: first, there's no opposition between "deep" and "Bayesian". Second, it takes time to acquire skills and talents to be professional in deep learning research.
fw

  
Written 1am. 388 views.
 
 
There was a big hype in the 80s around what we call now "shallow" neural networks. I don't know why but bio-inspired models in artificial intelligence seem to follow a cycle of popularity-discontent, whereas pure statistical methods seem to be less hyped but more constant in popularity.

Anyway they are not so distant. The basic component of Hinton's Deep belief network is the restricted Boltzmann machine, which is a flavour of the Boltzmann machine, which is a probabilistic model.
You can always see the state of a neuron to be conditioned by the state of its inputs, statistically speaking. The whole network state can be described in a probabilistic fashion.

What is universally important for artificial intelligence is linear algebra (vector spaces), calculus (gradient descent), and probability theory (bayes). Be worried only when these topics are neglected... :)
Also, I really see graph theory as a common feature of all advanced models in AI.

Piero,
PhD quitter who still loves neural models

  
Written Mon. 662 views.
 
 
I'm actually quite disturbed by the current use of the term. It reminds me of all the "high level" stuff in the 1980s, what wasn't really high level in any particular absolute sense, just relatively high compared to what proceeded it. Now we have something being called "deep" just because it's a bit heavier than something else and "learning" just because it's a fashionable word to use. Why is everybody working toward a job in marketing these days?

Why are very few schools involved in deep learning research? Why are they still hooked on to Bayesian methods?的更多相关文章

  1. (转) Deep Learning Research Review Week 2: Reinforcement Learning

      Deep Learning Research Review Week 2: Reinforcement Learning 转载自: https://adeshpande3.github.io/ad ...

  2. (转)Deep Learning Research Review Week 1: Generative Adversarial Nets

    Adit Deshpande CS Undergrad at UCLA ('19) Blog About Resume Deep Learning Research Review Week 1: Ge ...

  3. 深度学习研究组Deep Learning Research Groups

    Deep Learning Research Groups Some labs and research groups that are actively working on deep learni ...

  4. [DEEP LEARNING An MIT Press book in preparation]Deep Learning for AI

    动人的DL我们有六个月的时间,积累了一定的经验,实验,也DL有了一些自己的想法和理解.曾经想扩大和加深DL相关方面的一些知识. 然后看到了一个MIT按有关的对出版物DL图书http://www.iro ...

  5. [C3] Andrew Ng - Neural Networks and Deep Learning

    About this Course If you want to break into cutting-edge AI, this course will help you do so. Deep l ...

  6. 贝叶斯深度学习(bayesian deep learning)

      本文简单介绍什么是贝叶斯深度学习(bayesian deep learning),贝叶斯深度学习如何用来预测,贝叶斯深度学习和深度学习有什么区别.对于贝叶斯深度学习如何训练,本文只能大致给个介绍. ...

  7. Conclusions about Deep Learning with Python

     Conclusions about Deep Learning with Python  Last night, I start to learn the python for deep learn ...

  8. What are some good books/papers for learning deep learning?

    What's the most effective way to get started with deep learning?       29 Answers     Yoshua Bengio, ...

  9. 《Deep Learning》(深度学习)中文版 开发下载

    <Deep Learning>(深度学习)中文版开放下载   <Deep Learning>(深度学习)是一本皆在帮助学生和从业人员进入机器学习领域的教科书,以开源的形式免费在 ...

随机推荐

  1. Nginx - Configuration File Syntax

    Configuration Directives The Nginx configuration file can be described as a list of directives organ ...

  2. Nginx - HTTP Configuration, the Location Block

    Nginx offers you the possibility to fine-tune your configuration down to three levels — at the proto ...

  3. C#算法基础之堆排序

    using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.T ...

  4. Android 异常捕获

    在用户使用APP时,如果APP毫无征兆的突然退出程序,又没有任何提示信息.我想这是一种最差劲的用户体验了吧,如果是我估计干脆就直接卸载APP了.因此,作为Android开发者对于这种情况的发生一定要有 ...

  5. DOS批处理命令-引数取得

    参数传递对程序来说,是一个很重要的事情,所以,获得传递的参数是很重要的,接下来,我们来探讨下获得传递的参数的N种方式. 1.%N  获得传递的第N个参数(N最大为9) 就是传递过去的参数原样值(并且忽 ...

  6. Swift中的注释以及表达式

    Swift程序有两类注释:单行注释(//)和多行注释(/*...*/).注释方法与C.C++和Objective-C语言都是类似的,下面详细介绍一下.1. 单行注释单行注释可以注释整行或者一行中的一部 ...

  7. Dreamweaver标签库

    .highlight .hll { background-color: #ffffcc } .highlight { background: #ffffff } .highlight .c { col ...

  8. 委托[delegate]_C#

    委托(delegate): 委托声明定义了一种类型,它用一组特定的参数以及返回类型来封装方法.对于静态方法,委托对象封装要调用的方法.对于实例方法,委托对象同时封装一个实例和该实例上的一个方法.如果您 ...

  9. 跨域访问JSONP CORS

    一.JSONP 常用的Jquery框架支持jsonp方式请求,该方式只支持GET方法,传参大小有限,而且需要后台根据jsonp的请求方式进行封装结果返回. 其中参数jsonp默认为callback,j ...

  10. java synchronized关键字

    引用其他人的一段话 Java语言的关键字,当它用来修饰一个方法或者一个代码块的时候,能够保证在同一时刻最多只有一个线程执行该段代码. 一.当两个并发线程访问同一个对象object中的这个synchro ...