某Facebook工程师写的攻略。
Chapter 1
Interesting read, but you can skip it.
Chapter 2
2.1 Insertion
Sort - To be honest you should probably know all major sorting
algorithms, not just insertion sort. It's just basic knowledge and you
never know when it can help.
2.2 Analysis of Algorithms - you can skip the small intro, but know the rest.
2.3 Designing
algorithms - contains merge sort and its analysis as well as an
overview of divide-and-conquer, very important stuff, so worth a read.
Chapter 3
All of it. You have to know big-O notation and time complexity analysis, period.
Chapter 4
4.1 Maximum
subarray problem - Can kind of be worth your time. There are better
solutions to this problem than divide and conquer but it's good practice
and the flow of logic may help develop how you think.
4.2 Strassen's
algorithm - I really love this algorithm and was astounded at how cool
it was the first time I saw it, but you can skip it for the interviews.
It won't come up.
4.3 Substitution method - you won't be using
this method in an interview, but you should know it since it's a basic
tool for finding the time complexity of a recursive algorithm.
4.4 Recurrence tree method - same as 4.3
4.5 Master
method - essential knowledge. You should know it and practice with it
and be able to use it in 3 seconds. This is the method you would use in
an interview if analyzing a recursive algorithm that fits the form.
4.6 Proof
of the master theorem - you can probably skip this, though it's good to
read at least once so that you understand what you're doing with the
master method.
Chapter 5
I've never read this chapter, to
be honest, but what I know is that you need a basic grasp of
probability in interviews because there's a good chance they may come
up. That said, as long as you know basic probability concepts and
practice on probability-related interview problems (there are such
problems with solution explanations in Elements of Programming Interviews,
the book I recommend for interview prep), you can probably skip this
chapter. From a cursory glance, it's more math than algorithms.
Chapter 6
6.1, 6.2, 6.3, 6.4, 6.5 - Heaps and heapsort. Check.
Chapter 7
7.1, 7.2, 7.3 - Quicksort
and its randomized version. Need-to-know concepts. I also recommend 7.4
(I was once asked in an interview to high-level-analyze a randomized
algorithm), though the probability you have to deal with something like
7.4 in an interview is pretty low, I'd guess.
Chapter 8
8.1 - Lower
bounds on sorting - Yes. Basic knowledge. May be asked in a Google
interview (though unlikely, I know of a case it happened in before).
8.2 - Counting sort - Need-to-know in detail. It comes up in disguised forms.
8.3 - Radix sort - Yup. It's an easy algorithm anyway.
8.4 - Bucket sort - can skip.
Chapter 9
9.1 - Small section, worth a read.
9.2 - Selection in expected linear time - Very important,
as it's not common knowledge like quicksort and yet it comes up often
in interviews. I had to code the entire thing in an interview once.
9.3 - Selection
in worst-case linear time - Can skip. Just know that it's possible in
worst-case linear time, because that might help somewhat.
Chapter 10
10.1 - Stacks and queues - basic knowledge, definitely very important.
10.2 - Linked lists - same as 10.1
10.3 - Implementing pointers and objects - If you use C++ or Java, skip this. Otherwise I'm not sure.
10.4 - Representing rooted trees - Small section, worth a quick read.
Chapter 11
For
hashing, I'd say the implementation isn't as important to know as, for
example, linked lists, but you should definitely have an idea about it
and most importantly know the (expected and worst-case) time
complexities of search/insert/delete etc. Also know that practically,
they're very important data structures and, also practically, the
expected time complexity is what matters in the real world.
11.1 - Direct addressing - Just understand the idea.
11.2 - Hash tables - important.
11.3 - Hash
functions - it's worth having an idea about them, but I wouldn't go too
in-depth here. Just know a couple examples of good and bad hash
functions (and why they are good/bad).
11.4 - Open addressing - Worth having an idea about, but unlikely to come up.
11.5 - Perfect hashing - skip.
Chapter 12
12.1 - What is a binary search tree? - Yep.
12.2 - Querying a BST - Yep. All of it.
12.3 - Insertion/Deletion - Same as 12.2
12.4 - Randomly built BSTs - just know Theorem 12.4 (expected height of random BST is O(lgn)) and an idea of why it's true.
Chapter 13
This one is easy. Know what a Red-Black tree is, and what its worst-case height/insert/delete/find are. Read 13.1 and 13.2,
and skip the rest. You will never be asked for RB-tree insert/delete
unless the interviewer is "doing it wrong", or if the interviewer wants
to see if you can re-derive the cases, in which case knowing them won't
help much anyway (and I doubt this would happen anyway). Also know that
RB-trees are pretty space-efficient and some C++ STL containers are
built as RB-trees usually (e.g. map/set).
Chapter 14
Might be worth skimming 14.2 just
to know that you can augment data structures and why it might be
helpful. Otherwise do one or two simple problems on augmenting data
structures and you're set here. I'd skip 14.1 and 14.3.
Chapter 15
DP! Must-know.
15.1 - Rod-cutting. Standard DP problem, must-know.
15.2 - Matrix-chain
multiplication - same as 15.1, though I don't particularly like the way
this section is written (it's rare for me to say that about CLRS).
15.3 - Elements
of DP - worth a read so that you understand DP properly, but I'd say
it's less important than knowing what DP is (via the chapter
introduction) and practicing on it (via the problems in this book and in
interview preparation books).
15.4 - LCS - same as 15.1
15.5 - Optimal binary search trees - I've never read this section, so I can't argue for its importance, but I did fine without it.
Chapter 16
You should definitely know what a greedy algorithm is, so read the introduction for this chapter.
16.1 - An activity selection problem - Haven't read this in detail, but I'd say check it out, if not in-depth.
16.2 - Elements of the greedy strategy - same as 16.1
16.3 - Huffman
codes - I'd say read the problem and the algorithm, but that's enough.
I've seen interview questions where the answer is Huffman coding (but
the question will come up in a 'disguised form', so it won't be
obvious.)
16.4 - Matroids and greedy methods - I've never read
this section, but I've done a lot of greedy problems during interview
prep and this stuff never came up, so I'd say this section is irrelevant
for the interview.
16.5 - Task-scheduling problem as a matroid - Same as 16.4.
Chapter 17
Okay,
you should definitely know what amortized analysis is, but I've never
read it from the book and I feel it's a sufficiently simple concept that
you can just Google it and check a few examples on what it is, or
understand it just by reading section 17.1. So:
17.1 - Aggregate analysis - read this, it explains the important stuff.
17.2, 17.3, 17.4 - Skip.
Chapter 18
You
should probably have an idea of what B-Trees (and B+ trees) are, I've
heard of cases where candidates were asked about them in a general sense
(high-level questions about what they are and why they're awesome). But
other than that I'd skip this chapter.
Chapter 19
Fibonacci heaps - nope.
Chapter 20
van Emde Boas Trees - double, triple, and quadruple nope.
Chapter 21
Disjoint sets
Update:
I originally recommended skipping this section, but on reconsideration,
I've noticed that it's actually more important than I originally
thought. Thus, I recommend reading sections 21.1 and 21.2, while skipping the rest.
Union-find
is somewhat important and I've seen at least one problem which uses it,
though that problem could also be solved using DFS and connected
components. That said, I also believe that it's not strictly necessary
because one can probably, for interview purposes, come up with a similar
enough structure easily to solve a problem which requires union-find,
without knowing the material in this chapter. However, I believe it's
worth a read so that if a problem comes up whose intended solution is a
union-find data structure, you don't spend time in an interview coming
up with it, and rather know from before, which can be a good advantage.
Still, I'd probably rank it as less important than most of the other
material in this list, and even less than other material that's not even
in CLRS (like tries, for example).
Okay, now graph algorithms. First read the introduction. Now, there's a lot to know here, so hang on.
Chapter 22
22.1 - Representations of graphs - Yes.
22.2 - BFS - Yes. After you do that, solve this problem: ACM-ICPC Live Archive - Kermit the Frog. The whole "state-space search using BFS" thing is an important concept that might be used to solve several interview problems.
22.3 - DFS - Yes.
22.4 - Topological sort - Yes.
22.5 - Strongly connected components - much less likely to come up than the above 4, but still possible, so: Yes.
Chapter 23
Minimum
spanning trees - probably the least important graph algorithm, other
than max flow (I mean for interview purposes, of course). I'd still say
you should read it because it's such a well-known problem, but
definitely give priority to the other things.
23.1 - Growing a MST - sort of, yes.
23.2 - Prim and Kruskal's algorithms - sort of, yes.
Chapter 24
Shortest path algorithms are important, though maybe less so than BFS/DFS.
Read
the introduction. You should, in general, read all introductions
anyway, but this one's important (and long), so it warranted a special
note.
24.1 Bellman-Ford - Know the algorithm and its proof of correctness.
24.2 Shortest paths in DAGs - definitely worth knowing, may come up, even more so than Bellman-Ford I'd say.
24.3 Dijkstra's
algorithm - Yes. Of course. I've seen this come up multiple times (with
slight variations), and I've even seen A* come up.
24.4 Difference constraints and shortest paths - Skip.
Chapter 25
Read the intro as well.
25.1 - Matrix multiplication -I'd
say skip. It might be possible for this to come up (very very slim
chance that it does though), but the chances are so low in my view that
it's probably not worth it. If you have some extra time, though, give it
a read.
25.2 - Floyd-Warshall - Yep, worth knowing the
algorithm and its time complexity and when it works (which is for all
weighted graphs, except ones with negative weight cycles). Its code is
something like 5 lines so there's no reason not to know it. The analysis
might be a bit overkill though.
25.3 - Johnson's algorithm - Skip.
Chapter 26
Maximum flow - I've never heard of this coming up in an interview and I can't imagine why it would, so skip.
Chapters 27+
Most
of this stuff is never going to come up, so it's easier for me to tell
you what to actually read than what not to read, so here are a few
selected topics from the Selected Topics in the book:
Chapter 31
Most
of what you should learn from this chapter you can learn from
practicing on interview problems from Elements of Programming Interviews
(and your time is better spent doing that), so I'd say skip it all
except Euclid's algorithm for the GCD, under section 31.2.
Chapter 32
32.1 - Naive method - just read it quickly.
32.2 - Rabin-Karp
- I'd say you should know this, the rolling hash concept is very
important and can be useful in many string- or search-related interview
problems.
Appendices
A - Summations
Know the important summations for time complexity analysis.
C - Counting and Probability
Give C.4
a read if you don't know the material, Bernoulli trials may come up in
problems (not explicitly, but you might use them, specifically for time
analysis of questions that involve probability/coin flips).
某Facebook工程师写的攻略。的更多相关文章
- python web工程师跳巢攻略
python web工程师跳巢攻略 流程 一面问基础 二面问项目 三面问设计(经验) web请求的流程 浏览器 负载均衡 web框架 业务逻辑 数据库缓存 后端技术栈 python语言基础 语言特点 ...
- C语言新手写扫雷攻略4
今天写的是游戏过程的函数,基本的算法前面都解释过了,今天是实现基本的功能 补充一下前面需要用到的头文件 #include<conio.h> //_kbhit() #include<s ...
- C语言新手写扫雷攻略1
工欲善其事,必先利其器,首先要准备好开发环境,既然是C语言,那就不是WinAPI的扫雷,就是纯的C语言开发,但是以前的C都是TC开发的,现在用肯定是过时很久了,但是也是有解决办法的,某些大神开发出Ea ...
- C语言新手写扫雷攻略3
界面绘制好后,雷数也布置了,接下来就是游戏的运行过程了,今天先不说具体过程,再来看看需要用到的辅助函数 先是简单的画红旗,鼠标右键的功能是画红旗,至此我们都是在使用函数自己绘图,效率是低,但有助于理解 ...
- C语言新手写扫雷攻略2
接下来是游戏的功能设计,要有扫雷的基本功能,左键点击雷区,右键红旗标记,并且可以统计雷数,可以重新开始,以下是游戏的功能初始 void Game(void) { while (1) { if (FLA ...
- Nazo解密游戏攻略
啊,终于腾出时间来玩这个游戏了,顺手写一下攻略吧…… 第0关:http://cafebabe.cc/nazo/ 第一关:第一关很简单 点一下就好了 http://cafebabe.cc/nazo/le ...
- 寒城攻略:Listo 教你用Swift 语言编写 IOS 平台流媒体播放器
先展示播放器效果: 依然继承 Listo 本人的强迫症,还是从最初到完毕完整的写一个攻略来记录一下,这里声明 Listo 本人也是看了非常多的戴维营攻略才总结分享给大家这一篇攻略的. 首先,Lis ...
- 全栈数据工程师养成攻略:Python 基本语法
全栈数据工程师养成攻略:Python 基本语法 Python简单易学,但又博大精深.许多人号称精通Python,却不会写Pythonic的代码,对很多常用包的使用也并不熟悉.学海无涯,我们先来了解一些 ...
- Davinci DM6446开发攻略-UBOOT-2009.03移植2 nand flash的烧写
很长一段时间没有更新博客了,是因为要推出新开发方案和做好客户服务工作,忙得不易乐乎.有关DAVINCI U-BOOT的移植,以前写过一篇u-boot-1.3.4(2008年的),其实和这个u-bo ...
随机推荐
- 关于metaspolit中进行JAVA反序列化渗透RMI的原理分析
一.背景: 这里需要对java反序列化有点了解,在这里得推广下自己的博客嘛,虽然写的不好,广告还是要做的.原谅我: 1.java反序列化漏洞原理研习 2.java反序列化漏洞的检测 二.攻击手法简介 ...
- mac java环境
1.java运行环境jre:http://www.java.com/zh_CN/ 2.jdk:http://www.oracle.com/technetwork/java/javase/downloa ...
- vue+node+mongoDB火车票H5(七)-- nodejs 爬12306查票接口
菜鸟一枚,业余一直想做个火车票查票的H5,前端页面什么的已经写好了,node+mongoDB 也写了一个车站的接口,但 接下来的爬12306获取车次信息数据一直卡住,网上的爬12306的大部分是pyt ...
- centos7 安装kafka Manager
1.安装sbt编译环境 curl https://bintray.com/sbt/rpm/rpm |tee /etc/yum.repos.d/bintray-sbt-rpm.repo yum inst ...
- CSS样式表、JS脚本加载顺序与SpringMVC在URL路径中传参数与SpringMVC 拦截器
CSS样式表和JS脚本加载顺序 Css样式表文件要在<head>中先加载,这样网页显示时可以第一次就渲染出正确的布局和样式,网页就不会闪烁,或跳变 JS脚本尽可能放在<body> ...
- linux 命令行 执行 php
w为监控响应功能做准备. ubuntu@VM-52-248-ubuntu:~$ php -f /var/www/html/wlinux.phpwwubuntu@VM-52-248-ubuntu:~$ ...
- rrdtool ubuntu python snmpwalk
rrdtool install: apt-get install libpango1.0-dev libxml2-dev wget https://packages.debian.org/wheezy ...
- centos7 启动docker失败--selinux-enabled=false
centos7,执行完安装命令: yum install docker 执行启动命令: systemctl start docker ,报下面错误: Error starting daemon: ...
- react 获取input标签的输入值
参考:https://segmentfault.com/a/1190000012404114 两种方法,受控组件和非受控组件. 推荐使用受控组件,即通过this.state获取,因为其符合react规 ...
- SIP中的 session, dialog 及 transaction 的解释
http://stackoverflow.com/questions/35133331/difference-between-session-dialog-and-transaction-in-sip ...