hadoop 文件操作
Create a directory in HDFS - mkdir
The hadoop mkdir command is for creating directories in the hdfs. This is similar to the unix mkdir command. You can use the -p option for creating parent directories. Takes path uri’s as argument and creates directories.
Usage:
hadoop fs -mkdir
Examples:
hadoop fs -mkdir /user/hadoop/corejavaguru
hadoop fs -mkdir /user/hadoop/dir1 /user/hadoop/dir2
hadoop fs -mkdir -p /user/hadoop/corejavaguru/fscommands/demo
List the contents of a HDFS directory - ls
The ls command is used to list out the directories and files.
For a file ls returns stat on the file with the following format:
permissions number_of_replicas userid groupid filesize modification_date modification_time filename
For a directory it returns list of its direct children as in Unix. A directory is listed as:
permissions userid groupid modification_date modification_time dirname
Usage:
hadoop fs -ls
Example:
hadoop fs -ls /user/hadoop/file1
Upload a file into HDFS - put
put command is used to copy single source, or multiple sources to the destination file system. Also reads input from stdin and writes to destination file system. The different ways for the put command are :
Usage:
hadoop fs -put ... <hdfs_dest_path>
Example:
hadoop fs -put /home/hadoop/Samplefile.txt /user/hadoop/dir3/
hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
Download a file from HDFS - get
Hadoop get command copies the files from HDFS to the local file system. The syntax of the get command is shown below:
Usage:
hadoop fs -get [-ignorecrc] [-crc]
Example:
hadoop fs -get /user/hadoop/file localfile
hadoop fs -get hdfs://nn.example.com/user/hadoop/file localfile
See contents of a file in HDFS - cat
cat command is used to print the contents of the file on the stdout.
Usage:
hadoop fs -cat <path[filename]>
Example:
hadoop fs -cat /user/hadoop/dir1/xyz.txt
Copy a file from source to destination in HDFS - cp
cp command is for copying the source into the target. This command allows multiple sources as well in which case the destination must be a directory.
Usage:
hadoop fs -cp [-f] [-p | -p[topax]] URI [URI ...]
Example:
hadoop fs -cp /user/hadoop/file1 /user/hadoop/file2
hadoop fs -cp /user/hadoop/file1 /user/hadoop/file2 /user/hadoop/dir
Copy a file from Local file system to HDFS - copyFromLocal
The hadoop copyFromLocal command is used to copy a file from the local file system to the hadoop hdfs. Similar to put command, except that the source is restricted to a local file reference.
Usage:
hadoop fs -copyFromLocal URI
Example:
hadoop fs -copyFromLocal /home/hadoop/xyz.txt /user/hadoop/xyz.txt
Copy a file from HDFS to Local file system - copyToLocal
The hadoop copyToLocal command is used to copy a file from the hdfs to the local file system. Similar to get command, except that the destination is restricted to a local file reference.
Usage:
hadoop fs -copyToLocal [-ignorecrc] [-crc] URI
Example:
hadoop fs -copyToLocal /user/hadoop/xyz.txt /home/hadoop/xyz.txt
Move file from source to destination in HDFS - mv
Moves files from source to destination. This command allows multiple sources as well in which case the destination needs to be a directory. Note: Moving files across file systems is not permitted.
Usage:
hadoop fs -mv URI [URI ...]
Example:
hadoop fs -mv /user/hadoop/file1 /user/hadoop/file2
hadoop fs -mv hdfs://nn.example.com/file1 hdfs://nn.example.com/file2 hdfs://nn.example.com/dir1
Remove a file or directory in HDFS - rm, rmdir
rm
Delete files specified as args. Deletes directory only when it is empty
Usage:
hadoop fs -rm [-f] [-r |-R] [-skipTrash] URI [URI ...]
Example:
hadoop fs -rm hdfs://nn.example.com/file /user/hadoop/emptydir
rmdir
Delete a directory specified as args.
Usage:
hadoop fs -rmdir [--ignore-fail-on-non-empty] URI [URI ...]
Example:
hadoop fs -rmdir /user/hadoop/emptydir
Options: --ignore-fail-on-non-empty: When using wildcards, do not fail if a directory still contains files.
Display last few lines of a file in HDFS - tail
Displays last kilobyte of the file to stdout.
Usage:
hadoop fs -tail [-f] URI
Example:
hafoop fs -tail /user/hadoop/demo.txt
Print statistics about the file or directory in HDFS - stat
Use stat to print statistics about the file/directory at in the specified format.
Usage:
hadoop fs -stat [format] ...
Example:
hadoop fs -stat /user/hadoop/
Display the size of files and directories in HDFS - du
The du command displays aggregate length of files contained in the directory or the length of a file in case its just a file.
Usage :
hadoop fs -du
Example:
hadoop fs -du /user/hadoop/dir1/xyz.txt
Change group of files in HDFS - chgrp
The hadoop chgrp shell command is used to change the group association of files. The user must be the owner of files, or else a super-user.
Usage:
hadoop fs -chgrp [-R] GROUP URI [URI ...]
Change the permissions of files in HDFS - chmod
The hadoop chmod command is used to change the permissions of files. The user must be the owner of the file, or else a super-user.
Usage:
hadoop fs -chmod [-R] <mode[,mode]... |="" octalmode=""> URI [URI ...]
Change the owner of files in HDFS - chown
The hadoop chown command is used to change the ownership of files. The user must be a super-user.
Usage:
hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI ]
Help for an individual HDFS command - usage
Below command return the help for an individual command.
Usage:
hadoop fs -usage command
hadoop 文件操作的更多相关文章
- 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作
马士兵hadoop第一课:虚拟机搭建和安装hadoop及启动 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作 马士兵hadoop第三课:java开发hdfs 马士兵hadoop第 ...
- 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作(转)
马士兵hadoop第一课:虚拟机搭建和安装hadoop及启动 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作 马士兵hadoop第三课:java开发hdfs 马士兵hadoop第 ...
- 二、hadoop文件操作
1.使用hadoop命令查看hdfs下文件 [root@localhost hadoop-2.7.2]# hadoop fs -ls hdfs://192.168.211.129:9000/ (最后 ...
- Hadoop文件操作常用命令
1.创建目录 #hdfs dfs -mkidr /test 2.查询目录结构 #hdfs dfs -ls / 子命令 -R递归查看//查看具体的某个目录:例如#hdfs dfs -ls /test 3 ...
- Hadoop之HDFS文件操作常有两种方式(转载)
摘要:Hadoop之HDFS文件操作常有两种方式,命令行方式和JavaAPI方式.本文介绍如何利用这两种方式对HDFS文件进行操作. 关键词:HDFS文件 命令行 Java API HD ...
- Hadoop第4周练习—HDFS读写文件操作
1 运行环境说明... 3 :编译并运行<权威指南>中的例3.2. 3 内容... 3 2.3.1 创建代码目录... 4 2.3.2 建立例子文件上传到hdfs中... 4 ...
- hadoop的hdfs文件操作实现上传文件到hdfs
这篇文章主要介绍了使用hadoop的API对HDFS上的文件访问,其中包括上传文件到HDFS上.从HDFS上下载文件和删除HDFS上的文件,需要的朋友可以参考下hdfs文件操作操作示例,包括上传文件到 ...
- Hadoop之HDFS文件操作
摘要:Hadoop之HDFS文件操作常有两种方式.命令行方式和JavaAPI方式.本文介绍怎样利用这两种方式对HDFS文件进行操作. 关键词:HDFS文件 命令行 Java API HD ...
- Hadoop学习笔记之二 文件操作
HDFS分布式文件系统:优点:支持超大文件存储.流式访问.一次写入多次读取.缺点:不适应大量小文件.不适应低时延的数据访问.不适应多用户访问任意修改文件. 1.hadoop用于大数据处理,在数据量较小 ...
随机推荐
- C++学习之继承篇
今天通过对实验二继承,重载,覆盖的学习,让我更深一步理解了这些概念的区别. 首先来明确一个概念,函数名即地址,也就是说函数名就是个指针. 编译阶段,编译器为每个函数的代码分配一个地址空间并编译函数代码 ...
- Which dispatch method would be used in Swift?-Existential Container
In this example: protocol MyProtocol { func testFuncA() } extension MyProtocol { func testFuncA() { ...
- Linux下 SpringBoot jar项目后台运行、查看、停用
运行java jar: nohup java -jar **-0.0.1-SNAPSHOT.jar & 查看进程: 采用top或者ps aux命令.一般 如果后台是springboot,jar ...
- PyQt5实现第一个桌面应用程序
import sysfrom PyQt5.QtWidgets import QApplication,QWidget,QDialogfrom PyQt5.QtCore import Qt if __n ...
- run loop
Objective-C之run loop详解 作者:wangzz 原文地址:http://blog.csdn.net/wzzvictory/article/details/9237973 转载请注明出 ...
- CentO7-使用plantuml绘制UML类图
准备工作 到PlantUml官网(http://plantuml.com/download)下载plantuml.jar.官网上还有一个在线的demof服务.plantuml的官网真的很挫! 到官网下 ...
- 中南大学2019年ACM寒假集训前期训练题集(基础题)
先写一部分,持续到更新完. A: 寒衣调 Description 男从戎,女守家.一夜,狼烟四起,男战死沙场.从此一道黄泉,两地离别.最后,女终于在等待中老去逝去.逝去的最后是换尽一生等到的相逢和团圆 ...
- Groovy常用语法汇总
基本语法 1.Grovvy的注释分为//和/**/和java的一样. 2.Grovvy语法可以不已分号结尾. 3.单引号,里面的内容严格的对应java中的String,不对$符号进行转义. def s ...
- Python之数字
Python之数字 int(数字)===>在Python3中,int没有范围,在Python2中,int超出范围就叫长整型(Long). 浮点运算:单精度 float 双精度 double a: ...
- 输出一定范围unicode对应符号
#本程序没有考虑对0x的处理,请勿输入,直接输入16进制位即可 begin = input("起始点:") end = input("结束点:") b_int0 ...