http://stackoverflow.com/questions/5882005/how-to-download-image-from-any-web-page-in-java

  1. (throws IOException)
  2. Image image = null;
  3. try {
  4. URL url = new URL("http://www.yahoo.com/image_to_read.jpg");
  5. image = ImageIO.read(url);
  6. } catch (IOException e) {
  7. }

See javax.imageio package for more info. That's using the AWT image. Otherwise you could do:

  1. URL url = new URL("http://www.yahoo.com/image_to_read.jpg");
  2. InputStream in = new BufferedInputStream(url.openStream());
  3. ByteArrayOutputStream out = new ByteArrayOutputStream();
  4. byte[] buf = new byte[1024];
  5. int n = 0;
  6. while (-1!=(n=in.read(buf)))
  7. {
  8. out.write(buf, 0, n);
  9. }
  10. out.close();
  11. in.close();
  12. byte[] response = out.toByteArray();

And you may then want to save the image so do:

  1. FileOutputStream fos = new FileOutputStream("C://borrowed_image.jpg");
  2. fos.write(response);
  3. fos.close();
answered May 4 '11 at 10:32
planetjones

7,78912844
 
2  
For Java7 modify the code snippet to use the try-with-resources statement, seedocs.oracle.com/javase/tutorial/essential/exceptions/… – Gonfi den Tschal Jul 7 '13 at 1:18

You are looking for a web crawler. You can use JSoup to do this, here is basic example

answered May 4 '11 at 10:31
Jigar Joshi

148k20236309
 

It works for me)

  1. URL url = new URL("http://upload.wikimedia.org/wikipedia/commons/9/9c/Image-Porkeri_001.jpg");
  2. InputStream in = new BufferedInputStream(url.openStream());
  3. OutputStream out = new BufferedOutputStream(new FileOutputStream("Image-Porkeri_001.jpg"));
  4. for ( int i; (i = in.read()) != -1; ) {
  5. out.write(i);
  6. }
  7. in.close();
  8. out.close();
answered Apr 15 '14 at 21:11
G.F.

13828
 

If you want to save the image and you know its URL you can do this:

  1. try(InputStream in = new URL("http://example.com/image.jpg").openStream()){
  2. Files.copy(in, Paths.get("C:/File/To/Save/To/image.jpg"));
  3. }

You will also need to handle the IOExceptions which may be thrown.

answered Sep 9 '15 at 6:15
Alex

4,95511528
 

The following code downloads an image from a direct link to the disk into the project directory. Also note that it uses try-with-resources.

  1. import java.io.BufferedInputStream;
  2. import java.io.BufferedOutputStream;
  3. import java.io.File;
  4. import java.io.FileNotFoundException;
  5. import java.io.FileOutputStream;
  6. import java.io.IOException;
  7. import java.io.InputStream;
  8. import java.io.OutputStream;
  9. import java.net.MalformedURLException;
  10. import java.net.URL;
  11. import org.apache.commons.io.FilenameUtils;
  12. public class ImageDownloader
  13. {
  14. public static void main(String[] arguments) throws IOException
  15. {
  16. downloadImage("https://upload.wikimedia.org/wikipedia/commons/7/73/Lion_waiting_in_Namibia.jpg",
  17. new File("").getAbsolutePath());
  18. }
  19. public static void downloadImage(String sourceUrl, String targetDirectory)
  20. throws MalformedURLException, IOException, FileNotFoundException
  21. {
  22. URL imageUrl = new URL(sourceUrl);
  23. try (InputStream imageReader = new BufferedInputStream(
  24. imageUrl.openStream());
  25. OutputStream imageWriter = new BufferedOutputStream(
  26. new FileOutputStream(targetDirectory + File.separator
  27. + FilenameUtils.getName(sourceUrl)));)
  28. {
  29. int readByte;
  30. while ((readByte = imageReader.read()) != -1)
  31. {
  32. imageWriter.write(readByte);
  33. }
  34. }
  35. }
  36. }

how to download image from any web page in java 下载图片的更多相关文章

  1. 解读Web Page Diagnostics网页细分图

    解读Web Page Diagnostics网页细分图 http://blog.sina.com.cn/s/blog_62b8fc330100red5.html Web Page Diagnostic ...

  2. 网页细分图结果分析(Web Page Diagnostics)

    Discuz开源论坛网页细分图结果分析(Web Page Diagnostics) 续LR实战之Discuz开源论坛项目,之前一直是创建虚拟用户脚本(Virtual User Generator)和场 ...

  3. Tutorial: Importing and analyzing data from a Web Page using Power BI Desktop

    In this tutorial, you will learn how to import a table of data from a Web page and create a report t ...

  4. LR实战之Discuz开源论坛——网页细分图结果分析(Web Page Diagnostics)

    续LR实战之Discuz开源论坛项目,之前一直是创建虚拟用户脚本(Virtual User Generator)和场景(Controller),现在,终于到了LoadRunner性能测试结果分析(An ...

  5. [转]How WebKit Loads a Web Page

    ref:https://www.webkit.org/blog/1188/how-webkit-loads-a-web-page/ Before WebKit can render a web pag ...

  6. web page diagnostics

      1.概念说明: DNS解析时间:显示使用最近的DNS服务器将DNS名称解析为IP地址所需的时间:DNS查找度量是指示DNS解析问题或DNS服务器问题的一个很好的指示器: Connect时间:显示与 ...

  7. Loadrunner Analysis之Web Page Diagnostics

    Loadrunner Analysis之Web Page Diagnostics 分类: LoadRunner 性能测试 2012-12-31 18:47 1932人阅读 评论(2) 收藏 举报 di ...

  8. How a web page loads

    The major web browsers load web pages in basically the same way. This process is known as parsing an ...

  9. 解读Loadrunner网页细分图(Web Page Diagnostics)

    [转载的地址]https://www.cnblogs.com/littlecat15/p/9456376.html 一.启用网页细分图 首先在Controller场景设计运行之前,需要在菜单栏中设置D ...

随机推荐

  1. flume从kafka读取数据到hdfs中的配置

    #source的名字 agent.sources = kafkaSource # channels的名字,建议按照type来命名 agent.channels = memoryChannel # si ...

  2. 在hadoop 的任务中设置 map数量

    试验了一下: 调整mapred-site.xml中mapred.min.split.size的值可以改变map的数量 首先设置了hdfs-site.xml中的dfs.block.size为20M,测试 ...

  3. 【ORACLE】“System.Exception: System.Data.OracleClient 需要 Oracle 客户端软件 version 8.1.7 或更高版本。”解决办法

    我的电脑是win10.64位.问题如题,在网上找了很多办法都没搞好,如下: 1.给oracle目录设置“Authenticated Users”用户的“读取/写入”权限 2.IIS网站物理路径凭据添加 ...

  4. 设计模式 -- 解释器模式(Interpreter Pattern)

    2015年12月15日00:19:02 今天只看了理论和demo,明天再写文章,跑步好累 2015年12月15日21:36:00 解释器模式用来解释预先定义的文法. <大话设计模式>里面这 ...

  5. Linux之VI搜索相关命令

    /abc, 向前查询abc ?abc, 向后查询abc n, 向前继续查询 N, 向后继续查询 老是忘记,简单记录下

  6. ACdream 1726 A Math game

    深搜.不过有一个强大的剪枝.就是假设之后的全部用上都不能达到H,则return. if (A[n]-A[x-1]+summ< H) return; //A[n]表示前nx项和 #include& ...

  7. 12.hibernate命名查询

    1.创建如下javaweb项目结构 2.在项目的src下创建hibernate.cfg.xml主配置文件 <?xml version="1.0" encoding=" ...

  8. wep密钥的长度

    理论上,WEP可以搭配任意长度的密钥,因为RC4并未要求非得使用特定长度的密钥. 不过,大多数产品均支持一种或两种长度的密钥.唯一出现在标准中的密钥长度时64位的WEP种子(seed),其中40位是两 ...

  9. LeetCode OJ Remove Duplicates from Sorted Array II

    Follow up for "Remove Duplicates":What if duplicates are allowed at most twice? For exampl ...

  10. Digital Ocean VS. Linode对比评测

    美国攻城师Zach Schneider是linode vps资深用户,他最近却转向了Digital Ocean,原因是什么呢?来看这篇digitalocean linode对比评测的文章: 用了两年的 ...