GeoSpark是一种用于大规模空间数据处理的集群计算。 GeoSpark通过一组out-of-the-box空间弹性分布式数据集( SRDDs ) 扩展 Apache Spark,它可以跨机器高效地加载。处理、分析、展示大规模空间数据。

准备工作

  1. Windows 和 spark
  2. IDEA
  3. GeoSpark支持Java、Scala两种,本次开发语言选择Java。

GeoSpark

参考https://github.com/jiayuasu/GeoSparkTemplateProject,下载项目到本地。

GeoSpark-Viz Java项目构建

  1. cd ./geospark-viz/java
  2. mvn clean install

由于项目中的数据生成图片不太满意,将map.shp数据解析成polygon.csv,修改下java代码

  1. ConfFile= new FileInputStream(resourcePath+"babylon.polygon2.properties");



通过buildChoroplethMap统计面内得点数生成分级统计图,修改buildScatterPlot和

buildHeatMap输入数据为点数据生成散点图和热力图。





完整代码:

  1. package example;
  2. import com.vividsolutions.jts.geom.Envelope;
  3. import com.vividsolutions.jts.geom.Polygon;
  4. import org.apache.log4j.Level;
  5. import org.apache.log4j.Logger;
  6. import org.apache.spark.SparkConf;
  7. import org.apache.spark.api.java.JavaPairRDD;
  8. import org.apache.spark.api.java.JavaSparkContext;
  9. import org.apache.spark.serializer.KryoSerializer;
  10. import org.apache.spark.storage.StorageLevel;
  11. import org.datasyslab.geospark.enums.FileDataSplitter;
  12. import org.datasyslab.geospark.enums.GridType;
  13. import org.datasyslab.geospark.enums.IndexType;
  14. import org.datasyslab.geospark.formatMapper.EarthdataHDFPointMapper;
  15. import org.datasyslab.geospark.spatialOperator.JoinQuery;
  16. import org.datasyslab.geospark.spatialRDD.PointRDD;
  17. import org.datasyslab.geospark.spatialRDD.PolygonRDD;
  18. import org.datasyslab.geospark.spatialRDD.RectangleRDD;
  19. import org.datasyslab.geosparkviz.core.ImageGenerator;
  20. import org.datasyslab.geosparkviz.core.ImageStitcher;
  21. import org.datasyslab.geosparkviz.core.RasterOverlayOperator;
  22. import org.datasyslab.geosparkviz.core.Serde.GeoSparkVizKryoRegistrator;
  23. import org.datasyslab.geosparkviz.extension.visualizationEffect.ChoroplethMap;
  24. import org.datasyslab.geosparkviz.extension.visualizationEffect.HeatMap;
  25. import org.datasyslab.geosparkviz.extension.visualizationEffect.ScatterPlot;
  26. import org.datasyslab.geosparkviz.utils.ColorizeOption;
  27. import org.datasyslab.geosparkviz.utils.ImageType;
  28. import java.awt.*;
  29. import java.io.FileInputStream;
  30. import java.io.IOException;
  31. import java.util.Properties;
  32. // TODO: Auto-generated Javadoc
  33. /**
  34. * The Class Example.
  35. */
  36. public class Example2019 {
  37. /** The spark context. */
  38. static JavaSparkContext sparkContext;
  39. /** The prop. */
  40. static Properties prop;
  41. /** The Point input location. */
  42. static String PointInputLocation;
  43. /** The Point offset. */
  44. static Integer PointOffset;
  45. /** The Point splitter. */
  46. static FileDataSplitter PointSplitter;
  47. /** The Point num partitions. */
  48. static Integer PointNumPartitions;
  49. /** The Rectangle input location. */
  50. static String RectangleInputLocation;
  51. /** The Rectangle offset. */
  52. static Integer RectangleOffset;
  53. /** The Rectangle splitter. */
  54. static FileDataSplitter RectangleSplitter;
  55. /** The Rectangle num partitions. */
  56. static Integer RectangleNumPartitions;
  57. /** The Polygon input location. */
  58. static String PolygonInputLocation;
  59. /** The Polygon offset. */
  60. static Integer PolygonOffset;
  61. /** The Polygon splitter. */
  62. static FileDataSplitter PolygonSplitter;
  63. /** The Polygon num partitions. */
  64. static Integer PolygonNumPartitions;
  65. /** The Line string input location. */
  66. static String LineStringInputLocation;
  67. /** The Line string offset. */
  68. static Integer LineStringOffset;
  69. /** The Line string splitter. */
  70. static FileDataSplitter LineStringSplitter;
  71. /** The Line string num partitions. */
  72. static Integer LineStringNumPartitions;
  73. /** The US main land boundary. */
  74. static Envelope USMainLandBoundary;
  75. /** The earthdata input location. */
  76. static String earthdataInputLocation;
  77. /** The earthdata num partitions. */
  78. static Integer earthdataNumPartitions;
  79. /** The HDF increment. */
  80. static int HDFIncrement = 5;
  81. /** The HDF offset. */
  82. static int HDFOffset = 2;
  83. /** The HDF root group name. */
  84. static String HDFRootGroupName = "MOD_Swath_LST";
  85. /** The HDF data variable name. */
  86. static String HDFDataVariableName = "LST";
  87. /** The HDF data variable list. */
  88. static String[] HDFDataVariableList = {"LST","QC","Error_LST","Emis_31","Emis_32"};
  89. /** The HD fswitch XY. */
  90. static boolean HDFswitchXY = true;
  91. /** The url prefix. */
  92. static String urlPrefix = "";
  93. /**
  94. * Builds the scatter plot.
  95. *
  96. * @param outputPath the output path
  97. * @return true, if successful
  98. */
  99. public static boolean buildScatterPlot(String outputPath)
  100. {
  101. try{
  102. PointRDD spatialRDD = new PointRDD(sparkContext, PointInputLocation, PointOffset, PointSplitter, false, PointNumPartitions, StorageLevel.MEMORY_ONLY());
  103. //PolygonRDD spatialRDD = new PolygonRDD(sparkContext, PolygonInputLocation, PolygonSplitter, false, PolygonNumPartitions, StorageLevel.MEMORY_ONLY());
  104. ScatterPlot visualizationOperator = new ScatterPlot(1000,600,USMainLandBoundary,false);
  105. visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.GREEN, true);
  106. visualizationOperator.Visualize(sparkContext, spatialRDD);
  107. ImageGenerator imageGenerator = new ImageGenerator();
  108. imageGenerator.SaveRasterImageAsLocalFile(visualizationOperator.rasterImage, outputPath, ImageType.PNG);
  109. // visualizationOperator = new ScatterPlot(1000,600,USMainLandBoundary,false,-1,-1,false,true);
  110. // visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.GREEN, true);
  111. // visualizationOperator.Visualize(sparkContext, spatialRDD);
  112. // imageGenerator = new ImageGenerator();
  113. // imageGenerator.SaveVectorImageAsLocalFile(visualizationOperator.vectorImage, outputPath,ImageType.SVG);
  114. //
  115. // visualizationOperator = new ScatterPlot(1000,600,USMainLandBoundary,false,-1,-1,true,true);
  116. // visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.GREEN, true);
  117. // visualizationOperator.Visualize(sparkContext, spatialRDD);
  118. // imageGenerator = new ImageGenerator();
  119. // imageGenerator.SaveVectorImageAsLocalFile(visualizationOperator.distributedVectorImage, outputPath+"-distributed",ImageType.SVG);
  120. //
  121. }
  122. catch(Exception e)
  123. {
  124. e.printStackTrace();
  125. return false;
  126. }
  127. return true;
  128. }
  129. /**
  130. * Builds the heat map.
  131. *
  132. * @param outputPath the output path
  133. * @return true, if successful
  134. */
  135. public static boolean buildHeatMap(String outputPath)
  136. {
  137. try{
  138. PointRDD spatialRDD = new PointRDD(sparkContext, PointInputLocation, PointOffset, PointSplitter, false, PointNumPartitions, StorageLevel.MEMORY_ONLY());
  139. HeatMap visualizationOperator = new HeatMap(1000,600,USMainLandBoundary,false,5);
  140. visualizationOperator.Visualize(sparkContext, spatialRDD);
  141. ImageGenerator imageGenerator = new ImageGenerator();
  142. imageGenerator.SaveRasterImageAsLocalFile(visualizationOperator.rasterImage, outputPath,ImageType.PNG);
  143. }
  144. catch(Exception e)
  145. {
  146. e.printStackTrace();
  147. return false;
  148. }
  149. return true;
  150. }
  151. /**
  152. * Builds the choropleth map.
  153. *
  154. * @param outputPath the output path
  155. * @return true, if successful
  156. */
  157. public static boolean buildChoroplethMap(String outputPath)
  158. {
  159. try{
  160. PointRDD spatialRDD = new PointRDD(sparkContext, PointInputLocation, PointOffset, PointSplitter, false, PointNumPartitions, StorageLevel.MEMORY_ONLY());
  161. PolygonRDD queryRDD = new PolygonRDD(sparkContext, PolygonInputLocation, PolygonSplitter, false, PolygonNumPartitions, StorageLevel.MEMORY_ONLY());
  162. spatialRDD.spatialPartitioning(GridType.RTREE);
  163. queryRDD.spatialPartitioning(spatialRDD.grids);
  164. spatialRDD.buildIndex(IndexType.RTREE,true);
  165. JavaPairRDD<Polygon,Long> joinResult = JoinQuery.SpatialJoinQueryCountByKey(spatialRDD,queryRDD,true,false);
  166. long start = System.currentTimeMillis();
  167. ChoroplethMap visualizationOperator = new ChoroplethMap(1000,600,USMainLandBoundary,false);
  168. visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.RED, true);
  169. visualizationOperator.Visualize(sparkContext, joinResult);
  170. ScatterPlot frontImage = new ScatterPlot(1000,600,USMainLandBoundary,false);
  171. frontImage.CustomizeColor(0, 0, 0, 255, Color.GREEN, true);
  172. frontImage.Visualize(sparkContext, queryRDD);
  173. RasterOverlayOperator overlayOperator = new RasterOverlayOperator(visualizationOperator.rasterImage);
  174. overlayOperator.JoinImage(frontImage.rasterImage);
  175. ImageGenerator imageGenerator = new ImageGenerator();
  176. //imageGenerator.SaveRasterImageAsLocalFile(frontImage.rasterImage, outputPath,ImageType.PNG);
  177. imageGenerator.SaveRasterImageAsLocalFile(overlayOperator.backRasterImage, outputPath,ImageType.PNG);
  178. //imageGenerator.SaveRasterImageAsLocalFile(visualizationOperator.distributedRasterImage, outputPath,ImageType.PNG);
  179. //ImageStitcher.stitchImagePartitionsFromLocalFile(outputPath, 1000,600,0,4, 4);
  180. System.out.println("散点图生成完成,共耗时" + (System.currentTimeMillis() - start) + "ms");
  181. }
  182. catch(Exception e)
  183. {
  184. e.printStackTrace();
  185. return false;
  186. }
  187. return true;
  188. }
  189. /**
  190. * Parallel filter render no stitch.
  191. *
  192. * @param outputPath the output path
  193. * @return true, if successful
  194. */
  195. public static boolean parallelFilterRenderNoStitch(String outputPath)
  196. {
  197. try{
  198. PointRDD spatialRDD = new PointRDD(sparkContext, PointInputLocation, PointOffset, PointSplitter, false, PointNumPartitions, StorageLevel.MEMORY_ONLY());
  199. HeatMap visualizationOperator = new HeatMap(1000,600,USMainLandBoundary,false,2,4,4,true,true);
  200. visualizationOperator.Visualize(sparkContext, spatialRDD);
  201. ImageGenerator imageGenerator = new ImageGenerator();
  202. imageGenerator.SaveRasterImageAsLocalFile(visualizationOperator.distributedRasterImage, outputPath,ImageType.PNG);
  203. }
  204. catch(Exception e)
  205. {
  206. e.printStackTrace();
  207. return false;
  208. }
  209. return true;
  210. }
  211. /**
  212. * Parallel filter render stitch.
  213. *
  214. * @param outputPath the output path
  215. * @return true, if successful
  216. */
  217. public static boolean parallelFilterRenderStitch(String outputPath)
  218. {
  219. try{
  220. PointRDD spatialRDD = new PointRDD(sparkContext, PointInputLocation, PointOffset, PointSplitter, false, PointNumPartitions, StorageLevel.MEMORY_ONLY());
  221. HeatMap visualizationOperator = new HeatMap(1000,600,USMainLandBoundary,false,2,4,4,true,true);
  222. visualizationOperator.Visualize(sparkContext, spatialRDD);
  223. ImageGenerator imageGenerator = new ImageGenerator();
  224. imageGenerator.SaveRasterImageAsLocalFile(visualizationOperator.distributedRasterImage, outputPath,ImageType.PNG);
  225. ImageStitcher.stitchImagePartitionsFromLocalFile(outputPath, 1000,600,0,4, 4);
  226. }
  227. catch(Exception e)
  228. {
  229. e.printStackTrace();
  230. return false;
  231. }
  232. return true;
  233. }
  234. /**
  235. * Earthdata visualization.
  236. *
  237. * @param outputPath the output path
  238. * @return true, if successful
  239. */
  240. public static boolean earthdataVisualization(String outputPath)
  241. {
  242. try {
  243. EarthdataHDFPointMapper earthdataHDFPoint = new EarthdataHDFPointMapper(HDFIncrement,HDFOffset,HDFRootGroupName,
  244. HDFDataVariableList,HDFDataVariableName,HDFswitchXY,urlPrefix);
  245. PointRDD spatialRDD = new PointRDD(sparkContext, earthdataInputLocation, earthdataNumPartitions, earthdataHDFPoint,StorageLevel.MEMORY_ONLY());
  246. ScatterPlot visualizationOperator = new ScatterPlot(1000,600,spatialRDD.boundaryEnvelope,ColorizeOption.EARTHOBSERVATION,false,false);
  247. visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.BLUE, true);
  248. visualizationOperator.Visualize(sparkContext, spatialRDD);
  249. ImageGenerator imageGenerator = new ImageGenerator();
  250. imageGenerator.SaveRasterImageAsLocalFile(visualizationOperator.rasterImage, outputPath, ImageType.PNG);
  251. } catch (Exception e) {
  252. e.printStackTrace();
  253. return false;
  254. }
  255. return true;
  256. }
  257. /**
  258. * The main method.
  259. *
  260. * @param args the arguments
  261. * @throws IOException Signals that an I/O exception has occurred.
  262. */
  263. public static void main(String[] args) throws IOException {
  264. long start = System.currentTimeMillis();
  265. Logger.getLogger("org").setLevel(Level.WARN);
  266. Logger.getLogger("akka").setLevel(Level.WARN);
  267. SparkConf sparkConf = new SparkConf().setAppName("GeoSparkVizDemo").setMaster("local[*]").set("spark.serializer", KryoSerializer.class.getName())
  268. .set("spark.kryo.registrator", GeoSparkVizKryoRegistrator.class.getName());
  269. sparkContext = new JavaSparkContext(sparkConf);
  270. prop = new Properties();
  271. String resourcePath = "src/test/resources/";
  272. String demoOutputPath = "target/demo";
  273. FileInputStream ConfFile= new FileInputStream(resourcePath+"babylon.point.properties");
  274. prop.load(ConfFile);
  275. String scatterPlotOutputPath = System.getProperty("user.dir")+"/"+demoOutputPath + "/scatterplot";
  276. String heatMapOutputPath = System.getProperty("user.dir")+"/"+demoOutputPath+"/heatmap";
  277. String choroplethMapOutputPath = System.getProperty("user.dir")+"/"+demoOutputPath+"/choroplethmap";
  278. String parallelFilterRenderStitchOutputPath = System.getProperty("user.dir")+"/"+demoOutputPath+"/parallelfilterrenderstitchheatmap";
  279. String earthdataScatterPlotOutputPath = System.getProperty("user.dir")+"/"+demoOutputPath+"/earthdatascatterplot";
  280. PointInputLocation = System.getProperty("user.dir")+"/"+resourcePath+prop.getProperty("inputLocation");
  281. PointOffset = Integer.parseInt(prop.getProperty("offset"));;
  282. PointSplitter = FileDataSplitter.getFileDataSplitter(prop.getProperty("splitter"));
  283. PointNumPartitions = Integer.parseInt(prop.getProperty("numPartitions"));
  284. ConfFile= new FileInputStream(resourcePath+"babylon.rectangle.properties");
  285. prop.load(ConfFile);
  286. RectangleInputLocation = System.getProperty("user.dir")+"/"+resourcePath+prop.getProperty("inputLocation");
  287. RectangleOffset = Integer.parseInt(prop.getProperty("offset"));
  288. RectangleSplitter = FileDataSplitter.getFileDataSplitter(prop.getProperty("splitter"));
  289. RectangleNumPartitions = Integer.parseInt(prop.getProperty("numPartitions"));
  290. ConfFile= new FileInputStream(resourcePath+"babylon.polygon2.properties");
  291. prop.load(ConfFile);
  292. PolygonInputLocation = System.getProperty("user.dir")+"/"+resourcePath+prop.getProperty("inputLocation");
  293. PolygonOffset = Integer.parseInt(prop.getProperty("offset"));
  294. PolygonSplitter = FileDataSplitter.getFileDataSplitter(prop.getProperty("splitter"));
  295. PolygonNumPartitions = Integer.parseInt(prop.getProperty("numPartitions"));
  296. ConfFile= new FileInputStream(resourcePath+"babylon.linestring.properties");
  297. prop.load(ConfFile);
  298. LineStringInputLocation = System.getProperty("user.dir")+"/"+resourcePath+prop.getProperty("inputLocation");
  299. LineStringOffset = Integer.parseInt(prop.getProperty("offset"));
  300. LineStringSplitter = FileDataSplitter.getFileDataSplitter(prop.getProperty("splitter"));
  301. LineStringNumPartitions = Integer.parseInt(prop.getProperty("numPartitions"));
  302. USMainLandBoundary = new Envelope(-126.790180,-64.630926,24.863836,50.000);
  303. earthdataInputLocation = System.getProperty("user.dir")+"/src/test/resources/modis/modis.csv";
  304. earthdataNumPartitions = 5;
  305. HDFIncrement=5;
  306. HDFOffset=2;
  307. HDFRootGroupName = "MOD_Swath_LST";
  308. HDFDataVariableName = "LST";
  309. HDFswitchXY = true;
  310. urlPrefix = System.getProperty("user.dir")+"/src/test/resources/modis/";
  311. if(buildScatterPlot(scatterPlotOutputPath)&&buildHeatMap(heatMapOutputPath)
  312. &&buildChoroplethMap(choroplethMapOutputPath)&&parallelFilterRenderStitch(parallelFilterRenderStitchOutputPath+"-stitched")
  313. &&parallelFilterRenderNoStitch(parallelFilterRenderStitchOutputPath)&&earthdataVisualization(earthdataScatterPlotOutputPath))
  314. {
  315. System.out.println("散点图生成完成,共耗时" + (System.currentTimeMillis() - start) + "ms");
  316. System.out.println("All GeoSparkViz Demos have passed.");
  317. }
  318. else
  319. {
  320. System.out.println("GeoSparkViz Demos failed.");
  321. }
  322. sparkContext.stop();
  323. }
  324. }

项目可视化一览



参考资料:

https://www.jianshu.com/p/1a531de087df

https://www.helplib.com/GitHub/article_127813

GeoSpark入门-可视化的更多相关文章

  1. (一)Superset 1.3图表篇——Table

    本系列文章基于Superset 1.3.0版本.1.3.0版本目前支持分布,趋势,地理等等类型共59张图表.本次1.3版本的更新图表有了一些新的变化,而之前也一直没有做过非常细致的图表教程. 而且目前 ...

  2. 一起学微软Power BI系列-官方文档-入门指南(4)Power BI的可视化

    在前面的系列文章中,我们介绍了官方有关获取数据,以及建模的原始文档和基本介绍.今天继续给大家介绍官方文档中,有关可视化的内容.实际上获获取数据和建模更注重业务关系的处理,而可视化则关注对数据的解读.这 ...

  3. d3.js:数据可视化利器之快速入门

    hello,data! 在进入d3.js之前,我们先用一个小例子回顾一下将数据可视化的基本流程. 任务 用横向柱状图来直观显示以下数据: var data = [10,15,23,78,57,29,3 ...

  4. Roslyn 入门:使用 Visual Studio 的语法可视化窗格查看和了解代码的语法树

    使用 Visual Studio 提供的 Syntax Visualizer,我们可以实时看到一个代码文件中的语法树.这对我们基于 Roslyn 编写静态分析和修改工具非常有帮助.本文将介绍如何安装它 ...

  5. 数据可视化入门之show me the numbers

           数据的可视化一直是自己瞎玩着学,近来想系统的学数据可视化的东西,于是搜索资料时看到有人推荐<show me the numbers>作为入门. 由于搜不到具体的书籍内容,只能 ...

  6. 数据可视化-svg入门基础(二)

    接上一篇:数据可视化-svg入门基础(一),基础一主要是介绍了svg概念,元素样式设置等. svg是(scalable vector graphic)伸缩矢量图像. 一.目录 (1)图形元素 (2)文 ...

  7. Seaborn数据可视化入门

    在本节学习中,我们使用Seaborn作为数据可视化的入门工具 Seaborn的官方网址如下:http://seaborn.pydata.org 一:definition Seaborn is a Py ...

  8. Matplotlib数据可视化(1):入门介绍

      1 matplot入门指南¶ matplotlib是Python科学计算中使用最多的一个可视化库,功能丰富,提供了非常多的可视化方案,基本能够满足各种场景下的数据可视化需求.但功能丰富从另一方面来 ...

  9. Keras入门(六)模型训练实时可视化

      在北京做某个项目的时候,客户要求能够对数据进行训练.预测,同时能导出模型,还有在页面上显示训练的进度.前面的几个要求都不难实现,但在页面上显示训练进度当时笔者并没有实现.   本文将会分享如何在K ...

随机推荐

  1. 【2018寒假集训 Day2】【动态规划】钱币兑换(exchange)(自己翻译的2333)

    钱币兑换(exchange) 问题描述: Dave偶然获得了未来几天的美元(dollars)与马克(marks)之间的兑换率.例如Dave开始有100marks,请编写个程序帮助Dave找出最好的买卖 ...

  2. 个人收藏--未整理—C# http/https 上传下载文件

    c# HTTP/HTTPS 文件上传. 分类: .net 2015-02-03 08:36 541人阅读 评论(0) 收藏 举报 方法主体 [csharp] view plaincopy public ...

  3. centos 7 MysSQL 5.6.39 源码安装

    MySQL 5.6.39 二进制安装 CentOS 7 将默认数据库MySQL替换成了Mariadb. 这里会从系统的环境准备开始一步一步安装. 环境准备 系统版本 内核版本 IP地址 Centos ...

  4. Windows的定时任务(Schedule Task)设置

    一.设置 1 点击“开始” 2 点击“控制面板” 3 双击“任务计划” 4 双击“添加任务计划” 5 到了“任务计划向导”界面,点击“下一步” 6 点击“浏览”选择需要定时运行的程序(exe文件,ba ...

  5. PTA 1139 1138 1137 1136

    PAT 1139 1138 1137 1136 一个月不写题,有点生疏..脑子跟不上手速,还可以啦,反正今天很开心. PAT 1139 First Contact 18/30 找个时间再修bug 23 ...

  6. 机器学习笔记(六) ---- 支持向量机(SVM)

    支持向量机(SVM)可以说是一个完全由数学理论和公式进行应用的一种机器学习算法,在小批量数据分类上准确度高.性能好,在二分类问题上有广泛的应用. 同样是二分类算法,支持向量机和逻辑回归有很多相似性,都 ...

  7. 深入比特币原理(四)——锁定脚本(locking script)与解锁脚本(unlocking script)

    通常比特币都是以虚拟货币的概念出现在大众眼前,实际上比特币是第一个真正的区块链"平台",利用它去中心化.不可篡改.可追溯等特点不光可以实现多种交易模式(如点对点交易.多重签名交易等 ...

  8. shell 循环读取文件及字符串转为数组

    文件/etc/hdocker_config内容如下: 30.72.63.94 30.72.63.95 30.72.63.96 30.72.63.97 /tmp/lasclocker.tar maste ...

  9. php上传下载文件

    之前做一个上传下载的项目,发现网上的和自己需求不是很一样,翻阅了下书籍和整理了下网上的一些代码.做了一个上传下载的demo,上传通过php本身的uploadfile函数,并返回以时间戳命名的文件名后, ...

  10. luogu P2507 [SCOI2008]配对

    题目描述 你有 n 个整数Ai和n 个整数Bi.你需要把它们配对,即每个Ai恰好对应一个Bp[i].要求所有配对的整数差的绝对值之和尽量小,但不允许两个相同的数配对.例如A={5,6,8},B={5, ...