Hikari Connection Pool Hikari 连接池

HikariCP 官方文档 https://github.com/brettwooldridge/HikariCP

Maven依赖

一般都用8版本

Maven仓库所在地址

https://mvnrepository.com/artifact/com.zaxxer/HikariCP/3.4.5

<dependency>
  <groupId>com.zaxxer</groupId>
  <artifactId>HikariCP</artifactId>
  <version>3.4.2</version>
</dependency>

官方硬编码获取连接

public class HikariTest {

    @Test
public void hikariTest() throws Exception {
HikariConfig config = new HikariConfig();
config.setJdbcUrl("jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai");
config.setUsername("root");
config.setPassword("123456");
config.addDataSourceProperty("cachePrepStmts", "true");
config.addDataSourceProperty("prepStmtCacheSize", "300");
config.addDataSourceProperty("prepStmtCacheSqlLimit", "2048"); HikariDataSource dataSource = new HikariDataSource(config); Connection connection = dataSource.getConnection();
System.out.println(connection);
connection.close();
}
}

因为没有配置SLF4J日志工厂,这里报加载失败信息

上面的硬编码配置还支持了设置连接池的SQL预编译相关

- 开启与编译对象缓存

- 设置缓存个数300

- 设置缓存上限2048个

Hikari也是支持对配置文件读取方式的

但是啊但是啊,官方解释的驱动名很费解,我试了半天都不行

然后在这个博客看到了:http://zetcode.com/articles/hikaricp/

就是说你连MySQL不需要配置驱动的,直接写Url就好了

hikari.properties

jdbcUrl = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
dataSource.user = root
dataSource.password = 123456 # 开启SQL预编译对象缓存
dataSource.cachePrepStmts = true
# SQL预编译对象缓存个数 256
dataSource.prepStmtCacheSize = 256
# SQL预编译对象缓存个数上限 512
dataSource.prepStmtCacheSqlLimit = 512

连接测试

    @Test
public void hikariTest2() throws Exception {
final String configureFile = "src/main/resources/hikari.properties";
HikariConfig configure = new HikariConfig(configureFile);
HikariDataSource dataSource = new HikariDataSource(configure);
Connection connection = dataSource.getConnection();
System.out.println(connection);
connection.close();
}

封装工具类

public class JdbcHikariUtil {

    private static final DataSource dataSource = new HikariDataSource(new HikariConfig("src/main/resources/hikari.properties"));

    public static Connection getConnection(){
try {
return dataSource.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
}
}

做一个全连接池和原生JDBC的封装工具类

首先是Maven依赖

<!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.19</version>
</dependency> <!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13</version>
<scope>test</scope>
</dependency> <!-- https://mvnrepository.com/artifact/com.mchange/c3p0 -->
<dependency>
<groupId>com.mchange</groupId>
<artifactId>c3p0</artifactId>
<version>0.9.5.5</version>
</dependency> <!-- https://mvnrepository.com/artifact/commons-dbutils/commons-dbutils -->
<dependency>
<groupId>commons-dbutils</groupId>
<artifactId>commons-dbutils</artifactId>
<version>1.7</version>
</dependency> <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-dbcp2 -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
<version>2.7.0</version>
</dependency> <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-pool2 -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-pool2</artifactId>
<version>2.8.0</version>
</dependency> <!-- https://mvnrepository.com/artifact/com.alibaba/druid -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid</artifactId>
<version>1.1.22</version>
</dependency> <dependency>
<groupId>com.zaxxer</groupId>
<artifactId>HikariCP</artifactId>
<version>3.4.2</version>
</dependency>

然后是各个连接池的配置

原生JDBC  jdbc.properties

driverClass = com.mysql.cj.jdbc.Driver
url = jdbc:mysql://localhost:3306/jdbc_db?serverTimezone=Asia/Shanghai
user = root
password = 123456

C3P0  c3p0-config.xml

<?xml version="1.0" encoding="UTF-8" ?>
<c3p0-config>
<!-- 自定义的配置命名-->
<named-config name="c3p0_xml_config"> <!-- 四个基本信息 -->
<property name="driverClass">com.mysql.cj.jdbc.Driver</property>
<!-- 默认本地可以省略 jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai -->
<property name="jdbcUrl">jdbc:mysql://localhost:3306/jdbc_db?serverTimezone=Asia/Shanghai</property>
<property name="user">root</property>
<property name="password">123456</property> <!-- 连接池管理信息 --> <!-- 连接对象数不够时,每次申请 迭增的连接数 -->
<property name="acquireIncrement">5</property>
<!-- 初始池大小存放的连接对象数 -->
<property name="initialPoolSize">10</property>
<!-- 最小连接对象数 -->
<property name="minPoolSize">10</property>
<!-- 最大连接对象数,不可超出的范围 -->
<property name="maxPoolSize">100</property>
<!-- 最多维护的SQL编译对象个数-->
<property name="maxStatements">50</property>
<!-- 每个连接对象最多可使用的SQL编译对象的个数 -->
<property name="maxStatementsPerConnection">2</property>
</named-config>
</c3p0-config>

DBCP  dbcp.properties

driverClassName = com.mysql.cj.jdbc.Driver
url = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
username = root
password = 123456

Druid  druid.properties ,可以共用c3p0,基本一样

driverClassName = com.mysql.cj.jdbc.Driver
url = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
username = root
password = 123456

Hikari  hikari.properties

jdbcUrl = jdbc:mysql:///jdbc_db?serverTimezone=Asia/Shanghai
dataSource.user = root
dataSource.password = 123456 # 开启SQL预编译对象缓存
dataSource.cachePrepStmts = true
# SQL预编译对象缓存个数 256
dataSource.prepStmtCacheSize = 256
# SQL预编译对象缓存个数上限 512
dataSource.prepStmtCacheSqlLimit = 512

完整封装工具类

package cn.dai.util;

import com.alibaba.druid.pool.DruidDataSourceFactory;
import com.mchange.v2.c3p0.ComboPooledDataSource;
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import org.apache.commons.dbcp2.BasicDataSourceFactory; import javax.sql.DataSource;
import java.io.InputStream;
import java.sql.*;
import java.util.Properties; /**
* @author ArkD42
* @file Jdbc
* @create 2020 - 04 - 24 - 22:04
*/
public class CompleteJdbcUtils {
private CompleteJdbcUtils(){} //private static String driverClass;
private static String url;
private static String user;
private static String password; private static DataSource dataSourceFromDBCP;
private static DataSource dataSourceFromDruid;
private static final DataSource dataSourceFromC3P0 = new ComboPooledDataSource("c3p0_xml_config");
private static final DataSource dataSourceFromHikari = new HikariDataSource(new HikariConfig("src/main/resources/hikari.properties")); static {
InputStream originalJdbcStream = CompleteJdbcUtils.class.getClassLoader().getResourceAsStream("jdbc.properties");
Properties originalJdbcProperties = new Properties(); InputStream dbcpStream = CompleteJdbcUtils.class.getClassLoader().getResourceAsStream("dbcp.properties");
Properties dbcpProperties = new Properties(); InputStream druidStream = CompleteJdbcUtils.class.getClassLoader().getResourceAsStream("druid.properties");
Properties druidProperties = new Properties();
try {
originalJdbcProperties.load(originalJdbcStream);
//driverClass = originalJdbcProperties.getProperty("driverClass");
url = originalJdbcProperties.getProperty("url");
user = originalJdbcProperties.getProperty("user");
password = originalJdbcProperties.getProperty("password");
//Class.forName(driverClass);
//--------------------------------------------------------------------------\\
dbcpProperties.load( dbcpStream );
dataSourceFromDBCP = BasicDataSourceFactory.createDataSource(dbcpProperties);
druidProperties.load( druidStream );
dataSourceFromDruid = DruidDataSourceFactory.createDataSource(druidProperties);
} catch ( Exception e) {
e.printStackTrace();
}
} // 原生JDBC
public static Connection getConnectionByOriginalJdbc(){
try {
return DriverManager.getConnection(url,user,password);
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // C3P0
public static Connection getConnectionByC3P0(){
try {
return dataSourceFromC3P0.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // DBCP
public static Connection getConnectionByDBCP(){
try {
return dataSourceFromDBCP.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // Druid
public static Connection getConnectionByDruid(){
try {
return dataSourceFromDruid.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // Hikari
public static Connection getConnectionByHikari(){
try {
return dataSourceFromHikari.getConnection();
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // 资源释放
public static void releaseResource(Connection connection, PreparedStatement preparedStatement, ResultSet resultSet){
try{
if (resultSet != null) resultSet.close();
if (preparedStatement != null) preparedStatement.close();
if (connection != null) connection.close();
} catch (SQLException sqlException){
sqlException.printStackTrace();
}
}
}

测试

    @Test
public void te5() throws SQLException {
Connection connectionByOriginalJdbc = CompleteJdbcUtils.getConnectionByOriginalJdbc();
Connection connectionByC3P0 = CompleteJdbcUtils.getConnectionByC3P0();
Connection connectionByDBCP = CompleteJdbcUtils.getConnectionByDBCP();
Connection connectionByDruid = CompleteJdbcUtils.getConnectionByDruid();
Connection connectionByHikari = CompleteJdbcUtils.getConnectionByHikari(); Connection[]connections = new Connection[]{
connectionByOriginalJdbc,
connectionByC3P0,
connectionByDBCP,
connectionByDruid,
connectionByHikari
}; for (Connection connection: connections) {
System.out.println(connection);
connection.close();
}
}

对应完整工具类封装的一些常用方法

- 增删改  Update

- 查询单个结果  queryOne

- 查询多个结果  queryToList

- 对注入条件的封装

    //  获取SQL预编译对象
public static PreparedStatement getPreparedStatement(Connection connection,String sql){
try {
return connection.prepareStatement(sql);
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
return null;
} // 参数注入
public static void argumentsInject(PreparedStatement preparedStatement,Object[] args){
for (int i = 0; i < args.length; i++) {
try {
preparedStatement.setObject(i+1,args[i]);
} catch (SQLException sqlException) {
sqlException.printStackTrace();
}
}
} // 转换日期格式
public static java.sql.Date parseToSqlDate(String patternTime){
SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyy-MM-dd");
java.util.Date date = null;//"1987-09-01"
try {
date = simpleDateFormat.parse(patternTime);
} catch (ParseException e) {
e.printStackTrace();
}
return new java.sql.Date(date.getTime());
} // 更新操作
public static int update(Connection connection,String sql,Object[] args) {
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
try { return preparedStatement.executeUpdate(); }
catch (SQLException sqlException) { sqlException.printStackTrace(); }
return 0;
} // 自己封装的查询
public static <T> List<T> queryToList(Connection connection,Class<T> tClass,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
List<T> tList = new ArrayList<T>();
while(resultSet.next()){
T t = tClass.newInstance();
for (int i = 0; i < columnCount; i++) {
Object columnValue = resultSet.getObject(i + 1);
String columnLabel = metaData.getColumnLabel(i + 1);
Field field = tClass.getDeclaredField(columnLabel);
field.setAccessible( true );
field.set(t,columnValue);
}
tList.add(t);
}
return tList;
} catch (Exception e){ e.printStackTrace(); }
finally { releaseResource(connection,preparedStatement,resultSet); }
return null;
} // MapList集合封装
public static List<Map<String, Object>> queryToList(Connection connection,String sql, Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
List<Map<String,Object>> mapList = new ArrayList<Map<String, Object>>();
while(resultSet.next()){
Map<String,Object> row = new HashMap<String, Object>();
for (int i = 0; i < columnCount; i++) {
String columnLabel = metaData.getColumnLabel(i + 1);
Object columnValue = resultSet.getObject(i + 1);
row.put(columnLabel,columnValue);
}
mapList.add(row);
}
return mapList;
}catch (Exception e){ e.printStackTrace(); }
finally { releaseResource(connection,preparedStatement,resultSet);}
return null;
} // 反射实现单个查询
public static <T> T queryOne(Connection connection,Class<T> tClass,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection,sql);
argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery(); ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
if (resultSet.next()){
T t = tClass.newInstance();
for (int i = 0; i < columnCount; i++) {
Object columnValue = resultSet.getObject(i + 1);
String columnLabel = metaData.getColumnLabel(i + 1);
Field field = tClass.getDeclaredField(columnLabel);
field.setAccessible( true );
field.set(t,columnValue);
}
return t;
}
}catch (Exception e){ e.printStackTrace();}
finally { releaseResource(connection,preparedStatement,resultSet);}
return null;
} // Map单个查询
public static Map<String,Object> queryOne(Connection connection,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection,sql);
argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
ResultSetMetaData metaData = resultSet.getMetaData();
int columnCount = metaData.getColumnCount();
if (resultSet.next()){
Map<String,Object> map = new HashMap<String, Object>();
for (int i = 0; i < columnCount; i++) {
Object columnValue = resultSet.getObject(i + 1);
String columnLabel = metaData.getColumnLabel(i + 1);
map.put(columnLabel,columnValue);
}
return map;
}
}catch (Exception e){ e.printStackTrace();}
finally { releaseResource(connection,preparedStatement,resultSet);}
return null;
} // 求特殊值的通用方法 聚合函数
public <E> E getValue(Connection connection,String sql,Object[] args){
PreparedStatement preparedStatement = getPreparedStatement(connection, sql);
if (args != null) argumentsInject(preparedStatement,args);
ResultSet resultSet = null;
try{
resultSet = preparedStatement.executeQuery();
if (resultSet.next())return (E)resultSet.getObject(1);;
} catch (Exception e){ e.printStackTrace(); }
finally{ CompleteJdbcUtils.releaseResource(connection,preparedStatement,resultSet);}
return null;
}

【Java】JDBC Part5.1 Hikari连接池补充的更多相关文章

  1. DB数据源之SpringBoot+MyBatis踏坑过程(五)手动使用Hikari连接池

    DB数据源之SpringBoot+MyBatis踏坑过程(五)手动使用Hikari连接池 liuyuhang原创,未经允许禁止转载  系列目录连接 DB数据源之SpringBoot+Mybatis踏坑 ...

  2. hikari连接池属性详解

    hikari连接池属性详解 一.主要配置 1.dataSourceClassName 这是DataSourceJDBC驱动程序提供的类的名称.请查阅您的特定JDBC驱动程序的文档以获取此类名称,或参阅 ...

  3. spring boot:使用mybatis访问多个mysql数据源/查看Hikari连接池的统计信息(spring boot 2.3.1)

    一,为什么要访问多个mysql数据源? 实际的生产环境中,我们的数据并不会总放在一个数据库, 例如:业务数据库:存放了用户/商品/订单 统计数据库:按年.月.日的针对用户.商品.订单的统计表 因为统计 ...

  4. java基础(30):DBUtils、连接池

    1. DBUtils 如果只使用JDBC进行开发,我们会发现冗余代码过多,为了简化JDBC开发,本案例我们讲采用apache commons组件一个成员:DBUtils. DBUtils就是JDBC的 ...

  5. 微服务架构 ------ 插曲 hikari连接池的配置

    开胃菜:据说hikari连接池很快,快到让另一个连接池的作者抛弃对自己连接池的维护,并且强烈推荐使用hikari 连接池目前我们项目使用的有两个 一个是Druid , 一个是 Hikari, 其中Dr ...

  6. JAVA基础之DBUtils与连接池

    利用DBUtils进一步简化JDBC数据库的增删改查的代码,同时利用从连接池中接取连接,进而进行简化和减少资源的消耗! 一.DBUtils: 1.DBUtils就是JDBC的简化开发工具包.需要项目导 ...

  7. 走进JavaWeb技术世界3:JDBC的进化与连接池技术

    走进JavaWeb技术世界3:JDBC的进化与连接池技术 转载公众号[码农翻身] 网络访问 随着 Oracle, Sybase, SQL Server ,DB2,  Mysql 等人陆陆续续住进数据库 ...

  8. java jdbc使用SSH隧道连接mysql数据库demo

    java jdbc使用SSH隧道连接mysql数据库demo   本文链接:https://blog.csdn.net/earbao/article/details/50216999   packag ...

  9. Java的JDBC原生态学习以及连接池的用法

    JDBC是什么 JDBC(Java Data Base Connectivity)是Java访问数据库的桥梁,但它只是接口规范,具体实现是各数据库厂商提供的驱动程序(Driver). 应用程序.JDB ...

  10. java基础之JDBC八:Druid连接池的使用

    基本使用代码: /** * Druid连接池及简单工具类的使用 */ public class Test{ public static void main(String[] args) { Conne ...

随机推荐

  1. 从零开始写 Docker(十七)---容器网络实现(中):为容器插上”网线“

    本文为从零开始写 Docker 系列第十七篇,利用 linux 下的 Veth.Bridge.iptables 等等相关技术,构建容器网络模型,为容器插上"网线". 完整代码见:h ...

  2. INFINI Easysearch 与华为鲲鹏完成产品兼容互认证

    何为华为鲲鹏认证 华为鲲鹏认证是华为云围绕鲲鹏云服务(含公有云.私有云.混合云.桌面云)推出的一项合作伙伴计划,旨在为构建持续发展.合作共赢的鲲鹏生态圈,通过整合华为的技术.品牌资源,与合作伙伴共享商 ...

  3. Android Studio 编译报错:download fastutil-7.2.0.jar

    引用:https://www.cnblogs.com/caoxinyu/p/10568462.html build.gradle 可能有多个,一般在app 节点,默认里面不包含buildscript, ...

  4. 【现代 CSS】标准滚动条控制规范 scrollbar-color 和 scrollbar-width

    Chrome 在 121 版本开始,原生支持了两个滚动条样式相关的样式 scrollbar-color 和 scrollbar-width. 要知道,在此前,虽然有 ::-webkit-scrollb ...

  5. 在MySQL中INNER JOIN、LEFT JOIN、RIGHT JOIN 和 FULL JOIN 有什么区别?

    我们有两张表: TableA:id  firstName                  lastName.......................................1   aru ...

  6. python类和对象初识

    # python类和对象初识 a = 2 b = 3 print(id(a)) print(type(a)) print(a) print(id(b)) print(type(b)) print(b) ...

  7. 从github下好dirsearch后出现要下载文件依赖错误

    pip3 install -r requirements.txt

  8. Windows无法调节亮度

    原因1:驱动问题 解决方式: 安装360驱动大师,一键安装. 也可以使用其他软件:如驱动精灵. 推荐使用电脑品牌本身的驱动软件:如联想:联想驱动管理 原因2:设备管理问题 解决方式: 计算机 -> ...

  9. WPF/C#:如何实现拖拉元素

    前言 在Canvas中放置了一些元素,需要能够拖拉这些元素,在WPF Samples中的DragDropObjects项目中告诉了我们如何实现这种效果. 效果如下所示: 拖拉过程中的效果如下所示: 具 ...

  10. Linux 内核:设备树(3)把device_node转换成platfrom_device

    Linux 内核:设备树(3)把device_node转换成platfrom_device 背景 在上一节中讲到设备树dtb文件中的各个节点转换成device_node的过程(<dtb转换成de ...