update 格式化代码 统一间隔符
update 格式化代码 统一间隔符
| | |
| | | |
| | | > 系统演示: [传送门](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/wikis/系统演示?sort_id=4836388) |
| | | |
| | | | 功能介绍 | 使用技术 | 文档地址 | 特性注意事项 | |
| | | |---|---|---|---| |
| | | | 当前框架 | RuoYi-Vue-Plus | [RuoYi-Vue-Plus文档](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/wikis/pages) | 重写RuoYi-Vue全方位升级(不兼容原框架) | |
| | | | satoken分支 | RuoYi-Vue-Plus-satoken | [satoken分支地址](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/tree/satoken/) | 高可读性 扩展性(推荐使用) | |
| | | | 单体分支 | RuoYi-Vue-Plus-fast | [fast分支地址](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/tree/fast/) | 单体应用结构 | |
| | | | Vue3分支 | RuoYi-Vue-Plus-UI | [UI地址](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus-UI) | 由于组件还未完善 仅供学习 | |
| | | | 原框架 | RuoYi-Vue | [RuoYi-Vue官网](http://ruoyi.vip/) | 定期同步需要的功能 | |
| | | | 前端开发框架 | Vue、Element UI | [Element UI官网](https://element.eleme.cn/#/zh-CN) | | |
| | | | 后端开发框架 | SpringBoot | [SpringBoot官网](https://spring.io/projects/spring-boot/#learn) | | |
| | | | 容器框架 | Undertow | [Undertow官网](https://undertow.io/) | 基于 XNIO 的高性能容器 | |
| | | | 权限认证框架 | Spring Security、Jwt | [SpringSecurity官网](https://spring.io/projects/spring-security#learn) | 支持多终端认证系统 | |
| | | | 权限认证框架 | Sa-Token、Jwt | [Sa-Token官网](https://sa-token.dev33.cn/) | 强解耦、强扩展 | |
| | | | 关系数据库 | MySQL | [MySQL官网](https://dev.mysql.com/) | 适配 8.X 最低 5.7 | |
| | | | 缓存数据库 | Redis | [Redis官网](https://redis.io/) | 适配 6.X 最低 4.X | |
| | | | 数据库框架 | Mybatis-Plus | [Mybatis-Plus文档](https://baomidou.com/guide/) | 快速 CRUD 增加开发效率 | |
| | | | 数据库框架 | p6spy | [p6spy官网](https://p6spy.readthedocs.io/) | 更强劲的 SQL 分析 | |
| | | | 多数据源框架 | dynamic-datasource | [dynamic-ds文档](https://www.kancloud.cn/tracy5546/dynamic-datasource/content) | 支持主从与多种类数据库异构 | |
| | | | 序列化框架 | Jackson | [Jackson官网](https://github.com/FasterXML/jackson) | 统一使用 jackson 高效可靠 | |
| | | | Redis客户端 | Redisson | [Redisson文档](https://github.com/redisson/redisson/wiki/%E7%9B%AE%E5%BD%95) | 支持单机、集群配置 | |
| | | | 分布式限流 | Redisson | [Redisson文档](https://github.com/redisson/redisson/wiki/%E7%9B%AE%E5%BD%95) | 全局、请求IP、集群ID 多种限流 | |
| | | | 分布式队列 | Redisson | [Redisson文档](https://github.com/redisson/redisson/wiki/%E7%9B%AE%E5%BD%95) | 普通队列、延迟队列、优先队列 等 | |
| | | | 分布式锁 | Lock4j | [Lock4j官网](https://gitee.com/baomidou/lock4j) | 注解锁、工具锁 多种多样 | |
| | | | 分布式幂等 | Redisson | [Lock4j文档](https://gitee.com/baomidou/lock4j) | 拦截重复提交 | |
| | | | 分布式日志 | TLog | [TLog文档](https://yomahub.com/tlog/docs) | 支持跟踪链路日志记录、性能分析、链路排查 | |
| | | | 分布式任务调度 | Xxl-Job | [Xxl-Job官网](https://www.xuxueli.com/xxl-job/) | 高性能 高可靠 易扩展 | |
| | | | 文件存储 | Minio | [Minio文档](https://docs.min.io/) | 本地存储 | |
| | | | 文件存储 | 七牛、阿里、腾讯 | [OSS使用文档](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/wikis/pages?sort_id=4359146&doc_id=1469725) | 云存储 | |
| | | | 监控框架 | SpringBoot-Admin | [SpringBoot-Admin文档](https://codecentric.github.io/spring-boot-admin/current/) | 全方位服务监控 | |
| | | | 校验框架 | Validation | [Validation文档](https://docs.jboss.org/hibernate/stable/validator/reference/en-US/html_single/) | 增强接口安全性、严谨性 支持国际化 | |
| | | | Excel框架 | Alibaba EasyExcel | [EasyExcel文档](https://www.yuque.com/easyexcel/doc/easyexcel) | 性能优异 扩展性强 | |
| | | | 文档框架 | Knife4j | [Knife4j文档](https://doc.xiaominfo.com/knife4j/documentation/) | 美化接口文档 | |
| | | | 工具类框架 | Hutool、Lombok | [Hutool文档](https://www.hutool.cn/docs/) | 减少代码冗余 增加安全性 | |
| | | | 代码生成器 | 适配MP、Knife4j规范化代码 | [Hutool文档](https://www.hutool.cn/docs/) | 一键生成前后端代码 | |
| | | | 部署方式 | Docker | [Docker文档](https://docs.docker.com/) | 容器编排 一键部署业务集群 | |
| | | | 国际化 | SpringMessage | [SpringMVC文档](https://docs.spring.io/spring-framework/docs/current/reference/html/web.html#mvc) | Spring标准国际化方案 | |
| | | | 功能介绍 | 使用技术 | 文档地址 | 特性注意事项 | |
| | | |-----------|------------------------|---------------------------------------------------------------------------------------------------|--------------------------| |
| | | | 当前框架 | RuoYi-Vue-Plus | [RuoYi-Vue-Plus文档](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/wikis/pages) | 重写RuoYi-Vue全方位升级(不兼容原框架) | |
| | | | 单体分支 | RuoYi-Vue-Plus-fast | [fast分支地址](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/tree/fast/) | 单体应用结构 | |
| | | | Vue3分支 | RuoYi-Vue-Plus-UI | [UI地址](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus-UI) | 由于组件还未完善 仅供学习 | |
| | | | 原框架 | RuoYi-Vue | [RuoYi-Vue官网](http://ruoyi.vip/) | 定期同步需要的功能 | |
| | | | 前端开发框架 | Vue、Element UI | [Element UI官网](https://element.eleme.cn/#/zh-CN) | | |
| | | | 后端开发框架 | SpringBoot | [SpringBoot官网](https://spring.io/projects/spring-boot/#learn) | | |
| | | | 容器框架 | Undertow | [Undertow官网](https://undertow.io/) | 基于 XNIO 的高性能容器 | |
| | | | 权限认证框架 | Sa-Token、Jwt | [Sa-Token官网](https://sa-token.dev33.cn/) | 强解耦、强扩展 | |
| | | | 关系数据库 | MySQL | [MySQL官网](https://dev.mysql.com/) | 适配 8.X 最低 5.7 | |
| | | | 缓存数据库 | Redis | [Redis官网](https://redis.io/) | 适配 6.X 最低 4.X | |
| | | | 数据库框架 | Mybatis-Plus | [Mybatis-Plus文档](https://baomidou.com/guide/) | 快速 CRUD 增加开发效率 | |
| | | | 数据库框架 | p6spy | [p6spy官网](https://p6spy.readthedocs.io/) | 更强劲的 SQL 分析 | |
| | | | 多数据源框架 | dynamic-datasource | [dynamic-ds文档](https://www.kancloud.cn/tracy5546/dynamic-datasource/content) | 支持主从与多种类数据库异构 | |
| | | | 序列化框架 | Jackson | [Jackson官网](https://github.com/FasterXML/jackson) | 统一使用 jackson 高效可靠 | |
| | | | Redis客户端 | Redisson | [Redisson文档](https://github.com/redisson/redisson/wiki/%E7%9B%AE%E5%BD%95) | 支持单机、集群配置 | |
| | | | 分布式限流 | Redisson | [Redisson文档](https://github.com/redisson/redisson/wiki/%E7%9B%AE%E5%BD%95) | 全局、请求IP、集群ID 多种限流 | |
| | | | 分布式队列 | Redisson | [Redisson文档](https://github.com/redisson/redisson/wiki/%E7%9B%AE%E5%BD%95) | 普通队列、延迟队列、优先队列 等 | |
| | | | 分布式锁 | Lock4j | [Lock4j官网](https://gitee.com/baomidou/lock4j) | 注解锁、工具锁 多种多样 | |
| | | | 分布式幂等 | Redisson | [Lock4j文档](https://gitee.com/baomidou/lock4j) | 拦截重复提交 | |
| | | | 分布式日志 | TLog | [TLog文档](https://yomahub.com/tlog/docs) | 支持跟踪链路日志记录、性能分析、链路排查 | |
| | | | 分布式任务调度 | Xxl-Job | [Xxl-Job官网](https://www.xuxueli.com/xxl-job/) | 高性能 高可靠 易扩展 | |
| | | | 文件存储 | Minio | [Minio文档](https://docs.min.io/) | 本地存储 | |
| | | | 文件存储 | 七牛、阿里、腾讯 | [OSS使用文档](https://gitee.com/JavaLionLi/RuoYi-Vue-Plus/wikis/pages?sort_id=4359146&doc_id=1469725) | 云存储 | |
| | | | 监控框架 | SpringBoot-Admin | [SpringBoot-Admin文档](https://codecentric.github.io/spring-boot-admin/current/) | 全方位服务监控 | |
| | | | 校验框架 | Validation | [Validation文档](https://docs.jboss.org/hibernate/stable/validator/reference/en-US/html_single/) | 增强接口安全性、严谨性 支持国际化 | |
| | | | Excel框架 | Alibaba EasyExcel | [EasyExcel文档](https://www.yuque.com/easyexcel/doc/easyexcel) | 性能优异 扩展性强 | |
| | | | 文档框架 | Knife4j | [Knife4j文档](https://doc.xiaominfo.com/knife4j/documentation/) | 美化接口文档 | |
| | | | 工具类框架 | Hutool、Lombok | [Hutool文档](https://www.hutool.cn/docs/) | 减少代码冗余 增加安全性 | |
| | | | 代码生成器 | 适配MP、Knife4j规范化代码 | [Hutool文档](https://www.hutool.cn/docs/) | 一键生成前后端代码 | |
| | | | 部署方式 | Docker | [Docker文档](https://docs.docker.com/) | 容器编排 一键部署业务集群 | |
| | | | 国际化 | SpringMessage | [SpringMVC文档](https://docs.spring.io/spring-framework/docs/current/reference/html/web.html#mvc) | Spring标准国际化方案 | |
| | | |
| | | ## 参考文档 |
| | | |
| | |
| | | ## 演示图例 |
| | | |
| | | <table border="1" cellpadding="1" cellspacing="1" style="width:500px"> |
| | | <tbody> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-972235bcbe3518dedd351ff0e2ee7d1031c.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-5e0097702fa91e2e36391de8127676a7fa1.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td> |
| | | <p><img src="https://oscimg.oschina.net/oscnet/up-e56e3828f48cd9886d88731766f06d5f3c1.png" width="1920" /></p> |
| | | </td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-0715990ea1a9f254ec2138fcd063c1f556a.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-eaf5417ccf921bb64abb959e3d8e290467f.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-fc285cf33095ebf8318de6999af0f473861.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-60c83fd8bd61c29df6dbf47c88355e9c272.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-7f731948c8b73c7d90f67f9e1c7a534d5c3.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-e4de89b5e2d20c52d3c3a47f9eb88eb8526.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-8791d823a508eb90e67c604f36f57491a67.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-4589afd99982ead331785299b894174feb6.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-8ea177cdacaea20995daf2f596b15232561.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-32d1d04c55c11f74c9129fbbc58399728c4.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-04fa118f7631b7ae6fd72299ca0a1430a63.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-fe7e85b65827802bfaadf3acd42568b58c7.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-eff2b02a54f8188022d8498cfe6af6fcc06.png" width="1920" /></td> |
| | | </tr> |
| | | </tbody> |
| | | <tbody> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-972235bcbe3518dedd351ff0e2ee7d1031c.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-5e0097702fa91e2e36391de8127676a7fa1.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td> |
| | | <p><img src="https://oscimg.oschina.net/oscnet/up-e56e3828f48cd9886d88731766f06d5f3c1.png" width="1920" /></p> |
| | | </td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-0715990ea1a9f254ec2138fcd063c1f556a.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-eaf5417ccf921bb64abb959e3d8e290467f.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-fc285cf33095ebf8318de6999af0f473861.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-60c83fd8bd61c29df6dbf47c88355e9c272.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-7f731948c8b73c7d90f67f9e1c7a534d5c3.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-e4de89b5e2d20c52d3c3a47f9eb88eb8526.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-8791d823a508eb90e67c604f36f57491a67.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-4589afd99982ead331785299b894174feb6.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-8ea177cdacaea20995daf2f596b15232561.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-32d1d04c55c11f74c9129fbbc58399728c4.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-04fa118f7631b7ae6fd72299ca0a1430a63.png" width="1920" /></td> |
| | | </tr> |
| | | <tr> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-fe7e85b65827802bfaadf3acd42568b58c7.png" width="1920" /></td> |
| | | <td><img src="https://oscimg.oschina.net/oscnet/up-eff2b02a54f8188022d8498cfe6af6fcc06.png" width="1920" /></td> |
| | | </tr> |
| | | </tbody> |
| | | </table> |
| | |
| | | <project xmlns="http://maven.apache.org/POM/4.0.0" |
| | | xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" |
| | | xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> |
| | | <modelVersion>4.0.0</modelVersion> |
| | | <modelVersion>4.0.0</modelVersion> |
| | | |
| | | <groupId>com.ruoyi</groupId> |
| | | <artifactId>ruoyi-vue-plus</artifactId> |
| | |
| | | import com.ruoyi.common.core.domain.AjaxResult; |
| | | import com.ruoyi.common.core.domain.entity.SysUser; |
| | | import com.ruoyi.common.core.domain.model.LoginUser; |
| | | import com.ruoyi.common.core.service.TokenService; |
| | | import com.ruoyi.common.enums.BusinessType; |
| | | import com.ruoyi.common.utils.SecurityUtils; |
| | | import com.ruoyi.common.utils.StringUtils; |
| | |
| | | public class SysProfileController extends BaseController { |
| | | |
| | | private final ISysUserService userService; |
| | | private final TokenService tokenService; |
| | | private final ISysOssService iSysOssService; |
| | | |
| | | /** |
| | |
| | | value="%red(%d{yyyy-MM-dd HH:mm:ss}) %green([%thread]) %highlight(%-5level) %boldMagenta(%logger{36}%n) - %msg%n"/> |
| | | <property name="log.pattern" value="%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n"/> |
| | | |
| | | <!-- 控制台输出 --> |
| | | <appender name="console" class="ch.qos.logback.core.ConsoleAppender"> |
| | | <!-- 控制台输出 --> |
| | | <appender name="console" class="ch.qos.logback.core.ConsoleAppender"> |
| | | <encoder class="com.yomahub.tlog.core.enhance.logback.AspectLogbackEncoder"> |
| | | <pattern>${console.log.pattern}</pattern> |
| | | <pattern>${console.log.pattern}</pattern> |
| | | <charset>utf-8</charset> |
| | | </encoder> |
| | | </appender> |
| | | </encoder> |
| | | </appender> |
| | | |
| | | <!-- 控制台输出 --> |
| | | <appender name="file_console" class="ch.qos.logback.core.rolling.RollingFileAppender"> |
| | |
| | | </filter> |
| | | </appender> |
| | | |
| | | <!-- 系统日志输出 --> |
| | | <appender name="file_info" class="ch.qos.logback.core.rolling.RollingFileAppender"> |
| | | <file>${log.path}/sys-info.log</file> |
| | | <!-- 系统日志输出 --> |
| | | <appender name="file_info" class="ch.qos.logback.core.rolling.RollingFileAppender"> |
| | | <file>${log.path}/sys-info.log</file> |
| | | <!-- 循环政策:基于时间创建日志文件 --> |
| | | <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> |
| | | <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> |
| | | <!-- 日志文件名格式 --> |
| | | <fileNamePattern>${log.path}/sys-info.%d{yyyy-MM-dd}.log</fileNamePattern> |
| | | <!-- 日志最大的历史 60天 --> |
| | | <maxHistory>60</maxHistory> |
| | | </rollingPolicy> |
| | | <fileNamePattern>${log.path}/sys-info.%d{yyyy-MM-dd}.log</fileNamePattern> |
| | | <!-- 日志最大的历史 60天 --> |
| | | <maxHistory>60</maxHistory> |
| | | </rollingPolicy> |
| | | <encoder class="com.yomahub.tlog.core.enhance.logback.AspectLogbackEncoder"> |
| | | <pattern>${log.pattern}</pattern> |
| | | </encoder> |
| | | <filter class="ch.qos.logback.classic.filter.LevelFilter"> |
| | | <pattern>${log.pattern}</pattern> |
| | | </encoder> |
| | | <filter class="ch.qos.logback.classic.filter.LevelFilter"> |
| | | <!-- 过滤的级别 --> |
| | | <level>INFO</level> |
| | | <!-- 匹配时的操作:接收(记录) --> |
| | |
| | | <!-- 不匹配时的操作:拒绝(不记录) --> |
| | | <onMismatch>DENY</onMismatch> |
| | | </filter> |
| | | </appender> |
| | | </appender> |
| | | |
| | | <appender name="file_error" class="ch.qos.logback.core.rolling.RollingFileAppender"> |
| | | <file>${log.path}/sys-error.log</file> |
| | | <appender name="file_error" class="ch.qos.logback.core.rolling.RollingFileAppender"> |
| | | <file>${log.path}/sys-error.log</file> |
| | | <!-- 循环政策:基于时间创建日志文件 --> |
| | | <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> |
| | | <!-- 日志文件名格式 --> |
| | | <fileNamePattern>${log.path}/sys-error.%d{yyyy-MM-dd}.log</fileNamePattern> |
| | | <!-- 日志最大的历史 60天 --> |
| | | <maxHistory>60</maxHistory> |
| | | <!-- 日志最大的历史 60天 --> |
| | | <maxHistory>60</maxHistory> |
| | | </rollingPolicy> |
| | | <encoder class="com.yomahub.tlog.core.enhance.logback.AspectLogbackEncoder"> |
| | | <pattern>${log.pattern}</pattern> |
| | |
| | | <filter class="ch.qos.logback.classic.filter.LevelFilter"> |
| | | <!-- 过滤的级别 --> |
| | | <level>ERROR</level> |
| | | <!-- 匹配时的操作:接收(记录) --> |
| | | <!-- 匹配时的操作:接收(记录) --> |
| | | <onMatch>ACCEPT</onMatch> |
| | | <!-- 不匹配时的操作:拒绝(不记录) --> |
| | | <!-- 不匹配时的操作:拒绝(不记录) --> |
| | | <onMismatch>DENY</onMismatch> |
| | | </filter> |
| | | </appender> |
| | | |
| | | <!-- 系统模块日志级别控制 --> |
| | | <logger name="com.ruoyi" level="info" /> |
| | | <!-- Spring日志级别控制 --> |
| | | <logger name="org.springframework" level="warn" /> |
| | | <!-- 系统模块日志级别控制 --> |
| | | <logger name="com.ruoyi" level="info" /> |
| | | <!-- Spring日志级别控制 --> |
| | | <logger name="org.springframework" level="warn" /> |
| | | |
| | | <root level="info"> |
| | | <appender-ref ref="console" /> |
| | | </root> |
| | | <root level="info"> |
| | | <appender-ref ref="console" /> |
| | | </root> |
| | | |
| | | <!--系统操作日志--> |
| | | <!--系统操作日志--> |
| | | <root level="info"> |
| | | <appender-ref ref="file_info" /> |
| | | <appender-ref ref="file_error" /> |
| | |
| | | */ |
| | | @ApiOperation(value = "新增批量方法") |
| | | @PostMapping("/add") |
| | | // @DS("slave") |
| | | // @DS("slave") |
| | | public AjaxResult<Void> add() { |
| | | List<TestDemo> list = new ArrayList<>(); |
| | | for (int i = 0; i < 1000; i++) { |
| | |
| | | */ |
| | | @ApiOperation(value = "新增或更新批量方法") |
| | | @PostMapping("/addOrUpdate") |
| | | // @DS("slave") |
| | | // @DS("slave") |
| | | public AjaxResult<Void> addOrUpdate() { |
| | | List<TestDemo> list = new ArrayList<>(); |
| | | for (int i = 0; i < 1000; i++) { |
| | |
| | | */ |
| | | @ApiOperation(value = "删除批量方法") |
| | | @DeleteMapping() |
| | | // @DS("slave") |
| | | // @DS("slave") |
| | | public AjaxResult<Void> remove() { |
| | | return toAjax(testDemoMapper.delete(new LambdaQueryWrapper<TestDemo>() |
| | | .eq(TestDemo::getOrderNum, -1L))); |
| | |
| | | List<TestDemoVo> list = iTestDemoService.queryList(bo); |
| | | // 测试雪花id导出 |
| | | // for (TestDemoVo vo : list) { |
| | | // vo.setId(1234567891234567893L); |
| | | // } |
| | | // vo.setId(1234567891234567893L); |
| | | // } |
| | | ExcelUtil.exportExcel(list, "测试单表", TestDemoVo.class, response); |
| | | } |
| | | |
| | |
| | | <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" |
| | | xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> |
| | | <modelVersion>4.0.0</modelVersion> |
| | | <parent> |
| | | xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> |
| | | <modelVersion>4.0.0</modelVersion> |
| | | <parent> |
| | | <artifactId>ruoyi-extend</artifactId> |
| | | <groupId>com.ruoyi</groupId> |
| | | <version>3.5.0</version> |
| | | </parent> |
| | | <artifactId>ruoyi-xxl-job-admin</artifactId> |
| | | <packaging>jar</packaging> |
| | | </parent> |
| | | <artifactId>ruoyi-xxl-job-admin</artifactId> |
| | | <packaging>jar</packaging> |
| | | |
| | | <properties> |
| | | <mybatis-spring-boot-starter.version>2.1.4</mybatis-spring-boot-starter.version> |
| | | <mysql-connector-java.version>8.0.23</mysql-connector-java.version> |
| | | </properties> |
| | | <properties> |
| | | <mybatis-spring-boot-starter.version>2.1.4</mybatis-spring-boot-starter.version> |
| | | <mysql-connector-java.version>8.0.23</mysql-connector-java.version> |
| | | </properties> |
| | | |
| | | <dependencyManagement> |
| | | <dependencies> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-parent</artifactId> |
| | | <version>${spring-boot.version}</version> |
| | | <type>pom</type> |
| | | <scope>import</scope> |
| | | </dependency> |
| | | </dependencies> |
| | | </dependencyManagement> |
| | | <dependencyManagement> |
| | | <dependencies> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-parent</artifactId> |
| | | <version>${spring-boot.version}</version> |
| | | <type>pom</type> |
| | | <scope>import</scope> |
| | | </dependency> |
| | | </dependencies> |
| | | </dependencyManagement> |
| | | |
| | | <dependencies> |
| | | <dependencies> |
| | | |
| | | <!-- starter-web:spring-webmvc + autoconfigure + logback + yaml + tomcat --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-web</artifactId> |
| | | </dependency> |
| | | <!-- starter-test:junit + spring-test + mockito --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-test</artifactId> |
| | | <scope>test</scope> |
| | | </dependency> |
| | | <!-- starter-web:spring-webmvc + autoconfigure + logback + yaml + tomcat --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-web</artifactId> |
| | | </dependency> |
| | | <!-- starter-test:junit + spring-test + mockito --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-test</artifactId> |
| | | <scope>test</scope> |
| | | </dependency> |
| | | |
| | | <!-- freemarker-starter --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-freemarker</artifactId> |
| | | </dependency> |
| | | <!-- freemarker-starter --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-freemarker</artifactId> |
| | | </dependency> |
| | | |
| | | <!-- mail-starter --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-mail</artifactId> |
| | | </dependency> |
| | | <!-- mail-starter --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-mail</artifactId> |
| | | </dependency> |
| | | |
| | | <!-- starter-actuator --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-actuator</artifactId> |
| | | </dependency> |
| | | <!-- starter-actuator --> |
| | | <dependency> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-starter-actuator</artifactId> |
| | | </dependency> |
| | | |
| | | <!-- mybatis-starter:mybatis + mybatis-spring + hikari(default) --> |
| | | <dependency> |
| | | <groupId>org.mybatis.spring.boot</groupId> |
| | | <artifactId>mybatis-spring-boot-starter</artifactId> |
| | | <version>${mybatis-spring-boot-starter.version}</version> |
| | | </dependency> |
| | | <!-- mysql --> |
| | | <dependency> |
| | | <groupId>mysql</groupId> |
| | | <artifactId>mysql-connector-java</artifactId> |
| | | <version>${mysql-connector-java.version}</version> |
| | | </dependency> |
| | | <!-- mybatis-starter:mybatis + mybatis-spring + hikari(default) --> |
| | | <dependency> |
| | | <groupId>org.mybatis.spring.boot</groupId> |
| | | <artifactId>mybatis-spring-boot-starter</artifactId> |
| | | <version>${mybatis-spring-boot-starter.version}</version> |
| | | </dependency> |
| | | <!-- mysql --> |
| | | <dependency> |
| | | <groupId>mysql</groupId> |
| | | <artifactId>mysql-connector-java</artifactId> |
| | | <version>${mysql-connector-java.version}</version> |
| | | </dependency> |
| | | |
| | | <dependency> |
| | | <groupId>de.codecentric</groupId> |
| | | <artifactId>spring-boot-admin-starter-client</artifactId> |
| | | </dependency> |
| | | |
| | | <!-- xxl-job-core --> |
| | | <dependency> |
| | | <groupId>com.xuxueli</groupId> |
| | | <artifactId>xxl-job-core</artifactId> |
| | | </dependency> |
| | | <!-- xxl-job-core --> |
| | | <dependency> |
| | | <groupId>com.xuxueli</groupId> |
| | | <artifactId>xxl-job-core</artifactId> |
| | | </dependency> |
| | | |
| | | </dependencies> |
| | | </dependencies> |
| | | |
| | | <build> |
| | | <build> |
| | | <finalName>${project.artifactId}</finalName> |
| | | <plugins> |
| | | <plugin> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-maven-plugin</artifactId> |
| | | <version>${spring-boot.version}</version> |
| | | <executions> |
| | | <execution> |
| | | <goals> |
| | | <goal>repackage</goal> |
| | | </goals> |
| | | </execution> |
| | | </executions> |
| | | </plugin> |
| | | <!-- docker --> |
| | | <plugin> |
| | | <groupId>com.spotify</groupId> |
| | | <artifactId>docker-maven-plugin</artifactId> |
| | | <version>${docker.plugin.version}</version> |
| | | <configuration> |
| | | <!-- made of '[a-z0-9-_.]' --> |
| | | <imageName>${docker.namespace}/${project.artifactId}:${project.version}</imageName> |
| | | <dockerDirectory>${project.basedir}</dockerDirectory> |
| | | <dockerHost>${docker.registry.host}</dockerHost> |
| | | <registryUrl>${docker.registry.url}</registryUrl> |
| | | <serverId>${docker.registry.url}</serverId> |
| | | <resources> |
| | | <resource> |
| | | <targetPath>/</targetPath> |
| | | <directory>${project.build.directory}</directory> |
| | | <include>${project.build.finalName}.jar</include> |
| | | </resource> |
| | | </resources> |
| | | </configuration> |
| | | </plugin> |
| | | </plugins> |
| | | </build> |
| | | <plugins> |
| | | <plugin> |
| | | <groupId>org.springframework.boot</groupId> |
| | | <artifactId>spring-boot-maven-plugin</artifactId> |
| | | <version>${spring-boot.version}</version> |
| | | <executions> |
| | | <execution> |
| | | <goals> |
| | | <goal>repackage</goal> |
| | | </goals> |
| | | </execution> |
| | | </executions> |
| | | </plugin> |
| | | <!-- docker --> |
| | | <plugin> |
| | | <groupId>com.spotify</groupId> |
| | | <artifactId>docker-maven-plugin</artifactId> |
| | | <version>${docker.plugin.version}</version> |
| | | <configuration> |
| | | <!-- made of '[a-z0-9-_.]' --> |
| | | <imageName>${docker.namespace}/${project.artifactId}:${project.version}</imageName> |
| | | <dockerDirectory>${project.basedir}</dockerDirectory> |
| | | <dockerHost>${docker.registry.host}</dockerHost> |
| | | <registryUrl>${docker.registry.url}</registryUrl> |
| | | <serverId>${docker.registry.url}</serverId> |
| | | <resources> |
| | | <resource> |
| | | <targetPath>/</targetPath> |
| | | <directory>${project.build.directory}</directory> |
| | | <include>${project.build.finalName}.jar</include> |
| | | </resource> |
| | | </resources> |
| | | </configuration> |
| | | </plugin> |
| | | </plugins> |
| | | </build> |
| | | |
| | | </project> |
| | |
| | | @SpringBootApplication |
| | | public class XxlJobAdminApplication { |
| | | |
| | | public static void main(String[] args) { |
| | | public static void main(String[] args) { |
| | | SpringApplication.run(XxlJobAdminApplication.class, args); |
| | | } |
| | | } |
| | | |
| | | } |
| | | } |
| | |
| | | |
| | | /** |
| | | * index controller |
| | | * |
| | | * @author xuxueli 2015-12-19 16:13:16 |
| | | */ |
| | | @Controller |
| | | public class IndexController { |
| | | |
| | | @Resource |
| | | private XxlJobService xxlJobService; |
| | | @Resource |
| | | private LoginService loginService; |
| | | @Resource |
| | | private XxlJobService xxlJobService; |
| | | @Resource |
| | | private LoginService loginService; |
| | | |
| | | |
| | | @RequestMapping("/") |
| | | public String index(Model model) { |
| | | @RequestMapping("/") |
| | | public String index(Model model) { |
| | | |
| | | Map<String, Object> dashboardMap = xxlJobService.dashboardInfo(); |
| | | model.addAllAttributes(dashboardMap); |
| | | Map<String, Object> dashboardMap = xxlJobService.dashboardInfo(); |
| | | model.addAllAttributes(dashboardMap); |
| | | |
| | | return "index"; |
| | | } |
| | | return "index"; |
| | | } |
| | | |
| | | @RequestMapping("/chartInfo") |
| | | @ResponseBody |
| | | public ReturnT<Map<String, Object>> chartInfo(Date startDate, Date endDate) { |
| | | @ResponseBody |
| | | public ReturnT<Map<String, Object>> chartInfo(Date startDate, Date endDate) { |
| | | ReturnT<Map<String, Object>> chartInfo = xxlJobService.chartInfo(startDate, endDate); |
| | | return chartInfo; |
| | | } |
| | | |
| | | @RequestMapping("/toLogin") |
| | | @PermissionLimit(limit=false) |
| | | public ModelAndView toLogin(HttpServletRequest request, HttpServletResponse response,ModelAndView modelAndView) { |
| | | if (loginService.ifLogin(request, response) != null) { |
| | | modelAndView.setView(new RedirectView("/",true,false)); |
| | | return modelAndView; |
| | | } |
| | | return new ModelAndView("login"); |
| | | } |
| | | |
| | | @RequestMapping(value="login", method=RequestMethod.POST) |
| | | @ResponseBody |
| | | @PermissionLimit(limit=false) |
| | | public ReturnT<String> loginDo(HttpServletRequest request, HttpServletResponse response, String userName, String password, String ifRemember){ |
| | | boolean ifRem = (ifRemember!=null && ifRemember.trim().length()>0 && "on".equals(ifRemember))?true:false; |
| | | return loginService.login(request, response, userName, password, ifRem); |
| | | } |
| | | |
| | | @RequestMapping(value="logout", method=RequestMethod.POST) |
| | | @ResponseBody |
| | | @PermissionLimit(limit=false) |
| | | public ReturnT<String> logout(HttpServletRequest request, HttpServletResponse response){ |
| | | return loginService.logout(request, response); |
| | | } |
| | | |
| | | @RequestMapping("/help") |
| | | public String help() { |
| | | |
| | | /*if (!PermissionInterceptor.ifLogin(request)) { |
| | | return "redirect:/toLogin"; |
| | | }*/ |
| | | @RequestMapping("/toLogin") |
| | | @PermissionLimit(limit = false) |
| | | public ModelAndView toLogin(HttpServletRequest request, HttpServletResponse response, ModelAndView modelAndView) { |
| | | if (loginService.ifLogin(request, response) != null) { |
| | | modelAndView.setView(new RedirectView("/" , true, false)); |
| | | return modelAndView; |
| | | } |
| | | return new ModelAndView("login"); |
| | | } |
| | | |
| | | return "help"; |
| | | } |
| | | @RequestMapping(value = "login" , method = RequestMethod.POST) |
| | | @ResponseBody |
| | | @PermissionLimit(limit = false) |
| | | public ReturnT<String> loginDo(HttpServletRequest request, HttpServletResponse response, String userName, String password, String ifRemember) { |
| | | boolean ifRem = (ifRemember != null && ifRemember.trim().length() > 0 && "on".equals(ifRemember)) ? true : false; |
| | | return loginService.login(request, response, userName, password, ifRem); |
| | | } |
| | | |
| | | @InitBinder |
| | | public void initBinder(WebDataBinder binder) { |
| | | SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); |
| | | dateFormat.setLenient(false); |
| | | binder.registerCustomEditor(Date.class, new CustomDateEditor(dateFormat, true)); |
| | | } |
| | | |
| | | @RequestMapping(value = "logout" , method = RequestMethod.POST) |
| | | @ResponseBody |
| | | @PermissionLimit(limit = false) |
| | | public ReturnT<String> logout(HttpServletRequest request, HttpServletResponse response) { |
| | | return loginService.logout(request, response); |
| | | } |
| | | |
| | | @RequestMapping("/help") |
| | | public String help() { |
| | | |
| | | /*if (!PermissionInterceptor.ifLogin(request)) { |
| | | return "redirect:/toLogin"; |
| | | }*/ |
| | | |
| | | return "help"; |
| | | } |
| | | |
| | | @InitBinder |
| | | public void initBinder(WebDataBinder binder) { |
| | | SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); |
| | | dateFormat.setLenient(false); |
| | | binder.registerCustomEditor(Date.class, new CustomDateEditor(dateFormat, true)); |
| | | } |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * job code controller |
| | | * |
| | | * @author xuxueli 2015-12-19 16:13:16 |
| | | */ |
| | | @Controller |
| | | @RequestMapping("/jobcode") |
| | | public class JobCodeController { |
| | | |
| | | @Resource |
| | | private XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | private XxlJobLogGlueDao xxlJobLogGlueDao; |
| | | |
| | | @RequestMapping |
| | | public String index(HttpServletRequest request, Model model, int jobId) { |
| | | XxlJobInfo jobInfo = xxlJobInfoDao.loadById(jobId); |
| | | List<XxlJobLogGlue> jobLogGlues = xxlJobLogGlueDao.findByJobId(jobId); |
| | | @Resource |
| | | private XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | private XxlJobLogGlueDao xxlJobLogGlueDao; |
| | | |
| | | if (jobInfo == null) { |
| | | throw new RuntimeException(I18nUtil.getString("jobinfo_glue_jobid_unvalid")); |
| | | } |
| | | if (GlueTypeEnum.BEAN == GlueTypeEnum.match(jobInfo.getGlueType())) { |
| | | throw new RuntimeException(I18nUtil.getString("jobinfo_glue_gluetype_unvalid")); |
| | | } |
| | | @RequestMapping |
| | | public String index(HttpServletRequest request, Model model, int jobId) { |
| | | XxlJobInfo jobInfo = xxlJobInfoDao.loadById(jobId); |
| | | List<XxlJobLogGlue> jobLogGlues = xxlJobLogGlueDao.findByJobId(jobId); |
| | | |
| | | // valid permission |
| | | JobInfoController.validPermission(request, jobInfo.getJobGroup()); |
| | | if (jobInfo == null) { |
| | | throw new RuntimeException(I18nUtil.getString("jobinfo_glue_jobid_unvalid")); |
| | | } |
| | | if (GlueTypeEnum.BEAN == GlueTypeEnum.match(jobInfo.getGlueType())) { |
| | | throw new RuntimeException(I18nUtil.getString("jobinfo_glue_gluetype_unvalid")); |
| | | } |
| | | |
| | | // Glue类型-字典 |
| | | model.addAttribute("GlueTypeEnum", GlueTypeEnum.values()); |
| | | // valid permission |
| | | JobInfoController.validPermission(request, jobInfo.getJobGroup()); |
| | | |
| | | model.addAttribute("jobInfo", jobInfo); |
| | | model.addAttribute("jobLogGlues", jobLogGlues); |
| | | return "jobcode/jobcode.index"; |
| | | } |
| | | |
| | | @RequestMapping("/save") |
| | | @ResponseBody |
| | | public ReturnT<String> save(Model model, int id, String glueSource, String glueRemark) { |
| | | // valid |
| | | if (glueRemark==null) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobinfo_glue_remark")) ); |
| | | } |
| | | if (glueRemark.length()<4 || glueRemark.length()>100) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobinfo_glue_remark_limit")); |
| | | } |
| | | XxlJobInfo exists_jobInfo = xxlJobInfoDao.loadById(id); |
| | | if (exists_jobInfo == null) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobinfo_glue_jobid_unvalid")); |
| | | } |
| | | |
| | | // update new code |
| | | exists_jobInfo.setGlueSource(glueSource); |
| | | exists_jobInfo.setGlueRemark(glueRemark); |
| | | exists_jobInfo.setGlueUpdatetime(new Date()); |
| | | // Glue类型-字典 |
| | | model.addAttribute("GlueTypeEnum" , GlueTypeEnum.values()); |
| | | |
| | | exists_jobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(exists_jobInfo); |
| | | model.addAttribute("jobInfo" , jobInfo); |
| | | model.addAttribute("jobLogGlues" , jobLogGlues); |
| | | return "jobcode/jobcode.index"; |
| | | } |
| | | |
| | | // log old code |
| | | XxlJobLogGlue xxlJobLogGlue = new XxlJobLogGlue(); |
| | | xxlJobLogGlue.setJobId(exists_jobInfo.getId()); |
| | | xxlJobLogGlue.setGlueType(exists_jobInfo.getGlueType()); |
| | | xxlJobLogGlue.setGlueSource(glueSource); |
| | | xxlJobLogGlue.setGlueRemark(glueRemark); |
| | | @RequestMapping("/save") |
| | | @ResponseBody |
| | | public ReturnT<String> save(Model model, int id, String glueSource, String glueRemark) { |
| | | // valid |
| | | if (glueRemark == null) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobinfo_glue_remark"))); |
| | | } |
| | | if (glueRemark.length() < 4 || glueRemark.length() > 100) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobinfo_glue_remark_limit")); |
| | | } |
| | | XxlJobInfo exists_jobInfo = xxlJobInfoDao.loadById(id); |
| | | if (exists_jobInfo == null) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobinfo_glue_jobid_unvalid")); |
| | | } |
| | | |
| | | xxlJobLogGlue.setAddTime(new Date()); |
| | | xxlJobLogGlue.setUpdateTime(new Date()); |
| | | xxlJobLogGlueDao.save(xxlJobLogGlue); |
| | | // update new code |
| | | exists_jobInfo.setGlueSource(glueSource); |
| | | exists_jobInfo.setGlueRemark(glueRemark); |
| | | exists_jobInfo.setGlueUpdatetime(new Date()); |
| | | |
| | | // remove code backup more than 30 |
| | | xxlJobLogGlueDao.removeOld(exists_jobInfo.getId(), 30); |
| | | exists_jobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(exists_jobInfo); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | // log old code |
| | | XxlJobLogGlue xxlJobLogGlue = new XxlJobLogGlue(); |
| | | xxlJobLogGlue.setJobId(exists_jobInfo.getId()); |
| | | xxlJobLogGlue.setGlueType(exists_jobInfo.getGlueType()); |
| | | xxlJobLogGlue.setGlueSource(glueSource); |
| | | xxlJobLogGlue.setGlueRemark(glueRemark); |
| | | |
| | | xxlJobLogGlue.setAddTime(new Date()); |
| | | xxlJobLogGlue.setUpdateTime(new Date()); |
| | | xxlJobLogGlueDao.save(xxlJobLogGlue); |
| | | |
| | | // remove code backup more than 30 |
| | | xxlJobLogGlueDao.removeOld(exists_jobInfo.getId(), 30); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * job group controller |
| | | * |
| | | * @author xuxueli 2016-10-02 20:52:56 |
| | | */ |
| | | @Controller |
| | | @RequestMapping("/jobgroup") |
| | | public class JobGroupController { |
| | | |
| | | @Resource |
| | | public XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | public XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | private XxlJobRegistryDao xxlJobRegistryDao; |
| | | @Resource |
| | | public XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | public XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | private XxlJobRegistryDao xxlJobRegistryDao; |
| | | |
| | | @RequestMapping |
| | | public String index(Model model) { |
| | | return "jobgroup/jobgroup.index"; |
| | | } |
| | | @RequestMapping |
| | | public String index(Model model) { |
| | | return "jobgroup/jobgroup.index"; |
| | | } |
| | | |
| | | @RequestMapping("/pageList") |
| | | @ResponseBody |
| | | public Map<String, Object> pageList(HttpServletRequest request, |
| | | @RequestParam(required = false, defaultValue = "0") int start, |
| | | @RequestParam(required = false, defaultValue = "10") int length, |
| | | String appname, String title) { |
| | | @RequestMapping("/pageList") |
| | | @ResponseBody |
| | | public Map<String, Object> pageList(HttpServletRequest request, |
| | | @RequestParam(required = false, defaultValue = "0") int start, |
| | | @RequestParam(required = false, defaultValue = "10") int length, |
| | | String appname, String title) { |
| | | |
| | | // page query |
| | | List<XxlJobGroup> list = xxlJobGroupDao.pageList(start, length, appname, title); |
| | | int list_count = xxlJobGroupDao.pageListCount(start, length, appname, title); |
| | | // page query |
| | | List<XxlJobGroup> list = xxlJobGroupDao.pageList(start, length, appname, title); |
| | | int list_count = xxlJobGroupDao.pageListCount(start, length, appname, title); |
| | | |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal", list_count); // 总记录数 |
| | | maps.put("recordsFiltered", list_count); // 过滤后的总记录数 |
| | | maps.put("data", list); // 分页列表 |
| | | return maps; |
| | | } |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal" , list_count); // 总记录数 |
| | | maps.put("recordsFiltered" , list_count); // 过滤后的总记录数 |
| | | maps.put("data" , list); // 分页列表 |
| | | return maps; |
| | | } |
| | | |
| | | @RequestMapping("/save") |
| | | @ResponseBody |
| | | public ReturnT<String> save(XxlJobGroup xxlJobGroup){ |
| | | @RequestMapping("/save") |
| | | @ResponseBody |
| | | public ReturnT<String> save(XxlJobGroup xxlJobGroup) { |
| | | |
| | | // valid |
| | | if (xxlJobGroup.getAppname()==null || xxlJobGroup.getAppname().trim().length()==0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input")+"AppName") ); |
| | | } |
| | | if (xxlJobGroup.getAppname().length()<4 || xxlJobGroup.getAppname().length()>64) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_appname_length") ); |
| | | } |
| | | if (xxlJobGroup.getAppname().contains(">") || xxlJobGroup.getAppname().contains("<")) { |
| | | return new ReturnT<String>(500, "AppName"+I18nUtil.getString("system_unvalid") ); |
| | | } |
| | | if (xxlJobGroup.getTitle()==null || xxlJobGroup.getTitle().trim().length()==0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobgroup_field_title")) ); |
| | | } |
| | | if (xxlJobGroup.getTitle().contains(">") || xxlJobGroup.getTitle().contains("<")) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_title")+I18nUtil.getString("system_unvalid") ); |
| | | } |
| | | if (xxlJobGroup.getAddressType()!=0) { |
| | | if (xxlJobGroup.getAddressList()==null || xxlJobGroup.getAddressList().trim().length()==0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_addressType_limit") ); |
| | | } |
| | | if (xxlJobGroup.getAddressList().contains(">") || xxlJobGroup.getAddressList().contains("<")) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_registryList")+I18nUtil.getString("system_unvalid") ); |
| | | } |
| | | // valid |
| | | if (xxlJobGroup.getAppname() == null || xxlJobGroup.getAppname().trim().length() == 0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + "AppName")); |
| | | } |
| | | if (xxlJobGroup.getAppname().length() < 4 || xxlJobGroup.getAppname().length() > 64) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_appname_length")); |
| | | } |
| | | if (xxlJobGroup.getAppname().contains(">") || xxlJobGroup.getAppname().contains("<")) { |
| | | return new ReturnT<String>(500, "AppName" + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | if (xxlJobGroup.getTitle() == null || xxlJobGroup.getTitle().trim().length() == 0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobgroup_field_title"))); |
| | | } |
| | | if (xxlJobGroup.getTitle().contains(">") || xxlJobGroup.getTitle().contains("<")) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_title") + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | if (xxlJobGroup.getAddressType() != 0) { |
| | | if (xxlJobGroup.getAddressList() == null || xxlJobGroup.getAddressList().trim().length() == 0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_addressType_limit")); |
| | | } |
| | | if (xxlJobGroup.getAddressList().contains(">") || xxlJobGroup.getAddressList().contains("<")) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_registryList") + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | |
| | | String[] addresss = xxlJobGroup.getAddressList().split(","); |
| | | for (String item: addresss) { |
| | | if (item==null || item.trim().length()==0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_registryList_unvalid") ); |
| | | } |
| | | } |
| | | } |
| | | String[] addresss = xxlJobGroup.getAddressList().split(","); |
| | | for (String item : addresss) { |
| | | if (item == null || item.trim().length() == 0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_registryList_unvalid")); |
| | | } |
| | | } |
| | | } |
| | | |
| | | // process |
| | | xxlJobGroup.setUpdateTime(new Date()); |
| | | // process |
| | | xxlJobGroup.setUpdateTime(new Date()); |
| | | |
| | | int ret = xxlJobGroupDao.save(xxlJobGroup); |
| | | return (ret>0)?ReturnT.SUCCESS:ReturnT.FAIL; |
| | | } |
| | | int ret = xxlJobGroupDao.save(xxlJobGroup); |
| | | return (ret > 0) ? ReturnT.SUCCESS : ReturnT.FAIL; |
| | | } |
| | | |
| | | @RequestMapping("/update") |
| | | @ResponseBody |
| | | public ReturnT<String> update(XxlJobGroup xxlJobGroup){ |
| | | // valid |
| | | if (xxlJobGroup.getAppname()==null || xxlJobGroup.getAppname().trim().length()==0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input")+"AppName") ); |
| | | } |
| | | if (xxlJobGroup.getAppname().length()<4 || xxlJobGroup.getAppname().length()>64) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_appname_length") ); |
| | | } |
| | | if (xxlJobGroup.getTitle()==null || xxlJobGroup.getTitle().trim().length()==0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobgroup_field_title")) ); |
| | | } |
| | | if (xxlJobGroup.getAddressType() == 0) { |
| | | // 0=自动注册 |
| | | List<String> registryList = findRegistryByAppName(xxlJobGroup.getAppname()); |
| | | String addressListStr = null; |
| | | if (registryList!=null && !registryList.isEmpty()) { |
| | | Collections.sort(registryList); |
| | | addressListStr = ""; |
| | | for (String item:registryList) { |
| | | addressListStr += item + ","; |
| | | } |
| | | addressListStr = addressListStr.substring(0, addressListStr.length()-1); |
| | | } |
| | | xxlJobGroup.setAddressList(addressListStr); |
| | | } else { |
| | | // 1=手动录入 |
| | | if (xxlJobGroup.getAddressList()==null || xxlJobGroup.getAddressList().trim().length()==0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_addressType_limit") ); |
| | | } |
| | | String[] addresss = xxlJobGroup.getAddressList().split(","); |
| | | for (String item: addresss) { |
| | | if (item==null || item.trim().length()==0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_registryList_unvalid") ); |
| | | } |
| | | } |
| | | } |
| | | @RequestMapping("/update") |
| | | @ResponseBody |
| | | public ReturnT<String> update(XxlJobGroup xxlJobGroup) { |
| | | // valid |
| | | if (xxlJobGroup.getAppname() == null || xxlJobGroup.getAppname().trim().length() == 0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + "AppName")); |
| | | } |
| | | if (xxlJobGroup.getAppname().length() < 4 || xxlJobGroup.getAppname().length() > 64) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_appname_length")); |
| | | } |
| | | if (xxlJobGroup.getTitle() == null || xxlJobGroup.getTitle().trim().length() == 0) { |
| | | return new ReturnT<String>(500, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobgroup_field_title"))); |
| | | } |
| | | if (xxlJobGroup.getAddressType() == 0) { |
| | | // 0=自动注册 |
| | | List<String> registryList = findRegistryByAppName(xxlJobGroup.getAppname()); |
| | | String addressListStr = null; |
| | | if (registryList != null && !registryList.isEmpty()) { |
| | | Collections.sort(registryList); |
| | | addressListStr = ""; |
| | | for (String item : registryList) { |
| | | addressListStr += item + ","; |
| | | } |
| | | addressListStr = addressListStr.substring(0, addressListStr.length() - 1); |
| | | } |
| | | xxlJobGroup.setAddressList(addressListStr); |
| | | } else { |
| | | // 1=手动录入 |
| | | if (xxlJobGroup.getAddressList() == null || xxlJobGroup.getAddressList().trim().length() == 0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_addressType_limit")); |
| | | } |
| | | String[] addresss = xxlJobGroup.getAddressList().split(","); |
| | | for (String item : addresss) { |
| | | if (item == null || item.trim().length() == 0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_field_registryList_unvalid")); |
| | | } |
| | | } |
| | | } |
| | | |
| | | // process |
| | | xxlJobGroup.setUpdateTime(new Date()); |
| | | // process |
| | | xxlJobGroup.setUpdateTime(new Date()); |
| | | |
| | | int ret = xxlJobGroupDao.update(xxlJobGroup); |
| | | return (ret>0)?ReturnT.SUCCESS:ReturnT.FAIL; |
| | | } |
| | | int ret = xxlJobGroupDao.update(xxlJobGroup); |
| | | return (ret > 0) ? ReturnT.SUCCESS : ReturnT.FAIL; |
| | | } |
| | | |
| | | private List<String> findRegistryByAppName(String appnameParam){ |
| | | HashMap<String, List<String>> appAddressMap = new HashMap<String, List<String>>(); |
| | | List<XxlJobRegistry> list = xxlJobRegistryDao.findAll(RegistryConfig.DEAD_TIMEOUT, new Date()); |
| | | if (list != null) { |
| | | for (XxlJobRegistry item: list) { |
| | | if (RegistryConfig.RegistType.EXECUTOR.name().equals(item.getRegistryGroup())) { |
| | | String appname = item.getRegistryKey(); |
| | | List<String> registryList = appAddressMap.get(appname); |
| | | if (registryList == null) { |
| | | registryList = new ArrayList<String>(); |
| | | } |
| | | private List<String> findRegistryByAppName(String appnameParam) { |
| | | HashMap<String, List<String>> appAddressMap = new HashMap<String, List<String>>(); |
| | | List<XxlJobRegistry> list = xxlJobRegistryDao.findAll(RegistryConfig.DEAD_TIMEOUT, new Date()); |
| | | if (list != null) { |
| | | for (XxlJobRegistry item : list) { |
| | | if (RegistryConfig.RegistType.EXECUTOR.name().equals(item.getRegistryGroup())) { |
| | | String appname = item.getRegistryKey(); |
| | | List<String> registryList = appAddressMap.get(appname); |
| | | if (registryList == null) { |
| | | registryList = new ArrayList<String>(); |
| | | } |
| | | |
| | | if (!registryList.contains(item.getRegistryValue())) { |
| | | registryList.add(item.getRegistryValue()); |
| | | } |
| | | appAddressMap.put(appname, registryList); |
| | | } |
| | | } |
| | | } |
| | | return appAddressMap.get(appnameParam); |
| | | } |
| | | if (!registryList.contains(item.getRegistryValue())) { |
| | | registryList.add(item.getRegistryValue()); |
| | | } |
| | | appAddressMap.put(appname, registryList); |
| | | } |
| | | } |
| | | } |
| | | return appAddressMap.get(appnameParam); |
| | | } |
| | | |
| | | @RequestMapping("/remove") |
| | | @ResponseBody |
| | | public ReturnT<String> remove(int id){ |
| | | @RequestMapping("/remove") |
| | | @ResponseBody |
| | | public ReturnT<String> remove(int id) { |
| | | |
| | | // valid |
| | | int count = xxlJobInfoDao.pageListCount(0, 10, id, -1, null, null, null); |
| | | if (count > 0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_del_limit_0") ); |
| | | } |
| | | // valid |
| | | int count = xxlJobInfoDao.pageListCount(0, 10, id, -1, null, null, null); |
| | | if (count > 0) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_del_limit_0")); |
| | | } |
| | | |
| | | List<XxlJobGroup> allList = xxlJobGroupDao.findAll(); |
| | | if (allList.size() == 1) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_del_limit_1") ); |
| | | } |
| | | List<XxlJobGroup> allList = xxlJobGroupDao.findAll(); |
| | | if (allList.size() == 1) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobgroup_del_limit_1")); |
| | | } |
| | | |
| | | int ret = xxlJobGroupDao.remove(id); |
| | | return (ret>0)?ReturnT.SUCCESS:ReturnT.FAIL; |
| | | } |
| | | int ret = xxlJobGroupDao.remove(id); |
| | | return (ret > 0) ? ReturnT.SUCCESS : ReturnT.FAIL; |
| | | } |
| | | |
| | | @RequestMapping("/loadById") |
| | | @ResponseBody |
| | | public ReturnT<XxlJobGroup> loadById(int id){ |
| | | XxlJobGroup jobGroup = xxlJobGroupDao.load(id); |
| | | return jobGroup!=null?new ReturnT<XxlJobGroup>(jobGroup):new ReturnT<XxlJobGroup>(ReturnT.FAIL_CODE, null); |
| | | } |
| | | @RequestMapping("/loadById") |
| | | @ResponseBody |
| | | public ReturnT<XxlJobGroup> loadById(int id) { |
| | | XxlJobGroup jobGroup = xxlJobGroupDao.load(id); |
| | | return jobGroup != null ? new ReturnT<XxlJobGroup>(jobGroup) : new ReturnT<XxlJobGroup>(ReturnT.FAIL_CODE, null); |
| | | } |
| | | |
| | | } |
| | |
| | | package com.xxl.job.admin.controller; |
| | | |
| | | import com.xxl.job.admin.core.cron.CronExpression; |
| | | import com.xxl.job.admin.core.exception.XxlJobException; |
| | | import com.xxl.job.admin.core.model.XxlJobGroup; |
| | | import com.xxl.job.admin.core.model.XxlJobInfo; |
| | |
| | | |
| | | import javax.annotation.Resource; |
| | | import javax.servlet.http.HttpServletRequest; |
| | | import java.text.ParseException; |
| | | import java.util.*; |
| | | |
| | | /** |
| | | * index controller |
| | | * |
| | | * @author xuxueli 2015-12-19 16:13:16 |
| | | */ |
| | | @Controller |
| | | @RequestMapping("/jobinfo") |
| | | public class JobInfoController { |
| | | private static Logger logger = LoggerFactory.getLogger(JobInfoController.class); |
| | | private static Logger logger = LoggerFactory.getLogger(JobInfoController.class); |
| | | |
| | | @Resource |
| | | private XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | private XxlJobService xxlJobService; |
| | | |
| | | @RequestMapping |
| | | public String index(HttpServletRequest request, Model model, @RequestParam(required = false, defaultValue = "-1") int jobGroup) { |
| | | @Resource |
| | | private XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | private XxlJobService xxlJobService; |
| | | |
| | | // 枚举-字典 |
| | | model.addAttribute("ExecutorRouteStrategyEnum", ExecutorRouteStrategyEnum.values()); // 路由策略-列表 |
| | | model.addAttribute("GlueTypeEnum", GlueTypeEnum.values()); // Glue类型-字典 |
| | | model.addAttribute("ExecutorBlockStrategyEnum", ExecutorBlockStrategyEnum.values()); // 阻塞处理策略-字典 |
| | | model.addAttribute("ScheduleTypeEnum", ScheduleTypeEnum.values()); // 调度类型 |
| | | model.addAttribute("MisfireStrategyEnum", MisfireStrategyEnum.values()); // 调度过期策略 |
| | | @RequestMapping |
| | | public String index(HttpServletRequest request, Model model, @RequestParam(required = false, defaultValue = "-1") int jobGroup) { |
| | | |
| | | // 执行器列表 |
| | | List<XxlJobGroup> jobGroupList_all = xxlJobGroupDao.findAll(); |
| | | // 枚举-字典 |
| | | model.addAttribute("ExecutorRouteStrategyEnum" , ExecutorRouteStrategyEnum.values()); // 路由策略-列表 |
| | | model.addAttribute("GlueTypeEnum" , GlueTypeEnum.values()); // Glue类型-字典 |
| | | model.addAttribute("ExecutorBlockStrategyEnum" , ExecutorBlockStrategyEnum.values()); // 阻塞处理策略-字典 |
| | | model.addAttribute("ScheduleTypeEnum" , ScheduleTypeEnum.values()); // 调度类型 |
| | | model.addAttribute("MisfireStrategyEnum" , MisfireStrategyEnum.values()); // 调度过期策略 |
| | | |
| | | // filter group |
| | | List<XxlJobGroup> jobGroupList = filterJobGroupByRole(request, jobGroupList_all); |
| | | if (jobGroupList==null || jobGroupList.size()==0) { |
| | | throw new XxlJobException(I18nUtil.getString("jobgroup_empty")); |
| | | } |
| | | // 执行器列表 |
| | | List<XxlJobGroup> jobGroupList_all = xxlJobGroupDao.findAll(); |
| | | |
| | | model.addAttribute("JobGroupList", jobGroupList); |
| | | model.addAttribute("jobGroup", jobGroup); |
| | | // filter group |
| | | List<XxlJobGroup> jobGroupList = filterJobGroupByRole(request, jobGroupList_all); |
| | | if (jobGroupList == null || jobGroupList.size() == 0) { |
| | | throw new XxlJobException(I18nUtil.getString("jobgroup_empty")); |
| | | } |
| | | |
| | | return "jobinfo/jobinfo.index"; |
| | | } |
| | | model.addAttribute("JobGroupList" , jobGroupList); |
| | | model.addAttribute("jobGroup" , jobGroup); |
| | | |
| | | public static List<XxlJobGroup> filterJobGroupByRole(HttpServletRequest request, List<XxlJobGroup> jobGroupList_all){ |
| | | List<XxlJobGroup> jobGroupList = new ArrayList<>(); |
| | | if (jobGroupList_all!=null && jobGroupList_all.size()>0) { |
| | | XxlJobUser loginUser = (XxlJobUser) request.getAttribute(LoginService.LOGIN_IDENTITY_KEY); |
| | | if (loginUser.getRole() == 1) { |
| | | jobGroupList = jobGroupList_all; |
| | | } else { |
| | | List<String> groupIdStrs = new ArrayList<>(); |
| | | if (loginUser.getPermission()!=null && loginUser.getPermission().trim().length()>0) { |
| | | groupIdStrs = Arrays.asList(loginUser.getPermission().trim().split(",")); |
| | | } |
| | | for (XxlJobGroup groupItem:jobGroupList_all) { |
| | | if (groupIdStrs.contains(String.valueOf(groupItem.getId()))) { |
| | | jobGroupList.add(groupItem); |
| | | } |
| | | } |
| | | } |
| | | } |
| | | return jobGroupList; |
| | | } |
| | | public static void validPermission(HttpServletRequest request, int jobGroup) { |
| | | XxlJobUser loginUser = (XxlJobUser) request.getAttribute(LoginService.LOGIN_IDENTITY_KEY); |
| | | if (!loginUser.validPermission(jobGroup)) { |
| | | throw new RuntimeException(I18nUtil.getString("system_permission_limit") + "[username="+ loginUser.getUsername() +"]"); |
| | | } |
| | | } |
| | | |
| | | @RequestMapping("/pageList") |
| | | @ResponseBody |
| | | public Map<String, Object> pageList(@RequestParam(required = false, defaultValue = "0") int start, |
| | | @RequestParam(required = false, defaultValue = "10") int length, |
| | | int jobGroup, int triggerStatus, String jobDesc, String executorHandler, String author) { |
| | | |
| | | return xxlJobService.pageList(start, length, jobGroup, triggerStatus, jobDesc, executorHandler, author); |
| | | } |
| | | |
| | | @RequestMapping("/add") |
| | | @ResponseBody |
| | | public ReturnT<String> add(XxlJobInfo jobInfo) { |
| | | return xxlJobService.add(jobInfo); |
| | | } |
| | | |
| | | @RequestMapping("/update") |
| | | @ResponseBody |
| | | public ReturnT<String> update(XxlJobInfo jobInfo) { |
| | | return xxlJobService.update(jobInfo); |
| | | } |
| | | |
| | | @RequestMapping("/remove") |
| | | @ResponseBody |
| | | public ReturnT<String> remove(int id) { |
| | | return xxlJobService.remove(id); |
| | | } |
| | | |
| | | @RequestMapping("/stop") |
| | | @ResponseBody |
| | | public ReturnT<String> pause(int id) { |
| | | return xxlJobService.stop(id); |
| | | } |
| | | |
| | | @RequestMapping("/start") |
| | | @ResponseBody |
| | | public ReturnT<String> start(int id) { |
| | | return xxlJobService.start(id); |
| | | } |
| | | |
| | | @RequestMapping("/trigger") |
| | | @ResponseBody |
| | | //@PermissionLimit(limit = false) |
| | | public ReturnT<String> triggerJob(int id, String executorParam, String addressList) { |
| | | // force cover job param |
| | | if (executorParam == null) { |
| | | executorParam = ""; |
| | | } |
| | | return "jobinfo/jobinfo.index"; |
| | | } |
| | | |
| | | JobTriggerPoolHelper.trigger(id, TriggerTypeEnum.MANUAL, -1, null, executorParam, addressList); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | public static List<XxlJobGroup> filterJobGroupByRole(HttpServletRequest request, List<XxlJobGroup> jobGroupList_all) { |
| | | List<XxlJobGroup> jobGroupList = new ArrayList<>(); |
| | | if (jobGroupList_all != null && jobGroupList_all.size() > 0) { |
| | | XxlJobUser loginUser = (XxlJobUser) request.getAttribute(LoginService.LOGIN_IDENTITY_KEY); |
| | | if (loginUser.getRole() == 1) { |
| | | jobGroupList = jobGroupList_all; |
| | | } else { |
| | | List<String> groupIdStrs = new ArrayList<>(); |
| | | if (loginUser.getPermission() != null && loginUser.getPermission().trim().length() > 0) { |
| | | groupIdStrs = Arrays.asList(loginUser.getPermission().trim().split(",")); |
| | | } |
| | | for (XxlJobGroup groupItem : jobGroupList_all) { |
| | | if (groupIdStrs.contains(String.valueOf(groupItem.getId()))) { |
| | | jobGroupList.add(groupItem); |
| | | } |
| | | } |
| | | } |
| | | } |
| | | return jobGroupList; |
| | | } |
| | | |
| | | @RequestMapping("/nextTriggerTime") |
| | | @ResponseBody |
| | | public ReturnT<List<String>> nextTriggerTime(String scheduleType, String scheduleConf) { |
| | | public static void validPermission(HttpServletRequest request, int jobGroup) { |
| | | XxlJobUser loginUser = (XxlJobUser) request.getAttribute(LoginService.LOGIN_IDENTITY_KEY); |
| | | if (!loginUser.validPermission(jobGroup)) { |
| | | throw new RuntimeException(I18nUtil.getString("system_permission_limit") + "[username=" + loginUser.getUsername() + "]"); |
| | | } |
| | | } |
| | | |
| | | XxlJobInfo paramXxlJobInfo = new XxlJobInfo(); |
| | | paramXxlJobInfo.setScheduleType(scheduleType); |
| | | paramXxlJobInfo.setScheduleConf(scheduleConf); |
| | | @RequestMapping("/pageList") |
| | | @ResponseBody |
| | | public Map<String, Object> pageList(@RequestParam(required = false, defaultValue = "0") int start, |
| | | @RequestParam(required = false, defaultValue = "10") int length, |
| | | int jobGroup, int triggerStatus, String jobDesc, String executorHandler, String author) { |
| | | |
| | | List<String> result = new ArrayList<>(); |
| | | try { |
| | | Date lastTime = new Date(); |
| | | for (int i = 0; i < 5; i++) { |
| | | lastTime = JobScheduleHelper.generateNextValidTime(paramXxlJobInfo, lastTime); |
| | | if (lastTime != null) { |
| | | result.add(DateUtil.formatDateTime(lastTime)); |
| | | } else { |
| | | break; |
| | | } |
| | | } |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<List<String>>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) + e.getMessage()); |
| | | } |
| | | return new ReturnT<List<String>>(result); |
| | | return xxlJobService.pageList(start, length, jobGroup, triggerStatus, jobDesc, executorHandler, author); |
| | | } |
| | | |
| | | } |
| | | |
| | | @RequestMapping("/add") |
| | | @ResponseBody |
| | | public ReturnT<String> add(XxlJobInfo jobInfo) { |
| | | return xxlJobService.add(jobInfo); |
| | | } |
| | | |
| | | @RequestMapping("/update") |
| | | @ResponseBody |
| | | public ReturnT<String> update(XxlJobInfo jobInfo) { |
| | | return xxlJobService.update(jobInfo); |
| | | } |
| | | |
| | | @RequestMapping("/remove") |
| | | @ResponseBody |
| | | public ReturnT<String> remove(int id) { |
| | | return xxlJobService.remove(id); |
| | | } |
| | | |
| | | @RequestMapping("/stop") |
| | | @ResponseBody |
| | | public ReturnT<String> pause(int id) { |
| | | return xxlJobService.stop(id); |
| | | } |
| | | |
| | | @RequestMapping("/start") |
| | | @ResponseBody |
| | | public ReturnT<String> start(int id) { |
| | | return xxlJobService.start(id); |
| | | } |
| | | |
| | | @RequestMapping("/trigger") |
| | | @ResponseBody |
| | | //@PermissionLimit(limit = false) |
| | | public ReturnT<String> triggerJob(int id, String executorParam, String addressList) { |
| | | // force cover job param |
| | | if (executorParam == null) { |
| | | executorParam = ""; |
| | | } |
| | | |
| | | JobTriggerPoolHelper.trigger(id, TriggerTypeEnum.MANUAL, -1, null, executorParam, addressList); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | @RequestMapping("/nextTriggerTime") |
| | | @ResponseBody |
| | | public ReturnT<List<String>> nextTriggerTime(String scheduleType, String scheduleConf) { |
| | | |
| | | XxlJobInfo paramXxlJobInfo = new XxlJobInfo(); |
| | | paramXxlJobInfo.setScheduleType(scheduleType); |
| | | paramXxlJobInfo.setScheduleConf(scheduleConf); |
| | | |
| | | List<String> result = new ArrayList<>(); |
| | | try { |
| | | Date lastTime = new Date(); |
| | | for (int i = 0; i < 5; i++) { |
| | | lastTime = JobScheduleHelper.generateNextValidTime(paramXxlJobInfo, lastTime); |
| | | if (lastTime != null) { |
| | | result.add(DateUtil.formatDateTime(lastTime)); |
| | | } else { |
| | | break; |
| | | } |
| | | } |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<List<String>>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid")) + e.getMessage()); |
| | | } |
| | | return new ReturnT<List<String>>(result); |
| | | |
| | | } |
| | | |
| | | } |
| | |
| | | package com.xxl.job.admin.controller; |
| | | |
| | | import com.xxl.job.admin.core.exception.XxlJobException; |
| | | import com.xxl.job.admin.core.complete.XxlJobCompleter; |
| | | import com.xxl.job.admin.core.exception.XxlJobException; |
| | | import com.xxl.job.admin.core.model.XxlJobGroup; |
| | | import com.xxl.job.admin.core.model.XxlJobInfo; |
| | | import com.xxl.job.admin.core.model.XxlJobLog; |
| | |
| | | |
| | | /** |
| | | * index controller |
| | | * |
| | | * @author xuxueli 2015-12-19 16:13:16 |
| | | */ |
| | | @Controller |
| | | @RequestMapping("/joblog") |
| | | public class JobLogController { |
| | | private static Logger logger = LoggerFactory.getLogger(JobLogController.class); |
| | | private static Logger logger = LoggerFactory.getLogger(JobLogController.class); |
| | | |
| | | @Resource |
| | | private XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | public XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | public XxlJobLogDao xxlJobLogDao; |
| | | @Resource |
| | | private XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | public XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | public XxlJobLogDao xxlJobLogDao; |
| | | |
| | | @RequestMapping |
| | | public String index(HttpServletRequest request, Model model, @RequestParam(required = false, defaultValue = "0") Integer jobId) { |
| | | @RequestMapping |
| | | public String index(HttpServletRequest request, Model model, @RequestParam(required = false, defaultValue = "0") Integer jobId) { |
| | | |
| | | // 执行器列表 |
| | | List<XxlJobGroup> jobGroupList_all = xxlJobGroupDao.findAll(); |
| | | // 执行器列表 |
| | | List<XxlJobGroup> jobGroupList_all = xxlJobGroupDao.findAll(); |
| | | |
| | | // filter group |
| | | List<XxlJobGroup> jobGroupList = JobInfoController.filterJobGroupByRole(request, jobGroupList_all); |
| | | if (jobGroupList==null || jobGroupList.size()==0) { |
| | | throw new XxlJobException(I18nUtil.getString("jobgroup_empty")); |
| | | } |
| | | // filter group |
| | | List<XxlJobGroup> jobGroupList = JobInfoController.filterJobGroupByRole(request, jobGroupList_all); |
| | | if (jobGroupList == null || jobGroupList.size() == 0) { |
| | | throw new XxlJobException(I18nUtil.getString("jobgroup_empty")); |
| | | } |
| | | |
| | | model.addAttribute("JobGroupList", jobGroupList); |
| | | model.addAttribute("JobGroupList" , jobGroupList); |
| | | |
| | | // 任务 |
| | | if (jobId > 0) { |
| | | XxlJobInfo jobInfo = xxlJobInfoDao.loadById(jobId); |
| | | if (jobInfo == null) { |
| | | throw new RuntimeException(I18nUtil.getString("jobinfo_field_id") + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | // 任务 |
| | | if (jobId > 0) { |
| | | XxlJobInfo jobInfo = xxlJobInfoDao.loadById(jobId); |
| | | if (jobInfo == null) { |
| | | throw new RuntimeException(I18nUtil.getString("jobinfo_field_id") + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | |
| | | model.addAttribute("jobInfo", jobInfo); |
| | | model.addAttribute("jobInfo" , jobInfo); |
| | | |
| | | // valid permission |
| | | JobInfoController.validPermission(request, jobInfo.getJobGroup()); |
| | | } |
| | | // valid permission |
| | | JobInfoController.validPermission(request, jobInfo.getJobGroup()); |
| | | } |
| | | |
| | | return "joblog/joblog.index"; |
| | | } |
| | | return "joblog/joblog.index"; |
| | | } |
| | | |
| | | @RequestMapping("/getJobsByGroup") |
| | | @ResponseBody |
| | | public ReturnT<List<XxlJobInfo>> getJobsByGroup(int jobGroup){ |
| | | List<XxlJobInfo> list = xxlJobInfoDao.getJobsByGroup(jobGroup); |
| | | return new ReturnT<List<XxlJobInfo>>(list); |
| | | } |
| | | |
| | | @RequestMapping("/pageList") |
| | | @ResponseBody |
| | | public Map<String, Object> pageList(HttpServletRequest request, |
| | | @RequestParam(required = false, defaultValue = "0") int start, |
| | | @RequestParam(required = false, defaultValue = "10") int length, |
| | | int jobGroup, int jobId, int logStatus, String filterTime) { |
| | | @RequestMapping("/getJobsByGroup") |
| | | @ResponseBody |
| | | public ReturnT<List<XxlJobInfo>> getJobsByGroup(int jobGroup) { |
| | | List<XxlJobInfo> list = xxlJobInfoDao.getJobsByGroup(jobGroup); |
| | | return new ReturnT<List<XxlJobInfo>>(list); |
| | | } |
| | | |
| | | // valid permission |
| | | JobInfoController.validPermission(request, jobGroup); // 仅管理员支持查询全部;普通用户仅支持查询有权限的 jobGroup |
| | | |
| | | // parse param |
| | | Date triggerTimeStart = null; |
| | | Date triggerTimeEnd = null; |
| | | if (filterTime!=null && filterTime.trim().length()>0) { |
| | | String[] temp = filterTime.split(" - "); |
| | | if (temp.length == 2) { |
| | | triggerTimeStart = DateUtil.parseDateTime(temp[0]); |
| | | triggerTimeEnd = DateUtil.parseDateTime(temp[1]); |
| | | } |
| | | } |
| | | |
| | | // page query |
| | | List<XxlJobLog> list = xxlJobLogDao.pageList(start, length, jobGroup, jobId, triggerTimeStart, triggerTimeEnd, logStatus); |
| | | int list_count = xxlJobLogDao.pageListCount(start, length, jobGroup, jobId, triggerTimeStart, triggerTimeEnd, logStatus); |
| | | |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal", list_count); // 总记录数 |
| | | maps.put("recordsFiltered", list_count); // 过滤后的总记录数 |
| | | maps.put("data", list); // 分页列表 |
| | | return maps; |
| | | } |
| | | @RequestMapping("/pageList") |
| | | @ResponseBody |
| | | public Map<String, Object> pageList(HttpServletRequest request, |
| | | @RequestParam(required = false, defaultValue = "0") int start, |
| | | @RequestParam(required = false, defaultValue = "10") int length, |
| | | int jobGroup, int jobId, int logStatus, String filterTime) { |
| | | |
| | | @RequestMapping("/logDetailPage") |
| | | public String logDetailPage(int id, Model model){ |
| | | // valid permission |
| | | JobInfoController.validPermission(request, jobGroup); // 仅管理员支持查询全部;普通用户仅支持查询有权限的 jobGroup |
| | | |
| | | // base check |
| | | ReturnT<String> logStatue = ReturnT.SUCCESS; |
| | | XxlJobLog jobLog = xxlJobLogDao.load(id); |
| | | if (jobLog == null) { |
| | | // parse param |
| | | Date triggerTimeStart = null; |
| | | Date triggerTimeEnd = null; |
| | | if (filterTime != null && filterTime.trim().length() > 0) { |
| | | String[] temp = filterTime.split(" - "); |
| | | if (temp.length == 2) { |
| | | triggerTimeStart = DateUtil.parseDateTime(temp[0]); |
| | | triggerTimeEnd = DateUtil.parseDateTime(temp[1]); |
| | | } |
| | | } |
| | | |
| | | // page query |
| | | List<XxlJobLog> list = xxlJobLogDao.pageList(start, length, jobGroup, jobId, triggerTimeStart, triggerTimeEnd, logStatus); |
| | | int list_count = xxlJobLogDao.pageListCount(start, length, jobGroup, jobId, triggerTimeStart, triggerTimeEnd, logStatus); |
| | | |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal" , list_count); // 总记录数 |
| | | maps.put("recordsFiltered" , list_count); // 过滤后的总记录数 |
| | | maps.put("data" , list); // 分页列表 |
| | | return maps; |
| | | } |
| | | |
| | | @RequestMapping("/logDetailPage") |
| | | public String logDetailPage(int id, Model model) { |
| | | |
| | | // base check |
| | | ReturnT<String> logStatue = ReturnT.SUCCESS; |
| | | XxlJobLog jobLog = xxlJobLogDao.load(id); |
| | | if (jobLog == null) { |
| | | throw new RuntimeException(I18nUtil.getString("joblog_logid_unvalid")); |
| | | } |
| | | } |
| | | |
| | | model.addAttribute("triggerCode", jobLog.getTriggerCode()); |
| | | model.addAttribute("handleCode", jobLog.getHandleCode()); |
| | | model.addAttribute("executorAddress", jobLog.getExecutorAddress()); |
| | | model.addAttribute("triggerTime", jobLog.getTriggerTime().getTime()); |
| | | model.addAttribute("logId", jobLog.getId()); |
| | | return "joblog/joblog.detail"; |
| | | } |
| | | model.addAttribute("triggerCode" , jobLog.getTriggerCode()); |
| | | model.addAttribute("handleCode" , jobLog.getHandleCode()); |
| | | model.addAttribute("executorAddress" , jobLog.getExecutorAddress()); |
| | | model.addAttribute("triggerTime" , jobLog.getTriggerTime().getTime()); |
| | | model.addAttribute("logId" , jobLog.getId()); |
| | | return "joblog/joblog.detail"; |
| | | } |
| | | |
| | | @RequestMapping("/logDetailCat") |
| | | @ResponseBody |
| | | public ReturnT<LogResult> logDetailCat(String executorAddress, long triggerTime, long logId, int fromLineNum){ |
| | | try { |
| | | ExecutorBiz executorBiz = XxlJobScheduler.getExecutorBiz(executorAddress); |
| | | ReturnT<LogResult> logResult = executorBiz.log(new LogParam(triggerTime, logId, fromLineNum)); |
| | | @RequestMapping("/logDetailCat") |
| | | @ResponseBody |
| | | public ReturnT<LogResult> logDetailCat(String executorAddress, long triggerTime, long logId, int fromLineNum) { |
| | | try { |
| | | ExecutorBiz executorBiz = XxlJobScheduler.getExecutorBiz(executorAddress); |
| | | ReturnT<LogResult> logResult = executorBiz.log(new LogParam(triggerTime, logId, fromLineNum)); |
| | | |
| | | // is end |
| | | if (logResult.getContent()!=null && logResult.getContent().getFromLineNum() > logResult.getContent().getToLineNum()) { |
| | | // is end |
| | | if (logResult.getContent() != null && logResult.getContent().getFromLineNum() > logResult.getContent().getToLineNum()) { |
| | | XxlJobLog jobLog = xxlJobLogDao.load(logId); |
| | | if (jobLog.getHandleCode() > 0) { |
| | | logResult.getContent().setEnd(true); |
| | | } |
| | | } |
| | | |
| | | return logResult; |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<LogResult>(ReturnT.FAIL_CODE, e.getMessage()); |
| | | } |
| | | } |
| | | return logResult; |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<LogResult>(ReturnT.FAIL_CODE, e.getMessage()); |
| | | } |
| | | } |
| | | |
| | | @RequestMapping("/logKill") |
| | | @ResponseBody |
| | | public ReturnT<String> logKill(int id){ |
| | | // base check |
| | | XxlJobLog log = xxlJobLogDao.load(id); |
| | | XxlJobInfo jobInfo = xxlJobInfoDao.loadById(log.getJobId()); |
| | | if (jobInfo==null) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobinfo_glue_jobid_unvalid")); |
| | | } |
| | | if (ReturnT.SUCCESS_CODE != log.getTriggerCode()) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("joblog_kill_log_limit")); |
| | | } |
| | | @RequestMapping("/logKill") |
| | | @ResponseBody |
| | | public ReturnT<String> logKill(int id) { |
| | | // base check |
| | | XxlJobLog log = xxlJobLogDao.load(id); |
| | | XxlJobInfo jobInfo = xxlJobInfoDao.loadById(log.getJobId()); |
| | | if (jobInfo == null) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("jobinfo_glue_jobid_unvalid")); |
| | | } |
| | | if (ReturnT.SUCCESS_CODE != log.getTriggerCode()) { |
| | | return new ReturnT<String>(500, I18nUtil.getString("joblog_kill_log_limit")); |
| | | } |
| | | |
| | | // request of kill |
| | | ReturnT<String> runResult = null; |
| | | try { |
| | | ExecutorBiz executorBiz = XxlJobScheduler.getExecutorBiz(log.getExecutorAddress()); |
| | | runResult = executorBiz.kill(new KillParam(jobInfo.getId())); |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | runResult = new ReturnT<String>(500, e.getMessage()); |
| | | } |
| | | // request of kill |
| | | ReturnT<String> runResult = null; |
| | | try { |
| | | ExecutorBiz executorBiz = XxlJobScheduler.getExecutorBiz(log.getExecutorAddress()); |
| | | runResult = executorBiz.kill(new KillParam(jobInfo.getId())); |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | runResult = new ReturnT<String>(500, e.getMessage()); |
| | | } |
| | | |
| | | if (ReturnT.SUCCESS_CODE == runResult.getCode()) { |
| | | log.setHandleCode(ReturnT.FAIL_CODE); |
| | | log.setHandleMsg( I18nUtil.getString("joblog_kill_log_byman")+":" + (runResult.getMsg()!=null?runResult.getMsg():"")); |
| | | log.setHandleTime(new Date()); |
| | | XxlJobCompleter.updateHandleInfoAndFinish(log); |
| | | return new ReturnT<String>(runResult.getMsg()); |
| | | } else { |
| | | return new ReturnT<String>(500, runResult.getMsg()); |
| | | } |
| | | } |
| | | if (ReturnT.SUCCESS_CODE == runResult.getCode()) { |
| | | log.setHandleCode(ReturnT.FAIL_CODE); |
| | | log.setHandleMsg(I18nUtil.getString("joblog_kill_log_byman") + ":" + (runResult.getMsg() != null ? runResult.getMsg() : "")); |
| | | log.setHandleTime(new Date()); |
| | | XxlJobCompleter.updateHandleInfoAndFinish(log); |
| | | return new ReturnT<String>(runResult.getMsg()); |
| | | } else { |
| | | return new ReturnT<String>(500, runResult.getMsg()); |
| | | } |
| | | } |
| | | |
| | | @RequestMapping("/clearLog") |
| | | @ResponseBody |
| | | public ReturnT<String> clearLog(int jobGroup, int jobId, int type){ |
| | | @RequestMapping("/clearLog") |
| | | @ResponseBody |
| | | public ReturnT<String> clearLog(int jobGroup, int jobId, int type) { |
| | | |
| | | Date clearBeforeTime = null; |
| | | int clearBeforeNum = 0; |
| | | if (type == 1) { |
| | | clearBeforeTime = DateUtil.addMonths(new Date(), -1); // 清理一个月之前日志数据 |
| | | } else if (type == 2) { |
| | | clearBeforeTime = DateUtil.addMonths(new Date(), -3); // 清理三个月之前日志数据 |
| | | } else if (type == 3) { |
| | | clearBeforeTime = DateUtil.addMonths(new Date(), -6); // 清理六个月之前日志数据 |
| | | } else if (type == 4) { |
| | | clearBeforeTime = DateUtil.addYears(new Date(), -1); // 清理一年之前日志数据 |
| | | } else if (type == 5) { |
| | | clearBeforeNum = 1000; // 清理一千条以前日志数据 |
| | | } else if (type == 6) { |
| | | clearBeforeNum = 10000; // 清理一万条以前日志数据 |
| | | } else if (type == 7) { |
| | | clearBeforeNum = 30000; // 清理三万条以前日志数据 |
| | | } else if (type == 8) { |
| | | clearBeforeNum = 100000; // 清理十万条以前日志数据 |
| | | } else if (type == 9) { |
| | | clearBeforeNum = 0; // 清理所有日志数据 |
| | | } else { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("joblog_clean_type_unvalid")); |
| | | } |
| | | Date clearBeforeTime = null; |
| | | int clearBeforeNum = 0; |
| | | if (type == 1) { |
| | | clearBeforeTime = DateUtil.addMonths(new Date(), -1); // 清理一个月之前日志数据 |
| | | } else if (type == 2) { |
| | | clearBeforeTime = DateUtil.addMonths(new Date(), -3); // 清理三个月之前日志数据 |
| | | } else if (type == 3) { |
| | | clearBeforeTime = DateUtil.addMonths(new Date(), -6); // 清理六个月之前日志数据 |
| | | } else if (type == 4) { |
| | | clearBeforeTime = DateUtil.addYears(new Date(), -1); // 清理一年之前日志数据 |
| | | } else if (type == 5) { |
| | | clearBeforeNum = 1000; // 清理一千条以前日志数据 |
| | | } else if (type == 6) { |
| | | clearBeforeNum = 10000; // 清理一万条以前日志数据 |
| | | } else if (type == 7) { |
| | | clearBeforeNum = 30000; // 清理三万条以前日志数据 |
| | | } else if (type == 8) { |
| | | clearBeforeNum = 100000; // 清理十万条以前日志数据 |
| | | } else if (type == 9) { |
| | | clearBeforeNum = 0; // 清理所有日志数据 |
| | | } else { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("joblog_clean_type_unvalid")); |
| | | } |
| | | |
| | | List<Long> logIds = null; |
| | | do { |
| | | logIds = xxlJobLogDao.findClearLogIds(jobGroup, jobId, clearBeforeTime, clearBeforeNum, 1000); |
| | | if (logIds!=null && logIds.size()>0) { |
| | | xxlJobLogDao.clearLog(logIds); |
| | | } |
| | | } while (logIds!=null && logIds.size()>0); |
| | | List<Long> logIds = null; |
| | | do { |
| | | logIds = xxlJobLogDao.findClearLogIds(jobGroup, jobId, clearBeforeTime, clearBeforeNum, 1000); |
| | | if (logIds != null && logIds.size() > 0) { |
| | | xxlJobLogDao.clearLog(logIds); |
| | | } |
| | | } while (logIds != null && logIds.size() > 0); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | } |
| | |
| | | |
| | | // 执行器列表 |
| | | List<XxlJobGroup> groupList = xxlJobGroupDao.findAll(); |
| | | model.addAttribute("groupList", groupList); |
| | | model.addAttribute("groupList" , groupList); |
| | | |
| | | return "user/user.index"; |
| | | } |
| | |
| | | int list_count = xxlJobUserDao.pageListCount(start, length, username, role); |
| | | |
| | | // filter |
| | | if (list!=null && list.size()>0) { |
| | | for (XxlJobUser item: list) { |
| | | if (list != null && list.size() > 0) { |
| | | for (XxlJobUser item : list) { |
| | | item.setPassword(null); |
| | | } |
| | | } |
| | | |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal", list_count); // 总记录数 |
| | | maps.put("recordsFiltered", list_count); // 过滤后的总记录数 |
| | | maps.put("data", list); // 分页列表 |
| | | maps.put("recordsTotal" , list_count); // 总记录数 |
| | | maps.put("recordsFiltered" , list_count); // 过滤后的总记录数 |
| | | maps.put("data" , list); // 分页列表 |
| | | return maps; |
| | | } |
| | | |
| | |
| | | |
| | | // valid username |
| | | if (!StringUtils.hasText(xxlJobUser.getUsername())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_please_input")+I18nUtil.getString("user_username") ); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_please_input") + I18nUtil.getString("user_username")); |
| | | } |
| | | xxlJobUser.setUsername(xxlJobUser.getUsername().trim()); |
| | | if (!(xxlJobUser.getUsername().length()>=4 && xxlJobUser.getUsername().length()<=20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit")+"[4-20]" ); |
| | | if (!(xxlJobUser.getUsername().length() >= 4 && xxlJobUser.getUsername().length() <= 20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit") + "[4-20]"); |
| | | } |
| | | // valid password |
| | | if (!StringUtils.hasText(xxlJobUser.getPassword())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_please_input")+I18nUtil.getString("user_password") ); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_please_input") + I18nUtil.getString("user_password")); |
| | | } |
| | | xxlJobUser.setPassword(xxlJobUser.getPassword().trim()); |
| | | if (!(xxlJobUser.getPassword().length()>=4 && xxlJobUser.getPassword().length()<=20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit")+"[4-20]" ); |
| | | if (!(xxlJobUser.getPassword().length() >= 4 && xxlJobUser.getPassword().length() <= 20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit") + "[4-20]"); |
| | | } |
| | | // md5 password |
| | | xxlJobUser.setPassword(DigestUtils.md5DigestAsHex(xxlJobUser.getPassword().getBytes())); |
| | |
| | | // check repeat |
| | | XxlJobUser existUser = xxlJobUserDao.loadByUserName(xxlJobUser.getUsername()); |
| | | if (existUser != null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("user_username_repeat") ); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("user_username_repeat")); |
| | | } |
| | | |
| | | // write |
| | |
| | | // valid password |
| | | if (StringUtils.hasText(xxlJobUser.getPassword())) { |
| | | xxlJobUser.setPassword(xxlJobUser.getPassword().trim()); |
| | | if (!(xxlJobUser.getPassword().length()>=4 && xxlJobUser.getPassword().length()<=20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit")+"[4-20]" ); |
| | | if (!(xxlJobUser.getPassword().length() >= 4 && xxlJobUser.getPassword().length() <= 20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit") + "[4-20]"); |
| | | } |
| | | // md5 password |
| | | xxlJobUser.setPassword(DigestUtils.md5DigestAsHex(xxlJobUser.getPassword().getBytes())); |
| | |
| | | |
| | | @RequestMapping("/updatePwd") |
| | | @ResponseBody |
| | | public ReturnT<String> updatePwd(HttpServletRequest request, String password){ |
| | | public ReturnT<String> updatePwd(HttpServletRequest request, String password) { |
| | | |
| | | // valid password |
| | | if (password==null || password.trim().length()==0){ |
| | | if (password == null || password.trim().length() == 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL.getCode(), "密码不可为空"); |
| | | } |
| | | password = password.trim(); |
| | | if (!(password.length()>=4 && password.length()<=20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit")+"[4-20]" ); |
| | | if (!(password.length() >= 4 && password.length() <= 20)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, I18nUtil.getString("system_lengh_limit") + "[4-20]"); |
| | | } |
| | | |
| | | // md5 password |
| | |
| | | |
| | | /** |
| | | * 权限限制 |
| | | * |
| | | * @author xuxueli 2015-12-12 18:29:02 |
| | | */ |
| | | @Target(ElementType.METHOD) |
| | | @Retention(RetentionPolicy.RUNTIME) |
| | | public @interface PermissionLimit { |
| | | |
| | | /** |
| | | * 登录拦截 (默认拦截) |
| | | */ |
| | | boolean limit() default true; |
| | | |
| | | /** |
| | | * 要求管理员权限 |
| | | * |
| | | * @return |
| | | */ |
| | | boolean adminuser() default false; |
| | | /** |
| | | * 登录拦截 (默认拦截) |
| | | */ |
| | | boolean limit() default true; |
| | | |
| | | } |
| | | /** |
| | | * 要求管理员权限 |
| | | * |
| | | * @return |
| | | */ |
| | | boolean adminuser() default false; |
| | | |
| | | } |
| | |
| | | @Component |
| | | public class CookieInterceptor implements AsyncHandlerInterceptor { |
| | | |
| | | @Override |
| | | public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, |
| | | ModelAndView modelAndView) throws Exception { |
| | | @Override |
| | | public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, |
| | | ModelAndView modelAndView) throws Exception { |
| | | |
| | | // cookie |
| | | if (modelAndView!=null && request.getCookies()!=null && request.getCookies().length>0) { |
| | | HashMap<String, Cookie> cookieMap = new HashMap<String, Cookie>(); |
| | | for (Cookie ck : request.getCookies()) { |
| | | cookieMap.put(ck.getName(), ck); |
| | | } |
| | | modelAndView.addObject("cookieMap", cookieMap); |
| | | } |
| | | // cookie |
| | | if (modelAndView != null && request.getCookies() != null && request.getCookies().length > 0) { |
| | | HashMap<String, Cookie> cookieMap = new HashMap<String, Cookie>(); |
| | | for (Cookie ck : request.getCookies()) { |
| | | cookieMap.put(ck.getName(), ck); |
| | | } |
| | | modelAndView.addObject("cookieMap" , cookieMap); |
| | | } |
| | | |
| | | // static method |
| | | if (modelAndView != null) { |
| | | modelAndView.addObject("I18nUtil", FtlUtil.generateStaticModel(I18nUtil.class.getName())); |
| | | } |
| | | // static method |
| | | if (modelAndView != null) { |
| | | modelAndView.addObject("I18nUtil" , FtlUtil.generateStaticModel(I18nUtil.class.getName())); |
| | | } |
| | | |
| | | AsyncHandlerInterceptor.super.postHandle(request, response, handler, modelAndView); |
| | | } |
| | | AsyncHandlerInterceptor.super.postHandle(request, response, handler, modelAndView); |
| | | } |
| | | |
| | | } |
| | |
| | | @Component |
| | | public class PermissionInterceptor implements AsyncHandlerInterceptor { |
| | | |
| | | @Resource |
| | | private LoginService loginService; |
| | | @Resource |
| | | private LoginService loginService; |
| | | |
| | | @Override |
| | | public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception { |
| | | @Override |
| | | public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception { |
| | | |
| | | if (!(handler instanceof HandlerMethod)) { |
| | | return AsyncHandlerInterceptor.super.preHandle(request, response, handler); |
| | | } |
| | | if (!(handler instanceof HandlerMethod)) { |
| | | return AsyncHandlerInterceptor.super.preHandle(request, response, handler); |
| | | } |
| | | |
| | | // if need login |
| | | boolean needLogin = true; |
| | | boolean needAdminuser = false; |
| | | HandlerMethod method = (HandlerMethod)handler; |
| | | PermissionLimit permission = method.getMethodAnnotation(PermissionLimit.class); |
| | | if (permission!=null) { |
| | | needLogin = permission.limit(); |
| | | needAdminuser = permission.adminuser(); |
| | | } |
| | | // if need login |
| | | boolean needLogin = true; |
| | | boolean needAdminuser = false; |
| | | HandlerMethod method = (HandlerMethod) handler; |
| | | PermissionLimit permission = method.getMethodAnnotation(PermissionLimit.class); |
| | | if (permission != null) { |
| | | needLogin = permission.limit(); |
| | | needAdminuser = permission.adminuser(); |
| | | } |
| | | |
| | | if (needLogin) { |
| | | XxlJobUser loginUser = loginService.ifLogin(request, response); |
| | | if (loginUser == null) { |
| | | response.setStatus(302); |
| | | response.setHeader("location", request.getContextPath()+"/toLogin"); |
| | | return false; |
| | | } |
| | | if (needAdminuser && loginUser.getRole()!=1) { |
| | | throw new RuntimeException(I18nUtil.getString("system_permission_limit")); |
| | | } |
| | | request.setAttribute(LoginService.LOGIN_IDENTITY_KEY, loginUser); |
| | | } |
| | | if (needLogin) { |
| | | XxlJobUser loginUser = loginService.ifLogin(request, response); |
| | | if (loginUser == null) { |
| | | response.setStatus(302); |
| | | response.setHeader("location" , request.getContextPath() + "/toLogin"); |
| | | return false; |
| | | } |
| | | if (needAdminuser && loginUser.getRole() != 1) { |
| | | throw new RuntimeException(I18nUtil.getString("system_permission_limit")); |
| | | } |
| | | request.setAttribute(LoginService.LOGIN_IDENTITY_KEY, loginUser); |
| | | } |
| | | |
| | | return AsyncHandlerInterceptor.super.preHandle(request, response, handler); |
| | | } |
| | | return AsyncHandlerInterceptor.super.preHandle(request, response, handler); |
| | | } |
| | | |
| | | } |
| | |
| | | package com.xxl.job.admin.controller.resolver; |
| | | |
| | | import com.xxl.job.admin.core.exception.XxlJobException; |
| | | import com.xxl.job.core.biz.model.ReturnT; |
| | | import com.xxl.job.admin.core.util.JacksonUtil; |
| | | import com.xxl.job.core.biz.model.ReturnT; |
| | | import org.slf4j.Logger; |
| | | import org.slf4j.LoggerFactory; |
| | | import org.springframework.stereotype.Component; |
| | |
| | | */ |
| | | @Component |
| | | public class WebExceptionResolver implements HandlerExceptionResolver { |
| | | private static transient Logger logger = LoggerFactory.getLogger(WebExceptionResolver.class); |
| | | private static transient Logger logger = LoggerFactory.getLogger(WebExceptionResolver.class); |
| | | |
| | | @Override |
| | | public ModelAndView resolveException(HttpServletRequest request, |
| | | HttpServletResponse response, Object handler, Exception ex) { |
| | | @Override |
| | | public ModelAndView resolveException(HttpServletRequest request, |
| | | HttpServletResponse response, Object handler, Exception ex) { |
| | | |
| | | if (!(ex instanceof XxlJobException)) { |
| | | logger.error("WebExceptionResolver:{}", ex); |
| | | } |
| | | if (!(ex instanceof XxlJobException)) { |
| | | logger.error("WebExceptionResolver:{}" , ex); |
| | | } |
| | | |
| | | // if json |
| | | boolean isJson = false; |
| | | if (handler instanceof HandlerMethod) { |
| | | HandlerMethod method = (HandlerMethod)handler; |
| | | ResponseBody responseBody = method.getMethodAnnotation(ResponseBody.class); |
| | | if (responseBody != null) { |
| | | isJson = true; |
| | | } |
| | | } |
| | | // if json |
| | | boolean isJson = false; |
| | | if (handler instanceof HandlerMethod) { |
| | | HandlerMethod method = (HandlerMethod) handler; |
| | | ResponseBody responseBody = method.getMethodAnnotation(ResponseBody.class); |
| | | if (responseBody != null) { |
| | | isJson = true; |
| | | } |
| | | } |
| | | |
| | | // error result |
| | | ReturnT<String> errorResult = new ReturnT<String>(ReturnT.FAIL_CODE, ex.toString().replaceAll("\n", "<br/>")); |
| | | // error result |
| | | ReturnT<String> errorResult = new ReturnT<String>(ReturnT.FAIL_CODE, ex.toString().replaceAll("\n" , "<br/>")); |
| | | |
| | | // response |
| | | ModelAndView mv = new ModelAndView(); |
| | | if (isJson) { |
| | | try { |
| | | response.setContentType("application/json;charset=utf-8"); |
| | | response.getWriter().print(JacksonUtil.writeValueAsString(errorResult)); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return mv; |
| | | } else { |
| | | // response |
| | | ModelAndView mv = new ModelAndView(); |
| | | if (isJson) { |
| | | try { |
| | | response.setContentType("application/json;charset=utf-8"); |
| | | response.getWriter().print(JacksonUtil.writeValueAsString(errorResult)); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return mv; |
| | | } else { |
| | | |
| | | mv.addObject("exceptionMsg", errorResult.getMsg()); |
| | | mv.setViewName("/common/common.exception"); |
| | | return mv; |
| | | } |
| | | } |
| | | |
| | | } |
| | | mv.addObject("exceptionMsg" , errorResult.getMsg()); |
| | | mv.setViewName("/common/common.exception"); |
| | | return mv; |
| | | } |
| | | } |
| | | |
| | | } |
| | |
| | | * @author xuxueli 2016-1-12 18:25:49 |
| | | */ |
| | | public class XxlJobInfo { |
| | | |
| | | private int id; // 主键ID |
| | | |
| | | private int jobGroup; // 执行器主键ID |
| | | private String jobDesc; |
| | | |
| | | private Date addTime; |
| | | private Date updateTime; |
| | | |
| | | private String author; // 负责人 |
| | | private String alarmEmail; // 报警邮件 |
| | | |
| | | private String scheduleType; // 调度类型 |
| | | private String scheduleConf; // 调度配置,值含义取决于调度类型 |
| | | private String misfireStrategy; // 调度过期策略 |
| | | private int id; // 主键ID |
| | | |
| | | private String executorRouteStrategy; // 执行器路由策略 |
| | | private String executorHandler; // 执行器,任务Handler名称 |
| | | private String executorParam; // 执行器,任务参数 |
| | | private String executorBlockStrategy; // 阻塞处理策略 |
| | | private int executorTimeout; // 任务执行超时时间,单位秒 |
| | | private int executorFailRetryCount; // 失败重试次数 |
| | | |
| | | private String glueType; // GLUE类型 #com.xxl.job.core.glue.GlueTypeEnum |
| | | private String glueSource; // GLUE源代码 |
| | | private String glueRemark; // GLUE备注 |
| | | private Date glueUpdatetime; // GLUE更新时间 |
| | | private int jobGroup; // 执行器主键ID |
| | | private String jobDesc; |
| | | |
| | | private String childJobId; // 子任务ID,多个逗号分隔 |
| | | private Date addTime; |
| | | private Date updateTime; |
| | | |
| | | private int triggerStatus; // 调度状态:0-停止,1-运行 |
| | | private long triggerLastTime; // 上次调度时间 |
| | | private long triggerNextTime; // 下次调度时间 |
| | | private String author; // 负责人 |
| | | private String alarmEmail; // 报警邮件 |
| | | |
| | | private String scheduleType; // 调度类型 |
| | | private String scheduleConf; // 调度配置,值含义取决于调度类型 |
| | | private String misfireStrategy; // 调度过期策略 |
| | | |
| | | private String executorRouteStrategy; // 执行器路由策略 |
| | | private String executorHandler; // 执行器,任务Handler名称 |
| | | private String executorParam; // 执行器,任务参数 |
| | | private String executorBlockStrategy; // 阻塞处理策略 |
| | | private int executorTimeout; // 任务执行超时时间,单位秒 |
| | | private int executorFailRetryCount; // 失败重试次数 |
| | | |
| | | private String glueType; // GLUE类型 #com.xxl.job.core.glue.GlueTypeEnum |
| | | private String glueSource; // GLUE源代码 |
| | | private String glueRemark; // GLUE备注 |
| | | private Date glueUpdatetime; // GLUE更新时间 |
| | | |
| | | private String childJobId; // 子任务ID,多个逗号分隔 |
| | | |
| | | private int triggerStatus; // 调度状态:0-停止,1-运行 |
| | | private long triggerLastTime; // 上次调度时间 |
| | | private long triggerNextTime; // 下次调度时间 |
| | | |
| | | |
| | | public int getId() { |
| | | return id; |
| | | } |
| | | public int getId() { |
| | | return id; |
| | | } |
| | | |
| | | public void setId(int id) { |
| | | this.id = id; |
| | | } |
| | | public void setId(int id) { |
| | | this.id = id; |
| | | } |
| | | |
| | | public int getJobGroup() { |
| | | return jobGroup; |
| | | } |
| | | public int getJobGroup() { |
| | | return jobGroup; |
| | | } |
| | | |
| | | public void setJobGroup(int jobGroup) { |
| | | this.jobGroup = jobGroup; |
| | | } |
| | | public void setJobGroup(int jobGroup) { |
| | | this.jobGroup = jobGroup; |
| | | } |
| | | |
| | | public String getJobDesc() { |
| | | return jobDesc; |
| | | } |
| | | public String getJobDesc() { |
| | | return jobDesc; |
| | | } |
| | | |
| | | public void setJobDesc(String jobDesc) { |
| | | this.jobDesc = jobDesc; |
| | | } |
| | | public void setJobDesc(String jobDesc) { |
| | | this.jobDesc = jobDesc; |
| | | } |
| | | |
| | | public Date getAddTime() { |
| | | return addTime; |
| | | } |
| | | public Date getAddTime() { |
| | | return addTime; |
| | | } |
| | | |
| | | public void setAddTime(Date addTime) { |
| | | this.addTime = addTime; |
| | | } |
| | | public void setAddTime(Date addTime) { |
| | | this.addTime = addTime; |
| | | } |
| | | |
| | | public Date getUpdateTime() { |
| | | return updateTime; |
| | | } |
| | | public Date getUpdateTime() { |
| | | return updateTime; |
| | | } |
| | | |
| | | public void setUpdateTime(Date updateTime) { |
| | | this.updateTime = updateTime; |
| | | } |
| | | public void setUpdateTime(Date updateTime) { |
| | | this.updateTime = updateTime; |
| | | } |
| | | |
| | | public String getAuthor() { |
| | | return author; |
| | | } |
| | | public String getAuthor() { |
| | | return author; |
| | | } |
| | | |
| | | public void setAuthor(String author) { |
| | | this.author = author; |
| | | } |
| | | public void setAuthor(String author) { |
| | | this.author = author; |
| | | } |
| | | |
| | | public String getAlarmEmail() { |
| | | return alarmEmail; |
| | | } |
| | | public String getAlarmEmail() { |
| | | return alarmEmail; |
| | | } |
| | | |
| | | public void setAlarmEmail(String alarmEmail) { |
| | | this.alarmEmail = alarmEmail; |
| | | } |
| | | public void setAlarmEmail(String alarmEmail) { |
| | | this.alarmEmail = alarmEmail; |
| | | } |
| | | |
| | | public String getScheduleType() { |
| | | return scheduleType; |
| | | } |
| | | public String getScheduleType() { |
| | | return scheduleType; |
| | | } |
| | | |
| | | public void setScheduleType(String scheduleType) { |
| | | this.scheduleType = scheduleType; |
| | | } |
| | | public void setScheduleType(String scheduleType) { |
| | | this.scheduleType = scheduleType; |
| | | } |
| | | |
| | | public String getScheduleConf() { |
| | | return scheduleConf; |
| | | } |
| | | public String getScheduleConf() { |
| | | return scheduleConf; |
| | | } |
| | | |
| | | public void setScheduleConf(String scheduleConf) { |
| | | this.scheduleConf = scheduleConf; |
| | | } |
| | | public void setScheduleConf(String scheduleConf) { |
| | | this.scheduleConf = scheduleConf; |
| | | } |
| | | |
| | | public String getMisfireStrategy() { |
| | | return misfireStrategy; |
| | | } |
| | | public String getMisfireStrategy() { |
| | | return misfireStrategy; |
| | | } |
| | | |
| | | public void setMisfireStrategy(String misfireStrategy) { |
| | | this.misfireStrategy = misfireStrategy; |
| | | } |
| | | public void setMisfireStrategy(String misfireStrategy) { |
| | | this.misfireStrategy = misfireStrategy; |
| | | } |
| | | |
| | | public String getExecutorRouteStrategy() { |
| | | return executorRouteStrategy; |
| | | } |
| | | public String getExecutorRouteStrategy() { |
| | | return executorRouteStrategy; |
| | | } |
| | | |
| | | public void setExecutorRouteStrategy(String executorRouteStrategy) { |
| | | this.executorRouteStrategy = executorRouteStrategy; |
| | | } |
| | | public void setExecutorRouteStrategy(String executorRouteStrategy) { |
| | | this.executorRouteStrategy = executorRouteStrategy; |
| | | } |
| | | |
| | | public String getExecutorHandler() { |
| | | return executorHandler; |
| | | } |
| | | public String getExecutorHandler() { |
| | | return executorHandler; |
| | | } |
| | | |
| | | public void setExecutorHandler(String executorHandler) { |
| | | this.executorHandler = executorHandler; |
| | | } |
| | | public void setExecutorHandler(String executorHandler) { |
| | | this.executorHandler = executorHandler; |
| | | } |
| | | |
| | | public String getExecutorParam() { |
| | | return executorParam; |
| | | } |
| | | public String getExecutorParam() { |
| | | return executorParam; |
| | | } |
| | | |
| | | public void setExecutorParam(String executorParam) { |
| | | this.executorParam = executorParam; |
| | | } |
| | | public void setExecutorParam(String executorParam) { |
| | | this.executorParam = executorParam; |
| | | } |
| | | |
| | | public String getExecutorBlockStrategy() { |
| | | return executorBlockStrategy; |
| | | } |
| | | public String getExecutorBlockStrategy() { |
| | | return executorBlockStrategy; |
| | | } |
| | | |
| | | public void setExecutorBlockStrategy(String executorBlockStrategy) { |
| | | this.executorBlockStrategy = executorBlockStrategy; |
| | | } |
| | | public void setExecutorBlockStrategy(String executorBlockStrategy) { |
| | | this.executorBlockStrategy = executorBlockStrategy; |
| | | } |
| | | |
| | | public int getExecutorTimeout() { |
| | | return executorTimeout; |
| | | } |
| | | public int getExecutorTimeout() { |
| | | return executorTimeout; |
| | | } |
| | | |
| | | public void setExecutorTimeout(int executorTimeout) { |
| | | this.executorTimeout = executorTimeout; |
| | | } |
| | | public void setExecutorTimeout(int executorTimeout) { |
| | | this.executorTimeout = executorTimeout; |
| | | } |
| | | |
| | | public int getExecutorFailRetryCount() { |
| | | return executorFailRetryCount; |
| | | } |
| | | public int getExecutorFailRetryCount() { |
| | | return executorFailRetryCount; |
| | | } |
| | | |
| | | public void setExecutorFailRetryCount(int executorFailRetryCount) { |
| | | this.executorFailRetryCount = executorFailRetryCount; |
| | | } |
| | | public void setExecutorFailRetryCount(int executorFailRetryCount) { |
| | | this.executorFailRetryCount = executorFailRetryCount; |
| | | } |
| | | |
| | | public String getGlueType() { |
| | | return glueType; |
| | | } |
| | | public String getGlueType() { |
| | | return glueType; |
| | | } |
| | | |
| | | public void setGlueType(String glueType) { |
| | | this.glueType = glueType; |
| | | } |
| | | public void setGlueType(String glueType) { |
| | | this.glueType = glueType; |
| | | } |
| | | |
| | | public String getGlueSource() { |
| | | return glueSource; |
| | | } |
| | | public String getGlueSource() { |
| | | return glueSource; |
| | | } |
| | | |
| | | public void setGlueSource(String glueSource) { |
| | | this.glueSource = glueSource; |
| | | } |
| | | public void setGlueSource(String glueSource) { |
| | | this.glueSource = glueSource; |
| | | } |
| | | |
| | | public String getGlueRemark() { |
| | | return glueRemark; |
| | | } |
| | | public String getGlueRemark() { |
| | | return glueRemark; |
| | | } |
| | | |
| | | public void setGlueRemark(String glueRemark) { |
| | | this.glueRemark = glueRemark; |
| | | } |
| | | public void setGlueRemark(String glueRemark) { |
| | | this.glueRemark = glueRemark; |
| | | } |
| | | |
| | | public Date getGlueUpdatetime() { |
| | | return glueUpdatetime; |
| | | } |
| | | public Date getGlueUpdatetime() { |
| | | return glueUpdatetime; |
| | | } |
| | | |
| | | public void setGlueUpdatetime(Date glueUpdatetime) { |
| | | this.glueUpdatetime = glueUpdatetime; |
| | | } |
| | | public void setGlueUpdatetime(Date glueUpdatetime) { |
| | | this.glueUpdatetime = glueUpdatetime; |
| | | } |
| | | |
| | | public String getChildJobId() { |
| | | return childJobId; |
| | | } |
| | | public String getChildJobId() { |
| | | return childJobId; |
| | | } |
| | | |
| | | public void setChildJobId(String childJobId) { |
| | | this.childJobId = childJobId; |
| | | } |
| | | public void setChildJobId(String childJobId) { |
| | | this.childJobId = childJobId; |
| | | } |
| | | |
| | | public int getTriggerStatus() { |
| | | return triggerStatus; |
| | | } |
| | | public int getTriggerStatus() { |
| | | return triggerStatus; |
| | | } |
| | | |
| | | public void setTriggerStatus(int triggerStatus) { |
| | | this.triggerStatus = triggerStatus; |
| | | } |
| | | public void setTriggerStatus(int triggerStatus) { |
| | | this.triggerStatus = triggerStatus; |
| | | } |
| | | |
| | | public long getTriggerLastTime() { |
| | | return triggerLastTime; |
| | | } |
| | | public long getTriggerLastTime() { |
| | | return triggerLastTime; |
| | | } |
| | | |
| | | public void setTriggerLastTime(long triggerLastTime) { |
| | | this.triggerLastTime = triggerLastTime; |
| | | } |
| | | public void setTriggerLastTime(long triggerLastTime) { |
| | | this.triggerLastTime = triggerLastTime; |
| | | } |
| | | |
| | | public long getTriggerNextTime() { |
| | | return triggerNextTime; |
| | | } |
| | | public long getTriggerNextTime() { |
| | | return triggerNextTime; |
| | | } |
| | | |
| | | public void setTriggerNextTime(long triggerNextTime) { |
| | | this.triggerNextTime = triggerNextTime; |
| | | } |
| | | public void setTriggerNextTime(long triggerNextTime) { |
| | | this.triggerNextTime = triggerNextTime; |
| | | } |
| | | } |
| | |
| | | |
| | | /** |
| | | * xxl-job log, used to track trigger process |
| | | * |
| | | * @author xuxueli 2015-12-19 23:19:09 |
| | | */ |
| | | public class XxlJobLog { |
| | | |
| | | private long id; |
| | | |
| | | // job info |
| | | private int jobGroup; |
| | | private int jobId; |
| | | |
| | | // execute info |
| | | private String executorAddress; |
| | | private String executorHandler; |
| | | private String executorParam; |
| | | private String executorShardingParam; |
| | | private int executorFailRetryCount; |
| | | |
| | | // trigger info |
| | | private Date triggerTime; |
| | | private int triggerCode; |
| | | private String triggerMsg; |
| | | |
| | | // handle info |
| | | private Date handleTime; |
| | | private int handleCode; |
| | | private String handleMsg; |
| | | private long id; |
| | | |
| | | // alarm info |
| | | private int alarmStatus; |
| | | // job info |
| | | private int jobGroup; |
| | | private int jobId; |
| | | |
| | | public long getId() { |
| | | return id; |
| | | } |
| | | // execute info |
| | | private String executorAddress; |
| | | private String executorHandler; |
| | | private String executorParam; |
| | | private String executorShardingParam; |
| | | private int executorFailRetryCount; |
| | | |
| | | public void setId(long id) { |
| | | this.id = id; |
| | | } |
| | | // trigger info |
| | | private Date triggerTime; |
| | | private int triggerCode; |
| | | private String triggerMsg; |
| | | |
| | | public int getJobGroup() { |
| | | return jobGroup; |
| | | } |
| | | // handle info |
| | | private Date handleTime; |
| | | private int handleCode; |
| | | private String handleMsg; |
| | | |
| | | public void setJobGroup(int jobGroup) { |
| | | this.jobGroup = jobGroup; |
| | | } |
| | | // alarm info |
| | | private int alarmStatus; |
| | | |
| | | public int getJobId() { |
| | | return jobId; |
| | | } |
| | | public long getId() { |
| | | return id; |
| | | } |
| | | |
| | | public void setJobId(int jobId) { |
| | | this.jobId = jobId; |
| | | } |
| | | public void setId(long id) { |
| | | this.id = id; |
| | | } |
| | | |
| | | public String getExecutorAddress() { |
| | | return executorAddress; |
| | | } |
| | | public int getJobGroup() { |
| | | return jobGroup; |
| | | } |
| | | |
| | | public void setExecutorAddress(String executorAddress) { |
| | | this.executorAddress = executorAddress; |
| | | } |
| | | public void setJobGroup(int jobGroup) { |
| | | this.jobGroup = jobGroup; |
| | | } |
| | | |
| | | public String getExecutorHandler() { |
| | | return executorHandler; |
| | | } |
| | | public int getJobId() { |
| | | return jobId; |
| | | } |
| | | |
| | | public void setExecutorHandler(String executorHandler) { |
| | | this.executorHandler = executorHandler; |
| | | } |
| | | public void setJobId(int jobId) { |
| | | this.jobId = jobId; |
| | | } |
| | | |
| | | public String getExecutorParam() { |
| | | return executorParam; |
| | | } |
| | | public String getExecutorAddress() { |
| | | return executorAddress; |
| | | } |
| | | |
| | | public void setExecutorParam(String executorParam) { |
| | | this.executorParam = executorParam; |
| | | } |
| | | public void setExecutorAddress(String executorAddress) { |
| | | this.executorAddress = executorAddress; |
| | | } |
| | | |
| | | public String getExecutorShardingParam() { |
| | | return executorShardingParam; |
| | | } |
| | | public String getExecutorHandler() { |
| | | return executorHandler; |
| | | } |
| | | |
| | | public void setExecutorShardingParam(String executorShardingParam) { |
| | | this.executorShardingParam = executorShardingParam; |
| | | } |
| | | public void setExecutorHandler(String executorHandler) { |
| | | this.executorHandler = executorHandler; |
| | | } |
| | | |
| | | public int getExecutorFailRetryCount() { |
| | | return executorFailRetryCount; |
| | | } |
| | | public String getExecutorParam() { |
| | | return executorParam; |
| | | } |
| | | |
| | | public void setExecutorFailRetryCount(int executorFailRetryCount) { |
| | | this.executorFailRetryCount = executorFailRetryCount; |
| | | } |
| | | public void setExecutorParam(String executorParam) { |
| | | this.executorParam = executorParam; |
| | | } |
| | | |
| | | public Date getTriggerTime() { |
| | | return triggerTime; |
| | | } |
| | | public String getExecutorShardingParam() { |
| | | return executorShardingParam; |
| | | } |
| | | |
| | | public void setTriggerTime(Date triggerTime) { |
| | | this.triggerTime = triggerTime; |
| | | } |
| | | public void setExecutorShardingParam(String executorShardingParam) { |
| | | this.executorShardingParam = executorShardingParam; |
| | | } |
| | | |
| | | public int getTriggerCode() { |
| | | return triggerCode; |
| | | } |
| | | public int getExecutorFailRetryCount() { |
| | | return executorFailRetryCount; |
| | | } |
| | | |
| | | public void setTriggerCode(int triggerCode) { |
| | | this.triggerCode = triggerCode; |
| | | } |
| | | public void setExecutorFailRetryCount(int executorFailRetryCount) { |
| | | this.executorFailRetryCount = executorFailRetryCount; |
| | | } |
| | | |
| | | public String getTriggerMsg() { |
| | | return triggerMsg; |
| | | } |
| | | public Date getTriggerTime() { |
| | | return triggerTime; |
| | | } |
| | | |
| | | public void setTriggerMsg(String triggerMsg) { |
| | | this.triggerMsg = triggerMsg; |
| | | } |
| | | public void setTriggerTime(Date triggerTime) { |
| | | this.triggerTime = triggerTime; |
| | | } |
| | | |
| | | public Date getHandleTime() { |
| | | return handleTime; |
| | | } |
| | | public int getTriggerCode() { |
| | | return triggerCode; |
| | | } |
| | | |
| | | public void setHandleTime(Date handleTime) { |
| | | this.handleTime = handleTime; |
| | | } |
| | | public void setTriggerCode(int triggerCode) { |
| | | this.triggerCode = triggerCode; |
| | | } |
| | | |
| | | public int getHandleCode() { |
| | | return handleCode; |
| | | } |
| | | public String getTriggerMsg() { |
| | | return triggerMsg; |
| | | } |
| | | |
| | | public void setHandleCode(int handleCode) { |
| | | this.handleCode = handleCode; |
| | | } |
| | | public void setTriggerMsg(String triggerMsg) { |
| | | this.triggerMsg = triggerMsg; |
| | | } |
| | | |
| | | public String getHandleMsg() { |
| | | return handleMsg; |
| | | } |
| | | public Date getHandleTime() { |
| | | return handleTime; |
| | | } |
| | | |
| | | public void setHandleMsg(String handleMsg) { |
| | | this.handleMsg = handleMsg; |
| | | } |
| | | public void setHandleTime(Date handleTime) { |
| | | this.handleTime = handleTime; |
| | | } |
| | | |
| | | public int getAlarmStatus() { |
| | | return alarmStatus; |
| | | } |
| | | public int getHandleCode() { |
| | | return handleCode; |
| | | } |
| | | |
| | | public void setAlarmStatus(int alarmStatus) { |
| | | this.alarmStatus = alarmStatus; |
| | | } |
| | | public void setHandleCode(int handleCode) { |
| | | this.handleCode = handleCode; |
| | | } |
| | | |
| | | public String getHandleMsg() { |
| | | return handleMsg; |
| | | } |
| | | |
| | | public void setHandleMsg(String handleMsg) { |
| | | this.handleMsg = handleMsg; |
| | | } |
| | | |
| | | public int getAlarmStatus() { |
| | | return alarmStatus; |
| | | } |
| | | |
| | | public void setAlarmStatus(int alarmStatus) { |
| | | this.alarmStatus = alarmStatus; |
| | | } |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * xxl-job log for glue, used to track job code process |
| | | * |
| | | * @author xuxueli 2016-5-19 17:57:46 |
| | | */ |
| | | public class XxlJobLogGlue { |
| | | |
| | | private int id; |
| | | private int jobId; // 任务主键ID |
| | | private String glueType; // GLUE类型 #com.xxl.job.core.glue.GlueTypeEnum |
| | | private String glueSource; |
| | | private String glueRemark; |
| | | private Date addTime; |
| | | private Date updateTime; |
| | | |
| | | public int getId() { |
| | | return id; |
| | | } |
| | | private int id; |
| | | private int jobId; // 任务主键ID |
| | | private String glueType; // GLUE类型 #com.xxl.job.core.glue.GlueTypeEnum |
| | | private String glueSource; |
| | | private String glueRemark; |
| | | private Date addTime; |
| | | private Date updateTime; |
| | | |
| | | public void setId(int id) { |
| | | this.id = id; |
| | | } |
| | | public int getId() { |
| | | return id; |
| | | } |
| | | |
| | | public int getJobId() { |
| | | return jobId; |
| | | } |
| | | public void setId(int id) { |
| | | this.id = id; |
| | | } |
| | | |
| | | public void setJobId(int jobId) { |
| | | this.jobId = jobId; |
| | | } |
| | | public int getJobId() { |
| | | return jobId; |
| | | } |
| | | |
| | | public String getGlueType() { |
| | | return glueType; |
| | | } |
| | | public void setJobId(int jobId) { |
| | | this.jobId = jobId; |
| | | } |
| | | |
| | | public void setGlueType(String glueType) { |
| | | this.glueType = glueType; |
| | | } |
| | | public String getGlueType() { |
| | | return glueType; |
| | | } |
| | | |
| | | public String getGlueSource() { |
| | | return glueSource; |
| | | } |
| | | public void setGlueType(String glueType) { |
| | | this.glueType = glueType; |
| | | } |
| | | |
| | | public void setGlueSource(String glueSource) { |
| | | this.glueSource = glueSource; |
| | | } |
| | | public String getGlueSource() { |
| | | return glueSource; |
| | | } |
| | | |
| | | public String getGlueRemark() { |
| | | return glueRemark; |
| | | } |
| | | public void setGlueSource(String glueSource) { |
| | | this.glueSource = glueSource; |
| | | } |
| | | |
| | | public void setGlueRemark(String glueRemark) { |
| | | this.glueRemark = glueRemark; |
| | | } |
| | | public String getGlueRemark() { |
| | | return glueRemark; |
| | | } |
| | | |
| | | public Date getAddTime() { |
| | | return addTime; |
| | | } |
| | | public void setGlueRemark(String glueRemark) { |
| | | this.glueRemark = glueRemark; |
| | | } |
| | | |
| | | public void setAddTime(Date addTime) { |
| | | this.addTime = addTime; |
| | | } |
| | | public Date getAddTime() { |
| | | return addTime; |
| | | } |
| | | |
| | | public Date getUpdateTime() { |
| | | return updateTime; |
| | | } |
| | | public void setAddTime(Date addTime) { |
| | | this.addTime = addTime; |
| | | } |
| | | |
| | | public void setUpdateTime(Date updateTime) { |
| | | this.updateTime = updateTime; |
| | | } |
| | | public Date getUpdateTime() { |
| | | return updateTime; |
| | | } |
| | | |
| | | public void setUpdateTime(Date updateTime) { |
| | | this.updateTime = updateTime; |
| | | } |
| | | |
| | | } |
| | |
| | | * @author xuxueli 2019-05-04 16:43:12 |
| | | */ |
| | | public class XxlJobUser { |
| | | |
| | | private int id; |
| | | private String username; // 账号 |
| | | private String password; // 密码 |
| | | private int role; // 角色:0-普通用户、1-管理员 |
| | | private String permission; // 权限:执行器ID列表,多个逗号分割 |
| | | |
| | | public int getId() { |
| | | return id; |
| | | } |
| | | private int id; |
| | | private String username; // 账号 |
| | | private String password; // 密码 |
| | | private int role; // 角色:0-普通用户、1-管理员 |
| | | private String permission; // 权限:执行器ID列表,多个逗号分割 |
| | | |
| | | public void setId(int id) { |
| | | this.id = id; |
| | | } |
| | | public int getId() { |
| | | return id; |
| | | } |
| | | |
| | | public String getUsername() { |
| | | return username; |
| | | } |
| | | public void setId(int id) { |
| | | this.id = id; |
| | | } |
| | | |
| | | public void setUsername(String username) { |
| | | this.username = username; |
| | | } |
| | | public String getUsername() { |
| | | return username; |
| | | } |
| | | |
| | | public String getPassword() { |
| | | return password; |
| | | } |
| | | public void setUsername(String username) { |
| | | this.username = username; |
| | | } |
| | | |
| | | public void setPassword(String password) { |
| | | this.password = password; |
| | | } |
| | | public String getPassword() { |
| | | return password; |
| | | } |
| | | |
| | | public int getRole() { |
| | | return role; |
| | | } |
| | | public void setPassword(String password) { |
| | | this.password = password; |
| | | } |
| | | |
| | | public void setRole(int role) { |
| | | this.role = role; |
| | | } |
| | | public int getRole() { |
| | | return role; |
| | | } |
| | | |
| | | public String getPermission() { |
| | | return permission; |
| | | } |
| | | public void setRole(int role) { |
| | | this.role = role; |
| | | } |
| | | |
| | | public void setPermission(String permission) { |
| | | this.permission = permission; |
| | | } |
| | | public String getPermission() { |
| | | return permission; |
| | | } |
| | | |
| | | // plugin |
| | | public boolean validPermission(int jobGroup){ |
| | | if (this.role == 1) { |
| | | return true; |
| | | } else { |
| | | if (StringUtils.hasText(this.permission)) { |
| | | for (String permissionItem : this.permission.split(",")) { |
| | | if (String.valueOf(jobGroup).equals(permissionItem)) { |
| | | return true; |
| | | } |
| | | } |
| | | } |
| | | return false; |
| | | } |
| | | public void setPermission(String permission) { |
| | | this.permission = permission; |
| | | } |
| | | |
| | | } |
| | | // plugin |
| | | public boolean validPermission(int jobGroup) { |
| | | if (this.role == 1) { |
| | | return true; |
| | | } else { |
| | | if (StringUtils.hasText(this.permission)) { |
| | | for (String permissionItem : this.permission.split(",")) { |
| | | if (String.valueOf(jobGroup).equals(permissionItem)) { |
| | | return true; |
| | | } |
| | | } |
| | | } |
| | | return false; |
| | | } |
| | | |
| | | } |
| | | |
| | | } |
| | |
| | | * @author xuxueli 2015-9-1 18:05:56 |
| | | */ |
| | | public class JobCompleteHelper { |
| | | private static Logger logger = LoggerFactory.getLogger(JobCompleteHelper.class); |
| | | |
| | | private static JobCompleteHelper instance = new JobCompleteHelper(); |
| | | public static JobCompleteHelper getInstance(){ |
| | | return instance; |
| | | } |
| | | private static Logger logger = LoggerFactory.getLogger(JobCompleteHelper.class); |
| | | |
| | | // ---------------------- monitor ---------------------- |
| | | private static JobCompleteHelper instance = new JobCompleteHelper(); |
| | | |
| | | private ThreadPoolExecutor callbackThreadPool = null; |
| | | private Thread monitorThread; |
| | | private volatile boolean toStop = false; |
| | | public void start(){ |
| | | public static JobCompleteHelper getInstance() { |
| | | return instance; |
| | | } |
| | | |
| | | // for callback |
| | | callbackThreadPool = new ThreadPoolExecutor( |
| | | 2, |
| | | 20, |
| | | 30L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(3000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobLosedMonitorHelper-callbackThreadPool-" + r.hashCode()); |
| | | } |
| | | }, |
| | | new RejectedExecutionHandler() { |
| | | @Override |
| | | public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) { |
| | | r.run(); |
| | | logger.warn(">>>>>>>>>>> xxl-job, callback too fast, match threadpool rejected handler(run now)."); |
| | | } |
| | | }); |
| | | // ---------------------- monitor ---------------------- |
| | | |
| | | private ThreadPoolExecutor callbackThreadPool = null; |
| | | private Thread monitorThread; |
| | | private volatile boolean toStop = false; |
| | | |
| | | public void start() { |
| | | |
| | | // for callback |
| | | callbackThreadPool = new ThreadPoolExecutor( |
| | | 2, |
| | | 20, |
| | | 30L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(3000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobLosedMonitorHelper-callbackThreadPool-" + r.hashCode()); |
| | | } |
| | | }, |
| | | new RejectedExecutionHandler() { |
| | | @Override |
| | | public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) { |
| | | r.run(); |
| | | logger.warn(">>>>>>>>>>> xxl-job, callback too fast, match threadpool rejected handler(run now)."); |
| | | } |
| | | }); |
| | | |
| | | |
| | | // for monitor |
| | | monitorThread = new Thread(new Runnable() { |
| | | // for monitor |
| | | monitorThread = new Thread(new Runnable() { |
| | | |
| | | @Override |
| | | public void run() { |
| | | @Override |
| | | public void run() { |
| | | |
| | | // wait for JobTriggerPoolHelper-init |
| | | try { |
| | | TimeUnit.MILLISECONDS.sleep(50); |
| | | } catch (InterruptedException e) { |
| | | if (!toStop) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | // wait for JobTriggerPoolHelper-init |
| | | try { |
| | | TimeUnit.MILLISECONDS.sleep(50); |
| | | } catch (InterruptedException e) { |
| | | if (!toStop) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | |
| | | // monitor |
| | | while (!toStop) { |
| | | try { |
| | | // 任务结果丢失处理:调度记录停留在 "运行中" 状态超过10min,且对应执行器心跳注册失败不在线,则将本地调度主动标记失败; |
| | | Date losedTime = DateUtil.addMinutes(new Date(), -10); |
| | | List<Long> losedJobIds = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().findLostJobIds(losedTime); |
| | | // monitor |
| | | while (!toStop) { |
| | | try { |
| | | // 任务结果丢失处理:调度记录停留在 "运行中" 状态超过10min,且对应执行器心跳注册失败不在线,则将本地调度主动标记失败; |
| | | Date losedTime = DateUtil.addMinutes(new Date(), -10); |
| | | List<Long> losedJobIds = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().findLostJobIds(losedTime); |
| | | |
| | | if (losedJobIds!=null && losedJobIds.size()>0) { |
| | | for (Long logId: losedJobIds) { |
| | | if (losedJobIds != null && losedJobIds.size() > 0) { |
| | | for (Long logId : losedJobIds) { |
| | | |
| | | XxlJobLog jobLog = new XxlJobLog(); |
| | | jobLog.setId(logId); |
| | | XxlJobLog jobLog = new XxlJobLog(); |
| | | jobLog.setId(logId); |
| | | |
| | | jobLog.setHandleTime(new Date()); |
| | | jobLog.setHandleCode(ReturnT.FAIL_CODE); |
| | | jobLog.setHandleMsg( I18nUtil.getString("joblog_lost_fail") ); |
| | | jobLog.setHandleTime(new Date()); |
| | | jobLog.setHandleCode(ReturnT.FAIL_CODE); |
| | | jobLog.setHandleMsg(I18nUtil.getString("joblog_lost_fail")); |
| | | |
| | | XxlJobCompleter.updateHandleInfoAndFinish(jobLog); |
| | | } |
| | | XxlJobCompleter.updateHandleInfoAndFinish(jobLog); |
| | | } |
| | | |
| | | } |
| | | } catch (Exception e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job fail monitor thread error:{}", e); |
| | | } |
| | | } |
| | | } |
| | | } catch (Exception e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job fail monitor thread error:{}" , e); |
| | | } |
| | | } |
| | | |
| | | try { |
| | | TimeUnit.SECONDS.sleep(60); |
| | |
| | | |
| | | } |
| | | |
| | | logger.info(">>>>>>>>>>> xxl-job, JobLosedMonitorHelper stop"); |
| | | logger.info(">>>>>>>>>>> xxl-job, JobLosedMonitorHelper stop"); |
| | | |
| | | } |
| | | }); |
| | | monitorThread.setDaemon(true); |
| | | monitorThread.setName("xxl-job, admin JobLosedMonitorHelper"); |
| | | monitorThread.start(); |
| | | } |
| | | } |
| | | }); |
| | | monitorThread.setDaemon(true); |
| | | monitorThread.setName("xxl-job, admin JobLosedMonitorHelper"); |
| | | monitorThread.start(); |
| | | } |
| | | |
| | | public void toStop(){ |
| | | toStop = true; |
| | | public void toStop() { |
| | | toStop = true; |
| | | |
| | | // stop registryOrRemoveThreadPool |
| | | callbackThreadPool.shutdownNow(); |
| | | // stop registryOrRemoveThreadPool |
| | | callbackThreadPool.shutdownNow(); |
| | | |
| | | // stop monitorThread (interrupt and wait) |
| | | monitorThread.interrupt(); |
| | | try { |
| | | monitorThread.join(); |
| | | } catch (InterruptedException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | // stop monitorThread (interrupt and wait) |
| | | monitorThread.interrupt(); |
| | | try { |
| | | monitorThread.join(); |
| | | } catch (InterruptedException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | |
| | | |
| | | // ---------------------- helper ---------------------- |
| | | // ---------------------- helper ---------------------- |
| | | |
| | | public ReturnT<String> callback(List<HandleCallbackParam> callbackParamList) { |
| | | public ReturnT<String> callback(List<HandleCallbackParam> callbackParamList) { |
| | | |
| | | callbackThreadPool.execute(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | for (HandleCallbackParam handleCallbackParam: callbackParamList) { |
| | | ReturnT<String> callbackResult = callback(handleCallbackParam); |
| | | logger.debug(">>>>>>>>> JobApiController.callback {}, handleCallbackParam={}, callbackResult={}", |
| | | (callbackResult.getCode()== ReturnT.SUCCESS_CODE?"success":"fail"), handleCallbackParam, callbackResult); |
| | | } |
| | | } |
| | | }); |
| | | callbackThreadPool.execute(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | for (HandleCallbackParam handleCallbackParam : callbackParamList) { |
| | | ReturnT<String> callbackResult = callback(handleCallbackParam); |
| | | logger.debug(">>>>>>>>> JobApiController.callback {}, handleCallbackParam={}, callbackResult={}" , |
| | | (callbackResult.getCode() == ReturnT.SUCCESS_CODE ? "success" : "fail"), handleCallbackParam, callbackResult); |
| | | } |
| | | } |
| | | }); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | private ReturnT<String> callback(HandleCallbackParam handleCallbackParam) { |
| | | // valid log item |
| | | XxlJobLog log = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().load(handleCallbackParam.getLogId()); |
| | | if (log == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "log item not found."); |
| | | } |
| | | if (log.getHandleCode() > 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "log repeate callback."); // avoid repeat callback, trigger child job etc |
| | | } |
| | | private ReturnT<String> callback(HandleCallbackParam handleCallbackParam) { |
| | | // valid log item |
| | | XxlJobLog log = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().load(handleCallbackParam.getLogId()); |
| | | if (log == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "log item not found."); |
| | | } |
| | | if (log.getHandleCode() > 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "log repeate callback."); // avoid repeat callback, trigger child job etc |
| | | } |
| | | |
| | | // handle msg |
| | | StringBuffer handleMsg = new StringBuffer(); |
| | | if (log.getHandleMsg()!=null) { |
| | | handleMsg.append(log.getHandleMsg()).append("<br>"); |
| | | } |
| | | if (handleCallbackParam.getHandleMsg() != null) { |
| | | handleMsg.append(handleCallbackParam.getHandleMsg()); |
| | | } |
| | | // handle msg |
| | | StringBuffer handleMsg = new StringBuffer(); |
| | | if (log.getHandleMsg() != null) { |
| | | handleMsg.append(log.getHandleMsg()).append("<br>"); |
| | | } |
| | | if (handleCallbackParam.getHandleMsg() != null) { |
| | | handleMsg.append(handleCallbackParam.getHandleMsg()); |
| | | } |
| | | |
| | | // success, save log |
| | | log.setHandleTime(new Date()); |
| | | log.setHandleCode(handleCallbackParam.getHandleCode()); |
| | | log.setHandleMsg(handleMsg.toString()); |
| | | XxlJobCompleter.updateHandleInfoAndFinish(log); |
| | | // success, save log |
| | | log.setHandleTime(new Date()); |
| | | log.setHandleCode(handleCallbackParam.getHandleCode()); |
| | | log.setHandleMsg(handleMsg.toString()); |
| | | XxlJobCompleter.updateHandleInfoAndFinish(log); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | |
| | | } |
| | |
| | | * @author xuxueli 2015-9-1 18:05:56 |
| | | */ |
| | | public class JobFailMonitorHelper { |
| | | private static Logger logger = LoggerFactory.getLogger(JobFailMonitorHelper.class); |
| | | |
| | | private static JobFailMonitorHelper instance = new JobFailMonitorHelper(); |
| | | public static JobFailMonitorHelper getInstance(){ |
| | | return instance; |
| | | } |
| | | private static Logger logger = LoggerFactory.getLogger(JobFailMonitorHelper.class); |
| | | |
| | | // ---------------------- monitor ---------------------- |
| | | private static JobFailMonitorHelper instance = new JobFailMonitorHelper(); |
| | | |
| | | private Thread monitorThread; |
| | | private volatile boolean toStop = false; |
| | | public void start(){ |
| | | monitorThread = new Thread(new Runnable() { |
| | | public static JobFailMonitorHelper getInstance() { |
| | | return instance; |
| | | } |
| | | |
| | | @Override |
| | | public void run() { |
| | | // ---------------------- monitor ---------------------- |
| | | |
| | | // monitor |
| | | while (!toStop) { |
| | | try { |
| | | private Thread monitorThread; |
| | | private volatile boolean toStop = false; |
| | | |
| | | List<Long> failLogIds = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().findFailJobLogIds(1000); |
| | | if (failLogIds!=null && !failLogIds.isEmpty()) { |
| | | for (long failLogId: failLogIds) { |
| | | public void start() { |
| | | monitorThread = new Thread(new Runnable() { |
| | | |
| | | // lock log |
| | | int lockRet = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().updateAlarmStatus(failLogId, 0, -1); |
| | | if (lockRet < 1) { |
| | | continue; |
| | | } |
| | | XxlJobLog log = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().load(failLogId); |
| | | XxlJobInfo info = XxlJobAdminConfig.getAdminConfig().getXxlJobInfoDao().loadById(log.getJobId()); |
| | | @Override |
| | | public void run() { |
| | | |
| | | // 1、fail retry monitor |
| | | if (log.getExecutorFailRetryCount() > 0) { |
| | | JobTriggerPoolHelper.trigger(log.getJobId(), TriggerTypeEnum.RETRY, (log.getExecutorFailRetryCount()-1), log.getExecutorShardingParam(), log.getExecutorParam(), null); |
| | | String retryMsg = "<br><br><span style=\"color:#F39C12;\" > >>>>>>>>>>>"+ I18nUtil.getString("jobconf_trigger_type_retry") +"<<<<<<<<<<< </span><br>"; |
| | | log.setTriggerMsg(log.getTriggerMsg() + retryMsg); |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().updateTriggerInfo(log); |
| | | } |
| | | // monitor |
| | | while (!toStop) { |
| | | try { |
| | | |
| | | // 2、fail alarm monitor |
| | | int newAlarmStatus = 0; // 告警状态:0-默认、-1=锁定状态、1-无需告警、2-告警成功、3-告警失败 |
| | | if (info!=null && info.getAlarmEmail()!=null && info.getAlarmEmail().trim().length()>0) { |
| | | boolean alarmResult = XxlJobAdminConfig.getAdminConfig().getJobAlarmer().alarm(info, log); |
| | | newAlarmStatus = alarmResult?2:3; |
| | | } else { |
| | | newAlarmStatus = 1; |
| | | } |
| | | List<Long> failLogIds = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().findFailJobLogIds(1000); |
| | | if (failLogIds != null && !failLogIds.isEmpty()) { |
| | | for (long failLogId : failLogIds) { |
| | | |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().updateAlarmStatus(failLogId, -1, newAlarmStatus); |
| | | } |
| | | } |
| | | // lock log |
| | | int lockRet = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().updateAlarmStatus(failLogId, 0, -1); |
| | | if (lockRet < 1) { |
| | | continue; |
| | | } |
| | | XxlJobLog log = XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().load(failLogId); |
| | | XxlJobInfo info = XxlJobAdminConfig.getAdminConfig().getXxlJobInfoDao().loadById(log.getJobId()); |
| | | |
| | | } catch (Exception e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job fail monitor thread error:{}", e); |
| | | } |
| | | } |
| | | // 1、fail retry monitor |
| | | if (log.getExecutorFailRetryCount() > 0) { |
| | | JobTriggerPoolHelper.trigger(log.getJobId(), TriggerTypeEnum.RETRY, (log.getExecutorFailRetryCount() - 1), log.getExecutorShardingParam(), log.getExecutorParam(), null); |
| | | String retryMsg = "<br><br><span style=\"color:#F39C12;\" > >>>>>>>>>>>" + I18nUtil.getString("jobconf_trigger_type_retry") + "<<<<<<<<<<< </span><br>"; |
| | | log.setTriggerMsg(log.getTriggerMsg() + retryMsg); |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().updateTriggerInfo(log); |
| | | } |
| | | |
| | | // 2、fail alarm monitor |
| | | int newAlarmStatus = 0; // 告警状态:0-默认、-1=锁定状态、1-无需告警、2-告警成功、3-告警失败 |
| | | if (info != null && info.getAlarmEmail() != null && info.getAlarmEmail().trim().length() > 0) { |
| | | boolean alarmResult = XxlJobAdminConfig.getAdminConfig().getJobAlarmer().alarm(info, log); |
| | | newAlarmStatus = alarmResult ? 2 : 3; |
| | | } else { |
| | | newAlarmStatus = 1; |
| | | } |
| | | |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobLogDao().updateAlarmStatus(failLogId, -1, newAlarmStatus); |
| | | } |
| | | } |
| | | |
| | | } catch (Exception e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job fail monitor thread error:{}" , e); |
| | | } |
| | | } |
| | | |
| | | try { |
| | | TimeUnit.SECONDS.sleep(10); |
| | |
| | | |
| | | } |
| | | |
| | | logger.info(">>>>>>>>>>> xxl-job, job fail monitor thread stop"); |
| | | logger.info(">>>>>>>>>>> xxl-job, job fail monitor thread stop"); |
| | | |
| | | } |
| | | }); |
| | | monitorThread.setDaemon(true); |
| | | monitorThread.setName("xxl-job, admin JobFailMonitorHelper"); |
| | | monitorThread.start(); |
| | | } |
| | | } |
| | | }); |
| | | monitorThread.setDaemon(true); |
| | | monitorThread.setName("xxl-job, admin JobFailMonitorHelper"); |
| | | monitorThread.start(); |
| | | } |
| | | |
| | | public void toStop(){ |
| | | toStop = true; |
| | | // interrupt and wait |
| | | monitorThread.interrupt(); |
| | | try { |
| | | monitorThread.join(); |
| | | } catch (InterruptedException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | public void toStop() { |
| | | toStop = true; |
| | | // interrupt and wait |
| | | monitorThread.interrupt(); |
| | | try { |
| | | monitorThread.join(); |
| | | } catch (InterruptedException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * job registry instance |
| | | * |
| | | * @author xuxueli 2016-10-02 19:10:24 |
| | | */ |
| | | public class JobRegistryHelper { |
| | | private static Logger logger = LoggerFactory.getLogger(JobRegistryHelper.class); |
| | | private static Logger logger = LoggerFactory.getLogger(JobRegistryHelper.class); |
| | | |
| | | private static JobRegistryHelper instance = new JobRegistryHelper(); |
| | | public static JobRegistryHelper getInstance(){ |
| | | return instance; |
| | | } |
| | | private static JobRegistryHelper instance = new JobRegistryHelper(); |
| | | |
| | | private ThreadPoolExecutor registryOrRemoveThreadPool = null; |
| | | private Thread registryMonitorThread; |
| | | private volatile boolean toStop = false; |
| | | public static JobRegistryHelper getInstance() { |
| | | return instance; |
| | | } |
| | | |
| | | public void start(){ |
| | | private ThreadPoolExecutor registryOrRemoveThreadPool = null; |
| | | private Thread registryMonitorThread; |
| | | private volatile boolean toStop = false; |
| | | |
| | | // for registry or remove |
| | | registryOrRemoveThreadPool = new ThreadPoolExecutor( |
| | | 2, |
| | | 10, |
| | | 30L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(2000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobRegistryMonitorHelper-registryOrRemoveThreadPool-" + r.hashCode()); |
| | | } |
| | | }, |
| | | new RejectedExecutionHandler() { |
| | | @Override |
| | | public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) { |
| | | r.run(); |
| | | logger.warn(">>>>>>>>>>> xxl-job, registry or remove too fast, match threadpool rejected handler(run now)."); |
| | | } |
| | | }); |
| | | public void start() { |
| | | |
| | | // for monitor |
| | | registryMonitorThread = new Thread(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | while (!toStop) { |
| | | try { |
| | | // auto registry group |
| | | List<XxlJobGroup> groupList = XxlJobAdminConfig.getAdminConfig().getXxlJobGroupDao().findByAddressType(0); |
| | | if (groupList!=null && !groupList.isEmpty()) { |
| | | // for registry or remove |
| | | registryOrRemoveThreadPool = new ThreadPoolExecutor( |
| | | 2, |
| | | 10, |
| | | 30L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(2000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobRegistryMonitorHelper-registryOrRemoveThreadPool-" + r.hashCode()); |
| | | } |
| | | }, |
| | | new RejectedExecutionHandler() { |
| | | @Override |
| | | public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) { |
| | | r.run(); |
| | | logger.warn(">>>>>>>>>>> xxl-job, registry or remove too fast, match threadpool rejected handler(run now)."); |
| | | } |
| | | }); |
| | | |
| | | // remove dead address (admin/executor) |
| | | List<Integer> ids = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().findDead(RegistryConfig.DEAD_TIMEOUT, new Date()); |
| | | if (ids!=null && ids.size()>0) { |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().removeDead(ids); |
| | | } |
| | | // for monitor |
| | | registryMonitorThread = new Thread(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | while (!toStop) { |
| | | try { |
| | | // auto registry group |
| | | List<XxlJobGroup> groupList = XxlJobAdminConfig.getAdminConfig().getXxlJobGroupDao().findByAddressType(0); |
| | | if (groupList != null && !groupList.isEmpty()) { |
| | | |
| | | // fresh online address (admin/executor) |
| | | HashMap<String, List<String>> appAddressMap = new HashMap<String, List<String>>(); |
| | | List<XxlJobRegistry> list = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().findAll(RegistryConfig.DEAD_TIMEOUT, new Date()); |
| | | if (list != null) { |
| | | for (XxlJobRegistry item: list) { |
| | | if (RegistryConfig.RegistType.EXECUTOR.name().equals(item.getRegistryGroup())) { |
| | | String appname = item.getRegistryKey(); |
| | | List<String> registryList = appAddressMap.get(appname); |
| | | if (registryList == null) { |
| | | registryList = new ArrayList<String>(); |
| | | } |
| | | // remove dead address (admin/executor) |
| | | List<Integer> ids = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().findDead(RegistryConfig.DEAD_TIMEOUT, new Date()); |
| | | if (ids != null && ids.size() > 0) { |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().removeDead(ids); |
| | | } |
| | | |
| | | if (!registryList.contains(item.getRegistryValue())) { |
| | | registryList.add(item.getRegistryValue()); |
| | | } |
| | | appAddressMap.put(appname, registryList); |
| | | } |
| | | } |
| | | } |
| | | // fresh online address (admin/executor) |
| | | HashMap<String, List<String>> appAddressMap = new HashMap<String, List<String>>(); |
| | | List<XxlJobRegistry> list = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().findAll(RegistryConfig.DEAD_TIMEOUT, new Date()); |
| | | if (list != null) { |
| | | for (XxlJobRegistry item : list) { |
| | | if (RegistryConfig.RegistType.EXECUTOR.name().equals(item.getRegistryGroup())) { |
| | | String appname = item.getRegistryKey(); |
| | | List<String> registryList = appAddressMap.get(appname); |
| | | if (registryList == null) { |
| | | registryList = new ArrayList<String>(); |
| | | } |
| | | |
| | | // fresh group address |
| | | for (XxlJobGroup group: groupList) { |
| | | List<String> registryList = appAddressMap.get(group.getAppname()); |
| | | String addressListStr = null; |
| | | if (registryList!=null && !registryList.isEmpty()) { |
| | | Collections.sort(registryList); |
| | | StringBuilder addressListSB = new StringBuilder(); |
| | | for (String item:registryList) { |
| | | addressListSB.append(item).append(","); |
| | | } |
| | | addressListStr = addressListSB.toString(); |
| | | addressListStr = addressListStr.substring(0, addressListStr.length()-1); |
| | | } |
| | | group.setAddressList(addressListStr); |
| | | group.setUpdateTime(new Date()); |
| | | if (!registryList.contains(item.getRegistryValue())) { |
| | | registryList.add(item.getRegistryValue()); |
| | | } |
| | | appAddressMap.put(appname, registryList); |
| | | } |
| | | } |
| | | } |
| | | |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobGroupDao().update(group); |
| | | } |
| | | } |
| | | } catch (Exception e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job registry monitor thread error:{}", e); |
| | | } |
| | | } |
| | | try { |
| | | TimeUnit.SECONDS.sleep(RegistryConfig.BEAT_TIMEOUT); |
| | | } catch (InterruptedException e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job registry monitor thread error:{}", e); |
| | | } |
| | | } |
| | | } |
| | | logger.info(">>>>>>>>>>> xxl-job, job registry monitor thread stop"); |
| | | } |
| | | }); |
| | | registryMonitorThread.setDaemon(true); |
| | | registryMonitorThread.setName("xxl-job, admin JobRegistryMonitorHelper-registryMonitorThread"); |
| | | registryMonitorThread.start(); |
| | | } |
| | | // fresh group address |
| | | for (XxlJobGroup group : groupList) { |
| | | List<String> registryList = appAddressMap.get(group.getAppname()); |
| | | String addressListStr = null; |
| | | if (registryList != null && !registryList.isEmpty()) { |
| | | Collections.sort(registryList); |
| | | StringBuilder addressListSB = new StringBuilder(); |
| | | for (String item : registryList) { |
| | | addressListSB.append(item).append(","); |
| | | } |
| | | addressListStr = addressListSB.toString(); |
| | | addressListStr = addressListStr.substring(0, addressListStr.length() - 1); |
| | | } |
| | | group.setAddressList(addressListStr); |
| | | group.setUpdateTime(new Date()); |
| | | |
| | | public void toStop(){ |
| | | toStop = true; |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobGroupDao().update(group); |
| | | } |
| | | } |
| | | } catch (Exception e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job registry monitor thread error:{}" , e); |
| | | } |
| | | } |
| | | try { |
| | | TimeUnit.SECONDS.sleep(RegistryConfig.BEAT_TIMEOUT); |
| | | } catch (InterruptedException e) { |
| | | if (!toStop) { |
| | | logger.error(">>>>>>>>>>> xxl-job, job registry monitor thread error:{}" , e); |
| | | } |
| | | } |
| | | } |
| | | logger.info(">>>>>>>>>>> xxl-job, job registry monitor thread stop"); |
| | | } |
| | | }); |
| | | registryMonitorThread.setDaemon(true); |
| | | registryMonitorThread.setName("xxl-job, admin JobRegistryMonitorHelper-registryMonitorThread"); |
| | | registryMonitorThread.start(); |
| | | } |
| | | |
| | | // stop registryOrRemoveThreadPool |
| | | registryOrRemoveThreadPool.shutdownNow(); |
| | | public void toStop() { |
| | | toStop = true; |
| | | |
| | | // stop monitir (interrupt and wait) |
| | | registryMonitorThread.interrupt(); |
| | | try { |
| | | registryMonitorThread.join(); |
| | | } catch (InterruptedException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | // stop registryOrRemoveThreadPool |
| | | registryOrRemoveThreadPool.shutdownNow(); |
| | | |
| | | // stop monitir (interrupt and wait) |
| | | registryMonitorThread.interrupt(); |
| | | try { |
| | | registryMonitorThread.join(); |
| | | } catch (InterruptedException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | } |
| | | |
| | | |
| | | // ---------------------- helper ---------------------- |
| | | // ---------------------- helper ---------------------- |
| | | |
| | | public ReturnT<String> registry(RegistryParam registryParam) { |
| | | public ReturnT<String> registry(RegistryParam registryParam) { |
| | | |
| | | // valid |
| | | if (!StringUtils.hasText(registryParam.getRegistryGroup()) |
| | | || !StringUtils.hasText(registryParam.getRegistryKey()) |
| | | || !StringUtils.hasText(registryParam.getRegistryValue())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Illegal Argument."); |
| | | } |
| | | // valid |
| | | if (!StringUtils.hasText(registryParam.getRegistryGroup()) |
| | | || !StringUtils.hasText(registryParam.getRegistryKey()) |
| | | || !StringUtils.hasText(registryParam.getRegistryValue())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Illegal Argument."); |
| | | } |
| | | |
| | | // async execute |
| | | registryOrRemoveThreadPool.execute(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | int ret = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().registryUpdate(registryParam.getRegistryGroup(), registryParam.getRegistryKey(), registryParam.getRegistryValue(), new Date()); |
| | | if (ret < 1) { |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().registrySave(registryParam.getRegistryGroup(), registryParam.getRegistryKey(), registryParam.getRegistryValue(), new Date()); |
| | | // async execute |
| | | registryOrRemoveThreadPool.execute(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | int ret = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().registryUpdate(registryParam.getRegistryGroup(), registryParam.getRegistryKey(), registryParam.getRegistryValue(), new Date()); |
| | | if (ret < 1) { |
| | | XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().registrySave(registryParam.getRegistryGroup(), registryParam.getRegistryKey(), registryParam.getRegistryValue(), new Date()); |
| | | |
| | | // fresh |
| | | freshGroupRegistryInfo(registryParam); |
| | | } |
| | | } |
| | | }); |
| | | // fresh |
| | | freshGroupRegistryInfo(registryParam); |
| | | } |
| | | } |
| | | }); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | public ReturnT<String> registryRemove(RegistryParam registryParam) { |
| | | public ReturnT<String> registryRemove(RegistryParam registryParam) { |
| | | |
| | | // valid |
| | | if (!StringUtils.hasText(registryParam.getRegistryGroup()) |
| | | || !StringUtils.hasText(registryParam.getRegistryKey()) |
| | | || !StringUtils.hasText(registryParam.getRegistryValue())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Illegal Argument."); |
| | | } |
| | | // valid |
| | | if (!StringUtils.hasText(registryParam.getRegistryGroup()) |
| | | || !StringUtils.hasText(registryParam.getRegistryKey()) |
| | | || !StringUtils.hasText(registryParam.getRegistryValue())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Illegal Argument."); |
| | | } |
| | | |
| | | // async execute |
| | | registryOrRemoveThreadPool.execute(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | int ret = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().registryDelete(registryParam.getRegistryGroup(), registryParam.getRegistryKey(), registryParam.getRegistryValue()); |
| | | if (ret > 0) { |
| | | // fresh |
| | | freshGroupRegistryInfo(registryParam); |
| | | } |
| | | } |
| | | }); |
| | | // async execute |
| | | registryOrRemoveThreadPool.execute(new Runnable() { |
| | | @Override |
| | | public void run() { |
| | | int ret = XxlJobAdminConfig.getAdminConfig().getXxlJobRegistryDao().registryDelete(registryParam.getRegistryGroup(), registryParam.getRegistryKey(), registryParam.getRegistryValue()); |
| | | if (ret > 0) { |
| | | // fresh |
| | | freshGroupRegistryInfo(registryParam); |
| | | } |
| | | } |
| | | }); |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | private void freshGroupRegistryInfo(RegistryParam registryParam){ |
| | | // Under consideration, prevent affecting core tables |
| | | } |
| | | private void freshGroupRegistryInfo(RegistryParam registryParam) { |
| | | // Under consideration, prevent affecting core tables |
| | | } |
| | | |
| | | |
| | | } |
| | |
| | | private ThreadPoolExecutor fastTriggerPool = null; |
| | | private ThreadPoolExecutor slowTriggerPool = null; |
| | | |
| | | public void start(){ |
| | | public void start() { |
| | | fastTriggerPool = new ThreadPoolExecutor( |
| | | 10, |
| | | XxlJobAdminConfig.getAdminConfig().getTriggerPoolFastMax(), |
| | | 60L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(1000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobTriggerPoolHelper-fastTriggerPool-" + r.hashCode()); |
| | | } |
| | | }); |
| | | 10, |
| | | XxlJobAdminConfig.getAdminConfig().getTriggerPoolFastMax(), |
| | | 60L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(1000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobTriggerPoolHelper-fastTriggerPool-" + r.hashCode()); |
| | | } |
| | | }); |
| | | |
| | | slowTriggerPool = new ThreadPoolExecutor( |
| | | 10, |
| | | XxlJobAdminConfig.getAdminConfig().getTriggerPoolSlowMax(), |
| | | 60L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(2000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobTriggerPoolHelper-slowTriggerPool-" + r.hashCode()); |
| | | } |
| | | }); |
| | | 10, |
| | | XxlJobAdminConfig.getAdminConfig().getTriggerPoolSlowMax(), |
| | | 60L, |
| | | TimeUnit.SECONDS, |
| | | new LinkedBlockingQueue<Runnable>(2000), |
| | | new ThreadFactory() { |
| | | @Override |
| | | public Thread newThread(Runnable r) { |
| | | return new Thread(r, "xxl-job, admin JobTriggerPoolHelper-slowTriggerPool-" + r.hashCode()); |
| | | } |
| | | }); |
| | | } |
| | | |
| | | |
| | |
| | | |
| | | |
| | | // job timeout count |
| | | private volatile long minTim = System.currentTimeMillis()/60000; // ms > min |
| | | private volatile long minTim = System.currentTimeMillis() / 60000; // ms > min |
| | | private volatile ConcurrentMap<Integer, AtomicInteger> jobTimeoutCountMap = new ConcurrentHashMap<>(); |
| | | |
| | | |
| | |
| | | // choose thread pool |
| | | ThreadPoolExecutor triggerPool_ = fastTriggerPool; |
| | | AtomicInteger jobTimeoutCount = jobTimeoutCountMap.get(jobId); |
| | | if (jobTimeoutCount!=null && jobTimeoutCount.get() > 10) { // job-timeout 10 times in 1 min |
| | | if (jobTimeoutCount != null && jobTimeoutCount.get() > 10) { // job-timeout 10 times in 1 min |
| | | triggerPool_ = slowTriggerPool; |
| | | } |
| | | |
| | |
| | | } finally { |
| | | |
| | | // check timeout-count-map |
| | | long minTim_now = System.currentTimeMillis()/60000; |
| | | long minTim_now = System.currentTimeMillis() / 60000; |
| | | if (minTim != minTim_now) { |
| | | minTim = minTim_now; |
| | | jobTimeoutCountMap.clear(); |
| | | } |
| | | |
| | | // incr timeout-count-map |
| | | long cost = System.currentTimeMillis()-start; |
| | | long cost = System.currentTimeMillis() - start; |
| | | if (cost > 500) { // ob-timeout threshold 500ms |
| | | AtomicInteger timeoutCount = jobTimeoutCountMap.putIfAbsent(jobId, new AtomicInteger(1)); |
| | | if (timeoutCount != null) { |
| | |
| | | } |
| | | |
| | | |
| | | |
| | | // ---------------------- helper ---------------------- |
| | | |
| | | private static JobTriggerPoolHelper helper = new JobTriggerPoolHelper(); |
| | |
| | | public static void toStart() { |
| | | helper.start(); |
| | | } |
| | | |
| | | public static void toStop() { |
| | | helper.stop(); |
| | | } |
| | |
| | | /** |
| | | * @param jobId |
| | | * @param triggerType |
| | | * @param failRetryCount |
| | | * >=0: use this param |
| | | * <0: use param from job info config |
| | | * @param failRetryCount >=0: use this param |
| | | * <0: use param from job info config |
| | | * @param executorShardingParam |
| | | * @param executorParam |
| | | * null: use job param |
| | | * not null: cover job param |
| | | * @param executorParam null: use job param |
| | | * not null: cover job param |
| | | */ |
| | | public static void trigger(int jobId, TriggerTypeEnum triggerType, int failRetryCount, String executorShardingParam, String executorParam, String addressList) { |
| | | helper.addTrigger(jobId, triggerType, failRetryCount, executorShardingParam, executorParam, addressList); |
| | |
| | | */ |
| | | public class CookieUtil { |
| | | |
| | | // 默认缓存时间,单位/秒, 2H |
| | | private static final int COOKIE_MAX_AGE = Integer.MAX_VALUE; |
| | | // 保存路径,根路径 |
| | | private static final String COOKIE_PATH = "/"; |
| | | |
| | | /** |
| | | * 保存 |
| | | * |
| | | * @param response |
| | | * @param key |
| | | * @param value |
| | | * @param ifRemember |
| | | */ |
| | | public static void set(HttpServletResponse response, String key, String value, boolean ifRemember) { |
| | | int age = ifRemember?COOKIE_MAX_AGE:-1; |
| | | set(response, key, value, null, COOKIE_PATH, age, true); |
| | | } |
| | | // 默认缓存时间,单位/秒, 2H |
| | | private static final int COOKIE_MAX_AGE = Integer.MAX_VALUE; |
| | | // 保存路径,根路径 |
| | | private static final String COOKIE_PATH = "/"; |
| | | |
| | | /** |
| | | * 保存 |
| | | * |
| | | * @param response |
| | | * @param key |
| | | * @param value |
| | | * @param maxAge |
| | | */ |
| | | private static void set(HttpServletResponse response, String key, String value, String domain, String path, int maxAge, boolean isHttpOnly) { |
| | | Cookie cookie = new Cookie(key, value); |
| | | if (domain != null) { |
| | | cookie.setDomain(domain); |
| | | } |
| | | cookie.setPath(path); |
| | | cookie.setMaxAge(maxAge); |
| | | cookie.setHttpOnly(isHttpOnly); |
| | | response.addCookie(cookie); |
| | | } |
| | | |
| | | /** |
| | | * 查询value |
| | | * |
| | | * @param request |
| | | * @param key |
| | | * @return |
| | | */ |
| | | public static String getValue(HttpServletRequest request, String key) { |
| | | Cookie cookie = get(request, key); |
| | | if (cookie != null) { |
| | | return cookie.getValue(); |
| | | } |
| | | return null; |
| | | } |
| | | /** |
| | | * 保存 |
| | | * |
| | | * @param response |
| | | * @param key |
| | | * @param value |
| | | * @param ifRemember |
| | | */ |
| | | public static void set(HttpServletResponse response, String key, String value, boolean ifRemember) { |
| | | int age = ifRemember ? COOKIE_MAX_AGE : -1; |
| | | set(response, key, value, null, COOKIE_PATH, age, true); |
| | | } |
| | | |
| | | /** |
| | | * 查询Cookie |
| | | * |
| | | * @param request |
| | | * @param key |
| | | */ |
| | | private static Cookie get(HttpServletRequest request, String key) { |
| | | Cookie[] arr_cookie = request.getCookies(); |
| | | if (arr_cookie != null && arr_cookie.length > 0) { |
| | | for (Cookie cookie : arr_cookie) { |
| | | if (cookie.getName().equals(key)) { |
| | | return cookie; |
| | | } |
| | | } |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | /** |
| | | * 删除Cookie |
| | | * |
| | | * @param request |
| | | * @param response |
| | | * @param key |
| | | */ |
| | | public static void remove(HttpServletRequest request, HttpServletResponse response, String key) { |
| | | Cookie cookie = get(request, key); |
| | | if (cookie != null) { |
| | | set(response, key, "", null, COOKIE_PATH, 0, true); |
| | | } |
| | | } |
| | | /** |
| | | * 保存 |
| | | * |
| | | * @param response |
| | | * @param key |
| | | * @param value |
| | | * @param maxAge |
| | | */ |
| | | private static void set(HttpServletResponse response, String key, String value, String domain, String path, int maxAge, boolean isHttpOnly) { |
| | | Cookie cookie = new Cookie(key, value); |
| | | if (domain != null) { |
| | | cookie.setDomain(domain); |
| | | } |
| | | cookie.setPath(path); |
| | | cookie.setMaxAge(maxAge); |
| | | cookie.setHttpOnly(isHttpOnly); |
| | | response.addCookie(cookie); |
| | | } |
| | | |
| | | } |
| | | /** |
| | | * 查询value |
| | | * |
| | | * @param request |
| | | * @param key |
| | | * @return |
| | | */ |
| | | public static String getValue(HttpServletRequest request, String key) { |
| | | Cookie cookie = get(request, key); |
| | | if (cookie != null) { |
| | | return cookie.getValue(); |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | /** |
| | | * 查询Cookie |
| | | * |
| | | * @param request |
| | | * @param key |
| | | */ |
| | | private static Cookie get(HttpServletRequest request, String key) { |
| | | Cookie[] arr_cookie = request.getCookies(); |
| | | if (arr_cookie != null && arr_cookie.length > 0) { |
| | | for (Cookie cookie : arr_cookie) { |
| | | if (cookie.getName().equals(key)) { |
| | | return cookie; |
| | | } |
| | | } |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | /** |
| | | * 删除Cookie |
| | | * |
| | | * @param request |
| | | * @param response |
| | | * @param key |
| | | */ |
| | | public static void remove(HttpServletRequest request, HttpServletResponse response, String key) { |
| | | Cookie cookie = get(request, key); |
| | | if (cookie != null) { |
| | | set(response, key, "" , null, COOKIE_PATH, 0, true); |
| | | } |
| | | } |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * Jackson util |
| | | * |
| | | * <p> |
| | | * 1、obj need private and set/get; |
| | | * 2、do not support inner class; |
| | | * |
| | | * |
| | | * @author xuxueli 2015-9-25 18:02:56 |
| | | */ |
| | | public class JacksonUtil { |
| | | private static Logger logger = LoggerFactory.getLogger(JacksonUtil.class); |
| | | private static Logger logger = LoggerFactory.getLogger(JacksonUtil.class); |
| | | |
| | | private final static ObjectMapper objectMapper = new ObjectMapper(); |
| | | |
| | | public static ObjectMapper getInstance() { |
| | | return objectMapper; |
| | | } |
| | | |
| | | /** |
| | | * bean、array、List、Map --> json |
| | | * |
| | | * |
| | | * @param obj |
| | | * @return json string |
| | | * @throws Exception |
| | | */ |
| | | public static String writeValueAsString(Object obj) { |
| | | try { |
| | | return getInstance().writeValueAsString(obj); |
| | | } catch (JsonGenerationException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (JsonMappingException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | try { |
| | | return getInstance().writeValueAsString(obj); |
| | | } catch (JsonGenerationException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (JsonMappingException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | /** |
| | | * string --> bean、Map、List(array) |
| | | * |
| | | * |
| | | * @param jsonStr |
| | | * @param clazz |
| | | * @return obj |
| | | * @throws Exception |
| | | */ |
| | | public static <T> T readValue(String jsonStr, Class<T> clazz) { |
| | | try { |
| | | return getInstance().readValue(jsonStr, clazz); |
| | | } catch (JsonParseException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (JsonMappingException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return null; |
| | | try { |
| | | return getInstance().readValue(jsonStr, clazz); |
| | | } catch (JsonParseException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (JsonMappingException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | /** |
| | | * string --> List<Bean>... |
| | | * |
| | | * @param jsonStr |
| | | * @param parametrized |
| | | * @param parameterClasses |
| | | * @param <T> |
| | | * @return |
| | | */ |
| | | public static <T> T readValue(String jsonStr, Class<?> parametrized, Class<?>... parameterClasses) { |
| | | try { |
| | | JavaType javaType = getInstance().getTypeFactory().constructParametricType(parametrized, parameterClasses); |
| | | return getInstance().readValue(jsonStr, javaType); |
| | | } catch (JsonParseException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (JsonMappingException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return null; |
| | | } |
| | | /** |
| | | * string --> List<Bean>... |
| | | * |
| | | * @param jsonStr |
| | | * @param parametrized |
| | | * @param parameterClasses |
| | | * @param <T> |
| | | * @return |
| | | */ |
| | | public static <T> T readValue(String jsonStr, Class<?> parametrized, Class<?>... parameterClasses) { |
| | | try { |
| | | JavaType javaType = getInstance().getTypeFactory().constructParametricType(parametrized, parameterClasses); |
| | | return getInstance().readValue(jsonStr, javaType); |
| | | } catch (JsonParseException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (JsonMappingException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } catch (IOException e) { |
| | | logger.error(e.getMessage(), e); |
| | | } |
| | | return null; |
| | | } |
| | | } |
| | |
| | | |
| | | /** |
| | | * job info |
| | | * |
| | | * @author xuxueli 2016-1-12 18:03:45 |
| | | */ |
| | | @Mapper |
| | | public interface XxlJobInfoDao { |
| | | |
| | | public List<XxlJobInfo> pageList(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("triggerStatus") int triggerStatus, |
| | | @Param("jobDesc") String jobDesc, |
| | | @Param("executorHandler") String executorHandler, |
| | | @Param("author") String author); |
| | | public int pageListCount(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("triggerStatus") int triggerStatus, |
| | | @Param("jobDesc") String jobDesc, |
| | | @Param("executorHandler") String executorHandler, |
| | | @Param("author") String author); |
| | | |
| | | public int save(XxlJobInfo info); |
| | | public List<XxlJobInfo> pageList(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("triggerStatus") int triggerStatus, |
| | | @Param("jobDesc") String jobDesc, |
| | | @Param("executorHandler") String executorHandler, |
| | | @Param("author") String author); |
| | | |
| | | public XxlJobInfo loadById(@Param("id") int id); |
| | | |
| | | public int update(XxlJobInfo xxlJobInfo); |
| | | |
| | | public int delete(@Param("id") long id); |
| | | public int pageListCount(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("triggerStatus") int triggerStatus, |
| | | @Param("jobDesc") String jobDesc, |
| | | @Param("executorHandler") String executorHandler, |
| | | @Param("author") String author); |
| | | |
| | | public List<XxlJobInfo> getJobsByGroup(@Param("jobGroup") int jobGroup); |
| | | public int save(XxlJobInfo info); |
| | | |
| | | public int findAllCount(); |
| | | public XxlJobInfo loadById(@Param("id") int id); |
| | | |
| | | public List<XxlJobInfo> scheduleJobQuery(@Param("maxNextTime") long maxNextTime, @Param("pagesize") int pagesize ); |
| | | public int update(XxlJobInfo xxlJobInfo); |
| | | |
| | | public int scheduleUpdate(XxlJobInfo xxlJobInfo); |
| | | public int delete(@Param("id") long id); |
| | | |
| | | public List<XxlJobInfo> getJobsByGroup(@Param("jobGroup") int jobGroup); |
| | | |
| | | public int findAllCount(); |
| | | |
| | | public List<XxlJobInfo> scheduleJobQuery(@Param("maxNextTime") long maxNextTime, @Param("pagesize") int pagesize); |
| | | |
| | | public int scheduleUpdate(XxlJobInfo xxlJobInfo); |
| | | |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * job log |
| | | * |
| | | * @author xuxueli 2016-1-12 18:03:06 |
| | | */ |
| | | @Mapper |
| | | public interface XxlJobLogDao { |
| | | |
| | | // exist jobId not use jobGroup, not exist use jobGroup |
| | | public List<XxlJobLog> pageList(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("jobId") int jobId, |
| | | @Param("triggerTimeStart") Date triggerTimeStart, |
| | | @Param("triggerTimeEnd") Date triggerTimeEnd, |
| | | @Param("logStatus") int logStatus); |
| | | public int pageListCount(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("jobId") int jobId, |
| | | @Param("triggerTimeStart") Date triggerTimeStart, |
| | | @Param("triggerTimeEnd") Date triggerTimeEnd, |
| | | @Param("logStatus") int logStatus); |
| | | |
| | | public XxlJobLog load(@Param("id") long id); |
| | | // exist jobId not use jobGroup, not exist use jobGroup |
| | | public List<XxlJobLog> pageList(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("jobId") int jobId, |
| | | @Param("triggerTimeStart") Date triggerTimeStart, |
| | | @Param("triggerTimeEnd") Date triggerTimeEnd, |
| | | @Param("logStatus") int logStatus); |
| | | |
| | | public long save(XxlJobLog xxlJobLog); |
| | | public int pageListCount(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("jobGroup") int jobGroup, |
| | | @Param("jobId") int jobId, |
| | | @Param("triggerTimeStart") Date triggerTimeStart, |
| | | @Param("triggerTimeEnd") Date triggerTimeEnd, |
| | | @Param("logStatus") int logStatus); |
| | | |
| | | public int updateTriggerInfo(XxlJobLog xxlJobLog); |
| | | public XxlJobLog load(@Param("id") long id); |
| | | |
| | | public int updateHandleInfo(XxlJobLog xxlJobLog); |
| | | |
| | | public int delete(@Param("jobId") int jobId); |
| | | public long save(XxlJobLog xxlJobLog); |
| | | |
| | | public Map<String, Object> findLogReport(@Param("from") Date from, |
| | | @Param("to") Date to); |
| | | public int updateTriggerInfo(XxlJobLog xxlJobLog); |
| | | |
| | | public List<Long> findClearLogIds(@Param("jobGroup") int jobGroup, |
| | | @Param("jobId") int jobId, |
| | | @Param("clearBeforeTime") Date clearBeforeTime, |
| | | @Param("clearBeforeNum") int clearBeforeNum, |
| | | @Param("pagesize") int pagesize); |
| | | public int clearLog(@Param("logIds") List<Long> logIds); |
| | | public int updateHandleInfo(XxlJobLog xxlJobLog); |
| | | |
| | | public List<Long> findFailJobLogIds(@Param("pagesize") int pagesize); |
| | | public int delete(@Param("jobId") int jobId); |
| | | |
| | | public int updateAlarmStatus(@Param("logId") long logId, |
| | | @Param("oldAlarmStatus") int oldAlarmStatus, |
| | | @Param("newAlarmStatus") int newAlarmStatus); |
| | | public Map<String, Object> findLogReport(@Param("from") Date from, |
| | | @Param("to") Date to); |
| | | |
| | | public List<Long> findLostJobIds(@Param("losedTime") Date losedTime); |
| | | public List<Long> findClearLogIds(@Param("jobGroup") int jobGroup, |
| | | @Param("jobId") int jobId, |
| | | @Param("clearBeforeTime") Date clearBeforeTime, |
| | | @Param("clearBeforeNum") int clearBeforeNum, |
| | | @Param("pagesize") int pagesize); |
| | | |
| | | public int clearLog(@Param("logIds") List<Long> logIds); |
| | | |
| | | public List<Long> findFailJobLogIds(@Param("pagesize") int pagesize); |
| | | |
| | | public int updateAlarmStatus(@Param("logId") long logId, |
| | | @Param("oldAlarmStatus") int oldAlarmStatus, |
| | | @Param("newAlarmStatus") int newAlarmStatus); |
| | | |
| | | public List<Long> findLostJobIds(@Param("losedTime") Date losedTime); |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * job log for glue |
| | | * |
| | | * @author xuxueli 2016-5-19 18:04:56 |
| | | */ |
| | | @Mapper |
| | | public interface XxlJobLogGlueDao { |
| | | |
| | | public int save(XxlJobLogGlue xxlJobLogGlue); |
| | | |
| | | public List<XxlJobLogGlue> findByJobId(@Param("jobId") int jobId); |
| | | |
| | | public int removeOld(@Param("jobId") int jobId, @Param("limit") int limit); |
| | | public int save(XxlJobLogGlue xxlJobLogGlue); |
| | | |
| | | public int deleteByJobId(@Param("jobId") int jobId); |
| | | |
| | | public List<XxlJobLogGlue> findByJobId(@Param("jobId") int jobId); |
| | | |
| | | public int removeOld(@Param("jobId") int jobId, @Param("limit") int limit); |
| | | |
| | | public int deleteByJobId(@Param("jobId") int jobId); |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * job log |
| | | * |
| | | * @author xuxueli 2019-11-22 |
| | | */ |
| | | @Mapper |
| | | public interface XxlJobLogReportDao { |
| | | |
| | | public int save(XxlJobLogReport xxlJobLogReport); |
| | | public int save(XxlJobLogReport xxlJobLogReport); |
| | | |
| | | public int update(XxlJobLogReport xxlJobLogReport); |
| | | public int update(XxlJobLogReport xxlJobLogReport); |
| | | |
| | | public List<XxlJobLogReport> queryLogReport(@Param("triggerDayFrom") Date triggerDayFrom, |
| | | @Param("triggerDayTo") Date triggerDayTo); |
| | | public List<XxlJobLogReport> queryLogReport(@Param("triggerDayFrom") Date triggerDayFrom, |
| | | @Param("triggerDayTo") Date triggerDayTo); |
| | | |
| | | public XxlJobLogReport queryLogReportTotal(); |
| | | public XxlJobLogReport queryLogReportTotal(); |
| | | |
| | | } |
| | |
| | | import com.xxl.job.admin.core.model.XxlJobUser; |
| | | import org.apache.ibatis.annotations.Mapper; |
| | | import org.apache.ibatis.annotations.Param; |
| | | |
| | | import java.util.List; |
| | | |
| | | /** |
| | |
| | | @Mapper |
| | | public interface XxlJobUserDao { |
| | | |
| | | public List<XxlJobUser> pageList(@Param("offset") int offset, |
| | | public List<XxlJobUser> pageList(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("username") String username, |
| | | @Param("role") int role); |
| | | public int pageListCount(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("username") String username, |
| | | @Param("role") int role); |
| | | @Param("role") int role); |
| | | |
| | | public XxlJobUser loadByUserName(@Param("username") String username); |
| | | public int pageListCount(@Param("offset") int offset, |
| | | @Param("pagesize") int pagesize, |
| | | @Param("username") String username, |
| | | @Param("role") int role); |
| | | |
| | | public int save(XxlJobUser xxlJobUser); |
| | | public XxlJobUser loadByUserName(@Param("username") String username); |
| | | |
| | | public int update(XxlJobUser xxlJobUser); |
| | | |
| | | public int delete(@Param("id") int id); |
| | | public int save(XxlJobUser xxlJobUser); |
| | | |
| | | public int update(XxlJobUser xxlJobUser); |
| | | |
| | | public int delete(@Param("id") int id); |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * core job action for xxl-job |
| | | * |
| | | * |
| | | * @author xuxueli 2016-5-28 15:30:33 |
| | | */ |
| | | public interface XxlJobService { |
| | | |
| | | /** |
| | | * page list |
| | | * |
| | | * @param start |
| | | * @param length |
| | | * @param jobGroup |
| | | * @param jobDesc |
| | | * @param executorHandler |
| | | * @param author |
| | | * @return |
| | | */ |
| | | public Map<String, Object> pageList(int start, int length, int jobGroup, int triggerStatus, String jobDesc, String executorHandler, String author); |
| | | /** |
| | | * page list |
| | | * |
| | | * @param start |
| | | * @param length |
| | | * @param jobGroup |
| | | * @param jobDesc |
| | | * @param executorHandler |
| | | * @param author |
| | | * @return |
| | | */ |
| | | public Map<String, Object> pageList(int start, int length, int jobGroup, int triggerStatus, String jobDesc, String executorHandler, String author); |
| | | |
| | | /** |
| | | * add job |
| | | * |
| | | * @param jobInfo |
| | | * @return |
| | | */ |
| | | public ReturnT<String> add(XxlJobInfo jobInfo); |
| | | /** |
| | | * add job |
| | | * |
| | | * @param jobInfo |
| | | * @return |
| | | */ |
| | | public ReturnT<String> add(XxlJobInfo jobInfo); |
| | | |
| | | /** |
| | | * update job |
| | | * |
| | | * @param jobInfo |
| | | * @return |
| | | */ |
| | | public ReturnT<String> update(XxlJobInfo jobInfo); |
| | | /** |
| | | * update job |
| | | * |
| | | * @param jobInfo |
| | | * @return |
| | | */ |
| | | public ReturnT<String> update(XxlJobInfo jobInfo); |
| | | |
| | | /** |
| | | * remove job |
| | | * * |
| | | * @param id |
| | | * @return |
| | | */ |
| | | public ReturnT<String> remove(int id); |
| | | /** |
| | | * remove job |
| | | * * |
| | | * |
| | | * @param id |
| | | * @return |
| | | */ |
| | | public ReturnT<String> remove(int id); |
| | | |
| | | /** |
| | | * start job |
| | | * |
| | | * @param id |
| | | * @return |
| | | */ |
| | | public ReturnT<String> start(int id); |
| | | /** |
| | | * start job |
| | | * |
| | | * @param id |
| | | * @return |
| | | */ |
| | | public ReturnT<String> start(int id); |
| | | |
| | | /** |
| | | * stop job |
| | | * |
| | | * @param id |
| | | * @return |
| | | */ |
| | | public ReturnT<String> stop(int id); |
| | | /** |
| | | * stop job |
| | | * |
| | | * @param id |
| | | * @return |
| | | */ |
| | | public ReturnT<String> stop(int id); |
| | | |
| | | /** |
| | | * dashboard info |
| | | * |
| | | * @return |
| | | */ |
| | | public Map<String,Object> dashboardInfo(); |
| | | /** |
| | | * dashboard info |
| | | * |
| | | * @return |
| | | */ |
| | | public Map<String, Object> dashboardInfo(); |
| | | |
| | | /** |
| | | * chart info |
| | | * |
| | | * @param startDate |
| | | * @param endDate |
| | | * @return |
| | | */ |
| | | public ReturnT<Map<String,Object>> chartInfo(Date startDate, Date endDate); |
| | | /** |
| | | * chart info |
| | | * |
| | | * @param startDate |
| | | * @param endDate |
| | | * @return |
| | | */ |
| | | public ReturnT<Map<String, Object>> chartInfo(Date startDate, Date endDate); |
| | | |
| | | } |
| | |
| | | |
| | | /** |
| | | * core job action for xxl-job |
| | | * |
| | | * @author xuxueli 2016-5-28 15:30:33 |
| | | */ |
| | | @Service |
| | | public class XxlJobServiceImpl implements XxlJobService { |
| | | private static Logger logger = LoggerFactory.getLogger(XxlJobServiceImpl.class); |
| | | private static Logger logger = LoggerFactory.getLogger(XxlJobServiceImpl.class); |
| | | |
| | | @Resource |
| | | private XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | private XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | public XxlJobLogDao xxlJobLogDao; |
| | | @Resource |
| | | private XxlJobLogGlueDao xxlJobLogGlueDao; |
| | | @Resource |
| | | private XxlJobLogReportDao xxlJobLogReportDao; |
| | | |
| | | @Override |
| | | public Map<String, Object> pageList(int start, int length, int jobGroup, int triggerStatus, String jobDesc, String executorHandler, String author) { |
| | | @Resource |
| | | private XxlJobGroupDao xxlJobGroupDao; |
| | | @Resource |
| | | private XxlJobInfoDao xxlJobInfoDao; |
| | | @Resource |
| | | public XxlJobLogDao xxlJobLogDao; |
| | | @Resource |
| | | private XxlJobLogGlueDao xxlJobLogGlueDao; |
| | | @Resource |
| | | private XxlJobLogReportDao xxlJobLogReportDao; |
| | | |
| | | // page list |
| | | List<XxlJobInfo> list = xxlJobInfoDao.pageList(start, length, jobGroup, triggerStatus, jobDesc, executorHandler, author); |
| | | int list_count = xxlJobInfoDao.pageListCount(start, length, jobGroup, triggerStatus, jobDesc, executorHandler, author); |
| | | |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal", list_count); // 总记录数 |
| | | maps.put("recordsFiltered", list_count); // 过滤后的总记录数 |
| | | maps.put("data", list); // 分页列表 |
| | | return maps; |
| | | } |
| | | @Override |
| | | public Map<String, Object> pageList(int start, int length, int jobGroup, int triggerStatus, String jobDesc, String executorHandler, String author) { |
| | | |
| | | @Override |
| | | public ReturnT<String> add(XxlJobInfo jobInfo) { |
| | | // page list |
| | | List<XxlJobInfo> list = xxlJobInfoDao.pageList(start, length, jobGroup, triggerStatus, jobDesc, executorHandler, author); |
| | | int list_count = xxlJobInfoDao.pageListCount(start, length, jobGroup, triggerStatus, jobDesc, executorHandler, author); |
| | | |
| | | // valid base |
| | | XxlJobGroup group = xxlJobGroupDao.load(jobInfo.getJobGroup()); |
| | | if (group == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_choose")+I18nUtil.getString("jobinfo_field_jobgroup")) ); |
| | | } |
| | | if (jobInfo.getJobDesc()==null || jobInfo.getJobDesc().trim().length()==0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input")+I18nUtil.getString("jobinfo_field_jobdesc")) ); |
| | | } |
| | | if (jobInfo.getAuthor()==null || jobInfo.getAuthor().trim().length()==0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input")+I18nUtil.getString("jobinfo_field_author")) ); |
| | | } |
| | | // package result |
| | | Map<String, Object> maps = new HashMap<String, Object>(); |
| | | maps.put("recordsTotal" , list_count); // 总记录数 |
| | | maps.put("recordsFiltered" , list_count); // 过滤后的总记录数 |
| | | maps.put("data" , list); // 分页列表 |
| | | return maps; |
| | | } |
| | | |
| | | // valid trigger |
| | | ScheduleTypeEnum scheduleTypeEnum = ScheduleTypeEnum.match(jobInfo.getScheduleType(), null); |
| | | if (scheduleTypeEnum == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (scheduleTypeEnum == ScheduleTypeEnum.CRON) { |
| | | if (jobInfo.getScheduleConf()==null || !CronExpression.isValidExpression(jobInfo.getScheduleConf())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Cron"+I18nUtil.getString("system_unvalid")); |
| | | } |
| | | } else if (scheduleTypeEnum == ScheduleTypeEnum.FIX_RATE/* || scheduleTypeEnum == ScheduleTypeEnum.FIX_DELAY*/) { |
| | | if (jobInfo.getScheduleConf() == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")) ); |
| | | } |
| | | try { |
| | | int fixSecond = Integer.valueOf(jobInfo.getScheduleConf()); |
| | | if (fixSecond < 1) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | } catch (Exception e) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | } |
| | | @Override |
| | | public ReturnT<String> add(XxlJobInfo jobInfo) { |
| | | |
| | | // valid job |
| | | if (GlueTypeEnum.match(jobInfo.getGlueType()) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_gluetype")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (GlueTypeEnum.BEAN==GlueTypeEnum.match(jobInfo.getGlueType()) && (jobInfo.getExecutorHandler()==null || jobInfo.getExecutorHandler().trim().length()==0) ) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input")+"JobHandler") ); |
| | | } |
| | | // 》fix "\r" in shell |
| | | if (GlueTypeEnum.GLUE_SHELL==GlueTypeEnum.match(jobInfo.getGlueType()) && jobInfo.getGlueSource()!=null) { |
| | | jobInfo.setGlueSource(jobInfo.getGlueSource().replaceAll("\r", "")); |
| | | } |
| | | // valid base |
| | | XxlJobGroup group = xxlJobGroupDao.load(jobInfo.getJobGroup()); |
| | | if (group == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_choose") + I18nUtil.getString("jobinfo_field_jobgroup"))); |
| | | } |
| | | if (jobInfo.getJobDesc() == null || jobInfo.getJobDesc().trim().length() == 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobinfo_field_jobdesc"))); |
| | | } |
| | | if (jobInfo.getAuthor() == null || jobInfo.getAuthor().trim().length() == 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobinfo_field_author"))); |
| | | } |
| | | |
| | | // valid advanced |
| | | if (ExecutorRouteStrategyEnum.match(jobInfo.getExecutorRouteStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorRouteStrategy")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (MisfireStrategyEnum.match(jobInfo.getMisfireStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("misfire_strategy")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (ExecutorBlockStrategyEnum.match(jobInfo.getExecutorBlockStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorBlockStrategy")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | // valid trigger |
| | | ScheduleTypeEnum scheduleTypeEnum = ScheduleTypeEnum.match(jobInfo.getScheduleType(), null); |
| | | if (scheduleTypeEnum == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (scheduleTypeEnum == ScheduleTypeEnum.CRON) { |
| | | if (jobInfo.getScheduleConf() == null || !CronExpression.isValidExpression(jobInfo.getScheduleConf())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Cron" + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | } else if (scheduleTypeEnum == ScheduleTypeEnum.FIX_RATE/* || scheduleTypeEnum == ScheduleTypeEnum.FIX_DELAY*/) { |
| | | if (jobInfo.getScheduleConf() == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type"))); |
| | | } |
| | | try { |
| | | int fixSecond = Integer.valueOf(jobInfo.getScheduleConf()); |
| | | if (fixSecond < 1) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | } catch (Exception e) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | } |
| | | |
| | | // 》ChildJobId valid |
| | | if (jobInfo.getChildJobId()!=null && jobInfo.getChildJobId().trim().length()>0) { |
| | | String[] childJobIds = jobInfo.getChildJobId().split(","); |
| | | for (String childJobIdItem: childJobIds) { |
| | | if (childJobIdItem!=null && childJobIdItem.trim().length()>0 && isNumeric(childJobIdItem)) { |
| | | XxlJobInfo childJobInfo = xxlJobInfoDao.loadById(Integer.parseInt(childJobIdItem)); |
| | | if (childJobInfo==null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId")+"({0})"+I18nUtil.getString("system_not_found")), childJobIdItem)); |
| | | } |
| | | } else { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId")+"({0})"+I18nUtil.getString("system_unvalid")), childJobIdItem)); |
| | | } |
| | | } |
| | | // valid job |
| | | if (GlueTypeEnum.match(jobInfo.getGlueType()) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_gluetype") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (GlueTypeEnum.BEAN == GlueTypeEnum.match(jobInfo.getGlueType()) && (jobInfo.getExecutorHandler() == null || jobInfo.getExecutorHandler().trim().length() == 0)) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input") + "JobHandler")); |
| | | } |
| | | // 》fix "\r" in shell |
| | | if (GlueTypeEnum.GLUE_SHELL == GlueTypeEnum.match(jobInfo.getGlueType()) && jobInfo.getGlueSource() != null) { |
| | | jobInfo.setGlueSource(jobInfo.getGlueSource().replaceAll("\r" , "")); |
| | | } |
| | | |
| | | // join , avoid "xxx,," |
| | | String temp = ""; |
| | | for (String item:childJobIds) { |
| | | temp += item + ","; |
| | | } |
| | | temp = temp.substring(0, temp.length()-1); |
| | | // valid advanced |
| | | if (ExecutorRouteStrategyEnum.match(jobInfo.getExecutorRouteStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorRouteStrategy") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (MisfireStrategyEnum.match(jobInfo.getMisfireStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("misfire_strategy") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (ExecutorBlockStrategyEnum.match(jobInfo.getExecutorBlockStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorBlockStrategy") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | |
| | | jobInfo.setChildJobId(temp); |
| | | } |
| | | // 》ChildJobId valid |
| | | if (jobInfo.getChildJobId() != null && jobInfo.getChildJobId().trim().length() > 0) { |
| | | String[] childJobIds = jobInfo.getChildJobId().split(","); |
| | | for (String childJobIdItem : childJobIds) { |
| | | if (childJobIdItem != null && childJobIdItem.trim().length() > 0 && isNumeric(childJobIdItem)) { |
| | | XxlJobInfo childJobInfo = xxlJobInfoDao.loadById(Integer.parseInt(childJobIdItem)); |
| | | if (childJobInfo == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId") + "({0})" + I18nUtil.getString("system_not_found")), childJobIdItem)); |
| | | } |
| | | } else { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId") + "({0})" + I18nUtil.getString("system_unvalid")), childJobIdItem)); |
| | | } |
| | | } |
| | | |
| | | // add in db |
| | | jobInfo.setAddTime(new Date()); |
| | | jobInfo.setUpdateTime(new Date()); |
| | | jobInfo.setGlueUpdatetime(new Date()); |
| | | xxlJobInfoDao.save(jobInfo); |
| | | if (jobInfo.getId() < 1) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_add")+I18nUtil.getString("system_fail")) ); |
| | | } |
| | | // join , avoid "xxx,," |
| | | String temp = ""; |
| | | for (String item : childJobIds) { |
| | | temp += item + ","; |
| | | } |
| | | temp = temp.substring(0, temp.length() - 1); |
| | | |
| | | return new ReturnT<String>(String.valueOf(jobInfo.getId())); |
| | | } |
| | | jobInfo.setChildJobId(temp); |
| | | } |
| | | |
| | | private boolean isNumeric(String str){ |
| | | try { |
| | | int result = Integer.valueOf(str); |
| | | return true; |
| | | } catch (NumberFormatException e) { |
| | | return false; |
| | | } |
| | | } |
| | | // add in db |
| | | jobInfo.setAddTime(new Date()); |
| | | jobInfo.setUpdateTime(new Date()); |
| | | jobInfo.setGlueUpdatetime(new Date()); |
| | | xxlJobInfoDao.save(jobInfo); |
| | | if (jobInfo.getId() < 1) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_add") + I18nUtil.getString("system_fail"))); |
| | | } |
| | | |
| | | @Override |
| | | public ReturnT<String> update(XxlJobInfo jobInfo) { |
| | | return new ReturnT<String>(String.valueOf(jobInfo.getId())); |
| | | } |
| | | |
| | | // valid base |
| | | if (jobInfo.getJobDesc()==null || jobInfo.getJobDesc().trim().length()==0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input")+I18nUtil.getString("jobinfo_field_jobdesc")) ); |
| | | } |
| | | if (jobInfo.getAuthor()==null || jobInfo.getAuthor().trim().length()==0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input")+I18nUtil.getString("jobinfo_field_author")) ); |
| | | } |
| | | private boolean isNumeric(String str) { |
| | | try { |
| | | int result = Integer.valueOf(str); |
| | | return true; |
| | | } catch (NumberFormatException e) { |
| | | return false; |
| | | } |
| | | } |
| | | |
| | | // valid trigger |
| | | ScheduleTypeEnum scheduleTypeEnum = ScheduleTypeEnum.match(jobInfo.getScheduleType(), null); |
| | | if (scheduleTypeEnum == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (scheduleTypeEnum == ScheduleTypeEnum.CRON) { |
| | | if (jobInfo.getScheduleConf()==null || !CronExpression.isValidExpression(jobInfo.getScheduleConf())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Cron"+I18nUtil.getString("system_unvalid") ); |
| | | } |
| | | } else if (scheduleTypeEnum == ScheduleTypeEnum.FIX_RATE /*|| scheduleTypeEnum == ScheduleTypeEnum.FIX_DELAY*/) { |
| | | if (jobInfo.getScheduleConf() == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | try { |
| | | int fixSecond = Integer.valueOf(jobInfo.getScheduleConf()); |
| | | if (fixSecond < 1) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | } catch (Exception e) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | } |
| | | @Override |
| | | public ReturnT<String> update(XxlJobInfo jobInfo) { |
| | | |
| | | // valid advanced |
| | | if (ExecutorRouteStrategyEnum.match(jobInfo.getExecutorRouteStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorRouteStrategy")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (MisfireStrategyEnum.match(jobInfo.getMisfireStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("misfire_strategy")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | if (ExecutorBlockStrategyEnum.match(jobInfo.getExecutorBlockStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorBlockStrategy")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | // valid base |
| | | if (jobInfo.getJobDesc() == null || jobInfo.getJobDesc().trim().length() == 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobinfo_field_jobdesc"))); |
| | | } |
| | | if (jobInfo.getAuthor() == null || jobInfo.getAuthor().trim().length() == 0) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("system_please_input") + I18nUtil.getString("jobinfo_field_author"))); |
| | | } |
| | | |
| | | // 》ChildJobId valid |
| | | if (jobInfo.getChildJobId()!=null && jobInfo.getChildJobId().trim().length()>0) { |
| | | String[] childJobIds = jobInfo.getChildJobId().split(","); |
| | | for (String childJobIdItem: childJobIds) { |
| | | if (childJobIdItem!=null && childJobIdItem.trim().length()>0 && isNumeric(childJobIdItem)) { |
| | | XxlJobInfo childJobInfo = xxlJobInfoDao.loadById(Integer.parseInt(childJobIdItem)); |
| | | if (childJobInfo==null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId")+"({0})"+I18nUtil.getString("system_not_found")), childJobIdItem)); |
| | | } |
| | | } else { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId")+"({0})"+I18nUtil.getString("system_unvalid")), childJobIdItem)); |
| | | } |
| | | } |
| | | // valid trigger |
| | | ScheduleTypeEnum scheduleTypeEnum = ScheduleTypeEnum.match(jobInfo.getScheduleType(), null); |
| | | if (scheduleTypeEnum == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (scheduleTypeEnum == ScheduleTypeEnum.CRON) { |
| | | if (jobInfo.getScheduleConf() == null || !CronExpression.isValidExpression(jobInfo.getScheduleConf())) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, "Cron" + I18nUtil.getString("system_unvalid")); |
| | | } |
| | | } else if (scheduleTypeEnum == ScheduleTypeEnum.FIX_RATE /*|| scheduleTypeEnum == ScheduleTypeEnum.FIX_DELAY*/) { |
| | | if (jobInfo.getScheduleConf() == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | try { |
| | | int fixSecond = Integer.valueOf(jobInfo.getScheduleConf()); |
| | | if (fixSecond < 1) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | } catch (Exception e) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | } |
| | | |
| | | // join , avoid "xxx,," |
| | | String temp = ""; |
| | | for (String item:childJobIds) { |
| | | temp += item + ","; |
| | | } |
| | | temp = temp.substring(0, temp.length()-1); |
| | | // valid advanced |
| | | if (ExecutorRouteStrategyEnum.match(jobInfo.getExecutorRouteStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorRouteStrategy") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (MisfireStrategyEnum.match(jobInfo.getMisfireStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("misfire_strategy") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | if (ExecutorBlockStrategyEnum.match(jobInfo.getExecutorBlockStrategy(), null) == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_executorBlockStrategy") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | |
| | | jobInfo.setChildJobId(temp); |
| | | } |
| | | // 》ChildJobId valid |
| | | if (jobInfo.getChildJobId() != null && jobInfo.getChildJobId().trim().length() > 0) { |
| | | String[] childJobIds = jobInfo.getChildJobId().split(","); |
| | | for (String childJobIdItem : childJobIds) { |
| | | if (childJobIdItem != null && childJobIdItem.trim().length() > 0 && isNumeric(childJobIdItem)) { |
| | | XxlJobInfo childJobInfo = xxlJobInfoDao.loadById(Integer.parseInt(childJobIdItem)); |
| | | if (childJobInfo == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId") + "({0})" + I18nUtil.getString("system_not_found")), childJobIdItem)); |
| | | } |
| | | } else { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, |
| | | MessageFormat.format((I18nUtil.getString("jobinfo_field_childJobId") + "({0})" + I18nUtil.getString("system_unvalid")), childJobIdItem)); |
| | | } |
| | | } |
| | | |
| | | // group valid |
| | | XxlJobGroup jobGroup = xxlJobGroupDao.load(jobInfo.getJobGroup()); |
| | | if (jobGroup == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_jobgroup")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | // join , avoid "xxx,," |
| | | String temp = ""; |
| | | for (String item : childJobIds) { |
| | | temp += item + ","; |
| | | } |
| | | temp = temp.substring(0, temp.length() - 1); |
| | | |
| | | // stage job info |
| | | XxlJobInfo exists_jobInfo = xxlJobInfoDao.loadById(jobInfo.getId()); |
| | | if (exists_jobInfo == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_id")+I18nUtil.getString("system_not_found")) ); |
| | | } |
| | | jobInfo.setChildJobId(temp); |
| | | } |
| | | |
| | | // next trigger time (5s后生效,避开预读周期) |
| | | long nextTriggerTime = exists_jobInfo.getTriggerNextTime(); |
| | | boolean scheduleDataNotChanged = jobInfo.getScheduleType().equals(exists_jobInfo.getScheduleType()) && jobInfo.getScheduleConf().equals(exists_jobInfo.getScheduleConf()); |
| | | if (exists_jobInfo.getTriggerStatus() == 1 && !scheduleDataNotChanged) { |
| | | try { |
| | | Date nextValidTime = JobScheduleHelper.generateNextValidTime(jobInfo, new Date(System.currentTimeMillis() + JobScheduleHelper.PRE_READ_MS)); |
| | | if (nextValidTime == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | nextTriggerTime = nextValidTime.getTime(); |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | } |
| | | // group valid |
| | | XxlJobGroup jobGroup = xxlJobGroupDao.load(jobInfo.getJobGroup()); |
| | | if (jobGroup == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_jobgroup") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | |
| | | exists_jobInfo.setJobGroup(jobInfo.getJobGroup()); |
| | | exists_jobInfo.setJobDesc(jobInfo.getJobDesc()); |
| | | exists_jobInfo.setAuthor(jobInfo.getAuthor()); |
| | | exists_jobInfo.setAlarmEmail(jobInfo.getAlarmEmail()); |
| | | exists_jobInfo.setScheduleType(jobInfo.getScheduleType()); |
| | | exists_jobInfo.setScheduleConf(jobInfo.getScheduleConf()); |
| | | exists_jobInfo.setMisfireStrategy(jobInfo.getMisfireStrategy()); |
| | | exists_jobInfo.setExecutorRouteStrategy(jobInfo.getExecutorRouteStrategy()); |
| | | exists_jobInfo.setExecutorHandler(jobInfo.getExecutorHandler()); |
| | | exists_jobInfo.setExecutorParam(jobInfo.getExecutorParam()); |
| | | exists_jobInfo.setExecutorBlockStrategy(jobInfo.getExecutorBlockStrategy()); |
| | | exists_jobInfo.setExecutorTimeout(jobInfo.getExecutorTimeout()); |
| | | exists_jobInfo.setExecutorFailRetryCount(jobInfo.getExecutorFailRetryCount()); |
| | | exists_jobInfo.setChildJobId(jobInfo.getChildJobId()); |
| | | exists_jobInfo.setTriggerNextTime(nextTriggerTime); |
| | | // stage job info |
| | | XxlJobInfo exists_jobInfo = xxlJobInfoDao.loadById(jobInfo.getId()); |
| | | if (exists_jobInfo == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("jobinfo_field_id") + I18nUtil.getString("system_not_found"))); |
| | | } |
| | | |
| | | exists_jobInfo.setUpdateTime(new Date()); |
| | | // next trigger time (5s后生效,避开预读周期) |
| | | long nextTriggerTime = exists_jobInfo.getTriggerNextTime(); |
| | | boolean scheduleDataNotChanged = jobInfo.getScheduleType().equals(exists_jobInfo.getScheduleType()) && jobInfo.getScheduleConf().equals(exists_jobInfo.getScheduleConf()); |
| | | if (exists_jobInfo.getTriggerStatus() == 1 && !scheduleDataNotChanged) { |
| | | try { |
| | | Date nextValidTime = JobScheduleHelper.generateNextValidTime(jobInfo, new Date(System.currentTimeMillis() + JobScheduleHelper.PRE_READ_MS)); |
| | | if (nextValidTime == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | nextTriggerTime = nextValidTime.getTime(); |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | } |
| | | |
| | | exists_jobInfo.setJobGroup(jobInfo.getJobGroup()); |
| | | exists_jobInfo.setJobDesc(jobInfo.getJobDesc()); |
| | | exists_jobInfo.setAuthor(jobInfo.getAuthor()); |
| | | exists_jobInfo.setAlarmEmail(jobInfo.getAlarmEmail()); |
| | | exists_jobInfo.setScheduleType(jobInfo.getScheduleType()); |
| | | exists_jobInfo.setScheduleConf(jobInfo.getScheduleConf()); |
| | | exists_jobInfo.setMisfireStrategy(jobInfo.getMisfireStrategy()); |
| | | exists_jobInfo.setExecutorRouteStrategy(jobInfo.getExecutorRouteStrategy()); |
| | | exists_jobInfo.setExecutorHandler(jobInfo.getExecutorHandler()); |
| | | exists_jobInfo.setExecutorParam(jobInfo.getExecutorParam()); |
| | | exists_jobInfo.setExecutorBlockStrategy(jobInfo.getExecutorBlockStrategy()); |
| | | exists_jobInfo.setExecutorTimeout(jobInfo.getExecutorTimeout()); |
| | | exists_jobInfo.setExecutorFailRetryCount(jobInfo.getExecutorFailRetryCount()); |
| | | exists_jobInfo.setChildJobId(jobInfo.getChildJobId()); |
| | | exists_jobInfo.setTriggerNextTime(nextTriggerTime); |
| | | |
| | | exists_jobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(exists_jobInfo); |
| | | |
| | | |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | @Override |
| | | public ReturnT<String> remove(int id) { |
| | | XxlJobInfo xxlJobInfo = xxlJobInfoDao.loadById(id); |
| | | if (xxlJobInfo == null) { |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | @Override |
| | | public ReturnT<String> remove(int id) { |
| | | XxlJobInfo xxlJobInfo = xxlJobInfoDao.loadById(id); |
| | | if (xxlJobInfo == null) { |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | xxlJobInfoDao.delete(id); |
| | | xxlJobLogDao.delete(id); |
| | | xxlJobLogGlueDao.deleteByJobId(id); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | xxlJobInfoDao.delete(id); |
| | | xxlJobLogDao.delete(id); |
| | | xxlJobLogGlueDao.deleteByJobId(id); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | @Override |
| | | public ReturnT<String> start(int id) { |
| | | XxlJobInfo xxlJobInfo = xxlJobInfoDao.loadById(id); |
| | | |
| | | // valid |
| | | ScheduleTypeEnum scheduleTypeEnum = ScheduleTypeEnum.match(xxlJobInfo.getScheduleType(), ScheduleTypeEnum.NONE); |
| | | if (ScheduleTypeEnum.NONE == scheduleTypeEnum) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type_none_limit_start")) ); |
| | | } |
| | | |
| | | // next trigger time (5s后生效,避开预读周期) |
| | | long nextTriggerTime = 0; |
| | | try { |
| | | Date nextValidTime = JobScheduleHelper.generateNextValidTime(xxlJobInfo, new Date(System.currentTimeMillis() + JobScheduleHelper.PRE_READ_MS)); |
| | | if (nextValidTime == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | nextTriggerTime = nextValidTime.getTime(); |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type")+I18nUtil.getString("system_unvalid")) ); |
| | | } |
| | | |
| | | xxlJobInfo.setTriggerStatus(1); |
| | | xxlJobInfo.setTriggerLastTime(0); |
| | | xxlJobInfo.setTriggerNextTime(nextTriggerTime); |
| | | |
| | | xxlJobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(xxlJobInfo); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | @Override |
| | | public ReturnT<String> stop(int id) { |
| | | @Override |
| | | public ReturnT<String> start(int id) { |
| | | XxlJobInfo xxlJobInfo = xxlJobInfoDao.loadById(id); |
| | | |
| | | xxlJobInfo.setTriggerStatus(0); |
| | | xxlJobInfo.setTriggerLastTime(0); |
| | | xxlJobInfo.setTriggerNextTime(0); |
| | | // valid |
| | | ScheduleTypeEnum scheduleTypeEnum = ScheduleTypeEnum.match(xxlJobInfo.getScheduleType(), ScheduleTypeEnum.NONE); |
| | | if (ScheduleTypeEnum.NONE == scheduleTypeEnum) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type_none_limit_start"))); |
| | | } |
| | | |
| | | xxlJobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(xxlJobInfo); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | // next trigger time (5s后生效,避开预读周期) |
| | | long nextTriggerTime = 0; |
| | | try { |
| | | Date nextValidTime = JobScheduleHelper.generateNextValidTime(xxlJobInfo, new Date(System.currentTimeMillis() + JobScheduleHelper.PRE_READ_MS)); |
| | | if (nextValidTime == null) { |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | nextTriggerTime = nextValidTime.getTime(); |
| | | } catch (Exception e) { |
| | | logger.error(e.getMessage(), e); |
| | | return new ReturnT<String>(ReturnT.FAIL_CODE, (I18nUtil.getString("schedule_type") + I18nUtil.getString("system_unvalid"))); |
| | | } |
| | | |
| | | @Override |
| | | public Map<String, Object> dashboardInfo() { |
| | | xxlJobInfo.setTriggerStatus(1); |
| | | xxlJobInfo.setTriggerLastTime(0); |
| | | xxlJobInfo.setTriggerNextTime(nextTriggerTime); |
| | | |
| | | int jobInfoCount = xxlJobInfoDao.findAllCount(); |
| | | int jobLogCount = 0; |
| | | int jobLogSuccessCount = 0; |
| | | XxlJobLogReport xxlJobLogReport = xxlJobLogReportDao.queryLogReportTotal(); |
| | | if (xxlJobLogReport != null) { |
| | | jobLogCount = xxlJobLogReport.getRunningCount() + xxlJobLogReport.getSucCount() + xxlJobLogReport.getFailCount(); |
| | | jobLogSuccessCount = xxlJobLogReport.getSucCount(); |
| | | } |
| | | xxlJobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(xxlJobInfo); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | // executor count |
| | | Set<String> executorAddressSet = new HashSet<String>(); |
| | | List<XxlJobGroup> groupList = xxlJobGroupDao.findAll(); |
| | | @Override |
| | | public ReturnT<String> stop(int id) { |
| | | XxlJobInfo xxlJobInfo = xxlJobInfoDao.loadById(id); |
| | | |
| | | if (groupList!=null && !groupList.isEmpty()) { |
| | | for (XxlJobGroup group: groupList) { |
| | | if (group.getRegistryList()!=null && !group.getRegistryList().isEmpty()) { |
| | | executorAddressSet.addAll(group.getRegistryList()); |
| | | } |
| | | } |
| | | } |
| | | xxlJobInfo.setTriggerStatus(0); |
| | | xxlJobInfo.setTriggerLastTime(0); |
| | | xxlJobInfo.setTriggerNextTime(0); |
| | | |
| | | int executorCount = executorAddressSet.size(); |
| | | xxlJobInfo.setUpdateTime(new Date()); |
| | | xxlJobInfoDao.update(xxlJobInfo); |
| | | return ReturnT.SUCCESS; |
| | | } |
| | | |
| | | Map<String, Object> dashboardMap = new HashMap<String, Object>(); |
| | | dashboardMap.put("jobInfoCount", jobInfoCount); |
| | | dashboardMap.put("jobLogCount", jobLogCount); |
| | | dashboardMap.put("jobLogSuccessCount", jobLogSuccessCount); |
| | | dashboardMap.put("executorCount", executorCount); |
| | | return dashboardMap; |
| | | } |
| | | @Override |
| | | public Map<String, Object> dashboardInfo() { |
| | | |
| | | @Override |
| | | public ReturnT<Map<String, Object>> chartInfo(Date startDate, Date endDate) { |
| | | int jobInfoCount = xxlJobInfoDao.findAllCount(); |
| | | int jobLogCount = 0; |
| | | int jobLogSuccessCount = 0; |
| | | XxlJobLogReport xxlJobLogReport = xxlJobLogReportDao.queryLogReportTotal(); |
| | | if (xxlJobLogReport != null) { |
| | | jobLogCount = xxlJobLogReport.getRunningCount() + xxlJobLogReport.getSucCount() + xxlJobLogReport.getFailCount(); |
| | | jobLogSuccessCount = xxlJobLogReport.getSucCount(); |
| | | } |
| | | |
| | | // process |
| | | List<String> triggerDayList = new ArrayList<String>(); |
| | | List<Integer> triggerDayCountRunningList = new ArrayList<Integer>(); |
| | | List<Integer> triggerDayCountSucList = new ArrayList<Integer>(); |
| | | List<Integer> triggerDayCountFailList = new ArrayList<Integer>(); |
| | | int triggerCountRunningTotal = 0; |
| | | int triggerCountSucTotal = 0; |
| | | int triggerCountFailTotal = 0; |
| | | // executor count |
| | | Set<String> executorAddressSet = new HashSet<String>(); |
| | | List<XxlJobGroup> groupList = xxlJobGroupDao.findAll(); |
| | | |
| | | List<XxlJobLogReport> logReportList = xxlJobLogReportDao.queryLogReport(startDate, endDate); |
| | | if (groupList != null && !groupList.isEmpty()) { |
| | | for (XxlJobGroup group : groupList) { |
| | | if (group.getRegistryList() != null && !group.getRegistryList().isEmpty()) { |
| | | executorAddressSet.addAll(group.getRegistryList()); |
| | | } |
| | | } |
| | | } |
| | | |
| | | if (logReportList!=null && logReportList.size()>0) { |
| | | for (XxlJobLogReport item: logReportList) { |
| | | String day = DateUtil.formatDate(item.getTriggerDay()); |
| | | int triggerDayCountRunning = item.getRunningCount(); |
| | | int triggerDayCountSuc = item.getSucCount(); |
| | | int triggerDayCountFail = item.getFailCount(); |
| | | int executorCount = executorAddressSet.size(); |
| | | |
| | | triggerDayList.add(day); |
| | | triggerDayCountRunningList.add(triggerDayCountRunning); |
| | | triggerDayCountSucList.add(triggerDayCountSuc); |
| | | triggerDayCountFailList.add(triggerDayCountFail); |
| | | Map<String, Object> dashboardMap = new HashMap<String, Object>(); |
| | | dashboardMap.put("jobInfoCount" , jobInfoCount); |
| | | dashboardMap.put("jobLogCount" , jobLogCount); |
| | | dashboardMap.put("jobLogSuccessCount" , jobLogSuccessCount); |
| | | dashboardMap.put("executorCount" , executorCount); |
| | | return dashboardMap; |
| | | } |
| | | |
| | | triggerCountRunningTotal += triggerDayCountRunning; |
| | | triggerCountSucTotal += triggerDayCountSuc; |
| | | triggerCountFailTotal += triggerDayCountFail; |
| | | } |
| | | } else { |
| | | for (int i = -6; i <= 0; i++) { |
| | | triggerDayList.add(DateUtil.formatDate(DateUtil.addDays(new Date(), i))); |
| | | triggerDayCountRunningList.add(0); |
| | | triggerDayCountSucList.add(0); |
| | | triggerDayCountFailList.add(0); |
| | | } |
| | | } |
| | | @Override |
| | | public ReturnT<Map<String, Object>> chartInfo(Date startDate, Date endDate) { |
| | | |
| | | Map<String, Object> result = new HashMap<String, Object>(); |
| | | result.put("triggerDayList", triggerDayList); |
| | | result.put("triggerDayCountRunningList", triggerDayCountRunningList); |
| | | result.put("triggerDayCountSucList", triggerDayCountSucList); |
| | | result.put("triggerDayCountFailList", triggerDayCountFailList); |
| | | // process |
| | | List<String> triggerDayList = new ArrayList<String>(); |
| | | List<Integer> triggerDayCountRunningList = new ArrayList<Integer>(); |
| | | List<Integer> triggerDayCountSucList = new ArrayList<Integer>(); |
| | | List<Integer> triggerDayCountFailList = new ArrayList<Integer>(); |
| | | int triggerCountRunningTotal = 0; |
| | | int triggerCountSucTotal = 0; |
| | | int triggerCountFailTotal = 0; |
| | | |
| | | result.put("triggerCountRunningTotal", triggerCountRunningTotal); |
| | | result.put("triggerCountSucTotal", triggerCountSucTotal); |
| | | result.put("triggerCountFailTotal", triggerCountFailTotal); |
| | | List<XxlJobLogReport> logReportList = xxlJobLogReportDao.queryLogReport(startDate, endDate); |
| | | |
| | | return new ReturnT<Map<String, Object>>(result); |
| | | } |
| | | if (logReportList != null && logReportList.size() > 0) { |
| | | for (XxlJobLogReport item : logReportList) { |
| | | String day = DateUtil.formatDate(item.getTriggerDay()); |
| | | int triggerDayCountRunning = item.getRunningCount(); |
| | | int triggerDayCountSuc = item.getSucCount(); |
| | | int triggerDayCountFail = item.getFailCount(); |
| | | |
| | | triggerDayList.add(day); |
| | | triggerDayCountRunningList.add(triggerDayCountRunning); |
| | | triggerDayCountSucList.add(triggerDayCountSuc); |
| | | triggerDayCountFailList.add(triggerDayCountFail); |
| | | |
| | | triggerCountRunningTotal += triggerDayCountRunning; |
| | | triggerCountSucTotal += triggerDayCountSuc; |
| | | triggerCountFailTotal += triggerDayCountFail; |
| | | } |
| | | } else { |
| | | for (int i = -6; i <= 0; i++) { |
| | | triggerDayList.add(DateUtil.formatDate(DateUtil.addDays(new Date(), i))); |
| | | triggerDayCountRunningList.add(0); |
| | | triggerDayCountSucList.add(0); |
| | | triggerDayCountFailList.add(0); |
| | | } |
| | | } |
| | | |
| | | Map<String, Object> result = new HashMap<String, Object>(); |
| | | result.put("triggerDayList" , triggerDayList); |
| | | result.put("triggerDayCountRunningList" , triggerDayCountRunningList); |
| | | result.put("triggerDayCountSucList" , triggerDayCountSucList); |
| | | result.put("triggerDayCountFailList" , triggerDayCountFailList); |
| | | |
| | | result.put("triggerCountRunningTotal" , triggerCountRunningTotal); |
| | | result.put("triggerCountSucTotal" , triggerCountSucTotal); |
| | | result.put("triggerCountFailTotal" , triggerCountFailTotal); |
| | | |
| | | return new ReturnT<Map<String, Object>>(result); |
| | | } |
| | | |
| | | } |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobGroupDao"> |
| | | |
| | | <resultMap id="XxlJobGroup" type="com.xxl.job.admin.core.model.XxlJobGroup" > |
| | | <result column="id" property="id" /> |
| | | <result column="app_name" property="appname" /> |
| | | <result column="title" property="title" /> |
| | | <result column="address_type" property="addressType" /> |
| | | <result column="address_list" property="addressList" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.app_name, |
| | | t.title, |
| | | t.address_type, |
| | | t.address_list, |
| | | t.update_time |
| | | </sql> |
| | | <resultMap id="XxlJobGroup" type="com.xxl.job.admin.core.model.XxlJobGroup" > |
| | | <result column="id" property="id" /> |
| | | <result column="app_name" property="appname" /> |
| | | <result column="title" property="title" /> |
| | | <result column="address_type" property="addressType" /> |
| | | <result column="address_list" property="addressList" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | </resultMap> |
| | | |
| | | <select id="findAll" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | ORDER BY t.app_name, t.title, t.id ASC |
| | | </select> |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.app_name, |
| | | t.title, |
| | | t.address_type, |
| | | t.address_list, |
| | | t.update_time |
| | | </sql> |
| | | |
| | | <select id="findByAddressType" parameterType="java.lang.Integer" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | WHERE t.address_type = #{addressType} |
| | | ORDER BY t.app_name, t.title, t.id ASC |
| | | </select> |
| | | <select id="findAll" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | ORDER BY t.app_name, t.title, t.id ASC |
| | | </select> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobGroup" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_group ( `app_name`, `title`, `address_type`, `address_list`, `update_time`) |
| | | values ( #{appname}, #{title}, #{addressType}, #{addressList}, #{updateTime} ); |
| | | </insert> |
| | | <select id="findByAddressType" parameterType="java.lang.Integer" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | WHERE t.address_type = #{addressType} |
| | | ORDER BY t.app_name, t.title, t.id ASC |
| | | </select> |
| | | |
| | | <update id="update" parameterType="com.xxl.job.admin.core.model.XxlJobGroup" > |
| | | UPDATE xxl_job_group |
| | | SET `app_name` = #{appname}, |
| | | `title` = #{title}, |
| | | `address_type` = #{addressType}, |
| | | `address_list` = #{addressList}, |
| | | `update_time` = #{updateTime} |
| | | WHERE id = #{id} |
| | | </update> |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobGroup" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_group ( `app_name`, `title`, `address_type`, `address_list`, `update_time`) |
| | | values ( #{appname}, #{title}, #{addressType}, #{addressList}, #{updateTime} ); |
| | | </insert> |
| | | |
| | | <delete id="remove" parameterType="java.lang.Integer" > |
| | | DELETE FROM xxl_job_group |
| | | WHERE id = #{id} |
| | | </delete> |
| | | <update id="update" parameterType="com.xxl.job.admin.core.model.XxlJobGroup" > |
| | | UPDATE xxl_job_group |
| | | SET `app_name` = #{appname}, |
| | | `title` = #{title}, |
| | | `address_type` = #{addressType}, |
| | | `address_list` = #{addressList}, |
| | | `update_time` = #{updateTime} |
| | | WHERE id = #{id} |
| | | </update> |
| | | |
| | | <select id="load" parameterType="java.lang.Integer" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | WHERE t.id = #{id} |
| | | </select> |
| | | <delete id="remove" parameterType="java.lang.Integer" > |
| | | DELETE FROM xxl_job_group |
| | | WHERE id = #{id} |
| | | </delete> |
| | | |
| | | <select id="pageList" parameterType="java.util.HashMap" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="appname != null and appname != ''"> |
| | | AND t.app_name like CONCAT(CONCAT('%', #{appname}), '%') |
| | | </if> |
| | | <if test="title != null and title != ''"> |
| | | AND t.title like CONCAT(CONCAT('%', #{title}), '%') |
| | | </if> |
| | | </trim> |
| | | ORDER BY t.app_name, t.title, t.id ASC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | <select id="load" parameterType="java.lang.Integer" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | WHERE t.id = #{id} |
| | | </select> |
| | | |
| | | <select id="pageListCount" parameterType="java.util.HashMap" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_group AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="appname != null and appname != ''"> |
| | | AND t.app_name like CONCAT(CONCAT('%', #{appname}), '%') |
| | | </if> |
| | | <if test="title != null and title != ''"> |
| | | AND t.title like CONCAT(CONCAT('%', #{title}), '%') |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | <select id="pageList" parameterType="java.util.HashMap" resultMap="XxlJobGroup"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_group AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="appname != null and appname != ''"> |
| | | AND t.app_name like CONCAT(CONCAT('%', #{appname}), '%') |
| | | </if> |
| | | <if test="title != null and title != ''"> |
| | | AND t.title like CONCAT(CONCAT('%', #{title}), '%') |
| | | </if> |
| | | </trim> |
| | | ORDER BY t.app_name, t.title, t.id ASC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | |
| | | </mapper> |
| | | <select id="pageListCount" parameterType="java.util.HashMap" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_group AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="appname != null and appname != ''"> |
| | | AND t.app_name like CONCAT(CONCAT('%', #{appname}), '%') |
| | | </if> |
| | | <if test="title != null and title != ''"> |
| | | AND t.title like CONCAT(CONCAT('%', #{title}), '%') |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | |
| | | </mapper> |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobInfoDao"> |
| | | |
| | | <resultMap id="XxlJobInfo" type="com.xxl.job.admin.core.model.XxlJobInfo" > |
| | | <result column="id" property="id" /> |
| | | <resultMap id="XxlJobInfo" type="com.xxl.job.admin.core.model.XxlJobInfo" > |
| | | <result column="id" property="id" /> |
| | | |
| | | <result column="job_group" property="jobGroup" /> |
| | | <result column="job_desc" property="jobDesc" /> |
| | | <result column="job_group" property="jobGroup" /> |
| | | <result column="job_desc" property="jobDesc" /> |
| | | |
| | | <result column="add_time" property="addTime" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | <result column="add_time" property="addTime" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | |
| | | <result column="author" property="author" /> |
| | | <result column="alarm_email" property="alarmEmail" /> |
| | | <result column="author" property="author" /> |
| | | <result column="alarm_email" property="alarmEmail" /> |
| | | |
| | | <result column="schedule_type" property="scheduleType" /> |
| | | <result column="schedule_conf" property="scheduleConf" /> |
| | | <result column="misfire_strategy" property="misfireStrategy" /> |
| | | <result column="schedule_type" property="scheduleType" /> |
| | | <result column="schedule_conf" property="scheduleConf" /> |
| | | <result column="misfire_strategy" property="misfireStrategy" /> |
| | | |
| | | <result column="executor_route_strategy" property="executorRouteStrategy" /> |
| | | <result column="executor_handler" property="executorHandler" /> |
| | | <result column="executor_param" property="executorParam" /> |
| | | <result column="executor_block_strategy" property="executorBlockStrategy" /> |
| | | <result column="executor_timeout" property="executorTimeout" /> |
| | | <result column="executor_fail_retry_count" property="executorFailRetryCount" /> |
| | | <result column="executor_route_strategy" property="executorRouteStrategy" /> |
| | | <result column="executor_handler" property="executorHandler" /> |
| | | <result column="executor_param" property="executorParam" /> |
| | | <result column="executor_block_strategy" property="executorBlockStrategy" /> |
| | | <result column="executor_timeout" property="executorTimeout" /> |
| | | <result column="executor_fail_retry_count" property="executorFailRetryCount" /> |
| | | |
| | | <result column="glue_type" property="glueType" /> |
| | | <result column="glue_source" property="glueSource" /> |
| | | <result column="glue_remark" property="glueRemark" /> |
| | | <result column="glue_updatetime" property="glueUpdatetime" /> |
| | | <result column="glue_type" property="glueType" /> |
| | | <result column="glue_source" property="glueSource" /> |
| | | <result column="glue_remark" property="glueRemark" /> |
| | | <result column="glue_updatetime" property="glueUpdatetime" /> |
| | | |
| | | <result column="child_jobid" property="childJobId" /> |
| | | <result column="child_jobid" property="childJobId" /> |
| | | |
| | | <result column="trigger_status" property="triggerStatus" /> |
| | | <result column="trigger_last_time" property="triggerLastTime" /> |
| | | <result column="trigger_next_time" property="triggerNextTime" /> |
| | | </resultMap> |
| | | <result column="trigger_status" property="triggerStatus" /> |
| | | <result column="trigger_last_time" property="triggerLastTime" /> |
| | | <result column="trigger_next_time" property="triggerNextTime" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.job_group, |
| | | t.job_desc, |
| | | t.add_time, |
| | | t.update_time, |
| | | t.author, |
| | | t.alarm_email, |
| | | t.schedule_type, |
| | | t.schedule_conf, |
| | | t.misfire_strategy, |
| | | t.executor_route_strategy, |
| | | t.executor_handler, |
| | | t.executor_param, |
| | | t.executor_block_strategy, |
| | | t.executor_timeout, |
| | | t.executor_fail_retry_count, |
| | | t.glue_type, |
| | | t.glue_source, |
| | | t.glue_remark, |
| | | t.glue_updatetime, |
| | | t.child_jobid, |
| | | t.trigger_status, |
| | | t.trigger_last_time, |
| | | t.trigger_next_time |
| | | </sql> |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.job_group, |
| | | t.job_desc, |
| | | t.add_time, |
| | | t.update_time, |
| | | t.author, |
| | | t.alarm_email, |
| | | t.schedule_type, |
| | | t.schedule_conf, |
| | | t.misfire_strategy, |
| | | t.executor_route_strategy, |
| | | t.executor_handler, |
| | | t.executor_param, |
| | | t.executor_block_strategy, |
| | | t.executor_timeout, |
| | | t.executor_fail_retry_count, |
| | | t.glue_type, |
| | | t.glue_source, |
| | | t.glue_remark, |
| | | t.glue_updatetime, |
| | | t.child_jobid, |
| | | t.trigger_status, |
| | | t.trigger_last_time, |
| | | t.trigger_next_time |
| | | </sql> |
| | | |
| | | <select id="pageList" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <select id="pageList" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="triggerStatus gte 0"> |
| | | AND t.trigger_status = #{triggerStatus} |
| | | </if> |
| | | <if test="jobDesc != null and jobDesc != ''"> |
| | | AND t.job_desc like CONCAT(CONCAT('%', #{jobDesc}), '%') |
| | | </if> |
| | | <if test="executorHandler != null and executorHandler != ''"> |
| | | AND t.executor_handler like CONCAT(CONCAT('%', #{executorHandler}), '%') |
| | | </if> |
| | | <if test="author != null and author != ''"> |
| | | AND t.author like CONCAT(CONCAT('%', #{author}), '%') |
| | | </if> |
| | | </trim> |
| | | ORDER BY id DESC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | <if test="jobDesc != null and jobDesc != ''"> |
| | | AND t.job_desc like CONCAT(CONCAT('%', #{jobDesc}), '%') |
| | | </if> |
| | | <if test="executorHandler != null and executorHandler != ''"> |
| | | AND t.executor_handler like CONCAT(CONCAT('%', #{executorHandler}), '%') |
| | | </if> |
| | | <if test="author != null and author != ''"> |
| | | AND t.author like CONCAT(CONCAT('%', #{author}), '%') |
| | | </if> |
| | | </trim> |
| | | ORDER BY id DESC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | |
| | | <select id="pageListCount" parameterType="java.util.HashMap" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_info AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <select id="pageListCount" parameterType="java.util.HashMap" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_info AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="triggerStatus gte 0"> |
| | | AND t.trigger_status = #{triggerStatus} |
| | | </if> |
| | | <if test="jobDesc != null and jobDesc != ''"> |
| | | AND t.job_desc like CONCAT(CONCAT('%', #{jobDesc}), '%') |
| | | </if> |
| | | <if test="executorHandler != null and executorHandler != ''"> |
| | | AND t.executor_handler like CONCAT(CONCAT('%', #{executorHandler}), '%') |
| | | </if> |
| | | <if test="author != null and author != ''"> |
| | | AND t.author like CONCAT(CONCAT('%', #{author}), '%') |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | <if test="jobDesc != null and jobDesc != ''"> |
| | | AND t.job_desc like CONCAT(CONCAT('%', #{jobDesc}), '%') |
| | | </if> |
| | | <if test="executorHandler != null and executorHandler != ''"> |
| | | AND t.executor_handler like CONCAT(CONCAT('%', #{executorHandler}), '%') |
| | | </if> |
| | | <if test="author != null and author != ''"> |
| | | AND t.author like CONCAT(CONCAT('%', #{author}), '%') |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobInfo" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_info ( |
| | | job_group, |
| | | job_desc, |
| | | add_time, |
| | | update_time, |
| | | author, |
| | | alarm_email, |
| | | schedule_type, |
| | | schedule_conf, |
| | | misfire_strategy, |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobInfo" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_info ( |
| | | job_group, |
| | | job_desc, |
| | | add_time, |
| | | update_time, |
| | | author, |
| | | alarm_email, |
| | | schedule_type, |
| | | schedule_conf, |
| | | misfire_strategy, |
| | | executor_route_strategy, |
| | | executor_handler, |
| | | executor_param, |
| | | executor_block_strategy, |
| | | executor_timeout, |
| | | executor_fail_retry_count, |
| | | glue_type, |
| | | glue_source, |
| | | glue_remark, |
| | | glue_updatetime, |
| | | child_jobid, |
| | | trigger_status, |
| | | trigger_last_time, |
| | | trigger_next_time |
| | | ) VALUES ( |
| | | #{jobGroup}, |
| | | #{jobDesc}, |
| | | #{addTime}, |
| | | #{updateTime}, |
| | | #{author}, |
| | | #{alarmEmail}, |
| | | #{scheduleType}, |
| | | #{scheduleConf}, |
| | | #{misfireStrategy}, |
| | | #{executorRouteStrategy}, |
| | | #{executorHandler}, |
| | | #{executorParam}, |
| | | #{executorBlockStrategy}, |
| | | #{executorTimeout}, |
| | | #{executorFailRetryCount}, |
| | | #{glueType}, |
| | | #{glueSource}, |
| | | #{glueRemark}, |
| | | #{glueUpdatetime}, |
| | | #{childJobId}, |
| | | #{triggerStatus}, |
| | | #{triggerLastTime}, |
| | | #{triggerNextTime} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | /*SELECT @@IDENTITY AS id*/ |
| | | </selectKey>--> |
| | | </insert> |
| | | executor_handler, |
| | | executor_param, |
| | | executor_block_strategy, |
| | | executor_timeout, |
| | | executor_fail_retry_count, |
| | | glue_type, |
| | | glue_source, |
| | | glue_remark, |
| | | glue_updatetime, |
| | | child_jobid, |
| | | trigger_status, |
| | | trigger_last_time, |
| | | trigger_next_time |
| | | ) VALUES ( |
| | | #{jobGroup}, |
| | | #{jobDesc}, |
| | | #{addTime}, |
| | | #{updateTime}, |
| | | #{author}, |
| | | #{alarmEmail}, |
| | | #{scheduleType}, |
| | | #{scheduleConf}, |
| | | #{misfireStrategy}, |
| | | #{executorRouteStrategy}, |
| | | #{executorHandler}, |
| | | #{executorParam}, |
| | | #{executorBlockStrategy}, |
| | | #{executorTimeout}, |
| | | #{executorFailRetryCount}, |
| | | #{glueType}, |
| | | #{glueSource}, |
| | | #{glueRemark}, |
| | | #{glueUpdatetime}, |
| | | #{childJobId}, |
| | | #{triggerStatus}, |
| | | #{triggerLastTime}, |
| | | #{triggerNextTime} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | /*SELECT @@IDENTITY AS id*/ |
| | | </selectKey>--> |
| | | </insert> |
| | | |
| | | <select id="loadById" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | WHERE t.id = #{id} |
| | | </select> |
| | | <select id="loadById" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | WHERE t.id = #{id} |
| | | </select> |
| | | |
| | | <update id="update" parameterType="com.xxl.job.admin.core.model.XxlJobInfo" > |
| | | UPDATE xxl_job_info |
| | | SET |
| | | job_group = #{jobGroup}, |
| | | job_desc = #{jobDesc}, |
| | | update_time = #{updateTime}, |
| | | author = #{author}, |
| | | alarm_email = #{alarmEmail}, |
| | | schedule_type = #{scheduleType}, |
| | | schedule_conf = #{scheduleConf}, |
| | | misfire_strategy = #{misfireStrategy}, |
| | | executor_route_strategy = #{executorRouteStrategy}, |
| | | executor_handler = #{executorHandler}, |
| | | executor_param = #{executorParam}, |
| | | executor_block_strategy = #{executorBlockStrategy}, |
| | | executor_timeout = ${executorTimeout}, |
| | | executor_fail_retry_count = ${executorFailRetryCount}, |
| | | glue_type = #{glueType}, |
| | | glue_source = #{glueSource}, |
| | | glue_remark = #{glueRemark}, |
| | | glue_updatetime = #{glueUpdatetime}, |
| | | child_jobid = #{childJobId}, |
| | | trigger_status = #{triggerStatus}, |
| | | trigger_last_time = #{triggerLastTime}, |
| | | trigger_next_time = #{triggerNextTime} |
| | | WHERE id = #{id} |
| | | </update> |
| | | <update id="update" parameterType="com.xxl.job.admin.core.model.XxlJobInfo" > |
| | | UPDATE xxl_job_info |
| | | SET |
| | | job_group = #{jobGroup}, |
| | | job_desc = #{jobDesc}, |
| | | update_time = #{updateTime}, |
| | | author = #{author}, |
| | | alarm_email = #{alarmEmail}, |
| | | schedule_type = #{scheduleType}, |
| | | schedule_conf = #{scheduleConf}, |
| | | misfire_strategy = #{misfireStrategy}, |
| | | executor_route_strategy = #{executorRouteStrategy}, |
| | | executor_handler = #{executorHandler}, |
| | | executor_param = #{executorParam}, |
| | | executor_block_strategy = #{executorBlockStrategy}, |
| | | executor_timeout = ${executorTimeout}, |
| | | executor_fail_retry_count = ${executorFailRetryCount}, |
| | | glue_type = #{glueType}, |
| | | glue_source = #{glueSource}, |
| | | glue_remark = #{glueRemark}, |
| | | glue_updatetime = #{glueUpdatetime}, |
| | | child_jobid = #{childJobId}, |
| | | trigger_status = #{triggerStatus}, |
| | | trigger_last_time = #{triggerLastTime}, |
| | | trigger_next_time = #{triggerNextTime} |
| | | WHERE id = #{id} |
| | | </update> |
| | | |
| | | <delete id="delete" parameterType="java.util.HashMap"> |
| | | DELETE |
| | | FROM xxl_job_info |
| | | WHERE id = #{id} |
| | | </delete> |
| | | <delete id="delete" parameterType="java.util.HashMap"> |
| | | DELETE |
| | | FROM xxl_job_info |
| | | WHERE id = #{id} |
| | | </delete> |
| | | |
| | | <select id="getJobsByGroup" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | WHERE t.job_group = #{jobGroup} |
| | | </select> |
| | | <select id="getJobsByGroup" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | WHERE t.job_group = #{jobGroup} |
| | | </select> |
| | | |
| | | <select id="findAllCount" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_info |
| | | </select> |
| | | <select id="findAllCount" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_info |
| | | </select> |
| | | |
| | | |
| | | <select id="scheduleJobQuery" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | WHERE t.trigger_status = 1 |
| | | and t.trigger_next_time <![CDATA[ <= ]]> #{maxNextTime} |
| | | ORDER BY id ASC |
| | | LIMIT #{pagesize} |
| | | </select> |
| | | <select id="scheduleJobQuery" parameterType="java.util.HashMap" resultMap="XxlJobInfo"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_info AS t |
| | | WHERE t.trigger_status = 1 |
| | | and t.trigger_next_time <![CDATA[ <= ]]> #{maxNextTime} |
| | | ORDER BY id ASC |
| | | LIMIT #{pagesize} |
| | | </select> |
| | | |
| | | <update id="scheduleUpdate" parameterType="com.xxl.job.admin.core.model.XxlJobInfo" > |
| | | UPDATE xxl_job_info |
| | | SET |
| | | trigger_last_time = #{triggerLastTime}, |
| | | trigger_next_time = #{triggerNextTime}, |
| | | trigger_status = #{triggerStatus} |
| | | WHERE id = #{id} |
| | | </update> |
| | | <update id="scheduleUpdate" parameterType="com.xxl.job.admin.core.model.XxlJobInfo" > |
| | | UPDATE xxl_job_info |
| | | SET |
| | | trigger_last_time = #{triggerLastTime}, |
| | | trigger_next_time = #{triggerNextTime}, |
| | | trigger_status = #{triggerStatus} |
| | | WHERE id = #{id} |
| | | </update> |
| | | |
| | | </mapper> |
| | | </mapper> |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobLogGlueDao"> |
| | | |
| | | <resultMap id="XxlJobLogGlue" type="com.xxl.job.admin.core.model.XxlJobLogGlue" > |
| | | <result column="id" property="id" /> |
| | | <result column="job_id" property="jobId" /> |
| | | <result column="glue_type" property="glueType" /> |
| | | <result column="glue_source" property="glueSource" /> |
| | | <result column="glue_remark" property="glueRemark" /> |
| | | <result column="add_time" property="addTime" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.job_id, |
| | | t.glue_type, |
| | | t.glue_source, |
| | | t.glue_remark, |
| | | t.add_time, |
| | | t.update_time |
| | | </sql> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobLogGlue" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_logglue ( |
| | | `job_id`, |
| | | `glue_type`, |
| | | `glue_source`, |
| | | `glue_remark`, |
| | | `add_time`, |
| | | `update_time` |
| | | ) VALUES ( |
| | | #{jobId}, |
| | | #{glueType}, |
| | | #{glueSource}, |
| | | #{glueRemark}, |
| | | #{addTime}, |
| | | #{updateTime} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | </selectKey>--> |
| | | </insert> |
| | | |
| | | <select id="findByJobId" parameterType="java.lang.Integer" resultMap="XxlJobLogGlue"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_logglue AS t |
| | | WHERE t.job_id = #{jobId} |
| | | ORDER BY id DESC |
| | | </select> |
| | | |
| | | <delete id="removeOld" > |
| | | DELETE FROM xxl_job_logglue |
| | | WHERE id NOT in( |
| | | SELECT id FROM( |
| | | SELECT id FROM xxl_job_logglue |
| | | WHERE `job_id` = #{jobId} |
| | | ORDER BY update_time desc |
| | | LIMIT 0, #{limit} |
| | | ) t1 |
| | | ) AND `job_id` = #{jobId} |
| | | </delete> |
| | | |
| | | <delete id="deleteByJobId" parameterType="java.lang.Integer" > |
| | | DELETE FROM xxl_job_logglue |
| | | WHERE `job_id` = #{jobId} |
| | | </delete> |
| | | |
| | | </mapper> |
| | | <resultMap id="XxlJobLogGlue" type="com.xxl.job.admin.core.model.XxlJobLogGlue" > |
| | | <result column="id" property="id" /> |
| | | <result column="job_id" property="jobId" /> |
| | | <result column="glue_type" property="glueType" /> |
| | | <result column="glue_source" property="glueSource" /> |
| | | <result column="glue_remark" property="glueRemark" /> |
| | | <result column="add_time" property="addTime" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.job_id, |
| | | t.glue_type, |
| | | t.glue_source, |
| | | t.glue_remark, |
| | | t.add_time, |
| | | t.update_time |
| | | </sql> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobLogGlue" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_logglue ( |
| | | `job_id`, |
| | | `glue_type`, |
| | | `glue_source`, |
| | | `glue_remark`, |
| | | `add_time`, |
| | | `update_time` |
| | | ) VALUES ( |
| | | #{jobId}, |
| | | #{glueType}, |
| | | #{glueSource}, |
| | | #{glueRemark}, |
| | | #{addTime}, |
| | | #{updateTime} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | </selectKey>--> |
| | | </insert> |
| | | |
| | | <select id="findByJobId" parameterType="java.lang.Integer" resultMap="XxlJobLogGlue"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_logglue AS t |
| | | WHERE t.job_id = #{jobId} |
| | | ORDER BY id DESC |
| | | </select> |
| | | |
| | | <delete id="removeOld" > |
| | | DELETE FROM xxl_job_logglue |
| | | WHERE id NOT in( |
| | | SELECT id FROM( |
| | | SELECT id FROM xxl_job_logglue |
| | | WHERE `job_id` = #{jobId} |
| | | ORDER BY update_time desc |
| | | LIMIT 0, #{limit} |
| | | ) t1 |
| | | ) AND `job_id` = #{jobId} |
| | | </delete> |
| | | |
| | | <delete id="deleteByJobId" parameterType="java.lang.Integer" > |
| | | DELETE FROM xxl_job_logglue |
| | | WHERE `job_id` = #{jobId} |
| | | </delete> |
| | | |
| | | </mapper> |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobLogDao"> |
| | | |
| | | <resultMap id="XxlJobLog" type="com.xxl.job.admin.core.model.XxlJobLog" > |
| | | <result column="id" property="id" /> |
| | | |
| | | <result column="job_group" property="jobGroup" /> |
| | | <result column="job_id" property="jobId" /> |
| | | <resultMap id="XxlJobLog" type="com.xxl.job.admin.core.model.XxlJobLog" > |
| | | <result column="id" property="id" /> |
| | | |
| | | <result column="executor_address" property="executorAddress" /> |
| | | <result column="executor_handler" property="executorHandler" /> |
| | | <result column="executor_param" property="executorParam" /> |
| | | <result column="executor_sharding_param" property="executorShardingParam" /> |
| | | <result column="executor_fail_retry_count" property="executorFailRetryCount" /> |
| | | |
| | | <result column="trigger_time" property="triggerTime" /> |
| | | <result column="trigger_code" property="triggerCode" /> |
| | | <result column="trigger_msg" property="triggerMsg" /> |
| | | |
| | | <result column="handle_time" property="handleTime" /> |
| | | <result column="handle_code" property="handleCode" /> |
| | | <result column="handle_msg" property="handleMsg" /> |
| | | <result column="job_group" property="jobGroup" /> |
| | | <result column="job_id" property="jobId" /> |
| | | |
| | | <result column="alarm_status" property="alarmStatus" /> |
| | | </resultMap> |
| | | <result column="executor_address" property="executorAddress" /> |
| | | <result column="executor_handler" property="executorHandler" /> |
| | | <result column="executor_param" property="executorParam" /> |
| | | <result column="executor_sharding_param" property="executorShardingParam" /> |
| | | <result column="executor_fail_retry_count" property="executorFailRetryCount" /> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.job_group, |
| | | t.job_id, |
| | | t.executor_address, |
| | | t.executor_handler, |
| | | t.executor_param, |
| | | t.executor_sharding_param, |
| | | t.executor_fail_retry_count, |
| | | t.trigger_time, |
| | | t.trigger_code, |
| | | t.trigger_msg, |
| | | t.handle_time, |
| | | t.handle_code, |
| | | t.handle_msg, |
| | | t.alarm_status |
| | | </sql> |
| | | |
| | | <select id="pageList" resultMap="XxlJobLog"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_log AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobId==0 and jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND t.job_id = #{jobId} |
| | | </if> |
| | | <if test="triggerTimeStart != null"> |
| | | AND t.trigger_time <![CDATA[ >= ]]> #{triggerTimeStart} |
| | | </if> |
| | | <if test="triggerTimeEnd != null"> |
| | | AND t.trigger_time <![CDATA[ <= ]]> #{triggerTimeEnd} |
| | | </if> |
| | | <if test="logStatus == 1" > |
| | | AND t.handle_code = 200 |
| | | </if> |
| | | <if test="logStatus == 2" > |
| | | AND ( |
| | | t.trigger_code NOT IN (0, 200) OR |
| | | t.handle_code NOT IN (0, 200) |
| | | ) |
| | | </if> |
| | | <if test="logStatus == 3" > |
| | | AND t.trigger_code = 200 |
| | | AND t.handle_code = 0 |
| | | </if> |
| | | </trim> |
| | | ORDER BY t.trigger_time DESC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | |
| | | <select id="pageListCount" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_log AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobId==0 and jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND t.job_id = #{jobId} |
| | | </if> |
| | | <if test="triggerTimeStart != null"> |
| | | AND t.trigger_time <![CDATA[ >= ]]> #{triggerTimeStart} |
| | | </if> |
| | | <if test="triggerTimeEnd != null"> |
| | | AND t.trigger_time <![CDATA[ <= ]]> #{triggerTimeEnd} |
| | | </if> |
| | | <if test="logStatus == 1" > |
| | | AND t.handle_code = 200 |
| | | </if> |
| | | <if test="logStatus == 2" > |
| | | AND ( |
| | | t.trigger_code NOT IN (0, 200) OR |
| | | t.handle_code NOT IN (0, 200) |
| | | ) |
| | | </if> |
| | | <if test="logStatus == 3" > |
| | | AND t.trigger_code = 200 |
| | | AND t.handle_code = 0 |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | |
| | | <select id="load" parameterType="java.lang.Long" resultMap="XxlJobLog"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_log AS t |
| | | WHERE t.id = #{id} |
| | | </select> |
| | | <result column="trigger_time" property="triggerTime" /> |
| | | <result column="trigger_code" property="triggerCode" /> |
| | | <result column="trigger_msg" property="triggerMsg" /> |
| | | |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobLog" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_log ( |
| | | `job_group`, |
| | | `job_id`, |
| | | `trigger_time`, |
| | | `trigger_code`, |
| | | `handle_code` |
| | | ) VALUES ( |
| | | #{jobGroup}, |
| | | #{jobId}, |
| | | #{triggerTime}, |
| | | #{triggerCode}, |
| | | #{handleCode} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | </selectKey>--> |
| | | </insert> |
| | | <result column="handle_time" property="handleTime" /> |
| | | <result column="handle_code" property="handleCode" /> |
| | | <result column="handle_msg" property="handleMsg" /> |
| | | |
| | | <update id="updateTriggerInfo" > |
| | | UPDATE xxl_job_log |
| | | SET |
| | | `trigger_time`= #{triggerTime}, |
| | | `trigger_code`= #{triggerCode}, |
| | | `trigger_msg`= #{triggerMsg}, |
| | | `executor_address`= #{executorAddress}, |
| | | `executor_handler`=#{executorHandler}, |
| | | `executor_param`= #{executorParam}, |
| | | `executor_sharding_param`= #{executorShardingParam}, |
| | | `executor_fail_retry_count`= #{executorFailRetryCount} |
| | | WHERE `id`= #{id} |
| | | </update> |
| | | <result column="alarm_status" property="alarmStatus" /> |
| | | </resultMap> |
| | | |
| | | <update id="updateHandleInfo"> |
| | | UPDATE xxl_job_log |
| | | SET |
| | | `handle_time`= #{handleTime}, |
| | | `handle_code`= #{handleCode}, |
| | | `handle_msg`= #{handleMsg} |
| | | WHERE `id`= #{id} |
| | | </update> |
| | | |
| | | <delete id="delete" > |
| | | delete from xxl_job_log |
| | | WHERE job_id = #{jobId} |
| | | </delete> |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.job_group, |
| | | t.job_id, |
| | | t.executor_address, |
| | | t.executor_handler, |
| | | t.executor_param, |
| | | t.executor_sharding_param, |
| | | t.executor_fail_retry_count, |
| | | t.trigger_time, |
| | | t.trigger_code, |
| | | t.trigger_msg, |
| | | t.handle_time, |
| | | t.handle_code, |
| | | t.handle_msg, |
| | | t.alarm_status |
| | | </sql> |
| | | |
| | | <select id="pageList" resultMap="XxlJobLog"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_log AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobId==0 and jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND t.job_id = #{jobId} |
| | | </if> |
| | | <if test="triggerTimeStart != null"> |
| | | AND t.trigger_time <![CDATA[ >= ]]> #{triggerTimeStart} |
| | | </if> |
| | | <if test="triggerTimeEnd != null"> |
| | | AND t.trigger_time <![CDATA[ <= ]]> #{triggerTimeEnd} |
| | | </if> |
| | | <if test="logStatus == 1" > |
| | | AND t.handle_code = 200 |
| | | </if> |
| | | <if test="logStatus == 2" > |
| | | AND ( |
| | | t.trigger_code NOT IN (0, 200) OR |
| | | t.handle_code NOT IN (0, 200) |
| | | ) |
| | | </if> |
| | | <if test="logStatus == 3" > |
| | | AND t.trigger_code = 200 |
| | | AND t.handle_code = 0 |
| | | </if> |
| | | </trim> |
| | | ORDER BY t.trigger_time DESC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | |
| | | <select id="pageListCount" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_log AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobId==0 and jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND t.job_id = #{jobId} |
| | | </if> |
| | | <if test="triggerTimeStart != null"> |
| | | AND t.trigger_time <![CDATA[ >= ]]> #{triggerTimeStart} |
| | | </if> |
| | | <if test="triggerTimeEnd != null"> |
| | | AND t.trigger_time <![CDATA[ <= ]]> #{triggerTimeEnd} |
| | | </if> |
| | | <if test="logStatus == 1" > |
| | | AND t.handle_code = 200 |
| | | </if> |
| | | <if test="logStatus == 2" > |
| | | AND ( |
| | | t.trigger_code NOT IN (0, 200) OR |
| | | t.handle_code NOT IN (0, 200) |
| | | ) |
| | | </if> |
| | | <if test="logStatus == 3" > |
| | | AND t.trigger_code = 200 |
| | | AND t.handle_code = 0 |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | |
| | | <select id="load" parameterType="java.lang.Long" resultMap="XxlJobLog"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_log AS t |
| | | WHERE t.id = #{id} |
| | | </select> |
| | | |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobLog" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_log ( |
| | | `job_group`, |
| | | `job_id`, |
| | | `trigger_time`, |
| | | `trigger_code`, |
| | | `handle_code` |
| | | ) VALUES ( |
| | | #{jobGroup}, |
| | | #{jobId}, |
| | | #{triggerTime}, |
| | | #{triggerCode}, |
| | | #{handleCode} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | </selectKey>--> |
| | | </insert> |
| | | |
| | | <update id="updateTriggerInfo" > |
| | | UPDATE xxl_job_log |
| | | SET |
| | | `trigger_time`= #{triggerTime}, |
| | | `trigger_code`= #{triggerCode}, |
| | | `trigger_msg`= #{triggerMsg}, |
| | | `executor_address`= #{executorAddress}, |
| | | `executor_handler`=#{executorHandler}, |
| | | `executor_param`= #{executorParam}, |
| | | `executor_sharding_param`= #{executorShardingParam}, |
| | | `executor_fail_retry_count`= #{executorFailRetryCount} |
| | | WHERE `id`= #{id} |
| | | </update> |
| | | |
| | | <update id="updateHandleInfo"> |
| | | UPDATE xxl_job_log |
| | | SET |
| | | `handle_time`= #{handleTime}, |
| | | `handle_code`= #{handleCode}, |
| | | `handle_msg`= #{handleMsg} |
| | | WHERE `id`= #{id} |
| | | </update> |
| | | |
| | | <delete id="delete" > |
| | | delete from xxl_job_log |
| | | WHERE job_id = #{jobId} |
| | | </delete> |
| | | |
| | | <!--<select id="triggerCountByDay" resultType="java.util.Map" > |
| | | SELECT |
| | | DATE_FORMAT(trigger_time,'%Y-%m-%d') triggerDay, |
| | | COUNT(handle_code) triggerDayCount, |
| | | SUM(CASE WHEN (trigger_code in (0, 200) and handle_code = 0) then 1 else 0 end) as triggerDayCountRunning, |
| | | SUM(CASE WHEN handle_code = 200 then 1 else 0 end) as triggerDayCountSuc |
| | | FROM xxl_job_log |
| | | WHERE trigger_time BETWEEN #{from} and #{to} |
| | | GROUP BY triggerDay |
| | | ORDER BY triggerDay |
| | | SELECT |
| | | DATE_FORMAT(trigger_time,'%Y-%m-%d') triggerDay, |
| | | COUNT(handle_code) triggerDayCount, |
| | | SUM(CASE WHEN (trigger_code in (0, 200) and handle_code = 0) then 1 else 0 end) as triggerDayCountRunning, |
| | | SUM(CASE WHEN handle_code = 200 then 1 else 0 end) as triggerDayCountSuc |
| | | FROM xxl_job_log |
| | | WHERE trigger_time BETWEEN #{from} and #{to} |
| | | GROUP BY triggerDay |
| | | ORDER BY triggerDay |
| | | </select>--> |
| | | |
| | | <select id="findLogReport" resultType="java.util.Map" > |
| | | SELECT |
| | | COUNT(handle_code) triggerDayCount, |
| | | SUM(CASE WHEN (trigger_code in (0, 200) and handle_code = 0) then 1 else 0 end) as triggerDayCountRunning, |
| | | SUM(CASE WHEN handle_code = 200 then 1 else 0 end) as triggerDayCountSuc |
| | | FROM xxl_job_log |
| | | WHERE trigger_time BETWEEN #{from} and #{to} |
| | | SELECT |
| | | COUNT(handle_code) triggerDayCount, |
| | | SUM(CASE WHEN (trigger_code in (0, 200) and handle_code = 0) then 1 else 0 end) as triggerDayCountRunning, |
| | | SUM(CASE WHEN handle_code = 200 then 1 else 0 end) as triggerDayCountSuc |
| | | FROM xxl_job_log |
| | | WHERE trigger_time BETWEEN #{from} and #{to} |
| | | </select> |
| | | |
| | | <select id="findClearLogIds" resultType="long" > |
| | | SELECT id FROM xxl_job_log |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND job_id = #{jobId} |
| | | </if> |
| | | <if test="clearBeforeTime != null"> |
| | | AND trigger_time <![CDATA[ <= ]]> #{clearBeforeTime} |
| | | </if> |
| | | <if test="clearBeforeNum gt 0"> |
| | | AND id NOT in( |
| | | SELECT id FROM( |
| | | SELECT id FROM xxl_job_log AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND t.job_id = #{jobId} |
| | | </if> |
| | | </trim> |
| | | ORDER BY t.trigger_time desc |
| | | LIMIT 0, #{clearBeforeNum} |
| | | ) t1 |
| | | ) |
| | | </if> |
| | | </trim> |
| | | order by id asc |
| | | LIMIT #{pagesize} |
| | | </select> |
| | | <select id="findClearLogIds" resultType="long" > |
| | | SELECT id FROM xxl_job_log |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND job_id = #{jobId} |
| | | </if> |
| | | <if test="clearBeforeTime != null"> |
| | | AND trigger_time <![CDATA[ <= ]]> #{clearBeforeTime} |
| | | </if> |
| | | <if test="clearBeforeNum gt 0"> |
| | | AND id NOT in( |
| | | SELECT id FROM( |
| | | SELECT id FROM xxl_job_log AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="jobGroup gt 0"> |
| | | AND t.job_group = #{jobGroup} |
| | | </if> |
| | | <if test="jobId gt 0"> |
| | | AND t.job_id = #{jobId} |
| | | </if> |
| | | </trim> |
| | | ORDER BY t.trigger_time desc |
| | | LIMIT 0, #{clearBeforeNum} |
| | | ) t1 |
| | | ) |
| | | </if> |
| | | </trim> |
| | | order by id asc |
| | | LIMIT #{pagesize} |
| | | </select> |
| | | |
| | | <delete id="clearLog" > |
| | | delete from xxl_job_log |
| | | WHERE id in |
| | | <foreach collection="logIds" item="item" open="(" close=")" separator="," > |
| | | #{item} |
| | | </foreach> |
| | | </delete> |
| | | <delete id="clearLog" > |
| | | delete from xxl_job_log |
| | | WHERE id in |
| | | <foreach collection="logIds" item="item" open="(" close=")" separator="," > |
| | | #{item} |
| | | </foreach> |
| | | </delete> |
| | | |
| | | <select id="findFailJobLogIds" resultType="long" > |
| | | SELECT id FROM `xxl_job_log` |
| | | WHERE !( |
| | | (trigger_code in (0, 200) and handle_code = 0) |
| | | OR |
| | | (handle_code = 200) |
| | | ) |
| | | AND `alarm_status` = 0 |
| | | ORDER BY id ASC |
| | | LIMIT #{pagesize} |
| | | </select> |
| | | <select id="findFailJobLogIds" resultType="long" > |
| | | SELECT id FROM `xxl_job_log` |
| | | WHERE !( |
| | | (trigger_code in (0, 200) and handle_code = 0) |
| | | OR |
| | | (handle_code = 200) |
| | | ) |
| | | AND `alarm_status` = 0 |
| | | ORDER BY id ASC |
| | | LIMIT #{pagesize} |
| | | </select> |
| | | |
| | | <update id="updateAlarmStatus" > |
| | | UPDATE xxl_job_log |
| | | SET |
| | | `alarm_status` = #{newAlarmStatus} |
| | | WHERE `id`= #{logId} AND `alarm_status` = #{oldAlarmStatus} |
| | | </update> |
| | | <update id="updateAlarmStatus" > |
| | | UPDATE xxl_job_log |
| | | SET |
| | | `alarm_status` = #{newAlarmStatus} |
| | | WHERE `id`= #{logId} AND `alarm_status` = #{oldAlarmStatus} |
| | | </update> |
| | | |
| | | <select id="findLostJobIds" resultType="long" > |
| | | SELECT |
| | | t.id |
| | | FROM |
| | | xxl_job_log t |
| | | LEFT JOIN xxl_job_registry t2 ON t.executor_address = t2.registry_value |
| | | WHERE |
| | | t.trigger_code = 200 |
| | | AND t.handle_code = 0 |
| | | AND t.trigger_time <![CDATA[ <= ]]> #{losedTime} |
| | | AND t2.id IS NULL; |
| | | </select> |
| | | <!-- |
| | | SELECT t.id |
| | | FROM xxl_job_log AS t |
| | | WHERE t.trigger_code = 200 |
| | | and t.handle_code = 0 |
| | | and t.trigger_time <![CDATA[ <= ]]> #{losedTime} |
| | | and t.executor_address not in ( |
| | | SELECT t2.registry_value |
| | | FROM xxl_job_registry AS t2 |
| | | ) |
| | | --> |
| | | <select id="findLostJobIds" resultType="long" > |
| | | SELECT |
| | | t.id |
| | | FROM |
| | | xxl_job_log t |
| | | LEFT JOIN xxl_job_registry t2 ON t.executor_address = t2.registry_value |
| | | WHERE |
| | | t.trigger_code = 200 |
| | | AND t.handle_code = 0 |
| | | AND t.trigger_time <![CDATA[ <= ]]> #{losedTime} |
| | | AND t2.id IS NULL; |
| | | </select> |
| | | <!-- |
| | | SELECT t.id |
| | | FROM xxl_job_log AS t |
| | | WHERE t.trigger_code = 200 |
| | | and t.handle_code = 0 |
| | | and t.trigger_time <![CDATA[ <= ]]> #{losedTime} |
| | | and t.executor_address not in ( |
| | | SELECT t2.registry_value |
| | | FROM xxl_job_registry AS t2 |
| | | ) |
| | | --> |
| | | |
| | | </mapper> |
| | | </mapper> |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobLogReportDao"> |
| | | |
| | | <resultMap id="XxlJobLogReport" type="com.xxl.job.admin.core.model.XxlJobLogReport" > |
| | | <result column="id" property="id" /> |
| | | <result column="trigger_day" property="triggerDay" /> |
| | | <result column="running_count" property="runningCount" /> |
| | | <result column="suc_count" property="sucCount" /> |
| | | <result column="fail_count" property="failCount" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.trigger_day, |
| | | t.running_count, |
| | | t.suc_count, |
| | | t.fail_count |
| | | </sql> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobLogReport" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_log_report ( |
| | | `trigger_day`, |
| | | `running_count`, |
| | | `suc_count`, |
| | | `fail_count` |
| | | ) VALUES ( |
| | | #{triggerDay}, |
| | | #{runningCount}, |
| | | #{sucCount}, |
| | | #{failCount} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | </selectKey>--> |
| | | </insert> |
| | | <resultMap id="XxlJobLogReport" type="com.xxl.job.admin.core.model.XxlJobLogReport" > |
| | | <result column="id" property="id" /> |
| | | <result column="trigger_day" property="triggerDay" /> |
| | | <result column="running_count" property="runningCount" /> |
| | | <result column="suc_count" property="sucCount" /> |
| | | <result column="fail_count" property="failCount" /> |
| | | </resultMap> |
| | | |
| | | <update id="update" > |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.trigger_day, |
| | | t.running_count, |
| | | t.suc_count, |
| | | t.fail_count |
| | | </sql> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobLogReport" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_log_report ( |
| | | `trigger_day`, |
| | | `running_count`, |
| | | `suc_count`, |
| | | `fail_count` |
| | | ) VALUES ( |
| | | #{triggerDay}, |
| | | #{runningCount}, |
| | | #{sucCount}, |
| | | #{failCount} |
| | | ); |
| | | <!--<selectKey resultType="java.lang.Integer" order="AFTER" keyProperty="id"> |
| | | SELECT LAST_INSERT_ID() |
| | | </selectKey>--> |
| | | </insert> |
| | | |
| | | <update id="update" > |
| | | UPDATE xxl_job_log_report |
| | | SET `running_count` = #{runningCount}, |
| | | `suc_count` = #{sucCount}, |
| | | `fail_count` = #{failCount} |
| | | `suc_count` = #{sucCount}, |
| | | `fail_count` = #{failCount} |
| | | WHERE `trigger_day` = #{triggerDay} |
| | | </update> |
| | | |
| | | <select id="queryLogReport" resultMap="XxlJobLogReport"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_log_report AS t |
| | | WHERE t.trigger_day between #{triggerDayFrom} and #{triggerDayTo} |
| | | ORDER BY t.trigger_day ASC |
| | | </select> |
| | | <select id="queryLogReport" resultMap="XxlJobLogReport"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_log_report AS t |
| | | WHERE t.trigger_day between #{triggerDayFrom} and #{triggerDayTo} |
| | | ORDER BY t.trigger_day ASC |
| | | </select> |
| | | |
| | | <select id="queryLogReportTotal" resultMap="XxlJobLogReport"> |
| | | SELECT |
| | | SUM(running_count) running_count, |
| | | SUM(suc_count) suc_count, |
| | | SUM(fail_count) fail_count |
| | | FROM xxl_job_log_report AS t |
| | | </select> |
| | | <select id="queryLogReportTotal" resultMap="XxlJobLogReport"> |
| | | SELECT |
| | | SUM(running_count) running_count, |
| | | SUM(suc_count) suc_count, |
| | | SUM(fail_count) fail_count |
| | | FROM xxl_job_log_report AS t |
| | | </select> |
| | | |
| | | </mapper> |
| | | </mapper> |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobRegistryDao"> |
| | | |
| | | <resultMap id="XxlJobRegistry" type="com.xxl.job.admin.core.model.XxlJobRegistry" > |
| | | <result column="id" property="id" /> |
| | | <result column="registry_group" property="registryGroup" /> |
| | | <result column="registry_key" property="registryKey" /> |
| | | <result column="registry_value" property="registryValue" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.registry_group, |
| | | t.registry_key, |
| | | t.registry_value, |
| | | t.update_time |
| | | </sql> |
| | | <resultMap id="XxlJobRegistry" type="com.xxl.job.admin.core.model.XxlJobRegistry" > |
| | | <result column="id" property="id" /> |
| | | <result column="registry_group" property="registryGroup" /> |
| | | <result column="registry_key" property="registryKey" /> |
| | | <result column="registry_value" property="registryValue" /> |
| | | <result column="update_time" property="updateTime" /> |
| | | </resultMap> |
| | | |
| | | <select id="findDead" parameterType="java.util.HashMap" resultType="java.lang.Integer" > |
| | | SELECT t.id |
| | | FROM xxl_job_registry AS t |
| | | WHERE t.update_time <![CDATA[ < ]]> DATE_ADD(#{nowTime},INTERVAL -#{timeout} SECOND) |
| | | </select> |
| | | |
| | | <delete id="removeDead" parameterType="java.lang.Integer" > |
| | | DELETE FROM xxl_job_registry |
| | | WHERE id in |
| | | <foreach collection="ids" item="item" open="(" close=")" separator="," > |
| | | #{item} |
| | | </foreach> |
| | | </delete> |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.registry_group, |
| | | t.registry_key, |
| | | t.registry_value, |
| | | t.update_time |
| | | </sql> |
| | | |
| | | <select id="findAll" parameterType="java.util.HashMap" resultMap="XxlJobRegistry"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_registry AS t |
| | | WHERE t.update_time <![CDATA[ > ]]> DATE_ADD(#{nowTime},INTERVAL -#{timeout} SECOND) |
| | | </select> |
| | | <select id="findDead" parameterType="java.util.HashMap" resultType="java.lang.Integer" > |
| | | SELECT t.id |
| | | FROM xxl_job_registry AS t |
| | | WHERE t.update_time <![CDATA[ < ]]> DATE_ADD(#{nowTime},INTERVAL -#{timeout} SECOND) |
| | | </select> |
| | | |
| | | <delete id="removeDead" parameterType="java.lang.Integer" > |
| | | DELETE FROM xxl_job_registry |
| | | WHERE id in |
| | | <foreach collection="ids" item="item" open="(" close=")" separator="," > |
| | | #{item} |
| | | </foreach> |
| | | </delete> |
| | | |
| | | <select id="findAll" parameterType="java.util.HashMap" resultMap="XxlJobRegistry"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_registry AS t |
| | | WHERE t.update_time <![CDATA[ > ]]> DATE_ADD(#{nowTime},INTERVAL -#{timeout} SECOND) |
| | | </select> |
| | | |
| | | <update id="registryUpdate" > |
| | | UPDATE xxl_job_registry |
| | |
| | | VALUES( #{registryGroup} , #{registryKey} , #{registryValue}, #{updateTime}) |
| | | </insert> |
| | | |
| | | <delete id="registryDelete" > |
| | | DELETE FROM xxl_job_registry |
| | | WHERE registry_group = #{registryGroup} |
| | | AND registry_key = #{registryKey} |
| | | AND registry_value = #{registryValue} |
| | | </delete> |
| | | <delete id="registryDelete" > |
| | | DELETE FROM xxl_job_registry |
| | | WHERE registry_group = #{registryGroup} |
| | | AND registry_key = #{registryKey} |
| | | AND registry_value = #{registryValue} |
| | | </delete> |
| | | |
| | | </mapper> |
| | | </mapper> |
| | |
| | | <?xml version="1.0" encoding="UTF-8"?> |
| | | <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.xxl.job.admin.dao.XxlJobUserDao"> |
| | | |
| | | <resultMap id="XxlJobUser" type="com.xxl.job.admin.core.model.XxlJobUser" > |
| | | <result column="id" property="id" /> |
| | | <result column="username" property="username" /> |
| | | <result column="password" property="password" /> |
| | | <result column="role" property="role" /> |
| | | <result column="permission" property="permission" /> |
| | | </resultMap> |
| | | <resultMap id="XxlJobUser" type="com.xxl.job.admin.core.model.XxlJobUser" > |
| | | <result column="id" property="id" /> |
| | | <result column="username" property="username" /> |
| | | <result column="password" property="password" /> |
| | | <result column="role" property="role" /> |
| | | <result column="permission" property="permission" /> |
| | | </resultMap> |
| | | |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.username, |
| | | t.password, |
| | | t.role, |
| | | t.permission |
| | | </sql> |
| | | <sql id="Base_Column_List"> |
| | | t.id, |
| | | t.username, |
| | | t.password, |
| | | t.role, |
| | | t.permission |
| | | </sql> |
| | | |
| | | <select id="pageList" parameterType="java.util.HashMap" resultMap="XxlJobUser"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_user AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="username != null and username != ''"> |
| | | AND t.username like CONCAT(CONCAT('%', #{username}), '%') |
| | | </if> |
| | | <if test="role gt -1"> |
| | | AND t.role = #{role} |
| | | </if> |
| | | </trim> |
| | | ORDER BY username ASC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | <select id="pageList" parameterType="java.util.HashMap" resultMap="XxlJobUser"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_user AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="username != null and username != ''"> |
| | | AND t.username like CONCAT(CONCAT('%', #{username}), '%') |
| | | </if> |
| | | <if test="role gt -1"> |
| | | AND t.role = #{role} |
| | | </if> |
| | | </trim> |
| | | ORDER BY username ASC |
| | | LIMIT #{offset}, #{pagesize} |
| | | </select> |
| | | |
| | | <select id="pageListCount" parameterType="java.util.HashMap" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_user AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="username != null and username != ''"> |
| | | AND t.username like CONCAT(CONCAT('%', #{username}), '%') |
| | | </if> |
| | | <if test="role gt -1"> |
| | | AND t.role = #{role} |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | <select id="pageListCount" parameterType="java.util.HashMap" resultType="int"> |
| | | SELECT count(1) |
| | | FROM xxl_job_user AS t |
| | | <trim prefix="WHERE" prefixOverrides="AND | OR" > |
| | | <if test="username != null and username != ''"> |
| | | AND t.username like CONCAT(CONCAT('%', #{username}), '%') |
| | | </if> |
| | | <if test="role gt -1"> |
| | | AND t.role = #{role} |
| | | </if> |
| | | </trim> |
| | | </select> |
| | | |
| | | <select id="loadByUserName" parameterType="java.util.HashMap" resultMap="XxlJobUser"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_user AS t |
| | | WHERE t.username = #{username} |
| | | </select> |
| | | <select id="loadByUserName" parameterType="java.util.HashMap" resultMap="XxlJobUser"> |
| | | SELECT <include refid="Base_Column_List" /> |
| | | FROM xxl_job_user AS t |
| | | WHERE t.username = #{username} |
| | | </select> |
| | | |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobUser" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_user ( |
| | | username, |
| | | password, |
| | | role, |
| | | permission |
| | | ) VALUES ( |
| | | #{username}, |
| | | #{password}, |
| | | #{role}, |
| | | #{permission} |
| | | ); |
| | | </insert> |
| | | <insert id="save" parameterType="com.xxl.job.admin.core.model.XxlJobUser" useGeneratedKeys="true" keyProperty="id" > |
| | | INSERT INTO xxl_job_user ( |
| | | username, |
| | | password, |
| | | role, |
| | | permission |
| | | ) VALUES ( |
| | | #{username}, |
| | | #{password}, |
| | | #{role}, |
| | | #{permission} |
| | | ); |
| | | </insert> |
| | | |
| | | <update id="update" parameterType="com.xxl.job.admin.core.model.XxlJobUser" > |
| | | UPDATE xxl_job_user |
| | | SET |
| | | <if test="password != null and password != ''"> |
| | | password = #{password}, |
| | | </if> |
| | | role = #{role}, |
| | | permission = #{permission} |
| | | WHERE id = #{id} |
| | | </update> |
| | | <update id="update" parameterType="com.xxl.job.admin.core.model.XxlJobUser" > |
| | | UPDATE xxl_job_user |
| | | SET |
| | | <if test="password != null and password != ''"> |
| | | password = #{password}, |
| | | </if> |
| | | role = #{role}, |
| | | permission = #{permission} |
| | | WHERE id = #{id} |
| | | </update> |
| | | |
| | | <delete id="delete" parameterType="java.util.HashMap"> |
| | | DELETE |
| | | FROM xxl_job_user |
| | | WHERE id = #{id} |
| | | </delete> |
| | | <delete id="delete" parameterType="java.util.HashMap"> |
| | | DELETE |
| | | FROM xxl_job_user |
| | | WHERE id = #{id} |
| | | </delete> |
| | | |
| | | </mapper> |
| | | </mapper> |
| | |
| | | $(function() { |
| | | |
| | | // init date tables |
| | | var userListTable = $("#user_list").dataTable({ |
| | | "deferRender": true, |
| | | "processing" : true, |
| | | "serverSide": true, |
| | | "ajax": { |
| | | url: base_url + "/user/pageList", |
| | | type:"post", |
| | | data : function ( d ) { |
| | | var obj = {}; |
| | | // init date tables |
| | | var userListTable = $("#user_list").dataTable({ |
| | | "deferRender": true, |
| | | "processing" : true, |
| | | "serverSide": true, |
| | | "ajax": { |
| | | url: base_url + "/user/pageList", |
| | | type:"post", |
| | | data : function ( d ) { |
| | | var obj = {}; |
| | | obj.username = $('#username').val(); |
| | | obj.role = $('#role').val(); |
| | | obj.start = d.start; |
| | | obj.length = d.length; |
| | | obj.start = d.start; |
| | | obj.length = d.length; |
| | | return obj; |
| | | } |
| | | }, |
| | | "searching": false, |
| | | "ordering": false, |
| | | //"scrollX": true, // scroll x,close self-adaption |
| | | "columns": [ |
| | | { |
| | | "data": 'id', |
| | | "visible" : false, |
| | | "width":'10%' |
| | | }, |
| | | { |
| | | "data": 'username', |
| | | "visible" : true, |
| | | "width":'20%' |
| | | }, |
| | | { |
| | | "data": 'password', |
| | | "visible" : false, |
| | | }, |
| | | "searching": false, |
| | | "ordering": false, |
| | | //"scrollX": true, // scroll x,close self-adaption |
| | | "columns": [ |
| | | { |
| | | "data": 'id', |
| | | "visible" : false, |
| | | "width":'10%' |
| | | }, |
| | | { |
| | | "data": 'username', |
| | | "visible" : true, |
| | | "width":'20%' |
| | | }, |
| | | { |
| | | "data": 'password', |
| | | "visible" : false, |
| | | "width":'20%', |
| | | "render": function ( data, type, row ) { |
| | | return '*********'; |
| | | } |
| | | }, |
| | | { |
| | | "data": 'role', |
| | | "visible" : true, |
| | | "width":'10%', |
| | | }, |
| | | { |
| | | "data": 'role', |
| | | "visible" : true, |
| | | "width":'10%', |
| | | "render": function ( data, type, row ) { |
| | | if (data == 1) { |
| | | return I18n.user_role_admin |
| | |
| | | return I18n.user_role_normal |
| | | } |
| | | } |
| | | }, |
| | | { |
| | | "data": 'permission', |
| | | "width":'10%', |
| | | "visible" : false |
| | | }, |
| | | { |
| | | "data": I18n.system_opt , |
| | | "width":'15%', |
| | | "render": function ( data, type, row ) { |
| | | return function(){ |
| | | // html |
| | | }, |
| | | { |
| | | "data": 'permission', |
| | | "width":'10%', |
| | | "visible" : false |
| | | }, |
| | | { |
| | | "data": I18n.system_opt , |
| | | "width":'15%', |
| | | "render": function ( data, type, row ) { |
| | | return function(){ |
| | | // html |
| | | tableData['key'+row.id] = row; |
| | | var html = '<p id="'+ row.id +'" >'+ |
| | | '<button class="btn btn-warning btn-xs update" type="button">'+ I18n.system_opt_edit +'</button> '+ |
| | | '<button class="btn btn-danger btn-xs delete" type="button">'+ I18n.system_opt_del +'</button> '+ |
| | | '</p>'; |
| | | var html = '<p id="'+ row.id +'" >'+ |
| | | '<button class="btn btn-warning btn-xs update" type="button">'+ I18n.system_opt_edit +'</button> '+ |
| | | '<button class="btn btn-danger btn-xs delete" type="button">'+ I18n.system_opt_del +'</button> '+ |
| | | '</p>'; |
| | | |
| | | return html; |
| | | }; |
| | | } |
| | | } |
| | | ], |
| | | "language" : { |
| | | "sProcessing" : I18n.dataTable_sProcessing , |
| | | "sLengthMenu" : I18n.dataTable_sLengthMenu , |
| | | "sZeroRecords" : I18n.dataTable_sZeroRecords , |
| | | "sInfo" : I18n.dataTable_sInfo , |
| | | "sInfoEmpty" : I18n.dataTable_sInfoEmpty , |
| | | "sInfoFiltered" : I18n.dataTable_sInfoFiltered , |
| | | "sInfoPostFix" : "", |
| | | "sSearch" : I18n.dataTable_sSearch , |
| | | "sUrl" : "", |
| | | "sEmptyTable" : I18n.dataTable_sEmptyTable , |
| | | "sLoadingRecords" : I18n.dataTable_sLoadingRecords , |
| | | "sInfoThousands" : ",", |
| | | "oPaginate" : { |
| | | "sFirst" : I18n.dataTable_sFirst , |
| | | "sPrevious" : I18n.dataTable_sPrevious , |
| | | "sNext" : I18n.dataTable_sNext , |
| | | "sLast" : I18n.dataTable_sLast |
| | | }, |
| | | "oAria" : { |
| | | "sSortAscending" : I18n.dataTable_sSortAscending , |
| | | "sSortDescending" : I18n.dataTable_sSortDescending |
| | | } |
| | | } |
| | | }); |
| | | return html; |
| | | }; |
| | | } |
| | | } |
| | | ], |
| | | "language" : { |
| | | "sProcessing" : I18n.dataTable_sProcessing , |
| | | "sLengthMenu" : I18n.dataTable_sLengthMenu , |
| | | "sZeroRecords" : I18n.dataTable_sZeroRecords , |
| | | "sInfo" : I18n.dataTable_sInfo , |
| | | "sInfoEmpty" : I18n.dataTable_sInfoEmpty , |
| | | "sInfoFiltered" : I18n.dataTable_sInfoFiltered , |
| | | "sInfoPostFix" : "", |
| | | "sSearch" : I18n.dataTable_sSearch , |
| | | "sUrl" : "", |
| | | "sEmptyTable" : I18n.dataTable_sEmptyTable , |
| | | "sLoadingRecords" : I18n.dataTable_sLoadingRecords , |
| | | "sInfoThousands" : ",", |
| | | "oPaginate" : { |
| | | "sFirst" : I18n.dataTable_sFirst , |
| | | "sPrevious" : I18n.dataTable_sPrevious , |
| | | "sNext" : I18n.dataTable_sNext , |
| | | "sLast" : I18n.dataTable_sLast |
| | | }, |
| | | "oAria" : { |
| | | "sSortAscending" : I18n.dataTable_sSortAscending , |
| | | "sSortDescending" : I18n.dataTable_sSortDescending |
| | | } |
| | | } |
| | | }); |
| | | |
| | | // table data |
| | | var tableData = {}; |
| | | |
| | | // search btn |
| | | $('#searchBtn').on('click', function(){ |
| | | // search btn |
| | | $('#searchBtn').on('click', function(){ |
| | | userListTable.fnDraw(); |
| | | }); |
| | | |
| | | // job operate |
| | | $("#user_list").on('click', '.delete',function() { |
| | | var id = $(this).parent('p').attr("id"); |
| | | }); |
| | | |
| | | layer.confirm( I18n.system_ok + I18n.system_opt_del + '?', { |
| | | icon: 3, |
| | | title: I18n.system_tips , |
| | | // job operate |
| | | $("#user_list").on('click', '.delete',function() { |
| | | var id = $(this).parent('p').attr("id"); |
| | | |
| | | layer.confirm( I18n.system_ok + I18n.system_opt_del + '?', { |
| | | icon: 3, |
| | | title: I18n.system_tips , |
| | | btn: [ I18n.system_ok, I18n.system_cancel ] |
| | | }, function(index){ |
| | | layer.close(index); |
| | | }, function(index){ |
| | | layer.close(index); |
| | | |
| | | $.ajax({ |
| | | type : 'POST', |
| | | url : base_url + "/user/remove", |
| | | data : { |
| | | "id" : id |
| | | }, |
| | | dataType : "json", |
| | | success : function(data){ |
| | | if (data.code == 200) { |
| | | $.ajax({ |
| | | type : 'POST', |
| | | url : base_url + "/user/remove", |
| | | data : { |
| | | "id" : id |
| | | }, |
| | | dataType : "json", |
| | | success : function(data){ |
| | | if (data.code == 200) { |
| | | layer.msg( I18n.system_success ); |
| | | userListTable.fnDraw(false); |
| | | } else { |
| | | userListTable.fnDraw(false); |
| | | } else { |
| | | layer.msg( data.msg || I18n.system_opt_del + I18n.system_fail ); |
| | | } |
| | | } |
| | | }); |
| | | }); |
| | | }); |
| | | } |
| | | } |
| | | }); |
| | | }); |
| | | }); |
| | | |
| | | // add role |
| | | // add role |
| | | $("#addModal .form input[name=role]").change(function () { |
| | | var role = $(this).val(); |
| | | if (role == 1) { |
| | | var role = $(this).val(); |
| | | if (role == 1) { |
| | | $("#addModal .form input[name=permission]").parents('.form-group').hide(); |
| | | } else { |
| | | } else { |
| | | $("#addModal .form input[name=permission]").parents('.form-group').show(); |
| | | } |
| | | } |
| | | $("#addModal .form input[name='permission']").prop("checked",false); |
| | | }); |
| | | |
| | |
| | | return this.optional(element) || valid.test(value); |
| | | }, I18n.user_username_valid ); |
| | | |
| | | // add |
| | | $(".add").click(function(){ |
| | | $('#addModal').modal({backdrop: false, keyboard: false}).modal('show'); |
| | | }); |
| | | var addModalValidate = $("#addModal .form").validate({ |
| | | errorElement : 'span', |
| | | // add |
| | | $(".add").click(function(){ |
| | | $('#addModal').modal({backdrop: false, keyboard: false}).modal('show'); |
| | | }); |
| | | var addModalValidate = $("#addModal .form").validate({ |
| | | errorElement : 'span', |
| | | errorClass : 'help-block', |
| | | focusInvalid : true, |
| | | focusInvalid : true, |
| | | rules : { |
| | | username : { |
| | | required : true, |
| | | required : true, |
| | | rangelength:[4, 20], |
| | | myValid01: true |
| | | }, |
| | | }, |
| | | password : { |
| | | required : true, |
| | | rangelength:[4, 20] |
| | | } |
| | | }, |
| | | }, |
| | | messages : { |
| | | username : { |
| | | required : I18n.system_please_input + I18n.user_username, |
| | | required : I18n.system_please_input + I18n.user_username, |
| | | rangelength: I18n.system_lengh_limit + "[4-20]" |
| | | }, |
| | | password : { |
| | |
| | | rangelength: I18n.system_lengh_limit + "[4-20]" |
| | | } |
| | | }, |
| | | highlight : function(element) { |
| | | $(element).closest('.form-group').addClass('has-error'); |
| | | highlight : function(element) { |
| | | $(element).closest('.form-group').addClass('has-error'); |
| | | }, |
| | | success : function(label) { |
| | | label.closest('.form-group').removeClass('has-error'); |
| | | label.remove(); |
| | | success : function(label) { |
| | | label.closest('.form-group').removeClass('has-error'); |
| | | label.remove(); |
| | | }, |
| | | errorPlacement : function(error, element) { |
| | | element.parent('div').append(error); |
| | | errorPlacement : function(error, element) { |
| | | element.parent('div').append(error); |
| | | }, |
| | | submitHandler : function(form) { |
| | | |
| | |
| | | permissionArr.push($(this).val()); |
| | | }); |
| | | |
| | | var paramData = { |
| | | "username": $("#addModal .form input[name=username]").val(), |
| | | var paramData = { |
| | | "username": $("#addModal .form input[name=username]").val(), |
| | | "password": $("#addModal .form input[name=password]").val(), |
| | | "role": $("#addModal .form input[name=role]:checked").val(), |
| | | "permission": permissionArr.join(',') |
| | | }; |
| | | }; |
| | | |
| | | $.post(base_url + "/user/add", paramData, function(data, status) { |
| | | if (data.code == "200") { |
| | | $('#addModal').modal('hide'); |
| | | $.post(base_url + "/user/add", paramData, function(data, status) { |
| | | if (data.code == "200") { |
| | | $('#addModal').modal('hide'); |
| | | |
| | | layer.msg( I18n.system_add_suc ); |
| | | userListTable.fnDraw(); |
| | | } else { |
| | | layer.open({ |
| | | title: I18n.system_tips , |
| | | } else { |
| | | layer.open({ |
| | | title: I18n.system_tips , |
| | | btn: [ I18n.system_ok ], |
| | | content: (data.msg || I18n.system_add_fail), |
| | | icon: '2' |
| | | }); |
| | | } |
| | | }); |
| | | } |
| | | }); |
| | | $("#addModal").on('hide.bs.modal', function () { |
| | | $("#addModal .form")[0].reset(); |
| | | addModalValidate.resetForm(); |
| | | $("#addModal .form .form-group").removeClass("has-error"); |
| | | $(".remote_panel").show(); // remote |
| | | content: (data.msg || I18n.system_add_fail), |
| | | icon: '2' |
| | | }); |
| | | } |
| | | }); |
| | | } |
| | | }); |
| | | $("#addModal").on('hide.bs.modal', function () { |
| | | $("#addModal .form")[0].reset(); |
| | | addModalValidate.resetForm(); |
| | | $("#addModal .form .form-group").removeClass("has-error"); |
| | | $(".remote_panel").show(); // remote |
| | | |
| | | $("#addModal .form input[name=permission]").parents('.form-group').show(); |
| | | }); |
| | | }); |
| | | |
| | | // update role |
| | | $("#updateModal .form input[name=role]").change(function () { |
| | |
| | | $("#updateModal .form input[name='permission']").prop("checked",false); |
| | | }); |
| | | |
| | | // update |
| | | $("#user_list").on('click', '.update',function() { |
| | | // update |
| | | $("#user_list").on('click', '.update',function() { |
| | | |
| | | var id = $(this).parent('p').attr("id"); |
| | | var row = tableData['key'+id]; |
| | | |
| | | // base data |
| | | $("#updateModal .form input[name='id']").val( row.id ); |
| | | $("#updateModal .form input[name='username']").val( row.username ); |
| | | $("#updateModal .form input[name='password']").val( '' ); |
| | | $("#updateModal .form input[name='role'][value='"+ row.role +"']").click(); |
| | | // base data |
| | | $("#updateModal .form input[name='id']").val( row.id ); |
| | | $("#updateModal .form input[name='username']").val( row.username ); |
| | | $("#updateModal .form input[name='password']").val( '' ); |
| | | $("#updateModal .form input[name='role'][value='"+ row.role +"']").click(); |
| | | var permissionArr = []; |
| | | if (row.permission) { |
| | | permissionArr = row.permission.split(","); |
| | | } |
| | | } |
| | | $("#updateModal .form input[name='permission']").each(function () { |
| | | if($.inArray($(this).val(), permissionArr) > -1) { |
| | | $(this).prop("checked",true); |
| | |
| | | } |
| | | }); |
| | | |
| | | // show |
| | | $('#updateModal').modal({backdrop: false, keyboard: false}).modal('show'); |
| | | }); |
| | | var updateModalValidate = $("#updateModal .form").validate({ |
| | | errorElement : 'span', |
| | | // show |
| | | $('#updateModal').modal({backdrop: false, keyboard: false}).modal('show'); |
| | | }); |
| | | var updateModalValidate = $("#updateModal .form").validate({ |
| | | errorElement : 'span', |
| | | errorClass : 'help-block', |
| | | focusInvalid : true, |
| | | highlight : function(element) { |
| | | $(element).closest('.form-group').addClass('has-error'); |
| | | highlight : function(element) { |
| | | $(element).closest('.form-group').addClass('has-error'); |
| | | }, |
| | | success : function(label) { |
| | | label.closest('.form-group').removeClass('has-error'); |
| | | label.remove(); |
| | | success : function(label) { |
| | | label.closest('.form-group').removeClass('has-error'); |
| | | label.remove(); |
| | | }, |
| | | errorPlacement : function(error, element) { |
| | | element.parent('div').append(error); |
| | | errorPlacement : function(error, element) { |
| | | element.parent('div').append(error); |
| | | }, |
| | | submitHandler : function(form) { |
| | | |
| | |
| | | }); |
| | | } |
| | | }); |
| | | } |
| | | }); |
| | | $("#updateModal").on('hide.bs.modal', function () { |
| | | } |
| | | }); |
| | | $("#updateModal").on('hide.bs.modal', function () { |
| | | $("#updateModal .form")[0].reset(); |
| | | updateModalValidate.resetForm(); |
| | | $("#updateModal .form .form-group").removeClass("has-error"); |
| | | $(".remote_panel").show(); // remote |
| | | $(".remote_panel").show(); // remote |
| | | |
| | | $("#updateModal .form input[name=permission]").parents('.form-group').show(); |
| | | }); |
| | | }); |
| | | |
| | | }); |
| | |
| | | PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.ruoyi.generator.mapper.GenTableColumnMapper"> |
| | | |
| | | |
| | | <resultMap type="GenTableColumn" id="GenTableColumnResult"> |
| | | <id property="columnId" column="column_id" /> |
| | | <result property="tableId" column="table_id" /> |
| | |
| | | <result property="updateBy" column="update_by" /> |
| | | <result property="updateTime" column="update_time" /> |
| | | </resultMap> |
| | | |
| | | <select id="selectDbTableColumnsByName" parameterType="String" resultMap="GenTableColumnResult"> |
| | | select column_name, (case when (is_nullable = 'no' <![CDATA[ && ]]> column_key != 'PRI') then '1' else null end) as is_required, (case when column_key = 'PRI' then '1' else '0' end) as is_pk, ordinal_position as sort, column_comment, (case when extra = 'auto_increment' then '1' else '0' end) as is_increment, column_type |
| | | from information_schema.columns where table_schema = (select database()) and table_name = (#{tableName}) |
| | | order by ordinal_position |
| | | </select> |
| | | |
| | | </mapper> |
| | | <select id="selectDbTableColumnsByName" parameterType="String" resultMap="GenTableColumnResult"> |
| | | select column_name, (case when (is_nullable = 'no' <![CDATA[ && ]]> column_key != 'PRI') then '1' else null end) as is_required, (case when column_key = 'PRI' then '1' else '0' end) as is_pk, ordinal_position as sort, column_comment, (case when extra = 'auto_increment' then '1' else '0' end) as is_increment, column_type |
| | | from information_schema.columns where table_schema = (select database()) and table_name = (#{tableName}) |
| | | order by ordinal_position |
| | | </select> |
| | | |
| | | </mapper> |
| | |
| | | "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> |
| | | <mapper namespace="com.ruoyi.generator.mapper.GenTableMapper"> |
| | | |
| | | <resultMap type="GenTable" id="GenTableResult"> |
| | | <id property="tableId" column="table_id" /> |
| | | <result property="tableName" column="table_name" /> |
| | | <result property="tableComment" column="table_comment" /> |
| | | <result property="subTableName" column="sub_table_name" /> |
| | | <result property="subTableFkName" column="sub_table_fk_name" /> |
| | | <result property="className" column="class_name" /> |
| | | <result property="tplCategory" column="tpl_category" /> |
| | | <result property="packageName" column="package_name" /> |
| | | <result property="moduleName" column="module_name" /> |
| | | <result property="businessName" column="business_name" /> |
| | | <result property="functionName" column="function_name" /> |
| | | <result property="functionAuthor" column="function_author" /> |
| | | <result property="genType" column="gen_type" /> |
| | | <result property="genPath" column="gen_path" /> |
| | | <result property="options" column="options" /> |
| | | <result property="createBy" column="create_by" /> |
| | | <result property="createTime" column="create_time" /> |
| | | <result property="updateBy" column="update_by" /> |
| | | <result property="updateTime" column="update_time" /> |
| | | <result property="remark" column="remark" /> |
| | | <collection property="columns" javaType="java.util.List" resultMap="GenTableColumnResult" /> |
| | | </resultMap> |
| | | <resultMap type="GenTable" id="GenTableResult"> |
| | | <id property="tableId" column="table_id" /> |
| | | <result property="tableName" column="table_name" /> |
| | | <result property="tableComment" column="table_comment" /> |
| | | <result property="subTableName" column="sub_table_name" /> |
| | | <result property="subTableFkName" column="sub_table_fk_name" /> |
| | | <result property="className" column="class_name" /> |
| | | <result property="tplCategory" column="tpl_category" /> |
| | | <result property="packageName" column="package_name" /> |
| | | <result property="moduleName" column="module_name" /> |
| | | <result property="businessName" column="business_name" /> |
| | | <result property="functionName" column="function_name" /> |
| | | <result property="functionAuthor" column="function_author" /> |
| | | <result property="genType" column="gen_type" /> |
| | | <result property="genPath" column="gen_path" /> |
| | | <result property="options" column="options" /> |
| | | <result property="createBy" column="create_by" /> |
| | | <result property="createTime" column="create_time" /> |
| | | <result property="updateBy" column="update_by" /> |
| | | <result property="updateTime" column="update_time" /> |
| | | <result property="remark" column="remark" /> |
| | | <collection property="columns" javaType="java.util.List" resultMap="GenTableColumnResult" /> |
| | | </resultMap> |
| | | |
| | | <resultMap type="GenTableColumn" id="GenTableColumnResult"> |
| | | <resultMap type="GenTableColumn" id="GenTableColumnResult"> |
| | | <id property="columnId" column="column_id" /> |
| | | <result property="tableId" column="table_id" /> |
| | | <result property="columnName" column="column_name" /> |
| | |
| | | <result property="updateTime" column="update_time" /> |
| | | </resultMap> |
| | | |
| | | <sql id="selectGenTableVo"> |
| | | <sql id="selectGenTableVo"> |
| | | select table_id, table_name, table_comment, sub_table_name, sub_table_fk_name, class_name, tpl_category, package_name, module_name, business_name, function_name, function_author, gen_type, gen_path, options, create_by, create_time, update_by, update_time, remark from gen_table |
| | | </sql> |
| | | |
| | | <select id="selectPageGenTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | <include refid="selectGenTableVo"/> |
| | | <where> |
| | | <if test="genTable.tableName != null and genTable.tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{genTable.tableName}, '%')) |
| | | </if> |
| | | <if test="genTable.tableComment != null and genTable.tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{genTable.tableComment}, '%')) |
| | | </if> |
| | | <if test="genTable.params.beginTime != null and genTable.params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{genTable.params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="genTable.params.endTime != null and genTable.params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{genTable.params.endTime},'%y%m%d') |
| | | </if> |
| | | </where> |
| | | </select> |
| | | <select id="selectPageGenTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | <include refid="selectGenTableVo"/> |
| | | <where> |
| | | <if test="genTable.tableName != null and genTable.tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{genTable.tableName}, '%')) |
| | | </if> |
| | | <if test="genTable.tableComment != null and genTable.tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{genTable.tableComment}, '%')) |
| | | </if> |
| | | <if test="genTable.params.beginTime != null and genTable.params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{genTable.params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="genTable.params.endTime != null and genTable.params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{genTable.params.endTime},'%y%m%d') |
| | | </if> |
| | | </where> |
| | | </select> |
| | | |
| | | <select id="selectPageDbTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_schema = (select database()) |
| | | AND table_name NOT LIKE 'xxl_job_%' AND table_name NOT LIKE 'gen_%' |
| | | AND table_name NOT IN (select table_name from gen_table) |
| | | <if test="genTable.tableName != null and genTable.tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{genTable.tableName}, '%')) |
| | | </if> |
| | | <if test="genTable.tableComment != null and genTable.tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{genTable.tableComment}, '%')) |
| | | </if> |
| | | <if test="genTable.params.beginTime != null and genTable.params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{genTable.params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="genTable.params.endTime != null and genTable.params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{genTable.params.endTime},'%y%m%d') |
| | | </if> |
| | | order by create_time desc |
| | | </select> |
| | | |
| | | |
| | | <select id="selectGenTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | <include refid="selectGenTableVo"/> |
| | | <where> |
| | | <if test="tableName != null and tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{tableName}, '%')) |
| | | </if> |
| | | <if test="tableComment != null and tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{tableComment}, '%')) |
| | | </if> |
| | | <if test="params.beginTime != null and params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="params.endTime != null and params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{params.endTime},'%y%m%d') |
| | | </if> |
| | | </where> |
| | | </select> |
| | | |
| | | <select id="selectDbTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_schema = (select database()) |
| | | AND table_name NOT LIKE 'xxl_job_%' AND table_name NOT LIKE 'gen_%' |
| | | AND table_name NOT IN (select table_name from gen_table) |
| | | <if test="tableName != null and tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{tableName}, '%')) |
| | | </if> |
| | | <if test="tableComment != null and tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{tableComment}, '%')) |
| | | </if> |
| | | <if test="params.beginTime != null and params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="params.endTime != null and params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{params.endTime},'%y%m%d') |
| | | </if> |
| | | <select id="selectPageDbTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_schema = (select database()) |
| | | AND table_name NOT LIKE 'xxl_job_%' AND table_name NOT LIKE 'gen_%' |
| | | AND table_name NOT IN (select table_name from gen_table) |
| | | <if test="genTable.tableName != null and genTable.tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{genTable.tableName}, '%')) |
| | | </if> |
| | | <if test="genTable.tableComment != null and genTable.tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{genTable.tableComment}, '%')) |
| | | </if> |
| | | <if test="genTable.params.beginTime != null and genTable.params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{genTable.params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="genTable.params.endTime != null and genTable.params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{genTable.params.endTime},'%y%m%d') |
| | | </if> |
| | | order by create_time desc |
| | | </select> |
| | | </select> |
| | | |
| | | <select id="selectDbTableListByNames" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_name NOT LIKE 'xxl_job_%' and table_name NOT LIKE 'gen_%' and table_schema = (select database()) |
| | | and table_name in |
| | | <foreach collection="array" item="name" open="(" separator="," close=")"> |
| | | #{name} |
| | | |
| | | <select id="selectGenTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | <include refid="selectGenTableVo"/> |
| | | <where> |
| | | <if test="tableName != null and tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{tableName}, '%')) |
| | | </if> |
| | | <if test="tableComment != null and tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{tableComment}, '%')) |
| | | </if> |
| | | <if test="params.beginTime != null and params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="params.endTime != null and params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{params.endTime},'%y%m%d') |
| | | </if> |
| | | </where> |
| | | </select> |
| | | |
| | | <select id="selectDbTableList" parameterType="GenTable" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_schema = (select database()) |
| | | AND table_name NOT LIKE 'xxl_job_%' AND table_name NOT LIKE 'gen_%' |
| | | AND table_name NOT IN (select table_name from gen_table) |
| | | <if test="tableName != null and tableName != ''"> |
| | | AND lower(table_name) like lower(concat('%', #{tableName}, '%')) |
| | | </if> |
| | | <if test="tableComment != null and tableComment != ''"> |
| | | AND lower(table_comment) like lower(concat('%', #{tableComment}, '%')) |
| | | </if> |
| | | <if test="params.beginTime != null and params.beginTime != ''"><!-- 开始时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') >= date_format(#{params.beginTime},'%y%m%d') |
| | | </if> |
| | | <if test="params.endTime != null and params.endTime != ''"><!-- 结束时间检索 --> |
| | | AND date_format(create_time,'%y%m%d') <= date_format(#{params.endTime},'%y%m%d') |
| | | </if> |
| | | order by create_time desc |
| | | </select> |
| | | |
| | | <select id="selectDbTableListByNames" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_name NOT LIKE 'xxl_job_%' and table_name NOT LIKE 'gen_%' and table_schema = (select database()) |
| | | and table_name in |
| | | <foreach collection="array" item="name" open="(" separator="," close=")"> |
| | | #{name} |
| | | </foreach> |
| | | </select> |
| | | </select> |
| | | |
| | | <select id="selectTableByName" parameterType="String" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_comment <![CDATA[ <> ]]> '' and table_schema = (select database()) |
| | | and table_name = #{tableName} |
| | | </select> |
| | | <select id="selectTableByName" parameterType="String" resultMap="GenTableResult"> |
| | | select table_name, table_comment, create_time, update_time from information_schema.tables |
| | | where table_comment <![CDATA[ <> ]]> '' and table_schema = (select database()) |
| | | and table_name = #{tableName} |
| | | </select> |
| | | |
| | | <select id="selectGenTableById" parameterType="Long" resultMap="GenTableResult"> |
| | | SELECT t.table_id, t.table_name, t.table_comment, t.sub_table_name, t.sub_table_fk_name, t.class_name, t.tpl_category, t.package_name, t.module_name, t.business_name, t.function_name, t.function_author, t.gen_type, t.gen_path, t.options, t.remark, |
| | | c.column_id, c.column_name, c.column_comment, c.column_type, c.java_type, c.java_field, c.is_pk, c.is_increment, c.is_required, c.is_insert, c.is_edit, c.is_list, c.is_query, c.query_type, c.html_type, c.dict_type, c.sort |
| | | FROM gen_table t |
| | | LEFT JOIN gen_table_column c ON t.table_id = c.table_id |
| | | where t.table_id = #{tableId} order by c.sort |
| | | </select> |
| | | <select id="selectGenTableById" parameterType="Long" resultMap="GenTableResult"> |
| | | SELECT t.table_id, t.table_name, t.table_comment, t.sub_table_name, t.sub_table_fk_name, t.class_name, t.tpl_category, t.package_name, t.module_name, t.business_name, t.function_name, t.function_author, t.gen_type, t.gen_path, t.options, t.remark, |
| | | c.column_id, c.column_name, c.column_comment, c.column_type, c.java_type, c.java_field, c.is_pk, c.is_increment, c.is_required, c.is_insert, c.is_edit, c.is_list, c.is_query, c.query_type, c.html_type, c.dict_type, c.sort |
| | | FROM gen_table t |
| | | LEFT JOIN gen_table_column c ON t.table_id = c.table_id |
| | | where t.table_id = #{tableId} order by c.sort |
| | | </select> |
| | | |
| | | <select id="selectGenTableByName" parameterType="String" resultMap="GenTableResult"> |
| | | SELECT t.table_id, t.table_name, t.table_comment, t.sub_table_name, t.sub_table_fk_name, t.class_name, t.tpl_category, t.package_name, t.module_name, t.business_name, t.function_name, t.function_author, t.gen_type, t.gen_path, t.options, t.remark, |
| | | c.column_id, c.column_name, c.column_comment, c.column_type, c.java_type, c.java_field, c.is_pk, c.is_increment, c.is_required, c.is_insert, c.is_edit, c.is_list, c.is_query, c.query_type, c.html_type, c.dict_type, c.sort |
| | | FROM gen_table t |
| | | LEFT JOIN gen_table_column c ON t.table_id = c.table_id |
| | | where t.table_name = #{tableName} order by c.sort |
| | | </select> |
| | | <select id="selectGenTableByName" parameterType="String" resultMap="GenTableResult"> |
| | | SELECT t.table_id, t.table_name, t.table_comment, t.sub_table_name, t.sub_table_fk_name, t.class_name, t.tpl_category, t.package_name, t.module_name, t.business_name, t.function_name, t.function_author, t.gen_type, t.gen_path, t.options, t.remark, |
| | | c.column_id, c.column_name, c.column_comment, c.column_type, c.java_type, c.java_field, c.is_pk, c.is_increment, c.is_required, c.is_insert, c.is_edit, c.is_list, c.is_query, c.query_type, c.html_type, c.dict_type, c.sort |
| | | FROM gen_table t |
| | | LEFT JOIN gen_table_column c ON t.table_id = c.table_id |
| | | where t.table_name = #{tableName} order by c.sort |
| | | </select> |
| | | |
| | | <select id="selectGenTableAll" parameterType="String" resultMap="GenTableResult"> |
| | | SELECT t.table_id, t.table_name, t.table_comment, t.sub_table_name, t.sub_table_fk_name, t.class_name, t.tpl_category, t.package_name, t.module_name, t.business_name, t.function_name, t.function_author, t.options, t.remark, |
| | | c.column_id, c.column_name, c.column_comment, c.column_type, c.java_type, c.java_field, c.is_pk, c.is_increment, c.is_required, c.is_insert, c.is_edit, c.is_list, c.is_query, c.query_type, c.html_type, c.dict_type, c.sort |
| | | FROM gen_table t |
| | | LEFT JOIN gen_table_column c ON t.table_id = c.table_id |
| | | order by c.sort |
| | | </select> |
| | | <select id="selectGenTableAll" parameterType="String" resultMap="GenTableResult"> |
| | | SELECT t.table_id, t.table_name, t.table_comment, t.sub_table_name, t.sub_table_fk_name, t.class_name, t.tpl_category, t.package_name, t.module_name, t.business_name, t.function_name, t.function_author, t.options, t.remark, |
| | | c.column_id, c.column_name, c.column_comment, c.column_type, c.java_type, c.java_field, c.is_pk, c.is_increment, c.is_required, c.is_insert, c.is_edit, c.is_list, c.is_query, c.query_type, c.html_type, c.dict_type, c.sort |
| | | FROM gen_table t |
| | | LEFT JOIN gen_table_column c ON t.table_id = c.table_id |
| | | order by c.sort |
| | | </select> |
| | | |
| | | </mapper> |
| | |
| | | */ |
| | | public class OssConstant { |
| | | |
| | | /** |
| | | * OSS模块KEY |
| | | */ |
| | | public static final String SYS_OSS_KEY = "sys_oss:"; |
| | | /** |
| | | * OSS模块KEY |
| | | */ |
| | | public static final String SYS_OSS_KEY = "sys_oss:"; |
| | | |
| | | /** |
| | | * 对象存储配置KEY |
| | | */ |
| | | public static final String OSS_CONFIG_KEY = "OssConfig"; |
| | | /** |
| | | * 对象存储配置KEY |
| | | */ |
| | | public static final String OSS_CONFIG_KEY = "OssConfig"; |
| | | |
| | | /** |
| | | * 缓存配置KEY |
| | | */ |
| | | public static final String CACHE_CONFIG_KEY = SYS_OSS_KEY + OSS_CONFIG_KEY; |
| | | /** |
| | | * 缓存配置KEY |
| | | */ |
| | | public static final String CACHE_CONFIG_KEY = SYS_OSS_KEY + OSS_CONFIG_KEY; |
| | | |
| | | /** |
| | | * 预览列表资源开关Key |
| | | */ |
| | | public static final String PEREVIEW_LIST_RESOURCE_KEY = "sys.oss.previewListResource"; |
| | | /** |
| | | * 预览列表资源开关Key |
| | | */ |
| | | public static final String PEREVIEW_LIST_RESOURCE_KEY = "sys.oss.previewListResource"; |
| | | |
| | | /** |
| | | * 系统数据ids |
| | | */ |
| | | public static final List<Integer> SYSTEM_DATA_IDS = Arrays.asList(1, 2, 3, 4); |
| | | /** |
| | | * 系统数据ids |
| | | */ |
| | | public static final List<Integer> SYSTEM_DATA_IDS = Arrays.asList(1, 2, 3, 4); |
| | | |
| | | } |
| | |
| | | @Accessors(chain = true) |
| | | public class UploadResult { |
| | | |
| | | /** |
| | | * 文件路径 |
| | | */ |
| | | private String url; |
| | | /** |
| | | * 文件路径 |
| | | */ |
| | | private String url; |
| | | |
| | | /** |
| | | * 文件名 |
| | | */ |
| | | private String filename; |
| | | /** |
| | | * 文件名 |
| | | */ |
| | | private String filename; |
| | | } |
| | |
| | | @AllArgsConstructor |
| | | public enum OssEnumd { |
| | | |
| | | /** |
| | | * 七牛云 |
| | | */ |
| | | QINIU("qiniu", QiniuOssStrategy.class), |
| | | /** |
| | | * 七牛云 |
| | | */ |
| | | QINIU("qiniu", QiniuOssStrategy.class), |
| | | |
| | | /** |
| | | * 阿里云 |
| | | */ |
| | | ALIYUN("aliyun", AliyunOssStrategy.class), |
| | | /** |
| | | * 阿里云 |
| | | */ |
| | | ALIYUN("aliyun", AliyunOssStrategy.class), |
| | | |
| | | /** |
| | | * 腾讯云 |
| | | */ |
| | | QCLOUD("qcloud", QcloudOssStrategy.class), |
| | | /** |
| | | * 腾讯云 |
| | | */ |
| | | QCLOUD("qcloud", QcloudOssStrategy.class), |
| | | |
| | | /** |
| | | * minio |
| | | */ |
| | | MINIO("minio", MinioOssStrategy.class); |
| | | /** |
| | | * minio |
| | | */ |
| | | MINIO("minio", MinioOssStrategy.class); |
| | | |
| | | private final String value; |
| | | private final String value; |
| | | |
| | | private final Class<?> serviceClass; |
| | | private final Class<?> serviceClass; |
| | | |
| | | public static Class<?> getServiceClass(String value) { |
| | | for (OssEnumd clazz : values()) { |
| | | if (clazz.getValue().equals(value)) { |
| | | return clazz.getServiceClass(); |
| | | } |
| | | } |
| | | return null; |
| | | } |
| | | public static Class<?> getServiceClass(String value) { |
| | | for (OssEnumd clazz : values()) { |
| | | if (clazz.getValue().equals(value)) { |
| | | return clazz.getServiceClass(); |
| | | } |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | public static String getServiceName(String value) { |
| | | for (OssEnumd clazz : values()) { |
| | | if (clazz.getValue().equals(value)) { |
| | | return StringUtils.uncapitalize(clazz.getServiceClass().getSimpleName()); |
| | | } |
| | | } |
| | | return null; |
| | | } |
| | | public static String getServiceName(String value) { |
| | | for (OssEnumd clazz : values()) { |
| | | if (clazz.getValue().equals(value)) { |
| | | return StringUtils.uncapitalize(clazz.getServiceClass().getSimpleName()); |
| | | } |
| | | } |
| | | return null; |
| | | } |
| | | |
| | | |
| | | } |
| | |
| | | @AllArgsConstructor |
| | | public enum PolicyType { |
| | | |
| | | /** |
| | | * 只读 |
| | | */ |
| | | READ("read-only"), |
| | | /** |
| | | * 只读 |
| | | */ |
| | | READ("read-only"), |
| | | |
| | | /** |
| | | * 只写 |
| | | */ |
| | | WRITE("write-only"), |
| | | /** |
| | | * 只写 |
| | | */ |
| | | WRITE("write-only"), |
| | | |
| | | /** |
| | | * 读写 |
| | | */ |
| | | READ_WRITE("read-write"); |
| | | /** |
| | | * 读写 |
| | | */ |
| | | READ_WRITE("read-write"); |
| | | |
| | | /** |
| | | * 类型 |
| | | */ |
| | | private final String type; |
| | | /** |
| | | * 类型 |
| | | */ |
| | | private final String type; |
| | | |
| | | } |
| | |
| | | */ |
| | | public class OssException extends RuntimeException { |
| | | |
| | | private static final long serialVersionUID = 1L; |
| | | private static final long serialVersionUID = 1L; |
| | | |
| | | public OssException(String msg) { |
| | | super(msg); |
| | | } |
| | | public OssException(String msg) { |
| | | super(msg); |
| | | } |
| | | |
| | | } |
| | |
| | | @Slf4j |
| | | public class OssFactory { |
| | | |
| | | /** |
| | | * 服务实例缓存 |
| | | */ |
| | | private static final Map<String, IOssStrategy> SERVICES = new ConcurrentHashMap<>(); |
| | | /** |
| | | * 服务实例缓存 |
| | | */ |
| | | private static final Map<String, IOssStrategy> SERVICES = new ConcurrentHashMap<>(); |
| | | |
| | | /** |
| | | * 初始化工厂 |
| | |
| | | }); |
| | | } |
| | | |
| | | /** |
| | | * 获取默认实例 |
| | | */ |
| | | public static IOssStrategy instance() { |
| | | // 获取redis 默认类型 |
| | | String type = RedisUtils.getCacheObject(OssConstant.CACHE_CONFIG_KEY); |
| | | if (StringUtils.isEmpty(type)) { |
| | | throw new OssException("文件存储服务类型无法找到!"); |
| | | } |
| | | return instance(type); |
| | | } |
| | | /** |
| | | * 获取默认实例 |
| | | */ |
| | | public static IOssStrategy instance() { |
| | | // 获取redis 默认类型 |
| | | String type = RedisUtils.getCacheObject(OssConstant.CACHE_CONFIG_KEY); |
| | | if (StringUtils.isEmpty(type)) { |
| | | throw new OssException("文件存储服务类型无法找到!"); |
| | | } |
| | | return instance(type); |
| | | } |
| | | |
| | | /** |
| | | * 根据类型获取实例 |
| | | */ |
| | | public static IOssStrategy instance(String type) { |
| | | /** |
| | | * 根据类型获取实例 |
| | | */ |
| | | public static IOssStrategy instance(String type) { |
| | | IOssStrategy service = SERVICES.get(type); |
| | | if (service == null) { |
| | | refreshService(type); |
| | | service = SERVICES.get(type); |
| | | } |
| | | return service; |
| | | } |
| | | if (service == null) { |
| | | refreshService(type); |
| | | service = SERVICES.get(type); |
| | | } |
| | | return service; |
| | | } |
| | | |
| | | private static void refreshService(String type) { |
| | | Object json = RedisUtils.getCacheObject(OssConstant.SYS_OSS_KEY + type); |
| | | private static void refreshService(String type) { |
| | | Object json = RedisUtils.getCacheObject(OssConstant.SYS_OSS_KEY + type); |
| | | OssProperties properties = JsonUtils.parseObject(json.toString(), OssProperties.class); |
| | | if (properties == null) { |
| | | throw new OssException("系统异常, '" + type + "'配置信息不存在!"); |
| | | } |
| | | // 获取redis配置信息 创建对象 并缓存 |
| | | if (properties == null) { |
| | | throw new OssException("系统异常, '" + type + "'配置信息不存在!"); |
| | | } |
| | | // 获取redis配置信息 创建对象 并缓存 |
| | | IOssStrategy service = (IOssStrategy) ReflectUtils.newInstance(OssEnumd.getServiceClass(type)); |
| | | ((AbstractOssStrategy)service).init(properties); |
| | | SERVICES.put(type, service); |
| | | } |
| | | ((AbstractOssStrategy)service).init(properties); |
| | | SERVICES.put(type, service); |
| | | } |
| | | |
| | | } |
| | |
| | | @Data |
| | | public class OssProperties { |
| | | |
| | | /** |
| | | * 域名 |
| | | */ |
| | | private String endpoint; |
| | | /** |
| | | * 域名 |
| | | */ |
| | | private String endpoint; |
| | | |
| | | /** |
| | | * 前缀 |
| | | */ |
| | | private String prefix; |
| | | /** |
| | | * 前缀 |
| | | */ |
| | | private String prefix; |
| | | |
| | | /** |
| | | * ACCESS_KEY |
| | | */ |
| | | private String accessKey; |
| | | /** |
| | | * ACCESS_KEY |
| | | */ |
| | | private String accessKey; |
| | | |
| | | /** |
| | | * SECRET_KEY |
| | | */ |
| | | private String secretKey; |
| | | /** |
| | | * SECRET_KEY |
| | | */ |
| | | private String secretKey; |
| | | |
| | | /** |
| | | * 存储空间名 |
| | | */ |
| | | private String bucketName; |
| | | /** |
| | | * 存储空间名 |
| | | */ |
| | | private String bucketName; |
| | | |
| | | /** |
| | | * 存储区域 |
| | | */ |
| | | private String region; |
| | | /** |
| | | * 存储区域 |
| | | */ |
| | | private String region; |
| | | |
| | | /** |
| | | * 是否https(Y=是,N=否) |
| | | */ |
| | | private String isHttps; |
| | | /** |
| | | * 是否https(Y=是,N=否) |
| | | */ |
| | | private String isHttps; |
| | | |
| | | } |
| | |
| | | */ |
| | | public interface IOssStrategy { |
| | | |
| | | void createBucket(); |
| | | void createBucket(); |
| | | |
| | | /** |
| | | * 获取服务商类型 |
| | | */ |
| | | String getServiceType(); |
| | | /** |
| | | * 获取服务商类型 |
| | | */ |
| | | String getServiceType(); |
| | | |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param data 文件字节数组 |
| | | * @param path 文件路径,包含文件名 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult upload(byte[] data, String path, String contentType); |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param data 文件字节数组 |
| | | * @param path 文件路径,包含文件名 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult upload(byte[] data, String path, String contentType); |
| | | |
| | | /** |
| | | * 文件删除 |
| | | * |
| | | * @param path 文件路径,包含文件名 |
| | | */ |
| | | void delete(String path); |
| | | /** |
| | | * 文件删除 |
| | | * |
| | | * @param path 文件路径,包含文件名 |
| | | */ |
| | | void delete(String path); |
| | | |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param data 文件字节数组 |
| | | * @param suffix 后缀 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult uploadSuffix(byte[] data, String suffix, String contentType); |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param data 文件字节数组 |
| | | * @param suffix 后缀 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult uploadSuffix(byte[] data, String suffix, String contentType); |
| | | |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param inputStream 字节流 |
| | | * @param path 文件路径,包含文件名 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult upload(InputStream inputStream, String path, String contentType); |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param inputStream 字节流 |
| | | * @param path 文件路径,包含文件名 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult upload(InputStream inputStream, String path, String contentType); |
| | | |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param inputStream 字节流 |
| | | * @param suffix 后缀 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType); |
| | | /** |
| | | * 文件上传 |
| | | * |
| | | * @param inputStream 字节流 |
| | | * @param suffix 后缀 |
| | | * @return 返回http地址 |
| | | */ |
| | | UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType); |
| | | |
| | | } |
| | |
| | | */ |
| | | public abstract class AbstractOssStrategy implements IOssStrategy { |
| | | |
| | | protected OssProperties properties; |
| | | protected OssProperties properties; |
| | | |
| | | public abstract void init(OssProperties properties); |
| | | public abstract void init(OssProperties properties); |
| | | |
| | | @Override |
| | | public abstract void createBucket(); |
| | | @Override |
| | | public abstract void createBucket(); |
| | | |
| | | @Override |
| | | public abstract String getServiceType(); |
| | | @Override |
| | | public abstract String getServiceType(); |
| | | |
| | | public String getPath(String prefix, String suffix) { |
| | | // 生成uuid |
| | | String uuid = IdUtil.fastSimpleUUID(); |
| | | // 文件路径 |
| | | String path = DateUtils.datePath() + "/" + uuid; |
| | | if (StringUtils.isNotBlank(prefix)) { |
| | | path = prefix + "/" + path; |
| | | } |
| | | return path + suffix; |
| | | } |
| | | public String getPath(String prefix, String suffix) { |
| | | // 生成uuid |
| | | String uuid = IdUtil.fastSimpleUUID(); |
| | | // 文件路径 |
| | | String path = DateUtils.datePath() + "/" + uuid; |
| | | if (StringUtils.isNotBlank(prefix)) { |
| | | path = prefix + "/" + path; |
| | | } |
| | | return path + suffix; |
| | | } |
| | | |
| | | @Override |
| | | public abstract UploadResult upload(byte[] data, String path, String contentType); |
| | | @Override |
| | | public abstract UploadResult upload(byte[] data, String path, String contentType); |
| | | |
| | | @Override |
| | | public abstract void delete(String path); |
| | | @Override |
| | | public abstract void delete(String path); |
| | | |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | byte[] data = IoUtil.readBytes(inputStream); |
| | | return this.upload(data, path, contentType); |
| | | } |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | byte[] data = IoUtil.readBytes(inputStream); |
| | | return this.upload(data, path, contentType); |
| | | } |
| | | |
| | | @Override |
| | | public abstract UploadResult uploadSuffix(byte[] data, String suffix, String contentType); |
| | | @Override |
| | | public abstract UploadResult uploadSuffix(byte[] data, String suffix, String contentType); |
| | | |
| | | @Override |
| | | public abstract UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType); |
| | | @Override |
| | | public abstract UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType); |
| | | |
| | | public abstract String getEndpointLink(); |
| | | public abstract String getEndpointLink(); |
| | | } |
| | |
| | | */ |
| | | public class AliyunOssStrategy extends AbstractOssStrategy { |
| | | |
| | | private OSSClient client; |
| | | private OSSClient client; |
| | | |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | ClientConfiguration configuration = new ClientConfiguration(); |
| | | DefaultCredentialProvider credentialProvider = new DefaultCredentialProvider( |
| | | properties.getAccessKey(), properties.getSecretKey()); |
| | | client = new OSSClient(properties.getEndpoint(), credentialProvider, configuration); |
| | | createBucket(); |
| | | } catch (Exception e) { |
| | | throw new OssException("阿里云存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | ClientConfiguration configuration = new ClientConfiguration(); |
| | | DefaultCredentialProvider credentialProvider = new DefaultCredentialProvider( |
| | | properties.getAccessKey(), properties.getSecretKey()); |
| | | client = new OSSClient(properties.getEndpoint(), credentialProvider, configuration); |
| | | createBucket(); |
| | | } catch (Exception e) { |
| | | throw new OssException("阿里云存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | if (client.doesBucketExist(bucketName)) { |
| | | return; |
| | | } |
| | | CreateBucketRequest createBucketRequest = new CreateBucketRequest(bucketName); |
| | | createBucketRequest.setCannedACL(CannedAccessControlList.PublicRead); |
| | | client.createBucket(createBucketRequest); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对阿里云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | if (client.doesBucketExist(bucketName)) { |
| | | return; |
| | | } |
| | | CreateBucketRequest createBucketRequest = new CreateBucketRequest(bucketName); |
| | | createBucketRequest.setCannedACL(CannedAccessControlList.PublicRead); |
| | | client.createBucket(createBucketRequest); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对阿里云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.ALIYUN.getValue(); |
| | | } |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.ALIYUN.getValue(); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | return upload(new ByteArrayInputStream(data), path, contentType); |
| | | } |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | return upload(new ByteArrayInputStream(data), path, contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | try { |
| | | ObjectMetadata metadata = new ObjectMetadata(); |
| | | metadata.setContentType(contentType); |
| | | client.putObject(new PutObjectRequest(properties.getBucketName(), path, inputStream, metadata)); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检查阿里云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | try { |
| | | ObjectMetadata metadata = new ObjectMetadata(); |
| | | metadata.setContentType(contentType); |
| | | client.putObject(new PutObjectRequest(properties.getBucketName(), path, inputStream, metadata)); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检查阿里云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | |
| | | @Override |
| | | public void delete(String path) { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | try { |
| | | client.deleteObject(properties.getBucketName(), path); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检查阿里云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void delete(String path) { |
| | | path = path.replace(getEndpointLink() + "/" , ""); |
| | | try { |
| | | client.deleteObject(properties.getBucketName(), path); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检查阿里云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public String getEndpointLink() { |
| | | String endpoint = properties.getEndpoint(); |
| | | StringBuilder sb = new StringBuilder(endpoint); |
| | | if (StringUtils.containsAnyIgnoreCase(endpoint, "http://")) { |
| | | sb.insert(7, properties.getBucketName() + "."); |
| | | } else if (StringUtils.containsAnyIgnoreCase(endpoint, "https://")) { |
| | | sb.insert(8, properties.getBucketName() + "."); |
| | | } else { |
| | | throw new OssException("Endpoint配置错误"); |
| | | } |
| | | return sb.toString(); |
| | | } |
| | | @Override |
| | | public String getEndpointLink() { |
| | | String endpoint = properties.getEndpoint(); |
| | | StringBuilder sb = new StringBuilder(endpoint); |
| | | if (StringUtils.containsAnyIgnoreCase(endpoint, "http://")) { |
| | | sb.insert(7, properties.getBucketName() + "."); |
| | | } else if (StringUtils.containsAnyIgnoreCase(endpoint, "https://")) { |
| | | sb.insert(8, properties.getBucketName() + "."); |
| | | } else { |
| | | throw new OssException("Endpoint配置错误"); |
| | | } |
| | | return sb.toString(); |
| | | } |
| | | } |
| | |
| | | */ |
| | | public class MinioOssStrategy extends AbstractOssStrategy { |
| | | |
| | | private MinioClient minioClient; |
| | | private MinioClient minioClient; |
| | | |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | minioClient = MinioClient.builder() |
| | | .endpoint(properties.getEndpoint()) |
| | | .credentials(properties.getAccessKey(), properties.getSecretKey()) |
| | | .build(); |
| | | createBucket(); |
| | | } catch (Exception e) { |
| | | throw new OssException("Minio存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | minioClient = MinioClient.builder() |
| | | .endpoint(properties.getEndpoint()) |
| | | .credentials(properties.getAccessKey(), properties.getSecretKey()) |
| | | .build(); |
| | | createBucket(); |
| | | } catch (Exception e) { |
| | | throw new OssException("Minio存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | boolean exists = minioClient.bucketExists(BucketExistsArgs.builder().bucket(bucketName).build()); |
| | | if (exists) { |
| | | return; |
| | | } |
| | | // 不存在就创建桶 |
| | | minioClient.makeBucket(MakeBucketArgs.builder().bucket(bucketName).build()); |
| | | minioClient.setBucketPolicy(SetBucketPolicyArgs.builder() |
| | | .bucket(bucketName) |
| | | .config(getPolicy(bucketName, PolicyType.READ)) |
| | | .build()); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对Minio配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | boolean exists = minioClient.bucketExists(BucketExistsArgs.builder().bucket(bucketName).build()); |
| | | if (exists) { |
| | | return; |
| | | } |
| | | // 不存在就创建桶 |
| | | minioClient.makeBucket(MakeBucketArgs.builder().bucket(bucketName).build()); |
| | | minioClient.setBucketPolicy(SetBucketPolicyArgs.builder() |
| | | .bucket(bucketName) |
| | | .config(getPolicy(bucketName, PolicyType.READ)) |
| | | .build()); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对Minio配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.MINIO.getValue(); |
| | | } |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.MINIO.getValue(); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | return upload(new ByteArrayInputStream(data), path, contentType); |
| | | } |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | return upload(new ByteArrayInputStream(data), path, contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | try { |
| | | minioClient.putObject(PutObjectArgs.builder() |
| | | .bucket(properties.getBucketName()) |
| | | .object(path) |
| | | .contentType(StringUtils.blankToDefault(contentType, MediaType.APPLICATION_OCTET_STREAM_VALUE)) |
| | | .stream(inputStream, inputStream.available(), -1) |
| | | .build()); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请核对Minio配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | try { |
| | | minioClient.putObject(PutObjectArgs.builder() |
| | | .bucket(properties.getBucketName()) |
| | | .object(path) |
| | | .contentType(StringUtils.blankToDefault(contentType, MediaType.APPLICATION_OCTET_STREAM_VALUE)) |
| | | .stream(inputStream, inputStream.available(), -1) |
| | | .build()); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请核对Minio配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | |
| | | @Override |
| | | public void delete(String path) { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | try { |
| | | minioClient.removeObject(RemoveObjectArgs.builder() |
| | | .bucket(properties.getBucketName()) |
| | | .object(path) |
| | | .build()); |
| | | } catch (Exception e) { |
| | | throw new OssException(e.getMessage()); |
| | | } |
| | | } |
| | | @Override |
| | | public void delete(String path) { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | try { |
| | | minioClient.removeObject(RemoveObjectArgs.builder() |
| | | .bucket(properties.getBucketName()) |
| | | .object(path) |
| | | .build()); |
| | | } catch (Exception e) { |
| | | throw new OssException(e.getMessage()); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public String getEndpointLink() { |
| | | return properties.getEndpoint() + "/" + properties.getBucketName(); |
| | | } |
| | | @Override |
| | | public String getEndpointLink() { |
| | | return properties.getEndpoint() + "/" + properties.getBucketName(); |
| | | } |
| | | |
| | | private String getPolicy(String bucketName, PolicyType policyType) { |
| | | StringBuilder builder = new StringBuilder(); |
| | | builder.append("{\n"); |
| | | builder.append(" \"Statement\": [\n"); |
| | | builder.append(" {\n"); |
| | | builder.append(" \"Action\": [\n"); |
| | | if (policyType == PolicyType.WRITE) { |
| | | builder.append(" \"s3:GetBucketLocation\",\n"); |
| | | builder.append(" \"s3:ListBucketMultipartUploads\"\n"); |
| | | } else if (policyType == PolicyType.READ_WRITE) { |
| | | builder.append(" \"s3:GetBucketLocation\",\n"); |
| | | builder.append(" \"s3:ListBucket\",\n"); |
| | | builder.append(" \"s3:ListBucketMultipartUploads\"\n"); |
| | | } else { |
| | | builder.append(" \"s3:GetBucketLocation\"\n"); |
| | | } |
| | | builder.append(" ],\n"); |
| | | builder.append(" \"Effect\": \"Allow\",\n"); |
| | | builder.append(" \"Principal\": \"*\",\n"); |
| | | builder.append(" \"Resource\": \"arn:aws:s3:::"); |
| | | builder.append(bucketName); |
| | | builder.append("\"\n"); |
| | | builder.append(" },\n"); |
| | | if (PolicyType.READ.equals(policyType)) { |
| | | builder.append(" {\n"); |
| | | builder.append(" \"Action\": [\n"); |
| | | builder.append(" \"s3:ListBucket\"\n"); |
| | | builder.append(" ],\n"); |
| | | builder.append(" \"Effect\": \"Deny\",\n"); |
| | | builder.append(" \"Principal\": \"*\",\n"); |
| | | builder.append(" \"Resource\": \"arn:aws:s3:::"); |
| | | builder.append(bucketName); |
| | | builder.append("\"\n"); |
| | | builder.append(" },\n"); |
| | | } |
| | | builder.append(" {\n"); |
| | | builder.append(" \"Action\": "); |
| | | switch (policyType) { |
| | | case WRITE: |
| | | builder.append("[\n"); |
| | | builder.append(" \"s3:AbortMultipartUpload\",\n"); |
| | | builder.append(" \"s3:DeleteObject\",\n"); |
| | | builder.append(" \"s3:ListMultipartUploadParts\",\n"); |
| | | builder.append(" \"s3:PutObject\"\n"); |
| | | builder.append(" ],\n"); |
| | | break; |
| | | case READ_WRITE: |
| | | builder.append("[\n"); |
| | | builder.append(" \"s3:AbortMultipartUpload\",\n"); |
| | | builder.append(" \"s3:DeleteObject\",\n"); |
| | | builder.append(" \"s3:GetObject\",\n"); |
| | | builder.append(" \"s3:ListMultipartUploadParts\",\n"); |
| | | builder.append(" \"s3:PutObject\"\n"); |
| | | builder.append(" ],\n"); |
| | | break; |
| | | default: |
| | | builder.append("\"s3:GetObject\",\n"); |
| | | break; |
| | | } |
| | | builder.append(" \"Effect\": \"Allow\",\n"); |
| | | builder.append(" \"Principal\": \"*\",\n"); |
| | | builder.append(" \"Resource\": \"arn:aws:s3:::"); |
| | | builder.append(bucketName); |
| | | builder.append("/*\"\n"); |
| | | builder.append(" }\n"); |
| | | builder.append(" ],\n"); |
| | | builder.append(" \"Version\": \"2012-10-17\"\n"); |
| | | builder.append("}\n"); |
| | | return builder.toString(); |
| | | } |
| | | private String getPolicy(String bucketName, PolicyType policyType) { |
| | | StringBuilder builder = new StringBuilder(); |
| | | builder.append("{\n"); |
| | | builder.append(" \"Statement\": [\n"); |
| | | builder.append(" {\n"); |
| | | builder.append(" \"Action\": [\n"); |
| | | if (policyType == PolicyType.WRITE) { |
| | | builder.append(" \"s3:GetBucketLocation\",\n"); |
| | | builder.append(" \"s3:ListBucketMultipartUploads\"\n"); |
| | | } else if (policyType == PolicyType.READ_WRITE) { |
| | | builder.append(" \"s3:GetBucketLocation\",\n"); |
| | | builder.append(" \"s3:ListBucket\",\n"); |
| | | builder.append(" \"s3:ListBucketMultipartUploads\"\n"); |
| | | } else { |
| | | builder.append(" \"s3:GetBucketLocation\"\n"); |
| | | } |
| | | builder.append(" ],\n"); |
| | | builder.append(" \"Effect\": \"Allow\",\n"); |
| | | builder.append(" \"Principal\": \"*\",\n"); |
| | | builder.append(" \"Resource\": \"arn:aws:s3:::"); |
| | | builder.append(bucketName); |
| | | builder.append("\"\n"); |
| | | builder.append(" },\n"); |
| | | if (PolicyType.READ.equals(policyType)) { |
| | | builder.append(" {\n"); |
| | | builder.append(" \"Action\": [\n"); |
| | | builder.append(" \"s3:ListBucket\"\n"); |
| | | builder.append(" ],\n"); |
| | | builder.append(" \"Effect\": \"Deny\",\n"); |
| | | builder.append(" \"Principal\": \"*\",\n"); |
| | | builder.append(" \"Resource\": \"arn:aws:s3:::"); |
| | | builder.append(bucketName); |
| | | builder.append("\"\n"); |
| | | builder.append(" },\n"); |
| | | } |
| | | builder.append(" {\n"); |
| | | builder.append(" \"Action\": "); |
| | | switch (policyType) { |
| | | case WRITE: |
| | | builder.append("[\n"); |
| | | builder.append(" \"s3:AbortMultipartUpload\",\n"); |
| | | builder.append(" \"s3:DeleteObject\",\n"); |
| | | builder.append(" \"s3:ListMultipartUploadParts\",\n"); |
| | | builder.append(" \"s3:PutObject\"\n"); |
| | | builder.append(" ],\n"); |
| | | break; |
| | | case READ_WRITE: |
| | | builder.append("[\n"); |
| | | builder.append(" \"s3:AbortMultipartUpload\",\n"); |
| | | builder.append(" \"s3:DeleteObject\",\n"); |
| | | builder.append(" \"s3:GetObject\",\n"); |
| | | builder.append(" \"s3:ListMultipartUploadParts\",\n"); |
| | | builder.append(" \"s3:PutObject\"\n"); |
| | | builder.append(" ],\n"); |
| | | break; |
| | | default: |
| | | builder.append("\"s3:GetObject\",\n"); |
| | | break; |
| | | } |
| | | builder.append(" \"Effect\": \"Allow\",\n"); |
| | | builder.append(" \"Principal\": \"*\",\n"); |
| | | builder.append(" \"Resource\": \"arn:aws:s3:::"); |
| | | builder.append(bucketName); |
| | | builder.append("/*\"\n"); |
| | | builder.append(" }\n"); |
| | | builder.append(" ],\n"); |
| | | builder.append(" \"Version\": \"2012-10-17\"\n"); |
| | | builder.append("}\n"); |
| | | return builder.toString(); |
| | | } |
| | | } |
| | |
| | | */ |
| | | public class QcloudOssStrategy extends AbstractOssStrategy { |
| | | |
| | | private COSClient client; |
| | | private COSClient client; |
| | | |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | COSCredentials credentials = new BasicCOSCredentials( |
| | | properties.getAccessKey(), properties.getSecretKey()); |
| | | // 初始化客户端配置 |
| | | ClientConfig clientConfig = new ClientConfig(); |
| | | // 设置bucket所在的区域,华南:gz 华北:tj 华东:sh |
| | | clientConfig.setRegion(new Region(properties.getRegion())); |
| | | if ("Y".equals(properties.getIsHttps())) { |
| | | clientConfig.setHttpProtocol(HttpProtocol.https); |
| | | } else { |
| | | clientConfig.setHttpProtocol(HttpProtocol.http); |
| | | } |
| | | client = new COSClient(credentials, clientConfig); |
| | | createBucket(); |
| | | } catch (Exception e) { |
| | | throw new OssException("腾讯云存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | COSCredentials credentials = new BasicCOSCredentials( |
| | | properties.getAccessKey(), properties.getSecretKey()); |
| | | // 初始化客户端配置 |
| | | ClientConfig clientConfig = new ClientConfig(); |
| | | // 设置bucket所在的区域,华南:gz 华北:tj 华东:sh |
| | | clientConfig.setRegion(new Region(properties.getRegion())); |
| | | if ("Y".equals(properties.getIsHttps())) { |
| | | clientConfig.setHttpProtocol(HttpProtocol.https); |
| | | } else { |
| | | clientConfig.setHttpProtocol(HttpProtocol.http); |
| | | } |
| | | client = new COSClient(credentials, clientConfig); |
| | | createBucket(); |
| | | } catch (Exception e) { |
| | | throw new OssException("腾讯云存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | if (client.doesBucketExist(bucketName)) { |
| | | return; |
| | | } |
| | | CreateBucketRequest createBucketRequest = new CreateBucketRequest(bucketName); |
| | | createBucketRequest.setCannedAcl(CannedAccessControlList.PublicRead); |
| | | client.createBucket(createBucketRequest); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对腾讯云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | if (client.doesBucketExist(bucketName)) { |
| | | return; |
| | | } |
| | | CreateBucketRequest createBucketRequest = new CreateBucketRequest(bucketName); |
| | | createBucketRequest.setCannedAcl(CannedAccessControlList.PublicRead); |
| | | client.createBucket(createBucketRequest); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对腾讯云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.QCLOUD.getValue(); |
| | | } |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.QCLOUD.getValue(); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | return upload(new ByteArrayInputStream(data), path, contentType); |
| | | } |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | return upload(new ByteArrayInputStream(data), path, contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | try { |
| | | ObjectMetadata metadata = new ObjectMetadata(); |
| | | metadata.setContentType(contentType); |
| | | client.putObject(new PutObjectRequest(properties.getBucketName(), path, inputStream, metadata)); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检查腾讯云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | @Override |
| | | public UploadResult upload(InputStream inputStream, String path, String contentType) { |
| | | try { |
| | | ObjectMetadata metadata = new ObjectMetadata(); |
| | | metadata.setContentType(contentType); |
| | | client.putObject(new PutObjectRequest(properties.getBucketName(), path, inputStream, metadata)); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检查腾讯云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | |
| | | @Override |
| | | public void delete(String path) { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | try { |
| | | client.deleteObject(new DeleteObjectRequest(properties.getBucketName(), path)); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检腾讯云查配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void delete(String path) { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | try { |
| | | client.deleteObject(new DeleteObjectRequest(properties.getBucketName(), path)); |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请检腾讯云查配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public String getEndpointLink() { |
| | | String endpoint = properties.getEndpoint(); |
| | | StringBuilder sb = new StringBuilder(endpoint); |
| | | if (StringUtils.containsAnyIgnoreCase(endpoint, "http://")) { |
| | | sb.insert(7, properties.getBucketName() + "."); |
| | | } else if (StringUtils.containsAnyIgnoreCase(endpoint, "https://")) { |
| | | sb.insert(8, properties.getBucketName() + "."); |
| | | } else { |
| | | throw new OssException("Endpoint配置错误"); |
| | | } |
| | | return sb.toString(); |
| | | } |
| | | @Override |
| | | public String getEndpointLink() { |
| | | String endpoint = properties.getEndpoint(); |
| | | StringBuilder sb = new StringBuilder(endpoint); |
| | | if (StringUtils.containsAnyIgnoreCase(endpoint, "http://")) { |
| | | sb.insert(7, properties.getBucketName() + "."); |
| | | } else if (StringUtils.containsAnyIgnoreCase(endpoint, "https://")) { |
| | | sb.insert(8, properties.getBucketName() + "."); |
| | | } else { |
| | | throw new OssException("Endpoint配置错误"); |
| | | } |
| | | return sb.toString(); |
| | | } |
| | | } |
| | |
| | | */ |
| | | public class QiniuOssStrategy extends AbstractOssStrategy { |
| | | |
| | | private UploadManager uploadManager; |
| | | private BucketManager bucketManager; |
| | | private Auth auth; |
| | | private UploadManager uploadManager; |
| | | private BucketManager bucketManager; |
| | | private Auth auth; |
| | | |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | Configuration config = new Configuration(getRegion(properties.getRegion())); |
| | | // https设置 |
| | | config.useHttpsDomains = false; |
| | | config.useHttpsDomains = "Y".equals(properties.getIsHttps()); |
| | | uploadManager = new UploadManager(config); |
| | | auth = Auth.create(properties.getAccessKey(), properties.getSecretKey()); |
| | | String bucketName = properties.getBucketName(); |
| | | bucketManager = new BucketManager(auth, config); |
| | | @Override |
| | | public void init(OssProperties cloudStorageProperties) { |
| | | properties = cloudStorageProperties; |
| | | try { |
| | | Configuration config = new Configuration(getRegion(properties.getRegion())); |
| | | // https设置 |
| | | config.useHttpsDomains = false; |
| | | config.useHttpsDomains = "Y".equals(properties.getIsHttps()); |
| | | uploadManager = new UploadManager(config); |
| | | auth = Auth.create(properties.getAccessKey(), properties.getSecretKey()); |
| | | String bucketName = properties.getBucketName(); |
| | | bucketManager = new BucketManager(auth, config); |
| | | |
| | | if (!ArrayUtil.contains(bucketManager.buckets(), bucketName)) { |
| | | bucketManager.createBucket(bucketName, properties.getRegion()); |
| | | } |
| | | } catch (Exception e) { |
| | | throw new OssException("七牛云存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | if (!ArrayUtil.contains(bucketManager.buckets(), bucketName)) { |
| | | bucketManager.createBucket(bucketName, properties.getRegion()); |
| | | } |
| | | } catch (Exception e) { |
| | | throw new OssException("七牛云存储配置错误! 请检查系统配置:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | if (ArrayUtil.contains(bucketManager.buckets(), bucketName)) { |
| | | return; |
| | | } |
| | | bucketManager.createBucket(bucketName, properties.getRegion()); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对七牛云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | @Override |
| | | public void createBucket() { |
| | | try { |
| | | String bucketName = properties.getBucketName(); |
| | | if (ArrayUtil.contains(bucketManager.buckets(), bucketName)) { |
| | | return; |
| | | } |
| | | bucketManager.createBucket(bucketName, properties.getRegion()); |
| | | } catch (Exception e) { |
| | | throw new OssException("创建Bucket失败, 请核对七牛云配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.QINIU.getValue(); |
| | | } |
| | | @Override |
| | | public String getServiceType() { |
| | | return OssEnumd.QINIU.getValue(); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | try { |
| | | @Override |
| | | public UploadResult upload(byte[] data, String path, String contentType) { |
| | | try { |
| | | String token = auth.uploadToken(properties.getBucketName()); |
| | | Response res = uploadManager.put(data, path, token, null, contentType, false); |
| | | if (!res.isOK()) { |
| | | throw new RuntimeException("上传七牛出错:" + res.error); |
| | | } |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请核对七牛配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | if (!res.isOK()) { |
| | | throw new RuntimeException("上传七牛出错:" + res.error); |
| | | } |
| | | } catch (Exception e) { |
| | | throw new OssException("上传文件失败,请核对七牛配置信息:[" + e.getMessage() + "]"); |
| | | } |
| | | return new UploadResult().setUrl(getEndpointLink() + "/" + path).setFilename(path); |
| | | } |
| | | |
| | | @Override |
| | | public void delete(String path) { |
| | | try { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | Response res = bucketManager.delete(properties.getBucketName(), path); |
| | | if (!res.isOK()) { |
| | | throw new RuntimeException("删除七牛文件出错:" + res.error); |
| | | } |
| | | } catch (Exception e) { |
| | | throw new OssException(e.getMessage()); |
| | | } |
| | | } |
| | | @Override |
| | | public void delete(String path) { |
| | | try { |
| | | path = path.replace(getEndpointLink() + "/", ""); |
| | | Response res = bucketManager.delete(properties.getBucketName(), path); |
| | | if (!res.isOK()) { |
| | | throw new RuntimeException("删除七牛文件出错:" + res.error); |
| | | } |
| | | } catch (Exception e) { |
| | | throw new OssException(e.getMessage()); |
| | | } |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(byte[] data, String suffix, String contentType) { |
| | | return upload(data, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | @Override |
| | | public UploadResult uploadSuffix(InputStream inputStream, String suffix, String contentType) { |
| | | return upload(inputStream, getPath(properties.getPrefix(), suffix), contentType); |
| | | } |
| | | |
| | | @Override |
| | | public String getEndpointLink() { |
| | | return properties.getEndpoint(); |
| | | } |
| | | @Override |
| | | public String getEndpointLink() { |
| | | return properties.getEndpoint(); |
| | | } |
| | | |
| | | private Region getRegion(String region) { |
| | | switch (region) { |
| | | case "z0": |
| | | return Region.region0(); |
| | | case "z1": |
| | | return Region.region1(); |
| | | case "z2": |
| | | return Region.region2(); |
| | | case "na0": |
| | | return Region.regionNa0(); |
| | | case "as0": |
| | | return Region.regionAs0(); |
| | | default: |
| | | return Region.autoRegion(); |
| | | } |
| | | } |
| | | private Region getRegion(String region) { |
| | | switch (region) { |
| | | case "z0": |
| | | return Region.region0(); |
| | | case "z1": |
| | | return Region.region1(); |
| | | case "z2": |
| | | return Region.region2(); |
| | | case "na0": |
| | | return Region.regionNa0(); |
| | | case "as0": |
| | | return Region.regionAs0(); |
| | | default: |
| | | return Region.autoRegion(); |
| | | } |
| | | } |
| | | |
| | | } |
| | |
| | | <select id="selectDeptList" parameterType="SysDept" resultMap="SysDeptResult"> |
| | | <include refid="selectDeptVo"/> |
| | | where d.del_flag = '0' |
| | | <if test="deptId != null and deptId != 0"> |
| | | AND dept_id = #{deptId} |
| | | </if> |
| | | <if test="deptId != null and deptId != 0"> |
| | | AND dept_id = #{deptId} |
| | | </if> |
| | | <if test="parentId != null and parentId != 0"> |
| | | AND parent_id = #{parentId} |
| | | </if> |
| | |
| | | <result property="orderNum" column="order_num"/> |
| | | <result property="path" column="path"/> |
| | | <result property="component" column="component"/> |
| | | <result property="query" column="query"/> |
| | | <result property="isFrame" column="is_frame"/> |
| | | <result property="query" column="query"/> |
| | | <result property="isFrame" column="is_frame"/> |
| | | <result property="isCache" column="is_cache"/> |
| | | <result property="menuType" column="menu_type"/> |
| | | <result property="visible" column="visible"/> |
| | |
| | | m.menu_name, |
| | | m.path, |
| | | m.component, |
| | | m.`query`, |
| | | m.visible, |
| | | m.`query`, |
| | | m.visible, |
| | | m.status, |
| | | ifnull(m.perms, '') as perms, |
| | | m.is_frame, |
| | |
| | | m.menu_name, |
| | | m.path, |
| | | m.component, |
| | | m.`query`, |
| | | m.`query`, |
| | | m.visible, |
| | | m.status, |
| | | ifnull(m.perms, '') as perms, |
| | |
| | | r.data_scope, |
| | | r.status as role_status |
| | | from sys_user u |
| | | left join sys_dept d on u.dept_id = d.dept_id |
| | | left join sys_user_role ur on u.user_id = ur.user_id |
| | | left join sys_role r on r.role_id = ur.role_id |
| | | left join sys_dept d on u.dept_id = d.dept_id |
| | | left join sys_user_role ur on u.user_id = ur.user_id |
| | | left join sys_role r on r.role_id = ur.role_id |
| | | </sql> |
| | | |
| | | <select id="selectPageUserList" parameterType="SysUser" resultMap="SysUserResult"> |
| | |
| | | <link rel="icon" href="<%= BASE_URL %>favicon.ico"> |
| | | <title><%= webpackConfig.name %></title> |
| | | <!--[if lt IE 11]><script>window.location.href='/html/ie.html';</script><![endif]--> |
| | | <style> |
| | | <style> |
| | | html, |
| | | body, |
| | | #app { |
| | |
| | | </head> |
| | | <body> |
| | | <div id="app"> |
| | | <div id="loader-wrapper"> |
| | | <div id="loader"></div> |
| | | <div class="loader-section section-left"></div> |
| | | <div class="loader-section section-right"></div> |
| | | <div class="load_title">正在加载系统资源,请耐心等待</div> |
| | | <div id="loader-wrapper"> |
| | | <div id="loader"></div> |
| | | <div class="loader-section section-left"></div> |
| | | <div class="loader-section section-right"></div> |
| | | <div class="load_title">正在加载系统资源,请耐心等待</div> |
| | | </div> |
| | | </div> |
| | | </div> |
| | | </body> |
| | | </html> |
| | |
| | | /** |
| | | * 通用css样式布局处理 |
| | | * Copyright (c) 2019 ruoyi |
| | | */ |
| | | /** |
| | | * 通用css样式布局处理 |
| | | * Copyright (c) 2019 ruoyi |
| | | */ |
| | | |
| | | /** 基础通用 **/ |
| | | /** 基础通用 **/ |
| | | .pt5 { |
| | | padding-top: 5px; |
| | | padding-top: 5px; |
| | | } |
| | | |
| | | .pr5 { |
| | | padding-right: 5px; |
| | | padding-right: 5px; |
| | | } |
| | | |
| | | .pb5 { |
| | | padding-bottom: 5px; |
| | | padding-bottom: 5px; |
| | | } |
| | | |
| | | .mt5 { |
| | | margin-top: 5px; |
| | | margin-top: 5px; |
| | | } |
| | | |
| | | .mr5 { |
| | | margin-right: 5px; |
| | | margin-right: 5px; |
| | | } |
| | | |
| | | .mb5 { |
| | | margin-bottom: 5px; |
| | | margin-bottom: 5px; |
| | | } |
| | | |
| | | .mb8 { |
| | | margin-bottom: 8px; |
| | | margin-bottom: 8px; |
| | | } |
| | | |
| | | .ml5 { |
| | | margin-left: 5px; |
| | | margin-left: 5px; |
| | | } |
| | | |
| | | .mt10 { |
| | | margin-top: 10px; |
| | | margin-top: 10px; |
| | | } |
| | | |
| | | .mr10 { |
| | | margin-right: 10px; |
| | | margin-right: 10px; |
| | | } |
| | | |
| | | .mb10 { |
| | | margin-bottom: 10px; |
| | | margin-bottom: 10px; |
| | | } |
| | | |
| | | .ml0 { |
| | | margin-left: 10px; |
| | | margin-left: 10px; |
| | | } |
| | | |
| | | .mt20 { |
| | | margin-top: 20px; |
| | | margin-top: 20px; |
| | | } |
| | | |
| | | .mr20 { |
| | | margin-right: 20px; |
| | | margin-right: 20px; |
| | | } |
| | | |
| | | .mb20 { |
| | | margin-bottom: 20px; |
| | | margin-bottom: 20px; |
| | | } |
| | | |
| | | .m20 { |
| | | margin-left: 20px; |
| | | margin-left: 20px; |
| | | } |
| | | |
| | | .h1, .h2, .h3, .h4, .h5, .h6, h1, h2, h3, h4, h5, h6 { |
| | | font-family: inherit; |
| | | font-weight: 500; |
| | | line-height: 1.1; |
| | | color: inherit; |
| | | font-family: inherit; |
| | | font-weight: 500; |
| | | line-height: 1.1; |
| | | color: inherit; |
| | | } |
| | | |
| | | .el-dialog:not(.is-fullscreen) { |
| | | margin-top: 6vh !important; |
| | | margin-top: 6vh !important; |
| | | } |
| | | |
| | | .el-dialog__wrapper.scrollbar .el-dialog .el-dialog__body { |
| | | overflow: auto; |
| | | overflow-x: hidden; |
| | | max-height: 70vh; |
| | | padding: 10px 20px 0; |
| | | overflow: auto; |
| | | overflow-x: hidden; |
| | | max-height: 70vh; |
| | | padding: 10px 20px 0; |
| | | } |
| | | |
| | | .el-table { |
| | | .el-table__header-wrapper, .el-table__fixed-header-wrapper { |
| | | th { |
| | | word-break: break-word; |
| | | background-color: #f8f8f9; |
| | | color: #515a6e; |
| | | height: 40px; |
| | | font-size: 13px; |
| | | } |
| | | } |
| | | .el-table__body-wrapper { |
| | | .el-button [class*="el-icon-"] + span { |
| | | margin-left: 1px; |
| | | } |
| | | } |
| | | .el-table__header-wrapper, .el-table__fixed-header-wrapper { |
| | | th { |
| | | word-break: break-word; |
| | | background-color: #f8f8f9; |
| | | color: #515a6e; |
| | | height: 40px; |
| | | font-size: 13px; |
| | | } |
| | | } |
| | | |
| | | .el-table__body-wrapper { |
| | | .el-button [class*="el-icon-"] + span { |
| | | margin-left: 1px; |
| | | } |
| | | } |
| | | } |
| | | |
| | | /** 表单布局 **/ |
| | | .form-header { |
| | | font-size:15px; |
| | | color:#6379bb; |
| | | border-bottom:1px solid #ddd; |
| | | margin:8px 10px 25px 10px; |
| | | padding-bottom:5px |
| | | font-size: 15px; |
| | | color: #6379bb; |
| | | border-bottom: 1px solid #ddd; |
| | | margin: 8px 10px 25px 10px; |
| | | padding-bottom: 5px |
| | | } |
| | | |
| | | /** 表格布局 **/ |
| | | .pagination-container { |
| | | position: relative; |
| | | height: 25px; |
| | | margin-bottom: 10px; |
| | | margin-top: 15px; |
| | | padding: 10px 20px !important; |
| | | position: relative; |
| | | height: 25px; |
| | | margin-bottom: 10px; |
| | | margin-top: 15px; |
| | | padding: 10px 20px !important; |
| | | } |
| | | |
| | | /* tree border */ |
| | | .tree-border { |
| | | margin-top: 5px; |
| | | border: 1px solid #e5e6e7; |
| | | background: #FFFFFF none; |
| | | border-radius:4px; |
| | | margin-top: 5px; |
| | | border: 1px solid #e5e6e7; |
| | | background: #FFFFFF none; |
| | | border-radius: 4px; |
| | | } |
| | | |
| | | .pagination-container .el-pagination { |
| | | right: 0; |
| | | position: absolute; |
| | | right: 0; |
| | | position: absolute; |
| | | } |
| | | |
| | | @media ( max-width : 768px) { |
| | | @media (max-width: 768px) { |
| | | .pagination-container .el-pagination > .el-pagination__jump { |
| | | display: none !important; |
| | | } |
| | |
| | | } |
| | | |
| | | .el-table .fixed-width .el-button--mini { |
| | | padding-left: 0; |
| | | padding-right: 0; |
| | | width: inherit; |
| | | padding-left: 0; |
| | | padding-right: 0; |
| | | width: inherit; |
| | | } |
| | | |
| | | /** 表格更多操作下拉样式 */ |
| | | .el-table .el-dropdown-link { |
| | | cursor: pointer; |
| | | color: #409EFF; |
| | | margin-left: 5px; |
| | | cursor: pointer; |
| | | color: #409EFF; |
| | | margin-left: 5px; |
| | | } |
| | | |
| | | .el-table .el-dropdown, .el-icon-arrow-down { |
| | | font-size: 12px; |
| | | font-size: 12px; |
| | | } |
| | | |
| | | .el-tree-node__content > .el-checkbox { |
| | | margin-right: 8px; |
| | | margin-right: 8px; |
| | | } |
| | | |
| | | .list-group-striped > .list-group-item { |
| | | border-left: 0; |
| | | border-right: 0; |
| | | border-radius: 0; |
| | | padding-left: 0; |
| | | padding-right: 0; |
| | | border-left: 0; |
| | | border-right: 0; |
| | | border-radius: 0; |
| | | padding-left: 0; |
| | | padding-right: 0; |
| | | } |
| | | |
| | | .list-group { |
| | | padding-left: 0px; |
| | | list-style: none; |
| | | padding-left: 0px; |
| | | list-style: none; |
| | | } |
| | | |
| | | .list-group-item { |
| | | border-bottom: 1px solid #e7eaec; |
| | | border-top: 1px solid #e7eaec; |
| | | margin-bottom: -1px; |
| | | padding: 11px 0px; |
| | | font-size: 13px; |
| | | border-bottom: 1px solid #e7eaec; |
| | | border-top: 1px solid #e7eaec; |
| | | margin-bottom: -1px; |
| | | padding: 11px 0px; |
| | | font-size: 13px; |
| | | } |
| | | |
| | | .pull-right { |
| | | float: right !important; |
| | | float: right !important; |
| | | } |
| | | |
| | | .el-card__header { |
| | | padding: 14px 15px 7px; |
| | | min-height: 40px; |
| | | padding: 14px 15px 7px; |
| | | min-height: 40px; |
| | | } |
| | | |
| | | .el-card__body { |
| | | padding: 15px 20px 20px 20px; |
| | | padding: 15px 20px 20px 20px; |
| | | } |
| | | |
| | | .card-box { |
| | | padding-right: 15px; |
| | | padding-left: 15px; |
| | | margin-bottom: 10px; |
| | | padding-right: 15px; |
| | | padding-left: 15px; |
| | | margin-bottom: 10px; |
| | | } |
| | | |
| | | /* button color */ |
| | |
| | | |
| | | /* text color */ |
| | | .text-navy { |
| | | color: #1ab394; |
| | | color: #1ab394; |
| | | } |
| | | |
| | | .text-primary { |
| | | color: inherit; |
| | | color: inherit; |
| | | } |
| | | |
| | | .text-success { |
| | | color: #1c84c6; |
| | | color: #1c84c6; |
| | | } |
| | | |
| | | .text-info { |
| | | color: #23c6c8; |
| | | color: #23c6c8; |
| | | } |
| | | |
| | | .text-warning { |
| | | color: #f8ac59; |
| | | color: #f8ac59; |
| | | } |
| | | |
| | | .text-danger { |
| | | color: #ed5565; |
| | | color: #ed5565; |
| | | } |
| | | |
| | | .text-muted { |
| | | color: #888888; |
| | | color: #888888; |
| | | } |
| | | |
| | | /* image */ |
| | | .img-circle { |
| | | border-radius: 50%; |
| | | border-radius: 50%; |
| | | } |
| | | |
| | | .img-lg { |
| | | width: 120px; |
| | | height: 120px; |
| | | width: 120px; |
| | | height: 120px; |
| | | } |
| | | |
| | | .avatar-upload-preview { |
| | | position: absolute; |
| | | top: 50%; |
| | | transform: translate(50%, -50%); |
| | | width: 200px; |
| | | height: 200px; |
| | | border-radius: 50%; |
| | | box-shadow: 0 0 4px #ccc; |
| | | overflow: hidden; |
| | | position: absolute; |
| | | top: 50%; |
| | | transform: translate(50%, -50%); |
| | | width: 200px; |
| | | height: 200px; |
| | | border-radius: 50%; |
| | | box-shadow: 0 0 4px #ccc; |
| | | overflow: hidden; |
| | | } |
| | | |
| | | /* 拖拽列样式 */ |
| | | .sortable-ghost{ |
| | | opacity: .8; |
| | | color: #fff!important; |
| | | background: #42b983!important; |
| | | .sortable-ghost { |
| | | opacity: .8; |
| | | color: #fff !important; |
| | | background: #42b983 !important; |
| | | } |
| | | |
| | | .top-right-btn { |
| | | position: relative; |
| | | float: right; |
| | | position: relative; |
| | | float: right; |
| | | } |
| | |
| | | } |
| | | return routes; |
| | | }, |
| | | ishttp(url) { |
| | | ishttp(url) { |
| | | return url.indexOf('http://') !== -1 || url.indexOf('https://') !== -1 |
| | | } |
| | | }, |
| | |
| | | * 分布式日志 TLog 支持跟踪链路日志记录、性能分析、链路排查<br/> |
| | | * 分布式任务调度 Xxl-Job 高性能 高可靠 易扩展<br/> |
| | | * 文件存储 Minio 本地存储<br/> |
| | | * 文件存储 七牛、阿里、腾讯 云存储<br/> |
| | | * 文件存储 七牛、阿里、腾讯 云存储<br/> |
| | | * 监控框架 SpringBoot-Admin 全方位服务监控<br/> |
| | | * 校验框架 Validation 增强接口安全性 严谨性<br/> |
| | | * Excel框架 Alibaba EasyExcel 性能优异 扩展性强<br/> |
| | |
| | | v-model="queryParams.ipaddr" |
| | | placeholder="请输入登录地址" |
| | | clearable |
| | | size="small" |
| | | size="small" |
| | | style="width: 240px;" |
| | | @keyup.enter.native="handleQuery" |
| | | /> |
| | |
| | | v-model="queryParams.userName" |
| | | placeholder="请输入用户名称" |
| | | clearable |
| | | size="small" |
| | | size="small" |
| | | style="width: 240px;" |
| | | @keyup.enter.native="handleQuery" |
| | | /> |
| | |
| | | @echo off |
| | | |
| | | rem jar平级目录 |
| | | rem jarƽ��Ŀ¼ |
| | | set AppName=ruoyi-admin.jar |
| | | |
| | | rem JVM参数 |
| | | rem JVM���� |
| | | set JVM_OPTS="-Dname=%AppName% -Duser.timezone=Asia/Shanghai -Xms512m -Xmx1024m -XX:MetaspaceSize=128m -XX:MaxMetaspaceSize=512m -XX:+HeapDumpOnOutOfMemoryError -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:NewRatio=1 -XX:SurvivorRatio=30 -XX:+UseParallelGC -XX:+UseParallelOldGC" |
| | | |
| | | |
| | | ECHO. |
| | | ECHO. [1] 启动%AppName% |
| | | ECHO. [2] 关闭%AppName% |
| | | ECHO. [3] 重启%AppName% |
| | | ECHO. [4] 启动状态 %AppName% |
| | | ECHO. [5] 退 出 |
| | | ECHO. [1] ����%AppName% |
| | | ECHO. [2] �ر�%AppName% |
| | | ECHO. [3] ����%AppName% |
| | | ECHO. [4] ����״̬ %AppName% |
| | | ECHO. [5] �� �� |
| | | ECHO. |
| | | |
| | | ECHO.请输入选择项目的序号: |
| | | ECHO.������ѡ����Ŀ�����: |
| | | set /p ID= |
| | | IF "%id%"=="1" GOTO start |
| | | IF "%id%"=="2" GOTO stop |
| | | IF "%id%"=="3" GOTO restart |
| | | IF "%id%"=="4" GOTO status |
| | | IF "%id%"=="5" EXIT |
| | | IF "%id%"=="1" GOTO start |
| | | IF "%id%"=="2" GOTO stop |
| | | IF "%id%"=="3" GOTO restart |
| | | IF "%id%"=="4" GOTO status |
| | | IF "%id%"=="5" EXIT |
| | | PAUSE |
| | | :start |
| | | for /f "usebackq tokens=1-2" %%a in (`jps -l ^| findstr %AppName%`) do ( |
| | | set pid=%%a |
| | | set image_name=%%b |
| | | ) |
| | | if defined pid ( |
| | | echo %%is running |
| | | PAUSE |
| | | ) |
| | | set pid=%%a |
| | | set image_name=%%b |
| | | ) |
| | | if defined pid ( |
| | | echo %%is running |
| | | PAUSE |
| | | ) |
| | | |
| | | start javaw %JAVA_OPTS% -jar %AppName% |
| | | |
| | | echo starting…… |
| | | echo starting���� |
| | | echo Start %AppName% success... |
| | | goto:eof |
| | | |
| | | rem 函数stop通过jps命令查找pid并结束进程 |
| | | rem ����stopͨ��jps�������pid���������� |
| | | :stop |
| | | for /f "usebackq tokens=1-2" %%a in (`jps -l ^| findstr %AppName%`) do ( |
| | | set pid=%%a |
| | | set image_name=%%b |
| | | ) |
| | | if not defined pid (echo process %AppName% does not exists) else ( |
| | | echo prepare to kill %image_name% |
| | | echo start kill %pid% ... |
| | | rem 根据进程ID,kill进程 |
| | | taskkill /f /pid %pid% |
| | | ) |
| | | for /f "usebackq tokens=1-2" %%a in (`jps -l ^| findstr %AppName%`) do ( |
| | | set pid=%%a |
| | | set image_name=%%b |
| | | ) |
| | | if not defined pid (echo process %AppName% does not exists) else ( |
| | | echo prepare to kill %image_name% |
| | | echo start kill %pid% ... |
| | | rem ���ݽ���ID��kill���� |
| | | taskkill /f /pid %pid% |
| | | ) |
| | | goto:eof |
| | | :restart |
| | | call :stop |
| | | call :stop |
| | | call :start |
| | | goto:eof |
| | | :status |
| | | for /f "usebackq tokens=1-2" %%a in (`jps -l ^| findstr %AppName%`) do ( |
| | | set pid=%%a |
| | | set image_name=%%b |
| | | ) |
| | | if not defined pid (echo process %AppName% is dead ) else ( |
| | | echo %image_name% is running |
| | | ) |
| | | for /f "usebackq tokens=1-2" %%a in (`jps -l ^| findstr %AppName%`) do ( |
| | | set pid=%%a |
| | | set image_name=%%b |
| | | ) |
| | | if not defined pid (echo process %AppName% is dead ) else ( |
| | | echo %image_name% is running |
| | | ) |
| | | goto:eof |
| | |
| | | { |
| | | PID=`ps -ef |grep java|grep $AppName|grep -v grep|awk '{print $2}'` |
| | | |
| | | if [ x"$PID" != x"" ]; then |
| | | echo "$AppName is running..." |
| | | else |
| | | nohup java $JVM_OPTS -jar $AppName > /dev/null 2>&1 & |
| | | echo "Start $AppName success..." |
| | | fi |
| | | if [ x"$PID" != x"" ]; then |
| | | echo "$AppName is running..." |
| | | else |
| | | nohup java $JVM_OPTS -jar $AppName > /dev/null 2>&1 & |
| | | echo "Start $AppName success..." |
| | | fi |
| | | } |
| | | |
| | | function stop() |
| | | { |
| | | echo "Stop $AppName" |
| | | |
| | | PID="" |
| | | query(){ |
| | | PID=`ps -ef |grep java|grep $AppName|grep -v grep|awk '{print $2}'` |
| | | } |
| | | PID="" |
| | | query(){ |
| | | PID=`ps -ef |grep java|grep $AppName|grep -v grep|awk '{print $2}'` |
| | | } |
| | | |
| | | query |
| | | if [ x"$PID" != x"" ]; then |
| | | kill -TERM $PID |
| | | echo "$AppName (pid:$PID) exiting..." |
| | | while [ x"$PID" != x"" ] |
| | | do |
| | | sleep 1 |
| | | query |
| | | done |
| | | echo "$AppName exited." |
| | | else |
| | | echo "$AppName already stopped." |
| | | fi |
| | | query |
| | | if [ x"$PID" != x"" ]; then |
| | | kill -TERM $PID |
| | | echo "$AppName (pid:$PID) exiting..." |
| | | while [ x"$PID" != x"" ] |
| | | do |
| | | sleep 1 |
| | | query |
| | | done |
| | | echo "$AppName exited." |
| | | else |
| | | echo "$AppName already stopped." |
| | | fi |
| | | } |
| | | |
| | | function restart() |
| | |
| | | |
| | | #使用说明,用来提示输入参数 |
| | | usage() { |
| | | echo "Usage: sh 执行脚本.sh [port|mount|monitor|base|start|stop|stopall|rm|rmiNoneTag]" |
| | | exit 1 |
| | | echo "Usage: sh 执行脚本.sh [port|mount|monitor|base|start|stop|stopall|rm|rmiNoneTag]" |
| | | exit 1 |
| | | } |
| | | |
| | | #开启所需端口(生产环境不推荐开启) |
| | | port(){ |
| | | # mysql 端口 |
| | | firewall-cmd --add-port=3306/tcp --permanent |
| | | # redis 端口 |
| | | firewall-cmd --add-port=6379/tcp --permanent |
| | | # minio api 端口 |
| | | firewall-cmd --add-port=9000/tcp --permanent |
| | | # minio 控制台端口 |
| | | firewall-cmd --add-port=9001/tcp --permanent |
| | | # 监控中心端口 |
| | | firewall-cmd --add-port=9090/tcp --permanent |
| | | # 任务调度中心端口 |
| | | firewall-cmd --add-port=9100/tcp --permanent |
| | | # 重启防火墙 |
| | | service firewalld restart |
| | | firewall-cmd --add-port=3306/tcp --permanent |
| | | # redis 端口 |
| | | firewall-cmd --add-port=6379/tcp --permanent |
| | | # minio api 端口 |
| | | firewall-cmd --add-port=9000/tcp --permanent |
| | | # minio 控制台端口 |
| | | firewall-cmd --add-port=9001/tcp --permanent |
| | | # 监控中心端口 |
| | | firewall-cmd --add-port=9090/tcp --permanent |
| | | # 任务调度中心端口 |
| | | firewall-cmd --add-port=9100/tcp --permanent |
| | | # 重启防火墙 |
| | | service firewalld restart |
| | | } |
| | | |
| | | ##放置挂载文件 |
| | | mount(){ |
| | | #挂载 nginx 配置文件 |
| | | if test ! -f "/docker/nginx/conf/nginx.conf" ;then |
| | | mkdir -p /docker/nginx/conf |
| | | cp nginx/nginx.conf /docker/nginx/conf/nginx.conf |
| | | fi |
| | | #挂载 redis 配置文件 |
| | | if test ! -f "/docker/redis/conf/redis.conf" ;then |
| | | mkdir -p /docker/redis/conf |
| | | cp redis/redis.conf /docker/redis/conf/redis.conf |
| | | fi |
| | | #挂载 nginx 配置文件 |
| | | if test ! -f "/docker/nginx/conf/nginx.conf" ;then |
| | | mkdir -p /docker/nginx/conf |
| | | cp nginx/nginx.conf /docker/nginx/conf/nginx.conf |
| | | fi |
| | | #挂载 redis 配置文件 |
| | | if test ! -f "/docker/redis/conf/redis.conf" ;then |
| | | mkdir -p /docker/redis/conf |
| | | cp redis/redis.conf /docker/redis/conf/redis.conf |
| | | fi |
| | | } |
| | | |
| | | #启动基础模块 |
| | | base(){ |
| | | docker-compose up -d mysql nginx-web redis minio |
| | | docker-compose up -d mysql nginx-web redis minio |
| | | } |
| | | |
| | | #启动监控模块 |
| | | monitor(){ |
| | | docker-compose up -d ruoyi-monitor-admin |
| | | docker-compose up -d ruoyi-monitor-admin |
| | | } |
| | | |
| | | #启动程序模块 |
| | | start(){ |
| | | docker-compose up -d ruoyi-xxl-job-admin ruoyi-server1 ruoyi-server2 |
| | | docker-compose up -d ruoyi-xxl-job-admin ruoyi-server1 ruoyi-server2 |
| | | } |
| | | |
| | | #停止程序模块 |
| | | stop(){ |
| | | docker-compose stop ruoyi-xxl-job-admin ruoyi-server1 ruoyi-server2 |
| | | docker-compose stop ruoyi-xxl-job-admin ruoyi-server1 ruoyi-server2 |
| | | } |
| | | |
| | | #关闭所有模块 |
| | | stopall(){ |
| | | docker-compose stop |
| | | docker-compose stop |
| | | } |
| | | |
| | | #删除所有模块 |
| | | rm(){ |
| | | docker-compose rm |
| | | docker-compose rm |
| | | } |
| | | |
| | | #删除Tag为空的镜像 |
| | | rmiNoneTag(){ |
| | | docker images|grep none|awk '{print $3}'|xargs docker rmi -f |
| | | docker images|grep none|awk '{print $3}'|xargs docker rmi -f |
| | | } |
| | | |
| | | #根据输入参数,选择执行对应方法,不输入则执行使用说明 |
| | | case "$1" in |
| | | "port") |
| | | port |
| | | port |
| | | ;; |
| | | "mount") |
| | | mount |
| | | mount |
| | | ;; |
| | | "base") |
| | | base |
| | | base |
| | | ;; |
| | | "monitor") |
| | | monitor |
| | | monitor |
| | | ;; |
| | | "start") |
| | | start |
| | | start |
| | | ;; |
| | | "stop") |
| | | stop |
| | | stop |
| | | ;; |
| | | "stopall") |
| | | stopall |
| | | stopall |
| | | ;; |
| | | "rm") |
| | | rm |
| | | rm |
| | | ;; |
| | | "rmiNoneTag") |
| | | rmiNoneTag |
| | | rmiNoneTag |
| | | ;; |
| | | *) |
| | | usage |
| | | usage |
| | | ;; |
| | | esac |
| | |
| | | |
| | | access_log /var/log/nginx/access.log main; |
| | | |
| | | upstream server { |
| | | ip_hash; |
| | | server 172.30.0.60:8080; |
| | | server 172.30.0.61:8080; |
| | | } |
| | | upstream server { |
| | | ip_hash; |
| | | server 172.30.0.60:8080; |
| | | server 172.30.0.61:8080; |
| | | } |
| | | |
| | | upstream monitor-admin { |
| | | server 172.30.0.90:9090; |
| | |
| | | # return 200 '{"msg":"演示模式,不允许操作","code":500}'; |
| | | # } |
| | | |
| | | location / { |
| | | location / { |
| | | root /usr/share/nginx/html; |
| | | try_files $uri $uri/ /index.html; |
| | | try_files $uri $uri/ /index.html; |
| | | index index.html index.htm; |
| | | } |
| | | |
| | | location /prod-api/ { |
| | | proxy_set_header Host $http_host; |
| | | proxy_set_header X-Real-IP $remote_addr; |
| | | proxy_set_header REMOTE-HOST $remote_addr; |
| | | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; |
| | | proxy_pass http://server/; |
| | | } |
| | | location /prod-api/ { |
| | | proxy_set_header Host $http_host; |
| | | proxy_set_header X-Real-IP $remote_addr; |
| | | proxy_set_header REMOTE-HOST $remote_addr; |
| | | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; |
| | | proxy_pass http://server/; |
| | | } |
| | | |
| | | # https 会拦截内链所有的 http 请求 造成功能无法使用 |
| | | # 解决方案1 将 admin 服务 也配置成 https |
| | | # 解决方案2 将菜单配置为外链访问 走独立页面 http 访问 |
| | | location /admin/ { |
| | | proxy_set_header Host $http_host; |
| | | proxy_set_header X-Real-IP $remote_addr; |
| | | proxy_set_header REMOTE-HOST $remote_addr; |
| | | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; |
| | | proxy_pass http://monitor-admin/admin/; |
| | | } |
| | | location /admin/ { |
| | | proxy_set_header Host $http_host; |
| | | proxy_set_header X-Real-IP $remote_addr; |
| | | proxy_set_header REMOTE-HOST $remote_addr; |
| | | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; |
| | | proxy_pass http://monitor-admin/admin/; |
| | | } |
| | | |
| | | # https 会拦截内链所有的 http 请求 造成功能无法使用 |
| | | # 解决方案1 将 xxljob 服务 也配置成 https |
| | |
| | | status char(1) default '0' comment '部门状态(0正常 1停用)', |
| | | del_flag char(1) default '0' comment '删除标志(0代表存在 2代表删除)', |
| | | create_by varchar(64) default '' comment '创建者', |
| | | create_time datetime comment '创建时间', |
| | | create_time datetime comment '创建时间', |
| | | update_by varchar(64) default '' comment '更新者', |
| | | update_time datetime comment '更新时间', |
| | | primary key (dept_id) |
| | |
| | | status char(1) not null comment '状态(0正常 1停用)', |
| | | create_by varchar(64) default '' comment '创建者', |
| | | create_time datetime comment '创建时间', |
| | | update_by varchar(64) default '' comment '更新者', |
| | | update_by varchar(64) default '' comment '更新者', |
| | | update_time datetime comment '更新时间', |
| | | remark varchar(500) default null comment '备注', |
| | | primary key (post_id) |
| | |
| | | gen_path varchar(200) default '/' comment '生成路径(不填默认项目路径)', |
| | | options varchar(1000) comment '其它生成选项', |
| | | create_by varchar(64) default '' comment '创建者', |
| | | create_time datetime comment '创建时间', |
| | | create_time datetime comment '创建时间', |
| | | update_by varchar(64) default '' comment '更新者', |
| | | update_time datetime comment '更新时间', |
| | | remark varchar(500) default null comment '备注', |
| | |
| | | dict_type varchar(200) default '' comment '字典类型', |
| | | sort int comment '排序', |
| | | create_by varchar(64) default '' comment '创建者', |
| | | create_time datetime comment '创建时间', |
| | | create_time datetime comment '创建时间', |
| | | update_by varchar(64) default '' comment '更新者', |
| | | update_time datetime comment '更新时间', |
| | | primary key (column_id) |