Skip to content

Commit b7d6711

Browse files
committed
optimize code
1 parent 8e8b64c commit b7d6711

File tree

6 files changed

+257
-39
lines changed

6 files changed

+257
-39
lines changed

README.md

Lines changed: 73 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -8,40 +8,85 @@ Slightly Framework design to support Spring based java or Bigdata program.
88

99
1.Introduction
1010

11-
I.This project is base on Spring Framework and has four modules:
12-
|----------------------------------------------------------------------------------------------------|
13-
| Module | Description |
14-
|----------------------------------------------------------------------------------------------------|
15-
| Core | the core class include data access layer basic class(model,dao,service) and etc. |
16-
|----------------------------------------------------------------------------------------------------|
17-
| Comm | FileSystem Access tool(local/vfs),support FileFormat(csv/xml/json/avro/parquet/protobuf)|
18-
| | ,support Compress Format(gzip/bzip2/snappy/lzo/zip/lzma/lz4) |
19-
| | ,read and write excel,read word or PowerPoint |
20-
|----------------------------------------------------------------------------------------------------|
21-
|Hadooptool|FileSystem Access tool(hdfs), comm tool to access to HDFS,Hbase,Hive,Mongdb and etc |
22-
|----------------------------------------------------------------------------------------------------|
23-
|Example |springmvc config based and spring boot based Example; |
24-
|----------------------------------------------------------------------------------------------------|
25-
|Web |struts1,struts2 and springmvc support web component and required class. |
26-
|----------------------------------------------------------------------------------------------------|
27-
|Webui |Spring Boot with Oauth2 Thymeleaf Example; |
28-
|----------------------------------------------------------------------------------------------------|
29-
|Estool | ElasticSearch Comm Query tool |
30-
|----------------------------------------------------------------------------------------------------|
31-
|Tracer | Zipkin Brave tracing,Can trace All Database and Record parameters |
32-
|----------------------------------------------------------------------------------------------------|
33-
34-
II. Special feature
11+
I.This project is base on Spring Framework and has below modules:
12+
|----------------------------------------------------------------------------------------------------------------|
13+
| Module | Description |
14+
|----------------------------------------------------------------------------------------------------------------|
15+
| Core | the core class include data access layer basic class(model,dao,service) and etc. |
16+
|----------------------------------------------------------------------------------------------------------------|
17+
| Comm | FileSystem Access tool(local/vfs),support FileFormat(csv/xml/json/avro/parquet/protobuf)|
18+
| | ,support Compress Format(gzip/bzip2/snappy/lzo/zip/lzma/lz4) |
19+
| | ,read and write excel,read word or PowerPoint |
20+
|----------------------------------------------------------------------------------------------------------------|
21+
|Hadooptool |FileSystem Access tool(hdfs), comm tool to access to HDFS,Hbase,Hive,Mongdb and etc |
22+
|----------------------------------------------------------------------------------------------------------------|
23+
|Example |springmvc config based and spring boot based Simple framework Example; |
24+
|----------------------------------------------------------------------------------------------------------------|
25+
|Web |struts1,struts2 and springmvc support web component and required class. |
26+
|----------------------------------------------------------------------------------------------------------------|
27+
|Webui |Spring Boot with Oauth2 Thymeleaf Example; |
28+
|----------------------------------------------------------------------------------------------------------------|
29+
|Estool | ElasticSearch Comm Query tool |
30+
|----------------------------------------------------------------------------------------------------------------|
31+
|Tracer | Zipkin Brave tracing,Can trace All Database and Record parameters |
32+
|----------------------------------------------------------------------------------------------------------------|
33+
34+
35+
It is available under the terms of either the Apache Software License 2.0 or the Eclipse Public License 1.0.
36+
37+
2.Support Features
38+
I. Construct Simple Java FrameWork
39+
contain use sysrole and relation with customer privilege to use and roles;
40+
1.xml based standard frame : see example/config-example
41+
2.spring based standard frame: see example/boot-example
42+
43+
II. Bigdata supprot
44+
hadooptool:
45+
HDFS tool: com.robin.hadoop.hdfs can access HDFS with kerberos security
46+
Hbase tool: com.robin.hadoop.hbase hbase tool
47+
Cassandra tool : CassandraUtils
48+
49+
III. BigData common file format read/write tools support(including compress type support)
50+
AVRO
51+
PARQUET
52+
ORC
53+
PROTOBUF
54+
CSV
55+
XML
56+
JSON
57+
ARFF(weka format)
58+
59+
IV. File storage and Cloud storage support
60+
LocalFileSystem
61+
ApacheVFS
62+
HDFS
63+
Amazon S3
64+
Aliyun OSS
65+
Tencent COS
66+
Apache Kafka
67+
RabbitMq
68+
69+
V. Iterable and wirtable support intergate Storage and file format
70+
mix storage and File format to support cross storage read/write
71+
72+
VI. Spring cloud support
73+
WebUI simple webui base on dhtmlxGrid 5.1 with spring boot native
74+
related project in my another project microservices
75+
76+
VII. Zipkin Intergation
77+
trace sub project aimed to support All database to be tracable and can record query parameters.
78+
79+
VIII. Special feature
3580
a.A user defined xml Query config system,similar to mybatis,but easy config.
3681
b.Support defined annotation or jpa annotation in JdbcDao with ORM.
3782
c. BaseAnnotationService can access DB with minimize code,and use transaction with annotation.
3883
d.A common db access meta and util,can access all kind of db.
3984
e.Spring cloud based WebUI
4085
f.support Hadoop plateform
41-
42-
It is available under the terms of either the Apache Software License 2.0 or the Eclipse Public License 1.0.
43-
44-
2.Development
86+
g. Excel read write utils support auto merge columns and customer header define.
87+
88+
89+
3.Development
4590

4691
I.Model Layer:Simple ORM tool, Support JAVA JPA or my BaseObject Annotation
4792
Demostration:(support Composite primary key)
@@ -221,14 +266,5 @@ Slightly Framework design to support Spring based java or Bigdata program.
221266
222267
upon feature aim to simplify the work to develop standard MVC java code.
223268
224-
II. Bigdata supprot
225-
hadooptool:
226-
HDFS tool: com.robin.hadoop.hdfs can access HDFS with kerberos security
227-
Hbase tool: com.robin.hadoop.hbase hbase tool
228-
Cassandra tool : CassandraUtils
229-
230-
III. Spring cloud support
231-
WebUI simple webui base on dhtmlxGrid 5.1 with spring boot native
232-
related project in my another project microservices
233269

234270

core/src/main/java/com/robin/core/base/util/ResourceConst.java

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,20 @@ public String getValue() {
122122
return value;
123123
}
124124
}
125+
public enum BOSPARAM{
126+
ENDPOIN("endpoint"),
127+
REGION("region"),
128+
ACESSSKEYID("accessKeyId"),
129+
SECURITYACCESSKEY("securityAccessKey");
130+
private String value;
131+
BOSPARAM(String value){
132+
this.value=value;
133+
}
134+
135+
public String getValue() {
136+
return value;
137+
}
138+
}
125139
public enum OSSPARAM{
126140
ENDPOIN("endpoint"),
127141
REGION("region"),

hadooptool/pom.xml

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -338,6 +338,17 @@
338338
<version>[7.16.0, 7.16.99]</version>
339339
<optional>true</optional>
340340
</dependency>
341+
<dependency>
342+
<groupId>com.baidubce</groupId>
343+
<artifactId>bce-java-sdk</artifactId>
344+
<version>0.10.353</version>
345+
<exclusions>
346+
<exclusion>
347+
<groupId>org.eclipse.paho</groupId>
348+
<artifactId>org.eclipse.paho.client.mqttv3</artifactId>
349+
</exclusion>
350+
</exclusions>
351+
</dependency>
341352
<dependency>
342353
<groupId>io.minio</groupId>
343354
<artifactId>minio</artifactId>
Lines changed: 159 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,159 @@
1+
package com.robin.comm.fileaccess.fs;
2+
3+
import com.baidubce.auth.DefaultBceCredentials;
4+
import com.baidubce.services.bos.BosClient;
5+
import com.baidubce.services.bos.BosClientConfiguration;
6+
import com.baidubce.services.bos.model.BosObject;
7+
import com.robin.core.base.exception.MissingConfigException;
8+
import com.robin.core.base.util.ResourceConst;
9+
import com.robin.core.fileaccess.fs.AbstractFileSystemAccessor;
10+
import com.robin.core.fileaccess.meta.DataCollectionMeta;
11+
import lombok.Getter;
12+
import lombok.extern.slf4j.Slf4j;
13+
import org.apache.commons.lang3.tuple.Pair;
14+
import org.springframework.util.Assert;
15+
import org.springframework.util.CollectionUtils;
16+
import org.springframework.util.ObjectUtils;
17+
18+
import java.io.*;
19+
20+
@Slf4j
21+
@Getter
22+
public class BOSFileSystemAccessor extends AbstractFileSystemAccessor {
23+
private String endpoint;
24+
private String accessKeyId;
25+
private String securityAccessKey;
26+
private String bucketName;
27+
private BosClient client;
28+
29+
@Override
30+
public void init(DataCollectionMeta meta) {
31+
Assert.isTrue(!CollectionUtils.isEmpty(meta.getResourceCfgMap()),"config map is empty!");
32+
Assert.notNull(meta.getResourceCfgMap().get(ResourceConst.BOSPARAM.ENDPOIN.getValue()),"must provide endpoint");
33+
Assert.notNull(meta.getResourceCfgMap().get(ResourceConst.BOSPARAM.ACESSSKEYID.getValue()),"must provide accessKey");
34+
Assert.notNull(meta.getResourceCfgMap().get(ResourceConst.BOSPARAM.SECURITYACCESSKEY.getValue()),"must provide securityAccessKey");
35+
36+
endpoint=meta.getResourceCfgMap().get(ResourceConst.BOSPARAM.ENDPOIN.getValue()).toString();
37+
accessKeyId=meta.getResourceCfgMap().get(ResourceConst.BOSPARAM.ACESSSKEYID.getValue()).toString();
38+
securityAccessKey=meta.getResourceCfgMap().get(ResourceConst.BOSPARAM.SECURITYACCESSKEY.getValue()).toString();
39+
BosClientConfiguration config=new BosClientConfiguration();
40+
config.setCredentials(new DefaultBceCredentials(accessKeyId,securityAccessKey));
41+
config.setEndpoint(endpoint);
42+
client=new BosClient(config);
43+
}
44+
public void init(){
45+
Assert.notNull(endpoint,"must provide region");
46+
Assert.notNull(accessKeyId,"must provide accessKey");
47+
Assert.notNull(securityAccessKey,"must provide securityAccessKey");
48+
BosClientConfiguration config=new BosClientConfiguration();
49+
config.setCredentials(new DefaultBceCredentials(accessKeyId,securityAccessKey));
50+
config.setEndpoint(endpoint);
51+
client=new BosClient(config);
52+
}
53+
54+
@Override
55+
public Pair<BufferedReader, InputStream> getInResourceByReader(DataCollectionMeta meta, String resourcePath) throws IOException {
56+
InputStream inputStream = getInputStreamByConfig(meta);
57+
return Pair.of(getReaderByPath(resourcePath, inputStream, meta.getEncode()),inputStream);
58+
}
59+
60+
@Override
61+
public Pair<BufferedWriter, OutputStream> getOutResourceByWriter(DataCollectionMeta meta, String resourcePath) throws IOException {
62+
OutputStream outputStream = getOutputStream(meta);
63+
return Pair.of(getWriterByPath(meta.getPath(), outputStream, meta.getEncode()),outputStream);
64+
}
65+
66+
@Override
67+
public OutputStream getRawOutputStream(DataCollectionMeta meta, String resourcePath) throws IOException {
68+
return getOutputStream(meta);
69+
}
70+
71+
@Override
72+
public OutputStream getOutResourceByStream(DataCollectionMeta meta, String resourcePath) throws IOException {
73+
return getOutputStreamByPath(resourcePath, getOutputStream(meta));
74+
}
75+
76+
@Override
77+
public InputStream getInResourceByStream(DataCollectionMeta meta, String resourcePath) throws IOException {
78+
InputStream inputStream = getInputStreamByConfig(meta);
79+
return getInputStreamByPath(resourcePath, inputStream);
80+
}
81+
82+
@Override
83+
public InputStream getRawInputStream(DataCollectionMeta meta, String resourcePath) throws IOException {
84+
return getInputStreamByConfig(meta);
85+
}
86+
87+
@Override
88+
public boolean exists(DataCollectionMeta meta, String resourcePath) throws IOException {
89+
String bucketName= getBucketName(meta);
90+
return client.doesObjectExist(bucketName,resourcePath);
91+
}
92+
93+
@Override
94+
public long getInputStreamSize(DataCollectionMeta meta, String resourcePath) throws IOException {
95+
String bucketName= getBucketName(meta);
96+
if(exists(meta,resourcePath)){
97+
BosObject object=client.getObject(bucketName,resourcePath);
98+
return object.getObjectMetadata().getContentLength();
99+
}
100+
return 0;
101+
}
102+
private InputStream getInputStreamByConfig(DataCollectionMeta meta) {
103+
String bucketName= getBucketName(meta);
104+
String objectName= meta.getPath();
105+
return getObject(bucketName,objectName);
106+
}
107+
private String getBucketName(DataCollectionMeta meta) {
108+
return !ObjectUtils.isEmpty(bucketName)?bucketName:meta.getResourceCfgMap().get(ResourceConst.BUCKETNAME).toString();
109+
}
110+
private InputStream getObject(String bucketName,String objectName){
111+
if(client.doesObjectExist(bucketName,objectName)) {
112+
BosObject object = client.getObject(bucketName, objectName);
113+
if (!ObjectUtils.isEmpty(object)) {
114+
return object.getObjectContent();
115+
} else {
116+
throw new RuntimeException("objectName " + objectName + " can not get!");
117+
}
118+
}else{
119+
throw new MissingConfigException(" key "+objectName+" not in OSS bucket "+bucketName);
120+
}
121+
}
122+
public static class Builder{
123+
private BOSFileSystemAccessor accessor;
124+
public Builder(){
125+
accessor=new BOSFileSystemAccessor();
126+
}
127+
public static Builder builder(){
128+
return new Builder();
129+
}
130+
public Builder accessKeyId(String accessKeyId){
131+
accessor.accessKeyId=accessKeyId;
132+
return this;
133+
}
134+
public Builder endpoint(String endPoint){
135+
accessor.endpoint=endPoint;
136+
return this;
137+
}
138+
public Builder securityAccessKey(String securityAccessKey){
139+
accessor.securityAccessKey=securityAccessKey;
140+
return this;
141+
}
142+
public Builder withMetaConfig(DataCollectionMeta meta){
143+
accessor.init(meta);
144+
return this;
145+
}
146+
public Builder bucket(String bucketName){
147+
accessor.bucketName=bucketName;
148+
return this;
149+
}
150+
public BOSFileSystemAccessor build(){
151+
if(ObjectUtils.isEmpty(accessor.getClient())){
152+
accessor.init();
153+
}
154+
return accessor;
155+
}
156+
157+
}
158+
159+
}

hadooptool/src/main/java/com/robin/comm/fileaccess/fs/OSSFileSystemAccessor.java

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -137,7 +137,6 @@ public void finishWrite(DataCollectionMeta meta,OutputStream outputStream) {
137137
}
138138

139139
private InputStream getInputStreamByConfig(DataCollectionMeta meta) {
140-
141140
String bucketName= getBucketName(meta);
142141
String objectName= meta.getPath();
143142
return getObject(bucketName,objectName);

hadooptool/src/main/java/com/robin/comm/fileaccess/fs/S3FileSystemAccessor.java

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -107,7 +107,6 @@ public void init(DataCollectionMeta meta) {
107107
}
108108
}
109109
public void init(){
110-
Assert.notNull(region,"region name required!");
111110
Assert.notNull(accessKey,"accessKey name required!");
112111
Assert.notNull(secret,"secret name required!");
113112

0 commit comments

Comments
 (0)