Open data platform based on flink. Now scaleph is supporting data integration with seatunnel on flink

Overview

scaleph

Github Actions Total Lines Last commit

The Scaleph project features data integration, develop, job schedule and orchestration and trys to provide one-stop data platform for developers and enterprises. Scaleph hopes to help peoples to aggregate data, analyze data, free data internal worth and make profit from them.

Scaleph is drived by personal interest and evolves actively through faithful developer, flowerfine is open and appreciates any helps.

Features

  • Web-ui click-and-drag data integration ways backended by out-of-the-box connectors.
  • Multiple versions, different deployment mode and different resource provider flink job execution ways, where we develop flinkful for solving these troubles.
  • Job version management.
  • Project configuration, dependency and resource.

Quick Start

Whenever people want to explore Scaleph system, they want a running Scaleph application, then people can interact with Scaleph through Scaleph Admin.

Luckily, deploy Scaleph locally just takes three steps.

  • Make sure Docker installed on your machine.
  • Clone the repository
  • Use Docker Compose and Scaleph Docker image quickly install and run Scaleph.
git clone https://github.com/flowerfine/scaleph.git
cd scaleph/tools/docker/deploy
docker-compose up

Once all containers have started, the UI is ready to go at http://localhost!

Documentation

comming soon...

please refer wiki

Build and Deployment

  • develop. This doc describes how to set up local development environment of Scaleph project.
  • checkstyle. Scaleph project requires clean and robust code, which can help Scaleph go further and develop better.
  • build. This doc describes how to build the Scaleph project from source. Scaleph adopts maven as its build system, for more information about build from source and deployment, please refer build.
  • docker. As more application runs in container on cloud then bare metal machine, Scaleph provides own image.
  • deployment. For different deployment purpose such as develop, test or production, Scaleph make the best effort for people deploy project on local, docker and kubernetes.

RoadMap

features

  1. data ingress and egress.
    1. data integration in flink way. Scaleph features seatunnel, flink-cdc-connectors and other flink connectors.
    2. friendly to newbies web-ui.
  2. data develop
    1. udf + sql.
    2. support multi-layer data warehouse development.
  3. job schedule and orchestrate

architectures

  1. cloud native
    1. container and kubernetes development and runtime environment.
      1. flink operator
      2. seatunnel operator
      3. scaleph operator
    2. java 17, quarkus.
  2. plugins. https://dubbo.apache.org/zh/docsv2.7/dev/principals/

Contributing

For contributions, please refer CONTRIBUTING

Contact

License

Scaleph is licenced under the Apache License Version 2.0, link is here.

Comments
  • [Feature][scaleph-ui-react] more seatunnel connector support in new dag version

    [Feature][scaleph-ui-react] more seatunnel connector support in new dag version

    Already searched before asking?

    • [X] I had searched in the feature and found no similar feature requirement.

    Usage Scenario

    more seatunnel connector support in new dag version

    • seatunnel connector v2 https://github.com/apache/incubator-seatunnel/tree/dev/docs/en/connector-v2
    • how to connector in scaleph

    Description

    source

    • [x] [Feature] [scaleph-plugins] Fake source #269
    • [x] #281
    • [x] [Feature] [scaleph-plugins] FtpFile source #275
    • [x] #282
    • [x] [Feature] [scaleph-plugins] HdfsFile source #271
    • [x] #283
    • [x] #284
    • [x] [Feature] [scaleph-plugins] Hudi source #262
    • [x] [Feature] [scaleph-plugins] Icebreg source #264
    • [x] #285
    • [x] [Feature] [scaleph-plugins] jdbc source
    • [x] #286
    • [x] [Feature] [scaleph-plugins] LocalFile source #266
    • [x] #287
    • [x] #288
    • [ ] #289
    • [x] #290
    • [x] #291
    • [x] #292

    sink

    • [ ] #293
    • [x] #294
    • [ ] #295
    • [x] #277
    • [x] #296
    • [x] #297
    • [x] #298
    • [x] #299
    • [x] #300
    • [x] [Feature] [scaleph-plugins] FtpFile sink #275
    • [x] #301
    • [x] [Feature] [scaleph-plugins] HdfsFile sink #275
    • [x] #302
    • [x] #303
    • [x] #304
    • [x] [Feature] [scaleph-plugins] Jdbc sink
    • [x] #305
    • [x] [Feature] [scaleph-plugins] LocalFile sink #266
    • [x] #306
    • [x] #307
    • [x] #308
    • [ ] #309
    • [x] #310
    • [x] #280
    • [x] #278
    • [x] #279

    Are you willing to submit a PR?

    • [X] Yes, I am willing to submit a PR!

    Code of Conduct

    feature 
    opened by gleiyu 16
  • [Feature][scaleph-ui-react] flink job managerment

    [Feature][scaleph-ui-react] flink job managerment

    Purpose of this pull request

    1. remove version from flink_job table
    2. add projectid to flink job and clusters

    Brief change log

    (for example:)

    • Add datasource plugin and jdbc datasource plugin implementations
    • scaleph-ui datasource menu loads available datasource plugins automatically
    • seatunnel connector-jdbc job retrieves datasource from datasource plugin

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    opened by gleiyu 15
  • [Feature][scaleph-engine-seatunnel] support fake source v2 connector

    [Feature][scaleph-engine-seatunnel] support fake source v2 connector

    Purpose of this pull request

    support fake source v2 connector https://github.com/flowerfine/scaleph/issues/268

    Brief change log

    • support fake source source v2 connector

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    opened by lizu18xz 9
  • Feature react

    Feature react

    Purpose of this pull request

    Brief change log

    (for example:)

    • Add datasource plugin and jdbc datasource plugin implementations
    • scaleph-ui datasource menu loads available datasource plugins automatically
    • seatunnel connector-jdbc job retrieves datasource from datasource plugin

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    feature 
    opened by Dreamcreative 8
  • [Feature][scaleph-di] adapt to seatunnel connector v2 and xflow

    [Feature][scaleph-di] adapt to seatunnel connector v2 and xflow

    Purpose of this pull request

    1. remove DiJobStepAttrType
    2. job attributes setting

    Brief change log

    (for example:)

    • Add datasource plugin and jdbc datasource plugin implementations
    • scaleph-ui datasource menu loads available datasource plugins automatically
    • seatunnel connector-jdbc job retrieves datasource from datasource plugin

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    feature 
    opened by gleiyu 7
  • [Bug][scaleph-ui-react] function hasPrivilege json parse error

    [Bug][scaleph-ui-react] function hasPrivilege json parse error

    if user first login,function hasPrivilege of auth.ts json parse error #219

    Purpose of this pull request

    Brief change log

    • fix auth.ts function hasPrivilege json parse error

    Check list

    • [x] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    bug 
    opened by hx23840 7
  • [Feature] refactor generate seatunnel config file with SeatunnelNativeFlinkPlugin

    [Feature] refactor generate seatunnel config file with SeatunnelNativeFlinkPlugin

    Purpose of this pull request

    refactor generate seatunnel config file with SeatunnelNativeFlinkPlugin

    Brief change log

    (for example:)

    • Add datasource plugin and jdbc datasource plugin implementations
    • scaleph-ui datasource menu loads available datasource plugins automatically
    • seatunnel connector-jdbc job retrieves datasource from datasource plugin

    Check list

    • [x] Code changed are covered with tests, or it does not need tests for reason:
    • [x] If necessary, please update the documentation to describe the new feature.
    feature 
    opened by gleiyu 7
  • [Feature] [scaleph-ui-react] init xflow dag

    [Feature] [scaleph-ui-react] init xflow dag

    Purpose of this pull request

    Brief change log

    (for example:)

    • Add datasource plugin and jdbc datasource plugin implementations
    • scaleph-ui datasource menu loads available datasource plugins automatically
    • seatunnel connector-jdbc job retrieves datasource from datasource plugin

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    opened by gleiyu 6
  • [Feature][scaleph-plugin]Add  clickhouse datasource and clickhouse-sink plugin

    [Feature][scaleph-plugin]Add clickhouse datasource and clickhouse-sink plugin

    Purpose of this pull request

    contribute #185

    Brief change log

    • add clickhouse sink connector

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    feature 
    opened by lizu18xz 6
  • [Feature][docker]change docker image build and push to unified Dockerfile

    [Feature][docker]change docker image build and push to unified Dockerfile

    Purpose of this pull request

    change docker image build and push to unified Dockerfile

    Brief change log

    • change docker image build and push to unified Dockerfile

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    opened by kalencaya 6
  • [Feature] react user login and interceptor handled

    [Feature] react user login and interceptor handled

    Purpose of this pull request

    1. user login page
    2. token header interceptor
    3. error interceptor

    Brief change log

    (for example:)

    • Add datasource plugin and jdbc datasource plugin implementations
    • scaleph-ui datasource menu loads available datasource plugins automatically
    • seatunnel connector-jdbc job retrieves datasource from datasource plugin

    Check list

    • [ ] Code changed are covered with tests, or it does not need tests for reason:
    • [ ] If necessary, please update the documentation to describe the new feature.
    opened by gleiyu 6
  • [Bug] [Module Name] 数据源页面无法访问

    [Bug] [Module Name] 数据源页面无法访问

    Already searched before asking?

    • [X] I had searched in the issues and found no similar issues.

    Scaleph Version or Branch

    Scaleph:1.0.1-SNAPSHOT SeaTunnel:2.3.0

    What happened

    git clone https://github.com/flowerfine/scaleph.git
    cd scaleph/tools/docker/deploy/scaleph
    docker-compose up
    

    访问:http://localhost 点击数据源页面提示 Something went wrong.截图如下:

    图片

    Error Exception

    scaleph-api       | 2023-01-05 14:30:33.251  INFO 7 [http-nio-8080-exec-10] .a.c.WebMvcConfig$AsyncWebLogInterceptor 147  行: [sys_admin] GET /scaleph/api/user/get/9eba6af528494ab88f9b46b1d7bbf31e
    scaleph-ui-react  | 172.19.0.1 - - [05/Jan/2023:06:30:33 +0000] "GET /api/user/get/9eba6af528494ab88f9b46b1d7bbf31e HTTP/1.1" 200 962 "http://localhost/dataSource" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:108.0) Gecko/20100101 Firefox/108.0"
    scaleph-api       | 2023-01-05 14:30:33.443  INFO 7 [http-nio-8080-exec-1] .a.c.WebMvcConfig$AsyncWebLogInterceptor 147  行: [sys_admin] GET /scaleph/api/msg/count
    scaleph-ui-react  | 172.19.0.1 - - [05/Jan/2023:06:30:33 +0000] "GET /api/msg/count HTTP/1.1" 200 1 "http://localhost/dataSource" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:108.0) Gecko/20100101 Firefox/108.0"
    scaleph-api       | 2023-01-05 14:30:33.444  INFO 7 [http-nio-8080-exec-9] .a.c.WebMvcConfig$AsyncWebLogInterceptor 147  行: [sys_admin] GET /scaleph/api/msg uri_params: [{"current":"1","isRead":"0","pageSize":"1000"}]
    scaleph-ui-react  | 172.19.0.1 - - [05/Jan/2023:06:30:33 +0000] "GET /api/msg?pageSize=1000&current=1&isRead=0 HTTP/1.1" 200 144 "http://localhost/dataSource" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:108.0) Gecko/20100101 Firefox/108.0"
    scaleph-api       | 2023-01-05 14:30:33.516  INFO 7 [http-nio-8080-exec-2] .a.c.WebMvcConfig$AsyncWebLogInterceptor 147  行: [sys_admin] GET /scaleph/api/ds/category/type
    scaleph-ui-react  | 172.19.0.1 - - [05/Jan/2023:06:30:33 +0000] "GET /api/ds/category/type HTTP/1.1" 200 8830 "http://localhost/dataSource" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:108.0) Gecko/20100101 Firefox/108.0"
    scaleph-ui-react  | 172.19.0.1 - - [05/Jan/2023:06:30:33 +0000] "GET /api/ds/info?current=1&pageSize=10 HTTP/1.1" 200 3114 "http://localhost/dataSource" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:108.0) Gecko/20100101 Firefox/108.0"
    scaleph-api       | 2023-01-05 14:30:33.562  INFO 7 [http-nio-8080-exec-4] .a.c.WebMvcConfig$AsyncWebLogInterceptor 147  行: [sys_admin] GET /scaleph/api/ds/info uri_params: [{"current":"1","pageSize":"10"}]
    
    bug 
    opened by ocean-zhc 0
  • [Bug] [scaleph-api] delete cluster instance error

    [Bug] [scaleph-api] delete cluster instance error

    Already searched before asking?

    • [X] I had searched in the issues and found no similar issues.

    Scaleph Version or Branch

    dev

    What happened

    delete flink cluster instance error, flink cluster create successfully

    Error Exception

    2023-01-04 21:52:32.260 ERROR 3171 [http-nio-8080-exec-7] c.s.s.a.e.GlobalExceptionHandler         122  行: [sys_admin] DELETE /scaleph/api/flink/cluster-instance/1 
    
    java.lang.IllegalStateException: No ClusterClientFactory found. If you were targeting a Yarn cluster, please make sure to export the HADOOP_CLASSPATH environment variable or have hadoop in your classpath. For more information refer to the "Deployment" section of the official Apache Flink documentation.
    	at org.apache.flink.client.deployment.DefaultClusterClientServiceLoader.getClusterClientFactory(DefaultClusterClientServiceLoader.java:83)
    	at cn.sliew.flinkful.cli.base.util.FlinkUtil.createClientFactory(FlinkUtil.java:139)
    	at cn.sliew.flinkful.cli.base.util.FlinkUtil.retrieve(FlinkUtil.java:121)
    	at cn.sliew.scaleph.engine.flink.service.impl.WsFlinkServiceImpl.shutdown(WsFlinkServiceImpl.java:410)
    	at cn.sliew.scaleph.api.controller.ws.WsClusterInstanceController.shutdownCluster(WsClusterInstanceController.java:78)
    	at cn.sliew.scaleph.api.controller.ws.WsClusterInstanceController$$FastClassBySpringCGLIB$$edf17d80.invoke(<generated>)
    	at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
    	at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:793)
    	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
    	at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763)
    	at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:89)
    	at cn.sliew.scaleph.api.aspect.LogAspect.actionLogAround(LogAspect.java:104)
    	at jdk.internal.reflect.GeneratedMethodAccessor190.invoke(Unknown Source)
    	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    	at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethodWithGivenArgs(AbstractAspectJAdvice.java:634)
    	at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethod(AbstractAspectJAdvice.java:624)
    	at org.springframework.aop.aspectj.AspectJAroundAdvice.invoke(AspectJAroundAdvice.java:72)
    	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175)
    	at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763)
    	at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:97)
    	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
    	at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763)
    	at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:708)
    	at cn.sliew.scaleph.api.controller.ws.WsClusterInstanceController$$EnhancerBySpringCGLIB$$94abbe7f.shutdownCluster(<generated>)
    	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    	at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
    	at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150)
    	at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117)
    	at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895)
    	at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808)
    	at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87)
    	at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1071)
    	at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:964)
    	at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006)
    	at org.springframework.web.servlet.FrameworkServlet.doDelete(FrameworkServlet.java:931)
    	at javax.servlet.http.HttpServlet.service(HttpServlet.java:671)
    	at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883)
    	at javax.servlet.http.HttpServlet.service(HttpServlet.java:750)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at cn.sliew.scaleph.security.web.TokenFilter.doFilter(TokenFilter.java:85)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at cn.sliew.scaleph.api.config.WebMvcConfig$WebLogInterceptor.doFilterInternal(WebMvcConfig.java:123)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:337)
    	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:115)
    	at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:81)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:122)
    	at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:116)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:126)
    	at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:81)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:109)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:149)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at cn.sliew.scaleph.security.web.TokenFilter.doFilter(TokenFilter.java:85)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:103)
    	at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:89)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.header.HeaderWriterFilter.doHeadersAfter(HeaderWriterFilter.java:90)
    	at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:75)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:112)
    	at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:82)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:55)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.session.DisableEncodeUrlFilter.doFilterInternal(DisableEncodeUrlFilter.java:42)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:346)
    	at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:221)
    	at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:186)
    	at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:354)
    	at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:267)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:96)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201)
    	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
    	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
    	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
    	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197)
    	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97)
    	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:541)
    	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135)
    	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
    	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78)
    	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:360)
    	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:399)
    	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65)
    	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:890)
    	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1789)
    	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
    	at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191)
    	at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
    	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
    	at java.base/java.lang.Thread.run(Thread.java:834)
    

    Screenshots

    image

    Are you willing to submit PR?

    • [ ] Yes, I am willing to submit a PR!

    Code of Conduct

    bug 
    opened by huhan110 0
  • [Feature][scaleph-plugin-seatunnel-connectors] integrate more seatunnel v2 connectors on 2.3.0 release

    [Feature][scaleph-plugin-seatunnel-connectors] integrate more seatunnel v2 connectors on 2.3.0 release

    Already searched before asking?

    • [X] I had searched in the feature and found no similar feature requirement.

    Usage Scenario

    seatunnel will release 2.3.0 recently and a lot connectors will be added in this version, scaleph is already for integrating them.

    Description

    source

    • [ ] greenplum
    • [ ] phoenix
    • [ ] ossjindofile
    • [ ] sftpfile
    • [x] hdfsfile. add hdfs_site_path parameter
    • [ ] s3file. add hadoop_s3_properties parameter
    • [ ] kafka. add start_mode, start_mode.offsets, start_mode.timestamp, partition-discovery.interval-millis parameters
    • [x] pulsar. add poll.timeout parameter
    • [x] emailsink. email rename
    • [ ] http. add content_json and json_field parameters
    • [ ] jdbc. add partition_num, fetch_size parameters
    • [ ] redis. add mode, nodes, hash_key_parse_mode, user parameters
    • [ ] AmazonDynamodb
    • [ ] Cassandra
    • [ ] GoogleSheets
    • [ ] Lemlist
    • [ ] Klaviyo
    • [ ] OneSignal
    • [ ] Jira
    • [ ] Gitlab
    • [ ] Notion
    • [ ] RabbitMQ
    • [ ] OpenMldb
    • [ ] Maxcompute
    • [ ] MySQL-CDC

    sink

    • [ ] assert
    • [ ] clickhousefile
    • [ ] greenplum
    • [ ] phoenix
    • [x] localfile. add batch_size parameter
    • [ ] ossjindofile
    • [ ] sftpfile
    • [ ] kafka. remove partition_key, add partition_key_fields, format, field_delimiter parameters
    • [ ] jdbc. add table, primary_keys, support_upsert_by_query_primary_key_exist parameters
    • [ ] redis. add mode, nodes, user parameters
    • [ ] elasticsearch. add primary_keys, key_delimiter parameters
    • [ ] hive. remove partition_by, sink_columns, is_enable_transaction, save_mode, add table_name
    • [ ] AmazonDynamodb
    • [ ] Cassandra
    • [ ] StarRocks
    • [ ] MyHours
    • [ ] Slack
    • [ ] InfluxDB
    • [ ] Tablestore
    • [ ] RabbitMQ
    • [ ] Doris
    • [ ] Maxcompute
    • [ ] S3Redshift

    Are you willing to submit a PR?

    • [ ] Yes, I am willing to submit a PR!

    Code of Conduct

    feature 
    opened by kalencaya 0
  • [Feature][scaleph-engine-seatunnel] support flink cdc by scaleph v1 connector framework

    [Feature][scaleph-engine-seatunnel] support flink cdc by scaleph v1 connector framework

    Already searched before asking?

    • [X] I had searched in the feature and found no similar feature requirement.

    Usage Scenario

    As title described, try to support flink cdc by scaleph v1 connector framework. Flink itself has a prosperous ecosystem with so many bigdata components such as iceberg, hudi datalake, flink cdc data integration. Contributors have spent much time and energy on realize such connectors and focus on better performance and usage, seatunnel v1 connector framework provides a simple way for applying such connectors on out-of-box data integration tools

    Description

    support v1 seatunnel

    • [ ] support other data integration tools by dag
    • [ ] support seatunnel v1 engine

    connectors

    • [ ] flink cdc connector
    • [ ] iceberg connector
    • [ ] hudi connector
    • [ ] flink table store connector

    Are you willing to submit a PR?

    • [ ] Yes, I am willing to submit a PR!

    Code of Conduct

    feature 
    opened by kalencaya 0
  • [Feature][scaleph-engine-seatunnel] seatunnel schema service

    [Feature][scaleph-engine-seatunnel] seatunnel schema service

    Already searched before asking?

    • [X] I had searched in the feature and found no similar feature requirement.

    Usage Scenario

    As so many scenes need developer add schema for data on input, scaleph try to provide a unifed schema service

    Description

    https://open.taobao.com/doc.htm?docId=106556&docType=1

    Are you willing to submit a PR?

    • [ ] Yes, I am willing to submit a PR!

    Code of Conduct

    feature 
    opened by kalencaya 1
  • [Feature][scaleph-engine-sql] flink catalog service

    [Feature][scaleph-engine-sql] flink catalog service

    Already searched before asking?

    • [X] I had searched in the feature and found no similar feature requirement.

    Usage Scenario

    flink catalog service roadmap

    Description

    catalog meta service

    • [ ] catalog type. list
    • [ ] catalog config. list, insert, update, delete

    catalog instance service

    • [ ] catalog. register jdbc-catalog
    • [ ] database. list, get, checkExists, create, drop, alter
    • [ ] table. list, get, checkExists, create, drop, alter, rename
    • [ ] view. list, get, checkExists, create, drop, alter, rename
    • [ ] partition. list, get, checkExists, create, drop, alter
    • [ ] function. list, get, checkExists, create, drop, alter
    image image image

    Are you willing to submit a PR?

    • [ ] Yes, I am willing to submit a PR!

    Code of Conduct

    feature 
    opened by kalencaya 0
Owner
null
Apache Camel is an open source integration framework that empowers you to quickly and easily integrate various systems consuming or producing data.

Apache Camel Apache Camel is a powerful, open-source integration framework based on prevalent Enterprise Integration Patterns with powerful bean integ

The Apache Software Foundation 4.7k Dec 31, 2022
Dagger is an easy-to-use, configuration over code, cloud-native framework built on top of Apache Flink for stateful processing of real-time streaming data.

Dagger Dagger or Data Aggregator is an easy-to-use, configuration over code, cloud-native framework built on top of Apache Flink for stateful processi

Open DataOps Foundation 238 Dec 22, 2022
Carbyne Stack MP-SPDZ Integration Utilities

Carbyne Stack MP-SPDZ Integration Utilities This project provides utilities for using MP-SPDZ in the Carbyne Stack microservices. License Carbyne Stac

Carbyne Stack 5 Oct 15, 2022
Flink Demo

flink-demo minimum code just run flink-ds-connector DataStream API usage kafka es jdbc file row string parquet avro avro custom avro flink-sql-connect

hiscat 40 Dec 4, 2022
FLiP: StreamNative: Cloud-Native: Streaming Analytics Using Apache Flink SQL on Apache Pulsar

StreamingAnalyticsUsingFlinkSQL FLiP: StreamNative: Cloud-Native: Streaming Analytics Using Apache Flink SQL on Apache Pulsar Running on NVIDIA XAVIER

Timothy Spann 5 Dec 19, 2021
Template for an Apache Flink project.

Minimal Apache Flink Project Template It contains some basic jobs for testing if everything runs smoothly. How to Use This Repository Import this repo

Timo Walther 2 Sep 20, 2022
Firehose is an extensible, no-code, and cloud-native service to load real-time streaming data from Kafka to data stores, data lakes, and analytical storage systems.

Firehose - Firehose is an extensible, no-code, and cloud-native service to load real-time streaming data from Kafka to data stores, data lakes, and analytical storage systems.

Open DataOps Foundation 279 Dec 22, 2022
A modular and portable open source XMPP client library written in Java for Android and Java (SE) VMs

Smack About Smack is an open source, highly modular, easy to use, XMPP client library written in Java for Java SE compatible JVMs and Android. A pure

Ignite Realtime 2.3k Dec 28, 2022
HornetQ is an open source project to build a multi-protocol, embeddable, very high performance, clustered, asynchronous messaging system.

HornetQ If you need information about the HornetQ project please go to http://community.jboss.org/wiki/HornetQ http://www.jboss.org/hornetq/ This file

HornetQ 245 Dec 3, 2022
An XMPP server licensed under the Open Source Apache License.

Openfire About Openfire is a real time collaboration (RTC) server licensed under the Open Source Apache License. It uses the only widely adopted open

Ignite Realtime 2.6k Jan 3, 2023
An Open-Source, Distributed MQTT Message Broker for IoT.

MMQ broker MMQ broker 是一款完全开源,高度可伸缩,高可用的分布式 MQTT 消息服务器,适用于 IoT、M2M 和移动应用程序。 MMQ broker 完整支持MQTT V3.1 和 V3.1.1。 安装 MMQ broker 是跨平台的,支持 Linux、Unix、macOS

Solley 60 Dec 15, 2022
Microservice-based online payment system for customers and merchants using RESTful APIs and message queues

Microservice-based online payment system for customers and merchants using RESTful APIs and message queues

Daniel Larsen 1 Mar 23, 2022
SMS app based on QKSMS. DISCLAIMER: This project is intended for my own use. No issues are accepted

Messages Messages is an open source replacement to the stock messaging app on Android. DISCLAIMER: Unlike most other projects, this project is for my

Muntashir Al-Islam 13 Dec 16, 2022
Dataflow template which read data from Kafka (Support SSL), transform, and outputs the resulting records to BigQuery

Kafka to BigQuery Dataflow Template The pipeline template read data from Kafka (Support SSL), transform the data and outputs the resulting records to

DoiT International 12 Jun 1, 2021
Pipeline for Visualization of Streaming Data

Seminararbeit zum Thema Visualisierung von Datenströmen Diese Arbeit entstand als Seminararbeit im Rahmen der Veranstaltung Event Processing an der Ho

Domenic Cassisi 1 Feb 13, 2022
SeaTunnel is a distributed, high-performance data integration platform for the synchronization and transformation of massive data (offline & real-time).

SeaTunnel SeaTunnel was formerly named Waterdrop , and renamed SeaTunnel since October 12, 2021. SeaTunnel is a very easy-to-use ultra-high-performanc

The Apache Software Foundation 4.4k Jan 2, 2023
A distributed data integration framework that simplifies common aspects of big data integration such as data ingestion, replication, organization and lifecycle management for both streaming and batch data ecosystems.

Apache Gobblin Apache Gobblin is a highly scalable data management solution for structured and byte-oriented data in heterogeneous data ecosystems. Ca

The Apache Software Foundation 2.1k Jan 4, 2023
Spring Integration provides an extension of the Spring programming model to support the well-known Enterprise Integration Patterns (EIP)

Spring Integration Code of Conduct Please see our Code of conduct. Reporting Security Vulnerabilities Please see our Security policy. Checking out and

Spring 1.4k Dec 30, 2022
Docker-compose-integration-tstst - An exploration of how to run integration tests against an application that has inconvenient external dependencies (e.g. a SQL database).

Tstst? it was supposed to be docker-compose-integration-tests but i was too lazy to fix it at the outset, and now im trying to convince myself its fun

null 1 Jan 4, 2022
Framework for automated integration tests with focus on messaging integration

Citrus Integration Testing Welcome to Citrus Citrus is a test framework written in Java that is able to create fully automated end-to-end use case tes

Citrus Framework 373 Dec 27, 2022