Stream Processing and Complex Event Processing Engine

Overview

Siddhi Core Libraries

Jenkins Build Status GitHub Release GitHub Release Date GitHub Open Issues GitHub Last Commit Codecov CII Best Practices License

Siddhi is a cloud native Streaming and Complex Event Processing engine that understands Streaming SQL queries in order to capture events from diverse data sources, process them, detect complex conditions, and publish output to various endpoints in real time.

Siddhi Core Libraries contains the essential core libraries need for Siddhi execution such as siddhi-core, siddhi-query-api, siddhi-query-compiler, and siddhi-annotations.

Overview

Siddhi can run as an embedded Java and Python library, as a micro service on bare metal, VM, and Docker and natively in Kubernetes.

Siddhi provides web-based graphical and textual tooling for development.

For information on Siddhi and it's features refer Siddhi Documentation.

Download

Download Siddhi Core Libraries;

  • Versions 5.x and above with group id io.siddhi.* from here.
  • Versions 4.x and lower with group id org.wso2.siddhi.* from here.

Download Siddhi tooling and runtime distributions here.

Get Started!

Get started with Siddhi in a few minutes by following the Siddhi Quick Start Guide.

For more information on using Siddhi refer Siddhi Documentation.

Latest API Docs

Latest API Docs is 5.1.18.

Support and Contribution

You can reachout through Slack channel, Google mail group and etc. Please refer the community contribution site for more information.

Comments
  • CDC for Teradata

    CDC for Teradata

    Description:

    Can we perform CDC using siddhi app for any RDBMS database which has a JDBC driver like Teradata or are there any restrictions?

    Affected Siddhi Version:

    OS, DB, other environment details and versions:

    Steps to reproduce:

    Related Issues:

    type/question priority/high 
    opened by Sanket-Tantia 22
  • Create indexed CompareCollectionExpression when left and right parts …

    Create indexed CompareCollectionExpression when left and right parts …

    …are indexed attributes

    Purpose

    When using an indexed in-memory table, and updating records in it, the performance decreases as the table size increases. The problem is related to the creation of the CompareCollectionExpression which is used by the update operation on the in-memory table. The collection executor which is created by parsing the collection expression has an EXHAUSTIVE scope instead of an INDEXED_RESULT_SET scope. And thus, any ‘find’ operation, iterates over all elements of the table.

    Goals

    Fix #1088

    Approach

    All details are in Issue #1088

    Release note

    Use indexed CompareCollectionExpression when left and right parts are indexed attributes

    Documentation

    N/A

    Automation tests

    N/A

    Security checks

    N/A

    opened by debelyoo 13
  • siddhi window oom

    siddhi window oom

    Description: I integrated siddhi into the storm, but using siddhi's window.timeBatch method, I found that the corresponding bolt of storm is often restarted, monitoring the location of the program to spend time, at org.wso2.siddhi.core.util.snapshot.SnapshotService. addSnapshotable(SnapshotService.java:78) takes up more resources and will oom,I used multiple time aggregation. but why?

    type/bug resolution/fixed 
    opened by 478682649 12
  • siddhi how to confirm the data availability?

    siddhi how to confirm the data availability?

    now i deploy a siddhi cluster on some docker,and will restart the cluster frequently(at least one or more per day).my question is when i restart an docker, if will loss data or not ?

    if will loss data,how can i do to confirm the data availability?

    type/question 
    opened by xywan89 11
  • High CPU cost by getLastEvent while add/remove event in ComplexEventChunk

    High CPU cost by getLastEvent while add/remove event in ComplexEventChunk

    Description:

    Each add/insert method in ComplexEventChunk requires to call get getLastEvent, which 100% high CPU as it requires to go through the whole ComplexEventChunk to get the last, which has two problems:

    1. It makes the add/insert become O(n) complexity
    2. lastEvent.next may be lastEvent itself.

    https://github.com/siddhi-io/siddhi/blob/master/modules/siddhi-core/src/main/java/io/siddhi/core/event/ComplexEventChunk.java#L107-L109

    type/bug severity/major 
    opened by haoch 11
  • Is it possible to update query/rule without shutting down runtime?

    Is it possible to update query/rule without shutting down runtime?

    We want to use siddhi to process streaming data for anomaly detection. The performance is great. The problem is we need to allow users to add/update query or rules dynamically. From the document / api / google, I cannot find a perfect solution to handle such cases.
    In stream processor, it seems to be always shutdown old runtime and start a new one. If there are a bit amount of rules running, such approach will cost big overhead. Is there any better solutions?

    type/question 
    opened by hushengyue 10
  • using siddhi for video analysis

    using siddhi for video analysis

    Hi, I want to use Siddhi to do video analysis, is it possible, because I saw many tutorials about using streams like quotes stream and so on but nothing about streaming video, do you have any sample about this?

    opened by micuentadecasa 10
  • Support for HoppingWindow e.g., 24-hour window moving at 1-minute intervals

    Support for HoppingWindow e.g., 24-hour window moving at 1-minute intervals

    Description: Will there be support for Hopping Window? e.g., the ability to provide a window in which the events can occur, but it hops at a predefined amount of time.

    e.g., 24-hour window, hopping at 1 minute at a time.

    The hopping of 1 minute will allow you to receive events if the criteria happens before the 24-hour window. In my use case, I'm looking to detect if multiple events happen within a 24 hour period, but I want to know when it happens

    Suggested Labels: improvement

    opened by rburton 10
  • Siddhi external time

    Siddhi external time

    Hello,

    I wanted to do a simple query with siddhi which gives me back the hits in a batch. I could easily do it with esper but it seems I had some problems with siddhi. Is there a way to use external time instaed of internal time ? I tried to use External Time Window feature but it only can be used for simple queries, so it is not enough for me.

    Here is the poc (ESPER):

            Configuration con = new Configuration();
            //con.addEventType("LoginType", beanClass);
            con.addEventType("LoginType", LoginType.class);
            EPServiceProvider epService = EPServiceProviderManager.getDefaultProvider(con);
            String expression = "select count(loginType), timeStamp, userName,loginType,systemType,eventID, count(eventID) as ecntid from LoginType.win:time_batch(5 sec) where eventID='254' and userName='maci' ";
            //expression = "select a.timeStamp, a.userName,a.loginType,a.systemType,count(a.eventID ) as ecntid from pattern[every a=LoginType(eventID='254' and count(a.eventID) > 3) where timer:within(5 sec)";
            //expression = "select o.timeStamp, o.userName,o.systemType,o.eventID, count(o.eventID), b.timeStamp, b.eventID from pattern[every o=LoginType -> (timer:interval(5 sec) and b=LoginType(eventID=o.eventID) )]";
    
            expression = "select * from LoginType.win:time_batch(5 sec)  " +
                    "   match_recognize ( " +
                    "   measures A.eventID as aEID, B.eventID as bEID , A.timeStamp as firstStamp, B.timeStamp as secondStamp " +
                    "   pattern ( A B ) " +
                    "   define " +
                    "   A as A.eventID = '254' and A.userName='maci' , " +
                    "   B as B.eventID = '255' and prev(B.userName) = A.userName " +
                    "  ) ";
    
            //expression = "select a.eventID,b.eventID,a.userName,b.userName,a.timeStamp,b.timeStamp from pattern[every a=LoginType(a.eventID='254') -> (timer:interval(5 sec) and b=LoginType(b.eventID='255'))]";
    
            /*
            expression = "select sorted(price desc).take(5) as highestprice " +
                    " from LoginType.win:time(5 min)  ";
            */
    
            EPStatement statement = epService.getEPAdministrator().createEPL(expression);
            MyListener listener = new MyListener();
            statement.addListener(listener);
    
            Calendar c = Calendar.getInstance();
            EPRuntime runtime = epService.getEPRuntime();
            runtime.sendEvent(new TimerControlEvent(TimerControlEvent.ClockType.CLOCK_EXTERNAL));
    
            long start = new Date().getTime();
            int k = 0;
            for (k = 0; k <= 100000000; k++) {
                c.add(Calendar.SECOND, 1);
                String eventID = "254";
                if (k % 3 == 0) {
                    eventID = "255";
                }
                runtime.sendEvent(new CurrentTimeEvent(c.getTime().getTime()));
                runtime.sendEvent(new LoginType(new Date(c.getTime().getTime()), "type1", "any name", eventID, "windows", new Random().nextInt(1000)));
    
            }
            long end = new Date().getTime();
            System.out.println(end - start);
    
    
    

    I tried to use a similar query in esper but I was not be able to use the timestamp of my logs. Could you give me a similar example with siddhi ? Here are my attempts to create the above example with siddhi :

        public void externalTimeWindowTest1() throws InterruptedException {
            SiddhiManager siddhiManager = new SiddhiManager();
            String cseEventStream = "define stream LoginEvents (myTime long, ip string, phone string,price int) ;";
            String query = "@info(name = 'query1') from LoginEvents#window.timeBatch(5 sec)  "
                    + "select myTime, phone, ip, price , max(price) as maxprice, min(price) as minprice, count(myTime) as cntip insert all events into OutPut ";
    
    
            /*String query = "@info(name='query1') from every a1 = LoginEvents  " +
                    "            -> b1 = LoginEvents[b1.ip == a1.ip ]#window.externalTime(b1.myTime,5 second)   " +
                    "       within 5 seconds  select a1.myTime, a1.phone, a1.ip, a1.price , max(a1.price) as maxprice, min(a1.price) as minprice, count(a1.myTime) as cntip insert current events into Output ";
            */
            final ExecutionPlanRuntime executionPlanRuntime = siddhiManager.createExecutionPlanRuntime(cseEventStream + query);
    
            executionPlanRuntime.addCallback("query1", new QueryCallback() {
    
                @Override
                public void receive(long timeStamp, Event[] inEvents, Event[] removeEvents) {
                    EventPrinter.print(inEvents);
                    System.out.println(new Date(timeStamp));
                    if (inEvents != null) {
                        System.out.println("======================== START ===============================");
                        for (Event e : inEvents) {
                            if (e.isExpired()) continue;
    
                            System.out.println("----------------------------");
                            System.out.println(new Date(e.getTimestamp()));
                            System.out.println("IP:" + e.getData(2));
                            System.out.println("Max price:" + e.getData(4));
                            System.out.println("Min price:" + e.getData(5));
                            System.out.println("IP siddhiCount:" + e.getData(6));
                            System.out.println("Expired :" + e.isExpired());
                            System.out.println("----------------------------");
                        }
                        System.out.println("======================== END  ===============================");
                    }
                }
            });
            executionPlanRuntime.start();
    
    
            new Thread(new Runnable() {
                @Override
                public void run() {
                    Calendar c = Calendar.getInstance();
                    c.add(Calendar.HOUR, 1);
                    c.add(Calendar.SECOND, 1);
                    InputHandler inputHandler = executionPlanRuntime.getInputHandler("LoginEvents");
                    int i = 0;
                    for (i = 0; i <= 1000; i++) {
                        c.add(Calendar.SECOND, 1);
                        try {
                            inputHandler.send(c.getTime().getTime(), new Object[]{c.getTime().getTime(), new String("192.10.1.1"), "1", new Random().nextInt(1000)});
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                        }
                    }
                }
            }).start();
    
            sleep(15000);
            executionPlanRuntime.shutdown();
            System.out.println("Done");
        }
    
    

    Version :

    ✘~/Downloads/tmp/siddhi/siddhi  ➦ e18aaef  git branch | sed -n '/* /s///p' (HEAD detached at v3.0.1) ~/Downloads/tmp/siddhi/siddhi  ➦ e18aaef  git status HEAD detached at v3.0.1 nothing to commit, working directory clean

    opened by emraxxor 10
  • Siddhi App 56d572bd-caaa-41c5-928a-26813ae8bcd2 not stabilized for snapshot/restore, Active thread count is 1

    Siddhi App 56d572bd-caaa-41c5-928a-26813ae8bcd2 not stabilized for snapshot/restore, Active thread count is 1

    Description: not stabilized for snapshot/restore, Active thread count is 1 Affected Siddhi Version: siddhi-core:5.1.7 OS, DB, other environment details and versions:
    I use the jar package after the integration of Flink and siddhi (flink-siddhi.jar) to run in Hadoop. https://github.com/haoch/flink-siddhi
    Steps to reproduce: exception information: io.siddhi.core.exception.SiddhiAppRuntimeException: Siddhi App 47b2c172-51e7-45ff-b471-1ac16de23e5f not stabilized for snapshot/restore, Active thread count is 1 at io.siddhi.core.util.snapshot.SnapshotService.waitForSystemStabilization(SnapshotService.java:768) at io.siddhi.core.util.snapshot.SnapshotService.fullSnapshot(SnapshotService.java:100) at io.siddhi.core.SiddhiAppRuntimeImpl.snapshot(SiddhiAppRuntimeImpl.java:674) at org.apache.flink.streaming.siddhi.operator.AbstractSiddhiOperator.checkpointSiddhiRuntimeState(AbstractSiddhiOperator.java:377) at org.apache.flink.streaming.siddhi.operator.AbstractSiddhiOperator.processElement(AbstractSiddhiOperator.java:220) at org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202) at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:711) at java.lang.Thread.run(Thread.java:745) Related Issues: What causes this problem and how to solve it? Thank you

    opened by guanyufen123 9
  • how can i use object as input and then use in the select block

    how can i use object as input and then use in the select block

    my siddhi content is below:

    @App:name("App_test") @App:description("App_Description") define stream cepInput(student object); define stream cepOutput(name string,age int); @info(name = "helloWorld1Query") from cepInput select student.name,student.age insert into cepOutput;

    the error info is below:

    org.wso2.siddhi.query.api.exception.AttributeNotExistException: Error on 'App_test' @ Line: 10. Position: 19, near 'student.name'. Cannot find attribute type as 'name' does not exist in 'cepInput'; define stream cepInput (student object)

    the student object has name and age two attribute,i wanted to use the attribute in the select block like the xxx.xxx to get the attribute value.

    does siddhi has another way to implement this scenario? thanks a lot.

    question 
    opened by maozhihui 9
  • How Pattern enables continuous monitoring

    How Pattern enables continuous monitoring

    Description:

    I want to realize the alarm after the temperature is greater than or equal to 48 degrees for 30 seconds (at least 30 seconds). But I didn't see the API, only Detecting event non-occurrence in Pattern. It's not a scenario for not, and it's very strange to use not. Like the following? within <time gap> is the maximum time range, what I need is the satisfaction within the minimum time range.

    from every e1=OriginalStream[pointID == 1]
         -> not OriginalStream[e1.pointID == pointID and temp <= 48] for 30 sec
    select e1.pointID, 'Alarm!' as result
    insert into OutputStream;
    

    Affected Siddhi Version: v5.1

    OS, DB, other environment details and versions:
    window 10

    Steps to reproduce:

    Related Issues:

    opened by immno 0
  • Bump snakeyaml from 1.30 to 1.32

    Bump snakeyaml from 1.30 to 1.32

    Bumps snakeyaml from 1.30 to 1.32.

    Commits
    • b8239ec Add warning about untrusted data on landing page
    • 2853420 Merge remote-tracking branch 'origin/master'
    • 4b3d996 Merged master into format-2
    • 4081e08 Reformat with IntelliJ
    • 0305c04 Reformat tests with IntelliJ
    • fedd984 Reformat with IntelliJ
    • e5985fa Reformat tests with IntelliJ
    • ebad791 Move formatting to Maven profile
    • 72dfa9f Set the limit for incoming data to prevent a CVE report in NIST
    • 5e56066 Improve error message for too big document
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies java 
    opened by dependabot[bot] 0
  • Partitioned Distributed Sink encounter NullPointerException

    Partitioned Distributed Sink encounter NullPointerException

    Description: Hello , I am new for Siddhi. When I follow the "Distributed Sink" of "Query Guide" , I can't make the demo run successfully.

    (1)Is my usage wrong ?

    Affected Siddhi Version: siddhi-tooling-5.1.0

    OS, DB, other environment details and versions:
    CentOS Linux release 7.5.1804

    Steps to reproduce: I have a service which contains 2 interfaces: (1)/api/demo/test/post (2)/api/demo/test/post2 they all need User entity.

    below is my siddhi settings:

    @App:name("HelloWorldPartitionApp")
    
    @source(type = 'http', receiver.url = "http://0.0.0.0:8008/cargo123", @map(type = 'json'))
    define stream CargoStream1 (name string,age int);
    
    @sink(type='http',method='POST',@map(type='json',validate.json='true',@payload("""{"userName":"{{weight}}","age":"{{age}}","mobile":"{{totalWeight}}"}""")),
        @distribution(strategy='partitioned', partitionKey='weight',
            @destination(publisher.url='http://192.168.1.39:9911/api/demo/test/post'),
            @destination(publisher.url='http://192.168.1.39:9911/api/demo/test/post2')))
    define stream OutputStream1(weight string,age int, totalWeight long);
    
    @info(name='HelloWorldPartitionQuery')
    from CargoStream1
    select name as weight, age, sum(age) as totalWeight
    insert into OutputStream1;
    

    my request body for cargo123:

    {
        "name": "Tom",
        "age": 28
    }
    

    Related Issues:

    [2022-10-18_15-40-38_886] ERROR {io.siddhi.core.stream.StreamJunction} - Error in 'HelloWorldPartitionApp' after consuming events from Stream 'OutputStream1', null. Hence, dropping event 'StreamEvent{ timestamp=1666078838886, beforeWindowData=null, onAfterWindowData=null, outputData=[Tom, 28, 28], type=CURRENT, next=null}' (Encoded) 
    
    java.lang.NullPointerException
    	at io.siddhi.extension.io.http.sink.HttpSink.sendRequest(HttpSink.java:841)
    	at io.siddhi.extension.io.http.sink.HttpSink.publish(HttpSink.java:618)
    	at io.siddhi.core.util.transport.SingleClientDistributedSink.publish(SingleClientDistributedSink.java:61)
    	at io.siddhi.core.stream.output.sink.distributed.DistributedTransport.publish(DistributedTransport.java:125)
    	at io.siddhi.core.stream.output.sink.Sink.publish(Sink.java:182)
    	at io.siddhi.extension.map.json.sinkmapper.JsonSinkMapper.mapAndSend(JsonSinkMapper.java:211)
    	at io.siddhi.core.stream.output.sink.SinkMapper.mapAndSend(SinkMapper.java:180)
    	at io.siddhi.core.stream.output.sink.SinkCallback.receive(SinkCallback.java:55)
    	at io.siddhi.core.stream.output.StreamCallback.receive(StreamCallback.java:100)
    	at io.siddhi.core.stream.StreamJunction.sendEvent(StreamJunction.java:176)
    	at io.siddhi.core.stream.StreamJunction$Publisher.send(StreamJunction.java:465)
    	at io.siddhi.core.query.output.callback.InsertIntoStreamCallback.send(InsertIntoStreamCallback.java:56)
    	at io.siddhi.core.query.output.ratelimit.OutputRateLimiter.sendToCallBacks(OutputRateLimiter.java:104)
    	at io.siddhi.core.query.output.ratelimit.PassThroughOutputRateLimiter.process(PassThroughOutputRateLimiter.java:44)
    	at io.siddhi.core.query.selector.QuerySelector.process(QuerySelector.java:97)
    	at io.siddhi.core.query.input.ProcessStreamReceiver.processAndClear(ProcessStreamReceiver.java:183)
    	at io.siddhi.core.query.input.ProcessStreamReceiver.process(ProcessStreamReceiver.java:90)
    	at io.siddhi.core.query.input.ProcessStreamReceiver.receive(ProcessStreamReceiver.java:128)
    	at io.siddhi.core.stream.StreamJunction.sendEvent(StreamJunction.java:199)
    	at io.siddhi.core.stream.StreamJunction$Publisher.send(StreamJunction.java:474)
    	at io.siddhi.core.stream.input.InputDistributor.send(InputDistributor.java:34)
    	at io.siddhi.core.stream.input.InputEntryValve.send(InputEntryValve.java:45)
    	at io.siddhi.core.stream.input.InputHandler.send(InputHandler.java:78)
    	at io.siddhi.core.stream.input.source.PassThroughSourceHandler.sendEvent(PassThroughSourceHandler.java:35)
    	at io.siddhi.core.stream.input.source.InputEventHandler.sendEvent(InputEventHandler.java:81)
    	at io.siddhi.extension.map.json.sourcemapper.JsonSourceMapper.mapAndProcess(JsonSourceMapper.java:234)
    	at io.siddhi.core.stream.input.source.SourceMapper.onEvent(SourceMapper.java:152)
    	at io.siddhi.core.stream.input.source.SourceMapper.onEvent(SourceMapper.java:118)
    	at io.siddhi.extension.io.http.source.HttpWorkerThread.run(HttpWorkerThread.java:62)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at java.lang.Thread.run(Thread.java:748)
    
    opened by GitHub-Yann 0
  • set timezone information to io.siddhi.core.trigger.CronTrigger#scheduleCronJob?

    set timezone information to io.siddhi.core.trigger.CronTrigger#scheduleCronJob?

    Description:

    Want to add custom time zone information method:io.siddhi.core.trigger.CronTrigger#scheduleCronJob

    image

    Affected Siddhi Version:

    OS, DB, other environment details and versions:

    Steps to reproduce:

    Related Issues:

    opened by networkboy 0
  • Pattern stream with logical condition

    Pattern stream with logical condition "and" does not work correctly

    Description:

    @App:name('axx')
    define stream RegulatorStateChangeStream(id string, deviceID long,
    roomNo int, tempSet double, action string);
    define stream RoomKeyStream(id string, deviceID long, roomNo int,
        action string);
    
    @sink(type='log')
    define stream RegulatorActionStream(id1 string, id2 string, roomNo int, action string);
    
    from every (e1=RegulatorStateChangeStream[ action == 'on'] and e2=RoomKeyStream[ action == 'removed' ])
    select e1.id as id1, e2.id as id2, e1.roomNo,
        ifThenElse( e2 is null, 'none', 'stop' ) as action
    insert into RegulatorActionStream;
    

    if I send data as follow:

    regulatorStateChangeStreamHandler.send(new Object[] {"a", 10L, 5, 30, "on"});
    roomKeyStreamHandler.send(new Object[]{"b1",  10, 5, "removed"});
    

    I will get one match, but If I reverse the order of sending data, I will not get any match

    Affected Siddhi Version: 5.x.x OS, DB, other environment details and versions:

    Steps to reproduce:

    Related Issues:

    opened by Allan-QLB 0
  • installation of extension in siddhi docker

    installation of extension in siddhi docker

    Hi, am referring to this link https://siddhi-io.github.io/siddhi-io-kafka/ to install the extensions. However, am unable to find any extension-installations in the docker. Can anyone advise me how to install it? thank you

    opened by lchunleo 0
Releases(v5.1.25)
  • v5.1.25(Nov 1, 2022)

    What's Changed

    • Updates to make the Siddhi repository buildable with Java 11.
    • Fixes related to aggregations.

    Complete Changes

    Please find the complete changes here

    Source code(tar.gz)
    Source code(zip)
  • v5.1.24(Mar 31, 2022)

    What's Changed

    • Fix latency/throughput tracker issues by @grainier in https://github.com/siddhi-io/siddhi/pull/1774
    • Fix latency tracker markOuts by @grainier in https://github.com/siddhi-io/siddhi/pull/1772
    • Fix aggregation event duplication with HA and state persistence by @senthuran16 in https://github.com/siddhi-io/siddhi/pull/1769
    • Bump versions of external dependencies by @AnuGayan in https://github.com/siddhi-io/siddhi/pull/1770

    Full Changelog: https://github.com/siddhi-io/siddhi/compare/v5.1.22...v5.1.24

    Source code(tar.gz)
    Source code(zip)
  • v5.1.21(Feb 15, 2022)

  • v5.1.20(Nov 25, 2021)

    New Features and Improvements

    • Improve IncrementalDataPurger logic and Persisted Aggregation by @AnuGayan in https://github.com/siddhi-io/siddhi/pull/1724
    • Fix invalid json error given for a nested Json message by @dilini-muthumala in https://github.com/siddhi-io/siddhi/pull/1723
    • Fix testcase failure by @dilini-muthumala in https://github.com/siddhi-io/siddhi/pull/1725
    • Improve incremental aggregation error handling by @AnuGayan in https://github.com/siddhi-io/siddhi/pull/1726
    • Improve persisted aggregations by introducing a queue implementation by @dnwick in https://github.com/siddhi-io/siddhi/pull/1728
    • Add debug logs by @dnwick in https://github.com/siddhi-io/siddhi/pull/1729
    • Fix the persisted aggregation SQL query. by @ashendes in https://github.com/siddhi-io/siddhi/pull/1738
    • Add runtime warnings for the deprecated extensions used in Siddhi Apps by @grainier in https://github.com/siddhi-io/siddhi/pull/1734
    • Improve persisted aggregation and logs by @AnuGayan in https://github.com/siddhi-io/siddhi/pull/1740
    • Add missing Siddhi keywords under 'keyword' section by @senthuran16 in https://github.com/siddhi-io/siddhi/pull/1748
    • Primary key violation during persisted aggregation fixed. by @ashendes in https://github.com/siddhi-io/siddhi/pull/1749
    • Fix failing test cases in CacheCornerCasesTest and CacheExpiryAndReloadTestCase by @senthuran16 in https://github.com/siddhi-io/siddhi/pull/1752
    • Fix reconnecting for Error Store by @dilini-muthumala in https://github.com/siddhi-io/siddhi/pull/1755

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.19(Mar 19, 2021)

  • v5.1.12(Jan 16, 2020)

    Overview

    Siddhi 5.1.12 release consists of a significant improvement to support expression based window expiry and many other bug fixes for Siddhi incremental aggregation and in-memory sink/source.

    New Features & Improvements

    • Siddhi sliding and batch windows which support event expiry based on an expression. (#1599)
    • Add support to remove stream and query callback dynamically. (#1594)

    Bug Fixes

    • Store query parsing points to wrong siddhi app position when printing error log. (#1588)
    • AGG_TIMESTAMP value is not applied for aggregation with persistence store. (#1593)
    • Error stacktrace is not printed when dropping the event at Sink. (#1595)
    • Add latch mechanism to pause/resume in in-memory source. (#1605)
    • Siddhi input manager is not thread safe. (#1604)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.11(Dec 13, 2019)

  • v5.1.10(Dec 6, 2019)

    Overview

    Siddhi 5.1.10 release consists of bug fixes related filter operation in table joins and minor documentation improvements.

    Features & Improvements

    • Improve session window documentation. (#1580)

    Bug Fixes

    • Filter does not work when joining with table. (#1570)
    • Fix possible NPE for invalid call sink and call-response configuration. (#1577)
    • Code reformatting fixes. (#1579)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.8(Nov 8, 2019)

    Overview

    Siddhi 5.1.8 release consists of bug fixes related triggers, event mapping and event chunking.

    Features & Improvements

    • Support mapping transport properties to any data type. (#1560)

    Bug Fixes

    • NPE being thrown intermittently while removing bundles in shutdown process. (#1541)
    • Avoid triggers to start before other elements. (#1543)
    • SiddhiQL.g4 grammar file not building with antlr. (#1550)
    • Event chunk breaks for scatter-gather data processing with joins. (#1559)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.7(Oct 7, 2019)

  • v5.1.6(Oct 6, 2019)

    Overview

    Siddhi 5.1.6 release consists of bug fixes related to Siddhi sink retry implementation, logging and aggregations. Most importantly, it contains a bug fix for the Siddhi extension loading issue in slow environments.

    Features & Improvements

    • Improve error with context when updating environment variable (#1523)

    Bug Fixes

    • Fix reconnection logic when publish() always throw connection unavailable exception. (#1525)
    • Fix Siddhi extension loading issue in slow environments. (#1529)
    • Fix the inconsistent behaviour aggregation optimization. (#1533)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.5(Sep 27, 2019)

    Overview

    Siddhi 5.1.5 release consists of a lot of bug fixes covering various execution parts of Siddhi; mainly it contains fixes related to in-memory event table, error handling, extension loading, and event synchronization.

    Features & Improvements

    • Improve logs for duplicate extension additions (#1521)
    • Code refactoring changes to rename store query to On-Demand query (#1506)

    Bug Fixes

    • Fix for NPE when using stream name to refer to attributes in aggregation join queries. (#1503)
    • Fix update or insert operation in InMemoryTable for EventChunks. (#1497) , (#1512)
    • Fix for extension loading issue in certain OS environments (slow environments) (#1507)
    • Bug fixes related to error handling in Triggers (#1515)
    • Stop running on-demand queries if the Siddhi app has shut down (#1515)
    • Fix input handler being silent when siddhi app is not running (Throw error when input handler used without staring the Siddhi App runtime) (#1518)
    • Fix synchronization issues BaseIncrementalValueStore class (#1520)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.4(Sep 10, 2019)

    Overview

    Siddhi 5.1.4 release consists of improvements related to @index annotation usage in stores and some dependency upgrades.

    Features & Improvements

    • Change the behavior of in-memory tables to support multiple '@index' annotations. (#1491)

    Bug Fixes

    • Fix NPE when count() AttributeFunction is used (#1485)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.3(Aug 29, 2019)

    Overview

    Siddhi 5.1.3 release consists of a lot of improvements for existing functionalities supported by Siddhi. These improvements add more value to the use cases such as throttling, continuous testing & integration and error handling.

    Features & Improvements

    • Introduce RESET processing mode to preserve memory optimization. (#1444)
    • Add support YAML Config Manager for easy setting of system properties in SiddhiManager through a YAML file (#1446)
    • Support to create a Sandbox SiddhiAppRuntime for testing purposes (#1451)
    • Improve convert function to provide message & cause for Throwable objects (#1463)
    • Support a way to retrieve the sink options and type at sink mapper. (#1473)
    • Support error handling (log/wait/fault-stream) when event sinks publish data asynchronously. (#1473)

    Bug Fixes

    • Fixes to TimeBatchWindow to process events in a streaming manner, when it's enabled to send current events in streaming mode. This makes sure all having conditions are matched against the output, whereby allowing users to effectively implement throttling use cases with alert suppression. (#1441)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.2(Aug 6, 2019)

    Overview

    Siddhi 5.1.2 release consists of key improvements related to Siddhi patterns and template builders. Other than that, it contains various bug fixes done for Siddhi error handling and etc...

    Highlights

    • There is an improvement done for Template Builder by removing Java Message Format dependency since it is causing some inconsistencies with performing custom mapping for float, double and long values. Due to this fix, there might be some differences (corrected proper output) in the output that you get for custom output mapping with Text, XML, JSON, and CSV. (#1431)
    • There is a behavioral change introduced with the improvements done with (#1421). When counting patterns are used such as e1=StockStream<2:8> and when they are referred without indexes such as e1.price it collects the price values from all the events in the counting pattern e1 and produces it as a list. Since the list is not native to Siddhi the attribute will have the object as its type. In older Siddhi version, it will output the last matching event’s attribute value.

    Features & Improvements

    • SiddhiManager permits user-defined data to be propagated throughout the stack (#1406)
    • API to check whether the Siddhi App is stateful or not (#1413)
    • Support outputting the events collected in counting-pattern as a list (#1421)
    • Support API docs having multiline code segments (#1430)
    • Improve TemplateBuilder & remove Java MessageFormat dependency (#1431)
    • Support pattern ‘every’ clause containing multiple state elements with within condition (#1435)

    Bug Fixes

    • Siddhi Error Handlers not getting engaged (#1419)
    • Incremental persistence to work on Windows Environment (https://github.com/siddhi-io/siddhi/pull/1421/commits/9c37b0d8fc8ce271551d4106bb20231334846f59)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.1(Aug 6, 2019)

    Overview

    Siddhi 5.1.1 release consists of improvements related to Siddhi store join query optimizations and various bug fixes related to patterns, aggregations and etc...

    Features & Improvements

    • Siddhi store join query optimizations (#1382)

    Bug Fixes

    • Log Rolling when aggregation query runs when Database is down (#1380)
    • Fix to avoid API changes introduced for Siddhi store implementation in Siddhi 5.1.0 (#1388)
    • Counting pattern issue with ‘every’ (#1392)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.1.0(Jul 5, 2019)

    Overview

    Siddhi 5.1.0 release consists of various improvements in terms of no param case when paramOverload annotation is in place annotataion

    Features & Improvements

    • Minor improvements related to error messages used for the no param case when paramOverload annotation is in place. (#1375)

    Complete Changes

    Please find the complete changes here

    Download

    The download links for Siddhi core libs are following;

    Source code(tar.gz)
    Source code(zip)
  • v5.0.2(Jul 2, 2019)

    Overview

    Siddhi 5.0.2 release consists of various improvements in terms of automatic documentation generation, extension API validations and new theme update.

    Features & Improvements

    • Add supported Siddhi version information on extension API docs (#1366)
    • Add the origin jar information when generating documentation (#1365)

    Bug Fixes

    There are no bug fixes in this release

    Complete Changes

    Please find the complete changes here

    Source code(tar.gz)
    Source code(zip)
  • v5.0.1(Jun 29, 2019)

    Overview

    Siddhi 5.0.1 release consists of various improvements done for Siddhi aggregations, store implementation, and annotation based validations. It also includes some critical bug fixes done for incremental aggregation as well.

    Features & Improvements

    • Caching support for Siddhi Store (Event Table) implementation (#1319)
    • Incremental aggregation improvements such as efficient In-Memory data recreation, context-sensitive data retrieval and Inline data filtering at DB level (#491)
    • Validate extension specific parameters of SiddhiApp with the patterns specified by a new annotation @ParameterOverload in the extension class (#1368)

    Breaking API Changes

    • Backward incompatible changes for using stores (#1386)

    Bug Fixes

    Please find the bug fixes here

    Complete Changes

    Please find the complete changes here

    Source code(tar.gz)
    Source code(zip)
  • v5.0.0(Apr 15, 2019)

    Siddhi team is excited to announce the Siddhi engine 5.0.0 Release. Please find the major improvements and features introduced on this release.

    Compatibility & Support

    • There is no syntax changes are introduced in this release. If you are already using Siddhi 4.x.x and wanted to migrate to Siddhi 5.0.0 then you can simply move with 0 query changes.
    • Extension APIs are improved/changed thus custom extensions that you have written with Siddhi 4.x.x, no longer works in Siddhi 5.0.0. You have to migrate the extensions to Siddhi 5.0.0 and the changes are straight forward, please refer the Siddhi source code to get some understand on this

    Features & Improvements

    Bug Fixes Refer Github milestone issues to view bug fixes


    Note Please refer Siddhi distribution release to find the distributions of Siddhi runner and Siddhi tooling.

    Reporting Issues Issues can be reported using at GitHub Issue Tracker.

    Contact us Siddhi-Dev Google Group Group is the main Siddhi project discussion forum for developers.

    Users can use Siddhi-User Google Group to raise any queries and get some help to achieve their use cases.

    StackOverflow is also can be used to get some support and GitHub for issues and code repositories.

    Source code(tar.gz)
    Source code(zip)
  • v4.4.8(Feb 22, 2019)

    Release notes

    • Introducing fault streams, error and backpressure handling
    • Support for converting output events to key-value pairs at StreamCallback, i.e. Event to Map and Event[] to Map[].
    Source code(tar.gz)
    Source code(zip)
  • v4.3.0(Dec 13, 2018)

    Following API changes have being introduced in this release:

    • Following method in org.wso2.siddhi.core.query.processor.stream.window.QueryableProcessor interface is deprecated.
    StreamEvent query(StateEvent matchingEvent, CompiledCondition compiledCondition,
                          CompiledSelection compiledSelection)
                throws ConnectionUnavailableException;
    

    Above method should be replaced with following method:

        StreamEvent query(StateEvent matchingEvent, CompiledCondition compiledCondition,
                          CompiledSelection compiledSelection, Attribute[] outputAttributes)
                throws ConnectionUnavailableException;
    
    • As a result of above change in QueryableProcessor, following method in org.wso2.siddhi.core.table.record.RecordTableHandler abstract class is also deprecated.
        public abstract Iterator<Object[]> query(long timestamp, Map<String, Object> parameterMap,
                                                 CompiledCondition compiledCondition,
                                                 CompiledSelection compiledSelection,
                                                 RecordTableHandlerCallback recordTableHandlerCallback)
                throws ConnectionUnavailableException;
    

    Above method will be replaced with following method:

    public abstract Iterator<Object[]> query(long timestamp, Map<String, Object> parameterMap,
                                                 CompiledCondition compiledCondition,
                                                 CompiledSelection compiledSelection,
                                                 Attribute[] outputAttributes,
                                                 RecordTableHandlerCallback recordTableHandlerCallback)
                throws ConnectionUnavailableException;
    
    Source code(tar.gz)
    Source code(zip)
  • v4.0.0-M3(May 16, 2017)

  • v4.0.0-M1(Mar 29, 2017)

  • v3.1.0-beta2(Jul 9, 2016)

  • v3.0.5(May 5, 2016)

    Siddhi 3.0.5 is released with WSO2 CEP 4.1.0.

    To use Siddhi as a library, use the following maven dependency.

    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-query-api</artifactId>
         <version>3.0.5</version>
    </dependency>
    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-query-compiler</artifactId>
        <version>3.0.5</version>
    </dependency>
    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-core</artifactId>
         <version>3.0.5</version>
    </dependency>
    
    Source code(tar.gz)
    Source code(zip)
  • v3.0.4(Jan 11, 2016)

    Siddhi 3.0.4 is released with WSO2 DAS 3.0.1.

    To use Siddhi as a library, use the following maven dependency.

    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-query-api</artifactId>
         <version>3.0.4</version>
    </dependency>
    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-query-compiler</artifactId>
        <version>3.0.4</version>
    </dependency>
    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-core</artifactId>
         <version>3.0.4</version>
    </dependency>
    
    Source code(tar.gz)
    Source code(zip)
  • v3.0.3(Nov 2, 2015)

    Siddhi 3.0.3 is released with WSO2 DAS 3.0.0.

    To use Siddhi as a library, use the following maven dependency.

    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-query-api</artifactId>
         <version>3.0.3</version>
    </dependency>
    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-query-compiler</artifactId>
        <version>3.0.3</version>
    </dependency>
    <dependency>
        <groupId>org.wso2.siddhi</groupId>
        <artifactId>siddhi-core</artifactId>
         <version>3.0.3</version>
    </dependency>
    
    Source code(tar.gz)
    Source code(zip)
  • v3.0.2(Sep 26, 2015)

    Siddhi 3.0.2 is a is released with WSO2CEP 4.0.0.

    Use following maven dependency.

        <dependency>
            <groupId>org.wso2.siddhi</groupId>
            <artifactId>siddhi-query-api</artifactId>
             <version>3.0.2</version>
        </dependency>
        <dependency>
            <groupId>org.wso2.siddhi</groupId>
            <artifactId>siddhi-query-compiler</artifactId>
            <version>3.0.2</version>
        </dependency>
        <dependency>
            <groupId>org.wso2.siddhi</groupId>
            <artifactId>siddhi-core</artifactId>
             <version>3.0.2</version>
        </dependency>
    
    Source code(tar.gz)
    Source code(zip)
  • v3.0.0-alpha(May 22, 2015)

Owner
Siddhi - Cloud Native Stream Processor
Siddhi is an open source, lightweight, stream processing and complex event processing engine.
Siddhi - Cloud Native Stream Processor
MALLET is a Java-based package for statistical natural language processing, document classification, clustering, topic modeling, information extraction, and other machine learning applications to text.

MALLET is a Java-based package for statistical natural language processing, document classification, clustering, topic modeling, information extraction, and other machine learning applications to text.

null 900 Jan 2, 2023
SparkFE is the LLVM-based and high-performance Spark native execution engine which is designed for feature engineering.

Spark has rapidly emerged as the de facto standard for big data processing. However, it is not designed for machine learning which has more and more limitation in AI scenarios. SparkFE rewrite the execution engine in C++ and achieve more than 6x performance improvement for feature extraction. It guarantees the online-offline consistency which makes AI landing much easier. For further details, please refer to SparkFE Documentation.

4Paradigm 67 Jun 10, 2021
Statistical Machine Intelligence & Learning Engine

Smile Smile (Statistical Machine Intelligence and Learning Engine) is a fast and comprehensive machine learning, NLP, linear algebra, graph, interpola

Haifeng Li 5.7k Jan 1, 2023
An Engine-Agnostic Deep Learning Framework in Java

Deep Java Library (DJL) Overview Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. DJL is desig

Amazon Web Services - Labs 2.9k Jan 7, 2023
🔍 Open Source Enterprise Cognitive Search Engine

OpenK9 OpenK9 is a new Cognitive Search Engine that allows you to build next generation search experiences. It employs a scalable architecture and mac

SMC 24 Dec 10, 2022
An Engine-Agnostic Deep Learning Framework in Java

Deep Java Library (DJL) Overview Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. DJL is desig

DeepJavaLibrary 2.9k Jan 7, 2023
Chih-Jen Lin 4.3k Jan 2, 2023
Bazel training materials and codelabs focused on beginner, advanced and contributor learning paths

Bazel-learning-paths This repo has materials for learning Bazel: codelabs, presentations, examples. We are open sourcing the content for training engi

null 18 Nov 14, 2022
👄 The most accurate natural language detection library for Java and the JVM, suitable for long and short text alike

Quick Info this library tries to solve language detection of very short words and phrases, even shorter than tweets makes use of both statistical and

Peter M. Stahl 532 Dec 28, 2022
Detection, Classification, and Localisation of marine mammal and other bioacoustic signals

This is the main code repository for the PAMGuard software. This repository was created on 7 January 2022 from sourceforge SVN repository at https://s

PAMGuard 8 Nov 4, 2022
This repository holds the famous Data Structures (mostly abstract ones) and Algorithms for sorting, traversing, and modifying them.

Data-Structures-and-Algorithms About Repo The repo contains the algorithms for manipulating the abstract data structures like Linked List, Stacks, Que

Zaid Ahmed 14 Dec 26, 2021
Model import deployment framework for retraining models (pytorch, tensorflow,keras) deploying in JVM Micro service environments, mobile devices, iot, and Apache Spark

The Eclipse Deeplearning4J (DL4J) ecosystem is a set of projects intended to support all the needs of a JVM based deep learning application. This mean

Eclipse Foundation 12.7k Dec 30, 2022
Datumbox is an open-source Machine Learning framework written in Java which allows the rapid development of Machine Learning and Statistical applications.

Datumbox Machine Learning Framework The Datumbox Machine Learning Framework is an open-source framework written in Java which allows the rapid develop

Vasilis Vryniotis 1.1k Dec 9, 2022
java deep learning algorithms and deep neural networks with gpu acceleration

Deep Neural Networks with GPU support Update This is a newer version of the framework, that I developed while working at ExB Research. Currently, you

Ivan Vasilev 1.2k Jan 6, 2023
statistics, data mining and machine learning toolbox

Disambiguation (Italian dictionary) Field of turnips. It is also a place where there is confusion, where tricks and sims are plotted. (Computer scienc

Aurelian Tutuianu 63 Jun 11, 2022
A Common Criteria (CC) and FIDO certified FIDO U2F javacard applet.

de.fac2 - FIDO U2F Authenticator Applet v1.34 de.fac2 is a Javacard applet which implements a Fido U2F token. It was designed and implemented based on

Bundesamt für Sicherheit in der Informationstechnik 19 Nov 20, 2022
CompreFace is a free and open-source face recognition system from Exadel

CompreFace can be easily integrated into any system without prior machine learning skills. CompreFace provides REST API for face recognition, face verification, face detection, landmark detection, age, and gender recognition and is easily deployed with docker

Exadel 2.6k Dec 31, 2022