A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin

Related tags

Web Frameworks spark
Overview

Spark - a tiny web framework for Java 8

Spark 2.9.3 is out!! Changeset

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.9.3</version>
</dependency>

Sponsor the project here https://github.com/sponsors/perwendel

For documentation please go to: http://sparkjava.com/documentation

For usage questions, please use stack overflow with the โ€œspark-javaโ€ tag

Javadoc: http://javadoc.io/doc/com.sparkjava/spark-core

When committing to the project please use Spark format configured in https://github.com/perwendel/spark/blob/master/config/spark_formatter_intellij.xml

Getting started

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.9.2</version>
</dependency>
import static spark.Spark.*;

public class HelloWorld {
    public static void main(String[] arg){
        get("/hello", (request, response) -> "Hello World!");
    }
}

View at: http://localhost:4567/hello

Check out and try the examples in the source code. You can also check out the javadoc. After getting the source from github run:

mvn javadoc:javadoc

The result is put in /target/site/apidocs

Examples

Simple example showing some basic functionality

import static spark.Spark.*;

/**
 * A simple example just showing some basic functionality
 */
public class SimpleExample {

    public static void main(String[] args) {

        //  port(5678); <- Uncomment this if you want spark to listen to port 5678 instead of the default 4567

        get("/hello", (request, response) -> "Hello World!");

        post("/hello", (request, response) ->
            "Hello World: " + request.body()
        );

        get("/private", (request, response) -> {
            response.status(401);
            return "Go Away!!!";
        });

        get("/users/:name", (request, response) -> "Selected user: " + request.params(":name"));

        get("/news/:section", (request, response) -> {
            response.type("text/xml");
            return "<?xml version=\"1.0\" encoding=\"UTF-8\"?><news>" + request.params("section") + "</news>";
        });

        get("/protected", (request, response) -> {
            halt(403, "I don't think so!!!");
            return null;
        });

        get("/redirect", (request, response) -> {
            response.redirect("/news/world");
            return null;
        });

        get("/", (request, response) -> "root");
    }
}

A simple CRUD example showing how to create, get, update and delete book resources

import static spark.Spark.*;

import java.util.HashMap;
import java.util.Map;
import java.util.Random;

/**
 * A simple CRUD example showing how to create, get, update and delete book resources.
 */
public class Books {

    /**
     * Map holding the books
     */
    private static Map<String, Book> books = new HashMap<String, Book>();

    public static void main(String[] args) {
        final Random random = new Random();

        // Creates a new book resource, will return the ID to the created resource
        // author and title are sent in the post body as x-www-urlencoded values e.g. author=Foo&title=Bar
        // you get them by using request.queryParams("valuename")
        post("/books", (request, response) -> {
            String author = request.queryParams("author");
            String title = request.queryParams("title");
            Book book = new Book(author, title);

            int id = random.nextInt(Integer.MAX_VALUE);
            books.put(String.valueOf(id), book);

            response.status(201); // 201 Created
            return id;
        });

        // Gets the book resource for the provided id
        get("/books/:id", (request, response) -> {
            Book book = books.get(request.params(":id"));
            if (book != null) {
                return "Title: " + book.getTitle() + ", Author: " + book.getAuthor();
            } else {
                response.status(404); // 404 Not found
                return "Book not found";
            }
        });

        // Updates the book resource for the provided id with new information
        // author and title are sent in the request body as x-www-urlencoded values e.g. author=Foo&title=Bar
        // you get them by using request.queryParams("valuename")
        put("/books/:id", (request, response) -> {
            String id = request.params(":id");
            Book book = books.get(id);
            if (book != null) {
                String newAuthor = request.queryParams("author");
                String newTitle = request.queryParams("title");
                if (newAuthor != null) {
                    book.setAuthor(newAuthor);
                }
                if (newTitle != null) {
                    book.setTitle(newTitle);
                }
                return "Book with id '" + id + "' updated";
            } else {
                response.status(404); // 404 Not found
                return "Book not found";
            }
        });

        // Deletes the book resource for the provided id
        delete("/books/:id", (request, response) -> {
            String id = request.params(":id");
            Book book = books.remove(id);
            if (book != null) {
                return "Book with id '" + id + "' deleted";
            } else {
                response.status(404); // 404 Not found
                return "Book not found";
            }
        });

        // Gets all available book resources (ids)
        get("/books", (request, response) -> {
            String ids = "";
            for (String id : books.keySet()) {
                ids += id + " ";
            }
            return ids;
        });
    }

    public static class Book {

        public String author, title;

        public Book(String author, String title) {
            this.author = author;
            this.title = title;
        }

        public String getAuthor() {
            return author;
        }

        public void setAuthor(String author) {
            this.author = author;
        }

        public String getTitle() {
            return title;
        }

        public void setTitle(String title) {
            this.title = title;
        }
    }
}

Example showing a very simple (and stupid) authentication filter that is executed before all other resources

import static spark.Spark.*;

import java.util.HashMap;
import java.util.Map;

/**
 * Example showing a very simple (and stupid) authentication filter that is
 * executed before all other resources.
 *
 * When requesting the resource with e.g.
 *     http://localhost:4567/hello?user=some&password=guy
 * the filter will stop the execution and the client will get a 401 UNAUTHORIZED with the content 'You are not welcome here'
 *
 * When requesting the resource with e.g.
 *     http://localhost:4567/hello?user=foo&password=bar
 * the filter will accept the request and the request will continue to the /hello route.
 *
 * Note: There is a second "before filter" that adds a header to the response
 * Note: There is also an "after filter" that adds a header to the response
 */
public class FilterExample {

    private static Map<String, String> usernamePasswords = new HashMap<String, String>();

    public static void main(String[] args) {

        usernamePasswords.put("foo", "bar");
        usernamePasswords.put("admin", "admin");

        before((request, response) -> {
            String user = request.queryParams("user");
            String password = request.queryParams("password");

            String dbPassword = usernamePasswords.get(user);
            if (!(password != null && password.equals(dbPassword))) {
                halt(401, "You are not welcome here!!!");
            }
        });

        before("/hello", (request, response) -> response.header("Foo", "Set by second before filter"));

        get("/hello", (request, response) -> "Hello World!");

        after("/hello", (request, response) -> response.header("spark", "added by after-filter"));

        afterAfter("/hello", (request, response) -> response.header("finally", "executed even if exception is throw"));

        afterAfter((request, response) -> response.header("finally", "executed after any route even if exception is throw"));
    }
}

Example showing how to use attributes

import static spark.Spark.after;
import static spark.Spark.get;

/**
 * Example showing the use of attributes
 */
public class FilterExampleAttributes {

    public static void main(String[] args) {
        get("/hi", (request, response) -> {
            request.attribute("foo", "bar");
            return null;
        });

        after("/hi", (request, response) -> {
            for (String attr : request.attributes()) {
                System.out.println("attr: " + attr);
            }
        });

        after("/hi", (request, response) -> {
            Object foo = request.attribute("foo");
            response.body(asXml("foo", foo));
        });
    }

    private static String asXml(String name, Object value) {
        return "<?xml version=\"1.0\" encoding=\"UTF-8\"?><" + name +">" + value + "</"+ name + ">";
    }
}

Example showing how to serve static resources

import static spark.Spark.*;

public class StaticResources {

    public static void main(String[] args) {

        // Will serve all static file are under "/public" in classpath if the route isn't consumed by others routes.
        // When using Maven, the "/public" folder is assumed to be in "/main/resources"
        staticFileLocation("/public");

        get("/hello", (request, response) -> "Hello World!");
    }
}

Example showing how to define content depending on accept type

import static spark.Spark.*;

public class JsonAcceptTypeExample {

    public static void main(String args[]) {

        //Running curl -i -H "Accept: application/json" http://localhost:4567/hello json message is read.
        //Running curl -i -H "Accept: text/html" http://localhost:4567/hello HTTP 404 error is thrown.
        get("/hello", "application/json", (request, response) -> "{\"message\": \"Hello World\"}");
    }
} 

Example showing how to render a view from a template. Note that we are using ModelAndView class for setting the object and name/location of template.

First of all we define a class which handles and renders output depending on template engine used. In this case FreeMarker.

public class FreeMarkerTemplateEngine extends TemplateEngine {

    private Configuration configuration;

    protected FreeMarkerTemplateEngine() {
        this.configuration = createFreemarkerConfiguration();
    }

    @Override
    public String render(ModelAndView modelAndView) {
        try {
            StringWriter stringWriter = new StringWriter();

            Template template = configuration.getTemplate(modelAndView.getViewName());
            template.process(modelAndView.getModel(), stringWriter);

            return stringWriter.toString();
        } catch (IOException e) {
            throw new IllegalArgumentException(e);
        } catch (TemplateException e) {
            throw new IllegalArgumentException(e);
        }
    }

    private Configuration createFreemarkerConfiguration() {
        Configuration retVal = new Configuration();
        retVal.setClassForTemplateLoading(FreeMarkerTemplateEngine.class, "freemarker");
        return retVal;
    }
}

Then we can use it to generate our content. Note how we are setting model data and view name. Because we are using FreeMarker, in this case a Map and the name of the template is required:

public class FreeMarkerExample {

    public static void main(String args[]) {

        get("/hello", (request, response) -> {
            Map<String, Object> attributes = new HashMap<>();
            attributes.put("message", "Hello FreeMarker World");

            // The hello.ftl file is located in directory:
            // src/test/resources/spark/examples/templateview/freemarker
            return modelAndView(attributes, "hello.ftl");
        }, new FreeMarkerTemplateEngine());
    }
}

Example of using Transformer.

First of all we define the transformer class, in this case a class which transforms an object to JSON format using gson API.

public class JsonTransformer implements ResponseTransformer {

	private Gson gson = new Gson();

	@Override
	public String render(Object model) {
		return gson.toJson(model);
	}
}

And then the code which return a simple POJO to be transformed to JSON:

public class TransformerExample {

    public static void main(String args[]) {
        get("/hello", "application/json", (request, response) -> {
            return new MyMessage("Hello World");
        }, new JsonTransformer());
    }
}

Debugging

See Spark-debug-tools as a separate module.

Comments
  • Creating result transformation modules mainly for RESTful architecture.

    Creating result transformation modules mainly for RESTful architecture.

    Hello,

    I am using Spark as an education resource for my video tutorials. I have seen that Spark supports RESTful URL format, but does not support transforming entities to json format for example (of course this could be done inside Route's handle method), but because nowadays RESTful web architecture is really growing in popularity, and also HATEOAS architecture, I think should be great that Spark supports natively this transformation. So I decided to download the Spark code and see what can be done. Before starting coding and create a pull request I would agree if there are some philosophical to not implement as native. Let me explain what would be my approach:

    The basic idea is creating a meta-inf service interface, which will be in charge of transforming an object to String representation .

    So instead of having:

    bodyContent = result.toString();
    

    we can have something like:

    if(serviceRegistered()) {
      bodyContent = getService().transform(result);
    } else {
      bodyContent = result.toString();
    }
    

    I think that this approach gives the possibility to maintain Spark as lightweight web container, but also giving the opportunity to be extended. Then we can create some projects like spark-json which is a jar that transforms the object to json representation, spark-xml which transforms objects to xml and so on.

    What do you think? Has it sense this approach? If there is no inconvenience I can implement it and having a version quickly.

    Alex.

    opened by lordofthejars 58
  • Changing Jetty settings through SparkJava

    Changing Jetty settings through SparkJava

    I'm using SparkJava 2.2 which is using Jetty 9.0.2.

    I'm getting "Form too large" exception which is thrown by Jetty. I know how to solve this problem if I was using Jetty directly:

    http://www.eclipse.org/jetty/documentation/current/setting-form-size.html

    I've already checked that setting env.properties has no effect. Debugger doesn't even stop at any breakpoint set within org.eclipse.jetty.server.handlerContextHandler... Plus when it stops at org.eclipse.jetty.server.Request breakpoints, _context property is null.

    Is there any way to create context for Jetty requests? Is there any way to alter maxFormContentSize?

    Feature request Much wanted 
    opened by QMG-kazala 36
  • Can we have a non-static API?

    Can we have a non-static API?

    (Perhaps there is already a way to do this, that I have missed... but looking at spark.SparkBase it looks unlikely)

    The fact that all the methods (e.g. port(), get(), etc) are static, means that code using Spark is impossible to test, and Spark is no use in integration tests. Additionally, it prevents two servers being started in the same JVM (e.g. HTTPS/HTTP), which makes it unsuitable for quite a large number of use cases (e.g. HTTP-to-HTTPS redirection, for which I still need to run e.g. Apache).

    Would it be possible to expose a more traditional API that is based on instance methods? I don't think the fluidity would be lost substantially:

    Spark spark = new Spark();
    spark.port(8080);
    spark.get("/", this::homePage);
    etc.
    

    Thanks!

    Feature request 
    opened by gubby 25
  • Remove all or certain routes

    Remove all or certain routes

    We have a need to remove all routes and certain routes (given a path and an optional HTTP method) from our class that derives from SparkBase without having to restart the Web server. To support this, SparkBase should expose the following methods:

    + removeRoute(path: String): boolean
    + removeRoute(path: String, httpMethod: HttpMethod): boolean
    + clearRoutes(): void
    

    These would then forward the request to SimpleRouteMatcher which needs to expose analogous methods.

    opened by travisspencer 24
  • Adding simple websocket feature

    Adding simple websocket feature

    I am planning to extend spark for my own usage, and if it is good enough, share it with the community. I really need web sockets, it is essential. I do not have much experience with Netty, but I need to understand some concepts.

    class SparkServerImpl implements SparkServer { takes a Handler object.

    It seems that websockets in jetty is just used like in this page: https://github.com/jetty-project/embedded-jetty-websocket-examples/blob/master/native-jetty-websocket-example/src/main/java/org/eclipse/jetty/demo/EventServer.java

    It is just another handler:

            ServletContextHandler context = new ServletContextHandler(ServletContextHandler.SESSIONS);
            context.setContextPath("/");
            server.setHandler(context);
    
            // Add a websocket to a specific path spec
            ServletHolder holderEvents = new ServletHolder("ws-events", EventServlet.class);
            context.addServlet(holderEvents, "/events/*")
    

    So is it okay that I websocket its handler to following part in the SparkServerImpl:

               List<Handler> handlersInList = new ArrayList<Handler>();
                handlersInList.add(myWebSocketHandler); <-----
                handlersInList.add(handler);
    
    opened by mustafaakin 23
  • Enhancement: concept of

    Enhancement: concept of "finally" filter

    Original work from @gutobortolozzo on PR 406 adapted for current spark tree

    This updated PR shares the goal of PR #217 and #406 and fixes issues #195 and #324

    Feature request Much wanted 
    opened by MouettE-SC 22
  • Server shuts down when given port is in use

    Server shuts down when given port is in use

    Hello,

    we have a little problem with this code (JettySparkServer in method ignite()):

    } catch (Exception e) { logger.error("ignite failed", e); System.exit(100); // NOSONAR }

    System.exit is actually pretty bad here. We have no chance in stopping the server from shutting down.

    In our application we start a webserver and some other server. It is perfectly fine for us when the given port of the webserver is already in use but System.exit shuts down the whole jvm and thus our other server also.

    Would be nice when there was some workaround (like an exception handler which is configurable outside of spark)

    Thanks in advance!

    Cheers,

    Bug/Feature request 
    opened by Prototype1 22
  • Remove statics/globals in favor of an instance-based approach

    Remove statics/globals in favor of an instance-based approach

    Not all of the tests and examples have been migrated but I wanted to see if you had any comments on it first.

    The main reason for this change is to be able to easily run multiple Spark instances (e.g. one listening on localhost, another listening on 0.0.0.0) each with their own HTTP endpoints.

    I've kept a shim implementing the old interface so it should be completely backwards compatible with existing code. The only exception being the servlet filter interface which now takes the instance as an argument.

    opened by jrcamp 22
  • Instantiated API for Spark

    Instantiated API for Spark

    Hi again! #167 was out of date and I've created a new fork from the latest base code with the same changes. The goal here is to be able to use Spark with its default static API and by instantiating a SparkAPI instance.

    opened by ggalmazor 21
  • org.eclipse.jetty.websocket.api is not there

    org.eclipse.jetty.websocket.api is not there

    _Please don't close this one unnecesarily_. I downloaded and added eclipse jetty websocket to my projects after seing that it wasn't included in spark. I now am following the chat tutorial and it said to get import org.eclipse.jetty.websocket.api; I don't have that. Any help?

    Question (usage) Mystery 
    opened by AmitPr 20
  • Rename the project

    Rename the project

    Everyone working with Spark already faced an annoying problem: People frequently confuses our Spark with Apache Spark and this might lead to some trouble in explaining to other people that the thing they're googling up is not what we're talking about.

    Further, googling up simply for "spark" will give a lot of unrelated results.

    To try to disambiguate against Apache Spark when talking to other people, I frequently refer to Spark as "sparkjava" (that is the name of the site). However Apache Spark is frequently used with Java, so this disambiguation is very far from perfect. Also, now that Spark also works with Kotlin, calling it "sparkjava" became a misleading misnomer.

    Personally, for me, today was the third time that I had trouble explaining other people what Spark is and why it has nothing to do with Apache Spark. Today, a collaborator couldn't understand what my project had to do with big data (it has nothing to do at all, he was looking to Apache Spark) and he were thinking about things that would be way more expensive in terms of money in my project, risking to have its vetoed or very overbilled due to all that big data stuff. This lead me into a stressful situation in order to explain things to clear out all that confusion.

    I really like the name spark, but we must face the truth: Apache Spark is a bigger fish who holds that name and even without them, the name "spark" is already something very likely to be polluted in google results. Spark's name is in fact hindering its very dissemination, adoption and growth, so I propose that we should change its name.

    I won't propose a specific name here yet because I have none and my purpose is to just raise the issue by now.

    opened by victorwss 19
  • Exception in thread

    Exception in thread "main" java.lang.NoClassDefFoundError: spark/Route

    i'm trying to run old software (2015ish) that is made using spark.

      Exception in thread "main" java.lang.NoClassDefFoundError: spark/Route
    	  at java.base/java.lang.Class.forName0(Native Method)
    	  at java.base/java.lang.Class.forName(Class.java:488)
    	  at java.base/java.lang.Class.forName(Class.java:467)
    	  at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:56)
      Caused by: java.lang.ClassNotFoundException: spark.Route
    	  at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:445)
    	  at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
    	  at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
    	  ... 4 more
    
    opened by heav-4 0
  • Response body not possible (null) if HTTP status code is 304 (not modified).

    Response body not possible (null) if HTTP status code is 304 (not modified).

    Hi there,

    I've come to notice that it is not possible to deliver a response body to an arbitrary client in case the HTTP status code is set to 304 in the Response object. Client's will not receive any content, even the Route returns a String and/or the body field of the Response is set.

    The other 3xx codes seem to allow a response body though.

    Is there a reason why 304 has a special treatment regarding the response body?

    Kind regards -Ferdinand-

    opened by fsilkswan 0
  • Information + announcement + request :: I have made a Java 17 compatible version that runs on jetty 11 jars

    Information + announcement + request :: I have made a Java 17 compatible version that runs on jetty 11 jars

    Hi Team, I have just forked the project and made a version that is Java 17 :-) It is here. https://github.com/nmondal/spark-11 I am planning to extend it anyways I was wondering if anyone would be interested in merging this with parent ( this project ), so that I can go in my merry ways customizing this very cool project.

    opened by nmondal 0
  • Spark Java - strange behavior when uploading files

    Spark Java - strange behavior when uploading files

    In my project I want to try to upload files, here is the part of the code responsible for this:

            MultipartConfigElement multipartConfigElement =
                    new MultipartConfigElement(
                            "/tmp_files",
                            avatarSize,
                            avatarSize,
                            1024
                    );
    
            request.raw().setAttribute(
                    "org.eclipse.jetty.multipartConfig",
                    multipartConfigElement
            );
    
            Part uploadedFile = request.raw().getPart("file");
    
    

    And a request to upload a file using Idea's http client:

    POST http://localhost:8080/users/me/avatar
    Content-Type: multipart/form-data; boundary=abcd
    Authorization: Bearer {{authToken}}
    
    --abcd
    Content-Disposition: form-data; name="file"; filename="test.png"
    
    < /Users/user1/resources/test.png
    --abcd--
    
    

    where test.png is a regular picture. But when I try to load in this code place:

    Part uploadedFile = request.raw().getPart("file");
    

    I get an error:

    java.nio.file.NoSuchFileException: /tmp_files/MultiPart11851484240893602177
    	at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
    	at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
    	at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
    	at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
    	at java.base/java.nio.file.Files.newByteChannel(Files.java:375)
    	at java.base/java.nio.file.Files.createFile(Files.java:652)
    
    

    It can be assumed that this error is due to the fact that there are no write permissions to the root of the file system (I'm testing on mac os, under the user). But if i try to upload another file - which is just a zip file then everything works.

    POST http://{{host}}/users/me/avatar
    Content-Type: multipart/form-data; boundary=abcd
    Authorization: Bearer {{authToken}}
    
    --abcd
    Content-Disposition: form-data; name="file"; filename="file123.zip"
    
    < /Users/18493151/develop/icandev/api-gateway/src/main/resources/file123.zip
    --abcd--
    

    and no exception in this line:

    Part uploadedFile = request.raw().getPart("file");
    

    Why is this happening? Why does the result depend on the file type?

    sparkjava version 2.9.4

    opened by sleepdan 1
Owner
Per Wendel
Per Wendel
DEPRECATED: use https://github.com/jhipster/jhipster-bom instead

JHipster BOM and server-side library - DEPRECATED Full documentation and information is available on our website at https://www.jhipster.tech/ This pr

JHipster 407 Nov 29, 2022
The modular web framework for Java and Kotlin

โˆž do more, more easily Jooby is a modern, performant and easy to use web framework for Java and Kotlin built on top of your favorite web server. Java:

jooby 1.5k Dec 16, 2022
Firefly is an asynchronous web framework for rapid development of high-performance web application.

What is Firefly? Firefly framework is an asynchronous Java web framework. It helps you create a web application Easy and Quickly. It provides asynchro

Alvin Qiu 289 Dec 18, 2022
An evolving set of open source web components for building mobile and desktop web applications in modern browsers.

Vaadin components Vaadin components is an evolving set of high-quality user interface web components commonly needed in modern mobile and desktop busi

Vaadin 519 Dec 31, 2022
Vaadin 6, 7, 8 is a Java framework for modern Java web applications.

Vaadin Framework Vaadin allows you to build modern web apps efficiently in plain Java, without touching low level web technologies. This repository co

Vaadin 1.7k Jan 5, 2023
Ninja is a full stack web framework for Java. Rock solid, fast and super productive.

_______ .___ _______ ____. _____ \ \ | |\ \ | | / _ \ / | \| |/ | \ | |/ /_\ \ / | \

Ninja Web Framework 1.9k Jan 5, 2023
Apache Wicket - Component-based Java web framework

What is Apache Wicket? Apache Wicket is an open source, java, component based, web application framework. With proper mark-up/logic separation, a POJO

The Apache Software Foundation 657 Dec 31, 2022
Micro Java Web Framework

Micro Java Web Framework It's an open source (Apache License) micro web framework in Java, with minimal dependencies and a quick learning curve. The g

Pippo 769 Dec 19, 2022
True Object-Oriented Java Web Framework

Project architect: @paulodamaso Takes is a true object-oriented and immutable Java8 web development framework. Its key benefits, comparing to all othe

Yegor Bugayenko 748 Dec 23, 2022
ZK is a highly productive Java framework for building amazing enterprise web and mobile applications

ZK ZK is a highly productive Java framework for building amazing enterprise web and mobile applications. Resources Documentation Tutorial ZK Essential

ZK 375 Dec 23, 2022
An Intuitive, Lightweight, High Performance Full Stack Java Web Framework.

mangoo I/O mangoo I/O is a Modern, Intuitive, Lightweight, High Performance Full Stack Java Web Framework. It is a classic MVC-Framework. The foundati

Sven Kubiak 52 Oct 31, 2022
A web MVC action-based framework, on top of CDI, for fast and maintainable Java development.

A web MVC action-based framework, on top of CDI, for fast and maintainable Java development. Downloading For a quick start, you can use this snippet i

Caelum 347 Nov 15, 2022
A server-state reactive Java web framework for building real-time user interfaces and UI components.

RSP About Maven Code examples HTTP requests routing HTML markup Java DSL Page state model Single-page application Navigation bar URL path UI Component

Vadim Vashkevich 33 Jul 13, 2022
https://discord.gg/zv9aytZW join our discord!

Neko+ Leaked for stupid reasons. please leave a star and follow me. i wont provide ANY support https://discord.gg/zv9aytZW join neko+ discord! credits

null 22 Jul 14, 2021
The Grails Web Application Framework

Build Status Slack Signup Slack Signup Grails Grails is a framework used to build web applications with the Groovy programming language. The core fram

grails 2.7k Jan 5, 2023
jetbrick web mvc framework

jetbrick-webmvc Web MVC framework for jetbrick. Documentation http://subchen.github.io/jetbrick-webmvc/ Dependency <dependency> <groupId>com.githu

Guoqiang Chen 25 Nov 15, 2022
๐Ÿš€ The best rbac web framework. base on Spring Boot 2.4ใ€ Spring Cloud 2020ใ€ OAuth2 . Thx Give a star

?? The best rbac web framework. base on Spring Boot 2.4ใ€ Spring Cloud 2020ใ€ OAuth2 . Thx Give a star

pig-mesh 4.3k Jan 8, 2023
Java Web Toolkit

What is JWt ? JWt is a Java library for developing web applications. It provides a pure Java component-driven approach to building web applications, a

null 48 Jul 16, 2022
RESTEasy is a JBoss project that provides various frameworks to help you build RESTful Web Services and RESTful Java applications

RESTEasy RESTEasy is a JBoss.org project aimed at providing productivity frameworks for developing client and server RESTful applications and services

RESTEasy 1k Dec 23, 2022