Writing REST APIs with SparkJava
In the previous recipes, we saw how to create a microservice using various frameworks such as Spring Boot, WildFly Swarm, and Dropwizard. This recipe is going to be a little different for the fact that we are going to see how to create a self-managed RESTful API using a framework called SparkJava. Not to be confused with Apache Spark, the SparkJava framework claims to be a micro-framework for building web applications. Their HTTP API was inspired by Ruby's Sinatra framework. It is so simple that bringing up an HTTP GET
API requires fewer than ten lines of code. Owing to this, SparkJava is something that could be considered when you would like to quickly build HTTP-based microservices.
To avoid confusion and dependency conflicts in our project, we will create the SparkJava microservice as its own Maven project. This recipe is just here to help you get started with SparkJava. When you are building your production-level application, it is your choice to either use Spring Boot, WildFly Swarm, Dropwizard, or SparkJava based on your needs.
Getting ready
Similar to how we created other Maven projects, create a Maven JAR module with the groupId
com.packt.microservices
and name/artifactId
geolocation-sparkjava
. Feel free to use either your IDE or the command line. After the project is created, if you see that your project is using a Java version other than 1.8, follow the Creating a project template using STS and Maven recipe to change the Java version to 1.8. Perform a Maven update for the change to take effect.
How to do it...
The first thing that you will need is the SparkJava dependency. Add the following snippet to your project's pom.xml
file:
<dependencies> <dependency> <groupId>com.sparkjava</groupId> <artifactId>spark-core</artifactId> <version>2.5</version> </dependency> </dependencies>
We now have to create the domain object and service class. Follow the Writing microservices with WildFly Swarm recipe to create the following three files:
com.packt.microservices.geolocation.GeoLocation.java
com.packt.microservices.geolocation.GeoLocationService.java
com.packt.microservices.geolocation.GeoLocationServiceImpl.java
Let's see what each of these classes does. The GeoLocation.java
class is our domain object that holds the geolocation information. The GeoLocationService.java
interface defines our interface, which is then implemented by the GeoLocationServiceImpl.java
class. If you take a look at the GeoLocationServiceImpl.java
class, we are using a simple collection to store the GeoLocation
domain objects. In a real-time scenario, you will be persisting these objects in a database. But to keep it simple, we will not go that far.
The next thing that SparkJava needs is a controller. If you are familiar with Spring MVC, you can relate this controller to that of Spring MVC's. The controller has a collection of routes defined for each URL pattern in your API. Follow these steps:
- Let's create our controller
com.packt.microservices.geolocation.GeoLocationController.java
with a stubbed-outGET
API:package com.packt.microservices.geolocation; import static spark.Spark.*; public class GeoLocationController { public static void main(String[] args) { get("/geolocation", (req, resp) -> "[]"); } }
- The quickest way to test this is by running this class as a Java application. If you get
SLF4J
errors in your console after you start the application, add the following Maven dependency to yourpom.xml
file and restart your application:<dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-simple</artifactId> <version>1.7.21</version> </dependency>
The
slf4j-simple
dependency routes all theSLF4J
log messages to theSystem.err
stream. - Your console logs should look something like this after the restart:
From the logs, we can clearly see that the service is running on port
4567
. - Execute the following
curl
command in a terminal window to make sure our API is exposed. The response of the following command should be[]
, indicating there are no geolocations:curl http://localhost:4567/geolocation
- Now let's finish building the APIs. We already have a very basic stubbed-out
GET
API. Let's just introduce the service class to the controller and call thefindAll
method. Similarly, let's use the service'screate
method forPOST
API calls. Before we do that, we need to do one more thing. By default, SparkJava does not perform JSON serialization and deserialization. We will be using a library calledgson
to do that. So add the following dependency to yourpom.xml
file:<dependency> <groupId>com.google.code.gson</groupId> <artifactId>gson</artifactId> <version>2.7</version> </dependency>
- Now let's replace the
main
method ofGeoLocationController.java
with this:public static void main(String[] args) { GeoLocationService service = new GeoLocationServiceImpl(); Gson gson = new Gson(); get("/geolocation", (req, resp) -> { return service.findAll(); }, gson::toJson); post("/geolocation", (req, resp) -> { return service.create(gson.fromJson(req.body(), GeoLocation.class)); }, gson::toJson); }
Yes, there are too many things happening here. Let's try to understand them one by one:
- The
get
method now uses the service'sfindAll
method - The third argument to the
get
method is theResponseTransformer
, which says how your response should be transformed before being sent to the client - Since the
ResponseTransformer
is aFunctionalInterface
with just one methodrender
that takes in the rendering logic as an object, we are passing the method reference to Gson'stoJson
method as the rendering logic here.
- The
post
method, which uses the service'screate
method, uses Gson to transform the request body to aGeoLocation
value.
We are now ready to test our application. Restart the application. Let's try to create two geolocations using the POST
API and later try to retrieve them using the GET
method:
- Execute the following cURL commands in your terminal one by one:
curl -H "Content-Type: application/json" -X POST -d '{"timestamp": 1468203975, "userId": "f1196aac-470e-11e6-beb8-9e71128cae77", "latitude": 41.803488, "longitude": -88.144040}' http://localhost:4567/geolocation
- This should give you an output similar to the following (pretty-printed for readability):
{ "latitude": 41.803488, "longitude": -88.14404, "userId": "f1196aac-470e-11e6-beb8-9e71128cae77", "timestamp": 1468203975 } curl -H "Content-Type: application/json" -X POST -d '{"timestamp": 1468203975, "userId": "f1196aac-470e-11e6-beb8-9e71128cae77", "latitude": 9.568012, "longitude": 77.962444}' http://localhost:4567/geolocation
- This should give you an output like this (pretty-printed for readability):
{ "latitude": 9.568012, "longitude": 77.962444, "userId": "f1196aac-470e-11e6-beb8-9e71128cae77", "timestamp": 1468203975 }
- To verify whether your entities were stored correctly, execute the following cURL command:
curl http://localhost:4567/geolocation
- It should give you output similar to the following (pretty-printed for readability):
[ { "latitude": 41.803488, "longitude": -88.14404, "userId": "f1196aac-470e-11e6-beb8-9e71128cae77", "timestamp": 1468203975 }, { "latitude": 9.568012, "longitude": 77.962444, "userId": "f1196aac-470e-11e6-beb8-9e71128cae77", "timestamp": 1468203975 } ]
There are several other configurations that you can make to SparkJava, such as changing the default port, using query parameters, and using path parameters. I'll leave that to you to experiment.