Skip to main content

Implementing a service mashup with WSO2 API Manager

WSO2 API Manager is one of the leading open source API management platforms available in the market. According to a recent Gartner research (2018) it has been identified as the best “visionary” type vendor in the market. It comes with a support for full API lifecycle management, horizontal and vertical scalability and deployment options of on-premise, public cloud (SaaS) and managed (private) cloud. In this article, I’m going to discuss about how you can implement a service mashup (or service orchestration) with WSO2 API Manager within 10 minutes.
Let’s get started by downloading the WSO2 API Manager from the following link.

https://wso2.com/api-management/install/

Once you downloaded the product, you can install it to the desired location. Let’s consider the directory which WSO2 API Manager is installed as “APIM_HOME”. You can start the product using the following command within the APIM_HOME directory.

$ sh bin/wso2server.sh
Before implementing the use case, let’s understand the scenario with the below image.



In this scenario, we have 2 backend microservices called “Service1” and “Service2” and they produce the following results.

Service1 = {"id":200,"name":"IBM","price":234.34}

Service2 = {"name":"IBM","industry":"technology","CEO":"John Doe","Revenue":"23.5 billion USD"}

Now we need to mashup these two responses to produce a result similar to below.

{results: [{"id":200,"name":"IBM","price":234.34}, {"name":"IBM","industry":"technology","CEO":"John Doe","Revenue":"23.5 billion USD"}]}


Let’s see how we can achieve this requirement with WSO2 API Manager. You can log in to the publisher portal and start creating an API as depicted below.
  • Create a new API by clicking on “Add API” button and then selecting the option 3 which is “Design a new REST API”. That will bring the below mentioned interface where you need to configure the API definition.

  • Once the above interface is filled with the values depicted above, click on the “Next: Implement”. This will bring you to the interface where you need to configure the API implementation logic. Here you need to configure the URL of “service1” as the endpoint URL and then select “Enable Message Mediation” option to upload the service mashup logic which is implemented as a custom mediation policy.

We are going to implement the service mashup logic within a custom mediation policy which is implemented using the “synapse” mediation language using XML. The mashup logic is shown below.


mashupSeq.xml



<?xml version="1.0" encoding="UTF-8"?><sequence xmlns="http://ws.apache.org/ns/synapse" name="mashupSeq">

<log level="full">

<property name="STATUS" value="RESP-1"/>

</log>

<enrich>

<source type="body" clone="true"/>

<target type="property" property="response1"/>

</enrich>

<call>

<endpoint>

<http method="POST" uri-template="http://localhost:9091/service2"/>

</endpoint>

</call>

<log level="full">

<property name="STATUS" value="RESP-2"/>

</log>

<enrich>

<source type="body" clone="true"/>

<target type="property" property="response2"/>

</enrich>

<payloadFactory media-type="json">

<format>{results: [result1:$1, result2:$2]}</format>

<args>

<arg xmlns:soapenv="http://www.w3.org/2003/05/soap-envelope" xmlns:ns3="http://org.apache.synapse/xsd" evaluator="xml" expression="$ctx:response1"/>

<arg xmlns:soapenv="http://www.w3.org/2003/05/soap-envelope" xmlns:ns3="http://org.apache.synapse/xsd" evaluator="xml" expression="$ctx:response2"/>

</args>

</payloadFactory>

<respond/>

</sequence>

In the above sequence, we are saving the response of the first endpoint call to a property called “response1” and then call the second endpoint “service2” and save the result in another property called “response2”. After that, using a payload factory mediator, we are mashing up the 2 responses and creating the final response. Then using the <respond> mediator we are sending back the response.

You need to upload this custom sequence as the “outFlow” of this API as depicted in the above image.
  • Once the above mediation sequence is uploaded, click on “Next: Manage” button and select the “Unlimited” subscription tier and click “Save and Publish” button to publish the API as depicted in the below image.



Now the API is created and published to the API store. Now let’s go and start the backend services. These services are implemented in Ballerina programming language. Following are the source code of these 2 services.

https://gist.github.com/chanakaudaya/bf38a28a6b6b43911d7f2b1a2c65951a

https://gist.github.com/chanakaudaya/714a1dac25beff34733fac4e1d86f3cf


Once these 2 service are started, they will be running on the below URLs.

http://localhost:9090/service1

http://localhost:9091/service2
  • Let’s log in to the API Store and subscribe to this API using the default application and execute the API using the generated access token.



Click on the “Applications” tab and generate access token to consume the API.




Now we have the access token to execute the API. Let’s send a CURL request to get the result.

curl -d “{\”name\”:\”WSO2\”}” -H “Content-Type: application/json” -X POST -H “Authorization: Bearer d30d7e25-bb40-3e70-86b0-714f86784cd2” http://localhost:8281/mashup/v1

You will get the below result.

{results: [result1:{"id":200,"name":"IBM","price":234.34}, result2:{"name":"IBM","industry":"technology","CEO":"John Doe","Revenue":"23.5 billion USD"}]}


That’s all. You can do the mashup of the results based on your requirement by modifying the payload factory mediator.

Comments

  1. Thank you.Well it was nice post and very helpful information on Oracle SOA Online Training

    ReplyDelete
  2. Crisp article explaining step-by-step process. Probably if you would have shown how the mediation xml was created. It would help better.

    ReplyDelete

Post a Comment

Popular posts from this blog

How to setup an WSO2 API manager distributed setup with a clustered gateway with WSO2 ELB

In this blog post I am going to describe about how to configure a WSO2 API Manager in a distributed setup with a clustered gateway with WSO2 ELB and the WSO2 G-REG for a distributed deployment in your production environment. Before continuing with this post, you need to download the above mentioned products from the WSO2 website. WSO2 APIM - http://wso2.com/products/api-manager/ WSO2 ELB - http://wso2.com/products/elastic-load-balancer/ Understanding the API Manager architecture API Manager uses the following four main components: Publisher Creates and publishes APIs Store Provides a user interface to search, select, and subscribe to APIs Key Manager Used for authentication, security, and key-related operations Gateway Responsible for securing, protecting, managing, and scaling API calls Here is the deployment diagram that we are going to configure. In this setup, you have 5 APIM nodes with 2 gateway...

WSO2 ESB tuning performance with threads

I have written several blog posts explaining the internal behavior of the ESB and the threads created inside ESB. With this post, I am talking about the effect of threads in the WSO2 ESB and how to tune up threads for optimal performance. You can refer [1] and [2] to understand the threads created within the ESB. [1] http://soatutorials.blogspot.com/2015/05/understanding-threads-created-in-wso2.html [2] http://wso2.com/library/articles/2012/03/importance-performance-wso2-esb-handles-nonobvious/ Within this blog post, I am discussing about the "worker threads" which are used for processing the data within the WSO2 ESB. There are 2 types of worker threads created when you start sending the requests to the server 1) Server Worker/Client Worker Threads 2) Mediator Worker (Synapse-Worker) Threads Server Worker/Client Worker Threads These set of threads will be used to process all the requests/responses coming to the ESB server. ServerWorker Threads will be used to pr...

How puppet works in your IT infrstructure

What is Puppet? Puppet is IT automation software that helps system administrators manage infrastructure throughout its lifecycle, from provisioning and configuration to orchestration and reporting. Using Puppet, you can easily automate repetitive tasks, quickly deploy critical applications, and proactively manage change, scaling from 10s of servers to 1000s, on-premise or in the cloud. How the puppet works? It works like this..Puppet agent is a daemon that runs on all the client servers(the servers where you require some configuration, or the servers which are going to be managed using puppet.) All the clients which are to be managed will have puppet agent installed on them, and are called nodes in puppet. Puppet Master: This machine contains all the configuration for different hosts. Puppet master will run as a daemon on this master server. Puppet Agent: This is the daemon that will run on all the servers, which are to be managed using p...