Shay Shmeltzer

Subscribe to Shay Shmeltzer feed
Oracle Blogs
Updated: 5 hours 14 min ago

Connecting to Oracle Autonomous Transaction Processing (ATP) from Developer Cloud Service

Mon, 2018-12-10 18:10

The latest and greatest flavor of the Oracle Database in the cloud is Oracle Autonomous Transaction Processing (ATP). One of the Autonomous DB flavors that, it is optimized for OLTP (On-Line Transaction Processing) Applications - the type that you and I usually work on.

One new feature in the world of ATP is the way that you connect to the DB, connection is done leveraging wallets to make sure that your data is secured even though you are connecting over public internet. Here are instructions on how to get such a wallet file for your instance of ATP.

We introduced an enhancement to the latest version of Developer Cloud Service that allows you to connect to ATP from your CI/CD automation jobs. This can help you automate CI/CD for SQL scripts that you need to run against that DB.

As I mentioned in past blogs, DevCS has built in support for the SQLcl utility, allowing you to run SQL scripts against an Oracle database as part of your CI/CD chain. If you want to connect the SQLcl utility in DevCS to ATP, it will need to have access to your wallet.zip file. You can achieve this by uploading the file into your git repository.

Then in your SQLcl configuration you'll specify the user/pass like before, and then point the field titled Credentials File to the wallet.zip file location. (in the screenshot below, the zip file is at the top of the git repo connected to the build - so there is no need to add a path). In the next field, titled connection string, you specify the name used in the wallet's tnsnames.ora file to connect to the DB.

Now you can continue as usual and provide inline SQL or point to SQL files from your git repository.

ATP Connection Definition

 

Categories: Development

Adding Calculated Fields to Your Visual Builder UI

Tue, 2018-12-04 17:16

This is a quick blog to show two techniques for adding calculated fields to an Oracle Visual Builder application.

Both techniques do the calculation on the client side (in the browser). Keep in mind that you might want to consider doing the calculation on the back-end of your application and get the calculated value delivered directly to your client - in some cases this results in better performance. But sometimes you don't have access to modify the backend, or you can't do calculations there, so here we go:

1. For simple calculation you can just use the value property of a field to do the calculation for you.

For example if you need to know the yearly salary you can take the value in a field and just add *12 to it.

You can also use this to calculate values from multiple fields for example [[$current.data.firstName + " " +$current.data.lastName]] - will get you a field with the full name.

2. For more complex calculation you might need to write some logic to arrive at your calculated value, for example if you have multiple if/then conditions. To do that you can create a client side JavaScript function in your page's JS section. Then you refer to the function from your UI component's value attribute using something like {{$functions.myFunc($current.data.salary)}}

As you'll see in the demo, if you switch to the code view of your application the code editor in Oracle VB will give you code insight into the functions you have for your page, helping you eliminate coding errors.

Categories: Development

Filtering Data Providers with Compound Conditions in Visual Builder

Mon, 2018-11-19 13:53

I posted in the past a basic introduction to filtering lists in Visual Builder - showing how to use the filterCriterion to filter the records shown in a ServiceDataProvider. Since I recorded this video, a few things changed, and I also saw several customers asking how can they use more complex conditions that involve more than one filter.

In the video below I show how to define a basic filter with the latest versions (note that in VB 18.4.1 you no longer need to surround value with quotes ""), and then I show how to create a more complex condition that involves two filter criteria and set them to work with either an or or an and operator.

When you are using business components in Visual Builder, the filterCriterion is translated into a "q" parameter that is passed to the GET method (more about this q query parameter here). If you find that you are not getting the records you are expecting, check out the browser's Network tab to see what query parameter was passed in your call to the REST service (intro to this debugging technique here).

As you'll see the filterCriterion contains an array of "criteria" so you can specify several of them. In the video I'm using an approach that Brian Fry showed me that gives you a more declarative way to populate the array dragging and dropping multiple "criteria type" variables into the same array.

Note however that the important thing is what is actually being populated in the json file that defines the action. You should go into this view and verify that you have the right structure there. You can also directly manipulate that source to achieve the filter you need.

As you'll see in the video there are some cases where the design time for this filterCriterion adds an entry into the JSON that might not match what you want (we are tracking this issue). So as mentioned - if things don't work as expected direct manipulation of the JSON might be required. 

Categories: Development

Oracle JET UI on Top of Oracle ADF With Visual Builder

Thu, 2018-11-15 13:22

At Oracle OpenWorld this year I did a session about the future of Oracle ADF, and one of the demos I did there was showing the powerful combination of Oracle ADF backend with a new Oracle JET UI layer and how Oracle Visual Builder makes this integration very simple.

While we have many happy Oracle ADF customers, we do hear from some of them about new UI requirements that might justify thinking about adopting a new UI architecture for some modules. These type of requirements align with an industry trend towards adopting a more client centric UI architecture that leverages the power of JavaScript on the client. While ADF (which is more of a server centric architecture) does let you leverage JavaScript on the client and provides hook points for that in ADF Faces, some customers prefer a more "puristic" approach for new user interfaces that they are planning to build. Oracle's solution for such a UI architecture is based on Oracle JET - an open source set of libraries we developed and share with the community at http://oraclejet.org.

Oracle Visual Builder provides developers with a simpler approach to building Oracle JET based UIs - for both web and on-device mobile applications. Focusing on a visual UI design approach it drastically reduce the amount of manual coding you need to do to create JET based UIs. 

UIs that you build in Visual Builder connect at the back to REST services, and this is where you can leverage Oracle ADF. In version 12 of JDeveloper we introduced the ability to publish ADF Business Components as REST services through a simple wizard. Note that out-of-the-box you get a very powerful set of services that support things like query by example, pagination, sorting and more. If you haven't explored this functionality already, check out the videos showing how to do it here, and this video covering cloud hosting these services.

Once you have this ADF based REST services layer - you'll be glad to hear that in Visual Builder we have specific support to simplify consuming these REST services. Specifically - we understand the meta-data descriptions that these REST services provide and then are able to create services and endpoints mapping for you.

ADF Describe Dialog in Service Connection

You leverage our "Service from specification" dialog to add your ADF services to your Visual Builder app - and from that point on, it's quite simple to build new JET UIs accessing the data.

In the video below I show how simple it is to build a JET-based on-device mobile app that leverage a set of REST services that were created from Oracle JDeveloper 12. Check it out:

Categories: Development

Leveraging Snippets to Create Wiki Pages in Oracle Developer Cloud

Fri, 2018-11-09 12:53

Snippets are a feature of Oracle Developer Cloud Service that gives you a place to store reusable pieces of code as part of your project. These are the type of code snippets that you don't want as part of your core Git repository, but that you still find useful. Snippets can be your own private ones or shared among your team.

One nice usage for code snippets is the ability to quickly include them in a wiki page. This allows you, for example, to create a template of a wiki page and then quickly apply it to a new page that you creates. Using the correct markup for your wiki page format (confluence in the example in the video), you can create a collection of templates. For example, a template for a feature page, a template for a meeting minutes page, etc.. then your team members can quickly create pages that conforms to these templates.

In the video below I show you how to leverage this combination step by step.

Categories: Development

Adding Off Canvas Layout to a Visual Builder Application

Wed, 2018-11-07 15:18

Off Canvas layout is a common UI pattern for modern applications, especially on mobile devices. The concept is aimed at saving space on your page, allowing you to pop out a "drawer" of additional information. This helps reduce clatter on the main page but still provide access to important data when needed without leaving the page context. You can see an example of the runtime behavior at the top of this post. 

Oracle JET provides this type of "off-canvas" behavior as a built in component, and they have a demo of it working as part of the cookbook here.

In the video below I show you how to add this to a Visual Builder application. As always - you can mostly just copy and paste code from the JET cookbook, but you need to handle some of the importing of resources a little different, and use the Visual Builder approach for adding your JavaScript function.

The code used in the video is:

Page source:

Menu
List
chart
Gifts

JavaScript Function in the page:

define(['ojs/ojcore'], function(oj) { 'use strict'; var PageModule = function PageModule() {}; PageModule.prototype.showSide = function() { var offcanvas = { "selector": "#startDrawer", "content": "#mainContent", "edge": "start", "displayMode": "push", "size": "200px" }; oj.OffcanvasUtils.open(offcanvas); } return PageModule; });

and in your page Json file add this import:

"oj-offCanvas": { "path": "ojs/ojoffcanvas" }
Categories: Development

Working with REST POST and Other Operations in Visual Builder

Fri, 2018-10-05 12:37

One of the strong features of Visual Builder Cloud Service is the ability to consume any REST service very easily. I have a video that shows you how to work with REST services in a completely declarative way, but that video doesn't show you what happens behind the scenes when you work with the quick starts. In addition, that video shows using the GET methods and several threads on our community's discussion forum asked for help working with other operations of REST.

The demo video aims to give you a better insight into working with REST operations showing how to:

  • Add service endpoints for various REST operations
  • Create a GET form manually for retrieving single records
  • Create a POST form manually
    • Create type for the request and response parameters
    • Create variables based on the types
    • Call the POST operation passing a variable as body
  • Get the returned values from the POST to show in a page or notifications

A couple of notes:

In the video I use the free REST testing platform at https://jsonplaceholder.typicode.com

While I do everything here manually - you should be able to use the quick starts for creating a "create" form and map them to the post operation - as long as you marked the specific entry as a "create" entry like I did in the demo.

If the concepts above such as types, variables, action chains are new to you - I would highly recommend watching this video on the VBCS Architecture and Building Blocks, it will help you better understand what VBCS is all about.

 

 

 

 

Categories: Development

Business Logic for Business Object in Visual Builder - Triggers, Object Functions, Groovy and More

Fri, 2018-09-14 18:15

The business objects that you create in Visual Builder Cloud Service (VBCS) are quite powerful. Not only can they store data, manage relationships, and give you a rich REST interface for interacting with them, they can also execute dedicated business logic that deals with the data.

If you click on the Business Rules section of a business object you'll see that you can create:

  • Triggers - allow you to react to data events such as insert, update, and delete on records.
  • Object and field Validators - allowing you to make sure that data at the field or record level is correct.
  • Object Functions - A way to define "service methods" that encapsulate logic related to a business object. These functions can be invoked from various points in your application, and also from outside your app.

To code logic in any of these location you will leverage the Groovy language.

I wanted to show the power of some of the functionality you can achieve with these hook points for logic. The demo scenario below is based on a requirement we got from a customer to be able to send an email with the details of all the children records that belong to a specific master record. Imagine a scenario where we have travel requests associated with specific airlines. When we go to delete an airline we want to send an email that will notify someoe about the travel requests that are going to be impacted by this change.

To achieve this I used an accessor - an object that helps you traverse relationships between the two objects - to loop over the records and collect them.

In the video below you'll see a couple of important points:

  • Business object relationship and how to locate the name of an accessor
  • Using a Trigger Event to send an email
  • Passing an object function as a parameter to an email template
  • Coding groovy in a business object

For those interested the specific Groovy code I used is:

def children = TravelRequests; // Accessor name to child collection def ret_val = "List of travel requests "; if (!children.hasNext()) { return "no impact"; } while (children.hasNext()) { def emprec = children.next(); def name = emprec.name; ret_val=ret_val+" " +name; } return ret_val;

 

By the way - if, like me, you come from a background of using Oracle ADF Business Components you might find many of the things we did here quite familiar. That's because we are leveraging Oracle ADF Business Components in this layer of Visual Builder Cloud Service. So looking up old Groovy tutorial and blogs about ADF BC might prove to be useful here too :-)

 

 

Categories: Development

Automating CI/CD for Docker with Oracle Cloud Infrastructure Registry and Developer Cloud Service

Tue, 2018-08-28 18:20

In recent releases Developer Cloud Service (DevCS) has expanded to allow you to manage the full life cycle of Infrastructure in addition to software. One area where we made things simpler is in the management of CI/CD for Docker containers.

In this blog entry I'll take you through the basics of setting up a CI/CD chain that publishes Docker containers into the Oracle Cloud Infrastructure Registry (OCIR) - Oracle's cloud hosted docker registry. If you need a little tutorial on getting started with OCIR and docker using command lines - you can use this one.

Here is a demo video showing you how to leverage DevCS to automate the publishing process of Docker images and hook it into the Git repository in DevCS:

A few notes to help you replicate the steps I take in the video:

You will need to configure a DevCS build server that will have the docker software on it to run your builds. You do this from the "organization" menu under your user name (or get your org admin to do this for you). If you need tips, Abhinav shows you how to do it in this blog entry.

In order to work with OCIR from DevCS you'll need to have an Oracle Compute user created and have an auth token generated for it - make sure you have this token available for you as you'll need it when working from DevCS. (Note that this is separate from the password the user uses to login into the Oracle compute dashboard).

User auth in OCI

Now that you have a user, it's time to start your DevCS work. The first thing you might want to do is upload the code for a docker image into your DevCS git repository. In the video below you'll see that I'm using a very simple application and that my docker file is part of the 3 files I upload into the git repository in DevCS. (need some sample code pick up this docker getting started tutorial).

Once your code is in DevCS - the next step is to create a build jobs that pushes your code into the OCI Registry. Our job has three steps that leverage the following docker commands:

  • Login - you'll need to use your tenantName/user as the username and the auth token as the password. Your registry URL will depend on the center you are in iad - Ashburn, phx -Phoenix etc. In my case it was iad.ocir.io.
  • Build - you'll want to specify a complete image name again using your tenant/user/imageName. Also make sure to indicate where your docker file is (or provide a docker file in that step).
  • Push - here everything should already be filled out for you and ready to go.

Here is how the completed job looks like:

Docker Build Job

Now run your build - and it will build your image and if it succeed it will push it to the registry. Monitor the build's log in case you run into errors.

You can hook up the docker registry section of DevCS to point to the OCIR registry - and then you'll be able to see all the images you pushed there. Again provide your tenant/user and auth token as the login info to the repository.

Docker Registry View in DevCS

Have fun automating your docker publishing on the Oracle Cloud!

 

Categories: Development

Edit Form in a Popup with Oracle Visual Builder

Mon, 2018-08-06 11:51

In Visual Builder Cloud Service (VBCS) it is very easy to create a CRUD application where one page shows you a list of records, and then clicking on one of them you are directed to another page where you edit the record. But what if instead you want to edit the record on the same page - or in a popup window when you click the record in the table?

This is what this blog is all about - the results looks like this:

Edit Popup

The video combines several techniques and tips, some of which I covered in details in other blog entires. Since the video is a bit on the long side (14 min) - here is a break down of what it shows and a way for you to skip to the parts that interest you:

 

Categories: Development

Tips and Tricks for List of Values in Visual Builder Cloud Service

Fri, 2018-08-03 17:39

While working on some customers' applications, I ran into a few performance and functionality tips related to list of values in Visual Builder Cloud Service (VBCS). While it is very tempting to use the built in quick start that binds a list to results from a service, in some cases you might want to take a different approach.

One reason is performance - some lists don't change very often - and it makes more sense to fetch them only once instead of at every entry into a page. VBCS offers additional scopes for variables - and in the demo below I show how to use an application scope to only fetch a list of countries once (starting at 1:15). I also show how to define an array that will store the values of the list in a way that will allow you to access them from other locations in your app, and not just the specific UI component.

The other scenario that the demo shows relates to situations where you need to get additional information on the record you selected in the list. For example, your list might have the code and label but might contain additional meaningful fields. What if you need access to those values for the selected record?

In the demo below (6:40), I use a little JavaSciprt utility method that I add to the page to get the full details of the selected record from the list. The code used is (replace the two bold names with the id field and the value you want to return):

PageModule.prototype.FindThreeLetter = function(list,value) {
 return  list.find(record => record.alpha2_code === value).alpha3_code;
}

In the past, any array used for an LOV had to have "label" and "code" fields, but Oracle JET now allows you to set other fields to act in those roles this is shown at 5:54 using the options-keys property of the list component - a combobox in my case.

Check it out:

Categories: Development

Implementing Master/Detail in Oracle Visual Builder Cloud Service

Wed, 2018-06-20 18:29

This is a quick demo that combines two techniques I showed in previous blogs - filtering lists, and accessing the value of a  selected row in a table. Leveraging these two together it's quite easy to crate a page that has two tables on it - one is the parent and the other is the child, once you select a record in the parent the child table will update to see only the related child records.

Here is a quick demo:

The two steps we are doing are:

  • Create an action flow on the change of first-selected-row attribute of the table
  • In the flow use the assign variable function to set the filterCriterion of the child table to check for the value selected in the master

As you can see - quite simple.

 

Categories: Development

Creating Dependent/Cascading Select Lists with Visual Builder

Fri, 2018-06-01 17:13

A common requirement in applications is to have dependent lists (also known as cascading lists) - meaning have the value selected in one place influence the values that could be select in another place. For example when you select a state, we'll only show you cities in that state in the city list.

In the short demo video below, I'm showing you how to implement this cascading lists solution with the new Visual Builder Cloud Service.

The solution is quite simple

You catch the event of a value change in the first list, and in the action chain that is invoked you set a filterCriterion on the second list. (See this entry for a quick introduction to filterCriterion).

Since the list is connected to a ServiceDataProvider, there is no further action you need to take - the change to the SDP will be reflected in the UI component automatically.

Quick tips - make sure you reference the id of the column and that your operators are properly defined and enclosed in double quotes.

 

Categories: Development

When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ...

Sun, 2018-05-20 03:16
image

A dataset with all sessions of the upcoming Oracle OpenWorld 2017 conference is nice to have – for experiments and demonstrations with many technologies. The session catalog is exposed at a website here.

With searching, filtering and scrolling, all available sessions can be inspected. If data is available in a browser, it can be retrieved programmatically and persisted locally in for example a JSON document. A typical approach for this is web scraping: having a server side program act like a browser, retrieve the HTML from the web site and query the data from the response. This process is described for example in this article – https://codeburst.io/an-introduction-to-web-scraping-with-node-js-1045b55c63f7 – for Node and the Cheerio library.

However, server side screen scraping of HTML will only be successful when the HTML is static. Dynamic HTML is constructed in the browser by executing JavaScript code that manipulates the browser DOM. If that is the mechanism behind a web site, server side scraping is at the very least considerably more complex (as it requires the server to emulate a modern web browser to a large degree). Selenium has been used in such cases – to provide a server side, programmatically accessible browser engine. Alternatively, screen scraping can also be performed inside the browser itself – as is supported for example by the Getsy library.

As you will find in this article – when server side scraping fails, client side scraping may be a much to complex solution. It is very well possible that the rich client web application is using a REST API that provides the data as a JSON document. An API that our server side program can also easily leverage. That turned out the case for the OOW 2017 website – so instead of complex HTML parsing and server side or even client side scraping, the challenge at hand resolves to nothing more than a little bit of REST calling. Read the complete article here.

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn image[7][2][2][2] Facebook clip_image002[8][4][2][2][2] Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Categories: Development

Solve digital transformation challenges using Oracle Cloud

Sun, 2018-05-20 03:15

 

image

Digital transformation is an omnipresent topic today, providing a lot of challenges as well as chances. Due to that, customers are asking about how to deal with those challenges and how to leverage from the provided chances. Frequently asked questions in this area are:

  • How can we modernize existing applications?
  • What are the key elements for a future-proven strategy IT system architecture?
  • How can the flexibility as well as the agility of the IT system landscape be ensured?

But from our experience there’s no common answer for these questions, since every customer has individual requirements and businesses, but it is necessary to find pragmatic solutions, which leverage from existing best Practices – it is not necessary to completely re-invent the wheel.

With our new poster „Four Pillars of Digitalization based on Oracle Cloud“ (Download it here) , we try to deliver a set of harmonized reference models which we evolved based on our practical experience, while conceiving modern, future-oriented solutions in the area of modern application designs, integrative architectures, modern infrastructure solutions and analytical architectures. The guiding principle, which is the basis for our architectural thoughts is: Design for Change. If you want to learn more, you can refer to our corresponding Ebook (find the Ebook here, only available in German at the moment).

Usually the technological base for modern application architectures today is based on Cloud services, where the offerings of different vendors are constantly growing. Here it is important to know which Cloud services are the right ones to implement a specific use case. Our poster „Four Pillars of Digitalization based on Oracle Cloud“ shows the respective Cloud services of our strategic partner Oracle, which can be used to address specific challenges in the area of digitalization. Get the poster here.

 

Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn Forum Wiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Categories: Development

Oracle API Platform Cloud Service Overview by Rolando Carrasco

Sat, 2018-05-19 03:25

image

  Oracle API Platform Cloud Services - API Design This is the first video of a series to showcase the usage of Oracle API Platform Cloud Services. API Management Part 1 of 2. Oracle API Cloud Services This is the second video of a series to show case the usage of the brand new Oracle API Platform CS. This is part one of API Management Oracle API Platform Cloud Services - API Management part 2 This is the 3rd video of the series. In specific here we will see the second part of the API Management functionality focused on Documentation. Oracle API Platform CS - How to create an app This is the 4th video of this series. In this video you will learn how to create an application. Oracle API Plaform Cloud Services - API Usage This is the fifth video of this series. In this video I will showcase how you will interact with the APIs that are deployed in APIPCS.

 

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn image[7][2][2][2] Facebook clip_image002[8][4][2][2][2] Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Categories: Development

Why are Universal Cloud Credit and Bring Your Own License a great opportunity for Oracle Partners?

Sat, 2018-05-19 03:24
image

Oracle simplified buying and consuming for PaaS and IaaS Cloud. Customer can purchase now Universal Cloud Credits. This universal cloud credits can be spend for any IaaS or PaaS service. Partners can start a PoC or project e.g. with Application Container Cloud Service and can add additional service when required e.g. Chabot Cloud Service. The customer can use the universal cloud credits for any available or even upcoming IaaS and PaaS services.

Thousands of customers use Oracle Fusion Middleware and Databases today. With Bring Your Own License they can move easy workload to the cloud. As they already own the license the customer needs to pay only a small uplift for the service portion of PaaS. This is a major opportunity for Oracle partners to offer services to this customers.

To learn more about Universal Cloud Credits and Bring Your Own License Attend the free on-demand training here

 

Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn Forum Wiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Categories: Development

Event Hub Cloud Service. Hello world

Sat, 2018-05-19 00:46

In early days, I've wrote a blog about Oracle Reference Architecture and concept of Schema on Read and Schema on Write. Schema on Read is well suitable for Data Lake, which may ingest any data as it is, without any transformation and preserve it for a long period of time. 

At the same time you have two types of data - Streaming Data and Batch. Batch could be log files, RDBMS archives. Streaming data could be IoT, Sensors, Golden Gate replication logs.

Apache Kafka is very popular engine for acquiring streaming data. It has multiple advantages, like scalability, fault tolerance and high throughput. Unfortunately, Kafka is hard to manage. Fortunately, Cloud simplifies many routine operations. Oracle Has three options for deploy Kafka in the Cloud:

1) Use Big Data Cloud Service, where you get full Cloudera cluster and there you could deploy Apache Kafka as part of CDH.

2) Event Hub Cloud Service Dedicated. Here you have to specify server shapes and some other parameters, but rest done by Cloud automagically. 

3) Event Hub Cloud Service. This service is fully managed by Oracle, you even don't need to specify any compute shapes or so. Only one thing to do is tell for how long you need to store data in this topic and tell how many partitions do you need (partitions = performance).

Today, I'm going to tell you about last option, which is fully managed cloud service.

It's really easy to provision it, just need to login into your Cloud account and choose "Event Hub" Cloud service.

after this go and choose open service console:

Next, click on "Create service":

Put some parameters - two key is Retention period and Number of partitions. First defines for how long will you store messages, second defines performance for read and write operations.

Click next after:

Confirm and wait a while (usually not more than few minutes):

after a short while, you will be able to see provisioned service:

 

 

Hello world flow.

Today I want to show "Hello world" flow. How to produce (write) and consume (read) message from Event Hub Cloud Service.

The flow is (step by step):

1) Obtain OAuth token

2) Produce message to a topic

3) Create consumer group

4) Subscribe to topic

5) Consume message

Now I'm going to show it in some details.

OAuth and Authentication token (Step 1)

For dealing with Event Hub Cloud Service you have to be familiar with concept of OAuth and OpenID. If you are not familiar, you could watch the short video or go through this step by step tutorial

In couple words OAuth token authorization (tells what I could access) method to restrict access to some resources.

One of the main idea is decouple Uses (real human - Resource Owner) and Application (Client). Real man knows login and password, but Client (Application) will not use it every time when need to reach Resource Server (which has some info or content). Instead of this, Application will get once a Authorization token and will use it for working with Resource Server. This is brief, here you may find more detailed explanation what is OAuth.

Obtain Token for Event Hub Cloud Service client.

As you could understand for get acsess to Resource Server (read as Event Hub messages) you need to obtain authorization token from Authorization Server (read as IDCS). Here, I'd like to show step by step flow how to obtain this token. I will start from the end and will show the command (REST call), which you have to run to get token:

#!/bin/bash curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET" \ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE" \ "$IDCS_URL/oauth2/v1/token" \ -o access_token.json

as you can see there are many parameters required for obtain OAuth token.

Let's take a looks there you may get it. Go to the service and click on topic which you want to work with, there you will find IDCS Application, click on it:

After clicking on it, you will go be redirected to IDCS Application page. Most of the credentials you could find here. Click on Configuration:

On this page right away you will find ClientID and Client Secret (think of it like login and password):

 

look down and find point, called Resources:

Click on it

and you will find another two variables, which you need for OAuth token - Scope and Primary Audience.

One more required parameter - IDCS_URL, you may find in your browser:

you have almost everything you need, except login and password. Here implies oracle cloud login and password (it what you are using when login into http://myservices.us.oraclecloud.com):

Now you have all required credential and you are ready to write some script, which will automate all this stuff:

#!/bin/bash export CLIENT_ID=7EA06D3A99D944A5ADCE6C64CCF5C2AC_APPID export CLIENT_SECRET=0380f967-98d4-45e9-8f9a-45100f4638b2 export THEUSERNAME=john.dunbar export THEPASSWORD=MyPassword export SCOPE=/idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export PRIMARY_AUDIENCE=https://7EA06D3A99D944A5ADCE6C64CCF5C2AC.uscom-central-1.oraclecloud.com:443 export THESCOPE=$PRIMARY_AUDIENCE$SCOPE export IDCS_URL=https://idcs-1d6cc7dae45b40a1b9ef42c7608b9afe.identity.oraclecloud.com curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET" \ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE" \ "$IDCS_URL/oauth2/v1/token" \ -o access_token.json

after running this script, you will have new file called access_token.json. Field access_token it's what you need:

$ cat access_token.json {"access_token":"eyJ4NXQjUzI1NiI6InVUMy1YczRNZVZUZFhGbXFQX19GMFJsYmtoQjdCbXJBc3FtV2V4U2NQM3MiLCJ4NXQiOiJhQ25HQUpFSFdZdU9tQWhUMWR1dmFBVmpmd0UiLCJraWQiOiJTSUdOSU5HX0tFWSIsImFsZyI6IlJTMjU2In0.eyJ1c2VyX3R6IjoiQW1lcmljYVwvQ2hpY2FnbyIsInN1YiI6ImpvaG4uZHVuYmFyIiwidXNlcl9sb2NhbGUiOiJlbiIsInVzZXJfZGlzcGxheW5hbWUiOiJKb2huIER1bmJhciIsInVzZXIudGVuYW50Lm5hbWUiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwic3ViX21hcHBpbmdhdHRyIjoidXNlck5hbWUiLCJpc3MiOiJodHRwczpcL1wvaWRlbnRpdHkub3JhY2xlY2xvdWQuY29tXC8iLCJ0b2tfdHlwZSI6IkFUIiwidXNlcl90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsImNsaWVudF9pZCI6IjdFQTA2RDNBOTlEOTQ0QTVBRENFNkM2NENDRjVDMkFDX0FQUElEIiwiYXVkIjpbInVybjpvcGM6bGJhYXM6bG9naWNhbGd1aWQ9N0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMiLCJodHRwczpcL1wvN0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMudXNjb20tY2VudHJhbC0xLm9yYWNsZWNsb3VkLmNvbTo0NDMiXSwidXNlcl9pZCI6IjM1Yzk2YWUyNTZjOTRhNTQ5ZWU0NWUyMDJjZThlY2IxIiwic3ViX3R5cGUiOiJ1c2VyIiwic2NvcGUiOiJcL2lkY3MtMWQ2Y2M3ZGFlNDViNDBhMWI5ZWY0MmM3NjA4YjlhZmUtb2VodGVzdCIsImNsaWVudF90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsInVzZXJfbGFuZyI6ImVuIiwiZXhwIjoxNTI3Mjk5NjUyLCJpYXQiOjE1MjY2OTQ4NTIsImNsaWVudF9ndWlkIjoiZGVjN2E4ZGRhM2I4NDA1MDgzMjE4NWQ1MzZkNDdjYTAiLCJjbGllbnRfbmFtZSI6Ik9FSENTX29laHRlc3QiLCJ0ZW5hbnQiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwianRpIjoiMDkwYWI4ZGYtNjA0NC00OWRlLWFjMTEtOGE5ODIzYTEyNjI5In0.aNDRIM5Gv_fx8EZ54u4AXVNG9B_F8MuyXjQR-vdyHDyRFxTefwlR3gRsnpf0GwHPSJfZb56wEwOVLraRXz1vPHc7Gzk97tdYZ-Mrv7NjoLoxqQj-uGxwAvU3m8_T3ilHthvQ4t9tXPB5o7xPII-BoWa-CF4QC8480ThrBwbl1emTDtEpR9-4z4mm1Ps-rJ9L3BItGXWzNZ6PiNdVbuxCQaboWMQXJM9bSgTmWbAYURwqoyeD9gMw2JkwgNMSmljRnJ_yGRv5KAsaRguqyV-x-lyE9PyW9SiG4rM47t-lY-okMxzchDm8nco84J5XlpKp98kMcg65Ql5Y3TVYGNhTEg","token_type":"Bearer","expires_in":604800}

Create Linux variable for it:

#!/bin/bash export TOKEN=`cat access_token.json |jq .access_token|sed 's/\"//g'`

Well, now we have Authorization token and may work with our Resource Server (Event Hub Cloud Service). 

Note: you also may check documentation about how to obtain OAuth token.

Produce Messages (Write data) to Kafka (Step 2)

The first thing that we may want to do is produce messages (write data to a Kafka cluster). To make scripting easier, it's also better to use some environment variables for common resources. For this example, I'd recommend to parametrize topic's end point, topic name, type of content to be accepted and content type. Content type is completely up to developer, but you have to consume (read) the same format as you produce(write). The key parameter to define is REST endpoint. Go to PSM, click on topic name and copy everything till "restproxy":

Also, you will need topic name, which you could take from the same window:

now we could write a simple script for produce one message to Kafka:

#!/bin/bash export OEHCS_ENDPOINT=https://oehtest-gse00014957.uscom-central-1.oraclecloud.com:443/restproxy export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export CONTENT_TYPE=application/vnd.kafka.json.v2+json curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ --data '{"records":[{"value":{"foo":"bar"}}]}' \ $OEHCS_ENDPOINT/topics/$TOPIC_NAME

if everything will be fine, Linux console will return something like:

{"offsets":[{"partition":1,"offset":8,"error_code":null,"error":null}],"key_schema_id":null,"value_schema_id":null}

Create Consumer Group (Step 3)

The first step to read data from OEHCS is create consumer group. We will reuse environment variables from previous step, but just in case I'll include it in this script:

#!/bin/bash export OEHCS_ENDPOINT=https://oehtest-gse00014957.uscom-central-1.oraclecloud.com:443/restproxy export CONTENT_TYPE=application/vnd.kafka.json.v2+json export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ --data '{"format": "json", "auto.offset.reset": "earliest"}' \ $OEHCS_ENDPOINT/consumers/oehcs-consumer-group \ -o consumer_group.json

this script will generate output file, which will contain variables, that we will need to consume messages.

Subscribe to a topic (Step 4)

Now you are ready to subscribe for this topic (export environment variable if you didn't do this before):

#!/bin/bash export BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'` export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ -d "{\"topics\": [\"$TOPIC_NAME\"]}" \ $BASE_URI/subscription

If everything fine, this request will not return something. 

Consume (Read) messages (Step 5)

Finally, we approach last step - consuming messages.

and again, it's quite simple curl request:

#!/bin/bash export BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'` export H_ACCEPT=application/vnd.kafka.json.v2+json curl -X GET \ -H "Authorization: Bearer $TOKEN" \ -H "Accept: $H_ACCEPT" \ $BASE_URI/records

if everything works, like it supposed to work, you will have output like:

[{"topic":"idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest","key":null,"value":{"foo":"bar"},"partition":1,"offset":17}]

Conclusion

Today we saw how easy to create fully managed Kafka Topic in Event Hub Cloud Service and also we made a first steps into it - write and read message. Kafka is really popular message bus engine, but it's hard to manage. Cloud simplifies this and allow customers concentrate on the development of their applications.

here I also want to give some useful links:

1) If you are not familiar with REST API, I'd recommend you to go through this blog

2) There is online tool, which helps to validate your curl requests

3) Here you could find some useful examples of producing and consuming messages

4) If you are not familiar with OAuth, here is nice tutorial, which show end to end example

Categories: Development

Pages