PeopleSoft Technology Blog

Subscribe to PeopleSoft Technology Blog feed
Oracle Blogs
Updated: 15 hours 11 min ago

How to run PeopleTools 8.57 in the Cloud

Sat, 2018-09-22 01:13

Now that 8.57 is generally available to run in the cloud, I’ll outline the steps to get it up and running so you can take it for a spin. It was announced in an earlier blog that PeopleTools 8.57 will be initially released to run on the Oracle Cloud before it’s available for on-premises installs.  That means you’ll want to use PeopleSoft Cloud Manager 7 to get it up and running.  There’s probably a pretty good chance that this is the first exposure you’ve had to Cloud Manager so it’s worth explaining what you’ll do.

The process is pretty simple.  First, you’ll get an Oracle Cloud Infrastructure (OCI) account.  Then you’ll install Cloud Manager 7 onto that account.  Once installed, Cloud Manager 7 is used to download the 8.57 DPKs into your Cloud Repository.  Then you’ll use Cloud Manager 7 to upgrade a PUM image from 8.56 to 8.57 using the downloaded DPKs.  If you’ve been through a PeopleTools upgrade before, you’ll be amazed how Cloud Manager automates this process.

The steps outlined below assume that you are new to OCI and Cloud Manager.  If you are already familiar with Cloud Manager and have it running in your tenancy, then update your Cloud Manager instance using Interaction Hub Image 07 as described here and go to Step 4.

Step 1: Get your Oracle Cloud trial account.  If you are new to OCI and PeopleSoft Cloud Manager, then please go through all links below to help you get started quickly.

Using the link given above, request for a free account that will give you access to all Oracle Cloud services up to 3500 hours or USD 300 free credits (available in select countries and limited validity). When your request is processed, you will be provisioned a tenancy in Oracle Cloud Infrastructure. Oracle will send you a Welcome email with instructions for signing in to the console for the first time. There is no charge unless you choose to Upgrade to Paid from My Services in the console.

Step 2: Set up your OCI tenancy by creating users, policies and networks.  Refer to OCI documentation here for details or get a quick overview in this blog.  After this step, you will have your tenancy ready to deploy PeopleSoft Cloud Manager. If you are on OCI Classic then you can skip this step.

Step 3: Install and configure PeopleSoft Cloud Manager.  Follow the OCI install guide here. If you are on OCI Classic, follow the install guide here.

As part of this step, you will –

  • Download the Cloud Manager images
  • Upload images to your tenancy
  • Create a custom Microsoft Windows 2012 image
  • Spin up the Cloud Manager instance
  • Run bootstrap to install Cloud Manager
  • Configure Cloud Manager settings
  • Set up file server repository

Step 4: Subscribe to download channels.  The PeopleTools 8.57 download channel is now available under the unsubscribed channel list.  Navigate to Dashboard | Repository | Download Subscriptions.  Click on Unsubscribed tab. Subscribe to the new Tools_857_Linux channel using the related Actions menu. Also subscribe to any application download channels, for example, HCM_92_Linux. Downloading new PeopleTools version takes a while. Wait for the status to indicate success.

Step 5:  Provision a demo environment using PUM Fulltier topology.  Create a new environment template to deploy an application that you downloaded in Step 4.  Use the provided PUM Fulltier topology in the template.  Using this newly created template, provision a new environment. If you already have a PUM environment deployed through Cloud Manager, then you can skip this step. 

Step 6: After the environment is provisioned, navigate to Environment Details | Upgrade PeopleTools.  On the right pane, you’ll have an option to select the PeopleTools 8.57 version that you have downloaded.  Select the PeopleTools version that you want to evaluate. Click Upgrade to begin the upgrade process. 

You’ll be able to see the job details and the steps running on the same page.  Click on the ‘In progress’ status link to view the upgrade job status. 

After the upgrade is complete, click the status link for detailed upgrade job status.

PeopleTools upgrade is now complete.  Login to the upgraded environment PIA to evaluate the new features in PeopleTools 8.57.  For more info on PeopleTools 8.57 go to PeopleTools 8.57 Documentation

PeopleTools 8.57 is Available on the Oracle Cloud

Fri, 2018-09-21 15:17

We are pleased to announce that PeopleTools 8.57 is generally available for install and upgrade on the Oracle Cloud.  As we announced earlier, PeopleTools 8.57 will initially be available only on the Oracle Cloud.  We plan to make PeopleTools 8.57 available for on-premises downloads with the 8.57.04 CPU patch in January 2019.  

There are many new exiting features in PeopleTools 8.57 including:

  • The ability for end-users to set up conditions in analytics that if met will notify the user
  • Improvements to the way Related Content and Analytics are displayed
  • Add custom fields to Fluid pages with minimum life-cycle impact
  • More capabilities for end user personalization
  • Improved search that supports multi-facet selections
  • Easier than ever to brand the application with your corporate colors and graphics
  • Fluid page preview in AppDesigner and improved UI properties interface
  • End-to-end command-line support for life-cycle management processes
  • And much more!

You’ll want to get all the details and learn about the new features in 8.57.  A great place to start is the PeopleTools 8.57 Highlights Video  posted on the PeopleSoft YouTube channel.  The highlights video gives you a overview of the new features and shows how to use them.

There is plenty more information about the release available today.  Here are some links to some of the other places you can go to learn more about 8.57:

In addition to releasing PeopleTools 8.57, version 7 of PeopleSoft Cloud Manager is also being released today.  CM 7 is similar in functionality to CM 6 with additional support for PeopleTools 8.57.  If you currently use a version of Cloud Manager you must upgrade to version 7 in order to install PT 8.57. 

There are a lot of questions about how to get started using PeopleTools 8.57 and Cloud Manager 7.  Documentation and installation instructions are available on the Cloud Manager Home Page.

More information will be published over the next couple of weeks to help you get started with 8.57 on the cloud.  Additional information will include blogs to help with details of the installation, an video that shows the complete process from creating a free trial account to running PT8.57, and a detailed Spotlight Video that describes configuring OCI and Cloud Manager 7.

PeopleTools 8.57 is a significant milestone for Oracle, making it easier than ever for customers to use, maintain and run PeopleSoft Applications.

How to upload PeopleSoft Cloud Manager 6 Images to OCI

Mon, 2018-08-06 00:46

In previous blog, we set up OCI tenancy.  In this blog let us understand how to get Cloud Manager images into your tenancy.  In OCI Classic, Cloud Manager images can be easily downloaded into your identity domain simply by going to Oracle Cloud Marketplace and using the Get App feature. Currently there is not a similar capability for OCI. Instead, you must download the Cloud Manager OCI images to a local system and then upload them into an object storage bucket in your tenancy. 

Although you cannot use the Get App feature in Oracle Cloud Marketplace to automatically download the Cloud Manager Images into OCI, you begin in Oracle Cloud Marketplace to locate the Cloud Manager images for OCI.

Locate and Download the Images from My Oracle Support

To begin, go to Oracle Cloud Marketplace, search for Cloud Manager, and select the instance for PeopleSoft Cloud manager Image 06 for OCI, as shown in this example.

When you click on ‘Get App’ on this listing, you will be redirected to a My Oracle Support Knowledge Base document with links to the Cloud Manager Image 6.0 and a PeopleSoft Linux image.  The PeopleSoft Linux Image for Cloud Manager is an Oracle Linux 6.9 image that can be used to deploy PeopleSoft environments.  You can use this Linux image with no further changes to get started, or use it as a base image to create your own custom image by installing any packages that are required by your organization. 

You can download these images on any existing Linux or Microsoft Windows instance in OCI, or use your local on-premise system. To upload these images to your OCI tenancy, you must first install OCI Command Line Interface (CLI).   Since there are multiple methods to install and configure OCI CLI, let’s take a quick look at which one is best suited to upload Cloud Manager images to OCI object storage.

Install OCI CLI

Using the automated CLI installer is the easiest option.  Let’s assume that we are using an on-premise Linux desktop to download images.  You can also choose to use a Microsoft Windows desktop or laptop with the same approach.  For the Microsoft Windows desktop or laptop, there will be differences in the commands you will need to run, which is clearly documented here

To upload these images, you need to set up OCI CLI on the Linux desktop where you downloaded the images.  Login to the Linux desktop and follow the instructions below to install OCI CLI.  More details on installing OCI CLI can be found in the Oracle Cloud documentation.

  1. Open a terminal.
  2. To run the installer script, run the following command (all on one line).

bash -c "$(curl -L https://raw.githubusercontent.com/oracle/oci-cli/master/scripts/install/install.sh)"

  1. Respond to the installer's prompts. 

Configure OCI CLI

At this point, the OCI CLI is installed on your Linux desktop.  Now, you must configure it to work seamlessly with your OCI tenancy. 

To simplify configuration, use the setup dialog process, which walks you through the first-time setup process step-by-step. For the setup dialog process, use the oci setup config command, which prompts you for the information required for the config file and the API public/private keys. The setup dialog generates an API key pair and creates the config file. 

After setting up the OCI CLI, it is important to add the API public key to your user settings.  Log in to OCI web console and navigate to Identity > Users > <your user name> > Add Public Key.

Upload Cloud Manager Images to OCI

Now the OCI CLI setup is complete and we can use CLI to upload Cloud Manager images that were downloaded earlier. Using the OCI web console, create an object storage bucket. You can then upload the downloaded images into this bucket using OCI CLI.  The command to upload a file to an object storage bucket is as shown below.  Modify it to suit your tenancy and execute the command on the Linux desktop for each image file, to upload the downloaded images.

oci os object put  -ns <tenancy_name>  -bn <bucket_name>  --file <cm_image_file> --name <destination_file_name>  --no-multipart

Both Cloud Manager images take a while to upload.  After the upload is complete, follow the process explained in the OBE for Installing Cloud Manager in OCI to import them as custom images into your tenancy.

How to Set Up OCI Tenancy for PeopleSoft Cloud Manager – Part II

Mon, 2018-07-02 03:43

In the previous blog on setting up tenancy, you learned that a Virtual Cloud Network (VCN) and subnets must be created as a prerequisite for PeopleSoft Cloud Manager Image 6. In Oracle Cloud Infrastructure (OCI), you get the flexibility to configure networking to mimic your on premises environment.  Networking can be configured in many ways for PeopleSoft environments.  You can have separate subnets for demo, development, testing, pre-production and production environments.  You can also group environments into two subnets, one for production and another for non-production environments.  Alternatively, you can create one subnet each for database, middle tier components (Application Server, Process Scheduler, and PIA), PeopleSoft Client, Elasticsearch instances and load balancer.  Another option is to create only three subnets, one for load balancer, the second for database and the third for rest of the instance types.  Subnets can either be public or private.  You can choose to deploy all instances on a private subnet to secure them from the Internet, or put them on a public network where they are accessible from the Internet, or put a few on private and a few on public subnets.  Complete flexibility!

In this blog, let’s take an example of a simple networking architecture. This example network has one VCN with one public subnet and one private subnet.  The public subnet (red dotted line in the illustration below) hosts the Cloud Manager, File Server, middle tier, PeopleSoft (Windows) Clients and Elasticsearch instances.  Database instances are hosted in a private subnet. 

 

First step, create a VCN using OCI web console by selecting Networking, Virtual Cloud Networks. Let’s call it MyVCN

Note. To locate the commands mentioned here, click the Navigation Menu at the top left of the OCI console home page.

You can choose to create the default subnets by selecting the option Create Virtual Cloud Network plus related resources or select Create Virtual Cloud Network only to create your own customized subnets.  Let’s select the latter option for the purpose of this blog. In the example below, a VCN is created with a CIDR 10.0.0.0/16.  You can use a CIDR that mimics your on premises networking too. Read more on VCN and subnet here to familiarize with the concepts and management of networks.

The next important part is creating security lists.  By default every subnet will have a security list.  You can customize the same default security list or create your own.  In the sample network illustrated above, there are two security lists – one for a database subnet and another for the middle tier subnet. The security list for the database subnet can have rules as shown below.  The first rule indicates that the database instance can be accessed over the Internet on port 22 for SSH, and the second rule indicates that all network connections arising from the middle tier subnet (10.0.1.0) must be allowed to connect to all ports on instances running on database subnet.

Now, let’s take a look at the middle tier subnet security list. Here is the summary of the rules:

  • Rule 1 – allow all connections to port 22 (SSH access)
  • Rule 2 – allow all connections to port range 8000 – 8200 (HTTP PIA access)
  • Rule 3 – allow all connections to port range 8443 – 8643 (HTTPS PIA access)
  • Rule 4 – allow all connections between instances within middle tier subnet
  • Rule 5 – allow all connections to port 3389 (RDP to Windows Client)

Also add Egress rules for each subnet as per your requirement.  The example shown below allows connections to all destinations.

Define an internet gateway as shown below.  Also add any route tables if required. More details are available in OCI documentation here.

It is important to ensure that the Cloud Manager and the file server instance has access to all subnets on which it will provision new environments.  The NFS share in the file server on which DPKs are stored, is mounted on the instances being provisioned and hence all NFS ports must be opened in the security lists. More details about the VCN, subnets and security lists specific to Cloud Manager is available here.

All security rules in this blog are not very restrictive and serve as examples to start with.  It is always recommended to restrict access to subnets by adding more rules that allow only required ports.  To learn more about securing access refer to OCI documentation here.

Next, create subnets for each Availability Domain.  In the example below, there are two subnets created for an Availability Domain evQs US-ASHBURN-AD-1.  One subnet is used for hosting database instances and another to host the remainder of the instances, such as middle tier (running Application Server, PIA server and Process Scheduler), Windows client and Elasticsearch.  Create similar subnets on each of the availability domains.  This way, you’ll have a uniform network layout on each of the availability domains.  Ensure that you select the right security list for each subnet.

With this, the tenancy is ready for you to set up PeopleSoft Cloud Manager.  You have set up a compartment, user, policies, VCN, subnets and security lists.  Learn more about installing Cloud Manager here and how to use it here.

How to Set up OCI Tenancy for PeopleSoft Cloud Manager – Part I

Mon, 2018-07-02 02:33

In our previous blog we announced the release of the latest PeopleSoft Cloud Manager Image 6, which includes support for Oracle Cloud Infrastructure (OCI).  Cloud Manager for OCI is designed to work in a simple way.  Let’s try to understand those design aspects that must be considered for your planning.

  • Deploy all instances of a PeopleSoft environment in a single OCI Region
    Cloud Manager can deploy PeopleSoft environments in only one region, the one in which it is running.  When you sign in to Cloud Manager in a browser and access the Cloud Manager Infrastructure Settings page, you see this region listed as “Deployment Region.” When you create or configure an environment template in Cloud Manger, you will see all regions listed, but you must always specify the region chosen as Deployment Region on the Cloud Manager Infrastructure Settings page.
  • Deploy all instances of a PeopleSoft environment in a single Availability Domain (AD) in the chosen Deployment Region
    In environment templates, you can set the AD in which you want to deploy an environment.  All instances are deployed in the chosen AD.  Cloud Manager doesn’t provide the ability to choose different ADs for each instance of the same environment. For example, you will not be able to deploy a Database instance in one AD and middle tier components (Application Server, Process Scheduler, and PIA) or a PeopleSoft Client instance in another AD.
  • Deploy all instances of a PeopleSoft environment in a single compartment
    While defining environment templates you get an option to choose the compartment in which an environment will be deployed.  All instances of an environment will always be deployed in the selected compartment.  You will not be allowed to deploy instances of the same environment across two or more compartments.  For example, you cannot deploy a Database instance in one compartment and middle tier instances or a PeopleSoft Client instance on another compartment. 
  • Deploy the Cloud Manager instance, and all instances of a PeopleSoft environment, on a single VCN that is dedicated for PeopleSoft environments. 
    The Cloud Manager instance must be deployed in the same VCN as all instances of a PeopleSoft environment.  Under the dedicated VCN, you can create multiple subnets that can be used to isolate individual instances or production and non-production environments.

Let’s go through those requirements that satisfy the above mentioned design considerations.  To provision PeopleSoft environments in OCI using Cloud Manager, you’ll need to prepare your tenancy with a few pre-requisite configurations. 

Create a compartment

When you get access to your OCI tenancy, you’ll have an administrator user only.  You can find an overview about how to set up your tenancy here

Note. To locate the commands mentioned here, click the Navigation Menu at the top left of the OCI console home page.

Let’s start by creating a compartment in which all PeopleSoft environments will be provisioned.  If you want to deploy your development, test or production environments on separate compartments, then create enough compartments for the various environments.  For this blog, let’s assume we have created a compartment and named it nkpsft.

Create a cloud user for PeopleSoft

Sign in as the administrator user for OCI, and create a user for Cloud Manager with sufficient privileges to create and manage instances, the Virtual Cloud Network and subnets.  This involves multiple steps as outlined below.   

First, create a Group for Cloud Manager users; for example, CM-Admin.

Next, create a policy, for example CM-Admin-Policy, to grant access to compartment nkpsft to Cloud Manager group.

After that, create a user cloudmgr that will be used in Cloud Manager Settings.

Finally, add cloudmgr user to CM-Admin group.  With this, we have set up a compartment and a user with admin privileges to create and manage instances on OCI.

In our next blog, we’ll talk about the next important pre-requisite – Networking.  Cloud Manager Image 6 on OCI does not enforce a flat networking architecture, as it does in OCI-Classic.  Your network admin can design and lay out the required networking architecture.  You can create your own networking layout and Cloud Manager automation will use it to deploy PeopleSoft environments.  Remember, you’ll be creating networking components in the same compartment that you had created earlier, so that Cloud Manager gets access to it. 

 

Initial release of PeopleTools 8.57 will be in the Oracle Cloud

Thu, 2018-05-31 16:44

Oracle recently announced in a Tech Update that PeopleTools 8.57 will be generally available to all PeopleSoft customers for use first in the Oracle Cloud.  Shortly after, most likely with the third PeopleTools patch (8.57.03) it will be available for on premises environments. This, understandably, has generated some questions so I thought I take a few minutes to clear things up. 

When we talk about running PeopleSoft in the cloud, we are talking about taking one or more of your existing PeopleSoft environments and instead of running them in your data center, they are run on hardware that you subscribe to and manage in the cloud.  In this case, the cloud is Oracle’s Infrastructure as a Service, also called Oracle Cloud Infrastructure (OCI).   When you move one of your environments to the cloud, it’s backed up to files, copied to a cloud repository and provision from the repository.  We call that process ‘Lift and Shift’.  When it’s done, your application with your data, your customizations, and your configurations is running in the cloud.  Even though it’s in the cloud, you are responsible for maintaining it.  What has changed is the infrastructure – servers, storage, network – that the application is running on.

Just to be clear, there is no Software-as-a-Services (SaaS) version of PeopleSoft or PeopleTools, nor does Oracle have any plans to release one.   So, thinking it through, the 8.57 release of PeopleTools that we make available in the cloud is exactly the same as the version you will install in your on premises environment.  Why then are we releasing it first in the cloud?

There are several reasons, but the most significant is to build awareness of the benefits of running PeopleSoft applications in the Oracle Cloud, and what that can mean to you.  We believe that in the long term, the best way to run PeopleSoft applications is to do so in in the cloud.  To make it even better, one of the major initiatives over the past couple of years is the release of PeopleSoft Cloud Manager.  With Cloud Manager, many of the processes that are time consuming or difficult, particularly around lifecycle management, have been improved or automated.  The PeopleTools upgrade, for instance, is automated.  Just choose one of your application images from your cloud repository and start it with the latest Tools version and the app will be upgraded as part of the provisioning process.  It’s that easy.  And that’s just one example.

It’s pretty easy to take advantage of the initial releases of PeopleTools in the cloud.  In fact, as I write this there is a 30 day Trial Program that gives you free credits to try it out.  Be sure to follow the correct OBE when installing.  For more information go to this link or talk to your Oracle Account team.  There is also a PeopleSoft Key Concept page about running Peoplesoft on the Oracle Cloud and PeopleSoft Cloud Manager where you can get more information.  It only takes a small investment to try this out, and it could lead to major improvements in how you manage your applications.

When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ...

Sun, 2018-05-20 03:16
image

A dataset with all sessions of the upcoming Oracle OpenWorld 2017 conference is nice to have – for experiments and demonstrations with many technologies. The session catalog is exposed at a website here.

With searching, filtering and scrolling, all available sessions can be inspected. If data is available in a browser, it can be retrieved programmatically and persisted locally in for example a JSON document. A typical approach for this is web scraping: having a server side program act like a browser, retrieve the HTML from the web site and query the data from the response. This process is described for example in this article – https://codeburst.io/an-introduction-to-web-scraping-with-node-js-1045b55c63f7 – for Node and the Cheerio library.

However, server side screen scraping of HTML will only be successful when the HTML is static. Dynamic HTML is constructed in the browser by executing JavaScript code that manipulates the browser DOM. If that is the mechanism behind a web site, server side scraping is at the very least considerably more complex (as it requires the server to emulate a modern web browser to a large degree). Selenium has been used in such cases – to provide a server side, programmatically accessible browser engine. Alternatively, screen scraping can also be performed inside the browser itself – as is supported for example by the Getsy library.

As you will find in this article – when server side scraping fails, client side scraping may be a much to complex solution. It is very well possible that the rich client web application is using a REST API that provides the data as a JSON document. An API that our server side program can also easily leverage. That turned out the case for the OOW 2017 website – so instead of complex HTML parsing and server side or even client side scraping, the challenge at hand resolves to nothing more than a little bit of REST calling. Read the complete article here.

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn image[7][2][2][2] Facebook clip_image002[8][4][2][2][2] Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Solve digital transformation challenges using Oracle Cloud

Sun, 2018-05-20 03:15

 

image

Digital transformation is an omnipresent topic today, providing a lot of challenges as well as chances. Due to that, customers are asking about how to deal with those challenges and how to leverage from the provided chances. Frequently asked questions in this area are:

  • How can we modernize existing applications?
  • What are the key elements for a future-proven strategy IT system architecture?
  • How can the flexibility as well as the agility of the IT system landscape be ensured?

But from our experience there’s no common answer for these questions, since every customer has individual requirements and businesses, but it is necessary to find pragmatic solutions, which leverage from existing best Practices – it is not necessary to completely re-invent the wheel.

With our new poster „Four Pillars of Digitalization based on Oracle Cloud“ (Download it here) , we try to deliver a set of harmonized reference models which we evolved based on our practical experience, while conceiving modern, future-oriented solutions in the area of modern application designs, integrative architectures, modern infrastructure solutions and analytical architectures. The guiding principle, which is the basis for our architectural thoughts is: Design for Change. If you want to learn more, you can refer to our corresponding Ebook (find the Ebook here, only available in German at the moment).

Usually the technological base for modern application architectures today is based on Cloud services, where the offerings of different vendors are constantly growing. Here it is important to know which Cloud services are the right ones to implement a specific use case. Our poster „Four Pillars of Digitalization based on Oracle Cloud“ shows the respective Cloud services of our strategic partner Oracle, which can be used to address specific challenges in the area of digitalization. Get the poster here.

 

Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn Forum Wiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Oracle API Platform Cloud Service Overview by Rolando Carrasco

Sat, 2018-05-19 03:25

image

  Oracle API Platform Cloud Services - API Design This is the first video of a series to showcase the usage of Oracle API Platform Cloud Services. API Management Part 1 of 2. Oracle API Cloud Services This is the second video of a series to show case the usage of the brand new Oracle API Platform CS. This is part one of API Management Oracle API Platform Cloud Services - API Management part 2 This is the 3rd video of the series. In specific here we will see the second part of the API Management functionality focused on Documentation. Oracle API Platform CS - How to create an app This is the 4th video of this series. In this video you will learn how to create an application. Oracle API Plaform Cloud Services - API Usage This is the fifth video of this series. In this video I will showcase how you will interact with the APIs that are deployed in APIPCS.

 

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn image[7][2][2][2] Facebook clip_image002[8][4][2][2][2] Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Why are Universal Cloud Credit and Bring Your Own License a great opportunity for Oracle Partners?

Sat, 2018-05-19 03:24
image

Oracle simplified buying and consuming for PaaS and IaaS Cloud. Customer can purchase now Universal Cloud Credits. This universal cloud credits can be spend for any IaaS or PaaS service. Partners can start a PoC or project e.g. with Application Container Cloud Service and can add additional service when required e.g. Chabot Cloud Service. The customer can use the universal cloud credits for any available or even upcoming IaaS and PaaS services.

Thousands of customers use Oracle Fusion Middleware and Databases today. With Bring Your Own License they can move easy workload to the cloud. As they already own the license the customer needs to pay only a small uplift for the service portion of PaaS. This is a major opportunity for Oracle partners to offer services to this customers.

To learn more about Universal Cloud Credits and Bring Your Own License Attend the free on-demand training here

 

Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn Forum Wiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Event Hub Cloud Service. Hello world

Sat, 2018-05-19 00:46

In early days, I've wrote a blog about Oracle Reference Architecture and concept of Schema on Read and Schema on Write. Schema on Read is well suitable for Data Lake, which may ingest any data as it is, without any transformation and preserve it for a long period of time. 

At the same time you have two types of data - Streaming Data and Batch. Batch could be log files, RDBMS archives. Streaming data could be IoT, Sensors, Golden Gate replication logs.

Apache Kafka is very popular engine for acquiring streaming data. It has multiple advantages, like scalability, fault tolerance and high throughput. Unfortunately, Kafka is hard to manage. Fortunately, Cloud simplifies many routine operations. Oracle Has three options for deploy Kafka in the Cloud:

1) Use Big Data Cloud Service, where you get full Cloudera cluster and there you could deploy Apache Kafka as part of CDH.

2) Event Hub Cloud Service Dedicated. Here you have to specify server shapes and some other parameters, but rest done by Cloud automagically. 

3) Event Hub Cloud Service. This service is fully managed by Oracle, you even don't need to specify any compute shapes or so. Only one thing to do is tell for how long you need to store data in this topic and tell how many partitions do you need (partitions = performance).

Today, I'm going to tell you about last option, which is fully managed cloud service.

It's really easy to provision it, just need to login into your Cloud account and choose "Event Hub" Cloud service.

after this go and choose open service console:

Next, click on "Create service":

Put some parameters - two key is Retention period and Number of partitions. First defines for how long will you store messages, second defines performance for read and write operations.

Click next after:

Confirm and wait a while (usually not more than few minutes):

after a short while, you will be able to see provisioned service:

 

 

Hello world flow.

Today I want to show "Hello world" flow. How to produce (write) and consume (read) message from Event Hub Cloud Service.

The flow is (step by step):

1) Obtain OAuth token

2) Produce message to a topic

3) Create consumer group

4) Subscribe to topic

5) Consume message

Now I'm going to show it in some details.

OAuth and Authentication token (Step 1)

For dealing with Event Hub Cloud Service you have to be familiar with concept of OAuth and OpenID. If you are not familiar, you could watch the short video or go through this step by step tutorial

In couple words OAuth token authorization (tells what I could access) method to restrict access to some resources.

One of the main idea is decouple Uses (real human - Resource Owner) and Application (Client). Real man knows login and password, but Client (Application) will not use it every time when need to reach Resource Server (which has some info or content). Instead of this, Application will get once a Authorization token and will use it for working with Resource Server. This is brief, here you may find more detailed explanation what is OAuth.

Obtain Token for Event Hub Cloud Service client.

As you could understand for get acsess to Resource Server (read as Event Hub messages) you need to obtain authorization token from Authorization Server (read as IDCS). Here, I'd like to show step by step flow how to obtain this token. I will start from the end and will show the command (REST call), which you have to run to get token:

#!/bin/bash curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET" \ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE" \ "$IDCS_URL/oauth2/v1/token" \ -o access_token.json

as you can see there are many parameters required for obtain OAuth token.

Let's take a looks there you may get it. Go to the service and click on topic which you want to work with, there you will find IDCS Application, click on it:

After clicking on it, you will go be redirected to IDCS Application page. Most of the credentials you could find here. Click on Configuration:

On this page right away you will find ClientID and Client Secret (think of it like login and password):

 

look down and find point, called Resources:

Click on it

and you will find another two variables, which you need for OAuth token - Scope and Primary Audience.

One more required parameter - IDCS_URL, you may find in your browser:

you have almost everything you need, except login and password. Here implies oracle cloud login and password (it what you are using when login into http://myservices.us.oraclecloud.com):

Now you have all required credential and you are ready to write some script, which will automate all this stuff:

#!/bin/bash export CLIENT_ID=7EA06D3A99D944A5ADCE6C64CCF5C2AC_APPID export CLIENT_SECRET=0380f967-98d4-45e9-8f9a-45100f4638b2 export THEUSERNAME=john.dunbar export THEPASSWORD=MyPassword export SCOPE=/idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export PRIMARY_AUDIENCE=https://7EA06D3A99D944A5ADCE6C64CCF5C2AC.uscom-central-1.oraclecloud.com:443 export THESCOPE=$PRIMARY_AUDIENCE$SCOPE export IDCS_URL=https://idcs-1d6cc7dae45b40a1b9ef42c7608b9afe.identity.oraclecloud.com curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET" \ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE" \ "$IDCS_URL/oauth2/v1/token" \ -o access_token.json

after running this script, you will have new file called access_token.json. Field access_token it's what you need:

$ cat access_token.json {"access_token":"eyJ4NXQjUzI1NiI6InVUMy1YczRNZVZUZFhGbXFQX19GMFJsYmtoQjdCbXJBc3FtV2V4U2NQM3MiLCJ4NXQiOiJhQ25HQUpFSFdZdU9tQWhUMWR1dmFBVmpmd0UiLCJraWQiOiJTSUdOSU5HX0tFWSIsImFsZyI6IlJTMjU2In0.eyJ1c2VyX3R6IjoiQW1lcmljYVwvQ2hpY2FnbyIsInN1YiI6ImpvaG4uZHVuYmFyIiwidXNlcl9sb2NhbGUiOiJlbiIsInVzZXJfZGlzcGxheW5hbWUiOiJKb2huIER1bmJhciIsInVzZXIudGVuYW50Lm5hbWUiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwic3ViX21hcHBpbmdhdHRyIjoidXNlck5hbWUiLCJpc3MiOiJodHRwczpcL1wvaWRlbnRpdHkub3JhY2xlY2xvdWQuY29tXC8iLCJ0b2tfdHlwZSI6IkFUIiwidXNlcl90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsImNsaWVudF9pZCI6IjdFQTA2RDNBOTlEOTQ0QTVBRENFNkM2NENDRjVDMkFDX0FQUElEIiwiYXVkIjpbInVybjpvcGM6bGJhYXM6bG9naWNhbGd1aWQ9N0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMiLCJodHRwczpcL1wvN0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMudXNjb20tY2VudHJhbC0xLm9yYWNsZWNsb3VkLmNvbTo0NDMiXSwidXNlcl9pZCI6IjM1Yzk2YWUyNTZjOTRhNTQ5ZWU0NWUyMDJjZThlY2IxIiwic3ViX3R5cGUiOiJ1c2VyIiwic2NvcGUiOiJcL2lkY3MtMWQ2Y2M3ZGFlNDViNDBhMWI5ZWY0MmM3NjA4YjlhZmUtb2VodGVzdCIsImNsaWVudF90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsInVzZXJfbGFuZyI6ImVuIiwiZXhwIjoxNTI3Mjk5NjUyLCJpYXQiOjE1MjY2OTQ4NTIsImNsaWVudF9ndWlkIjoiZGVjN2E4ZGRhM2I4NDA1MDgzMjE4NWQ1MzZkNDdjYTAiLCJjbGllbnRfbmFtZSI6Ik9FSENTX29laHRlc3QiLCJ0ZW5hbnQiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwianRpIjoiMDkwYWI4ZGYtNjA0NC00OWRlLWFjMTEtOGE5ODIzYTEyNjI5In0.aNDRIM5Gv_fx8EZ54u4AXVNG9B_F8MuyXjQR-vdyHDyRFxTefwlR3gRsnpf0GwHPSJfZb56wEwOVLraRXz1vPHc7Gzk97tdYZ-Mrv7NjoLoxqQj-uGxwAvU3m8_T3ilHthvQ4t9tXPB5o7xPII-BoWa-CF4QC8480ThrBwbl1emTDtEpR9-4z4mm1Ps-rJ9L3BItGXWzNZ6PiNdVbuxCQaboWMQXJM9bSgTmWbAYURwqoyeD9gMw2JkwgNMSmljRnJ_yGRv5KAsaRguqyV-x-lyE9PyW9SiG4rM47t-lY-okMxzchDm8nco84J5XlpKp98kMcg65Ql5Y3TVYGNhTEg","token_type":"Bearer","expires_in":604800}

Create Linux variable for it:

#!/bin/bash export TOKEN=`cat access_token.json |jq .access_token|sed 's/\"//g'`

Well, now we have Authorization token and may work with our Resource Server (Event Hub Cloud Service). 

Note: you also may check documentation about how to obtain OAuth token.

Produce Messages (Write data) to Kafka (Step 2)

The first thing that we may want to do is produce messages (write data to a Kafka cluster). To make scripting easier, it's also better to use some environment variables for common resources. For this example, I'd recommend to parametrize topic's end point, topic name, type of content to be accepted and content type. Content type is completely up to developer, but you have to consume (read) the same format as you produce(write). The key parameter to define is REST endpoint. Go to PSM, click on topic name and copy everything till "restproxy":

Also, you will need topic name, which you could take from the same window:

now we could write a simple script for produce one message to Kafka:

#!/bin/bash export OEHCS_ENDPOINT=https://oehtest-gse00014957.uscom-central-1.oraclecloud.com:443/restproxy export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export CONTENT_TYPE=application/vnd.kafka.json.v2+json curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ --data '{"records":[{"value":{"foo":"bar"}}]}' \ $OEHCS_ENDPOINT/topics/$TOPIC_NAME

if everything will be fine, Linux console will return something like:

{"offsets":[{"partition":1,"offset":8,"error_code":null,"error":null}],"key_schema_id":null,"value_schema_id":null}

Create Consumer Group (Step 3)

The first step to read data from OEHCS is create consumer group. We will reuse environment variables from previous step, but just in case I'll include it in this script:

#!/bin/bash export OEHCS_ENDPOINT=https://oehtest-gse00014957.uscom-central-1.oraclecloud.com:443/restproxy export CONTENT_TYPE=application/vnd.kafka.json.v2+json export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ --data '{"format": "json", "auto.offset.reset": "earliest"}' \ $OEHCS_ENDPOINT/consumers/oehcs-consumer-group \ -o consumer_group.json

this script will generate output file, which will contain variables, that we will need to consume messages.

Subscribe to a topic (Step 4)

Now you are ready to subscribe for this topic (export environment variable if you didn't do this before):

#!/bin/bash export BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'` export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ -d "{\"topics\": [\"$TOPIC_NAME\"]}" \ $BASE_URI/subscription

If everything fine, this request will not return something. 

Consume (Read) messages (Step 5)

Finally, we approach last step - consuming messages.

and again, it's quite simple curl request:

#!/bin/bash export BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'` export H_ACCEPT=application/vnd.kafka.json.v2+json curl -X GET \ -H "Authorization: Bearer $TOKEN" \ -H "Accept: $H_ACCEPT" \ $BASE_URI/records

if everything works, like it supposed to work, you will have output like:

[{"topic":"idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest","key":null,"value":{"foo":"bar"},"partition":1,"offset":17}]

Conclusion

Today we saw how easy to create fully managed Kafka Topic in Event Hub Cloud Service and also we made a first steps into it - write and read message. Kafka is really popular message bus engine, but it's hard to manage. Cloud simplifies this and allow customers concentrate on the development of their applications.

here I also want to give some useful links:

1) If you are not familiar with REST API, I'd recommend you to go through this blog

2) There is online tool, which helps to validate your curl requests

3) Here you could find some useful examples of producing and consuming messages

4) If you are not familiar with OAuth, here is nice tutorial, which show end to end example

Why Now Is the Time for ERP in the Cloud

Fri, 2018-05-18 20:20

“The movement to cloud is an inevitable destination; this is how computing will evolve over the next several years.” So said Oracle CEO Mark Hurd at Oracle OpenWorld 2017. Based on the results of new research, that inevitability is here, now.

In our first ERP Trends Report, we surveyed more than 400 finance and IT leaders. We found that 76% of respondents said they either have plans for ERP in the cloud or have made the move already. They are recognizing that waiting puts them at a disadvantage; the time to make the move is now.

The majority of respondents cited economic factors as the reason they made the leap, and it’s easy to see why: Nucleus Research recently published a report that cloud delivers 3.2x the return on investment (ROI) of on-premises systems, while the total cost of ownership (TCO) is 52% lower.  

But even more surprising were the benefits realized once our survey respondents got to the cloud. An astonishing 81% cited “Staying current on technology” as the main benefit of moving to cloud ERP. With a regular cadence of innovation delivered by the cloud, it is easier for companies to quickly incorporate game-changing technologies into everyday business processes—technologies like artificial intelligence, machine learning, the Internet of Things (IoT), blockchain and more. In the cloud, the risk of running their businesses on obsolete technology drops to zero. It’s the last upgrade they will ever need.

“One of the key value propositions in engaging with Oracle and implementing the cloud solutions has been the value of keeping current with technology and technological developments,” said Mick Murray, CFO of Blue Shield of California. “In addition to robotics, we’re looking at machine learning and artificial intelligence, and how do we apply that across the enterprise.”

As new capabilities are rolled out, cloud subscribers like Blue Shield can take advantage of them immediately. This gives them the agility to be both responsive and predictive. Uncertainty is the new normal in business and managing amid uncertainty is a must. It’s no longer enough to be quick-to-change; competitive companies must also have reliable insight into how potential future scenarios could impact performance.

So, what does that mean in terms of daily operations? Basically, it means people using knowledge to make good decisions in a fast, productive, and highly automated manner at all levels of the business. Cloud systems provide the data integration and ongoing technology refresh to incorporate best practices and technology advances.

The cloud also makes it easier to integrate external sources of valuable, contextual knowledge that helps improve the accuracy of data models. This is important considering the scope of threats to sustainable operations for businesses with large, global footprints. Political, environmental, and economic factors across multiple regions could impact business, such as limited travel capabilities slowing down delivery of key supplies.

Business uncertainty is everywhere, and organizations must be able to say, “What is our plan if X happens? What is our plan if X, Y, and Z happen, but W doesn’t?” And this insight must come quickly. Business moves too fast for reports to take days to compile.

ERP Replacement Effort Is Not What It Used to Be

One final stone on the scale in favor of ERP cloud is that migrating does not have to be painful. Don’t let memories of past onsite replacements haunt you. With the right products and the right expertise behind them, cloud migrations happen quickly, cause minimal business disruption, and don’t require intense user training.

For example, Blue Shield of California had set aside $600,000 on change management for the adoption of cloud; in the end, they barely spent anything. Change adoption, they reported, happened quickly and seamlessly.

Considering the benefits for cost savings, elimination of technology obsolescence, and ease of adopting emerging technologies, it is becoming harder to justify a wait on migration to cloud ERP. Disruption is not an issue, and long-term cost saving are substantial. Most importantly, modernizing ERP is an opportunity to modernize the business and embed an ever-refreshing technology infrastructure that enables higher performance on multiple levels.

 

7 Machine Learning Best Practices

Fri, 2018-05-18 20:11

Netflix’s famous algorithm challenge awarded a million dollars to the best algorithm for predicting user ratings for films. But did you know that the winning algorithm was never implemented into a functional model?

Netflix reported that the results of the algorithm just didn’t seem to justify the engineering effort needed to bring them to a production environment. That’s one of the big problems with machine learning.

At your company, you can create the most elegant machine learning model anyone has ever seen. It just won’t matter if you never deploy and operationalize it. That's no easy feat, which is why we're presenting you with seven machine learning best practices.

Download your free ebook, "Demystifying Machine Learning"

At the most recent Data and Analytics Summit, we caught up with Charlie Berger, Senior Director of Product Management for Data Mining and Advanced Analytics to find out more. This is article is based on what he had to say. 

Putting your model into practice might longer than you think. A TDWI report found that 28% of respondents took three to five months to put their model into operational use. And almost 15% needed longer than nine months.

Graph on Machine Learning Operational Use

So what can you do to start deploying your machine learning faster?

We’ve laid out our tips here:

1. Don’t Forget to Actually Get Started

In the following points, we’re going to give you a list of different ways to ensure your machine learning models are used in the best way. But we’re starting out with the most important point of all.

The truth is that at this point in machine learning, many people never get started at all. This happens for many reasons. The technology is complicated, the buy-in perhaps isn’t there, or people are just trying too hard to get everything e-x-a-c-t-l-y right. So here’s Charlie’s recommendation:

Get started, even if you know that you’ll have to rebuild the model once a month. The learning you gain from this will be invaluable.

2. Start with a Business Problem Statement and Establish the Right Success Metrics

Starting with a business problem is a common machine learning best practice. But it’s common precisely because it’s so essential and yet many people de-prioritize it.

Think about this quote, “If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and 5 minutes thinking about solutions.”

Now be sure that you’re applying it to your machine learning scenarios. Below, we have a list of poorly defined problem statements and examples of ways to define them in a more specific way.

Machine Learning Problem Statements

Think about what your definition of profitability is. For example, we recently talked to a nation-wide chain of fast-casual restaurants that wanted to look at increasing their soft drinks sales. In that case, we had to consider carefully the implications of defining the basket. Is the transaction a single meal, or six meals for a family? This matters because it affects how you will display the results. You’ll have to think about how to approach the problem and ultimately operationalize it.

Beyond establishing success metrics, you need to establish the right ones. Metrics will help you establish progress, but does improving the metric actually improve the end user experience? For example, your traditional accuracy measures might encompass precision and square error. But if you’re trying to create a model that measures price optimization for airlines, that doesn’t matter if your cost per purchase and overall purchases isn’t going up.

3. Don’t Move Your Data – Move the Algorithms

The Achilles heel in predictive modeling is that it’s a 2-step process. First you build the model, generally on sample data that can run in numbers ranging from the hundreds to the millions. And then, once the predictive model is built, data scientists have to apply it. However, much of that data resides in a database somewhere.

Let’s say you want data on all of the people in the US. There are 360 million people in the US—where does that data reside? Probably in a database somewhere.

Where does your predictive model reside?

What usually happens is that people will take all of their data out of database so they can run their equations with their model. Then they’ll have to import the results back into the database to make those predictions. And that process takes hours and hours and days and days, thus reducing the efficacy of the models you’ve built.

However, growing your equations from inside the database has significant advantages. Running the equations through the kernel of the database takes a few seconds, versus the hours it would take to export your data. Then, the database can do all of your math too and build it inside the database. This means one world for the data scientist and the database administrator.

By keeping your data within your database and Hadoop or object storage, you can build models and score within the database, and use R packages with data-parallel invocations. This allows you to eliminate data duplications and separate analytical servers (by not moving data) and allows you to to score models, embed data prep, build models, and prepare data in just hours.

4. Assemble the Right Data

As James Taylor with Neil Raden wrote in Smart Enough Systems, cataloging everything you have and deciding what data is important is the wrong way to go about things. The right way is to work backward from the solution, define the problem explicitly, and map out the data needed to populate the investigation and models.

And then, it’s time for some collaboration with other teams.

Machine Learning Collaboration Teams

Here’s where you can potentially start to get bogged down. So we will refer to point number 1, which says, “Don’t forget to actually get started.” At the same time, assembling the right data is very important to your success.

For you to figure out the right data to use to populate your investigation and models, you will want to talk to people in the three major areas of business domain, information technology, and data analysts.

Business domain—these are the people who know the business.

  • Marketing and sales
  • Customer service
  • Operations

Information technology—the people who have access to data.

  • Database administrators

Data Analysts—people who know the business.

  • Statisticians
  • Data miners
  • Data scientists

You need the active participation. Without it, you’ll get comments like:

  • These leads are no good
  • That data is old
  • This model isn’t accurate enough
  • Why didn’t you use this data?

You’ve heard it all before.

5. Create New Derived Variables

You may think, I have all this data already at my fingertips. What more do I need?

But creating new derived variables can help you gain much more insightful information. For example, you might be trying to predict the amount of newspapers and magazines sold the next day. Here’s the information you already have:

  • Brick-and-mortar store or kiosk
  • Sell lottery tickets?
  • Amount of the current lottery prize

Sure, you can make a guess based off that information. But if you’re able to first compare the amount of the current lottery prize versus the typical prize amounts, and then compare that derived variable against the variables you already have, you’ll have a much more accurate answer.

6. Consider the Issues and Test Before Launch

Ideally, you should be able to A/B test with two or more models when you start out. Not only will you know how you’re doing it right, but you’ll also be able to feel more confident knowing that you’re doing it right.

But going further than thorough testing, you should also have a plan in place for when things go wrong. For example, your metrics start dropping. There are several things that will go into this. You’ll need an alert of some sort to ensure that this can be looked into ASAP. And when a VP comes into your office wanting to know what happened, you’re going to have to explain what happened to someone who likely doesn’t have an engineering background.

Then of course, there are the issues you need to plan for before launch. Complying with regulations is one of them. For example, let’s say you’re applying for an auto loan and are denied credit. Under the new regulations of GDPR, you have the right to know why. Of course, one of the problems with machine learning is that it can seem like a black box and even the engineers/data scientists can’t say why certain decisions have been made. However, certain companies will help you by ensuring your algorithms will give a prediction detail.

7. Deploy and Automate Enterprise-Wide

Once you deploy, it’s best to go beyond the data analyst or data scientist.

What we mean by that is, always, always think about how you can distribute predictions and actionable insights throughout the enterprise. It’s where the data is and when it’s available that makes it valuable; not the fact that it exists. You don’t want to be the one sitting in the ivory tower, occasionally sprinkling insights. You want to be everywhere, with everyone asking for more insights—in short, you want to make sure you’re indispensable and extremely valuable.

Given that we all only have so much time, it’s easiest if you can automate this. Create dashboards. Incorporate these insights into enterprise applications. See if you can become a part of customer touch points, like an ATM recognizing that a customer regularly withdraws $100 every Friday night and likes $500 after every payday.

Conclusion

Here are the core ingredients of good machine learning. You need good data, or you’re nowhere. You need to put it somewhere like a database or object storage. You need deep knowledge of the data and what to do with it, whether it’s creating new derived variables or the right algorithms to make use of them. Then you need to actually put them to work and get great insights and spread them across the information.

The hardest part of this is launching your machine learning project. We hope that by creating this article, we’ve helped you out with the steps to success. If you have any other questions or you’d like to see our machine learning software, feel free to contact us.

You can also refer back to some of the articles we’ve created on machine learning best practices and challenges concerning that. Or, download your free ebook, "Demystifying Machine Learning."

 

Announcing PeopleSoft Cloud Manager Support for Oracle Cloud Infrastructure

Fri, 2018-05-18 19:45

Oracle released PeopleSoft Cloud Manager in 2017 featuring in-depth automation to help accelerate adoption of Oracle Cloud (Classic) as an efficient deployment platform for PeopleSoft customers. With the excitement generated around Oracle Cloud Infrastructure (OCI)--a cloud designed for the enterprise customer--several customers and partners have been looking forward to taking advantage of the enhanced OCI with PeopleSoft Cloud Manager.  Oracle is pleased to announce Cloud Manager’s support for OCI beginning with today’s release of PeopleSoft Cloud Manager Version 6.

So, what is new and exciting in PeopleSoft Cloud Manager Version 6?  For the first time, there are two images provided: one for OCI, and the other for OCI Classic.  The Cloud Manager Image 6 for OCI supports a number of OCI features, including Regions, Virtual Cloud Networks, Subnets, Compute and DB System platforms.  With this image, instances will be provisioned on VM shapes.  Customers can lift and shift PeopleSoft environments from on-premises to OCI using the same approach they used OCI Classic.

For PeopleSoft Cloud Manager on OCI Classic, we have enabled support for the lift and shift of on-premises databases encrypted with Oracle Transparent Data Encryption (TDE).  TDE offers another level of data security that customers are looking for as data is migrated to the cloud.  A ‘Clone to template’ option is also available for encrypted databases. 

The lift utility requires a few parameters for TDE so that the encrypted database may be packaged and lifted to the cloud.

During shift process, the same parameters are required to deploy the lifted database.

Customers have also requested an enhancement to support non-Unicode databases for PeopleSoft environments.  PeopleSoft Cloud Manager Version 6 supports lift and shift of environments that use non-Unicode Databases.  Unlike image 5, a conversion of the on-premises database to Unicode is no longer required prior to using Cloud Manager’s Lift and Shift automation.

To get your hands on the new Cloud Manager images, go to the Oracle Marketplace and look for either the OCI-Classic image or the OCI image…or try both!   Be sure to review the documentation and additional important information mentioned in the Marketplace listings.

We are excited to combine the automation of provisioning and maintenance that PeopleSoft Cloud Manager provides with the robust benefits of Oracle Cloud Infrastructure.  Combining support for OCI with the additional features of non-Unicode databases and TDE encrypted databases, we expect all customers to benefit from the latest Cloud Manager image, using whichever Oracle Cloud is right for you. 

Stay tuned for additional information and more Cloud Manager features.  Now, off to the next image!

 

Emerging Tech Helps Progressive Companies Deliver Exceptional CX

Fri, 2018-05-18 19:18

It’s no secret that the art of delivering exceptional service to customers—whether they’re consumers or business buyers—is undergoing dramatic change. Customers routinely expect highly personalized experiences across all touchpoints, from marketing and sales to service and support. I call each of these engagements a moment of truth—because leaving customers feeling satisfied and valued at each touchpoint will have a direct bearing on their loyalty and future spending decisions.

This is why customer experience (CX) has become a strategic business imperative for modern companies. Organizations that provide effective, well-integrated CX across the entire customer journey achieved compound annual growth rates of 17%, versus the 3% growth rates logged by their peers who provided less-effective customer experiences, according to Forrester’s 2017 “Customer Experience Index.”

Fortunately, it’s becoming easier to enter the CX winner’s circle. AI, machine learning, IoT, behavioral analytics, and other innovations are helping progressive companies capitalize on internal and third-party data to deliver highly personalized communications, promotional offers, and service engagements.

How can companies fully leverage today’s tools to support exceptional CX? If they haven’t already done so, companies should start evolving away from cloud 1.0 infrastructures, where an amalgam of best-of-breed services runs various business units. These standalone cloud platforms might have initially provided quick on-ramps to modern capabilities, but now, many companies are paying a price for that expediency. Siloed data and workflows hinder the smooth sharing of customer information among departments. This hurts CX when a consumer who just purchased a high-end digital camera at a retail outlet, for example, webchats with that same company’s service department about a problem, and the service team has no idea this is a premium customer.

In contrast, cloud 2.0 is focused on achieving a holistic view of customers—thanks to simplified, well-integrated services that support each phase of the customer journey. Eliminating information silos benefits companies by giving employees all the information they need to provide a tailored experience for every customer.

Achieving modern CX requires the right vendor partnerships. That starts with evaluating cloud services according to how complete, integrated, and extensible the CX platform is for supporting the entire customer journey. One option is the Oracle Customer Experience Cloud (Oracle CX Cloud) suite, an integrated set of applications for the entire customer lifecycle. It’s complemented by native AI capabilities and Oracle Data Cloud, the world’s largest third-party data marketplace of consumer and business information, which manages anonymized information from more than a billion business and 5 billion consumer identifiers. This means that business leaders, besides understanding customers based on their direct interactions, can use Oracle Data Cloud for insights into social, web surfing, and buying habits at third-party sites and retailers and then apply AI to find profitable synergies.

As new disruptive technologies come to the market—whether that’s the mainstreaming of IoT or drones for business—companies will be under constant pressure to integrate these new capabilities to improve their CX strategies. Modern, integrated cloud services designed for CX don’t support just today’s innovations. With the right cloud choices, companies can continually evolve to meet tomorrow’s CX challenges.

(Photo of Des Cahill by Bob Adler, The Verbatim Agency)

5 Subjects Every Computer Science Student Should Learn

Fri, 2018-05-18 18:55

I was fortunate this year to attend the Association for Computer Machinery’s SIGCSE (Special Interest Group on Computer Science Education) conference, where there was a good deal of conversation about what a modern computer science curriculum should include.

Technology changes quickly and it can be difficult for academic programs to keep pace. Still, if computer science students are to contribute meaningfully to the field in either industry or research jobs, it’s critical that they learn modern computing skills. Here are five subjects I think every higher education institution should teach their undergraduate computer science majors:

1. Parallel Programming

The single, standalone server with one CPU has gone the way of the dodo bird, displaced by the cloud, server farms and multithreaded parallel processors. Yet colleges and universities are still mainly teaching their undergraduates sequential programming—programs that execute instructions one after the other—as they have for decades.

Modern computing environments and massive data sets demand not just that we process multiple instructions simultaneously across multiple servers (distributed computing), but also that programs be written to process multiple instructions simultaneously on multicore chips within multiple servers and devices.

Too often, parallel programming is relegated to a single chapter in a textbook, easily skipped when time in the semester runs short. To prepare students for high-performance computing, big data, machine learning, blockchain and more, we must teach them to both think and program in parallel.

2. Green Programming

With the ubiquity of battery-driven computers, energy efficiency is more important than ever. The more we ask our smart devices to do, the more energy they need to do it and the more quickly they exhaust their batteries. The same is true for massive server clusters, where fires related to energy-consumption are not uncommon as we demand faster and faster processing of more and more data.

How you architect a software program directly affects how much energy is needed to execute the program, yet few undergraduate programs teach students about this relationship. In a fast-warming world, one in which we dream big dreams about all the ways artificial intelligence and high-performance computing will make our lives better, it is imperative that we write energy-optimized software. Students will not be able to do that if we don’t teach them how.

3. Collaborative Development

Academia persists in trying to measure what individual students know. In most programming classes, students start from a blank screen and write clean code independently or, less often, with a partner.

But this isn’t how software is engineered in the real world. Professional software engineers almost always start with someone else’s code and work collaboratively in large groups to modify, improve and correct that code, which is then integrated with code written by other engineers in other groups.

It’s common for software development groups to include people from different countries, in different time zones. Working effectively requires team members to communicate well in different languages and across different cultures. It also means that someone else needs to be able to look at your code and know what it does, so following formatting standards and providing clear commenting are critical.

However, in our desire to ensure that each student understands every programming concept and rule of syntax, we overlook opportunities to teach collaborative software development and help students develop critical professional skills.

4. Hardware Architecture

In the minds of most college students, IBM, Intel, and AMD—the inventors and developers of the multicore processor—are old news…old companies founded by old guys. Mobile applications are where the action is.

But mobile apps are driven by data, usually by a lot of data, and they won’t be of much use without the processors, databases and networks that power them.

Computing works and advances based on the entire system, from the power source to the user interface, and students will be more successful if they know how to open the box and “kick the tires.” They can then optimize for energy efficiency and write parallel code that makes use of new hardware architectures. They can manage caching, memory architecture and resource allocation issues. They can explain and explore quantum computing.

Computer science doesn’t stop at software or coding. Students need foundations in hardware architecture, too, including electrical engineering and physics. We need computer scientists who can test and push the boundaries of hardware just as much as they push what can be achieved with software.

5. Computer History and Ethics

Something I heard at the Turing 50th Anniversary celebration last summer has stuck with me: Computing is not neutral. It can be used for good or evil. It can be used to help people and it can be used to manipulate and harm them.

For several decades now, we have been making computing advances for the sake of computing, because what we can make computers do is cool, because the challenge of the next thing is too alluring to pass up, because there is money to be made if we can do “X.”

Just because we can do something with computing, however, doesn’t mean we should. Computing power is so great that we need policies to regulate and manage it, in order to protect and benefit people.

It’s important for students of computing to understand its history and to take courses grounded in ethics so they can make responsible decisions and guide others. They should know computing’s historical villains and heroes, its inventors and detractors, and how it has been used to benefit and hurt people. The old saw applies here: If we do not learn our history, we are doomed to repeat it.

Even in a crowded curriculum, we must ensure students are gaining the skills and knowledge they need to become technology innovators, business leaders and positive contributors to society in the coming decades. This list is only a starting point.

Alison Derbenwick Miller is vice president of Oracle Academy.

How Blockchain Will Disrupt the Insurance Industry

Fri, 2018-05-18 18:49

The insurance industry relies heavily on the notion of trust among transacting parties. For example, when you go to buy car insurance you get asked for things like your zip code, name, age, daily mileage, and make & model of your car. Other than, maybe, the make & model of your car you can pretty much falsify other information about yourself for a better insurance quote. Underwriters trust that you are providing the correct information, which is one of the many risks in the underwriting business.

Enterprise blockchain platforms such as one from Oracle essentially enables trust-as-a-service in such interactions. Participants (insurer and insured) need to come together to do business, but they do not necessarily trust each other. Blockchain provides a scalable mechanism to securely and easily enable trust in such scenarios. There are 4 key properties of Blockchain that enable trust-as-a-service:

  1. Transparency of digital events and transactions it manages,
  2. Immutability of records stored on the blockchain. through append-only time-stamped and hashed records,
  3. Security and assurance that records stored on blockchain aren't compromised through built-in consensus and encryption mechanisms,
  4. Privacy through cryptography

Blockchain can be a good solution for a number of insurance use cases such as:

  • Reducing frauds in underwriting and claims by validating data from customers and suppliers in the value chain
  • Reducing claims by offering tokenized incentives to promote safer driving behavior by capturing data from insured entities like motor vehicles
  • Enabling pay-per-mile billing for insurance by keeping verifiable records of miles traveled
  • And, in the not so distant future, using blockchain to determine liability in case of an accident between two autonomous vehicles by using blockchain to manage timestamped immutable records of decisions made by deep-learning models from both autonomous vehicles right before the accident.

Besides these use cases, blockchain has potential to eliminate intermediaries, improve transparency of records, eliminate manual paperwork, and error-prone processes, which together can deliver orders of magnitude improvement in operational efficiency for businesses. Of course, there are other types of insurance such as healthcare, reinsurance, catastrophic events insurance, property and casualty insurance, which would have some unique flavor of use cases but they would similarly benefit from blockchain to reduce risk and improve business efficiency.

There is no question that blockchain can, potentially, be a disruptive force in the insurance industry. It would have to overcome legal and regulatory barriers before we see mass adoption of blockchain among the industry participants. 

If you are working on an interesting project related to the use of Blockchain for insurance industry feel free to get in touch by leaving a comment or contact us through social media or Oracle sales rep. We’d be glad to help you connect with our subject matter experts and with your industry peers who may be working on similar use cases with Oracle. For more information on Oracle Blockchain, please visit Oracle Blockchain home pages here, and here

 

 

Oracle's No-Cost Platinum-Level Support Is the New Baseline in the Cloud Market

Fri, 2018-05-18 18:36

Companies may give up their servers, storage, and entire data centers when they move to the cloud, but their need for support services doesn’t go away, it changes. Recognizing a growing need for enterprise-class support in the cloud, Oracle is making its Platinum-level support services available at no additional cost to all customers of Oracle Fusion software-as-a-service applications.

“Our objective is to put out a service capability that is simply the best—bar none,” said Oracle CEO Mark Hurd, in announcing that a range of support services would be available for Oracle Fusion enterprise resource planning, enterprise performance management, human capital management, supply chain, manufacturing, and sales and service cloud applications.

The SaaS support services include 24/7 rapid-response technical support, proactive technical monitoring, success planning, end-user adoption guidance, and education resources.

“Most of our customers are going to cloud,” Hurd said in a briefing with journalists at Oracle’s headquarters in Redwood Shores, California. As that happens, he said, “it’s important for someone in the industry, particularly an industry leader in these mission-critical applications, to take a position” on what level of service that transition demands.

“SaaS application support offerings need to become more agile and responsive,” Hurd added. “We need to provide our SaaS customers with everything they need for rapid, low-cost implementations and a successful rollout to their users.”

Catherine Blackmore, Oracle group vice president of North America Customer Success, said Oracle will also offer new advanced services, including dedicated support and certified expertise, for customers that need a higher level of support. “We have a shared interest in our customers’ success, so we’re going above and beyond to ensure our customers have everything they need to succeed,” she said.

Cloud Levels the Playing Field

Oracle also announced the names of first-time cloud customers and others that are expanding their use of Oracle Cloud services. They include Alsea, Broadcom, Exelon, Gonzaga University, Heineken Urban Polo, Providence St. Joseph Health, Sinclair Broadcast Group, and T-Mobile US.

In a Q&A with the journalists, Hurd was asked about his outlook for SaaS adoption outside of the United States and, in particular, in the Latin America region. He said modern cloud applications can be “game changing” for businesses in places where outdated software applications are still the norm.

“You don’t need armies of experts and system integrators,” Hurd said.

Oracle develops thousands of features that are made available regularly to its SaaS application customers. “That’s a feature stream you don’t have to manage from a data center that you don’t have to operate,” Hurd said.

Self-Driving Technology

Hurd pointed to Oracle’s development of autonomous technologies, including the recently introduced Oracle Autonomous Data Warehouse Cloud Service, as another big area of focus at the company. “It gets upgraded, optimized, secured, patched, and tuned, all automatically without any human intervention,” he said.

As the next step in the delivery of autonomous cloud services, Oracle announced the availability of three new Oracle Cloud Platform services with built-in artificial intelligence and machine learning algorithms: Oracle Autonomous Analytics CloudOracle Autonomous Integration Cloud, and Oracle Autonomous Visual Builder Cloud.

Modern Customer Experience 2018 was Legendary

Fri, 2018-05-18 17:53

During his keynote at Modern Customer Experience 2018, Des Cahill, Head CX Evangelist, stated that CX should stand for Continuous Experimentation. He encouraged 4,500 enthusiastic marketers, customer service, sales, and commerce professional us to try new strategies, to take risks, strive to be remarkable, and triumph through sheer determination.

Casey Neistat echoed Des, challenging us to “do what you can’t,” while best-selling author Cheryl Strayed inspired us to look past our fears and be brave. “Courage isn’t success,” she reminded us, “it’s doing what’s hard regardless of the outcome.”

CX professionals today face numerous challenges: the relentless rise of customer expectations, the accelerating pace of innovation, evolving regulations like GDPR, increase ROI, plus the constant pressure to raise the bar. Modern Customer Experience not only inspired attendees to become the heroes of their organization, but it armed each with the tools to do so.

If you missed Carolyne-Matseshe Crawford, VP of Fan Experience at Fanatics talk about how her company’s culture pervades the entire customer experience, or how Magen Hanrahan VP of Product Marketing at Kraft Heinz is obsessed with data driven marketing tactics, give them a watch. And don’t miss Comcast’s Executive VP, Chief Customer Experience Officer, Charlie Herrin, who wants to build proactive customer experience and dialogue into Comcast’s products themselves with artificial intelligence.

The Modern Customer Experience X Room showcased CX innovation, like augmented and virtual reality, artificial intelligence, and the Internet of Things. But it wasn’t all just mock-ups and demos, a Mack Truck, a Yamaha motorcycle, and an Elgin Street Sweeper were on display, showcasing how Oracle customers put innovation to use to create legendary customer experiences.

Attendees were able to let off some steam during morning yoga and group runs. They relived the 90s with Weezer during CX Fest, and our Canine Heroes from xxxxx were a highlight of everyone’s day.

But don’t just take it from us. Here’s what a few of our attendees had to say about the event.

“Modern Customer Experience gives me the ability to learn about new products on the horizon, discuss challenges, connect with other MCX participants, learn best practices and understand we’re not alone in our journey.” – Matt Adams, Sales Cloud Manager, ArcBest

 “Modern Customer Experience really allows me to do my job more effectively. Without it, I don’t know where I would be! It’s the best conference of the year.” – Joshua Parker, Digital Marketing and Automation Manager, Rosetta Stone

We’re still soaking it all in. You can watch all the highlights from Modern Customer Experience keynotes on YouTube, and peruse the event’s photo slideshow. Don’t forget to share your images on social media, with #ModernCX and sign up for alerts when registration for Modern Customer Experience 2019 opens!

The Most Important Stop on Your Java Journey

Fri, 2018-05-18 14:52

Howdy, Pardner. Have you moseyed over to JavaRanch lately? Pull up a stool at the OCJA or OCJP Wall of Fame and tell your tale or peruse the tales of others. 

Ok - I'm not so great at the cowboy talk, but if you're serious about a Java career and haven't visited JavaRanch, you are missing out! 

JavaRanch, a self-proclaimed "friendly place for Java greenhorns [beginners]" was created in 1997 by Kathy Sierra, co-author of at least 5 Java guides for Oracle Press. The ranch was taken over in subsequent years by Paul Wheaten who continues to run this space today.

In addition to a robust collection of discussion forums about all things Java, JavaRanch provides resources to learn and practice Java, book recommendations, and resources to create your first Java program and test your Java skills.

One of our favorite features of JavaRanch remains the Walls of Fame! This is where you can read the personal experiences of other candidates certified on Java. Learn from their processes and their mistakes. Be inspired by their accomplishments. Share your own experience. 

Visit the Oracle Certified Java Associate Wall of Fame

Visit the Oracle Certified Java Professional Well of Fame

Get the latest Java Certification from Oracle

Oracle Certified Associate, Java SE 8 Programmer

Oracle Certified Professional, Java SE 8 Programmer

Oracle Certified Professional, Java SE 8 Programmer (upgrade from Java SE 7)

Oracle Certified Professional, Java SE 8 Programmer (upgrade from Java SE 6 and all prior versions)

Related Content

Test Your Java Knowledge With FREE Sample Questions

Program Your Future With Java

Pages