Anthony Shorten

Subscribe to Anthony Shorten feed
Oracle Blogs
Updated: 7 hours 51 min ago

Patches available for Internet Explorer 11 performance

Tue, 2018-08-07 21:45

A number of Oracle Utilities Customer and Billing customers have reported some performance issues with Internet Explorer 11 in particular situations. After analysis, it was ascertained that the issue was within Internet Explorer itself. An article is available at Known UI Performance Issues on Internet Explorer 11 (Doc Id: 2430962.1) from My Oracle Support with an explanation of the issues and advice on patches recommended to install to minimize the issue for affected versions.

It is highly recommended to read the article and install the patches to minimize any issues with Internet Explorer 11.

Keep up to Date With Critical Patches

Wed, 2018-08-01 20:39

One of the most important recommendations I give to customers is to keep up to date with the latest patches, especially all the security patches, to improve performance and reduce risk.

For more information refer to the following sites:

Oracle WebLogic, Oracle Linux, Oracle Solaris and Oracle Database patches apply to Oracle Utilities products.

Using Groovy Whitepaper available

Sun, 2018-07-29 18:33

Groovy is an alternative language for building extensions for Oracle Utilities Application Framework based products for on-premise and cloud implementations. For Cloud implementations it is the preferred language replacing java based extensions typically available for on-premise implementations. The implementation of Groovy in the Oracle Utilities Application Framework extends the scripting object to allow Groovy script, Groovy includes and Groovy libraries to be implemented. This is all controlled using a whitelist to ensure that the code is appropriate for the cloud implementation.

A new whitepaper is available outlining the Groovy capability as well as some guidelines on how to use Groovy to extend Oracle Utilities products. It is available as Using Groovy Script in Oracle Utilities Applications (Doc Id: 2427512.1) from My Oracle Support.

New Oracle Utilities Testing Accelerator (

Fri, 2018-06-29 14:02

I am pleased to announce the next chapter in automated testing solutions for Oracle Utilities products. In the past some Oracle Utilities products have used Oracle Application Testing Suite with some content to provide an amazing functional and regression testing solution. Building upon that success, a new solution named the Oracle Utilities Testing Accelerator has been introduced that is an new optimized and focused solution for Oracle Utilities products.

The new solution has the following benefits:

  • Component Based. As with the Oracle's other testing solutions, this new solution is based upon testing components and flows with flow generation and databank support. Those capabilities were popular with our existing testing solution customers and exist in expanded forms in the new solution.
  • Comprehensive Content for Oracle Utilities. As with Oracle's other testing solutions, supported products provided pre-built content to significantly reduce costs in adoption of automation. In this solution, the number of product within the Oracle Utilities portfolio has greatly expanded to provide content. This now includes both on-premise product as well as our growing portfolio of cloud based solutions.
  • Self Contained Solution.  The Oracle Utilities Testing Accelerator architecture has been simplified to allow customers to quickly deploy the product with the minimum of fuss and prerequisites.
  • Used by Product QA. The Oracle Utilities Product QA teams use this product on a daily basis to verify the Oracle Utilities products. This means that the content provided has been certified for use on supported Oracle Utilities products and reduces risk of adoption of automation.
  • Behavior-Driven Development Support. One of most exciting capabilities introduced in this new solution, is the support for Behavior-Driven Development (BDD), which is popular with the newer Agile based implementation approaches. One of the major goals of the new testing capability is reduce rework from the Agile process into the building of test assets. This new capability introduces Machine Learning into the testing arena for generating test flows from Gherkin syntax documentation from Agile approaches. A developer can reuse their Gherkin specifications to generate a flow quickly without the need for rework. As the capability uses Machine Learning, it can be corrected if the assumptions it makes are incorrect for the flow and those corrections will be reused for any future flow generations. An example of this approach is shown below:

  • Selenium Based. The Oracle Utilities Testing Accelerator uses a Selenium based scripting language for greater flexibility across the different channels supported by the Oracle Utilities products. The script is generated automatically and does not need any alteration to be executed correctly.
  • Data Independence. As with other Oracle's testing products, data is supported independently of the flow and components. This translates into greater flexibility and greater levels of reuse in using automated testing. It is possible to change data at anytime during the process to explore greater possibilities in testing.
  • Support for Flexible Deployments. Whilst the focus of the Oracle Utilities Testing Accelerator is functional and/or regression testing.
  • Beyond Functional Testing. The Oracle Utilities Testing Accelerator is designed to be used for testing beyond just functional testing. It can be used to perform testing in flexible scenarios including:
    • Patch Testing. The Oracle Utilities Testing Accelerator can be used to assess the impact of product patches on business processes using the flows as a regression test.
    • Extension Release Testing. The Oracle Utilities Testing Accelerator can be used to assess the impact of releases of extensions from the Oracle Utilities SDK (via the migration tools in the SDK) or after a Configuration Migration Assistant (CMA) migration.
    • Sanity Testing. In the Oracle Cloud the Oracle Utilities Testing Accelerator is being used to assess the state of a new instance of the product including its availability and that the necessary data is setup ensuring the instance is ready for use.
    • Cross Oracle Utilities Product Testing. The Oracle Utilities Testing Accelerator supports flows that cross Oracle Utilities product boundaries to model end to end processes when multiple Oracle Utilities products are involved.
    • Blue/Green Testing. In the Oracle Cloud, zero outage upgrades are a key part of the solution offering. The Oracle Utilities Testing Accelerator supports the concept of blue/green deployment testing to allow multiple versions to be able to be tested to facilitate smooth upgrade transitions.
  • Lower Skills Required. The Oracle Utilities Testing Accelerator has been designed with the testing users in mind. Traditional automation involves using recording using a scripting language that embeds the data and logic into a script that is available for a programmer to alter to make it more flexible. The Oracle Utilities Testing Accelerator uses an orchestration metaphor to allow a lower skilled person, not a programmer, to build test flows and generate, no touch, scripts to be executed.

An example of the Oracle Utilities Testing Accelerator Workbench:

New Architecture

The Oracle Utilities Testing Accelerator has been re-architectured to be optimized for use with Oracle Utilities products:

  • Self Contained Solution. The new design is around simplicity. As much as possible the configuration is designed to be used with minimal configuration.
  • Minimal Prerequisites. The Oracle Utilities Testing Accelerator only requires Java to execute and a Database schema to store its data. Allocations for non-production for existing Oracle Utilities product licenses are sufficient to use for this solution. No additional database licenses are required by default.
  • Runs on same platforms as Oracle Utilities applications. The solution is designed to run on the same operating system and database combinations supported with the Oracle Utilities products.

The architecture is simple:

UTA Architecture

  • Product Components. A library of components from the Product QA teams ready to use with the Oracle Utilities Testing Accelerator. You decide which libraries you want to enable.
  • Oracle Utilities Testing Accelerator Workbench. A web based design toolset to manage and orchestrate your test assets. Includes the following components:
    • Embedded Web Application Server. A preset simple configuration and runtime to house the workbench.
    • Testing Dashboard. A new home page outlining the state of the components and flows installed as well as notifications for any approvals and assets ready for use.
    • Component Manager. A Component Manager to allow you to add custom component and manage the components available to use in flows.
    • Flow Manager. A Flow Manager allowing testers to orchestrate flows and manage their lifecycle including generation of selenium assets for execution.
    • Script Management. A script manager used to generate scripts and databanks for flows.
    • Security. A role based model to support administration, development of components/flows and approvals of components/flows.
  • Oracle Utilities Testing Accelerator Schema. A set of database objects that can be stored in any edition of Oracle (PDB or non-PDB is supported) for storing assets and configuration.
  • Oracle Utilities Testing Accelerator Eclipsed based Plug In. An Oxygen compatible Eclipse plugin that executes the tests including recording of performance and payloads for details test analysis.
New Content

The Oracle Utilities Testing Accelerator has expanded the release of the number of products supported and now includes Oracle Utilities Application Framework based products and Cloud Services Products. New content will be released on a regular basis to provide additional coverage for components and a set of prebuilt flows that can be used across products.

Note: Refer to the release notes for supported Oracle Utilities products and assets provided.


The Oracle Utilities Testing Accelerator provides a comprehensive testing solution, optimized for Oracle Utilities products, with content provided by Oracle to allow implementation to realize lower cost and lower risk adoption of automated testing.

For more information about this solution, refer to the Oracle Utilities Testing Accelerator Overview and Frequently Asked Questions (Doc Id: 2014163.1) available from My Oracle Support.

Note: The Oracle Utilities Testing Accelerator is a replacement for the older Oracle Functional Testing Advanced Pack for Oracle Utilities. Customers on that product should migrate to this new platform. Utilities to convert any custom components from the Oracle Application Testing Suite platform are provided with this tool.

Updated Technical Best Practices

Tue, 2018-06-26 04:13

The Oracle Utilities Application Framework Technical Best Practices have been revamped and updated to reflect new advice, new versions and the cloud implementations of the Oracle Utilities Application Framework based products. The following summary of changes have been performed:

  • Formatting change. The whitepaper uses a new template for the content which is being rolled out across Oracle products.

  • Removed out of date advice. Advice that was on older versions and is not appropriate anymore has been removed from the document. This is ongoing to keep the whitepaper current and optimal.
  • Added Configuration Migration Assistant advice. With the increased emphasis of the use of CMA we have added a section outlining some techniques on how to optimize the use of CMA in any implementation.
  • Added Optimization Techniques advice. With the implementation of the cloud, there are various techniques we use to reduce our costs and risks on that platform. We added a section outlining some common techniques can be reused for on-premise implementations. This is based upon a series of talks given at customer forums the last year or so.
  • Added Preparation Your Implementation For the Cloud advice. This is a new section outlining the various techniques that can be used to prepare an on-premise implementation for moving to the Oracle Utilities Cloud SaaS Services. This is based upon a series of talks given at customer forums the last year or so.

The new version of the whitepaper is available at Technical Best Practices (Doc Id: 560367.1) from My Oracle Support.

Oracle WebLogic 12.2.1.x Configuration Guide for Oracle Utilities available

Thu, 2018-06-21 19:06

A new guide whitepaper is now available for use with Oracle Utilities Application Framework based products that support Oracle WebLogic 12.2.1.x and above. The whitepaper walks through the setup of the domain using the Fusion Domain Templates instead of the templates supplied with the product. In future releases, Oracle Utilities Application Framework the product specific domain templates will not be supplied as the Fusion Domain Templates take more of a prominent role in deploying Oracle Utilities products.

The whitepaper covers the following topics:

  • Setting up the Domain for Oracle Utilities products
  • Additional Web Services configuration
  • Configuration of Global Flush functionality in Oracle WebLogic 12.2.1.x
  • Frequently asked installation questions

The whitepaper is available as Oracle WebLogic 12.2.1.x Configuration Guide (Doc Id: 2413918.1) from My Oracle Support.

Oracle Utilities and the Oracle Database In-Memory Option

Tue, 2018-05-29 20:09

A few years ago, Oracle introduced an In-Memory option for the database to optimize analytical style applications. In Oracle Database 12c and above, the In-Memory option has been enhanced to support other types of workloads. All Oracle Utilities products are now certified to use the Oracle In-Memory option, on Oracle Database 12c and above, to allow customers to optimize the operational and analytical aspects of the products.

The Oracle In-Memory option is a memory based column store that co-exists with existing caching schemes used within Oracle to deliver faster access speeds for complex queries across the products. It is transparent to the product code and can be easily implemented with a few simple changes to the database to implement the objects to store in memory. Once configured the Oracle Cost Based Optimizer becomes aware of the data loaded into memory and adjusts the execution plan directly, delivering much better performance in almost all cases.

There are just a few option changes that need to be done:

  • Enable the In-Memory Option. The In-Memory capability is actually already in the database software already (no relinking necessary) but it is disabled by default. After licensing the option, you can enable the option by setting the amount of the SGA you want to use for the In-Memory store. Remember to ensure that the SGA is large enough to cover the existing memory areas as well as the In-Memory Data Store. These are setting a few database initialization parameters.
  • Enable Adaptive Plans. To tell the optimizer that you now want to take into account the In-Memory Option, you need to enable Adaptive Plans to enable support. This is flexible where you can actually turn off the In-Memory support without changing In-Memory settings.
  • Decide the Objects to Load into Memory. Now that the In-Memory Option is enabled the next step is to decide what is actually loaded into memory. Oracle provides an In-Memory Advisor that analyzes workloads to make suggestions.
  • Alter Objects to Load into Memory. Create the SQL DDL statements to specify the statements to instruct the loading of objects into memory. This includes priority and compression options for the objects to maximize flexibility of the option. The In-Memory Advisor can be configured to generate these statements from its analysis.

No changes to the code is necessary to use the option to speed up common queries in the products and analytical queries.

A new Implementing Oracle In-Memory Option (Doc Id: 2404696.1) whitepaper available from My Oracle Support has been published which outlines details of this process as well as specific guidelines for implementing this option.

PS. The Oracle In-Memory Option has been significantly enhanced in Oracle Database 18c.


Data Management with Oracle Utilities products

Mon, 2018-05-28 19:46

One of the most common questions I receive is about how to manage data volumes in the Oracle Utilities products. The Oracle Utilities products are designed to scale no matter how much data is present in the database but obviously storage costs and management of large amounts of data is not optimal.

A few years ago we adopted the Information Lifecycle Management (ILM) capabilities of the Oracle Database as well as developed a unique spin on the management of data. Like biological life, data has a lifecycle. It is born when it is created, it has an active life while the business uses or manipulates it, it goes into retirement but is still accessible and eventually it dies when it is physically removed from the database. The length of that lifecycle will vary from data type to data type, implementation to implementation. The length of the life is dictated by its relevance to the business, company policies and even legal or government legislation.

The data management (ILM) capabilities of Oracle Utilities take this into account:

  • Data Retention Configuration. The business configures how long the active life of the individual data types are for their business. This defines what is called the Active Period. This is when the data needs to be in the database and accessible to the business for update and active use in their business.
  • ILM Eligibility Rules. Once the data retention period is reached, before the data can enter retirement, the system needs to know that anything outstanding, from a business perspective, has been completed. This is the major difference with most data management approaches. I hear DBA's saying that they would just rather the data was deleted after a specific period. Whilst that would cover most situations it would not cover a situation where the business is not finished with the data. Lets explain with an example. In CCB customers are billed and you can also record complains against a bill if their is a dispute. Depending on the business rules and legal processes an old bill may be in dispute. You should not remove anything related to that bill till the complaint is resolved, regardless of the age. Legal issues can be drawn out for lots of reasons. If you use a retention rule only then the data used in the complaint would potentially be lost. In the same situation, the base ILM Eligbility rules would detect something outstanding and bypass the applicable records. Remember these rules are protecting the business and ensuring that the ILM solution adheres to the complex rules of the business.
  • ILM Features in the Database. Oracle, like a lot of vendors, introduced ILM features into the database to help, what I like to call storage manage the data. This provides a set of flexible options and features to allow database administrators a full range of possibilities for their data management needs. Here are the capabilities (refer to the Database Administration Guide for details of each capability):
    • Partitioning. One of the most common capabilities is using the Partitioning option. This allows a large table to be split up, storage wise, into parts or partitions using a partitioned tablespace. This breaks up the table into manageable pieces and allows the database administration to optimize the storage using hardware and/or software options. Some hardware vendors have inbuilt ILM facilities and this option allows you to target specific data partitions to different hardware capabilities or just split the data into trenches (for example to separate the retirement stages of data). Partitioning is also a valid option if you want to use hardware storage tiered based solutions to save money. In this scenario you would pout the less used data on cheaper storage (if you have it) to save costs. For Partitioning advice, refer to the product DBA Guides which outline the most common partitioning schemes used by customers.
    • Advanced Compression. One of the popular options is the using the Advanced Compression option. This allows administrators to set compression rules against the database based upon data usage. The compression is transparent to the product and compressed data can be co-located with uncompressed data with no special processing needed by the code. The compression covers a wide range of techniques including CLOB compression as well as data compression. Customers using Oracle Exadata can also use Hybrid Columnar Compression (HCC) for hardware assisted compression for greater flexibility.
    • Heat Map. One of the features added to Oracle Database 12c and above to help DBA's is the Heat Map. This is a facility where the database will track the usage patterns of the data in your database and give you feedback on the activity of the individual rows in the database. This is an important tool as it helps the DBA identify which data is actually being used by the business and is a useful tool for determining what is important to optimize. It is even useful in the active period to determine which data can be safely compressed as it has reduced update activity against it. It is a useful tool and is part of the autonomous capabilities of the database.
    • Automatic Data Optimization. The Automatic Data Optimization (ADO) is a feature of database that allows database administrations  to implement rules to manage storage capabilities based upon various metrics including heat map. For example, the DBA can put in a rule that says data if data in a specific table is not touched for X months then it should be compressed. The rules cover compression, partition movement, storage features etc and can be triggered by Heat Map or any other valid metric (even SQL procedure code can be used).
    • Transportable Tablespaces. One of the most expensive things you can do in the database is issue a DELETE statement. To avoid this in bulk in any ILM based solution, Oracle offers the ability to use the Partitioning option and create a virtual trash bin via a transportable tablespace. Using ADO or other capabilities you can move data into this tablespace and then using basic commands switch off the tablespace to do bulk removal quickly. An added advantages is that you can archive that tablespace and reconnect it later if needed.

The Oracle Utilities ILM solution is comprehensive and flexible using both a aspect for the business to define their retention and eligibility rules and the various capabilities of the ILM in the database for the database administrator to factor in their individual sites hardware and support policies. It is not as simple as removing data in most cases and the Oracle Utilities ILM solution reduces the risk of managing your data, taking to account both your business and storage needs.

For more information about the Oracle Utilities ILM solution, refer to the ILM Planning Guide (Doc Id: 1682436.1) available from My Oracle Support and read the product DBA Guides for product specific advice.

EMEA Edge Conference 2018

Wed, 2018-05-23 19:41

I will be attending the EMEA Oracle Utilities Edge Conference on the 26 - 27 June 2018 in the Oracle London office. This year we are running an extended set of technical sessions around on-premise and the Oracle Utilities Cloud Services. This forum is open to Oracle Utilities customers and Oracle Utilities partners.

The sessions mirror the technical sessions for the conference in the USA held earlier this year with the following topics:

Reducing Your Storage Costs Using Information Life-cycle Management With the increasing costs of maintaining storage and satisfying business data retention rules can be challenging. Using Oracle Information Life-cycle Management solution can help simplify your storage solution and hardness the power of the hardware and software to reduce storage costs. Integration using Inbound Web Services and REST with Oracle Utilities Integration is a critical part of any implementation. The Oracle Utilities Application Framework has a range of facilities for integrating from and to other applications. This session will highlight all the facilities and where they are best suited to be used. Optimizing Your Implementation Implementations have a wide range of techniques available to implement successfully. This session will highlight a group of techniques that have been used by partners and our cloud implementations to reduce Total Cost Of Ownership. Testing Your On-Premise and Cloud Implementations Our Oracle Testing solution is popular with on premise implementations. This session will outline the current testing solution as well as outline our future plans for both on premise and in the cloud. Securing Your Implementations With the increase in cybersecurity and privacy concerns in the industry, a number of key security enhancements have made available in the product to support simple or complex security setups for on premise and cloud implementations. Turbocharge Your Oracle Utilities
Product Using the Oracle In-Memory Database Option
The Oracle Database In-Memory options allows for both OLTP and Analytics to run much faster using advanced techniques. This session will outline the capability and how it can be used in existing on premise implementations to provide superior performance. Developing Extensions using Groovy Groovy has been added as a supported language for on premise and cloud implementations. This session outlines that way that Groovy can be used in building extensions. Note: This session will be very technical in nature. Ask Us Anything Session Interaction with the customer and partner community is key to the Oracle Utilities product lines. This interactive sessions allows you (the customers and partners) to ask technical resources within Oracle Utilities questions you would like answered. The session will also allow Oracle Utilities to discuss directions and poll the audience on key initiatives to help plan road maps

Note: These sessions are not recorded or materials distributed outside this forum.

This year we have decided to not only discuss capabilities but also give an idea of how we use those facilities in our own cloud implementations to reduce our operating costs for you to use as a template for on-premise and hybrid implementations.

See you there if you are attending.

If you wish to attend, contact your Oracle Utilities local sales representative for details of the forum and the registration process.

Reflecting Changes in Business Objects in UI Tables with Visual Builder

Mon, 2018-05-21 13:14

While the quick start wizards in Visual Builder Cloud Service (VBCS) make it very easy to create tables and other UI components and bind them to business objects, it is good to understand what is going on behind the scenes, and what the wizards actually do. Knowing this will help you achieve things that we still don't have wizards for.

For example - let's suppose you created a business object and then created a UI table that shows the fields from that business object in your page. You probably used the "Add Data" quick start wizard to do that. But then you remembered that you need one more column added to your business object, however after you added that one to the BO, you'll notice it is not automatically shown in the UI. That makes sense since we don't want to automatically show all the fields in a BO in the UI.

But how do you add this new column to the UI?

The table's Add Data wizard will be disabled at this point - so is your only option to drop and recreate the UI table? Of course not!


If you'll look into the table properties you'll see it is based on a page level ServiceDataProvider ( SDP for short) variable. This is a special type of object that the wizards create to represent collections. If you'll look at the variable, you'll see that it is returning data using a specific type. Note that the type is defined at the flow level - if you'll look at the type definition you'll see where the fields that make up the object are defined.

Type Definition

It is very easy to add a new field here - and modify the type to include the new column you added to the BO. Just make sure you are using the column's id - and not it's title - when you define the new field in the items array.

Now back in the UI you can easily modify the code of the table to add one more column that will be hooked up to this new field in the SDP that is based on the type.

Sounds complex? It really isn't - here is a 3 minute video showing the whole thing end to end:

As you see - a little understanding of the way VBCS works, makes it easy to go beyond the wizards and achieve anything.

European Privacy Requirements: Considerations for Retailers

Mon, 2018-05-21 11:52

When retailers throughout Europe adopt a new set of privacy and security regulations this week, it will be the first major revision of data protection guidelines in more than 20 years. The 2018 regulations address personal as well as financial data, and require that retailers use systems already designed to fulfill these protections by default.

In 1995, the European Commission adopted a Data Protection Directive that regulates the processing of personal data within the European Union. This gave rise to 27 different national data regulations, all of which remain intact today. In 2012, the EC announced that it would supersede these national regulations and unify data protection law across the EU by adopting a new set of requirements called the General Data Protection Regulation (GDPR).

The rules apply to any retailer selling to European consumers. The GDPR, which takes effect May 25, 2018, pertains to any company doing business in, or with citizens of, the European Union, and to both new and existing products and services. Organizations found to be in violation of the GDPR will face a steep penalty of 20 million euros or four percent of their gross annual revenue, whichever is greater.

Retailers Must Protect Consumers While Personalizing Offers

GDPR regulations will encompass personal as well as financial data, including much of the data found in a robust customer engagement system, CRM, or loyalty program. It also includes information not historically considered to be personal data: device IDs, IP addresses, log data, geolocation data, and, very likely, cookies.

For the majority of retailers relying on customer data to personalize offers, it is critically important to understand how to fulfill GDPR requirements and execute core retail, customer, and marketing operations. Developing an intimate relationship with consumers and delivering personalized offers means tapping into myriad data sources.

This can be done, but systems must be GDPR-compliant by design and by default. A key concept underlying the GDPR is Privacy by Design (PBD), which essentially stipulates that systems be designed to minimize the amount of personal data they collect. Beginning this week, Privacy by Design features will become a regulatory requirement for both Oracle and our customers and GDPR stipulates that these protections are, by default, turned on.

Implementing Security Control Features

While the GDPR requires “appropriate security and confidentiality,” exact security controls are not specified. However, a number of security control features are discussed in the text and will likely be required for certain types of data or processing. Among them are multi-factor authentication for cloud services, customer-configurable IP whitelisting, granular access controls (by record, data element, data type, or logs), encryption, anonymization, and tokenization.

Other security controls likely to be required are “separation of duties” (a customer option requiring two people to perform certain administrative tasks); customer options for marking some fields as sensitive and restricted; limited access on the part of the data controller (i.e. Oracle) to customer information; displaying only a portion of a data field; and the permanent removal of portions of a data element.

Summary of Critical GDPR Requirements

The GDPR includes a number of recommendations and requirements governing users’ overall approach to data gathering and use. Among the more important are:

  • Minimization. Users are required to minimize the amount of data used, length of time it is stored, the number of people who have access to it, and the extent of that access.
  • Retention and purging. Data may be retained for only as long as reasonably necessary. This applies in particular to personal data, which should be processed only if the purpose of processing cannot reasonably be fulfilled by other means. Services must delete customer data on completion of the services.
  • Exports and portability. End users must be provided with copies of their data in a structured, commonly used digital format. Customers will be required to allow end users to send data directly to a competing service provider for some services.
  • Access, correction, and deletion. End-user requests for data access, correction, and deletion for data they store in any service. Users may have a “right to be forgotten”—a right to have all their data erased.
  • Notice and consent. When information is collected, end-user notice and consent for data processing is generally required.
  • Backup and disaster recovery. Timely availability of end-user data must be ensured.

Are you prepared?

Oracle is prepared for the EU General Data Protection Regulation (GDPR) that was adopted by the European Parliament in April 2016 and will become effective on May 25, 2018. We welcome the positive changes it is expected to bring to our service offerings by providing a consistent and unified data protection regime for businesses across Europe. Oracle is committed to helping its customers address the GDPR’s new requirements that are relevant to our service offerings, including any applicable processor accountability requirements.

Our customers can rest assured that Oracle Retail’s omnichannel suite will empower them to continue delivering personalized customer experiences that meet complex global data privacy regulations. Contact Oracle Retail to learn more about Oracle systems, services and GDPR compliance:





New Oracle E-Business Suite Person Data Removal Tool Now Available

Mon, 2018-05-21 10:27

Oracle is pleased to announce the availability of the Oracle E-Business Suite Person Data Removal Tool, designed to remove (obfuscate) data associated with people in E-Business Suite systems. Customers can apply the tool to select information in their E-Business Suite production systems to help address internal operational and external regulatory requirements, such as the EU General Data Protection Regulation (GDPR).

For more details, see:

DP World Extends Strategic Collaboration with Oracle to Accelerate Global Digital ...

Mon, 2018-05-21 09:56

Global trade enabler DP World has extended its partnership with Oracle to implement its digital transformation programme that supports its strategy to develop complementary sectors in the global supply chain such as industrial parks, free zones and logistics. 


Suhail Al Banna, Senior Vice President, DP World, Middle East and Africa Region; Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle; Mohammed Al Muallem, CEO and Managing Director, DP World, UAE Region and CEO, JAFZA.

Suhail Al Banna, Senior Vice President, DP World, Middle East and Africa Region; Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle; Mohammed Al Muallem, CEO and Managing Director, DP World, UAE Region and CEO, JAFZA.


The move follows an announcement by DP World earlier this year to use the Oracle Cloud Suite of Applications drive business transformation. Oracle Consulting will now implement the full suite of Fusion Enterprise Resource Planning (ERP), Human Capital Management (HCM) and Enterprise Performance Management (EPM) Cloud solutions using its True Cloud methodology. The technology roll out across the Group has already started with the Group’s UAE Region and Middle East and Africa Region the first to sign up.

Teo Chin Seng, Senior Vice President IT, DP World Group, said:“Our focus on building our digital capability follows our vision to become a digitised global trade enabler and we working to achieve a new operational efficiency level while creating value for our stakeholders.”

Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle said:“Following the recent announcement of our strategic partnership to help DP World drive its global digital transformation with our best-in-class Cloud Suite of Applications (SaaS), we are proud to extend our collaboration by leveraging the deep expertise of Oracle Consulting to drive this large scale project. We are confident that this strategic cloud deployment will help them deliver the next level of innovation and differentiation.”

The Oracle Consulting team is focused exclusively on Oracle Cloud solutions and staffed with more than 7,000 experts in 175 countries serving more than 20 million users to help organizations implement Oracle Cloud in an efficient and cost-effective manner.


Further press releases Oracle Middle East Newsroom 

If You Are Struggling With GDPR, Then You Are Not Alone

Mon, 2018-05-21 08:00

Well, it's only 5 days to go until the infamous GDPR deadline of 25th May 2018 and you can certainly see the activity accelerating.

You would have thought that with the deadline so close, most organisations would be sat back, relaxing, safe in the knowledge that they have had 2 years to prepare for GDPR, and therefore, are completely ready for it. It's true, some organisations are prepared and have spent the last 24 months working hard to meet the regulations. Sadly, there are also a significant proportion of companies who aren't quite ready. Some, because they have left it too late. Others, by choice.

Earlier this week I had the pleasure of being invited to sit on a panel discussing GDPR at Equinix's Innovation through Interconnection conference in London.

As with most panels, we had a very interesting discussion, talking about all aspects of GDPR including readiness, data sovereignty, healthcare, the role of Cloud, and the dreaded Brexit!

I have written before about GDPR, but this time I thought I would take a bit of time to summarise three of the more interesting discussion topics from the panel, particularly areas where I feel companies are struggling.

Are you including all of your personal right data?

There is a clear recognition that an organisation's customer data is in scope for GDPR. Indeed, my own personal email account has been inundated with opt-in consent emails from loads of companies, many of whom I had forgotten even had my data. Clearly, companies are making sure that they are addressing GDPR for their customers. However, I think there is a general concern that some organisations are missing some of the data, especially internal data, such as that of their employees. HR data is just as important when it comes to GDPR. I see some companies paying far less attention to this area than their customer's data.

Does Cloud help or hinder GDPR compliance?

A lot was discussed on the panel around the use of cloud. Personally, I think that cloud can be a great enabler, taking away some of the responsibility and overhead of implementing security controls, processes, and procedures and allowing the Data Processor (the Cloud Service Provider) to bring all of their experience, skill and resources into delivering you a secure environment. Of course, the use of Cloud also changes the dynamic. As the Data Controller, an organisation still has plenty of their own responsibility, including that of the data itself. Therefore, putting your systems and data into the Cloud doesn't allow you to wash your hands of the responsibility. However, it does allow you to focus on your smaller, more focused areas of responsibility. You can read more about shared responsiblity from Oracle's CISO, Gail Coury in this article. Of course, you need to make sure you pick the right cloud service provider to partner with. I'm sure I must have mentioned before that Oracle does Cloud and does it extremely well.

What are the real challenges customers are facing with GDPR?

I talk to lots of customers about GDPR and my observations were acknowledged during the panel discussion. Subject access rights is causing lots of headaches. To put it simply, I think we can break GDPR down into two main areas: Information Security and Subject Access Rights. Organisations have been implementing Information Security for many years (to varying degrees), especially if they have been subject to other legislations like PCI, HIPAA, SOX etc. However, whilst the UK Data Protection Act has always had principles around data subjects, GDPR really brings that front and centre. Implementing many of the principles associated with data subjects, i.e. me and you, can mean changes to applications, implementing new processes, identifying sources of data across an organisation etc. None of this is proving simple.

On a similar theme, responding to subject access rights due to this spread of data across an organisation is worrying many company service desks, concerned that come 25th May, they will be inundated with requests they cannot fulfil in a timely manner.

Oh and of course, that's before you even get to paper-based and unstructured data, which is proving to be a whole new level of challenge.

I could continue, but the above 3 areas are some of the main topics I am hearing over and over again with the customers I talk to. Hopefully, everyone has realised that there is no silver bullet for achieving GDPR compliance, and, for those companies who won't be ready in 5 days time, I hope you at least have a strong plan in place.

Experience, Not Conversion, is the Key to the Switching Economy

Mon, 2018-05-21 08:00

In a world increasingly defined by instant-gratification, the demand for positive and direct shopping experiences has risen exponentially. Today’s always-on customers are drawn to the most convenient products and services available. As a result, we are witnessing higher customer switching rates, with consumers focusing more on convenience than on branding, reputation, or even on price.  

In this switching economy – where information and services are always just a click away –  we tend to reach for what suits our needs in the shortest amount of time. This shift in decision making has made it harder than ever for businesses to build loyalty among their customers and to guarantee repeat purchases. According to recent research, only 1 in 5 consumers now consider it a hassle to switch between brands, while a third would rather shop for better deals than stay loyal to a single organization. 

What's Changed? 

The consumer mindset for one. And the switching tools available to customers have also changed. Customers now have the ability to research extensively before they purchase, with access to reviews and price comparison sites often meaning that consumers don’t even make it to a your website before being captured by a competitor. 

This poses a serious concern for those brands that have devoted their time – and marketing budgets – to building great customer experiences across their websites. 

Clearly this is not to say that on-site experiences aren’t important, but rather that they are only one part of the wider customer journey. In an environment as complex and fast moving as the switching economy, you must look to take a more omnichannel approach to experience, examining how your websites, mobile apps, customer service teams, external reviews and in-store experiences are all shaping the customers’ perceptions of your brand. 

What Still Needs to Change?

Only by getting to know your customers across all of these different channels can you future-proof your brand in the switching economy. To achieve this, you must establish a new set of metrics that go beyond website conversion. The days of conversion optimization being viewed as the secret sauce for competitive differentiation are over; now brands must recognize that high conversion rates are not necessarily synonymous with a great customer experience – or lifetime loyalty. 

Today, the real measure of success does not come from conversion, but from building a true understanding of your customers – across every touchpoint in the omnichannel journey. Through the rise of experience analytics, you finally have the tools and technologies needed to understand customers in this way, and to tailor all aspects of your brand to maximize convenience, encourage positive mindsets and pre-empt when your customers are planning to switch to a different brand. 

It is only through this additional layer of insight that businesses and brands will rebuild the notion of customer loyalty, and ultimately, overcome the challenges of the switching economy. 

Want to learn more about simplifying and improving the customer experience? Read Customer Experience Simplified: Deliver The Experience Your Customers Want to discover how to provide customer experiences that are managed as carefully as the product, the price, and the promotion of the marketing mix.

Customer Experience Simplified

See What Your Guests Think with Data Visualization

Mon, 2018-05-21 06:00

As we approach the end of May, thoughts of summer and vacations begin. Naturally, a key component is finding the best place to stay and often that means considering the hotel options at your chosen destination. But what’s the best way to decide? That’s where reading reviews is so important.   

And that brings us to the latest blog in the series of taking datasets from ‘less typical’ sources and analyzing them with Oracle Data Visualization. Here, we’ve pulled the reviews from as a dataset and visualized it to see how we – the general public - rate the hotels we stay in.

Working with Ismail Syed, pre-sales intern, and Harry Snart, pre-sales consultant, both from Oracle UK, we ran the analysis and created visualizations. We decided to look at the most common words used in both positive and negative reviews, see how long each of them is – and work out which countries are the most discerning when they give their feedback. 

So, what are the main irritations when we go away? Conversely - what's making a good impression?

Words of discontent

First, we wanted to combine the most commonly used words in a positive review with those most likely used in a negative review. You can see these in the stacked bar chart below. Interestingly, 'room' and 'staff' both appear in the positive and negative comments list. However, there are far more positive reviews around staff than negative ones, and likewise a lot more negative reviews around the room than positive reviews.

It seems then, across the board, guests find customer service better than the standard of the rooms they receive – implying an effective way to boost client retention would be by starting with improving rooms. In particular the small size of the rooms was complained about, that’s a tough fix, but people were more upset about the standard of the beds, their bathrooms and the toilets, which can be updated a bit more easily.

You’ll also notice 'breakfast' appears prominently in both the positive and negative word clouds – so a more achievable fix could be to start there. A bad breakfast can leave a bad taste, but a good one is obviously remembered. 

Who’ll give a good review?

Next, we wanted to see who the most complimentary reviewers were, by nationality. While North Americans, Australians and Kyrgyz (highlighted in green) tend to leave the most favorable reviews, hotels have a harder time impressing those from Madagascar, Nepal and Mali (in red). Europeans sit somewhere in the middle – except for Bosnia and Herzegovina, who like to leave an upbeat review.   

Next, we wanted to see who is the most verbose in their feedback – the negative reviewers or the positive reviewers – and which countries leave the longest posts.

Are shorter reviews sweeter?

Overall, negative reviews were slightly longer, but only by a small amount – contrary to the popular belief that we tend to ‘rant’ more when we’re perturbed about something. People from Trinidad and Tobago left the longest good reviews, at an average of 29 words. Those from Belarus, the USA and Canada followed as the wordiest positive reviewers. On the flip side, the Romanians, Swedish, Russians and Germans had a lot to say about their bad experiences – leaving an average of 22 words showing their displeasure.

It's business, but also personal...

Clearly data visualization doesn't necessarily just need to be a tool just for the workplace; you can deploy it to gain an insight into other aspects as well – including helping you prepare for some valuable time off.

If you’re an IT leader your organization and need to enable insights for everyone across business, you should consider a complete, connected and collaborative analytics platform like Oracle Analytics Cloud. Why not find out a bit more and get started for free.

If you simply interested in visual analysis of your own data? Why not see what you can find out by taking a look at our short demo and signing up for an Oracle Data Visualization trial?

Either way, make sure you and your business take a vacation from spreadsheets and discover far more from your data through visualization.

HR today: right skills, right place, right time, right price

Mon, 2018-05-21 05:49

The only constant in today’s work environment is change. If you’re going to grow and stay competitive in this era of digital transformation, your business has to keep up—and HR must too.

A wide range of factors all mean that HR constantly has to grow and transform—changing demographics, new business models, economic uncertainty, evolving employee expectations, the bring-your-own-device revolution, increased automation, AI, the relentless search for cost savings, and more.

Things are different today. In the past, business change processes typically had a start and target end date, with specific deliverables that were defined in advance. Now change is open-ended, and its objectives evolve over time—based on the world as it is, rather than a set of assumptions. An agile model for transformation is therefore essential, along with a decision-making process that can survive constant change.

The fact is that people are still—and will always be—the most important part of any business, so HR has to be closely aligned to your overall business goals, delivering benefits to the whole organisation. Every move your HR team makes should be focused on how to deliver the right skills in the right place, at the right time and at the right price, to achieve your business’s goals.


Workforce planning

To manage your workforce effectively as the needs of your business change, you need to know what talent you have, where it’s located—and also what skills you are likely to need in the future. It’s much easier to fill skills gaps when you can see, or anticipate, them.


Deliver maximum value from your own people

And it’s much easier to do if you’ve already nurtured a culture of personal improvement. Giving people new opportunities to learn and develop, and a sense of control over their own careers will help you maintain up-to-date skills within your business and also identify the most ideal candidates—whether for promotion, relocation within the company or to take on specific roles. Moreover, it should enable them to, for example, pursue areas of personal interest, train for qualifications, or perhaps work flexibly—all of which will improve loyalty and morale.

You can also look for skills gaps that you absolutely must recruit externally to fill, and understand how best to do that, especially at short notice. What are the most cost-efficient and effective channels, for example? You might consider whether offshoring for skills is helpful, or maintaining a base of experienced temporary workers that you can call on.


Unknown unknowns

Yet these are all known gaps. Organisations now also have to consider recruiting people for unknown jobs too. Some estimates suggest that as much as two-thirds of primary school children will end up working in jobs that don’t yet exist. So what new roles are being created in your industry, and how are you selecting people that will be able to grow into them?


Maximise the value of your HR function

Your HR organisation must be capable of, and ready to support these changes, and that means three things. First, the strategic workforce planning activities described above, supported by modern data and analytics. Next, HR has to provide the very best employee experience possible, enabling personal development and support. Finally, they need to be able to support the process of constant change itself, and move to a more agile way of operating.


Get the culture right

Creating and nurturing a strong culture is essential here, and that relies on close co-ordination between HR, line managers and employees. Having a core system of record on everyone’s roles and various skills supports all these objectives, and can help you to grow your business through the modern era of change.


Essential enablers for implementing a modern product strategy

Mon, 2018-05-21 05:49

Continuous improvement across your entire mix of products and services is essential to innovate and stay competitive nowadays. Digital disruption requires companies to transform, successfully manage a portfolio of profitable offerings, and deliver unprecedented levels of innovation and quality. But creating your product portfolio strategy is only the first part—four key best practices are necessary to successfully implement it.

New technologies—the Internet of Things (IoT), Big Data, Social Media, 3D printing, and digital collaboration and modelling tools—are creating powerful opportunities to innovate. Increasingly customer-centric propositions are being delivered ‘as-a-service’ via the cloud, with just-in-time fulfilment joining up multiple parts of the supply chain. Your products and services have to evolve continually to keep up, causing massive amounts of data to be generated that has to be fed back in to inform future development.


Common language

To minimise complexity, it’s essential that there is just one context for all communication. You therefore need a standardised—and well-understood—enterprise product record that acts as a common denominator for your business processes. And that means every last piece of information—from core service features to how your product uses IoT sensors; from business processes to your roadmap for innovation, and all other details—gets recorded in one place, in the same way, for every one of your products, from innovation through development to commercialisation.

That will make it far easier for you to collect and interpret product information; define service levels and deliver on them; support new business models, and manage the overall future design of your connected offerings. Moreover, it enables your product development methods to become more flexible, so they can be updated more frequently, enabled by innovations in your supply chain, supported more effectively by IT, and improved over time.


Greater quality control in the digital world…

By including form, fit and function rules—that describe the characteristics of your product, or part of it—within the product record, you add a vital layer of change control. It enables you to create a formal approvals process for quality assurance. For example, changes made in one area—whether to a product or part of it—may create problems in other areas. The form, fit and function rules force you to perform cross-functional impact analyses and ensure you’re aware of any consequences.

As part of this, you can run simulations with ‘digital twins’ to predict changes in performance and product behaviour before anything goes wrong. This obviously has major cost-saving implications, enabling far more to be understood at the drawing-board stage. Moreover, IoT applications can be leveraged to help product teams test and gather data of your connected assets or production facilities.


Transparency and effective communications

The enterprise product record should also contain a full audit trail of decisions about the product, including data from third parties, and from your supply chain. The objective is full traceability from the customer perspective—with evidence of regulatory compliance, provenance of preferred suppliers, and fully-auditable internal quality processes. Additionally, it’s often helpful to be able to prove the safety and quality of your product and processes, as that can be a key market differentiator. Powerful project management and social networking capabilities support the collaborative nature of the innovation process.


Lean and efficient

Overall, your innovation platform should be both lean and efficient, based on the continual iteration of the following key stages:

  • Ideation, where you capture, collaborate and analyse ideas
  • Proposal, where you create business cases and model potential features
  • Requirements, where you evaluate, collaborate and manage product needs
  • Concepts, where you accelerate product development and define structures
  • Portfolio analysis, where you revise and optimise your product investment
  • Seamless Integration with downstream ERP and Supply Chain processes


The result: Powerful ROI

Being able to innovate effectively in a digital supply chain delivers returns from both top-line growth—with increased revenues and market share—and reduced costs from improved safety, security, sustainability and fewer returns.



Cloud: Look before you leap—and discover unbelievable new agility

Mon, 2018-05-21 05:48

All around the world, finance teams are now fully embracing the cloud to simplify their operations. The heady allure of reduced costs, increased functionality, and other benefits are driving the migration. Yet what’s getting people really excited is the unexpected flush of new business agility they experience after they’ve made the change.

At long last, the cloud is becoming accepted as the default environment to simplify ERP and EPM. Fifty-six percent* of finance teams have already moved to the cloud—or will do so within the next year—and 24% more plan to move at some point soon.


Major cost benefits in the cloud

Businesses are making the change to enjoy a wide range of benefits. According to a recent survey by Oracle*, reducing costs is (predictably) the main motivation, with improved functionality in second place—and culture, timing and the ability to write-off existing investments also key factors. The financial motivation breaks down into a desire to avoid infrastructure investment and on-premises upgrades, and also to achieve a lower total cost of ownership.

And Cloud is delivering on its promise in all these areas—across both ERP and EPM, 70% say they have experienced economic benefits after moving to the cloud.


Leap for joy at cloud agility

But the biggest overall benefit of moving to the cloud—quoted by 85% of those who have made the change—is staying current on technology. Moreover, 75% say that cloud improves usability, 71% say it increases flexibility and 68% say that it enables them to deploy faster. Financial gain is the top motivation for moving to the cloud, but that’s only the fourth-ranked advantage overall once there. It turns out that the main strengths of the cloud are in areas that help finance organisations improve business agility.

These are pretty amazing numbers. It would be unheard of, until fairly recently, for any decent-sized organisation to consider migrating its core ERP or EPM systems without a very, very good reason. Now, the majority of companies believe that the advantages of such a move—and specifically, moving to the cloud—overwhelm any downside.


The commercial imperative

Indeed, the benefits are more likely viewed as a competitive necessity. Cloud eliminates the old cycle of new system launches every two or three years—replacing it with incremental upgrades several times each year, and easy, instant access to additional features and capabilities.

And that is, no doubt, what’s behind the figures above. Finance professionals have an increasingly strong appetite to experiment with and exploit the latest technologies. AI, robotic process automation, internet of things, intelligent bots, augmented reality and blockchain are all being evaluated and used by significant numbers of organisations.

They’re improving efficiency in their day-to-day operations, joining-up operating processes across their business and reducing manual effort (and human error) through increased automation. Moreover, AI is increasingly being applied to analytics to find answers to compelling new questions that were, themselves, previously unthinkable—providing powerful new strategic insights.

Finance organisations are becoming more agile—able to think smarter, work more flexibly, and act faster using the very latest technical capabilities.


But it’s only available via cloud-based ERP and EPM

Increasingly, all these advances are only being developed as part of cloud-based platforms. And more and more advanced features are filtering down to entry-level cloud solutions—at least in basic form—encouraging finance people everywhere to experiment with what’s possible. That means, if you’re not yet using these tools in the cloud, you’re most likely falling behind your competitors that are—and that applies both from the broader business perspective as well as from the internal operating competency viewpoint.

The cloud makes it simple to deploy, integrate and experiment with new capabilities, alongside whatever you may already have in place. It has become the new normal in finance. It seems like we’re now at a watershed moment where those that embrace the potential of cloud will accelerate away from those that do not, and potentially achieve unassailable new operating efficiencies.

The good news is that it’s easy to get started.  According to MIT Technology Review in a 2017 report, 86% of those making a transition to the cloud said the costs were in line with, or better than expected, and 87% said that the timeframe of transition to the cloud was in line with, or better than expected.


* Except where stated otherwise, all figures in this article are taken from ‘Combined ERP and EPM Cloud Trends for 2018’, Oracle, 2018.


You’ve got to start with the customer experience

Mon, 2018-05-21 05:47

Visionary business leader Steve Jobs once remarked: ‘You’ve got to start with the customer experience and work backwards to the technology.’ From someone who spent his life creating definitive customer experiences in technology itself, these words should carry some weight—and are as true today as ever.

The fact is that customer experience is a science, and relevance is its key goal. A powerful customer experience is essential to compete today. And relevance is what cuts through the noise of the market to actually make the connection with customers.


The fundamentals of success

For companies to transform their customer experience, they need to be able to streamline their processes and create innovative customer experiences. They also have to be able to deliver by connecting all their internal teams together so they always speak with one consistent voice.

But that’s only part of the story. Customers have real choice today. They’re inundated with similar messages to yours and are becoming increasingly discerning in their tastes.

Making yourself relevant depends on the strength of your offering and content, and the effectiveness of your audience targeting. It also depends on your technical capabilities. Many of your competitors will already be experimenting with powerful new technologies to increase loyalty and drive stronger margins.


The value of data

Learning to collect and use relevant customer data is essential. Data is the lifeblood of modern business—it’s the basis of being able to deliver any kind of personalised service on a large scale. Businesses need to use data to analyse behaviour, create profiles for potential new customers, build propositions around those target personas and then deliver a compelling experience. They also need to continually capture new data at every touchpoint to constantly improve their offerings.

Artificial intelligence (AI) and machine learning (ML) have a key role to play both in the analysis of the data and also in the automation of the customer experience. These technologies are developing at speed to enable us to improve our data analysis, pre-empt changing customer tastes and automate parts of service delivery.


More mature digital marketing

You can also now add in all kinds of technologies to the customer experience mix that are straight out of sci-fi. The internet of things (IoT) is here, with connected devices providing help in all kinds of areas—from keeping you on the right road to telling you when your vehicle needs maintenance, from providing updates on your order status to delivering personal service wherever you are, and much more—enabling you to drive real transformation.

Moreover, intelligent bots are making it much easier to provide high-quality, cost-effective, round-the-clock customer support—able to deal with a wide range of issues—and using ML to improve their own performance over time.

Augmented reality makes it possible to add contextual information, based on your own products and services, to real-world moments. So, if you’re a car manufacturer you may wish to provide help with simple roadside repairs (e.g. change of tire) via a smartphone app.


Always omnichannel

Finally, whether at the pre-sale or delivery stage, your customer experience platform must give you the ability to deliver consistency at every touchpoint. Whatever channel, whatever time, whatever context, your customers must all believe that your whole business is one person.

Indeed, as Michael Schrage, author of the Harvard Business Review, said: ‘Innovation is an investment in the capabilities and competencies of your customers. Your future depends on their future.’ So you have to get as close as possible to your customers to learn what they want today, and understand what experiences they are likely to want tomorrow. Work backwards from that and use any technology that can help you deliver it.