Summary of Day 4 at the Oracle Open World 2016

Late, but not forgotten, here is the summary of day four. It was too late yesterday, after the appreciation event to write it all down,

Wednesday was a somehow slow day for me as I attended two sessions only. Most of the day was reserved for meetings around my other activities in the OTN network like moderation and the German ADF Community which will soon relaunch their community page on OTN.

The first session was about testing web applications with Selenium ‘Testing Java Web Applications with Selenium: A Cookbook‘ by Jorge Hidalgo and Vicente Gonzalez Arellano, over at the Java One. It turned out that the Selenium Webdriver for JDev ADF is better working than the one showed in the demo in this session. The JDev Webdriver abstracts all the tricky stuff like waiting for ajax calls or finding the right component away from the developer. This make the job really easy. Summary: nothing new learned.


After a nice working lunch with my peer OTN moderators and the Queen of Moderators I attended a session about developing applications with Oracle JET and ADFbc REST services ‘Oracle Application Development Framework and Oracle JavaScript Extension Toolkit in the Cloud‘ by Sherry Yu, Shray Bansal and Abhinav Shroff. This session was interesting to see as it used REST services generated from ADFbc. This kind of REST services offer many usages

ADFbc REST Services Usage

ADFbc REST Services Usage

and allow some very nice features out of the box like pageable collections, rich set of meta data, list of values, attribute types and validation and resource discovery

ADFbc REST Services Functions

ADFbc REST Services Functions

During run time you can tailor the payload by only retrieving the attributes you need, execute batch transactions, sort the results and have build in security.

ADFbc Run Time Features

ADFbc Run Time Features

Simple queries can be added to the REST calls. These are working like ‘Query by Example’ in ADF tables. This set of features allows for many different use cases

ADFbc REST Use Cases

ADFbc REST Use Cases

like back end for OracleJET based applications, mobile friendly UIs, integration with other services and as REST solution for SaaS.


The remaining part of the day I spend on multiple events like hte OTN Blogger Meetup, OTN Happy Hour and finally the Oracle Appreciation Event featuring Sting and Gwen Stefani.

 

Summary of Day 5 at Oracle Open World 2016

I started the day with a session on Alta UI ‘Implementing Oracle’s New Alta UI Features’ by Richard Wright. Richard started by giving some reasoning about why Oracle developed Alta UI. It was manly because the users demanded a more mobile friendly UI. The biggest change which came with Alta UI was that the UI has to be build by thinking ‘mobile first’ and by more designing the flow of operations by personas. Only then you gain the full advantage of the Alta UI.

Transforming a older (legacy) application to a modern application using the Alta UI is not just migrating the skin. You have to redo the UI and design it for mobile first.This means that you have to think about different device sizes which in the end means that you have to design the application in a responsive manner.

Here the page stretches on the device. This is mostly not working on small devices as it makes the user to zoom into the right section to see the information. Because of hte size mobile friendly means that you try to visualize the information instead of e.g. showing the user a table. An image is giving information a human can intake more easily than  data in a table.

For a developer this means that using a list view should be preferred over using a table. A list view allows better responsive design.

Summary is that you should

  • Leverage major UI updates as an opportunity
  • Verify actual users versus previously targeted users
  • Target UI for preferred user devices
  • Understand their most important artifacts and tasks

Next session of the day was ‘Cloud-Native Application Development with Oracle Application Container Cloud‘ by Shaun Smith, Anand Kothari and Eric Jacobsen. This session is about the Oracle Application Container Cloud Service which lets you run native Java SE applications or Node.js based application run in the cloud.

I already mentioned the ‘Cloud Native Architecture’ on day 2.

img_bqnk2w

 

and the demands on the application development

and tools to use to make this architecture work from Oracles point of view

The Application Container Cloud should allow you to make such development simpley by

  • Develop
  • Zip
  • Deploy

your application. This can be done on a polyglot platform using java, php, Node.js and later even Ruby and Java EE. It’s an open platform allowing you to run many applications. Oracle provides a Linux system and you can bring what ever you like.

All this runs on Docker containers. The only constraint is that the applications must be stateless, as the containers are build up and shut down on the fly to load balance your application. This is done automatically without you needing to interfere.

Once your application runs monitoring the JVM or the performance of the application is done via the cloud services. Patching, if needed, is done for you too. Not that you don’t know about it, but it’s just a click on a button. If you don’t like the patch because it breaks your application you can easily rollback the patch


Final session of the day and OOW 2016 for me was ‘Using Docker with Continuous Delivery in Oracle Cloud‘ by Greg Stachnick and Mike Raab. This session talked about how Docker is used in the Oracle Container Cloud Service to allow agile, containerized development in the cloud.

The first part was about the developers cloud which was covered in almost every session about the cloud.

img_20160922_132523

Second part was about the Container Cloud Service and it’s base implementation StackEngine (a company bought by Oracle end of last year).

IMG_20160922_132704.jpg

Key features of the Container Cloud are shown in the image below:

IMG_20160922_132757.jpg

When setting up a service in a docker container the UI looks like

IMG_20160922_134200.jpg

Changes made in the UI are reflected directly in a docker run script (which you can get on the same page). Spinning up a new container is a matter of two clicks:

Stacks are the equivalent to Dockers composer but have some add on like you are able to add parameters to the containers.

In the end the Container Cloud service is a flexible ‘bring your own’ container and run it in hte cloud. Don’t forget to bring the needed licences too🙂

Product will be available within the next 12 month!


That was the OOW2016 for me. See you next year!

 

 

Summary of Day 3 at Oracle Open World 2016

Started with the (early) morning keynote ‘Oracle OpenWorld Tuesday Morning Keynote‘ hosted by Bhanu Murthy B. M., Safra Catz, Hon. Chief Minister Shri. Devendra Fadnavis and Thomas Kurian.

As the keynote and it’s content is covered all over the media already I won’t add to this. Oh, one thing I like to say is that the ‘live’ demos did not really look live to me. Would you risk that your ‘live’ demo is going to hell because of some technical problem with Thomas Kurian on stage?


Next on my list for today was ‘Agile Development and DevOps Done Even Faster with Oracle IaaS and PaaS‘ by Michael Lehmann, Suhas Uliyar and  Siddhartha Agarwal. This session talked about agile development in the cloud using IaaS, PaaS and Microservices together with DevOps tools like Docker.

First a Cloud Navtive Architecture was introduced:

img_bqnk2w

Cloud Native Architecture

 Multiple services working together to build the cloud native architecture
img_-68tuak

Services for the Cloud Native Architecture

The practical part was a sample which showed how to build, deploy, or manage mobile-fronted, API-first autoscaling application, a microservice build on Node.js here, live on stage. New here is htat you can use the Management Cloud Service to introspect the microservice to see how it runs on your environment. The just build service then is consumned by anohter app (mobile using MAX) to visualize the data.

The final dashboard build for the mobile app, it took only about 20 minutes to build and deploy:

img_20160920_113941

Dashboard for the Mobile Application

and the final detailed architecture of the application:

Detailed Architecture

Detailed Architecture


Next on my Cloud program was ‘Development Operations in the Cloud: A Use Case and Best Practices‘ by Greg Stachnick and Jeff Stephenson. They talked about best practices using the Cloud Services to develop applications from the modern DevOps point of view.

img_20160920_123652

Modern DevOps

The case study was about the development of the Developers Cloud Service itself, neat!

img_20160920_124923

Developers Cloud Service Outline

This is a big project which is running completely in the cloud. Here is an image that shows a code review screen (sorry for the poor quality)

img_20160920_125529

Code Review

After accepting the changes the changes are pushed back to the mail line, triggering the next integration cycle in the continuous integration system. The typical cloud developers life is

IMG_20160920_125920.jpg

Day  in the Life of a Devloper

and the day of a manager

to summarize these points

img_20160920_131605

Summary

This summary hit the nail on the head. I’ve bin a contractor in many projects, always asking for more machines or more power. I would be happy if I could spin up another machine to do some testing instead of waiting for some other things to finish using the machine I wait for.


Before my day is over there are two sessions about ADF and JDeveloper to attend. First was Shay Schmeltzer with ‘Oracle Application Development Framework and Oracle JDeveloper: What’s New‘ which reveals what’s coming up in the world of ADF and JDeveloper. Shay started with the short history of ADF and JDev

img_20160920_160536

which is even longer if you count JBO to it too, which started 1999. Impressive. The session was more about features which are new in JDev 12.2.1 and JDev 12.2.1.1, both versions are out quite some time.  So, nothing new for seasoned ADF developers at the beginning.

Not so well known are ADF Business Components Triggers which are more known by Forms developers. They allow to do things right before or after some DB events fires.

IMG_20160920_161311.jpg

ADF BC REST Services and REST DataControl are better known if you work in the cloud or with mobile applications:

Remote Regions where introduced with JDev 12.2.1 but needed a patch to make them run (fixed in 12.2.1.1):

img_20160920_162051

Remote Task Flows:

img_20160920_162302

UI stuff like responsive support through templates (Tablet First), Massonry Layout and matchmediaqueries:

IMG_20160920_162555.jpg

Lots of new and changed data visualization components:

and finally to sum things up, other enhancements behind the scenes:

IMG_20160920_163029.jpg

For the future we can expect more and easier support for REST services and writing Groovy code. The biggest change will be the integration of JET Composite Components into ADF pages. JET Composite Components are an equivalent to ADF Declarative Components. You can build components from using other components, add properties to them to influence their behavior. Composite Components fire events which you can use to interact. Not sure how this will work, other that in the end you have HTML. Bad thing is that there is not even a time frame for this. More details in hte next section.

Anyway, ADF is not dead! There will be future development and enhancements in JDeveloper and ADF.


Final session for this long day ‘Oracle Development Tools and Frameworks: Which One Is Right for You?‘ by Shay Shmeltzer (again) and Denis Tyrell. As some of the features are not available at the moment the ‘Safe Harbor’ statement comes to play. So if you see something which you don’t find in the available version, you have to patiantly wait for it. No time frame given😦

Shay summarized the different frameworks ADF, MAF, JET and ABCS and pointed out their key features. As the frameworks are well known I spare most details. As promised I give more detail about the Oracle JET Composite Components.

Sample JET Composite Components

Sample JET Composite Components

Key features of JET Composite Components and there basic structure is shown below

(Coming soon!) The composite components end up together in a Tenant Component Catalog where the components can be filtered by their characteristics

 

Which late will be extended so that components are available from different channels

img_20160920_183844

In the end there will be Project Visual Code provides a low code environment

Project Visual Code

Project Visual Code

After this deep dive into JET Composite Components I present the summary of the session which shows which development framework is used for which development

At the end of the session Shay and Denis answered some question which are noteworthy. I Cant remember all question but tried to summarize the key points from the answers:

  1. Oracle focuses on JET as the future development environment Future focus on jet. Why? ADF is already feature rich and the developer don’t ask for much more.
  2. Developers want more client side development. Demand on server generated UI is going to decline.
  3. JET will get offline capabilities! This can’t be done easily with ADF.
  4. JET allows faster exchange of libraries. JavaScript developers tend to rewrite their UI faster then ADF developers (see yesterdays summary where Geertjan Wielenga made the same point).
  5.  Public Component Catalog is only public to a point. You have to submit components which then will be vetted by someone before other users can use them.
  6. Cloud IDE (writing code in the cloud) will have JavaScript capabilities
  7. ABCS (Application Builder Cloud Service) is not available on premise right now
  8. For declarative JET development look at ABCS. ABCS allows to get the underlying JET code (save as) so you can look at the code and change it, e.g. to use it elsewhere.

Summary of Day 2 at Oracle Open World 2016

I started day two with brushing up my knowledge about NetBeans and how to work with it. I attended a tutorial session about a JPA modeling tool (jpamodeler.github.io) by Geertjan Wielenga and Gaurav Gupta.

This tool, written by Gaurav Gupta,  allows seamless working with entities to create data models and the other way around. It’s a graphical tool to create complex entity relationship models and to generates code like REST API and Angular JS 1 from the model. Nicely done plugin.

Lets hope we see more plugins like this one, now that NetBeans is donated to the Apache Foundation.

Next session again on NetBeans and this time on ‘Ten Essential Building Blocks of JavaScript in the Enterprise’ by Geertjan Wielenga again. The session talked about building enterprise applications with modern JavaScript front end. This trend is still picking up momentum in the industries, the room was packed.

Building blocks

Building blocks

You may ask ‘JavaScript in the enterprise?’ and this was i thought first too, but Geertjan showed why this can be done and how. He showed how to translate concepts such as ‘modularity’ and ‘loose coupling’ to JavaScript applications.

The are however things to think about like that you don’t put everything into the browser! There are applications which don’t need to be in the browser as they are very specific to only a couple of people (like air traffic control software). You can supplement such apps with new browser based parts like notifications or reports.

New features should be based on HTML, CSS and JavaScript. Avoid external plugins or flash. CSS isn’t all about styling. CSS is modular today (about 50 modules). Some of the feature you use CSS for can be accomplished by using JavaScript in a better way. A sample is that CSS is used to show/hide elements from the DOM to get responsive designs (mediaqueries). Using JavaScript you load/unload libraries from the DOM reducing the size of the DOM. JavaScript allows to get better result then CSS mediaqueries.

JavaScript Librarires

JavaScript Librarires

One other thing to do is to differentiate between frameworks and libraries (see picture) and what they are doing. The question to answer is if to use frameworks or libraries to build your application. Using a Framework lock you in on the vendor, libraries allows you to switch them out if you find a better solution. On the other side you have to put it together yourself, or use  e.g. OracleJET which puts some well known libraries together to a working framework but allow to switch the libraries against others.

A more debatable argument Geertjan made was about ‘Life in a volatile Ecosystem’:

img_20160919_114219

which points out, that you just code the application once, so maintainability isn’t the biggest concern. You rewrite the app anyway in short time. a so

After a short break to change the venue I was back into the cloud again. Greg Stachnick talked about ‘A Cloud Platform for Developers: A Tour of Oracle Developer Cloud Service’, a more executive overview with a couple of demos. This is not new to me so I spare the details. The new info were the things to come in future versions like hte Cloud IDE which allows you to change code directly in the Developers Cloud.

Cloud IDE

Cloud IDE

Also new will be the support of Docker Pipeline for continuous integration in a DevOps way:

Docker Pipeline

Docker Pipeline

The downside is that Greg clod not disclose exact date. We are in the range of hte ‘next year’😦

Next session, again a change of the locate included, was Shay Sheltzer on Application Builder Cloud Service:

Application Cloud Builder Service

Application Cloud Builder Service

In short the Application Builder Cloud Service allows hte citizen developer to create applications by abstracting most of hte complexity away. Applications can be build from templates, you can add your one templates to e.g. implement some corporate identity. The development is done by using ready to use components (in hte end OracleJET is used for the UI of hte application) from the component pallet. You only have to drag the components to their place on the canvas defined in hte template and you are done.

The data model is build from the entries you do while putting the application together. Once you deploy the application the entries to the field are stored in a DB. There is not administration of hte db required, it’s all done in hte background. The downside right now is that you only can import/export data to the app using csv files. Later version can use REST and the  Integration Cloud Service to get to corporate data.

To close the day I went over to the Oracle Applications User Experience Cloud Exchange an event inside OOW16 showing then next generation UX. There were some very interesting approaches on how UX can and will (?) work in the future. Some ideas are already implemented in the next versions of some cloud applications.

Smart Office, IoT and other more experimental interacting methods are very interesting for future UX. I had a great time there!

 

 

 

Summary of day one Oracle Open World 2016

Day one, Sunday 18th September 2016, is reserved for Oracle User Forum, so it’s not the real start of the OOW which will be Larry Ellison’s keynote on Sunday afternoon.

Nevertheless, Sunday is full of interesting session. I tend to pick a main theme for this day to get at much info in the short time available.

This year I choose Microservices and SOA Cloud Service.

The first session I attended was about design pattern for the SOA Cloud Service by Arturo Viveros and Ronald van Luttikhuizen. They boiled down their experience doing SOA projects in well known design pattern (Design Patterns) which you can use if you are tasked with the same problem. They identified patterns like ‘publish/subscribe’ or ‘mediator’ and showed how to implement them in the SOA Cloud Service and other cloud services.

This was really interesting and useful.

Next session was about ‘Tips for mastering SOA Cloud Service’ by Robert van Mölken. He gave practical tips about things you should do and things you should better not do when provisioning the service, running it and writing or composing applications for it. A session with lots of practical info.

To round up the  Microservice/SOA theme I attend a session ‘Microservices and DevOps and Oracle Cloud: A Bright Future’ by Sai Janakiram Penumuru. Sai strongly advised to use micro services together with SOA Cloud Service.

The thing I remember best was the ‘definition’ of microservices he gave. He compared a microservice to Unix shell commands. A Unix shell command does one thing, but does it good. Multiple shell commands can be tied together to build something bigger (e.g. using the pipe symbol). This definition is intuitive🙂

The remaining on the session he talked about how DevOps works and how the different Oracle Cloud Services supports this with the help of microservices.

The remaining of the day was reserved for networking and the OOW annual ACE dinner. This year the event held at the Chart House. Great food, great views!


Thanks to the ACE program for this great event.

DOAG DevCamp2016: Oracle Development Cloud Service Hands On (Part 4)

In part three of the series we completed the task to setup the build system in the DCS for the AppsCloudUIKit application. This final part of the series concludes with setting up the ‘Continuous Integration’ part of the application into the JCS.

Setting up the Continuous Integration to the JCS

The final task is to setup the continuous integration to the Java Cloud Service we use. This is done in the ‘Deploy’ tab where we create a ‘New Configuration’

We then fill in the needed data:

The Configuration name and the application name must match

Nest we select a deployment target. Here we can choose between a JCS and an Application Container Cloud depending on the type of application we develop. As we have created a web application using ADF we select the Java Cloud Service.

We can decide which type of deployment we like: on ‘Demand’ or’ Automatic’. Automatic means that after each build the deploy task is triggered. With the checkbox ‘Deploy stable builds only’ we tell the task to only deploy successful builds. If we choose ‘On demand’ we can select the build we like to deploy

To see the application running in the JCS we can use the URL AppsCloudUIKit (http://140.86.8.75/AppsCloudUIKit/faces/Welcome)

DOAG DevCamp2016: Oracle Development Cloud Service Hands On (Part 3)

In part two of the series we have moved the source of the application to the DCS GIT repository and covered the agile development capabilities of the Oracle Development Cloud Service. In this final part we show how to setup the continuous build service for the application.

To build the application in the DCS we have to create ANT build files or a Maven pom. We use ANT to build the project. To create a starting set of ANT build files we can use JDeveloper (New->Ant->BuildScript from Application). This will create two files, build.xml and build.properties. The build.xml file is the ANT build file and is not dependent on the environment. All environment dependent variables are put in the build.properties file. A sample of the created build properties look like

#Mon Feb 15 21:29:41 CET 2016
oracle.commons=R\:\\Java\\12.1.3.0.0\\Oracle\\Middleware\\oracle_common
install.dir=R\:\\Java\\12.1.3.0.0\\Oracle\\Middleware
oracle.jdeveloper.ant.library=R\:\\Java\\12.1.3.0.0\\Oracle\\Middleware\\jdeveloper\\jdev\\/lib/ant-jdeveloper.jar
oracle.home=R\:\\Java\\12.1.3.0.0\\Oracle\\Middleware\\jdeveloper
oracle.jdeveloper.workspace.path=Q\:\\JavaDevCloud\\DOAGDEVCAMP2016\\AppsCloudUIKit\\AppsCloudUIKit.jws
oracle.jdeveloper.deploy.profile.name=*
oracle.jdeveloper.deploy.dir=Q\:\\JavaDevCloud\\DOAGDEVCAMP2016\\AppsCloudUIKit\\deploy
oracle.jdeveloper.ojdeploy.path=R\:\\Java\\12.1.3.0.0\\Oracle\\Middleware\\jdeveloper\\jdev\\bin\\ojdeploy.exe
oracle.jdeveloper.deploy.outputfile=Q\:\\JavaDevCloud\\DOAGDEVCAMP2016\\AppsCloudUIKit\\deploy\\${profile.name}

We see that the generated properties are holding absolute path variables to programs like ojdeploy or the installation path of your local JDeveloper. We can use these two files to build the application locally without a problem. Working with the same properties, the absolute path variables, in the DCS will not work. The path from the local machine simply does not exist on the server. We need to make some changes to the properties file.

The goal is to use the ant build.xml and build.properties file in both environments with minimal changes. The less we lees to change the better. JDeveloper did some good ground work for us. The build.xml file have no dependency to the environment it’s running at. The save build.xml file can be used to run on the local environment and any server. All locations are either relative or are addressed with properties from the build.properties file.

The DCS offers two different environments (today), one for version 11.1.1.7.0 and one for version 12.1.3.0.0. The environments can be used by using environment variables which are preset for the server which is running the DCS. There are two ORACLE_HOME variables predefines env.ORACE_HOME_11G7 and env.ORACLE_HOME_12C3. Depending on which JDeveloper version we use to develop an application and which runs on the server we substitute the absolute path by one of the preset environment variables:

oracle.commons=R\:\\Java\\12.1.3.0.0\\Oracle\\Middleware\\oracle_common

changed to

oracle.commons=${env.ORACLE_HOME_12C3}\oracle_common =${env.ORACLE_HOME_12C3}

This we have to do with all absolute path variables we find in the build.properties file. We don’t even have to know the real path of the installation on the DCS, all we do is to use the predefined environment variables. Another thing to take care of is that the Hudson server which uses the build files run on a Linux machine whereas we normally use a Windows machine. In case that you run rue local development on a Linux machine too, you can skip the next paragraph.

Use the same build.xml and build.properties on Windows and Linux machines

There are some small but vital differences when running the development on Windows against running it on Linux.

  1. The path separator is different: ‘/’ in Linus, ‘\’ in windows
  2. Executables in windows have a suffix ‘.exe’

It would make sense to use the same build files on both machines. The path separator isn’t much of a problem as the ant tool manages reading Windows and/or Linux without a problem. To make the path separator work on both machines we use the Linux version in the build.properties. This can be done by a simple search and replace job.

The build properties contain one call to an executable ojdeploy. This is the packaging task JDeveolper uses to build jar, war and ear files. The name suggests that its use is to deploy something to a server, but that’s not what’s ojdeploy is doing. The problem is that Windows uses a suffix ‘.exe’ for executables, whereas Linux does not. Here an executable just has a flag set on the file properties. One solution would be to rename the ojdeploy on the Linux side to ‘ojdeploy.exe’, but for this we have to have access to the servers file system and the right to change this setting. We use a property which we define for the ant build job, ‘${env.EXEC_SUFFIX}’. The trick is to set the ‘${env.EXEC_SUFFIX}’ property to ‘.exe’ on a windows system and to ” (empty) on a Linux system.

On a Windows machine we define a system property for this

041516_1236_DOAGDevCamp1.png

On a Linux local system we can use an export command. For the Hudson build we use an ant property

Optimizing the build.properties for reuse

A close inspection of the original build.properties file shows, that we only need five properties to make the build.properties easy to handle for every project which we want to develop in the cloud and on a local environment.

  • workspace.name: holds the name of the workspace, in our case AppsCloudUIKit
  • project.viewcontroller.name: holds the name of the view controller project which is used to build the WAR file. The sample application uses the ‘DemoMaster’ project to create the WAR file.
  • project.deploy.folder: holds the folder where all artefacts are stored. This folder holds the final artefact, the EAR file.
  • oracle.jdeveloper.deploy.profile.name: The name of the application profile which builds the EAR file.
  • output.dir: holds the directory the build process uses to put the class files to.

The remaining properties don’t change as they use environment variables which will be set by the machine the build system is running on. In the cloud we have two environments (at the time of writing), one for 11g (11.1.1.7.1) and one for 12c (12.1.3). We use the 12.1.3 environment as our JCS uses 12.1.3 too.

The main environment variables are ‘${env.ORACLE_HOME_12C3}’, ‘${env.MIDDLEWARE_HOME_12C3}’, ‘${env.WORKSPACE}’ and finally ‘${env.EXEC_SUFFIX}’. All of them start with ‘env.’ which shows that they are read from the environment. To make ant aware of the environment variables we have to add one line to the generated build.xml at the beginning of build.xml

<property environment=”env” />

The final version can be seen below or downloaded at Ant build.properties

#Change the next properties to match your projects names
workspace.name=AppsCloudUIKit
project.viewcontroller.name=DemoMster
project.deploy.folder=deploy
oracle.jdeveloper.deploy.profile.name=AppsCloudUIKit
output.dir=classes

# Don't change anything below!
oracle.home=${env.ORACLE_HOME_12C3}
oracle.commons=${env.MIDDLEWARE_HOME_12C3}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_12C3}
install.dir=${env.ORACLE_HOME_12C3}

#Flags
javac.deprecation=off
javac.nowarn=off
java.debug=on
project.workspace.file=${workspace.name}.jws
oracle.jdeveloper.ant.library=${oracle.home}/jdev/lib/ant-jdeveloper.jar
oracle.jdeveloper.workspace.path=${env.WORKSPACE}/${workspace.name}.jws
oracle.jdeveloper.project.name=${project.viewcontroller.name}
oracle.jdeveloper.deploy.dir=${env.WORKSPACE}/${project.deploy.folder}
oracle.jdeveloper.ojdeploy.path=${env.ORACLE_HOME_12C3}/jdev/bin/ojdeploy${env.EXEC_SUFFIX}
oracle.jdeveloper.deploy.outputfile=${env.WORKSPACE}/${project.deploy.folder}/${oracle.jdeveloper.deploy.profile.name} 

Setting up the Build Job in the DCS

The next task it to set up the build job in the DCS. For this we create a new job

Give the job a name and can now select to create a new job or to copy an existing one. We select to create a new one

Now we configure the build job by filling in the needed information. On the main tab we can describe the job, decide which JDK to use and how many build job executions are saved. If you use a high number the space on your virtual storage is filling up quickly as all output and all artefacts are stored. We set this to 10.

We skip the next tab ‘Build Parameters’ and move to the ‘Source Control’ tab

As we use Git, we set the radio button to Git and select the repository of the project we want to build. The ‘Advanced Repository Settings’ are set automatically. In the ‘Branch Specifier’ field we can tell which branch we want to check out. Next we tell the build system when to start working

Here we select that we want to poll the source control system every minute for changes. The schedule uses the Linux ‘cron’ syntax. We skip the ‘Environment’ tab and define one build step on the ‘Build Steps’ tab

All we need to do is to execute the Appbuild.xml target ‘deploy’. This will compile all other needed projects and create the needed jars before finally creating the applications ear file. The remaining tab we leave unchanged and save the job configuration. In the ‘Properties’ field we see the definition of the ${env.EXEC_SUFFIX} variable mentioned before.

The ‘Post Build’ tab defines what to do with the outcome of the build. Here we define that the generated artifacts which end up in the ‘deploy’ folder should be archived in GZIP format. If we don’t do this the build runs but as nothing is saved, we later can’t deploy the ear file. The remaining tab ‘Advanced we skip and save the created job.

Now we can run the build job by clicking the ‘Build’ button. The job doesn’t start immediately but is queued first, then starts running.

… when the build starts we see the change in status.

We can look at the console output of the job by clicking the console icon of the running job. If you click on the console icon of a finished job you see the complete output.

Once the job has finished successfully the artifacts are shown with the build.

In the next and final part we see how to setup the ‘Continuous Integration’ part by deploying the application to the JCS.

DOAG DevCamp2016: Oracle Development Cloud Service Hands On (Part 2)

In part 1 of this series we talked about the Oracle Development Cloud Service (DCS) in general terms and what we plan to do. This part describes the migration of an application developed for an earlier version of JDeveloper to version 12.1.3 and how to move it into the cloud.

As a test case we use the sample application provided by the Rapid Development Kit which shows a sample on how to easily develop modern, scalable applications using the Alta UI. The image below shows the landing page of the application with the splash screen. The running application can be seen at http://140.86.8.75/AppsCloudUIKit/faces/Welcome

In Part 1 we already downloaded the source of the application, created the DCS project, assigned users to the project and initialized the GIT repository for the application in the DCS. The next step is to migrate the application which was designed using JDeveloper 11.1.1.9.0 to JDeveloper version 12.1.3 which we use in the DCS.

Before we start we checkout a new branch named ‘develop’ from the GTI repository. This allows us to work outside the ‘master’ branch. When we finished the migration we can merge the changes back to the master. This resembles the GIT Flow pattern (see ‘The Git Experience (Part 4)‘).

Migrating is as simple as to open the project in your local JDeveloper 12.1.3 and let JDeveloper do an automatic migration. There are some things which have to be changed in the sources as JDeveloper can’t do them automatically.

  1. We check the libraries used in each of the projects of the AppsCloudUIKit workspace. Make sure that there are no red marked libraries as this would mean that the library is not available in the current defined libraries. If we see one of those (e.g. JSF1.2 which is JSF2.1 in 12.1.3) we need to find an equivalent library for 12.1.3 and choose this instead.
  2. We compile each project and correct any errors we find in the compile window. There are some warnings which we let go for the moment. They tell us that the UI uses some tags or components which have been deprecated in JDeveloper 12.1.3. The components are still available but we should exchange them with the new components in the future. When we compile the projects we have to follow a specific order, the dependency of the project. There is a common project ‘UIKitCommon’ which is used in all other projects. This project holds the foundation of the application. Once the project compiles we have to create an adfLibrary from it which is used in the other projects. For this we right click on the project and select ‘Deploy’->’adflibUIKitCommon…’ and follow the instructions.
  3. We need to setup the data used for the application. The application doesn’t use a DB in this version. All data is created and served via POJO Java classes. All of them reside in the ‘DemoData’ project. We compile this project and create an ADF library from it like we did for the ‘UIKitCommon’ project.
  4. We compile and deploy (to adfLibrary) the other projects in this order: ‘DemoCRM’, ‘DemoHCM’, ‘DemoFIN’ and finally ‘DemoMaster’. The ‘DemoMaster’ project create an EAR File which can be deployed to a standalone server.

After this we can run the application in our local server integrated in JDeveloper and see if it works (see the image above). Once this is verified we save all changes in the GIT repository and push them to the cloud based remote GIT Repository. This is like working with any other remote GIT repository, no difference in usage. After this the landing page of the DCS project shows the trail of work as in the image below.

Using the collaboration features

One really nice thing about the DCS is the integrated collaboration features like a wiki page, an issue tracker like Jira and an agile board where we can plan sprints to track the progress of the project.

We create a wiki page to collect all decisions made during development and generating documentation this way. This will help members to understand the project and how they are supposed to work with the project. New members added to the project at a later point in time can use this wiki to understand the project and how to work with the team.

The image below show the start wiki page of the project

and add some basic information about the project. Later we add more info about who we changed the project and how to setup the build system.

The wiki supports cascading pages too. We add a page describing the build system to the project. This allows other team members to efficiently use the build system on the DCS. We talk about details of the build system and how to use it in the next part.

Agile Development

The DCS supports agile development. The tab ‘Agile’ opens a sprint planning view to the project. This is a very neat feature. Teams can use this to plan their tasks and track their progress. Here we can create issues (tasks, feature or issues) which first end up in the bag log. We can create sprints and assign the issues to sprints.

We can work like in e.g. Jira, we drag issues from the backlog to the sprint

to add the issue to the sprint

If you like you can change the agile board, e.g. add progress states

Finally we can start the sprint by defining the start and end date. Once a sprint is started we can look at the active sprint to see the tasks in their different states. This view allows drag & drop to make it easy to change the status of a task.

Once all tasks are finished we can complete the sprint.

A look at the ‘Issues’ tab shows the finished work.

All this works out of the box. As a teaser I added a couple of images from the DCS team feature when they are integrated in JDeveloper 12.2.1

When the DCS supports JDeveloper 12.2.1 the integration to the agile board and issue tracker is as simple as logging into the DCS. No hassle setting up a team server and all other needed software and their adapters.

This concludes the second part of the series. The next part reveals details about the build system.

DOAG DevCamp2016: Oracle Development Cloud Service Hands On (Part 1)

End of February the DOAG held its annual DevCamp in Bonn, Germany. One big part of the DevCamp was a session or better a couple of session about the Oracle Developer Cloud Service and how to use it.

This part shows some general information about the Oracle Cloud. In the next part we show how to migrate an existing application to the cloud and how to use some of the available tools of the Development cloud.

The Developer Cloud Service (DCS) was introduced last year and became available to the public around the OOW2015. It offers a whole toolset to allow development of applications in the cloud. The DCS is bundled with the Java Cloud Service (JCS) which is bundled with the Database Cloud Service (DBCS). There are a couple of other services like Storage Cloud Service , responsible for managing the disk storage needed, and Compute Cloud Service responsible for the security and firewall of all services used by a company. For more information see Fasten your seat belts: Flying the Oracle Development Cloud Service (1- Boarding).

All these services are working together. If you ever have setup a working Oracle environment consisting of a DB, a WebLogic Server, load balancer, ADF Runtime you know that this isn’t an easy task to accomplish. The good news is that this work is done automatically by Oracle provisioning the different services. You as a user or company have to make some decisions like which version of the DB you want to use, or which version of WebLogic Server to install and how many CPUs to use for each service. You can later upscale the number of CPUs or managed server you want to use in total for your system. All this is very flexible.

Why to use the DCS?

Well, as mentions before, setting up a development environment does take some time and hardware. Sometimes it hard to get the time from your admins to get the hardware and setup the software to get the full environment four your development. This is one reason I see at my customers for not upgrading to newer software versions. The department has to buy the hardware and software licenses, without knowing exactly which hardware parameters they later need. Once the evaluation is finished you have the hardware and software on stock without knowing if you really need them. After all it was an evaluation only.

Oracle Cloud Services (Platform as a Service or PaaS in short) allows you to buy or lease the needed hard and software to setup an evaluation stack. You can use the stack as long and you pay for it. You can up/down scale it to your needs. As a sample you can start with a small scale development environment (DB, WeblogicServer with one admin and one managed server) and later scale it up with a load balancer and multiple managed servers in a cluster.

Right now there are two different environments for development available to configure: 11g as JDev 11.1.1.7.0 and 12c as JDev 12.1.3. In a couple of weeks JDev 12.2.1 should be available too.

Depending on the size (RAM, storage or number of OCPUs) you can select ‘metered’ or ‘none metered’ services. For pricing information see https://cloud.oracle.com/en_US/java?tabID=1385147650676 for a sample for the JCS.

Installation or configuration of the DCS is not part of this document. Oracle Developer Cloud Service smoothly and invisibly integrates your development environment with the latest versions of other services in Oracle Cloud, such as Oracle Java Cloud Service and Oracle Database Cloud Service.

Another big plus is the fully integrated development life cycle which allows to create and administer the configurations for code repositories, continuous integration, testing, building, and deployment for all stages of the development.

Developer Cloud Hands On

The remainder of this document talks about the practical work with the DCS. We show which tools are provided and how to use them to setup a CI (Continuous Integration) environment. As a test case we use the sample application provided by the Rapid Development Kit which shows a sample on how to easily develop modern, scalable applications using the Alta UI. The sample was developed by Oracle Applications User Experience team to give developers a foundation to enhance the sample or use the code they like in their own application. The image below shows the landing page of the application with the splash screen. The running application can be seen at http://140.86.8.75/AppsCloudUIKit/faces/Welcome

The application comes with design guide as an e-book and hints on how to use and extend the sample. We start with downloading the source from the web page. The application was developed using JDev 11.1.1.9.0. In the DCS we use JDev version 12.1.3.

Before we start to use the DCS we copy the sources from the zip into a new empty directory. From this directory we start by first logging into the DCS and creating a new project, DevCamp16 in the image below.

The identity domain holds the service (multiple projects) and members who can access the DCS using different roles. Once we work with one of the project we can add members to the project, but first have to add them to the identity domain as users. For this document we are only using one administrator and multiple development users. The DCS administrator assigns new users to the DCS, the project administrator the members of the DCS to the project. A DCS member can be member in zero or many projects in different roles. This way one DCS can be used for different projects. Members of one project can be excluded from others. A member of the identity domain only sees projects he is a member of. This allows a fine grained project landscape.

The project is the development environment we or the team uses to develop an application. A project holds one or more GIT repositories to manage code, a Husdon build server to manage the builds of the software, an issue tracker (kind of Jira), a wiki which we user to document decisions we made during the development, a deployment section which we use to implement a CI environment and finally we get an agile board where we can plan the development (add issues, backlog and sprints).

The last tab we see is the ‘Administration’ tab which allows one or more members to act as the administrators of the project. An administrator can add other members to the project, manage their rights and even remove the project with all its artifacts. The UI is modern and build using JET with Alta UI Design. The image below shows the administration page of the project.

The image below shows the GIT (or more than one if we add more) and the one Maven repository.

In the next tab we get an overview about the project. This is more or less empty as there has not been much work done.

The final tab hold the information about the members of the project.

This concludes part 1. In part 2 we talk about how to migrate the application from the RDK to the DCS. We learn how to setup the build system and integrate with continuous integration service.

Developer Cloud Service: Continuous Integration with JDeveloper 12.2.1

The last blog showed that the Oracle Developer Cloud Service is now available for JDeveloper and ADF 12.2.1 (Developer Cloud Service with JDeveloper 12.2.1 available). The missing part is the connection of the DCS to the newly created JCS for version 12.2.1. This we show in the blog.

The ground work, how to set up a build system for the DCS has been shown in Fasten your seat belts: Flying the Oracle Development Cloud Service (3 – Take Off – ROTATE). We now have to find out which environment variable to use for the 12.2.1 installation. At the time I wrote the mentioned blog there where only environment variables for 11.1.1.7.1 and 12.1.3.0 available. Looking at the documentation Using Hudson Environment Variables we find that the variables

  • ORACLE_HOME_SOA_12_2_1=/opt/Oracle/MiddlewareSOA_12.2.1/jdeveloper
  • MIDDLEWARE_HOME_SOA_12_2_1=/opt/Oracle/MiddlewareSOA_12.2.1
  • WLS_HOME_SOA_12_2_1=/opt/Oracle/MiddlewareSOA_12.2.1/wlserver

Are the right ones (and the only ones which point to 12.2.1). In the application.properties file (from the ‘… Take Off…’ blog) we exchange

# Don't change anything below!
 oracle.home=${env.ORACLE_HOME_12C3}
oracle.commons=${env.MIDDLEWARE_HOME_12C3}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_12C3}
install.dir=${env.ORACLE_HOME_12C3} 

with

# Don't change anything below!
oracle.home=${env.ORACLE_HOME_SOA_12_2_1}
oracle.commons=${env.MIDDLEWARE_HOME_SOA_12_2_1}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_SOA_12_2_1}
install.dir=${env.ORACLE_HOME_SOA_12_2_1} 

This change will use the JDeveloper 12.2.1 to run ojdeploy and configure the application to run on a WebLogic Server 12.2.1. This should do the trick and we can use the DCS build system to create application using ADF 12.2.1. As the application I used for the ‘Fasten your seat belts…’ blog series was pretty simple I like to show the result using the application I used for a presentation at the DOAG DevCamp2016, named AppsClouUIKit. You can read all about this application in a blog I wrote here DOAG DevCamp2016.

The application was build using JDeveloper 11.1.1.9.0 and has been migrated during the DevCamp to 12.1.3. This was the DCS version which was available at the time of the DevCamp. The first task is to migrate the source to 12.2.1 by creating a new branch in the GIT repository for the new 12.2.1 version.

We Clone the repository and create a new branch 12_2_1 which we use to build the AppsCloudUIKit for 12.2.1. As we are now running JDeveloper 12.2.1 we can use the Team-Server to get the sources from the DCS GIT repository

But we can use any other GIT client to get it. As this is covers in other blogs I’ll skip the details here. In the end we have this branch tree

Where the green marked local branch 12_2_1 is the one we are working on.

After changing the application.properies as shown above we can run the build using ant on the local machine

By selecting the ‘deploy’ target.

The result is an EAR file in the deploy folder

Setting up the build job

Let’s check-in the changes and setup the build in the DCS. Here are the steps for the build job

With this we can build the application to get the result

Setting up the Deployment

The final task is to set up the deploy task to deploy the application on the JCS_12_2_1. When we select the ‘Deploy’ tab we see the existing deployment configuration for the 12.1.3 JCS.

For the JCS 12.2.1 we created a new JCS instance with a different IP (public). Before we can create a new configuration for the 12.2.1 JCS instance we have to allow the Hudson user access to the JCS. This process is described in detail at Deploying an Application from Oracle Developer Cloud Service to Oracle Java Cloud Service

It’s absolutely necessary to get the Oracle Developer Cloud Service SSH public key and add this key to the JSC 12.2.1 instance as authorized key. Please follow the instructions given in the link above to do so.

After this is done we can create a new deployment configuration

Start filling in the dialog by giving the configuration a name. Next we create a new ‘Deployment Target’

In the dialog fill in the public IP address from the new JC 12.2.1 and select SSH Tunnel. The user name and password is the one you selected when you created the JCD instance. Test the connection and close the dialog by clicking ‘Use Connection’

Finally we can complete the Deployment dialog

We choose ‘On Demand’ here which let us specify which job/Build and artifact to use. A click to ‘Save and Deploy’ closes the dialog and the artifact will be deployed to the JCS 12.2.1. The URL to open the application is AppsCloudUIKit 12.2.1

And we should see