DOAG DevCamp2016: Oracle Development Cloud Service Hands On (Part 2)

In part 1 of this series we talked about the Oracle Development Cloud Service (DCS) in general terms and what we plan to do. This part describes the migration of an application developed for an earlier version of JDeveloper to version 12.1.3 and how to move it into the cloud.

As a test case we use the sample application provided by the Rapid Development Kit which shows a sample on how to easily develop modern, scalable applications using the Alta UI. The image below shows the landing page of the application with the splash screen. The running application can be seen at http://140.86.8.75/AppsCloudUIKit/faces/Welcome

In Part 1 we already downloaded the source of the application, created the DCS project, assigned users to the project and initialized the GIT repository for the application in the DCS. The next step is to migrate the application which was designed using JDeveloper 11.1.1.9.0 to JDeveloper version 12.1.3 which we use in the DCS.

Before we start we checkout a new branch named ‘develop’ from the GTI repository. This allows us to work outside the ‘master’ branch. When we finished the migration we can merge the changes back to the master. This resembles the GIT Flow pattern (see ‘The Git Experience (Part 4)‘).

Migrating is as simple as to open the project in your local JDeveloper 12.1.3 and let JDeveloper do an automatic migration. There are some things which have to be changed in the sources as JDeveloper can’t do them automatically.

  1. We check the libraries used in each of the projects of the AppsCloudUIKit workspace. Make sure that there are no red marked libraries as this would mean that the library is not available in the current defined libraries. If we see one of those (e.g. JSF1.2 which is JSF2.1 in 12.1.3) we need to find an equivalent library for 12.1.3 and choose this instead.
  2. We compile each project and correct any errors we find in the compile window. There are some warnings which we let go for the moment. They tell us that the UI uses some tags or components which have been deprecated in JDeveloper 12.1.3. The components are still available but we should exchange them with the new components in the future. When we compile the projects we have to follow a specific order, the dependency of the project. There is a common project ‘UIKitCommon’ which is used in all other projects. This project holds the foundation of the application. Once the project compiles we have to create an adfLibrary from it which is used in the other projects. For this we right click on the project and select ‘Deploy’->’adflibUIKitCommon…’ and follow the instructions.
  3. We need to setup the data used for the application. The application doesn’t use a DB in this version. All data is created and served via POJO Java classes. All of them reside in the ‘DemoData’ project. We compile this project and create an ADF library from it like we did for the ‘UIKitCommon’ project.
  4. We compile and deploy (to adfLibrary) the other projects in this order: ‘DemoCRM’, ‘DemoHCM’, ‘DemoFIN’ and finally ‘DemoMaster’. The ‘DemoMaster’ project create an EAR File which can be deployed to a standalone server.

After this we can run the application in our local server integrated in JDeveloper and see if it works (see the image above). Once this is verified we save all changes in the GIT repository and push them to the cloud based remote GIT Repository. This is like working with any other remote GIT repository, no difference in usage. After this the landing page of the DCS project shows the trail of work as in the image below.

Using the collaboration features

One really nice thing about the DCS is the integrated collaboration features like a wiki page, an issue tracker like Jira and an agile board where we can plan sprints to track the progress of the project.

We create a wiki page to collect all decisions made during development and generating documentation this way. This will help members to understand the project and how they are supposed to work with the project. New members added to the project at a later point in time can use this wiki to understand the project and how to work with the team.

The image below show the start wiki page of the project

and add some basic information about the project. Later we add more info about who we changed the project and how to setup the build system.

The wiki supports cascading pages too. We add a page describing the build system to the project. This allows other team members to efficiently use the build system on the DCS. We talk about details of the build system and how to use it in the next part.

Agile Development

The DCS supports agile development. The tab ‘Agile’ opens a sprint planning view to the project. This is a very neat feature. Teams can use this to plan their tasks and track their progress. Here we can create issues (tasks, feature or issues) which first end up in the bag log. We can create sprints and assign the issues to sprints.

We can work like in e.g. Jira, we drag issues from the backlog to the sprint

to add the issue to the sprint

If you like you can change the agile board, e.g. add progress states

Finally we can start the sprint by defining the start and end date. Once a sprint is started we can look at the active sprint to see the tasks in their different states. This view allows drag & drop to make it easy to change the status of a task.

Once all tasks are finished we can complete the sprint.

A look at the ‘Issues’ tab shows the finished work.

All this works out of the box. As a teaser I added a couple of images from the DCS team feature when they are integrated in JDeveloper 12.2.1

When the DCS supports JDeveloper 12.2.1 the integration to the agile board and issue tracker is as simple as logging into the DCS. No hassle setting up a team server and all other needed software and their adapters.

This concludes the second part of the series. The next part reveals details about the build system.

Advertisements

Naviagting an af:table in pagination mode from a bean

A question on the JDeveloper and ADF OTN forum asked about how to navigate to a specific page of an af:table in pagination mode. As of JDeveloper 11.1.1.7.0 adf tables can be rendered in scroll mode or in pagination mode where only a specific number of rows are visible in the table.

af:table in pagination mode

To navigate the pages there is a small navigation toolbar below the table which allows to enter a page number or to navigate to the previous, next, first or last page.

The problem to solve is how to navigate the paginated table from within a java bean?

The table doesn’t offer any navigation listeners or methods you can bind bean methods to. Luckily there is the RangeChangeEvent one of the FacesEvents which can e used to notify a component that change in the range has taken place.

All we have to do to navigate the table in pagination mode is to calculate the needed parameters

  • oldStart: The previous start of this UIComponent’s selected range, inclusive
  • oldEnd: The previous end of this UIComponent’s selected range, exclusive
  • newStart: The new start of this UIComponent’s selected range, inclusive
  • newEnd: The new end of this UIComponent’s selected range, exclusive

We add an input field to the page which allow us to enter a page number and a button which we use to call an action listener in a bean.

The running application looks like

Running application

Another button is used to calculate the index of the selected row in the whole rowset, the index on the page and the page number. The row index and the index of the row on the page are zero based, page numbers start with 1. Let’s look at the code:

public void onGotoPage(ActionEvent actionEvent) {
BindingContainer bindingContainer = BindingContext.getCurrent().getCurrentBindingsEntry();
// get number of page to goto
AttributeBinding attr = (AttributeBinding) bindingContainer.getControlBinding("gotopage1");
Integer newPage = (Integer) attr.getInputValue();
if (newPage == null) {
return;
}
// page one starts at index 0 so subtract 1 from the pagen number
newPage--;
DCIteratorBinding iter = (DCIteratorBinding) bindingContainer.get("EmployeesView1Iterator");
// calculate the old and new rages for the RangeChangeEvent
int range = iter.getRangeSize(); // note both the table and we take the page size from the iterator's RangeSize
int oldStart = iter.getRangeStart();
int oldEnd = oldStart + range;
int newStart = newPage * range;
int newEnd = newStart + range;
// find the table
UIViewRoot iViewRoot = FacesContext.getCurrentInstance().getViewRoot();
UIComponent table = iViewRoot.findComponent("t1");
// build the event and fire it
RangeChangeEvent event = new RangeChangeEvent(table, oldStart, oldEnd, newStart, newEnd);
((RichTable)table).broadcast(event);
// update the table
AdfFacesContext.getCurrentInstance().addPartialTarget(table);
}

Line 2-8 we get the new page number we want to navigate to. Line 9-10 we subtract 1 from the given number as the page is zero based internally. In Line 11 we get the iterator which we need to get the range size and the start of the current range (lines 13-15). These values are oldStart and oldEnd. Lines 16-17 we calculate the new start range as page to go multiplied with the range. The newEnd parameter is the newStart pus the range size.
In lines 18-20 we get to the table component on the page. Then we create the RangeChangeEvent and broadcast the event to the table component in lines 21-23. Finally we ppr the table to see the change in the UI.

To show how to calculate the other way around, to get from the selected row in a table to the index on the page, the page number and the index in the rowset we added another button ‘GetPageOfSelectedRow’which calls a listener in the same bean which builds a string with the needed information.

public void onGetCurrentPage(ActionEvent actionEvent) {
BindingContainer bindingContainer = BindingContext.getCurrent().getCurrentBindingsEntry();
DCIteratorBinding iter = (DCIteratorBinding) bindingContainer.get("EmployeesView1Iterator");
// calculate index and page number. Index is zero based!
int currentRowIndex = iter.getRowSetIterator().getCurrentRowIndex();
_logger.info("CurrentRowIndex: " + currentRowIndex);
int currentPage = currentRowIndex / iter.getRangeSize();
currentPage++;
_logger.info("Current Page:" + currentPage);
int indexOnPage = (currentRowIndex % iter.getRangeSize());
_logger.info("Current index on Page:" + indexOnPage);
// get an ADF attributevalue from the ADF page definitions
AttributeBinding attr = (AttributeBinding) bindingContainer.getControlBinding("selectedRow1");
StringBuffer sb = new StringBuffer();
sb.append("row index overall: ");
sb.append(currentRowIndex);
sb.append(" row index on page: ");
sb.append(indexOnPage);
sb.append(" Page: ");
sb.append(currentPage);
attr.setInputValue(sb.toString());
}

To get the index of the selected row in the whole rowset we need the iterator and get the RowSetIterator from it. The rowSetIterator method getCurrentRowIndex() returns the index of the current row (line 5). The current page is calculated by dividing the current index through the range size (line 7). The final information is the index of the selected row on the page which is calculated as the current index modulo the range size (line 10). The rest of the listener build a string out of this information and writes it to a pageDef variable which is referenced in an outputfield on the page.

<af:outputText value="#{bindings.selectedRow1.inputValue}" id="ot8" partialTriggers="b2"/>

Here are some images from the sample application.

The sample application is build using JDev 12.1.3 and uses the HR DB schema. The sample can be downloaded from  Github

Fasten your seat belts: Flying the Oracle Development Cloud Service (3 – Take Off – ROTATE)

The last part of the series 3 – Take Off – V1 we finished when we could build hte application using ANT on the local machine. In this part we are going to try this on the Oracle Developer Cloud. Finally we should see how Continuous Integration and Continuous Delivery works in the cloud.

Alt NOTE
I created a fresh set of ANT build scripts named ‘buildlocal.xml’ and ‘buildlocal.properties’ from the project to demonstrate the process. The original ones name ‘build.xml’ and ‘build.properties’ are the final result which I didn’t want to revert. So when you create the ANT scripts yourself you can user the default names ‘build.xml’ and ‘build.properties’. When I talk about build files I now mean the ones named ‘buildlocal.*’.

Demo Build Files

Demo Build Files


For the same reason we create a new build job in the cloud names ADFTestBuild to show the steps to take. The final build job is named ADFCommunuityFrkExt.
Demo Build Job

Demo Build Job

We pushed the files local build files already to the remote repository. Let’s run the build on the could. First we log into the Oracle Developer Cloud as team member and switch to the build tab and create a new build job (ADFTestBuild)


Note that we use JDK 7 to build the project. The Oracle Developer Cloud offers JDK 6-8 to work with. As we use JDev 12.1.3 we use JDK 7
JDK's available

JDK’s available


In the Source Code Management section we select the repository and branch to use for this job. The advanced section can be left blank as it’s filled by the system when you save the job. There are more advanced option you can set but they are not part of this post. All we nach to remember

Alt NOTE
Builds are dependent on ONE branch

Alt NOTE
The Build Trigger defines that each minute the CI system checks the SCM if something has changed. If yes, it schedules to execute the build job.

When we are finished with the feature we have to change the build job or to create a new one which uses the master or default branch to build on. In our situation where we implement the CI we set the branch to the one we are working on named ‘feature-setup-build’.

After saving the new job we can start it by clicking on the ‘Run Now’ button


Hm, the build did not work as it did on the local machine. This is shown by the icon in the first column of the job history table. To find out what went wrong we look at the output of the build by clicking on the ‘console’ button in the last column of hte table
Build output

Build output


In the first marked section we see the build file ‘buildlocal.xml’ which was used and in the second marked section the error message. It looks liek the build job can’t find the task ‘OJDeployAntTask’. A look into the buildlocal.xml file at line 40 reveals

   <taskdef name="ojdeploy" classname="oracle.jdeveloper.deploy.ant.OJDeployAntTask" uri="oraclelib:OJDeployAntTask"
             classpath="${oracle.jdeveloper.ant.library}"/>

where line 40 is the classpath in the above listing. This means that the variable “${oracle.jdeveloper.ant.library}” is not found. A look into the Oracle Developer Cloud at Developing Oracle ADF Applications with Oracle Developer Cloud Service give the needed information. We have to alter the build files
1. add a line

<property environment="env"/>

to the build.xml file before loading the build.properties
2. change the build.properties file to use information from the now loaded environemnt
The second part is a bit confusing. From the link above we learn to set a variable as
oracle.home=${env.ORACLE_HOME}
which is misleading a bit. The problem is that the developer cloud offers two environments to the user. One for 11g and one for 12c. As we use the one for 12c we have to use a different setup which can be found in the docs too at a different location Using Hudson Environment Variables. The second link tells us to use
oracle.home=${env.ORACLE_HOME12C3}

Alt NOTE
Add property environment="env" to your build.xml to load the environment of the server
Alt NOTE
Add
oracle.home=${env.ORACLE_HOME_12C3}
oracle.commons=${env.MIDDLEWARE_HOME_12C3}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_12C3}
install.dir=${env.ORACLE_HOME_12C3}
to the build.properties file to make use of hte servers environment.

With this info we can make the needed changes. The resulting build.properties is

#Fri Jul 24 15:06:08 CEST 2015
#Change the next three properties to match your projects names
workspace.name=ADFCommunityFrkExt
workspace=${env.WORKSPACE}
project.viewcontroller.name=FrkExtModel
project.deploy.folder=deploy
oracle.jdeveloper.deploy.profile.name=adflibADFCommunityFrkExt
output.dir=classes

# Don't change anything below!
oracle.home=${env.ORACLE_HOME_12C3}
oracle.commons=${env.MIDDLEWARE_HOME_12C3}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_12C3}
install.dir=${env.ORACLE_HOME_12C3}

#Flags
javac.deprecation=off
javac.nowarn=off
java.debug=on

project.workspace.file=${workspace.name}.jws
oracle.jdeveloper.ant.library=${oracle.home}/jdev/lib/ant-jdeveloper.jar
oracle.jdeveloper.workspace.path=${workspace}/${workspace.name}.jws
oracle.jdeveloper.project.name=${project.viewcontroller.name}
oracle.jdeveloper.deploy.dir=${workspace}/${project.deploy.folder}
oracle.jdeveloper.ojdeploy.path=${env.ORACLE_HOME_12C3}/jdev/bin/ojdeploy${env.EXEC_SUFFIX}
oracle.jdeveloper.deploy.outputfile=${workspace}/${project.deploy.folder}/${oracle.jdeveloper.deploy.profile.name}

and the build.xml

<?xml version="1.0" encoding="UTF-8" ?>
<!--Ant buildfile generated by Oracle JDeveloper-->
<!--Generated Aug 22, 2015 3:15:37 PM-->
<project xmlns="antlib:org.apache.tools.ant" name="FrkExtModel" default="all" basedir=".">
  <property environment="env"/>
  <property file="build.properties"/>
  <path id="library.ADF.Model.Runtime">
    <pathelement location="${oracle.commons}/modules/oracle.idm_12.1.3/identitystore.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.model_12.1.3/adfm.jar"/>
    <pathelement location="${oracle.commons}/modules/groovy-all-2.1.6.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.model_12.1.3/adftransactionsdt.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.view_12.1.3/adf-dt-at-rt.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.model_12.1.3/adfdt_common.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.model_12.1.3/adflibrary.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.xdk_12.1.3/xmlparserv2.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.model_12.1.3/db-ca.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.model_12.1.3/jdev-cm.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.ldap_12.1.3/ojmisc.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.share_12.1.3/commons-el.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.share_12.1.3/jsp-el-api.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.share_12.1.3/oracle-el.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.security_12.1.3/adf-share-security.jar"/>
    <pathelement location="${oracle.commons}/modules/oracle.adf.security_12.1.3/adf-controller-security.jar"/>
    <pathelement location="${oracle.commons}/modules/javax.mail_2.0.0.0_1-4-4.jar"/>
  </path>
  <path id="classpath">
    <path refid="library.ADF.Model.Runtime"/>
  </path>
  <target name="init">
    <tstamp/>
    <mkdir dir="${output.dir}"/>
  </target>
  <target name="info">
    <echo level="info">build: env.ORACLE_HOME=${env.ORACLE_HOME_12C3}</echo>
    <echo level="info">build: env.WORKSPACE=${env.WORKSPACE}</echo>
    <echo level="info">build: workspace=${workspace}</echo>
    <echo level="info">build: install.dir=${env.ORACLE_HOME_12C3}</echo>
    <echo level="info">build: oracle.commons=${oracle.commons}</echo>
    <echo level="info">build: oracle.jdeveloper.ant.library=${oracle.jdeveloper.ant.library}</echo>
    <echo level="info">build: oracle.jdeveloper.ojdeploy.path=${oracle.jdeveloper.ojdeploy.path}</echo>
    <echo level="info">build: oracle.jdeveloper.deploy.dir=${oracle.jdeveloper.deploy.dir}</echo>
    <echo level="info">build: oracle.jdeveloper.deploy.profile.name=${oracle.jdeveloper.deploy.profile.name}</echo>
    <echo level="info">build: oracle.jdeveloper.workspace.path=${oracle.jdeveloper.workspace.path}</echo>
    <echo level="info">build: oracle.jdeveloper.deploy.outputfile=${oracle.jdeveloper.deploy.outputfile}</echo>
  </target>
  <target name="all" description="Build the project" depends="info,deploy,compile,copy"/>
  <target name="clean" description="Clean the project">
    <delete includeemptydirs="true" quiet="true">
      <fileset dir="${output.dir}" includes="**/*"/>
    </delete>
  </target>
  <target name="deploy" description="Deploy JDeveloper profiles" depends="init">
    <taskdef name="ojdeploy" classname="oracle.jdeveloper.deploy.ant.OJDeployAntTask" uri="oraclelib:OJDeployAntTask"
             classpath="${oracle.jdeveloper.ant.library}"/>
    <ora:ojdeploy xmlns:ora="oraclelib:OJDeployAntTask" executable="${oracle.jdeveloper.ojdeploy.path}"
                  ora:buildscript="${oracle.jdeveloper.deploy.dir}/ojdeploy-build.xml"
                  ora:statuslog="${oracle.jdeveloper.deploy.dir}/ojdeploy-statuslog.xml">
      <ora:deploy>
        <ora:parameter name="workspace" value="${oracle.jdeveloper.workspace.path}"/>
        <ora:parameter name="project" value="${oracle.jdeveloper.project.name}"/>
        <ora:parameter name="profile" value="${oracle.jdeveloper.deploy.profile.name}"/>
        <ora:parameter name="nocompile" value="false"/>
        <ora:parameter name="outputfile" value="${oracle.jdeveloper.deploy.outputfile}"/>
      </ora:deploy>
    </ora:ojdeploy>
  </target>
  <target name="compile" description="Compile Java source files" depends="init">
    <javac destdir="${output.dir}" classpathref="classpath" debug="${javac.debug}" nowarn="${javac.nowarn}"
           deprecation="${javac.deprecation}" encoding="UTF8" source="1.7" target="1.7" includeantruntime="false">
      <src path="src"/>
    </javac>
  </target>
  <target name="copy" description="Copy files to output directory" depends="init">
    <patternset id="copy.patterns">
      <include name="**/*.GIF"/>
      <include name="**/*.JPEG"/>
      <include name="**/*.JPG"/>
      <include name="**/*.PNG"/>
      <include name="**/*.cpx"/>
      <include name="**/*.dcx"/>
      <include name="**/*.ejx"/>
      <include name="**/*.gif"/>
      <include name="**/*.ini"/>
      <include name="**/*.jpeg"/>
      <include name="**/*.jpg"/>
      <include name="**/*.png"/>
      <include name="**/*.properties"/>
      <include name="**/*.sva"/>
      <include name="**/*.tag"/>
      <include name="**/*.tld"/>
      <include name="**/*.wsdl"/>
      <include name="**/*.xcfg"/>
      <include name="**/*.xlf"/>
      <include name="**/*.xml"/>
      <include name="**/*.xsd"/>
      <include name="**/*.xsl"/>
      <exclude name="build.xml"/>
    </patternset>
    <copy todir="${output.dir}">
      <fileset dir="src">
        <patternset refid="copy.patterns"/>
      </fileset>
      <fileset dir=".">
        <patternset refid="copy.patterns"/>
      </fileset>
    </copy>
  </target>
</project>

The files above are the original ones and can be run from the build console to get this

Great, we now have successfully enabled CI in the cloud for the ‘Framework Extension’ project. Well, there is something more to think about. Can’t we use the same ANT build scripts on the local machine too?

Yes, we can but we have to make some adjustments for this.

Now that we read the environment from the server the ANT script is running on to set the some of the variables we need to set these environment variables on the local machine too. this can be done easily by altering the jdev start file (Linux) or using a batch to first set the environment variables and then start jdev (Windows). Below is my changes jdev start script

#!/bin/bash

#=============================================================================
#  Launcher for Oracle JDeveloper 12c (12.1.2.0.0)
#=============================================================================

unset -v GNOME_DESKTOP_SESSION_ID
export MIDDLEWARE_HOME_12C3=/opt/jdev/12.1.3.0.0/Oracle/Middleware
export ORACLE_HOME_12C3=/opt/jdev/12.1.3.0.0/Oracle/Middleware/Oracle_Home/jdeveloper
export WORKSPACE=/data/development/ENTW_12.1.3.0.0
export EXEC_SUFFIX=
/opt/jdev/12.1.3.0.0/Oracle/Middleware/Oracle_Home/jdeveloper/jdev/bin/jdev $1

As you see I set the environment variables which are later read through the build.xml file before starting jdeveloper.
The one line
export EXEC_SUFFIX=
need special attention. It’s only necessary if you run JDev using different operating systems (Linux and Windows). The build file has one variable pointing to the the ojdeploy executable
oracle.jdeveloper.ojdeploy.path=${env.ORACLE_HOME_12C3}/jdev/bin/ojdeploy${env.EXEC_SUFFIX}
Users using Windows need to add the suffix ‘.exe’ to this variable as ojdeploy can’t be started otherwise under Windows.
The problem is that we can’t add it for Linux systems as they don’t know this suffix. The solution I found is to add ${env.EXEC_SUFFIX} to the executable and set it to an empty string for Linux systems. For Windows systems you have to set this environment variable to ‘.exe’. For this I use a batch file where I use
~~~setx EXEC_SUFFIX .exe~~~
before starting JDev. In the same batch I set the other variables too

setx ORACLE_HOME_12C3 r:\Java\12.1.3.0.0\Oracle\Middleware\jdeveloper
setx MIDDLEWARE_HOME_12C3 r:\Java\12.1.3.0.0\Oracle\Middleware
setx EXEC_SUFFIX .exe

Alt NOTE
To make the build files work under Windows and Linux and iOS add an environment variable defining the suffix for executable files.

One final trick is to set the workspace directory. The build.properties file has one more environment variable workspace=${env.WORKSPACE} which we need to set.
As the workspace isn’t fix on a local machine, at least if you have more than one workspace, you can’t set this variable before you start JDev. This has to be done per workspace, when you change the workspace.
JDev has a solution for this in the ANT properties section

ANT Project Properties

ANT Project Properties


You can shoose from different variables JDev sets according to the workspace and project you are working with.
JDeveloper Variables

JDeveloper Variables

Alt NOTE
Set the env.WORKSPACE environment variable in the ANT properties of the project.

This concludes this part of the series. In the next post we finish the feature ‘feature-setup-build’ by introducing the code review function of the Oracle Developer Cloud. This will be followed by a post about building a simple ADF application with a UI which you use to show the Continuous Delivery (CD) option of the Oracle Developer Cloud.

Fasten your seat belts: Flying the Oracle Development Cloud Service (3 – Take Off – V1)

In part three of the series about the Oracle Developer Cloud we start working on a project as a member of a team in the developer cloud.

Before starting a new project some basic ground has to be covered. What architecture and technology should the project use as well as which package path to use. For the technology the the decision is easy as we want to use ADF. For the architecture we can choose on one of the patterns outlined at ‘Angles in the architecture’.

A good starting point is to introduce a for every ADF project, regardless of the architectural pattern, is a framework extension project (see ‘Extending a Helping Hand’). So we start with this too.

As a developer can’t create a new repository in a cloud project, we have to do this as a user with admin rights.


The first thing to note is that you should create an empty repository (unmark the ‘Initialize repository with README file’). If you initialize the repository with a README file, the developer can’t later just push his initial local version of the JDev workspace into the remote repository. The local repository has be updated with the README file first.

Now that the remote, empty repository exists we switch roles and work as a developer. For this we use a different login as a user who only has developer rights in the Oracle Developer Cloud.
Before the developer uses the new repository he creates a new workspace or project. We create a workspace for the framework extension library.


Next we add the ‘ADF Model Runtime’ library to the project and then the framework extension classes to the workspace.

Right now we don’t need to add or change any of the code in the created classes. If we later need to add some global functionality we come back to these classes. The next thing to do is to create an ADF library from the created classes

To make the new library available for other projects you can create a new file system connection using the same path we specified in the deployment descriptor

Later we come back to this step as we see that we have to change it a bit to make it work in the cloud. Right now we leaf it as is as this shows how you normally would do this in a normal project.
The next thing to do is to initialize a local GIT repository and push this to the Oracle Development Cloud repository as the initial master

and then push the local master branch to the Oracle Developer Cloud repository. For this we use the repository URL we get when we log into the cloud as the developer
Copy git repository address

Copy git repository address


Using this URL we push the local repository to the remote one

to finally see the changes in the cloud

More Decisions
With the basics covered we have to make another decision:
How to define the workflow for changes to make to the project sources.
Should all team members work on the trunk (called the mainline) or should each member use a branch to work on (called a feature branch). Both of these practices have their supporters and naturally opponents. The first is more CI like per definition. Feature Branches on the other side are not CI by the definition, as the code is not continuously integrated into the main line. This dispute is not for this post and may be not for this blog. Anyway, lets start with feature branching.
This allows us to show a feature of the Oracle Developer Cloud as it allows for code reviews which are mostly used if you work with feature branches, but can be used for the other practice too.

Feature: add build files
The feature we implement is to setup a build system for our framework extension project. We name the feature ‘feature-setup-build’


We learned in part 1 that the Oracle Developer Cloud provides a Continuous Integration server (CI). We plan to use this CI server to build our library whenever the code changes. For this we need to use ANT or Maven as the build system. For this project we choose ANT and can now build the needed build.xml files from the project

To finish this part we add the new files to our local repository and then push them to the remote as a new branch.

We push the local changes to the remote repository in the cloud using the same branch name

We had not looked into the created build.xml file or the build.properties files, we had them just created and pushed them into the repository. Question is, will they work?
Let’s try it on the local machine first.

Now we can run the ANT target ‘all’ which is the default one.
Well, as JDev 12.1.3 has somehow eliminated the ANT tool bar buttons running ANT on a project is a bit cumbersome (hopefully the ANT build buttons are back with the next release)

OK, this works like a charm.

As this post is already very long, we split the take off into two parts, V1 and ROTATE. This concludes part V1. Next time we make the necessary changes to the build files to integrate them to the clouds build system and start the CI process.

Note: for those who wonder about the terms V1 and ROTATE:
– V1 is the maximum speed at which an aircraft pilot may abort a takeoff without causing a runway overrun
– ROTATE or Vr is the speed of an aircraft at which the pilot initiates rotation to obtain the scheduled takeoff performance

Fasten your seat belts: Flying the Oracle Development Cloud Service (2 – Safety)

In this second part of the series we take a look at the safety features on board of our aircraft, named Oracle Developer Could Service.
As in a real aircraft we don’t see all safety features available, as some are hard to show without blacking most of the screen. We cover how the Developer Cloud looks for a new member of a project. Remember that a project in the cloud is not the same as a project in JDeveloper. For more info on this refer to Part 1 – Boarding.

We start with an administrative task of creating a new member for our Identity Domain in the Oracle Developer Cloud Service. This is necessary as only members to the Identity Domain can be members of a project in the Developer Cloud. The Identity Domain is the sandbox which holds all available (or licensed) cloud services. In our installation it contains the services we saw in the first part of the series.

Logged in as an administrator of the identity domain we can add a new user

Add a new member

Add a new member


Clicking on one of the marked links will open a couple of dialogs to fill in the new users data

In the first image we fill in the basic user data like name and e-mail address and the roles the user is assigned to. The e-mail address is significant as the new member gets a nice mail with credentials he/she must use to verify the e-mail address and finish the account building by changing the initial system assigned password.
The possible roles a user can be assigned to can be seen on the left. We only assign the new member the ‘Java Developer’ the role ‘Developers Service User Role’. This is sufficient to work with the Oracle Developer Cloud Service as part of a development team. The other roles allow a user access more administrative tasks and the other parts of the Oracle Development Cloud Service (DB, storage…).
Once the dialog is filled out an e-mail is send to the new member as well as to the manager of the user if this field is filled.
New member e-mail

New member e-mail


When the new member follows the link in the e-mail and logs in the first time he has to change her/his password. This isn’t just changing the password but you also have to answer three questions which are used if you forget you password and need to reset it later. You should note down the answers carefully! The next step is to configure the user interface language and timezone.

Finally you get transferred to the landing page showing all available services from all identity domains the e-mail address is or was registered to.
As I used the e-mail address before to get a trial account the landing page shows multiple identity domains. The one we are using in this post is marked with a red border. You can user the drop down to select to only show one identity domain which makes it less confusing.

Clicking on the Oracle Developer Cloud Service you are transferred to the ‘Welcome Page’ (last image of the gallery). At the moment you can only create a new project, but don’t see any available project. The reason for this is that the new member is not attached to an existing project. This has to be done by an ‘owner’ of the project. Only after this a member can access the project.
To add the new member to an existing project, we log into the Oracle Development Could Service as an owner of the project and add the new member to the project.

Administrator adds new member

Administrator adds new member


The next time the user updates the ‘Welcome Page’ or logs in again he’ll see the project.

Clicking on the project shows the project’s home page with the project’s timeline and information about the git and maven repositories.
Project's landing page

Project’s landing page


Now the new member can access the git repository information by clicking the menu button
Copy git repository address

Copy git repository address


With this information the member can clone the repository using JDeveloper 12.1.3

The member can now work locally with the project and make changes needed or assigned to him/her.

This concludes the 2nd part about safety and setting up members in the Oracle Developer Cloud Service.

In the next part we will introduce how to work with projects and how to setup projects for continuous integration (CI).

Export to Excel enhancements in JDeveloper 11.1.1.9.0 and JDeveloper 12.1.3

In the current JDeveloper version 11.1.1.9.0 and 12.1.3 the af:exportCollectionActionListener got enhanced by options to filter the data to export.

Enhanced options of exportCollectionListener

Enhanced options of exportCollectionListener


The option this blog talks about is the one marked, the FilterMethod. The ducumentation for 12.12 Exporting Data from Table, Tree, or Tree Table does not reveal too much about how to use this FilterMethod.
The sample we build in this blog entry shows how the FilterMethod can be used to filter the data to be exported to excel.
In older version of JDev you hadto use a trick to filter the data which was downloaded from a table see Validate Data before Export via af:exportCollectionActionListener or af:fileDownloadActionListener. The new property of the af:exportCollectionActionListener allows to filter the data without using the trick.
The sample just load the employees table from the HR DB schema and shows it in a table on the screen. In the toolbar we add a button which has the af:exportCollectionActionListener attached.
Running application

Running application


Below is the page code of the toolbar holdign the export button:

                            &lt;f:facet name=&quot;toolbar&quot;&gt;
                                &lt;af:toolbar id=&quot;t2&quot;&gt;
                                    &lt;af:button text=&quot;Export to Excel&quot; id=&quot;b1&quot;&gt;
                                        &lt;af:exportCollectionActionListener type=&quot;excelHTML&quot; exportedId=&quot;t1&quot; filename=&quot;emp.xsl&quot; title=&quot;Export&quot;
                                                                           filterMethod=&quot;#{ExportToExcelBean.exportCollectionFilter}&quot;/&gt;
                                    &lt;/af:button&gt;
                                &lt;/af:toolbar&gt;

The filterMethod of the af:exportCollectionActionListener points to a bean method exportCollectionFilter in a request scoped bean ExportToExcelBean. The method gets called for each cell of the table which gets exported.

    /**
     * This method gets called for each cell which is to be exported.
     * It can be used to filter data to be exported. In this case salary values &gt; 6000 are not exported
     * @param uIComponent component of the cess which gets to be exported
     * @param exportContext context of the exported data (holds e.g. file name, character set...)
     * @param formatHandler format to be exported
     * @return true if cell value is exported, false if not
     */
    public Boolean exportCollectionFilter(UIComponent uIComponent, ExportContext exportContext, FormatHandler formatHandler) {
        if (exportContext.isFirstInRow()) {
            count++;
            _logger.info(&quot;Start a new Row &quot; + count);
        }
        _logger.info(&quot;Export Collection UIComponent: &quot; + uIComponent.getId());
        if (uIComponent instanceof RichOutputText) {
            RichOutputText rot = (RichOutputText) uIComponent;
            Object val = rot.getValue();
            String headerText = &quot;&quot;;
            UIComponent component = rot.getParent();
            if (component instanceof RichColumn) {
                RichColumn col = (RichColumn) component;
                headerText = col.getHeaderText();
            }
            StringBuilder sb = new StringBuilder();
            sb.append(&quot;Name: &quot;);
            sb.append(headerText);
            sb.append(&quot; Value: &quot;);
            sb.append(val);
            _logger.info(sb.toString());
            // check if the salary is greater than 6000
            if (&quot;Salary&quot;.equals(headerText)) {
                if (((BigDecimal) val).intValue() &gt; 6000) {
                    // if yes return false so that the value isn't exported
                    _logger.info(&quot;Skip Vals &gt; 6000&quot;);
                    return false;
                }
            }
        }

        return true;
    }

The method gets the uiComponent which represents the current cell to be exported, the ExportContext and the FormatHandler for the export. The ExportContext hold information about the filename, title, the used character set and status information about the row and cells currently exported. The status can be used to find out is a new row just starts to be exported ro is a cell is part of a span of cells. In the sample we use this information to print a log message for each row exported.
The FormatHandler is used to generate the document to be exported and the data in it. I did not find a way to use my own handler and there is no documentation about how to use another handler, so we leaf this as is for the moment.
In the sample method we like to filter the employee data in a way, that salaries greater than 6000 are not exported to the resulting file. As the method is called for each cell, the first thing to find out is which cell currently used. In lines 15-29 we use the current UIComponent to find out which column we are in. In lines 31-37 we check the salary column. In case the salary value is greater than 6000 we return false as this will trigger that the cell value is not exported. If the salary is below or equal to 6000 we return true and the cell value is exported.
Below we see the result we get if we export the table without the filterMethod set:

Exported table without filter

Exported table without filter


and the result with the filter method set:
Exported table with filter

Exported table with filter

You can download the sample application which was build using JDeveloper 12.1.3 and the HR DB schema from GitHub.

dvt:treemap showing node detail in popup

This post describes how to implement an dvt:treemap which shows a af:popup when the user clicks on a detail node in the map.
The documentation of the dvt:treemap component tell us that the dvt:treemapnode supports the af:showPopupBehaviortag and reacts on the ‘click’ and ‘mouseHover’ events.
This is part of the solution and allows us to begin implementing the use case. We add an af:showPopupBehavior to the nodes we want to show detail information for.

After creating a default Fusion Web Application which uses the HR DB schema, we begin with creating the data model for the model project. For this small sample the departments and employees tables will be sufficient.


The views are named according to their usage to make it easier to understand the model. This is all we need for the model.

Let’s start with the UI which only consist of a single page. The page has a header part and a center part. In the center area we build the treemap by dragging the Departments from the data controls onto the page and dropping it as treemap. After that, in the dialog we specify the first level of the map to be the departmentId (which shows the department name as the label) and the for the second level we choose the employeeId (which shows the last name of the employee as label) from the employees. The whole process is shown in the gallery below.


The resulting treemap is very basic in it’s features, e.g. there is no legend as you see later.
In the next step we create an af:popup to show the nodes detail information. This process is outlined in the next gallery. We drag the popup component onto the page below the af:treemap component

One thing to take note of are the properties of the popup. First we set the content delivery to ‘lazyUncached’, which makes sure that the data is loaded every time the popup is opened. Otherwise we’ll see only the data from the first time the popup has been opened. Second change is to set the launcherVar to ‘source’. This is the variable name we later use to access the node data. Third change is to set the event context to ‘launcher’. This means that events delivered by the popup and its descendents are delivered in the context of the launch source.

The treemap for example, when an event is delivered ‘in context’ then the data for the node clicked is made ‘current’ before the event listener is called, so if getRowData() is called on the collectionModel in the event listener it will return the data of the node that triggered the event. This is exactly what we need.

Finally we add a popupFetchListener to the popup which we use to get the data from the current node to a variable in the bindings. In the sample this variable ‘nodeInfo’ is defined in the variable iterator of the page and an attribute binding ‘nodeInfo1’ is added. More info on this can be found here.


The code below shows the popupFetchListener:

package de.hahn.blog.treemappopup.view.beans;

import javax.el.ELContext;
import javax.el.ExpressionFactory;

import javax.faces.application.Application;
import javax.faces.context.FacesContext;

import oracle.adf.model.BindingContext;
import oracle.adf.share.logging.ADFLogger;
import oracle.adf.view.rich.event.PopupFetchEvent;

import oracle.binding.AttributeBinding;
import oracle.binding.BindingContainer;


/**
 * Treemap handler bean
 * @author Timo Hahn
 */
public class TreemapBean {
    private static ADFLogger logger = ADFLogger.createADFLogger(TreemapBean.class);

    public TreemapBean() {
    }

    /**
     * listen to popup fetch.
     * @param popupFetchEvent event triggerd the fetch
     */
    public void fetchListener(PopupFetchEvent popupFetchEvent) {
        // retrieve node information 
        String lastName = (String) getValueFromExpression("#{source.currentRowData.lastName}");
        Integer id = (Integer) getValueFromExpression("#{source.currentRowData.EmployeeId}");
        //build info string
        String res = lastName + " id: " + id;
        logger.info("Information: " + res);
        // get the binding container
        BindingContainer bindings = BindingContext.getCurrent().getCurrentBindingsEntry();

        // get an ADF attributevalue from the ADF page definitions
        AttributeBinding attr = (AttributeBinding) bindings.getControlBinding("nodeInfo1");
        //set the value to it
        attr.setInputValue(res);
    }

    // get a value as object from an expression
    private Object getValueFromExpression(String name) {
        FacesContext facesCtx = FacesContext.getCurrentInstance();
        Application app = facesCtx.getApplication();
        ExpressionFactory elFactory = app.getExpressionFactory();
        ELContext elContext = facesCtx.getELContext();
        Object obj = elFactory.createValueExpression(elContext, name, Object.class).getValue(elContext);
        return obj;
    }
}

Finally we have to design the popup to show the node info from the attribute binding ‘nodeInfo1’. The popup uses a dialog with an af:outputText like

Show the node info in the popup

Show the node info in the popup


and set an af:showPopupBehavior to the node showing the employees

Running the finished application brings up the treemap, not pretty but enough to see this use case working. If we click on an employee node we see the popup with the last name of the employee and the employee id, the primary key of the selected row in the employees iterator.

You can download the sample application which was build using JDeveloper 12.1.3 and the HR DB schema from GitHub.

Initialize an execute af:quickQuery default criteria on page load

Recently a question on the OTN JDeveloper & ADF Space caught my interest. The question was how to initialize an af:quickQuery component with a parameter passed to a task flow on load of a page.
At first I thought that this would be a simple case of setting a property (InitialQueryOverwritten=true) as mentioned by Frank Nimphius in his article How-to query af:quickQuery on page load ?, but after a short test it turned out, that this setting only executes the query but can’t be used to initialize the criteria.

This blog is about a solution to this problem. The question can be divided into two smaller problems to solve. The first is to pass a parameter to a bounded task flow and use the passed parameter in the bounded task flow. The second problem is to initialize a default query attribute of a af:quickQuery component and execute the query.

Let’s have a look at the running application.

Start Page

Start Page


On the start page the user can enter a parameter, which is used as input parameter in the second page, which holds a region (as bounded task flow) with the quick query component. Clicking on the ‘Go Query’ button passes the entered parameter to a pageFlowScope variable. and navigates to the second page.
Start Page Page with initialized af:quickQuery

Start Page Page with initialized af:quickQuery


As we see, the passed parameter is visible in the quick query component and the table shows the corresponding data in the table.

The first problem mentioned isn’t really one as the solution the well documented. So passing a parameter from an af:inputText to a bounded task flow will only showed briefly here. The button on the start page uses a af:setPropertyListener to set the parameter to a pageFlowScope variable. On the second page the parameter is passed as input parameter to the bounded task flow which assembles the af:quickQuery.


The images above showing the navigation between the two pages and the region (QuickQuery.jsf) which holds the af:quickQuery.

First Try
The first method I tried to initialize the af:quickQuery was to overwrite the QueryListener of the af:quickQuery component to set the parameter to the default search attribute. The already mentioned property InitialQueryOverwritten=true would then execute the query with the parameter set. This should show the right result in the table. As it turned out, if the property InitialQueryOverwritten is set to true, the QueryListener is not called on load of the page. No change to set the parameter which is passed to the bounded task flow.

Second Try
For the next try I used a method activity in the bounded task flow and tried to set the parameter from this method. This will not work as the component is not present when the method is called as default activity in the task flow. You can set the parameter to the view object and filter the data after it, however, the overwritten property InitialQueryOverwritten then executed the default query again, this time without the parameter. If you set the property to false, you see the data, but the parameter is not set in the af:inputText component.

Final Try: Working solution
The working solution uses a trick which is kind of lazy initializing the component. For this we bind a property of the component to a bean and overwrite the getter method for the property. In the getter we check a private variable of the bean if the component has been called already or not. In case the getter has already been called we just return the value for the property. In case the getter method is called the first time we initialize the component before returning the value of the property.

Let’s look at the af:quickQuery in the region:

                        <af:quickQuery label="Search" searchDesc="#{viewScope.QuickQueryBean.dummy}" id="qryId1"
                                       value="#{bindings.ImplicitViewCriteriaQuery.quickQueryDescriptor}"
                                       model="#{bindings.ImplicitViewCriteriaQuery.queryModel}"
                                       queryListener="#{bindings.ImplicitViewCriteriaQuery.processQuery}" binding="#{viewScope.QuickQueryBean.quickQuery}">
                            <f:facet name="end">
                                <af:commandLink text="Advanced" rendered="false" id="cl1"/>
                            </f:facet>
                        </af:quickQuery>

Two things to note are
1. the component is bound to the viewScope bean QuickQueryBean
2. the searchDesc property is bound to the same QuickQueryBean bean
The component is bound to the bean as a convenience to get the query descriptor easily in the initialization method. To make this save we use a ComponentReference to store the component.

    private ComponentReference quickQuery;
    /**
     * setter for component to ComponentReference
     * @param quickQuery the component
     */
    public void setQuickQuery(RichQuickQuery quickQuery) {
        this.quickQuery = ComponentReference.newUIComponentReference(quickQuery);
    }

    /**
     * getter for the component from the component reference
     * @return
     */
    public RichQuickQuery getQuickQuery() {
        if (quickQuery != null) {
            return (RichQuickQuery) quickQuery.getComponent();
        }
        return null;
    }

For more information about this technique see Rules and Best Practices for JSF Component Binding in ADF

The lazy initialization is done by binding the searchDesc property to the QuickQueryBean. The trick is that the component has to call the getter for this property to get it’s value. In the getter in the bean

    /**
     * getter for a string value names dummy in EL
     * @return value of the dummy property
     */
    public String getDummy() {
        if (needInit) {
            needInit = false;
            initQuickQuery();
        }
        return "Search";
    }

we check a local variable ‘needInit’ which is set to true when the bean is created each time the page gets loaded. As the bean is in viewScope it guarantees that the bean is created each time the page is loaded and stays active until the page is visible.
The real work is done in the initQuickQuery() method:

    /**
     * Initialize the quickQuery component if a parameter tpCityName is found in the pageFlowScope. Once this is done, the pageFlowScope
     * variable tpCityName is set to null or removed.
     */
    public void initQuickQuery() {
        // get the PageFlowScope Params
        AdfFacesContext adfFacesCtx = AdfFacesContext.getCurrentInstance();
        Map<String, Object> scopePageFlowScopeVar = adfFacesCtx.getPageFlowScope();
        String paramCity = (String) scopePageFlowScopeVar.get("tpCityName");
        if (paramCity != null && !paramCity.isEmpty()) {
            // get query descriptor (the components value property)
            FilterableQueryDescriptor queryDescriptor = (FilterableQueryDescriptor) getQuickQuery().getValue();
            // get the current selected criterion (which should set in the ImplicitViewCriteriaQuery in hte pageDef
            AttributeCriterion attributeCriterion = queryDescriptor.getCurrentCriterion();
            // get the attribute name and check if it'S 'City'
            AttributeDescriptor attribute = attributeCriterion.getAttribute();
            String name = attribute.getName();
            // only set parameter if hte attribute matches the parameter
            if ("City".equalsIgnoreCase(name)) {
                attributeCriterion.setValue(paramCity);
                // remove value to allow new one in component
                scopePageFlowScopeVar.put("tpCityName", null);
                // set the parameter to the attributeCriterion 
                QueryModel model = getQuickQuery().getModel();
                model.setCurrentDescriptor(queryDescriptor);
                // create a queryEvent and invoke it
                QueryEvent qe = new QueryEvent(getQuickQuery(), queryDescriptor);
                invokeMethodExpression("#{bindings.ImplicitViewCriteriaQuery.processQuery}", Object.class, QueryEvent.class, qe);
            }
        }
    }

In this method we check if a parameter named ‘tpCityName’ is present in the pageFlowScope (lines 8-10). If yes the next check is if the current selected criterion the for the selected parameter, in this case the ‘City’ (lines 11-19) . Only if this test is positive the value from the parameter is set to the criterion (line 20), the pageFlowScope variable ‘tpCityName’ is removed and the new criterion is set back to the query model (lines 21-25). Finally to execute the af:quickQuery we create a new QueryEvent and invoke it via an EL (lines 26 -28).
The solution does not need to set the InitialQueryOverwritten property to true to run. The query is fired after setting the attribute via the QueryEvent. Here is an image of the af:quickQuery binding

Definition of the ImpliciteViewCriteriaQuery

Definition of the ImpliciteViewCriteriaQuery

The sample needs the HR DB schema. You can download the code for the sample, which was build using JDeveloper 12.1.3, from GitHub. Please note that if you run the sample in your environment, that you have to change the DB connection to the HR DB schema according to your environment.

JDev 12.1.3: Using Parent Action to Navigate Tabs of a af:panelTabbed from Inside a Region

This blog is based on a question in the OTN JDeveloper and ADF forum. The Question was how to navigate from one selected tab to the next tab when the af:showDetailItem in the tab is a region and the button to navigate is inside the region.

We implement two cases, the first is the easy one where the button to navigate is in the page holding the af:panelTabbed. The second one uses a button is inside a bounded task flow which is shown in the af:showDetailItem in a tab to navigate the af:panelTabbed.

We start with creating a new ‘ADF Fusion Web Application’ from the gallery. We only change the application name and the path of the application, otherwise we can just use the default values. The sample is simple and doesn’t need a model project or connection to a DB. You can download the finished workspace using the link provided at the end of the post.

We skip all the needed steps and going right into creating the starting page which holds the af:panelTabbed. It has three af:showDetailItem and a af:Button to navigate the tabs directly from the page. This button implements the first use case.

Start Page with Outer Navigation

Start Page with Outer Navigation

The button has a listener attached which is implemented in a viewScope bean ‘NavigateTabBean’. The listener implements the needed logic to navigate from the selected tab to the next tab. If the last tab is reached the first tab is selected.

    private static ADFLogger _logger = ADFLogger.createADFLogger(NavigateTabBean.class);
    private static final String PANELTAB = &quot;pt1&quot;;

    /**
     * Eventhandler to navigate to the next tab in a af:panelTabbed
     * @param actionEvent event which called the listener
     */
    public void naviGateButtonAction(ActionEvent actionEvent) {
        UIComponent ui = getUIComponent(PANELTAB);
        if (ui == null) {
            _logger.info(&quot;PanelTab component not found!&quot;);
            return;
        }
        if (!(ui instanceof RichPanelTabbed)) {
            _logger.info(&quot;Component is not an af:panelTabbed&quot;);
            return;
        }

        RichPanelTabbed rpt = (RichPanelTabbed) ui;
        int childCount = rpt.getChildCount();
        List&lt;UIComponent&gt; children = rpt.getChildren();
        for (int ii = 0; ii &lt; childCount; ii++) {
            UIComponent uiSDI = children.get(ii);
            if (uiSDI instanceof RichShowDetailItem) {
                RichShowDetailItem rsdi = (RichShowDetailItem) uiSDI;
                if (rsdi.isDisclosed()) {
                    //close current tab
                    rsdi.setDisclosed(false);
                    //calculate next tab to disclose as next_tab_index = (current_tab_index + 1) % number_of_tabs
                    int kk = ii + 1;
                    int jj = kk % childCount;
                    _logger.info(&quot;old disclosed tab: &quot; + ii + &quot; new disclodes tab: &quot; + jj);
                    RichShowDetailItem newSDI = (RichShowDetailItem) children.get(jj);
                    //open new tab
                    newSDI.setDisclosed(true);
                    AdfFacesContext.getCurrentInstance().addPartialTarget(rpt);
                    return;
                }
            }
        }
    }

    // find a jsf component
    private UIComponent getUIComponent(String name) {
        FacesContext facesCtx = FacesContext.getCurrentInstance();
        return facesCtx.getViewRoot().findComponent(name);
    }

    public void nextTab() {
        naviGateButtonAction(null);
    }

The logic in the action listener first searches for the af:panelTabbed in the viewRoot and gets the number of children from it. Each child is one of the af:showDetailItem representing a tab. Then we iterate over the child list and search the currently disclosed tab. We close this tab and the next tab in the list gets disclosed. If the currently selected tab is the last in the list, the first tab is disclosed (see the comments in the code section).

To Implement the second use case, the one we really want to talk about, we first need to implement three bounded task flows which we later use as regions in the tabs.

Bounden Task Flow with Parent Action

Bounden Task Flow with Parent Action

The image shows the bounded task flow for one tab. The other bounded task flows are build in the same way and are just showing different text. The reason for this is that you normally would use different regions aka different task flows in the tabs. We could have used only one bounded task flow with a parameter to change the text shown in the fragment. In the sample you’ll find this implemented for tabs 4 and 5.
The region is simple and only shows one fragment which has a button to navigate to the next tab and a test to distinguish the regions when navigating. The whole magic is the parent action in the bounded task flow. This parent action executes a navigation case ‘nextTab’ in the parent task flow.

Unbounded Task Flow with Start Page

Unbounded Task Flow with Start Page

In the image above we the the unbounded task flow which is the parent of the bounded task flow. Here a wild card rule navigates to a method call activity ‘selectNextTab’ using the navigation case ‘nextTab’ we entered to the parent action of the regions.
The method action calls the ‘nextTab()’ in the managed bean from the code section above. All this method does is to call the action listener which is called from the af:Button of the start page (Start.jsf). As the action listener needs an ActionEvent as parameter, which we don’t use in the code we pass ‘null’ when we call the listener from the method call activity.

This concludes the implementation. Here are some images from the running application

The sample application can be downloaded form ADFEMG Sample Project.

A version of the software build with JDeveloper 11.1.1.7.0 can be downloaded from GitHub

JDev 12.1.3: Use Default Activity Instead of the Deprecated Invoke Action

Since JDeveloper 12.1.3 the invoke action used in earlier version has been deprecated. Users still using the old invoke action to load data on page load should migrate their code to using the default activity in a bounded task flow instead. This article describes how to use the executeWithParams method as a default activity in a bounded task flow (btf) to load data to be shown in a region. For this we implement a common

Use Case:
in a text field the user enters a string which should be used to look-up data in the DB and show the data as a table in a region.
For this we use the HR schema and build a look-up for locations after the name of the city of the location. In a page the user can insert the name or part of a cities name into a text field. This input is send as parameter to a bounded task flow. The default activity of the btf calls a method in the view object which uses a view criteria to search for cities starting with the given input data. In a second implementation the same technique is used but a where clause is used in the VO and the VO is called with executeWithParams. The result of the search is displayed as a table in a region.

Implementation

Model Project:
We start by creating a new ‘Fusion Web Application’ and creating a model project of the HR DB schema. Here we only use the location table for which we create entity object and view object.
Now we create the view criteria which we use to find locations by part of the city name.

Next step is to create the java class for the view object including the method to safely access the created bind variable. In the class we add a method to apply the created view criteria which we expose in the client interface well as the methods to access bind variables.


Finally we have to make sure that the locations view object is part of the data model of the application module.
Resulting Application Module Data Model

Resulting Application Module Data Model


Next we add another view object to the data model which we use to implement the use case a second time. This time we use the view criteria we defined in the view object LocationsView and select it as the default where clause.

ViewController Project:
We start implementing the view controller project by first adding a start page, ‘Start’, to the unbounded task flow in adfc-config.xml. For this page we use a quick layout (One Column, Header stretched).

After opening the page (which creates it) we add a third grid row to the panelGridLayout we got from the quick layout which later holds the result table. In the first grid row we add a captain for the page, ‘Execute with param sample’, the second grid row we add an af:inputText which holds the users input for the city name to search for.
The page looks like

<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE html>
<f:view xmlns:f="http://java.sun.com/jsf/core" xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
    <af:document title="Start.jsf" id="d1">
        <af:form id="f1">
            <af:panelGridLayout id="pgl1">
                <af:gridRow height="50px" id="gr1">
                    <af:gridCell width="100%" halign="stretch" valign="stretch" id="gc1">
                        <!-- Header -->
                        <af:outputText value="ExecuteWithParams Test" id="ot1" inlineStyle="font-size:x-large;"/>
                    </af:gridCell>
                </af:gridRow>
                <af:gridRow height="50px" id="gr2">
                    <af:gridCell width="100%" halign="stretch" valign="stretch" id="gc2">
                        <!-- Content -->
                        <af:inputText label="City" id="it1" value="" autoSubmit="true"/>
                    </af:gridCell>
                </af:gridRow>
                <af:gridRow id="gr3">
                    <af:gridCell id="gc3">
                        <!-- REGION HERE -->
                    </af:gridCell>
                </af:gridRow>
            </af:panelGridLayout>
        </af:form>
    </af:document>
</f:view>

Now we create a pageDefinition for the page, where we define a variable and an attribute binding which holds the users input into the inputText we added to a grid row below the header.


The final inputText look like

<af:inputText label="City" id="it1" value="#{bindings.searchCityName1.inputValue}" autoSubmit="true"/>

As you see we set the autoSubmit property to true as we don’t have (and need) a button to submit the data to the binding layer.

The next task is to create a new bounded task flow which has one input parameter, which is used to search for locations with cities starting with the given parameter from the inputText component.

Once the bounded task flow is created we can drag this btf onto the start page and drop it in the girdCell in the third gridRow and wire the parameter for the task flow to the value we have stored in the in the variable iterator via the inputText.

Finally we make the region refresh whenever the inputParamter of the task flow changes by setting the regions refresh property to ‘ifNeeded’.
The final ‘Start’ page layout looks like

<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE html>
<f:view xmlns:f="http://java.sun.com/jsf/core" xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
    <af:document title="Start.jsf" id="d1">
        <af:form id="f1">
            <af:panelGridLayout id="pgl1">
                <af:gridRow height="50px" id="gr1">
                    <af:gridCell width="100%" halign="stretch" valign="stretch" id="gc1">
                        <!-- Header -->
                        <af:outputText value="ExecuteWithParams Test" id="ot1" inlineStyle="font-size:x-large;"/>
                    </af:gridCell>
                </af:gridRow>
                <af:gridRow height="50px" id="gr2">
                    <af:gridCell width="100%" halign="stretch" valign="stretch" id="gc2">
                        <!-- Content -->
                        <af:inputText label="City" id="it1" value="#{bindings.searchCityName1.inputValue}" autoSubmit="true"/>
                    </af:gridCell>
                </af:gridRow>
                <af:gridRow id="gr3">
                    <af:gridCell id="gc3">
                        <af:region value="#{bindings.showlocatiobycitybtf1.regionModel}" id="r1"/>
                    </af:gridCell>
                </af:gridRow>
            </af:panelGridLayout>
        </af:form>
    </af:document>
</f:view>

This concludes the first implementation and we can run the application

The sample application can be downloaded form ADFEMG Sample Project. It contains a second page (Start2) which uses the other view object (LocationsWithParamsView) inside the region. It’s build like the first version. The difference is that the default activity nor is the executeWithParams from the VOs operations instead the self implemented method from the VO. You spare writing the method and exposing the method in the client interface this way.
Be aware that the sample uses the HR DB schema and you have to change the connection information to point to your DB.