Developer Cloud Service: Continuous Integration with JDeveloper 12.2.1

The last blog showed that the Oracle Developer Cloud Service is now available for JDeveloper and ADF 12.2.1 (Developer Cloud Service with JDeveloper 12.2.1 available). The missing part is the connection of the DCS to the newly created JCS for version 12.2.1. This we show in the blog.

The ground work, how to set up a build system for the DCS has been shown in Fasten your seat belts: Flying the Oracle Development Cloud Service (3 – Take Off – ROTATE). We now have to find out which environment variable to use for the 12.2.1 installation. At the time I wrote the mentioned blog there where only environment variables for 11.1.1.7.1 and 12.1.3.0 available. Looking at the documentation Using Hudson Environment Variables we find that the variables

  • ORACLE_HOME_SOA_12_2_1=/opt/Oracle/MiddlewareSOA_12.2.1/jdeveloper
  • MIDDLEWARE_HOME_SOA_12_2_1=/opt/Oracle/MiddlewareSOA_12.2.1
  • WLS_HOME_SOA_12_2_1=/opt/Oracle/MiddlewareSOA_12.2.1/wlserver

Are the right ones (and the only ones which point to 12.2.1). In the application.properties file (from the ‘… Take Off…’ blog) we exchange

# Don't change anything below!
 oracle.home=${env.ORACLE_HOME_12C3}
oracle.commons=${env.MIDDLEWARE_HOME_12C3}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_12C3}
install.dir=${env.ORACLE_HOME_12C3} 

with

# Don't change anything below!
oracle.home=${env.ORACLE_HOME_SOA_12_2_1}
oracle.commons=${env.MIDDLEWARE_HOME_SOA_12_2_1}/oracle_common
middleware.home=${env.MIDDLEWARE_HOME_SOA_12_2_1}
install.dir=${env.ORACLE_HOME_SOA_12_2_1} 

This change will use the JDeveloper 12.2.1 to run ojdeploy and configure the application to run on a WebLogic Server 12.2.1. This should do the trick and we can use the DCS build system to create application using ADF 12.2.1. As the application I used for the ‘Fasten your seat belts…’ blog series was pretty simple I like to show the result using the application I used for a presentation at the DOAG DevCamp2016, named AppsClouUIKit. You can read all about this application in a blog I wrote here DOAG DevCamp2016.

The application was build using JDeveloper 11.1.1.9.0 and has been migrated during the DevCamp to 12.1.3. This was the DCS version which was available at the time of the DevCamp. The first task is to migrate the source to 12.2.1 by creating a new branch in the GIT repository for the new 12.2.1 version.

We Clone the repository and create a new branch 12_2_1 which we use to build the AppsCloudUIKit for 12.2.1. As we are now running JDeveloper 12.2.1 we can use the Team-Server to get the sources from the DCS GIT repository

But we can use any other GIT client to get it. As this is covers in other blogs I’ll skip the details here. In the end we have this branch tree

Where the green marked local branch 12_2_1 is the one we are working on.

After changing the application.properies as shown above we can run the build using ant on the local machine

By selecting the ‘deploy’ target.

The result is an EAR file in the deploy folder

Setting up the build job

Let’s check-in the changes and setup the build in the DCS. Here are the steps for the build job

With this we can build the application to get the result

Setting up the Deployment

The final task is to set up the deploy task to deploy the application on the JCS_12_2_1. When we select the ‘Deploy’ tab we see the existing deployment configuration for the 12.1.3 JCS.

For the JCS 12.2.1 we created a new JCS instance with a different IP (public). Before we can create a new configuration for the 12.2.1 JCS instance we have to allow the Hudson user access to the JCS. This process is described in detail at Deploying an Application from Oracle Developer Cloud Service to Oracle Java Cloud Service

It’s absolutely necessary to get the Oracle Developer Cloud Service SSH public key and add this key to the JSC 12.2.1 instance as authorized key. Please follow the instructions given in the link above to do so.

After this is done we can create a new deployment configuration

Start filling in the dialog by giving the configuration a name. Next we create a new ‘Deployment Target’

In the dialog fill in the public IP address from the new JC 12.2.1 and select SSH Tunnel. The user name and password is the one you selected when you created the JCD instance. Test the connection and close the dialog by clicking ‘Use Connection’

Finally we can complete the Deployment dialog

We choose ‘On Demand’ here which let us specify which job/Build and artifact to use. A click to ‘Save and Deploy’ closes the dialog and the artifact will be deployed to the JCS 12.2.1. The URL to open the application is AppsCloudUIKit 12.2.1

And we should see

Advertisements

Developer Cloud Service with JDeveloper 12.2.1 available

I almost missed that Developer Cloud Service has been updated to 12.2.1. Great news as we now can use JDeveloper 12.2.1 to access the agile capabilities like

  • Interact with Tasks/Issues in JDeveloper
  • Leverage the Team view in JDeveloper (tasks, builds, and code repositories)
  • Connect to DevCS and its projects from inside JDeveloper
  • Create Agile boards and manage sprints in Developer Cloud Service
  • Associate code commits with specific tasks
  • Monitor team activity in the Team Dashboard
  • Handle Git transactions

For more information about how JDeveloper and the DCS are integrated watch this video ‘Agile development with Oracle JDeveloper and Oracle Developer Cloud Service’.

This was possible since last year. So, what’s new?

New is that the JCS is also available in 12.2.1 and that we can use the whole continuous integration scenario. For this we have to configure a 12.2.1 JCS instance which then can be used for deployment. When we select to create a new instance of a JCS we see the new wizard which allows us to select a WebLogic Server 12c in version 12.2.1

On the ‘Edition’ page we don’t find anything new so we skip it and go to the Details page where we specify the needed information for the service, database configuration, backup and the WebLogic user

After getting the confirmation page we create the new service and finally after a short time we see the new service

A look at the Enterprise Manager of the new service shows the new login page

and after logging in the new 12.2.1 Enterprise Manager

It look modern and fresh. However, this is not what this blog is about. I installed my ADF Version Web Service BlogAdfVersionWS to check which ADF version is running in this instance. Selection the modules we find the test point on the right side of the Web Service

After selecting the test point we select to run the ‘GetVersion’ service

and get

That’s right what we expect when running ADF 12.2.1!

Next time we see how to change the build and deployment part of the DCS to work with the JCS 12.2.1.

dvt:treemap showing node detail in popup

This post describes how to implement an dvt:treemap which shows a af:popup when the user clicks on a detail node in the map.
The documentation of the dvt:treemap component tell us that the dvt:treemapnode supports the af:showPopupBehaviortag and reacts on the ‘click’ and ‘mouseHover’ events.
This is part of the solution and allows us to begin implementing the use case. We add an af:showPopupBehavior to the nodes we want to show detail information for.

After creating a default Fusion Web Application which uses the HR DB schema, we begin with creating the data model for the model project. For this small sample the departments and employees tables will be sufficient.


The views are named according to their usage to make it easier to understand the model. This is all we need for the model.

Let’s start with the UI which only consist of a single page. The page has a header part and a center part. In the center area we build the treemap by dragging the Departments from the data controls onto the page and dropping it as treemap. After that, in the dialog we specify the first level of the map to be the departmentId (which shows the department name as the label) and the for the second level we choose the employeeId (which shows the last name of the employee as label) from the employees. The whole process is shown in the gallery below.


The resulting treemap is very basic in it’s features, e.g. there is no legend as you see later.
In the next step we create an af:popup to show the nodes detail information. This process is outlined in the next gallery. We drag the popup component onto the page below the af:treemap component

One thing to take note of are the properties of the popup. First we set the content delivery to ‘lazyUncached’, which makes sure that the data is loaded every time the popup is opened. Otherwise we’ll see only the data from the first time the popup has been opened. Second change is to set the launcherVar to ‘source’. This is the variable name we later use to access the node data. Third change is to set the event context to ‘launcher’. This means that events delivered by the popup and its descendents are delivered in the context of the launch source.

The treemap for example, when an event is delivered ‘in context’ then the data for the node clicked is made ‘current’ before the event listener is called, so if getRowData() is called on the collectionModel in the event listener it will return the data of the node that triggered the event. This is exactly what we need.

Finally we add a popupFetchListener to the popup which we use to get the data from the current node to a variable in the bindings. In the sample this variable ‘nodeInfo’ is defined in the variable iterator of the page and an attribute binding ‘nodeInfo1’ is added. More info on this can be found here.


The code below shows the popupFetchListener:

package de.hahn.blog.treemappopup.view.beans;

import javax.el.ELContext;
import javax.el.ExpressionFactory;

import javax.faces.application.Application;
import javax.faces.context.FacesContext;

import oracle.adf.model.BindingContext;
import oracle.adf.share.logging.ADFLogger;
import oracle.adf.view.rich.event.PopupFetchEvent;

import oracle.binding.AttributeBinding;
import oracle.binding.BindingContainer;


/**
 * Treemap handler bean
 * @author Timo Hahn
 */
public class TreemapBean {
    private static ADFLogger logger = ADFLogger.createADFLogger(TreemapBean.class);

    public TreemapBean() {
    }

    /**
     * listen to popup fetch.
     * @param popupFetchEvent event triggerd the fetch
     */
    public void fetchListener(PopupFetchEvent popupFetchEvent) {
        // retrieve node information 
        String lastName = (String) getValueFromExpression("#{source.currentRowData.lastName}");
        Integer id = (Integer) getValueFromExpression("#{source.currentRowData.EmployeeId}");
        //build info string
        String res = lastName + " id: " + id;
        logger.info("Information: " + res);
        // get the binding container
        BindingContainer bindings = BindingContext.getCurrent().getCurrentBindingsEntry();

        // get an ADF attributevalue from the ADF page definitions
        AttributeBinding attr = (AttributeBinding) bindings.getControlBinding("nodeInfo1");
        //set the value to it
        attr.setInputValue(res);
    }

    // get a value as object from an expression
    private Object getValueFromExpression(String name) {
        FacesContext facesCtx = FacesContext.getCurrentInstance();
        Application app = facesCtx.getApplication();
        ExpressionFactory elFactory = app.getExpressionFactory();
        ELContext elContext = facesCtx.getELContext();
        Object obj = elFactory.createValueExpression(elContext, name, Object.class).getValue(elContext);
        return obj;
    }
}

Finally we have to design the popup to show the node info from the attribute binding ‘nodeInfo1’. The popup uses a dialog with an af:outputText like

Show the node info in the popup

Show the node info in the popup


and set an af:showPopupBehavior to the node showing the employees

Running the finished application brings up the treemap, not pretty but enough to see this use case working. If we click on an employee node we see the popup with the last name of the employee and the employee id, the primary key of the selected row in the employees iterator.

You can download the sample application which was build using JDeveloper 12.1.3 and the HR DB schema from GitHub.

How-to filter ADF bound tables by date range (JDeveloper 12.1.x)

Based on an older article from Frank Nimphius How-to filter ADF bound tables by date range JDeveloper 11.1.1.4 I got a interesting question on the OTN JDeveloper & ADF forum why the solution provided in the article does not work in JDev 12c.

The solution from Frank’s article is designed for JDev 11.1.1.4.0. Today’s version of JDev is 12.1.3 where the solution does not seem to work. Migrating the source of the article and running it under JDev 12.1.3 indeed shows, that filtering the employees records for a date range does not work at all. Setting dates into the filter and hitting enter to activate the filter does not filter the data in the table.
The reason for this was easily found by debugging the code. Set a breakpoint into the query listener which is setup in the table

<af:table value="#{bindings.allEmployees.collectionModel}" var="row" 
  rows="#{bindings.allEmployees.rangeSize}"
  emptyText="#{bindings.allEmployees.viewable ? 'No data to display.' : 'Access Denied.'}"
  fetchSize="#{bindings.allEmployees.rangeSize}" rowBandingInterval="0"
  filterModel="#{bindings.allEmployeesQuery.queryDescriptor}" filterVisible="true" 
  varStatus="vs" selectedRowKeys="#{bindings.allEmployees.collectionModel.selectedRow}"
  selectionListener="#{bindings.allEmployees.collectionModel.makeCurrent}" 
  rowSelection="single" id="t1" styleClass="AFStretchWidth"  partialTriggers="::cb1"
  queryListener="#{EmployeeQueryBean.onEmployeeQuery}">

As you can see it’s pointing to a bean method ‘onEmplyoeeQuery’. A look into this method reveals that the method FilterableQueryDescriptor.getFilterCriteria() has been deprecated.

        FilterableQueryDescriptor fqd = (FilterableQueryDescriptor) queryEvent.getDescriptor();
        Map map = fqd.getFilterCriteria();

Instead of the deprecated method you should use the method FilterableQueryDescriptor.getFilterConjunctionCriterion() which now holds the map of parameters.

        FilterableQueryDescriptor fqd = (FilterableQueryDescriptor) queryEvent.getDescriptor();
        ConjunctionCriterion cc = fqd.getFilterConjunctionCriterion();
        Map<String, Criterion> criterionMap = cc.getCriterionMap();

When you set a breakpoint in this method and step through the code you see that the values entered into the filter fields in the UI are not visible in the map as Frank describes in his article.

Criterion Map and old FilterCriteria Map

Criterion Map and old FilterCriteria Map


As you can see there are no map entries for the made up variables ‘HireStartRange’ and ‘HireEndRange’. This is the reason the filter by date range does not work. There are simply not dates to filter the rows.

I’m not sure if this is a bug or a change in behavior which was made for a reason. Anyway, you can’t just simply add values to the map anymore.

The solution to fix the problem is simple. As you can’t store additional values in the criterion map, you have to store the values entered by the user somewhere else. A valid storage area is the variables iterator each pagedef holds.
In one of my other blogs Creating Variables and Attribute Bindings to Store Values Temporarily in the PageDef I showed how to add temporary variables in this iterator.

Create two new variables inside the variable iterator of type oracle.jbo.domain.Date, name them ‘startDate’ and ‘endDate’. Then create attribute bindings for them.
The final touch is to wire the new variables up in the HireDate filter for start range and end range:

                                    <af:column sortProperty="HireDate" filterable="true" sortable="true"
                                               headerText="#{bindings.allEmployees.hints.HireDate.label}" id="c1" width="277">
                                        <f:facet name="filter">
                                            <af:panelGroupLayout id="pgl2" layout="horizontal">
                                                <af:panelLabelAndMessage label="From: " id="plam1">
                                                    <af:inputDate id="id2" value="#{bindings.startDate1.inputValue}" clientComponent="false">
                                                        <af:convertDateTime pattern="#{bindings.allEmployees.hints.HireDate.format}"/>
                                                        <f:validator binding="#{bindings.HireDate.validator}"/>
                                                    </af:inputDate>
                                                </af:panelLabelAndMessage>
                                                <af:spacer width="5" height="5" id="s1"/>
                                                <af:panelLabelAndMessage label="To:" id="plam2">
                                                    <af:inputDate id="id3" value="#{bindings.endDate1.inputValue}" required="false" clientComponent="false">
                                                        <f:validator binding="#{bindings.HireDate.validator}"/>
                                                        <af:convertDateTime pattern="#{bindings.allEmployees.hints.HireDate.format}"/>
                                                    </af:inputDate>
                                                </af:panelLabelAndMessage>
                                            </af:panelGroupLayout>
                                        </f:facet>
                                        <af:inputDate value="#{row.bindings.HireDate.inputValue}" label="#{bindings.allEmployees.hints.HireDate.label}"
                                                      required="#{bindings.allEmployees.hints.HireDate.mandatory}"
                                                      shortDesc="#{bindings.allEmployees.hints.HireDate.tooltip}" id="id1" styleClass="AFStretchWidth">
                                            <f:validator binding="#{row.bindings.HireDate.validator}"/>
                                            <af:convertDateTime pattern="#{bindings.allEmployees.hints.HireDate.format}"/>
                                        </af:inputDate>
                                    </af:column>

The code above shows the new column for the HireDate and the new storage location for the startDateRange as ‘value=”#{bindings.startDate1.inputValue}”‘ and EndDateRange as ‘value=”#{bindings.endDate1.inputValue}”‘. Next we change the bean method which reads the filter values and calls the query:

    public void onEmployeeQuery(QueryEvent queryEvent) {
        //default EL string created when dragging the table
        //to the JSF page
        //#{bindings.allEmployeesQuery.processQuery}

        BindingContext bctx = BindingContext.getCurrent();
        DCBindingContainer bindings = (DCBindingContainer) bctx.getCurrentBindingsEntry();

        //access the method bindings to set the bind variables on the ViewCriteria
        OperationBinding rangeStartOperationBinding = bindings.getOperationBinding("setHireDateRangeStart");
        OperationBinding rangeEndOperationBinding = bindings.getOperationBinding("setHireDateRangeEnd");

        // get the start date and end date from the temporary valiables
        AttributeBinding attr = (AttributeBinding) bindings.getControlBinding("startDate1");
        oracle.jbo.domain.Date sd = (oracle.jbo.domain.Date) attr.getInputValue();
        attr = (AttributeBinding) bindings.getControlBinding("endDate1");
        oracle.jbo.domain.Date ed = (oracle.jbo.domain.Date) attr.getInputValue();

        //set the start and end date of the range to search
        rangeStartOperationBinding.getParamsMap().put("value", sd);
        rangeEndOperationBinding.getParamsMap().put("value", ed);

        //set bind variable on the business service
        rangeStartOperationBinding.execute();
        rangeEndOperationBinding.execute();

        invokeMethodExpression("#{bindings.allEmployeesQuery.processQuery}", Object.class, QueryEvent.class, queryEvent);
    }

In line 14-17 you see that we read the values from the newly created attribute bindings for the temporary variables. After removing the unnecessary parts of the code, which tried to read the values from the map, the rest of the code remains as is.

Here is an image of the now working filter by date range

Filter Table by Date Range

Filter Table by Date Range

Please note that if you run the sample in your environment, that you have to change the DB connection to the HR DB schema according to your environment. You can download the changed code for the sample from GitHub

Change Application Configuration at Run Time through a Properties File (Part 2)

In this second part of the mini series we look into the problem left over from part 1.

We left the task to change the location and name of the property file we read when a configuration property is needed by the application. You my ask why we need to change the path at all. The answer is that most administrators won’t allow you to copy a file to any location on a server. They normally don’t allow access to a production server at all. You can ask them to put put the configuration file to a location of their choice and then configure this path during deployment of the application. This is exactly what we do in this blog.

In the first part we finished the sample application which read a property file from a location we can set via a context parameter in web.xml. The question now is, how to change this parameter during deployment of the application. The answer to this is to use a Deployment Plan.

Typically, deployment plans are created by developers along with the associated application files, then distributed to the administrator or another deployer, who updates the plan for a particular environment (such as staging, testing, or production). The deployment plan is stored outside of an application archive or exploded archive directory. We store the initial plan in the ViewControllers src/META-INF folder as BlogReadConfigFile_Plan.xml.

<?xml version='1.0' encoding='UTF-8'?>
<deployment-plan xmlns="http://xmlns.oracle.com/weblogic/deployment-plan" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                 xsi:schemaLocation="http://xmlns.oracle.com/weblogic/deployment-plan http://xmlns.oracle.com/weblogic/deployment-plan/1.0/deployment-plan.xsd">
    <application-name>BlogReadConfigFile</application-name>
    <variable-definition>
        <variable>
            <name>ResourceFileLocation</name>
            <value>/tmp/readconfigfile.properties</value>
        </variable>
    </variable-definition>
    <module-override>
        <module-name>BlogReadConfigFile_BRCFViewController_webapp.war</module-name>
        <module-type>war</module-type>
        <module-descriptor external="false">
            <root-element>web-app</root-element>
            <uri>WEB-INF/web.xml</uri>
            <variable-assignment>
                <name>ResourceFileLocation</name>
                <xpath>/web-app/context-param/[param-name="de.hahn.blog.readconfigfile.FILENAME"]/param-value</xpath>
                <operation>replace</operation>
            </variable-assignment>
        </module-descriptor>
    </module-override>
</deployment-plan>

There are three notable parts in the plan. The first is the application name which is the same as you set under application properties in the deployment descriptor


The second section is the variable-definition section. Here we define variables which we later can use in the other parts of the descriptor as reference. Without a variable definition we can’t change a thing in the plan.
We name our variable ‘ResourceFileLocation’ and set the value to any appropriate location we like. This location don’t have to exist on the target server. We change it later anyway.
The third part is the module-overwrite where we specify which part of the of any descriptor, which is part of the deployment, we like to change.
It’s essential to name all parts exactly as they are named in the descriptors in your application.

The module name is the name of the war file we build, the type is war, as we build a war artifact. The root element describes which section of the deployed application we want to change and the URI exactly where the descriptor is located relative from its root.
The interesting part is the variable-assignment where we specify the variable name defined in the variable-definition section and which element of the defined module we want to change. This is done with a XPath expression:

<xpath>/web-app/context-param/[param-name="de.hahn.blog.readconfigfile.FILENAME"]/param-value</xpath>

which describes the location of the context parameter in web.xml with the name “de.hahn.blog.readconfigfile.FILENAME” and set it’s value to the variable value.
The operation tells what we want to do. As we want to replace the value we choose REPLACE as operation.

To make the setup complete we have to specify the BlogReadConfigFile_Plan.xml in the application properties

EAR Deployment Profile Properties

EAR Deployment Profile Properties


For more info about how to use deployment plans refer to the documentation at How to Use Deployment Plans

We can now deploy the application as usual and run the application from within JDeveloper. To show how it’s done when you deploy the application to a stand alone server we first create the ear file, then start the integrated WLS in JDevloper and open the admin console to deploy the application

Deploy the Application

Deploy the Application


which will create the ear file as we see in the log window

[06:33:59 PM] ----  Deployment started.  ----
[06:33:59 PM] Target platform is  (Weblogic 12.x).
[06:33:59 PM] Running dependency analysis...
[06:33:59 PM] Building...
[06:33:59 PM] Deploying 2 profiles...
[06:33:59 PM] Wrote Web Application Module to /data/development/ENTW_12.1.3.0.0/QT/BlogReadConfigFile/BRCFViewController/deploy/BlogReadConfigFile_BRCFViewController_webapp.war
[06:33:59 PM] Wrote Enterprise Application Module to /data/development/ENTW_12.1.3.0.0/QT/BlogReadConfigFile/deploy/BlogReadConfigFile.ear
[06:33:59 PM] Elapsed time for deployment:  1 second
[06:33:59 PM] ----  Deployment finished.  ----

Next step is to open the admin console and to deploy the ear file


If we run the application we see that the application tries to read the properties file in the log window
Running Application

Running Application


We now want to change the properties file the application is using. For this we change the location and name of the properties file we configured in the web.xml file and change the content of the properties file:

The UI has not changed, however, after you click the refresh Properties button and look into the log output you see
Application Tries to Read Properties from New Location

Application Tries to Read Properties from New Location


Please note that the location of the properties file has changed. If we had the file at the position defined in the deployment plan, the application would have read the properties from there.

Change Application Configuration at Run Time through a Properties File (Part 1)

Often users ask how to change some configuration properties, e.g. a reprot definition file or endpoint of a web service, art runtime of the application. This is not an easy task as such configuration normally is deployed together with the application as part of the ear file. This however can’t be changed easily.

There are different possible solutions, like providing a Mbean which then can be used in the weblogic servers admin console to change values. A sample for this approach can be found e.g. here Creating Mbeans(JMX) in ADF Application and accessing them from jrockit mission control.

In this blog I show a different approach which uses a configuration file which can be changed externally. The values read from the file are properties (key value pairs). If we make changes to the file they are reflected during run time without the need to restart the application. Keep in mind that this approach does not work will on a clustered system as there are multiple servers with multiple file locations which have to change. One way to overcome this is to set the location on e shared file system which can be accessed from all servers.

To implement this use case we first have to think about how we get the path to the configuration file and it’s name to load it during run time.

To get the path the the configuration file we use a context parameter which we define in the web.xml file. The reason for this is that we need to change this parameter depending on the system we deploy the application too. In addition you can’t make predictions like where an administrator likes to put the configuration file.

Context Parameter in web.xml

Context Parameter in web.xml

To load the properties we can use java default Properties class which loads properties from a stream. The bit and bytes can be found in the source of the work space which is available GitHub (see link at the end of the post).

One thing to notice is that this post uses Apache Commons-IO version 2.4. This update make one other change necessary in the weblogic-application.xml file.

  <prefer-application-packages>
    <package-name>org.apache.commons.io</package-name>
  </prefer-application-packages>

This entry allows the application to use the included commons-io jar to be loaded before the already available commons-io jar, of an older version, in WebLogic server 12.1.3.

After setting the parameter let’s write a sample page where we show some of the properties on the page, then change the configuration reset the properties and see the changes on the page.
For the implementation we use an application scope bean. The reason is that the configuration parameter should be available application wide. There is no need to keep this info per session. If you read the configuration file for every parameter you can use a request scope bean.

In a live system I would not recommend using this approach (reading the configuration file every time) because it produces a bottleneck reading the file over and over again. Instead I would call a method periodically or as the result of a user action like button click.

OK, let’s create a configuration file in the WEB-INF folder or if you like in any other folder. Once the file is created we copy it into a temporary folder on the system (/tmp on mine) and read it from there.


Now we need an application scope bean

The bean has a getProperties method which checks if the proprieties are already read or if not read the context parameter from web.xml to read the file from the given position.

    /**
     * This method savely get the properties from a file if the file can be read, otherwise it return an empty properties object
     * @return Properties object read from file of empty properties object if hte file was empty or could not be found
     */
    public Properties getProperties() {
        if (properties == null) {
            FileInputStream fileInputStream = null;
            try {
                // read context parameter
                String initParameter = FacesContext.getCurrentInstance().getExternalContext().getInitParameter(PROPERTYFILE_PARAMETER);
                logger.info("Read properties from " + initParameter);
                // try to read the file
                File file = FileUtils.getFile(initParameter);
                fileInputStream = FileUtils.openInputStream(file);
                properties = new Properties();
                properties.load(fileInputStream);
                logger.info(properties.size() + " properties successfully read.");
            } catch (IOException ioe) {
                logger.warning("Error: Property file could not be read!", ioe);
                properties = new Properties();
            } finally {
                if (fileInputStream != null)
                    try {
                        fileInputStream.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
            }
        }
        return properties;
    }

    /**
     * Reset the read properties so that the next try to read a property ready the file again
     */
    public void readPropertiesAgain() {
        logger.info("Reset properties!");
        properties = null;
    }

    /**
     * Method to return the version information of the configuration file.
     * this information is compiled from the keys PROPERY_NAME and PROPERTY_VERSION
     * @return version information read from the file
     */
    public String getPropertyVersionInfo() {
        String property = getProperty(PROPERTY_NAME);
        String property_2 = getProperty(PROPERTY_VERSION);
        String res = "Unkown";
        if (property != null && property_2 != null) {
            res = property + " " + property_2;
        }
        logger.info("Properyinfo: " + res);

        return res;
    }

The second method is used to reset the local storage of the properties, so that the next time a property is read the whole file will be read again. The third method is used to get the version information of the configuration file which is build as the concatenation of two properties.

On a page we add a button to reset the properties in the application scoped bean. After this the configuration file will be read again.

<af:button text="Read Configuration again" id="b1"
 actionListener="#{ReadPropertyFileBean.rereadActionListener}"
 partialSubmit="true"/>

Finally we add an outputText component to the page which uses an EL to read the PropertyVersionInfo from the application scoped bean ‘ReadPropertyFileBean’

<af:outputText id="ot6" inlineStyle="font-size:small;" 
 value="#{ReadPropertyFileBean.propertyVersionInfo}"
 partialTriggers="b1"/>

When we run the application we see the inital screen like

Initial index.jsf

Initial index.jsf


we change the configuration file
Changed Configuration File

Changed Configuration File


and reset the properties via a click on the button
Changed index.jsf

Changed index.jsf

This concludes part 1. In the final 2nd part we see how to change the fixed path set as context parameter in web.xml during deployment. This allows us to deliver a properties file together with the application but let the administrator decide where to put it on the server.

The work space for this sample can be downloaded from GitHub BlogReadConfigFile Release 1.0 or get the zipped workspace of this release 1.0
The sample is build using JDeveloper 12.1.3 and uses the HR DB schema.

The Git Experience (Part 4)

In this part of the ‘Git Experience’ series we are looking at GitFlow. GitFlow is a branching model which helps you and your company to structure your work in a way which is understandable and has proven it’s value in many projects. I don’t want to copy all information given in the link about GitFolw but only use the image from the blog post:

GitFlow Model

GitFlow Model

What we see in this image is a timeline of development with releases, hot fixes, development and feature branches and how they work together. We like to use this model to structure the work of the development. The development needs to set up the software for releases which are delivered to the customer or an internal server. Then there is the need to supply hot fixes if a release version has a major bug. Nevertheless development has to develop for the next release, probably breaking the task into smaller pieces we call features. To keep this features in our repository as well we use feature branches which are merged back to the development branch once they are ready and tested.

The development branch is the grapevine for the development. Feature branches as well as release branches are started from the development branch. Once a release is ready the release branch is merged and tagged on the master branch and merged to the development branch.

IMPORTANT: you never work directly on the master branch!

In the last part ‘The Git Experience (Part 3)’ we started a new repository on GitHub which we use in this part to introduce GitFlow on it. There are several ways to do this. You can use the command line and execute the git commands from there. Or you use shell scripts to put multiple git commands as a unit of work together and call the script to e.g. start a new feature branch. The last and least complicated way is to use a tool which already has set up the scripts for you and gives you a nice GUI to work with.

We follow the last suggested way and use a tool with graphical user interface. As JDeveloper does not support the usage of GitFlow with a GUI we use an external tool like ‘SmartGit’ or ‘Source Tree’ which both come with a graphical user interface supporting GitFlow. For the remainder of this blog we use SmartGit as it’s available for Windows and Linux operating systems. It’s free for non commercial use.

Once we started SmartGit we can add our local repository to be shown in SmartGit. Don’t be confused this with cloning a repository. Cloning fetches a remote repository from a remote server and creates a local copy of it on your pc. As we already have the local repository on out machines we just add the local repository.

For those of you how did not create the repository in the last part you can clone it from my GitHub server repository using ‘https://github.com/tompeez/BlogReadConfigFile.git&#8217; as url for the clone command.

After this SmartGit looks the same as the last image after adding the repository. You now can play with the SmartGit UI (or any other too you are using). One thing I like to bring to attention is the ‘Log’ button. Clicking this button opens a new window which shows the timeline of all commits.

Right now we only see two nodes which were created during creation of the repository.

Let the fun begin: Introduce GitFlow to the project

Now that the local repository is up in the tool of choise, lets introduce GitFlow to it. For this we click the GitFlow Button, select the ‘Full’ radio button and leave the rest of the options as is.

This will add another branch to the repository named ‘develop’

Repository after GitFlow Introduction

Repository after GitFlow Introduction

However, this  new branch is not the current branch as the ‘master’ branch is still marked current. Please also note the different color of the GitFlow button. In this shape it starts a HotFix as the master branch is the current branch and all hot fixes are started from the master branch, or better a release tag on the master branch.

GitFlow Button: Start HotFix

GitFlow Button: Start HotFix

We change this by double clicking the develop branch to get

and see the GitFlow button changes to a different default action, ‘Start Feature’ as we are now working on the develop branch. Before we start our first feature we take a look at the GitHub remote repository:

GitHub Timeline

GitHub Timeline

As we see the remote repository still only have one branch ‘master’. This is a lesson we have to learn fast. Everything we do, we do only locally. The remote repository doesn’t know about our work until we tell about it or push our changes to the remote repository.

Let’s push the ‘develop’ branch to the remote repository by clicking the Push button

Last thing to do is to do some house keeping on the GitHub side. Here we set the ‘develop’ Branch as ‘default’ branch.

Now we are ready to start our first feature. Remember that the feature branch is only created on the local repository and not automatically pushed to the remote GitHub repository. If you like the feature branch to be visible in the remote repository you have to push it there after creating it.

We create a feature ‘Feature_1’ (you should choose a more meaningful name!):

As we see the new branch is the current working branch. We now make some changes e.g. adding a header above the table and then look at the changes in the SmartGit and GitHub UI.

We add a panelGroupLayout to the top facet of the panelStretchLayout to add a header telling us what we see and another text telling that this was added with ‘Feature-1’. This is just for the time we are playing with the GitFlow features. We later can safely remove this second text.

As we see, all tools showing the same changed files. The interesting thing is that we see a changed index.java file. The only change we made was to add something to the index.jsf file!

Well, this second change was not intended but is the result of a setting in JDeveloper which adds a property to a backing bean for every component we add to the index.jsf file. Before we remove this nonsense setting let’s save the changes to our repository and look at the different tools:

In the first image we see an interesting info: ‘Outgoing (1)’. This means that SmartGit knows that this branch isn’t connected to the remote repository and can’t seen there. This isn’t really necessary as Git is a distributed version control system, but other users can’t get to this branch if the PC holding it isn’t available (due to network restrictions or because it’s offline).

After this the new branch is visible and tracked in GitHub. Now we can remove the not needed nor wanted index.java backing bean.

Why is it there in the first place?

If we create a new view in a task flow we have the option to activate the automatic component binding to a backing bean in this dialog

Automatic Component Binding

Automatic Component Binding

Well, it’s either a bug in JDev 12.1.3 or a saved configuration I made to investigate something else which uses automatic component binding. Les’s assume the latter and remove this setting.

Now we have to remove line 76 in the index.jsf file, fix the bindings in index.jsf by replacing them (find: binding=\”#{backingBeanScope.backing_index..*\”), remove the bean from adfc-config.xml and finally remove the index.java file from the project

Now we can compile the source, test the application and then save the changes in the repository. Don’t forget to push the changes to the remote repository!

If you use SmartGit to commit the changes you can commit and push in one command by clicking the ‘Commit & Push’ button in the dialog. The final timeline in SmartGit looks like

SmartGit Timeline

SmartGit Timeline

Time to wrap everything up. We made some changes and now are finished with our feature. We now finish the feature in SmartGit by clicking the GitFlow button and follow the dialog

The second image shows the options we have to finish a feature. Here we can decide to remove the feature branch completely or, as we do, keep it for later. As features are not used by GitFlow to build a release or hot fix on them, there is generally no need to keep them after they are finished and merged back into the development branch.

The final thing to make the circle is to build a release from the current development branch.

We add a release note part to the README.MD file to make the release visible in the file too. Now we commit the change (not shown) and finish the release

In image 2 we can set the options we want to use to finish the release. The one we change from the default is that we like to keep the release branch so that we can see it in the timeline. This is not necessary as you can’t do anything with the branch (beside cherry picking :)). The last image shows the SmartGit timeline where we see all commits and the different branches used. This show look like the GitFlow image we started this blog with.

This finishes part 4 of the blog. The repository (and it’s branches) and be cloned or loaded from GitHub.

The Git Experience (Part 3)

Almost a year ago I posted the second part of my git experience series (JDeveloper 11.1.1.6.0: The Git Experience (Part 2)). In this part I announced a third part which should handle working with remote git repositories.

I totally forgot about this. Only after a question on OTN JDeveloper and ADF forum I looked up the older blogs about git. Since the last post a new JDev version 12.1.3 has arrived so I decided to use this new version to continue the series.

In this part we are talking about setting up a remote repository. As remote git server I use GitHub which allows you to easily set up an account for non commercial use without any cost. Please read the all about the process on the GitHub Help web page.

Before we begin we have to talk about security accessing the remote repository. There are multiple possible connection strategies. The more common are ssh and https. Both are supported by GitHub, https is the recommended.

In this blog I use https to authenticate and work with GitHub, all you need to remember is your GitHub username and it’s password. I tested ssh too but like https better as you don’t have to handle the ssh keys on every pc you use (and I use up to 10 different machines).

I assume you have built yourself an account on GitHub which you can use. If you don’t want to create one yourself you can use the repository of this blog (link available at the end of the blog). Creating a new remote repository yourself can’t be done without an account. For those of you part 4 of the series which will be published in a couple of days will talk about how to do real work with remote repositories and introduce GitFlow as work template.

Let’s start with a typical workflow. We want to implement a new application using JDev 12.1.3 and want to use git as version control system. As the development is done on multiple locations, the repository should be accessible for all team members 24/7. This exclude a repository on a local pc which others might not be able to reach (e.g. if it’d down).

We generate a new ADF Fusion Web Application. I spare the walk through this process. As name of the new application workspace we choose BlogReadConfigFile, every other name you like is OK too.

Once the new application workspace is ready we like to put it under git control. We first do this locally by using the ‘Team->Version Application…’ menu. See the gallery below fro the whole process.

Notice the ‘.git’ folder which holds all information about the git versions of the application on your local pc. The fresh created local repository can be used already to make changes and commit them to the local repository. Other team members can clone the repository from your pc if they have access to it.

As the new workspace is under version control we can and should open a new window in JDev via menu ‘ Window->Team->Version’ to get a view of the version information available in JDev:

Open Version Window

Open Version Window

Expanded Version Window

Expanded Version Window

As you’ll notice all remote nodes in the tree are empty. The local master branch is the current branch we are working on, visible through the green badge. Now we want to make this local repository available to a remote git server GitHub. First problem is that JDev 12.1.3 (and all other version I know of) don’t support creating a remote repository. We can add existing remote repositories by right clicking to the ‘Remotes’ node and ‘add remote…’

Add a Remote Repository

Add a Remote Repository

Add Remote Dialog

Add Remote Dialog

As you see you can only add the URL to the remote repository but can’t create it. So we have to create the remote repository on GitHub first. Read ‘Setup a new Repository’ to find out how to do this. Make sure you make it a public repository and leave the check box to init the repository with a README.md file empty for the moment.

Create Repository in GitHub

Create Repository in GitHub

You can add default ‘.gitignore’ fie and a licence via this dialog too. We use a ‘.gitignore’ file from other JDev projects as the default java one doesn’t know about some of the artifacts used by JDev. After finishing the dialog you get the information from GitHub how to get data into the new repository.

How to add Data

How to add Data

If you decide to use ssh instead of https you can click the button on the left side of the URL and the info switches accordingly.

As we already have our local repository we use the second method ‘…or publish an existing repository from the command line’. For this we open a command shell and open the location of the workspace (‘/data/development/ENTW_12.1.3.0.0/QT/BlogReadConfigFile’ on my machine). Copy the commands from the second option into the command shell. You are asked for your credentials to login into GitHub, after that the local repository is pushed to GitHub.

Command Lines to Push Data to Repository

Command Lines to Push Data to Repository

When we look at the GitHub web page we see the data from the workspace

Finished Remote Repository

Finished Remote Repository

As mentioned at the button of the page we should add a README.md file. This can be done directly on the web page or you can add such a file on your local pc and push it to the repository. I like to do this through the web page as it copies the description from the creation of the repository into the file.

Add README.md

Add README.md

Now if we look at the Versions window in JDev we see that the remote repository is visible in JDev too.

Updated Version Window

Updated Version Window

Dev knows about the changes as they all in the ‘.git’ folder in the workspace. Right now command line git , JDev git and third party tools like SmartGit, SourceTree or mysygit (or Tortoise Git on windows) play well together. You can use them interchangeably.

As we have added the README.md file using the web page, out first action is to refresh the workspace to get the README.md file into our local repository. For this we use the menu ‘Team->Git->Pull…’, work through the wizard and check the file system after the work has finished.

This concludes this part 3 of the series. You can clone the repository from GitHub. Stay tuned for the next part where we introduce GitFlow and start working on the repository.

DOAG Development Conference, JDeveloper 12c New Features

Yesterday, 14th June 2012, the first DOAG Developer Conference to place in Bonn, Germany. Among the many sessions one hosted by Frank Nimphius revealed new features for the upcoming major release of JDeveloper 12c. As the version is still in development there is no guarantee that all features he talked about are in when we get 12c. Nevertheless I like to mention some of the features. The list I present here is not the full list as there are way too much to mention them all.

    • GlassFish will be a supported server
    • Pretty URL: URLs of applications will be more human readable, aka we don’t see the state tokens any longer
    • IDE window system is based on NetBeans code
    • New Wizzards to create OSGI Extension. Most code to wire the extension up is generated, we only need to add code to some methods.
    • No need to deploy an extension. It can be tested and run right out of JDeveloper 12c
    • Full Maven Support with Hudson integration: ojdeploy plugin for Maven
    • Better RESTservice support
    • Groovy debugger: allows to set breakpoints in Groovy code and see the attributes like in normal java debugger
    • EL2.2 support: methods with parameters possible
    • write EL direct in the page without the need of an af:outputText
    • faster page loading: use of CSS3, HTML5 and smaller JavaScript size allows faster page loading
    • JDev helps to (manually) convert jspx to jsf pages. You need to do it manually but get hints or work lists on what to do
    • Multi file upload component together with a nice looking progress window (popup)
    • ‘Placeholder’ watermarks for inputText components
    • New component to edit code: like the rich text editor but for code
    • Templates can not only be used, but copied into your project to allow easier manipulation
    • More help to find out which skin selectors a component uses
    • All button and link components are put together in on ‘magicButton’ component (thanks Chris for the name). The old components are still available.
  • There was plenty more which I couldn’t write down in the short time. Let’s see when DOAG provide the slides. I’ll add more info then.