Short recap on OFM Summer Camps 2014

Last week the Oracle Fusion Middleware summer camps took place in Lisbon. More than 100 participants attended the event, learning much new stuff about new features and enhancements, arriving with the recently available FMW 12c release. In four parallel tracks the highlights of the new major release were presented to the attendees; hands-on labs allows to get a first impression regarding the new platform features and the markedly increased productivity delivered by the enhanced, consolidated tooling.

The four tracks had different focuses, regarding the new features of the 12c release of Oracle Middleware platform:

  • SOA 12c – focusing on application integration, including Oracle Managed File Transfer (MFT), and fast data with Oracle Event Processing (OEP)
  • BPM 12c – focusing on Business Process Management, the new enhanced Business Activity Monitoring (BAM) and Adaptive Case Management (ACM)
  • SOA/BPM 12c (Fast track) – Combined track, covering the most important enhancements and concepts with reference to SOA and BPM 12c
  • Mobile Application Framework (MAF) Hackathon – Development of mobile applications using the newly released MAF (formerly known as ADF mobile)

The main topics addressed by the new OFM 12c release are:

  • Cloud integration
  • Mobile integration
  • Developer’s performance
  • Industrial SOA

Cloud integration

Integrating Cloud solutions in grown IT system landscapes is complex. With SOA Suite 12c, Oracle provides a coherent and simple approach for integrating enterprise applications with existing cloud solutions. Therefore new  JCA-based cloud adapters, e..g. for integrating with Salesforce, as well as a Cloud SDK are available. Service Bus might be used in this context to care about transformation, routing and forms the backbone of a future-oriented, flexible as well as scalable cloud application architecture.

Mobile integration

Mobile-enablement of enterprise applications is a key requirement and a necessity for application acceptance today. The new JCA REST adapter can be used to easily REST-enable existing applications. In combination with Oracle MAF and Service Bus, Oracle provides a complete Mobile Suite, where seamless development of new mobile innovations can be done.

Developer’s performance

To enhance development performance, the new SOA and BPM Quickinstalls are introduced. Using those allows the developers to have a complete SOA or BPM environment installed in 15 minutes (see the blog post of my colleague). Furthermore new debugging possibilities, different templating mechanisms (SOA project templates, Custom activity templates, BPEL subprocesses and Service Bus pipeline Templates) as well as JDeveloper as the single and only IDE deliver a maximum good development experience.

Industrial SOA

Industrializing SOA is a main goal, when starting with a SOA initiative: Transparent monitoring and management and a robust, scalable and performant platform are key to successfully implementing SOA-based applications and architectures. These points are addressed by the new OFM 12c release through the following features:

  • Lazy Composite Loading – Composites will be loaded on demand and not at platform startup
  • Modular Profiles – Different profiles provided, which enables only the features currently needed (e.g. only BPEL)
  • Improved Error Hospital and Error Handling
  • Optimized Dehydration behaviour
  • Integrated Enterprise Scheduler (ESS)

Further main enhancements that where introduced regarding SOA and BPM Suite 12c were:

  • Oracle BPM Suite 12c: Definition of Business Architecture, including definition of Key Performance Indicators (KPI) and Key Risk Indicators (KRI) to provide an integral overview from a high-level perspective; ACM enhancements in the direction of predictive analytics
  • Oralce BAM 12c: Completly re-implemented in ADF, allows operational analytics based on the defined KPIs and KRIs
  • Oracle MFT: Managed File Transfer solution for transferring big files from a specified source to a defined target; integration with SOA/BPM Suite 12c can be done by new JCA-based MFT adapters

Looking back,  a great and very interesting week lays behind me, providing a bunch of new ideas and impressions on the new Fusion Middleware 12c release. I’m looking forward to use some of this great new stuff soon, in real world’s projects.

Special thanks to Jürgen Kress for the excellent organization of the event! I’m already looking forward for next SOA Community event…

IT-Security (Part 7): WebLogic Server, Roles, Role Mapping and Configuring a Role Mapping Provider

Key words: IT-Security, WebLogic Server, Authorization, authorization process, Role Mapping, Roles and  XACML Role Mapping Provider

Let’s continue with Authorization topic. We discussed about the Authorization Process and its main components such as WebLogic Security Framework and Security Provider. Now, we look at Security Provider’s subcomponents: Role Mapping and Security Policies.  

The Role Mapping: Is access allowed?

Role Mapping providers help to clear, weather a user has the adequate role to access a resource? The Authorization provider can with this role information answer the “is access allowed?” question for WebLogic resources.[1]

The Role Mapping Process

Role mapping is the process whereby principals are dynamically mapped to security roles at runtime. The WebLogic Security Framework sends Request Parameter to specific Role Mapping provider that is configured for a security realm as a part of an authorization decision. Figure 1 Role Mapping Process presents how the Role Mapping providers interact with the WebLogic Security Framework to create dynamic role associations. The result is a set of roles that apply to the principals stored in a subject at a given moment.[2]


Role Mapping Process

Role Mapping Process

Figure 1 Role Mapping Process

Let’s review each part again[3]:

  • The request parameters are including information such as the subject of the request and the WebLogic resource being requested.
  • Role Mapping provider contains a list of the roles. For instance, if a security policy specifies that the requestor is permitted to a particular role, the role is added to the list of roles that are applicable to the subject.
  • As response, get WebLogic Security Framework the list of roles.
  • These roles can then be used to make authorization decisions for protected WebLogic resources, as well as for resource container and application code. I’m going to discuss about that in part 9.

Configuring a Role Mapping Provider

The XACML Role Mapping provider and DefaultRoleMapper are included by WebLogic Server. In addition, you can use a custom Role Mapping provider in your security realm too. By default, most configuration options for the XACML Role Mapping provider are already defined. However, you can set Role Mapping Deployment Enabled, which specifies whether or not this Role Mapping provider imports information from deployment descriptors for Web applications and EJBs into the security realm. This setting is enabled by default. In order to support Role Mapping Deployment Enabled, a Role Mapping provider must implement the DeployableRoleProvider SSPI. Roles are stored by the XACML Role Mapping provider in the embedded LDAP server.[4] XACML Role Mapping provider is the standard Role Mapping provider for the WebLogic Security Framework. To configure a Role Mapping provider:

  • In the Change Center of the Administration Console, click Lock & Edit

Change Center

Change Center

Figure 2 Change Center

  • In the left pane, select Security Realms and click the name of the realm you are configuring.

Domain Structure: Click Security Realms

Domain Structure: Click Security Realms

Figure 3 Domain Structure: Click Security Realms


Summary of Security Realms

Summary of Security Realms

Figure 4 Summary of Security Realms


  • Select Providers > Role Mapping. The Role Mapping Providers table lists the Role Mapping providers configured in this security realm

myrealm: Role Mapping

myrealm: Role Mapping

Figure 5 myrealm: Role Mapping

  • Click New. The Create a New Role Mapping Provider page appears.

WebLogic Server default Role Mapping Provider: XACMLRoleMapper

WebLogic Server default Role Mapping Provider: XACMLRoleMapper

Figure 6 WebLogic Server default Role Mapping Provider: XACMLRoleMapper

  • In the Name field, enter a name for the Role Mapping provider. From the Type drop-down list, select the type of the Role Mapping provider (e.g. DefaultRoleMapper or XACMLRoleMapper) and click OK.

a New Role Mapping Provider: Default_1

a New Role Mapping Provider: Default_1

Figure 7 a New Role Mapping Provider: Default_1


  • Select Providers > Role Mapping and click the name of the new Role Mapping provider to complete its configuration.


Role Mapping Configuration

Role Mapping Configuration

Figure 8 Role Mapping Configuration

  • Optionally, under Configuration > Provider Specific, set Role Deployment Enabled if you want to store security roles that are created when you deploy a Web application or an Enterprise JavaBean (EJB) (See Figure 8 Role Mapping Configuration).
  • Click Save to save your changes.
  • In the Change Center, click Activate Changes and then restart WebLogic Server.

XACML Role Mapping Provider

As we discussed above, a WebLogic security realm is configured by default with the XACML Role Mapping provider. It implements XACML 2.0, the standard access control policy markup language (the eXtensible Access Control Markup Language). WebLogic XACML Role Mapping Provider is saved as a .dat file und available on e.g.: $Domain-Home/XACMLRoleMapper.dat and has the following options (see Figure 8 Role Mapping Configuration):

  • Name: The name of your WebLogic XACML Role Mapping Provider.
  • Description: The description of your Weblogic XACML Role Mapping Provider.
  • Version: The version of your Weblogic XACML Role Mapping Provider.
  • Role Deployment Enabled: Returns whether this Role Mapping provider stores roles that are created while deploying a Web application or EJB.

You can see file structure on the following example: XACMLRoleMapper.dat has different User/Groups. For each User assigned particular Roles, Policies and associated resources. For example, you see description of Group and User “Administrators” below:

XACMLRoleMapper.dat: description of Group and User “Administrators”

XACMLRoleMapper.dat: description of Group and User “Administrators”

Figure 9 XACMLRoleMapper.dat: description of Group and User “Administrators”

You see a policy contains Description, Target and Rule. Each element is associated to different attributes and with this form prepared one “authorization matrix” that it helps to decide Application Server about a user or a group. Continued…


See too last parts of IT-Security and Oracle Fusion Middleware:


[1] Oracle® Fusion Middleware Securing Oracle WebLogic Server 11g Release 1 (10.3.6), E13707-06

[2] Oracle® Fusion Middleware Understanding Security for Oracle WebLogic Server 11g Release 1 (10.3.6), E13710-06

[3] Oracle® Fusion Middleware Understanding Security for Oracle WebLogic Server 11g Release 1 (10.3.6), E13710-06

[4] Oracle® Fusion Middleware Securing Oracle WebLogic Server 11g Release 1 (10.3.6), E13707-06

Finding differences in two Open-Office-Writer documents

If you write documents and get feedback from different persons on different versions it is a great pain to merge the documents and changes together. Microsoft Word has a functionality that works quite well. But the function to compare documents in Open Office Writer has  never work for me the way I expected.

Fortunately OO stores documents in a zip file, containing xml files. The main content of the document is the file content.xml. After changing the extension of the OO Writer document to zip it is possible to open the file with the favorite zip application and extracting the content.xml file. If you do this for both versions you can compare the both files with your favorite text compare tool and you will see … hmmm yes… thousands of changes. This happens especially if the documents have been edited with different versions of Open Office or Libre Office. Most of the changes are not relevant for your comparison.

So we would like to eliminate the changes not interested in to get an overview of the real changes.

We will do this using Notepad++, the tool I use most for work. Additionally we need for formation the document the XML Tools Plugin. Both are free.

We open both versions of content.xml with Notepad++ and do a “Linarize XML” with XML Tools first on both files.

In the next step we replace these six regular expressions with an empty string. This is done recursively until no further replace is possible:

1 [a-zA-Z0-9\-]+:[a-zA-Z0-9\-]+="[^"]*" 2 <([a-zA-Z0-9\-]+:)?[a-zA-Z0-9\-]+\s*/> 3 <([a-zA-Z0-9\-]+:)?[a-zA-Z0-9\-]+\s*>\s*</([a-zA-Z0-9\-]+:)?[a-zA-Z0-9\-]+> 4 <text:changed\-region\s*>.*?<\/text:changed\-region> 5 <office:annotation\s*>.*?<\/office:annotation> 6 <text:bookmark-ref\s*>.*?<\/text:bookmark-ref>

Finally we use the “Pretty print (libxml)” function of XML Tools to get the XML files formatted. Now it is possible to compare the two files with tool for comparing text files and you will see the real text changes.

Bernhard Mähr @ OPITZ-CONSULTING published at

Kategorien:English, Uncategorized

IT-Security (Part 6): WebLogic Server and Authorization

Key words: IT-Security, WebLogic Server, WebLogic Security Framework, Authorization, authorization process, Role Mapping, Roles, Adjudication Process, Security Service Provider Interfaces (SSPIs), Users, Groups, Principals and Subjects

We discussed about Authentication in Part 4 and 5[1]; now let us focus on Authorization topic. Authorization is known as access control too and is used to clear main questions such as: “What can you access?”, “Who has access to a WebLogic resource?”, “Is access allowed?” and in general “Who can do what?“ In order to guarantee integrity, confidentiality (privacy), and availability of resources, WebLogic are restricted accesses to these resources. In other words, authorization process is responsible to grant access to specific resources based on an authenticated user’s privileges.

Authorization: What can you access?

After authentication one user, it is the first question that system has to answer: “What can you access?” In this sense, WebLogic Server has to clear, which resources are available for a particular user, that will be cleared by using the user’s security role and the security policy assigned to the requested WebLogic resource. A WebLogic resource is generally understood as a structured object used to represent an underlying WebLogic Server entity, which can be protected from unauthorized access using security roles and security policies. WebLogic resource implementations are available for[2]:

  • Administrative resources
  • Application resources
  • Common Object Model (COM) resources
  • Enterprise Information System (EIS) resources
  • Enterprise JavaBean (EJB) resources
  • Java Database Connectivity (JDBC) resources
  • Java Messaging Service (JMS) resources
  • Java Naming and Directory Interface (JNDI) resources
  • Server resources
  • Web application resources
  • Web service resources
  • Work Context resources

The Authorization Process

I’m going to clear whole process in a top-down approach. First of all, we have to see what will be happen in Authorization Process? Figure 1 Authorization Process[3] shows how WebLogic Security Framework communicated with a particular Security Provider and Authorization providers respectively.


Authorization Process

Authorization Process

Figure 1 Authorization Process

If a user want to use one protected resource, then WebLogic send a request to “Resource Container” that handles the type of WebLogic resource being requested receives the request (for example, the EJB container receives the request for an EJB resource). It forwards to “WebLogic Security Framework” and its request parameters, including information such as the subject of the request and the WebLogic resource being requested. The Role Mapping providers use the request parameters to compute a list of roles to which the subject making the request is entitled and passes the list of applicable roles back to the WebLogic Security Framework. On this information will be decided about authorization: e.g. PERMIT and/or DENY. WebLogic Server provides an auditing to collect, store and distribute information about requests and outcomes. It calls Adjudication. It can happened that for Authorization is defined multiple providers. For such cases is an Adjudication provider available. The WebLogic Security Framework delegates the job of merging any conflicts in the Access Decisions rendered by the Authorization providers to the Adjudication provider. It resolves the conflicts and sends a final decision (TRUE or FALSE) to WebLogic Security Framework.[4]

WebLogic Security Framework

I have mentioned a bit about WebLogic Security Framework in Part 1 and 2[5]. Figure 2 WebLogic Security Service Architecture shows a high-level view of the WebLogic Security Framework. The framework contains interfaces, classes, and exceptions in the package. The Framework provides a simplified application programming interface (API) that can be used by security and application developers to define security services. Within that context, the WebLogic Security Framework also acts as an intermediary between the WebLogic containers (Web and EJB), the Resource containers, and the security providers[6].

WebLogic Security Framework

WebLogic Security Framework

Figure 2 WebLogic Security Service Architecture

The Security Service Provider Interfaces (SSPIs) can be used by developers and third-party vendors to develop security providers for the WebLogic Server environment[7].

Security Provider

Figure 1 Authorization Process presents Security Provider as next module that provides security services to applications to protect WebLogic resources.  A security provider consists of runtime classes and MBeans, which are created from SSPIs and/or Mbean types. Security providers are WebLogic security providers (provided with WebLogic Server) or custom security providers. You can use the security providers that are provided as part of the WebLogic Server product, purchase custom security providers from third-party security vendors, or develop your own custom security providers.


In order to complete authorization process, is Role Mapping within security provider necessary. Simple to say, a role mapper maps a valid token to a WebLogic user. Formerly that we focus on Roles, I would like to clarify a few more terms.

Users, Groups, Principals and Subjects

User is an entity that is authenticated in our security provider in last steps (See: Part 4 and 5 – Authentication Process[8]). A user can be a person or a software entity or other instances of WebLogic Server. As a result of authentication, a user is assigned an identity, or principal. A principal is an identity assigned to a user or group as a result of authentication and can consist of any number of users and groups. Principals are typically stored within subjects. Both users and groups can be used as principals by WebLogic Server.

Groups are logically ordered sets of users. Usually, group members have something in common. For example, a company may separate its IT-Department into two groups, Admins and Developers. In this form, it will be possible to define different levels of access to WebLogic resources, depending on their group membership. Managing groups is more efficient than managing large numbers of users individually. For example, an administrator can specify permissions for several users at one time by placing the users in a group, assigning the group to a security role, and then associating the security role with a WebLogic resource via a security policy. All user names and groups must be unique within a security realm[9].

Security Roles

Role is a dynamically computed privilege that is granted to users or groups based on specific conditions. The difference between groups and roles is that a group is a static identity that a server administrator assigns, while membership in a role is dynamically calculated based on data such as user name, group membership, or the time of day. Security roles are granted to individual users or to groups, and multiple roles can be used to create security policies for a WebLogic resource. A security role is a privilege granted to users or groups based on specific conditions[10].

Like groups, security roles allow you to restrict access to WebLogic resources for several users at once. However, unlike groups, security roles[11]:

  • Are computed and granted to users or groups dynamically, based on conditions such as user name, group membership, or the time of day.
  • Can be scoped to specific WebLogic resources within a single application in a WebLogic Server domain (unlike groups, which are always scoped to an entire WebLogic Server domain).

Granting a security role to a user or a group confers the defined access privileges to that user or group, as long as the user or group is “in” the security role. Multiple users or groups can be granted a single security role. It can be summarized as follows:

Groups are static and defined on Domain level (coarse granularity) and Roles are dynamic and defined on Resource level (fine granularity). Continued…

See too last parts of IT-Security and Oracle Fusion Middleware:


[1] See:


[2] Oracle® Fusion Middleware Understanding Security for Oracle WebLogic Server, 11g Release 1 (10.3.6), E13710-06

[3] Oracle® Fusion Middleware Securing Oracle WebLogic Server 11g Release 1 (10.3.6), E13707-06

[4] Oracle® Fusion Middleware Securing Oracle WebLogic Server 11g Release 1 (10.3.6), E13707-06

[5] See:


[6] See:

[7] See:

[8] See:


[9] See:

[10] See:

[11] See:

camunda BPM – Mocking subprocesses with BPMN Model API

A common way to call a reusable subprocess is to use a call activity in the BPMN 2.0 model. By using a call activity it is only necessary to add the process key of the subprocess to call and the version of it to the call activity properties. Thus, the modeling can be continued. Apart from this it is possible to define process variables to pass between the main and the subprocess.

But during unit testing the main process and all subprocesses referenced by the defined process keys must exist in the process engine repository.

The easiest way to solve this problem is to replace the defined process by the process key of a mock process which must exist in repository. But it is not advisable to change a process model for testing purposes only. It takes time to undo these changes when the real subprocess is completed. Moreover such changes could be forgotten, cause it is already tested successfully.

Creating a mock process with the same process key of the real subprocess is not convenient if there exist more than a few subprocesses which is often the reality.

A handy alternative since version 7.1 of camunda BPM is the use of the BPMN Model API.
It makes it possible to create, edit and parse BPMN 2.0 models as pure Java code.

Let’s make an example

The following process model consists of a main process with two call activities.

Main Proces with two Call-Activities

Main Proces with two Call-Activities

To have a reusable solution, a helper method is created and used by the test.
It creates a model instance by using BPMN Model API and deploys it in the given process engine repository as shown below.

 * Create and deploy a process model with one logger delegate as service task.
 * @param origProcessKey
 * key to call
 * @param mockProcessName
 * process name
 * @param fileName
 * file name without extension
 private void mockSubprocess(String origProcessKey, String mockProcessName,
 String fileName) {
 BpmnModelInstance modelInstance = Bpmn
 .startEvent().name("Start Point").serviceTask()
 .name("Log Something for Test")
 .name("End Point").done();
 .addModelInstance(fileName + ".bpmn", modelInstance).deploy();

The primary goal of this test is to ensure that the main process is ended successfully. Therefore a model instance for each call activity is created and deployed in the given repository. The main process is deployed via @Deployment annotation. Following code snippet illustrates the implementation.

 @Deployment(resources = "mainProcess.bpmn")
 public void shouldEnd() {

 // mock first sub process
 this.mockSubprocess("firstSubProcessKey", "Mocked First Sub Process",

 // mock second sub process
 this.mockSubprocess("secondSubProcessKey", "Mocked Second Sub Process",

 // start main process
 ProcessInstance mainInstance = runtimeService().startProcessInstanceByKey(


The created model instances look equally – it consists of a start event, a service task which references a delegate and an end event. Following code snippet shows the simple implementation of the used delegate.

public class MockLoggerDelegate implements JavaDelegate {

 private final Logger LOGGER = Logger.getLogger(MockLoggerDelegate.class

 public void execute(DelegateExecution execution) throws Exception {"\n\n ..." + MockLoggerDelegate.class.getName()
 + " invoked by " + "processDefinitionId="
 + execution.getProcessDefinitionId() + ", activtyId="
 + execution.getCurrentActivityId() + ", activtyName='"
 + execution.getCurrentActivityName() + "'" + ", processInstanceId="
 + execution.getProcessInstanceId() + ", businessKey="
 + execution.getProcessBusinessKey() + ", executionId="
 + execution.getId() + " \n\n");


Of course, it’s possible to individualize these mocks dependant on your test case. For example, you could create a delegate for each sub process which set specific process variables. This example demonstrates only the capability of this solution.

Keep in mind, it is not recommended to replace your process models by using the BPMN Model API. But it is very useful to solve small problems in a simple way – just a few lines of Java code. After completion a subprocess it is advisable to test the interaction with the main process, too.

And of course, do not forget to write automated integration tests ;-)

Oracle Application Testing Suite (OATS) – One tool for the whole testing process

One of the most challenging things, apart from test definition, during the process of testing an application is to keep the overview over the entire process, i.e. knowing the status of the current test progress.

Here you want to know (obviously there are more things):

  • what requirements need be tested
  • what are the tests belonging to them, i.e. each others association
  • test status, i.e. how many tests are executed yet, and which of them failed
  • the issues resulting out of failed test executions (issue tracking)

For all that Oracle provides you the Oracle Application Testing Suite.

OATS – TestManager

This is the central tool, that allows you to cover all aspects of the whole appication testing process, i.e. defining testing plans,  add your requirements, whole issue tracking process  adding tests, including JUnit Tests over existing ANT-File and 3rd Party Test via an executable file.

OATS also provides reports for the several categories right out of the box and the ability to export each report either to Microsoft Excel or HTML.

Obviously via an Administration tool you could control user access based on roles. And as a goody the most important roles are already on board, i.e. Planner, QA-Engineer, Developer, Tester, Read-Only Role and Full Access

OATS – OpenScript

This is the development environment based on Eclipse where the Scripts that could be chosen in the TestManager’s “add Test” functionality  are developped.

OpenScript gives you the possibility of developping Scripts for the so called Functional Testing which is simply automated Browser/GUI testing as well as the possibility of creating Scripts for Load-Testing for e.g. of Oracle Forms,  Oracle ADF applications, etc..

Further more OATS comes along with OpenScript Addons for Mozilla Firefox (at least Versoin 30 is not supported yet)  and Microsoft Internet Explorer (IE 11 works fine on my machine), that gives you the abilty to start recording Scripts from the development environment.

OATS – LoadTest

Within this part of OATS on the one hand you may define several scenarios under which the script should run., e.g. the amount of concurrent users. You also have the possibility to run the tests simulating different browsers, such as Chrome, Firefox, MSIE, Safari, etc. as well as Connection speed simulation.

As second part of this tool it gives you the possibility to gather Server Statistics, depending on predefined metric profiles that exist for Oracle Weblogic and on database side for SQL-Server and Oracle database.

Further Information

Download the latest Version at:

Further information obtain here:


Kategorien:English, Oracle FMW

You’ve Got Mail: Inbound Email Processing in WLS/OSB integration scenarios

In an integration project we are currently replacing an available integration platform using Oracle Service Bus 11g. Different incoming and outgoing message formats and protocols (HTTP, FTP, SMTP, etc.) are used from the external partners of our customer and therefore have to be supported. With OSB no problem at all, but polling a MS Exchange server for new e-mails is simply not possible with OSB standard tooling. Debt is a bug in MS Exchange server, which advertises that it supports plain authentification for login, but it does not ([1], search for AUTH=PLAIN). So when trying to access an exchange inbox from a proxy service ends up with failures, which cannot be worked around.

So we decided to implement a custom Java service that does the polling, because with plain Java the bug can be worked around by setting the corresponding Java Mail session parameters described in [1]. The challenge from a implementation perspective is that in a clustered environment, a service is in general active on all cluster nodes and so parallel access and therefore multi processing for one specific e-mail is possible. So the service has to be implemented as a Weblogic Singleton service [2] to avoid this. A Singleton service is physically deployed to the cluster and so available on all nodes, but it is only active on one specific cluster node. In case of problems on the node where the service is active, it might be activated on another node in the cluster automatically, depending on the failover configuration in the cluster.

Basically Singleton services may be implemented in two different fashions:

Standalone application

When implementing a Singleton service as a standalone application, it has to be bundled as a JAR-File and must be placed under <DOMAIN_HOME>/lib folder. Dependend third-party libs not provided by Weblogic must be also available within this folder, with a reference in the Singleton JARs manifest. Afterwards the servers has to be restarted and the Singleton service has to be registered in the Cluster using Weblogic Console.




Part of an enterprise application

When implementing a Singleton service as part of an enterprise application, it has to be packaged inside an EAR-File which has to be deployed to the cluster. The registration of the Singleton to the Cluster is done by adding an entry to weblogic-application.xml.

Deploying a singleton service as part of an enterprise application is the more flexible alternative and less invasive way regarding changes in the singleton implementation, because a simple redeployment of the application is sufficient. Using the standalone variant, a server restart is needed in case of changes in the Singletons implementation logic. In our concrete scenario we decided to implement the Mail Singleton service as part of an enterprise application.

After deploying the Singleton application to the cluster it will be activated on one of the cluster nodes and starts polling the specified email account. When stopping the server, where the Singleton service is currently active on, it will be deactivated on this node and directly be activated on another node. Observing the server logs shows this behaviour because of corresponding log outputs in the Singleton implementations activate() and deactivate() methods.


23:20:04.341 [[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailClientRunner - SingletonService MailClientRunner is initiated...
23:20:05.461 [[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailClientRunner - SingletonService MailClientRunner is activated...
23:20:06.736 [[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - FROM: ["Bernhardt, Sven" <>]
23:20:06.736 [[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - SENT DATE: [Sat Jul 12 23:15:03 CEST 2014]
23:20:06.736 [[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - SUBJECT: [Singleton Service Testmail]
23:20:07.001 [[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - CONTENT: [Hello,

this is a test mail.


23:21:16.131 [[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailClientRunner - SingletonService MailClientRunner has been deactivated...

23:21:22.967 [[STANDBY] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailClientRunner - SingletonService MailClientRunner is activated...
23:21:24.220 [[STANDBY] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - FROM: ["Bernhardt, Sven" <>]
23:21:24.220 [[STANDBY] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - SENT DATE: [Sat Jul 12 23:15:03 CEST 2014]
23:21:24.220 [[STANDBY] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - SUBJECT: [Singleton Service Testmail]
23:21:24.481 [[STANDBY] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] INFO  MailReaderClient - CONTENT: [Hello,
this is a test mail.

Finally let’s have a short look on the implementation of the Singleton service:
package com.opitzconsulting.mail;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import weblogic.cluster.singleton.SingletonService;

public class MailClientRunner implements SingletonService {

private static final Logger log = LoggerFactory.getLogger(MailClientRunner.class.getSimpleName());

private MailReaderClient mailReaderClient;

public MailClientRunner() {"SingletonService MailClientRunner is initiated..."));

public void activate() {"SingletonService MailClientRunner is activated..."));

mailReaderClient = new MailReaderClient();

public void deactivate() {"SingletonService MailClientRunner has been deactivated..."));

The interaction between Oracle Service Bus and the Singleton Mail service has been implemented using JMS Queues. The Mail service reads the mails, coverts the content (CSV, XML) from the mail body or from attachments, creates a uniform message format which is independent from protocol as well as format and enqueues it into the corresponding queues. From here OSB dequeues the messages and does the further processing. The logic from this point on is the same, used for other interfaces. With this implementation approach, by combining the strenghts of of JEE and OSB, we created a flexible, maintainable and standard-based way to integrate inbound email processing in our final integration architecture.



Erhalte jeden neuen Beitrag in deinen Posteingang.

Schließe dich 25 Followern an