Thursday, January 27, 2011

How does Kerberos actually work in the HTTP world?

I got hit with an IM out of the blue this morning that kicked off a bunch of conversations about how Kerberos works with HTTP. As I was working my way around the org it became clear that lots of people say "and Kerberos happens here" and move on, but most didn't have a good understanding of how it actually worked "on the wire".

If you want my description read on. If you want to see more authoritative info you should check out Wikipedia's page on SPNEGO.

Tuesday, January 25, 2011

Do we have an OAM Web Gate for Microsoft .NET 4.0?

A question came in on one of the Oracle internal mailing lists with the following question:
One of their partners achieves a single sign-on with them by means of our CoreId/web gate product. Do we have a CoreID/web gate product that works with Microsoft .Net 4.0?
The short answer is that you don't need a .NET WebGate so no, there isn't one.

The long answer is a bit more complicated.

Monday, January 24, 2011

Using the x.509 Attribute Sharing profile responsibly

I'm back, rested and I've had some time to think about the crazy (clever?) OVD adapter I wrote for last week's PoC. You know, the one that lets you do a search for certdn= the user's certificate DN and it makes a SOAP call over to OIF to get the user info?

I've talked to a few people internally about how this thing works and at first everyone has had the same reaction - that's kinda cool. Then we get to talking about the fine print warning:
Before you go further a warning: If you're going to try this at home make sure you test if for scalability. It's not entirely clear that this will scale up to thousands of concurrent users. OAM and OVD will easily support that, as will OIF (as both the SP and IdP). But the entire architecture relies on SOAP calls over the Internet and those are notoriously latency heavy. As a result the initial access by a user will be relatively slow and that could cause any number of issues. If a large percentage of your users visit for only a short time those problems will be worse.

Not everyone I've talked to is as concerned. At least not before seeing it run. Perhaps I'm a little more conservative about performance in large scale deployments. Or maybe I've seen one too many super clever solutions fall flat on their faces when faced with the real world. But in any case I cooked this crazy idea up and am still not convinced it's a good one.

Read on for why I'm a little gun shy.

Thursday, January 20, 2011

Oracle Identity Federation (OIF) now supports OpenID 2.0

This news is a long time coming, but none-the-less, very cool. With the most recent patch set, OIF now supports OpenID 2.0.

In addition, OIF now supports “custom actions” by which you can create custom site specific operations that are executed during federated authentication at the service provider (consumer) and identity provider (producer).

You can read more about these recent OIF developments here:

The documentation for configuring OIF as an OpenID identity provider can be found here:

The documentation for configuring OIF as an OpenID service provider can be found here:

The chapter on custom actions can be found here:

Edit:  Looks like I posted right after Alex did an excellent post on OAM/OES integration.  If you missed his post check it out here:

Custom Identity Assertion for OAM-OES SSO integration

OAM and OES both provide mechanisms to protect Web Applications. Some customers have expressed the need to achieve SSO between these two Access Management solutions. This article intends to describe how one would implement such integration.
The solution mostly involves custom components from the OES side so we will list the ones required from OAM which are mostly just configurable components provided out of the box by OAM. So from OAM’s stand point all is needed is an Access Gate which will be tied to the custom Identity Asserter that will be built for OES. To accomplish this the Access Server SDK must be installed on the server where OES is installed. Locate the Access Server SDK distribution coming with OAM’s installation packages and install it locally on the OES server. Then configure an Access Gate in OAM’s Access System Console and then run the ConfigureAccessGate command line tool from the Access Server SDK access/oblix/ConfigureAccessGate/ directory. Make note of the installation directory of this access gate because you will have to provide that somehow to your implementation of the SSPI Identity Assertion Provider that will consume the SSO token from OAM, details following. For details on how to configure OAM’s Access Gate and install and configure the Access Server SDK you can refer to the OAM’s product documentation.

So far what you have done is to achieve connectivity between the OES server and the Access Server from OAM. This is needed to validate the ObSSOCookie coming from OAM which will be passed in the request to the WS-SSM of OES as an Identity Assertion with a Token Type set as ObSSOCookie, we will cover this in more detail in sections to come within this article.

Now we are ready to discuss the OES side components. First and foremost, we need an Identity Assertion Provider which is a SSPI implementation of an Identity Assertion provider. The code is presented below (Thanks to Chris Johnson and Josh Bregman for providing the code for this solution):

import java.util.Arrays;
import java.util.Enumeration;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.oblix.access.*;

public class OAMIdentityAsserterProviderImpl implements AuthenticationProviderV2 {
* The attribute of the DN used to identify the
* user in WebLogic. example: cn
private String userAttribute;

public AppConfigurationEntry getAssertionModuleConfiguration() {
// TODO Auto-generated method stub
return null;

public IdentityAsserterV2 getIdentityAsserter() {
// TODO Auto-generated method stub
return new Foo(userAttribute);

public AppConfigurationEntry getLoginModuleConfiguration() {
// TODO Auto-generated method stub
return null;

public PrincipalValidator getPrincipalValidator() {
// TODO Auto-generated method stub
return null;

public String getDescription() {
// TODO Auto-generated method stub
return null;

public void initialize(ProviderMBean mbean, SecurityServices arg1) {
OAMIdentityAsserterMBean config = (OAMIdentityAsserterMBean)mbean;
String installDir = config.getAccessGateSDKInstallDir();
System.out.println("Initalizing connection to OAM using "+installDir+" updated 3.");
try {
} catch (Throwable t) {
throw new RuntimeException(t);
this.userAttribute = config.getNameAttribute();

public CallbackHandler assertIdentity(String arg0, Object arg1, ContextHandler ctxHandler) throws IdentityAssertionException {
// TODO Auto-generated method stub
System.out.println("Asserting ID for "+arg0+" value="+arg1);
String [] names = ctxHandler.getNames();
System.out.println("Attributes: "+Arrays.asList(names));
HttpServletRequest req = (HttpServletRequest)ctxHandler.getValue("HttpServletRequest");
if (req!=null) {
Enumeration e = req.getHeaderNames();
while (e.hasMoreElements()) {
String header = (String)e.nextElement();
String value = req.getHeader(header);
try {
String token = null;
if (arg1 instanceof byte []) {
token = new String((byte[])arg1);
} else if (arg1 instanceof String) {
token = (String)arg1;
} else {
System.out.println("Unknown token "+arg1.getClass().getName());
if (token.equals("loggedoutcontinue")) {
HttpServletResponse resp = (HttpServletResponse)ctxHandler.getValue("HttpServletResponse");
System.out.println("Sending redirect to "+req.getRequestURI()); resp.sendRedirect(req.getRequestURI());
throw new IdentityAssertionException("No Identity. Redirecting");
ObUserSession session = new ObUserSession(token);
String userName = session.getUserIdentity();
return new ObSSOTokenCallbackHandler(session,userAttribute);
} catch (Exception e) {
throw new IdentityAssertionException(e.getMessage());

public void shutdown() {
// TODO Auto-generated method stub

This code works as follows: The first thing it does is to retrieve the configuration directory for the Access Gate configured as described in previous sections of this post when the Access Server SDK was installed and an Access Gate was configured. When you create a security provider in WebLogic, there is a utility that creates the provider as a Managed Bean (MBean creation utility). This creates the provider with a set of properties that can be configured via the WebLogic System Console to supply configuration values need by providers. This is how it retrieves the installation directory of the Access Server SDK to retrieve the configuration to connect to OAM’s Access Server. The rest is just processing the ObSSOCookie. In order to consume the cookie as part of this implementation there is also a CallbackHandler that retrieves the ObSSOCookie from the HttpServletRequest. The code for this handler is shown below:

import java.util.StringTokenizer;
import com.oblix.access.ObUserSession;

public class ObSSOTokenCallbackHandler implements CallbackHandler {

private ObUserSession session;
private String userNameAttribute;

public ObSSOTokenCallbackHandler(ObUserSession session, String userNameAttribute) {
this.session = session;
this.userNameAttribute = userNameAttribute;

public void handle(Callback[] callbacks) throws IOException, UnsupportedCallbackException {
// TODO Auto-generated method stub
System.out.println("There are "+callbacks.length);
for (int i=0; i Callback c = callbacks[i]; i++) {
if (c instanceof NameCallback) {
try {
System.out.println("Handling Name Callback for "+session.getUserIdentity()+" "+userNameAttribute);
NameCallback nc = (NameCallback)c;
} catch (Exception e) {
} else {
System.out.println("Can't handle callback "+c.getClass().getName());

private String getName() throws Exception {
String dn = session.getUserIdentity();
StringTokenizer parser = new StringTokenizer(dn,"=,");
while (parser.hasMoreTokens()) {
String token = parser.nextToken();
if (token.equals(userNameAttribute)) {
return parser.nextToken();
throw new Exception("Couldn't find "+userNameAttribute+" in "+dn);

The Class above has the purpose of extracting the ObSSOCookie from the HttpRequest and creating an ObSession object with it. Then it retrieves the identity within the cookie and parses the resulting DN to extract the user name which will be used as the principal for the authenticated subject. Now, once all these components are in place we need to tell OES when to call this provider, which is anytime it sees a Token Type of ObSSOCookie. OES represents Security Tokens via implementations of the CredentialHolder interface. We need to create an implementation of the CredentialHolder interface as follows:

public class OAMSSOCookieIndentityHolderImpl implements CredentialHolder
private String m_cookie;
public static final String m_Type = "ObSSOCookie"; // This should match the Token Type expected by the OAMIdentityAsserter configured
public void setCookie(String cookie)
this.m_cookie = cookie;

public String getCookie()
return this.m_cookie;

public Object getObject()
return getCookie();
public void setObject(Object cred)

public String getType()
return this.m_Type;

public String getAsString()
return this.m_cookie;

As you can see in the code above this class has the only purpose of representing the ObSSOCookie as an acceptable token type. The next step is to configure the castor.xml files to allow OES to successfully de-serialize the incoming token from the xml request submitted to the WS-SSM. This is accomplished by modifying the following files:
Open the file C:\bea\1032\ales32-ssm\webservice-ssm\lib\com\bea\security\ssmws\soap\.castor.xml and modify it as follows:
<class name=" ">
<map-to cst:xml="ObSSOCookie" />
<field name="cookie" type="java.lang.String" >
<bind-xml node="text"/>

Open the file C:\bea\1032\ales32-ssm\webservice-ssm\lib\com\bea\security\ssmws\credentials\.castor.xml and modify it as follows:
<class name=" OAMSSOCookieIndentityHolderImpl ">
<map-to cst:xml="ObSSOCookie" cst:ns-uri="" />
<field name="cookie" type="java.lang.String" >
<bind-xml node="text"/>

The next step is to test this. So when making a Web Service method call to the WS-SSM the SOAP message should contain an identity assertion named as ObSSOCookie, see XML fragment below:
<soap:Envelope xmlns:soap="" xmlns:xsi="" xmlns:xsd="">
<isAccessAllowed xmlns="">
<ObSSOCookie>34EFB3201BCA31F21A3E320A21B31D19</ ObSSOCookie >
<ResourceString>app/claim </ResourceString>
<AuthorityName>ARME_RESOURCE_AUTHORITY </AuthorityName>



The tag name needs to reflect exactly the name of the token configured in the Identity Assertion Provider of the Web Logic Security configuration and also the configuration changes made in castor.xml files.

Monday, January 17, 2011

The (Windows) Natives Are Restless

From Brian:  I'm adding this excellent post by Matt to our OAM 11g Academy series. To view the first post in the series which will be updated throughout to contain links to the entire series, click here: 
 OAM 11g has the ability to do Windows Native Authentication to give a Windows client desktop SSO to the OAM-protected application. This was possible in OAM 10g as well, but it required an IIS server to do the heavy lifting of getting the Kerberos ticket and authenticating the user. In 11g, Oracle does not require IIS to accomplish desktop SSO. WebLogic also had this capability by using its SPNEGO Identity Asserter, but this approach gives one SSO to any other OAM-protected application as well. The documentation is in chapter 7 of the Integration Guide.
For the krb5.conf file my example is (showing all the edited parts; there were other pieces that came with my Amazon images that I left alone):
kdc = ip-10-116-199-182.ec2.internal
admin_server = ip-10-116-199-182.ec2.internal
default_domain = IAM.COM
[domain_realm] = IAM.COM = IAM.COM
On the KDC (Active Directory), I created a user named “idam11g”. This is the WebGate host, what you’re going to use in the browser. I ran the following to create the keytab:
C:\>ktpass -princ HTTP/idam11g@IAM.COM -pass P@ssw0rd -mapuser idam11g -out c:\logs\keytab.service
Make sure that your User Login name in the Active Directory looks like “HTTP/idam11g”
Copy the keytab.service file to your OAM server.
Test on your OAM Server box that you can generate the Kerberos token by using the kinit command:
$ kinit HTTP/idam11g@IAM.COM -k -t /oracle/stage/wna/keytab.service
Now make the changes to OAM to tell it how to contact the KDC. Here is the oam.config.xml entry:
<Setting Name="KerberosModules" Type="htf:map">
<Setting Name="6DBSE52C" Type="htf:map">
<Setting Name="principal" Type="xsd:string">HTTP/idam11g@IAM.COM</Setting>
<Setting Name="name" Type="xsd:string">Kerberos</Setting>
<Setting Name="keytabfile" Type="xsd:string">/oracle/stage/wna/keytab.service</Setting>
<Setting Name="krbconfigfile" Type="xsd:string">/etc/krb5.conf</Setting>
NOTE: The docs are off on the contents of oam-config.xml. Also, if you make changes to the oam-config.xml, these changes may get reset if you make other changes in OAM through the console. I’ve heard of similar problems when doing the OAM-OAAM integration. I think you are better off making this edit through the console. You can configure this through the console by going to System Configuration (tab)->Authentication Modules->Kerberos Authentication Modules->Kerberos:

While in the oamconsole, you want to configure AD to be the Primary Identity Store. This is located at System Configuration (tab)->Data Sources->User Identity Stores.

A word about Role Mapping: The “OAM Administrator’s Role” field is looking for an existing group in AD. Users in that group will be able to login to oamconsole once AD becomes the primary identity store. Don’t forget to press the “Set as Primary” button. Once you do this, you should be able to authenticate to your default web page with AD users’ credentials.
I recommend creating a separate Authentication Policy for WNA with a OnAuthFailure redirect so you can see when you are getting an OAM Auth failure vs. other reasons. I used the existing “KerbScheme” for the Authentication Scheme as is, without editing.

I configured an 11g WebGate on 11g OHS via standard means.
For IE7, here is the process for setting Integrated Windows Authentication for the client:
  • Select Tools, Internet Options.
  • Select the Security tab.
  • Ensure that your WebGate-protected OHS site is in the list of trusted “Sites”
  • Select Local intranet and click Custom Level....
  • In the Security Settings dialog box, scroll to the User Authentication section.
  • Select “Automatic logon only in Intranet zone”.
  • Click OK.
  • Select the Advanced tab.
  • Scroll to the Security section.
  • Make sure that Enable Integrated Windows Authentication option is checked and click OK.
  • If this option was not checked, restart the client.

Tools like isHTTPHeaders can help you determine whether the Negotiate token is being issued. It should look something like:
GET /oam/CredCollectServlet/WNA?request_id=-3931587206375492112&error_code=OAM-1001&redirect_url=http%3A%2F%2Fip-10-124-122-41%3A7777%2Fwna%2Findex.html HTTP/1.1
Authorization: Negotiate
<Long encrypted string of about 1600 chars>
For OAM troubleshooting, you want to be able to turn on some tracing via WLST:
cd <MW_HOME>/Oracle_IDM1/common bin
setLogLevel(logger=”oracle.oam”,level=”TRACE:32”, persist=”0”, target=”oam_server1”)
You can then check your diagnostics log at <IDM_DOMAIN>/servers/oam_server1/logs/oam_server1-diagnostic.log.
A successful transaction should look something like:
[2011-01-13T12:34:40.730-05:00] [oam_server1] [TRACE:16] [] [oracle.oam.controller] [tid: [ACTIVE].ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: ] [ecid: 0000Iq0SORD1rYspkgg8yZ1DBnSC00000R,0] [SRC_CLASS:] [APP: oam_server] [dcid: d03843071ed98d9b:6369d65b:12d806f7606:-8000-0000000000000015] [SRC_METHOD: createSubject] RETURN, Subject: [{ Subject: Subject:[[
Principal: username@IAM.COM
Principal: CN=User Name,cn=users,dc=iam,dc=com
Principal: \4f\81\08\d9\90\14\29\43\81\fd\2b\a2\59\c3\21\ab
, GroupsLoaded: false, UpdateSession: false, isAnonymous: false }], Subject Attrs: [String Map: {}], User Id: will.laase@IAM.COM, User DN: CN=Will Laase,cn=users,dc=iam,dc=com, GUID: \4f\81\08\d9\90\14\29\43\81\fd\2b\a2\59\c3\21\ab, Auth Level: 2, Auth Scheme: KerbScheme
The documentation talks about doing the same with Firefox, by browsing to about:config and setting:
network.negotiate-auth.trusted-uris =http://idam11g:7777
I was not able to get firefox to work. If anyone can comment on a success path here, that would be appreciated.

My first custom OVD adapter

For a PoC I have been working on I wrote a (demo quality) OVD adapter that I thought might be interesting for others.

In my PoC I had OIF, OAM, OVD and DSEE and OES. In the main use case a user can come along to the SP with an x.509 certificate issued by an issuer known to the SP. The SP is expected to do the normal certificate authentication "stuff" (crypto operations, CRL & OCSP checking) and then make authorization decisions for URLs based upon the user identity including information about the user that the SP doesn't get in advance. To get that information the SP will need to make SAML attribute requests to an IdP.

Out of the box OAM and OIF support this use case via the Oracle Access Manager AuthZ Plug-in. That plug-in makes calls to OIF which in turn goes and gets the necessary attributes from the right IdP. This is great and answers most of the requirement.

The missing bit is that this customer wanted to go further... much, much further. They want to make very fine-grained authorization decisions deep down in their applications (by calling OES). And they want OES to make its decisions based on the attributes coming from the IdP.

My first plan was to write an equivalent to the OAM x.509 Attribute Sharing plug-in for OES. It should have taken me all of a couple of hours start to finish including deploying it in the PoC environment. Naturally I thought to myself "That's not all that interesting. Why don't you do something more clever?". I blame too much caffeine and inadequate sleep for that thought.

Everything in the environment (OAM, OES, WebLogic and a few other things) already make LDAP calls to get user info and make authorization decisions. Why not hide all of the SAML stuff down in OVD. Then there's only one place that needs to talk to OIF.

Click through to see what I did.

Friday, January 14, 2011

WebLogic Domain Models for Installing the Oracle Identity Management Suite – Part 2

A couple days ago I wrote what I consider to be an important post about whether different Oracle middleware packages (or bundles) should be installed together in a single domain or installed in separate domains.

I’ve received a few questions asking a logical extension of that topic which is what about the individual products within one package? Should individual products within one package be installed together in a single domain (which is really the default behavior) or be spread across several domains? For example, if you are deploying the Identity Management package with OID,OVD, and OIF, should you install them all in one domain or maybe put OIF in one domain and OID in a separate domain?

There are a number of things to consider in answering this question.

Let’s begin by looking at the issues that led me to recommend that you not install multiple packages/bundles in a single domain.

The first issue was the risk of incompatibilities between the packages and the difficulty in dealing with such issues when they arise. I would have to say that this issue does not apply to multiple products within one package. After all, the package was explicitly developed and tested with the idea that all the products would be running in same domain.

The second issue I raised was the notion that deploying multiple packages to one domain could complicate patching and upgrading; even potentially leading to a situation where you will be kept from upgrading due to version incompatibilities. Again, since we are now talking about products within one package, there is less of a concern about patching and upgrades. However, since even a single product patch could include components that are common across the entire package, having all the products from a package in a single domain means that you should really test every product that you use in the domain before deploying the patch to production. I don’t see this as being a huge deal but it is something to consider.

The next consideration which I did not address in my last post is delegation of duty or purpose for a domain. Some customers segregate certain WLS domains for certain purposes. Often this is seen as a security practice such as the case where a customer deploys all intranet apps to one domain , extranet apps to a second domain, and utility services to a third domain. If you are a customer that does this you may see some products in a package as falling in a different category from another. One example of this is that many customers will see OID and OVD as being “internal” or “utility” applications where as they might see OIF as being an “external “– end user facing application. This might lead them to deploy these applications from the Identity Management package into separate domains.

The last consideration is to note that some of the integrations between products in a package only work if the products are installed in the same domain. Two examples of this are the OAM/OIM integration and the native integration between OAM and OAAM. If you want to use the integrated functionality offered by these packages, you have to deploy them in the same WLS domain.

Thursday, January 13, 2011

Behavior of the OVD "Entry" class

As part of a demo I'm doing this week I wrote a custom OVD adapter and (for various reasons) included my own logic to cache the results between LDAP calls. Along the way I discovered behavior that perplexed me and probably shortened my life by days thanks to how high my blood pressure went.

If you ever find yourself thinking things are going "just swell" on a PoC you should try using an API you've never seen before on a product you've barely used, but wait until the last day to start.

yes, my own fault. And yes it's humbling.

What I hope to save you from discovering the hard way is that OVD's class com.octetstring.vde.Entry can't be shared across calls into the directory.

Wednesday, January 12, 2011

Important Topic: WebLogic Domain Models for installing the Oracle Identity Management Suite

As most of you know, the current 11g version of the Oracle Identity Management Suite runs on top of a tech stack based on WebLogic Server. In essence, when you install any package from the 11g Identity Management Suite, you end up deploying the applications from the package into a WLS domain.

There are 2 basic models of domain architecture for the Oracle IAM suite:

1) Install a given package into a new, self contained, WLS domain.

2) Extend an existing domain. In other words, deploy the apps from the Identity Management package you are installing into a domain that is also running other software. This model includes several variations based on what else you are running in the domain. One variation is installing the different Identity Management suite packages (namely the Identity Management package containing OID, OVD, and OIF; and the Identity and Access Management package containing OAM, OIM, and OAAM) into one domain. Another variation would be installing one of the Identity Management Suite packages into a domain running another Oracle product such as WebCenter or SOA Suite. Finally, one might even consider installing an Identity Management Suite package into a domain running custom application code.

Basic coverage of the topic can be found in the documentation here:

My Advice

It might seem like one or more of the scenarios listed above in the “extend an existing domain model “option above would make life simpler. In particular, I think that many people who use the entire Oracle Identity Management suite might be tempted to create a single monolithic “IAM domain” containing both major packages such that (OID,OVD,OIF,OAM,OAAM, and OIM) would all run in the same domain.

However, I’m here to tell you that running multiple packages in one domain is most likely a bad way to go. The reason is that it introduces the risk that different components (libraries, jars, utilities) from the different packages that you are installing in the domain might be incompatible and cause problems across your entire domain.

I actually had this happen when I accidently installed the IAM package (OAM) into a domain that was only running the PS1 version (instead of the PS2 version) of the IM package (OID and OVD). The IAM package is only certified to run in the same domain as the PS2 version of the IM package and sure enough many things in the domain broke.

To make matters worse, there is little you can do outside of restoring a previous full OS backup to fix the situation if such an incompatibility occurs.

Beyond the risk of such an incompatibility, you also must consider how installing multiple packages into one domain reduces your flexibility to apply patches and upgrades. Let’s say a new patch set of the Identity Management package (OID and OVD) comes out. If you have installed the package into its own domain then you are good to go. However, if you have installed it into a domain with WebCenter or OAM/OIM then you must consider whether the patch set will be compatible with those other packages. So, you’ll have to open a support case to ask and if the answer is no then you might have to wait months for those other packages to catch up.

One exception to this warning is SOA Suite for OIM. SOA Suite is a prerequisite for OIM and as part of the stack that supports OIM should be installed in the same domain as OIM. However, if you use SOA suite for other purposes, you should consider setting up a separate install for running your own services, composites, BPEL processes etc.


While installing multiple Identity Management packages (or Fusion Middleware packages in general) into one domain might seem like a good idea as it gives you just less domains to manage, it is safer and better to install each Identity Management package into a separate, isolated domain.

To be clear, I am recommending that customers install the IAM package (OAM,OIM,OAAM) into a separate domain from the IDM package (OID,OVD,OIF).  I am not necessarily recommending that you install the various products inside of each package in separate domains.  If and when to do that is a topic for another discussion which I will try and post about soon.

Friday, January 7, 2011

Risky Business

Incorporating risk detection and mitigation capabilities into apps is becoming all the rage. There are plenty of real-life examples of cases where prevention of cyber-security threats and fraudsters might have kept governments and companies out of the news, and with more money in their accounts. These are just the events that we know about. There are probably many more that have been kept on the down-low or haven’t been detected yet. Oracle’s Adaptive Access Manager provides a risk engine that can help you model, detect and mitigate those threats.

The quickest way to start using the Adaptive Risk Manager (henceforth ARM), is to call the Strong Authenticator. This is done by browsing to http://<oaam_host>:14300/oaam_server. Login with any username and password ‘test’ and you should be ‘authenticated’. I put authenticated in quotes because OAAM doesn’t go against any directory service out of the box. At this point you should be able to browse to the ARM console and see your session information. Browse to http://<oaam_host>:14200/oaam_admin and doubleclick ‘Sessions’. You should see an entry in the table for the user you just authenticated as.

There are 3rd party tools like ARMAutomator that provide extensive testing on ARM testing, but for onesie-twosie testing and demoing, you can use this native integration JSP I put together. To set up native in your own Java EE web app, do the following:

Copy the following jars to your client-side web app WEB-INF/lib:

- oaam_core.jar

- oaam_native_wrapper.jar (had to extract it from the native lib war)

- oaam_uio.jar

Copy bharosa_properties folder from /Oracle_IDM1/oaam/cli to /WEB-INF/classes

Update the JDBC URL in in bharosa_properties folder to your environment.

Include the following in your WEB-INF/classes folder and update the SOAP URL.






I cheated a bit and disabled the basic auth constraint on the server side. If you have a 10g setup with a soap keystore setup, you can leave this in place, but if you want disable it, perform the following steps on the OAAM server:

- cd /Oracle_IDM1/oaam/oaam_server/ear

- back oaam_server.ear

- jar xvf oaam_server.ear oaam_server.war

- jar xvf oaam_server.war WEB-INF/web.xml

- Edit web.xml and remove <security-constraint>...</security-constraint>

- jar uvf oaam_server.war WEB-INF

- jar uvf oaam_server.ear oaam_server.war

- Redeploy oaam_server.ear

If you haven’t upgraded to OAAM BP01 (p10022410), I would do that upgrade before removing this constraint. If done correctly, you should be about to browse the following URL without being prompted by Basic Auth prompt:


A very quick ‘n dirty way to see how OAAM geo-spatial capability can work is to load the test data included in the tool. Quova from what I’ve heard is the top-tier service for IP Geo-location. Here’s a perl script to convert those numbers to IP address format:

sub numToStr {

my ($ipnum) = @_;

my $z = $ipnum % 256;

$ipnum >>= 8;

my $y = $ipnum % 256;

$ipnum >>= 8;

my $x = $ipnum % 256;

$ipnum >>= 8;

my $w = $ipnum % 256;

print "$w.$x.$y.$z";

return "$w.$x.$y.$z";


print "IP Address is ";


Assuming you’ve imported the IP GeoLocation test data, you can take a value from /oaam/cli/test_data/test_MaxMindBlocks.csv and pass it into the tool.

perl 209868800

Credit to Maxmind for that little tool.

Here is the JSP I use to call ARM for post-authentication results:

<%@ page import="com.bharosa.client.BharosaHelper"%>

<%@ page import="com.bharosa.client.BharosaUtil" %>

<%@ page import="com.bharosa.client.BharosaSession"%>

<%@ page import="com.bharosa.vcrypt.common.util.VCryptServletUtil" %>

<%@ page import="com.bharosa.client.enums.BharosaEnumAction"%>

<%@ page import="com.bharosa.vcrypt.tracker.util.CookieSet" %>

<%@ page import="com.bharosa.client.enums.BharosaEnumAuthStatus" %>

<%@ page import="com.bharosa.common.util.StringUtil" %>

<%@ page import="java.util.List" %>

<%@include file="common_imports.jsp"%>


String loginId = request.getParameter("loginId"); //Group name should be come from the application based on which product or group the user belongs to.

String groupName = request.getParameter("groupName");

if( groupName == null groupName.trim().length() == 0 ) {

groupName = "Default";


if (loginId == null loginId.trim().length() == 0) {

String errorMessage = "Login Id was not found in the HTTP request.";


} else {

loginId = loginId.trim();

groupName = groupName.trim();

BharosaHelper bharosaHelper = BharosaHelper.getInstance();

BharosaSession bharosaSession = bharosaHelper.createNewBharosaSession();



bharosaSession.setLocale(request.getLocale(), request.getLocales());


String ipAddress = request.getParameter("ipAddr");

if (!StringUtil.isEmpty(ipAddress)) {

ipAddress = ipAddress.trim();

} else {

ipAddress = VCryptServletUtil.getRemoteIP(request);



// Set the client's timezone offset

String clientOffsetStr = request.getParameter("clientOffset");


String secureCookie = getCookie(request, "bharosa");

Object[] browserFpObjects = VCryptServletUtil.getBrowserFingerPrint(request);

String browserFp = (String) browserFpObjects[1];

CookieSet cookieSet = bharosaHelper.fingerPrintBrowser(bharosaSession, bharosaSession.getRemoteIPAddr(), request.getRemoteHost(),

BharosaEnumAuthStatus.PENDING, secureCookie, browserFp);

if (cookieSet != null && cookieSet.getVCryptResponse().isSuccess()

&& cookieSet.getSecureCookie() != null) {

setCookie(request, response, "bharosa", cookieSet.getSecureCookie());


// Run post-authentication rules

bharosaEnumAction = bharosaHelper.runPostAuthRules(bharosaSession, BharosaHelper.getHeaderContextMap(request));

if (bharosaEnumAction == BharosaEnumAction.CHALLENGE){


} else if (bharosaEnumAction == BharosaEnumAction.REGISTER_USER){


session.setAttribute("isOptional", "false");

} else if (bharosaEnumAction == BharosaEnumAction.REGISTER_QUESTIONS){


session.setAttribute("isOptional", "false");

} else if (bharosaEnumAction == BharosaEnumAction.REGISTER_USER_OPTIONAL){


session.setAttribute("isOptional", "true");

} else if (bharosaEnumAction == BharosaEnumAction.BLOCK){


} else if (bharosaEnumAction == BharosaEnumAction.ALLOW){


} else if (bharosaEnumAction == BharosaEnumAction.SYSTEM_ERROR){



List POST_AUTH_RUNTIME_LIST = Collections.singletonList(new Integer(2));

int riskScore = bharosaHelper.runRules(bharosaSession, POST_AUTH_RUNTIME_LIST, BharosaHelper.getHeaderContextMap(request)).getScore();

out.println("Risk Score = " + riskScore);

//Store the session object the HTTP Session

BharosaUtil.storeBharosaSession(session, bharosaSession);

//Store the requestID in the session.




Once this is in place, test the web app with a URL like:


This JSP simply displays the action and the risk score.

I typically start with Phase 2 Post-Authentication Flow Phase 2 and add rules & conditions as necessary. You may need to change the policy to be “all users” from “linked users” to get this policy to fire.

I added a rule for restricted countries. I disabled some of the KBA checks that were pre-existing for this policy.

This is the condition that is added to the rule:

Where “Axis of Evil” is a Country group (not like Dixie Chics) that contains my blacklist of restricted countries. Here is where we set the action and alert block, as well as the risk score:

Here is an example of the session detail for the request above:

You can see the alert that results from the condition rendering true as well as the risk score and the action that results from the restricted country.

This is a very small set of the OAAM capabilities but this is intended as a bootstrap.

If you’re looking for a partner in this area, I recommend the folks at Integral who have a lot of experience with this product.

Another pattern that is interesting to me is to factor an OES attribute retriever to call ARM and write OES policies based on risk score, i.e. DENY (…) if risk_score > 500. At some point I plan to take the sample that’s floating around and make it a tidy re-usable attribute retriever.

Thursday, January 6, 2011

Tell what patches are installed in your Oracle IAM 11g deployments using OPatch

People often want to know how they can tell what versions (including patches) of software they are running in their different environments. Fortunately, there is a simple standard way to get this information for all software that uses the Oracle Universal Installer. The method I speak of uses the OPatch utility that is standard for all Oracle products that use the universal installer. This includes most, if not all of Fusion Middleware 11g and all of the 11g Identity Management packages.

OPatch is the utility used to patch Oracle products that utilize the Oracle Universal Installer. It is included in most product/package installs under that package’s ORACLE_HOME/OPatch.

Now, keep in mind that when you execute OPatch it will operate on the ORACLE_HOME that is set in the environment. So, if you are operating in an environment where you have multiple products/packages installed under one Fusion Middleware Home (FMW_HOME) you want to make sure that you have an ORACLE_HOME environment variable pointed to the product/package you are interested in patching or analyzing.

To get information on the version and patch levels of your Oracle product/packages run ‘opatch lsinventory’. Again, make sure that ORACLE_HOME is set to the product/package that you want the information for. If it is not set, then OPatch picks some default ORACLE_HOME from your FMW_HOME in a manor I have yet to figure out.

You can also use OPatch to tell you all the ORACLE_HOMEs that exist under your FMW_HOME by running ‘opatch lsinventory -all’.

The documentation for OPatch can be found in the Oracle Universal Installer and OPatch User Guide

Tuesday, January 4, 2011

Good write-up on Oracle Internet Directory / Active Directory integration

One thing I intend to do more of is to provide links to other good resources on the internet for Oracle IAM and Fusion Middleware security.

Atul Kumar has written a bunch of good articles on his blog including this one on OID/AD

If you need a good reason to subscribe to our twitter feed @fusionsecexpert, we will probably start tweeting some links that we don't actually put up on the blog.

New on the blog for 2011

We at the Fusion Security Blog are looking forward to a great 2011.  Look for new topics and products to be added to the blog this year including coverage of Oracle Identity Manager (OIM), Oracle Adaptive Access Manager (OAAM), Oracle Identity Analytics (OIA), Oracle Identity Federation (OIF), and maybe some discussion of our new eSSO solution from Passlogix.  Also look for some new bloggers to join our site, providing you with some fresh perspective.
Finally, we are getting onboard the twitter bandwagon.  You can now follow us on twitter @fusionsecexpert.
Thanks for reading and here is hoping that all your Oracle Middleware projects in 2011 are wildly successful!

Monday, January 3, 2011

How do I secure my services?

I've been up early for a couple of days talking to a customer about how they should secure their services.

We started with the bit of advice "only make them as secure as you need to". In other words the "HelloWorld" service probably doesn't need an encrypted request with a signed SAML Assertion over a mutually authenticated SSL channel on a private, physically disconnected network segment. And the "launch nuclear weapon" service should probably be secured with something other than a simple username token.

Figuring out what sort of security your services require takes time and effort and lots of information. So I'm not going even try to give you a recommendation here. Instead I guess I'm going to tell you what I told them...

This customer has requests coming in from the Internet. Their users do NOT have individual certificates but do have a username and password. The plan, in as much as there is one, is to have OSB in the DMZ accept the SOAP request from the user and then route it to the real service behind the firewall. So the customer asked us what they should do to secure the services.

They tossed around a bunch of ideas including:

  • using an STS - the client would go to the STS and get a SAML assertion, then use that assertion to send the request to OSB.
  • using an STS with WS-Secure Conversation - same as above, but instead of just getting a SAML Assertion the client would do the more advanced session "stuff" that spec describes.
  • Publishing a certificate issuance SOAP endpoint. The SOAP client would call over to that service with a username and password and get a Certificate issued. The username and password would then be locked out and the Certificate would be the only way the user could authenticate after that point.
All of these solutions work - that is to say that they all provide some aspects of security and thus make the service more secure in some way. But they're all a bit heavy handed and require quite a bit of smarts on the client side. So we had a conversation about what they were actually trying to accomplish.

Turns out their requirements were that the data had to be transmitted securely - meaning nobody else could listen in and see the data. And they wanted to make sure that nobody could inject requests that didn't come from a real user into the flow. And that's it.

Those of you playing along at home know that one way SSL with the username and password in a WS-Security header meets those requirements. Plus it's really lightweight and simple to implement.

Sometimes simpler is better. And in this case turning all of the security levers up to 10 wouldn't have bought them anything but trouble.

So that's what they're going to go with - at least initially. In the future they'll revisit this solution and if things change they can always add new Proxy Services on the bus for other authentication methods.

Behind the bus was a whole 'nother story. They're going to use SAML assertions with the Sender Vouches confirmation method to pass identity down to the real services. But that's a story for another day.