Ask Sawal

Discussion Forum
Notification Icon1
Write Answer Icon
Add Question Icon

How to check dfc version?

2 Answer(s) Available
Answer # 1 #

Before I joined ArgonDigital this past year, I had never worked with Documentum. So, my first task was to familiarize myself with the basics of the Documentum Foundation Classes (DFC). Documentum defines DFC as “a set of Java classes that make essentially all EDM server functionality available to client programs through a published set of interfaces.” It may also be described as the Object-Relational-Mapper (ORM) for programmatically accessing and manipulating all of the objects in a docbase.

Whenever I’m learning a new application framework, I typically look for a “Quick Start” or “Getting Started” article either in the official documentation or by performing a Google search. If I’m unable to find one, then I dive straight into whatever documentation is provided with the intent of creating a set of programming exercises which will essentially become my own “Quick Start” guide going forward. Since a docbase is a persistent store, this meant I needed to familiarize myself with the basic CRUD functions: Create, Read, Update, and Delete. Furthermore, since Documentum is a document management system (to say the very least), I decided to explore the basic file system operations of the following:

Finally, since I was unable to locate that “Quick Start” guide I was searching for, I decided to capture what I learned in the article presented here.

The complete Eclipse project containing all sample code can be found HERE.

Setting up a basic DFC project in Eclipse was a straightforward task. After installing DFC 5.3 and creating a new Java project in Eclipse, perform the following steps:

After setting up my Eclipse project, I wanted to verify that I could programmatically access my test docbase. I wrote the following JUnit TestCase, which would serve as the base class for all subsequent TestCases I might write. Its main purpose is to authenticate to our test docbase and obtain an IDfSession object for our tests to use.

package com.ArgonDigital.dfc.test;

import junit.framework.TestCase;

import com.documentum.fc.client.DfClient; import com.documentum.fc.client.IDfClient; import com.documentum.fc.client.IDfSession; import com.documentum.fc.client.IDfSessionManager; import com.documentum.fc.common.DfLoginInfo; import com.documentum.fc.common.IDfLoginInfo;

public class Dfc5BaseTest extends TestCase {

// TODO: refactor to pull from a properties file private static final String DOCBASE = "YOUR DOCBASE"; private static final String USERNAME = "YOUR USERNAME"; private static final String PASSWORD = "YOUR PASSWORD"; private IDfSessionManager sessionMgr = null; protected IDfSession session = null; protected void setUp() throws Exception { super.setUp(); IDfClient client = DfClient.getLocalClient(); sessionMgr = client.newSessionManager(); // Setup login details. IDfLoginInfo login = new DfLoginInfo(); login.setUser(USERNAME); login.setPassword(PASSWORD); login.setDomain(null); sessionMgr.setIdentity(DOCBASE, login); session = sessionMgr.newSession(DOCBASE); } protected void tearDown() throws Exception { super.tearDown(); if (session != null) { sessionMgr.release(session); } } protected void log(String message) { System.out.println(message); } }

And then I tested my login code with the following subclass:

package com.ArgonDigital.dfc.test; public class LoginTest extends Dfc5BaseTest { public void testLogin() throws Exception { // login happens in setUp(), so nothing to do here assertNotNull("session is null", session); } }

There are a couple of important points regarding the above code samples:

Prior to DFC 5.x, there wasn’t an IDfSessionManager and the developer was required to call IDfSession.disconnect() whenever a session was no longer needed. However, the IDfSessionManager supports session pooling, so it is critical that any session acquired through a session manager is released through that session manager as well. Otherwise, bad things can happen and probably will. This is typical Object/Relational Mapper design, so those familiar with a similar persistence framework should find the transition rather painless.

DFC’s object model for managing docbase objects is a deep and complex hierarchy, but we can get started with the basics by looking at only a small subset of these classes:

*Arrows represent object inheritance levels.

IDfClient IDfSessionManager IDfSession IDfQuery IDfTypedObject --> IDfCollection IDfPersistentObject --> IDfSysObject --> IDfFolder IDfDocument

We’ve already been introduced to IDfClient, IDfSessionManager, and IDfSession in the previous section. So what are the remaining classes used for? The DFC Javadoc describes them as follows:

We’ll get a better understanding once we see them in action, so let’s put them to use.

Finally, it’s time to do what we all love: Write code. Let’s revisit our chosen exercises:

I’ve created a single test case class, DfcCrudTest.java, with test methods present for each of our exercises. For some of our exercises, there turned out to be more than one viable way of accomplishing our goal. For example, to obtain a folder’s contents, you can perform a simple DQL query, or if you have a handle on the IDfFolder object, you can call the getContents(..) method on the folder object. To demonstrate this, I included both options within my testFolderContents() method.

Please keep in mind that these tests are written for clarity, not for optimal design.

package com.ArgonDigital.dfc.test;

import com.documentum.fc.client.DfQuery; import com.documentum.fc.client.IDfCollection; import com.documentum.fc.client.IDfDocument; import com.documentum.fc.client.IDfFolder; import com.documentum.fc.client.IDfQuery; import com.documentum.fc.common.IDfId;

public class DfcCrudTest extends Dfc5BaseTest {

private static String DIR_NAME = "Subdir"; private static String DIR_PATH = "/Temp/" + DIR_NAME; private static String FILE_NAME = "Getting Started with DFC and DQL.txt"; private static String FILE_PATH = DIR_PATH + "/" + FILE_NAME; private static String DOC_AUTHOR = "Steve McMichael";

private IDfFolder folder; private IDfDocument document; public void testSimpleDfc() throws Exception { initialize(); // tests are order dependent createFolder(); createFile(); linkFileToFolder(); modifyFile(); fetchFolderContents(); queryFiles(); deleteFile(); deleteFolder(); }

private void createFolder() throws Exception { log("** Testing folder creation"); folder = (IDfFolder) session.newObject("dm_folder"); folder.setObjectName(DIR_NAME); folder.link("/Temp"); folder.save(); log("created folder " + folder.getId("r_object_id")); assertEquals("unexpected folder path", DIR_PATH, folder.getFolderPath(0)); } private void createFile() throws Exception { log("** Testing file creation"); document = (IDfDocument) session.newObject("dm_document"); document.setObjectName(FILE_NAME); document.setContentType("crtext"); document.setFile("E:/clipboard.txt"); // add content to this dm_document document.save(); log("created file" + document.getId("r_object_id")); } private void linkFileToFolder() throws Exception { log("** Testing file linking to folder"); document.link(DIR_PATH); document.save(); log(FILE_PATH); assertNotNull("unexpected folder path", session.getObjectByPath( FILE_PATH)); } private void modifyFile() throws Exception { log("** Testing file modification"); document.checkout(); int numAuthors = document.getAuthorsCount(); document.setAuthors(numAuthors, DOC_AUTHOR); //doc.checkin(false, "Prevents promotion to CURRENT"); document.checkin(false, null); // When a null version label is provided, // DFC automatically gives the new version // an implicit version label (1.1, 1.2, etc.) // and the symbolic label "CURRENT". } private void fetchFolderContents() throws Exception { log("** Testing folder contents"); // (1) Fetch using IDfFolder object IDfFolder folder = session.getFolderByPath(DIR_PATH); assertNotNull("folder is null", folder); IDfCollection collection = null; IDfDocument doc = null; int count = 0; try { collection = folder.getContents("r_object_id"); while (collection.next()) { count++; IDfId id = collection.getId("r_object_id"); doc = (IDfDocument) session.getObject(id); log(id + ": " + doc.getObjectName()); } } finally { // ALWAYS! clean up your collections if (collection != null) { collection.close(); } } assertEquals("wrong number of files in folder", 1, count); assertEquals("unexpected doc name", FILE_NAME, doc.getObjectName()); // (2) Fetch using DQL folder(..) String dql = "SELECT r_object_id, object_name from dm_document where folder('"+DIR_PATH+"');"; // Or we can fetch the contents of our folder and all of its subfolders using // // folder('/Temp/Subdir', descend) // // But since we haven't added any subfolders, this will return the same set of dm_documents. // // String dql = "SELECT r_object_id, object_name from dm_document where folder('"+DIR_PATH+"', descend);"; IDfQuery query = new DfQuery(); query.setDQL(dql); collection = null; String docName = null; count = 0; try { collection = query.execute(session, IDfQuery.DF_READ_QUERY); while (collection.next()) { count++; String id = collection.getString("r_object_id"); docName = collection.getString("object_name"); log(id + ": " + docName); } } finally { // ALWAYS! clean up your collections if (collection != null) { collection.close(); } } assertEquals("wrong number of files in folder", 1, count); assertEquals("unexpected doc name", FILE_NAME, docName); } private void queryFiles() throws Exception { log("** Testing file query"); // (1) load by path IDfDocument doc = (IDfDocument) session.getObjectByPath(FILE_PATH); assertNotNull("null doc returned", doc); assertEquals("unexpected doc name", FILE_NAME, doc.getObjectName()); // (2) load by query // NOTE: Authors is a "repeating attribute" in Documentum terminology, // meaning it is multi-valued. So we need to use the ANY DQL keyword here. doc = null; String dql = "SELECT r_object_id" + " FROM dm_document" + " WHERE object_name = '" + FILE_NAME + "'" + " AND ANY authors = '" + DOC_AUTHOR + "'"; IDfQuery query = new DfQuery(); query.setDQL(dql); IDfCollection collection = query.execute(session, IDfQuery.DF_READ_QUERY); try { assertTrue("query did not return any results", collection.next()); doc = (IDfDocument) session.getObject(collection.getId("r_object_id")); } finally { // ALWAYS! clean up your collections if (collection != null) { collection.close(); } } assertNotNull("null doc returned", doc); assertEquals("unexpected doc name", FILE_NAME, doc.getObjectName()); } private void deleteFile() throws Exception { if (document != null) { log("** Testing file deletion"); document.destroyAllVersions(); } } private void deleteFolder() throws Exception { if (folder != null) { log("** Testing folder deletion"); folder.destroyAllVersions(); } } private void initialize() { // If something bad happened during the previous run, this will // make sure we're back in a good state for this test run. try { session.getObjectByPath(FILE_PATH).destroy(); } catch (Exception e) { // ignore } try { session.getObjectByPath(DIR_PATH).destroy(); } catch (Exception e) { // ignore } } }

If you have your DFC Javadoc handy, then the above code sample should provide the details required to tie everything together.

However, there is one requirement I’d like to highlight. Whenever you execute a DQL query in DFC, an IDfCollection object is created as a handle to the query results, similar to a ResultSet in JDBC. This collection represents an open resource which must be closed. There are a limited number of collections available, and so it is imperative that collections be closed when they are no longer in use.

So the two best practices we’ve discussed regarding resource cleanup with DFC are:

Hopefully, this is enough to get you started. There are numerous resources available which provide a deeper dive into some of the concepts presented here. To help you out, I’ve provided a short list of references for further reading. Enjoy!

[1]
Edit
Query
Report
Muhammet DiLeo
Master Electrician
Answer # 2 #

Whenever you install a CS patch or another patch, it will probably have its own DFC libraries simply because EMC fixed something in it or because it was needed. Whenever you install D2, it will also have its own DFC libraries in the JMS and the WAR files. The problem is that the DFC libraries are everywhere… Each and every DFC client has its own DFC libraries which come when you install it, patch it, aso… Basically that’s not a wrong approach, it ensure that the components will work wherever they are installed so it can always talk to the Content Server.

The problem here is that the DFC libraries are changing at every patch almost and therefore it is kind of complicated to keep a clean environment. It already happened to us that two different patches (CS and D2 for example), released on the exact same day, were using different DFC versions and you will see below another example coming from the same package…  You can live with a server having five different DFC versions but this also means that whenever a bug impact one of your DFC library, it will be hard to fix that because you then need to deploy the next official patch which is always a pain. It also multiplies the number of issues that impact your DFC versions since you are running several versions at the same time.

I’m not saying that you absolutely need to always use only the latest DFC version but if you can properly and quickly perform the appropriate testing, I believe it can brings you something. A few weeks ago for example, one of the Application Teams we are supporting had an issue with some search functionalities in D2. This was actually caused by the DFC version bundled with D2 (DFC 7.2P03 I think) and we solved this issue by simply using the DFC version coming from our Content Server (DFC 7.2P05) which was only two patch above.

To quickly and efficiently see which versions of the DFC libraries you are using and where, you can use:

You can execute these commands on a Content Server, Application Server (Note: dfc.jar files might be on the D2/DA war files if you aren’t using exploded deployments), Full Text Server or any other Linux Servers for what it matters. These commands handle the spaces in the paths even if normally you shouldn’t have any for the dfc files. To use them, you can just replace with the base folder of your installation. This can be $DOCUMENTUM for a Content Server, $XPLORE_HOME for a Full Text Server, aso… Of course you still need to have the proper permissions to see the files otherwise it will be quite useless to execute this command.

A small example on a Content Server 7.3 (no patches are available yet) including xCP 2.3 P05 (End of February 2017 patch which is supposed to be for CS 7.3):

As you can see above, it looks like there are two different versions of the DFC library on this Content Server which has just been installed: one coming from the CS 7.3 which is therefore in 7.3 P00 build number 205 and another version which is still in 7.2 P21 build number 184. This second version has been put on the Content Server by the xCP 2.3 P05 installer. Therefore using a 7.2 library on a 7.3 Content Server is a little bit ugly but the good news is that they are both in a pretty recent version since these two libraries were released almost at the same time (end of 2016/beginning of 2017). Therefore here I don’t think it would be a big problem even if as soon as the CS 7.3 P01 is out (normally end of this month), we will replace all dfc.jar files with the 7.3 P01 versions.

Another example on a Full Text Server using xPlore 1.6 (same as before, no patches are available yet for xPlore 1.6) including one Primary Dsearch and two IndexAgents for DOCBASE1 and DOCBASE2:

Do you see something strange here? Because I do! This is a completely new xPlore 1.6 server which has just been installed and yet we have two different versions of the DFC libraries… It’s not a difference on the minor version but it’s a difference on the build number! As you can see above, it looks like the PrimaryDsearch uses a DFC 7.3 P00 build number 196 while all other DFC versions are in 7.3 P00 build number 205 (so just like the Content Server). The problem here is that each xPlore modules (IA, Dsearch, aso…) are built by different xPlore Teams. Therefore the team that package the Dsearch libraries isn’t the same that the one that package the IndexAgent libraries.

Since there is a difference here, it probably means that the Dsearch team built their package some days/weeks before the other teams (from IA, CS, aso…) and therefore the DFC libraries included in the Dsearch are older… Is it an issue or not? According to EMC, it’s not, BUT I wouldn’t be so categorical. If EMC built this library 9 additional times, it’s not for nothing… There must be a reason behind those builds and therefore not having the latest build seems a little bit risky to me. Since this is just a sandbox environment, I will most probably just wait for the P01 of the xPlore 1.6 which will be release in a few days and I will implement it to have an aligned version of the DFC for all components.

Have fun finding issues in the EMC releases :).

[0]
Edit
Query
Report
Nishi Bala
SIFTER