J2EE Build Tool Calms Websphere Coding Chaos

Catalyst Systems is bringing a new level of build management tools to J2EE Websphere devs and architects. See how Openmake 6.3, as an Eclipse 5.1 plug-in, cuts the need for J2EE devs to script makefiles, XML or even Java classes using a blend of XML metadata, Apache Ant and "builds" Best Practices.

Tags: Scripting, Developers, WebSphere, Openmake, Ant, Eclipse, XML,


IDN Executive Overview
Among topics covered in this article are:

  • A review of how Catalysts' Openmake 6.3 eliminates many tedious scripting tasks for J2EE builds, including makefiles, XML scripting, and even Java classes for builds.
  • An IDN "How-it-Works" interview with Catalyst CEO, and Openmake creator, Tracy Ragan.
    ======================================

    byVance McCarthy
    Open Source firm Catalyst Systems is shipping an Eclipse 5.1 plug-in tool that will curb chaos in the lives of J2EE extreme programmers.

    Catalyst has been providing a wide variety of build management tools for more than 10 years. The company's latest release,Openmake 6.3, blends the Open Source power of Ant and XML metadata with the firm's Best Practices approach for creating and maintaining builds -- and makes it available as an easy-to-use Eclipse plug-in for WebSphere Studio developers.

    The result: Complex WebSphere apps that need components built and integrated from outside the WebSphere Studio can be developed using a repeatable method of building. In addition, Openmake eliminates the need to hard-code separate manual processes for performing builds outside the IDE.

    "The primary goal of Openmake is to remove the requirement for developers to perform any scripting at all -- no makefiles, no XML scripting for Ant and no special Java classes are needed to perform builds within the IDE or externally by a build master," said Tracy Ragan, Catalyst CEO.

    "Openmake 6.3 moves the Java build process a huge step forward by giving developers all the power and flexibility of Ant without anyone having to touch a single line of XML, allowing them to focus on what they were hired to do and what they do best -- write code."[For OET's full interview with Catalyst Systems CEO Tracy Ragan, see below.]

    Helping Cure 'Extreme Programming's' Extreme Headaches
    J2EE's extreme programming can require architects and developers to implement continuous integration of source code changes and builds -- outside their chosen IDE. The result can be a management nightmare, requiring staff to manage scheduled builds in parallel IDEs while maintaining multiple production releases at the same time.

    Openmake automatically gathers and coordinates WebSphere project-specific information from the WebSphere Project file to support an automated Ant Build process across WebSphere Projects and across the enterprise. Once installed, the Openmake plug-in monitors a developer's work activities.

    Any additions to the WebSphere project, such as changes to the Java build path or changes to project dependencies, are automatically recorded by Openmake and reflected in the Openmake Target Definition (TGT) Files. These TGT files are stored within the WebSphere Studio Workspace and can then be used inside or outside of the Eclipse IDE to perform project builds at any release level or state of a development lifecycle.

    The Openmake Eclipse Benefits
    Openmake's Eclipse plug-in approach enables developers to completely automate the build process for enterprise applications without having to write a single Ant/XML script. Ant build tasks are derived dynamically based on the WebSphere Studio project file. This provides a collaborative approach to the "build engineering process," enabling WebSphere developers to work through the WebSphere Eclipse IDE in a standard, non-interrupted method.

    The Openmake plug-in for Eclipse also provides:
    • A console for sophisticated real-time build logging without the need to write a Build Listener Java Class for reporting, including a Build Setup View enabling users to initiate builds from within the Eclipse IDE for individual targets or multiple targets that can be either WebSphere or non-WebSphere components; and

    • Advanced reporting options including Embedded Footprinting, Bill of Materials reports, a verbose logging mode and Dependency Impact Analysis.


    OET Interview with
    Tracy Ragan, CEO
    Catalyst Systems Inc.


    OET: Why do you think Ant can be so important to J2EE developers?

    Ragan: Ants are very useful set of functions for building Java programs, whether they be WebSphere, BEA or straight Java; it really doesn't matter. Basically, Ant acts as a compiler, so you can tell Ant to use different versions of a Java compiler. There are lots of very useful functions you can use with Ant -- but there are a lot of very un-useful functions with Ant.

    OET: For instance?

    Ragan: For instance, XML is not a true scripting language; it's really a tag language. There are some functions that Ant had to create to make XML more like a Perl scripting language; for example, being able to create a directory, or copying a file from one directory to the next. These are things that you can do in a heartbeat with Perl, but you can't do in XML.

    OET: Does that mean, in your view, that some folks who are tuning their J2EE app server are using Perl and don't want to?

    Ragan: I would say the opposite: Most people building Java apps are having to rely on XML, which is not a very intelligent scripting tool. What we're doing with Openmake is to allow them to use all of the useful Ant tasks, but the delivery of the Ant is through Perl as opposed to XML.

    OET: It appears that in Openmake, you have this marriage of Ant and Java and Perl and XML and Make. Which one of those would you suggest a developer know the most about to get the most mileage from Openmake 6.3?

    Ragan: You don't have to know anything about any of them, actually -- and that's the beauty of Openmake. They need to understand how they want their particular file to be created, whether it's an ear file or a shared Unix library. They need to know what source code needs to go into it, and they need to know what it is. They don't need to know anything else. Now if they want to become super users of Openmake, the most useful language for them to know is Perl, because Perl is how we wrap the compiler. So our reusable code, which is 100% reusable, contains no application specific file information in it; it's all written in Perl. I can write build types with these command scripts written in Perl to support EARs, WARs, and JARs, and anybody can use them.

    OET: And for Perl, do you have any recommendations?

    Ragan: We usually recommend Active State Perl. So the big deal for the WebSphere community, with Openmake, is when you're a WebSphere developer, you rely heavily on the Eclipse IDE. When you want to build applications that are larger than a few developers, you get into the standard enterprise rollout issues that you would if you were writing straight C code, using your favorite editor and knowing all of the names of your source code.

    OET: Is it fair to say Openmake, in addition to automation, also provides some code practices "enforcement," at least when it comes to builds?

    Ragan: The J2EE standards really mean that developers need to build EARs and JARs in a certain way. Now you see tools like Maven and Ant -- they're attempting to assist developers in ensuring that builds are done in a certain way. Openmake has always done that. What's happening is that as these small teams become mission-critical teams, they're moving toward some accountability. Where in the past they just had their small Ant scripts and only had to answer to each other, now these teams are having to answer to a larger group; maybe productions management or a central cm team -- and those teams have to answer to auditing.

    OET: What would be an example of how you work with Eclipse/WebSphere Studio?

    Ragan:For example, all these issues around Sarbanes-Oxley being able to verify matching source to executable. All those become very apparent when you begin building larger enterprise based applications with these tools, which is really what IBM is selling WebSphere to do. But the problem is you build inside the IDE and you can't automate builds and you can't grow builds beyond the IDE. So what developers are faced with is to go outside the IDE and they can build in headless mode, which actually calls the Java compiler without bringing up the Eclipse IDE -- but it's only able to do limited steps. If they need to go beyond those limited steps, they call Ant in order to do that heavier lifting, aAnd they have to code XML scripts to deliver Ant to their build process.

    What we do is we fit in the middle: We say, "Don't bother. You have this fabulous environment where you are literally generating code. Don't stop and manually script a build solution for code that's being generated. It doesn't make sense." We simply interrogate the WebSphere project file and create what we call a target file, automatically for them. We take our target information and we merge it with our metadata and our Perl script, and we generate what the developer would normally have to manually script. We have a plug-in for Eclipse and we have a plug-in for WebSphere that has been validated for the WebSphere team as ready for WebSphere.

    OET: Are you saying we're on the brink of having an army of stressed Java/J2EE architects and developers out there because they didn't handle their build functions right?

    Ragan: Well, what these Java developers are being asked to do is to provide a mechanism to match what has been checked into their CM tool to their executable. And these developers like to tinker; they like to tinker with scripts. And you might have someone who is constantly tinkering but will leave to work elsewhere.

    OET: Are you saying we have a version control issue here? J2EE professionals will tinker with some code set, but he won't take what he tinkered and put it back and update?<

    Ragan: It's literally impossible with a manually scripted process. The hardest part is to look at these manually scripted processes and determine where the source code came from. Even though you checked in all of the code that you wrote, you didn't check in the third-party code that you downloaded from the Internet. Or you didn't check in the code that the team next to you provided for error routines that you borrowed from them. There may be a large portion of the code that creates that final binary is actually missing from the cm tool.

    OET: With new rules and requirements, the "tinker police" are coming out?

    Ragan: Exactly -- that's the best way to describe it. They want to know who's tinkering because builds are really important at the end of the day. That's what determines what goes to production and that's what they need to find out.

    OET: Let's talk a few minutes about the future of Openmake. Are any of the features in Eclipse 3.0 something you either do or would like to take advantage of?

    Ragan: That's our next step. I looked at the Eclipse 3.0 platform and in terms of the way we substitute, I guess is the best word, the Jdt, there's not a whole lot we have to change. The Eclipse project file, in terms of the items we have to use when we're scanning, none of that has changed, either. So in [Openmake] 6.4, we'll validate the Eclipse 3.0 and at that point in time, we'll be prepared for Rational, which is the new IBM certification for WebSphere. When they go through this process, it will be our 6.4 product with Eclipse 3.0 and it will be ready for Rational certification.

    OET: You seem to be doing great work for IBM developers. Do you get any sense that IBM corporate wants to get to know Catalyst better?

    Ragan: At the Rational user conference, they welcome us with open arms. They were happy to see that there was a build tool out there that actually integrated, cleanly, underneath the covers of WebSphere. Because now that WebSphere is moving the Rational division, they need to be able to answer to people in terms of how do you verify what you're building with Ant… They need to be able to answer those questions.

    OET: So, in the end, do you expect that many Java/J2EE architects and developers out there will be stressing because they didn't handle their build functions right?

    Ragan: Well, what these Java developers are being asked to do is to provide a mechanism to match what has been checked into their CM tool to their executable. And these developers like to tinker. They like to tinker with scripts. And you might have someone who is constantly tinkering but will leave to work elsewhere.

    OET: Are you saying we have a version control issue here. J2EE professionals will tinker with some code set, but he won't take what he tinkered and put it back and update.

    Ragan: It's literally impossible with a manually scripted process. The hardest part is to look at these manually scripted processes and determine where the source code came from. Even though you checked in all of the code that you wrote, you didn't check in the third-party code that you downloaded from the Internet. Or you didn't check in the code that the team next to you provided for error routines that you borrowed from them. There may be a large portion of the code that creates that final binary is actually missing from the cm tool.

    OET: So, with new rules and requirements, the "tinker police" are coming out?

    Ragan: Exactly, that's the best way to describe it. They want to know who's tinkering because builds are really important at the end of the day. That's what determines what goes to production, and that's what they need to find out.





    back