How UML Offers Devs More Control, Not Less
Abstraction tools can be "good, bad or ugly" for devs. See how one IT department began as UML skeptics and in the end found the right modeling tools give Java/J2EE devs more control -- not less. See how the right abstraction tools can save time, money and aggravation -- without detracting from expert programming.
As devs debate the merits and demerits of modeling, UMLs and other abstraction tools, Integration Developer News is talking to end users about their "good, bad and ugly" experiences. Our latest interview focuses on a firm that began as skeptics and ended up surprised to find UML tools could give their Java/J2EE devs more control -- not less -- over a mission-critical project One reason: UML actually relies on maturity of core developer technologies, including J2EE code and patterns.
Developer Opportunities from Abstraction, UML
Richard Seegmiller, CIO for start-up wireless emergency solutions provider Micro-News Network, presented his IT developer staff with the type of "challenging opportunity" not unfamiliar to devs in a high-tech start-up.
The wealth of the opportunity and the depth of the challenge prompted Seegmiller and his MNN team to explore the state-of-the-art for abstraction tools (UML, modeling, code generation, etc.). Seegmiller admits he started his search as a skeptic. "I was skeptical," he told IDN, "and yet I saw some architectural things…which intrigued me." So, with the company's biggest product opportunity ever at stake, Seegmiller decided it was worth the time and effort to investigate.
In the end, the skeptical Seegmiller found, to his surprise, that abstractions and UML tools were more ready-for-primetime than many may think. So, IDN asked Seegmiller, should J2EE developers spend more time looking past Java APIs to interoperability or integration technologies such as UML? His response was instructive.
"Yes, I think so," he said, putting UML into a broader technology context that he thinks many enterprise developers, especially J2EE devs, could benefit from.
"I find it interesting that so many people in our culture are fearful of a change in their job and a change in what they do. If we look back through history, we can see that back in the late 1800s, 80% of our population was involved in the manufacture of food; today it is 2%," Seegmiller posited. "So," he asked, "in that context, what has automation allowed us to do?"
Seegmiller's answer: "[Automation] has allowed us to focus less on mundane survival skills and more on greater and higher ideals. So, here we are at the beginning of the millennium with the unraveling of human DNA. How could this occur if we weren't unafraid to give up what we're doing in favor of a greater and higher ideal? And, so, by automating the mundane, it allows us to focus on new horizons that are yet unconquered. Developers should use automation as an opportunity to begin to tackle the unconquered, rather than rewrite what's already been done, time after time, Seegmiller said.
Inside Seegmiller's Tools Investigation
One reason Seegmiller said that abstraction/UML tools may be ready is not so much because modeling techniques are so strong, but because of the strength of the underlying technologies that models rely on -- such as J2EE code, J2EE patterns, object-oriented programming and code generators.
Let's take a look at Seegmiller's opportunity/challenge equation, which set up the requirements for his tools investigation:
The opportunity: "If we build the right solution, and built it correctly," Seegmiller said, "customers will come." MNN's research showed strong demand for a low-cost, easy-to-deploy emergency-response system.
The challenge: MNN's emergency wireless solution, dubbed SNAP, would need to blend hardened mission-critical features with core flexibilities. Among the requirements:
- Device Portability -- Operate on any number of mobile devices that government workers, teachers, ERTs, etc. might already be carrying (PDAs, cell phones, pagers, industrial radios, etc.).
- Back-end Portability -- While MNN plans to use Sun's Java Application Server Framework, Seegmiller is prototyping the application in Apache/Tomcat and wants to move the application (including the components that integrate to legacy assets) without a lot of added development.
- High Reliability (end-to-end) -- This would require that the code be tightly coupled to the application's data model (along with all CRUD functions needed to support the data model -- Create, Read, Update and Delete).
- Customizable Workflows -- While the overall framework for SNAP would remain consistent across customers, customer-centric emergency response workflows or dataflows would need to be supported, without the need to "re-architect" the entire SNAP framework.
Seegmiller also was resource-constrained, so he wanted to prototype a SNAP application with only a handful of Java/J2EE developers in under a month.
Putting J2EE Devs at the Center
Seegmiller candidly admits he wasn't sure he could get all the attributes from one tool, but he was sure of one thing: Getting his J2EE developers the right tools to support development of a tightly coupled app that would have flexibility at the workflow and even the language level would be key to MNN's success.
"I think the real issue [for developers and tools] is the integration of the model-driven architecture from the data modeling through the technology selection and on through the code that is generated," Seegmiller said. The results of his investigation and trials surprised even Seegmiller. Details follow.
Seegmiller and his MNN team, like many J2EE teams, are very methodical in their application design, construction and deployment. MNN starts with a data model, then turns that data model into a coded project, then runs and tests that project, makes tweaks where needed, runs again, and deploys.
This approach, while it results in runable code (with few surprises) the majority of the time, also has drawbacks, Seegmiller said. It requires lots of man-hours and many other skills (from architects, business analysts, SQL programmers, integration specialists, etc.) to get a project from design to finished, runable code.
Seegmiller described the challenge for UML/abstraction tools this way: "Beginning with the domain modeling or the data model, one will drag-and-drop objects onto a canvas, and these objects represent data elements in the data model. One will connect these elements together with other objects; for example, they’re represented as lines in the tool. And these lines connect objects together, and one defines the relationship between these objects," Seegmiller said.
"The real issue [in abstraction tools or UML]," Seegmiller said, "is the integration of the model-driven architecture from the data modeling through the technology selection and on through the code that is generated."
Putting Abstraction, UML Tools to the Test
Seegmiller had many questions that would give him a conclusion, and two (2) questions would prove crucial:
- Just how much a tool/framework does to help J2EE devs construct their data model (without too much hands-on help from business analysts and architects); and
- Just how well (if at all) could that tools-based UML data model generate truly "runable" code (without the hands-on help of SQL, J2EE or other programmers).
"As we were going down this path, certainly there were concerns," Seegmiller confided. "So, we approached this as an experiment: Let's do something on a small scale and see how it does. So, we threw some stuff at it, and within a very short time we demonstrated some capabilities there."
Seegmiller thinks of his "design-to-code-to-deployment" this way:
"In the traditional days of modeling, an architect might generate a model and hand it over to some technology expert to convert the model into code. Now, that translation is much more seamless, and even though there’s less hands-on coding, more control can be put directly into the hands of the J2EE developer.
"So, you have your data model, and you wish to express that data model in a technology,” Seegmiller went on. “With a traditional approach, such as [IBM] Rational Rose, you take the output and hand it to your SQL programmer. With that [decision to go SQL], you've chosen a technology. Now, the SQL programmer takes that data model and generates code that represents that data model in that technology. Likewise," Seegmiller added," you could take that same data model and hand it to a .NET programmer, and he would express that in a .NET Framework."
Based on the value J2EE devs car get from effective abstraction, Seegmiller said he's a bit confused by the negative reaction to such tools among some J2EE devs. "I think J2EE people are familiar with abstraction. In an object-oriented programming language such as Java, one has to be able to abstract, so I don't think this is such a long leap in terms of abstraction," he said.
The List: Seegmiller Quantifies Dev Benefits from UML Tools
Seegmiller said was particularly interested in tools that would help his J2EE devs trade up their skills; in other words, (1) do more-valued, higher-end functions at the design level, while (2) letting them sidestep or avoid time-consuming lower-level chores. Seegmiller had several functions in mind:
- Getting J2EE Code to Comply with the Model
The main issue was to have the generated code tightly coupled to the data model. In the traditional model, when you hand the data model over to the programmer, you rely on their skills to completely implement the data model and all the CRUD (Create, Read, Update, Delete) functions that go along with it, as well as to accurately represent the data model. So, you have to worry a lot less about "Did everything get built correctly?”
With the integrated development framework of OptimalJ, all of that architecture, data modeling, to technology to code, is automated.
- Get Rid of CRUD:
To Seegmiller, getting rid of tedious developer chores meant eliminating hand-coding of CRUD functions. While he said many UML/abstraction tools take stabs at this, he found OptimalJ's graphical data models can directly generate CRUD functions that comply with core J2EE patterns. "because those code patterns have been through rigorous testing, and any code that's developed relying on those code patterns can be considered tested," he said.
Another key benefit? "It certainly helps in terms of maintainability of the code. Programmers familiar with Best Practices will feel comfortable with the code that’s generated, and feel like they're 'coming home,' so to speak," Seegmiller added.
- Convert the "Data Model" to "Runable Code"
Generating CRUD functions is all well and good, but the real boon to J2EE dev productivity, Seegmiller said, would come when he found a tool that would help J2EE devs eliminate the need for architects, analysts, SQL programmers and other professionals needed to get a project from design to deployment.
Here, Seegmiller also found merit in OptimalJ's implementations, especially as they were tied to the underlying J2EE technologies. "The [UML] models can be mapped to our application requirements, and in turn to our chosen underlying technology, which is J2EE, so that runable code can be generated" directly from the model, he said. Seegmiller found Compuware engineers had taken a life-cycle approach to its support for UML and graphical data models, which provided J2EE devs more intimate integration among the model, the workflow and the underlying J2EE and SQL code needed to turn the model into a runable application.
- Ensuring J2EE Code Portability
Seegmiller said that as J2EE app server vendors each begin to optimize their own toolsets, he has growing concern for portability for his J2EE code. "Using [UML] tends to ensure portability aspects of Java," Seegmiller told IDN.
"There was some skepticism here, but in our laboratory environment we demonstrated code portability across the data center, using Linux and Sun [Solaris]," Seegmiller said. Currently Seegmiller is running SNAP on Apache/Tomcat, and has a demo system running Sun's J2EE stack on Win98.
UML's Pros and Cons for Developer Jobs
At the end of his investigation, Seegmiller found some key areas where he felt abstraction/UML was ready for primetime, and some where it still needs work.
On the plus side, there were substantial productivity gains: "Before OptimalJ, I had explored what it would take for a traditional team of J2EE programmers to develop a fairly restricted sample of our application. That would have been a team of 4 people and would have taken them 4-6 weeks to do what 2 people did in a week," Seegmiller said, adding that this was "a very restricted piece." Once his team "realized how quickly that went, we extended the scope of the project to include some additional capabilities, and we got functionality out of the same 2-person team within another 2 weeks."
On the downside, OptimalJ does not have what Seegmiller would consider to be an out-of-the-box UI capability. "When we talk about use cases, I look at it as encompassing the definition and UI. As far as developing use cases, that's where OptimalJ currently stops," Seegmiller said. "To be able to take a model that you've got and, using OptimalJ, be able to create what the end user actually sees, is not what OptimalJ is doing yet."
But, on balance, Seegmiller finds the current state of UML/data modeling abstraction tools to be sufficient to offer both CxOs and the frontline developer some notable upside, "so long as the expectations and outcomes are well-described from the beginning," he added, saying, "There still is more room for improvement [in UML and data modeling tools], but companies and individual developers can certainly get started."