Wall Street & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Infrastructure

03:20 PM
Connect Directly
RSS
E-Mail
50%
50%

New Application Development Tools Promise to Boost Code Quality and Reduce Downtime

To improve application code and reduce downtime, financial services firms such as Raymond James are turning to an emerging category of analytical tools to objectively measure quality and inform developers.

Nearly ubiquitous and almost always costly, application outages are the bane of most enterprises -- business halts as IT rushes to extinguish the blaze. Fortunately, the solution is on the horizon.

"Organizations are beginning to realize it's not just about what an application does," asserts Jim Duggan, research VP for Gartner. "It's about the quality of an application in terms of performance, reliability and security." But improving application quality doesn't mean simply working programmers harder, Duggan stresses. Instead, developers need tools to help them work smarter. Enter: a new breed of solutions aimed at doing just that.

Termed "software application analysis and measurement" tools, these solutions provide objective insights into the health of software code across languages, applications, platforms and layers. Correctly applied, according to the experts, these tools measure code quality -- not developer performance. "In fact, when code analysis and measurement becomes the programmer's performance appraisal, then the manager is taking a shortcut," stresses Duggan, who cautions that this approach "is wrong and, worse, incendiary."

Personnel evaluations aside, savvy financial services firms are discovering the value of application analysis. "It's certainly much cheaper to deal with application insufficiencies on the front end, rather than on the fly while your whole trading floor is sitting idle," Duggan observes.

Going Beyond Quality Assurance

St. Petersburg, Fla.-based Raymond James Financial is among the capital markets firms that are embracing application analysis and measurement. Spurred by a senior management change in mid-2010, the institution sought to go beyond standard software testing, explains Margaret Boisvert, Raymond James' senior manager of software quality. "We wanted to focus on overall quality," she says.

As Boisvert researched technology solutions, she says, she quickly discovered another department had already purchased Paris-based CAST's Application Intelligence Platform. "But we lacked sufficient training to evaluate the tool," she recalls. To gain the required expertise, Boisvert's team designed a pilot project to test five applications actively undergoing revisions. Upon running the first application through CAST in October 2010, the tool identified both code violations and significant strengths, Boisvert reports.

"Managers quickly saw the value in knowing technical debits," she says. "And programmers realized the value of objective measurements because they could take pride in the areas where their applications performed well."

As the pilot continued through December, CAST representatives conducted onsite sessions to assist with interpreting tool output, Boisvert adds. During one meeting about a certain section of code, several developers challenged a colleague's assertion that "there is only one way" to program it, she recalls. "This demonstrated the value of the tool for sparking critical conversations," Boisvert insists. "Making developers aware of other ways to accomplish tasks will improve our programming, which is an unexpected -- and important -- benefit of CAST."

Given these early successes, Raymond James committed to fully deploying CAST across its mixed Microsoft and HP NonStop environment, Boisvert relates. For most of the first quarter, she says, the firm continued to baseline the remaining pilot applications, and its development, quality and administration teams learned to navigate CAST. During the spring and summer, Boisvert adds, more applications were added to the CAST platform and, by late September, the solution was scanning 15 applications from the firm's approximately 400-strong application portfolio.

So far, the only noteworthy technology issue was integrating CAST with HP's version of COBOL. "We overcame this in partnership with the vendor," notes Boisvert. "CAST developed a pre-processor for NonStop and we fine-tuned it."

Digging Out From the Code Violation Avalanche

More challenging has been managing the sheer volume of code violations CAST typically reports, Boisvert acknowledges. Out of the box, CAST utilizes 1,000 best practices rules to conduct analyses, according to Lev Lesokhin, VP of worldwide marketing for CAST, who says violation "avalanches" are common.

"We typically see 5,000 to 10,000 violations in an average application," says Lesokhin, who is based in CAST's North American headquarters in New York. "Unless you prioritize the platform's output, development teams get so overwhelmed that it can be paralyzing."

Raymond James' experience proved no exception. "We didn't anticipate the volume of data or the impact CAST could have," Boisvert admits. "It was a little daunting at first. Now we're evaluating the default rules to determine which ones are most appropriate for our organization."

In the process, Raymond James also discovered that its development processes contain standardization insufficiencies, Boisvert notes. "This started an internal discussion about defining our programming standards more clearly," she says.

Another challenge has been change management. Because software development is more art than science, measurement of any kind makes people nervous, according to Boisvert. "Fortunately, fears were quickly calmed when we explained our intent is giving people more visibility into the quality of their code so they can make improvements themselves," she says.

To that end, the initial programming benchmark will be to simply maintain CAST baselines. "Before we begin focusing on specific areas for improvement, we'll be establishing the expectation that the quality score should not go down between implementations," Boisvert explains.

Additionally, developers may provide input for honing CAST's defaults. "Our development teams offer feedback after their application is scanned," Boisvert says. Then management reviews the findings to decide whether, and how, to adjust CAST's rules. The feedback process also generates precisely the types of quality conversations Boisvert envisioned at the outset. "We've had some outstanding discussions about coding and architectural principles," she affirms.

Further, Raymond James already is achieving previously impossible goals. "For one application, developers addressed all critical code violations prior to the next deployment," comments Boisvert, who notes that plans are to continue to bring applications into CAST throughout 2012. "We're definitely using the information provided to improve our systems," she says.

Next page: 7 Keys to Successful Application Analysis Anne Rawland Gabriel is a technology writer and marketing communications consultant based in the Minneapolis/St. Paul metro area. Among other projects, she's a regular contributor to UBM Tech's Bank Systems & Technology, Insurance & Technology and Wall Street & Technology ... View Full Bio

Previous
1 of 2
Next
Register for Wall Street & Technology Newsletters
Video
7 Unusual Behaviors That Indicate Security Breaches
7 Unusual Behaviors That Indicate Security Breaches
Breaches create outliers. Identifying anomalous activity can help keep firms in compliance and out of the headlines.