Secrets to Successful Software Integration

As a hosted electronic data exchange company, SPS Commerce in Minneapolis, Minn., is in the business of connecting the software it creates to applications its customers use every day. When the company recently outgrew some of its own systems, it was in a unique position of living its own customers' experience.

Previously, SPS Commerce relied on transferring information manually into Lotus Notes. But managers were concerned about accuracy and timeliness, and also felt that Lotus Notes by itself wasn't an ideal platform for analyzing data. "Our decision was driven by velocity," explains Troy Benesh, SPS Marketing Services manager. "[That is] reducing the amount of set-up time, eliminating manual entry, setting strict data standards and efficiently handing off information to various groups." By integrating Lotus Notes and SPS' Oracle database with Saleforce's customer relationship management (CRM) application, SPS hoped to track users' activities, ensure follow-up tasks were performed in a timely manner and spotlight accountability from one end of the customer chain to the other. Now it was about to find out whether those hopes would hold water.

Like SPS, many companies face a fundamental dilemma: They are organized vertically with each function using customized software applications and databases, but the customer experience extends throughout all business functions. "That's where software integration can play the greatest role," says Mike Childress, vice president of Applications Portfolio Development at EDS in Plano, Texas. "It cuts across those domains because that's how the real world works."

Unfortunately, in the real world, integration between disparate systems doesn't always go smoothly. With so many elements involved, some key items may get overlooked. (article continues)


A Failure to Communicate
Software integration is made up of five key components, says Childress. They include the underlying architecture or tools that will be used; governance, where IT and business discuss the scope and priority of requests; the rules put in place for workflow; the behavioral or cultural aspect and service management. IT departments tend to focus on the first three components when negotiating with integration enablers; the latter two, however, are equally important.

If you don't have a culture that wants to and is incented to leverage the organization's integrated business system, you'll reintegrate the same business functions three or four times during various projects, notes Childress. This IT version of reinventing the wheel is exactly what application integration attempts to avoid.

Service management is the other piece companies often overlook. This component dictates how a company manages its business processes, applications and underlying infrastructure so systems can run 24/7 with zero downtime. Service management also addresses when an application can be taken down for maintenance and provides a solution if a piece of the infrastructure fails.

Also, when integrating the two systems, IT managers must make sure they understand the language of both the data and the event architecture. Otherwise, Childress warns, there may be a misunderstanding of basic terms like "employee." (article continues)


Increased Connections, Enhanced Performance
Integrating software can improve business performance in several ways. Probably the clearest benefit of software integration is the reduction of manual processes -- from the time savings in reducing the amount of data entry to the minimization of human errors.

Equally important, says Pat Backen, a software engineer at SPS Commerce, is the ability to leverage separate application strengths. "Every application has its plusses and minuses, but with integration methods, companies can fully capitalize on the strengths of each application while minimizing the downside.''
Business intelligence also increases when integrating data from separate data sources and applications. An overall view of the business can be developed to allow for better data mining and reporting; the resulting synthesis can be used to improve performance by delivering actionable information that was not available from either independent system before integration.

However, some of the most striking results SPS witnessed as a result of its own systems integration have been in increased levels of customer satisfaction. "Manual data entry had the potential to be a clog in the pipeline," Backen says. "By automating this process, we have greatly reduced the time it takes for our customers to be up and running. This is a win-win-win: happy customers, happy salespeople, happy management."

The bottom line: Before undertaking a systems integration project, address the crucial components: the enabling technology; the underlying architecture, the tools and the governance. Make sure there is a policy covering the scope and priority of requests about the data to be shared among the different business units. Last but not least, train employees to make the most of the integrated data. That's the ultimate key to improved results.

IT Survival Skills

For IT professionals, certifications in technical areas serve as standard career currency. But even the most gilt-edged diplomas may no longer guarantee advancement. To reach the upper levels of the IT organization, training time is better spent sharpening interpersonal and project management skills.

According to an IT skills research survey by IT Training Magazine, a U.K.-based publication, demand has shifted over the last five years, moving away from technical training and toward business analysis, systems design and project management skills. IT training shops have likewise shifted their offerings to match the demand for these skills.

"The days of code-writing savants that we kept in the back room are over," says John Oberlin, associate vice chancellor and interim CEO at the University of North Carolina at Chapel Hill. "We need well-rounded people with social skills and good judgment across a wide variety of situations."

IT Management is Still Management
The College of Information Technology at Georgia Southern University near Savannah, Ga., has a thriving Professional Development Center, serving companies such as Verisign, newspaper publisher Morris Multimedia and Memorial Health, a large health service organization in the area. What most of these companies want for their IT professionals is project management training, says associate dean Hans Reichgelt.

"Memorial Health was keen on communication," says Reichgelt. The Professional Development Center provided Verisign with PMI (Project Management Institute) Certification development units. Morris Multimedia wanted to improve its facility with Scrum, a flexible project management method designed for very small software development teams focusing on short-term goals. Overall, most of those being trained were already overseeing projects and were being groomed for greater management responsibilities. (article continues)


Similarly, Tom Carpenter, a longtime IT trainer and consultant for LearnKey in St. George, Utah, has found that his course, "Communication Skills for IT Professionals," struck a chord. The genesis of the course, Carpenter recalls, was the perception that information technology people are poor communicators. "I found out this was really true, and that the large majority of people working in IT had put close to 100 percent of their effort into developing their technical skills and almost no effort into developing their interpersonal skills."

The course focuses on "learning to speak the business language and communicating the business value of IT," says Carpenter, who still teaches on technical topics. Since the course was first offered in 2002, enrollment has more than doubled.

Training for Real Life
More and more organizations stress both written and verbal communication skills among their IT staff. University of North Carolina's Oberlin, for example, requires his 450 IT employees to be able to write short summary memorandums.

Training in project management, running business meetings and dispute resolution is also vital. Anthony Orr, the global best practices director for BMC Software in Houston, Texas, prepares employees for an even tougher test: communicating in the midst of chaos. Orr oversees a unique airport simulation workshop designed to give students a better understanding of ITIL (Information Technology Infrastructure Library), a best practices framework for IT. In the simulation, students set up a service desk, add technical specialists and suppliers and manage the dynamics of the environment, all while increasing the complexity of the IT infrastructure.

"They experience what chaotic behavior is like within an organization where the processes aren't aligned correctly, the technology doesn't underpin the processes and there's no communication," says Orr. (article continues)


That's when the learning begins. "People come to the simulation and think, 'Wow, this is what I do when I come to work.' The value to the businesses is that the greater the level of complexity, the more you're able to be more efficient, more effective and more economical."

Orr notes that the program has been widely adopted across Europe and the Pacific Rim and is becoming increasingly important in the United States. "ITIL is all about service improvement," he says. "ITIL tries to connect business and IT overall."

Five Courses to Learn By
According to these experts, IT employees who want to progress in their career should take courses in the following five areas:

  • Project management
  • Group leadership
  • Individual time and task management
  • Communication, which includes communication across the company, with co-workers and with customers
  • ITIL, or connecting business and IT

As software development becomes more streamlined and end users become more technically savvy, so-called "soft skills," especially project management and communication, will continue to become increasingly important for IT professionals and their employers. IT organizations must emphasize ongoing training in these areas in order to survive, thrive and add value to their businesses.

Migration Without Mishap

Server migrations come in just about every size, shape and variety -- you can change operating systems, hardware platforms, vendors or do any combination of the three. But whether you seek to increase capacity, upgrade to the latest technology or standardize on a particular hardware or software platform, the ability for data center managers to effectively manage server migrations remains a significant challenge despite the proliferation of tools to enable a smooth transition.

"People have talked for years about the difficulty of moving from one desktop to another," says Bob Gill, managing director of the server sector of TheInfoPro, a consulting firm based in New York City. "But with servers, you have a whole different level of complexity. Even something as simple as migrating the files on a file or print server raises significant issues."

One of the main reasons to migrate is to cut costs, says Jeff Gould, CEO and director of research for Peerstone Research, a technology analyst firm based in San Francisco, Calif. "Over the last five years, companies have been increasingly making the transition from expensive Unix hardware with proprietary processors to commodity platforms, "This means having to choose between Windows and Linux server operating systems as well as selecting the best hardware vendor." (article continues)


Experts recommend the following strategies to increase your chances of completing a successful migration:

Let the application drive the migration According to Gill, this is the number-one rule when planning a migration. "Because the application drives the decision as to what hardware and software platform choices you have, in many ways the decision is beyond the scope of things in your control," he says. Adds Gould, "because these computers have become commoditized, people care less and less about the operating system or underlying hardware -- what matters is the applications you use, or plan to use. Are those applications supported on a given hardware or software platform? That's the key question."

Seize the opportunity to embrace virtualization, clustering and other high-availability solutions A server migration isn't just about continuing to do what you've always done; you can also use it to implement other, more advanced data center capabilities. For example, virtualization is the ability to share storage and processing capabilities regardless of the physical source of a resource; clustering is when multiple machines work together so that they can be viewed as a single computer. In either case, both techniques enable enterprises to gain access to more, and more powerful server capabilities than they would if just performing a straight migration from one server to another.

Bring IT workers quickly up to speed on the new technologies The skill set of employees is another critical consideration when migrating servers. Many of the existing environments are Unix-based. For obvious reasons -- since Linux is a flavor of Unix -- it's easier to transition IT workers to Linux rather than Windows. "But although the learning curve is a lot less steep, the ease of use and manageability of Windows Server means that salaries for Windows administrators are less," says Gould. "These are tradeoffs that companies have to make." (article continues)


Test the new environment thoroughly "Companies often fail to test the new environment completely enough," says Chip Nickolett, president of Comprehensive Solutions, a data center consultancy based in Brookfield, Wisc. "It's far better to catch and address problems pre-migration than post-migration."

Ascertain data integrity Companies often fail to appreciate the critical nature of data integrity. Even the most miniscule floating point differences can result in very costly computing errors. Says Nickolett, "The old system must be kept properly isolated when it is still running, before migration to the new system has been implemented."

Rein in "power users" One often-overlooked problem is that so-called power users have frequently been allowed to create systems and processes on their own. "The more people who have been allowed to do this, the more likely there will be problems," says Nickolett. "These people are often reluctant to disclose what they have done, they usually don't possess any documentation and are not receptive to having someone take these systems away from them." Getting such users to buy into the migration, and be transparent and open about their former activities that could impact the migration, is therefore essential.

Although automated tools are making the actual migrations easier to accomplish, many strategic and organizational issues still abound. And for many organizations, the main one is achieving a certain strategic "coherence" in the data center. "Are you basically a Unix shop with a few islands of Windows servers and intend to keep that balance? Or do you envision a long-term strategy of moving over to Linux entirely?" asks Gould. "As the applications out there increasingly become available -- and interoperable -- on a number of different server platforms, these are the important questions to ask.

The Ins and Outs of SCM

Advocates of software configuration management tools and techniques tend to resort to metaphor when describing why, once adopted, IT organizations can't live without them.

By far the most frequent comparison is to the assembly line popularized by Henry Ford (but not invented by him, as it is widely believed). Before that, workers built complex physical products one at a time: A single person -- or team -- created each part of a product individually and put them together, making customized changes to individual parts so that they would fit together correctly.

Traditional software development followed this same process. And it was inefficient and prone to delays and cost overruns. Using automated SCM development tools, on the other hand, means your development process is infinitely more predictable and reliable. Costs go down. Errors are dramatically reduced. Indeed, SCM -- like the industrial assembly line -- is transforming the way an entire industry works.

Fran Schmidt will attest to that. She was hired by Source Medical Inc. in Birmingham, Ala. three years ago, to be its manager of configuration management systems. Fran walked into a situation where the previous software development lifecycle (SDLC) regime had been broken for more than two years.

"It was horrible," she recalls. "They had onsite development, remote development, were operating on a twenty four-by-six schedule, and none of it was in sync. Not to mention that we had extremely tight budgetary and time constraints."

Schmidt had been watching a small company called AccuRev, based in Lexington, Mass., for several years, and decided to try its SCM product. "We were able to almost immediately achieve low maintenance and lost cost in our very distributed development environment," she says.

What to Look For

Over the last few years, dozens of SCM tools have been released into the market to provide options for IT development professionals who need help managing this complex process. These products support a broad range of diverse functionality. (article continues)


Here are essential capabilities that a good SCM tool will provide:

 

  • Enable "agile development" Using this increasingly popular incremental development methodology, IT development organizations deliver smaller amounts of code more frequently. "In effect, you deliver more, more often," says Schmidt. "You make fewer errors, and customers are more satisfied as a result." AccuRev's support of agile development is one reason Schmidt chose that product.
  • Promote parallel development A good SCM product also needs to go far beyond version-control software to support parallel rather than linear development, says Keith West, senior software engineer of configuration management at ACI Worldwide, a maker of electronic payment systems based in Omaha, Neb. "We have more than 300 developers working in parallel on 102 customer-specific lines of code. If we tried to manage that complex a development environment using just traditional tools, we'd go crazy."
  • Support your particular business processes Rather than requiring you to adapt your business processes to it, a good SCM tool will be flexible enough to adapt to the way you do things, according to West, who chose Telelogic's Synergy SCM solution for that reason. Telelogic is based in Irvine, California. Agrees Steve Beaver, chief architect of MedAvant Heathcare Solutions, in Norcross, Ga., which provides systems support services to physicians and insurance companies: "In today's IT environment, which increasingly depends on overseas development, you have to pay particular attention to supporting your SDLC processes, and SCM is the only effective way to do that." The more SDLC processes your SCM tool supports, "the easier and more efficient your software development will be," says Beaver, who uses AccuRev because of its SDLC project management capabilities.
  • Facilitate compliance with regulations such as Sarbanes-Oxley (SOX) Many companies are being forced to implement SCM by legislation, according to Tom Tyler, chief technology officer of the Go To Group, Inc., a software automation IT consulting firm based in Bellaire, Md. For example, compliance with SOX requires auditable evidence of access control and policy enforcement for any applications impacting financial statements of public companies, and would be virtually impossible without SCM. Tyler depends on Perforce's SCM to help him manage his IT development consulting activities for recruiting giant Monster.
  • Deliver quick return on investment Cost and ease of use are very important considerations when choosing an SCM product. "There are some very expensive products out there costing tens of thousands of dollars, and although they might do a lot, are very difficult to use and cost-justify," says Paul Gowan, IT development manager for Columbia Analytical Services Inc., an environmental testing firm based in Kelso, Wash. He chose Team 2 from Alexsys Corporation, which costs less than $2,200 for a 20-user license, because it was easy to get his developers rapidly up to speed and productive. "We were careful to weigh the return on investment of SCM, and cost was a big part of our decision," he says.

Indeed, you have to think of SCM in that way -- as an investment rather than an expense. "The payoff in productivity gains and reduced person-hours will far outweigh the cost of implementing it," says Tyler.

Expansion Planning For Systems Growth

Something scary is happening in the world of computing. Even though everything is getting smaller and more powerful, we're faced with increasing demand for power to run systems, keep them cool and house them. Although disk drive capacity requirements double every year, if companies require more drives, the issues become how to deal with the increased expense to run systems and keep servers cool -- and where they will house everything.

"Server consolidation and virtual servers are a major step toward improving capacity utilization of servers," says Arun Taneja, founder and consulting analyst at Taneja Group, a storage and server analyst consultancy in Hopkinton, Mass. "Every data center person should be looking at both. Companies are also turning to blade servers as another strategic way of consolidating space."

Restricted Space
Companies in cities such as Manhattan and London are facing serious storage issues because there physically isn't room for them to increase the size of their data centers. Many organizations are being forced to move their operations to locations with more open space.

Yet moving a data center is among the hardest processes from a planning and execution perspective. The volume of data that companies possess is daunting, and they don't have the luxury of simply turning systems off and putting them onto a truck. (article continues)


"It's a very expensive proposition, even in a best case scenario," says Steve Duplessie, founder and senior analyst at The Enterprise Strategy Group, Inc., an IT consultancy based in Milford, Mass. "All of a company's applications can be dynamically migrated online to a new site, but IT must make sure everything is synched and online before doing a hard cut over and decommissioning the old site."

But Duplessie cautions that such moves never go perfectly smoothly and it takes an enormous amount of time -- at least a year -- and bandwidth on top of the cost to complete a move and upgrade.

Starting Over
On the plus side, moving a data center enables a company to start with a clean slate. It can be an opportunity to recognize the past infrastructure mistakes and to architect and configure for the new millennium going forward.

Duplessie suggests that if an old data center site isn't going to go away, it can then be used as a disaster recovery backup site. He believes companies are starting to recognize that assets that have depreciated are ideal for housing second tier applications, or for disaster recovery.

"Many companies prefer to get rid of their old assets and tend to buy infrastructure for infrastructure sake," says Duplessie, "which solves one problem and creates another -- that's the nature of computing."

Another way to conserve data center space and plan for future growth, says Duplessie, is to evaluate data and determine what is mission critical and what isn't. "People are starting to realize that not all data should be treated exactly the same and not all data needs to be on the highest tech, most expensive gizmos." (article continues)


"I would be willing to bet there is huge amount of data that doesn't need to be spinning around twenty-four hours a day,'' he adds. "Data's value and access patterns change over time, and because we architected it and stored it there on day one doesn't meant that's where it should be on day two-hundred."

If companies took 75 percent of their data off their primary production storage area and moved it to a lower cost, lower power, lower performing storage area, the savings garnered could be doubled, according to experts. 

"If it's not possible to grow physically we have to take a data-centric view,'' says Duplessie. "If we place everything around the life cycle and value and importance of the data, which is variable over time, and don't treat it the same, we have the opportunity to make decisions based on whatever criteria we have. It's treating everything the same that gets us in this power, cooling and space problem because it's easier to make the decision to buy more of what we have -- but it's usually strategically worse for that operation."

Over the long term, companies need to recognize that data has a life cycle and it can be staged on different servers that have different attributes around performance. That will let IT devise plans that capitalize on existing data center usage and minimize the need to add real estate to duplicate what is already in place.