The promise of good technology drives network pros to look for innovative tools that will advance the business. Yet that requires they also wade through the sea of useless, inane and in some cases absurd technologies clouding their view and muddying their decisions.
Here we narrow the search with three early-stage technologies promising to make a difference in the enterprise in 2008. Some of the technologies have established their place in the enterprise and others are still working to gain widespread recognition, but each technology enhances the environment in its own way. The technologies introduce intelligence and efficiency into operations, give the business an edge with IT know-how or bring fluid security to a Mercurial environment.
1. A foundation for optimized service and performance
Almost everyone knows you can’t manage an environment without first knowing what it comprises. And while in the past inventory and discovery tools worked to identify and catalog devices, systems and applications, yesterday’s technologies have no chance of keeping up with today’s complex, distributed application architectures. Add to that new IP and service-oriented architecture (SOA) applications, and the tools that relied in part on manual effort, outdated standards and human memory are toast.
Enter application dependency mapping (ADM). This new breed of discovery technology goes beyond compiling a list of simple components to generating a map of how the components interact and rely on each other. Network managers equipped with that information can more easily prevent performance problems from reaching users and prioritize work around the most critical applications.
“One of the biggest challenges today for IT organizations is changing from a bottoms-up approach, in which they piece together and manage components, to a top-down understanding of what the critical business applications are and how they can proactively manage them so as not to impact the business,” says Evelyn Hubbert, senior analyst at Forrester Research. “IT organizations still very much need to know what the bottom components are, but they need to be able to relate them to the applications that rise to the surface.”
The technology emerged several years ago and large management vendors quickly acquired innovative start-ups that provided the tools to collect inventory, configuration and relationship data on applications in distributed environments. Just to name a few: CA acquired Cendura; EMC acquired nLayers; IBM acquired Collation; HP acquired Mercury Interactive, which had acquired Appilog; and Symantec acquired Relicore. BMC Software has its own ADM technology, part of its Atrium configuration management database (CMDB). Tideway Systems remains one of the only independent ADM specialty vendors.
The industry activity can be overwhelming, Hubbert says. But in the coming year, ADM will become less secondary and more primary for companies wanting proactive management of business-critical applications.
“Companies with dynamic applications, and there are fewer companies that don’t have such applications, cannot afford for them to go down, and ADM will become part of the infrastructure to support and maintain applications,” she says.
For customers, ADM means getting a complete picture of what they have and knowing better how to optimize it.
“The data center was basically invisible to us and we wanted to make it visible,” says Jacob Hall, vice president of platform and design for Wachovia Corporate and Investment Banking in Charlotte, N.C. After developing custom scripts to do data dependency mapping for mission-critical equities applications, Hall came across Tideway Foundation, and about a year ago became an early adopter of commercial ADM software. Tideway’s automation enabled Hall to deliver top-rate application performance while focusing his efforts on bigger projects such as a green data center initiative, architecture reviews, data center moves and disaster-recovery plans.
Tideway Foundation, a network appliance that uses agentless collection methods, gathers application configuration data from servers and other data-center devices and stores that information in a database of configuration blueprints. It also tracks configuration changes and alerts staff to potential compliance breaches.
As sophisticated as tools such as Tideway may be, network managers need to be aware these tools can’t perform magic without some upfront work and patience. ADM is an ongoing project, which can reap many rewards, but only with the right amount of commitment, Hall says. “There aren’t any huge roadblocks,” he says. “It just takes time. You didn’t roll out an entire data center in a day so you won’t be able to easily see everything in one day. It requires a very focused effort, but no matter how you do it, you have to get it done because the information is a foundation for so many other initiatives.”
2. Business intelligence brought to real-time events
For business managers, a delay in learning why a customer didn’t receive requested merchandise, service or information isn’t acceptable. That means waiting for IT to uncover the reason why, say, a Web server failed to deliver isn’t an option.
Complex event processing (CEP) technologies promise to speed those answers and enable businesses to stay on top of the IT and application performance issues that pose customer satisfaction problems. The software combines event processing capabilities often seen in management applications with business process and intelligence tools to deliver information in real-time about the state of a business application, transaction or service. Data streaming features coupled with event processing capabilities are able to integrate data from multiple sources to deliver actionable information, says James Kobielus, a principal analyst at Current Analysis.
“Its evolution has been many years in the making and CEP is a core capability that enables real-time business intelligence and business activity monitoring. It makes it possible to respond to business events in real time and alert the appropriate people and feed the event information into automated workflows,” Kobielus says. “CEP helps enable automated rules-driven responses to critical events.”
Vendors focusing on CEP include Agent Logic, Aleri, AltoSoft, Coral8, GemStone, SeeWhy, StreamBase Systems and Vhayu. And other large vendors that work in SOA, middleware or business process management (BPM) technologies have been dabbling in CEP. Such companies, Kobielus says, have been beefing up their CEP offerings with strategic acquisitions and product upgrades. For instance, IBM acquired Data Mirror in September and BEA updated its WebLogic Event Server application earlier this year to better discern trends in event processes and help companies prevent performance issues or thwart threats.
For customers, CEP ties network and system components to applications and business processes in a way that enables IT to understand how all the pieces come together. “I started in the systems management space, where we used to monitor the CPU of a server or the components. We didn’t monitor the business processes, and we really couldn’t tie the two together. In a lot of cases IT didn’t even know the business processes it was supporting,” says Eric Bruner, senior manager of systems development at Sallie Mae, a financial services provider in Reston, Va.
Sallie Mae recently received an Enterprise All-Star Award from Network World for Bruner’s work in decoding the Web content and application issues plaguing the company by using CEP technology. Sallie Mae uses Coral8 software, which the vendor calls a “data-crunching engine” that can process events from network and systems management software, click streams, message buses and external applications. In the Sallie Mae configuration, TeaLeaf Technology’s CX Web application monitoring software feeds customer online-experience data to Coral8. The goal is eliminating the wait between events and IT understanding of the potential impact.
CEP is an early adopter’s technology, used in stovepipe fashion for specific applications, Kobielus says. Marketplace confusion and a lack of standards have inhibited CEP from becoming an everyday tool, he adds.
On the standards front, groups such as the Event Processing Technical Society are working on developing common languages, formats and more for CEP tools. In the meantime, enterprise IT managers today should work with their business intelligence, BPM, middleware or SOA vendors on how they can make CEP work in their environments. Enterprise users need to understand the technologies they have today that could augment or enable more advanced CEP capabilities before selecting a vendor or tool, Kobielus says.
“If users want to bring CEP in, they need to ask their vendors to what extent their CEP technology implementation would be consistent or interoperable with their current SOA stack. Without set standards, each user has to understand what they have before they bring in more tools,” he says. “It is entirely possible to create more of a mess with multiple [enterprise service buses], SOA governance tools and CEP products.”
3. Security in a dynamic environment
The advent of SOA represents many changes for IT environments, not the least of which involves security.
The nature of SOA introduces flexibility and transitions an environment from being static to behaving in a dynamic fashion to meet business needs. SOA applications are loosely coupled and reuse components scattered about an environment, which is not ideal for the locked-down nature of most enterprise security technologies in use today. The impending issue of security in an SOA environment has some industry watchers looking for technologies and tools that could fall into a bigger category of security-oriented architecture or put simply, security for SOA.
“The fundamental security challenge that SOA presents is that by abstracting IT capabilities and data as services, the security for those capabilities is at risk of being lost,” says Jason Bloomberg, principal and senior analyst at ZapThink. “As a result, SOA essentially necessitates enterprisewide identity and access management to maintain the security context for users as they interact with abstracted services — essentially allowing the right users to do what they’re supposed to do. Services must also be protected from threats, as well — preventing the wrong users from doing what they’re not supposed to do.”
That means instead of locking down systems, IT security executives need to learn how to attach security measures to components that operate independently in the environment. One way to provide such protection is with XML firewalls or content-level protection (such as the technology Citrix just acquired with QuickTree) and endpoint security tools, such as antivirus, network access control and intrusion-prevention tools.
Other options include an up-and-coming extension of identity and access management technologies dubbed entitlement management that includes fine-grained, role-based access controls. These technologies require granular policies for access rights to applications, which could be extended into an SOA environment.
“Security is becoming identity-centric, and this goes well beyond simply directories and into detailed entitlements. That is a move in the right direction,” says Andreas Antonopoulos, a founding partner at Nemertes Research. “We have to move away from the security model of the past that involved one vendor and closed systems with little ability to do anything outside of that system. In an SOA environment, you may have one primary security vendor, but that vendor can accept data from multiple sources to make security more fluid to better address subtle threats and to ensure protection across the components.”
But the issue around securing SOA environments today isn’t the lack of promising concepts; it’s the absence of standards, industry watchers say. Stand-alone SOA security vendors such as Forum Systems and Layer 7 have emerged to augment the market and security efforts by SOA providers such as SOA Software and Software AG. Yet pure-play security vendors haven’t quite come on board with the effort, which stalls standards and prevents true security-oriented architectures from being developed, Antonopoulos says. For now, that leaves enterprises cobbling together their identity and access management and single sign-on deployments to SOA initiatives.
“SOA and applications vendors are working more and more for security, but security vendors simply aren’t getting involved. That is a problem, because you need all your security components to be able to talk to each other. Start-ups addressing the issue of SOA security can only take it so far,” Antonopoulos says.
Essentially, security-oriented architecture technologies would be similar to SOA technologies: not one product, but several products equipped to communicate, integrate and secure the overall environment. While OASIS and other standards bodies are working on standards such as Web-Services Security, or WS-Security, without security vendors signing on to comply there isn’t much headway in terms of integrated security across an SOA environment yet.
“Most systems can talk SOAP for instance. But typical security systems are deaf, dumb and blind to SOAP. Buyers need to make SOA features critical to their security buying decisions going forward,” Antonopoulos says. “2008 will bring a higher level of adoption for SOA and that is going to put pressure on security infrastructure to adapt and security vendors would be well served to work on those integrations and standards that would enable the exchange of information security-oriented architecture requires.”