|Editions | myCNN | Video | Audio | Headline News Brief | Feedback||
Getting to the root of network security
(IDG) -- What if IT managers discovered a magic way to shield e-commerce from all things illegal, such as online credit-card heists, denial-of-service attacks, Web page destruction, viruses and data thefts?
Achieving all that doesn't take a magic wand. What it does take is changing how your organization thinks about security so that the lines between security and business processes no longer exist.
It also takes an evolutionary restructuring of the security infrastructure. The goal: proactive, scalable and flexible security that can easily accommodate new applications, mergers and network changes.
"The vast majority of network plumbing gear in use today is misconfigured. We see it all the time with our clients. They bring a wire from the Internet to a switch that carries traffic to both the internal LAN and the Web server," says Stefan Jon Silverman, master technologist at Scient Corp. in San Francisco, which builds e-commerce applications for clients.
"But if you get it right -- access control lists and rigid enforcement of traffic routing -- nobody from the Web server can see into the internal machines," he says.
What do information security professionals want in this replumbed, business-enabling security model?
Already, some vendor tools and network security professionals are implementing such changes.
Security from the beginning
To minimize confusion, Ian Poynter starts with what he terms "security from the beginning." Poynter, president of security consulting firm Jerboa Inc. in Cambridge, Mass., calls this "holistic" security.
Pete van De Gohm, chief security officer at Enron Energy Services in Houston, says holistic security means showing information technology professionals, regardless of whether they specialize in security, the value of the information in their network.
Holistic security involves both network engineering and application development, Poynter says. For example, if network managers watch traffic patterns for load, why not also watch for unauthorized access? On the programming side, why not teach developers how to write code that's free of common vulnerabilities, and teach security professionals how to review this code for security problems?
But code review is typically seen as an inhibitor to business processes, adds Poynter. That's because most code review is done after the application has been developed, which results in average delays of six months, he says.
Instead, the security team should be involved in the entire life cycle of new applications. The team should sit in on development-planning meetings and calculate the security impact of an application before any code rolls off a programmer's fingertips. Then, as the application is developed -- not after -- the security team and the programming team should review code together, thus mitigating myriad risks without slowing development.
"Some 40% of the common vulnerabilities and exposures listed in the CVE database are buffer overflows," Poynter explains. (The Common Vulnerabilities and Exposures (CVE) page, link below, is a widely accepted archive of security problems found in software and hardware). "If we can train programmers to avoid buffer overflows, then we can reduce incidents by 40%."
Code review may help in the realm of home-written applications. But how do you know what security exposures might lay dormant in vendor-developed applications? You don't, say some observers.
"If proprietary software is part of your infrastructure, it's even harder to reverse-engineer the application and see what's inside it," says Yetzer-A, a hacker-in-hiding who's a system administrator at a large East Coast Internet services firm. "Essentially, you're putting security into the hands of others."
Aside from using only open-source products such as the Apache Web Server or the Linux operating system, the only solution Poynter says he sees is user revolt.
If enough large buyers say they won't buy the tool without an independent code review, vendors may also find a way to add security review into development cycles. But that will only happen if customers are willing to wait longer for product releases, something John Pescatore, an analyst at Stamford, Conn.-based Gartner Group Inc., says will never happen.
One program may hold some promise - the Common Criteria Project, part of the National Information Assurance Partnership, which is sponsored by the National Security Agency in Fort Meade, Md., and the National Institute of Standards and Technology.
In its lower-level evaluations, Common Criteria offers vendors (for a steep fee) a review of the security performance of their products and what exposures they bring to mixed network simulations.
So far, 14 products have passed Common Criteria evaluations (functional testing) for Levels 1 to 3 of the year-old program. Redwood City, Calif.-based Check Point Software Technologies Ltd.'s FireWall-1, Murray Hill, N.J.-based Lucent Technologies Inc.'s Managed Firewall and San Jose-based Cisco Systems Inc.'s Secure Pix Firewall are among the products that have passed.
These levels of evaluation don't include code review, which explains how a newly discovered exposure on these firewalls popped up in May, when the CVE reported that some filters and firewalls, such as FireWall-1 and Santa Clara, Calif.-based Network Associates Inc.'s Gauntlet, are vulnerable to packet fragment attacks, which are common denial-of-service attacks. These attacks confuse routing so an attacker can slip past filtering.
For companies whose only line of defense is a perimeter firewall, that firewall becomes the single point of failure. What's more, the front-end firewall (or "firewall farm") also slows traffic, adding to security's bad rap as a business inhibitor.
To solve these problems, most information security professionals want to distribute firewalls throughout the enterprise.
Some, like Silverman, are getting rid of their front-end firewalls altogether. He has chosen to handle front-end traffic through properly configured routers, switches and a hardened operating system on the Web server itself.
Under this model, routers deny all traffic except that which must get through and bounce all packets they don't know how to route. On the Web server, Silverman doesn't mix Internet traffic with internal traffic. Instead, he uses network interface cards to send business partners to the right, Web traffic to the left and remote workers through the center. "If you remotely manage these Web sites, encrypt the [management] traffic," he says.
And firewalls themselves are already moving to other critical points in the network. Hence, the new term distributed firewalls, referring to firewalls that are "independent of topology," wrote Steven Bellovin, a research director at AT&T Labs Research, in a white paper late last year.
Pescatore puts firewalls in three categories.
The first category is the mammoth front-end firewall or firewall cluster. The second is the firewall appliance -- a type of product that many companies are buying to shore up their branch offices. The third is the embedded firewall, which fits on PCs and is especially useful for remote workers connected via Digital Subscriber Lines or cable. An embedded firewall can also be embedded in a chip, a network interface card or a motherboard.
Another safeguard that blocks rogue users but lets the good guys in is the authorization server, part of what Pescatore calls "extranet access management." The authorization server is usually located somewhere at the gateway but can be distributed to other parts of the network as needed.
"I refer to these as access controls and privilege management, which are combinations of directories to store user attributes and rules engines that implement policy, authorization and entitlement," he says.
Silverman adds that, especially in business-to-business environments, access control lists, which list who is authorized to access what, are important because they allow only certain types of traffic to be sent to predetermined destinations.
Access controls in the Unix and Windows NT operating systems can't support this level of granularity, which most companies need in order to authenticate and grant differing access privileges to business partners, customers and users.
For example, as a stock trader, "you're only allowed access to research if you have $100,000 in your account," Pescatore explains. "If you start with this mechanism of business processes, entitlement and privileges, and those get implemented by security rules, you're golden because you don't have a separate business and security policy rule set."
This level of granularity is especially important as wireless devices such as personal digital assistants and cellular phones access networks. Right now, it's difficult or impossible to authenticate and grant privileges to these devices because they have varying -- and inadequate -- authentication capabilities.
New tools are beginning to sprout up around the extranet access-management concept. For example, some intrusion-detection systems (IDS) look for violations of business-based privileges rather than following the earlier strategy of looking at "attack signatures," according to Pescatore.
If an attacker were to hijack a legitimate user account and password, an intuitive IDS would notice when that hijacked account tried to access forbidden applications and operational controls.
"Ultimately, IDSs will tell me who's pulling competitive information away from me vs. who's doing business with me," explains Pescatore.
These new products are far from mature, he warns. "Choose these tools by how well they integrate with other management tools, your infrastructure and your high-value business applications," he says.
As companies move to a more holistic, integrated approach to security, centralized management is the only way to glue it all together. The problem is that the handful of security management tools on the market just aren't up to the job of managing complex multivendor environments.
"Even in the security management space, you have a class of companies that address Web security, a class of companies that address single sign-ons, authentication, and perhaps a [virtual private network] or [public-key infrastructure]," says Jonathan Chinitz, vice president and general manager of IntelliSoft Corp., an authentication management services division of Vasco Data Security International Inc. "What companies need is transparent integration across different tools and different architectures."
The major network management vendors -- Tivoli Systems Inc. in Austin, Texas; Computer Associates International Inc. in Islandia, N.Y.; and BMC Software Inc. in Houston -- have also been buying technical security companies and integrating those vendors' security products into their network and systems management tool sets.
Though imperfect, network management vendors may be the ones to drive network security deeper into the infrastructure by commingling network and security management.
That's because some technologies, such as remote firewalls and authentication servers, will already be centrally managed and therefore easier to hook into such an infrastructure, Pescatore explains. But these products must integrate with a customer's security tools and business access policies, he adds.
Point solutions dead?
Randy Sandone says security managers may be able to do away with many of today's security tools once they can run truly secure operating systems.
Most applications run at the all-powerful root or admin level of the operating system because the operating system can't limit a program's privileges to just the services it needs. That's why it's so easy for executable e-mail applications to spread viruses through e-mail programs such as Microsoft Outlook.
Sandone's company, Argus Systems Inc. in Savoy, Ill., makes a tool called PitBull that partitions privileges on Unix, Solaris and Linux operating systems so no application or no single administrator has root privileges.
"What a compartmentalized operating system does is [let] you define a compartment in which every defined process runs. Within that department, connectivity can only be established with other processes in the rule set you've predefined," says Silverman.
If attackers can't exploit an application to get to the root controls of a machine, they can't use the root to get to other applications, directories or files. "Vulnerabilities will drop significantly," says Poynter.
But he warns not to throw the baby out with the bathwater. Security infrastructure, he says, is much more than a partitioned operating system.
"It's the 'same-old, same-old' all over again. We must rewrite the way we look at security from an infrastructure standpoint," he explains. "We need to evangelize about how security can't be bought at Sears and slapped on. We must build our infrastructures correct from the beginning. Then security really becomes the business enabler it should be."
Web privacy group seeks to block Toysmart sale
RELATED IDG.net STORIES:
The Napster security trap
Common Vulnerabilities and Exposures (CVE) page
|Back to the top||
© 2001 Cable News Network. All Rights Reserved.|
Terms under which this service is provided to you.
Read our privacy guidelines.