Social Network Security

Social networking sites are designed to let people reach out to one another. As virtual communities of professions and connections to experts, they can be valuable business resources. But these very interactions that make networking sites valuable are the same ones that can leave corporate networks vulnerable to IT security breaches.

This is becoming an issue as several virtual communities have sprung up that are geared to business executives. These include Ryze, Xing (formerly OpenBc), Ecademy, Hoover's Connect, Spoke and Vshake.

"They definitely pose a problem," says Andre Protas, a researcher at eEye Digital Security Inc., an enterprise security software and research firm in Aliso Viejo, Calif. "Most of these websites were not created with security in mind.''

Often, these types of communications paths bypass security measures that have been put in place to protect the enterprise, such as firewalls, IDS/IPS, personal firewalls and gateway anti-virus systems, adds Doug Howard, chief operating officer at BT Counterpane, a security firm in Chantilly, Va. "Through peer-to-peer and other technologies that bypass corporate security, you create additional risk for an enterprise."

Protas says he's seen "a proliferation of vulnerabilities" on social networking sites recently. A case in point: the worm that targeted MySpace users, changing the links of their home pages and redirecting them to phishing sites. "That was a pretty serious first punch to MySpace and to social networking sites in general,'' he says. (article continues)


Some Sites are Safer
At LinkedIn, which bills itself as world's largest business network with 8.5 million users, officials say they are mindful of the potential for security breaches. They claim protections have been built in to allow users the flexibility of deciding what information they want to share, and what they would prefer to keep private.

"Privacy and protected communications are key elements of LinkedIn,'' says Allen Blue, vice president of Product Strategy, at LinkedIn in Palo Alto, Calif. "We have created communication and browsing systems, which allow all participants - browsers, message-senders and recipients - to show or hide as much information about themselves as they like, and to protect private information (for instance, email addresses) until they are ready to share that information."

Vshake founder Sagi Richberg says his site was built with privacy and security protection in mind and the site eliminates spam by acting as a proxy for visitors.

"Even after you pay to contact someone, we act on your behalf as a proxy. You don't get the person's email or telephone number; everything is done via our system and we send the email on your behalf," says Richberg in Ashland, Mass.

As an added security measure, Vshake also has what Richberg calls a "unique verification system." If a Vshake visitor decides to verify him or herself on the site with a driver's license or other identification, lending credibility to the communication, Vshake will send that person a letter with a random, system-generated number via snail mail. (article continues)


The visitor then has to input it onto the website for the verification to be accepted. "That means you are who you are,'' Richberg explains. "Anyone can go to any social networking site and say they're Bill Gates and claim to know people they don't really know. That's another layer we have that no one else has."

Always Employ Basic Security Measures
Protas says the easiest way for IT to prevent network vulnerabilities is simply by using software that lets administrators decide which sites people can access and which sites are blocked. "I would suggest there is no benefit for [corporate] users to be on MySpace so IT should block it. People are spending so much time on it that it's a huge productivity loss."

Howard concurs. "The recommendation is usually to not allow employees to use sites that utilize peer-to-peer communications," he says.

But in lieu of that, since, according to Protas, "there's really no way to filter out the good MySpace from the bad MySpace," IT should make sure every computer on the network is covered with basic Internet security software covering zero day, anti-virus, anti-phishing and spyware protection.

Howard also recommends that the enterprise communicate overall best practices to their employee on a regular basis. These include:

  • Never provide personal information to someone in a social network environment since you never really know who is on the other end
  • Never provide company confidential or proprietary information to someone in a social network environment
  • Never perform file sharing across a social network environment

"It's hard to draw the line to decide what to allow and what not to,'' admits Protas. "So if you're going to try and block users from these sites, make sure you protect the end points."

Implementing a Tiered Storage Strategy

The cost of storage continues to decline even as the capacity of individual drives increases. But rather than move all data to the fastest -- and most expensive -- devices, most enterprises are implementing a tiered storage strategy, in which data is delegated to devices according to its importance.

"Ten years ago, people simply installed systems that allowed them to store all information electronically and to get to it quickly," says Joe Martins, a partner with the Data Mobility Group LLC, in Nashua, N.H. "They didn't start to consider strategies that took into account cost as well as performance issues until 2000," when resolving Y2K issues brought it to their attention.

Dividing the Data
The Enterprise Strategy Group estimates that about 85 percent of the data sitting on high-end storage systems is non- or post-transactional, and that half of this replicates non-transactional data. "This is a very inefficient use of high-end real estate at best -- and a costly business adventure at worst," says Heidi Biggar, an analyst with the Enterprise Strategy Group, based in Milford, Mass. After all, primary disk storage is expensive to acquire, manage and maintain, as well as draining power and cooling resources.

Best practices involve first classifying data into categories. Information deemed mission-critical and data that is frequently accessed and/or changed should be (article continues)


kept on primary storage systems. Data that is less urgent but still important to the business could be kept on SATA-based systems. Data that is static, or only periodically accessed, could be kept on disk-based archives. Finally, tape or optical storage would be used for deep archiving.

Five Storage Strategies
How can you effectively implement a tiered storage system? Here's a five-step plan:

Determine your goals What do you hope to achieve from moving to tiered storage? Reduce costs? Improve performance? Achieve better service levels? Some combination of the above? "Some enterprises are extremely cost-sensitive and are looking to cut storage-related expenses," says Greg Schulz, founder of the StorageIO Group, in Stillwater, Minn. "Others put performance first, as it can directly impact their ability to serve customers. Before you do anything, you have to establish your priorities."

Establish recovery point objectives and recovery time objectives for all data types Before you can determine what data belongs on what kind of storage device, you must decide your recovery point objectives (RPO) -- the point in time to which data must be restored to ensure smooth recovery of business processes that the data supports -- as well as your recovery time objectives (RTO), the maximum amount of time that you can afford for systems to be down after a failure. Without both an RPO and an RTO, you will not be able to implement a viable storage tier strategy. It's not good enough to define one or the other; you have to specify both. (article continues)


Inventory existing storage resources Chances are good that you already possess a number of the storage devices you need to implement a tiered strategy with a minimal investment in additional equipment and software. Find out exactly what you already have installed that can be repurposed into a tiered strategy.

Consider how to best manage the increased complexity Moving to a tiered system always adds complexity to data center operations, but especially so if you switch from a single vendor to a multiple vendor storage environment. "Make sure you have the software tools, as well as the personnel to manage this cost-effectively," says Mike Karp, a senior analyst and head of the storage practice at the Enterprise Management Associates, in Boulder, Colo. "You don't want to gain efficiency in one area only to lose it in another."

Approach the process in phases Rather than implementing it all at once, it's best to start with one application and two data tiers, and proceed gradually from there. "This makes the process much more manageable -- and there are still immediate and sizable benefits," says Biggar.

All experts agree: Don't write that P.O. for the new drive until you complete your upfront work. "There's so much more involved than simply deciding what product to buy," says Karp.

Establishing a tiered storage system involves a lot of upfront work and investment in both time and hardware and software. There's no magic bullet for making it a success, warns Martins. But, he quickly adds, "It pays off in spades in the long run."

Seven Best Practices for Managing Data Storage

Despite the strategic value of data, many enterprises have yet to establish robust storage management strategies that keep important data readily available, the cost of managing it low and its access safe and secure. Given the huge proportion of corporate data that resides in unstructured or semi-structured
form -- such as in documents, spreadsheets and e-mails -- this is a daunting challenge.

"Data both grows very fast and ages very fast," says Mike Noordyke, president of the Trivalent Group, a consulting firm based in Grand Rapids, Mich. "You need to put strategies in place that allow you to monitor and manage it effectively, or it will impact your firm's ability to function properly."

Data storage experts recommend the following seven practices to help enterprises, both large and small, stay on top of their data storage challenges:

Establish recovery time objectives and put technology and processes in place to enforce them Recovery time objectives (aka RTOs) specify the maximum time allowed between a system crash and when data is restored. "Backup is one thing, but restoring is another," says Noordyke. "You can have the best backup mechanism in the world in place, but if you can't make your data accessible again in a timely manner, your business can take a real hit."

Classify data into "tiers" By assigning value to data throughout its lifecycle, from  the moment it is created until it is destroyed, organizations can begin to establish policies on when to move data from the most accessible -- and expensive -- disk storage to media that is more difficult to access, but which is (article continues)


substantially cheaper. But it's not enough to merely classify data according to "old" or "new" says Mike Karp, a senior analyst and head of the storage practice at Enterprise Management Associates, in Boulder, Colo. "You have to look at the underlying value, rather than assuming old data is less important than newer information." Concurrently, it's important to think not just in terms of managing capacity, but of performance, says Kris Domich, principal consultant for Data Center and Storage Solutions at Dimension Data, a $3.1 billion IT services firm based in New York. "Implementing monitoring and management tools is a must, so you can collect and see trends in performance as well as space usage over time."

Talk to actual users To help classify data into these different "value" tiers, it's critical to keep users in the loop, says Scott Robinson, chief technology office at DataLink, a Chanhassen, Minn., enterprise storage integrator. "There's no magic bullet; you have to go out and interview business data owners, and gain a first-hand knowledge of the needs of each business unit and how they prioritize their data," he says.

Understand the true cost of storage "Organizations must also understand the actual cost per gigabyte of storing data," says Dan Mack, a principal consultant with Glass House Technologies, an enterprise storage consulting firm based in Framingham, Mass. Firms need to include everything in this calculation: hardware, software, maintenance, employee time, service and support fees from external parties and -- last but not least -- utility costs. "A huge part of the IT department's budget is storage. They need this cost information to make all sorts of decisions, including calculating the true cost of proposed IT projects," he says. (article continues)


Limit access to need-to-know data In general, companies make data available to a lot more people than actually need it. "'Deny all' should be the default, and organizations should only give access to those people who really need it," says Domich. "And you have to audit those access rights on a regular basis."

Safely destroy, as well as protect, critical data operations "Completely erasing data is just as important as other aspects of data security," says Greg Schulz, founder of the StorageIO Group, a storage analyst firm based in Stillwater, Minn. "This means not simply discarding old magnetic tape, disk drives, laptops and desktop systems, but destroying the data residing on them."

Discriminate between backup and archiving "The growth of data -- whether in e-mails, IMs, database records, or transactions -- is pretty alarming," says Mike Kahn managing director of The Clipper Group, in Wellesley, Mass. "You need to know, legally, what you need to keep and what needs simply to be archived versus what is needed for backup and recovery purposes." Adds Lauren Whitehouse, an analyst with the Enterprise Strategy Group, an enterprise consulting firm based in Milford, Mass.: "Given increasing emphasize on compliance and privacy, companies need to be very vigilant to make sure that data is secured as well as readily available when required."

In the end, companies need to have an "information perspective" on all aspects of their data storage, says Joe Martins, a partner at the Data Mobility Group LLC, a Nassau, N.H.-based storage analyst group. "They need to think of storage less as bits and bytes and put it in a business context. This is what makes effective storage management capabilities so important."

The Heat Is On

A strange thing happened when St. Louis, Mo.-based Sisters of Mercy Health System began populating its data center with new servers. "We ran out of power capacity,'' says Bill Hodges, director of the data center at Mercy, the ninth largest Catholic healthcare system in the country. Mercy found that the servers were smaller and denser and took up less space. But on the flip side, they required more power and cooling, as well as a bigger generator to handle the increased load.

Then the domino effect kicked in. "Now I have to increase the capacity of the infrastructure that supports the data center,'' says Hodges. "You fix one thing, but then there are four other areas you have to upgrade to leverage what the technology changes are allowing you to do."

With a trend toward smaller servers and blade servers, enterprises are now able to get more hardware into a single rack than ever before. However, they're finding that improvement to be a double-edged sword because the smaller server size equals greater power consumption. And if companies don't upgrade to accommodate the new equipment, they'll find themselves only able to use a fraction of their data center's floor space.

"We're seeing increased density in data centers because more CPUs are being packed into a unit of volume,'' explains Dan Golding, a vice president and senior analyst with Tier1 Research in New York City. "Each chassis is taking up a tremendous amount of power to do its computing. One of the laws of engineering is if you take in a lot of power to do computation, something has to be done with that power, which is turned into heat eventually." (article continues)


Power, Power Everywhere
And it's not just the processors that are getting more powerful -- the same thing is happening to storage devices and Ethernet switches, which take more power to handle more bandwidth, Golding adds. There is more capacity in data centers "to the tune of fifty-times the processing power, and a hundred-times the storage and networking capacity in a single cabinet. And that means you're using much more power and that's generating much more heat," Golding says. At the same time, because generators and air conditioning units are only so big, data centers are running out of power.

According to research firm International Data Corp., 40 percent of data center end users report that power demand is greater than the supply. What's the answer? Ideally, says Golding, outsourcing or building a better data center designed with more power and greater cooling capacity per square foot. "Designers and electrical and mechanical engineers are designing for large enterprises two to three times the cooling and power capacity than what presently exists,'' he says.

"The goal is to guarantee the inlet temperature to the IT equipment so the fans are always pulling in the same temperature as the air,'' says Kevin Dunlap, director of business strategies, Cooling Group, at American Power Consumption Corp (APCC) in St. Louis, Mo. "The easiest way for us to guarantee that temperature is to remove heat from the back of the server and not give it a chance to mix with the air in the rest of room."

It's more efficient to cool at the row level, he says, because when air is blown from a source that's much further away, the air has to be cooled down to a much lower temperature, which takes more energy.

Dunlap says for energy saving reasons, the new servers "pull back" when they're not being asked to do lot of computing. But the cooling system has to be able to respond quickly and the power has to be there to support the equipment when it springs to life again.

"As computing loads moves around the data center, the power and cooling has to move around data center to mirror that compute load,'' he adds. "That's the next challenge we're facing." (article continues)


Plan to Scale Based on Demand
When planning the layout of a data center, Dunlap recommends thinking about how much capacity is needed today and populating it for current needs, and then matching the energy consumption rack by rack. Then as racks are added, it will be possible to also add cooling units and scale as the computing needs grow, so you're matching the capacity to the demand.

Hodges says APCC's hardware enabled Mercy to tap into the building's power supply and redirect capacity that wasn't being used, which gave them the ability to extend the life of their existing data center. On top of that, APCC's components are modular and can be moved when Mercy ultimately builds its new data center five years down the road.

Since most enterprises have a three- to seven-year equipment replacement cycle, experts suggest doing a usage inventory and then ensuring that power is supplied only to the racks that are being used. "We've seen a shift from cooling the room in general to looking at a room as a large heat source and trying to cool it with a big air conditioning system,'' says Dunlap, "to targeted cooling solutions where each individual rack has its own row or rack. So it's much more one to one."

Will Alternatives to Microsoft Become Mainstream?

Is the Microsoft monolith beginning to crumble? Alternative applications for everyday office tasks like word processing, e-mail, spreadsheets and presentations have been available for years, but their small size and limited resources relegated them to the sidelines.

Now, however, the software giant is facing challenges from champions of similar size.  "IBM and Google have begun to play together," says Kyle McNabb, principal analyst at Forrester in Cambridge, Mass. "If they can work past their differences, then you have something credible and Microsoft should start shaking in its boots."

Sneaking in the Back Door
IBM, Sun, Google, Yahoo and Thinkfree are the leading contenders jostling to gain a substantial foothold on Microsoft's turf in the enterprise space. Although their potential is promising, at the moment, their plays are tenuous at best.

"We've talked to over 200 enterprises in the last 18 months and we're seeing a lot of playing around with these alternatives -- usually in IT groups and rogues throughout the enterprise that typically use them without corporate sanctions --but we're not seeing very much actual adoption," adds McNabb. "Where we are seeing adoption in enterprise, it's usually of Sun's StarOffice."(article continues)


Not everyone agrees with McNabb. Vaughan Woods, CEO of Maverick Asia Pte Ltd, a boutique consultancy out of Singapore, sees one solid reason, at least, to consider the contenders' claims. "The reason enterprises look at these products is the high cost of Microsoft."  

A Different Lever
While cost is always a factor in any enterprise decision, it still isn't providing the momentum for change one might expect. This forces the alternative applications to take a different approach to encourage wide scale adoption.

"Adoption happens outside of work," says McNabb. "Google and Yahoo in particular are wooing home adopters, hoping to become so familiar and useful that employees will push the enterprise to adopt the programs as well. That strategy does have legs. After all, that's exactly how Microsoft, particularly Windows, pushed its way into the enterprise."

Microsoft did indeed slide in the back door to invade all of the enterprise space but that pervasiveness brought much-needed standardization as well. The threat of the chaos implicit in non-standardized applications has strengthened the ties to the Microsoft mother ship.

"If you review the process of convergence with the MS Office de facto standard in the 1980s, you will get a template for this process," says David McNab, president of Objective Business Services, a boutique consulting firm based in Markham, Ontario. "Office productivity was only realized once we could share information across desktops and networks, and to do that we needed a standard which led to Microsoft's dominance. The next step in evolution is that the Office standards become commoditized -- e.g., open source alternatives abound -- and that's what you are starting to see now."(article continues)


Culture Dictates the Play
Some IT analysts believe the pace of change is governed more by culture than by business principles or cost concerns. They cite an intriguing disparity in alternative uptakes in various countries.

"For a while now, I've noticed that U.S. businesses are quite inflexible when it comes to adopting new technology. There seems to be a particular aversion to using this kind of third-party application for managing data," says Paul C. Williams, senior software engineer at LexisNexis Examen in Sacramento, Calif.

Conversely, he adds, "Businesspeople in developing countries, particularly in India and China, but also in the Eastern Bloc, are quite open to using these new tools. I would expect that trend to continue with corporate adoption of Internet-based applications."

His prediction: Managers in developing countries with large groups of people working collaboratively will tend to use the network-based tools, while the "rugged individualists" in the U.S. will continue to insist on stand-alone applications.

Which Will Switch?
The status quo will always have plenty of supporters, if not outright fans. "I don't expect many businesses to switch to online alternatives in the near future -- just because it is a switch to be made," says Jan Doornaert, project manager at Radar Automation N.V. in Belgium. "Too many vested interests, or 'it works just fine now, why change?' mindsets take over."

But at the end of the day, the needs of the global marketplace may override all other concerns.

"I suspect some changeovers may be financially motivated" says Williams. "But ultimately, due to the increasing need for document control and collaboration, some collaborative platform -- be it Google Docs, Sharepoint or something all together different -- will become adopted throughout businesses."