As we bring our Information Overload summit to a close, we have decided to name and rank the biggest culprits for the overload, the issues which more than anything else are causing companies to drown in a deluge of data.
Some of these problems have been around since the dawn of the computer age. Others are new, brought about by new technologies and different ways of working and servicing IT infrastructure. Nevertheless they all cut into the IT manager's time.
Honourable mention: Web management
Iain Thomson: Watching what people are doing on the internet is one of those tasks that IT managers are increasingly being tasked with, but I've yet to meet one that like the job.
Managers are increasingly overloaded these days and the prevailing view is they have more than enough on their plate without playing censor to an entire company. Yes, if someone's spending all their time looking at porn on the internet that's an issue for a company, but it's a problem in management, not in IT, seems to be the prevailing view.
If a company is that worried about web management then they should hire the services of someone like Websense to do the job for them, not force stretched IT departments to take up the role. The only time the IT department should get involved is after a complaint – either from someone on the floor who's spotted what's going on or from a manager who's concerned about lost productivity.
Shaun Nichols: The tasks of monitoring and managing web access has only become more difficult as interest in new web services has grown. Now, sites such as Twitter and Facebook aren't purely for consumers, but many companies are also making use of them for promotion and customer relations.
This means that simply blocking everyone off from these services is no longer possible, as they have become work tools.
At the same time, more and more new sites are popping up, more blogging platforms, social networks and casual gaming portals are emerging every day, making it far more difficult to keep up with what can and can't be blocked.
Then on top of it all, there's the ever-growing ranks of malware infections and phishing scams connected to web applications and tools, making the risk of security breaches through the browser stronger than ever.
As such, the task of web management at the corporate level is becoming both more complex and crucial at a most inopportune time.
Honourable mention- Integration of Web 2.0 tools
Shaun Nichols: It's one thing to have to deal with cloud computing, taking existing processes and applications online. It's another headache entirely when you're asked to find completely new uses for web tools.
We've all known at least one or two bosses and executives that love to throw about the latest buzzwords and demand that everyone adopt the latest business crazes, even if nobody is completely sure why they are doing so. Blogs, wikis and social networks are increasingly popular for companies as internal tools, and their implementation can be quite a task for IT staffs, particularly when nobody is quite sure how they will be used.
The only reason that we haven't placed this issue higher on the list is because it isn't really IT's problem. Yes, setting up and managing those services takes a bit of time and effort, but the real issue is how those services will be used, and that is mainly the concern of executives, managers and end users.
Web 2.0 tools can be very valuable to a company, but they are only useful when implemented correctly and used to improve communication and collaboration. Really, it's far more a human issue than a technological one.
Iain Thomson: The growth of Web 2.0 has caused some additional headaches for IT managers, but it's not as bad as it could have been.
Because much of this content is user generated then the demands on the T ma nager's time aren't too onerous. It's setting up the systems in the first place that's the real time waster.
A lot of I managers have also been rather smart about how they deploy such systems. Increasingly they will set them up, but in the spirit of user generated content they are tapping the users to police and edit such information. It's a smart move, but also a logical one.
10. Cloud integration
Iain Thomson: In many ways cloud is nothing more than a fashionable term from client/server but no matter – it's this year's thing and as such there's strong pressure on IT managers to get into cloud services.
While cloud computing offers many advantages it is increasingly looking like firms are better off hiring third parties to set up and run a cloud infrastructure. EMC is currently working with Intel to set up a do it yourself cloud system and Amazon is involved too. But building a cloud syste3m from scratch is still an enormous responsibility.
It's also something that shouldn't be rushed into. Some board members don't seem to get this. A cloud system is incredibly complicated to set up and operate and the consequences if it all goes wrong are huge.
Shaun Nichols: One of the biggest problems of could integration is that not everything goes into the cloud. As a result, companies are left with a mixture of cloud-based services and locally-stored applications.
This presents several headaches, the first of which is integration. How do you get your cloud applications compatible with your other applications, and how do you make sure that everyone is on the same page?
Then there's the management issue. Instead of simply having to manage who has access to applications and accounts on the local network, administrators now also have to keep track of online identities and access to web based services.
9. Internal/external data breaches
Shaun Nichols: IT has enough to worry about these days, and adding new security worries only adds to the problem.
Companies should already have policies and protections in place to deal with security and data breaches, but the growing piles of data only make it harder. As new storage systems go live and archives expand, the task of managing and tracking access only gets harder, and sometimes files and users can slip through the cracks.
Then of course there's the worry that not only are files left unencrypted and drives unaccounted for, there's also the possibility that people purposely decide to steal data and destroy systems. With more data than ever and fewer people to manage it, the chances of a disgruntled employee causing damage to a system only increase.
Iain Thomson: As we're seeing the greatest threat to a company's data is not the spooky external hacker, but the enemy within.
The insider problem is something that IT managers are only just getting to grips with. The biggest threat is still the clueless user – the idiot who decides to set up their own Wi-Fi point and forgets to lock it, the user who clicks on an unidentified attachment or the half-wit who sets their password as Passw0rd.
But there is also the problem of the wilful thief. This can either be the employee who is leaving for another job with a competitor and is sweetening the deal by bringing over corporate data or someone with a grudge who wants to cause harm. With more and more people getting laid off it's this scenario that is increasingly a concern.
8. OS migration
Iain Thomson: Shaun and I disputed this in the list, with Shaun thinking it should barely have made an honourable mention. But with the launch of Windows 7 it's higher on the priority list than it has been in the past.
Windows 7 is going to make operating system migration a much bigger deal than it has been in the past. Most companies have steered clear of upgrading from XP to Windows Vista because of the failings of that operating system. Instead XP, which is stable, has been left to rule the roost.
However, upgrading these systems to XP is going to be a major headache. To move from XP to Windows 7 will require a full system wipe and that spells a lot of trouble. I suspect IT managers are going to simply suggest a full hardware upgrade instead.
For those corporates not in Windows the problem is much simpler. If you're running Linux then the steps for upgrading the operating systems are much simpler but still problematical. If Apple is the company's operating system of choice then the job is also less difficult, but such companies make up a tiny fraction of the market.
Shaun Nichols: Last week we noted that the operating system was becoming less and less relevant to the actual practice of computing. And while I maintain that belief, I must also concede that it's still a huge issue, particularly in times of transition, such as what we are now in with the move to Windows 7.
This latest transition could be especially tough for the many companies who opted not to move to Windows Vista. For those companies, there is the unenviable task of taking stock of which machines are capable of running Windows 7 and which will need to be upgraded or replaced.
If you decide to switch to Linux or OS X, you have other headaches to deal with, such as making sure you have the same applications or at least new ones which can handle the old files. While operating systems have become far more compatible in recent years, there are still big problems to deal with.
7. Patch deployment
Shaun Nichols: This is a given to anyone who has spent time in enterprise IT. With more users, more workstations and more software to manage, the process of installing patches and fixes only becomes more difficult.
The addition of virtualised machines and servers only makes things more complicated, particularly as malware loads increase and exploits become increasingly common and dangerous.
One bit of relief has come from the vendors. Companies such as Microsoft, Adobe and Oracle have begun issuing regular, scheduled updates rather than issue individual fixes for each bug. This allows administrators to set a date and plan ahead for testing and deployment of patches. The down side is that this can leave machines vulnerable for a longer amounts of time, but for most the trade off is more than welcome.
Iain Thomson: Patching has certainly improved, and the lot of the IT manager has got a lot easier. But not so fast Shaun, a lot of these easing up has come from a difference in malware writing rather than a great effort from application and operating system vendors.
In the good old days when malware writers were simply maladjusted amateurs computer networks were beset by worms whose jobs was to spread as fast as possible and provide the author with bragging rights. In such circumstances when a worm hit the IT manager had to drop everything and patch systems as soon as possible.
But these days the opposite is true. Malware writers want to get in under the radar and steal all that valuable information without being recognised. Patching is still essential, but the need for it is less visible and I fear this may be fostering a dangerous sense of complacency, particularly given the speed with which patches are reverse engineered.
6. Remote workers
Iain Thomson: It's difficult to dispute the value of home working in most cases. People working from home are generally more productive, happier and healthier than their office brethren. I can say this with some confidence since both Shaun and I are writing this in the comfort of our own homes and communicating electronically.
But home workers are often not the friend of the IT manager, under certain circumstances. If the worker is using their own PC at home then it is an unknown quantity and the IT managerm can't control the security settings on the remote worker's computer. The one time (we know about) that Microsoft has lost source code for example came about because a home worker got an infection and allowed hackers into the network.
The other problem comes when staff are working overseas on business trips. When you entering the US and many other countries then the government retains the right to take a copy of the hard drive of any computer entering the country and that can be a security nightmare.
The simplest way to handle this is to issue company hardware to remote workers. For those working from home this ensures that security standards are kept. For those travelling a blank laptop can be issued and then confidential data can be sent via VPN once the traveller has cleared customs.
Shaun Nichols: Iain, we are lucky in that we were working from a branch office to start with. Since even when we are in San Francisco we're remotely accessing systems based in London, telecommuting is pretty much a non-issue. It's also a nice snapshot of just how much we take for granted the work behind setting things up for remote workers.
For IT staff, this sort of thing can be a major headache, as evidenced by the number of companies which specialize in setting up and managing network access and management for telecommuters. Aside from the headaches of leaving the network open to outside connections, there's also the matter of access controls and oversight of what information is being accessed and stored.
As Iain noted, one good way to solve this is to simply issue employees with hardware for home use, but not every company has an extra notebook to hand out, and sometimes employees will simply insist on connecting with their own machines. Either way, you're left with more machines to manage and more traffic to worry about.
Shaun Nichols: Whether its Sarbanes-Oxley, HIPAA or any of the other regulatory acts, more and more firms are being tested with compliance regulations. Dictating everything from access policies to the use of encryption, local and federal laws are making file protection and management mandatory.
This is a big enough issue on its own, but when combined with the increasing amounts of data and stricter financial pressures, ensuring that everything is in compliance can become a huge task.
What's even more troublesome is the risk involved with not being in compliance. Violations alone can be bad for a company, but should a massive data breach or other incident occur while a company was not meeting government standards, the consequences could be devastating.
Iain Thomson: While regulations are essential for the maintenance of stable, beneficial capitalism they are also the bane of the IT managers life.
Companies are increasingly having to hold increasing amounts of information in order to comply with the regulations government has been laying down, and it needs to be stored for years but accessible when the auditors come knocking.
When these regulations were first brought in management just threw up their hands and ordered the IT department to save everything. After all, storage was dirt cheap and getting a few hundred extra gigabytes cost a pittance compared to the fines the company would accrue if they were found to be in breach of the law.
This however is no longer sustainable. The amount of data companies are generating and the costs of keeping it are growing at such a rate that we are going to need new storage options, or better regulations.
4. Overmanagement by non-IT staff
Iain Thomson: To my mind this should have been higher but Shaun talked me down. I've just lost count of the number of IT managers moaning about the fact that they are being asked to do the impossible by management that have no idea about technology.
The cycle usually goes like this. A salesperson gets a meeting with a senior manager and promises them the moon on a stick with a flashy demo, lots of promises and occasionally a night on the town. I know of one senior IT salesman with a large corporation that can get virtually anythingt through expenses in the quest for a contract – hookers included.
The manager who's been one over then tells the IT manager about this new technology and insists that it be implemented. With any luck they will do this before the contract has been signed, since they need the IT manager to tell them if the plan is feasible or not. However, it isn't unknown for the whole deal to be signed and sealed before the IT manager even knows about it.
The second way this manifests itself is when managers ask for the impossible. They say a little knowledge is a dangerous thing and this is true, especially in IT. There are too many cases of managers watching something like 24 and ordering such systems to be installed in their company, only to be told that they are living in a fantasy world, in the nicest possible way.
Shaun Nichols: There's the old saying that too many cooks spoils the broth. It's even worse if several of those cooks lack the culinary skills to make so much as a bowl of cereal.
We're still in a strange era in that a great many senior executives are at an age where they need to know technology but are just not able to completely grasp it. The type of people who only a few years ago learned how to send email and still worry about teenage anarchists "hacking the mainframe." These are the same people who see IBM commercials during golf telecasts and on Monday morning say "I was watching Tiger Woods sink a putt this weekend when I got this idea I think we should try…"
Add the aforementioned to Iain's picture of slick salespeople and junior executives who feel it is fine to promise the world and then dump all the actual planning and implementation off on the IT people, and you can understand why your company's tech staff can be more than a bit cranky at times.
Shaun Nichols: Virtualisation can be a great way to save money, increase efficiency and generally give IT departments much more to work with.
Unfortunately, it can also make things far more complicated. The problem is elementary: you take one physical server and turn in into several virtual ones, and you will be left having to monitor, manage and maintain far more servers than you ever had to before.
The number of tools and systems for managing and monitoring virtualised server deployments is growing every day, which only pays further testimony to how complex the task can be. Not only do you have to manage the various virtualised servers themselves, but there is also the hypervisor and virtualisation platform as well as the server hardware itself.
All in all, a virtualisation deployment can be a major asset to a company, but it can also be a nightmare for IT when problems arise.
Iain Thomson: Virtualisation is like living in the Playboy mansion with a whisky swimming pool – great idea but in practice it can be less than edifying, as anyone who's seen Hugh Heffner or the effects of liver failure can tell you.
Nevertheless it is the wave of the future, and as some have pointed out can bring major cost savings in terms of operation costs. But the downside is increased management time and complexity.
There's no getting around the fact that we will all be running a lot more virtualised servers in the future. But how we handle them will be the true test of the technology. I suspect that we're going to see a major shift in management tool technology, of the same scope as the shift to object orientated programming revolutionised the software industry. It is needed, and cannot come soon enough.
Iain Thomson: As companies and individuals we are now generating more content than at any point in human history. We're also having to store it for compliance purposes and this presents the IT manager with something of a problem.
It's an understandable human need to store everything that's done online. However, simply storing the data isn't the only problem, it's when someone wants to access it that the real fun kicks in. A good storage strategy needs to address both concerns.
Storage is essential, particularly off-site storage. If the company takes a physical hit you need off-site backup to be safe. In my first journalism job we had a break in and lost all our hardware, with two issues of the magazine and one handbook on and no backups. It took three weeks of hard labour (ie 100 hour weeks) to pull us back from financial ruin.
So an IT manager needs to be a master of the craft. Simply copying everything on the hard drives takes a huge amount of space and is rather wasteful. After all if someone has sent a large Powerpoint presentation to fifty staff there's no point in saving it 49 times when once will do. This explains why the bidding for Data Domains was so fierce.
Storage also remains a security problem. It's scary how often companies create storage systems that don't involve encryption. Miss this and the company not only faces a loss of data but also a law suit.
Shaun Nichols: This is of course the heart of information overload. We are creating more content than ever, through more channels than ever, with more tools than ever and it all has to go somewhere.
No matter how many analysts, hardware vendors or service providers we talk to, the warning is always the same: don't just throw more hard drives at the problem. Archives have become so large and so complex that it's not sufficient to simply increase the storage volume any more. Indeed, with budgets shrinking it isn't even possible for many companies.
Instead, the constant theme seems to be make better use of the storage you have. Iain mentioned de-duplication, erasing multiple copies of a file you only need to back up once. Other suggested fixes include tiering data, moving to online backup systems and using snapshots rather than full system backups.
Whatever the remedy, it's clear that simply expanding storage isn't enough any more, companies have to take a new look at how they manage their data and approach storage.
1. Budget constraints
Shaun Nichols: There's never a good time for a recession, but from a technological standpoint, this latest one could not have hit at a worse time.
Advances in hardware, software and network technology have given birth to entirely new fields of the industry, and just as many companies were looking to see the fruits of those new technologies, the economy took a dive and IT budgets everywhere took a major hit.
When you get down to it, the top two items on the list are pretty much interchangeable. The amount of data keeps growing and the budget for managing it keeps shrinking. From these two issues the entire problem of information overload really springs.
The crisis may, however, have a silver lining. Just as the Great Depression brought about economic and social reforms that improved the quality of life in later decades, this latest recession could necessitate advances in the approach to IT management and the business culture that will help speed up the recovery.
Having learned how to do more with less, IT departments could emerge from the crisis better able to manage their systems and with a greater understanding of how to squeeze the most out of the resources on hand.
Iain Thomson: Oh Shaun, you are a little ray of sunshine at times. I hope you're right about the recession being a good thing.
There is never enough money to do everything in IT. The only people with unlimited budgets are government security systems and even they must bow to the accountants at times. I suspect in a hundred years from now IT managers will still be complaining about having to do too much with too little funding, unless we're reached the Singularity by then and are no longer running the show (and I for one welcome our new overlords.)
But your broader point you may have hit the nail on the head. We have to learn to do more with less, and if the recession helps that then it's certainly a silver lining in a very dark cloud.