(Christopher Null, ComputerWorld) There's no telling what the future will bring, but one thing is sure: In the world of technology, nothing stays the same for very long. The year 2010 wasn't terribly turbulent for tech, but 2011 is shaping up to be more of a thrill than you might expect. From Android's scorched-earth march across the industry to malware threats that we have yet to wrap our arms around, it seems as if everything is about to change.
With that in mind, here are nine resolutions for the small business operator to think about for 2011: 1. Ignore Android at Your Peril 2. Start Prepping for Windows 8 3. Accept Tablets as Mainstream Devices 4. Make Mobile Security a Big Deal 5. Leave No Stone Unturned When It Comes to Security 6. Develop a Flash/HTML5 Strategy 7. Get Ready for Video 8. Put Your Social Media in Order 9. Figure Out the Cloud
(Michael Friedenberg , ComputerWorld) It's the time of year for bold and brazen predictions, so here's a forecast of the Top 10 trends, priorities and events of 2011.
10. Social media will keep dominating the business conversation. 9. The CIO-CMO relationship will change for the better, growing closer and more collaborative. 8. Cloud will move from an overhyped theory to an adopted practice in mainstream business. 7. Mobile moves aggressively into the data and applications arena. 6. Real-time analytics will define and drive the real-time organization. 5. Security breaches will hit an all-time high. 4. A battle will break out between IT and the lines of business over who really owns the user interface. 3. CIOs will continue evolving beyond an operational focus. 2. Vendor consolidations will cause major support issues. 1. CIO turnover will increase if businesses can't scale.
(Larry Dignan, TechRepublic) Gartner on Tuesday outlined the 10 technologies it thinks will give technology execs the most bang for their budgets in 2011. How many of these technologies will be a true hit?
Here’s Gartner’s 2011 list: 1. Cloud computing 2. Mobile apps and media tablets 3. Next-gen analytics 4. Social analytics 5. Social communication and collaboration 6. Video 7. Context-aware computing 8. Ubiquitous computing 9. Storage class memory 10. Fabric based infrastructure and computers
Click on the "full article" link below for more details.
(Network World) Worldwide enterprise IT spending will rise from $2.38 trillion this year to $2.46 trillion in 2011, a 3.1% increase, the research firm Gartner said Monday.
"Over the next five years, enterprise IT spending will represent a period of timid and at times lackluster growth with spending totaling $2.8 trillion in 2014," Gartner said. Enterprise IT spending in 2009 was $2.33 trillion.
IT budgets in several industries will not match pre-recession levels for several years. Enterprise IT spending as defined by Gartner is a subset of total global IT spending. IT spending as a whole is expected to rise from $3.33 billion in 2010 to $3.45 billion in 2011, a Gartner spokesperson said in an e-mail to Network World. While Gartner's report Monday doesn't show massive growth, things are looking much better than last year. One year ago, Gartner called 2009 the "worst year ever" for IT spending, after a decline of 5.2%.
(Maxwell Cooter, CIO) The battle for email cloud is set to heat up as enterprises start to rethink their email strategies, that's according to Forrester chief analyst, Ted Schadler.
In a new Forrester report, Four Giants Compete For Your Cloud Email Business, Schadler explains how the advent of cloud services is going to shake up enterprises' spending on email.
Email is going to the first large-scale cloud application wrote Schadler. "The reasons are simple: Email in the cloud is cheaper; it will evolve faster; and it is a commodity application that an email provider can run." Not only that, it's a great test bed to master the issues of cloud computing providers. And we're not talking about being a little cheaper either. Cloud-based email is going to be a lot cheaper "unless you're a 50,000-person company with a highly centralised email platform or you run hardware and software until it's old and crusty and a decade behind the times." Schadler wrote.
But when it comes to deciding which company is going to dominate the market, the issue is not so clear cut. With four major companies offering similarly priced services, the differentiators are going to be the level of integration that they offer.
(Jody Gilbert, TechRepublic) Just as with many popular arguments, IT certifications are popular fodder for debate. Except that certifications, in an IT professional’s microcosm of a world, have a bigger impact on the future. Just which certifications hold the most value today?
Here’s a list of the 10 accreditations with the greatest potential for technology support professionals, administrators and managers seeking employment within consulting firms or small and midsize organizations: 1. MCITP (Microsoft Certified IT Professional) 2. MCTS (Microsoft Certified Technology Specialist) 3. Network+ (CompTIA) 4. A+ (CompTIA) 5. CSSA (Certified SonicWALL Security Administrator) 6. CCNA (Cisco Certified Network Associate) 7. ACTC (Apple Certified Technical Coordinator) 8. ACSP (Apple Certified Support Professional) 9. CISSP (Certified Information Systems Security Professional) 10. PMP (Project Management Professional)
(Nathan Eddy, eWeek) A new survey of more than 1,200 cost-conscious businesses shows a growing confidence among IT departments to expand budgets and embrace technologies like cloud computing and virtualization.
A report from IT management specialist Spiceworks found small to medium-size business IT professionals expect this year to be better than last, with business IT budgets up in 2010 as the economy seems to be stabilizing. However, the last six months of 2009 proved to be much harder on midmarket IT departments than IT professionals had anticipated, causing businesses to react quickly and curb IT spending.
The average IT budget climbed 9% in 2010 to $117,200, the report found, with 43% of SMBs reporting budget increases. 67% of SMBs said they plan to keep their IT staff the same for the first half of the year, while 20% said they plan to add new full-time staff.
IT budget trends for 2010 indicated that the midmarket IT budget freeze is beginning to thaw. 43% of their IT departments reported a budget increase this year, while SMB IT professionals reported a cut to their IT budget declined 24% from the last survey (31% to 24%). For 2010, the annual IT budget increased 9%, versus a 1% decrease in 2009.
(Chris Preimesberger, eWeek) Based on IDC's first cloud computing survey focused exclusively on servers, the research company predicts that server revenue in the private cloud category will grow from $7.3 billion in 2009 to $11.8 billion in 2014, or about 62%.
IT researcher IDC reported May 10 that the combination of an aging server installed base, IT managers' increasing need to rein in virtual machines, and a general upturn in the buying environment is boosting sales of commodity-type servers used in public and private cloud-computing systems.
Based on its first cloud computing survey focused exclusively on servers, IDC predicted that server revenue in the public cloud category will grow from $582 million in 2009 to $718 million in 2014. Server revenue for the much larger private cloud market will grow from $7.3 billion to $11.8 billion [about 62 %] in the same time period, IDC said.
(Infoworld Staff, InfoWorld) A survey released yesterday, of where IT pros are actively engaged, confirms strong business interest in analytics, cloud, and collaboration technologies, but it also shows that open source and ERP are falling off their radars as key concerns.
The top 10 areas of interest are:
Cloud computing: 57%
Business process management: 46%
Web 2.0/social networking technologies: 45%
Desktop/client virtualization: 45%
Storage virtualization: 44%
Enterprise data management: 43%
Software as a service and Web services: 40%
Collaboration technologies: 40%
Mobile technology is not on the "active investigation" list only because IT has already begun deploying the technology in significant percentages: 52% of respondents say they have already implemented mobile initiatives, and another 29% are actively following the technology.[...]
(Matthew Broersma, ZDNet UK) Less than five percent of all email is delivered to mailboxes, as the rest is junk blocked by spam-fighting efforts, according to Enisa, the European Network and Information Security Agency.
While anti-spam measures are well used by providers, junk email remains a key problem for them and takes up a large part of their annual budgets, according to the report. [...] One-quarter of very small providers said they spent more than €10,000 (£8,700) per year on fighting spam, and one-third of very large providers invested more than €1m per year.
"The data on aborted SMTP connections and filtered emails seems to show that anti-spam measures are currently highly effective," study says. The result is that only 4.4% of all email was delivered, down from 6% in Enisa's last spam report 2 years ago.
The agency noted that many providers, though not all, currently use collaborative measures to fight junk mail, such as working with spam-sending ISPs to eliminate the problem. It recommended that more service providers should work together on the problem.
(Laura, Word to the Wise Blog) In 2010 e mail marketing is going to get much more challenging for everyone. Recipients, and their ISPs, are expecting more and better things from email marketing. Senders who are currently meeting expectations may struggle to meet those increasing standards within their current marketing frameworks. Successful marketers will be able to make the switch from sending mail that doesn’t annoy customers to sending mail that recipients truly want.
Authentication While people will probably continue to publish SPF records, its relevance will continue to decrease. As less people pay attention to SPF, records may be unmaintained and become stale further decreasing their use and relevance.
Domain based reputation Domain based reputation is on the upswing and that will continue through 2010.[...]Domain based reputation will augment but not replace IP based reputation. It is easy and efficient to check the reputation of a connecting IP address and a receiver can make a preliminary delivery decision without having to accept the full email.
Engagement The buzzword for 2010 is engagement. ISPs will be measuring engagement and making delivery decisions based on how much their users want particular email.
Social Networking While social networking will not replace email marketing at any time, the latter might give recipients opportunities to share information with social networks. Smart senders will provide easy links so that recipients can share information with their social networks.
(Paul Mah, TechRepublic) The rush to open e-mails via the network can lead to potential leaks via mobile devices and other unsecured devices. Smartphone device loyalty will trump standardization in the enterprise. While it is conceivable for IT departments to lock down access at every nook and cranny, the truth is not every organization will have the resources or expertise to do so. Let's highlight some possible avenues that might result in the inadvertent leaking of sensitive e-mails.
Avenues for e-mail leakage One common scenario would be IT-savvy employees linking up their personal laptops to their company’s Exchange Server. Obviously, these additional workstations represent additional points of vulnerability, especially so if they are used outside the company premises. While it is possible to disable or block HTTP access to forcibly divert the access of e-mails to the LAN, this is hardly a practical solution against the backdrop of an increasingly mobile workforce. In this context, the use of VPN does not protect against the risk of e-mail leakage.
For organizations on Microsoft Exchange, it is trivial to enable Exchange ActiveSync to allow mobile devices such as Windows Mobile smartphones to access corporate mailboxes. However, this also opens the door to devices such as the Apple iPhone or iPod Touch, as well as other mobile phones that implement the Microsoft ActiveSync protocol. Organizations concerned about the security of such devices can, of course, disable such access from Exchange. However, Exchange push mail represents the most affordable option for many SMBs; they will be hard-pressed to pay for the steep licensing fees to implement a secure BlackBerry Enterprise Server with RIM BlackBerry solution.
(Linda Tucci, Senior News Writer, TechTarget) The recession has intensified some companies' need to shop around for new vendors and better deals, and in some cases is turning best practice for changing technology vendors on its head. Just ask Bill Yearous, CIO of The Seattle Times, where the IT budget has been reduced by half over the past three years.
When he saw the year-over-year pricing on his Oracle Corp. database increasing at twice the rate of his two other database products, he dropped the contract, ending a decade-plus relationship with the vendor widely reputed to have the industry's premier technology. "It used to be that CIOs would joke that you don't get fired buying the Oracles, IBMs, Microsofts of the world, companies with high service levels, whose software and applications are widely accepted. They are easy choices to make," Yearous said. "It's a little bit counterintuitive that the best technology turns out the be the technology that meets your business needs at the price you can afford, as opposed to who has the purest best technology."[...]
Changing vendors is never easy, even in good times and especially for a product as "foundational" as major software, according to Duncan Jones, an analyst at Cambridge, Mass.-based Forrester Research Inc. "For software products, it really depends on the product category and what is involved in migrating data, retraining users and redoing integration," Jones said.
By contrast, switching service providers, or switching software resellers - moving from CDW to Insight or vice versa, for example, to source the company's Adobe Systems Inc. software - is less complex. Likewise, discrete niche products, such as a travel expense management program or e-sourcing, can be switched with relative ease, particularly if the software is delivered as a service
(Kristen Caretta, Associate Editor, SearchCIO-Midmarket) The application options available for the midmarket are many and varied. Two popular alternatives to the more traditional - and often more costly - route of on-premise applications are open source and Software as a Service (SaaS) solutions. Although both provide many benefits, including reduced capital costs and subscription-based pricing models, it's the differences between the two models that may dictate which is best suited for your organization.
Open source solutions are more widely used today, and the question is less about if to use them and more about how to use them. And although open source and SaaS are two different animals, many people would be surprised to learn how often SaaS offerings are actually open source applications at heart.
Understand the hidden costs of open source Open source solutions and SaaS applications can reduce capital costs -- but what happens down the road, after the intial implementation? Many people believe that open source is free, with the option to upgrade up to a support edition if necessary. In some cases this is true, but according to Liz Herbert, a senior analyst at Forrester Research Inc. in Cambridge, Mass., it's important to weigh the burden these "free editions" can put on IT.
(Cath Everett, ZDNet) Cloud computing is one of the most overhyped phenomena to have hit the IT industry in a long time. It is a business model that definitely has its advantages. The trouble is vendors of all sizes and stripes are so desperate for a piece of the cloud action, they are willing to blur distinctions and fudge definitions for their own ends.
Their headlong pursuit has saddled cloud computing with so many misconceptions that it is sometimes difficult for customers to make informed business choices. ZDNet UK has looked at the most common myths, and debunks five of them here.
Myth 1: Cloud equals SaaS, grid and utility computing The term 'cloud computing' has been hijacked by anyone wanting to make a service sound hip and interesting. Jumping on the latest bandwagon is a favourite pastime in the technology industry, but in this case it is creating confusion among customers, who are unsure what they should be asking for or what they're likely to get for their money.
So to clarify: cloud computing is a form of outsourcing by which vendors supply computing services to lots of customers over the internet. These services can range from applications, such as customer relationship management, to infrastructure, such as storage and the provision of development platforms.
The services are provided by massively scalable datacentres running hundreds of thousands of CPUs as a single compute engine, using virtualisation technology. That approach means workloads are distributed across multiple machines — which can also be located in multiple datacentres — and capacity can be allocated or scaled back according to a customer's needs.
(Neil Roiter, Senior Technology Editor, TechTarget) It's tough to keep pace with the explosive growth of spam if you build an in-house email security solution. Commercial software and appliances are more efficient and have the features to make life easier for IT and end users, but even so, managing email security is just one more chore for companies with limited staff and tight budgets.
Small wonder Software-as-a-Service (SaaS) is increasingly popular among midmarket companies, and it's very likely a good choice for your organization.
The biggest consideration driving this trend is overtaxed staff. Even small companies have to deal with spam volumes that have grown far past the point where some basic filtering rules are sufficient. In-house antispam, using open source software such as SpamAssassin probably is no longer robust enough. What was initially a good, inexpensive solution put together and maintained by your technical hotshots, grows increasingly burdensome because either your hotshot is gone, or you're spending too much time keeping it up to date. Cooling and high availability factors to consider High availability is another consideration. Email appliances are very reliable and unlikely to go down, but it's a consideration. If you can't afford an outage, you'll need to run a second appliance in parallel in case one goes down. That's not a very attractive option for small organizations because of the extra cost and maintenance. On the plus side, vendors typically offer deep discounts for backup appliances.
On the other hand, SaaS vendors are well established with strong, redundant infrastructures, and SLAs that guarantee availability. Weighing appliances versus services If you go the SaaS route, you're treating email security as an expense rather than a capital expenditure. If that's your preference, one of the prime advantages is that you pay as you go for the number of users. So, if your company has laid off employees in this tough economy, you can ratchet down and increase your costs as your employee count goes up. That may be more attractive to management than additional capital investment.
(Joe McKendrick, ZDNet) Can new technology initiatives help pull Wall Street out of the danger zone? A new survey released by IBM and Securities Industry and Financial Markets Association (SIFMA) finds that IT budgets are tight on Wall Street, but things are loosening up, and there’s going to be plenty of demand for new technology initiatives in the near future as firms on the Street look to “transformational” solutions to help better manage risk.
The survey of more than 350 Wall Street IT professionals found a “significant” increase in interest in new technologies and computing models, in particular cloud computing, as firms seek to overcome budgetary restrictions and skills shortages. Almost half of the respondents now see cloud computing as a disruptive force.
The past year has seen marked growth interest in cloud computing. The number of respondents predicting that cloud computing would force significant business change more than doubled (from 21% in 2008 to 46% in 2009), making it the top disruptive technology, ahead of operational risk modeling and mobile technologies.
Major initiatives underway at most Wall Street firms include enhancing electronic trading tools (69%), improving data capacity and bandwidth (58%), and improving technology framework and infrastructure (58%). It can be assumed that the last item includes SOA efforts.
(Sam Diaz, ZDNet) E-mail overload, though a problem for years, has always been a concern for the overwhelmed user. But even corporate networks can take a beating as the sheer volume of messages has increased. It’s not brain surgery; it makes perfect sense. But every once in a while, we need a kick in the pants to remind us how much of a time-waster e-mail can be and how to deal with it.
We suggest an analysis of your inbox to place messages into four categories:
Coordinating Schedules — Deciding upon common times for meetings and events
Document Collaboration — Working together on documents by sending them back and forth as attachments
Managing Tasks — Sending or getting tasks requests and updating status.
Group Decisions — Using email to discuss issues or as a voting mechanism to build consensus
You will find that close to half of the emails in our inbox don’t have much to do with “communication” at all, and fall in one of the above categories. Ironically, email is supposed to be a tool for “asynchronous communication”. A majority of emails are about teams and groups coordinating activities, discussing work related matters, or actually working on tasks like editing documents and sending them back and forth as attachments.
(Scott Lowe, TechRepublic) Between cloud computing, virtualization, and economic conditions, the data center has certainly changed form over the past 10 years. Where there used to be a box for each discrete workload, we now have boxes running virtualized server instances for dozens of workloads and some services run “in the cloud” and, relying on no local servers at all, simply depend on the corporate router to achieve their aims. Every day, more and more software-as-a-service vendors pop up offering their wares. And today, unprecedented economic conditions are forcing organizations of all types to deeply examine everything they do to make sure that every dollar spent directly supports the bottom line.
With this perfect storm of activity, what’s happening in the data center? In May, CNET quoted an IDC report indicating that worldwide server sales were down 25% in the first quarter when compared to sales of a year ago. In February, ZDNet’s Larry Dignan quoted another IDC report indicating that year-over-year server sales fell 12 percent. There are also published reports claiming that 2009 server sales will plummet more than 20 percent for the year.
Some possible reasons above for this downturn: Virtualization. With dozens of workloads now running on a single box, physical server sprawl is a thing of the past (of course, virtual server sprawl is now here to stay!). Fewer servers in the data center directly equates to fewer server sales for each vendor. Virtualization has other benefits beyond simple consolidation; for example, virtualization generally reduces service deployment time and, when deployed in the right way, virtualization can be a boon for high availability.
Software-as-a-service. I doubt that we’ve seen the full impact of SaaS, but I can’t imagine that it hasn’t had at least a minor impact on server purchases and sales. Many colleges and universities, for example, are outsourcing student and, sometimes, faculty/staff email to the likes of Google and Microsoft, thus eliminating the need for a server infrastructure supporting those outsourced email services. No longer are those old servers on the replacement cycle.
1. About This White Paper This white paper is sponsored by Gecad Technologies, the developer of AXIGEN, an alternative to Microsoft Exchange. While Exchange is a solid and robust email platform that has roughly 150 million users in its installed base, it has a higher TCO than AXIGEN and does not support a variety of server operating system or client access modes. It is important to note that the goal of this white paper is simply to compare the features and benefits of both Exchange and AXIGEN, not to denigrate the many features and capabilities of Exchange.
2. What Should a Mail Server Do? Email Is Critical to Users and Organizations. Virtually anyone who uses email at work understands how important this capability to getting his or her work done. For example, an Osterman Research survey conducted during March 2009 found that the typical user spends 152 minutes each day working in email. Based on an average workday of nine hours nine minutes discovered in that survey, the typical user spends 28% of his or her day doing something in their email client. Interestingly, we found virtually identical results for both smaller and larger organizations with a difference of only one minute in the average time spent using email on a typical day. In addition to spending more than one-quarter of their day in email, most users also check work-related email from home on weekdays after hours and on weekends. Further, a large proportion of users access their work email while on vacation.
3. Deployment Scenarios While there is some commonality of system requirements across small organizations, large organizations and carriers that provide hosted email services, there are differences in the requirements that each type of organization has for its email capabilities.
Low-end Enterprise Environments In order to address the needs of small and medium businesses, an email server needs to have a variety of key features. These include the basic features and functions needed to satisfy generalist email requirements, including:
It must be easy to install, configure and manage, particularly for organizations or satellite offices of larger organizations that may not have dedicated IT staff.
It must require as little additional expertise and infrastructure as possible.
It must have a low cost of acquisition and management.
It must provide robust security and filtering capabilities to protect against the growing variety of spam and malware traversing the Internet.
While Exchange can satisfy the email requirements for smaller organizations, it has a much higher total cost of ownership (TCO) compared to AXIGEN.
(Ted Schadler, ZDNet) After a survey of 53 information & knowledge management professionals to ask about the cost of email, it is abundantly clear that few firms know their true cost of running email on-premises. And this matters if you’re considering a move to cloud-based email.
While the cloud-based cost of email is pretty transparent (many providers, including Microsoft and Google, publish their per-user per-month costs), the cost of running email on-premises is often a big mystery to everyone, including most CIOs. The big challenge is that the costs are spread throughout the budget: some in the hardware budget, some in the software budget, some in the storage budget, some in the cost of capital budget, some in the staffing budgets, and so on.
But it an accurate calculation of on-premises email also matters if you are contemplating upgrading your email to a more current version that might support cheaper storage, higher automation, or reduced email database size due to eliminating redundant copies of attachments. You can compare your current costs against the fully loaded costs of the new system with its higher efficiencies.
When you factor in servers, storage, server software, software maintenance, hardware and software administration, power, archiving, message filtering, mobile costs, even financing, you find out that the cost of email for 15,000-person organizations can be as high as $40 per user per month, and even for a normal information worker without mobile email, it can cost more than $27 per user per month. Of course, you can and should segment your workforce into different tiers — for example, mobile executives, information workers, and occasional users — and provision them with different size mailboxes, email clients, and mobile email.
(Edward F. Moltzen, Samara Lynn, ChannelWeb) Cloud computing is the next big thing, or current big thing, in information technology. It's fast. It's cheap. It's easy. It works. What could possibly go wrong? Let's be clear: Don't expect "the cloud" to provide five-nines of availability and don't expect it to be the default solution for those wanting cost-competitiveness. Expect the unexpected.
Before stopping to count all the profit you or a customer could make by deploying a cloud-based solution, consider that just one outage for the span of a few hours could eat up a measurable piece of a company's profit for a quarter. Could never happen, right? It did, though. On July 20, 2008, Amazon.com suffered a catastrophic outage in its S3-hosted storage business. While Amazon managed to get the service up and running in several hours, and the event is now just a blur to many, it impacted thousands of businesses and individuals who had gone to S3 as a convenient, cost-effective way to store lots of data easily.
And, over the past several months, search and online advertising giant Google has suffered notable outages in its Gmail and Google News offerings - outages that the company has left largely unexplained.
(Kristen Caretta, Associate Editor, TechTarget) IT Service Management (ITSM) can often be a hard sell even in a solid economy. Frameworks such as the IT Infrastructure Library (ITIL), COBIT and Six Sigma have often been described as "nice to haves" and not IT necessities. But there are ways that midmarket IT executives can embrace ITSM's focus on organizing processes and workflow to reduce costs and increase customer service, even during a recession.
Consider this advice for making the most of what an ITSM implementation has to offer to achieve a quick return on investment without dedicating too many resources:
To ensure you go down the right path, identify up front what problem you are trying to solve with your ITSM implementation. That might be the need to improve IT performance, the need to educate users in what services you can provide, or the need to increase efficiency and process.
"You don't want to implement ITIL if you don't understand the problem that needs to be solved," said David Pultorak, IT consultant and founder of Pultorak & Associates Ltd. in Seattle.
Lee Root, IT division manager for Tulare County in California, created a new ITSM implementation when he worked to merge two county IT departments.
"We had to merge two separate workflows, two separate IT systems, two separate policies and we had to find a new way to manage IT services. We knew what we wanted, we just had to find a less antiquated way to make it happen," Root said.
Rather than creating a hybrid of the two former ITSM practices, Root took the opportunity to start fresh and outline what the IT department had to offer.[...]
If your objective is cost reduction, look at processes over tools and technology. "ITSM implementations do not require a huge investment in time or resources if you pick and choose which aspects will bring about the most efficient return without overextending lean resources," said Ryan Ballmer, principal consultant at Cadence ITSM LLC.
"There is a tremendous opportunity to cut costs and improve operating efficiencies by investing in their ITSM processes and maximizing the ROI for tool and technology purchases they've already made," said Ballmer, who works with midmarket IT executives. "These improvements can position them optimally for larger projects once the capital starts to flow again."
(Matt Stansberry, Executive Editor, TechTarget) Between February and March 2009, SearchDataCenter.com conducted a survey on how data center managers are coping in the economic downturn. Subscribers were contacted by email and 235 end users, primarily in U.S.-based IT shops, responded.
The data shows that IT budgets took a major hit in the first quarter of 2009. Nearly 70% of the respondents said they face budget cuts, and one-third must deal with budget cuts of more than 20%.
The largest cuts across IT budgets come from staff reductions, followed distantly by reductions in spending on server hardware, application software and systems management tools, respectively.
Despite the cuts, demand for IT and data center services keeps growing. And about 30% of respondents are proceeding with all planned data center projects. Twenty-five percent have canceled some data center projects, 17% are scaling all projects back, and only 4% are canceling projects altogether.
Crafting a survival strategy In the face of increasing compute demand, how are cash-strapped IT organizations getting by? Primarily by making their servers last longer in production. Servers are typically replaced every three years. Two-thirds of IT shops have extended the production life of server deployments in 2009. More than 35% say they'll keep servers in production for six months to a year longer, 34% say they'll extend server lifecycles by two years.
IT pros said prolonging server work life is a viable option. "This is definitely a workable strategy; systems don't suddenly stop working when they're depreciated off the books. The three-year-lifecycle is more of an accounting issue than a matter of hardware reliability," said Bill Bradford, the senior systems administrator at an energy services firm in Texas.
"As long as a server is doing what it is supposed to do and is keeping up with workload, I don't see anything wrong with keeping it around as long as warranty service and/or spare parts are available. It's an extreme case, but I've seen 12-year-old systems in production simply because they still worked just fine, plenty of spare parts were on hand, and there was no driving reason to put their functionality on a newer platform."
(Chad Perrin, TechRepublic) A recession economy can affect more than just your employer’s revenue stream; it can also affect the software you use. It’s time to start thinking about how to minimize any negative effects that may have on your systems’ security. In today’s recession economy, IT industry companies may run into financial trouble. Many smaller software vendors are likely to fail or get swallowed up by larger vendors that have a different vision for the software they provide. Even larger organizations may cut major product lines, or perhaps even disappear entirely, if things get bad enough.
That may not affect most of us very much (aside from saturating the employment market with people in need of new jobs, I suppose), but it may more directly affect some of us. One such possible effect could be the loss of ongoing development and support for software that we use on a daily basis. Part of maintaining the security of our computers and networks is ensuring that security vulnerabilities are identified and fixed. Unfortunately, when the vendor of a closed source, proprietary piece of software disappears or ends support for the software, it’s not only difficult (in many cases effectively impossible) to get needed security patch support; it’s also illegal to do so.
As you consider future software deployment options, consider the likelihood the software will continue to receive support. Companies with large cash reserves such as Microsoft are more likely to survive the recession intact; software lines that are central to a company’s business model such as Adobe Photoshop are more likely to receive continued support; and, ultimately, popular open source projects are more likely on average to survive the recession intact than closed source software vendors, because their continued development is not as dependent on having a lot of disposable income.
Open source projects aren’t even dependent on any organization, because if current developers just give up on it, there’s nothing stopping others from picking up where they left off. In fact, as financial belts tighten, open source software projects may actually get stronger, as tighter finances create opportunities for more open source software deployments.
(George Spafford, TechTarget) Organizations are eager to improve the effectiveness and efficiency of IT services. The IT Infrastructure Library (ITIL) has much to offer with its IT Service Management (ITSM) philosophy and reference processes. The challenge that groups face when implementing ITIL is that the process must be tailored to the needs of each organization, and it is critical that it's done correctly. As a result, many ITIL projects either stall or outright fail. Herein lies a challenge: How can these projects recover?
First, recognize that recovery takes work - hard work. When an ITSM implementation fails, it risks being viewed as another management fad that hasn't lived up to the hype. This makes recovery much more difficult, because in order for the next attempt at ITSM implementation to succeed, employees must believe it will be successful. Moreover, the next time around must be successful. Three strikes are too many for a failed ITSM implementation.
For groups that have either stalled or failed (and most stalled projects become failed projects), the following suggestions can help you understand what happened and potential corrective action.
Candid feedback The first step is to understand what went wrong, and there are two methods here. The first is to honestly ask those involved, "What went wrong?" and "How can we prevent this from happening again?" Some groups do so internally and are successful. Candor is critical, and unless people can talk freely, the discussion won't be effective.
The other method is to bring in an external group to assess your organization's current state. The assessment will review processes as well as identify problems that occurred during the failed ITSM attempt and gaps between a company's current state and its future goals. From this analysis, an implementation roadmap should be created.
(Linda Tucci, Senior News Writer, SearchCIO-Midmarket) How can companies be free of security vulnerabilities? They could ferret out all the flaws in their computer products and patch them. They could prevent flaws from being exploited by shutting down systems. Of course, neither is good for business or the budget.
That's the view of Peyton Engel, a technical architect who heads the security assessment team at CDW Corp., at the recent Fusion 2009 CEO-CIO Symposium in Madison, Wis."Instead, companies need to spend less time reacting willy-nilly to security vulnerabilities and more time asking whether threats are likely to affect them," Engel said. He recommends companies identify the point of diminishing returns of patch management by weighing the probability and severity of the security vulnerability, rather than the severity alone.[...]
Calculated hype from security vendors But calculating risk is itself a risky business. One formula, for example, calls for thinking about risk in terms of annualized loss expectancy. To determine this, you multiply the single loss expectancy, or the cost of a single incident, by the annual rate of occurrence (ARO), or how many incidents per year, to get a dollar figure per year. If the solution the security guy is trying to sell you is less than dollars per year, then it is a no-brainer and you should buy it.
(Angela Moscaritolo, SecureComputing) This year, enterprises must have the appropriate protection in place to secure their organization from internet and removable storage device malware in their environments, according to David Marcus, director of security research and communication for McAfee Avert Labs.
McAfee's “2009 Threat Predictions” report forecasts the most dangerous IT security risks for the year ahead, some of which include web-based malware. Accordingly, organizations must realize that more malware than ever is being created and distributed via the internet through Web 2.0 sites, such as Facebook and MySpace.
“Unless you are looking at that vector, it's a question of when the malware will enter the environment,” Marcus said."Businesses must analyze employee use of the internet to determine behavior, then make sure they have appropriate protections in place to secure their organization from malware."
In addition, the incidence of malware distributed through removable storage devices, including USB sticks and cameras, has increased, and this threat is expected to continue this year. The same study finds that, for organizations, this threat represents an accident waiting to happen.
(Erik Eckel, TechRepublic) IT certifications boast numerous benefits. They bolster resumes, encourage higher salaries, and assist in job retention. But which IT certifications are best?Technology professionals generate much debate over just that question. Many claim vendor-specific programs best measure a candidate’s skills, while others propose vendor-independent exams are the only worthy way of measuring real-world expertise. Still other observers believe the highest-level accreditations — Microsoft’s MCSE or new Architect Series certification, Cisco’s CCIE, etc. — are the only credentials that truly hold value.
The best IT certification for you, is likely to be different from that for another technology professional with different education, skills, and goals working at a different company in a different industry. For that reason, when pursuing any professional accreditation, you should give much thought and care to your education, experience, skills, goals, and desired career path.
Once a career road map is in place, selecting a potential certification path becomes much easier. And that’s where this list of the industry’s 10 best IT certifications comes into play. While this list may not include the 10 best accreditations for you, it does catalog 10 IT certifications that possess significant value for a wide range of technology professionals.
#1: MCITP The new-generation Microsoft Certified IT Professional credential, or MCITP for short, is likely to become the next big Microsoft certification. Available for a variety of fields of expertise — including database developer, database administrator, enterprise messaging administrator, and server administrator — an MCITP validates a professional’s proven job-role capabilities. Candidates must pass several Microsoft exams that track directly to their job role before earning the new designation.
(Linda Tucci, Senior News Writer, TechTarget) When asked to name the toughest ongoing challenge in business continuity (BC) planning, the majority of midsized organizations say it is inadequate funding. Given that financial hurdle, one strategy for implementing a business continuity program is using savings from your disaster recovery preparations to help BC pay for itself. Here are five elements that can save your organization money:
1. Server consolidation: Fewer systems mean that you will have less to recover in an emergency. "If you can consolidate your systems into a more fault-tolerant configuration, you have reduced your risk footprint dramatically," said Jim Copenhaver, a certified business continuity professional at Key Results Management Inc. in Atlanta. Consolidation saves on operational overhead, including personnel, redundant software licenses and patch management. HVAC, power consumption and network capacity costs also shrink.
2. A technical refresh: Older systems and tape drives incur maintenance costs that never decline. "By performing a technical refresh you can usually eliminate maintenance costs for three years. New equipment is less prone to catastrophic failure and reduces the chance of extended unplanned downtime," Copenhaver said. A good business case will show upper management that a capital expenditure spread out over three years will be revenue-neutral or even represent a savings.
(Beth Pariseau, Senior News Writer, SearchStorage) As interest around the cloud grows, industry experts are debating whether or not to establish a standard application programming interface (API) for sending data to and from cloud service providers.
Proponents of drafting a standard now say it will boost customer adoption to the cloud. Opponents say attempts to standardize an "on ramp" to the cloud this early might stifle innovation.
People on both sides made their point during a Cloud Storage Summit held by the Storage Networking Industry Association (SNIA) in January. Sajai Krishnan, CEO of cloud storage startup ParaScale Inc., said the prospect was raised of basing a potential standard on Amazon's Simple Storage Service (S3) API, which Krishnan said offers advances in the use of the HTTP to write and retrieve data over a network.
Amazon rival Nirvanix Inc. also uses HTTP to do writes, but has added more advanced features than S3, such as file system directory structures. "It's not the same old NFS mount," Krishnan said. "The community is still debating what is standard out there."
Vincent Franceschini, chairman of SNIA's board of directors, said the group is looking at Amazon's API as a de facto standard because it's one of the earliest and most widely adopted cloud services. However, Franceschini says Amazon has turned down offers to participate in formal discussions.
(Matthew Emmerton) Benchmarks are a critical part of the IT industry. They allow us to measure technological progress, enable buyers to compare offerings from multiple vendors and help vendors improve their products. In the world of database servers, the Transaction Processing Performance Council Benchmark C (TPC-C) has had a starring role in the online transaction processing (OLTP) space for more than 15 years. Few vendors publish data server specifications without TPC-C numbers. Now, TPC offers a new, rich and carefully designed benchmark - TPC-E - intended to become, like TPC-C, a standard measure of data server performance. TPC-E is designed to be broadly representative of contemporary OLTP systems, challenging today's fast processors with a rich data model and demanding calculations.
The TPC-E model TPC-E is modeled on a brokerage with its database of customers, accounts and securities. In a typical brokerage setting, customers generate data research requests, account inquiries and trade requests. The brokerage house receives trade requests, sends them to the market and returns the results to the customer. When developing TPC-E, TPC consulted with well-known brokerage houses to develop a model that is as realistic as possible. TPC-E uses multiple trade types - market orders, limit orders, stop orders and more. There are several tiers of simulated customers, each with distinct trading behaviors. Simulated companies issue both preferred and common stock. When customers make research requests, the system responds with large text documents, which include line-of-business (LOB) data as well as historic financial and market data.
The resulting model comprises 33 tables, 22 check constraints and referential integrity constraints, including 33 primary and 50 foreign keys. TPC-E has three dimensions of data, posing realistic partitioning challenges. The model implements business logic using 10 transactions, many with various modes of execution. The rich schema and diverse transactions of TPC-E provide new opportunities for vendors to optimize system performance. Modern database features TPC-E is designed to match several additional features of contemporary databases:
Some transactions in TPC-E cannot be implemented as a single stored procedure, enforcing multiple round trips for certain data.
Constraint checking, now a standard feature of all database management systems, is enforced in TPC-E.
TPC-E provides a business intelligence-style transaction (broker-volume) that requires the processing of various-sized data sets, including volatile data.
TPC-E requires database disks to be RAID-protected, reflecting typical real-world configurations. In addition, sponsors must report the time interval between catastrophic disk failure as achieving 95% of pre-failure throughput. Buyers will be able to better judge the tradeoffs between performance and reliability, and vendors will be rewarded for maximizing both together.
Economic downturns are a time when a lot of innovation happens quietly under the radar. That’s because some companies get sidetracked trying to stay alive and weaker competitors often go out of business. That leaves the door wide open for innovators who find a way to build a better mousetrap, or offer products that are highly valuable in a down economy, or think up a product that no one knew they needed yet.
For a capital-intensive industry like the technology industry, it’s natural to think that it will be one of the areas of the economy hardest hit by the current recession. However, in spite of the economic storm clouds, 2009 will likely be a watershed year in tech, because of two factors: the developments under way that will not slow down for the recession and the opportunities that are being created or intensified by the recession itself.
1. U.S. broadband investment Broadband investment is a large part of the massive economic stimulus package that U.S. President Barack Obama wants to use to jump-start the jalopy that is the U.S. economy. Although the $800 billion package - which is expected to pass this week - contains about $6 billion for broadband investments, this is likely just the beginning of the broadband initiatives for the Obama administration.
2. Storage in the cloud Web-based apps, that most people, due to cloud computing, latched on to in 2008, remain important, and you can expect them to continue to change the ways people work during 2009, but a much more important development is the arrival of Web-based storage. This is going to fundamentally change computing, because it is going to untether users from their PCs and allow them to quite simply access their work from anywhere on any device - eventually including smartphones as well. 3. Cheap computers unleashed The Netbook phenomenon shows that a lot of users had a lot more computer than they needed. In addition to Netbooks, 2009 will see the arrival of “Nettops,” Atom powered desktop computers. Some of those could cost as little as $100 (not including monitor) and could also be used to power thin clients as well. The result of all these “cheap” PCs being unleashed on the market is that it will drive down the overall cost of a PC and significantly lower the barrier to entry for new people trying to buy a PC for the first time or replace an old one.
4. An opportunity in power After cloud computing, the next most overused buzzword in 2008 was “Green IT.” And while many Green IT and alternative energy initiatives have been shelved due to lack of funding since the economic meltdown last September, there is still a place for projects that can show a clear ROI through energy savings.
(Mark Schlack, Vice President, Editorial) Recent years have seen large infrastructure investments in IT -- storage area networks, network-attached storage, server farms, etc. -- but the economic downturn has forced some to cut back. While a little more than a third of midmarket companies will spend more on servers, storage and networks, about a quarter will spend less. Those that are spending are mainly spending to support application enhancements and upgrades.
Midmarket CIOs are sticking with what they know -- rack servers and storage are on the shopping list at 55% and 53% of companies, respectively. In the enterprise, blades are the new rack, as far as being the de facto commodity form factor. But like their enterprise counterparts, midmarket IT shops will be investing in server virtualization software in big numbers (51%).
Midmarket companies have been slower to adopt virtualization, with 31% reporting no current deployment. But looking out 12 months, 39% expect limited deployment, 25% anticipate significant production numbers, and 36% expect most production servers to be virtualized. Enthusiasm for server virtualization hasn't spread to the desktop variety: 46% report no plans for 2009.
(Kristen Caretta - Associate Editor, SearchCIO-Midmarket) In tough times, the "free" price tag of open source software is too good for some CIOs to pass by without at least a second glance. But it can be buyer beware: Despite short-term cost benefits, open source applications require sufficient IT staffing resources for the long term to keep up with code changes.
For many CIOs, open source applications have been part of the infrastructure for a while, often as part of tactical projects. But with competitive pricing for enterprise editions and incredible flexibility, some CIOs are now looking strategically at open source options for everything from phone systems to operating systems.
The aggressive rates of the products are catching the attention of midmarket CIOs looking to cut expenses, but Rob Enderle, principal analyst at Enderle Group in San Jose, Calif., advises CIOs to proceed with caution and not overlook the staff resources needed to install and maintain open source tools, which might not include the application support that's standard with other applications. Without sufficient support, any issues will fall in the hands of the IT staff, which may or may not be familiar with the applications.
"Everyone is thinking, 'cut, cut, cut.' CIOs may save some cash in the short term, but if they don't have the necessary resources to support their decisions, they will end up shooting themselves in the foot," Enderle said.
Lately, the Internet news sites and blogs have all been buzzing with reports of privacy infringements by various IT companies, regulation changes and updates on the laws already in effect, all related to data retention policies.
Well what does all this mean? In effect, the majority of companies providing IT services in one way or another have to abide by these new regulations in order to comply with the new standards. Initially, data retention was not mandatory, but it soon became clear (mostly in the past few years) that legal, administrative, commercial and even social interests can be better protected through such an initiative.
This article describes how the Axigen messaging solution (www.axigen.com) is in compliance with both United States and European Union currently effective laws and policies related to data retention and data preservation. These measures were adopted by service providers mainly to minimize the risk of deletion or modification of records and communications.
The Axigen Messaging Solution is fully compliant with all the laws and regulations in effect to date over both the United States and the European Union territories. In the next few pages, each area will be treated independently and references to the specific requirements enforced for that area alone will be listed.
First Stop: Europe
European Union data retention policies are listed in the Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC. This Directive can be accessed online at the following link: http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32006L0024:EN:HTML
Not a bad way to make an introduction for Red Hat Enterprise Linux 6! This claim of "King of the Linux Hill" is sure to conjure some fuss and debate within the open source community. If the snippet below sparks your interest be sure to check out the full article ;)
(By Sean Michael Kerner, Internet.com) "On the eve of its next major release, the distro produces new figures showing that it's ahead of rivals in total users.
Counting Linux users is no easy task since there is typically no requirement for users to register their installations. Yet Linux distributions do try and count users in an attempt to quantify their user base and relative footprint in the operating systems space.
Red Hat's Fedora community Linux distribution has now tallied its user base, and it's a number that on the surface would make it the largest installed base of any Linux distribution, with at least 9.5 million users and possibly as many as 10.5 million. Fedora competitor Ubuntu Linux currently claims to have 8 million users.
The Fedora figures come out as the major players in Linux continue jockeying for position as the dominant vendor in the space, while also competing to make inroads against proprietary software. The news also comes as Frields and his team and ramping up to deliver their next release, Fedora 10, which is slated for Nov. 25th.
The counting methodology has the potential to overcount the same machine if the box has a dynamic IP address -- for instance, using Dynamic Host Configuration Protocol, or DHCP -- that could change over time." Looks like they could turn that red hat into a red crown if irrefutable research confirms Red Hat Fedora to be number one. But imagine the arguably improbable embarrassment of a definitive user count detroning the self-proclaimed ruler of the Linux realm - sure wouldn't want to work in their PR dept if that happened! How do you feel about Red Hat Fedora? Do you think it merits its alleged no.1 spot?
Looks like the aussies have taken down a long lasting myth...
(ITnews.com.au)A recent study has found the use of instant messaging software in the workplace to reduce disruption.
The findings challenge the widespread belief that instant messaging, used in addition to phone and e-mail, may increase workplace interruption and reduce productivity. Researchers found that instant messaging was often used as a substitute for telephone, e-mail, and face-to-face communications. While the use of instant messaging software led to more conversations on the computer, conversations were briefer and less disruptive than in other forms of communication. The non-disruptive nature of instant messaging is said to be due to how people are using the technology. Researchers found that many workers use the technology to check in with coworkers to see when they are available, or to solicit quick answers to questions without engaging in longer, face-to-face conversations. Instant messaging software also gives people the ability to flag their availability or postpone responses to a more convenient time, which reduces workplace disruptions. It's a definite step up, even though several employers are/will be limiting IM-img to an internal messaging tool. However, when it becomes the standard, instant messaging will be fully embraced due to its advantages before being dumped for the next best thing. Any thoughts on what could outdo the comfortable and increasingly popular IM?
A swift research guide on messaging solutions and email communication.
Where do you go to if you find yourself in need of researching modern messaging?
Such needs rarely occur naturally -unless you’re really into technical stuff and live your life in an ongoing symbiosis with the IT elements. So when you have to do this for school or work you’d like to get it done ASAP so you can move on to the more humane aspects of your life, right? Well, here’s a quick guide that will help you through by telling you where find quality info.
First of all there’s… wait for it…Wikipedia! Before you start arguing about why I’m wasting your time by stating the obvious, know that Wiki is definitely worth mentioning not only due to the direct information it provides, but also because the references, notes and external links in a page on any subject can lead to great extra content for your research, not to mention trigger some new ideas and approaches.
And in the realm of internet.com there’s Webo, Wiki’s techie web-savvy cousin. Webopedia focuses on internet and computer issues, giving you in depth info on pretty much every subject of interest in the IT world. All this AND you get the option of linking to Webo by integrating the “Term of the day” and/or the Webopedia search engine into your website.
Now for the judge and jury : Enter Serverwatch, provider of product reviews, stats, tutorials, news, glossary and a lot more. Serverwatch is also part of the internet.com portal so if there is a place for quality broad spectrum IT and business insight - this is it!
…and of course the Radicati Group which, as they rightfully say, ”provides research on messaging and collaboration, security, email archiving, regulatory compliance, wireless technologies, web services, identity management, instant messaging, unified communications, voice over IP, and more.” Couldn’t have put it better myself!
And for those of you who are more hands-on and looking for cool tools there’s our very own MailRadar with a half dozen tools and tests. BEWARE: Experimenting with these e-toys has been proven to make your inner geek cry tears of joy!
In hoping that this article was useful and easy to digest, I leave you to your research. But if it wasn’t, feel free to slap the author with a comprehensive comment on what should have made it on the list!
When the media saturates our daily existence with dirty words such as financial crisis, subprime loans, credit crunch or the big ol' war on terrorism, you can understand the need of common folks to point the finger at someone or something, hoping at the very least for some façade justice. Sadly, according to the magistrates residing in the Greatest Nation in the World, the angry mobs are to sit this one out, at least for now...
(Washington Post) The White House does not have to make public internal documents examining the potential disappearance of e-mails sent during some of the Bush administration's biggest controversies, a U.S. district judge ruled yesterday.
In a 39-page opinion, Colleen Kollar-Kotelly said that the White House's Office of Administration is not subject to the Freedom of Information Act (FOIA), even though its top officials had complied with the public records law for more than two decades.
The Office of Administration, which performs a variety of services for the Executive Office of the President, announced it would no longer comply with the FOIA last August, three months after an independent watchdog group filed a lawsuit seeking to discover what happened to the e-mails, which may have vanished from White House computer archives.
Acknowledging that the issue "is a close one, and is not easily resolved," Kollar-Kotelly wrote that the Office of Administration "lacks the type of substantial independent authority" necessary for it to fall under the FOIA. She added that the office performs mostly administrative functions, which also exempts it, and that past compliance with the FOIA was "insufficient by itself" to subject it to the law's requirements.
Luckily for the White House, they can't be held accountable for the disappearing act of those pesky presidential emails, and you can see the extent of their "freedom" by reading full version of this article. However, for the rest of us that aren't untouchable, secure messaging solutions are highly important, so you when want the best one on the market you owe it to yourself to get AXIGEN.
As technology advances, the pros are starting to outweigh the cons.
Email suffers the indignity of being pervasive to the point of obscurity. Much like toilet paper, everyone expects it to be there when needed, and to use it every single day, yet it is rarely given much deliberative thought. As the world becomes virtually papered in billions (trillions?) of emails, the matter becomes an exercise in exhaustion for IT pros who watch in exasperation as the stuff continues to flush their resources.
“In some cases there are costs benefits to outsourcing enterprise email, but in all cases it frees up IT resources,” said Matthew Cain, lead email analyst at Gartner. “And, for many, that’s reason enough to do it.”
Still, outsourcing enterprise email is a task best undertaken with care as email is not merely a disposable commodity, but rather the very pulse of a company. The downside to hosted email is the loss of control as you put your most valuable communications in someone else’s hands. There are quagmires and periphery obstacles to address before you hand off your email to an outsider: compliance and security issues top the list.
Even as organizations become more dependent on e-mail to run their businesses, many don’t dedicate enough staff or technology to prevent downtime, according to a King Research survey [...]
A total of 220 respondents completed the survey.[...] Of those respondents, 96% reported that during an e-mail outage, there is a huge drop in productivity, and some employees cannot work at all.
A proactive approach with preventive maintenance of the e-mail system can prevent many unexpected outages. IT staffs, however, are finding it increasingly difficult to schedule downtime for server and application updates and maintenance. At the same time, 71% of participants report they spend two hours or less per month on maintenance.
As e-mail becomes more important for doing business and as the number and frequency of e-mails grow, it’s interesting to note that staffing to manage messaging applications is flat. For example, 77% of the survey respondents report that their staffs have not increased in the past 12 months. And 5% report that their staffs have actually decreased in size.
Do you have trouble getting answers to e-mail you send? Are there some people with whom you need to communicate on a regular basis who not only don't answer your e-mail, but also are seldom available to take your phone calls?
Before you chalk up the lack of communication to other people's bad work habits or rudeness, take a close look at your own communication style. [...] Here are some of the most aggravating e-mail faux pas to avoid.
1. Marking e-mail messages you send out with an exclamation point to indicate high importance for routine matters. Yes, you want people to read e-mail you send, and, yes, you think the matter is important. But marking everything as high importance is going to have the opposite effect. Those who frequently receive e-mail from you marked with an exclamation point will start ignoring it -- and be mad at you for sending so many e-mail messages marked high importance.
2. Demanding immediate response when it's not warranted. Just because something is important to you doesn't mean that others should drop what they're doing to answer your question or do what you want done. They have their priorities, too. Not only will they get mad at you, but if requests aren't truly urgent, they'll soon be ignored -- just like the fabled boy who cried "Wolf!" too often. So, if the matter you're discussing in e-mail isn't truly urgent (i.e., no one is going to suffer any harm or damage if whatever you want done isn't handled the same day), then don't ask for immediate action. And, if something really does need to be handled right away, explain why. (And remember to say "Please" and "Thank You.")
As employees become more mobile, technology can help them control the flow of communications with clients and vendors through one point of contact for your business:
1. Leave the office, stay in touch Employees today don’t necessarily work the same schedule in the same office every day and leave work behind when they go home. Unified messaging makes it easier to work from anywhere at anytime. Employees can receive e-mail, fixed-line voice mail, mobile voice mail, mobile text messages and faxes online or through one phone call. Before, checking all five channels of communication required two phone calls, one trip online, and a trip back to the office to check the fax machine’s paper tray.
2. Keep work flowing Not only can employees check for messages on any communication channel, but they also can respond via the most convenient channel. In other words, if you log into your unified messaging inbox and hear a voice mail on your office phone, you can respond via e-mail or phone. If the content of the message is important to someone else, you can forward it. And if you want to listen to it later when you’ll be away from an Internet connection, you can save it to your desktop.
Tips for composing better e-mails and improving communication with clients and potential investors online.
E-mail is an essential business tool. The average office worker spends 49 minutes managing e-mail daily, while upper level managers spend up to four hours a day on email, according to Nancy Flynn, director of the ePolicy Institute and author of Writing Effective E-mail and E-mail Rules. "All that sending and receiving, responding and deleting takes an enormous toll on workplace productivity," she says. Making better use of e-mail includes communicating more effectively, as well as knowing when to use e-mail, and when another form of communication would be more effective.
E-mail picks up the pace of communications between co-workers and customers: it arrives almost instantly, compared to the slower pace of traditional mail or even special delivery services such as overnight and same-day couriers. It allows workers to be more productive, as it reduces the time spent in face-to-face meetings or even in telephone calls. Workers tend to spend less time composing e-mail than on formal letters, yet the content of e-mail communication is just as important.
Email is pervasive in today's business environment, a fact that raises its own set of issues. In no particular order, here are the top five Issues that any business needs to know about.
1. Spam. Spam. And more spam.
Email is both ubiquitous and cheap, which means that it is the easy choice for evildoers looking to cast a large net. While Bill Gates famously predicted that spam would be eliminated by 2007, he was, unfortunately, quite wrong. The volume of spam is growing: according to IronPort, an email management vendor, there were roughly 86 billion spam messages per day in the month of November, 2006, up from 5 billion per day in June of 2006. That's an increase of over 1700 percent in five months.
This rise in spam is based almost solely on financial incentives; in a 2005 speech, Mark Loveless of Bindview (later acquired by Symantec), broke down the black market for hacked information: $100-$500 for a known Internet Explorer flaw, $1,000-$5,000 for an unknown exploit; $150-$500 for a list of 5000 IP addresses primed to become a bot network, $500-$5,000 for a list of 100 credit card numbers ... and the list goes on. When you combine those numbers, the estimated annual salary of a skilled hacker falls into the $100,000 - $200,000 range. Which is a nice number to get as an end-of-the-year, tax-free bonus.
There are numerous examples of companies running into email compliance issues, but perhaps one of the most high-profile cases was that of Frank Quattrone, an investment banker with Credit Suisse First Boston. After two trials and three years in court, legal observers said that Quattrone was essentially exonerated of all charges. The root cause of all the trouble? Whether Quattrone, in following company email policy, had obstructed an investigation by the SEC into IPO stock distribution. If following company policy got him three years in court ... you don't want to mess this up.
Forrester Research has a new report out that offers some insights into the communications technologies that enterprises are adopting -- and are still holding off on. There's also a provocative data point on how involved business unit executives are in Unified Communications purchases.
The respondents were asked how much influence three types of roles have in UC purchasing decisions. The biggest influencers were IT executives, not surprisingly: 26% of respondents said these folks exert final decision-making authority, while 52% reported that IT execs wield "heavy influence" on UC buying. Notably, this was very close to the numbers regarding the influence of telecom executives: 23% of respondents said telecom was the final UC decision-maker for their enterprise, and another 51% ranked telecom as a "heavy" influencer.[...]
The factor that Forrester calls out as important is the last category, "A business unit executive who works with the telecom and/or IT departments." And this is pretty noteworthy: 15% of the survey respondents said this type of exec holds final decision-making authority for UC in their enterprises, another 43% see "heavy influence" here, and 34% say such execs hold "some influence" on the UC buying decsion.
According to a new survey, almost two thirds of business users have to manage their own e-mail inboxes to stay under corporate capacity limits. And for almost a third, those limits are a paltry 100MB or less.
C2C sponsored the third-party survey to draw attention to its email data management solution called Archive One, but to me the interesting point is just how pathetic most business email systems are -- especially compared to business and even consumer email solutions available in the cloud, often for free.
Among these latest survey findings: - 65 percent of survey respondents contend with mailbox quotas and are forced to self-manage their email to stay operational. - 66 percent take their own measures to save email messages in order to ensure they aren’t lost, with a majority storing email outside their company email system, in some cases even in personal/home email accounts. - 67 percent need to search for an email that is more than three months old at least once a month, with 28 percent spending time searching about once a week or even daily.
The survey also found that those who self-manage email to stay within quotas frequently delete messages, delete attachments, and/or create a PST file – a method used in more than half of organizations surveyed.
When U.S. Cellular's chief operating officer, Jay Ellison, imposed a "no email Friday" rule at his company, he thought it would ease workers' overload. Instead, he got a rebellion:
A growing number of employers, including U.S. Cellular, Deloitte & Touche and Intel, are imposing or trying out "no email" Fridays or weekends. While the bans typically allow emailing clients and customers or responding to urgent matters, the normal flow of routine internal email is halted. Violators are hit with token fines, or just called out by the boss.
The limits aim to encourage more face-to-face and phone contact with customers and co-workers, raise productivity or just give employees a reprieve from the ever-rising email tide. Emails sent by individual corporate users are projected to increase 27% this year, to an average of 47 a day, up from 37 in 2006, says Radicati Group, a Palo Alto, Calif., research and consulting firm. And one-third of users feel stressed by heavy email volume, according to a 2007 study of 177 people by the University of Glasgow and Paisley University in Scotland. Many check email as often as 30 to 40 times an hour, the study showed.
Surfing the web unprotected will leave the average web user with 70 spam messages each day, according to an experiment by security firm McAfee. It invited 50 people from around the world, including five from the UK, to surf without spam filters.
The experiment revealed that UK residents are most likely to be targeted by the infamous Nigerian e-mails and "adult" spam. One UK participant received 5,414 spam e-mails during the month-long trial. But the US still tops the global spam league. Participants in the US received a total of 23,233 spam e-mails during the course of the experiment compared to 15,856 for the second most spammed country - Brazil. In the UK, the five participants racked up 11,965 spam messages during the course of the experiment. Germany attracted the least spam, with just 2,331 junk messages.
GLOBAL SPAM LEAGUE: US - 23,233; Brazil - 15,856; Italy - 15,610; Mexico - 12,229; UK - 11,965; Australia - 9,214; The Netherlands - 6,378; Spain - 5,419; France - 2,597; Germany - 2,331.
TOP TEN MOST POPULAR SPAM CATEGORIES: Advertisements; Financial; Health and medicine; Adult services; Free stuff; Education; IT related; Money making; Credit cards; Watch adverts.
IT professionals may initially be awestruck by the promises of virtualization, but Gartner analysts warn that awe could turn into upset when organizations start to suffer from seven nasty side effects. Here are the reasons Gartner says virtualization is no IT cure-all:
1. Magnified failures. In the physical world, a server hardware failure typically would mean one server failed and backup servers would step in to prevent downtime. In the virtual world, depending on the number of virtual machines residing on a physical box, a hardware failure could impact multiple virtual servers and the applications they host.
2. Degraded performance.Companies looking to ensure top performance of critical applications often dedicate server, network and storage resources for those applications, segmenting them from other traffic to ensure they get the resources they need. With virtualization, sharing resources that can be automatically allocated on demand is the goal in a dynamic environment. At any given time, performance of an application could degrade, perhaps not to a failure, but slower than desired.
Many employees in the modern workplace simply assume their electronic communications are being read by IT administrators. A new study released by IT security firm Cyber-Ark Software shows that those assumptions aren't too far off base.
The survey of 300 senior IT professionals at mid-market and enterprise firms yielded the disturbing news that a third admit that they or fellow administrators have "used the admin password to get at information that is otherwise confidential or sensitive," while nearly half say they have "accessed information on a system that was not relevant" to their jobs.
Presenting the results of their annual "Trust, Security and Passwords" at the recent Infosecurity Expo in London, Newton, Mass.-based CyberArk stressed the scandal of the two questions concerning snooping by IT staff, but the bulk of the study concerns more mundane areas of data leakage prevention such as the frequency with which passwords are changed on computer networks.
Most small and large enterprises are uncertain of the benefits of a unified communications implementation, according to a recent survey of 2008 networking plans from Forrester Research.
Fifty-five percent of the 2,187 North American and European companies queried said there is "confusion about the value" of unified communications for their company. Only 11% of the firms have already deployed it. Another 16% are rolling out and 57% are evaluating or piloting it, Forrester found.
"We were not surprised," says Forrester analyst Ellen Daley, author of the survey's report. "There's been a 21% increase in UC pilots since 2007 but no increase in firms buying UC. A lot of people are talking about UC, a lot more are tipping their toe in; but at the same time they're all saying they're not sure about the value," she says.
Daley says Forrester receives inquiries from clients regularly asking simply: What is UC?
Networkworld.com publishes the conclusions of an extensive study they have conducted on email, web and instant messaging security issues within North-American mid-sized and large enterprises. Among the findings of the study, there were:
- Two out of five organizations has had a virus, worm or Trojan successfully infiltrate their network through e-mail. - More than two in five organizations have experienced corruption of one or more e-mail databases.
On behalf of Proofpoint, Inc., Forrester Consulting fielded an online survey of email decision makers at large US, UK, German, French and Australian companies. Respondents were asked about their concerns, priorities and plans related to the content of email leaving their organizations, as well as related concerns about the risks associated with mobile devices, blogs and message boards, media sharing sites and other electronic communications technologies.
Forrester gathered a total of 424 responses from companies with 1,000 or more employees, including 301 US, 32 UK, 30 German, 31 French and 30 Australian companies. This report summarizes the findings of the 2008 study.
Key Findings, US 2008 - 22% of US companies with 20,000 or more employees surveyed employ staff whose primary or exclusive job function is to read or otherwise monitor outbound email content. - More than 1 in 3 (38.0%) US companies surveyed perform regular audits of outbound email content. - 44% of US companies investigated a suspected email leak of confidential or proprietary information in the past 12 months. 40% investigated a suspected violation of privacy or data protection regulations in the past 12 months.
Online firm Habeas released its annual study of consumer attitudes Wednesday, which showed that email remains the "primary method of communications in personal and business capacities."
The survey found that 67% of respondents prefer email to communicate online, and 65% feel that will still be the case in five years. And vis-à-vis a younger generation "aging out of email," the survey found that 65% in the 18-to-34 demo "will favor email to communicate with businesses in five years."
The research also found that there is a rise in consumer concern about email security issues--with 69% of respondents worried about "being victimized by email fraud scams," up from 62% a year ago. And perhaps giving marketers pause about advertising on mobile devices, it found that 43% of respondents were concerned about spam and virus threats on their wireless devices, up from 36% in 2007.
The leading technology research firm, The Radicati Group Inc., publishes an extensive white paper reviewing the strategic importance of email messaging systems in today’s business environments and the AXIGEN Mail Server .
Email is a vital communication channel for organizations of all sizes. Having a reliable and efficient messaging system can improve employees’ productivity, reduce operating costs, and give any company an edge over its competition.
The number of messages exchanged between users continues to grow quickly. It is projected to increase from about 156 messages/day per user in 2008 to about 233 messages/day per user in 2012. Such a quick increase in volume over the next few years will influence many companies to take a critical look at their solutions, even if they may be currently satisfied with their messaging systems.
Email is one of today’s most widely employed means of communication. Whether for personal or business purposes, email is extensively used throughout every day activities, its speed and reliability making it the communication channel of choice for millions of people worldwide. Figures on email usage are impressive, and growing, as reported by major technology research companies.
According to Email Marketing Reports1, The Radicati Group2 estimates 1.2 billion email users worldwide in 2007. Figures are expected to reach up to 1.6 billion by 2011, as stated in a report issued by the company in October, 2007. In a previous study, dated October 2006, Radicati also estimated that approximately 183 billion emails were sent each day in 2006 and that wireless email users would grow "from 14 million in 2006 up to 228 million in 2010". For the business environment, Ferris Research3 estimated the number of business email users at around 780 million in 2007.
Email communication is certainly popular, but what are rules and guidelines bringing the best of it?
Checking email, reading email and answering email can take up hours of time if you let it. But only if you let it. Here are four simple email management rules to help you keep control of your inbox:
1) Let your email program manage your email as much as possible. Email management starts with setting up and using filters. If you're using an email program such as Outlook, you can configure email rules to send your spam directly to the trash - meaning that you don't waste your time reading and deleting it.
This white paper will explain how businesses can significantly reduce the costs of their email communication while continuing to provide users with a best-in-class messaging and collaboration solution.
Introduction Today, email is absolutely mission-critical. Communication and collaboration keep your business running. Email and electronically enabled collaboration have become so embedded in normal day-to-day operations that many businesses simply could not function without them. These services enable everything from productivity enhancing collaboration between employees to external communications with customers and business partners and demand 24x7 availability.
Many businesses, however, have found that the cost of providing employees with the latest in messaging and collaboration technology is rapidly escalating. To meet modern business needs, mail servers have had to become more complex – and with that additional complexity come additional management burdens and costs. Furthermore, some mail servers have an upgrade process that is both extremely complex and extremely costly and which may necessitate the purchase of replacement server hardware. Combined, these factors place a considerable drain on corporate resources. The problem is especially severe for small and medium sized businesses (SMBs) which usually do not have access to the same financial or technical resources as large enterprises. In fact, the cost of upgrading has forced many SMBs to expose themselves to risk by continuing to use an older and unsupported version of their mail server.
The escalating cost of email Empowering workers with sophisticated communication and collaboration technology is not a luxury, it is a necessary cost of doing business. However, it is also a cost that has escalated to a point that many businesses are finding difficult to bear. Take Microsoft Exchange Server™ 2007, for example. Exchange Server is the most widely used business email platform and is undoubtedly an extremely capable product – but it is also highly expensive.
Mining of email data could help companies spot dangerous employees before they do damage.
Three researchers at the Air Force Institute of Technology -- James Okolica, Gilbert Peterson, and Robert Mills -- have published a paper that outlines an algorithm for mining email data and identifying patterns of transmission that might tell managers when employees are keeping a secret.
In a nutshell, the algorithm identifies email topics of interest that are communicated outside the organization, but never shared with others inside the organization. The identification of such topics indicates that employees "either have a secret interest in the topic or generally feel alienated from the organization," the paper says.
In the study, researchers applied a data mining concept called Probabilistic Latent Semantic Indexing (PLSI), which has been used to extract specific information from a large body of data. By adding users to the body of data being studied, the researchers were able to identify patterns of content exchanged between specific users.
PCWorld: Researchers at the University of California, San Diego (UCSD) said they've discovered a critical weakness in the spam ecosystem that could be used to help cut off the promise of economic returns fuelling the huge growth in spam levels:
In a paper delivered at the USENIX Security 2007 conference in Boston, the UCSD researchers said that while spammers use vastly powerful, distributed delivery networks to pump out junk e-mail, it's quite another story for the internet scams that form the real heart of the spam mechanism.
Such scams, for instance selling pharmaceutical products over a website, are typically hosted on a single website, the researchers found. What's more, a single site might host several scams and might also act as a spam relay.
Here is an article that analyzes the results of independent tests performed by PC Magazine Romania comparing the AXIGEN Mail Server against two open source alternatives, Sendmail (with Dovecot) and Postfix (with Cyrus).
The comparative performance study covers the four basic functions of electronic messaging: message receiving, their delivery to the user’s mailboxes, message storage and user accessing stored emails. Two usage scenarios were considered: business and ISP. The tests consisted in sending messages with a predetermined size to the servers and checking their acceptance in the users’ mailboxes.
The large number of spam messages from the total traffic of received email messages (estimated by Radicati, in 2007, at 72% of all traffic) generates frequent periods of intensive server usage. To verify the servers’ ability to respond in overload conditions, their response time to requests on 1, 2, 4 and 8 parallel connections was tested.