You are in: Articles / Operating Systems
 
 
Quick-List: Articles in Operating Systems

Companies count the cost of IT failure

The market has rejected Linux desktops. Get over it.

Hardening Linux with Bastille UNIX

Ubuntu 9.10 makes a serious charge toward the enterprise level

Economics and virtualization stunt Linux server growth

All Linux needs is a good commercial

The Incredible Shrinking Operating System

Linux's position in cloud computing efforts

Ubuntu a minor player? Not outside the States

Linux and Windows compromised at boot

Have we now entered the post-OS era?

A look at real-world exploits of Linux security vulnerabilities

More OSes gain hypervisors, but most users choose OS-independent approach

Avoiding disaster recovery pitfalls in VMware and Linux

10 iptables rules to help secure your Linux box

10 obscure Linux applications you need to try

Using Linux in a data center consolidation management strategy

Choosing the best server OS: Linux vs. Windows comparisons

Red Hat Fedora Claims It's the Leader in Linux

Gartner warns of misguided virtualization strategies

Linux made its first desktop breakthrough

Thou shalt fear the GPL...

Linux, the New Botnet Command Center

Articles in Operating Systems

Companies count the cost of IT failure

(Stephen Pritchard, ITPro) Companies are becoming more aware of the direct financial costs of computer downtime, according to a survey of IT managers. One in five businesses lose £10,000 an hour through systems downtime.

Almost one in four companies have suffered an outage that lasted more than one business day, even though IT failures are meant to be covered by their business continuity plans. “The fact that people are willing to state that it costs of £10,000 an hour, or even £1m per business day, means that businesses appear to be taking it seriously.” said Andrew Barnes from Neverfail.

Greater awareness of the costs to the business of failure does not appear to translate into more resilient systems. The survey found that the number of businesses affected by outages remains stubbornly high. A full 92.8% of companies said they had experienced a failure. Businesses are becoming more aware of the costs of downtime, because in a tough trading climate, retaining customers, and maintaining revenues, has become a higher priority.
|
|
Rating: 12345
 

The market has rejected Linux desktops. Get over it.

(Jason Hiner, TechSanityCheck)I’ve been running Linux on PCs since 1998, when Red Hat still cared about the desktop and Mandrake was supposed to be the distribution that was going to bring Linux to the masses. That was also about the time that the mainstream media got infatuated with the story of the free operating system from the Finnish hacker that was going to bring down Microsoft Windows.

Spoiler alert: I’m going to give away the ending now. It never happened. In the decade since it was first proclaimed as the “Windows killer,” Linux on the desktop has made virtually no progress in real adoption numbers. According to market share trackers (based on real PC activity and not just sales) such Net Applications, StatCounter, W3Counter, and others, the market share of Linux has been hovering around just 1-2% of total PC operating system installations for a decade.

Even in the past two years since the netbook phenomenon began with Linux as its primary OS, Linux market share has failed to make a major jump. The chart below, based on Internet visitors tracked by Net Applications, shows the trajectory of Linux desktop market share over the past 24 months.
|
|
Rating: 12345
 

Hardening Linux with Bastille UNIX

(Kevin Beaver, CISSP) Even with the common vulnerabilities I've talked about in the past, Linux is a solid operating system (OS) that stands up well to security tests. This doesn't mean, however, that you should let your guard down. Over time, configuration tweaks, third-party software and human intervention tend to change the security posture of once-secure Linux systems. This will inevitably lead, at best, to dings noted on vulnerability-assessment or audit reports.

But there is a way to establish a solid Linux security foundation and set your business up for future success, and that is hardening your Linux systems using Bastille UNIX, an open source project led by Jay Beale.

Formerly named Bastille Linux, the graphical user interface (GUI)-based Bastille UNIX steps you through the OS-hardening process for Debian, Gentoo, Mandriva, Red Hat and SUSE Linux distributions, as well as HP-UX and Mac OS X. Its intuitive question-and-answer approach allows you to lock your system down without having to worry about fat-fingering or configuring something incorrectly along the way. Bastille is not just a hardening program -- it's also a great learning aid, something that could be used to teach classes.

Bastille UNIX is an easy download and even easier to run. There are several system hardening categories you can choose from, including patches, file permissions, account security, domain name systems and more. As shown in Figure 1, Bastille prompts users with specific questions and offers detailed explanations to ensure that the effects of each action will be understood.
|
|
Rating: 12345
 

Ubuntu 9.10 makes a serious charge toward the enterprise level

(Jack Wallen, TechRepublic) At the end of this month (October 29th, 2009 to be exact), Ubuntu will be releasing its newest take on the Linux operating system. This time around, it should be obvious (even to the biggest of skeptics) that Ubuntu is making some serious inroads to the business and enterprise scene.

Prior to this release, I would have said Ubuntu is the best-of-the-best Linux candidate to take over the desktop scene. It’s user friendly, easy to install, stable, secure, and (minus games) everything you would need for desktop computing. Now, however, the ante has been upped. With 9.10 (Karmic Koala) Ubuntu could well be on their way to becoming the best-of-the-best for the business desktop.

“Pshaw!” you say, old naysayer that you are. Well, before you pishposh this off to /dev/null/ you might want to take a look at a few of the new features (and improvements to old features) Ubuntu is bringing to life. Many of these new features should make it readily apparent that Canonical is planning some stealthy attack on the enterprise-level desktop. Let’s take a look at these features.

Bootup: Ubuntu has already made serious headway in the boot up process. When 9.04 was released the goal was the ever-elusive 10 second boot up time. With Ubuntu 9.10 they are inching very close to that time. The alpha release I tested had yet to reach that magic number but it’s getting close, clocking in under 20 seconds (and I was testing in a virtual environment so the installment wasn’t getting 100% of the machine resources).

Software Center:
This is really one of the biggest enterprise-level additions. Gone will be the Add/Remove Software tool and (eventually) Synaptic in favor of the Software Center. Although this tool will be used in the same way as the Add/Remove Software tool, it will have one thing its predecessor didn’t have - Commercial Software. That’s right. Now the enterprise (or SMB) user can go to the Software Center and shop for just about any type of software you can imagine - including non-free, enterprise-grade software! All in one very user-friendly tool.
|
|
Rating: 12345
 

Economics and virtualization stunt Linux server growth

(Leah Rosin, Site Editor, EnterpriseLinux) As the economy struggled in 2008, some IT pros thought Linux adoption and use would grow in enterprise data centers. After all, Linux is supposed to be the low-cost solution. A year ago, nearly half (47%) of the respondents to a SearchDataCenter.com reader survey said they expected to evaluate Linux for data center use, citing lower cost as the primary driver.

But at the end of the third quarter of 2009, things don't appear so rosy for Linux data center evaluation and deployment.

According to this year's "Data Center Decisions 2009: Purchasing Intentions Survey," the number of respondents saying that they use Red Hat Enterprise Linux decreased to 43% in 2009 from 48% in 2008. SUSE Linux Enterprise usage declined slightly as well to 20% from 21%, compared with the previous year. But use of Ubuntu Server Edition increased somewhat, 14% from 9%

With these numbers in mind, consider that 60% of respondents reported that they do not use and are not evaluating Linux on their servers this year, compared with 54% saying the same in 2008. The depressed economy is doubtless a factor, and Enterprise Management Associates analyst Steve Brasen, for one, was not surprised by the results. "In general, every survey I have seen about IT purchases this year shows the numbers are down. Businesses just aren't buying as much equipment as they were last year," he said.
|
|
Rating: 12345
 

All Linux needs is a good commercial

(Jack Wallen, Linux and Open Source - TechRepublic) That’s what commercials do. They take products and make them desirable, no matter what the product. Think about how many bad products have been sold by good commercials. The right commercial can get the public to buy a steaming pile of poo and think they have purchased gold. And that is just what Linux needs. Mark Shuttleworth needs to spend some coin to purchase some air time and create a commercial for the Ubuntu distribution that will become a viral wonder. Make Linux mysterious, sexy, and funny! Make it something the people can talk about.

The scene comes up with Larry David (creator of Seinfeld and star of TV’s Curb Your Enthusiasm) sitting at a desk in front of a laptop. He’s obviously having a bit of trouble (in the way only Larry David can have trouble). He’s getting frustrated at something. He’s growing verklempt over an issue with his laptop. He’s picking it up and shaking it saying, “No, no, no, no, no!” In comes Julia Louis-Dreyfus (Elaine from Seinfeld) who sits down beside him with her own laptop. She opens up the laptop and the welcoming ditty from the Ubuntu GNOME desktop login is heard. Elaine starts working away. Mr. David looks over at Elaine’s laptop and says, “You better be careful, there’s a virus going around here…your computer will catch it.” Mr. David starts to give Elaine’s laptop the “evil eye” and says, “Catch it. Catch it. Catch it!”

It doesn’t catch it. Elaine continues working and eventually says: “Larry, it’s Linux…it won’t catch it. It’s safe.”
Larry: “Well, give me Linux.”
Elaine: “Go get it yourself.”
Larry: “But I just bought this laptop, I don’t want to have to buy anything else.”
Elaine: “It’s free. Now go away.”
Larry: “Free…” The gears are working in Larry’s brain. “What’s the catch. There’s always a catch.”
Elaine: “No catch, Larry. It’s just free.”
Larry looks over Elaine’s shoulder as she continues working.
Larry: “Looks good. Hard to believe all of that is free. Why haven’t I heard of this before?”
Elaine: “You have now. Now go away!”
Larry: “Fine. I’ll go get this Linux.”
Larry grabs his laptop and leaves the room. Fade out with voice of Jerry Seinfeld saying, “If Larry David can use it…”
|
|
Rating: 12345
 

The Incredible Shrinking Operating System

(Saul Hansell, NYtimes) Google’s new Chrome operating system is a challenge to Microsoft in several ways. It will offer a free rival to Windows, which can add $25 to $100 to the price of a computer. But it also represents a conceptual slap at the elaborate array of features that make up the soon-to-be-unveiled Windows 7.

Chrome OS will be positively minimalist by contrast. It will be built on a simple version of Linux that is meant to run only one application: the Chrome browser. Google’s idea is that anything for which you may have wanted a separate software program can be done within the browser instead. Never mind all the other functions and add-on programs you find in Windows.

This vision of the declining importance of the operating system reminds me of a conversation I had recently with Paul Maritz, the chief executive of VMware and a former top Microsoft executive who once ran its operating system unit. “The traditional operating system is becoming less and less important,” Mr. Maritz said. “It’s not going to go away, but it is going to shrink.”
|
|
Rating: 12345
 

Linux's position in cloud computing efforts

(Pam Derringer, News Contributor, TechTarget) Clouds are such big Linux news these days that, in the physical world, it would be raining by now. Or at least heavily overcast. The latest development is a Linux Foundation's report on Linux and cloud computing, which states that Linux is the OS of choice for major cloud platforms now and in the future. The report follows just days after the Distributed Management Task Force (DMTF) formed a collaborative Open Cloud Standards Incubator to develop standard protocols to remove interoperability barriers. Ultimately, the DMTF's goal is to spur cloud adoption by making its workloads more portable and easier to manage. And, finally, Raleigh, N.C.-based Red Hat Inc. recently announced a virtual forum on open source cloud computing July 22 to address interoperability problems and ways to solve them.

The Linux Foundation report said cloud computing is growing because data center costs are escalating and improvements in virtualization, distributed computing and IT management make cloud computing a more feasible option. Linux, in turn, was an obvious choice for cloud computing from the get-go because of its open source, modular architecture, its low cost and its scalability, the report said.

"The fact is that Linux is already the de facto operating system of choice for cloud computing," and will be the foundation of cloud platforms going forward, concluded the report.
|
|
Rating: 12345
 

Ubuntu a minor player? Not outside the States

(Christopher Dawson, Zdnet) There are a lot of people outside the United States doing a lot of incredibly innovative things in education and many of them are doing it cheaply with Linux.

So-called “emerging markets” (which, at this rate, won’t be emerging for long, and will quickly become “overtaking markets”) are rolling out a variety of operating systems and engaging in really progressive learning models. Worldwide, there are 13 million active Ubuntu users with use growing faster than any other distribution. Check out these trends from Google gauging online interest (with breakdowns by region).

In many places, people (far more than the 300 million in the US) are buying their first computers with no preconceptions about what an OS should be. As these markets explode, one has to wonder if our perception that the US is the only market that matters to operating system vendors will change. Microsoft gets it; they are putting incredible amounts of pressure on governments in Brazil to compete with Metasys and are largely proving unsuccessful (this is only one example, of course; Microsoft is working very aggressively in countless other markets).

In China, Ubuntu is gaining traction quickly since, due to rampant piracy, Windows is essentially free in that country. New users are choosing operating systems based on merit rather than price, since price is largely irrelevant in that market.

So perhaps Ubuntu, and Linux in general will struggle to gain market share on the desktop in the States for now, particularly since Windows 7 looks to be a decent operating system. We need to remember, though, that the US is not the only PC market in the world. In fact, it’s a shrinking, saturated market. When many of the international partners with whom we work in this global economy find Windows and Linux to be equally legitimate (or even favor Linux because of its openness and low cost or free software).
|
|
Rating: 12345
 

Linux and Windows compromised at boot

(Chad Perrin, TechRepublic)There’s a lot of debate over what constitutes a “secure” operating system. The debates seem to become most heated when people compare the Big Three of home desktop OSes — Microsoft Windows, Apple MacOS X, and the Linux family of operating systems. Of course, it’s difficult to convincingly offer a definitive declaration that any given operating system is “more secure” than another.

OpenBSD is rightly proud of its record of only two identified remotely exploitable vulnerabilities in default configuration through its entire stable release history, but even this is not proof positive that an OS is the “most secure”, considering that security needs change from one system deployment to another.

Ultimately, any of the widely used general purpose OSes can theoretically be compromised. The recent popularity of virtual machines, allowing one to simultaneously run multiple virtual computers on a single physical hardware platform, has provided hints of one particular threat that may apply even to an OS running outside of the controlled environment of a virtual machine: compromise by altering the OS image in memory during boot. This kind of danger has become something of a common bogeyman for VM users, as they worry that some piece of malware may be able to break free of the limits of the VM, and affect the OS in ways that have not previously been a concern for operating system installs on “bare metal”.

In theory, however, there is no specific reason something similar cannot be done to a system running without the virtual machine environment, as long as malicious security crackers can find ways to access the machine’s boot process itself. This may be prohibitively difficult to achieve remotely, at this time at least, but it presents a very worrisome state of affairs for cases where a security cracker may have physical access to the computer.

In the case of Microsoft Windows and certain Linux distributions, this concern is not just theory. It is also a very concrete reality. Piotr Bania has put together a proof of concept, a boot compromise tool called Kon-Boot, which so far has been tested and confirmed to work on at least four Linux distribution releases and a slew of common MS Windows releases.
|
|
Rating: 12345
 

Have we now entered the post-OS era?

(Jason Hiner, Editor in Chief, TechSanityCheck) Microsoft knew this day was coming. This was the reason it desperately wanted — no, needed — to take down Netscape in 1996. Netscape wasn’t just trying to build a program for reading text and photos across a network of connected computers. Netscape was trying to build a new platform - the ultimate platform - to run software and share information instantly and on a global scale. And no one understood that better than Bill Gates.[...]

There are a lot of reasons for the failure of Windows Vista, but in retrospect the biggest reason was that the OS simply didn’t matter that much anymore. Most of the consumers who ended up with Vista simply got it because it came installed when they bought a new computer. The vast majority of them never chose Vista. The group that did have a choice with Vista was businesses and they chose to avoid it, although not because of any inherent inferiority of Vista. The problem was that there was never a compelling reason to upgrade to Vista. It was the software equivalent of repainting a room and rearranging the furniture.[...]

It didn’t used to be this way. Installing a new operating system used to be like getting a whole new computer. Installing Windows 95 over Windows 3.1? That was a huge improvement. Installing Windows 2000 on top of Windows 95? That was a big leap forward. There were reasons to upgrade back then. [...]

Part of what’s going here is that the computer operating system has achieved a level of maturity and efficiency. You could even say that work on the OS has reached a point of diminishing returns. How much more efficiency can we wring out of it? What other major innovations are waiting out there?[...]

Twenty years ago, we thought the computer was the revolution, but it wasn’t. The advent of the Internet - and the Web browser as one of the ways to harness it - has shown us that the revolution is actually in communications and the dissemination of information. The computer will be to the Information Revolution as the assembly line was to the Industrial Revolution. It will simply be one of the catalysts that helped make it happen.

In the same way, the computer OS simply doesn’t mean as much as it once did, or at least as much as we once thought it did. But, then again, all of us (including Bill Gates) knew this day was coming.
|
|
Rating: 12345
 

A look at real-world exploits of Linux security vulnerabilities

(Kevin Beaver, CISSP) Probably the simplest exploit to carry out against Linux systems is to look for unprotected NetBIOS shares. Weak Samba configurations are often very revealing. For example, file shares created for the sake of convenience can end up coming back to haunt you. I've seen Samba-based Linux shares that provided anyone and everyone on the network with access to sensitive files containing patient health records, and network diagrams with detailed information (e.g., passwords for accessing network infrastructure systems, source code, etc.).

This attack is simple to carry out. All someone needs to do is to be logged into the network as a standard Windows user (i.e. no admin privileges), run a network share finder tool such as what's available in GFI LANguard, and then run a text search tool such as FileLocator Pro. As I have mentioned before, it's really simple for anyone on the network to gain access to sensitive documents they otherwise should not have access to - and no one may ever know about it.

A related attack is one against poorly-configured FTP servers that allow anonymous connections or have accounts with weak or nonexistent passwords.

In this situation anonymous FTP provided access to a configuration file that happened to have the password for a financial management database hard-coded into it. You know where things can go from there.

Another Samba exploit can lead to remote user enumeration. When a Linux system's Samba configuration allows for guest (i.e. null session) access, vulnerability scanners such as Nessus and QualysGuard can enumerate the system to glean user names. In most instances an attacker can use these user names in subsequent password-cracking attacks against Linux accounts. In many cases, you can also use a Web vulnerability scanner such as WebInspect or Acunetix Web vulnerability scanner to glean Linux user accounts via an unsecured Apache installation that doesn't have the UserDir Disabled directive in the httpd.conf file.
|
|
Rating: 12345
 

More OSes gain hypervisors, but most users choose OS-independent approach

(Bridget Botelho, News Writer) With the rise of KVM and Hyper-V, the move is clearly afoot to build hypervisors into operating systems, but most users say the OS-independent approach to virtualization is here to stay.

Non-OS based virtualization from VMware can be expensive and resource-intensive because it requires tracking and licensing for virtual machines (VMs) running on disparate OSes. But it also provides much-needed features.

Advantages of the OS-based approach are exemplified by a Hackensack, N.J.-based virtualization user who hates the hassle of assigning VMware virtual machines (VMs) to OS licenses and struggles with tracking and managing VMs running outside the core OS. Some observers say these virtualization management burdens could cause a VMware defection to Red Hat KVM and Microsoft's Hyper-V, which can be built into and managed directly through an OS.

"Why spend money on [virtualization software], third-party tools, add-ons, and support contracts for a virtualization solution from a totally separate vendor when your OS - Windows or Linux - already has the same features built right into them," he wrote on a popular IT community board.

Non-OS hypervisor plusses and minuses

Microsoft's Hyper-V operates on bare metal and is typically used with Windows Server 2008, although Microsoft also supports SUSE Linux Enterprise Server 10 as guest OSes. KVM, or the Kernel-based Virtual Machine, is a Linux-based hypervisor for hosting Red Hat Enterprise Linux (RHEL), Windows, and Sun Microsystems' Solaris OS.

This user also complained about the licenses required by each OS running VMware VMs. "You have a separate financial transaction to worry about."
|
|
Rating: 12345
 

Avoiding disaster recovery pitfalls in VMware and Linux

(Richard Jones, TechTarget) Last year, at several seminars on advanced enterprise virtualization, I asked attendees about their virtualization deployments. Most had deployed VMware with the Virtual Machine File System (VMFS) as the storage file system, but few had deployed virtual machines (VMs) with raw device mapping (RDM) for storage access.

When creating storage for a VM, VMware's cluster file system, VMFS, is the default and follows the simple path, creating virtual disks inside a VMFS logical volume on a logical unit number (LUN). Then when you install a Linux OS, you can use the default. While the simple default settings for creating VMs may speed deployment and appear to reduce management overhead, these default settings pose problems. The central issue is that these settings target the simplest case without regard for particular system requirements. Defaults are often designed for a product evaluation, where ease and speed are the objectives. But particularly when it comes to disaster recovery (DR) and backup, consider the possible long-term effects of these default settings. Let's consider some enterprise scenarios to illustrate how default settings can be a bad move in virtual deployments.

Serverless backup and raw disk maps
As more modern applications are stored, it takes more time to back up or restore an application's data. But this opposes business disaster recovery demands of shorter recovery time objectives (RTOs). Compounding the situation is the ability of modern servers with more processor cores and memory to host higher VM consolidation ratios that can tax limited host backup/restore I/O bandwidth. All three factors have resulted in backup and recovery times that are unacceptable today compared with just a few years ago.

The solution is to move the backup and restore process away from VMs and to a storage area network (SAN) with serverless backup products such as VMware Consolidated Backup (VCB). Serverless backup requires RDMs to SAN LUNs to allow the backup server direct access to the LUN. But if a VM was created with defaults, you must re-create virtual disks and move data into RDM-configured storage to support a serverless backup architecture. This necessitates a major departure from default configurations for virtual environments.
|
|
Rating: 12345
 

10 iptables rules to help secure your Linux box

(Jack Wallen, TechRepublic) The iptables tool is a magnificent means of securing a Linux box. But it can be rather overwhelming. Even after you gain a solid understanding of the command structure and know what to lock down and how to lock it down, iptables can be confusing. But the nice thing about iptables is that it’s fairly universal in its protection. So having a few iptables rules to put together into a script can make this job much easier.

With that in mind, let’s take a look at 10 such commands. Some of these rules will be more server oriented, whereas some will be more desktop oriented.[...]

1: iptables -A INPUT -p tcp -syn -j DROP
This is a desktop-centric rule that will do two things: First it will allow you to actually work normally on your desktop. All network traffic going out of your machine will be allowed out, but all TCP/IP traffic coming into your machine will simply be dropped. This makes for a solid Linux desktop that does not need any incoming traffic. What if you want to allow specific networking traffic in — for example, ssh for remote management? To do this, you’ll need to add an iptables rule for the service and make sure that service rule is run before rule to drop all incoming traffic.

2: iptables -A INPUT -p tcp –syn –destination-port 22 -j ACCEPT
Let’s build on our first command. To allow traffic to reach port 22 (secure shell), you will add this line. Understand that this line will allow any incoming traffic into port 22. This is not the most secure setup alone. To make it more secure, you’ll want to limit which machines can actually connect to port 22 on the machine. Fortunately, you can do this with iptables as well. If you know the IP address of the source machine, you can add the -s SOURCE_ADDRESS option (Where SOURCE_ADDRESS is the actual address of the source machine) before the –destination-port portion of the line.
|
|
Rating: 12345
 

10 obscure Linux applications you need to try

(Jack Wallen) Do a search for Linux applications on Freshmeat and you’ll get around 11,828 hits. (As of January 12, 2008, that was the tally.) Of those 11,828 applications, which ones are worth using? Not 100 percent of them for sure. Still, buried within that grand total you will find a few gems that get zero publicity but are worth giving a go. This article will highlight some these little-known apps, which range from multimedia to certificate authority tools and anything/everything in between.

#1.Floola
Floola isn’t an open source application, but it does run on Linux (as well was OS X and Windows). Floola takes music management (in particular, synching iPods) one step further. With this nifty application, you can download and convert YouTube videos for playback on your iPod. But unlike some other clunkier applications, Floola does this seamlessly and simply. No commands to enter; it’s all GUI. The only possible gotcha is that before you can add videos from YouTube, you have to install ffmpeg on your Linux box. Floola uses ffmpeg for the conversion process.

Don’t expect Floola to have all the bells and whistles that iTunes has. Floola offers Photo support, Snarl (Windows only) support, Growl (Mac only) support, Notes, repair iPods, export lists to HTML, language support, lyrics, duplicate and lost file search, artwork support, video support, Google calendar support, playlists, podcast, lastfm support, and more. Floola is simple to use in Linux, as it comes in an executable binary that you can simply copy to the /usr/bin directory and run with the command Floola.

#2.Transkode
Sticking with the multimedia theme, Transkode is a front end for the highly flexible, modular command line toolset Transcode. Transcode is one of the most versatile audio and video converting tools available. Transcode has both a graphical and a text-only interface and supports a vast number of formats including DV, MPEG-2, MPEG-2 Part 2, H.264, Quicktime, AC3, and any format included under libavcodec. Transcode can import DVDs on the fly and record from Video4Linux devices. The problem with Transcode is that the commands can get a bit overwhelming for the average user. Transkode remedies this by employing a user-friendly interface that makes the complex business of converting multimedia format files as simple as it can be.
|
|
Rating: 12345
 

Using Linux in a data center consolidation management strategy

(Ken Milberg) Why is Linux the answer for data center consolidation? To start, for a large IT infrastructure project to be approved in today's climate, it must make real business sense. The project must have a significant return on investment (ROI), which will lower total cost of ownership (TCO). Linux operating systems work well with these demands, as will be explained in this tip.

The primary goal of any data center consolidation project is to lower power and cooling costs, reduce real estate footprint and improve overall server workload utilization. Because data center consolidation usually involves server consolidation, this type of project should also substantially reduce the cost of service contracts and software licensing, while substantially increasing overall server workload utilization. I've led projects in which the largest single cost reduction was on the software licensing side. (Those of you that have worked with Oracle licensing on your server farms know what I'm talking about.)

Data center consolidation strategies

There are essentially two ways to consolidate data centers. One is a like-for-like forklift move. For example, a customer looking to consolidate four data centers into two. This scenario simply involves moving all of the equipment from the four sites into two data centers. There may be some virtualization and server consolidation involved, but no real effort to redesign or reengineer infrastructure. The other scenario would be to increase efficiencies and processes along with the consolidation of data centers. This involves strategic planning regarding the overall direction of infrastructure. While more complex and difficult, the latter move is the one that will provide the greatest ROI and TCO.[...]

Operating system consolidation
If your data center consolidation management strategy is built around Linux, you can keep your mainframe but run Linux. Same thing with your IBM p595 Power servers (formally running Unix) and your commodity x86 blades. At the same time you may also consider migrating off of Solaris to Linux. These moves can streamline your infrastructure. While you will not likely be able to trim staff down to just six people, but you might be able to get by with half of the staff previously needed. This is just one small example of how you can increase efficiencies by doing an operating system consolidation project in conjunction with a data center migration. As you know, Linux is no longer something just resigned to the back-end of data centers running Web or DNS servers. In fact, many companies already run all their information systems on this platform (e.g., Oracle).
|
|
Rating: 12345
 

Choosing the best server OS: Linux vs. Windows comparisons

(Logan G. Harbaugh) In a way, server operating systems are simpler than workstation OSes. They don't need to support as wide a variety of accessories and generally don't need to run as wide a variety of applications. On the other hand, the applications they run, such as databases, Web servers, email servers, collaborative applications and application servers, can stress both the server OS and the hardware. So choosing the best server operating system can be a trial.

Ten years ago, there were two main choices for a server OS running on commodity hardware: Novell's NetWare 4 and Microsoft's Windows NT. Today, Windows 2008 is still a solid choice, and although NetWare has disappeared into history, Novell's version of Linux is a good choice as well: On the proprietary side, the options are much the same as they were 10 years ago: Unix variants that run on proprietary hardware from Sun, IBM, SGI and others.

Choosing the best server operating system depends largely on a server's function. The easiest choice for a file-and-print server that supports Windows clients running Microsoft Office is Windows 2003 or 2008. While it's possible to support Windows file shares and run a server collaboration application that supports Outlook on a Linux server, it's more complex to set up and run smoothly. On the other hand, a file server supporting Linux workstations or an outward-facing Web server or application server is no more difficult to set up on Linux than on Windows and will probably be more secure in the default configuration and less of a pain to maintain over time.

Windows vs. Linux: Installation, maintenance and security
Both Windows and Linux offer pros and cons. Windows is easy to install and run in its default mode, includes an array of drivers for virtually any type of hardware and has the widest variety of software available. On the other hand, it suffers from frequent security problems and requires critical patches that usually involve rebooting. It is also expensive, from the initial purchase price of the OS and applications to the ongoing maintenance required to keep it stable and updated. Linux requires careful consideration of available hardware drivers that are appropriate for your hardware (including the motherboard) and whether newly released hardware (such as Intel i7 motherboards) is supported for. It also requires more knowledge to install and run the OS and applications. But at the same time, Linux is generally more stable and secure than Windows, especially the Enterprise editions available from Red Hat and Novell, which use kernel versions that are long-standing enough to have become completely stable.
|
|
Rating: 12345
 

Red Hat Fedora Claims It's the Leader in Linux

Not a bad way to make an introduction for Red Hat Enterprise Linux 6! This claim of "King of the Linux Hill" is sure to conjure some fuss and debate within the open source community. If the snippet below sparks your interest be sure to check out the full article ;)

(By Sean Michael Kerner, Internet.com) "On the eve of its next major release, the distro produces new figures showing that it's ahead of rivals in total users.

Counting Linux users is no easy task since there is typically no requirement for users to register their installations. Yet Linux distributions do try and count users in an attempt to quantify their user base and relative footprint in the operating systems space.

Red Hat's Fedora community Linux distribution has now tallied its user base, and it's a number that on the surface would make it the largest installed base of any Linux distribution, with at least 9.5 million users and possibly as many as 10.5 million. Fedora competitor Ubuntu Linux currently claims to have 8 million users.

The Fedora figures come out as the major players in Linux continue jockeying for position as the dominant vendor in the space, while also competing to make inroads against proprietary software. The news also comes as Frields and his team and ramping up to deliver their next release, Fedora 10, which is slated for Nov. 25th.

The counting methodology has the potential to overcount the same machine if the box has a dynamic IP address -- for instance, using Dynamic Host Configuration Protocol, or DHCP -- that could change over time."

Looks like they could turn that red hat into a red crown if irrefutable research confirms Red Hat Fedora to be number one. But imagine the arguably improbable embarrassment
of a definitive user count detroning the self-proclaimed ruler of the Linux realm - sure wouldn't want to work in their PR dept if that happened! How do you feel about Red Hat Fedora? Do you think it merits its alleged no.1 spot?
|
|
Rating: 12345
 

Gartner warns of misguided virtualization strategies

LAS VEGAS -- Too many midmarket CIOs look at server virtualization as a means of reducing space, power and cooling demands in the data center. As a result, they could be shortchanging their businesses.

Realizing savings from consolidation is largely a onetime event, said Gartner Inc. vice president Tom Bittman, who spoke Tuesday at the Gartner Data Center Conference.

In fact, if organizations are getting into virtualization just to address power and cooling issues, they might be making a mistake.
|
|
Rating: 12345
 

Linux made its first desktop breakthrough

It seems that Linux is gaining more and more popularity in the battle for the operating system of choice for cheap desktop computers. After the OLPC (one laptop per child) program decided to use Mandriva instead of a light version of Windows for the XO laptops, now Wal-Mart reports it has sold out its gPC stock.

The gPC runs a Google friendly Linux operating system built around Ubuntu Linux. The computer is priced at 199$ and can be used to perform most of the common tasks such as using the office suite or listening to music. Critics, however, have yet to proclaim the gPC as a success yet. Because of its very low price, this incredible sale rate could be only related to the actual price tag the product has, not its actual usefulness.
|
|
Rating: 12345
 

Thou shalt fear the GPL...

The first GPL (General Public License) lawsuit ever in the U.S. saw the light of the day. The Software Freedom Law Center was the one to make the official announcement. The complaint was filed by two main developers of the awesome set of Unix utilities called BusyBox. As it appears, the Monsoon Multimedia company redistributed BusyBox without supplying the source code, as specified in the GPL version 2.
|
|
Rating: 12345
 

Linux, the New Botnet Command Center

A recent study conducted by eBay on its security threat situations pointed out some interesting results. Linux systems have become the command centers of choice for phishing operations and Botnets. Because of its reputation of being a very secure and stable operating system, Linux is often installed on server machines. While Botnet soldier systems can safely join and part the Botnet without concern, the central node must be available as long as possible to control them. Linux servers are therefore a very valuable resource to any electronic criminal.
|
|
Rating: 12345
 
Close send to email window
 



Verification code

Already a member?
Blacklist monitoring alerts
sign up Signup for our real-time monitoring service and receive email notifications each time one of your IPs gets blacklisted.
Free Signup
Mail Server Operating System Poll
.01

What OS do you use for your email server?
Linux
Windows
Other
disabled next
.02

How many mailboxes do you currently manage?
1-50
51-300
300+
previous next
.03

Would you like to comment upon the choosing of this particular OS?

previous
 
DNS Tools
Get IP status, owner and location, obtain its corresponding hostname or check specific ports.
Ping Statistics
Reverse DNS Lookup
Whois Info (IP owner)
GeoIP Information
Check Port
Open Relay Test
Test if your mail server is an open relay for spammers.
Blacklist Checker
Check if your IP is listed in DNS based email blacklists (DNSBL)