Organizations rely heavily on their technology infrastructure to enable business growth. Software testing management is critical to ensuring that applications used by the workforce perform as intended and won’t break down when employees are in the middle of crucial...
At the time a friend was also getting into computers as a way of life. One day I was over at his place and he was showing off a program he liked. His computer of choice was an IBM PC running MS-DOS. I remember being impressed with the program and told him I might get a copy for myself. He told me I couldn’t. “Why?” I asked.
His response was, “It only runs on MS-DOS.”
My reaction was akin to a child learning that there’s no Santa Claus. At the time all I understood about operating systems was that a computer needed one to work. I really didn’t have any conception that there were operating systems out there other than the Apple DOS that ran on the machines in the computer lab.
Those of us who embraced virtualization and cloud computing hoped that these technologies would break the hold operating systems have on the computing experience. They really haven’t. While there have been efforts to make operating systems work in a conceptually similar manner, there are differences that need to be understood in order to have the full benefit of the cloud computing experience.
Remote access: A tale of two operating systems
Facilitating remote access is one such example. Linux supports SSH, which is the way most Linux admin access remote Linux servers at the command line. Windows, on the other hand, uses WinRM to facilitate remote access at the command line to Windows machines. This creates a problem when having to move between Linux and Windows targets using a cloud provisioning technology such as Ansible or a CI/CD product such as Jenkins.
If you want to work remotely across operating systems, you need to know how each OS facilitates remote access. Otherwise, you’re stuck in the water.
There are other differences as well. For example, Windows has a system-wide configuration storage and management technology called the Windows Registry, which is left over from the COM days. Microsoft has been trying to minimize the impact of the Registry through the .NET/CLR technology. But the Registry is still used by native Windows applications and .NET applications that use the COM interop, and vice versa.
A plethora of application installation methods
When it comes to installing applications, operating system matters. Linux has a variety of package management systems: yum, apt and dnf. The one you use is particular to the Linux distribution. Mac’s OSX, which is a flavor of Linux, uses homebrew, MacPorts, and pkg and dmg packaging. On the other hand, native Windows uses MSI Windows Installer, and there is an emerging package manager called Chocolatey. Also, for .NET apps, there is the NuGet platform, which facilitates downloading and installing external libraries.
You can’t run yum on a Windows machine. Conversely, you can’t run the MSI installer on Linux. This difference matters a lot when doing automated machine provisioning.
Linux, OSX and Windows will allow you to install a native app by doing nothing more than copying the executable file to disk. However, this method works only under certain conditions, mostly when there are no dependencies on external libraries. When the primary executable binary has a dependency on an external library, then the way the library is known and accessed varies by operating system and, to some degree, application framework.
For example, Windows might use the Registry to locate the dependency. Or it might rely on a library being present in a predefined location in the file system — for instance, by default .NET looks for dependencies in the bin subdirectory of an application should the given library not be present in the General Assembly Cache. (Of course, the base .NET binaries need to be installed beforehand in order for any .NET application to work.)
Linux will also follow file system convention in some cases. Java tries to make the dependency management the same across operating systems; sometimes it works, and sometimes it doesn’t — it depends on the Java Framework in play. For example, the Maven framework is cross-platform, but it has a dependency management system all its own.
Those pesky little command line differences
And then, of course, there are those pesky command line differences. For example, to copy a file in Linux, you use the command cp, while Windows has copy. To get network information in Linux, you use ifconfig. In Windows, it’s ipconfg. Linux has a filtering command, grep, which is bread and butter for Linux power users, but Windows does not support grep. These are but a few examples of many standard command differences.
In terms of the shell itself, Windows has the standard command prompt. To provide more power at the command line, Microsoft introduced a beefed-up shell called PowerShell, which has a learning curve all its own. PowerShell has some similarities to the standard DOS shell but has much more power. For example, PowerShell supports .NET system objects and C# programming at the command line. Linux goes another route: It has the standard shell, sh, as well as beefier shells such as bash, the korn shell and fish.
Two distinct approaches to doing security configuration
The way you do security for users and groups on a machine is different too. Handling users and groups is a lot easier at the command line in Linux, while Windows is a bit more complex.
The users and groups GUI hides a lot of the management complexity in Windows. Working through the GUI is the preferred method of interaction in Windows, even for Windows Server. However, this incurs a lot more overhead in the form of more disk, memory, and CPU required to support the GUI. Server editions of Linux ship command line only, so they are more lightweight. Yes, you can add a GUI to a Linux server, but that’s considered the sign of an amateur.
Putting it all together
Someone who is used to working with Linux in the cloud and then needs to move over to Windows — for example, to support working with Ranorex Studio — might experience a significant learning curve. The same is true for someone moving from Windows to Linux. Consequently, those working in the cloud will do well to have a basic familiarity working with each OS at the command line. Having such cross-platform understanding will save time and heartache if they ever want to work in an unfamiliar environment.
Not only that, but when you set up your environment, take notes. If it is in the cloud, you can script it up with code. The important thing is to make it reproducible, so the next hire, or replacement, won’t have to go through all that work themselves just to get started.
Having good software testing management can lead to business growth. Follow these 4 tips for a smooth management process.
Software quality is a core concern for all software developers, and it encompasses more than just the code. Here’s how to improve software quality in nine ways.
This is a helpful guide on how to reduce software testing times and the benefits of doing so. Check out these four tips to achieve more time-efficient testing.