Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Removing Microsoft from the equation - long live Linux


TeleSkoop

Recommended Posts

Have to agree 100% - even their business range of desktop machines now get bundled with so much rubbish it takes just as long to go through the post install and remove the unwanted guff as it does to install the image. There was a time when HP business machines installed just the OS and drivers for the hardware in the box....

When I have to work on a HP machine, the first thing I do where possible is a clean (re)install. Even with the endless joyful fun of tracking down all the correct drivers*, and the X amount of time the actual OS install takes, it's still a better option than trying to fix the existing HP install.

* Good old HP: a lot of the time the model isn't actually to be found on their own site, and, even when it is, there's a chance that the listed drivers are not the correct ones. Both situations occur too often.

Link to comment
Share on other sites

  • Replies 72
  • Created
  • Last Reply
Visual Studio 2010 is probably one of the most competent environments out there.

I don't doubt it. It's just that I find it frustrating trawling through endless tabs to find the particular option that is being inadvertently used. I grew up with Makefiles and emacs and TeX, managing thousand file projects with text files, where I could find everything very quickly. "Visual" environments are just not my cup of tea, I preferred structured text.

Link to comment
Share on other sites

I don't doubt it. It's just that I find it frustrating trawling through endless tabs to find the particular option that is being inadvertently used. I grew up with Makefiles and emacs and TeX, managing thousand file projects with text files, where I could find everything very quickly. "Visual" environments are just not my cup of tea, I preferred structured text.

My only problems are the compilers, I do not mind visual environments per se (though I still love emacs/make, and of course LaTeX). The lack of standards' support got me so annoyed that I uninstalled the lot. THAT is a story in itself. The uninstall did not work properly, and I ended up manually cleaning up disk and registry.

There are plenty of tools that use make under the hood, but give you a lot of additional support (suggested completion of methods/parameter, quick overview of object relationships and inheritance, etc). As a quick stop-gap we used Dev-CPP under Windows for one or two projects, but it is rather primitive, and borks under Vista. We are having a look at Netbeans (with C/C++ extensions). Portable code is essential for us, because we work on very diverse systems.

Link to comment
Share on other sites

There are though different requirements of portable scientific programs and the sort of thing likely to appeal to the "Wii Family" or others besides. <G> But I am but a "recreational" (e.g. M$ Silverlight) programmer, these days. :)

The downside of "Studios" is that, if they go wrong (They had been known to have bugs) you were left with a whole bunch of files, stored in odd places, that are quasi-useless. It is possible, sometimes to hand-hack files, to cut and paste source files, but that can take some doing... :)

In fairness to M$, it's quite NICE of them to provide "beta" and "lite" versions for FREE to the home market. But clearly one might not maintain one's "LHC analysis" (whatever) under such things. Though, it may well be that they now do such things. Software (and hardware!) reliability has increased... :)

But, without inspiration from SGL "debates", I might never have come across:

jokes > Stroustrup C++ 'interview'

The "older" (non-evangelical?) reader might appreciate such things more... :)

Link to comment
Share on other sites

Portability is also an issue in "software product lines" used in, e.g., mobile phones. You really want to reuse stuff rather than rewrite. This is why one of our previous professors in software engineering and architecture was snagged by a big mobile phone producer.

There are many free integrated development environments out there. Pick whichever you like best, or pay for a better one if needed. I am still very pleased that some code I wrote in 1990 is still in use today, and ran without any adaptation in a new setting. The only difference was the speed. I recall in a discussion that some people doubted the suggested approach would be sufficiently fast. I reminded them that it was written to work on thousands of data points on a 80286 at 8 MHz, and could handle 500,000 data points in 2D in 15 minutes on a 60 MHz 80486 in the distant past. It processed about 3 million 3D data points per minute at a much higher resolution on a single core of a Core 2 quad.

Link to comment
Share on other sites

Wise men leave science. <sigh> What can one say? "Scientists,

Mathematicians... whatever" make GREAT programmers tho! :)

It's a shame that "researchers" view programmers as "technicians". I struggle with... "software consultants" too. Heck, everyone's a "Web Designer", these days? My IT relief-teacher cousin's, fantasy. Some of this, I miss, naturally! But thankfully I no longer have to write my own valueless contribution to "self-promotional" annual confidential appraisals. :)

Link to comment
Share on other sites

Wise men leave science. <sigh> What can one say? "Scientists,

Mathematicians... whatever" make GREAT programmers tho! :)

It's a shame that "researchers" view programmers as "technicians". I struggle with... "software consultants" too. Heck, everyone's a "Web Designer", these days? My IT relief-teacher cousin's, fantasy. Some of this, I miss, naturally! But thankfully I no longer have to write my own valueless contribution to "self-promotional" annual confidential appraisals. :)

Us computer scientists are often considered "engineers" or "technicians" by more "high brow" sciences as mathematics and astronomy. My general response is "so what?" (until they start messing with our funding :))

I have fun making things work well and work fast. Apart from research I teach future programmers/software engineers/researchers to think about features like computational complexity, reusability, readability, etc.). There is an element of art too: finding the right balance in a host of conflicting requirements for a particular application. That requires much more than the five years of training we can offer.

There is a lot of fun to be had programming and doing science. I try to focus on that .

Link to comment
Share on other sites

OK, back on topic. A few remarks on ImageJ vs the Gimp: the former can handle 16 bit per pixel and floating point images, which can be a great benefit on high dynamic range images (such as occur often in astronomy). Gimp only handles 8 bits per channel. It can be a bit of a hassle finding things in both ImageJ and Gimp, and ImageJ is more geared to the image analysis specialist, than to general photoshop-like use (Gimp is better in that sense). Having said that, it is very straightforward to add plugins to ImageJ, and to automate procedures. Two ways exist: you can simply record a sequence of steps, and let ImageJ generate the Java plugin, or you can write something manually.

I have found certain methods were not implemented efficiently, but again, those with programming skills can alter the implementation of supplied plugins at will. A lot of ImageJ plugins are floating around on the web, so first see if someone has made something already before you go and write your own.

People without programming skills may well prefer Gimp, but remember you do not have to be able to programme to create plugins for ImageJ. Besides, it's free, so give it a shot. As mentioned by others, it is available for any OS.

Link to comment
Share on other sites

I just tested wxAstrocapture, and it now seems to be working neatly. Maybe I did not plug the camera in before starting the program.

All camera controls work neatly. As I have a lot more free disk space on the /home partition than in Windows, this is very handy. I will also try Registax under wine. That could prevent the great deal of memory swapping needed under windows for big files.

Link to comment
Share on other sites

I'm running ubuntu 10.04 as well. Very happy with it. Went windows cold turkey a few years ago now. Even my wife likes it !

I looked at 11 but there are a few too many basic 'known-issues' at this stage for me to take the plunge.

Of course there is the odd annoyance in terms of software incompatibility but on the whole I feel its worth it. I do wince (pardon the ppc pun) when I have to run something under 'Wine'. Registax was the latest. Oh for the day when I can remove it .... wine that is.

Sorry if I missed it somewhere on this thread but is there a native alternative that compares favourably ?

I too use wxAstroCapture and find it does everything I need right now.

Lee

Link to comment
Share on other sites

gimp... I do not see how an 8-bit dynamic range can be used for anything related to astro imaging. In fact, I convert to 32-bit floating point right after dark/flat processing in order to stay within bounds and not loose precision.

If you want to do work on Unix or Linix (or Windows or OS X for that matter) the uniform way that covers all platforms is to fork out the money for Pixinsight. It is worth every cent!

/per

Link to comment
Share on other sites

gimp... I do not see how an 8-bit dynamic range can be used for anything related to astro imaging. In fact, I convert to 32-bit floating point right after dark/flat processing in order to stay within bounds and not loose precision.

If you want to do work on Unix or Linix (or Windows or OS X for that matter) the uniform way that covers all platforms is to fork out the money for Pixinsight. It is worth every cent!

/per

Absolutely right. Though 32 bit integer processing has better resolution for certain types of processing, and for larger images, Fourier-transform-based techniques really require double precision to avoid all sorts of round-off errors, 32 bit float is best for many tasks in processing astrophotos.

I use either MatLab or home brew code at work. MatLab is rather expensive though.

Link to comment
Share on other sites

I tried Linux every so often over the years but it wasn't until Ubuntu came along that I thought it could compete with Windows in general. For several years Ubuntu had various problems but I preferred it to Windows in most respects. Nowadays Linux/Ubuntu has built in drivers for pretty much everything. And "Just Works". It is faster and more code efficient than Windows. It is also more reliable, though Windows XP with all it's updates and SPs has become pretty stable and virtually everything will work with it where later versions have problems, I'm told.

I have always liked the Linux principles, the modularity, the way the system is protected from rogue applications, easy multi-user features, and much more. But I didn't feel it really "came of age" as a universal OS that everybody could use until a few years ago, once Ubuntu had really got going and the many bugs been ironed out.

Linux is very handy for sorting out disasters produced under Windows. Data recovery, partition table recovery and, of course, creating separate partitions in the first place.

One of the principal features of Linux is that most applications are freeware, in the public domain or otherwise effectively free to use. After spending much cash on Windows applications, it was a breath of fresh air when I went fully over to Linux and no longer had to spend scarce cash resources on software. Now with the success of free Linux software, this has become available for Windows. Although freeware and public domain software was available for Windows there wasn't as much of it and what there was was not as good as commercial software. Now this freely available software is every bit as good as it's commercial equivalent. There are a lot of cross-platform programming languages that have been developed in the last few years that make it relatively easy to produce applications that have versions for Mac OS X, Linux and Windows.

As for myself, I use Linux Mint (a derivative of Ubuntu) on my main desktop and an old laptop which runs my weather software, and Windows XP on a few machines for applications which have no Linux version and won't run under wine in Linux. I also use Mac OS X on my no.1 laptop which is a MacBook Pro and a real delight to use.

From a user's point of view I find little difference between the three operating systems. In fact there are a number of apps where I use versions of them on all 3 platforms.

Link to comment
Share on other sites

The issues I had with Ubuntu were the common issues of not getting sound out either the front or rear panel and the issue of it not starting if it got an update. I jumped through many hoops to try to get it to go and it refused on both counts. Ubuntu can be idiot proof when it's going well and very basic but now and again something comes up that requires you to really know programming - something that isn't an issue with windows. Toylike perhaps but the fact that you know that when you push the power button you're going to see a desktop screen well into the 99.99999's of percent and you don't need to wade through webpage after webpage of terminal commands to find the one that lets you do something that windows does that we take for granted.

It's this that keeps an operating system that costs between £80 and £250 being bought and not a free one from the internets and I think Linux users like it that way.

Link to comment
Share on other sites

I did find a terminal command that got sound coming from the front panel earphone socket but it was a bit rubbish having the speakers plugged into the front and after an update it went back to not working.

I enjoyed ubuntu while I was using it outside the sound issue, even found running windows games on wine easy. I atill used the bopt disk now and again to open documents that windows has issues with, namely the compatability of Word, Works, Office, etc where documents made by one microsoft programme can't get opened/read/changed by another.

Link to comment
Share on other sites

Linux is very handy for sorting out disasters produced under Windows. Data recovery, partition table recovery and, of course, creating separate partitions in the first place.

This is where I will totally disagree with you. All filesystems on Linux suffer from recovery problems. Power off a Linux box accidentally and you will have file system problems. NTFS, the filesystem used in Windows (and I do not count the 9x series as even being an operating) is based on principles from the VMS filesystem used on the VAX computers from Digital. It is almost impossible to knock it off track. I have powered off servers and workstations for ages, and never - I mean NEVER - have I seen an NTFS disk loose data. In the dark 90's, i accidentally powered off an Alpha server with Windows NT 3.51 and 400 GB of RAID. Mind you, this is the mid 90's and the disk system comprised two 19" full height cabinets with 4GB disks facing both sides! When we powered it up we missed the "press space to NOT check disks" prompt and it went into chkdsk. It took a good six hours and there was no data loss.

Every time I forcefully power off a Linux box it needs to check the disks, and about half of the checks result in data loss. I do not know why. The journaled versions of the EXT series of filesystems should be OK but they constantly amaze me as being so vulnerable.

My two cents... :D

/per

Link to comment
Share on other sites

NTFS may be based on the VMS file system, but Linux file systems are based on Unix file systems. All journalling file systems under Linux (ext3, ReiserFS, ext4, and many besides are very robust. I have never had issues with accidental power downs. It just relays transactions and is done with it. NTFS is OK, but even Microsoft has been looking for a replacement for a long time. It was really one of the key innovations planned in project Longhorn.

Link to comment
Share on other sites

NTFS may be based on the VMS file system, but Linux file systems are based on Unix file systems. All journalling file systems under Linux (ext3, ReiserFS, ext4, and many besides are very robust. I have never had issues with accidental power downs. It just relays transactions and is done with it. NTFS is OK, but even Microsoft has been looking for a replacement for a long time. It was really one of the key innovations planned in project Longhorn.

Yes, that is the way it should be, but please try powering off an Ext3 a few times. I lost a couple of configuration files on an ESX 3.5 host system just last Wednesday when it panicked. Wasn't difficult to fix, just very annoying. Three files were garbled. Same with one of my Linux systems last month - one forced power off and a bunch of files garbled.

NTFS is still robust and the reason for replacement thoughts were more functionality. The part SQL-based file system that was planned for Windows 7 (WinFS) was actually scrapped because the same functionality was achieved with filter drivers providing the "libraries".

Link to comment
Share on other sites

I have seen some NTFS problems of a similar nature. Ext4 is supposed to be a big improvement, and is now the OpenSUSE default. ReiserFS was very robust, but is no longer under development (no comment :D ). I have heard people using ZFS under Linux to great effect. That is one of the nice things under Linux. If you don't like a component, replace it.

Where Linux really shines is in multitasking (in server environments). A Dutch bank used to have its web servers running NT, on (in those days) powerful dual core Xeon based machines. These bogged down under a 25 user load. A Linux box (single core Pentium II based) had no problems with 1500 users. This down to the way daemons are handled. In the Windows case, the process table becomes cluttered with (in Unix parlance) zombie processes.

Again, an OS is just a tool. Use the one that best fits your needs.

Link to comment
Share on other sites

  • 2 weeks later...

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.