Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

Software question (maybe a stupid one?!)


Recommended Posts

I had a few challenges with software at SGL5. I tried to minimise what I needed to put on the netbook (given space constraints, to keep it running OK), and minimised it too much ;).

I was wondering whether it was possible to set up a removable drive to hold all the stuff I need for star parties and run the software from there. Would it be possible, or would it be too slow?

Thanks

Helen

Link to comment
Share on other sites

  • Replies 32
  • Created
  • Last Reply

The type of software you're describing is known as "portable software" or "portable applications". I think it's possible to fiddle any piece of software into one by something called "application virtualisation", don't ask me how though!

Are you sure you need to do this though? It's a common misconception that using up space on your hard disk affects performance. Only having many apps running in main memory does this. (the exception is totally filling your hard disk, meaning there is no room left for paging).

Although it may sound like I know what I'm talking about when it comes to computers, this is the second time I've typed all of this out, as I managed to hit "post" when the connection was down, doh!

Link to comment
Share on other sites

As long as you dont run it all at once it should be fine, what you'll find with a netbook is the graphics card probably isnt up to it, as demonperformer says use a memory stick or invest in a separate portable usb hard-drive... its the ram that affects the speed of your computer not really the memory... think of ram as a cars engine and memory as the cars boot... you can buy more ram (random access memory) this will make the netbook 'quicker' but as i mentioned if your graphics card is poo then youre on a hiding to nothing. Ive got cartes du ceil, stellarium, deep sky stacker, registax 5 as well as office, and other applications on my dell notebook but i only run one at a time... hope this helps?

Link to comment
Share on other sites

Well a portable hard drive will drain the batteries faster. You're better off with a pen disk (you can find 32, 64 or even 128GB pens on the market). With this option you just change the install path to the pen drive. The libraries will however still be installed on the OS drive, but all other files will be on the pen.

Other options:

- Clean the netbook as much as possible, leave only what you really need. You can even uninstall the anti virus as long as you pre-check anything on the PC 1st. I do that on my work machine, no AV there, every pen/disk is pre-checked on my personal laptop (and the work machine is a 2 processor (8 core each) 12Gb RAm, SAS disks on RAID 5, HP workstation over 2k£). The system is optimized for top performance and software development, no windows effects, no desktop image, just a black desktop with gray classic windows start menu).

- 2nd option is to have "virtual PC" from Microsoft installed and have a virtual machine on a pendrive which you can run whenever you want. I also use this extensively for software deployment scenarios tests. To make a virtual machine is a very simple process if you know how to format and deploy a regular PC. however virtuallization needs RAM (the more the best). I suspect a netbook won't be working fast enough while running a virtual PC.

Link to comment
Share on other sites

All the main astronomy software (stellarium, cartes du ciel, VMA, EQMOD etc.) all runs perfectly fine on my Samsung N130 netbook - which I bought specifically as a sort of "deluxe" controller for my HEQ5.

It's a common misconception that using up space on your hard disk affects performance.

Erm... Hardly a misconception! If there's not enough disk space for virtual memory to operate properly, your computer will slow down dramatically when memory gets tight, eventually becoming unusable.

Link to comment
Share on other sites

If you read the comment fully, you would have noticed that this was already mentioned:

It's a common misconception that using up space on your hard disk affects performance. Only having many apps running in main memory does this. (the exception is totally filling your hard disk, meaning there is no room left for paging).

Link to comment
Share on other sites

Erm... Hardly a misconception! If there's not enough disk space for virtual memory to operate properly, your computer will slow down dramatically when memory gets tight, eventually becoming unusable.

True, but then again virtual memory is a trick to compensate for the lack of physical memory (RAM) so if you're using that much virtual memory that means your computer is not suited for what you're using it and you should either add RAM or get a new one. In fact the constant page faults (trigger that forces the OS to swap a chunk of memory from the disk VM to the RAM) adds yet another processing step and a few accesses to disk and RAM making things even slower. When you're going to the limits of VM you can make the OS loose almost as much processing time handling page faults as it is using on running programs.

Disk speed is actually more important then disk space, from the performance point of view, specially when you use application that eat up so much ram they always need to use disk as an aid (i.e.: video rendering applications). When you plan a PC for that kind of use, it's normal to spend over 200€ on a high speed disk that only haves 146GB capacity, (i.e.: HP SAS 2.5" 15K 146GB) while a normal SATA disk with over 1TB costs less.

Link to comment
Share on other sites

True, but then again virtual memory is a trick to compensate for the lack of physical memory (RAM) so if you're using that much virtual memory that means your computer is not suited for what you're using it

I don't entirely agree with that - I don't see any harm in having unused applications spooled to the background if they're dormant for the time being. In practice it seems that having loads of active stuff spooling in and out that starts the thrashing and eventual death-on-its-knees of a PC.

Also, there's an awful lot of memory these days getting gobbled up by badly-written "Software Update Checkers", application pre-loaders (Acrobat Reader Quick Start, Microsoft Office Quick Start, Java Runtime Quick Start etc. etc.). Often these things and arbitrary "System Tray" utilities consume a surprising amount of memory for little or no function (such as simply adding a pop-up menu to an icon next to the clock).

First thing I did when I purchased my netbook last week, was go through the registry and kill off all that junk - including Intel's pointless hotkey server and graphics tray icon processes. There's probably a bunch of system processes that aren't needed either - but I'll get to those in due course.

Link to comment
Share on other sites

I was assuming the computer was already optimized and cleaned from all that junk. If you let all that junk in, of course not even a last model PC will work as it should.

I was speaking about running processes, as in use by the user ;). Of course there's no point in having dormant services on RAM, that's the way to use virtual memory well. I was referring to when you got a few programs open, and you check resources and all your physical mem is busted and you got another 3 or 4GB in the paging file, including memory pages of the programs you're actually using.

Page fault - Wikipedia, the free encyclopedia

Link to comment
Share on other sites

I was assuming the computer was already optimized and cleaned from all that junk. If you let all that junk in, of course not even a last model PC will not work as it should.

But in many cases, you're not going to know it's there, or where it came from, or how to get rid of it when it's discovered - especially if you're primarilly into astronomy first and computers second.

For example, it may be handy to have a PDF viewer so you can read the instructions that came with your GoTo scope - but how many people are actually aware that installing Acrobat Reader might also install the memory-grabbing "quick start". It's not so bad in itself, but there's dozens of examples like this (QuickTime etc. etc.) - and the Java Quick Start pre-loader is so buried that no normal PC user would know how to remove it. A lot of stuff like that is often pre-installed when you buy the PC and often has no conventional "uninstall" utility.

Perhaps we ought to publish a guide on SGL for astronomy: "Windows on netBooks: How to remove the stuff that you don't need, so you can run the stuff you do"... Hmmm... snappier title required though ;)

I remember when someone did the same for Music Production PCs their web site had so much traffic they had to move to a subscription model to pay for the hosting! :)

Link to comment
Share on other sites

The easiest route is:

1) go to start menu then run (or search in Win Vista/7)

2) type "msconfig" and press enter

3) on the window that opens, go to the Startup tab

4) deactivate everything you don't need, if your're not sure deactivate it anyway

5) restart

If later on you notice something stopped working, run msconfig again and activate the most likely suspect, restart, test. If OK you're done, if not repeat the process.

Link to comment
Share on other sites

as the hard disk becomes full, the files can become more fragmented across the hard disk making them slower to access. Also files written to the centre of the hard disk can be twice as slow to access than the outer edge as it spins slower in the middle.

I found a great utility called Ultimate defrag, which is a fully customisable defragger which tries to move stuff to the outer edges of the hard disk for faster access. You can make it do this for your windows boot up files and programme files for faster access, then move your documents and least accessed stuff to the centre bit.

UltimateDefrag Freeware Edition - Free software downloads and software reviews - CNET Download.com

My (oldish) laptop now cold boots into XP and stops all disk activity within 45 seconds.

ps. I think that when you install applications, you might be able to use the option to install them on an external disk, but I've not tried this.

Link to comment
Share on other sites

files written to the centre of the hard disk can be twice as slow to access than the outer edge as it spins slower in the middle.

At the risk of veering off-topic...

Although it's true that the disk's linear velocity changes depending on the lateral position, the angular velocity remains constant - and since disk sectors are angular, pie-shaped slices into the disk, and each sector has a fixed data size despite the area it spans on the disk, I'm struggling to understand why the data rate would change (over and above any additional seek time).

I'm not saying that it doesn't - I'd just like to understand why - if anyone can provide any links?

The only way I could see transfer rate changing across the disk (other than by seek time) is if sectors are entirely virtualised these days, and aren't pie-shaped at all...

Anyone know?

Link to comment
Share on other sites

I use a prog called 'tuneup utilities'

1 of its functions is to find all the pointless microsoft software add-ons that comes with the operating system, the stuff that we never ever use or need, but its there and its running without us knowing.

It also has a turbo function that can disable non essential progs if you need to run a large ram hungry bit of software.

Link to comment
Share on other sites

At the risk of veering off-topic...

Although it's true that the disk's linear velocity changes depending on the lateral position, the angular velocity remains constant - and since disk sectors are angular, pie-shaped slices into the disk, and each sector has a fixed data size despite the area it spans on the disk, I'm struggling to understand why the data rate would change (over and above any additional seek time).

I'm not saying that it doesn't - I'd just like to understand why - if anyone can provide any links?

The only way I could see transfer rate changing across the disk (other than by seek time) is if sectors are entirely virtualised these days, and aren't pie-shaped at all...

Anyone know?

I think I can answer that one; it's not the angular velocity that's important, it's the velocity the surface of the disk is travelling, which is greater at the edge than the centre. As the physical space the sectors and data take up is the same at the edge as at the middle, the sectors and data will be passing under the head quicker at the edge than at the centre. If you run a utility like Sisoft Sandra (free) to benchmark you disk, it will show the read speed tailing off towards the centre of the disk.

Link to comment
Share on other sites

As the physical space the sectors and data take up is the same at the edge as at the middle

That's the bit I have a problem with - because (with traditional disk formatting at least) they don't take up the same space. The space is bigger on the edge, and the information less dense, so the information transfer rate would be identical, since its the same amount of data-per-revolution whether it's at the centre, middle, or edge of the disk. It's just like a slice of a pie - as shown on this wikipedia page.

Link to comment
Share on other sites

That's the bit I have a problem with - because (with traditional disk formatting at least) they don't take up the same space. The space is bigger on the edge, and the information less dense, so the information transfer rate would be identical, since its the same amount of data-per-revolution whether it's at the centre, middle, or edge of the disk. It's just like a slice of a pie - as shown on this wikipedia page.

Yeah, I thought the same. I studied that some years ago, but I wasn't sure so I kept quiet :D. Defrag does help increase startup speed and performance on software that uses a lot of disk space to operate. Another nice trick is to create a separated drive for the paging file, so it won't fragment the system partition (similar to the behavior linux forces with the swap drive).

Link to comment
Share on other sites

intriguing, I think this is worth a thread of its own! :D

I'm no IT expert by the way, but my evidence is;

1) below is an image from Ultimate defrag, which shows each sector is a fixed physical size (ok this could just be a pictoral representation)

2) hard disk designers will try and fit as much data per square millimeter as possible, so why would they make it less dense in some areas (near the edge) and more dense near the middle, unless they were trying to maintain the same read/write speed for every position on the disk, which is not really a consumer requirement, whereas more storage space is

3) Benchmarking a hard disk shows the speed dropping off towards the centre

4) Ultimate defrag explains the reason for moving stuff to the edge is for faster read speed.

I could be wrong of course ;):p

Link to comment
Share on other sites

If my memory doesn't fail it has to do with the head, not the cylinder. It can only read up to a speed, so if the disk moves faster at the edges the info needs to be more spread apart to maintain a constant reading speed. And there ware some more reasons I can't recall...

I did toke a major in Informatics and Computer Engineering (rough translation) but I work on software dev. and hardware is really not my field. There may be exceptions and this may not be true for every disk, specially with nearly 10 years since I learned that. Technology moves too fast to keep up with everything.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.