Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Graphics card (for image processing)?


Recommended Posts

Hi all

I suppose I posed the question on here as a bit of an afterthought since I just ordered a couple of cards. I appreciate that AP and image processing don't require the same GPU facilities as gaming but my old 32-bit system only has a 512Mb geforce 9500gt (from 2009) in it, and my 64 bit i5 only has on-board Intel hd 3000. So I decided to buy myself a Christmas present :) (nobody else will!). I use the 32-bit system for watching tv also and it currently won't display HD channels properly :(. Also, my 64-bit hd 3000 doesn't seem to support hdmi (I think it's supposed to but I've never been able to get an hdmi output) and I'd like to use hdmi if I could. Both monitors are 1920x1080. I've ordered a MSI Nvidia GT730 for the 32-bit system and a Gigabyte GTX 1050 Ti for the 64-bit one. Hopefully, they should keep me going for a bit. Rumour had it that the Pixinsight people were going to implement some graphics acceleration at some point - maybe...

Any thoughts about graphics cards generally? Anyone here done any CUDA/parallel programming? With a new card I might try my hands at that, one of theses days - just for fun :p.

Louise

Link to comment
Share on other sites

6 minutes ago, Thalestris24 said:

Hi all

I suppose I posed the question on here as a bit of an afterthought since I just ordered a couple of cards. I appreciate that AP and image processing don't require the same GPU facilities as gaming but my old 32-bit system only has a 512Mb geforce 9500gt (from 2009) in it, and my 64 bit i5 only has on-board Intel hd 3000. So I decided to buy myself a Christmas present :) (nobody else will!). I use the 32-bit system for watching tv also and it currently won't display HD channels properly :(. Also, my 64-bit hd 3000 doesn't seem to support hdmi (I think it's supposed to but I've never been able to get an hdmi output) and I'd like to use hdmi if I could. Both monitors are 1920x1080. I've ordered a MSI Nvidia GT730 for the 32-bit system and a Gigabyte GTX 1050 Ti for the 64-bit one. Hopefully, they should keep me going for a bit. Rumour had it that the Pixinsight people were going to implement some graphics acceleration at some point - maybe...

Any thoughts about graphics cards generally? Anyone here done any CUDA/parallel programming? With a new card I might try my hands at that, one of theses days - just for fun :p.

Louise

Using GPU compute libraries isn't simple. Unless you're already a very experienced programmer I wouldn't suggest diving into CUDA or OpenCl. They require... more than a bit of setup.

For image processing, GPU acceleration has the potential for a HUGE performance increase over CPU, but I don't think any existing software really makes use of it unless it's for scientific purposes or for 3D rendering (3DS max/Blender etc) (some video encoders and I think Adobe after affects is all i can think of atm.)

For a glimpse into GPU programming, have a look at this: http://simpleopencl.blogspot.co.uk/2013/06/tutorial-simple-start-with-opencl-and-c.html

Link to comment
Share on other sites

TBH, the demands on the GFX card are low for 2D applications provided you have the card memory to support the resolution you use - and preferably double in case you want to run 2 monitors. The monitor is more important because that's what you use visually to judge the processing result on - and the ability to fine callibrate the monitor is also important. CUDA needs special programming and is only suitable for a few specific tasks that can be split into hundreds of cores. Seti@home or BOINC are well known apps that made use of it (at the expense of generating a lot of heat!).

ChrisH

Link to comment
Share on other sites

18 minutes ago, pipnina said:

Using GPU compute libraries isn't simple. Unless you're already a very experienced programmer I wouldn't suggest diving into CUDA or OpenCl. They require... more than a bit of setup.

For image processing, GPU acceleration has the potential for a HUGE performance increase over CPU, but I don't think any existing software really makes use of it unless it's for scientific purposes or for 3D rendering (3DS max/Blender etc) (some video encoders and I think Adobe after affects is all i can think of atm.)

For a glimpse into GPU programming, have a look at this: http://simpleopencl.blogspot.co.uk/2013/06/tutorial-simple-start-with-opencl-and-c.html

Doesn't look too bad :) I'll have a look-see at what Nvidia offers in the way of software support

Link to comment
Share on other sites

Unsurprisingly, Nvidia seem pretty committed to its CUDA support. At a glance, I'm quite impressed by their stuff. I think I could get into this! Especially the AI application (nothing to do with watching 'Humans', honest!). I do need time and space to focus on it, though, and that probably won't happen until maybe April. Hopefully, I'll get the gtx 1050 ti within a couple of weeks and I'll see about starting to set things up on the 64-bit i5. Lots of reading to do :) I think DIY quantum computing is still a ways off so 'conventional' parallel computing seems worth investigating. I think this thread probably belongs elsewhere now - if a moderator would like to move it to one of the lounges, ta.

Louise

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.