Parallel / multi core, should be care? [modified]
-
It seems Microsoft is making a major push this direction. Do you think it is much of a concern for the typical business app world? That is, wouldn't this only really make much of a difference on heavy load server software of CPU intensive applications? As far as I can see, for most applications, the current methods seem to be fine, maybe I am just missing the big picture.. Edit: I guess I should have included that I figure for most applications, letter the OS handle it seems a winner. It just seems like another part a developer should not have to hassle with for most software and is handy to have as an option. But MS seems to be really moving down that rabbit hole and trying to pull developers with them.
modified on Tuesday, November 18, 2008 7:12 PM
-
It seems Microsoft is making a major push this direction. Do you think it is much of a concern for the typical business app world? That is, wouldn't this only really make much of a difference on heavy load server software of CPU intensive applications? As far as I can see, for most applications, the current methods seem to be fine, maybe I am just missing the big picture.. Edit: I guess I should have included that I figure for most applications, letter the OS handle it seems a winner. It just seems like another part a developer should not have to hassle with for most software and is handy to have as an option. But MS seems to be really moving down that rabbit hole and trying to pull developers with them.
modified on Tuesday, November 18, 2008 7:12 PM
Rocky Moore wrote:
for most applications, the current methods seem to be fine
This was probably true when Win 3.1 was running on a 386 - or indeed when Lotus 1,2 was running in DOS etc. I imagine a future where we have massively multi core processors - imagine, for example, one processor for each pixel on your (3D) display - each constantly calculating. Or one processor per object in your application - objects interacting in real time in parallel, not just 'simulating' parallel processing. If you can work through the problems with programming then the possibilities could be as staggering between the software we're using now and those first text-based green-screen computers of yesteryear. Hope I live long enough to see it!
Life is like a pubic hair on the toilet seat... ...sometimes, you just get pissed off. .\\axxx (That's an 'M')
-
It seems Microsoft is making a major push this direction. Do you think it is much of a concern for the typical business app world? That is, wouldn't this only really make much of a difference on heavy load server software of CPU intensive applications? As far as I can see, for most applications, the current methods seem to be fine, maybe I am just missing the big picture.. Edit: I guess I should have included that I figure for most applications, letter the OS handle it seems a winner. It just seems like another part a developer should not have to hassle with for most software and is handy to have as an option. But MS seems to be really moving down that rabbit hole and trying to pull developers with them.
modified on Tuesday, November 18, 2008 7:12 PM
I don't think it'll make a lot of difference for the majority of software - just look at how little load your CPU is under already. It might help something like SQL Server but even there my uneducated and possibly wrong guess is that disk speed probably makes more of a difference than CPU performance.
-
It seems Microsoft is making a major push this direction. Do you think it is much of a concern for the typical business app world? That is, wouldn't this only really make much of a difference on heavy load server software of CPU intensive applications? As far as I can see, for most applications, the current methods seem to be fine, maybe I am just missing the big picture.. Edit: I guess I should have included that I figure for most applications, letter the OS handle it seems a winner. It just seems like another part a developer should not have to hassle with for most software and is handy to have as an option. But MS seems to be really moving down that rabbit hole and trying to pull developers with them.
modified on Tuesday, November 18, 2008 7:12 PM
Rocky Moore wrote:
As far as I can see, for most applications, the current methods seem to be fine, maybe I am just missing the big picture..
In most cases you won't need it. However it does open a few new doors should anyone care to go down them. Back when I was working in the oil and gas industry I wrote a predictive analysis and trend monitoring system for various products. We had to limit it's input because of CPU usage. Now you could use hourly trends to pick up things like lunch hour rushes and plan for them as well as multiple weekly, monthly, seasonal, and yearly trends. Even by product. Normally such things are reduced in order to cut down on CPU overhead in processing, or you just look at the bottom line. But detailed information like that you can encourage trends, running lunch specials, rush hour madness sales, etc. Where ever you find a trend you can take advantage of it. where there are problems you can attempt to bring it back. The older ways you are already hip-deep in unsold goods before you realize you're not getting anywhere. My boss had a shed full of left-over glassware from a failed promotional gimmick. Don't gimmick, give 'em cheap gas! By predicting subtle trends he made an extra 8 million (for a total profit of 10 million) in the first year of the gulf-war playing on the changing prices. Often chances for detailed analysis are overlooked because a business programmer doesn't know statistical analysis and data mining methods, let alone advanced filtering and detailed predictions. These are scientific analysis tools and have no place in business.... or do they? There is a lot that can be done that is not being done. There is always more to do and to learn, even I don't know everything and can never hope to -- anyone who says they know everything knows less than you do probably. but the short answer... to do the same things, to not grow or change, or offer any new techniques, or new information.... no, you do not need parallel programming. :-D
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb) John Andrew Holmes "It is well to remember that the entire universe, with one trifling exception, is composed of others."
modified on Tuesday, November 18, 2008 2:38 AM
-
Rocky Moore wrote:
for most applications, the current methods seem to be fine
This was probably true when Win 3.1 was running on a 386 - or indeed when Lotus 1,2 was running in DOS etc. I imagine a future where we have massively multi core processors - imagine, for example, one processor for each pixel on your (3D) display - each constantly calculating. Or one processor per object in your application - objects interacting in real time in parallel, not just 'simulating' parallel processing. If you can work through the problems with programming then the possibilities could be as staggering between the software we're using now and those first text-based green-screen computers of yesteryear. Hope I live long enough to see it!
Life is like a pubic hair on the toilet seat... ...sometimes, you just get pissed off. .\\axxx (That's an 'M')
It's called shaders and it already exists ;) latest 3d cards have a few hundred thousand parallel cores calculating the color for just one pixel :)
-
It seems Microsoft is making a major push this direction. Do you think it is much of a concern for the typical business app world? That is, wouldn't this only really make much of a difference on heavy load server software of CPU intensive applications? As far as I can see, for most applications, the current methods seem to be fine, maybe I am just missing the big picture.. Edit: I guess I should have included that I figure for most applications, letter the OS handle it seems a winner. It just seems like another part a developer should not have to hassle with for most software and is handy to have as an option. But MS seems to be really moving down that rabbit hole and trying to pull developers with them.
modified on Tuesday, November 18, 2008 7:12 PM
I think it depends which field you are in, and what sort of data you have to deal with. For time consuming stuff, definitely (I had a 12 CPU core grid running code analysis tasks yesterday, and boy could we have done with that sort of CPU power when I was working on acoustic transponder array calibrations at Sonardyne!). :-\ Like every other technique, it is useful for some applications and irrelevant for others.
Anna :rose: Having a bad bug day? Tech Blog | Anna's Place | Tears and Laughter "If mushy peas are the food of the devil, the stotty cake is the frisbee of God"
-
It's called shaders and it already exists ;) latest 3d cards have a few hundred thousand parallel cores calculating the color for just one pixel :)
That's funny and true, but kind of irrelevant. NVIDIA and ATI graphics cards do have what's called massively parallel processing hardware, and they get plenty of work done. The reason it's irrelevant is that this hardware was designed specifically for the visual processing algorithms they run, which are usually "embarrassingly parallel" as HPC people call it. This is very different from general purpose applications, many of which are quite serial (one-at-a-time style). Researchers have put years of effort into putting general purpose computations on graphics cards, and the only ones they can still reliably extract better performance out of are the streaming and data-parallel algorithms. Again, unlike most apps. (for a good example, see RapidMind platform or BrookGPU) To original poster: yes, it matters, but not for now. Right now, the processors only have two or four cores that are compatible with single core instruction sets (x86, mainly). I noticed a performance drop on some apps in the change from P4 3.0GHz to Core Duo with two 1.6GHz cores. Some software just can't use both cores, so it just slows down. This isn't a big problem right now, as desktop systems usually have hundreds of threads going at the same time and adding a few dedicated cores that can run more simultaneous threads will definitely help. Sun's T1 processor is excellent example, with 64 simultaneous threads in hardware for server apps. However, if they keep dividing the chips and produce 16-64 core designs with weak processing units and little dedicated memory, that will be a serious problem. We currently don't have the technology to make general-purpose software work efficiently on that many cores. It may even be impossible. ;)
-
I don't think it'll make a lot of difference for the majority of software - just look at how little load your CPU is under already. It might help something like SQL Server but even there my uneducated and possibly wrong guess is that disk speed probably makes more of a difference than CPU performance.