I bet my mission critical DOS apps will really fly all these cores!

Dodeca-core: The Megahertz Race is Now Officially the Multi-core Race

“Don’t be disappointed, AMD is making up for it,” hints one engineer. Further conversations revealed that inter-CPU communication is going to be a big deal with the 45nm refresh. The first breadcrumb comes with a new “native six-core” Shanghai derivative, currently codenamed Istanbul. This processor is clearly targeted at Intel’s recently announced six-core, 45nm Dunnington processor.

But sextuple-core processors have been done, or at least we’ll see the first ones this year. The real neat stuff comes a few months after, where AMD will finally ditch the “native-core” rhetoric. Two separate reports sent to DailyTech from AMD partners indicate that Shanghai and its derivatives will also get twin-die per package treatment.
[…]
A twin-die Istanbul processor could enable 12 cores in a single package. Each of these cores will communicate to each other via the now-enabled HT3.0 interconnect on the processor.

The rabbit hole gets deeper. Since each of these processors will contain a dual-channel memory controller, a single-core can emulate quad-channel memory functions by accessing the other dual-channel memory controller on the same socket. This move is likely a preemptive strike against Intel’s Nehalem tri-channel memory controller.




  1. JimD says:

    It matter not how many “CORES” you might have, M$ WinBLOZE BLOAT-WARE WILL GUM UP THE WORKS – and run no faster than Word for DOS running on DOS !!!

  2. FRAGaLOT says:

    When it comes too multi-cores it has nothing to do with bloatware OS but the OS being aware that there are multiple cores and actually USE THEM. Hell, we are still waiting for Windows 64-bit to be mainstream, but it’s as useless as windows vista is.

    Granted Microsoft really needs to pull the stick out of their collective asses and fully utilize the hardware we spend thousands of dollars on.

    Also, this story about windows code being bloated and about to collapse on it self… it’s been like this since the introduction of the registry in Windows 3.11.

    Also, Word for DOS never existed you idiot. We ran WordPerfect back then.

  3. FRAGaLOT says:

    Microsoft needs to do one of two things.

    Completely re-write a new OS from the ground up built around 64bit CPUs and multiple cores. Similar to what Apple tends to do and often orphans older mac hardware in the process. Of course this new OS will be completely incompatible with nearly all existing windows apps and drivers. This is important for Microsoft’s to do this, and finally SHED it self with legacy support and inherent security issues that’s been holding back windows since the 1980s.

    The reason why Microsoft never does this is because they been trying to stay “backward compatible” since the DOS days to keep thier bigest customers happy; bussinesses.

    This is why Windows is so held back, and has big security flaws since windows STILL has inherented flaws from DOS. Even after 20 years. At least Apple switched to a Unix/BSD kernel for OS X, even before the intel switch. But Apple can do this since (in the past) apple had no significant business marketshare that would be orphaned by this. Microsoft would cripple business if they abandoned support like this in the past.

    So the 2nd thing MS could do is offer two products. Keep support for XP/Vista going on now for all the old farts in business suits, so they don’t go out of bussiness. And have another team develop a new OS for NEW hardware as I described above for END users, that supports the new hardware we all paid thousands of dollars on.

    Keep in mind they already did something like this with the early development of Windows NT, while end users will still using Windows 3.11.

  4. Uncle Dave says:

    “Microsoft needs to… Completely re-write a new OS from the ground up”

    I could be wrong, but I doubt I’ll be around in the next century when that would be finished.

  5. JimD says:

    WRONG, FragALot !!!

    LINK:
    http://www.downloadsquad.com/2005/11/25/free-file/

    But erstwhile columnists at PC Mag were fond of pointing out how migrating to WinBloze 3.1 didn’t really help people using Word for their work …

  6. Mark Derail says:

    #5 beat me to the punch, FragAlot.

    I actually used Dos Word, but then I also used IBM’s dreaded nightmarish DisplayWrite.
    Because I was forced to. Documentation.

    But used WordPerfect 4 (then 5) for as a programming tool. Macros for IF..THEN..ELSE structures, back when IDE didn’t exist.

    Just before I left the IBM only hardware/software BANK as my job, they had finally decided to use Wordperfect 6, which was quickly killed by Microsoft Win 95.

  7. The Pirate says:

    Woot! We can all run Prime95 Lucas-Lehmer iterations and heat the house. I can’t wait, but I bet I have to in order to run anything anywhere close to multi-core max capacity.

  8. Ron Larson says:

    I can’t understand why, with all the new processors, Microsoft can’t run instances of DOS and WinXP as guest OS’s under a VM. Those apps and devices that won’t work under a better OS can then be sandboxed into their own instance of a legacy OS.

  9. Thomas says:

    As far as I know, there is no OS that can effectively use 12 cores that is not specially tailored for specific hardware. Managing requests amongst those cores gets more expensive from a performance standpoint and geometrically more complicated as you add CPUs.

    #2
    Every version of NT has had the ability to use multiple CPUs (and thus multiple cores). Every one. I believe XP has a cap at 2-4 cores depending on the version but the server OS can go up to much larger numbers. Most of, if not all, Microsoft’s application are designed utilize multiple CPUs. However, many commercial applications, most notably games, are not.

    #3
    RE: Backwards compatibility

    No business is going to buy an OS that will break everyone’s applications. If you want to stay in business you have to provide a migration path. One of my larger customers is still using IE6 because IE7 breaks one of their applications. Businesses want stability.

  10. Ah_Yea says:

    #3, Microsoft rewrite Windows? They should, but don’t hold your breath.

    Word for Dos? As shown above, it existed. In fact, it was pretty good and the Thesaurus was vastly superior than Word For XP. They used Roget’s by default, and it was very well implemented.

    Eliminate backward compatibility? This is not going to happen anytime soon. Many of the programs, even the most up-to-date versions used on windows, have ancient embedded code. They still work precisely because Windows allows older code to run.

    Additionally, one of the reasons Windows is popular -at least in the IT circles I have been in- is because many people feel safer being able to use an OS that does not obsolete their saved data. What good is saving your data if the program you need to recall it no longer works? Why save that old Dos dBase file if you cannot run dBase to retrieve it? This may be more of a feel-good factor than anything actually useful, but it does count.

  11. kjackman says:

    I’m glad I’m not seeing the typical “users can’t possibly find a use for X cores” arguments you constantly see on Slashdot. I’m a musician. With modern production software, I could use 32 or 64 cores or more, to run all the soft synths, reverbs and delays, and mastering processors required for quality real-time mixdown. Windows CAN handle >4 cores. Visit the Cakewalk Sonar forums to see end user conversations in glorious detail. Multicore is a godsend for us.

    With all the multimedia apps used by even the most amateur users (think Garageband), multicore will be very important.

  12. FRAGaLOT says:

    Fine so there was WORD for DOS, but Word Processor apps aren’t used for benchmarking how fast your PC is buddy. They are slow now, as they were 20 years ago. My point is talking about Word isn’t how you show off how slick your new PC is.

    As for windows being re-written, Why not? Mac OS X was rewritten from version 9, Linux has a zillion different flavors. What they really need to do is just start over from ground zero and let go of all this aging legacy bullshit that MS has been schlepping for 20 years.

  13. Jopa says:

    Excellent! The more cores the better. The software needs to find a way to use all these puppies though. I believe that it will take around 2 to 3 years before we really see the benefits, but at least the direction is good.
    Personally I prefer intel chips, but I like it when AMD gives intel a great fight. AMD brings balance to the force so to say 🙂
    SEXtuple…… mmmm… sounds sexy!

    Do you guys understand that we are taking leaps here towards hardcore, modern super computing on the desktop / laptop!

    Now where did I put that 128 core macbook pro…

  14. joaoPT says:

    Funny stuff will happen when AMD starts to include some ATI Graphic cores into the package, not to do graphics but to process massively parallel data. Think Video Encoding: H264 HD on the fly, feature recognition software: taking your laptop camera and use it to control your second life avatar’s expressions. Really really fast Photoshop plug-ins.

    People, I know that shelling out $$$ for Quad cores today does not pay, unless you’re a 3D artist, but, believe me, it will. It will.

  15. ECA says:

    Lets see…
    MS makes the Programming language that is SUPPOSED to make the OS..
    If the Programming language ISNT up to par?? What do you think happens?
    Now i have to ask…WHO WROTE the 64bit Language?? MS?? YUK!.

  16. KwadGuy says:

    Who cares if MS optimizes Windows for 12 cores? I mean, what OS functions, exactly, are you gonna use those cores for?

    This is all pointing the way towards significantly greater importance of coarse grained parallelization for apps that are cpu bound. The OS is not one of those apps. Audio/video encoding, certain photo filters, scientific/enginnering apps, a few games, etc. are places where the coarse grained parallelization will be useful.

    Scientific apps I run can peg 12 core processors 24/7 using coarse grained (or fine grained) parallelization. But these are not Windows platform apps anyway. If I had a 12 core machine, I’d be running Unix/Linux on it. For people running linux clusters, it will be great to be able to reduce the number of PCs, as this will drastically reduce both the floorspace AND the cooling required.

    A few (3-4) years ago, I set up a small cluster of 4 P4 3.2Ghz linux boxes to run some scientific apps at home. It worked fine, but now I can do a lot better with a single quad core Intel box–faster, supports fine grain parallel and way cheaper to operate.

    I can’t wait for the mega core processors, and I couldn’t care less if Microsoft optimizes their OS for them.


0

Bad Behavior has blocked 5490 access attempts in the last 7 days.