NVIDIA is killing the GTX260, GTX275, and GTX285 with the GTX295 almost assured to follow as it (Nvidia: NVDA) abandons the high and mid range graphics card market. Due to a massive series of engineering failures, nearly all of the company’s product line is financially under water, and mismanagement seems to be killing the company.
Not even an hour after we laid out the financial woes surrounding the Nvidia GTX275 and GTX260, word reached us that they are dead. Normally, this would be an update to the original article, but this news has enough dire implications that it needs its own story. Nvidia is in desperate shape, whoop-ass has turned to ash, and the wagons can’t be circled any tighter.
Word from sources deep in the bowels of 2701 San Tomas Expressway tell us that the OEMs have been notified that the GTX285 is EOL’d, the GTX260 is EOL in November or December depending on a few extraneous issues, and the GTX275 will be EOL’d within 2 weeks. I would expect this to happen around the time ATI launches its Juniper based boards, so before October 22.
Here’s info on their problems with Intel. And when they wanted to show off a card at a conference, they had to fake it.
Found by Brother Uncle Don.
Probably worth taking – certainly the conclusions – here with a pinch of salt. The article is by Charlie Demerjian, who was on either the Register or the Inquirer, and all of his stories were anti nVidia. Now he did expose the nVidia chip failure fiasco, so it wasn’t all needlessly negative. But put it this way, it was hardly impartial reporting.
This is a bit of a blow for nVidia for sure. Ati has taken a few hits in the past, and now happily they are back swinging. So I wouldn’t call game over quite yet.
1. This is old, FAKE news.
B. Nvidia has already categorically denied this as 100% untrue.
3. Semi-Accurate is funded by AMD/ATI sponsorship. A glance at the site looks like a corporate PR site. Charlie’s a whore and a Muppet.
Does Dvorak know what crap is being posted by his minions on his blog?
Semi-Accurate indeed! HAR!
Oh the trials and tribulations of the free market system where innovation makes you better.
And if you can’t innovate, sue.
Actually it makes sense. Specially after the partnerships and alliances with M$. The immediate cause is liscensing litigation with intel. Everybody knows that Intel is Microsoft’s beatch. While factual this post is void of background info. Uncle Dave’s oblivious as always…
ATI has produced some great video cards over the years. Unfortunately, they also produced some of the worst drivers, and software interfaces out there. Those like myself, who have been stung several times by ATI, are now Nvidia fans.
As Nvidia’s is ATI’s only real competition, their demise would no doubt result in even less innovation from ATI. Frankly, I don’t buy this story without more proof.
#6
or make way to a brand new competitor, either yet to be discovered, or maybe…intel…with larrabee.
If AMD/ATI is behind this, this is surely a shot in the foot.
#7 JoaoPT
The Larabee taking the mid/high end add in graphics card market is very unlikely in the short/medium term. In fact I think nVidia is trying to make effectively a larabee and therein lies much of it’s problems. Part of nVidias being off the rails for my money is trying for the HPC market. Theres no money there – its a death trap. Ask Seymour Cray or SGI for that matter.
Ati went pragmatic and is reaping the rewards.
Further down the line perhaps these highly parallel compute style engines will deliver. For the moment – people are still trying to figure out how to program them.
So for the short term at least Ati is sitting pretty.
I take it with a grain of salt.
Yesterday I bought my second Nvidia card, the GIGABYTE GV-N94TOC-512I GeForce 9400 GT.
The price was to good to pass up. It replaces a 6 year old ATI card.
I think the has already been addressed by Nvidia as false. But since PC gaming is virtually dead in many ways. I think what they (Nvidia) will try and do is concentrate more on integrated solutions and less on high end cards. This of course is a busy area of the business with Intel controlling a lot of the lower end market for video chips. So the question remains then what is ATI going to do? Will they become the definitive graphics card maker for gamer’s? Or will this market just fade away? Time will tell.
#8
Maybe not Larrabee, but Larrabee II… and also, who’s got the might and the fab infrastructure? And, by the way, Intel is still continuing with it’s line of embedded GPU, not based nor derived from Larrabee.
My point is, ATI will not remain as the single graphics mega-contender…
Your point about Fermi being a kind of Larrabee, but coming from the other side, is well put. That’s what they’re doing.
Meh … My GTX280 works just fine I wont even bother taking a glimpse at ATI (Hate their drivers and software, plus it’s a real pain to configure on Linux).
These monsters already do the trick, since game developers aim at XBOX360, what’s on the market medium-end is far enough for eye-glittering gaming for years to come, thanks to the economy. Since R&D suffers cuts everywhere, that’s no big deal.
I just miss the days when my top-of-the-line PC was obsolete after a year…
One thing not mentioned in any of the comments: ATI does not even try to support Linux any more. I have a GTX 285 on the way due to its supported drivers. My ATI card is limping along until then. Long live nVidia!
#10
Your post makes no sense at all. You contradict yourself in almost every aspect. No PC game market, concentrating on integrated video and producing video cards for gamers. What in those three sentences makes sense to anyone?
PC gaming isn’t dying, they’re more concentrated in certain genres like MMO and FPS. It’s because of dumbass kids these days that don’t know how to spread their interest. Hopefully, Starcraft2 and Diablo3 will change that when they come out and bring more innovation and ideas. PC gaming can’t be dying if WoW is generating millions of dollars per month from one game. And Chinese gold farmers have never been richer because of MMOs.
Integrated graphics to do what? Games? NO. Video? Yes to some extent, HD maybe, video editing, yes in a few hours, etc. For desktop, the question will still be is the integrated graphics any better than these $30-$40 cards? Chances are it’s still a big NO. In laptops? Yeah, watch that battery die in a few seconds and your money go down the drain.
And if gaming is dying, in your own words, you would think NVidia made the right choice instead of ATI. But that’s not that case, gaming on PC is still going and both companies will provide the video power to play them IF management doesn’t screw up and belly-up the company.
http://cagematch.dvorak.org/index.php/topic,7912.msg34823.html#new
I did another angle on this in cage match..
“Intel, meanwhile, has publicly stated that it will combine graphics and a CPU inside of a multi-chip module with its Clarkdale and Arrandale processors. Both offer the option to use a discrete GPU, but de-emphasize their need. Meanwhile, Intel is prepping its own discrete graphics chip, code-named “Larrabee”. ”
“But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments.” ”
If anyone has ever used an INTEL GRAPHICS chipset…Im sorry.
lets think about a few things..
1. Intel is having problems utilizing the FULL power of Multi core.
2. WHO needs 64 bit?? business dont need it. 32bit is Big enough and does well. Only science needs 64 bit. Security?? WHY would you need 64bit for security?? WHAT is gained.
3. Win7? is a DREAM. until its FULLY released and out there, you wont KNOW what it can/will do.
As mentioned, it’s pure crap. nVidia did announce they are leaving the chipset market. At least until they work out a deal with Intel (good luck there). The much rumored nVidia ‘x86’ CPU with GPU on die is likely to true though. I won’t ever count nVidia out.
I have a grudge against NVidia after the whole mobile-GPU crap they pulled with the Dell computers. They shipped millions of GPU’s that they knew were defective and overheated.
The article is crap, and sadly I am one of the people who fell for it. I had to post an apology on my site today about the whole mess.
This is the kind of bad reporting that happens when you let any idiot with a keyboard call themselves a journalist, and force the industry to the lowest common denominator of performance. (An yes, I know I am part of the problem, as my staff is too small for the level of scrutiny we used to apply regularly to all content.)
Modern MBA mentality. Increase profit by drastically slashing cost. Eventually something will break.
The Disney California Adventure is a fascinating case study regarding this failed approach. They built the park on the cheap, despite contrary advise from park engineers, and tried to float it solely on the Disney brand. It failed miserably and now they are in the works to implement the more costly original recommendations from the engineers. Old man Disney made his name by giving people value and profited from it. This mentality of increasing profit through aggressive cost cutting is doomed to failure in the long run.
#19. Thanks for setting it straight.
I just ordered a GTX285 for running CUDA code, ported from Matlab. It is around 20 times faster than running it on any kind of desktop PC. Modeling granulation, of pollutant dust particles in a colloidal aqueous media.
#16 ECA you’re way off:
1) it’s an issue with the OS and software that doesn’t utilize mutli-core CPUs, not Intel. It’s not like Intel are the only ones making multi-core CPUs and *that* makes it their fault. It’s programmers who still think in “old-school” linear terms with CPU threading, and compilers that mostly ignore this.
2)64bit addressing allows you use more memory, use bigger hard drives, higher resolution graphics create, more complex graphics and physics since a 64bit CPU can count higher (per byte) and more complex algorithms for encryption (security).
Basically allows more of everything giving programmers and hardware makers more breathing room to improve things.
3)Windows 7 HAS BEEN fully out for some time, and I don’t just mean the Beta/RC versions, but OEM copies have already been released, and PC makers are already working with it, and software developers also got their hands on it too. It’s just not in RETAIL yet.
Win7 is really a matured version of Vista anyway so the drivers are already compatible with it, and it’s going to be out in retail in about two weeks.
ECA you sound like an old fart, who doesn’t like change. You are welcome to keep using your old 32bit 386, and windows 3.1 if it keeps working for what you need to do. Meanwhile the rest of us will live in the 21st century with modern computers that let us do more than just play tetris, wordperfect, and lotus 1-2-3.
C’mon, it’s not like we ain’t seen this before. M$/Intel have a plan, Nvidia is in the way. Wana bet the next xBox has Intel graphics? It’s definitely M$ style business model, compatibility issues between Intel and Nvidia, ensuing legal actions, etc… This all began when Nvidia started making motherboard chipsets and started competing with Intel. Afterall if you don’t have any competitors you don’t have to compete or produce a quality product as M$ demonstrates. Too bad the antitrust laws ain’t worth spit anymore, monopolies are bad for everyone.
#23
Xbox has ATI graphics, so I don’t get the reason to switch to intel. Anyway, intel is not there yet, but could be close to power the next Xbox. I just don’t think it’s their game (pun not inteded).
intel has respectable GPU know-how, that gets dramatically overshadowed by the sheer magnitude of the two bigger GPU contenders. No reason to assume, if given the demise of one of the big ones, intel would not be perfectly capable of assuming the role…
Who else?
damn typos: #24amodedoma, not #23, in my last post…
#23, FRAG..
1. TRUE, As windows cant figure out HOW to program what they Forced intel to make.
MS OS,
MS programming language.
And AMD jumped ont he boat without doing their OWN testing.
2. Business does not need 64 bit. Unless you want to count the USA debt to the .000000000000000000000000000000000000000000000000000000000000000000000~1 point.
Science might need it..but the only real reason is the HEAVY graphics for games/3D cad and very few other things.
and its NOT that it allows you more, IT REQUIRES more. esp under windows.
3. Um, you just admitted that Win7 is just another version of windows vista. The only things added are security. and we KNOW windows is lousy on securing their OWN system from everything including the USER.
Give me a windows option screen that lets me see the options for graphics and sound and everything else in 1 nice spot. Give me back my win2000 TRackball driver, LOST under winxp.
XP was stupid until RC 2, Vista has had its releases, and Win7 is NOT complete until its been in the WILD at least 1 year and had the BUGS STOMPED out of it.
If MS would cut back and follow proper procedure in programming there wouldnt be as many problems. you DONT write to the OS.
Windows is requiring MORE AND MORE AND MORE.
They need to go back to square 1, and RELEARN some of the Old programming. An OS that requires 1+gig just to operate, SUCKS.
Why would a group like MS bring a OS to an environment IF’ it didnt work properly??
Why would INTEL make hardware that wouldnt WORK with the OS it was created for??
They even bought out a Multi processor programming company to TEACH THEM.
It will end up, 1 Core controlling the Others, telling programs Which core to run on, and What part of ram is allocated for it.
Hmmm,
seems a lot of you folks didn’t see this coming!!! trends over the past few years
1) pc boxs being replaced with laptops (which have no graphics card slots)
2) consoles with their graphics chips plus a hdisk etc
3) gpus being filled with more transistors that are not used
4) cpu multicore not providing major identified performance gains so they are placing graphs etc in them.
5) 45nm lithography has 4x area capacity
6) consoles again have grabbed most of the gaming community that desire fancy graphical reality
7) online, flash based gaming
Look for bad PC gaming outcomes.
This forthcoming demise of the gpu chip leaders could be problematic for the graphics programmer enthusist as consoles are still closed to anybody, the ps3 gpu hole was quickly filled (and they have stopped the linux partition).
Will the Intels etc jump into the niche, or are they not interested. GPU programming may soon become a retro pastime.
#25 Joao
Xbox started out with Nvidia and was switched to ATI back when Nvidia started making chipset processors (2003). Coincidence? I think not. But M$ has other beefs with Nvidia (they produce linux drivers for one). Hardware and software have a lot in common, if you’re going to produce something you need to be deeply concerned about the interoperability or your products or they’re going to fail. That means having intimate knowledge of somebody else’s product. In software that means having inside information on windows and M$ has special alliances and partnerships for that, that’s how they strangled the software industry. In Hardware that means having intimate knowledge of motherboard architecture and forming strategic alliances and partnerships with other companies you may be competing with. Between M$, Intel, and AMD they’re planning to slice up the next crop of consoles/PC’s amongst themselves and leave Nvidia out. AMD’s in for a surprise though, Intel will eventually produce a half decent GPU and that will be the end for AMD. AND Finally M$/Intel will rule the technological world!
If Nvidia’s were smart they’d use the last of their resources to promote linux and, dare I say it, linux gaming and multimedia.