12.28.2006

Rahul's Predictions for 007


It’s been awhile since I have written an industry update so I thought I would end 2006 with some 2007 predictions. Keep in mind; these predictions are simply open predictions based on my own personal hypothesis – nothing is written in stone of course, but as per my usual style I leave my opinions in the open.

Samsung or Hitachi should/will acquire Western Digital: I think Western Digital, being the #2 hard drive manufacturer in the world (maybe a distant number 2, but still number 2) is a prime target for acquisition. I ran the theory by Charlie from the Inquirer sometime early last year and he wrote about it without the complete explaination. Perhaps I will expand on it later - but for now here are a few points;

I think it would benefit Samsung or Hitachi to acquire Western Digital because they could really use the brand penetration that Western Digital has garnered in the enthusiast PC space. I have written before about the “halo effect” that’s driven by the enthusiast community, so there’s no reason to explain the reason for this theory. WD is also doing pretty well in emerging markets and emerging vertical markets. As far as I understand Samsung has a goal to be #2 in the hard drive space by year end 2007 – that said the only way they can do this is to make an acquisition or grow their storage business at an unbelievable rate. By the way, I think Western Digital drives are pretty awesome and even though Seagate owns a ton of I.P. Western Digital keeps coming up with new and innovative ideas. They helped turn a commodity into a “sexy product”. It should also be noted that I own Western Digital stock, although I own it because I believe in it – and this opinion is purely based on my beliefs.

NVIDIA will not be acquired by Intel, but...: Nvidia will work feverishly on a strategy to remain ahead of the curve in the mobile space. It’s unlikely that they will allow Intel to “acquire” them simply because the cultures are like fire and water. I think the *only* way and Intel + Nvidia marriage will work is if Jen-Hsun remains CEO of the entire entity and they do a reverse takeover of Intel. That’s not likely to occur, but if the shareholders of both companies feel it’s a good move then it’s a possibility.

It’s more likely than not that Nvidia will build their own solutions and ultimately there will be three huge companies competing for silicon real estate. Eventually we may even see AMD and Nvidia get even closer - or not. I think it depends on how Intel handles the situation.

AMD will see better days in the future: Intel’s sudden onslaught of technology caught AMD by surprise. There is no way that AMD expected Intel to come across with such aggressive technology so quickly. AMD needs to clean house, make changes soon. They need to get ATI integrated into the machine as quickly as possible. The sooner ATI and AMD “fuse” the sooner they will create new and innovative technologies that everyone needs. Margins will eventually go back up and revenues will be strong – but until then we’ll see AMD margins drop. I don’t think it’s a pretty situation for the short term – and I’m really looking on the horizon for AMD to do something spectacular again.

ATI should really be proud of the Nintendo WII as a marquee product. ...but unfortunately such products don't pay the bills, so ATI needs to aspire to clobber Nvidia one day. I'm optimistic that AMD management will help ATI go down the right path to performance.

Intel will open up on 11 cylinders: It looks like Intel is in the position to open up on 11 of 12 cylinders. Their products will use less power, and they will continue to perform favorably. Short-mid term for Intel looks fabulous - of course the AMD+ATI thing probably has them worried, but leave it to the engineers in Israel to come up with a solution and I think we're in for some more surprises. I think Intel will try to push more "Centrino/Viiv-like" standards to the market, we'll see how that works out for them. I believe they will cut some more heads from the company, and they should take a page out of Mark Hurd's book to become a more sales-driven organization.

Something is happening with Lexmark, what I have no clue:
Lexmark is still on the climb. I wrote that I thought they would make a possible acquisition, but damned if I thought it would continue to go on a vertical climb before that happened. Does anyone know what happens when a company climbs too high based on rumours beyond a reasonable value for an acquisition? That’s not a rhetorical question either, if anyone knows please feel free to pipe in.

Dude! You’re getting a real gaming system: Wait and see. Good things take time, and we’re not prepared to launch things overnight. We’re bringing in some serious muscle to leave zero doubt that this acquisition was the best move both companies could ever make in the space.

Nintendo will sell more WII's than they could ever dream of: Try playing Madden on an XBOX 360 at 1080i, then play the same game on a WII at 480. Need I say more? There's clearly no comparison, and Nintendo figured out how to invoke deep emotions from console gamers. This is a category killer. Sony should be afraid, very afraid.

Apple should license OSX: If Microsoft Vista Ultimate is ~$400 and Apple OSX is $40 there is clearly an imbalance here. Wouldn't it be interesting if Apple opened up their OS to a select few manufacturers? I think so.

2007 Special Request: I would like to make a special request to both GPU and CPU semi-conductor companies out there.

Please keep power limitations at the top of your priority list! I can’t believe we’re approaching 1.2 kilowatt power supplies, it’s getting insane. These supplies are drawing as much power as the wall socket can handle. There are management at the top of certain companies that think customers in our space “could care less” about power, and this is not true at all.

Power is a very important consideration for enthusiasts! Noise, reliability, thermals, overclocking, it’s all part of the package. We aren’t interested in loud-ass PCs – and even scarier is when liquid cooling is becoming standard because we are left with little choice! I'm a believer in liquid cooling, in fact at one point I was a significant shareholder of Cool-It technology. I don't believe liquid cooling should be a forced standard -- it should be installed to enhance the overall experience/performance - and not to "cool the shit out of the hardware" because it needs a power station to operate.

That said, it’s likely that liquid cooling will become a necessity in the enthusiast space as long things continue down the current path.

The winner of this battle will be the company who eliminates that trend and continues to go down a lower power path. You’re just going to have to trust me on this. If you can take overall system power down while delivering similar or better performance than your competition at higher power then you’ll be more successful.

That’s my story and I’m sticking to it.

29 comments:

Anonymous said...

I don't want to see Western Digital get acquired. They have always been my favorite brand of hard drive. But if they do, then maybe it will be by Samsung, since they're my favorite brand of monitor.

I have used nVidia cards ever since 3DFX went out of business. But when I get my new DX10 card, it will be an AMD card. ;)

I'm hoping that AMD's future will get brighter with their quad core processor, since that's my next planned CPU upgrade.

I wouldn't touch a Lexmark printer due to their poor drivers.

I'm hoping that the Nintendo Wii will be a success. I have one myself, and I have been having fun playing Zelda on it. But I am a little concerned about the graphics. It would look so much better if it supported HDTV.

I have not used an Apple computer in many years, and I will certainly get one the DAY that they release an AMD-based Apple.

S.Zhu said...

Always an interesting read Rahul - it's nice to get an insider's perspective on how things are going.

-I hope the number of players in hard drives goes down one more. Besides those high-performance drives, the market is relatively stale on the price and innovation front. Just a few large players should make for more product differentiation

-NVidia branching off to create its own line of CPUs? Or even a merger with Intel. Both would do much to stifle competition as buyers will be locked into proprietary CPUs, motherboards and perhaps even graphics.

By AMD buying ATi, what was once two 1v1 battles with AMD vs Intel and ATi vs Nvidia becomes a single three-way fight expanding into every category.

-I'm sure AMD will recover and even take the lead. The mass market seems to react slowly to industry changes, especially hardware leadboards and by the time most notice Core 2 Duo's advantage, AMD will have competing solutions.

-Lexmark? Their free printer died on me. I think they'll keep riding the rumor wave and try to use the extra funds to turn up a physical asset before the bubble bursts on them.

-I love the Wii. Enough said.

-Honestly. I never considered the PSU a core component until my newest system kept giving me random restarts/blue screens/disappearing video. PC Power+Cooling must be making a killing while environmentalists roll over in their graves.

md said...

Brilliant post Rahul, you're bang on with the power quote. It will be interesting to see your products as well. Most interesting is your perspective on Nvidia, it will be cool to see them compete as a third mega player, but not so cool to see them get sucked up by Intel.

Anonymous said...

It's been awhile since you've posted something, glad to see you're still at it! Keep it up.

Anonymous said...

Lexmark has zero technology, nothing interesting inside. I'm thinking shareholders will have to accept a discounted value for takeover at this point.

Otto said...

Rahul,

I didn't know that you owned part of Cool-it. I understand that they are being used in many enthusiast PCs and possibly Dell coming up, is that true? Are they public yet? Does Voodoo still use them? What about HP?

Rahul Sood said...

Otto, I prefer to keep you (and everyone else) guessing on our choices for upcoming cooling solutions. I know the Cool-it solutions very well, but I can't say what we plan to use going forward.

My relationship with Cool-it was nothing more than a shareholder.

I have no knowledge of their inner-workings, and I don't know who they have deals with.

That said, I am confident that we are at the forefront of liquid cooling technology, and no one knows it better than us.

unforgiven said...

I am still trying to figure out what Intel must be planning to mitigate the eventual hostile takeover of the market that is bound to happen when AMD and ATI put out their integrated CPU/GPU's.

Core2 has clearly shown that Intel is running scared and is willing to take losses, put in extreme amounts of investment, anything, to keep AMD down. Unfortunately, they're haunted by ghosts that they can't beat. They will never ever adopt hypertransport and coming up with something similar would mean ditching PCI Xpress, something they've been trying to push for a while.


Even with Core2, Intel is really not able to penetrate the high end server market. Who really gives a shit if one processor can work 30% faster than the competition, when the communication between processors takes 18 times as long. Computation to Communication ratios are sorta, kinda important in parallel algorithms ya know?



Anyway, which brings me back to my point. I really wonder what the hell Intel must be planning already or will come up with.


Whatever it is, I'm pretty certain consumers have it better than they've ever had in the last ten years!

Anonymous said...

Rahul, interesting take ... I am not so optimistic as you on AMD for 2007, 2008 would be the year to look for a rebound -- if 45 nm transistions better and faster than 65 nm and can compete with Intel node for node. Right now they are showing troubling times ahead. QuadFX was not as spectactular as needed, the Barcelona demo was of task manager (needs a lot of work), and 65 nm out of the gate is appearing very weak. Let's hope they pull it together soon.

Anonymous said...

I laugh every time someone says that sony should be afraid or it won't sell,etc... I laugh because the everyone has based there predictions/figures on n.American sales. N.America is not the main competition. Aisa and Europe is where the final battle takes place. With brand recognition, Sony can not lose. Nintendo will make up ground and the 360 will do worse 07 then 06. (who the hell buys a console for a game that will be delayed almost 2 years?)

Anonymous said...

As cool as liquid cooling is, I hope VPC never have to resort to it for mainstream computers.

Anonymous said...

hey haven't you heard of K8L?? whhat new things does Intel got?

unforgiven said...

Before I continue:
------------
Any views expressed by me here in any of the comments are my personal views and *not* related to, *nor* inspired by my work at, *nor* in any way indicative of any aspect any corporate policy, roadmap or any non-public information of AMD in any way.
------------



So working for AMD, I really can't say very much on my take on "AMD"'s projected performance, but lets put it this way, even if AMD just does mediocre for the next two years, in the end, if they put out a CPU/GPU combo, which gamer, graphics designer in their right minds would buy Intel? In fact, why would even a desktop user, who may not care 'too' much about graphics perfomance, still go with Intel, when AMD would offer such an obvious advantage. This is assuming that AMD's chips *don't* outperform Intel.

Intel does seem to have some vague ideas about releasing a ray-tracing capable tera-flop chip but then, Intel is kinda known for technologically marvelling, but totally useless ideas, case in point, that pretty little 64 bit processor they tried pushing out a few years ago.

I agree with Rahul that NVidia probably will not sell itself to Intel, if for no other reason, at least because Intel has this reputation of killing any products by companies it acquires. AMD on the other hand, tends to acquire companies and integrate their products into their own line and use them for a technological edge. So if NVidia won't be bought over, it sure as hell won't give its GPU designs to Intel to shove into their die, nor will Intel give over their processor designs to NVidia.


Neither Intel nor NVidia are that well placed to enter the GPU or CPU market (respectively). Neither will give up their designs for merging into the other's product.


So what are they going to do?
*shrug*

sharikouisallwaysright said...

"Please keep power limitations at the top of your priority list! I can’t believe we’re approaching 1.2 kilowatt power supplies, it’s getting insane. These supplies are drawing as much power as the wall socket can handle."

You are right, Sir.
Today i own an mighty 625W PSU and i will stay with it.
If new hardware exceeds this limit, i will not buy it!

I would like if WD could stay independent.
But Samsung would be the least bad solution to go with imo.

At all i think your predictions are bright.

Anonymous said...

"In the end, if they put out a CPU/GPU combo, which gamer, graphics designer in their right minds would buy Intel?"

What exactly is so exciting about a CPU/GPU combo for a gamer? At best, it'll be akin to integrated video on steroids - still far short of what a dedicated $150 mid range GPU can do.

Look at todays demanding games like COD2 and Oblivion. On high details, you are doing well to pull *40 - 50 fps* on a *HIGH END* GPU.

An integrated solution, whether off die or on die like Fusion, just will not work well with the latest 3D games.

Fusion will be just a regular CPU with a crippled GPU on the same die. The heat factor alone will not enable AMD to integrate a high end GPU on the same die.

It might be good for gamers on a tight budget, but other than that, I see nothing particularly exciting about Fusion for an avid gamer like myself who demands high performance and graphical quality in games.

Of course, you are just doing your duty by hyping up Fusion, I would be too if I was an AMD employee. ;)

Rahul Sood said...

I laugh every time someone says that sony should be afraid or it won't sell,etc... I laugh because the everyone has based there predictions/figures on n.American sales. N.America is not the main competition. Aisa and Europe is where the final battle takes place. With brand recognition, Sony can not lose. Nintendo will make up ground and the 360 will do worse 07 then 06. (who the hell buys a console for a game that will be delayed almost 2 years?)

I think you need to revisit your comments. Sony is going to get their ass handed to them in all markets. I don't care if it's Nigeria or fricking Midwestern United States - this deal is done. Nintendo is going to climb up the ladder faster than any console in history - and it's going to be an easy win for them. Microsoft XBOX will only be "less popular" based on the fact that there are two competitors in the space. Either way, Microsoft XBOX will squeeze Sony on the high end as well. Honestly, I think Sony's days are numbered in the console space, unless they buy Nintendo or license some technology from them quickly.

Merc88 said...

I was at Best BUy yesterday....they had a pile of PS 3's and 360's and no Wii...

Salesman said people were bringing back PS3 and putting money donw on a wii...

That says it all to me....

Merc88

Anonymous said...

I think you need to revisit your comments. Sony is going to get their ass handed to them in all markets. I don't care if it's Nigeria or fricking Midwestern United States - this deal is done. Nintendo is going to climb up the ladder faster than any console in history - and it's going to be an easy win for them. Microsoft XBOX will only be "less popular" based on the fact that there are two competitors in the space. Either way, Microsoft XBOX will squeeze Sony on the high end as well. Honestly, I think Sony's days are numbered in the console space, unless they buy Nintendo or license some technology from them quickly.

Holy crap, batman - you need to check to see if you have your 'finger on the pulse' like you say you do. I'm not sure whose pulse you're checking.

PS3 drives the market with its brand. What sort of images come to your mind when someone says Playstation, Rahul? Not the PS3, just the name 'Playstation'. Who do you envision playing it?

Now, look at Nintendo. What sort of images do you envision when you think of the word Nintendo? Uhm hmmm. Now, who is playing a 'Nintendo'? (Never mind if it's Wii or Famicon or whatever.)

Now, which consumer profile drives the market, or more appropriately, who has the most disposable income? 18-25 - male sound about right?

Now, what console system do you see the 18-25 year old males playing? Super frikking Mario in Happy Luck Geisha Polygon Land (Smiles and Friends All-Around Version)?

I don't think so. And now, look at who's playing the Wii - the <18 year olds...who do they emulate? The 18+ years olds.

Until Nintendo starts making Super Mario American Chopper Vice City (read: games for more mature edgy young men with tattoos and a motorcycle), it will always be playing catch up. This is part of what caused its demise back when the original Playstation was released. It was too kiddy.

Playstation has always attracted the 'bad boys', and this DRIVES the market. Wii is nothing but a fad. It's a gimmick. I can't believe you think the Wii is going to be a threat to either the PS or the Xbox!!

Anonymous said...

sony's definitely going to lose this round in the console race, but hopefully this means they'll actually learn from their mistakes (and/or lawsuits) and not make the same mistake twice.

here's to hoping PS4 will be a beast. this round, though, i'm definitly getting a wii.

i'd also have to agree with the power draw thing. i plan to make my next computer a passive-radiator watercooled rig. if power draws end up being too much for passive cooling to handle, then i'd rather fall back on lower-end tech than stupidly high amounts of heat.

Rahul Sood said...

Playstation has always attracted the 'bad boys', and this DRIVES the market. Wii is nothing but a fad. It's a gimmick. I can't believe you think the Wii is going to be a threat to either the PS or the Xbox!!

This reminds me of Kevin Rollins comment in February of 2005, when he said that the Ipod was nothing more than a fad. You can hope that the "bad boys" drive the market, as you suggest, but you're way off. WII is going to eat PS3 for lunch, brand name or not. Innovation brings riches, and complacency digs ditches. You can quote me on that too.

Anonymous said...

The difference between the Wii and the iPod is that the iPod is a fashion accessory first but also features tons of functionality. In the era it was introduced, nothing was capable of the storage capacity of the iPod. This sold it first, but now that it's fashion that drives sales more than anything. Think about it - a ton of people will never have 2+ gigs of music, but they will definitely be carrying an iPod. It's form over function.

The Wii is a specialized controller in essence. This is its selling feature and is an attribute of form, NOT function and this is why it's selling like hotcakes right now. Not to mention the problems with PS3 distribution.

Xbox or PS could easily release a controller with the same capabilities. And they will. It's enevitable.

But you won't see spit flying or sweat trickling down a characters face in a Wii game. It doesn't have the horsepower. In fact, its specs are downright miserable for a modern gaming system - and people don't know that yet. For this reason alone, the Wii is doomed. It just can't provide a detailed and immersive gaming experience (at least one that most Wii owners -when they find out- will be seriously disappointed about).

Think about it, how upset will Wii owners be when their boxing and controller-oriented games get tired? They won't have any alternatives, because the system can't handle a quality game that people crave. They will have a short list of games to choose from, because no one can design a WW2 game or flight sim or car racing game when the Wii processor is only working at 700mhz and barely has enough texture memory to handle simple graphics.

And of course Wii appeals to a wider demographic than the Xbox and PS, but that diluted demographic is not a driving force. Nintendo always has been kicked around because it fails to dig into the young adult market in North America.

Not to mention, the Wii doesn't handle DVD's, but this is not a big issue. And at 480p (!!), it will never be a good match for a modern HDTV.

And so far the Wii appears to lack any sort of hackability which has made something like the PSP more successful. Forget about doing cool things with Linux on your Wii (or whatever the nerds will think of that will be a fringe feature of the PS3).

Sure, a lot of people will buy the Wii off the bat for Christmas (think of a tickle-me-Elmo doll), but it doesn't have the capacity to provide high quality gaming.

I'm shocked that you think the Wii will do well!

Rahul Sood said...

Sure, a lot of people will buy the Wii off the bat for Christmas (think of a tickle-me-Elmo doll), but it doesn't have the capacity to provide high quality gaming.

I think you should buy Madden 2007 on your Sony PS3 and play it at 1080i. Then go try a WII with Madden 2007 at 480P or worse. You'll find the gaming experience on the WII to be 10x better regardless of graphics.

With WII Nintendo will not only dig into the young market, but it will dig into the family market as well. Families will insist on WII as the console of choice to get their kids off of the couch. People will return their Sony's to "upgrade" to the WII. The tables will turn.

Anonymous said...

If any parents insist on having the Wii as the preferred gaming system in order to promote excercise, they aren't very good parents (or they're misguided, but goodhearted).

If you as a parent rely on the meagre amount of excercise someone can get from swinging their arms around a little bit in order to promote better health (read: less fat kids), the families have a much bigger problem than with little chubby Billy. It's called an unhealthy paradigm from top to bottom. In other words, stop letting Billy gobble down Burger King to begin with and forget about using a game system as some sort of supplementary calorie burner.

It's not like parents have been scooping up DDR for their kids to get them to be healthier!

I still believe that no one is going to stick with the Wii. The power behind the Wii is in the games that are played with the special controllers. Once that's old and tired, there's nothing special left. Let's hope the game designers really come through for the Wii platform, or you'll be eating your words!

PS
I don't buy into console game systems. I feel they lack depth - I couldn't imagine playing Civ on a console, and without a keyboard I get too frustrated entering in my name to even start games. Plus I'm almost getting too old to play PC games or so my wifey says. ALMOST. ;)

Anonymous said...

"I still believe that no one is going to stick with the Wii. The power behind the Wii is in the games that are played with the special controllers. Once that's old and tired, there's nothing special left."

as opposed to both the xbox360 and PS3, neither of which have ANYTHING special. oh wow, shiny graphics, nothing we (especially us PC gamers) haven't seen before. 2 years ago.

people are going to get tired of it and ache for shinier graphics with a more familiar interface- just like how they switch from the DS to the PSP. oh wait, what was that? the DS, with its gimmicky touchscreen is outselling Sony's ultrashiny version of the Game Gear?

the wii ALLOWS developers to open up new, creative ways to make games while giving them familiar hardware to program with. the competition forces developers to have huge development budgets, and i'm willing to bet 50% of that goes to trying to make the graphics look better. and maybe another 25% testing code, trying to figure out how to squeeze performance out of the new hardware. the remainder? well, when you spend this kind of money on a game, you don't want to take risks. you go with the tried and true- so we'll just keep seeing more racers, shooters, Madden, and final fantasies.

what have we seen with the PS3 and 360 so far? same old, same old. you're paying $300-600 for a graphics upgrade, nothing more. nothing new on the gameplay front. no memorable experiences- just the same old thumb-twiddling exercise everyone's been doing since the NES.

who's buying? 360 and PS3: serious gamers with lots of money. they make up, what, 5% of the MALE population? who does the wii appeal to? gamers and non-gamers. again, i want to make a DS comparison- mostly because everyone who owns a DS wants a wii. and do you know who owns a DS? half the girls i know. i have never seen so many pink consoles floating around hallways. the other companies talk about girl gamer population and how they're going to make games with feminine appeal and such- nintendo doesn't. chicks buy nintendo, simple as that. the reason everyone seems surprised that there's so many gamer chicks is that they don't realize there's an entire platform with tons of games that appeal to females, without having to advertise them as chick flicks.

any gamer who's played for more than one generation and knows that improving graphics is usually just a way to force you to spend more money- you don't remember it, what you remember (and get the most out of) is gameplay. that's at least half the gamer population.

and then there's the people that have never touched a console in their life, pick up the wii and realize it feels natural. the sixaxis and xbox controller have 2 joysticks, 1 dpad, and 8 buttons + start/select. the wiimote has... a few, but who cares, most games seem to use only the dpad and trigger anyway. that's like 50% of the population right there. including grannies, who do, in fact, find this stuff fun. wii wins. no doubt.

unforgiven said...

"What exactly is so exciting about a CPU/GPU combo for a gamer? At best, it'll be akin to integrated video on steroids - still far short of what a dedicated $150 mid range GPU can do."


Wait, I am not quite sure if you understand the architectural implications of having the GPU and CPU as multiple cores on the same die. Are you seriously telling me that a $150 dedicated GPU will outperform a GPU integrated into a chip?

I am not even sure what to say to that. I'd suggest a book by John L. Hennessy and David A. Patterson, called Computer Architecture. That should be a good start.


"Look at todays demanding games like COD2 and Oblivion. On high details, you are doing well to pull *40 - 50 fps* on a *HIGH END* GPU."

So you're telling me that our high end GPU's are more than easily handling all possible games and we really have no real need for any more graphics processing?


"An integrated solution, whether off die or on die like Fusion, just will not work well with the latest 3D games.

Fusion will be just a regular CPU with a crippled GPU on the same die. The heat factor alone will not enable AMD to integrate a high end GPU on the same die."

Now won't you feel really, really stupid if AMD-ATI actually ends up delivering a high end GPU and CPU on the same die?


"It might be good for gamers on a tight budget, but other than that, I see nothing particularly exciting about Fusion for an avid gamer like myself who demands high performance and graphical quality in games."

Considering according to you, we already have as much power on GPU's as we need, I guess there isn't particularly anything exciting about any GPU or CPU that can be made, is there? Makes the whole discussion moot.


"Of course, you are just doing your duty by hyping up Fusion, I would be too if I was an AMD employee. ;)"


The AMD disclaimer is required, otherwise this has nothing to do with where I work. Believe me, if there is any effect at all my employment at AMD has, it is to dissuade me from expressing opinions as clearly as I would really have liked to have.

Anonymous said...

Unforgiven,

Look at the TDP of current and upcoming DX10 GPUs - creeping up towards the 200W mark. AMD's top CPUs consume 100W+. That is ~300W all up. No super fancy 10 heatpipe heatsink is going to sufficiently cool such a monstrosity. And people thought a 115W Prescott was bad - they ain't seen nothing yet according to you!

If power consumption does not come down drastically in the near future, there is NO WAY we are going to get a high end CPU/GPU on the same die.

I think AMD/ATI would do well to reduce GPU lower consumption to the 100W mark before they start considering 'fusing' a high performance GPU and CPU together. Unless they want to make watercooling kits standard fare of course...

Sorry, I just don't share your enthusiasm for Fusion with the current power constraints in place. If GPUs were like the good old days of being >50W, then it'd be a different story...

unforgiven said...

Power constraints are something that the entire industry is now taking into consideration as a design constraint now. Frankly, one of the reasons power req's went so out the roof were simply because Intel engineers really were never given that as a design requirement. AMD's generally been better on power (I guess till the recent Core2 line) per MFLOPS/MIPS to start with so if anyone were to be able to fix this problem further, I'd bet on them.

I don't forsee power requirements going down but I do forsee them staying where they are and engineers building better CPU's more by better design rather than by increasing number of transistors they're going to shove onto one piece of silicon.

Anonymous said...

Dear Mr. Sood

I like the fusion idea.

Thinking of all the computers in the world, which dont run any game, because they and their users belong to a company and "have to do work".

I am not "a gamer" at all (except for some sudoku's).


A divorce between "desktop/menue/popup/IO-dialogs/..." on ONdie GPU and "heavy graphics" on a separate card is thinkable too.

For nvda, there is still a good "torrenza" chance to build a cpu of at least stream/mathematical purposes; why not other!!

It would need no takeover and strengthen both.

I could imagine, there a two companies in the world, which talk a bit more stuff than comes to public.

Waiting on first K8L benches, sitting on my amd shares.

regards juergen

Anonymous said...

Here's a crazy one:

How about Samsung and Nvidia merging?

This would get Samsung into a new market and maybe potentially longer term into the CPU market. I still think Samsung is Intel's greatest threat as they are the only ones who really have the manufacturing muscle to compete with Intel.

Any thoughts?

PS- I do like the Samsung/WD speculation though....