Welcome aboard Visitor...

Daily Screenshot

Server Costs Target


Target met!

Latest Topics

- Anyone still playing from a decade ago or longer? »
- Game still active. NICE! »
- Password resett »
- Darkspace Idea/Opinion Submission Thread »
- Rank Bug maybe? »
- Next patch .... »
- Nobody will remember me...but. »
- 22 years...asking for help from one community to another »
- DS on Ubuntu? »
- Medal Breakpoints »

Development Blog

- Roadmap »
- Hello strangers, it’s been a while... »
- State of DarkSpace Development »
- Potential planetary interdictor changes! »
- The Silent Cartographer »

Combat Kills

Combat kills in last 24 hours:
No kills today... yet.

Upcoming Events

- Weekly DarkSpace
04/27/24 +11.3 Minutes

Search

Anniversaries

No anniversaries today.

Social Media

Why not join us on Discord for a chat, or follow us on Twitter or Facebook for more information and fan updates?

Network

DarkSpace
DarkSpace - Beta
Palestar

[FAQ
Forum Index » » Soap Box » » Buahahahahaha! I shall return! Timely, at that!
Goto page ( Previous Page 1 | 2 | 3 Next Page )
 Author Buahahahahaha! I shall return! Timely, at that!
BackSlash
Marshal
Galactic Navy


Joined: March 23, 2003
Posts: 11183
From: Bristol, England
Posted: 2005-09-19 12:58   
Quote:

On 2005-09-19 04:42, Diabo|ik wrote:
Last but not least, tell me how the Nvidia part is doing dx9.0 in "software" while dx9.0 has been supported by Nvidia in hardware for the past 2 generations or more, do I have to remember you that the 7800 is longhorn ( dx10 ) ready featureset-wise, right now? Cause I've been reading reviews for years left and right and since the 3dfx "software T&L/shader" emulation ( that was done on the CPU ), I have never seen any emulation notes whatsoever on any DX8.0 and/or Dx9.0+ part from either Nvidia, ATI or even *cough* lowly S3 *cough*. One more thing, if this emulation thingy was true, then we'd see this reflected in scores spanned across multiple CPU speeds, the curve would be different than from a part that does support it in hardware. Once again, beyond3d shows us otherwise ( unbiased, old powerVR fanboys and technical maniacs webby ), if you can read AND understand their reviews, then you got my utmost respect, if you don't, be aware that I can and I WILL prove "any" ( yeah, go fishing now ) sophistic argumentations wrong.

I just love to stretch my tongue in there... Hmmm...



Latest info:

Windows Vista WONT be using DirectX. It's using a new graphics engine, it will be supporting old dx9, but there is a far more advanced version coming.

The dx9 emulation is done by the drivers, not by the hardware, hence it wont show up under software emulation. The way to prove this is to run half life 2, the game wont actualy run it in dx9, becuase the hardware doesn't support it.

I can prove this if you like. My brothers 6800GT with my 9800XT and X850XT. His won't support DX9 (falls back to 8.1), mine will however. I can and WILL backup this if you want, I can just go downstairs and take a screenshot.
_________________


Sixkiller
Marshal
Courageous Elite Commandos


Joined: May 11, 2005
Posts: 1786
From: Netherlands
Posted: 2005-09-19 13:01   
its ok to cry dom
_________________



Jar Jar Binks
Grand Admiral

Joined: December 25, 2001
Posts: 556
Posted: 2005-09-19 14:45   
Quote:

On 2005-09-18 18:41, BackSlash *Jack* wrote:
X850XT still owns it in 60% of the tests...

Baaad




yes, lets refuse to accept the fact that ATI is teh suxor shall we?

always has been and always will be....

and give some trustworthy numbers and links to where x800xt "owns" anything other then yesterdays pizza....


_________________


Bobamelius
Grand Admiral
Galactic Navy


Joined: October 08, 2002
Posts: 2074
From: Ohio
Posted: 2005-09-19 17:50   
THG isn't trustworthy?
_________________


  Email Bobamelius
BackSlash
Marshal
Galactic Navy


Joined: March 23, 2003
Posts: 11183
From: Bristol, England
Posted: 2005-09-19 17:54   
Quote:

On 2005-09-19 17:50, Bobamelius wrote:
THG isn't trustworthy?




Thanks Bob

I'm sorry JJB, if you're saying THG isn't a valid source, then you really are showing that you know not what you are talking about.


_________________


Philky!


Joined: July 19, 2004
Posts: 90
Posted: 2005-09-19 18:56   
Nvidia cards do have hardware DX9. Half-Life 2 doesn't support DX9 on Nvidia for some reason. I don't know why, but they just don't. But that doesn't mean that Nvidia doesn't have hardware DX9. It would make no sense if they didn't.
_________________


Diabo|ik
Grand Admiral

Joined: August 16, 2002
Posts: 327
From: Quebec, Canada
Posted: 2005-09-20 16:46   
We all know about Valves affiliation with ATI...

I need an unbiased source/engine, not some biased "false proofs". Gimme beyond-type facts, not unprovable claims...
_________________
Mostly Retired.

BackSlash
Marshal
Galactic Navy


Joined: March 23, 2003
Posts: 11183
From: Bristol, England
Posted: 2005-09-20 16:48   
Quote:

On 2005-09-20 16:46, Diabo|ik wrote:
We all know about Valves affiliation with ATI...

I need an unbiased source/engine, not some biased "false proofs". Gimme beyond-type facts, not unprovable claims...




No.

Valve CHOSE ATI because their hardware ran FAR better than Nvidia's. Valve actauly would have to write custom code to get the FX + series to run on par with ATI. They were going to do a partner ship deal with whichever one ran the best, and ATI did. It's not BIASED, it's purely a business deal, there are NO ati optimizations whatsoever in the source engine.

I think, I have some authority on that matter considering...
_________________


Bobamelius
Grand Admiral
Galactic Navy


Joined: October 08, 2002
Posts: 2074
From: Ohio
Posted: 2005-09-20 17:01   
Meh, let's give the ATI vs. nVidia feud a rest and all celebrate in unison the impending return of our beloved Sardaukar!
_________________


  Email Bobamelius
Sardaukar
Admiral
Raven Warriors

Joined: October 08, 2002
Posts: 1656
Posted: 2005-09-20 19:40   
At least someone loves me . Man, I'm going to lag into Minuet as soon as I spawn, I just know it.
_________________


Ramius
Fleet Admiral
Agents

Joined: January 12, 2002
Posts: 894
From: Ramius
Posted: 2005-09-20 20:11   
<3 sard
_________________


  Email Ramius
Diabo|ik
Grand Admiral

Joined: August 16, 2002
Posts: 327
From: Quebec, Canada
Posted: 2005-09-21 00:03   
Quote:

No.

Valve CHOSE ATI because their hardware ran FAR better than Nvidia's. Valve actauly would have to write custom code to get the FX + series to run on par with ATI. They were going to do a partner ship deal with whichever one ran the best, and ATI did. It's not BIASED, it's purely a business deal, there are NO ati optimizations whatsoever in the source engine.

I think, I have some authority on that matter considering...




You do know about John C's point of view about this very question? And why for the exact same reasons he claims exactly the opposite for his engine? And there are no nvidia optimizations in the engine either. Now if you dare to say that half-life 2 graphics are more advanced than doom 3's... Half-life 2's lighting is good, yes, much like forsaken's aw'd me back then, but overall ( and we shall forgive doom 3s very dark settings, heck even outdoor and hell looked SWEET even with a lot of ambient lighting ) Doom 3's got my heart graphics-wise, it just looks more real to me than anything else, even your beloved HL2 ( I can understand since you have some hands in it that it might work you on the inside that I tell you that but ) can't possibly compare. It's as you said a few posts earlier, like comparing next gen to current gen. I'll go with next gen games AND next gen graphics based on the voodoo of a proven and true programmer ( not putting your voodoo in question here, but it'd be rather arrogant to claim that you or anyone at vavle can possibly compare with a worldwide respect and acclaimed legend, a legend that was earned, from the garage days to the top ).

We shall call it a draw, *cough* just for the sake of Sard's post *cough*.

Oh well, WELCOME BACK SARD! I LOVE YOU! And I love you too Jack, otherwise I wouldn't enjoy arguing with you so much .
_________________
Mostly Retired.

BackSlash
Marshal
Galactic Navy


Joined: March 23, 2003
Posts: 11183
From: Bristol, England
Posted: 2005-09-21 02:50   
Doom 3 engine is as far as it can go... It's built for FPS, and that's it.

Source can be modified for anything. Racing, RPG... Anything. It can incorperate HRD (FULL HRD, not LRD like D3). It can load far more detailed textures (I can't remember the number, but it's somewhere around 5-7x bigger).

If the doom 3 engine is so great, and everyone says Nvidia cards are easy to cater for... Why does everyone prefer the Source engine in terms of modification in any aspect. There are not nearly the same number of mods out for each one (talking FPS conversions alone). I mean, someone actualy WROTE a HRD code for the RPG modifcation Eclipse (any HL2 owners out there: Worth a look at). I don't see that sort of engine modifcation being done in D3. I mean, it looks good, but the people have no real face texturing.

The reason people don't want to mod or use the D3 engine, is because no one wants to touch the FX card optimizations with a stick. The reason it takes so long to bring out a Doom 3 mod, is because each mod needs to write its own optimizations for levels and all sorts.

Game engines should always come with generic optimizations. Writing card specific ones is always, always going to slow down development time, and it's going to slow down the speed of which people can modify the game.

Carmack has lost it this time...

I forget the URL, but if you google hard enough, some guy is actualy doing a Dx port of the D3 engine. It's nearly done, and the results don't go in Nvidia's favour. They go no were near it infact. Because the engine is so CPU heavy aswell as GPU heavy, the Directx9 calls that it has to do, just slows down the game. The tester actualy had to turn down all details just to stop the jerking he experienced from textures being loaded and the CPU being bogged down.

The figures show aswell...

SOE are dropping the Nvidia "The way it's meant to be played" badge on planetside. As are other companies. Infact, some "TWIMTBP" games, run up to 2x faster on ATI cards. Like WoW. It's just silly.

Nvidia need to revamp their technique and their dump their "driver based directx" solution in order to run head on with ATI again. I really can't see them touching ATi with a 10ft barge pole for a while. Sure the shader 3.0 comes in usful, and the 32bit shaders. But, no game uses them. The games that do, use so little, that it barely makes an impact on the FPS on ATI cards. Yes it's future technology, but what's the use of future technology if no one uses it?

They're making it optional for developers so they can use it in their games, yes. But, who's going to fork out $400 for a top of the range Nvidia card NOW to play that one game. That game is also going to alienate ATI card users, and budget/midrange card uses of Nvidia. It just isn't clever to bring out a technology that no one is using yet. It's just not good business wise to alienate customers.

It's like MS's Directx9. It's been out for over a year now... Yet a year and a half down the road, we have ONLY JUST started seeing dx9 games, and there are only 1-2 of them that support it. Even then they aren't fully dx9, they have dx8.1 and dx7 fallbacks.

Eitherway, sard has gone with a very nice card, and the best one I can think of. It's a mid-high range card, which Nvidia don't do. They don't touch the numbers inbetween (6700GT or 7700GT). Money-wise, the X700 is a sound choice, and is better than the 6600, the 7600, and the X600.

[ This Message was edited by: BackSlash *Jack* on 2005-09-21 02:53 ]
_________________


BackSlash
Marshal
Galactic Navy


Joined: March 23, 2003
Posts: 11183
From: Bristol, England
Posted: 2005-09-21 03:45   
http://apps.ati.com/ir/PressReleaseText.asp?compid=105421&releaseID=758490


_________________


Deathscythe
Admiral

Joined: November 30, 2002
Posts: 420
From: The netherlands
Posted: 2005-09-21 14:39   
ATi Rocks and the rest sux i last time wasted a nvidia once cooling fan locked up frying the chip below it :S now days i juse Ati and ati only
_________________


http://deathmut.myminicity.com/


  Email Deathscythe
Goto page ( Previous Page 1 | 2 | 3 Next Page )
Page created in 0.026564 seconds.


Copyright © 2000 - 2024 Palestar Inc. All rights reserved worldwide.
Terms of use - DarkSpace is a Registered Trademark of PALESTAR