AMD reconfirmed today that Star Citizen will support its TressFX tech. While support for TressFX was originally announced last year (at AMD GPU14), the Red team felt the need to remind us that Cloud Imperium does not intend to drop support for it. In addition, a new commercial video has been released that can be viewed below. It’s interesting that at the end of the video we see a female character with – what seems to be – hair powered by AMD’s tech. Enjoy (we’ve also included a new gameplay video that was captured during CitizenCon 2014)!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
AMD needs to bring out the R9 300 series soon if they want to compete with Nvidia at the moment. The GTX 970 is just too good a card at just the right price for most PC Gamers.
Not really, it’s very similar in performance to the R9 290 that you can get nowadays after the last price cut for the same price of a 970, and it will perform better on the Mantle titles, like Star Citizen, Battlefield Hardline, Civilization Beyond Earth, Dragon Age Inquisition, the next Mass Effect and Mirror’s Edge, and possible future titles using Cryengine and Frostbite since they both now officially support it… really most of the heavy graphics games.
Mantle is garbage. Once DX 12 will be released there won’t be reason to buy AMD GPU just for Mantle.
http://www.hardocp.com/images/articles/1412746309oHQVINIuLi_5_2.gif
Nv doesnt need mantle as much because AMD drivers use much more CPU power compared to Nv. For example dead rising 3, gameGPU site CPU benchmarks:
i5 4670 @ 3.4 GHz
Nv card 780
79 fps min- Avg 118
AMD 290
46 fps min- Avg 85
33fps difference in minimal fps in dx11. In battlefield 4 AMD users have to use mantle on order to see similar fps as Nv users in dx11. Nv did very well job with their drivers after AMD released mantle
link pls? I remember similar performance issue was with current Batman game and has been fixed with first beta driver that comes right after the game was released.
New 14.9 brings amazing DX11 improvements in variety of games and maybe would be good to check performance with those driver first (maybe the issue has been fixed already).
I wrote in my original post, that It’s GAMEGPU site, all you have to do is type that in “google”,
http://gamegpu.ru/images/remote/http–www.gamegpu.ru-images-stories-Test_GPU-Action-Dead_Rising_3-test-dr_3_proz_amd.jpg
http://gamegpu.ru/images/remote/http–www.gamegpu.ru-images-stories-Test_GPU-Action-Dead_Rising_3-test-dr_3_proz.jpg
And also tech like TrueAudio, MANTLE, XDMA CrossFire, HSA, etc. with company’s philosophy to use standards in optimized way for variety of HW (HairWorks runs 9 times slower on AMD but TressFX runs on NVidia same as on AMD). So it is not only about perf./watt efficiency but much more parameters you have to consider and philosophy that company presents for PC gaming as a industry.
Maxwell v2.0 is very good and efficient but do not forget AMD has much more versatile chip with great performance in rendering as well as computing. And there are games even todays that leverage that capabilities – like cryengine and others. So regardless how much Maxwell is efficient now AMD might be more powerful when HSA and GPGPU get more into a games today and in very near future.
Actually, I think Nvidia wins for many non-gaming tasks. Video editing programs like Premiere often support doing work directly on an Nvidia GPU. My friend got a new computer for video editing this year and there was no question of what gpu he had to get.
Oh my, what does this game doesn’t support/have?
Amm , video setting, controler setting , normal ship control , balans mouse vs joystick , release date… ^_^
From all stated technologies we have not seen anything except in commercial videos 🙂
Everything you mentioned have been showcased. And you should watch Citizen Con presentation . Squadron 42 is coming next year . Star Citizen PU beta is coming late next year and full release is set for early 2016 . It is too early to announced the exact day .
They’re in the pre-alpha build now, I downloaded it yesterday since I got Star Citizen free with my R9 280.
Yep.
LOL
It up to how well this tech runs on different than NVidia HW. For same reason they used TressFX instead of HairWorks.
seems we have a true for-pc-made game.
I hope other devs take note. $55 million for just the promise of a true PC game.
Every single dollar spits in the face of the lie that we’re all just pirates.
It’s no lie, we’re all pirates – space pirates! 😀
stupid
Full presentation in HD :
youtube . com/watch?v=rGVAG-E2tO0
Planetside footage :
youtube . com/watch?v=8gGLE3USB2U
Mantle is the reason why there’s a DX12 (as a low-ish level API) in the first place. Since Microsoft wasn’t doing squat about DX true development, AMD took the opportunity of designing part of the current consoles (and being closer to a PC design) to release an API that brings to PC what happens in consoles (low level optimizations), something developers were asking.
Microsoft wants and prefers people to game on an Xbox, but if they loose their grip of gaming on PC (DX being the usual standard), they’ll condemn their control on the platform… and they also risk Windows as a standard. If OpenGL or Mantle start gaining track, then it wouldn’t take long for Linux to gain market share (gaming is basically the thing that lacks the most, and why many people won’t ditch Windows once and for all).
So no, Mantle is not garbage, it made the industry go forward when Microsoft was worrying about tablets, apps, stores and other useless crap. Plus, some people have gained around 10~15fps on some implementations of Mantle with only initial post-patches, that’s considerable for a start.
Is it a reason for buying a AMD when DX12 comes out? Possibly not. Just as, for most of the people, Gameworks or G-Sync proprietary stuff isn’t a sole reason to buy nVidia either. In any case, the industry moved forward, that’s what matters.
The actual gameplay looks horrible.
It’s not bad at all for pre-alpha, the combat and feel in space is rather nice and looks stunning even at this phase. You can turn off the motion blur with r_motionblur = 0. The textures are also very detailed.
What do you mean by that? They didn’t really show much gameplay, just flying and walking around. I will admit that the player character in 3rd person view looked like a hunchback trying to hold in a turd but that’s just work in progress animation cycles.
The level of detail in the cities architecture was really impressive though. It looked awesome.
how about you look into it before opening your useless mouth
You’ve got not basis on which to say that and comparing NVIDIA new GPU to make a point is invalid. Check out crossfire benchmarks and you’ll see how much Mantle makes a difference. Also, yet again, silly people like you don’t know why it benefits, it mainly benefits CPU limited setups, it’s a nice light weight graphics API.
Not necessarily $534k, as some players can “melt” their other ships in order to buy another one.
I hope they’ve gotten the bugs out of it. Has it been used in anything but Tomb Raider? It was kind of neat in that, but it was buggy sometimes and the performance hit was huge and frankly not worth it. Didn’t seem ready for prime-time.
Even in pre-alpha the city already looks tonnes better than anything in Mass Effect.
DX 12 is bound to one OS. It worked a solid while for MS but that might fall apart once Mantle is included in OpenGl 5. One API for Win/Mac/Linux/Android Phones.
It’s not labeled. What exactly do I see here?