Quixel’s Jungle Environment – Direct-Feed Screenshots in Glorious 4K Resolution

Yesterday we shared a video showing off Quixel’s latest environment created with Megascans in Unreal Engine 4. As we all know, YouTube greatly degrades visuals, therefore today we are sharing the first direct feed images from that map. While Quixel’s environment is not as realistic – in terms of colours – as Crysis, it can be easily described as a true next-gen version of the first Far Cry game. Enjoy the images after the jump (be warned that each one of them is around 13MB in size)!

30 thoughts on “Quixel’s Jungle Environment – Direct-Feed Screenshots in Glorious 4K Resolution”

  1. Although it’s obviously fake looking because of too much vibrancy or post processing effects, DAMN! these are gorgeous. I wonder when this kind of visual will be used on games and how processing hungry it will be

      1. Unfortunately, yeah. As long as there’s hardware limitation, multiplat devs won’t push their games this far 🙁 .

        Although, there might be hope for PC games when DX 12 delivers the performance it promises

        1. im really curious about DX12, and if they somehow manage to do it on consoles and get like twice the power from that

      2. While I do agree that consoles are pushing back graphics somewhat, I also think they’re not the only reason why we haven’t really seen immensely impressive graphics since the first Crysis.

        I think the main reason for not seeing huge advancements is that our GPUs and CPUs are simply not evolving as fast as they used to. Every new generation of GPU and CPU is just a few percentages stronger than the last one, and it might not be worth it from a marketing standpoint to develop for those with high-end multi-GPU configurations.
        I’m sure that if the next GPUs from AMD or Nvidia would somehow be twice as powerful as the ones of today (if they discover a new technology or something), and not just +10% more fps like usually, then we would see some much more amazingly looking games coming out.

        1. Hmm, But current Gen is 27-35% faster than last One 😉
          So, what i need now is the R390X will be 40-70% faster than R290X.
          And i think they Make it Possible -> HBM, 2.5D Insulator + Best GCN ever (its last GCN cuz the Arctic Islands will be something different and 16nm)

        2. The secret to DX12 is having plenty of spare GPU power, so that the GPU can process specific cpu task, that a GPU is better suited for and it removes a huge CPU bottleneck. Unfortunately consoles are already struggling to hit 1080p/30FPS so they don’t have much GPU power to spare, but I’m sure they can utilize it in some cases, but no where near the levels of the benefits the PC will reap. The CPU has always been the biggest bottleneck for PC gaming. While GPUs advance 25-40% every year, a 4 year old Intel Sand Bridge OC’d to 4.5+ offers, in many occasions, the exact performance or nearly the same performance in games as a modern processor.

          1. And just for comparison, how it looked some years ago. I wanted to include actual benchmark charts, but moderation takes absurd amount of time on this site (these are tomshardware and anandtech site benchmarks)

            TNT2 Ultra VS Geforce 1 DDR (12 vs 22 fps standard, 28 OC, quake 3 x1200p)

            Geforce 1 DDR vs Geforce 2 Ultra (25 vs 55 fps quake 3 x1200p)

            Geforce 2 vs Geforce 3ti500 (56 vs 123 fps quake 3 x1200p)

            Geforce 3ti500 vs Geforce 4ti4600 (85 vs 130 fps, ut2003- 1024×768). ATI released in that time radeon 9700 pro, it scored 177 fps in this benchmark, so still, year after geforce 3 we had 100% performance increse

            Gf 4ti4800 vs GF 5900 (55 vs 113 fps, and Nv had even better card=5950, ut 2003 1200p benchmark)

            7900GTX vs 8800Ultra (31 vs 71 fps Doom 3 1600p + 4xMSAA)

            980GTX was a big let down (almost the same as 8800)

            8800GTX vs 280GTX (26 vs 53 fps Half life 2 2560×1600 8xMSAA)

            280 vs 480GTX (56 vs 100 fps FarCry 2 1200p + MSAAx4)

            And since 580GTX performance increase isnt that spectacular anymore. No wonder graphic quality in new games isnt much better since crysis 1.

        3. Consoles are not pushing graphics back, new consoles which are also based on x86 arhitecture (basically PC in box) and which are equipped with Radeon GPUs, notebooks (x86) or weak hardware (CPU) in general are pushing physics and AI back.

          Reducing graphics quality is relatively easy to do (for develppers) because if you developed a game (100% graphics content) for a specific hardware, you can easily reduce the graphics quality to work good on weakest hardware. Ofc upgrading graphics is something different aka its harder to do.

      3. Nope, new consoles are based on x86 arhitecture. I mean, its like you said: “thank notebooks for that” which are also based on x86 arhitecture. And new consoles are using Radeon GPUs and in “modern tools” aka “editors” you can change/sacriface the graphics in real time. So, basically…. develop a game (best graphics possible) for high end SLI/CF GPUs and sacriface the graphics to work with weakest hardware as well (low end, midrange etc).

        As you can see, graphics is not the problem but physics and AI is the real problem because a specific game with perfect physics cant work well on weak hardware (weak CPU) and thats why they develop games with weakest hardware in mind (basically). The point is, you cant sacriface/change the physics…. you can but it wouldnt be the same game. Do you now what all multiplatform games have in common? The game has the same animations, AI and physics on all platforms (ps3/360/pc/ps4/x1) and the only difference is in graphics and GUI.

      1. Because real leaves/plants doesn’t shine like that. The effects used on the leaves/plants for the wetness effect makes it look like it’s made of plastic. Also, the things in the screenshots don’t have that kind of vibrant colors thus making it look like CG

        Edit : Look at Oscar’s edited version. It’s closer to the real color which is duller and less vibrant than the original ones.

        If I’m playing a game and have to choose between the two versions though, I’ll choose the original vibrant one.

  2. Highly detailed pictures but at the same time graphics looks extremely fake. All materials/vegetations looks like plastic, nothing like real life vegetation. Crysis 1, Ethan Carter, or even moded skyrim still has better vegetation.

    1. I don’t think they were going for photorealism, even the lighting is completely off in that case, but it would be a lovely colorful graphics style for a fantasy game.

  3. When are we starting to see a true next-gen game such as this made for the PC? I think 0 chance as long as Microsoft & Sony are manipulating the gaming industry.

  4. I’ve used Photoshop and color corrected these pictures and adjusted gamma S-curve and I also resized them a bit for a more manageable viewing pleasure. These are quite incredible to be honest. Didn’t think UE4 could present such impressive vegetative environments… Hands down to the artist. The present unforgiving Chromatic Aberration is a pity though. Why the hell is it ON by default in the Unreal Engine development tools??? So many artist and devs forgetting to turn that shit off. Horrible effect imho.

    http://abload.de/img/j8xwkgm.jpg

    http://abload.de/img/j67cjsi.jpg
    http://abload.de/img/j5jxks5.jpg
    http://abload.de/img/j4pik8l.jpg
    http://abload.de/img/j3s0k4t.jpg
    http://abload.de/img/j20pk1l.jpg
    http://abload.de/img/j1mjjg6.jpg

  5. Hey, Valve’s VR headset just leaked. Gizmodo’s got the rundown. It’s coming out this year, unlike Oculus that’s coming out never.

  6. Well, we are 3 year in 28nm. We can expect a big jump this year and next because we’ll probably see 20nm and 16nm GPU.

  7. In real life nature is saturated and full of colors, but the thing is, light refracts on vegetation and materials, and we can see smooth color transitions. I these quixel jungle pictures there’s no light refraction on plants at all, and for that reason plants looks fake and you cant correct it with photoshop (but of course black&white picture will always mask things like that)

  8. Firstly, English is my 3rd language actually (self-taught), if thats the problem. I apologize for this if it was not readable as it should be.

    OT: I basically said that we cant blame NEW consoles (PS4 and X1) without blaming for example weak notebooks at the same time when it comes to graphics. Why? PS4, X1 and notebooks are using AMD GPUs and modern development tools aka editors are very flexible, scalable, multiplatform, and relatively easy to use which means you can easily sacriface the graphics in real time but you first need ofc to create something aka create graphics related things (graphics assets, 3D moddels etc). Now, whats more logical to you:

    1. Develop graphics for the weakest hardware and then “upscale it” or upgrade it to work the best on the strongest hardware which obiviously means, it would look and play better on stronger hardware

    OR

    2. Develop a game (graphics assets etc) for the strongest hardware (quad CF/SLI) and then downscale it aka sacriface the graphics to work the best for the weakest hardware (ofc with lower graphics quality).

    Obivios, 2. is logical but developers really dont care about utilizing/using the true and maximum potential of high end SLI/CF setups for some reasons, sadly.

    Now, when it comes to consoles. What have PS4/X1 consoles in common with notebooks and PCs?

    A: All systems are based on x86 arhitecture (CPU), previous and old consoles are based on many different arhitectures (PowerPC arhitecture, custom made hardware just for consoles etc) but PS4/X1 are using an APU from AMD
    B: All systems support AMD graphics (PS4/X1/PC)
    C: Notebooks are not upgradable in the same way as X1 and PS4 consoles.

    That means that PS4/X1 consoles are just non upgradable PCs in the box using AMDs CPU, GPU and GDDR5/DDR3 memory (similar to notebooks). “Same sh**, different pacage” I say.

    With that said, we cant really blame PS4/X1 consoles ALONE when it comes to graphics or to be more precise, there is no reason to do so (to blame them) because GPUs/graphics arent problematic thanks to modern development tools. However, when it comes to CPUs, we can indeed blame all notebooks with weak CPUs, weak desktop CPUs and PS4/X1 consoles for it because weak hardware or should I say weaker CPUs are holding the gaming industry down (when it comes to revolutionary changes in AI, physics etc). Why? Simply because developers are still supporting old/weak hardware and they develop the game with those mentioned “weaker” hardware in mind (be it consoles or weak CPUs in PC/notebooks). The thing is, gameplay mechanics (physics, AI, animations etc) in multiplatform games are the same on all systems/platforms, no matter how much strong those systems are (PS4, X1, PC etc). Why? Again, by using simple logic, its because they want to provide the same experience/the same game for all platforms which obiviously means, they still support older and weaker dual core and quad core CPUs.

    We will see the revolution in AI and physics when they (developers/publishers) list the Intel i7-5960X CPU on official minimum requirements (example). And again, when it comes to graphics… thats not a big problem really because all modern GPUs support the same instructions, the same functions, shaders etc and the most important thing, graphics dont/cant change the gameplay mechanics, by chaning graphics quality you are not changing the gameplay mechanics in any way. The problem is the CPU part aka the CPU arhitecture, thats why we saw many bad Xbox360 ports on PC (not because the GPU). One more example, for emulators, the CPU is 95%+ important for emulating other systems and thats the hardest thing to emulate.

Leave a Reply

Your email address will not be published. Required fields are marked *