Sea of Thieves storm screenshot header

Sea of Thieves PC Performance Analysis

Sea of Thieves is a new shared-world adventure game from Rare that is available exclusively on Microsoft’s platforms, allowing players to take the role of a pirate and sail the seas of a fantastical world. The game has just been released so it’s time to benchmark it and see how it performs on the PC platform.

For this PC Performance Analysis, we used an Intel i7 4930K (overclocked at 4.2Ghz) with 8GB RAM, AMD’s Radeon RX580, NVIDIA’s GTX980Ti and GTX690, Windows 10 64-bit and the latest version of the GeForce and Catalyst drivers. NVIDIA has not included any SLI profile for this game, meaning that our GTX690 behaved similarly to a single GTX680 (notice a trend lately with SLI?).

Rare has added a few graphics settings to tweak. PC gamers can adjust the quality of Shadows, Models, Textures, Water and Lighting. Gamers can also lock their framerates to 15,30,60,90 or 120 (or leave it uncapped) and there is an option for Performance Metrics.

[nextpage title=”GPU, CPU metrics, Graphics & Screenshots”]

In order to find out how the game performs on a variety of CPUs, we simulated a dual-core and a quad-core CPU. For our CPU tests, we lowered our resolution to 1024×768 in order to eliminate any possible GPU bottleneck. Without Hyper Threading enabled, our simulated dual-core system was unable to offer a smooth gaming experience. With Hyper Threading enabled, our simulated dual-core was able run the game with a minimum of 67fps and an average of 93fps. Our six-core and our simulated quad-core systems had no trouble running the game, and performed similarly.

Regarding its GPU requirements, Sea of Thieves needs a moderate GPU in order to be enjoyed. Our AMD Radeon RX580 was unable to offer a constant 60fps experience on Mythical settings at 1080p. In order to hit a minimum framerate of 60fps, we had to lower our settings to Rare. On the other hand, our GTX980Ti was able to run the game on both 1080p and 1440p with consant 60fps. At this point, we should note that our GTX980Ti was underused in a variety of scenes (especially in the starting town). Normally, this would hint at a CPU bottleneck, however we can clearly see that our CPU was never stressed on those scenes. Moreover, the game is using the DX11 API instead of the DX12 API. Therefore, this could be caused from an increased number of drawcalls (though this is just an assumption).

As said, our GTX980Ti was able to run the game with more than 60fps at all times in 1080p and 1440p. In 4K and on Mythical settings, our GTX980Ti pushed a minimum of 40fps and an average of 44fps.

Graphics wise, Sea of Thieves has a cartoon-ish look that may put off a number of players. Rare has done an incredible job with its ocean simulation system and it’s not a stretch to say that this game features some of the best water effects we’ve seen in a video-game. Compared to its E3 2015 reveal, however, the game’s lighting system appears to have taken a hit, though it still looks great. Rare has used light propagation volumes for its Global Illumination solution and has created some truly amazing ‘fluffy’ clouds. Most textures are of high quality, though we do have to note that the game’s special ‘cartoon-ish’ style makes it possible to hide some low-res textures. Sea of Thieves looks very good and cute, but it does not really push the graphical boundaries of video-games.

In conclusion, Sea of Thieves performs great on the PC platform. The game does not require a high-end PC system in order to be enjoyed, and we did not notice any mouse acceleration or smoothing issues. There are also proper mouse+keyboard on-screen indicators as you’d expect from a PC game. It’s not perfect as we still can’t figure out what was causing our GPU to be underused in some scenes (again our guess is the increased drawcalls that cannot be handled ideally by the DX11 API but as we’ve said, this is just an assumption), so Rare will have to take a look and improve things.

Enjoy!

27 thoughts on “Sea of Thieves PC Performance Analysis”

  1. John, this is another game where I’ve noticed(and it’s been pointed out) that the game has a lack of SLI support. Why is this happening seemingly all the sudden? Is there something wrong with SLI that I haven’t been informed of recently? Or is it finally just no longer beneficially or cost efficient(was it ever?) to run SLI?

    1. Both AMD and nvidia are giving up on their multi-GPU tech and honestly, I can’t blame them. I say let it die, it rarely worked properly and even when it did, sometimes you had to wait days/weeks for it to be implemented in a game

    2. SLI/ CrossFire aren’t really useful anymore and the number of gamers running it have dwindled to the point that it’s not worth the effort for Nvidia or AMD to bother supporting it anymore for the most part but more importantly most Developers don’t consider supporting it to be worth their effort. Even the need for multi-GPU in 4K has fizzled. A single 1080 Ti can handle that and 4K adoption has gone to h*ll anyway. Last time that I checked the Steam Hardware Survey only 1 out of 200 gamers reported using 4K. SLI/ CrossFire had it’s days though. It was cool to see what you could do with it.

      There is a possible use in the future for multi-GPU if it’s needed but it won’t be anything like SLI/ CrossFire. It will be multiple GPUs on a single card with an appropriate interface to connect them and run as a single GPU through drivers. That tech has dried up too.

      1. Pretty much. Only a very small amount of people SLI. And those that still do only go with the highend GPU’s like the 1080 Ti to play at 4K. Other then that the majority of PC users do not care for it.

        It’s just more trouble then it’s worth anymore. A real shame. But that’s just how it is. I was shocked when Gears of War 4 had it’s SLI patch.

      2. I once had a GTX 295. Was a dual chip card with onboard SLI that was very expensive. The games that ran with Unreal 3 worked beautifully, but the other games… I still have nightmares.

        1. Yeah that 295 was a good card. iirc it was two 260 GPUs on a single card. People used to really get into SLI/ CrossFire but you had to be willing to work with it. I remember one guy bought three GTX 280s. He spent around $2,000 just on the cards. I remember him saying that after about an hour of gaming the heat would drive him away from his rig. lol He solved that by using a fan and dryer vent tubing and vented the heat out of a window that way.

          Seems like people just don’t do radical things like that anymore. Everything, including overclocking, is simple and people like to just plug and play.

          1. Hehe three 280’s could help you save on heating costs in winter! The GTX 295 was actually 2 x 280’s that were underclocked to the 260’s speed. But I still needed to strip the card down and put some thermal pads everywhere because I often hit 90’C in the more intensive games.

            I still loved the card. Really shined like a diamond in games that were SLI friendly.

      3. Thank you @disqus_eNwMfpj0Z2:disqus for taking the time to explain it to me as somebody who never really looked into the technology and always saw it was somewhat of a gimmick that people with loads of money could get into. I was going to say that a card with two GPU built onto it would probably be a better idea to move forward into manufacturing, but if that technology is going to the way side too looks like we’re just going to have incredibly powerful single chip GPUs from here on out. Which I’m perfectly fine with if my next card to stomp at 4k with ease.

        1. The 2080 Ti when it comes should handle 4K easily but there will be a small few games that won’t run at around 60 FPS whether due to extremely demanding graphics or poor optimization.

    3. It could be the fact that SLI sucks absolute donkey balls and it isn’t worth it to cater to an insanely small portion of the market.

      1. Good job responding like a sensible adult instead of like @disqus_eNwMfpj0Z2:disqus where he properly explained why. Now go play in your front lawn where mommy and daddy can see you and make sure you’re not getting into any trouble.

  2. SLI gaming is pretty much done. If a game does not work via force frame rending 1 or 2 then don’t bother playing with bits in NVInspector.

    Which is why my 970’s in SLI will be my last SLI set up. It’s been a good run since the days of Voodoo2 😀

    1. With Moore’s law out the door don’t count on it dead just yet John.
      It will be back, Just in a different form such as MCM.
      Large single die chips are getting too hard and expensive and die shrinks are happening less, Won’t be many options left other than to combine smaller chips onto one PCB.
      Should be interesting times what ever happens.
      Tempted to ditch SLI myself(980ti’s) but that all depends on how good the next top end TI card is.
      If it can do 4k at 90fps in latest games which I doubt, I would dump sli.
      Somehow I see myself still buying 2x2080ti’s to handle that upcoming 4k 144hz HDR monitor.

      1. Have you heard anything recently on that? I’ve seen a couple of articles a while back on the multi-GPU idea but then nothing more so it makes me wonder if Nvidia is still looking at that.

        That 4K HDR 144 Hz monitor looks sweet but I wonder how many will pay the money to own one. 4K and a 2080 Ti will be the next upgrade for me. People have been saying for years that 4K would go mainstream as prices come down and the costs of buying enough GPU to run it come down but it just never has caught on. In fact it seems to be decreasing in popularity over the years. For a while adoption managed to creep up to around 1% according to the Steam Hardware Survey and the latest I saw it was down to 1/2 of 1% or around 1 out of 200 gamers. 75% are still on 1080p. I’m not saying the Survey is correct but it does give some indication.

        Probably it will take two 2080 Ti for you if you want extremely high FPS. There will always be a small handful of games where even that won’t be enough but that’s always been the case. As you know also that some games aren’t going to support SLI anyway.

        1. Both companies looking into MCM but there hasn’t been much more news on it in months.
          Won’t have much options left in future though, They will have to eventually adopt some form of multi chip modules especially when they start getting down under 10nm.
          The 4k 144hz monitor will be niche for sure, But so was the 1440p 144hz Asus Rog Swift and Acer panels when hey came out.
          Just a shame the 4k panel isn’t at least 32inches minimum, 27 is too small for 4k in my opinion. Besides the Asus ROG swift 1440p 165hz panel, I own a 28inch Samsung 4k 60hz, And it’s back in the box under the bed!
          yeah agree, 4k pc gaming is still years away from mainstream, Even 1440p is still expensive for most.
          But it takes early adopters parting with their hard earned to eventually bring those prices down for everyone else, And act as guinea pigs to iron out all the flaws. 🙂

      2. It’s dead in terms of separate cards, but yeah in terms of MCM then it would likely work a lot better anyway if the GPUs had a dedicated hardware controller perhaps and present themselves as one to a game or at least work seamlessly together. At least similar in principle to modern laptops, hybrid power with various branding like Nvidia Optimus on top.

        3Dfx had interesting multi card solutions and we have seen DX12 software tech to run multi vendor agnostic GPUs together. The most recent interesting hardware technology for me popped up about a decade ago now. Lucid Hydra engine.

        The idea was that you could pair any GPU with any other GPU on a motherboard (AMD and Nvidia mixes, even different memory sizes, the chip didn’t care) equipped with this processor and it would split the scene geometry etc into various parts for each card to render their section of it. The Hydra chip would do clever stuff like load balancing, latency management and combining the parts into the final frame to achieve best scaling of equal (or unequal) GPU performance. You therefore have both cards always rendering at maximum performance. It should have eliminated stuttering seen and the frame pacing issues that very often result from the AFR rendering method still commonly used today in multi GPU setups.

        In theory dedicated hardware to do this would be the best way. However it wasn’t cheap at the time, had lots of initial niggles and heavily relied on driver support to make it function. The company (Lucid) backing it was relatively small. It was very new and experimental, didn’t work that well as it seemed the boards launched with it were practically prototypes. Nvidia in particular at the time also wanted it dead, because it cut them out of the motherboard chipset business (though history shows they were about done anyway there.)

        It never stood a chance. But I felt it only needed more time to mature and a big company to take it on and refine it. There was big potential as it is still a better way than the current long standing multi card methods.

      1. How’s your 970 sli going? I was thinking of getting a second 970 for my baby but I’m not sure if that would be a better investment then just getting a new gpu.

        1. I . New gpus are expensive. the new Nvidia gpus just right around the corner. i would never go sli again cuz of bad support. i recomend sell the 970 when the new Nvidia gpu come out and get a strong single gpu .

    1. Depends on the game. In this particular case, you can pair them and the 1080ti will be used to its fullest in 4K. In other games, like ACO or Watch_Dogs 2 there is really no point using the i5 6500 with a GTX1080Ti

    1. Slightly delayed so we can test it with their optimized drivers (AMD has already released one, NVIDIA will most likely release it today). Rest assured though that it’s well optimized.

Leave a Reply

Your email address will not be published. Required fields are marked *