An In-Depth Look at Online Multiplayer in Splatoon 2

Creative Commons Licence
Licensed under a Creative Commons Attribution 4.0 International License


Splatoon 2 launched in July 2017 on the Nintendo Switch. For the competitive community surrounding the first game, it was going to be the new lease of life that we’ve been craving. Ever since the announcement of codename “NX”, fans knew there had to be a Splatoon title for it. When it finally arrived, many were clambering to get online as soon as possible. The transition has been underway over the weekend. All of the die-hard fans have begun jumping ship from the Wii U, and heading over to their new home on the Nintendo Switch – myself included. Since Smash ‘N’ Splash, I haven’t found myself wanting to play Splatoon. Quite frankly, I’ve grown tired of the terrible online multiplayer experience that was to be had on the Wii U.

For returning players, I’m afraid I’m not bearing good news. Before you criticise me for being overly harsh on this game, I’m going to do so with good reason. I’m as passionate about this series as those who have thrown thousands of hours into it. I’m going to be harsh, because I care about the success of Splatoon 2 as anyone else in the community. Whilst producing this, Nintendo have really sent me into a whirlwind of emotions. From uncertainty, awe, doubt, disbelief, joy, laughter; and inevitably pain. I set the standards high for this title, because it’s the second iteration. The development team had the pitfalls of the previous game to learn upon this time. Either way, I hope you take the time to listen to what I have to say. I’ve written this because I care.

What You Need to Know First

First of all, I would like to explain a few concepts. For those of you who don’t understand how online experiences are created in games, take the time to read through this whole section. You may have existing misconceptions that will affect your understanding of this article. Put all your preexisting thoughts and opinions aside; this bit is important.

Ping

A great way to explain what ping is, and how it works, is to picture a submarine ship. In the movie, “The Hunt for Red October (1990)”, Sean Connery ordered his crew to check the distance to the US submarine with “one ping only”. This of course, being a sonar ping and not a network one. By sending an audio signal into the ocean, and waiting for the returning echo, you can approximate the distance between you and the other ship. In computer networking, ping is essentially the same concept. You can send out what is known as an “ICMP Echo Request”, and the server will send a response in most cases. The ping is the time difference between sending a request, and receiving a reply – also known as Round Trip Time (RTT). A higher ping results in more lag between two connected players. With short distances between the players, and good traffic routing between them, the ping should be very low. On the flip side, if a player is further away, then the ping between you and that player will be far greater. In the context of Splatoon, this is why Japanese players lagged an awful lot for European and North American players. Depending on how bad the lag is, the high ping players can also give the low ping players a bad experience, because they are receiving damage far too late. This is also partially why low ping players will take damage from high ping players, even after they are splatted.

Update Rates

What adds additional delay in online games is the update frequency between you, and other players. For example, when we receive 30 updates per second, there is more delay between updates if we were to send 60 updates per second. By sending and receiving more updates per second, we are providing players with more time-accurate information about the other players. This can pose a real problem if the updates are too infrequent. If your console does not send updates quick enough, then damage registration is bulked together in the next update.

In Splatoon, this means a weapon that deals 30 points of damage, will instead deal 60 points in the next update if the client can’t “tick” fast enough. By increasing the update rate, you might still be bulking ticks together, but at a much faster rate. This will reduce how often a player might take damage from behind cover, or after they have been splatted.

Tick Rates

The update data mentioned above is generated by clients and the server, during what is known as a “processing window”. Your console and the server have a specific amount of time where it is required to send the next update to other players. If a client was running at a 30Hz update rate, then it would be required to “tick” within a 33 millisecond time period. The time it takes for a tick to occur is simply known as the “tick duration”. If a console is having to do a lot of processing during this time, it will cause more delay between you sending updates, resulting in potential disconnects or increased latency. The faster your client can send out this update data, the earlier other clients will receive it.

There’s an example of this behaviour on Splatoon 1. If you had enough seeker rush specials being activated at once, you could end up disconnecting from the game because it was doing too much processing. The tick rate is limited by how optimized the netcode is by the game developers, and by how powerful your console’s is at dealing with this data. By optimizing this processing, game developers can make their games feel more responsive in online multiplayer. If the game goes outside this processing window, it results in all sorts of funky stuff. Such as, rubber-banding off ledges; teleporting players; teleporting Tower in Tower Control mode; shots getting rejected, etc.

Multiplayer Networking Solutions

Game developers are not limited by options when it comes to ways that they can introduce online multiplayer into their titles. What matters most is choosing the right system for the job.

Dedicated Servers

One solution is that the game developer can host dedicated servers themselves. Players can then connect to those servers for their online battles. This means that the game servers are running on powerful hardware, and that the bandwidth available to those servers is incredibly high (1,000mbps+). By hosting in a data center, you can also guarantee that your server is connected as close to the backbone of the Internet as possible.

This model also means that a solid anti-cheat system is easier to build, because you now have an authoritative system controlled by the game developer at all times.

Clients also have to make fewer connections, resulting in less bandwidth being used to play. This is perfect for mobile devices, as some mobile providers place strict caps on how much bandwidth you have available.

Another positive of using dedicated servers, is that the IP addresses of players are not exposed to other players. This is especially important if you wish to attract the attention of popular livestreamers, as you protect their network connection from the risk of attack.

The only downside to running a dedicated server, is the expense. Dedicated servers can be expensive to run – especially if you host multiple servers in each region. The costs can be mitigated by either letting the community host these servers themselves, or by introducing a subscription-based service to play online – similar to Xbox Live. On the contrary, if there are not enough servers worldwide, then you introduce another problem of having too many high ping players.

Client Hosted Servers

The most renowned use of the client hosted server is in the early Call of Duty series. In this system, one client is selected from all players to become the server. There are many flaws with this system, which is why Call of Duty later moved to hosting dedicated servers in most regions – falling back to a client hosted system when the ping to those servers is too high.

However with this solution, a game developer doesn’t have to pay for a dedicated server, because the consoles are doing the work. From a business perspective, this system can cut costs if online multiplayer is not the primary focus of your game.

One of the disadvantages this system has, is that it gives the player who is the “host” an unfair advantage. The host is able see and react to other players before those players can react to the host. Because the host is the server, they effectively have zero lag.

Of course, using your clients as the server also has the troubles of a consumer-grade internet connection. The player’s Internet Service Provider might be routing traffic poorly, which increases lag. You also run the risk of losing data packets, something that is common for portable devices running on mobile data plans.

In a worst-case scenario (and a common one too), the client who becomes the server is also situated on a WiFi network. This further increases the risk of packet loss and high latency, because their router may be in an entirely different room.

The player’s router or console may also be under-powered, and not able to cope with the task of running a game server. This, again, will increase delay between updates. In the case of a badly made router (there are lots of them), packet loss is all too commonplace.

Unlike dedicated servers, the IP addresses of all players are exposed to the client that is selected to be the host. This can be a cause for concern, if you don’t trust the other players in the same match as you.

Another difference from dedicated servers: by letting your clients host a match, you open up a vulnerability when it comes to anti-cheat. A client could be running a modified version of your game, which leaves a means for them to cheat during online play. Anti-cheat is much harder to design and introduce for systems like this.

Lastly, we all remember the early days of Call of Duty, where we had to wait for a host migration if that client disappeared from the game. It was easily one of the most frustrating issues with the series.

Peer to Peer Networking

Peer-to-peer is a system where players make a connection to every other player in the game. Also known as a “mesh network”, this system is the one that Splatoon on the Wii U was using. In a peer-to-peer system, the players also get assigned a “host” to some degree. The hosting player doesn’t deal with everything that a client hosted system does, but only the core functionality of the game modes. For instance, in Splatoon’s Tower Control mode, the “host” was assigned the task of keeping track of the tower. In Rainmaker, the host was also responsible for confirming who has the Rainmaker, and what position it was at. Generally, these tasks are assigned to one player, because it wouldn’t make sense to calculate the same thing 8 times.

One benefit of this solution is that no player has an advantage over another… to some extent. Ping becomes an even bigger problem here, because you can’t confirm if the other players have exactly the same ping. On one hand, you may be approached by a low ping player in a match and be able make a good battle out of it. On the other hand, you could run into a high ping player, which would be much harder to predict. In Splatoon 1, this was even more problematic because there wasn’t a means to check a player’s connection to you. A simple indicator in-game, telling us how much a person is lagging, would have been a huge benefit to the competitive community.

Another benefit to this system is that the game developer doesn’t have to spend money on providing servers for their game. Like client hosted multiplayer games, they still would be required to provide a matchmaking server. However, the cost of these servers are relatively low, because they have low requirements to run them.

With that in mind, there are a lot of downsides to this system, much like client hosted games. Anti-cheat is one of those issues, because all clients now have the option to run modified versions of the game. The host is no longer a limiting factor here. Anyone who is capable of modifying the game’s code from their console will be able to cheat online. In Splatoon 1, the only thing that was preventing this was a checksum message being sent back to the matchmaking server. This, obviously, was spoofed very easily. Not great, really.

Also in Splatoon 1, we had really noticeable connection problems to deal with. Honestly, the majority of these were solved on the Nintendo Switch, because the hardware is more powerful. But, consumer-grade internet connections to host a multiplayer game is still a terrible idea. Bad internet routers, coupled with terrible WiFi connections, resulted in an inadequate online experience. The Switch hardware may be more powerful, but as you will find out, these underlying problems have still not been resolved.

Peer-to-peer systems also create more traffic between the consoles than dedicated servers do. Because every client is sending updates to every other client (and spectators now…), this means that a lot of data is being thrown around very unnecessarily. Why should the same data be sent to 10 different players, 9 times over? For every player and spectator to do this, that’s a total of 90 connections for everyone in the game. Compare that a dedicated server, which would only require 20 connections from all clients and the server. Knowing this, it’s insane to think peer-to-peer is even a thing for multiplayer games.

Having your game make 9 connections instead of 1 will also make that same consumer-grade internet do 8 times more work than it needs to be doing. This worsens the problem of routers having trouble keeping up, and WiFi being under more strain – they’re under more load. Buying a better router isn’t always an option for a community with such a young player base; they have to make do with what they’ve got. Heck, you could buy all the equipment in the world and still run into issues with terrible ISPs. Peer-to-peer networking in games is a classic case of using the wrong tool for the job.

Splatoon vs Splatoon 2

To understand what has changed in Splatoon 2, we must first look at what we used to play. For this section, I’ve used Wireshark to capture the networking traffic between me and the game. So this can happen, I’ve attached my Wii U to my desktop via 2x USB LAN adapters. Then, the Ethernet LAN port that is built-into the computer is connected to the Internet via a cable. The alternative would be to hook up the desktop to a wireless network, but it’s not as reliable for getting accurate measurements. Finally in software, I’ve configured my computer to share the incoming Internet connection with my console. With this setup, I’m able to inspect the traffic going between me and my Wii U.

Splatoon Network Test Setup

In Wireshark, I’ve opened the LAN adapter interface that’s connecting to my Wii U. I only begin logging once the game has actually started. I end the logging when the in-game timer has finished counting down, or at the end of overtime. The data is then filtered to only include traffic between me and one other player. This data is then exported into an CSV file, which I then import into a Google Docs Spreadsheet. My spreadsheet then does the rest of the work for me, like calculating the time differences. Lastly, I have to do some other stuff, such as getting an average ping from the clients who I was connected to in-game.

Splatoon’s Update Rate

In Splatoon on the Wii U, I was able to get accurate update rate measurements going to and from other players. I have published this data; you are welcome to republish it so long as credit is given as appropriate (Creative Commons Attribution 4.0). Links can be found at the bottom of this article.

To re-iterate on the above, the update rate in Splatoon was partially responsible for increased delay between ranked mode items lagging behind (Tower, Rainmaker, etc). It was also responsible in part for players that still managed to splat you, after they had already been splatted. A low update rate will also influence the metagame, because weapons with high fire rate will be impacted when the game is not sending out updates fast enough to deal with such a weapon.

Splatoon Update Rate Comparison

From this, we can see that Splatoon on the Wii U was put slightly ahead of most other console shooters. Overwatch, Battlefield 1, and Counter Strike were the only titles that have synchronous update rates on client and server. For Splatoon, no data was available for server updates, because the server is the client! As I mentioned before, Splatoon runs on a peer-to-peer system.

Side note: Even though CoD:IW has a very high client update rate, this does not mean that it will be received by other players. The reason is that the server update rate is not synchronous. In fact, for real world performance, CoD:IW is actually worse than Splatoon. Client update rate alone does not show the whole picture.

Splatoon 2’s Update Rate

The same tests were run as above, the only difference being that Splatoon 2 runs on the Nintendo Switch. In fact, it was exactly the same. Splatoon 2 also uses a peer-to-peer networking system. Even though Nintendo are obviously targeting this to be a portable console, it is still 7-8 times heavier on network usage than it would be if there were dedicated servers! What really needs to happen, is that peer-to-peer should only be a thing for local play, where all the consoles are in the same room. Heck, they have a lobby for this – it’s called “The Shoal”. Alright, here’s the data anyway.

Splatoon 2 Update Rate Comparison

Splatoon 2 runs at an update rate of 16Hz (15.75Hz if you’re pedantic). So that means that the game is running at exactly 30% slower than the original game. This even puts Minecraft ahead; which is now a 7 year old game! Adding to that, it’s still peer-to-peer, so we have all the quirks of the old system to battle with. Going forward, Splatoon 2 does not have the system it needs to become seriously competitive. If this is what Nintendo want, then they need to invest time and effort to make Splatoon 2 a competitive e-sport. Splatoon 2 needs dedicated servers. Badly.

What’s worse, is that in 2018, Nintendo plan on charging users for playing online. So, right… we’re now going to be paying for our consoles to play amongst themselves? I can’t justify that cost. Even if you were to factor in matchmaking server costs, it would still lead to quite a nice profit. If Nintendo are going to make us charge for this service, they had better give us dedicated servers and a higher tick rate.

Bandwidth and Transmission Data Size

On being interviewed by 4gamer, Hisashi Nogami from the development team boasted about the transmission data size being reduced.

Dealing with lag is difficult, but Nintendo is considering various ways of dealing with it. The transmission data size has been decreased thanks to some optimization, meaning connection errors will happen less than in the first Splatoon.

Nintendo Everything: February 3rd, 2017

Good thing I’m able to validate that claim.

From the Wireshark data I captured, I’ve been able to produce rough estimates on how much bandwidth and data usage Splatoon 1 and 2 have. In Splatoon 1, on average, each packet was 214 bytes. However, this value wildly fluctuates from 90 bytes all the way up to 1250 bytes. This fluctuation is a decent indication that Splatoon had problems keeping up with the updates in the “processing window” mentioned above. A larger packet means more data. More data usually is because the game is bulking multiple updates together, causing lag on the receiving end.

To work out how much bandwidth is required to play, you multiply the packet size by the update frequency, and then multiply by 16 (2 connections for all 8 players). Convert that figure (in bytes) to kilobits/second, then you have your final answer. To save people the math, I’ve done the work for you.

Splatoon 1 Packet Size (Smallest / Largest / Average):
90 bytes / 1250 bytes / 214 bytes

Splatoon 2 Packet Size (Smallest / Largest / Average):
102 bytes / 1062 bytes / 320 bytes

Wait, are my figures correct? The average transmission data size has actually increased by 50%?! How can Nintendo be so deceiving? What do they mean by transmission size, if not the correct term? I think what they mean to say is the “update rate has been decreased, so there’s less data being sent”. Okay… let’s check that theory.

Splatoon 1 Data Rate (25Hz): 146kbps, 66 MB/hour

Splatoon 2 Data Rate (16Hz): 80kbps, 36 MB/hour

So a 45% reduction in data usage going from Splatoon 1 to 2. Not bad, but I didn’t include the 30% reduction in update rate over the original game. Also note, these values are only during actual online play, and not matchmaking lobbies & other Nintendo online services.

Let’s adjust those figures so we have Splatoon 2 running on the same update rate as the original game:

Splatoon 1 Data Rate (25Hz): 146kbps, 66 MB/hour

Splatoon 2 Data Rate (25Hz): 132kbps, 58 MB/hour

A mere 9.5% reduction in in transmission data size then. Not such a bold claim after all. You can try this yourself with my spreadsheets. I’ve published them at the end of this article.

Splatoon 2’s transmission data size has not been reduced. The game is using less data, because it’s running 30% slower than the original title.

Actual Real World Latency

As I’ve mentioned in the introductory sections of this article, the ping does not paint the whole picture when it comes to how much lag a player is experiencing. Other variables form the rest of what we experience during online play. By combining what we now know about the update rate with a player’s actual ping, we can get a rough idea of what real world latency we might expect when playing in either Splatoon 1 or 2.

To do this, we take the ping of the player and combine it with the average delta (Δ) time between each update. For the values in my spreadsheet, I’m using a player who was found to have a ping of 74 milliseconds.

Splatoon 1 Average Real World Latency: 113 milliseconds

Splatoon 2 Average Real World Latency: 137 milliseconds

As if the situation wasn’t bad enough, the new game adds an extra 24 milliseconds lag onto every player. Nice work, Nintendo.

Let’s Talk Dedicated Servers.

Nintendo, listen up. If you want this game to be treated competitively, then you need to act fast. Dedicated servers for online battles is what the community has been asking for since the launch of the original game. There’s a solid reason for it; online play improves drastically. But you’re a business, and like most businesses, you’ll want to know how much this will actually impact your game. Is it worth your investment? I’m going to say absolutely.

Dedicated Server Estimated Performance Improvements

To really make the problem hit home, I’ve compiled some estimated figures from the data provided to me in the original game. I also combined this with ping averages to Amazon Web Services in London, using the same internet connection I used to play. Nintendo currently use AWS for Splatoon 1 and 2 matchmaking, so I figured that would be an appropriate baseline (although AWS is very expensive).

Dedicated Server Ping (Manchester UK -> London, UK): 20 milliseconds

Splatoon 1 Real World Latency (Dedicated Server Estimate): 78 milliseconds

Splatoon 2 Real World Latency (Dedicated Server Estimate): 103 milliseconds

Boosting the Update & Tick Rate

But wait! That’s not all. Dedicated servers are only a small chunk of the problem. The update rate is still 16Hz in those figures. If you were to increase the update rate to match Battlefield 1 by EA DICE (60Hz), we could expect even more out of the game.

Splatoon 2 Real World Latency (Dedi Server Est. with 60Hz Update Rate): 57 milliseconds

So there we have it:

Dedicated servers and a 60Hz update rate could offer Splatoon 2 players an estimated 45% faster connection in online multiplayer.

I really hope they act on this information.

Summary

Splatoon 2’s online experience is considerably worse than the original game in quite a number of categories. The game runs 30% slower than Splatoon 1, meaning the metagame is going to be heavily skewed to favour high damage, low time-to-kill weapons. Adding to that, players have been drawn into the title by bold claims that later have proven to be absolute nonsense. Good online multiplayer networking plays a much larger role in enticing people to play at a competitive level. If Nintendo are dead set on making Splatoon 2 a competitive title, then they should focus their efforts to make dedicated servers possible in 2018. Dedicated servers would also bolster security and anti-cheat measures. Cheating online became a very big issue for the original game, so they would be doing themselves a huge favor.

The content in this new game is refreshing. By improving the networking, it would make this game truly something to get excited about. Until then, I’m going to be playing casually. Salmon Run, here we come.

Published Data

If you would like to view my raw data, I have published it on Google Sheets.
[Splatoon 1 | Splatoon 2]

Credits

  • Battle(non)sense – Initial introduction to the basics of game networking, is an adaptation of his work.
  • Conro @ConroStuff – Peer review

New PC – A Sub £600 Linux Developer Workstation

It’s Here!

Since 2012, I’ve been battling with a really unstable rig to do most of my heavy development work. An old AMD FX-8350 that was one of the first off the production line… yup. Really unstable. I moved house earlier this month, and have been able to cut my bills in half. Thus, I could finally save enough money to replace it. One day after payday, I did just that. I’m taking some old parts with me for the time being. Later down the line, I’ll probably be swapping out the MBO, GPU, PSU, CPU cooler; and Storage. After over a year of debating (and waiting to see what happens with Ryzen), I’ve made my mind up.

Actually coming to the hardware decision was really tough for me. I’m a heavy Linux user who likes to tinker. A lot. When it comes to gaming, I’m pretty light. (TF2, Minecraft, Kerbal Space Program, Hearts of Iron 4… etc); and all of those can run natively on Linux.

What I Needed

Right now, I run Fedora 25. If I wanted to, I have the option of throwing the GPU at a virtual machine running Windows. For me, Ryzen just wasn’t an option. I’ve been an AMD user since 2012 and I swear at times I could’ve beaten this PC to death. I don’t want to fight the instabilities of new CPU architecture any more. I bought the FX-8350 when it first hit the market back in 2012, and every weekend something seems to go wrong with this machine. For once, I just want a stable PC. I don’t care if it has the latest features – I just want something that works ‘good enough’. So, that’s exactly what this is. It’s also a stepping stone. I’m visiting America for a tournament in June and thus need to have enough money for the trip. Upgrades can come later.

The absolute minimum I would’ve gone with this system was a quad core with hyperthreading. For quite a while, I was looking at the i7-6800K and going X99, but the TDP on those chips are on-par with my dying FX-8350. They’re loud. I want quiet. Skylake was the cheaper option over Kaby Lake, and I don’t mind running a generation behind. For Linux too, this chip is tried and tested. An easy 65W is going to be really nice to have around, and solid virtualization support is going to be a big win for me.

What I Really Wanted

Then it was a matter of size and sound. I’m a big fan of quiet PCs (something this AMD build is not). I wanted something small enough to sit on my desk, and quiet to the point it’s almost inaudible. Often times, you can only have one or the other, but with a little work I think I have my solution. For the time being, the Thermaltake Core V1 is one of those ‘good enough’ parts. I like what Fractal Design are doing, and so will no doubt pick up something from them in the near future. Nonetheless, this a pretty nice case for the price.

For the power supply, I really wanted to try out some of Be Quiet’s stuff. The old PC was running a Corsair 750M, and let me say – those things have a ton of coil whine. Having done a lot of reading, it was between Seasonic and Be Quiet for this build, but BQ won me over in the end.

Memory was nothing special. I specifically wanted to avoid Kingston, having used their memory in the past and ran into some really funky issues with it. Eventually the CPU, MBO and memory will be swapped out anyway – looking at Xeon v5s for a future workstation – then re-purposing this build into a pfSense router and NAS.

The Motherboard decision was basically made for me. MSI have always been in my good books and I wanted to stay with them – Asus has not (M5A97-LE 2.0 twitch). For motherboards, you have to spend big money to get the reliability and features that are invisible on cheaper boards. Although, my experience with MSI has been a pretty solid one. The reliability is there, even on the cheaper MBOs. I once had a PSU die horrifically on a friend’s build I did a few years ago – motherboard held back the over-voltage no problem. So from experience, and lack of choice (this is one of the few mITX LGA1151 boards they do), this is the one. My only concern is the thermals on the power phases, but since I’m not overclocking, I should be fine.

Leftover Parts

GPU, again, decision was made for me. I got this GPU on the cheap when building my last rig. At the time, I didn’t really know the difference between the 900, 700, 600, etc. series and so went with the cheapest. To my horror months later, I had already bought this garbage and wanted it gone. Again, it coil whines like crazy and I hate the thing to death. It won’t be with me for much longer, haha.

The Parts List

CPU: Intel – Core i7-6700 3.4GHz Quad-Core Processor

Doesn’t break the bank when it comes to raw performance. Silky smooth experience running Linux Kernel 4.10

Motherboard: MSI – H110I Pro Mini ITX Socket LGA1151

Definitely get the 802.11ac model if you can afford it, but my goodness this board is tiny. A Mini ITX board like this is outstanding for compact space-saver builds. It will happily take an i7-6700 no problem. Adequate VRM cooling, with great USB 3.1 options. Yet to test out the M.2 slot with a Samsung 960 Pro – it’s mounted on the back of the board FYI. Love the little thing; it’ll serve me well over the next 5 years!

Memory: Corsair – Vengeance LPX 16GB (2 x 8GB) DDR4-2400

Perfect compatibility with beefy CPU coolers, hardly anything to say about it. It’s memory; it does the job at stock speeds. XMP is supported too, although Linux users will probably want to keep that disabled for good measure.

Storage: Samsung 850 EVO 120GB 2.5″ Solid State Drive

Fast, but feels a little dated when compared to the rest of the stuff on the market these days. Would look at getting one of Samsung’s NVMe SSDs in the future. However if you’re running Windows, this thing can do “Rapid Boost”, where it plays some voodoo by using memory cache. For games, it’s brilliant… just a little too small on the GB side of things.

Storage: Western Digital Red Pro 4TB 3.5″ 7200RPM

Albeit noisy at times, this drive does the job perfectly. Good support for extended SMART features, and it does support idle power down unlike some Seagate drives I’ve had in the past.

Video Card: EVGA GeForce GTX 760 2GB Superclocked ACX

Two words: coil whine. The fans on this thing have always been rather noisy, especially for an eVGA component – they’re usually alright. I don’t know if the model I have is defective, but the coil whine spins above everything else under reasonably normal loads. Bought it a while ago now, but always hated it. Wish I’d gone with a 960 at the time, rather than the 760… bad choice when I was a noob at these PC building things. Performance is lacking significantly too. Probably not when the card was launched, but a fair number of years later, and it’s starting to show against 1050 Tis / 1060s. Ripping this out of my system as soon as I can afford an upgrade.

Case: Thermaltake Core V1 Mini ITX Tower

Fantastic case for the price, I’m actually blown away by it. The only niggle I have is with the 3.5″ drive mounts. It makes zero sense as to why the cables can’t be routed down to the bottom. But, it is a £35 case so I guess some compromises have to be made.

Power Supply: be quiet! Pure Power 10 CM 500W 80+ Silver Certified ATX

This supply is advertised as semi-modular, however it IS NOT. Nonetheless, I do like this little thing.

Case Fan: be quiet! SilentWings 3 pwm 59.5 CFM 140mm

Can’t fault it. Noise is… well there’s basically nothing, that’s the thing. be quiet have done a fantastic job on this one.