The original post: /r/homenetworking by /u/TheSilverSmith47 on 2024-12-22 23:56:26.

I want to have my PC set up to stream games to my android phone over Sunshine/Moonlight. I ran some tests with my gaming PC as the host and another PC as the client. The host PC was tested with ethernet and WiFi, and my client PC was tested using only WiFi. I can’t figure out why my quality is so bad. Speedtest.net shows that my house’s connection to Xfinity has a peak downlink bitrate of 800 Mb/s and a peak uplink bitrate of 100 Mb/s. Not the fastest uplink speed, but it should be plenty fast enough from what I’m reading online.

I tested for lost packets using iperf3 with the protocol set to UDP.

When my host PC was connected to my router over wifi, I had to drop my bitrate to 500 Kb/s in order to get a loss rate of 0%. Looking at my PC’s network activity in task manager confirmed that my laptop was indeed uploading at a rate of 500 Kb/s to my router. At that bitrate, the quality of my PC’s stream was terrible. The compression artifacts and latency were unbearable.

When using an ethernet connection from my PC to my router, I didn’t start losing packets (0.01%) until my uplink bitrate reached 112.5 Mb/s. Looking at my network activity in task manager confirmed this upload bitrate. Despite this, when streaming a game, I still had to drop my bitrate down to 8 Mb/s, and the quality and latency were still bad.

Given that packet loss is near zero at 112.5 Mb/s between my host and client PCs, I’d think that there’d be zero latency, zero dropped frames, and zero compression artifacts. Yet I still get all three to horrible degrees.

How can I diagnose where these dropped frames and input latency are coming from?