Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added NETWORKQUALITY to environment and GameUserSettings.ini (default 3) #139

Merged
merged 4 commits into from
Aug 7, 2022

Conversation

ChalyFlavour
Copy link
Contributor

@ChalyFlavour ChalyFlavour commented Aug 3, 2022

GameUserSettings.ini has an option for mNetworkQuality.
The games default value here is mNetworkQuality=0.
This refers to the ingame option "Network Quality" with the default value "normal".

I have added an environment variable to run.sh with the default value 3.
This equates to the ingame setting "Ultra"

Source
https://help.akliz.net/docs/common-satisfactory-errors

Unfortunately this option is not documented very well. The NetworkQuality setting is defined in Scalability.ini

TotalNetBandwidth=104857600
MaxDynamicBandwidth=104857600
MinDynamicBandwidth=10485760```
The bandwidth setting helps avoiding sync lags with conveyer belts.

Added NETWORKQUALITY setting
Added NETWORKQUALITY setting
@msladek
Copy link
Contributor

msladek commented Aug 3, 2022

Hey ChalyFlavour, thanks for the PR.

We already set NetworkQuality@3 (and beyond) in the image, see #113 for more details. Do you have a use case where you actually need configurable network quality?

@wolveix
Copy link
Owner

wolveix commented Aug 6, 2022

Closing this per @msladek's comment :) Please feel free to re-open if you have a valid use-case @ChalyFlavour

@wolveix wolveix closed this Aug 6, 2022
@ChalyFlavour
Copy link
Contributor Author

ChalyFlavour commented Aug 7, 2022

Hey ChalyFlavour, thanks for the PR.

We already set NetworkQuality@3 (and beyond) in the image, see #113 for more details. Do you have a use case where you actually need configurable network quality?

Hey Marc, thanks for the hint. As you’ve already guessed, I haven’t seen the #113 before committing a pull request.

I don’t have a use case for a configurable network quality. There shouldn’t be any reason to lower it as long as we have enough bandwidth for Satisfactory.

However, I (me and ppl on the server) had some sync issues with conveyer belts and hyper tubes while CPU wasn’t the reason. So I messed around with game settings. Every time I changed the NetworkQuality to 3, my problems were solved while reverting to default brought back the lag.

@wolveix I’m fine with using my fork, if you don’t want to have the configurable network setting in your image. I just thought, it won’t blow it up too much.

I have trouble adding a technical prove for this one. The average CPU usage on the dedicated server raised with setting 3 and my lags were gone. Some info from the devs would be of help. Maybe some servers code relays on the NetworkQuality setting instead of the bandwidth, dunno.

@wolveix
Copy link
Owner

wolveix commented Aug 7, 2022

Hey @ChalyFlavour, that's certainly interesting to hear!

Your comment just prompted me to investigate this further. I noticed that we actually specify mNetworkQuality=0 in GameUserSettings.ini 🤦‍♂️ So while we increase the bandwidth allocations for quality 3 (per @msladek's comment), we actually drop to quality 0 due to this mistake.

I'll push a fix to your repo, and merge this PR :) Thanks for bringing this to our attention!

@wolveix wolveix reopened this Aug 7, 2022
@wolveix wolveix merged commit 38f0380 into wolveix:main Aug 7, 2022
@ChalyFlavour
Copy link
Contributor Author

ChalyFlavour commented Aug 7, 2022

YAh, darn it, didn’t notice the @3 in [NetworkQuality@3] myself. I was so fixed to the gamesettings that I missed that the Scalability didn’t match it.

Sorry, I’d have mentioned it in my first PR otherwise. The fallback to low quality totally explains the experience I had.

thanks for merging 🖖

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants